Optical Fiber Cabling

         General FAQ
         
         Fiber Testing FAQ

     Fiber Testing FAQ

Optical Power Measurements

  • What are the measurement units for power?

Optical power is measured in linear units of milliwatts (mW), microwatts (uW - really the greek letter "mu"W), nanowatts (nW) and decibels (dB).

What is the difference between "dBm" and "dB"?

dB is a ratio of two powers, for example the loss in a fiber optic cable. When power is measured in linear units (mW, uW or nW), dB is calculated on a log scale using this formula:

power (dB) = 10 log (power1/power2)

If we are measuring absolute power levels, the measurement is generally referenced to 1 milliwatt (mW), is expressed as "dBm" and the calculation becomes:

power (dBm) = 10 log (power/1 mW)

Thus 1 mW = 0 dBm, 1 uW = -30 dBm, 1 nW = -60 dBm and two equal powers compared are 0dB (eg. power being the same, there is no loss.)

  • What power level should a source have?

LED: -10 to -25 dBm into 62.5/125 fiber

Telcom/LAN laser: 0 to -13 dBm into singlemode fiber, to +20 dBm with DWDM and fiber amplifier systems

CATV Laser : +16 to 0 dBm into singlemode fiber

  • What power level should a receiver see?

It depends on the network and type of source. When measured at the end of the network cable, the source output power will usually be in these ranges:

LAN/LED: -20 to -35 dBm into 62.5/125 fiber

Telcom/LAN laser: -20 to -45 dBm into single-mode fiber

CATV Laser : 0 to -10 dBm into single-mode fiber

  • How do you calculate a loss budget?

The loss budget is a calculation of how much attenuation a link should have. You compare that loss to the dynamic range of the networking equipment to see if the range and link loss are compatible.

  • How accurate are fiber optic power meters?

All optical power meters which are calibrated to NIST (the US standards body) or any national standards lab will measure optical power to an uncertainty of about +/- 0.2 dB or 5%. Therefore, since every power meter has an uncertainty of +/- 0.2 dB, any two meters can differ by 0.4 dB in the worst case (one at +0.2 dB and one at -0.2 dB) even if both are within their specification!

  • Are more complex or higher priced FO power meters more accurate?

The high priced meters offer better dynamic range and more features, but not better absolute measurement uncertainty.

Why is the measurement uncertainty so high? That is because there are three to four calibration transfers from the NIST absolute optical power standard before the meter is delivered to the customer. The NIST standard has an uncertainty of about 1% itself and every transfer adds errors of about 1%.

  • Why do most meters only offer calibrations at a few wavelengths?

NIST only offers calibrations at 850, 1300 and 1550 nm, so those meters that have calibrations at other wavelengths have to extrapolate to those values, increasing the measurement uncertainty at those wavelengths.

If my source is at a slightly different wavelength from the standard calibration wavelength, doesn't that add to measurement error?

Perhaps, but the wavelength of most sources is not known by the person making the measurement. If everyone uses meters calibrated at only a few specific wavelengths, everyone can be testing to the same standard and will get more closely correlated measurements on sources of unknown wavelengths.

  • How to test optical return loss?

Use an OTDR on cable plants, OCWR on patch cords. ORL testing with ORL tester (or what Telcordia/Bellcore calls an OCWR or optical continuous wave reflectomenter) is only applicable to short patch cords. If you try using them on an installed cable plant, the integrated backscatter from the length of the fiber will overwhelm the connector back reflection. Twenty km of fiber gives about the same backscatter as a connector with 20 dB ORL and you cannot tell where the backscatter comes from! It's better to use an OTDR to find ORL problems on installed cable plant.