Post Go back to editing

# 6/1/16: 1/f noise increases at lower frequencies proportional to 1/f

If 1/f noise increases at lower frequencies proportional to 1/f, why don't you get infinite 1/f noise in dc coupled systems?

edit
[edited by: lallison at 3:54 PM (GMT -4) on 6 Jun 2022]
• It is true that 1/f noise increases at 10dB per decade once you are below the 1/f corner. Assuming that 1Hz is well below the 1/f corner, you can calculate the noise as: total rms noise = en,1Hz * √ln(fH/fL). We can imagine a dc to 10Hz system with 100nV/√Hz at 1Hz, it will have 150nVrms in the decade from 1 to 10Hz. It will have 150nVrms in 0.1 to 1Hz (so 215nVrms once you add in the 1 to 10Hz decade too). You have 150nV in every decade of frequency below.

The limit becomes the aperture time, how long you're watching the output of your circuit. So if you turn on the circuit and watch the output for 100 seconds, the lowest frequency you can see is 0.01Hz, which is 1/100. You quickly get to a point where you reach the lifetime of your circuit and the 1/f noise hasn't added that much.

Here is a simple graph of our example going up to 1 BILLION seconds which is 31.6 years. The noise is 480nVrms. Clearly over this time period, our total uncertainty would depend much more on the long-term drift than the 1/f noise.