AnsweredAssumed Answered

Noise calculation in DC environment

Question asked by HAW on Sep 2, 2016
Latest reply on Sep 6, 2016 by HAW

I am new to precision designing and thus to noise analysis.

Amplifier noise principles for practical engineer by Matt Duff was enlignhtning

I also watched

EEVblog #528 - Opamp Input Noise Voltage Tutorial - YouTube 

Engineer It -- How to select an op amp based on datasheet noise specs - YouTube 

and read 

But I also have additional questions espesially about near-DC aplication

Do I assume correctly that I would  get full noise spectrum, unless I cut it out with the filter?

Does noise adhere to Ohm's law? so if 1k resistor has 4 nV/rtHz NSD then at 100 Hz bandwidth it produces 400 nV noise rms which coresponds to 400nV/1kOhm=400 pA rms current noise?

How about current noise in active filters? wouldn't large resistors typically used in them lead to increase in noise?

To test how  well (or perhaps how bad ^_^) my understanding is, here is an abstract example calculation:

So assume I have an AD5780 DAC, that has constant voltage outputs for prolonged periods. Let's imagine I need relatively large current output that's why I put an AD8655. With a corner frequency at 1k it s not an ideal opamp for DC, but there is no alternative.

So the DAC has ca 70 nV/rtHz at ~0 Hz. Th op amp 100 nV/rtHz

Additionally, the DAC has output inpedance of 3.4 kOhm, so current noise needs to be accounted too. The spec doesn't provide the number, so I just wild guess it to be 50 pA/rtHz which gives  170 nV/rtHz

Also at full scale ther is probably noise coming from voltage reference (ADR4550) at approx 280 nV/rtHz 

The total noise: rt(70^2+100^2+170^2+280^2)=350 nV/rtHz

Hope this calculation makes at least parttially sense


Now assume the same, but DAC changes output value 100 times per second.  Then the noise is 350 nV/rtHz * 100 Hz=35µV rms. I supect that Ive made here a nonsense calculation but as Chinese proverb says "to ask is a minute shame, not to know is a lifelong shame"