Q
We use two different voltages (digital 3.3V and analog 5.2V) to powerthe codec.
With our onboard power supply we have no problems but during the
production test we noticed a problem with the power-on-sequence.
We have established that the codec doesn't work if between power-on
5.2V and power-on 3.3V elapsed a few ms.
The problem only occured if the 5.2V comes first.
Now our questions:
What is the reason for this effect and
how many time is allowed between power-on 5.2V and 3.3V.
A
There are two possible causes of the failures that I can see. It could be powersupply sequencing. We actually recommend that AVDD and DVDD are powered up
together. On power up AVDD should follow DVDD until DVDD reaches it's final
value of 3.3V and then AVDD should then continue to it's final value of 5.2V.
The second possibility is that something else happens in the intervening few
milliseconds between powering 5.2V and powering 3.3V. Is it possible that the
DSP or micro attempts to communicate with the AD73311 before DVDD has been
established? It's critical that the absolute maximum ratings are respected for
the digital inputs and remain between -0.3V and DVDD +0.3V at all times. Check
that there is no high level on SDI, SDIFS and EN pins before DVDD is
established.
The final chapter in all our seminar books is dedicated to hardware design
techniques and deals with such issues as grounding, decoupling, parasitic
thermocouples and good PCB design.
http://www.analog.com/support/standard_linear/seminar_material/index.html