I am using the AD3246 to level translate a unidirectional bus from 3.3V down to 1.8V. The bus runs at 150 MHz. The device is physically placed in the middle of the bus.
I have connected BEn to ground, SELn to ground, VCC to 3.3V.
My problem is that the output of the device is around 2.1V instead of 1.8V. I tried applying a constant high level over many clock cycles and found that the output eventually does reduce to 1.8V, but it takes about 200ns.
I've attached two scope shots to show what's happening:
"data17_r_at_LT.bmp" shows the 3.3V input to the device (purple) and the output (yellow). The top cursor (labelled By) is at 3.3V, the bottom cursor (Ay) is at 1.8V. You can see that the input sits around 3.3V as normal. But the output is much higher than 1.8V. FYI, I'm planning on increasing the value of the series termination at the transmitter to try to reduce the overshoot and ringing.
"data17_r_at_LT_zoomout.bmp" shows a 3.3V input over a longer timebase. You can see it takes almost 200ns for the output to drop down to 1.8V.
Let me know why you think it's not dropping the voltage down to 1.8V fast enough.
Can you please send a schematic of the circuit?
I am interested in the impedance of the driver and the load seen by the device.
I sent you a private message, please have a look.
Apologies for my late reply.
I have answered your private message. Please send a simplified schematic and equivalent circuits for the load and driver impedances.
Again I replied to your private message but haven't heard back. Like I mentioned in my original message, I need your email address to attach the information since I can't attach files in private messages.
I have looked at the schematic and it looks fine. You may want to add line terminations on the B port side of the ADG3246.
I suspect that the issue is caused by either the ground lead inductance of your probe, improper line terminations or incorrect configuration of the I/O pins on the FPGAs.