I am using the AD3246 to level translate a unidirectional bus from 3.3V down to 1.8V. The bus runs at 150 MHz. The device is physically placed in the middle of the bus.
I have connected BEn to ground, SELn to ground, VCC to 3.3V.
My problem is that the output of the device is around 2.1V instead of 1.8V. I tried applying a constant high level over many clock cycles and found that the output eventually does reduce to 1.8V, but it takes about 200ns.
I've attached two scope shots to show what's happening:
"data17_r_at_LT.bmp" shows the 3.3V input to the device (purple) and the output (yellow). The top cursor (labelled By) is at 3.3V, the bottom cursor (Ay) is at 1.8V. You can see that the input sits around 3.3V as normal. But the output is much higher than 1.8V. FYI, I'm planning on increasing the value of the series termination at the transmitter to try to reduce the overshoot and ringing.
"data17_r_at_LT_zoomout.bmp" shows a 3.3V input over a longer timebase. You can see it takes almost 200ns for the output to drop down to 1.8V.
Let me know why you think it's not dropping the voltage down to 1.8V fast enough.