I am using LTSpice's inbuilt macromodel test fixture for LTC2862-1 to simulate its operation. Modified it slightly in the following ways,
- Modelled the transmission line using the inbuilt LTRA Model (Lossy Transmission Line Model). Plugged in the RLC values for Belden 3109A into the LTRA Model. Considered a length of 330ft (~100m). Looking into the datasheet, the device should work upto 450ft, at 1MHz.
- Operating the device at 1MHz i.e. sending a finite sequence of 10 pulses at 1MHz, through one of the transceivers (at DI pin), with the ~RE/DE pin driven high (Data In Enabled). The default test fixture uses 10MHz.
- The ground references between the 2 isolators are shorted through a 1MΩ resistance (as opposed to a sinusoidal source between the two)
- Added parallel termination of 100Ω (nominal) at both transceivers.
- Added capacitance of 375pF from Lines A and B to respective grounds, to model the effect of TVS Diode Capacitance (SZ1SMA33AT3G). Total of 4 x 375pF caps for TVS diode capacitances.
On running the simulation it is observed that the pulses on transmission line experience an unexplained ringing or spiking effect, during the mid point of their "on-times". The ringing does not start from the beginning of the pulse sequence, but starts after 4th pulse. After the DI (data input) pulse sequence at the transmitter has finished, the ringing on the transmission lines is causing the reception of “additional” pulses at the receiver’s RO (Receiver Output).
This is observed only for the 1MHz mode. While operating in 10MHz data transfer mode, the device is able to pass the pulses properly. Note that the “spikes” are at the middle of the pulses’ on-time, and not during rising or falling edges.
Please advise if there seems to be an issue with the model/simulation or if this is an expected electrical phenomenon.