Post Go back to editing

Group Delay Performance when DSA at Maximum Attenuation State? Why TTD, not Phase Shifter?

Thread Summary

The user inquired about TTD performance during DSA state transitions from maximum to minimum attenuation and the rationale for using TTD over phase shifters in 27-31 GHz and 17-21 GHz bands. The support engineer confirmed that TTD maintains good performance during these transitions and explained that the frequency range justifies the use of TTD over phase shifters, despite the narrower bandwidths.
AI Generated Content
Category: Hardware
Product Number: ADAR3000 ADAR3001, ADAR3000

Dear ADI's experts,

 I would like to ask about the TTD performance when the DSA state is changing from maximum attenuatation state to minimum attenuatation state? Did the TTD still work well, especially at DSA maximum attenuation state?

 Another thing is that the bandwidth of 27-31 GHz and 17-21 GHz bands are not very wide, so the beam-squint may be not a big problem. why don't we use the phase shifter instead of the TTD? 

Many Thanks for the feedback!

  • Hi Kim,

    the time delay holds up pretty well as the attenuation is changed. Maintaining that phase compliance was an important design target. While we still make some phase shifter based beamformers for very narrow band applications, we decided that for this application, there was enough operating frequency range to justify true time delay.