Can you talk a little bit about what software defined radio is and explain its typical elements?
Software Defined Radio is a radio in which the signal chain is partially in software.
In practical terms, it will have some or all of these …
In practice that means that the signal is digitized by an A/D converter at either IF or baseband and the filtering, demodulation, timing recovery, decryption, and AGC functions etc. are performed in software. In a transmitter the modulated signal would be created in software and then moved to baseband, IF, or RF by a high-speed DAC for transmission.
The fun part is that the analog portion of the signal chain must be a linear for all of this to work: the converter has to "see" a clean signal, or at least one that it can work with. So dynamic range vs. cost is still the name of the game and the big tradeoff.
Message was edited by: BobC -- added the bulleted list
YOU BE IN RIGHT , SDR APPLY FOR WIDE -BAND SIGNALS WITH FURTHER MULTI-BAND PROCESSING.
I THINK IMPERFECTIONS SDR CONSIST IN DIGITIZING CARRIER FREQUENCY THAT CREAT LIMITATION WIDE-BAND AND EXPENDITURE RESOURES.
I THINK ADC MAY BE DIGITIZED INFORMATION WIDE-BAND SIGNALS, IT IS POSSIBLE SOLVE TO APPLY ADCs with function
amplitude (or RF) detector.
You brought up a couple of good points.
First SDR applications are typically for wideband signals. To take this a step further, it's generally because wideband systems are standards-based and the standard imposes "controlled" conditions on the radio. That is, the standard usually imposes test conditions for the radio in the presence of interferers and specifies the signal levels (or includes enough information so the levels can be worked out). With test conditions and known signal levels, plus any margin added by the customer (usually the cellular carriers), it's possible for a systems engineer to generate a set of realizable requirements for a communications system.
Now for non-standards-based systems, or at least those that do not impose specific test conditions like some military and 2-way radio applications, for instance, the environment is more-or-less unknown from the beginning and the system has to be designed around worst-case "use cases" and then field-tested in prototype form. In the Boston area, for example, new VHF radios amateur were field-tested on the section of US 95 between Needham and Waltham, locally known in the amateur radio community as "Intermod Alley".
Second, it's still impractical to place the ADC at the antenna because of the jitter requirements this places on the clock.
This give me an excuse to post links to an application note (AN-501) on aperture jitter:
and a tutorial on converting VCO phase noise to jitter:
Note in the tutorial that the integration bandwidth is important (especially when reading data sheets for clock products -- not all manufacturers use the same bandwidths so make sure you compare products using the same integration bandwidths) and that the integrated wideband phase noise is usually the largest contribution to the total jitter, not the close-in phase noise.
I FULL AGREE FOR CONVERT RF/IF BE REQUIRED MINIMUM CLOCK JITTER,BUT FOR CONVERT WIDE-BAND WHICH MUCH
LESS RF/IF DEMAND TO JITTER MUCH LOWER.
ADCs with function amplitude (or RF) detector convert wide-band.
FOR EXAMPLE ADC SHOW ON FIG.1 WORK WITHOUT CLOCK GENERATOR.
ADC FORM CLOCK FROM INPUT RF/IF SIGNAL IN THAT CASE JITTER FORM TRANCMITTER AND CHANAL OF COMMUNICATION.
Just to be clear, the jitter requirement due to aperture uncertainty is determined by the highest signal seen by the ADC, whether it be RF, IF, or baseband, and of course the required SNR. At RF and IF, the upper sideband sets the highest frequency.
ADCs wiith function RF detector work as simply synchronous detector and create digitized image wideband simultaneously.
FOR MUCH UNDERSTANDING SEND DRAWINGs, me hope will de clear, why demand for jitter essential lower.
Here's a couple of relevant tutorials on sampling and ADCs and an applications note that pulls it all together.
First, on ADCs and the Nyquist criteria: http://www.analog.com/static/imported-files/tutorials/MT-002.pdf
Second, on ADCs and aperture uncertainty: http://www.analog.com/static/imported-files/tutorials/MT-007.pdf
And an applications note, AN-756, that helps tie it all together, "Sampled Systems and the Effects of Clock Phase Noise and Jitter": http://www.analog.com/static/imported-files/application_notes/5847948184484445938457260443675626756108420567021238941550065879349464383423509029308534504114752208671024345AN_756_0.pdf
The key point in all of this discussion is that the timing jitter creates the aperture uncertainly, that is, when the ADC samples. This uncertainty creates a jitter-dependent variation in amplitude that degrades the ADC's output signal to noise ratio. The higher the frequency being sampled, the less jitter can be tolerated for a given signal-to-noise ratio.
Hope this helps!
THANKS YOUR POST WITH TUTORIALS,YOU SUPPOSE WHAT I INSUFFICIENTLY EDUCATED.
I CAN TO EXPLAIN PROBLEM OF JITTER ON ONE DRAWING FIG.3 WITHOUT FORMULAS, AND STILL I SUPPOSE
ADCs PROBABLY DEVELOP NOT ONLY CHARACTERISTICS BUT AND FUNCTIONS.
Retrieving data ...