I am working on a picozed sdr2 rev. e. I am trying to verify the performance of the ad9361 by doing the test measurement for 2.4GHz for the transmit chain based on the page 4 of the datasheet as attached. I will like to clarify on that page under the test case for transmitter 2.4GHz .The measurement was tested based on 1MHz tone as stated on the test condition column.
Here comes my questions:
1) The 1 MHz is generated from the dds , am I right? If it is, what is the scale set to in the OS driver and at the non-OS driver.
2) For the maximum power obtained, how you get that high reading? By setting the Tx attenuation to 0dB? From the NON-OS driver and OS driver, I could not find any setting to increase the Tx gain other than by reducing the Tx attenuation.
3) For the modulation accuracy(EVM) , how did you capture the measurement? From what I know EVM is known as error vector magnitude which is used to determine the error offset of your IQ sample positioned on the constellation diagram and is determine in %, but according to the datasheet, the unit is in dB. So what is the EVM that you all are referring to ? I am using MXA Signal Analyser N9020A for my measurement
4)For the carrier leakage, there are 2 readings, is it aligned to the test condition by setting the Tx attenuation to 0 dB , a carrier leakage will be observed at -50dBc , and also by setting the Tx attenuation to 40 dB , carrier leakage will be observed at -32dBc . Am I right?
5) The noise floor at -156dBm/Hz is due to the 90MHz offset . What question is that what is this 90MHz offset about? How does it affect the noise floor and this offset is generated from where? How do I even set it?
6) Regarding the Isolation part , what is the isolation about , what do you mean by TX1 to TX2 : 50 dB ?
7) For the test case if I set TX LO to 800MHz, I realise that the clk ref used is 19.2MHz. From what is know for Picozed SDR2 , it is using a onboard 40MHz crystal . So if I need to verify the 800Mhz test speciation as refer on the datasheet , I will need to inject an external 19.2MHz clk rate in? Am I right? Is there any reason why the clk rate used for 800Mhz is different from 2.4GHz and 5.5Ghz test case?
The reason I am asking this is because I realise there is a some performance difference between the OS-driver and NON-OS driver approach. Now due to my application, I will require to inject an external clk rate in as well. So because moving forward to my application, I must be sure that the test specification is as per stated in the datasheet for both OS and NON-OS approach (This might determine which approach I will move forward w.r.t the performance observed).
1. 0 DBFS
2. Yes your understanding is correct.
3. EVM can be represented as percent or dB, you can capture IQ in VSA format and play the same in VSA software from Keysight.
5. it is the noise level measure at 90 MHz offset from carrier/tone set
6. It gives how much coupling can happen on other channel.
7. it doesn't make difference it is just a test condition.