I'm trying to use the AD5933 to measure electrochemical impedance. I've been able to measure impedance successfully and am generally getting use out of the part for my application. However, I've been experiencing some odd behavior in all my tests with the part: the output voltage appears to be at 1/10 the expected value. I am able to change the output voltage via I2C instructions, but I can only output at 200, 100, 40, and 20 mV P-P, rather than the expected 2, 1, 0.4, and 0.2 V P-P. I've attached some O-scope screenshots to show what I mean, which were probed at the output and input pin of the AD5933.
My circuit is set up exactly copying the analog front-end in this AN (http://www.analog.com/media/en/technical-documentation/application-notes/AN-1252.pdf ), using RFB = 1.15 kohm. Though, I don't think this would matter since I am reading a low signal coming right out of the AD5933, before it gets to the op amp or DUT.
My software executes the following I2C commands to turn on the AD5933:
1. Reset 2. Assign start frequency 3. Assign frequency increment 4. Assign # of freq increments 5. Assign settling time cycles 6. Enter standby mode 7. Initialize with start freq 8. Start freq sweep 9. Turn off AD5933
I've tried to follow the flowchart given in the datasheet (http://www.analog.com/media/en/technical-documentation/data-sheets/AD5933.pdf ), but I noticed some other application/circuit notes had slightly different procedures.
I've observed this with several different AD5933 parts, so I know it's not a bad part. I'm guessing it's a software thing, but not sure where. It feels like something simple that I'm overlooking. Anybody seen anything like this/have any ideas?
Thanks in advance!