Recently purchased an SUV5 UV radiometer and I'm trying to hook it into my CR3000 (no RS-485 on cr3000 so I have to use the analog. no big deal.). I have a few other K&Z radiometers, so I followed a similar route to set this one up.
I put green/V+ into a diffchan high, and brown/V- into a diffchan low. power into 12v, grounds into grounds, etc, and wrote the following line:
SUV5_Sens = 97.15 uV/W/m^2 from my calibration sheet, so I put in the 1000 to convert from mV to uV, and then divide by the sensitivity constant, as I've done with the rest of my K&Z radiometers. I used mV1000 because the manual states that the output voltage varies between 0 and 1 volt for -100-400 W/m^2.
The problem is though, that creates a fantastically high value. The current output from the sensor is about 285 mV, and when you apply the multiplier of 1000/97.15 uV/w/M^2, that balloons to something silly. So, something isn't right.
Should this be a VoltSE measurement thereby being different from other K&Z radiometers? I'm trying to avoid going back to the field this afternoon and rewire to try that if this is just a simple programming issue and I'm missing something obvious.
Hi. This is a straight line graph 0 - 1000mV produces -100 to 400W/m^2.
So, your multiplier would be 0.5 and your offset is -100 in the VoltDiff instruction. This should give you the outputs you expect.