Hi, does anyone able to successfully add a Vaisala model HMP110 temperature /relative humidity sensor to a CR1000 datalogger. The Relative humidity is dead on accurate, but the temperature readings are very off (by 10+ degrees). We've used the following CR basic command lines to take reading and tried various multiplier and offset settings but have not gotten a accurate reading for temperature.
Any help is much appreciated. Thanks.
We have several HMP155 on our site. I suppose the instructions are the same for the analog signal model.
Here is the code I use on a CR1000:
If RHair>100 AND RHair<108 Then RHair=100
The program is the same but the multiplier and offset will usually be different. For either of those probes the analogue outputs can be programmed in the sensor to scale to different ranges. This is normally done during manufacture.
You first need to check the documentation with the probe to work out what temperature range the probe is set to, for the 0-1V output, e.g. -40 to +60 C. Then you need to work out the math accordingly to give the answer you want. Normally the probes are set to a deg C range of temperatures so in the case of "District9RWIS" you need to allow for the fact you want the answer in Farenheight
I have the very same problem.
Our settings are
and the sensor is about 8-9 degrees C too low. The HMP110 is rated for -40 to +80 while the HMP60 is only rated for -40 to +60. We have been using the VoltSe settings as suggested in the ForumCSI coide snippets for use with the HMP60 (see above) but there seems to be an issue with the reference temperature and/or the range which correlates to the volt range. Since the range for the 110 is wider, the recorded voltage translates into a higher temp. In other words, 0.5V on the HMP60 should be 10C (dead middle of range), while on the HMP110 this correspondes to 20C. We have fixed this using a multiplier of 0.115 instead of what has been suggested = 0.1 (for the HMP60). Not sure though this is the right way to go about this. Would be glad to hear more about this.
The datalogger instruction is based on the algebraic slope intercept form y=mx+b. So if you can represent the sensor specs this way, you will have everything correct.
For the HMP110 you state that the voltage output is 0 to 1000mV for a corresponding -40 to +80°C. y represents the temperature in °C and x represents the mV output of the sensor. The y intercept is where the sensor output in mV is zero. So when the HMP110 outputs 0mV, the corresponding temperature or y value must be -40°C. We have -40 for the offset or y intercept of the b in the above equation. Now we have y=mx-40. At full scale, an output of 1000mV would be +80°C. Or 80 = m (1000) - 40. Solving for the slope m we end up with m = (80 + 40)/1000. Or m = 0.120.
So the correct settings should be
You should validate this with the sensor which it sounds like you did at 20°C or 500mV.
y=mx+b y = 0.12(500) - 40; y = 20°C