Hi: I have an Isco 674 tipping bucket rain gauge. I used Short Cut for programming and initially had a scan rate of 1 minute, but during field calibration runs it was underestimateing (missing pulses on P1) tips by a great deal (40-50%).
I noticed that the sample code for a similar rain gauge (TE525) had scan rates at 5 seconds. To see if that made a difference I changed my program to a 10 second scan rate. Only then did the Isco 674 properly account for the missing pulses.
Question: Based on the documentation, I assumed that the Total of tip pulses, would do so independent of scan rate. Is that not the case, and given that, should the scan rate be lowered to a similar 5 seconds for rain pulse data?
Let's see your code/instructions/data tables. A scan rate from a few seconds or minutes should not make a difference.
Hi: Code below fro the 10 second scan. Nothing different from before except for that was every 60 seconds.
'Created by Short Cut (4.0)
'Declare Variables and Units
Dim ModSecsWin As Long
Public SW12State As Boolean
Public Flag(8) As Boolean
Units PTemp_C=Deg C
Units Water_Temp_F=Deg F
'Define Data Tables
'Default CR1000 Datalogger Battery Voltage measurement 'BattV'
'Default CR1000 Datalogger Wiring Panel Temperature measurement 'PTemp_C'
'Type T Thermocouple measurements 'Water_Temp_F'
'Generic SDI-12 Sensor measurements 'Stage', 'Depth', 'Res3', and 'Res4'
'Generic Half Bridge, 4 Wire measurements 'HBr4W'
'Generic Single-Ended Voltage measurements 'AirTemp'
'Generic Single-Ended Voltage measurements 'Humidity'
'TE525/TE525WS Rain Gauge measurement 'Rain_in'
'SW12 Timed Control
'Get seconds since 1990 and do a modulo divide by 86400 seconds (24 hours)
ModSecsWin=Public.TimeStamp(1,1) MOD 86400
'Turn ON SW12 between 0700 hours and 1700 hours
If (ModSecsWin>=25200 And ModSecsWin<61200) Then
'Always turn OFF SW12 if battery drops below 11.5 volts
If BattV<11.5 Then SW12State=False
'Set SW12 to the state of 'SW12State' variable
'User Entered Calculation
WaterLevel_Inches= 45.502 - (Depth * 12)
'Call Data Tables and Store Data
looks as if everything is configured properly, how are you calibrating the gauge? Tiping bucket sensors typically need to be calibrated very slowly to avoid errors. A tip or two per minute is about right. In order to get this slow rate I usually calculate the volume needed to represent 1 inch of rain fall, then convert the volume to mass and weigh out the corresponding weight in ice and place it on the funnel screen. At room temperature or above this provides a nice slow rate.
Ah, that ice trick is a good idea. I weighed out water in a bottle with a nail in the top, and a pin hole in the bottom. It would take at least an hour for a caculated 57 tips. I suspect I had a bad terminal connection at the bucket which was causing the problem. I recalibrated it yesterday after much diagnosis, and I think I'm good to go. Back to a 1 minute scan with 60 minute recording of data.
It's a good idea to install a 100 ohm resistor in line with the cable going to the logger as suggested by Campbell to minimize switch arcing/prolong switch life. https://s.campbellsci.com/documents/us/manuals/te525.pdf
I noticed that your data table is output at 1 minute after the hour. Often for hourly data we would have it output at 0 minutes into the hour.
Finally, watching ice melt does eventually get tiring. Back when I had a lot of gauges in the field to test, I would carry ice and a small portable scale. Provided no rain was expected, after servicing the station and tipping bucket, I would put the prescribed amount of ice on the funnel and head to my next site. Then the next time I retrieved the data I would check to see that I had the right amount of tips. Don't forget to remove the calibration tips from the data file.