Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.

Log in or register to post/reply in the forum.

Remote diagnosis of program skipping scans

artyb Apr 23, 2020 03:05 AM


We have a logger that has been installed for a long time, which has started skipping scans. There are some failed sensors, but I haven't spotted parts of the program which would be delayed by that as they are analogue measurements. The logger is reporting main scans skipping. The main part of the program I can see that might have delays is a section communicating over RS232, but that is running in a slow scan so shouldn't cause the main scan to skip?

I wondered if there is any way to debug remotely which part of the program is causing this? Clearly the skipped scan counters show if it is the main scan or slow scans, but I need to determine which lines or sections cause the issue. I've previously done this by sending a modified program containing lots of timers to see how long execution has taken to each point. Is there any other way to do this which would disrupt the data collection less-LoggerNet debugging tools? There have been a few watchdog errors on the logger over the years, so maybe it is a logger hardware issue?

Thank you

JDavis Apr 23, 2020 08:20 AM

Your best clues are the values in the Status table. Look at MeasureTime, ProcessTime, and MaxProcTime. The values are in microseconds. If the program compiles in pipeline mode, measure time and process time can happen at the same time. In sequential mode, you add them together.

Be mindful of any instruction like SerialIn with a timeout parameter. If the sensor failed, you will be reaching the maximum timeout.  

artyb Apr 23, 2020 10:20 AM


Thanks for the reply. The program is sequential, and those values add up to more than the main scan interval, which agrees with the skipping of main scans, but doesn't help me determine where in the program the issue arises?

JDavis Apr 23, 2020 12:41 PM

Look at what measurements on the station are giving incorrect values, and focus on the instructions used to measure those. Particular digital measurement, because those have timeouts which can be long.

artyb Jun 29, 2020 07:25 AM


For the benefit of others searching the forum I think the 'InstructionTimes' instruction comes fairly close to what I was looking for. It would involve sending a new program though.


violetteta8 Jul 7, 2020 03:05 PM

EDIT: Pardon me for missing the second portion of your initial comment, artyb. I just realized you said you've done exactly as i'm describing below. I will leave my comment here for others to see if they choose. Hope you've figured out your skips!

For what it's worth, I chose not to use "InstructionTimes" for my processing timing because our program is massive, some 8500+ lines, so having an array that large would become extremely cumbersome. We are consistently updating our program with new instruments or QA/QC logic, and we use conditional compiling, so number of lines changes, so unless there is a way for CRBasic to populate an "NLines" variable, it is likely that we would run into Variable Out of Bounds errors.


Public InstTimesVar(NLines) As Long 'use NLines to define the array length

Is it possible to define the number of lines to a variable?

Anyway, without having debugging variables built in, it could be difficult to narrow down which specific task is taking excess time (depending on the number of measurements you're performing). So, if you're going to send a program anyway, here's he way we chose to do it. We chose to use the Timer instruction and create Start/End variables for each major task that could get hung up (if an instrument failed, Serial TimeOut, etc). We then Sample all the variables to a table each scan, so we then have a record of all process times each scan interval. This allows us to have a running record of timing as well.




Public ScanStart, ScanEnd, Task1Start, Task1End, Task1Tot, Task2Start,Task2End,Task2Tot 'Vars for start, end and total time calc for each task

Public DebugTimer 'Name of timer

   CardOut(0,1440) 'Store a day's worth of records
   Sample (1,ScanEnd,FP2)
   Sample (1,Task1Start,FP2)
   Sample (1,Task1End,FP2)
   Sample (1,Task1Tot,FP2)
   Sample (1,Task2Start,FP2)
   Sample (1,Task2End,FP2)
   Sample (1,Task2Tot,FP2)



ScanStart = Timer(DebugTimer,mSec,2) 'Reset and start timer, using mSec
ScanStart = ScanStart/1000 'Convert mSec to Sec for higher precision

Task1Start = Timer(DebugTimer,mSec,4) 'Read timer at beginning of task 1
Task1Start = Task1Start/1000 'Convert to Seconds

*Perform Task 1 Logic Here

Task1End = Timer(DebugTimer,mSec,4) 'Read Timer at end of task 1
Task1End = Task1End/1000 'Convert to Seconds
Task1Tot = Task1End - Task1Start 'Calculate total time of Task 1

Task2Start = Timer(DebugTimer,mSec,4) 'Read timer at beginning of task 2
Task2Start = Task2Start/1000 'Convert to Seconds

*Perform Task 2 Logic Here

Task2End = Timer(DebugTimer,mSec,4) 'ReadTimer at end of task 2
Task2End = Task2End/1000 'Convert to Seconds
Task2Tot = Task2End - Task2Start 'Calculate total time of Task 2

CallTable (Debugger) 'Sample Variables to store in table


This may still be a bit cumbersome depending on how many tasks you have, but it'll allow flexibility for modifications without dealing with array sizes and line numbers that the InstructionTimes instuction would.

Hope you find this helpful,


Log in or register to post/reply in the forum.