Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.

Log in or register to post/reply in the forum.

RealTime function

Otemohu Nov 30, 2017 07:32 AM

Hi all,

I'm currently want to create a variable based on the datalogger realtime.
My output format should be YYYYMMDDhhddss
If I'm using the RealTime function to get the nine variables linked to the time, ie YYYY, MM, DD, hh, mm, ss, micros, DOW, DOY and rearrange the output time as folowing :

Public rTime(9)
Public DesiredTime As String *17

Realtime (rTime)

DesiredTime = rTime(1) & rTime(2) & rTime(3) & rTime(4) & rTime(5) & rTime(6)

As it, I will get a output realtime that will be sometimes cut due to the presence of non zero front of value. Here is an example :

If time is 2017 12 01 15:09:27, I will get 201712115927 and no 20171201150927.

Do you have a quick solution without using long test to keep a zero when rtime <10?

All the best,

JDavis Nov 30, 2017 08:45 AM

I find it easier to use Status.TimeStamp(4,1) to get the timestamp as a string, then using the MID function to get the formatted pieces that you need.

The help for the syntax is listed in the help index under TableName.TimeStamp

Otemohu Nov 30, 2017 09:17 AM

I've tested this solution but get systematicaly a delay, typically 1.08 sec. I think I have to force the seconds to zero.

JDavis Nov 30, 2017 09:23 AM

Then go with your first approach, but use the FormatFloat instruction on each component of the timestamp to have the leading zeroes.

Otemohu Dec 7, 2017 01:35 AM

Ok I did it, thank you. I will ask another question on the forum regarding "time slicing" issue ?!?

ArtHeers Jan 4, 2018 12:31 PM

Another function introduced to help with this is sprintf(), which can be used more simply than Format Float, as with this example:

Public RT(9) As Long, RTStr As String

Otemohu Apr 12, 2018 04:03 AM


Here is a discussion we had some months ago. I have configured the CRbasic program of several dataloggers to record variables in a file.
I'm using TableFile ("USR:file.dat",11,7,0,1,Day,Outstat,Lastfilename) function and allocate a memory in the USR drive (1.5 Mb typically).
As you can see, the file is generated every day and I fixed the Maxfiles number at 7 (seven files of data).. When outstat is true, then:

If Outstat Then
Lastfiletime = file.Timestamp(3,5)
Desiredtime = timearray(3)&timearray(2)&timearray(1)
Newfilename = "USR:file_"+Desiredtime

So that I have a file in the USR drive like "file_20180411.dat" that is normally generated just after midnight (?)
I configured a collect schedule in the setup screen to retrieve each file after midnight and I erase each file from USR drives after that.

I have two issue with this configuration:

1) The filename ( eg file_20180411.dat) must classicaly contain data with timestamp of 11 April. But for some loggers, the filename has the date of the day after (ex: data with timestamp 20180412 and corresponding file is file_20480413.dat. I don't know why and how can I manage this ?!?

2) Another issue I have is sometimes the absence of retrieving file from USR. In the last few days, I retrieve any files from dataloggers. The USR drives were empty. I do not know what is happening. I suppose if I re-run the programm, I will have files. If there is problem in the network, what is the best solution to keep files in the USR drive before retrieving with loggernet. I suppose that putting Maxfiles at 7 allow me to have a max of 7 days of data files on the USR drive and then, when loggernet become available, data files are retrieved and erased?

I don't know if all this strategy is the best one to have daily files with timestamp. Maybe I need NL116 with CF card to keep files in transient memory, so without using USR. What I did not succeed up to now is to generate files with NAN lines when there is no data from datalogger rather than a file with lacking lines....

What do you think about all of this?

All the best,

Log in or register to post/reply in the forum.