Difference between revisions of "Campbell::Comm"

From IARC 207 Wiki
Jump to navigation Jump to search
Line 152: Line 152:
 
** append the log message to the site log file
 
** append the log message to the site log file
  
 +
In summary, the '''loggerdata.pl''' program:
 +
 +
* tries to download data from the last available location
 +
** all data retrieved is integrity-checked, then stored
 +
* checks (& maybe sets) the time
 +
* maybe downloads the datalogger's current program
 +
 +
In general, we make every attempt to ensure that data is downloaded as
 +
it exists on the datalogger, and safely stored on the data-gathering
 +
computer.  Data is not skipped; in particular, if a remote site has "fallen
 +
behind", the program will patiently and persistently try to bring the
 +
local (on-computer) data store in line with the remote logger.
 +
 +
The datalogger's time setting is of particular importance mainly so that
 +
connection times can be kept ''in synch'', assuming that the radios are
 +
powered on only at certain times.  Of course, another benefit in checking/setting
 +
the logger's time is that the stored data timestamps are meaningful in an
 +
absolute sense -- but note that this could also be achieved by only checking
 +
(and not setting) the time.
  
  

Revision as of 11:32, 6 July 2007

Campbell Scientific's CR10 family of dataloggers has been heavily used by WERC and many other research groups for years. Though now superseded by the CR1000 line of loggers, the CR10 line will likely be in service for years to come.

The CR10 loggers were provided with complete documentation, both for developing the datalogging programs themselves and for interfacing with other systems via serial communications. At WERC, we used this documentation to develop code in the Perl programming language for talking to these datalogging systems, with connections via serial radios (FreeWave) and the Internet.

Note that Campbell provides the LoggerNet (formerly PC208 and other) programs to interface dataloggers to PCs; WERC's Campbell::Comm and related programs serve a more limited function than LoggerNet/PC208, being primarily concerned with reliably downloading data and checking/setting the time.

WERC's code is posted online at http://www.uaf.edu/water/staff/irving/csi-code/, including a README file. This system has been in use for several years, downloading data from up to 50 remote loggers hourly in several different radio networks.

This code has not been packaged for distribution, or automatic installation, but it is intended to be available under the GNU Public License (GPL).

Technical descriptions of the primary components follow...

Note that this Perl code has been written, tested, and run only on Linux computers; Perl itself is very portable across platforms, but porting problems (e.g,. to Windows OS family) might be encountered in such areas as process management (fork, exec), alarm timing, and possibly others.


Campbell/Comm.pm

This module becomes part of a Perl program via the use or require keywords. It provides the machinery to connect and send/receive commands to/from a remote datalogger using Perl's object-oriented programming methodology. Individual methods are provided corresponding to some/most of the telecommunications commands documented in the Campbell Scientific CR10 family manuals.

Another Perl module is needed in order to interact with remote dataloggers, and that is Net::Telnet, which is included in the standard Perl distribution. This module allows automating the telnet protocol, but boils down to being able to send commands to the remote system and then wait for and handle responses -- or no response, if that's necessary.

Besides the nominal TCP/IP connections, Net::Telnet is capable of supporting interfaces using a direct serial or modem connection. This usage is implemented in the (or various...) open() methods.

A bug in Net::Telnet (this is my view; it may be considered a feature by the authors...) requires a patch; this should be represented in the Net::Telnet version included in the above tree. Specifically, Net::Telnet goes out of its way to handle timeouts using wall time (i.e., absolute time as measured on a wall clock) rather than relative inactivity time as used in the underlying select() system function. In our case the latter form is required due to the many levels and sources of latency in the communication system.

The Perl Expect module might provide a good alternative to Net::Telnet if Campbell::Comm were to be rewritten. Both provide the key functionality of being able to wait for and selectivly match on responses from the remote system.

Due to the specific need to communicate with dataloggers via FreeWave radios, the Campbell::Comm module was written with this capability perhaps too implicitly. A rewrite (anticipated) should remove any code not directly concerned with the Campbell dataloggers.

querylogger.pl

This perl script was written to exercise and test the Campbell::Comm module, but may be useful in its own right as a way to interface with dataloggers in batch (i.e., non-interactive) form. Each method in the Campbell::Comm module should be represented by a querylogger command as specified on the command line.

The interface is described if run with no arguments:

   $ querylogger   
   syntax: "$ /home/ken/bin/querylogger OPTIONS SITE COMMAND ...
   where OPTIONS are
       --quiet      suppress output to STDOUT
       -q           same as above
       --logfile=f  print everything to file f
       -l f         same as above
       --comm=s     use rc file section Campbell::Comm-s, or
       -c s         use rc file section Campbell::Comm-s
       --identify   identify site in output (default)
       --noidentify don't identify site in output
   where SITE is one of: site, slope_mtn,
   and COMMANDs may include:
       status       list status strings
       time         show time (setting not yet supported)
       backup=n     n=number of output arrays to spew (no limit)
       MPTR=loc     loc=output storage location (no limit)
       data=n       n=number of data records to get
       binary=n     n=number of data locations to get
       program      print all logger programming 
       flags        optionally toggle specified flag(s), show flags
       ports        show ports (toggle not yet supported)
       memory       show mode A memory settings
       signatures   show mode B signature settings
       input=n[,m][,r-s]  show input storage locations
       capture=[f]  end previous capture, capture output to file f
       get_time_flags experimental using F command

The querylogger (and loggerdata) scripts assume a specific system configuration of sites and rc files; see below...


loggerdata.pl

The loggerdata.pl script is the workhorse of WERC's data gathering system from CR10X loggers, and besides the querylogger.pl script (see above) is the only script we've written around the Campbell::Comm module.

See below for the #data site architecture assumed by the system.

Like #querylogger.pl, loggerdata.pl runs in a non-interactive or batch mode. But where the former program is driven according to the set of commands provided on the command line, the latter is hard-wired to follow a set sequence of commands.

As with any computer program, the details of what this program does are spelled out in its source code. The following steps are from memory, and should be confirmed...

  • a connection is established to the datalogger
  • several escape characters are sent in case the datalogger has been left in an edit or other state
  • the datalogger command prompt, "*" is obtained
  • the "next data memory location" (DSP) is retrieved from the site state on disk (not on the logger)
  • the datalogger's current data memory position (MPTR) is obtained
  • the G logger telecommunications command is used to set the MPTR to the last DSP value
  • data is requested in either ASCII (the D command) or binary (the F command) mode
  • incoming data is integrity-checked using checksums or signatures
  • confirmed-good data is stored on disk
  • update the stored DSP state information
  • repeat the above data query steps until:
    • the DSP value reaches the logger's MPTR
    • the query times out
    • the allowed processing time for the session elapses
  • query the datalogger's time
    • if the logger time is off by more than a specified amount (e.g., 10 seconds), set the time
  • if the logger program hasn't been downloaded for greater than a specified time (e.g., 10 days), download the program
    • store the logger program to a version control system (RCS)
  • query and store the datalogger flags settings
  • disconnect from the remote logger
  • compose a log message summarizing the session
    • print the log message if the session has been run by a user
    • append the log message to the site log file

In summary, the loggerdata.pl program:

  • tries to download data from the last available location
    • all data retrieved is integrity-checked, then stored
  • checks (& maybe sets) the time
  • maybe downloads the datalogger's current program

In general, we make every attempt to ensure that data is downloaded as it exists on the datalogger, and safely stored on the data-gathering computer. Data is not skipped; in particular, if a remote site has "fallen behind", the program will patiently and persistently try to bring the local (on-computer) data store in line with the remote logger.

The datalogger's time setting is of particular importance mainly so that connection times can be kept in synch, assuming that the radios are powered on only at certain times. Of course, another benefit in checking/setting the logger's time is that the stored data timestamps are meaningful in an absolute sense -- but note that this could also be achieved by only checking (and not setting) the time.




data site architecture

THIS IS A STUB .... NEEDS WORK!