GP
This is the first utility program given to us by the programmers at EE Internet. Turns out this program just does the html pages and not the graphs. To run it you'll need to install two extra python packages: Kid and Pytz. Instruction for that are on the python page. To run gp from the command line you give python the input script name and also the name/location of an initialization file:
$ python gp.py ADOT/dot.ini
This setup was modified by Bob after the materials we got from the EEI people. For this run I copied everything off the EEI page and put it all into the directory
/work/python/gws/gp
Important directories you'll need full and be filling:
This is where the gp.py created html will go. On your first run directories will be created for each station.
/work/python/gws/gp/ADOT/html
This is where all of the source data files from loggernet are contained. There is no directory structure here, everything is lumped into one directory.
/work/python/gws/gp/ADOT/data
It appears this directory, although listed in the configuration file, is not currently used by gp.py.
/work/python/gws/gp/ADOT/plots
In addition, for the first run there is some code at the end of gp.py that tries to send an email. I commented that section out (lines 314 to 358 in revision 143 of gp.py).
Here's a working initialization file.
[Main] networkName = ADOT # Where the configuration files lie configDir = /work/python/gws/gp/ADOT # File with all the station information stationInfoCsv = StationInfo.csv # Directory containing the template files templateDir = /work/python/gws/gp/ADOT # The base dir for rendered files outputDirBase = /work/python/gws/gp/ADOT/html # Base URL of the data files (http://, ftp://, file:///home/data, etc) #dataFileBaseURL = http://werc.engr.uaf.edu/~ken/nslope-adot/ # dataFileBaseURL = http://www.gwscientific.com/remote/adot/ dataFileBaseURL = file:///work/python/gws/gp/ADOT/data/ # Name of the diagnostics files diagOutputName = adot-diag.html # Are we generating the "stripped down" include files? generateInclude = True # Where are the graphs going? graphOutputDirBase = /work/python/gws/gp/ADOT/plots diagURL = http://www.eeinternet.com/dot/html/adot-diag.html baseStationOrder = bullen,foothills [DataRecency] threshold = 0.25 notifyEmail = fnkci@uaf.edu,jkugler@eeinternet.com [bullen] baseStationName = Bullen Point Project stationOrder = DBM1,DBM2,DBM3,DBM4,DBM5,DBM6,DBM7,DBM8,DBR1,DBR2,DBR3,DBR4,DBR5 [foothills] baseStationName = Kuparuk Foothills Project stationOrder = DFM1,DFM2,DFM3,DFM4,DFM5,DFR1,DFR2,DFR3
A bit more on this gp business. It appears to contain main() for this utility but the bulk of the work is done in gputil.py which contains object definitions for the system. Everything is pretty much done through the objects. The sneaky catch is that there's a filename hardcoded into the object, StationInfo.csv. StationInfo.csv basically contains the metadata for all these sites in csv format (header on the first line). Cracks me up a bit since they're also using the kid module which I thought was for flexible XML use but whatever I guess. So, you can get the feeling from reading the ADOT StationInfo.csv file below that all of this info gets used in various ways in the junk outputted to the html pages:
StationName,PakbusID,StationDescriptiveName,StationLocationDescription,Latitude,Longitude,MagneticDeclination(East),ElevationFt,ElevationM,DataFileDirectory,DataFileNameBase,DataFileSuffixes,OutputDir,TemplateMap DBM1,201,Accomplishment Creek Met,"Accomplishment Ck, below lake at pass to Rubicon",N 68 24.696,W 148 8.190,22.6,4833,1474.065,,DBM1_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM1,dot DBM2,202,Ribdon Met,Upper Rubicon,N 68 38.548,W 147 21.107,23.1,4648,1417.64,,DBM2_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM2,dot DBM3,203,Juniper Met,Upper Juniper Ck,N 69 4.570,W 146 30.294,23.8,4324,1318.82,,DBM3_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM3,dot DBM4,204,Sag-Ivishak Met,"Close to snow survey site UP1, aka FT1",N 69 12.933,W 148 33.116,22.9,1414,431.27,,DBM4_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM4,dot DBM5,205,Upper Kad Met,Kadleroshilik uplands,N 69 32.968,W 147 56.505,23.5,686,209.23,,DBM5_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM5,dot DBM6,206,Kavik Met,Kavik camp,N 69 40.402,W 146 54.034,24.1,649,197.945,,DBM6_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM6,dot DBM7,207,Lower Kad Met,"Kadleroshilik River, near the old runway",N 70 4.406,W 147 39.000,23.8,78,23.79,,DBM7_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM7,dot DBM8,208,Bullen Met,South of Bullen Pt,N 70 4.792,W 146 49.166,24.3,86,26.23,,DBM8_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DBM8,dot DBR1,211,TBD,TBD,,,,,,,DBR1_,"",DBR1,dot DBR2,212,Ribdon Rep,Somewhere around Lupine valley?,N 68 36.337,W 147 23.716,23.2,N/A,N/A,,DBR2_,"HrlyAtms.dat,HrlyDiag.dat",DBR2,dot DBR3,213,Pogopuk Rep,Between Gilead Ck and Niviak Pass,N 69 17.369,W 146 22.990,23.3,N/A,N/A,,DBR3_,"HrlyAtms.dat,HrlyDiag.dat",DBR3,dot DBR4,214,Kavik Rep,"South of Kavik Camp, next to winter trail",N 69 37.893,W 146 52.889,23.4,1436,437.98,,DBR4_,"HrlyAtms.dat,HrlyDiag.dat",DBR4,dot DBR5,215,Franklin Bluffs Rep,"Franklin Bluffs, NE side",N 69 48.633,W 148 19.540,23.1,875,266.875,,DBR5_,"HrlyAtms.dat,HrlyDiag.dat",DBR5,dot DFM1,601,South White Hills Met,WERC HV5 snowsurvey site,N 69 12.043,W 149 33.508,22.4,962,293.41,,DFM1_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DFM1,dot DFM2,602,White Hills Met,WERC White Hills previous repeater site,N 69 29.187,W 149 49.284,22.5,,,,DFM2_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DFM2,dot DFM3,603,North White Hills Met,Between lake and stream to NNE of White Hills,N 69 42.892,W 149 28.227,22.6,276,84.18,,DFM3_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DFM3,dot DFM4,604,North West Kuparuk Met,WERC H03 snowsurvey site,N 69 56.851,W 149 55.014,22.7,406,123.83,,DFM4_,"HrlyAtms.dat,HrlyDiag.dat,HrlySubs.dat",DFM4,dot DFM5,605,TBD,TBD,,,,,,,DFM5_,"",DFM5,dot DFR1,611,Kak Rep,Kakukturat Mtn,N 69 4.357,W 149 30.870,22.3,1667,508.435,,DFR1_,"HrlyAtms.dat,HrlyDiag.dat",DFR1,dot DFR2,612,Slope Mnt Rep,Slope Mtn Repeater Site (relocated to top),N 68 44.448,W 149 1.989,,3858,1176.69,,DFR2_,"HrlyAtms.dat,HrlyDiag.dat",DFR2,dot DFR3,613,Shell Pingo Rep,Shell Pingo Top,N 70 1.234,W 147 40.903,23.8,408,124.44,,DFR3_,"HrlyAtms.dat,HrlyDiag.dat",DFR3,dot
September 9 after work update
Will continue fleshing this out but had some more thoughts to add.
So, there are generally five parts of input:
- the .ini, which controls general directory locations, stations to process, station groupings etc.
- stationinfo.csv which contains station specific information (including data input file names)
- the .dat files (listed in stationinfo.csv)
- the html templates. The html templates are turned into regular html using the python package kid.
- another key file that basically contains aliases between variables common to each station which may have different names in the specific .dat file (like different ways of identifying air temperature for example).
So, changing the way plots are displays probably involves copying and then modifiying one of html templates. This could also allow annotations to be included in a separate include for example. Doing some early QA/QC stuff is even easier. Just write a new script (or point to datasets QA/QCed from the O.O. based file system Ken's been working on) that does the qaqc on the data and puts it all into a new file that has a first level of processing done to it. No need to change this program at all. The things that would change would be the .ini and stationinfo file with the general gp.py framework remaining in place.
So, for html changes (changing how figures are shown and adding annotation text) there will be at least a minimal amount of coding required to modify gp.py or gp_util.py but for data changes that can all be done through the .ini files.
I guess additionally, to get older cr10x data to work with this system wouldn't be a giant deal. The julian date & time would need to be changed into a regular format as the table based is and the arrays would each need to go into their own data file but otherwise, not to much required.