Difference between revisions of "DataPro and related utilities"

From IARC 207 Wiki
Jump to navigation Jump to search
Line 5: Line 5:
 
The current release can be found on Ross' github page:<br>
 
The current release can be found on Ross' github page:<br>
 
https://github.com/rwspicer/csv_utilities
 
https://github.com/rwspicer/csv_utilities
(example starter files here:)<b>
+
(example starter files here:)<br>
 
https://github.com/frankohanlon/DataPro_Sample_Files
 
https://github.com/frankohanlon/DataPro_Sample_Files
  
Finally, see a semi-real workflow for it here:
+
Finally, see a semi-real workflow for it here:<br>
[[DataPro setup and workflow]]
+
'''[[DataPro setup and workflow]]'''
 
+
<br>
 
There are several ways to get it.  Easy way is just to download the package as a zip file.  Proper way is to use open source tool Git (from command line or graphical interface):<br>
 
There are several ways to get it.  Easy way is just to download the package as a zip file.  Proper way is to use open source tool Git (from command line or graphical interface):<br>
 
<pre>git clone https://github.com/frankohanlon/DataPro_Sample_Files</pre>
 
<pre>git clone https://github.com/frankohanlon/DataPro_Sample_Files</pre>

Revision as of 12:00, 6 November 2015

Background

This program and related utilities started out in 2006 or 2007 as a way to leverage the gp.py program for web page creation. Basically, as a way to allow web pages generated by that tool to display data coming from Array styled outputting data loggers (Campbell Scientific cr10x & cr23x etc). Along the way it turned into a more full featured data processing system written in python with a host of support utilities. Bob Busey wrote the original and then student (now staff member) Ross Spicer did a nice rewrite and organized it like a proper computer scientist might.<br.

Where to find it

The current release can be found on Ross' github page:
https://github.com/rwspicer/csv_utilities (example starter files here:)
https://github.com/frankohanlon/DataPro_Sample_Files

Finally, see a semi-real workflow for it here:
DataPro setup and workflow
There are several ways to get it. Easy way is just to download the package as a zip file. Proper way is to use open source tool Git (from command line or graphical interface):

git clone https://github.com/frankohanlon/DataPro_Sample_Files

Also useful if you decide to use it is this is a windows / linux graphical utility which helps generate the configuration files:
https://github.com/rwspicer/sc_apps

What is included

datapro.py -- Datapro 3.0 the data processing utility
step_function_utility.py -- applies a correction to data values over given 
                            time periods. Helps with applying temperature offsets to soil temperature data.
rh_calculator.py -- given data on precipitation and dew point caclualtes the 
                    relative humidity
precip_utility.py -- an additional level of qa that gets applied to the summer precipitation data 
plotter.py -- a plotting utility for making png charts
PNGen.sh -- generates the plots for each csv file in a directory
get_ip -- a utility to get the ip adderess of a machine
rad_decoder.py -- takes a cdf File and converts to csv file
qc.py -- this one is handy.  So, consider the situation where you find a bunch of bad data.  You can use this utility to make your corrections in excel and then qc.py will copy those corrections back into the original output from datapro... and make note of what has changed in the qa log.
tz_shift.py -- for shifting between utc - 0 and akst timezones
wunder_formatter.py -- takes csv file data and converts it to a url that can 
                        be sent to weather under ground 
LoggerLink -- a porgram for uploading/ downloading data and programs from 
                cr1000 data loffers
tutorial.py -- should show how to use all of these things.

Probably less  interesting things:
barrow_monthly.py -- something he wrote to grab Barrow CRN data
noaa_data.py -- another something he wrote to grab Barrow CRN data
LoggerLink.py -- another way to send programs to pakbus data loggers (like CR1000 or CR800 etc etc)
rad_decoder.py -- take ARM netCDF files and extract the data into csv files.