Difference between revisions of "DataPro and related utilities"
(Created page with " ===Background=== This program and related utilities started out in 2006 or 2007 as a way to leverage the gp.py program for web page creation. Basically, as a way to allow we...") |
|||
Line 18: | Line 18: | ||
datapro.py -- Datapro 3.0 the data processing utility | datapro.py -- Datapro 3.0 the data processing utility | ||
step_function_utility.py -- applies a correction to data values over given | step_function_utility.py -- applies a correction to data values over given | ||
− | time periods | + | time periods. Helps with applying temperature offsets to soil temperature data. |
rh_calculator.py -- given data on precipitation and dew point caclualtes the | rh_calculator.py -- given data on precipitation and dew point caclualtes the | ||
relative humidity | relative humidity | ||
− | precip_utility.py -- | + | precip_utility.py -- an additional level of qa that gets applied to the summer precipitation data |
− | plotter.py -- | + | plotter.py -- a plotting utility for making png charts |
PNGen.sh -- generates the plots for each csv file in a directory | PNGen.sh -- generates the plots for each csv file in a directory | ||
get_ip -- a utility to get the ip adderess of a machine | get_ip -- a utility to get the ip adderess of a machine | ||
rad_decoder.py -- takes a cdf File and converts to csv file | rad_decoder.py -- takes a cdf File and converts to csv file | ||
+ | qc.py -- this one is handy. So, consider the situation where you find a bunch of bad data. You can use this utility to make your corrections in excel and then qc.py will copy those corrections back into the original output from datapro... and make note of what has changed in the qa log. | ||
tz_shift.py -- for shifting between utc - 0 and akst timezones | tz_shift.py -- for shifting between utc - 0 and akst timezones | ||
wunder_formatter.py -- takes csv file data and converts it to a url that can | wunder_formatter.py -- takes csv file data and converts it to a url that can | ||
Line 31: | Line 32: | ||
LoggerLink -- a porgram for uploading/ downloading data and programs from | LoggerLink -- a porgram for uploading/ downloading data and programs from | ||
cr1000 data loffers | cr1000 data loffers | ||
− | noaa_data.py -- | + | tutorial.py -- should show how to use all of these things. |
+ | |||
+ | Probably less interesting things: | ||
+ | barrow_monthly.py -- something he wrote to grab Barrow CRN data | ||
+ | noaa_data.py -- another something he wrote to grab Barrow CRN data | ||
+ | LoggerLink.py -- another way to send programs to pakbus data loggers (like CR1000 or CR800 etc etc) | ||
+ | rad_decoder.py -- take ARM netCDF files and extract the data into csv files. | ||
− | |||
− | |||
− | |||
</pre> | </pre> |
Revision as of 11:57, 6 November 2015
Background
This program and related utilities started out in 2006 or 2007 as a way to leverage the gp.py program for web page creation. Basically, as a way to allow web pages generated by that tool to display data coming from Array styled outputting data loggers (Campbell Scientific cr10x & cr23x etc). Along the way it turned into a more full featured data processing system written in python with a host of support utilities. Bob Busey wrote the original and then student (now staff member) Ross Spicer did a nice rewrite and organized it like a proper computer scientist might.<br.
Where to find it
The current release can be found on Ross' github page:
https://github.com/rwspicer/csv_utilities
(example starter files here:)
https://github.com/frankohanlon/DataPro_Sample_Files
There are several ways to get it. Easy way is just to download the package as a zip file. Proper way is to use open source tool Git (from command line or graphical interface):
git clone https://github.com/frankohanlon/DataPro_Sample_Files
Also useful if you decide to use it is this is a windows / linux graphical utility which helps generate the configuration files:
https://github.com/rwspicer/sc_apps
What is included
datapro.py -- Datapro 3.0 the data processing utility step_function_utility.py -- applies a correction to data values over given time periods. Helps with applying temperature offsets to soil temperature data. rh_calculator.py -- given data on precipitation and dew point caclualtes the relative humidity precip_utility.py -- an additional level of qa that gets applied to the summer precipitation data plotter.py -- a plotting utility for making png charts PNGen.sh -- generates the plots for each csv file in a directory get_ip -- a utility to get the ip adderess of a machine rad_decoder.py -- takes a cdf File and converts to csv file qc.py -- this one is handy. So, consider the situation where you find a bunch of bad data. You can use this utility to make your corrections in excel and then qc.py will copy those corrections back into the original output from datapro... and make note of what has changed in the qa log. tz_shift.py -- for shifting between utc - 0 and akst timezones wunder_formatter.py -- takes csv file data and converts it to a url that can be sent to weather under ground LoggerLink -- a porgram for uploading/ downloading data and programs from cr1000 data loffers tutorial.py -- should show how to use all of these things. Probably less interesting things: barrow_monthly.py -- something he wrote to grab Barrow CRN data noaa_data.py -- another something he wrote to grab Barrow CRN data LoggerLink.py -- another way to send programs to pakbus data loggers (like CR1000 or CR800 etc etc) rad_decoder.py -- take ARM netCDF files and extract the data into csv files.