Difference between revisions of "Seward Peninsula Sites"
(22 intermediate revisions by 11 users not shown) | |||
Line 1: | Line 1: | ||
+ | * [[SP Radio Status]] | ||
==== Trip Itineraries etc.==== | ==== Trip Itineraries etc.==== | ||
* [[2007 September Nome Trip]] | * [[2007 September Nome Trip]] | ||
Line 10: | Line 11: | ||
* [[2011 August Nome Trip]] | * [[2011 August Nome Trip]] | ||
* [[2012 August Nome Trip]] | * [[2012 August Nome Trip]] | ||
+ | * [[2014 July Nome Trip]] | ||
+ | * [[2015 July Nome Trip]] | ||
+ | * [[2015 September Nome Trip]] | ||
+ | * [[2016 June Nome Trip]] | ||
+ | * [[2016 April Nome Trip]] | ||
+ | * [[2016 Summer Nome Trip]] | ||
+ | * [[2016 NGEE August Nome Trip]] | ||
+ | * [[2016 NGEE September Nome Trip]] | ||
+ | * [[2017 NGEE June Nome Trip]] | ||
* [[ General Site Checklist]] | * [[ General Site Checklist]] | ||
* [[Storage Inventory]] | * [[Storage Inventory]] | ||
+ | * Kougarok Storage: Combo changed in 2015. Ask Bob Busey for access if needed. | ||
* [[Base Callbook]] | * [[Base Callbook]] | ||
+ | |||
==== Data ==== | ==== Data ==== | ||
+ | Unofficial data location on ngeedata:<br> | ||
+ | /home/bbusey/working_files/sp_datapro/raw/<br> | ||
+ | Internal wiki data:<br> | ||
+ | https://wiki.alaska.edu/display/iarcsp/ <br> | ||
+ | (Find additional info on that wiki for how public webserver is setup.: https://wiki.alaska.edu/display/iarcsp/INE+SP+Web+Page+Configuration | ||
* [[C1-Grid Data Downloads]] | * [[C1-Grid Data Downloads]] | ||
* [[C2-Blueberry Data Downloads]] | * [[C2-Blueberry Data Downloads]] | ||
Line 26: | Line 43: | ||
* [[Kigluaiks Repeater Data Downloads]] | * [[Kigluaiks Repeater Data Downloads]] | ||
* [[Kigluaiks Repeater 2 Data Downloads]] | * [[Kigluaiks Repeater 2 Data Downloads]] | ||
+ | * [[Teller Top Data Downloads]] | ||
+ | * [[Teller Bottom Data Downloads]] | ||
+ | * [[Teller Surface Water Level Data Downloads]] | ||
* [[Online Archive Fixes]] | * [[Online Archive Fixes]] | ||
Line 41: | Line 61: | ||
* [[Kigluaiks Repeater Instrumentation History]] | * [[Kigluaiks Repeater Instrumentation History]] | ||
* [[Kigluaiks Repeater 2 Instrumentation History]] | * [[Kigluaiks Repeater 2 Instrumentation History]] | ||
+ | * [[Teller Upper Station]] | ||
+ | * [[Teller Lower Station]] | ||
+ | |||
+ | ====Data QA Notes==== | ||
+ | * [[C1-Grid Notes]] | ||
+ | * [[C2-Blueberry Notes]] | ||
+ | * [[C3-Guy Rowe Notes]] | ||
+ | * [[K1 Burn Notes]] | ||
+ | * [[K2 Met Notes]] | ||
+ | * [[K3 Mauze Notes]] | ||
+ | * [[Anvil Mountain NOtes]] | ||
+ | * [[Skookum Pass Notes]] | ||
+ | * [[Kigluaiks Repeater Notes]] | ||
+ | * [[Kigluaiks Repeater 2 Notes]] | ||
+ | * [[Teller Top Notes]] | ||
+ | * [[Teller Bottom Notes]] | ||
+ | * [[NGEE Teller SW Notes]] | ||
====Sites Background==== | ====Sites Background==== | ||
Line 84: | Line 121: | ||
K3 mauze is located at the perimeter of the Kougarok area CALM grid. It is another 3 meter tower. This site has been hammered by the bears in recent years so some sensors are in and out in terms of their quality. | K3 mauze is located at the perimeter of the Kougarok area CALM grid. It is another 3 meter tower. This site has been hammered by the bears in recent years so some sensors are in and out in terms of their quality. | ||
+ | |||
+ | |||
+ | ===Nome Base=== | ||
+ | The base radio and moxa server are located at the UAF Northwest campus in Nome. | ||
+ | |||
+ | [[File:NomeAntenna.jpg|400px]] | ||
+ | [[File:NomeRadio.jpg|400px]] | ||
===Discussion of the Station Data=== | ===Discussion of the Station Data=== | ||
Line 223: | Line 267: | ||
http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/soil/Mauze_soil_2004.csv | http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/soil/Mauze_soil_2004.csv | ||
http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/rad/Mauze_rad_2006.csv | http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/rad/Mauze_rad_2006.csv | ||
+ | |||
+ | ===Datapro processing notes=== | ||
+ | On 5/23/16 we wanted to update the datapro processed output files to use the directly downloaded logger files instead of the ones from the radio network. Here is what we did, using C1 as an example: | ||
+ | |||
+ | *A record of all directly downloaded data files is found here http://ngeedata.iarc.uaf.edu/wiki/C1-Grid_Data_Downloads | ||
+ | *Config files such as Key files to run data pro are here: ngeedata /home/bbusey/working_files/sp_datapro/c1-grid/config/ | ||
+ | *Copied raw directly downloaded data files copied from owncloud folder(owncloud/Modern Seward Peninsula/ Seward_Data/raw_data_archive) to /home/bbusey/working_files/sp_datapro/raw/c1-grid/raw | ||
+ | *Deleted some data in each output file in /home/bbusey/working_files/sp_datapro/c1-grid/outputs so that data ended on 12/31/12 or before | ||
+ | **note 1: in this folder are some shortcut links called symlinks for the website). | ||
+ | **note 2: Datapro looks to see what the last line of data is so it doesn’t append if the data is already in the file | ||
+ | *Copied the bash script from /home/bbusey/bin/process_sp.sh too /home/bbusey/process_direct_downloads_bin/process_c1grid.sh so we could edit it and run it on the directly downloaded data instead of the radio downloads | ||
+ | **about bash: when you use ssh to bring up a terminal window, that particular instance is bash terminal or bash shell. so, you can use a bash script to string together a bunch of bash commands. in this case here we will use a bash script to process each .dat direct download using datapro | ||
+ | **this line specifies that we are running datapro version 3, the key file for array 27, and the direct data file we downloaded from C1 in 2014 | ||
+ | python $DATAPRO3 --key_file=$ROOT_DIR_C1/config/c1-met_array_127_key.txt --alt_data_files=$ROOT_DIR_SP/raw/c1-grid/RAW/c1_grid_2014_07_22.dat | ||
+ | **then we run that line again with the next data file, or next key file, etc | ||
+ | **we run the bash script for every new line and put the old line in a comments | ||
+ | *Now the files in the /home/bbusey/working_files/sp_datapro/c1-grid/outputs/ folder should then be updated to the end of the last data file | ||
+ | **you can check this by outputting the last two lines of each .csv file in the folder by typing | ||
+ | tail -n2 /home/bbusey/working_files/sp_datapro/c1-grid/outputs/*.csv | ||
+ | * the script we paused that runs hourly on the radio downloaded files looks like | ||
+ | python $DATAPRO3 --key_file=$ROOT_DIR_C1/config/c1-met_key.txt |
Latest revision as of 08:38, 20 June 2022
Contents
Trip Itineraries etc.
- 2007 September Nome Trip
- 2008 June Nome Trip
- 2008 August Nome Trip
- 2009 August Nome Trip
- 2009 September Nome Trip
- 2010 June Nome Trip
- 2010 August Nome Trip
- 2010 November Nome Trip
- 2011 August Nome Trip
- 2012 August Nome Trip
- 2014 July Nome Trip
- 2015 July Nome Trip
- 2015 September Nome Trip
- 2016 June Nome Trip
- 2016 April Nome Trip
- 2016 Summer Nome Trip
- 2016 NGEE August Nome Trip
- 2016 NGEE September Nome Trip
- 2017 NGEE June Nome Trip
- General Site Checklist
- Storage Inventory
- Kougarok Storage: Combo changed in 2015. Ask Bob Busey for access if needed.
- Base Callbook
Data
Unofficial data location on ngeedata:
/home/bbusey/working_files/sp_datapro/raw/
Internal wiki data:
https://wiki.alaska.edu/display/iarcsp/
(Find additional info on that wiki for how public webserver is setup.: https://wiki.alaska.edu/display/iarcsp/INE+SP+Web+Page+Configuration
- C1-Grid Data Downloads
- C2-Blueberry Data Downloads
- C3-Guy Rowe Data Downloads
- K1 Burn Data Downloads
- K2 Met Data Downloads
- K3 Mauze Data Downloads
- K1 Snow Thermocouples Data Downloads
- K2 Snow Thermocouples Data Downloads
- Anvil Mountain Data Downloads
- Skookum Pass Data Downloads
- Kigluaiks Repeater Data Downloads
- Kigluaiks Repeater 2 Data Downloads
- Teller Top Data Downloads
- Teller Bottom Data Downloads
- Teller Surface Water Level Data Downloads
- Online Archive Fixes
Instrumentation Serial Numbers
- C1-Grid Instrumentation History
- C2-Blueberry Instrumentation History
- C3-Guy Rowe Instrumentation History
- K1 Burn Instrumentation History
- K2 Met Instrumentation History
- K3 Mauze Instrumentation History
- K1 Snow Thermocouples Instrumentation History
- K2 Snow Thermocouples Instrumentation History
- Anvil Mountain Instrumentation History
- Skookum Pass Instrumentation History
- Kigluaiks Repeater Instrumentation History
- Kigluaiks Repeater 2 Instrumentation History
- Teller Upper Station
- Teller Lower Station
Data QA Notes
- C1-Grid Notes
- C2-Blueberry Notes
- C3-Guy Rowe Notes
- K1 Burn Notes
- K2 Met Notes
- K3 Mauze Notes
- Anvil Mountain NOtes
- Skookum Pass Notes
- Kigluaiks Repeater Notes
- Kigluaiks Repeater 2 Notes
- Teller Top Notes
- Teller Bottom Notes
- NGEE Teller SW Notes
Sites Background
As carry over from the ATLAS and HARC projects WERC maintains a network of telemetered data stations on the Seward Peninsula. Archived data is currently found on the UAF WERC site
Principal Data site, Council Area: C1 Grid (near CALM site U27), active C2 Blueberry, active C3 Guy Rowe, inactive
Principal Data site, Kougarok Area: K1 Burn site, active K2 Met, active K3 Mauze Gulch (near CALM site U28), active
We use radio telemetry to communicate with our stations out on the peninsula. So, there are several radio repeaters in that list, too. They are: Anvil Mountain (near Nome) Skookum Pass (near Council) Kigluaiks (in Kigluaik Mountains across the valley to the south of Kougarok)
In addition there are several other sites: Snake River (outside Nome on Nome-Teller Highway) Kuzitrin River (60+ miles up the Kougarok Road from Nome) Niagara Creek (~1 mile from K2 Met)
See most of them on a map here: Site Map
Radios turned off for C1-grid, C2-blueberry, Skookum Pass, Anvil Mountain on 11/4/2009
Project Background
Originally there were three Council sites, C1, C2, and C3. Each was selected to be in a different type of landscape but C3 kept getting attacked by bears so we eventually removed it. C1 is called C1-Grid because it is located in the valley bottom near Council along the perimeter of a CALM 1 km x 1 km active layer grid. So, the meteorological station was set up to complement the field data collected on the CALM grid. In terms of importance to the original project it was a secondary station so the data elements collected there are basically: Air Temperature / Relative Humidity (1m & 3m), Wind Speed (1m & 3m) & Wind Direction (3m), Net Radiation (in summer), precipitation (in summer), snow depth, soil moisture (vertically profiled), and soil temperature (vertically profiled). The station is also a shorter 3 meter tower.
Council Sites
The blueberry site is located on Blueberry Hill to the SW of Council. It is the primary site for the Council area so has more sensors and the tower is a 10 meter high setup. In addition to the sensors mentioned for C1 there is also up facing shortwave & longwave radiation, and down facing shortwave & longwave radiation sensors. Wind speed & AT/RH (as a single sensor) are also measured at more heights than at C1 (1m, 3m & 10m). In 2007 I made some changes to the station. In the past longwave radiation was measured using a battery. However, it was a lead-mercury battery, which is considered hazardous and has been discontinued so I re-did the wiring. This will be reflected in the data as a couple extra data columns that appear last year. Blueberry is across the Niukluk river (I think that's the river, not 100%) and is sometimes unreachable in fall. So, we weren't able to get across the river to remove the net radiometer in Fall 2007. Data for it shouldn't be trusted much past October 1st though because once the snow falls or frost accumulates on the domes that protect the radiation sensor.
On a similar note, the precipitation gages are out year-round but we only record summer rainfall. So, you should mark with '7777' the winter time of year. For measured precipitation I don't generally count it until the air temperature is consistently above freezing. It's kind of subjective on the shoulder seasons figuring out what is snow vs what is rainfall and if you don't feel comfortable with that, I'm happy to look over the data and define select which points to include and which to exclude. It may sound like we're doing something funky but we aren't. The tipping bucket rain gage is only designed to measure liquid precipitation. So, when snow accumulates in the bucket no measurements are recorded until the bucket temperature gets above freezing to melt the snow in the bucket. Since those gages are just used in the summer then we exclude all solid precip that may melt in and be counted at the data logger.
Kougarok Sites
K1-burn is another secondary site. It is located on an area burned by an old tundra fire up on a low sloping bench above the Kougarok river. The vegetation in this area is dry tussock and grasses. Much of the organic soil layer burnt off in the fire so recovery has been slow. This site gets visited fairly frequently by animals so you'll see most of the soil moisture sensors have had their cables eaten, as has also happened to the soil temperature sensors. This site is quite similar to C1 that I talked about earlier in terms of sensor setup. In the fall of 2007 we added a snow depth sensor to the site so that will be a new thing. I'm not sure about its reporting though.
K2-met is the 10 meter tower for the Kougarok area. It is located on very wet tundra 300 meters or so up slope from the Kougarok river, so down in the valley. It is a full site similar to C2 although it also has a snow sensor installed. A broken, older sensor was replaced with a newer one in fall 2007.
K3 mauze is located at the perimeter of the Kougarok area CALM grid. It is another 3 meter tower. This site has been hammered by the bears in recent years so some sensors are in and out in terms of their quality.
Nome Base
The base radio and moxa server are located at the UAF Northwest campus in Nome.
Discussion of the Station Data
So, that's a short bit of background on the stations. The data on the CD I gave you is predominantly archived data or data that has been through some level of processing. You can find the up to the day data here:
http://www.uaf.edu/water/projects/near-real-time/rawdatafiles/
http://www.uaf.edu/water/projects/near-real-time/rawdatafiles/atlas-blueberry
http://www.uaf.edu/water/projects/near-real-time/rawdatafiles/atlas-c1-grid
http://www.uaf.edu/water/projects/near-real-time/rawdatafiles/atlas-k-burn
http://www.uaf.edu/water/projects/near-real-time/rawdatafiles/atlas-k-met
http://www.uaf.edu/water/projects/near-real-time/rawdatafiles/atlas-kmauze
Those are all quite large at this point text files. Let me know if you find any gaps in the data. Sometimes there is trouble with the back end connection between the UAF webserver and the computer that retrieves this data over the radio network from Nome.
So, when you look at the data there it will probably look like a bunch of not real helpful numbers. Plus you'll see an Array ID at the start of the line which seems to alternate each hour. That is because the meteorological data arrays are mixed together withe subsurface data arrays. It's kind of a by product of the data logger we use. The design for them was done in the early 90s or late 80s so the electronic data storage architecture is robust but perhaps a bit primitive compared to the computers of today. So, to split out all the data I should have included on that cd a directory called 'batch' or 'bin'. If you look in there you should see some .bat files which are dos batch files. I don't know if you're using linux or windows, if you're using linux then pop open .bat file and you'll see 'grep .........' you should be able to change the directory paths there (I can help if you have questions about this) but these little batch files basically read through the data above (e.g. atlas-blueberry) and pull out all the met data arrays and then all the soil data array lines and dump them into different files. So, when you mention seeing different data files in the CD I gave you, the data is in various steps of processing. Oh, also, if you are using a windows computer (as I am for the data processing) then you should have cygwin installed. It's a program that gives you many of the free utilities that come with linux/unix/debian (most helpfully for us today, grep). You can download it here if you need: http://www.cygwin.com
So, if you're using windows, just modify the batch files. I like to use this program: Context for viewing text files. I find it much more useful than notepad, but if you have a favorite text editing program, feel free to use it. Anyway, if you're using windows you'll need to modify the batch file to grep out the 2007 data I think. The general format for the grep command is:
grep [text string to search for] [input data file] > or >> [output data file] c:\cygwin\bin\grep 228,3018,2007 "c:\data\atlas\c1-grid\atlas-c1-grid.txt" > "c:\data\atlas\c1-grid\c1-grid_228-2007-soil.csv"
You'll see both '>' and '>>' in the batch files. '>' means put all the data line matches into the output file and create a new file each time. '>>' means put all the data line matches into the output file and append them to the end of the file if it already exists.
So, you'll need to change the paths (e.g. 'c:\data\'), to whatever you're using. I like
c:\data\project\station_name from windows or /home/bob/data/project/station_name from linux but totally your call on how you organize. From linux the line above would like more like this: grep 228,3018,2007 "/home/bob/data/atlas/c1-grid/atlas-c1-grid.txt" > "/home/bob/data/atlas/c1-grid/c1-grid_228-2007-soil.csv"
So, when you see stuff like this:
CD/atlas/C1-grid/c1-grid_123-2006-met.csv CD/atlas/C1-grid/c1-grid_123-2007-met.csv CD/atlas/C1-grid/c1-grid_228-2004-soil.csv
etc. etc. That is output from the grep program. the 123/228 references the array ID. So, that file contains met data from 2007 from this file: C1 Data File Here's a sample line: 123,3018,2005,1,100,-0.47,-0.176,97.7,96.1,3.639,162.7,4.414,92.7,0,-2.046,12.35
I see for some flies the '123' has been replaced by the regular date & time. But that's probably the only change in the file. I would have done that if I was just checking things out.
So, moving forwards a bit, my rough method for processing the data is this:
- download data off website
- use GREP to separate out by data line type and year using the batch file for dos.. or if you use linux then you can actually edit the .bat files to be the correct path (like I mentioned above) and then from the command line type this:
chmod u+x c1-grid.bat
this will allow you to just type './c1-grid.bat' just as I have been doing but you'll do it from linux rather than windows. - open in excel or use a program I wrote to process the data (more on this below)
- view the data with a data visualization program I used Origin but Excel is semi okay (origin is nice because you can quickly change the time scales) or if you have another program you really like like matlab that's fine, too.
- delete bad data that is apparent through using the visualization program (replace with '6999')
- have someone else do a final check (like me probably)
- post data to the internet
'process the data' is pretty nebulous. I have a program I wrote which converts thermistor resistances (soil temperature data) from resistance to temperature and then formats the data as it should look on the web. That's the main reason you see so many similarly named files. Some are from step 1, some are from step 2 etc. So, that is probably a bit confusing. But for you, probably using a spreadsheet program like excel or whatever you're most comfortable with is better. You can find header information for what is in each column of the data file. The other main bit of 'processing' that happens is to correct the net radiation data for wind. I can dig that equation up for you if you can't find it in those spreadsheets. So, also along the process the data lines, some things you'll want to look out for:
- missing lines of data... make a note and check with me if you find any
- repeated lines of data... more likely to find due to the rsyncing process (the way the uaf website talks to the server downloading data from the radio network).
There are a couple ways of converting julian dates into regular. You could use these programs I've written for visual basic: http://werc.engr.uaf.edu/mediawiki/index.php/Excel_Macros or, another way is this: a) type into a cell '12/31/06' b) change the format of the cell to number. c) if your data is in the format column 1 is the date you'd like, column 2 is the year, column 3 is the day, column 4 is the hour then in column 1 you can print (using data for 2007) '= 39082 + b1 + c1/2400' then format this column as a date & time and you'll be in business. Do some simple boolean stuff to check for gaps of data or, just create another column that has as the first cell '1/1/07 1:00' then the second row in the column '=a1 + 1/24' then look to see if the calculated date (mentioned first) ever varies from the second method. If it doesn't you're golden. If they do vary then you can track down where the problem is. In general, if there is multiple lines of the same data (or so it looks like) I will delete the first instance and keep the last. The reason for this is related to the radio telemetry. Some of the connections are quite weak so we don't always get the data on an hourly basis. Sometimes we'll get half the line so we'll need to get things resent later (with regard to the radio network and data retrieval from the data loggers).
I guess another note on the array IDs. The met data and the radiation data are contained in the same array ID line. The soil temperature and soil moisture data (most of which has broken sensors) is in a separate array ID. That is the 123, 127 etc. that you wonder about below.
Well, I've written a book, maybe I will post this to our wiki and we can go back and forth on there, too. Totally let me know if you have any questions. I'll be in town through Friday (just learned) so I should be able to help more in the short term than the medium term. thanks
Part 2
Here's a bit about the radiation. I will get you more calibration values once we're back from the June trip the week of 6/16.
So, in the data files you've looked at off the internet is the output voltage in millivolts as recorded by the data logger. To get from milliVolt output from the data logger to Net Radiation in Watts / m^2 you'll need to do a couple things: 1) correct for wind speed. 2) apply the sensor specific calibration factor (positive or negative depending on sensor output)
Via this manual (pdf page 5 document page 3): Net Radiometers -- REBS Q-7.1 (Radiation Energy Balance Systems) http://www.campbellsci.com/documents/manuals/q-7-1.pdf
Are these wind speed correction equations (where wind speed in in meters per second as we have at our stations):
Positive Wind Speed Correction Factor: Wind_CF_Pos = 1 + (0.066 * 0.2 * 1m_Wind_Speed) / (0.066 + (0.2 * 1m_Wind_Speed) )
Negative Wind Speed Correction Factor: Wind_CF_Neg = (0.00174 * 1m_Wind_Speed) + 0.99755
So, if Net Radiation (in mVolts as recorded by the data logger) is positive use this equation:
Corrected_Net_Rad = Wind_CF_Pos * Uncor_Net_Rad * Pos_Calib
If Net Radiation (in mVolts as recorded by the data logger) is negative use this equation: Corrected_Net_Rad = Wind_CF_Neg * Uncor_Net_Rad * Neg_Calib
Here's the equation for C2 as it looks for me in Excel:
=IF(ABS([NetRadiometer Cell])<> 6999,IF(L4087>0,9.15*[Net Radiometer Cell]*(1+(0.066*0.2*[1 meter Wind Speed])/(0.066+0.2*[1 meter Wind Speed])),11.13*L4087*(0.00174*[1 meter Wind Speed]+0.99755)),7777)
So, given all that you need one more thing, the calibration values for positive and negative calibrations. Here's a first batch and I'll get you some more after we come back from out trip in a couple weeks. C1 Grid: Positive Calibration: 9.18 Negative Calibration: 11.31
C2 Blueberry: Positive Calibration: 9.15 Negative Calibration: 11.13
K1 Burn: [Need to track down yet, after 2008 June visit] K2 Met: [Need to track down yet, after 2008 June visit] K3 Mauze Gulch: Q99012 Positive Calibration: 8.97 Negative Calibration: 10.90
The raw data input column information you should find in the excel files or here, too:
http://www.uaf.edu/water/projects/atlas/metdata/Councilsites/C1Grid/current-raw.html http://www.uaf.edu/water/projects/atlas/metdata/Councilsites/Blueberry/current-raw.html http://www.uaf.edu/water/projects/atlas/metdata/Kougaroksites/k-burn/current-raw.html http://www.uaf.edu/water/projects/atlas/metdata/Kougaroksites/k-met/current-raw.html http://www.uaf.edu/water/projects/atlas/metdata/Kougaroksites/kmauze/current-raw.html
The output column information (how everything should be laid out) is here:
http://www.uaf.edu/water/projects/atlas/council/c1/data/met/c1_met_2006.csv http://www.uaf.edu/water/projects/atlas/council/c1/data/soil/c1_soil_2004.csv http://www.uaf.edu/water/projects/atlas/council/c1/data/rad/c1_rad_2005.csv
http://www.uaf.edu/water/projects/atlas/council/c2/data/met/c2_met_2006.csv http://www.uaf.edu/water/projects/atlas/council/c2/data/soil/c2_soil_2004.csv http://www.uaf.edu/water/projects/atlas/council/c2/data/rad/c2_rad_2006.csv
http://www.uaf.edu/water/projects/atlas/kougarok/k1/data/met/k1_met_2006.csv http://www.uaf.edu/water/projects/atlas/kougarok/k1/data/soil/k1_soil_2004.csv http://www.uaf.edu/water/projects/atlas/kougarok/k1/data/rad/k1_rad_2006.csv
http://www.uaf.edu/water/projects/atlas/kougarok/k2/data/met/k2_met_2004.csv http://www.uaf.edu/water/projects/atlas/kougarok/k2/data/soil/k2_soil_2004.csv http://www.uaf.edu/water/projects/atlas/kougarok/k2/data/rad/k2_rad_2004.csv
http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/met/Mauze_met_2006.csv http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/soil/Mauze_soil_2004.csv http://www.uaf.edu/water/projects/atlas/kougarok/k3/data/rad/Mauze_rad_2006.csv
Datapro processing notes
On 5/23/16 we wanted to update the datapro processed output files to use the directly downloaded logger files instead of the ones from the radio network. Here is what we did, using C1 as an example:
- A record of all directly downloaded data files is found here http://ngeedata.iarc.uaf.edu/wiki/C1-Grid_Data_Downloads
- Config files such as Key files to run data pro are here: ngeedata /home/bbusey/working_files/sp_datapro/c1-grid/config/
- Copied raw directly downloaded data files copied from owncloud folder(owncloud/Modern Seward Peninsula/ Seward_Data/raw_data_archive) to /home/bbusey/working_files/sp_datapro/raw/c1-grid/raw
- Deleted some data in each output file in /home/bbusey/working_files/sp_datapro/c1-grid/outputs so that data ended on 12/31/12 or before
- note 1: in this folder are some shortcut links called symlinks for the website).
- note 2: Datapro looks to see what the last line of data is so it doesn’t append if the data is already in the file
- Copied the bash script from /home/bbusey/bin/process_sp.sh too /home/bbusey/process_direct_downloads_bin/process_c1grid.sh so we could edit it and run it on the directly downloaded data instead of the radio downloads
- about bash: when you use ssh to bring up a terminal window, that particular instance is bash terminal or bash shell. so, you can use a bash script to string together a bunch of bash commands. in this case here we will use a bash script to process each .dat direct download using datapro
- this line specifies that we are running datapro version 3, the key file for array 27, and the direct data file we downloaded from C1 in 2014
python $DATAPRO3 --key_file=$ROOT_DIR_C1/config/c1-met_array_127_key.txt --alt_data_files=$ROOT_DIR_SP/raw/c1-grid/RAW/c1_grid_2014_07_22.dat
- then we run that line again with the next data file, or next key file, etc
- we run the bash script for every new line and put the old line in a comments
- Now the files in the /home/bbusey/working_files/sp_datapro/c1-grid/outputs/ folder should then be updated to the end of the last data file
- you can check this by outputting the last two lines of each .csv file in the folder by typing
tail -n2 /home/bbusey/working_files/sp_datapro/c1-grid/outputs/*.csv
- the script we paused that runs hourly on the radio downloaded files looks like
python $DATAPRO3 --key_file=$ROOT_DIR_C1/config/c1-met_key.txt