Eddy Flux Measurements, Tussock Station, Imnavait Creek, Alaska - 2011

Abstract: 

The Biocomplexity Station was established in 2005 to measure landscape-level carbon, water and energy balances at Imnavait Creek, Alaska. The station is now contributing valuable data to the AON project that was established at two nearby stations. These will form part of a network of observatories with Abisko (Sweden), Zackenburg (Greenland) and a location in the Canadian High Arctic which will provide further data points as part of the International Polar Year. This particular part of the project focuses on simultaneous measurements of carbon, water and energy fluxes of the terrestrial landscape at hourly, daily, seasonal and multi-year time scales. These are the major regulatory drivers of the Arctic climate system and form key linkages and feedbacks between the land surface, the atmosphere and the oceans. We will provide a comprehensive description of the state of the regional Arctic system with respect to these variables, its overall regulation and controlling features and its interaction with the global system.

Project Keywords: 

Data set ID: 

20014

EML revision ID: 

7
Published on EDI/LTER Data Portal

Citation Suggestion: 

Bret-harte, M., Euskirchen, E., Griffin, K., Shaver, G. 2011. Eddy Flux Measurements, Tussock Station, Imnavait Creek, Alaska - 2011 Environmental Data Initiative. http://dx.doi.org/10.6073/pasta/e7b9e40542eee2a00c3a873f9440705b
People
Dates

Date Range: 

Thursday, March 3, 2011 to Wednesday, October 26, 2011

Publication Date: 

Monday, December 12, 2011

Methods: 

Similar to the other flux stations involved in the AON project, the Biocomplexity Station generates two types of data: high frequency eddy covariance (EC) data and low frequency means of meteorological and subsurface data. On a daily basis, approximately 75Mb of high frequency binary data and 16Kb of low frequency ASCII data is collected.

This station uses a CR3000 data logger and a laptop to collect and store data. The CR3000 is used to measure the open-path EC equipment which is sampled at 10Hz. Micrometeorological data is scanned at 0.33Hz and all data points are averaged every half hour. The CRBASIC program that controls the data logger has been written in such a way that only the most basic corrections and filtering are applied to the raw data. These would include shifting the CSAT3 and LI7500 data arrays by 2 and 3 scans respectively to account for the inherent processing delays of these sensors. The high frequency data is processed to yield mass and energy fluxes using a Reynold's decomposition after which the following corrections are applied: the WPL correction, a coordinate rotation, a spectral correction and the 'Burba' correction. Further quality controls flags are generated such as a stationarity test and a foot print analysis. The high frequency processed tables are then combined with the low frequency micrometeorological data after which the data is both filtered and gap-filled. The following procedures are used to filter the data: 1) Parameters that are based on the engineering specifications of each instrument. This normally involves filtering data based on the operating temperature range. 2) Parameters that are based on the Automatic Gain Control of the LI7500 (AGC - lens transmissivity flag). If this value exceeds a given threshold, the lens of the LI7500 is assumed to be obstructed by ice or snow. All measurements from this sensor and all radiation sensors are then assumed to be similarly obstructed and this data is filtered. 3) Any sources of air flow distortion are identified at each site and all EC measurements from those azimuth directions are filtered. Thus, if any wind that originates between these wind rejection angles, then this EC data is filtered out. 4) Parameters for impossible measurements (i.e. negative values from a precipitation gauge or pyranometer.) 5) Parameters for previously flagged data including '-9999'. Data loggers or other processed datasets for which raw data isn't available will sometimes have various flag strings in use. This will standardize everything to 'NaN'. 6) A three-standard deviation filter to get rid of extreme outliers. 7) A similarity/cluster filter - With certain instruments, the appearance of a string of identical values in a time series usually indicates measurement errors. Any time series with clusters of 5 identical values are filtered. The following procedures are used to gap-till the data: 1) A Pth order autoregressive model. Two values for each missing element in a time series are predicted with a forward-looking model and a back-looking model. This function is currently looking 168 elements in both directions in a time series and both predictions are averaged to produce a final estimate. This model is still capable of producing impossible values (i.e. negative measurements from a pyranometer) which are filtered out - thus, there may still be gaps in the time series, but they will be minimized. 2) Filtered values from the beginning and end ranges of a time series are predicted using a variant of the MDV method wherein values are estimated using binned half-hourly averages from the following or the previous seven days.
See the file IC_1993_Metadata_2011.csv for data collection statistics.

AON Data Processing, Filtering and Gap-Filling

http://aon.iab.uaf.edu/data_info

Sites sampled.

Download a comma delimited (csv) or Excel file (includes metadata and data sheets).

Use of the data requires acceptance of the data use policy --> Arctic LTER Data Use Policy

To cite this data set see the citation example on the LTER Network Data Portal page for this data set.