General Operations

General Operations

Activities and procedures supporting data quality assurance and control (QA/QC)

Station Siting

AZMet locates stations as optimally as possible given local circumstances, ideally at sites where shading does not interfere with solar radiation measurements, obstacles like trees or buildings do not interfere with wind measurements, and surfaces like asphalt do not interfere with temperature measurements.

Station Maintenance

AZMet typically visits each station every four to six weeks for site inspection and sensor cleaning. On a quarterly basis, these visits include a comparison of air temperature, relative humidity, solar radiation, and wind speed measurements with those of a mobile calibrated set. For precipitation, soil temperature, and wind direction, we perform a reading verification test. Sensors that fall out of specified acceptable error ranges are either recalibrated or replaced at the time of the visit.

We record activities and results from clean-and-inspect and sensor-comparison visits, and digitally archive documentation by station and visit date.

Clean-and-inspect and sensor-comparison visits are additional to any emergency visits for repair of problematic sensors or other equipment.

AZMet preventatively replaces selected sensors and other station equipment, based on network experience and manufacturer recommendation. For example, we annually exchange anemometers. We clean and replace bearings in ones removed from the field and recalibrate these sensors prior to redeployment. Station batteries have a five-year replacement cycle.

Data Collection, Processing, Storage, Provision, and Backup

As AZMet collects data from an individual station, initial processing includes automated comparison of a value to minimum and maximum limits of the corresponding sensor, as specified by the sensor manufacturer. If a value is outside of the sensor range, we automatically replace it with a corresponding missing data value.

Initial processing also includes automated comparison of a value to theoretical physical limits, such as 0 and 100 % for relative humidity. If a value is outside of its theoretical physical range, we automatically replace it with a corresponding missing data value.

After the above checks for data validity, AZMet manually reviews recent data of measured variables multiple times per week. Manual review is for detecting faulty sensors that nonetheless record valid values, such as a wind vane that shows the same average wind direction over many hours. It also is for detecting deviations from broader-scale conditions by comparing values and temporal variations at an individual station with those at nearby ones. The latter purpose is more relevant for variables like air and soil temperature, relative humidity, solar radiation, and average wind speed that are relatively more spatially homogeneous.

Whenever we write a missing data value to the database for a variable recorded in the datalogger at an individual station, an automated process writes a corresponding missing data value for any derived variables.

In cases of a problematic sensor or a functional sensor falling out of range during a sensor-comparison visit, we manually review corresponding data and overwrite them with either estimated values if prudent or a corresponding missing data value. Estimated values typically are based on regression of data from nearby stations and with spatially homogeneous variables.

Whenever we replace data with missing or estimated values, a new value version for the variable is created in the database. We do not delete overwritten values.

AZMet carries out data collection, processing, storage, provision, and backup on secure servers. Data is not available directly to the public from individual stations, as access to station dataloggers has password protection and two-factor authentication.

Improving Data QA/QC

AZMet continues to refine activities and procedures that support data QA/QC, including further development of automated review, more descriptive flagging of erroneous or potentially problematic data, and techniques for correcting inaccurate values.