Australian Government - Bureau of Meteorology Home | About Us | Contacts | Help | Feedback |

Global | Australia | NSW | Vic. | Qld | WA | SA | Tas. | ACT | NT | Ant. |

Weather & Warnings | Hydrology | Climate | Numerical Prediction | About Services | Learn About Meteorology | Registered User Services |

Surface Based Observations Section

(formerly Networks and Measurements Section)

Automatic Weather Stations for Agricultural
and Other Applications

Year of publication: 1995 (Updated 2005)


PURPOSE OF THIS DOCUMENT

This leaflet has been written to assist members of the agricultural community and other groups interested in environmental monitoring in the purchase and operation of Automatic Weather Stations (AWS). It provides a range of general, technical and meteorological advice which will assist current and potential AWS users.

AWS

WHY INSTALL AWS?

AWS have a number of advantages over conventional manual recording.

In general:

  • AWS are more consistent in their measurement
  • AWS provide data at a significantly greater frequency (some provide data every minute)
  • AWS provide data in all weather, day and night, 365 days per year
  • AWS can be installed in sparsely populated areas

However, AWS suffer a number of disadvantages. These are:

  • Some elements are difficult to automate (e.g. cloud cover)
  • AWS require a large capital investment
  • AWS are less flexible than human observers

WHAT ARE THE BASIC REQUIREMENTS FOR AN AWS?

Different users have different requirements:

  • Some AWS are installed for Short-term projects (e.g. animal health emergency monitoring or near wild fires), some are installed for long-term projects (e.g. studying climate change)
  • Some AWS are required to provide data in real-time (e.g. for irrigation), some provide delayed reports (e.g. for climate monitoring)
  • Some AWS are required to perform in all weather (e.g. for cyclone forecasting), some do not (e.g. crop disease monitoring).

One common set of conditions for all the above users is that the data must be representative of the area and time period under investigation, and that the data must continually meet the accuracy required. In addition, the data collection and storage systems must be cost effective and must also be considered before AWS purchase.

WHAT NEEDS TO BE CONSIDERED WHEN PLANNING
FOR AN AWS (OR AWS NETWORK)?

The issues of representativeness, accuracy, collection, and storage may be conveniently broken down to the following topics:

  • siting
  • sensors
  • algorithms
  • maintenance
  • documentation
  • data formats and communications
  • archiving and retrieval
  • cost

These issues are discussed in more detail in the following sections.

SITING

Spatial Representativeness

The AWS should be sited so the variables measured are representative of the area of interest. Subtle variations in exposure may mean that the data are not representative.

Three examples will suffice:

  • rainfall collection efficiency varies with height, due to wind turbulence effects. (e.g. rain measured at 1m above ground level is only 97% of rain measured at 300mm
  • temperatures measured over a bitumen surface are significantly different to those measured over a grass surface
  • wind speed measured at 3m is significantly less than wind speed measured at 10m (the wind directions are also different)

To ensure consistency between sites, an Australian set of standards for the physical siting of the instruments, Bureau Specification 2013, has been developed using the World Meteorological Organization's (WMO) guidelines.

Temporal Representativeness

In addition to difficulties with the correct exposure of instruments, thought has to be given to changes in the long-term exposure of the site. Buildings in close proximity to the instrument enclosure will result in the area of representativeness being reduced.

For example, when the instrument enclosure at Sydney was installed in 1788, the instruments were representative of a relatively wide area around Sydney. With subsequent construction of high-rise buildings and freeways, climatic and meteorological conditions only 50m from the site are now significantly different to those at the site.

It is important that the station be inspected regularly and any changes in the siting are properly documented.

The Bureau of Meteorology's Field Operations Group (and the Regional Observations Sections in each State) can provide advice regarding instrument exposure and site inspection procedures. A list of contact addresses is included at Appendix 1.

SENSORS

The sensors used on an AWS are the heart and soul of the system. Therefore a great deal of care should be taken when choosing sensors appropriate to the user's requirements.

The Bureau's standard AWSs use sensors to monitor temeprature, humidity, wind speed and direction, pressure and rainfall. Various advanced sensors are available for specialised applications. These sensors can monitor cloud height (ceilometer), visibility, present weather, thunderstorms, soil temperature (at a range of depths) and terrestrial temperature. The Bureau is also investigating other types of systems such as automated evaporation.

The quality of the final data received by the researcher or farmer can only be as good as the quality of the sensors used. No post analysis of the data can improve the accuracy or reliability of the information obtained.

Many AWS manufacturers use sensors which have poor accuracy, and whose calibration may drift significantly over a short time. Some sensors, particularily cheap ones, are also prone to premature failure.

The manufacturer's sensor specifications should be read very carefully as they can be misleading in some situations and manufacturer's claims can often not be replicated in the laboratory. For example, a manufacturer may quote the response time for a humidity sensing element but not the combined response time of the sensing element, electronics and filter which can be orders of magnitude longer; also, the manufacturer may quote an accuracy for a device such as a pressure sensor but give no indication as to confidence limits of the specification. These omissions can make a large difference as to the suitability of the device.

There are a number of fundamental characteristics which make up the accuracy and precision of a sensor.

  • Resolution - the smallest change the device can detect (this is not the same as the accuracy of the device).
  • Repeatability - the ability of the sensor to measure a parameter more than once and produce the same result in identical circumstances.
  • Response time - normally defined as the time the sensor takes to measure 63% of the change.
  • Drift - the stability of the sensor's calibration with time.
  • Hysteresis - the ability of the sensor to produce the same measurement whether the phenomenon is increasing or decreasing.
  • Linearity - the deviation of the sensor from ideal straight line behaviour.

All of these factors go into defining the accuracy and precision of a sensor but some are more important in particular situations than others. For example, for monitoring climatic temperature changes a significant amount of data is collected over a long period therefore a sensor is required which has very little drift. However if you want to measure short term wind gusts then the repeatability of the device and the response time become more important.

Another factor to consider is the robustness of the device. As a general rule, these devices are installed in harsh environments. This requires the sensors to be well designed and constructed, have strong waterproof housings for the electronics and be able the withstand extremes of climate variability. It is counterproductive to install a lightweight wind vane that will break the first time a sparrow sits on it or to use a sensing device which is designed for laboratory use (e.g. many humidity probes) in a dusty environment. Frequent replacement of lightweight or unreliable instruments can end up costing more than their more costly counterparts. The swapping of sensors can also have a significant effect on the quality of data, frequently introducing discontinuities into a data series.

The usefulness of the data obtained from a sensor is heavily dependent on the calibration of the sensor. For data to be comparable with other sites and networks, the calibration of sensors needs to be traceable back to common standards. This is often difficult to establish, particularly with cheaper sensors, but is of equal importance regardless of the quality of the sensor.

The easiest way to ensure that the calibration is reliable is to buy sensors from a NATA-certified supplier or to have the purchased sensor independently calibrated by a certified laboratory. The other way is to spend some time establishing with the manufacturer the traceability of the standards used by the company. One must not assume a company certified to calibrate rain gauges is certified to calibrate temperature probes as well.

Integral to the sensor and its calibration is sensor maintenance. There is no sensor designed which does not need to be cleaned and checked to verify its calibration. It is important that amaintenance program periodically reassesses the calibration of all sensors, otherwise the data quality will degrade.

The Bureau's Regional Instrument Centre publishes reports on a variety of sensors and can provide copies of these reports or advice on the suitability of sensors. A number of the Centre's Instrument Test Reports are listed in Appendix 2.

ALGORITHMS

The algorithms used to derive meteorological variables should be meaningful, documented, and comparable between networks.

For example, the maximum temperature derived from one second readings can be quite different to a maximum temperature derived from hourly readings, wind gusts based on one second readings will be significantly greater than gusts based on three second readings, and scalar averaging of wind direction generally produces meaningless results.

Documenting the algorithms used, and all changes to those algorithms, is necessary for future users of the data.

It should be noted that many AWS manufacturers are unaware of the subtleties involved with the algorithms and with the meaning of the meteorological variables derived.

Bureau Specification A2669 details the algorithms used in the Bureau's AWSs. These algorithms are compatible with those recommended by the WMO, and as used by other National Meteorological Services.

The Bureau's AWS Co-ordinator, the Instrument Engineering Section and the Regional Instrument Centre are available to provide advice on processing algorithms.

MAINTENANCE

AWS should be chosen for their ease of maintenance.

Maintenance should be able to be performed on an AWS without affecting the climatological record. For example, the temperature and humidity sensors should be able to be disabled before the instrument shelter is washed.

Many of the cheaper AWS cannot be adjusted in the field and need to be returned to the manufacturer for periodic calibration. In addition, many of these AWS lack robustness and require frequent maintenance visits to replace electronics and/or sensors.

It is important to consider the lifetime costs of an AWS rather than simply the initial cost. Generally, the lower the initial cost, the higher the ongoing cost to maintain acceptable data. In the end, this may result in either a higher total cost or long periods with no useful data.

The Bureau's Engineering Maintenance Section (and the Regional Engineering Services Sections in each State) can provide advice regarding the inspection and maintenance of AWS.

DOCUMENTATION

One area of observational networks which is frequently overlooked is proper ongoing documentation of equipment and siting (metadata).

Many station-years of data have been rendered useless for climate-related research due to lack of metadata showing changes in the station's immediate surroundings or instrumentation.

The initial siting of the AWS should be documented with maps and photographs. In addition, all inspection and maintenance visits should be fully documented to record any changes in representativeness and changes or errors detected in the instrumentation.

The Field Operations Group can provide information concerning inspection procedures. Most of the station metadata is stored in SitesDb, a comprehensive database of information concerning sites, systems, equipment and history for 12 000 sites around the country and offshore. The Networks Operations Group is able to provide information on these metadata.

DATA FORMATS AND COMMUNICATIONS

Output Format

Careful thought must be given to the output data format. Ideally, the format used should be:

  • Flexible - so new sensors can be added without having to re-process all the stations records into the new format
  • Simple - such that only simple programming is required to decode the data
  • Preferably human-readable without reformatting - to assist in the quality monitoring of the data
  • Independent of AWS manufacturer - to allow data to be easily exchanged between agencies and to encourage cost competitiveness between manufacturers
  • Unambiguous - the use of features such as check-sums minimise the possibility of data corruption

The use of standard data formats permits easy exchange of data between agencies and for their processing with a minimum of reformatting.

Most AWS manufacturers use their own proprietary data formats. Their use reduces the user's ability to exchange data to/from other agencies and reduces the AWS owner's flexibility to add AWS of another manufacturer.

The Bureau of Meteorology AWS generate five standard data formats. They are the one second format (for maintenance and real-time read-outs), one minute format (data logging, display), ten minute format (data logging), thirty minute format (forecasting), and three hourly format (international exchange and archiving).

Bureau Specification A2669 details the five data formats. These data formats are compatible with WMO standards and are used by a number of other National Meteorological Services. The Bureau's Networks & Codes Unit can provide advice on data formatting protocols.

Communications

The communications between the AWS and the collection agency should be:

  • Reliable
  • Inexpensive
  • Follow standard protocols

AWSs report observations by a variety of formats, including telephone lines, radio modems, mobile phone networks and satellite networks. Consideration must be given to the frequency of messages, cost (satellite telephone can be expensive) and availability of services.

Bureau Specification A2670 details the communication protocols used by the Bureau's AWS. This specification includes the command set whereby the user can remotely configure the AWS. The Bureau's Instrument Engineering and Communications Engineering Sections can provide advice on AWS communications protocols.

ARCHIVING AND RETRIEVAL

The archival and retrieval of AWS data must be considered.

Apart from, possibly, short term projects AWS data should be kept permanently. This will require balancing the need to store high temporal resolution data against the large volumes generated.

When deciding on a data storage system, consideration should be given to the ease of quality control and retrieval of the data. This applies as equally to data stored on hard copy as to data held in electronic form.

The Bureau of Meteorology's National Climate Centre maintains the Bureau's data archives and, under certain conditions, stores data from other agencies. Before any data can be accepted into the Bureau's climatological database, the foregoing issues must be addressed.

The Bureau's Data Management Section (and the Climate and Consultancy Services Sections in each State) can provide advice regarding the requirements for data to be archived in the Bureau's climatological database. Generally, the AWS installation and operation should follow the procedures outlined in this document.

AWS COSTS

No definitive AWS costs can be given as each user has different requirements, as noted above. Each AWS purchase needs to be considered in the context of these requirements. As an analogy, definitive transport costs cannot be given as a user's requirements may vary from a bicycle to an aircraft.

As an indication of costs, an AWS with sensors for air temperature, humidity and rainfall, and conforming to Bureau standards, can be purchased for approximately $40 000. It is also important to consider costs beyond simply the installation. Regular inspection of the site, sensor performance testing, having sufficient spares in stock, retaining trained technical personal and monthly communications cost all contribute to keeping an AWS running smoothly.

For a given quality of data, the general rule is: the cheaper the AWS, the less accurate the data, the more prone are the electronics and sensors to failure, and the higher the maintenance costs.

The Bureau's Regional Instrument Centre and Instrument Engineering Section are able to advise on the relative merits of a range of Automatic Weather Stations and can assist in the appropriate choice to suit individual needs.

BUREAU OF METEOROLOGY'S AWS NETWORK

The Bureau of Meteorology operates a network of over 550 AWS across Australia, as shown in Figure 1. The AWS are designed to serve the dual purposes of providing real-time data for the Bureau's forecasting, warning, and information services, as well as high quality data for the Bureau's climate database.

Figure 1.
AWS network map


AWS operated as part of the Bureau network are maintained to the accuracies detailed in Table 1 by maintenance visits at least twice a year.

The requirement to operate in extreme conditions, such as the Antarctic Winter and Tropical Cyclones, means the AWS are constructed to high standards; the requirement for high quality climate data means the AWS are rigorously inspected on a frequent basis. Satisfying both these requirements means the Bureau's AWS are more expensive to install and operate than most other AWS networks. However, for long term use they are more economical. They also allow for the efficient archiving of long term data.

A number of important sites are equipped with a personal computer (MetConsole) to enable observers to add information which cannot conveniently be provided by the AWS (e.g. cloud conditions, evaporation).

Table 1.
Sensor Range Accuracy Unit
Air Pressure 750 to 1060 0.3 hPa
Air temperature -25 to +60 0.3 oC
Wet bulb temperature -25 to +60 0.3 oC
Relative Humidity 2 to 100 3 %
Wind Speed 2 to 180 2 knot
Wind Direction 0 to 359 5 degree
Rainfall 0 to 999.8 2% mm


AWS configuration

Figure 2. shows the configuration of a typical Bureau AWS and its communication links to other agencies.


Appendix 1. - Selected list of Instrument Test Reports

ITR Author Issue Date Title Manufacturer Model
469   17/12/59 Electrical Anemometer Munro Cup
499 S. Reynolds & (H.N.Brann) 16/2/62 Examination and calibration of precision Aneroid Barometers (S/N 122/59) Type M1847 and (S/N 272/61 Type M2016 manufactured by Mechanism Ltd Mechanism Ltd DA M1847 & M2016
511 D.E.Handcock (H.N.Brann) Oct 63 Comparison of speeds of response of Dines anemograph and electrical recording cup generator anemometer Dines  
533 S. Reynolds (H.N.Brann) June 65 Portable Barometer with Digital Display of pressure (Digital Aneroid Barometer CBM Ident #5059) Range 900 to 1050 mb Mechanism Ltd  
553 (M.Cassidy) Aug 66 Marquis 4 inch plastic raingauge Marquis  
592 A.F.Young Aug 81 Performance of metric cup-counting anemometers   Cup
595 A.F.Young   Performance of the 150 knot Dines anemograph Dines 150knot
596 A.F.Young 9/1/86 The performance of the 127mm raingauge converter    
599 A.F.Young 3/9/87 Performance testing on Vaisala PA11 digital barometers Vaisala PA11
600 A.F.Young Dec 87 Effects on the performance of a 150 knot Dines anemograph Dines 150knot
601 M.Brunt Oct 88 Comparison of a Monitor thermometer screen with a standard Stevenson screen Monitor Screen
605 A.F.Young 26/1/89 Evaluation of a Monitor Sensors model RGD-01 tipping bucket raingauge Monitor RGD-01
606 A.F.Young 16/6/89 Performance Tests on Monitor Sensors Type RGD-01 Rain Gauge Monitor RGD-01
607 A.F.Young Sep 89 Heavy rainfall capabilities of Monitor Sensors and RIMCO type Tipping bucket raingauges Monitor
RIMCO
RGD-01
R/TBR-8
608 M.Brunt 29/3/90 Performance tests on Hydrological Services type T.B.2 raingauge Hydrological Services T.B.2
609 M.Brunt 6/4/90 Performance tests on McVan Pty Ltd 203mm Diameter Raingauge McVan  
617 M.Brunt (G.Dowling) 15/11/91 Comparison of the Regional Association V and a Fuess (s/n F6754) standard barometer for the Philippines Atmospheric, Geophysical and Astronomical Services Administration.    
621 J.O.Warne 10/9/92 Evaluation of G.H.Zeal ordinary, minimum and maximum thermometers Zeal  
622 K.Gregory (J.O.Warne) 10/9/92 Setra 470 pressure transducer compliance tests for AWS applications Setra 470
624 K. Gregory (J.O.Warne) 13/1/93 Calibration of Six Rosemount Thermometers Rosemount  
625 P.N.Huysing (J.O.Warne) 15/5/93 Calibration of an ELPRO 12 inch TBRG ELPRO Alert
627 K.Gregory
P.N.Huysing (J.O.Warne)
15/7/94 Calibration of Paroscientific Model 205 Pressure Sensor for use at Heard Island. Paroscientific 205
628 P.N.Huysing (J.O.Warne) 5/4/93 Calibration of a Qualimetrics 12 inch TBRG Qualimetrics M103657
629 P.N.Huysing (J.O.Warne) 5/4/93 Calibration of a Monitor RGD-01 TBRG Monitor RGD-01
630 P.N.Huysing (J.O.Warne) 5/4/93 Calibration of a Hydrological Services TB-3 TBRG Hydrological Services TB-3
632 P.N.Huysing
B.S.Keating (J.O.Warne)
18/6/93 Calibration of AIR hand held barometer AIR-HB-1A AIR HB-1A
633 P.N.Huysing (J.O.Warne) 18/6/93 Anemometer Test Report Syncrotac
Bendix
R.M.Young
Environdata
Cup
120
05103
WS-1
635 J.O.Warne
K. Gregory
P.N.Huysing (J.O.Warne)
31/1/94 Evaluation of Hydrological Services TB3/A Tipping Bucket Raingauge Hydrological Services TB-3A
636 P.Huysing (J.O.Warne) 15/7/94 Clock Drift Test of Hydrological Services Automatic Rainfall Data logger Hydrological Services Prototype
637 J.O.Warne 21/10/94 Summary report on ALMOS AWS ALMOS
638 J.O.Warne 21/10/94 Evaluation of Qualimetrics Model 6011-B TBRG Qualimetrics 6011-B



For all enquiries regarding AWS, please use our feedback service


© Copyright Commonwealth of Australia 2009, Bureau of Meteorology (ABN 92 637 533 532)
Please note the Copyright Notice and Disclaimer statements relating to the use of the information on this site and our site Privacy and Accessibility statements. Users of these web pages are deemed to have read and accepted the conditions described in the Copyright, Disclaimer, and Privacy statements. Please also note the Acknowledgement notice relating to the use of information on this site. No unsolicited commercial email.