Techniques such as uncertainty analysis and data reconciliation are cost-effective methods of improving the effectiveness of network monitoring and the interpretation of measurement data.
With increased legislative and environmental pressures being applied to the water industry, maintaining and developing the flow metering infrastructure has never been more important. This is particularly so in the calculation of water balance, and the detection and estimation of the level of leakage.
A significant proportion of modern flow meters rely on assumptions about the flow profile in the pipe. Bends, valves and other pipe components upstream of the measurement device will affect the assumed flow profile and the accuracy of the meter. For example, swirl in the flow will impact the rotor of a turbine meter and, depending on the direction of the swirl, will cause an under or over-reading.
The water industry would gain real benefits from adopting the practice of the oil and gas industry, by applying rigorous uncertainty analysis at the heart of their network monitoring procedures. The development of uncertainty budget tables for metering systems would allow a water company to go beyond saying: “Our distribution input is 950 ± 30 Ml/day.” It would instead identify the main contributors to the overall uncertainty, including calibration uncertainty and installation effects. By identifying these key factors, it would allow the company to target expenditure to where it will produce the most benefit.
In the last few years, the availability of inexpensive computing power and measurement databases has enabled the development of powerful data analysis techniques that allow metering networks to be monitored on a daily basis. Such techniques can give operators detail about meter performance and leakage, and are much more effective than the traditional water balance calculation over the distribution network. The most appropriate techniques that may be applied over an inter-connected network of measurements include the following:
Data reconciliation – a self-consistency check that is applied across the whole or part of the metering network. It allows the network operator to identify meters that are either malfunctioning or have drifted out of calibration.
Data filtering – a numerical technique that allows the identification of gross errors and removes the outliers by comparing them against and expected range of flows.
Averaging or smoothing algorithms – a range of techniques that allow the comparison of data averaged or aggregated over a period of time such as a day or week. Especially effective when comparing seasonal demand patterns for water.
Genetic algorithms – a computer based neural network that ‘learns’ the behaviour of the measurements over a period of time and predicts demand, as well as highlighting anomalous measurement data.
Using data well
Optimising data use is an operational imperative, especially to water companies under environmental, regulatory and resource pressure. Many have recently made substantial investment in new infrastructure, data control systems and general data acquisition. The use of condition monitoring to detect problems in key plant such as pumps and filters is now routine; data analysis is the metering equivalent.
Failure to protect metering investment, by not complementing it with modern, cost-effective, data analysis techniques, risks increased capital and operational expenditure through poor targeting of effort.