Our Blogs

Big Data and Continuous Monitoring: The Trade-Offs

Steve Wilson

With the projected exponential growth in IoT devices (IDC forecast 28.1 billion IoT connected devices globally by 2020) remote monitoring technologies will increasingly become the norm for the environmental sector. Here are the Trade-Offs between Big Data and Continuous Monitoring:

The growth in IoT devices will also see a huge increase in the overall amount of data generated, particularly if one considers that the dominant monitoring technique within the industry is still single ‘spot monitoring’ either from handheld data or from samples sent to a laboratory.

With climate change and air pollution increasingly dominating the political and social agenda (check out the UN Global Pulse website for tonnes of data on how the conversation is accelerating online), the benefits of better data are increasingly obvious to all.

‘Better Data’ vs ‘Bigger Data’

There is a difference between ‘better data’ and ‘bigger data’. Gathering large datasets, while technically feasible, can add to technology costs and encumber customers with additional regulatory burdens thus inhibiting their adoption.

Like most things in life, there is always a trade-off to be made and in this blog, we will seek to articulate those trade-offs and address some of the major issues around moving to continuous/high-frequency monitoring.

What is ‘Continuous Monitoring’?

In our experience, ‘continuous monitoring’ in the context of environmental monitoring generally means setting a sampling frequency that is lower than the rate at which the process itself changes.

In the landfill sector, for example, this is often described to us as 60-minute sample intervals for gas monitoring and fifteen-minute sample frequencies are common for faster moving processes such as leachate and surface water monitoring.

If you consider that many of these processes are currently spot-monitored with a sample taken maybe once per week, then that represents an x168 increase in data generated in the case of gas and an x672 increase in the case of water.

While those increases would be welcome if there were no inherent drawbacks, the fact is that there are significant issues to consider both from the perspective of the technology itself and from that of the compliance manager employing it.

What are the trade-offs?

#1. Power

Remote monitoring instruments generally incorporate long-life batteries or utilize solar-panels to keep the instruments functioning. Given that the power draw from the instrument is generally related to frequency of measurement, the more data you gather the more you decrease the life span of the instrument.

If the monitoring location is remote and difficult or dangerous to access then reducing the need to visit the instrument to change its battery or recharge it, is an important consideration.

#2. Sensor Life-span & Performance

Like power draw, there is a direct relationship between sensor life-span and its performance and the number of measurements undertaken. Resolving a sensor performance issue also normally requires a site visit and there is a cost to performing this. This issue is more serious than the power issue mentioned above in that sensors gradually drift and reduce in performance, rather than just toggle from working one day to not functioning the day after.

The same line of argument can be applied to filtration media often used in remote monitoring instruments. Other components, such as the pump, suffers wear & tear over longer running hours, thus accelerate at higher frequency. It is important to recognise hardware constraints because ultimately choosing a strategy to keep the systems live whilst still acquiring high temporal resolution data is key.

It should be noted however that systems like Ambisense (which incorporate a telemetry module and report live data), generally include an analytics back-end such that the performance of the instrument can be monitored in real-time and therefore predictive maintenance and calibration schedules can be put in place to optimise visits to the site.

#3. Communications

Remote monitoring instruments are often attractive to operators because they minimise the cost and risks around visiting very remote locations. If those instruments incorporate telemetry modules to report back the data, then this telemetry system is normally either GSM or Satellite-based since 3g/4g and Wi-Fi connections are not available.

Communicating through GSM and Satellite is generally much more expensive than doing so via a data connection as anyone who has moved from sending text messages on their mobile phone to using WhatsApp or Viber can attest to!

While the major telecoms companies and new companies like Sigfox have committed large amounts of capital to rolling out M2M and other types of radio networks (for example LORaWAN), specifically designed for IoT devices, the reality is that they will take some time to roll out and reliance on GSM/Satellite is likely to remain for some time.

To give a sense of the costs incurred, a customer taking a landfill gas sample every hour and paying 5p per text would pay almost £450 per year to receive the data whereas a customer obtaining a sample every 6 hours would be paying just over £70 pounds. These costs are not insignificant, particularly if a client is considering rolling out large numbers of devices.

#4. Risk of measurement affecting the measurement

Some measurement techniques inherently affect the properties being measured, e.g. sampling gas from a well and venting to atmosphere or measuring well flow-rates through a vented orifice. The more frequent the measurement, the greater effect it has and ultimately it may compromise the intended characteristics to monitor.

Ambisense returns the sampled gas downstream from where it was extracted for this reason.

#5. Data overhead

Data acquisition needs to be considered with regard to secure and invulnerable data storage. The receiver(s) of data originating from proliferated sensor devices needs to be capable of handling such traffic without faults or gaps.

Databases must be able to handle the growing volume of data without necessitating excessive memory or processing power, whilst enabling easy filtering and querying such that the user can readily pull-down relevant datasets of interest.

#6. Regulatory perception

The frequency of the monitoring regime does not make a situation any more or less compliant than its existing nature but can have a significant influence from a regulatory perspective; continuous monitoring will filter out the outlier false positive and false negative readings that inevitably occur with sparse and infrequent manual sampling.

This ‘true picture’ of the environmental behaviour can be subjectively received by either party. The increased level of scrutiny may be preferable from a regulator’s perspective but maybe a cause of concern for an operator.

Transient events that otherwise would not be picked up by periodic spot monitoring are flagged in continuous monitoring – balanced interpretation of the time-based data is then necessary to judge whether this qualifies as compliance breaches or whether the definition of breach criteria need to be modified for high-frequency events. (The topic of regulatory perception of continuous monitoring will be the subject of an upcoming blog).

Regulatory Regime May Need to Change

So, if our goal is to produce better datasets to improve environmental compliance then it is important that the amount of data deemed necessary helps and not hinders the rollout of the very technology that will provide that compliance.

Hardware constraints are inevitable and need to be acknowledged, although their effect is ever diminishing as technology advances. In determining the appropriate frequency of continuous monitoring, one needs to consider the questions needing to be answered (why is XYZ behaving like so? Does it improve if I do this? What is the extent of this issue?) and the time-frame needed to establish the answers, thus maintaining a live coherent dataset.

In our opinion, the regulatory regime also needs to change to handle that data to help, and not hinder, those early adopters brave enough to take a chance on new technologies in the first place.

Share This Post