In the latest of our Enviro Influencers series, where we talk to key people who we think are key influencers in the environmental industry, we interviewed Brett Wells, asking him his thoughts on the technology, regulations and the changes and trends that he sees coming.
“People are building systems to assess the quality of data in real-time and slowly more and more consultants & regulators are saying ‘this is the way to go’ “
AmbiSense: Where will technology have the greatest impact on the environmental industry and why?
Brett Wells: The greatest impact will be across the entire industry and it covers the type of environmental data that is collected, how it is collected and used. Essentially, it’s the move to the use of big data. There are a couple of drivers for this:
- Environmental monitoring data has historically been expensive to gather & siloed in many ways however the falling price of sensor technology makes it possible to dramatically reduce the cost per sample by several orders of magnitude, liberating the technology in multiple ways and when you look at the scale of development in terms of sensors and measurements, it’s really quite incredible how rapidly things are moving. You can get really good measurements from very cheap devices.
- The second thing here, picking up on the business models that people, like Uber, have built out, is that it is also incredibly easy to share the data from these sensors, given the developments in internet technology. Reference stations vs reasonable cost instruments to give spatial density.
For example, looking at particle monitoring in London. There are probably around 5000. If we want to look at which ones work vs. don’t we can take a 3rd party feed to understand that and even if we have a range of instruments we can basically be analysing in real time and figuring out which ones are coherent vs incoherent.
A: To make that kind of judgement whether it is coherent or incoherent. do you think you can do that with the instruments alone or you always need a reference sample?
BW: It really depends on the confidence and certainty with which you want to measure. We do certainly calibrate all instruments and we feed into reference systems all the time to do that. The processes we use to do this are fully automated & running every second of every day. Our big thing of course is that we want to know if the data we are producing right now is correct and we also want to know what’s going wrong if one data looks odd and we often anchor the data in a reference instrument that’s not often our equipment.
A: Over what time frame do you think this disruption will occur?
BW: I think it is happening now. The issue has been we had those new technologies over the last five to ten years coming to the market and it has been quite significant learning process for people. In some cases the technology hasn’t worked and people have been quite disappointed. But now people are building systems to assess the quality of the data in real-time and slowly more and more consultants we talk to & regulators are actually saying ‘this is the way to go’ & then it’s the classic ‘let’s get a pilot going’ and invariably clients are very positive about the outcome, so again I see that this will just increase more and more.
A: How do you see regulatory regimes changing as new technologies become more prevalent? The classic ‘Uber’ model is to ‘ask for forgiveness not permission’. They went into markets with regulatory controls and effectively bypassed them. Traditionally regulators have been perceived to act as a break or a block on innovation. Do you think that this is still the case or are regulators changing their view?
BW: All the context we had with regulators in the last years had been incredible supportive about this new technology. If you are putting yourself in the position of a regulator there is much new technology coming in, you do really struggle to keep up.
So you’ve got to work in a sensible and consistent way such that people get comfortable with moving towards deeper technology integration.
Where there is legal certainty required, we are still to some extent relying on traditional measurements but we can still generate a huge amount of addition information with new technologies alongside, (because the cost for these measurements is so much more cost effective) with the regulators blessing, to better answer the question and be able to report back to polluters much faster. In most cases regulators would much prefer to use this data to facilitate a negotiation & an improvement quickly than the time and expense of going to court and increasingly large companies and governments do not wish to be seen as ‘bad citizens’ when it comes to the environment & indicative monitoring is doing great things to encourage/nudge people to move things along.
A: The amount of new technologies on the market can make it hard for the regulators to understand how people are working. Traditionally there would be standards that companies can obtain so the regulator are sure they know what they are doing, e.g. ISO accreditation. Do you think there is space for standardisation on this part of the market, or it would just kill innovation?
BW: I think standards do come in but they are often late & there is growth in the area of standardisation, in the area of indicative instruments. However what you have to remember is that in this industry, you are always required to demonstrate something that can stand up challenge. So by the time standards come in, a lot has already happened. However there is a lot stuff that you can and have to do to validate the equipment prior to a standard being developed.
A: Do you think technology will reshape incumbent business models or simply change the way existing services are delivered?
BW: I think it’s both. If I look over at something like Uber, we are making the industry more efficient and offering a new service and the rest of the industry has to keep up with the game but there is still a place for the incumbent provider (however it might be a different place). Either way, I just think it’s a fantastic market opportunity and I do believe that the area in environmental monitoring is the same. You just need to think about the information need and what a client is prepare to pay for that information. If you can answer that question more cost effectively then it’s a winning proposition.
The historical business model has been about controlling the quality of data and that data is very expensive. The traditional reference methods are very expensive and they need a whole lot of equipment around them. But actually, if we can go out and make measurements at one tenth or even one hundredth of the price, and satisfy that information need then that’s a good thing for the user of the information.
So while I sort of feel that there is always going to be a market place for the traditional stuff, a lot of development are either to make the traditional reference standard more sophisticated or on the flip side to make them more cost effective to answer the information need at a lower price.
Traditionally the costs in the service have been associated with buying and servicing the hardware but advanced software and analytics and the data being on-line actually drives the costs down. I used to manufacture instruments and the instruments would represent 15% of the total cost of that measurement over the life cycle., the rest being service costs.
But if we move towards this new business model, we can reduce the costs of operating equipment and still get very good valid data on a 24/7 basis as well as understanding how well the instrument is performing. All of a sudden that is a very attractive business model.
A: What is the coolest piece of tech. you’ve come across in the last 12 months (totally okay to plug your own :))?
BW: I guess what I see is a huge amount of new & cool technology coming every day. Probably what I’m most excited about is all the cool technology around, things like: nutrient analysis of water and low cost particle measurements and a whole lot of new transducer sensors where the cost is around 5 euro. Its then possible to use off the shelf, cheap electronics to get that data into the cloud where you can run electronics and share it and you can work out whether you believe the data or not.
In many cases we are looking at regional pictures of what is happening on a multitude of devices in real-time. It’s the whole integration of technologies which I think is really cool.
For example, I was speaking to a guy the other day who reckons he can put 100 air sensors out and calibrate them using feeds of other data he has about the environment & then take the sensors back and check to see if the calibration was correct.
So it’s really exciting what you can come up with and it’s a really interesting set of propositions you can come up with.
So the question is in 5, 10 years time how much data will companies be generating and keeping themselves vs how much will come from respected people, who can draw data from many sources and give a more holistic view of what is happening in the environment.
Effectively then, we could share this information and just like to rate your Uber driver, you can rate the polluters and use this social pressure to encourage companies to optimise their performance.
About Brett Wells
Dr Brett Wells is Managing Director of AirQuality Ltd, a technology company combining precision measurement with IoT/big data technology to deliver quality assured, real-time environmental data. The company operates a variety of measurements networks: reference stations, remote systems, and dense, multi-tier systems in the Asia-Pacific Region. He was previously the CE of Aeroqual, commercialising new gas-sensitive semiconductor technology.