By Alastair MacLeod, CEO, Ground Control
Across the water sector, monitoring data is under increasing scrutiny. What was once collected primarily to meet compliance requirements is now being used far more actively, both operationally and publicly. Regulators expect greater transparency, and when incidents occur, the quality of monitoring data is often examined in detail.
This shift is exposing a persistent issue in many IoT deployments; the gap between data captured in the field, and data that can be reliably delivered, verified, and acted upon. In remote water monitoring, connectivity often becomes the weakest link in that chain.
Measurement is not the problem
Most modern sensing technologies used in water monitoring are well understood and widely deployed. Whether measuring flow, level or water quality, the ability to capture data at the asset is rarely the limiting factor. The greater challenge lies in transmitting that data reliably.
Monitoring points are typically located in areas where connectivity is weakest. Riverbanks, reservoirs and remote discharge sites rarely benefit from consistent cellular coverage. Even where coverage exists, it can be unpredictable, affected by network congestion, terrain, weather and power availability.
For many IoT deployments, this creates a gap between what is measured and what is actually recorded centrally. Data may be buffered locally and transmitted later, or gaps may be filled through estimation. These workarounds are often accepted as part of operating in remote environments, but they introduce uncertainty into the dataset.
When data is questioned, connectivity matters
That uncertainty becomes visible when monitoring data is examined more closely. If a gap appears during a critical event, or if timestamps cannot be clearly traced from capture to reporting, confidence in the data begins to erode.
At that point, the focus shifts from whether sensors are working to whether the system as a whole can be trusted. For organisations deploying IoT at scale, this becomes an architectural issue, as connectivity moves beyond a simple transport layer and becomes part of the data integrity model.
Rethinking connectivity in IoT deployments
Traditional IoT deployments in utilities have relied heavily on cellular networks, largely due to cost and availability. However, this approach assumes a level of coverage and consistency that does not always exist in practice.
Satellite IoT is increasingly being used to address this gap, not as a replacement for cellular but as a complementary layer within the overall architecture. Most monitoring use cases involve small packets of data, such as alerts, threshold breaches or status updates, where the priority is timely delivery rather than bandwidth.
Satellite connectivity provides that assurance by enabling signals to be transmitted independently of local network conditions, which is particularly important for remote or infrastructure-poor sites.
Hybrid connectivity as a design principle
The most effective deployments are moving towards hybrid connectivity models, where different networks are used for different purposes.
Satellite links are typically reserved for critical, time-sensitive data that must be delivered quickly, while cellular networks support routine data transmission, firmware updates and higher-frequency reporting where coverage allows. This approach reduces reliance on a single network, improves resilience and enables more deliberate prioritisation of data.
From an IoT architecture perspective, this reflects a shift towards designing for intermittent connectivity rather than assuming continuous availability.
What this looks like in practice
In a remote monitoring setup where upstream water levels determine whether a pump should activate downstream, reliance on intermittent cellular connectivity introduces a risk that the trigger signal is delayed or missed.
By introducing satellite connectivity for that specific signal, the system behaviour changes. The trigger can be transmitted rapidly when conditions are met, while downstream systems continue to use cellular networks for confirmation and additional data.
This results in a more reliable sequence of events, with each step recorded and time-stamped, improving both operational performance and the ability to audit system behaviour.
A similar pattern is emerging in distributed monitoring systems, where data processing is pushed closer to the edge. Sensors and loggers identify threshold breaches locally and trigger alerts, while less time-critical data is transmitted as connectivity allows. This reduces dependency on continuous connections and aligns with hybrid network strategies.
From data collection to data confidence
As IoT deployments mature, the focus is shifting from simply collecting data to ensuring that data can be trusted. This is particularly important in regulated environments, but it also applies more broadly wherever IoT data is used to support decision-making.
Reliable connectivity underpins this transition by supporting consistent timestamps, reducing data gaps and providing a clearer record of events. It also simplifies integration, as complete and time-aligned data can be more easily fed into analytics platforms, asset management systems and reporting tools without extensive reprocessing.
For organisations investing in IoT, this has a direct impact on value, as data that cannot be relied upon is difficult to use regardless of how advanced the analytics layer may be.
Designing for reliability from the outset
The increasing scrutiny of monitoring data is forcing a rethink of how IoT systems are designed. Connectivity can no longer be treated as a secondary consideration and must be built into the system architecture from the outset.
Satellite IoT plays a role in this design approach, particularly for deployments that extend beyond well-connected environments, as it provides a way to address a known point of failure and improve the reliability of critical data flows.
For many organisations, the question is not whether satellite connectivity is needed everywhere, but where it delivers the most value within a broader connectivity strategy. As IoT continues to expand into remote and distributed assets, those decisions will shape whether monitoring systems deliver not just data, but confidence in that data.
Author biography
Alastair MacLeod is CEO of Ground Control, a provider of satellite and cellular IoT solutions that connect people, assets, and machines in remote environments. He has more than 20 years of leadership in data, telecoms, and deep tech.
