…through Low-Latency Data Processing, Strategic Partnerships, and the Addition of Time-Series API

As summer approaches, electric grid operators are warning of disruptions and outages becoming more frequent. It’s a frightening thought: 12 nursing home residents died in Florida in stifling heat after Hurricane Irma knocked out their facilities’ air-conditioning units in 2017.

But utility failures aren’t just a concern during the hot months. In Texas, a winter storm that left more than 4.5 million homes and businesses without power in 2021 was blamed for hundreds of deaths.

The U.S. power grid is aging, strained and inefficient. On the contrary, there is a demand for more power to heat and cool homes during extreme weather fluctuations attributed to global warming. In addition, outdated or deficient infrastructure is a real safety concern. All of these factors highlight the need for better real-time understanding of the health of the utility ecosystem – from production to distribution to restoration.

IoT data opportunities are exploding in the utilities segment

Markets and Markets report that IoT utility market spending is expected to grow to $53.8 billion by 2024, compared to $28.6 billion in 2019.

One of the big drivers for this growth is the huge potential and proliferation of IoT sensors.

Leveraging IoT data is seen as a way to create an infrastructure that’s powerful, efficient, and more resilient. Utility operators, for example, can detect any changes in usage levels, which can immediately help identify power/gas losses or leaks. Additionally, they can see overloading sooner. Quicker reactions enabled by early warning and phased restoration systems can potentially save lives by ensuring power is delivered to the most critical structures first, such as nursing homes or hospitals.

Aerospike is already discussing with several U.S. utility companies how to help them modernize their data infrastructure. Use cases include emergency response service, capturing and processing real-time data at the edge, demand forecasting and predictive maintenance via AI, data retention, and restoration planning.

diagram: Sample findings through our conversations with utility companies

Figure: Sample findings through our conversations with utility companies

Let me give you an example of how Aerospike can make a difference: One large Midwestern utility company requires three hours every morning to ingest data to Databricks, a machine learning and analytics platform, and another 30 to 40 minutes for indexing/partitioning of the same data on the same platform – required to make the data usable. Aerospike can complete the entire cycle in less than 30 minutes – we have proof of concept (PoC) results to back up that claim. Since Aerospike is well suited for mixed workloads (not only “write”, but also “read”), our data platform can also be leveraged for other use cases such as analytics that require high read throughput to support rapid query access.

Aerospike’s ability to power IoT ecosystems

A database must be the foundation of any IoT strategy. An IoT ecosystem powered by the Aerospike data platform provides the flexibility to push data closer to the points of acquisition and usage on edge networks. This helps drive widescale IoT adoption. The ability to dynamically add new devices future-proofs the data infrastructure. Apart from the utility segment, our IoT use cases also include manufacturing, smart city, connected vehicles, transportation & logistics, healthcare, and oil & gas.

Of course, our IoT data story does not stop at bringing in the best-in-class database solution to the table. We are making strategic partnerships on multiple fronts to be an effective ecosystem player. One such move is Aerospike joining forces with Ably, who provides a suite of APIs to build, extend, and deliver powerful event-driven applications.

diagram: Example of real-time event-driven architecture for the combined Aerospike, Ably and Kafka solution

Figure: Example of real-time event-driven architecture for the combined Aerospike, Ably, and Kafka solution

In the combined solution above, Aerospike provides the basis for a high-performance, high-resilience data platform that spans multiple data centers and allows for uninterrupted access during localized outages. The platform can deliver billions of transactions in real time and can give its customers resilience against localized platform hardware, network and software faults. The solution also excels by having a relatively small footprint – up to 80% fewer servers than alternative solutions – resulting in a significant total cost of ownership (TCO) reduction, owing to our patented Hybrid Memory Architecture (HMA). Ably, is an edge messaging platform that’s designed with resilience at its heart and mathematically modeled to work regardless of the volume of messages, number of connections opened, or the quality of the network.

The combined solution is a great fit for both enterprise-grade and mass-market IoT use cases that need data ingestion and processing at speed and scale with resilience and guaranteed performance. That’s because efficiently incorporating event streaming is a key requirement for IoT/Telco edge systems. This solution directly addresses the challenges encountered in terms of reducing friction, implementation time and most importantly, cost. It also maximizes any existing Kafka (or other event data streaming solution) implementation with the easiest enablement of data flow in both directions.

Aerospike in the time series space

As highlighted in this technical blog written by my colleague Ken Tune, although Aerospike’s real-time data platform has its origin in the key-value space, in reality, it has a number of bespoke features that make it capable to support a wider range of use cases. Time series data is one of those areas. We have a plan for natively supporting time series data in our product roadmap. But, even before that, the combination of buffered writes and efficient map operations allows us to optimize for both read and write of time series data today. The Aerospike Time Series API leverages these features to provide a general purpose interface for efficient reading/writing of time series data at scale. In addition, the Time Series API ships with a benchmarking tool with three modes of operations provided – real-time insert, batch insert, and query.

In terms of performance, queries retrieving 1M points/query (1 year of observations of every 30 seconds) were able to run at the rate of two per second, with end-to-end latency of ~0.5 seconds for a sustained period using the benchmarking tool on a very modestly sized cluster composed of 3 x i3en.2xlarge AWS instances. Regarding writes, a 50K writes/second rate was easily sustained on the same cluster. Realistically these numbers can easily be scaled by a factor of 20 or more, simply by increasing the power of the underlying hardware.

The success of using IoT data for utilities (and other use cases) depends on the effective deployment of a wide range of technologies. This includes a variety of IoT sensors/end devices, real-time edge/core/cloud data infrastructure, and AI/ML capabilities within the analytical tools. The key requirement for the data platform is stability, reliability, and ability to scale without losing performance. At Aerospike, our promise is to deliver exactly that. We look forward to engaging with you and demonstrating the proof of concept (PoC) that makes the most sense to you.

Please check out our Telco/IoT/Connectivity ecosystem references and let’s continue the conversation here.