Share, , Google Plus, Pinterest,

Print

Posted in:

How IoT Changes Computing Architectures

Technology and computing changes have fundamentally shaped human life in recent decades. Limitless capabilities of the human imagination combined with the creativity of audacious technologists, funded by clairvoyant investors and resourceful industry giants, have brought about momentous changes in our daily lives to the extent that imagining a life without smartphones, on-demand services and always-synced devices seems unbearable.

The Internet of Things trend is shaping up to be another potent force, wide-ranging and disruptive, in its potential impact on the daily lives of people. How we live, what services we take for granted, how organizations conduct business are all likely to be upended by these new inter-webs of smart things.

However, for the Internet of Things to truly materialize, the way computing is done will have to fundamentally change. Mobile devices, social functionality and the supporting cloud computing infrastructure have already extended the reach of the connected internet to places well beyond homes and workplaces. However, once buildings, cars, trains, refrigerators, industrial machines, farm equipment or even farm animals get equipped with devices or smart sensors, computing infrastructure will have to evolve in tandem.

The existing computing infrastructure will be stretched in many ways. Some of the stress on computing will involve just sheer numbers. Gartner, a research firm, predicts that the number of wireless-ly connected products in existence (not including smartphones or computers) will increase from perhaps 5 billion today to 21 billion by 2020.1 Data coming from these “smart” things could be tens or hundreds of thousands or millions of updates per second. A key characteristic of this data is that in many cases, it is likely to be time-series data at very frequent intervals. Handling this fire-hose of data with relatively lightweight hardware is going to be a key requirement of the technology architectures that evolve to support it. For IoT deployments that are heavily distributed geographically, use of hybrid computing models – where light-weight data aggregators deployed on local offices, factories’ floor and even in farmers’ farms with connectivity to a geographically distributed highly elastic cloud for data ingest – will become a necessity.

Even as you get past the data ingest problems, right on their heels will come the data analysis and response problems. Traditional models of computing have been to collect data, store it and analyze it later. Businesses make decisions based on data that is days and weeks old rather than minutes and hours. For IoT scenarios, expectations of analysis and decision-making speeds will change. Not only will decisions need to be made automatically, on-the-fly, the data used for those decisions will require to be crunched in near real-time. New analytics architectures that store and process data simultaneously, in-memory and very rapidly will emerge. This is where efficient, highly available and scalable data stores with built-in data structures that execute analytics on the data right next to where it is stored become important and where a fast serving layer (after execution) is important. Data structures minimize the processing overheads, allow developers and architects to build their schema on-the-fly according to the expected analytics query patterns and can significantly accelerate analytics performance.

Moovit, the world’s #1 transit app, caters to 35 million users, and handles 100 million daily passenger reports, tracking 4.6 million bus/train stops and 4500 transit operators. While not a classic IoT scenario, we can use it as a proxy to understand some of the challenges a service provider might face when the bus/train stops are equipped with sensors, and the actual vehicles report on their status. The need to provide the most accurate information to commuters in real-time requires that their infrastructure responds in less than 10 milliseconds after having crunched through optimal routes. Failure means delays and frustration for all consumers who have come to rely on the service. To guarantee responsiveness and high availability, Moovit uses Redis Labs’ elastic, highly scalable Redis service (Redis Cloud) to ensure high performance responses, automated linear scaling and very high throughput with sub-millisecond latencies. Redis, the high performance data structure store provides not just the data ingest and processing capabilities, but its data structures facilitate real-time analytics with blazing fast speed.

Another important factor which will influence the evolution of IoT computing is likely to be cost. Consuming millions of data points, digesting them at appropriate rates and responding with speeds acceptable in the IoT world is best done with in-memory processing. However as dataset sizes increase, the race will be on to get the same world-class performance at ever decreasing costs. Luckily, here’s where memory technology evolution is likely to play an intersecting and casually fortunate role.

Innovations from Samsung, SanDisk and Diablo in the field of Storage Class Memory (SCM) and from Intel and Micron with 3D Xpoint (pronounced three D cross point) technology, are fundamentally changing the economics of in-memory computing. 

Another advantage of the SCM/3D Xpoint technologies is the elimination of storage-engines and filesystems, if being used as memory extender, as they maintain the byte access semantics of DRAM and thus dramatically reduce the storage access overhead. Yet ten times slower might not be acceptable by most in-memory applications and therefore the new SCM/3D Xpoint chipsets will allow applications to decide which part of the data will be stored on the DRAM part of the chipset and which part on Flash. To fully utilize their benefits, in-memory databases and platforms will be redesigned to store the data in two forms of memory, fast and slow, and then developers and architects can start analyzing petabytes of data in-memory and cost effectively.

New analytic architectures, fueled by the cheaper memory technologies and in-memory data structures, capable of handling millions and billions of time-series data points, lightning speed analytics and sub-millisecond responses are likely to be the norm once the Internet of Things changes human life for good.

It is time for the architects of future IoT applications to take a serious look beyond the limitations of the current technology and anticipate new architectures for the dramatically distinctive world of smart, connected things.