Imagine the following scenarios:
Scenario 1: You have a high profile meeting to prepare which has kept you so busy that you completely forgot a huge pile of clothes which absolutely cannot do without washing, lying in your washing machine. You suddenly remembered this while you were about to step inside the boardroom.
Suppose you had the power to program wash, spin, and rinse cycles; set the water temperature; and handle fabric on your washing machine using your smart phone. In addition, you had the ability to check maintenance and system status information of your premises and could summon service professionals by even sending a self-driven car to pick them up, in case something went wrong.
Scenario 2: You are rushing at jet speed on the highway towards the hospital to attend to a friend who just had an accident. Your mind is totally blank, even your sense of traffic has abandoned you. Possible outcome – another accident that you just cannot afford.
However, think of a situation where your car automatically communicates with other vehicles and traffic lights to generate and exchange some kind of “Here I Am” signals and negotiates the traffic by itself without your intervention. Vehicle-to-vehicle (V2V) communication technologies are currently being developed under the aegis of the U.S. Department of Transportation. These will be available for the general public in the near future.
Scenario 3: There is an intrusion or fire at home when you are out. You have no control over the situation. But if you are able to remotely manage and monitor the environment of your home by installing devices for fire and home-invasion detection which are capable of texting alerts and articulating alarms, you might just be able to alert the responsible authorities before it is too late.
All this sounds great. However, consider that billions of objects are interconnected via the Internet and millions more are being added at a very fast rate, the chances of accelerating the growth of already tsunami-like influx of data is quite alarming.
This issue is something that a lot of companies are focusing upon. One such company is Confluent, which has a ‘Stream Data Platform’ that helps solve data integration and stream processing challenges by providing a central hub for all the data. It makes all your data available as low-latency data streams, exactly the form required for real-time stream processing.
At the heart of this platform is the Linkedin developed product Kafka. Kafka excels at delivering real-time data between various devices, and hence it could be an ideal platform for delivering notifications to connected devices, like wearables, connected cars, or smart appliances to solve the problem of data influx in the near future. Confluent enhances Kafka with advanced features and enterprise support that makes a company’s data available as real-time streams. The system works via Kafka plug-ins, which lets businesses improve their data circulation without modifying their existing databases.
According to Cisco, in 2008, there were already more “things” connected to the Internet than people. By 2020, the amount of Internet-connected things will reach 50 billion, with $19 trillion in profits and cost savings coming from IoT over the next decade.
LinkedIn recently revealed some staggering numbers on Kafka’s growth. The platform now handles over a trillion messages per day, compared to a billion daily messages just four years ago. It currently processes 1.34 million gigabytes of data through its system every week. The platform works by receiving a message from one system then delivering it to other systems that require the same data, usually in real time. Looking at these figures, Cisco’s figure doesn’t look that staggering after all!
All this is excellent news for a smarter, highly connected planet of the future. And that future is quite near, almost within our grasp.