If we have learnt anything in 2016, it is that consumers love on-demand experiences that capture their moods, interests and wants. From ride-hailing to gourmet cooking ingredients, speed and convenience are the new genies, fulfilling users’ many wishes.

The new race among customer-centric organizations is therefore to provide the best experience by anticipating and predicting user behavior. Offers made in context of a user’s existing actions and state of mind are likely to get converted faster than cold emails and calls, days after the fact. Enterprise applications that ‘learn’ quickly and customize user experiences on the fly will be the new norm for success. Expectations created by consumer applications will carry over to enterprise, including business to business applications, although possibly with a great deal more caution.

The new disruptor in the mix is technology that has fallen in and out of fashion since the eighties – artificial intelligence. Gleaning meaningful patterns from vast quantities of data is now achievable because of big data technologies, the availability of cheap compute capacity and advances in machine learning. In 2017, machine learning will change the fabric of the enterprise. It has the potential to transform every enterprise process that aligns business outcomes and customer needs, bringing about a flexible, adaptive enterprise centered around the customer.  Imagine if a retailer can predict, based on a user’s recent behavior, which coupon might push them over into several impulse buys. Or a rental car provider that can predict user stops and can provide discounts at a network of participating neighborhood businesses, pushed through a mobile app. Machine learning paves the way for on-the fly predictions and recommendations, things that used to require months of analysis, business goal vs consumer preference tradeoffs and hundreds of excel spreadsheets.

Machine learning is not without risks, however – improper models, insufficient data, incorrect assumptions, could lead to false targeting, false recommendations and lost customer confidence. In other words, the downside of not being intelligent enough is extremely high. A cable provider that learned behavior from a multitude of users in the same family, and promoted the wrong programs could land in a lot of hot water, as an example.

A likely consequence of this pursuit of customer delight is increased personalized interactions between organizations and their customers. Savvy businesses are already aware that higher levels of customer engagement results in greater rates of loyal customers. A natural evolution of interactive applications is the expansion from mobile, social, email and other channels to the embrace of bots, so interactions can be personalized and customized. While their interactions with their customer base today range from social media accounts, facebook pages, and email, bots for intelligent 1:1 conversations and support to consumers are likely to fulfill immediacy needs.

Data challenges are likely to come to an inflection point in 2017. As data continues to inundate organizations and overwhelm them, enterprises are likely going to be pressed to tier their data needs into “hot”, “warm” and “cold” tiers, based on the speed of decision making facilitated. Enterprises have long struggled with which data will yield insights and which is largely useless. In the process of creating data lakes, organizations often end up with “data dumps”, which are effectively cold storage for data that is too hard to use for problem solving, but not useless enough to throw away.

For responsive and interactive applications, both machine learning and performance needs will define data strategy. As reactive applications become progressively intelligent, analytics will move from a post-facto batch mode to real-time and inline. Batch processing big data frameworks like Hadoop, that retroactively provide insights about enterprise data, are not going to be adequate for modern applications.  Real time processing frameworks such as Apache Spark, are rising in popularity reflecting the need for analysis that can be applied to customer situations as soon as possible. As expectations from analytics increase, implementing high performance yet cost effective tiers for “hot” and “warm” data needs of crucial enterprise applications, whether they are operational or analytical,  will become the custom.

A much hyped trend of 2016, the Internet of Things, has attracted the attention of consumers and enterprises with its promise of connected, intelligent “things” that can sense, adapt and even take corrective action based on a complex set of conditions. Manufacturing, transportation and utility companies, have been suddenly catapulted from relative obscurity to pioneers in IoT. For all its hype, IoT has been largely in “test” mode, especially as companies deal with a labyrinth of regulations built for a less automated milieu, well known and well capitalized “driverless” efforts of Google, Uber notwithstanding. Amazon changed all that, leaping into the fray with drone delivery and disruption of the everyday shopping experience with Amazon Go. Amazon competitors like Target and Walmart, have so far been playing a catch up game online, but these most recent moves threaten consumer companies worldwide.

Offering a comparable or better shopping experience enabled by the current adolescent stage IoT technologies is no mean achievement. Many loose ends need to be figured out and tied up as we move past the “wow” phase of IoT. As an example, high speed data ingest is a foundational concern as are security and privacy. As hundreds of thousands of thing relay status and ask for decisions, collecting the data at high volume and scale requires lightweight high performance databases that can collect and aggregate data with extremely high throughput and low latencies. Often the data is time-series, and the datastore in addition to being high performance while ingest also needs to execute range queries, anomaly detection and other analyses gracefully with the least overhead.  Security controls need to be in place for this data, as much of it can be sensitive. A trivial example is an energy management system that monitors buildings for physical presence so that energy can be conserved. Data about which apartments are empty during long periods of time, if available, can put the physical security of these apartments at risk.

As IoT grows up, an analytics layer that sanctions the right on-the-fly decision making, needs to be in place. It will also need to incorporate techniques such as machine learning, behavior prediction and fraud detection, both for the protection of the consumer and the enterprise. Meeting all of these needs at the right level of performance is not possible without a multi-function, versatile high performance technology. Databases like Redis, which can be extended to any functionality while retaining extremely high throughput and sub-millisecond latencies, can be particularly handy elements of IoT technical stacks.

To summarize, “intelligence” is going to be a requirement – enterprises need to be prepared to offer smooth, smart experiences in 2017, but they need to do so while using all of the data at their disposal shrewdly. Consumer expectations demand immediate actions with increasing accuracy, and fulfilling these with sufficient security and privacy controls will dictate success or failure for organizations in 2017.