Geoff Clark, General Manager EMEA, Aerospike
The attention span of individuals and organisations is reducing. Time is compressing. There are only brief windows that can be devoted to any given transaction.
Expectations around situational knowledge, updated in real-time, are compounding. Customers assume that applications know who they are and that their transaction can be completed as quickly as possible whatever their location. If that doesn’t happen, they will abandon the transaction and quickly move on to another provider.
Financial institutions must do more in ever shorter time frames. Success in the digital world is largely down to how deeply you understand your customer based on multiple data streams and driving decisions that provide a deeply personalised experience.
Having a clear picture of your current situation is critical in business. While you can create models and analytic dashboards these often can’t provide a real-time view. The latency of these models may be measured in seconds, minutes, hours, or even days and weeks. As latency extends the picture you have is not in the moment and that can be the difference that defines operating risk. Never has the phrase “time is money” rung truer.
Access to a real-time view of positions enables organisations to manage risk more closely, in terms of their aggregate position, based on a transactionally correct view of precisely what they have on their books to respond in real-time to global market changes.
For banks and financial institutions, it means knowing the content a customer has seen and the environment where they encountered it. Did they just transfer money or make a payment online or in a store? What do their digital journeys look like? What online content have they been viewing? Continually adaptive models use the answers to these questions to select active customers and recommend options that address what they are searching for.
The customer’s digital experience is composed of multiple linked processes, working in concert to continually understand the customer experience. Different system elements – user profiles, decisioning, recommendation engines, and machine learning and AI – require data across different time windows. Some of these systems deal with terabytes of data and some petabytes. Some require absolute consistency and for some eventual consistency is good enough. In many of these systems, there is a demand for hyperscale within tight time windows.
The ability to take in more data within a given time window feeds the currency of the context. It is not enough to have a lot of data, but you also need the most current view of that data. Operating at hyperscale enables an organisation to deliver exponentially more data than traditional databases. This makes models more accurate and, by extension, more valuable.
Do More in Less Time
In many ways having a blend of scale and speed provides time, so that you can do more within a given time window. This is what we refer to as a dividend. The use of more data enables the process to be completed in less time.
We call the systems that interact directly with either customers, employees, or clients are “edge” systems. They draw and process data within relatively tight time windows. Edge systems sometimes have expiry windows where data is deleted after a given time window where it is deemed to be “stale.”
Data that’s collected and processed at the edge might be transferred to a system of record or core transactional store. Here it can be kept for longer periods to provide historical context. These data sets may grow to petabytes in size while still needing to support response times measured in milliseconds. In order to meet these requirements you must have hyperscale capability, so the performance of data access is not impacted as the size of the information repository increases.
Here’s an example related to a consumer of banking products. The bank would like to promote relevant products, based upon the individual’s financial situation and predict how the customer will respond to offers and suggestions. They must match the information collected regarding devices, identity, and patterns that establish context against their database of user profiles. The bank wants to run more sophisticated recommendation engines based on these patterns to ensure they are tailoring the product or service based on the customer’s needs. Establishing a model that tells which bank customers with this profile are much more likely to make a decision to adopt their products, ensures the analysis and optimisation boundaries go beyond just this transaction or online engagement. This also allows the bank to track behaviours in order to validate the model. It’s necessary that these processes are running, some in parallel, others in sequence:
- Establish an identity profile of the customer
- Match that profile against a database of profiles that models likely desires and behaviours
- Based on that match, determine the products and/or services to recommend
- Select the content and media types based on the user profile
- Determine pricing and offers based upon the user profile, status and the engagement history
The last element, real-time pricing and offers, is something that the banks will modify to take into account the margin and profitability of the customer over time.
Driving up value for both the customer and the business
By building models based on hyperscale capability, 10 milliseconds are saved here, another 20 milliseconds there. This creates time to perform additional processing and make more relevant suggestions and offers – better for customers and better for the business. As more data points are added to customer profiles as they scale up to include more data overall, it’s important not to experience performance degradation. These time benefits will remain. This is the real-time data innovation dividend that’s driving up the customer experience.