Achieving real-time analytics is strikingly difficult to make successful. This kind of complex data science project requires high velocity of data, high volume of data and data from multiple sources. Which means it’s critical to understand the use cases and benefits, rather than being driven by the tech, before embarking upon any real-time projects.
In 2020, Gartner predicted that throughout 2021, 75% of data science projects wouldn’t make it past the prototype stage.
Typically, in Ekimetrics’ view, that’s because:
Even with the same use case as another business, the specificity for an individual company requires in-depth industry knowledge as well as in-depth technical expertise. A cookie cutter approach won’t bring the value expected.
Real-time requires heavy engineering and technology. Automation of data and the governance of the solution to ensure it remains reliable and robust is very different from the needs of other forms of analysis.
Transferring academic AI knowledge into business solving usages with compelling business cases can be a real challenge. Innovation that is at the bleeding edge is still finding its feet and is unlikely to deliver value in the immediate term and is likely to need considerable iteration before it can be widely adopted.
To buck this trend, Ekimetrics employs a three-stage approach to such projects.
Be clear about what data your business needs to drive competitive advantage, and not just what data is available. Understand your data maturity and organise your company and project around your real-time, near-time and some-time use cases and value.
Typically, there are trade-offs to make between the extremely complex and extremely narrow, especially when it is important to ensure real-time stays real-time. Plus, the implementation management and operationalization is quite different from that for monthly or even daily reporting.
The best solution is of little value if no-one is using it. It’s critical to bring people along on the journey, with good change management protocols that ensure a solution is used, improved and evolved in light of changing data, questions or priorities.
Implementing real-time: Steer your data opportunity
For each use case, it’s essential to start by understanding whether it is necessary to use the data in real-time. Will real-time bring an adequate uplift in performance for the investment that is required? This is primarily a question of need, combined with data readiness and maturity.
Real-time
Typical applications for real-time are short-term, operational and tactical, including things like on-site, live journey personalisation, brand safety alerts, bot behaviour, website activity alerts, and propensity scoring, for example to drive real-time next best action.
Real-time requires immediate, real-time data streaming through instantaneous connections and live views of data. There can be no delay in the data being surfaced, which means all processes must be fully automated. That also typically means an extremely heavy tech stack is needed, or the scope must be extremely limited.
Near-time
For many applications, near-time is more than good enough to deliver the significant majority of the benefit especially where there is a need for more time to consult, consider or update the data, for example where there is an inherent time lag between events or where w weight of data will deliver a better decision. Applications include campaign optimization, creative optimization, digital engagement metrics, CRM journey monitoring and digital attribution.
Near-time can cope with delayed connections and be managed via frequent, regular reporting. Rather than relying on data streaming, data updates can be asynchronous and scheduled, allowing data to flow between systems and then synch. Ideally there will be a high degree of automation, but a consistent, structured process can delivery adequately in some circumstance. The closer you get to real-time, the more critical the tech stack becomes.
Some-time
Ad hoc analysis is perfectly suited to some-time management, especially where there are new questions that don’t have existing data sources to support them, or where the tools are simply not available to provide the data in a more structured, regular fashion. Examples include lead to conversion reporting, promotional effectiveness, budget optimization, media mix effectiveness and customer funnel metrics.
Use cases with some-time data requirements can rely upon irregular updates and ad-hoc, manual processes. Data can often be slow-moving across myriad sources, plus unstructured data often requires manual intervention.
Assessing the criticality of data
Mapping the customer journey, with data touchpoints, sources and data hand-offs, for example to understand conversion or next best action, will help you see where the value of data lies, the complexity of marrying it up and delivering in real-time and, therefore, where to focus effort for real-time and how to adapt outside of it.
In summary:
Implementing real-time: Build your operational capabilities
Solution design for real-time is highly complex, so it’s important to understand what you are committing to if it is to be a success.
An example of a narrow use case with value, could be building next-best actions at scale, using machine learning scoring and activation in the leads funnel, to follow and dynamically intervene in an on-site purchase journey. To understand the scale of that, where this was implemented by Ekimetrics for a holiday firm, over 100 variables were captured for each lead, with 14 billion scores and over 100GB of data produced every day.
The inputs required to derive a single propensity score are complex and come from many aspects of the journey, including numeric input, such as the number of contacts a customer has had, resorts viewed, time on site or call duration, and categorical input, such as their country of residence or segment. This single score is just one of many per customer, all designed to provoke a specific action through live personalisation on site or through customer servicing channels, such as webchat or inbound calls.
Those scores can then be used across other applications, perhaps where real-time is less critical, such as outbound follow-up calls, dynamic email campaigns, etc.
To deliver on such a use case, it’s important to set the tech framework according to those needs, but start pragmatically and build from there, rather than finding tech and working out how and where to apply it. In other words, the ‘why’ is far more important. Underpinning an example such as this are large data stores, AI-platforms and different databases to serve different latency requirements.
In summary:
Implementing real-time: Deploy your solutions
The most common reasons for failure when it comes to the deployment of such a project are that it does not address a strategic priority, it’s not adopted by people in the operation and the cost of implementation is allowed to spiral.
If the first step of the process has been followed, the question of strategic priority should have been mitigated, however new solutions that change processes and demand new ways of working can be difficult to embed operationally. Not only do the data, tech and maths have to work in harmony, but the business objectives, human aspects and organisation must too. Misalignment between any of those is likely to result in a less than satisfactory deployment, meaning value will be lost.
In addition, the cost of maintaining and managing real-time is around ten to fifty times higher than daily or weekly some-time, so it is vitally important costs are scoped out as accurately as possible before embarking on the project and then managed tightly to ensure the ROI of the project is realised.
Essentially, deployment is a change management task to deliver advantage.
Making sure the solution can be seen and actioned by operational users from the outset is critical to ensure further development is building on successes. Tackling the low hanging fruit first with testing, metrics and performance monitoring will promote the opportunity to move onto more complex implementations, provided the necessary changes are also made to the decision-making process.
This then needs to be supported by training and democratization of the data and vision throughout the organization, rather than it being the preserve of a small number of super-users. Further, regular strategic review will ensure the project remains on track, taking account of changes in business strategy and objectives or external factors such as competitor activity.
In terms of marketing performance and measurement, where econometrics is increasingly being asked to meet the challenges of attribution, the question is of greater granularity and speed to deliver ‘live’ MMM compared with ‘classic’ MMM, which is typically delivered over longer outcome periods, from quarterly to annually.
This transition is far from trivial as you integrate real-time data into a tool that is used strategically to determine optimal investment across channels and/or countries. So it’s necessary to manage that disconnect, while also empowering broader teams and monitoring model performance, as you can no longer rely on presentations to feed in the data needed to manage in real-time.
Further, MMM still needs to perform as it always did at a strategic level; real-time MMM is essentially a lengthening of the lens that allows you to zoom in to more tactical actions. And so clear guidelines are needed on which decisions should be taken in real-time or over longer time frames. This allows you to shift from a more backward-looking view of the business to a more test and learn approach.
In summary, successful deployment means:
Ultimately, any real-time solution must be useful, and drive value, usable, by delivering pragmatically, and used, by supporting adoption.
You can watch the presentation in full here.