Your Next Big Data Project? Operational Analytics
Operational analytics is a great way for companies to use Big Data, according to a new Capgemini report.
If Big Data has a killer application, it is operational analytics.
Companies using Big Data initially focused on the customer experience, with Capgemini/Sloan Management Review research finding that 40 percent of analytics initiatives in 2013 were aimed at the customer while 26 percent focused on operational improvements. A lot has changed in three years, however. In 2016, Capgemini surveyed 600 global executives and found 70 percent now emphasize operations rather than customer experience with their analytics projects.
That's because companies get the most bang for the buck with operational analytics, said Steve Jones, Global VP of Big Data for Capgemini. "Customer stuff is great, but operational efficiency can drive significant top and bottom line advantage."
Operational Analytics' Big Benefits
The benefits of operational analytics are far easier to illustrate than those of customer analytics, Jones said. A Capgemini report titled Going Big: Why Companies Need to Focus on Operational Analytics offers the example of an Asian steel manufacturer that used operational analytics to uncover root causes of quality issues and then attained a 50 percent reduction in lead time for production of some of its products and a 60 percent reduction in inventory. In another example, the UK's Network Rail used operational analytics to make better decisions on preventive maintenance for its rail system infrastructure, realizing cost savings of 125 million euros (U.S. $141 million) over a five-year period.
This kind of measurability is "a definite advantage," Jones said, as it adds clarity to Big Data analytics projects.
Despite this advantage and the big benefits realized by some companies using operational analytics, relatively few companies are doing it. Thirty-nine percent of survey respondents have extensively integrated their operational analytics initiatives with their business processes. Further, just 29 percent said their operational analytics initiatives had yielded the desired results.
While some of the usual technical suspects, including poor quality data and a wide diversity of data formats, contribute to this lack of results, Jones said simply determining the goals of an operational analytics project and establishing the appropriate scope are key challenges for many companies.
"Getting data is the first problem, then determining which analytics need to use that subset of data, and, then once the analytics are done you need to be able to determine how to turn it into something that has real business impact," he said.
An Excel Mind Set and Operational Analytics
Companies that Capgemini terms "game changers" – those that have integrated most of their analytics initiatives with their business processes and realized desired benefits from the initiatives – use what Jones calls an "Excel mind set," which he said is the diametric opposite of the "boil the ocean" approach that some companies use for Big Data and other analytics projects.
"They start by saying 'What is the problem we are going to fix?' and then 'What schema and analytics do we need just for that problem? If I am doing stock control, I do not need schema for everything in the business.' Then they build something specific they can integrate into SAP," he said. "You do not want to say 'there is a 68.4 percent chance you need to do this or 74.2 percent chance to do that.' If you are a stock control manager in a warehouse in Des Moines, that really does not help you; you want to know you need to order six items. It is not about a report or a schema, but about actionable intelligence and insight at the point of action."
With the right underlying architecture companies can efficiently roll out point solutions, Jones said, and "get a real value because of incremental costs."
Big Data infrastructure is far less expensive than traditional data warehouses and quicker and easier to implement, Jones said. Typical components of a flexible Big Data architecture include Hadoop, the statistical language R, Apache HAWQ or a similar query engine, a machine learning library and a graph database. The idea is to create databases on top of machine learning techniques, he said.
Using these kinds of Big Data platforms, Amazon, Google and other Big Data analytics pioneers are "able to deliver huge business innovation on a limited technology base," he said, adding "They innovate in specific, constrained ways, focusing on industrialization and repeatability."
Big Data Haves and Have-nots
Most companies have a long way to go to catch up to Amazon and Google. Capgemini characterizes just 18 percent of companies as game changers. Another 21 percent are "optimizers," companies that have realized early benefits from their analytics initiatives in a limited number of areas within their operations. Twenty percent are "strugglers" that have integrated analytics in most of their business processes but lag in attaining benefits, and 41 percent are "laggards" that are just introducing analytics into their operations.
Among the key differences between game changers and laggards, according to Capgemini:
- Integrating data to achieve a single view of operations data, a step taken by 43 percent of game changers and just 11 percent of laggards
- Routinely collecting unstructured data to improve data quality, practiced by 59 percent of game changers vs. 27 percent of laggards
- Using external data to enhance insights, mentioned by 48 percent of game changers and 23 percent of laggards
- High utilization of operations data, practiced by 68 percent of game changers and 45 percent of laggards
Like other experts, Jones recommends identifying a business problem to be solved with Big Data analytics, then using incremental wins to fund future Big Data projects.
"That first project justifies putting in place a Big Data infrastructure and getting a lot of data in it," he said. "So you might start with predictive maintenance, then move on to stock control. Thanks to that predictive maintenance project, you've already got 90 percent of your data available and 90 percent of the technology. You don't have to go build a data silo; just build set of data distillations to solve that problem."
The model makes sense, he said, because companies generally will incur just a few incremental costs for each new project. "Incremental costs are significantly lower by using these Big Data technologies," he said. "You don’t have to establish a stock management system that is going to cost millions of dollars and take several years to implement. You can shift toward an approach that lets you solve your business problems with incremental costs instead of undertaking a series of costly projects."
Embedding Big Data into Business
Perhaps the most important difference between game changers and their less advanced peers, Jones said, is their ability to "go from Big Data being a tech project to being the data fabric embedded within the business."
Looking ahead, Jones said the Internet of Things (IoT) will generate lots of streaming data, which means "streaming and reacting to data in real time will become very, very important."
The next generation Big Data challenge companies will face, he said, is how to run analytics in a partner's environment. For example, he said, all of the suppliers contributing parts to an airplane generate data; bringing the data together to perform analytics can help solve business problems that affect all of them.
"The first phase of Big Data is getting it right within a company's four walls; the next phase is about collaboration," he said.
Ann All is the editor of Enterprise Apps Today and eSecurity Planet. She has covered business and technology for more than a decade, writing about everything from business intelligence to virtualization.