Data Virtualization and Big Data: A ‘Virtual Case Study’

Pedro Hernandez

Updated · Dec 26, 2012

The following is a “virtual case study” in which Wayne Kernochan takes an existing use case and imagines what would happen if the company extended its use of data virtualization to analytics involving Web-sensor and social-media data. The intent is to give readers additional ideas about profitable future directions in leveraging data virtualization. The details of the “virtual case study” are based on articles in MIT Sloan Management Review and up until the start of the project described here are factual, although the rest is conjectural.

Data Virtualization Sets the Stage

A large U.S. media/telecommunications company has, over the past few years, been upgrading its customer/subscriber applications to promote greater cost efficiency and a better customer experience. A particular target has been customer services. Efforts to improve the quality of installation (“onboarding”) and upgrade of the company’s solution in the consumer and business markets have improved customer and market perception of the company’s brand and decreased “rework” and repair costs.

Data virtualization has been key to the success of these initiatives. The company has tens of customer-facing business critical applications, with siloed data sources, all of which needed to be upgraded in order to allow the new customer-service programs, as well as analytics to determine how well the new strategies were working. The company found that data virtualization speeded up implementation of the upgrades and ensured they would fit seamlessly in the new “global architecture” once finished.

Moreover, data virtualization allowed rapid implementation of the analytics to determine solution effectiveness, so that success or failure could be determined in a timely fashion – not to mention modification of existing strategies and implementation of ensuing strategies. Finally, the simplification of the company’s architecture cut administrative costs.

Extending Big Data Analytics

Because of the success of its efforts, the company decided to extend its analytics to customer and market data existing outside the company’s firewalls, typically in public clouds: so-called Big Data originating from social media channels and Web sensors. The rationale was that the company’s brand was strongly affected by the effectiveness of its website, as well as recommendations in channels like Facebook and Twitter.

These places were also rich sources of context for the data that the company was accustomed to collecting at the point-of-sale. For example, incorporating this data in marketing analyses such as sentiment analysis would allow the company to distinguish those customers who were “passionate” and “loyal” or “low maintenance” and “high maintenance,” not just identify those who spent the most.

Data virtualization, as in the previous stage, proved critical to rapid completion and implementation of the new capabilities. The data virtualization solution proved just as adept at merging Web semi-unstructured and primarily-structured internal data as it was at the simpler case of internal siloed data. 

Moreover, the data virtualization solution’s ability to load-balance across data-management systems ensured that massive inputs of social media and sensor data did not “crowd out” existing application processing. Finally, the data virtualization solution’s ability to provide one view of the data for end users, developers and administrators simplified the tasks of the latter two enormously – especially given company concerns about security, as this data existed on multiple clouds with varying security schemes, all of which were outside the company firewall.

360-Degree Competitive Advantage

The outcome of the project placed the company squarely within the “leaders” group as defined by MIT Sloan Management Review’s studies of users of Big Data. That is, they:

  • Clearly focused on the customer via an unusually extensive 360-degree view;
  • Were able to pick up customer changes exceptionally rapidly, often by catching them on the Web, before they surfaced in buying behavior;
  • Had a very strong understanding of the context behind a buying and repeated-buying decision, due to sentiment analysis and other analytics on Web data; and
  • Were able to supplement existing “portals” and other end-user self-service ad-hoc querying tools with tools and features available on the Web, easily integrated via the data virtualization solution’s “virtual table” features.

Due to the relative newness of cloud architectures and Hadoop, the data virtualization solution played an even more critical role in the success of this project. Unfamiliarity with the cloud and Web data access would have made implementation much slower than in the company’s original effort. With data virtualization, the slowdown was negligible.

Based on reasonably comparable TCO/ROI studies, I estimate that using data virtualization gave the company an average one-year lead in rolling out the extended 360-degree solution, or a year’s lead over competitors, all else being equal. The company anticipates that rolling out its second generation of extended 360-degree view solutions will extend this lead even further.

The net effect of the project on the company’s bottom line was, if anything, greater than that of the previous project. Customers were better segmented; understanding of the customer from statistics rather than opinion began to permeate the company; customer satisfaction continued to climb; and establishing a customer relationship for an extended period of time became more frequent.  At the same time, cost efficiencies allowed IT to improve system robustness and responsiveness to end-user needs in an economic environment of slow growth and constrained IT spending – thus impacting both the company’s top and bottom lines significantly.

Wayne Kernochan is the president of Infostructure Associates, an affiliate of Valley View Ventures that aims to identify ways for businesses to leverage information for innovation and competitive advantage. Wayne has been an IT industry analyst for 22 years. During that time, he has focused on analytics, databases, development tools and middleware, and ways to measure their effectiveness, such as TCO, ROI, and agility measures. He has worked for respected firms such as Yankee Group, Aberdeen Group and Illuminata, and has helped craft marketing strategies based on competitive intelligence for vendors ranging from Progress Software to IBM.

  • Business Intelligence
  • Research
  • Pedro Hernandez
    Pedro Hernandez

    Pedro Hernandez contributes to Enterprise Apps Today, and 11Press, the technology network. He was previously the managing editor of Internet.com, an IT-related website network. He has expertise in Smart Tech, CRM, and Mobile Tech, Helping Banks and Fintechs, Telcos and Automotive OEMs, and Healthcare and Identity Service Providers to Protect Mobile Apps.

    Read next