Using Data Virtualization to Simplify Application Modernization
Updated · Mar 06, 2015
WHAT WE HAVE ON THIS PAGE
Application modernization is a big deal for most organizations this year. According to the most recent CSC Global CIO Survey, 70 percent of the 590 IT leaders interviewed say modernizing legacy applications is a critical or high priority in 2015.
It’s not hard to guess why. As Dr. Rainer Sommer, head of GI IT Germany at Zurich Insurance Group, explains in the report, it can be difficult to expand and upgrade traditional applications and the underlying on-premises infrastructure. Application modernization allows companies to take advantage of the cloud’s scalability, flexibility and cost savings.
What’s more, modernization makes it easier for companies to adopt mobile, the Internet of Things (IoT) and Big Data technologies, says Denodo Senior Vice President Suresh Chandrasekaran. Application modernization is a major reason enterprises are revisiting data virtualization solutions like Denodo, Chandrasekaran explained in a recent interview for Enterprise Apps Today.
Not Your Dad’s Data Virtualization
That may be hard to understand if you still believe data virtualization is synonymous with data federation. Ten years ago, data federation was primarily used for real-time reporting and agile business intelligence, Chandrasekaran said. In the past decade, there’s been “a profound shift” in the technology’s capabilities, which now include a metadata discovery layer, discovery of canonical data assets and an abstraction layer that enables real-time data services.
“The primary reason that people are adopting data virtualization is less about real-time integration and more about abstraction and discovery of my enterprise’s data assets,” he said.
Application Modernization Examples
These capabilities can be particularly handy during an application modernization initiative, as companies such as AAA of Northern California, Nevada, and Utah, and B&S Railway have discovered. For example, when AAA decided to separate its Western auto clubs from its insurance divisions, it faced a massive migration for both data and applications.
“We understood the volatility of our data architecture was going to exist for some time,” explained Anthony Kopec, data solutions architect manager for the project, in a video presentation about the effort. Yet, both companies still needed access to applications and data during the migration.
The team used data virtualization to combine the data sources into a virtual data layer, so it no longer mattered where the data actually was. The applications — including cloud-based apps such as Salesforce — were able to access the data layer via Web services and other customized wrappers, Kopec said.
That virtual data layer also allowed the team to bring on new services and modernize the underlying data sources and structures without disrupting the business.
B&S Railway’s business case was slightly different. The company had a more traditional need to modernize legacy applications, which in some cases meant a move to the cloud. Denodo’s data virtualization solution served as the abstraction layer, allowing B&S to move its applications module by module and data source by data source, without impacting operations.
“Lots of companies have modernization of applications for the cloud, mobile, etc.,” Chandrasekaran said. “They have to move very fast, focusing just on the application logic and the UI. The data services team allows them to do that by providing a virtualization layer.”
Accelerated Application Delivery
Data virtualization doesn’t just reduce the chaos of enterprise application modernization. Virtualization can also accelerate the application delivery process, according to Ted Girard, vice president for Delphix Federal. In a recent FCW column, Girard explains how:
“By virtualizing the data inside databases, data warehouses, and applications and files, virtual copies can be distributed throughout the agency ecosystem on demand, drastically reducing the time it takes to request copies of databases for development and testing, rationalize and launch new applications or respond to compliance and reporting requirements. This process can completely transform government data centers by lowering data management and storage costs by up to 90 percent, accelerating application delivery by up to 50 percent and maintaining data integrity and security. Just as server virtualization was transformative, data virtualization is proving to do the same.”
More Uses for Data Virtualization
All of this has created a major shift in use case for data virtualization, which was once positioned as an integration alternative to ETL. It’s also why companies tend to find multiple uses for today’s data virtualization, Chandrasekaran said.
For instance, one national insurance company found data virtualization helpful in its master data management (MDM) hub project. The company still needed an MDM solution, Chandrasekaran said, and it still needed Informatica’s ETL engine for the heavy integration work. Instead of providing integration for the primary project, data virtualization facilitated service-enabling the master data and combining it with data from non-primary sources in a virtual layer.
Data virtualization can play a role when you need to add context to data, but Chandrasekaran is quick to say it isn’t right for all situations. For instance, you would not want to put data virtualization between the analytics layer tool and the underlying processing capabilities of Hadoop. But if you’re adding contextual data or streaming data, then data virtualization may help.
“Data virtualization becomes that abstraction view that fills the information gap between all this complexity at the bottom and what business wants, which is much more of a business-focused view of the data assets of the company,” he said.
Loraine Lawson is a freelance writer specializing in technology and business issues, including integration, health care IT, cloud and Big Data.