Sign in   |   Register

Metanautix Enables Software-Defined Data Marts

  |     |   Bookmark    
                      
Posted June 26, 2015 By EnterpriseAppsToday.com Staff     Feedback

Big Data analytics startup rolls out software that allows organizations to forgo buying new hardware and instead build data marts using existing virtualization infrastructure.

We've heard a lot about software-defined storage and software-defined data centers, but Big Data analytics startup Metanautix has made it possible for organizations to build software-defined data marts using its Quest 2.0 software.

According to the company, the product helps organizations build data marts using existing virtualization infrastructure like VMware, instead of buying new hardware. It enables IT to provide centralized operations and apply necessary audit and governance, while allowing individual business units to see only the data that matters to them.

Tony Baer, a principal analyst at research firm Ovum, said in a statement that organizations can use Metanautix to query data stored in multiple different sources, including Hadoop, MongoDB, CouchDB and Salesforce. "The data compute engine is not only able to federate data integration with query, but it also offers high performance analytics through SQL and BI tools like Tableau or Eclipse," Baer said.

Quest 2.0 is built on the foundation of Metanautix's Quest data compute engine, which it introduced shortly after snagging $7 million in funding last August from Sequoia Capital and Shiva Shivakumar. According to the company, the data compute engine enables end-to-end analysis on-premise or in the cloud, including extract, transform and load (ETL), ad hoc analysis and discovery and in-memory serving.

Among Quest 2.0’s features:

  • Data is bundled together as a logical unit, a SQL catalog
  • Security controls restrict users from accessing data marts they are not permitted to see
  • Standard InformationSchema tables make it easy to audit data
  • Data may be automatically cached or explicitly snapshotted as needed
  • Multi-node commodity cluster deployments offer scalability

According to Metanautix, Quest 2.0 works within an organization’s existing architecture and doesn’t require data to be moved to a centralized system first.

"The data marts can flexibly self-fill by referencing data from the silos, role-based access controls enforce data governance, and distributed systems technology delivers performance while improving cluster utilization," said Theo Vassilakis, CEO and founder of Metanautix, in a statement.

Metanautix was co-founded by Vassilakis, who served as the lead engineer for Google’s Dremel project, and Apostolos Lerios, who was a data scientist at Facebook. HP was among companies that beta tested its software.

Submit a Comment

Loading Comments...

 
 
Thanks for your registration, follow us on our social networks to keep up-to-date