- On Way to Omni-Channel, Retailers Try Beacons and More
- Chief Data Officer Role Continues to Evolve
- What Is Sales Enablement Software and Do You Need It?
- 5 Ways to Use Virtual Reality in the Enterprise
- 8 Open Source BPM Software Options
- How HTAP Database Technology Can Help You
- 9 Apps Empowering Finance
Attunity says its Maestro platform improves IT efficiencies across global data centers or in the cloud.
Attunity joins the growing ranks of vendors with a Big Data solution with the release of Attunity Maestro, a management platform designed to help organizations automate the process of composing, conducting and monitoring information flow across the enterprise.
According to Attunity, the software will enable organizations to increase productivity by empowering them to design and monitor Big Data transfer processes via its unified control platform.
Supporting global data centers and cloud environments, Attunity Maestro is engineered for medium to large enterprises that need to integrate data transfer processes into daily business activity. The solution accelerates and coordinates data transmission and deployment processes of Big Data and large-file assets, delivering speed, simplicity and scalability to business or IT processes that require information availability.
According to Attunity, the platform aims to meet the needs of a diverse group of users, including IT operations, lines of business and risk management teams. It provides controls for defining, executing, managing and auditing all transaction and automation initiatives. Common uses will include data distribution to remote locations, data consolidation for central analytics, enterprise-wide content management and sharing, and multi-stage content deployment.
"Global organizations implementing Big Data initiatives are increasingly challenged with managing and monitoring their most precious and fastest growing asset -- their data," said Shimon Alon, chairman and CEO at Attunity, in a statement. "With Attunity Maestro, we are pleased to address this critical need head-on and help organizations to better manage the flow of information throughout their globally-distributed enterprises."
According to Jeffrey Kelly, principal research contributor and lead Big Data Analyst at open source IT research community Wikibon, managing data flow between heterogeneous systems is a key function for "enterprises that want to become truly data-driven."
This article was originally published on April 3, 2014