How to Build a Data Assessment Business Case

Matt Sarrel

Updated · Mar 06, 2014

By Michael Collins

Examining an enterprise’s strategic master data within or across multiple systems to ensure quality and accuracy can be a daunting task; however, it can also pay major dividends.

As all key business decisions and transactions are based on an organization’s master data—ranging from manufacturing and warehousing to finance, sales and marketing—conducting a data assessment gives executives proper visibility into whether their strategic and operational data is correct and thus driving optimal company performance.

For example, taking a hard look at a company’s master data often reveals costly errors such as duplicate and misaligned data for assets, customers, materials and vendors, including financial issues like incorrect payment terms, credit limits, cost and profit centers and problematic HR data regarding current employees. Proactively identifying and correcting these types of data problems can yield millions of dollars in savings for large entities.

Why Conduct a Data Assessment?

Whether a company is implementing a new business system—moving data from an old system to a new one—or looking to improve the quality of data in their existing systems, managers often grapple with getting their hands around the quality of their overall master data. By leveraging technology and data experts to review all levels of source data at the master, operational and transaction levels, executives can break down the behemoth effort and take a critical step in understanding whether their organization’s data meets established standards and where the right skills and data resources can be applied to support a well-oiled data operation.

They can also identify whether data doesn’t harmonize across multiple source systems and where business processes are in jeopardy because the data fails to support certain requirements.

As an example, an industry materials operation may have advanced planning and optimization capabilities to manage demand, production planning and detailed scheduling. The system may be best-of-breed, but that doesn’t mean much if the information within isn’t accurate and therefore incapable of yielding profitable results by ensuring that the right materials are available at the right times without excesses or shortages. By lifting up the hood and examining their organizations’ complex master data, company leaders can get a baseline on current quality levels and establish a strong information governance strategy based on a solid understanding of master data relevance and data quality metrics.

Overcoming Data Ambiguity

A data assessment is often sparked by confusion around quantifying an organization’s level of master data quality, determining the complexity of establishing a data governance program or assembling the right team and strategy to support a data conversion. Rather than basing answers around factual discovery of the data, teams often rely on anecdotal conversations representing varying perspectives. Unfortunately, the larger and more dispersed the organization, the less accurate these exchanges are in representing the organization as a whole, leading to widespread data inaccuracies and decisions based on speculation and ambiguity.

The key is to examine company data from an unbiased perspective with the most critical business processes in mind. This strategy gives management a complete picture of how individual business units treat and interact with data—including important nuances—and the level of adherence to existing standards.

Identifying Key Data Problems

Possible data inaccuracies often reach across all areas of companies’ operations, but certain ones can cause significant and costly damage. It is important to comprehensively assess corporate source data to stop errors across the board to prevent unnecessary losses. Major areas where common data errors occur include:

Duplication of Data. This issue presents significant risk for organizations. For example, if a sales person creates a duplicate record to execute a sales order (whether intentionally or inadvertently), they are effectively doubling the credit limit to the customer that has been approved. It could also degrade customer satisfaction if one sales person enters information into one record and another enters data into the duplicate record, thus opening the door to major miscommunications with the customer regarding late payments, order details, etc.

Misaligned Data in Multiple Systems. Companies often rely on multiple systems to interface with each other and share common data. They may rely on one system for inputting financial data and another for inputting operational data. If the systems aren’t synching up properly or if erroneous data entry practices exist, major problems could arise. For example, one system may be intended for adding or changing vendor and procurement/finance-related information and another for managing details around plan maintenance for physical assets. If someone mistakenly used the second system to add vendors or include information about purchase orders, the first system would consistently have different vendors and vendor attribute data, creating issues around missing purchase orders and misaligned budgeting.

Payment Terms. While the financial, sales and purchasing teams may have an idea of payment terms that exist and may be in use, they are often amazed when they see the volume of non-standard payment terms that are actually being implemented across their organization. This clarity often enables CFOs and the finance department to determine the source of previously unknown cash flow issues.

Chart of Accounts (COA). Complexities and differences across multiple COAs impacts the accuracy of financial reporting, which affects the company’s ability to close the books on time and track KPIs in a timely manner. Taking a deep look into the COAs provides visibility into challenges that may arise if a new chart is created for combining separate business units into one. Assessing the COAs across all relevant systems is paramount to optimizing the efficiency and effectiveness of all financial reporting.

Cost and Profit Centers. Ensuring correct data for cost and profit centers is critical for getting an accurate picture of business operations. Companies must be able to confirm that they are posting transactions to the right accounts and ideally prevent incorrect postings. Duplication and improper configurations often lead to incorrect postings.

Human Resources Mistakes. Significant HR errors occur due to inaccurate data, including companies continuing to pay former employees who haven’t properly been removed from payroll. This may happen for regular payroll or for employees who received a one-time payment and are no longer with the company. There are often problems with employees’ birthdates being entered incorrectly, such as birthdates and hire dates being switched by mistake, which not only impacts benefits eligibility but can result in regulatory violations.  Incorrect organization structures often lead to data access and data security vulnerabilities.

Engaging in a holistic data assessment offers organizations an unfiltered view into what may be happening with their data and causing them unnecessary headaches and negatively impacting their business results. Companies may suspect that they have “bad” data and experience complaints from employees, but they often struggle to quantify the severity of the issues. Properly examining the data is the first step in understanding where the data stands today and determining the right approach for remediating the issues as part of a solid data governance strategy.

Michael Collins is a global vice president at BackOffice Associates, a provider of information governance and data migration solutions, focusing on helping Fortune 1000 customers manage one of their most critical assets – data.

More Posts By Matt Sarrel