Sign in   |   Register

Can This Project Be Saved? Yes, with Data First Approach

  |     |   Bookmark    
                      
Posted December 5, 2014 By EnterpriseAppsToday.com Staff     Feedback

A Data First approach can help organizations avoid problems common in business intelligence projects.

By Pedro Cardoso, BackOffice Associates

In my last article, we discussed how organizations can effectively deliver, improve and sustain business outcomes and value by embracing a "data first" approach. In this post we’ll take a look at some examples of real-world projects in distress, and how a Data First methodology would enable different outcomes. 

The Business Intelligence Project 

At the outset of a business intelligence project the team starts down the path of gathering detailed requirements from key stakeholders and the business community. A governance structure is established to review, consolidate and prioritize feature and reporting requests best aligned with operational and strategic priorities.

With requirements in hand, the BI team proceeds to execute on provisioning the required data and performing requisite mappings and transformations required to populate target data model structures. The questions of data readiness or the ability of current business processes and source systems to support the target information model and outcomes are not addressed prior to the build (or realization) phase of the project.

Not until late in the user validation and acceptance testing phases are the underlying deficiencies in data readiness and gaps in design assumptions discovered. The project enters "SWAT team" mode, with groups focused on trying to "fix" the data and remediate faulty design assumptions. Business rules and mapping tables find their way into the data warehouse build scope of work, increasing complexity and expanding the scope of work required by the BI team. User adoption of the new enterprise data warehouse (EDW) content is challenged, with existing offline Excel and manually curated reports continuing to be used.

A new initiative is approved, revisiting initial design assumptions, now with a specific focus on business process capabilities around information and data quality management. It becomes clear that most of the original build activity will not be usable, and cost projections for the effort grow to four times the original budgeted amount. Business benefits and process efficiencies are delayed, with an ROI profile that is significantly different from the original.

BI Project, with Data First Approach

In parallel to the project visioning and requirements phase, a core data team would have been focused on data provisioning, profiling and validating alignment to the design from a data and business readiness perspective. This insight would have been available prior to finalizing requirements and solution blueprinting, providing the opportunity to re-align design assumptions and proactively resource any identified data and process capability gaps.

Stakeholder expectations could be calibrated with the insights provided and, depending on the business process gaps requiring remediation, project plans might require extending to allow for the required activity. In comparison to the previous outcome, while likely still coming in slightly over originally planned costs, the overall cost avoidance and savings from employing a Data First approach would be significant.

End user adoption of the delivered solution would be high, as key business stakeholders were involved throughout the lifecycle of the project, proving the required feedback and ability to “course correct” and align solution outcomes with realistic business expectations. The previous offline reporting processes are discontinued, and the project is labeled a success! 

Reporting Tool Project

Users are unhappy, as internal IT and business intelligence teams struggle to service a five-month backlog of reporting and information requests. A fragmented information delivery and systems landscape is posing a challenge for large-scale business transformation and process improvement projects to operate effectively, with a significant knowledge gap existing in the organization. A decision is made to invest in an enterprise reporting tool that can provide users with the ability to "self serve" what they need and reduce IT resource burden.

A tool is selected with the reporting vendor selected to implement the new solution. The "self service" project is scoped with key dimensions, measures and reporting artifacts identified from an exhaustive set of backlogged reporting requirements. Users embrace their new reporting tool and training feedback is overwhelmingly positive.

The build phase begins in earnest. The implementation team works furiously to identify requisite data in source systems and build ETL feeds to populate the data warehouse and target design. Issues around data quality now come to light, with differences in how multiple source systems define and retain transactional data -- posing significant data harmonization challenges. 

The project team decides to focus on meeting the known information requirements and enabling the "self service" model with the new reporting tool. Where data availability issues across systems existed, transactional feeds are replaced with an aggregated (summary) approach. Where business process issues exist, extract logic is augmented with business rules and filters to eliminate the data "noise." During user acceptance testing (UAT), additional ETL customizations and tweaks to ETL logic are performed to ensure scripts and validations can be successfully completed.

Data marts come online incrementally and users transition to the new reporting environment. Over time concerns around report accuracy grow, as it is discovered fixes performed during UAT were not sufficient to sustain the alignment between what was being reported vs. reality. The lack of detailed transaction details where summary data was sourced poses a challenge when trying to tie back numbers to specific events and answer new questions not previously considered. As this was approached as a "tool" implementation, no formal data governance or quality management processes were put in place that could address the root causes.

The issues around having multiple versions of the truth and the need for alignment on the "right numbers" remain a systemic problem. The new reporting tool did not address the underlying information lifecycle management needs that were the core issue.

Reporting Tool Project, with Data First Approach

The first issue with this project is the notion that implementing a reporting tool will cure the underlying knowledge gaps and data quality issues. If the team had focused on the business information needs first and taken steps to address the standardization of data elements, business definitions and closing of key process gaps, the result would be dramatically different. Applying a Data First methodology would have allowed the company to delay the tool selection process and instead spend time properly documenting and modeling enough of the target information model (TIM) needed to enable a narrower, but deep and well-defined, set of reporting and analytics capabilities. 

Tool selection would also have been delayed if the organization discovered that the diverse nature of enterprise data consumption meant a single tool wouldn’t suffice. Instead the funds earmarked for a reporting tool might be diverted to building a robust, extendable EDW and ensuring data was accessible, well documented and business-ready; focusing on enabling a "single source of truth."

Another issue was the myopic focus on the current reports as a key measure of project success rather than providing specific subject-oriented information models that could be used to answer questions across a broad range of meaningful dimensions and measures. The implementation team might have made different decisions early on with this perspective, when initially faced with the gaps in model design.

The bottom line is that one of the most underestimated elements of information and analytics projects is ironically, the deliberate focus required to profile, validate and align solution design with business process and data readiness capabilities.

Management teams are focused on the final outcome, such as the data warehouse, common reporting platform or KPI reporting dashboards. Project managers are focused on delivering the agreed on scope at the agreed to cost, with the resources provided and on schedule. IT is focused on the technology and critical landscape, security and solution compatibility considerations.

Organizations must look beyond these functional silos to resource projects with the required information lifecycle stewardship experience, skills and competencies that can bring a Data First focus to your BI and business transformation projects. Make sure data stewardship has a seat at the head table and you’ll find it will deliver just what your project ordered!

Pedro Cardoso is a senior information and data governance consultant at BackOffice Associates, a provider of information governance and data migration solutions, focusing on helping Fortune 1000 customers manage one of their most critical assets -- data.

Submit a Comment

Loading Comments...

 
 
Thanks for your registration, follow us on our social networks to keep up-to-date