image 6

The importance of master data part two

In my previous blog, I looked at the importance of Master Data, and set out why I believe it is a vital component of any successful PLM implantation.  In this second instalment, I intend to look at why a Master Data unification project represents a unique opportunity to improve business processes across the board and promote a new ideology beyond the core of PLM.

The process of Master Data unification and organisation is a difficult one to begin.  Too often business and executive management recognise the necessity of the process, but believe that the resources needed to retrieve, organise and unify that Master Data are too significant and costly.  As a standalone project, it can be tough to set out a tangible ROI (return on investment) for the kind of work necessary to do a Master Data project justice.

In order to avoid this, organisations should look to align their Master Data project with other PLM related initiatives – initiatives that improve business processes, business intelligence, integration to third-party extended PLM solutions, reporting and analytics, or help reduce administrative overhead caused by redundant data entry.  By pairing a Master Data project with other introspective endeavours that deliver demonstrable benefits to the business, some of the sting can be taken out of the standalone project’s tail.  I call this broader approach the Master Data Ideology.

Where does Master Data reside within your business?  Well, this depends greatly on your IT strategy and architecture, and the maturity of your Master Data process.  As with any product development maturity improvements, it is important that a Master Data project is only re-engineered to a level that is appropriate for your business, to avoid over-engineering.

Another sticking point for management when it comes to Master Data is the desire to do everything at once.  We would never advise our clients to move immediately from using pencil and paper to a full-blown, state-of-the-art CAD solution, and the same principle applies to Master Data projects.  A typical step-by-step Master Data project will look something like this:

Step One

Low-level use of Excel (or comparable software) for data gathering and organisation.

Step Two

Detailed analysis of how that data will be used within the business, taking in supporting field types and data values, before organising, educating and documenting these processes across the entire extended supply chain.

Step Three

At this stage the benefits of a broad Master Data strategy become clear.  A business could, for example, decide to add an Integrated Operational Data Store or a Master Data Hub, which can then be used as components of the staging area for the analytical Business Intelligence warehouse, as well as acting as a hub being co-existent PLM applications.  This will allow new business functionality to be delivered on the analytical side in parallel with the new operational and process functionality delivered across the entire business.

A core tenet of the Master Data ideology is that improvement to the quality of that data requires more than just PLM software.  As I explained in my previous blog, in today’s hyper-connected world, the same data that underpins PLM is now shared across many different enterprise systems at every stage of the extended product development lifecycle.

When a PLM implementation is built on top of confused and conflicting sources of data, and that data is allowed to populate the expanded systems that integrate to PLM, an old adage comes to mind: “garbage in, garbage out”.  Without due care being given to a Master Data project (and one that takes in the entire extended supply chain), a business runs the risk of each of their PLM solution – where the centralised data will resides – being riddled with poor quality information.

The right PLM solution is unquestionably the best way for businesses to achieve efficiency savings and remain competitive in a difficult climate, but its potential is limited when implementation occurs without a concurrent Master Data project.  The PLM solution will, in those circumstances, disseminate the poor quality data that it was fed – quickly – to the full range of solutions that rely on it.

Even a thoroughly-planned and sophisticated Master Data gathering and cleansing process cannot resolve data quality issues where proper standards and governance procedures are not in place.  In line with the way that the data involved permeates across the business, the Master Data Ideology goes well beyond the initial gathering and cleansing process:  it requires a culture change in the understanding of ownership and the responsibility for carefully considering the value of business-wide data and how that data works in conjunction with PLM-supporting solutions.

In the case of these broad Master Data projects, any business needs to carefully analyse data ownership, investigation of usage, re-engineering and data governance to address long-standing issues, prevent new data quality issues from occurring, and provide an enterprise exception processing framework for efficient data processing management.

In a typical broad Master Data project, we will typically see the gathering, cleansing and unification of existing data that has thus far been used by a wide range of systems (CAD / CAM / PDM / ERP / GSD / CRM / Excel / PLM / 2D / 3D / Others), into or through a centralised data gathering template. This is where most data quality issues are discovered and resolved, before loading into the centralised storage that could in some cases relative to maturity levels be the core PLM.  This cleansing exercise alone can deliver the kind of broader business benefits (improving the reliability of every solution in the product lifecycle) that add to the ROI case for the project, as well as ensuring that the consolidated Master Data source is as accurate and consistent as possible.

When it comes time to import that data into the PLM solution, this is typically done with the aid of a defined template that will aid in the integration of other PLM-related solutions both now and in the future.  As a corollary benefit, once this template is established it allows the Master Data project to be run on the full range of supporting solutions simultaneously, confident that the data from those disparate systems will conform to the standards established by the template itself.

As well as streamlining simultaneous projects, a well-mapped broad Master Data project can serve as a roadmap for future integration projects.  The template defined in the initial gathering and cleansing exercise can serve as a framework for those future projects, along with the meta-data defined at the same stage.  Today, we find that future integration is overlooked at the start of a PLM project, meaning that re-engineering is required when integration with additional solutions becomes necessary.

Integration (at both a systems and data level) requires a comprehensive strategy, taking in people, processes, organisation and technology.  It can be conducted over many years, change entire ways of working, disrupt business processes and involve a significant number of business stakeholders.  Adding re-engineering because of a lack of foresight will only increase the cost and disruption of those future integration and implementation projects.  The business savings that can be achieved by establishing a robust integration framework and a healthy Data Governance culture at the time of the initial Master Data project should be clear.

A properly-conducted broad Master Data project requires a new mind-set based on a careful analysis of the value that it can bring to the organisation, and an understanding that it simply cannot be run in isolation.

Mark Harrop is a leading Apparel PLM expert with more than 37 years’ experience in the industry.  He is the Managing Director of The Product Development Partnership and a recognised apparel industry thought leader.

Related articles