Despite being a relatively new term, Master Data is really a new face on an old problem that’s been with us for a very long time. Master Data (sometimes referred to as “Reference Data”) is one of the most vital components of any enterprise system, but one that you will very rarely see referenced in sales literature. Without it, work can be duplicated, product development complicated unnecessarily and entire implementations scuppered.
In the world of modern product development, more and more solutions (ERP, CRM, CAD, CAM and more) share a common data source. So what is Master Data and why is it so critical not only to the success of your PLM implementation, but for those solutions in place across your entire global supply chain that rely on the same sets of data?
Managing the data used by a given business (data such as Supplier identifiers, Measurements, Product Types, Colour, Materials, Trims, Labels, Employee, locality and partner data) has always been a major challenge for any PLM implementation – ever since organisations have tried to share or integrate data across systems. Those crucial sources of data
Much of the data your business holds, whether you realise it or not, falls within the scope of Master Data. Examples of the kind of Master Data that might play a part in a PLM implementation include:
- Product data (item number, bill of materials, product codes)
- Materials (All types)
- Labour Operations (Standard Minuet values)
- POM (Point Of Measures)
- Size Ranges
- Size Categories
- Grading Increments
- Image Types
- Costing Data (Currency tables and exchange rates)
- Country Codes
- Duty Rates
- Roles & Employee data (employee role, names, placement in organisational structure)
- Partner data (Supplier name, address, contact details, classification)
Let’s take a step back and look at how many of today’s businesses develop products without the support of a modern PLM solution. Whether they are a retailer, brand, manufacturer, agent or supplier, the majority of businesses around the world currently develop their products using traditional, paper-based methods.
At best, some of them use what I like to call “Microsoft PDM” – i.e. a combination of Powerpoint, CAD and Word files during the design phase, supplemented by hundreds of Excel spreadsheet that support the technical specification process. In these situations, Excel serves as a bridge between purchasing solutions, ERP and testing; e-mail is heavily relied upon for collaboration, and costing and critical path mapping tend, again, to be done in Excel and with hundreds of phone calls!
These legacy systems and piles of Excel files have held what is crucial master data for many years. Many such organisations have poorly implemented Data Governance processes to handle changes in this data over time, leading to inefficiencies, inaccuracy and many mistakes. Under the traditional and “Microsoft PDM” methods, data is almost always poorly integrated and at low levels of quality
While these traditional processes have been in place for many decades, that lack of data governance has seen little or no focus given to the way the data they rely on is handled, and this is the culture into which many PLM implementations have to fit. The foundations are far from ideal.
These traditional development methods have evolved over time to a state where each process owner makes his or her own decision on how they would like to enter data, based on their own experience and personal preferences. For example: some of those process owners will have used capital letters for style or supplier names, while others will have used the typical capital first letter, followed by lower case, to enter their style names.
And this is just for a single core system! On top of this, many different supporting solutions will have been acquired, developed and deployed across the business over the years, and in many cases these systems will have rules that dictate the ways in which data can be entered into the system. Some of those systems will force the capitalisation of the initial style name letter, others still will not be case-sensitive; some systems will limit users to a set field length (e.g. ten digits for a style name, even while other systems within the same organisation support fields of 13 or 25+ characters) leading to a situation where we have potentially three ways of entering the same data within the same business.
Now, if we multiply this confusion by the hundreds of field types in place in a typical system, and the number of different systems in a given organisations, I’m sure you can begin to see the enormity of the challenge of bringing a modern PLM solution (which rely on standardised, unified data) into that situation and ensuring that it can still deliver bottom-line savings to the efficiency of the business.
Master Data, which in its purest form is the establishment of one centralised, unified set of data from which all enterprises systems can draw, is for me as critical as any part of a PLM project – on the same level as functional scoping or detailed design. In most circumstances, though, Master Data is not as simple as a data-cleansing and gathering exercise, but should also be seen as a critical methodology/process that should take place either before or in parallel with the PLM process design & implementation.
If we use the analogy that PLM is the vehicle that enables organisations to reach a state of streamlined, modern product development, Master Data should be viewed as the high octane fuel for that vehicle. There is no doubt that PLM (properly chosen and implemented) can deliver significant value to any business, but this value is limited where a business simply implements the system and leaves it up to the end users (who each have their own methods of data input) to organise and input the data that it relies upon.
Unless the business treats Master Data as carefully as the initial design of the PLM project then unfortunately the results are often that the data being loaded into the PLM solution is very poor and out of sync with the rest of the business solutions. often the data has not been carefully organised, cleansed and will most likely be full of duplications e.g. it could be the same data using capital letters and/or lowercase type settings, poorly organised coding of data types, unnecessary data that adds to the complexity of sorting.
Unfortunately this scenario is common: the data enters the system in an ad-hoc and disorganised fashion, compounding the difficulties inherent in the “Microsoft PDM” approach, and this is the reason behind the long delays in those businesses deriving the expected benefits from their solution.
The Master Data approach should apply to all of your business-critical product development data: POMs; Size Range; Sample Types; Materials; Product Types; Roles; Employee data; Design & Development Locations; Purchasing offices; Colours; Labels; Packaging; Trims; Costing Data; Field lengths and types. These should all be carefully examined, especially where their data is used in other supporting solutions like the Adobe Suite and ERP, CRM, tracking and sourcing solutions.
It is vital that data governance rules and a business-wide culture change are considered as early as possible in any implementation process, to ensure that, where PLM or other enterprise-level systems are adopted across a business, the data they rely on is accurate and consistent.
This ideology should also extend outside the company. As I explained at the beginning of this blog, in the world of modern product development an increasing amount of Master Data is being shared between businesses and their partners. Similarly, data that might not be thought of as Master Data internally (such as style codes, product codes, country codes, colour codes, material codes and foreign exchange rates, to name only a few) can comprise the core of your data-sharing relationship with your partners.
I cannot emphasis enough that transitioning from traditional methods to the Master Data ideology requires serious education (both internally and across your extended supply chain), and a supporting methodology using processes and tools that will aid in the creation and unification of Apparel PLM Master Data.
Taking account of the data gathering and unification process, the need for the development of uploading tools (managed by the PLM vendors) emerges as a critical key driver to speed and value. As with the core functionality of the PLM solution itself, the aim of establishing reliable, sustainable, accurate, and secure Master Data is to help businesses achieve real value from their investments, and vendors should ease this process wherever possible.
For those businesses considering PLM, though, it’s time to clean up and unify your Master Data if you want to derive the optimum benefit from your PLM implementation!
Mark Harrop is a leading Apparel PLM expert with more than 37 years’ experience in the industry. He is the Managing Director of The Product Development Partnership and a recognised apparel industry thought leader.