Today, E-Spec shares another exclusive article with WhichPLM. In this instalment President, Dan Hudson, explores the transition of PDM to PLM and the change in working this brought; he questions whether cloud is the answer, and offers ideas on how businesses can BYOPLM (Build Your Own PLM).
If you make or sell products, you already have a product lifecycle management system. It may be completely manual, it may be spreadsheets and e-mails, it may be a combination of off-the-shelf software packages; but it is ‘a system’. You take your products and move them through their lifecycle. PLM software may make your PLM system more effective – if it is implemented properly. If it is implemented in such a way that users work around the software rather than embrace it, it might not achieve the results or the ROI you were expecting.
PLM vendors talk about their software reaching across the entire lifecycle of a product – from concept to manufacture – but there are additional aspects of the product lifecycle that are typically not included. CAD, marketing, sourcing, sales, e-commerce and financial departments all typically have their own software systems. Even when the PLM vendor supplies functionality in these areas, many companies retain their previous solutions (like ERP systems). All of this means that PLM software does not include the entire product lifecycle; you are still managing your products across and between multiple systems. Integrations are used, manual re-entry of data is still required; lots of images, PDFs, spreadsheets and documents are still in use.
I have been involved in the Apparel/Consumer Goods PLM industry since its inception – back when it was called PDM. In the beginning we were trying to convince users to put their information into a computer. They were carrying around folders of paper; hand written notes, printouts from Word and Excel, hand drawn sketches, fabric swatches, etc. Our pitch was simple: only the person with the folder could actually do any work. This made the workflow single threaded, but by storing the information electronically multiple people could access the data and perform tasks in parallel. At the time, networks and e-mail were not standardized so, in reality, the data was only shared within the department.
By implementing a PDM system, the data could be shared across departments (partially because the PDM system require a standardized network). We began pitching the idea of keeping all of the information in the system, allowing multiple users to access a “single source of truth”. This might have worked if every department used the system but at this time each department already had their own systems in place (some on mainframes or AS400s). The PDM system had to be integrated with these other systems in some way – the most common way was the exchange of files, mostly via FTP.
People were creating files and uploading them into the PDM system, files from the PDM system were transferred to the ERP system, then to a sourcing system, then to a logistics system and finally to a financial system. This is when PLM came along and said “let’s put all this data in one big database that the entire enterprise can access”. This was possible because the technology existed to support the required communication. The internet had replaced the FTP file transferring sites, browsers could now access the systems and databases, and remote access was possible. What we found, though, was that users would still create files before entering the data into the system; users would still create PDFs and send them via e-mail. Other systems still existed (PLM had not become the entire enterprise as proposed by the vendors) and files were still used between these systems.
PLM software has of course evolved over the last two decades. As technology has advanced, so has the PLM software running on it. From WYSIWYG forms to browser-based user interfaces, from proprietary databases to relational databases, from LAN to WAN and now the “cloud”, these advances have allowed PLM to grow but have also made it more complicated to implement and maintain.
Is cloud the answer?
Now it seems as though the “cloud” is the answer: all our data will be in the cloud so we can all access the “single source of the truth”. But, all of the systems are either not in the cloud or not in the same cloud, resulting in files still being downloaded and transferred. Talk has already shifted to “hybrid” implementations – combining the cloud with network systems. Why? Mainly because files are still the common denominator; images, documents, spreadsheets, PDFs, JPGs are still what drive the business processes.
Technology has made another change in this discussion. Our argument against file-based systems was that with multiple people working on different copies of the files, no one could be sure they had the latest information. Google, Dropbox, Box and other file syncing technologies enabled by the advent of the “cloud” destroy this argument. It is now easy to work together on the same file.
While the management team (and the PLM vendor) will tout the success of these systems, most users know the truth – users are doing their jobs in spite of these systems. They perform tasks outside of the system and back fill the required data in the PLM system after the fact. It is not that the PLM software doesn’t provide benefits, it does – it just isn’t the end-to-end “single source of the truth” Holy Grail solution.
What’s old is new again
The original Apparel PDM system (created by Microdynamics, purchased by Gerber) is now referred to as ‘Classic PDM’. Classic PDM was a file-based system. Forms (or templates) were created for each “page”, and these forms all had the same header. These header fields were the metadata. Each page was stored as a file on the network. A database associated files into folders based on the metadata, and no files were contained in the database. The folders (groups of files) represented a single style. This group of associated files generated PDFs that functioned as “tech packs”. This approach can easily be replicated with today’s technology. If your files all have the same metadata, a database (DAM system) can provide the relationship between the files. One measurement file or BOM file can be linked to multiple styles (folders). If PDFs are used as the standard, Adobe provides PDF Binders to collect multiple PDFs to create the “folders” for the users. The technology now supports the use of a file-based system. What is old is new again.
Whether you have PLM software or not, there are tons of files floating around in your IT infrastructure. These files are digital assets and need to be managed. Rather than blame the PLM software for not being a panacea, I am suggesting embracing the idea that these “external” files will continue to be part of the lifecycle process, helping users manage their files (digital assets) to work with the PLM software and other systems. Bring these “work a rounds” formally into the workflow. The solution is Digital Asset Management (DAM) and metadata.
B.Y.O.PLM (Build Your Own PLM)
What if you don’t have PLM software? Well, you still have all of these files being used to create your product workflow. By applying structured metadata and using a DAM, you can formalize your processes into a more structured product workflow. You can use your files and a DAM system to create a “Poor Man’s PLM”.
Ways companies can B.Y.O.PLM
Save Illustrator files as PDFs; save Excel measurement charts, Excel BOMs and Word files as PDFs; use Google docs to allow users to collaborate on the specs; tag the PDFs with metadata and catalog the files with the DAM. Users can now search and find the images, measurements, BOM and details/notes they need for a particular style or project. From the DAM they can place these files in a “dropbox” which will route and share the files to their destinations (overseas office or vendor).
Use InDesign to pull images and data from the DAM system to create working line sheets, collections and sales pitch sheets. Add a workflow application (either included with the DAM or supplied by another vendor) and you have your approval processes. Most common PLM features can be handled using a file-based approach.
With or without PLM software, your company has a growing issue with files (assets) that continue to multiply at an accelerating rate. This can get out of hand and create communication problems and costly errors in the lifecyle. This can be solved effectively and inexpensively by implementing the use of metadata. Adobe provides the industry standard XMP. Use XMP metadata to embrace user-generated files.