Home Featured Teaching PLM to Become Intelligent

Teaching PLM to Become Intelligent

0

Here, WhichPLM’s Founder & CEO, Mark Harrop, explores the changing world of Artificial Intelligence as it relates to PLM – and, more specifically, how PLM can harness AI going forward.

It’s hard to imagine that in the fashion and textiles sector(s), PLM has been around for the last 20 years. [WhichPLM actually played a big part in its origins, during the initial blueprint and design phase in 1999]. During these last two decades, we’ve witnessed the growth of RFA PLM, and multiple vendors competing with new, advanced modules and processes that have helped to automate the method of bringing a product from concept-to-consumer.

During the same period, these vendors have also developed similar modules to one another (at least on the surface), ultimately culminating in the development of the Tech-Pack, or the product data management (PDM) encompassing graphical images that support: trends, materials, colourways, silhouettes, pattern shapes, sizing data, points of measures (POMs), how-to-measure guides, size charts & grading increments, bills of materials (BOM), costing management, sourcing & supplier information, corporate social responsibility (CSR) and more. Beyond the basic Tech-Pack functionality vendors moved toward the front end of design; creative development and merchandise planning and are now moving upstream into manufacturing using the IoT (Internet of Things) & Industry 4.0 to continue to make the necessary connections as part of joining the digital value-chain.

In our 2020 PLM Buyer’s Guide, I suggested that we had arrived at the new era in which PLM will be driven by the use of Artificial Intelligence. Today, AI & ML (machine learning) are often used interchangeably, so for the purposes of this article I’ll use the umbrella term AI for both areas.

Since introducing PLM to our industry, both vendors and the user community have gone onto developed a large number of libraries within each PLM solution, containing enormous amounts of what we call ‘Master Datasets’ that are primarily made up of text and numbers. Like most things in life, concepts develop and become more complex, and nowadays there are lots of different multimedia data types used within PLM such as audio (used when taking sample measurements to autofill size charts), 2D & 3D graphics and videos that are often used in “how-to-measure guides”. Ultimately, all of these data types are stored as binary digits, typically “0” (zero) and “1” (one) and for each data type there are very specific techniques to convert between the binary language of computers and how we interpret data using our senses, such as sight and sound. Designers, developers, and manufacturers – in fact, everyone involved in the PLM value-chain – will enter data of some form into a PLM solution every minute of every day to create and develop PLM Tech-Packs, with each following very similar methods of inputting and outputting this data. As a former work-study engineer, I tend to look for patterns in the way we perform our daily tasks and PLM is no exception; it uses many repetitive tasks that can be automated by the use of AI to simplify the process of creating, sharing, informing, finding and even suggesting your every next step.

AI is nothing new. It’s been around for the last 60+ years. In it’s simplest form, it’s the way that computer scientists and developers can use algorithms (step-by-step repeatable procedures to solve logical and mathematical problems) to train computers to undertake human tasks, especially those that are repetitive. Like I say, AI is not new and we’ve been using it for decades – anybody that has been using automatic marker making since it first came to the market in the 1980s has been using AI. When you are navigating the internet on your computer, you’re almost certainly using AI to locate your answers, or if you’ve ask Google for example to play your favourite music track or use maps to get from point A to B, it’s all being powered by AI.

What’s also helping to propel AI in 2020 is the continued advances in GPU (graphics processing units) which is accelerating the capabilities of GPUs for general-purpose scientific and engineering computing. Over the next decade we can expect ongoing advancements and increased diversity as GPUs continue to become even more powerful. We can expect a decade of increased diversity as these GPUs and new advancements of accelerators augment GPUs, delivering dramatic performance gains that will help to empower AI to amazing levels. Added to this, we need to keep in mind that these same GPUs will be available at the edge, meaning that our devices (cameras for example) will compute from within, without having to travel across the networks to the servers and back to the device to deliver answers; they will be able to make decisions on the fly (at the edge!) and will benefit further from the addition of 5G to transmit back to the servers. In other words, we will help to reduce the processing time dramatically.

The process of taking a design concept, to a sample prototype and onto production has in the main been a manual process delivered by fashion ‘artisans’. Going forward we can expect a dramatic shift on how to automate and digitise the entire end-to-end workflow while enabling better specialisation using AI to help automate the low value tasks, and at the same time making the PLM process much more efficient. PLM vendors need to harness the power that AI can bring to fashion, they need to re-examine each of their modules, processes and datasets and put the right system design in place, using data scientists to focus on productive experimentation and model prototyping. PLM won’t be able to deliver the total answer, so data engineers and machine learning operations teams will need to focus on delivering a streamlined data integration pipeline, along the lines of an API, but this time focused on the shared data to help gather insights and intelligence coming from third-party applications that, operating together, can help support the end-to-end design and development workflow.

It’s taken PLM 20 years to achieve the maturity levels and best practices that are, today, very similar (in my opinion). So, just like with PLM, we will need to begin cautiously with AI and focus on automating the non-value, repetitive tasks that I’ve alluded to. Further down the chain, retailers and brands will use AI to forecast demand-curves, as well as to drive materials orders and maximise throughput of production, eventually reshaping logistics linked to the connected smart factories-of-the-future.

While it will might take several years or more for AI to be fully integrated into Fashion’s end-to-end value-chain, the good news is that many PLM vendors have already started to scope and even make their first steps into understanding the bigger picture for the use of AI’s potential in improving our industry.

So, expect to see vendors coming to market with new visual search engines that use real-world fashion images (Storyboards, Illustrator sketches, Internet images, or trend photographs) as the stimuli for searches within the PLM database or supporting solutions. These visual search technologies can use AI to understand the content and context of these images, returning a list of related results. It’s also worth noting that the algorithms can, and will, be improved on an ongoing basis to help improve the speed and accuracy of each search.

We can also expect to see AI used to automate the creation of new Tech-Packs or template styles based upon the common data (e.g. product name, gender, date/year, block-pattern, main body materials, colours) that will enable strings of data to come together to drive the development and even the workflow process.

Back in the late ‘90s we attempted to use speech recognition software; unfortunately it didn’t work for a variety of reasons, but we’ve come a long way since those days and speech recognition is very accurate today, so why not harness the power to help operators to become more efficient? The aim of AI is not to replace humans, but rather to help them become more efficient and productive.

We already know that AI has truly arrived in the retail and brand e-commerce world; it’s now time to connect AI into the design, development and manufacturing world.

tags:
Mark Harrop Mark Harrop is the founder and Managing Director of WhichPLM. During a career that has spanned more than four decades, Mark has worked tirelessly to further the cause of PLM – providing the unbiased, expert advice that has enabled some of the world’s best known retailers, brands and manufacturers achieve efficiency savings across their entire supply chain through informed technology investments.