Peter Bambridge is an Independent Industry Analyst and Consultant, who has worked for 30 years creating, selling and implementing software and services solutions for the retail and consumer goods industry; he is also one of WhichPLM’s resident authors. Peter explores here the need for scalability in PLM.
Across the retail, footwear and apparel industry, PLM solutions have at times struggled to meet the scalability needs of growing customer bases. This has on occasion resulted in multiple undesirable implications:
- Users waiting unacceptable amounts of time for system response.
- Companies being unable to add more users to their PLM solution.
- Timeouts that never deliver the required information.
Given the increasing requirements of modern User Experience, that not only demands responsive performance and efficient interaction, but also the flexibility to easily change the scope of the application and the size of the user community; scalability is becoming increasingly important.
When PDM solutions were focused on a handful of processes and users that operated in local workgroups, then scalability was not a big concern. Now-a-days, with global PLM implementations covering a broad range of processes and large communities of supply chain users in addition to hundreds of internal users, some implementations are reaching over 10,000 users.
Solution vendors need to be aware that prospects are looking more closely at scalability, and it is becoming a critical factor in the selection process. This is becoming a real differentiator between solutions, especially at the enterprise level.
Prospects who are considering potential PLM solutions need to probe beyond the vendor’s slick sales PowerPoint showing company logos, and understand which customers similar to their own business have actually gone live with roll-outs at full scale and are available for reference purposes.
Successful PLM solutions typically expand their internal and external user base over time, as more people see the benefits of a well implemented PLM solution. The potential increased ROI available through a wider user community must not be compromised by solution scalability concerns.
“Scalability” is defined in Wikipedia as the ability of a system, network or process to handle a growing amount of work in a capable manner, or its ability to be enlarged to accommodate that growth.
In the context of PLM for RFA, it comes down to how the system performs when processes and users are added to the system. A scalable system would then be expected to proportionally add capacity as extra hardware/bandwidth is added.
There are many potential causes of scalability problems, but they can be grouped as follows:
- Technical infrastructure limitations (such as network bandwidth, latency, clustering, load balancing)
- Storage capacity limitations (such as available disk space, memory)
- Data schema limitations (such as sub-optimal database design)
- Solution design limitations (such as concurrent access, mass editing)
- User interface limitations (such as multi-window updating)
These causes are typically present as a result of previous product management and product development decisions as the versions of the core solution have evolved over the years. For example, some of the PLM solutions were created by adding browser based workflow, collaboration and BI capabilities to older DOS based / Windows based PDM solutions. If the opportunity was not taken to redesign the underlying data schema and application to take advantage of technology advancements and evolved scope, then the resulting solution is likely to be sub-optimal and suffer from performance issues, especially with increases in scale of users.
The recent introduction of in-memory databases, (by ERP vendors and database providers), in some enterprises may facilitate new depths of analytics in areas such as consumer sentiment, but is not anticipated to have a significant impact on performance and scalability of real world PLM solutions, outside of improvements in business intelligence and analytics.
The recent publication of the WhichPLM Annual Report 2014 in its extensive market analysis identified that the ratio of licensed users currently in use globally across the leading PLM vendors is weighted significantly more to the internal user than the external user. However, recent sales show an increasing proportion of licensed software addressing the external supply chain users needs. Combined together this suggests that future roll-outs will increasingly be to adopt the supply chain trading partners in the PLM system, as well as the internal community of users.
If not properly addressed, scalability issues will increasingly become a concern as wider communities of global users gain access to the PLM solution. Unfortunately, it is often the globally spread supply chain partners who have less IT infrastructure and band-width available. As a result, external users at trading partners will be vulnerable to performance and scalability concerns.
Scalability is increasingly being listed as a key criteria with higher importance weighting in PLM solution selection processes. This is a direct reaction to concerns raised by reports of some recent customer experiences that have concluded that desired roll-outs cannot proceed with such performance constraints.
As a consequence, there is increased focus on volume testing and large-scale user testing in pre-selection sand-box testing. Some vendors have started to seek third party validation of their scalability (e.g. through working with companies such as AppLabs), in order to provide reassurance to prospects that current and future plans can be accommodated within performance expectations.
Some vendors have started to understand this significant change, and as a result, are investing heavily in developing re-architected solutions, taking advantage of next generation technical platforms such as the cloud. The adoption of a cloud based model will help performance generally, however, there is no magical answer when 100’s of MB of Ai, CAD and high resolution image files are needed on the other side of the world. The management of the underlying databases and global synchronization of attachments still needs careful design and configuration. There will always be trade-offs between efficient design, technical feasibility and cost effectiveness. Furthermore additional hardware may be sort that helps to manage the impact of storing and transferring native files of 100’s of megabytes by smart compression, customers should ask their suppliers to provide evidence of their understanding of the challenges and what they can offer as part of their overall solution to the issue of sub optimal performance.
While many RFA PLM solutions have reached a level of core capability that they can claim feature parity in the core processes, few have delivered the next generation of scalable complete solutions that the market requires and is now demanding.
Vendors who invest in next generation solutions and certified third party scalability validation, will in the future be well positioned to satisfy the needs of the growing number of PLM prospects. Those that do not, will find it increasingly difficult to differentiate and to compete in an ever increasingly competitive marketplace.