Regular readers will have noticed the presence of our new Verified and Non-Verified stamps alongside the listings for different suppliers, but they may not necessarily realise the exhaustive work that goes into the benchmarking process behind them. These stamps are intended to separate those suppliers whose marketing claims have been independently verified by WhichPLM from those who have opted not to submit their solution for verification. While we make every effort to include as many suppliers as possible in our communications and in the results of our Comparison Engine, the suppliers themselves must approach us to request an independent benchmark. Given the notorious detail and steadfast honesty that our benchmarking process involves, it is not a decision to be taken lightly or before a solution is truly ready.
In order for a solution to achieve WhichPLM verification, it must undergo a rigorous evaluation consisting of an 860-point questionnaire and a three-day “face to face” examination of every aspect of its functionality. This process is designed to mirror the buying experience of a prospective PLM customer, meaning that our readers can, at a glance, be certain that a supplier’s claims are substantiated by the actual solution and that it will meet their needs. Vendors who have approached WhichPLM and allowed their results to be published have every faith in their solution, and we would always recommend that companies looking for a PLM solution begin their search with those that have been impartially assessed rather than trusting promotions to tell the truth.
If a solution has not undergone the WhichPLM benchmarking process, a Non-Verified stamp will tell you so. The vendors of these solutions may have provided the data that allows our Comparison Engine to compile its results (and we do attempt to check this data against own industry insight), but that data forms only a fraction of the information collected during a full evaluation, and has not been subjected to our stringent verification criteria. We would always advise customers to conduct their own research into the capabilities of these solutions, as their functionality may not stand up to their marketing materials.
Once a supplier has submitted their solution for the benchmarking process, they are sent a questionnaire consisting of 860 questions prior to a three-day “face to face” meeting in which our Managing Director and Director cross-examine the supplier’s Retail, Footwear and Apparel (RFA) team using the completed pre-benchmark questionnaire as a starting point. This process is intended to act as a realistic simulation of a prospective client scenario – one in which a potential customer would invite a PLM supplier to tender for their project, would provide sufficient time to compile a comprehensive response but would still have a definite a cut-off date.
The questionnaire itself is broken down into several categories including: Commercial, Functional, Technical, Training, Support, Ethical and Social Compliance. The answers provided to these key areas of examination form the basis for the three-day site visit, which is conducted at either the supplier’s place of business or an agreed location.
The supplier’s team is asked to demonstrate their answers to each of the listed questions using their latest available released version of software, of which our team of experts evaluates every last detail, scoring it according to established method of recreating the buying experience of a prospective customer. This way, the solution is assessed on its “true” ability to deliver on the full range of its advertised capabilities.
Scores range from 0 to 5 (with 5 being the highest), so that a score of 0 equals a zero percentage score, and from there 1,2,3 & 4 increase the final percentage in 20% increments such that a score of 5 equals a perfect 100%.
Both of our examiners mark each of the answers provided separately to ensure fairness and accuracy. After careful examination of both sets of data, the overall results are submitted to the supplier, together with supporting comments, and are then signed off by both parties before being released onto the WhichPLM site where they take the form of both a high-level statistical overview (as seen in the example below) and a longer-form written evaluation.
All things considered, the average benchmark takes ten days, comprising off-site documentation sharing, question and answer sessions and in-person evaluation scoring meetings. Any benchmark represents a considerable investment of time on both our part and from the supplier themselves, so our readers can be confident that any solution bearing the WhichPLM Verified stamp has truly been put through its paces.
If you would like to learn more about the benchmarking process or to request that the WhichPLM team undertake an evaluation of your solution, please contact us.