Full digital design and development workflows are one of fashion’s most sought-after goals in 2021, but many of the benefits of having engineering-grade 3D assets to use upstream and downstream are currently out of reach for many brands and retailers. And as well as ensuring that future styles are born digital, luxury and heritage businesses have the added complication of figuring out how to digitise their back catalogues.
Swiss company SO REAL is taking a unique approach to solving both the forward-facing and the rearview mirror problem of creating digital twins of footwear, garments, and accessories at scale, translating technologies from other sectors to fashion.
The Interline’s Mark Harrop spent some time talking to SO REAL CEO Ian Ravenshaw Bland about the why, the how, and the when of their vision to automate a process that’s currently at or near the top of many brands’ and retailers’ agendas.
Mark Harrop: So Real is one of a growing number of businesses that have road-tested and refined their technology in other industries before bringing their potential to bear in fashion. Can you tell us a little about SO REAL’s own history, and where you feel this trend comes from – why fashion is now looking beyond its own borders for new opportunities and innovations?
Ian Ravenshaw Bland: To understand the roots of crossover technologies, I think it’s important to recognise what fashion has in common with other industries, and where it’s unique. And the most obvious area where fashion shares the same challenges and opportunities as other industries is the need for change: that’s universal, and the speed at which every industry needs to evolve has obviously picked up dramatically as a result of the pandemic. The ways in which fashion needs to change differ from, say, the consumer products sector, but one thing is clear for everyone: things cannot remain the same in the face of a shifting world, and against a backdrop of rapid digitisation.
Personally, I’ve been interested – and invested – in helping to realise cross-industry change in what I see as the three most important areas: cleaning up the planet, making it smaller, and accelerating the pace of innovation to keep up with circumstances that are changing more quickly than almost anyone predicted. With SO REAL, I and my Co-Founder, Charles Flükiger, have built on more than two decades’ worth of cross-industry innovation, in sectors like food safety and airport security, to build a solution that tackles all three of those priorities, and we’re now ready to bring its proven results over to fashion.
Mark Harrop: Do you have any examples of where other industries have embraced new technologies in a way that fashion could emulate?
Ian Ravenshaw Bland: The long journey that X-Ray inspection, which is the foundation of SO REAL’s fashion solutions, has been on is a perfect example. Decades ago, everything was 2D, analogue, offline, and slow. Then the process went digital, then 3D, then inline. I’m sure this evolution is starting to sound familiar to fashion readers.
Today, when you check your bag at an airport, it’s scanned by a Computed Tomography (CT) scanner that can handle a throughput of 1,800 bags per hour. That’s a massive acceleration, but it wasn’t caused by the availability of the technology, but rather the necessity for change; security concerns and spiralling passenger volumes conspired to make rapid, high-resolution scanning a requirement, and the spark came when the technology was applied to the problem.
This is a very similar journey to one that almost every industry has undergone – from 2D to 3D – because, as I mentioned, there’s that common requirement for change that leads people to re-evaluate how technologies and processes that work in one industry can be translated to work in another.
It’s also a journey that fashion is on right now, moving from disconnected 2D design, development, and selling, to fully connected 3D workflows. And the benefits are clear: it adds a new dimension not just to the way garments and footwear are visualised, but to the way brands, retailers, and their suppliers think, plan, and design.
From our perspective, the demand for fashion to take the next step on that journey is clear, and the technology to make it possible already exists. The tools to do everything in 3D, to create a “mirror world” of digital twins of physical products, have already demonstrated maturity in other industries, and the way forward now lies in figuring out how to apply them to the unique challenges that fashion faces.
And those challenges are not going away without digitisation. Sustainability, material waste, labour practices, and of course the decimation of physical retail and the destabilisation of supply chains wrought by COVID. Right now many industries are struggling for survival, but fashion retail has been particularly hard-hit, and where digitisation was previously seen as a far-off destination on the roadmap, it’s now something that has to happen as soon as possible. And that timeline can be brought forward if we in fashion look over our garden walls and embrace innovation from outside the industry.
Mark Harrop: Can you tell us more about how So Real’s technology works, how the service around it is structured, and how you’ve worked to translate those two things into specific use cases for the fashion industry?
Ian Ravenshaw Bland: For us, the key to unlocking that mirror world I mentioned is having the right input for converting the physical to digital. We use the same CT scanning technology that has been used in medicine for quite a while, except that where a patient passes through a ring in a hospital and X-Ray light passes through their body, we’ve reconfigured the same process for industrial purposes, to create 1:1 scans of physical objects with exacting external and internal detail. This is fundamentally different from other methods of scanning, which rely on optical sensors and photogrammetry.
The fine, critical detail is where the difference really shows. If there are holes in an object, like the eyelets of a shoe, those holes are present in the digital version of the object as well – not as surface textures, but as actual geometry. Optical scanning struggles with these fine details, as well as being confounded by reflective surfaces and transparent materials, and requiring more manual intervention to deliver acceptable visual results.
And another key difference is that X-Ray CT is the only tool that can scan and allow us to recreate the internal volume and inner components of an object, such as the toe space of a shoe. And when you combine that internal view with the level of finite detail we’re talking about, and apply some proprietary machine learning, you also gain the ability to “explode” or segment the shoe, so that a footwear manufacturer could drill down into just the cap in a steel toe boot, for instance.
We’ve already applied this technology to other industries, and we’ve developed both a B2B and B2C business around it. Brands send us their objects and we scan them, interpret them with artificial intelligence, and then deliver the digital twins back to them. And on the consumer side we’re applying the same principles to the huge demand for photorealistic 3D objects in other areas such as gaming and virtual events. This can unlock the ability for brands to sell digital versions of physical products, which is a revenue stream that a large number of brands are investigating right now, as the crossover between fashion and different media reaches new milestones on a regular basis.
Specifically for fashion, we’ve also identified real potential upstream, where having incredibly accurate digital twins of shoes or garments can be a significant accelerator for the relationship between brand and manufacturer – allowing companies and their suppliers to build and share digital libraries not just of entire products, but their segmented components. Those libraries can then be adjusted, interpreted, and used for rapid iteration and innovation without the need to develop physical samples.
What’s crucial, for me, is that designers and development teams can access the tools to experiment in this way without needing to learn to use complex, engineering-oriented 3D CAD environments. What we’re aiming to create is a simple tool for rapid creative iteration and collaboration, but one that doesn’t sacrifice technical accuracy, because its components have deeply scientific roots.
Mark Harrop: Digital product creation, sampling, and development is one of the fashion industry’s busiest investment areas today, with different technology vendors (and different end users) approaching it from different angles. Can you explain how the way So Real approaches digitisation differs from others?
Ian Ravenshaw Bland: For many people, digitisation starts with imagination, and then becomes a question of how to tie that imagination down to reality, with digital technology as the interpreting layer. In practice, this often leads to a familiar conflict between what can be dreamed and what can actually be made – the perennial see-saw between the creative and the commercial.
Because SO REAL starts with the physical, we approach digitisation from a place of fact. Experimenting with a library of 3D components created through our process allows creative teams to experiment and innovate in a way that bridges design-oriented thinking with engineering detail. Which means that, from the design stage to the deliverable, everyone involved in the development of a garment or a piece of footwear or an accessory can collaborate with confidence that the elements they’re using are not just technically feasible, but technically accurate to an extremely finite degree.
And because we begin with the physical object, every complete style and every individual component digitised using SO REAL technology inherits a huge amount of extremely granular metadata – from the material to the stitching to potential patterns of wear. That metadata is what allows our digital twins to stand in for the real article everywhere from a bill of materials to a videogame engine eCommerce experience – with total consistency and constant accuracy, and all in real-time.
Mark Harrop: As well as different technologies and approaches, we’re also talking here about different types of users than we typically might be with a more technical, engineering-oriented 3D solution. Can you explain your vision for the sort of digital twins So Real creates, and for the way fashion interacts with and creates 3D assets as a whole?
Ian Ravenshaw Bland: I believe we are creating future-proof 3D assets – capturing the essence of fashion in a way that’s infinitely reusable, and completely non-destructive. As we mentioned in the article that SO REAL and The Interline wrote together towards the end of last year, our process can even be used to bring pre-digital, archival products back to life with the same level of detail and segmentation as they would have if they had been born digital to begin with.
Whether it’s for archival purposes or for something more immediate, such as virtual photography, our vision is to free up the fashion inventory from having to invest and reinvest in 3D every time a new use case for 3D assets emerges. Take photoshoots as an example: why reshoot every time a lifestyle campaign demands a new backdrop, or when a regional regulatory change means that product pairings or colourways have to change. With genuinely reusable, technically accurate 3D assets, photoshoots can be composed and re-composed on the fly – and crucially nobody needs to learn complex engineering-focused CAD solutions to make that possible.
Or consider that a brand’s marketing team want an exploded view of a new shoe that incorporates a particularly novel construction technique or material, but which wasn’t created in 3D to begin with. You could digitise it the traditional way, with compromises to detail and interior volume, or you could reuse a digital twin that incorporates all that information from the outset.
Mark Harrop: To tease that thread even further, your new Virtual Sampler is different to other 3D solutions in that it’s been developed and deployed in a game engine, Unity. That’s a strategy that other industries have already used to build responsive, real-time tools that allow a wider range of users than ever before to become part of digital product creation. Can you explain the thinking behind that, and where you see this converge of fashion and gaming going from here?
Ian Ravenshaw Bland: This is another prime example of cross-industry innovation. We have a number of former game developers and videogame industry veterans on our team, and when we sat down to work out how to apply our technology and our proprietary process to fashion, they quickly hit on the realisation that games have been making complex objects and actions accessible and intuitive for non-technical people for decades. And this was precisely the same problem that fashion was facing, where 3D objects needed to hold value not just for engineers, but for consumers, designers, executives – essentially anyone who might touch on a product and want to see how it works or to make a confident decision based on how it was put together.
So the reasoning was that if a game like Tom Clancy’s Ghost Recon, or Call Of Duty, can make something as complex as a weapon tweakable, intelligible, and accessible to people who will probably never fire a gun in their lives, then fashion should be able to do the same for its products – and then use the results to create meaningful, engaging experiences for people from all walks of life, upstream and downstream. And from that point of view, videogames have been the gold standard for ongoing customer engagement, loyalty, interactive storytelling, real-time visuals, and much more. There’s a lot that every industry can learn from gaming.
That thought process was what led to the creation of SO REAL’s Virtual Sampler, which is designed to be as immediate, as intuitive, and as powerful as anything you’ll find in blockbuster videogames, but carefully tailored for users across the fashion value chain. And in fact one of the two versions of Virtual Sampler that we currently offer actually runs in a game engine, Unity, while the other uses web-native technology. Both make the products look great, and both are stable.
Within Virtual Sampler, your designs are visible on the main screen inventory. Users select the object with a tap, then open the object to reveal its component parts. For the purposes of this conversation, we’re looking at a brogue shoe. Using your fingers, you can zoom in and take a look at the punch, is it accurate? Does it meet the discussion criterion? The shoe will also rotate giving you the full view of your design. Then, by selecting the upper, inner or outer (or any segment of the object for that matter) you can explore everything detail orientated – from aglet to colourways, from bill of materials to glue measurements, and from punch to lace size. It’s all there, and it accurately reflects reality in every way. Our vision for Virtual Sampler is to be simple, on-the-go and cost effective which is important if you’re used to shipping samples around the globe. But the real value? By using it, you can load any object up with every single piece of metadata that you’d usually grab from a spreadsheet or from your PLM system, which is where that bridge between the creative and the commercial comes back into the conversation again.
Mark Harrop: Investing in new technology is a fraught discussion for any brand or retailer right now, given the challenges that the ongoing pandemic has introduced. What is your take on how a solution like So Real’s Virtual Sampler can help fashion businesses to keep working in an unprecedented situation?
Ian Ravenshaw Bland: Unprecedented is a bit of an overused word these days, but there really isn’t any other word for it. The world today is very much altered compared to how it was this time last year, and that change is affecting everything. We recently set up a new office, and it took weeks longer than expected because of disruptions to supply chains, distribution networks, infrastructure, and a host of other things. And we know those are challenges that the fashion industry is feeling very keenly as well, so we’re working right now on making sure that brands and their suppliers can use the Virtual Sampler as a tool for remote creativity and collaboration.
But it’s important to remember that big changes were afoot before COVID. Early last year we were working with a running shoe brand who told us they had a mandate from the very top to drastically reduce the number of samples per season by thousands. And this was before the pandemic made sample production, shipping, and tracking completely unpredictable.
So we see a simple solution: don’t ship anything at all. Produce a single sample, have it digitised at source and turned into a digital twin, and then load it into Virtual Sampler and invite all kinds of users to check it out. Designers can test it against their vision. QC can check the dimensions and the fit, relying on technical accuracy and internal volume measurements. The eCommerce team can even start building online experiences with it. The old mantra of “create once and reuse” has taken on a whole new life in this pandemic, and we’re doing everything we can to make that philosophy accessible to everyone.
Mark Harrop: Finally, can you tell us what you see as the future of CT scanning and machine learning for fashion? It seems as though the Virtual Sampler might be just a slice of what’s potentially possible.
Ian Ravenshaw Bland: We like to think big, because I believe sweeping change is the one thing that fashion and other industries have in common. Iterative improvements to existing processes won’t get us to where we need to be; we need a process that takes ten days today to take 1 day tomorrow. And that’s the thinking behind our strategy of putting CT scanning locations near to suppliers as they come on board, so that their brand and retail customers can have their samples digitally the same day.
Today our Virtual Sampler is already well beyond the proof of concept stage, and we’re ready to start signing on users in larger volumes to help us perfect the interface and functionality, after which we’re going to make it more widely available as part of our scanning-and-twinning subscription model.
For SO REAL itself, the future is all about speed. Virtual Sampling is just the beginning, and while we’re investing the bulk of our R&D budget in further automating the scanning and twinning process, using deep learning, we’re also looking at bigger, bolder moves such as CT metrology-enabled virtual fitting and automated production of technical specifications.
For the first time, I think it’s safe to say that an all-digital, end-to-end sustainable fashion industry is within reach. I’m proud to be in a position to play a role in bringing it closer.