Home Featured Integrations; what you need to ask/know (part three – moving to the cloud)

Integrations; what you need to ask/know (part three – moving to the cloud)

0

Integration and migration specialist, Kevin Ryder, shares the final instalment in his exclusive three-part series today. Having already covered the business case for PLM, technology and architecture, key stakeholders, timelines, and test environments, this final piece explores the recent surge in moving to cloud, and what this means for integration.

PLM systems are now moving to the cloud. This in itself represents a big shift and impacts integrations heavily.

Addressing batch extractions first, most PLM systems on the cloud will not have the ETL tools currently used hosted there; so the question is what to do? There are plenty of companies out there offering ETL solutions and trying to cover them all in this piece would be impossible. PLM providers need to assess what offers the best option for them, in terms of how the tools are hosted, the flexibility that can be offered as previously given and the cost.

PLM systems are now being pushed to offer fully loaded API suites (if they don’t already) and these are required more than ever to allow continuous integration to/from PLM in the cloud. Integration options have now increased, there are those with PLM systems that are still server-based, but may have another business system set in the cloud or vice versa with PLM in the cloud and the other system still onsite. It may well be a very old legacy system and now an intermediate tool will be required because it cannot deal with APIs directly and in the past would have used an ETL tool such as SSIS to extract the data to a fixed file format.

The final option is cloud to cloud and again this presents its own issues. If you are offering an API to get data out or even into your PLM system, it is still required to be executed by another piece of code. This circles back to the security and support issue; you must use a validated portal. You now have two systems, both with an in/out API; however, how do you connect them? You still require an intermediate piece to connect everything together. Unless the two systems have a standard connector/API that is part of their offering, you are back to a bespoke piece of code to develop connecting the two or you create a hub.

With the nature of the cloud and its benefits, a PLM vendor will not want to have custom bespoke software hosted there. The ideal solution would be for the two interacting systems to have a generic connector. However, in reality many clients use different systems and there are so many available today that the time and effort this takes is not cost effective. So instead you create your own hub.

There are ETL tools out there that can manage APIs, so you can pull data from one, transform and push to another system. Using this option is probably a good idea if you are going to have multiple systems integrating or several integration points and flows of data – for example colours, materials and styles. In effect you are building your own internal hub, something similar to the concept of a bus messaging system. That said you could always employ a bus/messaging system; the options are there but it will take some consultation in itself to ensure the correct approach is chosen. Areas to consider are the number of systems you have; overlap of data; options to integrate in the future; and if there is the possibility of several integration points, to build for that even if you are not working or looking to work on those in the short term. It is better to build the foundations correctly the first time out to have the option to expand than to re-engineer further down the line.

The traditional and original method of PLM system installations was to install on a local server. This expanded with companies offering hosting; but the concept was the same, access to a network. For integration, APIs or Web Services allowed for direct access; legacy systems that tended to be batched would have access to a network drive for a file to upload or pick up from extraction or an FTP site would be used and the file accessible by the ETL tool. If each PLM system has its own cloud, then in theory this process could still be used via an API to grab the file and the cloud hosting the ETL tool. Though in practice this will almost certainly not happen. The benefit of cloud is for one system to be hosted and multiple PLM systems to be on the same cloud all using the same version of the software, data separated by company. This in turn means the API suite has to be able to differentiate data from different clients and causes more complexity if you were to upload a flat file for example. Therefore, the methodology of using an intermediate tool to deal with the file first, then using the APIs proves to be more appealing.

[As a footnote, I’ve been involved in a lot of report writing in the past for PLM systems and know clients like to create their own either via services from the provider or their own IT team. Unless the PLM system allows for an upload of report and stored procedure (which raises its own issues of security pushing custom code to a hosted cloud system for multiple systems) then you need to have the option to extract and report on the data. This fits perfectly with the solution of using an intermediate tool, which connects via the APIs; extract the data, manipulate it and then create your reports as you normally would.]

Cloud PLMs are the way forward for PLM providers. They are taking advantage of the benefits that come with cloud technology such as rapid deployment and continuous updates, allowing them to focus on further enhancements rather than supporting onsite installations. With this comes the realisation that to take full advantage of the benefits, users of PLM systems will want to have fully connected environments. To achieve this, serious planning must be utilised to ensure you develop the correct internal framework that can be built upon to connect all your systems, not just on an individual basis.

*This piece concludes Kevin Ryder’s three-part series on integrations. Stay tuned for further content from Kevin later this year, focusing on migrations.

Lydia Hanson Lydia Hanson has been part of the WhichPLM team for over four years now. She has a creative and media background, and is responsible for maintaining and updating our website content, liaising with advertisers, working on special projects like the Annual Review, and more.Joining mid-2013 as our Online Editor, she has since become WhichPLM’s Editor. In addition to taking on writing and interviewing responsibilities, Lydia has also become the primary point of contact for news, events, features and other aspects of our ever-growing online content library and tools.