Data consistency from development, through production, to the life cycle of recycling is still a dream in the future. (Photo: Adobe Stock, Yingyaipumi)
The Digital Twin (DT) – the supreme system for comprehensive data handling – is still in its infancy from an expert’s point of view: “Today, partial solutions to the digital twin are being created in individual domains, but this is only a first step in the direction of integrated solutions”And the Petra Voyth-Forster, Head of the Automotive Business Unit at the Fraunhofer Institute for Manufacturing and Automation Engineering IPA states. Today, consistency from development, through production, and life cycle to recycling, in order to profitably use data from all phases of the life cycle, remains a dream for the future.
A true digital twin is not in sight
Continuity between systems is still “bad”. “Data will continue to stay in systems in the future, it doesn’t make sense to introduce a completely new IT system. Instead, cloud platforms can be pushed under existing systems”Foyth-Forster says. However, DT projects have difficulty with this because ROI is not easily calculated. “There is no such thing as an omnipotent digital twin, only individual forms”Walter Heeby, partner at MHP, agrees. He sees positive examples of use primarily on the engineering side, for example in the field of simulation.
If you go in the direction of production, it becomes more difficult. Here DT can be used to separate the description of systems or system states. Example: Every robot, every machine consumes energy. However, this data is called differently in each case, which makes it difficult to access it consistently. “Although we have figured out how to get the data in production, we don’t always know what it means. This requires a separation layer to process data in a unified way in the cloud.”Hebe says.
Data consistency is a long time coming
Convergence here means that not only the access to the data, but also its interpretation can be standardized. Industrial Digital Twin Consortium The association is currently promoting a standard based on the specifications accompanying the OPC-UA. “This semantic extension will ensure, for example, that the energy value always looks the same”According to Walter Hebby.
This also includes the definition of the unit and the frequency at which the value is measured. It won’t be long before the management envelope is used in practice. “In the next year or two, we will see concrete approaches: otherwise data usage, especially in the cloud, is going to become a problem, and even the best data lake can’t prevent that,” Hebe says. However, it is difficult to estimate the time scale in which corresponding solutions will be rolled out across factories. However, this development is necessary, as the expert believes: “The pain point is the same everywhere: you need to describe this data.”
Audi relies on virtual models
Audi, for example, is already working on the digital twin on several projects. “In the planning field, we conduct online process workshops using prototypes and virtual production cells,” Jürgen Galb reports, Data-driven production/supply chain at the Ingolstadt OEM. To do this, the production lines and hall infrastructure are recorded as part of the digital twin using 3D scans. Virtual prototyping from PLM is used. “As a basis for consistency, we implement projects on the DPP, the digital production platform. DPP is an easily scalable Volkswagen AG cloud infrastructure that represents the link between development, planning, production and logistics.” Gulab says. This enables end-to-end operations For implementation, such as handing over work sequences from process workshops to production control.
Production results are passed on to subsequent processes and serve as the basis for data analytics.Ghallab further explains. Tier 1 supplier Bosch also relies on DT. This is also part of the sustainability strategy, because one is sure that production systems can be planned, developed and tested in a more resource-efficient way based on digital twins. But the question of which systems digital twins can be assigned to has not yet been answered: “There is competition from PLM and large platform providers, among others, and small manufacturers and start-ups are bringing partial-to-market solutions.”As Petra Foyth-Forster says. But there will probably never be one solution. In their view, the biggest obstacle remains the lack of standards and rules, and there is also no agreement on standardized interfaces between systems.
Is predictive maintenance already in use?
The default launch is currently one of the most popular application scenarios in production, reports Petra Voyth-Forster. With DT, for example, the system setup can be simulated digitally in order to keep startup times as short as possible. This also speeds up the exchange of tools. A pre-trained AI can ensure in a simulated environment that a robot has already mastered its task before it has to perform it. Other data-based methods, for example around AR/VR, are still on the rise, due to more and more practical and safe solutions to implement remote maintenance. In the event of a problem, particularly in maintenance, external experts are called via data glasses and an application, or 3D CAD plans are used for guidance.
In the case of predictive maintenance in particular, which is perhaps the most well-known use case of AI in production, it has so far stayed with beacon projects and proof-of-concept. Determining ROI is not easy. “Companies find it hard to scale use cases, once it’s on a different machine, it’s different. This makes it difficult to deploy solutions,” Hebe reports. Here, too, the semantic separation plays an important role. In order to perform AI analytics at the edge, specific hardware and gateway solutions are also required. The computing performance required for this on the edge is expensive and not worth the trouble unless the benefit is clearly demonstrated. This is still the case, particularly in the visual quality check.
Electric motion can drive digital twin
Image processing and artificial intelligence based on DT are also useful for working with components of different shapes. In the auto industry, the topic is of particular interest to operations and start-ups that have a high degree of variability, says expert Fraunhofer: “Until now it has not been the norm for robots to interact with changing components. However, this could change as more other drives are introduced.”. For example, researchers are working on DT to produce battery cells. The focus is on mapping the data chain into the PLM system in order to be able to use the data for recycling – a complex challenge.
Walter Heeby argues that companies need to catch up when it comes to data strategy in production: it matters here and beyond To capture data and its semantics cleanly from the knowledge of production experts and to develop more experience with data. “Even from practical knowledge, data can be contextualized and categorized: a time series of data alone is useless, and it should also be clear what is happening in the process”, says Walter Heeby. This preparatory work is absolutely necessary.