Refine
Document Type
- Conference proceeding (220) (remove)
Language
- English (220) (remove)
Is part of the Bibliography
- yes (220)
Institute
- Technik (220) (remove)
Publisher
- IEEE (120)
- VDE Verlag GmbH (14)
- ACM (8)
- Springer (7)
- European Association for the Development of Renewable Energy, Environment and Power Quality (4)
- SCITEPRESS (4)
- University of Colorado (4)
- Hochschule Ulm (3)
- Technische Universität Berlin (3)
- VDE (3)
Different network architectures are being used to build remote laboratories. Historically, it has been difficult to integrate industrial control systems with higher level IT systems like enterprise resource planning (ERP), manufacturing execution systems (MES), and manufacturing operations management (MOM). Getting these systems to communicate with one another has proven to be relatively difficult due to the absence of shared protocols between them. The Open Platform Communications United Architecture (OPC-UA) protocol was introduced as a remedy for this issue and is gaining popularity, but what if open-source protocols that are widely used in the IT industry could be used instead? This paper presents the development of an IT-Architecture for a cyber-physical industrial control systems laboratory that enables a seamless interconnection and integration of its elements. The architecture utilises Node-Red technology. Node-RED is an open-source programming platform developed by IBM that is focused on making it simple to link physical components, APIs, and web services. This cyber-physical laboratory is for learning principles of an industrial cascaded process control factory. Finally, this text will also discuss future work relating to digital twin (DT). A coupled tank system is selected as a teaching factory to illustrate a range of fluid control application in a typical chemical process factory.
Impact of a large distribution network on radiation characteristics of planar spiral antenna arrays
(2023)
Designing antenna arrays with a central feed point has gained ground in the antenna technique. This approach, which is usually applied because of manufacturing costs, is difficult to achieve and leads to a large feeding network. The impact of which is numerically investigated in the present work. Upon comparing three different antennas, it is shown that the enlargement of the feed strongly affects the antenna's overall dimensions and the antenna's radiation characteristics. The antenna with the plug-in solution is not only small in size but also performs better compared to antennas with a central feed point. Considering the high effort in designing the feed network with a central point and the influence of the resulting enlarged network on the dimensions and radiation characteristics of the antenna, the cost saving in production can be put into perspective.
Advancing mental health diagnostics: AI-based method for depression detection in patient interviews
(2023)
In this paper, we present a novel artificial intelligence (AI) application for depression detection, using advanced transformer networks to analyse clinical interviews. By incorporating simulated data to enhance traditional datasets, we overcome limitations in data protection and privacy, consequently improving the model’s performance. Our methodology employs BERT-based models, GPT-3.5, and ChatGPT-4, demonstrating state-of-the-art results in detecting depression from linguistic patterns and contextual information that significantly outperform previous approaches. Utilising the DAIC-WOZ and Extended-DAIC datasets, our study showcases the potential of the proposed application in revolutionising mental health care through early depression detection and intervention. Empirical results from various experiments highlight the efficacy of our approach and its suitability for real-world implementation. Furthermore, we acknowledge the ethical, legal, and social implications of AI in mental health diagnostics. Ultimately, our study underscores the transformative potential of AI in mental health diagnostics, paving the way for innovative solutions that can facilitate early intervention and improve patient outcomes.
In recent years, the demand for accurate and efficient 3D body scanning technologies has increased, driven by the growing interest in personalised textile development and health care. This position paper presents the implementation of a novel 3D body scanner that integrates multiple RGB cameras and image stitching techniques to generate detailed point clouds and 3D mesh models. Our system significantly enhances the scanning process, achieving higher resolution and fidelity while reducing the cost, time and effort required for data acquisition and processing. Furthermore, we evaluate the potential use cases and applications of our 3D body scanner, focusing on the textile technology and health sectors. In textile development, the 3D scanner contributes to bespoke clothing production, allowing designers to construct made-to-measure garments, thus minimising waste and enhancing customer satisfaction through fitting clothing. In mental health care, the 3D body scanner can be employed as a tool for body image analysis, providing valuable insights into the psychological and emotional aspects of self-perception. By exploring the synergy between the 3D body scanner and these fields, we aim to foster interdisciplinary collaborations that drive advancements in personalisation, sustainability, and well-being.
Patterns are virtually simulated in 3D CAD programs before production to check the fit. However, achieving lifelike representations of human avatars, especially regarding soft tissue dynamics, remains challenging. This is mainly since conventional avatars in garment CAD programs are simulated with a continuous hard surface and not corresponding to the human physical and mechanical body properties of soft tissue. In the real world, the human body’s natural shape is affected by the contact pressure of tight-fitting textiles. To verify the fit of a simulated garment, the interactions between the individual body shape and the garment must be considered. This paper introduces an innovative approach to digitising the softness of human tissue using 4D scanning technology. The primary objective of this research is to explore the interactions between tissue softness and different compression levels of apparel, exerting pressure on the tissue to capture the changes in the natural shape. Therefore, to generate data and model an avatar with soft body physics, it is essential to capture the deform ability and elasticity of the soft tissue and map it into the modification options for a simulation. To aim this, various methods from different fields were researched and compared to evaluate 4D scanning as the most suitable method for capturing tissue deformability in vivo. In particular, it should be considered that the human body has different deformation capabilities depending on age, the amount of muscle and body fat. In addition, different tissue zones have different mechanical properties, so it is essential to identify and classify them to back up these properties for the simulation. It has been shown that by digitising the obtained data of the different defined applied pressure levels, a prediction of the deformation of the tissue of the exact person becomes possible. As technology advances and data sets grow, this approach has the potential to reshape how we verify fit digitally with soft avatars and leverage their realistic soft tissue properties for various practical purposes.
Analog integrated circuit sizing still relies heavily on human expert knowledge as previous automation approaches have not found wide-spread acceptance in industry. One strand, the optimization-based automation, is often discarded due to inflated constraining setups, infeasible results or excessive run times. To address these deficits, this work proposes a alternative optimization flow featuring a designer’s intuition for feasible design spaces through integration of expert knowledge based on the gm/ID-method. Moreover, the extensive run times of simulation-based optimization flows are overcome by incorporating computationally efficient machine learning methods. Neural network surrogate models predicting eleven performance parameters increase the evaluation speed by 3 400× on average compared to a simulator. Additionally, they enable the use of optimization algorithms dependent on automatic differentiation, that would otherwise be unavailable in this field. First, an up to 4× more efficient way for sampling training data based on the aforementioned space is detailed. After presenting the architecture and training effort regarding the surrogate models, they are employed as part of the objective function for sizing three operational amplifiers with three different optimization algorithms. Additionally, the benefits of using the gm/ID-method become evident when considering technology migration, as previously found solutions may be reused for other technologies.
This article presents a modified method of performing power flow calculations as an alternative to pure energy-based simulations of off-grid hybrid systems. The enhancement consists in transforming the scenario-based power flow method into a discrete time-dependent algorithm with the inclusion of bus and controller dynamics.
Most Question-answering (QA) systems rely on training data to reach their optimal performance. However, acquiring training data for supervised systems is both time-consuming and resource-intensive. To address this, in this paper, we propose TFCSG, an unsupervised similar question retrieval approach that leverages pre-trained language models and multi-task learning. Firstly, topic keywords in question sentences are extracted sequentially based on a latent topic-filtering algorithm to construct unsupervised training corpus data. Then, the multi-task learning method is used to build the question retrieval model. There are three tasks designed. The first is a short sentence contrastive learning task. The second is the question sentence and its corresponding topic sequence similarity judgment task. The third is using question sentences to generate their corresponding topic sequence task. The three tasks are used to train the language model in parallel. Finally, similar questions are obtained by calculating the cosine similarity between sentence vectors. The comparison experiment on public question datasets that TFCSG outperforms the comparative unsupervised baseline method. And there is no need for manual marking, which greatly saves human resources.
The presented research is dedicated to estimation of the correlation between the level of renewable energy sources and the costs of congestion management in electric networks in selected European countries. Data of six countries in North-West European area (Italy, Spain, Germany, France, Poland and Austria) were investigated. Factors considered included grid congestion costs including re-dispatching costs as well as countertrading costs, gross electricity generation, installed capacity of electric generating facilities, installed capacity of electric non-dispatchable renewable energy sources and total electricity consumption. Special attention is paid to the share of renewable energy sources. It is found that the grid congestion costs are not clearly affected by penetration of non-dispatchable renewables in all the analysed countries and therefore a clear mathematical correlation cannot no be extrapolated with the available data. The results of this research show in general a loose dependency of the grid congestion costs on the penetration of renewables and a strong dependency on the total electrical consumption of the country.
The efficient production and utilization of green hydrogen is vital to succeed in the global strive for a sustainable future. To provide the necessary amount of green hydrogen a high number of electrolyzers will be connected as decentralized power consumers to the grid. A large amount of decentralized renewable power sources will provide the energy. In such a system a control method is necessary to dispatch the available power most efficiently. In particular, the shutdown of renewable energy sources due to temporary overproduction must be avoided. This paper presents a decentralized tertiary control algorithm that provides a new decentralized control approach, thus creating a flexible, robust and easily scalable system. The operation of each grid participant within this grid connected microgrid is optimized for maximum financial profit, while minimizing the exchange of power with the mains grid and reducing the shutdown of renewable power sources.