Refine
Year of publication
- 2022 (102) (remove)
Document Type
- Conference proceeding (102) (remove)
Is part of the Bibliography
- yes (102)
Institute
- Informatik (50)
- Technik (26)
- ESB Business School (22)
- Texoversum (2)
- Life Sciences (1)
- Zentrale Einrichtungen (1)
Publisher
- IEEE (19)
- Springer (17)
- Hochschule Reutlingen (13)
- Association for Computing Machinery (4)
- Association for Information Systems (4)
- Elsevier (4)
- University of Colorado (4)
- University of Hawai'i at Manoa (4)
- VDE Verlag (3)
- SciTePress (2)
Context: Companies that operate in the software-intensive business are confronted with high market dynamics, rapidly evolving technologies as well as fast-changing customer behavior. Traditional product roadmapping practices, such as fixed-time-based charts including detailed planned features, products, or services typically fail in such environments. Until now, the underlying reasons for the failure of product roadmaps in a dynamic and uncertain market environment are not widely analyzed and understood.
Objective: This paper aims to identify current challenges and pitfalls practitioners face when developing and handling product roadmaps in a dynamic and uncertain market environment.
Method: To reach our objective we conducted a grey literature review (GLR).
Results: Overall, we identified 40 relevant papers, from which we could extract 11 challenges of the application of product roadmapping in a dynamic and uncertain market environment. The analysis of the articles showed that the major challenges for practitioners originate from overcoming a feature-driven mindset, not including a lot of details in the product roadmap, and ensuring that the content of the roadmap is not driven by management or expert opinion.
Enterprises and societies currently face crucial challenges, while Society 5.0 can contribute to a supersmart society, especially for manufacturing and healthcare, and Industry 4.0 becomes important in the global manufacturing industry. Smart energy digital platforms are architected to manage energy supply efficiently. Furthermore, the above digital platforms are expected to collect various kinds of data and analyze Big Data for the trends in the sharing economy in ecosystems. The adaptive integrated digital architecture framework (AIDAF) for Design Thinking Approach with Risk Management is expected to make an alignment with digital IT strategy. In this paper, we propose that various energy management systems and related digital platforms are designed and implemented in an alignment to digital IT strategy for sharing economy toward Society 5.0, with the AIDAF framework for Design Thinking Approach with Risk Management. The vision of AIDAF applications to enable sharing economy and digital platforms is explained and extended in the context of Society 5.0. In addition, challenges and future activities for this area are discussed that cover the directions of smart energy for Society 5.0.
In order to evaluate the performance of different stapes prosthesis types, a coupled finite element (FE) model of human ear was developed. First, the middle-ear FE model was developed and validated using the middle-ear transfer function measurements available in literature including pathological cases. Then, the inner-ear FE model was developed and validated using tonotopy, impedance, and level of cochlea amplification curves from literature. Both models are based on pre-existing research with some improvements and were combined into one coupled FE model. The stapes in the coupled FE ear model was replaced with a model of a stapes prosthesis to create a reconstructed ear model that can be used to estimate how different types of protheses perform relative to each other as well as to the natural ear. This will help in designing of new innovative types of stapes prostheses or any other type of middle-ear prostheses as well as to improve the ones that are already available on the market.
Verification of an active time constant tuning technique for continuous-time delta-sigma modulators
(2022)
In this work we present a technique to compensate the effects of R-C / g m -C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with finely tunable circuit structures, such as current-steering digital-to-analog converters (DAC). We apply our technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure, implemented in a 0.35-μm CMOS process. A tuning scheme for the reference currents of the feedback DACs is derived as a function of the individual TC errors and verified by circuit simulations. We confirm the tuning technique experimentally on the fabricated circuit over a TC parameter variation range of ±20%. Stable modulator operation is achieved for all parameter sets. The measured performances satisfy the expectations from our theoretical calculations and circuit-level simulations.
Today, companies face increasing market dynamics, rapidly evolving technologies, and rapid changes in customer behavior. Traditional approaches to product development typically fail in such environments and require companies to transform their often feature-driven mindset into a product-led mindset. A promising first step on the way to a product-led company is a better understanding of how product planning can be adapted to the requirements of an increasingly dynamic and uncertain market environment in the sense of product roadmapping. The authors developed the DEEP product roadmap assessment tool to help companies evaluate their current product roadmap practices and identify appropriate actions to transition to a more product-led company. Objective: The goal of this paper is to gain insight into the applicability and usefulness of version 1.1 of the DEEP model. In addition, the benefits, and implications of using the DEEP model in corporate contexts will be explored. Method: We conducted a multiple case study in which participants were observed using the DEEP model. We then interviewed each participant to understand their perceptions of the DEEP model. In addition, we conducted interviews with each company's product management department to learn how the application of the DEEP model influenced their attitudes toward product roadmapping. Results: The study showed that by applying the DEEP model, participants better understood which artifacts and methods were critical to product roadmapping success in a dynamic and uncertain market environment. In addition, the application of the DEEP model helped convince management and other stakeholders of the need to change current product roadmapping practices. The application also proved to be a suitable starting point for the transformation in the participating companies.
Industrial practice is characterized by random events, also referred to as internal and external turbulences, which disturb the target-oriented planning and execution of production and logistics processes. Methods of probabilistic forecasting, in contrast to single value predictions, allow an estimation of the probability of various future outcomes of a random variable in the form of a probability density function instead of predicting the probability of a specific single outcome. Probabilistic forecasting methods, which are embedded into the analytics process to gain insights for the future based on historical data, therefore offer great potential for incorporating uncertainty into planning and control in industrial environments. In order to familiarize students with these potentials, a training module on the application of probabilistic forecasting methods in production and intralogistics was developed in the learning factory 'Werk150' of the ESB Business School (Reutlingen University). The theoretical introduction to the topic of analytics, probabilistic forecasting methods and the transition to the application domain of intralogistics is done based on examples from other disciplines such as weather forecasting and energy consumption forecasting. In addition, data sets of the learning factory are used to familiarize the students with the steps of the analytics process in a practice-oriented manner. After this, the students are given the task of identifying the influencing factors and required information to capture intralogistics turbulences based on defined turbulence scenarios (e.g. failure of a logistical resource) in the learning factory. Within practical production scenario runs, the students apply probabilistic forecasting using and comparing different probabilistic forecasting methods. The graduate training module allows the students to experience the potentials of using probabilistic forecasting methods to improve production and intralogistics processes in context with turbulences and to build up corresponding professional and methodological competencies.
Theoretical foundation, effectiveness, and design artefact for machine learning service repositories
(2022)
Machine learning (ML) has played an important role in research in recent years. For companies that want to use ML, finding the algorithms and models that fit for their business is tedious. A review of the available literature on this problem indicates only a few research papers. Given this gap, the aim of this paper is to design an effective and easy-to-use ML service repository. The corresponding research is based on a multi-vocal literature analysis combined with design science research, addressing three research questions: (1) How is current white and gray literature on ML services structured with respect to repositories? (2) Which features are relevant for an effective ML service repository? (3) How is a prototype for an effective ML service repository conceptualized? Findings are relevant for the explanation of user acceptance of ML repositories. This is essential for corporate practice in order to create and use ML repositories effectively.
The time has come : application of artificial intelligence in small- and medium-sized enterprises
(2022)
Artificial intelligence (AI) is not yet widely used in small- and medium-sized industrial enterprises (SME). The reasons for this are manifold and range from not understanding use cases, not enough trained employees, to too little data. This article presents a successful design-oriented case study at a medium-sized company, where the described reasons are present. In this study, future demand forecasts are generated based on historical demand data for products at a material number level using a gradient boosting machine (GBM). An improvement of 15% on the status quo (i.e. based on the root mean squared error) could be achieved with rather simple techniques. Hence, the motivation, the method, and the first results are presented. Concluding challenges, from which practical users should derive learning experiences and impulses for their own projects, are addressed.
Compared to the automotive sector, where automation is the rule, in many other less standardized sectors automation is still the exception. This could soon hurt the productivity of industrialized countries, where the unemployment is low and the population is aging. Phenomena like the recent downfall in productivity, due to lockdowns and social distancing for prevention of health hazards during the COVID19 pandemic, only add to the problem. For these reasons, the relevance, motivation and intention for more automation in less standardized sectors has probably never been higher. However, available statistics say that providers and users of technologies struggle to bring more automation into action in automation-unfriendly sectors. In this paper, we present a decision support method for investment in automation that tackles the problem: the STIC analysis. The method takes a holistic and quantitative approach tying together technological, context-related and economic input parameters and synthetizing them in a final economic indicator. Thanks to the modelling of such parameters, it is possible to gain sensibility on the technological and/or process adjustments that would have the highest impact on the efficiency of the automation, thereby delivering value for both technology users and technology providers.
The digital twin concept has been widely known for asset monitoring in the industry for a long time. A clear example is the automotive industry. Recently, there has also been significant interest in the application of digital twins in healthcare, especially in genomics in what is known as precision medicine. This work focuses on another medical speciality where digital twins can be applied, sleep medicine. However, there is still great controversy about the fundamentals that constitute digital twins, such as what this concept is based on and how it can be included in healthcare effectively and sustainably. This article reviews digital twins and their role so far in what is known as personalized medicine. In addition, a series of steps will be exposed for a possible implementation of a digital twin for a patient suffering from sleep disorders. For this, artificial intelligence techniques, clinical data management, and possible solutions for explaining the results derived from artificial intelligence models will be addressed.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. Therefore, the logic of business decisions is based on the agility to respond to emerging trends in a proactive way. By contrast, traditional IT governance (ITG) frameworks rely on hierarchy and standardized mechanisms to ensure better business/IT alignment. This conflict leads to a call for an ambidextrous governance, in which firms alternate between stability and agility in their ITG mechanisms. Accordingly, this research aims to explore how agility might be integrated in ITG. A quantitative research strategy is implemented to explore the impact of agility on the causal relationship among ITG, business/IT alignment, and firm performance. The results show that the integration of agile ITG mechanisms contributes significantly to the explanation of business/IT alignment. As such, firms need to develop a dual governance model powered by traditional and agile ITG mechanisms.
Since project managers still face problems in managing interorganizational R&D projects, it is a promising approach to manage these projects project-culturally-aware. However, an important prerequisite for a project-culture-aware management is that the involved individual organizations pursue a collaborative strategy. Therefore, our article provides a conceptual approach including a new tool, the Collaborative Iron Triangle, which supports both project sponsors and managers in different phases of the collaboration process to pursue a collaborative strategy in interorganizational R&D projects.
This article explores the question of how sustainability and labour law are interrelated. The modern world of work is characterised by the growing social and environmental responsibility of companies. Especially in the post-COVID era, sustainability also plays an increasingly important role in the corporate context, which is also noticeable in the so-called ‘war for talent’. Achieving personal career goals is no longer enough for employees today. Corporate values and in particular the so-called ESG criteria (Environment, Social, Governance) are thus also becoming increasingly important in the employment relationship and in corporate reporting requirements. In terms of social sustainability, labour law instruments can, for example, promote the creation of a discrimination-free working environment, the introduction of flexible working time models or the protection of whistleblowers. From an ecological perspective, labour regulations are also suitable for implementing ‘green mobility’ and other measures to reduce companies’ ecological footprints. Working from home, which experienced a huge boom during the COVID-19 pandemic, is also sustainable, especially from an ecological point of view. Appropriate consideration of these sustainable work tools in future corporate social responsibility (CSR) strategies not only creates a competitive advantage but can also be beneficial in recruitment.
Database management systems and K/V-Stores operate on updatable datasets – massively exceeding the size of available main memory. Tree-based K/V storage management structures became particularly popular in storage engines. B+ -Trees [1, 4] allow constant search performance, however write-heavy workloads yield in inefficient write patterns to secondary storage devices and poor performance characteristics. LSM-Trees [16, 23] overcome this issue by horizontal partitioning fractions of data – small enough to fully reside in main memory, but require frequent maintenance to sustain search performance.
Firstly, we propose Multi-Version Partitioned BTrees (MV-PBT) as sole storage and index management structure in key-sorted storage engines like K/V-Stores. Secondly, we compare MV-PBT against LSM-Trees. The logical horizontal partitioning in MV-PBT allows leveraging recent advances in modern B+ -Tree techniques in a small transparent and memory resident portion of the structure. Structural properties sustain steady read performance, yielding efficient write patterns and reducing write amplification.
We integrated MV-PBT in the WiredTiger [15] KV storage engine. MV-PBT offers an up to 2× increased steady throughput in comparison to LSM-Trees and several orders of magnitude in comparison to B+ -Trees in a YCSB [5] workload.
Personalized remote healthcare monitoring is in continuous development due to the technology improvements of sensors and wearable electronic systems. A state of the art of research works on wearable sensors for healthcare applications is presented in this work. Furthermore, a state of the art of wearable devices, chest and wrist band and smartwatches available on the market for health and sport monitoring is presented in this paper. Many activity trackers are commercially available. The prices are continuously reducing and the performances are improving, but commercial devices do not provide raw data and are therefore not useful for research purposes.
The respiratory rate is a vital sign indicating breathing illness. It is necessary to analyze the mechanical oscillations of the patient's body arising from chest movements. An inappropriate holder on which the sensor is mounted, or an inappropriate sensor position is some of the external factors which should be minimized during signal registration. This paper considers using a non-invasive device placed under the bed mattress and evaluates the respiratory rate. The aim of the work is the development of an accelerometer sensor holder for this system. The normal and deep breathing signals were analyzed, corresponding to the relaxed state and when taking deep breaths. The evaluation criterion for the holder's model is its influence on the patient's respiratory signal amplitude for each state. As a result, we offer a non-invasive system of respiratory rate detection, including the mechanical component providing the most accurate values of mentioned respiratory rate.
A premise guaranteeing the successful interdisciplinary teamwork in product design is a mutual understanding of both professional and academic communities of the different design expertise and the role they play in the process. It appears that the open compound word industrial design is open to interpretation in European education. This ambiguity had a negative impact on the labour policies of some European countries, which have labelled some professions with incorrect names. Therefore, this terminological inconsistency urges for clarification within the design community. This work analyses the term industrial design, it presents historical developments in European industrial design education, in particular in Germany and in the Netherlands, and discusses how the education to the industrial design profession was positioned towards product development. This paper suggests that the causes for the observed lack of clarity about the meaning of the term industrial design are of an etymological and disciplinary kind. In order to act as a bridge between the professional and academic communities, universities should create the premises for interdisciplinary collaboration between designers and engineers through standardized communication, ultimately contributing for a sustainable future in both design and engineering education.
For collision and obstacle avoidance as well as trajectory planning, robots usually generate and use a simple 2D costmap without any semantic information about the detected obstacles. Thus a robot’s path planning will simply adhere to an arbitrarily large safety margin around obstacles. A more optimal approach is to adjust this safety margin according to the class of an obstacle. For class prediction, an image processing convolutional neural network can be trained. One of the problems in the development and training of any neural network is the creation of a training dataset. The first part of this work describes methods and free open source software, allowing a fast generation of annotated datasets. Our pipeline can be applied to various objects and environment settings and is extremely easy to use to anyone for synthesising training data from 3D source data. We create a fully synthetic industrial environment dataset with 10 k physically-based rendered images and annotations. Our da taset and sources are publicly available at https://github.com/LJMP/synthetic-industrial-dataset. Subsequently, we train a convolutional neural network with our dataset for costmap safety class prediction. We analyse different class combinations and show that learning the safety classes end-to-end directly with a small dataset, instead of using a class lookup table, improves the quantity and precision of the predictions.
Motivation: Aim of this project is the automatic classification of total hip endoprosthesis (THEP) components in 2D Xray images. Revision surgeries of total hip arthroplasty (THA) are common procedures in orthopedics and trauma surgery. Currently, around 400.000 procedures per year are performed in the United States (US) alone. To achieve the best possible result, preoperative planning is crucial. Especially if parts of the current THEP system are to be retained.
Methods: First, a ground truth based on 76 X-ray images was created: We used an image processing pipeline consisting of a segmentation step performed by a convolutional neural network and a classification step performed by a support vector machine (SVM). In total, 11 classes (5 pans and 6 shafts) shall be classified.
Results: The ground truth generated was of good quality even though the initial segmentation was performed by technicians. The best segmentation results were achieved using a U-net architecture. For classification, SVM architectures performed much better than additional neural networks.
Conclusions: The overall image processing pipeline performed well, but the ground truth needs to be extended to include a broader variability of implant types and more examples per training class.
What might the attendee be able to do after being in your session?
Our work shows how to connect intra-operative devices via IEEE 11073 Service-oriented Device Connectivity (SDC).
Description of the Problem or Gap
Standardized device communication is essential for interoperability, availability of device data, and therefore for the intelligent operating room (OR) and arising solutions. The SDC standard was developed to make information from medical devices available in a uniform manner and enable interoperability. Existing devices are rarely SDC-capable and need interfaces to be interoperable via SDC.
Methods: What did you do to address the problem or gap?
We conceived an SDC-based architecture consisting of a service provider and service consumer. In our concept, the service provider is connected to the medical device and capable to translate the proprietary protocol of the device into SDC and vice versa. The service consumer is used to request or send information via the SDC protocol to the service provider and can function as a uniform bidirectional interface (e.g. for displaying or controlling). This concept was exemplarily demonstrated with the patient monitor MX800 of Philips to retrieve the device data (e.g. vital parameters) via SDC and partly for the operating light marLED X of KLS Martin Group.
Results: What was the outcome(s) of what you did to address the problem or gap?
The patient monitor MX800 was connected to a Raspberry Pi (RPi) via LAN, on which the service provider is running. The python script on the RPi establishes a connection to the monitor and translates incoming and outgoing messages from the proprietary protocol to SDC and vice versa to/from the service consumer. The service consumer is running on a laptop and acts as a simulation for different kinds of systems that want to get vital parameters or other information from the patient monitor. The operating light marLED X was connected to an RPi via USB-to-RS232. A python script on the RPi establishes a connection to the light and makes it possible via proprietary commands to get information of the light (e.g. status) and to control it (e.g. toggle the light, increment the intensity). A translation to SDC is not integrated yet.
Discussion of Results
Our practical implementation shows that medical devices can be accessed via external connections to get device data and control the device via commands. The example SDC implementation of the patient monitor MX800 makes it possible to request its data via the standardized communication protocol SDC. This is also possible for the operating light marLED X if its proprietary protocol is analyzed to be translatable to/from SDC. This would allow to control the device from an external system, or automatically depending on the status of the ongoing procedure. The advantage is, that existing intra-operative devices can be extended by a service provider which is capable of translating the proprietary protocol of the device in SDC and vice versa. This enables interoperability and an intelligent OR that, for example, is aware of all devices, their status, and data and can use this information to optimally support the surgeons and their team (e.g. provision of information, automated documentation). This interoperability allows that future innovations merely need to understand the SDC protocol instead of all vendor-dependent communication protocols.
Conclusion
Standardized device communication is essential to reach interoperability, and therefore intelligent ORs. Our contribution addresses the possibility of subsequently making medical devices SDC-capable. This may eliminate the need of understanding all the different proprietary protocols when developing new innovative solutions for the OR.
Recently, blockchain-based tokens have earned an important role in fields such as the art market or online gaming. First approaches exist, which adopt the potentials of blockchain tokens in supply chain management to increase transparency, visibility, automation, and disintermediation of supply chains. In context, the tokenization of assets in supply chains refers to the practice of creating virtual representations of physical assets on the blockchain. Solutions in supply chain management based on the tokenization of assets vary in terms of application objectives, token types, asset characteristics, as well as the complexities of supply chain events to be mapped on the blockchain. Currently, however, no review exists that summarizes the characteristics of blockchain-based tokens and their scope of applications. This paper provides a clear terminological distinction of existing blockchain token types and therefore distinguishes between fungible tokens, non-fungible tokens, smart non-fungible tokens, and dynamic smart non-fungible tokens. Subsequently, the token types are classified regarding their traceability, modifiability, and authorization to evaluate suitability for mapping assets in supply chains. Given the potential of blockchain in supply chain management, the results of the review serve as a foundation for a practical guide supporting the selection process of suitable token types for industrial applications.
Current data-intensive systems suffer from scalability as they transfer massive amounts of data to the host DBMS to process it there. Novel near-data processing (NDP) DBMS architectures and smart storage can provably reduce the impact of raw data movement. However, transferring the result-set of an NDP operation may increase the data movement, and thus, the performance overhead. In this paper, we introduce a set of in-situ NDP result-set management techniques, such as spilling, materialization, and reuse. Our evaluation indicates a performance improvement of 1.13 × to 400 ×.
Global, competitive markets which are characterised by mass customisation and rapidly changing customer requirements force major changes in production styles and the configuration of manufacturing systems. As a result, factories may need to be regularly adapted and optimised to meet short-term requirements. One way to optimise the production process is the adaptation of the plant layout to the current or expected order situation. To determine whether a layout change is reasonable, a model of the current layout is needed. It is used to perform simulations and in the case of a layout change it serves as a basis for the reconfiguration process. To aid the selection of possible measurement systems, a requirements analysis was done to identify the important parameters for the creation of a digital shadow of a plant layout. Based on these parameters, a method is proposed for defining limit values and specifying exclusion criteria. The paper thus contributes to the development and application of systems that enable an automatic synchronisation of the real layout with the digital layout.
Perforations of the tympanic membrane (TM) can occur as a result of injury or inflammation of the middle ear. These perforations can lead to conductive hearing loss (HL), where in some cases the magnitude of HL exceeds that attributable to the observed TM perforation alone. We aim with this study to better understand the effects of location and size of TM perforations on the sound transmitting properties of the middle ear.
The middle ear transfer function (METF) of six human temporal bones (TB; freshly frozen specimen of body donors) were compared before and after perforation of the TM at different locations (anterior or posterior lower quadrant) and of different sizes (1mm, ¼ of the TM, ½ of the TM, and full ablation). The
METF were correlated with a Finite Element (FE) model of the middle ear, in which similar alterations were simulated.
The measured and simulated FE model METFs exhibited frequency and perforation size dependent amplitude losses at all locations and severities. In direct comparison, posterior TM perforations affected the transmission properties to a larger degree than perforations of the anterior quadrant. This could possibly be caused by an asymmetry of the TM, where the malleus-incus complex rotates and results in larger deflections in the posterior TM half than in the anterior TM half. The FE model of the TM with a sealed cavity suggest that small perforations result in a decrease of TM rigidity and thus to an increase in oscillation amplitude of the TM, mostly above 1 kHz.
The location and size of TM perforations influence the METF in a reproducible way. Correlating our data with the FE model could help to better understand the pathologic mechanisms of middle-ear diseases. If small TM perforations with uncharacteristically significant HL are observed in daily clinical practice, additional middle ear pathologies should be considered. Further investigations on the loss of TM pretension due to perforations may be informative.
Die additive Fertigung hat sich in den vergangenen Jahren wesentlich weiter entwickelt. Dabei wurde die Prozesstechnologie, Anlagen und die Werkstoffe optimiert. Für die industrielle Anwendung auch bei größeren Stückzahlen in der flexiblen Fertigung fehlen noch automatisierte Lösungen für die gesamte Prozesskette. In diesem Beitrag werden Werkzeuge und Technologie für die Reinigung interner Strukturelemente dargestellt.
Die additive Fertigung hat sich in den vergangenen Jahren wesentlich weiterentwickelt. Dabei wurde die Prozesstechnologie, Anlagen und die Werkstoffe optimiert. Für die industrielle Anwendung auch bei größeren Stückzahlen in der flexiblen Fertigung fehlen noch automatisierte Lösungen für die gesamte Prozesskette. In diesem Beitrag werden Werkzeuge und Technologie für die Reinigung interner Strukturelemente dargestellt.
We present a multitask network that supports various deep neural network based pedestrian detection functions. Besides 2D and 3D human pose, it also supports body and head orientation estimation based on full body bounding box input. This eliminates the need for explicit face recognition. We show that the performance of 3D human pose estimation and orientation estimation is comparable to the state-of-the-art. Since very few data sets exist for 3D human pose and in particular body and head orientation estimation based on full body data, we further show the benefit of particular simulation data to train the network. The network architecture is relatively simple, yet powerful, and easily adaptable for further research and applications.
The vast majority of state-of-the-art integrated circuits are mixed-signal chips. While the design of the digital parts of the ICs is highly automated, the design of the analog circuitry is largely done manually; it is very time-consuming; and prone to error. Among the reasons generally listed for this is often the attitude of the analog designer. The fact is that many analog designers are convinced that human experience and intuition are needed for good analog design. This is why they distrust the automated synthesis tools. This observation is quite correct, but this is only a symptom of the real problem. This paper shows that this phenomenon is caused by very concrete technical (and thus very rational) issues. These issues lie in the mode of operation of the typical optimization processes employed for the synthesizing tasks. I will show that the dilemma that arises in analog design with these optimizers is the root cause of the low level of automation in analog design. The paper concludes with a review of proposals for automating analog design
Die Zielsetzung des hier vorgestellten Projekts ist es, eine intelligente Steuerungsalgorithmik für Biogas-Blockheizkraftwerke (Biogas-BHKW) zu entwickeln und zu optimieren. Daran schließt sich eine Testphase an einer realen Biogasanlage an, an der die Algorithmik zu diesem Zweck in die Anlagensteuerung implementiert wird. Um beurteilen zu können inwieweit die Steuerungsalgorithmik einen Beitrag zur Entlastung von Stromnetzen leisten kann, wird für die Versuche neben dem elektrischen Bedarf des landwirtschaftlichen Betriebs, an dem die Anlage angesiedelt ist, zusätzlich die Residuallast des benachbarten Stromnetzes betrachtet. Diese basiert auf Daten vom nächstgelegenen Umspannwerk, die so skaliert werden, dass sie eine Siedlung repräsentieren, die von dem Biogas-BHKW der Anlage mitversorgt werden kann. Die Einbindung der Steuerungsalgorithmik in die Anlagensteuerung erfolgt über eine Kommunikationsstruktur mit einer Datenbank als zentraler Schnittstelle. Eine erste Versuchsreihe, bei der das Biogas-BHKW nach den Fahrplänen der intelligenten Steuerungsalgorithmik geregelt wird, zeigt vielversprechende Ergebnisse. Über die gesamte Versuchsreihe hinweg berechnet die Steuerungsalgorithmik zuverlässig neue Fahrpläne, die vom BHKW weitestgehend auch sehr gut umgesetzt werden. Zudem kann nachgewiesen werden, dass durch den Einsatz der Algorithmik das vorgelagerte Stromnetz entlastet wird.
On the influence of ground and substrate on the radiation characteristics of planar spiral antennas
(2022)
The unidirectional radiation of spiral antennas mounted on a substrate requires the presence of a ground plane. In this work, we successively illustrate the impact of dielectric material and ground plane on the key metrics of a planar equiangular spiral antenna (PESA). For this purpose, a PESA mounted on several substrates with different dielectric properties and thicknesses is modeled and simulated. We introduce the tertiary current flowing on spiral arms when backed by a ground plane.
Sleep analysis using a Polysomnography system is difficult and expensive. That is why we suggest a non-invasive and unobtrusive measurement. Very few people want the cables or devices attached to their bodies during sleep. The proposed approach is to implement a monitoring system, so the subject is not bothered. As a result, the idea is a non-invasive monitoring system based on detecting pressure distribution. This system should be able to measure the pressure differences that occur during a single heartbeat and during breathing through the mattress. The system consists of two blocks signal acquisition and signal processing. This whole technology should be economical to be affordable enough for every user. As a result, preprocessed data is obtained for further detailed analysis using different filters for heartbeat and respiration detection. In the initial stage of filtration, Butterworth filters are used.
Multi-versioning and MVCC are the foundations of many modern DBMSs. Under mixed workloads and large datasets, the creation of the transactional snapshot can become very expensive, as long-running analytical transactions may request old versions, residing on cold storage, for reasons of transactional consistency. Furthermore, analytical queries operate on cold data, stored on slow persistent storage. Due to the poor data locality, snapshot creation may cause massive data transfers and thus lower performance. Given the current trend towards computational storage and near-data processing, it has become viable to perform such operations in-storage to reduce data transfers and improve scalability. neoDBMS is a DBMS designed for near-data processing and computational storage. In this paper, we demonstrate how neoDBMS performs snapshot computation in-situ. We showcase different interactive scenarios, where neoDBMS outperforms PostgreSQL 12 by up to 5×.
Simulation models of the middle ear have rarely been used for diagnostic purposes due to their limited predictive ability with respect to pathologies. One big challenge is the large uncertainty and ambiguity in the choice of material parameters of the model.
Typically, the model parameters are determined by fitting simulation results to validation measurements. In a previous study, it was shown that fitting the model parameters of a finite-element model using the middle-ear transfer function and various other measurable output variables from normal ears alone is not sufficient to obtain a good predictive ability of the model on pathological middle-ear conditions. However, the inclusion of validation measurements on one pathological case resulted in a very good predictive ability also for other pathological cases. Although the found parameter set was plausible in all aspects, it was not yet possible to draw conclusions about the uniqueness and the accuracy or the uncertainty of the parameter set.
To answer these questions, statistical solution approaches are used in this study. Using the Monte Carlo method, a large number of plausible model data sets are generated that correctly represent the normal and pathological middle-ear characteristics in terms of various output variables like e.g., impedance, reflectance, umbo, and stapes transfer function. Subsequent principal component analyses (PCA) allow to draw conclusions about correlations, quantitative limits and statistical density of parameter values.
Furthermore, applying inverse PCA yields numerous plausible parameterizations of the middle-ear model, which can be used for data augmentation and training of a neural network which is capable of distinguishing between a normal middle ear and pathologies like otosclerosis, malleus fixation, and disarticulation based on objectively measured quantities like impedance, reflectance, and umbo velocity.
Mobile apps for sustainability in grocery shopping: increasing acceptance through gameification
(2022)
Sustainability has become an important topic in social sciences research as well as in the societal debate. Research in general indicates a high sensitivity of sustainability issues in broad parts of the society, however a change of consumption habits can hardly be overserved. It can be argued that technology, such as mobile apps, can play an important role to increase more sustainable behaviors and consumption habits, as they facilitate such behaviors, bring transparency to an unclear field and reduce complexity. Our research hence approaches an important research gap, especially as currently existing apps show a lack of functionalities and UX. By using a Design Science Research (DSR) approach applying Chou’s Octalysis framework, we systematically analyzed eight apps in the field of sustainability and two general gamification apps as reference points complementing our findings with issues discussed in literature and could identify a broad range of functionalities. This comprehensive analysis allowed us to develop an initial mockup of a potential app, which then was tested within a user-group of ten users by using a semi structured interview approach. Our findings contribute to knowledge by highlighting the importance of user experience on the acceptance of mobile apps, as well as, by showcasing how gamification can contribute to a sustained use of mobile apps in this specific context.
An autonomous vehicle is a robotic vehicle with decision and action capability capable of performing assigned tasks without or with minimal human intervention. Autonomous cars have been in development for many years. The Society of Automotive Engineers (SAE International) published in 2014 a classification in five levels of driving automation, with level 0 corresponding to completely manual driving, and level 5 to an ideal dream where the vehicle would be able to navigate entirely autonomously for all missions and in all environments. This work addressed the navigation of an autonomous vehicle in general. We focus on one of the most complex scenarios of the road network and crossing of road intersections. In this paper, the critical features of autonomous intelligent vehicles are reviewed. Furthermore, the associated problems are presented, and the most advanced solutions are derived. This article aims to allow a novice in this field to understand the different facets of localization and perception problems for autonomous vehicles.
In recent decades, it can be observed that a steady increase in the volume of tourism is a stable trend. To offer travel opportunities to all groups, it is also necessary to prepare offers for people in need of long-term care or people with disabilities. One of the ways to improve accessibility could be digital technologies, which could help in planning as well as in carrying out trips. In the work presented, a study of barriers was first conducted, which led to selecting technologies for a test setup after analysis. The main focus was on a mobile app with travel information and 360° tours. The evaluation results showed that both technologies could increase accessibility, but some essential aspects (such as usability, completeness, relevance, etc.) need to be considered when implementing them.
The hearing contact lens® (HCL) is a new type of hearing aid devices. One of its main components is a piezo-electric actuator. In order to evaluate and maximize the HCL’s performance, a model of the HCL coupled to the middle ear was developed using finite element approach. The model was validated step by step starting with the HCL only. To validate the HCL model, vibrational measurements on the HCL were performed using a Laser-Doppler-Vibrometer (LDV). Then, a silicone cap was placed onto the HCL to provide an interface between the HCL and the tympanic membrane of the middle-ear model and additional LDV measurements on temporal bones were performed to validate the coupled model. The coupled model was used to evaluate the equivalent sound pressure of the HCL. Moreover, a deeper insight was gained into the contact between the HCL and tympanic membrane and its effects on the HCL performance. The model can be used to investigate the sensitivity of geometrical and material parameters with respect to performance measures of the HCL and evaluate the feedback behavior.
In dem Beitrag wurden exemplarisch Möglichkeiten aufgezeigt, die mittels der Verknüpfung unterschiedlicher Technologien zur Steigerung von Genauigkeit und Effizienz bei der Bearbeitung genutzt werden können. Dabei sind Kenntnisse aus unterschiedlichen Bereichen erforderlich. Dies sind sowohl Bearbeitungs- und Prozesstechnologie, die Konstruktion von Maschinen, Vorrichtungen und Werkzeugen, sowie Mess- und Steuerungstechnik. Daneben sind auch neue Geschäftsmodelle und Technologien für die Nutzung und Verfügbarmachung von Daten und Informationen erforderlich.
Das Motto in diesem Jahr lautet: "Zukunft mIT gestalten". Die Beiträge sind ein Spiegelbild der menschenzentrierten Rolle der Informatik in der heutigen Welt. Sie zeigen u. a. Forschungen in Künstlicher Intelligenz, Mensch-Maschine-Interkation und Mixed-Reality mit Anwendungen in der Medizin, der Wirtschaft und der Gesellschaft. Ein besonderer Höhepunkt der Konferenz ist der abschließende Gastvortrag von Frau Prof. Dr. Claudia Müller-Birn zum Thema "Human-Centered Data Science".
Die Informatics Inside ist seit über 13 Jahren ein fester Bestandteil des akademischen Jahres an der Fakultät für Informatik der Hochschule Reutlingen. Die Konferenz wird von Studierenden des Masterstudiengangs Human-Centered Computing selbstständig organisiert und bildet einen wichtigen Teil der wissenschaftlichen Ausbildung. Die Studierenden haben ihre Themen selbst gewählt und nicht selten sind es Fragen, die sie bereits durch das ganze Studium begleiten. Sie bereiten diese im Format einer wissenschaftlichen Ausarbeitung auf, wobei Inhalt, Vollständigkeit und Nachvollziehbarkeit entscheidende Faktoren sind. Die Ergebnisse dieser vertieften Auseinandersetzung mit relevanten Anwendungsthemen der Informatik können Sie in diesem Tagungsband nachlesen. Die Anwendungsdomänen reichen von der Medizin über Wirtschaft bis zu den Medien. Dabei werden aktuelle Fragestellungen des menschzentrierten Einsatzes von künstlicher Intelligenz, Softwaretechnik, Datenanalyse und Kommunikation sowie der digitalen Transformation behandelt. Es wird deutlich, dass der Nutzen von IT-Lösungen für den Menschen im Mittelpunkt der Veranstaltung steht. Das Motto der Veranstaltung „IT´s Future“ ist Programm und macht die Relevanz der Informatik für alle Lebensbereiche sowie die zukünftige Innovations- und Wettbewerbsfähigkeit von Industrie und Forschung deutlich.
The imparting of knowledge and skills in STEM education, especially under the influence of the Covid-19 pandemic, is increasingly taking place online and through digital formats. The partially asynchronous instruction eliminates, on the one hand, the social relation in the learning process and, on the other hand, the direct experience with physical objects. Here, the digital learning systems provide learning tools and controls to support the learning process on a general basis. Existing methods for simulating physical objects (digital twins) are also used to a minimal extent. The following approach presents a learning system framework that enables individualized learning, including all dimensions (social, physical). Implementing a concept that uses a personalized assistance system to orchestrate the individual learning steps enables efficient and effective learning. Applying the learning system framework exemplifies the STEM education at Reutlingen University in the logistics learning factory Werk150.
Organizations that operate under uncertainty need to cultivate their ability to manage their primary resource, knowledge, accordingly. Under such conditions, organizations are required to harvest knowledge from two sources: to explore knowledge that is to be found outside the organization as well as exploit knowledge that is contained within. In a knowledge management context these exploitation and exploration activities have been conceptualized as knowledge ambidexterity. While ambidexterity has been studied extensively in contexts as manufacturing or IT, the notion of knowledge ambidexterity remains scarce in current knowledge management research. This study illustrates knowledge ambidexterity and elaborates its positive impact on organizational performance. Our study furthermore answers the question of how the use of enterprise social media (ESM) can facilitate the performance effects of knowledge ambidexterity. Drawing on the theory of communication visibility, we argue that ESM (e.g., Microsoft Teams, Slack, etc.) allow employees to communicate unhindered while making these communications visible. This allows for capturing tacit knowledge within these communications - this form of knowledge is generally hard to codify and can be a source of competitive edge. With respect to knowledge ambidexterity, ESM use can capture tacit knowledge aspects originating from inside and outside the organization, which fosters the development of a competitive advantage and, thus, supports its positive effect on organizational performance. This paper contributes to IT-enabled ambidexterity research in two aspects: (1) It sheds light on knowledge ambidexterity and, thereby, addresses a major practical challenge for knowledge-intensive organizations, and (2) it elaborates on the effects that ESM use can have on the relationship between knowledge ambidexterity and organizational performance. This work-in-progress paper offers a better understanding of the phenomenon of ambidexterity in a knowledge context, while providing insights on the facilitating role of ESM. Our research serves as a foundation for future empirical examinations of the concept of knowledge ambidexterity.
The current paper proposes a design method for an active damping approach for LC output filters in a power stage for motor control with continuous output voltage. The power stage uses GaN-HEMTs and operates at switching frequencies in a range between 500 kHz and 1MHz. The active damping of the output filter is achieved here by a feedback of the filter inductor current using a high-pass structure. The paper discusses the impact of this feedback on the system behavior and proposes a design method.
In many cases continuous monitoring of vital signals is required and low intrusiveness is an important requirement. Incorporating monitoring systems in the hospital or home bed could have benefits for patients and caregivers. The objective of this work is the definition of a measurement protocol and the creation of a data set of measurements using commercial and low-cost prototypes devices to estimate heart rate and breathing rate. The experimental data will be used to compare results achieved by the devices and to develop algorithms for feature extraction of vital signals.
The majority of people in sub-Saharan Africa (SSA) rely on so-called “paratransit” for their mobility needs. The term refers to a large informal transport sector that runs independent of government, of which 83% comprises minibus taxis (MBT). MBT technology is often old and contribute significantly to climate change with their high carbon dioxide (CO2) emissions. Issues related to sustainability and climate change are becoming more important world-wide and hardly any attention is given to MBTs. Converting the MBTs from internal combustion engines (ICEs) to electric motors could be a possible solution. The existing power grid in SSA is largely based on fossil power plants and is unstable. This can be seen by frequent local power blackouts. To avoid further strain on the existing power grid, it would therefore make sense to charge the electric minibus taxis (eMBTs) through a grid consisting of renewable energies. A mobility map is created via simulations with collected data points of the MBTs. By using this mobility map, the energy demand of the eMBTs is calculated. Furthermore, a region-specific photovoltaic (PV) and wind simulation can be realised based on existing weather data, and a tool to size the supply system to charge the eMBTs is developed after all data has been collected. With the help of this work, it can be determined to what extent renewable energies such as PV and wind power can be used to support the transition from ICEs to electric engines in the MBT sector.
This workshop addressed scientific research and development to acquire physiological signals, process signals, and extract relevant data for further analysis. There are very different domains of application, for example. Tiredness and drowsiness are responsible for a significant percentage of road accidents. There are different approaches to monitoring driver drowsiness, ranging from the driver’s steering behavior to in-depth analysis of the driver, e.g., eye tracking, blinking, yawning, or Electrocardiogram (ECG). One of the leading causes of road accidents in Egypt is trucks, buses, cars, motorcycles, and pedestrians, all sharing the same infrastructure. The result is that there are more than 12,000 fatalities in road accidents every year. Thousands are injured, and some suffer long-term disabilities. A similar effect can be observed in Germany for all types of vehicles. According to the Federal Statistical Office, a high percentage of accidents involving personal injury are directly or indirectly caused by drowsiness.
A different application domain is sleep monitoring: Healthy and sound sleep is a prerequisite for a rested mind and body. Both form the basis for physical and mental health. Healthy sleep is counteracted by sleep disorders, the medically diagnosed frequency of which increases sharply from the age of 40. Increasing acceptance can be promoted by monitoring vital signs during sleep over long periods through the exclusive use of noninvasive technologies. In the case of objective measurement, the vital signs are measured to calculate the sleep phases or sleep efficiency and, after applying the appropriate algorithms, to record the sleep quality. About a quarter of all Germans have the feeling of sleeping poorly. The disruptive factors include problems falling asleep or the subjective feeling that sleep is not restful. About half of those subjectively affected have consulted a doctor. Older people and people living alone are particularly affected. There is no doubt that sleep abnormalities can lead to poor performance throughout the day, physical/somatic illnesses, psychological problems, or even premature death. Prevention, early detection, and therapy support are relevant factors impacting the personal quality of life.
The presented approaches have different application domains but share standard methodologies and technologies. Cross-domain thinking and application are essential to successful data acquisition and processing, either with traditional or cutting-edge approaches.
Startups play a key role in software-based innovation. They make an important contribution to an economy’s ability to compete and innovate, and their importance will continue to grow due to increasing digitalization. However, the success of a startup depends primarily on market needs and the ability to develop a solution that is attractive enough for customers to choose. A sophisticated technical solution is usually not critical, especially in the early stages of a startup. It is not necessary to be an experienced software engineer to start a software startup. However, this can become problematic as the solution matures and software complexity increases. Based on a proposed solution for systematic software development for early-stage startups, in this paper, we present the key findings of a survey study to identify the methodological and technical priorities of software startups. Among other things, we found that requirements engineering and architecture pose challenges for startups. In addition, we found evidence that startups’ software development approaches do not tend to change over time. An early investment in a more scalable development approach could help avoid long-term software problems. To support such an investment, we propose an extended model for Entrepreneurial Software Engineering that provides a foundation for future research.
Generating synthetic data is a relevant point in the machine learning community. As accessible data is limited, the generation of synthetic data is a significant point in protecting patients' privacy and having more possibilities to train a model for classification or other machine learning tasks. In this work, some generative adversarial networks (GAN) variants are discussed, and an overview is given of how generative adversarial networks can be used for data generation in different fields. In addition, some common problems of the GANs and possibilities to avoid them are shown. Different evaluation methods of the generated data are also described.
Gamification is one of the recognized methods of motivating people in various life processes, and it has spread to many spheres of life, including healthcare. This article proposes a system design for long-term care patients using the method mentioned. The proposed system aims to increase patient engagement in the treatment and rehabilitation process via gamification. Literature research on available and earlier proposed systems was conducted to develop a suited system design. The primary target group includes bedridden patients and a sedentary lifestyle (predominantly lying in bed). One of the main criteria for selecting a suitable option was its contactless realization for the mentioned target groups in long-term care cases. As a result, we developed the system design for hardware and software that could prevent bedsores and other health problems from occurring because of low activity. The proposed design can be tested in hospitals, nursing homes, and rehabilitation centers.
In times of climate change and growing urbanization, the way food is produced and consumed also changes. Meanwhile, digitization is transforming farming practices, which also applies to the domestic growing of crops. More and more so-called smart home farms (SHF) are finding their way into private households. This paper conceptualizes the unique nature of enabled smart services and their underlying technology. Following an inductive interpretive approach, this study explores the antecedents of smart home farming practices. Our sample consists of eleven actual smart home farmers. We found six constructs to be of salient importance: expected outcomes related to harvesting, positive feelings, and sustainability; a combination of one's affinity for green and novel technologies; and the smartness and visibility of the enabled services. In the outlook, we present some preliminary thoughts for testing our qualitative findings.
Purpose
Artificial intelligence (AI), in particular deep learning (DL), has achieved remarkable results for medical image analysis in several applications. Yet the lack of human-like explanations of such systems is considered the principal restriction before utilizing these methods in clinical practice (Yang, Ye, & Xia, 2022).
Methods
Explainable Artificial Intelligence (XAI) provides a human-explainable and interpretable description of the “black-box” nature of DL (Gulum, Trombley, & Kantardzic, 2021). An effective XAI diagnosis generator, namely NeuroXAI (refer to Fig. 1), has been developed to extract 3D explanations from convolutional neural networks (CNN) models of brain gliomas (Zeineldin et al., 2022). By providing visual justification maps, NeuroXAI can help make DL models transparent and thus increase the trust of medical experts.
Results
NeuroXAI has been applied to two applications of the most widely investigated problems in brain imaging analysis, i.e. image classification and segmentation using magnetic resonance imaging (MRI). Visual attention maps of multiple XAI methods have been generated and compared for both applications, which could help to provide transparency about the performance of DL systems.
Conclusion
NeuroXAI helps to understand the prediction process of 3D CNN networks for brain glioma using human-understandable explanations. Results revealed that the investigated DL models behave in a logical human-like manner and can improve the analytical process of the MRI images systematically. Due to its open architecture, ease of implementation, and scalability to new XAI methods, NeuroXAI could be utilized to assist medical professionals in the detection and diagnosis of brain tumors. NeuroXAI code is publicly accessible at https://github.com/razeineldin/NeuroXAI
This paper presents a toolbox in Matlab/Octave for procedural design of analog integrated circuits. The toolbox contains all native functions required by analog designers (namely, schematic-generation, simulation setup and execution, integrated look-up tables and functions for design space exploration) to capture an entire design strategy in an executable script. This script - which we call an Expert Design Plan (EDP) - is capable of executing an analog circuit design fully automatically. The toolbox is integrated in an existing design flow. A bandgap reference voltage circuit is designed with this tool in less than 15 min.
In this work, a brushless, harmonic-excited wound-rotor synchronous machine without any auxiliary windings which can provide full torque at startup is investigated experimentally. The excitation power is transferred inductively by superimposing an additional harmonic field of different pole-pair number on top of the airgap field. This is achieved by feeding the parallel paths of the stator and rotor winding separately. A prototype for the harmonic-excited synchronous machine has been constructed and experimental results are presented to verify the concept. The main loss contributors are identified and the importance of considering core losses under harmonic excitation is discussed. A general analytical model for harmonic excited synchronous machines is proposed which enables a quick estimation of the iron core flux densities and the core losses generated by the additional harmonic currents.
We address the problem of 3D face recognition based on either 3D sensor data, or on a 3D face reconstructed from a 2D face image. We focus on 3D shape representation in terms of a mesh of surface normal vectors. The first contribution of this work is an evaluation of eight different 3D face representations and their multiple combinations. An important contribution of the study is the proposed implementation, which allows these representations to be computed directly from 3D meshes, instead of point clouds. This enhances their computational efficiency. Motivated by the results of the comparative evaluation, we propose a 3D face shape descriptor, named Evolutional Normal Maps, that assimilates and optimises a subset of six of these approaches. The proposed shape descriptor can be modified and tuned to suit different tasks. It is used as input for a deep convolutional network for 3D face recognition. An extensive experimental evaluation using the Bosphorus 3D Face, CASIA 3D Face and JNU-3D Face datasets shows that, compared to the state of the art methods, the proposed approach is better in terms of both computational cost and recognition accuracy.
When wearing compressive garments, the tissue of the human body is altered in relation to its natural shape by the properties of the applied material and by the pattern construction used.
To check the fit of garments, both construction and selected materials can be virtually simulated in 3D on avatars in corresponding CAD programs before fabrication.
The software Blender allows the modelling of an avatar and to generate in respective to the different tissue zones with their specific properties to adjust them with soft body physics according to the testing of real soft tissue but the models in Blender are mainly using linear springs.
The importance of sleep for human life is enormous. It affects physical, mental, and psychological health. Therefore, it is vital to recognise sleep disorders in a timely manner in order to be able to initiate therapy. There are two methods for measuring sleep-related parameters - objective and subjective. Whether the substitution of a subjective method for an objective one is possible is investigated in this paper. Such replacement may bring several advantages, including increased comfort for the user. To answer this research question, a study was conducted in which 75 overnight recordings were evaluated. The primary purpose of this study was to compare both ways of measurement for total sleep time and sleep efficiency, which are essential parameters for, e.g., insomnia diagnosis and treatment. The evaluation results demonstrated that, on average, there are 32 minutes of difference between the two measurement methods when total sleep time is analysed. In contrast, on average, both measurement methods differ by 7.5% for sleep efficiency measurement. It should also be noted that people typically overestimate total sleep time and efficiency with the subjective method, where the perceived values are measured.
Evaluation of human-robot order picking systems considering the evolution of object detection
(2022)
The automation of intralogistic processes is a major trend, but order picking, one of the core and most cost-intensive tasks in this field, remains mostly manual due to the flexibility required during picking. Reacting to its hard physical and ergonomic strain, the automation of this process is however highly relevant. Robotic picking system would enable the automation of this process from a technical point of view, but the necessity for the system to evolve in time, due to dynamics of logistic environments, faces operations with new challenges that are hardly treated in literature. This unknown scares potential investors, hindering the application of technically feasible solutions. In this paper, a model for the evaluation of the additional cost of training of automated systems during operations is presented, that also considers the savings enabled by the system after its evolution. The proposed approach, that considers different parameters such as capacity, ergonomics and cost, is validated with a case study and discussed.
Recognition of sleep and wake states is one of the relevant parts of sleep analysis. Performing this measurement in a contactless way increases comfort for the users. We present an approach evaluating only movement and respiratory signals to achieve recognition, which can be measured non-obtrusively. The algorithm is based on multinomial logistic regression and analyses features extracted out of mentioned above signals. These features were identified and developed after performing fundamental research on characteristics of vital signals during sleep. The achieved accuracy of 87% with the Cohen’s kappa of 0.40 demonstrates the appropriateness of a chosen method and encourages continuing research on this topic.
The purpose of this paper is to examine the effects of perceived stress on traffic and road safety. One of the leading causes of stress among drivers is the feeling of having a lack of control during the driving process. Stress can result in more traffic accidents, an increase in driver errors, and an increase in traffic violations. To study this phenomenon, the Stress Perceived Questionnaire (PSQ) was used to evaluate the perceived stress while driving in a simulation. The study was conducted with participants from Germany, and they were grouped into different categories based on their emotional stability. Each participant was monitored using wearable devices that measured their instantaneous heart rate (HR). The preference for wearable devices was due to their non-intrusive and portable nature. The results of this study provide an overview of how stress can affect traffic and road safety, which can be used for future research or to implement strategies to reduce road accidents and promote traffic safety.
Glioblastomas are the most aggressive fast-growing primary brain cancer which originate in the glial cells of the brain. Accurate identification of the malignant brain tumor and its sub-regions is still one of the most challenging problems in medical image segmentation. The Brain Tumor Segmentation Challenge (BraTS) has been a popular benchmark for automatic brain glioblastomas segmentation algorithms since its initiation. In this year, BraTS 2021 challenge provides the largest multi-parametric (mpMRI) dataset of 2,000 pre-operative patients. In this paper, we propose a new aggregation of two deep learning frameworksnamely, DeepSeg and nnU-Net for automatic glioblastoma recognition in pre-operative mpMRI. Our ensemble method obtains Dice similarity scores of 92.00, 87.33, and 84.10 and Hausdorff Distances of 3.81, 8.91, and 16.02 for the enhancing tumor, tumor core, and whole tumor regions, respectively, on the BraTS 2021 validation set, ranking us among the top ten teams. These experimental findings provide evidence that it can be readily applied clinically and thereby aiding in the brain cancer prognosis, therapy planning, and therapy response monitoring. A docker image for reproducing our segmentation results is available online at (https://hub.docker.com/r/razeineldin/deepseg21).
Digital assistants like Alexa, Google Assistant or Siri have seen a large adoption over the past years. Using artificial intelligence (AI) technologies, they provide a vocal interface to physical devices as well as to digital services and have spurred an entire new ecosystem. This comprises the big tech companies themselves, but also a strongly growing community of developers that make these functionalities available via digital platforms. At present, only few research is available to understand the structure and the value creation logic of these AI-based assistant platforms and their ecosystem. This research adopts ecosystem intelligence to shed light on their structure and dynamics. It combines existing data collection methods with an automated approach that proves useful in deriving a network-based conceptual model of Amazon’s Alexa assistant platform and ecosystem. It shows that skills are a key unit of modularity in this ecosystem, which is linked to other elements such as service, data, and money flows. It also suggests that the topology of the Alexa ecosystem may be described using the criteria reflexivity, symmetry, variance, strength, and centrality of the skill coactivations. Finally, it identifies three ways to create and capture value on AI-based assistant platforms. Surprisingly only a few skills use a transactional business model by selling services and goods but many skills are complementary and provide information, configuration, and control services for other skill provider products and services. These findings provide new insights into the highly relevant ecosystems of AI-based assistant platforms, which might serve enterprises in developing their strategies in these ecosystems. They might also pave the way to a faster, data-driven approach for ecosystem intelligence.
Early exposure makes the entrepreneur: how economics education in school influences entrepreneurship
(2022)
Many countries that seek to boost their economy share the goal of promoting entrepreneurship. Whereas there is ample research on the predictors of entrepreneurship during adulthood, we know little about how pre-adulthood experience influences entrepreneurship later in life. Using a natural experiment, this paper examines whether introducing economics classes in school enhances entrepreneurial behavior in adulthood. Our difference-in-differences approach exploits curricula reforms across German states that introduced compulsory economics education classes in secondary schools. Using information on school and labor market careers for more than 10,000 individuals from 1984 to 2019, we find that the reform increases students’ entrepreneurial activities by three percentage points. Examining gender differences, we find that economics classes equally benefit female and male students. Our results advance our understanding of how pre-adulthood experiences shape individuals’ entrepreneurial behavior.
Digital twins: a meta-review on their conceptualization, application, and reference architecture
(2022)
The concept of digital twins (DTs) is receiving increasing attention in research and management practice. However, various facets around the concept are blurry, including conceptualization, application areas, and reference architectures for DTs. A review of preliminary results regarding the emerging research output on DTs is required to promote further research and implementation in organizations. To do so, this paper asks four research questions: (1) How is the concept of DTs defined? (2) Which application areas are relevant for the implementation of DTs? (3) How is a reference architecture for DTs conceptualized? and (4) Which directions are relevant for further research on DTs? With regard to research methods, we conduct a meta-review of 14 systematic literature reviews on DTs. The results yield important insights for the current state of conceptualization, application areas, reference architecture, and future research directions on DTs.
Process risks are omnipresent in the corporate world and repeatedly present organizations with the challenge of how to deal with these risks. Efforts in trying to analyze and prevent these risks are costly and require many resources, which do not always bring the desired added value. The goal of this work is to determine how a benefit-oriented resource allocation can be made for risk-oriented process management. For this purpose, the following research question is posed: "How can systematic prioritization decisions regarding risk-oriented process management be made?” To answer it, an evaluation procedure is developed which assesses processes based on their characteristics regarding potential risk disposition as well as entrepreneurial relevance. For this purpose, requirements for such a procedure are first collected and used to define selection criteria for it. After the detailed analysis of known selection and evaluation procedures, one of them is selected and used for further development. Next steps include the definition of relevant criteria for the evaluation of the processes by examining process characteristics regarding their suitability for process evaluation. The focus here lies on characteristics that provide indications of the risk disposition and business relevance of processes. The result of this approach is a scoring model with a criteria catalog consisting of 15 criteria according to which a process is evaluated. The evaluation result is presented both numerically and in a matrix. This enables the comparison of several processes and a derived prioritization of those for a more in-depth risk analysis. The application of this approach will ensure a benefit-oriented allocation of resources in the management of process risks and increased process reliability.
Especially, if the potential of technical and organizational measures for ergonomic workplace design is limited, exoskeletons can be considered as innovative ergonomic aids to reduce the physical workload of workers. Recent scientific findings from ergonomic analyses with and without exoskeletons are indicating that strain reduction can be achieved, particularly at workplaces with lifting, holding, and carrying processes. Currently, a work system design method is under development incorporating criteria and characteristics for the design of work systems in which a human worker is supported by an exoskeleton. Based on the properties of common passive and active exoskeletons, factors influencing the human on which an exoskeleton can have a positive or negative effect (e.g. additional weight) were derived. The method will be validated by the conceptualization and setup of several work system demonstrators at Werk150, the factory of ESB Business School on campus of Reutlingen University, to prove the positive ergonomic effect on humans and the supporting process to choose the suitable exoskeleton. The developed method and demonstrators enable the user to experience the positive ergonomic effects of exoskeletal support in lifting, holding and carrying processes in logistics and production. The new work system design method will contribute to the fact that employees can pursue their professional activity longer without substantial injuries or can be used more flexibly at different work stations. Also new work concepts, strategies and scenarios are opened up to reduce the risk of occupational accidents and to promote the compatibility of work for employees. A training module is being developed and evaluated with participants from industry and master students to build up competence.
Determination of accelerometer sensor position for respiration rate detection: initial research
(2022)
Continuous monitoring of a patient's vital signs is essential in many chronic illnesses. The respiratory rate (RR) is one of the vital signs indicating breathing diseases. This article proposes the initial investigation for determining the accelerometric sensor position of a non-invasive and unobtrusive respiratory rate monitoring system. This research aims to determine the sensor position in relation to the patient, which can provide the most accurate values of the mentioned physiological parameter. In order to achieve the result, the particular system setup, including a mechanical sensor holder construction was used. The breathing signals from 5 participants were analyzed corresponding to the relaxed state. The main criterion for selecting a suitable sensor position was each patient's average acceleration amplitude excursion, which corresponds to the respiratory signal. As a result, we provided one more defined important parameter for the considered system, which was not determined before.
Sleep is essential to existence, much like air, water, and food, as we spend nearly one-third of our time sleeping. Poor sleep quality or disturbed sleep causes daytime solemnity, which worsens daytime activities' mental and physical qualities and raises the risk of accidents. With advancements in sensor and communication technology, sleep monitoring is moving out of specialized clinics and into our everyday homes. It is possible to extract data from traditional overnight polysomnographic recordings using more basic tools and straightforward techniques. Ballistocardiogram is an unobtrusive, non-invasive, simple, and low-cost technique for measuring cardiorespiratory parameters. In this work, we present a sensor board interface to facilitate the communication between force sensitive resistor sensor and an embedded system to provide a high-performing prototype with an efficient signal-to-noise ratio. We have utilized a multi-physical-layer approach to locate each layer on top of another, yet supporting a low-cost, compact design with easy deployment under the bed frame.
Der relative Vorteil von Heim- gegenüber Auswärtsteams im Sport - der sogenannte Heimvorteil - ist in mehreren Studien belegt (z.B. Nevill et al., 2002; Jamieson, 2010). Als theoretisch dem Heimvorteil zugrundeliegende Faktoren gelten u.a. folgende: die Zuschauer (durch ihre motivierende Wirkung auf Spieler oder beeinflussende Wirkung auf Schiedsrichter), Reisefaktoren (z.B. die Entfernung bzw. Dauer der Reise und die damit einhergehende Erschöpfung der Spieler) und die Vertrautheit der Heimmannschaft mit der Umgebung (z.B. die Vertrautheit mit dem Stadion und dem Spieluntergrund) (Courneya & Carron, 1992; Nevill et al., 2002). Durch die während der COVID-19-Pandemie stattfindenden Spiele ohne Zuschauer (Geisterspiele) lässt sich erstmals durch ein natürliches Experiment der Einfluss von Zuschauern auf den Heimvorteil betrachten. Ein Überblick über die Studien, die den Heimvorteil in verschiedenen Fußballligen während der pandemiebedingten Geisterspiele untersuchen, findet sich in Leitner et al. (2022).
There have been substantial research efforts for algorithms to improve continuous and automated assessment of various health-related questions in recent years. This paper addresses the deployment gap between those improving algorithms and their usability in care and mobile health applications. In practice, most algorithms require significant and founded technical knowledge to be deployed at home or support healthcare professionals. Therefore, the digital participation of persons in need of health care professionals lacks a usable interface to use the current technological advances. In this paper, we propose applying algorithms taken from research as web-based microservices following the common approach of a RESTful service to bridge the gap and make algorithms accessible to caregivers and patients without technical knowledge and extended hardware capabilities. We address implementation details, interpretation and realization of guidelines, and privacy concerns using our self-implemented example. Also, we address further usability guidelines and our approach to those.
Production systems are becoming increasingly complex, which means that the main task of industrial maintenance, ensuring the technical availability of a production system, is also becoming increasingly difficult. The previous focus of maintenance efforts on individual machines must give way to a holistic view encompassing the whole production system. Against this background, the technical availability of a production system must be redefined. The aim of this publication is to present different definition approaches of production systems’ availability and to demonstrate the effects of random machine failures on the key figures considering the complexity of the production system using a discrete event simulation.
There is still a great reliance on human expert knowledge during the analog integrated circuit sizing design phase due to its complexity and scale, with the result that there is a very low level of automation associated with it. Current research shows that reinforcement learning is a promising approach for addressing this issue. Similarly, it has been shown that the convergence of conventional optimization approaches can be improved by transforming the design space from the geometrical domain into the electrical domain. Here, this design space transformation is employed as an alternative action space for deep reinforcement learning agents. The presented approach is based entirely on reinforcement learning, whereby agents are trained in the craft of analog circuit sizing without explicit expert guidance. After training and evaluating agents on circuits of varying complexity, their behavior when confronted with a different technology, is examined, showing the applicability, feasibility as well as transferability of this approach.
The Fourteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2022), held between May 22 – 26, 2022, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Context: Nowadays the market environment is characterized by high uncertainties due to high market dynamics, confronting companies with new challenges in creating and updating product roadmaps. Most companies are still using traditional approaches which typically fail in such environments. Therefore, companies are seeking opportunities for new product roadmapping approaches.
Objective: This paper presents good practices to support companies better understand what factors are required to conduct a successful product roadmapping in a dynamic and uncertain market environment.
Method: Based on a grey literature review, essential aspects for conducting product roadmapping in a dynamic and uncertain market environment were identified. Expert workshops were then held with two researchers and three practitioners to develop best practices and the proposed approach for an outcome-driven roadmap. These results were then given to another set of practitioners and their perceptions were gathered through interviews.
Results: The study results in the development of 9 good practices that provide practitioners with insights into what aspects are crucial for product roadmapping in a dynamic and uncertain market environment. Moreover, we propose an approach to product roadmapping that includes providing a flexible structure and focusing on delivering value to the customer and the business. To ensure the latter, this approach consists of the main items outcome hypothesis, validated outcomes, and discovered outputs.
The blockchain technology represents a decentralized database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain manufacturing processes to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. However, the decentralized and immutable structure of blockchain technology also creates challenges when applying these token-based approaches to dynamic manufacturing processes. As a first step, this paper investigates existing mapping approaches and exemplifies weaknesses regarding their suitability for products with changeable configurations. Secondly, a concept is proposed to overcome these weaknesses by introducing logically coupled tokens embedded into a flexible smart contract structure. Finally, a concept for a token-based architecture is introduced to map manufacturing processes of products with changeable configurations.
Public transport causes in rural areas high costs per passenger and kilometer as the frequency of scheduled busses is low and therefore, many people avoid using public transport. With the trend of moving from urban regions to countryside individual traffic will further increase. To tackle issues of emissions, mobility for young and elderly people and provide economically meaningful public transport a new concept was elaborated in Germany. This consists of (partly) autonomous shuttle busses which are remote controlled. For implementation rural districts of Germany have worked together and set up a three-phase plan consisting of a project with public funding, a highly frequent used pilot region and industrial partners with the commitment and possibilities for necessary investments. The concept promises economical value with respect to installation, service and maintaining costs, it leads to lower barriers for public transport of young and elderly people and ultimately reduces emissions and congestions.
In this paper we presented the results of the workshop with the topic: Co-creation in citizen science (CS) for the development of climate adaptation measurements - Which success factors promote, and which barriers hinder a fruitful collaboration and co-creation process between scientists and volunteers? Under consideration of social, motivational, technical/technological and legal factors., which took place at the CitSci2022. We underlined the mentioned factors in the work with scientific literature. Our findings suggest that a clear communication strategy of goals and how citizen scientists can contribute to the project are important. In addition, they have to feel include and that the contribution makes a difference. To achieve this, it is critical to present the results to the citizen scientists. Also, the relationship between scientist and citizen scientists are essential to keep the citizen scientists engaged. Notification of meetings and events needs to be made well in advance and should be scheduled on the attendees' leisure time. The citizen scientists should be especially supported in technical questions. As a result, they feel appreciated and remain part of the project. For legal factors the current General Data Protection Regulation was considered important by the participants of the workshop. For the further research we try to address the individual points and first of all to improve our communication with the citizen scientist about the project goals and how they can contribute. In addition, we should better share the achieved results.
Class Phi2 amplifier using GaN HEMTs at 13.56MHz with tuned transformer for wireless power transfer
(2022)
This paper discusses a design procedure of a wireless power transfer system at a RF switching frequency of 13.56MHz. The wireless power transfer amplifier uses GaN HEMTs in aClass phi2 topology and is designed in order to achieve high efficiency and high power density. A design method for the load over a certain bandwidth is presented for a transformer with its tuning network.
Even though near-data processing (NDP) can provably reduce data transfers and increase performance, current NDP is solely utilized in read-only settings. Slow or tedious to implement synchronization and invalidation mechanisms between host and smart storage make NDP support for data-intensive update operations difficult. In this paper, we introduce a low-latency cache-coherent shared lock table for update NDP settings in disaggregated memory environments. It utilizes the novel CCIX interconnect technology and is integrated in neoDBMS, a near-data processing DBMS for smart storage. Our evaluation indicates end-to-end lock latencies of ∼80-100ns and robust performance under contention.
The energy turnaround, digitalization and decreasing revenues forces enterprises in the energy domain to develop new business models. Following a Design Science Research approach, we showed in two action research projects that businesses models in the energy domain result in complex ecosystems with multiple actors. Additionally, we identified that municipal utilities have problems with the systematic development of business models. In order to solve the problem, we captured together with the partners of the enterprises the requirements in a second phase. Further we developed a method which consist of the following components: Method for the creative development of a new business model in form of a Business Model Canvas (BMC). A mapping between the e3Value ontology and the BMC for modelling a business ecosystem. The Business Model Configurator (BMConfig) prototype for modelling and simulating the e3Value-Ontology. The Business model can be quantified and analyzed for its viability. We demonstrate the feasibility of our approach in business model of a power community.
Job advertisements are important means of communicating role expectations for management accountants to the labor market. They provide information about which roles of management accountants are sought by companies or which roles are expected. However, which roles are communicated in job advertisements is unknown so far. Using a large sample of 889 job ads and a text-mining approach, we show an apparent mix of different role types with a strong focus on a rather classic role: the watchdog role. However, individuals with business partner characteristics are more often sought for leadership positions or in family businesses and small and medium-sized enterprises (SMEs). The results challenge the current role discussion for management accountants as business partners in practice and some academic fields.
Die bedarfsgerechte Steuerung dezentraler thermischer Energiesysteme, wie Kraft-Wärme-Kopplungs- (KWK-) Anlagen und Wärmepumpen, kann einen entscheidenden Beitrag zur Deckung bzw. Reduktion der Residuallast leisten und so für eine Verringerung der konventionellen Reststromversorgung und den damit einhergehenden Treibhausgasemissionen sorgen. Dafür wurde an der Hochschule Reutlingen in mehrjähriger Forschungsarbeit ein prognosebasierter Steuerungsalgorithmus entwickelt. Gegenstand dieses Beitrags bilden neben der Vorstellung eben jenes Steuerungsalgorithmus auch dessen praktische Umsetzungsvarianten: Eine auf einer speicherprogrammierbaren Steuerung (SPS) rein lokal ausführbare Version sowie eine Webservice-Anwendung für den parallelen Betrieb mehrerer Anlagen – ausgehend von einem zentralen Server. Erprobungen am KWK-Prüfstand der Hochschule Reutlingen bestätigen die zuverlässige Funktionsweise des Algorithmus in den verschiedenen Umsetzungsvarianten. Gleichzeitig wird der Vorteil der bedarfsgerechten Steuerung gegenüber dem, insbesondere im Mikro-KWK-Bereich standardmäßig vorliegenden, wärmegeführten Betrieb in Form einer Steigerung der Eigenstromdeckung von bis zu 27 % aufgezeigt. Neben der bedarfsgerechten Steuerung bedient der entwickelte Algorithmus zudem noch ein weiteres Anwendungsgebiet: Den vorhersagbaren KWK-Betrieb, der beispielsweise in Form täglicher Einspeiseprognose im Rahmen des Redispatch 2.0 eingefordert wird. Die Vorhersage des KWK-Betriebs ist dabei auf zwei Weisen möglich: Als erste Option kann der wärmegeführte Betrieb direkt über den Algorithmus abgebildet und prognostiziert werden. Eine andere Möglichkeit stellt wiederum die bedarfsgerechte Steuerung der Anlage dar; der berechnete optimale Fahrplan entspricht dabei gleichzeitig der Betriebsprognose des KWK-Geräts. Damit ist der entwickelte Steuerungsalgorithmus in der Lage, auf unterschiedliche Weisen zum Gelingen der Energiewende beizutragen.
According to several surveys and statistics, the great majority of companies previously not accustomed to automation are piloting solutions to automate business processes. Those accustomed to automation also attempt to introduce more of it, focusing on automation-unfriendly processes that remained manual. However, when the decision on what and whether to automate is not trivial for evident reasons, even industry leaders may get stuck on an overwhelming question: where to begin automating? The question remains too often unanswered as state-of-the-art methods fail to consider the whole picture. This paper introduces a holistic approach to the decision-making for investments in automation. The method supports the iterative analysis and evaluation of operative processes, providing tools for a quantitative approach to the decision-making. Thanks to the method, a large pool of processes can be first considered and then filtered out in order to select the one that yields the best value for the automation in the specific context. After introducing the method, a case study is reported for validation before the discussion.
Switched reluctance motors are particularly attractive due to their simple structure. The control of this machine type requires the instants, to switch the currents in the motor phases in an appropriate sequence. These switching instants are determined either based on a position sensor, or on signals generated by a sensorless method. A very simple sensorless method uses the switching frequency of the hysteresis controllers used for phase current control. This paper first presents an automatic commissioning method for this sensorless method and second a startup procedure, thus enhancing this approach towards an application in industry.
Physicians in interventional radiology are exposed to high physical stress. To avoid negative long-term effects resulting from unergonomic working conditions, we demonstrated the feasibility of a system that gives feedback about unergonomic
situations arising during the intervention based on the Azure Kinect camera. The overall feasibility of the approach could be shown.
Demand forecasting intermittent time series is a challenging business problem. Companies have difficulties in forecasting this particular form of demand pattern. On the one hand, it is characterized by many non-demand periods and therefore classical statistical forecasting algorithms, such as ARIMA, only work to a limited extent. On the other hand, companies often cannot meet the requirements for good forecasting models, such as providing sufficient training data. The recent major advances of artificial intelligence in applications are largely based on transfer learning. In this paper, we investigate whether this method, originating from computer vision, can improve the forecasting quality of intermittent demand time series using deep learning models. Our empirical results show that, in total, transfer learning can reduce the mean square error by 65 percent. We also show that especially short (65 percent reduction) and medium long (91 percent reduction) time series benefit from this approach.
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
Today many scientific works are using deep learning algorithms and time series, which can detect physiological events of interest. In sleep medicine, this is particularly relevant in detecting sleep apnea, specifically in detecting obstructive sleep apnea events. Deep learning algorithms with different architectures are used to achieve decent results in accuracy, sensitivity, etc. Although there are models that can reliably determine apnea and hypopnea events, another essential aspect to consider is the explainability of these models, i.e., why a model makes a particular decision. Another critical factor is how these deep learning models determine how severe obstructive sleep apnea is in patients based on the apnea-hypopnea index (AHI). Deep learning models trained by two approaches for AHI determination are exposed in this work. Approaches vary depending on the data format the models are fed: full-time series and window-based time series.
Data analysis is becoming increasingly important to pursue organizational goals, especially in the context of Industry 4.0, where a wide variety of data is available. Here numerous challenges arise, especially when using unstructured data. However, this subject has not been focused by research so far. This research paper addresses this gap, which is interesting for science and practice as well. In a study three major challenges of using unstructured data has been identified: analytical know-how, data issues, variety. Additionally, measures how to improve the analysis of unstructured data in the industry 4.0 context are described. Therefore, the paper provides empirical insights about challenges and potential measures when analyzing unstructured data. The findings are presented in a framework, too. Hence, next steps of the research project and future research points become apparent.
This paper presents a compact four-arm spiral antenna, which may be used in direction-finding applications but also mobile communication systems. The antenna is fed sequentially at its outside-ends using a sequential phase network embedded in grounded multilayer dielectric media. Sequential rotation is applied to generate the axial mode M1 but also the conical mode M2 in the same frequency band. The antenna exhibits good radiation characteristics in the frequency band of interest.
For a long time, most discrete accelerators have been attached to host systems using various generations of the PCI Express interface. However, with its lack of support for coherency between accelerator and host caches, fine-grained interactions require frequent cache-flushes, or even the use of inefficient uncached memory regions. The Cache Coherent Interconnect for Accelerators (CCIX) was the first multi-vendor standard for enabling cache-coherent host-accelerator attachments, and already is indicative of the capabilities of upcoming standards such as Compute Express Link (CXL). In our work, we compare and contrast the use of CCIX with PCIe when interfacing an ARM-based host with two generations of CCIX-enabled FPGAs. We provide both low-level throughput and latency measurements for accesses and address translation, as well as examine an application-level use-case of using CCIX for fine-grained synchronization in an FPGA-accelerated database system. We can show that especially smaller reads from the FPGA to the host can benefit from CCIX by having roughly 33% shorter latency than PCIe. Small writes to the host have a latency roughly 32% higher than PCIe, though, since they carry a higher coherency overhead. For the database use-case, the use of CCIX allowed to maintain a constant synchronization latency even with heavy host-FPGA parallelism.
We propose a novel technique to compensate the effects of R-C / gm-C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with tunable circuit structures, such as current-steering digital-to-analog converters (DAC). This approach constitutes an alternative or supplement to existing compensation techniques, including capacitor or gm tuning. We apply the proposed technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure. A feedback path tuning scheme is derived analytically and confirmed numerically using behavioral simulations. The modulator circuit was implemented in a 0.35-μm CMOS process using an active feedback coefficient tuning structure based on current-steering DACs. Post-layout simulations show that with this tuning structure, constant performance and stable operation can be obtained over a wide range of TC variation.
There is a growing consensus in research and practice that value-creating networks and ecosystems are supplementing the traditional distinction between the internal firm and market perspectives. To achieve joint value in ecosystems, it is crucial to align the various interests of independently acting ecosystem actors and create a common vision. In this paper, we argue that the ecosystem-wide use of product roadmaps may help with this. To get a better understanding of how roadmapping is conducted in the dynamic ecosystem environment, we systematize the main characteristics of product roadmaps and perform a conceptual comparison with the known challenges of ecosystem management. Comparing the two concepts of ecosystems and product roadmaps, we highlight the fit between the characteristics and objectives of the roadmaps and the challenges of ecosystem management. Hence, we propose to experiment with the ecosystem-wide use of product roadmaps as well as the empirical study of the challenges emerging in the process and the associated redesign of the roadmaps.
Providing a digital infrastructure, platform technologies foster interfirm collaboration between loosely coupled companies, enabling the formation of ecosystems and building the organizational structure for value co-creation. Despite the known potential, the development of platform ecosystems creates new sources of complexity and uncertainty due to the involvement of various independent actors. For a platform ecosystem to succeed, it is essential that the platform ecosystem participants are aligned, coordinated, and given a common direction. Traditionally, product roadmaps have served these purposes during product development. A systematic mapping study was conducted to better understand how product roadmapping could be used in the dynamic environment of platform ecosystems. One result of the study is that there are hardly any concrete approaches for product roadmapping in platform ecosystems so far. However, many challenges on the topic are described in the literature from different perspectives. Based on the results of the systematic mapping study, a research agenda for product roadmapping in platform ecosystems is derived and presented.
Blockchain is a technology for the secure processing and verification of data transactions based on a distributed peer-to-peer network that uses cryptographic processes, consensus algorithms, and backward-linked blocks to make transactions virtually immutable. Within supply chain management, blockchain technology offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. However, its complexity requires future employees to have comprehensive knowledge regarding the functionality of blockchain-based applications in order to be able to apply their benefits to scenarios in supply chain and production. Learning factories represent a suitable environment allowing learners to experience new technologies and to apply them to virtual and physical processes throughout value chains. This paper presents a concept to practically transfer knowledge about the technical functionality of blockchain technology to future engineers and software developers working within supply chains and production operations to sensitize them regarding the advantages of decentralized applications. First, the concept proposes methods to playfully convey immutable backward-linked blocks and the embedment of blockchain smart contracts. Subsequently, the students use this knowledge to develop blockchain-based application scenarios by means of an exemplary product in a learning factory environment. Finally, the developed solutions are implemented with the help of a prototypical decentralized application, which enables a holistic mapping of supply chain events.
A single-phase fixed-frequency operated power factor correction circuit with reduced switching losses is proposed. The circuit uses the combination of a boost converter with an added clamp-switch, a pulse wave shaping circuit, and a standard control IC to discharge the transistor's output capacitance prior to its turn-on. In this way, a very low-complexity control circuit implementation to reduce switching losses or even achieve complete zero-voltage switching without additional sensors is possible. Moreover, this operation method is achieved at a constant switching frequency, possibly simplifying the design of the EMI filter and the converter's inductor. Experimental test results for a 100 W prototype converter are presented to validate the feasibility of the proposed operating method and corresponding circuit structure.
Data governance have been relevant for companies for a long time. Yet, in the broad discussion on smart cities, research on data governance in particular is scant, even though data governance plays an essential role in an environment with multiple stakeholders, complex IT structures and heterogeneous processes. Indeed, not only can a city benefit from the existing body of knowledge on data governance, but it can also make the appropriate adjustments for its digital transformation. Therefore, this literature review aims to spark research on urban data governance by providing an initial perspective for future studies. It provides a comprehensive overview of data governance and the relevant facets embedded in this strand of research. Furthermore, it provides a fundamental basis for future research on the development of an urban data governance framework.