Refine
Year of publication
- 2023 (243) (remove)
Document Type
- Journal article (124)
- Conference proceeding (82)
- Book chapter (25)
- Doctoral Thesis (6)
- Report (2)
- Working Paper (2)
- Issue of a journal (1)
- Review (1)
Has full text
- yes (243) (remove)
Is part of the Bibliography
- yes (243)
Institute
- ESB Business School (107)
- Informatik (70)
- Technik (32)
- Life Sciences (22)
- Texoversum (11)
- Zentrale Einrichtungen (5)
Publisher
- Springer (43)
- Elsevier (37)
- MDPI (21)
- IEEE (18)
- Erich Schmidt Verlag (9)
- American Chemical Society (7)
- Hochschule Reutlingen (6)
- Association for Information Systems (5)
- Emerald (5)
- Association for Computing Machinery (4)
Towards a sustainable future, looking beyond the system boundaries of a single manufacturing company is necessary to promote meaningful collaborations in terms of circular economy principles. In this context digital data processing technologies to connect the potential collaborators are seen as enablers to make use of proven collaborative circular business models (CCBMs). Since most of such data processing technologies rely on features to describe the entities involved, it is essential to provide guidance for identifying and selecting the relevant and most appropriate ones. Defining critical success factors (CSFs) is considered a suitable instrument to describe the decisive factors. A systematic literature review (SLR), followed by a qualitative synthesis is investigating two scientific fields of work, namely (1) the general relevant features of CCBMs and, (2) methodologies for determining CSFs. This results in the development of a conceptual framework which provides guidance for digital applications that perform further digital processing based on the relevant CSFs relating to the specific CCBM.
Artificial intelligence (AI) is one of the most promising technologies of the post-pandemic era. Cloud computing technology can simplify the process of developing AI applications by offering a variety of services, including ready-to-use tools to train machine learning (ML) algorithms. However, comparing the vast amount of services offered by different providers and selecting a suitable cloud service can be a major challenge for many firms. Also in academia, suitable criteria to evaluate this type of service remain largely unclear. Therefore, the overall aim of this work has been to develop a framework to evaluate cloud-based ML services. We use Design Science Research as our methodology and conduct a hermeneutic literature review, a vendor analysis, as well as, expert interviews. Based on our research, we present a novel framework for the evaluation of cloud-based ML services consisting of six categories and 22 criteria that are operationalized with the help of various metrics. We believe that our results will help organizations by providing specific guidance on how to compare and select service providers from the vast amount of potential suppliers.
Condition monitoring supported with artificial intelligence, cloud computing, and industrial internet of things (IIoT) technologies increases the feasibility of predictive maintenance. However, the cost of traditional sensors, data acquisition systems, and the required information technology expert-knowledge challenge the industry. This paper presents a hybrid condition monitoring system (CMS) architecture consisting of a distributed, low-cost IIoT-sensor solution. The CMS uses micro-electro-mechanical system (MEMS) microphones for data acquisition, edge computing for signal preprocessing, and cloud computing, including artificial neural networks (ANN) for higher-level information processing. The system's feasibility is validated using a testbed for reciprocating linear-motion axes.
The members of the European TRIZ Campus (ETC) have been learning from and working together with many honorable members of MATRIZ Official for many years and feel very connected to the official International TRIZ Association.
To further spread the TRIZ methodology and TRIZ teaching in the European area in the past 12 months the ETC has put a lot of thought in how making TRIZ accessible to a broader audi-ence and getting more professionals in touch with the methodology was one of the focal points.
To this end, we have developed new formats such as the "Trainer Day" to support trainers on their way into practice. We have drawn up detailed quality guidelines for the teaching of the TRIZ methodology, which are intended to provide orientation for the design of training classes and docu-mentation. We strive for exchange with representatives of "neighbouring" methods such as Six sigma, Lean, DFMA and Design Thinking to indicate synergies and added value among methods and approaches of different kinds. We are testing formats for community building, in order to connect users of all places more strongly with the TRIZ methodology through communication and information of-fers. If TRIZ users feel alone in their organizations, the exchange outside their organi-zation helps them to keep up with the TRIZ methodology. Moreover, the ETC strives to increase the ability to communicate the benefits of TRIZ-usage inside organizations. We discuss, how to reach teachers and students of all age, to make them the unique way of inventive thinking accessible.
In our paper we want to give other MATRIZ Official members insights and share our experi-ences and best practices with our fellow MO members.
Parallel grippers offer multiple applications thanks to their flexibility. Their application field ranges from aerospace and automotive to medicine and communication technologies. However, the application of grippers has the problem of exhibition wear and errors during the execution of their operation. This affects the performance of the gripper. In this context, the remaining useful life (RUL) defines the remaining lifespan until failure for an asset at a particular time of operation occurs. The exact lifespan of an asset is uncertain, thus the RUL model and estimation must be derived from available sources of information. This paper presents a method for the estimation of the RUL for a two-jaw parallel gripper. After the introduction to the topic, an overview of existing literature and RUL methods are presented. Subsequently, the method for estimating the RUL of grippers is explained. Finally, the results are summarized and discussed before the outlook and further challenges are presented.
This article presents a modified method of performing power flow calculations as an alternative to pure energy-based simulations of off-grid hybrid systems. The enhancement consists in transforming the scenario-based power flow method into a discrete time-dependent algorithm with the inclusion of bus and controller dynamics.
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
Using predictive maintenance, more efficient processes can be implemented, leading to fewer maintenance costs and increased availability. The development of a predictive maintenance solution currently requires high efforts in time and capacity as well as often interdisciplinary cooperation. This paper presents a standardized model to describe a predictive maintenance use case. The description model is used to collect, present, and document the required information for the implementation of predictive maintenance use cases by and for different stakeholders. Based on this model, predictive maintenance solutions can be introduced more efficiently. The method is validated across departments in the automotive sector.
The development of automatic solutions for the detection of physiological events of interest is booming. Improvements in the collection and storage of large amounts of healthcare data allow access to these data faster and more efficiently. This fact means that the development of artificial intelligence models for the detection and monitoring of a large number of pathologies is becoming increasingly common in the medical field. In particular, developing deep learning models for detecting obstructive apnea (OSA) events is at the forefront. Numerous scientific studies focus on the architecture of the models and the results that these models can provide in terms of OSA classification and Apnea-Hypopnea-Index (AHI) calculation. However, little focus is put on other aspects of great relevance that are crucial for the training and performance of the models. Among these aspects can be found the set of physiological signals used and the preprocessing tasks prior to model training. This paper covers the essential requirements that must be considered before training the deep learning model for obstructive sleep apnea detection, in addition to covering solutions that currently exist in the scientific literature by analyzing the preprocessing tasks prior to training.
Introduction: Telemedicine reduces greenhouse gas emissions (CO2eq); however, results of studies vary extremely in dependence of the setting. This is the first study to focus on effects of telemedicine on CO2 imprint of primary care.
Methods: We conducted a comprehensive retrospective study to analyze total CO2eq emissions of kilometers (km) saved by telemedical consultations. We categorized prevented and provoked patient journeys, including pharmacy visits. We calculated CO2eq emission savings through primary care telemedical consultations in comparison to those that would have occurred without telemedicine. We used the comprehensive footprint approach, including all telemedical cases and the CO2eq emissions by the telemedicine center infrastructure. In order to determine the net ratio of CO2eq emissions avoided by the telemedical center, we calculated the emissions associated with the provision of telemedical consultations (including also the total consumption of physicians’ workstations) and subtracted them from the total of avoided CO2eq emissions. Furthermore, we also considered patient cases in our calculation that needed to have an in-person visit after the telemedical consultation. We calculated the savings taking into account the source of the consumed energy (renewable or not).
Results: 433 890 telemedical consultations overall helped save 1 800 391 km in travel. On average, 1 telemedical consultation saved 4.15 km of individual transport and consumed 0.15 kWh. We detected savings in almost every cluster of patients. After subtracting the CO2eq emissions caused by the telemedical center, the data reveal savings of 247.1 net tons of CO2eq emissions in total and of 0.57 kg CO2eq per telemedical consultation. The comprehensive footprint approach thus indicated a reduced footprint due to telemedicine in primary care.
Discussion: Integrating a telemedical center into the health care system reduces the CO2 footprint of primary care medicine; this is true even in a densely populated country with little use of cars like Switzerland. The insight of this study complements previous studies that focused on narrower aspects of telemedical consultations.
This project aims to evaluate existing big data infrastructures for their applicability in the operating room to support medical staff with context-sensitive systems. Requirements for the system design were generated. The project compares different data mining technologies, interfaces, and software system infrastructures with a focus on their usefulness in the peri-operative setting. The lambda architecture was chosen for the proposed system design, which will provide data for both postoperative analysis and real-time support during surgery.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
Purpose
In recognising the key role of business intelligence and big data analytics in influencing companies’ decision-making processes, this paper aims to codify the main phases through which companies can approach, develop and manage big data analytics.
Design/methodology/approach
By adopting a research strategy based on case studies, this paper depicts the main phases and challenges that companies “live” through in approaching big data analytics as a way to support their decision-making processes. The analysis of case studies has been chosen as the main research method because it offers the possibility for different data sources to describe a phenomenon and subsequently to develop and test theories.
Findings
This paper provides a possible depiction of the main phases and challenges through which the approach(es) to big data analytics can emerge and evolve over time with reference to companies’ decision-making processes.
Research limitations/implications
This paper recalls the attention of researchers in defining clear patterns through which technology-based approaches should be developed. In its depiction of the main phases of the development of big data analytics in companies’ decision-making processes, this paper highlights the possible domains in which to define and renovate approaches to value. The proposed conceptual model derives from the adoption of an inductive approach. Despite its validity, it is discussed and questioned through multiple case studies. In addition, its generalisability requires further discussion and analysis in the light of alternative interpretative perspectives.
Practical implications
The reflections herein offer practitioners interested in company management the possibility to develop performance measurement tools that can evaluate how each phase can contribute to companies’ value creation processes.
Originality/value
This paper contributes to the ongoing debate about the role of digital technologies in influencing managerial and social models. This paper provides a conceptual model that is able to support both researchers and practitioners in understanding through which phases big data analytics can be approached and managed to enhance value processes.
Advancing mental health diagnostics: AI-based method for depression detection in patient interviews
(2023)
In this paper, we present a novel artificial intelligence (AI) application for depression detection, using advanced transformer networks to analyse clinical interviews. By incorporating simulated data to enhance traditional datasets, we overcome limitations in data protection and privacy, consequently improving the model’s performance. Our methodology employs BERT-based models, GPT-3.5, and ChatGPT-4, demonstrating state-of-the-art results in detecting depression from linguistic patterns and contextual information that significantly outperform previous approaches. Utilising the DAIC-WOZ and Extended-DAIC datasets, our study showcases the potential of the proposed application in revolutionising mental health care through early depression detection and intervention. Empirical results from various experiments highlight the efficacy of our approach and its suitability for real-world implementation. Furthermore, we acknowledge the ethical, legal, and social implications of AI in mental health diagnostics. Ultimately, our study underscores the transformative potential of AI in mental health diagnostics, paving the way for innovative solutions that can facilitate early intervention and improve patient outcomes.
Large critical systems, such as those created in the space domain, are usually developed by a large number of organizations and, furthermore, they have to comply with standards. Yet, the different stakeholders often do not have a common understanding of the needed quality of requirements specifications. Achieving such a common understanding is a laborious process that is currently not sufficiently supported. Moreover, such a common understanding must be aligned with the standards. In this paper, we present an approach that can be used to align the different stakeholder perceptions regarding the quality of requirements specifications. Existing quality models for requirements specifications are analyzed for equivalences, and transferred into a common representation, the so-called Aligned Quality Map (AQM). Furthermore, a process is defined that supports the alignment of different stakeholder perspectives with regard to the quality of requirements specifications using AQM, which is validated in a case study in the context of European space projects. AQM has been created and populated with an initial set of quality models. It is designed in such way that it can be extended to include further quality models. The case study has shown that an alignment of different stakeholder perspectives and the quality model of the European Cooperation for Space Standardization using AQM is feasible. The approach allows for aligning different stakeholder perspectives for a common understanding of the quality of requirements specifications in the context of standards. Furthermore, AQM supports the assessment of requirements specifications.
Global trade is plagued by slow and inefficient manual processes associated with physical documents. Firms are constantly looking for new ways to improve transparency and increase the resilience of their supply chains. This can be solved by the digitalisation of supply chains and the automation of document- and information-sharing processes. Blockchain is touted as a solution to these issues due to its unique combination of features, such as immutability, decentralisation and transparency. A lack of business cases that quantify the costs and benefits causes uncertainty regarding the truth of these claims. This paper explores how the costs and benefits of a blockchain-based solution for digitalising and automating documentation flows in cross-border supply chains compare to a conventional centralised relational database solution. The research described in this paper uses primary data collected through semi-structured interviews with industry experts, as well as secondary data from literature. Two models based on existing services were developed and the costs and benefits compared and then analysed using the Architecture Trade-off Analysis Method (ATAM) and the Analytic Network Process (ANP). Findings from the analysis show that a consortium blockchain solution like TradeLens is the favourable solution for digitalising and automating information flows in cross-border supply chains.
Intelligent Tutoring Systems (ITSs) are increasingly used in modern education to automatically give students individual feedback on their performance. The advantage for students is fast individual feedback on their answers to asked questions, while lecturers benefit from considerable time savings and easy delivery of educational material. Of course, it is important that the provided feedback is as effective as direct feedback from the lecturer. However, in digital teaching, lecturers cannot assess the student’s knowledge precisely but can only provide information on which questions were answered correctly and incorrectly. Therefore, this paper presents a concept for integrating ITS elements into the gamified e-learning platform IT-REX so that the feedback quality can be improved to support students in the best possible way.
Fragestellung: Das klinische Standardverfahren und Referenz der Schlafmessung und der Klassifizierung der einzelnen Schlafstadien ist die Polysomnographie (PSG). Alternative Ansätze zu diesem aufwändigen Verfahren könnten einige Vorteile bieten, wenn die Messungen auf eine komfortablere Weise durchgeführt werden. Das Hauptziel dieser Forschung Studie ist es, einen Algorithmus für die automatische Klassifizierung von Schlafstadien zu entwickeln, der ausschließlich Bewegungs- und Atmungssignale verwendet [1].
Patienten und Methoden: Nach der Analyse der aktuellen Forschungsarbeiten haben wir multinomiale logistische Regression als Grundlage für den Ansatz gewählt [2]. Um die Genauigkeit der Auswertung zu erhöhen, wurden vier Features entwickelt, die aus Bewegungs- und Atemsignalen abgeleitet wurden. Für die Auswertung wurden die nächtlichen Aufzeichnungen von 35 Personen verwendet, die von der Charité-Universitätsmedizin Berlin zur Verfügung gestellt wurden. Das Durchschnittsalter der Teilnehmer betrug 38,6 +/– 14,5 Jahre und der BMI lag bei durchschnittlich 24,4 +/– 4,9 kg/m2. Da der Algorithmus mit drei Stadien arbeitet, wurden die Stadien N1, N2 und N3 zum NREM-Stadium zusammengeführt. Der verfügbare Datensatz wurde strikt aufgeteilt: in einen Trainingsdatensatz von etwa 100 h und in einen Testdatensatz mit etwa 160 h nächtlicher Aufzeichnungen. Beide Datensätze wiesen ein ähnliches Verhältnis zwischen Männern und Frauen auf, und der durchschnittliche BMI wies keine signifikante Abweichung auf.
Ergebnisse: Der Algorithmus wurde implementiert und lieferte erfolgreiche Ergebnisse: die Genauigkeit der Erkennung von Wach-/NREM-/REM-Phasen liegt bei 73 %, mit einem Cohen’s Kappa von 0,44 für die analysierten 19.324 Schlafepochen von jeweils 30 s. Die beobachtete gewisse Überschätzung der NREM-Phase lässt sich teilweise durch ihre Prävalenz in einem typischen Schlafmuster erklären. Selbst die Verwendung eines ausbalancierten Trainingsdatensatzes konnte dieses Problem nicht vollständig lösen.
Schlussfolgerungen: Die erreichten Ergebnisse haben die Tauglichkeit des Ansatzes prinzipiell bestätigt. Dieser hat den Vorteil, dass nur Bewegungs- und Atemsignale verwendet werden, die mit weniger Aufwand und komfortabler für Benutzer aufgezeichnet werden können als z. B. Herz- oder EEG-Signale. Daher stellt das neue System eine deutliche Verbesserung im Vergleich zu bestehenden Ansätzen dar. Die Zusammenführung der beschriebenen algorithmischen Software mit dem in [1] beschriebenen Hardwaresystem zur Messung von Atem- und Körperbewegungssignalen zu einem autonomen, berührungslosen System zur kontinuierlichen Schlafüberwachung ist eine mögliche Richtung zukünftiger Arbeiten.
While driving, stress is caused by situations in which the driver estimates their ability to manage the driving demands as insufficient or loses the capability to handle the situation. This leads to increased numbers of driver mistakes and traffic violations. Additional stressing factors are time pressure, road conditions, or dislike for driving. Therefore, stress affects driver and road safety. Stress is classified into two categories depending on its duration and the effects on the body and psyche: short-term eustress and constantly present distress, which causes degenerative effects. In this work, we focus on distress. Wearable sensors are handy tools for collecting biosignals like heart rate, activity, etc. Easy installation and non-intrusive nature make them convenient for calculating stress. This study focuses on the investigation of stress and its implications. Specifically, the research conducts an analysis of stress within a select group of individuals from both Spain and Germany. The primary objective is to examine the influence of recognized psychological factors, including personality traits such as neuroticism, extroversion, psychoticism, stress and road safety. The estimation of stress levels was accomplished through the collection of physiological parameters (R-R intervals) using a Polar H10 chest strap. We observed that personality traits, such as extroversion, exhibited similar trends during relaxation, with an average heart rate 6% higher in Spain and 3% higher in Germany. However, while driving, introverts, on average, experienced more stress, with rates 4% and 1% lower than extroverts in Spain and Germany, respectively.
Modern wide bandgap power devices promise higher power conversion performance if the device can be operated reliably. As switching speed increases, the effects of parasitic ringing become more prominent, causing potentially damaging overvoltages during device turn-off. Estimating the expected additional voltage caused by such ringing enables more reliable designs. In this paper, we present an analytical expression to calculate the expected overvoltage caused by parasitic ringing based on parasitic element values and operating point parameters. Simulations and measurements confirm that the expression can be used to find the smallest rise time of the switches’ drain-source voltage for minimum overvoltage. The given expression also allows the prediction of the trade off overvoltage amplitude in case of faster required rise times.
Military organizations have special features like following different organizational laws in times of peace and war and their specific embeddedness in society and politics. Especially the latter aspect has made the military an important object of study since the beginnings of modern sociology. In the wake of establishing specific sociological accounts, military sociology has been developed, dedicated to the different facets of the military. This research is based on different theoretical perspectives, but has hardly embraced the frameworks from economics and sociology of conventions (EC/SC) so far. The aim of the chapter is to explore and demonstrate the potentials of this approach. In a first step, the state of the art of military sociology research is outlined, and potential avenues for analyzing military forces based on EC/SC are identified. It is argued that especially the connection to organizational theory (military as organization) and civil-military relations, including leadership and professionalism, offer starting points. After introducing existing studies addressing military-related topics with reference to EC/SC, relevant concepts and approaches of convention theory that prove to be particularly enriching for military research are discussed. An outlook on possible further fields and topics of research is given to concretize how an inclusion of the perspective of EC/SC could look like.
Cotton contamination by honeydew is considered one of the significant problems for quality in textiles as it causes stickiness during manufacturing. Therefore, millions of dollars in losses are attributed to honeydew contamination each year. This work presents the use of UV hyperspectral imaging (225–300 nm) to characterize honeydew contamination on raw cotton samples. As reference samples, cotton samples were soaked in solutions containing sugar and proteins at different concentrations to mimic honeydew. Multivariate techniques such as a principal component analysis (PCA) and partial least squares regression (PLS-R) were used to predict and classify the amount of honeydew at each pixel of a hyperspectral image of raw cotton samples. The results show that the PCA model was able to differentiate cotton samples based on their sugar concentrations. The first two principal components (PCs) explain nearly 91.0% of the total variance. A PLS-R model was built, showing a performance with a coefficient of determination for the validation (R2cv) = 0.91 and root mean square error of cross-validation (RMSECV) = 0.036 g. This PLS-R model was able to predict the honeydew content in grams on raw cotton samples for each pixel. In conclusion, UV hyperspectral imaging, in combination with multivariate data analysis, shows high potential for quality control in textiles.
Patterns are virtually simulated in 3D CAD programs before production to check the fit. However, achieving lifelike representations of human avatars, especially regarding soft tissue dynamics, remains challenging. This is mainly since conventional avatars in garment CAD programs are simulated with a continuous hard surface and not corresponding to the human physical and mechanical body properties of soft tissue. In the real world, the human body’s natural shape is affected by the contact pressure of tight-fitting textiles. To verify the fit of a simulated garment, the interactions between the individual body shape and the garment must be considered. This paper introduces an innovative approach to digitising the softness of human tissue using 4D scanning technology. The primary objective of this research is to explore the interactions between tissue softness and different compression levels of apparel, exerting pressure on the tissue to capture the changes in the natural shape. Therefore, to generate data and model an avatar with soft body physics, it is essential to capture the deform ability and elasticity of the soft tissue and map it into the modification options for a simulation. To aim this, various methods from different fields were researched and compared to evaluate 4D scanning as the most suitable method for capturing tissue deformability in vivo. In particular, it should be considered that the human body has different deformation capabilities depending on age, the amount of muscle and body fat. In addition, different tissue zones have different mechanical properties, so it is essential to identify and classify them to back up these properties for the simulation. It has been shown that by digitising the obtained data of the different defined applied pressure levels, a prediction of the deformation of the tissue of the exact person becomes possible. As technology advances and data sets grow, this approach has the potential to reshape how we verify fit digitally with soft avatars and leverage their realistic soft tissue properties for various practical purposes.
Unternehmen sind derzeit dabei, ihre Strategie, ihre Prozesse und ihre Informationssysteme zu verändern, um ihren Digitalisierungsgrad zu erhöhen. Das Potenzial des Internets und verwandter digitaler Technologien wie Internet der Dinge, Services Computing, Cloud Computing, künstliche Intelligenz, Big Data mit Analysen, mobile Systeme, Kollaborationsnetzwerke und cyber-physikalische Systeme treibt neue Geschäftsmodelle an und ermöglicht sie. Die Digitalisierung führt zu einer tiefgreifenden Umwälzung bestehender Unternehmen, Technologien und Volkswirtschaften und fördert die Architektur digitaler Umgebungen mit vielen eher kleinen und verteilten Strukturen. Dies hat starke Auswirkungen auf neue Wertschöpfungsmöglichkeiten und die Gestaltung digitaler Dienste und Produkte, die durch die Nutzung einer service-dominanten Logik gesteuert werden. Das Hauptergebnis des Buchkapitels erweitert Methoden für integrale digitale Strategien um wertorientierte Modelle für digitale Produkte und Dienstleistungen, die im Rahmen eines multiperspektivischen digitalen Unternehmensarchitektur-Referenzmodells definiert werden.
Student-faculty interactions that promote learning are essential contributors to student retention, academic success and satisfaction. But the factors that causally initiate and frame these interactions are not well understood. Only if students evaluate these interactions as positive will they seek them. We conducted a survey experiment with students (n = 375) from a tuition-fee-free German business school, using conditional process analysis to assess which factors frame effective interactions. We focus on out-of-classroom standard and non-standard requests that students make to faculty, then investigate how faculty and student gender and students’ academic entitlement influence the interaction. Our study examines how students evaluate the interaction with faculty: when they seek interaction, their expectations of getting their requests approved, and their disappointment when their requests are declined. We find a significant influence of the request type along with moderating effects of faculty gender, student gender and student entitlement, particularly for non-standard work requests. We conclude with policy implications for university management: developing target-group-specific measures that facilitate the desired and positively evaluated student-faculty interactions might benefit all university stakeholders.
Artificial Intelligence (AI) in der Markenführung: Künstliche Neuronale Netze zur Markenimagemessung
(2023)
Da Künstliche Neuronale Netze die Modellierung nichtlinearer und vielschichtiger Beziehungen ermöglichen, befasst sich dieser Beitrag mit deren Einsatzmöglichkeiten für die methodisch anspruchsvolle Analyse und Messung des Markenimages. Zur Veranschaulichung des konzeptionellen Ansatzes wird am empirischen Beispiel des Sportartikelherstellers adidas ein mehrschichtiges Künstliches Neuronales Netz zwischen den Bewertungen spezifischer Markenattribute und der Gesamtbewertung der Marke erzeugt. Auf der Grundlage einer Analyse der Verbindungsgewichte des Künstliches Neuronales Netzes wird die Bedeutung verschiedener Markenattribute für die Markenbewertung gemessen, wodurch sich konkrete Implikationen für die Praxis der Markenführung ableiten lassen.
In order to ensure sufficient recovery of the human body and brain, healthy sleep is indispensable. For this purpose, appropriate therapy should be initiated at an early stage in the case of sleep disorders. For some sleep disorders (e.g., insomnia), a sleep diary is essential for diagnosis and therapy monitoring. However, subjective measurement with a sleep diary has several disadvantages, requiring regular action from the user and leading to decreased comfort and potential data loss. To automate sleep monitoring and increase user comfort, one could consider replacing a sleep diary with an automatic measurement, such as a smartwatch, which would not disturb sleep. To obtain accurate results on the evaluation of the possibility of such a replacement, a field study was conducted with a total of 166 overnight recordings, followed by an analysis of the results. In this evaluation, objective sleep measurement with a Samsung Galaxy Watch 4 was compared to a subjective approach with a sleep diary, which is a standard method in sleep medicine. The focus was on comparing four relevant sleep characteristics: falling asleep time, waking up time, total sleep time (TST), and sleep efficiency (SE). After evaluating the results, it was concluded that a smartwatch could replace subjective measurement to determine falling asleep and waking up time, considering some level of inaccuracy. In the case of SE, substitution was also proved to be possible. However, some individual recordings showed a higher discrepancy in results between the two approaches. For its part, the evaluation of the TST measurement currently does not allow us to recommend substituting the measurement method for this sleep parameter. The appropriateness of replacing sleep diary measurement with a smartwatch depends on the acceptable levels of discrepancy. We propose four levels of similarity of results, defining ranges of absolute differences between objective and subjective measurements. By considering the values in the provided table and knowing the required accuracy, it is possible to determine the suitability of substitution in each individual case. The introduction of a “similarity level” parameter increases the adaptability and reusability of study findings in individual practical cases.
The influence of sleep on human health is enormous. Accordingly, sleep disorders can have a negative impact on it. To avoid this, they should be identified and treated in time. For this purpose, objective (with an appropriate device) or subjective (based on perceived values) measurement methods are used for sleep analysis to understand the problem. The aim of this work is to find out whether an exchange of the two methods is possible and can provide reliable results. In accordance with this goal, a study was conducted with people aged over 65 years old (a total of 154 night-time recordings) in which both measurement methods were compared. Sleep questionnaires and electronic devices for sleep assessment placed under the mattress were applied to achieve the study aims. The obtained results indicated that the correlation between both measurement methods could be observed for sleep characteristics such as total sleep time, total time in bed and sleep efficiency. However, there are also significant differences in absolute values of the two measurement approaches for some subjects/nights, which leads us to conclude that the substitution is more likely to be considered in case of long-term monitoring where the trends are of more importance and not the absolute values for individual nights.
Assistant platforms
(2023)
Many assistant systems have evolved toward assistant platforms. These platforms combine a range of resources from various actors via a declarative and generative interface. Among the examples are voice-oriented assistant platforms like Alexa and Siri, as well as text-oriented assistant platforms like ChatGPT and Bard. They have emerged as valuable tools for handling tasks without requiring deeper domain expertise and have received large attention with the present advances in generative artificial intelligence. In view of their growing popularity, this Fundamental outlines the key characteristics and capabilities that define assistant platforms. The former comprise a multi-platform architecture, a declarative interface, and a multi-platform ecosystem, while the latter include capabilities for composition, integration, prediction, and generativity. Based on this framework, a research agenda is proposed along the capabilities and affordances for assistant platforms.
Purpose – This paper explores, which employer attractiveness attributes Generation Z (Gen Z) talents prioritize. Comparing the findings for female and male participants, this study examines whether gender-specific work value orientations prevail among Gen Z talents and impact their expectations toward employers.
Design/methodology/approach – A survey was conducted among 308 students of business, economics and management in Germany. Data were collected using the employer attractiveness scale of Berthon and colleagues (2005) complemented by an additional dimension focusing on work–life balance.
Findings – Findings indicate that Gen Z talents primarily expect a fun work environment, a positive team atmosphere and supportive relations with colleagues and superiors. Application aspects and work–life balance enabling services are expected the least. Expectations of four of the six attributes measured differ significantly among women and men, indicating that traditional gender assumptions continue to be reflected in the work value orientations of Gen Z talents.
Research limitations/implications – The sample was limited to business, economics and management students in Germany. Additional research should include a wider variety of respondents of different disciplines and countries.
Practical implications – Practical implications refer to emphasizing the social value of employment in the employee value proposition and customizing employer branding activities by gender.
Originality/value – This study contributes to the literature by empirically determining which employer attractiveness attributes Gen Z talents expect and whether and how these expectations vary by gender.
Business Process Management (BPM) ist aufgrund seiner Bedeutung für prozessorientierte Unternehmen und den daraus resultierenden Anforderungen hinsichtlich interner Betriebsorganisation und Audits, ein zentraler Bestandteil. Die Einführung und Aufrechterhaltung von BPM stellt jedoch einen erheblichen Aufwand dar, da Prozesse aufgenommen, modelliert und aktuell gehalten werden müssen. Empirische Belege zeigen, dass erfolgreiche Prozessmodellierung dabei eine besondere Herausforderung darstellt, welche häufig nicht zufriedenstellend nachhaltig gelingt. Ein wesentlicher Erfolgsfaktor für die nachhaltige Prozessorientierung in Unternehmen ist somit die konsistente und aktuelle Prozessmodellierung, sowie deren Adaption an externe und interne Veränderungen. Mittels einer Literaturrecherche werden die relevanten Dimensionen zur nachhaltigen Prozessorientierung auf Grundlage der Prozessmodellierung ermittelt. Auf deren Basis wird ein adaptives handlungsorientiertes Framework für die praktische Anwendung in Unternehmen abgeleitet.
Auf dem Weg zu einer neuen Normalität in Schule und Bildung?! : Empfehlungen der Beitragenden
(2023)
Die im vorliegenden Band präsentierten Studien und Erkenntnisse zeigen die tiefen Einschnitte, die die Pandemie in Schule und Bildung hinterlassen hat. Zahlreiche Forschende, Expertinnen und Experten, aber auch engagierte Eltern, Kinder und Jugendliche wünschen sich in Anbetracht der Erfahrungen eine „neue“ Normalität für Schule und Bildung – eine Normalität, in der Bildungsungerechtigkeit wirksamer begegnet wird, die digitaler ist, … Wie könnte der Weg dahin aussehen?
Verschleiß an Zerspanwerkzeugen mit geometrisch definierter Schneide führt zu schlechter Oberflächenqualität, erhöhten Kräften, Maßabweichungen und Bruch. Bisher wird dieser Verschleiß außerhalb der Maschine oder indirekt (z. B. Durchmesser) erfasst. Der Tausch der Werkzeuge findet nach einer bestimmten Werkstückzahl, Zeit, oder einem Standweg statt. In diesem Beitrag wird ein neuartiges System zur direkten Ermittlung des Freiflächenverschleißes im Arbeitsraum eines Bearbeitungszentrums dargestellt. Dabei wird eine geschützt integrierte Industriekamera mit Objektiv im Arbeitsraum installiert. Die Maschinenachsen bzw. die Bearbeitungsspindel positionieren das Werkzeug davor. Nach einer nur wenige Sekunden dauernden Messung findet die Auswertung des Verschleißes hauptzeitparallel statt.
Purpose
Job advertisements are important means of communicating role expectations for management accountants to the labor market. They provide information about which roles are sought and expected. However, which roles are communicated in job advertisements is unknown so far.
Design/methodology/approach
With a text-mining approach on a large sample of 889 job ads, the authors extract information on roles, type of firm and hierarchical position of the management accountant sought.
Findings
The results indicate an apparent mix of different role types with a strong focus on a classic watchdog role. However, the business partner role is more often sought for leadership positions or in family businesses and small- and medium-sized enterprises (SME).
Research limitations/implications
The main limitation is the lack of an agreed-upon measurement instrument for roles in job offers. The study results imply that corporate practice is not as theory-driven as is postulated and communicated in the management accounting community. This indicates the existence of a research-practice gap and tensions between different actors in the management accounting field.
Practical implications
The results challenge the current role discussion of professional organizations for management accountants as business partners.
Originality/value
The authors contribute the first study, which explicitly analyzes the communication of roles in job offers for management accountants. It indicates a discrepancy between scholarly discussion on roles and management accountants' work from an employer's perspective.
Die Coronapandemie hat Deutschland seit dem Frühjahr 2020 fest im Griff. Eine zentrale Maßnahme zur Verlangsamung der Ausbreitung des Coronavirus war von Beginn an die Schließung von Schulen. In einer ersten Studie wurden die Lernzeitverluste durch die Corona-bedingten Schulschließungen im Frühjahr 2020 quantifiziert (Wößmann, Freundl, Lergetporer, Grewenig, Werner & Zierow, 2020). Es zeigte sich, dass sich die Lernzeit der Schülerinnen und Schüler durch die Schulschließungen halbiert hatte und die Verluste bei leistungsschwächeren Schülerinnen und Schülern besonders groß waren. Im Frühjahr 2020 wurde die Verringerung der Lernzeit von den Schulen nicht kompensiert: Nur ein kleiner Teil der Schülerinnen und Schüler hatte in dieser Phase regelmäßigen Distanzunterricht und täglichen Kontakt mit Lehrkräften. Während der Sommer- und Herbstmonate seit der Phase der ersten Schulschließungen hatten Schulverwaltung, Schulen und Lehrkräfte Zeit, sich auf Distanzunterricht und digitale Lehrmethoden umzustellen, um Lernausfällen während etwaiger erneuter Schulschließungen entgegenzuwirken. Inwiefern dies dazu geführt hat, dass die Schülerinnen und Schüler während der Schulschließungen Anfang 2021 tatsächlich mehr Zeit mit Lernen verbracht haben als im Frühjahr 2020, ist jedoch bislang weitgehend unbekannt.
Um zu erfahren, mit welchen Aktivitäten die Schulkinder die Zeit der Schulschließungen Anfang 2021 verbracht haben, wurde erneut eine deutschlandweite Umfrage durchgeführt, diesmal unter mehr als 2.000 Eltern von Schulkindern. Die Ergebnisse liefern umfassende Einblicke in den Alltag von Schulkindern, Eltern und Schulen während der Schulschließungen Anfang 2021. Sie zeigen, wie viele Stunden die Schulkinder in dieser Phase mit Lernen und anderen kreativen und passiven Tätigkeiten verbracht haben, welche konkreten Maßnahmen die Schulen ergriffen haben, um den Schulbetrieb aufrechtzuerhalten, wie effektiv das Lernen zu Hause war, und wie die Eltern das häusliche Lernumfeld einschätzen. Dabei vergleichen wir die Aktivitäten während der Schulschließungen Anfang 2021 mit den Aktivitäten während der ersten Corona-bedingten Schulschließungen im Frühjahr 2020 sowie mit den Aktivitäten vor Corona (vgl. Wößmann et al., 2020). Wir berichten zudem Ergebnisse zum sozio-emotionalen Wohlbefinden der Kinder nach einem Jahr Coronapandemie und zu den Einschätzungen der Eltern, welche breiteren Auswirkungen die Schulschließungen auf verschiedene Lebensbereiche ihrer Kinder haben. Die Befragung liefert somit neue empirische Erkenntnisse über mögliche Folgen der Corona-Krise für den Bildungserfolg von Kindern in Deutschland. Dabei untersuchen wir auch, inwiefern sich die Auswirkungen der Schulschließungen zwischen leistungsstärkeren und -schwächeren Schülerinnen und Schülern sowie zwischen Akademikerkindern und Nicht-Akademikerkindern unterscheiden.
Silicon neurons represent different levels of biological details and accuracies as a trade-off between complexity and power consumption. With respect to this trade-off and high similarity to neuron behaviour models, relaxation-type oscillator circuits often yield a good compromise to emulate neurons. In this chapter, two exemplified relaxation-type silicon neurons are presented that emulate neural behaviour with energy consumption under the scale of nJ/spike. The first proposed fully CMOS relaxation SiN is based on mathematical Izhikevich model and can mimic a broad range of physiologically observable spike patterns. The results of kinds of biologically plausible output patterns and coupling process of two SiNs are presented in 0.35 μm CMOS technology. The second type is a novel ultra-low-frequency hybrid CMOS-memristive SiN based on relaxation oscillators and analog memristive devices. The hybrid SiN directly emulates neuron behaviour in the range of physiological spiking frequencies (less than 100 Hz). The relaxation oscillator is implemented and fabricated in 0.13 μm CMOS technology. An autonomous neuronal synchronization process is demonstrated with two relaxation oscillators coupled by an analog memristive device in the measurement to emulate the synchronous behaviour between spiking neurons.
Supply chains have evolved into dynamic, interconnected supply networks, which increases the complexity of achieving end-to-end traceability of object flows and their experienced events. With its capability of ensuring a secure, transparent, and immutable environment without relying on a trusted third party, the emerging blockchain technology shows strong potential to enable end-to-end traceability in such complex multitiered supply networks. This paper aims to overcome the limitations of existing blockchain-based traceability architectures regarding their object-related event mapping ability, which involves mapping the creation and deletion of objects, their aggregation and disaggregation, transformation, and transaction, in one holistic architecture. Therefore, this paper proposes a novel ‘blueprint-based’ token concept, which allows clients to group tokens into different types, where tokens of the same type are non-fungible. Furthermore, blueprints can include minting conditions, which, for example, are necessary when mapping assembly processes. In addition, the token concept contains logic for reflecting all conducted object-related events in an integrated token history. Finally, for validation purposes, this article implements the architecture’s components in code and proves its applicability based on the Ethereum blockchain. As a result, the proposed blockchain-based traceability architecture covers all object-related supply chain events and proves its general-purpose end-to-end traceability capabilities of object flows.
We introduce bloomRF as a unified method for approximate membership testing that supports both point- and range-queries. As a first core idea, bloomRF introduces novel prefix hashing to efficiently encode range information in the hash-code of the key itself. As a second key concept, bloomRF proposes novel piecewisemonotone hash-functions that preserve local order and support fast range-lookups with fewer memory accesses. bloomRF has near-optimal space complexity and constant query complexity. Although, bloomRF is designed for integer domains, it supports floating-points, and can serve as a multi-attribute filter. The evaluation in RocksDB and in a standalone library shows that it is more efficient and outperforms existing point-range-filters by up to 4× across a range of settings and distributions, while keeping the false-positive rate low.
Non-fungible tokens (NFTs) are unique digital assets that have recently gained significant popularity, particularly in the digital art sector. The success of NFTs and other blockchain-based innovations depends on their ac-acceptance and use by consumers. This study aims to understand the impact of moral values on the acceptance of NFTs. Based on a quantitative survey with over 800 complete responses, the analysis shows that moral aspects of NFTs are indeed important for potential users. However, there is an attitude-behavior gap, as the positive impact of moral values on the intention to use NFTs is not reflected in the actual current usage of NFTs by the respondents. This study contributes to knowledge by providing new empirical data on the acceptance of NFTs and highlighting the role of moral values on the acceptance decision.
In the past, plant layouts were regarded as highly static structures. With increasing internal and external factors causing turbulence in operations, it has become more necessary for companies to adapt to new conditions in order to maintain optimal performance. One possible way for such an adaptation is the adjustment of the plant layout by rearranging the individual facilities within the plant. Since the information about the plant layout is considered as master data and changes have a considerable impact on interconnected processes in production, it is essential that this data remains accurate and up-to-date. This paper presents a novel approach to create a digital shadow of the plant layout, which allows the actual state of the physical layout to be continuously represented in virtual space. To capture the spatial positions and orientations of the individual facilities, a pan-tilt-zoom camera in combination with fiducial markers is used. With the help of a prototypically implemented system, the real plant layout was captured and converted into different data formats for further use in exemplary external software systems. This enabled the automatic updating of the plant layout for simulation, analysis and routing tasks in a case study and showed the benefits of using the proposed system for layout capturing in terms of accuracy and effort reduction.
Polyurethane thermosets have a wide range of applications. In this study, alternative raw materials were used to enhance sustainability. In two newly developed biobased polyurethanes (PUs), the cross-linker content was varied, which caused phase separation and therefore affected the turbidity. To investigate this phenomenon, UV–Vis–NIR spectroscopy was utilized. Spectra were recorded from 200 to 2500 nm in transmittance mode, and multivariate data analysis was applied to the three UV, Vis, and NIR sections separately. For the two different PU classes, each with five different cross-linker contents, classification by principal component analysis combined with linear or quadratic discriminant analysis was possible with an accuracy between 93% and nearly 100%. The best separation was achieved in the NIR range. Partial least-squares regression models were determined to predict the cross-linker content. As mentioned, the model for the NIR range is the most suitable, with the highest R2 (validation) of 0.99 for PU1 and 0.98 for PU2. The corresponding root-mean-square error of prediction values of the external validation was the lowest, with 0.82% (PU1) and 1.25% (PU2). Therefore, UV–Vis–NIR absorbance spectroscopy, especially NIR, is a suitable tool for monitoring the appropriate material composition of turbid PU thermosets in line.
The chemical recycling of used motor oil via catalytic cracking to convert it into secondary diesel-like fuels is a sustainable and technically attractive solution for managing environmental concerns associated with traditional disposal. In this context, this study was conducted to screen basic and acidic-aluminum silicate catalysts doped with different metals, including Mg, Zn, Cu, and Ni. The catalysts were thoroughly characterized using various techniques such as N2 adsorption–desorption isotherms, FT-IR spectroscopy, and TG analysis. The liquid and gaseous products were identified using GC, and their characteristics were compared with acceptable ranges from ASTM characterization methods for diesel fuel. The results showed that metal doping improved the performance of the catalysts, resulting in higher conversion rates of up to 65%, compared to thermal (15%) and aluminum silicates (≈20%). Among all catalysts, basic aluminum silicates doped with Ni showed the best catalytic performance, with conversions and yields three times higher than aluminum silicate catalysts. These findings significantly contribute to developing efficient and eco-friendly processes for the chemical recycling of used motor oil. This study highlights the potential of basic aluminum silicates doped with Ni as a promising catalyst for catalytic cracking and encourages further research in this area.
Artificial intelligence is considered to be a significant technology for driving the future evolution of smart manufacturing environments. At the same time, automated guided vehicles (AGVs) play an essential role in manufacturing systems due to their potential to improve internal logistics by increasing production flexibility. Thereby, the productivity of the entire system relies on the quality of the schedule, which can achieve production cost savings by minimizing delays and the total makespan. However, traditional scheduling algorithms often have difficulties in adapting to changing environment conditions, and the performance of a selected algorithm depends on the individual scheduling problem. Therefore, this paper aimed to analyze the scheduling problem classes of AGVs by applying design science research to develop an algorithm selection approach. The designed artifact addressed a catalogue of characteristics that used several machine learning algorithms to find the optimal solution strategy for the intended scheduling problem. The contribution of this paper is the creation of an algorithm selection method that automatically selects a scheduling algorithm, depending on the problem class and the algorithm space. In this way, production efficiency can be increased by dynamically adapting the AGV schedules. A computational study with benchmark literature instances unveiled the successful implementation of constraint programming solvers for solving JSSP and FJSSP scheduling problems and machine learning algorithms for predicting the most promising solver. The performance of the solvers strongly depended on the given problem class and the problem instance. Consequently, the overall production performance increased by selecting the algorithms per instance. A field experiment in the learning factory at Reutlingen University enabled the validation of the approach within a running production scenario.
Project managers still face management problems in interorganizational Research and Development (R&D) projects due to their limited authority. Addressing a project culture which is conducive to cooperation and innovation in interorganizational R&D project management demands commitment of individual project members and thus balances this limited authority. However, the relational collaboration level at which project culture manifests itself is not addressed by current project management approaches, or it is addressed only at a late stage. Consequently, project culture develops within a predefined framework of project organization and organized contents and thus is not actively targeted. Therefore, a focus shift towards project culture becomes necessary. This can be done by a project-culture-aware management. The method CLIPS actively supports interorganizational project members in this kind of management. It should be integrable in the common project management approaches, that with its application all collaboration levels are addressed in interorganizational R&D project management. The goal of this paper is to demonstrate the integrability of the method CLIPS and show how it can be integrated in common project management approaches. This enriches interorganizational R&D project management by a project culture focus.
The relevance of Robotic Process Automation (RPA) has increased over the last few years. Combining RPA with Artificial Intelligence (AI) can further enhance the business value of the technology. The aim of this research was to analyze applications, terminology, benefits, and challenges of combining the two technologies. A total of 60 articles were analyzed in a systematic literature review to evaluate the aforementioned areas. The results show that by adding AI, RPA applications can be used in more complex contexts, it is possible to minimize the human factor during the development process, and AI-based decision-making can be integrated into RPA routines. This paper also presents a current overview of the used terminology. Moreover, it shows that by integrating AI, some unseen challenges in RPA projects can emerge, but also a lot of new benefits will come along with it. Based on the outcome, it is concluded that the topic offers a lot of potential, but further research and development is required. The result of this study help researches to gain an overview of the state-of-the-art in combining RPA and AI.
Mobile assistance systems (MAS) promise to overcome personnel shortages in operating theatres worldwide. A literature review inspired by the PRISMA 2020 method determines the state of the art of MAS, and identifies a lack of application areas for MAS in the operating theatre. Interviews with subject-matter experts aim to investigate application areas for MAS. The results show that most operational tasks refer to material management and patient management. MAS, with their potential to reduce the time needed for material and patient management, and the physical and mental strain of patient management, have great potential in the operating theatre.
Healthy sleep is one of the prerequisites for a good human body and brain condition, including general well-being. Unfortunately, there are several sleep disorders that can negatively affect this. One of the most common is sleep apnoea, in which breathing is impaired. Studies have shown that this disorder often remains undiagnosed. To avoid this, developing a system that can be widely used in a home environment to detect apnoea and monitor the changes once therapy has been initiated is essential. The conceptualisation of such a system is the main aim of this research. After a thorough analysis of the available literature and state of the art in this area of knowledge, a concept of the system was created, which includes the following main components: data acquisition (including two parts), storage of the data, apnoea detection algorithm, user and device management, data visualisation. The modules are interchangeable, and interfaces have been defined for data transfer, most of which operate using the MQTT protocol. System diagrams and detailed component descriptions, including signal requirements and visualisation mockups, have also been developed. The system's design includes the necessary concepts for the implementation and can be realised in a prototype in the next phase.