Nein
Refine
Document Type
- Conference proceeding (597) (remove)
Is part of the Bibliography
- yes (597)
Institute
- Informatik (295)
- Technik (204)
- ESB Business School (81)
- Texoversum (11)
- Life Sciences (7)
Publisher
Evaluation of a contactless accelerometer sensor system for heart rate monitoring during sleep
(2024)
The monitoring of a patient's heart rate (HR) is critical in the diagnosis of diseases. In the detection of sleep disorders, it also plays an important role. Several techniques have been proposed, including using sensors to record physiological signals that are automatically examined and analysed. This work aims to evaluate using a contactless HR monitoring system based on an accelerometer sensor during sleep. For this purpose, the oscillations caused by chest movements during heart contractions are recorded by an installation mounted under the bed mattress. The processing algorithm presented in this paper filters the signals and determines the HR. As a result, an average error of about 5 bpm has been documented, i.e., the system can be considered to be used for the forecasted domain.
Business Process Management (BPM) ist aufgrund seiner Bedeutung für prozessorientierte Unternehmen und den daraus resultierenden Anforderungen hinsichtlich interner Betriebsorganisation und Audits, ein zentraler Bestandteil. Die Einführung und Aufrechterhaltung von BPM stellt jedoch einen erheblichen Aufwand dar, da Prozesse aufgenommen, modelliert und aktuell gehalten werden müssen. Empirische Belege zeigen, dass erfolgreiche Prozessmodellierung dabei eine besondere Herausforderung darstellt, welche häufig nicht zufriedenstellend nachhaltig gelingt. Ein wesentlicher Erfolgsfaktor für die nachhaltige Prozessorientierung in Unternehmen ist somit die konsistente und aktuelle Prozessmodellierung, sowie deren Adaption an externe und interne Veränderungen. Mittels einer Literaturrecherche werden die relevanten Dimensionen zur nachhaltigen Prozessorientierung auf Grundlage der Prozessmodellierung ermittelt. Auf deren Basis wird ein adaptives handlungsorientiertes Framework für die praktische Anwendung in Unternehmen abgeleitet.
Menopause is the permanent cessation of menstruation occurring naturally in women's aging. The most frequent symptoms associated with menopausal phases are mucosal dryness, increased weight and body fat, and changes in sleep patterns. Oral symptoms in menopause derived from saliva flow reduction can lead to dry mouth, ulcers, and alterations of taste and swallowing patterns. However, the oral health phenotype of postmenopausal women has not been characterized. The aim of the study was to determine postmenopausal women's oral phenotype, including medical history, lifestyle, and oral assessment through artificial intelligence algorithms. We enrolled 100 postmenopausal women attending the Dental School of the University of Seville were included in the study. We collected an extensive questionnaire, including lifestyle, medication, and medical history. We used an unsupervised k-means algorithm to cluster the data following standard features for data analysis. Our results showed the main oral symptoms in our postmenopausal cohort were reduced salivary flow and periodontal disease. Relying on the classical assessment of the collected data, we might have a biased evaluation of postmenopausal women. Then, we used artificial intelligence analysis to evaluate our data obtaining the main features and providing a reduced feature defining the oral health phenotype. We found 6 clusters with similar features, including medication affecting salivation or smoking as essential features to obtain different phenotypes. Thus, we could obtain main features considering differential oral health phenotypes of postmenopausal women with an integrative approach providing new tools to assess the women in the dental clinic.
Acting like a startup - using corporate startup structures to manage the digital transformation
(2023)
Digital transformation is proving to be a significant challenge for firms and companies when it comes to maintaining their market position. It is evident that many companies are struggling to find their particular way through this transformation. A corporate startup structure is one way to find a suitable solution quickly. Therefore, we are presenting a model for corporate startup activities, which we will instantiate in an appropriate tool to support the management of corporate startups by their parent firms. We have derived the first requirements and design principles from a comprehensive problem analysis and literature study. In addition to this,we are presenting a first artifact, which should realize the design principles by implementing a practical tool. Forming a cooperation with an automotive firm has enabled us to gain access to real-world data for the design and evaluation of the artifact.
Framework for integrating intelligent product structures into a flexible manufacturing system
(2023)
Increasing individualisation of products with a high variety and shorter product lifecycles result in smaller lot sizes, increasing order numbers, and rising data and information processing for manufacturing companies. To cope with these trends, integrated management of the products and manufacturing information is necessary through a “product-driven” manufacturing system. Intelligent products that are integrated as an active element within the controlling and planning of the manufacturing process can represent flexibility advantages for the system. However, there are still challenges regarding system integration and evaluation of product intel-ligence structures. In light of these trends, this paper proposes a conceptual frame-work for defining, analysing, and evaluating intelligent products using the example of an assembly system. This paper begins with a classification of the existing problems in the assembly and a definition of the intelligence level. In contrast to previous approaches, the analysis of products is expanded to five dimensions. Based on this, a structured evaluation method for a use case is presented. The structure of solving the assembly problem is provided by the use case-specific ontology model. Results are presented in terms of an assignment of different application areas, linking the problem with the target intelligence class and, depending on the intelligence class of the product, suggesting requirements for implementation. The conceptual frame-work is evaluated by utilising a case study in a learning factory. Here, the model-mix assembly is controlled actively by the workpiece carrier in terms of transferring the variant-specific work instructions to the operator and the collaborative robot (cobot) at the workstations. The resulting system thus enables better exploitation of the poten-tials through less frequent errors and shorter search times. Such an implementation has demonstrated that the intelligent workpiece carrier represents an additional part for realising a cyber-physical production system (CPPS).
The strong demand to transform the textile and fashion industry towards sustainability requires continuous implementation of the Education for Sustainable Development (ESD) mission statement in education and industry. To achieve this goal, the European research project "Fashion DIET - Sustainable Fashion Curriculum at Textile Universities in Europe. Development, Implementation and Evaluation of a Teaching Module for Educators", co-funded by the Erasmus+ programme of the European Union (2020-1-DE01-KA203-005657), aims to create an ESD module for university lecturers and research-based teaching and learning materials delivered through an e-learning portal. First, an online questionnaire was rolled out to assess university faculty attitudes toward and needs for ESD content and methods. The feedback questionnaire enabled the selection of the most relevant data for the elaboration of an action and research-oriented professional development module for ESD in textile education, which will be accessible through an information & e-learning portal. The e-learning portal can be used as a web-based tool to apply and evaluate the project outcomes, e.g. the further education module and the teaching and learning materials for educators, such as manuals, broadcasts and the provision of interactive and physical materials. It thus ensures that the teaching materials can be used sustainably in the classroom. It also provides country-specific data for the fashion and textile industry and its market, taking into account the different perspectives of universities and schools. In any case, the portal represents (1) the web-based platform to support the dissemination of ESD as a guiding principle and (2) a central contact point for the target group to obtain relevant information on ESD. Fashion DIET explores the use of e-learning to improve teaching and learning on ESD, by training educators and empowering them as multipliers for a sustainable textile and fashion industry. At a higher level, the European project strengthens the quality and relevance of learning provision in education towards the latest developments in textile research and innovation in terms of a more sustainable fashion.
Because of a high product and technology complexity, companies involve external partners in their research and development (R&D) processes. Interorganizational projects result, which represent temporary organizations. In these projects heterogenous organizations work closely together. Since project work is always teamwork, these projects face due to their characteristic’s major challenges on an organizational, relational, and content-related collaboration level. Thus, this paper raises the following research question: “How can a project team be supported on an organizational, relational, and content-related level in an interorganizational new product development setting?” To answer this research question, an explorative expert study was set up with two digital workshops using the interactive presentation tool Mentimeter. The results show that a cooperative innovation culture could support project teams on an organizational and relational level in the future in minimizing predominant problems. Moreover, it supports project teams for example in a functional communication. Furthermore, 18 values of a cooperative innovation culture result which are for example openness and transparency, risk and failure tolerance or respect. On a content-related level the results show that an adaptable tool which promotes creativity and collaboration method as well as content-related input support could be beneficial for problem-solving in an interorganizational new product development setting in the future. Because the tool can guide product developers through the process with suitable creativity and collaboration methods, can give content-related input and can enable interactive interchange on a table-top. Future research could mainly focus on the connection of the cooperative innovation culture and the tool since these potentially influence each other.
In a recently developed study programme at Reutlingen University, which focuses on practical orientations, an innovative product with solid company references is to be defined and realised by student teams. On the basis of this product, all subjects of the business engineering study programme “Sustainable Production and Business” are taught. By focusing on three main paths of future skills that have been developed by NextSkills to analyse upcoming social changes, global challenges and fields of work that are innovation-driven and agile, the new study programme aims to create responsible leaders who will shape global businesses respectfully. Thereby, different TRIZ tools help to support students in developing their own products with a focus on sustainability and pay off on the future skills enhancement. Further, students get to know TRIZ tools in an unbiased way, unburdened by too much theory, and are thus continuously supported in the progressing product development process that accompanies their studies. Hence, students perceive TRIZ on the one hand as a method to develop sustainable products and, on the other hand, to find sustainable solutions for everyday problems. The knowledge and positive experiences gained in this way should then arouse curiosity for the TRIZ class at the end of the study programme. The students can graduate with a TRIZ Level 1 certificate. Thereby, as many students as possible are introduced to the TRIZ methods, and the TRIZ tool is spread widely.
There are indicators we are entering a new era for MTM research, by moving beyond the structural approach that has characterized MTM research to date, to focus on important and under-researched issues, such as the nature of employees’ experiences in an MTM context. Although team research suggests that the experiences of members impact team functioning, these lines of reasoning have not, until recently, made their way to MTM research. To overcome this limitation, this symposium showcases five papers that use a variety of theoretical perspectives, research designs (i.e., qualitative, quantitative), contexts (e.g., healthcare, automotive manufacturer, online panels), methodologies, and analytical methods (i.e., meta-analysis, content/thematic analysis). The symposium focuses on surfacing and advancing unanswered questions that extend theory and can offer fruitful directions for MTM research by examining critical individual and team level outcomes (e.g., individual/team performance, individual counterproductive and organizational citizenship behavior, individual learning, individual turnover intentions, organizational commitment) in the experiences of MTM employees across their teams (e.g., goals, functions, roles). We hope to provide a forum to advance unanswered questions that offer fruitful directions for MTM research.
Application systems often need to be deployed in different variants if requirements that influence their implementation, hosting, and configuration differ between customers. Therefore, deployment technologies, such as Ansible or Terraform, support a certain degree of variability modeling. Besides, modern application systems typically consist of various software components deployed using multiple deployment technologies that only support their proprietary, non-interoperable variability modeling concepts. The Variable Deployment Metamodel (VDMM) manages the deployment variability across heterogeneous deployment technologies based on a single variable deployment model. However, VDMM currently only supports modeling conditional components and their relations which is sometimes too coarse-grained since it requires modeling entire components, including their implementation and deployment configuration for each different component variant. Therefore, we extend VDMM by a more fine-grained approach for managing the variability of component implementations and their deployment configurations, e.g., if a cheap version of a SaaS deployment provides only a community edition of the software and not the enterprise edition, which has additional analytical reporting functionalities built-in. We show that our extended VDMM can be used to realize variable deployments across different individual deployment technologies using a case study and our prototype OpenTOSCA Vintner.
Mit zunehmender Dynamik im Forschungsumfeld – Digitalisierung der Produktentwicklung – steigen neben der Komplexität auch die technischen Anforderungen an die künftigen Entscheidungsprozesse. Die Einführung von neuen IT-Systemen zur Automation von Entscheidungen haben Anpassungen in den derzeitigen Geschäftsprozessen der Unternehmen zur Folge. Für eine erfolgreiche Implementierung neuer IT-Informationstools gilt es im Voraus mögliche Auswirkungen auf die bisherigen Anwendersysteme genauer zu untersuchen. Neue Technologien, KI-Informationssysteme oder auch neues Wissen entstehen in der Wissenschaft oft durch Interpretation und Synthese von bestehendem Wissen. Aus diesem Grund nimmt die Qualität von Literaturanalysen eine immer größere Relevanz in der Ingenieur- und Informatikwissenschaft ein. Neben der Anzahl an Publikationen wächst auch der Aufwand für die strukturierte Literaturrecherche (SLA). Die Autoren stellen in diesem Paper den Rechercheprozess und die Ergebnisse einer SLA vor. Mit dieser Arbeit soll der derzeitige Forschungsstand zur Entscheidungsunterstützung in der Produktentwicklung von Klein- und mittelständischen Unternehmen sowie Großunternehmen in der
Automobilbranche ermittelt und nach Analyse sowie Bewertung mögliche Forschungslücken zu automatisierten Entscheidungsunterstützungssystemen (aEUS) aufgezeigt werden.
In the era of digital transformation, the notion of software quality transcends its traditional boundaries, necessitating an expansion to encompass the realms of value creation for customers and the business. Merely optimizing technical aspects of software quality can result in diminishing returns. Product discovery techniques can be seen as a powerful mechanism for crafting products that align with an expanded concept of quality - one that incorporates value creation. Previous research has shown that companies struggle to determine appropriate product discovery techniques for generating, validating, and prioritizing ideas for new products or features to ensure they meet the needs and desires of the customers and the business. For this reason, we conducted a grey literature review to identify various techniques for product discovery. First, the article provides an overview of different techniques and assesses how frequently they are mentioned in the literature review. Second, we mapped these techniques to an existing product discovery process from previous research to provide concrete guidelines for establishing product discovery in their organizations. The analysis shows, among other things, the increasing importance of techniques to structure the problem exploration process and the product strategy process. The results are interpreted regarding the importance of the techniques to practical applications and recognizable trends.
Gamification has been increasingly applied to software engineering education in the past. The approaches vary from applying game elements on a conceptual phase in the course to using specific tools to engage the students more and support their learning goals. However, existing tools usually have game elements, such as quizzes or challenges, but do not provide a more computer game-like experience. Therefore, we try to raise the level of gamified learning experience to another level by proposing Gamify-IT. Gamify-IT is a Unity- and web-based game platform intended to help students learn software engineering. It follows an immersive role-play game characteristic where the students explore a world, find and solve minigames and clear dungeons with SE tasks. Lecturers can configure the worlds, e.g., to add content hints. Furthermore, they can add and configure minigames and dungeons to include exercises in a fully gamified way. Thereby, they customize their course in Gamify-IT to adapt the world very precisely to other materials such as lectures or exercises. Results of an evaluation of our initial prototype show that (i) students like to engage with the platform, (ii) students are motivated to learn when using Gamify-IT, and (iii) the minigames support students in understanding the learning objectives.
Impact of a large distribution network on radiation characteristics of planar spiral antenna arrays
(2023)
Designing antenna arrays with a central feed point has gained ground in the antenna technique. This approach, which is usually applied because of manufacturing costs, is difficult to achieve and leads to a large feeding network. The impact of which is numerically investigated in the present work. Upon comparing three different antennas, it is shown that the enlargement of the feed strongly affects the antenna's overall dimensions and the antenna's radiation characteristics. The antenna with the plug-in solution is not only small in size but also performs better compared to antennas with a central feed point. Considering the high effort in designing the feed network with a central point and the influence of the resulting enlarged network on the dimensions and radiation characteristics of the antenna, the cost saving in production can be put into perspective.
Advancing mental health diagnostics: AI-based method for depression detection in patient interviews
(2023)
In this paper, we present a novel artificial intelligence (AI) application for depression detection, using advanced transformer networks to analyse clinical interviews. By incorporating simulated data to enhance traditional datasets, we overcome limitations in data protection and privacy, consequently improving the model’s performance. Our methodology employs BERT-based models, GPT-3.5, and ChatGPT-4, demonstrating state-of-the-art results in detecting depression from linguistic patterns and contextual information that significantly outperform previous approaches. Utilising the DAIC-WOZ and Extended-DAIC datasets, our study showcases the potential of the proposed application in revolutionising mental health care through early depression detection and intervention. Empirical results from various experiments highlight the efficacy of our approach and its suitability for real-world implementation. Furthermore, we acknowledge the ethical, legal, and social implications of AI in mental health diagnostics. Ultimately, our study underscores the transformative potential of AI in mental health diagnostics, paving the way for innovative solutions that can facilitate early intervention and improve patient outcomes.
This research evaluates current measurement scales for ambidexterity and proposes a new approach for the measurement of this important construct. We argue that current measurement approaches may be unsuitable to capture the concept of ambidexterity. Through a systematic scale development process, we derive a measurement scale with dual items that simultaneously refer to both dimensions, exploitation and exploration, thus reflecting the true nature of ambidexterity. An extensive pre-test with 39 executives suggests that our scale is suitable for capturing ambidexterity. Our measurement model enhances conceptual clarity of ambidexterity and can serve as a base for future investigations of the concept.
Human pose estimation (HPE) is integral to scene understanding in numerous safety-critical domains involving human-machine interaction, such as autonomous driving or semi-automated work environments. Avoiding costly mistakes is synonymous with anticipating failure in model predictions, which necessitates meta-judgments on the accuracy of the applied models. Here, we propose a straightforward human pose regression framework to examine the behavior of two established methods for simultaneous aleatoric and epistemic uncertainty estimation: maximum a-posteriori (MAP) estimation with Monte-Carlo variational inference and deep evidential regression (DER). First, we evaluate both approaches on the quality of their predicted variances and whether these truly capture the expected model error. The initial assessment indicates that both methods exhibit the overconfidence issue common in deep probabilistic models. This observation motivates our implementation of an additional recalibration step to extract reliable confidence intervals. We then take a closer look at deep evidential regression, which, to our knowledge, is applied comprehensively for the first time to the HPE problem. Experimental results indicate that DER behaves as expected in challenging and adverse conditions commonly occurring in HPE and that the predicted uncertainties match their purported aleatoric and epistemic sources. Notably, DER achieves smooth uncertainty estimates without the need for a costly sampling step, making it an attractive candidate for uncertainty estimation on resource-limited platforms.
Analog integrated circuit sizing still relies heavily on human expert knowledge as previous automation approaches have not found wide-spread acceptance in industry. One strand, the optimization-based automation, is often discarded due to inflated constraining setups, infeasible results or excessive run times. To address these deficits, this work proposes a alternative optimization flow featuring a designer’s intuition for feasible design spaces through integration of expert knowledge based on the gm/ID-method. Moreover, the extensive run times of simulation-based optimization flows are overcome by incorporating computationally efficient machine learning methods. Neural network surrogate models predicting eleven performance parameters increase the evaluation speed by 3 400× on average compared to a simulator. Additionally, they enable the use of optimization algorithms dependent on automatic differentiation, that would otherwise be unavailable in this field. First, an up to 4× more efficient way for sampling training data based on the aforementioned space is detailed. After presenting the architecture and training effort regarding the surrogate models, they are employed as part of the objective function for sizing three operational amplifiers with three different optimization algorithms. Additionally, the benefits of using the gm/ID-method become evident when considering technology migration, as previously found solutions may be reused for other technologies.
Facing ever-looming climate change, studying the drivers for individuals' Information Systems (IS) Use to reduce environmental harm gains momentum. While extant research on the antecedents of sustainable IS Use has focused on specific theories, interventions, contexts, and technologies, a holistic understanding has become increasingly elusive, with a synthesis remaining absent. We employ a systematic literature review methodology to shed light on the driving antecedents for sustainable IS Use among individual consumers. Our results build on findings of 29 empirical studies drawn from 598 articles retrieved from our premier outlets and a forward/backward search. The analysis reveals six salient complementary antecedents: Relief, Empowerment, Default, User-centricity, Salience, and Encouragement. We recommend considering these concepts when developing, deploying, promoting, or regulating digital technologies to mitigate individual consumers' emissions. Along with memorable and implementable concepts, our theoretical framework offers a novel conceptualization and four promising avenues for researchers on sustainable IS Use.
Smart cities are considered data factories that generate an enormous amount of data from various sources. In fact data is the backbone of any smart services. Therefore, the strategic beneficial handling of this digital capital is crucial for cities. Some smart city pioneers have already written down their approach to data in the form of data strategies, but what should a city's data strategy include, and how can the goals and measures defined in the strategies be operationalized? This paper addresses these questions by looking closely at the data strategies of cities in Germany and the top three countries in the EU Digital Economy and Society Index. The in-depth analysis of 8 city data strategies has yielded 11 dimensions that cities should consider in their data strategy. These are relevance of data, principles, methods, data sharing, technology, data culture, data ethics, organizational structure, data security and privacy, collaborations, data literacy. In addition, data governance is a concept to put these 11 strategic dimensions into practice through standardization measures, training programs, and defining roles and responsibilities by developing a data catalog.
The proliferation of smart technologies transforms the way individual consumers perform tasks. Considerable research alludes that smart technologies are often related to domestic energy consumption. However, it remains unclear how such technologies transform tasks and thereby impact our planet. We explore the role of technological smartness in personal day-to-day tasks that help create a more sustainable future. In the absence of theory, but facing extensive changes in everyday life enabled by smart technologies, we draw on phenomenon-based theorizing (PBT) guidelines. As anchor, we refer to task endogeneity related to task-technology fit theory (TTF). As infusion, we employ theory on public goods. Our model proposes novel relations between the concepts of smart autonomy and -transparency with sustainable task outcomes, mediated by task convenience and task significance. We discuss some implications, limitations, and future research opportunities.
Most Question-answering (QA) systems rely on training data to reach their optimal performance. However, acquiring training data for supervised systems is both time-consuming and resource-intensive. To address this, in this paper, we propose TFCSG, an unsupervised similar question retrieval approach that leverages pre-trained language models and multi-task learning. Firstly, topic keywords in question sentences are extracted sequentially based on a latent topic-filtering algorithm to construct unsupervised training corpus data. Then, the multi-task learning method is used to build the question retrieval model. There are three tasks designed. The first is a short sentence contrastive learning task. The second is the question sentence and its corresponding topic sequence similarity judgment task. The third is using question sentences to generate their corresponding topic sequence task. The three tasks are used to train the language model in parallel. Finally, similar questions are obtained by calculating the cosine similarity between sentence vectors. The comparison experiment on public question datasets that TFCSG outperforms the comparative unsupervised baseline method. And there is no need for manual marking, which greatly saves human resources.
The market for indoor positioning systems for a variety of applications has grown strongly in recent years. A wide range of systems is available, varying considerably in terms of accuracy, price and technology used. The suitability of the systems is highly dependent on the intended application. This paper presents a concept to use a single low-cost PTZ camera in combination with fiducial markers for indoor position and orientation determination. The intended use case is to capture a plant layout consisting of position, orientation and unique identity of individual facilities. Important factors to consider for the selection of a camera have been identified and the transformation of the marker pose in camera coordinates into a selectable plant coordinate system is described. The concept is illustrated by an exemplary practical implementation and its results.
Measuring cardiorespiratory parameters in sleep, using non-contact sensors and the Ballistocardiography technique has received much attention due to the low-cost, unobtrusive, and non-invasive method. Designing a user-friendly, simple-to-use, and easy-to-deployment preserving less error-prone remains open and challenging due to the complex morphology of the signal. In this work, using four forcesensitive resistor sensors, we conducted a study by designing four distributions of sensors, in order to simplify the complexity of the system by identifying the region of interest for heartbeat and respiration measurement. The sensors are deployed under the mattress and attached to the bed frame without any interference with the subjects. The four distributions are combined in two linear horizontal, one linear vertical, and one square, covering the influencing region in cardiorespiratory activities. We recruited 4 subjects and acquired data in four regular sleeping positions, each for a duration of 80 seconds. The signal processing was performed using discrete wavelet transform bior 3.9 and smooth level of 4 as well as bandpass filtering. The results indicate that we have achieved the mean absolute error of 2.35 and 4.34 for respiration and heartbeat, respectively. The results recommend the efficiency of a triangleshaped structure of three sensors for measuring heartbeat and respiration parameters in all four regular sleeping positions.
Large critical systems, such as those created in the space domain, are usually developed by a large number of organizations and, furthermore, they have to comply with standards. Yet, the different stakeholders often do not have a common understanding of the needed quality of requirements specifications. Achieving such a common understanding is a laborious process that is currently not sufficiently supported. Moreover, such a common understanding must be aligned with the standards. In this paper, we present an approach that can be used to align the different stakeholder perceptions regarding the quality of requirements specifications. Existing quality models for requirements specifications are analyzed for equivalences, and transferred into a common representation, the so-called Aligned Quality Map (AQM). Furthermore, a process is defined that supports the alignment of different stakeholder perspectives with regard to the quality of requirements specifications using AQM, which is validated in a case study in the context of European space projects. AQM has been created and populated with an initial set of quality models. It is designed in such way that it can be extended to include further quality models. The case study has shown that an alignment of different stakeholder perspectives and the quality model of the European Cooperation for Space Standardization using AQM is feasible. The approach allows for aligning different stakeholder perspectives for a common understanding of the quality of requirements specifications in the context of standards. Furthermore, AQM supports the assessment of requirements specifications.
The efficient production and utilization of green hydrogen is vital to succeed in the global strive for a sustainable future. To provide the necessary amount of green hydrogen a high number of electrolyzers will be connected as decentralized power consumers to the grid. A large amount of decentralized renewable power sources will provide the energy. In such a system a control method is necessary to dispatch the available power most efficiently. In particular, the shutdown of renewable energy sources due to temporary overproduction must be avoided. This paper presents a decentralized tertiary control algorithm that provides a new decentralized control approach, thus creating a flexible, robust and easily scalable system. The operation of each grid participant within this grid connected microgrid is optimized for maximum financial profit, while minimizing the exchange of power with the mains grid and reducing the shutdown of renewable power sources.
In the context of digital transformation, having a data-driven organizational culture has been recognized as an important factor for data analytics capabilities, innovativeness and competitive advantage of firms. However, the current literature on data-driven culture (DDC) is fragmented, lacking both a synthesis of findings and a theoretical foundation. Therefore, the aim of this work has been to develop a comprehensive framework for understanding DDC and the mechanisms that can be used to embed such a culture in organizations as well as structuring prior dispersed findings on the topic. Based on the foundation of organizational culture theory, we employed a Design Science Research (DSR) approach using a systematic literature review and expert interviews to build and evaluate a transformation-oriented framework. This research contributes to knowledge by synthesizing previously dispersed knowledge in a holistic framework, as well as, by providing a conceptual framework to guide the transformation towards a DDC.
The performance and scalability of modern data-intensive systems are limited by massive data movement of growing datasets across the whole memory hierarchy to the CPUs. Such traditional processor-centric DBMS architectures are bandwidth- and latency-bound. Processing-in-Memory (PIM) designs seek to overcome these limitations by integrating memory and processing functionality on the same chip. PIM targets near- or in-memory data processing, leveraging the greater in-situ parallelism and bandwidth.
In this paper, we introduce pimDB and provide an initial comparison of processor-centric and PIM-DBMS approaches under different aspects, such as scalability and parallelism, cache-awareness, or PIM-specific compute/bandwidth tradeoffs. The evaluation is performed end-to-end on a real PIM hardware system from UPMEM.
Software development teams have to face stress caused by deadlines, staff turnover, or individual differences in commitment, expertise, and time zones. While students are typically taught the theory of software project management, their exposure to such stress factors is usually limited. However, preparing students for the stress they will have to endure once they work in project teams is important for their own sake, as well as for the sake of team performance in the face of stress. Team performance has been linked to the diversity of software development teams, but little is known about how diversity influences the stress experienced in teams. In order to shed light on this aspect, we provided students with the opportunity to self-experience the basics of project management in self-organizing teams, and studied the impact of six diversity dimensions on team performance, coping with stressors, and positive perceived learning effects. Three controlled experiments at two universities with a total of 65 participants suggest that the social background impacts the perceived stressors the most, while age and work experience have the highest impact on perceived learnings. Most diversity dimensions have a medium correlation with the quality of work, yet no significant relation to the team performance. This lays the foundation to improve students’ training for software engineering teamwork based on their diversity-related needs and to create diversity-sensitive awareness among educators, employers and researchers.
For large-scale processes as implemented in organizations that develop software in regulated domains, comprehensive software process models are implemented, e.g., for compliance requirements. Creating and evolving such processes is demanding and requires software engineers having substantial modeling skills to create consistent and certifiable processes. While teaching process engineering to students, we observed issues in providing and explaining models. In this paper, we present an exploratory study in which we aim to shed light on the challenges students face when it comes to modeling. Our findings show that students are capable of doing basic modeling tasks, yet, fail in utilizing models correctly. We conclude that the required skills, notably abstraction and solution development, are underdeveloped due to missing practice and routine. Since modeling is key to many software engineering disciplines, we advocate for intensifying modeling activities in teaching.
Ecuador, traditionally an agricultural based economy, has a great potential for valorizing their industrial residues. This study, presents a techno-economic analysis for applying a novel biomass oxidation method to produce formic and acetic acids from coffee husk residues in Machala, Ecuador. The analysis determined that the time of return of investment was lower than 5 years, making this project economically feasible, when producing approx. 1000 tons of formic acid per year, which is enough for supplying the Ecuadorian market. This production, would reduce imports costs and develop the chemical industry in the country.
Near-Data Processing (NDP) is a key computing paradigm for reducing the ever growing time and energy costs of data transport versus computations. With their flexibility, FPGAs are an especially suitable compute element for NDP scenarios. Even more promising is the exploitation of novel and future non-volatile memory (NVM) technologies for NDP, which aim to achieve DRAM-like latencies and throughputs, while providing large capacity non-volatile storage.
Experimentation in using FPGAs in such NVM-NDP scenarios has been hindered, though, by the fact that the NVM devices/FPGA boards are still very rare and/or expensive. It thus becomes useful to emulate the access characteristics of current and future NVMs using off-the-shelf DRAMs. If such emulation is sufficiently accurate, the resulting FPGA-based NDP computing elements can be used for actual full-stack hardware/software benchmarking, e.g., when employed to accelerate a database.
For this use, we present NVMulator, an open-source easy-to-use hardware emulation module that can be seamlessly inserted between the NDP processing elements on the FPGA and a conventional DRAM-based memory system. We demonstrate that, with suitable parametrization, the emulated NVM can come very close to the performance characteristics of actual NVM technologies, specifically Intel Optane. We achieve 0.62% and 1.7% accuracy for cache line sized accesses for read and write operations, while utilizing only 0.54% of LUT logic resources on a Xilinx/AMD AU280 UltraScale+ FPGA board. We consider both file-system as well as database access patterns, examining the operation of the RocksDB database when running on real or emulated Optane-technology memories.
The increase in distributed energy generation, such as photovoltaic systems (PV) or combined heat and power plants (CHP), poses new challenges to almost every distribution network operator (DNO). In the low-voltage (LV) grids, where installed PV capacity approaches the magnitude of household load, reverse power flow occurs at the secondary substa-tions. High PV penetration leads to voltage rise, flicker and loading problems. These problems have been addressed by the application of various techniques amongst which is the deployment of step voltage regulators (SVR). SVR can solve the voltage problem, but do not prevent or reduce reverse power flows. Therefore, the application of SVR in low voltage grids can result in significant power losses upstream. In this paper we present part of a research project investi-gating the application of remote-controlled cable cabinets (CC) with metering units in a low-voltage network as a possible alternative for SVR. A new generation of custom-made remote-control cable cabinets has been deployed and dynamic network reconfigurations (NR) have been realized with the following objectives: (i) reduction of reverse power flow through the secondary substation to the upstream network and therefore a reduction of upstream losses, (ii) reduction of the voltage rise caused by distributed energy resources and (iii) load balancing in the low-voltage grid. Secondary objec-tives are to improve the DNO's insight into the state of the network and to provide further information on future smart grid integration.
Werkzeugmaschinen sind im Bereich des Maschinen- und Anlagenbau die größte Branche, mit denen auch in Unternehmen anderer Bereiche (z. B. Automobilbau, Aerospace) wesentliche Teile der Bruttowertschöpfung stattfinden. (Destatis, 2022) Das dynamische Verhalten von Werkzeugmaschinen beeinflusst in entscheidendem Maße die Produktivität der Produktionsanlage und die Qualität der darauf erzeugten Werkstücke. Sowohl fremderregte Schwingungen (z. B. Unwucht, Pulsation, periodisch schwankende Prozesskräfte) als auch selbsterregte Schwingungen (z. B. Rattern) führen zu schlechter Qualität der gefertigten Bauteile. Das dynamische Verhalten vonWerkzeugmaschinen wird durch die Masse, Dämpfung und Steifigkeit der einzelnen Komponenten (z. B. Maschinenbett, Ständer, Schlitten) als auch der im Kraftfluss liegenden Fügestellen (z. B. Führungen, Antriebe) beeinflusst. In diesem Beitrag werden die Auswirkungen von konstruktiven Modifikationen der Dämpfung in Gestellbauteilen bezüglich des dynamischen Verhaltens an der Zerspanstelle näher beleuchtet.
OpenAPI, WADL, RAML, and API Blueprint are popular formats for documenting Web APIs. Although these formats are in general both human and machine-readable, only the part of the format describing the syntax of a Web API is machine-understandable. Descriptions, which explain the meaning and purpose of Web API elements, are embedded as natural language text snippets into documents and target human readers but not machines. To enable machines to read and process these state-of-practice Web API documentation, we propose a Transformer model that solves the generic task of identifying a Web API element within a syntax structure that matches a natural language query. For our first prototype, we focus on the Web API integration task of matching output with input parameters and fined-tuned a pre-trained CodeBERT model to the downstream task of question answering with samples from 2,321 OpenAPI documentation. We formulate the original question answering problem as a multiple choice task: given a semantic natural language description of an output parameter (question) and the syntax of the input schema (paragraph), the model chooses the input parameter (answer) in the schema that best matches the description. The paper describes the data preparation, tokenization, and fine-tuning process as well as discusses possible applications of our model as part of a recommender system. Furthermore, we evaluate the generalizability and the robustness of our fine-tuned model, with the result that it achieves an accuracy of 81.46% correctly chosen parameters.
As fuel prices climb and the global automotive sector migrates to more sustainable vehicle technologies, the future of South Africa’s minibus taxis is in flux. The authors’ previous research has found that battery electric technology struggles to meet all the mobility requirements of minibus taxis. They investigate the technical feasibility of powering taxis with hydrogen fuel cells instead. The following results are projected using a custom-built simulator, and tracking data of taxis based in Stellenbosch, South Africa. Each taxi requires around 12 kg of hydrogen gas per day to travel an average distance of 360 km. 465 kWh of electricity, or 860 m2 of solar panels, would electrolyse the required green hydrogen. An economic analysis was conducted on the capital and operational expenses of a system of ten hydrogen taxis and an electrolysis plant. Such a pilot project requires a minimum investment of € 3.8 million (R 75 million), for a 20 year period. Although such a small scale roll-out is technically feasible and would meet taxis’ performance requirements, the investment cost is too high, making it financially unfeasible. They conclude that a large scale solution would need to be investigated to improve financial feasibility; however, South Africa’s limited electrical generation capacity poses a threat to its technical feasibility. The simulator is uploaded at: https://gitlab.com/eputs/ev-fleet-sim-fcv-model.
Modern wide bandgap power devices promise higher power conversion performance if the device can be operated reliably. As switching speed increases, the effects of parasitic ringing become more prominent, causing potentially damaging overvoltages during device turn-off. Estimating the expected additional voltage caused by such ringing enables more reliable designs. In this paper, we present an analytical expression to calculate the expected overvoltage caused by parasitic ringing based on parasitic element values and operating point parameters. Simulations and measurements confirm that the expression can be used to find the smallest rise time of the switches’ drain-source voltage for minimum overvoltage. The given expression also allows the prediction of the trade off overvoltage amplitude in case of faster required rise times.
The relevance of Robotic Process Automation (RPA) has increased over the last few years. Combining RPA with Artificial Intelligence (AI) can further enhance the business value of the technology. The aim of this research was to analyze applications, terminology, benefits, and challenges of combining the two technologies. A total of 60 articles were analyzed in a systematic literature review to evaluate the aforementioned areas. The results show that by adding AI, RPA applications can be used in more complex contexts, it is possible to minimize the human factor during the development process, and AI-based decision-making can be integrated into RPA routines. This paper also presents a current overview of the used terminology. Moreover, it shows that by integrating AI, some unseen challenges in RPA projects can emerge, but also a lot of new benefits will come along with it. Based on the outcome, it is concluded that the topic offers a lot of potential, but further research and development is required. The result of this study help researches to gain an overview of the state-of-the-art in combining RPA and AI.
We present the results of an extensive characterization of the performance and stability of a third-order continuous-time delta-sigma modulator with active coefficient error compensation. Using our previously published coefficient tuning technique, process variation induced R-C time-constant (TC) errors in the forward signal path can be compensated indirectly using continuously tunable DACs in the feedback path. To validate our technique experimentally with a range of real TC variations, we designed a modulator with discretely configurable integration capacitor arrays in a 0.35-μm CMOS process. We configured the capacitors of the fabricated device for a range of total TC variations from -28.4 % to +19.3 % and measured the signal-to-noise ratio (SNR) as a function of the input amplitude before and after compensating the variations electrically using the feedback DACs. The results show that our tuning technique is capable of restoring the desired nominal modulator performance over the entire parameter variation range, including the system’s nominal maximum stable amplitude (MSA).
Governments and public institutions increasingly embrace digital opportunities to involve citizens in public issues and decision making. While public participation is generally seen as an important and promising venture, the design of the participation processes and the utilized digital infrastructure poses challenges, especially to the public sector. Instead of limiting conceptual guidance and exchange to one domain, we therefore develop a taxonomy for digital involvement projects that unites the domains of e-participation, citizen science and crowd-X. Embedded in a design science research approach, we follow an iterative design process to elaborate the key characteristics of a digital involvement project based on the participation process, its individuals and digital infrastructure. Through evaluating the artifact in a focus group with domain practitioners, we find support for the usefulness of our taxonomy and its ability to provide guidance and a basis for discussion of digital involvement projects across domains.
What might the attendee be able to do after being in your session?
Our work shows how to connect intra-operative devices via IEEE 11073 Service-oriented Device Connectivity (SDC).
Description of the Problem or Gap
Standardized device communication is essential for interoperability, availability of device data, and therefore for the intelligent operating room (OR) and arising solutions. The SDC standard was developed to make information from medical devices available in a uniform manner and enable interoperability. Existing devices are rarely SDC-capable and need interfaces to be interoperable via SDC.
Methods: What did you do to address the problem or gap?
We conceived an SDC-based architecture consisting of a service provider and service consumer. In our concept, the service provider is connected to the medical device and capable to translate the proprietary protocol of the device into SDC and vice versa. The service consumer is used to request or send information via the SDC protocol to the service provider and can function as a uniform bidirectional interface (e.g. for displaying or controlling). This concept was exemplarily demonstrated with the patient monitor MX800 of Philips to retrieve the device data (e.g. vital parameters) via SDC and partly for the operating light marLED X of KLS Martin Group.
Results: What was the outcome(s) of what you did to address the problem or gap?
The patient monitor MX800 was connected to a Raspberry Pi (RPi) via LAN, on which the service provider is running. The python script on the RPi establishes a connection to the monitor and translates incoming and outgoing messages from the proprietary protocol to SDC and vice versa to/from the service consumer. The service consumer is running on a laptop and acts as a simulation for different kinds of systems that want to get vital parameters or other information from the patient monitor. The operating light marLED X was connected to an RPi via USB-to-RS232. A python script on the RPi establishes a connection to the light and makes it possible via proprietary commands to get information of the light (e.g. status) and to control it (e.g. toggle the light, increment the intensity). A translation to SDC is not integrated yet.
Discussion of Results
Our practical implementation shows that medical devices can be accessed via external connections to get device data and control the device via commands. The example SDC implementation of the patient monitor MX800 makes it possible to request its data via the standardized communication protocol SDC. This is also possible for the operating light marLED X if its proprietary protocol is analyzed to be translatable to/from SDC. This would allow to control the device from an external system, or automatically depending on the status of the ongoing procedure. The advantage is, that existing intra-operative devices can be extended by a service provider which is capable of translating the proprietary protocol of the device in SDC and vice versa. This enables interoperability and an intelligent OR that, for example, is aware of all devices, their status, and data and can use this information to optimally support the surgeons and their team (e.g. provision of information, automated documentation). This interoperability allows that future innovations merely need to understand the SDC protocol instead of all vendor-dependent communication protocols.
Conclusion
Standardized device communication is essential to reach interoperability, and therefore intelligent ORs. Our contribution addresses the possibility of subsequently making medical devices SDC-capable. This may eliminate the need of understanding all the different proprietary protocols when developing new innovative solutions for the OR.
In recent years, 3D facial reconstructions from single images have garnered significant interest. Most of the approaches are based on 3D Morphable Model (3DMM) fitting to reconstruct the 3D face shape. Concurrently, the adoption of Generative Adversarial Networks (GAN) has been gaining momentum to improve the texture of reconstructed faces. In this paper, we propose a fundamentally different approach to reconstructing the 3D head shape from a single image by harnessing the power of GAN. Our method predicts three maps of normal vectors of the head’s frontal, left, and right poses. We are thus presenting a model-free method that does not require any prior knowledge of the object’s geometry to be reconstructed.
The key advantage of our proposed approach is the substantial improvement in reconstruction quality compared to existing methods, particularly in the case of facial regions that are self-occluded in the input image. Our method is not limited to 3d face reconstruction. It is generic and applicable to multiple kinds of 3D objects. To illustrate the versatility of our method, we demonstrate its efficacy in reconstructing the entire human body.
By delivering a model-free method capable of generating high-quality 3D reconstructions, this paper not only advances the field of 3D facial reconstruction but also provides a foundation for future research and applications spanning multiple object types. The implications of this work have the potential to extend far beyond facial reconstruction, paving the way for innovative solutions and discoveries in various domains.
The aim of this work is the development of artificial intelligence (AI) application to support the recruiting process that elevates the domain of human resource management by advancing its capabilities and effectiveness. This affects recruiting processes and includes solutions for active sourcing, i.e. active recruitment, pre-sorting, evaluating structured video interviews and discovering internal training potential. This work highlights four novel approaches to ethical machine learning. The first is precise machine learning for ethically relevant properties in image recognition, which focuses on accurately detecting and analysing these properties. The second is the detection of bias in training data, allowing for the identification and removal of distortions that could skew results. The third is minimising bias, which involves actively working to reduce bias in machine learning models. Finally, an unsupervised architecture is introduced that can learn fair results even without ground truth data. Together, these approaches represent important steps forward in creating ethical and unbiased machine learning systems.
The 17 SDGs, as agreed upon by the international community, are designed to be implemented across all levels of human activity. Alongside the level of international politics, this also includes the local levels, national politics, wider society, and the economic sphere. Many channels are called on to further implementation, including the transfer of technology to developing and emerging countries. As the patent holders, this must include the active participation of companies. While the literature examines the important role of technology transfer in North-South business-to-business (B2B) partnerships, studies on the technology transfer between European and African companies are scarce. Therefore, in this study we use original data from 26 interviews conducted with managers engaged in sales partnerships between German manufacturers and their distributors in African markets to examine the existence and forms of technology transfer. We find that training and marketing excellence are the predominant forms of technology transfer and based on that suggest a refinement of established frameworks on B2B technology transfer.
Automatic segmentation is essential for the brain tumor diagnosis, disease prognosis, and follow-up therapy of patients with gliomas. Still, accurate detection of gliomas and their sub-regions in multimodal MRI is very challenging due to the variety of scanners and imaging protocols. Over the last years, the BraTS Challenge has provided a large number of multi-institutional MRI scans as a benchmark for glioma segmentation algorithms. This paper describes our contribution to the BraTS 2022 Continuous Evaluation challenge. We propose a new ensemble of multiple deep learning frameworks namely, DeepSeg, nnU-Net, and DeepSCAN for automatic glioma boundaries detection in pre-operative MRI. It is worth noting that our ensemble models took first place in the final evaluation on the BraTS testing dataset with Dice scores of 0.9294, 0.8788, and 0.8803, and Hausdorf distance of 5.23, 13.54, and 12.05, for the whole tumor, tumor core, and enhancing tumor, respectively. Furthermore, the proposed ensemble method ranked first in the final ranking on another unseen test dataset, namely Sub-Saharan Africa dataset, achieving mean Dice scores of 0.9737, 0.9593, and 0.9022, and HD95 of 2.66, 1.72, 3.32 for the whole tumor, tumor core, and enhancing tumor, respectively.
Enterprises and societies currently face essential challenges, and digital transformation can contribute to their resolution. Enterprise architecture (EA) is useful for promoting digital transformation in global companies and information societies covering ecosystem partners. The advancement of new business models can be promoted with digital platforms and architectures for Industry 4.0 and Society 5.0. Therefore, products from the sector of healthcare, manufacturing and energy, etc. can increase in value. The adaptive integrated digital architecture framework (AIDAF) for Industry 4.0 and the design thinking approach is expected to promote and implement the digital platforms and digital products for healthcare, manufacturing and energy communities more efficiently. In this paper, we propose various cases of digital transformation where digital platforms and products are designed and evaluated for digital IT, digital manufacturing and digital healthcare with Industry 4.0 and Society 5.0. The vision of AIDAF applications to perform digital transformation in global companies is explained and referenced, extended toward the digitalized ecosystems such as Society 5.0 and Industry 4.0.
Current advances in Artificial Intelligence (AI) combined with other digitalization efforts are changing the role of technology in service ecosystems. Human-centered intelligent systems and services are the target of many current digitalization efforts and part of a massive digital transformation based on digital technologies. Artificial intelligence, in particular, is having a powerful impact on new opportunities for shared value creation and the development of smart service ecosystems. Motivated by experiences and observations from digitalization projects, this paper presents new methodological experiences from academia and practice on a joint view of digital strategy and architecture of intelligent service ecosystems and explores the impact of digitalization based on real case study results. Digital enterprise architecture models serve as an integral representation of business, information, and technology perspectives of intelligent service-based enterprise systems to support management and development. This paper focuses on the novel aspect of closely aligned digital strategy and architecture models for intelligent service ecosystems and highlights the fundamental business mechanism of AI-based value creation, the corresponding digital architecture, and management models. We present key strategy-oriented architecture model perspectives for intelligent systems.
In today’s education, healthcare, and manufacturing sectors, organizations and information societies are discussing new enhancements to corporate structure and process efficiency using digital platforms. These enhancements can be achieved using digital tools. Industry 5.0 and Society 5.0 give several potentials for businesses to enhance the adaptability and efficacy of their industrial processes, paving the door for developing new business models facilitated by digital platforms. Society 5.0 can contribute to a super-intelligent society that includes the healthcare industry. In the past decade, the Internet of Things, Big Data Analytics, Neural Networks, Deep Learning, and Artificial Intelligence (AI) have revolutionized our approach to various job sectors, from manufacturing and finance to consumer products. AI is developing quickly and efficiently. We have heard of the latest artificial intelligence chatbot, ChatGPT. OpenAI created this, which has taken the internet by storm. We tested the effectiveness of a considerable language model referred to as ChatGPT on four critical questions concerning “Society 5.0”, “Healthcare 5.0”, “Industry,” and “Future Education” from the perspectives of Age 5.0.
Artificial Intelligence (AI) in der Markenführung: Künstliche Neuronale Netze zur Markenimagemessung
(2023)
Da Künstliche Neuronale Netze die Modellierung nichtlinearer und vielschichtiger Beziehungen ermöglichen, befasst sich dieser Beitrag mit deren Einsatzmöglichkeiten für die methodisch anspruchsvolle Analyse und Messung des Markenimages. Zur Veranschaulichung des konzeptionellen Ansatzes wird am empirischen Beispiel des Sportartikelherstellers adidas ein mehrschichtiges Künstliches Neuronales Netz zwischen den Bewertungen spezifischer Markenattribute und der Gesamtbewertung der Marke erzeugt. Auf der Grundlage einer Analyse der Verbindungsgewichte des Künstliches Neuronales Netzes wird die Bedeutung verschiedener Markenattribute für die Markenbewertung gemessen, wodurch sich konkrete Implikationen für die Praxis der Markenführung ableiten lassen.
The dawn of the 21st Century has witnessed a tremendous increase in trade pacts among nations, resulting in renewed hopes for sustainable enterprise development in emerging economies worldwide. Ghana and other sub-Saharan African (SSA) countries have signed onto several North-South and South-South free trade agreements with the hope of strengthening their presence in the international trade arena, and to promote economic growth in SSA. For over two decades, however, very little has changed, and many have dashed their high hopes as enterprises continue to struggle in SSA. Not even the African Continental Free Trade Agreement (AfCFTA) could renew the hopes of sceptics. Several studies opined that enterprises in SSA could improve their domestic and international competitiveness by establishing mutually beneficial partnerships with their counterparts from the Global North and South. This study delved into the issues that affect North-South and South-South business collaborations and recommends key success factors that could help promote mutually beneficial cross-border business partnerships. The research includes both literature and empirical information on the key success factors of business partnerships between African enterprises as well as between African enterprises and firms from the Global North. We approached the study qualitatively using a phenomenological research design. Research participants included important stakeholders in Africa and Europe's international trade and sustainable enterprise development ecosystem. The study identified several challenges with the current business collaborations and recommended new ways of making such partnerships more beneficial.
This article proposes several modified quasi Z-source dc/dc boost converters. These can achieve soft-switching by using a clamp-switch network comprised of an active switch and a diode in parallel with a capacitor connected across one of the inductors of the Z-source network. In this way, ringing at the transistor switching node is mitigated, and the voltage at the turn-on of the transistor is reduced. Even a zero voltage switching (ZVS) of the main transistor is possible if the capacitor in the clamp-switch network is adequately chosen. The proposed circuit structure and operating mode are described and validated through simulations and measurements on a low-power prototype.
Non-fungible tokens (NFTs) are unique digital assets that have recently gained significant popularity, particularly in the digital art sector. The success of NFTs and other blockchain-based innovations depends on their ac-acceptance and use by consumers. This study aims to understand the impact of moral values on the acceptance of NFTs. Based on a quantitative survey with over 800 complete responses, the analysis shows that moral aspects of NFTs are indeed important for potential users. However, there is an attitude-behavior gap, as the positive impact of moral values on the intention to use NFTs is not reflected in the actual current usage of NFTs by the respondents. This study contributes to knowledge by providing new empirical data on the acceptance of NFTs and highlighting the role of moral values on the acceptance decision.
The volume includes papers presented at the International KES Conference on Human Centred Intelligent Systems 2023 (KES HCIS 2023), held in Rome, Italy on June 14–16, 2023. This book highlights new trends and challenges in intelligent systems, which play an important part in the digital transformation of many areas of science and practice. It includes papers offering a deeper understanding of the human-centred perspective on artificial intelligence, of intelligent value co-creation, ethics, value-oriented digital models, transparency, and intelligent digital architectures and engineering to support digital services and intelligent systems, the transformation of structures in digital businesses and intelligent systems based on human practices, as well as the study of interaction and the co-adaptation of humans and systems.
Motivation
In order to enable context-aware behavior of surgical assistance systems, the acquisition of various information about the current intraoperative situation is crucial. To achieve this, the complex task of situation recognition can be delegated to a specialized system. Consequently, a standardized interface is required for the seamless transfer of the recognized contextual information to the assistance systems, enabling them to adapt accordingly.
Methods
Our group analyzed four medical interface standards to determine their suitability for exchanging intraoperative contextual information. The assessment was based on a harmonized data and service model derived from the requirements of expected context-aware use cases. The Digital Imaging and Communications in Medicine (DICOM) and IEEE 11073 for Service-oriented Device Connectivity (SDC) were identified as the most appropriate standards.
Results
We specified how DICOM Unified Procedure Steps (UPS), can be used to effectively communicate contextual information. We proposed the inclusion of attributes to formalize different granularity levels of the surgical workflow.
Conclusions
DICOM UPS SOP classes can be used for the exchange of intraoperative contextual information between a situation recognition system and surgical assistance systems. This can pave the way for vendor-independent context awareness in the OR, leading to targeted assistance of the surgical team and an improvement of the surgical workflow.
The fifth mobile communications generation (5G) can lead to a substantial change in companies enabling the full capability of wireless industrial communication. 5G with its key features of providing Enhanced Mobile Broadband, Ultra-Reliable and Low-Latency Communication, and Massive Machine Type Communication will support the implementation of Industry 4.0 applications. In particular, the possibility to set-up Non-Public Networks provides the opportunity of 5G communication in factories and ensures sole access to the 5G infrastructure offering new opportunities for companies to implement innovative mobile applications. Currently there exist various concepts, ideas, and projects for 5G applications in an industrial environment. However, the global rollout of 5G systems is a continuous process based on various stages defined by the global initiative 3rd Generation Partnership Project that develops and specifies the 5G telecommunication standard. Accordingly, some services are currently still far from their final performance capability or not yet implemented. Additionally, research lacks in clarifying the general suitability of 5G regarding frequently mentioned 5G use cases. This paper aims to identify relevant 5G use cases for intralogistics and evaluates their technical requirements regarding their practical feasibility throughout the upcoming 5G specifications.
Identifikation von Schlaf- und Wachzuständen durch die Auswertung von Atem- und Bewegungssignalen
(2021)
Early exposure makes the entrepreneur: how economics education in school influences entrepreneurship
(2022)
Many countries that seek to boost their economy share the goal of promoting entrepreneurship. Whereas there is ample research on the predictors of entrepreneurship during adulthood, we know little about how pre-adulthood experience influences entrepreneurship later in life. Using a natural experiment, this paper examines whether introducing economics classes in school enhances entrepreneurial behavior in adulthood. Our difference-in-differences approach exploits curricula reforms across German states that introduced compulsory economics education classes in secondary schools. Using information on school and labor market careers for more than 10,000 individuals from 1984 to 2019, we find that the reform increases students’ entrepreneurial activities by three percentage points. Examining gender differences, we find that economics classes equally benefit female and male students. Our results advance our understanding of how pre-adulthood experiences shape individuals’ entrepreneurial behavior.
Context: Companies that operate in the software-intensive business are confronted with high market dynamics, rapidly evolving technologies as well as fast-changing customer behavior. Traditional product roadmapping practices, such as fixed-time-based charts including detailed planned features, products, or services typically fail in such environments. Until now, the underlying reasons for the failure of product roadmaps in a dynamic and uncertain market environment are not widely analyzed and understood.
Objective: This paper aims to identify current challenges and pitfalls practitioners face when developing and handling product roadmaps in a dynamic and uncertain market environment.
Method: To reach our objective we conducted a grey literature review (GLR).
Results: Overall, we identified 40 relevant papers, from which we could extract 11 challenges of the application of product roadmapping in a dynamic and uncertain market environment. The analysis of the articles showed that the major challenges for practitioners originate from overcoming a feature-driven mindset, not including a lot of details in the product roadmap, and ensuring that the content of the roadmap is not driven by management or expert opinion.
Providing a digital infrastructure, platform technologies foster interfirm collaboration between loosely coupled companies, enabling the formation of ecosystems and building the organizational structure for value co-creation. Despite the known potential, the development of platform ecosystems creates new sources of complexity and uncertainty due to the involvement of various independent actors. For a platform ecosystem to succeed, it is essential that the platform ecosystem participants are aligned, coordinated, and given a common direction. Traditionally, product roadmaps have served these purposes during product development. A systematic mapping study was conducted to better understand how product roadmapping could be used in the dynamic environment of platform ecosystems. One result of the study is that there are hardly any concrete approaches for product roadmapping in platform ecosystems so far. However, many challenges on the topic are described in the literature from different perspectives. Based on the results of the systematic mapping study, a research agenda for product roadmapping in platform ecosystems is derived and presented.
Context: Nowadays the market environment is characterized by high uncertainties due to high market dynamics, confronting companies with new challenges in creating and updating product roadmaps. Most companies are still using traditional approaches which typically fail in such environments. Therefore, companies are seeking opportunities for new product roadmapping approaches.
Objective: This paper presents good practices to support companies better understand what factors are required to conduct a successful product roadmapping in a dynamic and uncertain market environment.
Method: Based on a grey literature review, essential aspects for conducting product roadmapping in a dynamic and uncertain market environment were identified. Expert workshops were then held with two researchers and three practitioners to develop best practices and the proposed approach for an outcome-driven roadmap. These results were then given to another set of practitioners and their perceptions were gathered through interviews.
Results: The study results in the development of 9 good practices that provide practitioners with insights into what aspects are crucial for product roadmapping in a dynamic and uncertain market environment. Moreover, we propose an approach to product roadmapping that includes providing a flexible structure and focusing on delivering value to the customer and the business. To ensure the latter, this approach consists of the main items outcome hypothesis, validated outcomes, and discovered outputs.
Theoretical foundation, effectiveness, and design artefact for machine learning service repositories
(2022)
Machine learning (ML) has played an important role in research in recent years. For companies that want to use ML, finding the algorithms and models that fit for their business is tedious. A review of the available literature on this problem indicates only a few research papers. Given this gap, the aim of this paper is to design an effective and easy-to-use ML service repository. The corresponding research is based on a multi-vocal literature analysis combined with design science research, addressing three research questions: (1) How is current white and gray literature on ML services structured with respect to repositories? (2) Which features are relevant for an effective ML service repository? (3) How is a prototype for an effective ML service repository conceptualized? Findings are relevant for the explanation of user acceptance of ML repositories. This is essential for corporate practice in order to create and use ML repositories effectively.
Today, companies face increasing market dynamics, rapidly evolving technologies, and rapid changes in customer behavior. Traditional approaches to product development typically fail in such environments and require companies to transform their often feature-driven mindset into a product-led mindset. A promising first step on the way to a product-led company is a better understanding of how product planning can be adapted to the requirements of an increasingly dynamic and uncertain market environment in the sense of product roadmapping. The authors developed the DEEP product roadmap assessment tool to help companies evaluate their current product roadmap practices and identify appropriate actions to transition to a more product-led company. Objective: The goal of this paper is to gain insight into the applicability and usefulness of version 1.1 of the DEEP model. In addition, the benefits, and implications of using the DEEP model in corporate contexts will be explored. Method: We conducted a multiple case study in which participants were observed using the DEEP model. We then interviewed each participant to understand their perceptions of the DEEP model. In addition, we conducted interviews with each company's product management department to learn how the application of the DEEP model influenced their attitudes toward product roadmapping. Results: The study showed that by applying the DEEP model, participants better understood which artifacts and methods were critical to product roadmapping success in a dynamic and uncertain market environment. In addition, the application of the DEEP model helped convince management and other stakeholders of the need to change current product roadmapping practices. The application also proved to be a suitable starting point for the transformation in the participating companies.
The rapid development and growth of knowledge has resulted in a rich stream of literature on various topics. Information systems (IS) research is becoming increasingly extensive, complex, and heterogeneous. Therefore, a proper understanding and timely analysis of the existing body of knowledge are important to identify emerging topics and research gaps. Despite the advances of information technology in the context of big data, machine learning, and text mining, the implementation of systematic literature reviews (SLRs) is in most cases still a purely manual task. This might lead to serious shortcomings of SLRs in terms of quality and time. The outlined approach in this paper supports the process of SLRs with machine learning techniques. For this purpose, we develop a framework with embedded steps of text mining, cluster analysis, and network analysis to analyze and structure a large amount of research literature. Although the framework is presented using IS research as an example, it is not limited to the IS field but can also be applied to other research areas.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. Therefore, the logic of business decisions is based on the agility to respond to emerging trends in a proactive way. By contrast, traditional IT governance (ITG) frameworks rely on hierarchy and standardized mechanisms to ensure better business/IT alignment. This conflict leads to a call for an ambidextrous governance, in which firms alternate between stability and agility in their ITG mechanisms. Accordingly, this research aims to explore how agility might be integrated in ITG. A quantitative research strategy is implemented to explore the impact of agility on the causal relationship among ITG, business/IT alignment, and firm performance. The results show that the integration of agile ITG mechanisms contributes significantly to the explanation of business/IT alignment. As such, firms need to develop a dual governance model powered by traditional and agile ITG mechanisms.
Startups play a key role in software-based innovation. They make an important contribution to an economy’s ability to compete and innovate, and their importance will continue to grow due to increasing digitalization. However, the success of a startup depends primarily on market needs and the ability to develop a solution that is attractive enough for customers to choose. A sophisticated technical solution is usually not critical, especially in the early stages of a startup. It is not necessary to be an experienced software engineer to start a software startup. However, this can become problematic as the solution matures and software complexity increases. Based on a proposed solution for systematic software development for early-stage startups, in this paper, we present the key findings of a survey study to identify the methodological and technical priorities of software startups. Among other things, we found that requirements engineering and architecture pose challenges for startups. In addition, we found evidence that startups’ software development approaches do not tend to change over time. An early investment in a more scalable development approach could help avoid long-term software problems. To support such an investment, we propose an extended model for Entrepreneurial Software Engineering that provides a foundation for future research.
Organizations that operate under uncertainty need to cultivate their ability to manage their primary resource, knowledge, accordingly. Under such conditions, organizations are required to harvest knowledge from two sources: to explore knowledge that is to be found outside the organization as well as exploit knowledge that is contained within. In a knowledge management context these exploitation and exploration activities have been conceptualized as knowledge ambidexterity. While ambidexterity has been studied extensively in contexts as manufacturing or IT, the notion of knowledge ambidexterity remains scarce in current knowledge management research. This study illustrates knowledge ambidexterity and elaborates its positive impact on organizational performance. Our study furthermore answers the question of how the use of enterprise social media (ESM) can facilitate the performance effects of knowledge ambidexterity. Drawing on the theory of communication visibility, we argue that ESM (e.g., Microsoft Teams, Slack, etc.) allow employees to communicate unhindered while making these communications visible. This allows for capturing tacit knowledge within these communications - this form of knowledge is generally hard to codify and can be a source of competitive edge. With respect to knowledge ambidexterity, ESM use can capture tacit knowledge aspects originating from inside and outside the organization, which fosters the development of a competitive advantage and, thus, supports its positive effect on organizational performance. This paper contributes to IT-enabled ambidexterity research in two aspects: (1) It sheds light on knowledge ambidexterity and, thereby, addresses a major practical challenge for knowledge-intensive organizations, and (2) it elaborates on the effects that ESM use can have on the relationship between knowledge ambidexterity and organizational performance. This work-in-progress paper offers a better understanding of the phenomenon of ambidexterity in a knowledge context, while providing insights on the facilitating role of ESM. Our research serves as a foundation for future empirical examinations of the concept of knowledge ambidexterity.
The respiratory rate is a vital sign indicating breathing illness. It is necessary to analyze the mechanical oscillations of the patient's body arising from chest movements. An inappropriate holder on which the sensor is mounted, or an inappropriate sensor position is some of the external factors which should be minimized during signal registration. This paper considers using a non-invasive device placed under the bed mattress and evaluates the respiratory rate. The aim of the work is the development of an accelerometer sensor holder for this system. The normal and deep breathing signals were analyzed, corresponding to the relaxed state and when taking deep breaths. The evaluation criterion for the holder's model is its influence on the patient's respiratory signal amplitude for each state. As a result, we offer a non-invasive system of respiratory rate detection, including the mechanical component providing the most accurate values of mentioned respiratory rate.
Enterprises and societies currently face crucial challenges, while Society 5.0 can contribute to a supersmart society, especially for manufacturing and healthcare, and Industry 4.0 becomes important in the global manufacturing industry. Smart energy digital platforms are architected to manage energy supply efficiently. Furthermore, the above digital platforms are expected to collect various kinds of data and analyze Big Data for the trends in the sharing economy in ecosystems. The adaptive integrated digital architecture framework (AIDAF) for Design Thinking Approach with Risk Management is expected to make an alignment with digital IT strategy. In this paper, we propose that various energy management systems and related digital platforms are designed and implemented in an alignment to digital IT strategy for sharing economy toward Society 5.0, with the AIDAF framework for Design Thinking Approach with Risk Management. The vision of AIDAF applications to enable sharing economy and digital platforms is explained and extended in the context of Society 5.0. In addition, challenges and future activities for this area are discussed that cover the directions of smart energy for Society 5.0.
An autonomous vehicle is a robotic vehicle with decision and action capability capable of performing assigned tasks without or with minimal human intervention. Autonomous cars have been in development for many years. The Society of Automotive Engineers (SAE International) published in 2014 a classification in five levels of driving automation, with level 0 corresponding to completely manual driving, and level 5 to an ideal dream where the vehicle would be able to navigate entirely autonomously for all missions and in all environments. This work addressed the navigation of an autonomous vehicle in general. We focus on one of the most complex scenarios of the road network and crossing of road intersections. In this paper, the critical features of autonomous intelligent vehicles are reviewed. Furthermore, the associated problems are presented, and the most advanced solutions are derived. This article aims to allow a novice in this field to understand the different facets of localization and perception problems for autonomous vehicles.
Verification of an active time constant tuning technique for continuous-time delta-sigma modulators
(2022)
In this work we present a technique to compensate the effects of R-C / g m -C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with finely tunable circuit structures, such as current-steering digital-to-analog converters (DAC). We apply our technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure, implemented in a 0.35-μm CMOS process. A tuning scheme for the reference currents of the feedback DACs is derived as a function of the individual TC errors and verified by circuit simulations. We confirm the tuning technique experimentally on the fabricated circuit over a TC parameter variation range of ±20%. Stable modulator operation is achieved for all parameter sets. The measured performances satisfy the expectations from our theoretical calculations and circuit-level simulations.
A single-phase fixed-frequency operated power factor correction circuit with reduced switching losses is proposed. The circuit uses the combination of a boost converter with an added clamp-switch, a pulse wave shaping circuit, and a standard control IC to discharge the transistor's output capacitance prior to its turn-on. In this way, a very low-complexity control circuit implementation to reduce switching losses or even achieve complete zero-voltage switching without additional sensors is possible. Moreover, this operation method is achieved at a constant switching frequency, possibly simplifying the design of the EMI filter and the converter's inductor. Experimental test results for a 100 W prototype converter are presented to validate the feasibility of the proposed operating method and corresponding circuit structure.
On the influence of ground and substrate on the radiation characteristics of planar spiral antennas
(2022)
The unidirectional radiation of spiral antennas mounted on a substrate requires the presence of a ground plane. In this work, we successively illustrate the impact of dielectric material and ground plane on the key metrics of a planar equiangular spiral antenna (PESA). For this purpose, a PESA mounted on several substrates with different dielectric properties and thicknesses is modeled and simulated. We introduce the tertiary current flowing on spiral arms when backed by a ground plane.
This paper presents a compact four-arm spiral antenna, which may be used in direction-finding applications but also mobile communication systems. The antenna is fed sequentially at its outside-ends using a sequential phase network embedded in grounded multilayer dielectric media. Sequential rotation is applied to generate the axial mode M1 but also the conical mode M2 in the same frequency band. The antenna exhibits good radiation characteristics in the frequency band of interest.
Process risks are omnipresent in the corporate world and repeatedly present organizations with the challenge of how to deal with these risks. Efforts in trying to analyze and prevent these risks are costly and require many resources, which do not always bring the desired added value. The goal of this work is to determine how a benefit-oriented resource allocation can be made for risk-oriented process management. For this purpose, the following research question is posed: "How can systematic prioritization decisions regarding risk-oriented process management be made?” To answer it, an evaluation procedure is developed which assesses processes based on their characteristics regarding potential risk disposition as well as entrepreneurial relevance. For this purpose, requirements for such a procedure are first collected and used to define selection criteria for it. After the detailed analysis of known selection and evaluation procedures, one of them is selected and used for further development. Next steps include the definition of relevant criteria for the evaluation of the processes by examining process characteristics regarding their suitability for process evaluation. The focus here lies on characteristics that provide indications of the risk disposition and business relevance of processes. The result of this approach is a scoring model with a criteria catalog consisting of 15 criteria according to which a process is evaluated. The evaluation result is presented both numerically and in a matrix. This enables the comparison of several processes and a derived prioritization of those for a more in-depth risk analysis. The application of this approach will ensure a benefit-oriented allocation of resources in the management of process risks and increased process reliability.
Class Phi2 amplifier using GaN HEMTs at 13.56MHz with tuned transformer for wireless power transfer
(2022)
This paper discusses a design procedure of a wireless power transfer system at a RF switching frequency of 13.56MHz. The wireless power transfer amplifier uses GaN HEMTs in aClass phi2 topology and is designed in order to achieve high efficiency and high power density. A design method for the load over a certain bandwidth is presented for a transformer with its tuning network.
Switched reluctance motors are particularly attractive due to their simple structure. The control of this machine type requires the instants, to switch the currents in the motor phases in an appropriate sequence. These switching instants are determined either based on a position sensor, or on signals generated by a sensorless method. A very simple sensorless method uses the switching frequency of the hysteresis controllers used for phase current control. This paper first presents an automatic commissioning method for this sensorless method and second a startup procedure, thus enhancing this approach towards an application in industry.
The majority of people in sub-Saharan Africa (SSA) rely on so-called “paratransit” for their mobility needs. The term refers to a large informal transport sector that runs independent of government, of which 83% comprises minibus taxis (MBT). MBT technology is often old and contribute significantly to climate change with their high carbon dioxide (CO2) emissions. Issues related to sustainability and climate change are becoming more important world-wide and hardly any attention is given to MBTs. Converting the MBTs from internal combustion engines (ICEs) to electric motors could be a possible solution. The existing power grid in SSA is largely based on fossil power plants and is unstable. This can be seen by frequent local power blackouts. To avoid further strain on the existing power grid, it would therefore make sense to charge the electric minibus taxis (eMBTs) through a grid consisting of renewable energies. A mobility map is created via simulations with collected data points of the MBTs. By using this mobility map, the energy demand of the eMBTs is calculated. Furthermore, a region-specific photovoltaic (PV) and wind simulation can be realised based on existing weather data, and a tool to size the supply system to charge the eMBTs is developed after all data has been collected. With the help of this work, it can be determined to what extent renewable energies such as PV and wind power can be used to support the transition from ICEs to electric engines in the MBT sector.
The current paper proposes a design method for an active damping approach for LC output filters in a power stage for motor control with continuous output voltage. The power stage uses GaN-HEMTs and operates at switching frequencies in a range between 500 kHz and 1MHz. The active damping of the output filter is achieved here by a feedback of the filter inductor current using a high-pass structure. The paper discusses the impact of this feedback on the system behavior and proposes a design method.
Public transport causes in rural areas high costs per passenger and kilometer as the frequency of scheduled busses is low and therefore, many people avoid using public transport. With the trend of moving from urban regions to countryside individual traffic will further increase. To tackle issues of emissions, mobility for young and elderly people and provide economically meaningful public transport a new concept was elaborated in Germany. This consists of (partly) autonomous shuttle busses which are remote controlled. For implementation rural districts of Germany have worked together and set up a three-phase plan consisting of a project with public funding, a highly frequent used pilot region and industrial partners with the commitment and possibilities for necessary investments. The concept promises economical value with respect to installation, service and maintaining costs, it leads to lower barriers for public transport of young and elderly people and ultimately reduces emissions and congestions.
Data analysis is becoming increasingly important to pursue organizational goals, especially in the context of Industry 4.0, where a wide variety of data is available. Here numerous challenges arise, especially when using unstructured data. However, this subject has not been focused by research so far. This research paper addresses this gap, which is interesting for science and practice as well. In a study three major challenges of using unstructured data has been identified: analytical know-how, data issues, variety. Additionally, measures how to improve the analysis of unstructured data in the industry 4.0 context are described. Therefore, the paper provides empirical insights about challenges and potential measures when analyzing unstructured data. The findings are presented in a framework, too. Hence, next steps of the research project and future research points become apparent.
There is still a great reliance on human expert knowledge during the analog integrated circuit sizing design phase due to its complexity and scale, with the result that there is a very low level of automation associated with it. Current research shows that reinforcement learning is a promising approach for addressing this issue. Similarly, it has been shown that the convergence of conventional optimization approaches can be improved by transforming the design space from the geometrical domain into the electrical domain. Here, this design space transformation is employed as an alternative action space for deep reinforcement learning agents. The presented approach is based entirely on reinforcement learning, whereby agents are trained in the craft of analog circuit sizing without explicit expert guidance. After training and evaluating agents on circuits of varying complexity, their behavior when confronted with a different technology, is examined, showing the applicability, feasibility as well as transferability of this approach.
The imparting of knowledge and skills in STEM education, especially under the influence of the Covid-19 pandemic, is increasingly taking place online and through digital formats. The partially asynchronous instruction eliminates, on the one hand, the social relation in the learning process and, on the other hand, the direct experience with physical objects. Here, the digital learning systems provide learning tools and controls to support the learning process on a general basis. Existing methods for simulating physical objects (digital twins) are also used to a minimal extent. The following approach presents a learning system framework that enables individualized learning, including all dimensions (social, physical). Implementing a concept that uses a personalized assistance system to orchestrate the individual learning steps enables efficient and effective learning. Applying the learning system framework exemplifies the STEM education at Reutlingen University in the logistics learning factory Werk150.
In this work, a brushless, harmonic-excited wound-rotor synchronous machine without any auxiliary windings which can provide full torque at startup is investigated experimentally. The excitation power is transferred inductively by superimposing an additional harmonic field of different pole-pair number on top of the airgap field. This is achieved by feeding the parallel paths of the stator and rotor winding separately. A prototype for the harmonic-excited synchronous machine has been constructed and experimental results are presented to verify the concept. The main loss contributors are identified and the importance of considering core losses under harmonic excitation is discussed. A general analytical model for harmonic excited synchronous machines is proposed which enables a quick estimation of the iron core flux densities and the core losses generated by the additional harmonic currents.
In this work, a comparison between different brushless harmonic-excited wound-rotor synchronous machines is performed. The general idea of all topologies is the elimination of the slip rings and auxiliary windings by using the already existing stator and rotor winding for field excitation. This is achieved by injecting a harmonic airgap field with the help of power electronics. This harmonic field does not interact with the fundamental field, it just transfers the excitation power across the airgap. Alternative methods with varying number of phases, different pole-pair combinations, and winding layouts are covered and compared with a detailed Finite-Element-parameterized model. Parasitic effects due to saturation and coupling between the harmonic and main windings are considered.
Personalized remote healthcare monitoring is in continuous development due to the technology improvements of sensors and wearable electronic systems. A state of the art of research works on wearable sensors for healthcare applications is presented in this work. Furthermore, a state of the art of wearable devices, chest and wrist band and smartwatches available on the market for health and sport monitoring is presented in this paper. Many activity trackers are commercially available. The prices are continuously reducing and the performances are improving, but commercial devices do not provide raw data and are therefore not useful for research purposes.
Gamification is one of the recognized methods of motivating people in various life processes, and it has spread to many spheres of life, including healthcare. This article proposes a system design for long-term care patients using the method mentioned. The proposed system aims to increase patient engagement in the treatment and rehabilitation process via gamification. Literature research on available and earlier proposed systems was conducted to develop a suited system design. The primary target group includes bedridden patients and a sedentary lifestyle (predominantly lying in bed). One of the main criteria for selecting a suitable option was its contactless realization for the mentioned target groups in long-term care cases. As a result, we developed the system design for hardware and software that could prevent bedsores and other health problems from occurring because of low activity. The proposed design can be tested in hospitals, nursing homes, and rehabilitation centers.
In recent decades, it can be observed that a steady increase in the volume of tourism is a stable trend. To offer travel opportunities to all groups, it is also necessary to prepare offers for people in need of long-term care or people with disabilities. One of the ways to improve accessibility could be digital technologies, which could help in planning as well as in carrying out trips. In the work presented, a study of barriers was first conducted, which led to selecting technologies for a test setup after analysis. The main focus was on a mobile app with travel information and 360° tours. The evaluation results showed that both technologies could increase accessibility, but some essential aspects (such as usability, completeness, relevance, etc.) need to be considered when implementing them.
The digital twin concept has been widely known for asset monitoring in the industry for a long time. A clear example is the automotive industry. Recently, there has also been significant interest in the application of digital twins in healthcare, especially in genomics in what is known as precision medicine. This work focuses on another medical speciality where digital twins can be applied, sleep medicine. However, there is still great controversy about the fundamentals that constitute digital twins, such as what this concept is based on and how it can be included in healthcare effectively and sustainably. This article reviews digital twins and their role so far in what is known as personalized medicine. In addition, a series of steps will be exposed for a possible implementation of a digital twin for a patient suffering from sleep disorders. For this, artificial intelligence techniques, clinical data management, and possible solutions for explaining the results derived from artificial intelligence models will be addressed.
In many cases continuous monitoring of vital signals is required and low intrusiveness is an important requirement. Incorporating monitoring systems in the hospital or home bed could have benefits for patients and caregivers. The objective of this work is the definition of a measurement protocol and the creation of a data set of measurements using commercial and low-cost prototypes devices to estimate heart rate and breathing rate. The experimental data will be used to compare results achieved by the devices and to develop algorithms for feature extraction of vital signals.
There have been substantial research efforts for algorithms to improve continuous and automated assessment of various health-related questions in recent years. This paper addresses the deployment gap between those improving algorithms and their usability in care and mobile health applications. In practice, most algorithms require significant and founded technical knowledge to be deployed at home or support healthcare professionals. Therefore, the digital participation of persons in need of health care professionals lacks a usable interface to use the current technological advances. In this paper, we propose applying algorithms taken from research as web-based microservices following the common approach of a RESTful service to bridge the gap and make algorithms accessible to caregivers and patients without technical knowledge and extended hardware capabilities. We address implementation details, interpretation and realization of guidelines, and privacy concerns using our self-implemented example. Also, we address further usability guidelines and our approach to those.
This paper presents a toolbox in Matlab/Octave for procedural design of analog integrated circuits. The toolbox contains all native functions required by analog designers (namely, schematic-generation, simulation setup and execution, integrated look-up tables and functions for design space exploration) to capture an entire design strategy in an executable script. This script - which we call an Expert Design Plan (EDP) - is capable of executing an analog circuit design fully automatically. The toolbox is integrated in an existing design flow. A bandgap reference voltage circuit is designed with this tool in less than 15 min.
A premise guaranteeing the successful interdisciplinary teamwork in product design is a mutual understanding of both professional and academic communities of the different design expertise and the role they play in the process. It appears that the open compound word industrial design is open to interpretation in European education. This ambiguity had a negative impact on the labour policies of some European countries, which have labelled some professions with incorrect names. Therefore, this terminological inconsistency urges for clarification within the design community. This work analyses the term industrial design, it presents historical developments in European industrial design education, in particular in Germany and in the Netherlands, and discusses how the education to the industrial design profession was positioned towards product development. This paper suggests that the causes for the observed lack of clarity about the meaning of the term industrial design are of an etymological and disciplinary kind. In order to act as a bridge between the professional and academic communities, universities should create the premises for interdisciplinary collaboration between designers and engineers through standardized communication, ultimately contributing for a sustainable future in both design and engineering education.
Multi-versioning and MVCC are the foundations of many modern DBMSs. Under mixed workloads and large datasets, the creation of the transactional snapshot can become very expensive, as long-running analytical transactions may request old versions, residing on cold storage, for reasons of transactional consistency. Furthermore, analytical queries operate on cold data, stored on slow persistent storage. Due to the poor data locality, snapshot creation may cause massive data transfers and thus lower performance. Given the current trend towards computational storage and near-data processing, it has become viable to perform such operations in-storage to reduce data transfers and improve scalability. neoDBMS is a DBMS designed for near-data processing and computational storage. In this paper, we demonstrate how neoDBMS performs snapshot computation in-situ. We showcase different interactive scenarios, where neoDBMS outperforms PostgreSQL 12 by up to 5×.
In times of climate change and growing urbanization, the way food is produced and consumed also changes. Meanwhile, digitization is transforming farming practices, which also applies to the domestic growing of crops. More and more so-called smart home farms (SHF) are finding their way into private households. This paper conceptualizes the unique nature of enabled smart services and their underlying technology. Following an inductive interpretive approach, this study explores the antecedents of smart home farming practices. Our sample consists of eleven actual smart home farmers. We found six constructs to be of salient importance: expected outcomes related to harvesting, positive feelings, and sustainability; a combination of one's affinity for green and novel technologies; and the smartness and visibility of the enabled services. In the outlook, we present some preliminary thoughts for testing our qualitative findings.
Der relative Vorteil von Heim- gegenüber Auswärtsteams im Sport - der sogenannte Heimvorteil - ist in mehreren Studien belegt (z.B. Nevill et al., 2002; Jamieson, 2010). Als theoretisch dem Heimvorteil zugrundeliegende Faktoren gelten u.a. folgende: die Zuschauer (durch ihre motivierende Wirkung auf Spieler oder beeinflussende Wirkung auf Schiedsrichter), Reisefaktoren (z.B. die Entfernung bzw. Dauer der Reise und die damit einhergehende Erschöpfung der Spieler) und die Vertrautheit der Heimmannschaft mit der Umgebung (z.B. die Vertrautheit mit dem Stadion und dem Spieluntergrund) (Courneya & Carron, 1992; Nevill et al., 2002). Durch die während der COVID-19-Pandemie stattfindenden Spiele ohne Zuschauer (Geisterspiele) lässt sich erstmals durch ein natürliches Experiment der Einfluss von Zuschauern auf den Heimvorteil betrachten. Ein Überblick über die Studien, die den Heimvorteil in verschiedenen Fußballligen während der pandemiebedingten Geisterspiele untersuchen, findet sich in Leitner et al. (2022).
For a long time, most discrete accelerators have been attached to host systems using various generations of the PCI Express interface. However, with its lack of support for coherency between accelerator and host caches, fine-grained interactions require frequent cache-flushes, or even the use of inefficient uncached memory regions. The Cache Coherent Interconnect for Accelerators (CCIX) was the first multi-vendor standard for enabling cache-coherent host-accelerator attachments, and already is indicative of the capabilities of upcoming standards such as Compute Express Link (CXL). In our work, we compare and contrast the use of CCIX with PCIe when interfacing an ARM-based host with two generations of CCIX-enabled FPGAs. We provide both low-level throughput and latency measurements for accesses and address translation, as well as examine an application-level use-case of using CCIX for fine-grained synchronization in an FPGA-accelerated database system. We can show that especially smaller reads from the FPGA to the host can benefit from CCIX by having roughly 33% shorter latency than PCIe. Small writes to the host have a latency roughly 32% higher than PCIe, though, since they carry a higher coherency overhead. For the database use-case, the use of CCIX allowed to maintain a constant synchronization latency even with heavy host-FPGA parallelism.
Glioblastomas are the most aggressive fast-growing primary brain cancer which originate in the glial cells of the brain. Accurate identification of the malignant brain tumor and its sub-regions is still one of the most challenging problems in medical image segmentation. The Brain Tumor Segmentation Challenge (BraTS) has been a popular benchmark for automatic brain glioblastomas segmentation algorithms since its initiation. In this year, BraTS 2021 challenge provides the largest multi-parametric (mpMRI) dataset of 2,000 pre-operative patients. In this paper, we propose a new aggregation of two deep learning frameworksnamely, DeepSeg and nnU-Net for automatic glioblastoma recognition in pre-operative mpMRI. Our ensemble method obtains Dice similarity scores of 92.00, 87.33, and 84.10 and Hausdorff Distances of 3.81, 8.91, and 16.02 for the enhancing tumor, tumor core, and whole tumor regions, respectively, on the BraTS 2021 validation set, ranking us among the top ten teams. These experimental findings provide evidence that it can be readily applied clinically and thereby aiding in the brain cancer prognosis, therapy planning, and therapy response monitoring. A docker image for reproducing our segmentation results is available online at (https://hub.docker.com/r/razeineldin/deepseg21).
Database management systems and K/V-Stores operate on updatable datasets – massively exceeding the size of available main memory. Tree-based K/V storage management structures became particularly popular in storage engines. B+ -Trees [1, 4] allow constant search performance, however write-heavy workloads yield in inefficient write patterns to secondary storage devices and poor performance characteristics. LSM-Trees [16, 23] overcome this issue by horizontal partitioning fractions of data – small enough to fully reside in main memory, but require frequent maintenance to sustain search performance.
Firstly, we propose Multi-Version Partitioned BTrees (MV-PBT) as sole storage and index management structure in key-sorted storage engines like K/V-Stores. Secondly, we compare MV-PBT against LSM-Trees. The logical horizontal partitioning in MV-PBT allows leveraging recent advances in modern B+ -Tree techniques in a small transparent and memory resident portion of the structure. Structural properties sustain steady read performance, yielding efficient write patterns and reducing write amplification.
We integrated MV-PBT in the WiredTiger [15] KV storage engine. MV-PBT offers an up to 2× increased steady throughput in comparison to LSM-Trees and several orders of magnitude in comparison to B+ -Trees in a YCSB [5] workload.