Refine
Document Type
- Book chapter (134)
- Journal article (117)
- Conference proceeding (115)
- Book (1)
Has full text
- yes (367) (remove)
Is part of the Bibliography
- yes (367)
Institute
- Informatik (160)
- ESB Business School (100)
- Texoversum (46)
- Life Sciences (29)
- Technik (28)
- Zentrale Einrichtungen (4)
Publisher
- Springer (367) (remove)
Public transport maps are typically designed in a way to support route finding tasks for passengers, while they also provide an overview about stations, metro lines, and city-specific attractions. Most of those maps are designed as a static representation, maybe placed in a metro station or printed in a travel guide. In this paper, we describe a dynamic, interactive public transport map visualization enhanced by additional views for the dynamic passenger data on different levels of temporal granularity. Moreover, we also allow extra statistical information in form of density plots, calendar-based visualizations, and line graphs. All this information is linked to the contextual metro map to give a viewer insights into the relations between time points and typical routes taken by the passengers. We also integrated a graph-based view on user-selected routes, a way to interactively compare those routes, an attribute- and property-driven automatic computation of specific routes for one map as well as for all available maps in our repertoire, and finally, also the most important sights in each city are included as extra information to include in a user-selected route. We illustrate the usefulness of our interactive visualization and map navigation system by applying it to the railway system of Hamburg in Germany while also taking into account the extra passenger data. As another indication for the usefulness of the interactively enhanced metro maps we conducted a controlled user experiment with 20 participants.
Framework for integrating intelligent product structures into a flexible manufacturing system
(2023)
Increasing individualisation of products with a high variety and shorter product lifecycles result in smaller lot sizes, increasing order numbers, and rising data and information processing for manufacturing companies. To cope with these trends, integrated management of the products and manufacturing information is necessary through a “product-driven” manufacturing system. Intelligent products that are integrated as an active element within the controlling and planning of the manufacturing process can represent flexibility advantages for the system. However, there are still challenges regarding system integration and evaluation of product intel-ligence structures. In light of these trends, this paper proposes a conceptual frame-work for defining, analysing, and evaluating intelligent products using the example of an assembly system. This paper begins with a classification of the existing problems in the assembly and a definition of the intelligence level. In contrast to previous approaches, the analysis of products is expanded to five dimensions. Based on this, a structured evaluation method for a use case is presented. The structure of solving the assembly problem is provided by the use case-specific ontology model. Results are presented in terms of an assignment of different application areas, linking the problem with the target intelligence class and, depending on the intelligence class of the product, suggesting requirements for implementation. The conceptual frame-work is evaluated by utilising a case study in a learning factory. Here, the model-mix assembly is controlled actively by the workpiece carrier in terms of transferring the variant-specific work instructions to the operator and the collaborative robot (cobot) at the workstations. The resulting system thus enables better exploitation of the poten-tials through less frequent errors and shorter search times. Such an implementation has demonstrated that the intelligent workpiece carrier represents an additional part for realising a cyber-physical production system (CPPS).
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
A configuration-management-database driven approach for fabric-process specification and automation
(2014)
In this paper we describe an approach that integrates a Configuration- Management-Database into fabric-process specification and automation in order to consider different conditions regarding to cloud-services. By implementing our approach, the complexity of fabric processes gets reduced. We developed a prototype by using formal prototyping principles as research methods and integrated the Configuration-Management-Database Command into the Workflow- Management-System Activiti. We used this prototype to evaluate our approach. We implemented three different fabric-processes and show that by using our approach the complexity of these three fabric-processes gets reduced.
Dieser Beitrag beschreibt das Markenmanagement von Profifußballvereinen durch den Einsatz von Social Media. Um sich ein stückweit vom nichtplanbaren sportlichen Erfolg unabhängig zu machen, sollten sich Fußballvereine als Marke positionieren. Dazu steht ihnen allerdings traditionellerweise ein geringes Marketingbudget zur Verfügung. Social Media bietet Fußballvereinen die Möglichkeit, relativ kostengünstig und effektiv die eigene Marke aufzubauen und zu pflegen. Der Beitrag erläutert diesbezüglich die Notwendigkeit eines systematischen Markenmanagements, geht auf die Besonderheiten der Vermarktung eines Profifußballvereins ein und zeigt anhand von Beispielen, wie Social Media zum Markenaufbau respektive zur Markenpflege genutzt werden kann.
Die digitale Transformation ist die heute vorherrschende geschäftliche Transformation, die einen starken Einfluss darauf hat, wie digitale Dienstleistungen und Produkte dienstleistungsdominant gestaltet werden. Eine beliebte zugrundeliegende Theorie der Wertschöpfung und des wirtschaftlichen Austauschs, die als dienstleistungsdominante Logik (S-D) bekannt ist, kann mit vielen erfolgreichen digitalen Geschäftsmodellen verbunden werden. Allerdings ist die S-D-Logik an sich abstrakt. Unternehmen können sie nicht ohne Weiteres als Instrument für die Innovation und Gestaltung von Geschäftsmodellen nutzen. Um dies zu ändern, wird eine umfassende Ideenfindungsmethode auf der Grundlage der S-D-Logik vorgeschlagen, die als service-dominantes Design (SDD) bezeichnet wird. SDD zielt darauf ab, Unternehmen beim Übergang zu einer service- und wertorientierten Perspektive zu unterstützen. Die Methode bietet eine vereinfachte Möglichkeit, den Ideenfindungsprozess auf der Grundlage von vier Modellkomponenten zu strukturieren. Jede Komponente besteht aus praktischen Implikationen, Hilfsfragen und Visualisierungstechniken, die aus einer Literaturrecherche, einer Anwendungsfallbewertung der digitalen Mobilität und einer Fokusgruppendiskussion abgeleitet wurden. SDD ist ein erster Schritt zu einem Toolset, das etablierte Unternehmen bei der Service- und Werteorientierung im Rahmen ihrer digitalen Transformation unterstützen kann.
Werttreiber Lean Production
(2013)
Steigern Unternehmen, die Lean-Production-Methoden einsetzten, ihren Unternehmenswert, und wenn ja, wie sehr? Das Autorenteam der Hochschule Reutlingen hat das Zusammenspiel der Managementkonzepte Working Capital Management und Wertorientierung untersucht und stellt die ermutigenden Ergebnisse anhand je eines Szenarios für ein Großunternehmen und ein KMU vor.
Von den Covid-19-Restriktionen wurden im Automobilsektor die Zulieferer wesentlich stärker getroffen als die Fahrzeughersteller. Vor allem die Entwicklung des Working Capitals im ersten Pandemie-Jahr erwies sich als kritisch. Der Beitrag gibt einen Überblick über mögliche Lösungen für eine allseits vorteilhaftere, stabile Supply-Chain-Finanzierung in künftigen Krisen.
The digitization of factories will be a significant issue for the 2020s. New scenarios are emerging to increase the efficiency of production lines inside the factory, based on a new generation of robots’ collaborative functions. Manufacturers are moving towards data-driven ecosystems by leveraging product lifecycle data from connected goods. Energy-efficient communication schemes, as well as scalable data analytics, will support these various data collection scenarios. With augmented reality, new remote services are emerging that facilitate the efficient sharing of knowledge in the factory. Future communication solutions should generally ensure connectivity between the various production sites spread worldwide and new players in the value chain (e.g., suppliers, logistics) transparent, real-time, and secure. Industry 4.0 brings more intelligence and flexibility to production. Resulting in more lightweight equipment and, thus, offering better ergonomics. 5G will guarantee real-time transmissions with latencies of less than 1 ms. This will provide manufacturers with new possibilities to collect data and trigger actions automatically.
Silicon neurons represent different levels of biological details and accuracies as a trade-off between complexity and power consumption. With respect to this trade-off and high similarity to neuron behaviour models, relaxation-type oscillator circuits often yield a good compromise to emulate neurons. In this chapter, two exemplified relaxation-type silicon neurons are presented that emulate neural behaviour with energy consumption under the scale of nJ/spike. The first proposed fully CMOS relaxation SiN is based on mathematical Izhikevich model and can mimic a broad range of physiologically observable spike patterns. The results of kinds of biologically plausible output patterns and coupling process of two SiNs are presented in 0.35 μm CMOS technology. The second type is a novel ultra-low-frequency hybrid CMOS-memristive SiN based on relaxation oscillators and analog memristive devices. The hybrid SiN directly emulates neuron behaviour in the range of physiological spiking frequencies (less than 100 Hz). The relaxation oscillator is implemented and fabricated in 0.13 μm CMOS technology. An autonomous neuronal synchronization process is demonstrated with two relaxation oscillators coupled by an analog memristive device in the measurement to emulate the synchronous behaviour between spiking neurons.
IOS 2.0 : new aspects on inter-organizational integration through enterprise 2.0 technologies
(2015)
This special theme of „Electronic Markets“ focuses on research concerned with the use of social technologies and "2.0" principles in the interaction between organization (i.e., with "inter-organizational systems (IOS) 2.0"). This theme falls within the larger space of Enterprise 2.0 research, but focuses in particular on inter-organizational use (between enterprises), not intra-organizational use (in a single enterprise). While there is great interest in practice regarding the use of 2.0 technologies to support intra-organizational communication, collaboration and interaction, information systems (IS) research has largely been oblivious to this important use of social technologies.
Introduction to the special issue on self‑managing and hardware‑optimized database systems 2022
(2023)
Data management systems have evolved in terms of functionality, performance characteristics, complexity, and variety during the last 40 years. Particularly, the relational database management systems and the big data systems (e.g., Key-Value stores, Document stores, Graph stores and Graph Computation Systems, Spark, MapReduce/Hadoop, or Data Stream Processing Systems) have evolved with novel additions and extensions. However, the systems administration and tasks have become highly complex and expensive, especially given the simultaneous and rapid hardware evolution in processors, memory, storage, or networking. These developments present new open problems and challenges to data management systems as well as new opportunities.
The SMDB (International Workshop on Self-Managing Database Systems) and HardBD&Active (Joint International Workshop on Big Data Management on Emerging Hardware and Data Management on Virtualized Active Systems) workshops organized in conjunction with the IEEE ICDE (International Conference on Data Engineering) offered two distinct platforms for examining the above system-related challenges from different perspectives. The SMDB workshop looks into developing autonomic or self-* features in database and data management systems to tackle complex administrative tasks, while the HardBD&Active workshop focuses on harnessing hardware technologies to enhance efficiency and performance of data processing and management tasks. As a result of these workshops, we are delighted to present the third special issue of DAPD titled “Self-Managing and Hardware-Optimized Database Systems 2022,” which showcases the best contributions from the SMDB 2021/2022 and HardBD&Active 2021/2022 workshops.
The aim of this work is the development of artificial intelligence (AI) application to support the recruiting process that elevates the domain of human resource management by advancing its capabilities and effectiveness. This affects recruiting processes and includes solutions for active sourcing, i.e. active recruitment, pre-sorting, evaluating structured video interviews and discovering internal training potential. This work highlights four novel approaches to ethical machine learning. The first is precise machine learning for ethically relevant properties in image recognition, which focuses on accurately detecting and analysing these properties. The second is the detection of bias in training data, allowing for the identification and removal of distortions that could skew results. The third is minimising bias, which involves actively working to reduce bias in machine learning models. Finally, an unsupervised architecture is introduced that can learn fair results even without ground truth data. Together, these approaches represent important steps forward in creating ethical and unbiased machine learning systems.
In recent years, 3D facial reconstructions from single images have garnered significant interest. Most of the approaches are based on 3D Morphable Model (3DMM) fitting to reconstruct the 3D face shape. Concurrently, the adoption of Generative Adversarial Networks (GAN) has been gaining momentum to improve the texture of reconstructed faces. In this paper, we propose a fundamentally different approach to reconstructing the 3D head shape from a single image by harnessing the power of GAN. Our method predicts three maps of normal vectors of the head’s frontal, left, and right poses. We are thus presenting a model-free method that does not require any prior knowledge of the object’s geometry to be reconstructed.
The key advantage of our proposed approach is the substantial improvement in reconstruction quality compared to existing methods, particularly in the case of facial regions that are self-occluded in the input image. Our method is not limited to 3d face reconstruction. It is generic and applicable to multiple kinds of 3D objects. To illustrate the versatility of our method, we demonstrate its efficacy in reconstructing the entire human body.
By delivering a model-free method capable of generating high-quality 3D reconstructions, this paper not only advances the field of 3D facial reconstruction but also provides a foundation for future research and applications spanning multiple object types. The implications of this work have the potential to extend far beyond facial reconstruction, paving the way for innovative solutions and discoveries in various domains.
Sleep is an important aspect in life of every human being. The average sleep duration for an adult is approximately 7 h per day. Sleep is necessary to regenerate physical and psychological state of a human. A bad sleep quality has a major impact on the health status and can lead to different diseases. In this paper an approach will be presented, which uses a long-term monitoring of vital data gathered by a body sensor during the day and the night supported by mobile application connected to an analyzing system, to estimate sleep quality of its user as well as give recommendations to improve it in real-time. Actimetry and historical data will be used to improve the individual recommendations, based on common techniques used in the area of machine learning and big data analysis.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Because of a high product and technology complexity, companies involve external partners in their research and development (R&D) processes. Interorganizational projects result, which represent temporary organizations. In these projects heterogenous organizations work closely together. Since project work is always teamwork, these projects face due to their characteristic’s major challenges on an organizational, relational, and content-related collaboration level. Thus, this paper raises the following research question: “How can a project team be supported on an organizational, relational, and content-related level in an interorganizational new product development setting?” To answer this research question, an explorative expert study was set up with two digital workshops using the interactive presentation tool Mentimeter. The results show that a cooperative innovation culture could support project teams on an organizational and relational level in the future in minimizing predominant problems. Moreover, it supports project teams for example in a functional communication. Furthermore, 18 values of a cooperative innovation culture result which are for example openness and transparency, risk and failure tolerance or respect. On a content-related level the results show that an adaptable tool which promotes creativity and collaboration method as well as content-related input support could be beneficial for problem-solving in an interorganizational new product development setting in the future. Because the tool can guide product developers through the process with suitable creativity and collaboration methods, can give content-related input and can enable interactive interchange on a table-top. Future research could mainly focus on the connection of the cooperative innovation culture and the tool since these potentially influence each other.
The fifth mobile communications generation (5G) can lead to a substantial change in companies enabling the full capability of wireless industrial communication. 5G with its key features of providing Enhanced Mobile Broadband, Ultra-Reliable and Low-Latency Communication, and Massive Machine Type Communication will support the implementation of Industry 4.0 applications. In particular, the possibility to set-up Non-Public Networks provides the opportunity of 5G communication in factories and ensures sole access to the 5G infrastructure offering new opportunities for companies to implement innovative mobile applications. Currently there exist various concepts, ideas, and projects for 5G applications in an industrial environment. However, the global rollout of 5G systems is a continuous process based on various stages defined by the global initiative 3rd Generation Partnership Project that develops and specifies the 5G telecommunication standard. Accordingly, some services are currently still far from their final performance capability or not yet implemented. Additionally, research lacks in clarifying the general suitability of 5G regarding frequently mentioned 5G use cases. This paper aims to identify relevant 5G use cases for intralogistics and evaluates their technical requirements regarding their practical feasibility throughout the upcoming 5G specifications.
The blockchain technology represents a decentralized database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain manufacturing processes to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. However, the decentralized and immutable structure of blockchain technology also creates challenges when applying these token-based approaches to dynamic manufacturing processes. As a first step, this paper investigates existing mapping approaches and exemplifies weaknesses regarding their suitability for products with changeable configurations. Secondly, a concept is proposed to overcome these weaknesses by introducing logically coupled tokens embedded into a flexible smart contract structure. Finally, a concept for a token-based architecture is introduced to map manufacturing processes of products with changeable configurations.
The blockchain technology represents a decentralised database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain manufacturing processes to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. This paper proposes a prototypical blockchain application that adopts an authority concept and a concept of smart non-fungible tokens. The application enables the mapping of complex products in dynamic supply chains that require the auditability of changeable assembling processes on the blockchain. Finally, the paper demonstrates the practical feasibility of the proposed application based on a prototypical implementation created on the Ethereum blockchain.
Distributed ledger technologies such as the blockchain technology offer an innovative solution to increase visibility and security to reduce supply chain risks. This paper proposes a solution to increase the transparency and auditability of manufactured products in collaborative networks by adopting smart contract-based virtual identities. Compared with existing approaches, this extended smart contract-based solution offers manufacturing networks the possibility of involving privacy, content updating, and portability approaches to smart contracts. As a result, the solution is suitable for the dynamic administration of complex supply chains.
Companies are becoming aware of the potential risks arising from sustainability aspects in supply chains. These risks can affect ecological, economic or social aspects. One important element in managing those risks is improved transparency in supply chains by means of digital transformation. Innovative technologies like blockchain technology can be used to enforce transparency. In this paper, we present a smart contract-based Supply Chain Control Solution to reduce risks. Technological capabilities of the solution will be compared to a similar technology approach and evaluated regarding their benefits and challenges within the framework of supply chain models. As a result, the proposed solution is suitable for the dynamic administration of complex supply chains.
The critical process parameters cell density and viability during mammalian cell cultivation are assessed by UV/VIS spectroscopy in combination with multivariate data analytical methods. This direct optical detection technique uses a commercial optical probe to acquire spectra in a label-free way without signal enhancement. For the cultivation, an inverse cultivation protocol is applied, which simulates the exponential growth phase by exponentially replacing cells and metabolites of a growing Chinese hamster ovary cell batch with fresh medium. For the simulation of the death phase, a batch of growing cells is progressively replaced by a batch with completely starved cells. Thus, the most important parts of an industrial batch cultivation are easily imitated. The cell viability was determined by the well-established method partial least squares regression (PLS). To further improve process knowledge, the viability has been determined from the spectra based on a multivariate curve resolution (MCR) model. With this approach, the progress of the cultivations can be continuously monitored solely based on an UV/VIS sensor. Thus, the monitoring of critical process parameters is possible inline within a mammalian cell cultivation process, especially the viable cell density. In addition, the beginning of cell death can be detected by this method which allows us to determine the cell viability with acceptable error. The combination of inline UV/VIS spectroscopy with multivariate curve resolution generates additional process knowledge complementary to PLS and is considered a suitable process analytical tool for monitoring industrial cultivation processes.
Digitalisierung und Mediatisierung prägen die Gesellschaft und auch die Erwachsenenbildung/Weiterbildung. Der Beitrag geht der Frage nach, wie Digitalisierung in Angeboten der Erwachsenenbildung/Weiterbildung gelingt. Damit wird ein Fokus auf den Einsatz digitaler Medien gelegt. Dazu werden die Angebotsentwicklung für Adressatinnen und Adressaten sowie Teilnehmende, medienbezogene Inhalte, Lehr- und Lernarrangements mit digitalen Medien, der Einsatz digitaler Medien und die Zugänglichkeit von Lehr- und Lernmaterialien als relevante Merkmale identifiziert. Insgesamt zeigen die analysierten Interviewdaten, dass der Einsatz digitaler Medien in Angeboten eine Erweiterung der didaktischen Aufgaben darstellt, da Angebote mit digitalen Medien zielgenau auf die Bedarfe und Möglichkeiten von Adressatinnen und Adressaten sowie Teilnehmenden abgestimmt werden müssen.
Ecuador, traditionally an agricultural based economy, has a great potential for valorizing their industrial residues. This study, presents a techno-economic analysis for applying a novel biomass oxidation method to produce formic and acetic acids from coffee husk residues in Machala, Ecuador. The analysis determined that the time of return of investment was lower than 5 years, making this project economically feasible, when producing approx. 1000 tons of formic acid per year, which is enough for supplying the Ecuadorian market. This production, would reduce imports costs and develop the chemical industry in the country.
Programmable nano-bio interfaces driven by tuneable vertically configured nanostructures have recently emerged as a powerful tool for cellular manipulations and interrogations. Such interfaces have strong potential for ground-breaking advances, particularly in cellular nanobiotechnology and mechanobiology. However, the opaque nature of many nanostructured surfaces makes non-destructive, live-cell characterization of cellular behavior on vertically aligned nanostructures challenging to observe. Here, a new nanofabrication route is proposed that enables harvesting of vertically aligned silicon (Si) nanowires and their subsequent transfer onto an optically transparent substrate, with high efficiency and without artefacts. We demonstrate the potential of this route for efficient live-cell phase contrast imaging and subsequent characterization of cells growing on vertically aligned Si nanowires. This approach provides the first opportunity to understand dynamic cellular responses to a cell-nanowire interface, and thus has the potential to inform the design of future nanoscale cellular manipulation technologies.
Empirical software engineering experts on the use of students and professionals in experiments
(2018)
Using students as participants remains a valid simplification of reality needed in laboratory contexts. It is an effective way to advance software engineering theories and technologies but, like any other aspect of study settings, should be carefully considered during the design, execution, interpretation, and reporting of an experiment. The key is to understand which developer population portion is being represented by the participants in an experiment. Thus, a proposal for describing experimental participants is put forward.
The relative pros and cons of using students or practitioners in experiments in empirical software engineering have been discussed for a long time and continue to be an important topic. Following the recent publication of “Empirical software engineering experts on the use of students and professionals in experiments” by Falessi, Juristo, Wohlin, Turhan, Münch, Jedlitschka, and Oivo (EMSE, February 2018) we received a commentary by Sjøberg and Bergersen. Given that the topic is of great methodological interest to the community and requires nuanced treatment, we invited two editorial board members, Martin Shepperd and Per Runeson, respectively, to provide additional views.
The analysis of exhaled metabolites has become a promising field of research in recent decades. Several volatile organic compounds reflecting metabolic disturbance and nutrition status have even been reported. These are particularly important for long-term measurements, as needed in medical research for detection of disease progression and therapeutic efficacy. In this context, it has become urgent to investigate the effect of fasting and glucose treatment for breath analysis. In the present study, we used amodel of ventilated rats that fasted for 12 h prior to the experiment. Ten rats per group were randomly assigned for continuous intravenous infusion without glucose or an infusion including 25 mg glucose per 100 g per hour during an observation period of 12 h. Exhaled gas was analysed using multicapillary column ion-mobility spectrometry. Analytes were identified by the BS-MCC/IMS database (version 1209; B & S Analytik, Dortmund, Germany). Glucose infusion led to a significant increase in blood glucose levels (p<0.05 at 4 h and thereafter) and cardiac output (p<0.05 at 4 h and thereafter). During the observation period, 39 peaks were found collectively. There were significant differences between groups in the concentration of ten volatile organic compounds: p<0.001 at 4 h and thereafter for isoprene, cyclohexanone, acetone, p-cymol, 2-hexanone, phenylacetylene, and one unknown compound, and p<0.001 at 8 h and thereafter for 1-pentanol, 1-propanol, and 2-heptanol. Our results indicate that for long-term measurement, fasting and the withholding of glucose could contribute to changes of volatile metabolites in exhaled air.
Social and environmental risk management in supply chains : a survey in the clothing industry
(2015)
Almost daily, news indicates that there are environmental and social problems in globally fragmented supply chains. Even though conceptualisations of sustainable supply chain management suggest supplier-related risk management for sustainable products and processes as substantial for companies, research on how risk management for environmental and social issues in supply chains is performed has so far been neglected. This study aims at analysing both why companies in the clothing industry are performing management of social and environmental risks in their supply chain and what kind of action they are taking. Based on the literature on sustainable supply chain management and supply chain risk management as well as 10 expert interviews, a conceptual model for risk management in sustainable supply chains was developed. This model was tested in an empirical study in the clothing industry. The data were analysed by structural equation modelling. Results of the research show high statistical significance for the respective conceptual model. The main driver to perform risk management in environmental and social affairs is pressures and incentives from stakeholders. While companies’ corporate orientation mainly drives social actions, top management drives environmental affairs for differentiating themselves from competitors.
While the recently emerged microservices architectural style is widely discussed in literature, it is difficult to find clear guidance on the process of refactoring legacy applications. The importance of the topic is underpinned by high costs and effort of a refactoring process which has several other implications, e.g. overall processes (DevOps) and team structure. Software architects facing this challenge are in need of selecting an appropriate strategy and refactoring technique. One of the most discussed aspects in this context is finding the right service granularity to fully leverage the advantages of a microservices architecture. This study first discusses the notion of architectural refactoring and subsequently compares 10 existing refactoring approaches recently proposed in academic literature. The approaches are classified by the underlying decomposition technique and visually presented in the form of a decision guide for quick reference. The review yielded a variety of strategies to break down a monolithic application into independent services. With one exception, most approaches are only applicable under certain conditions. Further concerns are the significant amount of input data some approaches require as well as limited or prototypical tool support.
Nachhaltige Managementmodelle sind auf Erfüllung der Triple Bottom Line ausgerichtet: Unternehmen adressieren Energie- und Co2-Effizienz (ökologisch), Arbeitsschutz oder Arbeitslosenquoten (sozial) sowie mögliche Wachstumspotenziale, die durch Nachhaltigkeit zu erreichen sind, um das eigene Überleben des Unternehmens am Markt zu sichern (ökonomisch). Daneben stehen die 17 Sustainable Development Goals (SDGs), die bis 2030 als Leitlinie nachhaltigen Wirtschaftens weltweit gelten und in nationale Gesetzgebung überführt wurden. Dieser Beitrag entwickelt ein Managementmodell, das Unternehmen dabei unterstützt, relevante SDGs zu identifizieren und Handlungsempfehlungen abzuleiten. Aufbauend auf einer nachhaltigen Supply Chain ordnet das Modell die SDGs den Dimensionen der Triple Bottom Line zu, um kurz eine Checkliste möglicher zu berücksichtigender Nachhaltigkeitsmaßnahmen im Kontext des Behaviour Change Modells zu erarbeiten. Zurückgreifend auf die Empfehlungen der Vereinten Nationen wird ein nachhaltiger Managementansatz eingeführt, der Unternehmen dazu befähigt, Governance, Transparenz und Engagement in ihrer Supply Chain zu implementieren.
Children undergoing systemic chemotherapy often suffer from severe immunosuppression usually associated to severe neutropenia (neutrophils < 0.5 x 109/l). Clinical courses during those periods range from asymptomatic to septic general conditions. Development of septic symptoms can be very fast and life-threatening. Swift detection of risk factors in those patients is therefore needed. So far no early, rapid and reliable marker or tool exists. Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column (IMS-MCC) can analyze more than 600 volatile components from exhaled air within a few minutes and hence is a potential, rapid detection-tool. As a proof of concept we measured the exhaled breath of 11 patients with neutropenia and 10 healthy controls ranging from 3 to 18 years of age at the time of measurement. Ten milliliters breath samples were taken at the outpatient clinic and analyzed with an onsite IMS-MCC (BreathDiscovery, B&S Analytik, Dortmund, Germany). Dead-space-volume was adapted to two groups (small 250 ml, large 500 ml). Interestingly 59 differing peaks were measured. Eleven were significantly different (p ≤ 0.05), three of which highly significant (p ≤ 0.01) in Mann-Whitney-Rank-Sum-testing. The corresponding analytes used in the decision tree are 2-Propanol, D-Limonene and Acetone. The analytes with the lowest rank sum identified are 2-Hexanone, Iso-Propylamine and 1-Butanol. Eventually we were able to show a three-step-decision-tree, which discerns the 21 samples except one from each group. Sensitivity was 90 % and specificity was 91 %. Naturally these findings need further confirmation within a bigger population. Our pilot-study proves that Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column is a feasible rapid diagnostic tool in the setting of a pediatric oncology out-patient clinic for patients 3 years and older. Our first results furthermore encourage additional analysis as to whether patients at risk for septic events during immunosuppression can be diagnosed in advance by rapidly assessing risk factors such as Neutropenia in exhaled breath.
Sleep quality and in general, behavior in bed can be detected using a sleep state analysis. These results can help a subject to regulate sleep and recognize different sleeping disorders. In this work, a sensor grid for pressure and movement detection supporting sleep phase analysis is proposed. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this project is a non invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable actigraphy devices tends to be uncomfortable. Besides this fact, they are also very expensive. The system represented in this work classifies respiration and body movement with only one type of sensor and also in a non invasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed the potential for classification of breathing rate and body movements. Although previous researches show the use of pressure sensors in recognizing posture and breathing, they have been mostly used by positioning the sensors between the mattress and bedsheet. This project however, shows an innovative way to position the sensors under the mattress.
The main aim of presented in this manuscript research is to compare the results of objective and subjective measurement of sleep quality for older adults (65+) in the home environment. A total amount of 73 nights was evaluated in this study. Placing under the mattress device was used to obtain objective measurement data, and a common question on perceived sleep quality was asked to collect the subjective sleep quality level. The achieved results confirm the correlation between objective and subjective measurement of sleep quality with the average standard deviation equal to 2 of 10 possible quality points.
Identifikation von Schlaf- und Wachzuständen durch die Auswertung von Atem- und Bewegungssignalen
(2021)
Fragestellung: Das klinische Standardverfahren und Referenz der Schlafmessung und der Klassifizierung der einzelnen Schlafstadien ist die Polysomnographie (PSG). Alternative Ansätze zu diesem aufwändigen Verfahren könnten einige Vorteile bieten, wenn die Messungen auf eine komfortablere Weise durchgeführt werden. Das Hauptziel dieser Forschung Studie ist es, einen Algorithmus für die automatische Klassifizierung von Schlafstadien zu entwickeln, der ausschließlich Bewegungs- und Atmungssignale verwendet [1].
Patienten und Methoden: Nach der Analyse der aktuellen Forschungsarbeiten haben wir multinomiale logistische Regression als Grundlage für den Ansatz gewählt [2]. Um die Genauigkeit der Auswertung zu erhöhen, wurden vier Features entwickelt, die aus Bewegungs- und Atemsignalen abgeleitet wurden. Für die Auswertung wurden die nächtlichen Aufzeichnungen von 35 Personen verwendet, die von der Charité-Universitätsmedizin Berlin zur Verfügung gestellt wurden. Das Durchschnittsalter der Teilnehmer betrug 38,6 +/– 14,5 Jahre und der BMI lag bei durchschnittlich 24,4 +/– 4,9 kg/m2. Da der Algorithmus mit drei Stadien arbeitet, wurden die Stadien N1, N2 und N3 zum NREM-Stadium zusammengeführt. Der verfügbare Datensatz wurde strikt aufgeteilt: in einen Trainingsdatensatz von etwa 100 h und in einen Testdatensatz mit etwa 160 h nächtlicher Aufzeichnungen. Beide Datensätze wiesen ein ähnliches Verhältnis zwischen Männern und Frauen auf, und der durchschnittliche BMI wies keine signifikante Abweichung auf.
Ergebnisse: Der Algorithmus wurde implementiert und lieferte erfolgreiche Ergebnisse: die Genauigkeit der Erkennung von Wach-/NREM-/REM-Phasen liegt bei 73 %, mit einem Cohen’s Kappa von 0,44 für die analysierten 19.324 Schlafepochen von jeweils 30 s. Die beobachtete gewisse Überschätzung der NREM-Phase lässt sich teilweise durch ihre Prävalenz in einem typischen Schlafmuster erklären. Selbst die Verwendung eines ausbalancierten Trainingsdatensatzes konnte dieses Problem nicht vollständig lösen.
Schlussfolgerungen: Die erreichten Ergebnisse haben die Tauglichkeit des Ansatzes prinzipiell bestätigt. Dieser hat den Vorteil, dass nur Bewegungs- und Atemsignale verwendet werden, die mit weniger Aufwand und komfortabler für Benutzer aufgezeichnet werden können als z. B. Herz- oder EEG-Signale. Daher stellt das neue System eine deutliche Verbesserung im Vergleich zu bestehenden Ansätzen dar. Die Zusammenführung der beschriebenen algorithmischen Software mit dem in [1] beschriebenen Hardwaresystem zur Messung von Atem- und Körperbewegungssignalen zu einem autonomen, berührungslosen System zur kontinuierlichen Schlafüberwachung ist eine mögliche Richtung zukünftiger Arbeiten.
Recognition of sleep and wake states is one of the relevant parts of sleep analysis. Performing this measurement in a contactless way increases comfort for the users. We present an approach evaluating only movement and respiratory signals to achieve recognition, which can be measured non-obtrusively. The algorithm is based on multinomial logistic regression and analyses features extracted out of mentioned above signals. These features were identified and developed after performing fundamental research on characteristics of vital signals during sleep. The achieved accuracy of 87% with the Cohen’s kappa of 0.40 demonstrates the appropriateness of a chosen method and encourages continuing research on this topic.
The scoring of sleep stages is one of the essential tasks in sleep analysis. Since a manual procedure requires considerable human and financial resources, and incorporates some subjectivity, an automated approach could result in several advantages. There have been many developments in this area, and in order to provide a comprehensive overview, it is essential to review relevant recent works and summarise the characteristics of the approaches, which is the main aim of this article. To achieve it, we examined articles published between 2018 and 2022 that dealt with the automated scoring of sleep stages. In the final selection for in-depth analysis, 125 articles were included after reviewing a total of 515 publications. The results revealed that automatic scoring demonstrates good quality (with Cohen's kappa up to over 0.80 and accuracy up to over 90%) in analysing EEG/EEG + EOG + EMG signals. At the same time, it should be noted that there has been no breakthrough in the quality of results using these signals in recent years. Systems involving other signals that could potentially be acquired more conveniently for the user (e.g. respiratory, cardiac or movement signals) remain more challenging in the implementation with a high level of reliability but have considerable innovation capability. In general, automatic sleep stage scoring has excellent potential to assist medical professionals while providing an objective assessment.
Broad acceptance of finite-element-based analysis of structural problems and the increased availability of CAD-systems for structural tasks, which help to generate meshes of non-trivial geometries, have been setting a standard for the evaluation of designs in mechanical engineering in the last few decades. The development of automated or semi-automated optimizers, integrated into the Computer-Aided Engineering (CAE)-packages or working as outer loop machines, requiring the solver to do the analysis of the specific designs, has been accepted by most advanced users of the simulation community as well. The availability and inexpensive processing power of computers is increasing without any limitations foreseen in the coming years. There is little doubt that virtual product development will continue using the tools that have proved to be so successful and so easy to handle.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
Indoor localization systems are becoming more and more important with the digitalization of the industrial sector. Sensor data such as the current position of machines, transport vehicles, goods or tools represent an essential component of cyber physical production systems (CCPS). However, due to the high costs of these sensors, they are not widespread and are used mainly in special scenarios. However, especially optical indoor positioning systems (OIPS) based on cameras have certain advantages due to their technological specifications. In this paper, the application scenarios and requirements as well as their characteristics are presented and a classification approach of OIPS is introduced.
A 3D face modelling approach for pose-invariant face recognition in a human-robot environment
(2017)
Face analysis techniques have become a crucial component of human-machine interaction in the fields of assistive and humanoid robotics. However, the variations in head-pose that arise naturally in these environments are still a great challenge. In this paper, we present a real-time capable 3D face modelling framework for 2D in-the-wild images that is applicable for robotics. The fitting of the 3D Morphable Model is based exclusively on automatically detected landmarks. After fitting, the face can be corrected in pose and transformed back to a frontal 2D representation that is more suitable for face recognition. We conduct face recognition experiments with non-frontal images from the MUCT database and uncontrolled, in the wild images from the PaSC database, the most challenging face recognition database to date, showing an improved performance. Finally, we present our SCITOS G5 robot system, which incorporates our framework as a means of image pre-processing for face analysis.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Software startups often make assumptions about the problems and customers they are addressing as well as the market and the solutions they are developing. Testing the right assumptions early is a means to mitigate risks. Approaches such as Lean Startup foster this kind of testing by applying experimentation as part of a constant build-measure-learn feedback loop. The existing research on how software startups approach experimentation is very limited. In this study, we focus on understanding how software startups approach experimentation and identify challenges and advantages with respect to conducting experiments. To achieve this, we conducted a qualitative interview study. The initial results show that startups often spent a disproportionate amount of time focusing on creating solutions without testing critical assumptions. Main reasons are the lack of awareness, that these assumptions can be tested early and a lack of knowledge and support on how to identify, prioritize and test these assumptions. However, startups understand the need for testing risky assumptions and are open to conducting experiments.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
In recent years, the parallel computing community has shown increasing interest in leveraging cloud resources for executing parallel applications. Clouds exhibit several fundamental features of economic value, like on-demand resource provisioning and a pay-per-use model. Additionally, several cloud providers offer their resources with significant discounts; however, possessing limited availability. Such volatile resources are an auspicious opportunity to reduce the costs arising from computations, thus achieving higher cost efficiency. In this paper, we propose a cost model for quantifying the monetary costs of executing parallel applications in cloud environments, leveraging volatile resources. Using this cost model, one is able to determine a configuration of a cloud-based parallel system that minimizes the total costs of executing an application.