Refine
Year of publication
- 2019 (22) (remove)
Document Type
- Conference proceeding (15)
- Journal article (6)
- Book chapter (1)
Language
- English (22) (remove)
Has full text
- yes (22) (remove)
Is part of the Bibliography
- yes (22)
Institute
- Informatik (15)
- ESB Business School (3)
- Life Sciences (2)
- Technik (2)
Publisher
- Springer (22) (remove)
Exogenous factors of influence on exhaled breath analysis by ion-mobility spectrometry (MCC/IMS)
(2019)
The interpretation of exhaled breath analysis needs to address to the influence of exogenous factors, especially to a transfer of confounding analytes by the test persons. A test person who was exposed to a disinfectant had exhaled breath analysis by MCC/IMS (Bioscout®) after different time intervals. Additionally, a new sampling method with inhalation of synthetic air before breath analysis was tested. After exposure to the disinfectant, 3-Pentanone monomer, 3-Pentanone dimer, Hexanal, 3-Pentanone trimer, 2-Propanamine, 1-Propanol, Benzene, Nonanal showed significantly higher intensities, in exhaled breath and air of the examination room, compared to the corresponding baseline measurements. Only one ingredient of the disinfectant (1-Propanol) was identical to the 8 analytes. Prolonging the time intervals between exposure and breath analysis showed a decrease of their intensities. However, the half-time of the decrease was different. The inhalation of synthetic air - more than consequently airing the examination room with fresh air - reduced the exogenous and also relevant endogenous analytes, leading to a reduction and even changing polarity of the alveolar gradient. The interpretation of exhaled breath needs further knowledge about the former residence of the proband and the likelihood and relevance of the inhalation of local, site-specific and confounding exogenous analytes by him. Their inhalation facilitates a transfer to the examination room and a detection of high concentrations in room air and exhaled breath, but also the exhalation of new analytes. This may lead to a misinterpretation of these analytes as endogenous resp. disease-specific ones.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
After the initiator of the ESB Logistics Learning Factory, Prof. Vera Hummel had made experience in developing and implementing a concept for a Learning Factory for Advanced Industrial Engineering (aIE) at the University of Stuttgart, Institute IFF between 2005 and 2008, she was appointed as a full professor at ESB Business School, a faculty of Reutlingen University in March 2010. Lacking a realistic, hands on learning and teaching environment of industrial scale for its industrial engineering students, first ideas for a Learning Factory that would strongly focus on all aspects of production logistics were drafted in 2012. Already back then, a strong integration of virtual and physical factory was desired: While the Learning Factory itself would be physical, the neighboring partners along the supply chain, such as suppliers or distribution warehouses, could be added in a fully virtual way. Considering implementation of the ESB Logistics Learning Factory a strategic initiative of the university, initial funding was provided by the faculty ESB Business School itself. Following its own creed, to provide future-oriented training for the region, also primarily local suppliers and manufacturers were selected as equipment providers to the new Learning Factory. During the initialization phase, 2014, a total of three researchers and nine students worked approximately four months to set up a first assembly line, storage racks, AGVs, or pick-by-light systems in conjunction with the underlying didactical concept. Since then, several hundred of students have participated in trainings and lectures held in the ESB Logistics Learning Factory, several research projects were carried out, and multiple high-level politicians and industry executives have been touring the shop floor. Also, more than EUR 2 million in research and infrastructure funds could be secured for expansion and upgrade — allowing the ESB Logistics Learning Factory today to represent many core aspects of an Industrie 4.0 production environment.
Recently, practitioners have begun appraising an effective customer journey design (CJD) as an important source of customer value in increasingly complex and digitalized consumer markets. Research, however, has neither investigated what constitutes the effectiveness of CJD from a consumer perspective nor empirically tested how it affects important variables of consumer behavior. The authors define an effective CJD as the extent to which consumers perceive multiple brand-owned touchpoints as designed in a thematically cohesive, consistent, and context-sensitive way. Analyzing consumer data from studies in two countries (4814 consumers in total), they provide evidence of the positive influence of an effective CJD on customer loyalty through brand attitude — over and above the effects of brand experience. Importantly, an effective CJD more strongly influences utilitarian brand attitudes, while brand experience more strongly affects hedonic brand attitudes. These underlying mechanisms are also prevalent when testing for the contingency factors services versus goods, perceived switching costs, and brand involvement.
Context: Organizations are increasingly challenged by high market dynamics, rapidly evolving technologies and shifting user expectations. In consequence, many organizations are struggling with their ability to provide reliable product roadmaps by applying traditional roadmapping approaches. Currently, many companies are seeking opportunities to improve their product roadmapping practices and strive for new roadmapping approaches. A typical first step towards advancing the roadmapping capabilities of an organization is to assess the current situation. Therefore, the so-called maturity model DEEP for assessing the product roadmapping capabilities of companies operating in dynamic and uncertain environments has been developed and published by the authors.
Objective: The aim of this article is to conduct an initial validation of the DEEP model in order to understand its applicability better and to see if important concepts are missing. In addition, the aim of this article is to evolve the model based on the findings from the initial validation.
Method: The model has been given to practitioners such as product managers with the request to perform a self-assessment of the current product roadmapping practices in their company. Afterwards, interviews with each participant have been conducted in order to gain insights.
Results: The initial validation revealed that some of the stages of the model need to be rearranged and minor usability issues were found. The overall structure of the model was well received. The study resulted in the development of the version 1.1 of the DEEP product roadmap maturity model which is also presented in this article.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach. Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts. The study focuses on mid-sized and large companies developing software-intensive products in dynamic and technical market environments. Method: We conducted semi structured expert interviews with 15 experts from 13 German companies and conducted a thematic data analysis. Results: The analysis showed that a significant number of companies is still struggling with traditional feature based product-roadmapping and opinion based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and the establishing discovery activities.
Efficient and robust 3D object reconstruction based on monocular SLAM and CNN semantic segmentation
(2019)
Various applications implement slam technology, especially in the field of robot navigation. We show the advantage of slam technology for independent 3d object reconstruction. To receive a point cloud of every object of interest void of its environment, we leverage deep learning. We utilize recent cnn deep learning research for accurate semantic segmentation of objects. In this work, we propose two fusion methods for cnn-based semantic segmentation and slam for the 3d reconstruction of objects of interest in order to obtain a more robustness and efficiency. As a major novelty, we introduce a cnn-based masking to focus slam only on feature points belonging to every single object. Noisy, complex or even non-rigid features in the background are filtered out, improving the estimation of the camera pose and the 3d point cloud of each object. Our experiments are constrained to the reconstruction of industrial objects. We present an analysis of the accuracy and performance of each method and compare the two methods describing their pros and cons.
Presently, many companies are transforming their strategy and product base, as well as their culture, processes and information systems to become more digital or to approach for a digital leadership. In the last years new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, edge and fog computing, social networks, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT environments with many rather small and distributed structures, like the Internet of Things, Microservices, or other micro-granular elements. This has a strong impact for architecting digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of micro-granular system architectures defines the moving context for adaptable systems. We are focusing on a continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, as part of a new digital enterprise architecture for service dominant digital products.
The investigation of stress requires to distinguish between stress caused by physical activity and stress that is caused by psychosocial factors. The behaviour of the heart in response to stress and physical activity is very similar in case the set of monitored parameters is reduced to one. Currently, the differentiation remains difficult and methods which only use the heart rate are not able to differentiate between stress and physical activity, without using additional sensor data input. The approach focusses on methods which generate signals providing characteristics that are useful for detecting stress, physical activity, no activity and relaxation.