Refine
Year of publication
- 2019 (176) (remove)
Document Type
- Conference proceeding (98)
- Journal article (68)
- Book chapter (4)
- Book (3)
- Doctoral Thesis (3)
Language
- English (176) (remove)
Has full text
- yes (176) (remove)
Is part of the Bibliography
- yes (176)
Institute
- Informatik (74)
- ESB Business School (38)
- Technik (30)
- Life Sciences (25)
- Texoversum (9)
Publisher
- IEEE (35)
- Hochschule Reutlingen (19)
- Springer (19)
- Elsevier (16)
- Stellenbosch University (7)
- MDPI (6)
- De Gruyter (5)
- Wiley (5)
- SCITEPRESS (4)
- ACM (3)
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach. Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts. The study focuses on mid-sized and large companies developing software-intensive products in dynamic and technical market environments. Method: We conducted semi structured expert interviews with 15 experts from 13 German companies and conducted a thematic data analysis. Results: The analysis showed that a significant number of companies is still struggling with traditional feature based product-roadmapping and opinion based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and the establishing discovery activities.
Among the multitude of software development processes available, hardly any is used by the book. Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods— so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this paper, we make a first step towards devising such guidelines. Grounded in 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods. Using an 85% agreement level in the participants’ selections, we provide two examples illustrating how hybrid development methods are characterized by the practices they are made of. Our evidence-based analysis approach lays the foundation for devising hybrid development methods.
Water jacket systems are routinely used to control the temperature of Petri dish cell culture chambers. Despite their widespread use, the thermal characteristics of such systems have not been fully investigated. In this study, we conducted a comprehensive set of theoretical, numerical and experimental analyses to investigate the thermal characteristics of Petri dish chambers under stable and transient conditions. In particular, we investigated the temperature gradient along the radial axis of the Petri dish under stable conditions, and the transition period under transient conditions. Our studies indicate a radial temperature gradient of 3.3 °C along with a transition period of 27.5 min when increasing the sample temperature from 37 to 45 °C for a standard 35 mm diameter Petri dish. We characterized the temperature gradient and transition period under various operational, geometric, and environmental conditions. Under stable conditions, reducing the diameter of the Petri dish and incorporating a heater underneath the Petri dish can effectively reduce the temperature gradient across the sample. In comparison, under transient conditions, reducing the diameter of the Petri dish, reducing sample volume, and using glass Petri dish chambers can reduce the transition period.
In this paper we describe an interactive web-based visual analysis tool for Formula one races. It first provides an overview about all races on a yearly basis in a calendar-like representation. From this starting point, races can be selected and visually inspected in detail. We support a dynamic race position diagram as well as a more detailed lap times line plot for showing the drivers’ lap times in comparison. Many interaction techniques are supported like selections, filtering, highlighting, color coding, or details-on demand. We illustrate the usefulness of our visualization tool by applying it to a Formula one dataset while we describe the different dynamic visual racing patterns for a number of selected races and drivers.
Private equity (PE) firms are investment firms that acquire equity shares in companies. The goal of PE firms is to exit the investment after few years with a substantial increase in value. PE firms often claim to outperform the market, i.e. to create alpha.
The overall aim of this paper is to unravel the mystery of value creation in the PE industry. First, the author presents a conceptual framework for value creation in the PE industry based on a multiple valuation model that breaks down value creation into different elements. Second, the paper evaluates whether PE firms really create value by analysing and combining results from prior empirical studies based on the conceptual framework.
The results show that existing empirical evidence is mixed but that there is indeed a tendency toward a positive evidence that PE firms create economic value in average. However, there are methodological difficulties in measuring the value creation and studies are often subject to bias. Finally, it is pointed out that the question whether PE firms really create value has to be viewed from different perspectives such as the perspective of the PE firm, the investors and the portfolio companies.
The desire to combine advanced user friendly interfaces with a product personality communicating environmental friendliness to customers poses new challenges for car interior designers, as little research has been carried out in this field to date. In this paper, the creation of three personas aimed at defining key German car users with pro environmental behaviour is presented. After collecting ethnographic data of potential drivers through literature review, information about generation and Euro car segment led to the definition of three key user groups. The resulting personas were applied to determine the most important interaction points in car interior. Finally, present design cues of eco-friendly product personality developed in the field of automotive design were explored. Our work presents three strategic directions for the design development of future in-car user interfaces named as a) foster multimodal mobility; b) emphasize the interlinkage economy - sustainable driving; and c) highlight new technological developments. The presented results are meant as an impulse for developers to fit the needs of green customers and drivers when designing user-friendly HMI components.
Digital light microscopy techniques are among the most widely used methods in cell biology and medical research. Despite that, the automated classification of objects such as cells or specific parts of tissues in images is difficult. We present an approach to classify confluent cell layers in microscopy images by learned deep correlation features using deep neural networks. These deep correlation features are generated through the use of gram-based correlation features and are input to a neural network for learning the correlation between them. In this work we wanted to prove if a representation of cell data based on this is suitable for its classification as has been done for artworks with respect to their artistic period. The method generates images that contain recognizable characteristics of a specific cell type, for example, the average size and the ordered pattern.
Software evolvability is an important quality attribute, yet one difficult to grasp. A certain base level of it is allegedly provided by service- and microservice-based systems, but many software professionals lack systematic understanding of the reasons and preconditions for this. We address this issue via the proxy of architectural modifiability tactics. By qualitatively mapping principles and patterns of Service Oriented Architecture (SOA) and microservices onto tactics and analyzing the results, we cannot only generate insights into service-oriented evolution qualities, but can also provide a modifiability comparison of the two popular service-based architectural styles. The results suggest that both SOA and microservices possess several inherent qualities beneficial for software evolution. While both focus strongly on loose coupling and encapsulation, there are also differences in the way they strive for modifiability (e.g. governance vs. evolutionary design). To leverage the insights of this research, however, it is necessary to find practical ways to incorporate the results as guidance into the software development process.
The coupling of the heat and power sector is required as supply and demand in the German electricity mix drift further and further apart with a high percentage of renewable energy. Heat pumps in combination with thermal energy storage systems can be a useful way to couple the heat and power sectors. This paper presents a hardware-in the-loop test bench for experimental investigation of optimized control strategies for heat pumps. 24-hour experiments are carried out to test whether the heat pump is able to serve optimized schedules generated by a MATLAB algorithm. The results show that the heat pump is capable of following the generated schedules, and the maximum deviation of the operational time between schedule and experiment is only 3%. Additionally, the system can serve the demand for space heating and DHW at any time.
A large body of literature is concerned with models of presence— the sensory illusion of being part of a virtual scene— but there is still no general agreement on how to measure it objectively and reliably. For the presented study, we applied contemporary theory to measure presence in virtual reality. Thirty-seven participants explored an existing commercial game in order to complete a collection task. Two startle events were naturally embedded in the game progression to evoke physical reactions and head tracking data was collected in response to these events. Subjective presence was recorded using a post-study questionnaire and real-time assessments. Our novel implementation of behavioral measures lead to insights which could inform future presence research: We propose a measure in which startle reflexes are evoked through specific events in the virtual environment, and head tracking data is compared to the range and speed of baseline interactions.
Continuous refactoring is necessary to maintain source code quality and to cope with technical debt. Since manual refactoring is inefficient and error prone, various solutions for automated refactoring have been proposed in the past. However, empirical studies have shown that these solutions are not widely accepted by software developers and most refactorings are still performed manually. For example, developers reported that refactoring tools should support functionality for reviewing changes. They also criticized that introducing such tools would require substantial effort for configuration and integration into the current development environment.
In this paper, we present our work towards the Refactoring-Bot, an autonomous bot that integrates into the team like a human developer via the existing version control platform. The bot automatically performs refactorings to resolve code smells and presents the changes to a developer for asynchronous review via pull requests. This way, developers are not interrupted in their workflow and can review the changes at any time with familiar tools. Proposed refactorings can then be integrated into the code base via the push of a button. We elaborate on our vision, discuss design decisions, describe the current state of development, and give an outlook on planned development and research activities.
While the concepts of object-oriented antipatterns and code smells are prevalent in scientific literature and have been popularized by tools like SonarQube, the research field for service-based antipatterns and bad smells is not as cohesive and organized. The description of these antipatterns is distributed across several publications with no holistic schema or taxonomy. Furthermore, there is currently little synergy between documented antipatterns for the architectural styles SOA and Microservices, even though several antipatterns may hold value for both. We therefore conducted a Systematic Literature Review (SLR) that identified 14 primary studies. 36 service-based antipatterns were extracted from these studies and documented with a holistic data model. We also categorized the antipatterns with a taxonomy and implemented relationships between them. Lastly, we developed a web application for convenient browsing and implemented a GitHub-based repository and workflow for the collaborative evolution of the collection. Researchers and practitioners can use the repository as a reference, for training and education, or for quality assurance.
Many start-ups are in search of cooperation partners to develop their innovative business models. In response, incumbent firms are introducing increasingly more cooperation systems to engage with start-ups. However, many of these cooperations end in failure. Although qualitative studies on cooperation models have tried to improve the effectiveness of incumbent start-up strategies, only a few have empirically examined start-up cooperation behavior. Considering the lack of adequate measurement models in current research, this paper focuses on developing a multi-item scale on cooperation behavior of start-ups, drawing from a series of qualitative and quantitative studies. The resultant scale contributes to recent research on start-up cooperation and provides a framework to add an empirical perspective to current research.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
Due to the rising need for palliative care in Russia, it is crucial to provide timely and high-quality solutions for patients, relatives, and caregivers. A methodology for remote monitoring of patients in need of palliative care and the requirements will be developed for a hardware-software complex for remote monitoring of patients' health at home.
The relevance of technology knowledge in digital transformation especially in small and mediumsized enterprises (SMEs) that are still largely dependent on physical human capital has become increasingly obvious. This is due to the rapid revolution in business environment coupled with increased living examples of firms disrupted by advancement in technological knowledge. Consequently, we find it progressively vital for SMEs to spot and mitigate both threats and take advantage of opportunities arising from digital transformation dynamism.
Our study aims at exploring the relevance of technology knowledge in SMEs for digital transformation to uncover the opportunities, roadmaps, and models that SMEs can take advantage of in the digital transformation and gain a competitive edge.
We conclude that irrespective relevance of technology knowledge for digital transformation coupled with its low costs and accessibility, SMEs are yet to realize the full potential of technological knowledge. This is mainly due to technologies appearing, changing and also vanishing so rapidly in the digital age, that gaining proper understanding without dedicated resources is utterly difficult for SMEs - making them less competitive as incumbent large firms in the market.
Purpose – The purpose of this paper is to examine the mediating effect of psychological contract breach on the relationship between job insecurity and counterproductive workplace behavior (CWB) and the moderating effect of employment status in this relationship.
Design/methodology/approach – Data were collected from 212 supervisor–subordinate dyads in a large Chinese state-owned air transportation group. AMOS 17.0 software was used to examine the hypothesized predictions and the theoretical model.
Findings – The results showed that psychological contract breach partially mediates the effect of job insecurity on CWB, including organizational counterproductive workplace behavior and interpersonal counterproductive workplace behavior. In addition, the relationships between job insecurity, psychological contract breach and CWB differ significantly between permanent workers and contract workers.
Originality/value – The present study provides a new insight into explaining the linkage between job insecurity and negative work behaviors as well as suggestions to managers on minimizing the harmful effects of job insecurity.
In standardized sectors such as the automotive, the cost-benefit ratio of automation solutions is high as they contribute to increase capacity, decrease costs and improve product quality. In less standardized application fields, the contribution of automation to improvements in capacity, cost and quality blurs. The automation of complex and unstructured tasks requires sophisticated, expensive and low-performing systems, whose impact on product quality is oftentimes not directly perceived by customers. As a result, the full automation of process chains in the general manufacturing or the logistic sectors is often a sub optimal solution. Taking the distance from the false idea that a process should be either fully automated, or fully manual, this paper presents a novel heuristic method for design of lean human-robot interaction, the Quality Interaction Function Deployment, with the objective of the “right level of automation”. Functions are divided among human and automated agents and several automation scenarios are created and evaluated with respect to their compliance to the requirements of all process´ stakeholders. As a result, synergies among operators (manual tasks) and machines (automated tasks) are improved, thus reducing time-losses and increasing productivity.
Context: Organizations are increasingly challenged by high market dynamics, rapidly evolving technologies and shifting user expectations. In consequence, many organizations are struggling with their ability to provide reliable product roadmaps by applying traditional roadmapping approaches. Currently, many companies are seeking opportunities to improve their product roadmapping practices and strive for new roadmapping approaches. A typical first step towards advancing the roadmapping capabilities of an organization is to assess the current situation. Therefore, the so-called maturity model DEEP for assessing the product roadmapping capabilities of companies operating in dynamic and uncertain environments has been developed and published by the authors.
Objective: The aim of this article is to conduct an initial validation of the DEEP model in order to understand its applicability better and to see if important concepts are missing. In addition, the aim of this article is to evolve the model based on the findings from the initial validation.
Method: The model has been given to practitioners such as product managers with the request to perform a self-assessment of the current product roadmapping practices in their company. Afterwards, interviews with each participant have been conducted in order to gain insights.
Results: The initial validation revealed that some of the stages of the model need to be rearranged and minor usability issues were found. The overall structure of the model was well received. The study resulted in the development of the version 1.1 of the DEEP product roadmap maturity model which is also presented in this article.
The paper describes a new stimulus using learning factories and an academic research programme - an M.Sc. in Digital Industrial Management and Engineering (DIME) comprising a double degree - to enhance international collaboration between four partner universities. The programme will be structured in such a way as to maintain or improve the level of innovation at the learning factories of each partner. The partners agreed to use Learning Factory focus areas along with DIME learning modules to stimulate international collaboration. Furthermore, they identified several research areas within the framework of the DIME program to encourage horizontal and vertical collaboration. Vertical collaboration connects faculty expertise across the Learning Factory network to advance knowledge in one of the focus areas, while Horizontal collaboration connects knowledge and expertise across multiple focus areas. Together they offer a platform for students to develop disciplinary and cross-disciplinary applied research skills necessary for addressing the complex challenges faced by industry. Hence, the university partners have the opportunity to develop the learning factory capabilities in alignment with the smart manufacturing concept. The learning factory is thus an important pillar in this venture. While postgraduate students/researchers in the DIME program are the enablers to ensure the success of entire projects, the learning factory provides a learning environment which is entirely conducive to fostering these successful collaborations. Ultimately, the partners are focussed on utilising smart technologies in line with the digitalization of the production process.
The digital age makes it possible to be globally networked at any time. Digital communication is therefore an important aspect of today’s world. Hence, the further development and expansion of this is becoming increasingly important. Even within a wireless system, copper channels are important as part of the overall network. Given the need to keep pushing at the current limitations, careful design of the cables in connection with an adapted coding of the bits is essential to transmit more and more data.
One of the most popular and widespread cabling technologies is symmetrical copper cabling [1, pp. 8-15]. It is also known as Twisted Pair and it is of immense importance for the cabling of communication networks.
At the time of writing this thesis, data rates of up to 10 GBit/s over a transmission distance of 100 m and 40 GBit/s over a transmission distance of 30 m are standardized for symmetrical copper cabling [2]. Other lengths are not standardized. Short lengths in particular are of great interest for copper cables, because copper cables are usually used for short distances, such as between computers and the campus network or within data centres.
This work has focused on the transmission of higher order Pulse Amplitude Modulation and the associated transmission performance. The central research question is:“how well can we optimize the transmission technique in order to be able to maximise the data bandwidth over Ethernet cable and, given that remote powering is also a significant application of these cables, how much will the resulting heating affect this transmission and what can be done to mitigate that?”
To answer this question, the cable parameters are first examined. A series of spectral measurements, such as Insertion Loss, Return Loss, Near End Crosstalk and Far End Crosstalk, provide information about the electromagnetic interference and the influence of the ohmic resistance on the signal. Based on these findings, the first theoretical statements and calculations can be made. In the next step, data transmissions over different transmission lengths are realized. The examination of the eye diagrams of the different transmission approaches ultimately provides information about the signal quality of the transmissions. An overview of the maximum transmission rate depending on the transmission distance shows the potential for different applications.
Furthermore, the simultaneous transmission of energy and data is a significant advantage of copper. However, the resulting heat development has an influence on the data transmission. Therefore, the influence of the ambient temperature of cables is investigated in the last part and changes in the signal quality are clarified.
The paper studies the deciding parameters that influence business students' selection of internships in Germany. The findings are based on literature research and a survey amongst students and company representatives and asks to rate the importance of 24 different aspects of internships. The benefits and negative impacts of internships on students, companies and universities are discussed in detail. The results of different demographic groups are compared.
During two researches the influence of technologies on sleep were analyzed. The first one is about the effect of light on the circadian rhythm and as consequence on sleep quality of persons in a vegetative state. The second one, which is still running, surveys the influence of several technical tools on the sleep of elderly people living in a nursing home.
With the capability of employing virtually unlimited compute resources, the cloud evolved into an attractive execution environment for applications from the High Performance Computing (HPC) domain. By means of elastic scaling, compute resources can be provisioned and decommissioned at runtime. This gives rise to a new concept in HPC: Elasticity of parallel computations. However, it is still an open research question to which extent HPC applications can benefit from elastic scaling and how to leverage elasticity of parallel computations. In this paper, we discuss how to address these challenges for HPC applications with dynamic task parallelism and present TASKWORK, a cloud-aware runtime system based on our findings. TASKWORK enables the implementation of elastic HPC applications by means of higher level development frameworks and solves corresponding coordination problems based on Apache ZooKeeper. For evaluation purposes, we discuss a development framework for parallel branch-and-bound based on TASKWORK, show how to implement an elastic HPC application, and report on measurements with respect to parallel efficiency and elastic scaling.
This paper summarises the experiences with sustainability reporting in a very wide meaning at Universities of Applied Sciences (UoAS). It focuses on the communication of sustainability aspects and activities of universities. It provides a recommendation, a model for communicating the sustainability activities of universities and emphasises the values of this appraoch. This paper aims to find the most effective ways to convey education for sustainable development to a broad public and initiate communication about sustainability aspects with society.
The paper is based on action research done at two universities about the ways in which academic institutions can communicate with their stakeholders in order to report about their own role as a responsible university and also to make an impact on the sustainable development on a local and global scale.
Research is focussed on experiences at Universitites of Applied Sciences with their strong focus on applied research, education and transfer. However, these results can be helpful for each academic institution that wants to make a positive impact on society. The concept which we present focusses on the possible impact which universities can generate.
Seen as the contribution to the research field of sustainabitliy reporting the paper points out that a continuous qualitative reporting process with a focus on education for SD is an adequate and efficient approach to sustainability reporting for universities and an effective way to reach a broad public.
We show that there are several efficient methodss of communication ranging from the traditional sustainability report to publications which address the public and to more innovative methods using the web 2.0. We show and argue that for universities, alternative ways of sustainability communication may be more effective to achieve the sustainability mission.
The concept which we present gives the universities a broader impact on society and helps them to support sustainable development in an efficient way.
In vitro, hydrogel-based ECMs for functionalizing surfaces of various material have played an essential role in mimicking native tissue matrix. Polydimethylsiloxane (PDMS) is widely used to build microfluidic or organ-on-chip devices compatible with cells due to its easy handling in cast replication. Despite such advantages, the limitation of PDMS is its hydrophobic surface property. To improve wettability of PDMS-based devices, alginate, a naturally derived polysaccharide, was covalently bound to the PDMS surface. This alginate then crosslinked further hydrogel onto the PDMS surface in desired layer thickness. Hydrogel-modified PDMS was used for coating a topography chip system and in vitro investigation of cell growth on the surfaces. Moreover, such hydrophilic hydrogel-coated PDMS is utilized in a microfluidic device to prevent unspecific absorption of organic solutions. Hence, in both exemplary studies, PDMS surface properties were modified leading to improved devices.
The purpose of this paper is to assess if the strategy development of the fashion industry is oriented to the long or short term. Following the theory of dynamic capabilities, this paper argues that a long term strategic orientation can be observed in corporate foresight activities. A multi methodological research approach is chosen to answer the research question. The findings suggest that the fashion industry is lagging behind other industries in terms of future orientation and therefore long-term strategy development, even though the challenges in the business environment are not perceived as less relevant.
Small and Medium Enterprises (SMEs) which play substantial role in the development of any economy have been on the rise in the recent periods. Consequently, these enterprises are faced with a myriad of challenges which could potentially be solved through adoption of technology. Nonetheless, it has been observed that the new technological uptake among SMEs remains limited with the majority of them opting to maintain the status quo with regards to technology awareness and innovation strategies.
In a literature review, this paper explores three major dynamics curtailing adoption of new technologies by SMEs in the manufacturing: Knowledge absorptive capacity and management factors, organisational structures as well as technological awareness. Firstly, with regards to knowledge absorptive capacity and management factors, this study shows how these factors drive innovation potentials in SMEs.
Secondly, with regards to technological awareness factors, this study documents how perceived usefulness, costs, network and infrastructure, education and skills, training and attitude as well as knowledge influence adoption of new technologies among SMEs in the world. Lastly, the study concludes by analysing how organisational structures drive innovation potentials of SMEs in the wake of swift and profound technological changes in the market.
Software process improvement (SPI) is around for decades, but it is a critically discussed topic. In several waves, different aspects of SPI have been discussed in the past, e.g., large scale company-level SPI programs, maturity models, success factors, and in-project SPI. It is hard to find new streams or a consensus in the community, but there is a trend coming along with agile and lean software development. Apparently, practitioners reject extensive and prescriptive maturity models and move towards smaller, faster and continuous project-integrated SPI. Based on data from two survey studies conducted in Germany (2012) and Europe (2016), we analyze the process customization for projects and practices for implementing SPI in the participating companies. Our findings indicate that, even in regulated industry sectors, companies increasingly adopt in-project SPI activities, primarily with the goal to continuously optimize specific processes. Therefore, with this paper, we want to stimulate a discussion on how to evolve traditional SPI towards a continuous learning environment.
A distinctive highlight of the dissertation at hand is the investigation of multiple apparel supply chain actors incorporating the views of a global apparel retailer in Europe and multiple suppliers in Vietnam and Indonesia.
More specifically, the dissertation presents a coherent investigation starting with the depiction of a conceptual framework for social management strategies as a means for social risk management (SRM), exclusively aiming at the apparel industry. In accordance to the identified research gaps and suggested research directions from the conceptual framework, the role of the apparel sourcing agent for social management strategies was analysed by conducting a multiple case study approach with evidence from Vietnam and Europe, ultimately suggesting ten propositions. Whereas a further multiple case study data collection in Vietnam, Indonesia and Europe allowed for the investigation of buyer-supplier relationships with regards to social compliance strategies by using core tenets of agency theory to interpret the findings and outline ten propositions. Based on the development of a conceptual framework on social SSCM in the apparel industry, the formulation of related 20 propositions with evidence from crucial developing (apparel sourcing) countries, and the application of agency theory which has been declared as a shortfall in this context, this thesis contributes with further grounding to SSCM theory and substantially contributes to the debate by addressing numerous research gaps.
We present a compact battery charger topology for weight and cost sensitive applications with an average output current of 9A targeted for 36V batteries commonly found in electric bicycles. Instead of using a conventional boost converter with large DC-link capacitors, we accomplish PFC-functionality by shaping the charging current into a sin²-shape. In addition, a novel control scheme without input-current sensing is introduced. A-priori knowledge is used to implement a feed-forward control in combination with a closed-loop output current control to maintain the target current. The use of a full-bridge/half bridge LLC converter enables operation in a wide input-voltage range.
A fully featured prototype has been built with a peak output power of 1050W. An average output power of 400W was measured, resulting in a power density of 1.8 kW/dm³. At 9A charging current, a power factor of 0.96 was measured and the efficiency exceeds 93% on average with passive rectification.
The impact of pulse charging has been evaluated on a 400Wh battery which was charged with the proposed converter as well as CC-CV-charging for reference. Both charging schemes show similar battery surface temperatures.
Recognizing human actions is a core challenge for autonomous systems as they directly share the same space with humans. Systems must be able to recognize and assess human actions in real-time. To train the corresponding data-driven algorithms, a significant amount of annotated training data is required. We demonstrate a pipeline to detect humans, estimate their pose, track them over time and recognize their actions in real-time with standard monocular camera sensors. For action recognition, we transform noisy human pose estimates in an image like format we call Encoded Human Pose Image (EHPI). This encoded information can further be classified using standard methods from the computer vision community. With this simple procedure, we achieve competitive state-of-the-art performance in pose based action detection and can ensure real-time performance. In addition, we show a use case in the context of autonomous driving to demonstrate how such a system can be trained to recognize human actions using simulation data.
Serverless computing is an emerging cloud computing paradigm with the goal of freeing developers from resource management issues. As of today, serverless computing platforms are mainly used to process computations triggered by events or user requests that can be executed independently of each other. These workloads benefit from on-demand and elastic compute resources as well as per-function billing. However, it is still an open research question to which extent parallel applications, which comprise most often complex coordination and communication patterns, can benefit from serverless computing.
In this paper, we introduce serverless skeletons for parallel cloud programming to free developers from both parallelism and resource management issues. In particular, we investigate on the well known and widely used farm skeleton, which supports the implementation of a wide range of applications. To evaluate our concepts, we present a prototypical development and runtime framework and implement two applications based on our framework: Numerical integration and hyperparameter optimization - a commonly applied technique in machine learning. We report on performance measurements for both applications and discuss
the usefulness of our approach.
Semi-automated image data labelling using AprilTags as a pre-processing step for machine learning
(2019)
Data labelling is a pre-processing step to prepare data for machine learning. There are many ways to collect and prepare this data, but these are usually associated with a greater effort. This paper presents an approach to semi-automated image data labelling using AprilTags. The AprilTags attached to the object, which contain a unique ID, make it possible to link the object surfaces to a particular class. This approach will be implemented and used to label data of a stackable box.
The data is evaluated by training a You Only Look Once (YOLO) net, with a subsequent evaluation of the detection results. These results show that the semi-automatically collected and labelled data can certainly be used for machine learning. However, if concise features of an object surface are covered by the AprilTag, there is a risk that the concerned class will not be recognized. It can be assumed that the labelled data can not only be used for YOLO, but also for other machine learning approaches.
The persistent development towards decreasing batch sizes due to an ongoing product individualization, as well as increasingly dynamic market and competitive conditions lead to new changeability requirements in production environments. Since each of the individualized products mgith require different base materials or components and manufacturing resources, the paths of the products giong through the factory as well as the required internal transport and material supply processes are going to differ for every product. Conventional planning and control systems, which rely on predifined processes and central decision-making, are not capable to deal with the arising system's complexity along the dimensions of changing goods, layouts and throughput requirements. The concepts of "self-organization" in combination with "autonomous ocntrol" provide promising solutions to solve these new requirements by using among other things the potential of autonomous, decentralized and target-optimized logistical objects (e.g. smart products, bins and conveyor systems) wich are able to communicate and interact with each other as well as with human wokers. To investigate the potential of automation and human-robot collaboration for intralogistics, a research project for the development of a collaborative tugger train has been started at the ESB Logistics Learning Factory in lin with various student projects in neighboring research areas. This collaboraive tugger train system in combination with other manual (e.g. handcarts) and (semi-) automated conveyoer systems (e.g. automated guided forklift) will be integrated into a dynamic, self-organized scenario with varying production batch sizes to develop a method for target-oriented sefl-organization and autonomous control of intralogistics systems. For a structured investigation of self-organized scenarios a generic intralogistics model as well as a criteria cataloghe has been developed. The ESB Logistics Learning will serve as a practice-oriented research, validation and demonstration environment for these purposes.
RoPose-Real: real world dataset acquisition for data-driven industrial robot arm pose estimation
(2019)
It is necessary to employ smart sensory systems in dynamic and mobile workspaces where industrial robots are mounted on mobile platforms. Such systems should be aware of flexible and non-stationary workspaces and able to react autonomously to changing situations. Building upon our previously presented RoPose-system, which employs a convolutional neural network architecture that has been trained on pure synthetic data to estimate the kinematic chain of an industrial robot arm system, we now present RoPose-Real. RoPose-Real extends the prior system with a comfortable and targetless extrinsic calibration tool, to allow for the production of automatically annotated datasets for real robot systems. Furthermore, we use the novel datasets to train the estimation network with real world data. The extracted pose information is used to automatically estimate the observing sensor pose relative to the robot system. Finally we evaluate the performance of the presented subsystems in a real world robotic scenario.
This paper studies option pricing based on a reverse engineering (RE) approach. We utilize artificial intelligence in order to numerically compute the prices of options. The data consist of more than 5000 call- and put-options from the German stock market. First, we find that option pricing under reverse engineering obtains a smaller root mean square error to market prices. Second, we show that the reverse engineering model is reliant on training data. In general, the novel idea of reverse engineering is a rewarding direction for future research. It circumvents the limitations of finance theory, among others strong assumptions and numerical approximations under the Black–Scholes model.
This study is about estimating the reproducibility of finding palpation points of three different anatomical landmarks in the human body (Xiphoid Process and the 2 Hip Crests) to support a navigated ultrasound application. On 6 test subjects with different body mass index the three palpation points were located five times by two examiners. The deviation from the target position was calculated and correlated to the fat thickness above each palpation point. The reproducibility of the measurements had a mean error of ≈13.5 mm +- 4 mm, which seems to be sufficient for the desired application field.
In this paper, an approach is introduced how reinforcement learning can be used to achieve interoperability between heterogeneous Internet of Things (IoT) components. More specifically, we model an HTTP REST service as a Markov Decision Process and adapt Q-Learning to the properties of REST so that an agent in the role of an HTTP REST client can learn the semantics of the service and, especially an optimal sequence of service calls to achieve an application specific goal. With our approach, we want to open up and facilitate a discussion in the community, as we see the key for achieving interoperability in IoT by the utilization of artificial intelligence techniques.
This document presents an algorithm for a nonobtrusive recognition of Sleep/Wake states using signals derived from ECG, respiration, and body movement captured while lying in a bed. As a core mathematical base of system data analytics, multinomial logistic regression techniques were chosen. Derived parameters of the three signals are used as the input for the proposed method. The overall achieved accuracy rate is 84% for Wake/Sleep stages, with Cohen’s kappa value 0.46. The presented algorithm should support experts in analyzing sleep quality in more detail. The results confirm the potential of this method and disclose several ways for its improvement.
Most antimicrobial peptides (AMPs) and their synthetic mimics (SMAMPs) are thought to act by permeabilizing cell membranes. For antimicrobial therapy, selectivity for pathogens over mammalian cells is a key requirement. Understanding membrane selectivity is thus essential for designing AMPs and SMAMPs to complement classical antibiotics in the future. This study focuses on membrane permeabilization induced by SMAMPs and their selectivity for membranes with different lipid compositions. We measure release and fluorescence lifetime of a self-quenching dye in lipid vesicles. Apart from the dose-response, we quantify the strength of individual leakage events, and, employing cumulative kinetics, categorize permeabilization behavior. We propose that differing selectivities in a series of SMAMPs arise from a combination of the effect of the antimicrobial agent and the susceptibility of the membrane (with a given lipid composition) for certain types of leakage behavior. The unselective and hemolytic SMAMP is found to act mainly by the asymmetry stress mechanism, mediated by hydrophobic insertion of SMAMPs into lipid layers. The more selective SMAMPs induced leakage events occurring stochastically over several hours. Lipid intrinsic properties might additionally amplify the efficiency of leakage events. Leakage behavior changes with both the design of the SMAMP and the lipid composition of the membrane. Understanding how leakage behavior contributes to the selectivity and activity of antimicrobial agents will aid the design and screening of antimicrobials. An understanding of the underlying processes facilitates the comparison of membrane permeabilization across in vitro and in vivo assays.
Vitamin E (VitE) additives are important in treating osteoarthritis inclusive cartilage regeneration due to their antioxidant and anti-inflammatory properties. The present research study focuses on the ability of biological antioxidant VitE (alpha-tocopherol isoform) to reduce or minimize oxidative degradation of soft implantable polyurethane (PU) elastomers after extended periods of time (5 months) in vitro. The effect of the oxidation storage media on the morphology of the segmented PUs was evaluated by mechanical softening, crystallization and melting behavior of both soft and hard segments (SS, HS) using dynamic mechanical analysis (DMA). Bulk mechanical properties of the potential implant materials during ageing were predicted from comprehensive mechanical testing of the biomaterials under tension and compression cyclic loads. 5-months in vitro data suggest that the prepared siloxane-poly(carbonate urethane) formulations have sufficient resistance against degradation to be suitable materials for chondral long term bio-stable implants. Most importantly, the positive effect of incorporating VitE (0.5 or 1.0% w/w) as bio-antioxidant and lubricant on the bio-stability was observed for all PU types. VitE-additives protected the surface layer from erosion and cracking during chemical oxidation in vitro as well as from thermal oxidation during extrusion re-processing.
Context: Companies in highly dynamic markets increasingly struggle with their ability to plan product development and to create reliable roadmaps. A main reason is the decreasing lack of predictability of markets, technologies, and customer behaviors. New approaches for product roadmapping seem to be necessary in order to cope with today's highly dynamic conditions. Little research is available with respect to such new approaches. Objective: In order to better understand the state of the art and to identify research gaps, this article presents a review of the scientific literature with respect to product roadmapping. Method: We performed a systematic literature review (SLR) with respect to identify papers in the field of computer science. Results: After filtering, the search resulted in a set of 23 relevant papers. The identified papers focus on different aspects such as roadmap types, processes for creating and updating roadmaps, problems and challenges with roadmapping, approaches to visualize roadmaps, generic frameworks and specific aspects such as the combination of roadmaps with business modeling. Overall, the scientific literature covers many important aspects of roadmapping but does provide only little knowledge on how to create product roadmaps under highly dynamic conditions. Research gaps address, for instance, the inclusion of goals or outcomes into product roadmaps, the alignment of a roadmap with a product vision, and the inclusion of product discovery activities in product roadmaps. In addition, the transformation from traditional roadmapping processes to new ways of roadmapping is not sufficiently addressed in the scientific literature.
Digital technologies are moving into physical products. Smart cars, connected lightbulbs and data-generating tennis rackets are examples of previously “pure” physical products that turned into “digitized products”. Digitizing products offers many use cases for consumers that will hopefully persuade them to buy these products. Yet, as revenues from selling digitized products will remain small in the near future, digitized product manufacturers have to look for other sources of benefits. Producer-side use cases describe how manufacturers can benefit internally from the digitized products they produce. Our article identifies three categories of such use cases: product-, service-, and process-related ones.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
Potentials of smart contracts-based disintermediation in additive manufacturing supply chains
(2019)
We investigate which potentials are created by using smart contracts for disintermediation in supply chains for additive manufacturing. Using a qualitative, critical realist research approach, we analyzed three case studies with companies active in additive manufactures. Based on interviews with experts from these companies, we could identify eight key requirements for disintermediation and associate four potentials of smart contracts-based disintermediation.
We report on the reflectance, transmittance and fluorescence spectra (λ=200–1200nm) of four types of chicken eggshells (white, brown, light green, dark green) measured in situ without pretreatment and after ablation of 20–100 μm of the outer shell regions. The color pigment protoporphyrin IX (PPIX) is embedded in the protein phase of all four shell types as highly fluorescent monomers, in the white and light green shells additionally as non-fluorescent dimers, and in the brown and dark green shells mainly as non-fluorescent poly-aggregates. The green shell colors are formed from an approximately equimolar mixture of PPIX and biliverdin. The axial distribution of protein and color pigments were evaluated from the combined reflectances of both the outer and inner shell surfaces, as well as from the transmittances. For the data generation we used the radiative transfer model in the random walk and Kubelka-Munk approaches.
Assistive environments are entering our homes faster than ever. However, there are still various barriers to be broken. One of the crucial points is a personalization of offered services and integration of assistive technologies in common objects and therefore in a regular daily routine. Recognition of sleep patterns for the preliminary sleep study is one of the Health services that could be performed in an undisturbing way. This article proposes the hardware system for the measurement of bio-vital signals necessary for initial sleep study in a nonobtrusive way. The first results confirm the potential of measurement of breathing and movement signals with the proposed system.
Nowadays, the demand for a MEMS development/design kit (MDK) is even more in focus than ever before. In order to achieve a high quality and cost effectiveness in the development process for automotive and consumer applications, an advanced design flow for the MEMS (micro electro mechanical systems) element is urgently required. In this paper, such a development methodology and flow for parasitic extraction of active semiconductor devices is presented. The methodology considers geometrical extraction and links the electrically active pn junctions to SPICE standard library models and subsequently extracts the netlist. An example for a typical pressure sensor is presented and discussed. Finally, the results of the parasitic extraction are compared with fabricated devices in terms of accuracy and capability.