Refine
Year of publication
- 2019 (223) (remove)
Document Type
- Conference proceeding (114)
- Journal article (87)
- Book (9)
- Book chapter (9)
- Doctoral Thesis (3)
- Anthology (1)
Language
- English (223) (remove)
Is part of the Bibliography
- yes (223)
Institute
- Informatik (88)
- ESB Business School (52)
- Technik (35)
- Life Sciences (34)
- Texoversum (14)
Publisher
- IEEE (35)
- Elsevier (28)
- Springer (27)
- Hochschule Reutlingen (19)
- De Gruyter (7)
- Stellenbosch University (7)
- MDPI (6)
- Wiley (5)
- Fac. of Organization & Informatics, Univ. of Zagreb (4)
- SCITEPRESS (4)
This study is about estimating the reproducibility of finding palpation points of three different anatomical landmarks in the human body (Xiphoid Process and the 2 Hip Crests) to support a navigated ultrasound application. On 6 test subjects with different body mass index the three palpation points were located five times by two examiners. The deviation from the target position was calculated and correlated to the fat thickness above each palpation point. The reproducibility of the measurements had a mean error of ≈13.5 mm +- 4 mm, which seems to be sufficient for the desired application field.
Polyurethane-bases block copolymers (TPCUs) are block-copolymers with systematically varied soft and hard segments. They have been suggested to serve as material for chondral implants in joint regeneration. Such applications may require the adhesion of chondrocytes to the implant surface, facilitating cell growth while keeping their phenotype. Thus, aims of this work were (1) to modify the surface of soft biostable polyurethane-based model implants (TPCU and TSiPCU) with high-molecular weight hyaluronic acid (HA) using an optimized multistep strategy of immobilization, and (2) to evaluate bioactivity of the modified TPCUs in vitro. Our results show no cytotoxic potential of the TPCUs. HAbioactive molecules (Mw =700kDa) were immobilized onto the polyurethane surface via polyethylenimine (PEI) spacers, and modifications were confirmed by several characterization methods. Tests with porcine chondrocytes indicated the potential of the TPCU-HA for inducing enhanced cell proliferation.
Digitalization changes the manufacturing dramatically. In regard of employees’ demands, global trends and the technological vision of future factories, automotive manufacturing faces a huge number of diverse challenges. Currently, research focuses on technological aspects of future factories in terms of digitalization. New ways of work and new organizational models for future factories have not been described yet. There are assumptions on how to develop the organization of work in a future factory but up to now, literature shows deficits in scientifically substantiated answers in this research area. Consequently, the objective of this paper is to present an approach on a work organization design for automotive Industry 4.0 manufacturing. Future requirements were analyzed and deducted to criteria that determine future agile organization design. These criteria were then transformed into functional mechanisms, which define the approach for shopfloor organization design
The present publication reports the purification effort of two natural bone blocks, that is, an allogeneic bone block (maxgraft®, botiss biomaterials GmbH, Zossen, Germany) and a xenogeneic block (SMARTBONE®, IBI S.A., Mezzovico Vira, Switzerland) in addition to previously published results based on histology. Furthermore, specialized scanning electron microscopy (SEM) and in vitro analyses (XTT, BrdU, LDH) for testing of the cytocompatibility based on ISO 10993-5/-12 have been conducted. The microscopic analyses showed that both bone blocks possess a trabecular structure with a lamellar subarrangement. In the case of the xenogeneic bone block, only minor remnants of collagenous structures were found, while in contrast high amounts of collagen were found associated with the allogeneic bone matrix. Furthermore, only island-like remnants of the polymer coating in case of the xenogeneic bone substitute seemed to be detectable. Finally, no remaining cells or cellular remnants were found in both bone blocks. The in vitro analyses showed that both bone blocks are biocompatible. Altogether, the purification level of both bone blocks seems to be favorable for bone tissue regeneration without the risk for inflammatory responses or graft rejection. Moreover, the analysis of the maxgraft® bone block showed that the underlying purification process allows for preserving not only the calcified bone matrix but also high amounts of the intertrabecular collagen matrix.
Changing requirements and qualification profiles of employees, increasingly complex digital systems up to artificial intelligence, missing standards for the seamless embedding of existing resources and unpredictable return on investments are just a few examples of the challenges of an SME in the age of digitalisation. In most cases there is a lack of suitable tools and methods to support companies in the digital transformation process in the value creation processes, but also of training and learning materials. A European research project (BITTMAS - Business Transformation towards Digitalisation and Smart systems, ERASMUS+, 2016-1 DE02-KA202-003437) with international partners from science, associations and industry has addressed this issue and developed various methods and instruments to support SMEs. Within the scope of a literature search, 16 suitable digitalisation concepts for production and logistics were identified. In the following, a learning platform with a literature database with multivariable sorting options according to branches and keywords of digitalisation, a video gallery with basic and advanced knowledge and a glossary were created in order to provide the user with consolidated and structured specialist knowledge. The 16 identifying concepts for transforming value-added processes in the context of digitalisation were transferred to a learning platform using developed learning paths in coaching and training to online course modules including test questions. A maturity model was developed and implemented in a self assessment tool for the analysis to identify the potential of digitalisation in production and logistics in relation to the current technological digitalisation level of the company. As a result, the user receives one or more of the 16 potential digitalisation concepts suggested or the delta for the necessary, not yet available enabler technologies is presented as a spider diagram. For a successful implementation of the identified suitable digitalisation concepts in production and logistics, a further tool was developed to identify supplementary requirements for all company divisions and stakeholders in relation to the "digital transformation" in the form of a self-evaluation. This paper presents the methods and tools developed, the accompanying learning materials and the learning platform.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
The promise of immutable documents to make it easier and less expensive for consumers and producers to collaborate in a verifiable way would represent an enormous progress, especially as companies strive for establish service contracts which are based on the flow of many small transactions using machine-to-machine communication. The blockchain technology logs these data, verifies the authenticity and make them available for service offers. This work deals with an architecture enabling to setup order processing between consumers and produceers using blockchain. In this way, the technical feasibility is shown and the special characteristics of blockchain production networks will be discussed.
There is no denying that organizations, whether domestic or global, whether educational, governmental, or business, are undergoing rapid transformation. However, what is causing it? Prompted by the need to remain relevant and competitive, organizations constantly try to reinvent themselves. Those that do not, according to the laws of economics, will simply serve no purpose and will eventually cease to exist. Regardless of sector or industry, an organization's success pivots around its human talent. Hence, it is crucial to manage it and cultivate certain traits, knowledge, and skills. In today's global economy, organizations are more interconnected than ever before and thus the challenges they face require that employees possess not only expert knowledge, problem-solving, cross-cultural, and cross-functional teaming skills, but also good communications skills and agile thinking.
Many researchers have explored the phenomenon of intercultural communication since Edward T. Hall first brought it to light in the late 1950s. Although the literature is quite extensive, the ongoing sociopolitical struggles are evidence that even in the twenty-first century, society has limited intercultural as well as intracultural communication competence. This limited understanding continues to bring about discord in every facet of life, including work.
The modern workforce is expected to possess certain knowledge, skills, and attitudes that are inherently different from those expected from previous generations. Due to globalization, intercultural competence and highly effective communication skills are at the top of the list - a working knowledge of English as the lingua franca of today's business world can be considered as a first step.
This paper investigates the possibility to effectively monitor and control the respiratory action using a very simple and non invasive technique based on a single lightweight reduced-size wireless surface electromyography (sEMG) sensor placed below the sternum. The captured sEMG signal, due to the critical sensor position, is characterized by a low energy level and it is affected by motion artifacts and cardiac noise. In this work we present a preliminary study performed on adults for assessing the correlation of the spirometry signal and the sEMG signal after the removal of the superimposed heart signal. This study and the related findings could be useful in respiratory monitoring of preterm infants.
Mystery shopping (MS) is a widely used tool to monitor the quality of service and personal selling. In consultative retail settings, assessments of mystery shoppers are supposed to capture the most relevant aspects of sales people’s service and sales behavior. Given the important conclusions drawn by managers from MS results, the standard assumption seems to be that assessments of mystery shoppers are strongly related to customer satisfaction and sales performance. However, surprisingly scant empirical evidence supports this assumption. We test the relationship between MS assessments and customer evaluations and sales performance with large-scale data from three service retail chains. Surprisingly, we do not find asubstantial correlation. The results show that mystery shoppers are not good proxies for real customers. While MS assessments are not related to sales, our findings confirm the established correlation between customer satisfaction measurements and sales results.
Due to the consequential impact of technological breakdowns, companies have to be prepared to deal with breakdowns or even better prevent them. In today's information technology, several methods and tools exist to downscale this concern. Therefore, this paper deals with the initial determination of a resilient enterprise architecture supporting predictive maintenance in the information technology domain and furthermore, concerns several mechanisms on how to reactively and proactively secure the state of resiliency on several abstraction levels. The objective of this paper is to give an overview on existing mechanisms for resiliency and to describe the foundation of an optimized approach, combining infrastructure and process mining techniques.
While the concepts of object-oriented antipatterns and code smells are prevalent in scientific literature and have been popularized by tools like SonarQube, the research field for service-based antipatterns and bad smells is not as cohesive and organized. The description of these antipatterns is distributed across several publications with no holistic schema or taxonomy. Furthermore, there is currently little synergy between documented antipatterns for the architectural styles SOA and Microservices, even though several antipatterns may hold value for both. We therefore conducted a Systematic Literature Review (SLR) that identified 14 primary studies. 36 service-based antipatterns were extracted from these studies and documented with a holistic data model. We also categorized the antipatterns with a taxonomy and implemented relationships between them. Lastly, we developed a web application for convenient browsing and implemented a GitHub-based repository and workflow for the collaborative evolution of the collection. Researchers and practitioners can use the repository as a reference, for training and education, or for quality assurance.
Microservices are a topic driven mainly by practitioners and academia is only starting to investigate them. Hence, there is no clear picture of the usage of Microservices in practice. In this paper, we contribute a qualitative study with insights into industry adoption and implementation of Microservices. Contrary to existing quantitative studies, we conducted interviews to gain a more in-depth understanding of the current state of practice. During 17 interviews with software professionals from 10 companies, we analyzed 14 service-based systems. The interviews focused on applied technologies, Microservices characteristics, and the perceived influence on software quality. We found that companies generally rely on well established technologies for service implementation, communication, and deployment. Most systems, however, did not exhibit a high degree of technological diversity as commonly expected with Microservices. Decentralization and product character were different for systems built for external customers. Applied DevOps practices and automation were still on a mediocre level and only very few companies strictly followed the you build it, you run it principle. The impact of Microservices on software quality was mainly rated as positive. While maintainability received the most positive mentions, some major issues were associated with security. We present a description of each case and summarize the most important findings of companies across different domains and sizes. Researchers may build upon our findings and take them into account when designing industry-focused methods.
While Microservices promise several beneficial characteristics for sustainable long-term software evolution, little empirical research covers what concrete activities industry applies for the evolvability assurance of Microservices and how technical debt is handled in such systems. Since insights into the current state of practice are very important for researchers, we performed a qualitative interview study to explore applied evolvability assurance processes, the usage of tools, metrics, and patterns, as well as participants’ reflections on the topic. In 17 semi-structured interviews, we discussed 14 different Microservice-based systems with software professionals from 10 companies and how the sustainable evolution of these systems was ensured. Interview transcripts were analyzed with a detailed coding system and the constant comparison method.
We found that especially systems for external customers relied on central governance for the assurance. Participants saw guidelines like architectural principles as important to ensure a base consistency for evolvability. Interviewees also valued manual activities like code review, even though automation and tool support was described as very important. Source code quality was the primary target for the usage of tools and metrics. Despite most reported issues being related to Architectural Technical Debt (ATD), our participants did not apply any architectural or service-oriented tools and metrics. While participants generally saw their Microservices as evolvable, service cutting and finding an appropriate service granularity with low coupling and high cohesion were reported as challenging. Future Microservices research in the areas of evolution and technical debt should take these findings and industry sentiments into account.
While several service-based maintainability metrics have been proposed in the scientific literature, reliable approaches to automatically collect these metrics are lacking. Since static analysis is complicated for decentralized and technologically diverse microservice-based systems, we propose a dynamic approach to calculate such metrics from runtime data via distributed tracing. The approach focuses on simplicity, extensibility, and broad applicability. As a first prototype, we implemented a Java application with a Zipkin integrator, 23 different metrics, and five export formats. We demonstrated the feasibility of the approach by analyzing the runtime data of an example microservice based system. During an exploratory study with six participants, 14 of the 18 services were invoked via the system’s web interface. For these services, all metrics were calculated correctly from the generated traces.
Software evolvability is an important quality attribute, yet one difficult to grasp. A certain base level of it is allegedly provided by service- and microservice-based systems, but many software professionals lack systematic understanding of the reasons and preconditions for this. We address this issue via the proxy of architectural modifiability tactics. By qualitatively mapping principles and patterns of Service Oriented Architecture (SOA) and microservices onto tactics and analyzing the results, we cannot only generate insights into service-oriented evolution qualities, but can also provide a modifiability comparison of the two popular service-based architectural styles. The results suggest that both SOA and microservices possess several inherent qualities beneficial for software evolution. While both focus strongly on loose coupling and encapsulation, there are also differences in the way they strive for modifiability (e.g. governance vs. evolutionary design). To leverage the insights of this research, however, it is necessary to find practical ways to incorporate the results as guidance into the software development process.
Background: Design patterns are supposed to improve various quality attributes of software systems. However, there is controversial quantitative evidence of this impact. Especially for younger paradigms such as service- and microservice-based systems, there is a lack of empirical studies.
Objective: In this study, we focused on the effect of four service-based patterns - namely process abstraction, service façade, decomposed capability, and event-driven messaging - on the evolvability of a system from the viewpoint of inexperienced developers.
Method: We conducted a controlled experiment with Bachelor students (N = 69). Two functionally equivalent versions of a service-based web shop - one with patterns (treatment group), one without (control group) - had to be changed and extended in three tasks. We measured evolvability by the effectiveness and efficiency of the participants in these tasks. Additionally, we compared both system versions with nine structural maintainability metrics for size, granularity, complexity, cohesion, and coupling.
Results: Both experiment groups were able to complete a similar number of tasks within the allowed 90 min. Median effectiveness was 1/3. Mean efficiency was 12% higher in the treatment group, but this difference was not statistically significant. Only for the third task, we found statistical support for accepting the alternative hypothesis that the pattern version led to higher efficiency. In the metric analysis, the pattern version had worse measurements for size and granularity while simultaneously having slightly better values for coupling metrics. Complexity and cohesion were not impacted.
Interpretation: For the experiment, our analysis suggests that the difference in efficiency is stronger with more experienced participants and increased from task to task. With respect to the metrics, the patterns introduce additional volume in the system, but also seem to decrease coupling in some areas.
Conclusions: Overall, there was no clear evidence for a decisive positive effect of using service-based patterns, neither for the student experiment nor for the metric analysis. This effect might only be visible in an experiment setting with higher initial effort to understand the system or with more experienced developers.
In standardized sectors such as the automotive, the cost-benefit ratio of automation solutions is high as they contribute to increase capacity, decrease costs and improve product quality. In less standardized application fields, the contribution of automation to improvements in capacity, cost and quality blurs. The automation of complex and unstructured tasks requires sophisticated, expensive and low-performing systems, whose impact on product quality is oftentimes not directly perceived by customers. As a result, the full automation of process chains in the general manufacturing or the logistic sectors is often a sub optimal solution. Taking the distance from the false idea that a process should be either fully automated, or fully manual, this paper presents a novel heuristic method for design of lean human-robot interaction, the Quality Interaction Function Deployment, with the objective of the “right level of automation”. Functions are divided among human and automated agents and several automation scenarios are created and evaluated with respect to their compliance to the requirements of all process´ stakeholders. As a result, synergies among operators (manual tasks) and machines (automated tasks) are improved, thus reducing time-losses and increasing productivity.
Powered by e-commerce and vital in the manufacturing industry, intralogistics became an increasingly important and labour-intensive process. In highly standardized automation-friendly environments, such as the automotive sector, most of efficiently automatable intralogistics tasks have already been automated. Due to aging population in EU and ergonomic regulations, the urge to automate intralogistics tasks became consistent also where product and process standardization is lower. That is the case of the production line or cell material supply process, where an increasing number of product variants and individually customized products combined with the necessary ability of reacting to changes in market conditions led to smaller and more frequent replenishment to the points of use in the production plant and to the chaotic addition of production cells in shop floor layout. This led in turn to inevitable traffic growth with unforeseeable related delays and increased level of safety threats and accidents. In this paper, we use the structured approach of the Quality Interaction Function Deployment to analyse the process of supply of assembly lines, seeking the most efficient combination of automation and manual labour, satisfying all stakeholders´ requirements. Results are presented and discussed.
»Flexible Arbeitspraktiken: Eine Analyse aus pragmatischer Perspektive«. Traditional human resource management (HRM) research can hardly relate to today's developments in the world of work. Organizational boundaries are blurred because of the complexity due to globalization, digitalization, and demographic changes. In practice, new ways of organizing work can be found that depend on the specifics of the work situation. In this paper, we build on the economics of convention (EC) to elaborate on the current challenges HRM scholarship is confronted with and provide a theoretical lens that goes beyond the tension between market and bureaucracy principles in actual employment settings. We apply EC’s situationalist methodology to examples of the challenging coordination of flexibility in the workplace. We explain two hybrid forms of coordination – compromises and local arrangements – and highlight the dynamics of employment practices in organizations related to these forms. Thereby, we show that different modes of coordination in employment are applied in a fluctuating manner that depends on the specific situations. In doing so, we further seek to remind HRM scholars of the fruitfulness of the pragmatist perspective in analyzing work practices, as well as extending its conceptual toolkit for future analysis.
Theory predicts that market‐timing activities bias Jensen's alpha (JA). However, empirical studies have failed to find consistent evidence of this bias. We tackle this puzzle in a nested model analysis and show that the bias contains an exogenous market component that is unrelated to market‐timing skill. In a comprehensive empirical analysis of US mutual funds, we find that the timing‐induced bias in JA is mainly driven by this market component, which is uncorrelated with measured timing activities. Measures of total performance that allow for timing activities are virtually identical to JA, even if timing activities are present in the evaluated fund. Hence, we conclude that JA is a sufficient measure of total performance.
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
This study describes a non-contact measuring and parameter identification procedure designed to evaluate inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament. This method can potentially help to establish a correlation between stiffness and damping characteristics of the annular ligament and inertia properties of the stapes and, thus, help to reduce the number of independent parameters in the model-based hearing diagnosis.
This paper presents the preliminary results of a setof research projects being developed at the distributed resources laboratory at the University of Reutlingen. The main aim of these projects is to couple distributed ledger technologies (DLTs) with distributed control of microgrids. Firstly, a DLT based solution for a local market platform has been developed. This enables end customers to participate in new local micro-energy-markets by providing them with a distributed, decentralized, transparent and secure Peer to Peer (P2P) payment system. Secondly, this solution has been integrated with an autonomous (agent-based) grid management. The integrated solution of both marked platform as well as agent based control has been implemented and tested in a real microgrid with different distributed components such as PV System, CHP and different kinds of controllable loads. This microgrid is located in the distributed energy resources laboratory at the University of Reutlingen. Thirdly, the resulting solution is being implemented as an easy to customize market solution by AC2SG Software Oy, a Finland based software company, developing solutions for the Indian market. In a next phase, the solution is going to be tested in real environment in off-grids systems in India.
Semi-automated image data labelling using AprilTags as a pre-processing step for machine learning
(2019)
Data labelling is a pre-processing step to prepare data for machine learning. There are many ways to collect and prepare this data, but these are usually associated with a greater effort. This paper presents an approach to semi-automated image data labelling using AprilTags. The AprilTags attached to the object, which contain a unique ID, make it possible to link the object surfaces to a particular class. This approach will be implemented and used to label data of a stackable box.
The data is evaluated by training a You Only Look Once (YOLO) net, with a subsequent evaluation of the detection results. These results show that the semi-automatically collected and labelled data can certainly be used for machine learning. However, if concise features of an object surface are covered by the AprilTag, there is a risk that the concerned class will not be recognized. It can be assumed that the labelled data can not only be used for YOLO, but also for other machine learning approaches.
The paper describes a new stimulus using learning factories and an academic research programme - an M.Sc. in Digital Industrial Management and Engineering (DIME) comprising a double degree - to enhance international collaboration between four partner universities. The programme will be structured in such a way as to maintain or improve the level of innovation at the learning factories of each partner. The partners agreed to use Learning Factory focus areas along with DIME learning modules to stimulate international collaboration. Furthermore, they identified several research areas within the framework of the DIME program to encourage horizontal and vertical collaboration. Vertical collaboration connects faculty expertise across the Learning Factory network to advance knowledge in one of the focus areas, while Horizontal collaboration connects knowledge and expertise across multiple focus areas. Together they offer a platform for students to develop disciplinary and cross-disciplinary applied research skills necessary for addressing the complex challenges faced by industry. Hence, the university partners have the opportunity to develop the learning factory capabilities in alignment with the smart manufacturing concept. The learning factory is thus an important pillar in this venture. While postgraduate students/researchers in the DIME program are the enablers to ensure the success of entire projects, the learning factory provides a learning environment which is entirely conducive to fostering these successful collaborations. Ultimately, the partners are focussed on utilising smart technologies in line with the digitalization of the production process.
Indicators of disruption potentials - analysis of the blockchain technology’s potential impact
(2019)
The goal of this paper was to answer the question whether blockchain has the potential to become a disruption according to Clayton Christensen’s disruption theory. Therefore, the theory and the five characteristics that define the process of disruption were outlined in the first part of the paper. That and the following explanation of the blockchain technology served as the basis for the analysis and evaluation in chapters four to seven. For the analysis, three applications of the DLT, namely payment methods, intermediaries, as well as data storage and transfer, were considered. The fulfillment of the five characteristics of disruption was assessed using an example for each of the three applications.
Additionally, the paper might serve as a basis for future research on the topic, once the technology develops further, since it is generally hard to tell whether the fourth and fifth characteristics are fulfilled by blockchain at this point. Therefore, the results of the paper also back criticism of Christensen’s theory regarding its usefulness for predictions.
This paper suggests that, in the financial services industry, too, the impact of blockchain will be significant. However, given the manifoldness of the services that are part of the industry, it cannot generally be concluded whether the DLT will disrupt the industry. For example, in services related to payment methods, blockchain is unlikely to follow disruptive pattern, despite the recent hype surrounding blockchain-based cryptocurrencies. However, regarding data storage and transfer, the technology might as well follow disruptive pattern in the financial services industry just as the application of blockchain solutions has been doing in the healthcare industry.
After definition and the history of podcasts, in this book the role of podcasts in the communication strategy is mapped out. Podcast production, podcast types, podcast structures, and podcast advertising are explained. Podcast audiences and podcast in the fashion industry are introduced.
In a thorough explorative analysis, a general exploration of the podcast offering of the fashion sector was conducted. Then a selected podcast analysis with evaluation and conclusion, including a discussion of the future use of podcasts closes this book.
This paper discusses the optimal control problem for increasing the energy efficiency of induction machines in dynamic operation including field weakening regime. In an offline procedure optimal current and flux trajectories are determined such that the copper losses are minimized during transient operations. These trajectories are useful for a subsequent online implementation.
Urban platforms are essential for smart and sustainable city planning and operation. Today they are mostly designed to handle and connect large urban data sets from very different domains. Modelling and optimisation functionalities are usually not part of the cities software infrastructure. However, they are considered crucial for transformation scenario development and optimised smart city operation. The work discusses software architecture concepts for such urban platforms and presents case study results on the building sector modelling, including urban data analysis and visualisation. Results from a case study in New York are presented to demonstrate the implementation status.
Learning to translate between real world and simulated 3D sensors while transferring task models
(2019)
Learning-based vision tasks are usually specialized on the sensor technology for which data has been labeled. The knowledge of a learned model is simply useless when it comes to data which differs from the data on which the model has been initially trained or if the model should be applied to a totally different imaging or sensor source. New labeled data has to be acquired on which a new model can be trained. Depending on the sensor, this can even get more complicated when the sensor data becomes more abstract and hard to be interpreted and labeled by humans. To enable reuse of models trained for a specific task across different sensors minimizes the data acquisition effort. Therefore, this work focuses on learning sensor models and translating between them, thus aiming for sensor interoperability. We show that even for the complex task of human pose estimation from 3D depth data recorded with different sensors, i.e. a simulated and a Kinect 2TM depth sensor, human pose estimation can greatly improve by translating between sensor models without modifying the original task model. This process especially benefits sensors and applications for which labels and models are difficult if at all possible to retrieve from raw sensor data.
In a time of upheaval and digitalization, new business models for companies play an important role. Decentralized power generation and energy efficiency indicators to achieve climate goals and to reduce global warming are currently forcing energy companies to develop new business models. In recent years, many methods of business model development have been introduced to create new business ideas. But what are the obstacles in implementing these business models in the energy sector to develop new business opportunities? And what challenges do companies face in this respect? To answer this question, a systematic literature review was conducted in this paper. As a result, eight categories were identified which summarise the main barriers for the implementation of new business models in the energy domain.
The energy turnaround, digitalization and decreasing revenues forces enterprises in the energy domain to develop new business models. Business models for renewable energy are compound on different logic than business models for larger scale power plants. Following a design science research approach, we examined the business models of three enterprises in the energy domain in a first step. We identified that these business models result in complex ecosystems with multiple actors and difficult relationships between them. One cause is the fast changing and complicated state regulation in Germany. In order to solve the problem, we captured together with the partners of the enterprises the requirements in a second phase. Further we developed the prototype Business Model Configurator (BMConfig) based on the e3Value Ontology on the metamodelling platform ADOxx. We demonstrate the feasibility of our approach in business model of energy efficiency service based on smart meter data.
Purpose of the research paper is to illuminate the subject of assortment policy in the German fashion e‐commerce market. A short literature review is conducted in order to set up a system of characteristics to contemplate assortments on a strategic level. In a second step, structured observations are conducted to quantitatively analyze and compare the assortments of the leading online fashion retailers within Germany. Based on literature, the following characteristics for a classification of assortments can be identified: assortment structure, assortment size, assortment width, assortment depth, assortment consistency and rotation, price level, quality mix, fashion degree as well as the mix of private labels and manufacturer brands. Furthermore, the results of the empirical analysis show that there are currently five leaders within the nalyzed market: Amazon, Otto, Zalando, Baur and About You. Among these five market leaders, Amazon positions itself as a retailer that not only offers an enormous assortment size, but also the lowest entry prices as well as the broadest price dispersion. Through the development of the system of characteristics for assortment analysis and the examination of the current market environment, the findings of this paper contribute to the current state of the art in both theoretical and practical aspects.
To remain competitive in a fast changing environment, many companies started to migrate their legacy applications towards a Microservices architecture. Such extensive migration processes require careful planning and consideration of implications and challenges likewise. In this regard, hands-on experiences from industry practice are still rare. To fill this gap in scientific literature, we contribute a qualitative study on intentions, strategies, and challenges in the context of migrations to Microservices. We investigated the migration process of 14 systems across different domains and sizes by conducting 16 in-depth interviews with software professionals from 10 companies. Along with a summary of the most important findings, we present a separate discussion of each case. As primary migration drivers, maintainability and scalability were identified. Due to the high complexity of their legacy systems, most companies preferred a rewrite using current technologies over splitting up existing code bases. This was often caused by the absence of a suitable decomposition approach. As such, finding the right service cut was a major technical challenge, next to building the necessary expertise with new technologies. Organizational challenges were especially related to large, traditional companies that simultaneously established agile processes. Initiating a mindset change and ensuring smooth collaboration between teams were crucial for them. Future research on the evolution of software systems can in particular profit from the individual cases presented.
While the recently emerged microservices architectural style is widely discussed in literature, it is difficult to find clear guidance on the process of refactoring legacy applications. The importance of the topic is underpinned by high costs and effort of a refactoring process which has several other implications, e.g. overall processes (DevOps) and team structure. Software architects facing this challenge are in need of selecting an appropriate strategy and refactoring technique. One of the most discussed aspects in this context is finding the right service granularity to fully leverage the advantages of a microservices architecture. This study first discusses the notion of architectural refactoring and subsequently compares 10 existing refactoring approaches recently proposed in academic literature. The approaches are classified by the underlying decomposition technique and visually presented in the form of a decision guide for quick reference. The review yielded a variety of strategies to break down a monolithic application into independent services. With one exception, most approaches are only applicable under certain conditions. Further concerns are the significant amount of input data some approaches require as well as limited or prototypical tool support.
This document presents an algorithm for a nonobtrusive recognition of Sleep/Wake states using signals derived from ECG, respiration, and body movement captured while lying in a bed. As a core mathematical base of system data analytics, multinomial logistic regression techniques were chosen. Derived parameters of the three signals are used as the input for the proposed method. The overall achieved accuracy rate is 84% for Wake/Sleep stages, with Cohen’s kappa value 0.46. The presented algorithm should support experts in analyzing sleep quality in more detail. The results confirm the potential of this method and disclose several ways for its improvement.
Many start-ups are in search of cooperation partners to develop their innovative business models. In response, incumbent firms are introducing increasingly more cooperation systems to engage with start-ups. However, many of these cooperations end in failure. Although qualitative studies on cooperation models have tried to improve the effectiveness of incumbent start-up strategies, only a few have empirically examined start-up cooperation behavior. Considering the lack of adequate measurement models in current research, this paper focuses on developing a multi-item scale on cooperation behavior of start-ups, drawing from a series of qualitative and quantitative studies. The resultant scale contributes to recent research on start-up cooperation and provides a framework to add an empirical perspective to current research.
Purpose – Many start-ups are in search of cooperation partners to develop their innovative business models. In response, incumbent firms are introducing increasingly more cooperation systems to engage with startups. However, many of these cooperations end in failure. Although qualitative studies on cooperation models have tried to improve the effectiveness of incumbent start-up strategies, only a few have empirically examined start-up cooperation behavior. The paper aims to discuss these issues.
Design/methodology/approach – Drawing from a series of qualitative and quantitative studies. The scale dimensions are identified on an interview based qualitative study. Following workshops and questionnaire-based studies identify factors and rank them. These ranked factors are then used to build a measurement scale that is integrated in a standardized online questionnaire addressing start-ups. The gathered data are then analyzed using PLS-SEM.
Findings – The research was able to build a multi-item scale for start-ups cooperation behavior. This scale can be used in future research. The paper also provides a causal analysis on the impact of cooperation behavior on start-up performance. The research finds, that the found dimensions are suitable for measuring cooperation behavior. It also shows a minor positive effect on start-up’s performance.
Originality/value – The research fills the gap of lacking empirical research on the cooperation between start-ups and established firms. Also, most past studies focus on organizational structures and their performance when addressing these cooperations. Although past studies identified the start-ups behavior as a relevant factor, no empirical research has been conducted on the topic yet.
Design thinking is inherently and invariably oriented towards the future in that all design is for products, services or events that will exist in the future, and be used by people in the future. This creates an overlap between the domains of design thinking and strategic foresight. A small but significant literature has grown up in the strategic foresight field as to how design thinking may be used to improve its processes. This paper considers the other side of the relationship: how methods from the strategic foresight field may advance design thinking, improving insight into the needs and preferences of users of tomorrow, including how contextual change may suddenly and fundamentally reshape these. A side-by-side comparison of representative models from each field is presented, and it is shown how these may be assembled together to create foresight-informed design-based innovation.
A new two-dimensional fluorescence sensor system was developed for in-line monitoring of mammalian cell cultures. Fluorescence spectroscopy allows for the detection and quantification of naturally occurring intra- and extracellular fluorophores in the cell broth. The fluorescence signals correlate the the cells' current redox state and other relevant process parameters. Cell culture pretests with twelve different excitation wavelengths showed that only three wavelengths account for a vast majority of spectral variation. Accordingly, the newly developed device utilizes three high-power LEDs as excitation sources in combination with a back-thinned CCD-spectrometer for fluorescence detection.
Indoor localization systems are becoming more and more important with the digitalization of the industrial sector. Sensor data such as the current position of machines, transport vehicles, goods or tools represent an essential component of cyber physical production systems (CCPS). However, due to the high costs of these sensors, they are not widespread and are used mainly in special scenarios. However, especially optical indoor positioning systems (OIPS) based on cameras have certain advantages due to their technological specifications. In this paper, the application scenarios and requirements as well as their characteristics are presented and a classification approach of OIPS is introduced.
Fatigue and drowsiness are responsible for a significant percentage of road traffic accidents. There are several approaches to monitor the driver's drowsiness, ranging from the driver's steering behavior to the analysis of the driver, e.g. eye tracking, blinking, yawning, or electrocardiogram (ECG). This paper describes the development of a low-cost ECG sensor to derive heart rate variability (HRV) data for drowsiness detection. The work includes hardware and software design. The hardware was implemented on a printed circuit board (PCB) designed so that the board can be used as an extension shield for an Arduino. The PCB contains a double, inverted ECG channel including low-pass filtering and provides two analog outputs to the Arduino, which combines them and performs the analog-to-digital conversion. The digital ECG signal is transferred to an NVidia embedded PC where the processing takes place, including QRS-complex, heart rate, and HRV detection as well as visualization features. The resulting compact sensor provides good results in the extraction of the main ECG parameters. The sensor is being used in a larger frame, where facial-recognition-based drowsiness detection is combined with ECG-based detection to improve the recognition rate under unfavorable light or occlusion conditions.
It is expected that ongoing digitalisation will drive the merger between the manufacturing world and the internet world, possibly leading to a next industrial revolution, currently called “Industry 4.0”. The driving forces behind this development are new business opportunities and competition advantages arising from mass production customisation as well as rapid individual product development and manufacturing. Key factors of the development towards Industry 4.0 are discussed. Threats and opportunities arising from these developments for future production are discussed. Actual examples from real-time customized manufacturing of consumer products are given. As mechatronic systems and industrial robots are widely used in manufacturing and in particular in assembly, it is discussed how they can be connected to and used in digitalised industrial systems. Different examples of remote controlled systems are presented, like remote controlled KUKA robot for handling and quality control, PLC-controlled equipment, drive systems, FESTO handling system and others. The architecture of an assembly cell is presented, where industrial robots are set-up for batch-one production or can directly receive control / production information on-line and in real-time over the factory network. Methods for remote maintenance and monitoring of systems over the internet and production operator support over the internet are presented as well.
The use of gamification in workplace learning to encourage employee motivation and engagement
(2019)
When we think about playing a game, be it a card game, board game, sport, or video game, we generally associate the act of playing with a positive experience like having fun, enjoying the interaction with others, or feeling a greater motivation to reach a certain goal. By contrast, workplace learning is often perceived as being dull. Employees are likely at some point in their career to find themselves stuck in a rigidly defined seminar for a long period of time or in front of their computer navigating through a mandatory e-learning course on a dry topic such as standards of business conduct of safety policies.
In recent years, organizations have tried to leverage the motivating quality of games for more serious learning contexts. Gamification entails transferring those elements and principles from games to nongaming context that improve user experience and engagement. In this chapter, we will specifically focus on the context of workplace learning.
Digital light microscopy techniques are among the most widely used methods in cell biology and medical research. Despite that, the automated classification of objects such as cells or specific parts of tissues in images is difficult. We present an approach to classify confluent cell layers in microscopy images by learned deep correlation features using deep neural networks. These deep correlation features are generated through the use of gram-based correlation features and are input to a neural network for learning the correlation between them. In this work we wanted to prove if a representation of cell data based on this is suitable for its classification as has been done for artworks with respect to their artistic period. The method generates images that contain recognizable characteristics of a specific cell type, for example, the average size and the ordered pattern.
RoPose-Real: real world dataset acquisition for data-driven industrial robot arm pose estimation
(2019)
It is necessary to employ smart sensory systems in dynamic and mobile workspaces where industrial robots are mounted on mobile platforms. Such systems should be aware of flexible and non-stationary workspaces and able to react autonomously to changing situations. Building upon our previously presented RoPose-system, which employs a convolutional neural network architecture that has been trained on pure synthetic data to estimate the kinematic chain of an industrial robot arm system, we now present RoPose-Real. RoPose-Real extends the prior system with a comfortable and targetless extrinsic calibration tool, to allow for the production of automatically annotated datasets for real robot systems. Furthermore, we use the novel datasets to train the estimation network with real world data. The extracted pose information is used to automatically estimate the observing sensor pose relative to the robot system. Finally we evaluate the performance of the presented subsystems in a real world robotic scenario.
Based on a survey among customers of seven German municipal utilities, we estimate hierarchical multiple regression models to identify consumer motivations for participating in P2P electricity trading and develop implications for marketing strategies for this currently relatively unknown product. Our results show a low importance of socio-demographics in explaining differences between consumer groups, but high influence of attitudes, knowledge and likelihood to purchase related products. The most valuable target groups for P2P electricity trading marketing strategies of municipal utilities first and foremost should aim at are innovators, especially prosumers. They are well-informed about and open minded concerning electricity sharing and highly environmentally aware. They ask for transparency and are willing to purchase related products. They are attracted by the ability to share generation and consumption and to a lesser extent by economic reasons. Our results indicate that the marketing efforts should to a special degree take peer effects into account, as they are found to wield great influence on general openness towards and purchase intention for P2P electricity products. Finally, municipal utilities should build on the high level of satisfaction and trust of consumers and use P2P electricity trading as measure to keep and win customers willing to change their supplier.
Improved inductive feed-forward for fast turn-on of power semiconductors during hard switching
(2019)
A transformer is used to increase the gate voltage during turn-on, thus reducing the necessary bias voltage of the gate driver. Counteracting the voltage dependency of the gate capacitance of high-voltage power devices, faster transitions are possible. The additional transformer only slighly increases the over-voltage during turn-off.