Refine
Document Type
- Journal article (871)
- Conference proceeding (847)
- Book chapter (184)
- Book (61)
- Doctoral Thesis (34)
- Anthology (15)
- Working Paper (12)
- Patent / Standard / Guidelines (6)
- Review (6)
- Issue of a journal (2)
Language
- English (2041) (remove)
Is part of the Bibliography
- yes (2041)
Institute
- Informatik (702)
- ESB Business School (515)
- Technik (345)
- Life Sciences (327)
- Texoversum (150)
- Zentrale Einrichtungen (6)
Publisher
- Springer (299)
- IEEE (250)
- Elsevier (218)
- MDPI (98)
- Hochschule Reutlingen (55)
- Gesellschaft für Informatik (54)
- Wiley (49)
- ACM (40)
- De Gruyter (35)
- Association for Information Systems (AIS) (31)
The purpose of this paper is to study the impact of transparency on the political budget cycle (PBC) over time and across countries. So far, the literature on electoral cycles finds evidence that cycles depend on the stage of an economy. However, the author shows – for the first time – a reliance of the budget cycle on transparency. The author uses a new data set consisting of 99 developing and 34 Organization for Economic Cooperation and Development countries. First, the author develops a model and demonstrates that transparency mitigates the political cycles. Second, the author confirms the proposition through the econometric assessment. The author uses time series data from 1970 to 2014 and discovers smaller cycles in countries with higher transparency, especially G8 countries.
This study examines the relevance of integrated reporting quality (IRQ) to capital markets. We investigate whether IRQ benefits capital market participants by improving a firm's information environment, using analyst earnings forecast accuracy as a proxy. Our study focuses specifically on companies that publish integrated reports on a voluntary basis. Based on a scoring model, we assess IRQ and its effects with data from 2015 to 2019 of 101 companies. The results indicate no significant relationship between IRQ and analyst earnings forecast accuracy. Thus, IRQ does not appear to improve a firm's information environment, at least not currently in a voluntary setting. Drawing on previous literature in the field, this study further concludes that integrated reporting (IR) in general has not yet reached its full potential in benefitting capital markets. Potential implications of our results are that the standard setters should work to improve the specificity and rigor of their guidelines, and analysts should become more involved in developing IR guidelines to make them more relevant to their information needs. IR seems to unfold its benefits better in mandatory settings, which could call for regulators to make IR mandatory.
This paper studies the impact of governmental transparency on the political business cycle. The literature on electoral cycles finds evidence that cycles depend on the stage of the economy. However, we show a reliance of the cycle on transparency. We use data for G7 countries and compare it with less developed OECD countries. Our theory states that transparency reduces the political cycles due to peer pressure and by voting outs. We confirm the theory with an econometric assessment of 34 countries from 1970 to 2012. We discover smaller cycles in countries with a higher transparency, especially in G7-countries.
The halo effect is a cognitive bias known from social psychology. A halo effect occurs when a global impression or information about a salient characteristic shapes the evaluation of other characteristics. In a sports-related context, the halo effect has hardly been researched so far, although this could contribute significantly to understanding the thinking and behavior of sports fans. In this research paper, the following questions are investigated: Is there a halo effect in soccer? Does the sporting success or failure of a club outshine other sporting aspects? Does sporting success or failure possibly even distort fans' perception of non-sporting aspects? The research paper reflects the current state of halo research and presents the results of an empirical study in which fans of soccer clubs from the German Bundesliga were interviewed. The results of the analyses substantiate the distortion of the fans’ perception with regard to a very diverse range of aspects that is triggered by the sporting success or failure of their favorite club.
Context
Web APIs are one of the most used ways to expose application functionality on the Web, and their understandability is important for efficiently using the provided resources. While many API design rules exist, empirical evidence for the effectiveness of most rules is lacking.
Objective
We therefore wanted to study 1) the impact of RESTful API design rules on understandability, 2) if rule violations are also perceived as more difficult to understand, and 3) if demographic attributes like REST-related experience have an influence on this.
Method
We conducted a controlled Web-based experiment with 105 participants, from both industry and academia and with different levels of experience. Based on a hybrid between a crossover and a between-subjects design, we studied 12 design rules using API snippets in two complementary versions: one that adhered to a rule and one that was a violation of this rule. Participants answered comprehension questions and rated the perceived difficulty.
Results
For 11 of the 12 rules, we found that violation performed significantly worse than rule for the comprehension tasks. Regarding the subjective ratings, we found significant differences for 9 of the 12 rules, meaning that most violations were subjectively rated as more difficult to understand. Demographics played no role in the comprehension performance for violation.
Conclusions
Our results provide first empirical evidence for the importance of following design rules to improve the understandability of Web APIs, which is important for researchers, practitioners, and educators.
This article explores the determinants of people’s growth prospects in survey data as well as the impact of the European recovery fund to future growth. The focus is on the aftermath of the Corona pandemic, which is a natural limit to the sample size. We use Eurobarometer survey data and macroeconomic variables, such as GDP, unemployment, public deficit, inflation, bond yields, and fiscal spending data. We estimate a variety of panel regression models and develop a new simulation-regression methodology due to limitation of the sample size. We find the major determinant of people’s growth prospect is domestic GDP per capita, while European fiscal aid does not significantly matter. In addition, we exhibit with the simulation-regression method novel scientific insights, significant outcomes, and a policy conclusion alike.
Mystery shopping (MS) is a widely used tool to monitor the quality of service and personal selling. In consultative retail settings, assessments of mystery shoppers are supposed to capture the most relevant aspects of sales people’s service and sales behavior. Given the important conclusions drawn by managers from MS results, the standard assumption seems to be that assessments of mystery shoppers are strongly related to customer satisfaction and sales performance. However, surprisingly scant empirical evidence supports this assumption. We test the relationship between MS assessments and customer evaluations and sales performance with large-scale data from three service retail chains. Surprisingly, we do not find asubstantial correlation. The results show that mystery shoppers are not good proxies for real customers. While MS assessments are not related to sales, our findings confirm the established correlation between customer satisfaction measurements and sales results.
Do Chinese subordinates trust their German supervisors? A model of inter-cultural trust development
(2023)
In this qualitative study based on 95 interviews with Chinese subordinates and their German supervisors, we inductively develop a model which advances theoretical understanding by showing how inter-cultural trust development in hierarchical relationships is the result of six distinct elements: the subordinate trustor’s cultural profile (cosmopolitans, hybrids, culturally bounds), the psychological mechanisms operating within the trustor (role expectations and cultural accommodation), and contextual moderators (e.g., country context, time spent in foreign culture, and third-party influencers), which together influence the trust forms (e.g., presumptive trust, relational trust) and trust dynamics (e.g., trust breakdown and repair) within relationship phases over time (initial contact, trust continuation, trust disillusionment, separation, and acculturation). Our findings challenge the assumption that cultural differences result in low levels of initial trust and highlight the strong role the subordinate’s cultural profile can have on the dynamics and trajectory of trust in hierarchical relationships. Our model highlights that inter-cultural trust development operates as a variform universal, following the combined universalistic-particularistic paradigm in cross-cultural management, with both culturally generalizable etic dynamics, as well as culturally specific etic manifestations.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
This article presents a two-level optimisation approach for the management of controllable and distributed converters with storage systems across different energy sectors. It aims at the reduction of electrical peak load and at the economical optimisation of the electrical energy exchange with the grid, based on a dynamic external incentive, e.g. through dynamic energy price tariffs. By means of a secure, standardised and lean communication with two different internal price signals, an optimal flexibility provision shall be achieved. The two-level optimisation approach consists of a centralised and several distributed decentralised entities. At the centralised level, the distributed flexibilities are invoked for optimal scheduling on the basis of an internal price algorithm for stimulating the decentralised entities. Based on that internal incentive and on the expected demands for electricity, heating and cooling, the decentralised optimisation algorithms provide optimal generation schedules for the energy converters. The suggested interaction between the central and decentral entities is successfully tested and the principle potential for peak shaving and the adaption to dynamic energy-related market prices could be demonstrated and compared to different energy management strategies such as the standard heat-led operation. Further, variations of the system parameters such as load shifting potential, installed capacity and system diversification are evaluated against cost saving potential for the energy supply and overall system performance.
Software development consists to a large extent of human-based processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the role of different communication patterns in distributed agile software development. In particular, students gain awareness about the importance of communication by experiencing the impact of limitations of communication channels and the effects on collaboration and team performance. The course unit presented uses the controlled experiment instrument to provide the basic organization of a small software project carried out in virtual teams. We provide a detailed design of the course unit to allow for implementation in further courses. Furthermore, we provide experiences obtained from implementing this course unit with 16 graduate students. We observed students struggling with technical aspects and team coordination in general, while not realizing the importance of communication channels (or their absence). Furthermore, we could show the students that lacking communication protocols impact team coordination and performance regardless of the communication channels used.
Distributed Ledger Technologies for the energy sector: facilitating interoperability analysis
(2023)
The use of distributed data storage and management structures, such as Distributed Ledger Technologies (DLT), in the energy sector has gained great interest in recent times. This opens up new possibilities in e.g. microgrid management, aggregation of distributed resources, peer-to- peer trading, integration of electromobility or proof-of-origin strategies. However, in order to benefit from those new possibilities, new challenges have to be overcome. This work focuses on one of these challenges, which is the need to ensure interoperability when integrating DLT-enabled devices in energy use cases. Firstly, the use of DLTs in the energy sector will be analyzed and the main use cases will be presented. Then, a classification of DLT-Energy use cases will be proposed. Secondly, the need for a common reference architecture framework to analyze those use cases with a focus on interoperability will be discussed and the current activities in research and standardization in this field will be presented. Finally, a new common reference architecture framework based on current activities in standardization will be presented.
Cell-cell and cell-extracellular matrix (ECM) adhesion regulates fundamental cellular functions and is crucial for cell-material contact. Adhesion is influenced by many factors like affinity and specificity of the receptor-ligand interaction or overall ligand concentration and density. To investigate molecular details of cell ECM and cadherins (cell-cell) interaction in vascular cells functional nanostructured surfaces were used Ligand-functionalized gold nanoparticles (AuNPs) with 6-8 nm diameter, are precisely immobilized on a surface and separated by non-adhesive regions so that individual integrins or cadherins can specifically interact with the ligands on the AuNPs. Using 40 nm and 90 nm distances between the AuNPs and functionalized either with peptide motifs of the extracellular matrix (RGD or REDV) or vascular endothelial cadherins (VEC), the influence of distance and ligand specificity on spreading and adhesion of endothelial cells (ECs) and smooth muscle cells (SMCs) was investigated. We demonstrate that RGD-dependent adhesion of vascular cells is similar to other cell types and that the distance dependence for integrin binding to ECM-peptides is also valid for the REDV motif. VEC-ligands decrease adhesion significantly on the tested ligand distances. These results may be helpful for future improvements in vascular tissue engineering and for development of implant surfaces.
Businesses need to cope with myriad challenges including increasingly competitive markets and rapid developments in digital technology. The overall aim of the research described in this paper is to generate fresh insights into the impacts of digitalisation on the design and management of global supply chains. It focuses on understanding the current adoption rate of new technologies in global supply chains, identifying perceived opportunities and challenges and clarifying the critical factors driving (and inhibiting) their deployment. The authors administered an online survey with a global sample of respondents from various supply chain functions, resulting in a sample of 142 responses. Significant differences emerged in adoption patterns between companies of different sizes. Moreover, the study pointed to a widening gap (or a ‘digital divide’) between leaders and laggards in terms of technology adoption. Perceived benefits and challenges also differ notably between companies of varying sizes. Adoption patterns are very diverse across specific technologies. The results further suggest that there is a significant correlation between adoption of digital technologies and different dimensions of company performance.
The field of breath analysis has developed to be of growing interest in medical diagnosis and patient monitoring. The main advantages are that it’s noninvasive, painless and repeatable in flexible cycles. Even though breath analysis is being researched for a couple of decades there are still many unanswered questions. Human breath contains volatile organic compounds which are emitted from inside the body. Some of these compounds can be assigned to specific sources, such as inflammation or cancer, but also to non health related origins. This paper gives an overview of breath analysis for the purpose of disease diagnosis and health monitoring. Therefore, literature regarding breath analysis in the medical field has been analyzed, from its early stages to the present. As a result, this paper gives an outline of the topic of breath analysis.
Enterprise Governance, Risk and Compliance (GRC) systems are key to managing risks threatening modern enterprises from many different angles. Key constituent to GRC systems is the definition of controls that are implemented on the different layers of an Enterprise Architecture (EA). Controls become part of a “concern” of the EA, which allows to use an EA viewpoint to cover control compliance assessments. In this article we explore this relationship further, derive a metamodel linking control and EA, and elicit how this linkage give rise to a hierarchic understanding of the viewpoint concept for EAs. We complement these considerations with an expository instantiation in a cockpit for control compliance applied in an international enterprise in the insurance industry.
The critical process parameters cell density and viability during mammalian cell cultivation are assessed by UV/VIS spectroscopy in combination with multivariate data analytical methods. This direct optical detection technique uses a commercial optical probe to acquire spectra in a label-free way without signal enhancement. For the cultivation, an inverse cultivation protocol is applied, which simulates the exponential growth phase by exponentially replacing cells and metabolites of a growing Chinese hamster ovary cell batch with fresh medium. For the simulation of the death phase, a batch of growing cells is progressively replaced by a batch with completely starved cells. Thus, the most important parts of an industrial batch cultivation are easily imitated. The cell viability was determined by the well-established method partial least squares regression (PLS). To further improve process knowledge, the viability has been determined from the spectra based on a multivariate curve resolution (MCR) model. With this approach, the progress of the cultivations can be continuously monitored solely based on an UV/VIS sensor. Thus, the monitoring of critical process parameters is possible inline within a mammalian cell cultivation process, especially the viable cell density. In addition, the beginning of cell death can be detected by this method which allows us to determine the cell viability with acceptable error. The combination of inline UV/VIS spectroscopy with multivariate curve resolution generates additional process knowledge complementary to PLS and is considered a suitable process analytical tool for monitoring industrial cultivation processes.
Direct observation of structural heterogeneity and tautomerization of single hypericin molecules
(2021)
Tautomerization is a fundamental chemical reaction which involves the relocation of a proton in the reactants. Studying the optical properties of tautomeric species is challenging because of ensemble averaging. Many molecules, such as porphines, porphycenes, or phenanthroperylene quinones, exhibit a reorientation of the transition dipole moment (TDM) during tautomerization, which can be directly observed in single-molecule experiments. Here, we study single hypericin molecules, which is a prominent phenanthroperylene quinone showing antiviral, antidepressive, and photodynamical properties. Observing abrupt flipping of the image pattern combined with time-dependent density functional theory calculations allows drawing conclusions about the coexistence of four tautomers and their conversion path. This approach allows the unambiguous assignment of a TDM orientation to a specific tautomer and enables the determination of the chemical structure in situ. Our approach can be applied to other molecules showing TDM reorientation during tautomerization, helping to gain a deeper understanding of this important process.
Recent digital technologies like the Internet of Things and Augmented Reality have brought IT into companies’ core products. What were previously purely physical products are becoming hybrid or digitized. Despite receiving a lot of recent attention, digitized products have only seen a slow uptake in businesses so far. In this paper, we study the challenges that keep companies from realizing the desired impacts of digitized products and the practices they employ to address these challenges. To do so, we looked at companies from a set of industries that are highly affected by digital transformation, but at the same time hesitant to move to a more digitized world: the creative industries. Based on a literature review and twelve interviews in creative industries, we developed a conceptual model that can serve as a basis for formulating testable hypotheses for further research in this area.
Digitization in the energy sector is a necessity to enable energy savings and energy efficiency potentials. Managing decentralized corporate energy systems is hindered by a non-existence. The required integration of energy objectives into business strategy creates difficulties resulting in inefficient decisions. To improve this, practice-proven methods such as Balanced Scorecard, Enterprise Architecture Management and the Value Network approach are transferred to the energy domain. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
Digitization is more than using digital technologies to transfer data and perform computations and tasks. Digitization embraces disruptive effects of digital technologies on economy and society. To capture these effects, two perspectives are introduced, the product and the value-creation perspective. In the product perspective, digitization enables the transition from material, static products to interactive and configurable services. In the value-creation perspective, digitization facilitates the transition from centralized, isolated models of value creation, to bidirectional, co-creation oriented approaches of value creation.
Logistics has undergone tremendous changes over the past few decades. Above all with the advent of the digital age, we have witnessed the significant impact of new technologies on supply chains in terms of business transformation, increased agility and performance. However, many businesses have chosen to harness the full potential of these technologies to create further value (Bughin et al, 2017). High investment costs, fears for cyber security, a lack of expertise in the workforce and insufficient awareness of the concrete benefits of these technologies are just some of the factors hampering the decision to adopt digital technologies.
The following chapter draws on the findings of both recent quantitative and qualitative research conducted by practitioners und academics.
A holistic approach to digitization enables decision-makers to achieve new efficiency in corporate performance management. The digitalization improves the quality, validity and speed of information retrieval and processing. At present, most corporations are confronted with the problem of not being able to organize, categorize and visualize decision-relevant information. To meet the challenges of information management, the Management Cockpit provides an information center for managers. In accordance with the specific working environment of the executives, the Management Cockpit offers a quick and comprehensive overview of the company's situation. Today, the current situation of a company is no longer only influenced by internal factors, but also by its public image. Social media monitoring and analysis is therefore a crucial component for the external factors of successful management. Real-time monitoring of the emotions and behaviors of consumers and customers thus contributes to effective controlling of allbusiness areas. The intelligent factories promise to collect data for internal factors, but the current reality in manufacturing looks different. Production often consists of a large number of different machines, with varying degrees of digitization and limited sensor data availability. In order to close this gap, we developed a compact sensor board with network components, which allows a flexible design with different sensors for a wide variety of applications. The sensor data enable decision makers to adapt the supply chain based on their internal and external observations in the Management Cockpit. Due to the realtime and long-term monitoring and analytic possibilities the Management Cockpit provides a multi-dimensional view of the company and supports an holistic Corporate Performance Management.
Digitalization and enterprise architecture management: a perspective on benefits and challenges
(2023)
Many companies digitally transform their business models, processes, and services. They have also been using Enterprise Architecture Management approaches for a long time to synchronize corporate strategy and information technology. Such digitalization projects bring different challenges for Enterprise Architecture Management. Without understanding and addressing them, Enterprise Architecture Management projects will fail or not deliver the expected value. Since existing research has not yet addressed these challenges, they were investigated based on a qualitative expert study with leading industry experts from Europe. Furthermore, potential benefits of digitalization projects for Enterprise Architecture Management were researched. Our results provide a theoretical framework consisting of five identified challenges, triggers and a number of benefits. Furthermore, we discuss in what ways digitalization and EAM is a promising topic for future research.
Digitization will require companies to fundamentally reengineer their sales processes. Adapting the concept of value selling to the digital age will enable them to deliver superior value to their customers. Specifically, social selling will provide them with an answer to the ever-increasing complexity of customer journeys. This article, based on a survey among 235 German companies, assesses the status quo and outlines opportunities. Moreover, it introduces a novel approach for developing well-grounded social selling metrics.
Digital twins: a meta-review on their conceptualization, application, and reference architecture
(2022)
The concept of digital twins (DTs) is receiving increasing attention in research and management practice. However, various facets around the concept are blurry, including conceptualization, application areas, and reference architectures for DTs. A review of preliminary results regarding the emerging research output on DTs is required to promote further research and implementation in organizations. To do so, this paper asks four research questions: (1) How is the concept of DTs defined? (2) Which application areas are relevant for the implementation of DTs? (3) How is a reference architecture for DTs conceptualized? and (4) Which directions are relevant for further research on DTs? With regard to research methods, we conduct a meta-review of 14 systematic literature reviews on DTs. The results yield important insights for the current state of conceptualization, application areas, reference architecture, and future research directions on DTs.
What does the factory of tomorrow have to offer for companies? This question and its aspects are the focus of many actual articles and publications. According to Gartner digital twins, one of 2017 strategic technology trends will play a big role for the future of manufacturing. At the moment digital twins are gaining more importance for the industrial application. If companies want to be competitive in the future they have to implement the digital twin in the factories of today. Therefore this paper provides a basic overview of the concept of the smart factory and its requirements. In addition, digital twins are identified as a necessary concept for the evolution of the factory of today.
Digital twins deployed in production are important in practice and interesting for research. Currently, mostly structured data coming from e.g., sensors and timestamps of related stations, are integrated into Digital Twins. However, semi- and unstructured data are also important to display the current status of a digital twin (e.g., of a machinery or produced good). Process Mining and Text Mining in combination can be used to support the use of log file data to understand the current state of the process as well as highlight issues. Therefore, issue related reactions can be taken more quickly, targeted and cost oriented. Applying a design science research approach; here a prototype as an artefact based on derived requirements is developed. This prototype helps to understand and to clarify the possibilities of Process Mining and Text Mining based on log data for production related Digital Twins. Contributions for practice and research are described. Furthermore, limitations of the research and future opportunities are pointed out.
A digital twin - a replica of energy devices - was established in the computing environment of MATLAB and Simulink. It simulates continuously their operation and is time synchronized and connected to the cenral energy management and control system of a virtual power plant. The model can be used as a platform for testing device performance in various conditions, working schedules and new optimization options.
Technologies for mapping the “digital twin“ have been under development for approximately 20 years. Nowadays increasingly intelligent, individualized products encourages companies to respond innovatively to customer requirements and to handle the rising product variations quickly.
An integrated engineering network, spanning across the entire value chain, is operated to intelligently connect various company divisions, and to generate a business ecosystem for products, services and communities. The conditions for the digital twin are thereby determined in which the digital world can be fed into the real, and the real world back into the digital to deal such intelligent products with rising variations.
The term digital twin can be described as a digital copy of a real factory, machine, worker etc., that is created and can be independently expanded, automatically updated as well as being globally available in real time. Every real product and production site is permanently accompanied by a digital twin. First prototypes of such digital twins already exist in the ESB Logistics Learning Factory on a cloud- and app based software that builds on a dynamic, multidimensional data and information model. A standardized language of the robot control systems via software agents and positioning systems has to be integrated. The aspect of the continuity of the real factory in the digital factory as an economical means of ensuring continuous actuality of digital models looks as the basis of changeability.
For the indoor localization sensor combinations that in addition to the hardware already contain the software required for the sensor data fusion should be used. Processing systems, scenario-live-simulations and digital shop floor management results in a mandatory procedural combination. Essential to the digital twin is the ability to consistently provide all subsystems with the latest state of all required information, methods and algorithms.
In 2017, Philips' goal was to use innovation to improve the lives of three billion people a year by 2025. To achieve that, the company was shifting from selling medical products in a transactional manner to providing integrated healthcare solutions based on digital health technology. Based on our interviews with 23 executives at Philips, the case examines the two directions of the transformation required by this shift: externally, Philips worked on transforming how healthcare was conducted. Healthcare professionals would have to change the way they worked and reimbursement schemes needed to change to incentivize payers, providers, and patients in vastly different ways. Internally, Philips needed to redesign how its employees worked. The company componentized its business, introduced digital platforms, and co-created integrated solutions with the various stakeholders of the healthcare industry. In other words: Philips was transforming itself in order the reinvent healthcare in the digital age.
By 2019, Germany-based Kärcher, “the world’s leading provider of cleaning technology,” had turned its professional cleaning devices into IoT products. The data generated by these IoT-connected cleaning devices formed a key ingredient in the company’s ongoing strategic shift in its B2B business: Kärcher was transforming from a seller of cleaning devices to a provider of consulting services in order to help professional cleaning companies improve their cleaning processes. Based on interviews with seven IT- and non-IT executives, the case illustrates how the company learned to generate value from IoT products. And it demonstrates how a family-owned company transformed its organization in order to be able to more effectively develop and provide IoT products, while adding roles, developing technology platforms, and changing organizational structures and ways of working.
The disruptive potential of digital transformation (DT) has been widely discussed in scholarly literature and practitioner-oriented discourses. The management control (MC) function is an important corporate function, as it provides transparency on the economic situation of a firm. DT challenges MC in a two-fold and reciprocal nature as it (i) changes the MC function itself as well as (ii) the entire firm and its business models, which needs to be accompanied by the MC function. Given the complexity and variety of phenomena within the developments in the context of DT, a comprehensive management approach is essential. Surprisingly, there exist few convincing approaches, which support a comprehensive management of the DT. The objectives of this paper are therefore to discuss the impact of DT on MC, as well as, to develop a framework to control DT of an organization from a MC perspective. Based on a literature review and conceptual research, our study contributes to knowledge by proposing an initial, preliminary conceptual framework to manage DT, from a MC perspective. The framework highlights important dimensions that should be considered in the management of DT, for example related to processes and MC instruments.
Established companies are facing two transformations involving digital technologies: becoming digitized and becoming digital. The platforms enabling these transformations are fundamentally different in their purpose, target state, success metrics — and especially, in the key responsibilities of senior leaders. Because of these differences, companies will need to apply new rules new roles, processes, metrics, and norms — to the new digital platform. To develop new rules leaders should (1) separate the teams working on the digital platform, (2) allow digital platform leaders to experiment with new rules, and (3) identify new leaders and coach them to succeed with new rules. Given the time it takes to establish new rules, companies need to start breaking old rules now.
Current advances in Artificial Intelligence (AI) combined with other digitalization efforts are changing the role of technology in service ecosystems. Human-centered intelligent systems and services are the target of many current digitalization efforts and part of a massive digital transformation based on digital technologies. Artificial intelligence, in particular, is having a powerful impact on new opportunities for shared value creation and the development of smart service ecosystems. Motivated by experiences and observations from digitalization projects, this paper presents new methodological experiences from academia and practice on a joint view of digital strategy and architecture of intelligent service ecosystems and explores the impact of digitalization based on real case study results. Digital enterprise architecture models serve as an integral representation of business, information, and technology perspectives of intelligent service-based enterprise systems to support management and development. This paper focuses on the novel aspect of closely aligned digital strategy and architecture models for intelligent service ecosystems and highlights the fundamental business mechanism of AI-based value creation, the corresponding digital architecture, and management models. We present key strategy-oriented architecture model perspectives for intelligent systems.
Prior to the introduction of AI-based forecast models in the procurement department of an industrial retail company, we assessed the digital skills of the procurement employees and surveyed their attitudes toward a new digital technology. The aim of the survey was to ascertain important contextual factors which are likely to influence the acceptance and the successful use of the new forecast tool. What we find is that the digital skills of the employees show an intermediate level and that their attitudes toward key aspects of new digital technologies are largely positive. Thus, the conditions for high acceptance and the successful use of the models are good, as evidenced by the high intention of the procurement staff to use the models. In line with previous research, we find that the perceived usefulness of a new technology and the perceived ease of use are significant drivers of the willingness to use the new forecast tool.
The rise of digital technologies has become an important driver for change in multiple industries. Therefore, firms need to develop digital capabilities to manage the transformation process successfully. Prior research assumes that the development of a specific set of digital capabilities leads to higher digital maturity. However, a measurement framework for digital maturity does not exist in scholarly work. Therefore, this paper develops a conceptualization and measuremnent model for digital maturity.
Digitalization increases the pressure for companies to innovate. While current research on digital transformation mostly focuses on technological and management aspects, less attention has been paid to organizational culture and its influence on digital innovations. The purpose of this paper is to identify the characteristics of organizational culture that foster digital innovations. Based on a systematic literature review on three scholarly databases, we initially found 778 articles that were then narrowed down to a total number of 23 relevant articles through a methodical approach. After analyzing these articles, we determine nine characteristics of organizational culture that foster digital innovations: corporate entrepreneurship, digital awareness and necessity of innovations, digital skills and resources, ecosystem orientation, employee participation, agility and organizational structures, error culture and risk-taking, internal knowledge sharing and collaboration, customer and market orientation as well as open-mindedness and willingness to learn.
Sustainable technologies are being increasingly used in various areas of human life. While they have a multitude of benefits, they are especially useful in health monitoring, especially for certain groups of people, such as the elderly. However, there are still several issues that need to be addressed before its use becomes widespread. This work aims to clarify the aspects that are of great importance for increasing the acceptance of the use of this type of technology in the elderly. In addition, we aim to clarify whether the technologies that are already available are able to ensure acceptable accuracy and whether they could replace some of the manual approaches that are currently being used. A two-week study with people 65 years of age and over was conducted to address the questions posed here, and the results were evaluated. It was demonstrated that simplicity of use and automatic functioning play a crucial role. It was also concluded that technology cannot yet completely replace traditional methods such as questionnaires in some areas. Although the technologies that were tested were classified as being “easy to use”, the elderly population in the current study indicated that they were not sure that they would use these technologies regularly in the long term because the added value is not always clear, among other issues. Therefore, awareness-raising must take place in parallel with the development of technologies and services.
The third Digital Enterprise Computing Conference DEC 17 at the Herman Hollerith Center in Böblingen brings together students, researchers, and practitioners to discuss solutions, experiences, and future developments for the digital transformation. Digitization of business and IT defines the conference agenda: digital models & architecture, digital marketing, agility & innovation.
The second Digital Enterprise Computing Conference DEC 16 at the Herman Hollerith Center in Böblingen brings together students, researchers, and practitioners to discuss solutions, experiences, and future developments for the digital transformation. Digitization of business and IT defines the conference agenda: technology acceptance, digital transformation, digital business & administration, digital process challenges, analytics, and big data & data processing.
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology with more flexible enterprise information systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates the continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, like microservices and the Internet of Things, as part of a new digital enterprise architecture. To integrate micro granular architecture models to living architectural model versions we are extending more traditional enterprise architecture reference models with state of art elements for agile architectural engineering to support the digitization of products, services, and processes.
Digital enterprise architecture management in tourism : state of the art and future directions
(2018)
The advance of information technology impacts tourism more than many other industries, due to the service character of its products. Most offerings in tourism are immaterial in nature and challenging in coordination. Therefore, the alignment of IT and strategy and digitization is of crucial importance to enterprises in tourism. To cope with the resulting challenges, methods for the management of enterprise architectures are necessary. Therefore, we scrutinize approaches for managing enterprise architectures based on a literature research. We found many areas for future research on the use of enterprise architecture in tourism.
Excellence in IT is both a driver and a key enabler of the digital transformation. The digital transformation changes the way we live, work, learn, communicate, and collaborate. The Internet of Things (IoT) fundamentally influences today’s digital strategies with disruptive business operating models and fast changing markets. New business information systems are integrating emerging Internet of Things infrastructures and components. With the huge diversity of Internet of Things technologies and products organizations have to leverage and extend previous Enterprise Architecture efforts to enable business value by integrating Internet of Things architectures. Both architecture engineering and management of current information systems and business models are complex and currently integrating beside the Internet of Things synergistic subjects, like Enterprise Architecture in context with services & cloud computing, semantic-based decision support through ontologies and knowledge-based systems, big data management, as well as mobility and collaboration networks. To provide adequate decision support for complex business/IT environments, we have to make transparent the impact of business and IT changes over the integral landscape of affected architectural capabilities, like directly and transitively impacted IoT-objects, business categories, processes, applications, services, platforms and infrastructures. The paper describes a new metamodel-based approach for integrating Internet of Things architectural objects, which are semi-automatically federated into a holistic Digital Enterprise Architecture environment.
Salivary gland tumors (SGTs) are a relevant, highly diverse subgroup of head and neck tumors whose entity determination can be difficult. Confocal Raman imaging in combination with multivariate data analysis may possibly support their correct classification. For the analysis of the translational potential of Raman imaging in SGT determination, a multi-stage evaluation process is necessary. By measuring a sample set of Warthin tumor, pleomorphic adenoma and non-tumor salivary gland tissue, Raman data were obtained and a thorough Raman band analysis was performed. This evaluation revealed highly overlapping Raman patterns with only minor spectral differences. Consequently, a principal component analysis (PCA) was calculated and further combined with a discriminant analysis (DA) to enable the best possible distinction. The PCA-DA model was characterized by accuracy, sensitivity, selectivity and precision values above 90% and validated by predicting model-unknown Raman spectra, of which 93% were classified correctly. Thus, we state our PCA-DA to be suitable for parotid tumor and non-salivary salivary gland tissue discrimination and prediction. For evaluation of the translational potential, further validation steps are necessary.
The complexity of supply chains increases, especially due to the geographical spread of supplier and customer networks. In the connected and automated supply chains of the industry 4.0, even more nodes are incorporated in supply chains. This paper discusses the possible improvement of process quality in the industry 4.0 through the different blockchain and distributed ledger technologies. We derived hypotheses from a literature review and asked German blockchain experts from the industry to validate and discuss the hypotheses. We find that the different blockchain technologies and consensus algorithms have different strength with regard to quality improvement. One central finding is that IOTA, developed especially for the IoT and deemed the ’next evolutionary step’ is scalable and hence may increase the process efficiency, but at the same time is more vulnerable than other blockchain implementations, which again may reduce the overall process quality.
Analog integrated circuit sizing still relies heavily on human expert knowledge as previous automation approaches have not found wide-spread acceptance in industry. One strand, the optimization-based automation, is often discarded due to inflated constraining setups, infeasible results or excessive run times. To address these deficits, this work proposes a alternative optimization flow featuring a designer’s intuition for feasible design spaces through integration of expert knowledge based on the gm/ID-method. Moreover, the extensive run times of simulation-based optimization flows are overcome by incorporating computationally efficient machine learning methods. Neural network surrogate models predicting eleven performance parameters increase the evaluation speed by 3 400× on average compared to a simulator. Additionally, they enable the use of optimization algorithms dependent on automatic differentiation, that would otherwise be unavailable in this field. First, an up to 4× more efficient way for sampling training data based on the aforementioned space is detailed. After presenting the architecture and training effort regarding the surrogate models, they are employed as part of the objective function for sizing three operational amplifiers with three different optimization algorithms. Additionally, the benefits of using the gm/ID-method become evident when considering technology migration, as previously found solutions may be reused for other technologies.
The self-healing effect of melamine-based surfaces, triggered by temperature, was investigated. The temperature triggered reversible healing chemistry, on which the self-healing effect is based, was the Diels-Alder (DA) reaction between furan and malemeide groups. Melamine-furan containing building blocks were connected by multi-functional maleimide crosslinker via a Diels-Alder (DA) reaction to giva a DA adduct. The DA adduct was then reacted with formaldehyde to form a network by conventional condensation reaction of melamine amino groups with formaldehyde. The obtained resin was characterised and used for the impregnation of paper. Impregnated papers and neat resin werde used to perform scratch-healing tests and mechanical analysis of the novel coating system.
A device including a first and second monitoring unit, the first monitoring unit detecting a first voltage potential and the second monitoring unit detecting a second voltage potential, the monitoring units comparing the first voltage potential and the second voltage potential to the value of the supply voltage and activate a control unit as a function of the comparisons, the control unit determining a switching point in time of a second power transistor, and an arrangement being present which generates current when the second power transistor is being switched on, the current changing the first voltage potential, and the control unit activates a first power transistor when the first voltage potential has the same value as the supply voltage, so that the first power transistor is de-energized.
Powder coatings provide several advantages over traditional coatings: environmental friendliness, freedom of design, robustness and resistance of surfaces, possibility to seamlessly all-around coating, fast production process, cost-effectiveness. In the last years these benefits of the powder coating technology have been adopted from metal to heat-sensitive natural fibre/ wood based substrates (especially medium density fibre boards- MDF) used for interior furniture applications. Powder coated MDF furniture parts are gaining market share already in the classic furniture applications kitchen, bathroom, living and offices. The acceptance of this product is increasing as reflected by excellent growth rates and an increasing customer base. Current efforts of the powder coating industry to develop new powders with higher reactivity (i.e. lower curing temperatures and shorter curing times; e.g. 120°C/5min) will enable the powder coating of other heat-sensitive substrates like natural fibre composites, wood plastic composites, light weight panels and different plastics in the future. The coating could be applied and cured by the conventional powder coating process (electrostatic application, and melting and curing in an IR-oven) or by a new powder coating procedure based on the in-mould-coating (IMC) technique which is already established in the plastic industry. Extra value could be added in the future by the functional powder toner printing of powder coated substrates using the electrophotographic printing technology, meeting the future demand of both individualization of the furniture part surface by applying functional 3D textures and patterns and individually created coloured images and enabling shorter delivery times for these individualized parts. The paper describes the distinctiveness of powder coating on natural fibre/ wood based substrates, the requirements of the substrate and the coating powder.
Increasing concerns regarding the world´s natural resources and sustainability continue to be a major issue for global development. As a result several political initiatives and strategies for green or resource-efficient growth both on national and international levels have been proposed. A core element of these initiatives is the promotion of an increase of resource or material productivity. This dissertation examines material productivity developments in the OECD and BRICS countries between 1980 and 2008. By applying the concept of convergence stemming from economic growth theory to material productivity the analysis provides insights into both aspects: material productivity developments in general as well potentials for accelerated improvements in material productivity which consequently may allow a reduction of material use globally. The results of the convergence analysis underline the importance of policy-making with regard to technology and innovation policy enabling the production of resource-efficient products and services as well as technology transfer and diffusion.
Long-term stability of membranes in membrane distillation operation is a problem nowadays which prevents the industrial breakthrough of this separation process. Fouling or slow pore wetting are the basic reasons for this.
Membrane distillation membranes were made by NIPS process rendering the membrane asymmetrically to achieve low permeation resistance and pores which can be over coated with polyelectrolyte polymers thus leading to thermopervaporation membranes. Those membranes prohibit pore wetting and may strongly reduce resorption of organic substances on for membrane distillation typically used hydrophobic surfaces thus leading to longterm operation stability in dewatering including stable membrane cleaning.
Asymmetric PVDF membranes have been coated with cation exchange polyelectrolyte leading to a very thin, defect-free layer which has a high permeation rate for water due to the domain structure of phase-separated hydrophilic and hydrophobic three-dimensional structures.
One of strategically important issues of energy security of Ukraine and the countries of Europe today is to reduce the consumption of natural gas. This task is particularly relevant in winter, when a significant amount of natural gas is consumed for heating premises. Therefore, one can predict that in the nearest future, in Ukraine and European countries, premises will be heated more frequently by electrical energy.
A massive transition to electric heating of premises under conditions of the implementation of national objective in Ukraine and the countries of Europe related to a significant reduction in energy consumption necessitates to rethink the process of control over electric heating of premises. It is required that the algorithms that control power supply to premises should include mechanisms for planning the amount of electric energy consumed by an individual. This is especially true of such energy-intensive processes like heating the premises.
Therefore, it is an important task for Ukraine and the countries of Europe to work out an approach for creating systems to control electric heating of premises in a house or apartment that would take into consideration not only information about the desired temperature regime, but also information on the desired amount of electricity needed for heating.
Drug-induced liver toxicity is one of the most common reasons for the failure of drugs in clinical trials and frequent withdrawal from the market. Reasons for such failures include the low predictive power of in vivo studies, that is mainly caused by metabolic differences between humans and animals, and intraspecific variances. In addition to factors such as age and genetic background, changes in drug metabolism can also be caused by disease-related changes in the liver. Such metabolic changes have also been observed in clinical settings, for example, in association with a change in liver stiffness, a major characteristic of an altered fibrotic liver. For mimicking these changes in an in vitro model, this study aimed to develop scaffolds that represent the rigidity of healthy and fibrotic liver tissue. We observed that liver cells plated on scaffolds representing the stiffness of healthy livers showed a higher metabolic activity compared to cells plated on stiffer scaffolds. Additionally, we detected a positive effect of a scaffold pre-coated with fetal calf serum (FCS)-containing media. This pre-incubation resulted in increased cell adherence during cell seeding onto the scaffolds. In summary, we developed a scaffold-based 3D model that mimics liver stiffness-dependent changes in drug metabolism that may more easily predict drug interaction in diseased livers.
Health monitoring in a home environment can have broader use since it may provide continuous control of health parameters with relatively minor intrusiveness into regular life. This work aims to verify if it is possible to replace the typical in some sleep medicine areas subjective questioning by an objective measurement using electronic devices. For this purpose, a study was conducted with ten subjects, in which objective and subjective measurement of relevant sleep parameters took place. The results of both measurement methods were evaluated and analyzed. The results showed that while for some measures, such as Total Time in Bed, there is a high agreement between objective and subjective measurements, for others, such as sleep quality, there are significant differences. For this reason, currently, a combination of both measurement methods may be beneficial and provide the most detailed results, while a partial replacement can already reduce the number of questions at the subjective measurement by measurement through electronic devices.
Development of an IoT-based inventory management solution and training module using smart bins
(2023)
Flexibility, transparency and changeability of warehouse environments are playing an increasingly important role to achieve a cost-efficient production of small batch sizes. This results in increasing requirements for warehouses in terms of flexibility, scalability, reconfigurability and transparency of material and information flows to deal with large number of different components and variable material and information flows due to small batch sizes. Therefore, an IoT-based inventory management solution and training module has been developed, implemented and validated at Werk150 – the Factory on campus of the ESB Business School. Key elements of the developed solution are smart bins using weight mats to track the bin’s content and additional sensors and buttons which are connected to an IoT – Hub to collect data of material consumption and manual handling operations. The use of weight mats for the smart bins offers the possibility to measure the container content independent of the specific component geometry and thus for a variety of components based on the specific component weights. The developed solution enables focusing on key for success elements of the system to provide synchronization of the flow of materials and information resulting an increase of flexibility and significantly higher transparency of the material flow. AIbased algorithms are applied to analyse the gathered data and to initiate process optimizations by providing the logistics decision makers a profound and transparent basis for decision making. In order to provide students and industry visitors of the learning factory with the necessary competences and to support the transfer into practice, a training module on IoT-based inventory management was developed and implemented.
Manufacturing companies are confronted with external (e.g. short-term change of product configuration by the customer) and internal (e.g. production process deviations) turbulences which are affecting the performance of production. Predefined, centrally controlled logistics processes are limiting the possibilities of production to initiate countermeasures to react in an optimized way to these turbulences. The autonomous control of intralogistics offers a great potential to cope with these turbulences by using the respective flexibility corridors of production systems and applying intelligent logistic objects with decentralized decision and process execution capabilities to maintain a target-optimized production. A method for AI-based storage-location- and material-handling-optimization to achieve performance-optimized intralogistics system through continuous monitoring of performance-relevant parameters and influencing factors by using AI (e.g. for pattern recognition) has been developed. To provide the basis to investigate and demonstrate the potentials of autonomously controlled intralogistics in connection with turbulences of production and in combination with AI, an intelligent warehouse involving an indoor localization system, smart bins, manual, semi-automated/collaborative and autonomous transport systems has been developed and implemented at Werk150, the factory on campus of ESB Business School (Reutlingen University). This scenario, which has been integrated into graduate training modules, allows the analysis and demonstration of different measures of intralogistics to cope with turbulences in production involving amongst others storage and material provision processes. The target fulfilment of the applied intralogistics measures to master arising turbulences is assessed based on the overall performance of production considering lead times and adherence to delivery dates. By applying artificial intelligence (AI) algorithms the intelligent logistical objects (smart bin, transport systems, etc.) as well as the entire logistics system should be enabled to improve their decision and process execution capabilities to master short-term turbulences in the production system autonomously.
Development of an indoor positioning system to create a digital shadow of production plant layouts
(2023)
The objective of this dissertation is to develop an indoor positioning system that allows the creation of a digital shadow of the plant layout in order to continuously represent the actual state of the physical layout in the virtual space. In order to define the requirements for such a system, potential stakeholders who could benefit from a digital shadow in the context of the plant layout were analysed. In order to generate added value for their work, the requirements were derived from their perspective. As the core of an indoor positioning system is the sensory aspect to capture the physical layout parameters, different potential technologies were compared and evaluated in terms of their suitability for this particular application. Derived from this analysis, the selected concept is based on the use of a pan-tilt-zoom (PTZ) camera in combination with fiducial markers. In order to determine specific camera parameters, a series of experiments were conducted which were necessary to develop the measurement method as well as the mathematical calculation method and coordinate transformation for the determination of poses (positions and angular orientations) of the respective facilities in the plant. In addition, an experimental validation was performed to ensure that the limit values for individual parameters determined in the requirements analysis can be met.
Development of an expert system to overpass citizens technological barriers on smart home and living
(2023)
Adopting new technologies can be overwhelming, even for people with experience in the field. For the general public, learning about new implementations, releases, brands, and enhancements can cause them to lose interest. There is a clear need to create point sources and platforms that provide helpful information about the novel and smart technologies, assisting users, technicians, and providers with products and technologies. The purpose of these platforms is twofold, as they can gather and share information on interests common to manufacturers and vendors. This paper presents the ”Finde-Dein-SmartHome” tool. Developed in association with the Smart Home & Living competence center [5] to help users learn about, understand, and purchase available technologies that meet their home automation needs. This tool aims to lower the usability barrier and guide potential customers to clear their doubts about privacy and pricing. Communities can use the information provided by this tool to identify market trends that could eventually lower costs for providers and incentivize access to innovative home technologies and devices supporting long-term care.
Process risks are omnipresent in the corporate world and repeatedly present organizations with the challenge of how to deal with these risks. Efforts in trying to analyze and prevent these risks are costly and require many resources, which do not always bring the desired added value. The goal of this work is to determine how a benefit-oriented resource allocation can be made for risk-oriented process management. For this purpose, the following research question is posed: "How can systematic prioritization decisions regarding risk-oriented process management be made?” To answer it, an evaluation procedure is developed which assesses processes based on their characteristics regarding potential risk disposition as well as entrepreneurial relevance. For this purpose, requirements for such a procedure are first collected and used to define selection criteria for it. After the detailed analysis of known selection and evaluation procedures, one of them is selected and used for further development. Next steps include the definition of relevant criteria for the evaluation of the processes by examining process characteristics regarding their suitability for process evaluation. The focus here lies on characteristics that provide indications of the risk disposition and business relevance of processes. The result of this approach is a scoring model with a criteria catalog consisting of 15 criteria according to which a process is evaluated. The evaluation result is presented both numerically and in a matrix. This enables the comparison of several processes and a derived prioritization of those for a more in-depth risk analysis. The application of this approach will ensure a benefit-oriented allocation of resources in the management of process risks and increased process reliability.
Development of an easy teaching and simulation solution for an autonomous mobile robot system
(2019)
With mass customized production becoming the mainstream, industries are shifting from large-scale manufacturing to flexible and customized production of small batch sizes. Agile manufacturing strategies adopted by SMEs are driving the usage of collaborative robots in today's factories. Major challenges in the adoption of cobots in the industry are the lack of a highly trained workforce to program the robot to perform complex tasks and integration of robot systems to other smart devices in the factory. In addition, the teaching and simulation by non-robotics experts of many industrial collaborative robot systems like the KUKA LBR iiwa is a major challenge, since these systems are designed to be programmed by robot experts and not by shop floor workers or other non-experts. This paper describes the research and development activities done for reducing the barriers in operation and ensure holistic integration of LBR iiwa cobot in the assembly on the example of the ESB Logistics Learning Factory. These include a visual programming solution for the easy teaching of various tasks. Robotic tasts are classified based on common robotics applications and application-specific blocks abstracting specific actions are implemented. A factory worker with no programming competency cour create robot programs by combining these blocks using a Graphical User Interface. In addition, a simulation solution was developed to visualized, analyse, and optimize robotic workflow before deployment. an autonomous mobile robot is integrated with the LBR iiw to improve reconfigurability and thus also the productivity. The system as a whole is controlled using an event-driven distributed control system. Finally, the capabilities of the system are analysed based on the design principles of Industrie 4.0 and potential future research ideas are discussed to further improve the system.
The process for the production of customized bras is really challenging. Although the need is very clear, the lingerie industry is currently facing a lack of data, knowledge and expertise for the realization of an automated process chain. Different studies and surveys have shown, that the majority of women wear the incorrect bra size. In addition to aesthetic problems, health risks such as headaches, back problems or digestive problems of the wearers can result from this. An important prerequisite for improvements is the basic knowledge about the female breast, both in terms of body measurements and different breast shapes. The current size systematic for bras only defines a bra size by the relation between bust girth and underbust girth and standardized cup forms do not justice to the high variability of the human body. As the bra type shapes the female breast, basic knowledge about the relation of measurements and shapes from the clothed and the unclothed breast is missing.
In the present project, studies are conducted to explore the female breast and to derive new breast-specific body measurements, different breast shapes and deformation knowledge using existing bras.
Furthermore, an innovative process is being developed that leads from 3D scanning to individual and interactive pattern construction, which allows an automatic pattern creation based on individual body measurements and the influence of different material parameters.
In the course of the presentation, the current project status will be shown and the future developments and project steps will be introduced.
Circular economy aims to support reuse and extends the product life cycles through repair, remanufacturing, upgrades and retrofits, as well as closing material cycles through recycling. To successfully manage the necessary transformation processes to circular economy, manufacturing enterprises rely on the competency of their employees. The definition of competency requirements for circular economy-oriented production networks will contribute to the operationalization of circular economy. The International Association of Learning Factories (IALF) statesin its mission the development of learning systems addressing these challenges for training of students and further education of industry employees. To identify the required competencies for circular economy, the major changes of the product life cycle phases have been investigated based on the state of the science and compared to the socio-technical infrastructure and thematic fields of the learning factories considered in this paper. To operationalize the circular economy approach in the product design and production phase in learning factories, an approach for a cross learning factory network (so called "Cross Learning Factory Product Production System (CLFPPS)") has been developed. The proposed CLFPPS represents a network on the design dimensions of learning factories. This approach contributes to the promotion of circular economy in learning factories as it makes use of and combines the focus areas of different learning factories. This enables the CLFPPS to offer a holistic view on the product life cycle in production networks.
Recent advances in artificial intelligence have enabled promising applications in neurosurgery that can enhance patient outcomes and minimize risks. This paper presents a novel system that utilizes AI to aid neurosurgeons in precisely identifying and localizing brain tumors. The system was trained on a dataset of brain MRI scans and utilized deep learning algorithms for segmentation and classification. Evaluation of the system on a separate set of brain MRI scans demonstrated an average Dice similarity coefficient of 0.87. The system was also evaluated through a user experience test involving the Department of Neurosurgery at the University Hospital Ulm, with results showing significant improvements in accuracy, efficiency, and reduced cognitive load and stress levels. Additionally, the system has demonstrated adaptability to various surgical scenarios and provides personalized guidance to users. These findings indicate the potential for AI to enhance the quality of neurosurgical interventions and improve patient outcomes. Future work will explore integrating this system with robotic surgical tools for minimally invasive surgeries.
Especially, if the potential of technical and organizational measures for ergonomic workplace design is limited, exoskeletons can be considered as innovative ergonomic aids to reduce the physical workload of workers. Recent scientific findings from ergonomic analyses with and without exoskeletons are indicating that strain reduction can be achieved, particularly at workplaces with lifting, holding, and carrying processes. Currently, a work system design method is under development incorporating criteria and characteristics for the design of work systems in which a human worker is supported by an exoskeleton. Based on the properties of common passive and active exoskeletons, factors influencing the human on which an exoskeleton can have a positive or negative effect (e.g. additional weight) were derived. The method will be validated by the conceptualization and setup of several work system demonstrators at Werk150, the factory of ESB Business School on campus of Reutlingen University, to prove the positive ergonomic effect on humans and the supporting process to choose the suitable exoskeleton. The developed method and demonstrators enable the user to experience the positive ergonomic effects of exoskeletal support in lifting, holding and carrying processes in logistics and production. The new work system design method will contribute to the fact that employees can pursue their professional activity longer without substantial injuries or can be used more flexibly at different work stations. Also new work concepts, strategies and scenarios are opened up to reduce the risk of occupational accidents and to promote the compatibility of work for employees. A training module is being developed and evaluated with participants from industry and master students to build up competence.
The increasing urban population growth leads to challenges in cities in many aspects: Urbanisation problems such as excessive environmental pollution or increasing urban traffic demand new and innovative solutions. In this context, the concept of smart cities is discussed. An enabling element of the smart city concept is applying information technology (IT) to improve administrative efficiency and quality of life while reducing costs and resource consumption and ensuring greater citizen participation in administrative and urban development issues. While these smart city services are technologically studied and implemented, government officials, citizens or businesses are often unaware of the large variety of smart city service solutions. Therefore, this work deals with developing a smart city services catalogue that documents best practice services to create a platform that brings citizens, city government, and businesses together. Although the concept of IT service catalogues is not new and guidelines and recommendations for the design and development of service catalogues already exist in the corporate context, there is little work on smart city service catalogues. Therefore, approaches from agile software development and pattern research were adapted to develop the smart city service catalogue platform in this work.
The Circular Economy aims to reintroduce the value of products back into the economic cycle at the same value chain level. While the activities of the Circular Economy are already well-defined, there exists a gap in how returned products are treated by the industry. This study aims to examine how a process should be designed to handle returned products in the context of the Circular Economy. To achieve this, a machine learning-based algorithm is used to classify data and extract relevant information throughout the product life cycle. The focus of this research is limited to land transportation systems within the Sharing Economy sector.
Endogenous electrical fields play an important role in various physiological and pathological events. Yet the effects of electrical cues on processes such as wound healing, tumor development or metastasis are still rarely investigated, though it is known that direct current electrical fields can alter cell migration or proliferation in vitro. Several 2D experimental models for studying cell responses to direct current electrical fields have been presented and characterized but suitable experimental models for electrotaxis studies in 3D are rare. Here we present a novel, easy-to-produce, multi-well-based galvanotactic-chamber for the use in 2D and 3D cell experiments for investigations on the influence of electrical fields on tumor cell migration and tumor spheroid growth. Our presented system allows the simultaneous application of electrical field to cells in four chambers, either cultured on the bottom of the culture-plate (2D) or embedded in hydrogel filled channels(3D). The set-up is also suitable for, live-cell-imaging. Validation tests show stable electrical fields and high cell viabilities inside the channel. Tumor spheroids of various diameters can be exposed to direct current electrical fields up to one week.
The focus of the developed maturity model was set on processes. The concept of the widespread CMM and its practices has been transferred to the perioperative domain and the concept of the new maturity model. Additional optimization goals and technological as well as networking-specific aspects enable a process- and object-focused view of the maturity model in order to ensure broad coverage of different subareas. The evaluation showed that the model is applicable to the perioperative field. Adjustments and extensions of the maturity model are future steps to improve the rating and classification of the new maturity model.
The Industry 4.0 paradigm requires concepts for integrating intelligent/ smart IoT Solutions into manufacturing. Such intelligent solutions are envisioned to increase flexibility and adaptability in smart factories. Especially autonomous cobots capable of adapting to changing conditions are a key enabler for changeable factory concepts. However, identifying the requirements and solution scenarios incorporating intelligent products challenges the manufacturing industry, especially in the SME sector. In pick and place scenarios, changing coordinate systems of workpiece carriers cause placing process errors. Using the IPIDS framework, this paper describes the development of a tool-center-point positioning method to improve the process stability of a collaborative robot in a changeable assembly workstation. Applying the framework identifies the requirement for an intelligent workpiece carrier as a part of the solution. Implementing and evaluating the solution within a changeable factory validates the IPIDS framework.
Future intralogistics systems need to adapt flexibly to changing material flow requirements in line with future versatile factory environments, producing personalized products under the performance and cost conditions of today's mass production. Small batch sized down to a batch size of "1" lead to a high complexity in the design and economical manufacturing of these customized products. Intralogistics systems are integrated into higher-level areas (segment level) as well as into upsteam and downstream performance units (system-wide areas). This includes the logistic activities relevant for the system (organized according to storage, picking, transport) such as transportation or storage tasks of tools, semi-finished products, components, assemblies and containers, and waste. Today's centralized material flow control systems, which work based on predefined processes, are not capable and more specifically not suitable to deal with the arising complexity of changeable intralogistics systems. Autononomous, decentralized material flow control systems distribute the required decision-making and control processes on intelligent logistic entities. A major step for the development of an autonomous control method for hybrid intralogistics systems (manual, semi-automated and automated) is the development of a generic archetype for intralogistics systems regarding the system boundaries, elements and relations resulting in a descriptive model taking into account amongst others the time of demand, availability of resources, economic efficiency and technical performance parameters. The ESB Logistics Learning Factory at ESB Business School (Reutlingen University) serves for this as a close-to-reality development and validation environment.
The global demand for individualized products leading to decreasing production batch sizes requires innovative approaches how to organize production and logistics systems in a dynamic manner. Current material flow systems mainly rely on predefined system structures and processes, which result in a huge increase of complexity and effort for system and process changes to realize an optimized production and material provision of individualized products. Autonomous production and logistics entities in combination with intelligent products or logistic load carriers following the vision of the “Internet of Things” offer a promising solution for mastering this complexity based on autonomous, decentralized and target size-optimized decision making and structure formation without the need for predefined processes and central decision-making bodies. Customer orders are going to prioritize themselves and communicate directly with the required production and logistics resources. Bins containing the required materials are going to communicate with the conveyors or workers of the respective intralogistics system organizing and controlling the material flow to the autonomously selected workstation. A current research project is the development of a collaborative tugger train combing the potential of automation and human-robot collaboration in intralogistics. This tugger train is going to be integrated into a self organized intralogistics scenario involving individualized customer orders (low to high batch sizes). To classify the application of self-organization within intralogistics systems, a criteria catalogue has been developed. The application of this criteria catalogue will be demonstrated on the example of a self-organization scenario involving the collaborative tugger train and an intelligent bin system.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
Supply chains have evolved into dynamic, interconnected supply networks, which increases the complexity of achieving end-to-end traceability of object flows and their experienced events. With its capability to ensure a secure, transparent, and immutable environment without relying on a trusted third party, the emerging blockchain technology shows strong potential to enable end-to-end traceability in such complex multitiered supply networks. However, as the dissertation’s systematic literature review reveals, the currently available blockchain-based traceability solutions lack the ability to map object-related supply chain events holistically, which involves mapping objects’ creation and deletion, aggregation and disaggregation, transformation, and transaction. Therefore, this dissertation proposes a novel blockchain-based traceability architecture that integrates governance and token concepts to overcome the limitations of existing architectures. While the governance concept manages the supply chain structure on an application level, the token concept includes all functions to conduct object-related supply chain events. For this to be possible, this dissertation’s token concept introduces token ‘blueprints’, which allow clients to group tokens into different types, where tokens of the same type are non-fungible. Furthermore, blueprints can include minting conditions, which are, for example, necessary when mapping assembly or delivery processes. In addition, the token concept contains logic for reflecting all conducted object-related events in an integrated token history. This ultimately leads to end-to-end traceability of tokens and their physical or abstract representatives on the blockchain. For validation purposes, this dissertation implements the architecture’s components and their update and request relationships in code and proves its applicability based on the Ethereum blockchain. Finally, this dissertation provides a scenario-based evaluation based on two industrial case studies from a manufacturing and logistics perspective to validate the architecture’s capabilities when applied in real-world industrial settings. The proposed blockchain-based traceability architecture thus covers all object-related supply chain events derived from the two industrial case studies and therefore proves its general-purpose end-to-end traceability capabilities of object flows.
Today's pattern making methods for industrial purposes are including construction principles, which are based on mathematical formula and sizing charts. As a result, there are two-dimensional flats, which can be converted into a three-dimensional garment. Because of their high linearity, those patterns are incapable of recreating the complexity of the human body, which results in insufficient fit. Subsequent changes of the pattern require a high degree of experience and lead to an inefficient product development process. It is known that draping allows the development of more complex and demanding patterns, which corresponds more to the actual body shape. Therefore, this method is used in custom tailoring and haute couture to achieve perfect garment fit but is also associated with time.
So, there is the act of defiance to improve the fit of garments, to speed up production but maintain a good value for money. Reutlingen University is therefore working on the development of 3D-modelled body shapes for 3D draping, considering different layers of clothing, such as jackets or coats. For this purpose, 3D modelling is used to develop 3D-bodies that correspond to the finished dimensions of the garment. By flattening of the modelled body, it is then possible to obtain an optimal 2D Pattern of the body. The comparison of the conventional method and the developed method is done by 3D simulation.
Finally, the optical fit test is demonstrated by the simulated basic cuts, that a significantly better body wrapping through the newly developed methodology could be achieved. Unlike in the basic cuts, which were achieved by classical design principles have been created, only a few adjustments are necessary to obtain an optimized basic cut. Also, when considering the body distance, it is shown that the newly developed basic patterns provide a more even enclosure of the body.
This paper presents the preliminary results of a setof research projects being developed at the distributed resources laboratory at the University of Reutlingen. The main aim of these projects is to couple distributed ledger technologies (DLTs) with distributed control of microgrids. Firstly, a DLT based solution for a local market platform has been developed. This enables end customers to participate in new local micro-energy-markets by providing them with a distributed, decentralized, transparent and secure Peer to Peer (P2P) payment system. Secondly, this solution has been integrated with an autonomous (agent-based) grid management. The integrated solution of both marked platform as well as agent based control has been implemented and tested in a real microgrid with different distributed components such as PV System, CHP and different kinds of controllable loads. This microgrid is located in the distributed energy resources laboratory at the University of Reutlingen. Thirdly, the resulting solution is being implemented as an easy to customize market solution by AC2SG Software Oy, a Finland based software company, developing solutions for the Indian market. In a next phase, the solution is going to be tested in real environment in off-grids systems in India.
Cloud resources can be dynamically provisioned according to application-specific requirements and are payed on a per-use basis. This gives rise to a new concept for parallel processing: Elastic parallel computations. However, it is still an open research question to which extent parallel applications can benefit from elastic scaling, which requires resource adaptation at runtime and corresponding coordination mechanisms. In this work, we analyze how to address these system-level challenges in the context of developing and operating elastic parallel tree search applications. Based on our findings, we discuss the design and implementation of TASKWORK, a cloud-aware runtime system specifically designed for elastic parallel tree search, which enables the implementation of elastic applications by means of higher-level development frameworks. We show how to implement an elastic parallel branch-and-bound application based on an exemplary development framework and report on our experimental evaluation that also considers several benchmarks for parallel tree search.
Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory.
There is no denying that organizations, whether domestic or global, whether educational, governmental, or business, are undergoing rapid transformation. However, what is causing it? Prompted by the need to remain relevant and competitive, organizations constantly try to reinvent themselves. Those that do not, according to the laws of economics, will simply serve no purpose and will eventually cease to exist. Regardless of sector or industry, an organization's success pivots around its human talent. Hence, it is crucial to manage it and cultivate certain traits, knowledge, and skills. In today's global economy, organizations are more interconnected than ever before and thus the challenges they face require that employees possess not only expert knowledge, problem-solving, cross-cultural, and cross-functional teaming skills, but also good communications skills and agile thinking.
The purpose of this paper sought to develop a collaborative framework that provides wine bottling facilities, wine cellars and their direct supply chain partner guidelines to facilitate a collaborative partnership – aiming to aid responsive decision making and improve reliability. The framework was developed using a triangulation approach, consisting of an in-depth literature review, 14 semi-structured interviews with industry experts and a theoretical case study. The developed framework was presented to wine bottling facilities and their supply chain stakeholders. Indication are that the proposed wine industry collaborative framework should enhance supply chain collaboration and will contribute towards the guidance and facilitation in developing collaboration platforms to align supply chain operations, while improving bottling responsiveness and meeting demand requirements.
This study determines the correlation between industry-specific success patterns of Germany’s engineering industry and the business models applied within. In order to identify this correlation, the following objectives are addressed within the framework of this paper: (1) identification and description of business models used by Germany’s engineering industry; (2) analysis of industry-specific success patterns of Germany’s engineering industry by the usage of Key-Performance-Indicators (KPIs); and (3) determination of correlation between the KPIs and Germany’s engineering industry’s business models’ effectiveness. These objectives are mainly achieved by literature research and expert surveys. The findings highlight the KPIs (overall 41) that are relevant for the respective business models. This enables a better understanding of the interrelationships of the business model, in order to derive relevant conclusions. The paper contributes to the literature as it advances this field of research in Germany, and it is one of the first studies to examine the relationship between business models and industry-specific success patterns with relevant KPIs.
Purpose – This paper aims to determine the affecting factors of the brand authenticity of startups in social media.
Design/methodology/approach – Using a qualitative method based on a grounded theory approach, this research specifies and classifies the affecting factors of brand authenticity of startups in social media through in-depth semi-structured interviews.
Findings – Multiple factors affecting the brand authenticity of startups in social media are determined and categorized as indexical, iconic and existential cues through this research. Connection to heritage and having credible support are determined as indexical cues. Founder intellectuality, brand intellectuality, commitment toward customers and proactive clear and interesting communications are identified as iconic cues. Having self-confidence and self-satisfaction, having intimacy with the brand and a joyful feeling for interactions with the community around the brand are determined as existential cues in this research. This research furthers previous arguments on a multiplicity of brand authenticity by shedding light on the relationship between the different aspects of authenticity and the form that different affecting factors can be organized together. Consumers eventually evaluate a strengthened perception of brand authenticity through existential cues that reflect the cues of other aspects (iconic and indexical) which passed through the goal-based assessment and self-authentication filter.
Research limitations/implications – The research sampling population can be more diversified in terms of sociodemographic attributes. Due to the qualitative methodology of this research, assessment of the findings through quantitative methods can be considered in future research. Practical implications – Using the findings of this research, startup managers can properly build a perception of authenticity in their consumers’ minds by using alternate factors while lacking major indexical cues such as heritage. This research helps startup businesses to design their brand communications better to convey their authenticity to their audiences.
Originality/value – This research determines the factors affecting the authenticity of startup brands in social media. It also defines the process of authenticity perception through different aspects of brand authenticity.
Selecting a suitable development method for a specific project context is one of the most challenging activities in process design. Every project is unique and, thus, many context factors have to be considered. Recent research took some initial steps towards statistically constructing hybrid development methods, yet, paid little attention to the peculiarities of context factors influencing method and practice selection. In this paper, we utilize exploratory factor analysis and logistic regression analysis to learn such context factors and to identify methods that are correlated with these factors. Our analysis is based on 829 data points from the HELENA dataset. We provide five base clusters of methods consisting of up to 10 methods that lay the foundation for devising hybrid development methods. The analysis of the five clusters using trained models reveals only a few context factors, e.g., project/product size and target application domain, that seem to significantly influence the selection of methods. An extended descriptive analysis of these practices in the context of the identified method clusters also suggests a consolidation of the relevant practice sets used in specific project contexts.
Selecting a suitable development method for a specific project context is one of the most challenging activities in process design. To extend the so far statistical construction of hybrid development methods, we analyze 829 data points to investigate which context factors influence the choice of methods or practices. Using exploratory factor analysis, we derive five base clusters consisting of up to 10 methods. Logistic regression analysis then reveals which context factors have an influence on the integration of methods from these clusters in the development process. Our results indicate that only a few context factors including project/product size and target application domain significantly influence the choice. This summary refers to the paper “Determining Context Factors for Hybrid Development Methods with Trained Models”. This paper was published in the proceedings of the International Conference on Software and System Process in 2020.
Determination of the gel point of formaldehyde-based wood adhesives by using a multiwave technique
(2023)
Determining the instant of gelation of formaldehyde-based wood adhesives as an assessment parameter for their curing rate is important for optimizing the curing behavior. Due to the stoichiometrically imbalanced networks of formaldehyde-based adhesives, the crossover point of storage G′ and loss modulus G″ cannot unconditionally be assumed as the gel point in oscillatory time sweeps as the material response is frequency-dependent. This study aims to determine the gel point of selected adhesives by the isothermal multiwave oscillatory shear test. A thorough comparison between the gel and the crossover point of G′ and G″ is performed. Rheokinetic analysis showed no significant difference between the activation energies calculated at the gel point determined by a multiwave test and the crossover point obtained by the time sweep test. Hence, for resins with similar curing reactions, a reliable determination of gel point by applying a multiwave test is needed for a comparison of their reactivity.
Determination of accelerometer sensor position for respiration rate detection: initial research
(2022)
Continuous monitoring of a patient's vital signs is essential in many chronic illnesses. The respiratory rate (RR) is one of the vital signs indicating breathing diseases. This article proposes the initial investigation for determining the accelerometric sensor position of a non-invasive and unobtrusive respiratory rate monitoring system. This research aims to determine the sensor position in relation to the patient, which can provide the most accurate values of the mentioned physiological parameter. In order to achieve the result, the particular system setup, including a mechanical sensor holder construction was used. The breathing signals from 5 participants were analyzed corresponding to the relaxed state. The main criterion for selecting a suitable sensor position was each patient's average acceleration amplitude excursion, which corresponds to the respiratory signal. As a result, we provided one more defined important parameter for the considered system, which was not determined before.
A lens-based Raman spectrometer is characterized by studying the optical elements in the optical path and we study the measure of aberration–diffraction effects. This is achieved by measuring the spectral resolution (SR) thus encompassing almost all optical elements of a spectrometer that are mostly responsible for such effects. An equation for SR is used to determine the quality factor Q which measures aberration/diffraction effects occurring in a spectrometer. We show how the quality factor changes with different spectrometer parameters such as grating groove density, the wavelength of excitation, pinhole width, charge-coupled device (CCD) pixel density, etc. This work provides an insight into the quality of a spectrometer and helps to monitor the performance of the spectrometer over a certain period. Commercially available spectrometers or home-built spectrometers are prone to misalignment in optical elements and can benefit from this work that allows maintaining the overall quality of the setup. Performing such experiments over a period helps to minimize the aberration/ diffraction effects occurring as a result of time and maintaining the quality of measurements.
Determinants of customer recovery in retail banking - lessons from a German banking case study
(2023)
Due to the increased willingness of retail banking customers to switch and churn their banking relationships, a question arises: Is it possible to win back lost customers, and if so, is such a possibility even desirable after all economic factors have been considered? To answer these questions, this paper examines selected determinants for the recovery of terminated customer–bank relationships from the perspective of former customers. This study therefore evaluates for the first time, empirically and systematically with reference to a German Sparkasse as a case-study setting, whether lost customers have a sufficient general willingness to return (GWR) a retail banking relationship. From our results, a correlation is shown between the GWR a banking relationship and some specific determinants: seeking variety, attractiveness of alternatives and customer satisfaction with the former business relationship. In addition, we show that a customer’s GWR varies depending on the reason for churn and is surprisingly greater when the customer defected for reasons that lie within the scope of the customer himself. Despite the case-study character, however, our results provide relevant insights for other banks and, in particular, this applies to countries with a comparable banking system.
Omnichannel retailing and sustainability are two important challenges for the fast fashion industry. However, the sustainable behavior of fast fashion consumers in an omnichannel environment has not received much attention from researchers. This paper aims to examine the factors that determine consumers’ willingness to participate in fast fashion brands’ used clothes recycling plans in an omnichannel retail environment. In particular, we examine the impact of individual consumer characteristics (environmental attitudes, consumer satisfaction), organizational arrangements constitutive for omnichannel retailing (channel integration), and their interplay (brand identification, impulsive consumption). A conceptual model was developed based on findings from previous research and tested on data that were collected online from Chinese fast fashion consumers. Findings suggest that consumers’ intentions for clothes recycling are mainly determined by individual factors, such as environmental attitudes and consumer satisfaction. Organizational arrangements (perceived channel integration) showed smaller effects. This study contributes to the literature on omnichannel (clothing) retail, as well as on sustainability in the clothing industry, by elucidating individual and organizational determinants of consumers’ recycling intentions for used clothes in an omnichannel environment. It helps retailers to organize used clothes recycling plans in an omnichannel environment and to motivate consumers to participate in them.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (longterm electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic TimeWarping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
Uncontrolled movements of laparoscopic instruments can lead to inadvertent injury of adjacent structures. The risk becomes evident when the dissecting instrument is located outside the field of view of the laparoscopic camera. Technical solutions to ensure patient safety are appreciated. The present work evaluated the feasibility of an automated binary classification of laparoscopic image data using Convolutional Neural Networks (CNN) to determine whether the dissecting instrument is located within the laparoscopic image section. A unique record of images was generated from six laparoscopic cholecystectomies in a surgical training environment to configure and train The CNN. By using a temporary version of the neural network, the annotation of the training image files could be automated and accelerated. A combination of oversampling and selective data augmentation was used to enlarge the fully labelled image data set and prevent loss of accuracy due to imbalanced class volumes. Subsequently the same approach was applied to the comprehensive, fully annotated Cholec80 database. The described process led to the generation of extensive and balanced training image data sets. The performance of the CNN-based binary classifiers was evaluated on separate test records from both databases. On our recorded data, an accuracy of 0.88 with regard to the safety-relevant classification was achieved. The subsequent evaluation on the Cholec80 data set yielded an accuracy of 0.84. The presented results demonstrate the feasibility of a binary classification of laparoscopic image data for the detection of adverse events in a surgical training environment using a specifically configured CNN architecture.
Detecting the adherence of driving rules in an energy-efficient, safe and adaptive driving system
(2016)
An adaptive and rule-based driving system is being developed that tries to improve the driving behavior in terms of the energy-efficiency and safety by giving recommendations. Therefore, the driving system has to monitor the adherence of driving rules by matching the rules to the driving behavior. However, existing rule matching algorithms are not sufficient, as the data within a driving system is changing frequently. In this paper a rule matching algorithm is introduced that is able to handle frequently changing data within the context of the driving system. 15 journeys were used to evaluate the performance of the rule matching algorithms. The results showed that the introduced algorithm outperforms existing algorithms in the context of the driving system. Thus, the introduced algorithm is suited for matching frequently changing data against rules with a higher performance, why it will be used in the driving system for the detection of broken energy-efficiency of safety-relevant driving rules.
We present an approach for segmenting individual cells and lamellipodia in epithelial cell clusters using fully convolutional neural networks. The method will set the basis for measuring cell cluster dynamics and expansion to improve the investigation of collective cell migration phenomena. The fully learning-based front-end avoids classical feature engineering, yet the network architecture needs to be designed carefully. Our network predicts how likely each pixel belongs to one of the classes and, thus, is able to segment the image. Besides characterizing segmentation performance, we discuss how the network will be further employed.
Some widely used optical measurement systems require a scan in wavelength or in one spatial dimension to measure the topography in all three dimensions. Novel hyperspectral sensors based on an extended Bayer pattern have a high potential to solve this issue as they can measure three dimensions in a single shot. This paper presents a detailed examination of a hyperspectral sensor including a description of the measurement setup. The evaluated sensor (Ximea MQ022HG-IM-SM5X5-NIR) offers 25 channels based on Fabry–Pérot filters. The setup illuminates the sensor with discrete wavelengths under a specified angle of incidence. This allows characterization of the spatial and angular response of every channel of each macropixel of the tested sensor on the illumination. The results of the characterization form the basis for a spectral reconstruction of the signal, which is essential to obtain an accurate spectral image. It turned out that irregularities of the signal response for the individual filters are present across the whole sensor.
Hyperspectral imaging opens a wide field of applications. It is a well established technique in agriculture, medicine, mineralogy and many other fields. Most commercial hyperspectral sensors are able to record spectral information along one spatial dimension in a single acquisition. For the second spatial dimension a scan is required. Beside those systems there is a novel technique allowing to sense a two dimensional scene and its spectral information within one shot. This increases the speed of hyperspectral imaging, which is interesting for metrology tasks under rough environmental conditions. In this article we present a detailed characterization of such a snapshot sensor for later use in a snapshot full field chromatic confocal system. The sensor (Ximea MQ022HG-IM-SM5X5-NIR) is based on the so called snapshot mosaic technique, which offers 25 bands mapped to one so called macro pixel. The different bands are realized by a spatially repeating pattern of Fabry-Pèrot flters. Those filters are monolithically fabricated on the camera chip.
The intelligent recycling of plastics waste is a major concern. Because of the widespread use of polyethylene terephtalate, considerable amounts of PET waste are generated that are ideally re-introduced into the material cycle by generating second generation products without loss of materials performance. Chemical recycling methods are often expensive and entail environmentally hazardous by-products. Established mechanical methods generally provide materials of reduced quality, leading to products of lower quality. These drawbacks can be avoided by the development of new recycling methods that provide materials of high quality in every step of the production cycle. In the present work, oligomeric ethylene terephthalate with defined degrees of polymerization and defined molecular weight is produced by melt-mixing PET with different quantities of adipic acid as an alternative pathway of recycling PET with respect to conventional methods, offering ecofriendly and economical aspects. Additionally, block-copolyesters of defined block length are designed from the oligomeric products.
The proliferation of convergence of digital technologies SMACIT (social, mobile, analytics, cloud, and Internet of Things) has created significant threats and opportunities to established companies. Business leaders must rethink their business strategies and develop what we refer to as a digital strategy. Our research shows four keys to successfully defining and executing a digital strategy:
1. zeroing in on a customer engagement or digitized solutions strategy to guide the transformation, 2. building operational excellence, 3. creating a powerful digital services backbone to facilitate rapid innovation and responsiveness, and 4. ensuring ongoing organizational redesign. A list of publications from the research is provided at the end of this document.