Refine
Year of publication
- 2019 (248) (remove)
Document Type
- Conference proceeding (110)
- Journal article (98)
- Book chapter (26)
- Book (3)
- Doctoral Thesis (3)
- Report (3)
- Issue of a journal (2)
- Working Paper (2)
- Anthology (1)
Has full text
- yes (248) (remove)
Is part of the Bibliography
- yes (248)
Institute
- ESB Business School (90)
- Informatik (85)
- Technik (35)
- Life Sciences (25)
- Texoversum (11)
Publisher
- Hochschule Reutlingen (45)
- IEEE (35)
- Springer (28)
- Elsevier (16)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (7)
- Stellenbosch University (7)
- Wiley (7)
- MDPI (6)
- MIM, Marken-Institut München (6)
- De Gruyter (5)
Die Zukunftsfähigkeit des Personalmanagements lässt sich daran festmachen, dass in der Organisation qualitativ und quantitativ ausreichend Personal zur Erfüllung des Organisationszwecks in dynamischen Umfeldern zur Verfügung steht. Einen wichtigen Ansatzpunkt stellen die Flexibilisierung der Personalausstattung sowie die institutionelle und strukturelle Öffnung von Organisationen in Richtung mehr Agilität dar. Darauf aufbauend muss das Personalmanagement selbst durch neue Arbeitsweisen und Praktiken innovativer werden und zusätzlich zu seinem stabilen Kern ein zweites agiles Betriebssystem entwickeln. Das zeitlich und strukturell abgestimmte Zusammenspiel des stabilen und agilen Betriebssystems ermöglicht dann die gleichzeitige Nutzung von exploitativen und explorativen Praktiken. Um die Agilitätsagenda des Personalmanagements weiter voranzutreiben, benötigt es einen systematischen Umgang mit der Bedeutung unterschiedlicher Agilitätsdimensionen, die Entwicklung von Instrumenten sowie Zielsetzungen, die ein agiles Personalmanagement verfolgen sollte.
In dieser Ausarbeitung wird eine zeitliche Vorhersage von Erdbeben getroffen. Hierfür werden mit einem Datensatz aus Labor-Erdbeben Convolutional Neural Networks (CNN) trainiert. Die trainierten Netzwerke geben Vorhersagen, indem sie einen Input an seismischen Daten klassifizieren. Durch das Klassifizieren kann das CNN die zeitliche Entfernung zum nächsten Erdbeben vorhersagen. Es werden hierfür zwei Ansätze miteinander verglichen. Beim ersten Ansatz werden die Originaldaten in ein CNN gegeben. Beim zweiten Ansatz wird vor dem CNN eine Vorverarbeitung der Daten mit den Mel Frequency Cepstral Coefficients (MFCC) durchgeführt. Es zeigt sich, dass mit beiden Ansätzen eine gute Klassifikation möglich ist. Die Kombination aus MFCC und CNN liefert die besseren quantitativen Ergebnisse. Hierbei konnte eine Genauigkeit von 65 % erreicht werden.
Die digitale Transformation verändert branchenübergreifend Geschäftsmodelle und ist damit ein Schlüssel für den langfristigen Unternehmenserfolg. In der Praxis scheitert sie jedoch häufig an der richtigen Umsetzung. Der Digital-Business-Excellence-Ansatz soll Unternehmen helfen, den digitalen Wandel erfolgreich zu gestalten.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach. Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts. The study focuses on mid-sized and large companies developing software-intensive products in dynamic and technical market environments. Method: We conducted semi structured expert interviews with 15 experts from 13 German companies and conducted a thematic data analysis. Results: The analysis showed that a significant number of companies is still struggling with traditional feature based product-roadmapping and opinion based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and the establishing discovery activities.
Among the multitude of software development processes available, hardly any is used by the book. Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods— so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this paper, we make a first step towards devising such guidelines. Grounded in 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods. Using an 85% agreement level in the participants’ selections, we provide two examples illustrating how hybrid development methods are characterized by the practices they are made of. Our evidence-based analysis approach lays the foundation for devising hybrid development methods.
Water jacket systems are routinely used to control the temperature of Petri dish cell culture chambers. Despite their widespread use, the thermal characteristics of such systems have not been fully investigated. In this study, we conducted a comprehensive set of theoretical, numerical and experimental analyses to investigate the thermal characteristics of Petri dish chambers under stable and transient conditions. In particular, we investigated the temperature gradient along the radial axis of the Petri dish under stable conditions, and the transition period under transient conditions. Our studies indicate a radial temperature gradient of 3.3 °C along with a transition period of 27.5 min when increasing the sample temperature from 37 to 45 °C for a standard 35 mm diameter Petri dish. We characterized the temperature gradient and transition period under various operational, geometric, and environmental conditions. Under stable conditions, reducing the diameter of the Petri dish and incorporating a heater underneath the Petri dish can effectively reduce the temperature gradient across the sample. In comparison, under transient conditions, reducing the diameter of the Petri dish, reducing sample volume, and using glass Petri dish chambers can reduce the transition period.
Nur auf den ersten Blick erscheint es unschlüssig, Entscheidungen von Managern einerseits und Wildwasserfahren andererseits im Zusammenhang zu sehen. Der Autor kennt beide Seiten. Seit Jahrzehnten ist er mit dem Wildwasserfahren im Kajak vertraut. Vor der Berufung zum Professor hat er im Management mittelgroßer Unternehmen bis hin zu großen Konzernen gearbeitet, zuletzt als Geschäftsführer im Anlagenbau und in der Energiewirtschaft. Bereits damals erinnerten ihn das chaotische und turbulente Wildwasser an entsprechende Vorgänge im Unternehmen.
100 Jahre nach der Gründung des Bauhauses betrete ich die Bauhaus Universität Weimar und frage mich, wie das wohl damals war? Vor 100 Jahren haben Gropius, Muche, Mies van der Rohe, Schlemmer und Itten Vorlesungen hier gehalten und jetzt bin ich dran. Wir schreiben Geschichte! Deren Geist weht immer noch in den original erhaltenen Gebäuden und Räumen. Bei der genaueren Betrachtung des Bauhauses, das Weltruhm erlangt hat, kann man einiges lernen. Beispielsweise lohnt die Überlegung, was das Marketing vom Bauhaus lernen kann.
Michael Wörz erfüllte 26 Jahre lang das Amt und die Aufgabe des Referenten für Technik- und Wirtschaftsethik an den Hochschulen für Angewandte Wissenschaften des Landes Baden-Württemberg. Ihm war die Interdisziplinarität, die diese Aufgaben erfordern würde, in sein berufliches Stammbuch geschrieben. Ein Ingenieur, der Philosophie studierte und dort promovierte, der zudem sein wissenschaftliches Schaffen aus der Perspektive der Theorien eines Soziologen vorantrieb, hatte gelernt, interdisziplinär zu denken und zu arbeiten. Ihm fiel die Aufgabe zu, das zum Überleben künftiger Generationen fachübergreifende, notwendige Wissen in die akademische Ausbildung der Hochschulen für Angewandte Wissenschaften zu tragen, und das hat Michael Wörz über die Jahre seiner Amtszeit wahrlich getan. Neben diesen Aufgaben eines engagierten und innovativen Hochschullehrers war er auch in der Politik ein streitbarer Vertreter der Ethik und der Nachhaltigen Entwicklung, der geholfen hat, den Lehrenden an den Hochschulen den Weg zu bereiten.
In this paper we describe an interactive web-based visual analysis tool for Formula one races. It first provides an overview about all races on a yearly basis in a calendar-like representation. From this starting point, races can be selected and visually inspected in detail. We support a dynamic race position diagram as well as a more detailed lap times line plot for showing the drivers’ lap times in comparison. Many interaction techniques are supported like selections, filtering, highlighting, color coding, or details-on demand. We illustrate the usefulness of our visualization tool by applying it to a Formula one dataset while we describe the different dynamic visual racing patterns for a number of selected races and drivers.
Eine der Hauptaufgaben der Ethik als wissenschaftlicher Disziplin ist es, dem Menschen Entscheidungshilfen für Situationen zu geben, in denen jede denkbare Handlungsalternative Vor- und/oder Nachteile hat, die ohne Weiteres nicht gegeneinander abgewogen werden können. Diese Situation wird als "Dilemma" bezeichnet.
Dilemma-Situationen treten immer dann auf, wenn sich die moralisch-ethische Bewertung einer Situation von der Bewertung unterscheidet, die durch korrekte Anwendung der gesetzlichen Grundlagen vorgeschrieben wird. Menschen in der Verwaltung sind daher in besonderem Maße der Gefahr von Zielkonflikten ausgesetzt.
Private equity (PE) firms are investment firms that acquire equity shares in companies. The goal of PE firms is to exit the investment after few years with a substantial increase in value. PE firms often claim to outperform the market, i.e. to create alpha.
The overall aim of this paper is to unravel the mystery of value creation in the PE industry. First, the author presents a conceptual framework for value creation in the PE industry based on a multiple valuation model that breaks down value creation into different elements. Second, the paper evaluates whether PE firms really create value by analysing and combining results from prior empirical studies based on the conceptual framework.
The results show that existing empirical evidence is mixed but that there is indeed a tendency toward a positive evidence that PE firms create economic value in average. However, there are methodological difficulties in measuring the value creation and studies are often subject to bias. Finally, it is pointed out that the question whether PE firms really create value has to be viewed from different perspectives such as the perspective of the PE firm, the investors and the portfolio companies.
The desire to combine advanced user friendly interfaces with a product personality communicating environmental friendliness to customers poses new challenges for car interior designers, as little research has been carried out in this field to date. In this paper, the creation of three personas aimed at defining key German car users with pro environmental behaviour is presented. After collecting ethnographic data of potential drivers through literature review, information about generation and Euro car segment led to the definition of three key user groups. The resulting personas were applied to determine the most important interaction points in car interior. Finally, present design cues of eco-friendly product personality developed in the field of automotive design were explored. Our work presents three strategic directions for the design development of future in-car user interfaces named as a) foster multimodal mobility; b) emphasize the interlinkage economy - sustainable driving; and c) highlight new technological developments. The presented results are meant as an impulse for developers to fit the needs of green customers and drivers when designing user-friendly HMI components.
Digital light microscopy techniques are among the most widely used methods in cell biology and medical research. Despite that, the automated classification of objects such as cells or specific parts of tissues in images is difficult. We present an approach to classify confluent cell layers in microscopy images by learned deep correlation features using deep neural networks. These deep correlation features are generated through the use of gram-based correlation features and are input to a neural network for learning the correlation between them. In this work we wanted to prove if a representation of cell data based on this is suitable for its classification as has been done for artworks with respect to their artistic period. The method generates images that contain recognizable characteristics of a specific cell type, for example, the average size and the ordered pattern.
Software evolvability is an important quality attribute, yet one difficult to grasp. A certain base level of it is allegedly provided by service- and microservice-based systems, but many software professionals lack systematic understanding of the reasons and preconditions for this. We address this issue via the proxy of architectural modifiability tactics. By qualitatively mapping principles and patterns of Service Oriented Architecture (SOA) and microservices onto tactics and analyzing the results, we cannot only generate insights into service-oriented evolution qualities, but can also provide a modifiability comparison of the two popular service-based architectural styles. The results suggest that both SOA and microservices possess several inherent qualities beneficial for software evolution. While both focus strongly on loose coupling and encapsulation, there are also differences in the way they strive for modifiability (e.g. governance vs. evolutionary design). To leverage the insights of this research, however, it is necessary to find practical ways to incorporate the results as guidance into the software development process.
The coupling of the heat and power sector is required as supply and demand in the German electricity mix drift further and further apart with a high percentage of renewable energy. Heat pumps in combination with thermal energy storage systems can be a useful way to couple the heat and power sectors. This paper presents a hardware-in the-loop test bench for experimental investigation of optimized control strategies for heat pumps. 24-hour experiments are carried out to test whether the heat pump is able to serve optimized schedules generated by a MATLAB algorithm. The results show that the heat pump is capable of following the generated schedules, and the maximum deviation of the operational time between schedule and experiment is only 3%. Additionally, the system can serve the demand for space heating and DHW at any time.
Hauptziel des Projektes war zum einen die Entwicklung einer validen Testmethode auf Grundlage vorliegender Normen, welche die in der betrieblichen Praxis auftretende Degradation abreinigbarer Filtermedien (hohe Temperaturen, aggressive chemische Atmosphären) praxisnah abbilden kann. Die Methode sollte auch die mechanische Alterung der Medien durch Staubbeaufschlagung sowie Abreinigungs Druckstöße berücksichtigen (DIN ISO 11057). Innerhalb des Projektes konnten umfangreiche Praxiserfahrungen mit der Inbetriebnahme und dem Betrieb einer schadgasbeaufschlagten, temperierbaren Testkammer zur chemischen Alterung von Filtermedien auf Grundlage der Vorgaben der DIN EN ISO 16891 gewonnen werden. Sollen vergleichbare Prüfdaten für mehrere Proben verlässlich ermittelt werden, sind bei den Untersuchungen demnach umfangreiche Randbedingungen zu beachten. Insbesondere zeigten die Untersuchungen den hohen technischen Aufwand zur Durchführung der Filtertests auf, welche nicht zuletzt auch aufgrund der erforderlichen Sicherheitstechnik und langen Untersuchungsdauer eine Umsetzung insbesondere bei KMU aus wirtschaftlichen Gründen erschwert ist. Es konnte weiter dargestellt werden, dass die Kombination von chemisch-thermischer und mechanisch(-thermischer) Alterung durch den Einsatz verschiedener Prüfeinrichtungen grundsätzlich umsetzbar ist. Die im Rahmen des Vorhabens entwickelte Testmethode einer chemischen Alterung der Filtermatrices durch Gasphasenexposition in einer Druckkammer ermöglicht kürzere Beanspruchungszeiträume bei reduziertem zu behandelnden Schadgasanfall und kann damit den wirtschaftlichen Betrieb eines entsprechenden Prüfstandes ermöglichen. Kombiniert mit der externen mechanischen Alterung durch Staubbeaufschlagung und Möglichkeit der parallelen Temperaturaufprägung gem. EN ISO 16891 auf mehrere Filtermedien-Proben lässt sich das thermisch, chemisch und mechanisch induzierte Degradationsverhalten von Filtermedien ggf. realitätsnah und mit wirtschaftlich vertretbarem Aufwand in eine Prüfvorschrift überführen. Entsprechende Validierungsarbeiten sind Bestandteil eines aktuell gestarteten Folgeprojektes. Das zweite Hauptziel des Projektes war es Ausrüstungen zu entwickeln, die zu einer verbesserten Beständigkeit gegenüber aggressiven Komponenten führen. Die Ergebnisse zeigten, dass mit dem Sol-Gelverfahren mechanisch stabile Beschichtungen auf Faservlies dauerhaft aufgetragen werden konnten, welche insbesondere die chemisch induzierte Degradation von Aramiden reduzieren können. Bei Aramiden handelt es sich um relativ teure Hochleistungsmaterialien, von welchen bekannt ist, dass ihre Beständigkeit sowohl gegen über UV-Strahlung als auch unterschiedlichen Schadgasen gering ist. Daher stellen die Beständigkeit der Materialien verbessernde Ausrüstungen eine wichtige Entwicklung für Unternehmen dar, um auf diese Weise beständigere Aramid-basierte Produkte zu erhalten. Als besonders geeignet stellten sich dabei Fluorcarbonausrüstungen, organisch-anorganische Hybride auf Basis von GPTMS und Zirkonium-haltige Ausrüstungen heraus.
Welche Trends beeinflussen die Customer-Journey im B-to-B und wie wird sie in naher Zukunft aussehen? Dieser Beitrag wirft einen Blick auf zukünftige Trends, welche die Customer-Journey im B-to-B erheblich beeinflussen werden. Exemplarisch wird die Ausgestaltung und Optimierung der zukünftigen Customer-Journey mittels Lead-Profiling am Beispiel der Künstlichen Intelligenz aufgezeigt, bei der neue Technologien Daten nutzbringend verwenden. Darüber hinaus steht die Customer-Journey-Transformation im Mittelpunkt. Schließlich ist es doch gerade die komplexe Transformation der Customer-Journey, die aufgrund ihrer Komplexität häufig von den Unternehmen vernachlässigt wird.
A large body of literature is concerned with models of presence— the sensory illusion of being part of a virtual scene— but there is still no general agreement on how to measure it objectively and reliably. For the presented study, we applied contemporary theory to measure presence in virtual reality. Thirty-seven participants explored an existing commercial game in order to complete a collection task. Two startle events were naturally embedded in the game progression to evoke physical reactions and head tracking data was collected in response to these events. Subjective presence was recorded using a post-study questionnaire and real-time assessments. Our novel implementation of behavioral measures lead to insights which could inform future presence research: We propose a measure in which startle reflexes are evoked through specific events in the virtual environment, and head tracking data is compared to the range and speed of baseline interactions.
Continuous refactoring is necessary to maintain source code quality and to cope with technical debt. Since manual refactoring is inefficient and error prone, various solutions for automated refactoring have been proposed in the past. However, empirical studies have shown that these solutions are not widely accepted by software developers and most refactorings are still performed manually. For example, developers reported that refactoring tools should support functionality for reviewing changes. They also criticized that introducing such tools would require substantial effort for configuration and integration into the current development environment.
In this paper, we present our work towards the Refactoring-Bot, an autonomous bot that integrates into the team like a human developer via the existing version control platform. The bot automatically performs refactorings to resolve code smells and presents the changes to a developer for asynchronous review via pull requests. This way, developers are not interrupted in their workflow and can review the changes at any time with familiar tools. Proposed refactorings can then be integrated into the code base via the push of a button. We elaborate on our vision, discuss design decisions, describe the current state of development, and give an outlook on planned development and research activities.
While the concepts of object-oriented antipatterns and code smells are prevalent in scientific literature and have been popularized by tools like SonarQube, the research field for service-based antipatterns and bad smells is not as cohesive and organized. The description of these antipatterns is distributed across several publications with no holistic schema or taxonomy. Furthermore, there is currently little synergy between documented antipatterns for the architectural styles SOA and Microservices, even though several antipatterns may hold value for both. We therefore conducted a Systematic Literature Review (SLR) that identified 14 primary studies. 36 service-based antipatterns were extracted from these studies and documented with a holistic data model. We also categorized the antipatterns with a taxonomy and implemented relationships between them. Lastly, we developed a web application for convenient browsing and implemented a GitHub-based repository and workflow for the collaborative evolution of the collection. Researchers and practitioners can use the repository as a reference, for training and education, or for quality assurance.
Many start-ups are in search of cooperation partners to develop their innovative business models. In response, incumbent firms are introducing increasingly more cooperation systems to engage with start-ups. However, many of these cooperations end in failure. Although qualitative studies on cooperation models have tried to improve the effectiveness of incumbent start-up strategies, only a few have empirically examined start-up cooperation behavior. Considering the lack of adequate measurement models in current research, this paper focuses on developing a multi-item scale on cooperation behavior of start-ups, drawing from a series of qualitative and quantitative studies. The resultant scale contributes to recent research on start-up cooperation and provides a framework to add an empirical perspective to current research.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
Due to the rising need for palliative care in Russia, it is crucial to provide timely and high-quality solutions for patients, relatives, and caregivers. A methodology for remote monitoring of patients in need of palliative care and the requirements will be developed for a hardware-software complex for remote monitoring of patients' health at home.
The relevance of technology knowledge in digital transformation especially in small and mediumsized enterprises (SMEs) that are still largely dependent on physical human capital has become increasingly obvious. This is due to the rapid revolution in business environment coupled with increased living examples of firms disrupted by advancement in technological knowledge. Consequently, we find it progressively vital for SMEs to spot and mitigate both threats and take advantage of opportunities arising from digital transformation dynamism.
Our study aims at exploring the relevance of technology knowledge in SMEs for digital transformation to uncover the opportunities, roadmaps, and models that SMEs can take advantage of in the digital transformation and gain a competitive edge.
We conclude that irrespective relevance of technology knowledge for digital transformation coupled with its low costs and accessibility, SMEs are yet to realize the full potential of technological knowledge. This is mainly due to technologies appearing, changing and also vanishing so rapidly in the digital age, that gaining proper understanding without dedicated resources is utterly difficult for SMEs - making them less competitive as incumbent large firms in the market.
Purpose – The purpose of this paper is to examine the mediating effect of psychological contract breach on the relationship between job insecurity and counterproductive workplace behavior (CWB) and the moderating effect of employment status in this relationship.
Design/methodology/approach – Data were collected from 212 supervisor–subordinate dyads in a large Chinese state-owned air transportation group. AMOS 17.0 software was used to examine the hypothesized predictions and the theoretical model.
Findings – The results showed that psychological contract breach partially mediates the effect of job insecurity on CWB, including organizational counterproductive workplace behavior and interpersonal counterproductive workplace behavior. In addition, the relationships between job insecurity, psychological contract breach and CWB differ significantly between permanent workers and contract workers.
Originality/value – The present study provides a new insight into explaining the linkage between job insecurity and negative work behaviors as well as suggestions to managers on minimizing the harmful effects of job insecurity.
In standardized sectors such as the automotive, the cost-benefit ratio of automation solutions is high as they contribute to increase capacity, decrease costs and improve product quality. In less standardized application fields, the contribution of automation to improvements in capacity, cost and quality blurs. The automation of complex and unstructured tasks requires sophisticated, expensive and low-performing systems, whose impact on product quality is oftentimes not directly perceived by customers. As a result, the full automation of process chains in the general manufacturing or the logistic sectors is often a sub optimal solution. Taking the distance from the false idea that a process should be either fully automated, or fully manual, this paper presents a novel heuristic method for design of lean human-robot interaction, the Quality Interaction Function Deployment, with the objective of the “right level of automation”. Functions are divided among human and automated agents and several automation scenarios are created and evaluated with respect to their compliance to the requirements of all process´ stakeholders. As a result, synergies among operators (manual tasks) and machines (automated tasks) are improved, thus reducing time-losses and increasing productivity.
Context: Organizations are increasingly challenged by high market dynamics, rapidly evolving technologies and shifting user expectations. In consequence, many organizations are struggling with their ability to provide reliable product roadmaps by applying traditional roadmapping approaches. Currently, many companies are seeking opportunities to improve their product roadmapping practices and strive for new roadmapping approaches. A typical first step towards advancing the roadmapping capabilities of an organization is to assess the current situation. Therefore, the so-called maturity model DEEP for assessing the product roadmapping capabilities of companies operating in dynamic and uncertain environments has been developed and published by the authors.
Objective: The aim of this article is to conduct an initial validation of the DEEP model in order to understand its applicability better and to see if important concepts are missing. In addition, the aim of this article is to evolve the model based on the findings from the initial validation.
Method: The model has been given to practitioners such as product managers with the request to perform a self-assessment of the current product roadmapping practices in their company. Afterwards, interviews with each participant have been conducted in order to gain insights.
Results: The initial validation revealed that some of the stages of the model need to be rearranged and minor usability issues were found. The overall structure of the model was well received. The study resulted in the development of the version 1.1 of the DEEP product roadmap maturity model which is also presented in this article.
The paper describes a new stimulus using learning factories and an academic research programme - an M.Sc. in Digital Industrial Management and Engineering (DIME) comprising a double degree - to enhance international collaboration between four partner universities. The programme will be structured in such a way as to maintain or improve the level of innovation at the learning factories of each partner. The partners agreed to use Learning Factory focus areas along with DIME learning modules to stimulate international collaboration. Furthermore, they identified several research areas within the framework of the DIME program to encourage horizontal and vertical collaboration. Vertical collaboration connects faculty expertise across the Learning Factory network to advance knowledge in one of the focus areas, while Horizontal collaboration connects knowledge and expertise across multiple focus areas. Together they offer a platform for students to develop disciplinary and cross-disciplinary applied research skills necessary for addressing the complex challenges faced by industry. Hence, the university partners have the opportunity to develop the learning factory capabilities in alignment with the smart manufacturing concept. The learning factory is thus an important pillar in this venture. While postgraduate students/researchers in the DIME program are the enablers to ensure the success of entire projects, the learning factory provides a learning environment which is entirely conducive to fostering these successful collaborations. Ultimately, the partners are focussed on utilising smart technologies in line with the digitalization of the production process.
The digital age makes it possible to be globally networked at any time. Digital communication is therefore an important aspect of today’s world. Hence, the further development and expansion of this is becoming increasingly important. Even within a wireless system, copper channels are important as part of the overall network. Given the need to keep pushing at the current limitations, careful design of the cables in connection with an adapted coding of the bits is essential to transmit more and more data.
One of the most popular and widespread cabling technologies is symmetrical copper cabling [1, pp. 8-15]. It is also known as Twisted Pair and it is of immense importance for the cabling of communication networks.
At the time of writing this thesis, data rates of up to 10 GBit/s over a transmission distance of 100 m and 40 GBit/s over a transmission distance of 30 m are standardized for symmetrical copper cabling [2]. Other lengths are not standardized. Short lengths in particular are of great interest for copper cables, because copper cables are usually used for short distances, such as between computers and the campus network or within data centres.
This work has focused on the transmission of higher order Pulse Amplitude Modulation and the associated transmission performance. The central research question is:“how well can we optimize the transmission technique in order to be able to maximise the data bandwidth over Ethernet cable and, given that remote powering is also a significant application of these cables, how much will the resulting heating affect this transmission and what can be done to mitigate that?”
To answer this question, the cable parameters are first examined. A series of spectral measurements, such as Insertion Loss, Return Loss, Near End Crosstalk and Far End Crosstalk, provide information about the electromagnetic interference and the influence of the ohmic resistance on the signal. Based on these findings, the first theoretical statements and calculations can be made. In the next step, data transmissions over different transmission lengths are realized. The examination of the eye diagrams of the different transmission approaches ultimately provides information about the signal quality of the transmissions. An overview of the maximum transmission rate depending on the transmission distance shows the potential for different applications.
Furthermore, the simultaneous transmission of energy and data is a significant advantage of copper. However, the resulting heat development has an influence on the data transmission. Therefore, the influence of the ambient temperature of cables is investigated in the last part and changes in the signal quality are clarified.
The paper studies the deciding parameters that influence business students' selection of internships in Germany. The findings are based on literature research and a survey amongst students and company representatives and asks to rate the importance of 24 different aspects of internships. The benefits and negative impacts of internships on students, companies and universities are discussed in detail. The results of different demographic groups are compared.
During two researches the influence of technologies on sleep were analyzed. The first one is about the effect of light on the circadian rhythm and as consequence on sleep quality of persons in a vegetative state. The second one, which is still running, surveys the influence of several technical tools on the sleep of elderly people living in a nursing home.
With the capability of employing virtually unlimited compute resources, the cloud evolved into an attractive execution environment for applications from the High Performance Computing (HPC) domain. By means of elastic scaling, compute resources can be provisioned and decommissioned at runtime. This gives rise to a new concept in HPC: Elasticity of parallel computations. However, it is still an open research question to which extent HPC applications can benefit from elastic scaling and how to leverage elasticity of parallel computations. In this paper, we discuss how to address these challenges for HPC applications with dynamic task parallelism and present TASKWORK, a cloud-aware runtime system based on our findings. TASKWORK enables the implementation of elastic HPC applications by means of higher level development frameworks and solves corresponding coordination problems based on Apache ZooKeeper. For evaluation purposes, we discuss a development framework for parallel branch-and-bound based on TASKWORK, show how to implement an elastic HPC application, and report on measurements with respect to parallel efficiency and elastic scaling.
Systemtheorie für die Praxis : ein transdiziplinäres Modell systemischen Handelns mit Anwendungen
(2019)
Unser transdisziplinäres Modell systemischen Handelns verknüpft Grundideen soziologischer Systemtheorien mit Grundideen angrenzender Disziplinen so, dass es auf soziale Systeme aller Art angewendet werden kann. Die Bandbreite reicht von Zwei-Personen-Systemen über Profit- und Non-Profit-Organisation, Staaten und internationale Organisationen bis hin zur Weltgesellschaft. Solche Systeme interagieren untereinander sowie mit natürlichen und technischen Systemen, verändern dabei sich selbst und ihr Umfeld.
Das Handeln jedes sozialen Systems lässt sich mit Hilfe der sechs Dimensionen unseres Modells beschreiben, erklären und gestalten. Das zeigt schon die Leitfrage, zu der wir diese Dimensionen verdichten: Wer ist wo (System mit Subsystemen), wie präsent (Interpretationen), hat/kann (Ressourcen) und darf/soll (Institutionen) was wie transformieren (Prozesse) und mit wem wie interagieren (Interaktionen)? Diese Abstraktheit ermöglicht Anwendungen auf alle Systeme, in Verbindung mit passenden Konkretisierungen. Das verdeutlichen wir beispielhaft primär am Hochschulsystem.
Telemetrie und Homemonitoring werden bereits in vielen Gesundheitsbereichen erfolgreich genutzt. Moderne Herzschrittmacher ermöglichen durch telemetrische Datenübertragung das Homemonitoring aktueller Gesundheits- und Zustandsdaten durch PatientInnen und ÄrztInnen. Für die Weiterentwicklung existierender Produkte ist ein grundlegendes Verständnis der Anforderungen an und des Aufbaus solcher Systeme notwendig. Bisher existieren
herstellerunabhängige Betrachtungen dieser noch nicht. Durch die Verwendung von SysML als semiformale Notationssprache wird das System Herzschrittmacher und Homemonitoring modelliert. Die Anforderungen an ein solches System lassen sich aus bestehenden Produkten ableiten. Die vorliegende Arbeit beschreibt die Systemarchitektur solcher Systeme, anhand derer die Anbindung an Informationssysteme über das Homemonitoringsystem und die dadurch umgesetzten Funktionen gezeigt werden.
Ein Markenzeichen der Hochschule Reutlingen ist die enge Kooperation zwischen Forschern, Lehrenden, Praktikern und Studierenden. Professor*innen der Hochschule Reutlingen bieten neben dem so genannten „State-of-the-Art“- Managementwissen systemische Strukturaufstellungen als „komplementäre Medizin“ für Führungskräfte an.
This paper summarises the experiences with sustainability reporting in a very wide meaning at Universities of Applied Sciences (UoAS). It focuses on the communication of sustainability aspects and activities of universities. It provides a recommendation, a model for communicating the sustainability activities of universities and emphasises the values of this appraoch. This paper aims to find the most effective ways to convey education for sustainable development to a broad public and initiate communication about sustainability aspects with society.
The paper is based on action research done at two universities about the ways in which academic institutions can communicate with their stakeholders in order to report about their own role as a responsible university and also to make an impact on the sustainable development on a local and global scale.
Research is focussed on experiences at Universitites of Applied Sciences with their strong focus on applied research, education and transfer. However, these results can be helpful for each academic institution that wants to make a positive impact on society. The concept which we present focusses on the possible impact which universities can generate.
Seen as the contribution to the research field of sustainabitliy reporting the paper points out that a continuous qualitative reporting process with a focus on education for SD is an adequate and efficient approach to sustainability reporting for universities and an effective way to reach a broad public.
We show that there are several efficient methodss of communication ranging from the traditional sustainability report to publications which address the public and to more innovative methods using the web 2.0. We show and argue that for universities, alternative ways of sustainability communication may be more effective to achieve the sustainability mission.
The concept which we present gives the universities a broader impact on society and helps them to support sustainable development in an efficient way.
In vitro, hydrogel-based ECMs for functionalizing surfaces of various material have played an essential role in mimicking native tissue matrix. Polydimethylsiloxane (PDMS) is widely used to build microfluidic or organ-on-chip devices compatible with cells due to its easy handling in cast replication. Despite such advantages, the limitation of PDMS is its hydrophobic surface property. To improve wettability of PDMS-based devices, alginate, a naturally derived polysaccharide, was covalently bound to the PDMS surface. This alginate then crosslinked further hydrogel onto the PDMS surface in desired layer thickness. Hydrogel-modified PDMS was used for coating a topography chip system and in vitro investigation of cell growth on the surfaces. Moreover, such hydrophilic hydrogel-coated PDMS is utilized in a microfluidic device to prevent unspecific absorption of organic solutions. Hence, in both exemplary studies, PDMS surface properties were modified leading to improved devices.
Der Sondermaschinenbau ist durch eine hohe Variantenvielfalt und komplexe Materialflüsse charakterisiert (Reinhart, Bredow & Pohl, 2009, S. 131). Die vorliegende Publikation stellt die im Zuge eines Praxisprojekts angewendeten Vorgehensweisen zur Materialflussoptimierung im Sondermaschinenbau vor. Dabei werden Erfahrungswerte und Hindernisse herausgearbeitet.
Dieser Bericht fasst die wesentlichen Arbeiten und Ergebnisse zusammen, die in dem Verbundvorhaben „GalvanoFlex_BW“ im Kalenderjahr 2018 durchgeführt und erzielt wurden. Dazu lässt sich zunächst sagen, dass die Messwertaufnahme und –auswertung abgeschlossen ist. Es wurden verschiedene Messkampagnen bei der Fa. NovoPlan durchgeführt. Bei C&C Bark konnte man teilweise auf bestehende Daten zurückgreifen, die punktuell durch weitere Messungen ergänzt wurden. Bei der Fa. Hartchrom konnten aufgrund von Personalmangel keine Messungen durchgeführt werden. Die aufgenommenen Daten wurden in eine Effizienzbewertung überführt, aus der im Folgenden allgemeine Aussagen abgeleitet werden sollen. Dazu ist ein Simulationsprogramm aufgesetzt worden, das in der Lage ist, Prozessketten energetisch abzubilden und zu optimieren. Zudem sollen aus den Messdaten verbesserte Profile für den Wärmebedarf in den Unternehmen entwickelt werden, die daraufhin der KWK-Optimierung zur Verfügung gestellt werden. Im Zuge der Entwicklung und Bewertung stromoptimierter KWK- Strategien ist ein bestehendes Simulationsmodell entsprechend weiterentwickelt worden. Konkret wurde das Modell um eine verbesserte Lastprognose für Strom und Wärme für Industriebetriebe ergänzt, und das Optimierungsverfahren wurde um eine zweite Dimension erweitert. Während bislang allein die Optimierung der Eigenstromdeckung mit einer Begrenzung der BHKW-Starts als Nebenbedingung möglich war, ist jetzt die Kappung der elektrischen Lastspitze zusätzlich in der Zielfunktion integriert. Gerade bei Industrieunternehmen lässt sich auf diese Weise eine weitere, zum Teil nicht unerhebliche Energiekosteneinsparung erreichen, was durch die ersten Berechnungen anhand der drei im Reallabor vertretenden Betriebe bestätigt wird. Die Ergebnisse werden unter AP 8 (Umsetzung) diskutiert. Der Dialog mit weiteren Unternehmen und Institutionen außerhalb des Vorhabens konnte über die Branchenplattfom weitergeführt werden. In 2018 wurden zwei Veranstaltungen dieser Art durchgeführt, und im Frühjahr 2019 wird ein weiterer Workshop zu diesem Thema durchgeführt. Die sozialwissenschaftliche Begleitforschung wurde mit der zweiten Phase der Firmenbefragungen ebenfalls planmäßig weitergeführt. Mit Blick auf die Umsetzung eines BHKW-Konzeptes haben sich dabei zwei wichtige Punkte wie folgt gezeigt: Zum einen muss die umsetzende Firma eine gewisse „Energieeffizienz-Reife“ besitzen, die sich u.a. in der Erfahrung bei der Durchführung von Energieeffizienzmaßnahmen zeigt, da die Installation eines BHKWs eine äußerst komplexe Maßnahme darstellt. Zum anderen müssen andere unternehmensspezifische Kontextfaktoren hinzukommen, wie z.B. aus anderen Gründen durchzuführende bauliche Maßnahmen, so dass gewisse zeitliche Entscheidungsfenster entstehen, in denen die Umsetzung von KWK-Maßnahmen sinnvoll sind.
The purpose of this paper is to assess if the strategy development of the fashion industry is oriented to the long or short term. Following the theory of dynamic capabilities, this paper argues that a long term strategic orientation can be observed in corporate foresight activities. A multi methodological research approach is chosen to answer the research question. The findings suggest that the fashion industry is lagging behind other industries in terms of future orientation and therefore long-term strategy development, even though the challenges in the business environment are not perceived as less relevant.
Small and Medium Enterprises (SMEs) which play substantial role in the development of any economy have been on the rise in the recent periods. Consequently, these enterprises are faced with a myriad of challenges which could potentially be solved through adoption of technology. Nonetheless, it has been observed that the new technological uptake among SMEs remains limited with the majority of them opting to maintain the status quo with regards to technology awareness and innovation strategies.
In a literature review, this paper explores three major dynamics curtailing adoption of new technologies by SMEs in the manufacturing: Knowledge absorptive capacity and management factors, organisational structures as well as technological awareness. Firstly, with regards to knowledge absorptive capacity and management factors, this study shows how these factors drive innovation potentials in SMEs.
Secondly, with regards to technological awareness factors, this study documents how perceived usefulness, costs, network and infrastructure, education and skills, training and attitude as well as knowledge influence adoption of new technologies among SMEs in the world. Lastly, the study concludes by analysing how organisational structures drive innovation potentials of SMEs in the wake of swift and profound technological changes in the market.
Die Auswirkungen der digitalen Revolution sind vielfältig. So gibt es inzwischen immer ausgefeiltere Möglichkeiten, um mit Kunden in den Dialog zu treten. Unternehmen stehen deshalb vor der Herausforderung, die verschiedenen Kanäle und Kontaktpunkte mit ihren Kunden systematisch zu managen. Im Mittelpunkt steht dabei der Informations- und Entscheidungsprozess des Kunden - die Customer Journey. Sie gilt im Marketing als Königsdisziplin, bei der es darum geht, wie man am besten die "Reise des Kunden" erfasst, um Zielgruppen genau an der richtigen Stelle anzusprechen und Budgets gezielt in bestimmte Kanäle zu steuern. Marketing- und Vertriebsverantwortliche müssen sich daher Folgendes fragen: Was erwartet der Endkunde an den verschiedenen Touchpoints und wie sieht seine Customer Journey aus? Wie werden Unternehmen diesen Anforderungen gerecht?
Nach der Euro- und Wirtschaftskrise in den Jahren seit 2010 verzeichnet die Europäische Union (EU) derzeit einen soliden Wirtschaftsaufschwung in allen Mitgliedstaaten. Der Anteil Europas an der Weltwirtschaft beträgt rund 30 Prozent. Das europäische Wirtschaftswachstum ist 2018 mit 2,1 Prozent sogar größer als das in Deutschland mit 1,6 Prozent. Eine Analyse der Dauer von Aufschwungsphasen zeigt, dass Europa im Vergleich zur Weltwirtschaft sogar unerwarteter Spitzenreiter ist. Seit den 1970er-Jahren liegt die durchschnittliche Dauer eines europäischen Wirtschaftsaufschwungs bei über dreißig Quartalen; sie ist mithin deutlich höher als in den USA und Japan.
Software process improvement (SPI) is around for decades, but it is a critically discussed topic. In several waves, different aspects of SPI have been discussed in the past, e.g., large scale company-level SPI programs, maturity models, success factors, and in-project SPI. It is hard to find new streams or a consensus in the community, but there is a trend coming along with agile and lean software development. Apparently, practitioners reject extensive and prescriptive maturity models and move towards smaller, faster and continuous project-integrated SPI. Based on data from two survey studies conducted in Germany (2012) and Europe (2016), we analyze the process customization for projects and practices for implementing SPI in the participating companies. Our findings indicate that, even in regulated industry sectors, companies increasingly adopt in-project SPI activities, primarily with the goal to continuously optimize specific processes. Therefore, with this paper, we want to stimulate a discussion on how to evolve traditional SPI towards a continuous learning environment.
Social Selling – ein innovativer Vertriebsansatz, der die Prinzipien des digitalen Marketings auf den Vertrieb anwendet – findet in der Unternehmenspraxis zunehmend Beachtung. Die Forschung, insbesondere zur Ausgestaltung von Social Selling, steht allerdings noch am Anfang. Mit Hilfe der Daten der Social-Media-Kanäle Facebook und LinkedIn von zwei Industriegüterunternehmen wird in einer explorativen Studie herausgefunden, dass eine direkte Vernetzungsanfrage zur Erweiterung des Netzwerks effizient ist und dass Social-Selling-Beiträge, die zu Beginn und Ende einer Woche vor allem vormittags als visuelles Format (Fotos, Videos) veröffentlicht werden, am erfolgversprechendsten hinsichtlich Klicks, Likes, Shares und Comments sind.
A distinctive highlight of the dissertation at hand is the investigation of multiple apparel supply chain actors incorporating the views of a global apparel retailer in Europe and multiple suppliers in Vietnam and Indonesia.
More specifically, the dissertation presents a coherent investigation starting with the depiction of a conceptual framework for social management strategies as a means for social risk management (SRM), exclusively aiming at the apparel industry. In accordance to the identified research gaps and suggested research directions from the conceptual framework, the role of the apparel sourcing agent for social management strategies was analysed by conducting a multiple case study approach with evidence from Vietnam and Europe, ultimately suggesting ten propositions. Whereas a further multiple case study data collection in Vietnam, Indonesia and Europe allowed for the investigation of buyer-supplier relationships with regards to social compliance strategies by using core tenets of agency theory to interpret the findings and outline ten propositions. Based on the development of a conceptual framework on social SSCM in the apparel industry, the formulation of related 20 propositions with evidence from crucial developing (apparel sourcing) countries, and the application of agency theory which has been declared as a shortfall in this context, this thesis contributes with further grounding to SSCM theory and substantially contributes to the debate by addressing numerous research gaps.
We present a compact battery charger topology for weight and cost sensitive applications with an average output current of 9A targeted for 36V batteries commonly found in electric bicycles. Instead of using a conventional boost converter with large DC-link capacitors, we accomplish PFC-functionality by shaping the charging current into a sin²-shape. In addition, a novel control scheme without input-current sensing is introduced. A-priori knowledge is used to implement a feed-forward control in combination with a closed-loop output current control to maintain the target current. The use of a full-bridge/half bridge LLC converter enables operation in a wide input-voltage range.
A fully featured prototype has been built with a peak output power of 1050W. An average output power of 400W was measured, resulting in a power density of 1.8 kW/dm³. At 9A charging current, a power factor of 0.96 was measured and the efficiency exceeds 93% on average with passive rectification.
The impact of pulse charging has been evaluated on a 400Wh battery which was charged with the proposed converter as well as CC-CV-charging for reference. Both charging schemes show similar battery surface temperatures.
Recognizing human actions is a core challenge for autonomous systems as they directly share the same space with humans. Systems must be able to recognize and assess human actions in real-time. To train the corresponding data-driven algorithms, a significant amount of annotated training data is required. We demonstrate a pipeline to detect humans, estimate their pose, track them over time and recognize their actions in real-time with standard monocular camera sensors. For action recognition, we transform noisy human pose estimates in an image like format we call Encoded Human Pose Image (EHPI). This encoded information can further be classified using standard methods from the computer vision community. With this simple procedure, we achieve competitive state-of-the-art performance in pose based action detection and can ensure real-time performance. In addition, we show a use case in the context of autonomous driving to demonstrate how such a system can be trained to recognize human actions using simulation data.
Serverless computing is an emerging cloud computing paradigm with the goal of freeing developers from resource management issues. As of today, serverless computing platforms are mainly used to process computations triggered by events or user requests that can be executed independently of each other. These workloads benefit from on-demand and elastic compute resources as well as per-function billing. However, it is still an open research question to which extent parallel applications, which comprise most often complex coordination and communication patterns, can benefit from serverless computing.
In this paper, we introduce serverless skeletons for parallel cloud programming to free developers from both parallelism and resource management issues. In particular, we investigate on the well known and widely used farm skeleton, which supports the implementation of a wide range of applications. To evaluate our concepts, we present a prototypical development and runtime framework and implement two applications based on our framework: Numerical integration and hyperparameter optimization - a commonly applied technique in machine learning. We report on performance measurements for both applications and discuss
the usefulness of our approach.
Semi-automated image data labelling using AprilTags as a pre-processing step for machine learning
(2019)
Data labelling is a pre-processing step to prepare data for machine learning. There are many ways to collect and prepare this data, but these are usually associated with a greater effort. This paper presents an approach to semi-automated image data labelling using AprilTags. The AprilTags attached to the object, which contain a unique ID, make it possible to link the object surfaces to a particular class. This approach will be implemented and used to label data of a stackable box.
The data is evaluated by training a You Only Look Once (YOLO) net, with a subsequent evaluation of the detection results. These results show that the semi-automatically collected and labelled data can certainly be used for machine learning. However, if concise features of an object surface are covered by the AprilTag, there is a risk that the concerned class will not be recognized. It can be assumed that the labelled data can not only be used for YOLO, but also for other machine learning approaches.
The persistent development towards decreasing batch sizes due to an ongoing product individualization, as well as increasingly dynamic market and competitive conditions lead to new changeability requirements in production environments. Since each of the individualized products mgith require different base materials or components and manufacturing resources, the paths of the products giong through the factory as well as the required internal transport and material supply processes are going to differ for every product. Conventional planning and control systems, which rely on predifined processes and central decision-making, are not capable to deal with the arising system's complexity along the dimensions of changing goods, layouts and throughput requirements. The concepts of "self-organization" in combination with "autonomous ocntrol" provide promising solutions to solve these new requirements by using among other things the potential of autonomous, decentralized and target-optimized logistical objects (e.g. smart products, bins and conveyor systems) wich are able to communicate and interact with each other as well as with human wokers. To investigate the potential of automation and human-robot collaboration for intralogistics, a research project for the development of a collaborative tugger train has been started at the ESB Logistics Learning Factory in lin with various student projects in neighboring research areas. This collaboraive tugger train system in combination with other manual (e.g. handcarts) and (semi-) automated conveyoer systems (e.g. automated guided forklift) will be integrated into a dynamic, self-organized scenario with varying production batch sizes to develop a method for target-oriented sefl-organization and autonomous control of intralogistics systems. For a structured investigation of self-organized scenarios a generic intralogistics model as well as a criteria cataloghe has been developed. The ESB Logistics Learning will serve as a practice-oriented research, validation and demonstration environment for these purposes.
RoPose-Real: real world dataset acquisition for data-driven industrial robot arm pose estimation
(2019)
It is necessary to employ smart sensory systems in dynamic and mobile workspaces where industrial robots are mounted on mobile platforms. Such systems should be aware of flexible and non-stationary workspaces and able to react autonomously to changing situations. Building upon our previously presented RoPose-system, which employs a convolutional neural network architecture that has been trained on pure synthetic data to estimate the kinematic chain of an industrial robot arm system, we now present RoPose-Real. RoPose-Real extends the prior system with a comfortable and targetless extrinsic calibration tool, to allow for the production of automatically annotated datasets for real robot systems. Furthermore, we use the novel datasets to train the estimation network with real world data. The extracted pose information is used to automatically estimate the observing sensor pose relative to the robot system. Finally we evaluate the performance of the presented subsystems in a real world robotic scenario.
This paper studies option pricing based on a reverse engineering (RE) approach. We utilize artificial intelligence in order to numerically compute the prices of options. The data consist of more than 5000 call- and put-options from the German stock market. First, we find that option pricing under reverse engineering obtains a smaller root mean square error to market prices. Second, we show that the reverse engineering model is reliant on training data. In general, the novel idea of reverse engineering is a rewarding direction for future research. It circumvents the limitations of finance theory, among others strong assumptions and numerical approximations under the Black–Scholes model.
This study is about estimating the reproducibility of finding palpation points of three different anatomical landmarks in the human body (Xiphoid Process and the 2 Hip Crests) to support a navigated ultrasound application. On 6 test subjects with different body mass index the three palpation points were located five times by two examiners. The deviation from the target position was calculated and correlated to the fat thickness above each palpation point. The reproducibility of the measurements had a mean error of ≈13.5 mm +- 4 mm, which seems to be sufficient for the desired application field.
In this paper, an approach is introduced how reinforcement learning can be used to achieve interoperability between heterogeneous Internet of Things (IoT) components. More specifically, we model an HTTP REST service as a Markov Decision Process and adapt Q-Learning to the properties of REST so that an agent in the role of an HTTP REST client can learn the semantics of the service and, especially an optimal sequence of service calls to achieve an application specific goal. With our approach, we want to open up and facilitate a discussion in the community, as we see the key for achieving interoperability in IoT by the utilization of artificial intelligence techniques.
This document presents an algorithm for a nonobtrusive recognition of Sleep/Wake states using signals derived from ECG, respiration, and body movement captured while lying in a bed. As a core mathematical base of system data analytics, multinomial logistic regression techniques were chosen. Derived parameters of the three signals are used as the input for the proposed method. The overall achieved accuracy rate is 84% for Wake/Sleep stages, with Cohen’s kappa value 0.46. The presented algorithm should support experts in analyzing sleep quality in more detail. The results confirm the potential of this method and disclose several ways for its improvement.
Mehr denn je zeigt ein Blick auf die aktuellen gesellschaftlichen Debatten, wie wichtig angewandte Forschung ist, denn sie liefert fundierte Fakten und Erkenntnisse für den politischen Diskurs. So verstehen wir unseren Auftrag zur Forschung: Wir helfen Gesellschaft und Wirtschaft, Lösungen für die drängenden Fragen unserer Zeit zu entwickeln. Wie wir das machen? Auf den folgenden Seiten bekommen Sie einen kleinen Einblick in die aktuellen Forschungsprojekte an der Hochschule. Sie werden feststellen, dass diesmal ganz der Mensch im Mittelpunkt steht. Gute Forschung ist auch die Grundlage für die Ausbildung hochqualifizierter Absolventinnen und Absolventen, die die Unternehmen so dringend brauchen, um nachhaltig wettbewerbsfähig zu bleiben. Wir machen die Studierenden fit für die „Welt da draußen“ – mit besonderen Laborprojekten (Seite 26), mit spannenden Promotionsarbeiten (Seite 10) und Unterstützung von Ausgründungen (Seite 18).
Most antimicrobial peptides (AMPs) and their synthetic mimics (SMAMPs) are thought to act by permeabilizing cell membranes. For antimicrobial therapy, selectivity for pathogens over mammalian cells is a key requirement. Understanding membrane selectivity is thus essential for designing AMPs and SMAMPs to complement classical antibiotics in the future. This study focuses on membrane permeabilization induced by SMAMPs and their selectivity for membranes with different lipid compositions. We measure release and fluorescence lifetime of a self-quenching dye in lipid vesicles. Apart from the dose-response, we quantify the strength of individual leakage events, and, employing cumulative kinetics, categorize permeabilization behavior. We propose that differing selectivities in a series of SMAMPs arise from a combination of the effect of the antimicrobial agent and the susceptibility of the membrane (with a given lipid composition) for certain types of leakage behavior. The unselective and hemolytic SMAMP is found to act mainly by the asymmetry stress mechanism, mediated by hydrophobic insertion of SMAMPs into lipid layers. The more selective SMAMPs induced leakage events occurring stochastically over several hours. Lipid intrinsic properties might additionally amplify the efficiency of leakage events. Leakage behavior changes with both the design of the SMAMP and the lipid composition of the membrane. Understanding how leakage behavior contributes to the selectivity and activity of antimicrobial agents will aid the design and screening of antimicrobials. An understanding of the underlying processes facilitates the comparison of membrane permeabilization across in vitro and in vivo assays.
Vitamin E (VitE) additives are important in treating osteoarthritis inclusive cartilage regeneration due to their antioxidant and anti-inflammatory properties. The present research study focuses on the ability of biological antioxidant VitE (alpha-tocopherol isoform) to reduce or minimize oxidative degradation of soft implantable polyurethane (PU) elastomers after extended periods of time (5 months) in vitro. The effect of the oxidation storage media on the morphology of the segmented PUs was evaluated by mechanical softening, crystallization and melting behavior of both soft and hard segments (SS, HS) using dynamic mechanical analysis (DMA). Bulk mechanical properties of the potential implant materials during ageing were predicted from comprehensive mechanical testing of the biomaterials under tension and compression cyclic loads. 5-months in vitro data suggest that the prepared siloxane-poly(carbonate urethane) formulations have sufficient resistance against degradation to be suitable materials for chondral long term bio-stable implants. Most importantly, the positive effect of incorporating VitE (0.5 or 1.0% w/w) as bio-antioxidant and lubricant on the bio-stability was observed for all PU types. VitE-additives protected the surface layer from erosion and cracking during chemical oxidation in vitro as well as from thermal oxidation during extrusion re-processing.
Context: Companies in highly dynamic markets increasingly struggle with their ability to plan product development and to create reliable roadmaps. A main reason is the decreasing lack of predictability of markets, technologies, and customer behaviors. New approaches for product roadmapping seem to be necessary in order to cope with today's highly dynamic conditions. Little research is available with respect to such new approaches. Objective: In order to better understand the state of the art and to identify research gaps, this article presents a review of the scientific literature with respect to product roadmapping. Method: We performed a systematic literature review (SLR) with respect to identify papers in the field of computer science. Results: After filtering, the search resulted in a set of 23 relevant papers. The identified papers focus on different aspects such as roadmap types, processes for creating and updating roadmaps, problems and challenges with roadmapping, approaches to visualize roadmaps, generic frameworks and specific aspects such as the combination of roadmaps with business modeling. Overall, the scientific literature covers many important aspects of roadmapping but does provide only little knowledge on how to create product roadmaps under highly dynamic conditions. Research gaps address, for instance, the inclusion of goals or outcomes into product roadmaps, the alignment of a roadmap with a product vision, and the inclusion of product discovery activities in product roadmaps. In addition, the transformation from traditional roadmapping processes to new ways of roadmapping is not sufficiently addressed in the scientific literature.
Digital technologies are moving into physical products. Smart cars, connected lightbulbs and data-generating tennis rackets are examples of previously “pure” physical products that turned into “digitized products”. Digitizing products offers many use cases for consumers that will hopefully persuade them to buy these products. Yet, as revenues from selling digitized products will remain small in the near future, digitized product manufacturers have to look for other sources of benefits. Producer-side use cases describe how manufacturers can benefit internally from the digitized products they produce. Our article identifies three categories of such use cases: product-, service-, and process-related ones.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
Der vorliegende Artikel untersucht die Möglichkeiten des Trendthemas Chatbots und bietet gleichzeitig einen Orientierungsrahmen für Unternehmen. Um die Wettbewerbsfähigkeit im digitalen Zeitalter weiterhin gewährleisten zu können, müssen Unternehmen auf die wachsenden Ansprüche der Kunden reagieren. Hierfür sind Chatbots eine ideale Variante, die Kundenkommunikation effizienter, individueller und kostengünstiger zu gestalten. Chatbots könnten sich in Zukunft zur wichtigsten Schnittstelle zwischen dem Unternehmen und seinen Kunden entwickeln. Die durchgeführten Expertenbefragungen und Tests kommen allerdings zu dem Ergebnis, dass die Potenziale der Chatbots noch nicht voll ausgeschöpft werden. Insbesondere in den Bereichen Natural Language Processing, Intelligenz und Automatisierung stoßen aktuelle CHatbots an ihre Grenzen und finden somit in der Gesellschaft nur bedingt Akzeptanz.
Potentials of smart contracts-based disintermediation in additive manufacturing supply chains
(2019)
We investigate which potentials are created by using smart contracts for disintermediation in supply chains for additive manufacturing. Using a qualitative, critical realist research approach, we analyzed three case studies with companies active in additive manufactures. Based on interviews with experts from these companies, we could identify eight key requirements for disintermediation and associate four potentials of smart contracts-based disintermediation.
We report on the reflectance, transmittance and fluorescence spectra (λ=200–1200nm) of four types of chicken eggshells (white, brown, light green, dark green) measured in situ without pretreatment and after ablation of 20–100 μm of the outer shell regions. The color pigment protoporphyrin IX (PPIX) is embedded in the protein phase of all four shell types as highly fluorescent monomers, in the white and light green shells additionally as non-fluorescent dimers, and in the brown and dark green shells mainly as non-fluorescent poly-aggregates. The green shell colors are formed from an approximately equimolar mixture of PPIX and biliverdin. The axial distribution of protein and color pigments were evaluated from the combined reflectances of both the outer and inner shell surfaces, as well as from the transmittances. For the data generation we used the radiative transfer model in the random walk and Kubelka-Munk approaches.
Assistive environments are entering our homes faster than ever. However, there are still various barriers to be broken. One of the crucial points is a personalization of offered services and integration of assistive technologies in common objects and therefore in a regular daily routine. Recognition of sleep patterns for the preliminary sleep study is one of the Health services that could be performed in an undisturbing way. This article proposes the hardware system for the measurement of bio-vital signals necessary for initial sleep study in a nonobtrusive way. The first results confirm the potential of measurement of breathing and movement signals with the proposed system.
Nowadays, the demand for a MEMS development/design kit (MDK) is even more in focus than ever before. In order to achieve a high quality and cost effectiveness in the development process for automotive and consumer applications, an advanced design flow for the MEMS (micro electro mechanical systems) element is urgently required. In this paper, such a development methodology and flow for parasitic extraction of active semiconductor devices is presented. The methodology considers geometrical extraction and links the electrically active pn junctions to SPICE standard library models and subsequently extracts the netlist. An example for a typical pressure sensor is presented and discussed. Finally, the results of the parasitic extraction are compared with fabricated devices in terms of accuracy and capability.
This study describes a non-contact measuring and parameter identification procedure designed to evaluate inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament. This method can potentially help to establish a correlation between stiffness and damping characteristics of the annular ligament and inertia properties of the stapes and, thus, help to reduce the number of independent parameters in the model-based hearing diagnosis.
Organisationslernen
(2019)
Durch Organisationslernen passen sich Organisationen an veränderte Umweltanforderungen (Digitalisierung, politische Reformen, usw.) an. Organisationen können die Lernfähigkeit erhöhen, indem sie ihre dynamischen Fähigkeiten durch eine geringe Arbeitsteilung stärken, ihren Absorptionsprozess von Wissen hinterfragen, und strukturelle und zeitliche Ambidextrie schaffen. Sie können sich am Leitbild der lernenden Organisation orientieren und flache Organisationsstrukturen sowie Teamarbeit fördern. Insbesondere für öffentliche Verwaltungen, die derzeit nicht ausreichend lernfähig sind, bietet das Organisationslernen sinnvolle Ansatzpunkte.
This paper discusses the optimal control problem for increasing the energy efficiency of induction machines in dynamic operation including field weakening regime. In an offline procedure optimal current and flux trajectories are determined such that the copper losses are minimized during transient operations. These trajectories are useful for a subsequent online implementation.
A clinically useful system for individual continuous health data monitoring needs an architecture that takes into account all relevant medical and technical conditions. The requirements for a health app to support such a system are collected, and a vendor independent architecture is designed that allows the collection of vital data from arbitrary wearables using a smartphone. A prototypical implementation for the main scenario shows the feasibility of the approach.
In summary, we believe that current “sleep monitoring” consumer devices on the market must undergo a more robust validation process before being made available and distributed in the general public. This is especially noteworthy as there have been first reports in the literature that inaccurate feedback of such consumer devices can worry subjects and may even lead to compromised well-being of the user.
Background: Design patterns are supposed to improve various quality attributes of software systems. However, there is controversial quantitative evidence of this impact. Especially for younger paradigms such as service- and microservice-based systems, there is a lack of empirical studies.
Objective: In this study, we focused on the effect of four service-based patterns - namely process abstraction, service façade, decomposed capability, and event-driven messaging - on the evolvability of a system from the viewpoint of inexperienced developers.
Method: We conducted a controlled experiment with Bachelor students (N = 69). Two functionally equivalent versions of a service-based web shop - one with patterns (treatment group), one without (control group) - had to be changed and extended in three tasks. We measured evolvability by the effectiveness and efficiency of the participants in these tasks. Additionally, we compared both system versions with nine structural maintainability metrics for size, granularity, complexity, cohesion, and coupling.
Results: Both experiment groups were able to complete a similar number of tasks within the allowed 90 min. Median effectiveness was 1/3. Mean efficiency was 12% higher in the treatment group, but this difference was not statistically significant. Only for the third task, we found statistical support for accepting the alternative hypothesis that the pattern version led to higher efficiency. In the metric analysis, the pattern version had worse measurements for size and granularity while simultaneously having slightly better values for coupling metrics. Complexity and cohesion were not impacted.
Interpretation: For the experiment, our analysis suggests that the difference in efficiency is stronger with more experienced participants and increased from task to task. With respect to the metrics, the patterns introduce additional volume in the system, but also seem to decrease coupling in some areas.
Conclusions: Overall, there was no clear evidence for a decisive positive effect of using service-based patterns, neither for the student experiment nor for the metric analysis. This effect might only be visible in an experiment setting with higher initial effort to understand the system or with more experienced developers.
Digitalization of products and services commonly causes substantial changes in business models, operations, organization structures and IT infrastructures of enterprises. Motivated by experiences and observations from digitalization projects, the paper investigates the effects of digitalization on enterprise architectures (EA). EA models serve as representation of business, information system and technical aspects of an enterprise to support management and development. By comparing EA models before and after digitalization, the paper analyzes the kinds of changes visible in the EA model. The most important finding is that newly created digitized products and the associated (product)- and enterprise architecture are no longer properly integrated into the overall architecture and even exist in parallel. Thus, the focus of this work is on showing these parallel architectures and proposing derivations for a better integration.
Novel design for a coreless printed circuit board transformer realizing high bandwidth and coupling
(2019)
Rogowski coils offer galvanic isolation and can measure alternating currents with a high bandwidth. Coreless printed circuit board (PCB) transformers have been used as an alternative to limit the additional stray inductance if a Rogowski coil can not be attached to the circuit. A new PCB transformer layout is proposed to reduce cost, decrease additional stray inductance, increase the bandwidth of current measurements and simplify the integration into existing designs.
Mit der Überarbeitung der DIN EN 50173 (VDE 0800-173) Serie, wurden unter anderem die optischen Übertragungsstreckenklassen ersatzlos gestrichen. Um die so entstandene Lücke zu schließen, hat das deutsche Gremium DKE GUK 715.3 „Informationstechnische Verkabelung von Gebäudekomplexen“ neue Klassen erarbeitet, die in der DIN VDE 0800- 173-100 „Klassifizierung von Lichtwellenleiter-Übertragungsstrecken“ im Juni 2019 veröffentlicht wurden. Die Norm klassifiziert Lichtwellenleiter Übertragungsstrecken für anwendungsneutrale Kommunikationskabelanlagen nach DIN EN 50173-1.
Sie dient Benutzern, eine breite Palette von Anwendungen zu ermöglichen, die Auswahl des Verkabelungssystems zu erleichtern, eine zukunftssichere Klassifizierung von LWL-Verkabelungen zu generieren und dazu, Systemanforderungen zu beschreiben.
Die in der Norm definierten Klassen beschreiben die Anforderungen an die Übertragungsstrecken und basieren auf einer maximal zulässigen Einfügedämpfung in dB für maximale Übertragungsstreckenlängen, wobei zusätzlich das Bandbreitenlängenprodukt berücksichtigt wird.
Der Beitrag liefert einen Überblick über die Norm und zeigt Anwendungsbeispiele auf.
Data analytics tasks on large datasets are computationally intensive and often demand the compute power of cluster environments. Yet, data cleansing, preparation, dataset characterization and statistics or metrics computation steps are frequent. These are mostly performed ad hoc, in an explorative manner and mandate low response times. But, such steps are I/O intensive and typically very slow due to low data locality, inadequate interfaces and abstractions along the stack. These typically result in prohibitively expensive scans of the full dataset and transformations on interface boundaries.
In this paper, we examine R as analytical tool, managing large persistent datasets in Ceph, a wide-spread cluster file-system. We propose nativeNDP – a framework for Near Data Processing that pushes down primitive R tasks and executes them in-situ, directly within the storage device of a cluster-node. Across a range of data sizes, we show that nativeNDP is more than an order of magnitude faster than other pushdown alternatives.
In the present tutorial we perform a cross-cut analysis of database storage management from the perspective of modern storage technologies. We argue that neither the design of modern DBMS, nor the architecture of modern storage technologies are aligned with each other. Moreover, the majority of the systems rely on a complex multi-layer and compatibility oriented storage stack. The result is needlessly suboptimal DBMS performance, inefficient utilization, or significant write amplification due to outdated abstractions and interfaces. In the present tutorial we focus on the concept of native storage, which is storage operated without intermediate abstraction layers over an open native storage interface and is directly controlled by the DBMS.
In recent years Indonesia has been confronted with an excessive generation of municipal solid waste (MSW), predominantly present in the form of organic refuse. While moving towards integrated solid waste management (ISWM) is an important strategy used to control its generation, it is also now recognized that economic approaches need to be promoted as well in order to tackle the problem concertedly. In this case study, empirical approaches are developed to understand how market instruments could be introduced into environmental services and how to apply co-benefit approach in a green economy paradigm for Indonesia. We investigate the feasibility of introducing market instruments in Indonesia by appliying local co-benefit initiatives adapted from German experiences in integrating market instruments into MSW management practices. Currently co-benefit activities are undertaken in the Sukunan village (Yogjakarta) to promote waste composting using market incentives in the framework of community-based solid waste management (CBSWM). This scheme aims at reducing MSW generation at its source and mobilizing people to be involved in waste separation (organic and non-organic) at household levels. As a result, about 200,000 t of CO2 emissions could be successfully reduced annually. By integrating market instruments into waste management practices, the result of our studies sugggests that Indonesia could make positive changes to its environmental policy and regulation of MSW at local levels. The country's policymakers have played important roles in promoting the effectiveness of urban development with co-benefits approaches to facilitate its transition towards a green eccnomy.
Enterprises are transforming their strategy, culture, processes, and their information systems to enlarge their digitalization efforts or to approach for digital leadership. The digital transformation profoundly disrupts existing enterprises and economies. In current times, a lot of new business opportunities appeared using the potential of the Internet and related digital technologies: The Internet of Things, services computing, cloud computing, artificial intelligence, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT environments with many rather small and distributed structures, like the Internet of Things, microservices, or other micro-granular elements. Architecting micro-granular structures have a substantial impact on architecting digital services and products. The change from a closed-world modeling perspective to more flexible Open World of living software and system architectures defines the context for flexible and evolutionary software approaches, which are essential to enable the digital transformation. In this paper, we are revealing multiple perspectives of digital enterprise architecture and decisions to effectively support value and service oriented software systems for intelligent digital services and products.
Collaborative apparel consumption is proposed as more sustainable alternative to conventional consumption. The purpose of this study is the exploration of consumers’ motives to participate in collaborative apparel consumption. Findings suggest that consumers’ intention to participate in collaborative apparel consumption is mainly influenced by financial benefits, convenience and sustainability awareness.
Due to the large interindividual variances and the poor optical accessibility of the ear, the specificity of hearing diagnostics today is severely restricted to a certain clinical picture and quantitative assessment. Often only a yes or no decision is possible, which depends strongly on the subjective assessment of the ENT physician. A novel approach, in which objectively obtainable, non invasive audiometric measurements are evaluated using a numerical middle ear model, makes it possible to make the hidden middle ear properties visible and quantifiable. The central topic of this paper is a novel parameter identification algorithm that combines inverse fuzzy arithmetic with an artificial neural network in order to achieve a coherent diagnostic overall picture in the comparison of model and measurement. Its usage is shown at a pathological pattern called malleus fixation where the upper ligament of the malleus is pathologically stiffened.
Model-based hearing diagnosis based on wideband tympanometry measurements utilizing fuzzy arithmetic
(2019)
Today's audiometric methods for the diagnosis of middle ear disease are often based on a comparison of measurements with standard curves that represent the statistical range of normal hearing responses. Because of large inter-individual variances in the middle ear, especially in wideband tympanometry (WBT), specificity and quantitative evaluation are greatly restricted. A new model-based approach could transform today's predominantly qualitative hearing diganostics into a quantitative and tailored, patient-specific diagnosis, by evaluating WBT measurements with the aid of a middle-ear model. For this particular investigation, a finite element model of a human ear was used. It consisted of an acoustic ear canal and a tympanic cavity model, a middle-ear with detailed nonlinear models of the tympanic membrane and annular ligament, and a simplified inner-ear model. This model has made it possible to identify pathologies from measurements, by analyzing the parameters through senstivity studies and parameter clustering. Uncertainties due to the lack of knowledge, subjectivity in numerical implementation and model simplification are taken into account by the application of fuzzy arithmetic. The most confident parameter set can be determined by applying an inverse fuzzy method on the measurement data. The principle and the benefits of this model-based approach are illustrated by the example of a two-mass oscillator, and also by the simulation of the energy absorbance of an ear with malleus fixation, where the parameter changes that are introduced can be determined quantitatively through the system identification.
An important shift in software delivery is the definition of a cloud service as an independently deployable unit by following the microservices architectural style. Container virtualization facilitates development and deployment by ensuring independence from the runtime environment. Thus, cloud services are built as container based systems - a set of containers that control the lifecycle of software and middleware components. However, using containers leads to a new paradigm for service development and operation: Self service environments enable software developers to deploy and operate container based systems on their own - you build it, you run it. Following this approach, more and more operational aspects are transferred towards the responsibility of software developers. In this work, we propose a concept for self-adaptive cloud services based on container virtualization in line with the microservices architectural style and present a model-based approach that assists software developers in building these services. Based on operational models specified by developers, the mechanisms required for self-adaptation are automatically generated. As a result, each container automatically adapts itself in a reactive, decentralized manner. We evaluate a prototype which leverages the emerging TOSCA standard to specify operational behavior in a portable manner.
Parallel applications are the computational backbone of major industry trends and grand challenges in science. Whereas these applications are typically constructed for dedicated High Performance Computing clusters and supercomputers, the cloud emerges as attractive execution environment, which provides on-demand resource provisioning and a pay-per-use model. However, cloud environments require specific application properties that may restrict parallel application design. As a result, design trade-offs are required to simultaneously maximize parallel performance and benefit from cloud-specific characteristics.
In this paper, we present a novel approach to assess the cloud readiness of parallel applications based on the design decisions made. By discovering and understanding the implications of these parallel design decisions on an application’s cloud readiness, our approach supports the migration of parallel applications to the cloud.We introduce an assessment procedure, its underlying meta model, and a corresponding instantiation to structure this multi-dimensional design space. For evaluation purposes, we present an extensive case study comprising three parallel applications and discuss their cloud readiness based on our approach.
To remain competitive in a fast changing environment, many companies started to migrate their legacy applications towards a Microservices architecture. Such extensive migration processes require careful planning and consideration of implications and challenges likewise. In this regard, hands-on experiences from industry practice are still rare. To fill this gap in scientific literature, we contribute a qualitative study on intentions, strategies, and challenges in the context of migrations to Microservices. We investigated the migration process of 14 systems across different domains and sizes by conducting 16 in-depth interviews with software professionals from 10 companies. Along with a summary of the most important findings, we present a separate discussion of each case. As primary migration drivers, maintainability and scalability were identified. Due to the high complexity of their legacy systems, most companies preferred a rewrite using current technologies over splitting up existing code bases. This was often caused by the absence of a suitable decomposition approach. As such, finding the right service cut was a major technical challenge, next to building the necessary expertise with new technologies. Organizational challenges were especially related to large, traditional companies that simultaneously established agile processes. Initiating a mindset change and ensuring smooth collaboration between teams were crucial for them. Future research on the evolution of software systems can in particular profit from the individual cases presented.
Microservices are a topic driven mainly by practitioners and academia is only starting to investigate them. Hence, there is no clear picture of the usage of Microservices in practice. In this paper, we contribute a qualitative study with insights into industry adoption and implementation of Microservices. Contrary to existing quantitative studies, we conducted interviews to gain a more in-depth understanding of the current state of practice. During 17 interviews with software professionals from 10 companies, we analyzed 14 service-based systems. The interviews focused on applied technologies, Microservices characteristics, and the perceived influence on software quality. We found that companies generally rely on well established technologies for service implementation, communication, and deployment. Most systems, however, did not exhibit a high degree of technological diversity as commonly expected with Microservices. Decentralization and product character were different for systems built for external customers. Applied DevOps practices and automation were still on a mediocre level and only very few companies strictly followed the you build it, you run it principle. The impact of Microservices on software quality was mainly rated as positive. While maintainability received the most positive mentions, some major issues were associated with security. We present a description of each case and summarize the most important findings of companies across different domains and sizes. Researchers may build upon our findings and take them into account when designing industry-focused methods.
Changing requirements and qualification profiles of employees, increasingly complex digital systems up to artificial intelligence, missing standards for the seamless embedding of existing resources and unpredictable return on investments are just a few examples of the challenges of an SME in the age of digitalisation. In most cases there is a lack of suitable tools and methods to support companies in the digital transformation process in the value creation processes, but also of training and learning materials. A European research project (BITTMAS - Business Transformation towards Digitalisation and Smart systems, ERASMUS+, 2016-1 DE02-KA202-003437) with international partners from science, associations and industry has addressed this issue and developed various methods and instruments to support SMEs. Within the scope of a literature search, 16 suitable digitalisation concepts for production and logistics were identified. In the following, a learning platform with a literature database with multivariable sorting options according to branches and keywords of digitalisation, a video gallery with basic and advanced knowledge and a glossary were created in order to provide the user with consolidated and structured specialist knowledge. The 16 identifying concepts for transforming value-added processes in the context of digitalisation were transferred to a learning platform using developed learning paths in coaching and training to online course modules including test questions. A maturity model was developed and implemented in a self assessment tool for the analysis to identify the potential of digitalisation in production and logistics in relation to the current technological digitalisation level of the company. As a result, the user receives one or more of the 16 potential digitalisation concepts suggested or the delta for the necessary, not yet available enabler technologies is presented as a spider diagram. For a successful implementation of the identified suitable digitalisation concepts in production and logistics, a further tool was developed to identify supplementary requirements for all company divisions and stakeholders in relation to the "digital transformation" in the form of a self-evaluation. This paper presents the methods and tools developed, the accompanying learning materials and the learning platform.
Contemporary public enterprises differ from their forebears. Today, they are more similar to private enterprises, receiving far more attention than previously, when privatization processes all over the world were in the spotlight. Furthermore, the broad research stream of entrepreneurship has so far neglected the consideration of public enterprises. To set a future research agenda, the author examines the dispersed literature using an integrative and organizing framework to identify major topics and research findings. This paper reviews articles that investigate the entrepreneurship in contemporary public enterprises. Despite the growing scholarly interest globally, this systematic literature review indicates there is no more than a loose connection between the literature streams of public entrepreneurship and corporate entrepreneurship. Specifically, the review shows that the multidimensional concept of entrepreneurial orientation has thus far been ignored, although autonomy plays a significant role in the literature review, namely in the context of the interference of the public owner. It also reveals other essential research gaps, such as the development of a modern theory of public enterprises. The linked research stream of public-sector corporate entrepreneurship offers a broad area of scholarly research and should encourage further investigation.