Ja
Refine
Year of publication
- 2017 (96) (remove)
Document Type
- Conference proceeding (48)
- Journal article (25)
- Book (14)
- Working Paper (5)
- Doctoral Thesis (2)
- Book chapter (1)
- Issue of a journal (1)
Is part of the Bibliography
- yes (96)
Institute
- Informatik (39)
- ESB Business School (32)
- Texoversum (10)
- Technik (8)
- Life Sciences (6)
Publisher
- Hochschule Reutlingen (21)
- Gesellschaft für Informatik (15)
- Universitätsbibliothek Tübingen (8)
- Elsevier (5)
- Technische Universität Berlin (3)
- Association for Information Systems (AIS) (2)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e. V. (2)
- EACH USP (2)
- IOS Press (2)
- Koordinierungsstelle (2)
Purpose: The purpose of this paper is to describe and discuss the current state of fashion business academic education worldwide. This is motivated by the wish to develop recommendations for the fashion business bachelor program of Reutlingen Uni versity.
Design/methodology/approach: This paper is based on a systematic review of relevant fashion business academic programs. A qualitative comparison is conducted through a categorization of the programs’ content and a score system evaluating the programs’ concepts.
Findings: Key findings were that several factors ensure successful fashion business education: Industry connections, international networks, project-based work, personalized career services and innovative approaches in teaching that include all steps along the fashion value chain.
Research limitations/implications: The research was primarily limited due to the limited number of schools assessed. As a result of the restricted time frame, those schools that were presented could only be analyzed regarding a few aspects. Future research should focus on a more in-depth analysis and further-reaching comparisons, e.g. comparisons with teaching concepts outside the fashion business area or with requirements by fashion companies.
Die Arbeit stellt die Vision des Internet of Things (IoT) vor und betrachtet sowohl Möglichkeiten der Nutzung als auch Gefahrenpotentiale für die Sicherheit der Nutzer. Insbesondere wird hierbei der Anwendungsfall Smart Home näher betrachtet und am Beispiel ZigBee gravierende Schwächen dieser Geräte aufgezeigt.
Durch Industrie 4.0 kann die individuelle Fertigung von kleineren Stückzahlen zu geringen Kosten ermöglicht werden. Dafür müssen alle Anlagen miteinander vernetzt werden, um Daten austauschen und kommunizieren zu können. Durch die Vernetzung können neue Risiken und Gefahren entstehen. In dieser Arbeit wird die ITSicherheit in der Industrie 4.0 anhand möglichen Bedrohungsszenarien, Herausforderungen und Gegenmaßnahmen evaluiert. Dabei wird untersucht, welche Möglichkeiten Industrieunternehmen haben, um Hackerangriffen vorzubeugen und ob bereits etablierte Sicherheitskonzepte für industrielle Anlagen einfach übernommen werden können.
Das Ziel dieser Arbeit ist, die Infrastruktur einer modernen Fahrzeug-zu Fahrzeug-Kommunikation auf ihre Sicherheit zu prüfen. Dazu werden die Sicherheitsstandards für die Funkkommunikation genauer beschrieben und anschließend mit möglichen Angriffsmodellen geprüft. Mit dem erläuterten Wissen der VANET Architektur werden verschiedene Angriffe verständlicher. Dadurch werden die Schwachstellen offengelegt und Gegenmaßnahmen an passenden Punkten in der Architektur verdeutlicht.
Diese Arbeit beschäftigt sich mit dem neuen elektronischen Personalausweis. Zum einen werden in diesem Paper die Sicherheitsziele des Personalausweises und die technische Umsetzung der Architektur und Protokolle erklärt. Es wird der Ablauf einer Online-Identifizierung für einen Nutzer mithilfe des Ausweises aufgezeigt. Risiken und Schwachstellen der Technologie im Software- und Hardwarebereich werden diskutiert und die bereits erfolgten Hack-Angriffe aufgezeigt. Die Arbeit legt Möglichkeiten dar, wie sich der Nutzer vor Angriffen schützen kann. Es werden die Gründe genannt, warum der neue Personalausweis online nur schwar Anklang findet und warum die Aufklärung über die zur Verfügung stehenden Anwendungen, eine Preisreduzierung der Lesegeräte sowie die vom Europa-Parlament und Europarat erlassene eIDAS-Verordnung nicht helfen werden, um die Nutzung voranzutreiben. Ergebnisse hierfür liefert eine Nutzerstudie. Zum anderen werden Ideen genannt, wie die Nutzung der elektronischen Funktionen des Ausweises stattdessen zu fördern ist.
Im Rahmen der wissenschaftlichen Vertiefung soll auf Basis der vorhandenen Ansätze das IT-Risikomanagement evaluiert werden. Hierbei soll die Frage, inwiefern das IT-Risikomanagement dem Unternehmen eine Hilfestellung bieten kann, geklärt und anschließend anhand von zwei Fallbeispielen dargestellt werden.
Mittlerweile ist der Einsatz von technischen Hilfsmitteln zu Analysezwecken im Sport fester Bestandteil im Trainingsalltag von Trainern und Athleten. In nahezu jeder Sportart werden Videoaufzeichnungen genutzt, um die Bewegungsausführung zu dokumentieren und zu analysieren. Allerdings reichen Aufnahmen von einem statischen Standort oftmals nicht mehr aus. An dieser Stelle kann Virtual Reality (VR) eine Lösung dieses Problems bieten. Durch VR kann der aufgezeichneten Szene eine weitere Ebene hinzugefügt und die Bewegungsabläufe neu und detaillierter bewertet werden. Um Bewegungen in einer virtuellen Umgebung abzubilden, müssen diese mittels Motion Capturing (MoCap) aufgezeichnet werden. Ziel dieser Arbeit ist es, herauszufinden, ob das MoCap System Perception Neuron in der Lage ist, Bewegungen in hoher Geschwindigkeit zu erfassen.
In den letzten Jahren beschäftigten sich Forscher und Automobilhersteller mit den Voraussetzungen für die Einführung von autonomem Fahren. Für Innovationen und Geschäftsmodelle im Bereich der intelligenten Mobilität, aber auch innerhalb der digitalen Wertschöpfungskette, spielen generell Zuverlässigkeit und Qualität der digitalen Datenübertragung eine entscheidende Rolle. Bevor das autonome Fahren vollständig eingeführt wird, muss man feststellen, welche Anforderungen an die digitale Infrastruktur beachtet werden müssen, gleichzeitig muss die Bedrohungslandschaft für autonomes Fahren analysiert werden.
Die folgende Arbeit beschäftigt sich damit, die Anforderungen und Gefahren zu analysieren und allgemeine Handlungsempfehlungen vorzuschlagen.
Ein stark erforschtes Gebiet der Computer Vision ist die Detektion von markanten Punkten des Gesichtszuges (englisch: facial feature detection), wie der Mundwinkel oder des Kinns. Daher lassen sich eine Vielzahl von veröffentlichten Verfahren finden, die sich jedoch teils deutlich hinsichtlich der Detektionsgenauigkeit, Robustheit und Geschwindigkeit unterscheiden. So sind viele Verfahren nur bedingt echtzeitfähig oder liefern nur mit hochaufgelösten Bildquellen ein zufriedenstellendes Ergebnis. In den letzten Jahren wurden daher Verfahren entwickelt, die versuchen, diese Problematiken zu lösen. In dieser Arbeit erfolgt eine Betrachtung dreier dieser State-of-the-Art Verfahren: Constrained Local Neural Fields (CLNF), Discriminative Response Map Fitting (DRMF) und Structured Output SVM (SO SVM), sowie deren Implementierungen. Dazu erfolgt ein empirischer Vergleich hinsichtlich der Detektionsgenauigkeit.
Background. The application of lean management is standard in many companies all over the world. It is used to continuously optimise existing production processes and to reduce the complexity of administrative processes. Unfortunately, in higher education, the awareness of lean management as a highly effective methodology is quite low.
Research aims. The research aim is to show how the lean strategy can be applied in university environments. Finally, this paper addresses the question why it is so difficult to implement lean in a university environment and how an institution of higher education can move forward towards becoming a lean university.
Methodology. Based on a literature review, five key lean principles are presented and examples of their implementation are discussed using short case studies from our own institution. We also compare our findings with those in the literature.
Key findings. Lean offers the chance to improve the management of higher education institutions. This requires a commitment on the part of the university top management aiming at convincing all stakeholders that a culture of lean helps the institution to be able to adapt to the rapidly changing environment of higher education.
Der folgende Artikel befasst sich mit Wearables für Pferde. Ziel ist es, die Sicherheit der Tiere bei einem Ausbruch von einer Weide zu erhöhen und damit Personen- und Sachschäden zu minimieren. Hierzu wird der Stand der Technik zur Standortbestimmung im Freien zusammengetragen und durch eine Klassifizierung der unterschiedlichen Ansätze ermittelt, welche Standortbestimmung pferdegerecht erscheint. Zudem soll ein Fragebogen konzipiert werden, um Charakteristiken und Funktionalitäten für einen Prototypen festzustellen.
In der Medizin existieren verschiedene Reifegradmodelle, die die Digitalisierung von Krankenhäusern unterstützen können. Die Anforderungen an ein Reifegradmodell für diesen Zweck umfassen Aspekte aus allgemeinen und spezifischen Bereichen des Krankenhauses. Die Analyse der Reifegradmodelle HIN, CCMM, EMRAM und O-EMRAM zeigt große Lücken im Bereich des OP sowie fehlende Aspekte in der Notaufnahme auf. Ein umfassendes Reifegradmodell wurde nicht gefunden. Durch eine Kombination aus HIN und CCMM könnten fast alle Bereiche ausreichend abgedeckt werden. Zusätzliche Ergänzungen durch spezialisierte Reifegradmodelle oder sogar die Entwicklung eines umfassenden Reifegradmodells wären sinnvoll.
Die Arbeit stellt die Möglichkeiten von 3D-Controllern für den Einsatz in der interventionellen Radiologie und insbesondere für die Steuerung der Echtzeit-Magnetresonanztomographie (MRT) dar. Dies ist interessant in Bezug auf die kontrollierte Navigation in ein Zielgewebe. Dabei kann der Interventionalist durch Echtzeit- Bildgebung den Verlauf des Eingriffs verfolgen, allerdings kann er bisher das MRT während der Durchführung des Eingriffs nicht selbst steuern, da dies durch den Assistenten im Nebenraum erfolgt. Die Kommunikation ist bei dem hohen Geräuschpegel aber sehr schwer. Diese Arbeit setzt an dieser Stelle an und analysiert 3D-Controller auf die Eignung für die Echtzeit-Steuerung eines MRTs. Dabei wurden trackingbasierte und trackinglose Geräte betrachtet. Als Ergebnis ließ sich festhalten, dass trackingbasierte Verfahren weniger geeignet sind, aufgrund der nicht ausreichenden Interpretation der Eingaben. Die trackinglosen Geräte hingegen sind aufgrund der korrekten Interpretation aller Eingaben und der intuitiven Bedienung geeignet.
Die digitale Zukunft zu definieren und zu gestalten ist in aller Munde - in der Industrie, der Lehre und so auch im Fokus der diesjährigen Informatics Inside Konferenz. Dazu gehören einerseits die Möglichkeiten, die die Digitalisierung mit sich bringt, z.B. beschrieben im Umfeld Krankenhaus oder in der Pferdezucht, andererseits die Schnittstelle zwischen realer und virtueller Welt, ausgeführt an Beispielen der Gesichts- und Bewegungserkennung. Auffällig ist, dass auch die Studierenden sich immer stärker auf die Sicherheit und Privatsphäre persönlicher Daten in einer digitalen Welt fokussieren. Dazu gehören fundamentale Sicherheitsuntersuchungen für ausgewählte Domänen, z.B. Industrie 4.0 oder Smart Home, wie auch die Betrachtung konkreter Einsatzszenarien, wie das autonome Fahren, die Kommunikation zwischen Fahrzeugen und dem neuen Personalausweis. Darüber hinaus stellen die Studierenden ihre Master-Projekte in Kurzbeiträgen vor.
Die Teilnehmer erfüllen nicht nur den Anspruch, die Ergebnisse ihrer Arbeit in schriftlicher Form anschaulich auszuarbeiten, sondern auch interaktiv vor ihrem Publikum zu verteidigen und zu diskutieren. Die Informatics Inside bietet somit ein Forum für Studierende, um während des Studiums zum einen die Ergebnisse ihrer Arbeit professionell einem interessierten Publikum zugänglich zu machen und zum anderen Anregungen anderer Vertiefungsgebiete aufzunehmen, aber auch die Arbeiten anderer kritisch zu hinterfragen.
Reconstructing 3D face shape from a single 2D photograph as well as from video is an inherently ill-posed problem with many ambiguities. One way to solve some of the ambiguities is using a 3D face model to aid the task. 3D morphable face models (3DMMs) are amongst the state of the art methods for 3D face reconstruction, or so called 3D model fitting. However, current existing methods have severe limitations, and most of them have not been trialled on in-the-wild data. Current analysis-by- synthesis methods form complex non linear optimisation processes, and optimisers often get stuck in local optima. Further, most existing methods are slow, requiring in the order of minutes to process one photograph.
This thesis presents an algorithm to reconstruct 3D face shape from a single image as well as from sets of images or video frames in real-time. We introduce a solution for linear fitting of a PCA shape identity model and expression blendshapes to 2D facial landmarks. To improve the accuracy of the shape, a fast face contour fitting algorithm is introduced. These different components of the algorithm are run in iteration, resulting in a fast, linear shape-to- landmarks fitting algorithm. The algorithm, specifically designed to fit to landmarks obtained from in-the-wild images, by tackling imaging conditions that occur in in-the-wild images like facial expressions and the mismatch of 2D–3D contour correspondences, achieves the shape reconstruction accuracy of much more complex, nonlinear state of the art methods, while being multiple orders of magnitudes faster.
Second, we address the problem of fitting to sets of multiple images of the same person, as well as monocular video sequences. We extend the proposed shape-to-landmarks fitting to multiple frames by using the knowledge that all images are from the same identity. To recover facial texture, the approach uses texture from the original images, instead of employing the often-used PCA albedo model of a 3DMM. We employ an algorithm that merges texture from multiple frames in real-time based on a weighting of each triangle of the reconstructed shape mesh.
Last, we make the proposed real-time 3D morphable face model fitting algorithm available as open-source software. In contrast to ubiquitous available 2D-based face models and code, there is a general lack of software for 3D morphable face model fitting, hindering a widespread adoption. The library thus constitutes a significant contribution to the community.
Thematic issue on human-centred ambient intelligence: cognitive approaches, reasoning and learning
(2017)
This editorial presents advances on human-centred Ambient Intelligence applications which take into account cognitive issues when modelling users (i.e. stress, attention disorders), and learn users’ activities/preferences and adapt to them (i.e. at home, driving a car). These papers also show AmI applications in health and education, which make them even more valuable for the general society.
The main challenge when driving heat pumps by PV-electricity is balancing differing electrical and thermal demands. In this article, a heuristic method for optimal operation of a heat pump driven by a maximum share of PV-electricity is presented. For this purpose, the (DHW) are activated in order shift the operation of the heat pump to times of PV-generation. The system under consideration refers to thermal and electrical demands of a single family house. It consists of a heat pump, a thermal energy storage for DHW and of grid connected heating and generation of domestic hot water, the heat pump runs with two different supply temperatures and thereby achieving a maximum overall COP. Within the algorithm for optimization a set of heuristic rules is developed in a way that the operational characteristics of the heat pump in terms of minimum running and stopping times are met as well as the limiting constraints of upper and lower limits of room temperature and energy content of electricity generated, a varying number of heat pump schedules fulfilling the bundary conditions are created. Finally, the schedule offering the maximum on-site utilization of PV-electricity with a minimum number of starts of the heat pump, which serves as secondary condition, is selected. Yearly simulations of this combination have been carried out. Initial results of this method indicate a significant rise in on-site consumption of the PV-electricity and heating demand fulfilment by renewable electricity with no need for a massive TES for the heating system in terms of a big water tank.
This paper examines the efficacy of social media systems in customer complaint handling. The emergence of social media, as a useful complement and (possibly) a viable alternative to the traditional channels of service delivery, motivates this research. The theoretical framework, developed from literature on social media and complaint handling, is tested against data collected from two different channels (hotline and social media) of a German telecommunication services provider, in order to gain insights into channel efficacy in complaint handling. We contribute to the understanding of firm’s technology usage for complaint handling in two ways:
(a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels and (b) by comparing the impact of complaint handling quality on key performance outcomes such as customer loyalty, positive word-of-mouth, and crosspurchase intentions across traditional and social media channels.
Characterisation of porous knitted titanium for replacement of intervertebral disc nucleus pulposus
(2017)
Effective restoration of human intervertebral disc degeneration is challenged by numerous limitations of the currently available spinal fusion and arthroplasty treatment strategies. Consequently, use of artificial biomaterial implant is gaining attention as a potential therapeutic strategy. Our study is aimed at investigating and characterizing a novel knitted titanium (Ti6Al4V) implant for the replacement of nucleus pulposus to treat early stages of chronic intervertebral disc degeneration. Specific knitted geometry of the scaffold with a porosity of 67.67 ± 0.824% was used to overcome tissue integration failures. Furthermore, to improve the wear resistance without impairing original mechanical strength, electro-polishing step was employed. Electro-polishing treatment changed a surface roughness from 15.22 ± 3.28 to 4.35 ± 0.87 μm without affecting its wettability which remained at 81.03 ± 8.5°. Subsequently, cellular responses of human mesenchymal stem cells (SCP1 cell line) and human primary chondrocytes were investigated which showed positive responses in terms of adherence and viability. Surface wettability was further enhanced to super hydrophilic nature by oxygen plasma treatment, which eventually caused substantial increase in the proliferation of SCP1 cells and primary chondrocytes. Our study implies that owing to scaffolds physicochemical and biocompatible properties, it could improve the clinical performance of nucleus pulposus replacement.
A wide variety of cell types exhibit substrate topography-based behavior, also known as contact guidance. However, the precise cellular mechanisms underlying this process are still unknown. In this study, we investigated contact guidance by studying the reaction of human endothelial cells (ECs) to well-defined microgroove topographies, both during and after initial cell spreading. As the cytoskeleton plays a major role in cellular adaptation to topographical features, two methods were used to perturb cytoskeletal structures. Inhibition of actomyosin contractility with the chemical inhibitor blebbistatatin demonstrated that initial contact guidance events are independent of traction force generation. However, cell alignment to the grooved substrate was altered at later time points, suggesting an initial ‘passive’ phase of contact guidance, followed by a contractility-dependent ‘active’ phase that relies on mechanosensitive feedback. The actin cytoskeleton was also perturbed in an indirect manner by culturing cells upside down, resulting in decreased levels of contact guidance and suggesting that a possible loss of contact between the actin cytoskeleton and the substrate could lead to cytoskeleton impairment. The process of contact guidance at the microscale was found to be primarily lamellipodia driven, as no bias in filopodia extension was observed on micron-scale grooves.
Intermediate filament reorganization dynamically influences cancer cell alignment and migration
(2017)
The interactions between a cancer cell and its extracellular matrix (ECM) have been the focus of an increasing amount of investigation. The role of the intermediate filament keratin in cancer has also been coming into focus of late, but more research is needed to understand how this piece fits in the puzzle of cytoskeleton-mediated invasion and metastasis. In Panc-1 invasive pancreatic cancer cells, keratin phosphorylation in conjunction with actin inhibition was found to be sufficient to reduce cell area below either treatment alone. We then analyzed intersecting keratin and actin fibers in the cytoskeleton of cyclically stretched cells and found no directional correlation. The role of keratin organization in Panc-1 cellular morphological adaptation and directed migration was then analyzed by culturing cells on cyclically stretched polydimethylsiloxane (PDMS) substrates, nanoscale grates, and rigid pillars. In general, the reorganization of the keratin cytoskeleton allows the cell to become more ‘mobile’- exhibiting faster and more directed migration and orientation in response to external stimuli. By combining keratin network perturbation with a variety of physical ECM signals, we demonstrate the interconnected nature of the architecture inside the cell and the scaffolding outside of it, and highlight the key elements facilitating cancer cell-ECM interactions.
Cell-cell and cell-extracellular matrix (ECM) adhesion regulates fundamental cellular functions and is crucial for cell-material contact. Adhesion is influenced by many factors like affinity and specificity of the receptor-ligand interaction or overall ligand concentration and density. To investigate molecular details of cell ECM and cadherins (cell-cell) interaction in vascular cells functional nanostructured surfaces were used Ligand-functionalized gold nanoparticles (AuNPs) with 6-8 nm diameter, are precisely immobilized on a surface and separated by non-adhesive regions so that individual integrins or cadherins can specifically interact with the ligands on the AuNPs. Using 40 nm and 90 nm distances between the AuNPs and functionalized either with peptide motifs of the extracellular matrix (RGD or REDV) or vascular endothelial cadherins (VEC), the influence of distance and ligand specificity on spreading and adhesion of endothelial cells (ECs) and smooth muscle cells (SMCs) was investigated. We demonstrate that RGD-dependent adhesion of vascular cells is similar to other cell types and that the distance dependence for integrin binding to ECM-peptides is also valid for the REDV motif. VEC-ligands decrease adhesion significantly on the tested ligand distances. These results may be helpful for future improvements in vascular tissue engineering and for development of implant surfaces.
This work presents a fully integrated GaN gate driver in a 180nm HV BCD technology that utilizes high-voltage energy storing (HVES) in an on-chip resonant LC tank, without the need of any external capacitor. It delivers up to 11nC gate charge at a 5V GaN gate, which exceeds prior art by a factor of 45-83, supporting a broad range of GaN transistor types. The stacked LC tank covers an area of only 1.44mm², which corresponds to a superior value of 7.6nC/mm².
In recent years, significant progress was made on switched-capacitor DCDC converters as they enable fully integrated on chip power management. New converter topologies overcame the fixed input-to-output voltage limitation and achieved high efficiency at high power densities. SC converters are attractive to not only mobile handheld devices with small input and output voltages, but also for power conversion in IoTs, industrial and automotive applications, etc. Such applications need to be capable of handling high input voltages of more than 10V. This talk highlights the challenges of the required supporting circuits and high voltage techniques, which arise for high Vin SC converters. It includes level shifters, charge pumps and back-to-back switches. High Vin conversion is demonstrated in a 4:1 SC DCDC converter with an input voltage as high as 17V with a peak efficiency of 45 %, and a buckboost SC converter with an input voltage range starting from 2 up to 13V, which utilizes a total of 17 ratios and achieves a peak efficiency of 81.5 %. Furthermore a highly integrated micro power supply approach is introduced, which is connected directly to the 120/230 Vrms mains, with an output power of 3mW, resulting in a power density >390μW/mm², which exceeds prior art by a factor of 11.
Managing decentralized corporate energy systems is a challenging task for enterprises. However, the integration of energy objectives into business strategy creates difficulties resulting in inefficient decisions. To improve this, practice-proven methods such as the balanced scorecard and enterprise architecture management are transferred to the energy domain. The methods are evaluated based on a case study. Managing multi-dimensionality and high complexity are the main drivers for an effective and efficient energy management system. Both methods show a positive impact on managing decentralized corporate energy systems and are adaptable to the energy domain.
Multilevel-cell (MLC) flash is commonly deployed in today’s high density NAND memories, but low latency and high reliability requirements make it barely used in automotive embedded flash applications. This paper presents a time domain voltage sensing scheme that applies a dynamic voltage ramp at the cells’ control gate (CG) in order to achieve fast and reliable sensing suitable for automotive applications.
This publication gives a short introduction and overview of the European project SCOUT and introduces a methodology for a holistic approach to record the state of the art in technical (vehicle and connectivity, human factors regarding physiologic and ergonomic level) and non-technical enablers (societal, economic, legal, regulatory and policy level) of connected and automated driving in Europe. The paper addresses beside the technical topics of environmental perception, E/E architecture, actuators and security, the state of the art of the legal framework in the context of connected and automated driving.
The European Economic and Monetary Union (EMU) has been in turmoil for more than six years. The present governance rules do not seem to solve the problems neither permanently nor effectively. There is no vision about the future of Europe in the 21st century. This article describes a realignment of the economic governance, which does not necessarily lead to a transfer or political union. However, it solves the current and future challenges. In fact, the redesign of present rules is the most likely as well as legally and economically option today. The key ideais the detachment from the compulsive idea of an ever closer union. However, this vision requires boldness towards greater flexibility together with an exit clause or a state insolvency procedure for incompliant member states.
This paper models the political budget cycle with stochastic differential equations. The paper highlights the development of future volatility of the budget cycle. In fact, I confirm the proposition of a less volatile budget cycle in future. Moreover, I show that this trend is even amplified due to higher transparency. These findings are new evidence in the literature on electoral cycles. I calibrate a rigorous stochastic model on public deficit-to-GDP data for several countries from 1970 to 2012.
The business landscape is changing radically because of software. Companies in all industry sectors are continously finding new flexibilities in this programmable world. They are able to deliver new functionalities even after the product is already in the customer's hands. But success is far from guaranteed if they cannot validate their assumptions about what their customers actually need. A competitor with better knowledge of customer needs can disrupt the market in an instant.
This book introduces continuous experimentation, an approach to continuously and systematically test assumptions about the company's product or service strategy and verify customers' needs through experiments. By observing how customers actually use the product or early versions of it, companies can make better development decisions and avoid potentially expensive and wasteful activities. The book explains the cycle of continuous experimentation, demonstrates its use through industry cases, provides advice on how to conduct experiments with recipes, tools, and models, and lists some common pitfalls to avoid. Use it to get started with continuous experimentation and make better product and service development decisions that are in-line with your customers' needs.
Due to rapidly changing technologies and business contexts, many products and services are developed under high uncertainties. It is often impossible to predict customer behaviors and outcomes upfront. Therefore, product and service developers must continuously find out what customers want, requiring a more experimental mode of management and appropriate support for continuously conducting experiments. We have analytically derived an initial model for continuous experimentation from prior work and matched it against empirical case study findings from two startup companies. We examined the preconditions for setting up an experimentation system for continuous customer experiments. The resulting RIGHT model for Continuous Experimentation (Rapid Iterative value creation Gained through High-frequency Testing) illustrates the building blocks required for such a system and the necessary infrastructure. The major findings are that a suitable experimentation system requires the ability to design, manage, and conduct experiments, create so-called minimum viable products or features, link experiment results with a product roadmap, and manage a flexible business strategy. The main challenges are proper, rapid design of experiments, advanced instrumentation of software to collect, analyse, and store relevant data, and integration of experiment results in the product development cycle, software development process, and business strategy. This summary refers to the article The RIGHT Model for Continuous Experimentation, published in the Journal of Systems and Software [Fa17].
The ability to develop and deploy high-quality software at a high speed gets increasing relevance for the comptetitiveness of car manufacturers. Agile practices have shown benefits such as faster time to market in several application domains. Therefore, it seems to be promising to carefully adopt agile practices also in the automotive domain. This article presents findings from an interview-based qualitative survey. It aims at understanding perceived forces that support agile adoption. Particularly, it focuses on embedded software development for electronic control units in the automotive domain.
Within the scope of the present cumulative doctoral thesis six scientific papers were published which illustrates that modern reaction model-free (=isoconversional) kinetic analysis (ICKA) methods represents a universal and effective tool for the controlled processing of thermosetting materials. In order to demonstrate the universal applicability of ICKA methods, the thermal cure of different thermosetting materials having a very broad range of chemical composition (melamine-formaldehyde resins, epoxy resins, polyester-epoxy resins, and acrylate/epoxy resins) were analyzed and mathematically modelled. Some of the materials were based on renewable resources (an epoxy resin was made from hempseed oil; linseed oil was modified into an acrylate/epoxy resin). With the aid of ICKA methods not only single-step but also complex multi-step reactions were modelled precisely. The analyzed thermosetting materials were combined with wood, wood-based products, paper, and plant fibers which are processed to various final products. Some of the thermosetting materials were applied as coating (in form of impregnated décor papers or powder and wet coatings respectively) on wood substrates and the epoxy resin from hempseed oil was mixed with plant fibers and processed into bio-based composites for lightweight applications. From the final products mechanical, thermal, and surface properties were determined. The activation energy as function of cure conversion derived from ICKA methods was utilized to predict accurately the thermal curing over the course of time for arbitrary cure conditions. Furthermore the cure models were used to establish correlations between the cross-linking during processing into products and the properties of the final products. Therewith it was possible to derive the process time and temperature that guarantee optimal cross-linking as well as optimal product properties
To assess the quality of a person’s sleep, it is essential to examine the sleep behaviour by identifying the several sleep stages, their durations and sleep cycles. The established and gold standard procedure for sleep stage scoring is overnight polysomnography (PSG) with the Rechtschaffen and Kales (R-K) method. Unfortunately, the conduct of PSG is time-consuming and unfamiliar for the subjects and might have an impact of the recorded data. To avoid the disadvantages with PSG, it is important to make further investigations in low-cost home diagnostic systems. For this intention it is necessary to find suitable bio vital parameters for classifying sleep stages without any physical impairments at the same time. Due to the promising results in several publications we want to analyse existing methods for sleep stage classification based on the parameters body movement,
heartbeat and respiration. Our aim was to find different behaviour patterns in the several sleep stages. Therefore, the average values of 15 whole-night PSG recordings -obtained from the ‘DREAMS
Subjects Database’- where analysed in the light of heartbeat, body movement and respiration with 10 different methods.
To evaluate the quality of sleep, it is important to determine how much time was spent in each sleep stage during the night. The gold standard in this domain is an overnight polysomnography (PSG). But the recording of the necessary electrophysiological signals is extensive and complex and the environment of the sleep laboratory, which is unfamiliar to the patient, might lead to distorted results. In this paper, a sleep stage detection algorithm is proposed that uses only the heart rate signal, derived from electrocardiogram (ECG), as a discriminator. This would make it possible for sleep analysis to be performed at home, saving a lot of effort and money. From the heart rate, using the fast Fourier transformation (FFT), three parameters were calculated in order to distinguish between the different sleep stages. ECG data along with a hypnogram scored by professionals was used from Physionet database, making it easy to compare the results. With an agreement rate of 41.3%, this approach is a good foundation for future research.
46 Prozent der Arbeitsplätze in der Automobilindustrie sind bis 2030 durch Automatisierung und Digitalisierung bedroht – die Tätigkeiten werden dann nicht mehr von Menschen, sondern von intelligenten Robotern und Systemen erledigt. Das ist das zentrale Ergebnis unserer Studie „Digitale Transformation – Der Einfluss der Digitalisierung auf die Workforce in der Automobilindustrie“, die wir gemeinsam mit dem Herman Hollerith Lehr- und Forschungszentrum an der Hochschule Reutlingen erstellt haben.
IT platforms as the foundation of digitized processes and products are vital in a digital economy. However, many companies’ platforms are liabilities, not strategic assets because of their complexity. Consequently, companies initiate IT complexity reduction programs. But these technology-centric programs at best provide temporary relief. Soon after, companies’ platforms become just as complex as before. Based on four case studies, we identify three non-technical drivers of platform complexity: (1) Lacking awareness of consequences business decisions have on platform complexity, (2) Lacking motivation to avoid platform complexity, (3) Lacking authority to protect platforms from complexity. We propose measures to address these drivers that can help achieve more sustainable impact on platform complexity: (1) Removing information asymmetries between those creating complexity and those dealing with complexity, (2) Redefining incentives to include long-term effects on platform complexity, (3) Redressing power imbalances between those who create complexity and those who have to manage it.
In the present paper we demonstrate a novel approach to handling small updates on Flash called In-Place Appends (IPA). It allows the DBMS to revisit the traditional write behavior on Flash. Instead of writing whole database pages upon an update in an out-of-place manner on Flash, we transform those small updates into update deltas and append them to a reserved area on the very same physical Flash page. In doing so we utilize the commonly ignored fact that under certain conditions Flash memories can support in-place updates to Flash pages without a preceding erase operation.
The approach was implemented under Shore-MT and evaluated on real hardware. Under standard update-intensive workloads we observed 67% less page invalidations resulting in 80% lower garbage collection overhead, which yields a 45% increase in transactional throughput, while doubling Flash longevity at the same time. The IPA outperforms In-Page Logging (IPL) by more than 50%.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware – the OpenSSD Flash research platform. During the demonstration we allow the users to interact with the system and gain hands on experience of its performance under different demonstration scenarios. These involve various workloads such as TPC-B, TPC-C or TATP.
Die Europäische Währungs- und Wirtschaftsunion (EWWU) bedarf einer weiteren Stabilisierung, da die institutionellen Regelungen langfristig keine hinreichende Bindekraft auf die Mitgliedsländer entfalten. Die Herausforderung ist die Rückgewinnung der verlorengegangenen Glaubwürdigkeit in das Regelwerk im Zuge der europäischen Staatsverschuldungskrise seit dem Jahr 2010. Um die Währungsunion zu erhalten, muss einerseits im Primärrecht das "No Bailout" in Art. 125 AEUV glaubwürdig angewandt werden können und andererseits die Regelungen im Sekundärrecht, u.a. der Stabilitäts- und Wachstumspakt, der Fiskalpakt oder das europäische Semester, unabhängiger und schneller rechtsverbindlich vollzogen werden. Der hier vorgeschlagene und klug in den europäischen Rahmen eingepasste "staatliche Insolvenzmechanismus", verbunden mit einer im Ultima Ratio rechtsverbindlichen "Austrittsklausel" wäre ein Lösungsansatz. Ein Scheitern der EWWU ist abwendbar, aber der fehlende Reformwille könnte dem Zerfall der Währungsunion Vorschub leisten.
Im Rahmen des Forschungsprojektes sollten die Möglichkeiten und Grenzen des Einsatzes von Sol-Gel-Ausrüstung für die Verbesserung der Scheuer-/Abrasionsbeständigkeit für Gewebe aus unterschiedlichen Fasermaterialien untersucht werden. Dabei lag der Schwerpunkt auf Textilien für die Bereiche Bekleidung- /Berufsbekleidung sowie Bezugsstoffe (Möbel, Automotive, Personentransport).
Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.
Curriculum design for the German language class in the double-degree programme business engineering
(2017)
This paper aims to give an overview on how German is taught as a foreign language to students enrolled in the Bachelor of Business Engineering, a double-degree programme offered in Universiti Malaysia Pahang. The double degree students have the opportunity to complete their first two years of study in Malaysia and their last two years in Germany. Taking the TestDaF examination is compulsory for double-degree students. Hence, the German Language curriculum has been meticulously planned to ensure the students would be competent in the language. As such, the settings of the language class are discussed thoroughly in this paper. Additionally, it also discusses the challenges faced in teaching German as foreign language. This paper ends with some suggestions for improvement.
Decreasing batch sizes in production in line with Industrie 4.0 will lead to tremendous changes of the control of logistic processes in future production systems. Intelligent bins are crucial enablers to establish decentrally controlled material flow systems in value chain networks as well as at the intralogistics level. These intelligent bins have to be integrated into an overall decentralized monitoring and control approach and have to interact with humans and other entities just like other cyber-physical systems (CPS) within the cyber-physical production system (CPPS). To realize a decentralized material supply following the overall aim of a decentralized control of all production and logistics processes, an intelligent bin system is currently developed at the ESB Logistics Learning Factory. This intelligent bin system will be integrated into the self developed, cloud-based and event-oriented SES system (so-called “Self Execution System”) which goes beyond the common functionalities and capabilities of traditional manufacturing execution systems (MES).
To ensure a holistic integration of the intelligent bin for different material types into the SES framework, the required hard- and software components for the decentrally controlled bin system will be split into a common and an adaptable component. The common component represents the localization and network layer which is common for every bin, whereas the flexible component will be customizable to different requirements, like to the specific characteristics of the parts.
Close and safe interaction of humans and robots in joint production environments is technically feasible, however should not be implemented as an end in itself but to deliver improvement in any of a production system’s target dimensions. Firstly, this paper shows that an essential challenge for system integrators during the design of HRC applications is to identify a suitable distribution of available tasks between a robotic and a human resource. Secondly, it proposes an approach to determine task allocation by considering the actual capabilities of both human and robot in order to improve work quality. It matches those capabilities with given requirements of a certain task in order to identify the maximum congruence as the basis for the allocation decision. The approach is based on a study and subsequent generic description of human and robotic capabilities as well as a heuristic procedure that facilities the decision making process.
Technologies for mapping the “digital twin“ have been under development for approximately 20 years. Nowadays increasingly intelligent, individualized products encourages companies to respond innovatively to customer requirements and to handle the rising product variations quickly.
An integrated engineering network, spanning across the entire value chain, is operated to intelligently connect various company divisions, and to generate a business ecosystem for products, services and communities. The conditions for the digital twin are thereby determined in which the digital world can be fed into the real, and the real world back into the digital to deal such intelligent products with rising variations.
The term digital twin can be described as a digital copy of a real factory, machine, worker etc., that is created and can be independently expanded, automatically updated as well as being globally available in real time. Every real product and production site is permanently accompanied by a digital twin. First prototypes of such digital twins already exist in the ESB Logistics Learning Factory on a cloud- and app based software that builds on a dynamic, multidimensional data and information model. A standardized language of the robot control systems via software agents and positioning systems has to be integrated. The aspect of the continuity of the real factory in the digital factory as an economical means of ensuring continuous actuality of digital models looks as the basis of changeability.
For the indoor localization sensor combinations that in addition to the hardware already contain the software required for the sensor data fusion should be used. Processing systems, scenario-live-simulations and digital shop floor management results in a mandatory procedural combination. Essential to the digital twin is the ability to consistently provide all subsystems with the latest state of all required information, methods and algorithms.
Zusammen mit Partnern aus Industrie und Politik untersuchen die ESB Business School der Hochschule Reutlingen, die Hochschule Offenburg und die Fachhochschule Nordwestschweiz (FHNW) in einem Interreg-Projekt die Möglichkeiten, klima- und gesundheitsschädliche Emissionen im Grenzverkehr am Hochrhein zu reduzieren. Elektromobilität und Fahrgemeinschaften werden dazu im Rahmen eines Pilotprojekts gefördert und die Wirkung analysiert. Erste Ergebnisse zeigen, dass heutige Elektroautos für das grenzüberschreitende Pendeln unter bestimmten Voraussetzungen geeignet sind.
In this paper we build on our research in data management on native Flash storage. In particular we demonstrate the advantages of intelligent data placement strategies. To effectively manage phsical Flash space and organize the data on it, we utilize novel storage structures such as regions and groups. These are coupled to common DBMS logical structures, thus require no extra overhead for the DBA. The experimental results indicate an improvement of up to 2x, which doubles the longevity of Flash SSD. During the demonstration the audience can experience the advantages of the proposed approach on real Flash hardware.
Real estate markets are known to fluctuate. The real estate market in Stuttgart, Germany, has been booming for more than a decade: square-meter price hit top levels and real estate agents claim that market prices will continue to increase. In this paper, we test this market understanding by developing and analyzing a system dynamics model that depicts the Stuttgart real estate market. Simulating the model explains oscillating behavior arising from significant time delays and endogenous feedback structures – and not necessarily oscillating interest rates, as market experts assume. Scenarios provide insights into the system's behavior reacting to changes exogenous to the model. The first scenario tests the market development under increasing interest rates. The other scenario deals with possible effects on the real estate market if the regional automotive economy suffers from intense competition with new market players entering with alternative fuel vehicles and new technologies. With a policy run we test market structure changes to eliminate cyclical effects. The paper confirms that the business cycle in the Stuttgart real estate market arises from within the system's underlying structure, thus emphasizing the importance of understanding feedback structures.
Strategic alliances have become important strategic options for firms to achieve competitive advantage. Yet, there are many examples of alliance failures. Scholars have studied this phenomenon and identified many reasons for alliance failure, including lack of trust between the partnering firms. Paradoxically, the concept of trust is still not fully understood, specifically how and under what conditions trust comes to break down within the broader process of alliance building. We synthesize a process model that describes the “alliance capability”, including trust, openness, partner contributions, and relational rents. We then translate this framework into a formal simulation model and analyze it thoroughly. In analyzing trust dynamics we identify and explore a tipping boundary, separating a regime of alliance failures and successes. We apply our core findings to openness strategies – decisions about how much knowledge to share with partners. Our analyses reveal that strategies informed by a static mental model of trust, contributions, and openness, under undervalue openness. Further, too little openness risks early failure due to the being trapped in a vicious cycle of trust depletion.
Marketing mit Youtube
(2017)
Die Video-Plattform Youtube ist eine der meistbesuchten Webseiten weltweit. Youtube-Stars erreichen mit ihren Videos große Zuschauergruppen und können als Multiplikatoren für Werbebotschaften dienen. Sie sind oftmals Meinungsführer, denen hohes Vertrauen entgegengebracht wird, weshalb sie sich sehr gut für Influencer-Marketing eignen. Marketing mit Youtube ermöglicht eine offene, mehrwegige, schnelle und kostengünstige Kommunikation mit Kunden, insbesondere jüngeren. Unternehmen können sich dies zu Nutze machen, indem sie Youtube nicht nur als Plattform für klassische Werbespots, sondern auch für Produktplatzierungen, Videosponsoring oder Branded Entertainment in Zusammenarbeit mit Youtubern verwenden.
Im Rahmen dieser Arbeit werden zunächst die Video-Plattform Youtube vorgestellt und die Grundidee des Influencer-Marketings dargelegt. Daraus leiten sich verschiedene Möglichkeiten der Zusammenarbeit mit Youtubern zu Marketingzwecken ab, die beschrieben und anhand zahlreicher Beispiele illustriert werden. Die Arbeit schließt mit einer kritischen Würdigung der Ergebnisse und Handlungsempfehlungen zur Nutzung von Youtube als Marketinginstrument im Hinblick auf Influencer-Marketing.
.
Im diesem Beitrag wird auf Basis des identitätsorientierten Markenmanagements
anhand der sechs Komponenten der Markenidentität die Rolle von Thomas Müller für die Marke FC Bayern München untersucht. Das Ergebnis lautet: Für die Marke FC Bayern ist die Identitätsfigur Thomas Müller von unbezahlbarem Wert.
.
Clinical reading centers provide expertise for consistent, centralized analysis of medical data gathered in a distributed context. Accordingly, appropriate software solutions are required for the involved communication and data management processes. In this work, an analysis of general requirements and essential architectural and software design considerations for reading center information systems is provided. The identified patterns have been applied to the implementation of the reading center platform which is currently operated at the Center of Ophthalmology of the University Hospital of Tübingen.
The wet chemical deposition of solution processed transparent conducting oxides (TCO) provides an alternative low cost and economical deposition technique to realize large-areas of conducting films. Since the price for the most common TCO Indium Tin Oxide rises enormously, Aluminum Zinc Oxide (AZO) as alternative TCO reaches more and more interest. The optoelectronical properties of nanoparticle coatings strongly depend beneath the porosity of the coating on the shape and size of the used particles. By using bigger or rod-shaped particles it is possible to minimize the amount of grain boundaries resulting in an improvement of the electrical properties, whereas particles bigger than 100 nm should not be used if highly transparent coatings are necessary as these big particles scatter the visible light and lower the transmittance of the coatings. In this work we present a simple method to synthesize AZO particles with different shape and size, but comparable electronical properties. We use a simple, well reproducible polyol method for synthesis and influence the shape and size of the particles by adding different amounts of water to the precursor solution. We can show that the addition of aluminum as dopant strongly hinders the crystal growth but the addition of water counteracts this, so that both, spherical and rod-shaped particles can be obtained.
Digitization will require companies to fundamentally reengineer their sales processes. Adapting the concept of value selling to the digital age will enable them to deliver superior value to their customers. Specifically, social selling will provide them with an answer to the ever-increasing complexity of customer journeys. This article, based on a survey among 235 German companies, assesses the status quo and outlines opportunities. Moreover, it introduces a novel approach for developing well-grounded social selling metrics.
In an exploratory study about online communication of large and medium-sized B2B companies from the German state of Baden-Württemberg, their message content communicated via websites, and their websites' appeal for international prospects has been analyzed. It revealed many basic content items absent, making the site less attractive for further exploration, and difficult or international prospects to enter into a dialog, become leads, and possible customers. The subsequent survey elicited organizational backgrounds, available resources, and objectives for online communication. It could trace deficiencies back to a lack of understanding of the importance of digital communication for lead generation, and the customer journey in general, absence of a communication strategy, lack of urgency, and lack of resources to implement desired changes and additions to communication content.
Electronic word-of-mouth (eWoM) communication plays an increasingly important role in modern business. The underlying concept of word-of-mouth (WoM) communication is well researched and has proved highly significant in respect of its impact on customers purchase behavior. However, due to the advent of digital technologies, decision-making among customers is progressively shifting to the online world. Consequently, eWoM has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM. Five main research areas were analyzed, supporting the need for further eWoM studies and providing a structured overview of existing results.
This paper investigates the impact of dynamic capabilities (DC) on brand love. From a resource-based view, there is little clarity vis-à-vis the specific capabilities that drive the ability to create brand love. This paper focuses on three research questions: Firstly, which dynamic capabilities are relevant for brand love? Secondly, how strong is the impact of certain dynamic capabilities on brand love? Thirdly, which conditions mediate and moderate the impact of specific dynamic capabilities on brand love? Data from a multi-method research approach have been used to itentify the specific capabilities that corporations need, to enhance brand love. Furthermore, a standardized online survey was conducted on marketing executives and evaluated by structural equation modeling. The results indicate, that customer expertise plays a major role in the relationship between dynamic capabilities and brand love. Furthermore, this relationship is more important in markets that have a low competitive differentiation in products and services.
Pokémon Go was the first mobile Augmented Reality (AR) game that made it to the top of the download charts of mobile applications. However, very little is known about this new generation of mobile online Augmented Reality (AR) games. Existing media usage and technology acceptance theories provide limited applicability to the understanding of its users. Against this background, this research provides a comprehensive framework that incorporates findings from uses & gratification theory (U>), technology acceptance and risk research as well as flow theory. The proposed framework aims at explaining the drivers of attitudinal and intentional reactions, such as continuance in gaming or willingness to conduct in-app purchases. A survey among 642 Pokémon Go players provides insights into the psychological drivers of mobile AR games. Results show that hedonic, emotional and social benefits, and social norms drive, vice versa physical risks (but not privacy risks) hinder consumer reactions. However, the importance of these drivers differs between different forms of user behavior.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
Digitization in the energy sector is a necessity to enable energy savings and energy efficiency potentials. Managing decentralized corporate energy systems is hindered by a non-existence. The required integration of energy objectives into business strategy creates difficulties resulting in inefficient decisions. To improve this, practice-proven methods such as Balanced Scorecard, Enterprise Architecture Management and the Value Network approach are transferred to the energy domain. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
Smart meter based business models for the electricity sector : a systematical literature research
(2017)
The Act on the Digitization of the Energy Transition forces German industries and households to introduce smart meters in order to save engery, to gain individual based electricity tariffs and to digitize the energy data flow. Smart meter can be regarded as the advancement of the traditional meter. Utilizing this new technology enables a wide range of innovative business models that provide additional value for the electricity suppliers as well as for their customers. In this study, we followed a two-step approach. At first, we provide a state-of-the-art comparison of these business models found in the literature and identify structural differences in the way they add value to the offered products and services. Secondly, the business models are grouped into categories with respect to customer segmetns and the added value to the smart grid. Findings indicate that most business models focus on the end-costumer as their main customer.
In recent times, enterprises have been increasingly dealing with the use of social media in internal communication and collaboration. In particular, so-called Enterprise Social Networks (ESN) promise meaningful benefits for the nature of work in corporations. However, these platforms often suffer from poor degrees of use. This raises the question of what initiatives enterprise can launch in order to stimulate the vitality of ESN. Since the use of ESN is often voluntary, individual adoption by employees need to be examined to find an answer. Therefore, the Unified Theory of Acceptance and Use of Technology (UTAUT) model was selected for the theoretical foundation of this paper. Following a qualitative research approach, the available research provides an analysis of expert interviews on specific ESN implementation strategies and included factors. In order to extensively conceptualize and generalize these strategic considerations, we conducted an inductive coding process. The results reveal that ESN implementation strategies can be understood as a multi-level construct (individual vs. group vs. organizational level) containing different factors dependent on the degree of documentation and intensity. This research in progress describes a qualitative evaluation as a preliminary study for further quantitative analysis of an ESN adoption model.
Steady growing research material in a variety of databases, repositories and clouds make academic content more than ever hard to discover. Finding adequate material for the own research however is essential for every researcher. Based on recent developments in the field of artificial intelligence and the identified digital capabilities of future universities a change in the basic work of academic research is predicted. This study defines the idea of how artificial intelligence could simplifiy academic research at a digital university. Today's studies in the field of AI spectacle the true potential and its commanding impact on academic research.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change drive current and next information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology with more flexible enterprise information systems through adaptation and evolution of digital architectures. The present research paper investigates the continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, like microservices and the Internet of Things, as part of a new composed digital architecture. To integrate micro-granular architecture models into living architectural model versions we are extending enterprise architecture reference models by state of art elements for agile architectural engineering to support digital products, services, and processes.
With the Internet of Things being one of the most discussed trends in the computer world lately, many organizations find themselves struggling with the great paradigm shift and thus the implementation of IoT on a strategic level. The Ignite methodoogy as a part of the Enterprise-IoT project promises to support organizations with these strategic issues as it combines best practices with expert knowledge from diverse industries helping to create a better understanding of how to transform into an IoT driven business. A framework that is introduced within the context of IoT business model development is the Bosch IoT Business Model Builder. In this study the provided framework is compared to the Osterwalder Business Model Canvas and the St. Gallen Business Model Navigator, the most commonly used and referenced frameworks according to a quantitative literature analysis.
Diese Arbeit liefert einen Konzeptentwurf, der die Integration verschiedener Systeme mit prozessrelevanten klinischen Diensten gewährleistet. Chirurgische Abläufe werden in Form von Prozessen modelliert. Die Wahl der Notation und die Art der Modellierung dieser Prozesse spielt in der heutigen Forschung in diesem Gebiet eine zentrale Rolle. Sind diese Prozesse modelliert, besteht die Möglichkeit, diese in einer Workflow-Engine automatisiert auszuführen. Im Rahmen der Entwicklung eines Workflow-Managment-Systems stellt sich die Frage, wie die Anbindung dieser Workflow-Engine mit anderen Systemen erfolgen soll. In der Arbeit werden Schnittstellen abstrakt in der Web Services Description Language (WSDL) definiert. Darum werden automatisiert Artefakte erzeugt. Auf der Grundlage dieser Artefakte erfolgt die Integration der Systeme. Die Workflow-Engine kommunizieren über SOAP-Nachrichten (Simple Object Access Protocol) mit den entsprechenden Systemen. Dieser Ansatz wurde mithilfe eines Prototyps validiert und umgesetzt.
This paper contributes to the automatic detection of perioperative workflow by developing a binary endoscope localization. Automated situation recognition in the context of an intelligent operating room requires the automatic conversion of low level cues into more abstract high level information. Imagery from a laparoscope delivers rich content that is easy to obtain but hard to process. We introduce a system which detects if the endoscope's distal tip is inside or outsiede the patient based on the endoscope video. This information can be used as one parameter in a situation recognition pipeline. Our localization performs in real-time at a video resolution of 1280x720 and 5-fold cross validation yields mean F1-scores of up to 0,94 on videos of 7 laparoscopies.
New digital technologies present both game-changing opportunities for—and existential threats to—companies whose success was built in the pre-digital economy. This article describes our findings from a study of 25 companies that were embarking on digital transformation journeys. We identified two digital strategies—customer engagement and digitized solutions—that provide direction for a digital transformation. Two technology-enabled assets are essential for executing those strategies: an operational backbone and a digital services platform. We describe how a big old company can combine these elements to navigate its digital transformation.
The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
The Ninth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2017), held between May 21 - 25, 2017 - Barcelona, pain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Trotz größer werdendem Tenor, die Intuition als gewinnbringende Ergänzung zur rational geprägten Entscheidungskultur im Vertrieb zu implementieren, scheint dennoch Unsicherheit und Unwissenheit über den richtigen Umgang mit der Intuition seitens der Mitarbeiter und Mitarbeiterinnen vorzuherrschen. Durch die systematische Legitimierung der Intuition Vertrieb kann diesem Umstand entgegengewirkt werden.
Bei großen Sportereignissen wie Fußball-, Welt- und Europameisterschaften oder Olympischen Spielen geht es für Verbände und offizielle Sponsoren um Millionen, entsprechend scharf verteidigen sie ihre Werberechte. Burger King zeigt, wie sich dieses „Monopol“ kreativ umgehen lässt. Im folgenden Beitrag werden exemplarisch zwei Ambush-Marketing-Aktivitäten von Burger King im Rahmen der Fußball Europameisterschaften 2016 vorgestellt. Nicht-Sponsor Burger King setzte Ambush Marketing dabei gezielt und kreativ im Rahmen der EM ein, um gegen den offiziellen UEFA-Sponsor und Wettbewerber McDonald‘s Punkte zu sammeln.
Der B-to-B-Vertrieb hat sich durch neueste Informationstechnologien stark verändert und ist wesentlich komplexer geworden. Gleichwohl haben sich die Möglichkeiten dadurch auch dramatisch verbessert. International agierende, virtuell zusammenarbeitende Teams haben die Entscheidungsprozesse und Kompetenzen verschoben. Mit Social Selling bietet sich jetzt eine Möglichkeit, die neu entstandenen Herausforderungen im Vertrieb besser zu meistern.
The modern industrial corporation encompasses a myriad of different software applications, each of which must work in concert to deliver functionality to end-users. However, the increasingly complex and dynamic nature of competition in today’s product-markets dictates that this software portfolio be continually evolved and adapted, in order to meet new business challenges. This ability – to rapidly update, improve, remove, replace, and reimagine the software applications that underpin a firm’s competitive position – is at the heart of what has been called IT agility. Unfortunately, little work has examined the antecedents of IT agility, with respect to the choices a firm makes when designing its “Software Portfolio Architecture.”
We address this gap in the literature by exploring the relationship between software portfolio architecture and IT agility at the level of the individual applications in the architecture. In particular, we draw from modular systems theory to develop a series of hypotheses about how different types of coupling impact the ability to update, remove or replace the software applications in a firm’s portfolio. We test our hypotheses using longitudinal data from a large financial services firm, comprising over 1,000 applications and over 3,000 dependencies between them. Our methods allow us to disentangle the effects of different types and levels of coupling.
Our analysis reveals that applications with higher levels of coupling cost more to update, are harder to remove, and are harder to replace, than those with lower coupling. The measures of coupling that best explain differences in IT agility include all indirect dependencies between software applications (i.e., they include coupling and dependency relationships that are not easily visible to the system architect). Our results reveal the critical importance of software portfolio design decisions, in developing a portfolio of applications that can evolve and adapt over time.
We present a topology of MIMO arrays of inductive antennas exhibiting inherent high crosstalk cancellation capabilities. A single layer PCB is etched into a 3-channels array of emitting/receiving antennas. Once coupled with another similar 3-channels emitter/receiver, we measured an Adjacent Channel Rejection Ratio (ACRR) as high as 70 dB from 150 Hz to 150 kHz. Another primitive device made out of copper wires wound around PVC tubes to form a 2-channels “non-contact slip-ring” exhibited 22 dB to 47 dB of ACRR up to 15MHz. In this paper we introduce the underlying theoretical model behind the crosstalk suppression capabilities of those so-called “Pie-Chart antennas”: an extension of the mutual inductance compensation method to higher number of channels using symmetries. We detail the simple iterative building process of those antennas, illustrate it with numerical analysis and evaluate there effectiveness via real experiments on the 3-channels PCB array and the 2-channels rotary array up to the limit of our test setup. The Pie Chart design is primarily intended as an alternative solution to costly electronic filters or cumbersome EM shields in wireless AND wired applications, but not exclusively.
Atemloses Pfeifkonzert : warum Helene Fischer beim DFB-Pokalfinale gnadenlos ausgepfiffen wurde
(2017)
Die gnadenlosen Pfiffe gegen Helene Fischer bei ihrem Auftritt im Berliner Olympiastadion in der Halbzeitpause des DFB-Pokal-Finales 2017 zwischen Borussia Dortmund und Ein-tracht Frankfurt wirkten in den Medien noch einige Zeit nach. Viele Menschen – Fußball-Fans, Schlager-Fans und auch völlig Unbeteiligte – stellten die Frage: Hat Sie das wirklich verdient? Im vorliegenden Beitrag werden vier Erklärungsansätze vorgestellt und erläutert, warum Helene Fischer beim Pokalfinale ausgepfiffen wurde und der Aufritt des Schlagerstars an dieser Stelle deplatziert war.
Maskenball mit Aubameyang
(2017)
Pierre-Emerick Aubameyang, Torjäger von Borussia Dortmund, lässt sich einen Nike-Swoosh in seine Frisur einfärben und spielt so Anfang März 2017 in der Bundesliga gegen Bayer Leverkusen. Nur wenige Wochen später, Anfang April 2017, schnappt sich Aubameyang nach seinem Tor gegen Schalke 04 eine Nike-Maske, zieht sie auf und jubelt damit intensiv und publikumswirksam. Kurz zuvor war diese Maske als Gegenstand in einem Youtube-Werbevideo von Nike mit Aubameyang zu sehen.
Das Pikante dabei ist, dass Borussia Dortmund von Puma ausgestattet und gesponsert wird. Aubameyang dagegen ist ein Testimonial von Nike. Hier pusht also ein Star, der das Vorbild vieler Jugendlicher ist, seinen persönlichen Sponsor und schadet damit dem Ausrüster seines Vereins. Ist das ethisch gesehen korrekt? Es ist offensichtlich, dass BVB-Sponsor Puma darüber nicht amüsiert ist. Darüber hinaus ist es nachvollziehbar, wenn Fußball-Fans immer lauter beklagen, dass die Kommerzialisierung im Fußball langsam überhandnimmt.
Der Sportartikelhersteller Nike erwirtschaftet fast doppelt so viel Umsatz wie Wettbewerber adidas. Die Nummer zwei scheint abgeschüttelt. Auf dem Sportartikelmarkt findet dennoch ein harter Wettkampf statt. Aber nicht immer ist es sauberer Sport. Die Methoden von Nike sind umstritten – wie drei Beispiele belegen, auf die im vorliegenden Beitrag eingegangen wird.
Digitisation forms a part of Industrie 4.0 and is both threatening, but also providing an opportunity to transform business as we know it; and can make entire business models redundant. Although companies might realise the need to digitise, many are unsure of how to start this digital transformation. This paper addresses the problems and challenges faced in digitisation, and develops a model for initialising digital transformation in enterprises. The model is based on a continuous improvement cycle, and also includes triggers for innovative and digital thinking within the enterprise. The model was successfully validated in the German service sector.
Purpose: The purpose of this paper is to examine the service of the new business model Curated Shopping in the fashion industry as well as to analyze if the service provides a higher costumer added value in comparison to traditional services in retail stores and e-commerce platforms. It gives implications to curated shop operators how to optimize the service in each stage of the customer buying process.
Design/methodology/approach: The research methodology applied is an empirical study that uses the principal of mystery shopping in order to investigate the provided services during the selling process.
Findings: The study showed that information about the customer should be collected carefully and as holistic as possible in order to assort a suitable outfit. The consumer is able to benefit from the service by saving time and enjoying a stress-free way of shopping. Nevertheless there are limitations in the personal service to give individual and inspiring advice by the curator caused by the physical distance to the customer.
Research limitations: The survey was conducted under 10 mystery shoppers and 4 curated shop operators in Germany, limiting findings to these mystery shoppers and operators.
Practical implications: One implication for the shop operators is to collect consumer information carefully and expand the assortment and brand portfolio in order to provide fashion goods to inspire the consumer. The shop operators are on the right track still there is huge potential to provide a more shopper-oriented service.
Purpose: The purpose of this study was to investigate the value of the web representation of certain fashion hot spots and how these results can be shown on fashion maps in an illustrated way.
Design/methodology/approach: A new ranking was created, which was evaluated with a self-instructed index, to gain solid results. Numbers were collected from Google, Instagram, Facebook, Twitter and web.alert.io. Additionally, fashion maps were created for an illustrative visualization of the results.
Findings: Compared with the ranking of a trend forecasting agency, called Global Language Monitor, which concepted a ranking of non-virtual fashion cities, the web representation and therefore the ranking of the research project, differs mainly in the situation of the cities among the first 10, viz. the rank on which a city occurs, but fewer in the actual cities mentioned.
Research limitations: The research was limited to subjective analysis of data, leading to partly subjective results, as well as the selected number of social media platforms, that had been used.
Originality/value: This is the first study to explore the web representation value of fashion metropolises in comparison to their non-virtual ranking. The results are partly based on results that already existed, concerning transformations of fashion cities or in general which cities own the status of a fashion city.
There is no doubt that the amplification of channel integration towards an omni-channel structure is a powerful idea whose time has finally come. The digitally cross-linked world postulates all-encompassing, ubiquitous, and unobtrusive future services. In the concomitant, increasingly competitive market, retailers are starting to lay the foundation for omnichannel, meeting the expectations of a digitally cunning audience wanting their shopping experience to be as seamless and uncomplicated as possible. Nevertheless, recent researches show that there are still enough avenues for further research on omnichannel. Until now, the performance of companies was solely considered by experts from a suppliers’ point of view. It would be rather interesting to find out whether the desire to meet the increased cus-tomer expectations is also recognized by the customers themselves. This paper seeks to answering how the purchasing behavior has changed and what customers demand. In addition, it elaborates the opportunities that are promoted by omni-channel. Searching out all the effects, the paper will get to a final step, where it can be attested how the omnichannel performance of fashion and lifestyle retailers can be measured from a consumers’ perspective by developing an exclusive index. The study is confined to four fashion and lifestyle retailers: Hugo Boss AG, Levi Strauss & Co, Pull and Bear as well as COS. Using the scientific method of mystery shopping and a multi-item checklist including 54 key performance indicators, the paper aims to examine to which extend the four selected retailers provide a seamless customer journey, according to the five decision-making phases.
This research is about Omnichannel Retailing and addresses the question how the omnichanneling of retailers in the fashion market can be measured. Our sources will include books, interviews, newspapers and scientific databases.
Omnichanneling is a current topic in the fashion market, retailers all over the world face the question on how to adapt to the challenges Omnichannel Retailing sets. We are going to define what Omnichanneling is by explaining the differences between Multiple-, Multi-, Cross- and Omnichannel Retailing. After we defined omnichanneling itself, we took a set of 26 retailers to evaluate regarding their Omnichannel capabilities. Then we create an index with criteria that can measure the Omnichannel capability of each retailer.
The Omnichannel Score is based on 31 criteria, which analyze the retailers in offline, online, mobile and social aspects enables to see differences between retailers. Our findings were that retailers in the US fashion market are more advanced in Omnichannel Retailing than retailers in the German fashion market. Our top three Omnichannel retailers were Sears with an Omnichannel Score of 91, followed by KOHL’S and Marks&Spencer, both with a Omnichannel Score of 88. The best Omnichannel Retailer from Germany was Adidas with the fourth place and an Omnichannel Score of 81.
Geschäftsmodelle in der Energiewirtschaft : ein Kompendium von der Methodik bis zur Anwendung
(2017)
Ob Student oder Angestellter, Forscher oder Unternehmer, Politiker oder Dozent, ob im Start-up oder im Unternehmens-Oldie „Energieversorger“ – heute kommt vermeintlich keiner ohne ein gutes Geschäftsmodell aus. Warum ist das so? Was macht Geschäftsmodelle zu „fleißigen Lieschen“ nicht nur der Betriebswirtschaftslehre, sondern auch der Ingenieure, Volkswirte oder Informatiker? Das Geschäftsmodell beschreibt das Prinzip, nach dem eine Organisation Werte schafft, vermittelt und erfasst. Es ermöglicht durch diese Vereinfachung und Strukturierung eine leichtere Kommunikation und Analyse des Gesamtkonstrukts oder seiner Bestandteile. Es dient als Planungsinstrument, mit dessen Hilfe Innovationen effizienter und gezielter identifiziert werden können. Geschäftsmodelle können auf Ebene von Unternehmen oder einzelner Geschäftseinheiten entwickelt werden. Das vorliegende Kompendium dient dem Studenten wie dem Praktiker der Energiewirtschaft als methodische Basis zur eigenständigen Entwicklung von Geschäftsmodellen. Daher wird im 1. Kapitel aus Wissenschaft und Forschung abgeleitet, was ein Geschäftsmodell ist und wie es angewendet wird. Kapitel 2 beschreibt die Herausforderungen der Energiewirtschaft. Die Branche ist seit Jahrzehnten im Wandel. Neue Technologien zur (dezentralen) Erzeugung, Digitalisierung, sich wandelnde politische Ziele und Instrumente (Liberalisierung, Kernkraftausstieg, Energiewende,…) und neue Kundenbedürfnisse erfordern, dass die Unternehmen – große wie kleine, etablierte wie neue Anbieter, in öffentlichem wie in privatem Eigentum – angesichts erodierender Margen und zunehmendem Wettbewerb in diesem Umfeld erfolgversprechende Wege in die Zukunft suchen. Schon mit dem Begriff „Geschäftsmodell“ wird heute die Hoffnung eines Heilsbringers in diesem Dickicht erhofft, dem natürlich ein Strukturierungsinstrument – mehr ist das Geschäftsmodell schließlich nicht – nicht gerecht werden kann. In Kapitel 3 werden im Prinzip bekannte Geschäftsmodelle der Energiewirtschaft geschildert, sowie ihre Patterns, angelehnt an andere Branchen, ausdifferenziert. Dies sollte dem relativen Neuling den Einstieg in die Branche erleichtern und dem nach neuen Geschäftsmodellen Suchenden die Basis für eigene Innovation bieten. In Kapitel 4 werden Geschäftsmodelle für virtuelle Kraftwerke geschildert. Anhand dieses Beispiels wird auch ausgeführt, wie Geschäftsmodelle von Partnern entlang der Wertschöpfungskette ineinander greifen müssen. Im letzten Kapitel 5 wird schließlich auf Erfolgsfaktoren zur Entwicklung und Umsetzung von Geschäftsmodellen eingegangen.
Farben umgeben den Menschen tagtäglich und beeinflussen unser Befinden und Verhalten – teils bewusst, teils unbewusst. Diese Tatsache veranlasst auch das Marketing, sich mit den Wirkungen von Farben auseinanderzusetzen, um diese gezielt anwenden zu können. Der richtige Einsatz von Farben im Marketing kann die (Werbe-)Botschaft und die gewünschte Wirkung einer Aktivität oder Marke unterstützen und zudem Aufmerksamkeit bei Konsumenten generieren. Die Erkenntnisse über Wirkungen von Farben im Marketing sind somit entscheidend für die Wahrnehmung der Konsumenten sowie für den Erfolg des Marketings eines Unternehmens. Im Rahmen dieser Arbeit werden zunächst Farben und deren Wirkungen im Hinblick auf die Farbsymbolik sowie Farbkombinationen und Farbtöne beschrieben. Im Anschluss wird auf den Einsatz von Farben im Marketing eingegangen. Anhand von Beispielen aus der Marketing-Praxis werden Farben als Element der Corporate Identity sowie der Marke vorgestellt und die Anwendung von Farben in der Werbung analysiert. Zur Verdeutlichung der Wirkung von Farben im Marketing wird ein Moodboard für einen Aktionsartikel des Discounters ALDI Süd entworfen. Abschließend werden die Erkenntnisse zusammengefasst und eine Handlungsempfehlung für den Einsatz von Farben im Marketing ausgesprochen.
Marketing mit Instagram
(2017)
In einer reizüberfluteten Konsumgesellschaft gestaltet es sich nicht immer einfach, potentielle Kunden über den passenden Kanal zu erreichen. Für die Markenpositionierung müssen steigende Budgets in die Produktion von qualitativ hochwertigen Inhalten investiert werden, um durch den wachsenden digitalen Wettbewerb Authentizität und Relevanz für den Konsumenten zu wahren. Dabei ist der Fokus längst von der reinen Werbebotschaft zum "Storytelling" gewandert. Die Bedeutung der Erlebnisqualität einer Marke rückt in den Vordergrund, denn die Zielgruppe möchte nicht nur den objektiven Mehrwert erfahren, sondern zugleich eine spannende Geschichte, die sie auch selbst mit dem Produkt oder der Dienstleistung erleben können. Durch die Zunahme des mobilen Konsums von Content finden gerade soziale Netzwerke wie Instagram bei der Planung der Marketing-Aktivitäten eines Unternehmens steigende Beachtung. Im Textileinzelhandel wird diese Erkenntnis bereits seit längerer Zeit aktiv genutzt, nicht so im deutschen Lebensmitteleinzelhandel. Vor diesem Hintergrund stellt sich die Frage, ob eine ästhetische Inszenierung von Lebensmitteln auf Instagram überhaupt beim Rezipienten ankommt? Die Antwort ist eindeutig: Amerikanische Unternehmen wie Wholefoods machen es vor. Mit über 1,9 Millionen Abonnenten und fast 2.500 geposteten Beiträgen (Stand Februar 2017) erfreut sich das Unternehmensprofil des Lebensmitteleinzelhändlers aus Texas großer Beliebtheit. Marketing mit Instagram und Le-bensmittel kann also erfolgreich verknüpft werden, aber wie? Dies soll im Rahmen der vorliegenden Arbeit untersucht werden.
Social sustainable supply chain management in the textile and apparel industry : a literature review
(2017)
So far, a vast amount of studies on sustainability in supply chain management have been conducted by academics over the last decade. Nevertheless, socially related aspects are still neglected in the related discussion. The primary motivation of the present literature review has arisen from this shortcoming, thus the key purpose of this study is to enrich the discussion by providing a state of-the-art, focusing exclusively on social issues in sustainable supply chain management (SSCM) by considering the textile/apparel sector as the field of application. The authors conduct a literature review, including content analysis which covers 45 articles published in English peer-reviewed journals, and proposes a comprehensive map which integrates the latest findings on socially related practices in the textile/apparel industry with the dominant conceptualization in order to reveal potential research areas in the field. The results show an ongoing lack of investigation regarding the social dimension of the triple bottom line in SSCM. Findings indicate that a company’s internal orientation is the main assisting factor in sustainable supply chain management practices. Further, supplier collaboration and assessment can be interpreted as an offer for suppliers deriving from stakeholders and a focal company’s management of social risk. Nevertheless, suppliers do also face or even create huge barriers in improving their social performance. This calls for more empirical research and qualitative or quantitative survey methods, especially at the supplier level located in developing countries.
Nenne sie niemals Senioren!
(2017)