Refine
Year of publication
- 2020 (312) (remove)
Document Type
- Journal article (144)
- Conference proceeding (93)
- Book chapter (41)
- Book (10)
- Report (9)
- Doctoral Thesis (6)
- Working Paper (4)
- Anthology (3)
- Issue of a journal (2)
Is part of the Bibliography
- yes (312)
Institute
- ESB Business School (105)
- Informatik (101)
- Life Sciences (43)
- Technik (37)
- Texoversum (25)
Publisher
- Springer (47)
- Elsevier (30)
- Hochschule Reutlingen (20)
- IEEE (15)
- MDPI (13)
- Springer Gabler (8)
- ACM (7)
- De Gruyter (6)
- Wiley (6)
- AMD Akademie Mode & Design (5)
Motto der Herbstkonferenz Informatics Inside 2020 ist KInside. Wieder einmal blicken Studierende inside und schauen sich Methoden, Anwendungen und Zusammenhänge genauer an. Die Beiträge sind vielfältig und entsprechend dem Studiengang human-centered. Es ist der Anspruch, dass sich die Themen um die Bedürfnisse der Menschen drehen und eingesetzte Methoden kein Selbstzweck sind, sondern am Nutzen für den Menschen gemessen werden.
Steady state efficiency optimization techniques for induction motors are state of the art and various methods have already been developed. This paper provides new insights in the efficiency optimized operation in dynamic regime. The paper proposes an anticipative flux modification in order to decrease losses during torque and speed transients. These trajectories are analyzed based on a numerical study for different motors. Measurement results for one motor are given as well.
Sol-Gel basierte Flammschutzmittel stellen einen vielversprechenden Ansatz für Textilien dar, gerade im Bereich des Ersatzes von derzeit etablierten halogenhaltigen Flammschutzmitteln. Letztere sind aufgrund ihrer toxikologisch Bedenklichkeit sowie ihrer mitunter bioakkumulierenden Eigenschaften in die Kritik geraten. In diesem Forschungsvorhaben wurde daher untersucht auf welche Weise ein Flammschutz per Sol-Gel-Ansatz auf Stickstoff- und/oder Phosphorbasis als halogenfreie Alternative verwirklicht werden kann. Die Sol-Gel-Schicht fungierte dabei zum einen als nicht brennbarer Binder, zum anderen konnten über das Einführen entsprechender funktioneller Seitenketten für den Flammschutz aktive Gruppen direkt mit eingebunden werden. Verschiedene Ansätze wurden dabei verfolgt. Vor allem durch die Nutzung von additivierten Systemen, d.h. durch Sol-Gel-Schichten mit Zusätzen von stickstoff- und/oder phosphorhaltigen Verbindungen konnte ein Flammschutz nach DIN EN ISO 15025 (Schutzkleidung – Schutz gegen Hitze und Flammen) erhalten werden. Anhand eines Modellsystems, bei dem in zwei aufeinanderfolgenden Schritten zuerst eine funktionalisierte Sol-Gel-Schicht und anschließend eine Phosphorverbindung in einem zweiten Schritt aufgebracht wurde, konnten die Vorteile des Flammschutzes auf Sol-Gel-Basis nachgewiesen werden. Dabei wurde unter anderem auch gezeigt, dass ein Mechanismus auf Basis der Bildung einer Schutzschicht hauptsächlich verantwortlich für den Flammschutz ist. Dieses Ergebnis ist für eine zukünftige, weitere Optimierung entsprechender Ausrüstungen nicht zu unterschätzen. Durch Ausrüstungsversuche im semi-industriellen Maßstab konnte weiterhin gezeigt werden, dass einer großtechnischen Umsetzung der angewandten Ausrüstungen prinzipiell nichts im Wege steht. Abstriche müssen bis dato lediglich bezüglich der Waschstabilität gemacht werden. Die Sol-Gel-Schichten überstanden zwar im allgemeinen typische Waschprozesse, eine Permanenz der Flammfestigkeit von additivierten Systemen ergab sich aber nur in einzelnen Fällen. Ausgehend von den Ergebnissen wurde ein neuer Ansatz vorgestellt, der über den hier zugrundeliegenden Ansatz hinausgeht. Dieser sieht vor, durch den Einsatz von neu-synthetisierten Silanen mit Stickstoff- und Phosphorgruppen Sol-Gel-Schichten herzustellen, die ein vielversprechendes Verhalten zeigen. Hier konnte auch nach ersten Waschtests eine Aufrechterhaltung der verbesserten Flammfestigkeit nachgewiesen werden. Insgesamt konnte innerhalb des Forschungsvorhabens gezeigt werden, dass ein Flammschutz auf Sol-Gel-Basis für Textilien erhalten werden kann. Darüberhinaus konnte auch erklärt werden auf welchem Mechanismus dieser Flammschutz begründet ist und wie die derzeit noch ungenügende Waschpermanenz verbessert werden kann.
Driven by digital transformation, manufacturing systems are heading towards autonomy. The implementation of autonomous elements in manufacturing systems is still a big challenge. Especially small and medium sized enterprises (SME) often lack experience to assess the degree of Autonomous Production. Therefore, a description model for the assessment of stages for Autonomous Production has been identified as a core element to support such a transformation process. In contrast to existing models, the developed SME-tailored model comprises different levels within a manufacturing system, from single manufacturing cells to the factory level. Furthermore, the model has been validated in several case studies.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
In recent years, machine learning algorithms have made a huge development in performance and applicability in industry and especially maintenance. Their application enables predictive maintenance and thus offers efficiency increases. However, a successful implementation of such solutions still requires high effort in data preparation to obtain the right information, interdisciplinarity in teams as well as a good communication to employees. Here, small and medium sized enterprises (SME) often lack in experience, competence and capacity. This paper presents a systematic and practice-oriented method for an implementation of machine learning solutions for predictive maintenance in SME, which has already been validated.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
Der Anspruch an Energieversorger wird wachsen: in Zukunft gewinnen vor allem Aufgaben wie die Entwicklung digitalisierter Produkte/Dienstleistungen sowie ökologische Aktivitäten an Relevanz. Dies zeigt die Hochschule Reutlingen in ihrer aktuellen Untersuchung unter Aufsichtsräten, Geschäftsführern und Führungskräften. Trotz der erwarteten Veränderungen: die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalisierung bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet. Besonders relevant dabei: die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. So die Studie des Reutlinger Energiezentrums and der Hochschule Reutlingen im Auftrag von fünf Unternehmen der Branche.
nKV in action: accelerating KVstores on native computational storage with NearData processing
(2020)
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, has yet to see widespread use.
In this paper we demonstrate various NDP alternatives in nKV, which is a key/value store utilizing native computational storage and near-data processing. We showcase the execution of classical operations (GET, SCAN) and complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4x-2.7x better performance due to NDP. nKV runs on real hardware - the COSMOS+ platform.
Massive data transfers in modern key/value stores resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, have yet to see widespread use.
In this paper we introduce nKV, which is a key/value store utilizing native computational storage and near-data processing. On the one hand, nKV can directly control the data and computation placement on the underlying storage hardware. On the other hand, nKV propagates the data formats and layouts to the storage device where, software and hardware parsers and accessors are implemented. Both allow NDP operations to execute in host-intervention-free manner, directly on physical addresses and thus better utilize the underlying hardware. Our performance evaluation is based on executing traditional KV operations (GET, SCAN) and on complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4×-2.7× better performance on real hardware – the COSMOS+ platform.
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become viable.
The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under NoFTL-KV and the COSMOS hardware platform.
The tale of 1000 cores: an evaluation of concurrency control on real(ly) large multi-socket hardware
(2020)
In this paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” and analyse in-memory DBMSs on today’s large hardware. Despite the original assumption of the authors, today we do not see single-socket CPUs with 1000 cores. Instead multi-socket hardware made its way into production data centres. Hence, we follow up on this prior work with an evaluation of the characteristics of concurrency control schemes on real production multi-socket hardware with 1568 cores. To our surprise, we made several interesting findings which we report on in this paper.
In this paper, we present a new approach for achieving robust performance of data structures making it easier to reuse the same design for different hardware generations but also for different workloads. To achieve robust performance, the main idea is to strictly separate the data structure design from the actual strategies to execute access operations and adjust the actual execution strategies by means of so-called configurations instead of hard-wiring the execution strategy into the data structure. In our evaluation we demonstrate the benefits of this configuration approach for individual data structures as well as complex OLTP workloads.
Modern mixed (HTAP)workloads execute fast update-transactions and long running analytical queries on the same dataset and system. In multi-version (MVCC) systems, such workloads result in many short-lived versions and long version-chains as well as in increased and frequent maintenance overhead.
Consequently, the index pressure increases significantly. Firstly, the frequent modifications cause frequent creation of new versions, yielding a surge in index maintenance overhead. Secondly and more importantly, index-scans incur extra I/O overhead to determine, which of the resulting tuple versions are visible to the executing transaction (visibility-check) as current designs only store version/timestamp information in the base table – not in the index. Such index-only visibility-check is critical for HTAP workloads on large datasets.
In this paper we propose the Multi Version Partitioned B-Tree (MV-PBT) as a version-aware index structure, supporting index-only visibility checks and flash-friendly I/O patterns. The experimental evaluation indicates a 2x improvement for analytical queries and 15% higher transactional throughput under HTAP workloads. MV-PBT offers 40% higher tx. throughput compared to WiredTiger’s LSM-Tree implementation under YCSB.
Customer orientation should be the core engine of every organisation while IT can be considered as the enabler to generate competitive advantages along customer processes in marketing, sales and service. Research shows that customer relationship management (CRM) enables organisations to perform better and experience indicates that organisations that focus on customer orientation are more successful. With marketplace organisations such as Amazon, Alibaba or Conrad shaping the future of customer centricity and information technology, German B2B organisations need to shift their value contribution from product-centric to customer-centric. While these organisations are currently attempting to implement CRM software and putting their customers more into focus, the question remains how organisations are approaching the implementation of CRM and whether these attempts are paying off in terms of business performance.
Dieser Beitrag gibt einen Überblick über die verschiedenen Möglichkeiten der Bilanzierung einens Initial Coin Offerings (ICO) beim Emittenten auf der Passivseite nach den Regelungen der IFRS. Ziel ist es, die bilanzielle Einordnung anhand verschiedenenr Arten von Token zu erörtern und den Emittenten bei der Ausgestaltung der Token sowie der anschließenden Bilanzierung zu unterstützen. Die Ergebnisse zeigen, dass die Standards für die bilanzielle Einordnung von ICO-Token zwar ausreichen, allerdings eine große Bandbreite der Bilanzierung zu berücksichtigen ist und eine detaillierte Regelung durch einen eigenen IFRS daher schwierig erscheint.
Das Value-Engineering in der Kundenkommunikation ist eine strukturierte Methode, Kommunikationsprozesse zwischen Unternehmen zu verbessern. Das Konzept greift bewährte Elemente der technischen Wertanalyse und der Gemeinkosten-Wertanalyse auf und überträgt sie auf die Kundenkommunikation. Der Ansatz bietet eine systematische Vorgehensweise, Kommunikationsprozesse zwischen Anbieter und Kunde zu durchleuchten und neu zu gestalten. Value-Engineering in der Kundenkommunikation schafft somit Wettbewerbsvorteile durch eine Optimierung der Kommunikation.
The article studies a novel approach of inflation modeling in economics. We utilize a stochastic differential equation (SDE) of the form dXt=aXtdt+bXtdBtH, where dBtH is a fractional Brownian motion in order to model inflationary dynamics. Standard economic models do not capture the stochastic nature of inflation in the Eurozone. Thus, we develop a new stochastic approach and take into consideration fractional Brownian motions as well as Lévy processes. The benefits of those stochastic processes are the modeling of interdependence and jumps, which is equally confirmed by empirical inflation data. The article defines and introduces the rules for stochastic and fractional processes and elucidates the stochastic simulation output.
Resilienz und Stabilität? Weichenstellungen im Banken- und Finanzsystem in der Corona-Pandemie
(2020)
Seit der globalen Finanzkrise 2008/2009 hat es keine vergleichbare Herausforderung wie die Corona-Krise für das Finanz- und Bankensystem mehr gegeben.
Schwache Profitabilität, ungelöste Regulierungs-herausforderungen und steigende Konkurrenz im Digitalbereich stellen die Banken vor weitere Heraus-forderungen.
Die Stabilität des Finanzsystems und der Zugang zu den Finanzmärkten war während der Pandemie nicht gefährdet. Durch gemeinsame Bemühungen und bes-sere Bankenkapitalisierung ist das Finanzsystem heute widerstandsfähiger als zu Zeiten der Finanzkrise.
Sofern die Zuschüsse und Kredite im „Next Genera-tion EU“-Fund zielgerichtet für Strukturreformen und Zukunftsinvestitionen eingesetzt werden, dürfte dies einen Vertrauens- und Wachstumsimpuls darstellen.
Weitere Verbesserungen der Finanzstabilität, wie erhöhte Eigenkapitalunterlegungen, Regulierung von Schattenbanken oder Reformen im Bereich der Finanzaufsicht, sind jedoch von Nöten.
Since the global financial crisis of 2008/2009, there has been no challenge to the financial and banking system comparable to that during the Corona crisis.
Weak profitability, unresolved regulatory challenges and increasing competition in the digital sector pose further challenges for banks.
The stability of the financial system and access to financial markets was not at risk during the pandemic. Through joint efforts and better bank capitalisation, the financial system is now more resilient than during the financial crisis.
Provided that grants and loans in the “next generation EU” fund are well targeted for structural reforms and investments in the future, this should boost confi-dence and growth.
However, further improvements in financial stability, such as increased capital requirements, regulation of shadow banks or reforms in financial supervision, are needed.
Die rasante Entwicklung der Sensortechnik im Endverbraucherbereich lässt einen klinischen Nutzen der verfügbaren dezentral erhobenen Daten aus dem Patientenalltag zur Überwachung des individuellen Gesundheitszustands vermuten. Zur Überprüfung dieser Vermutung ist die Bereitstellung einer entsprechenden Plattform in den klinischen Alltag erforderlich. Hierzu wird die bwHealthApp entwickelt, mit der sowohl die aktuelle Bandbreite als auch die Evolution der Sensortechnik auf die klinische Anwendung abbildbar ist. Mit dem flexiblen Entwurf lässt sich der klinische Nutzen für die personalisierte Medizin evaluieren. Außerdem bietet die bwHealthApp einen an Machbarkeit orientierten Diskussionsbeitrag zu offenen rechtlichen, regulatorischen und ethischen Fragestellungen der Digitalisierung in der Medizin in Deutschland.
The livestock sector is growing steadily and is responsible for around 18% of global greenhouse‐gas‐emissions, which is more than the global transport sec-tor (Steinfeld et al. 2006). This paper examines the potential of social marketing to reduce meat consumption. The aim is to understand consumers’ motivation in diet choices and to learn what opportunities social marketing can provide to counteract negative environmental and health trends. The authors believe that research to answer this question should start in metropolitan areas, be-cause measures should be especially effective there. Based on the Theory of Planned Behaviour (TPB, Ajzen 1991) and the Technology‐Acceptance‐Model by Huijts et al. (2012), an online‐study with participants from the metropolitan region (n = 708) was conducted in which central socio‐psychological constructs for a meat consumption reduction were examined. It was shown that attitude, personal norm and habit have a critical influence on the intention to reduce meat consumption. A segmentation of consumers based on these factors led to three consumer clusters: vegetarians/flexitarians, potential flexitarians and convinced meat eaters. Potential flexitarians are an especially relevant target group for the development of social‐marketing‐measures to reduce meat consumption. In co‐creation‐workshops with potential flexitarians from the metropolitan region, barriers and benefits of reducing meat consumption were identified. The factors of environmental protection, animal welfare and desire for variety turn out to be the most relevant motivational factors. Based on these factors, consumers proposed a variety of social marketing measures, such as applications and labels to inform about the environmental impact of meat products.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
This paper studies the impact of financial liquidity on the macro-economy. We extend a classic macroeconomic modeland compute numerical simulations. The model confirms that persistently low inflation can occur despite a high degreeof financial liquidity due to a reallocation of cash, normal and risk-free bonds. In that regard, our model uncovers anexplanation of a flat Phillips curve. Overall, our approach contributes to a rather disregarded matter in macroeconomictheory.
This article studies the current debate on Coronabonds and the idea of European public debt in the aftermath of the Corona pandemic. According to the EU-Treaty economic and fiscal policy remains in the sovereignty of Member States. Therefore, joint European debt instruments are risky and trigger moral hazard and free-riding in the Eurozone. We exhibit that a mixture of the principle of liability and control impairs the present fiscal architecture and destabilizes the Eurozone. We recommend that Member States ought to utilize either the existing fiscal architecture available or establish a political union with full sovereignty in Europe. This policy conclusion is supported by the PSPP-judgement of the Federal Constitutional Court of Germany on 5 May 2020. This ruling initiated a lively debate about the future of the Eurozone and Europe in general.
Since Adam Smith, the “homo oeconomicus” is the behavioural model in economics. Commonly this model characterizes a selfish individual, a kind of ruthless type, whose greed for profit seems to take precedence over moral values. Already 100 years ago, Max Weber provided a modernization of the model concerning the methodological individualism. Recent research in cognitive sciences reveals a further modernization of this standard model in economics. Neuro-economics, a highly interdisciplinary research field, is building a new behavioural consensus. This article examines the new properties of the “neuro-homo oeconomicus”. We show that the new behavioural model is rather similar to the long-standing economic prototype. To that extent, the neuro-model is more hype than hope. In principle, this article considers an ancient philosophical question about the nature of humans in general.
Energy efficient electric control of drives is more and more important for electric mobility and manufacturing industries. Online dynamic optimization of induction machines is challenging due to the computational complexity involved and the variable power losses during dynamic operation of induction machines. This paper proposes a simple technique for sub-optimal online loss optimization using rotor flux linkage templates for energy efficient dynamic operation of induction machines. Such a rotor flux linkage template is given by a rotor flux linkage trajectory which is optimal for a specific scenario. This template is calculated in an offline optimization process. For a specific scenario during real time operation the rotor flux linkage is calculated by appropriately scaling the given template.
In this work, a brushless, harmonic-excited wound-rotor synchronous machine is investigated which utilizes special stator and rotor windings. The windings magnetically decouple the fundamental torque-producing field from the harmonic field required for the inductive power transfer to the field coil. In contrast to conventional harmonic-excited synchronous machines, the whole winding is utilized for both torque production and harmonic excitation such that no additional copper for auxiliary windings is needed. Different rotor topologies using rotating power electronic components are investigated and their efficiencies have been compared based on Finite-Element calculation and circuit analysis.
Companies are becoming aware of the potential risks arising from sustainability aspects in supply chains. These risks can affect ecological, economic or social aspects. One important element in managing those risks is improved transparency in supply chains by means of digital transformation. Innovative technologies like blockchain technology can be used to enforce transparency. In this paper, we present a smart contract-based Supply Chain Control Solution to reduce risks. Technological capabilities of the solution will be compared to a similar technology approach and evaluated regarding their benefits and challenges within the framework of supply chain models. As a result, the proposed solution is suitable for the dynamic administration of complex supply chains.
Globalisation, shorter product life cycles, and increasing product varieties have led to complex supply chains. At the same time, there is a growing interest of customers and governments in having a greater transparency of brands, manufacturers, and producers throughout the supply chain. Due to the complex structure of collaborative manufacturing networks, the increase of supply chain transparency is a challenge for manufacturing companies. The blockchain technology offers an innovative solution to increase the transparency, security, authenticity, and auditability of products. However, there are still uncertainties when applying the blockchain technology to manufacturing scenarios and thus enable all stakeholders to trace back each component of an assembled product. This paper proposes a framework design to increase the transparency and auditability of products in collaborative manufacturing networks by adopting the blockchain technology. In this context, each component of a product is marked with a unique identification number generated by blockchain-based smart contracts. In this way, a transparent auditability of assembled products and their components can be achieved for all stakeholders, including the custome.
Customer foresight is a relatively new research field. We introduce the customer foresight territory by discussing it localization between customer research and foresight research. For this purposse, we look at a variety of methods that help to understand customers and future realities. On this basis we provide an overwiew of customer foresight methods and outline an ideal-typical research journey.
Some widely used optical measurement systems require a scan in wavelength or in one spatial dimension to measure the topography in all three dimensions. Novel hyperspectral sensors based on an extended Bayer pattern have a high potential to solve this issue as they can measure three dimensions in a single shot. This paper presents a detailed examination of a hyperspectral sensor including a description of the measurement setup. The evaluated sensor (Ximea MQ022HG-IM-SM5X5-NIR) offers 25 channels based on Fabry–Pérot filters. The setup illuminates the sensor with discrete wavelengths under a specified angle of incidence. This allows characterization of the spatial and angular response of every channel of each macropixel of the tested sensor on the illumination. The results of the characterization form the basis for a spectral reconstruction of the signal, which is essential to obtain an accurate spectral image. It turned out that irregularities of the signal response for the individual filters are present across the whole sensor.
Systemische Betrachtung des therapeutischen Roboters Paro im Vergleich zu dem Haustierroboter AIBO
(2020)
Roboter sind in der heutigen Zeit nicht nur in der Industrie zu finden, sondern werden immer häufiger in privaten Lebensbereichen eingesetzt. Ein Beispiel hierfür ist der soziale Therapie-Roboter Paro. Dieser ist dem Verhalten und Aussehen einer jungen Robbe nachempfunden, drückt Gefühle aus und wird besonders in Pflegeheimen eingesetzt. Dabei zeigt er positive Auswirkungen auf das Wohlbefinden pflegebedürftiger Menschen. Diese Arbeit stellt den Roboter Paro in einer systemischen Analyse dar: hierbei werden Systemkontext, Anwendungsfälle, Anforderungen und Struktur betrachtet. Anschließend erfolgt eine Analyse des Haustierroboters AIBO, welcher einem Welpen ähnelt und verstärkt der Unterhaltung von Privatpersonen dient. Es werden Gemeinsamkeiten und Unterschiede zwischen den Systemen herausgearbeitet. Dabei wird ersichtlich, dass beide Systeme dem Nutzer vorrangig Gesellschaft leisten, jedoch verschiedene Anforderungen besitzen und in unterschiedlichen Anwendungsdomänen eingesetzt werden. Zudem besitzt AIBO vielfältigere Fähigkeiten und einen höheren Bewegungsgrad als Paro. Dies spiegelt sich in einer komplexeren Struktur der Hardware wider.
Wie kann die Digitalisierung in der Bauzulieferbranche erfolgreich gemeistert werden? Die Fülle und Komplexität der Fragen dazu lassen sich auf zwei zentrale Kernfragen reduzieren: Was sind die richtigen Inhalte und wesentlichen Werttreiber der Digitalisierung? Und wie muss zukünftig mit der steigenden Informationsflut, der rasant wachsenden Komplexität und der abnehmenden Planbarkeit umgegangen werden?
In diesem Beitrag wird ein Framework vorgestellt, das Bauzulieferern hilft, ihr digitales Zielbild mit seinen Werttreibern systematisch aus dem Kundennutzen abzuleiten. Das Framework berücksichtigt die Besonderheiten der Bauzulieferindustrie, kann aber mit leichten Anpassungen auch auf andere Branchen angewendet werden. Aufbauend auf dem Zielbild können Unternehmen definieren, welche technischen, personellen und organisatorischen Veränderungen für dessen Umsetzung erforderlich sind. Um flexibel mit den dynamischen Veränderungen in ihrem Ökosystem und kulturellen Herausforderungen umgehen zu können, werden zudem fünf Einflussgrößen identifiziert, die Unternehmen bei der Entwicklung der dafür benötigten Evolutionskompetenz berücksichtigen müssen.
Unter dem Begriff Innovation Enabling wird im Folgenden ein Konzept für die ganzheitliche Unterstützung interdisziplinärer Teams beim kreativen und innovativen Problemlösen vor-gestellt. Dieses Konzept unterstützt Moderatoren und Teilnehmergleichermaßen und ein damit realisiertes System bleibt durch die implizite Interaktion für den Nutzer im Hintergrund. Eine zentrale Rolle spielt das Konzept der Awareness Pipeline zur Implementation einer impliziten Interaktion auf Basis eines Sensor-Aktor-Systems, welches in diesem Artikel vorgestellt wird. Die Unterstützung der begleitenden Moderations- und Administrationsaufgaben, wie beispielsweise der automatisierten Dokumentation der Sitzung, sollen in Zukunft einen deutlichen Mehrwert gegenüber einer klassischen Brainstorming-Sitzung bieten.
The generous feed-in tariffs (FiTs) introduced in Germany—which resulted in major growth in decentralized solar photovoltaic (PV) systems—will phase out in the coming years, making many of the existing distributed generation assets stranded. This challenge creates an opportunity for community-focused energy utilities, such as Elektrizitätswerke Schönau eG (EWS) based in Schönau, Germany, to try a new approach to assist its customers, makes the transition to a more sustainable future. This chapter describes how EWS is developing products and offering community-based solutions including peer-to-peer trading using automated platforms. Such innovative offering may lead to successful differentiation in a competitive and highly decentralized future.
Based on a survey among customers of seven German municipal utilities, we estimate two regression models to identify the most prospective customer segments and their preferences and motivations for participating in peer-to-peer (P2P) electricity trading and develop implications for decision-makers in the energy sector and policy-makers for this currently relatively unknown product. Our results show a large general openness of private households towards P2P electricity trading, which is also the main predictor of respondents' intention to participate. It is mainly influenced by individuals’ environmental attitude, technical interest, and independence aspiration. Respondents with the highest willingness to participate in P2P electricity trading are mainly motivated by the ability to share electricity, and to a lesser extent by economic reasons. They also have stronger preferences for innovative pricing schemes (service bundles, time-of-use tariffs). Differences between individuals can be observed depending on their current ownership (prosumers) or installation probability of a microgeneration unit (consumers, planners). Rather than current prosumers, especially planners willing to install microgeneration in the foreseeable future are considered to be the most promising target group for P2P electricity trading. Finally, our results indicate that P2P electricity trading could be a promising niche option in the German energy transition.
Public transport maps are typically designed in a way to support route finding tasks for passengers while they also provide an overview about stations, metro lines, and city-specific attractions. Most of those maps are designed as a static representation, maybe placed in a metro station or printed in a travel guide. In this paper we describe a dynamic, interactive public transport map visualization enhanced by additional views for the dynamic passenger data on different levels of temporal granularity. Moreover, we also allow extra statistical information in form of density plots, calendar-based visualizations, and line graphs. All this information is linked to the contextual metro map to give a viewer insights into the relations between time points and typical routes taken by the passengers. We illustrate the usefulness of our interactive visualization by applying it to the railway system of Hamburg in Germany while also taking into account the extra passenger data. As another indication for the usefulness of the interactively enhanced metro maps we conducted a user experiment with 20 participants.
This book describes the current state of the art in integrated ring resonators, covering more than two decades in the development of this exciting device. It discusses in depth one of the most fascinating and versatile integrated optical filters, providing readers with a panoramic view spanning from design and simulation to implementation in various material systems. Written by authors with extensive experience in both academia and industry, this second edition offers a much-needed, major update as interest in integrated ring resonators undergoes a global revival. The new edition includes a comprehensive technological update, and a timely discussion of recent advances in new application areas, such as optofluidics and microfluidics, telecom operations and biosensors. This aptly named compendium is the ideal guide for researchers and engineers looking to review the field as a whole while exploring several of its possible and exciting future trajectories.
The objective of the project presented here is to develop an intelligent control algorithm for an energy system consisting of a biogas CHP (combined heat and power), various storage technologies, such as thermal energy storages (TES) and gas storages, and other renewable energy sources, such as photovoltaics. A corresponding algorithm based on the Monte-Carlo method has already been developed at Reutlingen University for CHP units running on natural gas and for heat pumps. The project presented here concentrates on the further development of this algorithm for application to biogas CP units. In this context, an adequate implementation of the gas storage is of primary importance, as it mainly determines the flexibility of the plant. In the course of the validation of the new optimization algorithm, simulations were carried out based on data from the Lower Lindenhof, an agricultural experimental station of the University of Hohenheim. Both an optimization with regard to onsite electricity utilization and an optimization driven by residual load were investigated. Preliminary results show that the optimization algorithm can improve the operation of the biogas CHP unit depending on the selected target function.
Azide-bearing cell-derived extracellular matrices (“clickECMs”) have emerged as a highly exciting new class of biomaterials. They conserve substantial characteristics of the natural extracellular matrix (ECM) and offer simultaneously small abiotic functional groups that enable bioorthogonal bioconjugation reactions. Despite their attractiveness, investigation of their biomolecular composition is very challenging due to the insoluble and highly complex nature of cell-derived matrices (CDMs). Yet, thorough qualitative and quantitative analysis of the overall material composition, organisation, localisation, and distribution of typical ECM-specific biomolecules is essential for consistent advancement of CDMs and the understanding of the prospective functions of the developed biomaterial. In this study, we evaluated frequently used methods for the analysis of complex CDMs. Sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) and (immune)histochemical staining methods in combination with several microscopic techniques were found to be highly eligible. Commercially available colorimetric protein assays turned out to deliver inaccurate information on CDMs. In contrast, we determined the nitrogen content of CDMs by elementary analysis and converted it into total protein content using conversion factors which were calculated from matching amino acid compositions. The amount of insoluble collagens was assessed based on the hydroxyproline content. The Sircol™ assay was identified as a suitable method to quantify soluble collagens while the Blyscan™ assay was found to be well-suited for the quantification of sulphated glycosaminoglycans (sGAGs). Eventually, we propose a series of suitable methods to reliably characterise the biomolecular composition of fibroblast-derived clickECM.
By 2019, German-based Kärcher, "the world's leading provider of cleaning technology", hat turned its professional cleaning devices into digital offerings. The data generated by these connected cleaning devices formed a key ingredient in the company's ongoing strategic shift in its B2B business: Kärcher was transforming from a seller of cleaning devices to a provider of consulting services in order to help professional cleaning companies improve their cleaning processes.
The case illustrates how the company learned to generate value from digital offerings. And it demonstrates how a family-owned company transformed its organization in order to be able to more effectively develop and provide digital offerings, while adding roles and developing technology platforms, as well as changing structures and ways of working.
Checklists are a valuable tool to ensure process quality and quality of care. To ensure proper integration in clinical processes, it would be desirable to generate checklists directly from formal process descriptions. Those checklists could also be used for user interaction in context-aware surgical assist systems. We built a tool to automatically convert Business Process Model and Notation (BPMN) process models to checklists displayed as HTML websites. Gateways representing decisions are mapped to checklist items that trigger dynamic content loading based on the placed checkmark. The usability of the resulting system was positively evaluated regarding comprehensibility and end-user friendliness.
Intraoperative brain deformation, so called brain shift, affects the applicability of preoperative magnetic resonance imaging (MRI) data to assist the procedures of intraoperative ultrasound (iUS) guidance during neurosurgery. This paper proposes a deep learning-based approach for fast and accurate deformable registration of preoperative MRI to iUS images to correct brain shift. Based on the architecture of 3D convolutional neural networks, the proposed deep MRI-iUS registration method has been successfully tested and evaluated on the retrospective evaluation of cerebral tumors (RESECT) dataset. This study showed that our proposed method outperforms other registration methods in previous studies with an average mean squared error (MSE) of 85. Moreover, this method can register three 3D MRI-US pair in less than a second, improving the expected outcomes of brain surgery.
In networked operating room environments, there is an emerging trend towards standardized non-proprietary communication protocols which allow to build new integration solutions and flexible human-machine interaction concepts. The most prominent endeavor is the IEEE 11073 SDC protocol. For some uses cases, it would be helpful if not just medical devices could be controlled based on SDC, but also building automation systems like light, shutters, air condition, etc. For those systems, the KNX protocol is widely used. We build an SDC-to-KNX gateway which allows to use the SDC protocol for sending commands to connected KNX devices. The first prototype system was successfully implemented at the demonstration operating room at Reutlingen University. This is a first step toward the integration of a broader variety of KNX devices.
We investigate the toxicity of different types and sizes of microplastic particles (0.3–4 mm) under different conditions (new particles, aged particles with biofilm, and particles with adsorbed Tributyltin) on the freshwater amphipod Gammarus fossarum in 3-week exposures. All types of plastic particles, which were randomly taken up to a small extent, were mostly Polyphenylenoxide, Polybutylentherephthalate and Polypropylene, with particles < 1 mm in size. Plastic particles did not affect the feeding and locomotory behaviour of gammarids, and there was no strong difference between pristine plastic particles and aged particles with biofilm. Mortality tended to be higher compared with the control. Tributyltinhydride (TBTH) adsorbed to microplastic particles had no effect on uptake, survival, feeding and locomotory behaviour during the 3 weeks of exposure. Dissolved TBTH, however, was already very toxic after few days of exposure (LC50-96h < 1 ng l–1).
Instagram fashion videos
(2020)
Instagram is one of the most used social media platforms to share photos and videos. Due to this, it can be seen as a helpful opportunity for companies to use the platform as a marketing tool in order to spread information to a wide range of potential customers. Ever since its launch, Instagram is strongly connected to fashion, which makes the platform in particular interesting for fashion brands. According to the screened literature, most brands use Instagram for marketing purposes. It is furthermore a matter of fact, that the utilization of videos plays a decisive role. Following up on this, the question about how brands use videos on Instagram for marketing purposes comes up. Due to this, this chapter aims to investigate the extent to which brands make use of videos on Instagram, what the goals of the videos are and what the most effective videos in terms of user engagement are. More specifically, this chapter includes an empirical study which examines the Instagram profiles of nine selected brands of the categories lifestyle, luxury and fashion and sportswear on the underlying research question. A subsequent evaluation and discussion of the results depicts differences and similarities within the categories and between the categories. All in all, the results of the study show that fashion brands use the possibility of films as a marketing tool on Instagram. The content and types of films thereby heavily depend on the brand category.
This chapter looks at the usage of image films produced by brands and their dealing with themselves. It focuses on analyzing important film parameters, the content and the way it can influence brand image. A list of 70 fashion brands from different categories was gathered through a survey and confirmed by comparing the results with relevant literature. All 70 brands were looked at to find relevant self-referencing films. The films had to be produced by the brand themselves. Videos for advertisement or promoting collections are not regarded either. In total 22 films from 17 brands were analyzed. Results show that most brands seem to have recognized videos as a powerful marketing tool in the social media age. Many brands seem to struggle with the compliance of certain parameters such as length and the use of the brand logo. In general, the content of the videos is focused around the four topics recruitment, value, history and behind the brand. As for the intent, the videos can be classified into the three categories learning, emotion and doing something. This paper not only analyzes this special film category, but also gives recommendations to improve the videos.
Hip-hop culture defines itself through four central pillars: DJing, MCing, breakdancing and graffiti, but a fifth one, fashion, may be in the coming. Hip-hop has become the most popular music genre, and the influence it has on society is undebatable. But as hip-hop artists increasingly underpin their music with visual components, like music videos, the question arises if that has an influence on the fashion industry. This chapter clarifies which factors may determine a fashion business impact and discusses differences between mainstream hip-hop artists and the ones that are active in the fashion industry as well. The focus lays on the way and amount fashion is presented in the music videos. 24 music videos were analyzed, thereof 15 popular records from the past three years and nine of artists that are already considered as fashion influential. Additionally, a fashion influence index was created to compare the degree of fashion between the music videos. Numbers of styles, recognized brands, fashion related song verses, fashion related description box mentions and articles about the fashion in the music video were noted. Findings reveal that the number of outfits shown in the video did not have a direct link to the amount of traffic it produces in fashion media. The artists that are considered influential in the fashion industry, name brands in their song lyrics more often and show brand logos more frequent in their music videos than others. Though over the observed years, for the mainstream hip-hop artists, a rise in fashion awareness can be seen through a higher number of styles, recognizable brands and fashion related verses in the lyrics.
Fashion show films
(2020)
Due to technological developments, fashion show films provide fashion brands the opportunity to communicate their brand concepts, to attract attention and to gain more brand awareness by publishing them in the Internet. The purpose of this research paper is to investigate how fashion brands communicate their brand concept and personality through fashion show films. For this purpose, ten fashion show films of brands from the categories luxury, premium, high-street and active wear are investigated. The results indicate that the investigated brands use different ways to attract attention and to communicate their brand concept and personality. The design of the setting, the presentation of the collection as well as the visualization of the brand concept through the brand name, logo, colors or symbols and camera work play an important role to create an effective and exciting fashion show film in order to communicate the brand concept and to promote their brand image. Mostly luxury and premium brands use fashion show films for branding. For high-street and active wear brands the analysis indicates less importance of fashion show films. The limitations of this research are related to the fact that the restricted number of ten fashion show films is analyzed. This gives an overview but cannot provide a comprehensive breakdown of this topic.
An event film is a successful marketing and communication instrument, which can be used from companies along social media. By reaching the target group and potential customers, companies could benefit from increasing brand awareness. It is striking that there is a lack of information about how event films are used in regard to showing fashion. To establish the subject further, the purpose of this paper is to enrich the existing findings and analyze the influence event films have. In an empirical study, the performance of two events and the two related fast fashion retailers H&M and Zara on Instagram and YouTube regarding event and fashion connected films is analyzed. Identified stylistic elements of event fashion are searched and found in their online shops. Since emotions are especially well transferred through event films, there is an indication that they contribute to the shaping of fashion trends.
The purpose of this paper is to investigate how motion pictures are currently used for the product presentation of fashion articles. An explorative approach was chosen for the literature section. This study shows that the use of moving images for the presentation of fashion articles in online shops is possible in numerous different ways. In order to be able to use product presentation videos meaningfully, one should consider exactly what is the purpose of these videos. Different goals require different means. However, retailers should obtain enough information in advance to assess whether they can afford the production and post-processing of these videos.
The purpose of this paper is to investigate how motion pictures are currently used for the product presentation of fashion articles in online shops in the German, American and British markets. This study shows that the use of moving images for the presentation of fashion articles in online shops is underutilized. With the amount of data that was manageable within the scope of this chapter, no valid generalizations can be made. All described results must be understood as an indication. In order to be able to use product presentation videos meaningfully, one should consider before exactly what is the purpose of these videos. Different goals require different means. However, retailer should obtain enough information in advance to assess whether they can afford the production and post processing of these videos.
Today, digitalization is firmly anchored in society and business. It is also recognized to have significant impact on the retailing sector. The in-store display of moving images has so far, however, gained little attention by researchers. The aim of this research is to provide a first estimation on the current state of moving images distribution in stationary retail stores. A store check was the basis for analysis and evaluation. In sum, 152 stores were analyzed in Stuttgart, Germany. Out of 152 observed stores, 62 stores showed 177 moving images. Detailed analyses about content, mood, color and the actors of motion pictures showed that all aspects are very well harmonized with the target group of the store. The chapter provides a basic estimation of the in-store diffusion of moving images. Thereby, avenues for further research are opened up.
This chapter provides insights in the future of fashion film with respect to augmented reality and virtual reality technologies. The question: How does augmented reality and virtual reality influence the future of fashion film? is therefore considered. It is important to analyze the influence of those technologies on fashion films to assess the potential for fashion retailers and in best case gain first-mover advantages. To answer the stated research question, a literature research was conducted to gain insights about the topic and its influence towards fashion filming. Explanation of augmented reality and virtual reality is provided as well as implications in the retail sector regarding fashion films. Moreover, company examples already using this approach have been compiled. Furthermore, an empirical research part was conducted including a survey method based on an online survey design. The questionnaire is based on what has been revealed in literature to gain in depth insides and approval. The data gained indicated that augmented reality and virtual reality influence the future of fashion film in various ways. The findings highlight how important those technologies can be in order to enhance customer experience and engagement. Regarding the research question, the conclusion can be drawn that it is highly important for fashion managers to take future developments like augmented reality and virtual reality into account to stay competitive and satisfy the requirements of modern consumers.
This book aims to explore various aspects of the use of moving images in fashion retail and fashion apparel companies in-store or online. The use of moving images is growing in numbers and in relevance for consumers. Films can be used in various forms by fashion businesses in traditional media like cinema or TV and in modern forms like in social media or moving images in high street stores.
The book provides a data-oriented analysis of the state-of-the-art with certain future outlooks. Additional areas of covering fashion in moving images, such as ‘fashion company identity films’ or ‘fashion and music videos’ are covered in order to get a more complete analysis from a consumer influenced perspective.
The purpose of this paper is to give an overview about the links between fashion businesses and film from a fashion business perspective. It focuses on the idea that digitalization brought much more film use for the fashion industry and that this development has just begun and not ended. This change finally also has an intense impact on the fashion industry, as fashion companies nowadays are content producers with films, too. The resulting closer connection with viewers via social media exposes fashion companies, gives on the other hand new influence potential to the fashion system. An in-depth future research about the fashion and film system is therefore required to develop answers for the current situation. This article should be interpreted more as a personal viewpoint of the author to this topic rather than a research paper based on the usual methodological criteria.
The connection of fashion and film seems symbiotic at first sight and they influence each other. There exist differences, including a different understanding of clothing by costume designers and fashion businesses. This article focuses on two successful movies „The Hunger Games“ and „The Great Gatsby“ in order to explore the role of film in fashion and vice versa. The findings suggest, that there are various collections in the fashion world, based on both movies. Therefore, movies indeed have an influence on the development of seasonal fashion. However, this connection is not natural, but rather artificially created by both industries. Through nowadays organized co-operation, the lines between costume designers and fashion designers get blurred. Furthermore, today fashion doesn’t trickle down to an audience naturally, but promoted using the film and its broad reach.
This chapter discusses German television as a platform for fashion content and, in that context, streaming services as possible alternatives. Three German television channels were surveilled over the period of one month, as well as the two most popular streaming services in Germany and the online media library of one German television channel over six months, regarding length, fashion connection, transmission time and success. Additionally, for three channels fashion advertisement was analyzed. Broadcasting the most contributions with fashion connection in one month, VOX was the channel being the most fashionable. Aiming to entertain, informative contributions about fashion in television build a minority. Streaming services offer more flexibility, which the user is asking for. All three television stations show fashion brand spots during prime-time. Especially ProSieben and sixx are in close cooperation with several fashion brands. Therefore, fashion advertising seems to be preferably inserted in fashion related series.
Based on new ways of watching series via streaming platforms and a change of buying behavior, advertising needs to focus on new strategies. Branded entertainment gives brands the opportunity to deeper integrate their product placements into television show plots. Through a managerial perspective this increases the advertising effectiveness. The serial ‘Sex and the City’ exemplifies successful branded entertainment and shows how series influence fashion nowadays. The placements are outstanding when it comes to storytelling around the brand or product, setting trends and creating a character connection plus a desire through identification. This chapter shows success factors and chances of placements for the fashion industry.
YouTube fashion videos
(2020)
YouTube is the most widely adopted and successful video sharing platform. It works as a marketing instrument and money-making tool for companies while reaching the target group. After considering the significant literature based on YouTube, it is striking that there is lack of information about YouTube’s benefits as a video marketing instrument for fashion brands. To establish this subject further, the purpose of this study is to enrich the existing findings on social video marketing on YouTube in the apparel industry. The findings indicate the importance of YouTube as a social network for fashion marketers. The second part conducts an empirical study, which makes the YouTube channel performance of nine fashion brands the subject of discussion. Thereby, three brands per lifestyle, sports and luxury sector are analyzed through comparative aspects. Accordingly, the differences and similarities within and between the sectors are analyzed and evaluated.
Impregnated paper-based decorative laminates prepared from lignin-substituted phenolic resins
(2020)
High Pressure Laminates (HPL) panels consist of stacks of self-gluing paper sheets soaked with phenol-formaldehyde (PF) resins. An important requirement for such PFs is that they must rapidly penetrate and saturate the paper pores. Partially substituting phenol with bio-based phenolic chemicals like lignin changes the physico-chemical properties of the resin and affects its ability to penetrate the paper. In this study, PF formulations containing different proportions of lignosulfonate and kraft lignin were used to prepare paper-based laminates. The penetration of a Kraft paper sheet was characterized by a recently introduced, new device measuring the conductivity between both sides of the paper sheet after a drop of resin was placed on the surface and allowed to penetrate the sheet. The main target value measured was the time required for a specific resin to completely penetrate the defined paper sample (“penetration time”). This penetration time generally depends on the molecular weight distribution, the flow behavior and the polarity of the resin which in turn are dependent on the manufacturing conditions of the resin. In the present study, the influences of the three process factors: (1) type of lignin material used for substitution, (2) lignin modification by phenolation and (3) degree of phenol substitution on the penetration times of various lignin-phenolic hybrid impregnation resins were studied using a complete twolevel three-factorial experimental design. Thin laminates made with the resins diluted in methanol were mechanically tested in terms of tensile and flexural strains, and their cross-sections were studied by light microscopy.
In recent years, the cloud has become an attractive execution environment for parallel applications, which introduces novel opportunities for versatile optimizations. Particularly promising in this context is the elasticity characteristic of cloud environments. While elasticity is well established for client-server applications, it is a fundamentally new concept for parallel applications. However, existing elasticity mechanisms for client-server applications can be applied to parallel applications only to a limited extent. Efficient exploitation of elasticity for parallel applications requires novel mechanisms that take into account the particular runtime characteristics and resource requirements of this application type. To tackle this issue, we propose an elasticity description language. This language facilitates users to define elasticity policies, which specify the elasticity behavior at both cloud infrastructure level and application level. Elasticity at the application level is supported by an adequate programming and execution model, as well as abstractions that comply with the dynamic availability of resources. We present the underlying concepts and mechanisms, as well as the architecture and a prototypical implementation. Furthermore, we illustrate the capabilities of our approach through real-world scenarios.
Cloud resources can be dynamically provisioned according to application-specific requirements and are payed on a per-use basis. This gives rise to a new concept for parallel processing: Elastic parallel computations. However, it is still an open research question to which extent parallel applications can benefit from elastic scaling, which requires resource adaptation at runtime and corresponding coordination mechanisms. In this work, we analyze how to address these system-level challenges in the context of developing and operating elastic parallel tree search applications. Based on our findings, we discuss the design and implementation of TASKWORK, a cloud-aware runtime system specifically designed for elastic parallel tree search, which enables the implementation of elastic applications by means of higher-level development frameworks. We show how to implement an elastic parallel branch-and-bound application based on an exemplary development framework and report on our experimental evaluation that also considers several benchmarks for parallel tree search.
Fehler, Manipulation und Rationalität – wie das Reporting das Verhalten der Entscheider beeinflusst
(2020)
Der Zweck des Management Reporting besteht darin, den Informationsbedarf der Führungskräfte zu befriedigen. Sowohl Ersteller als auch Nutzer von Berichten handeln aber nur begrenzt rational. Berichte wirken deshalb nicht „zielgenau“, sondern lösen vielfältige nicht gewünschte Reaktionen bei den Beteiligten aus. In diesem Beitrag erfahren Sie, wie sich „der Faktor Mensch“ auf die Erstellung und Nutzung von Management Reports auswirkt und wie ein effektives und effizientes Management Reporting unerwünschte Wirkungen minimieren kann.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
The planning and control of intralogistics systems in line with versatile production systems of smart factories requires new approaches and methods to cope with changing requirements within future factories. The planning of intralogistics can no longer follow a static, sequential approach as in the past since the planning assumptions are going to change in a high frequency. Reasons for these constant changes are amongst others external turbulences like rapidly changing market conditions, decreasing batch sizes down to customer-specific products with a batch size of one and on the other hand internal turbulences (like production and logistic resource breakdowns) affecting the production system. This paper gives an insight into research approaches and results how capabilities of intelligent logistical objects (intelligent bins, autonomous transport systems etc.) can be used to achieve a self-organized, cost and performance optimized intralogistics system with autonomously controlled process execution within versatile production environments. A first consistent method has been developed which has been validated and implemented within a scenario at the pilot factory Werk150 at the ESB Business School (Reutlingen University). Based on the incoming production orders, the method of the Extended Profitability Appraisal (EPA) covering the work system value to define the most effective work system for order fulfilment is applied. To derive the appropriate intralogistics processes, an autonomous control method involving principles of decentralized and target-oriented decision-making (e.g. intelligent bins are interacting with autonomously controlled transport systems to fulfil material orders of assembly workstations) has been developed and applied to achieve a target-optimized process execution. The results of the first stage research using predefined material sources and sinks described in this paper is going to set the basis for the further development of a self-organized and autonomously controlled method for intralogistics systems considering dynamic source and sink relations. By allowing dynamic shifts of production orders in the sense of dynamic source and sink relations the cost and performance aims of the intralogistics system can be directly aligned with the aims of the entire versatile production system in the sense of self-organized and autonomously controlled systems.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? And (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
Die Produktindividualisierung, Digitalisierung und Automatisierung der Produktion erfordern eine ständige Anpassung der Produktions- und Intralogistikprozesse. Referenzmodelle unterstützen dabei Produktions- und Fabrikplaner mit Standards, Werkzeugen und vielem mehr. Eine Marktrecherche von Referenzmodellen zeigt erhebliche inhaltliche und methodische Unvollständigkeiten auf. Eine daraus abgeleitete Handlungsempfehlung für die Konstruktion eines Intralogistikreferenzmodells wir vorgestellt.
Fabrikplanungsprozesse werden zunehmend durch räumlich und zeitlich verteilte Teams durchgeführt, die agiles Projektmanagement praktizieren. Voraussetzung für den Erfolg ist die Anwendung von Planungssystemen der Digitalen Fabrik sowie moderner Groupware zur Kommunikation, Koordination und Kooperation in den agilen Projektgruppen der jeweiligen Planungsphase. Es wird ein Konzept mit Implementierungshinweisen für einen zukunftsfähigen Fabrikplanungsprozess mit digitalen Systemen vorgestellt.
Die pharmazeutische Verpackungsindustrie ist durch umfangreiche Regularien geprägt und daher in der Innovationsdynamik etwas eingeschränkt. In einem sechsmonatigen Projekt zur Entwicklung von Zukunftsszenarien für die Pharmaverpackung wurde aufgezeigt, dass zwar neue Technologien, wie E-Labels oder Kindersicherungen, die Marktreife erreicht haben oder in Kürze erreichen werden, neue Anforderungen in absehbarer Zukunft aber weiteren Entwicklungsbedarf erfordern. Die pharmazeutische Verpackungsindustrie muss sich zusammen mit ihren Kunden und Technologielieferanten enger und intensiver austauschen, um die nächste Verpackungsgeneration, Smart Packaging 2.0, auf den Weg zu bringen.
Mangels durchgängiger Datenstandards für Planungssysteme der Digitalen Fabrik müssen systemspezifische Datenaustauschlösungen implementiert werden. Zur Unterstützung der Planung ist ein durchgängiger Fabrikplanungsprozess mit integrierter Routenplanung sowohl prozess- als auch systemtechnisch erforderlich. Dafür werden beispielhaft ein Fabrik- und ein Routenplanungssystem auf ihre Kompatibilität untersucht, erforderliche Anforderungen abgeleitet und eine Datenaustausch-möglichkeit für den Anwender aufgezeigt.
Für Educational Excellence muss die Ausbildung angehender Akademiker den Spagat zwischen Wissenschaft und Praxis meistern. Einfache Praktika u. a. klassische Ansätze reichen für Educational Excellence, angelehnt an die Anforderungen der Operational Excellence, nicht aus. Vorgestellt wird das an der ESB Business School der Hochschule Reutlingen seit vielen Jahren in der Wirtschaftsingenieurausbildung für Produktion und Logistik in einem Projekt-Masterstudiengang mit vielen Industriepartnern praktizierte kooperative Modell der Educational Excellence.
Hardly any software development process is used as prescribed by authors or standards. Regardless of company size or industry sector, a majority of project teams and companies use hybrid development methods (short: hybrid methods) that combine different development methods and practices. Even though such hybrid methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this article, we make a first step towards a statistical construction procedure for hybrid methods. Grounded in 1467 data points from a large‐scale practitioner survey, we study the question: What are hybrid methods made of and how can they be systematically constructed? Our findings show that only eight methods and few practices build the core of modern software development. Using an 85% agreement level in the participants' selections, we provide examples illustrating how hybrid methods can be characterized by the practices they are made of. Furthermore, using this characterization, we develop an initial construction procedure, which allows for defining a method frame and enriching it incrementally to devise a hybrid method using ranked sets of practice.
Selecting a suitable development method for a specific project context is one of the most challenging activities in process design. Every project is unique and, thus, many context factors have to be considered. Recent research took some initial steps towards statistically constructing hybrid development methods, yet, paid little attention to the peculiarities of context factors influencing method and practice selection. In this paper, we utilize exploratory factor analysis and logistic regression analysis to learn such context factors and to identify methods that are correlated with these factors. Our analysis is based on 829 data points from the HELENA dataset. We provide five base clusters of methods consisting of up to 10 methods that lay the foundation for devising hybrid development methods. The analysis of the five clusters using trained models reveals only a few context factors, e.g., project/product size and target application domain, that seem to significantly influence the selection of methods. An extended descriptive analysis of these practices in the context of the identified method clusters also suggests a consolidation of the relevant practice sets used in specific project contexts.
This book discusses important topics for engineering and managing software startups, such as how technical and business aspects are related, which complications may arise and how they can be dealt with. It also addresses the use of scientific, engineering, and managerial approaches to successfully develop software products in startup companies.
The book covers a wide range of software startup phenomena, and includes the knowledge, skills, and capabilities required for startup product development; team capacity and team roles; technical debt; minimal viable products; startup metrics; common pitfalls and patterns observed; as well as lessons learned from startups in Finland, Norway, Brazil, Russia and USA. All results are based on empirical findings, and the claims are backed by evidence and concrete observations, measurements and experiments from qualitative and quantitative research, as is common in empirical software engineering.
The book helps entrepreneurs and practitioners to become aware of various phenomena, challenges, and practices that occur in real-world startups, and provides insights based on sound research methodologies presented in a simple and easy-to-read manner. It also allows students in business and engineering programs to learn about the important engineering concepts and technical building blocks of a software startup. It is also suitable for researchers at different levels in areas such as software and systems engineering, or information systems who are studying advanced topics related to software business.
Hochschulen sind Teil des Innovationsökosystems: in einer kooperativen Austauschbeziehung fördern sie die regionale Wirtschaft und die gesellschaftliche Entwicklung. Deshalb ist die Förderung von Innovation, Kreativität und unternehmerischem Denken eine wichtige Aufgabe. Die Europäische Kommission hat bereits 2005 unternehmerisches Denken und Handeln als Schlüsselkompetenz für das 21. Jahrhundert definiert: „Unternehmerische Kompetenz ist die Fähigkeit, Ideen in die Tat umzusetzen“ (Europäische Kommission, 2005, S. 21). Entrepreneurship Education boomt und die Förderung von unternehmerischen Kompetenzen an Hochschulen wird vorangetrieben – damit ist die Förderung von Gründungskultur nicht nur Teil der Wirtschaftsbildung sondern vielmehr als Querschnittsaufgabe zu verstehen. Die Entrepreneurial Mission verändert die Lehr- und Lern kultur an den Hochschulen. Zum einen ist es Ziel, Entrepreneurship in der Breite an den Hochschulen zu verankern: Unternehmerisches Denken und Handeln ist eine Kernkompetenz. Zum anderen fördert die Start-up Education an Hochschulen aktiv Unternehmertalente und Ausgründungen.
Das Projekt “Spinnovation” ist ein Verbundprojekt der Hochschule Reutlingen, der Hochschule Aalen und der Hochschule der Medien und wird vom Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg in der Ausschreibung „Gründungskultur in Studium und Lehre“ gefördert. Seit 2016 wurden dazu an den beteiligten Hochschulen zahlreiche neue Angebote für Studierende entwickelt, um das Thema Entrepreneurship Education curricular zu integrieren und eine Änderung des Mindsets in Richtung Entrepreneurship und Innovation zu bewirken. Basierend auf den Erfahrungen und Ergebnissen aus dem Verbundprojekt Spinnovation können konkrete Handlungsempfehlungen für die Entrepreneurship Education an Hochschulen abgeleitet werden.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings—revenue-generating solutions to what customers want and are willing to pay for, inspired by what is possible with digital technologies. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This case describes how Munich Re addressed these common challenges by building a foundation to help its digital offerings succeed. The foundation provided prioritized and staged funding; dedicated, hands-on expertise; and a digital platform of shared services. By 2020, this foundation was helping to support over seventy initiatives, including several that were in the market generating new sources of revenue for the company by enabling its clients—insurance companies—to better service their own customers.
Product roadmaps are an important tool in product development. They provide direction, enable consistent development in relation to a product vision and support communication with relevant stakeholders. There are many different formats for product roadmaps, but they are often based on the assumption that the future is highly predictable. However, especially software-intensive businesses are faced with increasing market dynamics, rapidly evolving technologies and changing user expectations. As a result, many organizations are wondering what roadmap format is appropriate for them and what components it should have to deal with an unpredictable future. Objectives: To gain a better understanding of the formats of product roadmaps and their components, this paper aims to identify suitable formats for the development and handling of product roadmaps in dynamic and uncertain markets. Method: We performed a grey literature review (GLR) according to the guidelines from Garousi. Results: A Google search identified 426 articles, 25 of which were included in this study. First, various components of the roadmap were identified, especially the product vision, themes, goals, outcomes and outputs. In addition, various product roadmap formats were discovered, such as feature-based, goal-oriented, outcome-driven and a theme-based roadmap. The roadmap components were then assigned to the various product roadmap formats. This overview aims at providing initial decision support for companies to select a suitable product roadmap format and adapt it to their own needs.
In recent years companies have faced challenges by high market dynamics, rapidly evolving technologies and shifting user expectations. Together with the adaption of lean and agile practices, it is increasingly difficult to predict upfront which products, features or services will satisfy the needs of the customers and the organization. Currently, many new products fail to produce a significant financial return. One reason is that companies are not doing enough product discovery activities. Product discovery aims at tackling the various risks before the implementation of a product starts. The academic literature only provides little guidance for conducting product discovery in practice. Objective: In order to gain a better understanding of product discovery activities in practice, this paper aims at identifying motivations, approaches, challenges, risks, and pitfalls of product discovery reported in the grey literature. Method: We performed a grey literature review (GLR) according to the guidelines to Garousi et al. Results: The study shows that the main motivation for conducting product discovery activities is to reduce the uncertainty to a level that makes it possible to start building a solution that provides value for the customers and the business. Several product discovery approaches are reported in the grey literature which include different phases such as alignment, problem exploration, ideation, and validation. Main challenges are, among others, the lack of clarity of the problem to be solved, the prescription of concrete solutions through management or experts, and the lack of cross-functional collaboration.
A fast way to test business ideas and to explore customer problems and needs is to talk to them. Customer interviews help to understand what solutions customers will pay for before investing valuable resources to develop solutions. Customer interviews are a good way to gain qualitative insights. However, conducting interviews can be a difficult procedure and requires specific skills. The current ways of teaching interview skills have significant deficiencies. They especially lack guidance and opportunities to practice. Objective: The goal of this work is to develop and validate a workshop format to teach interview skills for conducting good customer interviews in a practical manner. Method: The research method is based on design science research which serves as a framework. A game-based workshop format was designed to teach interview skills. The approach consists of a half-day, hands-on workshop and is based on an analysis of necessary interview skills. The approach has been validated in several workshops and improved based on learnings from those workshops. Results: Results of the validation show that participants could significantly improve their interview skills while enjoying the game-based exercises. The game-based learning approach supports learning and practicing customer interview skills with playful and interactive elements that encourage greater motivation among participants to conduct interviews.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Context: A product roadmap is an important tool in product development. It sets the strategic direction in which the product is to be developed to achieve the company’s vision. However, for product roadmaps to be successful, it is essential that all stakeholders agree with the company’s vision and objectives and are aligned and committed to a common product plan.
Objective: In order to gain a better understanding of product roadmap alignment, this paper aims at identifying measures, activities and techniques in order to align the different stakeholders around the product roadmap.
Method: We conducted a grey literature review according the guidelines to Garousi et al.
Results: Several approaches to gain alignment were identified such as defining and communicating clear objectives based on the product vision, conducting cross-functional workshops, shuttle diplomacy, and mission briefing. In addition, our review identified the “Behavioural Change Stairway Model” that suggests five steps to gain alignment by building empathy and a trustful relationship.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
Methods based exclusively on heart rate hardly allow to differentiate between physical activity, stress, relaxation, and rest, that is why an additional sensor like activity/movement sensor added for detection and classification. The response of the heart to physical activity, stress, relaxation, and no activity can be very similar. In this study, we can observe the influence of induced stress and analyze which metrics could be considered for its detection. The changes in the Root Mean Square of the Successive Differences provide us with information about physiological changes. A set of measurements collecting the RR intervals was taken. The intervals are used as a parameter to distinguish four different stages. Parameters like skin conductivity or skin temperature were not used because the main aim is to maintain a minimum number of sensors and devices and thereby to increase the wearability in the future.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Autonomous driving is becoming the next big digital disruption in the automotive industry. However, the possibility of integrating autonomous driving vehicles into current transportation systems not only involves technological issues but also requires the acceptance and adoption of users. Therefore, this paper develops a conceptual model for user acceptance of autonomous driving vehicles. The corresponding model is tested through a standardized survey of 470 respondents in Germany. Finally, the findings are discussed in relation to the current developments in the automotive industry, and recommendations for further research are given.
The advent of chatbots in customer service solutions received increasing attention by research and practice throughout the last years. However, the relevant dimensions and features for service quality and service performance for chatbots remain quite unclear. Therefore, this research develops and tests a conceptual model for customer service quality and customer service performance in the context of chatbots. Additionally, the impact of the developed service dimensions on different customer relationship metrics is measured across different service channels (hotline versus chatbots). Findings of six independent studies indicate a strong main effect of the conceptualized service dimensions on customer satisfaction, service costs, intention to service reusage, word-of-mouth, and customer loyalty. However, different service dimensions are relevant for chatbots compared to a traditional service hotline.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
Learning factories can complement each other by training different competencies in the field of digitalisation and Industry 4.0. They depict diverse sections of the product development process and focus on various technologies. Within the framework of the International Association of Learning Factories (IALF), the operating organisations of learning factories exchange information on research, training and education. One of the aims is to develop joint projects. The article presents different concepts of cooperation between learning factories while focusing on the improvement of the development of learners competencies e.g. with a broader range of topics. A concept of a joint course between the learning factories in Bochum, Reutlingen and Darmstadt is explained in detail. The three learning factories will be examined with regard to their similarities and differences. The joint course focuses on the target group of students and the topic of digitalisation in the development and production of products. The course and its contents are explained in detail. The new learning approach is evaluated on the basis of feedback from the participants. Finally, challenges resulting from the cooperation between learning factories at different locations and with different operating models will be discussed.
Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory.
In recent years, the development and application of decellularized extracellular matrices (ECMs) for use as biomaterials have grown rapidly. These cell-derived matrices (CDMs) represent highly bioactive and biocompatible materials consisting of a complex assembly of biomolecules. Even though CDMs mimic the natural microenvironment of cells in vivo very closely, they still lack specifically addressable functional groups, which are often required to tailor a biomaterial functionality by bioconjugation. To overcome this limitation, metabolic glycoengineering has emerged as a powerful tool to equip CDMs with chemical groups such as azides. These small chemical handles are known for their ability to undergo bioorthogonal click reactions, which represent a desirable reaction type for bioconjugation. However, ECM insolubility makes its processing very challenging. In this contribution, we isolated both the unmodified ECM and azide-modified clickECM by osmotic lysis. In a first step, these matrices were concentrated to remove excessive water from the decellularization step. Next, the hydrogel-like ECM and clickECM films were mechanically fragmentized, resulting in easy to pipette suspensions with fragment sizes ranging from 7.62 to 31.29 μm (as indicated by the mean d90 and d10 values). The biomolecular composition was not impaired as proven by immunohistochemistry. The suspensions were used for the reproducible generation of surface coatings, which proved to be homogeneous in terms of ECM fragment sizes and coating thicknesses (the mean coating thickness was found to be 33.2 ± 7.3 μm). Furthermore, they were stable against fluid-mechanical abrasion in a laminar flow cell. When primary human fibroblasts were cultured on the coated substrates, an increased bioactivity was observed. By conjugating the azides within the clickECM coatings with alkyne-coupled biotin molecules, a bioconjugation platform was obtained, where the biotin–streptavidin interaction could be used. Its applicability was demonstrated by equipping the bioactive clickECM coatings with horseradish peroxidase as a model enzyme.
The extracellular matrix (ECM) naturally surrounds cells in humans, and therefore represents the ideal biomaterial for tissue engineering. ECM from different tissues exhibit different composition and physical characteristics. Thus, ECM provides not only physical support but also contains crucial biochemical signals that influence cell adhesion, morphology, proliferation and differentiation. Next to native ECM from mature tissue, ECM can also be obtained from the in vitro culture of cells. In this study, we aimed to highlight the supporting effect of cell-derived- ECM (cdECM) on adipogenic differentiation. ASCs were seeded on top of cdECM from ASCs (scdECM) or pre-adipocytes (acdECM). The impact of ECM on cellular activity was determined by LDH assay, WST I assay and BrdU assay. A supporting effect of cdECM substrates on adipogenic differentiation was determined by oil red O staining and subsequent quantification. Results revealed no effect of cdECM substrates on cellular activity. Regarding adipogenic differentiation a supporting effect of cdECM substrates was obtained compared to control. With these results, we confirm cdECM as a promising biomaterial for adipose tissue engineering.
Different types of raw cotton were investigated by a commercial ultraviolet-visible/near infrared (UV-Vis/NIR) spectrometer (210–2200 nm) as well as on a home-built setup for NIR hyperspectral imaging (NIR-HSI) in the range 1100–2200 nm. UV-Vis/NIR reflection spectroscopy reveals the dominant role proteins, hydrocarbons and hydroxyl groups play in the structure of cotton. NIR-HSI shows a similar result. Experimentally obtained data in combination with principal component analysis (PCA) provides a general differentiation of different cotton types. For UV-Vis/NIR spectroscopy, the first two principal components (PC) represent 82 % and 78 % of the total data variance for the UV-Vis and NIR regions, respectively. Whereas, for NIR-HSI, due to the large amount of data acquired, two methodologies for data processing were applied in low and high lateral resolution. In the first method, the average of the spectra from one sample was calculated and in the second method the spectra of each pixel were used. Both methods are able to explain ≥90 % of total variance by the first two PCs. The results show that it is possible to distinguish between different cotton types based on a few selected wavelength ranges. The combination of HSI and multivariate data analysis has a strong potential in industrial applications due to its short acquisition time and low-cost development. This study opens a novel possibility for a further development of this technique towards real large-scale processes.
This book presents an empirical investigation of the efforts that multinational pharmaceutical companies take in order to find a business model that allows for a profitable access to the Bottom of the Pyramid (BoP) markets. The Bottom of the Pyramid in Africa is frequently mentioned as an attractive market due to its sheer size. Yet most companies struggle to access it because of the low price level, difficult physical market access and challenges when it comes to payment.
More specifically, the book investigates the following business model-related questions: Do pharmaceutical companies provide products that meet the needs of the BoP? What characterizes the value generation of the company? What revenue model leads to a profitable business, and what role does a network of partners play in the business model?
Findings reveal that there is no ‘one-size-fits-all’ answer to these questions. Providing continuous availability, affordability at a good quality of goods and services, creating health awareness, as well as localizing business to achieve a level of inclusivenessare essential prerequisites for success. In the last chapter this book provides a business model prototype that accounts for these key success factors for business at the Bottom of the Pyramid and points to further research topics.