Refine
Year of publication
- 2020 (312) (remove)
Document Type
- Journal article (144)
- Conference proceeding (93)
- Book chapter (41)
- Book (10)
- Report (9)
- Doctoral Thesis (6)
- Working Paper (4)
- Anthology (3)
- Issue of a journal (2)
Is part of the Bibliography
- yes (312)
Institute
- ESB Business School (105)
- Informatik (101)
- Life Sciences (43)
- Technik (37)
- Texoversum (25)
Publisher
- Springer (58)
- Elsevier (30)
- Hochschule Reutlingen (20)
- IEEE (16)
- De Gruyter (13)
- MDPI (13)
- Association for Computing Machinery (9)
- Wiley (8)
- Gesellschaft für Informatik e.V (6)
- AMD Akademie Mode & Design (5)
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings—revenue-generating solutions to what customers want and are willing to pay for, inspired by what is possible with digital technologies. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This case describes how Munich Re addressed these common challenges by building a foundation to help its digital offerings succeed. The foundation provided prioritized and staged funding; dedicated, hands-on expertise; and a digital platform of shared services. By 2020, this foundation was helping to support over seventy initiatives, including several that were in the market generating new sources of revenue for the company by enabling its clients—insurance companies—to better service their own customers.
Durch die Digitalisierung der Arbeitswelt und die New Work-Bewegung verändert sich die Art und Weise, wie wir zu-ammenarbeiten. Entsprechend wird derzeit viel über Führung gesprochen – möglicherweise zu viel. Die zentrale These des Beitrags ist, dass wir besser verstehen können, wie Führungskräften und Mitarbeitende zukunftsorientiert zusam-menarbeiten, wenn wir weniger direkt über Führung reden, sondern zunächst von der Situation aus denken, in der sich Akteure koordinieren. Führung wird dann nicht isoliert, sondern zusammen mit verschiedenen Koordinationsformen betrachtet (wie Autonomie, Selbstorganisation, Management und eben Führung). Das kann in der Praxis helfen, den Wandel der Zusammenarbeit – inklusive der neuen Art von Führung – reflektierter und zielorientierter zu gestalten.
In modernen Arbeitswelten werden zunehmend arbeitsplatzbezogene digitale Technologien eingesetzt. Wenngleich dies zahlreiche Chancen bietet, kann es auch negative Folgen für die Gesundheit von Mitarbeitenden haben. Diese Herausforderungen werden durch die aktuelle Corona-Krise für viele Unternehmen noch verschärft. Stress, der direkt oder indirekt durch den Einsatz von Technologien entsteht, wird als «Technostress» bezeichnet. Wichtige Hebel zu dessen Vermeidung umfassen die Gestaltung von Technologien sowie die Berücksichtigung verschiedener individueller und situativer Faktoren im Rahmen technologischer Veränderungsprozesse.
Fehler, Manipulation und Rationalität – wie das Reporting das Verhalten der Entscheider beeinflusst
(2020)
Der Zweck des Management Reporting besteht darin, den Informationsbedarf der Führungskräfte zu befriedigen. Sowohl Ersteller als auch Nutzer von Berichten handeln aber nur begrenzt rational. Berichte wirken deshalb nicht „zielgenau“, sondern lösen vielfältige nicht gewünschte Reaktionen bei den Beteiligten aus. In diesem Beitrag erfahren Sie, wie sich „der Faktor Mensch“ auf die Erstellung und Nutzung von Management Reports auswirkt und wie ein effektives und effizientes Management Reporting unerwünschte Wirkungen minimieren kann.
Problem: Immer mehr Unternehmen führen Lean-Prinzipien ein, finden ihre Anforderungen an passende Kosteninformation aber von der traditionellen Kostenrechnung nicht ausreichend abgedeckt.
Ziel: Eine am Lean-Gedanken orientierte Kostenrechnung baut neue Kostenzurechnungsobjekte ein und stellt bisher vernachlässigte Kosteninformationen zur Verfügung
Methode: Gängige Kostenrechnungsansätze werden einem geschlossenen “accounting for lean” Ansatz gegenübergestellt, Gemeinsamkeiten und Überschneidungen aufgezeigt.
Cloud resources can be dynamically provisioned according to application-specific requirements and are payed on a per-use basis. This gives rise to a new concept for parallel processing: Elastic parallel computations. However, it is still an open research question to which extent parallel applications can benefit from elastic scaling, which requires resource adaptation at runtime and corresponding coordination mechanisms. In this work, we analyze how to address these system-level challenges in the context of developing and operating elastic parallel tree search applications. Based on our findings, we discuss the design and implementation of TASKWORK, a cloud-aware runtime system specifically designed for elastic parallel tree search, which enables the implementation of elastic applications by means of higher-level development frameworks. We show how to implement an elastic parallel branch-and-bound application based on an exemplary development framework and report on our experimental evaluation that also considers several benchmarks for parallel tree search.
In recent years, the cloud has become an attractive execution environment for parallel applications, which introduces novel opportunities for versatile optimizations. Particularly promising in this context is the elasticity characteristic of cloud environments. While elasticity is well established for client-server applications, it is a fundamentally new concept for parallel applications. However, existing elasticity mechanisms for client-server applications can be applied to parallel applications only to a limited extent. Efficient exploitation of elasticity for parallel applications requires novel mechanisms that take into account the particular runtime characteristics and resource requirements of this application type. To tackle this issue, we propose an elasticity description language. This language facilitates users to define elasticity policies, which specify the elasticity behavior at both cloud infrastructure level and application level. Elasticity at the application level is supported by an adequate programming and execution model, as well as abstractions that comply with the dynamic availability of resources. We present the underlying concepts and mechanisms, as well as the architecture and a prototypical implementation. Furthermore, we illustrate the capabilities of our approach through real-world scenarios.
High Performance Computing (HPC) enables significant progress in both science and industry. Whereas traditionally parallel applications have been developed to address the grand challenges in science, as of today, they are also heavily used to speed up the time-to-result in the context of product design, production planning, financial risk management, medical diagnosis, as well as research and development efforts. However, purchasing and operating HPC clusters to run these applications requires huge capital expenditures as well as operational knowledge and thus is reserved to large organizations that benefit from economies of scale. More recently, the cloud evolved into an alternative execution environment for parallel applications, which comes with novel characteristics such as on-demand access to compute resources, pay-per-use, and elasticity. Whereas the cloud has been mainly used to operate interactive multi-tier applications, HPC users are also interested in the benefits offered. These include full control of the resource configuration based on virtualization, fast setup times by using on-demand accessible compute resources, and eliminated upfront capital expenditures due to the pay-per-use billing model. Additionally, elasticity allows compute resources to be provisioned and decommissioned at runtime, which allows fine-grained control of an application's performance in terms of its execution time and efficiency as well as the related monetary costs of the computation. Whereas HPC-optimized cloud environments have been introduced by cloud providers such as Amazon Web Services (AWS) and Microsoft Azure, existing parallel architectures are not designed to make use of elasticity. This thesis addresses several challenges in the emergent field of High Performance Cloud Computing. In particular, the presented contributions focus on the novel opportunities and challenges related to elasticity. First, the principles of elastic parallel systems as well as related design considerations are discussed in detail. On this basis, two exemplary elastic parallel system architectures are presented, each of which includes (1) an elasticity controller that controls the number of processing units based on user-defined goals, (2) a cloud-aware parallel execution model that handles coordination and synchronization requirements in an automated manner, and (3) a programming abstraction to ease the implementation of elastic parallel applications. To automate application delivery and deployment, novel approaches are presented that generate the required deployment artifacts from developer-provided source code in an automated manner while considering application-specific non-functional requirements. Throughout this thesis, a broad spectrum of design decisions related to the construction of elastic parallel system architectures is discussed, including proactive and reactive elasticity control mechanisms as well as cloud-based parallel processing with virtual machines (Infrastructure as a Service) and functions (Function as a Service). To evaluate these contributions, extensive experimental evaluations are presented.
The objective of the project presented here is to develop an intelligent control algorithm for an energy system consisting of a biogas CHP (combined heat and power), various storage technologies, such as thermal energy storages (TES) and gas storages, and other renewable energy sources, such as photovoltaics. A corresponding algorithm based on the Monte-Carlo method has already been developed at Reutlingen University for CHP units running on natural gas and for heat pumps. The project presented here concentrates on the further development of this algorithm for application to biogas CP units. In this context, an adequate implementation of the gas storage is of primary importance, as it mainly determines the flexibility of the plant. In the course of the validation of the new optimization algorithm, simulations were carried out based on data from the Lower Lindenhof, an agricultural experimental station of the University of Hohenheim. Both an optimization with regard to onsite electricity utilization and an optimization driven by residual load were investigated. Preliminary results show that the optimization algorithm can improve the operation of the biogas CHP unit depending on the selected target function.
Unter dem Begriff Innovation Enabling wird im Folgenden ein Konzept für die ganzheitliche Unterstützung interdisziplinärer Teams beim kreativen und innovativen Problemlösen vor-gestellt. Dieses Konzept unterstützt Moderatoren und Teilnehmergleichermaßen und ein damit realisiertes System bleibt durch die implizite Interaktion für den Nutzer im Hintergrund. Eine zentrale Rolle spielt das Konzept der Awareness Pipeline zur Implementation einer impliziten Interaktion auf Basis eines Sensor-Aktor-Systems, welches in diesem Artikel vorgestellt wird. Die Unterstützung der begleitenden Moderations- und Administrationsaufgaben, wie beispielsweise der automatisierten Dokumentation der Sitzung, sollen in Zukunft einen deutlichen Mehrwert gegenüber einer klassischen Brainstorming-Sitzung bieten.
The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2020) continued a series of events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Entrepreneurship education is becoming increasingly important in higher education and also drives the development of innovative teaching formats, which can increase student engagement. It does, however, need greater international focus to become more attractive for both domestic and international students. This paper presents the examination and course design of two case studies, which promote entrepreneurship education for domestic and international students. These examples show that entrepreneurship courses are attractive due to their focus on interdisciplinarity, experience-based learning, and project-based work. Following a design-based research approach, this paper provides a practical contribution by offering a detailed overview of course design principles, classroom practice and presents reflections and learnings from an iterative development process.
Ein Forscherteam der Pädagogischen Hochschule Freigburg und der Hochschule Reutlingen mit Expertisen in Kommunikationsdesign und einer ästhetisch-kulturellen Fachdidaktik der Grundschulpädagogik erforscht, inwieweit sich der iterative Prozess und Prinzipien des Design Thinking eignen, Kreativität, Problemlösekompetenz und kollaboratives Arbeiten von Grundschulkindern zu födern. Grundlage der Überlegungen sind die prozessorientierten Kompetenzen der Fächer Kunst/Werken und Sachunterricht gemäß dem aktuellen Bildungsplan in Baden-Württemberg. Nach Vortstudien mit Lehrpersonen und Ausbildungslehrkräften wurde eine Unterrichtseinheit konzipiert, in welcher Kinder der dritten Klassenstufe mittels Design Thinking den perfekten Leseort umsetzen sollten.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings: revenue-generating solutions that leverage digital technologies to address customer needs. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This briefing describes how Munich Re addressed these common challenges by building a foundation for experimenting more systematically and successfully with digital offerings. The foundation has enabled Munich Re to become a serial innovator of digital offerings.
Krisenzeiten sind für die Wirtschaft durch immer kürzer werdende Zyklen mittlerweile zum Normalfall geworden. Auch im neuen Jahrtausend war die Weltwirtschaft schon mit mehreren schweren Krisen konfrontiert: Das Platzen der New-Economy-Blase zu Beginn des Jahrtausends oder die Finanz- und Wirtschaftskrise 2008. In diese gesamtwirtschaftlichen Krisen mischen sich aktuell Branchen- und Unternehmenskrisen, häufig verursacht durch die disruptive Kraft der digitalen Transformation oder durch Managementfehler. Unternehmenskrisen sind somit gewissermaßen der Normalfall einer typischen Unternehmensentwicklung und treten in jedem Unternehmen früher oder später auf. Dementsprechend legen einige Modelle des organisationalen Lebenszyklus (Ringlstetter & kaiser, 2004) nahe, Unternehmenskrisen als wenig außergewöhnlich bzw. normale und permanente Begleiterscheinung unternehmerischen Handelns aufzufassen (insbesondere Greiner, 1972). Dabei kann man unter Krise allgemein den abrupten Bruch einer bis dahin kontinuierlichen Entwicklung (Krystek, 1987, S. 3) verstehen. Dieser Bruch markiert einen Wendepunkt in der Unternehmensentwicklung, dessen konkreter Ausgang nicht absehbar ist und der zudem die gesamte Unternehmung oder deren dominante Ziele gefährden kann. Es ist somit zu Recht auf die ambivalenten Entwicklungsmöglichkeiten in einer Krise hinzuweisen.
Lean Management hat in viele Unternehmen Einzug gehalten. Lean Konzepte stellen neue Anforderungen an die Art und Struktur der benötigten Kosteninformation, welche von traditionallen Kostenrechnungssystemen nicht unmittelbar erfüllt werden. Vertreter eines „Lean Accounting“ schlagen deshalb teils radikale Änderungen und eine Vereinfachung der Kostenrechnung vor. Der Beitrag diskutiert die Beschränkungen der traditionellen Kostenrechnung bei der Umsetzung von Lean Management und stellt ausgewählte Ansätze eines „Accounting for Lean“ vor. Die Analyse zeigt, dass Ansätze des Lean Accounting zu eng fokussiert sind und die in der Praxis vorhandene Pluralität der Kostenrechnungsfunktionen nicht adäquat abbilden können. Eine radikale Neugestaltung bestehender Kostenrechnungssysteme wird deshalb als unrealistisch und unbegründet verworfen. Der Beitrag entwickelt alternative Vorschläge, wie Konzepte des Lean Managements und die dafür benötigte Kosteninformation in traditionellen Kostenrechnungssystemen integriert werden können.
Industrielle Produktionseinrichtungen haben mit rund 40% einen signifikanten Anteil am Gesamtenergiebedarf in Deutschland. Daher wurden und werden sie sowohl technologisch als auch energetisch optimiert. Häufig geht die technologischwirtschaftliche Optimierung auch mit der Reduzierung des Energie und Materialverbrauchs einher. Zudem macht der Ausbau der regenerativen Energiequellen die Energieerzeugung zunehmend volatiler, sodass nicht nur die Senkung des absoluten Energieverbrauchs, sondern auch eine höhere Flexibilität (Steuerung der Leistung über der Zeit) zunehmend interessanter wird. Dadurch ändert sich oft die installierte Leistung sowie die Gestaltung der Verlustleistungsabfuhr, was die Dimensionierung von Anlagen, zum Beispiel von spanenden Werkzeugmaschinen, beeinflusst.
A holistic approach to digitization enables decision-makers to achieve new efficiency in corporate performance management. The digitalization improves the quality, validity and speed of information retrieval and processing. At present, most corporations are confronted with the problem of not being able to organize, categorize and visualize decision-relevant information. To meet the challenges of information management, the Management Cockpit provides an information center for managers. In accordance with the specific working environment of the executives, the Management Cockpit offers a quick and comprehensive overview of the company's situation. Today, the current situation of a company is no longer only influenced by internal factors, but also by its public image. Social media monitoring and analysis is therefore a crucial component for the external factors of successful management. Real-time monitoring of the emotions and behaviors of consumers and customers thus contributes to effective controlling of allbusiness areas. The intelligent factories promise to collect data for internal factors, but the current reality in manufacturing looks different. Production often consists of a large number of different machines, with varying degrees of digitization and limited sensor data availability. In order to close this gap, we developed a compact sensor board with network components, which allows a flexible design with different sensors for a wide variety of applications. The sensor data enable decision makers to adapt the supply chain based on their internal and external observations in the Management Cockpit. Due to the realtime and long-term monitoring and analytic possibilities the Management Cockpit provides a multi-dimensional view of the company and supports an holistic Corporate Performance Management.
The article studies a novel approach of inflation modeling in economics. We utilize a stochastic differential equation (SDE) of the form dXt=aXtdt+bXtdBtH, where dBtH is a fractional Brownian motion in order to model inflationary dynamics. Standard economic models do not capture the stochastic nature of inflation in the Eurozone. Thus, we develop a new stochastic approach and take into consideration fractional Brownian motions as well as Lévy processes. The benefits of those stochastic processes are the modeling of interdependence and jumps, which is equally confirmed by empirical inflation data. The article defines and introduces the rules for stochastic and fractional processes and elucidates the stochastic simulation output.
Resilienz und Stabilität? Weichenstellungen im Banken- und Finanzsystem in der Corona-Pandemie
(2020)
Seit der globalen Finanzkrise 2008/2009 hat es keine vergleichbare Herausforderung wie die Corona-Krise für das Finanz- und Bankensystem mehr gegeben.
Schwache Profitabilität, ungelöste Regulierungs-herausforderungen und steigende Konkurrenz im Digitalbereich stellen die Banken vor weitere Heraus-forderungen.
Die Stabilität des Finanzsystems und der Zugang zu den Finanzmärkten war während der Pandemie nicht gefährdet. Durch gemeinsame Bemühungen und bes-sere Bankenkapitalisierung ist das Finanzsystem heute widerstandsfähiger als zu Zeiten der Finanzkrise.
Sofern die Zuschüsse und Kredite im „Next Genera-tion EU“-Fund zielgerichtet für Strukturreformen und Zukunftsinvestitionen eingesetzt werden, dürfte dies einen Vertrauens- und Wachstumsimpuls darstellen.
Weitere Verbesserungen der Finanzstabilität, wie erhöhte Eigenkapitalunterlegungen, Regulierung von Schattenbanken oder Reformen im Bereich der Finanzaufsicht, sind jedoch von Nöten.
Since the global financial crisis of 2008/2009, there has been no challenge to the financial and banking system comparable to that during the Corona crisis.
Weak profitability, unresolved regulatory challenges and increasing competition in the digital sector pose further challenges for banks.
The stability of the financial system and access to financial markets was not at risk during the pandemic. Through joint efforts and better bank capitalisation, the financial system is now more resilient than during the financial crisis.
Provided that grants and loans in the “next generation EU” fund are well targeted for structural reforms and investments in the future, this should boost confi-dence and growth.
However, further improvements in financial stability, such as increased capital requirements, regulation of shadow banks or reforms in financial supervision, are needed.
Since Adam Smith, the “homo oeconomicus” is the behavioural model in economics. Commonly this model characterizes a selfish individual, a kind of ruthless type, whose greed for profit seems to take precedence over moral values. Already 100 years ago, Max Weber provided a modernization of the model concerning the methodological individualism. Recent research in cognitive sciences reveals a further modernization of this standard model in economics. Neuro-economics, a highly interdisciplinary research field, is building a new behavioural consensus. This article examines the new properties of the “neuro-homo oeconomicus”. We show that the new behavioural model is rather similar to the long-standing economic prototype. To that extent, the neuro-model is more hype than hope. In principle, this article considers an ancient philosophical question about the nature of humans in general.
This paper studies the impact of financial liquidity on the macro-economy. We extend a classic macroeconomic modeland compute numerical simulations. The model confirms that persistently low inflation can occur despite a high degreeof financial liquidity due to a reallocation of cash, normal and risk-free bonds. In that regard, our model uncovers anexplanation of a flat Phillips curve. Overall, our approach contributes to a rather disregarded matter in macroeconomictheory.
Businesses need to cope with myriad challenges including increasingly competitive markets and rapid developments in digital technology. The overall aim of the research described in this paper is to generate fresh insights into the impacts of digitalisation on the design and management of global supply chains. It focuses on understanding the current adoption rate of new technologies in global supply chains, identifying perceived opportunities and challenges and clarifying the critical factors driving (and inhibiting) their deployment. The authors administered an online survey with a global sample of respondents from various supply chain functions, resulting in a sample of 142 responses. Significant differences emerged in adoption patterns between companies of different sizes. Moreover, the study pointed to a widening gap (or a ‘digital divide’) between leaders and laggards in terms of technology adoption. Perceived benefits and challenges also differ notably between companies of varying sizes. Adoption patterns are very diverse across specific technologies. The results further suggest that there is a significant correlation between adoption of digital technologies and different dimensions of company performance.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
The advent of chatbots in customer service solutions received increasing attention by research and practice throughout the last years. However, the relevant dimensions and features for service quality and service performance for chatbots remain quite unclear. Therefore, this research develops and tests a conceptual model for customer service quality and customer service performance in the context of chatbots. Additionally, the impact of the developed service dimensions on different customer relationship metrics is measured across different service channels (hotline versus chatbots). Findings of six independent studies indicate a strong main effect of the conceptualized service dimensions on customer satisfaction, service costs, intention to service reusage, word-of-mouth, and customer loyalty. However, different service dimensions are relevant for chatbots compared to a traditional service hotline.
Autonomous driving is becoming the next big digital disruption in the automotive industry. However, the possibility of integrating autonomous driving vehicles into current transportation systems not only involves technological issues but also requires the acceptance and adoption of users. Therefore, this paper develops a conceptual model for user acceptance of autonomous driving vehicles. The corresponding model is tested through a standardized survey of 470 respondents in Germany. Finally, the findings are discussed in relation to the current developments in the automotive industry, and recommendations for further research are given.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? And (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? and (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
Our paper investigates the response of acquiring firms’ stock returns around the announcement date in cross-border mergers and acquisitions (M&A) between listed Chinese acquirers and German targets. We apply an event study methodology to examine the shareholder value effect based on a sample of M&A deals over the most recent period of 2012-2018. We apply a market model event study based on the argumentation of Brown and Warner (1985) and use short-term observation periods according to Andrade, Mitchell, and Stafford (2001) as well as Hackbarth and Morellec (2008). The results indicate that the announcement of M&A involving German targets results in a positive cumulative abnormal return of on average 2.18% for Chinese acquirers’ shareholders in a five-day symmetric event window. Furthermore, we found slight indications of possible information leakage prior to the formal announcement. Although it shows that the size of acquiring firms is not necessarily correlated with the positive abnormal returns in the short run, this study suggests that Chinese acquirers’ shareholders gain higher abnormal returns when the German targets are non-listed companies.
Im Rahmen des Forschungsprojektes wurden Ausrüstungsmittel und -verfahren entwickelt, die dem vorbeugenden Schutz von Textilien (insbesondere Bodenbelägen) vor Anschmutzung dienen. Das Verfahren sieht eine kombinierte Ausrüstung von Textilien mit fluorierten Polymeren mit inkorporierten Nanopartikeln (in erster Linie: SiO2) zur Erhöhung der Rauhigkeit vor. Es wurden kommerziell erhältliche Hydrophobiermittel (Fluorcarbon- oder Kohlenwasserstoff-basierte Polymere) in Kombination mit SiO2-Nanopartikeln auf Teppiche aufgebracht und hinsichtlich eines Anschmutzens - z.B. durch Kaffee, KoolAid, Rotwein, AATCC Standard Soil, schwarze Schuhcreme - untersucht. Hierzu wurden die scherempfindlichen Dispersionen der Hydrophobiermittel mit neu entwickelten angepassten Dispersionen von SiO2-Nanopartikel versetzt. Die SiO2-Nanopartikel wurden mit systematisch variierten Größen von 10-1.000 nm synthetisiert, umfassend charakterisiert und mit Hilfe von neu entwickelten Fluormethacrylat-Copolymeren mit reaktiven Gruppen (Maleinsäure-, Itaconsäure- oder Citraconsäureanhydrid) und hydrophilen Modifiern (Alkohol- oder Amingruppen) stabilisiert. Die resultierenden Polymer-Teilchen-Dispersionen konnten aus wässrigen oder ethanolisch-wässrigen Lösungen auf Textilien (PA-, PES- oder WO-Teppiche und -Gewebe) appliziert werden. Weiterhin wurden auch die neu entwickelten Fluorcarbon-Polymere hinsichtlich ihrer Anwendung getestet. In Anschmutzungsversuchen wiesen die so ausgerüsteten Teppiche ein geringeres Anschmutzen durch Standardschmutz als Referenzmaterialien auf. Die Beständigkeit der Ausrüstung bei mechanischer Belastung konnte durch Vernetzung der Polymere auf dem Textilmaterial verbessert werden. Für PA 6- und PA 6.6-Teppiche wurden die besten Ergebnisse hinsichtlich eines geringeren Anschmutzens durch wasserlösliche Verschmutzungen (Kaffee, Rotwein, KoolAid) im Vergleich zu unbehandelten Teppichen ermittelt, wenn die Ausrüstung mit Fluorpolymer-stabilisierten SiO2-Nanopartikeln oder mit einer kombinierten Dispersion aus SiO2-Partikeln und Fluorcarbonharzen vorgenommen wurde. Eine im Vergleich zu unbehandelten Teppichen weniger starke Anschmutzung durch AATCC Standard Soil (DIN EN ISO 11378-2) wurde für mit SiO2-Partikeln behandelte PA 6-Teppiche ermittelt. Hydrophobe Anschmutzungen (z.B. schwarze Schuhcreme) konnten von mit Fluorcarbon-Polymeren ausgerüsteten Teppichen am besten entfernt werden. Die Kombination von SiO2-Partikeln mit Fluorcarbon-Polymeren erwies sich meist als günstiger als die alleinige Behandlung mit Fluorcarbonharzen. Ein Zusammenhang zwischen der Größe der Nanopartikel, der Abrasionsbeständigkeit und den Reinigungseigenschaften wurde festgestellt, und es konnte gezeigt werden, dass FC-Nanopartikel-Composites diese verbessern. Die mechanische Beständigkeit der Antischmutzausrüstung mit SiO2-Nanopartikeln und Fluorcarbon-Polymeren auf Polyamidteppichen wurde z.B. durch Hexapod-Trommelbeanspruchung (nach ISO 10361) geprüft. Durch REM, IR-Spektroskopie und den Wassertropfentest wurde nach 4.000 und auch nach 12.000 Touren noch eine intakte Beschichtung nachgewiesen. Mit Vernetzern, die das Polymer selbst, das Polymer mit Partikeln und/oder der Substratoberfläche vernetzen, konnte z.T. die Abrasionsbeständigkeit verbessert werden (hier müssen ggf. optimalere Vernetzer gesucht werden).
Die Ausrüstung von Textilien mit Sol-Gel-Beschichtungen wird seit einigen Jahren intensiv verfolgt. Eine Vielzahl von bekannten, aber auch neuen Ausrüstungseffekten können über diesen Ansatz realisiert werden. Besonders interessant ist die Sol-Gel-Technik wegen der Möglichkeiten, multifunktionelle Ausrüstungen zu synthetisieren. Problematisch ist eine in vielen Fällen geringe Beständigkeit solcher Ausrüstungen, insbesondere gegenüber Waschprozessen. Ziel des Projektes war es davon ausgehend, Vorbehandlungsstrategien für textile Fasermaterialien, basierend auf synthetischen Polymeren oder aus Naturfasern, zu entwickeln, die die Haltbarkeit von Sol-Gel-basierten Ausrüstungen verbessern. Im Rahmen der Arbeiten wurden, angepasst an die jeweiligen Faserpolymere - Polyethylenterephthalat, Polyamid, Polypropylen und Baumwolle - funktionelle Gruppen über geeignete Anker auf den Polymeren etabliert, die in der Lage sind, kovalente Bindungen zu Sol-Gel-basierten Beschichtungssystemen auszubilden. Als Anker wurden primär Trialkoxysilane verwendet, die zusätzlich z.B. Epoxy-, Isocyanato-, Azido- oder Amino-funktionelle Reste besitzen. Mit diesen Resten können die Anker kovalent an die Polymere angebunden werden. Die meisten Sol-Gel-basierten Systeme enthalten zumindest zu einem gewissen Anteil SiOx und/oder MexOy-Cluster. Die zur Funktionalisierung der Oberflächen eingesetzten Alkoxysilane können generell an solche Systeme/Cluster per Kondensation gebunden werden und dienen daher für die effektive Anbindung verschiedenster funktioneller Sol-Gel-Schichten. Entsprechend vorfunktionalisierte Substrate wurden in der Folge mit exemplarisch ausgewählten Sol-Gel-Ausrüstungen beschichtet. Dabei wurden für den Großteil der Untersuchungen hydrophobierende Sole appliziert. Vorteilhaft ist, dass sich der mit hydrophobierenden Solen erzielte Ausrüstungseffekt genau wie dessen Beständigkeit mit vergleichsweise überschaubarem Aufwand über die Untersuchung der Benetzbarkeit (DuPont-Noten, Kontaktwinkel, Tropfeneinsinkzeiten) charakterisieren lässt. Die Wirksamkeit der Vorbehandlungen wurde dann vor allem anhand von Untersuchungen zur Waschbeständigkeit der Ausrüstungen überprüft. Im Rahmen der Arbeiten konnte gezeigt werden, dass sich über die Etablierung geeigneter Anker die Beständigkeit von Sol-Gel-Ausrüstungen bzw. der daraus hervorgehenden Effekte verbessern lässt. Es zeigt sich gleichzeitig, dass die erzielten Verbesserungen sehr stark vom jeweiligen Sol abhängen. D.h., dass sich erzielte Verbesserungen nicht zwangsläufig auf andere Sol übertragen lassen. Analytische Charakterisierungen weisen darauf hin, dass in vielen Fällen die Beständigkeit der Beschichtungsnetzwerke selbst einen weit größeren Einfluss besitzt als die Anbindung an das Substrat. So zeigt sich bei verschiedenen Untersuchungen, dass die Auflage der Sol-Gel-Beschichtung vor allem nach einer ersten Wäsche, aber auch darüber hinaus, signifikant sinkt, oftmals aber ohne dass der durch Ausrüstung erzielte Effekt verloren geht. Dies deutet auf ein (Auf-)Lösen der Beschichtungsmatrizes hin, wovor die Anker nicht schützen können, da deren Wirkung auf die Grenzfläche zum Substrat beschränkt ist. Neben den hydrophobierenden Ausrüstungen wurden exemplarisch auch antibakterielle Ausrüstungen nach den entsprechenden Vorbehandlungen appliziert. Auch hier konnten Verbesserungen in der Beständigkeit des Effektes erzielt werden. Abschließend wurde untersucht inwieweit sich die Vorbehandlungen im Vergleich zur einfachen Ausrüstung negativ auf die textilen Produkte auswirken. Hierzu wurden relevante textile Parameter wie z.B. Höchstzugkräfte, Weißgrade, Steifigkeit oder Luftdurchlässigkeiten bestimmt. Diese Parameter wurden in der überwiegenden Zahl der Vorbehandlungen nicht oder nur geringfügig beeinflusst.
This book describes the current state of the art in integrated ring resonators, covering more than two decades in the development of this exciting device. It discusses in depth one of the most fascinating and versatile integrated optical filters, providing readers with a panoramic view spanning from design and simulation to implementation in various material systems. Written by authors with extensive experience in both academia and industry, this second edition offers a much-needed, major update as interest in integrated ring resonators undergoes a global revival. The new edition includes a comprehensive technological update, and a timely discussion of recent advances in new application areas, such as optofluidics and microfluidics, telecom operations and biosensors. This aptly named compendium is the ideal guide for researchers and engineers looking to review the field as a whole while exploring several of its possible and exciting future trajectories.
Textil im Verbund ist besser? Das ist in der Fachwelt lange keine Frage mehr, textile Verbundwerkstoffe können viele Vorteile bieten. Es ist wohl bekannt, dass sie oft besser sind als nicht-textile Alternativen. Die Beispiele sind mannigfaltig. Innovative Entwicklungen sind nicht nur der stark beachtete Textilbeton, der mit dem Deutschen Zukunftspreis ausgezeichnet wurde, sondern auch viele vielleicht weniger wahrgenommene oder spektakuläre Produkte auf Basis faserverstärkter Kunststoffe.
This book discusses important topics for engineering and managing software startups, such as how technical and business aspects are related, which complications may arise and how they can be dealt with. It also addresses the use of scientific, engineering, and managerial approaches to successfully develop software products in startup companies.
The book covers a wide range of software startup phenomena, and includes the knowledge, skills, and capabilities required for startup product development; team capacity and team roles; technical debt; minimal viable products; startup metrics; common pitfalls and patterns observed; as well as lessons learned from startups in Finland, Norway, Brazil, Russia and USA. All results are based on empirical findings, and the claims are backed by evidence and concrete observations, measurements and experiments from qualitative and quantitative research, as is common in empirical software engineering.
The book helps entrepreneurs and practitioners to become aware of various phenomena, challenges, and practices that occur in real-world startups, and provides insights based on sound research methodologies presented in a simple and easy-to-read manner. It also allows students in business and engineering programs to learn about the important engineering concepts and technical building blocks of a software startup. It is also suitable for researchers at different levels in areas such as software and systems engineering, or information systems who are studying advanced topics related to software business.
Product roadmaps are an important tool in product development. They provide direction, enable consistent development in relation to a product vision and support communication with relevant stakeholders. There are many different formats for product roadmaps, but they are often based on the assumption that the future is highly predictable. However, especially software-intensive businesses are faced with increasing market dynamics, rapidly evolving technologies and changing user expectations. As a result, many organizations are wondering what roadmap format is appropriate for them and what components it should have to deal with an unpredictable future. Objectives: To gain a better understanding of the formats of product roadmaps and their components, this paper aims to identify suitable formats for the development and handling of product roadmaps in dynamic and uncertain markets. Method: We performed a grey literature review (GLR) according to the guidelines from Garousi. Results: A Google search identified 426 articles, 25 of which were included in this study. First, various components of the roadmap were identified, especially the product vision, themes, goals, outcomes and outputs. In addition, various product roadmap formats were discovered, such as feature-based, goal-oriented, outcome-driven and a theme-based roadmap. The roadmap components were then assigned to the various product roadmap formats. This overview aims at providing initial decision support for companies to select a suitable product roadmap format and adapt it to their own needs.
Selecting a suitable development method for a specific project context is one of the most challenging activities in process design. Every project is unique and, thus, many context factors have to be considered. Recent research took some initial steps towards statistically constructing hybrid development methods, yet, paid little attention to the peculiarities of context factors influencing method and practice selection. In this paper, we utilize exploratory factor analysis and logistic regression analysis to learn such context factors and to identify methods that are correlated with these factors. Our analysis is based on 829 data points from the HELENA dataset. We provide five base clusters of methods consisting of up to 10 methods that lay the foundation for devising hybrid development methods. The analysis of the five clusters using trained models reveals only a few context factors, e.g., project/product size and target application domain, that seem to significantly influence the selection of methods. An extended descriptive analysis of these practices in the context of the identified method clusters also suggests a consolidation of the relevant practice sets used in specific project contexts.
Hochschulen sind Teil des Innovationsökosystems: in einer kooperativen Austauschbeziehung fördern sie die regionale Wirtschaft und die gesellschaftliche Entwicklung. Deshalb ist die Förderung von Innovation, Kreativität und unternehmerischem Denken eine wichtige Aufgabe. Die Europäische Kommission hat bereits 2005 unternehmerisches Denken und Handeln als Schlüsselkompetenz für das 21. Jahrhundert definiert: „Unternehmerische Kompetenz ist die Fähigkeit, Ideen in die Tat umzusetzen“ (Europäische Kommission, 2005, S. 21). Entrepreneurship Education boomt und die Förderung von unternehmerischen Kompetenzen an Hochschulen wird vorangetrieben – damit ist die Förderung von Gründungskultur nicht nur Teil der Wirtschaftsbildung sondern vielmehr als Querschnittsaufgabe zu verstehen. Die Entrepreneurial Mission verändert die Lehr- und Lern kultur an den Hochschulen. Zum einen ist es Ziel, Entrepreneurship in der Breite an den Hochschulen zu verankern: Unternehmerisches Denken und Handeln ist eine Kernkompetenz. Zum anderen fördert die Start-up Education an Hochschulen aktiv Unternehmertalente und Ausgründungen.
Das Projekt “Spinnovation” ist ein Verbundprojekt der Hochschule Reutlingen, der Hochschule Aalen und der Hochschule der Medien und wird vom Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg in der Ausschreibung „Gründungskultur in Studium und Lehre“ gefördert. Seit 2016 wurden dazu an den beteiligten Hochschulen zahlreiche neue Angebote für Studierende entwickelt, um das Thema Entrepreneurship Education curricular zu integrieren und eine Änderung des Mindsets in Richtung Entrepreneurship und Innovation zu bewirken. Basierend auf den Erfahrungen und Ergebnissen aus dem Verbundprojekt Spinnovation können konkrete Handlungsempfehlungen für die Entrepreneurship Education an Hochschulen abgeleitet werden.
With the expansion of cyber-physical systems (CPSs) across critical and regulated industries, systems must be continuously updated to remain resilient. At the same time, they should be extremely secure and safe to operate and use. The DevOps approach caters to business demands of more speed and smartness in production, but it is extremely challenging to implement DevOps due to the complexity of critical CPSs and requirements from regulatory authorities. In this study, expert opinions from 33 European companies expose the gap in the current state of practice on DevOps-oriented continuous development and maintenance. The study contributes to research and practice by identifying a set of needs. Subsequently, the authors propose a novel approach called Secure DevOps and provide several avenues for further research and development in this area. The study shows that, because security is a cross-cutting property in complex CPSs, its proficient management requires system-wide competencies and capabilities across the CPSs development and operation.
Hardly any software development process is used as prescribed by authors or standards. Regardless of company size or industry sector, a majority of project teams and companies use hybrid development methods (short: hybrid methods) that combine different development methods and practices. Even though such hybrid methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this article, we make a first step towards a statistical construction procedure for hybrid methods. Grounded in 1467 data points from a large‐scale practitioner survey, we study the question: What are hybrid methods made of and how can they be systematically constructed? Our findings show that only eight methods and few practices build the core of modern software development. Using an 85% agreement level in the participants' selections, we provide examples illustrating how hybrid methods can be characterized by the practices they are made of. Furthermore, using this characterization, we develop an initial construction procedure, which allows for defining a method frame and enriching it incrementally to devise a hybrid method using ranked sets of practice.
Tissue constructs of physiologically relevant scale require a vascular system to maintain cell viability. However, in vitro vascularization of engineered tissues is still a major challenge. Successful approaches are based on a feeder layer (FL) to support vascularization. Here, we investigated whether the supporting effect on the self‐assembled formation of prevascular‐like structures by microvascular endothelial cells (mvECs) originates from the FL itself or from its extracellular matrix (ECM). Therefore, we compared the influence of ECM, either derived from adipose‐derived stem cells (ASCs) or adipogenically differentiated ASCs, with the classical cell‐based FL. All cell‐derived ECM (cdECM) substrates enabled mvEC growth with high viability. Prevascular‐like structures were visualized by immunofluorescence staining of endothelial surface protein CD31 and could be observed on all cdECM and FL substrates but not on control substrate collagen I. On adipogenically differentiated ECM, longer and higher branched structures could be found compared with stem cell cdECM. An increased concentration of proangiogenic factors was found in cdECM substrates and FL approaches compared with controls. Finally, the expression of proteins associated with tube formation (E‐selectin and thrombomodulin) was confirmed. These results highlight cdECM as promising biomaterial for adipose tissue engineering by inducing the spontaneous formation of prevascular‐like structures by mvECs.
Based on a survey among customers of seven German municipal utilities, we estimate two regression models to identify the most prospective customer segments and their preferences and motivations for participating in peer-to-peer (P2P) electricity trading and develop implications for decision-makers in the energy sector and policy-makers for this currently relatively unknown product. Our results show a large general openness of private households towards P2P electricity trading, which is also the main predictor of respondents' intention to participate. It is mainly influenced by individuals’ environmental attitude, technical interest, and independence aspiration. Respondents with the highest willingness to participate in P2P electricity trading are mainly motivated by the ability to share electricity, and to a lesser extent by economic reasons. They also have stronger preferences for innovative pricing schemes (service bundles, time-of-use tariffs). Differences between individuals can be observed depending on their current ownership (prosumers) or installation probability of a microgeneration unit (consumers, planners). Rather than current prosumers, especially planners willing to install microgeneration in the foreseeable future are considered to be the most promising target group for P2P electricity trading. Finally, our results indicate that P2P electricity trading could be a promising niche option in the German energy transition.
In digital transformierten Arbeitswelten organisieren die Mitarbeitenden ihre Arbeitszeit, ihren Arbeitsort und die Art und Weise, wie sie Aufgaben erledigen, in größerem Umfang selbst. Unternehmen, die im Zuge des Transformationsprozesses den Grad der Selbstorganisation erhöhen möchten, stehen vor einer komplexen Herausforderung. Selbstorganisation betrifft zahlreiche Elemente der Organisation wie Arbeitsaufgaben und Rollen, Führung, Regeln und Kompetenzen. Auf Basis eines empirisch entwickelten Bezugsrahmens, dem Digitalisierungsatlas, können die verschiedenen Elemente integrativ betrachtet und die Wechselwirkungen zwischen den Dimensionen in den Blick genommen werden. Wird Selbstorganisation ausgehend von der Autonomie der Beschäftigten, Arbeitsaufgaben und die eigene Rolle in der Organisation selbst zu beeinflussen, in den Blick genommen, sind insbesondere die Wechselwirkungen zwischen den organisationalen Dimensionen sowie Führung relevant. Die Spannungen zwischen diesen Dimensionen werden näher fokussiert. Insgesamt zeigt der Beitrag auf, dass Selbstorganisation nicht als ein unabhängiges Phänomen verstanden werden kann, sondern stets in Wechselwirkung mit anderen Dimensionen steht.
Elasticity is considered to be the most beneficial characteristic of cloud environments, which distinguishes the cloud from clusters and grids. Whereas elasticity has become mainstream for web-based, interactive applications, it is still a major research challenge how to leverage elasticity for applications from the high-performance computing (HPC) domain, which heavily rely on efficient parallel processing techniques. In this work, we specifically address the challenges of elasticity for parallel tree search applications. Well-known meta-algorithms based on this parallel processing technique include branch-and-bound and backtracking search. We show that their characteristics render static resource provisioning inappropriate and the capability of elastic scaling desirable. Moreover, we discuss how to construct an elasticity controller that reasons about the scaling behavior of a parallel system at runtime and dynamically adapts the number of processing units according to user-defined cost and efficiency thresholds. We evaluate a prototypical elasticity controller based on our findings by employing several benchmarks for parallel tree search and discuss the applicability of the proposed approach. Our experimental results show that, by means of elastic scaling, the performance can be controlled according to user-defined thresholds, which cannot be achieved with static resource provisioning.
Different types of raw cotton were investigated by a commercial ultraviolet-visible/near infrared (UV-Vis/NIR) spectrometer (210–2200 nm) as well as on a home-built setup for NIR hyperspectral imaging (NIR-HSI) in the range 1100–2200 nm. UV-Vis/NIR reflection spectroscopy reveals the dominant role proteins, hydrocarbons and hydroxyl groups play in the structure of cotton. NIR-HSI shows a similar result. Experimentally obtained data in combination with principal component analysis (PCA) provides a general differentiation of different cotton types. For UV-Vis/NIR spectroscopy, the first two principal components (PC) represent 82 % and 78 % of the total data variance for the UV-Vis and NIR regions, respectively. Whereas, for NIR-HSI, due to the large amount of data acquired, two methodologies for data processing were applied in low and high lateral resolution. In the first method, the average of the spectra from one sample was calculated and in the second method the spectra of each pixel were used. Both methods are able to explain ≥90 % of total variance by the first two PCs. The results show that it is possible to distinguish between different cotton types based on a few selected wavelength ranges. The combination of HSI and multivariate data analysis has a strong potential in industrial applications due to its short acquisition time and low-cost development. This study opens a novel possibility for a further development of this technique towards real large-scale processes.
The data presented in this article characterize the thermomechanical and microhardness properties of a novel melamine-formaldehyde resin (MF) intended for the use as a self-healing surface coating. The investigated MF resin is able to undergo reversible crosslinking via Diels Alder reactive groups. The microhardness data were obtained from nanoindentation measurements performed on solid resin film samples at different stages of the self-healing cycle. Thermomechanical analysis was performed under dynamic load conditions. The data provide supplemental material to the manuscript published by Urdl et al. 2020 (https://doi.org/10.1016/j.eurpolymj.2020.109601) on the self-healing performance of this resin, where a more thorough discussion on the preparation, the properties of this coating material and its application in impregnated paper-based decorative laminates can be found.
Thermoplastic polymers like ethylene-octene copolymer (EOC) may be grafted with silanes via reactive extrusion to enable subsequent crosslinking for advanced biomaterials manufacture. However, this reactive extrusion process is difficult to control and it is still challenging to reproducibly arrive at well-defined products. Moreover, high grafting degrees require a considerable excess of grafting reagent. A large proportion of the silane passes through the process without reacting and needs to be removed at great expense by subsequent purification. This results in unnecessarily high consumption of chemicals and a rather resource-inefficient process. It is thus desired to be able to define desired grafting degrees with optimum grafting efficiency by means of suitable process control. In this study, the continuous grafting of vinyltrimethoxysilane (VTMS) on ethylene-octene copolymer (EOC) via reactive extrusion was investigated. Successful grafting was verified and quantified by 1H-NMR spectroscopy. The effects of five process parameters and their synergistic interactions on grafting degree and grafting efficiency were determined using a face-centered experimental design (FCD). Response surface methodology (RSM) was applied to derive a causal process model and define process windows yielding arbitrary grafting degrees between <2 and >5% at a minimum waste of grafting agent. It was found that the reactive extrusion process was strongly influenced by several second-order interaction effects making this process difficult to control. Grafting efficiencies between 75 and 80% can be realized as long as grafting degrees <2% are admitted.
Das Urteil des Bundesverfassungsgerichts (BVerfG) vom 5. Mai 2020 ist Schlusspunkt und zugleich Neuanfang nach einer jahrelangen verfassungsrechtlichen und ökonomischen Auseinandersetzung. Im Prinzip geht es um die konstituierenden Prinzipien der Eurozone sowie das Mandat der Europäischen Zentralbank (EZB). Der EU-Vertrag charakterisiert die Leitplanken der Wirtschafts- und Währungsunion (WWU) im Spannungsfeld der Art. 119, 123 und 125 des Vertrags über die Arbeitsweise der Europäischen Union (AEUV). Mithin liegt die wirtschaftspolitische Souveränität – nach dem Prinzip Haftung und Kontrolle – allein bei den Mitgliedstaaten. Die Organe der Europäischen Union (EU) sowie der Gerichtshof der Europäischen Union (EuGH) legen diese Leitplanken gemäß dem Leitgedanken in Art. 1 des Vertrags über die Europäische Union (EUV) einer „ever closer union“ regelmäßig mit weitem Ermessen aus.
Some widely used optical measurement systems require a scan in wavelength or in one spatial dimension to measure the topography in all three dimensions. Novel hyperspectral sensors based on an extended Bayer pattern have a high potential to solve this issue as they can measure three dimensions in a single shot. This paper presents a detailed examination of a hyperspectral sensor including a description of the measurement setup. The evaluated sensor (Ximea MQ022HG-IM-SM5X5-NIR) offers 25 channels based on Fabry–Pérot filters. The setup illuminates the sensor with discrete wavelengths under a specified angle of incidence. This allows characterization of the spatial and angular response of every channel of each macropixel of the tested sensor on the illumination. The results of the characterization form the basis for a spectral reconstruction of the signal, which is essential to obtain an accurate spectral image. It turned out that irregularities of the signal response for the individual filters are present across the whole sensor.
Here, the effects of substituting portions of fossil-based phenol in phenol formaldehyde resin by renewable lignin from two different sources are investigated using a factorial screening experimental design. Among the resins consumed by the wood-based industry, phenolics are one of the most important types used for impregnation, coating or gluing purposes. They are prepared by condensing phenol with formaldehyde (PF). One major use of PF is as matrix polymer for decorative laminates in exterior cladding and wet-room applications. Important requirements for such PFs are favorable flow properties (low viscosity), rapid curing behavior (high reactivity) and sufficient self-adhesion capacity (high residual curing potential). Partially substituting phenol in PF with bio-based phenolic co-reagents like lignin modifies the physicochemical properties of the resulting resin. In this study, phenol-formaldehyde formulations were synthesized where either 30% or 50% (in weight) of the phenol monomer were substituted by either sodium lignosulfonate or Kraft lignin. The effect of modifying the lignin material by phenolation before incorporation into the resin synthesis was also investigated. The resins so obtained were characterized by Fourier Transform Infra-Red (FTIR) spectroscopy, Size Exclusion Chromatography (SEC), Differential Scanning Calorimetry (DSC), rheology, and measurements of contact angle and surface tension using the Wilhelmy plate method and drop shape analysis.
Improvement of a three-layered in vitro skin model for topical application of irritating substances
(2020)
In the field of skin tissue engineering, the development of physiologically relevant in vitro skin models comprising all skin layers, namely epidermis, dermis, and subcutis, is a great challenge. Increasing regulatory requirements and the ban on animal experiments for substance testing demand the development of reliable and in vivo-like test systems, which enable high-throughput screening of substances. However, the reproducibility and applicability of in vitro testing has so far been insufficient due to fibroblast-mediated contraction. To overcome this pitfall, an advanced 3-layered skin model was developed. While the epidermis of standard skin models showed an 80% contraction, the initial epidermal area of our advanced skin models was maintained. The improved barrier function of the advanced models was quantified by an indirect barrier function test and a permeability assay. Histochemical and immunofluorescence staining of the advanced model showed well-defined epidermal layers, a dermal part with distributed human dermal fibroblasts and a subcutis with round-shaped adipocytes. The successful response of these advanced 3-layered models for skin irritation testing demonstrated the suitability as an in vitro model for these clinical tests: only the advanced model classified irritative and non-irritative substances correctly. These results indicate that the advanced set up of the 3-layered in vitro skin model maintains skin barrier function and therefore makes them more suitable for irritation testing.
Impregnated paper-based decorative laminates prepared from lignin-substituted phenolic resins
(2020)
High Pressure Laminates (HPL) panels consist of stacks of self-gluing paper sheets soaked with phenol-formaldehyde (PF) resins. An important requirement for such PFs is that they must rapidly penetrate and saturate the paper pores. Partially substituting phenol with bio-based phenolic chemicals like lignin changes the physico-chemical properties of the resin and affects its ability to penetrate the paper. In this study, PF formulations containing different proportions of lignosulfonate and kraft lignin were used to prepare paper-based laminates. The penetration of a Kraft paper sheet was characterized by a recently introduced, new device measuring the conductivity between both sides of the paper sheet after a drop of resin was placed on the surface and allowed to penetrate the sheet. The main target value measured was the time required for a specific resin to completely penetrate the defined paper sample (“penetration time”). This penetration time generally depends on the molecular weight distribution, the flow behavior and the polarity of the resin which in turn are dependent on the manufacturing conditions of the resin. In the present study, the influences of the three process factors: (1) type of lignin material used for substitution, (2) lignin modification by phenolation and (3) degree of phenol substitution on the penetration times of various lignin-phenolic hybrid impregnation resins were studied using a complete twolevel three-factorial experimental design. Thin laminates made with the resins diluted in methanol were mechanically tested in terms of tensile and flexural strains, and their cross-sections were studied by light microscopy.
Here, we report the continuous peroxide-initiated grafting of vinyltrimethoxysilane (VTMS) onto a standard polyolefin by means of reactive extrusion to produce a functionalized liquid ethylene propylene copolymer (EPM). The effects of the process parameters governing the grafting reaction and their synergistic interactions are identified, quantified and used in a mathematical model of the extrusion process. As process variables the VTMS and peroxide concentrations and the extruder temperature setting were systematically studied for their influence on the grafting and the relative grafting degree using a face-centered central composite design (FCD). The grafting degree was quantified by 1H NMR spectroscopy. Response surface methodology (RSM) was used to calculate the most efficient grafting process in terms of chemical usage and graft yield. With the defined processing window, it was possible to make precise predictions about the grafting degree with at the same time highest possible relative degree of grafting.
This book presents an empirical investigation of the efforts that multinational pharmaceutical companies take in order to find a business model that allows for a profitable access to the Bottom of the Pyramid (BoP) markets. The Bottom of the Pyramid in Africa is frequently mentioned as an attractive market due to its sheer size. Yet most companies struggle to access it because of the low price level, difficult physical market access and challenges when it comes to payment.
More specifically, the book investigates the following business model-related questions: Do pharmaceutical companies provide products that meet the needs of the BoP? What characterizes the value generation of the company? What revenue model leads to a profitable business, and what role does a network of partners play in the business model?
Findings reveal that there is no ‘one-size-fits-all’ answer to these questions. Providing continuous availability, affordability at a good quality of goods and services, creating health awareness, as well as localizing business to achieve a level of inclusivenessare essential prerequisites for success. In the last chapter this book provides a business model prototype that accounts for these key success factors for business at the Bottom of the Pyramid and points to further research topics.
Für den Unternehmer wichtig ist, binnen welcher Fristen er als Käufer seine Rechte bei Sachmängeln geltend machen muss. Ist der Unternehmer Verkäufer, kann er sich erst nach Verjährungsvollendung endgültig zurücklehnen und sicher sein, dass keine Gewährleistungsansprüche gegen ihn mehr geltend gemacht werden können. Im nationalen Rechtsverkehr hat man sich auf Verkäufer- und Käuferseite mittlerweile an die zweijährige Regelfrist im BGB gewöhnt. Welche Fristen im Auslandsgeschäft gelten, ist dagegen oft unklar, weil sich die nationalen Verjährungsfristen oft unterscheiden: Allein in Europa gibt es bei der kaufrechtlichen Gewährleistung Verjährungsfristen zwischen sechs Monaten und sechs Jahren.
Bone tissue is highly vascularized. The crosstalk of vascular and osteogenic cells is not only responsible for the formation of the strongly divergent tissue types but also for their physiological maintenance and repair. Extrusion-based bioprinting presents a promising fabrication method for bone replacement. It allows for the production of large-volume constructs, which can be tailored to individual tissue defect geometries. In this study, we used the all-gelatin-based toolbox of methacryl-modified gelatin (GM), non-modified gelatin (G) and acetylated GM (GMA) to tailor both the properties of the bioink towards improved printability, and the properties of the crosslinked hydrogel towards enhanced support of vascular network formation by simple blending. The vasculogenic behavior of human dermal microvascular endothelial cells (HDMECs) and human adipose-derived stem cells (ASCs) was evaluated in the different hydrogel formulations for 14 days. Co-culture constructs including a vascular component and an osteogenic component (i.e. a bone bioink based on GM, hydroxyapatite and ASCs) were fabricated via extrusion-based bioprinting. Bioprinted co-culture constructs exhibited functional tissue-specific cells whose interplay positively affected the formation and maintenance of vascular-like structures. The setup further enabled the deposition of bone matrix associated proteins like collagen type I, fibronectin and alkaline phosphatase within the 30-day culture.
Azide-bearing cell-derived extracellular matrices (“clickECMs”) have emerged as a highly exciting new class of biomaterials. They conserve substantial characteristics of the natural extracellular matrix (ECM) and offer simultaneously small abiotic functional groups that enable bioorthogonal bioconjugation reactions. Despite their attractiveness, investigation of their biomolecular composition is very challenging due to the insoluble and highly complex nature of cell-derived matrices (CDMs). Yet, thorough qualitative and quantitative analysis of the overall material composition, organisation, localisation, and distribution of typical ECM-specific biomolecules is essential for consistent advancement of CDMs and the understanding of the prospective functions of the developed biomaterial. In this study, we evaluated frequently used methods for the analysis of complex CDMs. Sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) and (immune)histochemical staining methods in combination with several microscopic techniques were found to be highly eligible. Commercially available colorimetric protein assays turned out to deliver inaccurate information on CDMs. In contrast, we determined the nitrogen content of CDMs by elementary analysis and converted it into total protein content using conversion factors which were calculated from matching amino acid compositions. The amount of insoluble collagens was assessed based on the hydroxyproline content. The Sircol™ assay was identified as a suitable method to quantify soluble collagens while the Blyscan™ assay was found to be well-suited for the quantification of sulphated glycosaminoglycans (sGAGs). Eventually, we propose a series of suitable methods to reliably characterise the biomolecular composition of fibroblast-derived clickECM.
The promise of the EVs is twofold. First, rejuvenating a transport sector that still heavily depends on fossil fuels and second, integrating intermittent renewable energies into the power mix. However, it is still not clear how electricity networks will cope with the predicted increase in EVs and their charging demand, especially in combination with conventional energy demand. This paper proposes a methodology which allows to predict the impact of EV charging behavior on the electricity grid. Moreover, this model simulates the driving and charging behavior of heterogeneous EV drivers which differ in their mobility pattern, decision-making heuristics and charging strategies. The simulations show that uncoordinated charging results in charging load clustering. In contrast, decentralized coordination allows to fill the valleys of the conventional load curve and to integrate EVs without the need of a costly expansion of the electricity grid.
Machine learning (ML) techniques are rapidly evolving, both in academia and practice. However, enterprises show different maturity levels in successfully implementing ML techniques. Thus, we review the state of adoption of ML in enterprises. We find that ML technologies are being increasingly adopted in enterprises, but that small and medium-size enterprises (SME) are struggling with the introduction in comparison to larger enterprises. In order to identify enablers and success factors we conduct a qualitative empirical study with 18 companies in different industries. The results show that especially SME fail to apply ML technologies due to insufficient ML knowhow. However, partners and appropriate tools can compensate this lack of resources. We discuss approaches to bridge the gap for SME.
The extracellular matrix (ECM) naturally surrounds cells in humans, and therefore represents the ideal biomaterial for tissue engineering. ECM from different tissues exhibit different composition and physical characteristics. Thus, ECM provides not only physical support but also contains crucial biochemical signals that influence cell adhesion, morphology, proliferation and differentiation. Next to native ECM from mature tissue, ECM can also be obtained from the in vitro culture of cells. In this study, we aimed to highlight the supporting effect of cell-derived- ECM (cdECM) on adipogenic differentiation. ASCs were seeded on top of cdECM from ASCs (scdECM) or pre-adipocytes (acdECM). The impact of ECM on cellular activity was determined by LDH assay, WST I assay and BrdU assay. A supporting effect of cdECM substrates on adipogenic differentiation was determined by oil red O staining and subsequent quantification. Results revealed no effect of cdECM substrates on cellular activity. Regarding adipogenic differentiation a supporting effect of cdECM substrates was obtained compared to control. With these results, we confirm cdECM as a promising biomaterial for adipose tissue engineering.
The generous feed-in tariffs (FiTs) introduced in Germany—which resulted in major growth in decentralized solar photovoltaic (PV) systems—will phase out in the coming years, making many of the existing distributed generation assets stranded. This challenge creates an opportunity for community-focused energy utilities, such as Elektrizitätswerke Schönau eG (EWS) based in Schönau, Germany, to try a new approach to assist its customers, makes the transition to a more sustainable future. This chapter describes how EWS is developing products and offering community-based solutions including peer-to-peer trading using automated platforms. Such innovative offering may lead to successful differentiation in a competitive and highly decentralized future.
Public transport maps are typically designed in a way to support route finding tasks for passengers while they also provide an overview about stations, metro lines, and city-specific attractions. Most of those maps are designed as a static representation, maybe placed in a metro station or printed in a travel guide. In this paper we describe a dynamic, interactive public transport map visualization enhanced by additional views for the dynamic passenger data on different levels of temporal granularity. Moreover, we also allow extra statistical information in form of density plots, calendar-based visualizations, and line graphs. All this information is linked to the contextual metro map to give a viewer insights into the relations between time points and typical routes taken by the passengers. We illustrate the usefulness of our interactive visualization by applying it to the railway system of Hamburg in Germany while also taking into account the extra passenger data. As another indication for the usefulness of the interactively enhanced metro maps we conducted a user experiment with 20 participants.
JumpAR kombiniert die Welt der Augmented Reality (AR) mit dem weltbekannten Jump ’n’ Run Genre in einem Mobile Game. Der Spieler kreiert einen individuellen Spielparcours in seiner realen Umgebung und navigiert seine Spielfigur auf virtuellen Plattformen durch diesen. Der mit Unity entwickelte JumpAR Prototyp wurde nach Umsetzungen der Grundfunktionen und Mechaniken im Rahmen eines Nutzertests analysiert. Die Integration von echten Gegenständen aus dem Umfeld des Spielers führt im Spielfluss zu einer starken Verknüpfung der virtuellen und realen Welt, was eine neue AR-Interaktionsform für Handyspiele darstellt.
Systemische Betrachtung des therapeutischen Roboters Paro im Vergleich zu dem Haustierroboter AIBO
(2020)
Roboter sind in der heutigen Zeit nicht nur in der Industrie zu finden, sondern werden immer häufiger in privaten Lebensbereichen eingesetzt. Ein Beispiel hierfür ist der soziale Therapie-Roboter Paro. Dieser ist dem Verhalten und Aussehen einer jungen Robbe nachempfunden, drückt Gefühle aus und wird besonders in Pflegeheimen eingesetzt. Dabei zeigt er positive Auswirkungen auf das Wohlbefinden pflegebedürftiger Menschen. Diese Arbeit stellt den Roboter Paro in einer systemischen Analyse dar: hierbei werden Systemkontext, Anwendungsfälle, Anforderungen und Struktur betrachtet. Anschließend erfolgt eine Analyse des Haustierroboters AIBO, welcher einem Welpen ähnelt und verstärkt der Unterhaltung von Privatpersonen dient. Es werden Gemeinsamkeiten und Unterschiede zwischen den Systemen herausgearbeitet. Dabei wird ersichtlich, dass beide Systeme dem Nutzer vorrangig Gesellschaft leisten, jedoch verschiedene Anforderungen besitzen und in unterschiedlichen Anwendungsdomänen eingesetzt werden. Zudem besitzt AIBO vielfältigere Fähigkeiten und einen höheren Bewegungsgrad als Paro. Dies spiegelt sich in einer komplexeren Struktur der Hardware wider.
A methodology for designing planar spiral antennas with a feeding network embedded within a dielectric is presented. To avoid a purely academic work which may not be manufactured with available standard technologies, the approach takes into account manufacturing process requirements by choice of used materials in the simulation. General design rules are provided. They encompass amongst others, selection criteria for dielectric material, aspects to consider when sketching the radiating element design, as well as those for the implementation of the feeding network. A rule of thumb, which maybe helpful in the determination of the antenna supporting substrate’s height, has been found. The appeal of the method resides in the fact that it eases up the design process and helps to minimize errors, saving time and money. The approach also enables the design of a compact and small-size spiral antenna as antenna-in-package (AiP), and provides the opportunity to assemble the antenna with other RF components/systems on the same layer stack or on the same integration platform.
Die rasante Entwicklung der Sensortechnik im Endverbraucherbereich lässt einen klinischen Nutzen der verfügbaren dezentral erhobenen Daten aus dem Patientenalltag zur Überwachung des individuellen Gesundheitszustands vermuten. Zur Überprüfung dieser Vermutung ist die Bereitstellung einer entsprechenden Plattform in den klinischen Alltag erforderlich. Hierzu wird die bwHealthApp entwickelt, mit der sowohl die aktuelle Bandbreite als auch die Evolution der Sensortechnik auf die klinische Anwendung abbildbar ist. Mit dem flexiblen Entwurf lässt sich der klinische Nutzen für die personalisierte Medizin evaluieren. Außerdem bietet die bwHealthApp einen an Machbarkeit orientierten Diskussionsbeitrag zu offenen rechtlichen, regulatorischen und ethischen Fragestellungen der Digitalisierung in der Medizin in Deutschland.
The livestock sector is growing steadily and is responsible for around 18% of global greenhouse‐gas‐emissions, which is more than the global transport sec-tor (Steinfeld et al. 2006). This paper examines the potential of social marketing to reduce meat consumption. The aim is to understand consumers’ motivation in diet choices and to learn what opportunities social marketing can provide to counteract negative environmental and health trends. The authors believe that research to answer this question should start in metropolitan areas, be-cause measures should be especially effective there. Based on the Theory of Planned Behaviour (TPB, Ajzen 1991) and the Technology‐Acceptance‐Model by Huijts et al. (2012), an online‐study with participants from the metropolitan region (n = 708) was conducted in which central socio‐psychological constructs for a meat consumption reduction were examined. It was shown that attitude, personal norm and habit have a critical influence on the intention to reduce meat consumption. A segmentation of consumers based on these factors led to three consumer clusters: vegetarians/flexitarians, potential flexitarians and convinced meat eaters. Potential flexitarians are an especially relevant target group for the development of social‐marketing‐measures to reduce meat consumption. In co‐creation‐workshops with potential flexitarians from the metropolitan region, barriers and benefits of reducing meat consumption were identified. The factors of environmental protection, animal welfare and desire for variety turn out to be the most relevant motivational factors. Based on these factors, consumers proposed a variety of social marketing measures, such as applications and labels to inform about the environmental impact of meat products.
Wie kann die Digitalisierung in der Bauzulieferbranche erfolgreich gemeistert werden? Die Fülle und Komplexität der Fragen dazu lassen sich auf zwei zentrale Kernfragen reduzieren: Was sind die richtigen Inhalte und wesentlichen Werttreiber der Digitalisierung? Und wie muss zukünftig mit der steigenden Informationsflut, der rasant wachsenden Komplexität und der abnehmenden Planbarkeit umgegangen werden?
In diesem Beitrag wird ein Framework vorgestellt, das Bauzulieferern hilft, ihr digitales Zielbild mit seinen Werttreibern systematisch aus dem Kundennutzen abzuleiten. Das Framework berücksichtigt die Besonderheiten der Bauzulieferindustrie, kann aber mit leichten Anpassungen auch auf andere Branchen angewendet werden. Aufbauend auf dem Zielbild können Unternehmen definieren, welche technischen, personellen und organisatorischen Veränderungen für dessen Umsetzung erforderlich sind. Um flexibel mit den dynamischen Veränderungen in ihrem Ökosystem und kulturellen Herausforderungen umgehen zu können, werden zudem fünf Einflussgrößen identifiziert, die Unternehmen bei der Entwicklung der dafür benötigten Evolutionskompetenz berücksichtigen müssen.
In diesem Beitrag wird ein neuer Ansatz vorgestellt, welcher eine schwerkraftreduzierte Navigation innerhalb einer VR-Umgebung erlaubt, wie beispielsweise ein simulierter Mondspaziergang. Zur Navigation in der VR-Umgebung wird der Cyberith Virtualizer ein-gesetzt. Die Schwerkraftsimulation erfolgt mittels eines einstellbaren Gurtsystems, das anelastischen Seilen aufgehängt wird und abgestufte Schwerkraftkompensationen erlaubt. Als Umgebung wurde ein Raumschiffszenario sowie eine Mondoberfläche generiert. Hier sind in der aktuellen Anwendung einfache Interaktionen möglich. In Anlehnung an existierende Gravity Offload Systeme wird die Lösung ViRGOS bezeichnet. ViRGOS wurde bereits bei verschiedenen Besuchsterminen und Hochschulevents eingesetzt, so dass erste Rückmeldungen von Nutzern eingeholt werden konnten.
Here, we report the mechanical and water sorption properties of a green composite based on Typha latifolia fibres. The composite was prepared either completely binder-less or bonded with 10% (w/w) of a bio-based resin which was a mixture of an epoxidized linseed oil and a tall-oil based polyamide. The flexural modulus of elasticity, the flexural strength and the water absorption of hot pressed Typha panels were measured and the influence of pressing time and panel density on these properties was investigated. The cure kinetics of the biobased resin was analyzed by differential scanning calorimetry (DSC) in combination with the iso-conversional kinetic analysis method of Vyazovkin to derive the curing conditions required for achieving completely cured resin. For the binderless Typha panels the best technological properties were achieved for panels with high density. By adding 10% of the binder resin the flexural strength and especially the water absorption were improved significantly.
In recent years, the development and application of decellularized extracellular matrices (ECMs) for use as biomaterials have grown rapidly. These cell-derived matrices (CDMs) represent highly bioactive and biocompatible materials consisting of a complex assembly of biomolecules. Even though CDMs mimic the natural microenvironment of cells in vivo very closely, they still lack specifically addressable functional groups, which are often required to tailor a biomaterial functionality by bioconjugation. To overcome this limitation, metabolic glycoengineering has emerged as a powerful tool to equip CDMs with chemical groups such as azides. These small chemical handles are known for their ability to undergo bioorthogonal click reactions, which represent a desirable reaction type for bioconjugation. However, ECM insolubility makes its processing very challenging. In this contribution, we isolated both the unmodified ECM and azide-modified clickECM by osmotic lysis. In a first step, these matrices were concentrated to remove excessive water from the decellularization step. Next, the hydrogel-like ECM and clickECM films were mechanically fragmentized, resulting in easy to pipette suspensions with fragment sizes ranging from 7.62 to 31.29 μm (as indicated by the mean d90 and d10 values). The biomolecular composition was not impaired as proven by immunohistochemistry. The suspensions were used for the reproducible generation of surface coatings, which proved to be homogeneous in terms of ECM fragment sizes and coating thicknesses (the mean coating thickness was found to be 33.2 ± 7.3 μm). Furthermore, they were stable against fluid-mechanical abrasion in a laminar flow cell. When primary human fibroblasts were cultured on the coated substrates, an increased bioactivity was observed. By conjugating the azides within the clickECM coatings with alkyne-coupled biotin molecules, a bioconjugation platform was obtained, where the biotin–streptavidin interaction could be used. Its applicability was demonstrated by equipping the bioactive clickECM coatings with horseradish peroxidase as a model enzyme.
Today's pattern making methods for industrial purposes are including construction principles, which are based on mathematical formula and sizing charts. As a result, there are two-dimensional flats, which can be converted into a three-dimensional garment. Because of their high linearity, those patterns are incapable of recreating the complexity of the human body, which results in insufficient fit. Subsequent changes of the pattern require a high degree of experience and lead to an inefficient product development process. It is known that draping allows the development of more complex and demanding patterns, which corresponds more to the actual body shape. Therefore, this method is used in custom tailoring and haute couture to achieve perfect garment fit but is also associated with time.
So, there is the act of defiance to improve the fit of garments, to speed up production but maintain a good value for money. Reutlingen University is therefore working on the development of 3D-modelled body shapes for 3D draping, considering different layers of clothing, such as jackets or coats. For this purpose, 3D modelling is used to develop 3D-bodies that correspond to the finished dimensions of the garment. By flattening of the modelled body, it is then possible to obtain an optimal 2D Pattern of the body. The comparison of the conventional method and the developed method is done by 3D simulation.
Finally, the optical fit test is demonstrated by the simulated basic cuts, that a significantly better body wrapping through the newly developed methodology could be achieved. Unlike in the basic cuts, which were achieved by classical design principles have been created, only a few adjustments are necessary to obtain an optimized basic cut. Also, when considering the body distance, it is shown that the newly developed basic patterns provide a more even enclosure of the body.
The process for the production of customized bras is really challenging. Although the need is very clear, the lingerie industry is currently facing a lack of data, knowledge and expertise for the realization of an automated process chain. Different studies and surveys have shown, that the majority of women wear the incorrect bra size. In addition to aesthetic problems, health risks such as headaches, back problems or digestive problems of the wearers can result from this. An important prerequisite for improvements is the basic knowledge about the female breast, both in terms of body measurements and different breast shapes. The current size systematic for bras only defines a bra size by the relation between bust girth and underbust girth and standardized cup forms do not justice to the high variability of the human body. As the bra type shapes the female breast, basic knowledge about the relation of measurements and shapes from the clothed and the unclothed breast is missing.
In the present project, studies are conducted to explore the female breast and to derive new breast-specific body measurements, different breast shapes and deformation knowledge using existing bras.
Furthermore, an innovative process is being developed that leads from 3D scanning to individual and interactive pattern construction, which allows an automatic pattern creation based on individual body measurements and the influence of different material parameters.
In the course of the presentation, the current project status will be shown and the future developments and project steps will be introduced.
Im Gegensatz etwa zur klassischen Werbung handelt es sich beim Event-Marketing um ein dynamisches Kommunikationsinstrument, das laufend Trends und Neuerungen mit sich bringt. Die vielfältigen Einsatzmöglichkeiten und Potenziale des Event-Marketing ermöglichen es, entsprechend dem momentanen Zeitgeist relevante Zielgruppen zu erreichen, markenrelevante Wirklichkeiten und Erlebniswelten zu generieren, Emotionen und Sympathiewerte zu erzeugen und auf diese Weise eine Bindung zwischen Marke bzw. Unternehmen und Rezipienten herzustellen.
Zielgenau aus dem Hinterhalt
(2020)
Ambush-Marketing löst meistens heftige Reaktionen aus - bei Befürwortern und Gegnern. Die Idee des Ambush-Marketings ist es, von den Erfolgen des Sponsorings zu profitieren, ohne die Pflichten eines offiziellen Sponsors einzugehen. Ambusher besitzen keine Vermarktungsrechte an einer Veranstaltung, bauen aber dennoch durch ihre Marketingmassnahmen eine Verbindung zu einem Event auf. Der Grat zwischen der Verletzung von Sponsoren-Rechten und kreativ-innovativer Kommunikationspolitik ist dabei oft sehr schmal.
Gibt es einen Kauf-Knopf im Gehirn des Konsumenten? Und wenn ja, wie betätigt man diesen? Die Antworten auf diese Fragen könnte das Neuromarketing liefern. Das Neuromarketing ist Bestandteil der Neuroökonomie und eine relativ junge Disziplin an der Schnittstelle von Kognitionswissenschaften, Neurowissenschaften und der Marktforschung. Durch den technologischen Fortschritt können die Neurowissenschaften wichtige Erkenntnisse für das Marketing liefern, insbesondere Einblicke zur Erklärung des Konsumentenverhaltens. Durch den Blick in das Kundengehirn können beispielsweise Handelsunternehmen ihre Kunden gezielter ansprechen und sich so einen Vorteil gegenüber Konkurrenten verschaffen.
This article studies the current debate on Coronabonds and the idea of European public debt in the aftermath of the Corona pandemic. According to the EU-Treaty economic and fiscal policy remains in the sovereignty of Member States. Therefore, joint European debt instruments are risky and trigger moral hazard and free-riding in the Eurozone. We exhibit that a mixture of the principle of liability and control impairs the present fiscal architecture and destabilizes the Eurozone. We recommend that Member States ought to utilize either the existing fiscal architecture available or establish a political union with full sovereignty in Europe. This policy conclusion is supported by the PSPP-judgement of the Federal Constitutional Court of Germany on 5 May 2020. This ruling initiated a lively debate about the future of the Eurozone and Europe in general.
Purpose: Despite growing interest in the intersection of supply chain management (SCM) and management accounting (MA) in the academic debate, there is a lack of understanding regarding both the content and the delimitation of this topic. As of today, no common conceptualization of supply chain management accounting (SCMA) exists. The purpose of this study is to provide an overview of the research foci of SCMA in the scholarly debate of the past two decades. Additionally, it analyzes whether and to what extent the academic discourse of MA in SCs has already found its way into both SCM and MA higher education, respectively.
Design/methodology/approach: A content analysis is conducted including 114 higher education textbooks written in English or in German language.
Findings: The study finds that SC-specific concepts of MA are seldom covered in current textbooks of both disciplines. The authors conclude that although there is an extensive body of scholarly research about SCMA concepts, there is a significant discrepancy with what is taught in higher education textbooks.
Practical implications: There is a large discrepancy between the extensive knowledge available in scholarly research and what we teach in both disciplines. This implies that graduates of both disciplines lack important knowledge and skills in controlling and accounting for SCs. To bring about the necessary change, MA and SCM in higher education must be more integrative.
Originality/value: To the best of the authors knowledge, this study is first of its kind comprising a large textbook sample in both English and German languages. It is the first substantiated assessment of the current state of integration between SCM and MA in higher education.
In Germany, mobility is currently in a state of flux. Since June 2019, electric kick scooters (e-scooters) have been permitted on the roads, and this market is booming. This study employs a user survey to generate new data, supplemented by expert interviews to determine whether such e-scooters are a climate-friendly means of transport. The environmental impacts are quantified using a life cycle assessment. This results in a very accurate picture of e-scooters in Germany. The global warming potential of an e-scooter calculated in this study is 165 g CO2-eq./km, mostly due to material and production (that together account for 73% of the impact). By switching to e-scooters where the battery is swapped, the global warming potential can be reduced by 12%. The lowest value of 46 g CO2-eq./km is reached if all possibilities are exploited and the life span of e-scooters is increased to 15 months. Comparing these emissions with those of the replaced modal split, e-scooters are at best 8% above the modal split value of 39 g CO2-eq./km.
In this work, a brushless, harmonic-excited wound-rotor synchronous machine is investigated which utilizes special stator and rotor windings. The windings magnetically decouple the fundamental torque-producing field from the harmonic field required for the inductive power transfer to the field coil. In contrast to conventional harmonic-excited synchronous machines, the whole winding is utilized for both torque production and harmonic excitation such that no additional copper for auxiliary windings is needed. Different rotor topologies using rotating power electronic components are investigated and their efficiencies have been compared based on Finite-Element calculation and circuit analysis.
Energy efficient electric control of drives is more and more important for electric mobility and manufacturing industries. Online dynamic optimization of induction machines is challenging due to the computational complexity involved and the variable power losses during dynamic operation of induction machines. This paper proposes a simple technique for sub-optimal online loss optimization using rotor flux linkage templates for energy efficient dynamic operation of induction machines. Such a rotor flux linkage template is given by a rotor flux linkage trajectory which is optimal for a specific scenario. This template is calculated in an offline optimization process. For a specific scenario during real time operation the rotor flux linkage is calculated by appropriately scaling the given template.
Steady state efficiency optimization techniques for induction motors are state of the art and various methods have already been developed. This paper provides new insights in the efficiency optimized operation in dynamic regime. The paper proposes an anticipative flux modification in order to decrease losses during torque and speed transients. These trajectories are analyzed based on a numerical study for different motors. Measurement results for one motor are given as well.
Motto der Herbstkonferenz Informatics Inside 2020 ist KInside. Wieder einmal blicken Studierende inside und schauen sich Methoden, Anwendungen und Zusammenhänge genauer an. Die Beiträge sind vielfältig und entsprechend dem Studiengang human-centered. Es ist der Anspruch, dass sich die Themen um die Bedürfnisse der Menschen drehen und eingesetzte Methoden kein Selbstzweck sind, sondern am Nutzen für den Menschen gemessen werden.
Innovationskraft ist einer der wesentlichen Erfolgsfaktoren der Zukunft, welcher den Unterschied zwischen erfolgreichen und scheiternden Unternehmen in hohem Maße beeinflussen wird (PWC, 2015). Besonders junge Unternehmen und Start-ups sind für ihre hohe Innovationsfähigkeit bekannt. Etablierte Unternehmen hingegen punkten weniger mit neuen Ideen, aber dafür mit innovationskritischen Ressourcen, Routinen und Skaleneffekten. Ein stetig an Popularität gewinnender Ansatz, die Fähigkeiten und Ressourcen von etablierten Unternehmen mit der Innovationskraft von Start-ups zu verknüpfen, stellt das "Intrapreneurship" dar.
Das Value-Engineering in der Kundenkommunikation ist eine strukturierte Methode, Kommunikationsprozesse zwischen Unternehmen zu verbessern. Das Konzept greift bewährte Elemente der technischen Wertanalyse und der Gemeinkosten-Wertanalyse auf und überträgt sie auf die Kundenkommunikation. Der Ansatz bietet eine systematische Vorgehensweise, Kommunikationsprozesse zwischen Anbieter und Kunde zu durchleuchten und neu zu gestalten. Value-Engineering in der Kundenkommunikation schafft somit Wettbewerbsvorteile durch eine Optimierung der Kommunikation.
Dieser Beitrag gibt einen Überblick über die verschiedenen Möglichkeiten der Bilanzierung einens Initial Coin Offerings (ICO) beim Emittenten auf der Passivseite nach den Regelungen der IFRS. Ziel ist es, die bilanzielle Einordnung anhand verschiedenenr Arten von Token zu erörtern und den Emittenten bei der Ausgestaltung der Token sowie der anschließenden Bilanzierung zu unterstützen. Die Ergebnisse zeigen, dass die Standards für die bilanzielle Einordnung von ICO-Token zwar ausreichen, allerdings eine große Bandbreite der Bilanzierung zu berücksichtigen ist und eine detaillierte Regelung durch einen eigenen IFRS daher schwierig erscheint.
Customer orientation should be the core engine of every organisation while IT can be considered as the enabler to generate competitive advantages along customer processes in marketing, sales and service. Research shows that customer relationship management (CRM) enables organisations to perform better and experience indicates that organisations that focus on customer orientation are more successful. With marketplace organisations such as Amazon, Alibaba or Conrad shaping the future of customer centricity and information technology, German B2B organisations need to shift their value contribution from product-centric to customer-centric. While these organisations are currently attempting to implement CRM software and putting their customers more into focus, the question remains how organisations are approaching the implementation of CRM and whether these attempts are paying off in terms of business performance.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Modern mixed (HTAP)workloads execute fast update-transactions and long running analytical queries on the same dataset and system. In multi-version (MVCC) systems, such workloads result in many short-lived versions and long version-chains as well as in increased and frequent maintenance overhead.
Consequently, the index pressure increases significantly. Firstly, the frequent modifications cause frequent creation of new versions, yielding a surge in index maintenance overhead. Secondly and more importantly, index-scans incur extra I/O overhead to determine, which of the resulting tuple versions are visible to the executing transaction (visibility-check) as current designs only store version/timestamp information in the base table – not in the index. Such index-only visibility-check is critical for HTAP workloads on large datasets.
In this paper we propose the Multi Version Partitioned B-Tree (MV-PBT) as a version-aware index structure, supporting index-only visibility checks and flash-friendly I/O patterns. The experimental evaluation indicates a 2x improvement for analytical queries and 15% higher transactional throughput under HTAP workloads. MV-PBT offers 40% higher tx. throughput compared to WiredTiger’s LSM-Tree implementation under YCSB.
In this paper, we present a new approach for achieving robust performance of data structures making it easier to reuse the same design for different hardware generations but also for different workloads. To achieve robust performance, the main idea is to strictly separate the data structure design from the actual strategies to execute access operations and adjust the actual execution strategies by means of so-called configurations instead of hard-wiring the execution strategy into the data structure. In our evaluation we demonstrate the benefits of this configuration approach for individual data structures as well as complex OLTP workloads.
The tale of 1000 cores: an evaluation of concurrency control on real(ly) large multi-socket hardware
(2020)
In this paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” and analyse in-memory DBMSs on today’s large hardware. Despite the original assumption of the authors, today we do not see single-socket CPUs with 1000 cores. Instead multi-socket hardware made its way into production data centres. Hence, we follow up on this prior work with an evaluation of the characteristics of concurrency control schemes on real production multi-socket hardware with 1568 cores. To our surprise, we made several interesting findings which we report on in this paper.
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become viable.
The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under NoFTL-KV and the COSMOS hardware platform.
Massive data transfers in modern key/value stores resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, have yet to see widespread use.
In this paper we introduce nKV, which is a key/value store utilizing native computational storage and near-data processing. On the one hand, nKV can directly control the data and computation placement on the underlying storage hardware. On the other hand, nKV propagates the data formats and layouts to the storage device where, software and hardware parsers and accessors are implemented. Both allow NDP operations to execute in host-intervention-free manner, directly on physical addresses and thus better utilize the underlying hardware. Our performance evaluation is based on executing traditional KV operations (GET, SCAN) and on complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4×-2.7× better performance on real hardware – the COSMOS+ platform.
nKV in action: accelerating KVstores on native computational storage with NearData processing
(2020)
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, has yet to see widespread use.
In this paper we demonstrate various NDP alternatives in nKV, which is a key/value store utilizing native computational storage and near-data processing. We showcase the execution of classical operations (GET, SCAN) and complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4x-2.7x better performance due to NDP. nKV runs on real hardware - the COSMOS+ platform.
Despite strong political efforts in Europe, industrial small- and medium sized enterprises (SMEs) seem to neglect adopting practices for energy effciency. By taking a cultural perspective, this study investigated what drives the establishment of energy effciency and corresponding practices in SMEs. Based on 10 ethnographic case studies and a quantitative survey among 500 manufacturing SMEs, the results indicate the importance of everyday employee behavior in achieving energy savings. The studied enterprises value behavior related measures as similarly important as technical measures. Raising awareness for energy issues within the organization, therefore, constitutes an essential leadership task that is oftentimes perceived as challenging and frustrating. It was concluded that the embedding of energy efficiency in corporate strategy, the use of a broad spectrum of different practices, and the empowerment and involvement of employees serve as major drivers in establishing energy effciency within SMEs. Moreover, the findings reveal institutional influences on shaping the meanings of energy effciency for the SMEs by raising attention for energy effciency in the enterprises and making energy effciency decisions more likely. The main contribution of the paper is to offer an alternative perspective on energy effciency in SMEs beyond the mere adoption of energy-effcient technology.
The automation of work by means of disruptive technologies such as Artificial Intelligence (AI) and Robotic Process Automation (RPA) is currently intensely discussed in business practice and academia. Recent studies indicate that many tasks manually conducted by humans today will not in the future. In a similar vein, it is expected that new roles will emerge. The aim of this study is to analyze prospective employment opportunities in the context of RPA in order to foster our understanding of the pivotal qualifications, expertise and skills necessary to find an occupation in a completely changing world of work. This study is based on an explorative, content analysis of 119 job advertisements related to RPA in Germany. The data was collected from major German online job platforms, qualitatively coded, and subsequently analyzed quantitatively. The research indicates that there indeed are employment opportunities, especially in the consulting sector. The positions require different technological expertise such as specific programming languages and knowledge in statistics. The results of this study provide guidance for organizations and individuals on reskilling requirements for future employment. As many of the positions require profound IT expertise, the generally accepted perspective that existing employees affected by automation can be retrained to work in the emerging positions has to be seen extremely critical. This paper contributes to the body of knowledge by providing a novel perspective on the ongoing discussion of employment opportunities, and reskilling demands of the existing workforce in the context of recent technological developments and automation.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
Der Anspruch an Energieversorger wird wachsen: in Zukunft gewinnen vor allem Aufgaben wie die Entwicklung digitalisierter Produkte/Dienstleistungen sowie ökologische Aktivitäten an Relevanz. Dies zeigt die Hochschule Reutlingen in ihrer aktuellen Untersuchung unter Aufsichtsräten, Geschäftsführern und Führungskräften. Trotz der erwarteten Veränderungen: die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalisierung bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet. Besonders relevant dabei: die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. So die Studie des Reutlinger Energiezentrums and der Hochschule Reutlingen im Auftrag von fünf Unternehmen der Branche.