Refine
Year of publication
- 2017 (302) (remove)
Document Type
- Conference proceeding (122)
- Journal article (104)
- Book chapter (32)
- Book (28)
- Patent / Standard / Guidelines (6)
- Working Paper (5)
- Doctoral Thesis (2)
- Anthology (1)
- Issue of a journal (1)
- Review (1)
Is part of the Bibliography
- yes (302)
Institute
- ESB Business School (96)
- Informatik (88)
- Technik (58)
- Texoversum (33)
- Life Sciences (26)
Publisher
- Springer (44)
- Hochschule Reutlingen (29)
- IEEE (27)
- Elsevier (18)
- Gesellschaft für Informatik (16)
- ACM (8)
- Universitätsbibliothek Tübingen (8)
- Shaker (7)
- Springer Gabler (7)
- Association for Information Systems (AIS) (5)
Learning factories present a promising environment for education, training and research, especially in manufacturing related areas which are a main driver for wealth creation in any nation. While numerous learning factories have been built in industry and academia in the last decades, a comprehensive scientific overview of the topic is still missing. This paper intends to close this gap establishing the state of the art of learning factories. The motivations, historic background, and the didactic foundations of learning factories are outlined. Definitions of the term learning factory and the corresponding morphological model are provided. An overview of existing learning factory approaches in industry and academia is provided, showing the broad range of different applications and varying contents. The state of the art of learning factories curricula design and their use to enhance learning and research as well as potentials and limitations are presented. Conclusions and an outlook on further research priorities are offered.
The fashion industry is well documented for causing significant environmental impact. Product-service systems (PSS) present a promising way to solve this challenge. PSS shift the focus toward complementary service offers, which decouples customer satisfaction from material consumption and entails dematerialization. However, PSS are not ecoefficient by nature but need to be accompanied by corporate environmental management (CEM) practices. The objective of this article is to examine the potential of PSS to contribute to the environmental sustainability of today's fashion industry by investigating if fashion firms with a positive attitude toward PSS implementation also pursue goals related to the ecological environment. For this purpose, analysis of variance (ANOVA) is conducted to analyze data of 102 fashion firms. Results reveal that the diffusion of PSS in today's fashion industry is low and few firms consider implementing PSS. Results, furthermore, demonstrate that PSS implementation is positively related to CEM. This indicates that existing structures of CEM favor PSS implementation and unlock the eco-efficient potential of implemented PSS in the fashion industry.
We report the temperature dependence of metal-enhanced fluorescence (MEF) of individual photosystem I (PSI) complexes from Thermosynechococcus elongatus (T. elongatus) coupled to gold nanoparticles (AuNPs). A strong temperature dependence of shape and intensity of the emission spectra is observed when PSI is coupled to AuNPs. For each temperature, the enhancement factor (EF) is calculated by comparing the intensity of individual AuNP-coupled PSI to the mean intensity of ‘uncoupled’ PSI. At cryogenic temperature (1.6 K) the average EF was 4.3-fold. Upon increasing the temperature to 250 K the EF increases to 84-fold. Single complexes show even higher EFs up to 441.0-fold. At increasing temperatures the different spectral pools of PSI from T. elongatus become distinguishable. These pools are affected differently by the plasmonic interactions and show different enhancements. The remarkable increase of the EFs is explained by a rate model including the temperature dependence of the fluorescence yield of PSI and the spectral overlap between absorption and emission spectra of AuNPs and PSI, respectively.
Management virtueller, internationaler Engineering-Prozesse. - (Berichte aus dem Maschinenbau)
(2017)
Die Internationalisierung von Geschäftsprozessen hat dazu geführt, dass in den Engineering-Prozessen, z.B. Produktentwicklung, Produktionsplanung etc., zunehmend in virtuellen, internationalen Teams gearbeitet wird. Die Prozesse und Methoden wurden aber oft nicht entsprechend weiterentwickelt, dass die virtuellen, internationalen Engineering-Prozesse effektiv und effizient ausgeführt werden. In Rahmen einer fragebogenbasierten Unternehmensbefragung in Deutschland wird untersucht, wie Unternehmen mit Kernfragen der Gestaltung virtueller, internationaler Engineering-Prozesse umgehen und welche Vorgehensweisen, Methoden und Werkzeuge, insbesondere Informations- und Kommunikationstechnologien, sie einsetzen. Dabei werden neben dem Nutzen auch die Risiken thematisiert, die nach der neuen DIN EN ISO 9001:2015 stärker in den unternehmerischen Fokus rücken müssen.
Es gibt viele und teilweise sehr unterschiedliche Vorgehensweisen für die Planung von Materialflusssysstemen in der Produktionslogisitk. Ein spezifisches Vorgehensmodell für die Planung der Produktionsversorgung in der Montage im engeren Sinne existiert nicht. Im Folgenden wird ein solches Vorgehensmodell dargestellt, das sich auf die Planung der Produktionsversorgung fokussiert.
Die neue DIN EN ISO 9001:2015 ist ein Meilenstein in der Normenentwicklung, da mit ihr die Zusammenführung unterschiedlicher Strukturen verschiedener Managementnormen erfolgt, die jetzt alle nach dem gleichen System strukturiert sind bzw. in Zukunft noch derart strukturiert werden. Außerdem spezifiziert sie Anforderungen in einigen Bereichen deutlich genauer und umfangreicher als dies bisher der Fall war. Dementsprechend müssen Unternehmen für eine Neu- oder Reakkreditierung ihre Organisation weiterentwickeln. Der folgende Artikel gibt einen Überblick über wesentliche Änderungen und den anzustrebenden Umgang damit.
Purpose: The purpose of this study was to investigate the value of the web representation of certain fashion hot spots and how these results can be shown on fashion maps in an illustrated way.
Design/methodology/approach: A new ranking was created, which was evaluated with a self-instructed index, to gain solid results. Numbers were collected from Google, Instagram, Facebook, Twitter and web.alert.io. Additionally, fashion maps were created for an illustrative visualization of the results.
Findings: Compared with the ranking of a trend forecasting agency, called Global Language Monitor, which concepted a ranking of non-virtual fashion cities, the web representation and therefore the ranking of the research project, differs mainly in the situation of the cities among the first 10, viz. the rank on which a city occurs, but fewer in the actual cities mentioned.
Research limitations: The research was limited to subjective analysis of data, leading to partly subjective results, as well as the selected number of social media platforms, that had been used.
Originality/value: This is the first study to explore the web representation value of fashion metropolises in comparison to their non-virtual ranking. The results are partly based on results that already existed, concerning transformations of fashion cities or in general which cities own the status of a fashion city.
Cell-cell and cell-extracellular matrix (ECM) adhesion regulates fundamental cellular functions and is crucial for cell-material contact. Adhesion is influenced by many factors like affinity and specificity of the receptor-ligand interaction or overall ligand concentration and density. To investigate molecular details of cell ECM and cadherins (cell-cell) interaction in vascular cells functional nanostructured surfaces were used Ligand-functionalized gold nanoparticles (AuNPs) with 6-8 nm diameter, are precisely immobilized on a surface and separated by non-adhesive regions so that individual integrins or cadherins can specifically interact with the ligands on the AuNPs. Using 40 nm and 90 nm distances between the AuNPs and functionalized either with peptide motifs of the extracellular matrix (RGD or REDV) or vascular endothelial cadherins (VEC), the influence of distance and ligand specificity on spreading and adhesion of endothelial cells (ECs) and smooth muscle cells (SMCs) was investigated. We demonstrate that RGD-dependent adhesion of vascular cells is similar to other cell types and that the distance dependence for integrin binding to ECM-peptides is also valid for the REDV motif. VEC-ligands decrease adhesion significantly on the tested ligand distances. These results may be helpful for future improvements in vascular tissue engineering and for development of implant surfaces.
Wege der Gewinnermittlung
(2017)
Macht ein Unternehmen Gewinn, heißt dies nicht notwendigerweise, dass alles „in trockenen Tüchern“ ist. Die entscheidende Frage ist, wie der Gewinn ermittelt wurde, denn nur mit dem richtigen Verfahren erhält man auch den geeigneten Blickwinkel – auf den Erfolg eines einzelnen Geschäfts, auf den Gewinn einer Periode, auf das Betriebsvermögen, auf die Liquidität oder auf die Bilanz.
EBIT & Co.
(2017)
Eine ganze Reihe von Kennzahlen wird in der Betriebswirtschaftslehre zur Ermittlung und Steuerung des Unternehmensgewinns verwendet. Doch nicht alle eignen sich für denselben Zweck. Je nach Fragestellung sollten unterschiedliche Kennzahlen herangezogen werden. Ihre Interpretation muss nicht zuletzt auch branchenspezifisch erfolgen.
Smart meter based business models for the electricity sector : a systematical literature research
(2017)
The Act on the Digitization of the Energy Transition forces German industries and households to introduce smart meters in order to save engery, to gain individual based electricity tariffs and to digitize the energy data flow. Smart meter can be regarded as the advancement of the traditional meter. Utilizing this new technology enables a wide range of innovative business models that provide additional value for the electricity suppliers as well as for their customers. In this study, we followed a two-step approach. At first, we provide a state-of-the-art comparison of these business models found in the literature and identify structural differences in the way they add value to the offered products and services. Secondly, the business models are grouped into categories with respect to customer segmetns and the added value to the smart grid. Findings indicate that most business models focus on the end-costumer as their main customer.
This research is about Omnichannel Retailing and addresses the question how the omnichanneling of retailers in the fashion market can be measured. Our sources will include books, interviews, newspapers and scientific databases.
Omnichanneling is a current topic in the fashion market, retailers all over the world face the question on how to adapt to the challenges Omnichannel Retailing sets. We are going to define what Omnichanneling is by explaining the differences between Multiple-, Multi-, Cross- and Omnichannel Retailing. After we defined omnichanneling itself, we took a set of 26 retailers to evaluate regarding their Omnichannel capabilities. Then we create an index with criteria that can measure the Omnichannel capability of each retailer.
The Omnichannel Score is based on 31 criteria, which analyze the retailers in offline, online, mobile and social aspects enables to see differences between retailers. Our findings were that retailers in the US fashion market are more advanced in Omnichannel Retailing than retailers in the German fashion market. Our top three Omnichannel retailers were Sears with an Omnichannel Score of 91, followed by KOHL’S and Marks&Spencer, both with a Omnichannel Score of 88. The best Omnichannel Retailer from Germany was Adidas with the fourth place and an Omnichannel Score of 81.
Zusammen mit Partnern aus Industrie und Politik untersuchen die ESB Business School der Hochschule Reutlingen, die Hochschule Offenburg und die Fachhochschule Nordwestschweiz (FHNW) in einem Interreg-Projekt die Möglichkeiten, klima- und gesundheitsschädliche Emissionen im Grenzverkehr am Hochrhein zu reduzieren. Elektromobilität und Fahrgemeinschaften werden dazu im Rahmen eines Pilotprojekts gefördert und die Wirkung analysiert. Erste Ergebnisse zeigen, dass heutige Elektroautos für das grenzüberschreitende Pendeln unter bestimmten Voraussetzungen geeignet sind.
Towards a practical maintainability quality model for service- and microservice-based systems
(2017)
Although current literature mentions a lot of different metrics related to the maintainability of service-based systems (SBSs), there is no comprehensive quality model (QM) with automatic evaluation and practical focus. To fill this gap, we propose a Maintainability Model for Services (MM4S), a layered maintainability QM consisting of service properties (SPs) related with automatically collectable Service Metrics (SMs). This research artifact created within an ongoing Design Science Research (DSR) project is the first version ready for detailed evaluation and critical feedback. The goal of MM4S is to serve as a simple and practical tool for basic maintainability estimation and control in the context of BSs and their specialization
microservice-based systems (μSBSs).
In a time of digital transformation, the ability to quickly and efficiently adapt software systems to changed business requirements becomes more important than ever. Measuring the maintainability of software is therefore crucial for the long-term management of such products. With service-based systems (SBSs) being a very important form of enterprise software, we present a holistic overview of such metrics specifically designed for this type of system, since traditional metrics – e.g. object oriented ones – are not fully applicable in this case. The selected metric candidates from the literature review were mapped to 4 dominant design properties: size, complexity, coupling, and cohesion. Microservice-based systems (μSBSs) emerge as an agile and fine grained variant of SBSs. While the majority of identified metrics are also applicable to this specialization (with some limitations), the large number of services in combination with technological heterogeneity and decentralization of control significantly impacts automatic metric collection in such a system. Our research therefore suggests that specialized tool support is required to guarantee the practical applicability of the presented metrics to μSBSs.
Anforderungen an die Mensch-Maschine-Schnittstelle im Automobil auf dem Weg zum autonomen Fahren
(2017)
In den letzten Jahrzehnten haben immer mehr Fahrerassistenzsysteme Einzug in das Automobil gefunden und bereiten damit den Weg zu vollautonomen Fahrzeugen der Zukunft vor. So bieten bereits viele Hersteller Ausstattungsvarianten ihrer Fahrzeuge an, die für den Umstieg in die vollautonome Zukunft gewappnet sind. Um den Menschen mit auf den Weg zu nehmen, werden einige Anforderungen an die Mensch-Maschine-Schnittstelle (MMS) des Automobils gestellt. Für die teilautonomen Fahrzeuge der nächsten Generation gilt es, den Fahrerwechsel zwischen manuellem und autonomen Fahren für die Menschen bestmöglich zu gestalten. Die Arbeit wirft einen Blick auf ausgewählte Ansätze für zukünftige MMS-Systeme und bewertet diese anhand der Übergabezeiten zwischen Mensch und Maschine. Ein Wandel der MMS im Automobil wird empfohlen, um den Menschen mit den neuen Technologien vertraut zu machen.
This paper introduces a novel placement methodology for a common-centroid (CC) pattern generator. It can be applied to various integrated circuit (IC) elements, such as transistors, capacitors, diodes, and resistors. The proposed method consists of a constructive algorithm which generates an initial, close to the optimum, solution, and an iterative algorithm which is used subsequently, if the output of constructive algorithm does not satisfy the desired criteria. The outcome of this work is an automatic CC placement algorithm for IC element arrays. Additionally, the paper presents a method for the CC arrangement evaluation. It allows for evaluating the quality of an array, and a comparison of different placement methods.
In diesem Beitrag haben wir uns mit den Grenzen der Leistungsgerechtigkeit in Personalbeurteilungen auseinandergesetzt. Dabei haben wir auf Basis der EC argumentiert, dass die Beteiligten leistungsbezogene Beurteilungskriterien so auslegen, dass sie mit ihrem Gerechtigkeitsempfinden kompatibel sind. Dieses Argument deutet darauf hin, dass es in vielen Organisationen nicht möglich ist, das Leistungsprinzip strikt umzusetzen. Sofern in Organisationen die Leistungsgerechtigkeit mit weiteren Gerechtigkeitsvorstellungen kollidiert, werden die Akteure bemüht sein, in Konventionen lokaler Arrangements oder Kompromisse zu etablieren. Zudem haben wir argumentiert, dass durch die Entkoppelung der am Beurteilungssystem Beteiligten organisationalen Subsysteme lokale Konventionen ausgebildet werden und eine durchgängige Kontrolle durch höhere Instanzen vermieden wird.
Technologies for mapping the “digital twin“ have been under development for approximately 20 years. Nowadays increasingly intelligent, individualized products encourages companies to respond innovatively to customer requirements and to handle the rising product variations quickly.
An integrated engineering network, spanning across the entire value chain, is operated to intelligently connect various company divisions, and to generate a business ecosystem for products, services and communities. The conditions for the digital twin are thereby determined in which the digital world can be fed into the real, and the real world back into the digital to deal such intelligent products with rising variations.
The term digital twin can be described as a digital copy of a real factory, machine, worker etc., that is created and can be independently expanded, automatically updated as well as being globally available in real time. Every real product and production site is permanently accompanied by a digital twin. First prototypes of such digital twins already exist in the ESB Logistics Learning Factory on a cloud- and app based software that builds on a dynamic, multidimensional data and information model. A standardized language of the robot control systems via software agents and positioning systems has to be integrated. The aspect of the continuity of the real factory in the digital factory as an economical means of ensuring continuous actuality of digital models looks as the basis of changeability.
For the indoor localization sensor combinations that in addition to the hardware already contain the software required for the sensor data fusion should be used. Processing systems, scenario-live-simulations and digital shop floor management results in a mandatory procedural combination. Essential to the digital twin is the ability to consistently provide all subsystems with the latest state of all required information, methods and algorithms.
Eine realistische Risikoeinschätzung ist Basis von verantwortungsvollen Unternehmensentscheidungen. Doch wie lassen sich Risiken richtig einschätzen? Verschiedene Instrumente des Risiko-Managements erlauben es, Risiken systematisch zu identifizieren, zu quantifizieren, zu bewerten und zu dokumentieren.
Risiken sind per se nichts Schlechtes, wenn der dadurch erzielte Ertrag für das eingegangene Risiko angemessen ist. Dieser Zusammenhang wird allerdings nicht immer verstanden – einer der Gründe für die Finanzkrise von 2008/09. Die in diesem Beitrag vorgestellten Kennzahlen zeigen, wie man Risiken mit erzielten oder möglichen Erträgen ins Verhältnis setzen kann.
The purpose of this paper is to determine the relevance of social media for luxury brand management. It employs both a multi-methodological approach: After analyzing the online performance of the three luxury brands Burberry, Louis Vuitton and Gucci, the empirical research includes a survey as well as an eye tracking test executed with Tobii Studio. The findings reveal that online and social media have given luxury fashion businesses the opportunity to establish a sustainable interaction with their customers and distinguish themselves from the competition. Still, the online business holds many challenges for luxury companies to overcome. This paper gives instructions as to how social media can be effectively incorporated into a luxury company.
Painting galleries typically provide a wealth of data composed of several data types. Those multivariate data are too complex for laymen like museum visitors to first, get an overview about all paintings and to look for specific categories. Finally, the goal is to guide the visitor to a specific painting that he wishes to have a more closer look on. In this paper we describe an interactive visualization tool that first provides such an overview and lets people experiment with the more than 41,000 paintings collected in the web gallery of art. To generate such an interactive tool, our technique is composed of different steps like data handling, algorithmic transformations, visualizations, interactions, and the human user working with the tool with the goal to detect insights in the provided data. We illustrate the usefulness of the visualization tool by applying it to such characteristic data and show how one can get from an overview about all paintings to specific paintings.
Der Zusammenschluss von Unternehmen in Lieferantennetzwerken auf Basis digitaler Plattformen bietet eine Möglichkeit, der Forderung nach Flexibilität in der Industrie 4.0 nachzukommen. Anhand der Charakterisierung eines realen Lieferantennetzwerkes werden use cases für die Lieferantenanbindung hergeleitet. Diese dienen als Diskussionsgrundlage von Potenzialen und Herausforderungen der Anbindung, wobei sich die Frage nach der optimalen Integrationstiefe stellt. Hierzu wurde ein anwenderorientiertes Entscheidungsmodell abgeleitet.
The diversity of energy prosumer types makes it difficult to create appropriate incentive mechanisms that satisfy both prosumers and energy system operators alike. Meanwhile, European energy suppliers buy guarantees of origin (GoO) which allow them to sell green energy at premium prices while in reality delivering grey energy to their customers. Blockchain technology has proven itself to be a robust paying system in which users transact money without the involvement of a third party. Blockchain tokens can be used to represent a unit of energy and, just as GoOs, be submitted to the market. This paper focuses on simulating marketplace using the ethereum blockchain and smart contracts, where prosumers can sell tokenized GoOs to consumers willing to subsidize renewable energy producers. Such markets bypass energy providers by allowing consumers to obtain tokenized GoOs directly from the producers, which in turn benefit directly from the earnings. Two market strategies where tokens are sold as GoOs have been simulated. In the Fix Price Strategy prosumers sell their tokens to the average GoO price of 2014. The Variable Price Strategy focuses on selling tokens at a price range defined by the difference between grey and green energy. The study finds that the ethereum blockchain is robust enough to functions as a platform for tokenized GoO trading. Simulation results have been compared and the results indicate that prosumers earn significantly more money by following the Variable Price
Strategy.
Clinical reading centers provide expertise for consistent, centralized analysis of medical data gathered in a distributed context. Accordingly, appropriate software solutions are required for the involved communication and data management processes. In this work, an analysis of general requirements and essential architectural and software design considerations for reading center information systems is provided. The identified patterns have been applied to the implementation of the reading center platform which is currently operated at the Center of Ophthalmology of the University Hospital of Tübingen.
Kauf- und Vertragsrecht sind oft Schwerpunkt von Rechtsvorlesungen in betriebswirtschaftlichen Studiengängen. Hier spielen nicht nur nationale, sondern immer öfter auch grenzüberschreitende Transaktionen eine Rolle. In diesem Buch werden nationale und internationale Regelungen im Kauf- und Vertragsrecht miteinander verglichen. Neben dem UN-Kaufrecht werden rechtsvergleichend auch allgemeine rechtliche Fragen behandelt, wie etwa das Verhältnis vertraglicher und außervertraglicher Ansprüche und Rechtsbehelfe, Einbeziehung und Gültigkeit Allgemeiner Geschäftsbedingungen, vertragliche Haftungsbeschränkungen und Vertragsstrafen oder Unterschiede im allgemeinen Schadensrecht.
Der Verfasser macht deutlich, dass Regelungen, die in der eigenen Rechtsordnung als selbstverständlich erscheinen, sich zum Teil von denen in anderen Rechtsordnungen erheblich unterscheiden oder dort sogar unbekannt sein können. Studierende sollen erkennen, warum es diese Unterschiede gibt. Das erlaubt ihnen später bei Vertragsverhandlungen, Vorschläge ausländischer Partner besser zu verstehen und angemessen auf sie zu reagieren.
Entwicklung eines nicht vergilbenden, faserbasierten BH's mittels innovativer FIM-Technologie
(2017)
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
Integrierte Schaltkreise (IC) sind ein integraler Bestandteil vieler Geräte wie zum Beispiel Smartphones, Computer oder Fernseher. Auf den Schaltkreisen werden immer mehr Funktionen integriert. Um die Arbeit auch zukünftig in gegebener Zeit bewältigen zu können, bedarf es daher einer Möglichkeit für die gleichzeitige Zusammenarbeit der Entwickler. Unter dem Arbeitstitel eCEDA (eCollaboration for Electronic Design Automation) wird ein Konzept für eine Webanwendung entwickelt, die die Echtzeitkollaboration von Entwicklern im Chipentwurf ermöglichen soll. Dieses Konzept sowie verschiedene Aspekte der Kollaboration werden in dieser Arbeit behandelt.
Seit über 50 Jahren dominiert die neoklassische Kapitalmarkttheorie unser Verständnis für die Abläufe an Finanzmärkten. Sie hat eine Vielzahl von Theorien und Konzepten (z.B. Portfoliotheorie, Capital Asset Pricing Model oder Value-at-Risk) hervorgebracht und basiert auf der Annahme eines streng rationalen Homo Oeconomicus.
Das vorliegende Buch möchte Praktikern die Türe öffnen zu einer neu entstehenden, verhaltenswissenschaftlichen Sicht auf die Finanzmärkte, in der ein realitätsnäherer Homo Oeconomicus Humanus an den Märkten agiert. Er setzt bei der Entscheidungsfindung begrenzt rationale Heuristiken ein und lässt sich von emotionalen Einflüssen lenken.
Die Autoren schlagen zunächst den Bogen von der neoklassischen Sicht der Finanzmärkte zur Behavioral Finance. Anschließend werden spekulative Blasen, von der Tulpenmanie bis zur Subprime Hypothekenblase, als Anzeichen für begrenzte Rationalität an Finanzmärkten ausführlich vorgestellt. Danach stehen die Heuristiken bei Anlageentscheidungen an Wertpapiermärkten im Vordergrund. Die dadurch ausgelösten Verzerrungen werden ntsprechend ihrer Risiko-/Renditeschädlichkeit im Rahmen des RRS-Index® eingeordnet. Abschließend werden Beispiele für die Anwendung der Behavioral-Finance-Erkenntnisse im Wealth Management und Corporate Governance diskutiert und es wird ein Blick auf aktuelle Entwicklungen der Neuro-Finance und Emotional Finance geworfen.
In dieser Auflage neu hinzugekommen ist Financial Nudging, eine besonders vielversprechende Anwendung von Behavioral Finance-Erkenntnissen.
Spekulationsblasen ziehen sich wie ein roter Faden durch die Geschichte der Finanzmärkte. Die erste bedeutende und dokumentierte Blase entstand im 17. Jahrhundert in Holland als weit bekannte Tulpenmanie. Weitere sollten in den nächsten Jahrhunderten folgen. Doch wie entstehen diese zumeist euphorischen und schlussendlich panikartigen Marktentwicklungen? Und wie konnte es immer wieder passieren, dass diese von Börsenprofis nicht rechtzeitig erkannt wurden? In diesem Buch stehen die Spekulationsblasen als Anzeichen für wiederkehrende und anhaltende Marktanomalien im Fokus der Betrachtung. Rolf J. Daxhammer und Máté Facsar erklären im ersten Teil die Entstehung und Ursachen für die Bildung von Spekulationsblasen sowie die unterschiedlichen Phasen und Arten. Dabei ist es ihnen besonders wichtig aufzuzeigen, dass diese sowohl positive als auch negative Effekte auf die Volkswirtschaften haben. Anschließend stellen die Autoren die wichtigsten Spekulationsblasen, beginnend mit der Tulpenmanie über die Südseeblase bis hin zu aktuellen Entwicklungen an Finanz- und Immobilienmärkten, vor. Der Leser ist somit in der Lage, typische Eigenschaften der Kapitalmärkte zu verstehen und die Entwicklung historischer Spekulationsblasen auf Basis des Fünf-Phasen-Modells zu erklären und dieses anzuwenden.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
We present a topology of MIMO arrays of inductive antennas exhibiting inherent high crosstalk cancellation capabilities. A single layer PCB is etched into a 3-channels array of emitting/receiving antennas. Once coupled with another similar 3-channels emitter/receiver, we measured an Adjacent Channel Rejection Ratio (ACRR) as high as 70 dB from 150 Hz to 150 kHz. Another primitive device made out of copper wires wound around PVC tubes to form a 2-channels “non-contact slip-ring” exhibited 22 dB to 47 dB of ACRR up to 15MHz. In this paper we introduce the underlying theoretical model behind the crosstalk suppression capabilities of those so-called “Pie-Chart antennas”: an extension of the mutual inductance compensation method to higher number of channels using symmetries. We detail the simple iterative building process of those antennas, illustrate it with numerical analysis and evaluate there effectiveness via real experiments on the 3-channels PCB array and the 2-channels rotary array up to the limit of our test setup. The Pie Chart design is primarily intended as an alternative solution to costly electronic filters or cumbersome EM shields in wireless AND wired applications, but not exclusively.
Gallium nitride high electron mobility transistors (GaN-HEMTs) have low capacitances and can achieve low switching losses in applications where hard turn-on is required. Low switching losses imply a fast switching; consequently, fast voltage and current transients occur. However, these transients can be limited by package and layout parasitics even for highly optimized systems. Furthermore, a fast switching requires a fast charging of the input capacitance, hence a high gate current.
In this paper, the switching speed limitations of GaN-HEMTs due to the common source inductance and the gate driver supply voltage are discussed. The turn-on behavior of a GaN-HEMT is simulated and the impact of the parasitics and the gate driver supply voltage on the switching losses is described in detail. Furthermore, measurements are performed with an optimized layout for a drain-source voltage of 500 V and a drain-source current up to 60 A.
Modern power semiconductor devices have low capacitances and can therefore achieve very fast switching transients under hard-switching conditions. However, these transients are often limited by parasitic elements, especially by the source inductance and the parasitic capacitances of the power semiconductor. These limitations cannot be compensated by conventional gate drivers. To overcome this, a novel gate driver approach for power semiconductors was developed. It uses a transformer which accelerates the switching by transferring energy from the source path to the gate path.
Experimental results of the novel gate driver approach show a turn-on energy reduction of 78% (from 80 μJ down to 17 μJ) with a drain-source voltage of 500V and a drain current of 60 A. Furthermore, the efficiency improvement is demonstrated for a hard-switching boost converter. For a switching frequency of 750 kHz with an input voltage of 230V and an output voltage of 400V, it was possible to extend the output power range by 35%(from 2.3kW to 3.1 kW), due to the reduction of the turn-on losses, therefore lowering the junction temperature of the GaN-HEMT.
A gate driver approach is presented for the reduction of turn-on losses in hard switching applications. A significant turn-on loss reduction of up to 55% has been observed for SiCMOSFETs. The gate driver approach uses a transformer which couples energy from the power path back into the gate path during switching events, providing increased gate driver current and thereby faster switching speed.
The gate driver approach was tested on a boost converter running at a switching frequency up to 300 kHz. With an input voltage of 300V and an output voltage of 600V, it was possible to reduce the converter losses by 8% at full load. Moreover, the output power range could be extended by 23% (from 2.75kW to 3.4 kW) due to the reduction of the turn-on losses.
Steady growing research material in a variety of databases, repositories and clouds make academic content more than ever hard to discover. Finding adequate material for the own research however is essential for every researcher. Based on recent developments in the field of artificial intelligence and the identified digital capabilities of future universities a change in the basic work of academic research is predicted. This study defines the idea of how artificial intelligence could simplifiy academic research at a digital university. Today's studies in the field of AI spectacle the true potential and its commanding impact on academic research.
Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.
Context: Development of software intensive products and services increasingly occurs by continuously deploying product or service increments, such as new features and enhancements, to customers. Product and service developers must continuously find out what customers want by direct customer feedback and usage behaviour observation. Objective: This paper examines the preconditions for setting up an experimentation system for continuous customer experiments. It describes the RIGHT model for Continuous Experimentation (Rapid Iterative value creation Gained through High-frequency Testing), illustrating the building blocks required for such a system. Method: An initial model for continuous experimentation is analytically derived from prior work. The model is matched against empirical case study findings from two startup companies and further developed. Results: Building blocks for a continuous experimentation system and infrastructure are presented. Conclusions: A suitable experimentation system requires at least the ability to release minimum viable products or features with suitable instrumentation, design and manage experiment plans, link experiment results with a product roadmap, and manage a flexible business strategy. The main challenges are proper, rapid design of experiments, advanced instrumentation of software to collect, analyse, and store relevant data, and the integration of experiment results in both the product development cycle and the software development process.
Thematic issue on human-centred ambient intelligence: cognitive approaches, reasoning and learning
(2017)
This editorial presents advances on human-centred Ambient Intelligence applications which take into account cognitive issues when modelling users (i.e. stress, attention disorders), and learn users’ activities/preferences and adapt to them (i.e. at home, driving a car). These papers also show AmI applications in health and education, which make them even more valuable for the general society.
Um einen Funksensor zum Messen der Windgeschwindigkeit per Energy Harvesting mit Energie zu versorgen, bietet es sich an, das Messsignal selbst zur Energiegewinnung zu nutzen. Mit optimierter Funkübertragung und Energiemanagement lässt sich ein autarker Windstärke-Funksensor realisieren, der ab 2 m/s Windgeschwindigkeiten messen und die Messwerte per Funk übertragen kann.
Ein stark erforschtes Gebiet der Computer Vision ist die Detektion von markanten Punkten des Gesichtszuges (englisch: facial feature detection), wie der Mundwinkel oder des Kinns. Daher lassen sich eine Vielzahl von veröffentlichten Verfahren finden, die sich jedoch teils deutlich hinsichtlich der Detektionsgenauigkeit, Robustheit und Geschwindigkeit unterscheiden. So sind viele Verfahren nur bedingt echtzeitfähig oder liefern nur mit hochaufgelösten Bildquellen ein zufriedenstellendes Ergebnis. In den letzten Jahren wurden daher Verfahren entwickelt, die versuchen, diese Problematiken zu lösen. In dieser Arbeit erfolgt eine Betrachtung dreier dieser State-of-the-Art Verfahren: Constrained Local Neural Fields (CLNF), Discriminative Response Map Fitting (DRMF) und Structured Output SVM (SO SVM), sowie deren Implementierungen. Dazu erfolgt ein empirischer Vergleich hinsichtlich der Detektionsgenauigkeit.
Mittlerweile ist der Einsatz von technischen Hilfsmitteln zu Analysezwecken im Sport fester Bestandteil im Trainingsalltag von Trainern und Athleten. In nahezu jeder Sportart werden Videoaufzeichnungen genutzt, um die Bewegungsausführung zu dokumentieren und zu analysieren. Allerdings reichen Aufnahmen von einem statischen Standort oftmals nicht mehr aus. An dieser Stelle kann Virtual Reality (VR) eine Lösung dieses Problems bieten. Durch VR kann der aufgezeichneten Szene eine weitere Ebene hinzugefügt und die Bewegungsabläufe neu und detaillierter bewertet werden. Um Bewegungen in einer virtuellen Umgebung abzubilden, müssen diese mittels Motion Capturing (MoCap) aufgezeichnet werden. Ziel dieser Arbeit ist es, herauszufinden, ob das MoCap System Perception Neuron in der Lage ist, Bewegungen in hoher Geschwindigkeit zu erfassen.
Purpose: The purpose of this paper is to examine the service of the new business model Curated Shopping in the fashion industry as well as to analyze if the service provides a higher costumer added value in comparison to traditional services in retail stores and e-commerce platforms. It gives implications to curated shop operators how to optimize the service in each stage of the customer buying process.
Design/methodology/approach: The research methodology applied is an empirical study that uses the principal of mystery shopping in order to investigate the provided services during the selling process.
Findings: The study showed that information about the customer should be collected carefully and as holistic as possible in order to assort a suitable outfit. The consumer is able to benefit from the service by saving time and enjoying a stress-free way of shopping. Nevertheless there are limitations in the personal service to give individual and inspiring advice by the curator caused by the physical distance to the customer.
Research limitations: The survey was conducted under 10 mystery shoppers and 4 curated shop operators in Germany, limiting findings to these mystery shoppers and operators.
Practical implications: One implication for the shop operators is to collect consumer information carefully and expand the assortment and brand portfolio in order to provide fashion goods to inspire the consumer. The shop operators are on the right track still there is huge potential to provide a more shopper-oriented service.