Refine
Year of publication
- 2016 (295) (remove)
Document Type
- Conference proceeding (95)
- Journal article (91)
- Book chapter (57)
- Book (29)
- Anthology (5)
- Patent / Standard / Guidelines (5)
- Doctoral Thesis (4)
- Issue of a journal (4)
- Review (2)
- Working Paper (2)
Is part of the Bibliography
- yes (295)
Institute
- ESB Business School (86)
- Informatik (81)
- Technik (72)
- Life Sciences (33)
- Texoversum (17)
Publisher
- Springer (39)
- IEEE (33)
- Elsevier (16)
- Hochschule Reutlingen (16)
- Springer Gabler (16)
- Gesellschaft für Informatik (15)
- Universitätsbibliothek Tübingen (10)
- Wiley (8)
- De Gruyter (5)
- Hochschulen für Angewandte Wissenschaften Baden-Württemberg, Koordinierungsstelle (5)
Verlängerte Werkbank, Global Sourcing, Low-Cost-Country-Potenziale, Outsourcing: Seit Jahren herrscht eine inflationäre Verwendung dieser Schlagworte in den Vorstandsetagen. Der Markt für professionelle Dienstleistungen ist schon lange nicht mehr auf die Strategieberatungsbranche beschränkt. Mittlerweile gibt es fast keinen Unternehmensprozess in einem produzierenden Unternehmen mehr, der nicht hinsichtlich seiner Auslagerbarkeit an Berater oder Dienstleister geprüft wurde. Die Autorin beschreibt, warum professionelle Dienstleister zunehmend beauftragt werden und zeigt am Beispiel einer Service-Performance-Studie die Erfolgsfaktoren für die Zusammenarbeit auf. Anhand einer Collaboration Check List legt sie dar, was im täglichen Doing wichtig ist. Abschließend werden die Risiken und Chancen der Inanspruchnahme von Dienstleistungen aus Kunden- und Lieferantenperspektive beleuchtet.
In dieser Arbeit wird ein Modell vorgestellt, das die Planung der direkten Wiederverwendung bei der Vermietung mobiler und langlebiger Investitionsgüter in Closed-Loop Supply Chains optimiert. Insbesondere die Entwicklung von Planungsalgorithmen zur Verbesserung der Vorhersagewahrscheinlichkeit zukünftiger Rücklieferungen und deren betriebswirtschaftliche Auswirkungen für Unternehmen stehen im Vordergrund. Das Optimierungsmodell betrachtet dabei sowohl die Positionierung des Unternehmens im Innen- als auch im Außenverhältnis und liefert die Entscheidungsgrundlage für entsprechende strategische Initiativen.
Diese Arbeit befasst sich mit möglichen Eingabegeräten für VR-Anwendungen, die mit HMDs betrachtet werden. Es wird überprüft, ob grundlegende Interaktionsmöglichkeiten wie Navigation durch den Raum, Texteingabe und Objektauswahl mit den evaluierten Geräten umsetzbar ist. Untersucht werden der Leap Motion Controller, die Kinect 2, das Myo-Armband, der Xbox-Controller und die Razer Hydra.
In dieser Ausarbeitung geht es um den aktuellen Stand der Digitalisierung der Textilindustrie. Sie dient als Grundlage zur Master-Thesis und soll die Frage beantworten, ob ein Informations-System, das die Textilprozesskette begleitet, benötigt wird. Dazu werden die einzelnen Prozessschritte kurz erläutert. In der Ausarbeitung wird auch die Verbindung zwischen der Textilindustrie und den neuen Möglichkeiten mit dem Internet der Dinge beleuchtet.
In dieser Arbeit werden verschiedene Lösungsansätze für die Konstruktion von Display-Walls in digitalen Schowrooms gesammelt und evaluiert. Besonders interessant ist dabei ein digitaler Ansatz, bei dem die Ausgabegeräte von den Zuspielern getrennt sind. Diese Lösung verspricht eine große Flexibilität und eine einfache Erweiterbarkeit im Vergleich zu herkömmlichen Ansätzen. Um diese Aussagen zu prüfen, soll ein funktionaler Prototyp auf Basis der Ergebnisse entwickelt werden.
Im Rahmen der wissenschaftlichen Vertiefung soll auf Basis einer vorhandenen Gebrauchstauglichkeitsanalyse einer mobilen Applikation das Risikomanagement geplant und durchgeführt werden. Die Applikation ist Bestandteil eines In vitro-Diagnostikums, welches transplantierten Patienten im Alltag bei der Bewertung ihrer Blutwerte und des Gesundheitszustandes, sowie bei der korrekt dosierten Einnahme der erforderlichen Medikamente unterstützen soll.
Diese Ausarbeitung befasst sich mit der Fragestellung, inwiefern interaktive Systeme innerhalb eines historischen Ausstellungskontextes herangezogen werden können, um die methodische Vermittlung von Informationen zu fördern und zu unterstützen. Als Anwendungsfall wird hierbei auf das Schloss Aulendorf zurückgegriffen.
Durch das breite Angebot an Cloud-Plattformen fällt es schwer, die passende Plattform für einen bestimmten Anwendungsfall auszuwählen. Es wird häufig die Frage gestellt, welche Unterschiede die einzelnen Cloud Plattformen aufweisen und welche Eigenschaften und Vorteile jede einzelne besitzt. In diesem Artikel werden deshalb zunächst die Prinzipien von Cloud Computing näher gebracht. Außerdem werden die Plattformen Amazon Web Services, Microsoft Azure, Pivotal Cloud Foundry und OpenStack näher beleuchtet und auf die Aspekte der Skalierung und Lastverteilung untersucht.
Die Themen der Konferenz gehen von Präsentationen historischer Inhalte mit interaktiven Systemen (in Museen), über die Erstellung einer Risikomanagementakte im Medizinumfeld, bis zu den Interaktionsgeräten in VR-Anwendungen. In dieser Ausgabe der Informatics Inside ist der inhärent interdisziplinäre Charakter der Informatik mehr denn je spürbar. Denn die Informatik ist auch "inside" der Kunst, der Medizin, der Chemie und der Textilien.
In recent years robotic systems have matured enough to perform simple home or office tasks, guide visitors in environments such as museums or stores and aid people in their daily life. To make the interaction with service and even industrial robots as fast and intuitive as possible, researchers strive to create transparent interfaces close to human-human interaction. As facial expressions play a central role in human-human communication, robot faces were implemented with varying degrees of human-likeness and expressiveness. We propose an emotion model to parameterize a screen based facial animation via inter-process communication. A software will animate transitions and add additional animations to make a digital face appear “alive” and equip a robotic system with a virtual face. The result will be an inviting appearance to motivate potential users to seek interaction with the robot.
Dieses Lehr- und Übungsbuch führt in die wesentlichen Grundlagen der Festigkeitslehre ein. Es zeigt die wichtigsten Konzepte und Arbeitsabläufe eines ingenieursgerechten Festigkeitsnachweises. Besonderer Wert wird auf eine anschauliche Vermittlung des Lehrstoffs aus Sicht des Ingenieurs gelegt. Aus Gründen der Verständlichkeit wird daher auf mathematische Herleitungen weitgehend verzichtet und stattdessen der Schwerpunkt auf eine werkstoffkundliche Betrachtungsweise gelegt. Dies wird durch umfangreiche Werkstoff- und Kennwerttabellen dokumentiert.
Um sich im Kommunikationswettbewerb zu profilieren und Streuverluste zu minimieren, bedienen sich Unternehmen vermehrt den sogenannten "nicht klassischen" Kommunikationsinstrumenten. Sponsoring stellt dabei einen erfolgsversprechenden Ansatz dar, da Sponsoring in einem attraktiven, emotional- aufgeladenen und nicht -kommerziellen Umfeld stattfindet. Aufgrund der zunehmenden Reizüberflutung der Konsumenten erscheint die Erreichung gesteckter Sponsoringziele durch bloße Sichtbarkeit jedoch nicht mehr zufriedenstellend realisierbar. Der vorliegende Beitrag behandelt das Thema Aktuelle Trends im Sponsoring im Sport. Die Analyse der aktuellen Entwicklungen zeigt, dass sich die Wirkungsvoraussetzungen des Sponsoring im Zeitverlauf verändert haben. Es bedarf neuer und innovativer Aktivierungsmaßmahmen, um die Reizüberflutung der Konsumenten zu überwinden und die Potentiale des Sponsorings zu nutzen. Die Darstellung praktischer Beispiele aus dem Sportmarketing zeigt, dass die handelnden Akteure die neuen Herausforderungen des Sponsorings erkannt haben. Es werden die aktuellen Entwicklungen hinsichtlich Digitalisierung, Internationalisierung, Professionalisierung und unkonventionaller Aktivierung aufgezeigt.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
This article is a review of the book "Brain computation as hierarchical abstraction" by Dana H. Ballard published by MIT press in 2015. The book series computational neuroscience familiarizes the reader with the computational aspects of brain functions based on neuroscientific evidence. It provides an excellent introduction of the functioning, i.e. the structure, the network and the routines of the brain in our daily life. The final chapters even discuss behavioral elements such as decision-making, emotions and consciousness. These topics are of high relevance in other sciences such as economics and philosophy. Overall, Ballard’s book stimulates a scientifically well-founded debate and, more importantly, reveals the need of an interdisciplinary dialogue towards social sciences.
In this note we look at anisotropic approximation of smooth functions on bounded domains with tensor product splines. The main idea is to extend such functions and then use known approximation techniques on Rd. We prove an error estimate for domains for which bounded extension operators exist. This obvious approach has some limitations. It is not applicable without restrictions on the chosen coordinate degree even if the domain is as simple as the unit disk. Further for approximation on Rd there are error estimates in which the grid widths and directional derivatives are paired in an interesting way. It seems impossible to maintain this property using extension operators.
Optimization-based analog layout automation does not yet find evident acceptance in the industry due to the complexity of the design problem. This paper presents a Self-organized Wiring and Arrangement of Responsive Modules (SWARM), able to consider crucial design constraints both implicitly and explicitly. The flexibility of algorithmic methods and the expert knowledge captured in PCells combine into a flow of supervised module interaction. This novel approach targets the creation of constraint-compliant layout blocks which fit into a specified zone. Provoking a synergetic self-organization, even optimal layout solutions can emerge from the interaction. Various examples depict the power of that new concept and the potential for future developments.
Aufgrund seiner hohen gesellschaftlichen Bedeutung, der emotionalen Strahlkraft und der überdurchschnittlich hohen medialen Reichweiten hat sich der Sport zu einer der bedeutendsten Kommunikationsplattformen entwickelt. Unternehmen nutzen Sportsponsoring, um ihre Bekanntheit im hochemotionalen Umfeld des Sports zu steigern sowie Produkte und Marken mittels eines Imagetransfers zu profilieren. Sportsponsoring bietet eine attraktive Möglichkeit, den heutigen kommunikationspolitischen Problemstellungen des steigenden Werbedrucks, der erhöhten Reizüberflutung und der sinkenden Effizienz klassischer Kommunikationsinstrumente entgegenzutreten.
Im vorliegenden Beitrag werden Best Practice Beispiele der wichtigsten Organisationsformen des Sponsoring anhand eines eigens entwickelten Untersuchungsdesigns analysiert. Die Fallbeispiele zeigen, dass die Erreichung anvisierter Sponsoringziele von internen und externen Einflussfaktoren abhängig gemacht werden kann. Während die bloße Sichtbarkeit eines Sponsorships alleine nicht zielführend ist, geht es vielmehr darum, die Sponsoringpartnerschaft durch systematische Aktivierungskonzepte und eine ganzheitliche Integration bekannt zu machen. Die untersuchten Fallbeispiele liefern kreative Lösungsansätze und lassen Rückschlüsse auf Erfolgsfaktoren des Sportsponsoring zu.
Bei großen Sportereignissen wie der diesjährigen Fußball- Europameisterschaft oder den Olympischen Sommerspielen geht es für Verbände und offizielle Sponsoren um Millionen, entsprechend scharf verteidigen sie ihre Werberechte. Burger King zeigt, wie sich dieses "Monopol" kreativ umgehen lässt. Im vorliegenden Beitrag werden exemplarisch zwei Ambush Marketing-Aktivitäten von Burger King im Rahmen der Fußball-Europameisterschaften 2016 vorgestellt. Nicht Sponsor Burger King setzte Ambush Marketing dabei gezielt und kreativ im Rahmen der EM ein, um gegen den offiziellen UEFA-Sponsor und Marktführer McDonald's Punkte zu sammeln.
The superior electrical and thermal properties of silicon carbide (SiC) allow further shrinking of the active area of future power semiconductor devices. A lower boundary of the die size can be obtained from the thermal impedance required to withstand the high power dissipation during a short-circuit event. However, this implies that the power distribution is homogeneous and that no current filamentation has to be considered. Therefore, this work investigates this assumption by evaluating the stability of a SiC-MOSFET over a wide range of operation conditions by measurements up to destruction, thermal simulations, and high-temperature characterization.
This paper addresses the turn-on switching process of insulated-gate bipolar transistor (IGBT) modules with anti-parallel free-wheeling diodes (FWD) used in inductive load switching power applications. An increase in efficiency, i.e. decrease in switching losses, calls for a fast switching process of the IGBT, but this commonly implies high values of the reverse-recovery current overshoot. To overcome this undesired behaviour, a solution was proposed which achieves an independent control of the collector current slope and peak reverse recovery current by applying a gate current that is briefly turned negative during the turn-on process. The feasibility of this approach has already been shown, however, a sophisticated control method is required for applying it in applications with varying currents, temperature and device parameters. In this paper a solution based on an adaptive, iterative closed-loop ontrol is proposed. Its effectiveness is demonstrated by experimental results from a 1200 V/200A IGBT power module for different load currents and reverse-recovery current overshoots.
Branding in sports
(2016)
Brands are ubiquitous in the sports business. The significance of the brand is fuelled not only by the various functions that a brand performs for providers and consumers in sports, but by the monetary value that brands have come to represent for sporting organizations. As part of the commercialization and professionalization of sports, a uniform brand presence is becoming increasingly important for sporting organizations. The implication is the need for systematic and integral brand management. This chapter initially examines the key features of sports from the marketing perspective and the most important fundamentals of sport marketing. Based on this, we will demonstrate specifically how brands in sports are established and cultivated.
Entwicklung eines Portfolios von Energieeffizienzdienstleistungen für kommunale EVU. - Kurzfassung
(2016)
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. The Internet of Things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and service-oriented enterprise architectures. Our aim is to support flexibility and agile transformations for both business domains and related information technology. The present research paper investigates mechanisms for decision analytics in the context of multi-perspective explorations of enterprise services and their digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support. The paper’s context focuses on digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight on the example domain – Internet of Things.
The composition of vascularized adipose tissue is still an ongoing challenge as no culture medium is available to supply adipocytes and endothelial cells appropriately. Endothelial cell medium is typically supplemented with epidermal growth factor (EGF) as well as hydrocortisone (HC). The effect of EGF on adipocytes is discussed controversially. Some studies say it inhibits adipocyte differentiation while others reported of improved adipocyte lipogenesis. HC is known to have lipolytic activities, which might result in mature adipocyte dedifferentiation. In this study, we evaluated the influence of EGF and HC on the co-culture of endothelial cells and mature adipocytes regarding their cell morphology and functionality. We showed in mono-culture that high levels of HC promoted dedifferentiation and proliferation of mature adipocytes, whereas EGF seemed to have no negative influence. Endothelial cells kept their typical cobblestone morphology and showed a proliferation rate comparable to the control independent of EGF and HC concentration. In co-culture, HC promoted dedifferentiation of mature adipocytes, which was shown by a higher glycerol release. EGF had no negative impact on adipocyte morphology. No negative impact on endothelial cell morphology and functionality could be seen with reduced EGF and HC supplementation in co-culture with mature adipocytes. Taken together, our results demonstrate that reduced levels of HC are needed for co-culturing mature adipocytes and endothelial cells. In co-culture, EGF had no influence on mature adipocytes. Therefore, for the composition of vascularized adipose tissue constructs, the media with low levels of HC and high or low levels of EGF can be used.
Large, deep full-thickness skin wounds from high-graded burns or trauma are not able to reepithelialize sufficiently, resulting in scar formation, mobility limitations, and cosmetic deformities. In this study, in vitro-constructed tissue replacements are needed. Furthermore, such full-skin equivalents would be helpful as in vivo-like test systems for toxicity, cosmetic, and pharmaceutical testing. Up to date, no skin equivalent is available containing the underlying subcutaneous fatty tissue. In this study, we composed a full-skin equivalent and evaluated three different media for the coculture of mature adipocytes, fibroblasts, and keratinocytes. Therefore, adipocyte medium was supplemented with ascorbyl-2-phosphate and calcium chloride, which are important for successful epidermal stratification (Air medium). This medium was further supplemented with two commercially available factor combinations often used for the in vitro culture of keratinocytes (Air-HKGS and Air- KGM medium). We showed that in all media, keratinocytes differentiated successfully to build a stratified epidermal layer and expressed cytokeratin 10 and 14. Perilipin A-positive adipocytes could be found in all tissue models for up to 14 days, whereas adipocytes in the Air-HKGS and Air-KGM medium seemed to be smaller. Adipocytes in all tissue models were able to release adipocyte-specific factors, whereas the supplementation of keratinocyte-specific factors had a slightly negative effect on adipocyte functionality. The permeability of the epidermis of all models was comparable since they were able to withstand a deep penetration of cytotoxic Triton X in the same manner. Taken together, we were able to compose functional three-layered fullskin equivalents by using the Air medium.
Blood vessel reconstruction is still an elusive goal for the development of in vitro models as well as artificial vascular grafts. In this study, we used a novel photo curable cytocompatible polyacrylate material (PA) for freeform generation of synthetic vessels. We applied stereolithography for the fabrication of arbitrary 3D tubular structures with total dimensions in the centimeter range, 300 µm wall thickness, inner diameters of 1 to 2 mm and defined pores with a constant diameter of approximately 100 µm or 200 µm. We established a rinsing protocol to remove remaining cytotoxic substances from the photo-cured PA and applied thio-modified heparin and RGDC-peptides to functionalize the PA surface for enhanced endothelial cell adhesion. A rotating seeding procedure was introduced to ensure homogenous endothelial monolayer formation at the inner luminal tube wall. We showed that endothelial cells stayed viable and adherent and aligned along the medium flow under fluid-flow conditions comparable to native capillaries. The combined technology approach comprising of freeform additive manufacturing (AM), biomimetic design, cytocompatible materials which are applicable to AM, and biofunctionalization of AM constructs has been introduced as BioRap® technology by the authors.
The development of in vitro adipose tissue constructs is highly desired to cope with the increased demand for substitutes to replace damaged soft tissue after high graded burns, deformities or tumor removal. To achieve clinically relevant dimensions, vascularization of soft tissue constructs becomes inevitable but still poses a challenge. Adipose-derived stem cells (ASCs) represent a promising cell source for the setup of vascularized fatty tissue constructs as they can be differentiated into adipocytes and endothelial cells in vitro and are thereby available in sufficiently high cell numbers.
This review summarizes the currently known characteristics of ASCs and achievements in adipogenic and endothelial differentiation in vitro. Further, the interdependency of adipogenesis and angiogenesis based on the crosstalk of endothelial cells, stem cells and adipocytes is addressed at the molecular level. Finally, achievements and limitations of current co-culture conditions for the construction of vascularized adipose tissue are evaluated.
In bioprinting approaches, the choice of bioink plays an important role since it must be processable with the selected printing method, but also cytocompatible and biofunctional. Therefore, a crosslinkable gelatin-based ink was modified with hydroxyapatite (HAp) particles, representing the composite buildup of natural bone. The inks’ viscosity was significantly increased by the addition of HAp, making the material processable with extrusion-based methods. The storage moduli of the formed hydrogels rose significantly, depicting improved mechanical properties. A cytocompatibility assay revealed suitable ranges for photoinitiator and HAp concentrations. As a proof of concept, the modified ink was printed together with cells, yielding stable three-dimensional constructs containing a homogeneously distributed mineralization and viable cells.
Adapting characteristics of biomaterials specifically for in vitro and in vivo applications is becoming increasingly important in order to control interactions between material and biological systems. These complex interactions are influenced by surface properties like chemical composition, charge, mechanical and topographic attributes. In many cases it is not useful or even not possible to alter the base material but changing surface, to improve biocompatibility or to make surfaces bioactive, may be achieved by thin coatings. An already established method is the coating with polyelectrolyte multilayers (PEM). To adjust adhesion, proliferation and improve vitality of certain cell types, we modified the roughness of PEM coatings. We included different types nanoparticles (NP’s) in different concentrations into PEM coatings for controlling surface roughness. Surface properties were characterized and the reaction of 3 different cell types on these coatings was tested.
In thermopervaporation the same economically favorable driving force as in membrane distillation, i.e., a temperature difference between feed and permeate for the transport, is used but with non-porous thin-film composite membranes. Membrane pores cannot be wetted and long-term operational stability can be achieved with the appropriate coating layer, but normally with a decrease of the flux compared to membrane distillation with porous hydrophobic membranes.
Porous asymmetric PVDF membranes were made to achieve low permeation resistance and pores which could be overcoated with polyelectrolyte polymers. This coating prohibits pore wetting and strongly reduces adsorption of organic substances.
Those membranes showed a high permeation rate for water due to a structure of phase-separated hydrophilic and hydrophobic three-dimensional domains. The permeation rates of these composite membranes for water is between 6 and 12 l/(h m²) at a feed temperature of 60 °C and permeate at a temperature of 40 °C of a 2% saline solution feed depending on the operational parameters. This is only a slight reduction of 10–15% in permeation rate compared to membrane distillation with porous hydrophobic membranes.
In whey dewatering experiment this membrane showed a constant performance over 4 days in intermittent operation mode and stability in cleaning with strong alkaline solution.
This book showcases new and innovative approaches to biometric data capture and analysis, focusing especially on those that are characterized by non-intrusiveness, reliable prediction algorithms, and high user acceptance. It comprises the peer-reviewed papers from the international workshop on the subject that was held in Ancona, Italy, in October 2014 and featured sessions on ICT for health care, biometric data in automotive and home applications, embedded systems for biometric data analysis, biometric data analysis: EMG and ECG, and ICT for gait analysis. The background to the book is the challenge posed by the prevention and treatment of common, widespread chronic diseases in modern, aging societies. Capture of biometric data is a cornerstone for any analysis and treatment strategy. The latest advances in sensor technology allow accurate data measurement in a non-intrusive way, and in many cases it is necessary to provide online monitoring and real-time data capturing to support a patient’s prevention plans or to allow medical professionals to access the patient’s current status. This book will be of value to all with an interest in this expanding field.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
A lot of people need help in their daily life to wash, select and manage their clothing. The goal of this work is to design an assistant system (eKlarA) to support the user by giving recommendations to choose the clothing combinations, to find the clothing and to wash the clothing. The idea behind eKlarA is to generate a system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of the stations: closets, laundry basket and washing machine in one or several places. The system uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. The first prototype of this system has been developed and tested. The test results are presented in this work.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Knee osteoarthritis is a common complication and can lead to total loss of joint function in patients. Treatment by either partial or total knee replacement with appropriate UHMWPE based implantsis highly invasive, may cause complications and may show unsatisfying results. Alternatively, treatment may be done by insertion of an elastic interpositional knee spacer with optimized material characteristics.
We report the development of high performance polyurethane-based polymers modified with bioactive molecules for fabrication of such knee spacers. In order to tailor mechanical and tribological properties and to improve resist to enzymatic degradation we propose a core-shell model for the spacer with specifically adapted properties.
Efficiency in supply chain risk management (SCRM) is a major topic in industries with serial production and a complex supply chain due to limited management and financial resources. A high number of possible risk situations and intertwined processes create a more challenging environment for resource allocation. Managers cannot perform SCRM in all possible supply chain areas and hence have to decide where available resources should be utilised for highest possible risk reduction. This makes it important to quickly and systematically evaluate input and output relationships among risk mitigation actions to determine which actions are deployed first for efficient risk level reduction. This paper introduces a new SCRM method based on the failure mode and effects analysis (FMEA) in order to perform an efficiency-oriented risk action prioritisation. By considering the cost-benefit evaluation of identified risk mitigation actions for each assessed risk and by determining the implementation effort for risk mitigation actions, also considered as the cost for realising a specific risk action the method allows finding those risk and risk mitigation actions, which are most efficient for risk reduction and should be implemented first in the process of risk steering.
Gamification, the use of game elements for non-gaming purposes, may just make a huge impact on education, a contribution the world in general and South Africa in particular, desperately needs. In today’s fast-paced work environment, there is not only a severe skills shortage, but also a great need for graduates with practical knowledge - students that are not purely “book smart”. Didactic teaching habits have created an education realm in which reciting facts is more often than not what gets students to pass. Learning factories are physical, operational factories that serve as exemplary and realistic hands-on learning environments and provide an important step towards more industry-prepared graduates. Top universities around the world are establishing such environments and are showing superb results. This paper explores the potential benefit of applying gamification in such a setting to enhance the learning environment even further, and provide opportunities for training otherwise difficult to teach topics, such as shop floor management.
During the first years of their employment, the graduates are a liability to industry. The employer goes an extra mile to bridge the gap between university-exiting and profitable employment of engineering graduates. Unfortunately some cannot take this risk. Given this scenario, this paper presents a learning factory approach as a platform for the application of knowledge so as to develop the required engineering competences in South African engineering graduates before they enter the labour market. It spells out the components of a Stellenbosch University Learning Factory geared towards production of engineering graduates with the required industrial skills. It elaborates on the didactics embedded in the learning factory environment, tailor-made to produce engineers who can productively contribute to the growth of the industry upon exiting the university.
Increasingly volatile market conditions and manufacturing environments combined with a rising demand for highly personalized products, the emergence of new technologies like cyber-physical systems and additive manufacturing as well as an increasing cross-linking of different entities (Industrie 4.0) will result in fundamental changes of future work and logistics systems. The place of production, the logistical network and the respective production system will underlie the requirements of constant changes and therefore sources and sinks of logistical networks have to obey the versatility of (cyber-physical) production systems. To cope with the arising complexity to control and monitor changeable production and logistics systems, decentralized control systems are the mean of choice since centralized systems are pushed to their limits in this regard. This paradigm shift will affect the overall concept under which production and logistics is planned, managed and controlled and how companies interact and collaborate within the emerging value chains by using dynamic methods to generate and execute the created network and to allocate available resources to fulfill the demand for customized products. In this field of research learning factories, like the ESB Logistics Learning Factory at ESB Business School (Reutlingen University), provide a great potential as a risk free test bed to develop new methods and technical solutions, to investigate new technologies regarding their practical use and to transfer the latest state of knowledge and specific competences into the training of students and professionals. Keeping with these guiding principles ESB Business School is transferring its existing production system into a cyber-physical production system to investigate innovative solutions for the design of human-machine collaboration and technical assistance systems as wells as to develop decentralized control methods for intralogistics systems following the requirements of changeable work systems including the respective design of dynamic inbound and outbound logistic networks.
The fast moving process of digitization1 demands flexibility in order to adapt to rapidly changing business requirements and newly emerging business opportunities. New features have to be developed and deployed to the production environment a lot faster. To be able to cope with this increased velocity and pressure, a lot of software developing companies have switched to a Microservice Architecture (MSA) approach. Applications built this way consist of several fine-grained and heterogeneous services that are independently scalable and deployable. However, the technological and business architectural impacts of microservices based applications directly affect their integration into the digital enterprise architecture. As a consequence, traditional Enterprise Architecture Management (EAM) approaches are not able to handle the extreme distribution, diversity, and volatility of micro-granular systems and services. We are therefore researching mechanisms for dynamically integrating large amounts of microservices into an adaptable digital enterprise architecture.
Disclosed is an electronic drive circuit and a drive method. The drive circuit includes an output; a first output transistor comprising a control node and a load path, wherein the load path is coupled between the output and a first supply node; a voltage regulator configured to control a voltage across the load path of the first output transistor; and a first driver configured to drive the first output transistor based on a first control signal.
This paper is concerned with the study, optimization and control of the moisture sorption kinetics of agricultural products at temperatures typically found in processing and storage. A nonlinear autoregressive with exogenous inputs (NARX) neural network was developed to predict moisture sorption kinetics and consequently equilibrium moisture contents of shiitake mushrooms (Lentinula edodes (Berk.) Pegler) over a wide range of relative humidity and different temperatures. Sorption kinetic data of mushroom caps was separately generated using a continuous, gravimetric dynamic vapour sorption analyser at emperatures of 25-40 °C over a stepwise variation of relative humidity ranging from 0 to 85%. The predictive power of the neural network was based on physical data, namely relative humidity and temperature. The model was fed with a total of 4500 data points by dividing them into three subsets, namely, 70% of the data was used for training, 15% of the data for testing and 15% of the data for validation, randomly selected from the whole dataset. The NARX neural network was capable of precisely simulating equilibrium moisture contents of mushrooms derived from the dynamic vapour sorption kinetic data throughout the entire range of relative humidity.
This case study describes the emerging customized omnichannel loyalty solution of Marc O’Polo from a customer’s perspective. After the introduction of Marc O’Polo and their general omnichannel strategy, the loyalty program is described in detail, like Marc O’Polo for members and the mobile app, social media, direct mail and in-store capabilities. A discussion chapter closes the case study with research implications and open questions for Marc O’Polo.
Loyalty programs become more important in an omnichannel environment of fashion retail business. After the definition of customer loyalty and loyalty programs the main characteristics of omnichannel loyalty programs are described. As touchpoints of omnichannel loyalty programs mobile, social media, direct mail and in-store capabilities are detailed. A discussion chapter closes with recommendations for fashion retailers.
This case study of Breuninger aims to analyze how Breuninger adapts to the emerging omnichannel environment in fashion business. From a consumer’s perspective Breuninger and the general omnichannel strategy of Breuninger is explained, before the loyalty program of Breuninger is analyzed in detail. Key factors as the mobile app and the mobile Breuninger card, social media, direct mail and in-store capabilities are described. A discussion chapter finalizes the case.
In times of e-commerce and digitalization, new markets are opening, young companies have the possibility to grow and new perspectives arise in terms of customer relationship. Customers require more possibilities of personalization. In the same time, companies have access to new and especially more information about the customer. Seems like it was a correlation that could evolve greatly if there weren't privacy issues. Vast amount of data about consumers are collected in Big Data warehouses. These shall be analyzed via predictive analytics and customers shall be classified by algorithms like clustering models, propensity models or collaborative filtering. All these subjects are growing in importance, as they are shaping the global marketing landscape. Marketers develop together with IT scientists new ways of analyzing customer databases and benefit from more accurate segmentation methods as that have been used until now. The following paper shall provide a literature review on new methods of consumer segmentation regarding the high inflow of new information via e-commerce. It will introduce readers in the subject of predictive analytics and will discuss several predictive models. The writing of the paper is not based on own empirical researches, but shall serve as a reference text for further researches. A conclusion will complete the paper.
Like many others, fashion companies have to deal with a global and very competitive environment. Thus companies rely on accurate sales forecasts - as key success factor of an efficient supply chain management. However, forecasters have to take into account some specificities of the fashion industry. To respond to these constraints, a variety of different forecasting methods exists, including new, computer-based predictive analytics. After the evaluation of different methods, their application to the fashion industry is investigated through semi structured expert interviews. Despite several benefits predictive analytics is not yet frequently used in practice. This research does not only reflect an industry profile, but also gives important insights about the future potential and obstacles of predictive analytics.
Um einen reibungslosen Produktionsablauf zu gewährleisten und Produktionsstillstände zu vermeiden, ist eine kontinuierliche Materialverfügbarkeit erforderlich. Bei der Auswahl von Materialbereitstellungsprinzipien gilt es, unternehmensspezifische Gegebenheiten zu berücksichtigen. Der vorliegende Beitrag zeigt am Beispiel der ERBE Elektromedizin GmbH, Tübingen, ein mögliches Vorgehen.
Broad acceptance of finite-element-based analysis of structural problems and the increased availability of CAD-systems for structural tasks, which help to generate meshes of non-trivial geometries, have been setting a standard for the evaluation of designs in mechanical engineering in the last few decades. The development of automated or semi-automated optimizers, integrated into the Computer-Aided Engineering (CAE)-packages or working as outer loop machines, requiring the solver to do the analysis of the specific designs, has been accepted by most advanced users of the simulation community as well. The availability and inexpensive processing power of computers is increasing without any limitations foreseen in the coming years. There is little doubt that virtual product development will continue using the tools that have proved to be so successful and so easy to handle.
Current fields of interest
(2016)
If we review the research done in the field of optimization, the following topics appear to be the focus of current development:
– Optimization under uncertainties, taking into account the inevitable scatter of parts, external effects and internal properties. Reliability and robustness both have to be taken into account when running optimizations, so the name Robust Design Optimization (RDO) came into use.
– Multi-Objective Optimization (MOO) handles situations in which different participants in the development process are developing in different directions. Typically we think of commercial and engineering aspects, but other constellations have to be looked at as well, such as comfort and performance or price and consumption.
– Process development of the entire design process, including optimization from early stages, might help avoid inefficient efforts. Here the management of virtual development has to be re-designed to fit into a coherent scheme.
...
There are many other fields where interesting progress is being made. We limit our discussion to the first three questions.
To illustrate the power and the pitfalls of Bionic Optimization, we will show some examples spanning classes of applications and employing various strategies. These applications cover a broad range of engineering tasks. Nevertheless, there is no guarantee that our experiences and our examples will be sufficient to deal with all questions and issues in a comprehensive way. As general rule it might be stated, that for each class of problems, novices should begin with a learning phase. So, in this introductory phase, we use simple and quick examples, e.g., using small FE-models, linear load cases, short time intervals and simple material models. Here beginners within the Bionic Optimization community can learn which parameter combinations to use. In Sect. 3.3 we discuss strategies for optimization study acceleration. Making use of these parameters as starting points is one way to set the specific ranges, e.g., number of parents and kids, crossing, mutation radii and, numbers of generations. On the other hand, these trial runs will doubtless indicate that Bionic Optimization needs large numbers of individual designs, and considerable time and computing power. We recommend investing enough time preparing each task in order to avoid the frustration should large jobs fail after long calculation times.
Application to CAE systems
(2016)
Due to the broad acceptance of CAD-systems based on 3D solids, the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models, where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
Due to the broad acceptance of CAD-systems based on 3D solids , the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models , where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
In the early days of automated meshing development, there were discussions over the use of tetragonal (Fig. 4.1) or hexagonal based meshes. But, after a short period of time, it became evident, that there were and will always be many problems using automated meshers to generate hexagonal elements . So today nearly all automated 3D-meshing systems use tetragonal elements .
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
In this chapter we introduce methods to improve mechanical designs by bionic methods. In most cases we assume that a general idea of the part or system is given by a set of data or parameters. Our task is to modify these free parameters so that a given goal or objective is optimized without violation of any of the existing restrictions.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.
Primäres Ziel und Aufgabe dieser Arbeit ist ... die Entwicklung einer neuen Recyclingmethode für PET, die die Nachteile der bisherigen Verwertungsmethoden vermeidet und unter weitgehendem Erhalt der bereits erbrachten Syntheseleistung definierte Oligomere liefert. Aus diesen können in Folge hochwertige Produkte hergestellt werden.
Es werden eine elektronische Treiberschaltung und ein Ansteuerverfahren offenbart. Die Treiberschaltung weist einen Ausgang auf; einen ersten Ausgangstransistor mit einem Steuerknoten und einer Laststrecke, wobei die Laststrecke zwischen den Ausgang und einen ersten Versorgungsknoten geschaltet ist; einen Spannungsregler, der dazu ausgebildet ist, eine Spannung über der Laststrecke des ersten Ausgangstransistors zu steuern; und einen ersten Treiber, der dazu ausgebildet ist, den ersten Ausgangstransistor in Abhängigkeit von einem ersten Steuersignal anzusteuern.
Die vorliegende Erfindung betrifft ein Verfahren zur Regelung einer Totzeit in einem Synchronwandler (100), in welchem ein zyklisches Schalten eines Steuerschalters (2) und eines Synchronschalters (3) erfolgen, wobei der Steuerschalter (2) mittels eines ersten Schaltsignals (S1) und der Synchronschalter (3) mittels eines zweiten Schaltsignals (S2) geschaltet werden. Das Verfahren umfasst ein Erfassen und Vorhalten eines Spannungswertes, welcher eine Spannung (VSW) über den Synchronschalter (3) zu einem bestimmten Zeitpunkt beschreibt, und ein Anpassen des ersten und/oder zweiten Schaltsignals (S1, S2) für einen folgenden Zyklus basierend auf dem vorgehaltenen Spannungswert.
Die Erfindung betrifft einen Energieübertrager (100) zur induktiven Energieübertragung von einem primären Schaltkreis (10) des Energieübertragers (100) an eine erste (5) und eine zweite (15) Spannungsdomäne eines sekundären Schaltkreises (20) des Energieübertragers (100) und zur Informationsübertragung vom sekundären Schaltkreis (20) zum primären Schaltkreis (10). Dabei umfasst der Energieübertrager (100): – einen Transformator (30), über den der primäre Schaltkreis (10) und der sekundäre Schaltkreis (20) induktiv miteinander gekoppelt sind und über den sowohl die Energieübertragung als auch die Informationsübertragung erfolgt; und – ein Amplitudenmodulationsmodul (50) zum Modulieren der Strom- und/oder Spannungsamplitude im sekundären Schaltkreis (20) mit Hilfe eines Amplitudenmodulationsschalters (55), wobei der Amplitudenmodulationsschalter (55) zwischen der ersten (5) und zweiten (15) Spannungsdomäne des sekundären Schaltkreises (20) angeordnet ist und ausgelegt ist, durch Öffnen und Schließen des Amplitudenmodulationsschalters (55) die Strom- und/oder Spannungsamplitude im primären Schaltkreis (10) zu ändern, um somit Information vom sekundären Schaltkreis (20) zum primären Schaltkreis (10) zu übertragen. Die vorliegende Erfindung betrifft ferner einen Gate-Treiber zum Schalten eines Leistungsschalters (500) und ein Verfahren zur induktiven Übertragung von Energie und zur kombinierten Informationsübertragung.
Today the optimization of metal forming processes is done using advanced simulation tools in a virtual process, e.g. FEM-studies. The modification of the free parameters represents the different variants to be analysed. So experienced engineers may derive useful proposals in an acceptable time if good initial proposals are available. As soon as the number of free parameters growths or the total process takes long times and uses different succeeding forming steps it might be quite difficult to find promising initial ideas. In metal forming another problem has to be considered. The optimization using a series of local improvements, often called a gradient approach may find a local optimum, but this could be far away from a satisfactory solution. Therefore non-deterministic approaches, e.g. Bionic Optimization have to be used. These approaches like Evolutionary Optimization or Particle Swarm Optimization are capable to cover a large range of high dimensional optimization spaces and discover many local optima. So the chance to include the global optimum increases when using such non-deterministic methods. Unfortunately these bionic methods require large numbers of studies of different variants of the process to be optimized. The number of studies tends to increase exponentially with the number of free parameters of the forming process. As the time for one single study might be not too small as well, the total time demand will be inacceptable, taking weeks to months even if high performance computing will be used. Therefore the optimization process needs to be accelerated. Among the many ideas to reduce the time and computer power requirement Meta- and Hybrid Optimization seem to produce the most efficient results. Hybrid Optimization often consists of global searches of promising regions within the parameter space. As soon as the studies indicate that there could be a local optimum, a deterministic study tries to identify this local region. If it shows better performance than other optima found until now, it is preserved for a more detailed analysis. If it performs worse than other optima the region is excluded from further search. Meta-Optimization is often understood as the derivation of Response Surfaces of the functions of free parameters. Once there are enough studies performed, the optimization is done using the Response Surfaces as representatives e.g. for the goal and the restrictions of the optimization problem. Having found regions where interesting solutions are to be expected, the studies available up to now are used to define the Response Surfaces. In many cases low degree polynomials are used, defining their coefficients by least square methods. Both proposals Hybrid Optimization and Meta-Optimization, sometimes used in combination often help to reduce the total optimization processes by large numbers of variants to be studied. In consequence they are highly recommended when dealing with time consuming optimization studies.
In this article an energy harvesting system for measuring the wind speed starting from 2 m/s (about 2 Bft) is presented, which uses the same source for measuring and supplying power (energy autarkic). The use of the same source for measurement and power supply increases the number of potential applications since needed power is present with the measuring signal. For the case of measuring the wind velocity, one might consider applications in tunnels, tubes, pipelines, air conditioning or for controlling clogging of filters. Bluetooth Low Energy (BLE) is chosen as radio technology, since it provides the possibility to realize a unidirectional communication; requiring only a single telegram (advertising telegram) for sending the measured value. A more complex establishment of communication required by WLAN or 6LoWPAN could therefore be avoided to significantly reduce the overall energy consumption. Since the advertisement telegram does not make any provision for security or against hacking in general, a security concept is presented which includes encryption and resilience against replay attacks in a unidirectional communication system.
To facilitate the presented concepts beyond wind sensors, the system is divided into three major modules namely the generator-sensor module, the radio module and the energy management module. Whereas the first two might be changed in different applications the energy management module could be reused in many different applications. It supplies and stores the needed energy and switches power on and off in a deterministic way to ensure the radio module can operate correctly.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. We use a 3D morphable face model to obtain a semi-dense shape and combine it with a fast median-based super-resolution technique to obtain a high-fidelity textured 3D face model. Our system does not need prior training and is designed to work in uncontrolled scenarios.
Mit dem Betrieb von KWK-Anlagen lässt sich nennenswert Primärenergie einsparen. KWK-Anlagen werden aus diesem Grund aufgrund verschiedener Gesetze und Richtlinien gefördert. Zum wirtschaftlichen Betrieb einer KWK-Anlage ist es erforderlich, den größtmöglichen Teil des erzeugten elektrischen Stroms entweder selbst zu verbrauchen oder an Dritte (Mieter, Wohnungseigentümer…) zu verkaufen. Mit dem KWKG 2016 werden größere KWK-Anlagen interessant, und Anlagen mit geringerer jährlicher Laufzeit können sich sogar wirtschaftlicher darstellen als reine Grundlastanlagen.
Due to the complexity of assembly processes, a high ratio of tasks is still performed by human workers. Short-cyclically changing work contents due to smaller lot sizes, especially the varied series assesmbly, increases both the need for information support as well as the risk of rising physical and psychological stress. The use of technical and digital assistance systems can counter these challenges. Through the integration of information and communication technology as well as collaborative assembly technologies, hybrid cyber-physical assembly systems will emerge. Widely established assembly planning approaches for digital and technical support systems in cyber physical assembly systems will be outlined and discussed with regard to synergies and delimitations of planning perspectives.
Data collected from internet applications are mainly stored in the form of transactions. All transactions of one user form a sequence, which shows the user´s behaviour on the site. Nowadays, it is important to be able to classify the behaviour in real time for various reasons: e.g. to increase conversion rate of customers while they are in the store or to prevent fraudulent transactions before they are placed. However, this is difficult due to the complex structure of the data sequences (i.e. a mix of categorical and continuous data types, constant data updates) and the large amounts of data that are stored. Therefore, this thesis studies the classification of complex data sequences. It surveys the fields of time series analysis (temporal data mining), sequence data mining or standard classification algorithms. It turns out that these algorithms are either difficult to be applied on data sequences or do not deliver a classification: Time series need a predefined model and are not able to handle complex data types; sequence classification algorithms such as the apriori algorithm family are not able to utilize the time aspect of the data. The strengths and weaknesses of the candidate algorithms are identified and used to build a new approach to solve the problem of classification of complex data sequences. The problem is thereby solved by a two-step process. First, feature construction is used to create and discover suitable features in a training phase. Then, the blueprints of the discovered features are used in a formula during the classification phase to perform the real time classification. The features are constructed by combining and aggregating the original data over the span of the sequence including the elapsed time by using a calculated time axis. Additionally, a combination of features and feature selection are used to simplify complex data types. This allows catching behavioural patterns that occur in the course of time. This new proposed approach combines techniques from several research fields. Part of the algorithm originates from the field of feature construction and is used to reveal behaviour over time and express this behaviour in the form of features. A combination of the features is used to highlight relations between them. The blueprints of these features can then be used to achieve classification in real time on an incoming data stream. An automated framework is presented that allows the features to adapt iteratively to a change in underlying patterns in the data stream. This core feature of the presented work is achieved by separating the feature application step from the computational costly feature construction step and by iteratively restarting the feature construction step on the new incoming data. The algorithm and the corresponding models are described in detail as well as applied to three case studies (customer churn prediction, bot detection in computer games, credit card fraud detection). The case studies show that the proposed algorithm is able to find distinctive information in data sequences and use it effectively for classification tasks. The promising results indicate that the suggested approach can be applied to a wide range of other application areas that incorporate data sequences.
The Eighth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2016), held between June 26 - 30, 2016 - Lisbon, Portugal, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Die Kombination von Softwareproduktlinien und agiler Softwareentwicklung in der Automobilbranche ist vielversprechend. Das Ziel ist hierbei, sowohl die Vorteile agiler Methoden wie kurze Entwicklungszyklen als auch die Vorteile systematischer Wiederverwendung wie beispielsweise das effektive Management von Varianten zu erzielen. Allerdings ist die Kombination auch mit Herausforderungen verbunden und erfordert eine geeignete Einführungs- oder Transformationsstrategie. Basierend auf Erkenntnissen einer Interviewstudie und existierenden Produktlinienentwicklungen werden Herausforderungen und Lösungsideen aufgezeigt.
This article studies the development of e-governance over time and across countries. We use a large data sample consisting of 99 developing and 34 OECD countries to study this notion. Firstly, we study the development of e-governance. Secondly, we estimate models to check the determining factors of e-governance over time and across countries. The study reveals that the level of e-governance is determined by the degree of e-participation, online access as well as GDP per capita.
Dieser Beitrag leistet einen Beitrag zur Marketingforschung, da er den jungen aber von zunehmender Relevanz geprägten Forschungsstrang zum Themenkomplex CEM grundlegend entwickelt. Zum einen zeigt das identifizierte Rahmenkonzept auf, dass CEM über einzelne unternehmerische Fähigkeiten wie dem Design von Serviceerlebnissen, das die bisherige CEM-Forschung bestimmt hat, hinausgeht. Zum anderen leistet das Konzept einen Beitrag zur Synthese fragmentierter, aber miteinander zusammenhängender Literaturströmungen in der Marketingforschung ...
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study’s parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware.
Bionic optimization means finding the best solution to a problem using methods found in nature. As evolutionary strategies and particle swarm optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them.
A set of sample applications shows how bionic optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for multi-objective-optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating bionic optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented.
The closing section focuses on an overview and outlook on reliable and robust as well as on multi-objective optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
The most important objective of event marketing is to improve the image of a brand or a company. The paper presents an image transfer model for event marketing. Based on current research, an image transfer model for event marketing is developed and the conditions required for an image transfer to take place from an event to a brand or a company are explained. Depending on which conditions are met, there are different consequences with regard to the image transfer from the event to the brand or company that are structured and characterized in detail. The image transfer model is developed against the backdrop of selected event types often used in actual practice. The focus of its application lies mainly in brand-oriented leisure and infotainment events directed towards external target groups. The model provides a discussion and analysis of the impact category of the image transfer in event marketing. The paper explains that the possibility of an attitude change is given within the context of event marketing. The presented model serves to structure the image transfer in event marketing. It is intended to illustrate the steps that are involved in the emergence of an image transfer as well as the resulting alternative consequences.
Sport marketing is the specific application of marketing principles and processes to sports products and services. In 2014 the biggest sports event in the world, the FIFA World Cup, took place in Brazil. Billions of spectators around the world saw Germany win the trophy in Rio de Janeiro for the fourth time in history. Yet unlike in previous World Cups, conversation was not only taking place at the numerous public viewings which were held in open spaces like bars and restaurants. For the entire tournament social media like Facebook or Twitter were playing a dominant role in all aspects. With 672 million tweets on Twitter and three billion conversations on Facebook, this was the most social World Cup as well as the most social mega sports event so far. It did not matter whether it were users, athletes or companies, everyone was trying to catch up on the conversation to be informed or inform others about their opinion or latest news. This paper analyzes the implementation of social media marketing during mega sports events with a focus on Adidas’ and Nike’s social media campaigns in the frame of the FIFA World Cup 2014 in Brazil. The analysis shows that social media marketing in the frame of mega sports events gains importance. Those companies finding topics that affect people personally with a relationship to their products achieve success through social media marketing.
Analysis of multicellular patterns is required to understand tissue organizational processes. By using a multi-scale object oriented image processing method, the spatial information of cells can be extracted automatically. Instead of manual segmentation or indirect measurements, such as general distribution of contrast or flow, the orientation and distribution of individual cells is extracted for quantitative analysis. Relevant objects are identified by feature queries and no low-level knowledge of image processing is required.
It is assumed that more education leads to better understanding of complex systems. Some researchers claim, however, find indications that simple mechanisms like stocks and flows are not well understood even by people who have passed higher education. In this paper, we test people’s understanding of complex systems with the widely studied stock-and-flow (SF) tasks (Booth Sweeney and Sterman 2000). SF tasks assess people’s understanding of the interplay between stocks and flows. We investigate SF failure of domain experts and novices in different knowledge domains. In particular, we compare performance on the original study’s Bathtub task with the square wave pattern (Booth Sweeney and Sterman 2000) with two alternative cover stories from the engineering and business domains on different groups of business and engineering students from different semesters. Further, we show that, while engineering students perform better than business students, with progressing in higher education, students seem to lose the capability of dealing with simple SF tasks from domains other than their field. We thus find hints on déformation professionelle in higher education.
The financial crisis of 2007-2010 was probably one of the greatest, most lustrous black-swan events that people of our generation(s) will experience – and at its heart, it was a dynamic phenomenon. It is stated in the vision of the System Dynamics Society that we aspire to transform society by influencing decision-making. Yet, it seems as if system dynamics did not play any significant role in this crisis: we did not examine the markets, we did not provide insights to banks, and we did not warn governments or the people. In our presentation we describe the dynamics involved in a housing bubble, and describe what made the last one different. With the insights gained from this exercise we conclude that, from a system dynamics perspective, the dimension of the financial crisis of 2007-2009 was eminently foreseeable, which will lead us to pose the following question: where were we as a field while this crisis was unfolding, why were we not active players? We present a range of potential answers to this question, hoping to provoke some reflection… and maybe some (re)action.
Wasted paradise – imagining the Maldives without the garbage island of Thilafushi : Version 1.2
(2016)
To address the high level of waste production in the Maldives, the local government decided to transform the coral island of Thilafushi into an immense waste dumb in 1992. Meanwhile, each day, 330 tons of waste is ferried to Thilafushi. The policy had the positive consequence of relieving the garbage burden in Malé, the main island, and surrounding tourist atolls. However, it can also lead to serious environmental and economic damage in the long range. First, the garbage is in visual range of one of the most prominent tourist destinations. Second, if the wind blows a certain way, unfiltered fumes from burning waste travels to tourist atolls. Third, water quality can erode as hazardous waste from batteries and other toxic waste is floating in the ocean. Over time, these effects can accumulate to significantly hamper the number of tourists that travel to the Maldives – one of the state’s main sources of financial income. In our paper, we lay out the situation in more detail and translate it into a simulation model. We test different policies to propose the Maldives government how to better solve the waste problem.
Sleep is an important aspect in life of every human being. The average sleep duration for an adult is approximately 7 h per day. Sleep is necessary to regenerate physical and psychological state of a human. A bad sleep quality has a major impact on the health status and can lead to different diseases. In this paper an approach will be presented, which uses a long-term monitoring of vital data gathered by a body sensor during the day and the night supported by mobile application connected to an analyzing system, to estimate sleep quality of its user as well as give recommendations to improve it in real-time. Actimetry and historical data will be used to improve the individual recommendations, based on common techniques used in the area of machine learning and big data analysis.
This article highlights three major outcomes from global employability surveys about the topic of gender diversity. Students and graduates of two master programs at ESB Business School of Reutlingen University in Germany were asked about their study programs, their expected and their realized career paths, and their individual well-being. This article highlights selected gender differences that were discovered in the analysis and underlines results on specific gender issues. The three major outcomes are: firstly, men and women work in different industries, functions, and leadership positions; secondly, there is a potential for unfulfilled expectations of young managers regarding their achievement of certain positions and the realization of their private goals; thirdly, by looking at the graduates’ career paths in combination with their well-being, a low level of satisfaction with work-life balance and high levels of stress could be identified. The results give valuable insights into the conceptual world of students at the beginning of their career and as future managers. Looking at gender differences and gender issues leads to interesting findings which can be used for further research and discussions at ESB Business School. By contrasting the outcomes of the alumni survey with outcomes of the student survey, significant differences between the awareness of students and the reality of the graduates concerning gender diversity issues were discovered. The disclosed gap between students’ expectations and the real-life situations of the alumni indicates further areas for discussion. One major question is how students can cope with these challenges and issues of gender diversity management in future management positions as (female) managers while taking corporate social responsibility into consideration.
Die zunehmende Durchdringung von cyber-physischen Systemen und deren Vernetzung zu cyberphysischen Produktionssystemen (CPPS) führt zu fundamentalen Veränderungen von zukünftigen Montage-, Fertigungs- und Logistiksystemen, welche innovative Methoden zur Planung, Steuerung und Kontrolle von wandlungsfähigen Produktionssystemen erfordern. Zukünftige logistische Systeme werden dabei den Anforderungen einer hochfrequenten Veränderung und Re-Konfiguration ausgelöst durch wandlungsfähige Produktionssysteme für individualisierte Produkte und kleinen Losgrößen unterliegen. Der Einsatz dezentraler Steuerungssysteme, bei denen die komplexen Planungs-, Steuerungs- und Kontrollprozesse auf zahlreiche Knoten und Entitäten des entstehenden Steuerungssystems verteilt werden, bietet ein großes Potential, den Anforderungen in cyber-physischen Logistiksystemen gerecht zu werden. Eine zentrale Herausforderung ist dabei die echtzeitfähige Steuerung und Re-Konfiguration von sogenannten hybriden Logistiksystemen, welche u.a. durch die Kollaboration von Mensch und Maschine, der Kombination verschiedenartiger Fördermittel sowie verschiedenartiger Steuerungsarchitekturen geprägt sind und darüber hinaus auf hybriden Entscheidungsfindungsprozessen beruhen, welche die Fähigkeiten von Menschen und (cyber-physischen) Systemen synergetisch nutzen.
Lernfabriken, wie die ESB Logistik-Lernfabrik an der ESB Business School (Hochschule Reutlingen), bieten dabei weitreichende Möglichkeiten, diese innovativen Methoden, Systeme und technischen Lösungen in einer industrienahen und risikofreien Fabrikumgebung zu entwickeln sowie in die Ausbildung von Studierenden und Weiterbildung von Teilnehmern aus der Industrie zu transferieren. Um die Forschung, Lehre sowie Aus- und Weiterbildung im Bereich zukünftiger Montage-, Fertigungs- und Logistiksysteme auszuweiten, wird das bestehende Produktionssystem der ESB Logistik-Lernfabrik im Rahmen verschiedenster Forschungs- und Studentenprojekte schrittweise in ein dezentral gesteuertes cyber-physisches Produktionssystem, basierend auf einer ereignisorientierten, cloud-basierten und dezentralen Steuerungsarchitektur, überführt.
An integrated synchronous buck converter with a high resolution dead time control for input voltages up to 48V and 10MHz switching frequency is presented. The benefit of an enhanced dead time control at light loads to enable zero voltage switching at both the high-side and low-side switch at low output load is studied. This way, compact multi-MHz DCDC converters can be implemented at high efficiency over a wide load current range. The concept also eliminates body diode forward conduction losses and minimizes reverse recovery losses. A dead time resolution of 125 ps is realized by an 8-bit differential delay chain. A further efficiency enhancement by soft switching at the high-side switch at light load is achieved with a voltage boost of the switching node by dead time control in forced continuous conduction mode. The monolithic converter is implemented in an 180nm high-voltage BiCMOS technology. At V IN = 48V, V OUT = 5V, 50mA load, 10MHz switching frequency and 500 nH output inductance, the efficiency is measured to be increased by 14.4% compared to a conventional predictive dead time control. A peak efficiency of 80.9% is achieved at 12V input.
Die zunehmende erneuerbare Stromerzeugung erfordert Anstrengungen, um den damit verbundenen Angebotsschwankungen und der zusätzlichen Netzbelastung entgegen zu wirken. Eine dezentrale und am Bedarf orientierte Stromerzeugung mittels Kraft-Wärme-Kopplung (KWK) kann hier einen wesentlichen Beitrag leisten, um eine sichere und konstante Stromversorgung zu gewährleisten und die Netze zu entlasten. Zu diesem Zweck ist jedoch ein Steuerungssystem erforderlich, das die KWK-Anlagen in die Lage versetzt, sowohl die Deckung des Wärmebedarfs im Objekt aufrecht zu erhalten, als auch die elektrische Energie genau zu den Zeiten zu erzeugen, in denen sie benötigt wird. Die Entkopplung von Stromerzeugung und Deckung des Wärmebedarfs kann dabei über den standardmäßig vorhandenen Wärmespeicher erfolgen. Dieser stellt damit das zentrale Element der Gesamtanlage dar, für die das Steuerungssystem zur Eigenstromoptimierung im Rahmen des Forschungsvorhabens entwickelt und erprobt wurde.
In recent years, significant progress has been made on switched-capacitor DC-DC converters as they enable fully integrated on-chip power management. New converter topologies overcame the fixed input-to-output voltage limitation and achieved high efficiency at high power densities. SC converters are attractive to not only mobile handheld devices with small input and output voltages, but also for power conversion in IoE, industrial and automotive applications, etc. Such applications need to be capable of handling widely varying input voltages of more than 10V, which requires a large amount of conversion ratios. The goal is to achieve a fine granularity with the least number of flying capacitors. In [1] an SC converter was introduced that achieves these goals at low input voltage VIN ≤ 2.5V. [2] shows good efficiency up to VIN = 8V while its conversion ratio is restricted to ≤1/2 with a limited, non-equidistant number of conversion steps. A particular challenge arises with increasing input voltage as several loss mechanisms like parasitic bottom-plate losses and gate-charge losses of high-voltage transistors become of significant influence. High input voltages require supporting circuits like level shifters, auxiliary supply rails etc., which allocate additional area and add losses [2-5]. The combination of both increasing voltage and conversion ratios (VCR) lowers the efficiency and the achievable output power of SC converters. [3] and [5] use external capacitors to enable higher output power, especially for higher VIN. However, this is contradictory to the goal of a fully integrated power supply.
A highly integrated synchronous buck converter with a predictive dead time control for input voltages >18 V with 10 MHz switching frequency is presented. A high resolution dead time of ˜125 ps allows to reduce dead time dependent losses without requiring body diode conduction to evaluate the dead time. High resolution is achieved by frequency compensated sampling of the switching node and by an 8 bit differential delay chain. Dead time parameters are derived in a comprehensive study of dead time depended losses. This way, the efficiency of fast switching DC-DC converters can be optimized by eliminating the body diode forward conduction losses, minimizing reverse recovery losses and by achieving zero voltage switching. High-speed circuit blocks for fast switching operation are presented including level shifter, gate driver, PWM generator. The converter has been implemented in a 180 nm high-voltage BiCMOS technology.
The power supply is one of the major challenges for applications like internet of things IoTs and smart home. The maintenance issue of batteries and the limited power level of energy harvesting is addressed by the integrated micro power supply presented in this paper. Connected to the 120/230 Vrms mains, which is one of the most reliable energy sources and anywhere indoor available, it provides a 3.3V DC output voltage. The micro power supply consists of a fully integrated ACDC and DCDC converter with one external low voltage SMD buffer capacitor. The micro power supply is fabricated in a low cost 0.35 μm 700 V CMOS technology and covers a die size of 7.7 mm². The use of only one external low voltage SMD capacitor, results in an extremely compact form factor. The ACDC is a direct coupled, full wave rectifier with a subsequent bipolar shunt regulator, which provides an output voltage around 17 V. The DCDC stage is a fully integrated 4:1 SC DCDC converter with an input voltage as high as 17 V and a peak efficiency of 45 %. The power supply achieves an overall output power of 3 mW, resulting in a power density of 390 μW/mm². This exceeds prior art by a factor of 11.
Der Wärmespeicher einer KWK-Anlage kann genutzt werden, um den Betrieb des BHKWs in die Zeiten des Stromverbrauchs zu verlagern. Die Ad-hoc-Zuschaltfunktion verbessert das Ergebnis gegenüber eines auf Basis von Prognosen erstellten Fahrplans. Zu beachten sind allerdings eine erhöhte Anzahl BHKW-Starts und erhöhte Wärmeverluste am Speicher. Die deutlich besten Ergebnisse werden für BHKW mit Leistungsmodulation erzielt.
3D morphable face models are a powerful tool in computer vision. They consist of a PCA model of face shape and colour information and allow to reconstruct a 3D face from a single 2D image. 3D morphable face models are used for 3D head pose estimation, face analysis, face recognition, and, more recently, facial landmark detection and tracking. However, they are not as widely used as 2D methods - the process of building and using a 3D model is much more involved.
In this paper, we present the Surrey Face Model, a multi resolution 3D morphable model that we make available to the public for non-commercial purposes. The model contains different mesh resolution levels and landmark point annotations as well as metadata for texture remapping. Accompanying the model is a lightweight open-source C++ library designed with simplicity and ease of integration as its foremost goals. In addition to basic functionality, it contains pose estimation and face frontalisation algorithms. With the tools presented in this paper, we aim to close two gaps. First, by offering different model resolution levels and fast fitting functionality, we enable the use of a 3D Morphable Model in time-critical applications like tracking. Second, the software library makes it easy for the community to adopt the 3D morphable face model in their research, and it offers a public place for collaboration.
Nowadays there is a rich diversity of sleep monitoring systems available on the market. They promise to offer information about sleep quality of the user by recording a limited number of vital signals, mainly heart rate and body movement. Typically, fitness trackers, smart watches, smart shirts, smartphone applications or patches do not provide access to the raw sensor data. Moreover, the sleep classification algorithm and the agreement ratio with the gold standard, polysomnography (PSG) are not disclosed. Some commercial systems record and store the data on the wearable device, but the user needs to transfer and import it into specialised software applications or return it to the doctor, for clinical evaluation of the data set. Thus an immediate feedback mechanism or the possibility of remote control and supervision are lacking. Furthermore, many such systems only distinguish between sleep and wake states, or between wake, light sleep and deep sleep. It is not always clear how these stages are mapped to the four known sleep stages: REM, NREM1, NREM2, NREM3-4. [1] The goal of this research is to find a reduced complexity method to process a minimum number of bio vital signals, while providing accurate sleep classification results. The model we propose offers remote control and real time supervision capabilities, by using Internet of Things (IoT) technology. This paper focuses on the data processing method and the sleep classification logic. The body sensor network representing our data acquisition system will be described in a separate publication. Our solution showed promising results and a good potential to overcome the limitations of existing products. Further improvements will be made and subjects with different age and health conditions will be tested.
In the last decades, several driving systems were developed to improve the driving behaviour in energy efficiency or safety. However, these driving systems cover either the area of energy-efficiency or safety. Furthermore, they do not consider the stress level of the driver when showing a recommendation, although stress can lead to an unsafe or inefficient driving behaviour. In this paper, an approach is presented to consider the driver stress level in a driving system for safe and energy-efficient driving behaviour. The driving system tries to suppress a recommendation when the driver is in stress in order not to stress the driver additionally with recommendations in a stressful driving situation. This can lead to an increase in the road safety and in the user acceptance of the driving system, as the driver is not getting bothered or stressed by the driving system.
The evaluation of the approach showed, that the driving system
is able to show recommendations to the driver, while also reacting
to a high stress level by suppressing recommendations in
order not to stress the driver additionally.
Detecting the adherence of driving rules in an energy-efficient, safe and adaptive driving system
(2016)
An adaptive and rule-based driving system is being developed that tries to improve the driving behavior in terms of the energy-efficiency and safety by giving recommendations. Therefore, the driving system has to monitor the adherence of driving rules by matching the rules to the driving behavior. However, existing rule matching algorithms are not sufficient, as the data within a driving system is changing frequently. In this paper a rule matching algorithm is introduced that is able to handle frequently changing data within the context of the driving system. 15 journeys were used to evaluate the performance of the rule matching algorithms. The results showed that the introduced algorithm outperforms existing algorithms in the context of the driving system. Thus, the introduced algorithm is suited for matching frequently changing data against rules with a higher performance, why it will be used in the driving system for the detection of broken energy-efficiency of safety-relevant driving rules.
Recycling of poly(ethylene terephthalate) (PET) is of crucial importance, since worldwide amounts of PETwaste increase rapidly due to its widespread applications. Hence, several methods have been developed, like energetic, material, thermo-mechanical and chemical recycling of PET. Most frequently, PET-waste is incinerated for energy recovery, used as additive in concrete composites or glycolysed to yield mixtures of monomers and undefined oligomers. While energetic and thermo-mechanical recycling entail downcycling of the material, chemical recycling requires considerable amounts of chemicals and demanding processing steps entailing toxic and ecological issues. This review provides a thorough survey of PET-recycling including energetic, material, thermo-mechanical and chemical methods. It focuses on chemical methods describing important reaction parameters and yields of obtained reaction products. While most methods yield monomers, only a few yield undefined low molecular weight oligomers for impaired applications (dispersants or plasticizers). Further, the present work presents an alternative chemical recycling method of PET in comparison to existing chemical methods.
Die vorliegende Arbeit thematisiert die Identifizierung und Darstellung von Ansätzen, wie man Menschen zu einem besseren Entscheidungsverhalten bei Finanzprodukten und -dienstleistungen bewegen kann. Hierfür werden sogenannte Nudges bei Krediten, Kreditkarten, Hypotheken, der Altersvorsorge und Aktien/Anleihen erläutert. Die Arbeit beginnt mit einer knappen Einführung in die Entscheidungstheorie. Danach wird die seit Jahrzehnten dominierende neoklassische Kapitalmarktheorie kurz erläutert und der Bogen zur jungen Disziplin der Behavioral Finance gespannt. Im Anschluss daran werden Verzerrungen und Heuristiken entlang des Entscheidungsprozesses aufgezeigt und erklärt. Das nächste Kapitel, „Libertärer Paternalismus“, bildet den theoretischen Rahmen für Nudging. Im letzten Kapitel werden Nudgingansätze bei Krediten, Kreditkarten, Hypotheken, der Altersvorsorge und Aktien/Anleihen dargestellt.
In a recent publication Novy-Marx (2013) finds evidence that the variable gross profitability has a strong statistical influence on the common variation of stock returns. He also points out that there is common variation in stock returns related to firm profitability that is not captured by the three-factor model of Fama and French (1993). Thus, this thesis augments the three-factor model by the factor gross profitability and examines whether a profitability-based four-factor model is able to better explain monthly portfolio excess returns on the German stock market compared to the three-factor model of Fama and French (1993) and the Capital Asset Pricing Model (CAPM). Based on monthly stock returns of the CDAX over the period July 2008 to June 2014 this thesis documents four main findings. First, a significant positive market risk premium and a significant positive value premium can be identified. No evidence is found for a size or a profitability effect. Second, all included factors have a strong significant effect on monthly portfolio excess returns. Third, the four-factor model clearly outperforms both the three-factor model of Fama and French (1993) and the CAPM in capturing the common variation in monthly portfolio excess returns. The CAPM performs worst. Finally, the results indicate that the three-factor model of Fama and French (1993) is somewhat better in explaining the cross-section of portfolio excess returns than the four-factor model. Again, the CAPM performs worst. Nevertheless, the four-factor model is considered to be an improvement over the three-factor model of Fama and French (1993) and the CAPM in determining stock returns on the German stock market.
A high-voltage replica based current sensor is presented, along with challenges and design techniques which are rarely discussed in literature so far. The performance is evaluated by detailed small signal and large signal analysis. By dedicated placing of high-voltage cascode devices, while keeping as many low-voltage devices as possible, a high gain-bandwidth product is achieved. A decoupling and biasing circuit is introduced which improves the response time of the current sensor at on/off transitions by a factor of five. The current sensor is implemented in a 180nm HV BiCMOS technology. The sensor achieves a DC loop gain of 83 dB and a gain-bandwidth product of 7 MHz. With the proposed techniques, the gain-bandwidth product is increased by a factor of six. The measurable current range is between 60mA and 1.5 A. The performance is demonstrated in a 500 kHz buck converter at an input voltage of 40V. The overall circuit concept is suitable for 100V and beyond, enabling high performance power management designs including switched mode power supplies and motor applications.