Refine
Document Type
- Journal article (117)
- Conference proceeding (100)
- Book chapter (60)
- Book (39)
- Anthology (15)
- Doctoral Thesis (12)
- Patent / Standard / Guidelines (6)
- Issue of a journal (2)
- Journal (1)
- Report (1)
Language
- English (353) (remove)
Has full text
- no (353) (remove)
Is part of the Bibliography
- yes (353)
Institute
- ESB Business School (110)
- Informatik (90)
- Life Sciences (61)
- Technik (50)
- Texoversum (42)
Publisher
- Springer (63)
- Elsevier (54)
- Books on Demand (12)
- Association for Information Systems (8)
- Royal Society of Chemistry (8)
- Erich Schmidt Verlag (7)
- IOP Publishing (7)
- LIT Verlag (7)
- Routledge (7)
- Universität Tübingen (7)
Autonomous navigation is one of the main areas of research in mobile robots and intelligent connected vehicles. In this context, we are interested in presenting a general view on robotics, the progress of research, and advanced methods related to this field to improve autonomous robots’ localization. We seek to evaluate algorithms and techniques that give robots the ability to move safely and autonomously in a complex and dynamic environment. Under these constraints, we focused our work in the paper on a specific problem: to evaluate a simple, fast and light SLAM algorithm that can minimize localization errors. We presented and validated a FastSLAM 2.0 system combining scan matching and loop closure detection. To allow the robot to perceive the environment and detect objects, we have studied one of the best deep learning technique using convolutional neural networks (CNN). We validate our testing using the YOLOv3 algorithm.
A behavior marker for measuring non-technical skills of software professionals : an empirical study
(2015)
Managers recognize that software development teams need to be developed. Although technical skills are necessary, non-technical (NT) skills are equally, if not more, necessary for project success. Currently, there are no proven tools to measure the NT skills of software developers or software development teams. Behavioral markers (observable behaviors that have positive or negative impacts on individual or team performance) are successfully used by airline and medical industries to measure NT skill performance. This research developed and validated a behavior marker tool rated video clips of software development teams. The initial results show that the behavior marker tool can be reliably used with minimal training.
The rapid development and growth of knowledge has resulted in a rich stream of literature on various topics. Information systems (IS) research is becoming increasingly extensive, complex, and heterogeneous. Therefore, a proper understanding and timely analysis of the existing body of knowledge are important to identify emerging topics and research gaps. Despite the advances of information technology in the context of big data, machine learning, and text mining, the implementation of systematic literature reviews (SLRs) is in most cases still a purely manual task. This might lead to serious shortcomings of SLRs in terms of quality and time. The outlined approach in this paper supports the process of SLRs with machine learning techniques. For this purpose, we develop a framework with embedded steps of text mining, cluster analysis, and network analysis to analyze and structure a large amount of research literature. Although the framework is presented using IS research as an example, it is not limited to the IS field but can also be applied to other research areas.
This paper studies whether a monetary union needs a fical union in particular in the Eurozone. On 1 January 1999, despite controversial debates, the rule-based Economic and Monetary Union (EMU) started without a fiscal union. I show that there is weak economic convergence in the EMU since 18 years. In addition, I argue that a fiscal union does not solve the past disintegration failures.
I demonstrate that the major flaws are domestic policy failures and not institutional failures in the euro area. Consequently, establishing a monetary union without having a political union is a risky strategy. Indeed, the rule-based architecture of Maastricht is not guilty for the crisis alone. The root causes are the political flaws aligned with the rather weak enforcement of the rules. I propose a genuine redesign of the rule-based paradigm without a fiscal union. Yet a monetary union without a fiscal union works effectively if the rule enforcement is more automatic and independent of domestic and European policy-making.
Motivation
In order to enable context-aware behavior of surgical assistance systems, the acquisition of various information about the current intraoperative situation is crucial. To achieve this, the complex task of situation recognition can be delegated to a specialized system. Consequently, a standardized interface is required for the seamless transfer of the recognized contextual information to the assistance systems, enabling them to adapt accordingly.
Methods
Our group analyzed four medical interface standards to determine their suitability for exchanging intraoperative contextual information. The assessment was based on a harmonized data and service model derived from the requirements of expected context-aware use cases. The Digital Imaging and Communications in Medicine (DICOM) and IEEE 11073 for Service-oriented Device Connectivity (SDC) were identified as the most appropriate standards.
Results
We specified how DICOM Unified Procedure Steps (UPS), can be used to effectively communicate contextual information. We proposed the inclusion of attributes to formalize different granularity levels of the surgical workflow.
Conclusions
DICOM UPS SOP classes can be used for the exchange of intraoperative contextual information between a situation recognition system and surgical assistance systems. This can pave the way for vendor-independent context awareness in the OR, leading to targeted assistance of the surgical team and an improvement of the surgical workflow.
Many scientific reports have warned about the catastrophic consequences of unchecked climate change, with the latest international report calling for emissions of climate pollutants to reach net zero by around 2050 (IPCC, 2018). Limiting warming to 1.5°C could save more than 100 million people from water shortages, as many as 2 billion people from dangerous heatwaves, and the majority of species from climate change extinction risks (IPCC, 2018; Warren et al., 2018). The actions taken to achieve these climate outcomes would generate benefits of more than $20 trillion while easing global economic inequality (Burke et al., 2018). Scientists make it clear that it is physically possible to meet these goals using today’s technologies (Holz et al., 2018). Yet emissions of climate pollutants continue to grow, reaching a new record high in 2018 (Jackson et al., 2018). Clearly, scientific evidence has failed to spark needed climate action. The question now is: what can?
While academia and industry see large potential for human-robot collaboration (HRC), only a small number of realized HRC application is currently found in industry. To gather more data about current hindrances to wider implementation of collaborative robots, a study among 15 robot manufactureres and 14 system integrators of collaborative robot technology has been conducted through a predesigned questionnaire procedure. Additionally, five industrial users of human-robot collaboration have been interviewed on the main challenges they experienced during the initial implementation process. The quantitative data has been analyzed using the Wilcoxon-Signed-Rank-Test. Accoring to the study participants, the main challenges within the implementation currently are the identification of HRC-suitable processes, the application of relevant safety norms (such as ISO 10218, ISO/TS 15066) and the application-individual risk assessment.
For many companies, it is major international sporting events (in particular the Football World Cup or the Olympic Games) that constitute the ideal platform for the integration of their target group-specific marketing communication into an attractive sports environment. Sports event organizers sell exclusive marketing rights for their events to official sponsors, who, in return, acquire exclusive options to utilize the event for their own advertising purposes. Ambush marketing is the method used by companies that do not hold marketin rights to an event, but still use their marketing activities in diverse ways to establish a connection to it. There is still whidespread debate and confusion about the topic. Ambush marketing is often defined in different ways, by different people, according to their position as either supporters of opponents of the practice.
Size and function of bioartificial tissue models are still limited due to the lack of blood vessels and dynamic perfusion for nutrient supply. In this study, we evaluated the use of cytocompatible methacryl-modified gelatin for the fabrication of a hydrogel-based tube by dip-coating and subsequent photo-initiated cross-linking. The wall thickness of the tubes and the diameter were tuned by the degree of gelatin methacryl-modification and the number of dipping cycles. The dipping temperature of the gelatin solution was adjusted to achieve low viscous fluids of approximately 0.1 Pa s and was different for gelatin derivatives with different modification degrees. A versatile perfusion bioreactor for the supply of surrounding tissue models was developed, which can be adaped to several geometries and sizes of blood-vessel mimicking tubes. The manufactured bendable gelatin tubes were permeable for water and dissolved substances, like Nile Blue and serum albumin. As a proof of concept, human fibroblasts in a three-dimensional collagen tissue model were sucessfully supplied with nutrients via the central gelatin tube under dynamic conditions for 2 days. Moreover, the tubes could be used as scaffolds to build-up a functional and viable endothelial layer. Hence, the presented tools can contribute to solving current challenges in tissue engineering.
The demonstration project Virtual Power Plant Neckar-Alb is constructing a Virtual Power Plant (VPP) demonstration site at the Reutlingen University campus. The VPP demonstrator integrates a heterogeneous set of distributed energy resources (DERs) which are connected to control the infrastructure and an energy management system. This paper describes the components and the architecture of the demonstrator and presents strategies for demonstration of multiple optimization and control systems with different control paradigms.
In this work, a web-based software architecture and framework for management and diagnosis of large amounts of medical data in an ophthalmologic reading center is proposed. Data management for multi-center studies requires merging of standing data and repeatedly gathered clinical evidence such as vital signs and raw data. If ophthalmologic questions are involved the data acquisition is often provided by non-medical staff at the point of care or a study center, whereas the medical finding is mostly provided by an ophthalmologist in a specialized reading center. The study data such as participants, cohorts and measured values are administrated at a single data center for the entire study. Since a specialized reading center maintains several studies, the medical staff must learn the different data administration for the different data center. With respect to the increasing number and sizes of clinical studies, two aspects must be considered. At first, an efficient software framework is required to support the data management, processing and diagnosis by medical experts at the reading center. In the second place, this software needs a standardized user-interface that has not to be trained/taylore /adapted for each new study. Furthermore different aspects of quality and security controls have to be included. Therefore, the objective of this work is to establish a multi purpose ophthalmologic reading center, which can be connected to different data centers via configurable data interfaces in order to treat various topics simultaneously.
The influence of trust on the adherence to investment recommendations in the context of robo-advisors is under-researched. This relationship needs to be better understood because robo-advice lacks a critical element of trust: human interaction. Theory suggests that ability, integrity, and benevolence are key factors in building trust in human advisors. Using an experimental study design, our research examines the relationship between a robo-advisor's trust attributes and the acceptance of its investment advice. The results show that trust in a robo-advisor increases the propensity to follow its recommendations. While ability and integrity are significant, benevolence is not. The study contributes to the research on technology acceptance, trust, and the adoption of technology-based recommendations by improving the understanding of the relationship between trust and the acceptance of automated investment recommendations.
The pH value of the human skin is not in the neutral range but is slightly acidic with values of – depending on the body part – 3.5 to 6. This provides a suitable habitat for the commensal skin floral but has a killing effect on some pathogenic micro-organisms and an inactivating effect on some viruses. This protective acid mantle of the skin thus represents a first external protective layer against infestation by pathogens. An appropriate surface pH on textiles can help to minimize the transmission of pathogens through the clothing of healthcare workers while at the same time not exerting a negative influence on the skin’s own flora. In addition, the colonization of e.g. bed linen by pathogenic microorganisms can be reduced. This can also have a positive influence on bacteria-associated odor formation on functional clothing.
The pH value of the human skin is not in the neutral range but is slightly acidic with values of – depending on the body part – 3.5 to 6. This provides a suitable habitat for the commensal skin floral but has a killing effect on some pathogenic micro-organisms and an inactivating effect on some viruses. This protective acid mantle of the skin thus represents a first external protective layer against infestation by pathogens. An appropriate surface pH on textiles can help to minimize the transmission of pathogens through the clothing of healthcare workers while at the same time not exerting a negative influence on the skin’s own flora. In addition, the colonization of e.g. bed linen by pathogenic microorganisms can be reduced. This can also have a positive influence on bacteria-associated odor formation on functional clothing.
Acting like a startup - using corporate startup structures to manage the digital transformation
(2023)
Digital transformation is proving to be a significant challenge for firms and companies when it comes to maintaining their market position. It is evident that many companies are struggling to find their particular way through this transformation. A corporate startup structure is one way to find a suitable solution quickly. Therefore, we are presenting a model for corporate startup activities, which we will instantiate in an appropriate tool to support the management of corporate startups by their parent firms. We have derived the first requirements and design principles from a comprehensive problem analysis and literature study. In addition to this,we are presenting a first artifact, which should realize the design principles by implementing a practical tool. Forming a cooperation with an automotive firm has enabled us to gain access to real-world data for the design and evaluation of the artifact.
The spreading area of cells has been shown to play a central role in the determination of cell fate and tissue morphogenesis; however, a clear understanding of how spread cell area is determined is still lacking. The observation that cell area and force generally increase with substrate rigidity suggests that cell area is dictated mechanically, by means of a force-balance between the cell and the substrate. A simple mechanical model, corroborated by experimental measurements of cell area and force is presented to analyze the temporal force balance between the cell and the substrate during spreading. The cell is modeled as a thin elastic disc that is actively pulled by lamellipodia protrusions at the cell front. The essential molecular mechanisms of the motor activity at the cell front, including, actin polymerization, adhesion kinetics, and the actin retrograde flow, are accounted for and used to predict the dynamics of cell spreading on elastic substrates; simple, closed-form expressions for the evolution of cell size and force are derived. Time-resolved, traction force microscopy, combined with measurements of cell area are performed to investigate the simultaneous variations of cell size and force. We find that cell area and force increase simultaneously during spreading but the force develops with an apparent delay relative to the increase in cell area. We demonstrate that this may reflect the strain-stiffening property of the cytoskeleton. We further demonstrate that the radial cell force is a concave function of spreading speed and that this may reflect the strengthening of cell–substrate adhesions during spreading.
Propofol is an intravenous anesthetic. Currently, it is not possible to routinely measure blood concentration of the drug in real time. However, multi-capillary column ion-mobility spectrometry of exhaled gas can estimate blood propofol concentration.Unfortunately, adhesion of volatile propofol on plastic materials complicates measurements. Therefore, it is necessary to consider the extent to which volatile propofol adheres to various plastics used in sampling tubing. Perfluoralkoxy (PFA), polytetrafluorethylene (PTFE), polyurethane (PUR), silicone, and Tygon tubing were investigated in an experimental setting using a calibration gas generator (HovaCAL). Propofol gas was measured for one hour at 26 °C, 50 °C, and 90 °C tubing temperature. Test tubing segments were then flushed with N2 to quantify desorption. PUR and Tygon sample tubing absorbed all volatile propofol. The silicone tubing reached the maximum propofol concentration after 119 min which was 29 min after propofol gas exposure stopped. The use of PFAor PTFE tubing produced comparable and reasonably accurate propofol measurements. The desaturation time for the PFA was 10 min shorter at 26 °C than for PTFE. PFA tubing thus seems most suitable for measurement of volatile propofol,with PTFE as an alternative.
Ambush marketing in sports
(2014)
A sports event organizer sells exclusive marketing rights for his event to official sponsors, who, in return, acquire exclusive options to utilize the event for their own advertising purposes. Ambush marketing is the practice by companies of using their own marketing, particularly marketing communications activities, to create an impression of an association with the event to the event audience, although the companies in question have no legal or only underprivileged or non-exclusive marketing rights for this event sponsored by third parties. So, the objective of ambush marketing is to benefit from the success of sports sponsorship without having the duties of an official sponsor.
It is fine line between creative marketing communication and infringing on sponsorship rights. From the perspective of the event organizers and sports sponsors ambush marketing represents an understandable threat, while from the perspective of the ambushers it offers the opportunity to reach the target audience in an attractive environment and at affordable cost. The paper defines and structures the phenomenon of ambush marketing and analyses the impacts of ambush marketing in sports. The results of an empirical study on the effects of ambush marketing in the frame of the FIFA Soccer World cup are presented and discussed.
Ambush marketing in sports
(2013)
Ambush marketing is a strategy by which a company or organisation uses their marketing communications to associate themselves with an event without being an official sponsor or authorised partner or licensee. It has become a particular concern in the marketing of major sports events, with international sponsorship and branding properties worth many millions of dollars. Ambush Marketing in Sports is the first book to offer comprehensive analysis of the theoretical and practical implications of ambush marketing.
Drawing on cutting-edge empirical research data, the book outlines an innovative model for understanding ambush marketing and offers practical advice for all stakeholders, from sponsors and event organisers to media organisations. The book examines the opportunities and the risks of ambush marketing, assesses the legal, ethical and business dimensions, and offers advice for preventing ambush marketing in a range of contexts. Fully supported throughout with examples and cases from major international sports events, such as the FIFA World Cup and the Olympic Games, this book is important reading for any student, researcher or practitioner with an interest in sport marketing, sport business or event management.
In recent years, the cloud has become an attractive execution environment for parallel applications, which introduces novel opportunities for versatile optimizations. Particularly promising in this context is the elasticity characteristic of cloud environments. While elasticity is well established for client-server applications, it is a fundamentally new concept for parallel applications. However, existing elasticity mechanisms for client-server applications can be applied to parallel applications only to a limited extent. Efficient exploitation of elasticity for parallel applications requires novel mechanisms that take into account the particular runtime characteristics and resource requirements of this application type. To tackle this issue, we propose an elasticity description language. This language facilitates users to define elasticity policies, which specify the elasticity behavior at both cloud infrastructure level and application level. Elasticity at the application level is supported by an adequate programming and execution model, as well as abstractions that comply with the dynamic availability of resources. We present the underlying concepts and mechanisms, as well as the architecture and a prototypical implementation. Furthermore, we illustrate the capabilities of our approach through real-world scenarios.
IGBT modules with anti-parallel FWDs are widely used in inductive load switching power applications, such as motor drive applications. Nowadays there is a continuous effort to increase the efficiency of such systems by decreasing their switching losses. This paper addresses the problems arising in the turn-on process of an IGBT working in hard-switching conditions. A method is proposed which achieves – contrary to most other approaches – a high switching speed and, at the same time, a low peak reverse-recovery current. This is done by applying an improved gate current waveform that is briefly lowered during the turn-on process. The proposed method achieves low switching losses. Its effectiveness is demonstrated by experimental results with IGBT modules for 600V and 1200V.
Due to digitalization, constant technological progress and ever shorter product life cycles, enterprises are currently facing major challenges. In order to succeed in the market, business models have to be adapted more often and more quickly to changing market conditions than they used to be. Fast adaptability, also called agility, is a decisive competitive factor in today’s world. Because of the ever-growing IT part of products and the fact that they are manufactured using IT, changing the business model has a major impact on the enterprise architecture (EA). However, developing EAs is a very complex task, because many stakeholders with conflicting interests are involved in the decision-making process. Therefore, a lot of collaboration is required. To support organizations in developing their EA, this article introduces a novel integrative method that systematically integrates stakeholder interests into decision-making activities. By using the method, collaboration between stakeholders involved is improved by identifying points of contact between them. Furthermore, standardized activities make decision-making more transparent and comparable without limiting creativity.
An apparatus and method for analyzing a flow of material having an inlet region, a measurement range and an outlet region, and having a first diverter and a second diverter, and a deflection area, wherein in a first state of operation, the two diverters form a continuous first material flow space from the inlet region via the first diverter through the measurement range, via the second diverter to the outlet region, and in a second state of operation, form a continuous second material flow space from the inlet region via the first diverter through the deflection area, via the second diverter to the outlet region.
Enterprises and societies currently face essential challenges, and digital transformation can contribute to their resolution. Enterprise architecture (EA) is useful for promoting digital transformation in global companies and information societies covering ecosystem partners. The advancement of new business models can be promoted with digital platforms and architectures for Industry 4.0 and Society 5.0. Therefore, products from the sector of healthcare, manufacturing and energy, etc. can increase in value. The adaptive integrated digital architecture framework (AIDAF) for Industry 4.0 and the design thinking approach is expected to promote and implement the digital platforms and digital products for healthcare, manufacturing and energy communities more efficiently. In this paper, we propose various cases of digital transformation where digital platforms and products are designed and evaluated for digital IT, digital manufacturing and digital healthcare with Industry 4.0 and Society 5.0. The vision of AIDAF applications to perform digital transformation in global companies is explained and referenced, extended toward the digitalized ecosystems such as Society 5.0 and Industry 4.0.
Enterprises and societies currently face crucial challenges, while Industry 4.0 becomes important in the global manufacturing industry all the more. Industry 4.0 offers a range of opportunities for companies to increase the flexibility and efficiency of production processes. The development of new business models can be promoted with digital platforms and architectures for Industry 4.0. Therefore, products from the healthcare sector can increase in value. The adaptive integrated digital architecture framework (AIDAF) for Industry 4.0 is expected to promote and implement the digital platforms and robotics for healthcare and medical communities efficiently. In this paper, we propose that various digital platforms and robotics are designed and evaluated for digital healthcare as for manufacturing industry with Industry 4.0. We argue that the design of an open healthcare platform “Open Healthcare Platform 2030 - OHP2030” for medical product design and robotics can be developed with AIDAF. The vision of AIDAF applications to enable Industry 4.0 in the OHP2030 research initiative is explained and referenced, extended in the context of Society 5.0.
Enterprises are currently transforming their strategy, processes, and their information systems to extend their degree of digitalization. The potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, artificial intelligence, big data with analytics, mobile systems, collaboration networks, and cyber physical systems both drives and enables new business designs. Digitalization deeply disrupts existing businesses, technologies and economies and fosters the architecture of digital environments with many rather small and distributed structures. This has a strong impact for new value producing opportunities and architecting digital services and products guiding their design through exploiting a Service-Dominant Logic. The main result of the book chapter extends methods for integral digital strategies with value-oriented models for digital products and services which are defined in the framework of a multi-perspective digital enterprise architecture reference model.
Our paper gives first answers on a fundamental question: how can the design of architectures of intelligent digital systems and services be accomplished methodologically? Intelligent systems and services are the goals of many current digitalization efforts today and part of massive digital transformation efforts based on digital technologies. Digital systems and services are the foundation of digital platforms and ecosystems. Digtalization disrupts existing businesses, technologies, and economies and promotes the architecture of open environments. This has a strong impact on new value-added opportunities and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, and social enterprise networks systems are important enablers of digitalization. The current publication presents our research on the architecture of intelligent digital ecosystems and products and services influenced by the service-dominant logic. We present original methodological extensions and a new reference model for digital architectures with an integral service and value perspective to model intelligent systems and services that effectively align digital strategies and architectures with artificial intelligence as main elements to support intelligent digitalization.
This chapter presents an introduction to the emerging trends for architecting the digital transformation having a strong focus on digital products, intelligent services, and related systems together with methods, models and architectures. The primary aim of this book is to highlight some of the most recent research results in the field. We are providing a focused set of brief descriptions of the chapters included in the book.
This research-oriented book presents key contributions on architecting the digital transformation. It includes the following main sections covering 20 chapters: · Digital Transformation · Digital Business · Digital Architecture · Decision Support · Digital Applications Focusing on digital architectures for smart digital products and services, it is a valuable resource for researchers, doctoral students, postgraduates, graduates, undergraduates, academics and practitioners interested in digital transformation.
The promise of the EVs is twofold. First, rejuvenating a transport sector that still heavily depends on fossil fuels and second, integrating intermittent renewable energies into the power mix. However, it is still not clear how electricity networks will cope with the predicted increase in EVs and their charging demand, especially in combination with conventional energy demand. This paper proposes a methodology which allows to predict the impact of EV charging behavior on the electricity grid. Moreover, this model simulates the driving and charging behavior of heterogeneous EV drivers which differ in their mobility pattern, decision-making heuristics and charging strategies. The simulations show that uncoordinated charging results in charging load clustering. In contrast, decentralized coordination allows to fill the valleys of the conventional load curve and to integrate EVs without the need of a costly expansion of the electricity grid.
Based on a survey among customers of seven German municipal utilities, we estimate two regression models to identify the most prospective customer segments and their preferences and motivations for participating in peer-to-peer (P2P) electricity trading and develop implications for decision-makers in the energy sector and policy-makers for this currently relatively unknown product. Our results show a large general openness of private households towards P2P electricity trading, which is also the main predictor of respondents' intention to participate. It is mainly influenced by individuals’ environmental attitude, technical interest, and independence aspiration. Respondents with the highest willingness to participate in P2P electricity trading are mainly motivated by the ability to share electricity, and to a lesser extent by economic reasons. They also have stronger preferences for innovative pricing schemes (service bundles, time-of-use tariffs). Differences between individuals can be observed depending on their current ownership (prosumers) or installation probability of a microgeneration unit (consumers, planners). Rather than current prosumers, especially planners willing to install microgeneration in the foreseeable future are considered to be the most promising target group for P2P electricity trading. Finally, our results indicate that P2P electricity trading could be a promising niche option in the German energy transition.
Based on a survey among customers of seven German municipal utilities, we estimate hierarchical multiple regression models to identify consumer motivations for participating in P2P electricity trading and develop implications for marketing strategies for this currently relatively unknown product. Our results show a low importance of socio-demographics in explaining differences between consumer groups, but high influence of attitudes, knowledge and likelihood to purchase related products. The most valuable target groups for P2P electricity trading marketing strategies of municipal utilities first and foremost should aim at are innovators, especially prosumers. They are well-informed about and open minded concerning electricity sharing and highly environmentally aware. They ask for transparency and are willing to purchase related products. They are attracted by the ability to share generation and consumption and to a lesser extent by economic reasons. Our results indicate that the marketing efforts should to a special degree take peer effects into account, as they are found to wield great influence on general openness towards and purchase intention for P2P electricity products. Finally, municipal utilities should build on the high level of satisfaction and trust of consumers and use P2P electricity trading as measure to keep and win customers willing to change their supplier.
We report an investigation into the distribution of copper oxidation states in oxide films formed on the surfaces of technical copper. The oxide films were grown by thermal annealing at ambient conditions and studied using Auger depth profiling and UV–Vis spectroscopy. Both Auger and UV–Vis data were evaluated applying multivariate curve resolution (MCR). Both experimental techniques revealed that the growth of Cu2O dominates the initial ca. 40 nm of oxide films grown at 175 °C, while further oxide growth is dominated by CuO formation. The largely coincident results from both experimental approaches demonstrates the huge benefit of the application of UV–Vis spectroscopy in combination with MCR analysis, which provides access to information on chemical state distributions without the need for destructive sample analysis. Both approaches are discussed in detail.
Data collected from internet applications are mainly stored in the form of transactions. All transactions of one user form a sequence, which shows the user´s behaviour on the site. Nowadays, it is important to be able to classify the behaviour in real time for various reasons: e.g. to increase conversion rate of customers while they are in the store or to prevent fraudulent transactions before they are placed. However, this is difficult due to the complex structure of the data sequences (i.e. a mix of categorical and continuous data types, constant data updates) and the large amounts of data that are stored. Therefore, this thesis studies the classification of complex data sequences. It surveys the fields of time series analysis (temporal data mining), sequence data mining or standard classification algorithms. It turns out that these algorithms are either difficult to be applied on data sequences or do not deliver a classification: Time series need a predefined model and are not able to handle complex data types; sequence classification algorithms such as the apriori algorithm family are not able to utilize the time aspect of the data. The strengths and weaknesses of the candidate algorithms are identified and used to build a new approach to solve the problem of classification of complex data sequences. The problem is thereby solved by a two-step process. First, feature construction is used to create and discover suitable features in a training phase. Then, the blueprints of the discovered features are used in a formula during the classification phase to perform the real time classification. The features are constructed by combining and aggregating the original data over the span of the sequence including the elapsed time by using a calculated time axis. Additionally, a combination of features and feature selection are used to simplify complex data types. This allows catching behavioural patterns that occur in the course of time. This new proposed approach combines techniques from several research fields. Part of the algorithm originates from the field of feature construction and is used to reveal behaviour over time and express this behaviour in the form of features. A combination of the features is used to highlight relations between them. The blueprints of these features can then be used to achieve classification in real time on an incoming data stream. An automated framework is presented that allows the features to adapt iteratively to a change in underlying patterns in the data stream. This core feature of the presented work is achieved by separating the feature application step from the computational costly feature construction step and by iteratively restarting the feature construction step on the new incoming data. The algorithm and the corresponding models are described in detail as well as applied to three case studies (customer churn prediction, bot detection in computer games, credit card fraud detection). The case studies show that the proposed algorithm is able to find distinctive information in data sequences and use it effectively for classification tasks. The promising results indicate that the suggested approach can be applied to a wide range of other application areas that incorporate data sequences.
Automated stabilization of loading capacity of coal shearer screw with controlled cutting drive
(2015)
A solution of topical scientific problem of coal shearer output increase providing minimum specific power supply for coal cutting, transportation, and loading in terms of thin seams has been proposed. The solution is based on the use of earlier proposed criterion of screw gumming for optimum cutting velocity-coal shearer feed rate ratio in the context of increased screw rotation owing to phase voltage frequency increase. Simulation results of automated control system for coal shearer operations with frequency-controlled cutting drive within thin seams have confirmed the efficiency of the system using proposed algorithm of smart analysis of coal shearer power signal.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings: revenue-generating solutions that leverage digital technologies to address customer needs. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This briefing describes how Munich Re addressed these common challenges by building a foundation for experimenting more systematically and successfully with digital offerings. The foundation has enabled Munich Re to become a serial innovator of digital offerings.
Over the last 50 years, neoclassical financial theory has been dominating our perception of what is happening in financial markets. It has spurred numerous valuable theories and concepts all based on the concept of Homo Economicus, the strictly rational economic man. However, humans do not always act in a strictly rational manner. For students and practitioners alike, our book aims at opening the door to another perspective on financial markets: a behavioral perspective based on a Homo Oeconomicus Humanus. This agent acts with limited rationality when making decisions. He/she uses heuristics and shortcuts and is prone to the influence of emotions. This sounds familiar in real life and can be transferred to what happens in financial markets, too.
Since its early beginnings in the form of correspondence schools, e-learning has generally sought to provide flexibility and high quality education. While these are indeed noble intentions, the reality of today's connected world demands that such programs focus on a different purpose. As the main purpose of e-learning shifts, so must be the design approaches.
Rethinking e-learning requires open-mindedness on the part of academies, designers, cyber educators, legislators, IT and administrators, but also the learners themselves. All who are involved in or impacted by e-learning programs must speak up and finally share their perspectives, but who will be listening? The key to rethinking e-learning lies in the ability of the stakeholders to listen to each other and make decisions which are in the best interest of the learner.
This chapter will propose a new purpose for e-learning and explore promising possibilities for learner-centered design. The future of e-learning can be shaped by the decisions made today, but before any decisions can be made, one must acknowledge e-learning's successes as well as its shortcomings. The purpose of this chapter is to encourage those who are impacted by e-learning to think about the future.
The intention of this paper is to show that the statistical approach to risk is not enough to explain the behavior of investors. It furthermore proposes ideas and alternative approaches on how to deal with risk. Psychological findings are of particular interest as they might enhance our understanding of risk perception and assessment. The chapter “From the normal distribution to fat tails” starts with the rejection of the normal distribution as a simplifying basis for risk and return. This rejection is supported by several empirical observations like clustering of volatility and fat tails. This leads to a two-step approach for modeling risk and return based on the distinction of conditional and un-conditional changes. Conditional time series models (ARMA, ARCH, GARCH) and alternative distributions are presented (Stable Paretian, Student’s T, EVT) as a way to improve the art of risk and return modeling beyond the normal distribution assumption. The chapter ends with the conclusion that each model is only a statistical approximation and never encompasses the unpredictability of black swans and the nature of human behavior in the financial markets. After having discussed the limitations of the purely statistical approach to risk and return this paper goes beyond the standard theory of finance for two purposes. Firstly, behavioral finance provides some arguments for the limitation of statistics in assessing risk. Secondly, an alternative approach to risk perception is presented. This alternative is called Prospect Theory, a rather psychology-based approach using preferences to explain investors’ actions by human behavior in decision making processes. Starting point is the utility function and the value function followed by a description of the two phases: framing and evaluation. The value function is then clearly distinguished from the utility function by elaborating certain effects like reference points, loss aversion or the weighting function. In this section the paper enters the arena of human risk perception which is far from being monetarily rational in the sense of the homo oeconomicus. With Cumulative Prospect Theory there exists an extension to multiple outcome scenarios where risk does not necessarily have to be known. In such a situation, besides risk, there also exists immeasurable uncertainty. Current research confirms and rejects parts of (Cumulative) Prospect Theory which is not necessarily a bad sign as human behavior is rarely exactly replicable and the complexity does not really allow generalizations. Therefore, even if the theory is not completely correct it still enhances our understanding of risk perception and human decision making which can be a very valuable input for agent-based models. The next chapter analyses in more detail possible distortions from psychological biases in the assessment of risk. In this context the law of small numbers, overconfidence and feelings/experience are discussed. Knowing these biases complicates the idea of developing a risk model even further. However, this is again another step to better understand the underlying processes and motives of decision making in the context of financial markets. The last chapter is an attempt to link the different aspects to get a holistic view on risk behavior. Two possibilities are discussed: Hedonic psychology, with the distinction between blow up and bleeding strategy, and heuristic-based explanations for real observations like clustering of expectations and trust in experts. This leaves space for further research as we do not have a tool that is based on current findings and can actually help us in explaining and predicting behavior in financial markets. One possibility would be to link all these aspects in the approach of computational finance to develop agent-based models in which market observations, psychological findings and the situational context can be integrated.
This paper contributes to the automatic detection of perioperative workflow by developing a binary endoscope localization. Automated situation recognition in the context of an intelligent operating room requires the automatic conversion of low level cues into more abstract high level information. Imagery from a laparoscope delivers rich content that is easy to obtain but hard to process. We introduce a system which detects if the endoscope's distal tip is inside or outsiede the patient based on the endoscope video. This information can be used as one parameter in a situation recognition pipeline. Our localization performs in real-time at a video resolution of 1280x720 and 5-fold cross validation yields mean F1-scores of up to 0,94 on videos of 7 laparoscopies.
Today fiber reinforced plastics (FRP) are well established in manifold technical applications, because they provide advantages such as low weight, high stiffness, high strength and chemical resistance. The broad range of production methods starts from cost effective mass production up to the manufacturing of ultra-lightweight composite parts.
Biological materials are also usually composite materials: Higher plants or bones of higher animals are hierarchically organized and are composed of only a few materials such as lignin, cellulose, apatite and collagen. The large variety and the mechanical properties of natural tissues results primarily from an optimized fiber lay-up to adapt to the mechanical requirements of the respective “installation circumstances”.
Advanced lightweight technical solutions need strong materials and structurally optimized structures. In many industries, the structural optimization by an appropriate fiber lay-up has become an important method to save more weight. Corresponding software tools help to optimize topology/shape (e.g. Mattheck: CAO/SKO, Co. Altair: Optistruct), mainly using finite element analyzing technology.
The combination of strong lightweight materials, optimized topology and sophisticated fiber lay-up is also present in many bio-mineralized planktonic shells — for instance diatoms and radiolaria—but also in glass sponges.
Following it is shown, how the high weight-related mechanical properties of plankton are biomimetically transferred into ultra-lightweight technical structures.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study’s parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware.
Bionic optimization means finding the best solution to a problem using methods found in nature. As evolutionary strategies and particle swarm optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them.
A set of sample applications shows how bionic optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for multi-objective-optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating bionic optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented.
The closing section focuses on an overview and outlook on reliable and robust as well as on multi-objective optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
Film formation of self synthesized Polymer EPM–g–VTMDS (ethylene–propylene rubber, EPM, grafted with vinyltetramethyldisiloxane, VTMDS) was studied regarding bonding to adhesion promoter vinyltrimethoxysilane (VTMS) on oxidized 18/10 chromium/nickel–steel (V2A) stainless steel surfaces. Polymer films of different mixed solutions including commercial siloxane and silicone, dimethyl, vinyl group terminated crosslinker (HANSA SFA 42100, CAS# 68083-19-2, 0.35 mmol Vinyl/g) and platinum, 1,3-diethenyl-1,1,3,3-tetramethyldisiloxane complex Karstedt's catalyst (ALPA–KAT 1, CAS# 68478-92-2) were spin coated on V2A stainless steel surfaces with adsorbed VTMS thin layers in order to analyze film formation of EPM–g–VTMDS at early stages. Surface topography and chemical bonding of the high performance polymers on different oxidized V2A surfaces were investigated with X–ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM), scanning electron microscopy (SEM) and surface enhanced Raman spectroscopy (SERS). AFM and SEM as well as XPS results indicated that the formation of the polymer film proceeds via growth of polymer islands. Chemical signatures of the essential polymer contributions, linker and polymer backbones, could be identified using XPS core level peak shape analysis and also SERS. The appearance of signals which are related to Si–O–Si can be seen as a clear indication of lateral crosslinking and silica network formation in the films on the V2A surface.
Though bioprinting is a forward-looking approach in bone tissue engineering, the development of bioinks which are on the one hand processable with the chosen printing technique, and on the other hand possess the relevant mechanical as well as osteoconductive features remains a challenge. In the present study, polymer solutions based on methacrylated gelatin and methacrylated hyaluronic acid modified with hydroxyapatite (HAp) particles (5 wt%) were prepared. Encapsulation of primary human adipose derived stem cells in the HAp-containing gels and culture for 28 d resulted in a storage moduli significantly increased to 126% ± 9.6% compared to the value on day 1 by the sole influence of the HAp. Additional use of osteogenic media components resulted in an increase of storage module up to 199% ± 27.8%. Similarly, the loss moduli was increased to 370% ± 122.1% under the influence of osteogenic media components and HAp. Those changes in rheological material characteristics indicate a distinct change in elastic and viscous hydrogel properties, and are attributed to extensive matrix production in the hydrogels by the encapsulated cells, what could also be proven by staining of bone matrix components like collagen I, fibronectin, alkaline phosphatase and osteopontin. When using the cell-laden polymer solutions as bioinks to build up relevant geometries, the ink showed excellent printability and the printed grid structure's integrity remained intact over a culture time of 28 d. Again, an intense matrix formation as well as upregulation of osteogenic markers by the encapsulated cells could be shown. In conclusion, we demonstrated that our HAp-containing bioinks and hydrogels on basis of methacrylated gelatin and hyaluronic acid are on the one hand highly suitable for the build up of relevant three-dimensional geometries with microextrusion bioprinting, and on the other hand exhibit a significant positive effect on bone matrix development and remodeling in the hydrogels, as indicated by rheological measurements and staining of bone components. This makes the developed composite hydrogels an excellent material for bone bioprinting approaches.
Coupling electricity and heat sector is one of the most necessary actions for the successful energy transition. Efficient electrification for space heating and domestic hot water generation is needed for buildings, which are not connected to any district heating network, as distributed heating demand momentarily is largely met by fossil fuels. Hence, hybrid energy systems will play a pivotal role for the energy transition in buildings. Heat pumps running on PV-electricity is one of the most widely discussed combination for this purpose. In this paper, a heuristic optimization method for the optimal operation of a heat pump driven by the objective for maximum onsite PV electricity utilization is presented. In this context, the thermal flexibility of the building and a thermal energy storage (TES) for generation of domestic hot water (DHW) are activated in order to shift the operation of the heat pump to times of PV-generation. Yearly simulations for a system consisting of heat pump, PV modules, building with floor heating installation and TES for DHW generation are carried out. Variation parameters for the simulation include room temperature amplitude (0.5, 1, 1.5 and 2 K) based on mean room temperature (21 °C), PV-capacity (4, 6, 8 and 10 kW) and type of heat pump (ground source and air source type). The yearly energy balances show that buildings offer significant thermal storage capacity avoiding an additional, large TES for space heating fulfillment and improving the share of onsite PV electricity utilization. With introduction of a battery, which has been analyzed as well for different sizes (1.9, 4.8, 7.7 and 10.6 kWh), the share of onsite PVelectricity utilization can even be improved. However, thermal flexibility supplemented by the varying room temperature amplitude for a bigger battery does not improve the share of onsite PV-electricity utilization. Nevertheless, even with a battery not more than 50% of the electrical load including operation of the heat pump can be covered by PV-electricity for the specific system under investigation. This is noteworthy on the one hand, since it indicates that a hybrid heating system consisting of heat pump and PV cannot solely cover the heat demand of residential buildings. One the other hand, this emphasizes the necessity to include further renewable sources like wind power, in order to draw the complete picture. This, however, is beyond the scope of this paper, which mainly focuses on introduction and verification of the novel control method with regard to a practical building.