Refine
Document Type
- Journal article (874)
- Conference proceeding (849)
- Book chapter (184)
- Book (61)
- Doctoral Thesis (34)
- Anthology (15)
- Working Paper (13)
- Patent / Standard / Guidelines (6)
- Review (6)
- Issue of a journal (2)
Language
- English (2047) (remove)
Is part of the Bibliography
- yes (2047) (remove)
Institute
- Informatik (704)
- ESB Business School (517)
- Technik (346)
- Life Sciences (327)
- Texoversum (151)
- Zentrale Einrichtungen (6)
Publisher
- Springer (300)
- IEEE (250)
- Elsevier (219)
- MDPI (98)
- Hochschule Reutlingen (55)
- Gesellschaft für Informatik (54)
- Wiley (49)
- ACM (40)
- De Gruyter (35)
- Association for Information Systems (AIS) (32)
Some widely used optical measurement systems require a scan in wavelength or in one spatial dimension to measure the topography in all three dimensions. Novel hyperspectral sensors based on an extended Bayer pattern have a high potential to solve this issue as they can measure three dimensions in a single shot. This paper presents a detailed examination of a hyperspectral sensor including a description of the measurement setup. The evaluated sensor (Ximea MQ022HG-IM-SM5X5-NIR) offers 25 channels based on Fabry–Pérot filters. The setup illuminates the sensor with discrete wavelengths under a specified angle of incidence. This allows characterization of the spatial and angular response of every channel of each macropixel of the tested sensor on the illumination. The results of the characterization form the basis for a spectral reconstruction of the signal, which is essential to obtain an accurate spectral image. It turned out that irregularities of the signal response for the individual filters are present across the whole sensor.
Hyperspectral imaging opens a wide field of applications. It is a well established technique in agriculture, medicine, mineralogy and many other fields. Most commercial hyperspectral sensors are able to record spectral information along one spatial dimension in a single acquisition. For the second spatial dimension a scan is required. Beside those systems there is a novel technique allowing to sense a two dimensional scene and its spectral information within one shot. This increases the speed of hyperspectral imaging, which is interesting for metrology tasks under rough environmental conditions. In this article we present a detailed characterization of such a snapshot sensor for later use in a snapshot full field chromatic confocal system. The sensor (Ximea MQ022HG-IM-SM5X5-NIR) is based on the so called snapshot mosaic technique, which offers 25 bands mapped to one so called macro pixel. The different bands are realized by a spatially repeating pattern of Fabry-Pèrot flters. Those filters are monolithically fabricated on the camera chip.
The intelligent recycling of plastics waste is a major concern. Because of the widespread use of polyethylene terephtalate, considerable amounts of PET waste are generated that are ideally re-introduced into the material cycle by generating second generation products without loss of materials performance. Chemical recycling methods are often expensive and entail environmentally hazardous by-products. Established mechanical methods generally provide materials of reduced quality, leading to products of lower quality. These drawbacks can be avoided by the development of new recycling methods that provide materials of high quality in every step of the production cycle. In the present work, oligomeric ethylene terephthalate with defined degrees of polymerization and defined molecular weight is produced by melt-mixing PET with different quantities of adipic acid as an alternative pathway of recycling PET with respect to conventional methods, offering ecofriendly and economical aspects. Additionally, block-copolyesters of defined block length are designed from the oligomeric products.
The proliferation of convergence of digital technologies SMACIT (social, mobile, analytics, cloud, and Internet of Things) has created significant threats and opportunities to established companies. Business leaders must rethink their business strategies and develop what we refer to as a digital strategy. Our research shows four keys to successfully defining and executing a digital strategy:
1. zeroing in on a customer engagement or digitized solutions strategy to guide the transformation, 2. building operational excellence, 3. creating a powerful digital services backbone to facilitate rapid innovation and responsiveness, and 4. ensuring ongoing organizational redesign. A list of publications from the research is provided at the end of this document.
The digital economy poses existential threats to — and game-changing opportunities for — companies that were successful in the pre-digital economy. What will distinguish those companies that successfully transform from those that become historical footnotes? This is the question a group of six researchers and consultants from Boston Consulting Group set out to examine. The team conducted in-depth interviews with senior executives at twenty-seven companies in different industries to explore the strategies and organizational initiatives they relied on to seize the opportunities associated with new, readily accessible digital technologies. This paper summarizes findings from this research and offers recommendations to business leaders responsible for digital business success.
The modern industrial corporation encompasses a myriad of different software applications, each of which must work in concert to deliver functionality to end-users. However, the increasingly complex and dynamic nature of competition in today’s product-markets dictates that this software portfolio be continually evolved and adapted, in order to meet new business challenges. This ability – to rapidly update, improve, remove, replace, and reimagine the software applications that underpin a firm’s competitive position – is at the heart of what has been called IT agility. Unfortunately, little work has examined the antecedents of IT agility, with respect to the choices a firm makes when designing its “Software Portfolio Architecture.”
We address this gap in the literature by exploring the relationship between software portfolio architecture and IT agility at the level of the individual applications in the architecture. In particular, we draw from modular systems theory to develop a series of hypotheses about how different types of coupling impact the ability to update, remove or replace the software applications in a firm’s portfolio. We test our hypotheses using longitudinal data from a large financial services firm, comprising over 1,000 applications and over 3,000 dependencies between them. Our methods allow us to disentangle the effects of different types and levels of coupling.
Our analysis reveals that applications with higher levels of coupling cost more to update, are harder to remove, and are harder to replace, than those with lower coupling. The measures of coupling that best explain differences in IT agility include all indirect dependencies between software applications (i.e., they include coupling and dependency relationships that are not easily visible to the system architect). Our results reveal the critical importance of software portfolio design decisions, in developing a portfolio of applications that can evolve and adapt over time.
Sleep is essential to existence, much like air, water, and food, as we spend nearly one-third of our time sleeping. Poor sleep quality or disturbed sleep causes daytime solemnity, which worsens daytime activities' mental and physical qualities and raises the risk of accidents. With advancements in sensor and communication technology, sleep monitoring is moving out of specialized clinics and into our everyday homes. It is possible to extract data from traditional overnight polysomnographic recordings using more basic tools and straightforward techniques. Ballistocardiogram is an unobtrusive, non-invasive, simple, and low-cost technique for measuring cardiorespiratory parameters. In this work, we present a sensor board interface to facilitate the communication between force sensitive resistor sensor and an embedded system to provide a high-performing prototype with an efficient signal-to-noise ratio. We have utilized a multi-physical-layer approach to locate each layer on top of another, yet supporting a low-cost, compact design with easy deployment under the bed frame.
"Designed for digital" offers practical advice on digital transformation, with examples that include Amazon, BNY Mellon, DBS Bank, LEGO, Philips, Schneider Electric, USAA, and many other global organizations. Drawing on five years of research and in-depth case studies, the book is an essential guide for companies that want to disrupt rather than be disrupted in the new digital landscape.
Clinical reading centers provide expertise for consistent, centralized analysis of medical data gathered in a distributed context. Accordingly, appropriate software solutions are required for the involved communication and data management processes. In this work, an analysis of general requirements and essential architectural and software design considerations for reading center information systems is provided. The identified patterns have been applied to the implementation of the reading center platform which is currently operated at the Center of Ophthalmology of the University Hospital of Tübingen.
Nowadays robust, energy-efficient multisensor microsystems often come with heavily restricted power budgets and the characteristic of remaining in certain states for a longer period of time. During this time frame there is no continuous clock signal required which gives the opportunity to suspend the clock until a new transition is requested. In this paper, we present a new topology for on-demand locally clocked finite state machines. The architecture combines a local adaptive clocking approach with synchronous and asynchronous components forming a quasi synchronous system. Using adaptive and local clocking comes with the advantages of reducing the power consumption while saving design effort when no global clock tree is needed. Combining synchronous and asynchronous components is beneficial compared to previous fully asynchronous approaches concerning the design restrictions. The developed topology is verified by the implementation and simulation of a temperature-ADC sensor system realized in a 180 nm process.
In this study, a novel strategy has been developed for the assembly of polyelectrolyte multilayer (PEM) on CaCO3 templates in acidic pH solutions, where consecutive polyelectrolyte layers (heparin/poly(allylamine hydrochloride) or heparin/chitosan) were deposited on PEM hollow microcapsules established previously on CaCO3 templates. The PEM build-up, hollow capsule characterization and successful encapsulation of fluorescein 5(6)-isothiocyanate (FITC)-Dextran by coprecipitation with CaCO3 are demonstrated. Improvement by the removal of CaCO3 core was achieved while the depositions. In the course of the release profile, high retardation for encapsulated FITC-Dextran was observed. The combined shell capsules system is a significant trait that has potential use in tailoring functional layer-by-layer capsules as intelligent drug delivery vehicles where the preliminary in vitro tests showed the responsiveness on the enzymes.
Heat pumps in combination with a photovoltaic system are a very promising option for the transformation of the energy system. By using such a system for coupling the electricity and heat sectors, buildings can be heated sustainably and with low greenhouse gas emissions. This paper reveals a method for dimensioning a suitable system of heat pump and photovoltaics (PV) for residential buildings in order to achieve a high level of (photovoltaic) PV self-consumption. This is accomplished by utilizing a thermal energy storage (TES) for shifting the operation of the heat pump to times of high PV power production by an intelligent control algorithm, which yields a high portion of PV power directly utilized by the heat pump. In order to cover the existing set of building infrastructure, 4 reference buildings with different years of construction are introduced for both single- and multi-family residential buildings. By this means, older buildings with radiator heating as well as new buildings with floor heating systems are included. The simulations for evaluating the performance of a heat pump/PV system controlled by the novel algorithm for each type of building were carried out in MATLAB-Simulink® 2017a. The results show that 25.3% up to 41.0% of the buildings’ electricity consumption including the heat pump can be covered directly from the PV installation per year. Evidently, the characteristics of the heating system significantly influence the results: new buildings with floor heating and low supply temperatures yield a higher level of PV self-consumption due to a higher efficiency of the heat pump compared to buildings with radiator heating and higher supply temperatures. In addition, the effect of adding a battery to the system was studied for two building types. It will be shown that the degree of PV self-consumption increases in case a battery is present. However, due to the high investment costs of batteries, they do not pay off within a reasonable period.
In this paper we describe the design and development process of an electromagnetic picker for rivets. These rivets are used in a production process of leather or textile design objects like riveted waist belts or purses. The picker is designed such that it replaces conventional mechanical pickers thus avoiding mechanical wear problems and increasing the process quality. The paper illustrates the challenges in the design process of this mechatronic system. The design process was based on both simulation and experiments leading to a prototype that satisfies the requirements.
Additive Manufacturing is increasingly used in the industrial sector as a result of continuous development. In the Production Planning and Control (PPC) system, AM enables an agile response in the area of detailed and process planning, especially for a large number of plants. For this purpose, a concept for a PPC system for AM is presented, which takes into account the requirements for integration into the operational enterprise software system. The technical applicability will be demonstrated by individual implemented sections. The presented solution approach promises a more efficient utilization of the plants and a more elastic use.
The world is becoming increasingly digital. People have become used to learning and interacting with the world around them through technology, accelerated even further by the Covid-19 pandemic. This is especially relevant to the generation currently entering education systems and the workforce. Considering digital aids and methods of learning are important for future learning. The increasing online learning needs open the case for integrating digital learning aspects such as serious gaming within education and training systems. Learning factories fall amongst the education and training systems that can benefit from integration with digital learning extensions. Digital capabilities such as digital twins and models further enable the exploration of integrating digital serious games as an extension of learning factories. Since learning factories are meant for a range of different learning, training, and research purposes, such serious games need to be adaptable across stakeholder perspectives to maximize the value gained from the time and cost invested into such design and development. Research into the development of adaptive serious games for multiple stakeholder perspectives must first determine whether such development can be developed that reaches the objectives set for different included stakeholder perspectives. The purpose of this research is to investigate this at the hand of the practical development of a digital adaptive serious game for stakeholder perspectives.
The design process for a single phase, smart, universal charger for light electric vehicles, is presented. With a step up, power factor correction circuit, followed by a phase shifted, full bridge converter, with synchronous rectification on the secondary side. Due to the resistor-capacitor-diode snubber on the secondary side, the current peak at the start of power transfer, leads to false triggering during light load control with peak current mode control. The solution developed for light loads, is to change from peak current control to voltage control. This is achieved by limiting the maximum phase shift, instead of changing the reference value. For the power factor correction stage, measured and calculated efficiencies are compared as a function of the output power. The voltage and current waveforms are shown for the power factor correction circuit, and for the phase shifted bridge, the measured current waveform is compared with simulation.
Normal breathing during sleep is essential for people’s health and well-being. Therefore, it is crucial to diagnose apnoea events at an early stage and apply appropriate therapy. Detection of sleep apnoea is a central goal of the system design described in this article. To develop a correctly functioning system, it is first necessary to define the requirements outlined in this manuscript clearly. Furthermore, the selection of appropriate technology for the measurement of respiration is of great importance. Therefore, after performing initial literature research, we have analysed in detail three different methods and made a selection of a proper one according to determined requirements. After considering all the advantages and disadvantages of the three approaches, we decided to use the impedance measurement-based one. As a next step, an initial conceptual design of the algorithm for detecting apnoea events was created. As a result, we developed an activity diagram on which the main system components and data flows are visually represented.
Usually battery chargers have two stages and DC charging current is considered to by necessary for a proper charging. To decrease the charger volume, a single stage LLC battery charger is investigated in this paper. PFC stage is eliminated, therefore no bulky capacitor is necessary any more, and battery is charged with a sinusoidal-like charging current. However, previous studies show that such a pulsating charging current has only minimal impact on battery life and efficiency. Design considerations of the resonant tank and optimal transformer design are presented. A 360W single stage LLC converter prototype for e-bike charger achieves a power factor of 0.98, efficiency of 0.93 and power density of 1,8kW/dm³.
In this work design rules for a novel brushless excitation system for externally excited synchronous machines are discussed. The concept replaces slip rings with a fullbridge active rectifier and a controller mounted on the rotor. An AC signal induced from the stator is used to charge the rotor DC link. The DC current for the rotor excitation is provided from this DC link source. Finite element analysis of an existing machine is used to analyze the practicability of the excitation system.
This paper presents the design and simulation processes of an Equiangular Spiral Antenna for the extremely high frequencies between 65 GHz and 170 GHz. A new approach for the analysis of the antenna’s electrical parameters is described. This approach is based on formalism proposed by Rumsey to determine the EM field produced by an equiangular spiral antenna. Analytical expressions of the electrical parameters such as the gain or the directivity are then calculated using well sustained mathematical approximations. The comparison of obtained results with those from numerical integration methods shows a good agreement.
An operation room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for making mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and tasks of their colleagues. The entire team must work synchronously at all times. However, the operation room (OR) is a noisy environment and the actors have to set their focus on their work. To optimize the overall workflow, a task manager supporting the team was developed. Each actor is equipped with a client terminal showing a summary of their own tasks. Moreover, a big screen displays all tasks of all actors. The architecture is a distributed system based on a communication framework that supports the interaction of all clients with the task manager. A prototype of the task manager and several clients have been developed and implemented. The system represents a proof-of-concept for further development. This paper describes the concept of the task manager.
An operating room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and the tasks of their colleagues. The entire team must work synchronously at all times. To optimize the overall workflow, a task manager supporting the team was developed. In parallel, a common conceptual design of a business process visualization was developed, which makes all relevant information accessible in real-time during a surgery. In this context an overview of all processes in the operating room was created and different concepts for the graphical representation of these user-dependent processes were developed. This paper describes the concept of the task manager as well as the general concept in the field of surgery.
This paper presents a permanent magnet tubular linear generator system for powering passive sensors using vertical vibration harvesting energy. The system consists of a permanent magnet tubular linear vibration generator and electric circuits. By using the design of mechanical resonant movers, the generator is capable of converting low frequencies small amplitude vertical vibration energy into more regular sinusoidal electrical energy. The distribution of the magnetic field and electromotive force are calculated by Finite Element Analysis. The characteristics of the linear vibration generator system are observed. The experimental results show the generator can produce about 0.4W~1.6W electrical power when the vibration source's amplitude is fixed on 2mm and the frequencies are between 13Hz and 22Hz.
This paper presents a description model for smart, connected devices used in a manufacturing context. Similar to the wide spread adoption of smart products for personal and private usage, recent developments lead to a plethora of devices offering a variety of features and capabilities. Manufacturing companies undergoing digital transformation demand guidance with respect to the systematic introduction of smart, connected devices. The introduction of smart connected devices constitutes a strategic decision cost due to the high future committed cost after introduction and maintaining a smart device fleet by a vendor. This paper aims to support the introduction efforts by classifying the devices and thus helping companies identify their specific requirements for smart, connected devices before initiating widespread procurement. By mapping the features of these devices based on various attributes, allows the clustering of smart, connected devices including a requirement list for their implementation on the shopfloor. Four individual commercially available smart connected devices were analyzed using the description model.
Driven by digital transformation, manufacturing systems are heading towards autonomy. The implementation of autonomous elements in manufacturing systems is still a big challenge. Especially small and medium sized enterprises (SME) often lack experience to assess the degree of Autonomous Production. Therefore, a description model for the assessment of stages for Autonomous Production has been identified as a core element to support such a transformation process. In contrast to existing models, the developed SME-tailored model comprises different levels within a manufacturing system, from single manufacturing cells to the factory level. Furthermore, the model has been validated in several case studies.
The aim of this work was to investigate the mean fill weight control of a continuous capsule-filling process, whether it is possible to derive controller settings from an appendant process model. To that end, a system composed out of fully automated capsule filler and an online gravimetric scale was used to control the filled weight. This setup allows to examine challenges associated with continuous manufacturing processes, such as variations in the amount of active pharmaceutical ingredient (API) in the mixture due to fluctuations of the feeders or due to altered excipient batch qualities. Two types of controllers were investigated: a feedback control and a combination of feedback and feedforward control. Although both of those are common in the industry, determining the optimal parameter settings remains an issue. In this study, we developed a method to derive the control parameters based on process models in order to obtain optimal control for each filled product. Determined via rapid automated process development (RAPD), this method is an effective and fast way of determining control parameters. The method allowed us to optimize the weight control for three pharmaceutical excipients. By conducting experiments, we verified the feasibility of the proposed method and studied the dynamics of the controlled system. Our work provides important basic data on how capsule filler can be implemented into continuous manufacturing systems.
Plasma polymerization is used for the modification and control of surface properties of a highly transparent, thermoplastic elastomeric silicone copolymer, GENIOMER® 80 (G80). PEG-like diglyme plasma polymer films were deposited with ether retentions varying between 20% and 70% as measured by X-ray photoelectron spectroscopy analysis which did not affect the transparency of the substrate. Films with ether retentions of greater than 70% inhibit protein binding (bovine serum albumin and fibrinogen) and cell proliferation. A short oxygen plasma pretreatment enhances the adhesion and stability of the film as shown by protein binding and cell adhesion experiments. The transparency of the material and the stability of the coating makes this material a versatile bulk material for technical (e.g., lab-on-a-chip) and biomedical (e.g., intraocular lens) applications. The G80/plasma polymer composite is stable against vigorous washing and storage over 5 months and, therefore, offers an attractive alternative to poly(dimethylsiloxane).
There have been substantial research efforts for algorithms to improve continuous and automated assessment of various health-related questions in recent years. This paper addresses the deployment gap between those improving algorithms and their usability in care and mobile health applications. In practice, most algorithms require significant and founded technical knowledge to be deployed at home or support healthcare professionals. Therefore, the digital participation of persons in need of health care professionals lacks a usable interface to use the current technological advances. In this paper, we propose applying algorithms taken from research as web-based microservices following the common approach of a RESTful service to bridge the gap and make algorithms accessible to caregivers and patients without technical knowledge and extended hardware capabilities. We address implementation details, interpretation and realization of guidelines, and privacy concerns using our self-implemented example. Also, we address further usability guidelines and our approach to those.
Forecasting intermittent and lumpy demand is challenging. Demand occurs only sporadically and, when it does, it can vary considerably. Forecast errors are costly, resulting in obsolescent stock or unmet demand. Methods from statistics, machine learning and deep learning have been used to predict such demand patterns. Traditional accuracy metrics are often employed to evaluate the forecasts, however these come with major drawbacks such as not taking horizontal and vertical shifts over the forecasting horizon into account, or indeed stock-keeping or opportunity costs. This results in a disadvantageous selection of methods in the context of intermittent and lumpy demand forecasts. In our study, we compare methods from statistics, machine learning and deep learning by applying a novel metric called Stock-keeping-oriented Prediction Error Costs (SPEC), which overcomes the drawbacks associated with traditional metrics. Taking the SPEC metric into account, the Croston algorithm achieves the best result, just ahead of a Long Short-Term Memory Neural Network.
Delphi Markets
(2023)
Delphi markets refer to approaches and implementations of integrating prediction markets and Delphi studies (Real-time Delphi). The combination of the two methods for producing forecasts can potentially compensate for each other´s weaknesses. For example, prediction markets can be used to select participants with expertise and also motivate long-term participation through their gamified approach and incentive mechanisms. In this paper, two potentials for prediction markets and four potentials for Delphi studies, which are made possible by integration, are derived theoretically. Subsequently, three different integration approaches are presented, on the basis of which the integration on user, market and Delphi question-level is exemplified and it is shown that, depending on the approach, not all potentials can be achieved. At the end, recommendations for the use of Delphi markets are derived, existing limitations for Delphi markets as well as future developments are pointed out.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Production systems are becoming increasingly complex, which means that the main task of industrial maintenance, ensuring the technical availability of a production system, is also becoming increasingly difficult. The previous focus of maintenance efforts on individual machines must give way to a holistic view encompassing the whole production system. Against this background, the technical availability of a production system must be redefined. The aim of this publication is to present different definition approaches of production systems’ availability and to demonstrate the effects of random machine failures on the key figures considering the complexity of the production system using a discrete event simulation.
Defining the antecedents of experience co-creation as applied to alternative consumption models
(2019)
Purpose – The purpose of this paper is to propose a conceptual framework of experience co-creation that captures the multi-dimensionality of this construct, as well as a research process for defining of the antecedents of experience co-creation.
Design/methodology/approach – The framework of experience co-creation was conceptualized by means of a literature review. Subsequently, this framework was used as the conceptual basis for a qualitative content analysis of 66 empirical papers investigating alternative consumption models (ACMs), such as renting, remanufacturing, and second-hand models.
Findings – The qualitative content analysis resulted in 12 categories related to the consumer and 9 related to the ACM offerings that represent the antecedents of experience co-creation. These categories provide evidence that, to a large extent, the developed conceptual framework allows one to capture the multi-dimensionality of the experience co-creation construct.
Research limitations/implications – This study underscores the understanding of experience co-creation as a function of the characteristics of the offering – which are, in turn, a function of the consumers’ motives as determined by their lifeworlds – as well as to service design as an iterative approach to finding, creating and refining service offerings.
Practical implications – The investigation of the antecedents of experience co-creation can enable service providers to determine significant consumer market conditions for forecasting the suitability and viability of their offerings and to adjust their service designs accordingly.
Originality/value – This paper provides a step toward the operationalization of the dimension-related experience co creation construct and presents an approach to defining the antecedents of experience co-creation by considering different research perspectives that can enhance service design research.
Purpose: Gliomas are the most common and aggressive type of brain tumors due to their infiltrative nature and rapid progression. The process of distinguishing tumor boundaries from healthy cells is still a challenging task in the clinical routine. Fluid attenuated inversion recovery (FLAIR) MRI modality can provide the physician with information about tumor infiltration. Therefore, this paper proposes a new generic deep learning architecture, namely DeepSeg, for fully automated detection and segmentation of the brain lesion using FLAIR MRI data.
Methods: The developed DeepSeg is a modular decoupling framework. It consists of two connected core parts based on an encoding and decoding relationship. The encoder part is a convolutional neural network (CNN) responsible for spatial information extraction. The resulting semantic map is inserted into the decoder part to get the full-resolution probability map. Based on modified U-Net architecture, different CNN models such as residual neural network (ResNet), dense convolutional network (DenseNet), and NASNet have been utilized in this study.
Results: The proposed deep learning architectures have been successfully tested and evaluated on-line based on MRI datasets of brain tumor segmentation (BraTS 2019) challenge, including s336 cases as training data and 125 cases for validation data. The dice and Hausdorff distance scores of obtained segmentation results are about 0.81 to 0.84 and 9.8 to 19.7 correspondingly.
Conclusion: This study showed successful feasibility and comparative performance of applying different deep learning models in a new DeepSeg framework for automated brain tumor segmentation in FLAIR MR images. The proposed DeepSeg is open source and freely available at https://github.com/razeineldin/DeepSeg/.
The Internet of Things (IoT) is coined by many different standards, protocols, and data formats that are often not compatible to each other. Thus, the integration of different heterogeneous (IoT) components into a uniform IoT setup can be a time-consuming manual task. This lacking interoperability between IoT components has been addressed with different approaches in the past. However, only very few of these approaches rely on Machine Learning techniques. In this work, we present a new way towards IoT interoperability based on Deep Reinforcement Learning (DRL). In detail, we demonstrate that DRL algorithms, which use network architectures inspired by Natural Language Processing (NLP), can be applied to learn to control an environment by merely taking raw JSON or XML structures, which reflect the current state of the environment, as input. Applied to IoT setups, where the current state of a component is often reflected by features embedded into JSON or XML structures and exchanged via messages, our NLP DRL approach eliminates the need for feature engineering and manually written code for pre-processing of data, feature extraction, and decision making.
There is still a great reliance on human expert knowledge during the analog integrated circuit sizing design phase due to its complexity and scale, with the result that there is a very low level of automation associated with it. Current research shows that reinforcement learning is a promising approach for addressing this issue. Similarly, it has been shown that the convergence of conventional optimization approaches can be improved by transforming the design space from the geometrical domain into the electrical domain. Here, this design space transformation is employed as an alternative action space for deep reinforcement learning agents. The presented approach is based entirely on reinforcement learning, whereby agents are trained in the craft of analog circuit sizing without explicit expert guidance. After training and evaluating agents on circuits of varying complexity, their behavior when confronted with a different technology, is examined, showing the applicability, feasibility as well as transferability of this approach.
Intracranial brain tumors are one of the ten most common malignant cancers and account for substantial morbidity and mortality. The largest histological category of primary brain tumors is the gliomas which occur with an ultimate heterogeneous appearance and can be challenging to discern radiologically from other brain lesions. Neurosurgery is mostly the standard of care for newly diagnosed glioma patients and may be followed by radiation therapy and adjuvant temozolomide chemotherapy.
However, brain tumor surgery faces fundamental challenges in achieving maximal tumor removal while avoiding postoperative neurologic deficits. Two of these neurosurgical challenges are presented as follows. First, manual glioma delineation, including its sub-regions, is considered difficult due to its infiltrative nature and the presence of heterogeneous contrast enhancement. Second, the brain deforms its shape, called “brain shift,” in response to surgical manipulation, swelling due to osmotic drugs, and anesthesia, which limits the utility of pre-operative imaging data for guiding the surgery.
Image-guided systems provide physicians with invaluable insight into anatomical or pathological targets based on modern imaging modalities such as magnetic resonance imaging (MRI) and Ultrasound (US). The image-guided toolkits are mainly computer-based systems, employing computer vision methods to facilitate the performance of peri-operative surgical procedures. However, surgeons still need to mentally fuse the surgical plan from pre-operative images with real-time information while manipulating the surgical instruments inside the body and monitoring target delivery. Hence, the need for image guidance during neurosurgical procedures has always been a significant concern for physicians.
This research aims to develop a novel peri-operative image-guided neurosurgery (IGN) system, namely DeepIGN, that can achieve the expected outcomes of brain tumor surgery, thus maximizing the overall survival rate and minimizing post-operative neurologic morbidity. In the scope of this thesis, novel methods are first proposed for the core parts of the DeepIGN system of brain tumor segmentation in MRI and multimodal pre-operative MRI to the intra-operative US (iUS) image registration using the recent developments in deep learning. Then, the output prediction of the employed deep learning networks is further interpreted and examined by providing human-understandable explainable maps. Finally, open-source packages have been developed and integrated into widely endorsed software, which is responsible for integrating information from tracking systems, image visualization, image fusion, and displaying real-time updates of the instruments relative to the patient domain.
The components of DeepIGN have been validated in the laboratory and evaluated in the simulated operating room. For the segmentation module, DeepSeg, a generic decoupled deep learning framework for automatic glioma delineation in brain MRI, achieved an accuracy of 0.84 in terms of the dice coefficient for the gross tumor volume. Performance improvements were observed when employing advancements in deep learning approaches such as 3D convolutions over all slices, region-based training, on-the-fly data augmentation techniques, and ensemble methods.
To compensate for brain shift, an automated, fast, and accurate deformable approach, iRegNet, is proposed for registering pre-operative MRI to iUS volumes as part of the multimodal registration module. Extensive experiments have been conducted on two multi-location databases: the BITE and the RESECT. Two expert neurosurgeons conducted additional qualitative validation of this study through overlaying MRI-iUS pairs before and after the deformable registration. Experimental findings show that the proposed iRegNet is fast and achieves state-of-the-art accuracies. Furthermore, the proposed iRegNet can deliver competitive results, even in the case of non-trained images, as proof of its generality and can therefore be valuable in intra-operative neurosurgical guidance.
For the explainability module, the NeuroXAI framework is proposed to increase the trust of medical experts in applying AI techniques and deep neural networks. The NeuroXAI includes seven explanation methods providing visualization maps to help make deep learning models transparent. Experimental findings showed that the proposed XAI framework achieves good performance in extracting both local and global contexts in addition to generating explainable saliency maps to help understand the prediction of the deep network. Further, visualization maps are obtained to realize the flow of information in the internal layers of the encoder-decoder network and understand the contribution of MRI modalities in the final prediction. The explainability process could provide medical professionals with additional information about tumor segmentation results and therefore aid in understanding how the deep learning model is capable of processing MRI data successfully.
Furthermore, an interactive neurosurgical display has been developed for interventional guidance, which supports the available commercial hardware such as iUS navigation devices and instrument tracking systems. The clinical environment and technical requirements of the integrated multi-modality DeepIGN system were established with the ability to incorporate: (1) pre-operative MRI data and associated 3D volume reconstructions, (2) real-time iUS data, and (3) positional instrument tracking. This system's accuracy was tested using a custom agar phantom model, and its use in a pre-clinical operating room is simulated. The results of the clinical simulation confirmed that system assembly was straightforward, achievable in a clinically acceptable time of 15 min, and performed with a clinically acceptable level of accuracy.
In this thesis, a multimodality IGN system has been developed using the recent advances in deep learning to accurately guide neurosurgeons, incorporating pre- and intra-operative patient image data and interventional devices into the surgical procedure. DeepIGN is developed as open-source research software to accelerate research in the field, enable ease of sharing between multiple research groups, and continuous developments by the community. The experimental results hold great promise for applying deep learning models to assist interventional procedures - a crucial step towards improving the surgical treatment of brain tumors and the corresponding long-term post-operative outcomes.
Deep learning-based EEG detection of mental alertness states from drivers under ethical aspects
(2021)
One of the most critical factors for a successful road trip is a high degree of alertness while driving. Even a split second of inattention or sleepiness in a crucial moment, will make the difference between life and death. Several prestigious car manufacturers are currently pursuing the aim of automated drowsiness identification to resolve this problem. The path between neuro-scientific research in connection with artificial intelligence and the preservation of the dignity of human individual’s and its inviolability, is very narrow. The key contribution of this work is a system of data analysis for EEGs during a driving session, which draws on previous studies analyzing heart rate (ECG), brain waves (EEG), and eye function (EOG). The gathered data is hereby treated as sensitive as possible, taking ethical regulations into consideration. Obtaining evaluable signs of evolving exhaustion includes techniques that obtain sleeping stage frequencies, problematic are hereby the correlated interference’s in the signal. This research focuses on a processing chain for EEG band splitting that involves band-pass filtering, principal component analysis (PCA), independent component analysis (ICA) with automatic artefact severance, and fast fourier transformation (FFT). The classification is based on a step-by-step adaptive deep learning analysis that detects theta rhythms as a drowsiness predictor in the pre-processed data. It was possible to obtain an offline detection rate of 89% and an online detection rate of 73%. The method is linked to the simulated driving scenario for which it was developed. This leaves space for more optimization on laboratory methods and data collection during wakefulness-dependent operations.
In recent years, both fields, AI and VRE, have received increasing attention in scientific research. Thus, this article’s purpose is to investigate the potential of DL-based applications on VRE and as such provide an introduction to and structured overview of the field. First, we conduct a systematic literature review of the application of Artificial Intelligence (AI), especially Deep Learning (DL), on the integration of Variable Renewable Energy (VRE). Subsequently, we provide a comprehensive overview of specific DL-based solution approaches and evaluate their applicability, including a survey of the most applied and best suited DL architectures. We identify ten DL-based approaches to support the integration of VRE in modern power systems. We find (I) solar PV and wind power generation forecasting, (II) system scheduling and grid management, and (III) intelligent condition monitoring as three high potential application areas.
Intraoperative imaging can assist neurosurgeons to define brain tumours and other surrounding brain structures. Interventional ultrasound (iUS) is a convenient modality with fast scan times. However, iUS data may suffer from noise and artefacts which limit their interpretation during brain surgery. In this work, we use two deep learning networks, namely UNet and TransUNet, to make automatic and accurate segmentation of the brain tumour in iUS data. Experiments were conducted on a dataset of 27 iUS volumes. The outcomes show that using a transformer with UNet is advantageous providing an efficient segmentation modelling long-range dependencies between each iUS image. In particular, the enhanced TransUNet was able to predict cavity segmentation in iUS data with an inference rate of more than 125 FPS. These promising results suggest that deep learning networks can be successfully deployed to assist neurosurgeons in the operating room.
Fault diagnosis of rolling bearings is an essential process for improving the reliability and safety of the rotating machinery. It is always a major challenge to ensure fault diag- nosis accuracy in particular under severe working conditions. In this article, a deep adversarial domain adaptation (DADA) model is proposed for rolling bearing fault diagnosis. This model con- structs an adversarial adaptation network to solve the commonly encountered problem in numerous real applications: the source domain and the target domain are inconsistent in their distribution. First, a deep stack autoencoder (DSAE) is combined with representative feature learning for dimensionality reduction, and such a combination provides an unsupervised learning method to effectively acquire fault features. Meanwhile, domain adaptation and recognition classification are implemented using a Softmax classifier to augment classification accuracy. Second, the effects of the number of hidden layers in the stack autoencoder network, the number of neurons in each hidden layer, and the hyperparameters of the proposed fault diagnosis algorithm are analyzed. Third, comprehensive analysis is performed on real data to vali- date the performance of the proposed method; the experimental results demonstrate that the new method outperforms the existing machine learning and deep learning methods, in terms of classification accuracy and generalization ability.
Through increasing market dynamics, rapidly evolving technologies and shifting user expectations coupled with the adoption of lean and agile practices, companies are struggling with their ability to provide reliable product roadmaps by applying traditional approaches. Currently, most companies are seeking opportunities to improve their product roadmapping practices. As a first challenge they have to assess their current product roadmapping capabilities in order to better understand how to improve their practices and how to switch to a new approach. The aim of this article is to provide an initial maturity model for product roadmapping practices that is especially suited for assessing the roadmapping capabilities of companies operating in dynamic and uncertain market environments. Based on interviews with 15 experts from 13 various companies the current state of practice regarding product roadmapping was identified. Afterwards, the model development was conducted in the context of expert workshops with the Robert Bosch GmbH and researchers. The study results in the so-called DEEP 1.0 product roadmap maturity model which allows companies to conduct a self assessment of their product roadmapping practice.
In the context of Industry 4.0, intralogistics faces an increasingly complex and dynamic environment driven by a high level of product customisation and complex manufacturing processes. One approach to deal with these changing conditions is the decentralised and intelligent connectivity of intralogistics systems. However, wireless connectivity presents a major challenge in the industry due to strict requirements such as safety and real-time data transmission. In this context, the fifth generation of mobile communications (5G) is a promising technology to meet the requirements of safety-critical applications. Particularly, since 5G offers the possibility of establishing private 5G networks, also referred to as standalone non-public networks. Through their isolation from public networks, private 5G networks provide exclusive coverage for private organisations offering them high intrinsic network control and data security. However, 5G is still under development and is being gradually introduced in a continuous release process. This process lacks transparency regarding the performance of 5G in individual releases, complicating the successful adoption of 5G as an industrial communication. Additionally, the evaluation of 5G against the specified target performance is insufficient due to the impact of the environment and external interfering factors on 5G in the industrial environment. Therefore, this paper aims to develop a technical decision-support framework that takes a holistic approach to evaluate the practicality of 5G for intralogistics use cases by considering two fundamental stages. The first of these analyses technical parameters and characteristics of the use case to evaluate the theoretical feasibility of 5G. The second stage investigates the application's environment, which substantially impacts the practicality of 5G, for instance, the influence of surrounding materials. Finally, a case study validates the proposed framework by means of an autonomous mobile robot. As a result, the validation proves the proposed framework's applicability and shows the practicality of the autonomous mobile robot, when integrating it into a private 5G network testbed.
Enterprises are presently transforming their strategy, culture, processes, and their information systems to become more digital. The digital transformation deeply disrupts existing enterprises and economies. Digitization fosters the development of IT systems with many rather small and distributed structures, like Internet of Things or mobile systems. Since years a lot of new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. This has a strong impact for architecting digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of system architectures defines the moving context for adaptable systems, which are essential to enable the digital transformation. In this paper, we are focusing on a decision-oriented architectural composition approach to support the transformation for digital services and products.
Digitization of societies changes the way we live, work, learn, communicate, and collaborate. In the age of digital transformation IT environments with a large number of rather small structures like Internet of Things (IoT), microservices, or mobility systems are emerging to support flexible and agile digitized products and services. Adaptable ecosystems with service oriented enterprise architectures are the foundation for self-optimizing, resilient run-time environments and distributed information systems. The resulting business disruptions affect almost all new information processes and systems in the context of digitization. Our aim are more flexible and agile transformations of both business and information technology domains with more flexible enterprise information systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision-controlled digitization architectures for Internet of Things and microservices by evolving enterprise architecture reference models and state of the art elements for architectural engineering for micro-granular systems.
Digitization fosters the development of IT environments with many rather small structures, like Internet of Things (IoT), microservices, or mobility systems. They are needed to support flexible and agile digitized products and services. The goal is to create service-oriented enterprise architectures (EA) that are self optimizing and resilient. The present research paper investigates methods for decision-making concerning digitization architectures for Internet of Things and microservices. They are based on evolving enterprise architecture reference models and state of the art elements for architectural engineering for microgranular systems. Decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures, is sorely needed. The challenging of the decision processes can be supported with in a more flexible and intuitive way by an architecture management cockpit.
The Internet of Things (IoT), enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems with service oriented enterprise architectures provide the foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision case management in the context of multi-perspective explorations of enterprise services and Internet of Things architectures by extending original enterprise architecture reference models with state of art elements for architectural engineering for the digitization and architectural decision support.
The efficient production and utilization of green hydrogen is vital to succeed in the global strive for a sustainable future. To provide the necessary amount of green hydrogen a high number of electrolyzers will be connected as decentralized power consumers to the grid. A large amount of decentralized renewable power sources will provide the energy. In such a system a control method is necessary to dispatch the available power most efficiently. In particular, the shutdown of renewable energy sources due to temporary overproduction must be avoided. This paper presents a decentralized tertiary control algorithm that provides a new decentralized control approach, thus creating a flexible, robust and easily scalable system. The operation of each grid participant within this grid connected microgrid is optimized for maximum financial profit, while minimizing the exchange of power with the mains grid and reducing the shutdown of renewable power sources.
The increasing emergence of cyber-physical systems (CPS) and a global crosslinking of these CPS to cyber-physical production systems (CPPS) are leading to fundamental changes of future work and logistic systems requiring innovative methods to plan, control and monitor changeable production systems and new forms of human-machine-collaboration. Particularly logistic systems have to obey the versatility of CPPS and will be transferred to so-called cyber physical logistic systems, since the logistical networks will underlie the requirements of constant changes initiated by changeable production systems. This development is driven and enhanced by increasingly volatile and globalized market and manufacturing environments combined with a high demand for individualized products and services. Also nowadays mainly used centralized control systems are pushed to their limits regarding their abilities to deal with the arising complexity to plan, control and monitor changeable work and logistic systems. Decentralized control systems bear the potential to cope with these challenges by distributing the required operations on various nodes of the resulting decentralized control system.
Learning factories, like the ESB Logistics Learning Factory at ESB Business School (Reutlingen University), provide a wide range of possibilities to develop new methods and innovative technical solutions in a risk-free and close-to-reality factory environment and to transfer knowledge as well as specific competences into the training of students and professionals. To intensify the research and training activities in the field of future work and logistics systems, ESB Business School is transferring its existing production system into a CPPS involving decentralized planning, control and monitoring methods and systems, human-machine-collaboration as well as technical assistance systems for changeable work and logistics systems.
The establishment of adipose tissue test systems is still a major challenge in the investigation of cellular and molecular interactions responsible for the pathogenesis of inflammatory diseases involving adipose tissue. Mature adipocytes are mainly involved in these pathologies, but rarely used in vitro, due to the lack of an appropriate culture medium which inhibits dedifferentiation and maintains adipocyte functionality. In our study, we showed that Dulbecco's Modified Eagle's Medium/Ham's F-12 with 10% fetal calf serum (FCS) reported for the culture of mature adipocytes favors dedifferentiation, which was accompanied by a high glycerol release, a decreasing release of leptin, and a low expression of the adipocyte marker perilipin A, but high expression of CD73 after 21 days. Optimized media containing FCS, biotin, pantothenate, insulin, and dexamethasone decelerated the dedifferentiation process. These cells showed a lower lipolysis rate, a high level of leptin release, as well as a high expression of perilipin A. CD73-positive dedifferentiated fat cells were only found in low quantity. In this work, we showed that mature adipocytes when cultured under optimized conditions could be highly valuable for adipose tissue engineering in vitro.
As fuel prices climb and the global automotive sector migrates to more sustainable vehicle technologies, the future of South Africa’s minibus taxis is in flux. The authors’ previous research has found that battery electric technology struggles to meet all the mobility requirements of minibus taxis. They investigate the technical feasibility of powering taxis with hydrogen fuel cells instead. The following results are projected using a custom-built simulator, and tracking data of taxis based in Stellenbosch, South Africa. Each taxi requires around 12 kg of hydrogen gas per day to travel an average distance of 360 km. 465 kWh of electricity, or 860 m2 of solar panels, would electrolyse the required green hydrogen. An economic analysis was conducted on the capital and operational expenses of a system of ten hydrogen taxis and an electrolysis plant. Such a pilot project requires a minimum investment of € 3.8 million (R 75 million), for a 20 year period. Although such a small scale roll-out is technically feasible and would meet taxis’ performance requirements, the investment cost is too high, making it financially unfeasible. They conclude that a large scale solution would need to be investigated to improve financial feasibility; however, South Africa’s limited electrical generation capacity poses a threat to its technical feasibility. The simulator is uploaded at: https://gitlab.com/eputs/ev-fleet-sim-fcv-model.
In this presentation the audience will be: (a) introduced to the aims and objectives of the DBTechNet initiative, (b) briefed on the DBTech EXT virtual laboratory workshops (VLW), i.e. the educational and training (E&T) content which is freely available over the internet and includes vendor-neutral hands-on laboratory training sessions on key database technology topics, and (c) informed on some of the practical problems encountered and the way they have been addressed. Last but not least, the audience will be invited to consider incorporating some or all of the DBTech EXT VLW content into their higher education (HE), vocational education and training (VET), and/or lifelong learning/training type course curricula. This will come at no cost and no commitment on behalf of the teacher/trainer; the latter is only expected to provide his/her feedback on the pedagogical value and the quality of the E&T content received/used.
In the present tutorial we perform a cross-cut analysis of database systems from the perspective of modern storage technology, namely Flash memory. We argue that neither the design of modern DBMS, nor the architecture of flash storage technologies are aligned with each other. The result is needlessly suboptimal DBMS performance and inefficient flash utilisation as well as low flash storage endurance and reliability. We showcase new DBMS approaches with improved algorithms and leaner architectures, designed to leverage the properties of modern storage technologies. We cover the area of transaction management and multi-versioning, putting a special emphasis on: (i) version organisation models and invalidation mechanisms in multi-versioning DBMS; (ii) Flash storage management especially on append-based storage in tuple granularity; (iii) Flash-friendly buffer management; as well as (iv) improvements in the searching and indexing models. Furthermore, we present our NoFTL approach to native Flash access that integrates parts of the flash-management functionality into the DBMS yielding significant performance increase and simplification of the I/O stack. In addition, we cover the basics of building large Flash storage for DBMS and revisit some of the RAID techniques and principles.
The Fifteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2023), held between March 13 – 17, 2023, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Fourteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2022), held between May 22 – 26, 2022, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Thirteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2021), held between May 30 – June 3rd, 2021, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2020) continued a series of events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Eleventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2019), held between June 02, 2019 to June 06, 2019 - Athens, Greece, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
We welcomed academic, research and industry contributions. The conference had the followingtracks:
Knowledgeanddecisionbase
Databasestechnologies
Datamanagement
GraphSM: Large-scale Graph Analysis, Management and Applications
The Tenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2018), held between May 20 - 24, 2018 - Nice, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Ninth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2017), held between May 21 - 25, 2017 - Barcelona, pain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Eighth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2016), held between June 26 - 30, 2016 - Lisbon, Portugal, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Seventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2015), held between May 24-29, 2015 in Rome, Italy, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base Technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and Agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2014), held between April 20 - 24, 2014 in Chamonix, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Fifth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2013], held between January 27th- February 1st, 2013 in Seville, Spain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2013 Technical Program Committee, as well as the numerous reviewers. The creation of such a high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2013. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2013 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2013 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Seville, Spain.
The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2012], held between February 29th and March 5th, 2012 in Saint Gilles, Reunion Island, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2012 Technical Program Committee, as well as the numerous reviewers. The creation of such a broad and high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2012. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2012 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2012 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge, and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Saint Gilles, Reunion Island.
The Third International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2011) held on January 23-27, 2011 in St. Maarten, The Netherlands Antilles, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take this opportunity to thank all the members of the DBKDA 2011 Technical Program Committee as well as the numerous reviewers. The creation of such a broad and high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to the DBKDA 2011. We truly believe that, thanks to all these efforts, the final conference program consists of top quality contributions. This event could also not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2011 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2011 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in database research. We are convinced that the participants found the event useful and communications very open. The beautiful places of St. Maarten surely provided a pleasant environment during the conference and we hope you had a chance to visit the surroundings.
This paper reviews suggestions for changes to database technology coming from the work of many researchers, particularly those working with evolving big data. We discuss new approaches to remote data access and standards that better provide for durability and auditability in settings including business and scientific computing. We propose ways in which the language standards could evolve, with proof-of-concept implementations on Github.
Over the last decades, a tremendous change toward using information technology in almost every daily routine of our lives can be perceived in our society, entailing an incredible growth of data collected day-by-day on Web, IoT, and AI applications.
At the same time, magneto-mechanical HDDs are being replaced by semiconductor storage such as SSDs, equipped with modern Non-Volatile Memories, like Flash, which yield significantly faster access latencies and higher levels of parallelism. Likewise, the execution speed of processing units increased considerably as nowadays server architectures comprise up to multiple hundreds of independently working CPU cores along with a variety of specialized computing co-processors such as GPUs or FPGAs.
However, the burden of moving the continuously growing data to the best fitting processing unit is inherently linked to today’s computer architecture that is based on the data-to-code paradigm. In the light of Amdahl's Law, this leads to the conclusion that even with today's powerful processing units, the speedup of systems is limited since the fraction of parallel work is largely I/O-bound.
Therefore, throughout this cumulative dissertation, we investigate the paradigm shift toward code-to-data, formally known as Near-Data Processing (NDP), which relieves the contention on the I/O bus by offloading processing to intelligent computational storage devices, where the data is originally located.
Firstly, we identified Native Storage Management as the essential foundation for NDP due to its direct control of physical storage management within the database. Upon this, the interface is extended to propagate address mapping information and to invoke NDP functionality on the storage device. As the former can become very large, we introduce Physical Page Pointers as one novel NDP abstraction for self-contained immutable database objects.
Secondly, the on-device navigation and interpretation of data are elaborated. Therefore, we introduce cross-layer Parsers and Accessors as another NDP abstraction that can be executed on the heterogeneous processing capabilities of modern computational storage devices. Thereby, the compute placement and resource configuration per NDP request is identified as a major performance criteria. Our experimental evaluation shows an improvement in the execution durations of 1.4x to 2.7x compared to traditional systems. Moreover, we propose a framework for the automatic generation of Parsers and Accessors on FPGAs to ease their application in NDP.
Thirdly, we investigate the interplay of NDP and modern workload characteristics like HTAP. Therefore, we present different offloading models and focus on an intervention-free execution. By propagating the Shared State with the latest modifications of the database to the computational storage device, it is able to process data with transactional guarantees. Thus, we achieve to extend the design space of HTAP with NDP by providing a solution that optimizes for performance isolation, data freshness, and the reduction of data transfers. In contrast to traditional systems, we experience no significant drop in performance when an OLAP query is invoked but a steady and 30% faster throughput.
Lastly, in-situ result-set management and consumption as well as NDP pipelines are proposed to achieve flexibility in processing data on heterogeneous hardware. As those produce final and intermediary results, we continue investigating their management and identified that an on-device materialization comes at a low cost but enables novel consumption modes and reuse semantics. Thereby, we achieve significant performance improvements of up to 400x by reusing once materialized results multiple times.
Production planning and control are characterized by unplanned events or so-called turbulences. Turbulences can be external, originating outside the company (e.g., delayed delivery by a supplier), or internal, originating within the company (e.g., failures of production and intralogistics resources). Turbulences can have far reaching consequences for companies and their customers, such as delivery delays due to process delays. For target-optimized handling of turbulences in production, forecasting methods incorporating process data in combination with the use of existing flexibility corridors of flexible production systems offer great potential. Probabilistic, data-driven forecasting methods allow determining the corresponding probabilities of potential turbulences. However, a parallel application of different forecasting methods is required to identify an appropriate one for the specific application. This requires a large database, which often is unavailable and, therefore, must be created first. A simulation-based approach to generate synthetic data is used and validated to create the necessary database of input parameters for the prediction of internal turbulences. To this end, a minimal system for conducting simulation experiments on turbulence scenarios was developed and implemented. A multi-method simulation of the minimal system synthetically generates the required process data, using agent-based modeling for the autonomously controlled system elements and event-based modeling for the stochastic turbulence events. Based on this generated synthetic data and the variation of the input parameters in the forecast, a comparative study of data-driven probabilistic forecasting methods was conducted using a data analytics tool. Forecasting methods of different types (including regression, Bayesian models, nonlinear models, decision trees, ensemble, deep learning) were analyzed in terms of prediction quality, standard deviation, and computation time. This resulted in the identification ofappropriate forecasting methods, and required input parameters for the considered turbulences.
In the context of digital transformation, having a data-driven organizational culture has been recognized as an important factor for data analytics capabilities, innovativeness and competitive advantage of firms. However, the current literature on data-driven culture (DDC) is fragmented, lacking both a synthesis of findings and a theoretical foundation. Therefore, the aim of this work has been to develop a comprehensive framework for understanding DDC and the mechanisms that can be used to embed such a culture in organizations as well as structuring prior dispersed findings on the topic. Based on the foundation of organizational culture theory, we employed a Design Science Research (DSR) approach using a systematic literature review and expert interviews to build and evaluate a transformation-oriented framework. This research contributes to knowledge by synthesizing previously dispersed knowledge in a holistic framework, as well as, by providing a conceptual framework to guide the transformation towards a DDC.
In various German cities free-floating e-scooter sharing is an upcoming trend in e-mobility. Trends such as climate change, urbanization, demographic change, amongst others are arising and forces the society to develop new mobility solutions. Contrasting the more scientifically explored car sharing, the usage patterns and behaviors of e-scooter sharing customers still need to be analyzed. This presumably enables a better addressing of customers as well as adaptions of the business model to increase scooter utilization and therefore the profit of the e-scooter providers. The customer journey is digitally traceable from registration to scooter reservation and the ride itself. These data enable to identifies customer needs and motivations. We analyzed a dataset from 2017 to 2019 of an e-scooter sharing provider operating in a big German city. Based on the datasets we propose a customer clustering that identifies three different customer segments, enabling to draw multiple conclusions for the business development and improving the problem-solution fit of the e-scooter sharing model.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
Human recognition is an important part of perception systems, such as those used in autonomous vehicles or robots. These systems often use deep neural networks for this purpose, which rely on large amounts of data that ideally cover various situations, movements, visual appearances, and interactions. However, obtaining such data is typically complex and expensive. In addition to raw data, labels are required to create training data for supervised learning. Thus, manual annotation of bounding boxes, keypoints, orientations, or actions performed is frequently necessary. This work addresses whether the laborious acquisition and creation of data can be simplified through targeted simulation. If data are generated in a simulation, information such as positions, dimensions, orientations, surfaces, and occlusions are already known, and appropriate labels can be generated automatically. A key question is whether deep neural networks, trained with simulated data, can be applied to real data. This work explores the use of simulated training data using examples from the field of pedestrian detection for autonomous vehicles. On the one hand, it is shown how existing systems can be improved by targeted retraining with simulation data, for example to better recognize corner cases. On the other hand, the work focuses on the generation of data that hardly or not occur at all in real standard datasets. It will be demonstrated how training data can be generated by targeted acquisition and combination of motion data and 3D models, which contain finely graded action labels to recognize even complex pedestrian situations. Through the diverse annotation data that simulations provide, it becomes possible to train deep neural networks for a wide variety of tasks with one dataset. In this work, such simulated data is used to train a novel deep multitask network that brings together diverse, previously mostly independently considered but related, tasks such as 2D and 3D human pose recognition and body and orientation estimation.
This article contains data on the synthesis and mechanical characterization of polysiloxane-based urea-elastomers (PSUs) and is related to the research article entitled “Influence of PDMS molecular weight on transparency and mechanical properties of soft polysiloxane-urea-elastomers for intraocular lens application” (Riehle et al., 2018) [1]. These elastomers were prepared by a two-step polyaddition using the aliphatic diisocyanate 4,4′-Methylenbis(cyclohexylisocyanate) (H12MDI), a siloxane-based chain extender 1,3-Bis(3-aminopropyl)-1,1,3,3-tetramethyldisiloxane (APTMDS) and amino-terminated polydimethylsiloxanes (PDMS) or polydimethyl-methyl-phenyl-siloxane-copolymers (PDMS-Me,Ph), respectively. (More details about the synthesis procedure and the reaction scheme can be found in the related research article (Riehle et al., 2018) [1]).
Amino-terminated polydimethylsiloxanes with varying molecular weights and PDMS-Me,Ph-copolymers were prepared prior by a base-catalyzed ring-chain equilibration of a cyclic siloxane and the endblocker APTMDS. This DiB article contains a procedure for the synthesis of the base catalyst tetramethylammonium-3-aminopropyl-dimethylsilanolate and a generic synthesis procedure for the preparation of a PDMS having a targeted number average molecular weight of 3000 g mol−1. Molecular weights and the amount of methyl-phenyl-siloxane within the polysiloxane-copolymers were determined by 1H NMR and 29Si NMR spectroscopy. The corresponding NMR spectra and data are described in this article.
Additionally, this DiB article contains processed data on in line and off line FTIR-ATR spectroscopy, which was used to follow the reaction progress of the polyaddition by showing the conversion of the diisocyanate. All relevant IR band assignments of a polydimethylsiloxane-urea spectrum are described in this article.
Finally, data on the tensile properties and the mechanical hysteresis-behaviour at 100% elongation of PDMS-based polyurea-elastomers are shown in dependence to the PDMS molecular weight.
The data present in this article affords insides in the characterization of a newly described bi-functional furan-melamine monomer, which is used for the production of monodisperse, furan-functionalized melamine formaldehyde particles. In the related research article Urdl et al., 2019 data interpretations can be found. The furan functionalization of particles is necessary to perform reversible Diels-Alder reactions with maleimide (BMI) crosslinker to form thermoreversible network systems. To understand the reaction conditions of Diels Alder (DA) reaction with a Fu-Mel monomer and a maleimide crosslinker, model DA reaction were performed and evaluated using dynamic FT-IR measurements. During retro Diels-Alder (rDA) reactions of the monomer system, it was found out that some side reaction occurred at elevated temperatures. The data of evaluating the side reaction is described in one part of this manuscript. Additional high resolution SEM images of Fu Mel particles are shown and thermoreversible particle networks with BMI2 are shown. The data of different Fu-Mel particle networks with maleimide crosslinker are presented. Therefore, the used maleimide crosslinker with different spacer lengths were synthesized and the resulting networks were analyzed by ATR-FT-IR, SEM and DSC.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
In a networked world, companies depend on fast and smart decisions, especially when it comes to reacting to external change. With the wealth of data available today, smart decisions can increasingly be based on data analysis and be supported by IT systems that leverage AI. A global pandemic brings external change to an unprecedented level of unpredictability and severity of impact. Resilience therefore becomes an essential factor in most decisions when aiming at making and keeping them smart. In this chapter, we study the characteristics of resilient systems and test them with four use cases in a wide-ranging set of application areas. In all use cases, we highlight how AI can be used for data analysis to make smart decisions and contribute to the resilience of systems.
The Third International Conference on Data Analytics (DATA ANALYTICS 2014), held on August 24 - 28, 2014 - Rome, Italy, continued the inaugural event on fundamentals in supporting data analytics, special mechanisms and features of applying principles of data analytics, application oriented analytics, and target-area analytics.
Processing of terabytes to petabytes of data, or incorporating non-structural data and multistructured data sources and types require advanced analytics and data science mechanisms for both raw and partially-processed information. Despite considerable advancements on high performance, large storage, and high computation power, there are challenges in identifying, clustering, classifying, and interpreting of a large spectrum of information.
Cyanate esters
(2014)
Cyanate ester resins are an important class of thermosetting compounds that have experienced an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other applications, are especially suitable for highly demanding functions in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the aOCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure. The basic reaction sequence leading to the typical cyanate ester polymer molecule is depicted in Figure 11.1. The curing reaction may take place with or without catalyst.
Cyanate ester resins
(2022)
Cyanate ester resins are an important class of thermosetting compounds that experience an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other application fields are especially suitable for highly demanding applications in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the –OCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure.
Customer Success Management is the next evolution in complex sales that drives growth. Moreover, Customer Success Management is a modern holistic sales philosophy and part of a professional customer experience management strategy. The following conceptual paper discusses fundamental thoughts based on value-based selling, customer success focus, and a clear view on a perspective beyond selling that will gain importance in the future.
Customer services in the digital transformation: social media versus hotline channel performance
(2015)
Due to the digital transformation online service strategies have gained prominence in practice as well as in the theory of service management. This study examines the efficacy of different types of service channels in customer complaint handling. The theoretical framework, developed using complaint handling and social media literature, is tested against data collected from two different channels (hotline and social media) of a German telecommunication service provider. We contribute to the understanding of firm’s multichannel distribution strategy in two ways: a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels, and b) by testing the impact of complaint handling quality on key performance outcomes like customer loyalty, positive word-of-mouth, and cross purchase intentions.
This paper addresses the following four research questions: 1. How should customer service quality in social media channels be conceptualized on multiple levels? 2. Which aspects of customer service quality are important in enhancing customer satisfaction? 3. What outcomes are effected by customer service quality and customer satisfaction? 4. How effective are customer services delivered through social media channels (as compared to customer services delivered through other channels)?
The limited focus on particular research designs, data analysis methods, and research objects frequently characterise customer research projects. However, standard practice regarding researching certain phenomena is not always correct, and, in many cases, could provide misleading results. In this paper, we call for a more holistic approach to customer research, which considers the entire research design and data analysis toolbox, while also recognising the importance of consumer groups other than costumers. At the same time, we call for using simple data analysis methods, which often suffice to show relevant effects, instead of overemphasising method complexity as is often the case in top-tier journals. Based on our discussion, we offer researchers and practitioners concrete recommendations for advancing their research design and data analyses.
The generous feed-in tariffs (FiTs) introduced in Germany—which resulted in major growth in decentralized solar photovoltaic (PV) systems—will phase out in the coming years, making many of the existing distributed generation assets stranded. This challenge creates an opportunity for community-focused energy utilities, such as Elektrizitätswerke Schönau eG (EWS) based in Schönau, Germany, to try a new approach to assist its customers, makes the transition to a more sustainable future. This chapter describes how EWS is developing products and offering community-based solutions including peer-to-peer trading using automated platforms. Such innovative offering may lead to successful differentiation in a competitive and highly decentralized future.
Loyalty programs become more important in an omnichannel environment of fashion retail business. After the definition of customer loyalty and loyalty programs the main characteristics of omnichannel loyalty programs are described. As touchpoints of omnichannel loyalty programs mobile, social media, direct mail and in-store capabilities are detailed. A discussion chapter closes with recommendations for fashion retailers.
For the widespread establishment of a circular economy, the acceptance of used products among consumers is a prerequisite. This paper investigates the customer experience of product service systems related to used products (PSSuP), such as renting, remanufacturing, and second-hand models, and aims to point out the offering characteristics that effect customer response and customer engagement. This study was conducted by means of a content analysis-based literature review of 69 empirical PSSuP studies. A frequency analysis of the categories that determine customer experience creation was conducted, as well as a contingency analysis to reveal the interrelationship between these categories. On this basis, the different PSSuP types were compared, and four strategic orientations of customer experience creation in PSSuP are pointed out: price, confidence, convenience, and delight orientation. For each of these strategic orientations, supportive PSSuP offering characteristics are specified. Building on the findings of this study, theoretical and managerial implications for product–service systems marketing are pointed out, and the need for research on the role of information and communication technology as an enabler of customer experience creation in PSSuP is highlighted.
Surface topographies are often discussed as an important parameter influencing basic cell behavior. Whereas most in vitro studies deal with microstructures with sharp edges, smooth, curved microscale topographies might be more relevant concerning in-vivo situations. Addressing the lack of highly defined surfaces with varying curvature, we present a topography chip system with 3D curved features of varying spacing, curvature radii as well as varying overall dimensions of curved surfaces. The CurvChip is produced by low-cost photolithography with thermal reflow, subsequent (repetitive) PDMS molding and hot embossing. The platform facilitates the systematic in-vitro investigation of the impact of substrate curvature on cell types like epithelial, endothelial, smooth muscle cells, or stem cells. Such investigations will not only help to further understand the mechanism of curvature sensation but may also contribute to optimize cell-material interactions in the field of regenerative medicine.
At Reutlingen University in Germany students from different countries and disciplines can learn business English within the framework of a theatre production. In the "Business English Theatre" they work in an international project team staging a play with a business focus and thus improve both their language, social and professional skills.
Curriculum design for the German language class in the double-degree programme business engineering
(2017)
This paper aims to give an overview on how German is taught as a foreign language to students enrolled in the Bachelor of Business Engineering, a double-degree programme offered in Universiti Malaysia Pahang. The double degree students have the opportunity to complete their first two years of study in Malaysia and their last two years in Germany. Taking the TestDaF examination is compulsory for double-degree students. Hence, the German Language curriculum has been meticulously planned to ensure the students would be competent in the language. As such, the settings of the language class are discussed thoroughly in this paper. Additionally, it also discusses the challenges faced in teaching German as foreign language. This paper ends with some suggestions for improvement.
The purpose of this paper is to investigate how motion pictures are currently used for the product presentation of fashion articles in online shops in the German, American and British markets. This study shows that the use of moving images for the presentation of fashion articles in online shops is underutilized. With the amount of data that was manageable within the scope of this chapter, no valid generalizations can be made. All described results must be understood as an indication. In order to be able to use product presentation videos meaningfully, one should consider before exactly what is the purpose of these videos. Different goals require different means. However, retailer should obtain enough information in advance to assess whether they can afford the production and post processing of these videos.
The scoring of sleep stages is one of the essential tasks in sleep analysis. Since a manual procedure requires considerable human and financial resources, and incorporates some subjectivity, an automated approach could result in several advantages. There have been many developments in this area, and in order to provide a comprehensive overview, it is essential to review relevant recent works and summarise the characteristics of the approaches, which is the main aim of this article. To achieve it, we examined articles published between 2018 and 2022 that dealt with the automated scoring of sleep stages. In the final selection for in-depth analysis, 125 articles were included after reviewing a total of 515 publications. The results revealed that automatic scoring demonstrates good quality (with Cohen's kappa up to over 0.80 and accuracy up to over 90%) in analysing EEG/EEG + EOG + EMG signals. At the same time, it should be noted that there has been no breakthrough in the quality of results using these signals in recent years. Systems involving other signals that could potentially be acquired more conveniently for the user (e.g. respiratory, cardiac or movement signals) remain more challenging in the implementation with a high level of reliability but have considerable innovation capability. In general, automatic sleep stage scoring has excellent potential to assist medical professionals while providing an objective assessment.
This work is a report on practical experiences with the issue of interoperability in German practice management systems (PMSs) from an ongoing clinical trial on teledermatology, the TeleDerm project. A proprietary and established web-platform for store-and-forward telemedicine is integrated with the IT in the GPs’ offices for automatic exchange of basic patient data. Most of the 19 different PMSs included in the study sample lack support of modern health data exchange standards, therefore the relatively old but widely available German health data exchange interface “Gerätedatentransfer” (GDT) is used. Due to the lack of enforcement and regulation of the GDT standard, several obstacles to interoperability are encountered. As a partial, but reusable working solution to cope with these issues, we present a custom middleware which is used in conjunction with GDT. We describe the design, technical implementation and observed hindrances with the existing infrastructure. A discussion on health care interfacing standards and the current state of interoperability in German PMS software is given.
Electronic word-of-mouth (eWoM) communication has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM.
Current fields of interest
(2016)
If we review the research done in the field of optimization, the following topics appear to be the focus of current development:
– Optimization under uncertainties, taking into account the inevitable scatter of parts, external effects and internal properties. Reliability and robustness both have to be taken into account when running optimizations, so the name Robust Design Optimization (RDO) came into use.
– Multi-Objective Optimization (MOO) handles situations in which different participants in the development process are developing in different directions. Typically we think of commercial and engineering aspects, but other constellations have to be looked at as well, such as comfort and performance or price and consumption.
– Process development of the entire design process, including optimization from early stages, might help avoid inefficient efforts. Here the management of virtual development has to be re-designed to fit into a coherent scheme.
...
There are many other fields where interesting progress is being made. We limit our discussion to the first three questions.
We report on the cure characterization, based on inline monitoring of the dielectric parameters, of a commercially available epoxy phenol resin molding compound with a high glass transition temperature (>195 °C), which is suitable for the direct packaging of electronic components. The resin was cured under isothermal temperatures close to general process conditions (165–185 °C). The material conversion was determined by measuring the ion viscosity. The change of the ion viscosity as a function of time and temperature was used to characterize the cross-linking behavior, following two separate approaches (model based and isoconversional). The determined kinetic parameters are in good agreement with those reported in the literature for EMCs and lead to accurate cure predictions under process-near conditions. Furthermore, the kinetic models based on dielectric analysis (DEA) were compared with standard offline differential scanning calorimetry (DSC) models, which were based on dynamic measurements. Many of the determined kinetic parameters had similar values for the different approaches. Major deviations were found for the parameters linked to the end of the reaction where vitrification phenomena occur under process-related conditions. The glass transition temperature of the inline molded parts was determined via thermomechanical analysis (TMA) to confirm the vitrification effect. The similarities and differences between the resulting kinetics models of the two different measurement techniques are presented and it is shown how dielectric analysis can be of high relevance for the characterization of the curing reaction under conditions close to series production.
Within the scope of the present cumulative doctoral thesis six scientific papers were published which illustrates that modern reaction model-free (=isoconversional) kinetic analysis (ICKA) methods represents a universal and effective tool for the controlled processing of thermosetting materials. In order to demonstrate the universal applicability of ICKA methods, the thermal cure of different thermosetting materials having a very broad range of chemical composition (melamine-formaldehyde resins, epoxy resins, polyester-epoxy resins, and acrylate/epoxy resins) were analyzed and mathematically modelled. Some of the materials were based on renewable resources (an epoxy resin was made from hempseed oil; linseed oil was modified into an acrylate/epoxy resin). With the aid of ICKA methods not only single-step but also complex multi-step reactions were modelled precisely. The analyzed thermosetting materials were combined with wood, wood-based products, paper, and plant fibers which are processed to various final products. Some of the thermosetting materials were applied as coating (in form of impregnated décor papers or powder and wet coatings respectively) on wood substrates and the epoxy resin from hempseed oil was mixed with plant fibers and processed into bio-based composites for lightweight applications. From the final products mechanical, thermal, and surface properties were determined. The activation energy as function of cure conversion derived from ICKA methods was utilized to predict accurately the thermal curing over the course of time for arbitrary cure conditions. Furthermore the cure models were used to establish correlations between the cross-linking during processing into products and the properties of the final products. Therewith it was possible to derive the process time and temperature that guarantee optimal cross-linking as well as optimal product properties
This booklet will give you an overview of the development of CSR from a (brief) historic point of view and will examine the underlying concepts and research. Furthermore, examples of contemporary CSR management will be explored to show how companies Interpret the issue and how they face the challenges of managing the new demands placed upon them. Business, in the end, comes down to figures and numbers which give management, shareholders and stakeholders a chance to measure a company’s success. Therefore, modern methods and approaches for measuring, rating and ranking a company’s CSR management will be presented. Finally, an attempt will be made to evaluate CSR as a tool for increasing global welfare and as a business and management strategy for companies and entrepreneurs.
The diversity of energy prosumer types makes it difficult to create appropriate incentive mechanisms that satisfy both prosumers and energy system operators alike. Meanwhile, European energy suppliers buy guarantees of origin (GoO) which allow them to sell green energy at premium prices while in reality delivering grey energy to their customers. Blockchain technology has proven itself to be a robust paying system in which users transact money without the involvement of a third party. Blockchain tokens can be used to represent a unit of energy and, just as GoOs, be submitted to the market. This paper focuses on simulating marketplace using the ethereum blockchain and smart contracts, where prosumers can sell tokenized GoOs to consumers willing to subsidize renewable energy producers. Such markets bypass energy providers by allowing consumers to obtain tokenized GoOs directly from the producers, which in turn benefit directly from the earnings. Two market strategies where tokens are sold as GoOs have been simulated. In the Fix Price Strategy prosumers sell their tokens to the average GoO price of 2014. The Variable Price Strategy focuses on selling tokens at a price range defined by the difference between grey and green energy. The study finds that the ethereum blockchain is robust enough to functions as a platform for tokenized GoO trading. Simulation results have been compared and the results indicate that prosumers earn significantly more money by following the Variable Price
Strategy.