Refine
Year of publication
Document Type
- Journal article (1244)
- Conference proceeding (1039)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (38)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3088)
Institute
- ESB Business School (1107)
- Informatik (875)
- Technik (509)
- Life Sciences (343)
- Texoversum (220)
- Zentrale Einrichtungen (16)
Publisher
- Springer (473)
- IEEE (252)
- Elsevier (243)
- Hochschule Reutlingen (191)
- MDPI (99)
- Wiley (72)
- Gesellschaft für Informatik e.V (69)
- Universität Tübingen (65)
- De Gruyter (60)
- VDE Verlag (48)
Ziel eines aktuellen Forschungsprojektes an der Hochschule Reutlingen, das gemeinsam mit dem Ingenieurbüro Ganssloser und der Universität Tübingen durchgeführt wird, ist es, Flexibilitäten in Unternehmen, die im Verbund als virtuelles Kraftwerk am Strommarkt agieren, zu erkennen und nutzbar zu machen. Zu diesem Zweck soll eine Steuerbox für Industrie- und Gewerbebetriebe entwickelt werden, die einerseits mit der zentralen Leitwarte des virtuellen Kraftwerks kommuniziert und andererseits die Anlagen des Unternehmens so steuert, dass die zur Verfügung stehenden Flexibilitäten möglichst optimal genutzt werden. Die Hochschule Reutlingen beschäftigt sich innterhalb des Projekts mit der Erkennung und Beschreibung von Flexibilitäten in Unternehmen.
Forecasting intermittent and lumpy demand is challenging. Demand occurs only sporadically and, when it does, it can vary considerably. Forecast errors are costly, resulting in obsolescent stock or unmet demand. Methods from statistics, machine learning and deep learning have been used to predict such demand patterns. Traditional accuracy metrics are often employed to evaluate the forecasts, however these come with major drawbacks such as not taking horizontal and vertical shifts over the forecasting horizon into account, or indeed stock-keeping or opportunity costs. This results in a disadvantageous selection of methods in the context of intermittent and lumpy demand forecasts. In our study, we compare methods from statistics, machine learning and deep learning by applying a novel metric called Stock-keeping-oriented Prediction Error Costs (SPEC), which overcomes the drawbacks associated with traditional metrics. Taking the SPEC metric into account, the Croston algorithm achieves the best result, just ahead of a Long Short-Term Memory Neural Network.
Delphi Markets
(2023)
Delphi markets refer to approaches and implementations of integrating prediction markets and Delphi studies (Real-time Delphi). The combination of the two methods for producing forecasts can potentially compensate for each other´s weaknesses. For example, prediction markets can be used to select participants with expertise and also motivate long-term participation through their gamified approach and incentive mechanisms. In this paper, two potentials for prediction markets and four potentials for Delphi studies, which are made possible by integration, are derived theoretically. Subsequently, three different integration approaches are presented, on the basis of which the integration on user, market and Delphi question-level is exemplified and it is shown that, depending on the approach, not all potentials can be achieved. At the end, recommendations for the use of Delphi markets are derived, existing limitations for Delphi markets as well as future developments are pointed out.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Production systems are becoming increasingly complex, which means that the main task of industrial maintenance, ensuring the technical availability of a production system, is also becoming increasingly difficult. The previous focus of maintenance efforts on individual machines must give way to a holistic view encompassing the whole production system. Against this background, the technical availability of a production system must be redefined. The aim of this publication is to present different definition approaches of production systems’ availability and to demonstrate the effects of random machine failures on the key figures considering the complexity of the production system using a discrete event simulation.
Defining the antecedents of experience co-creation as applied to alternative consumption models
(2019)
Purpose – The purpose of this paper is to propose a conceptual framework of experience co-creation that captures the multi-dimensionality of this construct, as well as a research process for defining of the antecedents of experience co-creation.
Design/methodology/approach – The framework of experience co-creation was conceptualized by means of a literature review. Subsequently, this framework was used as the conceptual basis for a qualitative content analysis of 66 empirical papers investigating alternative consumption models (ACMs), such as renting, remanufacturing, and second-hand models.
Findings – The qualitative content analysis resulted in 12 categories related to the consumer and 9 related to the ACM offerings that represent the antecedents of experience co-creation. These categories provide evidence that, to a large extent, the developed conceptual framework allows one to capture the multi-dimensionality of the experience co-creation construct.
Research limitations/implications – This study underscores the understanding of experience co-creation as a function of the characteristics of the offering – which are, in turn, a function of the consumers’ motives as determined by their lifeworlds – as well as to service design as an iterative approach to finding, creating and refining service offerings.
Practical implications – The investigation of the antecedents of experience co-creation can enable service providers to determine significant consumer market conditions for forecasting the suitability and viability of their offerings and to adjust their service designs accordingly.
Originality/value – This paper provides a step toward the operationalization of the dimension-related experience co creation construct and presents an approach to defining the antecedents of experience co-creation by considering different research perspectives that can enhance service design research.
Purpose: Gliomas are the most common and aggressive type of brain tumors due to their infiltrative nature and rapid progression. The process of distinguishing tumor boundaries from healthy cells is still a challenging task in the clinical routine. Fluid attenuated inversion recovery (FLAIR) MRI modality can provide the physician with information about tumor infiltration. Therefore, this paper proposes a new generic deep learning architecture, namely DeepSeg, for fully automated detection and segmentation of the brain lesion using FLAIR MRI data.
Methods: The developed DeepSeg is a modular decoupling framework. It consists of two connected core parts based on an encoding and decoding relationship. The encoder part is a convolutional neural network (CNN) responsible for spatial information extraction. The resulting semantic map is inserted into the decoder part to get the full-resolution probability map. Based on modified U-Net architecture, different CNN models such as residual neural network (ResNet), dense convolutional network (DenseNet), and NASNet have been utilized in this study.
Results: The proposed deep learning architectures have been successfully tested and evaluated on-line based on MRI datasets of brain tumor segmentation (BraTS 2019) challenge, including s336 cases as training data and 125 cases for validation data. The dice and Hausdorff distance scores of obtained segmentation results are about 0.81 to 0.84 and 9.8 to 19.7 correspondingly.
Conclusion: This study showed successful feasibility and comparative performance of applying different deep learning models in a new DeepSeg framework for automated brain tumor segmentation in FLAIR MR images. The proposed DeepSeg is open source and freely available at https://github.com/razeineldin/DeepSeg/.
The Internet of Things (IoT) is coined by many different standards, protocols, and data formats that are often not compatible to each other. Thus, the integration of different heterogeneous (IoT) components into a uniform IoT setup can be a time-consuming manual task. This lacking interoperability between IoT components has been addressed with different approaches in the past. However, only very few of these approaches rely on Machine Learning techniques. In this work, we present a new way towards IoT interoperability based on Deep Reinforcement Learning (DRL). In detail, we demonstrate that DRL algorithms, which use network architectures inspired by Natural Language Processing (NLP), can be applied to learn to control an environment by merely taking raw JSON or XML structures, which reflect the current state of the environment, as input. Applied to IoT setups, where the current state of a component is often reflected by features embedded into JSON or XML structures and exchanged via messages, our NLP DRL approach eliminates the need for feature engineering and manually written code for pre-processing of data, feature extraction, and decision making.
There is still a great reliance on human expert knowledge during the analog integrated circuit sizing design phase due to its complexity and scale, with the result that there is a very low level of automation associated with it. Current research shows that reinforcement learning is a promising approach for addressing this issue. Similarly, it has been shown that the convergence of conventional optimization approaches can be improved by transforming the design space from the geometrical domain into the electrical domain. Here, this design space transformation is employed as an alternative action space for deep reinforcement learning agents. The presented approach is based entirely on reinforcement learning, whereby agents are trained in the craft of analog circuit sizing without explicit expert guidance. After training and evaluating agents on circuits of varying complexity, their behavior when confronted with a different technology, is examined, showing the applicability, feasibility as well as transferability of this approach.
Intracranial brain tumors are one of the ten most common malignant cancers and account for substantial morbidity and mortality. The largest histological category of primary brain tumors is the gliomas which occur with an ultimate heterogeneous appearance and can be challenging to discern radiologically from other brain lesions. Neurosurgery is mostly the standard of care for newly diagnosed glioma patients and may be followed by radiation therapy and adjuvant temozolomide chemotherapy.
However, brain tumor surgery faces fundamental challenges in achieving maximal tumor removal while avoiding postoperative neurologic deficits. Two of these neurosurgical challenges are presented as follows. First, manual glioma delineation, including its sub-regions, is considered difficult due to its infiltrative nature and the presence of heterogeneous contrast enhancement. Second, the brain deforms its shape, called “brain shift,” in response to surgical manipulation, swelling due to osmotic drugs, and anesthesia, which limits the utility of pre-operative imaging data for guiding the surgery.
Image-guided systems provide physicians with invaluable insight into anatomical or pathological targets based on modern imaging modalities such as magnetic resonance imaging (MRI) and Ultrasound (US). The image-guided toolkits are mainly computer-based systems, employing computer vision methods to facilitate the performance of peri-operative surgical procedures. However, surgeons still need to mentally fuse the surgical plan from pre-operative images with real-time information while manipulating the surgical instruments inside the body and monitoring target delivery. Hence, the need for image guidance during neurosurgical procedures has always been a significant concern for physicians.
This research aims to develop a novel peri-operative image-guided neurosurgery (IGN) system, namely DeepIGN, that can achieve the expected outcomes of brain tumor surgery, thus maximizing the overall survival rate and minimizing post-operative neurologic morbidity. In the scope of this thesis, novel methods are first proposed for the core parts of the DeepIGN system of brain tumor segmentation in MRI and multimodal pre-operative MRI to the intra-operative US (iUS) image registration using the recent developments in deep learning. Then, the output prediction of the employed deep learning networks is further interpreted and examined by providing human-understandable explainable maps. Finally, open-source packages have been developed and integrated into widely endorsed software, which is responsible for integrating information from tracking systems, image visualization, image fusion, and displaying real-time updates of the instruments relative to the patient domain.
The components of DeepIGN have been validated in the laboratory and evaluated in the simulated operating room. For the segmentation module, DeepSeg, a generic decoupled deep learning framework for automatic glioma delineation in brain MRI, achieved an accuracy of 0.84 in terms of the dice coefficient for the gross tumor volume. Performance improvements were observed when employing advancements in deep learning approaches such as 3D convolutions over all slices, region-based training, on-the-fly data augmentation techniques, and ensemble methods.
To compensate for brain shift, an automated, fast, and accurate deformable approach, iRegNet, is proposed for registering pre-operative MRI to iUS volumes as part of the multimodal registration module. Extensive experiments have been conducted on two multi-location databases: the BITE and the RESECT. Two expert neurosurgeons conducted additional qualitative validation of this study through overlaying MRI-iUS pairs before and after the deformable registration. Experimental findings show that the proposed iRegNet is fast and achieves state-of-the-art accuracies. Furthermore, the proposed iRegNet can deliver competitive results, even in the case of non-trained images, as proof of its generality and can therefore be valuable in intra-operative neurosurgical guidance.
For the explainability module, the NeuroXAI framework is proposed to increase the trust of medical experts in applying AI techniques and deep neural networks. The NeuroXAI includes seven explanation methods providing visualization maps to help make deep learning models transparent. Experimental findings showed that the proposed XAI framework achieves good performance in extracting both local and global contexts in addition to generating explainable saliency maps to help understand the prediction of the deep network. Further, visualization maps are obtained to realize the flow of information in the internal layers of the encoder-decoder network and understand the contribution of MRI modalities in the final prediction. The explainability process could provide medical professionals with additional information about tumor segmentation results and therefore aid in understanding how the deep learning model is capable of processing MRI data successfully.
Furthermore, an interactive neurosurgical display has been developed for interventional guidance, which supports the available commercial hardware such as iUS navigation devices and instrument tracking systems. The clinical environment and technical requirements of the integrated multi-modality DeepIGN system were established with the ability to incorporate: (1) pre-operative MRI data and associated 3D volume reconstructions, (2) real-time iUS data, and (3) positional instrument tracking. This system's accuracy was tested using a custom agar phantom model, and its use in a pre-clinical operating room is simulated. The results of the clinical simulation confirmed that system assembly was straightforward, achievable in a clinically acceptable time of 15 min, and performed with a clinically acceptable level of accuracy.
In this thesis, a multimodality IGN system has been developed using the recent advances in deep learning to accurately guide neurosurgeons, incorporating pre- and intra-operative patient image data and interventional devices into the surgical procedure. DeepIGN is developed as open-source research software to accelerate research in the field, enable ease of sharing between multiple research groups, and continuous developments by the community. The experimental results hold great promise for applying deep learning models to assist interventional procedures - a crucial step towards improving the surgical treatment of brain tumors and the corresponding long-term post-operative outcomes.
Deep learning-based EEG detection of mental alertness states from drivers under ethical aspects
(2021)
One of the most critical factors for a successful road trip is a high degree of alertness while driving. Even a split second of inattention or sleepiness in a crucial moment, will make the difference between life and death. Several prestigious car manufacturers are currently pursuing the aim of automated drowsiness identification to resolve this problem. The path between neuro-scientific research in connection with artificial intelligence and the preservation of the dignity of human individual’s and its inviolability, is very narrow. The key contribution of this work is a system of data analysis for EEGs during a driving session, which draws on previous studies analyzing heart rate (ECG), brain waves (EEG), and eye function (EOG). The gathered data is hereby treated as sensitive as possible, taking ethical regulations into consideration. Obtaining evaluable signs of evolving exhaustion includes techniques that obtain sleeping stage frequencies, problematic are hereby the correlated interference’s in the signal. This research focuses on a processing chain for EEG band splitting that involves band-pass filtering, principal component analysis (PCA), independent component analysis (ICA) with automatic artefact severance, and fast fourier transformation (FFT). The classification is based on a step-by-step adaptive deep learning analysis that detects theta rhythms as a drowsiness predictor in the pre-processed data. It was possible to obtain an offline detection rate of 89% and an online detection rate of 73%. The method is linked to the simulated driving scenario for which it was developed. This leaves space for more optimization on laboratory methods and data collection during wakefulness-dependent operations.
In recent years, both fields, AI and VRE, have received increasing attention in scientific research. Thus, this article’s purpose is to investigate the potential of DL-based applications on VRE and as such provide an introduction to and structured overview of the field. First, we conduct a systematic literature review of the application of Artificial Intelligence (AI), especially Deep Learning (DL), on the integration of Variable Renewable Energy (VRE). Subsequently, we provide a comprehensive overview of specific DL-based solution approaches and evaluate their applicability, including a survey of the most applied and best suited DL architectures. We identify ten DL-based approaches to support the integration of VRE in modern power systems. We find (I) solar PV and wind power generation forecasting, (II) system scheduling and grid management, and (III) intelligent condition monitoring as three high potential application areas.
Intraoperative imaging can assist neurosurgeons to define brain tumours and other surrounding brain structures. Interventional ultrasound (iUS) is a convenient modality with fast scan times. However, iUS data may suffer from noise and artefacts which limit their interpretation during brain surgery. In this work, we use two deep learning networks, namely UNet and TransUNet, to make automatic and accurate segmentation of the brain tumour in iUS data. Experiments were conducted on a dataset of 27 iUS volumes. The outcomes show that using a transformer with UNet is advantageous providing an efficient segmentation modelling long-range dependencies between each iUS image. In particular, the enhanced TransUNet was able to predict cavity segmentation in iUS data with an inference rate of more than 125 FPS. These promising results suggest that deep learning networks can be successfully deployed to assist neurosurgeons in the operating room.
Fault diagnosis of rolling bearings is an essential process for improving the reliability and safety of the rotating machinery. It is always a major challenge to ensure fault diag- nosis accuracy in particular under severe working conditions. In this article, a deep adversarial domain adaptation (DADA) model is proposed for rolling bearing fault diagnosis. This model con- structs an adversarial adaptation network to solve the commonly encountered problem in numerous real applications: the source domain and the target domain are inconsistent in their distribution. First, a deep stack autoencoder (DSAE) is combined with representative feature learning for dimensionality reduction, and such a combination provides an unsupervised learning method to effectively acquire fault features. Meanwhile, domain adaptation and recognition classification are implemented using a Softmax classifier to augment classification accuracy. Second, the effects of the number of hidden layers in the stack autoencoder network, the number of neurons in each hidden layer, and the hyperparameters of the proposed fault diagnosis algorithm are analyzed. Third, comprehensive analysis is performed on real data to vali- date the performance of the proposed method; the experimental results demonstrate that the new method outperforms the existing machine learning and deep learning methods, in terms of classification accuracy and generalization ability.
Through increasing market dynamics, rapidly evolving technologies and shifting user expectations coupled with the adoption of lean and agile practices, companies are struggling with their ability to provide reliable product roadmaps by applying traditional approaches. Currently, most companies are seeking opportunities to improve their product roadmapping practices. As a first challenge they have to assess their current product roadmapping capabilities in order to better understand how to improve their practices and how to switch to a new approach. The aim of this article is to provide an initial maturity model for product roadmapping practices that is especially suited for assessing the roadmapping capabilities of companies operating in dynamic and uncertain market environments. Based on interviews with 15 experts from 13 various companies the current state of practice regarding product roadmapping was identified. Afterwards, the model development was conducted in the context of expert workshops with the Robert Bosch GmbH and researchers. The study results in the so-called DEEP 1.0 product roadmap maturity model which allows companies to conduct a self assessment of their product roadmapping practice.
In the context of Industry 4.0, intralogistics faces an increasingly complex and dynamic environment driven by a high level of product customisation and complex manufacturing processes. One approach to deal with these changing conditions is the decentralised and intelligent connectivity of intralogistics systems. However, wireless connectivity presents a major challenge in the industry due to strict requirements such as safety and real-time data transmission. In this context, the fifth generation of mobile communications (5G) is a promising technology to meet the requirements of safety-critical applications. Particularly, since 5G offers the possibility of establishing private 5G networks, also referred to as standalone non-public networks. Through their isolation from public networks, private 5G networks provide exclusive coverage for private organisations offering them high intrinsic network control and data security. However, 5G is still under development and is being gradually introduced in a continuous release process. This process lacks transparency regarding the performance of 5G in individual releases, complicating the successful adoption of 5G as an industrial communication. Additionally, the evaluation of 5G against the specified target performance is insufficient due to the impact of the environment and external interfering factors on 5G in the industrial environment. Therefore, this paper aims to develop a technical decision-support framework that takes a holistic approach to evaluate the practicality of 5G for intralogistics use cases by considering two fundamental stages. The first of these analyses technical parameters and characteristics of the use case to evaluate the theoretical feasibility of 5G. The second stage investigates the application's environment, which substantially impacts the practicality of 5G, for instance, the influence of surrounding materials. Finally, a case study validates the proposed framework by means of an autonomous mobile robot. As a result, the validation proves the proposed framework's applicability and shows the practicality of the autonomous mobile robot, when integrating it into a private 5G network testbed.
Enterprises are presently transforming their strategy, culture, processes, and their information systems to become more digital. The digital transformation deeply disrupts existing enterprises and economies. Digitization fosters the development of IT systems with many rather small and distributed structures, like Internet of Things or mobile systems. Since years a lot of new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. This has a strong impact for architecting digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of system architectures defines the moving context for adaptable systems, which are essential to enable the digital transformation. In this paper, we are focusing on a decision-oriented architectural composition approach to support the transformation for digital services and products.
Digitization of societies changes the way we live, work, learn, communicate, and collaborate. In the age of digital transformation IT environments with a large number of rather small structures like Internet of Things (IoT), microservices, or mobility systems are emerging to support flexible and agile digitized products and services. Adaptable ecosystems with service oriented enterprise architectures are the foundation for self-optimizing, resilient run-time environments and distributed information systems. The resulting business disruptions affect almost all new information processes and systems in the context of digitization. Our aim are more flexible and agile transformations of both business and information technology domains with more flexible enterprise information systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision-controlled digitization architectures for Internet of Things and microservices by evolving enterprise architecture reference models and state of the art elements for architectural engineering for micro-granular systems.
Digitization fosters the development of IT environments with many rather small structures, like Internet of Things (IoT), microservices, or mobility systems. They are needed to support flexible and agile digitized products and services. The goal is to create service-oriented enterprise architectures (EA) that are self optimizing and resilient. The present research paper investigates methods for decision-making concerning digitization architectures for Internet of Things and microservices. They are based on evolving enterprise architecture reference models and state of the art elements for architectural engineering for microgranular systems. Decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures, is sorely needed. The challenging of the decision processes can be supported with in a more flexible and intuitive way by an architecture management cockpit.
The Internet of Things (IoT), enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems with service oriented enterprise architectures provide the foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision case management in the context of multi-perspective explorations of enterprise services and Internet of Things architectures by extending original enterprise architecture reference models with state of art elements for architectural engineering for the digitization and architectural decision support.
The efficient production and utilization of green hydrogen is vital to succeed in the global strive for a sustainable future. To provide the necessary amount of green hydrogen a high number of electrolyzers will be connected as decentralized power consumers to the grid. A large amount of decentralized renewable power sources will provide the energy. In such a system a control method is necessary to dispatch the available power most efficiently. In particular, the shutdown of renewable energy sources due to temporary overproduction must be avoided. This paper presents a decentralized tertiary control algorithm that provides a new decentralized control approach, thus creating a flexible, robust and easily scalable system. The operation of each grid participant within this grid connected microgrid is optimized for maximum financial profit, while minimizing the exchange of power with the mains grid and reducing the shutdown of renewable power sources.
The increasing emergence of cyber-physical systems (CPS) and a global crosslinking of these CPS to cyber-physical production systems (CPPS) are leading to fundamental changes of future work and logistic systems requiring innovative methods to plan, control and monitor changeable production systems and new forms of human-machine-collaboration. Particularly logistic systems have to obey the versatility of CPPS and will be transferred to so-called cyber physical logistic systems, since the logistical networks will underlie the requirements of constant changes initiated by changeable production systems. This development is driven and enhanced by increasingly volatile and globalized market and manufacturing environments combined with a high demand for individualized products and services. Also nowadays mainly used centralized control systems are pushed to their limits regarding their abilities to deal with the arising complexity to plan, control and monitor changeable work and logistic systems. Decentralized control systems bear the potential to cope with these challenges by distributing the required operations on various nodes of the resulting decentralized control system.
Learning factories, like the ESB Logistics Learning Factory at ESB Business School (Reutlingen University), provide a wide range of possibilities to develop new methods and innovative technical solutions in a risk-free and close-to-reality factory environment and to transfer knowledge as well as specific competences into the training of students and professionals. To intensify the research and training activities in the field of future work and logistics systems, ESB Business School is transferring its existing production system into a CPPS involving decentralized planning, control and monitoring methods and systems, human-machine-collaboration as well as technical assistance systems for changeable work and logistics systems.
The establishment of adipose tissue test systems is still a major challenge in the investigation of cellular and molecular interactions responsible for the pathogenesis of inflammatory diseases involving adipose tissue. Mature adipocytes are mainly involved in these pathologies, but rarely used in vitro, due to the lack of an appropriate culture medium which inhibits dedifferentiation and maintains adipocyte functionality. In our study, we showed that Dulbecco's Modified Eagle's Medium/Ham's F-12 with 10% fetal calf serum (FCS) reported for the culture of mature adipocytes favors dedifferentiation, which was accompanied by a high glycerol release, a decreasing release of leptin, and a low expression of the adipocyte marker perilipin A, but high expression of CD73 after 21 days. Optimized media containing FCS, biotin, pantothenate, insulin, and dexamethasone decelerated the dedifferentiation process. These cells showed a lower lipolysis rate, a high level of leptin release, as well as a high expression of perilipin A. CD73-positive dedifferentiated fat cells were only found in low quantity. In this work, we showed that mature adipocytes when cultured under optimized conditions could be highly valuable for adipose tissue engineering in vitro.
As fuel prices climb and the global automotive sector migrates to more sustainable vehicle technologies, the future of South Africa’s minibus taxis is in flux. The authors’ previous research has found that battery electric technology struggles to meet all the mobility requirements of minibus taxis. They investigate the technical feasibility of powering taxis with hydrogen fuel cells instead. The following results are projected using a custom-built simulator, and tracking data of taxis based in Stellenbosch, South Africa. Each taxi requires around 12 kg of hydrogen gas per day to travel an average distance of 360 km. 465 kWh of electricity, or 860 m2 of solar panels, would electrolyse the required green hydrogen. An economic analysis was conducted on the capital and operational expenses of a system of ten hydrogen taxis and an electrolysis plant. Such a pilot project requires a minimum investment of € 3.8 million (R 75 million), for a 20 year period. Although such a small scale roll-out is technically feasible and would meet taxis’ performance requirements, the investment cost is too high, making it financially unfeasible. They conclude that a large scale solution would need to be investigated to improve financial feasibility; however, South Africa’s limited electrical generation capacity poses a threat to its technical feasibility. The simulator is uploaded at: https://gitlab.com/eputs/ev-fleet-sim-fcv-model.
Debiasing als Managementtool
(2021)
Unternehmen existieren dadurch, dass sie eine Entscheidung auf die andere folgen lassen: Sollen wir hier investieren oder lieber dort? Sollte diese Mitarbeiterin auf die Führungsstelle befördert werden oder jener Mitarbeiter? Und welcher Preis ist für die neue Dienstleistung angemessen? Ob es einem Unternehmen gut geht, hängt daher mit der Qualität der zahlreichen Entscheidungen zusammen: Entscheiden Unternehmen immer wieder klüger als die Konkurrenz, werden sie sich im Wettbewerb behaupten können.
In this presentation the audience will be: (a) introduced to the aims and objectives of the DBTechNet initiative, (b) briefed on the DBTech EXT virtual laboratory workshops (VLW), i.e. the educational and training (E&T) content which is freely available over the internet and includes vendor-neutral hands-on laboratory training sessions on key database technology topics, and (c) informed on some of the practical problems encountered and the way they have been addressed. Last but not least, the audience will be invited to consider incorporating some or all of the DBTech EXT VLW content into their higher education (HE), vocational education and training (VET), and/or lifelong learning/training type course curricula. This will come at no cost and no commitment on behalf of the teacher/trainer; the latter is only expected to provide his/her feedback on the pedagogical value and the quality of the E&T content received/used.
In the present tutorial we perform a cross-cut analysis of database systems from the perspective of modern storage technology, namely Flash memory. We argue that neither the design of modern DBMS, nor the architecture of flash storage technologies are aligned with each other. The result is needlessly suboptimal DBMS performance and inefficient flash utilisation as well as low flash storage endurance and reliability. We showcase new DBMS approaches with improved algorithms and leaner architectures, designed to leverage the properties of modern storage technologies. We cover the area of transaction management and multi-versioning, putting a special emphasis on: (i) version organisation models and invalidation mechanisms in multi-versioning DBMS; (ii) Flash storage management especially on append-based storage in tuple granularity; (iii) Flash-friendly buffer management; as well as (iv) improvements in the searching and indexing models. Furthermore, we present our NoFTL approach to native Flash access that integrates parts of the flash-management functionality into the DBMS yielding significant performance increase and simplification of the I/O stack. In addition, we cover the basics of building large Flash storage for DBMS and revisit some of the RAID techniques and principles.
The Fifteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2023), held between March 13 – 17, 2023, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Fourteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2022), held between May 22 – 26, 2022, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Thirteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2021), held between May 30 – June 3rd, 2021, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Twelfth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2020) continued a series of events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Eleventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2019), held between June 02, 2019 to June 06, 2019 - Athens, Greece, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
We welcomed academic, research and industry contributions. The conference had the followingtracks:
Knowledgeanddecisionbase
Databasestechnologies
Datamanagement
GraphSM: Large-scale Graph Analysis, Management and Applications
The Tenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2018), held between May 20 - 24, 2018 - Nice, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Ninth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2017), held between May 21 - 25, 2017 - Barcelona, pain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Eighth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2016), held between June 26 - 30, 2016 - Lisbon, Portugal, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Seventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2015), held between May 24-29, 2015 in Rome, Italy, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base Technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and Agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2014), held between April 20 - 24, 2014 in Chamonix, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
The Fifth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2013], held between January 27th- February 1st, 2013 in Seville, Spain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2013 Technical Program Committee, as well as the numerous reviewers. The creation of such a high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2013. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2013 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2013 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Seville, Spain.
The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2012], held between February 29th and March 5th, 2012 in Saint Gilles, Reunion Island, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2012 Technical Program Committee, as well as the numerous reviewers. The creation of such a broad and high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2012. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2012 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2012 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge, and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Saint Gilles, Reunion Island.
The Third International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2011) held on January 23-27, 2011 in St. Maarten, The Netherlands Antilles, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take this opportunity to thank all the members of the DBKDA 2011 Technical Program Committee as well as the numerous reviewers. The creation of such a broad and high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to the DBKDA 2011. We truly believe that, thanks to all these efforts, the final conference program consists of top quality contributions. This event could also not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2011 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2011 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in database research. We are convinced that the participants found the event useful and communications very open. The beautiful places of St. Maarten surely provided a pleasant environment during the conference and we hope you had a chance to visit the surroundings.
Im Gegensatz etwa zur klassischen Werbung handelt es sich beim Event-Marketing um ein dynamisches Kommunikationsinstrument, das laufend Trends und Neuerungen mit sich bringt. Die vielfältigen Einsatzmöglichkeiten und Potenziale des Event-Marketing ermöglichen es, entsprechend dem momentanen Zeitgeist relevante Zielgruppen zu erreichen, markenrelevante Wirklichkeiten und Erlebniswelten zu generieren, Emotionen und Sympathiewerte zu erzeugen und auf diese Weise eine Bindung zwischen Marke bzw. Unternehmen und Rezipienten herzustellen.
Der lokale Bekleidungseinzelhandel steht unter immer stärkerem Konkurrenzdruck durch Versandunternehmen. Zusätzlich bestehen durch gewachsene Architekturen eine Reihe von Wachstumshemmnissen. Daher sollen hier eine Reihe von Ansätzen zur Gestaltung datenzentrierter Unternehmensarchitekturen für den Bekleidungseinzelhandel vorgestellt werden. Sie basieren auf dem Einsatz von RFID zur Gewinnung von Kundenprofilen in den Niederlassungen und dem Einsatz von Big-Data basierten Auswertungs- und Analysemechanismen. Mit den vorgestellten Konzepten ist es Unternehmen des Bekleidungseinzelhandels möglich, ähnlich wie Versandunternehmen, individuelle Ansprachen des Kunden und Angebote zu entwickeln
Big Data wird aktuell als einer der Haupttrends der IT-Industrie diskutiert. Big Data d. h. auf Basis großer Mengen unterschiedlich strukturierter Daten die Entscheidungen in Echtzeit oder prognostisch zu treffen. Von hochleistungsfähigen, schnell verfügbaren Prognoseverfahren erhofft man sich eine Risikominimierung für unternehmerische Entscheidungen in hochvolatilen Märkten.
Der Einsatz von Data Science in der Produktion ermöglicht eine neue Art der Optimierung von Prozessen und Systemen. Die Bedeutung der datengetriebenen Produktionsoptimierung wächst zunehmend im produzierenden Gewerbe. Im Gegensatz zu konventionellen Ansätzen, wie z. B. die des Lean Managements, basiert dieser anhaltende Trend auf der steigenden Verfügbarkeit von Daten im Zuge der digitalen Transformation. Vor allem kleine und mittlere Unternehmen stehen vor der Herausforderung abzuwägen, welche Maßnahmen hierfür ergriffen werden sollten und welche Nutzenpotenziale sich daraus ergeben. Diese Arbeit stellt einen strukturierten Leitfaden zur Vorgehensweise bei Datenanalyseprojekten bezogen auf einen spezifischen Anwendungsfall im Kontext einer frühen Fehlerdetektion und -prävention dar.
This paper reviews suggestions for changes to database technology coming from the work of many researchers, particularly those working with evolving big data. We discuss new approaches to remote data access and standards that better provide for durability and auditability in settings including business and scientific computing. We propose ways in which the language standards could evolve, with proof-of-concept implementations on Github.
Over the last decades, a tremendous change toward using information technology in almost every daily routine of our lives can be perceived in our society, entailing an incredible growth of data collected day-by-day on Web, IoT, and AI applications.
At the same time, magneto-mechanical HDDs are being replaced by semiconductor storage such as SSDs, equipped with modern Non-Volatile Memories, like Flash, which yield significantly faster access latencies and higher levels of parallelism. Likewise, the execution speed of processing units increased considerably as nowadays server architectures comprise up to multiple hundreds of independently working CPU cores along with a variety of specialized computing co-processors such as GPUs or FPGAs.
However, the burden of moving the continuously growing data to the best fitting processing unit is inherently linked to today’s computer architecture that is based on the data-to-code paradigm. In the light of Amdahl's Law, this leads to the conclusion that even with today's powerful processing units, the speedup of systems is limited since the fraction of parallel work is largely I/O-bound.
Therefore, throughout this cumulative dissertation, we investigate the paradigm shift toward code-to-data, formally known as Near-Data Processing (NDP), which relieves the contention on the I/O bus by offloading processing to intelligent computational storage devices, where the data is originally located.
Firstly, we identified Native Storage Management as the essential foundation for NDP due to its direct control of physical storage management within the database. Upon this, the interface is extended to propagate address mapping information and to invoke NDP functionality on the storage device. As the former can become very large, we introduce Physical Page Pointers as one novel NDP abstraction for self-contained immutable database objects.
Secondly, the on-device navigation and interpretation of data are elaborated. Therefore, we introduce cross-layer Parsers and Accessors as another NDP abstraction that can be executed on the heterogeneous processing capabilities of modern computational storage devices. Thereby, the compute placement and resource configuration per NDP request is identified as a major performance criteria. Our experimental evaluation shows an improvement in the execution durations of 1.4x to 2.7x compared to traditional systems. Moreover, we propose a framework for the automatic generation of Parsers and Accessors on FPGAs to ease their application in NDP.
Thirdly, we investigate the interplay of NDP and modern workload characteristics like HTAP. Therefore, we present different offloading models and focus on an intervention-free execution. By propagating the Shared State with the latest modifications of the database to the computational storage device, it is able to process data with transactional guarantees. Thus, we achieve to extend the design space of HTAP with NDP by providing a solution that optimizes for performance isolation, data freshness, and the reduction of data transfers. In contrast to traditional systems, we experience no significant drop in performance when an OLAP query is invoked but a steady and 30% faster throughput.
Lastly, in-situ result-set management and consumption as well as NDP pipelines are proposed to achieve flexibility in processing data on heterogeneous hardware. As those produce final and intermediary results, we continue investigating their management and identified that an on-device materialization comes at a low cost but enables novel consumption modes and reuse semantics. Thereby, we achieve significant performance improvements of up to 400x by reusing once materialized results multiple times.
Production planning and control are characterized by unplanned events or so-called turbulences. Turbulences can be external, originating outside the company (e.g., delayed delivery by a supplier), or internal, originating within the company (e.g., failures of production and intralogistics resources). Turbulences can have far reaching consequences for companies and their customers, such as delivery delays due to process delays. For target-optimized handling of turbulences in production, forecasting methods incorporating process data in combination with the use of existing flexibility corridors of flexible production systems offer great potential. Probabilistic, data-driven forecasting methods allow determining the corresponding probabilities of potential turbulences. However, a parallel application of different forecasting methods is required to identify an appropriate one for the specific application. This requires a large database, which often is unavailable and, therefore, must be created first. A simulation-based approach to generate synthetic data is used and validated to create the necessary database of input parameters for the prediction of internal turbulences. To this end, a minimal system for conducting simulation experiments on turbulence scenarios was developed and implemented. A multi-method simulation of the minimal system synthetically generates the required process data, using agent-based modeling for the autonomously controlled system elements and event-based modeling for the stochastic turbulence events. Based on this generated synthetic data and the variation of the input parameters in the forecast, a comparative study of data-driven probabilistic forecasting methods was conducted using a data analytics tool. Forecasting methods of different types (including regression, Bayesian models, nonlinear models, decision trees, ensemble, deep learning) were analyzed in terms of prediction quality, standard deviation, and computation time. This resulted in the identification ofappropriate forecasting methods, and required input parameters for the considered turbulences.
In the context of digital transformation, having a data-driven organizational culture has been recognized as an important factor for data analytics capabilities, innovativeness and competitive advantage of firms. However, the current literature on data-driven culture (DDC) is fragmented, lacking both a synthesis of findings and a theoretical foundation. Therefore, the aim of this work has been to develop a comprehensive framework for understanding DDC and the mechanisms that can be used to embed such a culture in organizations as well as structuring prior dispersed findings on the topic. Based on the foundation of organizational culture theory, we employed a Design Science Research (DSR) approach using a systematic literature review and expert interviews to build and evaluate a transformation-oriented framework. This research contributes to knowledge by synthesizing previously dispersed knowledge in a holistic framework, as well as, by providing a conceptual framework to guide the transformation towards a DDC.
In various German cities free-floating e-scooter sharing is an upcoming trend in e-mobility. Trends such as climate change, urbanization, demographic change, amongst others are arising and forces the society to develop new mobility solutions. Contrasting the more scientifically explored car sharing, the usage patterns and behaviors of e-scooter sharing customers still need to be analyzed. This presumably enables a better addressing of customers as well as adaptions of the business model to increase scooter utilization and therefore the profit of the e-scooter providers. The customer journey is digitally traceable from registration to scooter reservation and the ride itself. These data enable to identifies customer needs and motivations. We analyzed a dataset from 2017 to 2019 of an e-scooter sharing provider operating in a big German city. Based on the datasets we propose a customer clustering that identifies three different customer segments, enabling to draw multiple conclusions for the business development and improving the problem-solution fit of the e-scooter sharing model.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
Human recognition is an important part of perception systems, such as those used in autonomous vehicles or robots. These systems often use deep neural networks for this purpose, which rely on large amounts of data that ideally cover various situations, movements, visual appearances, and interactions. However, obtaining such data is typically complex and expensive. In addition to raw data, labels are required to create training data for supervised learning. Thus, manual annotation of bounding boxes, keypoints, orientations, or actions performed is frequently necessary. This work addresses whether the laborious acquisition and creation of data can be simplified through targeted simulation. If data are generated in a simulation, information such as positions, dimensions, orientations, surfaces, and occlusions are already known, and appropriate labels can be generated automatically. A key question is whether deep neural networks, trained with simulated data, can be applied to real data. This work explores the use of simulated training data using examples from the field of pedestrian detection for autonomous vehicles. On the one hand, it is shown how existing systems can be improved by targeted retraining with simulation data, for example to better recognize corner cases. On the other hand, the work focuses on the generation of data that hardly or not occur at all in real standard datasets. It will be demonstrated how training data can be generated by targeted acquisition and combination of motion data and 3D models, which contain finely graded action labels to recognize even complex pedestrian situations. Through the diverse annotation data that simulations provide, it becomes possible to train deep neural networks for a wide variety of tasks with one dataset. In this work, such simulated data is used to train a novel deep multitask network that brings together diverse, previously mostly independently considered but related, tasks such as 2D and 3D human pose recognition and body and orientation estimation.
This article contains data on the synthesis and mechanical characterization of polysiloxane-based urea-elastomers (PSUs) and is related to the research article entitled “Influence of PDMS molecular weight on transparency and mechanical properties of soft polysiloxane-urea-elastomers for intraocular lens application” (Riehle et al., 2018) [1]. These elastomers were prepared by a two-step polyaddition using the aliphatic diisocyanate 4,4′-Methylenbis(cyclohexylisocyanate) (H12MDI), a siloxane-based chain extender 1,3-Bis(3-aminopropyl)-1,1,3,3-tetramethyldisiloxane (APTMDS) and amino-terminated polydimethylsiloxanes (PDMS) or polydimethyl-methyl-phenyl-siloxane-copolymers (PDMS-Me,Ph), respectively. (More details about the synthesis procedure and the reaction scheme can be found in the related research article (Riehle et al., 2018) [1]).
Amino-terminated polydimethylsiloxanes with varying molecular weights and PDMS-Me,Ph-copolymers were prepared prior by a base-catalyzed ring-chain equilibration of a cyclic siloxane and the endblocker APTMDS. This DiB article contains a procedure for the synthesis of the base catalyst tetramethylammonium-3-aminopropyl-dimethylsilanolate and a generic synthesis procedure for the preparation of a PDMS having a targeted number average molecular weight of 3000 g mol−1. Molecular weights and the amount of methyl-phenyl-siloxane within the polysiloxane-copolymers were determined by 1H NMR and 29Si NMR spectroscopy. The corresponding NMR spectra and data are described in this article.
Additionally, this DiB article contains processed data on in line and off line FTIR-ATR spectroscopy, which was used to follow the reaction progress of the polyaddition by showing the conversion of the diisocyanate. All relevant IR band assignments of a polydimethylsiloxane-urea spectrum are described in this article.
Finally, data on the tensile properties and the mechanical hysteresis-behaviour at 100% elongation of PDMS-based polyurea-elastomers are shown in dependence to the PDMS molecular weight.
The data present in this article affords insides in the characterization of a newly described bi-functional furan-melamine monomer, which is used for the production of monodisperse, furan-functionalized melamine formaldehyde particles. In the related research article Urdl et al., 2019 data interpretations can be found. The furan functionalization of particles is necessary to perform reversible Diels-Alder reactions with maleimide (BMI) crosslinker to form thermoreversible network systems. To understand the reaction conditions of Diels Alder (DA) reaction with a Fu-Mel monomer and a maleimide crosslinker, model DA reaction were performed and evaluated using dynamic FT-IR measurements. During retro Diels-Alder (rDA) reactions of the monomer system, it was found out that some side reaction occurred at elevated temperatures. The data of evaluating the side reaction is described in one part of this manuscript. Additional high resolution SEM images of Fu Mel particles are shown and thermoreversible particle networks with BMI2 are shown. The data of different Fu-Mel particle networks with maleimide crosslinker are presented. Therefore, the used maleimide crosslinker with different spacer lengths were synthesized and the resulting networks were analyzed by ATR-FT-IR, SEM and DSC.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
In a networked world, companies depend on fast and smart decisions, especially when it comes to reacting to external change. With the wealth of data available today, smart decisions can increasingly be based on data analysis and be supported by IT systems that leverage AI. A global pandemic brings external change to an unprecedented level of unpredictability and severity of impact. Resilience therefore becomes an essential factor in most decisions when aiming at making and keeping them smart. In this chapter, we study the characteristics of resilient systems and test them with four use cases in a wide-ranging set of application areas. In all use cases, we highlight how AI can be used for data analysis to make smart decisions and contribute to the resilience of systems.
The Third International Conference on Data Analytics (DATA ANALYTICS 2014), held on August 24 - 28, 2014 - Rome, Italy, continued the inaugural event on fundamentals in supporting data analytics, special mechanisms and features of applying principles of data analytics, application oriented analytics, and target-area analytics.
Processing of terabytes to petabytes of data, or incorporating non-structural data and multistructured data sources and types require advanced analytics and data science mechanisms for both raw and partially-processed information. Despite considerable advancements on high performance, large storage, and high computation power, there are challenges in identifying, clustering, classifying, and interpreting of a large spectrum of information.
Das ZD.BB - Digitaler Hub für kleine und mittelständische Unternehmen in der Region Stuttgart
(2020)
Die Digitale Transformation ist eines der meistdiskutierten Themen in der heutigen Geschäftswelt. Viele Unternehmen, vor allem kleine und mittelständische Unternehmen (KMU), tun sich schwer die Chancen und Risiken der Digitalisierung einzuschätzen. Mit all den Möglichkeiten und Chancen, welche die Digitalisierung birgt, droht Unternehmen, die sich vor den Entwicklungen verschließen, der Verlust ihrer Markt- und Wettbewerbsposition. Mit dem im Februar 2019 eröffneten Digital Hub ZD.BB (Zentrum Digitalisierung) besteht in der Region Stuttgart eine neue, zentrale Anlaufstelle für Fragen rund um das Thema Digitalisierung. Am ZD.BB erhalten kleine und mittelständische Unternehmen (KMU) sowie Startups für ihre digitalen Transformationsprozesse eine kompetente Beratung und Betreuung. Sie geht von der Sensibilisierung über die Analyse bis zur Lösungsentwicklung für digitale Prozesse. Mithilfe einer digitalen Qualifizierungsoffensive und mittelstandsgerechten Methoden zur Geschäftsmodellentwicklung werden Unternehmen im ZD.BB umfassend bei ihren Digitalisierungsvorhaben unterstützt. Dazu werden in Innovationslaboren, in Coworking Spaces und bei Events unterschiedliche Kompetenzen, Disziplinen, Ideen, Technologien und Kreativität vernetzt und auf diese Weise digitale Innovationen hervorgebracht.
Am Körper getragenen Geräte, sog. Wearables, kommunizieren in der Regel über Bluetooth-Low-Energy (BLE) mit dem Smartphone. Viele Anwendungen, insbesondere im Bereich Gesundheit und AAL, basieren auf der Zusammenarbeit von Wearables mit SmartHome-Geräten. Diese Arbeit präsentiert die Definition und Implementierung von einem neuen BLE Profil für EKG, das Streaming der Signal zum SmartPhone und die Möglichkeit, mehrere solcher Biosignale parallel zu streamen, besitzt. Die Datenarchitektur der App erlaubt eine konfigurierbare Synchronisation der Signal mit dem SmartHome.
Das Thema Energiewende ist in aller Munde. Sie soll eine sichere, umweltverträgliche und wirtschaftlich erfolgreiche Zukunft ermöglichen. Ein Ansatz dafür ist die dezentrale, also verbrauchernahe Energieversorgung. Der Trend geht weg vom konventionellen Kraftwerk und hin zur Kraft-Wärme-Koppelung und erneuerbaren Energien. Für einen absehbaren Zeitraum geht es auch darum, zentrale und dezentrale Elemente sinnvoll miteinander zu verknüpfen. Mit der Frage, wie Energiesysteme angepasst und kombiniert werden müssen, um den Energiehaushalt – den nationalen wie den von Unternehmen und Privatpersonen – optimieren zu können, beschäftigt sich das Reutlinger Energiezentrum für Dezentrale Energiesysteme und Energieeffizienz in Lehre und Forschung. Es ist die Kombination aus Technik und Betriebswirtschaft, aus einzelwirtschaftlicher Optimierung und aus Gesamtsicht, die das Reutlinger Energiezentrum ausmacht. Im Folgenden werden die Schwerpunkte des Forschungsteams dargestellt.
Social-Customer-Relationship-Management zeichnet sich vor allem durch die Möglichkeit eines zentralen, überregionalen Kundendialogs mit der Option einer inhaltlichen Segmentierung aus. Obwohl Wissenschaftler schon seit Längerem die Vorteile von Social CRM als ganzheitliche Marketingstrategie betonen, versuchen nur wenig Unternehmen eine ernsthafte Etablierung. Dabei eignet sich dieser Ansatz insbesondere für größere Firmen, die eine Palette unterschiedlicher Produkte überregional unter einem Markennamen vertreiben. Hier könnte Social CRM eine sinnvolle Bereicherung für das CRM-Instrumentarium darstellen und je nach Art der vertriebenen Produkte auch zum Qualitätsmanagement beitragen.
Die Annexion der Krim, die Kriegsführung in Syrien, das finanzielle Engagement in Zypern, das Tauziehen um die Ukraine und Weißrussland oder die Namensgebung Sputnik 5 für den Impfstoff gegen die Corona Epidemie sind eindeutige Belege für das aktuelle russische Machtstreben – und seine Expansionspolitik. Deshalb ist es nicht uninteressant zu fragen, welches Meinungsbild Friedrich List (1789–1846) von Russland hatte, zumal es heute noch so aktuell, wie vor 180 bis 190 Jahren erscheint und in seinen Schriften dargelegt ist. Dieses Meinungsbild wird in diesem Aufsatz erstmals untersucht und umfassend dargestellt.
Im Lernzentrum der Hochschule Reutlingen werden die Dienste und Services der Hochschulbibliothek und des Rechen- und Medienzentrums unter einem Dach angeboten. Das Lernzentrum ist ein Ort zum Lernen und zum gemeinsamen Arbeiten in bestens ausgestatteten Räumen. Das Spektrum reicht von Einzelarbeitsplätzen, Gruppenräumen, Kabinen und Arbeitsinseln bis zu PC-Räumen, einer Lounge und einer Cafeteria.
Mit Auskunfts-, Beratungs- und Schulungsangeboten unterstützen SupportmitarbeiterInnen aus verschiedenen Abteilungen der Hochschule Studierende, Lehrende, Forschende, AnwärterInnen und MitarbeiterInnen dabei, ihren Studien- und Berufsalltag zu meistern.
Die Automobilindustrie steht insbesondere im Forschungs- und Entwicklungsbereich vor großen Herausforderungen. Es zeichnet sich eine deutliche Entwicklung hin zu Systeminnovationen ab, um den gestiegenen Anforderungen des Marktes gerecht zu werden. Voraussetzung hierfür ist die Kooperation von Unternehmen innerhalb der Wertschöpfungskette. In dieser Arbeit werden zunächst auf theoretischer Basis geeignete Kooperationsmodelle ausgewählt, die in einem zweiten Schritt anhand einer Nutzwertanalyse bewertet werden. Die Basis für die Bewertung bilden theoretische Überlegungen, die anhand von Experteninterviews validiert werden. Die Analyse zeigt, dass der Forschungscampus als auch das Branchencluster die beste Eignung aufweist. Abschließend werden die Erkenntnisse an einem Praxisobjekt angewandt.
Durch die Differenzierung der Produkte nach verschiedenen Kundengruppen konnte das Unternehmen seinen Marktanteil trotz des gesättigten Markts ausbauen. Neben der erstklassigen Qualtät seiner Produkte unterscheidet das Unternehmen sich nun auch noch durch seine kundengruppenspezifische Entwicklung und Vermarktung von der No-Name-Konkurrenz und den Herstellern aus Niedriglohnländern. Diese Sicherung der Marktposition wiegt die Kosten der notwendigen Flexibilisierung mehr als auf. Das Unternehmen ist durch seine neuen und innovativen Produkte besser für die Zukunft gerüstet. Es kann durch seine hervorragenden Kontakte zu seinen Kunden auf die sich ändernden Wünsche und Anforderungen seiner vielfältigen Kundschaft schnell und gezielt reagieren.
Dezentrale Stromerzeugungsanlagen, Energiespeicher und Steuerungseinrichtungen für Erzeuger und Verbraucher sind die Grundbausteine eines virtuellen Kraftwerks, welches im Stromnetz der Zukunft, dem Smart Grid, eine wichtige Rolle spielt. Im Rahmen des Demonstrationsprojekts Virtuelles Kraftwerk Neckar-Alb soll an der Hochschule Reutlingen eine Demonstrationsanlage aufgebaut werden, die diese Grundbausteine vernetzt und funktional integriert. Damit entsteht eine flexible Testumgebung für Forschung und Lehre, in der sich das Zusammenspiel der Komponenten untersuchen lässt. Zudem wird eine Besichtigungsmöglichkeit für interessierte Unternehmen geschaffen. Damit sollen Akzeptanz und Verständnis für die Thematik gefördert werden.
Da sind wir dabei, das ist prima! Eine Reflexion über die Mitgliedschaft in modernen Organisationen
(2015)
Organisationen verändern sich. Ihre Grenzen werden in Zeiten virtueller Unternehmen, Netzwerkorganisationen, Open Source Projekten und Freelancern immer durchlässiger. Flexibilität und Agilität sind zentrale Ziele. Wie sich die Vorstellungen über die Mitgliedschaft in einer Organisation wandeln und welche Konsequenzen das auch für das Change Management hat, zeigen die Autoren am Beispiel des Umgangs mit Wissen von externen Mitarbeitenden sowie dem Freiwilligen-Modell von Wikipedia.
Durch die Entwicklungen der vergangenen Jahre hin zu technisch komplexeren Maschinen und Anlagen steigt die Bedeutung der Instandhaltung als wesentlichem Schlüssel zur Sicherung der Verfügbarkeit von Maschinen und Anlagen. Wesentliche Ansatzpunkte zur Verbesserung sind hier die Verfügbarkeit von Informationen, voraussagende Instandhaltungsstrategien und eine verbesserte Informationsbereitstellung. Diese können auf technischer Ebene durch spezialisierte Cyberphysische Systeme realisiert werden. In diesem Beitrag wird ein Überblick über die wesentlichen Bausteine, aus smarten Komponenten, smarten Planungssystemen und smarten Benutzerschnittstellen gegeben, die für eine erfolgreiche Umsetzung notwendig sind.
Cyanate esters
(2014)
Cyanate ester resins are an important class of thermosetting compounds that have experienced an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other applications, are especially suitable for highly demanding functions in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the aOCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure. The basic reaction sequence leading to the typical cyanate ester polymer molecule is depicted in Figure 11.1. The curing reaction may take place with or without catalyst.
Cyanate ester resins
(2022)
Cyanate ester resins are an important class of thermosetting compounds that experience an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other application fields are especially suitable for highly demanding applications in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the –OCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure.
Das Internet der Dinge verändert die Customer-Experience nachhaltig, beispielsweise indem neue Dienste die Konsumenten kognitiv entlasten. Das Management sollte die neuen Interaktionsmöglichkeiten mit den Konsumenten nutzen und leistungsfähige Benutzerschnittstellen entwickeln sowie analytisches Know-how und Partnerschaften aufbauen.
Customer Success Management is the next evolution in complex sales that drives growth. Moreover, Customer Success Management is a modern holistic sales philosophy and part of a professional customer experience management strategy. The following conceptual paper discusses fundamental thoughts based on value-based selling, customer success focus, and a clear view on a perspective beyond selling that will gain importance in the future.
Die vorliegende Studie beschäftigt sich mit der Verbreitung des Customer-Success-Managements im deutschsprachigen Mittelstand und der Frage, wie eine erfolgreiche Implementierung dort durchgeführt werden kann. Die Ergebnisse zeigen, dass, vorgelagert zum eigentlichen Customer-Success-Management-Prozess, interne sowie externe Voraussetzungen im deutschsprachigen Mittelstand geschaffen werden müssen, um eine nachhaltige Implementierung gewährleisten zu können. Dazu zählt die Transformation vom reinen Produktfokus hin zu einer kunden- und servicezentrierten Unternehmensstrategie. Voraussetzung dafür ist die Erhöhung des Digitalisierungsgrads der Produkte und internen Prozesse sowie ein aktives Change-Management.
Die vorliegende Studie beschäftigt sich mit der Verbreitung des Customer-Success-Managements im deutschsprachigen Mittelstand und der Frage, wie eine erfolgreiche Implementierung dort durchgeführt werden kann. Die Ergebnisse zeigen, dass, vorgelagert zum eigentlichen Customer-Success-Management-Prozess, interne sowie externe Voraussetzungen im deutschsprachigen Mittelstand geschaffen werden müssen, um eine nachhaltige Implementierung gewährleisten zu können. Dazu zählt die Transformation vom reinen Produktfokus hin zu einer kunden- und servicezentrierten Unternehmensstrategie. Voraussetzung dafür ist die Erhöhung des Digitalisierungsgrads der Produkte und internen Prozesse sowie ein aktives Change-Management.
Customer services in the digital transformation: social media versus hotline channel performance
(2015)
Due to the digital transformation online service strategies have gained prominence in practice as well as in the theory of service management. This study examines the efficacy of different types of service channels in customer complaint handling. The theoretical framework, developed using complaint handling and social media literature, is tested against data collected from two different channels (hotline and social media) of a German telecommunication service provider. We contribute to the understanding of firm’s multichannel distribution strategy in two ways: a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels, and b) by testing the impact of complaint handling quality on key performance outcomes like customer loyalty, positive word-of-mouth, and cross purchase intentions.
This paper addresses the following four research questions: 1. How should customer service quality in social media channels be conceptualized on multiple levels? 2. Which aspects of customer service quality are important in enhancing customer satisfaction? 3. What outcomes are effected by customer service quality and customer satisfaction? 4. How effective are customer services delivered through social media channels (as compared to customer services delivered through other channels)?
The limited focus on particular research designs, data analysis methods, and research objects frequently characterise customer research projects. However, standard practice regarding researching certain phenomena is not always correct, and, in many cases, could provide misleading results. In this paper, we call for a more holistic approach to customer research, which considers the entire research design and data analysis toolbox, while also recognising the importance of consumer groups other than costumers. At the same time, we call for using simple data analysis methods, which often suffice to show relevant effects, instead of overemphasising method complexity as is often the case in top-tier journals. Based on our discussion, we offer researchers and practitioners concrete recommendations for advancing their research design and data analyses.
The generous feed-in tariffs (FiTs) introduced in Germany—which resulted in major growth in decentralized solar photovoltaic (PV) systems—will phase out in the coming years, making many of the existing distributed generation assets stranded. This challenge creates an opportunity for community-focused energy utilities, such as Elektrizitätswerke Schönau eG (EWS) based in Schönau, Germany, to try a new approach to assist its customers, makes the transition to a more sustainable future. This chapter describes how EWS is developing products and offering community-based solutions including peer-to-peer trading using automated platforms. Such innovative offering may lead to successful differentiation in a competitive and highly decentralized future.
Loyalty programs become more important in an omnichannel environment of fashion retail business. After the definition of customer loyalty and loyalty programs the main characteristics of omnichannel loyalty programs are described. As touchpoints of omnichannel loyalty programs mobile, social media, direct mail and in-store capabilities are detailed. A discussion chapter closes with recommendations for fashion retailers.
Dieser Beitrag leistet einen Beitrag zur Marketingforschung, da er den jungen aber von zunehmender Relevanz geprägten Forschungsstrang zum Themenkomplex CEM grundlegend entwickelt. Zum einen zeigt das identifizierte Rahmenkonzept auf, dass CEM über einzelne unternehmerische Fähigkeiten wie dem Design von Serviceerlebnissen, das die bisherige CEM-Forschung bestimmt hat, hinausgeht. Zum anderen leistet das Konzept einen Beitrag zur Synthese fragmentierter, aber miteinander zusammenhängender Literaturströmungen in der Marketingforschung ...
For the widespread establishment of a circular economy, the acceptance of used products among consumers is a prerequisite. This paper investigates the customer experience of product service systems related to used products (PSSuP), such as renting, remanufacturing, and second-hand models, and aims to point out the offering characteristics that effect customer response and customer engagement. This study was conducted by means of a content analysis-based literature review of 69 empirical PSSuP studies. A frequency analysis of the categories that determine customer experience creation was conducted, as well as a contingency analysis to reveal the interrelationship between these categories. On this basis, the different PSSuP types were compared, and four strategic orientations of customer experience creation in PSSuP are pointed out: price, confidence, convenience, and delight orientation. For each of these strategic orientations, supportive PSSuP offering characteristics are specified. Building on the findings of this study, theoretical and managerial implications for product–service systems marketing are pointed out, and the need for research on the role of information and communication technology as an enabler of customer experience creation in PSSuP is highlighted.
Surface topographies are often discussed as an important parameter influencing basic cell behavior. Whereas most in vitro studies deal with microstructures with sharp edges, smooth, curved microscale topographies might be more relevant concerning in-vivo situations. Addressing the lack of highly defined surfaces with varying curvature, we present a topography chip system with 3D curved features of varying spacing, curvature radii as well as varying overall dimensions of curved surfaces. The CurvChip is produced by low-cost photolithography with thermal reflow, subsequent (repetitive) PDMS molding and hot embossing. The platform facilitates the systematic in-vitro investigation of the impact of substrate curvature on cell types like epithelial, endothelial, smooth muscle cells, or stem cells. Such investigations will not only help to further understand the mechanism of curvature sensation but may also contribute to optimize cell-material interactions in the field of regenerative medicine.
At Reutlingen University in Germany students from different countries and disciplines can learn business English within the framework of a theatre production. In the "Business English Theatre" they work in an international project team staging a play with a business focus and thus improve both their language, social and professional skills.
Curriculum design for the German language class in the double-degree programme business engineering
(2017)
This paper aims to give an overview on how German is taught as a foreign language to students enrolled in the Bachelor of Business Engineering, a double-degree programme offered in Universiti Malaysia Pahang. The double degree students have the opportunity to complete their first two years of study in Malaysia and their last two years in Germany. Taking the TestDaF examination is compulsory for double-degree students. Hence, the German Language curriculum has been meticulously planned to ensure the students would be competent in the language. As such, the settings of the language class are discussed thoroughly in this paper. Additionally, it also discusses the challenges faced in teaching German as foreign language. This paper ends with some suggestions for improvement.
The purpose of this paper is to investigate how motion pictures are currently used for the product presentation of fashion articles in online shops in the German, American and British markets. This study shows that the use of moving images for the presentation of fashion articles in online shops is underutilized. With the amount of data that was manageable within the scope of this chapter, no valid generalizations can be made. All described results must be understood as an indication. In order to be able to use product presentation videos meaningfully, one should consider before exactly what is the purpose of these videos. Different goals require different means. However, retailer should obtain enough information in advance to assess whether they can afford the production and post processing of these videos.
The scoring of sleep stages is one of the essential tasks in sleep analysis. Since a manual procedure requires considerable human and financial resources, and incorporates some subjectivity, an automated approach could result in several advantages. There have been many developments in this area, and in order to provide a comprehensive overview, it is essential to review relevant recent works and summarise the characteristics of the approaches, which is the main aim of this article. To achieve it, we examined articles published between 2018 and 2022 that dealt with the automated scoring of sleep stages. In the final selection for in-depth analysis, 125 articles were included after reviewing a total of 515 publications. The results revealed that automatic scoring demonstrates good quality (with Cohen's kappa up to over 0.80 and accuracy up to over 90%) in analysing EEG/EEG + EOG + EMG signals. At the same time, it should be noted that there has been no breakthrough in the quality of results using these signals in recent years. Systems involving other signals that could potentially be acquired more conveniently for the user (e.g. respiratory, cardiac or movement signals) remain more challenging in the implementation with a high level of reliability but have considerable innovation capability. In general, automatic sleep stage scoring has excellent potential to assist medical professionals while providing an objective assessment.
This work is a report on practical experiences with the issue of interoperability in German practice management systems (PMSs) from an ongoing clinical trial on teledermatology, the TeleDerm project. A proprietary and established web-platform for store-and-forward telemedicine is integrated with the IT in the GPs’ offices for automatic exchange of basic patient data. Most of the 19 different PMSs included in the study sample lack support of modern health data exchange standards, therefore the relatively old but widely available German health data exchange interface “Gerätedatentransfer” (GDT) is used. Due to the lack of enforcement and regulation of the GDT standard, several obstacles to interoperability are encountered. As a partial, but reusable working solution to cope with these issues, we present a custom middleware which is used in conjunction with GDT. We describe the design, technical implementation and observed hindrances with the existing infrastructure. A discussion on health care interfacing standards and the current state of interoperability in German PMS software is given.
Electronic word-of-mouth (eWoM) communication has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM.
Current fields of interest
(2016)
If we review the research done in the field of optimization, the following topics appear to be the focus of current development:
– Optimization under uncertainties, taking into account the inevitable scatter of parts, external effects and internal properties. Reliability and robustness both have to be taken into account when running optimizations, so the name Robust Design Optimization (RDO) came into use.
– Multi-Objective Optimization (MOO) handles situations in which different participants in the development process are developing in different directions. Typically we think of commercial and engineering aspects, but other constellations have to be looked at as well, such as comfort and performance or price and consumption.
– Process development of the entire design process, including optimization from early stages, might help avoid inefficient efforts. Here the management of virtual development has to be re-designed to fit into a coherent scheme.
...
There are many other fields where interesting progress is being made. We limit our discussion to the first three questions.
We report on the cure characterization, based on inline monitoring of the dielectric parameters, of a commercially available epoxy phenol resin molding compound with a high glass transition temperature (>195 °C), which is suitable for the direct packaging of electronic components. The resin was cured under isothermal temperatures close to general process conditions (165–185 °C). The material conversion was determined by measuring the ion viscosity. The change of the ion viscosity as a function of time and temperature was used to characterize the cross-linking behavior, following two separate approaches (model based and isoconversional). The determined kinetic parameters are in good agreement with those reported in the literature for EMCs and lead to accurate cure predictions under process-near conditions. Furthermore, the kinetic models based on dielectric analysis (DEA) were compared with standard offline differential scanning calorimetry (DSC) models, which were based on dynamic measurements. Many of the determined kinetic parameters had similar values for the different approaches. Major deviations were found for the parameters linked to the end of the reaction where vitrification phenomena occur under process-related conditions. The glass transition temperature of the inline molded parts was determined via thermomechanical analysis (TMA) to confirm the vitrification effect. The similarities and differences between the resulting kinetics models of the two different measurement techniques are presented and it is shown how dielectric analysis can be of high relevance for the characterization of the curing reaction under conditions close to series production.
Within the scope of the present cumulative doctoral thesis six scientific papers were published which illustrates that modern reaction model-free (=isoconversional) kinetic analysis (ICKA) methods represents a universal and effective tool for the controlled processing of thermosetting materials. In order to demonstrate the universal applicability of ICKA methods, the thermal cure of different thermosetting materials having a very broad range of chemical composition (melamine-formaldehyde resins, epoxy resins, polyester-epoxy resins, and acrylate/epoxy resins) were analyzed and mathematically modelled. Some of the materials were based on renewable resources (an epoxy resin was made from hempseed oil; linseed oil was modified into an acrylate/epoxy resin). With the aid of ICKA methods not only single-step but also complex multi-step reactions were modelled precisely. The analyzed thermosetting materials were combined with wood, wood-based products, paper, and plant fibers which are processed to various final products. Some of the thermosetting materials were applied as coating (in form of impregnated décor papers or powder and wet coatings respectively) on wood substrates and the epoxy resin from hempseed oil was mixed with plant fibers and processed into bio-based composites for lightweight applications. From the final products mechanical, thermal, and surface properties were determined. The activation energy as function of cure conversion derived from ICKA methods was utilized to predict accurately the thermal curing over the course of time for arbitrary cure conditions. Furthermore the cure models were used to establish correlations between the cross-linking during processing into products and the properties of the final products. Therewith it was possible to derive the process time and temperature that guarantee optimal cross-linking as well as optimal product properties
Purpose: The purpose of this paper is to examine the service of the new business model Curated Shopping in the fashion industry as well as to analyze if the service provides a higher costumer added value in comparison to traditional services in retail stores and e-commerce platforms. It gives implications to curated shop operators how to optimize the service in each stage of the customer buying process.
Design/methodology/approach: The research methodology applied is an empirical study that uses the principal of mystery shopping in order to investigate the provided services during the selling process.
Findings: The study showed that information about the customer should be collected carefully and as holistic as possible in order to assort a suitable outfit. The consumer is able to benefit from the service by saving time and enjoying a stress-free way of shopping. Nevertheless there are limitations in the personal service to give individual and inspiring advice by the curator caused by the physical distance to the customer.
Research limitations: The survey was conducted under 10 mystery shoppers and 4 curated shop operators in Germany, limiting findings to these mystery shoppers and operators.
Practical implications: One implication for the shop operators is to collect consumer information carefully and expand the assortment and brand portfolio in order to provide fashion goods to inspire the consumer. The shop operators are on the right track still there is huge potential to provide a more shopper-oriented service.
This booklet will give you an overview of the development of CSR from a (brief) historic point of view and will examine the underlying concepts and research. Furthermore, examples of contemporary CSR management will be explored to show how companies Interpret the issue and how they face the challenges of managing the new demands placed upon them. Business, in the end, comes down to figures and numbers which give management, shareholders and stakeholders a chance to measure a company’s success. Therefore, modern methods and approaches for measuring, rating and ranking a company’s CSR management will be presented. Finally, an attempt will be made to evaluate CSR as a tool for increasing global welfare and as a business and management strategy for companies and entrepreneurs.
The diversity of energy prosumer types makes it difficult to create appropriate incentive mechanisms that satisfy both prosumers and energy system operators alike. Meanwhile, European energy suppliers buy guarantees of origin (GoO) which allow them to sell green energy at premium prices while in reality delivering grey energy to their customers. Blockchain technology has proven itself to be a robust paying system in which users transact money without the involvement of a third party. Blockchain tokens can be used to represent a unit of energy and, just as GoOs, be submitted to the market. This paper focuses on simulating marketplace using the ethereum blockchain and smart contracts, where prosumers can sell tokenized GoOs to consumers willing to subsidize renewable energy producers. Such markets bypass energy providers by allowing consumers to obtain tokenized GoOs directly from the producers, which in turn benefit directly from the earnings. Two market strategies where tokens are sold as GoOs have been simulated. In the Fix Price Strategy prosumers sell their tokens to the average GoO price of 2014. The Variable Price Strategy focuses on selling tokens at a price range defined by the difference between grey and green energy. The study finds that the ethereum blockchain is robust enough to functions as a platform for tokenized GoO trading. Simulation results have been compared and the results indicate that prosumers earn significantly more money by following the Variable Price
Strategy.
The purpose of this paper is to identify key success factors of Crowdfunding in the Music Business in order to discuss their applicability to the Fashion Industry. The research methodology applied is a literature review examining academic and non-academic references. Key research findings include four main success factors. First explains the innovative and adaptive nature of the music industry caused by historical evolution. Second strong commitment and connection to the fan base is identified as success factor. Third manageable effort for the realisation on a large scale reduces the risk of a failure. And, last success factor describes the successful implementation of campaign specific aspects. The discussion finally shows that three of four success factors can be adapted to the Fashion Business. Due to little scientific research in the field of Crowdfunding in the Music Business, the success factors are worked out independently, based on general literature. Accordingly, quantitative testing and further analysis is recommended.
Crosslinked thermoplastics
(2014)
Cross-linked thermoplastics represent an important class of materials for numerous applications such as heat-shrinkable tubing, rotational molded parts, and polyolefin foams. By cross-linking olefins, their mechanical performance can be significantly enhanced. This chapter covers the three main methods for the cross-linking of thermoplastics: radiation cross-linking, chemical cross-linking with organic peroxides, and cross-linking using silane-grafting agents. It also considers the major effects of the cross-linking procedure on the performance of the thermoplastic materials discussed.