Refine
Year of publication
- 2020 (218) (remove)
Document Type
- Journal article (108)
- Conference proceeding (73)
- Book chapter (22)
- Book (5)
- Doctoral Thesis (5)
- Anthology (2)
- Working Paper (2)
- Report (1)
Language
- English (218) (remove)
Is part of the Bibliography
- yes (218)
Institute
- Informatik (82)
- ESB Business School (48)
- Life Sciences (43)
- Technik (28)
- Texoversum (18)
Publisher
- Springer (46)
- Elsevier (30)
- IEEE (16)
- MDPI (13)
- Association for Computing Machinery (8)
- De Gruyter (7)
- Wiley (7)
- SciTePress (4)
- Association for Information Systems (3)
- Gesellschaft für Informatik e.V (3)
With the expansion of cyber-physical systems (CPSs) across critical and regulated industries, systems must be continuously updated to remain resilient. At the same time, they should be extremely secure and safe to operate and use. The DevOps approach caters to business demands of more speed and smartness in production, but it is extremely challenging to implement DevOps due to the complexity of critical CPSs and requirements from regulatory authorities. In this study, expert opinions from 33 European companies expose the gap in the current state of practice on DevOps-oriented continuous development and maintenance. The study contributes to research and practice by identifying a set of needs. Subsequently, the authors propose a novel approach called Secure DevOps and provide several avenues for further research and development in this area. The study shows that, because security is a cross-cutting property in complex CPSs, its proficient management requires system-wide competencies and capabilities across the CPSs development and operation.
Different types of raw cotton were investigated by a commercial ultraviolet-visible/near infrared (UV-Vis/NIR) spectrometer (210–2200 nm) as well as on a home-built setup for NIR hyperspectral imaging (NIR-HSI) in the range 1100–2200 nm. UV-Vis/NIR reflection spectroscopy reveals the dominant role proteins, hydrocarbons and hydroxyl groups play in the structure of cotton. NIR-HSI shows a similar result. Experimentally obtained data in combination with principal component analysis (PCA) provides a general differentiation of different cotton types. For UV-Vis/NIR spectroscopy, the first two principal components (PC) represent 82 % and 78 % of the total data variance for the UV-Vis and NIR regions, respectively. Whereas, for NIR-HSI, due to the large amount of data acquired, two methodologies for data processing were applied in low and high lateral resolution. In the first method, the average of the spectra from one sample was calculated and in the second method the spectra of each pixel were used. Both methods are able to explain ≥90 % of total variance by the first two PCs. The results show that it is possible to distinguish between different cotton types based on a few selected wavelength ranges. The combination of HSI and multivariate data analysis has a strong potential in industrial applications due to its short acquisition time and low-cost development. This study opens a novel possibility for a further development of this technique towards real large-scale processes.
Controlling the surface properties and structure of thin nanosized coatings is of primary importance in diverse engineering and medical applications. Here we report on how the nanostructure, growth mechanism, thickness, roughness, and hydrophilicity of nanocomposites composed of weak natural or strong synthetic polyelectrolytes (PE) can be tailored by graphene oxide (GO) doping. GO reverses the build‐up mechanism affecting the internal structure and the hydrophilicity in a way depending on the type of the PE‐matrix. The extent of GO‐adsorption and its impact on the surface morphology was found to be independent on the type of the underlying PE‐matrix. The nanostructure of the hybrid films is not significantly altered when a single surface‐exposed GO‐layer is deposited, while increasing the number of embedded GO‐layers leads to pronounced surface heterogeneity. These results are expected to have valuable impact on the construction strategies of coatings with tunable surface properties.
Thermoplastic polycarbonate urethane elastomers (TPCU) are potential implant materials for treating degenerative joint diseases thanks to their adjustable rubber-like properties, their toughness, and their durability. We developed a water-containing high-molecular-weight sulfated hyaluronic acid-coating to improve the interaction of TPCU with the synovial fluid. It is suggested that trapped synovial fluid can act as a lubricant that reduces the friction forces and thus provides an enhanced abrasion resistance of TPCU implants. Aims of this work were (i) the development of a coating method for novel soft TPCU with high-molecular sulfated hyaluronic acid to increase the biocompatibility and (ii) the in vitro validation of the functionalized TPCUs in cell culture experiments.
The tale of 1000 cores: an evaluation of concurrency control on real(ly) large multi-socket hardware
(2020)
In this paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” and analyse in-memory DBMSs on today’s large hardware. Despite the original assumption of the authors, today we do not see single-socket CPUs with 1000 cores. Instead multi-socket hardware made its way into production data centres. Hence, we follow up on this prior work with an evaluation of the characteristics of concurrency control schemes on real production multi-socket hardware with 1568 cores. To our surprise, we made several interesting findings which we report on in this paper.
In this paper, we present a new approach for achieving robust performance of data structures making it easier to reuse the same design for different hardware generations but also for different workloads. To achieve robust performance, the main idea is to strictly separate the data structure design from the actual strategies to execute access operations and adjust the actual execution strategies by means of so-called configurations instead of hard-wiring the execution strategy into the data structure. In our evaluation we demonstrate the benefits of this configuration approach for individual data structures as well as complex OLTP workloads.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Machine learning (ML) techniques are rapidly evolving, both in academia and practice. However, enterprises show different maturity levels in successfully implementing ML techniques. Thus, we review the state of adoption of ML in enterprises. We find that ML technologies are being increasingly adopted in enterprises, but that small and medium-size enterprises (SME) are struggling with the introduction in comparison to larger enterprises. In order to identify enablers and success factors we conduct a qualitative empirical study with 18 companies in different industries. The results show that especially SME fail to apply ML technologies due to insufficient ML knowhow. However, partners and appropriate tools can compensate this lack of resources. We discuss approaches to bridge the gap for SME.
Additive Manufacturing is increasingly used in the industrial sector as a result of continuous development. In the Production Planning and Control (PPC) system, AM enables an agile response in the area of detailed and process planning, especially for a large number of plants. For this purpose, a concept for a PPC system for AM is presented, which takes into account the requirements for integration into the operational enterprise software system. The technical applicability will be demonstrated by individual implemented sections. The presented solution approach promises a more efficient utilization of the plants and a more elastic use.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
The design process for a single phase, smart, universal charger for light electric vehicles, is presented. With a step up, power factor correction circuit, followed by a phase shifted, full bridge converter, with synchronous rectification on the secondary side. Due to the resistor-capacitor-diode snubber on the secondary side, the current peak at the start of power transfer, leads to false triggering during light load control with peak current mode control. The solution developed for light loads, is to change from peak current control to voltage control. This is achieved by limiting the maximum phase shift, instead of changing the reference value. For the power factor correction stage, measured and calculated efficiencies are compared as a function of the output power. The voltage and current waveforms are shown for the power factor correction circuit, and for the phase shifted bridge, the measured current waveform is compared with simulation.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Context: Fast moving markets and the age of digitization require that software can be quickly changed or extended with new features. The associated quality attribute is referred to as evolvability: the degree of effectiveness and efficiency with which a system can be adapted or extended. Evolvability is especially important for software with frequently changing requirements, e.g. internet-based systems. Several evolvability-related benefits were arguably gained with the rise of service-oriented computing (SOC) that established itself as one of the most important paradigms for distributed systems over the last decade. The implementation of enterprise-wide software landscapes in the style of service-oriented architecture (SOA) prioritizes loose coupling, encapsulation, interoperability, composition, and reuse. In recent years, microservices quickly gained in popularity as an agile, DevOps-focused, and decentralized service-oriented variant with fine-grained services. A key idea here is that small and loosely coupled services that are independently deployable should be easy to change and to replace. Moreover, one of the postulated microservices characteristics is evolutionary design.
Problem Statement: While these properties provide a favorable theoretical basis for evolvable systems, they offer no concrete and universally applicable solutions. As with each architectural style, the implementation of a concrete microservice-based system can be of arbitrary quality. Several studies also report that software professionals trust in the foundational maintainability of service orientation and microservices in particular. A blind belief in these qualities without appropriate evolvability assurance can lead to violations of important principles and therefore negatively impact software evolution. In addition to this, very little scientific research has covered the areas of maintenance, evolution, or technical debt of microservices.
Objectives: To address this, the aim of this research is to support developers of microservices with appropriate methods, techniques, and tools to evaluate or improve evolvability and to facilitate sustainable long-term development. In particular, we want to provide recommendations and tool support for metric-based as well as scenario-based evaluation. In the context of service-based evolvability, we furthermore want to analyze the effectiveness of patterns and collect relevant antipatterns. Methods: Using empirical methods, we analyzed the industry state of the practice and the academic state of the art, which helped us to identify existing techniques, challenges, and research gaps. Based on these findings, we then designed new evolvability assurance techniques and used additional empirical studies to demonstrate and evaluate their effectiveness. Applied empirical methods were for example surveys, interviews, (systematic) literature studies, or controlled experiments.
Contributions: In addition to our analyses of industry practice and scientific literature, we provide contributions in three different areas. With respect to metric-based evolvability evaluation, we identified a set of structural metrics specifically designed for service orientation and analyzed their value for microservices. Subsequently, we designed tool-supported approaches to automatically gather a subset of these metrics from machine-readable RESTful API descriptions and via a distributed tracing mechanism at runtime. In the area of scenario-based evaluation, we developed a tool-supported lightweight method to analyze the evolvability of a service-based system based on hypothetical evolution scenarios. We evaluated the method with a survey (N=40) as well as hands-on interviews (N=7) and improved it further based on the findings. Lastly with respect to patterns and antipatterns, we collected a large set of service-based patterns and analyzed their applicability for microservices. From this initial catalogue, we synthesized a set of candidate evolvability patterns via the proxy of architectural modifiability tactics. The impact of four of these patterns on evolvability was then empirically tested in a controlled experiment (N=69) and with a metric-based analysis. The results suggest that the additional structural complexity introduced by the patterns as well as developers' pattern knowledge have an influence on their effectiveness. As a last contribution, we created a holistic collection of service-based antipatterns for both SOA and microservices and published it in a collaborative repository.
Conclusion: Our contributions provide first foundations for a holistic view on the evolvability assurance of microservices and address several perspectives. Metric- and scenario-based evaluation as well as service-based antipatterns can be used to identify "hot spots" while service-based patterns can remediate them and provide means for systematic evolvability construction. All in all, researchers and practitioners in the field of microservices can use our artifacts to analyze and improve the evolvability of their systems as well as to gain a conceptual understanding of service-based evolvability assurance.
While many maintainability metrics have been explicitly designed for service-based systems, tool-supported approaches to automatically collect these metrics are lacking. Especially in the context of microservices, decentralization and technological heterogeneity may pose challenges for static analysis. We therefore propose the modular and extensible RAMA approach (RESTful API Metric Analyzer) to calculate such metrics from machine-readable interface descriptions of RESTful services. We also provide prototypical tool support, the RAMA CLI, which currently parses the formats OpenAPI, RAML, and WADL and calculates 10 structural service-based metrics proposed in scientific literature. To make RAMA measurement results more actionable, we additionally designed a repeatable benchmark for quartile-based threshold ranges (green, yellow, orange, red). In an exemplary run, we derived thresholds for all RAMA CLI metrics from the interface descriptions of 1,737 publicly available RESTful APIs. Researchers and practitioners can use RAMA to evaluate the maintainability of RESTful services or to support the empirical evaluation of new service interface metrics.
Scenario-based analysis is a comprehensive technique to evaluate software quality and can provide more detailed insights than e.g. maintainability metrics. Since such methods typically require significant manual effort, we designed a lightweight scenario-based evolvability evaluation method. To increase efficiency and to limit assumptions, the method exclusively targets service- and microservice-based systems. Additionally, we implemented web-based tool support for each step. Method and tool were also evaluated with a survey (N=40) that focused on change effort estimation techniques and hands-on interviews (N=7) that focused on usability. Based on the evaluation results, we improved method and tool support further. To increase reuse and transparency, the web-based application as well as all survey and interview artifacts are publicly available on GitHub. In its current state, the tool-supported method is ready for first industry case studies.
Planning of available resources considering ergonomics under deterministic highly variable demand
(2020)
In this paper, a method for hybrid short- to long-term planning of available resources for operations is presented, which is based on a known or deterministically forecasted but highly variable demand. The method considers quantitative measures such as the performance and the availability of resources, ergonomically relevant KPI and ultimately process costs in order to serve as a pragmatic planning tool for operations managers in SMEs. Specifically, the method enables exploiting the ergonomic advantages of available flexible automation technology (e.g. AGVs or picking robots), while assuring that these do not represent a capacity bottleneck. After presenting the method along with the necessary assumptions, mainly concerning the availability of data for the calculations, we report a case study that quantifies the impact of throughput variability on the selection of different process alternatives, where different teams of resources are used.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? and (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
The shift of populations to cities is creating challenges in many respects, thus leading to increasing demand for smart solutions of urbanization problems. Smart city applications range from technical and social to economic and ecological. The main focus of this work is to provide a systematic literature review of smart city research to answer two main questions: (1) How is current research on smart cities structured? And (2) What directions are relevant for future research on smart cities? To answer these research questions, a text-mining approach is applied to a large number of publications. This provides an overview and gives insights into relevant dimensions of smart city research. Although the main dimensions of research are already described in the literature, an evaluation of the relevance of such dimensions is missing. Findings suggest that the dimensions of environment and governance are popular, while the dimension of economy has received only limited attention.
The process for the production of customized bras is really challenging. Although the need is very clear, the lingerie industry is currently facing a lack of data, knowledge and expertise for the realization of an automated process chain. Different studies and surveys have shown, that the majority of women wear the incorrect bra size. In addition to aesthetic problems, health risks such as headaches, back problems or digestive problems of the wearers can result from this. An important prerequisite for improvements is the basic knowledge about the female breast, both in terms of body measurements and different breast shapes. The current size systematic for bras only defines a bra size by the relation between bust girth and underbust girth and standardized cup forms do not justice to the high variability of the human body. As the bra type shapes the female breast, basic knowledge about the relation of measurements and shapes from the clothed and the unclothed breast is missing.
In the present project, studies are conducted to explore the female breast and to derive new breast-specific body measurements, different breast shapes and deformation knowledge using existing bras.
Furthermore, an innovative process is being developed that leads from 3D scanning to individual and interactive pattern construction, which allows an automatic pattern creation based on individual body measurements and the influence of different material parameters.
In the course of the presentation, the current project status will be shown and the future developments and project steps will be introduced.
Zero or plus energy office buildings must have very high building standards and require highly efficient energy supply systems due to space limitations for renewable installations. Conventional solar cooling systems use photovoltaic electricity or thermal energy to run either a compression cooling machine or an absorption-cooling machine in order to produce cooling energy during daytime, while they use electricity from the grid for the nightly cooling energy demand. With a hybrid photovoltaic-thermal collector, electricity as well as thermal energy can be produced at the same time. These collectors can produce also cooling energy at nighttime by longwave radiation exchange with the night sky and convection losses to the ambient air. Such a renewable trigeneration system offers new fields of applications. However, the technical, ecological and economical aspects of such systems are still largely unexplored.
In this work, the potential of a PVT system to heat and cool office buildings in three different climate zones is investigated. In the investigated system, PVT collectors act as a heat source and heat sink for a reversible heat pump. Due to the reduced electricity consumption (from the grid) for heat rejection, the overall efficiency and economics improve compared to a conventional solar cooling system using a reversible air-to-water heat pump as heat and cold source.
A parametric simulation study was carried out to evaluate the system design with different PVT surface areas and storage tank volumes to optimize the system for three different climate zones and for two different building standards. It is shown such systems are technically feasible today. With a maximum utilization of PV electricity for heating, ventilation, air conditioning and other electricity demand such as lighting and plug loads, high solar fractions and primary energy savings can be achieved.
Annual costs for such a system are comparable to conventional solar thermal and solar electrical cooling systems. Nevertheless, the economic feasibility strongly depends on country specific energy prices and energy policy. However, even in countries without compensation schemes for energy produced by renewables, this system can still be economically viable today. It could be shown, that a specific system dimensioning can be found at each of the investigated locations worldwide for a valuable economic and ecological operation of an office building with PVT technologies in different system designs.
The purpose of this paper is to give an overview about the links between fashion businesses and film from a fashion business perspective. It focuses on the idea that digitalization brought much more film use for the fashion industry and that this development has just begun and not ended. This change finally also has an intense impact on the fashion industry, as fashion companies nowadays are content producers with films, too. The resulting closer connection with viewers via social media exposes fashion companies, gives on the other hand new influence potential to the fashion system. An in-depth future research about the fashion and film system is therefore required to develop answers for the current situation. This article should be interpreted more as a personal viewpoint of the author to this topic rather than a research paper based on the usual methodological criteria.
Today, digitalization is firmly anchored in society and business. It is also recognized to have significant impact on the retailing sector. The in-store display of moving images has so far, however, gained little attention by researchers. The aim of this research is to provide a first estimation on the current state of moving images distribution in stationary retail stores. A store check was the basis for analysis and evaluation. In sum, 152 stores were analyzed in Stuttgart, Germany. Out of 152 observed stores, 62 stores showed 177 moving images. Detailed analyses about content, mood, color and the actors of motion pictures showed that all aspects are very well harmonized with the target group of the store. The chapter provides a basic estimation of the in-store diffusion of moving images. Thereby, avenues for further research are opened up.
This chapter provides insights in the future of fashion film with respect to augmented reality and virtual reality technologies. The question: How does augmented reality and virtual reality influence the future of fashion film? is therefore considered. It is important to analyze the influence of those technologies on fashion films to assess the potential for fashion retailers and in best case gain first-mover advantages. To answer the stated research question, a literature research was conducted to gain insights about the topic and its influence towards fashion filming. Explanation of augmented reality and virtual reality is provided as well as implications in the retail sector regarding fashion films. Moreover, company examples already using this approach have been compiled. Furthermore, an empirical research part was conducted including a survey method based on an online survey design. The questionnaire is based on what has been revealed in literature to gain in depth insides and approval. The data gained indicated that augmented reality and virtual reality influence the future of fashion film in various ways. The findings highlight how important those technologies can be in order to enhance customer experience and engagement. Regarding the research question, the conclusion can be drawn that it is highly important for fashion managers to take future developments like augmented reality and virtual reality into account to stay competitive and satisfy the requirements of modern consumers.
This chapter discusses German television as a platform for fashion content and, in that context, streaming services as possible alternatives. Three German television channels were surveilled over the period of one month, as well as the two most popular streaming services in Germany and the online media library of one German television channel over six months, regarding length, fashion connection, transmission time and success. Additionally, for three channels fashion advertisement was analyzed. Broadcasting the most contributions with fashion connection in one month, VOX was the channel being the most fashionable. Aiming to entertain, informative contributions about fashion in television build a minority. Streaming services offer more flexibility, which the user is asking for. All three television stations show fashion brand spots during prime-time. Especially ProSieben and sixx are in close cooperation with several fashion brands. Therefore, fashion advertising seems to be preferably inserted in fashion related series.
Based on new ways of watching series via streaming platforms and a change of buying behavior, advertising needs to focus on new strategies. Branded entertainment gives brands the opportunity to deeper integrate their product placements into television show plots. Through a managerial perspective this increases the advertising effectiveness. The serial ‘Sex and the City’ exemplifies successful branded entertainment and shows how series influence fashion nowadays. The placements are outstanding when it comes to storytelling around the brand or product, setting trends and creating a character connection plus a desire through identification. This chapter shows success factors and chances of placements for the fashion industry.
Hip-hop culture defines itself through four central pillars: DJing, MCing, breakdancing and graffiti, but a fifth one, fashion, may be in the coming. Hip-hop has become the most popular music genre, and the influence it has on society is undebatable. But as hip-hop artists increasingly underpin their music with visual components, like music videos, the question arises if that has an influence on the fashion industry. This chapter clarifies which factors may determine a fashion business impact and discusses differences between mainstream hip-hop artists and the ones that are active in the fashion industry as well. The focus lays on the way and amount fashion is presented in the music videos. 24 music videos were analyzed, thereof 15 popular records from the past three years and nine of artists that are already considered as fashion influential. Additionally, a fashion influence index was created to compare the degree of fashion between the music videos. Numbers of styles, recognized brands, fashion related song verses, fashion related description box mentions and articles about the fashion in the music video were noted. Findings reveal that the number of outfits shown in the video did not have a direct link to the amount of traffic it produces in fashion media. The artists that are considered influential in the fashion industry, name brands in their song lyrics more often and show brand logos more frequent in their music videos than others. Though over the observed years, for the mainstream hip-hop artists, a rise in fashion awareness can be seen through a higher number of styles, recognizable brands and fashion related verses in the lyrics.
An event film is a successful marketing and communication instrument, which can be used from companies along social media. By reaching the target group and potential customers, companies could benefit from increasing brand awareness. It is striking that there is a lack of information about how event films are used in regard to showing fashion. To establish the subject further, the purpose of this paper is to enrich the existing findings and analyze the influence event films have. In an empirical study, the performance of two events and the two related fast fashion retailers H&M and Zara on Instagram and YouTube regarding event and fashion connected films is analyzed. Identified stylistic elements of event fashion are searched and found in their online shops. Since emotions are especially well transferred through event films, there is an indication that they contribute to the shaping of fashion trends.
Instagram fashion videos
(2020)
Instagram is one of the most used social media platforms to share photos and videos. Due to this, it can be seen as a helpful opportunity for companies to use the platform as a marketing tool in order to spread information to a wide range of potential customers. Ever since its launch, Instagram is strongly connected to fashion, which makes the platform in particular interesting for fashion brands. According to the screened literature, most brands use Instagram for marketing purposes. It is furthermore a matter of fact, that the utilization of videos plays a decisive role. Following up on this, the question about how brands use videos on Instagram for marketing purposes comes up. Due to this, this chapter aims to investigate the extent to which brands make use of videos on Instagram, what the goals of the videos are and what the most effective videos in terms of user engagement are. More specifically, this chapter includes an empirical study which examines the Instagram profiles of nine selected brands of the categories lifestyle, luxury and fashion and sportswear on the underlying research question. A subsequent evaluation and discussion of the results depicts differences and similarities within the categories and between the categories. All in all, the results of the study show that fashion brands use the possibility of films as a marketing tool on Instagram. The content and types of films thereby heavily depend on the brand category.
The purpose of this paper is to investigate how motion pictures are currently used for the product presentation of fashion articles. An explorative approach was chosen for the literature section. This study shows that the use of moving images for the presentation of fashion articles in online shops is possible in numerous different ways. In order to be able to use product presentation videos meaningfully, one should consider exactly what is the purpose of these videos. Different goals require different means. However, retailers should obtain enough information in advance to assess whether they can afford the production and post-processing of these videos.
The purpose of this paper is to investigate how motion pictures are currently used for the product presentation of fashion articles in online shops in the German, American and British markets. This study shows that the use of moving images for the presentation of fashion articles in online shops is underutilized. With the amount of data that was manageable within the scope of this chapter, no valid generalizations can be made. All described results must be understood as an indication. In order to be able to use product presentation videos meaningfully, one should consider before exactly what is the purpose of these videos. Different goals require different means. However, retailer should obtain enough information in advance to assess whether they can afford the production and post processing of these videos.
This chapter looks at the usage of image films produced by brands and their dealing with themselves. It focuses on analyzing important film parameters, the content and the way it can influence brand image. A list of 70 fashion brands from different categories was gathered through a survey and confirmed by comparing the results with relevant literature. All 70 brands were looked at to find relevant self-referencing films. The films had to be produced by the brand themselves. Videos for advertisement or promoting collections are not regarded either. In total 22 films from 17 brands were analyzed. Results show that most brands seem to have recognized videos as a powerful marketing tool in the social media age. Many brands seem to struggle with the compliance of certain parameters such as length and the use of the brand logo. In general, the content of the videos is focused around the four topics recruitment, value, history and behind the brand. As for the intent, the videos can be classified into the three categories learning, emotion and doing something. This paper not only analyzes this special film category, but also gives recommendations to improve the videos.
The connection of fashion and film seems symbiotic at first sight and they influence each other. There exist differences, including a different understanding of clothing by costume designers and fashion businesses. This article focuses on two successful movies „The Hunger Games“ and „The Great Gatsby“ in order to explore the role of film in fashion and vice versa. The findings suggest, that there are various collections in the fashion world, based on both movies. Therefore, movies indeed have an influence on the development of seasonal fashion. However, this connection is not natural, but rather artificially created by both industries. Through nowadays organized co-operation, the lines between costume designers and fashion designers get blurred. Furthermore, today fashion doesn’t trickle down to an audience naturally, but promoted using the film and its broad reach.
Fashion show films
(2020)
Due to technological developments, fashion show films provide fashion brands the opportunity to communicate their brand concepts, to attract attention and to gain more brand awareness by publishing them in the Internet. The purpose of this research paper is to investigate how fashion brands communicate their brand concept and personality through fashion show films. For this purpose, ten fashion show films of brands from the categories luxury, premium, high-street and active wear are investigated. The results indicate that the investigated brands use different ways to attract attention and to communicate their brand concept and personality. The design of the setting, the presentation of the collection as well as the visualization of the brand concept through the brand name, logo, colors or symbols and camera work play an important role to create an effective and exciting fashion show film in order to communicate the brand concept and to promote their brand image. Mostly luxury and premium brands use fashion show films for branding. For high-street and active wear brands the analysis indicates less importance of fashion show films. The limitations of this research are related to the fact that the restricted number of ten fashion show films is analyzed. This gives an overview but cannot provide a comprehensive breakdown of this topic.
YouTube fashion videos
(2020)
YouTube is the most widely adopted and successful video sharing platform. It works as a marketing instrument and money-making tool for companies while reaching the target group. After considering the significant literature based on YouTube, it is striking that there is lack of information about YouTube’s benefits as a video marketing instrument for fashion brands. To establish this subject further, the purpose of this study is to enrich the existing findings on social video marketing on YouTube in the apparel industry. The findings indicate the importance of YouTube as a social network for fashion marketers. The second part conducts an empirical study, which makes the YouTube channel performance of nine fashion brands the subject of discussion. Thereby, three brands per lifestyle, sports and luxury sector are analyzed through comparative aspects. Accordingly, the differences and similarities within and between the sectors are analyzed and evaluated.
Public transport maps are typically designed in a way to support route finding tasks for passengers while they also provide an overview about stations, metro lines, and city-specific attractions. Most of those maps are designed as a static representation, maybe placed in a metro station or printed in a travel guide. In this paper we describe a dynamic, interactive public transport map visualization enhanced by additional views for the dynamic passenger data on different levels of temporal granularity. Moreover, we also allow extra statistical information in form of density plots, calendar-based visualizations, and line graphs. All this information is linked to the contextual metro map to give a viewer insights into the relations between time points and typical routes taken by the passengers. We illustrate the usefulness of our interactive visualization by applying it to the railway system of Hamburg in Germany while also taking into account the extra passenger data. As another indication for the usefulness of the interactively enhanced metro maps we conducted a user experiment with 20 participants.
Here, we report the continuous peroxide-initiated grafting of vinyltrimethoxysilane (VTMS) onto a standard polyolefin by means of reactive extrusion to produce a functionalized liquid ethylene propylene copolymer (EPM). The effects of the process parameters governing the grafting reaction and their synergistic interactions are identified, quantified and used in a mathematical model of the extrusion process. As process variables the VTMS and peroxide concentrations and the extruder temperature setting were systematically studied for their influence on the grafting and the relative grafting degree using a face-centered central composite design (FCD). The grafting degree was quantified by 1H NMR spectroscopy. Response surface methodology (RSM) was used to calculate the most efficient grafting process in terms of chemical usage and graft yield. With the defined processing window, it was possible to make precise predictions about the grafting degree with at the same time highest possible relative degree of grafting.
The supply of customer-specific products is leading to the increasing technical complexity of machines and plants in the manufacturing process. In order to ensure the availability of the machines and plants, maintenance is considered as an essential key. The application of cyber-physical systems enables the complexity to be mastered by improving the availability of information, implementing predictive maintenance strategies and the provision of all relevant information in real-time. The present research project deals with the development of a cost-effective and retrofittable smart maintenance system for the application of ultraviolet (UV) lamps. UV lamps are used in a variety of applications such as curing of materials and water disinfection, where UV lamps are still used instead of UV LED due to their higher effectiveness. The smart maintenance system enables continuous condition monitoring of the UV lamp through the integration of sensors. The data obtained are compared with data from existing lifetime models of UV lamps to provide information about the remaining useful lifetime of the UV lamp. This ensures needs-based maintenance measures and more efficient use of UV lamps. Furthermore, it is important to have accurate information on the remaining useful lifetime of a UV lamp, as the unplanned breakdown of a UV lamp can have far-reaching consequences. The key element is the functional model of the envisioned cyber-physical system, describing the dependencies between the sensors and actuator, the condition monitoring system as well as the IoT platform. Based on the requirements developed and the functional model, the necessary hardware and software are selected. Finally, the system is developed and retrofitted to a simulated curing process of a 3D printer to validate its functional capability. The developed system leads to improved information availability of the condition of UV lamps, predictive maintenance measures and context-related provision of information.
Urgent action is needed to keep the chance of limiting global warming to 1.5°C or even 2.0°C. Current outlooks by IPCC, and many other organisations forecast that this will be impossible at current pace of emission 'reductions' – Germany has already hit 1.5° warming this year. Across 2019, particularly during the UN New York Climate summit, numerous organisations declared their ambition to become net carbon neutral. Amongst these were investors and companies, including quite a number of German ones.
We apply a mixed methods approach, utilising data gathered from approx. 900 companies after Climate Week in context of the Energy Efficiency Index of German Industry (EEI), along with media research focusing on decarbonisation plans announced and initiatives pledging climate action.
With this, we analyse how German companies in the manufacturing sectors react to rising societal pressure and emerging policies, particularly what measures they have taken or plan to implement to reduce the footprint of their company, their products and their supply chain. In this, we particularly analyse whether and in what way energy- and resource consumption, as well as carbon emissions are considered in the development and lifecycle of goods manufactured. This is of huge relevance as these goods determine the future footprint of buildings, vehicles and industry.
Regarding the supply chain, current articles indicate that small and medium-sized enterprises (SME) are particularly challenged by increasing demands from their large corporate clients and an alleged lack of preparedness to be able to take and afford prompt decarbonisation action themselves (Buchenau et. al. 2019). Notably the automotive industry recently announced new models that will be 100% carbon neutral all the way through (ibid). We thus analyse if and how factors such as company size, energy intensity and sector affiliation influence a company’s plan to fully decarbonize. Ownership structure and corporate culture, it appears, significantly impact on the degree of decarbonisation action underway.
Power line communications (PLC) reuse the existing power-grid infrastructure for the transmission of data signals. As power line the communication technology does not require a dedicated network setup, it can be used to connect a multitude of sensors and Internet of Things (IoT) devices. Those IoT devices could be deployed in homes, streets, or industrial environments for sensing and to control related applications. The key challenge faced by future IoT-oriented narrowband PLC networks is to provide a high quality of service (QoS). In fact, the power line channel has been traditionally considered too hostile. Combined with the fact that spectrum is a scarce resource and interference from other users, this requirement calls for means to increase spectral efficiency radically and to improve link reliability. However, the research activities carried out in the last decade have shown that it is a suitable technology for a large number of applications. Motivated by the relevant impact of PLC on IoT, this paper proposed a cooperative spectrum allocation in IoT-oriented narrowband PLC networks using an iterative water-filling algorithm.
Automatic anode rod inspection in aluminum smelters using deep-learning techniques: a case study
(2020)
Automatic fault detection using machine learning has become an exciting and promising area of research. This because it accurate and timely way to manage and classify with minimal human effort. In the computer vision community, deep-learning methods have become the most suitable approaches for this task. Anodes are large carbon blocks that are used to conduct electricity during the aluminum reduction process. The most basic function of anode rod inspection is to prevent a situation where the anode rod will not fit into the stub-holes of a new anode. It would be the case for a rod containing either severe toe-in, missing stubs, or a retained thimble on one or more stubs. In this work, to improve the accuracy of shape defect inspection for an anode rod, we use the Fast Region-based Convolutional Network method (Fast R-CNN), model. To train the detection model, we collect an image dataset composed of multi-class of anode rod defects with annotated labels. Our model is trained using a small number of samples, an essential requirement in the industry where the number of available defective samples is limited. It can simultaneously detect multi-class of defects of the anode rod in nearly real-time.
This paper aims at presenting a solution that enables end customers of the energy system to participate in new local micro-energy-markets by providing them with a distributed, decentralized, transparent and secure Peer to Peer (P2P) payment system, which functions automatically applying new concepts of Machine to Machine (M2M) communication technologies. This work was performed within the German project VK_2G, funded by the DBU. The key results were: Providing means to perform microtransactions in a P2P fashion between end consumers and prosumers in local communities at low cost in a transparent and secure manner; Developing a platform with pre-defined smart contracts able to be tailored to different end customers ‘needs in an easy way and; Integrating both the market platform as well as the local control of generation and loads. This solution has been developed, integrated and tested in a laboratory prototype. This paper discusses this solution and presents the results of the first test.
The ballistocardiography is a technique that measures the heart rate from the mechanical vibrations of the body due to the heart movement. In this work a novel noninvasive device placed under the mattress of a bed estimates the heart rate using the ballistocardiography. Different algorithms for heart rate estimation have been developed.
Globalization has increased the number of road trips and vehicles. The result has been an intensification of traffic accidents, which are becoming one of the most important causes of death worldwide. Traffic accidents are often due to human error, the probability of which increases when the cognitive ability of the driver decreases. Cognitive capacity is closely related to the driver’s mental state, as well as other external factors such as the CO2 concentration inside the vehicle. The objective of this work is to analyze how these elements affect driving. We have conducted an experiment with 50 drivers who have driven for 25 min using a driving simulator. These drivers completed a survey at the start and end of the experiment to obtain information about their mental state. In addition, during the test, their stress level was monitored using biometric sensors and the state of the environment (temperature, humidity and CO2 level) was recorded. The results of the experiment show that the initial level of stress and tiredness of the driver can have a strong impact on stress, driving behavior and fatigue produced by the driving test. Other elements such as sadness and the conditions of the interior of the vehicle also cause impaired driving and affect compliance with traffic regulations.
Traditional communication of research on climate change fails to encourage individual, corporate, and political leaders to take appropriate action. We argue that this problem is based on an overly simplistic unidirectional model of science communication. Conversely, theory shows that active learning processes are better suited to initiate and mobilize engagement among all stakeholders. Here, we integrate theoretical insights on active learning with empirical evidence from serious gaming: communication should be understood as an integral design feature that relates active learning on climate change to tangible action.
At DBKDA 2019, we demonstrated that StrongDBMS with simple but rigorous optimistic algorithms, provides better performance in situations of high concurrency than major commercial database management systems (DBMS). The demonstration was convincing but the reasons for its success were not fully analysed. There is a brief account of the results below. In this short contribution, we wish to discuss the reasons for the results. The analysis leads to a strong criticism of all DBMS algorithms based on locking, and based on these results, it is not fanciful to suggest that it is time to re-engineer existing DBMS.
3D assisted 2D face recognition involves the process of reconstructing 3D faces from 2D images and solving the problem of face recognition in 3D. To facilitate the use of deep neural networks, a 3D face, normally represented as a 3D mesh of vertices and its corresponding surface texture, is remapped to image-like square isomaps by a conformal mapping. Based on previous work, we assume that face recognition benefits more from texture. In this work, we focus on the surface texture and its discriminatory information content for recognition purposes. Our approach is to prepare a 3D mesh, the corresponding surface texture and the original 2D image as triple input for the recognition network, to show that 3D data is useful for face recognition. Texture enhancement methods to control the texture fusion process are introduced and we adapt data augmentation methods. Our results show that texture-map-based face recognition can not only compete with state-of-the-art systems under the same precon ditions but also outperforms standard 2D methods from recent years.
Appropriate mechanical properties and fast endothelialization of synthetic grafts are key to ensure long-term functionality of implants. We used a newly developed biostable polyurethane elastomer (TPCU) to engineer electrospun vascular scaffolds with promising mechanical properties (E-modulus: 4.8 ± 0.6 MPa, burst pressure: 3326 ± 78 mmHg), which were biofunctionalized with fibronectin (FN) and decorin (DCN). Neither uncoated nor biofunctionalized TPCU scaffolds induced major adverse immune responses except for minor signs of polymorph nuclear cell activation. The in vivo endothelial progenitor cell homing potential of the biofunctionalized scaffolds was simulated in vitro by attracting endothelial colony-forming cells (ECFCs). Although DCN coating did attract ECFCs in combination with FN (FN + DCN), DCN-coated TPCU scaffolds showed a cell-repellent effect in the absence of FN. In a tissue-engineering approach, the electrospun and biofunctionalized tubular grafts were cultured with primary-isolated vascular endothelial cells in a custom-made bioreactor under dynamic conditions with the aim to engineer an advanced therapy medicinal product. Both FN and FN + DCN functionalization supported the formation of a confluent and functional endothelial layer.
Purpose. To improve the efficiency of the closed-cycle operation of the field-orientation induction machine in dynamic behavior when load conditions are changing, considering the nonlinearities of the main inductance.
Methodology. The optimal control problem is defined as the minimization of the time integral of the energy losses. The algorithm observed in this paper uses the Matlab/Simulink, dSPACE real-time interface, and C language. Handling real-time applications is made in ControlDesk experiment software for seamless ECU development.
Findings. Adiscrete-time model with an integrated predictive control scheme where the optimization is performed online at every sampling step has been developed. The optimal field-producing current trajectory is determined, so that the copper losses are minimized over a wide operational range. Additionally, the comparison of measurement results with conventional methods is provided, which validates the advantages and performance of the control scheme.
Originality. To solve the given problem, the information vector on the current state of the coordinates of the electromechanical system is used to form a controlling influence in the dynamic mode of operation. For the first time, the formation process of controls has considered the current state and the desired future state of the system in the real-time domain.
Practical value. Apredictive iterative approach for optimal flux level of an induction machine is important to generate the required electromagnetic torque and to reduce power losses simultaneously.
Rising consumption due to a growing world population and increasing prosperity, combined with a linear economic system have led to a sharp increase in garbage collection, general pollution of the environment and the threat of resource scarcity. At the same time, the perception of environmental protection becomes more sensitive as the consequences of neglecting sustainable business and eco-efficiency become more visible. The Circular Economy (CE) could reduce waste production and is able to decouple economic growth from resource consumption, but most of the products currently in use are not designed for the reuse-forms of the CE. In addition, the decision-making process of the End of-Usage (EoU) products regarding the following steps has further weaknesses in terms of economic attractiveness for the participants, which leads to low return rates and thus the disposal is often the only alternative.
This paper proposes a model of the decision-making process, which uses machine learning. For this purpose, a Machine Learning (ML) classification is created, by applying the waterfall model. An artificial neural network (ANN) uses information about the model, use phase and the obvious symptoms of the product to predict the condition of individual components. The resulting information can be used in a downstream economic and ecological evaluation to assess the possible next steps. To test this process comprehensive training data is simulated to train the ANN. The decentralized implementation, cost savings and the possibility of an incentive system for the return of an end-of-usage product could lead to increased return rates. Since electronic devices in particular are attractive for the CE, laptops are the reference object of this work. However, the obtained findings are easily applicable to other electronic devices.
Globalisation, shorter product life cycles, and increasing product varieties have led to complex supply chains. At the same time, there is a growing interest of customers and governments in having a greater transparency of brands, manufacturers, and producers throughout the supply chain. Due to the complex structure of collaborative manufacturing networks, the increase of supply chain transparency is a challenge for manufacturing companies. The blockchain technology offers an innovative solution to increase the transparency, security, authenticity, and auditability of products. However, there are still uncertainties when applying the blockchain technology to manufacturing scenarios and thus enable all stakeholders to trace back each component of an assembled product. This paper proposes a framework design to increase the transparency and auditability of products in collaborative manufacturing networks by adopting the blockchain technology. In this context, each component of a product is marked with a unique identification number generated by blockchain-based smart contracts. In this way, a transparent auditability of assembled products and their components can be achieved for all stakeholders, including the custome.
Companies are becoming aware of the potential risks arising from sustainability aspects in supply chains. These risks can affect ecological, economic or social aspects. One important element in managing those risks is improved transparency in supply chains by means of digital transformation. Innovative technologies like blockchain technology can be used to enforce transparency. In this paper, we present a smart contract-based Supply Chain Control Solution to reduce risks. Technological capabilities of the solution will be compared to a similar technology approach and evaluated regarding their benefits and challenges within the framework of supply chain models. As a result, the proposed solution is suitable for the dynamic administration of complex supply chains.
Detecting semantic similarities between sentences is still a challenge today due to the ambiguity of natural languages. In this work, we propose a simple approach to identifying semantically similar questions by combining the strengths of word embeddings and Convolutional Neural Networks (CNNs). In addition, we demonstrate how the cosine similarity metric can be used to effectively compare feature vectors. Our network is trained on the Quora dataset, which contains over 400k question pairs. We experiment with different embedding approaches such as Word2Vec, Fasttext, and Doc2Vec and investigate the effects these approaches have on model performance. Our model achieves competitive results on the Quora dataset and complements the well-established evidence that CNNs can be utilized for paraphrase recognition tasks.
Medical implants play a central role in modern medicine and both, naturally derived and synthetic materials have been explored as biomaterials for such devices. However, when implanted into living tissue, most materials initiate a host response. In addition, implants often cause bacterial infections leading to complications. Polyelectrolyte multilayer (PEM) coatings can be used for functionalization of medical implants improving the implant integration and reducing foreign body reactions. Some PEMs are also known to show antibacterial properties. We developed a PEM coating suggesting that it can decrease the risk of bacterial infections occurring after implantation while being highly biocompatible. We applied two different standard tests for evaluating the PEM’s antibacterial properties, the ISO norm (ISO 22196) and one ASTM norm (ASTM E2180) test. We found a reduction of bacterial growth on the PEM but to a different degree depending on the testing method. This result demonstrates the need for defining proper method to evaluate antibacterial properties of surface coatings.
Steady state efficiency optimization techniques for induction motors are state of the art and various methods have already been developed. This paper provides new insights in the efficiency optimized operation in dynamic regime. The paper proposes an anticipative flux modification in order to decrease losses during torque and speed transients. These trajectories are analyzed based on a numerical study for different motors. Measurement results for one motor are given as well.
Energy efficient electric control of drives is more and more important for electric mobility and manufacturing industries. Online dynamic optimization of induction machines is challenging due to the computational complexity involved and the variable power losses during dynamic operation of induction machines. This paper proposes a simple technique for sub-optimal online loss optimization using rotor flux linkage templates for energy efficient dynamic operation of induction machines. Such a rotor flux linkage template is given by a rotor flux linkage trajectory which is optimal for a specific scenario. This template is calculated in an offline optimization process. For a specific scenario during real time operation the rotor flux linkage is calculated by appropriately scaling the given template.
The critical process parameters cell density and viability during mammalian cell cultivation are assessed by UV/VIS spectroscopy in combination with multivariate data analytical methods. This direct optical detection technique uses a commercial optical probe to acquire spectra in a label-free way without signal enhancement. For the cultivation, an inverse cultivation protocol is applied, which simulates the exponential growth phase by exponentially replacing cells and metabolites of a growing Chinese hamster ovary cell batch with fresh medium. For the simulation of the death phase, a batch of growing cells is progressively replaced by a batch with completely starved cells. Thus, the most important parts of an industrial batch cultivation are easily imitated. The cell viability was determined by the well-established method partial least squares regression (PLS). To further improve process knowledge, the viability has been determined from the spectra based on a multivariate curve resolution (MCR) model. With this approach, the progress of the cultivations can be continuously monitored solely based on an UV/VIS sensor. Thus, the monitoring of critical process parameters is possible inline within a mammalian cell cultivation process, especially the viable cell density. In addition, the beginning of cell death can be detected by this method which allows us to determine the cell viability with acceptable error. The combination of inline UV/VIS spectroscopy with multivariate curve resolution generates additional process knowledge complementary to PLS and is considered a suitable process analytical tool for monitoring industrial cultivation processes.
On the design of an urban data and modeling platform and its application to urban district analyses
(2020)
An integrated urban platform is the essential software infrastructure for smart, sustainable and resilitent city planning, operation and maintenance. Today such platforms are mostly designed to handle and analyze large and heterogeneous urban data sets from very different domains. Modeling and optimization functionalities are usually not part of the software concepts. However, such functionalities are considered crucial by the authors to develop transformation scenarios and to optimized smart city operation. An urban platform needs to handle multiple scales in the time and spatial domain, ranging from long term population and land use change to hourly or sub-hourly matching of renewable energy supply and urban energy demand.
Customer foresight is a relatively new research field. We introduce the customer foresight territory by discussing it localization between customer research and foresight research. For this purposse, we look at a variety of methods that help to understand customers and future realities. On this basis we provide an overwiew of customer foresight methods and outline an ideal-typical research journey.
Customer foresight is a relatively new research field. We introduce the customer foresight territory by discussing its localization between customer research and foresight research. For this purpose, we look at a variety of methods that help to understand customers and future realities. On this basis we provide an overview of customer foresight methods and outline an ideal-typical research journey.
Learning factories can complement each other by training different competencies in the field of digitalisation and Industry 4.0. They depict diverse sections of the product development process and focus on various technologies. Within the framework of the International Association of Learning Factories (IALF), the operating organisations of learning factories exchange information on research, training and education. One of the aims is to develop joint projects. The article presents different concepts of cooperation between learning factories while focusing on the improvement of the development of learners competencies e.g. with a broader range of topics. A concept of a joint course between the learning factories in Bochum, Reutlingen and Darmstadt is explained in detail. The three learning factories will be examined with regard to their similarities and differences. The joint course focuses on the target group of students and the topic of digitalisation in the development and production of products. The course and its contents are explained in detail. The new learning approach is evaluated on the basis of feedback from the participants. Finally, challenges resulting from the cooperation between learning factories at different locations and with different operating models will be discussed.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings—revenue-generating solutions to what customers want and are willing to pay for, inspired by what is possible with digital technologies. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This case describes how Munich Re addressed these common challenges by building a foundation to help its digital offerings succeed. The foundation provided prioritized and staged funding; dedicated, hands-on expertise; and a digital platform of shared services. By 2020, this foundation was helping to support over seventy initiatives, including several that were in the market generating new sources of revenue for the company by enabling its clients—insurance companies—to better service their own customers.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings: revenue-generating solutions that leverage digital technologies to address customer needs. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This briefing describes how Munich Re addressed these common challenges by building a foundation for experimenting more systematically and successfully with digital offerings. The foundation has enabled Munich Re to become a serial innovator of digital offerings.
In an increasingly competitive environment, suppliers are now seen as an important source of innovation. Long term partnerships enable companies to access the knowledge of suppliers to optimize their business. "Procurement 4.0" is one of the concepts that come to the fore when talking about digitalization of business processes. The major aim of this research is to discuss a conceptual model of "Procurement 4.0" and its potential to rethink the management of supplier relationships, which will be one of the main future tasks of procurement. The paper is based on a factual-analytical research approach that serves to continuously specify and supplement the elements of the frame of reference: Two challenging concepts, "Procurement 4.0" and Supplier Relationship Management, are merged to contribute to the fact that purchasing is indispensable as an "interface" within a global supply chain to reap the benefits of digitization. The factors that prove to be obstacles to digital supplier relationship management along the digital supplier journey - e.g. lack of guidelines, approaches or tools and a lack of understanding of the importance of long-term relationships - are reflected within the identified technologies of digital transformation. A comprehensive analysis of the given situation within digital supplier relationship management in Germany is provided. The most important digital supplier touchpoints are discussed in order to develop a traditional supplier relationship towards a digital relationship management. Thus, this paper succeeds in illustrating how the innovative concept of a supplier journey can be implemented in practice to counteract the future, entrepreneurial challenges.
This document presents a new complete standalone system for a recognition of sleep apnea using signals from the pressure sensors placed under the mattress. The developed hardware part of the system is tuned to filter and to amplify the signal. Its software part performs more accurate signal filtering and identification of apnea events. The overall achieved accuracy of the recognition of apnea occurrence is 91%, with the average measured recognition delay of about 15 seconds, which confirms the suitability of the proposed method for future employment. The main aim of the presented approach is the support of the healthcare system with the cost-efficient tool for recognition of sleep apnea in the home environment.
The recovery of our body and brain from fatigue directly depends on the quality of sleep, which can be determined from the results of a sleep study. The classification of sleep stages is the first step of this study and includes the measurement of vital data and their further processing. The non-invasive sleep analysis system is based on a hardware sensor network of 24 pressure sensors providing sleep phase detection. The pressure sensors are connected to an energy-efficient microcontroller via a system-wide bus. A significant difference between this system and other approaches is the innovative way in which the sensors are placed under the mattress. This feature facilitates the continuous use of the system without any noticeable influence on the sleeping person. The system was tested by conducting experiments that recorded the sleep of various healthy young people. Results indicate the potential to capture respiratory rate and body movement.
Comparison of sleep characteristics measurements: a case study with a population aged 65 and above
(2020)
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
Concrete is significant for construction. A problem in application is the appearance of cracks that will damage its strength. An autogenous crack-healing mechanism based on bacteria receives increasing attention in recent years. The bacteria are able to form calcium carbonate (CaCO3) precipitations in suitable conditions to protect and reinforce the concrete. However, a large number of spores are crushed in aged specimens, resulting in a loss of viability. A new kind of hydrogel crosslinked by alginate, chitosan and calcium ions was introduced in this study. It was observed that the addition of chitosan improved the swelling properties of calcium alginate. Opposite pH response to calcium alginate was observed when the chitosan content in the solution reached 1.0%. With an addition of 1.0% chitosan in hydrogel beads, 10.28% increase of compressive strength and 13.79% increase of flexural strength to the control were observed. The results reveal self-healing properties of concretes. A healing crack of 4 cm length and 1 mm width was observed when using cement PO325, with the addition of bacterial spores (2.54–3.07 × 105/cm3 concrete) encapsulated by hydrogel containing no chitosan.
Autonomous driving is becoming the next big digital disruption in the automotive industry. However, the possibility of integrating autonomous driving vehicles into current transportation systems not only involves technological issues but also requires the acceptance and adoption of users. Therefore, this paper develops a conceptual model for user acceptance of autonomous driving vehicles. The corresponding model is tested through a standardized survey of 470 respondents in Germany. Finally, the findings are discussed in relation to the current developments in the automotive industry, and recommendations for further research are given.
We investigate the toxicity of different types and sizes of microplastic particles (0.3–4 mm) under different conditions (new particles, aged particles with biofilm, and particles with adsorbed Tributyltin) on the freshwater amphipod Gammarus fossarum in 3-week exposures. All types of plastic particles, which were randomly taken up to a small extent, were mostly Polyphenylenoxide, Polybutylentherephthalate and Polypropylene, with particles < 1 mm in size. Plastic particles did not affect the feeding and locomotory behaviour of gammarids, and there was no strong difference between pristine plastic particles and aged particles with biofilm. Mortality tended to be higher compared with the control. Tributyltinhydride (TBTH) adsorbed to microplastic particles had no effect on uptake, survival, feeding and locomotory behaviour during the 3 weeks of exposure. Dissolved TBTH, however, was already very toxic after few days of exposure (LC50-96h < 1 ng l–1).
”I have never seen one who loves virtue as much as he loves beauty,” Confucius once said. If beauty is more important as goodness, it becomes clear why people invest so much effort in their first impression. The aesthetic of faces has many aspects and there is a strong correlation to all characteristics of humans, like age and gender. Often, research on aesthetics by social and ethic scientists lacks sufficient labelled data and the support of machine vision tools. In this position paper we propose the Aesthetic-Faces dataset, containing training data which is labelled by Chinese and German annotators. As a combination of three image subsets, the AF-dataset consists of European, Asian and African people. The research communities in machine learning, aesthetics and social ethics can benefit from our dataset and our toolbox. The toolbox provides many functions for machine learning with state-of-the-art CNNs and an Extreme-Gradient-Boosting regressor, but also 3D Morphable Model technolo gies for face shape evaluation and we discuss how to train an aesthetic estimator considering culture and ethics.
Rapidly changing market conditions and global competition are leading to an increasing complexity of logistics systems and require innovative approaches with respect to the organisation and control of these systems. In scientific research, concepts of autonomously controlled logistics systems show a promising approach to meet the increasing requirements for flexible and efficient order processing. In this context, this work aims to introduce a system that is able to adjust order processing dynamically, and optimise intralogistics transportation regarding various generic intralogistics target criteria. The logistics system under consideration consists of various means of transport for autonomous decision-making and fulfilment of transport orders with defined source-sink relationships. The context of this work is set by introducing the Learning Factory Werk 150 with its existing hardware and software infrastructure and its defined target figures to measure the performance of the system. Specifically, the important target figures cost and performance are considered for the transportation system. The core idea of the system’s logic is to solve the problem of order allocation to specific means of transport by linking a Genetic Algorithm with a Multi-Agent System. The implementation of the developed system is described in an application scenario at the learning factory.
This article examines centralised and decentralised approaches towards managing internationalisation by means of a case study. Reutlingen University (Hochschule Reutlingen), a university of applied sciences in Southern Germany, has three decades of experience in managing internationalisation. Its strongly integrated and hybrid approach combines centralised and decentralised strategies with the aim of achieving responsiveness, innovation, transparency, quality, and goal alignment. Centralisation and decentralisation are manifested on two levels: university versus schools, and school versus individual programmes. Since internationalisation is embedded in virtually all areas of the university’s operations, examples will be provided ranging from administration and marketing to research, international programme management and curricula.
Simple MOSFET models intended for hand analysis are inaccurate in deep sub-micrometer process technologies and in the moderate inversion region of device operation. Accurate models, such as the Berkeley BSIM6 model, are too complex for use in hand analysis and are intended for circuit simulators. Artificial neural networks (ANNs) are efficient at capturing both linear and non-linear multivariate relationships. In this work, a straightforward modeling technique is presented using ANNs to replace the BSIM model equations. Existing open-source libraries are used to quickly build models with error rates generally below 3%. When combined with a novel approach, such as the gm/Id systematic design method, the presented models are sufficiently accurate for use in the initial sizing of analog circuit components without simulation.
Based on a survey among customers of seven German municipal utilities, we estimate two regression models to identify the most prospective customer segments and their preferences and motivations for participating in peer-to-peer (P2P) electricity trading and develop implications for decision-makers in the energy sector and policy-makers for this currently relatively unknown product. Our results show a large general openness of private households towards P2P electricity trading, which is also the main predictor of respondents' intention to participate. It is mainly influenced by individuals’ environmental attitude, technical interest, and independence aspiration. Respondents with the highest willingness to participate in P2P electricity trading are mainly motivated by the ability to share electricity, and to a lesser extent by economic reasons. They also have stronger preferences for innovative pricing schemes (service bundles, time-of-use tariffs). Differences between individuals can be observed depending on their current ownership (prosumers) or installation probability of a microgeneration unit (consumers, planners). Rather than current prosumers, especially planners willing to install microgeneration in the foreseeable future are considered to be the most promising target group for P2P electricity trading. Finally, our results indicate that P2P electricity trading could be a promising niche option in the German energy transition.
Some widely used optical measurement systems require a scan in wavelength or in one spatial dimension to measure the topography in all three dimensions. Novel hyperspectral sensors based on an extended Bayer pattern have a high potential to solve this issue as they can measure three dimensions in a single shot. This paper presents a detailed examination of a hyperspectral sensor including a description of the measurement setup. The evaluated sensor (Ximea MQ022HG-IM-SM5X5-NIR) offers 25 channels based on Fabry–Pérot filters. The setup illuminates the sensor with discrete wavelengths under a specified angle of incidence. This allows characterization of the spatial and angular response of every channel of each macropixel of the tested sensor on the illumination. The results of the characterization form the basis for a spectral reconstruction of the signal, which is essential to obtain an accurate spectral image. It turned out that irregularities of the signal response for the individual filters are present across the whole sensor.
Hyperspectral imaging opens a wide field of applications. It is a well established technique in agriculture, medicine, mineralogy and many other fields. Most commercial hyperspectral sensors are able to record spectral information along one spatial dimension in a single acquisition. For the second spatial dimension a scan is required. Beside those systems there is a novel technique allowing to sense a two dimensional scene and its spectral information within one shot. This increases the speed of hyperspectral imaging, which is interesting for metrology tasks under rough environmental conditions. In this article we present a detailed characterization of such a snapshot sensor for later use in a snapshot full field chromatic confocal system. The sensor (Ximea MQ022HG-IM-SM5X5-NIR) is based on the so called snapshot mosaic technique, which offers 25 bands mapped to one so called macro pixel. The different bands are realized by a spatially repeating pattern of Fabry-Pèrot flters. Those filters are monolithically fabricated on the camera chip.
In recent years, the cloud has become an attractive execution environment for parallel applications, which introduces novel opportunities for versatile optimizations. Particularly promising in this context is the elasticity characteristic of cloud environments. While elasticity is well established for client-server applications, it is a fundamentally new concept for parallel applications. However, existing elasticity mechanisms for client-server applications can be applied to parallel applications only to a limited extent. Efficient exploitation of elasticity for parallel applications requires novel mechanisms that take into account the particular runtime characteristics and resource requirements of this application type. To tackle this issue, we propose an elasticity description language. This language facilitates users to define elasticity policies, which specify the elasticity behavior at both cloud infrastructure level and application level. Elasticity at the application level is supported by an adequate programming and execution model, as well as abstractions that comply with the dynamic availability of resources. We present the underlying concepts and mechanisms, as well as the architecture and a prototypical implementation. Furthermore, we illustrate the capabilities of our approach through real-world scenarios.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
The article studies a novel approach of inflation modeling in economics. We utilize a stochastic differential equation (SDE) of the form dXt=aXtdt+bXtdBtH, where dBtH is a fractional Brownian motion in order to model inflationary dynamics. Standard economic models do not capture the stochastic nature of inflation in the Eurozone. Thus, we develop a new stochastic approach and take into consideration fractional Brownian motions as well as Lévy processes. The benefits of those stochastic processes are the modeling of interdependence and jumps, which is equally confirmed by empirical inflation data. The article defines and introduces the rules for stochastic and fractional processes and elucidates the stochastic simulation output.
Since the global financial crisis of 2008/2009, there has been no challenge to the financial and banking system comparable to that during the Corona crisis.
Weak profitability, unresolved regulatory challenges and increasing competition in the digital sector pose further challenges for banks.
The stability of the financial system and access to financial markets was not at risk during the pandemic. Through joint efforts and better bank capitalisation, the financial system is now more resilient than during the financial crisis.
Provided that grants and loans in the “next generation EU” fund are well targeted for structural reforms and investments in the future, this should boost confi-dence and growth.
However, further improvements in financial stability, such as increased capital requirements, regulation of shadow banks or reforms in financial supervision, are needed.
This paper studies the impact of financial liquidity on the macro-economy. We extend a classic macroeconomic modeland compute numerical simulations. The model confirms that persistently low inflation can occur despite a high degreeof financial liquidity due to a reallocation of cash, normal and risk-free bonds. In that regard, our model uncovers anexplanation of a flat Phillips curve. Overall, our approach contributes to a rather disregarded matter in macroeconomictheory.
Polycaprolactone (PCL) was electrospun with the addition of arginine (Arg), an α-amino acid that accelerates the haeling process. The efficient needleless electrospinning technique was used for the fabrication of the nanofibrous layers. The materials produced consisted mainly of fibers with diameters of between 200 and 400 nm. Moreover, both microfibers and beads were present within the layers. Higher bead sized were observed with the increased addition of arginine.
This study investigates empirically the development of working capital management and its impact on profitability and shareholder value in Germany. We analyse panel data of 115 firms listed on the German Prime Standard, covering the period from 2011 to 2017. The results provide evidence that efficient working capital management, indicated by a shorter cash conversion cycle, deteriorated over time, but that a shorter cash conversion has a positive impact on profitability and shareholder value. The findings highlight the need that managers should give greater priority to working capital optimization, even in a low-interest environment. The paper contributes to the literature by advancing this research area in Germany, and it is the first study investigating shareholder relationship with working capital management and all its determinants.
Entrepreneurship education is becoming increasingly important in higher education and also drives the development of innovative teaching formats, which can increase student engagement. It does, however, need greater international focus to become more attractive for both domestic and international students. This paper presents the examination and course design of two case studies, which promote entrepreneurship education for domestic and international students. These examples show that entrepreneurship courses are attractive due to their focus on interdisciplinarity, experience-based learning, and project-based work. Following a design-based research approach, this paper provides a practical contribution by offering a detailed overview of course design principles, classroom practice and presents reflections and learnings from an iterative development process.
Hypothesis
The origin of negative surface charge at water/air interface is still not clear. The most probable origin is specific adsorption of OH− ions. From diffuse layer potential, we can evaluate the surface density of ions in the Stern layer which can be a measure for the specific adsorption of ions and determines whether the surface charge is solely due to the specific adsorption of OH− ions.
Experiments
Equilibrium thickness of foam films of pure water and aqueous solutions of NaCl, HCl, and NaOH was measured as a function of disjoining pressure for water and as a function of concentration for the aqueous solutions at 298.15 K. Quartz-glass cells thoroughly cleaned and immersed in pure water before use were used for the measurement.
Findings
Application of a modified Poisson-Boltzmann equation to the equilibrium film thickness gave the diffuse layer potential and the surface density of ions in the Stern layer. From the concentration dependence of the surface density, it was concluded that not only OH− ions but also Cl− ions and HCO3− and/or CO32− ions adsorb specifically at the water/air interface.
Deep learning-based fabric defect detection methods have been widely investigated to improve production efficiency and product quality. Although deep learning-based methods have proved to be powerful tools for classification and segmentation, some key issues remain to be addressed when applied to real applications. Firstly, the actual fabric production conditions of factories necessitate higher real-time performance of methods. Moreover, fabric defects as abnormal samples are very rare compared with normal samples, which results in data imbalance. It makes model training based on deep learning challenging. To solve these problems, an extremely efficient convolutional neural network, Mobile-Unet, is proposed to achieve the end-to-end defect segmentation. The median frequency balancing loss function is used to overcome the challenge of sample imbalance. Additionally, Mobile-Unet introduces depth-wise separable convolution, which dramatically reduces the complexity cost and model size of the network. It comprises two parts: encoder and decoder. The MobileNetV2 feature extractor is used as the encoder, and then five deconvolution layers are added as the decoder. Finally, the softmax layer is used to generate the segmentation mask. The performance of the proposed model has been evaluated by public fabric datasets and self-built fabric datasets. In comparison with other methods, the experimental results demonstrate that segmentation accuracy and detection speed in the proposed method achieve state-of-the-art performance.
The use of learning factories for education in maintenance concepts is limited, despite the important role maintenance plays in the effective operation of organizational assets. A training programme in a learning factory environment is presented where a combination of gamification, classroom training and learning factory applications is used to introduce students to the concepts of maintenance plan development, asset failure characteristics and the costs associated with maintenance decision-making. The programme included a practical task to develop a maintenance plan for different advanced manufacturing machines in a learning factory setting. The programme stretched over a four-day period and demonstrated how learning factories can be effectively utilized to teach management related concepts in an interdisciplinary team context, where participants had no, or very limited, previous exposure to these concepts.
Strong optical mode coupling between two adjacent λ/2 Fabry-Pérot microresonators consisting of three parallel silver mirrors is investigated experimentally and theoretically as a function of their detuning and coupling strength. Mode coupling can be precisely controlled by tuning the mirror spacing of one resonator with respect to the other by piezoelectric actuators. Mode splitting, anti-crossing and asymmetric modal damping are observed and theoretically discussed for the symmetric and antisymmetric supermodes of the coupled system. The spectral profile of the supermodes is obtained from the Fourier transform of the numerically calculated time evolution of the individual resonator modes, taking into account their resonance frequencies, damping and coupling constants, and is in excellent agreement with the experiments. Our microresonator design has potential applications for energy transfer between spatially separated quantum systems in micro optoelectronics and for the emerging field of polaritonic chemistry.
In Germany, mobility is currently in a state of flux. Since June 2019, electric kick scooters (e-scooters) have been permitted on the roads, and this market is booming. This study employs a user survey to generate new data, supplemented by expert interviews to determine whether such e-scooters are a climate-friendly means of transport. The environmental impacts are quantified using a life cycle assessment. This results in a very accurate picture of e-scooters in Germany. The global warming potential of an e-scooter calculated in this study is 165 g CO2-eq./km, mostly due to material and production (that together account for 73% of the impact). By switching to e-scooters where the battery is swapped, the global warming potential can be reduced by 12%. The lowest value of 46 g CO2-eq./km is reached if all possibilities are exploited and the life span of e-scooters is increased to 15 months. Comparing these emissions with those of the replaced modal split, e-scooters are at best 8% above the modal split value of 39 g CO2-eq./km.
High Performance Computing (HPC) enables significant progress in both science and industry. Whereas traditionally parallel applications have been developed to address the grand challenges in science, as of today, they are also heavily used to speed up the time-to-result in the context of product design, production planning, financial risk management, medical diagnosis, as well as research and development efforts. However, purchasing and operating HPC clusters to run these applications requires huge capital expenditures as well as operational knowledge and thus is reserved to large organizations that benefit from economies of scale. More recently, the cloud evolved into an alternative execution environment for parallel applications, which comes with novel characteristics such as on-demand access to compute resources, pay-per-use, and elasticity. Whereas the cloud has been mainly used to operate interactive multi-tier applications, HPC users are also interested in the benefits offered. These include full control of the resource configuration based on virtualization, fast setup times by using on-demand accessible compute resources, and eliminated upfront capital expenditures due to the pay-per-use billing model. Additionally, elasticity allows compute resources to be provisioned and decommissioned at runtime, which allows fine-grained control of an application's performance in terms of its execution time and efficiency as well as the related monetary costs of the computation. Whereas HPC-optimized cloud environments have been introduced by cloud providers such as Amazon Web Services (AWS) and Microsoft Azure, existing parallel architectures are not designed to make use of elasticity. This thesis addresses several challenges in the emergent field of High Performance Cloud Computing. In particular, the presented contributions focus on the novel opportunities and challenges related to elasticity. First, the principles of elastic parallel systems as well as related design considerations are discussed in detail. On this basis, two exemplary elastic parallel system architectures are presented, each of which includes (1) an elasticity controller that controls the number of processing units based on user-defined goals, (2) a cloud-aware parallel execution model that handles coordination and synchronization requirements in an automated manner, and (3) a programming abstraction to ease the implementation of elastic parallel applications. To automate application delivery and deployment, novel approaches are presented that generate the required deployment artifacts from developer-provided source code in an automated manner while considering application-specific non-functional requirements. Throughout this thesis, a broad spectrum of design decisions related to the construction of elastic parallel system architectures is discussed, including proactive and reactive elasticity control mechanisms as well as cloud-based parallel processing with virtual machines (Infrastructure as a Service) and functions (Function as a Service). To evaluate these contributions, extensive experimental evaluations are presented.
Elasticity is considered to be the most beneficial characteristic of cloud environments, which distinguishes the cloud from clusters and grids. Whereas elasticity has become mainstream for web-based, interactive applications, it is still a major research challenge how to leverage elasticity for applications from the high-performance computing (HPC) domain, which heavily rely on efficient parallel processing techniques. In this work, we specifically address the challenges of elasticity for parallel tree search applications. Well-known meta-algorithms based on this parallel processing technique include branch-and-bound and backtracking search. We show that their characteristics render static resource provisioning inappropriate and the capability of elastic scaling desirable. Moreover, we discuss how to construct an elasticity controller that reasons about the scaling behavior of a parallel system at runtime and dynamically adapts the number of processing units according to user-defined cost and efficiency thresholds. We evaluate a prototypical elasticity controller based on our findings by employing several benchmarks for parallel tree search and discuss the applicability of the proposed approach. Our experimental results show that, by means of elastic scaling, the performance can be controlled according to user-defined thresholds, which cannot be achieved with static resource provisioning.
Cloud resources can be dynamically provisioned according to application-specific requirements and are payed on a per-use basis. This gives rise to a new concept for parallel processing: Elastic parallel computations. However, it is still an open research question to which extent parallel applications can benefit from elastic scaling, which requires resource adaptation at runtime and corresponding coordination mechanisms. In this work, we analyze how to address these system-level challenges in the context of developing and operating elastic parallel tree search applications. Based on our findings, we discuss the design and implementation of TASKWORK, a cloud-aware runtime system specifically designed for elastic parallel tree search, which enables the implementation of elastic applications by means of higher-level development frameworks. We show how to implement an elastic parallel branch-and-bound application based on an exemplary development framework and report on our experimental evaluation that also considers several benchmarks for parallel tree search.
Azide-bearing cell-derived extracellular matrices (“clickECMs”) have emerged as a highly exciting new class of biomaterials. They conserve substantial characteristics of the natural extracellular matrix (ECM) and offer simultaneously small abiotic functional groups that enable bioorthogonal bioconjugation reactions. Despite their attractiveness, investigation of their biomolecular composition is very challenging due to the insoluble and highly complex nature of cell-derived matrices (CDMs). Yet, thorough qualitative and quantitative analysis of the overall material composition, organisation, localisation, and distribution of typical ECM-specific biomolecules is essential for consistent advancement of CDMs and the understanding of the prospective functions of the developed biomaterial. In this study, we evaluated frequently used methods for the analysis of complex CDMs. Sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) and (immune)histochemical staining methods in combination with several microscopic techniques were found to be highly eligible. Commercially available colorimetric protein assays turned out to deliver inaccurate information on CDMs. In contrast, we determined the nitrogen content of CDMs by elementary analysis and converted it into total protein content using conversion factors which were calculated from matching amino acid compositions. The amount of insoluble collagens was assessed based on the hydroxyproline content. The Sircol™ assay was identified as a suitable method to quantify soluble collagens while the Blyscan™ assay was found to be well-suited for the quantification of sulphated glycosaminoglycans (sGAGs). Eventually, we propose a series of suitable methods to reliably characterise the biomolecular composition of fibroblast-derived clickECM.
In recent years, the development and application of decellularized extracellular matrices (ECMs) for use as biomaterials have grown rapidly. These cell-derived matrices (CDMs) represent highly bioactive and biocompatible materials consisting of a complex assembly of biomolecules. Even though CDMs mimic the natural microenvironment of cells in vivo very closely, they still lack specifically addressable functional groups, which are often required to tailor a biomaterial functionality by bioconjugation. To overcome this limitation, metabolic glycoengineering has emerged as a powerful tool to equip CDMs with chemical groups such as azides. These small chemical handles are known for their ability to undergo bioorthogonal click reactions, which represent a desirable reaction type for bioconjugation. However, ECM insolubility makes its processing very challenging. In this contribution, we isolated both the unmodified ECM and azide-modified clickECM by osmotic lysis. In a first step, these matrices were concentrated to remove excessive water from the decellularization step. Next, the hydrogel-like ECM and clickECM films were mechanically fragmentized, resulting in easy to pipette suspensions with fragment sizes ranging from 7.62 to 31.29 μm (as indicated by the mean d90 and d10 values). The biomolecular composition was not impaired as proven by immunohistochemistry. The suspensions were used for the reproducible generation of surface coatings, which proved to be homogeneous in terms of ECM fragment sizes and coating thicknesses (the mean coating thickness was found to be 33.2 ± 7.3 μm). Furthermore, they were stable against fluid-mechanical abrasion in a laminar flow cell. When primary human fibroblasts were cultured on the coated substrates, an increased bioactivity was observed. By conjugating the azides within the clickECM coatings with alkyne-coupled biotin molecules, a bioconjugation platform was obtained, where the biotin–streptavidin interaction could be used. Its applicability was demonstrated by equipping the bioactive clickECM coatings with horseradish peroxidase as a model enzyme.
Heat pumps in combination with a photovoltaic system are a very promising option for the transformation of the energy system. By using such a system for coupling the electricity and heat sectors, buildings can be heated sustainably and with low greenhouse gas emissions. This paper reveals a method for dimensioning a suitable system of heat pump and photovoltaics (PV) for residential buildings in order to achieve a high level of (photovoltaic) PV self-consumption. This is accomplished by utilizing a thermal energy storage (TES) for shifting the operation of the heat pump to times of high PV power production by an intelligent control algorithm, which yields a high portion of PV power directly utilized by the heat pump. In order to cover the existing set of building infrastructure, 4 reference buildings with different years of construction are introduced for both single- and multi-family residential buildings. By this means, older buildings with radiator heating as well as new buildings with floor heating systems are included. The simulations for evaluating the performance of a heat pump/PV system controlled by the novel algorithm for each type of building were carried out in MATLAB-Simulink® 2017a. The results show that 25.3% up to 41.0% of the buildings’ electricity consumption including the heat pump can be covered directly from the PV installation per year. Evidently, the characteristics of the heating system significantly influence the results: new buildings with floor heating and low supply temperatures yield a higher level of PV self-consumption due to a higher efficiency of the heat pump compared to buildings with radiator heating and higher supply temperatures. In addition, the effect of adding a battery to the system was studied for two building types. It will be shown that the degree of PV self-consumption increases in case a battery is present. However, due to the high investment costs of batteries, they do not pay off within a reasonable period.
Customer orientation should be the core engine of every organisation. Information technology can be considered as the enabler to generate competitive advantages through customer processes in marketing, sales and service. The impact of information technologies is the biggest risk and at the same time a huge opportunity for any organisation. Research shows that Customer Relationship Management (CRM) enables organisations to perform better and focus more on their customers (e.g. market capitalisation of Amazon). While global enterprises are shaping the future of customer centricity and information technology, the question arises how German B2B organisations can shift their value contribution from product-centric to customer-centric. Therefore, these organisations are attempting to implement CRM software and putting their customers more into focus. However, the question remains, how organisations are approaching the implementation of CRM and if these attempts are paying off in terms of business performance.
Contributing to this highly topical discussion, this thesis contributes to the body of knowledge about the implementation of CRM in the German B2B sector and how it impacts their business performance. First, theoretical frameworks have been developed based on an extensive literature review. Hereby different aspects of CRM are worked-out and mapped against three dimensions of business performance, namely process efficiency, customer satisfaction and financial performance. Based on the theory, a conceptual framework was developed to test the relationships between CRM and Business Performance (BP). Therefore, a survey with 500 participants has been conducted. Based on this a measurement model was developed to test five main hypotheses.
The findings of these hypotheses suggest, that the implementation of CRM positively impacts business performance. In specific, the usage of analytical CRM and the establishment of a dedicated CRM success measurement correlate with the performance of German B2B organisations. In addition to these main findings, various key statements could be derived from the research and a measurement model was developed, which can be used for different organisational characteristics assessing BP. As a result, CRM implementations can be enhanced, and business performance can be improved.
Customer orientation should be the core engine of every organisation while IT can be considered as the enabler to generate competitive advantages along customer processes in marketing, sales and service. Research shows that customer relationship management (CRM) enables organisations to perform better and experience indicates that organisations that focus on customer orientation are more successful. With marketplace organisations such as Amazon, Alibaba or Conrad shaping the future of customer centricity and information technology, German B2B organisations need to shift their value contribution from product-centric to customer-centric. While these organisations are currently attempting to implement CRM software and putting their customers more into focus, the question remains how organisations are approaching the implementation of CRM and whether these attempts are paying off in terms of business performance.
Documentation of clinical processes, especially in the perioperative are, is a base requirement for quality of service. Nonetheless, the documentation is a burden for the medical staff since it distracts from the clinical core process. An intuitive and user-friendly documentation system could increase documentation quality and reduce documentation workload. The optimal system solution would know what happened and the person documenting the step would need a single “confirm” button. In many cases, such a linear flow of activities is given as long as only one profession (e.g. anaestesiology, scrub nurse) is considered, but even in such cases, there might be derivations from the linear process flow and further interaction is required.
Whether diversity enhances or impedes team creativity remains an issue of scholarly debate. Explanations of this ambiguity often lie in how diversity is both operationalized and measured. Eschewing the popular approach of using differences in objective criteria to signal diversity, a deep-level approach that focuses on differences in personal values is taken in this study. Value diversity is measured in the two forms of variety and separation and their associations with team creativity are explored. The investigation is augmented by considering the mediating role of team communication in these associations. The analysis was conducted on a sample of 98 teams, using both subjective and objective measures. The findings reveal that when considering value diversity in terms of variety, there is a positive association between diversity and team creativity. However, when the separation dimension of value diversity is considered, a negative association between diversity and team creativity is identified. Complex pathways pertaining to the role of communication within these relationships are also uncovered. In moving beyond rudimentary categories and measurement of diversity, this study further elucidates the complexity of the diversity–creativity relationship. Conclusions are drawn and implications for further research and managerial practice are derived.