Refine
Document Type
- Conference Proceeding (704)
- Article (697)
- Part of a Book (276)
- Book (202)
- Doctoral Thesis (32)
- Working Paper (23)
- Anthology (22)
- Patent (20)
- Part of Periodical (15)
- Report (8)
Institute
- ESB Business School (686)
- Informatik (552)
- Technik (366)
- Angewandte Chemie (209)
- Textil und Design (170)
Today's pattern making methods for industrial purposes are including construction principles, which are based on mathematical formula and sizing charts. As a result, there are two-dimensional flats, which can be converted into a three-dimensional garment. Because of their high linearity, those patterns are incapable of recreating the complexity of the human body, which results in insufficient fit. Subsequent changes of the pattern require a high degree of experience and lead to an inefficient product development process. It is known that draping allows the development of more complex and demanding patterns, which corresponds more to the actual body shape. Therefore, this method is used in custom tailoring and haute couture to achieve perfect garment fit but is also associated with time.
So, there is the act of defiance to improve the fit of garments, to speed up production but maintain a good value for money. Reutlingen University is therefore working on the development of 3D-modelled body shapes for 3D draping, considering different layers of clothing, such as jackets or coats. For this purpose, 3D modelling is used to develop 3D-bodies that correspond to the finished dimensions of the garment. By flattening of the modelled body, it is then possible to obtain an optimal 2D Pattern of the body. The comparison of the conventional method and the developed method is done by 3D simulation.
Finally, the optical fit test is demonstrated by the simulated basic cuts, that a significantly better body wrapping through the newly developed methodology could be achieved. Unlike in the basic cuts, which were achieved by classical design principles have been created, only a few adjustments are necessary to obtain an optimized basic cut. Also, when considering the body distance, it is shown that the newly developed basic patterns provide a more even enclosure of the body.
The process for the production of customized bras is really challenging. Although the need is very clear, the lingerie industry is currently facing a lack of data, knowledge and expertise for the realization of an automated process chain. Different studies and surveys have shown, that the majority of women wear the incorrect bra size. In addition to aesthetic problems, health risks such as headaches, back problems or digestive problems of the wearers can result from this. An important prerequisite for improvements is the basic knowledge about the female breast, both in terms of body measurements and different breast shapes. The current size systematic for bras only defines a bra size by the relation between bust girth and underbust girth and standardized cup forms do not justice to the high variability of the human body. As the bra type shapes the female breast, basic knowledge about the relation of measurements and shapes from the clothed and the unclothed breast is missing.
In the present project, studies are conducted to explore the female breast and to derive new breast-specific body measurements, different breast shapes and deformation knowledge using existing bras.
Furthermore, an innovative process is being developed that leads from 3D scanning to individual and interactive pattern construction, which allows an automatic pattern creation based on individual body measurements and the influence of different material parameters.
In the course of the presentation, the current project status will be shown and the future developments and project steps will be introduced.
Im Gegensatz etwa zur klassischen Werbung handelt es sich beim Event-Marketing um ein dynamisches Kommunikationsinstrument, das laufend Trends und Neuerungen mit sich bringt. Die vielfältigen Einsatzmöglichkeiten und Potenziale des Event-Marketing ermöglichen es, entsprechend dem momentanen Zeitgeist relevante Zielgruppen zu erreichen, markenrelevante Wirklichkeiten und Erlebniswelten zu generieren, Emotionen und Sympathiewerte zu erzeugen und auf diese Weise eine Bindung zwischen Marke bzw. Unternehmen und Rezipienten herzustellen.
Zielgenau aus dem Hinterhalt
(2020)
Ambush-Marketing löst meistens heftige Reaktionen aus - bei Befürwortern und Gegnern. Die Idee des Ambush-Marketings ist es, von den Erfolgen des Sponsorings zu profitieren, ohne die Pflichten eines offiziellen Sponsors einzugehen. Ambusher besitzen keine Vermarktungsrechte an einer Veranstaltung, bauen aber dennoch durch ihre Marketingmassnahmen eine Verbindung zu einem Event auf. Der Grat zwischen der Verletzung von Sponsoren-Rechten und kreativ-innovativer Kommunikationspolitik ist dabei oft sehr schmal.
Gibt es einen Kauf-Knopf im Gehirn des Konsumenten? Und wenn ja, wie betätigt man diesen? Die Antworten auf diese Fragen könnte das Neuromarketing liefern. Das Neuromarketing ist Bestandteil der Neuroökonomie und eine relativ junge Disziplin an der Schnittstelle von Kognitionswissenschaften, Neurowissenschaften und der Marktforschung. Durch den technologischen Fortschritt können die Neurowissenschaften wichtige Erkenntnisse für das Marketing liefern, insbesondere Einblicke zur Erklärung des Konsumentenverhaltens. Durch den Blick in das Kundengehirn können beispielsweise Handelsunternehmen ihre Kunden gezielter ansprechen und sich so einen Vorteil gegenüber Konkurrenten verschaffen.
This article studies the current debate on Coronabonds and the idea of European public debt in the aftermath of the Corona pandemic. According to the EU-Treaty economic and fiscal policy remains in the sovereignty of Member States. Therefore, joint European debt instruments are risky and trigger moral hazard and free-riding in the Eurozone. We exhibit that a mixture of the principle of liability and control impairs the present fiscal architecture and destabilizes the Eurozone. We recommend that Member States ought to utilize either the existing fiscal architecture available or establish a political union with full sovereignty in Europe. This policy conclusion is supported by the PSPP-judgement of the Federal Constitutional Court of Germany on 5 May 2020. This ruling initiated a lively debate about the future of the Eurozone and Europe in general.
Purpose
Despite growing interest in the intersection of supply chain management (SCM) and management accounting (MA) in the academic debate, there is a lack of understanding regarding both the content and the delimitation of this topic. As of today, no common conceptualization of supply chain management accounting (SCMA) exists. The purpose of this study is to provide an overview of the research foci of SCMA in the scholarly debate of the past two decades. Additionally, it analyzes whether and to what extent the academic discourse of MA in SCs has already found its way into both SCM and MA higher education, respectively.
Design/methodology/approach
A content analysis is conducted including 114 higher education textbooks written in English or in German language.
Findings
The study finds that SC-specific concepts of MA are seldom covered in current textbooks of both disciplines. The authors conclude that although there is an extensive body of scholarly research about SCMA concepts, there is a significant discrepancy with what is taught in higher education textbooks.
Practical implications
There is a large discrepancy between the extensive knowledge available in scholarly research and what we teach in both disciplines. This implies that graduates of both disciplines lack important knowledge and skills in controlling and accounting for SCs. To bring about the necessary change, MA and SCM in higher education must be more integrative.
Originality/value
To the best of the authors knowledge, this study is first of its kind comprising a large textbook sample in both English and German languages. It is the first substantiated assessment of the current state of integration between SCM and MA in higher education.
In Germany, mobility is currently in a state of flux. Since June 2019, electric kick scooters (e-scooters) have been permitted on the roads, and this market is booming. This study employs a user survey to generate new data, supplemented by expert interviews to determine whether such e-scooters are a climate-friendly means of transport. The environmental impacts are quantified using a life cycle assessment. This results in a very accurate picture of e-scooters in Germany. The global warming potential of an e-scooter calculated in this study is 165 g CO2-eq./km, mostly due to material and production (that together account for 73% of the impact). By switching to e-scooters where the battery is swapped, the global warming potential can be reduced by 12%. The lowest value of 46 g CO2-eq./km is reached if all possibilities are exploited and the life span of e-scooters is increased to 15 months. Comparing these emissions with those of the replaced modal split, e-scooters are at best 8% above the modal split value of 39 g CO2-eq./km.
In this work, a brushless, harmonic-excited wound-rotor synchronous machine is investigated which utilizes special stator and rotor windings. The windings magnetically decouple the fundamental torque-producing field from the harmonic field required for the inductive power transfer to the field coil. In contrast to conventional harmonic-excited synchronous machines, the whole winding is utilized for both torque production and harmonic excitation such that no additional copper for auxiliary windings is needed. Different rotor topologies using rotating power electronic components are investigated and their efficiencies have been compared based on Finite-Element calculation and circuit analysis.
Energy efficient electric control of drives is more and more important for electric mobility and manufacturing industries. Online dynamic optimization of induction machines is challenging due to the computational complexity involved and the variable power losses during dynamic operation of induction machines. This paper proposes a simple technique for sub-optimal online loss optimization using rotor flux linkage templates for energy efficient dynamic operation of induction machines. Such a rotor flux linkage template is given by a rotor flux linkage trajectory which is optimal for a specific scenario. This template is calculated in an offline optimization process. For a specific scenario during real time operation the rotor flux linkage is calculated by appropriately scaling the given template.
Steady state efficiency optimization techniques for induction motors are state of the art and various methods have already been developed. This paper provides new insights in the efficiency optimized operation in dynamic regime. The paper proposes an anticipative flux modification in order to decrease losses during torque and speed transients. These trajectories are analyzed based on a numerical study for different motors. Measurement results for one motor are given as well.
The aim of this work was to investigate the mean fill weight control of a continuous capsule-filling process, whether it is possible to derive controller settings from an appendant process model. To that end, a system composed out of fully automated capsule filler and an online gravimetric scale was used to control the filled weight. This setup allows to examine challenges associated with continuous manufacturing processes, such as variations in the amount of active pharmaceutical ingredient (API) in the mixture due to fluctuations of the feeders or due to altered excipient batch qualities. Two types of controllers were investigated: a feedback control and a combination of feedback and feedforward control. Although both of those are common in the industry, determining the optimal parameter settings remains an issue. In this study, we developed a method to derive the control parameters based on process models in order to obtain optimal control for each filled product. Determined via rapid automated process development (RAPD), this method is an effective and fast way of determining control parameters. The method allowed us to optimize the weight control for three pharmaceutical excipients. By conducting experiments, we verified the feasibility of the proposed method and studied the dynamics of the controlled system. Our work provides important basic data on how capsule filler can be implemented into continuous manufacturing systems.
Most antimicrobial peptides (AMPs) and their synthetic mimics (SMAMPs) are thought to act by permeabilizing cell membranes. For antimicrobial therapy, selectivity for pathogens over mammalian cells is a key requirement. Understanding membrane selectivity is thus essential for designing AMPs and SMAMPs to complement classical antibiotics in the future. This study focuses on membrane permeabilization induced by SMAMPs and their selectivity for membranes with different lipid compositions. We measure release and fluorescence lifetime of a self-quenching dye in lipid vesicles. Apart from the dose-response, we quantify the strength of individual leakage events, and, employing cumulative kinetics, categorize permeabilization behavior. We propose that differing selectivities in a series of SMAMPs arise from a combination of the effect of the antimicrobial agent and the susceptibility of the membrane (with a given lipid composition) for certain types of leakage behavior. The unselective and hemolytic SMAMP is found to act mainly by the asymmetry stress mechanism, mediated by hydrophobic insertion of SMAMPs into lipid layers. The more selective SMAMPs induced leakage events occurring stochastically over several hours. Lipid intrinsic properties might additionally amplify the efficiency of leakage events. Leakage behavior changes with both the design of the SMAMP and the lipid composition of the membrane. Understanding how leakage behavior contributes to the selectivity and activity of antimicrobial agents will aid the design and screening of antimicrobials. An understanding of the underlying processes facilitates the comparison of membrane permeabilization across in vitro and in vivo assays.
Digitalization changes the manufacturing dramatically. In regard of employees’ demands, global trends and the technological vision of future factories, automotive manufacturing faces a huge number of diverse challenges. Currently, research focuses on technological aspects of future factories in terms of digitalization. New ways of work and new organizational models for future factories have not been described yet. There are assumptions on how to develop the organization of work in a future factory but up to now, literature shows deficits in scientifically substantiated answers in this research area. Consequently, the objective of this paper is to present an approach on a work organization design for automotive Industry 4.0 manufacturing. Future requirements were analyzed and deducted to criteria that determine future agile organization design. These criteria were then transformed into functional mechanisms, which define the approach for shopfloor organization design
In spite of many studies, knowledge about the fundamental factors influencing adhesion between addition curing silicones and aluminum substrates is very limited. The aim of this publication is to evaluate the influence of the formulation and the surface state of the adherend on bond strength. For this purpose, the composition of an addition curing silicone was systematically varied and the effects on both material and bond properties were examined. Additionally, the influence of surface aging at different humidities (0% r. h., 34% r. h., 82% r. h.) of acid etch pretreated aluminum substrates was considered. It is shown that the mechanical properties of the silicone material can be easily adjusted over a wide range by changing the formulation. Although high tensile strengths up to 9.2 MPa for the silicone material can be achieved, lap-shear strengths remain moderate at approximately 3.5 MPa. Predominant adhesive failures show the limited adhesive strength of the basic formulation without additives. Basic ingredients of addition curing silicones without additives are able to reach a certain adhesive strength. However, this strength was quite limited and adhesion promoters are required to further improve adhesion. The humidity at which the pretreated substrates are stored has an overall minor influence on bond strength. Surprisingly, bond strength tends to increase with the storage time of aluminum substrates despite lower surface energies in comparison to freshly pretreated substrates. All in all, the storage conditions of aluminum had a rather small influence on adhesion, whereas the composition of the silicone adhesive strongly influences bond strength.
In addition to increased safety by detecting possible overload, continuous component monitoring by sensor integration makes the use of fiber reinforced plastics more cost-effective. Since the components are continuously monitored, one can switch from time-based to condition-based maintenance. However, the integration of conventional sensor components causes weak points, as foreign objects are inserted into the reinforcing structure. In this paper, we examine the use of the textile reinforcement as a sensor in itself. We describe how bending sensors can be formed by slightly modifying in the composite’s reinforcement structure. We investigated two different sensor principles. (1) The integration of textile plate capacitors into the structure; (2) The construction of textile piezo elements as part of the reinforcing structure. The bending test results reveal that textile plate capacitors show a load-dependent signal output. The samples with textile piezo elements show a significant increase in signal strength.
Motto der Herbstkonferenz Informatics Inside 2020 ist KInside. Wieder einmal blicken Studierende inside und schauen sich Methoden, Anwendungen und Zusammenhänge genauer an. Die Beiträge sind vielfältig und entsprechend dem Studiengang human-centered. Es ist der Anspruch, dass sich die Themen um die Bedürfnisse der Menschen drehen und eingesetzte Methoden kein Selbstzweck sind, sondern am Nutzen für den Menschen gemessen werden.
Im Frühjahr 1817 unternahm der damalige Professor Friedrich List an der Universität Tübingen eine Reise nach Frankfurt a. M., wo zu dieser Zeit die berühmte Ostermesse stattfand. Dort traf er mit den Anführern der Kaufleute zusammen, die darüber klagten, dass die zaghafte wirtschaftliche Entwicklung unter den vielen Zollschranken und den Billigimporten aus England stark zu leiden habe. Deshalb forderten sie die Abschaffung der Binnenzölle und die Bildung einer Wirtschaftsunion. Im Auftrag der Kaufleute verfasste List seine berühmt gewordene Petition an die Bundesversammlung, die lose Interessenvertretung des Deutschen Bundes in Frankfurt. Als die Petition mit großem Beifall aufgenommen wurde, gründete List im Hochgefühl seines Erfolges spontan den "Allgemeinen Deutschen Handels- und Gewerbsverein" – die erste Interessenvertretung deutscher Kaufleute. Er legte damit den Grundstein für den politischen Prozess zur Gründung des Zollvereins von 1834, der wiederum die Vorstufe zur Gründung des Deutschen Reiches von 1871 bildete. Lists damalige Forderungen sind zurzeit wieder hoch aktuell.
Exogenous factors of influence on exhaled breath analysis by ion-mobility spectrometry (MCC/IMS)
(2019)
The interpretation of exhaled breath analysis needs to address to the influence of exogenous factors, especially to a transfer of confounding analytes by the test persons. A test person who was exposed to a disinfectant had exhaled breath analysis by MCC/IMS (Bioscout®) after different time intervals. Additionally, a new sampling method with inhalation of synthetic air before breath analysis was tested. After exposure to the disinfectant, 3-Pentanone monomer, 3-Pentanone dimer, Hexanal, 3-Pentanone trimer, 2-Propanamine, 1-Propanol, Benzene, Nonanal showed significantly higher intensities, in exhaled breath and air of the examination room, compared to the corresponding baseline measurements. Only one ingredient of the disinfectant (1-Propanol) was identical to the 8 analytes. Prolonging the time intervals between exposure and breath analysis showed a decrease of their intensities. However, the half-time of the decrease was different. The inhalation of synthetic air - more than consequently airing the examination room with fresh air - reduced the exogenous and also relevant endogenous analytes, leading to a reduction and even changing polarity of the alveolar gradient. The interpretation of exhaled breath needs further knowledge about the former residence of the proband and the likelihood and relevance of the inhalation of local, site-specific and confounding exogenous analytes by him. Their inhalation facilitates a transfer to the examination room and a detection of high concentrations in room air and exhaled breath, but also the exhalation of new analytes. This may lead to a misinterpretation of these analytes as endogenous resp. disease-specific ones.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
Woven piezoelectric sensors as part of the textile reinforcement of fiber reinforced plastics
(2019)
Sensor integration in fiber reinforced plastic (FRP) structures enables online process and structural health monitoring (SHM). This paper describes the development and application of woven fabric-based piezoelectric impact and bending sensors for integration into FRP. The work focuses on design and characterization of woven piezoelectric sensors, especially as a part of the reinforcement structure. The reinforcement of the component acts as a sensor in itself and therefore no additional external objects in the form of sensor components or sensor fibers, which could create unwanted weak points within the FRP, are added. The bending test results reveal a direct relationship between the applied load and the sensor signal. Furthermore, the appropriate sensor position in the component cross section was determined and the influence of thermal polarization on the sensor properties was investigated.
Innovationskraft ist einer der wesentlichen Erfolgsfaktoren der Zukunft, welcher den Unterschied zwischen erfolgreichen und scheiternden Unternehmen in hohem Maße beeinflussen wird (PWC, 2015). Besonders junge Unternehmen und Start-ups sind für ihre hohe Innovationsfähigkeit bekannt. Etablierte Unternehmen hingegen punkten weniger mit neuen Ideen, aber dafür mit innovationskritischen Ressourcen, Routinen und Skaleneffekten. Ein stetig an Popularität gewinnender Ansatz, die Fähigkeiten und Ressourcen von etablierten Unternehmen mit der Innovationskraft von Start-ups zu verknüpfen, stellt das "Intrapreneurship" dar.
Das Value-Engineering in der Kundenkommunikation ist eine strukturierte Methode, Kommunikationsprozesse zwischen Unternehmen zu verbessern. Das Konzept greift bewährte Elemente der technischen Wertanalyse und der Gemeinkosten-Wertanalyse auf und überträgt sie auf die Kundenkommunikation. Der Ansatz bietet eine systematische Vorgehensweise, Kommunikationsprozesse zwischen Anbieter und Kunde zu durchleuchten und neu zu gestalten. Value-Engineering in der Kundenkommunikation schafft somit Wettbewerbsvorteile durch eine Optimierung der Kommunikation.
Dieser Beitrag gibt einen Überblick über die verschiedenen Möglichkeiten der Bilanzierung einens Initial Coin Offerings (ICO) beim Emittenten auf der Passivseite nach den Regelungen der IFRS. Ziel ist es, die bilanzielle Einordnung anhand verschiedenenr Arten von Token zu erörtern und den Emittenten bei der Ausgestaltung der Token sowie der anschließenden Bilanzierung zu unterstützen. Die Ergebnisse zeigen, dass die Standards für die bilanzielle Einordnung von ICO-Token zwar ausreichen, allerdings eine große Bandbreite der Bilanzierung zu berücksichtigen ist und eine detaillierte Regelung durch einen eigenen IFRS daher schwierig erscheint.
Customer orientation should be the core engine of every organisation while IT can be considered as the enabler to generate competitive advantages along customer processes in marketing, sales and service. Research shows that customer relationship management (CRM) enables organisations to perform better and experience indicates that organisations that focus on customer orientation are more successful. With marketplace organisations such as Amazon, Alibaba or Conrad shaping the future of customer centricity and information technology, German B2B organisations need to shift their value contribution from product-centric to customer-centric. While these organisations are currently attempting to implement CRM software and putting their customers more into focus, the question remains how organisations are approaching the implementation of CRM and whether these attempts are paying off in terms of business performance.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Modern mixed (HTAP)workloads execute fast update-transactions and long running analytical queries on the same dataset and system. In multi-version (MVCC) systems, such workloads result in many short-lived versions and long version-chains as well as in increased and frequent maintenance overhead.
Consequently, the index pressure increases significantly. Firstly, the frequent modifications cause frequent creation of new versions, yielding a surge in index maintenance overhead. Secondly and more importantly, index-scans incur extra I/O overhead to determine, which of the resulting tuple versions are visible to the executing transaction (visibility-check) as current designs only store version/timestamp information in the base table – not in the index. Such index-only visibility-check is critical for HTAP workloads on large datasets.
In this paper we propose the Multi Version Partitioned B-Tree (MV-PBT) as a version-aware index structure, supporting index-only visibility checks and flash-friendly I/O patterns. The experimental evaluation indicates a 2x improvement for analytical queries and 15% higher transactional throughput under HTAP workloads. MV-PBT offers 40% higher tx. throughput compared to WiredTiger’s LSM-Tree implementation under YCSB.
In this paper, we present a new approach for achieving robust performance of data structures making it easier to reuse the same design for different hardware generations but also for different workloads. To achieve robust performance, the main idea is to strictly separate the data structure design from the actual strategies to execute access operations and adjust the actual execution strategies by means of so-called configurations instead of hard-wiring the execution strategy into the data structure. In our evaluation we demonstrate the benefits of this configuration approach for individual data structures as well as complex OLTP workloads.
The tale of 1000 cores: an evaluation of concurrency control on real(ly) large multi-socket hardware
(2020)
In this paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” and analyse in-memory DBMSs on today’s large hardware. Despite the original assumption of the authors, today we do not see single-socket CPUs with 1000 cores. Instead multi-socket hardware made its way into production data centres. Hence, we follow up on this prior work with an evaluation of the characteristics of concurrency control schemes on real production multi-socket hardware with 1568 cores. To our surprise, we made several interesting findings which we report on in this paper.
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become viable.
The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under NoFTL-KV and the COSMOS hardware platform.
Massive data transfers in modern key/value stores resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, have yet to see widespread use.
In this paper we introduce nKV, which is a key/value store utilizing native computational storage and near-data processing. On the one hand, nKV can directly control the data and computation placement on the underlying storage hardware. On the other hand, nKV propagates the data formats and layouts to the storage device where, software and hardware parsers and accessors are implemented. Both allow NDP operations to execute in host-intervention-free manner, directly on physical addresses and thus better utilize the underlying hardware. Our performance evaluation is based on executing traditional KV operations (GET, SCAN) and on complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4×-2.7× better performance on real hardware – the COSMOS+ platform.
nKV in action: accelerating KVstores on native computational storage with NearData processing
(2020)
Massive data transfers in modern data intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-data processing (NDP) designs represent a feasible solution, which although not new, has yet to see widespread use.
In this paper we demonstrate various NDP alternatives in nKV, which is a key/value store utilizing native computational storage and near-data processing. We showcase the execution of classical operations (GET, SCAN) and complex graph-processing algorithms (Betweenness Centrality) in-situ, with 1.4x-2.7x better performance due to NDP. nKV runs on real hardware - the COSMOS+ platform.
Despite strong political efforts in Europe, industrial small- and medium sized enterprises (SMEs) seem to neglect adopting practices for energy effciency. By taking a cultural perspective, this study investigated what drives the establishment of energy effciency and corresponding practices in SMEs. Based on 10 ethnographic case studies and a quantitative survey among 500 manufacturing SMEs, the results indicate the importance of everyday employee behavior in achieving energy savings. The studied enterprises value behavior related measures as similarly important as technical measures. Raising awareness for energy issues within the organization, therefore, constitutes an essential leadership task that is oftentimes perceived as challenging and frustrating. It was concluded that the embedding of energy efficiency in corporate strategy, the use of a broad spectrum of different practices, and the empowerment and involvement of employees serve as major drivers in establishing energy effciency within SMEs. Moreover, the findings reveal institutional influences on shaping the meanings of energy effciency for the SMEs by raising attention for energy effciency in the enterprises and making energy effciency decisions more likely. The main contribution of the paper is to offer an alternative perspective on energy effciency in SMEs beyond the mere adoption of energy-effcient technology.
The automation of work by means of disruptive technologies such as Artificial Intelligence (AI) and Robotic Process Automation (RPA) is currently intensely discussed in business practice and academia. Recent studies indicate that many tasks manually conducted by humans today will not in the future. In a similar vein, it is expected that new roles will emerge. The aim of this study is to analyze prospective employment opportunities in the context of RPA in order to foster our understanding of the pivotal qualifications, expertise and skills necessary to find an occupation in a completely changing world of work. This study is based on an explorative, content analysis of 119 job advertisements related to RPA in Germany. The data was collected from major German online job platforms, qualitatively coded, and subsequently analyzed quantitatively. The research indicates that there indeed are employment opportunities, especially in the consulting sector. The positions require different technological expertise such as specific programming languages and knowledge in statistics. The results of this study provide guidance for organizations and individuals on reskilling requirements for future employment. As many of the positions require profound IT expertise, the generally accepted perspective that existing employees affected by automation can be retrained to work in the emerging positions has to be seen extremely critical. This paper contributes to the body of knowledge by providing a novel perspective on the ongoing discussion of employment opportunities, and reskilling demands of the existing workforce in the context of recent technological developments and automation.
The chemical synthesis of polysiloxanes from monomeric starting materials involves a series of hydrolysis, condensation and modification reactions with complex monomeric and oligomeric reaction mixtures. Real-time monitoring and precise process control of the synthesis process is of great importance to ensure reproducible intermediates and products and can readily be performed by optical spectroscopy. In chemical reactions involving rapid and simultaneous functional group transformations and complex reaction mixtures, however, the spectroscopic signals are often ambiguous due to overlapping bands, shifting peaks and changing baselines. The univariate analysis of individual absorbance signals is hence often only of limited use. In contrast, batch modelling based on the multivariate analysis of the time course of principal components (PCs) derived from the reaction spectra provides a more efficient tool for real time monitoring. In batch modelling, not only single absorbance bands are used but information over a broad range of wavelengths is extracted from the evolving spectral fingerprints and used for analysis. Thereby, process control can be based on numerous chemical and morphological changes taking place during synthesis. “Bad” (or abnormal) batches can quickly be distinguished from “normal” ones by comparing the respective reaction trajectories in real time. In this work, FTIR spectroscopy was combined with multivariate data analysis for the in-line process characterization and batch modelling of polysiloxane formation. The synthesis was conducted under different starting conditions using various reactant concentrations. The complex spectral information was evaluated using chemometrics (principal component analysis, PCA). Specific spectral features at different stages of the reaction were assigned to the corresponding reaction steps. Reaction trajectories were derived based on batch modelling using a wide range of wavelengths. Subsequently, complexity was reduced again to the most relevant absorbance signals in order to derive a concept for a low-cost process spectroscopic set-up which could be used for real-time process monitoring and reaction control.
Der Anspruch an Energieversorger wird wachsen: in Zukunft gewinnen vor allem Aufgaben wie die Entwicklung digitalisierter Produkte/Dienstleistungen sowie ökologische Aktivitäten an Relevanz. Dies zeigt die Hochschule Reutlingen in ihrer aktuellen Untersuchung unter Aufsichtsräten, Geschäftsführern und Führungskräften. Trotz der erwarteten Veränderungen: die Aufsichtsräte sind sich zwar ihrem Druck zu mehr Professionalisierung bewusst, scheinen aktuell aber nur mäßig für die künftigen Herausforderungen des Unternehmens gerüstet. Besonders relevant dabei: die Professionalisierung der Gremienarbeit in kommunalen EVU ermöglicht einen höheren wahrgenommenen Unternehmenserfolg. So die Studie des Reutlinger Energiezentrums and der Hochschule Reutlingen im Auftrag von fünf Unternehmen der Branche.
In recent years, machine learning algorithms have made a huge development in performance and applicability in industry and especially maintenance. Their application enables predictive maintenance and thus offers efficiency increases. However, a successful implementation of such solutions still requires high effort in data preparation to obtain the right information, interdisciplinarity in teams as well as a good communication to employees. Here, small and medium sized enterprises (SME) often lack in experience, competence and capacity. This paper presents a systematic and practice-oriented method for an implementation of machine learning solutions for predictive maintenance in SME, which has already been validated.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
Driven by digital transformation, manufacturing systems are heading towards autonomy. The implementation of autonomous elements in manufacturing systems is still a big challenge. Especially small and medium sized enterprises (SME) often lack experience to assess the degree of Autonomous Production. Therefore, a description model for the assessment of stages for Autonomous Production has been identified as a core element to support such a transformation process. In contrast to existing models, the developed SME-tailored model comprises different levels within a manufacturing system, from single manufacturing cells to the factory level. Furthermore, the model has been validated in several case studies.
Matthias Varga von Kibéd und Insa Sparrer unterscheiden zwischen drei verschiedenen Aufstellungsmethoden (Sparrer und Varga von Kibéd, o. D.): Der spezifisch (konkreten), der virtuellen und der prototypischen Aufstellung. Bei spezifischen Aufstellungen wird ein konkretes Anliegen eines Klienten betrachtet. Im Gegensatz dazu, werden bei virtuellen Aufstellungen eine Übungsumgebung geschaffen. In dieser können Aufstellungstechniken und Interventionsmethoden geübt werden. Bei prototypischen Strukturaufstellungen werden Themen zusammengefasst, die mehrere Teilnehmer im Seminar berühren bzw. in deren Alltag immer wieder auftreten können. Dieses Thema wird wie eine spezifische Aufstellung bearbeitet jedoch ohne ein konkretes vorliegendes Anliegen. Beispiele für prototypische Strukturaufstellungen kommen aus vielen Bereichen z.B. dem Führungsalltag, Teamentwicklung, Konfliktmanagement, Gesprächsführung, Zeit‐ und Selbstmanagement.
Sol-Gel basierte Flammschutzmittel stellen einen vielversprechenden Ansatz für Textilien dar, gerade im Bereich des Ersatzes von derzeit etablierten halogenhaltigen Flammschutzmitteln. Letztere sind aufgrund ihrer toxikologisch Bedenklichkeit sowie ihrer mitunter bioakkumulierenden Eigenschaften in die Kritik geraten. In diesem Forschungsvorhaben wurde daher untersucht auf welche Weise ein Flammschutz per Sol-Gel-Ansatz auf Stickstoff- und/oder Phosphorbasis als halogenfreie Alternative verwirklicht werden kann. Die Sol-Gel-Schicht fungierte dabei zum einen als nicht brennbarer Binder, zum anderen konnten über das Einführen entsprechender funktioneller Seitenketten für den Flammschutz aktive Gruppen direkt mit eingebunden werden. Verschiedene Ansätze wurden dabei verfolgt. Vor allem durch die Nutzung von additivierten Systemen, d.h. durch Sol-Gel-Schichten mit Zusätzen von stickstoff- und/oder phosphorhaltigen Verbindungen konnte ein Flammschutz nach DIN EN ISO 15025 (Schutzkleidung – Schutz gegen Hitze und Flammen) erhalten werden. Anhand eines Modellsystems, bei dem in zwei aufeinanderfolgenden Schritten zuerst eine funktionalisierte Sol-Gel-Schicht und anschließend eine Phosphorverbindung in einem zweiten Schritt aufgebracht wurde, konnten die Vorteile des Flammschutzes auf Sol-Gel-Basis nachgewiesen werden. Dabei wurde unter anderem auch gezeigt, dass ein Mechanismus auf Basis der Bildung einer Schutzschicht hauptsächlich verantwortlich für den Flammschutz ist. Dieses Ergebnis ist für eine zukünftige, weitere Optimierung entsprechender Ausrüstungen nicht zu unterschätzen. Durch Ausrüstungsversuche im semi-industriellen Maßstab konnte weiterhin gezeigt werden, dass einer großtechnischen Umsetzung der angewandten Ausrüstungen prinzipiell nichts im Wege steht. Abstriche müssen bis dato lediglich bezüglich der Waschstabilität gemacht werden. Die Sol-Gel-Schichten überstanden zwar im allgemeinen typische Waschprozesse, eine Permanenz der Flammfestigkeit von additivierten Systemen ergab sich aber nur in einzelnen Fällen. Ausgehend von den Ergebnissen wurde ein neuer Ansatz vorgestellt, der über den hier zugrundeliegenden Ansatz hinausgeht. Dieser sieht vor, durch den Einsatz von neu-synthetisierten Silanen mit Stickstoff- und Phosphorgruppen Sol-Gel-Schichten herzustellen, die ein vielversprechendes Verhalten zeigen. Hier konnte auch nach ersten Waschtests eine Aufrechterhaltung der verbesserten Flammfestigkeit nachgewiesen werden. Insgesamt konnte innerhalb des Forschungsvorhabens gezeigt werden, dass ein Flammschutz auf Sol-Gel-Basis für Textilien erhalten werden kann. Darüberhinaus konnte auch erklärt werden auf welchem Mechanismus dieser Flammschutz begründet ist und wie die derzeit noch ungenügende Waschpermanenz verbessert werden kann.
Customer foresight is a relatively new research field. We introduce the customer foresight territory by discussing it localization between customer research and foresight research. For this purposse, we look at a variety of methods that help to understand customers and future realities. On this basis we provide an overwiew of customer foresight methods and outline an ideal-typical research journey.
This study investigates how integrated reporting (IR) creates value for investors. It examines how providers of financial capital benefit from an improved firm information environment provided by IR. Specifically, this study investigates the effect of voluntary IR disclosure on analyst earnings forecast accuracy as well as on firm value. To do so, we use an international sample of 167 listed companies that voluntarily publish an integrated report. Our analysis shows no significant effect of a voluntary IR publication on analyst earnings forecast accuracy and no significant effect on firm value. We thus do not find evidence for the fulfillment of IR's promises regarding improved information environment and value creation of voluntary adopters. We conclude that such companies might already have a relatively high level of transparency leading to an absent additional effect of IR disclosure. Positive effects of IR appear to be more relevant in environments where IR is mandatory.
The key aim of Open Strategy is to open up the process of strategy development to larger groups within and even outside an organization. Furthermore, Open Strategy aims to include broad groups of stakeholders in the various steps of the strategy process. The question at hand is how can Open Strategy be achieved? What approaches can be used? Scenario planning and business wargaming are approaches perceived as relevant tools in the field of strategy and strategic foresight and in the context of Open Strategy because of their participative nature. The aim of this article is to assess to what degree scenario planning and business wargaming can be used in the context of Open Strategy. While these approaches are suitable, their current application limits the number of potential participants. Further research and experimentation in practice with larger groups and/or online approaches, or a combination of both, are needed to explore the potential of scenario planning and business wargaming as tools for Open Strategy.
Globalisation, shorter product life cycles, and increasing product varieties have led to complex supply chains. At the same time, there is a growing interest of customers and governments in having a greater transparency of brands, manufacturers, and producers throughout the supply chain. Due to the complex structure of collaborative manufacturing networks, the increase of supply chain transparency is a challenge for manufacturing companies. The blockchain technology offers an innovative solution to increase the transparency, security, authenticity, and auditability of products. However, there are still uncertainties when applying the blockchain technology to manufacturing scenarios and thus enable all stakeholders to trace back each component of an assembled product. This paper proposes a framework design to increase the transparency and auditability of products in collaborative manufacturing networks by adopting the blockchain technology. In this context, each component of a product is marked with a unique identification number generated by blockchain-based smart contracts. In this way, a transparent auditability of assembled products and their components can be achieved for all stakeholders, including the custome.
Companies are becoming aware of the potential risks arising from sustainability aspects in supply chains. These risks can affect ecological, economic or social aspects. One important element in managing those risks is improved transparency in supply chains by means of digital transformation. Innovative technologies like blockchain technology can be used to enforce transparency. In this paper, we present a smart contract-based Supply Chain Control Solution to reduce risks. Technological capabilities of the solution will be compared to a similar technology approach and evaluated regarding their benefits and challenges within the framework of supply chain models. As a result, the proposed solution is suitable for the dynamic administration of complex supply chains.
We propose a method for recognizing dynamic gestures using a 3D sensor. New aspects of the developed system include problem-adapted data conversion and compression as well as automatic detection of different variants of the same gesture via clustering with a suitable metric inspired by Jaccard metric. The combination of Hidden Markov Models and clustering leads to robust detection of different executions based on a small set of training data. We achieved an increase of 5% recognition rate compared to regular Hidden Markov Models. The system has been used for human-machine interaction and might serve as an assistive system in physiotherapy and neurological or orthopedic diagnosis.