Refine
Document Type
- Journal article (1244)
- Conference proceeding (1039)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (38)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3088)
Institute
- ESB Business School (1107)
- Informatik (875)
- Technik (509)
- Life Sciences (343)
- Texoversum (220)
- Zentrale Einrichtungen (16)
Publisher
- Springer (473)
- IEEE (252)
- Elsevier (243)
- Hochschule Reutlingen (191)
- MDPI (99)
- Wiley (72)
- Gesellschaft für Informatik e.V (69)
- Universität Tübingen (65)
- De Gruyter (60)
- VDE Verlag (48)
This paper studies whether a monetary union can be managed solely by a rule based approach. The Five Presidents’ Report of the European Union rejects this idea. It suggests a centralisation of powers. We analyse the philosophy of policy rules from the vantage point of the German economic school of thought. There is evidence that a monetary union consisting of sovereign states is well organised by rules, together with the principle of subsidiarity. The root cause of the euro crisis is rather the weak enforcement of rules, compounded by structural problems. Therefore, we suggest a genuine rule-based paradigm for a stable future of the Economic and Monetary Union.
Under update intensive workloads (TPC, LinkBench) small updates dominate the write behavior, e.g. 70% of all updates change less than 10 bytes across all TPC OLTP workloads. These are typically performed as in-place updates and result in random writes in page-granularity, causing major write-overhead on Flash storage, a write amplification of several hundred times and lower device longevity.
In this paper we propose an approach that transforms those small in-place updates into small update deltas that are appended to the original page. We utilize the commonly ignored fact that modern Flash memories (SLC, MLC, 3D NAND) can handle appends to already programmed physical pages by using various low-level techniques such as ISPP to avoid expensive erases and page migrations. Furthermore, we extend the traditional NSM page-layout with a delta-record area that can absorb those small updates. We propose a scheme to control the write behavior as well as the space allocation and sizing of database pages.
The proposed approach has been implemented under Shore- MT and evaluated on real Flash hardware (OpenSSD) and a Flash emulator. Compared to In-Page Logging it performs up to 62% less reads and writes and up to 74% less erases on a range of workloads. The experimental evaluation indicates: (i) significant reduction of erase operations resulting in twice the longevity of Flash devices under update-intensive workloads; (ii) 15%-60% lower read/write I/O latencies; (iii) up to 45% higher transactional throughput; (iv) 2x to 3x reduction in overall write
amplification.
In the present paper we demonstrate the novel technique to apply the recently proposed approach of In-Place Appends – overwrites on Flash without a prior erase operation. IPA can be applied selectively: only to DB-objects that have frequent and relatively small updates. To do so we couple IPA to the concept of NoFTL regions, allowing the DBA to place update-intensive DB-objects into special IPA-enabled regions. The decision about region configuration can be (semi-)automated by an advisor analyzing DB-log files in the background.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware. During the demonstration we allow the users to interact with the system and gain hands-on experience under different demonstration scenarios.
In the present paper we demonstrate a novel approach to handling small updates on Flash called In-Place Appends (IPA). It allows the DBMS to revisit the traditional write behavior on Flash. Instead of writing whole database pages upon an update in an out-of-place manner on Flash, we transform those small updates into update deltas and append them to a reserved area on the very same physical Flash page. In doing so we utilize the commonly ignored fact that under certain conditions Flash memories can support in-place updates to Flash pages without a preceding erase operation.
The approach was implemented under Shore-MT and evaluated on real hardware. Under standard update-intensive workloads we observed 67% less page invalidations resulting in 80% lower garbage collection overhead, which yields a 45% increase in transactional throughput, while doubling Flash longevity at the same time. The IPA outperforms In-Page Logging (IPL) by more than 50%.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware – the OpenSSD Flash research platform. During the demonstration we allow the users to interact with the system and gain hands on experience of its performance under different demonstration scenarios. These involve various workloads such as TPC-B, TPC-C or TATP.
This article is a review of the book "Brain computation as hierarchical abstraction" by Dana H. Ballard published by MIT press in 2015. The book series computational neuroscience familiarizes the reader with the computational aspects of brain functions based on neuroscientific evidence. It provides an excellent introduction of the functioning, i.e. the structure, the network and the routines of the brain in our daily life. The final chapters even discuss behavioral elements such as decision-making, emotions and consciousness. These topics are of high relevance in other sciences such as economics and philosophy. Overall, Ballard’s book stimulates a scientifically well-founded debate and, more importantly, reveals the need of an interdisciplinary dialogue towards social sciences.
The purpose of this paper is to study the impact of transparency on the political budget cycle (PBC) over time and across countries. So far, the literature on electoral cycles finds evidence that cycles depend on the stage of an economy. However, the author shows – for the first time – a reliance of the budget cycle on transparency. The author uses a new data set consisting of 99 developing and 34 Organization for Economic Cooperation and Development countries. First, the author develops a model and demonstrates that transparency mitigates the political cycles. Second, the author confirms the proposition through the econometric assessment. The author uses time series data from 1970 to 2014 and discovers smaller cycles in countries with higher transparency, especially G8 countries.
Die Europäische Währungs- und Wirtschaftsunion (EWWU) bedarf einer weiteren Stabilisierung, da die institutionellen Regelungen langfristig keine hinreichende Bindekraft auf die Mitgliedsländer entfalten. Die Herausforderung ist die Rückgewinnung der verlorengegangenen Glaubwürdigkeit in das Regelwerk im Zuge der europäischen Staatsverschuldungskrise seit dem Jahr 2010. Um die Währungsunion zu erhalten, muss einerseits im Primärrecht das "No Bailout" in Art. 125 AEUV glaubwürdig angewandt werden können und andererseits die Regelungen im Sekundärrecht, u.a. der Stabilitäts- und Wachstumspakt, der Fiskalpakt oder das europäische Semester, unabhängiger und schneller rechtsverbindlich vollzogen werden. Der hier vorgeschlagene und klug in den europäischen Rahmen eingepasste "staatliche Insolvenzmechanismus", verbunden mit einer im Ultima Ratio rechtsverbindlichen "Austrittsklausel" wäre ein Lösungsansatz. Ein Scheitern der EWWU ist abwendbar, aber der fehlende Reformwille könnte dem Zerfall der Währungsunion Vorschub leisten.
Ein wesentliches Ziel der unter dem Schlagwort Industrie 4.0 gebündelten neuen Entwicklungen ist die Vernetzung intelligenter Komponenten in industriellen Anlagen, um Prozesse transparenter und effizienter zu gestalten. Ein weiteres Ziel ist das Condition Monitoring, d.h. die Überwachung des Zustands der Komponenten während der Laufzeit und die Abschätzung der Restlebensdauer, damit die gesamte Lebensdauer der Komponente ausgenutzt und Wartungsintervalle besser geplant werden können. Die Bewertung des Komponentenzustands erfolgt anhand von Messgrößen, die entweder durch zusätzlich in den Prozess eingebrachte Sensoren erfasst werden oder durch Prozessdaten, die in den Regel- und Steuereinrichtungen verfügbar sind. Diese Messdaten werden ausgewertet und das Ergebnis wird dem Anwender angezeigt.
Der vorliegende Beitrag gibt einen kurzen Überblick über verwendete Messgrößen sowie verwendete Auswerteverfahren. Darüber hinaus wird ein Verfahren erläutert, das die Schwierigkeiten bei der Beurteilung der üblicherweise verwendeten Frequenzspektren vermeidet.
This paper describes a new method for condition monitoring of a roller chain. In contrast to conventional methods, no additional accelerometers are used to measure and interpret frequency spectra but the chain condition is evaluated using an easy to interpret similarity measure based on correlation functions using the driving motor torque. An additional clustering of current data and reference measurements yields an easy to understand representation of the chain condition.
Using the damage area as a quantification method for the Martindale test is a promising method to compare textile finishes without the need to test to full destruction. In addition, it could be shown that the results of Martindale tests performed with different pressure loads can be scaled to identical functional shape. If these results can be verified, this method would be a simplification of abrasive testing for different application areas.
Im Zuge von REACH wurden gängige hocheffektive halogenierte Flammschutzmittel verboten, da diese unter dem Verdach stehen, kanzerogen, mutagen und teratogen zu sein. Zur Zeit fehlen entsprechende Alternativen. Daher werden am DTNW neue umweltfreundliche und halogenfreie Flammschutzmittel auf der Basis von Phosphor- und Stickstoffverbindungen entwickelt, um einen entsprechenden Flammschutz zu gewährleisten. Neue Möglichkeiten werden im Rahmen dieses Artikels vorgestellt.
Es wird eine Möglichkeit zur Quantifizierung der Schädigung von Geweben bei der Martindale-Flachscheuerung vorgestellt, die eine interessante Möglichkeit darstellt, vergleichende Untersuchungen durchzuführen, auch wenn die Gewebe nicht bis zu einem Fadenbruch gescheuert werden. Die Untersuchungen deuten außerdem eine Skalierbarkeit der Scheuerversuche bei unterschiedlichen Anpressdrücken an. Sollte sich diese Beobachtung in weiteren Untersuchungen bestätigen, würde dies eine grundlegende Vereinfachung der Scheuerprüfung für verschiedene Anwendungen bedeuten.
In this paper we describe the design and development process of an electromagnetic picker for rivets. These rivets are used in a production process of leather or textile design objects like riveted waist belts or purses. The picker is designed such that it replaces conventional mechanical pickers thus avoiding mechanical wear problems and increasing the process quality. The paper illustrates the challenges in the design process of this mechatronic system. The design process was based on both simulation and experiments leading to a prototype that satisfies the requirements.
In retail environments, consumers commonly evaluate products while standing on some type of flooring and concurrently being exposed to music; however, no study has examined the interaction of these two atmospheric cues. To bridge this gap, this research examines whether retailers can benefit from creating multisensory atmospheric congruent rather than incongruent retail environments of flooring and music. The results of an experiment in a real retail store reveal positive effects of multisensory congruent retail environments (e.g., soft music combined with soft flooring) on product evaluations. This study provides a new process explanation with consumers’ purchase-related self-confidence mediating these effects. Specifically, consumers in congruent rather than incongruent retail environments experience more purchase-related self confidence, which in turn leads to more favorable product evaluations. Furthermore, this study shows that consumers with a low rather than a high preference for haptic information are influenced more by multisensory atmospheric congruence when evaluating a product haptically.
As the market penetration of alternative fuel vehicles is still uncertain, defining green design cues for their design is of specific relevance to target environmentally conscious customers. This paper is a review of the existing literature aiming at summarizing the market penetration scenarios of alternative fuel vehicles over the next years, consumer demand for sustainable materials, and present methodologies to represent characteristics of eco-friendly mobility in the interior of alternative fuel vehicles. In particular, present attempts to correlate materials with green design cues are explored. Finally, projections for the future of the field are suggested, posing enchanting research questions to further unify the field of environmentally conscious design with the domain of product personality.
Werkstoffkundeunterricht wird in den allermeisten Fällen als reiner Frontalunterricht abgehalten. Ergänzender Laborunterricht zur Werkstoffkunde findet teilweise statt, nur beziehen sich hier die durchgeführten Experimente größtenteils auf sehr wenige Teilgebiete, wie z.B. den Zugversuch. Im Curriculum mancher Studiengänge ist für derartige im Labor druchgeführte Experimente gar keine Zeit vorgesehen. Das Wissen muss komplett im Unterricht vermittelt werden.
Ziel von ExperiMat ist es, praxistaugliche Schauexperimente für den werkstoffkundlichen Unterricht zu beschreiben.
Das vorliegende Buch ist als eine zusammenfassende Bearbeitung der über viele Jahre gesammelten Erkenntnisse im Bereich der Schussfadenzugkraftmessung entstanden. Für die Schusseintragstechniken mit Greifern, Projektilen und Luftstrahl ist es gelungen, Formeln zur Berechnung der Schussfadenzugkräfte zu erarbeiten, die es auch dem Weberei-Praktiker ermöglichen, die maximalen Schussfadenzugkräfte mit guter Genauigkeit voraus zu berechnen. Ausgehend von diesen Ergebnissen kann über einen Vergleich mit dem Schwachstellenniveau des Schussgarns die Entscheidung über die Einsatzfähigkeit des Schussgarns auf der vorgesehenen Webmaschine oder über die Einsatzfähigkeit der vorgesehenen Webmaschine selbst bei der gewünschten Drehzahl herbeigeführt werden.
Das vorliegende Taschenbuch fasst die bekannten Berechnungsformeln und Erkenntnisse aus der betrieblichen Praxis und aus wissenschaftlichen Untersuchungen im Bereich des Weberei-Vorwerks und der Weberei zusammen. Die bei der Gewebeherstellung notwendigen Entscheidungsprozesse sollen damit erleichtert werden.
Mit dieser Formelsammlung lassen sich jedoch nicht nur die optimalen Fertigungsvorschriften für Gewebe praxisgerecht erstellen, sondern auch die wichtigsten technischen und physikalischen Grundlagen des Fabrikbetriebs werden in der gebotenen Kürze dargestellt.
- Die zunehmende Digitalisierung stellt sowohl Unternehmen als auch deren Mitarbeiter vor viele neue Herausforderungen und bietet zugleich ein hohes Chancenpotenzial.
- Die Chance, das mit der Digitalisierung verbundene Potenzial zu heben, ist abhängig von der Konsequenz der Umsetzung der Digitalisierung.
- Voraussetzung für eine erfolgreiche Umsetzung ist eine Digitalisierungsstrategie, die den Weg und die Ziele für alle Beteiligten transparent, nachvollziehbar und erstrebenswert macht.
- Eine gelebte und von allen mitgetragene Digitalisierungsstrategie impliziert eine Kulturveränderung im Unternehmen.
Nachdem Unternehmen Klarheit über das Kundenverhalten und die damit zusammenhängenden Determinanten haben, gilt es, in einem zweiten Schritt eine Strategie auszuarbeiten, die die bestmögliche Befriedigung der Kundenbedürfnisse in den Mittelpunkt des Handelns setzt. Je besser dieses den Unternehmen gelingt, umso besser lassen sich damit deren eigene Ziele verwirklichen. Dabei gilt es zunächst, einen Marketingplan zu entwickeln.
Marketing ist allgegenwärtig! Es begegnet Ihnen im Supermarkt, in Onlineshops und in sozialen Medien. Doch was steckt konkret hinter dem Marketing und wie gestalten Unternehmen es erfolgreich? Auf diese und weitere Fragen geht das Buch im Detail ein. Zu Beginn vermittelt es Grundlagen zum Konsumentenverhalten, zum Kaufprozess und zur persönlichen Kaufentscheidung. Vor diesem Hintergrund erläutert es Ziele und Maßnahmen der strategischen Marketingplanung. Darauf aufbauend präsentiert es Aspekte einer operativen Marketingplanung und diskutiert die Marken-, Produkt-, Distributions-, Kommunikations- sowie Preispolitik ausführlich. Marketingprofis geben Einblicke in die Praxis. Ein Best Practice-Beispiel macht das Gelernte schnell (be)greifbar.
Radiofrequency ablation is an ablation technique to treat tumors with focused heat. Computer tomography, ultrasound and magnetic resonance imaging (MRI) are imaging modalities which can be used for image-guided procedures. MRI offers several advantages in comparison to the other imaging modalities, such as radiation-free fluoroscopic imaging, temperature mapping, a high-soft-tissue contrast and free selection of imaging planes. This work addresses the application of 3Dcontrollers for controlling interventional, fluoroscopic MR sequences at the scenario of MR guided radiofrequency ablation of hepatic malignancies. During this procedure, the interventionalist can monitor the targeting of the tumor with near-real time fluoroscopic sequences. In general, adjustments of the imaging planes are necessary during tumor targeting, which is performed by an assistant in the control room. Therefore, communication between the interventionalist in the scanner room and the assistant in the control room is essential. However, verbal communication is impaired due to the loud scanning noises. Alternatively, non-verbal communication between the two persons is possible, however limited to a few gestures and susceptible to misunderstandings. This work is analyzing different 3D-controllers to enable control of interventional MR sequences during MR-guided procedures directly by the interventionalist. Leap Motion, Wii Remote, SpaceNavigator, Phantom Omni and Foot Switch were selected. For that a simulation was built in C++ with VTK to feign the real scenario for test purposes. Previous results showed that Leap Motion is not suitable for the application while Wii Remote and Foot Switch are possible input devices. Final evaluation showed a generally time reduction with the use of 3D-controllers. Best results were reached with Wii Remote in 34 seconds. Handholding input devices like Wii Remote have further potential to integrate them in real environment to reduce intervention time.
Digitization of societies changes the way we live, work, learn, communicate, and collaborate. In the age of digital transformation IT environments with a large number of rather small structures like Internet of Things (IoT), microservices, or mobility systems are emerging to support flexible and agile digitized products and services. Adaptable ecosystems with service oriented enterprise architectures are the foundation for self-optimizing, resilient run-time environments and distributed information systems. The resulting business disruptions affect almost all new information processes and systems in the context of digitization. Our aim are more flexible and agile transformations of both business and information technology domains with more flexible enterprise information systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision-controlled digitization architectures for Internet of Things and microservices by evolving enterprise architecture reference models and state of the art elements for architectural engineering for micro-granular systems.
Digitization fosters the development of IT environments with many rather small structures, like Internet of Things (IoT), microservices, or mobility systems. They are needed to support flexible and agile digitized products and services. The goal is to create service-oriented enterprise architectures (EA) that are self optimizing and resilient. The present research paper investigates methods for decision-making concerning digitization architectures for Internet of Things and microservices. They are based on evolving enterprise architecture reference models and state of the art elements for architectural engineering for microgranular systems. Decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures, is sorely needed. The challenging of the decision processes can be supported with in a more flexible and intuitive way by an architecture management cockpit.
Digital enterprise architecture management in tourism : state of the art and future directions
(2018)
The advance of information technology impacts tourism more than many other industries, due to the service character of its products. Most offerings in tourism are immaterial in nature and challenging in coordination. Therefore, the alignment of IT and strategy and digitization is of crucial importance to enterprises in tourism. To cope with the resulting challenges, methods for the management of enterprise architectures are necessary. Therefore, we scrutinize approaches for managing enterprise architectures based on a literature research. We found many areas for future research on the use of enterprise architecture in tourism.
Digitization transforms business process models and processes in many enterprises. However, many of them need guidance, how digitization is impacting the design of their information systems. Therefore, this paper investigates the influence of digitization on information system design. We apply a two-phase research method applying a literature review and an exploratory case study. The case study took place in the IT service provider of a large insurance enterprise. The study’s results suggest that a number of areas of information system design are affected, such as architecture, processes, data and services.
In a time of digital transformation, the ability to quickly and efficiently adapt software systems to changed business requirements becomes more important than ever. Measuring the maintainability of software is therefore crucial for the long-term management of such products. With service-based systems (SBSs) being a very important form of enterprise software, we present a holistic overview of such metrics specifically designed for this type of system, since traditional metrics – e.g. object oriented ones – are not fully applicable in this case. The selected metric candidates from the literature review were mapped to 4 dominant design properties: size, complexity, coupling, and cohesion. Microservice-based systems (μSBSs) emerge as an agile and fine grained variant of SBSs. While the majority of identified metrics are also applicable to this specialization (with some limitations), the large number of services in combination with technological heterogeneity and decentralization of control significantly impacts automatic metric collection in such a system. Our research therefore suggests that specialized tool support is required to guarantee the practical applicability of the presented metrics to μSBSs.
Towards a practical maintainability quality model for service- and microservice-based systems
(2017)
Although current literature mentions a lot of different metrics related to the maintainability of service-based systems (SBSs), there is no comprehensive quality model (QM) with automatic evaluation and practical focus. To fill this gap, we propose a Maintainability Model for Services (MM4S), a layered maintainability QM consisting of service properties (SPs) related with automatically collectable Service Metrics (SMs). This research artifact created within an ongoing Design Science Research (DSR) project is the first version ready for detailed evaluation and critical feedback. The goal of MM4S is to serve as a simple and practical tool for basic maintainability estimation and control in the context of BSs and their specialization
microservice-based systems (μSBSs).
Im Rahmen des Forschungsprojektes sollten die Möglichkeiten und Grenzen des Einsatzes von Sol-Gel-Ausrüstung für die Verbesserung der Scheuer-/Abrasionsbeständigkeit für Gewebe aus unterschiedlichen Fasermaterialien untersucht werden. Dabei lag der Schwerpunkt auf Textilien für die Bereiche Bekleidung- /Berufsbekleidung sowie Bezugsstoffe (Möbel, Automotive, Personentransport).
Kleider machen Drummer
(2018)
Nicht nur Songs machen Musiker bekannt, sondern auch ihr Look: David Bowie, Kurt Cobain, Madonna, Elvis Presley haben durch ihre Musik und ihren Kleidungsstil ganze Generationen geprägt. Viele Drummer legen sehr viel Wert auf Technik - doch wenn man in der Musikbranche erfolgreich sein will, sollte man sich auch seiner Wirkungsweis bewusst sein.
Structural and functional thermosetting composite materials are exposed to different kinds of stress which can damage the polymer matrix, thus impairing the intended properties. Therefore, self-healing materials have attracted the attention of many research groups over the last decades in order to provide satisfactory material properties and outstanding product durability. The present article provides a critical overview of promising self-healing strategies for crosslinked thermoset polymers. It is organized in two parts: an overview about the different approaches to self-healing is given in the first part, whereas the second part focuses on the specific chemistries of the main strategies to achieve self-healing through crosslinking. It is attempted to provide a comprehensive discussion of different approaches which are described in the scientific literature. By comparison of the advantages and disadvantages, the authors wish to provide helpful insights on the assessment of the potential to transfer the extensive present knowledge about self-healing materials and methods to surface varnishing thermoset coatings.
This article provides a general overview of the most promising candidates of bio based materials and deals with the most important issues when it comes to their incorporation into PF resins. Due to their abundance on Earth, much knowledge of lignin-based materials has already been gained and uses of lignin in PF resins have been studied for many decades. Other natural polyphenols that are less frequently considered for impregnation are covered as well, as they do also possess some potential for PF substitution.
High quality decorative laminate panels typically consist of two major types of components: the surface layers comprising décor and overlay papers that are impregnated with melamine-based resins, and the core which is made of stacks of kraft papers impregnated with phenolic (PF) resin. The PF-impregnated layers impart superior hydrolytic stability, mechanical strength and fire-resistance to the composite. The manufacturing involves the complex interplay between resin, paper and impregnation/drying processes. Changes in the input variables cause significant alterations in the process characteristics and adaptations of the used materials and specific process conditions may, in turn, be required. This review summarizes the main variables influencing both processability and technological properties of phenolic resin impregnated papers and laminates produced therefrom. It is aimed at presenting the main influences from the involved components (resin and paper), how these may be controlled during the respective process steps (resin preparation and paper production), how they influence the impregnation and lamination conditions, how they affect specific aspects of paper and laminate performance, and how they interact with each other
(synergies).
Eine Vielzahl einzelner Personalmanagementaktivitäten ist in Forschungseinrichtungen bereits etabliert. Im Rahmen eines Nachhaltigkeitsmanagements ist jedoch ein systematisches und strategisches Vorgehen bei der Konzeption, Planung und Implementierung von Personalmanagementaktivitäten notwendig. Ein an einem Nachhaltigkeitsmanagement orientiertes Personalmanagement richtet Personalmaßnahmen an der Strategie und den Organisationszielen der Forschungseinrichtung aus. Nachhaltiges Personalmanagement in Forschungsorganisationen bedeutet, dass die Forschenden ihr kreatives wissenschaftliches Potential entfalten und einsetzen können; die Expertise des wissenschaftsunterstützenden Personals ist dabei ein wichtiger Partner und Begleiter.
Trotz der oft schwierigen Rahmenbedingungen auf den Märkten Subsahara Afrikas ist es möglich, ein erfolgreiches Geschäft aufzubauen, sofern man die adäquate Markteintrittsform wählt und darüber hinaus bereit ist, das eigene Geschäftsmodell an die lokalen Gegebenheiten anzupassen. Eine Herausforderung ist die politische, wirtschaftliche und kulturelle Heterogenität in Subsahara Afrika, die es unmöglich macht, den Kontinent als einen Markt zu bearbeiten. Allerdings gibt es seit Jahrzehnten regionale Integrationsabkommen, die jedoch erst in den letzten Jahren wiederbelebt und vorangetrieben wurden, wie z. B. die ECOWAS (Economic Community of West African States), die EAC (East African Community) und die SADC (South African Development Corporation). Inzwischen funktionieren diese Initiativen soweit, dass einige ausländische Unternehmen beginnen, Teile des Geschäftes regional und nicht mehr national zu organisieren, also die Länder als gemeinsamen Markt zu bearbeiten. In der Regel sind die kulturellen Unterschiede in Nachbarländern nicht so groß. Aus diesen Gründen besteht Wachstumspotential für geschäftlich interessante Märkte in Subsahara Afrika.
Context: Development of software intensive products and services increasingly occurs by continuously deploying product or service increments, such as new features and enhancements, to customers. Product and service developers must continuously find out what customers want by direct customer feedback and usage behaviour observation. Objective: This paper examines the preconditions for setting up an experimentation system for continuous customer experiments. It describes the RIGHT model for Continuous Experimentation (Rapid Iterative value creation Gained through High-frequency Testing), illustrating the building blocks required for such a system. Method: An initial model for continuous experimentation is analytically derived from prior work. The model is matched against empirical case study findings from two startup companies and further developed. Results: Building blocks for a continuous experimentation system and infrastructure are presented. Conclusions: A suitable experimentation system requires at least the ability to release minimum viable products or features with suitable instrumentation, design and manage experiment plans, link experiment results with a product roadmap, and manage a flexible business strategy. The main challenges are proper, rapid design of experiments, advanced instrumentation of software to collect, analyse, and store relevant data, and the integration of experiment results in both the product development cycle and the software development process.
Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.
In vitro composed vascularized adipose tissue is and will continue to be in great demand e.g. for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Up to date, the lack of adequate culture conditions, mainly a culture medium, decelerates further achievements. In our study, we evaluated the influence of epidermal growth factor (EGF) and hydrocortisone (HC), often supplemented in endothelial cell (EC) specific media, on the co-culture of adipogenic differentiated adipose derived stem cells (ASCs) and microvascular endothelial cells (mvECs). In ASCs, EGF and HC are thought to inhibit adipogenic differentiation and have lipolytic activities. Our results showed that in indirect co-culture for 14 days, adipogenic differentiated ASCs further incorporated lipids and partly gained an univacuolar morphology when kept in media with low levels of EGF and HC. In media with high EGF and HC levels, cells did not incorporate further lipids, on the contrary, cells without lipid droplets appeared. Glycerol release, to measure lipolysis, also increased with elevated amounts of EGF and HC in the culture medium. Adipogenic differentiated ASCs were able to release leptin in all setups. MvECs were functional and expressed the cell specific markers, CD31 and von Willebrand factor (vWF), independent of the EGF and HC content as long as further EC specific factors were present. Taken together, our study demonstrates that adipogenic differentiated ASCs can be successfully co-cultured with mvECs in a culture medium containing low or no amounts of EGF and HC, as long as further endothelial cell and adipocyte specific factors are available.
Die Internationalität muss das Markenzeichen und ein wesentlicher Bestandteil des Leitbildes einer Hochschule sein. Für die Strategieentwicklung und -umsetzung bedarf es der notwendigen Strukturen an einer Hochschule sowie der Vernetzung mit weiteren nationalen und internationalen Partnern. Keine Hochschulleitung würde dieses Erfordernis in Zweifel ziehen. Und doch unterschätzen noch immer Rektorate und Präsidien diesen "Dauer-Marathon". Eine Internationalisierungsstrategie wird vielfach und damit gleichzeitig unkorrekt mit einem Perpetuum mobile
verglichen. Einmal verabschiedet, geht es weiter - es gibt keinen Stillstand. Aber das passendere Bild - auch für die Hochschule Reutlingen - ist die Strategie, die einem "kontinuierlichen Verbesserungsprozess" gleicht.
Curriculum design for the German language class in the double-degree programme business engineering
(2017)
This paper aims to give an overview on how German is taught as a foreign language to students enrolled in the Bachelor of Business Engineering, a double-degree programme offered in Universiti Malaysia Pahang. The double degree students have the opportunity to complete their first two years of study in Malaysia and their last two years in Germany. Taking the TestDaF examination is compulsory for double-degree students. Hence, the German Language curriculum has been meticulously planned to ensure the students would be competent in the language. As such, the settings of the language class are discussed thoroughly in this paper. Additionally, it also discusses the challenges faced in teaching German as foreign language. This paper ends with some suggestions for improvement.
Decreasing batch sizes in production in line with Industrie 4.0 will lead to tremendous changes of the control of logistic processes in future production systems. Intelligent bins are crucial enablers to establish decentrally controlled material flow systems in value chain networks as well as at the intralogistics level. These intelligent bins have to be integrated into an overall decentralized monitoring and control approach and have to interact with humans and other entities just like other cyber-physical systems (CPS) within the cyber-physical production system (CPPS). To realize a decentralized material supply following the overall aim of a decentralized control of all production and logistics processes, an intelligent bin system is currently developed at the ESB Logistics Learning Factory. This intelligent bin system will be integrated into the self developed, cloud-based and event-oriented SES system (so-called “Self Execution System”) which goes beyond the common functionalities and capabilities of traditional manufacturing execution systems (MES).
To ensure a holistic integration of the intelligent bin for different material types into the SES framework, the required hard- and software components for the decentrally controlled bin system will be split into a common and an adaptable component. The common component represents the localization and network layer which is common for every bin, whereas the flexible component will be customizable to different requirements, like to the specific characteristics of the parts.
Close and safe interaction of humans and robots in joint production environments is technically feasible, however should not be implemented as an end in itself but to deliver improvement in any of a production system’s target dimensions. Firstly, this paper shows that an essential challenge for system integrators during the design of HRC applications is to identify a suitable distribution of available tasks between a robotic and a human resource. Secondly, it proposes an approach to determine task allocation by considering the actual capabilities of both human and robot in order to improve work quality. It matches those capabilities with given requirements of a certain task in order to identify the maximum congruence as the basis for the allocation decision. The approach is based on a study and subsequent generic description of human and robotic capabilities as well as a heuristic procedure that facilities the decision making process.
Technologies for mapping the “digital twin“ have been under development for approximately 20 years. Nowadays increasingly intelligent, individualized products encourages companies to respond innovatively to customer requirements and to handle the rising product variations quickly.
An integrated engineering network, spanning across the entire value chain, is operated to intelligently connect various company divisions, and to generate a business ecosystem for products, services and communities. The conditions for the digital twin are thereby determined in which the digital world can be fed into the real, and the real world back into the digital to deal such intelligent products with rising variations.
The term digital twin can be described as a digital copy of a real factory, machine, worker etc., that is created and can be independently expanded, automatically updated as well as being globally available in real time. Every real product and production site is permanently accompanied by a digital twin. First prototypes of such digital twins already exist in the ESB Logistics Learning Factory on a cloud- and app based software that builds on a dynamic, multidimensional data and information model. A standardized language of the robot control systems via software agents and positioning systems has to be integrated. The aspect of the continuity of the real factory in the digital factory as an economical means of ensuring continuous actuality of digital models looks as the basis of changeability.
For the indoor localization sensor combinations that in addition to the hardware already contain the software required for the sensor data fusion should be used. Processing systems, scenario-live-simulations and digital shop floor management results in a mandatory procedural combination. Essential to the digital twin is the ability to consistently provide all subsystems with the latest state of all required information, methods and algorithms.
This paper presents a novel multi-modal CNN architecture that exploits complementary input cues in addition to sole color information. The joint model implements a mid-level fusion that allows the network to exploit cross modal interdependencies already on a medium feature-level. The benefit of the presented architecture is shown for the RGB-D image understanding task. So far, state-of-the-art RGB-D CNNs have used network weights trained on color data. In contrast, a superior initialization scheme is proposed to pre-train the depth branch of the multi-modal CNN independently. In an end-to-end training the network parameters are optimized jointly using the challenging Cityscapes dataset. In thorough experiments, the effectiveness of the proposed model is shown. Both, the RGB GoogLeNet and further RGB-D baselines are outperformed with a significant margin on two different tasks: semantic segmentation and object detection. For the latter, this paper shows how to extract object level groundtruth from the instance level annotations in Cityscapes in order to train a powerful object detector.
Rational strain engineering requires solid testing of phenotypes including productivity and ideally contributes thereby directly to our understanding of the genotype-phenotype relationship. Actually, the test step of the strain engineering cycle becomes the limiting step, as ever advancing tools for generating genetic diversity exist. Here, we briefly define the challenge one faces in quantifiying phenotypes and summarize existing analytical techniques that partially overcome this challenge. We argue that the evolution of volatile metabolites can be used as proxy for cellular metabolism. In the simplest case, the product of interest is a volatile (e.g., from bulk alcohols to special fragrances) that is directly quantified over time. But also nonvolatile products (e.g., from bulk long-chain fatty acids to natural products) require major flux rerouting that result potentially in altered volatile production. While alternative techniques for volatile determination exist, rather few can be envisaged for medium to high-throughput analysis required for phenotype testing. Here, we contribute a detailed protocol for an ion mobility spectrometry (IMS) analysis that allows volatile metabolite quantification down to the ppb range. The sensivity can be exploited for small-scale fermentation monitoring. The insights shared might contribute to a more frequent use of IMS in biotechnology, while the experimented aspects are of general use for researchers interested in volatile monitoring.
Despite the significant potential offered by the powder coating process for finishing wood-based materials, until now it has been used almost exclusively for coating Medium Density Fiber Board (MDF). A research project aims to develop processes and substrate materials that will allow lightweight boards to be powder coated.