Refine
Document Type
- Journal article (1244)
- Conference proceeding (1039)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (38)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3088)
Institute
- ESB Business School (1107)
- Informatik (875)
- Technik (509)
- Life Sciences (343)
- Texoversum (220)
- Zentrale Einrichtungen (16)
Publisher
- Springer (473)
- IEEE (252)
- Elsevier (243)
- Hochschule Reutlingen (191)
- MDPI (99)
- Wiley (72)
- Gesellschaft für Informatik e.V (69)
- Universität Tübingen (65)
- De Gruyter (60)
- VDE Verlag (48)
Es ist landläufig bekannt, dass die Stromerzeugung zukünftig auf der Basis erneuerbarer Energien, und damit vornehmlich durch Solar- und Windkraftanlagen, erfolgen soll. Dieses unter dem Stichwort „Energiewende“ formulierte Ziel ist allgemein akzeptiert, und es existieren mittlerweile verschiedene Szenarien, die den Zeitplan dafür vorgeben.
Für Baden-Württemberg hat das Umweltministerium die Strategie „50-80-90“ ausgearbeitet: Danach sollen bis zum Jahr 2050 der Energieverbrauch um 50% reduziert, 80% der benötigten Energie aus erneuerbaren Energien erzeugt und 90% der Treibhausgasemissionen eingespart werden.
Die bedarfsgerechte Steuerung dezentraler thermischer Energiesysteme, wie Kraft-Wärme-Kopplungs- (KWK-) Anlagen und Wärmepumpen, kann einen entscheidenden Beitrag zur Deckung bzw. Reduktion der Residuallast leisten und so für eine Verringerung der konventionellen Reststromversorgung und den damit einhergehenden Treibhausgasemissionen sorgen. Dafür wurde an der Hochschule Reutlingen in mehrjähriger Forschungsarbeit ein prognosebasierter Steuerungsalgorithmus entwickelt. Gegenstand dieses Beitrags bilden neben der Vorstellung eben jenes Steuerungsalgorithmus auch dessen praktische Umsetzungsvarianten: Eine auf einer speicherprogrammierbaren Steuerung (SPS) rein lokal ausführbare Version sowie eine Webservice-Anwendung für den parallelen Betrieb mehrerer Anlagen – ausgehend von einem zentralen Server. Erprobungen am KWK-Prüfstand der Hochschule Reutlingen bestätigen die zuverlässige Funktionsweise des Algorithmus in den verschiedenen Umsetzungsvarianten. Gleichzeitig wird der Vorteil der bedarfsgerechten Steuerung gegenüber dem, insbesondere im Mikro-KWK-Bereich standardmäßig vorliegenden, wärmegeführten Betrieb in Form einer Steigerung der Eigenstromdeckung von bis zu 27 % aufgezeigt. Neben der bedarfsgerechten Steuerung bedient der entwickelte Algorithmus zudem noch ein weiteres Anwendungsgebiet: Den vorhersagbaren KWK-Betrieb, der beispielsweise in Form täglicher Einspeiseprognose im Rahmen des Redispatch 2.0 eingefordert wird. Die Vorhersage des KWK-Betriebs ist dabei auf zwei Weisen möglich: Als erste Option kann der wärmegeführte Betrieb direkt über den Algorithmus abgebildet und prognostiziert werden. Eine andere Möglichkeit stellt wiederum die bedarfsgerechte Steuerung der Anlage dar; der berechnete optimale Fahrplan entspricht dabei gleichzeitig der Betriebsprognose des KWK-Geräts. Damit ist der entwickelte Steuerungsalgorithmus in der Lage, auf unterschiedliche Weisen zum Gelingen der Energiewende beizutragen.
Bedarfe gezielt erheben
(2021)
Makerspaces sind ein Element einer Open Innovation und bieten die Möglichkeit, den „klassischen Erfinder“ und „Tüftler“ aus seiner Garage, seinem Keller oder seiner Werkstatt herauszuholen. Ziel dabei ist es, ihm ein professionelles und leistungsfähiges Umfeld zur Realisierung seiner Ideen zu bieten, ihn in den Austausch mit Gleichgesinnten zu bringen und eine Verwertungsplattform für die entwickelten Ideen und Prototypen aufzubauen. Diese Optionen sind auch kleinen und mittelständischen Unternehmen zugänglich zu machen, um ihnen darüber die Möglichkeit zu geben, mit ihren zur Verfügung stehenden Mitteln, ähnlich wie Großunternehmen in Sachen Innovation und Kooperation vorzugehen.
Genau hier setzt die vorliegende Studie an und geht der Frage nach den Anforderungen kleiner und mittelständischer Unternehmen an Makerspaces auf den Grund.
To remain relevant and mitigate disruption, traditional companies have to engage in multiple fast-paced experiments in digital offerings: revenue-generating solutions that leverage digital technologies to address customer needs. After launching several digital offering initiatives, reinsurance giant Munich Re noticed that many experienced similar challenges. This briefing describes how Munich Re addressed these common challenges by building a foundation for experimenting more systematically and successfully with digital offerings. The foundation has enabled Munich Re to become a serial innovator of digital offerings.
Bausparverträge sind kombinierte Spar- und Finanzierungsinstrumente, die für die breite Bevölkerung ausgelegt sind. Im Jahr 2020 umfasste der Bestand an Bausparverträgen in Deutschland ca. 25 Mio. Verträge. Ein wesentlicher Teil der Attraktivität des Bausparvertrags für Kunden liegt in der hohen Flexibilität dieser Finanzprodukte, die im Vertragsablauf eine flexible Anpassung an individuelle Finanzierungsbedingungen ermöglicht. In der Sparphase sind das insbesondere Möglichkeiten zur Erhöhung, Ermäßigung und Teilung der Verträge sowie zur relativ flexiblen Anpassung der Sparrate. Bei einem zuteilungsreifen Vertrag kann die Sparphase innerhalb bestimmter zeitlicher Grenzen fortgesetzt werden. In der Darlehensphase sind flexible Sondertilgungen jederzeit und ohne Vorfälligkeitsentschädigung möglich.
Die Vielzahl eingebetteter Optionen beeinflussen sich wechselseitig und müssen in ihrer Wirkungsweise immer gesamthaft betrachtet und gesteuert werden. Die empirische Erfahrung der letzten Jahrzehnte zeigt bezüglich der Optionsausübung ein Kundenverhalten, das sich zwar an finanzmathematischen Überlegungen orientiert, aber nicht vollständig finanzrational abläuft.
It is assumed that more education leads to better understanding of complex systems. Some researchers claim, however, find indications that simple mechanisms like stocks and flows are not well understood even by people who have passed higher education. In this paper, we test people’s understanding of complex systems with the widely studied stock-and-flow (SF) tasks (Booth Sweeney and Sterman 2000). SF tasks assess people’s understanding of the interplay between stocks and flows. We investigate SF failure of domain experts and novices in different knowledge domains. In particular, we compare performance on the original study’s Bathtub task with the square wave pattern (Booth Sweeney and Sterman 2000) with two alternative cover stories from the engineering and business domains on different groups of business and engineering students from different semesters. Further, we show that, while engineering students perform better than business students, with progressing in higher education, students seem to lose the capability of dealing with simple SF tasks from domains other than their field. We thus find hints on déformation professionelle in higher education.
It is assumed that more education leads to better understanding of complex systems. Some researchers, however, find indications that simple mechanisms like stocks and flows are not well understood even by people who have passed higher education. In this paper, we test people’s understanding of complex systems with the widely studied stock-and-flow (SF) tasks. SF tasks assess people’s understanding of the interplay between stocks and flows. We investigate SF failure of domain experts and novices in different knowledge domains. In particular, we compare performance on the original study’s bathtub task with the square wave pattern with two alternative cover stories from the engineering and business domains on different groups of business and engineering students from different semesters. Further, we show that, while engineering students perform better than business students, with progressing in higher education, students may lose the capability of dealing with simple SF tasks. We thus find hints on déformation professionelle in higher education.
SF-failure, the inability of people to correctly determine the behavior of simple stock and flow structures is subject of a long research stream. Reasons for SF-failure can be attributed to different reasons, one of them being lacking domain specific experience, thus familiarity with the problem context. In this article we present a continuation of an experiment to examine the role of educational background in SF-performance. We base the question set on the Bathtub Dynamics tasks introduced by Booth Sweeney and Sterman (2000) and vary the cover stories. In this paper we describe how we developed and tested a new cover story for the engineering domain and implemented the recommendations from a prior study. We test three sets of questions with engineering students which enables us to compare the results to a previous study in which we tested the questions with business students. Results mainly support our hypothesis that context familiarity increases SF-performance. With our findings we further develop the methodology of the research on SF-failure.
Prior studies ascribed people’s poor performance in dealing with basic systems concepts to different causes. While results indicate that, among other things, domain specific experience and familiarity with the problem context play a role in this stock-flow-(SF-)performance, this has not yet been fully clarified. In this article, we present an experiment that examines the role of educational background in SF-performance. We hypothesize that SF-performance increases when the problem context is embedded in the problem solver’s knowledge domain, indicated by educational background. Using the square wave pattern and the sawtooth pattern tasks from the initial study by Booth Sweeney and Sterman (2000), we design two additional cover stories for the former, the Vehicle story from the engineering domain and the Application story from the business domain, next to the original Bathtub story. We then test the three sets of questions on business students. Results mainly support our hypothesis. Interestingly, participants even do better on a more complex behavioral pattern from their knowledge domain than on a simpler pattern from more distant domains. Although these findings have to be confirmed by further studies, they contribute both to the methodology of future surveys and the context familiarity discussion.
This article adopts a qualitative comparative causal mapping approach to extend knowledge of the interrelated barriers to public entrepreneurship and the outcomes of such entrepreneurship. The results highlight marked differences between the sales segment and the distribution grid segment of German public enterprises that should prompt a refined perspective on public entrepreneurship. Notably, besides intra-organizational barriers and those interfering from the external environment, results also show that a public enterprise’s supervisory board can hinder its progress. This study thus contributes to recent discussion on governance and entrepreneurship by revealing a feature that could distinguish public from private enterprises.
The present publication reports the purification effort of two natural bone blocks, that is, an allogeneic bone block (maxgraft®, botiss biomaterials GmbH, Zossen, Germany) and a xenogeneic block (SMARTBONE®, IBI S.A., Mezzovico Vira, Switzerland) in addition to previously published results based on histology. Furthermore, specialized scanning electron microscopy (SEM) and in vitro analyses (XTT, BrdU, LDH) for testing of the cytocompatibility based on ISO 10993-5/-12 have been conducted. The microscopic analyses showed that both bone blocks possess a trabecular structure with a lamellar subarrangement. In the case of the xenogeneic bone block, only minor remnants of collagenous structures were found, while in contrast high amounts of collagen were found associated with the allogeneic bone matrix. Furthermore, only island-like remnants of the polymer coating in case of the xenogeneic bone substitute seemed to be detectable. Finally, no remaining cells or cellular remnants were found in both bone blocks. The in vitro analyses showed that both bone blocks are biocompatible. Altogether, the purification level of both bone blocks seems to be favorable for bone tissue regeneration without the risk for inflammatory responses or graft rejection. Moreover, the analysis of the maxgraft® bone block showed that the underlying purification process allows for preserving not only the calcified bone matrix but also high amounts of the intertrabecular collagen matrix.
Back to the future: origins and directions of the “Agile Manifesto” – views of the originators
(2018)
In 2001, seventeen professionals set up the manifesto for agile software development. They wanted to define values and basic principles for better software development. On top of brought into focus, the manifesto has been widely adopted by developers, in software-developing organizations and outside the world of IT. Agile principles and their implementation in practice have paved the way for radical new and innovative ways of software and product development. In parallel, the understanding of the manifesto’s underlying principles evolved over time. This, in turn, may affect current and future applications of agile principles. This article presents results from a survey and an interview study in collaboration with the original contributors of the manifesto for agile software development. Furthermore, it comprises the results from a workshop with one of the original authors. This publication focuses on the origins of the manifesto, the contributors’ views from today’s perspective, and their outlook on future directions. We evaluated 11 responses from the survey and 14 interviews to understand the viewpoint of the contributors. They emphasize that agile methods need to be carefully selected and agile should not be seen as a silver bullet. They underline the importance of considering the variety of different practices and methods that had an influence on the manifesto. Furthermore, they mention that people should question their current understanding of "agile" and recommend reconsidering the core ideas of the manifesto.
Due to the growing importance of videos for B2B sales outreach, this study proposes an extended model based on the Corporate-Video Model by Büsching and Meidel (2016) and a qualitative survey with high-ranking company representatives. The findings comprise seven complementary categories: structure, communication, product display, information content, unique selling proposition, value-based selling, and dramaturgy of product videos. The model extension aids practitioners in analyzing and conceptualizing compelling B2B product videos.
Obwohl Vorteile wertorientierter Preissetzung seit Jahren bekannt sind, gewinnt sie nur langsam an Boden. Die erste Studie des Preisverhaltens nach Geschäftstypen zeigt: 35 Jahre Preisforschung und -beratung konnten die Dominanz der Kostenorientierung erstmals schwächen. Der Autor wagt einen Erklärungsversuch und ermutigt zu mehr Marktorientierung.
Gelatin is one of the most prominent biopolymers in biomedical material research and development. It is frequently used in hybrid hydrogels, which combine the advantageous properties of bio‐based and synthetic polymers. To prevent the biological component from leaching out of the hydrogel, the biomolecules can be equipped with azides. Those groups can be used to immobilize gelatin covalently in hydrogels by the highly selective and specific azide–alkyne cycloaddition. In this contribution, we functionalized gelatin with azides at its lysine residues by diazo transfer, which offers the great advantage of only minimal side‐chain extension. Approximately 84–90% of the amino groups are modified as shown by 1H‐NMR spectroscopy, 2,4,6‐trinitrobenzenesulfonic acid assay as well as Fourier‐transform infrared spectroscopy, rheology, and the determination of the isoelectric point. Furthermore, the azido‐functional gelatin is incorporated into hydrogels based on poly(ethylene glycol) diacrylate (PEG‐DA) at different concentrations (0.6, 3.0, and 5.5%). All hydrogels were classified as noncyctotoxic with significantly enhanced cell adhesion of human fibroblasts on their surfaces compared to pure PEG‐DA hydrogels. Thus, the new gelatin derivative is found to be a very promising building block for tailoring the bioactivity of materials.
In recent years, the development and application of decellularized extracellular matrices (ECMs) for use as biomaterials have grown rapidly. These cell-derived matrices (CDMs) represent highly bioactive and biocompatible materials consisting of a complex assembly of biomolecules. Even though CDMs mimic the natural microenvironment of cells in vivo very closely, they still lack specifically addressable functional groups, which are often required to tailor a biomaterial functionality by bioconjugation. To overcome this limitation, metabolic glycoengineering has emerged as a powerful tool to equip CDMs with chemical groups such as azides. These small chemical handles are known for their ability to undergo bioorthogonal click reactions, which represent a desirable reaction type for bioconjugation. However, ECM insolubility makes its processing very challenging. In this contribution, we isolated both the unmodified ECM and azide-modified clickECM by osmotic lysis. In a first step, these matrices were concentrated to remove excessive water from the decellularization step. Next, the hydrogel-like ECM and clickECM films were mechanically fragmentized, resulting in easy to pipette suspensions with fragment sizes ranging from 7.62 to 31.29 μm (as indicated by the mean d90 and d10 values). The biomolecular composition was not impaired as proven by immunohistochemistry. The suspensions were used for the reproducible generation of surface coatings, which proved to be homogeneous in terms of ECM fragment sizes and coating thicknesses (the mean coating thickness was found to be 33.2 ± 7.3 μm). Furthermore, they were stable against fluid-mechanical abrasion in a laminar flow cell. When primary human fibroblasts were cultured on the coated substrates, an increased bioactivity was observed. By conjugating the azides within the clickECM coatings with alkyne-coupled biotin molecules, a bioconjugation platform was obtained, where the biotin–streptavidin interaction could be used. Its applicability was demonstrated by equipping the bioactive clickECM coatings with horseradish peroxidase as a model enzyme.
Virtual Reality (VR) technology has the potential to support knowledge communication in several sectors. Still, when educators make use of immersive VR technology in favor of presenting their knowledge, their audience within the same room may not be able to see them anymore due to wearing head-mounted displays (HMDs). In this paper, we propose the Avatar2Avatar system and design, which augments the visual aspect during such a knowledge presentation. Avatar2Avatar enables users to see both a realistic representation of their respective counterpart and the virtual environment at the same time. We point out several design aspects of such a system and address design challenges and possibilities that arose during implementation. We specifically explore opportunities of a system design for integrating 2D video-avatars in existing roomscale VR setups. An additional user study indicates a positive impact concerning spatial presence when using Avatar2Avatar.
The EU funded project RobLog recently developed a system able to autonomously unload coffee sacks from a standard container. Being the first of its kind, a further development is needed in order for the system to be competitive against manual labor. Financing this development entails a risk, hence a justified skepticism, which can be overcome by the longsighted view of the existing market potential. This paper presents a method to estimate the market potential of autonomous unloading systems for heavy deformable goods. Starting from the analysis of the coffee trade, first the current coffee traffic is investigated in order to calculate the number of autonomous systems needed to handle the imported sacks; Results are validated and the method is extended for the calculation of the potential of other market segments, where the same unloading technology can be applied.
Autonomisierung von Shopfloor Management : Der Weg vom analogen zum autonomen Shopfloor Management
(2021)
Neue Technologien der Digitalisierung, Vernetzung und künstlichen Intelligenz werden zunehmend auch im Shopfloor Management (SFM) Einzug halten. Dieser Beitrag beschreibt in vier Stufen, wie sich das klassische SFM über das digitale SFM hin zu einem smarten und autonomen SFM entwickeln könnte. Darauf aufbauend wird diskutiert, welche Auswirkungen der Einsatz dieser neuen Technologien auf die operative Gestaltung der Durchführung eines SFM hätte und welche Konsequenzen somit auf Mitarbeiter und Führungskräfte zukommen würden.*)
Verteilnetzbetreiber müssen verschiedene Maßnahmen ergreifen, um den Herausforderungen der zunehmenden Installation dezentraler Erzeugungsanlagen zu begegnen. Die meisten dieser Maßnahmen führen zwar zur Einhaltung der Spannungsgrenzwerte, sie läsen jedoch nicht das Problem der Rückspeisung in die überlagerte Netzebene und die damit verbundenen Leistungsverluste. Im Projekt „Demo-rONT-Alternative“ wurde ein Prototyp für einen fernsteuerbaren Kabelverteiler entwickelt, um die Trennstellenverschiebung automatisiert durchführen zu können.
Der Erfolg der Energiewende in Deutschland setzt eine zunehmende Anzahl an dezentralen elektrischen Erzeugungsanlagen (EZA) voraus. Diese dezentralen EZA, wie Photovoltaikanlagen oder Blockheizkraftwerke, bringen für Verteilnetzbetreiber große Herausforderungen mit sich. Im Rahmen des geförderten Forschungsprojekts „Demonstrator Automatisierte Kabelverteil (KV) als Alternative zum regelbaren Ortsnetztransformator (DEMO rONT-Alternative)“ wurde ein neuer Ansatz für die Lösung der bestehenden Problematik erforscht. Dieser besteht in der aktiven Änderung der Topologie der Netzgebiete je nach elektrischer Last und PV-Einspeisung (Trennstellenverlagerung).
Anders als Digital-ICs, die hochautomatisiert entworfen werden können, ist der Entwurf analoger ICs bis heute Handarbeit. Übliche auf Optimierung basierende Automatisierungsverfahren scheitern. Die Ursachen wurden jetzt in einem Forschungsprojekt untersucht, um neue Ansätze zur Entwurfsautomatisierung analoger ICs abzuleiten.
Das Provisioning Tool automaIT wurde prototypisch um die Möglichkeit eines Data Discovery erweitert, mit dem Ziel, nicht durch automaIT verwaltete Systeme anbinden und steuern zu können. Daten aus dem Data Discovery werden mittels dem Tool Facter gesammelt und können dynamisch in ausführbare Modelle von automaIT integriert und ausgewertet werden. Dadurch kann der Verlauf weiterer Provisionierungsschritte gesteuert werden, ohne dass es eines manuellen Eingriffs bedarf.
Verschleiß an Zerspanwerkzeugen mit geometrisch definierter Schneide führt zu schlechter Oberflächenqualität, erhöhten Kräften, Maßabweichungen und Bruch. Bisher wird dieser Verschleiß außerhalb der Maschine oder indirekt (z. B. Durchmesser) erfasst. Der Tausch der Werkzeuge findet nach einer bestimmten Werkstückzahl, Zeit, oder einem Standweg statt. In diesem Beitrag wird ein neuartiges System zur direkten Ermittlung des Freiflächenverschleißes im Arbeitsraum eines Bearbeitungszentrums dargestellt. Dabei wird eine geschützt integrierte Industriekamera mit Objektiv im Arbeitsraum installiert. Die Maschinenachsen bzw. die Bearbeitungsspindel positionieren das Werkzeug davor. Nach einer nur wenige Sekunden dauernden Messung findet die Auswertung des Verschleißes hauptzeitparallel statt.
According to several surveys and statistics, the great majority of companies previously not accustomed to automation are piloting solutions to automate business processes. Those accustomed to automation also attempt to introduce more of it, focusing on automation-unfriendly processes that remained manual. However, when the decision on what and whether to automate is not trivial for evident reasons, even industry leaders may get stuck on an overwhelming question: where to begin automating? The question remains too often unanswered as state-of-the-art methods fail to consider the whole picture. This paper introduces a holistic approach to the decision-making for investments in automation. The method supports the iterative analysis and evaluation of operative processes, providing tools for a quantitative approach to the decision-making. Thanks to the method, a large pool of processes can be first considered and then filtered out in order to select the one that yields the best value for the automation in the specific context. After introducing the method, a case study is reported for validation before the discussion.
Purpose: To develop a method for synthesizing a fuzzy automatic control system for a shearer drum in terms of coal seam hypsometry basing on the information criterion of the beginning of rock cutting-off by the drum to reduce ash content of the extracted coal.
Methodology: Taking into consideration peculiarities of determining a distinct information criterion of the beginning of rock cutting-off by the drum and regularities of its variations during the shearer operation, a fuzzy inference algorithm is developed for a system of fuzzy automatic drum control in terms of seam hypsometry. In this context, rules of fuzzy productions, parameters of the membership functions of terms of the output linguistic variable system, and fuzzy operations are substantiated according to the recommendations of a classic Mamdani fuzzy inference algorithm. Studies are carried out to analyze the effi¬ciency of the proposed fuzzy inference algorithm basing on the introduced relative parameter of the number of effective control actions formed by the fuzzy control system. Simulation modeling makes it possible to perform comparative analysis of the efficiency of the drum control.
Findings: In the course of research, an algorithm of fuzzy control of the shearer’s upper drum in terms of coal seam hypsometry has been developed basing on the determination of direct and inverse transfer from coal breaking near the seam roof by the shearer drum to rock breaking with the help of statistical analysis of the stator power of a cutting drive motor.
Originality: For the first time, a method of synthesis of fuzzy automatic control of the drum in terms of seam hypsometry has been proposed.
Practical value: The proposed method is the theoretical basis to solve important scientific and applied problem of the automation of the coal shearer drum in terms of seam hypsometry to reduce ash content of the produced coal.
The high system flexibility necessary for the full automation of complex and unstructured tasks leads to increased technological complexity, thus to higher costs and lower performance. In this paper, after an introduction to the different dimensions of flexibility, a method for flexible modular configuration and evaluation of systems of systems is introduced. The method starts from process requirements and, considering factors such as feasibility, development costs, market potential and effective impact on the current processes, enables the evaluation of a flexible systems of systems equipped with the needed functionalities before its actual development. This allows setting the focus on those aspects of flexibility that add market value to the system, thus promoting the efficient development of systems addressed to interested customers in intralogistics. An example of application of the method is given and discussed.
Physical analog IC design has not been automated to the same degree as digital IC design. This shortfall is primarily rooted in the analog IC design problem itself, which is considerably more complex even for small problem sizes. Significant progress has been made in analog automation in several R&D target areas in recent years. Constraint engineering and generator-based module approaches are among the innovations that have emerged. Our paper will first present a brief review of the state of the art of analog layout automation. We will then introduce active and open research areas and present two visions – a “continuous layout design flow” and a “bottom-up meets top-down design flow” – which could significantly push analog design automation towards its goal of analog synthesis.
In a time of digital transformation, the ability to quickly and efficiently adapt software systems to changed business requirements becomes more important than ever. Measuring the maintainability of software is therefore crucial for the long-term management of such products. With service-based systems (SBSs) being a very important form of enterprise software, we present a holistic overview of such metrics specifically designed for this type of system, since traditional metrics – e.g. object oriented ones – are not fully applicable in this case. The selected metric candidates from the literature review were mapped to 4 dominant design properties: size, complexity, coupling, and cohesion. Microservice-based systems (μSBSs) emerge as an agile and fine grained variant of SBSs. While the majority of identified metrics are also applicable to this specialization (with some limitations), the large number of services in combination with technological heterogeneity and decentralization of control significantly impacts automatic metric collection in such a system. Our research therefore suggests that specialized tool support is required to guarantee the practical applicability of the presented metrics to μSBSs.
DMOS transistors often suffer from substantial self-heating during high power dissipation, which can lead to thermal destruction if the device temperature reaches excessive values. A successfully demonstrated method to reduce the peak temperature is the redistribution of power dissipation density from the hotter to the cooler device areas by careful layout modification. However, this is very tedious and time-consuming if complex-shaped devices as often found in industrial applications are considered.
This paper presents an approach for fully automatic layout optimization which requires only a few hours processing time. The approach is applied to complex shaped test structures which are investigated by measurements and electro-thermal simulations. Results show a significantly lower peak temperature and an energy capability gain of 84 %, offering potential for a 18 % size reduction of active area.
Annotations of character IDs in news images are critical as ground truth for news retrieval and recommendation system. Universality and accuracy optimization of deep neural network models constitutes the key technology to improve the precision and computing efficiency of automatic news character identification, which is attracting increased attention globally. This paper explores the optimized deep neural network model for automatic focus personage identification in multi-lingual news. First, the face model of the focus personage is trained by using the corresponding face images from German news as positive samples. Next, the scheme of Recurrent Convolutional Neural Network (RCNN) + Bi-directional Long-Short Term Memory (Bi-LSTM) + Conditional Random Field (CRF) is utilized to label the focus name, and the RCNN-RCNN encoder–decoder is applied to translate names of people into multiple languages. Third, face features are described by combining the advantages of Local Gabor Binary Pattern Histogram Sequence (LGBPHS) and RCNN, and iterative quantization (ITQ) is used to binarize codes. Finally, a name semantic network is built for different domains. Experiments are performed on a dataset which comprises approximately 100,000 news images. The experimental results demonstrate that the proposed method achieves a significant improvement over other algorithms.
Checklists are a valuable tool to ensure process quality and quality of care. To ensure proper integration in clinical processes, it would be desirable to generate checklists directly from formal process descriptions. Those checklists could also be used for user interaction in context-aware surgical assist systems. We built a tool to automatically convert Business Process Model and Notation (BPMN) process models to checklists displayed as HTML websites. Gateways representing decisions are mapped to checklist items that trigger dynamic content loading based on the placed checkmark. The usability of the resulting system was positively evaluated regarding comprehensibility and end-user friendliness.
Automatic content creation system for augmented reality maintenance applications for legacy machines
(2024)
Augmented reality (AR) applications have great potential to assist maintenance workers in their operations. However, creating AR solutions is time-consuming and laborious, which limits its widespread adoption in the industry. It therefore often happens that even with the latest generation machines, instead of an AR solution, the user only receives an electronic manual for the equipment operation and maintenance. This is commonplace with legacy machines. For this reason, solutions are required that simplify the creation of such AR solutions. This paper presents an approach using an electronic manual as a basis to create fast and cost-effective AR solutions for maintenance. As part of the approach, an application was developed to automatically identify and subdivide the chapters of electronic manuals via the bookmarks in the table of contents. The contents are then automatically uploaded to a central server and indexed with a suitable marker to make the data retrievable. The prepared content can then be accessed for creating context-related AR instructions via the marker. The application is characterized by the fact that no developers or experts are required to prepare the information. In addition to complying with common design criteria, the clear presentation of the contents and the intuitive use of the system offer added value for the performance of maintenance tasks. Together, these two elements form a novel way to retrofit legacy machines with AR maintenance instructions. The practical validation of the system took place in a factory environment. For this purpose, the content was created for a filter change on a CNC milling machine. The results show that inexperienced users can extract appropriate content with the software application. Furthermore, it is shown that maintenance workers, can access the content with an AR application developed for the Microsoft HoloLens 2 and complete simple tasks provided in the manufacturer's electronic manual.
Switched reluctance motors are particularly attractive due to their simple structure. The control of this machine type requires the instants, to switch the currents in the motor phases in an appropriate sequence. These switching instants are determined either based on a position sensor, or on signals generated by a sensorless method. A very simple sensorless method uses the switching frequency of the hysteresis controllers used for phase current control. This paper first presents an automatic commissioning method for this sensorless method and second a startup procedure, thus enhancing this approach towards an application in industry.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
Automatic anode rod inspection in aluminum smelters using deep-learning techniques: a case study
(2020)
Automatic fault detection using machine learning has become an exciting and promising area of research. This because it accurate and timely way to manage and classify with minimal human effort. In the computer vision community, deep-learning methods have become the most suitable approaches for this task. Anodes are large carbon blocks that are used to conduct electricity during the aluminum reduction process. The most basic function of anode rod inspection is to prevent a situation where the anode rod will not fit into the stub-holes of a new anode. It would be the case for a rod containing either severe toe-in, missing stubs, or a retained thimble on one or more stubs. In this work, to improve the accuracy of shape defect inspection for an anode rod, we use the Fast Region-based Convolutional Network method (Fast R-CNN), model. To train the detection model, we collect an image dataset composed of multi-class of anode rod defects with annotated labels. Our model is trained using a small number of samples, an essential requirement in the industry where the number of available defective samples is limited. It can simultaneously detect multi-class of defects of the anode rod in nearly real-time.
Automated stabilization of loading capacity of coal shearer screw with controlled cutting drive
(2015)
A solution of topical scientific problem of coal shearer output increase providing minimum specific power supply for coal cutting, transportation, and loading in terms of thin seams has been proposed. The solution is based on the use of earlier proposed criterion of screw gumming for optimum cutting velocity-coal shearer feed rate ratio in the context of increased screw rotation owing to phase voltage frequency increase. Simulation results of automated control system for coal shearer operations with frequency-controlled cutting drive within thin seams have confirmed the efficiency of the system using proposed algorithm of smart analysis of coal shearer power signal.
The recent years and especially the Internet have changed the ways in which data is stored. It is now common to store data in the form of transactions, together with ist creation time-stamp. These transactions can often be attributed to Logical units, e.g., all transactions that belong to one customer. These groups, we refer to them as data sequences, have a more complex structure than tuple-based data. This makes it more difficult to find discriminatory patterns for classification purposes. However, the complex structure potentially enables us to track behaviour and its change over the course of time. This is quite interesting, especially in the e-commerce area, in which classification of a sequence of customer actions is still a challenging task for data miners. However, before standard algorithms such as Decision Trees, Neural Nets, Naive Bayes or Bayesian Belief Networks can be applied on sequential data, preparations are required in order to capture the information stored within the sequences. Therefore, this work presents a systematic approach on how to reveal sequence patterns among data and how to construct powerful features out of the primitive sequence attributes. This is achieved by sequence aggregation and the incorporation of time dimension into the feature construction step. The proposed algorithm is described in detail and applied on a real-life data set, which demonstrates the ability of the proposed algorithm to boost the classification performance of well-known data mining algorithms for binary classification tasks.
Data collected from internet applications are mainly stored in the form of transactions. All transactions of one user form a sequence, which shows the user´s behaviour on the site. Nowadays, it is important to be able to classify the behaviour in real time for various reasons: e.g. to increase conversion rate of customers while they are in the store or to prevent fraudulent transactions before they are placed. However, this is difficult due to the complex structure of the data sequences (i.e. a mix of categorical and continuous data types, constant data updates) and the large amounts of data that are stored. Therefore, this thesis studies the classification of complex data sequences. It surveys the fields of time series analysis (temporal data mining), sequence data mining or standard classification algorithms. It turns out that these algorithms are either difficult to be applied on data sequences or do not deliver a classification: Time series need a predefined model and are not able to handle complex data types; sequence classification algorithms such as the apriori algorithm family are not able to utilize the time aspect of the data. The strengths and weaknesses of the candidate algorithms are identified and used to build a new approach to solve the problem of classification of complex data sequences. The problem is thereby solved by a two-step process. First, feature construction is used to create and discover suitable features in a training phase. Then, the blueprints of the discovered features are used in a formula during the classification phase to perform the real time classification. The features are constructed by combining and aggregating the original data over the span of the sequence including the elapsed time by using a calculated time axis. Additionally, a combination of features and feature selection are used to simplify complex data types. This allows catching behavioural patterns that occur in the course of time. This new proposed approach combines techniques from several research fields. Part of the algorithm originates from the field of feature construction and is used to reveal behaviour over time and express this behaviour in the form of features. A combination of the features is used to highlight relations between them. The blueprints of these features can then be used to achieve classification in real time on an incoming data stream. An automated framework is presented that allows the features to adapt iteratively to a change in underlying patterns in the data stream. This core feature of the presented work is achieved by separating the feature application step from the computational costly feature construction step and by iteratively restarting the feature construction step on the new incoming data. The algorithm and the corresponding models are described in detail as well as applied to three case studies (customer churn prediction, bot detection in computer games, credit card fraud detection). The case studies show that the proposed algorithm is able to find distinctive information in data sequences and use it effectively for classification tasks. The promising results indicate that the suggested approach can be applied to a wide range of other application areas that incorporate data sequences.
The state of the art proposes the microservices architectural style to build applications. Additionally, container virtualization and container management systems evolved into the perfect fit for developing, deploying, and operating microservices in line with the DevOps paradigm. Container virtualization facilitates deployment by ensuring independence from the runtime environment. However, microservices store their configuration in the environment. Therefore, software developers have to wire their microservice implementation with technologies provided by the target runtime environment such as configuration stores and service registries. These technological dependencies counteract the portability benefit of using container virtualization. In this paper, we present AUTOGENIC - a model-based approach to assist software developers in building microservices as self configuring containers without being bound to operational technologies. We provide developers with a simple configuration model to specify configuration operations of containers and automatically generate a self-configuring microservice tailored for the targeted runtime environment. Our approach is supported by a method, which describes the steps to automate the generation of self-configuring microservices. Additionally, we present and evaluate a prototype, which leverages the emerging TOSCA standard.
Bei der spanenden Bearbeitung metallischer Werkstücke ist Verschleiß der Werkzeuge entscheidend für die Qualität der erzeugten Bauteiloberflächen und die Kosten der eingesetzten Werkzeuge. Damit ist der Werkzeugverschleiß und dessen Ursachen maßgebliches Auflegungskriterium für die Zerspanprozesse. Verschleiß wird dabei durch eine Vielzahl unterschiedlicher Parameter beeinflusst. Neben dem Werkstückwerkstoff, dem Schneidstoff und der Beschichtung sowie der Werkzeuggeometrie, sind die Schnittgeschwindigkeit, der Vorschub, die radiale und axiale Zustellung und die Prozessmedien ausschlaggebend für die Standzeit der Werkzeuge. Weitere Effekte gehen von den Eigenschaften des Werkstücks, Werkzeugs und der Maschinen aus. diese sind verantwortlich für die Entstehung von durch Instabilitäten verursachten Verschleiß. Im Folgenden wird die Systematik der Untersuchung unterschiedlich additivierter Kühlschmierstoffe auf den Verschleiß untersucht.
Ein wirtschaftlicher Betrieb von KWK-Anlagen ist erreichbar, wenn Geräte mit gutem elektrischen Wirkungsgrad und geringen Anschaffungs- und Wartungskosten eingesetzt werden und der im BHKW erzeugte Strom zum größtmöglichen Anteil im Objekt verbraucht wird. Der Pufferspeicher einer KWK-Anlage sollte ausreichend groß bemessen sein (Flexibilität, Eigenstromoptimierung...). Ein größeres BHKW ist nicht automatisch unwirtschaftlicher aufgrund der geringeren Betriebszeit. Es bietet dagegen ein höheres Potenzial für eine bedarfsgerechte Stromeinspeisung in das Netz.
Ausbildung in der Akustik
(2022)
Die Wissenschaft der Akustik mit ihrer Vielfallt und Interdisziplinarität bietet hervorragende Möglichkeiten an beruflichen Betätigungsfeldern und hat viele von uns in ihren Bann gezogen. Ausbildung in der Akustik bedeutet mehr als Studierenden nur Wissen und Fähigkeiten zu vermitteln. Eigentlich ist nach dem Studium auch der Lernprozess nicht abgeschlossen, sondern wie viele Akustiker:innen meinen, fängt dieser erst dann richtig an. Um eine sehr gute Ausbildung zu gestalten, bedarf es neben Vorbildern an Personen auch Lehrformate, Methoden und Tools. Die folgenden sechs Kurzbeiträge sind Beispiele gelungener Maßnahmen in der Ausbildung der Akustik und sollen anregen, die Qualität in der Lehre stetig zu verbessern.
Im Kampf gegen die Ausbreitung des Corona-Virus waren flächendeckende Schulschließungen seit Ausbruch der Pandemie eine zentrale politische Maßnahme. Infolgedessen kam es zu erheblichen Lernzeiteinbußen unter allen Schüler*innen, worunter insbesondere Kinder und Jugendliche aus benachteiligten Verhältnissen immer noch leiden. Über die Auswirkungen der Corona-Pandemie auf das deutsche Bildungssystem wird diskutiert und Maßnahmen zu deren Eindämmung aufgesetzt.
In mehreren Untersuchungen hat sich gezeigt, dass sich die Wahrnehmung des eigenen Körpers in einer virtuellen Umgebung positiv auf die Wahrnehmung der gesamten Umgebung auswirkt. Für diese Untersuchungen wurden der Körper einer Person, oder Teile davon, als animierter Avatar aus der Ego-Perspektive dargestellt. Im Kontext der Informatikkonferenz Informatics Inside 2014 an der Hochschule Reutlingen soll in dieser Arbeit eine andere Möglichkeit der Darstellung untersucht werden. In einer prototypischen Augmented Virtuality Anwendung soll die virtuelle Umgebung um reale Inhalte erweitert werden. Es soll einer Person ermöglicht werden, Teile ihres eigenen Körpers nicht als Avatar, sondern auf Basis einer Kameraaufnahme als realistische Repräsentation wahrzunehmen. Die Arbeit beschreibt hierbei die gesetzten Ziele, sowie Aufbau und Funktionsweise der prototypischen Anwendung und deren derzeitigen Stand.
We report an investigation into the distribution of copper oxidation states in oxide films formed on the surfaces of technical copper. The oxide films were grown by thermal annealing at ambient conditions and studied using Auger depth profiling and UV–Vis spectroscopy. Both Auger and UV–Vis data were evaluated applying multivariate curve resolution (MCR). Both experimental techniques revealed that the growth of Cu2O dominates the initial ca. 40 nm of oxide films grown at 175 °C, while further oxide growth is dominated by CuO formation. The largely coincident results from both experimental approaches demonstrates the huge benefit of the application of UV–Vis spectroscopy in combination with MCR analysis, which provides access to information on chemical state distributions without the need for destructive sample analysis. Both approaches are discussed in detail.
Die vorliegende Aufgabensammlung beinhaltet Übungsaufgaben, die im Rahmen der Vorlesung Fluidmechnik (4 Semesterwochenstunden einschließlich der Übungsveranstaltungen) im 3. Semester des Bachelorstudiengangs Maschinenbau an der Fakultät Technik der Hochschule Reutlingen behandelt werden. Im Bachelorstudiengang Maschinenbau ist keine Spezialisierung vorgesehen und daher erfolgt die Vermittlung von Lehrinhalten aus allen wesentlichen Bereichen des Maschinenbaus.