Refine
Year of publication
- 2019 (248) (remove)
Document Type
- Conference proceeding (110)
- Journal article (98)
- Book chapter (26)
- Book (3)
- Doctoral Thesis (3)
- Report (3)
- Issue of a journal (2)
- Working Paper (2)
- Anthology (1)
Has full text
- yes (248) (remove)
Is part of the Bibliography
- yes (248)
Institute
- ESB Business School (90)
- Informatik (85)
- Technik (35)
- Life Sciences (25)
- Texoversum (11)
Publisher
- Hochschule Reutlingen (45)
- IEEE (35)
- Springer (28)
- Elsevier (16)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (7)
- Stellenbosch University (7)
- Wiley (7)
- MDPI (6)
- MIM, Marken-Institut München (6)
- De Gruyter (5)
The aim of this paper is to show to what extent Artificial Intelligence can be used to optimize forecasting capability in procurement as well as to compare AI with traditional statistic methods. At the same time this article presents the status quo of the research project ANIMATE. The project applies Artificial Intelligence to forecast customer orders in medium-sized companies.
Precise forecasts are essential for companies. For planning, decision making and controlling. Forecasts are applied, e.g. in the areas of supply chain, production or purchasing. Medium-sized companies have major challenges in using suitable methods to improve their forecasting ability.
Companies often use proven methods such as classical statistics as the ARIMA algorithm. However, simple statistics often fail while applied for complex non-linear predictions.
Initial results show that even a simple MLP ANN produces better results than traditional statistic methods. Furthermore, a baseline (Implicit Sales Expectation) of the company was used to compare the performance. This comparison also shows that the proposed AI method is superior.
Until the developed method becomes part of corporate practice, it must be further optimized. The model has difficulties with strong declines, for example due to holidays. The authors are certain that the model can be further improved. For example, through more advanced methods, such as a FilterNet, but also through more data, such as external data on holiday periods.
Ever since the 1980s, researchers in computer science and robotics have been working on making autonomous cars. Due to recent breakthroughs in research and devel- opment, such as the Bertha Benz Project [ZBS+14], the goal of fully autonomous vehicles seems closer than ever before. Yet a lot of questions remain unanswered. Especially now that the automotive industry moves towards autonomous systems in series production vehicles, the task of precise localization has to be solved with automotive grade sensors and keep memory and processing consumption at a mini- mum. This thesis investigates the Simultaneous Localization and Mapping (SLAM) prob- lem for autonomous driving scenarios on a parking lot using low cost automotive sensors. The main focus is herby devoted to the RAdio Detection And Ranging (RADAR) sensor, which has not been widely analyzed in an autonomous driving scenario so far, even though they are abundant in the automotive industry for ap- plications such as Adaptive Cruise Control (ACC). Due to the high noise floor, the radar sensor has widely been disregarded in the Intelligent Transportation Systems and Robotics communities with regards to SLAM applications. However in this thesis, it is shown that the RADAR sensor proves to be an affordable, robust and precise sensor, when modeling its physical properties correctly. In this regard, a GraphSLAM based framework is introduced, which extracts features from the RADAR sensor and generates an optimized map of the surroundings using the RADAR sensor alone. This framework is used to enable crowd based localization, which is not limited to the RADAR sensor alone. By integrating an automotive Light Detection and Ranging (LiDAR) and stereo camera sensor, a robust and precise localization system can be built that that is suitable for autonomous driving even in complex parking lot scenarios. It it is thereby shown that the RADAR sensor is strongly contributing to obtaining good results in a sensor fusion setup. These results were obtained on an extensive dataset on a parking lot, which has been recorded over the course of several months. It contains different weather conditions, different configurations of parked cars and a multitude of different trajectories to validate the approaches described in this thesis and to come to the conclusion that the RADAR sensor is a reliable sensor in series autonomous driving systems, both in a multi sensor framework and as a single component for localization.
The digital age makes it possible to be globally networked at any time. Digital communication is therefore an important aspect of today’s world. Hence, the further development and expansion of this is becoming increasingly important. Even within a wireless system, copper channels are important as part of the overall network. Given the need to keep pushing at the current limitations, careful design of the cables in connection with an adapted coding of the bits is essential to transmit more and more data.
One of the most popular and widespread cabling technologies is symmetrical copper cabling [1, pp. 8-15]. It is also known as Twisted Pair and it is of immense importance for the cabling of communication networks.
At the time of writing this thesis, data rates of up to 10 GBit/s over a transmission distance of 100 m and 40 GBit/s over a transmission distance of 30 m are standardized for symmetrical copper cabling [2]. Other lengths are not standardized. Short lengths in particular are of great interest for copper cables, because copper cables are usually used for short distances, such as between computers and the campus network or within data centres.
This work has focused on the transmission of higher order Pulse Amplitude Modulation and the associated transmission performance. The central research question is:“how well can we optimize the transmission technique in order to be able to maximise the data bandwidth over Ethernet cable and, given that remote powering is also a significant application of these cables, how much will the resulting heating affect this transmission and what can be done to mitigate that?”
To answer this question, the cable parameters are first examined. A series of spectral measurements, such as Insertion Loss, Return Loss, Near End Crosstalk and Far End Crosstalk, provide information about the electromagnetic interference and the influence of the ohmic resistance on the signal. Based on these findings, the first theoretical statements and calculations can be made. In the next step, data transmissions over different transmission lengths are realized. The examination of the eye diagrams of the different transmission approaches ultimately provides information about the signal quality of the transmissions. An overview of the maximum transmission rate depending on the transmission distance shows the potential for different applications.
Furthermore, the simultaneous transmission of energy and data is a significant advantage of copper. However, the resulting heat development has an influence on the data transmission. Therefore, the influence of the ambient temperature of cables is investigated in the last part and changes in the signal quality are clarified.
Frost reduction in mechanical balanced ventilation by efficient means of preheating cold supply air
(2019)
This study has focused on evaluating the financial potential of wastewater and geothermal heat recovery systems in a multi-family building. The recovered heat was used to improve the performance of mechanical ventilation with heat recovery (MVHR) system during the coldest days in central Sweden. The main issue, which was targeted with these solutions, was to reduce frost formation in the system and hence increase its thermal efficiency. By looking at the life cycle cost over a lifespan of 20 years, the observed systems were being evaluated economically. Furthermore, statistical analyses were carried-out to counter the uncertainty that comes with the calculation. It was found that the studied wastewater systems have a high possibility of generating savings in this period, while the one fed by geothermal energy is less likely to compensate for its high initial cost. All designed systems however, managed to reduce operational cost by 35-45% due to lower energy usage.
The aim of this work was to investigate the mean fill weight control of a continuous capsule-filling process, whether it is possible to derive controller settings from an appendant process model. To that end, a system composed out of fully automated capsule filler and an online gravimetric scale was used to control the filled weight. This setup allows to examine challenges associated with continuous manufacturing processes, such as variations in the amount of active pharmaceutical ingredient (API) in the mixture due to fluctuations of the feeders or due to altered excipient batch qualities. Two types of controllers were investigated: a feedback control and a combination of feedback and feedforward control. Although both of those are common in the industry, determining the optimal parameter settings remains an issue. In this study, we developed a method to derive the control parameters based on process models in order to obtain optimal control for each filled product. Determined via rapid automated process development (RAPD), this method is an effective and fast way of determining control parameters. The method allowed us to optimize the weight control for three pharmaceutical excipients. By conducting experiments, we verified the feasibility of the proposed method and studied the dynamics of the controlled system. Our work provides important basic data on how capsule filler can be implemented into continuous manufacturing systems.
Most antimicrobial peptides (AMPs) and their synthetic mimics (SMAMPs) are thought to act by permeabilizing cell membranes. For antimicrobial therapy, selectivity for pathogens over mammalian cells is a key requirement. Understanding membrane selectivity is thus essential for designing AMPs and SMAMPs to complement classical antibiotics in the future. This study focuses on membrane permeabilization induced by SMAMPs and their selectivity for membranes with different lipid compositions. We measure release and fluorescence lifetime of a self-quenching dye in lipid vesicles. Apart from the dose-response, we quantify the strength of individual leakage events, and, employing cumulative kinetics, categorize permeabilization behavior. We propose that differing selectivities in a series of SMAMPs arise from a combination of the effect of the antimicrobial agent and the susceptibility of the membrane (with a given lipid composition) for certain types of leakage behavior. The unselective and hemolytic SMAMP is found to act mainly by the asymmetry stress mechanism, mediated by hydrophobic insertion of SMAMPs into lipid layers. The more selective SMAMPs induced leakage events occurring stochastically over several hours. Lipid intrinsic properties might additionally amplify the efficiency of leakage events. Leakage behavior changes with both the design of the SMAMP and the lipid composition of the membrane. Understanding how leakage behavior contributes to the selectivity and activity of antimicrobial agents will aid the design and screening of antimicrobials. An understanding of the underlying processes facilitates the comparison of membrane permeabilization across in vitro and in vivo assays.
Digitalization changes the manufacturing dramatically. In regard of employees’ demands, global trends and the technological vision of future factories, automotive manufacturing faces a huge number of diverse challenges. Currently, research focuses on technological aspects of future factories in terms of digitalization. New ways of work and new organizational models for future factories have not been described yet. There are assumptions on how to develop the organization of work in a future factory but up to now, literature shows deficits in scientifically substantiated answers in this research area. Consequently, the objective of this paper is to present an approach on a work organization design for automotive Industry 4.0 manufacturing. Future requirements were analyzed and deducted to criteria that determine future agile organization design. These criteria were then transformed into functional mechanisms, which define the approach for shopfloor organization design
Im Frühjahr 1817 unternahm der damalige Professor Friedrich List an der Universität Tübingen eine Reise nach Frankfurt a. M., wo zu dieser Zeit die berühmte Ostermesse stattfand. Dort traf er mit den Anführern der Kaufleute zusammen, die darüber klagten, dass die zaghafte wirtschaftliche Entwicklung unter den vielen Zollschranken und den Billigimporten aus England stark zu leiden habe. Deshalb forderten sie die Abschaffung der Binnenzölle und die Bildung einer Wirtschaftsunion. Im Auftrag der Kaufleute verfasste List seine berühmt gewordene Petition an die Bundesversammlung, die lose Interessenvertretung des Deutschen Bundes in Frankfurt. Als die Petition mit großem Beifall aufgenommen wurde, gründete List im Hochgefühl seines Erfolges spontan den "Allgemeinen Deutschen Handels- und Gewerbsverein" – die erste Interessenvertretung deutscher Kaufleute. Er legte damit den Grundstein für den politischen Prozess zur Gründung des Zollvereins von 1834, der wiederum die Vorstufe zur Gründung des Deutschen Reiches von 1871 bildete. Lists damalige Forderungen sind zurzeit wieder hoch aktuell.
Exogenous factors of influence on exhaled breath analysis by ion-mobility spectrometry (MCC/IMS)
(2019)
The interpretation of exhaled breath analysis needs to address to the influence of exogenous factors, especially to a transfer of confounding analytes by the test persons. A test person who was exposed to a disinfectant had exhaled breath analysis by MCC/IMS (Bioscout®) after different time intervals. Additionally, a new sampling method with inhalation of synthetic air before breath analysis was tested. After exposure to the disinfectant, 3-Pentanone monomer, 3-Pentanone dimer, Hexanal, 3-Pentanone trimer, 2-Propanamine, 1-Propanol, Benzene, Nonanal showed significantly higher intensities, in exhaled breath and air of the examination room, compared to the corresponding baseline measurements. Only one ingredient of the disinfectant (1-Propanol) was identical to the 8 analytes. Prolonging the time intervals between exposure and breath analysis showed a decrease of their intensities. However, the half-time of the decrease was different. The inhalation of synthetic air - more than consequently airing the examination room with fresh air - reduced the exogenous and also relevant endogenous analytes, leading to a reduction and even changing polarity of the alveolar gradient. The interpretation of exhaled breath needs further knowledge about the former residence of the proband and the likelihood and relevance of the inhalation of local, site-specific and confounding exogenous analytes by him. Their inhalation facilitates a transfer to the examination room and a detection of high concentrations in room air and exhaled breath, but also the exhalation of new analytes. This may lead to a misinterpretation of these analytes as endogenous resp. disease-specific ones.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
We propose a method for recognizing dynamic gestures using a 3D sensor. New aspects of the developed system include problem-adapted data conversion and compression as well as automatic detection of different variants of the same gesture via clustering with a suitable metric inspired by Jaccard metric. The combination of Hidden Markov Models and clustering leads to robust detection of different executions based on a small set of training data. We achieved an increase of 5% recognition rate compared to regular Hidden Markov Models. The system has been used for human-machine interaction and might serve as an assistive system in physiotherapy and neurological or orthopedic diagnosis.
Der Halo-Effekt im Fußball
(2019)
Der Halo-Effekt ist eine aus der Sozialpsychologie bekannte kognitive Verzerrung. Ein Halo-Effekt tritt dann auf, wenn ein globaler Eindruck oder eine Information über ein hervorstechendes Merkmal die Beurteilung anderer Eigenschaften prägt. Im vorliegenden Beitrag wird der Frage nachgegangen: Gibt es einen Halo-Effekt im Fußball? Überstrahlt der sportliche Erfolg bzw. Misserfolg die Wahrnehmung der Fans womöglich sogar hinsichtlich nicht-sportlicher Aspekte? Der Beitrag gibt den aktuellen Stand zur Halo-Forschung wider und präsentiert die Ergebnisse einer empirischen Untersuchung, in deren Rahmen Fans von Vereinen aus der deutschen Fußball-Bundesliga befragt werden.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
Als einer der ersten Wissenschaftler hat der Strategieprofessor Michael Porter die aus der Volkswirtschaftslehre stammenden Erkenntnisse der Industrieökonomik mit Konzepten der Unternehmensstrategie kombiniert, um ein genaueres Verständnis vom Einfluss des Branchenwettbewerbs auf den Unternehmenserfolg sowie von Wettbewerbsentscheidungen zu erlangen. In all seinen Arbeiten steckt der Kerngedanke, durch die Wahl einer geeigneten Strategie, Wert zu generieren und somit eine hohe Wettbewerbsfähigkeit und Profitabilität zu erzielen. Porters Themen weisen dabei eine große inhaltliche Vielfalt auf: von der Digitalisierung, über Wettbewerb in der Politik bis hin zum gesellschaftlichen Fortschritt. Typisch für Porter ist ein ganzheitlicher Forschungsansatz, der sich anstelle einzelner Bestandteile eines Systems der umfassenden Betrachtung komplexer Systeme widmet. Hieraus leitet er anhand zahlreicher Fallstudien und Praxisbeispiele Modelle ab, die Managern als Spielregeln für den Wettbewerb dienen. Mit seinen Werken liefert er nicht nur Impulse für die Wissenschaft, sondern er versucht vor allem, das Denken und Handeln von Unternehmenspraxis, Politik und Gesellschaft nachhaltig zu beeinflussen. Seine Modelle, wie beispielsweise das Fünf-Kräfte-Modell, sind im Hinblick auf die Entwicklung von Organisationen dafür bekannt, dass Manager die marktseitigen Einflüsse auf die Wettbewerbssituation ihres Unternehmens besser verstehen und entsprechend ihre strategischen Entscheidungen treffen können.
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
Dieser Bericht fasst die wesentlichen Arbeiten und Ergebnisse zusammen, die in dem Verbundvorhaben „GalvanoFlex_BW“ im Kalenderjahr 2018 durchgeführt und erzielt wurden. Dazu lässt sich zunächst sagen, dass die Messwertaufnahme und –auswertung abgeschlossen ist. Es wurden verschiedene Messkampagnen bei der Fa. NovoPlan durchgeführt. Bei C&C Bark konnte man teilweise auf bestehende Daten zurückgreifen, die punktuell durch weitere Messungen ergänzt wurden. Bei der Fa. Hartchrom konnten aufgrund von Personalmangel keine Messungen durchgeführt werden. Die aufgenommenen Daten wurden in eine Effizienzbewertung überführt, aus der im Folgenden allgemeine Aussagen abgeleitet werden sollen. Dazu ist ein Simulationsprogramm aufgesetzt worden, das in der Lage ist, Prozessketten energetisch abzubilden und zu optimieren. Zudem sollen aus den Messdaten verbesserte Profile für den Wärmebedarf in den Unternehmen entwickelt werden, die daraufhin der KWK-Optimierung zur Verfügung gestellt werden. Im Zuge der Entwicklung und Bewertung stromoptimierter KWK- Strategien ist ein bestehendes Simulationsmodell entsprechend weiterentwickelt worden. Konkret wurde das Modell um eine verbesserte Lastprognose für Strom und Wärme für Industriebetriebe ergänzt, und das Optimierungsverfahren wurde um eine zweite Dimension erweitert. Während bislang allein die Optimierung der Eigenstromdeckung mit einer Begrenzung der BHKW-Starts als Nebenbedingung möglich war, ist jetzt die Kappung der elektrischen Lastspitze zusätzlich in der Zielfunktion integriert. Gerade bei Industrieunternehmen lässt sich auf diese Weise eine weitere, zum Teil nicht unerhebliche Energiekosteneinsparung erreichen, was durch die ersten Berechnungen anhand der drei im Reallabor vertretenden Betriebe bestätigt wird. Die Ergebnisse werden unter AP 8 (Umsetzung) diskutiert. Der Dialog mit weiteren Unternehmen und Institutionen außerhalb des Vorhabens konnte über die Branchenplattfom weitergeführt werden. In 2018 wurden zwei Veranstaltungen dieser Art durchgeführt, und im Frühjahr 2019 wird ein weiterer Workshop zu diesem Thema durchgeführt. Die sozialwissenschaftliche Begleitforschung wurde mit der zweiten Phase der Firmenbefragungen ebenfalls planmäßig weitergeführt. Mit Blick auf die Umsetzung eines BHKW-Konzeptes haben sich dabei zwei wichtige Punkte wie folgt gezeigt: Zum einen muss die umsetzende Firma eine gewisse „Energieeffizienz-Reife“ besitzen, die sich u.a. in der Erfahrung bei der Durchführung von Energieeffizienzmaßnahmen zeigt, da die Installation eines BHKWs eine äußerst komplexe Maßnahme darstellt. Zum anderen müssen andere unternehmensspezifische Kontextfaktoren hinzukommen, wie z.B. aus anderen Gründen durchzuführende bauliche Maßnahmen, so dass gewisse zeitliche Entscheidungsfenster entstehen, in denen die Umsetzung von KWK-Maßnahmen sinnvoll sind.
IC layout automation with self-organized wiring and arrangement of responsive modules (SWARM)
(2019)
Focused on automating analog IC layout, the multi-agent-system Self-organized Wir ing and Arrangement of Responsive Modules (SWARM) combines the powers of pro-cedural generators and algorithmic optimization into a novel bottom-up meets top-down flow of supervised layout module interaction. Provoking self-organization via the effect of emergence, examples show SWARM finding even optimal placement solutions and producing constraint-compliant layout blocks which fit into a specified zone.
In Folge der gegenwärtigen Digitalisierung in der produzierenden Industrie werden Anwendungen oder Services mit potentiell positiven Auswirkungen auf Faktoren wie Effektivität und Arbeitsqualität entwickelt. Ein geeigneter Ansatz zur Stärkung motivierender Aspekte im Arbeitskontext kann Gamification darstellen. In dieser Arbeit ist die initiale Konzeption und Evaluation eines Gamification-Ansatzes für Anwender eines KI-Service zur Maschinenoptimierung dargestellt und möglichen Anforderungen an ein Konzept zur Motivationssteigerung extrahiert.
In dieser Ausarbeitung wird eine zeitliche Vorhersage von Erdbeben getroffen. Hierfür werden mit einem Datensatz aus Labor-Erdbeben Convolutional Neural Networks (CNN) trainiert. Die trainierten Netzwerke geben Vorhersagen, indem sie einen Input an seismischen Daten klassifizieren. Durch das Klassifizieren kann das CNN die zeitliche Entfernung zum nächsten Erdbeben vorhersagen. Es werden hierfür zwei Ansätze miteinander verglichen. Beim ersten Ansatz werden die Originaldaten in ein CNN gegeben. Beim zweiten Ansatz wird vor dem CNN eine Vorverarbeitung der Daten mit den Mel Frequency Cepstral Coefficients (MFCC) durchgeführt. Es zeigt sich, dass mit beiden Ansätzen eine gute Klassifikation möglich ist. Die Kombination aus MFCC und CNN liefert die besseren quantitativen Ergebnisse. Hierbei konnte eine Genauigkeit von 65 % erreicht werden.