Refine
Year of publication
- 2019 (131) (remove)
Document Type
- Conference proceeding (131) (remove)
Is part of the Bibliography
- yes (131)
Institute
- Informatik (86)
- Technik (27)
- ESB Business School (15)
- Texoversum (3)
Publisher
- IEEE (32)
- Springer (21)
- Hochschule Reutlingen (17)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (7)
- Stellenbosch University (7)
- Fac. of Organization & Informatics, Univ. of Zagreb (4)
- SCITEPRESS (4)
- VDE Verlag GmbH (4)
- ACM (3)
- Association for Information Systems (AIS) (3)
The aim of this paper is to show to what extent Artificial Intelligence can be used to optimize forecasting capability in procurement as well as to compare AI with traditional statistic methods. At the same time this article presents the status quo of the research project ANIMATE. The project applies Artificial Intelligence to forecast customer orders in medium-sized companies.
Precise forecasts are essential for companies. For planning, decision making and controlling. Forecasts are applied, e.g. in the areas of supply chain, production or purchasing. Medium-sized companies have major challenges in using suitable methods to improve their forecasting ability.
Companies often use proven methods such as classical statistics as the ARIMA algorithm. However, simple statistics often fail while applied for complex non-linear predictions.
Initial results show that even a simple MLP ANN produces better results than traditional statistic methods. Furthermore, a baseline (Implicit Sales Expectation) of the company was used to compare the performance. This comparison also shows that the proposed AI method is superior.
Until the developed method becomes part of corporate practice, it must be further optimized. The model has difficulties with strong declines, for example due to holidays. The authors are certain that the model can be further improved. For example, through more advanced methods, such as a FilterNet, but also through more data, such as external data on holiday periods.
We propose a method for recognizing dynamic gestures using a 3D sensor. New aspects of the developed system include problem-adapted data conversion and compression as well as automatic detection of different variants of the same gesture via clustering with a suitable metric inspired by Jaccard metric. The combination of Hidden Markov Models and clustering leads to robust detection of different executions based on a small set of training data. We achieved an increase of 5% recognition rate compared to regular Hidden Markov Models. The system has been used for human-machine interaction and might serve as an assistive system in physiotherapy and neurological or orthopedic diagnosis.
Urban platforms are essential for smart and sustainable city planning and operation. Today they are mostly designed to handle and connect large urban data sets from very different domains. Modelling and optimisation functionalities are usually not part of the cities software infrastructure. However, they are considered crucial for transformation scenario development and optimised smart city operation. The work discusses software architecture concepts for such urban platforms and presents case study results on the building sector modelling, including urban data analysis and visualisation. Results from a case study in New York are presented to demonstrate the implementation status.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
Dieser Bericht fasst die wesentlichen Arbeiten und Ergebnisse zusammen, die in dem Verbundvorhaben „GalvanoFlex_BW“ im Kalenderjahr 2018 durchgeführt und erzielt wurden. Dazu lässt sich zunächst sagen, dass die Messwertaufnahme und –auswertung abgeschlossen ist. Es wurden verschiedene Messkampagnen bei der Fa. NovoPlan durchgeführt. Bei C&C Bark konnte man teilweise auf bestehende Daten zurückgreifen, die punktuell durch weitere Messungen ergänzt wurden. Bei der Fa. Hartchrom konnten aufgrund von Personalmangel keine Messungen durchgeführt werden. Die aufgenommenen Daten wurden in eine Effizienzbewertung überführt, aus der im Folgenden allgemeine Aussagen abgeleitet werden sollen. Dazu ist ein Simulationsprogramm aufgesetzt worden, das in der Lage ist, Prozessketten energetisch abzubilden und zu optimieren. Zudem sollen aus den Messdaten verbesserte Profile für den Wärmebedarf in den Unternehmen entwickelt werden, die daraufhin der KWK-Optimierung zur Verfügung gestellt werden. Im Zuge der Entwicklung und Bewertung stromoptimierter KWK- Strategien ist ein bestehendes Simulationsmodell entsprechend weiterentwickelt worden. Konkret wurde das Modell um eine verbesserte Lastprognose für Strom und Wärme für Industriebetriebe ergänzt, und das Optimierungsverfahren wurde um eine zweite Dimension erweitert. Während bislang allein die Optimierung der Eigenstromdeckung mit einer Begrenzung der BHKW-Starts als Nebenbedingung möglich war, ist jetzt die Kappung der elektrischen Lastspitze zusätzlich in der Zielfunktion integriert. Gerade bei Industrieunternehmen lässt sich auf diese Weise eine weitere, zum Teil nicht unerhebliche Energiekosteneinsparung erreichen, was durch die ersten Berechnungen anhand der drei im Reallabor vertretenden Betriebe bestätigt wird. Die Ergebnisse werden unter AP 8 (Umsetzung) diskutiert. Der Dialog mit weiteren Unternehmen und Institutionen außerhalb des Vorhabens konnte über die Branchenplattfom weitergeführt werden. In 2018 wurden zwei Veranstaltungen dieser Art durchgeführt, und im Frühjahr 2019 wird ein weiterer Workshop zu diesem Thema durchgeführt. Die sozialwissenschaftliche Begleitforschung wurde mit der zweiten Phase der Firmenbefragungen ebenfalls planmäßig weitergeführt. Mit Blick auf die Umsetzung eines BHKW-Konzeptes haben sich dabei zwei wichtige Punkte wie folgt gezeigt: Zum einen muss die umsetzende Firma eine gewisse „Energieeffizienz-Reife“ besitzen, die sich u.a. in der Erfahrung bei der Durchführung von Energieeffizienzmaßnahmen zeigt, da die Installation eines BHKWs eine äußerst komplexe Maßnahme darstellt. Zum anderen müssen andere unternehmensspezifische Kontextfaktoren hinzukommen, wie z.B. aus anderen Gründen durchzuführende bauliche Maßnahmen, so dass gewisse zeitliche Entscheidungsfenster entstehen, in denen die Umsetzung von KWK-Maßnahmen sinnvoll sind.
IC layout automation with self-organized wiring and arrangement of responsive modules (SWARM)
(2019)
Focused on automating analog IC layout, the multi-agent-system Self-organized Wir ing and Arrangement of Responsive Modules (SWARM) combines the powers of pro-cedural generators and algorithmic optimization into a novel bottom-up meets top-down flow of supervised layout module interaction. Provoking self-organization via the effect of emergence, examples show SWARM finding even optimal placement solutions and producing constraint-compliant layout blocks which fit into a specified zone.
In Folge der gegenwärtigen Digitalisierung in der produzierenden Industrie werden Anwendungen oder Services mit potentiell positiven Auswirkungen auf Faktoren wie Effektivität und Arbeitsqualität entwickelt. Ein geeigneter Ansatz zur Stärkung motivierender Aspekte im Arbeitskontext kann Gamification darstellen. In dieser Arbeit ist die initiale Konzeption und Evaluation eines Gamification-Ansatzes für Anwender eines KI-Service zur Maschinenoptimierung dargestellt und möglichen Anforderungen an ein Konzept zur Motivationssteigerung extrahiert.
In dieser Ausarbeitung wird eine zeitliche Vorhersage von Erdbeben getroffen. Hierfür werden mit einem Datensatz aus Labor-Erdbeben Convolutional Neural Networks (CNN) trainiert. Die trainierten Netzwerke geben Vorhersagen, indem sie einen Input an seismischen Daten klassifizieren. Durch das Klassifizieren kann das CNN die zeitliche Entfernung zum nächsten Erdbeben vorhersagen. Es werden hierfür zwei Ansätze miteinander verglichen. Beim ersten Ansatz werden die Originaldaten in ein CNN gegeben. Beim zweiten Ansatz wird vor dem CNN eine Vorverarbeitung der Daten mit den Mel Frequency Cepstral Coefficients (MFCC) durchgeführt. Es zeigt sich, dass mit beiden Ansätzen eine gute Klassifikation möglich ist. Die Kombination aus MFCC und CNN liefert die besseren quantitativen Ergebnisse. Hierbei konnte eine Genauigkeit von 65 % erreicht werden.
Semi-automated image data labelling using AprilTags as a pre-processing step for machine learning
(2019)
Data labelling is a pre-processing step to prepare data for machine learning. There are many ways to collect and prepare this data, but these are usually associated with a greater effort. This paper presents an approach to semi-automated image data labelling using AprilTags. The AprilTags attached to the object, which contain a unique ID, make it possible to link the object surfaces to a particular class. This approach will be implemented and used to label data of a stackable box.
The data is evaluated by training a You Only Look Once (YOLO) net, with a subsequent evaluation of the detection results. These results show that the semi-automatically collected and labelled data can certainly be used for machine learning. However, if concise features of an object surface are covered by the AprilTag, there is a risk that the concerned class will not be recognized. It can be assumed that the labelled data can not only be used for YOLO, but also for other machine learning approaches.
Bereits zum elften Mal findet nun die Studierendenkonferenz Informatics Inside statt. Als Teil des Masterstudiengangs Human-Centered Computing organisieren Masterstudierende selbständig eine vollumfängliche wissenschaftliche Konferenz. Die Informatik ist nach wie vor ständigem Wandel unterworfen. Unsere Studierenden tragen diesem Wandel bei, indem sie in ihrer wissenschaftllichen Vertiefung aktuelle Problemstellungen durch innovative Konzepte lösen. Inzwischen ist die Informatik aber auch nicht immer sofort sichtbar. Das merken wir immer dann, wenn etwas nicht wie vorgesehen funktioniert. Das diesjährige Motto der Informatics Inside ist experience (IT);, verdeckt als Funktionsaufruf:).
In this paper, we address the novel EDP (Expert Design Plan) principle for procedural design automation of analog integrated circuits, which captures the knowledge-based design strategy of human circuit designers in an executable script, making it reusable. We present the EDP Player, which enables the creation and execution of EDPs for arbitrary circuits in the Cadence® Virtuoso® Design Environment. The tool provides a generic version of an instruction set, called EDPL (EDPLanguage), enabling emulation of a typical manual analog sizing flow. To automate the design of a Miller Operational Amplifier and to create variants of a Smart Power IC, several EDPs were implemented using this tool. Employing these EDPs leads to a strong reduction of design time without compromising design quality or reliability.
Serverless computing is an emerging cloud computing paradigm with the goal of freeing developers from resource management issues. As of today, serverless computing platforms are mainly used to process computations triggered by events or user requests that can be executed independently of each other. These workloads benefit from on-demand and elastic compute resources as well as per-function billing. However, it is still an open research question to which extent parallel applications, which comprise most often complex coordination and communication patterns, can benefit from serverless computing.
In this paper, we introduce serverless skeletons for parallel cloud programming to free developers from both parallelism and resource management issues. In particular, we investigate on the well known and widely used farm skeleton, which supports the implementation of a wide range of applications. To evaluate our concepts, we present a prototypical development and runtime framework and implement two applications based on our framework: Numerical integration and hyperparameter optimization - a commonly applied technique in machine learning. We report on performance measurements for both applications and discuss
the usefulness of our approach.
The Virtual Power Plant Neckar-Alb is a demonstration platform for operation, optimization and control of distributed energy resources, which are able to produce, store or consume electric energy. A heterogeneous set of distributed energy devices has been installed at the Campus of Reutlingen University by the Reutlingen Energy Centre (REZ) of the School of Engineering. The distributed energy devices have been combined to local microgrids and connected to an operative central power plant with additional participants. The demonstration platform serves students, researchers and industry experts for education and investigation of new technologies, devices and software.
A digital twin - a replica of energy devices - was established in the computing environment of MATLAB and Simulink. It simulates continuously their operation and is time synchronized and connected to the cenral energy management and control system of a virtual power plant. The model can be used as a platform for testing device performance in various conditions, working schedules and new optimization options.
Companies are continuously changing their strategy, processes, and information systems to benefit from the digital transformation. Controlling the digital architecture and governance is the fundamental goal. Enterprise Governance, Risk and Compliance (GRC) systems are vital for managing digital risks threatening in modern enterprises from many different angles. The most significant constituent to GRC systems is the definition of controls that is implemented on different layers of a digital Enterprise Architecture (EA). As part of the compliant aspect of GRC, the effectiveness of these controls is assessed and reported to relevant management bodies within the enterprise. In this paper, we present a metamodel which links controls to the affected elements of a digital EA and supplies a way of expressing associated assessment techniques and results. We complement a metamodel with an expository instantiation of a control compliance cockpit in an international insurance enterprise.
Business process models provide a considerable number of benefits for enterprises and organizations, but the creation of such models is costly and time-consuming, which slows down the organizational adoption of business process modeling. Social paradigms pave new ways for business process modeling by integrating stakeholders and leveraging knowledge sources. However, empirical research about the impact of social paradigms on costs of business process modeling is sparse. A better understanding of their impact could help to reduce the cost of business process modeling and improve decision-making on BPM activities. The paper constributes to this field by reporting about an empirical investigation via survey research on the perceived influence of different cost factors among experts. Our results indicate that different cost components, as well as the use of social paradigms, influence cost.
Due to the consequential impact of technological breakdowns, companies have to be prepared to deal with breakdowns or even better prevent them. In today's information technology, several methods and tools exist to downscale this concern. Therefore, this paper deals with the initial determination of a resilient enterprise architecture supporting predictive maintenance in the information technology domain and furthermore, concerns several mechanisms on how to reactively and proactively secure the state of resiliency on several abstraction levels. The objective of this paper is to give an overview on existing mechanisms for resiliency and to describe the foundation of an optimized approach, combining infrastructure and process mining techniques.
This book contains the proceedings of the KES International conferences on Innovation in Medicine and Healthcare (KES-InMed-19) and Intelligent Interactive Multimedia Systems and Services (KES-IIMSS-19), held on 17–19 June 2019 and co-located in St. Julians, on the island of Malta, as part of the KES Smart Digital Futures 2019 multi theme conference.
The major areas covered by KES-InMed-19 include: Digital IT Architecture in Healthcare; Advanced ICT for Medical and Healthcare; Biomedical Engineering, Trends, Research and Technologies and Healthcare Support System. The major areas covered by KES-IIMSS-19 were: Interactive Technologies; Artificial Intelligence and Data Analytics; Intelligent Services and Architectures and Applications.
This book is of use to researchers in these vibrant areas, managers, industrialists and anyone wishing to gain an overview of the latest research in these fields.