Refine
Year of publication
- 2019 (131) (remove)
Document Type
- Conference proceeding (131) (remove)
Is part of the Bibliography
- yes (131)
Institute
- Informatik (86)
- Technik (27)
- ESB Business School (15)
- Texoversum (3)
Publisher
The aim of this paper is to show to what extent Artificial Intelligence can be used to optimize forecasting capability in procurement as well as to compare AI with traditional statistic methods. At the same time this article presents the status quo of the research project ANIMATE. The project applies Artificial Intelligence to forecast customer orders in medium-sized companies.
Precise forecasts are essential for companies. For planning, decision making and controlling. Forecasts are applied, e.g. in the areas of supply chain, production or purchasing. Medium-sized companies have major challenges in using suitable methods to improve their forecasting ability.
Companies often use proven methods such as classical statistics as the ARIMA algorithm. However, simple statistics often fail while applied for complex non-linear predictions.
Initial results show that even a simple MLP ANN produces better results than traditional statistic methods. Furthermore, a baseline (Implicit Sales Expectation) of the company was used to compare the performance. This comparison also shows that the proposed AI method is superior.
Until the developed method becomes part of corporate practice, it must be further optimized. The model has difficulties with strong declines, for example due to holidays. The authors are certain that the model can be further improved. For example, through more advanced methods, such as a FilterNet, but also through more data, such as external data on holiday periods.
We propose a method for recognizing dynamic gestures using a 3D sensor. New aspects of the developed system include problem-adapted data conversion and compression as well as automatic detection of different variants of the same gesture via clustering with a suitable metric inspired by Jaccard metric. The combination of Hidden Markov Models and clustering leads to robust detection of different executions based on a small set of training data. We achieved an increase of 5% recognition rate compared to regular Hidden Markov Models. The system has been used for human-machine interaction and might serve as an assistive system in physiotherapy and neurological or orthopedic diagnosis.
Urban platforms are essential for smart and sustainable city planning and operation. Today they are mostly designed to handle and connect large urban data sets from very different domains. Modelling and optimisation functionalities are usually not part of the cities software infrastructure. However, they are considered crucial for transformation scenario development and optimised smart city operation. The work discusses software architecture concepts for such urban platforms and presents case study results on the building sector modelling, including urban data analysis and visualisation. Results from a case study in New York are presented to demonstrate the implementation status.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
Dieser Bericht fasst die wesentlichen Arbeiten und Ergebnisse zusammen, die in dem Verbundvorhaben „GalvanoFlex_BW“ im Kalenderjahr 2018 durchgeführt und erzielt wurden. Dazu lässt sich zunächst sagen, dass die Messwertaufnahme und –auswertung abgeschlossen ist. Es wurden verschiedene Messkampagnen bei der Fa. NovoPlan durchgeführt. Bei C&C Bark konnte man teilweise auf bestehende Daten zurückgreifen, die punktuell durch weitere Messungen ergänzt wurden. Bei der Fa. Hartchrom konnten aufgrund von Personalmangel keine Messungen durchgeführt werden. Die aufgenommenen Daten wurden in eine Effizienzbewertung überführt, aus der im Folgenden allgemeine Aussagen abgeleitet werden sollen. Dazu ist ein Simulationsprogramm aufgesetzt worden, das in der Lage ist, Prozessketten energetisch abzubilden und zu optimieren. Zudem sollen aus den Messdaten verbesserte Profile für den Wärmebedarf in den Unternehmen entwickelt werden, die daraufhin der KWK-Optimierung zur Verfügung gestellt werden. Im Zuge der Entwicklung und Bewertung stromoptimierter KWK- Strategien ist ein bestehendes Simulationsmodell entsprechend weiterentwickelt worden. Konkret wurde das Modell um eine verbesserte Lastprognose für Strom und Wärme für Industriebetriebe ergänzt, und das Optimierungsverfahren wurde um eine zweite Dimension erweitert. Während bislang allein die Optimierung der Eigenstromdeckung mit einer Begrenzung der BHKW-Starts als Nebenbedingung möglich war, ist jetzt die Kappung der elektrischen Lastspitze zusätzlich in der Zielfunktion integriert. Gerade bei Industrieunternehmen lässt sich auf diese Weise eine weitere, zum Teil nicht unerhebliche Energiekosteneinsparung erreichen, was durch die ersten Berechnungen anhand der drei im Reallabor vertretenden Betriebe bestätigt wird. Die Ergebnisse werden unter AP 8 (Umsetzung) diskutiert. Der Dialog mit weiteren Unternehmen und Institutionen außerhalb des Vorhabens konnte über die Branchenplattfom weitergeführt werden. In 2018 wurden zwei Veranstaltungen dieser Art durchgeführt, und im Frühjahr 2019 wird ein weiterer Workshop zu diesem Thema durchgeführt. Die sozialwissenschaftliche Begleitforschung wurde mit der zweiten Phase der Firmenbefragungen ebenfalls planmäßig weitergeführt. Mit Blick auf die Umsetzung eines BHKW-Konzeptes haben sich dabei zwei wichtige Punkte wie folgt gezeigt: Zum einen muss die umsetzende Firma eine gewisse „Energieeffizienz-Reife“ besitzen, die sich u.a. in der Erfahrung bei der Durchführung von Energieeffizienzmaßnahmen zeigt, da die Installation eines BHKWs eine äußerst komplexe Maßnahme darstellt. Zum anderen müssen andere unternehmensspezifische Kontextfaktoren hinzukommen, wie z.B. aus anderen Gründen durchzuführende bauliche Maßnahmen, so dass gewisse zeitliche Entscheidungsfenster entstehen, in denen die Umsetzung von KWK-Maßnahmen sinnvoll sind.
IC layout automation with self-organized wiring and arrangement of responsive modules (SWARM)
(2019)
Focused on automating analog IC layout, the multi-agent-system Self-organized Wir ing and Arrangement of Responsive Modules (SWARM) combines the powers of pro-cedural generators and algorithmic optimization into a novel bottom-up meets top-down flow of supervised layout module interaction. Provoking self-organization via the effect of emergence, examples show SWARM finding even optimal placement solutions and producing constraint-compliant layout blocks which fit into a specified zone.
In Folge der gegenwärtigen Digitalisierung in der produzierenden Industrie werden Anwendungen oder Services mit potentiell positiven Auswirkungen auf Faktoren wie Effektivität und Arbeitsqualität entwickelt. Ein geeigneter Ansatz zur Stärkung motivierender Aspekte im Arbeitskontext kann Gamification darstellen. In dieser Arbeit ist die initiale Konzeption und Evaluation eines Gamification-Ansatzes für Anwender eines KI-Service zur Maschinenoptimierung dargestellt und möglichen Anforderungen an ein Konzept zur Motivationssteigerung extrahiert.
In dieser Ausarbeitung wird eine zeitliche Vorhersage von Erdbeben getroffen. Hierfür werden mit einem Datensatz aus Labor-Erdbeben Convolutional Neural Networks (CNN) trainiert. Die trainierten Netzwerke geben Vorhersagen, indem sie einen Input an seismischen Daten klassifizieren. Durch das Klassifizieren kann das CNN die zeitliche Entfernung zum nächsten Erdbeben vorhersagen. Es werden hierfür zwei Ansätze miteinander verglichen. Beim ersten Ansatz werden die Originaldaten in ein CNN gegeben. Beim zweiten Ansatz wird vor dem CNN eine Vorverarbeitung der Daten mit den Mel Frequency Cepstral Coefficients (MFCC) durchgeführt. Es zeigt sich, dass mit beiden Ansätzen eine gute Klassifikation möglich ist. Die Kombination aus MFCC und CNN liefert die besseren quantitativen Ergebnisse. Hierbei konnte eine Genauigkeit von 65 % erreicht werden.
Semi-automated image data labelling using AprilTags as a pre-processing step for machine learning
(2019)
Data labelling is a pre-processing step to prepare data for machine learning. There are many ways to collect and prepare this data, but these are usually associated with a greater effort. This paper presents an approach to semi-automated image data labelling using AprilTags. The AprilTags attached to the object, which contain a unique ID, make it possible to link the object surfaces to a particular class. This approach will be implemented and used to label data of a stackable box.
The data is evaluated by training a You Only Look Once (YOLO) net, with a subsequent evaluation of the detection results. These results show that the semi-automatically collected and labelled data can certainly be used for machine learning. However, if concise features of an object surface are covered by the AprilTag, there is a risk that the concerned class will not be recognized. It can be assumed that the labelled data can not only be used for YOLO, but also for other machine learning approaches.
Bereits zum elften Mal findet nun die Studierendenkonferenz Informatics Inside statt. Als Teil des Masterstudiengangs Human-Centered Computing organisieren Masterstudierende selbständig eine vollumfängliche wissenschaftliche Konferenz. Die Informatik ist nach wie vor ständigem Wandel unterworfen. Unsere Studierenden tragen diesem Wandel bei, indem sie in ihrer wissenschaftllichen Vertiefung aktuelle Problemstellungen durch innovative Konzepte lösen. Inzwischen ist die Informatik aber auch nicht immer sofort sichtbar. Das merken wir immer dann, wenn etwas nicht wie vorgesehen funktioniert. Das diesjährige Motto der Informatics Inside ist experience (IT);, verdeckt als Funktionsaufruf:).
In this paper, we address the novel EDP (Expert Design Plan) principle for procedural design automation of analog integrated circuits, which captures the knowledge-based design strategy of human circuit designers in an executable script, making it reusable. We present the EDP Player, which enables the creation and execution of EDPs for arbitrary circuits in the Cadence® Virtuoso® Design Environment. The tool provides a generic version of an instruction set, called EDPL (EDPLanguage), enabling emulation of a typical manual analog sizing flow. To automate the design of a Miller Operational Amplifier and to create variants of a Smart Power IC, several EDPs were implemented using this tool. Employing these EDPs leads to a strong reduction of design time without compromising design quality or reliability.
Serverless computing is an emerging cloud computing paradigm with the goal of freeing developers from resource management issues. As of today, serverless computing platforms are mainly used to process computations triggered by events or user requests that can be executed independently of each other. These workloads benefit from on-demand and elastic compute resources as well as per-function billing. However, it is still an open research question to which extent parallel applications, which comprise most often complex coordination and communication patterns, can benefit from serverless computing.
In this paper, we introduce serverless skeletons for parallel cloud programming to free developers from both parallelism and resource management issues. In particular, we investigate on the well known and widely used farm skeleton, which supports the implementation of a wide range of applications. To evaluate our concepts, we present a prototypical development and runtime framework and implement two applications based on our framework: Numerical integration and hyperparameter optimization - a commonly applied technique in machine learning. We report on performance measurements for both applications and discuss
the usefulness of our approach.
The Virtual Power Plant Neckar-Alb is a demonstration platform for operation, optimization and control of distributed energy resources, which are able to produce, store or consume electric energy. A heterogeneous set of distributed energy devices has been installed at the Campus of Reutlingen University by the Reutlingen Energy Centre (REZ) of the School of Engineering. The distributed energy devices have been combined to local microgrids and connected to an operative central power plant with additional participants. The demonstration platform serves students, researchers and industry experts for education and investigation of new technologies, devices and software.
A digital twin - a replica of energy devices - was established in the computing environment of MATLAB and Simulink. It simulates continuously their operation and is time synchronized and connected to the cenral energy management and control system of a virtual power plant. The model can be used as a platform for testing device performance in various conditions, working schedules and new optimization options.
Companies are continuously changing their strategy, processes, and information systems to benefit from the digital transformation. Controlling the digital architecture and governance is the fundamental goal. Enterprise Governance, Risk and Compliance (GRC) systems are vital for managing digital risks threatening in modern enterprises from many different angles. The most significant constituent to GRC systems is the definition of controls that is implemented on different layers of a digital Enterprise Architecture (EA). As part of the compliant aspect of GRC, the effectiveness of these controls is assessed and reported to relevant management bodies within the enterprise. In this paper, we present a metamodel which links controls to the affected elements of a digital EA and supplies a way of expressing associated assessment techniques and results. We complement a metamodel with an expository instantiation of a control compliance cockpit in an international insurance enterprise.
Business process models provide a considerable number of benefits for enterprises and organizations, but the creation of such models is costly and time-consuming, which slows down the organizational adoption of business process modeling. Social paradigms pave new ways for business process modeling by integrating stakeholders and leveraging knowledge sources. However, empirical research about the impact of social paradigms on costs of business process modeling is sparse. A better understanding of their impact could help to reduce the cost of business process modeling and improve decision-making on BPM activities. The paper constributes to this field by reporting about an empirical investigation via survey research on the perceived influence of different cost factors among experts. Our results indicate that different cost components, as well as the use of social paradigms, influence cost.
Due to the consequential impact of technological breakdowns, companies have to be prepared to deal with breakdowns or even better prevent them. In today's information technology, several methods and tools exist to downscale this concern. Therefore, this paper deals with the initial determination of a resilient enterprise architecture supporting predictive maintenance in the information technology domain and furthermore, concerns several mechanisms on how to reactively and proactively secure the state of resiliency on several abstraction levels. The objective of this paper is to give an overview on existing mechanisms for resiliency and to describe the foundation of an optimized approach, combining infrastructure and process mining techniques.
This book contains the proceedings of the KES International conferences on Innovation in Medicine and Healthcare (KES-InMed-19) and Intelligent Interactive Multimedia Systems and Services (KES-IIMSS-19), held on 17–19 June 2019 and co-located in St. Julians, on the island of Malta, as part of the KES Smart Digital Futures 2019 multi theme conference.
The major areas covered by KES-InMed-19 include: Digital IT Architecture in Healthcare; Advanced ICT for Medical and Healthcare; Biomedical Engineering, Trends, Research and Technologies and Healthcare Support System. The major areas covered by KES-IIMSS-19 were: Interactive Technologies; Artificial Intelligence and Data Analytics; Intelligent Services and Architectures and Applications.
This book is of use to researchers in these vibrant areas, managers, industrialists and anyone wishing to gain an overview of the latest research in these fields.
The rise of digital technologies has become an important driver for change in multiple industries. Therefore, firms need to develop digital capabilities to manage the transformation process successfully. Prior research assumes that the development of a specific set of digital capabilities leads to higher digital maturity. However, a measurement framework for digital maturity does not exist in scholarly work. Therefore, this paper develops a conceptualization and measuremnent model for digital maturity.
In 2017, Philips' goal was to use innovation to improve the lives of three billion people a year by 2025. To achieve that, the company was shifting from selling medical products in a transactional manner to providing integrated healthcare solutions based on digital health technology. Based on our interviews with 23 executives at Philips, the case examines the two directions of the transformation required by this shift: externally, Philips worked on transforming how healthcare was conducted. Healthcare professionals would have to change the way they worked and reimbursement schemes needed to change to incentivize payers, providers, and patients in vastly different ways. Internally, Philips needed to redesign how its employees worked. The company componentized its business, introduced digital platforms, and co-created integrated solutions with the various stakeholders of the healthcare industry. In other words: Philips was transforming itself in order the reinvent healthcare in the digital age.
Autism spectrum disorders (ASD) affect a large number of children both in the Russian Federation and in Germany. Early diagnosis is key for these children, because the sooner parents notice such disorders in a child and the rehabilitation and treatment program starts, the higher the likelihood of his social adaptation. The difficulties in raising such a child lie in the complexity of his learning outside of children's groups and the complexity of his medical care. In this regard, the development of digital applications that facilitate medical care and education of such children at home is important and relevant. The purpose of the project is to improve the availability and quality of healthcare and social adaptation at home of children with ASD through the use of digital technologies.
Continuous refactoring is necessary to maintain source code quality and to cope with technical debt. Since manual refactoring is inefficient and error prone, various solutions for automated refactoring have been proposed in the past. However, empirical studies have shown that these solutions are not widely accepted by software developers and most refactorings are still performed manually. For example, developers reported that refactoring tools should support functionality for reviewing changes. They also criticized that introducing such tools would require substantial effort for configuration and integration into the current development environment.
In this paper, we present our work towards the Refactoring-Bot, an autonomous bot that integrates into the team like a human developer via the existing version control platform. The bot automatically performs refactorings to resolve code smells and presents the changes to a developer for asynchronous review via pull requests. This way, developers are not interrupted in their workflow and can review the changes at any time with familiar tools. Proposed refactorings can then be integrated into the code base via the push of a button. We elaborate on our vision, discuss design decisions, describe the current state of development, and give an outlook on planned development and research activities.
The Eleventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2019), held between June 02, 2019 to June 06, 2019 - Athens, Greece, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
We welcomed academic, research and industry contributions. The conference had the followingtracks:
Knowledgeanddecisionbase
Databasestechnologies
Datamanagement
GraphSM: Large-scale Graph Analysis, Management and Applications
This paper presents an approach for the implementation of a modular and scalable power electronics device for controlling electric drives in the field of electric vehicles using wide bandgap semiconductor devices. The main idea is to achieve the required output currents or voltages by connecting adequately designed hardware modules in parallel or in series. This particular design is based on the fact that the single modules generate a continuous and specified output voltage from a given dc voltage, e.g. an intermediate circuit or battery voltage. The main benefit is, that different current or voltage requirements can be satisfied based on a single module thus decreasing development and production costs. The current paper focuses on the connection in parallel of such modules. A control architecture is illustrated and a first proof of concept is given.
Creativity, problem-solving skills and the ability for collaborative work are considered key competences for facing the challenges of the 21st century. Children are born with an inherent creativity that decreases throughout their school careers. A research team of designers and educators investigates whether the implementation of Design Thinking (DT) in textile education in German elementary schools is a suitable method to preserve children’s creativity. Initial surveys with teachers and pilot studies in elementary schools showed high motivation and openmindedness towards DT in classroom. The challenge will be to develop suitable teaching modules for elementary schools of the federal state Baden Württemberg.
Kreativität, Problemlösekompetenz und kollaboratives Arbeiten werden in zahlreichen internationalen Studien sowie von der OECD (2017) als Schlüsselkompetenzen des 21. Jahrhunderts definiert. Ungeachtet dessen orientieren sich viele Lehr-Lern Methoden noch immer an der Vermittlung vordefinierter Lösungswege. Studien im Sekundarbereich in den USA, Deutschland und Asien zeigen, dass Design Thinking durch seine kreativen und kollaborativen Elemente zu einem nachhaltigeren Lernerfolg bei Lernenden und seitens der Lehrenden zu höherer Zufriedenheit bei der Vermittlung der Inhalte führen kann.
Kernelemente des Design Thinking sind: der iterative Prozess mit seinen Phasen Verstehen, Beobachten, Sichtweisen definieren, Ideen finden, Prototypen bauen, Testen; die Arbeit in multidisziplinären Teams sowie die Nutzerorientierung bei der Definition der Aufgabe (Brown, 2009). Die Phasen des iterativen Prozesses weisen eine hohe Kongruenz mit den prozessorientierten Kompetenzen des Faches Kunst/Werken und des Sachunterrichts gemäß dem Bildungsplan für Grundschulen (Ministerium für Kultus, Jugend und Sport Baden Württemberg, 2016) auf. Im Rahmen eines interdisziplinären Promotionsvorhabens an der PH Freiburg soll, basierend auf einem qualitativen Forschungsdesign, untersucht werden, inwieweit sich Design Thinking eignet, Kreativität, Problemlösekompetenz und kollaboratives Arbeiten von Grundschulkindern in Kunst/Werken und im Sachunterricht aus Sicht von Lehrpersonen zu fördern. Vorstudien mit Lehrpersonen und Ausbildungslehrkräften, bei welchen Erhebungen per Fragebogen nach Teilnahme an einem Design Thinking Workshop eingesetzt wurden, sowie zwei Pilotunterrichtseinheiten an Grundschulen mit Teilnehmender Beobachtung, Experteninterviews und Kinderinterviews in Kleingruppen, zeigen erste Ergebnisse.
To remain competitive in a fast changing environment, many companies started to migrate their legacy applications towards a Microservices architecture. Such extensive migration processes require careful planning and consideration of implications and challenges likewise. In this regard, hands-on experiences from industry practice are still rare. To fill this gap in scientific literature, we contribute a qualitative study on intentions, strategies, and challenges in the context of migrations to Microservices. We investigated the migration process of 14 systems across different domains and sizes by conducting 16 in-depth interviews with software professionals from 10 companies. Along with a summary of the most important findings, we present a separate discussion of each case. As primary migration drivers, maintainability and scalability were identified. Due to the high complexity of their legacy systems, most companies preferred a rewrite using current technologies over splitting up existing code bases. This was often caused by the absence of a suitable decomposition approach. As such, finding the right service cut was a major technical challenge, next to building the necessary expertise with new technologies. Organizational challenges were especially related to large, traditional companies that simultaneously established agile processes. Initiating a mindset change and ensuring smooth collaboration between teams were crucial for them. Future research on the evolution of software systems can in particular profit from the individual cases presented.
While Microservices promise several beneficial characteristics for sustainable long-term software evolution, little empirical research covers what concrete activities industry applies for the evolvability assurance of Microservices and how technical debt is handled in such systems. Since insights into the current state of practice are very important for researchers, we performed a qualitative interview study to explore applied evolvability assurance processes, the usage of tools, metrics, and patterns, as well as participants’ reflections on the topic. In 17 semi-structured interviews, we discussed 14 different Microservice-based systems with software professionals from 10 companies and how the sustainable evolution of these systems was ensured. Interview transcripts were analyzed with a detailed coding system and the constant comparison method.
We found that especially systems for external customers relied on central governance for the assurance. Participants saw guidelines like architectural principles as important to ensure a base consistency for evolvability. Interviewees also valued manual activities like code review, even though automation and tool support was described as very important. Source code quality was the primary target for the usage of tools and metrics. Despite most reported issues being related to Architectural Technical Debt (ATD), our participants did not apply any architectural or service-oriented tools and metrics. While participants generally saw their Microservices as evolvable, service cutting and finding an appropriate service granularity with low coupling and high cohesion were reported as challenging. Future Microservices research in the areas of evolution and technical debt should take these findings and industry sentiments into account.
Small and Medium Enterprises (SMEs) which play substantial role in the development of any economy have been on the rise in the recent periods. Consequently, these enterprises are faced with a myriad of challenges which could potentially be solved through adoption of technology. Nonetheless, it has been observed that the new technological uptake among SMEs remains limited with the majority of them opting to maintain the status quo with regards to technology awareness and innovation strategies.
In a literature review, this paper explores three major dynamics curtailing adoption of new technologies by SMEs in the manufacturing: Knowledge absorptive capacity and management factors, organisational structures as well as technological awareness. Firstly, with regards to knowledge absorptive capacity and management factors, this study shows how these factors drive innovation potentials in SMEs.
Secondly, with regards to technological awareness factors, this study documents how perceived usefulness, costs, network and infrastructure, education and skills, training and attitude as well as knowledge influence adoption of new technologies among SMEs in the world. Lastly, the study concludes by analysing how organisational structures drive innovation potentials of SMEs in the wake of swift and profound technological changes in the market.
The relevance of technology knowledge in digital transformation especially in small and mediumsized enterprises (SMEs) that are still largely dependent on physical human capital has become increasingly obvious. This is due to the rapid revolution in business environment coupled with increased living examples of firms disrupted by advancement in technological knowledge. Consequently, we find it progressively vital for SMEs to spot and mitigate both threats and take advantage of opportunities arising from digital transformation dynamism.
Our study aims at exploring the relevance of technology knowledge in SMEs for digital transformation to uncover the opportunities, roadmaps, and models that SMEs can take advantage of in the digital transformation and gain a competitive edge.
We conclude that irrespective relevance of technology knowledge for digital transformation coupled with its low costs and accessibility, SMEs are yet to realize the full potential of technological knowledge. This is mainly due to technologies appearing, changing and also vanishing so rapidly in the digital age, that gaining proper understanding without dedicated resources is utterly difficult for SMEs - making them less competitive as incumbent large firms in the market.
The energy turnaround, digitalization and decreasing revenues forces enterprises in the energy domain to develop new business models. Business models for renewable energy are compound on different logic than business models for larger scale power plants. Following a design science research approach, we examined the business models of three enterprises in the energy domain in a first step. We identified that these business models result in complex ecosystems with multiple actors and difficult relationships between them. One cause is the fast changing and complicated state regulation in Germany. In order to solve the problem, we captured together with the partners of the enterprises the requirements in a second phase. Further we developed the prototype Business Model Configurator (BMConfig) based on the e3Value Ontology on the metamodelling platform ADOxx. We demonstrate the feasibility of our approach in business model of energy efficiency service based on smart meter data.
In a time of upheaval and digitalization, new business models for companies play an important role. Decentralized power generation and energy efficiency indicators to achieve climate goals and to reduce global warming are currently forcing energy companies to develop new business models. In recent years, many methods of business model development have been introduced to create new business ideas. But what are the obstacles in implementing these business models in the energy sector to develop new business opportunities? And what challenges do companies face in this respect? To answer this question, a systematic literature review was conducted in this paper. As a result, eight categories were identified which summarise the main barriers for the implementation of new business models in the energy domain.
We introduce IPA-IDX – an approach to handle index modifications modern storage technologies (NVM, Flash) as physical in-place appends, using simplified physiological log records. IPA-IDX provides similar performance and longevity advantages for indexes as basic IPA [5] does for tables. The selective application of IPA-IDX and basic IPA to certain regions and objects, lowers the GC overhead by over 60%, while keeping the total space overhead to 2%. The combined effect of IPA and IPA-IDX increases performance by 28%.
Because of saturated markets and of the low profit margins in the sales of cars, car manufacturers focus more and more on profitable product related services. This paper deals with the question how to classify product related services in the automotive industry and which characteristic product related services are offered to the end-users (consumers) in a standardized format. Two research studies on the provided product related services in 2010 und 2017 by 15 car manufacturers and 20 exemplary automotive brands in Germany revealed that the application degree by the OEM (original equipment manufacturers) in these years increased considerably. While in 2010, the average range of services only amounted to 33%, the value in the automotive industry increased until 2017 to 57%.
Mit der Überarbeitung der DIN EN 50173 (VDE 0800-173) Serie, wurden unter anderem die optischen Übertragungsstreckenklassen ersatzlos gestrichen. Um die so entstandene Lücke zu schließen, hat das deutsche Gremium DKE GUK 715.3 „Informationstechnische Verkabelung von Gebäudekomplexen“ neue Klassen erarbeitet, die in der DIN VDE 0800- 173-100 „Klassifizierung von Lichtwellenleiter-Übertragungsstrecken“ im Juni 2019 veröffentlicht wurden. Die Norm klassifiziert Lichtwellenleiter Übertragungsstrecken für anwendungsneutrale Kommunikationskabelanlagen nach DIN EN 50173-1.
Sie dient Benutzern, eine breite Palette von Anwendungen zu ermöglichen, die Auswahl des Verkabelungssystems zu erleichtern, eine zukunftssichere Klassifizierung von LWL-Verkabelungen zu generieren und dazu, Systemanforderungen zu beschreiben.
Die in der Norm definierten Klassen beschreiben die Anforderungen an die Übertragungsstrecken und basieren auf einer maximal zulässigen Einfügedämpfung in dB für maximale Übertragungsstreckenlängen, wobei zusätzlich das Bandbreitenlängenprodukt berücksichtigt wird.
Der Beitrag liefert einen Überblick über die Norm und zeigt Anwendungsbeispiele auf.
Workflow driven support systems in the peri-operative area have the potential to optimize clinical processes and to allow new situation-adaptive support systems. We started to develop a workflow management system supporting all involved actors in the operating theatre with the goal to synchronize the tasks of the different stakeholders by giving relevant information to the right team members. Using the OMG standards BPMN, CMMN and DMN gives us the opportunity to bring established methods from other industries into the medical field. The system shows each addressed actor their information in the right place at the right time to make sure every member can execute their task in time to ensure a smooth workflow. The system has the overall view of all tasks. Accordingly, a workflow management system including the Camunda BPM workflow engine to run the models, and a middleware to connect different systems to the workflow engine and some graphical user interfaces to show necessary information or to interact with the system are used. The complete pipeline is implemented with a RESTful web service. The system is designed to include different systems like hospital information system (HIS) via the RESTful web service very easily and without loss of data. The first prototype is implemented and will be expanded.
This paper discusses the optimal control problem for increasing the energy efficiency of induction machines in dynamic operation including field weakening regime. In an offline procedure optimal current and flux trajectories are determined such that the copper losses are minimized during transient operations. These trajectories are useful for a subsequent online implementation.
In this work design rules for a novel brushless excitation system for externally excited synchronous machines are discussed. The concept replaces slip rings with a fullbridge active rectifier and a controller mounted on the rotor. An AC signal induced from the stator is used to charge the rotor DC link. The DC current for the rotor excitation is provided from this DC link source. Finite element analysis of an existing machine is used to analyze the practicability of the excitation system.
A novel brushless excitation concept for synchronous machines with a rotating power converter is proposed in this paper. The concept does not need an auxiliary winding or any other modification to the machine structure apart from an inverter with a DC link capacitor and a controller on the rotor. The power required for the rotor excitation is provided by injecting harmonics into the stator winding. Thus, a voltage in the field coil is induced. The rotor inverter is controlled such that the alternating current charges the DC link capacitor. At the same time the inverter supplies the DC field current to the field coil. The excitation concept is first developed in theory, then presented using an analytical model and FEA, and lastly investigated with a prelimininary experimental setup.
Digital technologies are moving into physical products. Smart cars, connected lightbulbs and data-generating tennis rackets are examples of previously “pure” physical products that turned into “digitized products”. Digitizing products offers many use cases for consumers that will hopefully persuade them to buy these products. Yet, as revenues from selling digitized products will remain small in the near future, digitized product manufacturers have to look for other sources of benefits. Producer-side use cases describe how manufacturers can benefit internally from the digitized products they produce. Our article identifies three categories of such use cases: product-, service-, and process-related ones.
The goal of the presented project is to develop the concept of home e-health centers for barrier-free and cross-border telemedicine. AAL technologies are already present on the market but there is still a gap to close until they can be used for ordinary patient needs. The general idea needs to be accompanied by new services, which should be brought together in order to provide a full coverage of service for the users. Sleep and stress were chosen as predominant influence in the population. The executed scientific study of available home devices analyzing sleep has provided the necessary to select appropriate devices. The first choice for the project implementation is the device EMFIT QS+. This equipment provides a part of a complete system that a home telemedical hospital can provide at a level of precision and communication with internal and/or external health services.
Ein virtuelles Kraftwerk ist ein Verbund von Energieanlagen, koordiniert von einem gemeinsamen Leitsystem, um eine bessere Ausnutzung wetterabhängiger Energiequellen oder die gemeinsame Vermarktung von erzeugtem Strom zu ermöglichen. Der Demonstrator Virtuelles Kraftwerk Neckar-Alb ist eine Demonstrationsplattform für Forschung und Lehre, die Anlagen auf dem Campus der Hochschule Reutlingen und verteilte Anlagen in der Region Neckar-Alb integriert.
Novel design for a coreless printed circuit board transformer realizing high bandwidth and coupling
(2019)
Rogowski coils offer galvanic isolation and can measure alternating currents with a high bandwidth. Coreless printed circuit board (PCB) transformers have been used as an alternative to limit the additional stray inductance if a Rogowski coil can not be attached to the circuit. A new PCB transformer layout is proposed to reduce cost, decrease additional stray inductance, increase the bandwidth of current measurements and simplify the integration into existing designs.
Improved inductive feed-forward for fast turn-on of power semiconductors during hard switching
(2019)
A transformer is used to increase the gate voltage during turn-on, thus reducing the necessary bias voltage of the gate driver. Counteracting the voltage dependency of the gate capacitance of high-voltage power devices, faster transitions are possible. The additional transformer only slighly increases the over-voltage during turn-off.
In this paper, an approach is introduced how reinforcement learning can be used to achieve interoperability between heterogeneous Internet of Things (IoT) components. More specifically, we model an HTTP REST service as a Markov Decision Process and adapt Q-Learning to the properties of REST so that an agent in the role of an HTTP REST client can learn the semantics of the service and, especially an optimal sequence of service calls to achieve an application specific goal. With our approach, we want to open up and facilitate a discussion in the community, as we see the key for achieving interoperability in IoT by the utilization of artificial intelligence techniques.
Interoperability is an important topic in the Internet of Things (IoT), because this domain incorporates diverse and heterogeneous objects, communication protocols and data formats. Many models and classification schemes have been proposed to make the degree of interoperability measurable - however only on the basis of a hierarchical scale. In the course of this paper we introduce a novel approach to measure the degree of interoperability using a metric scaled quantity. We consider IoT as a distributed system, where interoperable objects exchange messages with each other. Under this premise, we interpret messages as operation calls and formalize this view as a causal model. The analysis of this model enables us to quantify the interoperable behavior of communicating objects.
Increasing flexibility, greater transparency and faster adaptability play a key role in the development of future intralogistics. Ever-changing environmental conditions require easy extensibility and modifiability of existing bin systems. This research project explores approaches to transfer the Internet of Things (IoT) paradigm to intralogistics. This allows a synchronization of the material and information flow. The bin is enabled by the implementation of adequate hardware and software components to capture, store, process and forward data to selected system subscribers. Monitoring the processes in the intralogistics by means of the smart bin system ensures the implementation of appropriate actions in case of defined deviations. By using explorative expert interviews with representatives from the automotive and pharmaceutical industries, seven practical application scenarios were defined. On this basis, the requirements of smart bin systems were examined. For each individual case of application, a system model was created in order to obtain an overview of the system components and thus reveal similarities and differences. Based on the similarities of the system models, a general requirement profile was derived. After the hardware components of the bin system had been determined, a utility analysis was carried out to find the adequate IoT software. The utility analysis was conducted with a focus on data acquisition and data transfer, data storage, data analysis, data presentation as well as authorization management and data security. The results show that there is great interest in easily expandable and modifiable bin systems, as in all cases, the necessary information flow in the existing bin system has to be improved by means of new IoT hardware and software components.
The global demand for individualized products leading to decreasing production batch sizes requires innovative approaches how to organize production and logistics systems in a dynamic manner. Current material flow systems mainly rely on predefined system structures and processes, which result in a huge increase of complexity and effort for system and process changes to realize an optimized production and material provision of individualized products. Autonomous production and logistics entities in combination with intelligent products or logistic load carriers following the vision of the “Internet of Things” offer a promising solution for mastering this complexity based on autonomous, decentralized and target size-optimized decision making and structure formation without the need for predefined processes and central decision-making bodies. Customer orders are going to prioritize themselves and communicate directly with the required production and logistics resources. Bins containing the required materials are going to communicate with the conveyors or workers of the respective intralogistics system organizing and controlling the material flow to the autonomously selected workstation. A current research project is the development of a collaborative tugger train combing the potential of automation and human-robot collaboration in intralogistics. This tugger train is going to be integrated into a self organized intralogistics scenario involving individualized customer orders (low to high batch sizes). To classify the application of self-organization within intralogistics systems, a criteria catalogue has been developed. The application of this criteria catalogue will be demonstrated on the example of a self-organization scenario involving the collaborative tugger train and an intelligent bin system.