Informatik
Refine
Year of publication
- 2020 (101) (remove)
Document Type
- Conference proceeding (66)
- Journal article (24)
- Book chapter (5)
- Doctoral Thesis (3)
- Book (1)
- Anthology (1)
- Report (1)
Is part of the Bibliography
- yes (101)
Institute
- Informatik (101)
- Technik (1)
Publisher
- Springer (18)
- Hochschule Reutlingen (15)
- Elsevier (10)
- IEEE (9)
- ACM (6)
- Gesellschaft für Informatik (4)
- IARIA (3)
- Association for Computing Machinery (2)
- De Gruyter (2)
- Gesellschaft für Informatik e.V (2)
- IGI Global (2)
- SCITEPRESS (2)
- Springer Gabler (2)
- University of Hawai'i at Manoa (2)
- Universität Stuttgart (2)
- Association for Information Systems (AIS) (1)
- BioMed Central (1)
- DUZ Medienhaus (1)
- EuroMed Press (1)
- Hochschule der Medien (1)
- IADIS Press (1)
- IGI Publishing (1)
- Inst. of Electrical and Electronics Engineers (1)
- International Academy of Business Disciplines (1)
- International Association for Development of the Information Society (1)
- Johannes Kepler University Linz (1)
- MDPI (1)
- Open Proceedings.org, Univ. of Konstanz (1)
- RWTH Aachen (1)
- Science and Technology Publications (1)
- Springer Science + Business Media B.V. (1)
- University of the West of Scotland (1)
- Universität des Saarlandes (1)
- Wiley-Blackwell (1)
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Documentation of clinical processes, especially in the perioperative are, is a base requirement for quality of service. Nonetheless, the documentation is a burden for the medical staff since it distracts from the clinical core process. An intuitive and user-friendly documentation system could increase documentation quality and reduce documentation workload. The optimal system solution would know what happened and the person documenting the step would need a single “confirm” button. In many cases, such a linear flow of activities is given as long as only one profession (e.g. anaestesiology, scrub nurse) is considered, but even in such cases, there might be derivations from the linear process flow and further interaction is required.
With significant advancements in digital technologies, firms find themselves competing in an increasingly dynamic business environment. It is of paramount importance that organizations undertake proper governance mechanisms with respect to their business and IT strategies. Therefore, IT governance (ITG) has become an important factor for firm performance. In recent years, agility has evolved as a core concept for governance, especially in the area of software development. However, the impact of agility on ITG and firm performance has not been analyzed by the broad scientific community. This paper focuses on the question, how the concept of agility affects the ITG–firm performance relationship. The conceptual model for this question was tested by a quantitative research process with 400 executives responding to a standardized survey. Findings show that the adoption of agile principles, values, and best practices to the context of ITG leads to meaningful results for governance, business/IT alignment, and firm performance.
In recent years, the cloud has become an attractive execution environment for parallel applications, which introduces novel opportunities for versatile optimizations. Particularly promising in this context is the elasticity characteristic of cloud environments. While elasticity is well established for client-server applications, it is a fundamentally new concept for parallel applications. However, existing elasticity mechanisms for client-server applications can be applied to parallel applications only to a limited extent. Efficient exploitation of elasticity for parallel applications requires novel mechanisms that take into account the particular runtime characteristics and resource requirements of this application type. To tackle this issue, we propose an elasticity description language. This language facilitates users to define elasticity policies, which specify the elasticity behavior at both cloud infrastructure level and application level. Elasticity at the application level is supported by an adequate programming and execution model, as well as abstractions that comply with the dynamic availability of resources. We present the underlying concepts and mechanisms, as well as the architecture and a prototypical implementation. Furthermore, we illustrate the capabilities of our approach through real-world scenarios.
Requirements Engineering (RE) umfasst sämtliche systematische Schritte zur Entwicklung eines Systems, um die Bedürfnisse der Nutzer und Vorgaben, die an dieses gestellt werden, zu erfüllen. Das RE eines ausgewählten Herstellers für klinische Informationssysteme (KIS) wurde untersucht und es stellt sich als intransparent als auch teilweise unzureichend dar. Das Ausmaß des Einsatzes von systematischen Vorgehensweisen und Methoden zum RE wurden beim ausgewählten KIS-Hersteller analysiert. Die Analyse zeigt, dass RE weit verbreitet ist, aber differenziert betrieben wird.
Das Ziel dieser Arbeit ist es, den Stand der Technik des RE für die KIS Entwicklung zu ermitteln. Es werden wichtige Faktoren des RE für die Entwicklung von KIS beschrieben. Die Ergebnisse dieser Arbeit werden als erster Schritt für die Optimierung des RE des ausgewählten KIS-Herstellers dienen.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
Our paper gives first answers on a fundamental question: how can the design of architectures of intelligent digital systems and services be accomplished methodologically? Intelligent systems and services are the goals of many current digitalization efforts today and part of massive digital transformation efforts based on digital technologies. Digital systems and services are the foundation of digital platforms and ecosystems. Digtalization disrupts existing businesses, technologies, and economies and promotes the architecture of open environments. This has a strong impact on new value-added opportunities and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, and social enterprise networks systems are important enablers of digitalization. The current publication presents our research on the architecture of intelligent digital ecosystems and products and services influenced by the service-dominant logic. We present original methodological extensions and a new reference model for digital architectures with an integral service and value perspective to model intelligent systems and services that effectively align digital strategies and architectures with artificial intelligence as main elements to support intelligent digitalization.
Automatic anode rod inspection in aluminum smelters using deep-learning techniques: a case study
(2020)
Automatic fault detection using machine learning has become an exciting and promising area of research. This because it accurate and timely way to manage and classify with minimal human effort. In the computer vision community, deep-learning methods have become the most suitable approaches for this task. Anodes are large carbon blocks that are used to conduct electricity during the aluminum reduction process. The most basic function of anode rod inspection is to prevent a situation where the anode rod will not fit into the stub-holes of a new anode. It would be the case for a rod containing either severe toe-in, missing stubs, or a retained thimble on one or more stubs. In this work, to improve the accuracy of shape defect inspection for an anode rod, we use the Fast Region-based Convolutional Network method (Fast R-CNN), model. To train the detection model, we collect an image dataset composed of multi-class of anode rod defects with annotated labels. Our model is trained using a small number of samples, an essential requirement in the industry where the number of available defective samples is limited. It can simultaneously detect multi-class of defects of the anode rod in nearly real-time.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
Checklists are a valuable tool to ensure process quality and quality of care. To ensure proper integration in clinical processes, it would be desirable to generate checklists directly from formal process descriptions. Those checklists could also be used for user interaction in context-aware surgical assist systems. We built a tool to automatically convert Business Process Model and Notation (BPMN) process models to checklists displayed as HTML websites. Gateways representing decisions are mapped to checklist items that trigger dynamic content loading based on the placed checkmark. The usability of the resulting system was positively evaluated regarding comprehensibility and end-user friendliness.
In Zusammenarbeit mit dem Medizinproduktehersteller ulrich medical wird eine User Experience und Usability Studie an der Software der im Moment eingesetzten Kontrastmittelinjektoren durchgeführt. Das Unternehmen möchte eine neue Variante eines Kontrastmittelinjektors entwickeln, der als Basis eine verbesserte Version dieser Softwares enthält. Benutzerstudien können mit den unterschiedlichsten Methoden durchgeführt werden. Das geeignete Vorgehen muss definiert und die Testpersonen in Bezug zur eingesetzten Methode ermittelt werden. Bei Medizinprodukten muss zusätzlich auf strikte Auflagen in Normen und Gesetzen geachtet werden. Die Grundlage zur Methodenauswahl bildet eine Recherche zu Usability und User Experience Vorgaben für Medizinprodukte. Die Studie wird anhand quantitativer Daten eines Usability Tests im Labor, Fragebögen zur User Experience und qualitativen Post Test- Interviews evaluiert. In erster Linie dient diese Studie der Ermittlung von möglichen Verbesserungen, welche in der darauf folgenden Masterthesis vertieft und umgesetzt werden.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
While many maintainability metrics have been explicitly designed for service-based systems, tool-supported approaches to automatically collect these metrics are lacking. Especially in the context of microservices, decentralization and technological heterogeneity may pose challenges for static analysis. We therefore propose the modular and extensible RAMA approach (RESTful API Metric Analyzer) to calculate such metrics from machine-readable interface descriptions of RESTful services. We also provide prototypical tool support, the RAMA CLI, which currently parses the formats OpenAPI, RAML, and WADL and calculates 10 structural service-based metrics proposed in scientific literature. To make RAMA measurement results more actionable, we additionally designed a repeatable benchmark for quartile-based threshold ranges (green, yellow, orange, red). In an exemplary run, we derived thresholds for all RAMA CLI metrics from the interface descriptions of 1,737 publicly available RESTful APIs. Researchers and practitioners can use RAMA to evaluate the maintainability of RESTful services or to support the empirical evaluation of new service interface metrics.
Detecting semantic similarities between sentences is still a challenge today due to the ambiguity of natural languages. In this work, we propose a simple approach to identifying semantically similar questions by combining the strengths of word embeddings and Convolutional Neural Networks (CNNs). In addition, we demonstrate how the cosine similarity metric can be used to effectively compare feature vectors. Our network is trained on the Quora dataset, which contains over 400k question pairs. We experiment with different embedding approaches such as Word2Vec, Fasttext, and Doc2Vec and investigate the effects these approaches have on model performance. Our model achieves competitive results on the Quora dataset and complements the well-established evidence that CNNs can be utilized for paraphrase recognition tasks.
Comparison of sleep characteristics measurements: a case study with a population aged 65 and above
(2020)
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
In der Kryochirurgie wird Kälte verwendet, um tumoröses Gewebe abzutöten. Dazu werden Kryosonden in den Tumor gestochen und stark abgekühlt. Hierbei gibt es verschiedene Herausforderungen, welchen computergestützt begegnet werden kann. Diese Arbeit gibt die Ergebnisse einer Literaturrecherche zu den Herausforderungen wieder. Die vorgestellten Arbeiten beschäftigten sich mit der Simulation des im Tumor entstehenden Eisballs, dem korrekten Positionieren der Kryosonden im Tumor, dem Überwachen des Eingriffs sowie dem Entwickeln von Simulationen für Trainingszwecke. Dabei zeigt sich, dass der Einsatz von computergestützten Lösungen die Kryochirurgie für Operateur und Patient verbessern kann.
In networked operating room environments, there is an emerging trend towards standardized non-proprietary communication protocols which allow to build new integration solutions and flexible human-machine interaction concepts. The most prominent endeavor is the IEEE 11073 SDC protocol. For some uses cases, it would be helpful if not just medical devices could be controlled based on SDC, but also building automation systems like light, shutters, air condition, etc. For those systems, the KNX protocol is widely used. We build an SDC-to-KNX gateway which allows to use the SDC protocol for sending commands to connected KNX devices. The first prototype system was successfully implemented at the demonstration operating room at Reutlingen University. This is a first step toward the integration of a broader variety of KNX devices.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Das ZD.BB - Digitaler Hub für kleine und mittelständische Unternehmen in der Region Stuttgart
(2020)
Die Digitale Transformation ist eines der meistdiskutierten Themen in der heutigen Geschäftswelt. Viele Unternehmen, vor allem kleine und mittelständische Unternehmen (KMU), tun sich schwer die Chancen und Risiken der Digitalisierung einzuschätzen. Mit all den Möglichkeiten und Chancen, welche die Digitalisierung birgt, droht Unternehmen, die sich vor den Entwicklungen verschließen, der Verlust ihrer Markt- und Wettbewerbsposition. Mit dem im Februar 2019 eröffneten Digital Hub ZD.BB (Zentrum Digitalisierung) besteht in der Region Stuttgart eine neue, zentrale Anlaufstelle für Fragen rund um das Thema Digitalisierung. Am ZD.BB erhalten kleine und mittelständische Unternehmen (KMU) sowie Startups für ihre digitalen Transformationsprozesse eine kompetente Beratung und Betreuung. Sie geht von der Sensibilisierung über die Analyse bis zur Lösungsentwicklung für digitale Prozesse. Mithilfe einer digitalen Qualifizierungsoffensive und mittelstandsgerechten Methoden zur Geschäftsmodellentwicklung werden Unternehmen im ZD.BB umfassend bei ihren Digitalisierungsvorhaben unterstützt. Dazu werden in Innovationslaboren, in Coworking Spaces und bei Events unterschiedliche Kompetenzen, Disziplinen, Ideen, Technologien und Kreativität vernetzt und auf diese Weise digitale Innovationen hervorgebracht.