Informatik
Refine
Document Type
- Conference proceeding (498)
- Journal article (186)
- Book chapter (48)
- Doctoral Thesis (10)
- Book (3)
- Report (1)
- Working Paper (1)
Has full text
- yes (747) (remove)
Is part of the Bibliography
- yes (747)
Institute
- Informatik (747)
- Technik (2)
Publisher
- Springer (160)
- Hochschule Reutlingen (95)
- IEEE (89)
- Gesellschaft für Informatik e.V (59)
- Elsevier (42)
- Association for Computing Machinery (37)
- IARIA (26)
- RWTH Aachen (14)
- De Gruyter (13)
- SciTePress (11)
Sowohl bei den industriellen als auch wissenschaftlichen Institutionen nimmt die Anwendung der additiven Fertigung stetig zu und ist insbesondere in den Bereichen der Prototypenentwicklung nicht mehr wegzudenken. Die werkzeuglose Herstellung von Teilen ermöglicht eine dynamische Nutzung der Produktionsressourcen bis unmittelbar zum Fertigungsstart. Dies erlaubt, einerseits in den Bereichen der Feinterminierung und Ablaufplanung, agil auf Veränderungen zu reagieren und andererseits Modelle unterschiedlicher Fertigungsaufträge miteinander zu kombinieren, um somit eine hohe Effizienz der Fertigungsanlagen zu erreichen. Bei der Nutzung von multiplen Anlagen in einem Unternehmen oder im Partnerverbund stellt die vorhandene Intransparenz Unternehmen und Unternehmensnetzwerke vor viele Herausforderungen. Die Blockchain Technologie ermöglicht eine gemeinsame Datenbasis zwischen den Teilnehmern. Die Einträge werden protokolliert und die Authentizität der Teilnehmer wird gewährleistet. Dies führt, im Falle der Beziehung zwischen Kunden und Produzenten, zu einer nachprüfbaren Zusammenarbeit, da Unternehmen Dienstleistungsverträge abschließen, die auf dem Fluss vieler kleiner Transaktionen basieren. In diesem Beitrag wird dargestellt, wie verfügbare additive Fertigungsressourcen erkannt werden, sowie, unter der Verwendung der Blockchain-Technologie, in einem dezentralen Produktionsnetzwerk angeboten und von unterschiedlichen Akteuren genutzt werden können.
The performance and scalability of modern data-intensive systems are limited by massive data movement of growing datasets across the whole memory hierarchy to the CPUs. Such traditional processor-centric DBMS architectures are bandwidth- and latency-bound. Processing-in-Memory (PIM) designs seek to overcome these limitations by integrating memory and processing functionality on the same chip. PIM targets near- or in-memory data processing, leveraging the greater in-situ parallelism and bandwidth.
In this paper, we introduce pimDB and provide an initial comparison of processor-centric and PIM-DBMS approaches under different aspects, such as scalability and parallelism, cache-awareness, or PIM-specific compute/bandwidth tradeoffs. The evaluation is performed end-to-end on a real PIM hardware system from UPMEM.
Even though near-data processing (NDP) can provably reduce data transfers and increase performance, current NDP is solely utilized in read-only settings. Slow or tedious to implement synchronization and invalidation mechanisms between host and smart storage make NDP support for data-intensive update operations difficult. In this paper, we introduce a low-latency cache-coherent shared lock table for update NDP settings in disaggregated memory environments. It utilizes the novel CCIX interconnect technology and is integrated in neoDBMS, a near-data processing DBMS for smart storage. Our evaluation indicates end-to-end lock latencies of ∼80-100ns and robust performance under contention.
Multi-versioning and MVCC are the foundations of many modern DBMSs. Under mixed workloads and large datasets, the creation of the transactional snapshot can become very expensive, as long-running analytical transactions may request old versions, residing on cold storage, for reasons of transactional consistency. Furthermore, analytical queries operate on cold data, stored on slow persistent storage. Due to the poor data locality, snapshot creation may cause massive data transfers and thus lower performance. Given the current trend towards computational storage and near-data processing, it has become viable to perform such operations in-storage to reduce data transfers and improve scalability. neoDBMS is a DBMS designed for near-data processing and computational storage. In this paper, we demonstrate how neoDBMS performs snapshot computation in-situ. We showcase different interactive scenarios, where neoDBMS outperforms PostgreSQL 12 by up to 5×.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
This work is a report on practical experiences with the issue of interoperability in German practice management systems (PMSs) from an ongoing clinical trial on teledermatology, the TeleDerm project. A proprietary and established web-platform for store-and-forward telemedicine is integrated with the IT in the GPs’ offices for automatic exchange of basic patient data. Most of the 19 different PMSs included in the study sample lack support of modern health data exchange standards, therefore the relatively old but widely available German health data exchange interface “Gerätedatentransfer” (GDT) is used. Due to the lack of enforcement and regulation of the GDT standard, several obstacles to interoperability are encountered. As a partial, but reusable working solution to cope with these issues, we present a custom middleware which is used in conjunction with GDT. We describe the design, technical implementation and observed hindrances with the existing infrastructure. A discussion on health care interfacing standards and the current state of interoperability in German PMS software is given.
Large critical systems, such as those created in the space domain, are usually developed by a large number of organizations and, furthermore, they have to comply with standards. Yet, the different stakeholders often do not have a common understanding of the needed quality of requirements specifications. Achieving such a common understanding is a laborious process that is currently not sufficiently supported. Moreover, such a common understanding must be aligned with the standards. In this paper, we present an approach that can be used to align the different stakeholder perceptions regarding the quality of requirements specifications. Existing quality models for requirements specifications are analyzed for equivalences, and transferred into a common representation, the so-called Aligned Quality Map (AQM). Furthermore, a process is defined that supports the alignment of different stakeholder perspectives with regard to the quality of requirements specifications using AQM, which is validated in a case study in the context of European space projects. AQM has been created and populated with an initial set of quality models. It is designed in such way that it can be extended to include further quality models. The case study has shown that an alignment of different stakeholder perspectives and the quality model of the European Cooperation for Space Standardization using AQM is feasible. The approach allows for aligning different stakeholder perspectives for a common understanding of the quality of requirements specifications in the context of standards. Furthermore, AQM supports the assessment of requirements specifications.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
This paper investigates the possibility to effectively monitor and control the respiratory action using a very simple and non invasive technique based on a single lightweight reduced-size wireless surface electromyography (sEMG) sensor placed below the sternum. The captured sEMG signal, due to the critical sensor position, is characterized by a low energy level and it is affected by motion artifacts and cardiac noise. In this work we present a preliminary study performed on adults for assessing the correlation of the spirometry signal and the sEMG signal after the removal of the superimposed heart signal. This study and the related findings could be useful in respiratory monitoring of preterm infants.
Smart meter based business models for the electricity sector : a systematical literature research
(2017)
The Act on the Digitization of the Energy Transition forces German industries and households to introduce smart meters in order to save engery, to gain individual based electricity tariffs and to digitize the energy data flow. Smart meter can be regarded as the advancement of the traditional meter. Utilizing this new technology enables a wide range of innovative business models that provide additional value for the electricity suppliers as well as for their customers. In this study, we followed a two-step approach. At first, we provide a state-of-the-art comparison of these business models found in the literature and identify structural differences in the way they add value to the offered products and services. Secondly, the business models are grouped into categories with respect to customer segmetns and the added value to the smart grid. Findings indicate that most business models focus on the end-costumer as their main customer.