Informatik
Refine
Document Type
- Book chapter (38) (remove)
Language
- English (38) (remove)
Is part of the Bibliography
- yes (38)
Institute
- Informatik (38)
Publisher
- Springer (19)
- Gesellschaft für Informatik e.V (11)
- Università Politecnica delle Marche (3)
- IGI Global (2)
- Elsevier (1)
- IOP Publishing (1)
- Pabst Science Publishers (1)
The massive use of patient data for the training of artificial intelligence algorithms is common nowadays in medicine. In this scientific work, a statistical analysis of one of the most used datasets for the training of artificial intelligence models for the detection of sleep disorders is performed: sleep health heart study 2. This study focuses on determining whether the gender and age of the patients have a relevant influence to consider working with differentiated datasets based on these variables for the training of artificial intelligence models.
Accurate monitoring of a patient's heart rate is a key element in the medical observation and health monitoring. In particular, its importance extends to the identification of sleep-related disorders. Various methods have been established that involve sensor-based recording of physiological signals followed by automated examination and analysis. This study attempts to evaluate the efficacy of a non-invasive HR monitoring framework based on an accelerometer sensor specifically during sleep. To achieve this goal, the motion induced by thoracic movements during cardiac contractions is captured by a device installed under the mattress. Signal filtering techniques and heart rate estimation using the symlets6 wavelet are part of the implemented computational framework described in this article. Subsequent analysis indicates the potential applicability of this system in the prognostic domain, with an average error margin of approximately 3 beats per minute. The results obtained represent a promising advancement in non-invasive heart rate monitoring during sleep, with potential implications for improved diagnosis and management of cardiovascular and sleep-related disorders.
Software scripts for sensor data extraction in Rasberry Pi: user-space and kernel-space comparison
(2024)
This paper compares two popular scripting implementations for hardware prototyping: Python scripts execut from User-Space and C-based Linux-Driver processes executed from Kernel-Space, which can provide information to researchers when considering one or another in their implementations. Conclusions exhibit that deploying software scripts in the kernel space makes it possible to grant a certain quality of sensor information using a Raspberry Pi without the need for advanced real-time operational systems.
Since half a decade, there has been an increasing interest in Robotic Process Automation (RPA) by business firms. However, academic literature has been lacking attention to RPA, before adopting the topic to a larger extent. The aim of this study is to review and structure the latest state of scholarly research on RPA. This chapter is based on a systematic literature review that is used as a basis to develop a conceptual framework to structure the field. Our study shows that some areas of RPA have been extensively examined by many authors, e.g. potential benefits of RPA. Other categories, such as empirical studies on adoption of RPA or organisational readiness models, have remained research gaps.
Higher education institutions (HEIs) rely heavily on information technology (IT) to create innovations. Therefore, IT governance (ITG) is essential for education activities, particularly during the ongoing COVID-19 pandemic. However, the traditional concept of ITG is not fully equipped to deal with the current changes occurring in the digital age. Today's ITG requires an agile approach that can respond to disruptions in the HEI environment. Consequently, universities increasingly need to adopt agile strategies to ensure superior performance. This research proposes a conceptualization comprising three agile dimensions within the ITG construct: structures, processes, and relational mechanisms. An extensive qualitative evaluation of industry uncovered 46 agile governance mechanisms. Moreover, 16 professors rated these elements to assess agile ITG in their HEIs to determine those most effective for HEIs. This led to the identification of four structure elements, seven processes, and seven relational mechanisms.
Health monitoring in a home environment can have broader use since it may provide continuous control of health parameters with relatively minor intrusiveness into regular life. This work aims to verify if it is possible to replace the typical in some sleep medicine areas subjective questioning by an objective measurement using electronic devices. For this purpose, a study was conducted with ten subjects, in which objective and subjective measurement of relevant sleep parameters took place. The results of both measurement methods were evaluated and analyzed. The results showed that while for some measures, such as Total Time in Bed, there is a high agreement between objective and subjective measurements, for others, such as sleep quality, there are significant differences. For this reason, currently, a combination of both measurement methods may be beneficial and provide the most detailed results, while a partial replacement can already reduce the number of questions at the subjective measurement by measurement through electronic devices.
Selecting a suitable development method for a specific project context is one of the most challenging activities in process design. To extend the so far statistical construction of hybrid development methods, we analyze 829 data points to investigate which context factors influence the choice of methods or practices. Using exploratory factor analysis, we derive five base clusters consisting of up to 10 methods. Logistic regression analysis then reveals which context factors have an influence on the integration of methods from these clusters in the development process. Our results indicate that only a few context factors including project/product size and target application domain significantly influence the choice. This summary refers to the paper “Determining Context Factors for Hybrid Development Methods with Trained Models”. This paper was published in the proceedings of the International Conference on Software and System Process in 2020.
The livestock sector is growing steadily and is responsible for around 18% of global greenhouse‐gas‐emissions, which is more than the global transport sec-tor (Steinfeld et al. 2006). This paper examines the potential of social marketing to reduce meat consumption. The aim is to understand consumers’ motivation in diet choices and to learn what opportunities social marketing can provide to counteract negative environmental and health trends. The authors believe that research to answer this question should start in metropolitan areas, be-cause measures should be especially effective there. Based on the Theory of Planned Behaviour (TPB, Ajzen 1991) and the Technology‐Acceptance‐Model by Huijts et al. (2012), an online‐study with participants from the metropolitan region (n = 708) was conducted in which central socio‐psychological constructs for a meat consumption reduction were examined. It was shown that attitude, personal norm and habit have a critical influence on the intention to reduce meat consumption. A segmentation of consumers based on these factors led to three consumer clusters: vegetarians/flexitarians, potential flexitarians and convinced meat eaters. Potential flexitarians are an especially relevant target group for the development of social‐marketing‐measures to reduce meat consumption. In co‐creation‐workshops with potential flexitarians from the metropolitan region, barriers and benefits of reducing meat consumption were identified. The factors of environmental protection, animal welfare and desire for variety turn out to be the most relevant motivational factors. Based on these factors, consumers proposed a variety of social marketing measures, such as applications and labels to inform about the environmental impact of meat products.
Due to digitalization, constant technological progress and ever shorter product life cycles, enterprises are currently facing major challenges. In order to succeed in the market, business models have to be adapted more often and more quickly to changing market conditions than they used to be. Fast adaptability, also called agility, is a decisive competitive factor in today’s world. Because of the ever-growing IT part of products and the fact that they are manufactured using IT, changing the business model has a major impact on the enterprise architecture (EA). However, developing EAs is a very complex task, because many stakeholders with conflicting interests are involved in the decision-making process. Therefore, a lot of collaboration is required. To support organizations in developing their EA, this article introduces a novel integrative method that systematically integrates stakeholder interests into decision-making activities. By using the method, collaboration between stakeholders involved is improved by identifying points of contact between them. Furthermore, standardized activities make decision-making more transparent and comparable without limiting creativity.
Enterprises are currently transforming their strategy, processes, and their information systems to extend their degree of digitalization. The potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, artificial intelligence, big data with analytics, mobile systems, collaboration networks, and cyber physical systems both drives and enables new business designs. Digitalization deeply disrupts existing businesses, technologies and economies and fosters the architecture of digital environments with many rather small and distributed structures. This has a strong impact for new value producing opportunities and architecting digital services and products guiding their design through exploiting a Service-Dominant Logic. The main result of the book chapter extends methods for integral digital strategies with value-oriented models for digital products and services which are defined in the framework of a multi-perspective digital enterprise architecture reference model.
This chapter presents an introduction to the emerging trends for architecting the digital transformation having a strong focus on digital products, intelligent services, and related systems together with methods, models and architectures. The primary aim of this book is to highlight some of the most recent research results in the field. We are providing a focused set of brief descriptions of the chapters included in the book.
The digital transformation is today’s dominant business transformation having a strong influence on how digital services and products are designed in a service-dominant way. A popular underlying theory of value creation and economic exchange that is known as the service-dominant (S-D) logic can be connected to many successful digital business models. However, S-D logic by itself is abstract. Companies cannot directly use it as an instrument for business model innovation and design in an easy way. To address this a comprehensive ideation method based on S-D logic is proposed, called service-dominant design (SDD). SDD is aimed at supporting firms in the transition to a service- and value-oriented perspective. The method provides a simplified way to structure the ideation process based on four model components. Each component consists of practical implications, auxiliary questions and visualization techniques that were derived from a literature review, a use case evaluation of digital mobility and a focus group discussion. SDD represents a first step of having a toolset that can support established companies in the process of service- and value-orientation as part of their digital transformation efforts.
Formula One races provide a wealth of data worth investigating. Although the time-varying data has a clear structure, it is pretty challenging to analyze it for further properties. Here the focus is on a visual classification for events, drivers, as well as time periods. As a first step, the Formula One data is visually encoded based on a line plot visual metaphor reflecting the dynamic lap times, and finally, a classification of the races based on the visual outcomes gained from these line plots is presented. The visualization tool is web-based and provides several interactively linked views on the data; however, it starts with a calendar-based overview representation. To illustrate the usefulness of the approach, the provided Formula One data from several years is visually explored while the races took place in different locations. The chapter discusses algorithmic, visual, and perceptual limitations that might occur during the visual classification of time-series data such as Formula One races.
Internet of Things (IoT) provides a strong platform for computer users to connect objects, devices, and people to the Internet for exchanging or sharing of information with each other. IoT is growing rapidly and is expected to adapt to disciplines such as manufacturing, agriculture, healthcare, and robotics. Furthermore, the new concept of IoT is proposed and shown, especially for robotics areas as Internet of Robotics Things (IoRT). IoRT is a mixed structure of diverse technologies such as cloud computing, artificial intelligence, and machine learning. However, to promote and realize IoRT, digitization and digital transformation should be proceeded and implemented in the robotics enterprise. In this paper, we propose and architecture framework for IoRT-based digital platforms an verify it using a planned case in a global robotics enterprise. The associated challenges and future research directions in this field are also presented.
A transaction is a demarcated sequence of application operations, for which the following properties are guaranteed by the underlying transaction processing system (TPS): atomicity, consistency, isolation, and durability (ACID). Transactions are therefore a general abstraction, provided by TPS that simplifies application development by relieving transactional applications from the burden of concurrency and failure handling. Apart from the ACID properties, a TPS must guarantee high and robust performance (high transactional throughput and low response times), high reliability (no data loss, ability to recover last consistent state, fault tolerance), and high availability (infrequent outages, short recovery times).
The architectures and workhorse algorithms of a high-performance TPS are built around the properties of the underlying hardware. The introduction of nonvolatile memories (NVM) as novel storage technology opens an entire new problem space, with the need to revise aspects such as the virtual memory hierarchy, storage management and data placement, access paths, and indexing. NVM are also referred to as storage-class memory (SCM).
Active storage
(2018)
In brief, Active Storage refers to an architectural hardware and software paradigm, based on collocation storage and compute units. Ideally, it will allow to execute application-defined data ... within the physical data storage. Thus Active Storage seeks to minimize expensive data movement, improving performance, scalability, and resource efficiency. The effective use of Active Storage mandates new architectures, algorithms, interfaces, and development toolchains.
Decentralized energy systems are characterized by an ad hoc planing. The missing integration of energy objectives into business strategy creates difficulties resulting in inefficient energy architectures and decisions. Practice-proven methods such as balanced scorecard, enterprise architecture management and value network approach supports the transformation path towards an effective decentralized system. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
A lot of people need help in their daily life to wash, select and manage their clothing. The goal of this work is to design an assistant system (eKlarA) to support the user by giving recommendations to choose the clothing combinations, to find the clothing and to wash the clothing. The idea behind eKlarA is to generate a system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of the stations: closets, laundry basket and washing machine in one or several places. The system uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. The first prototype of this system has been developed and tested. The test results are presented in this work.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
The troubles began when Tom, the business analyst, asked the customer what he wants. The customer came up with good ideas for software features. Tom created a brilliant roadmap and defined the requirements for a new software product. Mary, the development team leader, was already eager to start developing and happy when she got the requirements. She and her team went ahead and created the software right away. Afterwards, Paul tested the software against the requirements. As soon as the software fulfilled the requirements, Linda, the product manager, deployed it to the customer. The customer did not like the software and ignored it. Ringo, the head of software development, was fired. How come? Nowadays, we have tremendous capabilities for creating nearly all kinds of software to fulfill the needs of customers. We can apply agile practices for reacting flexibly to changing requirements, we can use distributed development, open source, or other means for creating software at low cost, we can use cloud technologies for deploying software rapidly, and we can get enormous amounts of data showing us how customers actually use software products. However, the sad reality is that around 90% of products fail, and more than 60% of the features of a typical software product are rarely or never used. But there is a silver lining – an insight regarding successful features: Around 60% of the successes stem from a significant change of an initial idea. This gives us a hint on how to build the right software for users and customers.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. The digitization of software-intensive products and services is enabled basically by four megatrends: Cloud computing, big data mobile systems, and social technologies. This disruptive change interacts with all information processes and systems that are important business enablers for the current digital transformation. The internet of things, social collaboration systems for adaptive case management, mobility systems and services for big data in cloud services environments are emerging to support intelligent user-centered and social community systems. Modern enterprises see themselves confronted with an ever growing design space to engineer business models of the future as well as their IT support, respectively. The decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures (EA), is duly needed. With the advent of intelligent user-centered and social community systems, the challenging decision processes can be supported in more flexible and intuitive ways. Tapping into these systems and techniques, the engineers and managers of the enterprise architecture become part of a viable enterprise, i.e. a resilient and continuously evolving system that develops innovative business models.
The evolution of Services Oriented Architectures (SOA) presents many challenges due to their complex, dynamic and heterogeneous nature. We describe how SOA design principles can facilitate SOA evolvability and examine several approaches to support SOA evolution. SOA evolution approaches can be classified based on the level of granularity they address, namely, service code level, service interaction level and model level. We also discuss emerging trends, such as microservices and knowledge-based support, which can enhance the evolution of future SOA systems.
Rapidly growing data volumes push today's analytical systems close to the feasible processing limit. Massive parallelism is one possible solution to reduce the computational time of analytical algorithms. However, data transfer becomes a significant bottleneck since it blocks system resources moving data-to-code. Technological advances allow to economically place compute units close to storage and perform data processing operations close to data, minimizing data transfers and increasing scalability. Hence the principle of Near Data Processing (NDP) and the shift towards code-to-data. In the present paper we claim that the development of NDP-system architectures becomes an inevitable task in the future. Analytical DBMS like HPE Vertica have multiple points of impact with major advantages which are presented within this paper.
Many organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Socio-technical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle the complex business and IT architecture. The transformation of an organization's EA is influenced by big data projects and their data-driven approach on all layers. To enable strategy oriented development of the EA it is essential to synchronize these projects supported by EA management. In
this paper, we conduct a systematic review of big data literature to analyze which requirements for the EA management discipline are proposed. Thereby, a broad overview about existing research is presented to facilitate a more detailed exploration and to foster the evolution o the EA management discipline.
Nowadays almost every major company has a monitoring system and produces log data to analyse their systems. To perform analysation on the log data and to extract experience for future decisions it is important to transform and synchronize different time series. For synchronizing multiple time series several methods are provided so that they are leading to a synchronized uniform time series. This is achieved by using discretisation and approximation methodics. Furthermore the discretisation through ticks is demonstrated, as well as the respectivly illustrated results.
Reality mining refers to an application of data mining, using sensor data to drive behavioral patterns in the real world. However, research in this field started a decade ago when technology was far behind today's state of the art. This paper discusses which requirements are now posed to applications in the context of reality mining. A survey has shown which sensors are available in state-of-the-art smartphones and usable to gather data for reality mining. As another contribution of this paper, a reality mining application architecture is proposed to facilitate the implementation of such applications. A proof of concept verifies the assumptions made on reality mining and the presented architecture.
Digital companies need information systems to implement their business processes end-to-end. BPM systems are promising candidates for that, because they are highly adaptable due to their business process model-driven operation mode. End-to-end processes contain different types of sub-processes that are either procedural, data-driven or business rule-based. Modern BPM systems support modeling notations for all these types of sub-processes. Moreover, end-to-end processes contain parts of shadow processing, so consequently, they must be supported in a performant way, too. BPMN seems to be the adequate notation for modeling these parts due to its procedural nature. Further, BPMN provides several elements that enable the modeling of parallel executions which are very interesting for accelerating shadow processing parts of the process. The present paper will observe the limitations and potentials of BPM systems for a high-performance execution of BPMN models representing shadow processing parts of a business process.
Converting users into customers : the role of user profile information and customer journey analysis
(2016)
Due to the digital transformation, the importance of web analysis and user profiling for enterprises is increasing rapidly as customers focus on digital channels to obtain information about products and brands. While there exists a lot research on these topics, only a minority of firms use them to their advantage. This study aims to tighten the link between research and business such that experimental methods can be used for the improvement of communication strategies in practice. Therefore, a systematic literature analysis is conducted, workshops are observed and documented and an empirical study is used to integrate single steps into a framework for the
practical usage of user profiling and customer journey analysis.
The acquisition of data for reality mining applications is a critical factor, since many mobile devices, e.g. smartphones, must be capable of capturing the required data. Otherwise, only a small target group would be able to use the reality mining application. In the course of a survey, we have identified smartphone features which might be relevant for various reality mining applications. The survey classifies these features and shows how the support of each feature has changed over the years by analyzing 143 smartphones released between 2004 and 2015. All analyzed devices can be ranked by their number of provided features. Furthermore, this paper deals with quality issues which have occurred during carrying out the survey.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
A configuration-management-database driven approach for fabric-process specification and automation
(2014)
In this paper we describe an approach that integrates a Configuration- Management-Database into fabric-process specification and automation in order to consider different conditions regarding to cloud-services. By implementing our approach, the complexity of fabric processes gets reduced. We developed a prototype by using formal prototyping principles as research methods and integrated the Configuration-Management-Database Command into the Workflow- Management-System Activiti. We used this prototype to evaluate our approach. We implemented three different fabric-processes and show that by using our approach the complexity of these three fabric-processes gets reduced.
The use of Wireless Sensor and Actuator Networks (WSAN) as an enabling technology for Cyber-Physical Systems has increased significantly in recent past. The challenges that arise in different application areas of Cyber- Physical Systems, in general, and in WSAN in particular, are getting the attention of academia and industry both. Since reliability issues for message delivery in wireless communication are of critical importance for certain safety related applications, it is one of the areas that has received significant focus in the research community. Additionally, the diverse needs of different applications put different demands on the lower layers in the protocol stack, thus necessitating such mechanisms in place in the lower layers which enable them to dynamically adapt. Another major issue in the realization of networked wirelessly communicating cyber-physical systems, in general, and WSAN, in particular, is the lack of approaches that tackle the reliability, configurability and application awareness issues together. One could consider tackling these issues in isolation. However, the interplay between these issues create such challenges that make the application developers spend more time on meeting these challenges, and that too not in very optimal ways, than spending their time on solving the problems related to the application being developed. Starting from some fundamental concepts, general issues and problems in cyber-physical systems, this chapter discusses such issues like energy-efficiency, application and channel-awareness for networked wirelessly communicating cyber-physical systems. Additionally, the chapter describes a middleware approach called CEACH, which is an acronym for Configurable, Energy-efficient, Application- and Channel-aware Clustering based middleware service for cyber-physical systems. The state of-the art in the area of cyberphysical systems with a special focus on communication reliability, configurability, application- and channel-awareness is described in the chapter. The chapter also describes how these features have been considered in the CEACH approach. Important node level and network level characteristics and their significance vis-àvis the design of applications for cyber physical systems is also discussed. The issue of adaptively controlling the impact of these factors vis-à-vis the application demands and network conditions is also discussed. The chapter also includes a description of Fuzzy-CEACH which is an extension of CEACH middleware service and which uses fuzzy logic principles. The fuzzy descriptors used in different stages of Fuzzy-CEACH have also been described. The fuzzy inference engine used in the Fuzzy-CEACH cluster head election process is described in detail. The Rule-Bases used by fuzzy inference engine in different stages of Fuzzy-CEACH is also included to show an insightful description of the protocol. The chapter also discusses in detail the experimental results validating the authenticity of the presented concepts in the CEACH approach. The applicability of the CEACH middleware service in different application scenarios in the domain of cyberphysical systems is also discussed. The chapter concludes by shedding light on the Publish-Subscribe mechanisms in distributed event-based systems and showing how they can make use of the CEACH middleware to reliably communicate detected events to the event-consumers or the actuators if the WSAN is modeled as a distributed event-based system.
There are several intra-operative use cases which require the surgeon to interact with medical devices. We used the Leap Motion Controller as input device and implemented two use-cases: 2D-Interaction (e.g. advancing EPR data) and selection of a value (e.g. room illumination brightness). The gesture detection was successful and we mapped its output to several devices and systems.