Informatik
Refine
Document Type
- Conference Proceeding (524)
- Article (165)
- Part of a Book (55)
- Doctoral Thesis (13)
- Book (10)
- Anthology (9)
- Patent / Norm / Richtlinie (2)
- Working Paper (2)
- Report (1)
Is part of the Bibliography
- yes (781)
Institute
- Informatik (781)
- Technik (2)
Publisher
- Springer (149)
- Hochschule Reutlingen (107)
- IEEE (80)
- Gesellschaft für Informatik (59)
- Elsevier (36)
- ACM (31)
- IARIA (23)
- Springer Gabler (15)
- De Gruyter (12)
- RWTH Aachen (9)
- SCITEPRESS (8)
- University of Hawai'i at Manoa (8)
- Università Politecnica delle Marche (8)
- Haufe (7)
- AIS Electronic Library (AISeL) (5)
- IOS Press (5)
- MDPI (5)
- Fac. of Organization & Informatics, Univ. of Zagreb (4)
- IGI Global (4)
- RWTH (4)
- SPIE (4)
- Springer International Publishing (4)
- American Marketing Association (3)
- Association for Computing Machinery (3)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (3)
- Emerald (3)
- IADIS Press (3)
- International Academy of Business Disciplines (3)
- Riga Technical University Press (3)
- Science and Technology Publications (3)
- Springer Science + Business Media B.V (3)
- Springer Science + Business Media B.V. (3)
- University of Konstanz, University Library (3)
- Wiley-Blackwell (3)
- American Marketing Assoc. (2)
- Association for Information Systems (2)
- Association for Information Systems (AIS) (2)
- BioMed Central (2)
- CSW-Verlag (2)
- Curran Associates (2)
- Curran Associates Inc. (2)
- Deutsche Aktuarvereinigung (DAV) e.V. (2)
- EuroMed Press (2)
- GMDS e.V. (2)
- Gabler (2)
- Gesellschaft für Informatik e.V (2)
- HTWG Konstanz (2)
- IADIS (2)
- IBM Research Division (2)
- International Society for Photogrammetry and Remote Sensing (2)
- PZH Verlag, TEWISS-Technik und Wissen GmbH (2)
- PeerJ Ltd. (2)
- Sage (2)
- Smart Home & Living Baden-Württemberg e.V. (2)
- Springer Vieweg (2)
- Taylor & Francis (2)
- The Association for Computing Machinery, Inc. (2)
- Thieme (2)
- University of Hawaii (2)
- University of the West of Scotland (2)
- Universität Stuttgart (2)
- 3m5.Media GmbH (1)
- AIP Publishing (1)
- Academic Conferences International Limited (1)
- Americas conference on information systems : AMCIS / Association for Information Systems (1)
- Association for Computing Machinery ACM (1)
- Association of Computing Machinery (1)
- Berlin (1)
- CIDR (1)
- CMP-WEKA-Verlag (1)
- Circle International (1)
- Copenhagen Business School (1)
- Cornell Universiy (1)
- Cuvillier Verlag (1)
- DIMECC Oy (1)
- DUZ Medienhaus (1)
- Deutsche Gesellschaft für Medizinische Physik (1)
- Deutsche Gesellschaft für die Computer- und Roboterassistierte Chirurgie e.V. (1)
- EDP Sciences (1)
- EMAC (1)
- Ed2.0Work (1)
- Elektronikpraxis, Vogel Business Media GmbH & Co. KG (1)
- Elsevier Science (1)
- EuroMedPress (1)
- Eurographics Association (1)
- Fachausschuß Management der Anwendungsentwicklung und -wartung (1)
- Faculty of Economics (1)
- Faculty of Organization and Informatics, University of Zagreb (1)
- Fraunhofer MEVIS (1)
- GBI-Genios (1)
- GITO-Verl. (1)
- German Medical Science Publishing House (1)
- Haufe Group (1)
- Hochschule Heilbronn (1)
- Hochschule der Medien (1)
- IGI Publ. (1)
- IGI Publishing (1)
- IMC Information multimedia communication AG (1)
- Inderscience Publ. (1)
- Inst. of Electrical and Electronics Engineers (1)
- JMIR Publications (1)
- Johannes Kepler University Linz (1)
- Lausanne (1)
- Lund University (1)
- MCB University Press (1)
- MFG Stiftung Baden-Württemberg (1)
- MHP. a Porsche Company (1)
- NextMed (1)
- Open Proceedings.org, Univ. of Konstanz (1)
- OpenProceedings (1)
- PLOS (1)
- Pabst Science Publishers (1)
- PeerJ (1)
- Riga Technical University (1)
- Rockville, Md. (1)
- Routledge, Taylor & Francis Group (1)
- SISSA (1)
- Science and Technology Publications, Lda (1)
- Shaker Verlag (1)
- Society for Science and Education (1)
- Springer Nature (1)
- Springer Science + Business Media (1)
- Technical University (1)
- Technische Universität Darmstadt (1)
- The Association for Computing Machinery (1)
- The University of Edinburgh : Informatics (1)
- Univ. de Jaén (1)
- Universidad Carlos III de Madrid (1)
- University of Minho (1)
- University of Portsmouth (1)
- University of Zagreb Faculty of Organization and Informatics (1)
- Universität Trier (1)
- Universität Tübingen (1)
- Universität des Saarlandes (1)
- Univerzita Tomáe Bati (1)
- Wiley (1)
- World Scientific (1)
- World Scientific Publishing (1)
- dpunkt-Verlag (1)
- libreriauniversitaria.it.edizioni (1)
- vwh (1)
Functionally impaired people have problems with choosing and finding the right clothing. So, they need help in their daily life to wash and manage the clothing. The goal of this work is to support the user by giving recommendations to choose the right clothing, to find the clothing and how to wash the clothing. The idea behind eKlarA is to generate a gateway based system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of (one and more) closet, laundry basket and washing machine in one or several places. The gateway uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. This allows to give more freedom to the functionally impaired people in their daily life.
Entrepreneurs and small and medium enterprises usually have issues on developing new prototypes, new ideas or testing new techniques. In order to help them, in the last years, academic Software Factories, a new concept of collaboration between universities and companies have been developed. Software Factories provide a unique environment for students and companies. Students benefit from the possibility of working in a real work environment learning how to apply the state of the art of the existing techniques and showing their skills to entrepreneurs. Companies benefit from the risk-free environment where they can develop new ideas, in a protected environment. Universities finally benefit from this setup as a perfect environment for empirical studies in industrial-like environment. In this paper, we present the network of academic Software Factories in Europe, showing how companies had already benefit from existing Software Factories and reporting success stories. The results of this paper can increase the network of the factories and help other universities and companies to setup similar environment to boost the local economy.
Autonomous navigation is one of the main areas of research in mobile robots and intelligent connected vehicles. In this context, we are interested in presenting a general view on robotics, the progress of research, and advanced methods related to this field to improve autonomous robots’ localization. We seek to evaluate algorithms and techniques that give robots the ability to move safely and autonomously in a complex and dynamic environment. Under these constraints, we focused our work in the paper on a specific problem: to evaluate a simple, fast and light SLAM algorithm that can minimize localization errors. We presented and validated a FastSLAM 2.0 system combining scan matching and loop closure detection. To allow the robot to perceive the environment and detect objects, we have studied one of the best deep learning technique using convolutional neural networks (CNN). We validate our testing using the YOLOv3 algorithm.
Software and system development is complex and diverse, and a multitude of development approaches is used and combined with each other to address the manifold challenges companies face today. To study the current state of the practice and to build a sound understanding about the utility of different development approaches and their application to modern software system development, in 2016, we launched the HELENA initiative. This paper introduces the 2nd HELENA workshop and provides an overview of the current project state. In the workshop, six teams present initial findings from their regions, impulse talk are given, and further steps of the HELENA roadmap are discussed.
Viele Unternehmen befassen sich in jüngster Zeit mit der Nutzung von Social Media für die interne Kommunikation und Zusammenarbeit. So genannte Enterprise Social Networks bieten integrierte Plattformen mit Profilen, Blogs, Gruppen- und Kommentarfunktionen für die unternehmensinterne Anwendung. Sehr häufig sind damit umfangreiche Investitionen verbunden. Die Budgets werden im Kern für die IT verwendet, "weiche Faktoren" bleiben häufig außen vor. Ein schwerer Fehler, wie aktuelle Marktstudien zeigen. Etliche der ambitionierten Projekte drohen daher zu scheitern.
The digitization of factories will be a significant issue for the 2020s. New scenarios are emerging to increase the efficiency of production lines inside the factory, based on a new generation of robots’ collaborative functions. Manufacturers are moving towards data-driven ecosystems by leveraging product lifecycle data from connected goods. Energy-efficient communication schemes, as well as scalable data analytics, will support these various data collection scenarios. With augmented reality, new remote services are emerging that facilitate the efficient sharing of knowledge in the factory. Future communication solutions should generally ensure connectivity between the various production sites spread worldwide and new players in the value chain (e.g., suppliers, logistics) transparent, real-time, and secure. Industry 4.0 brings more intelligence and flexibility to production. Resulting in more lightweight equipment and, thus, offering better ergonomics. 5G will guarantee real-time transmissions with latencies of less than 1 ms. This will provide manufacturers with new possibilities to collect data and trigger actions automatically.
A behavior marker for measuring non-technical skills of software professionals : an empirical study
(2015)
Managers recognize that software development teams need to be developed. Although technical skills are necessary, non-technical (NT) skills are equally, if not more, necessary for project success. Currently, there are no proven tools to measure the NT skills of software developers or software development teams. Behavioral markers (observable behaviors that have positive or negative impacts on individual or team performance) are successfully used by airline and medical industries to measure NT skill performance. This research developed and validated a behavior marker tool rated video clips of software development teams. The initial results show that the behavior marker tool can be reliably used with minimal training.
Managers recognize that software development project teams need to be developed and guided. Although technical skills are necessary, non-technical (NT) skills are equally, if not more, necessary for project success. Currently, there are no proven tools to measure the NT skills of software developers or software development teams. Behavioral markers (observable behaviors that have positive or negative impacts on individual or team performance) are beginning to be successfully used by airline and medical industries to measure NT skill performance. The purpose of this research is to develop and validate the behavior marker system tool that can be used by different managers or coaches to measure the NT skills of software development individuals and teams. This paper presents an empirical study conducted at the Software Factory where users of the behavior marker tool rated video clips of software development teams. The initial results show that the behavior marker tool can be reliably used with minimal training.
How to protect the skin from getting sun burnt? The sun can damage your skin e.g. skin cancer. But the sun has a positive effect to the human. The time in sun and the intensity are key values between enjoy the sunbath and having a negative effect to the skin. A smart device like a UV flower could help you to enjoy the sunbath. It measures the UV index around you and gives this information to a smartphone app. The development steps of such a device are described in this paper. The UV flower is made of textile fabrics.
Telemedicine is becoming an increasingly important approach to diagnostic, treat or prevent diseases. However, the usage of Information Communication Technologies in healthcare results in a considerable amount of data that must be efficiently and securely transmitted. Many manufacturers provide telemedicine platforms without regarding interoperability, mobility and collaboration. This paper describes a collaborative mobile telemonitoring platform that can use the IEEE 11073 and HL7 communication standards or adapt proprietary protocols. The proposed platform also covers the security and modularity aspects. Furthermore this work introduces an Android-based prototype implementation
The main aim of presented in this manuscript research is to compare the results of objective and subjective measurement of sleep quality for older adults (65+) in the home environment. A total amount of 73 nights was evaluated in this study. Placing under the mattress device was used to obtain objective measurement data, and a common question on perceived sleep quality was asked to collect the subjective sleep quality level. The achieved results confirm the correlation between objective and subjective measurement of sleep quality with the average standard deviation equal to 2 of 10 possible quality points.
Assistant platforms are becoming a key element for the business model of many companies. They have evolved from assistance systems that provide support when using information (or other) systems to platforms in their own. Alexa, Cortana or Siri may be used with literally thousands of services. From this background, this paper develops the notion of assistant platforms and elaborates a conceptual model that supports businesses in developing appropriate strategies. The model consists of three main building blocks, an architecture that depicts the components as well as the possible layers of an assistant platform, the mechanism that determines the value creation on assistant platforms, and the ecosystem with its network effects, which emerge from the multi-sided nature of assistant platforms. The model has been derived from a literature review and is illustrated with examples of existing assistant platforms. Its main purpose is to advance the understanding of assistant platforms and to trigger future research.
A configuration-management-database driven approach for fabric-process specification and automation
(2014)
In this paper we describe an approach that integrates a Configuration- Management-Database into fabric-process specification and automation in order to consider different conditions regarding to cloud-services. By implementing our approach, the complexity of fabric processes gets reduced. We developed a prototype by using formal prototyping principles as research methods and integrated the Configuration-Management-Database Command into the Workflow- Management-System Activiti. We used this prototype to evaluate our approach. We implemented three different fabric-processes and show that by using our approach the complexity of these three fabric-processes gets reduced.
Many modern DBMS architectures require transferring data from storage to process it afterwards. Given the continuously increasing amounts of data, data transfers quickly become a scalability limiting factor. Near-Data Processing and smart/computational storage emerge as promising trends allowing for decoupled in-situ operation execution, data transfer reduction and better bandwidth utilization. However, not every operation is suitable for an in-situ execution and a careful placement and optimization is needed.
In this paper we present an NDP-aware cost model. It has been implemented in MySQL and evaluated with nKV. We make several observations underscoring the need for optimization.
Type 1 diabetes is a chronic and a life threatening disease: an adjusted treatment and a proper management of the disease are crucial to prevent or delay the complications of diabetes. Although during the last decade the development of the artificial pancreas has presented great advances in diabetes care, the multiple daily injections therapy still represents the most widely used treatment option for type 1 diabetes. This work presents the proposal and first development stages of an application focused on guiding patients using the continuous glucose monitors and smart pens together with insulin and carbohydrates recommendations. Our proposal aims to develop a platform to integrate a series of innovative machine learning models and tools rigorously tested together with the use of the latest IoT devices to manage type 1 diabetes. The resulting system actually closes the loop, like the artificial pancreas, but in an intermittent way.
In modern times markets are very dynamic. This situation requires agile enterprises to have the ability to react fast on market influences. Thereby an enterprise’ IT is especially affected, because new or changed business models have to be realized. However, enterprise architectures (EA) are complex structures consisting of many artifacts and relationships between them. Thus analyzing an EA becomes to a complex task for stakeholders. In addition, many stakeholders are involved in decision-making processes, because Enterprise Architecture Management (EAM) targets providing a holistic view of the enterprise. In this article we use concepts of Adaptive Case Management (ACM) to design a decision-making case consisting of a combination of different analysis techniques to support stakeholders in decision-making. We exemplify the case with a scenario of a fictive enterprise.
Context:
Test-driven development (TDD) is an agile software development approach that has been widely claimed to improve software quality. However, the extent to which TDD improves quality appears to be largely dependent upon the characteristics of the study in which it is evaluated (e.g., the research method, participant type, programming environment, etc.). The particularities of each study make the aggregation of results untenable.
Objectives:
The goal of this paper is to: increase the accuracy and generalizability of the results achieved in isolated experiments on TDD, provide joint conclusions on the performance of TDD across different industrial and academic settings, and assess the extent to which the characteristics of the experiments affect the quality-related performance of TDD.
Method:
We conduct a family of 12 experiments on TDD in academia and industry. We aggregate their results by means of meta-analysis. We perform exploratory analyses to identify variables impacting the quality-related performance of TDD.
Results:
TDD novices achieve a slightly higher code quality with iterative test-last development (i.e., ITL, the reverse approach of TDD) than with TDD. The task being developed largely determines quality. The programming environment, the order in which TDD and ITL are applied, or the learning effects from one development approach to another do not appear to affect quality. The quality-related performance of professionals using TDD drops more than for students. We hypothesize that this may be due to their being more resistant to change and potentially less motivated than students.
Conclusion:
Previous studies seem to provide conflicting results on TDD performance (i.e., positive vs. negative, respectively). We hypothesize that these conflicting results may be due to different study durations, experiment participants being unfamiliar with the TDD process, or case studies comparing the performance achieved by TDD vs. the control approach (e.g., the waterfall model), each applied to develop a different system. Further experiments with TDD experts are needed to validate these hypotheses.
Purpose – Many start-ups are in search of cooperation partners to develop their innovative business models. In response, incumbent firms are introducing increasingly more cooperation systems to engage with startups. However, many of these cooperations end in failure. Although qualitative studies on cooperation models have tried to improve the effectiveness of incumbent start-up strategies, only a few have empirically examined start-up cooperation behavior. The paper aims to discuss these issues.
Design/methodology/approach – Drawing from a series of qualitative and quantitative studies. The scale dimensions are identified on an interview based qualitative study. Following workshops and questionnaire-based studies identify factors and rank them. These ranked factors are then used to build a measurement scale that is integrated in a standardized online questionnaire addressing start-ups. The gathered data are then analyzed using PLS-SEM.
Findings – The research was able to build a multi-item scale for start-ups cooperation behavior. This scale can be used in future research. The paper also provides a causal analysis on the impact of cooperation behavior on start-up performance. The research finds, that the found dimensions are suitable for measuring cooperation behavior. It also shows a minor positive effect on start-up’s performance.
Originality/value – The research fills the gap of lacking empirical research on the cooperation between start-ups and established firms. Also, most past studies focus on organizational structures and their performance when addressing these cooperations. Although past studies identified the start-ups behavior as a relevant factor, no empirical research has been conducted on the topic yet.
In recent years, artificial intelligence (AI) has increasingly become a relevant technology for many companies. While there are a number of studies that highlight challenges and success factors in the adoption of AI, there is a lack of guidance for firms on how to approach the topic in a holistic and strategic way. The aim of this study is therefore to develop a conceptual framework for corporate AI strategy. To address this aim, a systematic literature review of a wide spectrum of AI-related research is conducted, and the results are analyzed based on an inductive coding approach. An important conclusion is that companies should consider diverse aspects when formulating an AI strategy, ranging from technological questions to corporate culture and human resources. This study contributes to knowledge by proposing a novel, comprehensive framework to foster the understanding of crucial aspects that need to be considered when using the emerging technology of AI in a corporate context.
The rapid development and growth of knowledge has resulted in a rich stream of literature on various topics. Information systems (IS) research is becoming increasingly extensive, complex, and heterogeneous. Therefore, a proper understanding and timely analysis of the existing body of knowledge are important to identify emerging topics and research gaps. Despite the advances of information technology in the context of big data, machine learning, and text mining, the implementation of systematic literature reviews (SLRs) is in most cases still a purely manual task. This might lead to serious shortcomings of SLRs in terms of quality and time. The outlined approach in this paper supports the process of SLRs with machine learning techniques. For this purpose, we develop a framework with embedded steps of text mining, cluster analysis, and network analysis to analyze and structure a large amount of research literature. Although the framework is presented using IS research as an example, it is not limited to the IS field but can also be applied to other research areas.
Purpose
Supporting the surgeon during surgery is one of the main goals of intelligent ORs. The OR-Pad project aims to optimize the information flow within the perioperative area. A shared information space should enable appropriate preparation and provision of relevant information at any time before, during, and after surgery.
Methods
Based on previous work on an interaction concept and system architecture for the sterile OR-Pad system, we designed a user interface for mobile and intraoperative (stationary) use, focusing on the most important functionalities like clear information provision to reduce information overload. The concepts were transferred into a high-fidelity prototype for demonstration purposes. The prototype was evaluated from different perspectives, including a usability study.
Results
The prototype’s central element is a timeline displaying all available case information chronologically, like radiological images, labor findings, or notes. This information space can be adapted for individual purposes (e.g., highlighting a tumor, filtering for own material). With the mobile and intraoperative mode of the system, relevant information can be added, preselected, viewed, and extended during the perioperative process. Overall, the evaluation showed good results and confirmed the vision of the information system.
Conclusion
The high-fidelity prototype of the information system OR-Pad focuses on supporting the surgeon via a timeline making all available case information accessible before, during, and after surgery. The information space can be personalized to enable targeted support. Further development is reasonable to optimize the approach and address missing or insufficient aspects, like the holding arm and sterility concept or new desired features.
A hybrid deep registration of MR scans to interventional ultrasound for neurosurgical guidance
(2021)
Despite the recent advances in image-guided neurosurgery, reliable and accurate estimation of the brain shift still remains one of the key challenges. In this paper, we propose an automated multimodal deformable registration method using hybrid learning-based and classical approaches to improve neurosurgical procedures. Initially, the moving and fixed images are aligned using classical affine transformation (MINC toolkit), and then the result is provided to the convolutional neural network, which predicts the deformation field using backpropagation. Subsequently, the moving image is transformed using the resultant deformation into a moved image. Our model was evaluated on two publicly available datasets: the retrospective evaluation of cerebral tumors (RESECT) and brain images of tumors for evaluation (BITE). The mean target registration errors have been reduced from 5.35 ± 4.29 to 0.99 ± 0.22 mm in the RESECT and from 4.18 ± 1.91 to 1.68 ± 0.65 mm in the BITE. Experimental results showed that our method improved the state-of-the-art in terms of both accuracy and runtime speed (170 ms on average). Hence, the proposed method provides a fast runtime for 3D MRI to intra-operative US pair in a GPU-based implementation, which shows a promise for its applicability in assisting the neurosurgical procedures compensating for brain shift.
Data governance have been relevant for companies for a long time. Yet, in the broad discussion on smart cities, research on data governance in particular is scant, even though data governance plays an essential role in an environment with multiple stakeholders, complex IT structures and heterogeneous processes. Indeed, not only can a city benefit from the existing body of knowledge on data governance, but it can also make the appropriate adjustments for its digital transformation. Therefore, this literature review aims to spark research on urban data governance by providing an initial perspective for future studies. It provides a comprehensive overview of data governance and the relevant facets embedded in this strand of research. Furthermore, it provides a fundamental basis for future research on the development of an urban data governance framework.
Enterprise Governance, Risk and Compliance (GRC) systems are key to managing risks threatening modern enterprises from many different angles. Key constituent to GRC systems is the definition of controls that are implemented on the different layers of an Enterprise Architecture (EA). As part of the compliance aspect of GRC, the effectiveness of these controls is assessed and reported to relevant management bodies within the enterprise. In this paper we present a metamodel which links controls to the affected elements of an EA and supplies a way of expressing associated assessment techniques and results. We complement the metamodel with an expository instantiation in a cockpit for control compliance applied in an international enterprise in the insurance industry.
Introduction
Despite its high accuracy, polysomnography (PSG) has several drawbacks for diagnosing obstructive sleep apnea (OSA). Consequently, multiple portable monitors (PMs) have been proposed.
Objective
This systematic review aims to investigate the current literature to analyze the sets of physiological parameters captured by a PM to select the minimum number of such physiological signals while maintaining accurate results in OSA detection.
Methods
Inclusion and exclusion criteria for the selection of publications were established prior to the search. The evaluation of the publications was made based on one central question and several specific questions.
Results
The abilities to detect hypopneas, sleep time, or awakenings were some of the features studied to investigate the full functionality of the PMs to select the most relevant set of physiological signals. Based on the physiological parameters collected (one to six), the PMs were classified into sets according to the level of evidence. The advantages and the disadvantages of each possible set of signals were explained by answering the research questions proposed in the methods.
Conclusions
The minimum number of physiological signals detected by PMs for the detection of OSA depends mainly on the purpose and context of the sleep study. The set of three physiological signals showed the best results in the detection of OSA.
Model-guided Therapy and Surgical Workflow Systems are two interrelated research fields, which have been developed separately in the last years. To make full use of both technologies, it is necessary to integrate them and connect them to Hospital Information Systems. We propose a framework for integration of Model-guided Therapy in Hospital Information Systems based on the Electronic Medical Record, and a taskbased Workflow Management System, which is suitable for clinical end users. Two prototypes - one based on Business Process Modeling Language, one based on the serum-board - are presented. From the experience with these prototypes, we developed a novel personalized visualization system for Surgical Workflows and Model-guided Therapy. Key challenges for further development are automated situation detection and a common communication infrastructure.
While several service-based maintainability metrics have been proposed in the scientific literature, reliable approaches to automatically collect these metrics are lacking. Since static analysis is complicated for decentralized and technologically diverse microservice-based systems, we propose a dynamic approach to calculate such metrics from runtime data via distributed tracing. The approach focuses on simplicity, extensibility, and broad applicability. As a first prototype, we implemented a Java application with a Zipkin integrator, 23 different metrics, and five export formats. We demonstrated the feasibility of the approach by analyzing the runtime data of an example microservice based system. During an exploratory study with six participants, 14 of the 18 services were invoked via the system’s web interface. For these services, all metrics were calculated correctly from the generated traces.
Purpose: This paper aims to conceptualize and empirically test the determinants of service interaction quality (SIQ) as attitude, behavior and expertise of a service provider (SP). Further, the individual and simultaneous effects of SIQ and its dimensions on important marketing outcomes are tested. Design/methodology/approach – The narrative review of extant research helps formulate a conceptual model of SIQ, which is investigated using the univariate and multivariate meta-analysis.
Findings: There are interdependencies between drivers of SIQ that underlines the need to conceptualize service interaction as a dyadic phenomenon; use contemporary multilevel models, dyadic models, non-linear structural equation modeling and process studies; and study new and diverse services contexts. Meta-analysis illustrates the relative importance of the three drivers of SIQ and, in turn, their impact on consumer satisfaction and loyalty.
Research limitations/implications – The meta-analysis is based on existing research, which, unfortunately, has not examined critical services or exigency situations where SIQ is of paramount importance. Future research will be tasked with diversifying to several important domains where SIQ is a critical aspect of perceived service quality.
Practical implications: This study emphasizes that, although the expertise of an SP is important, firms would be surprised to learn that the attitude and behavior of their employees are equally important antecedents. In fact, there is a delicate balance that needs to be found; otherwise, attitudinal factors can have an overall counterproductive effect on consumer satisfaction.
Originality/value: This paper provides an empirical synthesis of SIQ and opens up interesting areas for further research.
In this note we look at anisotropic approximation of smooth functions on bounded domains with tensor product splines. The main idea is to extend such functions and then use known approximation techniques on Rd. We prove an error estimate for domains for which bounded extension operators exist. This obvious approach has some limitations. It is not applicable without restrictions on the chosen coordinate degree even if the domain is as simple as the unit disk. Further for approximation on Rd there are error estimates in which the grid widths and directional derivatives are paired in an interesting way. It seems impossible to maintain this property using extension operators.
In the upcoming years, huge benefits are expected from Artificial Intelligence (AI). However, there are also risks involved in the technology, such as accidents of autonomous vehicles or discrimination by AI-based recruitment systems. This study aims to investigate public perception of these risks, focusing on realistic risks of Narrow AI, i.e., the type of AI that is already productive today. Based on perceived risk theory, several risk scenarios are examined using data from an exploratory survey. This research shows that AI is perceived positively overall. The participants, however, do evaluate AI critically when being confronted with specific risk scenarios. Furthermore, a strong positive relationship between knowledge about AI and perceived risk could be shown. This study contributes to knowledge by advancing our understanding of the awareness and evaluation of the risks by consumers and has important implications for product development, marketing and society.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.
Documentation of clinical processes, especially in the perioperative are, is a base requirement for quality of service. Nonetheless, the documentation is a burden for the medical staff since it distracts from the clinical core process. An intuitive and user-friendly documentation system could increase documentation quality and reduce documentation workload. The optimal system solution would know what happened and the person documenting the step would need a single “confirm” button. In many cases, such a linear flow of activities is given as long as only one profession (e.g. anaestesiology, scrub nurse) is considered, but even in such cases, there might be derivations from the linear process flow and further interaction is required.
A lot of people need help in their daily life to wash, select and manage their clothing. The goal of this work is to design an assistant system (eKlarA) to support the user by giving recommendations to choose the clothing combinations, to find the clothing and to wash the clothing. The idea behind eKlarA is to generate a system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of the stations: closets, laundry basket and washing machine in one or several places. The system uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. The first prototype of this system has been developed and tested. The test results are presented in this work.
Sleep quality and in general, behavior in bed can be detected using a sleep state analysis. These results can help a subject to regulate sleep and recognize different sleeping disorders. In this work, a sensor grid for pressure and movement detection supporting sleep phase analysis is proposed. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this project is a non invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable actigraphy devices tends to be uncomfortable. Besides this fact, they are also very expensive. The system represented in this work classifies respiration and body movement with only one type of sensor and also in a non invasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed the potential for classification of breathing rate and body movements. Although previous researches show the use of pressure sensors in recognizing posture and breathing, they have been mostly used by positioning the sensors between the mattress and bedsheet. This project however, shows an innovative way to position the sensors under the mattress.
Background and purpose: Transapical aortic valve replacement (TAVR) is a recent minimally invasive surgical treatment technique for elderly and high-risk patients with severe aortic stenosis. In this paper,a simple and accurate image-based method is introduced to aid the intra-operative guidance of TAVR procedure under 2-D X-ray fluoroscopy.
Methods: The proposed method fuses a 3-D aortic mesh model and anatomical valve landmarks with live 2-D fluoroscopic images. The 3-D aortic mesh model and landmarks are reconstructed from interventional X-ray C-arm CT system, and a target area for valve implantation is automatically estimated using these aortic mesh models.Based on template-based tracking approach, the overlay of visualized 3-D aortic mesh model, land-marks and target area of implantation is updated onto fluoroscopic images by approximating the aortic root motion from a pigtail catheter motion without contrast agent. Also, a rigid intensity-based registration algorithm is used to track continuously the aortic root motion in the presence of contrast agent.Furthermore, a sensorless tracking of the aortic valve prosthesis is provided to guide the physician to perform the appropriate placement of prosthesis into the estimated target area of implantation.
Results: Retrospective experiments were carried out on fifteen patient datasets from the clinical routine of the TAVR. The maximum displacement errors were less than 2.0 mm for both the dynamic overlay of aortic mesh models and image-based tracking of the prosthesis, and within the clinically accepted ranges. Moreover, high success rates of the proposed method were obtained above 91.0% for all tested patient datasets.
Conclusion: The results showed that the proposed method for computer-aided TAVR is potentially a helpful tool for physicians by automatically defining the accurate placement position of the prosthesis during the surgical procedure.
The cloud evolved into an attractive execution environment for parallel applications from the High Performance Computing (HPC) domain. Existing research recognized that parallel applications require architectural refactoring to benefit from cloud-specific properties (most importantly elasticity). However, architectural refactoring comes with many challenges and cannot be applied to all applications due to fundamental performance issues. Thus, during the last years, different cloud migration strategies have been considered for different classes of parallel applications. In this paper, we provide a survey on HPC cloud migration research. We investigate on the approaches applied and the parallel applications considered. Based on our findings, we identify and describe three cloud migration strategies.
The acquisition of data for reality mining applications is a critical factor, since many mobile devices, e.g. smartphones, must be capable of capturing the required data. Otherwise, only a small target group would be able to use the reality mining application. In the course of a survey, we have identified smartphone features which might be relevant for various reality mining applications. The survey classifies these features and shows how the support of each feature has changed over the years by analyzing 143 smartphones released between 2004 and 2015. All analyzed devices can be ranked by their number of provided features. Furthermore, this paper deals with quality issues which have occurred during carrying out the survey.
Many organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Socio-technical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle the complex business and IT architecture. The transformation of an organization's EA is influenced by big data projects and their data-driven approach on all layers. To enable strategy oriented development of the EA it is essential to synchronize these projects supported by EA management. In
this paper, we conduct a systematic review of big data literature to analyze which requirements for the EA management discipline are proposed. Thereby, a broad overview about existing research is presented to facilitate a more detailed exploration and to foster the evolution o the EA management discipline.
Context: Many companies are facing an increasingly dynamic and uncertain market environment, making traditional product roadmapping practices no longer sufficiently applicable. As a result, many companies need to adapt their product roadmapping practices for continuing to operate successfully in today’s dynamic market environment. However, transforming product roadmapping practices is a difficult process for organizations. Existing literature offers little help on how to accomplish such a process.
Objective: The objective of this paper is to present a product roadmap transformation approach for organizations to help them identify appropriate improvement actions for their roadmapping practices using an analysis of their current practices.
Method: Based on an existing assessment procedure for evaluating product roadmapping practices, the first version of a product roadmap transformation approach was developed in workshops with company experts. The approach was then given to eleven practitioners and their perceptions of the approach were gathered through interviews.
Results: The result of the study is a transformation approach consisting of a process describing what steps are necessary to adapt the currently applied product roadmapping practice to a dynamic and uncertain market environment. It also includes recommendations on how to select areas for improvement and two empirically based mapping tables. The interviews with the practitioners revealed that the product roadmap transformation approach was perceived as comprehensible, useful, and applicable. Nevertheless, we identified potential for improvements, such as a clearer presentation of some processes and the need for more improvement options in the mapping tables. In addition, minor usability issues were identified.
This paper compares the influence a video self-avatar and a lack of a visual representation of a body have on height estimation when standing at a virtual visual cliff. A height estimation experiment was conducted using a custom augmented reality Oculus Rift hardware and software prototype also described in this paper. The results show a consistency with previous research demonstrating that the presence of a visual body influences height estimates, just as it has been shown to influence distance estimates and affordance estimates.
Representing users within an immersive virtual environment is an essential functionality of a multi-person virtual reality system. Especially when communicative or collaborative tasks must be performed, there exist challenges about realistic embodying and integrating such avatar representations. A shared comprehension of local space and non-verbal communication (like gesture, posture or self-expressive cues) can support these tasks. In this paper, we introduce a novel approach to create realistic, video-texture based avatars of colocated users in real-time and integrate them in an immersive virtual environment. We show a straight forward and low-cost hard- and software solution to do so. We discuss technical design problems that arose during implementation and present a qualitative analysis on the usability of the concept from a user study, applying it to a training scenario in the automotive sector.
In this work, a web-based software architecture and framework for management and diagnosis of large amounts of medical data in an ophthalmologic reading center is proposed. Data management for multi-center studies requires merging of standing data and repeatedly gathered clinical evidence such as vital signs and raw data. If ophthalmologic questions are involved the data acquisition is often provided by non-medical staff at the point of care or a study center, whereas the medical finding is mostly provided by an ophthalmologist in a specialized reading center. The study data such as participants, cohorts and measured values are administrated at a single data center for the entire study. Since a specialized reading center maintains several studies, the medical staff must learn the different data administration for the different data center. With respect to the increasing number and sizes of clinical studies, two aspects must be considered. At first, an efficient software framework is required to support the data management, processing and diagnosis by medical experts at the reading center. In the second place, this software needs a standardized user-interface that has not to be trained/taylore /adapted for each new study. Furthermore different aspects of quality and security controls have to be included. Therefore, the objective of this work is to establish a multi purpose ophthalmologic reading center, which can be connected to different data centers via configurable data interfaces in order to treat various topics simultaneously.
In this paper we describe an interactive web-based tool for visual analysis of Formula 1 data. A calendar-like representation provides an overview of all races on a yearly basis, either in absolute or normalized time. After selecting a dedicated race more details about this race can be explored. Furthermore it is possible to compare up to three different races. Beside visualizing details on dedicated races it is also possible to analyse driver and team performance over time. A user study was applied to get feedback about the usage of the application and decide between different visualization options.
Workflow driven support systems in the peri-operative area have the potential to optimize clinical processes and to allow new situation-adaptive support systems. We started to develop a workflow management system supporting all involved actors in the operating theatre with the goal to synchronize the tasks of the different stakeholders by giving relevant information to the right team members. Using the OMG standards BPMN, CMMN and DMN gives us the opportunity to bring established methods from other industries into the medical field. The system shows each addressed actor their information in the right place at the right time to make sure every member can execute their task in time to ensure a smooth workflow. The system has the overall view of all tasks. Accordingly, a workflow management system including the Camunda BPM workflow engine to run the models, and a middleware to connect different systems to the workflow engine and some graphical user interfaces to show necessary information or to interact with the system are used. The complete pipeline is implemented with a RESTful web service. The system is designed to include different systems like hospital information system (HIS) via the RESTful web service very easily and without loss of data. The first prototype is implemented and will be expanded.
Information systems, which support the workflow in the clinical area, are currently limited to organizational processes. This work shows a first approach of an information system supporting all actors in the perioperative area. The first prototype and proof of concept was a task manager, giving all actors information about their task and the task of all other actors during an intervention. Based on this initial task manager, we implemented an information system based on a workflow engine controlling all processes and all information necessary for the intervention. A second part was the development of a perioperative process visualization which was developed based on a user centered approach jointly with clinicians and OR members.
Reliable and accurate car driver head pose estimation is an important function for the next generation of advanced driver assistance systems that need to consider the driver state in their analysis. For optimal performance, head pose estimation needs to be non-invasive, calibration-free and accurate for varying driving and illumination conditions. In this pilot study we investigate a 3D head pose estimation system that automatically fits a statistical 3D face model to measurements of a driver’s face, acquired with a low-cost depth sensor on challenging real-world data. We evaluate the results of our sensor-independent, driver-adaptive approach to those of a state-of-the-art camera-based 2D face tracking system as well as a non-adaptive 3D model relative to own ground-truth data, and compare to other 3D benchmarks. We find large accuracy benefits of the adaptive 3D approach.
Active storage
(2019)
In brief, Active Storage refers to an architectural hardware and software paradigm, based on collocation storage and compute units. Ideally, it will allow to execute application-defined data ... within the physical data storage. Thus Active Storage seeks to minimize expensive data movement, improving performance, scalability, and resource efficiency. The effective use of Active Storage mandates new architectures, algorithms, interfaces, and development toolchains.