Refine
Document Type
- Journal article (1244)
- Conference proceeding (1039)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (38)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3088)
Institute
- ESB Business School (1107)
- Informatik (875)
- Technik (509)
- Life Sciences (343)
- Texoversum (220)
- Zentrale Einrichtungen (16)
Publisher
- Springer (473)
- IEEE (252)
- Elsevier (243)
- Hochschule Reutlingen (191)
- MDPI (99)
- Wiley (72)
- Gesellschaft für Informatik e.V (69)
- Universität Tübingen (65)
- De Gruyter (60)
- VDE Verlag (48)
Perceptual integration of kinematic components in the recognition of emotional facial expressions
(2018)
According to a long-standing hypothesis in motor control, complex body motion is organized in terms of movement primitives, reducing massively the dimensionality of the underlying control problems. For body movements, this low dimensional organization has been convincingly demonstrated by the learning of low-dimensional representations from kinematic and EMG data. In contrast, the effective dimensionality of dynamic facial expressions is unknown, and dominant analysis approaches have been based on heuristically defined facial ‘‘action units,’’ which reflect contributions of individual face muscles. We determined the effective dimensionality of dynamic facial expressions by learning of a low dimensional model from 11 facial expressions. We found an amazingly low dimensionality with only two movement primitives being sufficient to simulate these dynamic expressions with high accuracy. This low dimensionality is confirmed statistically, by Bayesian model comparison of models with different numbers of primitives, and by a psychophysical experiment that demonstrates that expressions, simulated with only two primitives, are indistinguishable from natural ones.
In addition, we find statistically optimal integration of the emotion information specified by these primitives in visual perception. Taken together, our results indicate that facial expressions might be controlled by a very small number of independent control units, permitting very low dimensional parametrization of the associated facial expression.
A case study with four German fashion retail brands was conducted in order to measure the performance of their Omnichannel services. In detail, their Click & Collect service was analyzed. Click & Collect is one of the first introduced Omnichannel services in fashion retailing. Omnichannel services integrate different sales and communication channels providing a seamless customer journey experience. Offline, online, and mobile app customer experiences should provide a seamless customer experience. Omnichannel performance of the four retailers Decathlon, Hunkemöller, Massimo Dutti and Galeria Kaufhof was measured via mystery shopping. A seamless customer journey experience is not yet a standard in German fashion retailing. The four companies differ in many process details. The biggest market potential and the recommendation for further research emerges in deficits of the offline store Omnichannel customer experience. Here, all four case companies have room to improve. Best overall results regarding the integration of offline, online and mobile shops were found with Hunkemöller, followed by Decathlon, Massimo Dutti, and Galeria Kaufhof.
Database management systems (DBMS) are critical performance components in large scale applications under modern update intensive workloads. Additional access paths accelerate look-up performance in DBMS for frequently queried attributes, but the required maintenance slows down update performance. The ubiquitous B+ tree is a commonly used key-indexed access path that is able to support many required functionalities with logarithmic access time to requested records. Modern processing and storage technologies and their characteristics require reconsideration of matured indexing approaches for today's workloads. Partitioned B-trees (PBT) leverage characteristics of modern hardware technologies and complex memory hierarchies as well as high update rates and changes in workloads by maintaining partitions within one single B+-Tree. This paper includes an experimental evaluation of PBTs optimized write pattern and performance improvements. With PBT transactional throughput under TPC-C increases 30%; PBT results in beneficial sequential write patterns even in presence of updates and maintenance operations.
Characteristics of modern computing and storage technologies fundamentally differ from traditional hardware. There is a need to optimally leverage their performance, endurance and energy consumption characteristics. Therefore, existing architectures and algorithms in modern high performance database management systems have to be redesigned and advanced. Multi Version Concurrency Control (MVCC) approaches in data-base management systems maintain multiple physically independent tuple versions. Snapshot isolation approaches enable high parallelism and concurrency in workloads with almost serializable consistency level. Modern hardware technologies benefit from multi-version approaches. Indexing multi-version data on modern hardware is still an open research area. In this paper, we provide a survey of popular multi-version indexing approaches and an extended scope of high performance single-version approaches. An optimal multi-version index structure brings look-up efficiency of tuple versions, which are visible to transactions, and effort on index maintenance in balance for different workloads on modern hardware technologies.
Dieses Buch beschreibt detailliert die Voraussetzungen und den Prozessablauf von Business Cases. Diese stellen in der Praxis das wichtigste Instrument dar, um unternehmerische Entscheidungen auf ihre Vorteilhaftigkeit zu analysieren. Um einen adäquaten Business Case zu erstellen, reicht allerdings die reine Beherrschung der relevanten Methoden der Investitionsrechnung nicht aus. Andreas Taschner gibt hilfreiche Anleitungen und Tipps zur Methodenwahl und Ergebnisdarstellung und erläutert weitergehende Fragen, wie die Berücksichtigung von Unsicherheit oder die Einbeziehung nicht-monetärer Faktoren. Die Orientierung am idealtypischen Prozess hilft beim Erarbeiten eigener Business Cases und liefert einen Leitfaden für die ersten selbstständigen Arbeiten. Anwendungsbezogene Fragen und Antworten vertiefen die Thematik. „Business Cases“ wendet sich an Unternehmenspraktiker in den Bereichen Investition, Controlling, Planung und Unternehmensführung. Studierende der Wirtschaftswissenschaften an Fachhochschulen und Universitäten, insbesondere mit den Schwerpunkten Controlling und Unternehmensführung, profitieren von der kompakten Wissensvermittlung.
Die Erfindung betrifft eine Vorrichtung (100) und ein Verfahren zum elektrischen Verbinden und Trennen zweier elektrischer Potentiale (1, 2). Des Weiteren betrifft die Erfindung eine Verwendung der Vorrichtung (100). Dabei umfasst die Vorrichtung (100): – ein erstes Modul, welches einen ersten und einen zweiten Transistor (10a, 10b) umfasst, wobei der erste Transistor (10a) antiseriell zu dem zweiten Transistor (10b) geschaltet ist; und – ein zweites Modul, welches einen dritten und einen vierten Transistor (10c, 10d) umfasst, wobei der dritte Transistor (10c) antiseriell zu dem vierten Transistor (10d) geschaltet ist; wobei das erste Modul und das zweite Modul parallel geschaltet sind.
Using measurement and simulation for understanding distributed development processes in the Cloud
(2017)
Organizations increasingly develop software in a distributed manner. The Cloud provides an environment to create and maintain software-based products and services. Currently, it is widely unknown which software processes are suited for Cloud-based development and what their effects in specific contexts are. This paper presents a process simulation to study distributed development in the Cloud. We contribute a simulation model, which helps analyzing different project parameters and their impact on projects carried out in the Cloud. The simulator helps reproducing activities, developers, issues and events in the project, and it generates statistics, e.g., on throughput, total time, and lead and cycle time. The aim of this simulation model is thus to analyze the tradeoffs regarding throughput, total time, project size, and team size. Furthermore, the modified simulation model aims to help project managers select the most suitable planning alternative. Based on observed projects in Finland and Spain, we simulated a distributed project using artificial and real data. Particularly, we studied the variables project size, team size, throughput, and total project duration. A comparison of the real project data with the results obtained from the simulation shows the simulation producing results close to the real data, and we could successfully replicate a distributed software project. By improving the understanding of distributed development processes, our simulation model thus supports project managers in their decision-making.
The business landscape is changing radically because of software. Companies in all industry sectors are continously finding new flexibilities in this programmable world. They are able to deliver new functionalities even after the product is already in the customer's hands. But success is far from guaranteed if they cannot validate their assumptions about what their customers actually need. A competitor with better knowledge of customer needs can disrupt the market in an instant.
This book introduces continuous experimentation, an approach to continuously and systematically test assumptions about the company's product or service strategy and verify customers' needs through experiments. By observing how customers actually use the product or early versions of it, companies can make better development decisions and avoid potentially expensive and wasteful activities. The book explains the cycle of continuous experimentation, demonstrates its use through industry cases, provides advice on how to conduct experiments with recipes, tools, and models, and lists some common pitfalls to avoid. Use it to get started with continuous experimentation and make better product and service development decisions that are in-line with your customers' needs.