Refine
Year of publication
- 2017 (302) (remove)
Document Type
- Conference proceeding (122)
- Journal article (104)
- Book chapter (32)
- Book (28)
- Patent / Standard / Guidelines (6)
- Working Paper (5)
- Doctoral Thesis (2)
- Anthology (1)
- Issue of a journal (1)
- Review (1)
Is part of the Bibliography
- yes (302)
Institute
- ESB Business School (96)
- Informatik (88)
- Technik (58)
- Texoversum (33)
- Life Sciences (26)
Publisher
- Springer (44)
- Hochschule Reutlingen (29)
- IEEE (27)
- Elsevier (18)
- Gesellschaft für Informatik (16)
- ACM (8)
- Universitätsbibliothek Tübingen (8)
- Shaker (7)
- Springer Gabler (7)
- Association for Information Systems (AIS) (5)
A device including a first and second monitoring unit, the first monitoring unit detecting a first voltage potential and the second monitoring unit detecting a second voltage potential, the monitoring units comparing the first voltage potential and the second voltage potential to the value of the supply voltage and activate a control unit as a function of the comparisons, the control unit determining a switching point in time of a second power transistor, and an arrangement being present which generates current when the second power transistor is being switched on, the current changing the first voltage potential, and the control unit activates a first power transistor when the first voltage potential has the same value as the supply voltage, so that the first power transistor is de-energized.
Sleep quality and in general, behavior in bed can be detected using a sleep state analysis. These results can help a subject to regulate sleep and recognize different sleeping disorders. In this work, a sensor grid for pressure and movement detection supporting sleep phase analysis is proposed. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this project is a non invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable actigraphy devices tends to be uncomfortable. Besides this fact, they are also very expensive. The system represented in this work classifies respiration and body movement with only one type of sensor and also in a non invasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed the potential for classification of breathing rate and body movements. Although previous researches show the use of pressure sensors in recognizing posture and breathing, they have been mostly used by positioning the sensors between the mattress and bedsheet. This project however, shows an innovative way to position the sensors under the mattress.
To analyze the humans’ sleep it is necessary as to identify the sleep stages, occurring during the sleep, their durations and sleep cycles. The gold standard procedure for this approach is polysomnography (PSG), which classify the sleep stages based on Rechtschaffen and Kales (R-K) method. This method aside the advantages as high accuracy has however some disadvantages, among others time-consuming and uncomfortable for the patient procedure. Therefore, the development of further methods for the sleep classification in addition to PSG is a promising topic for the investigation and this work has as its aim the presentation of possible ways and goals for this development.
We present a new methodology for automatic selection and sizing of analog circuits demonstrated on the OTA circuit class. The methodology consists of two steps: a generic topology selection method supported by a “part-sizing” process and subsequent final sizing. The circuit topologies provided by a reuse library are classified in a topology tree. The appropriate topology is selected by traversing the topology tree starting at the root node. The decision at each node is gained from the result of the part-sizing, which is in fact a node-specific set of simulations. The final sizing is a simulation-based optimization. We significantly reduce the overall simulation effort compared to a classical simulation-based optimization by combining the topology selection with the part-sizing process in the selection loop. The result is an interactive user friendly system, which eases the analog designer’s work significantly when compared to typical industrial practice in analog circuit design. The topology selection method and sizing process are implemented as a tool into a typical analog design environment. The design productivity improvement achievable by our method is shown by a comparison to other design automation approaches.
Asymmetric read/write storage technologies such as Flash are becoming
a dominant trend in modern database systems. They introduce
hardware characteristics and properties which are fundamentally
different from those of traditional storage technologies such
as HDDs.
Multi-Versioning Database Management Systems (MV-DBMSs)
and Log-based Storage Managers (LbSMs) are concepts that can
effectively address the properties of these storage technologies but
are designed for the characteristics of legacy hardware. A critical
component of MV-DBMSs is the invalidation model: commonly,
transactional timestamps are assigned to the old and the new version,
resulting in two independent (physical) update operations.
Those entail multiple random writes as well as in-place updates,
sub-optimal for new storage technologies both in terms of performance
and endurance. Traditional page-append LbSM approaches
alleviate random writes and immediate in-place updates, hence reducing
the negative impact of Flash read/write asymmetry. Nevertheless,
they entail significant mapping overhead, leading to write
amplification.
In this work we present an approach called Snapshot Isolation
Append Storage Chains (SIAS-Chains) that employs a combination
of multi-versioning, append storage management in tuple granularity
and novel singly-linked (chain-like) version organization.
SIAS-Chains features: simplified buffer management, multi-version
indexing and introduces read/write optimizations to data placement
on modern storage media. SIAS-Chains algorithmically avoids
small in-place updates, caused by in-place invalidation and converts
them into appends. Every modification operation is executed
as an append and recently inserted tuple versions are co-located.
A 3D face modelling approach for pose-invariant face recognition in a human-robot environment
(2017)
Face analysis techniques have become a crucial component of human-machine interaction in the fields of assistive and humanoid robotics. However, the variations in head-pose that arise naturally in these environments are still a great challenge. In this paper, we present a real-time capable 3D face modelling framework for 2D in-the-wild images that is applicable for robotics. The fitting of the 3D Morphable Model is based exclusively on automatically detected landmarks. After fitting, the face can be corrected in pose and transformed back to a frontal 2D representation that is more suitable for face recognition. We conduct face recognition experiments with non-frontal images from the MUCT database and uncontrolled, in the wild images from the PaSC database, the most challenging face recognition database to date, showing an improved performance. Finally, we present our SCITOS G5 robot system, which incorporates our framework as a means of image pre-processing for face analysis.
In a digitally controlled slope shaping system, reliable detection of both voltage and current slope is required to enable a closed-loop control for various power switches independent of system parameters. In most state-of-the-art works, this is realized by monitoring the absolute voltage and current values. Better accuracy at lower DC power loss is achieved by sensing techniques for a reliable passive detection, which is achieved through avoiding DC paths from the high voltage network into the sensing network. Using a high-speed analog-to-digital converter, the whole waveform of the transient derivative can be stored digitally and prepared for a predictive cycle-by-cycle regulation, without requiring high-precision digital differentiation algorithms. To gain an accurate representation of the voltage and current derivative waveforms, system parasitics are investigated and classified in three sections: (1) component parasitics, which are identified by s-parameter measurements and extraction of equivalent circuit models, (2) PCB design issues related to the sensing circuit, and (3) interconnections between adjacent boards.
The contribution of this paper is an optimized sensing network on the basis of the experimental study supporting fast transition slopes up to 100 V/ns and 1 A/ns and beyond, making the sensing technique attractive for slope shaping of fast switching devices like modern generation IGBTs, CoolMOSTM and SiC mosfets. Measurements of the optimized dv/dt and di/dt setups are demonstrated for a hard switched IGBT power stage.
A concept for a slope shaping gate driver IC is proposed, used to establish control over the slew rates of current and voltage during the turn-on and turn off switching transients.
It combines the high speed and linearity of a fully-integrated closed-loop analog gate driver, which is able to perform real-time regulation, with the advantages of digital control, like flexibility and parameter independency, operating in a predictive cycle-bycycle regulation. In this work, the analog gate drive integrated circuit is partitioned into functional blocks and modeled in the small-signal domain, which also includes the non-linearity of parameters. An analytical stability analysis has been performed in order to ensure full functionality of the system controlling a modern generation IGBT and a superjunction MOSFET. Major parameters of influence, such as gate resistor and summing node capacitance, are investigated to achieve stable control. The large-signal behavior, investigated by simulations of a transistor level design, verifies the correct operation of the circuit. Hence, the gate driver can be designed for robust operation.
Software startups often make assumptions about the problems and customers they are addressing as well as the market and the solutions they are developing. Testing the right assumptions early is a means to mitigate risks. Approaches such as Lean Startup foster this kind of testing by applying experimentation as part of a constant build-measure-learn feedback loop. The existing research on how software startups approach experimentation is very limited. In this study, we focus on understanding how software startups approach experimentation and identify challenges and advantages with respect to conducting experiments. To achieve this, we conducted a qualitative interview study. The initial results show that startups often spent a disproportionate amount of time focusing on creating solutions without testing critical assumptions. Main reasons are the lack of awareness, that these assumptions can be tested early and a lack of knowledge and support on how to identify, prioritize and test these assumptions. However, startups understand the need for testing risky assumptions and are open to conducting experiments.
Die meisten Unternehmen steuern ihr operatives Marketing, ohne die Effektivität und die Effizienz der dabei eingesetzten Instrumente zu quantifizieren - in Anbetracht der Höhe typischer Marketingbudgets ein kritischer Befund. Wie kann man hier Abhilfe schaffen? Ein Blick auf eines der Schlüsselkonzepte von Industrie 4.0 zeigt einen möglichen Weg auf.
Im Rahmen der wissenschaftlichen Vertiefung soll auf Basis der vorhandenen Ansätze das IT-Risikomanagement evaluiert werden. Hierbei soll die Frage, inwiefern das IT-Risikomanagement dem Unternehmen eine Hilfestellung bieten kann, geklärt und anschließend anhand von zwei Fallbeispielen dargestellt werden.
While the topic of Customer Relationship Management (CRM) has generated an increasing amount of research attention in recent years, still lacking is a comprehensive overview that helps to explain how companies can implement CRM successfully. To address these issues, this article identifies and discusses factors that are associated with a greater degree of CRM success. More specifically, we identify and discuss determinants on strategy, human resources, information management, structure and processes as well as specific factors within the implementation phase which help to improve CRM success. First, our results indicate that the implementation of CRM processes is associated with better company performance, especially at the relationship initiation and maintenance stage. Second, the findings emphasis a predominant influence of firm-based factors vis-à-vis structural industry, and customer-based factors. Furthermore, cross-functional CRM teams and a top management feeling responsible for CRM projects help to improve CRM success. In addition, internal processes which are related to customer contact points have to be redesigned to enhance the interaction between employees and customers. The current article sheds more light on what really drives CRM success.
Wissen schaffen für klimafreundliche Energietechnik : das Erfolgsgeheimnis der Kraft-Wärme-Kopplung
(2017)
Als hocheffiziente Energieerzeugungstechnologie spielt die KWK eine wichtige Rolle bei der Energiewende und dem Klimaschutz. Gerade für den Gebäudebereich und Unternehmen kann die KWK eine wirtschaftliche und umweltschonende Möglichkeit sein, Strom und Wärme zu erzeugen. Für die Planung, Umsetzung und den Betrieb von KWK-Anlagen werden fachkundige Handwerke und Ingenieure gebraucht. Aus diesem Grund hat das Ministerium für Umwelt, Klima und Energiewirtschaft in den Jahren 2015 und 2016 zusammen mit dem Handwerkstag Baden-Württemberg ... den Qualifizierungskurs "Kraft-Wärme-Kopplung - Kompetenz für den Wärme- und Energiemarkt von heute und morgen" durchgeführt... Aufgrund des Erfolges der Seminarreihe wird diese auch 2017 ff. fortgesetzt.
A new method for the analysis of movement dependent parasitics in full custom designed MEMS sensors
(2017)
Due to the lack of sophisticated microelectromechanical systems (MEMS) component libraries, highly optimized MEMS sensors are currently designed using a polygon driven design flow. The strength of this design flow is the accurate mechanical simulation of the polygons by finite element (FE) modal analysis. The result of the FE-modal analysis is included in the system model together with the data of the (mechanical) static electrostatic analysis. However, the system model lacks the dynamic parasitic electrostatic effects, arising from the electric coupling between the wiring and the moving structures. In order to include these effects in the system model, we present a method which enables the quasi dynamic parasitic extraction with respect to in-plane movements of the sensor structures. The method is embedded in the polygon driven MEMS design flow using standard EDA tools. In order to take the influences of the fabrication process into account, such as etching process variations, the method combines the FE-modal analysis and the fabrication process simulation data. This enables the analysis of dynamic changing electrostatic parasitic effects with respect to movements of the mechanical structures. Additionally, the result can be included into the system model allowing the simulation of positive feedback of the electrostatic parasitic effects to the mechanical structures.
In the present paper we demonstrate a novel approach to handling small updates on Flash called In-Place Appends (IPA). It allows the DBMS to revisit the traditional write behavior on Flash. Instead of writing whole database pages upon an update in an out-of-place manner on Flash, we transform those small updates into update deltas and append them to a reserved area on the very same physical Flash page. In doing so we utilize the commonly ignored fact that under certain conditions Flash memories can support in-place updates to Flash pages without a preceding erase operation.
The approach was implemented under Shore-MT and evaluated on real hardware. Under standard update-intensive workloads we observed 67% less page invalidations resulting in 80% lower garbage collection overhead, which yields a 45% increase in transactional throughput, while doubling Flash longevity at the same time. The IPA outperforms In-Page Logging (IPL) by more than 50%.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware – the OpenSSD Flash research platform. During the demonstration we allow the users to interact with the system and gain hands on experience of its performance under different demonstration scenarios. These involve various workloads such as TPC-B, TPC-C or TATP.
In the present paper we demonstrate the novel technique to apply the recently proposed approach of In-Place Appends – overwrites on Flash without a prior erase operation. IPA can be applied selectively: only to DB-objects that have frequent and relatively small updates. To do so we couple IPA to the concept of NoFTL regions, allowing the DBA to place update-intensive DB-objects into special IPA-enabled regions. The decision about region configuration can be (semi-)automated by an advisor analyzing DB-log files in the background.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware. During the demonstration we allow the users to interact with the system and gain hands-on experience under different demonstration scenarios.
Under update intensive workloads (TPC, LinkBench) small updates dominate the write behavior, e.g. 70% of all updates change less than 10 bytes across all TPC OLTP workloads. These are typically performed as in-place updates and result in random writes in page-granularity, causing major write-overhead on Flash storage, a write amplification of several hundred times and lower device longevity.
In this paper we propose an approach that transforms those small in-place updates into small update deltas that are appended to the original page. We utilize the commonly ignored fact that modern Flash memories (SLC, MLC, 3D NAND) can handle appends to already programmed physical pages by using various low-level techniques such as ISPP to avoid expensive erases and page migrations. Furthermore, we extend the traditional NSM page-layout with a delta-record area that can absorb those small updates. We propose a scheme to control the write behavior as well as the space allocation and sizing of database pages.
The proposed approach has been implemented under Shore- MT and evaluated on real Flash hardware (OpenSSD) and a Flash emulator. Compared to In-Page Logging it performs up to 62% less reads and writes and up to 74% less erases on a range of workloads. The experimental evaluation indicates: (i) significant reduction of erase operations resulting in twice the longevity of Flash devices under update-intensive workloads; (ii) 15%-60% lower read/write I/O latencies; (iii) up to 45% higher transactional throughput; (iv) 2x to 3x reduction in overall write
amplification.
In this paper we build on our research in data management on native Flash storage. In particular we demonstrate the advantages of intelligent data placement strategies. To effectively manage phsical Flash space and organize the data on it, we utilize novel storage structures such as regions and groups. These are coupled to common DBMS logical structures, thus require no extra overhead for the DBA. The experimental results indicate an improvement of up to 2x, which doubles the longevity of Flash SSD. During the demonstration the audience can experience the advantages of the proposed approach on real Flash hardware.
Obwohl Vorteile wertorientierter Preissetzung seit Jahren bekannt sind, gewinnt sie nur langsam an Boden. Die erste Studie des Preisverhaltens nach Geschäftstypen zeigt: 35 Jahre Preisforschung und -beratung konnten die Dominanz der Kostenorientierung erstmals schwächen. Der Autor wagt einen Erklärungsversuch und ermutigt zu mehr Marktorientierung.
In an exploratory study about online communication of large and medium-sized B2B companies from the German state of Baden-Württemberg, their message content communicated via websites, and their websites' appeal for international prospects has been analyzed. It revealed many basic content items absent, making the site less attractive for further exploration, and difficult or international prospects to enter into a dialog, become leads, and possible customers. The subsequent survey elicited organizational backgrounds, available resources, and objectives for online communication. It could trace deficiencies back to a lack of understanding of the importance of digital communication for lead generation, and the customer journey in general, absence of a communication strategy, lack of urgency, and lack of resources to implement desired changes and additions to communication content.
The demonstration project Virtual Power Plant Neckar-Alb is constructing a Virtual Power Plant (VPP) demonstration site at the Reutlingen University campus. The VPP demonstrator integrates a heterogeneous set of distributed energy resources (DERs) which are connected to control the infrastructure and an energy management system. This paper describes the components and the architecture of the demonstrator and presents strategies for demonstration of multiple optimization and control systems with different control paradigms.
Das Knie ist das am häufigsten von Verletzungen betroffene Gelenk beim Skifahren. Durch die Messung bestimmter Einflussfaktoren, wie Kniewinkel und Muskelaktivität, kann eine Aussage über die Wahrscheinlichkeit einer drohenden Verletzung getroffen werden. Diese Daten können als Basis genutzt werden, um eine entsprechende Reaktion des Bindungssystems des Skis hervorzurufen. Durch die automatische Auslösung der Skibindung bei der Überschreitung der Grenzwerte wird das Knie entlastet, um möglichen Verletzungen vorzubeugen. Im Rahmen eines Forschungsprojekts der TU München wurde von einem Team der TU München und der Hochschule Reutlingen eine Skiunterhose entwickelt, mit der in Echtzeit der Kniewinkel erfasst und drahtlos an eine Auswerteeinheit gesendet werden kann.
Die Europäische Währungs- und Wirtschaftsunion (EWWU) bedarf einer weiteren Stabilisierung, da die institutionellen Regelungen langfristig keine hinreichende Bindekraft auf die Mitgliedsländer entfalten. Die Herausforderung ist die Rückgewinnung der verlorengegangenen Glaubwürdigkeit in das Regelwerk im Zuge der europäischen Staatsverschuldungskrise seit dem Jahr 2010. Um die Währungsunion zu erhalten, muss einerseits im Primärrecht das "No Bailout" in Art. 125 AEUV glaubwürdig angewandt werden können und andererseits die Regelungen im Sekundärrecht, u.a. der Stabilitäts- und Wachstumspakt, der Fiskalpakt oder das europäische Semester, unabhängiger und schneller rechtsverbindlich vollzogen werden. Der hier vorgeschlagene und klug in den europäischen Rahmen eingepasste "staatliche Insolvenzmechanismus", verbunden mit einer im Ultima Ratio rechtsverbindlichen "Austrittsklausel" wäre ein Lösungsansatz. Ein Scheitern der EWWU ist abwendbar, aber der fehlende Reformwille könnte dem Zerfall der Währungsunion Vorschub leisten.
Die weiterhin hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation erst gar nicht eintritt, hält der Autor eine staatliche Insovenzordnung – mit Bail-out durch die anderen Mitgliedstaaten nur in Notfällen – für erforderlich. Er schlägt einen staatlichen Abwicklungsmechanismus für überschuldete Euro-Länder vor, der auf einem Konzept des Sachverständigenrates für Wirtschaft von 2016 beruht.
The purpose of this paper is to study the impact of transparency on the political budget cycle (PBC) over time and across countries. So far, the literature on electoral cycles finds evidence that cycles depend on the stage of an economy. However, the author shows – for the first time – a reliance of the budget cycle on transparency. The author uses a new data set consisting of 99 developing and 34 Organization for Economic Cooperation and Development countries. First, the author develops a model and demonstrates that transparency mitigates the political cycles. Second, the author confirms the proposition through the econometric assessment. The author uses time series data from 1970 to 2014 and discovers smaller cycles in countries with higher transparency, especially G8 countries.
The European Economic and Monetary Union (EMU) has been in turmoil for more than six years. The present governance rules do not seem to solve the problems neither permanently nor effectively. There is no vision about the future of Europe in the 21st century. This article describes a realignment of the economic governance, which does not necessarily lead to a transfer or political union. However, it solves the current and future challenges. In fact, the redesign of present rules is the most likely as well as legally and economically option today. The key ideais the detachment from the compulsive idea of an ever closer union. However, this vision requires boldness towards greater flexibility together with an exit clause or a state insolvency procedure for incompliant member states.
This paper models the political budget cycle with stochastic differential equations. The paper highlights the development of future volatility of the budget cycle. In fact, I confirm the proposition of a less volatile budget cycle in future. Moreover, I show that this trend is even amplified due to higher transparency. These findings are new evidence in the literature on electoral cycles. I calibrate a rigorous stochastic model on public deficit-to-GDP data for several countries from 1970 to 2012.
This paper studies whether a monetary union can be managed solely by a rule based approach. The Five Presidents’ Report of the European Union rejects this idea. It suggests a centralisation of powers. We analyse the philosophy of policy rules from the vantage point of the German economic school of thought. There is evidence that a monetary union consisting of sovereign states is well organised by rules, together with the principle of subsidiarity. The root cause of the euro crisis is rather the weak enforcement of rules, compounded by structural problems. Therefore, we suggest a genuine rule-based paradigm for a stable future of the Economic and Monetary Union.
Incubators in multinational corporations : development of a corporate incubator operator model
(2017)
This paper analyzes the components of a corporate incubator operator model in multinational companies. Thereby, three relevant phases were identified: pre incubation, incubation, and exit. Each phase contains different criteria that represent critical success factors for a corporate incubator, which are based on theoretical findings and lessons learned from practice. During the pre-incubation phase companies should define their need for a corporate incubator, the origin of ideas and the selection criteria for incubator tenants. The actual phase of incubation refers to the incubator program, which should be flexible with respect to each tenant. Furthermore, resource allocation plays an important role during the incubator program. Exit options after a successful incubation differ according to internal ideas and external start-ups, as well as the objective of the incubator. The research is based on a comprehensive screening of existing incubator literature and a qualitative content analysis of statements from eight experts of international corporate incubators.
Newly developed active pharmaceutical ingredients (APIs) are often poorly soluble in water. As a result the bioavailability of the API in the human body is reduced. One approach to overcome this restriction is the formulation of amorphous solid dispersions (ASDs), e.g., by hot-melt extrusion (HME). Thus, the poorly soluble crystalline form of the API is transferred into a more soluble amorphous form. To reach this aim in HME, the APIs are embedded in a polymer matrix. The resulting amorphous solid dispersions may contain small amounts of residual crystallinity and have the tendency to recrystallize. For the controlled release of the API in the final drug product the amount of crystallinity has to be known. This review assesses the available analytical methods that have been recently used for the characterization of ASDs
and the quantification of crystalline API content. Well established techniques like near- and mid-infrared spectroscopy (NIR and MIR, respectively), Raman spectroscopy, and emerging ones like UV/VIS, terahertz, and ultrasonic spectroscopy are considered in detail. Furthermore, their advantages and limitations are discussed with regard to general practical applicability as process analytical technology (PAT) tools in industrial manufacturing. The review focuses on spectroscopic methods which have been proven as most suitable for in-line and on-line process analytics. Further aspects are spectroscopic techniques that have been or could be integrated into an extruder.
The digital transformation of the automotive industry has a significant impact on how development processes need to be organized in future. Dynamic market and technological environments require capabilities to react on changes and to learn fast. Agile methods are a promising approach to address these needs but they are not tailored to the specific characteristics of the automotive domain like product line development. Although, there have been efforts to apply agile methods in the automotive domain for many years, significant and widespread adoptions have not yet taken place. The goal of this literature review is to gain an overview and a better understanding of agile methods for embedded software development in the automotive domain, especially with respect to product line development. A mapping study was conducted to analyze the relation between agile software development, embedded software development in the automotive domain and software product line development. Three research questions were defined and 68 papers were evaluated. The study shows that agile and product line development approaches tailored for the automotive domain are not yet fully explored in the literature. Especially, literature on the combination of agile and product line development is rare. Most of the examined combinations are customizations of generic approaches or approaches stemming from other domains. Although, only few approaches for combining agile and software product line development in the automotive domain were found, these findings were valuable for identifying research gaps and provide insights into how existing approaches can be combined, extended and tailored to suit the characteristics of the automotive domain.
Context: The current situation and future scenarios of the automotive domain require a new strategy to develop high quality software in a fast pace. In the automotive domain, it is assumed that a combination of agile development practices and software product lines is beneficial, in order to be capable to handle high frequency of improvements. This assumption is based on the understanding that agile methods introduce more flexibility in short development intervals. Software product lines help to manage the high amount of variants and to improve quality by reuse of software for long term development.
Goal: This study derives a better understanding of the expected benefits for a combination. Furthermore, it identifies the automotive specific challenges that prevent the adoption of agile methods within the software product line.
Method: Survey based on 16 semi structured interviews from the automotive domain, an internal workshop with 40 participants and a discussion round on ESE congress 2016. The results are analyzed by means of thematic coding.
The ability to develop and deploy high-quality software at a high speed gets increasing relevance for the comptetitiveness of car manufacturers. Agile practices have shown benefits such as faster time to market in several application domains. Therefore, it seems to be promising to carefully adopt agile practices also in the automotive domain. This article presents findings from an interview-based qualitative survey. It aims at understanding perceived forces that support agile adoption. Particularly, it focuses on embedded software development for electronic control units in the automotive domain.
Intermediate filament reorganization dynamically influences cancer cell alignment and migration
(2017)
The interactions between a cancer cell and its extracellular matrix (ECM) have been the focus of an increasing amount of investigation. The role of the intermediate filament keratin in cancer has also been coming into focus of late, but more research is needed to understand how this piece fits in the puzzle of cytoskeleton-mediated invasion and metastasis. In Panc-1 invasive pancreatic cancer cells, keratin phosphorylation in conjunction with actin inhibition was found to be sufficient to reduce cell area below either treatment alone. We then analyzed intersecting keratin and actin fibers in the cytoskeleton of cyclically stretched cells and found no directional correlation. The role of keratin organization in Panc-1 cellular morphological adaptation and directed migration was then analyzed by culturing cells on cyclically stretched polydimethylsiloxane (PDMS) substrates, nanoscale grates, and rigid pillars. In general, the reorganization of the keratin cytoskeleton allows the cell to become more ‘mobile’- exhibiting faster and more directed migration and orientation in response to external stimuli. By combining keratin network perturbation with a variety of physical ECM signals, we demonstrate the interconnected nature of the architecture inside the cell and the scaffolding outside of it, and highlight the key elements facilitating cancer cell-ECM interactions.
There is no doubt that the amplification of channel integration towards an omni-channel structure is a powerful idea whose time has finally come. The digitally cross-linked world postulates all-encompassing, ubiquitous, and unobtrusive future services. In the concomitant, increasingly competitive market, retailers are starting to lay the foundation for omnichannel, meeting the expectations of a digitally cunning audience wanting their shopping experience to be as seamless and uncomplicated as possible. Nevertheless, recent researches show that there are still enough avenues for further research on omnichannel. Until now, the performance of companies was solely considered by experts from a suppliers’ point of view. It would be rather interesting to find out whether the desire to meet the increased cus-tomer expectations is also recognized by the customers themselves. This paper seeks to answering how the purchasing behavior has changed and what customers demand. In addition, it elaborates the opportunities that are promoted by omni-channel. Searching out all the effects, the paper will get to a final step, where it can be attested how the omnichannel performance of fashion and lifestyle retailers can be measured from a consumers’ perspective by developing an exclusive index. The study is confined to four fashion and lifestyle retailers: Hugo Boss AG, Levi Strauss & Co, Pull and Bear as well as COS. Using the scientific method of mystery shopping and a multi-item checklist including 54 key performance indicators, the paper aims to examine to which extend the four selected retailers provide a seamless customer journey, according to the five decision-making phases.
The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology.
Reconstructing 3D face shape from a single 2D photograph as well as from video is an inherently ill-posed problem with many ambiguities. One way to solve some of the ambiguities is using a 3D face model to aid the task. 3D morphable face models (3DMMs) are amongst the state of the art methods for 3D face reconstruction, or so called 3D model fitting. However, current existing methods have severe limitations, and most of them have not been trialled on in-the-wild data. Current analysis-by- synthesis methods form complex non linear optimisation processes, and optimisers often get stuck in local optima. Further, most existing methods are slow, requiring in the order of minutes to process one photograph.
This thesis presents an algorithm to reconstruct 3D face shape from a single image as well as from sets of images or video frames in real-time. We introduce a solution for linear fitting of a PCA shape identity model and expression blendshapes to 2D facial landmarks. To improve the accuracy of the shape, a fast face contour fitting algorithm is introduced. These different components of the algorithm are run in iteration, resulting in a fast, linear shape-to- landmarks fitting algorithm. The algorithm, specifically designed to fit to landmarks obtained from in-the-wild images, by tackling imaging conditions that occur in in-the-wild images like facial expressions and the mismatch of 2D–3D contour correspondences, achieves the shape reconstruction accuracy of much more complex, nonlinear state of the art methods, while being multiple orders of magnitudes faster.
Second, we address the problem of fitting to sets of multiple images of the same person, as well as monocular video sequences. We extend the proposed shape-to-landmarks fitting to multiple frames by using the knowledge that all images are from the same identity. To recover facial texture, the approach uses texture from the original images, instead of employing the often-used PCA albedo model of a 3DMM. We employ an algorithm that merges texture from multiple frames in real-time based on a weighting of each triangle of the reconstructed shape mesh.
Last, we make the proposed real-time 3D morphable face model fitting algorithm available as open-source software. In contrast to ubiquitous available 2D-based face models and code, there is a general lack of software for 3D morphable face model fitting, hindering a widespread adoption. The library thus constitutes a significant contribution to the community.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. With the use of a cascaded-regressor-based face tracking and a 3D morphable face model shape fitting, we obtain a semidense 3D face shape. We further use the texture information from multiple frames to build a holistic 3D face representation from the video footage. Our system is able to capture facial expressions and does not require any person specific training. We demonstrate the robustness of our approach on the challenging 300 Videos in the Wild (300- VW) dataset. Our real-time fitting framework is available as an open-source library at http://4dface.org.
Die meisten der aktuell im Allag vorfindlichen Touch-Flächen wurden unter Anwendung komplexer und kostenintensiver Technologien realisiert. Gerade für das Anwendungsszenario eines Touchfloors, bei welchem meist eine überdurchschnittlich große Touch-Fläche erwünscht ist, werden kostengünstigere Umsetzungsmöglichkeiten angestrebt. Dieses Paper dient als Ausgangsbasis für die Umsetzung eines Low-cost Touchfloors, der die kollaborative Arbeit eines Projektteams unterstützen soll. Mithilfe einer Analyse des State of the Arts der Touch-Technologien und einer anschließenden Evaluation, wird die Touch-Technologie abgeleitet, die sich am besten zur Realisierung dieses low-cost Touchfloors eignet. Aus der Evaluation geht hervor, dass vor allem optische Touch-Technologien, insbesondere visionsbasierte, für die Umsetzung von kostengünstigen großen Touch-Flächen geeignet sind.
Background. The application of lean management is standard in many companies all over the world. It is used to continuously optimise existing production processes and to reduce the complexity of administrative processes. Unfortunately, in higher education, the awareness of lean management as a highly effective methodology is quite low.
Research aims. The research aim is to show how the lean strategy can be applied in university environments. Finally, this paper addresses the question why it is so difficult to implement lean in a university environment and how an institution of higher education can move forward towards becoming a lean university.
Methodology. Based on a literature review, five key lean principles are presented and examples of their implementation are discussed using short case studies from our own institution. We also compare our findings with those in the literature.
Key findings. Lean offers the chance to improve the management of higher education institutions. This requires a commitment on the part of the university top management aiming at convincing all stakeholders that a culture of lean helps the institution to be able to adapt to the rapidly changing environment of higher education.
Wer in ein Unternehmen investiert, tut dies, um in Zukunft Geld zu verdienen. Er rechnet mit einer risikoadäquaten Rendite. Die Auswahl der Kennzahlen, die diese Wertsteigerung transparent machen, ist allerdings nicht trivial. Denn von ihnen hängt ab, ob die Unternehmensziele richtig vorgegeben und ob die Anreize für das Management richtig gesetzt werden.
Umsatz und Gewinne stagnieren auf hohem Niveau, und dennoch steigen der Aktienkurs und der Gewinn pro Aktie – eine Entwicklung, die sich etwa bei Apple oder Ebay beobachten lässt. Aktionäre sollten wissen, welche Arithmetik sich hinter solchen Entwicklungen verbirgt und mit welchen Verfahren sie den Unternehmenswert am besten ermitteln können.
In retail environments, consumers commonly evaluate products while standing on some type of flooring and concurrently being exposed to music; however, no study has examined the interaction of these two atmospheric cues. To bridge this gap, this research examines whether retailers can benefit from creating multisensory atmospheric congruent rather than incongruent retail environments of flooring and music. The results of an experiment in a real retail store reveal positive effects of multisensory congruent retail environments (e.g., soft music combined with soft flooring) on product evaluations. This study provides a new process explanation with consumers’ purchase-related self-confidence mediating these effects. Specifically, consumers in congruent rather than incongruent retail environments experience more purchase-related self confidence, which in turn leads to more favorable product evaluations. Furthermore, this study shows that consumers with a low rather than a high preference for haptic information are influenced more by multisensory atmospheric congruence when evaluating a product haptically.
Despite the significant potential offered by the powder coating process for finishing wood-based materials, until now it has been used almost exclusively for coating Medium Density Fiber Board (MDF). A research project aims to develop processes and substrate materials that will allow lightweight boards to be powder coated.
In times of dynamic markets, enterprises have to be agile to be able to quickly react to market influences. Due to the increasing digitization of products, the enterprise IT often is affected when business models change. Enterprise Architecture Management (EAM) targets a holistic view of the enterprise’ IT and their relations to the business. However, Enterprise Architectures (EA) are complex structures consisting of many layers, artifacts and relationships between them. Thus, analyzing EA is a very complex task for stakeholders. Visualizations are common vehicles to support analysis. However, in practice visualization capabilities lack flexibility and interactivity. A solution to improve the support of stakeholders in analyzing EAs might be the application of visual analytics. Starting from a systematic literature review, this article investigates the features of visual analytics relevant for the context of EAM.
In der Medizin existieren verschiedene Reifegradmodelle, die die Digitalisierung von Krankenhäusern unterstützen können. Die Anforderungen an ein Reifegradmodell für diesen Zweck umfassen Aspekte aus allgemeinen und spezifischen Bereichen des Krankenhauses. Die Analyse der Reifegradmodelle HIN, CCMM, EMRAM und O-EMRAM zeigt große Lücken im Bereich des OP sowie fehlende Aspekte in der Notaufnahme auf. Ein umfassendes Reifegradmodell wurde nicht gefunden. Durch eine Kombination aus HIN und CCMM könnten fast alle Bereiche ausreichend abgedeckt werden. Zusätzliche Ergänzungen durch spezialisierte Reifegradmodelle oder sogar die Entwicklung eines umfassenden Reifegradmodells wären sinnvoll.