Refine
Year of publication
- 2016 (295) (remove)
Document Type
- Conference proceeding (95)
- Journal article (91)
- Book chapter (57)
- Book (29)
- Anthology (5)
- Patent / Standard / Guidelines (5)
- Doctoral Thesis (4)
- Issue of a journal (4)
- Review (2)
- Working Paper (2)
Is part of the Bibliography
- yes (295)
Institute
- ESB Business School (86)
- Informatik (81)
- Technik (72)
- Life Sciences (33)
- Texoversum (17)
Publisher
- Springer (59)
- IEEE (33)
- Hochschule Reutlingen (18)
- Elsevier (17)
- Gesellschaft für Informatik e.V (15)
- Universität Tübingen (10)
- Wiley (10)
- De Gruyter (7)
- Koordinierungsstelle Forschung und Entwicklung der Fachhochschulen des Landes Baden-Württemberg (5)
- MIT Center for Information Systems Research (5)
Wie kann der erhöhte Anteil durch Photovoltaik und Windkraft fluktuierend erzeugter Energie im Stromnetz ausgeglichen werden? Biogas- und Biomethananlagen sind interessante technologische Lösungen zur Stabilisierung des Stromnetzes. Die Umsetzung der Biomasse zu Methan erfolgt aber aufgrund von Wasserstoffmangel im Biogasreaktor nicht vollständig. Daher werden derzeit verschiedene Ansätze verfolgt, die verwertbare Gasausbeute zu erhöhen. In der vorliegenden Arbeit wird untersucht, welche Möglichkeiten zur Erhöhung der Methanausbeute von Biogasanlagen bestehen und wie sich diese ökologisch und wirtschaftlich darstellen. Zunächst werden der aktuellen Forschungs- und Entwicklungsstand zur Erhöhung der Methanausbeute in Biogasanlagen dargestellt und die verschiedenen Prozesse der Wasserstoffherstellung und Methanisierung beschrieben. In einem Vergleich werden die vorteilhaftesten Verfahren dargestellt. Diese bilden die Grundlage für die ökologischen und ökonomischen Betrachtungen zu vier ausgewählten Szenarien. Für das wirtschaftlich vorteilhafteste Szenario wird das CO2-Minderungspotential auf den gesamtdeutschen Markt skaliert. Abschließend werden der weitere Forschungs- und Entwicklungsbedarf in dem Themengebiet ermittelt, sowie politische Rahmenbedingungen und deren Auswirkungen auf die Biogastechnologie kritisch beleuchtet.
Business process management and IT supported processes are an actual topic. The procedure of finding a business process system that implements your processes the best way is not easy and takes a lot of time. In this article you will find a recommendation for an open source system. Four selected open source workflow management systems are tested and analyzed. Mean criteria for the evaluation are listed in a criteria catalogue and rated by experts by their importance. Finally, the systems are evaluated by the criteria and the best evaluated system can be recommended.
Willingness-to-pay for alternative fuel vehicle characteristics : a stated choice study for Germany
(2016)
In the light of European energy efficiency and clean air regulations, as well as an ambitious electric mobility goal of the German government, we examine consumer preferences for alternative fuel vehicles (AFVs) based on a Germany-wide discrete choice experiment among 711 potential car buyers. We estimate consumers’ willingness to-pay and compensating variation (CV) for improvements in vehicle attributes, also taking taste differences in the population into account by applying a latent class model with 6 distinct consumer segments. Our results indicate that about 1/3 of the consumers are oriented towards at least one AFV option, with almost half of them being AFV-affine, showing a high probability of choosing AFVs despite their current shortcomings. Our results suggest that German car buyers’ willingness-to-pay for improvements of the various vehicle attributes varies considerably across consumer groups and that the vehicle features have to meet some minimum requirements for considering AFVs. The CV values show that decision-makers in the administration and industry should focus on the most promising consumer group of ‘AFV aficionados’ and their needs. It also shows that some vehicle attribute improvements could increase the demand for AFVs cost-effectively, and that consumers would accept surcharges for some vehicle attributes at a level which could enable their private provision and economic operation (e.g. fast-charging infrastructure). Improvement of other attributes will need governmental subsidies to compensate for insufficient consumer valuation (e.g. battery capacity).
Software development consists to a large extend of humanbased processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the impact of group dynamics on project performance and quality. The course unit uses the Tuckman model as theoretical framework, and borrows from controlled experiments to organize and implement its practical parts in which students then experience the effects of, e.g., time pressure, resource bottlenecks, staff turnover, loss of key personnel, and other stress factors. We provide a detailed design of the course unit to allow for implementation in further software project management courses. Furthermore, we provide experiences obtained from two instances of this unit conducted in Munich and Karlskrona with 36 graduate students. We observed students building awareness of stress factors and developing counter measures to reduce impact of those factors. Moreover, students experienced what problems occur when teams work under stress and how to form a performing team despite exceptional situations.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
Wasted paradise – imagining the Maldives without the garbage island of Thilafushi : Version 1.2
(2016)
To address the high level of waste production in the Maldives, the local government decided to transform the coral island of Thilafushi into an immense waste dumb in 1992. Meanwhile, each day, 330 tons of waste is ferried to Thilafushi. The policy had the positive consequence of relieving the garbage burden in Malé, the main island, and surrounding tourist atolls. However, it can also lead to serious environmental and economic damage in the long range. First, the garbage is in visual range of one of the most prominent tourist destinations. Second, if the wind blows a certain way, unfiltered fumes from burning waste travels to tourist atolls. Third, water quality can erode as hazardous waste from batteries and other toxic waste is floating in the ocean. Over time, these effects can accumulate to significantly hamper the number of tourists that travel to the Maldives – one of the state’s main sources of financial income. In our paper, we lay out the situation in more detail and translate it into a simulation model. We test different policies to propose the Maldives government how to better solve the waste problem.
Organisationen sind immer mehr gefragt, auch digitale Arbeitsumgebungen bewusst zu formen. Neue Technologien und digitale Arbeitspraktiken verlagern den Ort, an dem eine gemeinsame Identität gebildet wird, zunehmend in virtuelle Räume. Bislang fokussieren sich Führungskräfte und Change Manager jedoch zu sehr auf Dinge, die sie anfassen und plastisch gestalten können. Die Autoren erörtern daher, wie Unternehmen auch in virtuellen Arbeitswelten die organisationale Identität gestalten und aufrechterhalten können, um auf diese Weise das Change Management zu unterstützen.
Bei weitgehend gleicher Ausstattung und neuen Anforderungen an die Hochschulrechenzentren rücken Kooperationen zunehmend in den Mittelpunkt. Für diese,
auch hochschulartenübergreifende, Kooperationen genügt der klassische informelle Rahmen vielfach nicht mehr. Für eine erfolgreiche Zusammenarbeit sind einige Voraussetzungen zu erfüllen. Rechenzentren treten in neuer Rolle als Provider von Dienstleistungen für Nutzer auch außerhalb ihrer eigenen Hochschule auf. Ebenso
könnten sie sich zukünftig verstärkt in der Nutzerperspektive wiederfinden. IT-Service-Einrichtungen müssen sich ihrer neuen Rolle als Diensteanbieter und Nutzer von Diensten Dritter bewusst werden und diese in ihre Überlegungen für die Ausgestaltung neuer Dienste einfließen lassen.
Primäres Ziel und Aufgabe dieser Arbeit ist ... die Entwicklung einer neuen Recyclingmethode für PET, die die Nachteile der bisherigen Verwertungsmethoden vermeidet und unter weitgehendem Erhalt der bereits erbrachten Syntheseleistung definierte Oligomere liefert. Aus diesen können in Folge hochwertige Produkte hergestellt werden.
In der vorliegenden Studie werden typische, kommerziell erhältliche und mit unterschiedlichen Lacksystemen beschichtete MDF für den Küchenbereich hinsichtlich ihres Emissionsverhaltens und deren Oberflächeneigenschaften verglichen: wasserlack-, lösungsmittellack- und pulverlackbasierte Oberflächen. Es zeigt sich, dass eine Pulverlackierung insgesamt zu höherwertigen Produkten führt, sowohl in Bezug auf Kratzbeständigkeit, Haftung und Beständigkeit gegen feuchte Hitze als auch insbesondere in Bezug auf VOC-Emissionen. Die Wasserlackoberflächen schnitten hinsichtlich ihres Emissionsverhaltens deutlich besser ab als die lösemittelbasierten Beschichtungssysteme und zeigten in Bezug auf die Oberflächeneigenschaften mit einer Ausnahme vergleichbare Kennwerte.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Today the optimization of metal forming processes is done using advanced simulation tools in a virtual process, e.g. FEM-studies. The modification of the free parameters represents the different variants to be analysed. So experienced engineers may derive useful proposals in an acceptable time if good initial proposals are available. As soon as the number of free parameters growths or the total process takes long times and uses different succeeding forming steps it might be quite difficult to find promising initial ideas. In metal forming another problem has to be considered. The optimization using a series of local improvements, often called a gradient approach may find a local optimum, but this could be far away from a satisfactory solution. Therefore non-deterministic approaches, e.g. Bionic Optimization have to be used. These approaches like Evolutionary Optimization or Particle Swarm Optimization are capable to cover a large range of high dimensional optimization spaces and discover many local optima. So the chance to include the global optimum increases when using such non-deterministic methods. Unfortunately these bionic methods require large numbers of studies of different variants of the process to be optimized. The number of studies tends to increase exponentially with the number of free parameters of the forming process. As the time for one single study might be not too small as well, the total time demand will be inacceptable, taking weeks to months even if high performance computing will be used. Therefore the optimization process needs to be accelerated. Among the many ideas to reduce the time and computer power requirement Meta- and Hybrid Optimization seem to produce the most efficient results. Hybrid Optimization often consists of global searches of promising regions within the parameter space. As soon as the studies indicate that there could be a local optimum, a deterministic study tries to identify this local region. If it shows better performance than other optima found until now, it is preserved for a more detailed analysis. If it performs worse than other optima the region is excluded from further search. Meta-Optimization is often understood as the derivation of Response Surfaces of the functions of free parameters. Once there are enough studies performed, the optimization is done using the Response Surfaces as representatives e.g. for the goal and the restrictions of the optimization problem. Having found regions where interesting solutions are to be expected, the studies available up to now are used to define the Response Surfaces. In many cases low degree polynomials are used, defining their coefficients by least square methods. Both proposals Hybrid Optimization and Meta-Optimization, sometimes used in combination often help to reduce the total optimization processes by large numbers of variants to be studied. In consequence they are highly recommended when dealing with time consuming optimization studies.
"Wer nicht vielfältig denkt, denkt einfältig". Mit diesem Spruch machen Studierende der Hochschule Reutlingen in einem Videoclip auf das Thema Diversity aufmerksam. Doch was verstehen wir eigentlich darunter? Seit September letzten Jahres ist Professorin Dr. Gabriela Tullius Vizepräsidentin der Hochschule Reutlingen und widmet sich unter anderem diesem Bereich. GEA-Campus hat nachgefragt, worum es bei Diversity geht und warum dieser Bereich die Hochschule beschäftigt.
Marketing-Events emotionalisieren das Publikum auf besondere Art und Weise. Einstellungsänderungen bzw. Imageverbesserungen stellen deshalb die zentralen Zielsetzungen des Event-Marketing dar. Aufbauend auf den wesentlichen Grundlagen des Event-Marketing fokussiert dieses Kapitel auf das Controlling im Event-Marketing. Eingegangen wird auf die Erfolgskontrolle und die Wirkungsforschung im Event-Marketing. Es werden die Bedingungen für das Zustandekommen eines Imagetransfers von einem Event auf eine Marke bzw. ein Unternehmen erläutert und Verfahren zur Imagemessung vorgestellt.
Die vorliegende Erfindung betrifft ein Verfahren zur Regelung einer Totzeit in einem Synchronwandler (100), in welchem ein zyklisches Schalten eines Steuerschalters (2) und eines Synchronschalters (3) erfolgen, wobei der Steuerschalter (2) mittels eines ersten Schaltsignals (S1) und der Synchronschalter (3) mittels eines zweiten Schaltsignals (S2) geschaltet werden. Das Verfahren umfasst ein Erfassen und Vorhalten eines Spannungswertes, welcher eine Spannung (VSW) über den Synchronschalter (3) zu einem bestimmten Zeitpunkt beschreibt, und ein Anpassen des ersten und/oder zweiten Schaltsignals (S1, S2) für einen folgenden Zyklus basierend auf dem vorgehaltenen Spannungswert.
This practical guide for advanced students and decision-makers in the pharma and biotech industry presents key success factors in R&D along with value creators in pharmaceutical innovation. A team of editors and authors with extensive experience in academia and industry and at some of the most prestigious business schools in Europe discusses in detail the innovation process in pharma as well as common and new research and innovation strategies. In doing so, they cover collaboration and partnerships, open innovation, biopharmaceuticals, translational medicine, good manufacturing practice, regulatory affairs, and portfolio management. Each chapter covers controversial aspects of recent developments in the pharmaceutical industry, with the aim of stimulating productive debates on the most effective and efficient innovation processes. A must-have for young professionals and MBA students preparing to enter R&D in pharma or biotech as well as for students on a combined BA/biomedical and natural sciences program.
Die Alterungsbeständigkeit von geschirmten Kabeln und Kontaktpaaren gewinnt angesichts des zunehmenden Einsatzes elektronischer softwarebasierter Steuerungen in Transportmitteln an Relevanz. Elektro- und Hybridantriebe sowie die Implementierung von Fahrassistenzsystemen bis hin zu autonom fahrenden Fahrzeugen führen zu komplexen Kabelbäumen mit einer großen Zahl von geschirmten Kabeln und Steckverbindern.
Um die Stabilität von Kontaktsystemen hinsichtlich ihrer Stabilität zu prüfen, werden Hochspannungskabel unterschiedlicher Hersteller und Dimensionierung für den Einsatz in Hybridfahrzeugen untersucht. Diese wurden mit Modellsteckern versehen, die die Schimung kontaktierten und hinreichend gewährleisteten.
Impedanz und Schirmwirkungsdämpfung wurden an thermisch belasteten Kabeln und Kontaktpaaren vergleichend zu Neuteilen untersucht. Der Einfluss der in Anlehnung an die LV 214 induzierten thermischen Alterung auf das frequenzabhängige Übertragungsprofil sowie den Gesamtstreuverlust wurde messtechnisch ermittelt. Zusätzlich wurden die Alterungseffekte an den Leitungsmaterialien mittels moderner bildgebender Analytik dokumentiert.
Tumorzellen on the move : mikrosystem-basierter Assay zur Untersuchung der Tumorzellen-Migration
(2016)
Die Invasion von Tumorzellen in umliegendes Gewebe und die Bildung von Metastasen transformieren einen lokal wachsenden Tumor in eine systemische und lebensbedrohliche Krankheit mit schlechter Prognose. Dabei spielt die aktive Migration der Tumorzellen eine entscheidende Rolle. Tumorzellen gelangen durch die aktive Zellbewegung in das Lymph- oder Blutsystem und breiten sich im Körper aus. Bei der Invasion in ein neues Organ migrieren die Zellen ebenfalls wieder in komplexer Weise durch das Gewebe und können schließlich dort Metastasen bilden. Auf Grund der enormen medizinischen Relevanz der Tumorzell-Invasion, wird die Bewegung von Tumorzellen seit Jahrzehnten unter Laborbedingungen umfassend untersucht und ist ein wichtiger Marker für die Aggressivität der Tumorzellen. Zur Bewegungsanalyse gibt es mehrere experimentelle und auch kommerziell erhältliche in-vitro Untersuchungsmethoden. Ziel des interdisziplinären Projektes „MigChip“ ist die Entwicklung, Herstellung und experimentelle Validierung eines Mikrofludik-Chips zur verbesserten, detailgenauen in-vitro Untersuchung der Tumorzellen-Migration.
Context: Companies need capabilities to evaluate the customer value of software intensive products and services. One way of systematically acquiring data on customer value is running continuous experiments as part of the overall development process. Objective: This paper investigates the first steps of transitioning towards continuous experimentation in a large company, including the challenges faced. Method: We conduct a single-case study using participant observation, interviews, and qualitative analysis of the collected data. Results: Results show that continuous experimentation was well received by the practitioners and practising experimentation helped them to enhance understanding of their product value and user needs. Although the complexities of a large multi-stakeholder business to-business (B2B) environment presented several challenges such as inaccessible users, it was possible to address impediments and integrate an experiment in an ongoing development project. Conclusion: Developing the capability for continuous experimentation in large organisations is a learning process which can be supported by a systematic introduction approach with the guidance of experts. We gained experience by introducing the approach on a small scale in a large organisation, and one of the major steps for future work is to understand how this can be scaled up to the whole development organisation.
IT environments that consist of a very large number of rather small structures like microservices, Internet of Things (IoT) components, or mobility systems are emerging to support flexible and agile products and services in the age of digital transformation. Biological metaphors of living and adaptable ecosystems with service-oriented enterprise architectures provide the foundation for self-optimizing, resilient run-time environments and distributed information systems. We are extending Enterprise Architecture (EA) methodologies and models that cover a high degree of heterogeneity and distribution to support the digital transformation and related information systems with micro-granular architectures. Our aim is to support flexibility and agile transformation for both IT and business capabilities within adaptable digital enterprise architectures. The present research paper investigates mechanisms for integrating Microservice Architectures (MSA) by extending original enterprise architecture reference models with elements for more flexible architectural metamodels and EA-mini-descriptions.
Analysis is an important part of the enterprise architecture management process. Prior to decisions regarding transformation of the enterprise architecture, the current situation and the outcomes of alternative action plans have to be analysed. Many analysis approaches have been proposed by researchers and current enterprise architecture management tools implement analysis functionalities. However, few work has been done structuring and classifying enterprise architecture analysis approaches. This paper collects and extends existing classification schemes, presenting a framework for enterprise architecture analysis classification. For evaluation, a collection of enterprise architecture analysis approaches has been classified based on this framework. As a result, the description of these approaches has been assessed, a common set of important categories for enterprise architecture analysis classification has been derived and suggestions for further development are drawn.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
The reduced research and development (R&D) efficiency, strong competition from generics, increased cost pressure from payers, and an increased biological complexity of new target indications have resulted in a rethinking and a change from a traditional and more closed R&D model in the pharmaceutical industry toward the new paradigm of open innovation. In the past years, pharmaceutical companies have broadened their external networks toward research collaborations with academic institutes, technology providers, or codevelopment partners. To fulfill the demand to reduce timelines and costs, research-based pharmaceutical companies started to outsource R&D activities. In addition, internal R&D processes were adjusted to the more open R&D model and new processes such as alliance management were established. The corporate frontier of pharmaceutical companies became permeable and more open. As a result, the focus of pharmaceutical R&D expanded from a purely internal toward a mixed internal and external model. Today, the U.S. pharmaceutical company Eli Lilly may have established the most open model toward external innovation, as it has integrated its innovation processes with its business model. Other companies are following this more open R&D model with newer concepts such as new frontier sciences, drug discovery alliances, private public partnerships, innovation incubators, virtual R&D, crowdsourcing, open source innovation, and innovation camps.
During the first years of their employment, the graduates are a liability to industry. The employer goes an extra mile to bridge the gap between university-exiting and profitable employment of engineering graduates. Unfortunately some cannot take this risk. Given this scenario, this paper presents a learning factory approach as a platform for the application of knowledge so as to develop the required engineering competences in South African engineering graduates before they enter the labour market. It spells out the components of a Stellenbosch University Learning Factory geared towards production of engineering graduates with the required industrial skills. It elaborates on the didactics embedded in the learning factory environment, tailor-made to produce engineers who can productively contribute to the growth of the industry upon exiting the university.
In a recent publication Novy-Marx (2013) finds evidence that the variable gross profitability has a strong statistical influence on the common variation of stock returns. He also points out that there is common variation in stock returns related to firm profitability that is not captured by the three-factor model of Fama and French (1993). Thus, this thesis augments the three-factor model by the factor gross profitability and examines whether a profitability-based four-factor model is able to better explain monthly portfolio excess returns on the German stock market compared to the three-factor model of Fama and French (1993) and the Capital Asset Pricing Model (CAPM). Based on monthly stock returns of the CDAX over the period July 2008 to June 2014 this thesis documents four main findings. First, a significant positive market risk premium and a significant positive value premium can be identified. No evidence is found for a size or a profitability effect. Second, all included factors have a strong significant effect on monthly portfolio excess returns. Third, the four-factor model clearly outperforms both the three-factor model of Fama and French (1993) and the CAPM in capturing the common variation in monthly portfolio excess returns. The CAPM performs worst. Finally, the results indicate that the three-factor model of Fama and French (1993) is somewhat better in explaining the cross-section of portfolio excess returns than the four-factor model. Again, the CAPM performs worst. Nevertheless, the four-factor model is considered to be an improvement over the three-factor model of Fama and French (1993) and the CAPM in determining stock returns on the German stock market.
Clinical development is historically the phase in which a potential new medicine is being tested in phase 2 and phase 3 patient trials to demonstrate the new molecules' efficacy and safety to support the regulatory approval of drugs by health authorities. This relatively focused approach has been considerably expanded by a number of forces from within the pharmaceutical industry and equally important by changes in the healthcare systems. The need to identify the optimal patient population, showstoppers leading to discontinuation of clinical programs, the silent but constant removal of surrogate endpoints for registration, and the increased demand for real-life data which are used to demonstrate the patients' benefit and which have an ever-increasing role for pricing and reimbursement negotiations are today an integral part of this phase.
This chapter will review both the nuts and bolts of clinical development but also recent developments in this area which shape the environment and how the different players have reacted and what options might need to be explored in the future.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
Systemic Constellation describes an approach that enables practitioners to examine and address typical issues in diversity management from a different, relational perspective. Systemic Constellation utilizes the human ability to recognize the qualities of relationships between two or more people from their spatial alignment to each other (transverbal language) and the capability to illustrate inner pictures by placing humans or objects in a room as representatives (representative perception). Systemic Constellation originated in the field of family therapy and counseling, but through research, guidance work, and teaching activities over the last two decades, it has developed into a generic, structural, constellation logic with multiple methods of application. It has been adapted to a variety of topics and issues, and a number of constellation formats. This article serves as a starting point for the transfer of Systemic Constellation into diversity management. It appears that conventional approaches taught in traditional management classes (such as focusing on tools, setting targets, planning measures, and offering incentives) are of limited use when trying to deal with problematic situations in diversity management. Preliminary trials show that new solutions and insights into deeper underlying dynamics can be gained on personal and institutional levels when applying Systemic Constellation. Participants find the application of the model as very beneficial. Systemic Constellation is grounded in personal experience and particularly in a person’s own experience of the consistency of representative perception. This viewpoint can only be conveyed rudimentarily in a scientific article. Readers should feel encouraged to apply Systemic Constellations themselves and use it in their work, experimentally and professionally. To harness the full potential of Systemic Constellations in diversity management, further research needs to be done.
Optimization-based analog layout automation does not yet find evident acceptance in the industry due to the complexity of the design problem. This paper presents a Self-organized Wiring and Arrangement of Responsive Modules (SWARM), able to consider crucial design constraints both implicitly and explicitly. The flexibility of algorithmic methods and the expert knowledge captured in PCells combine into a flow of supervised module interaction. This novel approach targets the creation of constraint-compliant layout blocks which fit into a specified zone. Provoking a synergetic self-organization, even optimal layout solutions can emerge from the interaction. Various examples depict the power of that new concept and the potential for future developments.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
The deterioration of the shielding performance of electromagnetic interference finger stock gaskets in a corrosive environment is investigated. The visualization of the real contact area shows a drastic reduction of the engaged active contact region between fingers and their mating surfaces in presence of corrosives residues. In fact, additional openings occur besides the “Tlike” holes due to the porous nature of gaskets. This leads to a strong degradation of the shielding effectiveness. Modified Bethe’s theory is used to estimate the equivalent circuit parameters while the shielding effectiveness in terms of ratio between two transfer functions is obtained upon applying the filter theory. Quantitative measurements carried out for different gasket types show a good agreement with calculated results, demonstrating thus the validity of the approach.
Die zunehmende erneuerbare Stromerzeugung erfordert Anstrengungen, um den damit verbundenen Angebotsschwankungen und der zusätzlichen Netzbelastung entgegen zu wirken. Eine dezentrale und am Bedarf orientierte Stromerzeugung mittels Kraft-Wärme-Kopplung (KWK) kann hier einen wesentlichen Beitrag leisten, um eine sichere und konstante Stromversorgung zu gewährleisten und die Netze zu entlasten. Zu diesem Zweck ist jedoch ein Steuerungssystem erforderlich, das die KWK-Anlagen in die Lage versetzt, sowohl die Deckung des Wärmebedarfs im Objekt aufrecht zu erhalten, als auch die elektrische Energie genau zu den Zeiten zu erzeugen, in denen sie benötigt wird. Die Entkopplung von Stromerzeugung und Deckung des Wärmebedarfs kann dabei über den standardmäßig vorhandenen Wärmespeicher erfolgen. Dieser stellt damit das zentrale Element der Gesamtanlage dar, für die das Steuerungssystem zur Eigenstromoptimierung im Rahmen des Forschungsvorhabens entwickelt und erprobt wurde.
Der Wärmespeicher einer KWK-Anlage kann genutzt werden, um den Betrieb des BHKWs in die Zeiten des Stromverbrauchs zu verlagern. Die Ad-hoc-Zuschaltfunktion verbessert das Ergebnis gegenüber eines auf Basis von Prognosen erstellten Fahrplans. Zu beachten sind allerdings eine erhöhte Anzahl BHKW-Starts und erhöhte Wärmeverluste am Speicher. Die deutlich besten Ergebnisse werden für BHKW mit Leistungsmodulation erzielt.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Sport marketing is the specific application of marketing principles and processes to sports products and services. In 2014 the biggest sports event in the world, the FIFA World Cup, took place in Brazil. Billions of spectators around the world saw Germany win the trophy in Rio de Janeiro for the fourth time in history. Yet unlike in previous World Cups, conversation was not only taking place at the numerous public viewings which were held in open spaces like bars and restaurants. For the entire tournament social media like Facebook or Twitter were playing a dominant role in all aspects. With 672 million tweets on Twitter and three billion conversations on Facebook, this was the most social World Cup as well as the most social mega sports event so far. It did not matter whether it were users, athletes or companies, everyone was trying to catch up on the conversation to be informed or inform others about their opinion or latest news. This paper analyzes the implementation of social media marketing during mega sports events with a focus on Adidas’ and Nike’s social media campaigns in the frame of the FIFA World Cup 2014 in Brazil. The analysis shows that social media marketing in the frame of mega sports events gains importance. Those companies finding topics that affect people personally with a relationship to their products achieve success through social media marketing.
Information Systems in Distributed Environment (ISDE) is becoming a prominent standard in this globalization era due to advancement in information and communication technologies. The advent of the internet has supported Distributed Software Development (DSD) by introducing new concepts and opportunities, resulting in benefits such as scalability, flexibility, interdependence, reduced cost, resource pools, and usage tracking. The distributed development of information systems as well as their deployment and operation in distributed environments impose new challenges for software organizations and can lead to business advantages. In distributed environments, business units collaborate across time zones, organizational boundaries, work cultures and geographical distances, something that ultimately has led to an increasing diversification and growing complexity of cooperation among units. The real-world practice of developing, deployment and operation of information systems in globally distributed projects has been viewed from various perspectives, though technical and engineering in conjunction with managerial and organizational viewpoints have dominated the researcher’s attention so far. Successful participation in distributed environments, however, is ultimately a matter of the participants understanding and exploiting the particularities of their respective local contexts at specific points in time and exploring practical solutions through the local resources available.
This special issue of the Computer standards & interfaces journal therefore includes papers received from the public call for papers and extended and improved versions of those papers that were selected from the best of the International Workshop on Information Systems in Distributed Environment (ISDE 2014). It aims to serve as a forum to bring together academics, researchers, practitioners and students in the field of distributed information system, by presenting novel developments and lesson learned from real world cases, and to promote the exchange of ideas, discussion and advancement in these areas.
Software process improvement (SPI) has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.
Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability, especially from the perspective of SPI in small-to-medium-sized companies, which leads to new specialized frameworks. Furthermore, we find a growing interest in success factors to aid companies in conducting SPI.
This summary refers to the paper Software process improvement : where is the evidence? [Ku15].
This paper was published as full research paper in the ICSSP’2015 proceedings.
Although still in the early stages of diffusion, smartwatches represent the most popular type of wearable devices. Yet, little is known why some people are more likely to adopt smartwatches than others. To deepen the understanding of underlying factors prompting adoption behavior, the authors develop a theoretical model grounded in technology acceptance and social psychology literature. Empirical results reveal perceived usefulness and visibility as important factors that drive intention. The magnitude of these antecedents is influenced by an individual’s perception of viewing smartwatches as a technology and/or as a fashion accessory. Theoretical and managerial implications are discussed.
Über mehrere Monate porträtierten sich Masterstudierende von drei Kontinenten gegenseitig über Skype. Mittels einer besonderen Zeichentechnik, der Blindzeichnung, sind zahlreiche Porträts entstanden, deren Wirkung im öffentlichen Raum und in sozialen Netzwerken untersucht wurden. Diese Porträts sind die Basis für künstlerische Arbeiten in allen Bereichen und Medien der Bildenden Kunst. Das Forschungsprojekt SkypeLab schafft so eine Verbindung der traditionellen künstlerischen Techniken mit aktuellen digitalen Technologien.
Die zunehmende Durchdringung von cyber-physischen Systemen und deren Vernetzung zu cyberphysischen Produktionssystemen (CPPS) führt zu fundamentalen Veränderungen von zukünftigen Montage-, Fertigungs- und Logistiksystemen, welche innovative Methoden zur Planung, Steuerung und Kontrolle von wandlungsfähigen Produktionssystemen erfordern. Zukünftige logistische Systeme werden dabei den Anforderungen einer hochfrequenten Veränderung und Re-Konfiguration ausgelöst durch wandlungsfähige Produktionssysteme für individualisierte Produkte und kleinen Losgrößen unterliegen. Der Einsatz dezentraler Steuerungssysteme, bei denen die komplexen Planungs-, Steuerungs- und Kontrollprozesse auf zahlreiche Knoten und Entitäten des entstehenden Steuerungssystems verteilt werden, bietet ein großes Potential, den Anforderungen in cyber-physischen Logistiksystemen gerecht zu werden. Eine zentrale Herausforderung ist dabei die echtzeitfähige Steuerung und Re-Konfiguration von sogenannten hybriden Logistiksystemen, welche u.a. durch die Kollaboration von Mensch und Maschine, der Kombination verschiedenartiger Fördermittel sowie verschiedenartiger Steuerungsarchitekturen geprägt sind und darüber hinaus auf hybriden Entscheidungsfindungsprozessen beruhen, welche die Fähigkeiten von Menschen und (cyber-physischen) Systemen synergetisch nutzen.
Lernfabriken, wie die ESB Logistik-Lernfabrik an der ESB Business School (Hochschule Reutlingen), bieten dabei weitreichende Möglichkeiten, diese innovativen Methoden, Systeme und technischen Lösungen in einer industrienahen und risikofreien Fabrikumgebung zu entwickeln sowie in die Ausbildung von Studierenden und Weiterbildung von Teilnehmern aus der Industrie zu transferieren. Um die Forschung, Lehre sowie Aus- und Weiterbildung im Bereich zukünftiger Montage-, Fertigungs- und Logistiksysteme auszuweiten, wird das bestehende Produktionssystem der ESB Logistik-Lernfabrik im Rahmen verschiedenster Forschungs- und Studentenprojekte schrittweise in ein dezentral gesteuertes cyber-physisches Produktionssystem, basierend auf einer ereignisorientierten, cloud-basierten und dezentralen Steuerungsarchitektur, überführt.
Das Buch behandelt alle in der Praxis wichtigen Fragen und Probleme beim Entwurf und der Auswahl von Schaltnetzteilen im unteren bis mittleren Leistungsbereich, also von mW bis etwa 1kW. Zunächst erfolgt eine Einführung in die klassischen Wandler und Resonanzwandler. Danach werden die Leistungsbauelemente beschrieben und erprobte Ansteuerschaltungen vorgestellt. Im letzten Teil werden EMV-Aspekte behandelt. Wichtige, in der Praxis erprobte Schaltungstechniken werden vorgestellt und beschrieben. Für in der Praxis tätige Techniker und Ingenieure stellt das Buch eine Zusammenfassung aller Themengebiete dar, die für die tägliche Arbeit gebraucht werden. Studenten kann es zum Selbststudium dienen oder den Stoff von Lehrveranstaltungen ergänzen.
Significant advances have been achieved in mobile robot localization and mapping in dynamic environments, however these are mostly incapable of dealing with the physical properties of automotive radar sensors. In this paper we present an accurate and robust solution to this problem, by introducing a memory efficient cluster map representation. Our approach is validated by experiments that took place on a public parking space with pedestrians, moving cars, as well as different parking configurations to provide a challenging dynamic environment. The results prove its ability to reproducibly localize our vehicle within an error margin of below 1% with respect to ground truth using only point based radar targets. A decay process enables our map representation to support local updates.
In this paper we present our work in progress on revisiting traditional DBMS mechanisms to manage space on native Flash and how it is administered by the DBA. Our observations and initial results show that: the standard logical database structures can be used for physical organization of data on native Flash; at the same time higher DBMS performance is achieved without incurring extra DBA overhead. Initial experimental evaluation indicates a 20% increase in transactional throughput under TPC-C, by performing intelligent data placement on Flash, less erase operations and thus better Flash longevity.
Using a Fabry-Pérot-microresonator with controllable cavity lengths in the λ/2-regime, we show the controlled modification of the vibronic relaxation dynamics of a fluorescent dye molecule in the spectral and time domain. By altering the photonic mode density around the fluorophores we are able to shape the fluorescence spectrum and enhance specifically the probability of the radiative transitions from the electronic excited state to distinct vibronic excited states of the electronic ground state. Analysis and correlation of the spectral and time resolved measurements by a theoretical model and a global fitting procedure allows us to reveal quantitatively the spectrally distributed radiative and non-radiative relaxation dynamics of the respective dye molecule under ambient conditions at the ensemble level.
Here we report a simple way to enhance the resolution of a confocal scanning microscope under cryogenic conditions. Using a microscope objective (MO) with high numerical aperture (NA = 1:25) and 1-propanol as an immersion fluid with low freezing temperature we were able to reach an imaging resolution at 160 K comparable to ambient conditions. The MO and the sample were both placed inside the inner chamber of the cryostat to reduce distortions induced by temperature gradients. The image quality of our commercially available MO was further enhanced by scanning the sample (sample scanning) in contrast to beam scanning. The ease of the whole procedure marks an essential step towards the development of cryo high-resolution microscopy and correlative light and electron cryo microscopy (cryoCLEM).
As long as there have been professional sports, there have been relationships on different levels. For example, sponsorship (or patronage as it was called in the early days) was mostly based on personal relations between the local benefactors and their favourite sports club. Regarding media, clubs always maintained special relationships with selected journalists. The bond between fans and their clubs was always a close and mutually beneficial one. All these relationships existed from the start of the sports business. Therefore, relationship marketing is nothing new in the context of sports. Many sporting organisations always knew to value a deep and good relationship with their stakeholders and practised relationship marketing without being aware of it. Successful sports managers, however, take the old wisdom and turn it into a modern relationship marketing approach by structuring the various relationships in order to make them more effective and profitable for the own sporting organisation and the various stakeholders. This chapter ... illustrates the many facets of relationship marketing and the possibilities it offers in the context of the sports business.
Herein the optimization of the physicochemical properties and surface biocompatibility of polyelectrolyte multilayers of the natural, biocompatible and biodegradable, linear polysaccharides hyaluronan and chitosan by Hofmeister anions was systematically investigated. We demonstrated that there is an interconnection between the bulk and surface properties of HA/Chi multilayers both varying in accordance with the arrangement of the anions in the Hofmeister series. Kosmotropic anions increased the hydration, thickness, micro- and macro-roughness, and hydrophilicity and improved the biocompatibility of the films by reduction (2 orders of magnitude) of the films stiffness and complete anti-thrombogenicity.
Recycling of poly(ethylene terephthalate) (PET) is of crucial importance, since worldwide amounts of PETwaste increase rapidly due to its widespread applications. Hence, several methods have been developed, like energetic, material, thermo-mechanical and chemical recycling of PET. Most frequently, PET-waste is incinerated for energy recovery, used as additive in concrete composites or glycolysed to yield mixtures of monomers and undefined oligomers. While energetic and thermo-mechanical recycling entail downcycling of the material, chemical recycling requires considerable amounts of chemicals and demanding processing steps entailing toxic and ecological issues. This review provides a thorough survey of PET-recycling including energetic, material, thermo-mechanical and chemical methods. It focuses on chemical methods describing important reaction parameters and yields of obtained reaction products. While most methods yield monomers, only a few yield undefined low molecular weight oligomers for impaired applications (dispersants or plasticizers). Further, the present work presents an alternative chemical recycling method of PET in comparison to existing chemical methods.
Reality mining refers to an application of data mining, using sensor data to drive behavioral patterns in the real world. However, research in this field started a decade ago when technology was far behind today's state of the art. This paper discusses which requirements are now posed to applications in the context of reality mining. A survey has shown which sensors are available in state-of-the-art smartphones and usable to gather data for reality mining. As another contribution of this paper, a reality mining application architecture is proposed to facilitate the implementation of such applications. A proof of concept verifies the assumptions made on reality mining and the presented architecture.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. We use a 3D morphable face model to obtain a semi-dense shape and combine it with a fast median-based super-resolution technique to obtain a high-fidelity textured 3D face model. Our system does not need prior training and is designed to work in uncontrolled scenarios.
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.
Pultrusion of braids
(2016)
Customer needs and requirements are getting increasingly diverse and consumers more and more want to express their individuality with the products they buy. Due to the emergence of the internet and possibilities given, customers no longer only play a passive role, but are actually enabled to determine what they are purchasing. Therefore customisation or personlisation approaches like the miadidas concept from adidas, providing customised performance shoes or sneakers are more popular than ever. The prosumer concept already plays an important role trying to satisfy the demands of customers in future. As apparel for outdoor activities represents the largest and most important part of the sports good market in Germany and is yet still expected to grow, the purpose of this study is, on the one hand to identify diverse prosumer concepts existing and on the other hand to examine to what extent companies of the outdoor industry already have implemented prosumer concepts. A content analysis of homepages and online shops of 30 different European and North American outdoor brands was conducted. Results show, that companies of the outdoor industry have already implemented several prosumer concepts, but most of them are mainly concentrating on one prosumer approach and the involvement of professional users of their products.
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
Um einen reibungslosen Produktionsablauf zu gewährleisten und Produktionsstillstände zu vermeiden, ist eine kontinuierliche Materialverfügbarkeit erforderlich. Bei der Auswahl von Materialbereitstellungsprinzipien gilt es, unternehmensspezifische Gegebenheiten zu berücksichtigen. Der vorliegende Beitrag zeigt am Beispiel der ERBE Elektromedizin GmbH, Tübingen, ein mögliches Vorgehen.
Preface of IDEA 2015
(2016)
In times of e-commerce and digitalization, new markets are opening, young companies have the possibility to grow and new perspectives arise in terms of customer relationship. Customers require more possibilities of personalization. In the same time, companies have access to new and especially more information about the customer. Seems like it was a correlation that could evolve greatly if there weren't privacy issues. Vast amount of data about consumers are collected in Big Data warehouses. These shall be analyzed via predictive analytics and customers shall be classified by algorithms like clustering models, propensity models or collaborative filtering. All these subjects are growing in importance, as they are shaping the global marketing landscape. Marketers develop together with IT scientists new ways of analyzing customer databases and benefit from more accurate segmentation methods as that have been used until now. The following paper shall provide a literature review on new methods of consumer segmentation regarding the high inflow of new information via e-commerce. It will introduce readers in the subject of predictive analytics and will discuss several predictive models. The writing of the paper is not based on own empirical researches, but shall serve as a reference text for further researches. A conclusion will complete the paper.
An enormous amount of data in the context of business processes is stored as images. They contain valuable information for business process management. Up to now this data had to be integrated manually into the business process. By advances of capturing it is possible to extract information from an increasing number of images. Therefore, we systematically investigate the potentials of Image Mining for business process management by a literature research and an in-depth analysis of the business process lifecycle. As a first step to evaluate our research, we developed a prototype for recovering process model information from drawings using Rapidminer.
Due to the complexity of assembly processes, a high ratio of tasks is still performed by human workers. Short-cyclically changing work contents due to smaller lot sizes, especially the varied series assesmbly, increases both the need for information support as well as the risk of rising physical and psychological stress. The use of technical and digital assistance systems can counter these challenges. Through the integration of information and communication technology as well as collaborative assembly technologies, hybrid cyber-physical assembly systems will emerge. Widely established assembly planning approaches for digital and technical support systems in cyber physical assembly systems will be outlined and discussed with regard to synergies and delimitations of planning perspectives.
Sleep is an important aspect in life of every human being. The average sleep duration for an adult is approximately 7 h per day. Sleep is necessary to regenerate physical and psychological state of a human. A bad sleep quality has a major impact on the health status and can lead to different diseases. In this paper an approach will be presented, which uses a long-term monitoring of vital data gathered by a body sensor during the day and the night supported by mobile application connected to an analyzing system, to estimate sleep quality of its user as well as give recommendations to improve it in real-time. Actimetry and historical data will be used to improve the individual recommendations, based on common techniques used in the area of machine learning and big data analysis.
The loss contribution of a 2.3kW synchronous GaN-HEMT boost converter for an input voltage of 250V and an output voltage of 500V was analyzed. A simulation model which consists of two parts is introduced. First, a physics-based model is used to determine the switching losses. Then, a system simulation is applied to calculate the losses of the specific elements. This approach allows a fast and accurate system evaluation as required for further system optimization.
In this work, a hard- and a zero-voltage turn-on switching converter are compared. Measurements were performed to verify the simulation model, showing a good agreement. A peak efficiency of 99% was achieved for an output power of 1.4kW. Even with an output power above 400W, it was possible to obtain a system efficiency exceeding 98 %.
Pendeln im Alter : eine Fallstudie zu transnationaler Migration zwischen Deutschland und der Türkei
(2016)
Alter und Migration sind bisher kaum in Bezug zueinander gesetzt worden. Wie wirken sich migrationsspezifische Faktoren auf Alltag und Lebensplanung von Rentner/innen aus? Inwiefern beeinflussen Alter und Altern das Pendeln zwischen Herkunfts- und Einwanderungsland? Diese Fragen beantwortet Felicia Sparacio am Beispiel von Rentner/innen, die zwischen Deutschland und der Türkei pendeln. Sie untersucht deren soziale Netzwerke, konzentriert sich auf Alter in Verbindung mit Gesundheit, fragt nach Zugehörigkeiten und arbeitet die jeweils spezifische Prägung durch transnationale Migration und Alter heraus. Die Fallstudie erläutert zunächst Kontexte und Theorien zu Migration und Alter(n), bevor die beiden Felder gestützt durch empirische Ergebnisse analytisch verbunden werden.
This paper presents a concurrency control mechanism that does not follow a "one concurrency control mechanism fits all needs" strategy. With the presented mechanism a transaction runs under several concurrency control mechanisms and the appropriate one is chosen based on the accessed data. For this purpose, the data is divided into four classes based on its access type and usage (semantics). Class O (the optimistic class) implements a first-committer-wins strategy, class R (the reconciliation class) implements a first-n-committers-win strategy, class P (the pessimistic class) implements a first-reader-wins strategy, and class E (the escrow class) implements a first-n-readers-win strategy. Accordingly, the model is called OjRjPjE. The selected concurrency control mechanism may be automatically adapted at run-time according to the current load or a known usage profile. This run-time adaptation allows OjRjPjE to balance the commit rate and the response time even under changing conditions. OjRjPjE outperforms the Snapshot Isolation concurrency control in terms of response time by a factor of approximately 4.5 under heavy transactional load (4000 concurrent transactions). As consequence, the degree of concurrency is 3.2 times higher.
Der Beitrag erweitert aus der theoretischen Perspektive der Soziologie der Konventionen (Économie des Conventions, EC) die Forschung zur pragmatischen Dimension des organisationalen Gedächtnisses. Dabei wird erstens argumentiert, dass Konventionen als organisationales Gedächtnis verstanden werden können, in denen gespeichert wird, wie Koordinationsprobleme erfolgreich lösbar sind. Zweitens wird anhand des Akteurstatus der EC sowie des Konzepts der Handlungsregime diskutiert, wie Akteure auf gespeichertes Wissen zugreifen. Und drittens wird die bislang nicht berücksichtigte normative Dimension des organisationalen Gedächtnisses analysiert. Dabei wird argumentiert, dass Akteure sich auf Konventionen gestützt rechtfertigen, wenn sie Elemente des organisationalen Gedächtnisses aufgreifen. Insgesamt trägt der Beitrag dazu bei, die Verbindung von kollektivem Gedächtnis und Entscheidung besser zu verstehen, indem sie auf Basis der EC als eine interaktionistische, pragmatische und normativ geprägte Aushandlung von Erinnerungen in konkreten Situationen betrachtet wird.
In dieser Arbeit wird ein Modell vorgestellt, das die Planung der direkten Wiederverwendung bei der Vermietung mobiler und langlebiger Investitionsgüter in Closed-Loop Supply Chains optimiert. Insbesondere die Entwicklung von Planungsalgorithmen zur Verbesserung der Vorhersagewahrscheinlichkeit zukünftiger Rücklieferungen und deren betriebswirtschaftliche Auswirkungen für Unternehmen stehen im Vordergrund. Das Optimierungsmodell betrachtet dabei sowohl die Positionierung des Unternehmens im Innen- als auch im Außenverhältnis und liefert die Entscheidungsgrundlage für entsprechende strategische Initiativen.
One-pot synthesis of micron partly hollow anisotropic dumbbell shaped silica core-shell particles
(2016)
A facile method is described to prepare micron partly hollow dumbbell silica particles in a single step. The obtained particles consist of a large dense part and a small hollow lobe. The spherical dense core as well as the hollow lobe are covered by mesoporous channels. In the case of a smaller lobe these channels are responsible for the permeability of the shell which was demonstrated by confocal imaging and spectroscopy.
In many automotive applications, repetitive selfheating is the most critical operation condition for LDMOS transistors in smart power ICs. This is attributed to thermomechanical stress in the on-chip metallization, which results from the different thermal expansion coefficients of the metal and the intermetal dielectric. After many cycles, the accumulated strain in the metallization can lead to short circuits, thus limiting the lifetime. Increasing the LDMOS size can help to lower peak temperatures and therefore to reduce the stress. The downside of this is a higher cost. Hence, it has been suggested to use resilient systems that monitor the LDMOS metallization and lower the stress once a certain level of degradation is reached. Then, lifetime requirements can be fulfilled without oversizing LDMOS transistors, even though a certain performance loss has to be accepted. For such systems, suitable sensors for metal degradation are required. This work proposes a floating metal line embedded in the LDMOS metallization. The suitability of this approach has been investigated experimentally by test structures and shown to be a promising candidate. The obtained results will be explained by means of numerical thermo-mechanical simulations.
A software process is the game plan to organize project teams and run projects. Yet, it still is a challenge to select the appropriate development approach for the respective context. A multitude of development approaches compete for the users’ favor, but there is no silver bullet serving all possible setups. Moreover, recent research as well as experience from practice shows companies utilizing different development approaches to assemble the best-fitting approach for the respective company: a more traditional process provides the basic framework to serve the organization, while project teams embody this framework with more agile (and/or lean) practices to keep their flexibility. The paper at hand provides insights into the HELENA study with which we aim to investigate the use of “Hybrid dEveLopmENt Approaches in software systems development”. We present the survey design and initial findings from the survey’s test runs. Furthermore, we outline the next steps towards the full survey.
Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs. In this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of SQM (including testing). From the main study’s result set, 92 papers were selected for an in-depth systematic review to study the contributions and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models and a strong focus on custom review, testing, and documentation techniques, whereas a set of five selected improvement measures is almost equally addressed.
This article discusses the scientifically and industrially important problem of automating the process of unloading goods from standard shipping containers. We outline some of the challenges barring further adoption of robotic solutions to this problem, ranging from handling a vast variety of shapes, sizes, weights, appearances, and packing arrangements of the goods, through hard demands on unloading speed and reliability, to ensuring that fragile goods are not damaged. We propose a modular and reconfigurable software framework in an attempt to efficiently address some of these challenges. We also outline the general framework design and the basic functionality of the core modules developed. We present two instantiations of the software system on two different fully integrated demonstrators: 1) coping with an industrial scenario, i.e., the automated unloading of coffee sacks with an already economically interesting performance; and 2) a scenario used to demonstrate the capabilities of our scientific and technological developments in the context of medium- to long-term prospects of automation in logistics. We performed evaluations that allowed us to summarize several important lessons learned and to identify future directions of research on autonomous robots for the handling of goods in logistics applications.
The efficiency of pharmaceutical research and development (R&D) reflected by increasing costs of R&D, long timelines, and low probabilities of technical and regulatory success decreased continuously in the past years. Today, the costs for discovering and developing a new drug are enormously high with more than USD 2 billion per new molecular entity (NME), while the average overall success of a research project to provide an NME is in the single-digit percentage rate, and the total timelines of R&D easily exceeds 10 years questioning the return on investment (ROI) of pharmaceutical R&D. As a consequence and also caused by numerous patent expirations of blockbuster drugs that increased the pressure to return to an acceptable ROI, the pharmaceutical industry addressed this challenge and the related causes and identified several actions that need to be taken to increase the output/input ratio of R&D. This book chapter will review the pipeline sizes and the R&D investments of multinational pharmaceutical companies, will describe new processes that have been implemented to increase the reach and to reduce costs of pharmaceutical R&D, and it will illustrate new innovation models that were developed to increase the R&D efficiency.
Rapidly growing data volumes push today's analytical systems close to the feasible processing limit. Massive parallelism is one possible solution to reduce the computational time of analytical algorithms. However, data transfer becomes a significant bottleneck since it blocks system resources moving data-to-code. Technological advances allow to economically place compute units close to storage and perform data processing operations close to data, minimizing data transfers and increasing scalability. Hence the principle of Near Data Processing (NDP) and the shift towards code-to-data. In the present paper we claim that the development of NDP-system architectures becomes an inevitable task in the future. Analytical DBMS like HPE Vertica have multiple points of impact with major advantages which are presented within this paper.
This paper is concerned with the study, optimization and control of the moisture sorption kinetics of agricultural products at temperatures typically found in processing and storage. A nonlinear autoregressive with exogenous inputs (NARX) neural network was developed to predict moisture sorption kinetics and consequently equilibrium moisture contents of shiitake mushrooms (Lentinula edodes (Berk.) Pegler) over a wide range of relative humidity and different temperatures. Sorption kinetic data of mushroom caps was separately generated using a continuous, gravimetric dynamic vapour sorption analyser at emperatures of 25-40 °C over a stepwise variation of relative humidity ranging from 0 to 85%. The predictive power of the neural network was based on physical data, namely relative humidity and temperature. The model was fed with a total of 4500 data points by dividing them into three subsets, namely, 70% of the data was used for training, 15% of the data for testing and 15% of the data for validation, randomly selected from the whole dataset. The NARX neural network was capable of precisely simulating equilibrium moisture contents of mushrooms derived from the dynamic vapour sorption kinetic data throughout the entire range of relative humidity.
The extracellular environment of vascular cells in vivo is complex in its chemical composition, physical properties, and architecture. Consequently, it has been a great challenge to study vascular cell responses in vitro, either to understand their interaction with their native environment or to investigate their interaction with artificial structures such as implant surfaces. New procedures and techniques from materials science to fabricate bio-scaffolds and surfaces have enabled novel studies of vascular cell responses under well-defined, controllable culture conditions. These advancements are paving the way for a deeper understanding of vascular cell biology and materials–cell interaction. Here, we review previous work focusing on the interaction of vascular smooth muscle cells (SMCs) and endothelial cells (ECs) with materials having micro- and nanostructured surfaces. We summarize fabrication techniques for surface topographies, materials, geometries, biochemical functionalization, and mechanical properties of such materials. Furthermore, various studies on vascular cell behavior and their biological responses to micro- and nanostructured surfaces are reviewed. Emphasis is given to studies of cell morphology and motility, cell proliferation, the cytoskeleton and cell-matrix adhesions, and signal transduction pathways of vascular cells. We finalize with a short outlook on potential interesting future studies.
The internet of things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud environments are emerging to support smart connected i.e. digital products and services and the digital transformation. Biological metaphors for living and adaptable ecosystems are currently providing the logical foundation for resilient run-time environments with serviceoriented digitization architectures and for self-optimizing intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution of information systems with digital architecture in the context of the ongoing digital transformation. The goal is to support flexible and agile transformations for both business and related information systems through adaptation and dynamical evolution of their digital architectures. The present research paper investigates mechanisms of decision analytics for digitization architectures, putting a spotlight to internet of things micro-granular architectures, by extending original enterprise architecture reference models with digitization architectures and their multi-perspective architectural decision management.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.
Adapting characteristics of biomaterials specifically for in vitro and in vivo applications is becoming increasingly important in order to control interactions between material and biological systems. These complex interactions are influenced by surface properties like chemical composition, charge, mechanical and topographic attributes. In many cases it is not useful or even not possible to alter the base material but changing surface, to improve biocompatibility or to make surfaces bioactive, may be achieved by thin coatings. An already established method is the coating with polyelectrolyte multilayers (PEM). To adjust adhesion, proliferation and improve vitality of certain cell types, we modified the roughness of PEM coatings. We included different types nanoparticles (NP’s) in different concentrations into PEM coatings for controlling surface roughness. Surface properties were characterized and the reaction of 3 different cell types on these coatings was tested.
Das in Kundenforschungsprojekten häufig zu beobachtende Standardvorgehen liefert oftmals fehlerhafte Ergebnisse. Wir plädieren daher für einen "Schritt zurück", um einen ganzheitlichen Blick auf den Baukasten der Kundenforschungsinstrumente zu ermöglichen. Aufbauend auf dem ersten Beitrag in WiSt-Heft Nr. 4/2016, S. 188–193, in dem die Ausgangslage beschrieben und die ersten beiden Dimensionen der Kundenanalyse (Objekt der Forschung, und Forschungsdesign) diskutiert wurden, werden im vorliegenden zweiten Teil Aspekte der Datenanalyse thematisiert.
Kundenforschungsprojekte sind häufig durch einen beschränkten Fokus auf bestimmte Untersuchungsobjekte, Forschungsdesigns und Datenanalyseverfahren geprägt. Leider ist das häufig zu beobachtende Standardvorgehen nicht immer korrekt und liefert in vielen Fällen sogar fehlerhafte Ergebnisse. Die Diskussion des optimalen Untersuchungsobjekts und des geeigneten Untersuchungsdesigns sind Gegenstand des ersten Teils dieses Beitrages.
One of the challenges in condition monitoring systems is the residual life time prediction. This prediction is done based on statistical methods, based on physical knowledge about the considered process or a combination of these approaches. Physical knowledge of the system is a result of long-term experience of process operators. However, it can be gained as well by analyzing appropriately designed process models. The additional benefit of such models is that particular effects and their impact on the process behavior can be analyzed in detail and without plant operation in a shorter time. The current contribution developed in the framework of the research project Model Based Hierarchic Condition Monitoring presents such models for condition monitoring of roller chains. First, already existing high order dynamic models given by nonlinear differential equations of such chains are extended to incorporate effects that occur due to a deterioration of the chain condition. Then, a simple model is developed and compared to the high order model. Based on the two models the change in the process behavior due to a deterioration of the roller chain condition is analyzed to illustrate that these models can be used in future research in the above mentioned research project to better predict the residual life time of the considered roller chains.
This paper analyzes different government debt relief programs in the European Monetary Union. I build a model and study different options ranging from debt relief to the European Stability Mechanism (ESM). The analysis reveals the following: First, patient countries repay debt, while impatient countries more likely consume and default. Second, without ESM loans, indebted countries default anyway. Third, if the probability to be an impatient government is high, then the supply of loans is constrained. In general, sustainable and unsustainable governments should be incentivized differently especially in a supranational monetary union. Finally, I develop policy recommendations for the ongoing debate in the Eurozone.
This book showcases new and innovative approaches to biometric data capture and analysis, focusing especially on those that are characterized by non-intrusiveness, reliable prediction algorithms, and high user acceptance. It comprises the peer-reviewed papers from the international workshop on the subject that was held in Ancona, Italy, in October 2014 and featured sessions on ICT for health care, biometric data in automotive and home applications, embedded systems for biometric data analysis, biometric data analysis: EMG and ECG, and ICT for gait analysis. The background to the book is the challenge posed by the prevention and treatment of common, widespread chronic diseases in modern, aging societies. Capture of biometric data is a cornerstone for any analysis and treatment strategy. The latest advances in sensor technology allow accurate data measurement in a non-intrusive way, and in many cases it is necessary to provide online monitoring and real-time data capturing to support a patient’s prevention plans or to allow medical professionals to access the patient’s current status. This book will be of value to all with an interest in this expanding field.
Der Begriff Value-Based-Selling kam erstmals in Europa zur Jahrtausendwende in Mode. Doch so neu ist das wertorientierte Verkaufen nun auch wieder nicht. So wird doch jeder gute Verkäufer dem Kunden stets die Kundenvorteile ausreichend transparent machen. Das war doch schon immer so, auch wenn das früher niemand mit Value Based-Selling bezeichnet hatte. Doch eine kundennutzenorientierte Formulierung im Verkaufsgespräch ist nur eine Seite der Medaille. Der Ansatz des Value-Based Selling geht weit darüber hinaus. Er hat mehr Substanz, als weitläufig bekannt ist.
Methacrylated gelatin and mature adipocytes are promising components for adipose tissue engineering
(2016)
In vitro engineering of autologous fatty tissue constructs is still a major challenge for the treatment of congenital deformities, tumor resections or high-graded burns. In this study, we evaluated the suitability of photo-crosslinkable methacrylated gelatin (GM) and mature adipocytes as components for the composition of three-dimensional fatty tissue constructs. Cytocompatibility evaluations of the GM and the photoinitiator Lithium phenyl-2,4,6 trimethylbenzoylphosphinate (LAP) showed no cytotoxicity in the relevant range of concentrations. Matrix stiffness of cell-laden hydrogels was adjusted to native fatty tissue by tuning the degree of crosslinking and was shown to be comparable to that of native fatty tissue. Mature adipocytes were then cultured for 14 days within the GM resulting in a fatty tissue construct loaded with viable cells expressing cell markers perilipin A and laminin. This work demonstrates that mature adipocytes are a highly valuable cell source for the composition of fatty tissue equivalents in vitro. Photo-crosslinkable methacrylated gelatin is an excellent tissue scaffold and a promising bioink for new printing techniques due to its biocompatibility and tunable properties.
The MIT Center for Information Systems Research surveyed 255 executives in 2015 to investigate how companies are managing business complexity. This report details the findings from our analysis of the survey data:
1. Some product complexity adds value, some does not. Specifically, companies with more links (aka integration) in their product and service portfolio are higher performing. - 2. Product variety makes it more difficult for costumers and employees to get things done. These customers and employee difficulties impair a company's performance. - 3. Companies that excel at making it easy for employees and customers to get things done differentiate themselves by applying a set of complexity management practices around enterprise architecture, role reconfiguration, and the use of metrics and incentive systems.
Based on these findings, we recommend that companies make product complexity a strategic chois, invest in the abovementioned complexity management practices, and use costumer and employee dfficulties as key metrics for product innovation.
Dieser Beitrag beschreibt das Markenmanagement von Profifußballvereinen durch den Einsatz von Social Media. Um sich ein stückweit vom nichtplanbaren sportlichen Erfolg unabhängig zu machen, sollten sich Fußballvereine als Marke positionieren. Dazu steht ihnen allerdings traditionellerweise ein geringes Marketingbudget zur Verfügung. Social Media bietet Fußballvereinen die Möglichkeit, relativ kostengünstig und effektiv die eigene Marke aufzubauen und zu pflegen. Der Beitrag erläutert diesbezüglich die Notwendigkeit eines systematischen Markenmanagements, geht auf die Besonderheiten der Vermarktung eines Profifußballvereins ein und zeigt anhand von Beispielen, wie Social Media zum Markenaufbau respektive zur Markenpflege genutzt werden kann.
Facebook ist gegenwärtig das meist genutzte soziale Netzwerk weltweit. Es ist somit nicht verwunderlich, dass immer mehr Unternehmen Facebook im Rahmen ihres Marketings einsetzen. Die Integration von Facebook in das Markenmanagement avanciert zunehmend zum Erfolgsfaktor innovativer Unternehmen. Ein professionelles Markenmanagement mit diesem sozialen Netzwerk bietet die Möglichkeit, einen nachhaltigen Mehrwert zu generieren. In diesem Beitrag wird die Rolle von Facebook im Markenmanagement eruiert. Im Kontrast zum steigenden Bewusstsein der Vorteile von Marketing mit Facebook bleiben die Risiken einer inadäquaten Nutzung oftmals ungeachtet. Die übereilte und unsachgemäße Implementierung von Facebook in den Marketing-Mix kann sowohl in enormen ökonomischen Schäden als auch in einem Reputationsverlust für die Marke münden. Um dieses Risiko zu minimieren, werden im vorliegenden Beitrag Erfolgsfaktoren für den Einsatz von Facebook im Markenmanagement herausgearbeitet, die auf einer Analyse erfolgreicher Marketing-Kampagnen und Best-Practice-Beispielen basieren.
Um nachhaltig den Umsatz zu steigern, müssen sich Fußballvereine emotional verkaufen. "Mia san mia" und "Echte Liebe" - Bayern München und Borussia Dortmund gehen beispielhaft voran. Skrupel gehen bei anderen Clubs verloren. Der vorliegende Beitrag diskutiert den schmalen Grat zwischen Tradition, Professionalisierung, Kommerzialisierung und Fan-Abzocke.
This case study describes the emerging customized omnichannel loyalty solution of Marc O’Polo from a customer’s perspective. After the introduction of Marc O’Polo and their general omnichannel strategy, the loyalty program is described in detail, like Marc O’Polo for members and the mobile app, social media, direct mail and in-store capabilities. A discussion chapter closes the case study with research implications and open questions for Marc O’Polo.
Managing software process evolution : traditional, agile and beyond - how to handle process change
(2016)
This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice.
Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation and addresses the questions of which process(es) to use and adapt, and how to organize process improvement programs. Subsequently, Part 2 mainly addresses process modeling. Lastly, Part 3 collects concrete approaches, experiences, and recommendations that can help to improve software processes, with a particular focus on specific lifecycle phases.
This book is aimed at anyone interested in understanding and optimizing software development tasks at their organization. While the experiences and ideas presented will be useful for both those readers who are unfamiliar with software process improvement and want to get an overview of the different aspects of the topic, and for those who are experts with many years of experience, it particularly targets the needs of researchers and Ph.D. students in the area of software and systems engineering or information systems who study advanced topics concerning the organization and management of (software development) projects and process improvements projects.
The digital economy has created intense demands for innovations. Companies are responding in part by creating new digital products and services to meet increasing customer expectations.
MIT CISR findings indicate that product variety is NOT directly related to firm performance, and IS related to increased difficulties for costumers and employees.