Refine
Year of publication
- 2016 (202) (remove)
Document Type
- Conference proceeding (78)
- Journal article (63)
- Book chapter (40)
- Book (11)
- Anthology (5)
- Patent / Standard / Guidelines (2)
- Review (2)
- Doctoral Thesis (1)
Language
- English (202) (remove)
Is part of the Bibliography
- yes (202)
Institute
- Informatik (64)
- Technik (52)
- ESB Business School (43)
- Life Sciences (29)
- Texoversum (14)
Publisher
- Springer (35)
- IEEE (33)
- Elsevier (16)
- Gesellschaft für Informatik (15)
- Wiley (8)
- De Gruyter (5)
- MIT Center for Information Systems Research (5)
- Hochschule Reutlingen (4)
- Stellenbosch University (4)
- Association for Information Systems (AIS) (3)
Business process management and IT supported processes are an actual topic. The procedure of finding a business process system that implements your processes the best way is not easy and takes a lot of time. In this article you will find a recommendation for an open source system. Four selected open source workflow management systems are tested and analyzed. Mean criteria for the evaluation are listed in a criteria catalogue and rated by experts by their importance. Finally, the systems are evaluated by the criteria and the best evaluated system can be recommended.
Willingness-to-pay for alternative fuel vehicle characteristics : a stated choice study for Germany
(2016)
In the light of European energy efficiency and clean air regulations, as well as an ambitious electric mobility goal of the German government, we examine consumer preferences for alternative fuel vehicles (AFVs) based on a Germany-wide discrete choice experiment among 711 potential car buyers. We estimate consumers’ willingness to-pay and compensating variation (CV) for improvements in vehicle attributes, also taking taste differences in the population into account by applying a latent class model with 6 distinct consumer segments. Our results indicate that about 1/3 of the consumers are oriented towards at least one AFV option, with almost half of them being AFV-affine, showing a high probability of choosing AFVs despite their current shortcomings. Our results suggest that German car buyers’ willingness-to-pay for improvements of the various vehicle attributes varies considerably across consumer groups and that the vehicle features have to meet some minimum requirements for considering AFVs. The CV values show that decision-makers in the administration and industry should focus on the most promising consumer group of ‘AFV aficionados’ and their needs. It also shows that some vehicle attribute improvements could increase the demand for AFVs cost-effectively, and that consumers would accept surcharges for some vehicle attributes at a level which could enable their private provision and economic operation (e.g. fast-charging infrastructure). Improvement of other attributes will need governmental subsidies to compensate for insufficient consumer valuation (e.g. battery capacity).
Software development consists to a large extend of humanbased processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the impact of group dynamics on project performance and quality. The course unit uses the Tuckman model as theoretical framework, and borrows from controlled experiments to organize and implement its practical parts in which students then experience the effects of, e.g., time pressure, resource bottlenecks, staff turnover, loss of key personnel, and other stress factors. We provide a detailed design of the course unit to allow for implementation in further software project management courses. Furthermore, we provide experiences obtained from two instances of this unit conducted in Munich and Karlskrona with 36 graduate students. We observed students building awareness of stress factors and developing counter measures to reduce impact of those factors. Moreover, students experienced what problems occur when teams work under stress and how to form a performing team despite exceptional situations.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
Wasted paradise – imagining the Maldives without the garbage island of Thilafushi : Version 1.2
(2016)
To address the high level of waste production in the Maldives, the local government decided to transform the coral island of Thilafushi into an immense waste dumb in 1992. Meanwhile, each day, 330 tons of waste is ferried to Thilafushi. The policy had the positive consequence of relieving the garbage burden in Malé, the main island, and surrounding tourist atolls. However, it can also lead to serious environmental and economic damage in the long range. First, the garbage is in visual range of one of the most prominent tourist destinations. Second, if the wind blows a certain way, unfiltered fumes from burning waste travels to tourist atolls. Third, water quality can erode as hazardous waste from batteries and other toxic waste is floating in the ocean. Over time, these effects can accumulate to significantly hamper the number of tourists that travel to the Maldives – one of the state’s main sources of financial income. In our paper, we lay out the situation in more detail and translate it into a simulation model. We test different policies to propose the Maldives government how to better solve the waste problem.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Today the optimization of metal forming processes is done using advanced simulation tools in a virtual process, e.g. FEM-studies. The modification of the free parameters represents the different variants to be analysed. So experienced engineers may derive useful proposals in an acceptable time if good initial proposals are available. As soon as the number of free parameters growths or the total process takes long times and uses different succeeding forming steps it might be quite difficult to find promising initial ideas. In metal forming another problem has to be considered. The optimization using a series of local improvements, often called a gradient approach may find a local optimum, but this could be far away from a satisfactory solution. Therefore non-deterministic approaches, e.g. Bionic Optimization have to be used. These approaches like Evolutionary Optimization or Particle Swarm Optimization are capable to cover a large range of high dimensional optimization spaces and discover many local optima. So the chance to include the global optimum increases when using such non-deterministic methods. Unfortunately these bionic methods require large numbers of studies of different variants of the process to be optimized. The number of studies tends to increase exponentially with the number of free parameters of the forming process. As the time for one single study might be not too small as well, the total time demand will be inacceptable, taking weeks to months even if high performance computing will be used. Therefore the optimization process needs to be accelerated. Among the many ideas to reduce the time and computer power requirement Meta- and Hybrid Optimization seem to produce the most efficient results. Hybrid Optimization often consists of global searches of promising regions within the parameter space. As soon as the studies indicate that there could be a local optimum, a deterministic study tries to identify this local region. If it shows better performance than other optima found until now, it is preserved for a more detailed analysis. If it performs worse than other optima the region is excluded from further search. Meta-Optimization is often understood as the derivation of Response Surfaces of the functions of free parameters. Once there are enough studies performed, the optimization is done using the Response Surfaces as representatives e.g. for the goal and the restrictions of the optimization problem. Having found regions where interesting solutions are to be expected, the studies available up to now are used to define the Response Surfaces. In many cases low degree polynomials are used, defining their coefficients by least square methods. Both proposals Hybrid Optimization and Meta-Optimization, sometimes used in combination often help to reduce the total optimization processes by large numbers of variants to be studied. In consequence they are highly recommended when dealing with time consuming optimization studies.
This practical guide for advanced students and decision-makers in the pharma and biotech industry presents key success factors in R&D along with value creators in pharmaceutical innovation. A team of editors and authors with extensive experience in academia and industry and at some of the most prestigious business schools in Europe discusses in detail the innovation process in pharma as well as common and new research and innovation strategies. In doing so, they cover collaboration and partnerships, open innovation, biopharmaceuticals, translational medicine, good manufacturing practice, regulatory affairs, and portfolio management. Each chapter covers controversial aspects of recent developments in the pharmaceutical industry, with the aim of stimulating productive debates on the most effective and efficient innovation processes. A must-have for young professionals and MBA students preparing to enter R&D in pharma or biotech as well as for students on a combined BA/biomedical and natural sciences program.
Context: Companies need capabilities to evaluate the customer value of software intensive products and services. One way of systematically acquiring data on customer value is running continuous experiments as part of the overall development process. Objective: This paper investigates the first steps of transitioning towards continuous experimentation in a large company, including the challenges faced. Method: We conduct a single-case study using participant observation, interviews, and qualitative analysis of the collected data. Results: Results show that continuous experimentation was well received by the practitioners and practising experimentation helped them to enhance understanding of their product value and user needs. Although the complexities of a large multi-stakeholder business to-business (B2B) environment presented several challenges such as inaccessible users, it was possible to address impediments and integrate an experiment in an ongoing development project. Conclusion: Developing the capability for continuous experimentation in large organisations is a learning process which can be supported by a systematic introduction approach with the guidance of experts. We gained experience by introducing the approach on a small scale in a large organisation, and one of the major steps for future work is to understand how this can be scaled up to the whole development organisation.
IT environments that consist of a very large number of rather small structures like microservices, Internet of Things (IoT) components, or mobility systems are emerging to support flexible and agile products and services in the age of digital transformation. Biological metaphors of living and adaptable ecosystems with service-oriented enterprise architectures provide the foundation for self-optimizing, resilient run-time environments and distributed information systems. We are extending Enterprise Architecture (EA) methodologies and models that cover a high degree of heterogeneity and distribution to support the digital transformation and related information systems with micro-granular architectures. Our aim is to support flexibility and agile transformation for both IT and business capabilities within adaptable digital enterprise architectures. The present research paper investigates mechanisms for integrating Microservice Architectures (MSA) by extending original enterprise architecture reference models with elements for more flexible architectural metamodels and EA-mini-descriptions.
Analysis is an important part of the enterprise architecture management process. Prior to decisions regarding transformation of the enterprise architecture, the current situation and the outcomes of alternative action plans have to be analysed. Many analysis approaches have been proposed by researchers and current enterprise architecture management tools implement analysis functionalities. However, few work has been done structuring and classifying enterprise architecture analysis approaches. This paper collects and extends existing classification schemes, presenting a framework for enterprise architecture analysis classification. For evaluation, a collection of enterprise architecture analysis approaches has been classified based on this framework. As a result, the description of these approaches has been assessed, a common set of important categories for enterprise architecture analysis classification has been derived and suggestions for further development are drawn.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
The reduced research and development (R&D) efficiency, strong competition from generics, increased cost pressure from payers, and an increased biological complexity of new target indications have resulted in a rethinking and a change from a traditional and more closed R&D model in the pharmaceutical industry toward the new paradigm of open innovation. In the past years, pharmaceutical companies have broadened their external networks toward research collaborations with academic institutes, technology providers, or codevelopment partners. To fulfill the demand to reduce timelines and costs, research-based pharmaceutical companies started to outsource R&D activities. In addition, internal R&D processes were adjusted to the more open R&D model and new processes such as alliance management were established. The corporate frontier of pharmaceutical companies became permeable and more open. As a result, the focus of pharmaceutical R&D expanded from a purely internal toward a mixed internal and external model. Today, the U.S. pharmaceutical company Eli Lilly may have established the most open model toward external innovation, as it has integrated its innovation processes with its business model. Other companies are following this more open R&D model with newer concepts such as new frontier sciences, drug discovery alliances, private public partnerships, innovation incubators, virtual R&D, crowdsourcing, open source innovation, and innovation camps.
During the first years of their employment, the graduates are a liability to industry. The employer goes an extra mile to bridge the gap between university-exiting and profitable employment of engineering graduates. Unfortunately some cannot take this risk. Given this scenario, this paper presents a learning factory approach as a platform for the application of knowledge so as to develop the required engineering competences in South African engineering graduates before they enter the labour market. It spells out the components of a Stellenbosch University Learning Factory geared towards production of engineering graduates with the required industrial skills. It elaborates on the didactics embedded in the learning factory environment, tailor-made to produce engineers who can productively contribute to the growth of the industry upon exiting the university.
In a recent publication Novy-Marx (2013) finds evidence that the variable gross profitability has a strong statistical influence on the common variation of stock returns. He also points out that there is common variation in stock returns related to firm profitability that is not captured by the three-factor model of Fama and French (1993). Thus, this thesis augments the three-factor model by the factor gross profitability and examines whether a profitability-based four-factor model is able to better explain monthly portfolio excess returns on the German stock market compared to the three-factor model of Fama and French (1993) and the Capital Asset Pricing Model (CAPM). Based on monthly stock returns of the CDAX over the period July 2008 to June 2014 this thesis documents four main findings. First, a significant positive market risk premium and a significant positive value premium can be identified. No evidence is found for a size or a profitability effect. Second, all included factors have a strong significant effect on monthly portfolio excess returns. Third, the four-factor model clearly outperforms both the three-factor model of Fama and French (1993) and the CAPM in capturing the common variation in monthly portfolio excess returns. The CAPM performs worst. Finally, the results indicate that the three-factor model of Fama and French (1993) is somewhat better in explaining the cross-section of portfolio excess returns than the four-factor model. Again, the CAPM performs worst. Nevertheless, the four-factor model is considered to be an improvement over the three-factor model of Fama and French (1993) and the CAPM in determining stock returns on the German stock market.
Clinical development is historically the phase in which a potential new medicine is being tested in phase 2 and phase 3 patient trials to demonstrate the new molecules' efficacy and safety to support the regulatory approval of drugs by health authorities. This relatively focused approach has been considerably expanded by a number of forces from within the pharmaceutical industry and equally important by changes in the healthcare systems. The need to identify the optimal patient population, showstoppers leading to discontinuation of clinical programs, the silent but constant removal of surrogate endpoints for registration, and the increased demand for real-life data which are used to demonstrate the patients' benefit and which have an ever-increasing role for pricing and reimbursement negotiations are today an integral part of this phase.
This chapter will review both the nuts and bolts of clinical development but also recent developments in this area which shape the environment and how the different players have reacted and what options might need to be explored in the future.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
Systemic Constellation describes an approach that enables practitioners to examine and address typical issues in diversity management from a different, relational perspective. Systemic Constellation utilizes the human ability to recognize the qualities of relationships between two or more people from their spatial alignment to each other (transverbal language) and the capability to illustrate inner pictures by placing humans or objects in a room as representatives (representative perception). Systemic Constellation originated in the field of family therapy and counseling, but through research, guidance work, and teaching activities over the last two decades, it has developed into a generic, structural, constellation logic with multiple methods of application. It has been adapted to a variety of topics and issues, and a number of constellation formats. This article serves as a starting point for the transfer of Systemic Constellation into diversity management. It appears that conventional approaches taught in traditional management classes (such as focusing on tools, setting targets, planning measures, and offering incentives) are of limited use when trying to deal with problematic situations in diversity management. Preliminary trials show that new solutions and insights into deeper underlying dynamics can be gained on personal and institutional levels when applying Systemic Constellation. Participants find the application of the model as very beneficial. Systemic Constellation is grounded in personal experience and particularly in a person’s own experience of the consistency of representative perception. This viewpoint can only be conveyed rudimentarily in a scientific article. Readers should feel encouraged to apply Systemic Constellations themselves and use it in their work, experimentally and professionally. To harness the full potential of Systemic Constellations in diversity management, further research needs to be done.
Optimization-based analog layout automation does not yet find evident acceptance in the industry due to the complexity of the design problem. This paper presents a Self-organized Wiring and Arrangement of Responsive Modules (SWARM), able to consider crucial design constraints both implicitly and explicitly. The flexibility of algorithmic methods and the expert knowledge captured in PCells combine into a flow of supervised module interaction. This novel approach targets the creation of constraint-compliant layout blocks which fit into a specified zone. Provoking a synergetic self-organization, even optimal layout solutions can emerge from the interaction. Various examples depict the power of that new concept and the potential for future developments.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
The deterioration of the shielding performance of electromagnetic interference finger stock gaskets in a corrosive environment is investigated. The visualization of the real contact area shows a drastic reduction of the engaged active contact region between fingers and their mating surfaces in presence of corrosives residues. In fact, additional openings occur besides the “Tlike” holes due to the porous nature of gaskets. This leads to a strong degradation of the shielding effectiveness. Modified Bethe’s theory is used to estimate the equivalent circuit parameters while the shielding effectiveness in terms of ratio between two transfer functions is obtained upon applying the filter theory. Quantitative measurements carried out for different gasket types show a good agreement with calculated results, demonstrating thus the validity of the approach.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Sport marketing is the specific application of marketing principles and processes to sports products and services. In 2014 the biggest sports event in the world, the FIFA World Cup, took place in Brazil. Billions of spectators around the world saw Germany win the trophy in Rio de Janeiro for the fourth time in history. Yet unlike in previous World Cups, conversation was not only taking place at the numerous public viewings which were held in open spaces like bars and restaurants. For the entire tournament social media like Facebook or Twitter were playing a dominant role in all aspects. With 672 million tweets on Twitter and three billion conversations on Facebook, this was the most social World Cup as well as the most social mega sports event so far. It did not matter whether it were users, athletes or companies, everyone was trying to catch up on the conversation to be informed or inform others about their opinion or latest news. This paper analyzes the implementation of social media marketing during mega sports events with a focus on Adidas’ and Nike’s social media campaigns in the frame of the FIFA World Cup 2014 in Brazil. The analysis shows that social media marketing in the frame of mega sports events gains importance. Those companies finding topics that affect people personally with a relationship to their products achieve success through social media marketing.
Information Systems in Distributed Environment (ISDE) is becoming a prominent standard in this globalization era due to advancement in information and communication technologies. The advent of the internet has supported Distributed Software Development (DSD) by introducing new concepts and opportunities, resulting in benefits such as scalability, flexibility, interdependence, reduced cost, resource pools, and usage tracking. The distributed development of information systems as well as their deployment and operation in distributed environments impose new challenges for software organizations and can lead to business advantages. In distributed environments, business units collaborate across time zones, organizational boundaries, work cultures and geographical distances, something that ultimately has led to an increasing diversification and growing complexity of cooperation among units. The real-world practice of developing, deployment and operation of information systems in globally distributed projects has been viewed from various perspectives, though technical and engineering in conjunction with managerial and organizational viewpoints have dominated the researcher’s attention so far. Successful participation in distributed environments, however, is ultimately a matter of the participants understanding and exploiting the particularities of their respective local contexts at specific points in time and exploring practical solutions through the local resources available.
This special issue of the Computer standards & interfaces journal therefore includes papers received from the public call for papers and extended and improved versions of those papers that were selected from the best of the International Workshop on Information Systems in Distributed Environment (ISDE 2014). It aims to serve as a forum to bring together academics, researchers, practitioners and students in the field of distributed information system, by presenting novel developments and lesson learned from real world cases, and to promote the exchange of ideas, discussion and advancement in these areas.
Software process improvement (SPI) has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.
Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability, especially from the perspective of SPI in small-to-medium-sized companies, which leads to new specialized frameworks. Furthermore, we find a growing interest in success factors to aid companies in conducting SPI.
This summary refers to the paper Software process improvement : where is the evidence? [Ku15].
This paper was published as full research paper in the ICSSP’2015 proceedings.
Although still in the early stages of diffusion, smartwatches represent the most popular type of wearable devices. Yet, little is known why some people are more likely to adopt smartwatches than others. To deepen the understanding of underlying factors prompting adoption behavior, the authors develop a theoretical model grounded in technology acceptance and social psychology literature. Empirical results reveal perceived usefulness and visibility as important factors that drive intention. The magnitude of these antecedents is influenced by an individual’s perception of viewing smartwatches as a technology and/or as a fashion accessory. Theoretical and managerial implications are discussed.
Significant advances have been achieved in mobile robot localization and mapping in dynamic environments, however these are mostly incapable of dealing with the physical properties of automotive radar sensors. In this paper we present an accurate and robust solution to this problem, by introducing a memory efficient cluster map representation. Our approach is validated by experiments that took place on a public parking space with pedestrians, moving cars, as well as different parking configurations to provide a challenging dynamic environment. The results prove its ability to reproducibly localize our vehicle within an error margin of below 1% with respect to ground truth using only point based radar targets. A decay process enables our map representation to support local updates.
In this paper we present our work in progress on revisiting traditional DBMS mechanisms to manage space on native Flash and how it is administered by the DBA. Our observations and initial results show that: the standard logical database structures can be used for physical organization of data on native Flash; at the same time higher DBMS performance is achieved without incurring extra DBA overhead. Initial experimental evaluation indicates a 20% increase in transactional throughput under TPC-C, by performing intelligent data placement on Flash, less erase operations and thus better Flash longevity.
Using a Fabry-Pérot-microresonator with controllable cavity lengths in the λ/2-regime, we show the controlled modification of the vibronic relaxation dynamics of a fluorescent dye molecule in the spectral and time domain. By altering the photonic mode density around the fluorophores we are able to shape the fluorescence spectrum and enhance specifically the probability of the radiative transitions from the electronic excited state to distinct vibronic excited states of the electronic ground state. Analysis and correlation of the spectral and time resolved measurements by a theoretical model and a global fitting procedure allows us to reveal quantitatively the spectrally distributed radiative and non-radiative relaxation dynamics of the respective dye molecule under ambient conditions at the ensemble level.
Here we report a simple way to enhance the resolution of a confocal scanning microscope under cryogenic conditions. Using a microscope objective (MO) with high numerical aperture (NA = 1:25) and 1-propanol as an immersion fluid with low freezing temperature we were able to reach an imaging resolution at 160 K comparable to ambient conditions. The MO and the sample were both placed inside the inner chamber of the cryostat to reduce distortions induced by temperature gradients. The image quality of our commercially available MO was further enhanced by scanning the sample (sample scanning) in contrast to beam scanning. The ease of the whole procedure marks an essential step towards the development of cryo high-resolution microscopy and correlative light and electron cryo microscopy (cryoCLEM).
As long as there have been professional sports, there have been relationships on different levels. For example, sponsorship (or patronage as it was called in the early days) was mostly based on personal relations between the local benefactors and their favourite sports club. Regarding media, clubs always maintained special relationships with selected journalists. The bond between fans and their clubs was always a close and mutually beneficial one. All these relationships existed from the start of the sports business. Therefore, relationship marketing is nothing new in the context of sports. Many sporting organisations always knew to value a deep and good relationship with their stakeholders and practised relationship marketing without being aware of it. Successful sports managers, however, take the old wisdom and turn it into a modern relationship marketing approach by structuring the various relationships in order to make them more effective and profitable for the own sporting organisation and the various stakeholders. This chapter ... illustrates the many facets of relationship marketing and the possibilities it offers in the context of the sports business.
Herein the optimization of the physicochemical properties and surface biocompatibility of polyelectrolyte multilayers of the natural, biocompatible and biodegradable, linear polysaccharides hyaluronan and chitosan by Hofmeister anions was systematically investigated. We demonstrated that there is an interconnection between the bulk and surface properties of HA/Chi multilayers both varying in accordance with the arrangement of the anions in the Hofmeister series. Kosmotropic anions increased the hydration, thickness, micro- and macro-roughness, and hydrophilicity and improved the biocompatibility of the films by reduction (2 orders of magnitude) of the films stiffness and complete anti-thrombogenicity.
Recycling of poly(ethylene terephthalate) (PET) is of crucial importance, since worldwide amounts of PETwaste increase rapidly due to its widespread applications. Hence, several methods have been developed, like energetic, material, thermo-mechanical and chemical recycling of PET. Most frequently, PET-waste is incinerated for energy recovery, used as additive in concrete composites or glycolysed to yield mixtures of monomers and undefined oligomers. While energetic and thermo-mechanical recycling entail downcycling of the material, chemical recycling requires considerable amounts of chemicals and demanding processing steps entailing toxic and ecological issues. This review provides a thorough survey of PET-recycling including energetic, material, thermo-mechanical and chemical methods. It focuses on chemical methods describing important reaction parameters and yields of obtained reaction products. While most methods yield monomers, only a few yield undefined low molecular weight oligomers for impaired applications (dispersants or plasticizers). Further, the present work presents an alternative chemical recycling method of PET in comparison to existing chemical methods.
Reality mining refers to an application of data mining, using sensor data to drive behavioral patterns in the real world. However, research in this field started a decade ago when technology was far behind today's state of the art. This paper discusses which requirements are now posed to applications in the context of reality mining. A survey has shown which sensors are available in state-of-the-art smartphones and usable to gather data for reality mining. As another contribution of this paper, a reality mining application architecture is proposed to facilitate the implementation of such applications. A proof of concept verifies the assumptions made on reality mining and the presented architecture.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. We use a 3D morphable face model to obtain a semi-dense shape and combine it with a fast median-based super-resolution technique to obtain a high-fidelity textured 3D face model. Our system does not need prior training and is designed to work in uncontrolled scenarios.
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.
Pultrusion of braids
(2016)
Customer needs and requirements are getting increasingly diverse and consumers more and more want to express their individuality with the products they buy. Due to the emergence of the internet and possibilities given, customers no longer only play a passive role, but are actually enabled to determine what they are purchasing. Therefore customisation or personlisation approaches like the miadidas concept from adidas, providing customised performance shoes or sneakers are more popular than ever. The prosumer concept already plays an important role trying to satisfy the demands of customers in future. As apparel for outdoor activities represents the largest and most important part of the sports good market in Germany and is yet still expected to grow, the purpose of this study is, on the one hand to identify diverse prosumer concepts existing and on the other hand to examine to what extent companies of the outdoor industry already have implemented prosumer concepts. A content analysis of homepages and online shops of 30 different European and North American outdoor brands was conducted. Results show, that companies of the outdoor industry have already implemented several prosumer concepts, but most of them are mainly concentrating on one prosumer approach and the involvement of professional users of their products.
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
Preface of IDEA 2015
(2016)
In times of e-commerce and digitalization, new markets are opening, young companies have the possibility to grow and new perspectives arise in terms of customer relationship. Customers require more possibilities of personalization. In the same time, companies have access to new and especially more information about the customer. Seems like it was a correlation that could evolve greatly if there weren't privacy issues. Vast amount of data about consumers are collected in Big Data warehouses. These shall be analyzed via predictive analytics and customers shall be classified by algorithms like clustering models, propensity models or collaborative filtering. All these subjects are growing in importance, as they are shaping the global marketing landscape. Marketers develop together with IT scientists new ways of analyzing customer databases and benefit from more accurate segmentation methods as that have been used until now. The following paper shall provide a literature review on new methods of consumer segmentation regarding the high inflow of new information via e-commerce. It will introduce readers in the subject of predictive analytics and will discuss several predictive models. The writing of the paper is not based on own empirical researches, but shall serve as a reference text for further researches. A conclusion will complete the paper.
An enormous amount of data in the context of business processes is stored as images. They contain valuable information for business process management. Up to now this data had to be integrated manually into the business process. By advances of capturing it is possible to extract information from an increasing number of images. Therefore, we systematically investigate the potentials of Image Mining for business process management by a literature research and an in-depth analysis of the business process lifecycle. As a first step to evaluate our research, we developed a prototype for recovering process model information from drawings using Rapidminer.
Due to the complexity of assembly processes, a high ratio of tasks is still performed by human workers. Short-cyclically changing work contents due to smaller lot sizes, especially the varied series assesmbly, increases both the need for information support as well as the risk of rising physical and psychological stress. The use of technical and digital assistance systems can counter these challenges. Through the integration of information and communication technology as well as collaborative assembly technologies, hybrid cyber-physical assembly systems will emerge. Widely established assembly planning approaches for digital and technical support systems in cyber physical assembly systems will be outlined and discussed with regard to synergies and delimitations of planning perspectives.