Refine
Year of publication
- 2016 (202) (remove)
Document Type
- Conference proceeding (78)
- Journal article (63)
- Book chapter (40)
- Book (11)
- Anthology (5)
- Patent / Standard / Guidelines (2)
- Review (2)
- Doctoral Thesis (1)
Language
- English (202) (remove)
Is part of the Bibliography
- yes (202)
Institute
- Informatik (64)
- Technik (52)
- ESB Business School (43)
- Life Sciences (29)
- Texoversum (14)
Publisher
- Springer (36)
- IEEE (33)
- Elsevier (17)
- Gesellschaft für Informatik e.V (15)
- Wiley (10)
- De Gruyter (6)
- Hochschule Reutlingen (6)
- MIT Center for Information Systems Research (5)
- Stellenbosch University (4)
- Association for Information Systems (3)
Business process management and IT supported processes are an actual topic. The procedure of finding a business process system that implements your processes the best way is not easy and takes a lot of time. In this article you will find a recommendation for an open source system. Four selected open source workflow management systems are tested and analyzed. Mean criteria for the evaluation are listed in a criteria catalogue and rated by experts by their importance. Finally, the systems are evaluated by the criteria and the best evaluated system can be recommended.
Willingness-to-pay for alternative fuel vehicle characteristics : a stated choice study for Germany
(2016)
In the light of European energy efficiency and clean air regulations, as well as an ambitious electric mobility goal of the German government, we examine consumer preferences for alternative fuel vehicles (AFVs) based on a Germany-wide discrete choice experiment among 711 potential car buyers. We estimate consumers’ willingness to-pay and compensating variation (CV) for improvements in vehicle attributes, also taking taste differences in the population into account by applying a latent class model with 6 distinct consumer segments. Our results indicate that about 1/3 of the consumers are oriented towards at least one AFV option, with almost half of them being AFV-affine, showing a high probability of choosing AFVs despite their current shortcomings. Our results suggest that German car buyers’ willingness-to-pay for improvements of the various vehicle attributes varies considerably across consumer groups and that the vehicle features have to meet some minimum requirements for considering AFVs. The CV values show that decision-makers in the administration and industry should focus on the most promising consumer group of ‘AFV aficionados’ and their needs. It also shows that some vehicle attribute improvements could increase the demand for AFVs cost-effectively, and that consumers would accept surcharges for some vehicle attributes at a level which could enable their private provision and economic operation (e.g. fast-charging infrastructure). Improvement of other attributes will need governmental subsidies to compensate for insufficient consumer valuation (e.g. battery capacity).
Software development consists to a large extend of humanbased processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the impact of group dynamics on project performance and quality. The course unit uses the Tuckman model as theoretical framework, and borrows from controlled experiments to organize and implement its practical parts in which students then experience the effects of, e.g., time pressure, resource bottlenecks, staff turnover, loss of key personnel, and other stress factors. We provide a detailed design of the course unit to allow for implementation in further software project management courses. Furthermore, we provide experiences obtained from two instances of this unit conducted in Munich and Karlskrona with 36 graduate students. We observed students building awareness of stress factors and developing counter measures to reduce impact of those factors. Moreover, students experienced what problems occur when teams work under stress and how to form a performing team despite exceptional situations.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
Wasted paradise – imagining the Maldives without the garbage island of Thilafushi : Version 1.2
(2016)
To address the high level of waste production in the Maldives, the local government decided to transform the coral island of Thilafushi into an immense waste dumb in 1992. Meanwhile, each day, 330 tons of waste is ferried to Thilafushi. The policy had the positive consequence of relieving the garbage burden in Malé, the main island, and surrounding tourist atolls. However, it can also lead to serious environmental and economic damage in the long range. First, the garbage is in visual range of one of the most prominent tourist destinations. Second, if the wind blows a certain way, unfiltered fumes from burning waste travels to tourist atolls. Third, water quality can erode as hazardous waste from batteries and other toxic waste is floating in the ocean. Over time, these effects can accumulate to significantly hamper the number of tourists that travel to the Maldives – one of the state’s main sources of financial income. In our paper, we lay out the situation in more detail and translate it into a simulation model. We test different policies to propose the Maldives government how to better solve the waste problem.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Today the optimization of metal forming processes is done using advanced simulation tools in a virtual process, e.g. FEM-studies. The modification of the free parameters represents the different variants to be analysed. So experienced engineers may derive useful proposals in an acceptable time if good initial proposals are available. As soon as the number of free parameters growths or the total process takes long times and uses different succeeding forming steps it might be quite difficult to find promising initial ideas. In metal forming another problem has to be considered. The optimization using a series of local improvements, often called a gradient approach may find a local optimum, but this could be far away from a satisfactory solution. Therefore non-deterministic approaches, e.g. Bionic Optimization have to be used. These approaches like Evolutionary Optimization or Particle Swarm Optimization are capable to cover a large range of high dimensional optimization spaces and discover many local optima. So the chance to include the global optimum increases when using such non-deterministic methods. Unfortunately these bionic methods require large numbers of studies of different variants of the process to be optimized. The number of studies tends to increase exponentially with the number of free parameters of the forming process. As the time for one single study might be not too small as well, the total time demand will be inacceptable, taking weeks to months even if high performance computing will be used. Therefore the optimization process needs to be accelerated. Among the many ideas to reduce the time and computer power requirement Meta- and Hybrid Optimization seem to produce the most efficient results. Hybrid Optimization often consists of global searches of promising regions within the parameter space. As soon as the studies indicate that there could be a local optimum, a deterministic study tries to identify this local region. If it shows better performance than other optima found until now, it is preserved for a more detailed analysis. If it performs worse than other optima the region is excluded from further search. Meta-Optimization is often understood as the derivation of Response Surfaces of the functions of free parameters. Once there are enough studies performed, the optimization is done using the Response Surfaces as representatives e.g. for the goal and the restrictions of the optimization problem. Having found regions where interesting solutions are to be expected, the studies available up to now are used to define the Response Surfaces. In many cases low degree polynomials are used, defining their coefficients by least square methods. Both proposals Hybrid Optimization and Meta-Optimization, sometimes used in combination often help to reduce the total optimization processes by large numbers of variants to be studied. In consequence they are highly recommended when dealing with time consuming optimization studies.
This practical guide for advanced students and decision-makers in the pharma and biotech industry presents key success factors in R&D along with value creators in pharmaceutical innovation. A team of editors and authors with extensive experience in academia and industry and at some of the most prestigious business schools in Europe discusses in detail the innovation process in pharma as well as common and new research and innovation strategies. In doing so, they cover collaboration and partnerships, open innovation, biopharmaceuticals, translational medicine, good manufacturing practice, regulatory affairs, and portfolio management. Each chapter covers controversial aspects of recent developments in the pharmaceutical industry, with the aim of stimulating productive debates on the most effective and efficient innovation processes. A must-have for young professionals and MBA students preparing to enter R&D in pharma or biotech as well as for students on a combined BA/biomedical and natural sciences program.
Context: Companies need capabilities to evaluate the customer value of software intensive products and services. One way of systematically acquiring data on customer value is running continuous experiments as part of the overall development process. Objective: This paper investigates the first steps of transitioning towards continuous experimentation in a large company, including the challenges faced. Method: We conduct a single-case study using participant observation, interviews, and qualitative analysis of the collected data. Results: Results show that continuous experimentation was well received by the practitioners and practising experimentation helped them to enhance understanding of their product value and user needs. Although the complexities of a large multi-stakeholder business to-business (B2B) environment presented several challenges such as inaccessible users, it was possible to address impediments and integrate an experiment in an ongoing development project. Conclusion: Developing the capability for continuous experimentation in large organisations is a learning process which can be supported by a systematic introduction approach with the guidance of experts. We gained experience by introducing the approach on a small scale in a large organisation, and one of the major steps for future work is to understand how this can be scaled up to the whole development organisation.
IT environments that consist of a very large number of rather small structures like microservices, Internet of Things (IoT) components, or mobility systems are emerging to support flexible and agile products and services in the age of digital transformation. Biological metaphors of living and adaptable ecosystems with service-oriented enterprise architectures provide the foundation for self-optimizing, resilient run-time environments and distributed information systems. We are extending Enterprise Architecture (EA) methodologies and models that cover a high degree of heterogeneity and distribution to support the digital transformation and related information systems with micro-granular architectures. Our aim is to support flexibility and agile transformation for both IT and business capabilities within adaptable digital enterprise architectures. The present research paper investigates mechanisms for integrating Microservice Architectures (MSA) by extending original enterprise architecture reference models with elements for more flexible architectural metamodels and EA-mini-descriptions.
Analysis is an important part of the enterprise architecture management process. Prior to decisions regarding transformation of the enterprise architecture, the current situation and the outcomes of alternative action plans have to be analysed. Many analysis approaches have been proposed by researchers and current enterprise architecture management tools implement analysis functionalities. However, few work has been done structuring and classifying enterprise architecture analysis approaches. This paper collects and extends existing classification schemes, presenting a framework for enterprise architecture analysis classification. For evaluation, a collection of enterprise architecture analysis approaches has been classified based on this framework. As a result, the description of these approaches has been assessed, a common set of important categories for enterprise architecture analysis classification has been derived and suggestions for further development are drawn.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
The reduced research and development (R&D) efficiency, strong competition from generics, increased cost pressure from payers, and an increased biological complexity of new target indications have resulted in a rethinking and a change from a traditional and more closed R&D model in the pharmaceutical industry toward the new paradigm of open innovation. In the past years, pharmaceutical companies have broadened their external networks toward research collaborations with academic institutes, technology providers, or codevelopment partners. To fulfill the demand to reduce timelines and costs, research-based pharmaceutical companies started to outsource R&D activities. In addition, internal R&D processes were adjusted to the more open R&D model and new processes such as alliance management were established. The corporate frontier of pharmaceutical companies became permeable and more open. As a result, the focus of pharmaceutical R&D expanded from a purely internal toward a mixed internal and external model. Today, the U.S. pharmaceutical company Eli Lilly may have established the most open model toward external innovation, as it has integrated its innovation processes with its business model. Other companies are following this more open R&D model with newer concepts such as new frontier sciences, drug discovery alliances, private public partnerships, innovation incubators, virtual R&D, crowdsourcing, open source innovation, and innovation camps.
During the first years of their employment, the graduates are a liability to industry. The employer goes an extra mile to bridge the gap between university-exiting and profitable employment of engineering graduates. Unfortunately some cannot take this risk. Given this scenario, this paper presents a learning factory approach as a platform for the application of knowledge so as to develop the required engineering competences in South African engineering graduates before they enter the labour market. It spells out the components of a Stellenbosch University Learning Factory geared towards production of engineering graduates with the required industrial skills. It elaborates on the didactics embedded in the learning factory environment, tailor-made to produce engineers who can productively contribute to the growth of the industry upon exiting the university.
In a recent publication Novy-Marx (2013) finds evidence that the variable gross profitability has a strong statistical influence on the common variation of stock returns. He also points out that there is common variation in stock returns related to firm profitability that is not captured by the three-factor model of Fama and French (1993). Thus, this thesis augments the three-factor model by the factor gross profitability and examines whether a profitability-based four-factor model is able to better explain monthly portfolio excess returns on the German stock market compared to the three-factor model of Fama and French (1993) and the Capital Asset Pricing Model (CAPM). Based on monthly stock returns of the CDAX over the period July 2008 to June 2014 this thesis documents four main findings. First, a significant positive market risk premium and a significant positive value premium can be identified. No evidence is found for a size or a profitability effect. Second, all included factors have a strong significant effect on monthly portfolio excess returns. Third, the four-factor model clearly outperforms both the three-factor model of Fama and French (1993) and the CAPM in capturing the common variation in monthly portfolio excess returns. The CAPM performs worst. Finally, the results indicate that the three-factor model of Fama and French (1993) is somewhat better in explaining the cross-section of portfolio excess returns than the four-factor model. Again, the CAPM performs worst. Nevertheless, the four-factor model is considered to be an improvement over the three-factor model of Fama and French (1993) and the CAPM in determining stock returns on the German stock market.
Clinical development is historically the phase in which a potential new medicine is being tested in phase 2 and phase 3 patient trials to demonstrate the new molecules' efficacy and safety to support the regulatory approval of drugs by health authorities. This relatively focused approach has been considerably expanded by a number of forces from within the pharmaceutical industry and equally important by changes in the healthcare systems. The need to identify the optimal patient population, showstoppers leading to discontinuation of clinical programs, the silent but constant removal of surrogate endpoints for registration, and the increased demand for real-life data which are used to demonstrate the patients' benefit and which have an ever-increasing role for pricing and reimbursement negotiations are today an integral part of this phase.
This chapter will review both the nuts and bolts of clinical development but also recent developments in this area which shape the environment and how the different players have reacted and what options might need to be explored in the future.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
Systemic Constellation describes an approach that enables practitioners to examine and address typical issues in diversity management from a different, relational perspective. Systemic Constellation utilizes the human ability to recognize the qualities of relationships between two or more people from their spatial alignment to each other (transverbal language) and the capability to illustrate inner pictures by placing humans or objects in a room as representatives (representative perception). Systemic Constellation originated in the field of family therapy and counseling, but through research, guidance work, and teaching activities over the last two decades, it has developed into a generic, structural, constellation logic with multiple methods of application. It has been adapted to a variety of topics and issues, and a number of constellation formats. This article serves as a starting point for the transfer of Systemic Constellation into diversity management. It appears that conventional approaches taught in traditional management classes (such as focusing on tools, setting targets, planning measures, and offering incentives) are of limited use when trying to deal with problematic situations in diversity management. Preliminary trials show that new solutions and insights into deeper underlying dynamics can be gained on personal and institutional levels when applying Systemic Constellation. Participants find the application of the model as very beneficial. Systemic Constellation is grounded in personal experience and particularly in a person’s own experience of the consistency of representative perception. This viewpoint can only be conveyed rudimentarily in a scientific article. Readers should feel encouraged to apply Systemic Constellations themselves and use it in their work, experimentally and professionally. To harness the full potential of Systemic Constellations in diversity management, further research needs to be done.
Optimization-based analog layout automation does not yet find evident acceptance in the industry due to the complexity of the design problem. This paper presents a Self-organized Wiring and Arrangement of Responsive Modules (SWARM), able to consider crucial design constraints both implicitly and explicitly. The flexibility of algorithmic methods and the expert knowledge captured in PCells combine into a flow of supervised module interaction. This novel approach targets the creation of constraint-compliant layout blocks which fit into a specified zone. Provoking a synergetic self-organization, even optimal layout solutions can emerge from the interaction. Various examples depict the power of that new concept and the potential for future developments.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
The deterioration of the shielding performance of electromagnetic interference finger stock gaskets in a corrosive environment is investigated. The visualization of the real contact area shows a drastic reduction of the engaged active contact region between fingers and their mating surfaces in presence of corrosives residues. In fact, additional openings occur besides the “Tlike” holes due to the porous nature of gaskets. This leads to a strong degradation of the shielding effectiveness. Modified Bethe’s theory is used to estimate the equivalent circuit parameters while the shielding effectiveness in terms of ratio between two transfer functions is obtained upon applying the filter theory. Quantitative measurements carried out for different gasket types show a good agreement with calculated results, demonstrating thus the validity of the approach.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Sport marketing is the specific application of marketing principles and processes to sports products and services. In 2014 the biggest sports event in the world, the FIFA World Cup, took place in Brazil. Billions of spectators around the world saw Germany win the trophy in Rio de Janeiro for the fourth time in history. Yet unlike in previous World Cups, conversation was not only taking place at the numerous public viewings which were held in open spaces like bars and restaurants. For the entire tournament social media like Facebook or Twitter were playing a dominant role in all aspects. With 672 million tweets on Twitter and three billion conversations on Facebook, this was the most social World Cup as well as the most social mega sports event so far. It did not matter whether it were users, athletes or companies, everyone was trying to catch up on the conversation to be informed or inform others about their opinion or latest news. This paper analyzes the implementation of social media marketing during mega sports events with a focus on Adidas’ and Nike’s social media campaigns in the frame of the FIFA World Cup 2014 in Brazil. The analysis shows that social media marketing in the frame of mega sports events gains importance. Those companies finding topics that affect people personally with a relationship to their products achieve success through social media marketing.
Information Systems in Distributed Environment (ISDE) is becoming a prominent standard in this globalization era due to advancement in information and communication technologies. The advent of the internet has supported Distributed Software Development (DSD) by introducing new concepts and opportunities, resulting in benefits such as scalability, flexibility, interdependence, reduced cost, resource pools, and usage tracking. The distributed development of information systems as well as their deployment and operation in distributed environments impose new challenges for software organizations and can lead to business advantages. In distributed environments, business units collaborate across time zones, organizational boundaries, work cultures and geographical distances, something that ultimately has led to an increasing diversification and growing complexity of cooperation among units. The real-world practice of developing, deployment and operation of information systems in globally distributed projects has been viewed from various perspectives, though technical and engineering in conjunction with managerial and organizational viewpoints have dominated the researcher’s attention so far. Successful participation in distributed environments, however, is ultimately a matter of the participants understanding and exploiting the particularities of their respective local contexts at specific points in time and exploring practical solutions through the local resources available.
This special issue of the Computer standards & interfaces journal therefore includes papers received from the public call for papers and extended and improved versions of those papers that were selected from the best of the International Workshop on Information Systems in Distributed Environment (ISDE 2014). It aims to serve as a forum to bring together academics, researchers, practitioners and students in the field of distributed information system, by presenting novel developments and lesson learned from real world cases, and to promote the exchange of ideas, discussion and advancement in these areas.
Software process improvement (SPI) has been around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big picture of the state of the art, and to draw conclusions for future research directions. An analysis of 769 publications draws a big picture of SPI-related research of the past quarter-century. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories and models on SPI in general. In particular, standard SPI models like CMMI and ISO/IEC 15,504 are analyzed, enhanced, and evaluated for applicability in practice, but these standards are also critically discussed, e.g., from the perspective of SPI in small to-medium-sized companies, which leads to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation.
Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability, especially from the perspective of SPI in small-to-medium-sized companies, which leads to new specialized frameworks. Furthermore, we find a growing interest in success factors to aid companies in conducting SPI.
This summary refers to the paper Software process improvement : where is the evidence? [Ku15].
This paper was published as full research paper in the ICSSP’2015 proceedings.
Although still in the early stages of diffusion, smartwatches represent the most popular type of wearable devices. Yet, little is known why some people are more likely to adopt smartwatches than others. To deepen the understanding of underlying factors prompting adoption behavior, the authors develop a theoretical model grounded in technology acceptance and social psychology literature. Empirical results reveal perceived usefulness and visibility as important factors that drive intention. The magnitude of these antecedents is influenced by an individual’s perception of viewing smartwatches as a technology and/or as a fashion accessory. Theoretical and managerial implications are discussed.
Significant advances have been achieved in mobile robot localization and mapping in dynamic environments, however these are mostly incapable of dealing with the physical properties of automotive radar sensors. In this paper we present an accurate and robust solution to this problem, by introducing a memory efficient cluster map representation. Our approach is validated by experiments that took place on a public parking space with pedestrians, moving cars, as well as different parking configurations to provide a challenging dynamic environment. The results prove its ability to reproducibly localize our vehicle within an error margin of below 1% with respect to ground truth using only point based radar targets. A decay process enables our map representation to support local updates.
In this paper we present our work in progress on revisiting traditional DBMS mechanisms to manage space on native Flash and how it is administered by the DBA. Our observations and initial results show that: the standard logical database structures can be used for physical organization of data on native Flash; at the same time higher DBMS performance is achieved without incurring extra DBA overhead. Initial experimental evaluation indicates a 20% increase in transactional throughput under TPC-C, by performing intelligent data placement on Flash, less erase operations and thus better Flash longevity.
Using a Fabry-Pérot-microresonator with controllable cavity lengths in the λ/2-regime, we show the controlled modification of the vibronic relaxation dynamics of a fluorescent dye molecule in the spectral and time domain. By altering the photonic mode density around the fluorophores we are able to shape the fluorescence spectrum and enhance specifically the probability of the radiative transitions from the electronic excited state to distinct vibronic excited states of the electronic ground state. Analysis and correlation of the spectral and time resolved measurements by a theoretical model and a global fitting procedure allows us to reveal quantitatively the spectrally distributed radiative and non-radiative relaxation dynamics of the respective dye molecule under ambient conditions at the ensemble level.
Here we report a simple way to enhance the resolution of a confocal scanning microscope under cryogenic conditions. Using a microscope objective (MO) with high numerical aperture (NA = 1:25) and 1-propanol as an immersion fluid with low freezing temperature we were able to reach an imaging resolution at 160 K comparable to ambient conditions. The MO and the sample were both placed inside the inner chamber of the cryostat to reduce distortions induced by temperature gradients. The image quality of our commercially available MO was further enhanced by scanning the sample (sample scanning) in contrast to beam scanning. The ease of the whole procedure marks an essential step towards the development of cryo high-resolution microscopy and correlative light and electron cryo microscopy (cryoCLEM).
As long as there have been professional sports, there have been relationships on different levels. For example, sponsorship (or patronage as it was called in the early days) was mostly based on personal relations between the local benefactors and their favourite sports club. Regarding media, clubs always maintained special relationships with selected journalists. The bond between fans and their clubs was always a close and mutually beneficial one. All these relationships existed from the start of the sports business. Therefore, relationship marketing is nothing new in the context of sports. Many sporting organisations always knew to value a deep and good relationship with their stakeholders and practised relationship marketing without being aware of it. Successful sports managers, however, take the old wisdom and turn it into a modern relationship marketing approach by structuring the various relationships in order to make them more effective and profitable for the own sporting organisation and the various stakeholders. This chapter ... illustrates the many facets of relationship marketing and the possibilities it offers in the context of the sports business.
Herein the optimization of the physicochemical properties and surface biocompatibility of polyelectrolyte multilayers of the natural, biocompatible and biodegradable, linear polysaccharides hyaluronan and chitosan by Hofmeister anions was systematically investigated. We demonstrated that there is an interconnection between the bulk and surface properties of HA/Chi multilayers both varying in accordance with the arrangement of the anions in the Hofmeister series. Kosmotropic anions increased the hydration, thickness, micro- and macro-roughness, and hydrophilicity and improved the biocompatibility of the films by reduction (2 orders of magnitude) of the films stiffness and complete anti-thrombogenicity.
Recycling of poly(ethylene terephthalate) (PET) is of crucial importance, since worldwide amounts of PETwaste increase rapidly due to its widespread applications. Hence, several methods have been developed, like energetic, material, thermo-mechanical and chemical recycling of PET. Most frequently, PET-waste is incinerated for energy recovery, used as additive in concrete composites or glycolysed to yield mixtures of monomers and undefined oligomers. While energetic and thermo-mechanical recycling entail downcycling of the material, chemical recycling requires considerable amounts of chemicals and demanding processing steps entailing toxic and ecological issues. This review provides a thorough survey of PET-recycling including energetic, material, thermo-mechanical and chemical methods. It focuses on chemical methods describing important reaction parameters and yields of obtained reaction products. While most methods yield monomers, only a few yield undefined low molecular weight oligomers for impaired applications (dispersants or plasticizers). Further, the present work presents an alternative chemical recycling method of PET in comparison to existing chemical methods.
Reality mining refers to an application of data mining, using sensor data to drive behavioral patterns in the real world. However, research in this field started a decade ago when technology was far behind today's state of the art. This paper discusses which requirements are now posed to applications in the context of reality mining. A survey has shown which sensors are available in state-of-the-art smartphones and usable to gather data for reality mining. As another contribution of this paper, a reality mining application architecture is proposed to facilitate the implementation of such applications. A proof of concept verifies the assumptions made on reality mining and the presented architecture.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. We use a 3D morphable face model to obtain a semi-dense shape and combine it with a fast median-based super-resolution technique to obtain a high-fidelity textured 3D face model. Our system does not need prior training and is designed to work in uncontrolled scenarios.
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.
Pultrusion of braids
(2016)
Customer needs and requirements are getting increasingly diverse and consumers more and more want to express their individuality with the products they buy. Due to the emergence of the internet and possibilities given, customers no longer only play a passive role, but are actually enabled to determine what they are purchasing. Therefore customisation or personlisation approaches like the miadidas concept from adidas, providing customised performance shoes or sneakers are more popular than ever. The prosumer concept already plays an important role trying to satisfy the demands of customers in future. As apparel for outdoor activities represents the largest and most important part of the sports good market in Germany and is yet still expected to grow, the purpose of this study is, on the one hand to identify diverse prosumer concepts existing and on the other hand to examine to what extent companies of the outdoor industry already have implemented prosumer concepts. A content analysis of homepages and online shops of 30 different European and North American outdoor brands was conducted. Results show, that companies of the outdoor industry have already implemented several prosumer concepts, but most of them are mainly concentrating on one prosumer approach and the involvement of professional users of their products.
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
Preface of IDEA 2015
(2016)
In times of e-commerce and digitalization, new markets are opening, young companies have the possibility to grow and new perspectives arise in terms of customer relationship. Customers require more possibilities of personalization. In the same time, companies have access to new and especially more information about the customer. Seems like it was a correlation that could evolve greatly if there weren't privacy issues. Vast amount of data about consumers are collected in Big Data warehouses. These shall be analyzed via predictive analytics and customers shall be classified by algorithms like clustering models, propensity models or collaborative filtering. All these subjects are growing in importance, as they are shaping the global marketing landscape. Marketers develop together with IT scientists new ways of analyzing customer databases and benefit from more accurate segmentation methods as that have been used until now. The following paper shall provide a literature review on new methods of consumer segmentation regarding the high inflow of new information via e-commerce. It will introduce readers in the subject of predictive analytics and will discuss several predictive models. The writing of the paper is not based on own empirical researches, but shall serve as a reference text for further researches. A conclusion will complete the paper.
An enormous amount of data in the context of business processes is stored as images. They contain valuable information for business process management. Up to now this data had to be integrated manually into the business process. By advances of capturing it is possible to extract information from an increasing number of images. Therefore, we systematically investigate the potentials of Image Mining for business process management by a literature research and an in-depth analysis of the business process lifecycle. As a first step to evaluate our research, we developed a prototype for recovering process model information from drawings using Rapidminer.
Due to the complexity of assembly processes, a high ratio of tasks is still performed by human workers. Short-cyclically changing work contents due to smaller lot sizes, especially the varied series assesmbly, increases both the need for information support as well as the risk of rising physical and psychological stress. The use of technical and digital assistance systems can counter these challenges. Through the integration of information and communication technology as well as collaborative assembly technologies, hybrid cyber-physical assembly systems will emerge. Widely established assembly planning approaches for digital and technical support systems in cyber physical assembly systems will be outlined and discussed with regard to synergies and delimitations of planning perspectives.
Sleep is an important aspect in life of every human being. The average sleep duration for an adult is approximately 7 h per day. Sleep is necessary to regenerate physical and psychological state of a human. A bad sleep quality has a major impact on the health status and can lead to different diseases. In this paper an approach will be presented, which uses a long-term monitoring of vital data gathered by a body sensor during the day and the night supported by mobile application connected to an analyzing system, to estimate sleep quality of its user as well as give recommendations to improve it in real-time. Actimetry and historical data will be used to improve the individual recommendations, based on common techniques used in the area of machine learning and big data analysis.
The loss contribution of a 2.3kW synchronous GaN-HEMT boost converter for an input voltage of 250V and an output voltage of 500V was analyzed. A simulation model which consists of two parts is introduced. First, a physics-based model is used to determine the switching losses. Then, a system simulation is applied to calculate the losses of the specific elements. This approach allows a fast and accurate system evaluation as required for further system optimization.
In this work, a hard- and a zero-voltage turn-on switching converter are compared. Measurements were performed to verify the simulation model, showing a good agreement. A peak efficiency of 99% was achieved for an output power of 1.4kW. Even with an output power above 400W, it was possible to obtain a system efficiency exceeding 98 %.
This paper presents a concurrency control mechanism that does not follow a "one concurrency control mechanism fits all needs" strategy. With the presented mechanism a transaction runs under several concurrency control mechanisms and the appropriate one is chosen based on the accessed data. For this purpose, the data is divided into four classes based on its access type and usage (semantics). Class O (the optimistic class) implements a first-committer-wins strategy, class R (the reconciliation class) implements a first-n-committers-win strategy, class P (the pessimistic class) implements a first-reader-wins strategy, and class E (the escrow class) implements a first-n-readers-win strategy. Accordingly, the model is called OjRjPjE. The selected concurrency control mechanism may be automatically adapted at run-time according to the current load or a known usage profile. This run-time adaptation allows OjRjPjE to balance the commit rate and the response time even under changing conditions. OjRjPjE outperforms the Snapshot Isolation concurrency control in terms of response time by a factor of approximately 4.5 under heavy transactional load (4000 concurrent transactions). As consequence, the degree of concurrency is 3.2 times higher.
One-pot synthesis of micron partly hollow anisotropic dumbbell shaped silica core-shell particles
(2016)
A facile method is described to prepare micron partly hollow dumbbell silica particles in a single step. The obtained particles consist of a large dense part and a small hollow lobe. The spherical dense core as well as the hollow lobe are covered by mesoporous channels. In the case of a smaller lobe these channels are responsible for the permeability of the shell which was demonstrated by confocal imaging and spectroscopy.
In many automotive applications, repetitive selfheating is the most critical operation condition for LDMOS transistors in smart power ICs. This is attributed to thermomechanical stress in the on-chip metallization, which results from the different thermal expansion coefficients of the metal and the intermetal dielectric. After many cycles, the accumulated strain in the metallization can lead to short circuits, thus limiting the lifetime. Increasing the LDMOS size can help to lower peak temperatures and therefore to reduce the stress. The downside of this is a higher cost. Hence, it has been suggested to use resilient systems that monitor the LDMOS metallization and lower the stress once a certain level of degradation is reached. Then, lifetime requirements can be fulfilled without oversizing LDMOS transistors, even though a certain performance loss has to be accepted. For such systems, suitable sensors for metal degradation are required. This work proposes a floating metal line embedded in the LDMOS metallization. The suitability of this approach has been investigated experimentally by test structures and shown to be a promising candidate. The obtained results will be explained by means of numerical thermo-mechanical simulations.
A software process is the game plan to organize project teams and run projects. Yet, it still is a challenge to select the appropriate development approach for the respective context. A multitude of development approaches compete for the users’ favor, but there is no silver bullet serving all possible setups. Moreover, recent research as well as experience from practice shows companies utilizing different development approaches to assemble the best-fitting approach for the respective company: a more traditional process provides the basic framework to serve the organization, while project teams embody this framework with more agile (and/or lean) practices to keep their flexibility. The paper at hand provides insights into the HELENA study with which we aim to investigate the use of “Hybrid dEveLopmENt Approaches in software systems development”. We present the survey design and initial findings from the survey’s test runs. Furthermore, we outline the next steps towards the full survey.
Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs. In this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of SQM (including testing). From the main study’s result set, 92 papers were selected for an in-depth systematic review to study the contributions and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models and a strong focus on custom review, testing, and documentation techniques, whereas a set of five selected improvement measures is almost equally addressed.
This article discusses the scientifically and industrially important problem of automating the process of unloading goods from standard shipping containers. We outline some of the challenges barring further adoption of robotic solutions to this problem, ranging from handling a vast variety of shapes, sizes, weights, appearances, and packing arrangements of the goods, through hard demands on unloading speed and reliability, to ensuring that fragile goods are not damaged. We propose a modular and reconfigurable software framework in an attempt to efficiently address some of these challenges. We also outline the general framework design and the basic functionality of the core modules developed. We present two instantiations of the software system on two different fully integrated demonstrators: 1) coping with an industrial scenario, i.e., the automated unloading of coffee sacks with an already economically interesting performance; and 2) a scenario used to demonstrate the capabilities of our scientific and technological developments in the context of medium- to long-term prospects of automation in logistics. We performed evaluations that allowed us to summarize several important lessons learned and to identify future directions of research on autonomous robots for the handling of goods in logistics applications.
The efficiency of pharmaceutical research and development (R&D) reflected by increasing costs of R&D, long timelines, and low probabilities of technical and regulatory success decreased continuously in the past years. Today, the costs for discovering and developing a new drug are enormously high with more than USD 2 billion per new molecular entity (NME), while the average overall success of a research project to provide an NME is in the single-digit percentage rate, and the total timelines of R&D easily exceeds 10 years questioning the return on investment (ROI) of pharmaceutical R&D. As a consequence and also caused by numerous patent expirations of blockbuster drugs that increased the pressure to return to an acceptable ROI, the pharmaceutical industry addressed this challenge and the related causes and identified several actions that need to be taken to increase the output/input ratio of R&D. This book chapter will review the pipeline sizes and the R&D investments of multinational pharmaceutical companies, will describe new processes that have been implemented to increase the reach and to reduce costs of pharmaceutical R&D, and it will illustrate new innovation models that were developed to increase the R&D efficiency.
Rapidly growing data volumes push today's analytical systems close to the feasible processing limit. Massive parallelism is one possible solution to reduce the computational time of analytical algorithms. However, data transfer becomes a significant bottleneck since it blocks system resources moving data-to-code. Technological advances allow to economically place compute units close to storage and perform data processing operations close to data, minimizing data transfers and increasing scalability. Hence the principle of Near Data Processing (NDP) and the shift towards code-to-data. In the present paper we claim that the development of NDP-system architectures becomes an inevitable task in the future. Analytical DBMS like HPE Vertica have multiple points of impact with major advantages which are presented within this paper.
This paper is concerned with the study, optimization and control of the moisture sorption kinetics of agricultural products at temperatures typically found in processing and storage. A nonlinear autoregressive with exogenous inputs (NARX) neural network was developed to predict moisture sorption kinetics and consequently equilibrium moisture contents of shiitake mushrooms (Lentinula edodes (Berk.) Pegler) over a wide range of relative humidity and different temperatures. Sorption kinetic data of mushroom caps was separately generated using a continuous, gravimetric dynamic vapour sorption analyser at emperatures of 25-40 °C over a stepwise variation of relative humidity ranging from 0 to 85%. The predictive power of the neural network was based on physical data, namely relative humidity and temperature. The model was fed with a total of 4500 data points by dividing them into three subsets, namely, 70% of the data was used for training, 15% of the data for testing and 15% of the data for validation, randomly selected from the whole dataset. The NARX neural network was capable of precisely simulating equilibrium moisture contents of mushrooms derived from the dynamic vapour sorption kinetic data throughout the entire range of relative humidity.
The extracellular environment of vascular cells in vivo is complex in its chemical composition, physical properties, and architecture. Consequently, it has been a great challenge to study vascular cell responses in vitro, either to understand their interaction with their native environment or to investigate their interaction with artificial structures such as implant surfaces. New procedures and techniques from materials science to fabricate bio-scaffolds and surfaces have enabled novel studies of vascular cell responses under well-defined, controllable culture conditions. These advancements are paving the way for a deeper understanding of vascular cell biology and materials–cell interaction. Here, we review previous work focusing on the interaction of vascular smooth muscle cells (SMCs) and endothelial cells (ECs) with materials having micro- and nanostructured surfaces. We summarize fabrication techniques for surface topographies, materials, geometries, biochemical functionalization, and mechanical properties of such materials. Furthermore, various studies on vascular cell behavior and their biological responses to micro- and nanostructured surfaces are reviewed. Emphasis is given to studies of cell morphology and motility, cell proliferation, the cytoskeleton and cell-matrix adhesions, and signal transduction pathways of vascular cells. We finalize with a short outlook on potential interesting future studies.
The internet of things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud environments are emerging to support smart connected i.e. digital products and services and the digital transformation. Biological metaphors for living and adaptable ecosystems are currently providing the logical foundation for resilient run-time environments with serviceoriented digitization architectures and for self-optimizing intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution of information systems with digital architecture in the context of the ongoing digital transformation. The goal is to support flexible and agile transformations for both business and related information systems through adaptation and dynamical evolution of their digital architectures. The present research paper investigates mechanisms of decision analytics for digitization architectures, putting a spotlight to internet of things micro-granular architectures, by extending original enterprise architecture reference models with digitization architectures and their multi-perspective architectural decision management.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.
Adapting characteristics of biomaterials specifically for in vitro and in vivo applications is becoming increasingly important in order to control interactions between material and biological systems. These complex interactions are influenced by surface properties like chemical composition, charge, mechanical and topographic attributes. In many cases it is not useful or even not possible to alter the base material but changing surface, to improve biocompatibility or to make surfaces bioactive, may be achieved by thin coatings. An already established method is the coating with polyelectrolyte multilayers (PEM). To adjust adhesion, proliferation and improve vitality of certain cell types, we modified the roughness of PEM coatings. We included different types nanoparticles (NP’s) in different concentrations into PEM coatings for controlling surface roughness. Surface properties were characterized and the reaction of 3 different cell types on these coatings was tested.
One of the challenges in condition monitoring systems is the residual life time prediction. This prediction is done based on statistical methods, based on physical knowledge about the considered process or a combination of these approaches. Physical knowledge of the system is a result of long-term experience of process operators. However, it can be gained as well by analyzing appropriately designed process models. The additional benefit of such models is that particular effects and their impact on the process behavior can be analyzed in detail and without plant operation in a shorter time. The current contribution developed in the framework of the research project Model Based Hierarchic Condition Monitoring presents such models for condition monitoring of roller chains. First, already existing high order dynamic models given by nonlinear differential equations of such chains are extended to incorporate effects that occur due to a deterioration of the chain condition. Then, a simple model is developed and compared to the high order model. Based on the two models the change in the process behavior due to a deterioration of the roller chain condition is analyzed to illustrate that these models can be used in future research in the above mentioned research project to better predict the residual life time of the considered roller chains.
This paper analyzes different government debt relief programs in the European Monetary Union. I build a model and study different options ranging from debt relief to the European Stability Mechanism (ESM). The analysis reveals the following: First, patient countries repay debt, while impatient countries more likely consume and default. Second, without ESM loans, indebted countries default anyway. Third, if the probability to be an impatient government is high, then the supply of loans is constrained. In general, sustainable and unsustainable governments should be incentivized differently especially in a supranational monetary union. Finally, I develop policy recommendations for the ongoing debate in the Eurozone.
This book showcases new and innovative approaches to biometric data capture and analysis, focusing especially on those that are characterized by non-intrusiveness, reliable prediction algorithms, and high user acceptance. It comprises the peer-reviewed papers from the international workshop on the subject that was held in Ancona, Italy, in October 2014 and featured sessions on ICT for health care, biometric data in automotive and home applications, embedded systems for biometric data analysis, biometric data analysis: EMG and ECG, and ICT for gait analysis. The background to the book is the challenge posed by the prevention and treatment of common, widespread chronic diseases in modern, aging societies. Capture of biometric data is a cornerstone for any analysis and treatment strategy. The latest advances in sensor technology allow accurate data measurement in a non-intrusive way, and in many cases it is necessary to provide online monitoring and real-time data capturing to support a patient’s prevention plans or to allow medical professionals to access the patient’s current status. This book will be of value to all with an interest in this expanding field.
Methacrylated gelatin and mature adipocytes are promising components for adipose tissue engineering
(2016)
In vitro engineering of autologous fatty tissue constructs is still a major challenge for the treatment of congenital deformities, tumor resections or high-graded burns. In this study, we evaluated the suitability of photo-crosslinkable methacrylated gelatin (GM) and mature adipocytes as components for the composition of three-dimensional fatty tissue constructs. Cytocompatibility evaluations of the GM and the photoinitiator Lithium phenyl-2,4,6 trimethylbenzoylphosphinate (LAP) showed no cytotoxicity in the relevant range of concentrations. Matrix stiffness of cell-laden hydrogels was adjusted to native fatty tissue by tuning the degree of crosslinking and was shown to be comparable to that of native fatty tissue. Mature adipocytes were then cultured for 14 days within the GM resulting in a fatty tissue construct loaded with viable cells expressing cell markers perilipin A and laminin. This work demonstrates that mature adipocytes are a highly valuable cell source for the composition of fatty tissue equivalents in vitro. Photo-crosslinkable methacrylated gelatin is an excellent tissue scaffold and a promising bioink for new printing techniques due to its biocompatibility and tunable properties.
The MIT Center for Information Systems Research surveyed 255 executives in 2015 to investigate how companies are managing business complexity. This report details the findings from our analysis of the survey data:
1. Some product complexity adds value, some does not. Specifically, companies with more links (aka integration) in their product and service portfolio are higher performing. - 2. Product variety makes it more difficult for costumers and employees to get things done. These customers and employee difficulties impair a company's performance. - 3. Companies that excel at making it easy for employees and customers to get things done differentiate themselves by applying a set of complexity management practices around enterprise architecture, role reconfiguration, and the use of metrics and incentive systems.
Based on these findings, we recommend that companies make product complexity a strategic chois, invest in the abovementioned complexity management practices, and use costumer and employee dfficulties as key metrics for product innovation.
This case study describes the emerging customized omnichannel loyalty solution of Marc O’Polo from a customer’s perspective. After the introduction of Marc O’Polo and their general omnichannel strategy, the loyalty program is described in detail, like Marc O’Polo for members and the mobile app, social media, direct mail and in-store capabilities. A discussion chapter closes the case study with research implications and open questions for Marc O’Polo.
Managing software process evolution : traditional, agile and beyond - how to handle process change
(2016)
This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice.
Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation and addresses the questions of which process(es) to use and adapt, and how to organize process improvement programs. Subsequently, Part 2 mainly addresses process modeling. Lastly, Part 3 collects concrete approaches, experiences, and recommendations that can help to improve software processes, with a particular focus on specific lifecycle phases.
This book is aimed at anyone interested in understanding and optimizing software development tasks at their organization. While the experiences and ideas presented will be useful for both those readers who are unfamiliar with software process improvement and want to get an overview of the different aspects of the topic, and for those who are experts with many years of experience, it particularly targets the needs of researchers and Ph.D. students in the area of software and systems engineering or information systems who study advanced topics concerning the organization and management of (software development) projects and process improvements projects.
The digital economy has created intense demands for innovations. Companies are responding in part by creating new digital products and services to meet increasing customer expectations.
MIT CISR findings indicate that product variety is NOT directly related to firm performance, and IS related to increased difficulties for costumers and employees.
Management and cost accounting has been the basic toolbox in business administration for decades. Today it is an integral part of all curricula in business education and no student can afford not to be familiar with its basic concepts and instruments. At the same time, business in general, and management accounting in particular, is becoming more and more international. English clearly has evolved as the „lingua franca“ of international business. Academics, students as well as practitioners exchange their views and ideas, discuss concepts and communicate with each other in English. This is certainly also true for cost accounting and management accounting.
Nowadays almost every major company has a monitoring system and produces log data to analyse their systems. To perform analysation on the log data and to extract experience for future decisions it is important to transform and synchronize different time series. For synchronizing multiple time series several methods are provided so that they are leading to a synchronized uniform time series. This is achieved by using discretisation and approximation methodics. Furthermore the discretisation through ticks is demonstrated, as well as the respectivly illustrated results.
The paper studies liquidity management in the banking sector at the zero lower bound implemented by central banks. The new era of monetary policy with interest rates at zero and quantitative easing programs raise questions about the effectiveness of central banking policy and their impact on the banking sector. I find that the zero lower bound reduces liquidity reserves of banks and thus creates less credit supply. The T-LTRO program, developed by the European Central Bank, has helped to tackle this problem. However, the recently expanded asset purchase program reveals the opposite effect. Hence, the recent liquidity provisions by central banks have put incentives rather on de-leveraging than bank lending.
Digital companies need information systems to implement their business processes end-to-end. BPM systems are promising candidates for that, because they are highly adaptable due to their business process model-driven operation mode. End-to-end processes contain different types of sub-processes that are either procedural, data-driven or business rule-based. Modern BPM systems support modeling notations for all these types of sub-processes. Moreover, end-to-end processes contain parts of shadow processing, so consequently, they must be supported in a performant way, too. BPMN seems to be the adequate notation for modeling these parts due to its procedural nature. Further, BPMN provides several elements that enable the modeling of parallel executions which are very interesting for accelerating shadow processing parts of the process. The present paper will observe the limitations and potentials of BPM systems for a high-performance execution of BPMN models representing shadow processing parts of a business process.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. The digitization of software-intensive products and services is enabled basically by four megatrends: Cloud computing, big data mobile systems, and social technologies. This disruptive change interacts with all information processes and systems that are important business enablers for the current digital transformation. The internet of things, social collaboration systems for adaptive case management, mobility systems and services for big data in cloud services environments are emerging to support intelligent user-centered and social community systems. Modern enterprises see themselves confronted with an ever growing design space to engineer business models of the future as well as their IT support, respectively. The decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures (EA), is duly needed. With the advent of intelligent user-centered and social community systems, the challenging decision processes can be supported in more flexible and intuitive ways. Tapping into these systems and techniques, the engineers and managers of the enterprise architecture become part of a viable enterprise, i.e. a resilient and continuously evolving system that develops innovative business models.
On the way to achieving higher degrees of autonomy for vehicles in complicated, ever changing scenarios, the localization problem poses a very important role. Especially the Simultaneous Localization and Mapping (SLAM) problem has been studied greatly in the past. For an autonomous system in the real world, we present a very cost-efficient, robust and very precise localization approach based on GraphSLAM and graph optimization using radar sensors. We are able to prove on a dynamically changing parking lot layout that both mapping and localization accuracy are very high. To evaluate the performance of the mapping algorithm, a highly accurate ground truth map generated from a total station was used. Localization results are compared to a high precision DGPS/INS system. Utilizing these methods, we can show the strong performance of our algorithm.
ITMA 2015 which took place from November 12-19, 2015, in Milan/Italy showed new record results. The number of exhibitors rose in comparison to ITMA 2011 by 25% to nearly 1,700 exhibitors, and visitors and exhibition space rose by 20%. A survey by the German Textile Machinery Association (VDMA) and interviews at the fair conducted by the author showed that the exhibiting companies were highly satisfied with the quality and the number of discussions and that many new customers could be acquired. Furthermore, the VDMA survey showed that 74% of the German companies already concluded negotiations and contracts at the show. Of particular note were the transactions of the Belgian weaving machinery manufacturer Picanol, which sold hundreds of machines directly at the fair. The estimatons regarding aftersales were also expected to be good to very good. This is a very welcome development in comparison to both preceding ITMA shows.
The use of digital, IT-based components in physical products is becoming increasingly relevant in practice. Surprisingly, the strategic impact of these "digitized products" has not received a lot of attention in IS research so far. Extant papers on the topic rely on ambiguous terminology (e.g., "smart products", "cyber-physical systems", "digital product-service systems") and underlying concepts differ widely. Based on an extensive literature review, this article provides an overview of the different terms and identifies five conceptual elements that form the building blocks of digitized products in research: "hybridity" (i.e., the combination of digital and physical components), connectivity, smartness, digitized product-service bundles (servitization of digitized products), and digitized product ecosystems. The implication for practitioners is that each element comes with different managerial challenges that companies need to address when incorporating the respective element in their products. The research implication is that each conceptual element is supported by different theoretical streams.
The focus of this work lies on teaching methods for product design to stimulate novelty within a multiple disciplinary educational context. To address this issue, the different types of multiple disciplinary approach are presented by reviewing existing literature. As the initial study involved looking at the correlation between disciplines and product features, the definition of product design and its relationship with industrial design and other adjacent domains are introduced. The structure of a newly developed interdisciplinary master in product design is presented and, within this program, an educational activity fostering creativity in heterogeneous multiple disciplinary environments is described. Inspired by the approach of industrial designers to generate creative solutions, it is conceived to help product design students to flexibly adapt the problem and the solution space together through an iterative process.
Large, deep full-thickness skin wounds from high-graded burns or trauma are not able to reepithelialize sufficiently, resulting in scar formation, mobility limitations, and cosmetic deformities. In this study, in vitro-constructed tissue replacements are needed. Furthermore, such full-skin equivalents would be helpful as in vivo-like test systems for toxicity, cosmetic, and pharmaceutical testing. Up to date, no skin equivalent is available containing the underlying subcutaneous fatty tissue. In this study, we composed a full-skin equivalent and evaluated three different media for the coculture of mature adipocytes, fibroblasts, and keratinocytes. Therefore, adipocyte medium was supplemented with ascorbyl-2-phosphate and calcium chloride, which are important for successful epidermal stratification (Air medium). This medium was further supplemented with two commercially available factor combinations often used for the in vitro culture of keratinocytes (Air-HKGS and Air- KGM medium). We showed that in all media, keratinocytes differentiated successfully to build a stratified epidermal layer and expressed cytokeratin 10 and 14. Perilipin A-positive adipocytes could be found in all tissue models for up to 14 days, whereas adipocytes in the Air-HKGS and Air-KGM medium seemed to be smaller. Adipocytes in all tissue models were able to release adipocyte-specific factors, whereas the supplementation of keratinocyte-specific factors had a slightly negative effect on adipocyte functionality. The permeability of the epidermis of all models was comparable since they were able to withstand a deep penetration of cytotoxic Triton X in the same manner. Taken together, we were able to compose functional three-layered fullskin equivalents by using the Air medium.
Purpose: The purpose of this paper is to find out the influences of sustainability labels on fashion buying behaviour. Despite key information about Fair Trade is provided in all stores of the sample company, customers seem not to be aware of the Fair Trade concept. Therefore this paper aims to give recommendations for a fashion retailer in terms of elucidation about Fair Trade by answering the following research questions: Which influences do sustainability labels wield on customer´s buying behaviour? Are consumers of textile products aware of the function and backgrounds of the Fair Trade label?
Design/methodology/approach: A paper-based questionnaire was administered to 128 customers of a German fashion retailer "Adler Modemärkte AG" in four city stores from which 127 were correctly completed. Additionally an adjusted self-completion questionnaire administered to 50.000 customers online from which a total of 1.712 were correctly completed. Descriptive analysis and cross tabulations were applied to abstract the main research findings and evaluate the hypotheses.
Findings: Key findings suggest that Adler should either enhance their communication strategy regarding Fair Trade or remove Fair Trade products from the assortment, as the majority of respondents are not aware of Adlers´ Fair Trade products. The Fair Trade label could neither be identified as consumer-barrier nor sales support. Further findings revealed participants have more knowledge about Fair Trade than initially assumed.
Research limitations/implications: Majorly women aged between 56 and 75 participated in the survey. Findings are limited to geography, the target group of the fashion retailer Adler, gender, age group and the research method questionnaire.
In the last decades, several driving systems were developed to improve the driving behaviour in energy efficiency or safety. However, these driving systems cover either the area of energy-efficiency or safety. Furthermore, they do not consider the stress level of the driver when showing a recommendation, although stress can lead to an unsafe or inefficient driving behaviour. In this paper, an approach is presented to consider the driver stress level in a driving system for safe and energy-efficient driving behaviour. The driving system tries to suppress a recommendation when the driver is in stress in order not to stress the driver additionally with recommendations in a stressful driving situation. This can lead to an increase in the road safety and in the user acceptance of the driving system, as the driver is not getting bothered or stressed by the driving system.
The evaluation of the approach showed, that the driving system
is able to show recommendations to the driver, while also reacting
to a high stress level by suppressing recommendations in
order not to stress the driver additionally.
Influence of metallization layout on aging detector lifetime under cyclic thermo-mechanical stress
(2016)
The influence of the layout on early warning detectors in BCD technologies for metallization failure under cyclic thermo-mechanical stress was investigated. Different LDMOS transistors, with narrow or wide metal fingers and with or without embedded detectors, were used. The test structures were repeatedly stressed by pronounced self-heating until failure (a short circuit) was detected. The results show that the layout of the on-chip metallization has a large impact on the lifetime. A significant influence of the detectors on the lifetime was also observed, in our case causing a reduction of more than a factor of two, but only for the test structure with narrow metal fingers. The experimental results are explained by an efficient numerical thermo mechanical simulation approach, giving detailed insights into the strain distribution in the metal system. These results are important for aging detector design and, morever, for LDMOS on-chip metal layout in general.
The composition of vascularized adipose tissue is still an ongoing challenge as no culture medium is available to supply adipocytes and endothelial cells appropriately. Endothelial cell medium is typically supplemented with epidermal growth factor (EGF) as well as hydrocortisone (HC). The effect of EGF on adipocytes is discussed controversially. Some studies say it inhibits adipocyte differentiation while others reported of improved adipocyte lipogenesis. HC is known to have lipolytic activities, which might result in mature adipocyte dedifferentiation. In this study, we evaluated the influence of EGF and HC on the co-culture of endothelial cells and mature adipocytes regarding their cell morphology and functionality. We showed in mono-culture that high levels of HC promoted dedifferentiation and proliferation of mature adipocytes, whereas EGF seemed to have no negative influence. Endothelial cells kept their typical cobblestone morphology and showed a proliferation rate comparable to the control independent of EGF and HC concentration. In co-culture, HC promoted dedifferentiation of mature adipocytes, which was shown by a higher glycerol release. EGF had no negative impact on adipocyte morphology. No negative impact on endothelial cell morphology and functionality could be seen with reduced EGF and HC supplementation in co-culture with mature adipocytes. Taken together, our results demonstrate that reduced levels of HC are needed for co-culturing mature adipocytes and endothelial cells. In co-culture, EGF had no influence on mature adipocytes. Therefore, for the composition of vascularized adipose tissue constructs, the media with low levels of HC and high or low levels of EGF can be used.
Industrial hybrid systems with high pv penetration : performance, analysis and key success factors
(2016)
Since the first industrial-scale hybrid system was installed by SMA in 2012, information about the performance of several hybrid systems around the world has been monitored. This paper analyses the performance of SMA’s largest PV-Diesel hybrid system in the industrial-scale installed in Bolivia in 2014 and summarizes the lessons learned by managing this system with large-scale energy storage. The paper finally concludes with an outlook for future hybrid systems.
Purpose – The purpose of this paper is to discuss the applicability of current benchmarking proposals for small and medium-sized enterprises (SMEs) and to suggest a condensed process for logistics benchmarking in SMEs.
Design/methodology/approach – The paper starts by outlining why the logistics function is of increasing importance for SMEs. It discusses the benefit of logistics benchmarking and typical SME restrictions in benchmarking. Available approaches to benchmarking are discussed and their weaknesses when applied to SME logistics benchmarking are analyzed. The paper develops a new benchmarking process framework for SME logistics benchmarking and reports findings of a case application in three German SMEs.
Increasingly volatile market conditions and manufacturing environments combined with a rising demand for highly personalized products, the emergence of new technologies like cyber-physical systems and additive manufacturing as well as an increasing cross-linking of different entities (Industrie 4.0) will result in fundamental changes of future work and logistics systems. The place of production, the logistical network and the respective production system will underlie the requirements of constant changes and therefore sources and sinks of logistical networks have to obey the versatility of (cyber-physical) production systems. To cope with the arising complexity to control and monitor changeable production and logistics systems, decentralized control systems are the mean of choice since centralized systems are pushed to their limits in this regard. This paradigm shift will affect the overall concept under which production and logistics is planned, managed and controlled and how companies interact and collaborate within the emerging value chains by using dynamic methods to generate and execute the created network and to allocate available resources to fulfill the demand for customized products. In this field of research learning factories, like the ESB Logistics Learning Factory at ESB Business School (Reutlingen University), provide a great potential as a risk free test bed to develop new methods and technical solutions, to investigate new technologies regarding their practical use and to transfer the latest state of knowledge and specific competences into the training of students and professionals. Keeping with these guiding principles ESB Business School is transferring its existing production system into a cyber-physical production system to investigate innovative solutions for the design of human-machine collaboration and technical assistance systems as wells as to develop decentralized control methods for intralogistics systems following the requirements of changeable work systems including the respective design of dynamic inbound and outbound logistic networks.
The purpose of this paper is to examine the relationship between the consumers’ perception of sustainability and the application of a QR-code in stores with the focus on the information searching behavior regarding sustainable aspects. An online questionnaire was conducted with fashion students at Reutlingen University: in total, 65 students participated in the survey. Paired samples t-test and other statistical analyses were applied to test research questions. Apart from this, the research paper is based on a literature review. Furthermore, the decision was taken to use a projective method in the form of a dummy fashion fTRACE website. Key findings of the survey are that participants give sustainable aspects a higher importance with a QR-code than without one. Participants who prefer a product with detailed information experience a “positive shopping feeling” when provided with transparency via a QR-code. “Origin”, “production” and “quality” were rated of higher importance by those participants. These findings suggest that, transparency provided through the application of a QR-Code in stores influences the consumers’ perception of sustainability. Due to the small sample size of participants (65) in the study, findings of this research not generalizable to a larger population. This paper focused on the consumers’ information searching behavior regarding sustainable aspects, limiting its findings to impacts on perception of sustainability. Further research is therefore recommended.
Analysis of multicellular patterns is required to understand tissue organizational processes. By using a multi-scale object oriented image processing method, the spatial information of cells can be extracted automatically. Instead of manual segmentation or indirect measurements, such as general distribution of contrast or flow, the orientation and distribution of individual cells is extracted for quantitative analysis. Relevant objects are identified by feature queries and no low-level knowledge of image processing is required.
Current techniques for chromosome analysis need to be improved for rapid, economical identification of complex chromosomal defects by sensitive and selective visualisation. In this paper, we present a straightforward method for characterising unstained human metaphase chromosomes. Backscatter imaging in a dark-field setup combined with visible and short near-infrared spectroscopy is used to monitor morphological differences in the distribution of the chromosomal fine structure in human metaphase chromosomes. The reasons for the scattering centres in the fine structure are explained. Changes in the scattering centres during preparation of the metaphases are discussed. FDTD simulations are presented to substantiate the experimental findings. We show that local scattering features consisting of underlying spectral modulations of higher frequencies associated with a high variety of densely packed chromatin can be represented by their scatter profiles even on a sub-microscopic level. The result is independent of the chromosome preparation and structure size. This analytical method constitutes a rapid, costeffective and label-free cytogenetic technique which can be used in a standard light microscope.
In bioprinting approaches, the choice of bioink plays an important role since it must be processable with the selected printing method, but also cytocompatible and biofunctional. Therefore, a crosslinkable gelatin-based ink was modified with hydroxyapatite (HAp) particles, representing the composite buildup of natural bone. The inks’ viscosity was significantly increased by the addition of HAp, making the material processable with extrusion-based methods. The storage moduli of the formed hydrogels rose significantly, depicting improved mechanical properties. A cytocompatibility assay revealed suitable ranges for photoinitiator and HAp concentrations. As a proof of concept, the modified ink was printed together with cells, yielding stable three-dimensional constructs containing a homogeneously distributed mineralization and viable cells.
Nowadays, software development plays an important role in the entire value chain in production machine and plant engineering. An important component for rapid development of high quality software is the virtual commissioning. The real machine is described on the basis of simulation models. Therefore, the control software can be verified at an early stage using the simulation models. Since production machines are produced highly individual or in very small series, the challenge of virtual commissioning is to reduce the effort in the development of simulation models. Therefore, a systemic reuse of the simulation models and the control software for different variants of a machine is essential for an economic use. This necessarily requires a consideration of the variability which may occur between the production machines. This contribution analyzes the question of how to systematically deal with the software-related variability in the context of virtual commissioning. For this purpose, first the characteristics of the virtual commissioning and variability handling are considered. Subsequently, the requirements to a so-called variant infrastructure for virtual commissioning are analyzed and possible solutions are discussed.
Recent MIT CISR research found that an obsessive focus on innovation is a characteristic of CIOs of top-performing firms. There are now more ways than ever that a firm can be disrupted by and disruptive with digital innovations. Indeed, a growing number of firms and individuals are using increasingly powerful digital technologies and figuring out ways to develop better products and services, better customer and employee experiences, and new business models. The new digital imperative is to compete with more types of digital innovations - and IT units must refine approaches to producing them. Based on an in-depth caste study, this briefing takes a look at how German car manufacturer AUDI AG has expanded its portfolio of digital innovations.
For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of GSE. From the main study’s result set, a set of 30 papers dealing with GSE was selected for an in-depth analysis using the systematic review instrument to study the contributions and to develop an initial picture of how GSE is considered from the perspective of SPI. Our findings show the analyzed papers delivering a substantial discussion of cultural models and how such models can be used to better address and align SPI programs with multi-national environments. Furthermore, experience is shared discussing how agile approaches can be implemented in companies working at the global scale. Finally, success factors and barriers are studied to help companies implementing SPI in a GSE context.
The effect of Hofmeister anions on the surface properties of polyelectrolyte multilayers built from hyaluronan and chitosan by layer-by-layer deposition is studied by ellipsometry and atomic force microscopy. The thickness, roughness and morphology of the resulting coatings were found to depend on the type of the anion. Relationship between the surface properties and the biological response of the polyelectrolyte multilayers is established by assessing the degree of protein (albumin) adsorption.
The limited interfaces of today's IC design environments for editing PCell parameters hinder a solid advancement towards more complex analog PCell modules. This paper presents Hierarchical Instance Parameter Editing (HIPE), a highly flexible concept for the customization of PCell sub-instances. Introducing a new type of parameter, HIPE facilitates the dynamic creation of multi-level editing forms reflecting the actual contents of a PCell instance. This approach greatly improves a PCell's ease-of-use, substantially simplifies PCell development, and allows for a hierarchical execution of parameter validation callbacks. Our HIPE implementation has been integrated into a professional PCell development tool and represents a key enabling technology for upcoming generations of high-level hierarchical PCells.