Refine
Document Type
- Conference proceeding (142)
- Book chapter (91)
- Journal article (78)
- Anthology (10)
- Book (9)
- Doctoral Thesis (1)
Language
- English (331) (remove)
Is part of the Bibliography
- yes (331)
Institute
- Informatik (178)
- Texoversum (48)
- ESB Business School (42)
- Life Sciences (33)
- Technik (29)
- Zentrale Einrichtungen (1)
Publisher
- Springer (331) (remove)
In this presentation the audience will be: (a) introduced to the aims and objectives of the DBTechNet initiative, (b) briefed on the DBTech EXT virtual laboratory workshops (VLW), i.e. the educational and training (E&T) content which is freely available over the internet and includes vendor-neutral hands-on laboratory training sessions on key database technology topics, and (c) informed on some of the practical problems encountered and the way they have been addressed. Last but not least, the audience will be invited to consider incorporating some or all of the DBTech EXT VLW content into their higher education (HE), vocational education and training (VET), and/or lifelong learning/training type course curricula. This will come at no cost and no commitment on behalf of the teacher/trainer; the latter is only expected to provide his/her feedback on the pedagogical value and the quality of the E&T content received/used.
The Dow Jones Sustainability Indexes (DJSI) track the performance of companies that lead in corporate sustainability in their respective sectors or in the geographies they operate. The Sustainable Asset Management (SAM) Indexes GmbH publishes and markets the indexes, the so-called Dow Jones Sustainability Indexes in collaboration with SAM. All indexes of the DJSI family are assessed according to SAM’s Corporate Sustainability AssessmentTM methodology.
In the period from the 1950s to 2013, the American Food and Drug Administration (FDA) approved 1346 new molecular entities (NMEs) or new biologics entities (NBEs). On average, the approval rate was 20 NMEs per year. In the past 40 years, the number of new drugs launched into the market increased slightly from 15 NMEs in the 1970s to 25–30 NMEs since the 1990s. The highest number of new drugs approved by FDA was in 1996 and 1997, which might be related to the enactment of the Prescription Drug User Fee Act (PDUFA) in 1993.
The use of Wireless Sensor and Actuator Networks (WSAN) as an enabling technology for Cyber-Physical Systems has increased significantly in recent past. The challenges that arise in different application areas of Cyber- Physical Systems, in general, and in WSAN in particular, are getting the attention of academia and industry both. Since reliability issues for message delivery in wireless communication are of critical importance for certain safety related applications, it is one of the areas that has received significant focus in the research community. Additionally, the diverse needs of different applications put different demands on the lower layers in the protocol stack, thus necessitating such mechanisms in place in the lower layers which enable them to dynamically adapt. Another major issue in the realization of networked wirelessly communicating cyber-physical systems, in general, and WSAN, in particular, is the lack of approaches that tackle the reliability, configurability and application awareness issues together. One could consider tackling these issues in isolation. However, the interplay between these issues create such challenges that make the application developers spend more time on meeting these challenges, and that too not in very optimal ways, than spending their time on solving the problems related to the application being developed. Starting from some fundamental concepts, general issues and problems in cyber-physical systems, this chapter discusses such issues like energy-efficiency, application and channel-awareness for networked wirelessly communicating cyber-physical systems. Additionally, the chapter describes a middleware approach called CEACH, which is an acronym for Configurable, Energy-efficient, Application- and Channel-aware Clustering based middleware service for cyber-physical systems. The state of-the art in the area of cyberphysical systems with a special focus on communication reliability, configurability, application- and channel-awareness is described in the chapter. The chapter also describes how these features have been considered in the CEACH approach. Important node level and network level characteristics and their significance vis-àvis the design of applications for cyber physical systems is also discussed. The issue of adaptively controlling the impact of these factors vis-à-vis the application demands and network conditions is also discussed. The chapter also includes a description of Fuzzy-CEACH which is an extension of CEACH middleware service and which uses fuzzy logic principles. The fuzzy descriptors used in different stages of Fuzzy-CEACH have also been described. The fuzzy inference engine used in the Fuzzy-CEACH cluster head election process is described in detail. The Rule-Bases used by fuzzy inference engine in different stages of Fuzzy-CEACH is also included to show an insightful description of the protocol. The chapter also discusses in detail the experimental results validating the authenticity of the presented concepts in the CEACH approach. The applicability of the CEACH middleware service in different application scenarios in the domain of cyberphysical systems is also discussed. The chapter concludes by shedding light on the Publish-Subscribe mechanisms in distributed event-based systems and showing how they can make use of the CEACH middleware to reliably communicate detected events to the event-consumers or the actuators if the WSAN is modeled as a distributed event-based system.
The capability of the method of Immersion transmission ellipsometry (ITE) (Jung et al. Int Patent WO, 2004/109260) to not only determine three-dimensional refractive indices in anisotropic thin films (which was already possible in the past), but even their gradients along the z-direction (perpendicular to the film plane) is investigated in this paper. It is shown that the determination of orientation gradients in deep-sub-lm films becomes possible by applying ITE in combination with reflection ellipsometry. The technique is supplemented by atomic force microscopy for measuring the film thickness. For a photooriented thin film, no gradient was found, as expected. For a photo-oriented film, which was subsequently annealed in a nematic liquid crystalline phase, an order was found similar to the one applied in vertically aligned nematic displays, with a tilt angle varying along the z-direction. For fresh films, gradients were only detected for the refractive index perpendicular to the film plane, as expected.
Stent graft visualization and planning tool for endovascular surgery using finite element analysis
(2014)
Purpose: A new approach to optimize stent graft selection for endovascular aortic repair is the use of finite element analysis. Once the finite element model is created and solved, a software module is needed to view the simulation results in the clinical work environment. A new tool for Interpretation of simulation results, named Medical Postprocessor, that enables comparison of different stent graft configurations and products was designed, implemented and tested. Methods Aortic endovascular stent graft ring forces and sealing states in the vessel landing zone of three different configurations were provided in a surgical planning software using the Medical Imaging Interaction Tool Kit (MITK) Software system. For data interpretation, software modules for 2D and 3D presentations were implemented. Ten surgeons evaluated the software features of the Medical Postprocessor. These surgeons performed usability tests and answered questionnaires based on their experience with the system.
Results: The Medical Postprocessor visualization system enabled vascular surgeons to determine the configuration with the highest overall fixation force in 16 ± 6 s, best proximal sealing in 56±24 s and highest proximal fixation force in 38 ± 12 s. The majority considered the multiformat data provided helpful and found the Medical Postprocessor to be an efficient decision support system for stent graft selection. The evaluation of the user interface results in an ISONORMconform user interface (113.5 points).
Conclusion: The Medical Postprocessor visualization Software tool for analyzing stent graft properties was evaluated by vascular surgeons. The results show that the software can assist the interpretation of simulation results to optimize stent graft configuration and sizing.
Intra-operative fluoroscopy-guided assistance system for transcatheter aortic valve implantation
(2014)
A new surgical assistance system has been developed to assist the correct positioning of the AVP during transapical TAVI. The developed assistance system automatically defines the target area for implanting the AVP under live 2-D fluoroscopy guidance. Moreover, this surgical assistance system works with low levels of contrast agent for the final deployment of AVP, reducing therefore long-term negative effects, such as renal failure in the elderly and high-risk patients.
Several diseases occur due to asbestos exposure. Until today, asbestos predicted mortality and morbidity will increase because of the long latency period. Actually, the methods to investigate asbestos related disease are mostly invasive. Therefore, the aim of the present paper was to investigate, whether signals in human breath could be correlated to Asbestos related lung diseases using a multi-capillary column (MCC) connected to an ion mobility spectrometer (IMS) as non-invasive method. Here, the breath samples of 10 mL of 25 patients suffering from asbestos related diseases. This group includes patients with asbestos related pleural thickening with and without pulmonary fibrosis. Twelve healthy persons constitute the control group and the breath samples are compared with those of the BK4103 patients. In total 83 peaks are found in the IMS-Chromatogram. A discrimination was possible with p-values <0.001 for two peaks (99.9 %), <0.01 (99 %) for 5 peaks and <0.05 (95 %) for 17 peaks. The most discrimination peaks alpha pinene and 4-ethyltoluol were identified among some others with lower p-values. The corresponding Box-and-Whisker-Plots comparing both groups are presented. In addition, a decision tree including all peaks was created that shows a differentiation with alpha pinene between BK4103 (pleural plaques group) and the control group. In addition, the sensitivity was calculated to 96 %, specificity was 50 %, positive and negative predictive values were 80 % and 86 %. Ion mobility spectrometry was introduced as non-invasive method to separate both groups Asbestos related and healthy. Naturally, the findings need further confirmation on larger population groups, but encourage further investigations, too.
Children undergoing systemic chemotherapy often suffer from severe immunosuppression usually associated to severe neutropenia (neutrophils < 0.5 x 109/l). Clinical courses during those periods range from asymptomatic to septic general conditions. Development of septic symptoms can be very fast and life-threatening. Swift detection of risk factors in those patients is therefore needed. So far no early, rapid and reliable marker or tool exists. Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column (IMS-MCC) can analyze more than 600 volatile components from exhaled air within a few minutes and hence is a potential, rapid detection-tool. As a proof of concept we measured the exhaled breath of 11 patients with neutropenia and 10 healthy controls ranging from 3 to 18 years of age at the time of measurement. Ten milliliters breath samples were taken at the outpatient clinic and analyzed with an onsite IMS-MCC (BreathDiscovery, B&S Analytik, Dortmund, Germany). Dead-space-volume was adapted to two groups (small 250 ml, large 500 ml). Interestingly 59 differing peaks were measured. Eleven were significantly different (p ≤ 0.05), three of which highly significant (p ≤ 0.01) in Mann-Whitney-Rank-Sum-testing. The corresponding analytes used in the decision tree are 2-Propanol, D-Limonene and Acetone. The analytes with the lowest rank sum identified are 2-Hexanone, Iso-Propylamine and 1-Butanol. Eventually we were able to show a three-step-decision-tree, which discerns the 21 samples except one from each group. Sensitivity was 90 % and specificity was 91 %. Naturally these findings need further confirmation within a bigger population. Our pilot-study proves that Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column is a feasible rapid diagnostic tool in the setting of a pediatric oncology out-patient clinic for patients 3 years and older. Our first results furthermore encourage additional analysis as to whether patients at risk for septic events during immunosuppression can be diagnosed in advance by rapidly assessing risk factors such as Neutropenia in exhaled breath.
Ion mobility spectrometry coupled to multi capillary columns (MCC/IMS) combines highly sensitive spectrometry with a rapid separation technique. MCC\IMS is widely used for biomedical breath analysis. The identification of molecules in such a complex sample necessitates a reference database. The existing IMS reference databases are still in their infancy and do not allow to actually identify all analytes. With a gas chromatograph coupled to a mass selective detector (GC/MSD) setup in parallel to a MCC/IMS instrumentation we may increase the accuracy of automatic analyte identification. To overcome the time-consuming manual evaluation and comparison of the results of both devices, we developed a software tool MIMA (MS-IMS-Mapper), which can computationally generate analyte layers for MCC/IMS spectra by using the corresponding GC/MSD data. We demonstrate the power of our method by successfully identifying the analytes of a seven-component mixture. In conclusion, the main contribution of MIMA is a fast and easy computational method for assigning analyte names to yet un-assigned signals in MCC/IMS data. We believe that this will greatly impact modern MCC/IMS-based biomarker research by 'giving a name' to previously detected disease-specific molecules.
Online measurement of drug concentrations in patient's breath is a promising approach for individualized dosage. A direct transfer from breath- to blood-concentrations is not possible. Measured exhaled concentrations are following the blood-concentration with a delay in non-steady-state situations. Therefore, it is necessary to integrate the breath-concentration into a pharmacological model. Two different approaches for pharmacokinetic modelling are presented. Usually a 3-compartment model is used for pharmacokinetic calculations of blood concentrations. This 3-compartment model is extended with a 2-compartment model based on the first compartment of the 3-compartment model and a new lung compartment. The second approach is to calculate a time delay of changes in the concentration of the first compartment to describe the lung-concentration. Exemplarily both approaches are used for modelling of exhaled propofol. Based on time series of exhaled propofol measurements using an ion-mobility-spectrometer every minute for 346 min a correlation of calculated plasma and the breath concentration was used for modelling to deliver R2 = 0.99 interdependencies. Including the time delay modelling approach the new compartment coefficient ke0lung was calculated to ke0lung = 0.27 min−1 with R2 = 0.96. The described models are not limited to propofol. They could be used for any kind of drugs, which are measurable in patient's breath.
Increasing concerns regarding the world´s natural resources and sustainability continue to be a major issue for global development. As a result several political initiatives and strategies for green or resource-efficient growth both on national and international levels have been proposed. A core element of these initiatives is the promotion of an increase of resource or material productivity. This dissertation examines material productivity developments in the OECD and BRICS countries between 1980 and 2008. By applying the concept of convergence stemming from economic growth theory to material productivity the analysis provides insights into both aspects: material productivity developments in general as well potentials for accelerated improvements in material productivity which consequently may allow a reduction of material use globally. The results of the convergence analysis underline the importance of policy-making with regard to technology and innovation policy enabling the production of resource-efficient products and services as well as technology transfer and diffusion.
A configuration-management-database driven approach for fabric-process specification and automation
(2014)
In this paper we describe an approach that integrates a Configuration- Management-Database into fabric-process specification and automation in order to consider different conditions regarding to cloud-services. By implementing our approach, the complexity of fabric processes gets reduced. We developed a prototype by using formal prototyping principles as research methods and integrated the Configuration-Management-Database Command into the Workflow- Management-System Activiti. We used this prototype to evaluate our approach. We implemented three different fabric-processes and show that by using our approach the complexity of these three fabric-processes gets reduced.
The analysis of exhaled metabolites has become a promising field of research in recent decades. Several volatile organic compounds reflecting metabolic disturbance and nutrition status have even been reported. These are particularly important for long-term measurements, as needed in medical research for detection of disease progression and therapeutic efficacy. In this context, it has become urgent to investigate the effect of fasting and glucose treatment for breath analysis. In the present study, we used amodel of ventilated rats that fasted for 12 h prior to the experiment. Ten rats per group were randomly assigned for continuous intravenous infusion without glucose or an infusion including 25 mg glucose per 100 g per hour during an observation period of 12 h. Exhaled gas was analysed using multicapillary column ion-mobility spectrometry. Analytes were identified by the BS-MCC/IMS database (version 1209; B & S Analytik, Dortmund, Germany). Glucose infusion led to a significant increase in blood glucose levels (p<0.05 at 4 h and thereafter) and cardiac output (p<0.05 at 4 h and thereafter). During the observation period, 39 peaks were found collectively. There were significant differences between groups in the concentration of ten volatile organic compounds: p<0.001 at 4 h and thereafter for isoprene, cyclohexanone, acetone, p-cymol, 2-hexanone, phenylacetylene, and one unknown compound, and p<0.001 at 8 h and thereafter for 1-pentanol, 1-propanol, and 2-heptanol. Our results indicate that for long-term measurement, fasting and the withholding of glucose could contribute to changes of volatile metabolites in exhaled air.
Stress is recognized as a factor of predominant disease and in the future the costs for treatment will increase. The presented approach tries to detect stress in a very basic and easy to implement way, so that the cost for the device and effort to wear it remain low. The user should benefit from the fact that the system offers an easy interface reporting the status of his body in real time. In parallel, the system provides interfaces to pass the obtained data forward for further processing and (professional) analyses, in case the user agrees. The system is designed to be used in every day’s activities and it is not restricted to laboratory use or environments. The implementation of the enhanced prototype shows that the detection of stress and the reporting can be managed using correlation plots and automatic pattern recognition even on a very light weighted microcontroller platform.
An ongoing challenge in our days is to lower the impact on the quality of life caused by dysfunctionality through individual support. With the background of an aging society and continuous increases in costs for care, a holistic solution is needed. This solution must integrate individual needs and preferences, locally available possibilities, regional conditions, professional and informal caregivers and provide the flexibility to implement future requirements. The proposed model is a result of a common initiative to overcome the major obstacles and to center a solution on individual needs caused by dysfunctionality.
New or adapted digital business models have huge impacts on Enterprise Architectures (EA) and require them to become more agile, flexible, and adaptable. All these changes are happening frequently and are currently not well documented. An EA consists of a lot of elements with manifold relationships between them. Thus changing the business model may have multiple impacts on other architectural elements. The EA engineering process deals with the development, change and optimization of architectural elements and their dependencies. Thus an EA provides a holistic view for both business and IT from the perspective of many stakeholders, which are involved in EA decision-making processes. Different stakeholders have specific concerns and are collaborating today in often unclear decision-making processes. In our research we are investigating information from collaborative decision-making processes to support stakeholders in taking current decisions. In addition we provide all information necessary to understand how and why decisions were taken. We are collecting the decision-related information automatically to minimize manual time intensive work as much as possible. The core contribution of our research extends a decisional metamodel, which links basic decisions with architectural elements and extends them with an associated decisional case context. Our aim is to support a new integral method for multi perspective and collaborative decision-making processes. We illustrate this by a practice-relevant decision-making scenario for Enterprise Architecture Engineering.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (longterm electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic TimeWarping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
New business concepts such as Enterprise 2.0 foster the use of social software in enterprises. Especially social production significantly increases the amount of data in the context of business processes. Unfortunately, these data are still an unearthed treasure in many enterprises. Due to advances in data processing such as Big Data, the exploitation of context data becomes feasible. To provide a foundation for the methodical exploitation of context data, this paper introduces a classification, based on two classes, intrinsic and extrinsic data.
Modern enterprises reshape and transform continuously by a multitude of management processes with different perspectives. They range from business process management to IT service management and the management of the information systems. Enterprise Architecture (EA) management seeks to provide such a perspective and to align the diverse management perspectives. Therefore, EA management cannot rely on hierarchic - in a tayloristic manner designed - management processes to achieve and promote this alignment. It, conversely, has to apply bottom-up, information-centered coordination mechanisms to ensure that different management processes are aligned with each other and enterprise strategy. Social software provides such a bottom-up mechanism for providing support within EAM-processes. Consequently, challenges of EA management processes are investigated, and contributions of social software presented. A cockpit provides interactive functions and visualization methods to cope with this complexity and enable the practical use of social software in enterprise architecture management processes.
Leveraging textual information for improving decision making in the business process lifecycle
(2015)
Business process implementations fail, because requirements are elicited incompletely. At the same time, a huge amount of unstructured data is not used for decision-making during the business process lifecycle. Data from questionnaires and interviews is collected but not exploited because the effort doing so is too high. Therefore, this paper shows how to leverage textual information for improving decision making in the business process lifecycle. To do so, text mining is used for analyzing questionnaires and interviews.
Engineering of large vascularized adipose tissue constructs is still a challenge for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Communication between mature adipocytes and endothelial cells is important for homeostasis and the maintenance of adipose tissue mass but, to date, is mainly neglected in tissue engineering strategies. Thus, new coculture strategies are needed to integrate adipocytes and endothelial cells successfully into a functional construct. This review focuses on the cross-talk of mature adipocytes and endothelial cells and considers their influence on fatty acid metabolism and vascular tone. In addition, the properties and challenges with regard to these two cell types for vascularized tissue engineering are highlighted.
Social and environmental risk management in supply chains : a survey in the clothing industry
(2015)
Almost daily, news indicates that there are environmental and social problems in globally fragmented supply chains. Even though conceptualisations of sustainable supply chain management suggest supplier-related risk management for sustainable products and processes as substantial for companies, research on how risk management for environmental and social issues in supply chains is performed has so far been neglected. This study aims at analysing both why companies in the clothing industry are performing management of social and environmental risks in their supply chain and what kind of action they are taking. Based on the literature on sustainable supply chain management and supply chain risk management as well as 10 expert interviews, a conceptual model for risk management in sustainable supply chains was developed. This model was tested in an empirical study in the clothing industry. The data were analysed by structural equation modelling. Results of the research show high statistical significance for the respective conceptual model. The main driver to perform risk management in environmental and social affairs is pressures and incentives from stakeholders. While companies’ corporate orientation mainly drives social actions, top management drives environmental affairs for differentiating themselves from competitors.
Delivering value to customers in real-time requires companies to utilize real-time deployment of software to expose features to users faster, and to shorten the feedback loop. This allows for faster reaction and helps to ensure that the development is focused on features providing real value. Continuous delivery is a development practice where the software functionality is deployed continuously to customer environment. Although this practice has been established in some domains such as B2C mobile software, the B2B domain imposes specific challenges. This article presents a case study that is conducted in a medium-sized software company operating in the B2B domain. The objective of this study is to analyze the challenges and benefits of continuous delivery in this domain. The results suggest that technical challenges are only one part of the challenges a company encounters in this transition. The company must also address challenges related to the customer and procedures. The core challenges are caused by having multiple customers with diverse environments and unique properties, whose business depends on the software product. Some customers require to perform manual acceptance testing, while some are reluctant towards new versions. By utilizing continuous delivery, it is possible for the case company to shorten the feedback cycles, increase the reliability of new versions, and reduce the amount of resources required for deploying and testing new releases.
Software development as an experiment system : a qualitative survey on the state of the practice
(2015)
An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software functionalities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development. Although case studies on experimentation in industry exist, the understanding of the state of the practice and the encountered obstacles is incomplete. This paper presents an interview-based qualitative survey exploring the experimentation experiences of ten software development companies. The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing organizational culture, accelerating development cycle speed, and measuring customer value and product
success.
For years, agile methods are considered the most promising route toward successful software development, and a considerable number of published studies the (successful) use of agile methods and reports on the benefits companies have from adopting agile methods. Yet, since the world is not black or white, the question for what happened to the traditional models arises. Are traditional models replaced by agile methods? How is the transformation toward Agile managed, and, moreover, where did it start? With this paper we close a gap in literature by studying the general process use over time to investigate how traditional and agile methods are used. Is there coexistence or do agile methods accelerate the traditional processes’ extinction? The findings of our literature study comprise two major results: First, studies and reliable numbers on the general process model use are rare, i.e., we lack quantitative data on the actual process use and, thus, we often lack the ability to ground process-related research in practically relevant issues. Second, despite the assumed dominance of agile methods, our results clearly show that companies enact context-specific hybrid solutions in which traditional and agile development approaches are used in combination.
IOS 2.0 : new aspects on inter-organizational integration through enterprise 2.0 technologies
(2015)
This special theme of „Electronic Markets“ focuses on research concerned with the use of social technologies and "2.0" principles in the interaction between organization (i.e., with "inter-organizational systems (IOS) 2.0"). This theme falls within the larger space of Enterprise 2.0 research, but focuses in particular on inter-organizational use (between enterprises), not intra-organizational use (in a single enterprise). While there is great interest in practice regarding the use of 2.0 technologies to support intra-organizational communication, collaboration and interaction, information systems (IS) research has largely been oblivious to this important use of social technologies.
Excellence in IT is a key enabler for the digital transformation of enterprises. To realize the vision of digital enterprises it is necessary to cope with changing business requirements and to align business and IT. In order to evaluate the contribution of enterprise architecture management to these goals, our paper explores the impact of various factors to the perceived benefit of EAM in enterprises. Based on literature, we build an empirical research model. It is tested with empirical data of European EAM experts using a structural equation modelling approach. It is shown that changing business requirements, IT business alignment, the complexity of information technology infrastructure as well as enterprise architecture knowledge of information technology employees are crucial impact factors to the perceived benefit of EAM in enterprises.
Today fiber reinforced plastics (FRP) are well established in manifold technical applications, because they provide advantages such as low weight, high stiffness, high strength and chemical resistance. The broad range of production methods starts from cost effective mass production up to the manufacturing of ultra-lightweight composite parts.
Biological materials are also usually composite materials: Higher plants or bones of higher animals are hierarchically organized and are composed of only a few materials such as lignin, cellulose, apatite and collagen. The large variety and the mechanical properties of natural tissues results primarily from an optimized fiber lay-up to adapt to the mechanical requirements of the respective “installation circumstances”.
Advanced lightweight technical solutions need strong materials and structurally optimized structures. In many industries, the structural optimization by an appropriate fiber lay-up has become an important method to save more weight. Corresponding software tools help to optimize topology/shape (e.g. Mattheck: CAO/SKO, Co. Altair: Optistruct), mainly using finite element analyzing technology.
The combination of strong lightweight materials, optimized topology and sophisticated fiber lay-up is also present in many bio-mineralized planktonic shells — for instance diatoms and radiolaria—but also in glass sponges.
Following it is shown, how the high weight-related mechanical properties of plankton are biomimetically transferred into ultra-lightweight technical structures.
Soft, mechanically compliant robots are developed to safely interact with a “human environment”. The use of textiles and fibrous (composite-) materials for the fabrication of robots opens up new possibilities for “softness/compliance” and safety in human-robot interaction. Besides external motion monitoring systems, textiles allow on-board monitoring and early prediction, or detection, of robot-human contact. The use of soft fibers and textiles for robot skins can increase the acceptance of robots in human surroundings. Novel topology optimization tools, materials, processing technologies and biomimetic engineering allow developing ultra-light-weight, multifunctional, and adaptive structures.
This book presents emerging trends in the evolution of service-oriented and enterprise architectures. New architectures and methods of both business and IT are integrating services to support mobility systems, internet of things, ubiquitous computing, collaborative and adaptive business processes, big data, and cloud ecosystems. They inspire current and future digital strategies and create new opportunities for the digital transformation of next digital products and services. Services Oriented Architectures (SOA) and Enterprise Architectures (EA) have emerged as a useful framework for developing interoperable, large-scale systems, typically implementing various standards, like web services, REST, and microservices. Managing the adaptation and evolution of such systems presents a great challenge. Service-Oriented Architecture enables flexibility through loose coupling, both between the services themselves and between the IT organizations that manage them. Enterprises evolve continuously by transforming and extending their services, processes and information systems. Enterprise Architectures provide a holistic blueprint to help define the structure and operation of an organization with the goal of determining how an organization can most effectively achieve its objectives. The book proposes several approaches to address the challenges of the service-oriented evolution of digital enterprise and software architectures.
The evolution of Services Oriented Architectures (SOA) presents many challenges due to their complex, dynamic and heterogeneous nature. We describe how SOA design principles can facilitate SOA evolvability and examine several approaches to support SOA evolution. SOA evolution approaches can be classified based on the level of granularity they address, namely, service code level, service interaction level and model level. We also discuss emerging trends, such as microservices and knowledge-based support, which can enhance the evolution of future SOA systems.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. The digitization of software-intensive products and services is enabled basically by four megatrends: Cloud computing, big data mobile systems, and social technologies. This disruptive change interacts with all information processes and systems that are important business enablers for the current digital transformation. The internet of things, social collaboration systems for adaptive case management, mobility systems and services for big data in cloud services environments are emerging to support intelligent user-centered and social community systems. Modern enterprises see themselves confronted with an ever growing design space to engineer business models of the future as well as their IT support, respectively. The decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures (EA), is duly needed. With the advent of intelligent user-centered and social community systems, the challenging decision processes can be supported in more flexible and intuitive ways. Tapping into these systems and techniques, the engineers and managers of the enterprise architecture become part of a viable enterprise, i.e. a resilient and continuously evolving system that develops innovative business models.
The Internet of Things (IoT), enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems with service oriented enterprise architectures provide the foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates mechanisms for decision case management in the context of multi-perspective explorations of enterprise services and Internet of Things architectures by extending original enterprise architecture reference models with state of art elements for architectural engineering for the digitization and architectural decision support.
An enormous amount of data in the context of business processes is stored as images. They contain valuable information for business process management. Up to now this data had to be integrated manually into the business process. By advances of capturing it is possible to extract information from an increasing number of images. Therefore, we systematically investigate the potentials of Image Mining for business process management by a literature research and an in-depth analysis of the business process lifecycle. As a first step to evaluate our research, we developed a prototype for recovering process model information from drawings using Rapidminer.
Preface of IDEA 2015
(2016)
Digitization is more than using digital technologies to transfer data and perform computations and tasks. Digitization embraces disruptive effects of digital technologies on economy and society. To capture these effects, two perspectives are introduced, the product and the value-creation perspective. In the product perspective, digitization enables the transition from material, static products to interactive and configurable services. In the value-creation perspective, digitization facilitates the transition from centralized, isolated models of value creation, to bidirectional, co-creation oriented approaches of value creation.
The Internet of Things, enterprise social networks, adaptive case management, mobility systems, analytics for big data, and cloud services environments are emerging to support smart connected products and services and the digital transformation. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems with service-oriented enterprise architectures. We are investigating mechanisms for flexible adaptation and evolution for the next digital enterprise architecture systems in the context of the digital transformation. Our aim is to support flexibility and agile transformation for both business and related enterprise systems through adaptation and dynamical evolution of digital enterprise architectures. The present research paper investigates digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems. We are putting a spotlight with the example domain – Internet of Things.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
New drugs serving unmet medical needs are one of the key value drivers of research-based pharmaceutical companies. The efficiency of research and development (R&D), defined as the successful approval and launch of new medicines (output) in the rate of the monetary investments required for R&D (input), has declined since decades. We aimed to identify, analyze and describe the factors that impact the R&D efficiency. Based on publicly available information, we reviewed the R&D models of major research-based pharmaceutical companies and analyzed the key challenges and success factors of a sustainable R&D output. We calculated that the R&D efficiencies of major research-based pharmaceutical companies were in the range of USD 3.2–32.3 billion (2006–2014). As these numbers challenge the model of an innovation-driven pharmaceutical industry, we analyzed the concepts that companies are following to increase their R&D efficiencies: (A) Activities to reduce portfolio and project risk, (B) activities to reduce R&D costs, and (C) activities to increase the innovation potential. While category A comprises measures such as portfolio management and licensing, measures grouped in category B are outsourcing and risk-sharing in late-stage development. Companies made diverse steps to increase their innovation potential and open innovation, exemplified by open source, innovation centers, or crowdsourcing, plays a key role in doing so. In conclusion, research-based pharmaceutical companies need to be aware of the key factors, which impact the rate of innovation, R&D cost and probability of success. Depending on their company strategy and their R&D set-up they can opt for one of the following open innovators: knowledge creator, knowledge integrator or knowledge leverager.
Current techniques for chromosome analysis need to be improved for rapid, economical identification of complex chromosomal defects by sensitive and selective visualisation. In this paper, we present a straightforward method for characterising unstained human metaphase chromosomes. Backscatter imaging in a dark-field setup combined with visible and short near-infrared spectroscopy is used to monitor morphological differences in the distribution of the chromosomal fine structure in human metaphase chromosomes. The reasons for the scattering centres in the fine structure are explained. Changes in the scattering centres during preparation of the metaphases are discussed. FDTD simulations are presented to substantiate the experimental findings. We show that local scattering features consisting of underlying spectral modulations of higher frequencies associated with a high variety of densely packed chromatin can be represented by their scatter profiles even on a sub-microscopic level. The result is independent of the chromosome preparation and structure size. This analytical method constitutes a rapid, costeffective and label-free cytogenetic technique which can be used in a standard light microscope.
The amount of image data has been rising exponentially over the last decades due to numerous trends like social networks, smartphones, automotive, biology, medicine and robotics. Traditionally, file systems are used as storage. Although they are easy to use and can handle large data volumes, they are suboptimal for efficient sequential image processing due to the limitation of data organisation on single images. Database systems and especially column-stores support more stuctured storage and access methods on the raw data level for entiere series.
In this paper we propose definitions of various layouts for an efficient storage of raw image data and metadata in a column store. These schemes are designed to improve the runtime behaviour of image processing operations. We present a tool called column-store Image Processing Toolbox (cIPT) allowing to easily combine the data layouts and operations for different image processing scenarios.
The experimental evaluation of a classification task on a real world image dataset indicates a performance increase of up to 15x on a column store compared to a traditional row-store (PostgreSQL) while the space consumption is reduced 7x. With these results cIPT provides the basis for a future mature database feature.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Analysis is an important part of the enterprise architecture management process. Prior to decisions regarding transformation of the enterprise architecture, the current situation and the outcomes of alternative action plans have to be analysed. Many analysis approaches have been proposed by researchers and current enterprise architecture management tools implement analysis functionalities. However, few work has been done structuring and classifying enterprise architecture analysis approaches. This paper collects and extends existing classification schemes, presenting a framework for enterprise architecture analysis classification. For evaluation, a collection of enterprise architecture analysis approaches has been classified based on this framework. As a result, the description of these approaches has been assessed, a common set of important categories for enterprise architecture analysis classification has been derived and suggestions for further development are drawn.
This paper provides an introduction to the topic of enterprise social networks (ESN) and illustrates possible applications, potentials, and challenges for future research. It outlines an analysis of research papers containing a literature overview in the field of ESN. Subsequently, single relevant research papers are analysed and further research potentials derived therefrom. This yields seven promising areas for further research: (1) user behaviour; (2) effects of ESN usage; (3) management, leadership, and governance; (4) value assessment and success measurement; (5) cultural effects, (6) architecture and design of ESN; and (7) theories, research designs and methods. This paper characterises these areas and articulates further research directions.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study’s parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware.
Bionic optimization means finding the best solution to a problem using methods found in nature. As evolutionary strategies and particle swarm optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them.
A set of sample applications shows how bionic optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for multi-objective-optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating bionic optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented.
The closing section focuses on an overview and outlook on reliable and robust as well as on multi-objective optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.