Refine
Year of publication
- 2019 (223) (remove)
Document Type
- Conference proceeding (114)
- Journal article (87)
- Book (9)
- Book chapter (9)
- Doctoral Thesis (3)
- Anthology (1)
Language
- English (223) (remove)
Is part of the Bibliography
- yes (223)
Institute
- Informatik (88)
- ESB Business School (52)
- Technik (35)
- Life Sciences (34)
- Texoversum (14)
Publisher
- IEEE (35)
- Springer (31)
- Elsevier (28)
- Hochschule Reutlingen (19)
- Wiley (10)
- De Gruyter (7)
- Stellenbosch University (7)
- MDPI (6)
- SciTePress (5)
- Association for Computing Machinery (4)
This paper studies option pricing based on a reverse engineering (RE) approach. We utilize artificial intelligence in order to numerically compute the prices of options. The data consist of more than 5000 call- and put-options from the German stock market. First, we find that option pricing under reverse engineering obtains a smaller root mean square error to market prices. Second, we show that the reverse engineering model is reliant on training data. In general, the novel idea of reverse engineering is a rewarding direction for future research. It circumvents the limitations of finance theory, among others strong assumptions and numerical approximations under the Black–Scholes model.
Revenue management information systems are very important in the hospitality sector. Revenue decisions can be better prepared based on different information from different information systems and decision strategies. There is a lack of research about the usage of such systems in small and medium-sized hotels and architectural configurations. Our paper empirically shows the current development of revenue information systems. Furthermore, we define future developments and requirements to improve such systems and the architectural base.
On-chip metallization, especially in modern integrated BCD technologies, is often subject to high current densities and pronounced temperature cycles due to heat dissipation from power switches like LDMOS transistors. This paper continues the work on a sensor concept where small sense lines are embedded in the metallization layers above the active area of a switching LDMOS transistor. The sensors show a significant resistance change that correlates with the number of power cycles. Furthermore, influences of sense line layer, geometry and the dissipated energy are shown. In this paper, the focus lies on a more detailed analysis of the observed change in sense line resistance.
This study is about estimating the reproducibility of finding palpation points of three different anatomical landmarks in the human body (Xiphoid Process and the 2 Hip Crests) to support a navigated ultrasound application. On 6 test subjects with different body mass index the three palpation points were located five times by two examiners. The deviation from the target position was calculated and correlated to the fat thickness above each palpation point. The reproducibility of the measurements had a mean error of ≈13.5 mm +- 4 mm, which seems to be sufficient for the desired application field.
In this paper, an approach is introduced how reinforcement learning can be used to achieve interoperability between heterogeneous Internet of Things (IoT) components. More specifically, we model an HTTP REST service as a Markov Decision Process and adapt Q-Learning to the properties of REST so that an agent in the role of an HTTP REST client can learn the semantics of the service and, especially an optimal sequence of service calls to achieve an application specific goal. With our approach, we want to open up and facilitate a discussion in the community, as we see the key for achieving interoperability in IoT by the utilization of artificial intelligence techniques.
This document presents an algorithm for a nonobtrusive recognition of Sleep/Wake states using signals derived from ECG, respiration, and body movement captured while lying in a bed. As a core mathematical base of system data analytics, multinomial logistic regression techniques were chosen. Derived parameters of the three signals are used as the input for the proposed method. The overall achieved accuracy rate is 84% for Wake/Sleep stages, with Cohen’s kappa value 0.46. The presented algorithm should support experts in analyzing sleep quality in more detail. The results confirm the potential of this method and disclose several ways for its improvement.
Most antimicrobial peptides (AMPs) and their synthetic mimics (SMAMPs) are thought to act by permeabilizing cell membranes. For antimicrobial therapy, selectivity for pathogens over mammalian cells is a key requirement. Understanding membrane selectivity is thus essential for designing AMPs and SMAMPs to complement classical antibiotics in the future. This study focuses on membrane permeabilization induced by SMAMPs and their selectivity for membranes with different lipid compositions. We measure release and fluorescence lifetime of a self-quenching dye in lipid vesicles. Apart from the dose-response, we quantify the strength of individual leakage events, and, employing cumulative kinetics, categorize permeabilization behavior. We propose that differing selectivities in a series of SMAMPs arise from a combination of the effect of the antimicrobial agent and the susceptibility of the membrane (with a given lipid composition) for certain types of leakage behavior. The unselective and hemolytic SMAMP is found to act mainly by the asymmetry stress mechanism, mediated by hydrophobic insertion of SMAMPs into lipid layers. The more selective SMAMPs induced leakage events occurring stochastically over several hours. Lipid intrinsic properties might additionally amplify the efficiency of leakage events. Leakage behavior changes with both the design of the SMAMP and the lipid composition of the membrane. Understanding how leakage behavior contributes to the selectivity and activity of antimicrobial agents will aid the design and screening of antimicrobials. An understanding of the underlying processes facilitates the comparison of membrane permeabilization across in vitro and in vivo assays.
Vitamin E (VitE) additives are important in treating osteoarthritis inclusive cartilage regeneration due to their antioxidant and anti-inflammatory properties. The present research study focuses on the ability of biological antioxidant VitE (alpha-tocopherol isoform) to reduce or minimize oxidative degradation of soft implantable polyurethane (PU) elastomers after extended periods of time (5 months) in vitro. The effect of the oxidation storage media on the morphology of the segmented PUs was evaluated by mechanical softening, crystallization and melting behavior of both soft and hard segments (SS, HS) using dynamic mechanical analysis (DMA). Bulk mechanical properties of the potential implant materials during ageing were predicted from comprehensive mechanical testing of the biomaterials under tension and compression cyclic loads. 5-months in vitro data suggest that the prepared siloxane-poly(carbonate urethane) formulations have sufficient resistance against degradation to be suitable materials for chondral long term bio-stable implants. Most importantly, the positive effect of incorporating VitE (0.5 or 1.0% w/w) as bio-antioxidant and lubricant on the bio-stability was observed for all PU types. VitE-additives protected the surface layer from erosion and cracking during chemical oxidation in vitro as well as from thermal oxidation during extrusion re-processing.
Context: Companies in highly dynamic markets increasingly struggle with their ability to plan product development and to create reliable roadmaps. A main reason is the decreasing lack of predictability of markets, technologies, and customer behaviors. New approaches for product roadmapping seem to be necessary in order to cope with today's highly dynamic conditions. Little research is available with respect to such new approaches. Objective: In order to better understand the state of the art and to identify research gaps, this article presents a review of the scientific literature with respect to product roadmapping. Method: We performed a systematic literature review (SLR) with respect to identify papers in the field of computer science. Results: After filtering, the search resulted in a set of 23 relevant papers. The identified papers focus on different aspects such as roadmap types, processes for creating and updating roadmaps, problems and challenges with roadmapping, approaches to visualize roadmaps, generic frameworks and specific aspects such as the combination of roadmaps with business modeling. Overall, the scientific literature covers many important aspects of roadmapping but does provide only little knowledge on how to create product roadmaps under highly dynamic conditions. Research gaps address, for instance, the inclusion of goals or outcomes into product roadmaps, the alignment of a roadmap with a product vision, and the inclusion of product discovery activities in product roadmaps. In addition, the transformation from traditional roadmapping processes to new ways of roadmapping is not sufficiently addressed in the scientific literature.
Digital technologies are moving into physical products. Smart cars, connected lightbulbs and data-generating tennis rackets are examples of previously “pure” physical products that turned into “digitized products”. Digitizing products offers many use cases for consumers that will hopefully persuade them to buy these products. Yet, as revenues from selling digitized products will remain small in the near future, digitized product manufacturers have to look for other sources of benefits. Producer-side use cases describe how manufacturers can benefit internally from the digitized products they produce. Our article identifies three categories of such use cases: product-, service-, and process-related ones.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
Potentials of smart contracts-based disintermediation in additive manufacturing supply chains
(2019)
We investigate which potentials are created by using smart contracts for disintermediation in supply chains for additive manufacturing. Using a qualitative, critical realist research approach, we analyzed three case studies with companies active in additive manufactures. Based on interviews with experts from these companies, we could identify eight key requirements for disintermediation and associate four potentials of smart contracts-based disintermediation.
We report on the reflectance, transmittance and fluorescence spectra (λ=200–1200nm) of four types of chicken eggshells (white, brown, light green, dark green) measured in situ without pretreatment and after ablation of 20–100 μm of the outer shell regions. The color pigment protoporphyrin IX (PPIX) is embedded in the protein phase of all four shell types as highly fluorescent monomers, in the white and light green shells additionally as non-fluorescent dimers, and in the brown and dark green shells mainly as non-fluorescent poly-aggregates. The green shell colors are formed from an approximately equimolar mixture of PPIX and biliverdin. The axial distribution of protein and color pigments were evaluated from the combined reflectances of both the outer and inner shell surfaces, as well as from the transmittances. For the data generation we used the radiative transfer model in the random walk and Kubelka-Munk approaches.
Assistive environments are entering our homes faster than ever. However, there are still various barriers to be broken. One of the crucial points is a personalization of offered services and integration of assistive technologies in common objects and therefore in a regular daily routine. Recognition of sleep patterns for the preliminary sleep study is one of the Health services that could be performed in an undisturbing way. This article proposes the hardware system for the measurement of bio-vital signals necessary for initial sleep study in a nonobtrusive way. The first results confirm the potential of measurement of breathing and movement signals with the proposed system.
Nowadays, the demand for a MEMS development/design kit (MDK) is even more in focus than ever before. In order to achieve a high quality and cost effectiveness in the development process for automotive and consumer applications, an advanced design flow for the MEMS (micro electro mechanical systems) element is urgently required. In this paper, such a development methodology and flow for parasitic extraction of active semiconductor devices is presented. The methodology considers geometrical extraction and links the electrically active pn junctions to SPICE standard library models and subsequently extracts the netlist. An example for a typical pressure sensor is presented and discussed. Finally, the results of the parasitic extraction are compared with fabricated devices in terms of accuracy and capability.
This study describes a non-contact measuring and parameter identification procedure designed to evaluate inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament. This method can potentially help to establish a correlation between stiffness and damping characteristics of the annular ligament and inertia properties of the stapes and, thus, help to reduce the number of independent parameters in the model-based hearing diagnosis.
Optofluidics
(2019)
This introduction into the multidisciplinary area of optofluidics offers the necessary foundations in photonics, polymer physics and process analytics to students, engineers and researchers to enter the field. All basic ingredients of a polymer-based platform as a foundation for quick and compact solutions for chemical, biological and medical sensing and manipulation are developed.
This paper generalizes the theory of policy uncertainty with the new literature on rational inattention. First, the model demonstrates that inattention is dependent on the signal variance and the policy parameter. Second, I discover a novel trade-off showing that a policy instrument mitigates attention. Third, the policy instrument is non-linear and reciprocal to both the size and variance of the signal. The unifying theory creates new implications to economic theory and public policy alike.
This paper discusses the optimal control problem for increasing the energy efficiency of induction machines in dynamic operation including field weakening regime. In an offline procedure optimal current and flux trajectories are determined such that the copper losses are minimized during transient operations. These trajectories are useful for a subsequent online implementation.
A clinically useful system for individual continuous health data monitoring needs an architecture that takes into account all relevant medical and technical conditions. The requirements for a health app to support such a system are collected, and a vendor independent architecture is designed that allows the collection of vital data from arbitrary wearables using a smartphone. A prototypical implementation for the main scenario shows the feasibility of the approach.
In summary, we believe that current “sleep monitoring” consumer devices on the market must undergo a more robust validation process before being made available and distributed in the general public. This is especially noteworthy as there have been first reports in the literature that inaccurate feedback of such consumer devices can worry subjects and may even lead to compromised well-being of the user.
Background: Design patterns are supposed to improve various quality attributes of software systems. However, there is controversial quantitative evidence of this impact. Especially for younger paradigms such as service- and microservice-based systems, there is a lack of empirical studies.
Objective: In this study, we focused on the effect of four service-based patterns - namely process abstraction, service façade, decomposed capability, and event-driven messaging - on the evolvability of a system from the viewpoint of inexperienced developers.
Method: We conducted a controlled experiment with Bachelor students (N = 69). Two functionally equivalent versions of a service-based web shop - one with patterns (treatment group), one without (control group) - had to be changed and extended in three tasks. We measured evolvability by the effectiveness and efficiency of the participants in these tasks. Additionally, we compared both system versions with nine structural maintainability metrics for size, granularity, complexity, cohesion, and coupling.
Results: Both experiment groups were able to complete a similar number of tasks within the allowed 90 min. Median effectiveness was 1/3. Mean efficiency was 12% higher in the treatment group, but this difference was not statistically significant. Only for the third task, we found statistical support for accepting the alternative hypothesis that the pattern version led to higher efficiency. In the metric analysis, the pattern version had worse measurements for size and granularity while simultaneously having slightly better values for coupling metrics. Complexity and cohesion were not impacted.
Interpretation: For the experiment, our analysis suggests that the difference in efficiency is stronger with more experienced participants and increased from task to task. With respect to the metrics, the patterns introduce additional volume in the system, but also seem to decrease coupling in some areas.
Conclusions: Overall, there was no clear evidence for a decisive positive effect of using service-based patterns, neither for the student experiment nor for the metric analysis. This effect might only be visible in an experiment setting with higher initial effort to understand the system or with more experienced developers.
Digitalization of products and services commonly causes substantial changes in business models, operations, organization structures and IT infrastructures of enterprises. Motivated by experiences and observations from digitalization projects, the paper investigates the effects of digitalization on enterprise architectures (EA). EA models serve as representation of business, information system and technical aspects of an enterprise to support management and development. By comparing EA models before and after digitalization, the paper analyzes the kinds of changes visible in the EA model. The most important finding is that newly created digitized products and the associated (product)- and enterprise architecture are no longer properly integrated into the overall architecture and even exist in parallel. Thus, the focus of this work is on showing these parallel architectures and proposing derivations for a better integration.
Urban platforms are essential for smart and sustainable city planning and operation. Today they are mostly designed to handle and connect large urban data sets from very different domains. Modelling and optimisation functionalities are usually not part of the cities software infrastructure. However, they are considered crucial for transformation scenario development and optimised smart city operation. The work discusses software architecture concepts for such urban platforms and presents case study results on the building sector modelling, including urban data analysis and visualisation. Results from a case study in New York are presented to demonstrate the implementation status.
Novel design for a coreless printed circuit board transformer realizing high bandwidth and coupling
(2019)
Rogowski coils offer galvanic isolation and can measure alternating currents with a high bandwidth. Coreless printed circuit board (PCB) transformers have been used as an alternative to limit the additional stray inductance if a Rogowski coil can not be attached to the circuit. A new PCB transformer layout is proposed to reduce cost, decrease additional stray inductance, increase the bandwidth of current measurements and simplify the integration into existing designs.
New approaches to respiratory assist: bioengineering an ambulatory, miniaturized bioartificial lung
(2019)
Although state-of-the-art treatments of respiratory failure clearly have made some progress in terms of survival in patients suffering from severe respiratory system disorders, such as acute respiratory distress syndrome (ARDS), they failed to significantly improve the quality of life in patients with acute or chronic lung failure, including severe acute exacerbations of chronic obstructive pulmonary disease or ARDS as well. Limitations of standard treatment modalities, which largely rely on conventional mechanical ventilation, emphasize the urgent, unmet clinical need for developing novel(bio)artificial respiratory assist devices that provide extracorporeal gas exchange with a focus on direct extracorporeal CO2 removal from the blood. In this review, we discuss some of the novel concepts and critical prerequisites for such respiratory lung assist devices that can be used with an adequate safety profile, in the intensive care setting, as well as for long-term domiciliary therapy in patients with chronic ventilatory failure. Specifically, we describe some of the pivotal steps, such as device miniaturization, passivation of the blood-contacting surfaces by chemical surface modifications, or endothelial cell seeding, all of which are required for converting current lung assist devices into ambulatory lung assist device for long-term use in critically ill patients. Finally, we also discuss some of the risks and challenges for the long-term use of ambulatory miniaturized bioartificial lungs.
Data analytics tasks on large datasets are computationally intensive and often demand the compute power of cluster environments. Yet, data cleansing, preparation, dataset characterization and statistics or metrics computation steps are frequent. These are mostly performed ad hoc, in an explorative manner and mandate low response times. But, such steps are I/O intensive and typically very slow due to low data locality, inadequate interfaces and abstractions along the stack. These typically result in prohibitively expensive scans of the full dataset and transformations on interface boundaries.
In this paper, we examine R as analytical tool, managing large persistent datasets in Ceph, a wide-spread cluster file-system. We propose nativeNDP – a framework for Near Data Processing that pushes down primitive R tasks and executes them in-situ, directly within the storage device of a cluster-node. Across a range of data sizes, we show that nativeNDP is more than an order of magnitude faster than other pushdown alternatives.
In the present tutorial we perform a cross-cut analysis of database storage management from the perspective of modern storage technologies. We argue that neither the design of modern DBMS, nor the architecture of modern storage technologies are aligned with each other. Moreover, the majority of the systems rely on a complex multi-layer and compatibility oriented storage stack. The result is needlessly suboptimal DBMS performance, inefficient utilization, or significant write amplification due to outdated abstractions and interfaces. In the present tutorial we focus on the concept of native storage, which is storage operated without intermediate abstraction layers over an open native storage interface and is directly controlled by the DBMS.
In recent years Indonesia has been confronted with an excessive generation of municipal solid waste (MSW), predominantly present in the form of organic refuse. While moving towards integrated solid waste management (ISWM) is an important strategy used to control its generation, it is also now recognized that economic approaches need to be promoted as well in order to tackle the problem concertedly. In this case study, empirical approaches are developed to understand how market instruments could be introduced into environmental services and how to apply co-benefit approach in a green economy paradigm for Indonesia. We investigate the feasibility of introducing market instruments in Indonesia by appliying local co-benefit initiatives adapted from German experiences in integrating market instruments into MSW management practices. Currently co-benefit activities are undertaken in the Sukunan village (Yogjakarta) to promote waste composting using market incentives in the framework of community-based solid waste management (CBSWM). This scheme aims at reducing MSW generation at its source and mobilizing people to be involved in waste separation (organic and non-organic) at household levels. As a result, about 200,000 t of CO2 emissions could be successfully reduced annually. By integrating market instruments into waste management practices, the result of our studies sugggests that Indonesia could make positive changes to its environmental policy and regulation of MSW at local levels. The country's policymakers have played important roles in promoting the effectiveness of urban development with co-benefits approaches to facilitate its transition towards a green eccnomy.
Enterprises are transforming their strategy, culture, processes, and their information systems to enlarge their digitalization efforts or to approach for digital leadership. The digital transformation profoundly disrupts existing enterprises and economies. In current times, a lot of new business opportunities appeared using the potential of the Internet and related digital technologies: The Internet of Things, services computing, cloud computing, artificial intelligence, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT environments with many rather small and distributed structures, like the Internet of Things, microservices, or other micro-granular elements. Architecting micro-granular structures have a substantial impact on architecting digital services and products. The change from a closed-world modeling perspective to more flexible Open World of living software and system architectures defines the context for flexible and evolutionary software approaches, which are essential to enable the digital transformation. In this paper, we are revealing multiple perspectives of digital enterprise architecture and decisions to effectively support value and service oriented software systems for intelligent digital services and products.
Collaborative apparel consumption is proposed as more sustainable alternative to conventional consumption. The purpose of this study is the exploration of consumers’ motives to participate in collaborative apparel consumption. Findings suggest that consumers’ intention to participate in collaborative apparel consumption is mainly influenced by financial benefits, convenience and sustainability awareness.
The energy turnaround, digitalization and decreasing revenues forces enterprises in the energy domain to develop new business models. Business models for renewable energy are compound on different logic than business models for larger scale power plants. Following a design science research approach, we examined the business models of three enterprises in the energy domain in a first step. We identified that these business models result in complex ecosystems with multiple actors and difficult relationships between them. One cause is the fast changing and complicated state regulation in Germany. In order to solve the problem, we captured together with the partners of the enterprises the requirements in a second phase. Further we developed the prototype Business Model Configurator (BMConfig) based on the e3Value Ontology on the metamodelling platform ADOxx. We demonstrate the feasibility of our approach in business model of energy efficiency service based on smart meter data.
Companies are continuously changing their strategy, processes, and information systems to benefit from the digital transformation. Controlling the digital architecture and governance is the fundamental goal. Enterprise Governance, Risk and Compliance (GRC) systems are vital for managing digital risks threatening in modern enterprises from many different angles. The most significant constituent to GRC systems is the definition of controls that is implemented on different layers of a digital Enterprise Architecture (EA). As part of the compliant aspect of GRC, the effectiveness of these controls is assessed and reported to relevant management bodies within the enterprise. In this paper, we present a metamodel which links controls to the affected elements of a digital EA and supplies a way of expressing associated assessment techniques and results. We complement a metamodel with an expository instantiation of a control compliance cockpit in an international insurance enterprise.
Due to the large interindividual variances and the poor optical accessibility of the ear, the specificity of hearing diagnostics today is severely restricted to a certain clinical picture and quantitative assessment. Often only a yes or no decision is possible, which depends strongly on the subjective assessment of the ENT physician. A novel approach, in which objectively obtainable, non invasive audiometric measurements are evaluated using a numerical middle ear model, makes it possible to make the hidden middle ear properties visible and quantifiable. The central topic of this paper is a novel parameter identification algorithm that combines inverse fuzzy arithmetic with an artificial neural network in order to achieve a coherent diagnostic overall picture in the comparison of model and measurement. Its usage is shown at a pathological pattern called malleus fixation where the upper ligament of the malleus is pathologically stiffened.
Model-based hearing diagnosis based on wideband tympanometry measurements utilizing fuzzy arithmetic
(2019)
Today's audiometric methods for the diagnosis of middle ear disease are often based on a comparison of measurements with standard curves that represent the statistical range of normal hearing responses. Because of large inter-individual variances in the middle ear, especially in wideband tympanometry (WBT), specificity and quantitative evaluation are greatly restricted. A new model-based approach could transform today's predominantly qualitative hearing diganostics into a quantitative and tailored, patient-specific diagnosis, by evaluating WBT measurements with the aid of a middle-ear model. For this particular investigation, a finite element model of a human ear was used. It consisted of an acoustic ear canal and a tympanic cavity model, a middle-ear with detailed nonlinear models of the tympanic membrane and annular ligament, and a simplified inner-ear model. This model has made it possible to identify pathologies from measurements, by analyzing the parameters through senstivity studies and parameter clustering. Uncertainties due to the lack of knowledge, subjectivity in numerical implementation and model simplification are taken into account by the application of fuzzy arithmetic. The most confident parameter set can be determined by applying an inverse fuzzy method on the measurement data. The principle and the benefits of this model-based approach are illustrated by the example of a two-mass oscillator, and also by the simulation of the energy absorbance of an ear with malleus fixation, where the parameter changes that are introduced can be determined quantitatively through the system identification.
An important shift in software delivery is the definition of a cloud service as an independently deployable unit by following the microservices architectural style. Container virtualization facilitates development and deployment by ensuring independence from the runtime environment. Thus, cloud services are built as container based systems - a set of containers that control the lifecycle of software and middleware components. However, using containers leads to a new paradigm for service development and operation: Self service environments enable software developers to deploy and operate container based systems on their own - you build it, you run it. Following this approach, more and more operational aspects are transferred towards the responsibility of software developers. In this work, we propose a concept for self-adaptive cloud services based on container virtualization in line with the microservices architectural style and present a model-based approach that assists software developers in building these services. Based on operational models specified by developers, the mechanisms required for self-adaptation are automatically generated. As a result, each container automatically adapts itself in a reactive, decentralized manner. We evaluate a prototype which leverages the emerging TOSCA standard to specify operational behavior in a portable manner.
Parallel applications are the computational backbone of major industry trends and grand challenges in science. Whereas these applications are typically constructed for dedicated High Performance Computing clusters and supercomputers, the cloud emerges as attractive execution environment, which provides on-demand resource provisioning and a pay-per-use model. However, cloud environments require specific application properties that may restrict parallel application design. As a result, design trade-offs are required to simultaneously maximize parallel performance and benefit from cloud-specific characteristics.
In this paper, we present a novel approach to assess the cloud readiness of parallel applications based on the design decisions made. By discovering and understanding the implications of these parallel design decisions on an application’s cloud readiness, our approach supports the migration of parallel applications to the cloud.We introduce an assessment procedure, its underlying meta model, and a corresponding instantiation to structure this multi-dimensional design space. For evaluation purposes, we present an extensive case study comprising three parallel applications and discuss their cloud readiness based on our approach.
To remain competitive in a fast changing environment, many companies started to migrate their legacy applications towards a Microservices architecture. Such extensive migration processes require careful planning and consideration of implications and challenges likewise. In this regard, hands-on experiences from industry practice are still rare. To fill this gap in scientific literature, we contribute a qualitative study on intentions, strategies, and challenges in the context of migrations to Microservices. We investigated the migration process of 14 systems across different domains and sizes by conducting 16 in-depth interviews with software professionals from 10 companies. Along with a summary of the most important findings, we present a separate discussion of each case. As primary migration drivers, maintainability and scalability were identified. Due to the high complexity of their legacy systems, most companies preferred a rewrite using current technologies over splitting up existing code bases. This was often caused by the absence of a suitable decomposition approach. As such, finding the right service cut was a major technical challenge, next to building the necessary expertise with new technologies. Organizational challenges were especially related to large, traditional companies that simultaneously established agile processes. Initiating a mindset change and ensuring smooth collaboration between teams were crucial for them. Future research on the evolution of software systems can in particular profit from the individual cases presented.
Microservices are a topic driven mainly by practitioners and academia is only starting to investigate them. Hence, there is no clear picture of the usage of Microservices in practice. In this paper, we contribute a qualitative study with insights into industry adoption and implementation of Microservices. Contrary to existing quantitative studies, we conducted interviews to gain a more in-depth understanding of the current state of practice. During 17 interviews with software professionals from 10 companies, we analyzed 14 service-based systems. The interviews focused on applied technologies, Microservices characteristics, and the perceived influence on software quality. We found that companies generally rely on well established technologies for service implementation, communication, and deployment. Most systems, however, did not exhibit a high degree of technological diversity as commonly expected with Microservices. Decentralization and product character were different for systems built for external customers. Applied DevOps practices and automation were still on a mediocre level and only very few companies strictly followed the you build it, you run it principle. The impact of Microservices on software quality was mainly rated as positive. While maintainability received the most positive mentions, some major issues were associated with security. We present a description of each case and summarize the most important findings of companies across different domains and sizes. Researchers may build upon our findings and take them into account when designing industry-focused methods.
Changing requirements and qualification profiles of employees, increasingly complex digital systems up to artificial intelligence, missing standards for the seamless embedding of existing resources and unpredictable return on investments are just a few examples of the challenges of an SME in the age of digitalisation. In most cases there is a lack of suitable tools and methods to support companies in the digital transformation process in the value creation processes, but also of training and learning materials. A European research project (BITTMAS - Business Transformation towards Digitalisation and Smart systems, ERASMUS+, 2016-1 DE02-KA202-003437) with international partners from science, associations and industry has addressed this issue and developed various methods and instruments to support SMEs. Within the scope of a literature search, 16 suitable digitalisation concepts for production and logistics were identified. In the following, a learning platform with a literature database with multivariable sorting options according to branches and keywords of digitalisation, a video gallery with basic and advanced knowledge and a glossary were created in order to provide the user with consolidated and structured specialist knowledge. The 16 identifying concepts for transforming value-added processes in the context of digitalisation were transferred to a learning platform using developed learning paths in coaching and training to online course modules including test questions. A maturity model was developed and implemented in a self assessment tool for the analysis to identify the potential of digitalisation in production and logistics in relation to the current technological digitalisation level of the company. As a result, the user receives one or more of the 16 potential digitalisation concepts suggested or the delta for the necessary, not yet available enabler technologies is presented as a spider diagram. For a successful implementation of the identified suitable digitalisation concepts in production and logistics, a further tool was developed to identify supplementary requirements for all company divisions and stakeholders in relation to the "digital transformation" in the form of a self-evaluation. This paper presents the methods and tools developed, the accompanying learning materials and the learning platform.
Contemporary public enterprises differ from their forebears. Today, they are more similar to private enterprises, receiving far more attention than previously, when privatization processes all over the world were in the spotlight. Furthermore, the broad research stream of entrepreneurship has so far neglected the consideration of public enterprises. To set a future research agenda, the author examines the dispersed literature using an integrative and organizing framework to identify major topics and research findings. This paper reviews articles that investigate the entrepreneurship in contemporary public enterprises. Despite the growing scholarly interest globally, this systematic literature review indicates there is no more than a loose connection between the literature streams of public entrepreneurship and corporate entrepreneurship. Specifically, the review shows that the multidimensional concept of entrepreneurial orientation has thus far been ignored, although autonomy plays a significant role in the literature review, namely in the context of the interference of the public owner. It also reveals other essential research gaps, such as the development of a modern theory of public enterprises. The linked research stream of public-sector corporate entrepreneurship offers a broad area of scholarly research and should encourage further investigation.
Learning to translate between real world and simulated 3D sensors while transferring task models
(2019)
Learning-based vision tasks are usually specialized on the sensor technology for which data has been labeled. The knowledge of a learned model is simply useless when it comes to data which differs from the data on which the model has been initially trained or if the model should be applied to a totally different imaging or sensor source. New labeled data has to be acquired on which a new model can be trained. Depending on the sensor, this can even get more complicated when the sensor data becomes more abstract and hard to be interpreted and labeled by humans. To enable reuse of models trained for a specific task across different sensors minimizes the data acquisition effort. Therefore, this work focuses on learning sensor models and translating between them, thus aiming for sensor interoperability. We show that even for the complex task of human pose estimation from 3D depth data recorded with different sensors, i.e. a simulated and a Kinect 2TM depth sensor, human pose estimation can greatly improve by translating between sensor models without modifying the original task model. This process especially benefits sensors and applications for which labels and models are difficult if at all possible to retrieve from raw sensor data.
Powered by e-commerce and vital in the manufacturing industry, intralogistics became an increasingly important and labour-intensive process. In highly standardized automation-friendly environments, such as the automotive sector, most of efficiently automatable intralogistics tasks have already been automated. Due to aging population in EU and ergonomic regulations, the urge to automate intralogistics tasks became consistent also where product and process standardization is lower. That is the case of the production line or cell material supply process, where an increasing number of product variants and individually customized products combined with the necessary ability of reacting to changes in market conditions led to smaller and more frequent replenishment to the points of use in the production plant and to the chaotic addition of production cells in shop floor layout. This led in turn to inevitable traffic growth with unforeseeable related delays and increased level of safety threats and accidents. In this paper, we use the structured approach of the Quality Interaction Function Deployment to analyse the process of supply of assembly lines, seeking the most efficient combination of automation and manual labour, satisfying all stakeholders´ requirements. Results are presented and discussed.
Theory predicts that market‐timing activities bias Jensen's alpha (JA). However, empirical studies have failed to find consistent evidence of this bias. We tackle this puzzle in a nested model analysis and show that the bias contains an exogenous market component that is unrelated to market‐timing skill. In a comprehensive empirical analysis of US mutual funds, we find that the timing‐induced bias in JA is mainly driven by this market component, which is uncorrelated with measured timing activities. Measures of total performance that allow for timing activities are virtually identical to JA, even if timing activities are present in the evaluated fund. Hence, we conclude that JA is a sufficient measure of total performance.
Information technology (IT) plays an essential role in organizational innovation adoption. As such, IT governance (ITG) is paramount in accompanying IT to allow innovation. However, the traditional concept of ITG to control the formulation and implementation of IT strategy is not fully equipped to deal with the current changes occurring in the digital age. Today’s ITG needs an agile approach that can respond to changing dynamics. Consequently, companies are relying heavily on agile strategies to secure better company performance. This paper aims to clarify how organizations can implement agile ITG. To do so, this study conducted 56 qualitative interviews with professionals from the banking industry to identify agile dimensions within the governance construct. The qualitative evaluation uncovered 46 agile governance dimensions. Moreover, these dimensions were rated by 29 experts to identify the most effective ones. This led to the identification of six structure elements, eight processes, and eight relational mechanisms.
Melamine–formaldehyde (MF) resins are widely used as adhesives and finishing materials in the wood industry. During resin cure, either methylene ether or methylene bridges are formed, leading to the formation of a three‐dimensional resin network. Not only the curing degree, but also the chemical species present in the cured resin determine the quality of the final product. Analytical methods allowing a detailed investigation of network formation are of great benefit to manufacturers. In the present work, resin cure of an MF precondensate is studied at different temperatures (100–200 °C) without considering the initial pH as a factor. Isoconversional kinetic analysis based on exothermal curing enthalpies enables calculation of the crosslinking degree at a given time/temperature regime. A semiquantitative determination of the chemical groups present is performed based on solid‐state nuclear magnetic resonance data. Fourier transform infrared spectroscopy has shown to be a fast and reliable analytical tool with high sensitivity toward functional groups and with great potential for at‐line process control.
We introduce IPA-IDX – an approach to handle index modifications modern storage technologies (NVM, Flash) as physical in-place appends, using simplified physiological log records. IPA-IDX provides similar performance and longevity advantages for indexes as basic IPA [5] does for tables. The selective application of IPA-IDX and basic IPA to certain regions and objects, lowers the GC overhead by over 60%, while keeping the total space overhead to 2%. The combined effect of IPA and IPA-IDX increases performance by 28%.
This study describes a non-contact measuring and system identification procedure for evaluating inhomogeneous stiffness and damping characteristics of the annular ligament in the physiological amplitude and frequency range without the application of large static external forces that can cause unnatural displacements of the stapes. To verify the procedure, measurements were first conducted on a steel beam. Then, measurements on an individual human cadaveric temporal bone sample were performed. The estimated results support the inhomogeneous stiffness and damping distribution of the annular ligament and are in a good agreement with the multiphoton microscopy results which show that the posterior-inferior corner of the stapes footplate is the stiffest region of the annular ligament.
The aim of this study was to predefine the pore structure of β-tricalcium phosphate (β-TCP) scaffolds with different macro pore sizes (500, 750, and 1000 µm), to characterize β-TCP scaffolds, and to investigate the growth behavior of cells within these scaffolds. The lead structures for directional bone growth (sacrificial structures) were produced from polylactide (PLA) using the fused deposition modeling techniques. The molds were then filled with β-TCP slurry and sintered at 1250° C, whereby the lead structures (voids) were burnt out. The scaffolds were mechanically characterized (native and after incubation in simulated body fluid (SBF) for 28 d). In addition, biocompatibility was investigated by live/dead, cell proliferation and lactate dehydrogenase assays.