Refine
Document Type
- Journal article (871)
- Conference proceeding (847)
- Book chapter (184)
- Book (61)
- Doctoral Thesis (34)
- Anthology (15)
- Working Paper (12)
- Patent / Standard / Guidelines (6)
- Review (6)
- Issue of a journal (2)
Language
- English (2041) (remove)
Is part of the Bibliography
- yes (2041)
Institute
- Informatik (702)
- ESB Business School (515)
- Technik (345)
- Life Sciences (327)
- Texoversum (150)
- Zentrale Einrichtungen (6)
Publisher
- Springer (299)
- IEEE (250)
- Elsevier (218)
- MDPI (98)
- Hochschule Reutlingen (55)
- Gesellschaft für Informatik (54)
- Wiley (49)
- ACM (40)
- De Gruyter (35)
- Association for Information Systems (AIS) (31)
The aim of this article is to establish a stochastic search algorithm for neural networks based on the fractional stochastic processes {𝐵𝐻𝑡,𝑡≥0} with the Hurst parameter 𝐻∈(0,1). We define and discuss the properties of fractional stochastic processes, {𝐵𝐻𝑡,𝑡≥0}, which generalize a standard Brownian motion. Fractional stochastic processes capture useful yet different properties in order to simulate real-world phenomena. This approach provides new insights to stochastic gradient descent (SGD) algorithms in machine learning. We exhibit convergence properties for fractional stochastic processes.
This article provides a stochastic agent-based model to exhibit the role of aggregation metrics in order to mitigate polarization in a complex society. Our sociophysics model is based on interacting and nonlinear Brownian agents, which allow us to study the emergence of collective opinions. The opinion of an agent, x i (t) is a continuous positive value in an interval [0, 1]. We find (i) most agent-metrics display similar outcomes. (ii) The middle-metric and noisy-metric obtain new opinion dynamics either towards assimilation or fragmentation. (iii) We show that a developed 2-stage metric provide new insights about convergence and equilibria. In summary, our simulation demonstrates the power of institutions, which affect the emergence of collective behavior. Consequently, opinion formation in a decentralized complex society is reliant to the individual information processing and rules of collective behavior.
The paper “focuses on the critique of economic rationality” (p. 2). The author analyses the work by Amartya Sen with a somewhat interdisciplinary approach. The author concludes that Sen has greatly shifted our paradigm of economic rationality. The nexus of ethics and economics as well as the two types of rationality (consistency versus optimization) are major contributions of Sen, according to the author. In a nutshell, Sen’s work is reconfiguring economic rationality until today.
The purpose of this paper is to study the impact of transparency on the political budget cycle (PBC) over time and across countries. So far, the literature on electoral cycles finds evidence that cycles depend on the stage of an economy. However, the author shows – for the first time – a reliance of the budget cycle on transparency. The author uses a new data set consisting of 99 developing and 34 Organization for Economic Cooperation and Development countries. First, the author develops a model and demonstrates that transparency mitigates the political cycles. Second, the author confirms the proposition through the econometric assessment. The author uses time series data from 1970 to 2014 and discovers smaller cycles in countries with higher transparency, especially G8 countries.
This article explores the determinants of people’s growth prospects in survey data as well as the impact of the European recovery fund to future growth. The focus is on the aftermath of the Corona pandemic, which is a natural limit to the sample size. We use Eurobarometer survey data and macroeconomic variables, such as GDP, unemployment, public deficit, inflation, bond yields, and fiscal spending data. We estimate a variety of panel regression models and develop a new simulation-regression methodology due to limitation of the sample size. We find the major determinant of people’s growth prospect is domestic GDP per capita, while European fiscal aid does not significantly matter. In addition, we exhibit with the simulation-regression method novel scientific insights, significant outcomes, and a policy conclusion alike.
The aim of this work is to establish and generalize a relationship between fractional partial differential equations (fPDEs) and stochastic differential equations (SDEs) to a wider class of stochastic processes, including fractional Brownian motions and sub-fractional Brownian motions with Hurst parameter H ∈ (1/2,1). We start by establishing the connection between a fPDE and SDE via the Feynman-Kac Theorem, which provides a stochastic representation of a general Cauchy problem. In hindsight, we extend this connection by assuming SDEs with fractional and sub-fractional Brownian motions and prove the generalized Feynman-Kac formulas under a (sub-)fractional Brownian motion. An application of the theorem demonstrates, as a by-product, the solution of a fractional integral, which has relevance in probability theory.
This article studies the hidden blemishes of two benchmark rulings of the European Court of Justice (ECJ). In 2015 and 2018, the ECJ approved two unconventional monetary instruments, among others ‘Outright Monetary Transactions’ and the ‘Public Sector Purchase Program’. Yet, there is a vigorous debate about both monetary operations in law and economics. In this interdisciplinary article, we address law and economic arguments in order to elucidate insights to the legal community. In particular, we elaborate on the legal implications of a variety of concerning issues such as public policy interference, effect on wealth redistribution, erosion of democratic legitimacy and lack of effectiveness of monetary policy. These topics remain disregarded in the ECJ rulings. Consequently, the verdicts do not identify the economic boundaries of the European Central Bank’s mandate appropriately.
The European Economic and Monetary Union (EMU) has been in turmoil for more than six years. The present governance rules do not seem to solve the problems neither permanently nor effectively. There is no vision about the future of Europe in the 21st century. This article describes a realignment of the economic governance, which does not necessarily lead to a transfer or political union. However, it solves the current and future challenges. In fact, the redesign of present rules is the most likely as well as legally and economically option today. The key ideais the detachment from the compulsive idea of an ever closer union. However, this vision requires boldness towards greater flexibility together with an exit clause or a state insolvency procedure for incompliant member states.
Whither the german council of economic experts? The past and future of public economic advice
(2014)
The article discusses the development and impact of the German Council of Economic Experts (GCEE). Firstly, the author studies the historical origins and the institutional setup of the GCEE. In the second step, an analyse of the impact of the annual reports of the German Council is given, along with the international comparison with other advisory boards. Finally, the paper discusses the current economic challenges and the need of modernization of the GCEE in special and political advisory boards in general.
This paper models the political budget cycle with stochastic differential equations. The paper highlights the development of future volatility of the budget cycle. In fact, I confirm the proposition of a less volatile budget cycle in future. Moreover, I show that this trend is even amplified due to higher transparency. These findings are new evidence in the literature on electoral cycles. I calibrate a rigorous stochastic model on public deficit-to-GDP data for several countries from 1970 to 2012.
This article is a review of the book "Brain computation as hierarchical abstraction" by Dana H. Ballard published by MIT press in 2015. The book series computational neuroscience familiarizes the reader with the computational aspects of brain functions based on neuroscientific evidence. It provides an excellent introduction of the functioning, i.e. the structure, the network and the routines of the brain in our daily life. The final chapters even discuss behavioral elements such as decision-making, emotions and consciousness. These topics are of high relevance in other sciences such as economics and philosophy. Overall, Ballard’s book stimulates a scientifically well-founded debate and, more importantly, reveals the need of an interdisciplinary dialogue towards social sciences.
A major lesson of the recent financial crisis is that money market freezes have major macroeconomic implications. This paper develops a tractable model in which we analyze the microeconomic and macroeconomic implications of a systemic banking crisis. In particular, we consider how the systemic crisis affects the optimal allocation of funding for businesses. We show that a central bank should reduce the interest rate to manage a systemic shock and hence smooth the macroeconomic consequences. Moreover, the analysis offers insight on the rational of bank behavior and the role of markets in a systemic crisis. We find that the failure to adopt the optimal policy can lead to economic fragility.
This article studies the renewed interest surrounding sustainable public finance and the topic of tax evasion as well as the new theory of information inattention. Extending a model of tax evasion with the notion of inattention reveals novel findings about policy instruments that can be used to mitigate tax evasion. We show that the attention parameters regarding tax rates, financial penalty schemes and income levels are as important as the level of the detection probability and the financial penalty incurred. Thus, our theory recommends the enhancement of sustainability in public policy, particularly in tax policy. Consequently, the paper contributes both to the academic and public policy debate.
In various German cities free-floating e-scooter sharing is an upcoming trend in e-mobility. Trends such as climate change, urbanization, demographic change, amongst others are arising and forces the society to develop new mobility solutions. Contrasting the more scientifically explored car sharing, the usage patterns and behaviors of e-scooter sharing customers still need to be analyzed. This presumably enables a better addressing of customers as well as adaptions of the business model to increase scooter utilization and therefore the profit of the e-scooter providers. The customer journey is digitally traceable from registration to scooter reservation and the ride itself. These data enable to identifies customer needs and motivations. We analyzed a dataset from 2017 to 2019 of an e-scooter sharing provider operating in a big German city. Based on the datasets we propose a customer clustering that identifies three different customer segments, enabling to draw multiple conclusions for the business development and improving the problem-solution fit of the e-scooter sharing model.
Small and medium-sized enterprises (SMEs) play a fundamental role in the economic system of the European Union: SMEs represent over 99 percent of all companies and provide two-thirds of the jobs in the private sector. Their innovativeness and economic success have significant influence on growth, jobs and prosperity in Europe.
Information technologies are regarded as key drivers of innovation in small and medium-sized enterprises (SME). Modern information technologies (IT) offer SMEs today many opportunities to improve their competitiveness and market position. Thus, business processes can be designed efficiently, open up new market segments and strengthen the innovation capacity significantly. However, many SMEs still have difficulties in utilizing these new technologies efficiently in order to foster process and product innovation. This is partly due to the fact that many SMEs don’t use IT Service Management and waste resources in running basic IT-functions like the maintenance of printers, software or servers.
Information Technology Service Management (ITSM) is a discipline for managing IT systems centred on the customer’s perspective of IT’s contribution to the business. Thus, by strengthening the performance of SME’s IT departments, ITSM enables process innovation (e.g. eProcurement) and product innovations (e.g. client services) can be promoted. The EU-funded project "IT Service Management for small and medium-sized Enterprises of the Danube Region" (ITSM4SME) aims to make SMEs in the Danube Region aware of the potential of ITSM, to inspire SMEs about the use of information technology and to allow IT-enabled innovations. The aims of the project have been achieved inter alia through a simplified method for IT service management for small IT organisations, practical case studies, a "do-it-yourself" service management modelling tool, an eLearning portal and by training more than 300 participants from SMEs in pilot training courses in Bulgaria, Romania and Slovenia.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (longterm electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic TimeWarping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
Global, competitive markets which are characterised by mass customisation and rapidly changing customer requirements force major changes in production styles and the configuration of manufacturing systems. As a result, factories may need to be regularly adapted and optimised to meet short-term requirements. One way to optimise the production process is the adaptation of the plant layout to the current or expected order situation. To determine whether a layout change is reasonable, a model of the current layout is needed. It is used to perform simulations and in the case of a layout change it serves as a basis for the reconfiguration process. To aid the selection of possible measurement systems, a requirements analysis was done to identify the important parameters for the creation of a digital shadow of a plant layout. Based on these parameters, a method is proposed for defining limit values and specifying exclusion criteria. The paper thus contributes to the development and application of systems that enable an automatic synchronisation of the real layout with the digital layout.
In the past, plant layouts were regarded as highly static structures. With increasing internal and external factors causing turbulence in operations, it has become more necessary for companies to adapt to new conditions in order to maintain optimal performance. One possible way for such an adaptation is the adjustment of the plant layout by rearranging the individual facilities within the plant. Since the information about the plant layout is considered as master data and changes have a considerable impact on interconnected processes in production, it is essential that this data remains accurate and up-to-date. This paper presents a novel approach to create a digital shadow of the plant layout, which allows the actual state of the physical layout to be continuously represented in virtual space. To capture the spatial positions and orientations of the individual facilities, a pan-tilt-zoom camera in combination with fiducial markers is used. With the help of a prototypically implemented system, the real plant layout was captured and converted into different data formats for further use in exemplary external software systems. This enabled the automatic updating of the plant layout for simulation, analysis and routing tasks in a case study and showed the benefits of using the proposed system for layout capturing in terms of accuracy and effort reduction.
The market for indoor positioning systems for a variety of applications has grown strongly in recent years. A wide range of systems is available, varying considerably in terms of accuracy, price and technology used. The suitability of the systems is highly dependent on the intended application. This paper presents a concept to use a single low-cost PTZ camera in combination with fiducial markers for indoor position and orientation determination. The intended use case is to capture a plant layout consisting of position, orientation and unique identity of individual facilities. Important factors to consider for the selection of a camera have been identified and the transformation of the marker pose in camera coordinates into a selectable plant coordinate system is described. The concept is illustrated by an exemplary practical implementation and its results.
Development of an indoor positioning system to create a digital shadow of production plant layouts
(2023)
The objective of this dissertation is to develop an indoor positioning system that allows the creation of a digital shadow of the plant layout in order to continuously represent the actual state of the physical layout in the virtual space. In order to define the requirements for such a system, potential stakeholders who could benefit from a digital shadow in the context of the plant layout were analysed. In order to generate added value for their work, the requirements were derived from their perspective. As the core of an indoor positioning system is the sensory aspect to capture the physical layout parameters, different potential technologies were compared and evaluated in terms of their suitability for this particular application. Derived from this analysis, the selected concept is based on the use of a pan-tilt-zoom (PTZ) camera in combination with fiducial markers. In order to determine specific camera parameters, a series of experiments were conducted which were necessary to develop the measurement method as well as the mathematical calculation method and coordinate transformation for the determination of poses (positions and angular orientations) of the respective facilities in the plant. In addition, an experimental validation was performed to ensure that the limit values for individual parameters determined in the requirements analysis can be met.
This paper introduces a highly scalable heteromodular origami art technique for constructing 3D framework structures using elementary struts and connectors folded from uncut sheets of standard A4 office paper. The presented technique, named ZEBRA, allows the design of meter-scale architectural objects, such as truss bridges and towers, which are capable of bearing substantial mechanical loads. Moving parts, ranging from simple levers to complete multi-bar linkages, can be integrated into static frameworks using a set of kinematic extensions. An overview is given of how the ZEBRA system can be used to teach university students various theoretical and practical aspects of the engineering sciences in an entertaining and hands-on way.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
In countries such as Germany, where municipalities have planning sovereignty, problems of urban sprawl often arise. As the dynamics of land development have not substantially subsided over the last years, the national government decided to test the instrument of ‘Tradable Planning Permits’ (TPP) in a nationwide field experiment with 87 municipalities involved. The field experiment was able to implement the key features of a TPP system in a laboratory setting with approximated real socioeconomic and planning conditions. In a TPP system allocated planning permits must be used by municipalities for developing land. The permits can be traded between local jurisdictions, so that they have flexibility in deciding how to comply with the regulation. In order to evaluate the performance of such a system, specific field data about future building areas and their impact on community budgets for the period 2014–2028 were collected. The field experiment contains several sessions with representatives of the municipalities and with students. The participants were confronted with two (municipalities) and four (students) schemes. The results show that a trading system can curb down land development in an effective and also efficient manner. However, depending on the regulatory framework, the trading schemes show different price developments and distributional effects. The unexperienced representatives of the local authorities can easily handle with the permits in the administration and in the established market. A trading scheme sets very high incentives to save open space and to direct development activities to areas within existing planning boundaries. It is therefore a promising instrument for Germany and also other regions or countries with an established land-use planning system.
In times of e-commerce and digitalization, new markets are opening, young companies have the possibility to grow and new perspectives arise in terms of customer relationship. Customers require more possibilities of personalization. In the same time, companies have access to new and especially more information about the customer. Seems like it was a correlation that could evolve greatly if there weren't privacy issues. Vast amount of data about consumers are collected in Big Data warehouses. These shall be analyzed via predictive analytics and customers shall be classified by algorithms like clustering models, propensity models or collaborative filtering. All these subjects are growing in importance, as they are shaping the global marketing landscape. Marketers develop together with IT scientists new ways of analyzing customer databases and benefit from more accurate segmentation methods as that have been used until now. The following paper shall provide a literature review on new methods of consumer segmentation regarding the high inflow of new information via e-commerce. It will introduce readers in the subject of predictive analytics and will discuss several predictive models. The writing of the paper is not based on own empirical researches, but shall serve as a reference text for further researches. A conclusion will complete the paper.
The increasing share of renewable energy with volatile production results in higher variability of prices for electrical energy. Optimized operating schedules, e.g., for industrial units, can yield a considerable reduction of energy costs by shifting processes with high power consumption to times with low energy prices. We present a distributed control architecture for virtual power plants (VPPs) where VPP participants benefit from flexible adaptation of schedules to price forecasts while maintaining control of their operating schedule. An aggregator trades at the energy market on behalf of the participants and benefits from more detailed and reliable load profiles within the VPP.
The demonstration project Virtual Power Plant Neckar-Alb is constructing a Virtual Power Plant (VPP) demonstration site at the Reutlingen University campus. The VPP demonstrator integrates a heterogeneous set of distributed energy resources (DERs) which are connected to control the infrastructure and an energy management system. This paper describes the components and the architecture of the demonstrator and presents strategies for demonstration of multiple optimization and control systems with different control paradigms.
In an exploratory study about online communication of large and medium-sized B2B companies from the German state of Baden-Württemberg, their message content communicated via websites, and their websites' appeal for international prospects has been analyzed. It revealed many basic content items absent, making the site less attractive for further exploration, and difficult or international prospects to enter into a dialog, become leads, and possible customers. The subsequent survey elicited organizational backgrounds, available resources, and objectives for online communication. It could trace deficiencies back to a lack of understanding of the importance of digital communication for lead generation, and the customer journey in general, absence of a communication strategy, lack of urgency, and lack of resources to implement desired changes and additions to communication content.
The character of knowledge-intense processes is that participants decide the next process activities on base of the present information and their expert knowledge. The decisions of these knowledge workers are in general non-deterministic. It is not possible to model these processes in advance and to automate them using a process engine of a BPM system. Hence, in this context a process instance is called a case, because there is no predefined model that could be instantiated. Domain-specific or general case management systems are used to support the knowledge workers. These systems provide all case information and enable users to define the next activities, but they have no or only limited activity recommendation capabilities. In the following paper, we present a general concept for a self-learning system based on process mining that suggests the next best activity on quantitative and qualitative data for a given case. As a proof of concept, it was applied to the area of insurance claims settlement.
In recent years, the parallel computing community has shown increasing interest in leveraging cloud resources for executing parallel applications. Clouds exhibit several fundamental features of economic value, like on-demand resource provisioning and a pay-per-use model. Additionally, several cloud providers offer their resources with significant discounts; however, possessing limited availability. Such volatile resources are an auspicious opportunity to reduce the costs arising from computations, thus achieving higher cost efficiency. In this paper, we propose a cost model for quantifying the monetary costs of executing parallel applications in cloud environments, leveraging volatile resources. Using this cost model, one is able to determine a configuration of a cloud-based parallel system that minimizes the total costs of executing an application.
In this paper, we deal with optimizing the monetary costs of executing parallel applications in cloud-based environments. Specifically, we investigate on how scalability characteristics of parallel applications impact the total costs of computations. We focus on a specific class of irregularly structured problems, where the scalability typically depends on the input data. Consequently, dynamic optimization methods are required for minimizing the costs of computation. For quantifying the total monetary costs of individual parallel computations, the paper presents a cost model that considers the costs for the parallel infrastructure employed as well as the costs caused by delayed results. We discuss a method for dynamically finding the number of processors for which the total costs based on our cost model are minimal. Our extensive experimental evaluation gives detailed insights into the performance characteristics of our approach.
In recent years, the cloud has become an attractive execution environment for parallel applications, which introduces novel opportunities for versatile optimizations. Particularly promising in this context is the elasticity characteristic of cloud environments. While elasticity is well established for client-server applications, it is a fundamentally new concept for parallel applications. However, existing elasticity mechanisms for client-server applications can be applied to parallel applications only to a limited extent. Efficient exploitation of elasticity for parallel applications requires novel mechanisms that take into account the particular runtime characteristics and resource requirements of this application type. To tackle this issue, we propose an elasticity description language. This language facilitates users to define elasticity policies, which specify the elasticity behavior at both cloud infrastructure level and application level. Elasticity at the application level is supported by an adequate programming and execution model, as well as abstractions that comply with the dynamic availability of resources. We present the underlying concepts and mechanisms, as well as the architecture and a prototypical implementation. Furthermore, we illustrate the capabilities of our approach through real-world scenarios.
Computational breath analysis is a growing research area aiming at identifying volatile organic compounds (VOCs) in human breath to assist medical diagnostics of the next generation. While inexpensive and non-invasive bioanalytical technologies for metabolite detection in exhaled air and bacterial/fungal vapor exist and the first studies on the power of supervised machine learning methods for profiling of the resulting data were conducted, we lack methods to extract hidden data features emerging from confounding factors. Here, we present Carotta, a new cluster analysis framework dedicated to uncovering such hidden substructures by sophisticated unsupervised statistical learning methods. We study the power of transitivity clustering and hierarchical clustering to identify groups of VOCs with similar expression behavior over most patient breath samples and/or groups of patients with a similar VOC intensity pattern. This enables the discovery of dependencies between metabolites. On the one hand, this allows us to eliminate the effect of potential confounding factors hindering disease classification, such as smoking. On the other hand, we may also identify VOCs associated with disease subtypes or concomitant diseases. Carotta is an open source software with an intuitive graphical user interface promoting data handling, analysis and visualization. The back-end is designed to be modular, allowing for easy extensions with plugins in the future, such as new clustering methods and statistics. It does not require much prior knowledge or technical skills to operate. We demonstrate its power and applicability by means of one artificial dataset. We also apply Carotta exemplarily to a real-world example dataset on chronic obstructive pulmonary disease (COPD). While the artificial data are utilized as a proof of concept, we will demonstrate how Carotta finds candidate markers in our real dataset associated with confounders rather than the primary disease (COPD) and bronchial carcinoma (BC). Carotta is publicly available at http://carotta.compbio.sdu.dk.
This paper investigates if food ^ retailing mobile applications from Germany, Austria, USA and the United Kingdom are meant to stay a marginal topic in grocery shopping, or if they have the potential to significantly shape the future of grocery retailing by serving as competitive advantages that can fulfil customer requirements and satisfaction. It has filtered out success factors in form of functions of grocery apps and it has extracted key competencies that can be used to create customer value. The Kano model can help selecting the right app functions. But, there are other prerequisites, like customers’ general attitude towards technology and their acceptance towards any kind of apps, that play an important role looking at the big picture of apps in grocery retailing. However, this paper has contributed one vital part of giving more importance to apps in grocery retailing in form of app functions that clearly deliver customer value. In short, apps that fit customers’ needs and that provide usability and convenience clearly have the potential to shape the future of grocery retailing - if key barriers towards app use are eliminated and if incentives are given that overcome scepticism.
Intralogistics operations in automotive OEMs increasingly confront problems of overcomplexity caused by a customer-centred production that requires customisation and, thus, high product variability, short-notice changes in orders and the handling of an overwhelming number of parts. To alleviate the pressure on intralogistics without sacrificing performance objectives, the speed and flexibility of logistical operations have to be increased. One approach to this is to utilise three-dimensional space through drone technology. This doctoral thesis aims at establishing a framework for implementing aerial drones in automotive OEM logistic operations.
As of yet, there is no research on implementing drones in automotive OEM logistic operations. To contribute to filling this gap, this thesis develops a framework for Drone Implementation in Automotive Logistics Operations (DIALOOP) that allows for a close interaction between the strategic and the operative level and can lead automotive companies through a decision and selection process regarding drone technology.
A preliminary version of the framework was developed on a theoretical basis and was then revised using qualitative-empirical data from semi-structured interviews with two groups of experts, i.e. drone experts and automotive experts. The drone expert interviews contributed a current overview of drone capabilities. The automotive experts interview were used to identify intralogistics operations in which drones can be implemented along with the performance measures that can be improved by drone usage.
Furthermore, all interviews explored developments and changes with a foreseeable influence on drone implementation.
The revised framework was then validated using participant validation interviews with automotive experts.
The finalised framework defines a step-by-step process leading from strategic decisions and considerations over the identification of logistics processes suitable for drone implementation and the relevant performance measures to the choice of appropriate drone types based on a drone classification specifically developed in this thesis for an automotive context.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
Flash SSDs are omnipresent as database storage. HDD replacement is seamless since Flash SSDs implement the same legacy hardware and software interfaces to enable backward compatibility. Yet, the price paid is high as backward compatibility masks the native behaviour, incurs significant complexity and decreases I/O performance, making it non-robust and unpredictable. Flash SSDs are black-boxes. Although DBMS have ample mechanisms to control hardware directly and utilize the performance potential of Flash memory, the legacy interfaces and black-box architecture of Flash devices prevent them from doing so.
In this paper we demonstrate NoFTL, an approach that enables native Flash access and integrates parts of the Flashmanagement functionality into the DBMS yielding significant performance increase and simplification of the I/O stack. NoFTL is implemented on real hardware based on the OpenSSD research platform. The contributions of this paper include: (i) a description of the NoFTL native Flash storage architecture; (ii) its integration in Shore-MT and (iii) performance evaluation of NoFTL on a real Flash SSD and on an on-line data-driven Flash emulator under TPCB, C,E and H workloads. The performance evaluation results indicate an improvement of at least 2.4x on real hardware over conventional Flash storage; as well as better utilisation of native Flash parallelism.
In the present paper we demonstrate a novel approach to handling small updates on Flash called In-Place Appends (IPA). It allows the DBMS to revisit the traditional write behavior on Flash. Instead of writing whole database pages upon an update in an out-of-place manner on Flash, we transform those small updates into update deltas and append them to a reserved area on the very same physical Flash page. In doing so we utilize the commonly ignored fact that under certain conditions Flash memories can support in-place updates to Flash pages without a preceding erase operation.
The approach was implemented under Shore-MT and evaluated on real hardware. Under standard update-intensive workloads we observed 67% less page invalidations resulting in 80% lower garbage collection overhead, which yields a 45% increase in transactional throughput, while doubling Flash longevity at the same time. The IPA outperforms In-Page Logging (IPL) by more than 50%.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware – the OpenSSD Flash research platform. During the demonstration we allow the users to interact with the system and gain hands on experience of its performance under different demonstration scenarios. These involve various workloads such as TPC-B, TPC-C or TATP.
In the present paper we demonstrate the novel technique to apply the recently proposed approach of In-Place Appends – overwrites on Flash without a prior erase operation. IPA can be applied selectively: only to DB-objects that have frequent and relatively small updates. To do so we couple IPA to the concept of NoFTL regions, allowing the DBA to place update-intensive DB-objects into special IPA-enabled regions. The decision about region configuration can be (semi-)automated by an advisor analyzing DB-log files in the background.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware. During the demonstration we allow the users to interact with the system and gain hands-on experience under different demonstration scenarios.
Under update intensive workloads (TPC, LinkBench) small updates dominate the write behavior, e.g. 70% of all updates change less than 10 bytes across all TPC OLTP workloads. These are typically performed as in-place updates and result in random writes in page-granularity, causing major write-overhead on Flash storage, a write amplification of several hundred times and lower device longevity.
In this paper we propose an approach that transforms those small in-place updates into small update deltas that are appended to the original page. We utilize the commonly ignored fact that modern Flash memories (SLC, MLC, 3D NAND) can handle appends to already programmed physical pages by using various low-level techniques such as ISPP to avoid expensive erases and page migrations. Furthermore, we extend the traditional NSM page-layout with a delta-record area that can absorb those small updates. We propose a scheme to control the write behavior as well as the space allocation and sizing of database pages.
The proposed approach has been implemented under Shore- MT and evaluated on real Flash hardware (OpenSSD) and a Flash emulator. Compared to In-Page Logging it performs up to 62% less reads and writes and up to 74% less erases on a range of workloads. The experimental evaluation indicates: (i) significant reduction of erase operations resulting in twice the longevity of Flash devices under update-intensive workloads; (ii) 15%-60% lower read/write I/O latencies; (iii) up to 45% higher transactional throughput; (iv) 2x to 3x reduction in overall write
amplification.
In this paper we build on our research in data management on native Flash storage. In particular we demonstrate the advantages of intelligent data placement strategies. To effectively manage phsical Flash space and organize the data on it, we utilize novel storage structures such as regions and groups. These are coupled to common DBMS logical structures, thus require no extra overhead for the DBA. The experimental results indicate an improvement of up to 2x, which doubles the longevity of Flash SSD. During the demonstration the audience can experience the advantages of the proposed approach on real Flash hardware.
In this paper we present our work in progress on revisiting traditional DBMS mechanisms to manage space on native Flash and how it is administered by the DBA. Our observations and initial results show that: the standard logical database structures can be used for physical organization of data on native Flash; at the same time higher DBMS performance is achieved without incurring extra DBA overhead. Initial experimental evaluation indicates a 20% increase in transactional throughput under TPC-C, by performing intelligent data placement on Flash, less erase operations and thus better Flash longevity.
We introduce IPA-IDX – an approach to handle index modifications modern storage technologies (NVM, Flash) as physical in-place appends, using simplified physiological log records. IPA-IDX provides similar performance and longevity advantages for indexes as basic IPA [5] does for tables. The selective application of IPA-IDX and basic IPA to certain regions and objects, lowers the GC overhead by over 60%, while keeping the total space overhead to 2%. The combined effect of IPA and IPA-IDX increases performance by 28%.
AbstractThrough their procyclical behavior, loan loss provisions have been determined as one of the factors that contribute to financial instability during a crisis. IFRS 9 was introduced in 2018 with an expected credit loss model replacing the incurred loss model of IAS 39 to mitigate the effect in the future. Our study aims to analyze loan loss provisions of major banks in the Eurozone to determine for the first time if the implementation of IFRS 9, as intended by regulators, has a dampening effect on procyclicality, especially during the stressed situation under COVID‐19. We analyze 51 banks from 12 countries of the European Monetary Union using 2856 firm‐year observations. While no robust evidence of less procyclicality can be found after the implementation of IFRS 9 until the pandemic, we find evidence that loan loss provisions moved countercyclical during 2020, indicating an alleviating effect at the beginning of the exogenous shock.
Decision-making in the field of Enterprise Architecture (EA) is a complex task. Many organizations establish a set of complex processes and hierarchical structures to enable strategy-driven development of their EA. This leads to slow and inefficient decision-making entailing bad time-to-market and discontented stakeholders. Collaborative EA delineates a lightweight approach to enable EA decisions but often neglects strategic alignment. In this paper, we present an approach to integrate the concept of collaborative EA and goal-driven decision-making through collaborative modeling of goal-oriented information demands based on ArchiMate’s motivation extension to reach a goal-oriented EA decision support in a collaborative EA environment.
Nowadays, the demand for a MEMS development/design kit (MDK) is even more in focus than ever before. In order to achieve a high quality and cost effectiveness in the development process for automotive and consumer applications, an advanced design flow for the MEMS (micro electro mechanical systems) element is urgently required. In this paper, such a development methodology and flow for parasitic extraction of active semiconductor devices is presented. The methodology considers geometrical extraction and links the electrically active pn junctions to SPICE standard library models and subsequently extracts the netlist. An example for a typical pressure sensor is presented and discussed. Finally, the results of the parasitic extraction are compared with fabricated devices in terms of accuracy and capability.
In contrast to IC design, MEMS design still lacks sophisticated component libraries. Therefore, the physical design of MEMS sensors is mostly done by simply drawing polygons. Hence, the sensor structure is only given as plain graphic data which hinders the identification and investigation of topology elements such as spring, anchor, mass and electrodes. In order to solve this problem, we present a rule-based recognition algorithm which identifies the architecture and the topology elements of a MEMS sensor. In addition to graphic data, the algorithm makes use of only a few marking layers, as well as net and technology information. Our approach enables RC-extraction with commercial field solvers and a subsequent synthesis of the sensor circuit. The mapping of the extracted RC-values to the topology elements of the sensor enables a detailed analysis and optimization of actual MEMS sensors.
A new method for the analysis of movement dependent parasitics in full custom designed MEMS sensors
(2017)
Due to the lack of sophisticated microelectromechanical systems (MEMS) component libraries, highly optimized MEMS sensors are currently designed using a polygon driven design flow. The strength of this design flow is the accurate mechanical simulation of the polygons by finite element (FE) modal analysis. The result of the FE-modal analysis is included in the system model together with the data of the (mechanical) static electrostatic analysis. However, the system model lacks the dynamic parasitic electrostatic effects, arising from the electric coupling between the wiring and the moving structures. In order to include these effects in the system model, we present a method which enables the quasi dynamic parasitic extraction with respect to in-plane movements of the sensor structures. The method is embedded in the polygon driven MEMS design flow using standard EDA tools. In order to take the influences of the fabrication process into account, such as etching process variations, the method combines the FE-modal analysis and the fabrication process simulation data. This enables the analysis of dynamic changing electrostatic parasitic effects with respect to movements of the mechanical structures. Additionally, the result can be included into the system model allowing the simulation of positive feedback of the electrostatic parasitic effects to the mechanical structures.
Due to the lack of sophisticated component libraries for microelectromechanical systems (MEMS), highly optimized MEMS sensors are currently designed using a polygon driven design flow. The advantage of this design flow is its accurate mechanical simulation, but it lacks a method for an efficient and accurate electrostatic analysis of parasitic effects of MEMS. In order to close this gap in the polygon-driven design flow, we present a customized electrostatic analysis flow for such MEMS devices. Our flow features a 2.5D fabrication-process simulation, which simulates the three typical MEMS fabrication steps (namely deposition of materials including topography, deep reactive-ion etching, and the release etch by vapor-phase etching) very fast and on an acceptable abstraction level. Our new 2.5D fabrication-process simulation can be combined with commercial field-solvers such as they are commonly used in the design of integrated circuits. The new process simulation enables a faster but nevertheless satisfactory analysis of the electrostatic parasitic effects, and hence simplifies the electrical optimization of MEMS.
Due to the lack of sophisticated component libraries for microelectromechanical systems (MEMS), highly optimized MEMS sensors are currently designed using a polygon driven design flow. The advantage of this design flow is its accurate mechanical simulation, but it lacks a method for analyzing the dynamic parasitic electrostatic effects arising from the electric coupling between (stationary) wiring and structures in motion. In order to close this gap, we present a method that enables the parasitics arising from in-plane, sensor-structure motion to be extracted quasi-dynamically. With the method's structural-recognition feature we can analyze and optimize dynamic parasitic electrostatic effects.
Rational strain engineering requires solid testing of phenotypes including productivity and ideally contributes thereby directly to our understanding of the genotype-phenotype relationship. Actually, the test step of the strain engineering cycle becomes the limiting step, as ever advancing tools for generating genetic diversity exist. Here, we briefly define the challenge one faces in quantifiying phenotypes and summarize existing analytical techniques that partially overcome this challenge. We argue that the evolution of volatile metabolites can be used as proxy for cellular metabolism. In the simplest case, the product of interest is a volatile (e.g., from bulk alcohols to special fragrances) that is directly quantified over time. But also nonvolatile products (e.g., from bulk long-chain fatty acids to natural products) require major flux rerouting that result potentially in altered volatile production. While alternative techniques for volatile determination exist, rather few can be envisaged for medium to high-throughput analysis required for phenotype testing. Here, we contribute a detailed protocol for an ion mobility spectrometry (IMS) analysis that allows volatile metabolite quantification down to the ppb range. The sensivity can be exploited for small-scale fermentation monitoring. The insights shared might contribute to a more frequent use of IMS in biotechnology, while the experimented aspects are of general use for researchers interested in volatile monitoring.