Refine
Document Type
- Journal article (754)
- Conference proceeding (748)
- Book chapter (124)
- Book (22)
- Doctoral Thesis (22)
- Working Paper (12)
- Review (6)
- Report (1)
Language
- English (1689) (remove)
Has full text
- yes (1689) (remove)
Is part of the Bibliography
- yes (1689) (remove)
Institute
- Informatik (613)
- ESB Business School (405)
- Technik (295)
- Life Sciences (266)
- Texoversum (108)
- Zentrale Einrichtungen (6)
Publisher
- IEEE (249)
- Springer (240)
- Elsevier (167)
- MDPI (98)
- Hochschule Reutlingen (54)
- Gesellschaft für Informatik (52)
- Wiley (46)
- ACM (40)
- De Gruyter (32)
- IARIA (26)
Boost converters suffer from a bandwidth limitation caused by the right-half plane zero (RHPZ), which occurs in the control-to-output transfer function. In contrast, there are many applications that require superior dynamic behavior. Further, size and cost of boost converter systems can be minimized by reduced voltage deviations and fast transient responses in case of large signal load transients. The key idea of the proposed ΔV/Δt-intervention control concept is to adapt the controller output to its new steady state value immediately after a load transient by prediction from known parameters. The concept is implemented in a digital control circuit, consisting of an ASIC in a 110 nm-technology and a Xilinx Spartan-6 field programmable gate array (FPGA). In a boost converter with 3.5V input voltage, 6.3V output voltage, 1.2A load, and 500 kHz switching frequency, the output voltage deviations are 2.8x smaller, scaling down the output capacitor value by the same factor. The recovery times are 2.4x shorter in case of large signal load transients with the proposed concept. The control is widely applicable, as it supports constant switching frequencies and allows for duty cycle and inductor current limitations. It also shows various advantages compared to conventional control and to selected adaptive control concepts.
This article proposes several modified quasi Z-source dc/dc boost converters. These can achieve soft-switching by using a clamp-switch network comprised of an active switch and a diode in parallel with a capacitor connected across one of the inductors of the Z-source network. In this way, ringing at the transistor switching node is mitigated, and the voltage at the turn-on of the transistor is reduced. Even a zero voltage switching (ZVS) of the main transistor is possible if the capacitor in the clamp-switch network is adequately chosen. The proposed circuit structure and operating mode are described and validated through simulations and measurements on a low-power prototype.
YouTube fashion videos
(2020)
YouTube is the most widely adopted and successful video sharing platform. It works as a marketing instrument and money-making tool for companies while reaching the target group. After considering the significant literature based on YouTube, it is striking that there is lack of information about YouTube’s benefits as a video marketing instrument for fashion brands. To establish this subject further, the purpose of this study is to enrich the existing findings on social video marketing on YouTube in the apparel industry. The findings indicate the importance of YouTube as a social network for fashion marketers. The second part conducts an empirical study, which makes the YouTube channel performance of nine fashion brands the subject of discussion. Thereby, three brands per lifestyle, sports and luxury sector are analyzed through comparative aspects. Accordingly, the differences and similarities within and between the sectors are analyzed and evaluated.
Database management systems (DBMS) are critical performance components in large scale applications under modern update intensive workloads. Additional access paths accelerate look-up performance in DBMS for frequently queried attributes, but the required maintenance slows down update performance. The ubiquitous B+ tree is a commonly used key-indexed access path that is able to support many required functionalities with logarithmic access time to requested records. Modern processing and storage technologies and their characteristics require reconsideration of matured indexing approaches for today's workloads. Partitioned B-trees (PBT) leverage characteristics of modern hardware technologies and complex memory hierarchies as well as high update rates and changes in workloads by maintaining partitions within one single B+-Tree. This paper includes an experimental evaluation of PBTs optimized write pattern and performance improvements. With PBT transactional throughput under TPC-C increases 30%; PBT results in beneficial sequential write patterns even in presence of updates and maintenance operations.
Workshops and tutorials
(2018)
The 19th International Conference on Product-Focused Software Process Improvement (PROFES 2018) hosted two workshops and three tutorials. The workshops and tutorials complemented and enhanced the main conference program, offering a wider knowledge perspective around the conference topics. The topics of the two workshops were Hybrid Development Approaches in Software Systems Development (HELENA) and Managing Quality in Agile & Rapid Software Development Processes (QUaSD). The topics of the tutorials were The human factor in agile transitions – using the personas concept in agile oaching, Process Management 4.0 – Best Practices, and Domain-specific languages for specification, development, and testing of autonomous systems.
Business process management and IT supported processes are an actual topic. The procedure of finding a business process system that implements your processes the best way is not easy and takes a lot of time. In this article you will find a recommendation for an open source system. Four selected open source workflow management systems are tested and analyzed. Mean criteria for the evaluation are listed in a criteria catalogue and rated by experts by their importance. Finally, the systems are evaluated by the criteria and the best evaluated system can be recommended.
Introduction: Even if there is a standard procedure of CI surgery, especially in pediatric surgery surgical steps often differ individually due to anatomical variations, malformations or unforseen events. This is why every surgical report should be created individually, which takes time and relies on the correct memory of the surgeon. A standardized recording of intraoperative data and subsequent storage as well as text processing would therefore be desirable and provides the basis for subsequent data processing, e.g. in the context of research or quality assurance.
Method: In cooperation with Reutlingen University, we conducted a workflow analysis of the prototype of a semi-automatic checklist tool. Based on automatically generated checklists generated from BPMN models a prototype user interface was developed for an android tablet. Functions such as uploading photos and files, manual user entries, the interception of foreseeable deviations from the normal course of operations and the automatic creation of OP documentation could be implemented. The system was tested in a remote usability test on a petrous bone model.
Result: The user interface allows a simple intuitive handling, which can be well implemented in the intraoperative setting. Clinical data as well as surgical steps could be individually recorded and saved via DICOM. An automatic surgery report could be created and saved.
Summary: The use of a dynamic checklist tool facilitates the capture, storage and processing of surgical data. Further applications in clinical practice are pending.
Natural wood colors occur within a wide range from almost white (e.g., white poplar), various yellowish, reddish, and brownish hues to almost black (e.g., ebony). The intrinsic color of wood is basically defined by its chemical composition. However, other factors such as specific anatomical formations or physical properties further affect the optical impression. Starting with the chemical composition of wood and anatomical basics, wood color and its modifications are discussed in this chapter. The classic method of coloring or re-coloring wood-based material surfaces is the application of a coating containing appropriate dyes or pigments. Different concepts for wood coating and coloration are presented. Another method used dyes for coloration of the wood structure. As alternative techniques, physical methods, for example, drying, steaming, ammoniation, bleaching, enzyme treatment, as well as treatment with electromagnetic irradiation (e.g., UV), are explained in this chapter.
Willingness-to-pay for alternative fuel vehicle characteristics : a stated choice study for Germany
(2016)
In the light of European energy efficiency and clean air regulations, as well as an ambitious electric mobility goal of the German government, we examine consumer preferences for alternative fuel vehicles (AFVs) based on a Germany-wide discrete choice experiment among 711 potential car buyers. We estimate consumers’ willingness to-pay and compensating variation (CV) for improvements in vehicle attributes, also taking taste differences in the population into account by applying a latent class model with 6 distinct consumer segments. Our results indicate that about 1/3 of the consumers are oriented towards at least one AFV option, with almost half of them being AFV-affine, showing a high probability of choosing AFVs despite their current shortcomings. Our results suggest that German car buyers’ willingness-to-pay for improvements of the various vehicle attributes varies considerably across consumer groups and that the vehicle features have to meet some minimum requirements for considering AFVs. The CV values show that decision-makers in the administration and industry should focus on the most promising consumer group of ‘AFV aficionados’ and their needs. It also shows that some vehicle attribute improvements could increase the demand for AFVs cost-effectively, and that consumers would accept surcharges for some vehicle attributes at a level which could enable their private provision and economic operation (e.g. fast-charging infrastructure). Improvement of other attributes will need governmental subsidies to compensate for insufficient consumer valuation (e.g. battery capacity).
Will chatbots play a significant role for B2B marketingin the future? Chatbots in B2B businesses
(2022)
Digitalization has gained a foothold in our everyday lives. However, it remains to be seen what digital tools B2B companies can benefit from. During the last few years, chatbots have been on the rise and have played a more significant role in B2B marketing. Thus, this research follows a literature review to examine the current state of B2B chatbots. With this, the study will discover the buyer’s preferences for chatbots compared to sales agents and the role of chatbots in different stages of the B2B sales funnel.
Context: Companies that operate in the software-intensive business are confronted with high market dynamics, rapidly evolving technologies as well as fast-changing customer behavior. Traditional product roadmapping practices, such as fixed-time-based charts including detailed planned features, products, or services typically fail in such environments. Until now, the underlying reasons for the failure of product roadmaps in a dynamic and uncertain market environment are not widely analyzed and understood.
Objective: This paper aims to identify current challenges and pitfalls practitioners face when developing and handling product roadmaps in a dynamic and uncertain market environment.
Method: To reach our objective we conducted a grey literature review (GLR).
Results: Overall, we identified 40 relevant papers, from which we could extract 11 challenges of the application of product roadmapping in a dynamic and uncertain market environment. The analysis of the articles showed that the major challenges for practitioners originate from overcoming a feature-driven mindset, not including a lot of details in the product roadmap, and ensuring that the content of the roadmap is not driven by management or expert opinion.
Purpose
The purpose of this paper is to explore why men do not rent luxury fashion to explain why the demand for luxury fashion rental services for men is so low and to contribute to science by collecting high-quality data for the research fields gender differences in barriers to renting fashion, barriers to participating in renting luxury fashion in general and to increase the amount of data on men consumption behavior in the field of fashion and luxury fashion research. Furthermore, this study aims not only to make a theoretical contribution, but also to provide practical implications for the luxury fashion rental industry.
Design/methodology/approach
To answer the research question, qualitative semi-structured interviews with seven men were conducted, who are interested in fashion and spend at least 10% of their monthly net income on luxury fashion per month. Through a deductively-inductively category-based qualitative content analysis of the interviews supported by the software MAXQDA, not only were the reasons found why many men refuse to rent luxury fashion, but also characteristics were discovered that make luxury fashion rental services more attractive to men, as well as two fashion segments and a product category in which men can imagine renting fashion or luxury fashion under certain circumstances.
Findings
Men reject the concept of renting primarily because of the nonexistence of ownership, which has to do with loss of emotional value, loss of functional value, fear of social rejection, and identity concerns; other reasons include lack of individualism, lack of habit and their own subjective standards. Except for two outliers, the remaining men surveyed could imagine using a luxury rental service under certain conditions. The most frequently mentioned features were omnichannel approach, transparency of the entire rental process provided by reviews and feedback about both the borrower and the lender, information about the cleaning process, and proof of authenticity. Also mentioned was the maintenance of exclusivity and the fact that rental services should be offered directly by the company. In the convenience category, the purchase option and insurance were mentioned most often. In addition, some men could imagine renting event-related clothing, very trendy and expensive luxury clothing, and luxury watches. However, none of the respondents would give up owning clothes and primarily use the LFRS.
Value/Practical Implications
So that marketers do not have to go through trial and error to figure out which of these characteristics works best for which male target group, the work developed five types that can be targeted with selected characteristics and their marketing, and thus perhaps persuaded to participate in the LFRS. The social type needs the feature of maintenance of exclusivity, the emotional type needs the purchase option and an omnichannel experience, the flexibility type needs the same day delivery and free exchange possibilities, the cost-benefit type needs analytical tools to maximize his rental income or to calculate whether it is cheaper to buy or rent this particular item for this particular period of time, the rule-governed type needs an added value in addition to renting such as a top service.
Objective: This paper aims at getting an understanding of current problems and challenges with roadmapping processes in companies that are facing volatile markets with innovative products. It also aims at gathering ideas and attempts on how to react to those challenges.
Method: As an initial step towards the objectice a semi-structured expert interview study with a case company in the Smart Home domain was conducted. Four employees from the case company with different roles around product roadmaps have been interviewed and a content analysis of the data has been performed.
Results: The study shows a significant consensus among the interviewees about several major challenges and the necessity to change the traditional roadmapping process and format. The interviewees stated that based on their experience traditional feature-based product roadmaps are increasingly losing their benefits (such as good planning certainty) in volatile environments. Furthermore, the ability to understand customer needs and behaviors has become highly important for creating and adjusting product roadmaps. The interviewees see the need for both, sufficiently stable goals on the roadmap and flexibility with respect to products or features to be developed. To reach this target the interviewees proposed to create roadmaps based on outcome goals instead of product features. In addition, it was proposed to decrease the level of detail of the roadmaps and to emphasize the long-term view. Decisions about which feature to develop should be open as long as possible. Expected benefits of such a new way of product roadmapping are higher user centricity, a stable overall direction, more flexibility with respect to development decisions, and less breaking of commitments.
”I have never seen one who loves virtue as much as he loves beauty,” Confucius once said. If beauty is more important as goodness, it becomes clear why people invest so much effort in their first impression. The aesthetic of faces has many aspects and there is a strong correlation to all characteristics of humans, like age and gender. Often, research on aesthetics by social and ethic scientists lacks sufficient labelled data and the support of machine vision tools. In this position paper we propose the Aesthetic-Faces dataset, containing training data which is labelled by Chinese and German annotators. As a combination of three image subsets, the AF-dataset consists of European, Asian and African people. The research communities in machine learning, aesthetics and social ethics can benefit from our dataset and our toolbox. The toolbox provides many functions for machine learning with state-of-the-art CNNs and an Extreme-Gradient-Boosting regressor, but also 3D Morphable Model technolo gies for face shape evaluation and we discuss how to train an aesthetic estimator considering culture and ethics.
Whither the german council of economic experts? The past and future of public economic advice
(2014)
The article discusses the development and impact of the German Council of Economic Experts (GCEE). Firstly, the author studies the historical origins and the institutional setup of the GCEE. In the second step, an analyse of the impact of the annual reports of the German Council is given, along with the international comparison with other advisory boards. Finally, the paper discusses the current economic challenges and the need of modernization of the GCEE in special and political advisory boards in general.
This paper develops a new governance scheme for a stable and lasting European Monetary Union (EMU). I demonstrate that existing economic governance is based on flawed incentives especially due to insufficient macroeconomic coordination, failures of institutional enforcement and animal spirit in financial markets. All this caused the European sovereign debt crisis in 2010. Consequently, the EMU crisis is not a conundrum at all rather a failure of national and supranational governance. To tackle this problem, I propose a return to flexible but compulsory rules driven by market forces. The new governance principles shall promote the compliance and effective enforcement of rules.
Several studies analyzed existing Web APIs against the constraints of REST to estimate the degree of REST compliance among state-of-the-art APIs. These studies revealed that only a small number of Web APIs are truly RESTful. Moreover, identified mismatches between theoretical REST concepts and practical implementations lead us to believe that practitioners perceive many rules and best practices aligned with these REST concepts differently in terms of their importance and impact on software quality. We therefore conducted a Delphi study in which we confronted eight Web API experts from industry with a catalog of 82 REST API design rules. For each rule, we let them rate its importance and software quality impact. As consensus, our experts rated 28 rules with high, 17 with medium, and 37 with low importance. Moreover, they perceived usability, maintainability, and compatibility as the most impacted quality attributes. The detailed analysis revealed that the experts saw rules for reaching Richardson maturity level 2 as critical, while reaching level 3 was less important. As the acquired consensus data may serve as valuable input for designing a tool-supported approach for the automatic quality evaluation of RESTful APIs, we briefly discuss requirements for such an approach and comment on the applicability of the most important rules.
In order to explore an image, the human eye functions like a spotlight, scanning the content from one object to the next. This visual search behavior is implemented with the help of attention control. The following work surveys the visual search behavior in "Wimmelpictures", a special type of busy pictures. The research objective is to analyze different search strategies and to work out possible differences concerning age and gender. The university experiment is carried out by an eye tracker that records the fixations and saccades of the test persons. The results indicate three forms of search strategy: based on a pattern, based on feature selection, or a mixture of both. Our data shows the search for special features of the target is the most successful. Furthermore there are no differences concerning gender but some concerning age. All age groups need more time to locate the target with an increasing number of distractors in the image. The size of the target is also relevant as a larger target is found more quickly than the smaller one.
Software development consists to a large extend of humanbased processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the impact of group dynamics on project performance and quality. The course unit uses the Tuckman model as theoretical framework, and borrows from controlled experiments to organize and implement its practical parts in which students then experience the effects of, e.g., time pressure, resource bottlenecks, staff turnover, loss of key personnel, and other stress factors. We provide a detailed design of the course unit to allow for implementation in further software project management courses. Furthermore, we provide experiences obtained from two instances of this unit conducted in Munich and Karlskrona with 36 graduate students. We observed students building awareness of stress factors and developing counter measures to reduce impact of those factors. Moreover, students experienced what problems occur when teams work under stress and how to form a performing team despite exceptional situations.
The implementation of human resource (HR) policies often proves troublesome due to the appearance, and stubborn persistence, of gaps in the process. Human resource management (HRM) scholars problematise these gaps and advocate tight implementation to reduce gaps and to ensure the desired impact of policies on organisational performance. Drawing on organisational institutionalism, we contend that gaps in implementing HR policies can actually be productive, as they secure organisational legitimacy, and thus enable organisations to operate viably within several institutional environments. We suggest that different approaches to implementation are needed, some of them premised on accepting sustained implementation gaps. We introduce minimum and moderate implementation approaches, rooted in the notion of decoupling, to complement approaches aimed at tight implementation. Our aim is to support the further development of research based on a richer interpretation of HRM implementation challenges and choices they present for HR managers.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach. Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts. The study focuses on mid-sized and large companies developing software-intensive products in dynamic and technical market environments. Method: We conducted semi structured expert interviews with 15 experts from 13 German companies and conducted a thematic data analysis. Results: The analysis showed that a significant number of companies is still struggling with traditional feature based product-roadmapping and opinion based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and the establishing discovery activities.
Context: Organizations are increasingly challenged by dynamic and technical market environments. Traditional product roadmapping practices such as detailed and fixed long-term planning typically fail in such environments. Therefore, companies are actively seeking ways to improve their product roadmapping approach.
Goal: This paper aims at identifying problems and challenges with respect to product roadmapping. In addition, it aims at understanding how companies succeed in improving their roadmapping practices in their respective company contexts.
Method: We conducted semi-structured expert interviews with 15 experts from 13 German companies and conducted athematic data analysis.
Results: The analysis showed that a significant number of companies is still struggling with traditional feature-based product-roadmapping and opinion-based prioritization of features. The most promising areas for improvement are stating the outcomes a company is trying to achieve and making them part of the roadmap, sharing or co-developing the roadmap with stakeholders, and establishing discovery activities.
Together with many success stories, promises such as the increase in production speed and the improvement in stakeholders' collaboration have contributed to making agile a transformation in the software industry in which many companies want to take part. However, driven either by a natural and expected evolution or by contextual factors that challenge the adoption of agile methods as prescribed by their creator(s), software processes in practice mutate into hybrids over time. Are these still agile In this article, we investigate the question: what makes a software development method agile We present an empirical study grounded in a large-scale international survey that aims to identify software development methods and practices that improve or tame agility. Based on 556 data points, we analyze the perceived degree of agility in the implementation of standard project disciplines and its relation to used development methods and practices. Our findings suggest that only a small number of participants operate their projects in a purely traditional or agile manner (under 15%). That said, most project disciplines and most practices show a clear trend towards increasing degrees of agility. Compared to the methods used to develop software, the selection of practices has a stronger effect on the degree of agility of a given discipline. Finally, there are no methods or practices that explicitly guarantee or prevent agility. We conclude that agility cannot be defined solely at the process level. Additional factors need to be taken into account when trying to implement or improve agility in a software company. Finally, we discuss the field of software process-related research in the light of our findings and present a roadmap for future research.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
This study examines the underexplored areas of customer success management, focusing on the impact of leadership and companywide collaboration, and the role of customer success in overall firm performance. A qualitative research approach was utilized, which involved reviewing relevant literature and conducting an interview with the Vice President of Customer Success Management in B2B at a case company. Findings revealed that both leadership and pervasive collaboration greatly enhance the customer journey experience. Given that 75% of Annual Recurring Revenue is derived from existing customers, the substantial role of customer success in propelling business growth is affirmed. The study also demonstrated the importance of proactive customer engagement, assimilating customer feedback into products and services, and nurturing personal relationships with customers for fostering innovation. It further stressed the need for service provision and decision-making at various levels, as well as the implementation of a range of communication channels, to ensure customer success.
Among the multitude of software development processes available, hardly any is used by the book. Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods— so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. In this paper, we make a first step towards devising such guidelines. Grounded in 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods. Using an 85% agreement level in the participants’ selections, we provide two examples illustrating how hybrid development methods are characterized by the practices they are made of. Our evidence-based analysis approach lays the foundation for devising hybrid development methods.
Regardless of company size or industry sector, a majority of project teams and companies use customized processes that combine different development methods-so-called hybrid development methods. Even though such hybrid development methods are highly individualized, a common understanding of how to systematically construct synergetic practices is missing. Based on 1,467 data points from a large-scale online survey among practitioners, we study the current state of practice in process use to answer the question: What are hybrid development methods made of? Our findings reveal that only eight methods and few practices build the core of modern software development. This small set allows for statistically constructing hybrid development methods.
Rapid prototyping platforms reduce development time by allowing quick prototyping of a prototype idea and achieve more time for actual application development with user interfaces. This approach has long been followed in technical platforms, such as the Arduino. To transfer this form of prototyping to wearables, WearIT is presented in this paper.WearIT consists of four components as a wearable prototyping platform: (1) a vest, (2) sensor and actuator shields, (3) its own library and (4) a motherboard consisting of Arduino, Raspberry Pi, a board and a GPS module. As a result, a wearable prototype can be quickly developed by attaching sensor and actuator shields to the WearIT vest. These sensor and actuator shields can then be programmed through the WearIT library. Via Virtual Network Computing (VNC) with a remote computer, the screen contents of the Raspberry Pi can be accessed and the Arduino be programmed.
Water jacket systems are routinely used to control the temperature of Petri dish cell culture chambers. Despite their widespread use, the thermal characteristics of such systems have not been fully investigated. In this study, we conducted a comprehensive set of theoretical, numerical and experimental analyses to investigate the thermal characteristics of Petri dish chambers under stable and transient conditions. In particular, we investigated the temperature gradient along the radial axis of the Petri dish under stable conditions, and the transition period under transient conditions. Our studies indicate a radial temperature gradient of 3.3 °C along with a transition period of 27.5 min when increasing the sample temperature from 37 to 45 °C for a standard 35 mm diameter Petri dish. We characterized the temperature gradient and transition period under various operational, geometric, and environmental conditions. Under stable conditions, reducing the diameter of the Petri dish and incorporating a heater underneath the Petri dish can effectively reduce the temperature gradient across the sample. In comparison, under transient conditions, reducing the diameter of the Petri dish, reducing sample volume, and using glass Petri dish chambers can reduce the transition period.
Small Island Developing States (SIDS) face tension between economic growth and environmental impact. Tourism fuels growth, but the resulting solid waste and other pollutants threaten the SIDS’ natural beauty, quality of life for residents, attractiveness to tourists, and economic success. We assess the tension between tourism-driven economic growth and environmental degradation from a limits-to-growth perspective, developing a generic system dynamics model of the problem using 38 years of data from the Maldives to estimate parameters and Monte-Carlo methods to assess the sensitivity of results to uncertainty. We contrast development paths for the next three decades under three sets of policies focusing on promoting growth, managing tourism demand–supply balance, and improving waste management. Findings are counterintuitive; policies focused on better waste management alone are self defeating, because they increase tourism, growth and waste generation, undermining attractiveness and growth later. Policies that limit tourism demand improve economic and environmental health.
Wasted paradise – imagining the Maldives without the garbage island of Thilafushi : Version 1.2
(2016)
To address the high level of waste production in the Maldives, the local government decided to transform the coral island of Thilafushi into an immense waste dumb in 1992. Meanwhile, each day, 330 tons of waste is ferried to Thilafushi. The policy had the positive consequence of relieving the garbage burden in Malé, the main island, and surrounding tourist atolls. However, it can also lead to serious environmental and economic damage in the long range. First, the garbage is in visual range of one of the most prominent tourist destinations. Second, if the wind blows a certain way, unfiltered fumes from burning waste travels to tourist atolls. Third, water quality can erode as hazardous waste from batteries and other toxic waste is floating in the ocean. Over time, these effects can accumulate to significantly hamper the number of tourists that travel to the Maldives – one of the state’s main sources of financial income. In our paper, we lay out the situation in more detail and translate it into a simulation model. We test different policies to propose the Maldives government how to better solve the waste problem.
This study investigates how integrated reporting (IR) creates value for investors. It examines how providers of financial capital benefit from an improved firm information environment provided by IR. Specifically, this study investigates the effect of voluntary IR disclosure on analyst earnings forecast accuracy as well as on firm value. To do so, we use an international sample of 167 listed companies that voluntarily publish an integrated report. Our analysis shows no significant effect of a voluntary IR publication on analyst earnings forecast accuracy and no significant effect on firm value. We thus do not find evidence for the fulfillment of IR's promises regarding improved information environment and value creation of voluntary adopters. We conclude that such companies might already have a relatively high level of transparency leading to an absent additional effect of IR disclosure. Positive effects of IR appear to be more relevant in environments where IR is mandatory.
Purpose: Human breath analysis is proposed with increasing frequency as a useful tool in clinical application. We performed this study to find the characteristic volatile organic compounds (VOCs) in the exhaled breath of patients with idiopathic pulmonary fibrosis (IPF) for discrimination from healthy subjects. Methods: VOCs in the exhaled breath of 40 IPF patients and 55 healthy controls were measured using a multi-capillary column and ion mobility spectrometer. The patients were examined by pulmonary function tests, blood gas analysis, and serum biomarkers of interstitial pneumonia. Results: We detected 85 VOC peaks in the exhaled breath of IPF patients and controls. IPF patients showed 5 significant VOC peaks; p-cymene, acetoin, isoprene, ethylbenzene, and an unknown compound. The VOC peak of p-cymene was significantly lower (p < 0.001), while the VOC peaks of acetoin, isoprene, ethylbenzene, and the unknown compound were significantly higher (p < 0.001 for all) compared with the peaks of controls. Comparing VOC peaks with clinical parameters, negative correlations with VC (r =−0.393, p = 0.013), %VC (r =−0.569, p < 0.001), FVC (r = −0.440, p = 0.004), %FVC (r =−0.539, p < 0.001), DLco (r =−0.394, p = 0.018), and %DLco (r =−0.413, p = 0.008) and a positive correlation with KL-6 (r = 0.432, p = 0.005) were found for p-cymene. Conclusion: We found characteristic 5 VOCs in the exhaled breath of IPF patients. Among them, the VOC peaks of p-cymene were related to the clinical parameters of IPF. These VOCs may be useful biomarkers of IPF.
Background: Multicapillary column ion-mobility spectrometry (MCC-IMS) may identify volatile components in exhaled gas. The authors therefore used MCC-IMS to evaluate exhaled gas in a rat model of sepsis, inflammation, and hemorrhagic shock.
Methods: Male Sprague-Dawley rats were anesthetized and ventilated via tracheostomy for 10 h or until death. Sepsis was induced by cecal ligation and incision in 10 rats; a sham operation was performed in 10 others. In 10 other rats, endotoxemia was induced by intravenous administration of 10 mg/kg lipopolysaccharide. In a final 10 rats, hemorrhagic shock was induced to a mean arterial pressure of 35 +/- 5 mmHg. Exhaled gas was analyzed with MCC-IMS, and volatile compounds were identified using the BS-MCC/IMS-analytes database (Version 1209; B&S Analytik, Dortmund, Germany).
Results: All sham animals survived the observation period, whereas mean survival time was 7.9 h in the septic animals, 9.1 h in endotoxemic animals, and 2.5 h in hemorrhagic shock. Volatile compounds showed statistically significant differences in septic and endotoxemic rats compared with sham rats for 3-pentanone and acetone. Endotoxic rats differed significantly from sham for 1-propanol, butanal, acetophenone, 1,2-butandiol, and 2-hexanone. Statistically significant differences were observed between septic and endotoxemic rats for butanal, 3-pentanone, and 2-hexanone. 2-Hexanone differed from all other groups in the rats with shock.
Conclusions: Breath analysis of expired organic compounds differed significantly in septic, inflammation, and sham rats. MCC-IMS of exhaled breath deserves additional study as a noninvasive approach for distinguishing sepsis from inflammation.
Digital Enterprise Architecture allows multiple viewpoints on a company’s IT landscape. To gain valuable information out of huge amounts of operational data, it is indispensable to have both an understanding of the operations architecture and an engine capable of managing Big Data. The mechanism of understanding huge amounts of data is based on three main steps: collect, process and use. The main idea is focused on extracting valuable information out of Big Data to make better design decisions. The Elastic Stack is an open-source solution to comfortably and quickly handle Big Data scenarios.
Formula One races provide a wealth of data worth investigating. Although the time-varying data has a clear structure, it is pretty challenging to analyze it for further properties. Here the focus is on a visual classification for events, drivers, as well as time periods. As a first step, the Formula One data is visually encoded based on a line plot visual metaphor reflecting the dynamic lap times, and finally, a classification of the races based on the visual outcomes gained from these line plots is presented. The visualization tool is web-based and provides several interactively linked views on the data; however, it starts with a calendar-based overview representation. To illustrate the usefulness of the approach, the provided Formula One data from several years is visually explored while the races took place in different locations. The chapter discusses algorithmic, visual, and perceptual limitations that might occur during the visual classification of time-series data such as Formula One races.
Prominent theories of action recognition suggest that during the recognition of actions the physical patterns of the action is associated with only one action interpretation (e.g., a person waving his arm is recognized as waving). In contrast to this view, studies examining the visual categorization of objects show that objects are recognized in multiple ways (e.g., a VW Beetle can be recognized as a car or a beetle) and that categorization performance is based on the visual and motor movement similarity between objects. Here, we studied whether we find evidence for multiple levels of categorization for social interactions (physical interactions with another person, e.g., handshakes). To do so, we compared visual categorization of objects and social interactions (Experiments 1 and 2) in a grouping task and assessed the usefulness of motor and visual cues (Experiments 3, 4, and 5) for object and social interaction categorization. Additionally, we measured recognition performance associated with recognizing objects and social interactions at different categorization levels (Experiment 6). We found that basic level object categories were associated with a clear recognition advantage compared to subordinate recognition but basic level social interaction categories provided only a little recognition advantage. Moreover, basic level object categories were more strongly associated with similar visual and motor cues than basic level social interaction categories. The results suggest that cognitive categories underlying the recognition of objects and social interactions are associated with different performances. These results are in line with the idea that the same action can be associated with several action interpretations (e.g., a person waving his arm can be recognized as waving or greeting).
In times of dynamic markets, enterprises have to be agile to be able to quickly react to market influences. Due to the increasing digitization of products, the enterprise IT often is affected when business models change. Enterprise Architecture Management (EAM) targets a holistic view of the enterprise’ IT and their relations to the business. However, Enterprise Architectures (EA) are complex structures consisting of many layers, artifacts and relationships between them. Thus, analyzing EA is a very complex task for stakeholders. Visualizations are common vehicles to support analysis. However, in practice visualization capabilities lack flexibility and interactivity. A solution to improve the support of stakeholders in analyzing EAs might be the application of visual analytics. Starting from a systematic literature review, this article investigates the features of visual analytics relevant for the context of EAM.
In this paper we describe an interactive web-based visual analysis tool for Formula one races. It first provides an overview about all races on a yearly basis in a calendar-like representation. From this starting point, races can be selected and visually inspected in detail. We support a dynamic race position diagram as well as a more detailed lap times line plot for showing the drivers’ lap times in comparison. Many interaction techniques are supported like selections, filtering, highlighting, color coding, or details-on demand. We illustrate the usefulness of our visualization tool by applying it to a Formula one dataset while we describe the different dynamic visual racing patterns for a number of selected races and drivers.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
This study examines the phenomenon of Virtual Influencer (VI) marketing and its impact on customer purchase behavior. The aim is to understand the scope and impact of VI marketing. The study compares VI marketing to traditional Human Influencer (HI) marketing and identifies the unique benefits and challenges associated with VIs. A survey was conducted to gain insight into consumer attitudes and behaviors toward VIs. Key findings reveal varying levels of trust and acceptance of VIs among consumers. While some participants expressed openness to buying products promoted by VIs, others had reservations about their authenticity. The study also explores the potential role of VIs in the metaverse, highlighting business opportunities and challenges in this evolving digital landscape. Overall, this research sheds light on the growing influence of VIs and the need for further research in the field of marketing.
Today the optimization of metal forming processes is done using advanced simulation tools in a virtual process, e.g. FEM-studies. The modification of the free parameters represents the different variants to be analysed. So experienced engineers may derive useful proposals in an acceptable time if good initial proposals are available. As soon as the number of free parameters growths or the total process takes long times and uses different succeeding forming steps it might be quite difficult to find promising initial ideas. In metal forming another problem has to be considered. The optimization using a series of local improvements, often called a gradient approach may find a local optimum, but this could be far away from a satisfactory solution. Therefore non-deterministic approaches, e.g. Bionic Optimization have to be used. These approaches like Evolutionary Optimization or Particle Swarm Optimization are capable to cover a large range of high dimensional optimization spaces and discover many local optima. So the chance to include the global optimum increases when using such non-deterministic methods. Unfortunately these bionic methods require large numbers of studies of different variants of the process to be optimized. The number of studies tends to increase exponentially with the number of free parameters of the forming process. As the time for one single study might be not too small as well, the total time demand will be inacceptable, taking weeks to months even if high performance computing will be used. Therefore the optimization process needs to be accelerated. Among the many ideas to reduce the time and computer power requirement Meta- and Hybrid Optimization seem to produce the most efficient results. Hybrid Optimization often consists of global searches of promising regions within the parameter space. As soon as the studies indicate that there could be a local optimum, a deterministic study tries to identify this local region. If it shows better performance than other optima found until now, it is preserved for a more detailed analysis. If it performs worse than other optima the region is excluded from further search. Meta-Optimization is often understood as the derivation of Response Surfaces of the functions of free parameters. Once there are enough studies performed, the optimization is done using the Response Surfaces as representatives e.g. for the goal and the restrictions of the optimization problem. Having found regions where interesting solutions are to be expected, the studies available up to now are used to define the Response Surfaces. In many cases low degree polynomials are used, defining their coefficients by least square methods. Both proposals Hybrid Optimization and Meta-Optimization, sometimes used in combination often help to reduce the total optimization processes by large numbers of variants to be studied. In consequence they are highly recommended when dealing with time consuming optimization studies.
The present study investigated the possibilities and limitations of using a low-cost NIR spectrometer for the verification of the presence of the declared active pharmaceutical ingredients (APIs) in tablet formulations, especially for medicine screening studies in low-resource settings. Spectra from 950 to 1650 nm were recorded for 170 pharmaceutical products representing 41 different APIs, API combinations or placebos. Most of the products, including 20 falsified medicines, had been collected in medicine quality studies in African countries. After exploratory principal component analysis, models were built using data-driven soft independent modelling of class analogy (DD-SIMCA), a one-class classifier algorithm, for tablet products of penicillin V, sulfamethoxazole/trimethoprim, ciprofloxacin, furosemide, metronidazole, metformin, hydrochlorothiazide, and doxycycline. Spectra of amoxicillin and amoxicillin/clavulanic acid tablets were combined into a single model. Models were tested using Procrustes cross-validation and by projection of spectra of tablets containing the same or different APIs. Tablets containing no or different APIs could be identified with 100 % specificity in all models. A separation of the spectra of amoxicillin and amoxicillin/clavulanic acid tablets was achieved by partial least squares discriminant analysis. 15 out of 19 external validation products (79 %) representing different brands of the same APIs were correctly identified as members of the target class; three of the four rejected samples showed an API mass percentage of the total tablet weight that was out of the range covered in the respective calibration set. Therefore, in future investigations larger and more representative spectral libraries are required for model building. Falsified medicines containing no API, incorrect APIs, or grossly incorrect amounts of the declared APIs could be readily identified. Variation between different NIR-S-G1 spectroscopic devices led to a loss of accuracy if spectra recorded with different devices were pooled. Therefore, piecewise direct standardization was applied for calibration transfer. The investigated method is a promising tool for medicine screening studies in low-resource settings.
Verification of an active time constant tuning technique for continuous-time delta-sigma modulators
(2022)
In this work we present a technique to compensate the effects of R-C / g m -C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with finely tunable circuit structures, such as current-steering digital-to-analog converters (DAC). We apply our technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure, implemented in a 0.35-μm CMOS process. A tuning scheme for the reference currents of the feedback DACs is derived as a function of the individual TC errors and verified by circuit simulations. We confirm the tuning technique experimentally on the fabricated circuit over a TC parameter variation range of ±20%. Stable modulator operation is achieved for all parameter sets. The measured performances satisfy the expectations from our theoretical calculations and circuit-level simulations.
Venture capital and the innovative power of a state : econometric study including Google data
(2015)
This article focuses on venture capital investments and the innovative power of a state defined by its public infrastructure. The economic implications are evaluated by estimating several panel regression models. The novelty is twofold: on the one hand the research approach and on the other hand the new data set. The data ranges from 1995 to 2014 and consists of 10 European countries plus the US and Canada. For the first time we include Google search data on Venture Capital. The results show a significant increase in Venture Capital is mainly determined by economic conditions such as real GDP growth. The impact of the innovative power of a state is not significant. We find that Google data is positively related and significant in respect to Venture Capital investments too. Consequently, we confirm that private business investments cannot be created by government policy alone rather via solid macroeconomic conditions.
Redirected walking techniques allow people to walk in a larger virtual space than the physical extents of the laboratory. We describe two experiments conducted to investigate human sensitivity to walking on a curved path and to validate a new redirected walking technique. In a psychophysical experiment, we found that sensitivity to walking on a curved path was significantly lower for slower walking speeds (radius of 10 meters versus 22 meters). In an applied study, we investigated the influence of a velocity-dependent dynamic gain controller and an avatar controller on the average distance that participants were able to freely walk before needing to be reoriented. The mean walked distance was significantly greater in the dynamic gain controller condition, as compared to the static controller (22 meters versus 15 meters). Our results demonstrate that perceptually motivated dynamic redirected walking techniques, in combination with reorientation techniques, allow for unaided exploration of a large virtual city model.
Values Management System
(2022)
The ValuesManagementSystem (VWS) is a management standard to “provide a sustainable safeguard of a firm and its development, in all dimensions (legal, economic, ecological, social)” (VWSZfW, p. 4). It includes a framework for values-driven governance through self-commitment and self-binding mechanisms. Values promote a sense of identity and give organizations guidance in decision-making. This is especially important in decision-making processes where topics are not clearly ruled by laws and regulations.
VMSZfW must be embedded in the specific business strategy, structure, and culture of an organization. The following four steps describe the implementation of the ValuesManagementSystemZfW: (i) Codify core values of an organization, for instance, with a “mission, vision and values statement” or Code of Ethics, (ii) implement guidelines such as Code of Conduct and specific policies and procedures, (iii) systematize these by establishing management systems such as Compliance and CSR management systems, and (iv) finally organize and establish structures to ensure the strategic direction and operational implementation and review of these processes. The top management shows that values management is taken seriously by their self-commitment to the core values of the company.
Private equity (PE) firms are investment firms that acquire equity shares in companies. The goal of PE firms is to exit the investment after few years with a substantial increase in value. PE firms often claim to outperform the market, i.e. to create alpha.
The overall aim of this paper is to unravel the mystery of value creation in the PE industry. First, the author presents a conceptual framework for value creation in the PE industry based on a multiple valuation model that breaks down value creation into different elements. Second, the paper evaluates whether PE firms really create value by analysing and combining results from prior empirical studies based on the conceptual framework.
The results show that existing empirical evidence is mixed but that there is indeed a tendency toward a positive evidence that PE firms create economic value in average. However, there are methodological difficulties in measuring the value creation and studies are often subject to bias. Finally, it is pointed out that the question whether PE firms really create value has to be viewed from different perspectives such as the perspective of the PE firm, the investors and the portfolio companies.
Artificial Intelligence enables innovative applications, and applications based on Artificial Intelligence are increasingly important for all aspects of the Digital Economy. However, the question of how AI resources such as tools and data can be linked to provide an AI-capability and create business value is still open. Therefore, this paper identifies the value-creating mechanisms of connectionist artificial intelligence using a capability-oriented view and points out the connections to different kinds of business value. The analysis supports an agenda that identifies areas that need further research to understand the mechanism of value creation in connectionist artificial intelligence.
This article investigates the fundamental value of digital platforms, such as Facebook and Google. Despite the transformative nature of digital technologies, it is challenging to value digital services, given that the usage is free of charge. Applying the methodology of discrete choice experiments, we estimated the value of digital free goods. For the first time in the literature, we obtained data for the willingness-to-pay and willingness-to-accept, together with socio-economic variables. The customer´s valuation of free digital services is on average, for Google, 121 € per week and Facebook, 28 €.
Product engineering and subsequent phases of product lifecycles are predominantly managed in isolation. Companies therefore do not fully exploit potentials through using data from smart factories and product usage. The novel intelligent and integrated Product Lifecycle Management (i²PLM) describes an approach that uses these data for product engineering. This paper describes the i²PLM, shows the cause-and-effect relationships in this context and presents in detail the validation of the approach. The i²PLM is applied and validated on a smart product in an industrial research environment. Here, the subsequent generation of a smart lunchbox is developed based on production and sensor data. The results of the validation give indications for further improvements of the i²PLM. This paper describes how to integrate the i²PLM into a learning factory.
Sleep disorders can impact daily life, affecting physical, emotional, and cognitive well-being. Due to the time-consuming, highly obtrusive, and expensive nature of using the standard approaches such as polysomnography, it is of great interest to develop a noninvasive and unobtrusive in-home sleep monitoring system that can reliably and accurately measure cardiorespiratory parameters while causing minimal discomfort to the user’s sleep. We developed a low-cost Out of Center Sleep Testing (OCST) system with low complexity to measure cardiorespiratory parameters. We tested and validated two force-sensitive resistor strip sensors under the bed mattress covering the thoracic and abdominal regions. Twenty subjects were recruited, including 12 males and 8 females. The ballistocardiogram signal was processed using the 4th smooth level of the discrete wavelet transform and the 2nd order of the Butterworth bandpass filter to measure the heart rate and respiration rate, respectively. We reached a total error (concerning the reference sensors) of 3.24 beats per minute and 2.32 rates for heart rate and respiration rate, respectively. For males and females, heart rate errors were 3.47 and 2.68, and respiration rate errors were 2.32 and 2.33, respectively. We developed and verified the reliability and applicability of the system. It showed a minor dependency on sleeping positions, one of the major cumbersome sleep measurements. We identified the sensor under the thoracic region as the optimal configuration for cardiorespiratory measurement. Although testing the system with healthy subjects and regular patterns of cardiorespiratory parameters showed promising results, further investigation is required with the bandwidth frequency and validation of the system with larger groups of subjects, including patients.
Hyperspectral imaging and reflectance spectroscopy in the range from 200–380 nm were used to rapidly detect and characterize copper oxidation states and their layer thicknesses on direct bonded copper in a non-destructive way. Single-point UV reflectance spectroscopy, as a well-established method, was utilized to compare the quality of the hyperspectral imaging results. For the laterally resolved measurements of the copper surfaces an UV hyperspectral imaging setup based on a pushbroom imager was used. Six different types of direct bonded copper were studied. Each type had a different oxide layer thickness and was analyzed by depth profiling using X-ray photoelectron spectroscopy. In total, 28 samples were measured to develop multivariate models to characterize and predict the oxide layer thicknesses. The principal component analysis models (PCA) enabled a general differentiation between the sample types on the first two PCs with 100.0% and 96% explained variance for UV spectroscopy and hyperspectral imaging, respectively. Partial least squares regression (PLS-R) models showed reliable performance with R2c = 0.94 and 0.94 and RMSEC = 1.64 nm and 1.76 nm, respectively. The developed in-line prototype system combined with multivariate data modeling shows high potential for further development of this technique towards real large-scale processes.
Applications often need to be deployed in different variants due to different customer requirements. However, since modern applications often need to be deployed using multiple deployment technologies in combination, such as Ansible and Terraform, the deployment variability must be considered in a holistic way. To tackle this, we previously developed Variability4TOSCA and the prototype OpenTOSCA Vintner, which is a TOSCA preprocessing and management layer that implements Variability4TOSCA. In this demonstration, we present a detailed case study that shows how to model a deployment using Variability4TOSCA, how to resolve the variability using Vintner, and how the result can be deployed.
Despite the unstoppable global drive towards electric mobility, the electrification of sub-Saharan Africa’s ubiquitous informal multi-passenger minibus taxis raises substantial concerns. This is due to a constrained electricity system, both in terms of generation capacity and distribution networks. Without careful planning and mitigation, the additional load of charging hundreds of thousands of electric minibus taxis during peak demand times could prove catastrophic. This paper assesses the impact of charging 202 of these taxis in Johannesburg, South Africa. The potential of using external stationary battery storage and solar PV generation is assessed to reduce both peak grid demand and total energy drawn from the grid. With the addition of stationary battery storage of an equivalent of 60 kWh/taxi and a solar plant of an equivalent of 9.45 kWpk/taxi, the grid load impact is reduced by 66%, from 12 kW/taxi to 4 kW/taxi, and the daily grid energy by 58% from 87 kWh/taxi to 47 kWh/taxi. The country’s dependence on coal to generate electricity, including the solar PV supply, also reduces greenhouse gas emissions by 58%.
The purpose of this paper is to determine the relevance of social media for luxury brand management. It employs both a multi-methodological approach: After analyzing the online performance of the three luxury brands Burberry, Louis Vuitton and Gucci, the empirical research includes a survey as well as an eye tracking test executed with Tobii Studio. The findings reveal that online and social media have given luxury fashion businesses the opportunity to establish a sustainable interaction with their customers and distinguish themselves from the competition. Still, the online business holds many challenges for luxury companies to overcome. This paper gives instructions as to how social media can be effectively incorporated into a luxury company.
Recognizing actions of humans, reliably inferring their meaning and being able to potentially exchange mutual social information are core challenges for autonomous systems when they directly share the same space with humans. Today’s technical perception solutions have been developed and tested mostly on standard vision benchmark datasets where manual labeling of sensory ground truth is a tedious but necessary task. Furthermore, rarely occurring human activities are underrepresented in such data leading to algorithms not recognizing such activities. For this purpose, we introduce a modular simulation framework which offers to train and validate algorithms on various environmental conditions. For this paper we created a dataset, containing rare human activities in urban areas, on which a current state of the art algorithm for pose estimation fails and demonstrate how to train such rare poses with simulated data only.
Engineers of the research project “Digital Product Life-Cycle” are using a graph-based design language to model all aspects of the product they are working on. This abstract model is the base for all further investigations, developments and implementations. In particular at early stages of development, collaborative decision making is very important. We propose a semantic augmented knowledge space by means of mixed reality technology, to support engineering teams. Therefore we present an interaction prototype consisting of a pico projector and a camera. In our usage scenario engineers are augmenting different artefacts in a virtual working environment. The concept of our prototype contains both an interaction and a technical concept. To realise implicit and natural interactions, we conducted two prototype tests: (1) A test with a low-fidelity prototype and (2) a test by using the method Wizard of Oz. As a result, we present a prototype with interaction selection using augmentation spotlighting and an interaction zoom as a semantic zoom.
The desire to combine advanced user friendly interfaces with a product personality communicating environmental friendliness to customers poses new challenges for car interior designers, as little research has been carried out in this field to date. In this paper, the creation of three personas aimed at defining key German car users with pro environmental behaviour is presented. After collecting ethnographic data of potential drivers through literature review, information about generation and Euro car segment led to the definition of three key user groups. The resulting personas were applied to determine the most important interaction points in car interior. Finally, present design cues of eco-friendly product personality developed in the field of automotive design were explored. Our work presents three strategic directions for the design development of future in-car user interfaces named as a) foster multimodal mobility; b) emphasize the interlinkage economy - sustainable driving; and c) highlight new technological developments. The presented results are meant as an impulse for developers to fit the needs of green customers and drivers when designing user-friendly HMI components.
Using measurement and simulation for understanding distributed development processes in the Cloud
(2017)
Organizations increasingly develop software in a distributed manner. The Cloud provides an environment to create and maintain software-based products and services. Currently, it is widely unknown which software processes are suited for Cloud-based development and what their effects in specific contexts are. This paper presents a process simulation to study distributed development in the Cloud. We contribute a simulation model, which helps analyzing different project parameters and their impact on projects carried out in the Cloud. The simulator helps reproducing activities, developers, issues and events in the project, and it generates statistics, e.g., on throughput, total time, and lead and cycle time. The aim of this simulation model is thus to analyze the tradeoffs regarding throughput, total time, project size, and team size. Furthermore, the modified simulation model aims to help project managers select the most suitable planning alternative. Based on observed projects in Finland and Spain, we simulated a distributed project using artificial and real data. Particularly, we studied the variables project size, team size, throughput, and total project duration. A comparison of the real project data with the results obtained from the simulation shows the simulation producing results close to the real data, and we could successfully replicate a distributed software project. By improving the understanding of distributed development processes, our simulation model thus supports project managers in their decision-making.
A sequence of transactions represents a complex and multi dimensional type of data. Feature construction can be used to reduce the data´s dimensionality to find behavioural patterns within such sequences. The patterns can be expressed using the blue prints of the constructed relevant features. These blue prints can then be used for real time classification on other sequences.
Digital light microscopy techniques are among the most widely used methods in cell biology and medical research. Despite that, the automated classification of objects such as cells or specific parts of tissues in images is difficult. We present an approach to classify confluent cell layers in microscopy images by learned deep correlation features using deep neural networks. These deep correlation features are generated through the use of gram-based correlation features and are input to a neural network for learning the correlation between them. In this work we wanted to prove if a representation of cell data based on this is suitable for its classification as has been done for artworks with respect to their artistic period. The method generates images that contain recognizable characteristics of a specific cell type, for example, the average size and the ordered pattern.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
Software evolvability is an important quality attribute, yet one difficult to grasp. A certain base level of it is allegedly provided by service- and microservice-based systems, but many software professionals lack systematic understanding of the reasons and preconditions for this. We address this issue via the proxy of architectural modifiability tactics. By qualitatively mapping principles and patterns of Service Oriented Architecture (SOA) and microservices onto tactics and analyzing the results, we cannot only generate insights into service-oriented evolution qualities, but can also provide a modifiability comparison of the two popular service-based architectural styles. The results suggest that both SOA and microservices possess several inherent qualities beneficial for software evolution. While both focus strongly on loose coupling and encapsulation, there are also differences in the way they strive for modifiability (e.g. governance vs. evolutionary design). To leverage the insights of this research, however, it is necessary to find practical ways to incorporate the results as guidance into the software development process.
Vehicles have been so far improved in terms of energy-efficiency and safety mainly by optimising the engine and the power train. However, there are opportunities to increase energy-efficiency and safety by adapting the individual driving behaviour in the given driving situation. In this paper, an improved rule match algorithm is introduced, which is used in the expert system of a human-centred driving system. The goal of the driving system is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. The improved rule match algorithm checks the incoming information against the driving rules to recognise any breakings of a driving rule. The needed information is obtained by monitoring the driver, the current driving situation as well as the car, using in-vehicle sensors and serial-bus systems. On the basis of the detected broken driving rules, the expert system will create individual recommendations in terms of energy-efficiency and safety, which will allow eliminating bad driving habits, while considering the driver needs.
The coupling of the heat and power sector is required as supply and demand in the German electricity mix drift further and further apart with a high percentage of renewable energy. Heat pumps in combination with thermal energy storage systems can be a useful way to couple the heat and power sectors. This paper presents a hardware-in the-loop test bench for experimental investigation of optimized control strategies for heat pumps. 24-hour experiments are carried out to test whether the heat pump is able to serve optimized schedules generated by a MATLAB algorithm. The results show that the heat pump is capable of following the generated schedules, and the maximum deviation of the operational time between schedule and experiment is only 3%. Additionally, the system can serve the demand for space heating and DHW at any time.
The paper explains a workflow to simulate the food energy water (FEW) nexus for an urban district combining various data sources like 3D city models, particularly the City Geography Markup Language (CityGML) data model from the Open Geospatial Consortium, Open StreetMap and Census data. A long term vision is to extend the CityGML data model by developing a FEW Application Domain Extension (FEW ADE) to support future FEW simulation workflows such as the one explained in this paper. Together with the mentioned simulation workflow, this paper also identifies some necessary FEW related parameters for the future development of a FEW ADE. Furthermore, relevant key performance indicators are investigated, and the relevant datasets necessary to calculate these indicators are studied. Finally, different calculations are performed for the downtown borough Ville-Marie in the city of Montréal (Canada) for the domains of food waste (FW) and wastewater (WW) generation. For this study, a workflow is developed to calculate the energy generation from anaerobic digestion of FW and WW. In the first step, the data collection and preparation was done. Here relevant data for georeferencing, data for model set-up, and data for creating the required usage libraries, like food waste and wastewater generation per person, were collected. The next step was the data integration and calculation of the relevant parameters, and lastly, the results were visualized for analysis purposes. As a use case to support such calculations, the CityGML level of detail two model of Montréal is enriched with information such as building functions and building usages from OpenStreetMap. The calculation of the total residents based on the CityGML model as the main input for Ville-Marie results in a population of 72,606. The statistical value for 2016 was 89,170, which corresponds to a deviation of 15.3%. The energy recovery potential of FW is about 24,024 GJ/year, and that of wastewater is about 1,629 GJ/year, adding up to 25,653 GJ/year. Relating values to the calculated number of inhabitants in Ville-Marie results in 330.9 kWh/year for FW and 22.4 kWh/year for wastewater, respectively.
Avatars are in use when interacting in virtual environments in different contexts, in collaborative work, as well as in gaming and also in virtual meetings with friends. Therefore it is important to understand how the relationship between user and avatar works. In this study, an online survey is used to determine how the perception of an avatar changes in different contexts by relating it to existing avatar relationship typologies. Additionally, it is determined whether in each context a realistic, abstract or comic-like representation is preferred by the participants. One result was a preference of low poly representations in the work context, which are associated with the perception of the avatar as a tool. In the context of meeting friends, a realistic representation is perceived as more appropriate, which is perceived as an accurate self-representation. In the gaming context, the results are less clear, which can be attributed to different gaming preferences. Here, unlike in the other contexts, a comic-like representation is also perceived as appropriate, which is associated with the perception of the avatar as a friend. A symbiotic user-avatar relationship is not directly related to any form of representation, but always lies in the midfield, which is attributed to the fact that it represents a whole spectrum between other categories.
Going forward with the requirements of missions to the Moon and further into deep space, the European Space Agency is investigating new methods of astronaut training that can help accelerate learning, increase availability and reduce complexity and cost in comparison to currently used methods. To achieve this, technologies such as virtual reality may be utilized. In this paper, an investigation into the benefits of using virtual reality as a means for extravehicular activity training in comparison to conventional training methods, such as neutral buoyancy pools is given. To help determine the requirements and current uses of virtual reality for extravehicular activity training first hand tests of currently available software as well as expert interviews are utilized. With this knowledge a concept is developed that may be used to further advance training methods in virtual reality. The resulting concept is used as a basis for development of a prototype to showcase user interactions and locomotion in microgravity simulations.
The stimulation of user engagement has received significant attention in extant research. However, the theory of antecedents for user engagement with an initial electronic word-of-mouth (eWoM) communication is relatively less developed. In an investigation of 576 unique user postings across independent Facebook (FB) communities for two German firms, we contribute to the extant knowledge on user engagement in two different ways. First, we explicate senders’ prior usage experience and the extent of their acquaintance with other community members as the two key drivers of user engagement across a product and a service community. Second, we reveal that these main effects differ according to the type of community. In service communities, experience has a stronger impact on user engagement; whereas, in product communities, acquaintance is more important.
Managerial accountants spend a large part of their working time on more operational activities in cost accounting, reporting, and operational planning and budgeting. In all these areas, there has been increasing discussion in recent years, both in theory and practice, about using more digital technologies. For reporting, this means not only an intensified discussion of technologies such as RPA and AI but also more intensive changes to existing reporting systems. In particular, management information systems (MIS), which are maintained by managerial accountants and used by managers for corporate management, should be mentioned here. Based on an empirical survey in a large German company, this article discusses the requirements and assessments of users when switching from a regular MIS to a cloud-based system.
To correctly assess the cleanliness of technical surfaces in a production process, corresponding online monitoring systems must provide sufficient data. A promising method for fast, large-area, and non-contact monitoring is hyperspectral imaging (HSI), which was used in this paper for the detection and quantification of organic surface contaminations. Depending on the cleaning parameter constellation, different levels of organic residues remained on the surface. Afterwards, the cleanliness was determined by the carbon content in the atom percent on the sample surfaces, characterized by XPS and AES. The HSI data and the XPS measurements were correlated, using machine learning methods, to generate a predictive model for the carbon content of the surface. The regression algorithms elastic net, random forest regression, and support vector machine regression were used. Overall, the developed method was able to quantify organic contaminations on technical surfaces. The best regression model found was a random forest model, which achieved an R2 of 0.7 and an RMSE of 7.65 At.-% C. Due to the easy-to-use measurement and the fast evaluation by machine learning, the method seems suitable for an online monitoring system. However, the results also show that further experiments are necessary to improve the quality of the prediction models.
This paper examines the efficacy of social media systems in customer complaint handling. The emergence of social media, as a useful complement and (possibly) a viable alternative to the traditional channels of service delivery, motivates this research. The theoretical framework, developed from literature on social media and complaint handling, is tested against data collected from two different channels (hotline and social media) of a German telecommunication services provider, in order to gain insights into channel efficacy in complaint handling. We contribute to the understanding of firm’s technology usage for complaint handling in two ways:
(a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels and (b) by comparing the impact of complaint handling quality on key performance outcomes such as customer loyalty, positive word-of-mouth, and crosspurchase intentions across traditional and social media channels.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries. This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries.
This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Unraveling the double-edged sword : effects of cultural diversity on creativity and innovativeness
(2014)
Cultural diversity is considered a “double-edged sword” (Kravitz, 2005) as research on its effects on teams’ performance regularly delivers inconsistent and contradictory results. This paper makes an attempt to unravel the double-edged sword by discerning different forms of cultural diversity: separation and variety (Harrison & Klein, 2007). Based on a review of the literature, a conceptual model is developed hypothesizing that cultural variety yields positive, while cultural separation yields negative effects on team creativity and innovativeness. In addition the effects of national diversity are contrasted to proof whether national diversity can serve as a proxy for cultural diversity as is often practiced. The model is tested on a sample of 113 student teams of Entrepreneurship modules at 4 European universities. Cultural diversity is measured directly on the basis of individual team members’ cultural value orientations by means of the CPQ4 (Maznevski, DiStefano, Gomez, Noorderhaven & Wu, 2002). Data is analyzed using the PLS structural equation modeling technique. The results confirm the hypothesized impacts of cultural variety and separation on creativity but do not deliver evidence for impacts on innovativeness. Same is true for national diversity. Interestingly, national diversity does not show any relation to neither form of cultural diversity.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
The United Nations (UN) Global Compact is a call to companies to align their strategies and operations with ten universal principles in the areas of human rights, labor, environment, and anti-corruption, and to take actions that advance societal goals (UN Global Compact 2017, p. 3). The UN Global Compacts’ vision is “to mobilize a global movement of sustainable companies and stakeholder to create the world we want” (UN Global Compact 2021a). It is a global network with local presence all around the world.
Engineering of large vascularized adipose tissue constructs is still a challenge for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Communication between mature adipocytes and endothelial cells is important for homeostasis and the maintenance of adipose tissue mass but, to date, is mainly neglected in tissue engineering strategies. Thus, new coculture strategies are needed to integrate adipocytes and endothelial cells successfully into a functional construct. This review focuses on the cross-talk of mature adipocytes and endothelial cells and considers their influence on fatty acid metabolism and vascular tone. In addition, the properties and challenges with regard to these two cell types for vascularized tissue engineering are highlighted.
Strategic alliances have become important strategic options for firms to achieve competitive advantage. Yet, there are many examples of alliance failures. Scholars have studied this phenomenon and identified many reasons for alliance failure, including lack of trust between the partnering firms. Paradoxically, the concept of trust is still not fully understood, specifically how and under what conditions trust comes to break down within the broader process of alliance building. We synthesize a process model that describes the “alliance capability”, including trust, openness, partner contributions, and relational rents. We then translate this framework into a formal simulation model and analyze it thoroughly. In analyzing trust dynamics we identify and explore a tipping boundary, separating a regime of alliance failures and successes. We apply our core findings to openness strategies – decisions about how much knowledge to share with partners. Our analyses reveal that strategies informed by a static mental model of trust, contributions, and openness, under undervalue openness. Further, too little openness risks early failure due to the being trapped in a vicious cycle of trust depletion.
This research addresses the question of why employees use enterprise social networks (ESN). Against the background of technology acceptance research, we propose an extended unified theory of acceptance and use of technology (UTAUT) model, adapt it to an ESN context, and test our model against data from ESN users of large and medium-sized enterprises. We use partial least squares structural equation modeling to gain insights into the determinants of ESN use. This paper contributes to ESN acceptance research by evaluating a model containing determinants of ESN use. It also examines the effects of determinants on five different usage dimensions of ESN. The results reveal that facilitating conditions are the main driver of ESN use while the impact of intention to use is comparably small. Implications for theory and practice are discussed.
The unprecedented acceleration in the dynamics of economic development and its dependence on global interactions makes predicting the future especially difficult. Nevertheless, an examination of long-term trends provides an opportunity to begin a discussion about what reality could await us tomorrow and how we want to deal with it. With this food-for-thought paper, the member institutes of the Fraunhofer Group for Innovation Research wish to present a selection of the trends that are destined to have a significant impact on innovation systems in the period leading up to 2030. Based on these trends, the paper derives theses for innovation in the year 2030 and describes the resulting tasks for business, politics, science and society.
The Principles for Responsible Investments (PRI) is “the world’s leading proponent of responsible investment” (PRI 2021a). With the development of six Principles for Responsible Investment, the PRI supports its international network of investor signatories in incorporating the environmental, social, and governance (ESG) factors into their investment and ownership decisions. The goal of PRI is to develop a more sustainable global financial system by encouraging “investors to use responsible investment to enhance returns and better manage risks” (PRI 2021a). This independent financial initiative is supported by the United Nations and linked to the United Nations Environmental Program Finance Initiative (UNEP FI 2021) and the United Nations Global Compact (UN Global Compact 2021).
Different types of raw cotton were investigated by a commercial ultraviolet-visible/near infrared (UV-Vis/NIR) spectrometer (210–2200 nm) as well as on a home-built setup for NIR hyperspectral imaging (NIR-HSI) in the range 1100–2200 nm. UV-Vis/NIR reflection spectroscopy reveals the dominant role proteins, hydrocarbons and hydroxyl groups play in the structure of cotton. NIR-HSI shows a similar result. Experimentally obtained data in combination with principal component analysis (PCA) provides a general differentiation of different cotton types. For UV-Vis/NIR spectroscopy, the first two principal components (PC) represent 82 % and 78 % of the total data variance for the UV-Vis and NIR regions, respectively. Whereas, for NIR-HSI, due to the large amount of data acquired, two methodologies for data processing were applied in low and high lateral resolution. In the first method, the average of the spectra from one sample was calculated and in the second method the spectra of each pixel were used. Both methods are able to explain ≥90 % of total variance by the first two PCs. The results show that it is possible to distinguish between different cotton types based on a few selected wavelength ranges. The combination of HSI and multivariate data analysis has a strong potential in industrial applications due to its short acquisition time and low-cost development. This study opens a novel possibility for a further development of this technique towards real large-scale processes.
Ultra wideband real-time locating system for tracking people and devices in the operating room
(2022)
Position tracking within the OR could be one possible input for intraoperative situation recognition. Our approach demonstrates a Real-time Locating System (RTLS) using the Ultra Wideband (UWB) technology to determine the position of people or objects. The UWB RTLS was integrated into the research OR at Reutlingen University and the system’s settings were optimized regarding the four factors accuracy, susceptibility to interference, range, and latency. Therefore, different parameters were adapted and the effects on the factors were compared. Goodtracking quality could be achieved under optimal settings. These results indicate that a UWB RTLS is well suited to determine the position of people and devices in our setting. The feasibility of the system needsto be evaluated under real OR conditions.
In modern collaborative production environments where industrial robots and humans are supposed to work hand in hand, it is mandatory to observe the robot’s workspace at all times. Such observation is even more crucial when the robot’s main position is also dynamic e.g. because the system is mounted on a movable platform. As current solutions like physically secured areas in which a robot can perform actions potentially dangerous for humans, become unfeasible in such scenarios, novel, more dynamic, and situation aware safety solutions need to be developed and deployed.
This thesis mainly contributes to the bigger picture of such a collaborative scenario by presenting a data-driven convolutional neural network-based approach to estimate the two-dimensional kinematic-chain configuration of industrial robot-arms within raw camera images. This thesis also provides the information needed to generate and organize the mandatory data basis and presents frameworks that were used to realize all involved subsystems. The robot-arm’s extracted kinematic-chain can also be used to estimate the extrinsic camera parameters relative to the robot’s three-dimensional origin. Further a tracking system, based on a two-dimensional kinematic chain descriptor is presented to allow for an accumulation of a proper movement history which enables the prediction of future target positions within the given image plane. The combination of the extracted robot’s pose with a simultaneous human pose estimation system delivers a consistent data flow that can be used in higher-level applications.
This thesis also provides a detailed evaluation of all involved subsystems and provides a broad overview of their particular performance, based on novel generated, semi automatically annotated, real datasets.
Motor-based theories of facial expression recognition propose that the visual perception of facial expression is aided by sensorimotor processes that are also used for the production of the same expression. Accordingly, sensorimotor and visual processes should provide congruent emotional information about a facial expression. Here, we report evidence that challenges this view. Specifically, the repeated execution of facial expressions has the opposite effect on the recognition of a subsequent facial expression than the repeated viewing of facial expressions. Moreover, the findings of the motor condition, but not of the visual condition, were correlated with a nonsensory condition in which participants imagined an emotional situation. These results can be well accounted for by the idea that facial expression recognition is not always mediated by motor processes but can also be recognized on visual information alone.
Online measurement of drug concentrations in patient's breath is a promising approach for individualized dosage. A direct transfer from breath- to blood-concentrations is not possible. Measured exhaled concentrations are following the blood-concentration with a delay in non-steady-state situations. Therefore, it is necessary to integrate the breath-concentration into a pharmacological model. Two different approaches for pharmacokinetic modelling are presented. Usually a 3-compartment model is used for pharmacokinetic calculations of blood concentrations. This 3-compartment model is extended with a 2-compartment model based on the first compartment of the 3-compartment model and a new lung compartment. The second approach is to calculate a time delay of changes in the concentration of the first compartment to describe the lung-concentration. Exemplarily both approaches are used for modelling of exhaled propofol. Based on time series of exhaled propofol measurements using an ion-mobility-spectrometer every minute for 346 min a correlation of calculated plasma and the breath concentration was used for modelling to deliver R2 = 0.99 interdependencies. Including the time delay modelling approach the new compartment coefficient ke0lung was calculated to ke0lung = 0.27 min−1 with R2 = 0.96. The described models are not limited to propofol. They could be used for any kind of drugs, which are measurable in patient's breath.
Twitter and citations
(2023)
Social media, especially Twitter, plays an increasingly important role among researchers in showcasing and promoting their research. Does Twitter affect academic citations? Making use of Twitter activity about columns published on VoxEU, a renowned online platform for economists, we develop an instrumental variable strategy to show that Twitter activity about a research paper has a causal effect on the number of citations that this paper will receive. We find that the existence of at least one tweet, as opposed to none, increases citations by 16-25%. Doubling overall Twitter engagement boosts citations by up to 16%.
Turning students into Industry 4.0 entrepreneurs: design and evaluation of a tailored study program
(2022)
Startups in the field of Industry 4.0 could be a huge driver of innovation for many industry sectors such as manufacturing. However, there is a lack of education programs to ensure a sufficient number of well-trained founders and thus a supply of such startups. Therefore, this study presents the design, implementation, and evaluation of a university course tailored to the characteristics of Industry 4.0 entrepreneurship. Educational design-based research was applied with a focus on content and teaching concept. The study program was first implemented in 2021 at a German university of applied sciences with 25 students, of which 22 participated in the evaluation. The evaluation of the study program was conducted with a pretest–posttest-design targeting three areas: (1) knowledge about the application domain, (2) entrepreneurial intention and (3) psychological characteristics. The entrepreneurial intention was measured based on the theory of planned behavior. For measuring psychological characteristics, personality traits associated with entrepreneurship were used. Considering the study context and the limited external validity of the study, the following can be identified in particular: The results show that a university course can improve participants' knowledge of this particular area. In addition, perceived behavioral control of starting an Industry 4.0 startup was enhanced. However, the results showed no significant effects on psychological characteristics.
Strong optical mode coupling between two adjacent λ/2 Fabry-Pérot microresonators consisting of three parallel silver mirrors is investigated experimentally and theoretically as a function of their detuning and coupling strength. Mode coupling can be precisely controlled by tuning the mirror spacing of one resonator with respect to the other by piezoelectric actuators. Mode splitting, anti-crossing and asymmetric modal damping are observed and theoretically discussed for the symmetric and antisymmetric supermodes of the coupled system. The spectral profile of the supermodes is obtained from the Fourier transform of the numerically calculated time evolution of the individual resonator modes, taking into account their resonance frequencies, damping and coupling constants, and is in excellent agreement with the experiments. Our microresonator design has potential applications for energy transfer between spatially separated quantum systems in micro optoelectronics and for the emerging field of polaritonic chemistry.
Coopetitive endeavors offer valuable strategic options for firms. Yet, many of them are failure-prone as partners must balance collective and private interest. While interpartner trust is considered central for alliance success, paradoxically, the role and dynamics of trust is still not understood. We synthesize a computational model, capturing relational dynamics of an alliance, encompassing coevolution of trust, partner contributions, and (relative) alliance interactions. Analyzing alliance dynamics using simulation we find and explore a tipping boundary, separating a regime of alliance failure and success. We identify implications for collaborative (aspirations) and private strategies (openness). Our analyses reveal that strategies informed by a static mental model of partner trust, contributions, and openness tend to yield subpar alliance results and hidden failure-risk. We discuss implications for management theory.
Customer orientation should be the core engine of every organisation while IT can be considered as the enabler to generate competitive advantages along customer processes in marketing, sales and service. Research shows that customer relationship management (CRM) enables organisations to perform better and experience indicates that organisations that focus on customer orientation are more successful. With marketplace organisations such as Amazon, Alibaba or Conrad shaping the future of customer centricity and information technology, German B2B organisations need to shift their value contribution from product-centric to customer-centric. While these organisations are currently attempting to implement CRM software and putting their customers more into focus, the question remains how organisations are approaching the implementation of CRM and whether these attempts are paying off in terms of business performance.
Thin, flat textile roofing offers negligible heat insulation. In warm areas, such roofing membranes are therefore equipped with metallized surfaces to reflect solar heat radiation, thus reducing the warming inside a textile building. Heat reflection effects achieved by metallic coatings are always accompanied by shading effects as the metals are non-transparent for visible light (VIS). Transparent conductive oxides (TCOs) are transparent for VIS and are able to reflect heat radiation in the infrared. TCOs are, e.g., widely used in the display industry. To achieve the perfect coatings needed for electronic devices, these are commonly applied using costly vacuum processes at high temperatures. Vacuum processes, on account of the high costs involved and high processing temperatures, are obstructive for an application involving textiles. Accepting that heat-reflecting textile membranes demand less perfect coatings, a wet chemical approach has been followed here when producing transparent heat-reflecting coatings. Commercially available TCOs were employed as colloidal dispersions or nanopowders to prepare sol-gel-based coating systems. Such coatings were applied to textile membranes as used for architectural textiles using simple coating techniques and at moderate curing temperatures not exceeding 130 °C. The coatings achieved about 90% transmission in the VIS spectrum and reduced near-infrared transmission (at about 2.5 µm) to nearly zero while reflecting up to 25% of that radiation. Up to 35% reflection has been realized in the far infrared, and emissivity values down to ε = 0.5777 have been measured.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
Context: Companies need capabilities to evaluate the customer value of software intensive products and services. One way of systematically acquiring data on customer value is running continuous experiments as part of the overall development process. Objective: This paper investigates the first steps of transitioning towards continuous experimentation in a large company, including the challenges faced. Method: We conduct a single-case study using participant observation, interviews, and qualitative analysis of the collected data. Results: Results show that continuous experimentation was well received by the practitioners and practising experimentation helped them to enhance understanding of their product value and user needs. Although the complexities of a large multi-stakeholder business to-business (B2B) environment presented several challenges such as inaccessible users, it was possible to address impediments and integrate an experiment in an ongoing development project. Conclusion: Developing the capability for continuous experimentation in large organisations is a learning process which can be supported by a systematic introduction approach with the guidance of experts. We gained experience by introducing the approach on a small scale in a large organisation, and one of the major steps for future work is to understand how this can be scaled up to the whole development organisation.
Delivering value to customers in real-time requires companies to utilize real-time deployment of software to expose features to users faster, and to shorten the feedback loop. This allows for faster reaction and helps to ensure that the development is focused on features providing real value. Continuous delivery is a development practice where the software functionality is deployed continuously to customer environment. Although this practice has been established in some domains such as B2C mobile software, the B2B domain imposes specific challenges. This article presents a case study that is conducted in a medium-sized software company operating in the B2B domain. The objective of this study is to analyze the challenges and benefits of continuous delivery in this domain. The results suggest that technical challenges are only one part of the challenges a company encounters in this transition. The company must also address challenges related to the customer and procedures. The core challenges are caused by having multiple customers with diverse environments and unique properties, whose business depends on the software product. Some customers require to perform manual acceptance testing, while some are reluctant towards new versions. By utilizing continuous delivery, it is possible for the case company to shorten the feedback cycles, increase the reliability of new versions, and reduce the amount of resources required for deploying and testing new releases.