Refine
Year of publication
- 2023 (212) (remove)
Document Type
- Journal article (111)
- Conference proceeding (81)
- Book chapter (8)
- Doctoral Thesis (6)
- Book (3)
- Issue of a journal (1)
- Report (1)
- Review (1)
Language
- English (212) (remove)
Is part of the Bibliography
- yes (212)
Institute
- ESB Business School (80)
- Informatik (69)
- Life Sciences (26)
- Technik (26)
- Texoversum (12)
- Zentrale Einrichtungen (2)
Publisher
- Elsevier (43)
- Springer (34)
- MDPI (21)
- IEEE (18)
- American Chemical Society (7)
- Association for Information Systems (7)
- Emerald (5)
- Association for Computing Machinery (4)
- Taylor & Francis (4)
- Universität Tübingen (3)
This article proposes several modified quasi Z-source dc/dc boost converters. These can achieve soft-switching by using a clamp-switch network comprised of an active switch and a diode in parallel with a capacitor connected across one of the inductors of the Z-source network. In this way, ringing at the transistor switching node is mitigated, and the voltage at the turn-on of the transistor is reduced. Even a zero voltage switching (ZVS) of the main transistor is possible if the capacitor in the clamp-switch network is adequately chosen. The proposed circuit structure and operating mode are described and validated through simulations and measurements on a low-power prototype.
Introduction: Even if there is a standard procedure of CI surgery, especially in pediatric surgery surgical steps often differ individually due to anatomical variations, malformations or unforseen events. This is why every surgical report should be created individually, which takes time and relies on the correct memory of the surgeon. A standardized recording of intraoperative data and subsequent storage as well as text processing would therefore be desirable and provides the basis for subsequent data processing, e.g. in the context of research or quality assurance.
Method: In cooperation with Reutlingen University, we conducted a workflow analysis of the prototype of a semi-automatic checklist tool. Based on automatically generated checklists generated from BPMN models a prototype user interface was developed for an android tablet. Functions such as uploading photos and files, manual user entries, the interception of foreseeable deviations from the normal course of operations and the automatic creation of OP documentation could be implemented. The system was tested in a remote usability test on a petrous bone model.
Result: The user interface allows a simple intuitive handling, which can be well implemented in the intraoperative setting. Clinical data as well as surgical steps could be individually recorded and saved via DICOM. An automatic surgery report could be created and saved.
Summary: The use of a dynamic checklist tool facilitates the capture, storage and processing of surgical data. Further applications in clinical practice are pending.
Natural wood colors occur within a wide range from almost white (e.g., white poplar), various yellowish, reddish, and brownish hues to almost black (e.g., ebony). The intrinsic color of wood is basically defined by its chemical composition. However, other factors such as specific anatomical formations or physical properties further affect the optical impression. Starting with the chemical composition of wood and anatomical basics, wood color and its modifications are discussed in this chapter. The classic method of coloring or re-coloring wood-based material surfaces is the application of a coating containing appropriate dyes or pigments. Different concepts for wood coating and coloration are presented. Another method used dyes for coloration of the wood structure. As alternative techniques, physical methods, for example, drying, steaming, ammoniation, bleaching, enzyme treatment, as well as treatment with electromagnetic irradiation (e.g., UV), are explained in this chapter.
Purpose
The purpose of this paper is to explore why men do not rent luxury fashion to explain why the demand for luxury fashion rental services for men is so low and to contribute to science by collecting high-quality data for the research fields gender differences in barriers to renting fashion, barriers to participating in renting luxury fashion in general and to increase the amount of data on men consumption behavior in the field of fashion and luxury fashion research. Furthermore, this study aims not only to make a theoretical contribution, but also to provide practical implications for the luxury fashion rental industry.
Design/methodology/approach
To answer the research question, qualitative semi-structured interviews with seven men were conducted, who are interested in fashion and spend at least 10% of their monthly net income on luxury fashion per month. Through a deductively-inductively category-based qualitative content analysis of the interviews supported by the software MAXQDA, not only were the reasons found why many men refuse to rent luxury fashion, but also characteristics were discovered that make luxury fashion rental services more attractive to men, as well as two fashion segments and a product category in which men can imagine renting fashion or luxury fashion under certain circumstances.
Findings
Men reject the concept of renting primarily because of the nonexistence of ownership, which has to do with loss of emotional value, loss of functional value, fear of social rejection, and identity concerns; other reasons include lack of individualism, lack of habit and their own subjective standards. Except for two outliers, the remaining men surveyed could imagine using a luxury rental service under certain conditions. The most frequently mentioned features were omnichannel approach, transparency of the entire rental process provided by reviews and feedback about both the borrower and the lender, information about the cleaning process, and proof of authenticity. Also mentioned was the maintenance of exclusivity and the fact that rental services should be offered directly by the company. In the convenience category, the purchase option and insurance were mentioned most often. In addition, some men could imagine renting event-related clothing, very trendy and expensive luxury clothing, and luxury watches. However, none of the respondents would give up owning clothes and primarily use the LFRS.
Value/Practical Implications
So that marketers do not have to go through trial and error to figure out which of these characteristics works best for which male target group, the work developed five types that can be targeted with selected characteristics and their marketing, and thus perhaps persuaded to participate in the LFRS. The social type needs the feature of maintenance of exclusivity, the emotional type needs the purchase option and an omnichannel experience, the flexibility type needs the same day delivery and free exchange possibilities, the cost-benefit type needs analytical tools to maximize his rental income or to calculate whether it is cheaper to buy or rent this particular item for this particular period of time, the rule-governed type needs an added value in addition to renting such as a top service.
This study examines the underexplored areas of customer success management, focusing on the impact of leadership and companywide collaboration, and the role of customer success in overall firm performance. A qualitative research approach was utilized, which involved reviewing relevant literature and conducting an interview with the Vice President of Customer Success Management in B2B at a case company. Findings revealed that both leadership and pervasive collaboration greatly enhance the customer journey experience. Given that 75% of Annual Recurring Revenue is derived from existing customers, the substantial role of customer success in propelling business growth is affirmed. The study also demonstrated the importance of proactive customer engagement, assimilating customer feedback into products and services, and nurturing personal relationships with customers for fostering innovation. It further stressed the need for service provision and decision-making at various levels, as well as the implementation of a range of communication channels, to ensure customer success.
This study examines the phenomenon of Virtual Influencer (VI) marketing and its impact on customer purchase behavior. The aim is to understand the scope and impact of VI marketing. The study compares VI marketing to traditional Human Influencer (HI) marketing and identifies the unique benefits and challenges associated with VIs. A survey was conducted to gain insight into consumer attitudes and behaviors toward VIs. Key findings reveal varying levels of trust and acceptance of VIs among consumers. While some participants expressed openness to buying products promoted by VIs, others had reservations about their authenticity. The study also explores the potential role of VIs in the metaverse, highlighting business opportunities and challenges in this evolving digital landscape. Overall, this research sheds light on the growing influence of VIs and the need for further research in the field of marketing.
The present study investigated the possibilities and limitations of using a low-cost NIR spectrometer for the verification of the presence of the declared active pharmaceutical ingredients (APIs) in tablet formulations, especially for medicine screening studies in low-resource settings. Spectra from 950 to 1650 nm were recorded for 170 pharmaceutical products representing 41 different APIs, API combinations or placebos. Most of the products, including 20 falsified medicines, had been collected in medicine quality studies in African countries. After exploratory principal component analysis, models were built using data-driven soft independent modelling of class analogy (DD-SIMCA), a one-class classifier algorithm, for tablet products of penicillin V, sulfamethoxazole/trimethoprim, ciprofloxacin, furosemide, metronidazole, metformin, hydrochlorothiazide, and doxycycline. Spectra of amoxicillin and amoxicillin/clavulanic acid tablets were combined into a single model. Models were tested using Procrustes cross-validation and by projection of spectra of tablets containing the same or different APIs. Tablets containing no or different APIs could be identified with 100 % specificity in all models. A separation of the spectra of amoxicillin and amoxicillin/clavulanic acid tablets was achieved by partial least squares discriminant analysis. 15 out of 19 external validation products (79 %) representing different brands of the same APIs were correctly identified as members of the target class; three of the four rejected samples showed an API mass percentage of the total tablet weight that was out of the range covered in the respective calibration set. Therefore, in future investigations larger and more representative spectral libraries are required for model building. Falsified medicines containing no API, incorrect APIs, or grossly incorrect amounts of the declared APIs could be readily identified. Variation between different NIR-S-G1 spectroscopic devices led to a loss of accuracy if spectra recorded with different devices were pooled. Therefore, piecewise direct standardization was applied for calibration transfer. The investigated method is a promising tool for medicine screening studies in low-resource settings.
Product engineering and subsequent phases of product lifecycles are predominantly managed in isolation. Companies therefore do not fully exploit potentials through using data from smart factories and product usage. The novel intelligent and integrated Product Lifecycle Management (i²PLM) describes an approach that uses these data for product engineering. This paper describes the i²PLM, shows the cause-and-effect relationships in this context and presents in detail the validation of the approach. The i²PLM is applied and validated on a smart product in an industrial research environment. Here, the subsequent generation of a smart lunchbox is developed based on production and sensor data. The results of the validation give indications for further improvements of the i²PLM. This paper describes how to integrate the i²PLM into a learning factory.
Sleep disorders can impact daily life, affecting physical, emotional, and cognitive well-being. Due to the time-consuming, highly obtrusive, and expensive nature of using the standard approaches such as polysomnography, it is of great interest to develop a noninvasive and unobtrusive in-home sleep monitoring system that can reliably and accurately measure cardiorespiratory parameters while causing minimal discomfort to the user’s sleep. We developed a low-cost Out of Center Sleep Testing (OCST) system with low complexity to measure cardiorespiratory parameters. We tested and validated two force-sensitive resistor strip sensors under the bed mattress covering the thoracic and abdominal regions. Twenty subjects were recruited, including 12 males and 8 females. The ballistocardiogram signal was processed using the 4th smooth level of the discrete wavelet transform and the 2nd order of the Butterworth bandpass filter to measure the heart rate and respiration rate, respectively. We reached a total error (concerning the reference sensors) of 3.24 beats per minute and 2.32 rates for heart rate and respiration rate, respectively. For males and females, heart rate errors were 3.47 and 2.68, and respiration rate errors were 2.32 and 2.33, respectively. We developed and verified the reliability and applicability of the system. It showed a minor dependency on sleeping positions, one of the major cumbersome sleep measurements. We identified the sensor under the thoracic region as the optimal configuration for cardiorespiratory measurement. Although testing the system with healthy subjects and regular patterns of cardiorespiratory parameters showed promising results, further investigation is required with the bandwidth frequency and validation of the system with larger groups of subjects, including patients.
Applications often need to be deployed in different variants due to different customer requirements. However, since modern applications often need to be deployed using multiple deployment technologies in combination, such as Ansible and Terraform, the deployment variability must be considered in a holistic way. To tackle this, we previously developed Variability4TOSCA and the prototype OpenTOSCA Vintner, which is a TOSCA preprocessing and management layer that implements Variability4TOSCA. In this demonstration, we present a detailed case study that shows how to model a deployment using Variability4TOSCA, how to resolve the variability using Vintner, and how the result can be deployed.
Managerial accountants spend a large part of their working time on more operational activities in cost accounting, reporting, and operational planning and budgeting. In all these areas, there has been increasing discussion in recent years, both in theory and practice, about using more digital technologies. For reporting, this means not only an intensified discussion of technologies such as RPA and AI but also more intensive changes to existing reporting systems. In particular, management information systems (MIS), which are maintained by managerial accountants and used by managers for corporate management, should be mentioned here. Based on an empirical survey in a large German company, this article discusses the requirements and assessments of users when switching from a regular MIS to a cloud-based system.
In modern collaborative production environments where industrial robots and humans are supposed to work hand in hand, it is mandatory to observe the robot’s workspace at all times. Such observation is even more crucial when the robot’s main position is also dynamic e.g. because the system is mounted on a movable platform. As current solutions like physically secured areas in which a robot can perform actions potentially dangerous for humans, become unfeasible in such scenarios, novel, more dynamic, and situation aware safety solutions need to be developed and deployed.
This thesis mainly contributes to the bigger picture of such a collaborative scenario by presenting a data-driven convolutional neural network-based approach to estimate the two-dimensional kinematic-chain configuration of industrial robot-arms within raw camera images. This thesis also provides the information needed to generate and organize the mandatory data basis and presents frameworks that were used to realize all involved subsystems. The robot-arm’s extracted kinematic-chain can also be used to estimate the extrinsic camera parameters relative to the robot’s three-dimensional origin. Further a tracking system, based on a two-dimensional kinematic chain descriptor is presented to allow for an accumulation of a proper movement history which enables the prediction of future target positions within the given image plane. The combination of the extracted robot’s pose with a simultaneous human pose estimation system delivers a consistent data flow that can be used in higher-level applications.
This thesis also provides a detailed evaluation of all involved subsystems and provides a broad overview of their particular performance, based on novel generated, semi automatically annotated, real datasets.
Twitter and citations
(2023)
Social media, especially Twitter, plays an increasingly important role among researchers in showcasing and promoting their research. Does Twitter affect academic citations? Making use of Twitter activity about columns published on VoxEU, a renowned online platform for economists, we develop an instrumental variable strategy to show that Twitter activity about a research paper has a causal effect on the number of citations that this paper will receive. We find that the existence of at least one tweet, as opposed to none, increases citations by 16-25%. Doubling overall Twitter engagement boosts citations by up to 16%.
Thin, flat textile roofing offers negligible heat insulation. In warm areas, such roofing membranes are therefore equipped with metallized surfaces to reflect solar heat radiation, thus reducing the warming inside a textile building. Heat reflection effects achieved by metallic coatings are always accompanied by shading effects as the metals are non-transparent for visible light (VIS). Transparent conductive oxides (TCOs) are transparent for VIS and are able to reflect heat radiation in the infrared. TCOs are, e.g., widely used in the display industry. To achieve the perfect coatings needed for electronic devices, these are commonly applied using costly vacuum processes at high temperatures. Vacuum processes, on account of the high costs involved and high processing temperatures, are obstructive for an application involving textiles. Accepting that heat-reflecting textile membranes demand less perfect coatings, a wet chemical approach has been followed here when producing transparent heat-reflecting coatings. Commercially available TCOs were employed as colloidal dispersions or nanopowders to prepare sol-gel-based coating systems. Such coatings were applied to textile membranes as used for architectural textiles using simple coating techniques and at moderate curing temperatures not exceeding 130 °C. The coatings achieved about 90% transmission in the VIS spectrum and reduced near-infrared transmission (at about 2.5 µm) to nearly zero while reflecting up to 25% of that radiation. Up to 35% reflection has been realized in the far infrared, and emissivity values down to ε = 0.5777 have been measured.
The COVID-19 pandemic necessitated significant changes in foreign language education, forcing teachers to reconstruct their identities and redefine their roles as language educators. To better understand these adaptations and perspectives, it is crucial to study how the pandemic has influenced teaching practices. This mixed-methods study focused on the less-explored aspects of foreign language teaching during the pandemic, specifically examining how language teachers adapted and perceived their practices, including rapport building and learner autonomy, during emergency remote teaching (ERT) in higher education institutions. It also explored teachers’ intentions for their teaching in the post-pandemic era. An online survey was conducted, involving 118 language educators primarily from Germany, with a smaller representation from New Zealand, the United States, and the United Kingdom. The analysis of participants’ responses revealed issues and opportunities regarding lesson formats, tool usage, rapport, and learner autonomy. Our findings offer insights into the desired changes participants envisioned for the post-pandemic era. The results highlight the opportunities ERT had created in terms of teacher development, and we offer suggestions to enhance professional development programmes based on these findings.
Blockchains have become increasingly important in recent years and have expanded their applicability to many domains beyond finance and cryptocurrencies. This adoption has particularly increased with the introduction of smart contracts, which are immutable, user-defined programs directly deployed on blockchain networks. However, many scenarios require business transactions to simultaneously access smart contracts on multiple, possibly heterogeneous blockchain networks while ensuring the atomicity and isolation of these transactions, which is not natively supported by current blockchain systems. Therefore, in this work, we introduce the Transactional Cross-Chain Smart Contract Invocation (TCCSCI) approach that supports such distributed business transactions while ensuring their global atomicity and serializability. The approach introduces the concept of Resource Manager Smart Contracts, and 2PC for Blockchains (2PC4BC), a client-driven Atomic Commit Protocol (ACP) specialized for blockchain-based distributed transactions. We validate our approach using a prototypical implementation, evaluate its introduced overhead, and prove its correctness.
In countries such as Germany, where municipalities have planning sovereignty, problems of urban sprawl often arise. As the dynamics of land development have not substantially subsided over the last years, the national government decided to test the instrument of ‘Tradable Planning Permits’ (TPP) in a nationwide field experiment with 87 municipalities involved. The field experiment was able to implement the key features of a TPP system in a laboratory setting with approximated real socioeconomic and planning conditions. In a TPP system allocated planning permits must be used by municipalities for developing land. The permits can be traded between local jurisdictions, so that they have flexibility in deciding how to comply with the regulation. In order to evaluate the performance of such a system, specific field data about future building areas and their impact on community budgets for the period 2014–2028 were collected. The field experiment contains several sessions with representatives of the municipalities and with students. The participants were confronted with two (municipalities) and four (students) schemes. The results show that a trading system can curb down land development in an effective and also efficient manner. However, depending on the regulatory framework, the trading schemes show different price developments and distributional effects. The unexperienced representatives of the local authorities can easily handle with the permits in the administration and in the established market. A trading scheme sets very high incentives to save open space and to direct development activities to areas within existing planning boundaries. It is therefore a promising instrument for Germany and also other regions or countries with an established land-use planning system.
The aim of this work is the development of artificial intelligence (AI) application to support the recruiting process that elevates the domain of human resource management by advancing its capabilities and effectiveness. This affects recruiting processes and includes solutions for active sourcing, i.e. active recruitment, pre-sorting, evaluating structured video interviews and discovering internal training potential. This work highlights four novel approaches to ethical machine learning. The first is precise machine learning for ethically relevant properties in image recognition, which focuses on accurately detecting and analysing these properties. The second is the detection of bias in training data, allowing for the identification and removal of distortions that could skew results. The third is minimising bias, which involves actively working to reduce bias in machine learning models. Finally, an unsupervised architecture is introduced that can learn fair results even without ground truth data. Together, these approaches represent important steps forward in creating ethical and unbiased machine learning systems.
Smart cities are considered data factories that generate an enormous amount of data from various sources. In fact data is the backbone of any smart services. Therefore, the strategic beneficial handling of this digital capital is crucial for cities. Some smart city pioneers have already written down their approach to data in the form of data strategies, but what should a city's data strategy include, and how can the goals and measures defined in the strategies be operationalized? This paper addresses these questions by looking closely at the data strategies of cities in Germany and the top three countries in the EU Digital Economy and Society Index. The in-depth analysis of 8 city data strategies has yielded 11 dimensions that cities should consider in their data strategy. These are relevance of data, principles, methods, data sharing, technology, data culture, data ethics, organizational structure, data security and privacy, collaborations, data literacy. In addition, data governance is a concept to put these 11 strategic dimensions into practice through standardization measures, training programs, and defining roles and responsibilities by developing a data catalog.
The benefits of urban data cannot be realized without a political and strategic view of data use. A core concept within this view is data governance, which aligns strategy in data-relevant structures and entities with data processes, actors, architectures, and overall data management. Data governance is not a new concept and has long been addressed by scientists and practitioners from an enterprise perspective. In the urban context, however, data governance has only recently attracted increased attention, despite the unprecedented relevance of data in the advent of smart cities. Urban data governance can create semantic compatibility between heterogeneous technologies and data silos and connect stakeholders by standardizing data models, processes, and policies. This research provides a foundation for developing a reference model for urban data governance, identifies challenges in dealing with data in cities, and defines factors for the successful implementation of urban data governance. To obtain the best possible insights, the study carries out qualitative research following the design science research paradigm, conducting semi-structured expert interviews with 27 municipalities from Austria, Germany, Denmark, Finland, Sweden, and the Netherlands. The subsequent data analysis based on cognitive maps provides valuable insights into urban data governance. The interview transcripts were transferred and synthesized into comprehensive urban data governance maps to analyze entities and complex relationships with respect to the current state, challenges, and success factors of urban data governance. The findings show that each municipal department defines data governance separately, with no uniform approach. Given cultural factors, siloed data architectures have emerged in cities, leading to interoperability and integrability issues. A city-wide data governance entity in a cross-cutting function can be instrumental in breaking down silos in cities and creating a unified view of the city’s data landscape. The further identified concepts and their mutual interaction offer a powerful tool for developing a reference model for urban data governance and for the strategic orientation of cities on their way to data-driven organizations.