Refine
Document Type
- Journal article (1240) (remove)
Is part of the Bibliography
- yes (1240)
Institute
- ESB Business School (529)
- Life Sciences (286)
- Informatik (199)
- Technik (152)
- Texoversum (67)
- Zentrale Einrichtungen (7)
Publisher
- Elsevier (198)
- MDPI (97)
- Springer (84)
- Wiley (40)
- De Gruyter (36)
- IEEE (28)
- Springer Gabler (23)
- MIM, Marken-Institut München (21)
- Emerald (20)
- Sage (15)
This paper explores the application of People Analytics in
recruiting professors for universities of applied sciences. Using data-driven personas, the research project aims to identify and communicate the different paths and connections leading candidates to a professorship. The authors introduce the concept of personas, describe the underlying data source and derive an example for the current project.
Automatic content creation system for augmented reality maintenance applications for legacy machines
(2024)
Augmented reality (AR) applications have great potential to assist maintenance workers in their operations. However, creating AR solutions is time-consuming and laborious, which limits its widespread adoption in the industry. It therefore often happens that even with the latest generation machines, instead of an AR solution, the user only receives an electronic manual for the equipment operation and maintenance. This is commonplace with legacy machines. For this reason, solutions are required that simplify the creation of such AR solutions. This paper presents an approach using an electronic manual as a basis to create fast and cost-effective AR solutions for maintenance. As part of the approach, an application was developed to automatically identify and subdivide the chapters of electronic manuals via the bookmarks in the table of contents. The contents are then automatically uploaded to a central server and indexed with a suitable marker to make the data retrievable. The prepared content can then be accessed for creating context-related AR instructions via the marker. The application is characterized by the fact that no developers or experts are required to prepare the information. In addition to complying with common design criteria, the clear presentation of the contents and the intuitive use of the system offer added value for the performance of maintenance tasks. Together, these two elements form a novel way to retrofit legacy machines with AR maintenance instructions. The practical validation of the system took place in a factory environment. For this purpose, the content was created for a filter change on a CNC milling machine. The results show that inexperienced users can extract appropriate content with the software application. Furthermore, it is shown that maintenance workers, can access the content with an AR application developed for the Microsoft HoloLens 2 and complete simple tasks provided in the manufacturer's electronic manual.
Das Thema des Direktvertriebs (Direct-to-Customer oder kurz D-to-C) in der Automobilindustrie ist en vogue, denn nach Valtech (2023, S. 2) ist die Umstellung der Vertriebsmodelle in dieser Branche unumgänglich. Die Covid-19-Pandemie hat zudem noch als Katalysator für den D-to-C fungiert und die digitale Transformation sowie die Akzeptanz virtueller Verkaufsprozesse beschleunigt.
Despite the unstoppable global drive towards electric mobility, the electrification of sub-Saharan Africa’s ubiquitous informal multi-passenger minibus taxis raises substantial concerns. This is due to a constrained electricity system, both in terms of generation capacity and distribution networks. Without careful planning and mitigation, the additional load of charging hundreds of thousands of electric minibus taxis during peak demand times could prove catastrophic. This paper assesses the impact of charging 202 of these taxis in Johannesburg, South Africa. The potential of using external stationary battery storage and solar PV generation is assessed to reduce both peak grid demand and total energy drawn from the grid. With the addition of stationary battery storage of an equivalent of 60 kWh/taxi and a solar plant of an equivalent of 9.45 kWpk/taxi, the grid load impact is reduced by 66%, from 12 kW/taxi to 4 kW/taxi, and the daily grid energy by 58% from 87 kWh/taxi to 47 kWh/taxi. The country’s dependence on coal to generate electricity, including the solar PV supply, also reduces greenhouse gas emissions by 58%.
Purpose
As a response to the increased frequency of disruptive events and intense competition, organizational agility has become a key concept in organizational research. Fostering organizational agility requires leveraging knowledge that exists both outside (exploration) and inside (exploitation) the organization. This research tests the so-called ambidexterity hypothesis, which claims that a balance between exploration and exploitation leads to increased organizational outcomes, including the development of organizational agility. Complementing previously established measurement models on ambidexterity, this research proposes an alternative measurement model to analyze how ambidexterity can enhance organizational agility and, indirectly, performance, taking into consideration the moderating effect of environmental competitiveness.
Design/methodology/approach
A review of existing measurement models for ambidexterity shows that tension, a crucial aspect of ambidexterity, is often neglected. The authors, therefore, develop a new measurement model of ambidexterity to incorporate ambidexterity-induced tension. Using this measurement model, they examine the effect of ambidexterity on the development of entrepreneurial and adaptive agility as well as performance.
Findings
Ambidexterity positively influences both entrepreneurial and adaptive agility, indicating that a balance between exploration and exploitation has superior organizational effects. This finding confirms the ambidexterity hypothesis with respect to organizational agility. Furthermore, both entrepreneurial and adaptive agility drive organizational performance. These two indirect effects via agility fully mediate the impact of ambidexterity on organizational performance. Finally, environmental competitiveness positively moderates the relationship between ambidexterity and adaptive agility.
Originality/value
The findings extend research on ambidexterity by showing its positive effects on organizational agility. Furthermore, the study proposes an alternative operationalization to capture the ambidexterity construct that may lay the groundwork for further applications of the ambidexterity concept.
Tech hubs (THs) and cognate structures are nowadays ubiquitous in the innovation ecosystem of Sub-Saharan African (SSA) countries. However, the concept of THs is fuzzy due to the lack of a clear and universally accepted definition. This ambiguity is further compounded by the diverse range of organizations that self-identify as hubs, or are categorized as such by others. As a result, research on THs in SSA remained limited. Against the backdrop of established research on the interconnectedness of technology, innovation and entrepreneurship in different organizational forms, this paper is meant to provide fresh insights into the study of THs in SSA. To advance future research, first, it reveals what is special about THs in SSA and how they are related to existing concepts. I particularly argue that they contour a fourth-wave model of incubation. Second, four main categories are unfolded to delineate THs in SSA which is the cornerstone for future research.
Comparative analysis of the chemical and rheological curing kinetics of formaldehyde-based wood adhesives is crucial for assessing their respective performance. Differential scanning calorimetry (DSC) and rheometry are the conventional techniques used for monitoring the curing processes leading to crosslinking polymerization of the adhesives. However, the direct comparison of these techniques is inappropriate due to the intrinsic differences in their underlying procedures. To address this challenge, the two adhesive samples were sequentially cured, firstly with rheometry and followed by DSC. The observed higher curing degree in the subsequent DSC procedure underpins the incomplete curing of the samples during initial rheometry. Furthermore, the comparative assessment of the activation energies, molar ratios, and active groups of the two adhesives highlights the importance of the pre-exponential factor in addition to the activation energies, as it attributes to the probability of active groups coinciding at the appropriate spatial arrangement.
Salivary gland tumors (SGTs) are a relevant, highly diverse subgroup of head and neck tumors whose entity determination can be difficult. Confocal Raman imaging in combination with multivariate data analysis may possibly support their correct classification. For the analysis of the translational potential of Raman imaging in SGT determination, a multi-stage evaluation process is necessary. By measuring a sample set of Warthin tumor, pleomorphic adenoma and non-tumor salivary gland tissue, Raman data were obtained and a thorough Raman band analysis was performed. This evaluation revealed highly overlapping Raman patterns with only minor spectral differences. Consequently, a principal component analysis (PCA) was calculated and further combined with a discriminant analysis (DA) to enable the best possible distinction. The PCA-DA model was characterized by accuracy, sensitivity, selectivity and precision values above 90% and validated by predicting model-unknown Raman spectra, of which 93% were classified correctly. Thus, we state our PCA-DA to be suitable for parotid tumor and non-salivary salivary gland tissue discrimination and prediction. For evaluation of the translational potential, further validation steps are necessary.
Im Team zum fliegenden Traum
(2024)
Purpose – This paper aims to determine the affecting factors of the brand authenticity of startups in social media.
Design/methodology/approach – Using a qualitative method based on a grounded theory approach, this research specifies and classifies the affecting factors of brand authenticity of startups in social media through in-depth semi-structured interviews.
Findings – Multiple factors affecting the brand authenticity of startups in social media are determined and categorized as indexical, iconic and existential cues through this research. Connection to heritage and having credible support are determined as indexical cues. Founder intellectuality, brand intellectuality, commitment toward customers and proactive clear and interesting communications are identified as iconic cues. Having self-confidence and self-satisfaction, having intimacy with the brand and a joyful feeling for interactions with the community around the brand are determined as existential cues in this research. This research furthers previous arguments on a multiplicity of brand authenticity by shedding light on the relationship between the different aspects of authenticity and the form that different affecting factors can be organized together. Consumers eventually evaluate a strengthened perception of brand authenticity through existential cues that reflect the cues of other aspects (iconic and indexical) which passed through the goal-based assessment and self-authentication filter.
Research limitations/implications – The research sampling population can be more diversified in terms of sociodemographic attributes. Due to the qualitative methodology of this research, assessment of the findings through quantitative methods can be considered in future research. Practical implications – Using the findings of this research, startup managers can properly build a perception of authenticity in their consumers’ minds by using alternate factors while lacking major indexical cues such as heritage. This research helps startup businesses to design their brand communications better to convey their authenticity to their audiences.
Originality/value – This research determines the factors affecting the authenticity of startup brands in social media. It also defines the process of authenticity perception through different aspects of brand authenticity.
Human Digital Twin
(2022)
Man stelle sich vor, man könnte mit Unterstützung von künstlicher Intelligenz Spielabläufe von Bundesligaspielen oder sogar ganze WM-Partien simulieren. Oder der Trainer würde die Mannschaft im Endspiel anhand von Daten über den Gegner aufstellen und entsprechend psychologisch und physiologisch verschiedene Spielertypen auf den Platz schicken (vgl. Jahn). Ist das reine Fiktion? Nicht wirklich. Bereits heute werden die Leistungen von Sportlern immer häufiger digital analysiert und bewertet. Beispielsweise hat SAP eine Plattform entwickelt, die ein digitales Datenbild von Fußballspielern erstellt (vgl. SAP). Bei der letzten WM erhielt jeder Spieler über die neue Fifa Player App kurz nach der Begegnung präzise Statistiken zu seinen Leistungen während des Spiels (vgl. FIFA). Noch bessere Informationen sollen in Zukunft virtuelle Abbilder der Fußballspieler, digitale Zwillinge, liefern. Die dafür notwendigen Daten werden mithilfe von Sensoren im Trikot, in den Schuhen oder im Ball gewonnen. Durch erfassten Bewegungs- und Positionsdaten sowie Ballkontakten entsteht ein präzises Datenbild des Spielers. Solche Simulationen, die auf einem Modell des Menschen in der digitalen Welt beruhen, erfahren derzeit große Aufmerksamkeit in Wissenschaft und Praxis (vgl. van der Valk et al.). Nicht nur in der Fußballwelt, auch in der Medizin und im Kontext von Industrie 4.0 und Produktdesign, haben digitale menschliche Zwillinge das Potenzial, zu einer Schlüsseltechnologie zu werden.
Plasmonics and nanophotonics both deal with the interaction of light with structures of typically sub-wavelength size in one of more dimensions. Over the past decade or two, interest in these topics has grown significantly. This includes basic research towards detailed understanding of light-matter interaction and the manipulation of light on the nanometer scale as well as the search for applications ranging from quantum information processing, data storage, solar cells, spectroscopy and microscopy to (bio-)sensors and biomedical devices. Key enablers for this development are advanced materials and the variety of techniques to structure them with nanometer precision on the one hand, and progress in the theoretical description and numerical implementations, on the other. Besides the traditional metals Au, Ag, Al, and Cu also compounds such as refractory metal nitrides with much higher durability as well as semiconductors, dielectrics and hybrid structures have become of interest. Structuring techniques are not only aiming at the fabrication of individual elements with highest precision for detailed interaction analysis, but also at methods for large scale, low-cost nanofabrication mostly for sensor applications. In the former case, mostly electron beam lithography and focused ion beam milling are employed, while for high throughput various forms of nanoimprint and self-assembly based techniques are favored. Thin film deposition and pattern transfer techniques are mostly derived from those developed for nano-electronics, however more recently methods such as electroless plating, atomic layer deposition or etching and 3-D additive techniques are appearing. Thus, highly specialized expertise has been acquired in the different disciplines, and successful research and technology transfer will draw from this pool of knowledge.
Introduction to the special issue on self‑managing and hardware‑optimized database systems 2022
(2023)
Data management systems have evolved in terms of functionality, performance characteristics, complexity, and variety during the last 40 years. Particularly, the relational database management systems and the big data systems (e.g., Key-Value stores, Document stores, Graph stores and Graph Computation Systems, Spark, MapReduce/Hadoop, or Data Stream Processing Systems) have evolved with novel additions and extensions. However, the systems administration and tasks have become highly complex and expensive, especially given the simultaneous and rapid hardware evolution in processors, memory, storage, or networking. These developments present new open problems and challenges to data management systems as well as new opportunities.
The SMDB (International Workshop on Self-Managing Database Systems) and HardBD&Active (Joint International Workshop on Big Data Management on Emerging Hardware and Data Management on Virtualized Active Systems) workshops organized in conjunction with the IEEE ICDE (International Conference on Data Engineering) offered two distinct platforms for examining the above system-related challenges from different perspectives. The SMDB workshop looks into developing autonomic or self-* features in database and data management systems to tackle complex administrative tasks, while the HardBD&Active workshop focuses on harnessing hardware technologies to enhance efficiency and performance of data processing and management tasks. As a result of these workshops, we are delighted to present the third special issue of DAPD titled “Self-Managing and Hardware-Optimized Database Systems 2022,” which showcases the best contributions from the SMDB 2021/2022 and HardBD&Active 2021/2022 workshops.
Digitalization and enterprise architecture management: a perspective on benefits and challenges
(2023)
Many companies digitally transform their business models, processes, and services. They have also been using Enterprise Architecture Management approaches for a long time to synchronize corporate strategy and information technology. Such digitalization projects bring different challenges for Enterprise Architecture Management. Without understanding and addressing them, Enterprise Architecture Management projects will fail or not deliver the expected value. Since existing research has not yet addressed these challenges, they were investigated based on a qualitative expert study with leading industry experts from Europe. Furthermore, potential benefits of digitalization projects for Enterprise Architecture Management were researched. Our results provide a theoretical framework consisting of five identified challenges, triggers and a number of benefits. Furthermore, we discuss in what ways digitalization and EAM is a promising topic for future research.
In recent years, both fields, AI and VRE, have received increasing attention in scientific research. Thus, this article’s purpose is to investigate the potential of DL-based applications on VRE and as such provide an introduction to and structured overview of the field. First, we conduct a systematic literature review of the application of Artificial Intelligence (AI), especially Deep Learning (DL), on the integration of Variable Renewable Energy (VRE). Subsequently, we provide a comprehensive overview of specific DL-based solution approaches and evaluate their applicability, including a survey of the most applied and best suited DL architectures. We identify ten DL-based approaches to support the integration of VRE in modern power systems. We find (I) solar PV and wind power generation forecasting, (II) system scheduling and grid management, and (III) intelligent condition monitoring as three high potential application areas.
Do Chinese subordinates trust their German supervisors? A model of inter-cultural trust development
(2023)
In this qualitative study based on 95 interviews with Chinese subordinates and their German supervisors, we inductively develop a model which advances theoretical understanding by showing how inter-cultural trust development in hierarchical relationships is the result of six distinct elements: the subordinate trustor’s cultural profile (cosmopolitans, hybrids, culturally bounds), the psychological mechanisms operating within the trustor (role expectations and cultural accommodation), and contextual moderators (e.g., country context, time spent in foreign culture, and third-party influencers), which together influence the trust forms (e.g., presumptive trust, relational trust) and trust dynamics (e.g., trust breakdown and repair) within relationship phases over time (initial contact, trust continuation, trust disillusionment, separation, and acculturation). Our findings challenge the assumption that cultural differences result in low levels of initial trust and highlight the strong role the subordinate’s cultural profile can have on the dynamics and trajectory of trust in hierarchical relationships. Our model highlights that inter-cultural trust development operates as a variform universal, following the combined universalistic-particularistic paradigm in cross-cultural management, with both culturally generalizable etic dynamics, as well as culturally specific etic manifestations.
In the context of Industry 4.0, intralogistics faces an increasingly complex and dynamic environment driven by a high level of product customisation and complex manufacturing processes. One approach to deal with these changing conditions is the decentralised and intelligent connectivity of intralogistics systems. However, wireless connectivity presents a major challenge in the industry due to strict requirements such as safety and real-time data transmission. In this context, the fifth generation of mobile communications (5G) is a promising technology to meet the requirements of safety-critical applications. Particularly, since 5G offers the possibility of establishing private 5G networks, also referred to as standalone non-public networks. Through their isolation from public networks, private 5G networks provide exclusive coverage for private organisations offering them high intrinsic network control and data security. However, 5G is still under development and is being gradually introduced in a continuous release process. This process lacks transparency regarding the performance of 5G in individual releases, complicating the successful adoption of 5G as an industrial communication. Additionally, the evaluation of 5G against the specified target performance is insufficient due to the impact of the environment and external interfering factors on 5G in the industrial environment. Therefore, this paper aims to develop a technical decision-support framework that takes a holistic approach to evaluate the practicality of 5G for intralogistics use cases by considering two fundamental stages. The first of these analyses technical parameters and characteristics of the use case to evaluate the theoretical feasibility of 5G. The second stage investigates the application's environment, which substantially impacts the practicality of 5G, for instance, the influence of surrounding materials. Finally, a case study validates the proposed framework by means of an autonomous mobile robot. As a result, the validation proves the proposed framework's applicability and shows the practicality of the autonomous mobile robot, when integrating it into a private 5G network testbed.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
The COVID-19 pandemic necessitated significant changes in foreign language education, forcing teachers to reconstruct their identities and redefine their roles as language educators. To better understand these adaptations and perspectives, it is crucial to study how the pandemic has influenced teaching practices. This mixed-methods study focused on the less-explored aspects of foreign language teaching during the pandemic, specifically examining how language teachers adapted and perceived their practices, including rapport building and learner autonomy, during emergency remote teaching (ERT) in higher education institutions. It also explored teachers’ intentions for their teaching in the post-pandemic era. An online survey was conducted, involving 118 language educators primarily from Germany, with a smaller representation from New Zealand, the United States, and the United Kingdom. The analysis of participants’ responses revealed issues and opportunities regarding lesson formats, tool usage, rapport, and learner autonomy. Our findings offer insights into the desired changes participants envisioned for the post-pandemic era. The results highlight the opportunities ERT had created in terms of teacher development, and we offer suggestions to enhance professional development programmes based on these findings.
The present study investigated the possibilities and limitations of using a low-cost NIR spectrometer for the verification of the presence of the declared active pharmaceutical ingredients (APIs) in tablet formulations, especially for medicine screening studies in low-resource settings. Spectra from 950 to 1650 nm were recorded for 170 pharmaceutical products representing 41 different APIs, API combinations or placebos. Most of the products, including 20 falsified medicines, had been collected in medicine quality studies in African countries. After exploratory principal component analysis, models were built using data-driven soft independent modelling of class analogy (DD-SIMCA), a one-class classifier algorithm, for tablet products of penicillin V, sulfamethoxazole/trimethoprim, ciprofloxacin, furosemide, metronidazole, metformin, hydrochlorothiazide, and doxycycline. Spectra of amoxicillin and amoxicillin/clavulanic acid tablets were combined into a single model. Models were tested using Procrustes cross-validation and by projection of spectra of tablets containing the same or different APIs. Tablets containing no or different APIs could be identified with 100 % specificity in all models. A separation of the spectra of amoxicillin and amoxicillin/clavulanic acid tablets was achieved by partial least squares discriminant analysis. 15 out of 19 external validation products (79 %) representing different brands of the same APIs were correctly identified as members of the target class; three of the four rejected samples showed an API mass percentage of the total tablet weight that was out of the range covered in the respective calibration set. Therefore, in future investigations larger and more representative spectral libraries are required for model building. Falsified medicines containing no API, incorrect APIs, or grossly incorrect amounts of the declared APIs could be readily identified. Variation between different NIR-S-G1 spectroscopic devices led to a loss of accuracy if spectra recorded with different devices were pooled. Therefore, piecewise direct standardization was applied for calibration transfer. The investigated method is a promising tool for medicine screening studies in low-resource settings.
The Belt and Road Initiative (BRI) has reinforced China’s business engagement in Sub-Saharan Africa (SSA). While previous international business research focused on the internationalization and investments of Chinese companies, this viewpoint uncovers how both local African and international non-Chinese Small and Medium Sized Enterprises (SMEs) may benefit from and participate in the BRI. A focus is laid on the infrastructure sector accounting for the highest investments since the inception of the BRI in 2013. In a conceptual way, the motives of SMEs to participate in infrastructure project business in the context of the BRI are explored. Investigating the challenges of two large transport infrastructure projects, the business potentials for SMEs become visible. It is argued that SMEs find business potentials particularly as investors, sub-contractors and project management experts in the BRI in Sub-Saharan Africa.
Im E-Sport-Ökosystem gibt es eine Vielzahl an Akteuren, die auf verschiedene Art und Weisen vom Erfolg des elektronischen Sports profitieren: Sponsoren präsentieren sich einer attraktiven Zielgruppe, Publisher fördern eine intensivere Nutzung ihres Videospiels und Übertragungsplattformen erhalten professionell organisierte Inhalte zum Vertrieb an eine wachsende Zuschauerschaft. Lediglich die Team-Organisationen und Veranstalter konzentrierten sich bisher allein auf die Erzielung von Erträgen durch die Ausübung oder Organisation von E-Sport-Wettkämpfen. Getrieben von Profitabilitätsschwächen versuchen sie jedoch, ihre Abhängigkeit von Sponsoring zu reduzieren und neue Ertragsquellen zu erschließen. Insbesondere Team-Organisationen sehen die Möglichkeiten hierfür eher außerhalb des originären E-Sport-Geschäfts und entwickeln sich dadurch immer mehr zu Marketing- und Unterhaltungsunternehmen.
The introduction of smart contracts has expanded the applicability of blockchains to many domains beyond finance and cryptocurrencies. Moreover, different blockchain technologies have evolved that target special requirements. As a result, in practice, often a combination of different blockchain systems is required to achieve an overall goal. However, due to the heterogeneity of blockchain protocols, the execution of distributed business transactions that span several blockchains leads to multiple interoperability and integration challenges. Therefore, in this article, we examine the domain of Cross-Chain Smart Contract Invocations (CCSCIs), which are distributed transactions that involve the invocation of smart contracts hosted on two or more blockchain systems. We conduct a systematic multi-vocal literature review to get an overview of the available CCSCI approaches. We select 20 formal literature studies and 13 high-quality gray literature studies, extract data from them, and analyze it to derive the CCSCI Classification Framework. With the help of the framework, we group the approaches into two categories and eight subcategories. The approaches differ in multiple characteristics, e.g., the mechanisms they follow, and the capabilities and transaction processing semantics they offer. Our analysis indicates that all approaches suffer from obstacles that complicate real-world adoption, such as the low support for handling heterogeneity and the need for trusted third parties.
Blockchains have become increasingly important in recent years and have expanded their applicability to many domains beyond finance and cryptocurrencies. This adoption has particularly increased with the introduction of smart contracts, which are immutable, user-defined programs directly deployed on blockchain networks. However, many scenarios require business transactions to simultaneously access smart contracts on multiple, possibly heterogeneous blockchain networks while ensuring the atomicity and isolation of these transactions, which is not natively supported by current blockchain systems. Therefore, in this work, we introduce the Transactional Cross-Chain Smart Contract Invocation (TCCSCI) approach that supports such distributed business transactions while ensuring their global atomicity and serializability. The approach introduces the concept of Resource Manager Smart Contracts, and 2PC for Blockchains (2PC4BC), a client-driven Atomic Commit Protocol (ACP) specialized for blockchain-based distributed transactions. We validate our approach using a prototypical implementation, evaluate its introduced overhead, and prove its correctness.
The Circular Economy aims to reintroduce the value of products back into the economic cycle at the same value chain level. While the activities of the Circular Economy are already well-defined, there exists a gap in how returned products are treated by the industry. This study aims to examine how a process should be designed to handle returned products in the context of the Circular Economy. To achieve this, a machine learning-based algorithm is used to classify data and extract relevant information throughout the product life cycle. The focus of this research is limited to land transportation systems within the Sharing Economy sector.
Why are organizations and markets slow to transform toward sustainability despite the abundant well-recognized opportunities it provides? An important subset of the phenomena this question addresses involves decision-makers recognizing the existence of opportunities but failing to undertake ambitious, effective, sufficient, or timely action. Building on existing research on capability traps, market formation, and managing sustainability, we focus on the forces con-straining organizations from developing the capabilities and market infrastructures required for sustainability transformations. We characterize types of sustainability initiatives and, using causal loop diagramming, visualize structures that enable and constrain how organizations can navigate individually and collectively worse-before-better dynamics resulting from uncertain,nonlinear, and delayed returns. Being under day-to-day pressures and deeply intertwined within their environment, organizational actors find it difficult to recognize, undertake, maintain, and coordinate necessary efforts internally and externally. We discuss research implications and directions for future research on avoiding these traps and accelerating sustainability transformations.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
Verschleiß an Zerspanwerkzeugen mit geometrisch definierter Schneide führt zu schlechter Oberflächenqualität, erhöhten Kräften, Maßabweichungen und Bruch. Bisher wird dieser Verschleiß außerhalb der Maschine oder indirekt (z. B. Durchmesser) erfasst. Der Tausch der Werkzeuge findet nach einer bestimmten Werkstückzahl, Zeit, oder einem Standweg statt. In diesem Beitrag wird ein neuartiges System zur direkten Ermittlung des Freiflächenverschleißes im Arbeitsraum eines Bearbeitungszentrums dargestellt. Dabei wird eine geschützt integrierte Industriekamera mit Objektiv im Arbeitsraum installiert. Die Maschinenachsen bzw. die Bearbeitungsspindel positionieren das Werkzeug davor. Nach einer nur wenige Sekunden dauernden Messung findet die Auswertung des Verschleißes hauptzeitparallel statt.
The EAT–Lancet planetary health diet (PHD) provides guidelines on a global scale and calls for red meat consumption to be halved. Operational PHD guidelines at country level have yet to be determined. Here we argue that the biological link between milk and bovine-meat production must be considered when operationalizing the globally calculated PHD to national contexts. Using a stylized computer simulation model rooted in a food system approach, we explore the impact of dietary scenarios on milk and bovine-meat production and show that ignoring this biological link can lead to substantial imbalances between national dietary guidelines and production outcomes and potentially lead to food waste. Furthermore, we assess current national dietary guidelines in Europe and find that most disregard this biological link and are incompatible with the PHD, with implications for policymakers and consumers to consider when adapting the PHD in national contexts.
Fragestellung: Das klinische Standardverfahren und Referenz der Schlafmessung und der Klassifizierung der einzelnen Schlafstadien ist die Polysomnographie (PSG). Alternative Ansätze zu diesem aufwändigen Verfahren könnten einige Vorteile bieten, wenn die Messungen auf eine komfortablere Weise durchgeführt werden. Das Hauptziel dieser Forschung Studie ist es, einen Algorithmus für die automatische Klassifizierung von Schlafstadien zu entwickeln, der ausschließlich Bewegungs- und Atmungssignale verwendet [1].
Patienten und Methoden: Nach der Analyse der aktuellen Forschungsarbeiten haben wir multinomiale logistische Regression als Grundlage für den Ansatz gewählt [2]. Um die Genauigkeit der Auswertung zu erhöhen, wurden vier Features entwickelt, die aus Bewegungs- und Atemsignalen abgeleitet wurden. Für die Auswertung wurden die nächtlichen Aufzeichnungen von 35 Personen verwendet, die von der Charité-Universitätsmedizin Berlin zur Verfügung gestellt wurden. Das Durchschnittsalter der Teilnehmer betrug 38,6 +/– 14,5 Jahre und der BMI lag bei durchschnittlich 24,4 +/– 4,9 kg/m2. Da der Algorithmus mit drei Stadien arbeitet, wurden die Stadien N1, N2 und N3 zum NREM-Stadium zusammengeführt. Der verfügbare Datensatz wurde strikt aufgeteilt: in einen Trainingsdatensatz von etwa 100 h und in einen Testdatensatz mit etwa 160 h nächtlicher Aufzeichnungen. Beide Datensätze wiesen ein ähnliches Verhältnis zwischen Männern und Frauen auf, und der durchschnittliche BMI wies keine signifikante Abweichung auf.
Ergebnisse: Der Algorithmus wurde implementiert und lieferte erfolgreiche Ergebnisse: die Genauigkeit der Erkennung von Wach-/NREM-/REM-Phasen liegt bei 73 %, mit einem Cohen’s Kappa von 0,44 für die analysierten 19.324 Schlafepochen von jeweils 30 s. Die beobachtete gewisse Überschätzung der NREM-Phase lässt sich teilweise durch ihre Prävalenz in einem typischen Schlafmuster erklären. Selbst die Verwendung eines ausbalancierten Trainingsdatensatzes konnte dieses Problem nicht vollständig lösen.
Schlussfolgerungen: Die erreichten Ergebnisse haben die Tauglichkeit des Ansatzes prinzipiell bestätigt. Dieser hat den Vorteil, dass nur Bewegungs- und Atemsignale verwendet werden, die mit weniger Aufwand und komfortabler für Benutzer aufgezeichnet werden können als z. B. Herz- oder EEG-Signale. Daher stellt das neue System eine deutliche Verbesserung im Vergleich zu bestehenden Ansätzen dar. Die Zusammenführung der beschriebenen algorithmischen Software mit dem in [1] beschriebenen Hardwaresystem zur Messung von Atem- und Körperbewegungssignalen zu einem autonomen, berührungslosen System zur kontinuierlichen Schlafüberwachung ist eine mögliche Richtung zukünftiger Arbeiten.
Context
In a world of high dynamics and uncertainties, it is almost impossible to have a long-term prediction of which products, services, or features will satisfy the needs of the customer. To counter this situation, the conduction of Continuous Improvement or Design Thinking for product discovery are common approaches. A major constraint in conducting product discovery activities is the high effort to discover and validate features and requirements. In addition, companies struggle to integrate product discovery activities into their agile processes and iterations.
Objective
This paper aims at suggests a supportive tool, the “Discovery Effort Worthiness (DEW) Index”, for product owners and agile teams to determine a suitable amount of effort that should be spent on Design Thinking activities. To operationalize DEW, proposals for practitioners are presented that can be used to integrate product discovery into product development and delivery.
Method
A case study was conducted for the development of the DEW index. In addition, we conducted an expert workshop to develop proposals for the integration of product discovery activities into the product development and delivery process.
Results
First, we present the "Discovery Effort Worthiness Index" in form of a formula. Second, we identified requirements that must be fulfilled for systematic integration of product discovery activities into product development and delivery. Third, we derived from the requirements proposals for the integration of product discovery activities with a company's product development and delivery.
Conclusion
The developed "Discovery Effort Worthiness Index" provides a tool for companies and their product owners to determine how much effort they should spend on Design Thinking methods to discover and validate requirements. Integrating product discovery with product development and delivery should ensure that the results of product discovery are incorporated into product development. This aims to systematically analyze product risks to increase the chance of product success.
The Industry 4.0 paradigm requires concepts for integrating intelligent/ smart IoT Solutions into manufacturing. Such intelligent solutions are envisioned to increase flexibility and adaptability in smart factories. Especially autonomous cobots capable of adapting to changing conditions are a key enabler for changeable factory concepts. However, identifying the requirements and solution scenarios incorporating intelligent products challenges the manufacturing industry, especially in the SME sector. In pick and place scenarios, changing coordinate systems of workpiece carriers cause placing process errors. Using the IPIDS framework, this paper describes the development of a tool-center-point positioning method to improve the process stability of a collaborative robot in a changeable assembly workstation. Applying the framework identifies the requirement for an intelligent workpiece carrier as a part of the solution. Implementing and evaluating the solution within a changeable factory validates the IPIDS framework.
Due to constantly changing conditions, demand, and technologies, companies increasingly seek flexibility. Productivity results from automation, improved working conditions and the focus of people in production in interaction with machines. Unfortunately, the human factor is often not considered to increase flexibility and productivity with new concepts. This work aims to develop a hybrid assistance system that allows a dynamic configuration of cyber-physical production systems considering the current order situation and available resources utilizing simulation. The system also considers human factors in addition to economic factors, which contributes to the extended economic appraisal.
The hard template method for the preparation of monodisperse mesoporous silica microspheres (MPSMs) has been established in recent years. In this process, in situ-generated silica nanoparticles (SNPs) enter the porous organic template and control the size and pore parameters of the final MPSMs. Here, the sizes of the deposited SNPs are determined by the hydrolysis and condensation rates of different alkoxysilanes in a base catalyzed sol–gel process. Thus, tetramethyl orthosilicate (TMOS), tetraethyl orthosilicate (TEOS), tetrapropyl orthosilicate (TPOS) and tetrabutyl orthosilicate (TBOS) were sol–gel processed in the presence of amino-functionalized poly (glycidyl methacrylate-co-ethylene glycol dimethacrylate) (p(GMA-co-EDMA)) templates. The size of the final MPSMs covers a broad range of 0.5–7.3 µm and a median pore size distribution from 4.0 to 24.9 nm. Moreover, the specific surface area can be adjusted between 271 and 637 m2 g−1. Also, the properties and morphology of the MPSMs differ according to the SNPs. Furthermore, the combination of different alkoxysilanes allows the individual design of the morphology and pore parameters of the silica particles. Selected MPSMs were packed into columns and successfully applied as stationary phases in high-performance liquid chromatography (HPLC) in the separation of various water-soluble vitamins.
With the rapid development of globalization, the demand for translation between different languages is also increasing. Although pre-training has achieved excellent results in neural machine translation, the existing neural machine translation has almost no high-quality suitable for specific fields. Alignment information, so this paper proposes a pre-training neural machine translation with alignment information via optimal transport. First, this paper narrows the representation gap between different languages by using OTAP to generate domain-specific data for information alignment, and learns richer semantic information. Secondly, this paper proposes a lightweight model DR-Reformer, which uses Reformer as the backbone network, adds Dropout layers and Reduction layers, reduces model parameters without losing accuracy, and improves computational efficiency. Experiments on the Chinese and English datasets of AI Challenger 2018 and WMT-17 show that the proposed algorithm has better performance than existing algorithms.
Film formation of self synthesized Polymer EPM–g–VTMDS (ethylene–propylene rubber, EPM, grafted with vinyltetramethyldisiloxane, VTMDS) was studied regarding bonding to adhesion promoter vinyltrimethoxysilane (VTMS) on oxidized 18/10 chromium/nickel–steel (V2A) stainless steel surfaces. Polymer films of different mixed solutions including commercial siloxane and silicone, dimethyl, vinyl group terminated crosslinker (HANSA SFA 42100, CAS# 68083-19-2, 0.35 mmol Vinyl/g) and platinum, 1,3-diethenyl-1,1,3,3-tetramethyldisiloxane complex Karstedt's catalyst (ALPA–KAT 1, CAS# 68478-92-2) were spin coated on V2A stainless steel surfaces with adsorbed VTMS thin layers in order to analyze film formation of EPM–g–VTMDS at early stages. Surface topography and chemical bonding of the high performance polymers on different oxidized V2A surfaces were investigated with X–ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM), scanning electron microscopy (SEM) and surface enhanced Raman spectroscopy (SERS). AFM and SEM as well as XPS results indicated that the formation of the polymer film proceeds via growth of polymer islands. Chemical signatures of the essential polymer contributions, linker and polymer backbones, could be identified using XPS core level peak shape analysis and also SERS. The appearance of signals which are related to Si–O–Si can be seen as a clear indication of lateral crosslinking and silica network formation in the films on the V2A surface.
Project managers still face management problems in interorganizational Research and Development (R&D) projects due to their limited authority. Addressing a project culture which is conducive to cooperation and innovation in interorganizational R&D project management demands commitment of individual project members and thus balances this limited authority. However, the relational collaboration level at which project culture manifests itself is not addressed by current project management approaches, or it is addressed only at a late stage. Consequently, project culture develops within a predefined framework of project organization and organized contents and thus is not actively targeted. Therefore, a focus shift towards project culture becomes necessary. This can be done by a project-culture-aware management. The method CLIPS actively supports interorganizational project members in this kind of management. It should be integrable in the common project management approaches, that with its application all collaboration levels are addressed in interorganizational R&D project management. The goal of this paper is to demonstrate the integrability of the method CLIPS and show how it can be integrated in common project management approaches. This enriches interorganizational R&D project management by a project culture focus.
The chemical recycling of used motor oil via catalytic cracking to convert it into secondary diesel-like fuels is a sustainable and technically attractive solution for managing environmental concerns associated with traditional disposal. In this context, this study was conducted to screen basic and acidic-aluminum silicate catalysts doped with different metals, including Mg, Zn, Cu, and Ni. The catalysts were thoroughly characterized using various techniques such as N2 adsorption–desorption isotherms, FT-IR spectroscopy, and TG analysis. The liquid and gaseous products were identified using GC, and their characteristics were compared with acceptable ranges from ASTM characterization methods for diesel fuel. The results showed that metal doping improved the performance of the catalysts, resulting in higher conversion rates of up to 65%, compared to thermal (15%) and aluminum silicates (≈20%). Among all catalysts, basic aluminum silicates doped with Ni showed the best catalytic performance, with conversions and yields three times higher than aluminum silicate catalysts. These findings significantly contribute to developing efficient and eco-friendly processes for the chemical recycling of used motor oil. This study highlights the potential of basic aluminum silicates doped with Ni as a promising catalyst for catalytic cracking and encourages further research in this area.
Polyurethane thermosets have a wide range of applications. In this study, alternative raw materials were used to enhance sustainability. In two newly developed biobased polyurethanes (PUs), the cross-linker content was varied, which caused phase separation and therefore affected the turbidity. To investigate this phenomenon, UV–Vis–NIR spectroscopy was utilized. Spectra were recorded from 200 to 2500 nm in transmittance mode, and multivariate data analysis was applied to the three UV, Vis, and NIR sections separately. For the two different PU classes, each with five different cross-linker contents, classification by principal component analysis combined with linear or quadratic discriminant analysis was possible with an accuracy between 93% and nearly 100%. The best separation was achieved in the NIR range. Partial least-squares regression models were determined to predict the cross-linker content. As mentioned, the model for the NIR range is the most suitable, with the highest R2 (validation) of 0.99 for PU1 and 0.98 for PU2. The corresponding root-mean-square error of prediction values of the external validation was the lowest, with 0.82% (PU1) and 1.25% (PU2). Therefore, UV–Vis–NIR absorbance spectroscopy, especially NIR, is a suitable tool for monitoring the appropriate material composition of turbid PU thermosets in line.
In clothing e-commerce, the challenge of optimally recommending clothing that suits a user’s unique characteristics remains a pressing issue. Many platforms simply recommend best-selling or popular clothing, without taking into account important attributes like user’s face color, pupil color, face shape, age, etc. To solve this problem, this paper proposes a personalized clothing recommendation algorithm that incorporates the established 4-Season Color System and user-specific biological characteristics. Firstly, the attributes and colors of clothing are classified by Fnet network, that can learn disjoint label combinations and mitigate the issue of excessive labels. Secondly, on the basis of the 4-Season Color System, the user’s face color model is trained by combined MobileNetV3_DTL, which ensures the model’s generalization and improves the training speed. Thirdly, user’s face shape and age are divided into different categories by an Inception network. Finally, according to the users’ face color, age, face shape and other information, personalized clothing is recommended in a coarse-to-fine manner. Experiments on five datasets demonstrate that the algorithm proposed in this paper achieves state-of-the-art results.
Determination of the gel point of formaldehyde-based wood adhesives by using a multiwave technique
(2023)
Determining the instant of gelation of formaldehyde-based wood adhesives as an assessment parameter for their curing rate is important for optimizing the curing behavior. Due to the stoichiometrically imbalanced networks of formaldehyde-based adhesives, the crossover point of storage G′ and loss modulus G″ cannot unconditionally be assumed as the gel point in oscillatory time sweeps as the material response is frequency-dependent. This study aims to determine the gel point of selected adhesives by the isothermal multiwave oscillatory shear test. A thorough comparison between the gel and the crossover point of G′ and G″ is performed. Rheokinetic analysis showed no significant difference between the activation energies calculated at the gel point determined by a multiwave test and the crossover point obtained by the time sweep test. Hence, for resins with similar curing reactions, a reliable determination of gel point by applying a multiwave test is needed for a comparison of their reactivity.
Sol-gel-controlled size and morphology of mesoporous silica microspheres using hard templates
(2023)
Mesoporous silica microspheres (MPSMs) represent a promising material as a stationary phase for HPLC separations. The use of hard templates provides a preparation strategy for producing such monodisperse silica microspheres. Here, 15 MPSMs were systematically synthesized by varying the sol–gel reaction parameters of water-to-precursor ratio and ammonia concentration in the presence of a porous p(GMA-co-EDMA) polymeric hard template. Changing the sol–gel process factors resulted in a wide range of MPSMs with varying particle sizes from smaller than one to several micrometers. The application of response surface methodology allowed to derive quantitative predictive models based on the process factor effects on particle size, pore size, pore volume, and specific surface area of the MPSMs. A narrow size distribution of the silica particles was maintained over the entire experimental space. Two larger-scale batches of MPSMs were prepared, and the particles were functionalized with trimethoxy(octadecyl) silane for the application as stationary phase in reversed-phases liquid chromatography. The separation of proteins and amino acids was successfully accomplished, and the effect of the pore properties of the silica particles on separation was demonstrated.
Mesoporous silica microspheres (MPSMs) find broad application as separation materials in high liquid chromatography (HPLC). A promising preparation strategy uses p(GMA-co-EDMA) as hard templates to control the pore properties and a narrow size distribution of the MPMs. Here six hard templates were prepared which differ in their porosity and surface functionalization. This was achieved by altering the ratio of GMA to EDMA and by adjusting the proportion of monomer and porogen in the polymerization process. The various amounts of GMA incorporated into the polymer network of P1-6 lead to different numbers of tetraethylene pentamine in the p(GMA-co-EDMA) template. This was established by a partial least squares regression (PLS-R) model, based on FTIR spectra of the templates. Deposition of silica nanoparticles (SNP) into the template under Stoeber conditions and subsequent removal of the polymer by calcination result in MPSM1-6. The size of the SNPs and their incorporation depends on the pore parameters of the template and degree of TEPA functionalization. Moreover, the incorporated SNPs construct the silica network and control the pore parameters of the MPSMs. Functionalization of the MPSMs with trimethoxy (octadecyl) silane allows their use as a stationary phase for the separation of biomolecules. The pore characteristics and the functionalization of the template determine the pore structure of the silica particles and, consequently, their separation properties.
Externe Ladeinfrastruktur kann rechtskonform aus dem Stromnetz einer öffentlichen Liegenschaft versorgt werden. Bisher war die Vorgabe, die Versorgung über einen eigenen (neuen) Netzanschlusspunkt zu realisieren. Die hier vorgestellte Lösung ist ökologisch, wirtschaftlich und technisch deutlich günstiger und dient als Muster für die weitere Erschließung landeseigenen Parkraums in ganz Baden-Württemberg. Ein virtuelles Kraftwerk ermöglicht den gemeinschaftsdienlichen Betrieb.
High-performance liquid chromatography is one of the most important analytical tools for the identification and separation of substances. The efficiency of this method is largely determined by the stationary phase of the columns. Although monodisperse mesoporous silica microspheres (MPSM) represent a commonly used material as stationary phase their tailored preparation remains challenging. Here we report on the synthesis of four MPSMs via the hard template method. Silica nanoparticles (SNPs) which form the silica network of the final MPSMs were generated in situ from tetraethyl orthosilicate (TEOS) in the presence of (3-aminopropyl) triethoxysilane (APTES) functionalized p(GMA-co-EDMA) as hard template. Methanol, ethanol, 2-propanol, and 1-butanol were applied as solvents to control the size of the SNPs in the hybrid beads (HB). After calcination, MPSMs with different sizes, morphology and pore properties were obtained and characterized by scanning electron microscopy, nitrogen adsorption and desorption measurements, thermogravimetric analysis, solid state NMR and DRIFT IR spectroscopy. Interestingly, the 29Si NMR spectra of the HBs show T and Q group species which suggests that there is no covalent linkage between the SNPs and the template. The MPSMs were functionalized with trimethoxy (octadecyl) silane and used as stationary phases in reversed-phase chromatography to separate a mixture of eleven different amino acids. The separation characteristics of the MPSMs strongly depend on their morphology and pore properties which are controlled by the solvent during the preparation of the MPSMs. Overall, the separation behavior of the best phases is comparable with those of commercially available columns. The phases even achieve faster separation of the amino acids without loss of quality.
The benefits of urban data cannot be realized without a political and strategic view of data use. A core concept within this view is data governance, which aligns strategy in data-relevant structures and entities with data processes, actors, architectures, and overall data management. Data governance is not a new concept and has long been addressed by scientists and practitioners from an enterprise perspective. In the urban context, however, data governance has only recently attracted increased attention, despite the unprecedented relevance of data in the advent of smart cities. Urban data governance can create semantic compatibility between heterogeneous technologies and data silos and connect stakeholders by standardizing data models, processes, and policies. This research provides a foundation for developing a reference model for urban data governance, identifies challenges in dealing with data in cities, and defines factors for the successful implementation of urban data governance. To obtain the best possible insights, the study carries out qualitative research following the design science research paradigm, conducting semi-structured expert interviews with 27 municipalities from Austria, Germany, Denmark, Finland, Sweden, and the Netherlands. The subsequent data analysis based on cognitive maps provides valuable insights into urban data governance. The interview transcripts were transferred and synthesized into comprehensive urban data governance maps to analyze entities and complex relationships with respect to the current state, challenges, and success factors of urban data governance. The findings show that each municipal department defines data governance separately, with no uniform approach. Given cultural factors, siloed data architectures have emerged in cities, leading to interoperability and integrability issues. A city-wide data governance entity in a cross-cutting function can be instrumental in breaking down silos in cities and creating a unified view of the city’s data landscape. The further identified concepts and their mutual interaction offer a powerful tool for developing a reference model for urban data governance and for the strategic orientation of cities on their way to data-driven organizations.
While driving, stress is caused by situations in which the driver estimates their ability to manage the driving demands as insufficient or loses the capability to handle the situation. This leads to increased numbers of driver mistakes and traffic violations. Additional stressing factors are time pressure, road conditions, or dislike for driving. Therefore, stress affects driver and road safety. Stress is classified into two categories depending on its duration and the effects on the body and psyche: short-term eustress and constantly present distress, which causes degenerative effects. In this work, we focus on distress. Wearable sensors are handy tools for collecting biosignals like heart rate, activity, etc. Easy installation and non-intrusive nature make them convenient for calculating stress. This study focuses on the investigation of stress and its implications. Specifically, the research conducts an analysis of stress within a select group of individuals from both Spain and Germany. The primary objective is to examine the influence of recognized psychological factors, including personality traits such as neuroticism, extroversion, psychoticism, stress and road safety. The estimation of stress levels was accomplished through the collection of physiological parameters (R-R intervals) using a Polar H10 chest strap. We observed that personality traits, such as extroversion, exhibited similar trends during relaxation, with an average heart rate 6% higher in Spain and 3% higher in Germany. However, while driving, introverts, on average, experienced more stress, with rates 4% and 1% lower than extroverts in Spain and Germany, respectively.
The fierce market competition environment makes employees feel insecure at work. While it is difficult for enterprises to provide employees with a sense of security, they have to rely on employees’ innovative behavior to seek competitive advantage. Therefore, this study focuses on how employees engage in innovative behavior when they face job insecurity.MethodsUsing a variable-centered approach, this study aims to examine the mediating effects of intrinsic and impression management motivation in the relationship between quantitative and qualitative job insecurity and innovative behavior, including proactive and reactive innovative behavior. In addition, a person-centered approach is used to investigate whether it is possible to distinguish different combinations of quantitative and qualitative job insecurity, and examine the effect of these job insecurity profiles on motivation and innovative behavior. We used 503 data sets collected via the Credamo platform in China into the data analysis.ResultsThe study found that quantitative job insecurity affects proactive and reactive innovative behavior through impression management motivation and that qualitative job insecurity affects proactive and reactive innovative behavior through intrinsic and impression management motivation. In addition, three job insecurity profiles were identified: balanced high job insecurity, balanced low job insecurity, and a profile dominated by high quantitative job insecurity, all of which have significantly different effects on motivation and innovative behavior.DiscussionThis study contributes to provide new insights into the relationship between job insecurity and innovative behavior and compensate for the limitation of the traditional variable-centered approach that cannot capture heterogeneity within the workforce.
Projektbasiertes Lernen (PBL) ist eine ideale Methode, um Studierenden an Hochschulen praktische Projektmanagement-Kompetenzen zu vermitteln. Selbst anspruchsvolle Projekte werden hierdurch möglich. Jedoch ist die Balance zwischen den angestrebten Lernzielen und der praktischen Projektdurchführung in der Hochschulpraxis herausfordernd. Mit Hilfe des ‚PBL-Gold Standards‘ lassen sich PBL-Projekte zielgerichtet entwerfen und auf Effektivität hinsichtlich der Lernziele überprüfen. Am Beispiel des Projekts ‚IP Plane‘ der Hochschule Reutlingen, dem Bau eines Motorflugzeugs durch Studierende, wird die praktische Umsetzung eines PBL-Projektes demonstriert.
Mobile assistance systems (MAS) promise to overcome personnel shortages in operating theatres worldwide. A literature review inspired by the PRISMA 2020 method determines the state of the art of MAS, and identifies a lack of application areas for MAS in the operating theatre. Interviews with subject-matter experts aim to investigate application areas for MAS. The results show that most operational tasks refer to material management and patient management. MAS, with their potential to reduce the time needed for material and patient management, and the physical and mental strain of patient management, have great potential in the operating theatre.
5G-Campusnetze sind vielversprechende Umgebungen für industrielle Anwendungen in Produktion und Intralogistik. Diese erreichen jedoch bisher nicht die versprochenen Leistungen, um intralogistischen Anwendungen das volle Potenzial von 5G bieten zu können. Die im Rahmen des Projekts 5G4KMU erhobenen und in diesem Beitrag vorgestellten Leistungsmessungen dienen zur Evaluierung der derzeitigen Praxistauglichkeit von 5G-Campusnetzen.
In the past, plant layouts were regarded as highly static structures. With increasing internal and external factors causing turbulence in operations, it has become more necessary for companies to adapt to new conditions in order to maintain optimal performance. One possible way for such an adaptation is the adjustment of the plant layout by rearranging the individual facilities within the plant. Since the information about the plant layout is considered as master data and changes have a considerable impact on interconnected processes in production, it is essential that this data remains accurate and up-to-date. This paper presents a novel approach to create a digital shadow of the plant layout, which allows the actual state of the physical layout to be continuously represented in virtual space. To capture the spatial positions and orientations of the individual facilities, a pan-tilt-zoom camera in combination with fiducial markers is used. With the help of a prototypically implemented system, the real plant layout was captured and converted into different data formats for further use in exemplary external software systems. This enabled the automatic updating of the plant layout for simulation, analysis and routing tasks in a case study and showed the benefits of using the proposed system for layout capturing in terms of accuracy and effort reduction.
Gender Marketing gewinnt sowohl in der Marketing-Theorie als auch in der Unternehmenspraxis zunehmend an Bedeutung. Der Unterschied zwischen den Geschlechtern zeigt sich nicht nur in unterschiedlichen Fähigkeiten und Einstellungen, sondern auch in verschiedenen Bedürfnissen und im Kaufverhalten. Viele Produkte werden von Männern für Männer entwickelt. Produkte, die sich speziell an Frauen richten, werden häufig gemäß dem Motto „pink it and shrink it“ auf den Markt gebracht. Eine erfolgreiche Umsetzung von Gender-Aspekten ist für Unternehmen eine wichtige Marketing-Herausforderung für die Zukunft.
Hybride Arbeitsmodelle gelten als Zukunft der Arbeit. Demnach beschäftigt sich die vorliegende Forschungsarbeit mit der Untersuchung hybrider Arbeitsmodelle im Hinblick auf deutsche kleine und mittlere Unternehmen (KMU) im Vergleich zu Großbetrieben. Mithilfe einer multi-methodischen Studie, bestehend aus einer Umfrage und qualitativen Experteninterviews, wird evaluiert, in welchem Maß hybride Arbeitsmodelle in KMU bereits etabliert sind und welche Herausforderungen sie dabei bewältigen müssen. Zusätzlich wird betrachtet, ob soziodemografische Faktoren wie Alter, Geschlecht oder Rolle im Unternehmen einen Einfluss auf hybrides Arbeiten haben.Die Ergebnisse zeigen, dass die Etablierung von hybriden Arbeitsmodellen in KMU im Gegensatz zu Großbetrieben weniger vorangeschritten ist. KMUs stehen vor vielfältigen Herausforderungen, die beispielsweise auf unzureichende Digitalisierung oder traditionellere Strukturen zurückzuführen sind. Insbesondere die Unternehmenskultur sowie die Rolle im Unternehmen und der Einfluss der Führungskraft spielen eine wichtige Rolle.Praktische Relevanz: Der Großteil vorliegender Literatur zum Thema New Work und Hybride Arbeit legt den Fokus auf die Gesamtbetrachtung aller Unternehmensgrößen oder auf Großbetriebe. Aufgrund der spezifischen Merkmale, wie beispielsweise eingeschränkter Ressourcenzugang, können Ergebnisse von Großbetrieben kaum auf KMU übertragen werden. Demnach gibt diese Arbeit eine Orientierung, wie hybride Arbeitsmodelle in KMU sinnvoll und gewinnbringend umgesetzt werden und welche Herausforderungen auftreten.
Assistant platforms
(2023)
Many assistant systems have evolved toward assistant platforms. These platforms combine a range of resources from various actors via a declarative and generative interface. Among the examples are voice-oriented assistant platforms like Alexa and Siri, as well as text-oriented assistant platforms like ChatGPT and Bard. They have emerged as valuable tools for handling tasks without requiring deeper domain expertise and have received large attention with the present advances in generative artificial intelligence. In view of their growing popularity, this Fundamental outlines the key characteristics and capabilities that define assistant platforms. The former comprise a multi-platform architecture, a declarative interface, and a multi-platform ecosystem, while the latter include capabilities for composition, integration, prediction, and generativity. Based on this framework, a research agenda is proposed along the capabilities and affordances for assistant platforms.
Determinants of customer recovery in retail banking - lessons from a German banking case study
(2023)
Due to the increased willingness of retail banking customers to switch and churn their banking relationships, a question arises: Is it possible to win back lost customers, and if so, is such a possibility even desirable after all economic factors have been considered? To answer these questions, this paper examines selected determinants for the recovery of terminated customer–bank relationships from the perspective of former customers. This study therefore evaluates for the first time, empirically and systematically with reference to a German Sparkasse as a case-study setting, whether lost customers have a sufficient general willingness to return (GWR) a retail banking relationship. From our results, a correlation is shown between the GWR a banking relationship and some specific determinants: seeking variety, attractiveness of alternatives and customer satisfaction with the former business relationship. In addition, we show that a customer’s GWR varies depending on the reason for churn and is surprisingly greater when the customer defected for reasons that lie within the scope of the customer himself. Despite the case-study character, however, our results provide relevant insights for other banks and, in particular, this applies to countries with a comparable banking system.
Monitoring heart rate and breathing is essential in understanding the physiological processes for sleep analysis. Polysomnography (PSG) system have traditionally been used for sleep monitoring, but alternative methods can help to make sleep monitoring more portable in someone's home. This study conducted a series of experiments to investigate the use of pressure sensors placed under the bed as an alternative to PSG for monitoring heart rate and breathing during sleep. The following sets of experiments involved the addition of small rubber domes - transparent and black - that were glued to the pressure sensor. The resulting data were compared with the PSG system to determine the accuracy of the pressure sensor readings. The study found that the pressure sensor provided reliable data for extracting heart rate and respiration rate, with mean absolute errors (MAE) of 2.32 and 3.24 for respiration and heart rate, respectively. However, the addition of small rubber hemispheres did not significantly improve the accuracy of the readings, with MAEs of 2.3 bpm and 7.56 breaths per minute for respiration rate and heart rate, respectively. The findings of this study suggest that pressure sensors placed under the bed may serve as a viable alternative to traditional PSG systems for monitoring heart rate and breathing during sleep. These sensors provide a more comfortable and non-invasive method of sleep monitoring. However, the addition of small rubber domes did not significantly enhance the accuracy of the readings, indicating that it may not be a worthwhile addition to the pressure sensor system.
The Covid-19 virus has triggered a worldwide pandemic and therefore many employees were required to work from home which caused numerous challenges. With the Covid-19 pandemic now in its third year, there are already several studies available on the subject of home offices. To investigate the impact of remote work on employee satisfaction and trust, this quantitative study aims to review existing results and formulate hypotheses based on a conceptual model created through a qualitative study and extensive literature review. The research question is as follows: Does home office during Covid-19 affect employee satisfaction and trust? To test the hypotheses, a structural equation model was constructed and analyzed. The culture of trust and flexibility are identified as the biggest influencing factors in this study.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
Healthy sleep is one of the prerequisites for a good human body and brain condition, including general well-being. Unfortunately, there are several sleep disorders that can negatively affect this. One of the most common is sleep apnoea, in which breathing is impaired. Studies have shown that this disorder often remains undiagnosed. To avoid this, developing a system that can be widely used in a home environment to detect apnoea and monitor the changes once therapy has been initiated is essential. The conceptualisation of such a system is the main aim of this research. After a thorough analysis of the available literature and state of the art in this area of knowledge, a concept of the system was created, which includes the following main components: data acquisition (including two parts), storage of the data, apnoea detection algorithm, user and device management, data visualisation. The modules are interchangeable, and interfaces have been defined for data transfer, most of which operate using the MQTT protocol. System diagrams and detailed component descriptions, including signal requirements and visualisation mockups, have also been developed. The system's design includes the necessary concepts for the implementation and can be realised in a prototype in the next phase.
The influence of sleep on human health is enormous. Accordingly, sleep disorders can have a negative impact on it. To avoid this, they should be identified and treated in time. For this purpose, objective (with an appropriate device) or subjective (based on perceived values) measurement methods are used for sleep analysis to understand the problem. The aim of this work is to find out whether an exchange of the two methods is possible and can provide reliable results. In accordance with this goal, a study was conducted with people aged over 65 years old (a total of 154 night-time recordings) in which both measurement methods were compared. Sleep questionnaires and electronic devices for sleep assessment placed under the mattress were applied to achieve the study aims. The obtained results indicated that the correlation between both measurement methods could be observed for sleep characteristics such as total sleep time, total time in bed and sleep efficiency. However, there are also significant differences in absolute values of the two measurement approaches for some subjects/nights, which leads us to conclude that the substitution is more likely to be considered in case of long-term monitoring where the trends are of more importance and not the absolute values for individual nights.
Development of an expert system to overpass citizens technological barriers on smart home and living
(2023)
Adopting new technologies can be overwhelming, even for people with experience in the field. For the general public, learning about new implementations, releases, brands, and enhancements can cause them to lose interest. There is a clear need to create point sources and platforms that provide helpful information about the novel and smart technologies, assisting users, technicians, and providers with products and technologies. The purpose of these platforms is twofold, as they can gather and share information on interests common to manufacturers and vendors. This paper presents the ”Finde-Dein-SmartHome” tool. Developed in association with the Smart Home & Living competence center [5] to help users learn about, understand, and purchase available technologies that meet their home automation needs. This tool aims to lower the usability barrier and guide potential customers to clear their doubts about privacy and pricing. Communities can use the information provided by this tool to identify market trends that could eventually lower costs for providers and incentivize access to innovative home technologies and devices supporting long-term care.
Managerial accountants spend a large part of their working time on more operational activities in cost accounting, reporting, and operational planning and budgeting. In all these areas, there has been increasing discussion in recent years, both in theory and practice, about using more digital technologies. For reporting, this means not only an intensified discussion of technologies such as RPA and AI but also more intensive changes to existing reporting systems. In particular, management information systems (MIS), which are maintained by managerial accountants and used by managers for corporate management, should be mentioned here. Based on an empirical survey in a large German company, this article discusses the requirements and assessments of users when switching from a regular MIS to a cloud-based system.
The influence of trust on the adherence to investment recommendations in the context of robo-advisors is under-researched. This relationship needs to be better understood because robo-advice lacks a critical element of trust: human interaction. Theory suggests that ability, integrity, and benevolence are key factors in building trust in human advisors. Using an experimental study design, our research examines the relationship between a robo-advisor's trust attributes and the acceptance of its investment advice. The results show that trust in a robo-advisor increases the propensity to follow its recommendations. While ability and integrity are significant, benevolence is not. The study contributes to the research on technology acceptance, trust, and the adoption of technology-based recommendations by improving the understanding of the relationship between trust and the acceptance of automated investment recommendations.
In increasingly complex production environments, tremendous efforts are being made to optimize the efficiency of a production system. An important efficiency factor is industrial maintenance, both influencing the cost and securing the technical availability of machines and components. Maintenance managers are required to deliver the necessary availability of the production system while minimizing the resources needed to do so. To make this possible, a method to evaluate the dependency between the technical availability of an entire production system and maintenance resources is necessary. This paper presents a systematic literature review of such methods is presented. In order to assess the methods proposed in the literature, first, requirements are developed, including a necessary focus on maintenance strategies within these methods. Including maintenance strategies is necessary since they provide the foundation for both the availability of a component and the maintenance resources needed. In total, 13 requirements are developed, and 21 different methods are evaluated. Only one of the proposed methods addresses all requirements, with others lacking possible combinations of maintenance strategies and the resulting influences on the production system.
Introduction: Telemedicine reduces greenhouse gas emissions (CO2eq); however, results of studies vary extremely in dependence of the setting. This is the first study to focus on effects of telemedicine on CO2 imprint of primary care.
Methods: We conducted a comprehensive retrospective study to analyze total CO2eq emissions of kilometers (km) saved by telemedical consultations. We categorized prevented and provoked patient journeys, including pharmacy visits. We calculated CO2eq emission savings through primary care telemedical consultations in comparison to those that would have occurred without telemedicine. We used the comprehensive footprint approach, including all telemedical cases and the CO2eq emissions by the telemedicine center infrastructure. In order to determine the net ratio of CO2eq emissions avoided by the telemedical center, we calculated the emissions associated with the provision of telemedical consultations (including also the total consumption of physicians’ workstations) and subtracted them from the total of avoided CO2eq emissions. Furthermore, we also considered patient cases in our calculation that needed to have an in-person visit after the telemedical consultation. We calculated the savings taking into account the source of the consumed energy (renewable or not).
Results: 433 890 telemedical consultations overall helped save 1 800 391 km in travel. On average, 1 telemedical consultation saved 4.15 km of individual transport and consumed 0.15 kWh. We detected savings in almost every cluster of patients. After subtracting the CO2eq emissions caused by the telemedical center, the data reveal savings of 247.1 net tons of CO2eq emissions in total and of 0.57 kg CO2eq per telemedical consultation. The comprehensive footprint approach thus indicated a reduced footprint due to telemedicine in primary care.
Discussion: Integrating a telemedical center into the health care system reduces the CO2 footprint of primary care medicine; this is true even in a densely populated country with little use of cars like Switzerland. The insight of this study complements previous studies that focused on narrower aspects of telemedical consultations.
The development of automatic solutions for the detection of physiological events of interest is booming. Improvements in the collection and storage of large amounts of healthcare data allow access to these data faster and more efficiently. This fact means that the development of artificial intelligence models for the detection and monitoring of a large number of pathologies is becoming increasingly common in the medical field. In particular, developing deep learning models for detecting obstructive apnea (OSA) events is at the forefront. Numerous scientific studies focus on the architecture of the models and the results that these models can provide in terms of OSA classification and Apnea-Hypopnea-Index (AHI) calculation. However, little focus is put on other aspects of great relevance that are crucial for the training and performance of the models. Among these aspects can be found the set of physiological signals used and the preprocessing tasks prior to model training. This paper covers the essential requirements that must be considered before training the deep learning model for obstructive sleep apnea detection, in addition to covering solutions that currently exist in the scientific literature by analyzing the preprocessing tasks prior to training.
AbstractThrough their procyclical behavior, loan loss provisions have been determined as one of the factors that contribute to financial instability during a crisis. IFRS 9 was introduced in 2018 with an expected credit loss model replacing the incurred loss model of IAS 39 to mitigate the effect in the future. Our study aims to analyze loan loss provisions of major banks in the Eurozone to determine for the first time if the implementation of IFRS 9, as intended by regulators, has a dampening effect on procyclicality, especially during the stressed situation under COVID‐19. We analyze 51 banks from 12 countries of the European Monetary Union using 2856 firm‐year observations. While no robust evidence of less procyclicality can be found after the implementation of IFRS 9 until the pandemic, we find evidence that loan loss provisions moved countercyclical during 2020, indicating an alleviating effect at the beginning of the exogenous shock.
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
This study introduces a straightforward approach to construct three-dimensional (3D) surface-enhanced Raman spectroscopy (SERS) substrates using chemically modified silica particles as microcarriers and by attaching metal nanoparticles (NPs) onto their surfaces. Tollens’ reagent and sputtering techniques are utilized to prepare the SERS substrates from mercapto-functionalized silica particles. Treatment with Tollens’ reagent generates a variety of silver NPs, ranging from approximately 10 to 400 nm, while sputtering with gold (Au) yields uniformly distributed NPs with an island-like morphology. Both substrates display wide plasmon resonances in the scattering spectra, making them effective for SERS in the visible spectral range, with enhancement factors (ratio of the analyte’s intensity at the hotspot compared to that on the substrate in the absence of metal nanoparticles) of up to 25. These 3D substrates have a significant advantage over traditional SERS substrates because their active surface area is not limited to a 2D surface but offers a much greater active surface due to the 3D arrangement of the NPs. This feature may enable achieving much higher SERS intensity from within streaming liquids or inside cells/tissues.
Thin, flat textile roofing offers negligible heat insulation. In warm areas, such roofing membranes are therefore equipped with metallized surfaces to reflect solar heat radiation, thus reducing the warming inside a textile building. Heat reflection effects achieved by metallic coatings are always accompanied by shading effects as the metals are non-transparent for visible light (VIS). Transparent conductive oxides (TCOs) are transparent for VIS and are able to reflect heat radiation in the infrared. TCOs are, e.g., widely used in the display industry. To achieve the perfect coatings needed for electronic devices, these are commonly applied using costly vacuum processes at high temperatures. Vacuum processes, on account of the high costs involved and high processing temperatures, are obstructive for an application involving textiles. Accepting that heat-reflecting textile membranes demand less perfect coatings, a wet chemical approach has been followed here when producing transparent heat-reflecting coatings. Commercially available TCOs were employed as colloidal dispersions or nanopowders to prepare sol-gel-based coating systems. Such coatings were applied to textile membranes as used for architectural textiles using simple coating techniques and at moderate curing temperatures not exceeding 130 °C. The coatings achieved about 90% transmission in the VIS spectrum and reduced near-infrared transmission (at about 2.5 µm) to nearly zero while reflecting up to 25% of that radiation. Up to 35% reflection has been realized in the far infrared, and emissivity values down to ε = 0.5777 have been measured.
Geopolitische Risiken sind nicht erst seit Ausbruch des Ukraine-Kriegs für den Erfolg und die Überlebensfähigkeit von Unternehmen von großer Relevanz. Nur durch den Aufbau von Methodenkompetenz, diese besonderen Risiken zu identifizieren, schaffen Unternehmen die notwendigen Voraussetzungen für ein erfolgreiches Management von geopolitischen Ereignissen.
Die vorliegende Studie untersucht, wie Unternehmen die Generation Z für den Vertrieb rekrutieren können. Die Ergebnisse zeigen, dass Flexibilität, Weiterentwicklungsmöglichkeiten, ein attraktives Grundgehalt und eine angenehme Arbeitsatmosphäre für die Generation Z entscheidende Faktoren bei der beruflichen Entscheidungsfindung sind. Darüber hinaus wird die Bedeutung der Sinnhaftigkeit der Arbeit hervorgehoben.
This article explores current debate on the use of soft power in international higher education, highlighting existing tensions between competing political and academic discourses. It draws on examples from practice and relevant insights in soft power scholarship to capture varying paradoxes and dilemmas that emerge as nations try to leverage the power of international tertiary education to enhance their brand and attract foreign audiences in the name of public diplomacy. Whilst exposing cases of hubris and hidden agendas, this study also addresses issues of inequality and responds to a growing call for knowledge diplomacy aimed at tackling common global problems.
Global trade is plagued by slow and inefficient manual processes associated with physical documents. Firms are constantly looking for new ways to improve transparency and increase the resilience of their supply chains. This can be solved by the digitalisation of supply chains and the automation of document- and information-sharing processes. Blockchain is touted as a solution to these issues due to its unique combination of features, such as immutability, decentralisation and transparency. A lack of business cases that quantify the costs and benefits causes uncertainty regarding the truth of these claims. This paper explores how the costs and benefits of a blockchain-based solution for digitalising and automating documentation flows in cross-border supply chains compare to a conventional centralised relational database solution. The research described in this paper uses primary data collected through semi-structured interviews with industry experts, as well as secondary data from literature. Two models based on existing services were developed and the costs and benefits compared and then analysed using the Architecture Trade-off Analysis Method (ATAM) and the Analytic Network Process (ANP). Findings from the analysis show that a consortium blockchain solution like TradeLens is the favourable solution for digitalising and automating information flows in cross-border supply chains.
«Wir möchten in unserem Führungsteam agiler und selbstorganisierter zusammenarbeiten. Aber was, wenn das alles nicht klappt? Oder wenn das eigentlich doch keine*r will?» So ein Bereichsleiter, mit dem wir über neue Arbeitswelten sprachen. Nachdem wir den Veränderungsprozess als zeitlich begrenztes Experiment eingeführt hatten, schwanden die Bedenken spürbar. Wissenschaftliche Experimente sind zentral für unseren gesellschaftlichen und technologischen Fortschritt. Sie ermöglichen es, durch einen gezielten Versuchsaufbau (das eigentliche Experiment) Hypothesen zu überprüfen und damit neues Wissen hervorzubringen. Aber kann man diese mächtige Methode der Wissenschaft auch für die Organisationsentwicklung nutzbar machen und wenn ja, wie? Die zunehmende Verbreitung von allerlei «Labs», Reallaboren und Experimentierräumen in Organisationen und Gesellschaft legen das nahe. Inzwischen blickt das Forschungsteam der ESB Business School auf eine mehrjährige Erfahrung mit der Experimentellen Organisationsentwicklung zurück. Unter anderem in Kooperation mit dem Fraunhofer IAO sowie der Hochschule Harz wurden mehrere Experimentierräume in Organisationen begleitet und evaluiert. Dabei entstand ein ausgefeiltes Interventionsdesign. Denn auch wenn Experimente allgegenwärtig und einfach umsetzbar scheinen, brauchte es in der praktischen Umsetzung unserer Erfahrung nach ein differenziertes Vorgehensmodell, um etliche Fallstricke zu umgehen.
Cotton contamination by honeydew is considered one of the significant problems for quality in textiles as it causes stickiness during manufacturing. Therefore, millions of dollars in losses are attributed to honeydew contamination each year. This work presents the use of UV hyperspectral imaging (225–300 nm) to characterize honeydew contamination on raw cotton samples. As reference samples, cotton samples were soaked in solutions containing sugar and proteins at different concentrations to mimic honeydew. Multivariate techniques such as a principal component analysis (PCA) and partial least squares regression (PLS-R) were used to predict and classify the amount of honeydew at each pixel of a hyperspectral image of raw cotton samples. The results show that the PCA model was able to differentiate cotton samples based on their sugar concentrations. The first two principal components (PCs) explain nearly 91.0% of the total variance. A PLS-R model was built, showing a performance with a coefficient of determination for the validation (R2cv) = 0.91 and root mean square error of cross-validation (RMSECV) = 0.036 g. This PLS-R model was able to predict the honeydew content in grams on raw cotton samples for each pixel. In conclusion, UV hyperspectral imaging, in combination with multivariate data analysis, shows high potential for quality control in textiles.
We study three-color Förster resonance energy transfer (triple FRET) between three spectrally distinct fluorescent dyes, a donor and two acceptors, which are embedded in a single polystyrene nanosphere. The presence of triple FRET energy transfer is confirmed by selective acceptor photobleaching. We show that the fluorescence lifetimes of the three dyes are selectively controlled using the Purcell effect by modulating the radiative rates and relative fluorescence intensities when the nanospheres are embedded in an optical Fabry–Pérot microcavity. The strongest fluorescence intensity enhancement for the second acceptor can be observed as a signature of the FRET process by tuning the microcavity mode to suppress the intermediate dye emission and transfer more energy from donor to the second acceptor. Additionally, we show that the triple FRET process can be modeled by coupled rate equations, which allow to estimate the energy transfer rates between donor and acceptors. This fundamental study has the potential to extend the classical FRET approach for investigating complex systems, e.g., optical energy switching, photovoltaic devices, light-harvesting systems, or in general interactions between more than two constituents.
UV hyperspectral imaging (225 nm–410 nm) was used to identify and quantify the honey- dew content of real cotton samples. Honeydew contamination causes losses of millions of dollars annually. This study presents the implementation and application of UV hyperspectral imaging as a non-destructive, high-resolution, and fast imaging modality. For this novel approach, a reference sample set, which consists of sugar and protein solutions that were adapted to honeydew, was set-up. In total, 21 samples with different amounts of added sugars/proteins were measured to calculate multivariate models at each pixel of a hyperspectral image to predict and classify the amount of sugar and honeydew. The principal component analysis models (PCA) enabled a general differentiation between different concentrations of sugar and honeydew. A partial least squares regression (PLS-R) model was built based on the cotton samples soaked in different sugar and protein concentrations. The result showed a reliable performance with R2cv = 0.80 and low RMSECV = 0.01 g for the valida- tion. The PLS-R reference model was able to predict the honeydew content laterally resolved in grams on real cotton samples for each pixel with light, strong, and very strong honeydew contaminations. Therefore, inline UV hyperspectral imaging combined with chemometric models can be an effective tool in the future for the quality control of industrial processing of cotton fibers.
This article examines the risks and societal costs associated with flexible average inflation targeting in the United States and symmetric inflation targeting in the Eurozone. Employing an empirical approach, we analyze monthly cumulative inflation gaps over a monetary policy horizon of 36 months. By investigating the trajectories of the cumulative inflation gaps, we find a heavy tailed distribution and a 20 percent probability of over- and undershooting the inflation target. We exhibit that the offsetting mechanism introduced in the revised monetary strategies lack credibility in ensuring price stability during a period of persistent inflation. Consequently, the credibility of central banks may be compromised. The policy implications are the integration of an escape clause and prompt monetary corrections in cases where the inflation goal is not achieved. This study provides insights for policymakers and central banks, emphasizing challenges in maintaining credibility and price stability within the new monetary strategies.
The aim of this article is to establish a stochastic search algorithm for neural networks based on the fractional stochastic processes {𝐵𝐻𝑡,𝑡≥0} with the Hurst parameter 𝐻∈(0,1). We define and discuss the properties of fractional stochastic processes, {𝐵𝐻𝑡,𝑡≥0}, which generalize a standard Brownian motion. Fractional stochastic processes capture useful yet different properties in order to simulate real-world phenomena. This approach provides new insights to stochastic gradient descent (SGD) algorithms in machine learning. We exhibit convergence properties for fractional stochastic processes.
Offshore-Windenergie wird global zunehmend intensiver ausgebaut. Auch die deutsche Bundesregierung hat die Ausbauziele auf 30 GW installierte Leistung bis 2030 erhöht, von derzeit ca. 8 GW. Wie kann die deutsche Offshore-Windenergiebranche dies erreichen und was bedeutet das für ihre Zulieferer und Dienstleister? Vier Szenarien beschreiben mögliche Zukünfte. Technischer Fortschritt entlang der gesamten Wertschöpfungskette, Lieferkettensicherheit, Regulatorik sowie Fachkräfteverfügbarkeit sind die kritischen Erfolgsfaktoren.
Distributed Ledger Technologies for the energy sector: facilitating interoperability analysis
(2023)
The use of distributed data storage and management structures, such as Distributed Ledger Technologies (DLT), in the energy sector has gained great interest in recent times. This opens up new possibilities in e.g. microgrid management, aggregation of distributed resources, peer-to- peer trading, integration of electromobility or proof-of-origin strategies. However, in order to benefit from those new possibilities, new challenges have to be overcome. This work focuses on one of these challenges, which is the need to ensure interoperability when integrating DLT-enabled devices in energy use cases. Firstly, the use of DLTs in the energy sector will be analyzed and the main use cases will be presented. Then, a classification of DLT-Energy use cases will be proposed. Secondly, the need for a common reference architecture framework to analyze those use cases with a focus on interoperability will be discussed and the current activities in research and standardization in this field will be presented. Finally, a new common reference architecture framework based on current activities in standardization will be presented.
Es fällt in Unternehmen immer schwerer, in komplexen Zeiten zukünftige Entwicklungen zu antizipieren. Ein Umstand, der auch das Change Management prägen sollte. Dennoch ist ein methodisch fundierter Umgang mit Nicht-Wissen in der Change-Praxis kaum verbreitet. Erst seit Kurzem wird darüber nachgedacht, wie man beispielsweise das Experiment als zukunftsfähige Methode nutzbar machen kann. Ein Ansatz sind Experimentierräume als ein besonders kennzeichnendes Format einer experimentellen Organisationsentwicklung.
Context
Web APIs are one of the most used ways to expose application functionality on the Web, and their understandability is important for efficiently using the provided resources. While many API design rules exist, empirical evidence for the effectiveness of most rules is lacking.
Objective
We therefore wanted to study 1) the impact of RESTful API design rules on understandability, 2) if rule violations are also perceived as more difficult to understand, and 3) if demographic attributes like REST-related experience have an influence on this.
Method
We conducted a controlled Web-based experiment with 105 participants, from both industry and academia and with different levels of experience. Based on a hybrid between a crossover and a between-subjects design, we studied 12 design rules using API snippets in two complementary versions: one that adhered to a rule and one that was a violation of this rule. Participants answered comprehension questions and rated the perceived difficulty.
Results
For 11 of the 12 rules, we found that violation performed significantly worse than rule for the comprehension tasks. Regarding the subjective ratings, we found significant differences for 9 of the 12 rules, meaning that most violations were subjectively rated as more difficult to understand. Demographics played no role in the comprehension performance for violation.
Conclusions
Our results provide first empirical evidence for the importance of following design rules to improve the understandability of Web APIs, which is important for researchers, practitioners, and educators.
Rapid and robust quality monitoring of the composition of meat pastes is of fundamental importance in processing meat and sausage products. Here, an in-line near-infrared spectroscopy/micro-electro-mechanical-system-(MEMS)-based approach, combined with multivariate data analysis, was used for measuring the constituents fat, protein, water, and salt in meat pastes within a typical range of meat paste recipes. The meat pastes were spectroscopically characterized in-line with a novel process analyzer prototype. By integrating salt content in the calibration set, robust predictive PLSR models of high accuracy (R2 > 0.81) were obtained that take interfering matrix effects of the minor and NIR-inactive meat paste recipe component “salt” into account as well. The nonlinear blending behavior of salt concentration on the spectral features of meat pastes is discussed based on a designed mixture experiment with four systematically varied components.
Der Einsatz von spielerischen Elementen gewinnt im B-to-B immer mehr an Bedeutung. Die vorliegende Studie untersucht den Einsatz von Gamification-Elementen im B-to-B-Marketing und -Vertrieb, speziell in der deutschen Baubranche. Dabei zeigt sich, dass Gamification in Richtung der Mitarbeitenden häufiger genutzt wird als in Richtung der Kundschaft. Doch auch mit Blick auf nachrückende Kunden-Generationen wächst das Potenzial von Gamification zur Lead-Generierung und zur Unterstützung der Omni-Channel-Strategie.
In den letzten Jahren hat der Trend zur Digitalisierung und Konnektivität die Kundenerwartungen an den B2B-Kundenservice verändert. Vorliegender Artikel arbeitet mit zwei klaren Studienzielen und untersucht zum einen die Rolle von IoT (Internet of Things) und Cybersicherheit als Erfolgsfaktoren für den Business-to-Business (B2B) Kundenservice und zum anderen wie eine sichere Integration zu einem Wettbewerbsvorteil auf dem deutschen Markt beitragen kann. Durch einen qualitativen Ansatz mithilfe von 20 Befragungen wurde untersucht, dass IoT und Cybersicherheit als Erfolgsfaktoren für den deutschen B2B-Kundenservice angesehen werden können. Als Ergebnis liefert diese Studie fünf Kernaussagen (Hypothesen) aus qualitativen Interviews. Neben der Diskussion allgemeiner Erfolgsfaktoren und deren Einfluss, wurde die Rolle von IoT bei der Optimierung des B2B Kundendienstes diskutiert. Zudem werden potenzielle Sicherheitsrisken in Zusammenhang mit den Dienstleistungsmodellen, notwendige Anforderungen an Cybersicherheit sowie Datenerfassung erörtert. Abschließend wurde ein Modell entwickelt, das interne und externe Aspekte aufzeigt, die dazu beitragen, dass IoT und Cybersicherheit als Erfolgsfaktoren in der Aktivitätskette des Kunden in der Pre-Sales‑, Sales- und After-Sales-Phase erlebt werden.
Dieser praxis-nahe und industrie-übergreifende Artikel liefert somit Einblicke basierend auf qualitativen Erkenntnissen für weitere Forschung in der Theorie und befähigt Organisationen das Thema ganzeinheitlich zu betrachten.
Die Veröffentlichung von ChatGPT-3 im November 2022 und ChatGPT-4 im März 2023 verspricht, bisher Menschen vorbehaltene Denkaufgaben in zahlreichen Bereichen, von der Medizin bis zur Juristerei, zu automatisieren. Die vorliegende Untersuchung stellt das Versprechen auf die Probe, indem 200 Fälle aus dem Bereich des Wirtschaftsrechts in die derzeit leistungsfähigsten Chatbots zur Lösung eingegeben werden. Es ergibt sich ein nuanciertes Bild: Zwar wird erkennbar, dass der menschliche Experte nach wie vor überlegen ist. Trotzdem können Chatbots teilweise erstaunlich gute Ergebnisse erzielen, wenn sie einfache Fälle mit geringer Komplexität lösen.
In the last few years, business firms have substantially invested into the artificial intelligence (AI) technology. However, according to several studies, a significant percentage of AI projects fail or do not deliver business value. Due to the specific characteristics of AI projects, the existing body of knowledge about success and failure of information systems (IS) projects in general may not be transferrable to the context of AI. Therefore, the objective of our research has been to identify factors that can lead to AI project failure. Based on interviews with AI experts, this article identifies and discusses 12 factors that can lead to project failure. The factors can be further classified into five categories: unrealistic expectations, use case related issues, organizational constraints, lack of key resources, and, technological issues. This research contributes to knowledge by providing new empirical data and synthesizing the results with related findings from prior studies. Our results have important managerial implications for firms that aim to adopt AI by helping the organizations to anticipate and actively manage risks in order to increase the chances of project success.
In the course of a more intensive energy generation from regenerative sources, an increased number of energy storages is required. In addition to the widespread means of storing electric energy, storing energy thermally can contribute significantly. However, limited research exists on the behaviour of thermal energy storages (TES) in practical operation. While the physical processes are well known, it is nevertheless often not possible to adequately evaluate its performance with respect to the quality of thermal stratification inside the tank, which is crucial for the thermodynamic effectiveness of the TES. The behaviour of a TES is experimentally investigated in cyclic charging and discharging operation in interaction with a cogeneration (CHP) unit at a test rig in the lab. From the measurements the quality of thermal stratification is evaluated under varying conditions using different metrics such as normalised stratification factor, modified MIX number, exergy number and exergy efficiency, which extends the state of art for CHP applications. The results show that the positioning of the temperature sensors for turning the CHP unit on and off has a significant influence on both the effective capacity of a TES and the quality of thermal stratification inside the tank. It is also revealed that the positioning of at least one of these sensors outside the storage tank, i.e. in the return line to the CHP unit, prevents deterioration of thermal stratification, thereby enhancing thermodynamic effectiveness. Furthermore, the effects of thermal load and thermal load profile on effective capacity and thermal stratification are discussed, even though these are much smaller compared to the effect of positioning the temperature sensors.
In this paper, it aims to model wind speed time series at multiple sites. The five-parameter Johnson distribution is deployed to relate the wind speed at each site to a Gaussian time series, and the resultant m-dimensional Gaussian stochastic vector process Z(t) is employed to model the temporal-spatial correlation of wind speeds at m different sites. In general, it is computationally tedious to obtain the autocorrelation functions (ACFs) and cross-correlation functions (CCFs) of Z(t), which are different to those of wind speed times series. In order to circumvent this correlation distortion problem, the rank ACF and rank CCF are introduced to characterize the temporal-spatial correlation of wind speeds, whereby the ACFs and CCFs of Z(t) can be analytically obtained. Then, Fourier transformation is implemented to establish the cross-spectral density matrix of Z(t), and an analytical approach is proposed to generate samples of wind speeds at m different sites. Finally, simulation experiments are performed to check the proposed methods, and the results verify that the five-parameter Johnson distribution can accurately match distribution functions of wind speeds, and the spectral representation method can well reproduce the temporal-spatial correlation of wind speeds.
Sprachassistenten gewinnen als Alltagshelfer immer mehr an Bedeutung. Sie beeinflussen die Art und Weise, wie Menschen Aufgaben erledigen und mit Unternehmen interagieren. Eine qualitative Studie hat die Nutzungsbereitschaft beim Kauf von High-Involvement-Produkten über Sprachassistenten untersucht. Zukünftig könnte Voice Commerce für diese Produkte an Relevanz gewinnen.
Most digital innovations fail when they transition from the exploring to the scaling stage. We describe how freeyou, the digital innovation spinoff of a major German insurer, successfully scaled online-only car insurance, focusing particularly on how it managed the IT-related challenges. The stark differences between the stages required very different approaches to application development, IT organization and data analytics. Based on freeyou’s experience, we provide recommendations for successful transitioning from exploring to scaling.
Recent advances in artificial intelligence have enabled promising applications in neurosurgery that can enhance patient outcomes and minimize risks. This paper presents a novel system that utilizes AI to aid neurosurgeons in precisely identifying and localizing brain tumors. The system was trained on a dataset of brain MRI scans and utilized deep learning algorithms for segmentation and classification. Evaluation of the system on a separate set of brain MRI scans demonstrated an average Dice similarity coefficient of 0.87. The system was also evaluated through a user experience test involving the Department of Neurosurgery at the University Hospital Ulm, with results showing significant improvements in accuracy, efficiency, and reduced cognitive load and stress levels. Additionally, the system has demonstrated adaptability to various surgical scenarios and provides personalized guidance to users. These findings indicate the potential for AI to enhance the quality of neurosurgical interventions and improve patient outcomes. Future work will explore integrating this system with robotic surgical tools for minimally invasive surgeries.
Purpose
For the modeling, execution, and control of complex, non-standardized intraoperative processes, a modeling language is needed that reflects the variability of interventions. As the established Business Process Model and Notation (BPMN) reaches its limits in terms of flexibility, the Case Management Model and Notation (CMMN) was considered as it addresses weakly structured processes.
Methods
To analyze the suitability of the modeling languages, BPMN and CMMN models of a Robot-Assisted Minimally Invasive Esophagectomy and Cochlea Implantation were derived and integrated into a situation recognition workflow. Test cases were used to contrast the differences and compare the advantages and disadvantages of the models concerning modeling, execution, and control. Furthermore, the impact on transferability was investigated.
Results
Compared to BPMN, CMMN allows flexibility for modeling intraoperative processes while remaining understandable. Although more effort and process knowledge are needed for execution and control within a situation recognition system, CMMN enables better transferability of the models and therefore the system. Concluding, CMMN should be chosen as a supplement to BPMN for flexible process parts that can only be covered insufficiently by BPMN, or otherwise as a replacement for the entire process.
Conclusion
CMMN offers the flexibility for variable, weakly structured process parts, and is thus suitable for surgical interventions. A combination of both notations could allow optimal use of their advantages and support the transferability of the situation recognition system.