Refine
Year of publication
- 2023 (300) (remove)
Document Type
- Journal article (153)
- Conference proceeding (90)
- Book chapter (31)
- Doctoral Thesis (8)
- Book (5)
- Anthology (4)
- Issue of a journal (2)
- Report (2)
- Review (2)
- Working Paper (2)
Is part of the Bibliography
- yes (300)
Institute
- ESB Business School (137)
- Informatik (80)
- Technik (39)
- Life Sciences (26)
- Texoversum (17)
- Zentrale Einrichtungen (5)
Publisher
- Springer (56)
- Elsevier (44)
- MDPI (21)
- IEEE (18)
- Erich Schmidt Verlag (11)
- American Chemical Society (7)
- Association for Information Systems (7)
- Hochschule Reutlingen (6)
- Emerald (5)
- VDE Verlag (5)
- Association for Computing Machinery (4)
- De Gruyter (4)
- MIM (4)
- Taylor & Francis (4)
- Universität Tübingen (4)
- Beck (3)
- Gesellschaft für Informatik e.V (3)
- Wiley (3)
- dfv Mediengruppe (3)
- Academic Conferences International (2)
- Cambridge University Press (2)
- Center for Promoting Education and Research (2)
- Deutscher Fachverlag (2)
- European Association for the Development of Renewable Energy, Environment and Power Quality (2)
- Frontiers Research Foundation (2)
- Haufe (2)
- IARIA (2)
- IOP Publishing (2)
- Palgrave Macmillan (2)
- RWTH Aachen (2)
- SciTePress (2)
- Stellenbosch University (2)
- Technische Informationsbibliothek (2)
- UVK Verlag (2)
- University of Hawaii at Manoa (2)
- VDI Fachmedien (2)
- Waxmann (2)
- ASME (1)
- Academy of Management (1)
- Adonis & Abbey (1)
- Adonis & Abbey Publishers (1)
- American Institute of Physics (1)
- Bayerisches Zentrum für innovative Lehre (1)
- Berufsverband Information Bibliothek (1)
- DUZ Medienhaus (1)
- Deutsche Gesellschaft für Personalführung (1)
- Duncker & Humblot (1)
- Editio Cantor Verlag (1)
- Edward Elgar Publishing (1)
- Frontiers Media (1)
- Handelsblatt Fachmedien (1)
- Hochschule Bonn-Rhein-Sieg (1)
- Hochschule Hannover (1)
- Hochschule für Technik, Wirtschaft und Kultur Leipzig (1)
- Hometrica Consulting (1)
- IHK Mittlerer Niederrhein (1)
- Inderscience Enterprises (1)
- International Association for Development of the Information Society (1)
- International TRIZ Official Association (1)
- Karlsruher Institut für Technologie (1)
- Liz Mohn Center gGmbH (1)
- Morressier (1)
- Munich Society for the Promotion of Economic Research (1)
- Nature Research (1)
- Nomos Verlagsgesellschaft (1)
- Oemus Media (1)
- OpenProceedings (1)
- Pallas Press (1)
- Qeios (1)
- SAIIE (1)
- SATC (1)
- SPIE. The International Society for Optical Engineering (1)
- Sage Publishing (1)
- Schmidt (1)
- Schäffer-Poeschel (1)
- SciKA (1)
- SciKa (1)
- Sciamus GmbH (1)
- Sciendo (1)
- Solutions by Handelsblatt Media Group (1)
- Solutions by Handelsblatt Media Group GmbH (1)
- Süddeutsche Zeitung GmbH (1)
- Tech Science Press (1)
- Thexis Verlag (1)
- Universität Leipzig (1)
- VCW (1)
- VDI (1)
- VDI Verlag (1)
- Vogel (1)
This paper presents the first part of a research-work conducted at the University of Applied Sciences (HFT- Stuttgart). The aim of the research was to investigate the potential of low-cost renewable energy systems to reduce the energy demand of the building sector in hot and dry areas. Radiative cooling to the night sky represents a low-cost renewable energy source. The dry desert climate conditions promote radiative cooling applications. The system technology adopted in this work is based on uncovered solar thermal collectors integrated into the building’s hydronic system. By implementing different control strategies, the same system could be used for cooling as well as for heating applications. This paper focuses on identifying the collector parameters which are required as the coefficients to configure such an unglazed collector for calibrating its mathematical model within the simulation environment. The parameter identification process implies testing the collector for its thermal performance. This paper attempts to provide an insight into the dynamic testing of uncovered solar thermal collectors (absorbers), taking into account their prospective operation at nighttime for radiative cooling applications. In this study, the main parameters characterizing the performance of the absorbers for radiative cooling applications are identified and obtained from standardized testing protocol. For this aim, a number of plastic solar absorbers of different designs were tested on the outdoor test-stand facility at HFT-Stuttgart for the characterization of their thermal performance. The testing process was based on the quasi-dynamic test method of the international standard for solar thermal collectors EN ISO 9806. The test database was then used within a mathematical optimization tool (GenOpt) to determine the optimal parameter settings of each absorber under testing. Those performance parameters were significant to compare the thermal performance of the tested absorbers. The coefficients (identified parameters) were used then to plot the thermal efficiency curves of all absorbers, for both the heating and cooling modes of operation. Based on the intended main scope of the system utilization (heating or cooling), the tested absorbers could be benchmarked. Hence, one of those absorbers was selected to be used in the following simulation phase as was planned in the research-project.
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
As fuel prices climb and the global automotive sector migrates to more sustainable vehicle technologies, the future of South Africa’s minibus taxis is in flux. The authors’ previous research has found that battery electric technology struggles to meet all the mobility requirements of minibus taxis. They investigate the technical feasibility of powering taxis with hydrogen fuel cells instead. The following results are projected using a custom-built simulator, and tracking data of taxis based in Stellenbosch, South Africa. Each taxi requires around 12 kg of hydrogen gas per day to travel an average distance of 360 km. 465 kWh of electricity, or 860 m2 of solar panels, would electrolyse the required green hydrogen. An economic analysis was conducted on the capital and operational expenses of a system of ten hydrogen taxis and an electrolysis plant. Such a pilot project requires a minimum investment of € 3.8 million (R 75 million), for a 20 year period. Although such a small scale roll-out is technically feasible and would meet taxis’ performance requirements, the investment cost is too high, making it financially unfeasible. They conclude that a large scale solution would need to be investigated to improve financial feasibility; however, South Africa’s limited electrical generation capacity poses a threat to its technical feasibility. The simulator is uploaded at: https://gitlab.com/eputs/ev-fleet-sim-fcv-model.
Most Question-answering (QA) systems rely on training data to reach their optimal performance. However, acquiring training data for supervised systems is both time-consuming and resource-intensive. To address this, in this paper, we propose TFCSG, an unsupervised similar question retrieval approach that leverages pre-trained language models and multi-task learning. Firstly, topic keywords in question sentences are extracted sequentially based on a latent topic-filtering algorithm to construct unsupervised training corpus data. Then, the multi-task learning method is used to build the question retrieval model. There are three tasks designed. The first is a short sentence contrastive learning task. The second is the question sentence and its corresponding topic sequence similarity judgment task. The third is using question sentences to generate their corresponding topic sequence task. The three tasks are used to train the language model in parallel. Finally, similar questions are obtained by calculating the cosine similarity between sentence vectors. The comparison experiment on public question datasets that TFCSG outperforms the comparative unsupervised baseline method. And there is no need for manual marking, which greatly saves human resources.
UV hyperspectral imaging (225 nm–410 nm) was used to identify and quantify the honey- dew content of real cotton samples. Honeydew contamination causes losses of millions of dollars annually. This study presents the implementation and application of UV hyperspectral imaging as a non-destructive, high-resolution, and fast imaging modality. For this novel approach, a reference sample set, which consists of sugar and protein solutions that were adapted to honeydew, was set-up. In total, 21 samples with different amounts of added sugars/proteins were measured to calculate multivariate models at each pixel of a hyperspectral image to predict and classify the amount of sugar and honeydew. The principal component analysis models (PCA) enabled a general differentiation between different concentrations of sugar and honeydew. A partial least squares regression (PLS-R) model was built based on the cotton samples soaked in different sugar and protein concentrations. The result showed a reliable performance with R2cv = 0.80 and low RMSECV = 0.01 g for the valida- tion. The PLS-R reference model was able to predict the honeydew content laterally resolved in grams on real cotton samples for each pixel with light, strong, and very strong honeydew contaminations. Therefore, inline UV hyperspectral imaging combined with chemometric models can be an effective tool in the future for the quality control of industrial processing of cotton fibers.
Flame-retardant finishing of cotton fabrics using DOPO functionalized alkoxy- and amido alkoxysilane
(2023)
In the present study, DOPO-based alkoxysilane (DOPO-ETES) and amido alkoxysilane (DOPO-AmdPTES) were synthesized by one-step and without by-products as halogen-free flame retardants. The flame retardants were applied on cotton fabric utilizing sol–gel method and pad-dry-cure finishing process. The flame retardancy, the thermal stability and the combustion ehaviour of treated cotton were evaluated by surface and bottom edge ignition flame test (according to EN ISO 15025), thermogravimetric analysis (TGA) and micro-scale combustion calorimeter (MCC). Unlike CO/DOPO-ETES sample, cotton treated with DOPO-AmdPTES nanosols exhibits self-extinguishing ehaviour with high char residue, an improvement of the LOI value and a significant reduction of the PHRR, HRC and THR compared to pristine cotton. Cotton finished with DOPO-AmdPTES reveals a semi-durability after ten laundering cycles keeping the flame-retardant properties unchanged. According to the results obtained from TGA-FTIR, Py-GC/MS and XPS, the major activity of flame retardant occurs in the condensed phase via catalytic induced char formation as physical barrier along with the activity in the gas phase derived mainly from the dilution effect. The early degradation of CO/DOPO-AmdPTES compared to CO/DOPO-ETES, triggered by the cleavage of the weak bond between P and C=O, as the DFT study indicated, provides the beneficial effect of this flame retardant on the fire resistance of cellulose.
The chemical recycling of used motor oil via catalytic cracking to convert it into secondary diesel-like fuels is a sustainable and technically attractive solution for managing environmental concerns associated with traditional disposal. In this context, this study was conducted to screen basic and acidic-aluminum silicate catalysts doped with different metals, including Mg, Zn, Cu, and Ni. The catalysts were thoroughly characterized using various techniques such as N2 adsorption–desorption isotherms, FT-IR spectroscopy, and TG analysis. The liquid and gaseous products were identified using GC, and their characteristics were compared with acceptable ranges from ASTM characterization methods for diesel fuel. The results showed that metal doping improved the performance of the catalysts, resulting in higher conversion rates of up to 65%, compared to thermal (15%) and aluminum silicates (≈20%). Among all catalysts, basic aluminum silicates doped with Ni showed the best catalytic performance, with conversions and yields three times higher than aluminum silicate catalysts. These findings significantly contribute to developing efficient and eco-friendly processes for the chemical recycling of used motor oil. This study highlights the potential of basic aluminum silicates doped with Ni as a promising catalyst for catalytic cracking and encourages further research in this area.
Facing ever-looming climate change, studying the drivers for individuals' Information Systems (IS) Use to reduce environmental harm gains momentum. While extant research on the antecedents of sustainable IS Use has focused on specific theories, interventions, contexts, and technologies, a holistic understanding has become increasingly elusive, with a synthesis remaining absent. We employ a systematic literature review methodology to shed light on the driving antecedents for sustainable IS Use among individual consumers. Our results build on findings of 29 empirical studies drawn from 598 articles retrieved from our premier outlets and a forward/backward search. The analysis reveals six salient complementary antecedents: Relief, Empowerment, Default, User-centricity, Salience, and Encouragement. We recommend considering these concepts when developing, deploying, promoting, or regulating digital technologies to mitigate individual consumers' emissions. Along with memorable and implementable concepts, our theoretical framework offers a novel conceptualization and four promising avenues for researchers on sustainable IS Use.
The proliferation of smart technologies transforms the way individual consumers perform tasks. Considerable research alludes that smart technologies are often related to domestic energy consumption. However, it remains unclear how such technologies transform tasks and thereby impact our planet. We explore the role of technological smartness in personal day-to-day tasks that help create a more sustainable future. In the absence of theory, but facing extensive changes in everyday life enabled by smart technologies, we draw on phenomenon-based theorizing (PBT) guidelines. As anchor, we refer to task endogeneity related to task-technology fit theory (TTF). As infusion, we employ theory on public goods. Our model proposes novel relations between the concepts of smart autonomy and -transparency with sustainable task outcomes, mediated by task convenience and task significance. We discuss some implications, limitations, and future research opportunities.
The properties of polyelectrolyte multilayers are ruled by the process parameters employed during self-assembly. This is the first study in which a design of experiment approach was used to validate and control the production of ultrathin polyelectrolyte multilayer coatings by identifying the ranges of critical process parameters (polyelectrolyte concentration, ionic strength and pH) within which coatings with reproducible properties (thickness, refractive index and hydrophilicity) are created. Mathematical models describing the combined impact of key process parameters on coatings properties were developed demonstrating that only ionic strength and pH affect the coatings thickness, but not polyelectrolyte concentration. While the electrolyte concentration had a linear effect, the pH contribution was described by a quadratic polynomial. A significant contribution of this study is the development of a new approach to estimate the thickness of polyelectrolyte multilayer nanofilms by quantitative rhodamine B staining, which might be useful in all cases when ellipsometry is not feasible due to the shape complexity or small size of the coated substrate. The novel approach proposed here overcomes the limitations of known methods as it offers a low spatial sampling size and the ability to analyse a wide area without restrictions on the chemical composition and shape of the substrate.
In dieser Arbeit werden Anforderungen an ein digitales Referenzmodell der Cell and Gene Therapy (CGT) Supply Chain mittels systematischer Literaturrecherche unter partieller Anwendung der Preferred-Reporting-Items-for-Systematic-Reviews-and-Meta-Analyses(PRISMA)-2020-Methode erarbeitet und erläutert. Die Ergebnisse der Literaturrecherche untermauern, dass die CGT Supply Chain standardisierte und automatisierte Prozesse benötigt, gewissen Transportanforderungen gerecht werden sowie eine lückenlose Rückverfolgbarkeit gewährleisten können muss. Die Anforderungen an das Referenzmodell lehnen sich z. T. an die Anforderungen des klassischen Supply-Chain-Operations-Reference(SCOR)-Modells an, bedürfen jedoch einer Veränderung und Weiterentwicklung unter Beachtung der Besonderheiten der CGT Supply Chain. Auf Basis eines Referenzmodells für die CGT Supply Chain, das die aus dieser Arbeit identifizierten Anforderungen beachtet, kann eine übergeordnete Managementplattform aufgebaut werden. Mit der digitalen Abbildung und Vernetzung aller Aktivitäten ist der Grundstein für die Integration in ein Enterprise-Resource-Planning(ERP)-System zum effektiven Data und Process Mining gelegt. Durch eine zunehmend bessere Datenqualität und -quantität entlang der Prozesse der CGT Supply Chain lassen sich verstärkt Informationen über die Prozesse selbst generieren, aus denen weitere Verbesserungsansätze hervorgehen. Eine CGT-Managementplattform bildet demnach die Grundlage für alle Prozesse innerhalb der CGT Supply Chain für einen kontinuierlichen Verbesserungsprozess.
Monitoring heart rate and breathing is essential in understanding the physiological processes for sleep analysis. Polysomnography (PSG) system have traditionally been used for sleep monitoring, but alternative methods can help to make sleep monitoring more portable in someone's home. This study conducted a series of experiments to investigate the use of pressure sensors placed under the bed as an alternative to PSG for monitoring heart rate and breathing during sleep. The following sets of experiments involved the addition of small rubber domes - transparent and black - that were glued to the pressure sensor. The resulting data were compared with the PSG system to determine the accuracy of the pressure sensor readings. The study found that the pressure sensor provided reliable data for extracting heart rate and respiration rate, with mean absolute errors (MAE) of 2.32 and 3.24 for respiration and heart rate, respectively. However, the addition of small rubber hemispheres did not significantly improve the accuracy of the readings, with MAEs of 2.3 bpm and 7.56 breaths per minute for respiration rate and heart rate, respectively. The findings of this study suggest that pressure sensors placed under the bed may serve as a viable alternative to traditional PSG systems for monitoring heart rate and breathing during sleep. These sensors provide a more comfortable and non-invasive method of sleep monitoring. However, the addition of small rubber domes did not significantly enhance the accuracy of the readings, indicating that it may not be a worthwhile addition to the pressure sensor system.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
Die meisten Innovationsprojekte in Unternehmen scheitern nicht am Mangel an Ideen, Kreativität oder am Umsetzungswillen, sondern an vielen kleinen Hürden, die die Projekte massiv entschleunigen. So verlieren Initiativen an der Dynamik, die dafür sorgt, dass sich zügig Erfolge einstellen. Ein Bereich, in dem unkonventionell, agil und schnell Ergebnisse erzielt werden, ist das Guerilla Marketing. Was können Innovations-, Forschungs- und Projektleiter aus dem Methodenbaukasten lernen? Wie können konkrete Taktiken aus dem Marketing auch Innovationsprojekten zu mehr Viralität und Schwung verhelfen, um die Eigendynamik der Initiativen „unbremsbar“ zu machen? Das erfahren Sie in diesem essential.
Hybride Arbeitsmodelle gelten als Zukunft der Arbeit. Demnach beschäftigt sich die vorliegende Forschungsarbeit mit der Untersuchung hybrider Arbeitsmodelle im Hinblick auf deutsche kleine und mittlere Unternehmen (KMU) im Vergleich zu Großbetrieben. Mithilfe einer multi-methodischen Studie, bestehend aus einer Umfrage und qualitativen Experteninterviews, wird evaluiert, in welchem Maß hybride Arbeitsmodelle in KMU bereits etabliert sind und welche Herausforderungen sie dabei bewältigen müssen. Zusätzlich wird betrachtet, ob soziodemografische Faktoren wie Alter, Geschlecht oder Rolle im Unternehmen einen Einfluss auf hybrides Arbeiten haben.Die Ergebnisse zeigen, dass die Etablierung von hybriden Arbeitsmodellen in KMU im Gegensatz zu Großbetrieben weniger vorangeschritten ist. KMUs stehen vor vielfältigen Herausforderungen, die beispielsweise auf unzureichende Digitalisierung oder traditionellere Strukturen zurückzuführen sind. Insbesondere die Unternehmenskultur sowie die Rolle im Unternehmen und der Einfluss der Führungskraft spielen eine wichtige Rolle.Praktische Relevanz: Der Großteil vorliegender Literatur zum Thema New Work und Hybride Arbeit legt den Fokus auf die Gesamtbetrachtung aller Unternehmensgrößen oder auf Großbetriebe. Aufgrund der spezifischen Merkmale, wie beispielsweise eingeschränkter Ressourcenzugang, können Ergebnisse von Großbetrieben kaum auf KMU übertragen werden. Demnach gibt diese Arbeit eine Orientierung, wie hybride Arbeitsmodelle in KMU sinnvoll und gewinnbringend umgesetzt werden und welche Herausforderungen auftreten.
Um sich in einem schnelllebigen und globalen Markt nachhaltig wettbewerbsfähig aufzustellen, bedarf es innovativer Ansätze, Produkte sichtbar zu machen. Vorreiter wie Apple oder Microsoft stehen mit ihren Marketingstrategien und der Präsentation ihrer Produkte für eine neue Denkweise. Doch wie kann ein klein- oder mittelständiges Unternehmen (KMU) mit solchen Strategien konkurrieren und sich und die eigenen Produkte am Markt erfolgreich platzieren? Der vorliegende Beitrag zeigt auf, wie ein Markteinführungskonzept mithilfe des Design-Thinking-Ansatzes auf Basis der Kundenbedürfnisse modular und skalierbar ausgestaltet werden kann, um auf die jeweiligen Anforderungen des einzuführenden Produktes adaptierbar zu sein.
Military organizations have special features like following different organizational laws in times of peace and war and their specific embeddedness in society and politics. Especially the latter aspect has made the military an important object of study since the beginnings of modern sociology. In the wake of establishing specific sociological accounts, military sociology has been developed, dedicated to the different facets of the military. This research is based on different theoretical perspectives, but has hardly embraced the frameworks from economics and sociology of conventions (EC/SC) so far. The aim of the chapter is to explore and demonstrate the potentials of this approach. In a first step, the state of the art of military sociology research is outlined, and potential avenues for analyzing military forces based on EC/SC are identified. It is argued that especially the connection to organizational theory (military as organization) and civil-military relations, including leadership and professionalism, offer starting points. After introducing existing studies addressing military-related topics with reference to EC/SC, relevant concepts and approaches of convention theory that prove to be particularly enriching for military research are discussed. An outlook on possible further fields and topics of research is given to concretize how an inclusion of the perspective of EC/SC could look like.
The performance and scalability of modern data-intensive systems are limited by massive data movement of growing datasets across the whole memory hierarchy to the CPUs. Such traditional processor-centric DBMS architectures are bandwidth- and latency-bound. Processing-in-Memory (PIM) designs seek to overcome these limitations by integrating memory and processing functionality on the same chip. PIM targets near- or in-memory data processing, leveraging the greater in-situ parallelism and bandwidth.
In this paper, we introduce pimDB and provide an initial comparison of processor-centric and PIM-DBMS approaches under different aspects, such as scalability and parallelism, cache-awareness, or PIM-specific compute/bandwidth tradeoffs. The evaluation is performed end-to-end on a real PIM hardware system from UPMEM.
Motivation
In order to enable context-aware behavior of surgical assistance systems, the acquisition of various information about the current intraoperative situation is crucial. To achieve this, the complex task of situation recognition can be delegated to a specialized system. Consequently, a standardized interface is required for the seamless transfer of the recognized contextual information to the assistance systems, enabling them to adapt accordingly.
Methods
Our group analyzed four medical interface standards to determine their suitability for exchanging intraoperative contextual information. The assessment was based on a harmonized data and service model derived from the requirements of expected context-aware use cases. The Digital Imaging and Communications in Medicine (DICOM) and IEEE 11073 for Service-oriented Device Connectivity (SDC) were identified as the most appropriate standards.
Results
We specified how DICOM Unified Procedure Steps (UPS), can be used to effectively communicate contextual information. We proposed the inclusion of attributes to formalize different granularity levels of the surgical workflow.
Conclusions
DICOM UPS SOP classes can be used for the exchange of intraoperative contextual information between a situation recognition system and surgical assistance systems. This can pave the way for vendor-independent context awareness in the OR, leading to targeted assistance of the surgical team and an improvement of the surgical workflow.
Large critical systems, such as those created in the space domain, are usually developed by a large number of organizations and, furthermore, they have to comply with standards. Yet, the different stakeholders often do not have a common understanding of the needed quality of requirements specifications. Achieving such a common understanding is a laborious process that is currently not sufficiently supported. Moreover, such a common understanding must be aligned with the standards. In this paper, we present an approach that can be used to align the different stakeholder perceptions regarding the quality of requirements specifications. Existing quality models for requirements specifications are analyzed for equivalences, and transferred into a common representation, the so-called Aligned Quality Map (AQM). Furthermore, a process is defined that supports the alignment of different stakeholder perspectives with regard to the quality of requirements specifications using AQM, which is validated in a case study in the context of European space projects. AQM has been created and populated with an initial set of quality models. It is designed in such way that it can be extended to include further quality models. The case study has shown that an alignment of different stakeholder perspectives and the quality model of the European Cooperation for Space Standardization using AQM is feasible. The approach allows for aligning different stakeholder perspectives for a common understanding of the quality of requirements specifications in the context of standards. Furthermore, AQM supports the assessment of requirements specifications.
Das Interview geht der Frage nach: Verändern neue Geschäftsmodelle die Unternehmenssteuerung? Dazu machen die Diskussionspartner am Beispiel der Automobilbranche auf vielfältige Veränderungen und Entwicklungen aufmerksam. Zentral ist die Herausbildung neuer Geschäftsmodelle, die die Funktionsweise und den Marktauftritt der Unternehmen zeitgerecht und wirtschaftlich erfolgreich gestalten sollen. Ebenso wichtig ist die Zusammenfassung der vielfältigen Steuerungsaktivitäten in einem Steuerungsmodell und deren fortlaufende Abstimmung mit den sich aus dem Geschäftsmodell jeweils ergebenden Steuerungsanforderungen.
Cyber-Physical Production Systems increasingly use semantic information to meet the grown flexibility requirements. Ontologies are often used to represent and use this semantic information. Existing systems focus on mapping knowledge and less on the exchange with other relevant IT systems (e.g., ERP systems) in which crucial semantic information, often implicit, is contained. This article presents an approach that enables the exchange of semantic information via adapters. The approach is demonstrated by a use case utilizing an MES system and an ERP system.
It is widely recognized that Education for Sustainable Development (ESD) plays a critical role in creating a more sustainable world by fostering the development of the knowledge, skills, understanding, values, and actions necessary for such change (UNESCO, 2020). In this context, ESD represents a holistic approach that focuses on lifelong learning to create informed people who can make decisions today and in the future. Related to the textile and fashion industry, ESD is an appropriate approach to continuously implement sustainability aspects in education and training. To achieve this goal, the European project "Sustainable Fashion Curriculum at Textile Universities in Europe - Development, Implementation and Evaluation of a Teaching Module for Educators" (Fashion DIET) has developed a digital teaching module in a partnership between a University of Education and universities with textile departments. The main objective of the project is to elaborate an ESD module for university lecturers in order to introduce a sustainable fashion curriculum in textile universities in Europe and implement it in educational systems. The project therefore aims to train educators along the textile supply chain, to inform the young generation about the latest aspects of sustainability and raise awareness by implementing ESD in textile education. This paper presents the learning outcomes of the modules on sustainable fashion design and related production technologies developed by the technical university partners, as part of the total of 42 courses covering didactic-methodological approaches and the sustainable orientation of the fashion market, offered at the consortium level. The project content is made available as Open Educational Resources through Glocal Campus, an open-access e-learning platform that enables virtual collaboration between universities.
Context
Web APIs are one of the most used ways to expose application functionality on the Web, and their understandability is important for efficiently using the provided resources. While many API design rules exist, empirical evidence for the effectiveness of most rules is lacking.
Objective
We therefore wanted to study 1) the impact of RESTful API design rules on understandability, 2) if rule violations are also perceived as more difficult to understand, and 3) if demographic attributes like REST-related experience have an influence on this.
Method
We conducted a controlled Web-based experiment with 105 participants, from both industry and academia and with different levels of experience. Based on a hybrid between a crossover and a between-subjects design, we studied 12 design rules using API snippets in two complementary versions: one that adhered to a rule and one that was a violation of this rule. Participants answered comprehension questions and rated the perceived difficulty.
Results
For 11 of the 12 rules, we found that violation performed significantly worse than rule for the comprehension tasks. Regarding the subjective ratings, we found significant differences for 9 of the 12 rules, meaning that most violations were subjectively rated as more difficult to understand. Demographics played no role in the comprehension performance for violation.
Conclusions
Our results provide first empirical evidence for the importance of following design rules to improve the understandability of Web APIs, which is important for researchers, practitioners, and educators.
Sleep is extremely important for physical and mental health. Although polysomnography is an established approach in sleep analysis, it is quite intrusive and expensive. Consequently, developing a non-invasive and non-intrusive home sleep monitoring system with minimal influence on patients, that can reliably and accurately measure cardiorespiratory parameters, is of great interest. The aim of this study is to validate a non-invasive and unobtrusive cardiorespiratory parameter monitoring system based on an accelerometer sensor. This system includes a special holder to install the system under the bed mattress. The additional aim is to determine the optimum relative system position (in relation to the subject) at which the most accurate and precise values of measured parameters could be achieved. The data were collected from 23 subjects (13 males and 10 females). The obtained ballistocardiogram signal was sequentially processed using a sixth-order Butterworth bandpass filter and a moving average filter. As a result, an average error (compared to reference values) of 2.24 beats per minute for heart rate and 1.52 breaths per minute for respiratory rate was achieved, regardless of the subject’s sleep position. For males and females, the errors were 2.28 bpm and 2.19 bpm for heart rate and 1.41 rpm and 1.30 rpm for respiratory rate. We determined that placing the sensor and system at chest level is the preferred configuration for cardiorespiratory measurement. Further studies of the system’s performance in larger groups of subjects are required, despite the promising results of the current tests in healthy subjects.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
Sleep is essential to physical and mental health. However, the traditional approach to sleep analysis—polysomnography (PSG)—is intrusive and expensive. Therefore, there is great interest in the development of non-contact, non-invasive, and non-intrusive sleep monitoring systems and technologies that can reliably and accurately measure cardiorespiratory parameters with minimal impact on the patient. This has led to the development of other relevant approaches, which are characterised, for example, by the fact that they allow greater freedom of movement and do not require direct contact with the body, i.e., they are non-contact. This systematic review discusses the relevant methods and technologies for non-contact monitoring of cardiorespiratory activity during sleep. Taking into account the current state of the art in non-intrusive technologies, we can identify the methods of non-intrusive monitoring of cardiac and respiratory activity, the technologies and types of sensors used, and the possible physiological parameters available for analysis. To do this, we conducted a literature review and summarised current research on the use of non-contact technologies for non-intrusive monitoring of cardiac and respiratory activity. The inclusion and exclusion criteria for the selection of publications were established prior to the start of the search. Publications were assessed using one main question and several specific questions. We obtained 3774 unique articles from four literature databases (Web of Science, IEEE Xplore, PubMed, and Scopus) and checked them for relevance, resulting in 54 articles that were analysed in a structured way using terminology. The result was 15 different types of sensors and devices (e.g., radar, temperature sensors, motion sensors, cameras) that can be installed in hospital wards and departments or in the environment. The ability to detect heart rate, respiratory rate, and sleep disorders such as apnoea was among the characteristics examined to investigate the overall effectiveness of the systems and technologies considered for cardiorespiratory monitoring. In addition, the advantages and disadvantages of the considered systems and technologies were identified by answering the identified research questions. The results obtained allow us to determine the current trends and the vector of development of medical technologies in sleep medicine for future researchers and research.
Ziel des Beitrages ist es, sinnlich-ästhetisch Weltzugänge am Beispiel des Designs für pädagogische Kontexte herauszuarbeiten. Gleichermaßen soll eine Reflexion darüber in Gang gesetzt werden, welche Potenziale transdisziplinär geprägte Gestaltungsprozesse für das Lernen und Lehren bereithalten. Zunächst wird daher die transdisziplinäre Natur des Designs geschichtlich hergeleitet und verschiedene Prozesse beleuchtet. Das sich anschließende Kapitel arbeitet unter dem Begriff der Gestaltung jene Merkmale und Qualitäten professionellen Designs heraus, die allen Prozessen zugrunde liegen. Abschließend wird das Konzept des Design Thinking vor dem Hintergrund erfahrungsbasierten Lernens in der Schulbildung diskutiert und mit Beispielen aus empirischen Studien untermauert.
The article pleads for Education for Sustainable Development (ESD) in the textile and fashion sector and shows possibilities how this can be implemented from elementary school to higher education and vocational training. It begins by highlighting the non-sustainable practices and deficits that can be found in the fashion and textile sector worldwide and explains the sustainability goals in the context of the UN Roadmap ESD for 2030. In order to raise the awareness for sustainability and implement these goals, education is needed. The article introduces the concept of ESD as a guiding principle with the core element design competence, implemented by the interdisciplinary method of Design Thinking (DT). In order to successfully teach the ESD-relevant design competence, various didactic principles are required. It can be shown that they are very similar to the principles and phases of DT. Within a research project DT and its potential for implementing ESD has been investigated in teaching-learning situations at elementary schools as well as in an interdisciplinary seminar for student teachers. These findings have been transferred to the EU project Fashion DIET, which pursues the goal of implementing ESD in the textile and fashion sector. By means of an online pilot workshop, the methods and principles of DT were presented and explained to lecturers, teachers and educators, who gave their feedback on the potential of DT as a method to implement ESD as a guiding principle in their curricula.
The increasing complexity and need for availability of automated guided vehicles (AGVs) pose challenges to companies, leading to a focus on new maintenance strategies. In this paper, a smart maintenance architecture based on a digital twin is presented to optimize the technical and economic effectiveness of AGV maintenance activities. To realize this, a literature review was conducted to identify the necessary requirements for Smart Maintenance and Digital Twins. The identified requirements were combined into modules and then integrated into an architecture. The architecture was evaluated on a real AGV on the battery as one of the critical components.
Smart cities are considered data factories that generate an enormous amount of data from various sources. In fact data is the backbone of any smart services. Therefore, the strategic beneficial handling of this digital capital is crucial for cities. Some smart city pioneers have already written down their approach to data in the form of data strategies, but what should a city's data strategy include, and how can the goals and measures defined in the strategies be operationalized? This paper addresses these questions by looking closely at the data strategies of cities in Germany and the top three countries in the EU Digital Economy and Society Index. The in-depth analysis of 8 city data strategies has yielded 11 dimensions that cities should consider in their data strategy. These are relevance of data, principles, methods, data sharing, technology, data culture, data ethics, organizational structure, data security and privacy, collaborations, data literacy. In addition, data governance is a concept to put these 11 strategic dimensions into practice through standardization measures, training programs, and defining roles and responsibilities by developing a data catalog.
The benefits of urban data cannot be realized without a political and strategic view of data use. A core concept within this view is data governance, which aligns strategy in data-relevant structures and entities with data processes, actors, architectures, and overall data management. Data governance is not a new concept and has long been addressed by scientists and practitioners from an enterprise perspective. In the urban context, however, data governance has only recently attracted increased attention, despite the unprecedented relevance of data in the advent of smart cities. Urban data governance can create semantic compatibility between heterogeneous technologies and data silos and connect stakeholders by standardizing data models, processes, and policies. This research provides a foundation for developing a reference model for urban data governance, identifies challenges in dealing with data in cities, and defines factors for the successful implementation of urban data governance. To obtain the best possible insights, the study carries out qualitative research following the design science research paradigm, conducting semi-structured expert interviews with 27 municipalities from Austria, Germany, Denmark, Finland, Sweden, and the Netherlands. The subsequent data analysis based on cognitive maps provides valuable insights into urban data governance. The interview transcripts were transferred and synthesized into comprehensive urban data governance maps to analyze entities and complex relationships with respect to the current state, challenges, and success factors of urban data governance. The findings show that each municipal department defines data governance separately, with no uniform approach. Given cultural factors, siloed data architectures have emerged in cities, leading to interoperability and integrability issues. A city-wide data governance entity in a cross-cutting function can be instrumental in breaking down silos in cities and creating a unified view of the city’s data landscape. The further identified concepts and their mutual interaction offer a powerful tool for developing a reference model for urban data governance and for the strategic orientation of cities on their way to data-driven organizations.
Patterns are virtually simulated in 3D CAD programs before production to check the fit. However, achieving lifelike representations of human avatars, especially regarding soft tissue dynamics, remains challenging. This is mainly since conventional avatars in garment CAD programs are simulated with a continuous hard surface and not corresponding to the human physical and mechanical body properties of soft tissue. In the real world, the human body’s natural shape is affected by the contact pressure of tight-fitting textiles. To verify the fit of a simulated garment, the interactions between the individual body shape and the garment must be considered. This paper introduces an innovative approach to digitising the softness of human tissue using 4D scanning technology. The primary objective of this research is to explore the interactions between tissue softness and different compression levels of apparel, exerting pressure on the tissue to capture the changes in the natural shape. Therefore, to generate data and model an avatar with soft body physics, it is essential to capture the deform ability and elasticity of the soft tissue and map it into the modification options for a simulation. To aim this, various methods from different fields were researched and compared to evaluate 4D scanning as the most suitable method for capturing tissue deformability in vivo. In particular, it should be considered that the human body has different deformation capabilities depending on age, the amount of muscle and body fat. In addition, different tissue zones have different mechanical properties, so it is essential to identify and classify them to back up these properties for the simulation. It has been shown that by digitising the obtained data of the different defined applied pressure levels, a prediction of the deformation of the tissue of the exact person becomes possible. As technology advances and data sets grow, this approach has the potential to reshape how we verify fit digitally with soft avatars and leverage their realistic soft tissue properties for various practical purposes.
Diese Studie untersucht den kurzfristigen Einfluss der Tagespflege auf die kindliche Entwicklung im Vergleich zur Betreuung in der Kita. Internationale Studien deuten darauf hin, dass der Besuch einer Tagespflege im Vergleich zur Kita eher negative Auswirkungen auf Kinder hat. Mithilfe der Neugeborenen-Kohorte des NEPS können wir evaluieren, ob dies auch im deutschen Kontext gilt. Wir nutzen zwei verschiedene methodische Ansätze, um den Effekt der Tagespflege zu schätzen. Unsere Ergebnisse zeigen, dass die Tagespflege für die Mehrzahl der untersuchten Entwicklungsindikatoren keinen statistisch signifikant schlechteren Einfluss auf die kindliche Entwicklung hat, außer im Bereich der Habituation.
Projektbasiertes Lernen (PBL) ist eine ideale Methode, um Studierenden an Hochschulen praktische Projektmanagement-Kompetenzen zu vermitteln. Selbst anspruchsvolle Projekte werden hierdurch möglich. Jedoch ist die Balance zwischen den angestrebten Lernzielen und der praktischen Projektdurchführung in der Hochschulpraxis herausfordernd. Mit Hilfe des ‚PBL-Gold Standards‘ lassen sich PBL-Projekte zielgerichtet entwerfen und auf Effektivität hinsichtlich der Lernziele überprüfen. Am Beispiel des Projekts ‚IP Plane‘ der Hochschule Reutlingen, dem Bau eines Motorflugzeugs durch Studierende, wird die praktische Umsetzung eines PBL-Projektes demonstriert.
Geopolitische Risiken sind nicht erst seit Ausbruch des Ukraine-Kriegs für den Erfolg und die Überlebensfähigkeit von Unternehmen von großer Relevanz. Nur durch den Aufbau von Methodenkompetenz, diese besonderen Risiken zu identifizieren, schaffen Unternehmen die notwendigen Voraussetzungen für ein erfolgreiches Management von geopolitischen Ereignissen.
Human recognition is an important part of perception systems, such as those used in autonomous vehicles or robots. These systems often use deep neural networks for this purpose, which rely on large amounts of data that ideally cover various situations, movements, visual appearances, and interactions. However, obtaining such data is typically complex and expensive. In addition to raw data, labels are required to create training data for supervised learning. Thus, manual annotation of bounding boxes, keypoints, orientations, or actions performed is frequently necessary. This work addresses whether the laborious acquisition and creation of data can be simplified through targeted simulation. If data are generated in a simulation, information such as positions, dimensions, orientations, surfaces, and occlusions are already known, and appropriate labels can be generated automatically. A key question is whether deep neural networks, trained with simulated data, can be applied to real data. This work explores the use of simulated training data using examples from the field of pedestrian detection for autonomous vehicles. On the one hand, it is shown how existing systems can be improved by targeted retraining with simulation data, for example to better recognize corner cases. On the other hand, the work focuses on the generation of data that hardly or not occur at all in real standard datasets. It will be demonstrated how training data can be generated by targeted acquisition and combination of motion data and 3D models, which contain finely graded action labels to recognize even complex pedestrian situations. Through the diverse annotation data that simulations provide, it becomes possible to train deep neural networks for a wide variety of tasks with one dataset. In this work, such simulated data is used to train a novel deep multitask network that brings together diverse, previously mostly independently considered but related, tasks such as 2D and 3D human pose recognition and body and orientation estimation.
The Industry 4.0 paradigm requires concepts for integrating intelligent/ smart IoT Solutions into manufacturing. Such intelligent solutions are envisioned to increase flexibility and adaptability in smart factories. Especially autonomous cobots capable of adapting to changing conditions are a key enabler for changeable factory concepts. However, identifying the requirements and solution scenarios incorporating intelligent products challenges the manufacturing industry, especially in the SME sector. In pick and place scenarios, changing coordinate systems of workpiece carriers cause placing process errors. Using the IPIDS framework, this paper describes the development of a tool-center-point positioning method to improve the process stability of a collaborative robot in a changeable assembly workstation. Applying the framework identifies the requirement for an intelligent workpiece carrier as a part of the solution. Implementing and evaluating the solution within a changeable factory validates the IPIDS framework.
Framework for integrating intelligent product structures into a flexible manufacturing system
(2023)
Increasing individualisation of products with a high variety and shorter product lifecycles result in smaller lot sizes, increasing order numbers, and rising data and information processing for manufacturing companies. To cope with these trends, integrated management of the products and manufacturing information is necessary through a “product-driven” manufacturing system. Intelligent products that are integrated as an active element within the controlling and planning of the manufacturing process can represent flexibility advantages for the system. However, there are still challenges regarding system integration and evaluation of product intel-ligence structures. In light of these trends, this paper proposes a conceptual frame-work for defining, analysing, and evaluating intelligent products using the example of an assembly system. This paper begins with a classification of the existing problems in the assembly and a definition of the intelligence level. In contrast to previous approaches, the analysis of products is expanded to five dimensions. Based on this, a structured evaluation method for a use case is presented. The structure of solving the assembly problem is provided by the use case-specific ontology model. Results are presented in terms of an assignment of different application areas, linking the problem with the target intelligence class and, depending on the intelligence class of the product, suggesting requirements for implementation. The conceptual frame-work is evaluated by utilising a case study in a learning factory. Here, the model-mix assembly is controlled actively by the workpiece carrier in terms of transferring the variant-specific work instructions to the operator and the collaborative robot (cobot) at the workstations. The resulting system thus enables better exploitation of the poten-tials through less frequent errors and shorter search times. Such an implementation has demonstrated that the intelligent workpiece carrier represents an additional part for realising a cyber-physical production system (CPPS).
The Covid-19 virus has triggered a worldwide pandemic and therefore many employees were required to work from home which caused numerous challenges. With the Covid-19 pandemic now in its third year, there are already several studies available on the subject of home offices. To investigate the impact of remote work on employee satisfaction and trust, this quantitative study aims to review existing results and formulate hypotheses based on a conceptual model created through a qualitative study and extensive literature review. The research question is as follows: Does home office during Covid-19 affect employee satisfaction and trust? To test the hypotheses, a structural equation model was constructed and analyzed. The culture of trust and flexibility are identified as the biggest influencing factors in this study.
Do Chinese subordinates trust their German supervisors? A model of inter-cultural trust development
(2023)
In this qualitative study based on 95 interviews with Chinese subordinates and their German supervisors, we inductively develop a model which advances theoretical understanding by showing how inter-cultural trust development in hierarchical relationships is the result of six distinct elements: the subordinate trustor’s cultural profile (cosmopolitans, hybrids, culturally bounds), the psychological mechanisms operating within the trustor (role expectations and cultural accommodation), and contextual moderators (e.g., country context, time spent in foreign culture, and third-party influencers), which together influence the trust forms (e.g., presumptive trust, relational trust) and trust dynamics (e.g., trust breakdown and repair) within relationship phases over time (initial contact, trust continuation, trust disillusionment, separation, and acculturation). Our findings challenge the assumption that cultural differences result in low levels of initial trust and highlight the strong role the subordinate’s cultural profile can have on the dynamics and trajectory of trust in hierarchical relationships. Our model highlights that inter-cultural trust development operates as a variform universal, following the combined universalistic-particularistic paradigm in cross-cultural management, with both culturally generalizable etic dynamics, as well as culturally specific etic manifestations.
Einführung und Überblick
(2023)
In diesem einführenden Kapitel wird zunächst das Ziel der Buchreihe „Sportmanagement“ im Erich Schmidt Verlag vorgestellt sowie die Motive für die Veröffentlichung des vorliegenden Bandes „Nachhaltigkeitsmanagement in Sport und Kultur“ dargelegt. Danach erfolgt eine erste Annäherung an die Thematik, indem mit dem Sport-, Kultur- und Eventbusiness der Kontext dieses Herausgeberwerkes beschrieben und der Bezug zur Nachhaltigkeit und dem Nachhaltigkeitsmanagement hergestellt wird. Zudem werden aktuelle wissenschaftliche Beiträge, Lehrbücher und Bildungsangebote hinsichtlich Nachhaltigkeitsmanagement in Sport und Kultur präsentiert. Danach wird den Leserinnen und Lesern die zugrunde liegende Struktur des Bandes sowie die einzelnen Kapitel vorgestellt sowie nützliche Hinweise für die Lektüre des Buches gegeben.
Vor mehr als einem Jahrzehnt stellten die Autoren dieses Beitrags die folgende Denkaufgabe:
“Imagine the business of sports without fans. No spectators at sports matches, no buyers of merchandising, no potential customers for sponsoring companies, no recipients for the sports media. Such a scenario would be unthinkable.“ (Bühler & Nufer, 2010, S. 63)
Während der Corona-Pandemie 2020/21 wurde das Undenkbare dann aber doch Realität, als Zuschauer auf der ganzen Welt keine Sportveranstaltungen mehr besuchen durften. Das größte Sportevent der Welt, die Olympischen Spiele in Tokio 2020, mussten verschoben werden und fanden ein Jahr später unter nicht wirklich besseren Bedingungen vor so gut wie leeren Rängen statt. Das Gleiche galt für die UEFA EURO 2020, die ebenfalls um ein Jahr verschoben werden musste, dann aber zumindest (bis auf wenige Ausnahmen wie beispielsweise das Finale in Wembley) mit reduzierter Zuschauerzulassung stattfinden konnte. Hintergrund der Überlegungen sowohl des Internationalen Olympischen Komitees wie auch der Europäischen Fußballunion war damals die Befürchtung, dass ihre jeweiligen Premiumprodukte ohne Fans in den Stadien leiden würden. Natürlich gab es immer noch Millionen von Menschen, die Live-Streams von Sportveranstaltungen verfolgten oder in diesen schwierigen Corona-Zeiten allerhand Merchandise ihrer Lieblingsmannschaften kauften. Doch die Pandemie bestätigte einmal mehr die Grundregel im Sportbusiness: Der Wirtschaftsmarkt Sport im Allgemeinen und professionelle Sportorganisationen im Besonderen brauchen Fans, die bereit sind, ihre Zeit, ihre Emotionen und ihr Geld für ihren Lieblingssport und ihre Lieblingsmannschaften zu investieren. Zuschauer sind die primären – und wohl wichtigsten – Kunden eines Sportunternehmens. Daher ist es für jede professionelle Sportorganisation unerlässlich, eine nachhaltige Beziehung zu ihren Fans aufzubauen und aufrechtzuerhalten und sie auf jede mögliche Weise einzubeziehen. Vor diesem Hintergrund wird die Bedeutung des Fan-Engagements deutlich.
Die digitale Transformation ist die heute vorherrschende geschäftliche Transformation, die einen starken Einfluss darauf hat, wie digitale Dienstleistungen und Produkte dienstleistungsdominant gestaltet werden. Eine beliebte zugrundeliegende Theorie der Wertschöpfung und des wirtschaftlichen Austauschs, die als dienstleistungsdominante Logik (S-D) bekannt ist, kann mit vielen erfolgreichen digitalen Geschäftsmodellen verbunden werden. Allerdings ist die S-D-Logik an sich abstrakt. Unternehmen können sie nicht ohne Weiteres als Instrument für die Innovation und Gestaltung von Geschäftsmodellen nutzen. Um dies zu ändern, wird eine umfassende Ideenfindungsmethode auf der Grundlage der S-D-Logik vorgeschlagen, die als service-dominantes Design (SDD) bezeichnet wird. SDD zielt darauf ab, Unternehmen beim Übergang zu einer service- und wertorientierten Perspektive zu unterstützen. Die Methode bietet eine vereinfachte Möglichkeit, den Ideenfindungsprozess auf der Grundlage von vier Modellkomponenten zu strukturieren. Jede Komponente besteht aus praktischen Implikationen, Hilfsfragen und Visualisierungstechniken, die aus einer Literaturrecherche, einer Anwendungsfallbewertung der digitalen Mobilität und einer Fokusgruppendiskussion abgeleitet wurden. SDD ist ein erster Schritt zu einem Toolset, das etablierte Unternehmen bei der Service- und Werteorientierung im Rahmen ihrer digitalen Transformation unterstützen kann.
Purpose
In recognising the key role of business intelligence and big data analytics in influencing companies’ decision-making processes, this paper aims to codify the main phases through which companies can approach, develop and manage big data analytics.
Design/methodology/approach
By adopting a research strategy based on case studies, this paper depicts the main phases and challenges that companies “live” through in approaching big data analytics as a way to support their decision-making processes. The analysis of case studies has been chosen as the main research method because it offers the possibility for different data sources to describe a phenomenon and subsequently to develop and test theories.
Findings
This paper provides a possible depiction of the main phases and challenges through which the approach(es) to big data analytics can emerge and evolve over time with reference to companies’ decision-making processes.
Research limitations/implications
This paper recalls the attention of researchers in defining clear patterns through which technology-based approaches should be developed. In its depiction of the main phases of the development of big data analytics in companies’ decision-making processes, this paper highlights the possible domains in which to define and renovate approaches to value. The proposed conceptual model derives from the adoption of an inductive approach. Despite its validity, it is discussed and questioned through multiple case studies. In addition, its generalisability requires further discussion and analysis in the light of alternative interpretative perspectives.
Practical implications
The reflections herein offer practitioners interested in company management the possibility to develop performance measurement tools that can evaluate how each phase can contribute to companies’ value creation processes.
Originality/value
This paper contributes to the ongoing debate about the role of digital technologies in influencing managerial and social models. This paper provides a conceptual model that is able to support both researchers and practitioners in understanding through which phases big data analytics can be approached and managed to enhance value processes.
Artificial intelligence (AI) is one of the most promising technologies of the post-pandemic era. Cloud computing technology can simplify the process of developing AI applications by offering a variety of services, including ready-to-use tools to train machine learning (ML) algorithms. However, comparing the vast amount of services offered by different providers and selecting a suitable cloud service can be a major challenge for many firms. Also in academia, suitable criteria to evaluate this type of service remain largely unclear. Therefore, the overall aim of this work has been to develop a framework to evaluate cloud-based ML services. We use Design Science Research as our methodology and conduct a hermeneutic literature review, a vendor analysis, as well as, expert interviews. Based on our research, we present a novel framework for the evaluation of cloud-based ML services consisting of six categories and 22 criteria that are operationalized with the help of various metrics. We believe that our results will help organizations by providing specific guidance on how to compare and select service providers from the vast amount of potential suppliers.
Twitter and citations
(2023)
Social media, especially Twitter, plays an increasingly important role among researchers in showcasing and promoting their research. Does Twitter affect academic citations? Making use of Twitter activity about columns published on VoxEU, a renowned online platform for economists, we develop an instrumental variable strategy to show that Twitter activity about a research paper has a causal effect on the number of citations that this paper will receive. We find that the existence of at least one tweet, as opposed to none, increases citations by 16-25%. Doubling overall Twitter engagement boosts citations by up to 16%.
Silicon neurons represent different levels of biological details and accuracies as a trade-off between complexity and power consumption. With respect to this trade-off and high similarity to neuron behaviour models, relaxation-type oscillator circuits often yield a good compromise to emulate neurons. In this chapter, two exemplified relaxation-type silicon neurons are presented that emulate neural behaviour with energy consumption under the scale of nJ/spike. The first proposed fully CMOS relaxation SiN is based on mathematical Izhikevich model and can mimic a broad range of physiologically observable spike patterns. The results of kinds of biologically plausible output patterns and coupling process of two SiNs are presented in 0.35 μm CMOS technology. The second type is a novel ultra-low-frequency hybrid CMOS-memristive SiN based on relaxation oscillators and analog memristive devices. The hybrid SiN directly emulates neuron behaviour in the range of physiological spiking frequencies (less than 100 Hz). The relaxation oscillator is implemented and fabricated in 0.13 μm CMOS technology. An autonomous neuronal synchronization process is demonstrated with two relaxation oscillators coupled by an analog memristive device in the measurement to emulate the synchronous behaviour between spiking neurons.