Refine
Document Type
- Journal article (992)
- Conference proceeding (904)
- Book chapter (222)
- Working Paper (35)
- Book (30)
- Doctoral Thesis (24)
- Report (23)
- Issue of a journal (17)
- Review (6)
- Anthology (2)
- Patent / Standard / Guidelines (2)
Has full text
- yes (2257) (remove)
Is part of the Bibliography
- yes (2257)
Institute
- Informatik (747)
- ESB Business School (698)
- Technik (373)
- Life Sciences (274)
- Texoversum (137)
- Zentrale Einrichtungen (14)
Publisher
- Springer (367)
- IEEE (251)
- Elsevier (189)
- Hochschule Reutlingen (176)
- MDPI (99)
- Gesellschaft für Informatik e.V (66)
- Wiley (62)
- De Gruyter (51)
- Association for Computing Machinery (45)
- IARIA (26)
Military organizations have special features like following different organizational laws in times of peace and war and their specific embeddedness in society and politics. Especially the latter aspect has made the military an important object of study since the beginnings of modern sociology. In the wake of establishing specific sociological accounts, military sociology has been developed, dedicated to the different facets of the military. This research is based on different theoretical perspectives, but has hardly embraced the frameworks from economics and sociology of conventions (EC/SC) so far. The aim of the chapter is to explore and demonstrate the potentials of this approach. In a first step, the state of the art of military sociology research is outlined, and potential avenues for analyzing military forces based on EC/SC are identified. It is argued that especially the connection to organizational theory (military as organization) and civil-military relations, including leadership and professionalism, offer starting points. After introducing existing studies addressing military-related topics with reference to EC/SC, relevant concepts and approaches of convention theory that prove to be particularly enriching for military research are discussed. An outlook on possible further fields and topics of research is given to concretize how an inclusion of the perspective of EC/SC could look like.
Traditionelle Organisationen wandeln sich in komplexe Wertschöpfungssysteme mit zunehmend dezentralisierten und digitalisierten Formen der Arbeitsorganisation. Indem sich die Vorstellungen von Mitgliedschaft verändern und sich Arbeit in digitalisierte Räume verschiebt, lösen sich die Grenzen von Organisationen auf. Im Beitrag wird argumentiert, dass sich in solch grenzaufgelösten Organisationen die Machtdynamiken verändern. Im Beitrag werden zwei Dynamiken exemplarisch betrachtet: erstens diejenigen, die sich aus der abnehmenden Wirkung bürokratischer Strukturen als Machtressourcen ergeben, wenn sich die Formen der organisationalen Mitgliedschaft und Zugehörigkeit verändern (z. B. Freelancer, hybride Arbeit). Zweitens werden die Schnittstellen zwischen Menschen und intelligenten Technologien in digitalen Arbeitsräumen und die sich dadurch verschiebenden Machtverhältnisse betrachtet. Der Beitrag zielt darauf ab, die veränderten Machtdynamiken sichtbarer zu machen und damit einen reflektierten Umgang mit Macht in digital transformierten Organisationen zu ermöglichen.
Near-Data Processing (NDP) is a key computing paradigm for reducing the ever growing time and energy costs of data transport versus computations. With their flexibility, FPGAs are an especially suitable compute element for NDP scenarios. Even more promising is the exploitation of novel and future non-volatile memory (NVM) technologies for NDP, which aim to achieve DRAM-like latencies and throughputs, while providing large capacity non-volatile storage.
Experimentation in using FPGAs in such NVM-NDP scenarios has been hindered, though, by the fact that the NVM devices/FPGA boards are still very rare and/or expensive. It thus becomes useful to emulate the access characteristics of current and future NVMs using off-the-shelf DRAMs. If such emulation is sufficiently accurate, the resulting FPGA-based NDP computing elements can be used for actual full-stack hardware/software benchmarking, e.g., when employed to accelerate a database.
For this use, we present NVMulator, an open-source easy-to-use hardware emulation module that can be seamlessly inserted between the NDP processing elements on the FPGA and a conventional DRAM-based memory system. We demonstrate that, with suitable parametrization, the emulated NVM can come very close to the performance characteristics of actual NVM technologies, specifically Intel Optane. We achieve 0.62% and 1.7% accuracy for cache line sized accesses for read and write operations, while utilizing only 0.54% of LUT logic resources on a Xilinx/AMD AU280 UltraScale+ FPGA board. We consider both file-system as well as database access patterns, examining the operation of the RocksDB database when running on real or emulated Optane-technology memories.
The performance and scalability of modern data-intensive systems are limited by massive data movement of growing datasets across the whole memory hierarchy to the CPUs. Such traditional processor-centric DBMS architectures are bandwidth- and latency-bound. Processing-in-Memory (PIM) designs seek to overcome these limitations by integrating memory and processing functionality on the same chip. PIM targets near- or in-memory data processing, leveraging the greater in-situ parallelism and bandwidth.
In this paper, we introduce pimDB and provide an initial comparison of processor-centric and PIM-DBMS approaches under different aspects, such as scalability and parallelism, cache-awareness, or PIM-specific compute/bandwidth tradeoffs. The evaluation is performed end-to-end on a real PIM hardware system from UPMEM.
The increase in distributed energy generation, such as photovoltaic systems (PV) or combined heat and power plants (CHP), poses new challenges to almost every distribution network operator (DNO). In the low-voltage (LV) grids, where installed PV capacity approaches the magnitude of household load, reverse power flow occurs at the secondary substa-tions. High PV penetration leads to voltage rise, flicker and loading problems. These problems have been addressed by the application of various techniques amongst which is the deployment of step voltage regulators (SVR). SVR can solve the voltage problem, but do not prevent or reduce reverse power flows. Therefore, the application of SVR in low voltage grids can result in significant power losses upstream. In this paper we present part of a research project investi-gating the application of remote-controlled cable cabinets (CC) with metering units in a low-voltage network as a possible alternative for SVR. A new generation of custom-made remote-control cable cabinets has been deployed and dynamic network reconfigurations (NR) have been realized with the following objectives: (i) reduction of reverse power flow through the secondary substation to the upstream network and therefore a reduction of upstream losses, (ii) reduction of the voltage rise caused by distributed energy resources and (iii) load balancing in the low-voltage grid. Secondary objec-tives are to improve the DNO's insight into the state of the network and to provide further information on future smart grid integration.
UV hyperspectral imaging (225 nm–410 nm) was used to identify and quantify the honey- dew content of real cotton samples. Honeydew contamination causes losses of millions of dollars annually. This study presents the implementation and application of UV hyperspectral imaging as a non-destructive, high-resolution, and fast imaging modality. For this novel approach, a reference sample set, which consists of sugar and protein solutions that were adapted to honeydew, was set-up. In total, 21 samples with different amounts of added sugars/proteins were measured to calculate multivariate models at each pixel of a hyperspectral image to predict and classify the amount of sugar and honeydew. The principal component analysis models (PCA) enabled a general differentiation between different concentrations of sugar and honeydew. A partial least squares regression (PLS-R) model was built based on the cotton samples soaked in different sugar and protein concentrations. The result showed a reliable performance with R2cv = 0.80 and low RMSECV = 0.01 g for the valida- tion. The PLS-R reference model was able to predict the honeydew content laterally resolved in grams on real cotton samples for each pixel with light, strong, and very strong honeydew contaminations. Therefore, inline UV hyperspectral imaging combined with chemometric models can be an effective tool in the future for the quality control of industrial processing of cotton fibers.
We study three-color Förster resonance energy transfer (triple FRET) between three spectrally distinct fluorescent dyes, a donor and two acceptors, which are embedded in a single polystyrene nanosphere. The presence of triple FRET energy transfer is confirmed by selective acceptor photobleaching. We show that the fluorescence lifetimes of the three dyes are selectively controlled using the Purcell effect by modulating the radiative rates and relative fluorescence intensities when the nanospheres are embedded in an optical Fabry–Pérot microcavity. The strongest fluorescence intensity enhancement for the second acceptor can be observed as a signature of the FRET process by tuning the microcavity mode to suppress the intermediate dye emission and transfer more energy from donor to the second acceptor. Additionally, we show that the triple FRET process can be modeled by coupled rate equations, which allow to estimate the energy transfer rates between donor and acceptors. This fundamental study has the potential to extend the classical FRET approach for investigating complex systems, e.g., optical energy switching, photovoltaic devices, light-harvesting systems, or in general interactions between more than two constituents.
Cotton contamination by honeydew is considered one of the significant problems for quality in textiles as it causes stickiness during manufacturing. Therefore, millions of dollars in losses are attributed to honeydew contamination each year. This work presents the use of UV hyperspectral imaging (225–300 nm) to characterize honeydew contamination on raw cotton samples. As reference samples, cotton samples were soaked in solutions containing sugar and proteins at different concentrations to mimic honeydew. Multivariate techniques such as a principal component analysis (PCA) and partial least squares regression (PLS-R) were used to predict and classify the amount of honeydew at each pixel of a hyperspectral image of raw cotton samples. The results show that the PCA model was able to differentiate cotton samples based on their sugar concentrations. The first two principal components (PCs) explain nearly 91.0% of the total variance. A PLS-R model was built, showing a performance with a coefficient of determination for the validation (R2cv) = 0.91 and root mean square error of cross-validation (RMSECV) = 0.036 g. This PLS-R model was able to predict the honeydew content in grams on raw cotton samples for each pixel. In conclusion, UV hyperspectral imaging, in combination with multivariate data analysis, shows high potential for quality control in textiles.
This study introduces a straightforward approach to construct three-dimensional (3D) surface-enhanced Raman spectroscopy (SERS) substrates using chemically modified silica particles as microcarriers and by attaching metal nanoparticles (NPs) onto their surfaces. Tollens’ reagent and sputtering techniques are utilized to prepare the SERS substrates from mercapto-functionalized silica particles. Treatment with Tollens’ reagent generates a variety of silver NPs, ranging from approximately 10 to 400 nm, while sputtering with gold (Au) yields uniformly distributed NPs with an island-like morphology. Both substrates display wide plasmon resonances in the scattering spectra, making them effective for SERS in the visible spectral range, with enhancement factors (ratio of the analyte’s intensity at the hotspot compared to that on the substrate in the absence of metal nanoparticles) of up to 25. These 3D substrates have a significant advantage over traditional SERS substrates because their active surface area is not limited to a 2D surface but offers a much greater active surface due to the 3D arrangement of the NPs. This feature may enable achieving much higher SERS intensity from within streaming liquids or inside cells/tissues.
Der Einsatz von spielerischen Elementen gewinnt im B-to-B immer mehr an Bedeutung. Die vorliegende Studie untersucht den Einsatz von Gamification-Elementen im B-to-B-Marketing und -Vertrieb, speziell in der deutschen Baubranche. Dabei zeigt sich, dass Gamification in Richtung der Mitarbeitenden häufiger genutzt wird als in Richtung der Kundschaft. Doch auch mit Blick auf nachrückende Kunden-Generationen wächst das Potenzial von Gamification zur Lead-Generierung und zur Unterstützung der Omni-Channel-Strategie.
In the last few years, business firms have substantially invested into the artificial intelligence (AI) technology. However, according to several studies, a significant percentage of AI projects fail or do not deliver business value. Due to the specific characteristics of AI projects, the existing body of knowledge about success and failure of information systems (IS) projects in general may not be transferrable to the context of AI. Therefore, the objective of our research has been to identify factors that can lead to AI project failure. Based on interviews with AI experts, this article identifies and discusses 12 factors that can lead to project failure. The factors can be further classified into five categories: unrealistic expectations, use case related issues, organizational constraints, lack of key resources, and, technological issues. This research contributes to knowledge by providing new empirical data and synthesizing the results with related findings from prior studies. Our results have important managerial implications for firms that aim to adopt AI by helping the organizations to anticipate and actively manage risks in order to increase the chances of project success.
Modern wide bandgap power devices promise higher power conversion performance if the device can be operated reliably. As switching speed increases, the effects of parasitic ringing become more prominent, causing potentially damaging overvoltages during device turn-off. Estimating the expected additional voltage caused by such ringing enables more reliable designs. In this paper, we present an analytical expression to calculate the expected overvoltage caused by parasitic ringing based on parasitic element values and operating point parameters. Simulations and measurements confirm that the expression can be used to find the smallest rise time of the switches’ drain-source voltage for minimum overvoltage. The given expression also allows the prediction of the trade off overvoltage amplitude in case of faster required rise times.
In diesem Kapitel werden die Grundlagen eines nachhaltigen Human Resource Management (HRM) erläutert und auf den Sport- und Kultursektor übertragen. Nach einer kurzen Einführung in die Thematik und deren Relevanz werden zunächst verschiedene Ansätze des nachhaltigen HRM beschrieben. Anschließend wird sowohl auf die personalrelevanten Besonderheiten in Sport und Kultur als auch auf die Herausforderungen in beiden Sektoren eingegangen. Es wird zum einen beschrieben, wie Nachhaltigkeit in der HR-Strategie verankert werden kann, um geeignete Humanressourcen zu gewinnen. Hierbei werden insbesondere die Personalplanung, die Personalbeschaffung, der Rekrutierungs- und Auswahlprozess sowie das anschließende Onboarding beleuchtet. Zum anderen werden Anregungen für nachhaltige Mitarbeiterbindungsmaßnahmen in den Bereichen Training, Entwicklung und Leistungsmanagement gegeben. Zum Abschluss bietet dieses Kapitel einen Ausblick in die Zukunft.
The relevance of Robotic Process Automation (RPA) has increased over the last few years. Combining RPA with Artificial Intelligence (AI) can further enhance the business value of the technology. The aim of this research was to analyze applications, terminology, benefits, and challenges of combining the two technologies. A total of 60 articles were analyzed in a systematic literature review to evaluate the aforementioned areas. The results show that by adding AI, RPA applications can be used in more complex contexts, it is possible to minimize the human factor during the development process, and AI-based decision-making can be integrated into RPA routines. This paper also presents a current overview of the used terminology. Moreover, it shows that by integrating AI, some unseen challenges in RPA projects can emerge, but also a lot of new benefits will come along with it. Based on the outcome, it is concluded that the topic offers a lot of potential, but further research and development is required. The result of this study help researches to gain an overview of the state-of-the-art in combining RPA and AI.
Die vorliegende Studie untersucht, wie Unternehmen die Generation Z für den Vertrieb rekrutieren können. Die Ergebnisse zeigen, dass Flexibilität, Weiterentwicklungsmöglichkeiten, ein attraktives Grundgehalt und eine angenehme Arbeitsatmosphäre für die Generation Z entscheidende Faktoren bei der beruflichen Entscheidungsfindung sind. Darüber hinaus wird die Bedeutung der Sinnhaftigkeit der Arbeit hervorgehoben.
In den letzten Jahren hat der Trend zur Digitalisierung und Konnektivität die Kundenerwartungen an den B2B-Kundenservice verändert. Vorliegender Artikel arbeitet mit zwei klaren Studienzielen und untersucht zum einen die Rolle von IoT (Internet of Things) und Cybersicherheit als Erfolgsfaktoren für den Business-to-Business (B2B) Kundenservice und zum anderen wie eine sichere Integration zu einem Wettbewerbsvorteil auf dem deutschen Markt beitragen kann. Durch einen qualitativen Ansatz mithilfe von 20 Befragungen wurde untersucht, dass IoT und Cybersicherheit als Erfolgsfaktoren für den deutschen B2B-Kundenservice angesehen werden können. Als Ergebnis liefert diese Studie fünf Kernaussagen (Hypothesen) aus qualitativen Interviews. Neben der Diskussion allgemeiner Erfolgsfaktoren und deren Einfluss, wurde die Rolle von IoT bei der Optimierung des B2B Kundendienstes diskutiert. Zudem werden potenzielle Sicherheitsrisken in Zusammenhang mit den Dienstleistungsmodellen, notwendige Anforderungen an Cybersicherheit sowie Datenerfassung erörtert. Abschließend wurde ein Modell entwickelt, das interne und externe Aspekte aufzeigt, die dazu beitragen, dass IoT und Cybersicherheit als Erfolgsfaktoren in der Aktivitätskette des Kunden in der Pre-Sales‑, Sales- und After-Sales-Phase erlebt werden.
Dieser praxis-nahe und industrie-übergreifende Artikel liefert somit Einblicke basierend auf qualitativen Erkenntnissen für weitere Forschung in der Theorie und befähigt Organisationen das Thema ganzeinheitlich zu betrachten.
It is widely recognized that Education for Sustainable Development (ESD) plays a critical role in creating a more sustainable world by fostering the development of the knowledge, skills, understanding, values, and actions necessary for such change (UNESCO, 2020). In this context, ESD represents a holistic approach that focuses on lifelong learning to create informed people who can make decisions today and in the future. Related to the textile and fashion industry, ESD is an appropriate approach to continuously implement sustainability aspects in education and training. To achieve this goal, the European project "Sustainable Fashion Curriculum at Textile Universities in Europe - Development, Implementation and Evaluation of a Teaching Module for Educators" (Fashion DIET) has developed a digital teaching module in a partnership between a University of Education and universities with textile departments. The main objective of the project is to elaborate an ESD module for university lecturers in order to introduce a sustainable fashion curriculum in textile universities in Europe and implement it in educational systems. The project therefore aims to train educators along the textile supply chain, to inform the young generation about the latest aspects of sustainability and raise awareness by implementing ESD in textile education. This paper presents the learning outcomes of the modules on sustainable fashion design and related production technologies developed by the technical university partners, as part of the total of 42 courses covering didactic-methodological approaches and the sustainable orientation of the fashion market, offered at the consortium level. The project content is made available as Open Educational Resources through Glocal Campus, an open-access e-learning platform that enables virtual collaboration between universities.
Thin, flat textile roofing offers negligible heat insulation. In warm areas, such roofing membranes are therefore equipped with metallized surfaces to reflect solar heat radiation, thus reducing the warming inside a textile building. Heat reflection effects achieved by metallic coatings are always accompanied by shading effects as the metals are non-transparent for visible light (VIS). Transparent conductive oxides (TCOs) are transparent for VIS and are able to reflect heat radiation in the infrared. TCOs are, e.g., widely used in the display industry. To achieve the perfect coatings needed for electronic devices, these are commonly applied using costly vacuum processes at high temperatures. Vacuum processes, on account of the high costs involved and high processing temperatures, are obstructive for an application involving textiles. Accepting that heat-reflecting textile membranes demand less perfect coatings, a wet chemical approach has been followed here when producing transparent heat-reflecting coatings. Commercially available TCOs were employed as colloidal dispersions or nanopowders to prepare sol-gel-based coating systems. Such coatings were applied to textile membranes as used for architectural textiles using simple coating techniques and at moderate curing temperatures not exceeding 130 °C. The coatings achieved about 90% transmission in the VIS spectrum and reduced near-infrared transmission (at about 2.5 µm) to nearly zero while reflecting up to 25% of that radiation. Up to 35% reflection has been realized in the far infrared, and emissivity values down to ε = 0.5777 have been measured.
During the first years of the last decade, Egypt used to face recurrent electricity cut-offs in summer. In the past few years, the electricity tariff dramatically increased. Radiative cooling to the clear night sky is a renewable energy source that represents a relative solution. The dry desert climate promotes nocturnal radiative cooling applications. This study investigates the potential of nocturnal radiative cooling systems (RCSs) to reduce the energy consumption of the residential building sector in Egypt. The system technology proposed in this work is based on uncovered solar thermal collectors integrated into the building hydronic system. By implementing different control strategies, the same system could be used for both cooling and heating applications. The goal of this paper is to analyze the performance of RCSs in residential buildings in Egypt. The dynamic simulation program TRNSYS was used to simulate the thermal behavior of the system. The relevant issues of Egypt as a case-study are firstly overviewed. Then the paper introduces the work done to develop a building model that represents a typical residential apartment in Egypt. Typical occupancy profiles were developed to define the internal thermal gains. The adopted control strategy to optimize the system operation is presented as well. To fully understand and hence evaluate the operation of the proposed RCS, four simulation cases were considered: 1. a reference case (fully passive), 2. the stand-alone operation of the RCS, 3. ideal heating & cooling operation (fully-active), and 4. the hybrid-operation (when the active cooling system is supported by the proposed RCS). The analysis considered the main three distinct climates in Egypt, represented by the cities of Alexandria, Cairo and Asyut. The hotter and drier weather conditions resulted in a higher cooling potential and larger temperature differences. The simulated cooling power in Asyut was 28.4 W/m² for a 70 m² absorber field. For a smaller field area of 10 m², the cooling power reached 109 W/m² but with humble temperature differences. To meet the rigorous thermal comfort conditions, the proposed sensible RCS cannot fully replace conventional air-conditioning units, especially in humid areas like Alexandria. When working in a hybrid system, a 10% reduction in the active cooling energy demand could be achieved in Asyut to keep the cooling set-point at 24 °C. This percentage reduction was nearly doubled when the thermal comfort set-point was increased by two degrees (26 °C). In a sensitivity analysis, external shading devices as a passive measure as well as the implementation of the Egyptian code for buildings (ECP306/1–2005) were also investigated. The analysis of this study raised other relevant aspects to discuss, e.g. system-sizing, environmental effects, limitations and recommendations.
The presented research is dedicated to estimation of the correlation between the level of renewable energy sources and the costs of congestion management in electric networks in selected European countries. Data of six countries in North-West European area (Italy, Spain, Germany, France, Poland and Austria) were investigated. Factors considered included grid congestion costs including re-dispatching costs as well as countertrading costs, gross electricity generation, installed capacity of electric generating facilities, installed capacity of electric non-dispatchable renewable energy sources and total electricity consumption. Special attention is paid to the share of renewable energy sources. It is found that the grid congestion costs are not clearly affected by penetration of non-dispatchable renewables in all the analysed countries and therefore a clear mathematical correlation cannot no be extrapolated with the available data. The results of this research show in general a loose dependency of the grid congestion costs on the penetration of renewables and a strong dependency on the total electrical consumption of the country.
Distributed Ledger Technologies for the energy sector: facilitating interoperability analysis
(2023)
The use of distributed data storage and management structures, such as Distributed Ledger Technologies (DLT), in the energy sector has gained great interest in recent times. This opens up new possibilities in e.g. microgrid management, aggregation of distributed resources, peer-to- peer trading, integration of electromobility or proof-of-origin strategies. However, in order to benefit from those new possibilities, new challenges have to be overcome. This work focuses on one of these challenges, which is the need to ensure interoperability when integrating DLT-enabled devices in energy use cases. Firstly, the use of DLTs in the energy sector will be analyzed and the main use cases will be presented. Then, a classification of DLT-Energy use cases will be proposed. Secondly, the need for a common reference architecture framework to analyze those use cases with a focus on interoperability will be discussed and the current activities in research and standardization in this field will be presented. Finally, a new common reference architecture framework based on current activities in standardization will be presented.
This article presents a modified method of performing power flow calculations as an alternative to pure energy-based simulations of off-grid hybrid systems. The enhancement consists in transforming the scenario-based power flow method into a discrete time-dependent algorithm with the inclusion of bus and controller dynamics.
Verschleiß an Zerspanwerkzeugen mit geometrisch definierter Schneide führt zu schlechter Oberflächenqualität, erhöhten Kräften, Maßabweichungen und Bruch. Bisher wird dieser Verschleiß außerhalb der Maschine oder indirekt (z. B. Durchmesser) erfasst. Der Tausch der Werkzeuge findet nach einer bestimmten Werkstückzahl, Zeit, oder einem Standweg statt. In diesem Beitrag wird ein neuartiges System zur direkten Ermittlung des Freiflächenverschleißes im Arbeitsraum eines Bearbeitungszentrums dargestellt. Dabei wird eine geschützt integrierte Industriekamera mit Objektiv im Arbeitsraum installiert. Die Maschinenachsen bzw. die Bearbeitungsspindel positionieren das Werkzeug davor. Nach einer nur wenige Sekunden dauernden Messung findet die Auswertung des Verschleißes hauptzeitparallel statt.
Large critical systems, such as those created in the space domain, are usually developed by a large number of organizations and, furthermore, they have to comply with standards. Yet, the different stakeholders often do not have a common understanding of the needed quality of requirements specifications. Achieving such a common understanding is a laborious process that is currently not sufficiently supported. Moreover, such a common understanding must be aligned with the standards. In this paper, we present an approach that can be used to align the different stakeholder perceptions regarding the quality of requirements specifications. Existing quality models for requirements specifications are analyzed for equivalences, and transferred into a common representation, the so-called Aligned Quality Map (AQM). Furthermore, a process is defined that supports the alignment of different stakeholder perspectives with regard to the quality of requirements specifications using AQM, which is validated in a case study in the context of European space projects. AQM has been created and populated with an initial set of quality models. It is designed in such way that it can be extended to include further quality models. The case study has shown that an alignment of different stakeholder perspectives and the quality model of the European Cooperation for Space Standardization using AQM is feasible. The approach allows for aligning different stakeholder perspectives for a common understanding of the quality of requirements specifications in the context of standards. Furthermore, AQM supports the assessment of requirements specifications.
Software development teams have to face stress caused by deadlines, staff turnover, or individual differences in commitment, expertise, and time zones. While students are typically taught the theory of software project management, their exposure to such stress factors is usually limited. However, preparing students for the stress they will have to endure once they work in project teams is important for their own sake, as well as for the sake of team performance in the face of stress. Team performance has been linked to the diversity of software development teams, but little is known about how diversity influences the stress experienced in teams. In order to shed light on this aspect, we provided students with the opportunity to self-experience the basics of project management in self-organizing teams, and studied the impact of six diversity dimensions on team performance, coping with stressors, and positive perceived learning effects. Three controlled experiments at two universities with a total of 65 participants suggest that the social background impacts the perceived stressors the most, while age and work experience have the highest impact on perceived learnings. Most diversity dimensions have a medium correlation with the quality of work, yet no significant relation to the team performance. This lays the foundation to improve students’ training for software engineering teamwork based on their diversity-related needs and to create diversity-sensitive awareness among educators, employers and researchers.
For large-scale processes as implemented in organizations that develop software in regulated domains, comprehensive software process models are implemented, e.g., for compliance requirements. Creating and evolving such processes is demanding and requires software engineers having substantial modeling skills to create consistent and certifiable processes. While teaching process engineering to students, we observed issues in providing and explaining models. In this paper, we present an exploratory study in which we aim to shed light on the challenges students face when it comes to modeling. Our findings show that students are capable of doing basic modeling tasks, yet, fail in utilizing models correctly. We conclude that the required skills, notably abstraction and solution development, are underdeveloped due to missing practice and routine. Since modeling is key to many software engineering disciplines, we advocate for intensifying modeling activities in teaching.
The world is becoming increasingly digital. People have become used to learning and interacting with the world around them through technology, accelerated even further by the Covid-19 pandemic. This is especially relevant to the generation currently entering education systems and the workforce. Considering digital aids and methods of learning are important for future learning. The increasing online learning needs open the case for integrating digital learning aspects such as serious gaming within education and training systems. Learning factories fall amongst the education and training systems that can benefit from integration with digital learning extensions. Digital capabilities such as digital twins and models further enable the exploration of integrating digital serious games as an extension of learning factories. Since learning factories are meant for a range of different learning, training, and research purposes, such serious games need to be adaptable across stakeholder perspectives to maximize the value gained from the time and cost invested into such design and development. Research into the development of adaptive serious games for multiple stakeholder perspectives must first determine whether such development can be developed that reaches the objectives set for different included stakeholder perspectives. The purpose of this research is to investigate this at the hand of the practical development of a digital adaptive serious game for stakeholder perspectives.
Product engineering and subsequent phases of product lifecycles are predominantly managed in isolation. Companies therefore do not fully exploit potentials through using data from smart factories and product usage. The novel intelligent and integrated Product Lifecycle Management (i²PLM) describes an approach that uses these data for product engineering. This paper describes the i²PLM, shows the cause-and-effect relationships in this context and presents in detail the validation of the approach. The i²PLM is applied and validated on a smart product in an industrial research environment. Here, the subsequent generation of a smart lunchbox is developed based on production and sensor data. The results of the validation give indications for further improvements of the i²PLM. This paper describes how to integrate the i²PLM into a learning factory.
Circular economy aims to support reuse and extends the product life cycles through repair, remanufacturing, upgrades and retrofits, as well as closing material cycles through recycling. To successfully manage the necessary transformation processes to circular economy, manufacturing enterprises rely on the competency of their employees. The definition of competency requirements for circular economy-oriented production networks will contribute to the operationalization of circular economy. The International Association of Learning Factories (IALF) statesin its mission the development of learning systems addressing these challenges for training of students and further education of industry employees. To identify the required competencies for circular economy, the major changes of the product life cycle phases have been investigated based on the state of the science and compared to the socio-technical infrastructure and thematic fields of the learning factories considered in this paper. To operationalize the circular economy approach in the product design and production phase in learning factories, an approach for a cross learning factory network (so called "Cross Learning Factory Product Production System (CLFPPS)") has been developed. The proposed CLFPPS represents a network on the design dimensions of learning factories. This approach contributes to the promotion of circular economy in learning factories as it makes use of and combines the focus areas of different learning factories. This enables the CLFPPS to offer a holistic view on the product life cycle in production networks.
Development of an IoT-based inventory management solution and training module using smart bins
(2023)
Flexibility, transparency and changeability of warehouse environments are playing an increasingly important role to achieve a cost-efficient production of small batch sizes. This results in increasing requirements for warehouses in terms of flexibility, scalability, reconfigurability and transparency of material and information flows to deal with large number of different components and variable material and information flows due to small batch sizes. Therefore, an IoT-based inventory management solution and training module has been developed, implemented and validated at Werk150 – the Factory on campus of the ESB Business School. Key elements of the developed solution are smart bins using weight mats to track the bin’s content and additional sensors and buttons which are connected to an IoT – Hub to collect data of material consumption and manual handling operations. The use of weight mats for the smart bins offers the possibility to measure the container content independent of the specific component geometry and thus for a variety of components based on the specific component weights. The developed solution enables focusing on key for success elements of the system to provide synchronization of the flow of materials and information resulting an increase of flexibility and significantly higher transparency of the material flow. AIbased algorithms are applied to analyse the gathered data and to initiate process optimizations by providing the logistics decision makers a profound and transparent basis for decision making. In order to provide students and industry visitors of the learning factory with the necessary competences and to support the transfer into practice, a training module on IoT-based inventory management was developed and implemented.
Since its first publication in 2015, the learning factory morphology has been frequently used to design new learning factories and to classify existing ones. The structuring supports the concretization of ideas and promotes exchange between stakeholders.
However, since the implementation of the first learning factories, the learning factory concept has constantly evolved.
Therefore, in the Working Group "Learning Factory Design" of the International Association of Learning Factories, the existing morphology has been revised and extended based on an analysis of the trends observed in the evolution of learning factory concepts. On the one hand, new design elements were complemented to the previous seven design dimensions, and on the other hand, new design dimensions were added. The revised version of the morphology thus provides even more targeted support in the design of new learning factories in the future.
Production planning and control are characterized by unplanned events or so-called turbulences. Turbulences can be external, originating outside the company (e.g., delayed delivery by a supplier), or internal, originating within the company (e.g., failures of production and intralogistics resources). Turbulences can have far reaching consequences for companies and their customers, such as delivery delays due to process delays. For target-optimized handling of turbulences in production, forecasting methods incorporating process data in combination with the use of existing flexibility corridors of flexible production systems offer great potential. Probabilistic, data-driven forecasting methods allow determining the corresponding probabilities of potential turbulences. However, a parallel application of different forecasting methods is required to identify an appropriate one for the specific application. This requires a large database, which often is unavailable and, therefore, must be created first. A simulation-based approach to generate synthetic data is used and validated to create the necessary database of input parameters for the prediction of internal turbulences. To this end, a minimal system for conducting simulation experiments on turbulence scenarios was developed and implemented. A multi-method simulation of the minimal system synthetically generates the required process data, using agent-based modeling for the autonomously controlled system elements and event-based modeling for the stochastic turbulence events. Based on this generated synthetic data and the variation of the input parameters in the forecast, a comparative study of data-driven probabilistic forecasting methods was conducted using a data analytics tool. Forecasting methods of different types (including regression, Bayesian models, nonlinear models, decision trees, ensemble, deep learning) were analyzed in terms of prediction quality, standard deviation, and computation time. This resulted in the identification ofappropriate forecasting methods, and required input parameters for the considered turbulences.
Smart factories, driven by the integration of automation and digital technologies, have revolutionized industrial production by enhancing efficiency, productivity, and flexibility. However, the optimization and continuous improvement of these complex systems present numerous challenges, especially when real-world data collection is time-consuming, expensive, or limited. In this paper, we propose a novel method for semi-automated improvement of smart factories using synthetic data and cause-effect-relations, while incorporating the aspect of self-organization. The method leverages the power of synthetic data generation techniques to create representative datasets that mimic the behaviour of real-world manufacturing systems. These synthetic datasets serve together with the cause-and-effect relationships as a valuable resource for factory optimization, as they enable extensive experimentation and analysis without the constraints of limited or costly real-world data. Furthermore, the method embraces the concept of self organization within smart factories. By allowing the system to adapt and optimize itself based on feedback from the synthetic data, cause-effect-relationships, the factory can dynamically reconfigure and adjust its processes. To facilitate the improvement process, the method integrates the synthetic data with advanced analytics and machine learning algorithms as well as and the cause-and-effect relationships. This synergy between human expertise and technological advancements represents a compelling path towards a truly optimized smart factory of the future.
This project aims to evaluate existing big data infrastructures for their applicability in the operating room to support medical staff with context-sensitive systems. Requirements for the system design were generated. The project compares different data mining technologies, interfaces, and software system infrastructures with a focus on their usefulness in the peri-operative setting. The lambda architecture was chosen for the proposed system design, which will provide data for both postoperative analysis and real-time support during surgery.
Introduction: Even if there is a standard procedure of CI surgery, especially in pediatric surgery surgical steps often differ individually due to anatomical variations, malformations or unforseen events. This is why every surgical report should be created individually, which takes time and relies on the correct memory of the surgeon. A standardized recording of intraoperative data and subsequent storage as well as text processing would therefore be desirable and provides the basis for subsequent data processing, e.g. in the context of research or quality assurance.
Method: In cooperation with Reutlingen University, we conducted a workflow analysis of the prototype of a semi-automatic checklist tool. Based on automatically generated checklists generated from BPMN models a prototype user interface was developed for an android tablet. Functions such as uploading photos and files, manual user entries, the interception of foreseeable deviations from the normal course of operations and the automatic creation of OP documentation could be implemented. The system was tested in a remote usability test on a petrous bone model.
Result: The user interface allows a simple intuitive handling, which can be well implemented in the intraoperative setting. Clinical data as well as surgical steps could be individually recorded and saved via DICOM. An automatic surgery report could be created and saved.
Summary: The use of a dynamic checklist tool facilitates the capture, storage and processing of surgical data. Further applications in clinical practice are pending.
Recent advances in artificial intelligence have enabled promising applications in neurosurgery that can enhance patient outcomes and minimize risks. This paper presents a novel system that utilizes AI to aid neurosurgeons in precisely identifying and localizing brain tumors. The system was trained on a dataset of brain MRI scans and utilized deep learning algorithms for segmentation and classification. Evaluation of the system on a separate set of brain MRI scans demonstrated an average Dice similarity coefficient of 0.87. The system was also evaluated through a user experience test involving the Department of Neurosurgery at the University Hospital Ulm, with results showing significant improvements in accuracy, efficiency, and reduced cognitive load and stress levels. Additionally, the system has demonstrated adaptability to various surgical scenarios and provides personalized guidance to users. These findings indicate the potential for AI to enhance the quality of neurosurgical interventions and improve patient outcomes. Future work will explore integrating this system with robotic surgical tools for minimally invasive surgeries.
Most digital innovations fail when they transition from the exploring to the scaling stage. We describe how freeyou, the digital innovation spinoff of a major German insurer, successfully scaled online-only car insurance, focusing particularly on how it managed the IT-related challenges. The stark differences between the stages required very different approaches to application development, IT organization and data analytics. Based on freeyou’s experience, we provide recommendations for successful transitioning from exploring to scaling.
Governments and public institutions increasingly embrace digital opportunities to involve citizens in public issues and decision making. While public participation is generally seen as an important and promising venture, the design of the participation processes and the utilized digital infrastructure poses challenges, especially to the public sector. Instead of limiting conceptual guidance and exchange to one domain, we therefore develop a taxonomy for digital involvement projects that unites the domains of e-participation, citizen science and crowd-X. Embedded in a design science research approach, we follow an iterative design process to elaborate the key characteristics of a digital involvement project based on the participation process, its individuals and digital infrastructure. Through evaluating the artifact in a focus group with domain practitioners, we find support for the usefulness of our taxonomy and its ability to provide guidance and a basis for discussion of digital involvement projects across domains.
In this paper, it aims to model wind speed time series at multiple sites. The five-parameter Johnson distribution is deployed to relate the wind speed at each site to a Gaussian time series, and the resultant m-dimensional Gaussian stochastic vector process Z(t) is employed to model the temporal-spatial correlation of wind speeds at m different sites. In general, it is computationally tedious to obtain the autocorrelation functions (ACFs) and cross-correlation functions (CCFs) of Z(t), which are different to those of wind speed times series. In order to circumvent this correlation distortion problem, the rank ACF and rank CCF are introduced to characterize the temporal-spatial correlation of wind speeds, whereby the ACFs and CCFs of Z(t) can be analytically obtained. Then, Fourier transformation is implemented to establish the cross-spectral density matrix of Z(t), and an analytical approach is proposed to generate samples of wind speeds at m different sites. Finally, simulation experiments are performed to check the proposed methods, and the results verify that the five-parameter Johnson distribution can accurately match distribution functions of wind speeds, and the spectral representation method can well reproduce the temporal-spatial correlation of wind speeds.
In the course of a more intensive energy generation from regenerative sources, an increased number of energy storages is required. In addition to the widespread means of storing electric energy, storing energy thermally can contribute significantly. However, limited research exists on the behaviour of thermal energy storages (TES) in practical operation. While the physical processes are well known, it is nevertheless often not possible to adequately evaluate its performance with respect to the quality of thermal stratification inside the tank, which is crucial for the thermodynamic effectiveness of the TES. The behaviour of a TES is experimentally investigated in cyclic charging and discharging operation in interaction with a cogeneration (CHP) unit at a test rig in the lab. From the measurements the quality of thermal stratification is evaluated under varying conditions using different metrics such as normalised stratification factor, modified MIX number, exergy number and exergy efficiency, which extends the state of art for CHP applications. The results show that the positioning of the temperature sensors for turning the CHP unit on and off has a significant influence on both the effective capacity of a TES and the quality of thermal stratification inside the tank. It is also revealed that the positioning of at least one of these sensors outside the storage tank, i.e. in the return line to the CHP unit, prevents deterioration of thermal stratification, thereby enhancing thermodynamic effectiveness. Furthermore, the effects of thermal load and thermal load profile on effective capacity and thermal stratification are discussed, even though these are much smaller compared to the effect of positioning the temperature sensors.
Despite the unstoppable global drive towards electric mobility, the electrification of sub-Saharan Africa’s ubiquitous informal multi-passenger minibus taxis raises substantial concerns. This is due to a constrained electricity system, both in terms of generation capacity and distribution networks. Without careful planning and mitigation, the additional load of charging hundreds of thousands of electric minibus taxis during peak demand times could prove catastrophic. This paper assesses the impact of charging 202 of these taxis in Johannesburg, South Africa. The potential of using external stationary battery storage and solar PV generation is assessed to reduce both peak grid demand and total energy drawn from the grid. With the addition of stationary battery storage of an equivalent of 60 kWh/taxi and a solar plant of an equivalent of 9.45 kWpk/taxi, the grid load impact is reduced by 66%, from 12 kW/taxi to 4 kW/taxi, and the daily grid energy by 58% from 87 kWh/taxi to 47 kWh/taxi. The country’s dependence on coal to generate electricity, including the solar PV supply, also reduces greenhouse gas emissions by 58%.
Menopause is the permanent cessation of menstruation occurring naturally in women's aging. The most frequent symptoms associated with menopausal phases are mucosal dryness, increased weight and body fat, and changes in sleep patterns. Oral symptoms in menopause derived from saliva flow reduction can lead to dry mouth, ulcers, and alterations of taste and swallowing patterns. However, the oral health phenotype of postmenopausal women has not been characterized. The aim of the study was to determine postmenopausal women's oral phenotype, including medical history, lifestyle, and oral assessment through artificial intelligence algorithms. We enrolled 100 postmenopausal women attending the Dental School of the University of Seville were included in the study. We collected an extensive questionnaire, including lifestyle, medication, and medical history. We used an unsupervised k-means algorithm to cluster the data following standard features for data analysis. Our results showed the main oral symptoms in our postmenopausal cohort were reduced salivary flow and periodontal disease. Relying on the classical assessment of the collected data, we might have a biased evaluation of postmenopausal women. Then, we used artificial intelligence analysis to evaluate our data obtaining the main features and providing a reduced feature defining the oral health phenotype. We found 6 clusters with similar features, including medication affecting salivation or smoking as essential features to obtain different phenotypes. Thus, we could obtain main features considering differential oral health phenotypes of postmenopausal women with an integrative approach providing new tools to assess the women in the dental clinic.
Rare but extreme events, such as pandemics, terror attacks, and stock market collapses, pose a risk that could undermine cooperation in societies and groups. We extend the public goods game (PGG) to investigate the relationship between rare but extreme external risks and cooperation in a laboratory experiment. By incorporating risk as an external random variable in the PGG, independent of the participants’ contributions, we preserve the economic equilibrium of non-cooperation in the original game. Furthermore, we examine whether cooperation can be restored by the relatively simple intervention of informing about countermeasures while keeping the actual risk constant. Our experimental results reveal that on average extreme risks indeed decrease contributions by about 20%; however, countermeasure information increases contributions by about 10%. Specifically, in the first interactions, cooperation levels can even reach those observed in the riskless baseline. Our results suggest that countermeasure information could help reinforce social cohesion and resilience in the face of rare but extreme risks.
The introduction of smart contracts has expanded the applicability of blockchains to many domains beyond finance and cryptocurrencies. Moreover, different blockchain technologies have evolved that target special requirements. As a result, in practice, often a combination of different blockchain systems is required to achieve an overall goal. However, due to the heterogeneity of blockchain protocols, the execution of distributed business transactions that span several blockchains leads to multiple interoperability and integration challenges. Therefore, in this article, we examine the domain of Cross-Chain Smart Contract Invocations (CCSCIs), which are distributed transactions that involve the invocation of smart contracts hosted on two or more blockchain systems. We conduct a systematic multi-vocal literature review to get an overview of the available CCSCI approaches. We select 20 formal literature studies and 13 high-quality gray literature studies, extract data from them, and analyze it to derive the CCSCI Classification Framework. With the help of the framework, we group the approaches into two categories and eight subcategories. The approaches differ in multiple characteristics, e.g., the mechanisms they follow, and the capabilities and transaction processing semantics they offer. Our analysis indicates that all approaches suffer from obstacles that complicate real-world adoption, such as the low support for handling heterogeneity and the need for trusted third parties.
In this paper, the essential sponsorship basics are presented and the communication instrument of sports sponsorship is illustrated. Building on this, both the perspectives of sponsors and sponsees are examined in detail. In addition, the special features of sports event sponsorships are highlighted. Finally, current developments in sports sponsorship in the context of the FIFA Soccer World Cup 2022 in Qatar and the UEFA European Soccer Championship 2024 in Germany are compared and discussed.
Wave-like differential equations occur in many engineering applications. Here the engineering setup is embedded into the framework of functional analysis of modern mathematical physics. After an overview, the –Hilbert space approach to free Euler–Bernoulli bending vibrations of a beam in one spatial dimension is investigated. We analyze in detail the corresponding positive, selfadjoint differential operators of 4-th order associated to the boundary conditions in statics. A comparison with free string wave swinging is outlined.
Comparative analysis of the chemical and rheological curing kinetics of formaldehyde-based wood adhesives is crucial for assessing their respective performance. Differential scanning calorimetry (DSC) and rheometry are the conventional techniques used for monitoring the curing processes leading to crosslinking polymerization of the adhesives. However, the direct comparison of these techniques is inappropriate due to the intrinsic differences in their underlying procedures. To address this challenge, the two adhesive samples were sequentially cured, firstly with rheometry and followed by DSC. The observed higher curing degree in the subsequent DSC procedure underpins the incomplete curing of the samples during initial rheometry. Furthermore, the comparative assessment of the activation energies, molar ratios, and active groups of the two adhesives highlights the importance of the pre-exponential factor in addition to the activation energies, as it attributes to the probability of active groups coinciding at the appropriate spatial arrangement.
Tech hubs (THs) and cognate structures are nowadays ubiquitous in the innovation ecosystem of Sub-Saharan African (SSA) countries. However, the concept of THs is fuzzy due to the lack of a clear and universally accepted definition. This ambiguity is further compounded by the diverse range of organizations that self-identify as hubs, or are categorized as such by others. As a result, research on THs in SSA remained limited. Against the backdrop of established research on the interconnectedness of technology, innovation and entrepreneurship in different organizational forms, this paper is meant to provide fresh insights into the study of THs in SSA. To advance future research, first, it reveals what is special about THs in SSA and how they are related to existing concepts. I particularly argue that they contour a fourth-wave model of incubation. Second, four main categories are unfolded to delineate THs in SSA which is the cornerstone for future research.
Salivary gland tumors (SGTs) are a relevant, highly diverse subgroup of head and neck tumors whose entity determination can be difficult. Confocal Raman imaging in combination with multivariate data analysis may possibly support their correct classification. For the analysis of the translational potential of Raman imaging in SGT determination, a multi-stage evaluation process is necessary. By measuring a sample set of Warthin tumor, pleomorphic adenoma and non-tumor salivary gland tissue, Raman data were obtained and a thorough Raman band analysis was performed. This evaluation revealed highly overlapping Raman patterns with only minor spectral differences. Consequently, a principal component analysis (PCA) was calculated and further combined with a discriminant analysis (DA) to enable the best possible distinction. The PCA-DA model was characterized by accuracy, sensitivity, selectivity and precision values above 90% and validated by predicting model-unknown Raman spectra, of which 93% were classified correctly. Thus, we state our PCA-DA to be suitable for parotid tumor and non-salivary salivary gland tissue discrimination and prediction. For evaluation of the translational potential, further validation steps are necessary.
Purpose
As a response to the increased frequency of disruptive events and intense competition, organizational agility has become a key concept in organizational research. Fostering organizational agility requires leveraging knowledge that exists both outside (exploration) and inside (exploitation) the organization. This research tests the so-called ambidexterity hypothesis, which claims that a balance between exploration and exploitation leads to increased organizational outcomes, including the development of organizational agility. Complementing previously established measurement models on ambidexterity, this research proposes an alternative measurement model to analyze how ambidexterity can enhance organizational agility and, indirectly, performance, taking into consideration the moderating effect of environmental competitiveness.
Design/methodology/approach
A review of existing measurement models for ambidexterity shows that tension, a crucial aspect of ambidexterity, is often neglected. The authors, therefore, develop a new measurement model of ambidexterity to incorporate ambidexterity-induced tension. Using this measurement model, they examine the effect of ambidexterity on the development of entrepreneurial and adaptive agility as well as performance.
Findings
Ambidexterity positively influences both entrepreneurial and adaptive agility, indicating that a balance between exploration and exploitation has superior organizational effects. This finding confirms the ambidexterity hypothesis with respect to organizational agility. Furthermore, both entrepreneurial and adaptive agility drive organizational performance. These two indirect effects via agility fully mediate the impact of ambidexterity on organizational performance. Finally, environmental competitiveness positively moderates the relationship between ambidexterity and adaptive agility.
Originality/value
The findings extend research on ambidexterity by showing its positive effects on organizational agility. Furthermore, the study proposes an alternative operationalization to capture the ambidexterity construct that may lay the groundwork for further applications of the ambidexterity concept.