Refine
Year of publication
- 2021 (297) (remove)
Document Type
- Journal article (158)
- Conference proceeding (86)
- Book chapter (29)
- Book (7)
- Report (5)
- Doctoral Thesis (3)
- Anthology (3)
- Issue of a journal (2)
- Working Paper (2)
- Patent / Standard / Guidelines (1)
Is part of the Bibliography
- yes (297)
Institute
- ESB Business School (113)
- Informatik (83)
- Life Sciences (44)
- Technik (37)
- Texoversum (12)
- Zentrale Einrichtungen (5)
Publisher
- Springer (39)
- Elsevier (25)
- MDPI (24)
- IEEE (16)
- Springer Gabler (12)
- Wiley (10)
- De Gruyter (9)
- Hochschule Reutlingen (6)
- ACS (5)
- Association for Information Systems (AIS) (5)
The strong demand to transform the textile and fashion industry towards sustainability requires continuous implementation of the Education for Sustainable Development (ESD) mission statement in education and industry. To achieve this goal, the European research project "Fashion DIET - Sustainable Fashion Curriculum at Textile Universities in Europe. Development, Implementation and Evaluation of a Teaching Module for Educators", co-funded by the Erasmus+ programme of the European Union (2020-1-DE01-KA203-005657), aims to create an ESD module for university lecturers and research-based teaching and learning materials delivered through an e-learning portal. First, an online questionnaire was rolled out to assess university faculty attitudes toward and needs for ESD content and methods. The feedback questionnaire enabled the selection of the most relevant data for the elaboration of an action and research-oriented professional development module for ESD in textile education, which will be accessible through an information & e-learning portal. The e-learning portal can be used as a web-based tool to apply and evaluate the project outcomes, e.g. the further education module and the teaching and learning materials for educators, such as manuals, broadcasts and the provision of interactive and physical materials. It thus ensures that the teaching materials can be used sustainably in the classroom. It also provides country-specific data for the fashion and textile industry and its market, taking into account the different perspectives of universities and schools. In any case, the portal represents (1) the web-based platform to support the dissemination of ESD as a guiding principle and (2) a central contact point for the target group to obtain relevant information on ESD. Fashion DIET explores the use of e-learning to improve teaching and learning on ESD, by training educators and empowering them as multipliers for a sustainable textile and fashion industry. At a higher level, the European project strengthens the quality and relevance of learning provision in education towards the latest developments in textile research and innovation in terms of a more sustainable fashion.
Purpose – This paper aims to determine the affecting factors of the brand authenticity of startups in social media.
Design/methodology/approach – Using a qualitative method based on a grounded theory approach, this research specifies and classifies the affecting factors of brand authenticity of startups in social media through in-depth semi-structured interviews.
Findings – Multiple factors affecting the brand authenticity of startups in social media are determined and categorized as indexical, iconic and existential cues through this research. Connection to heritage and having credible support are determined as indexical cues. Founder intellectuality, brand intellectuality, commitment toward customers and proactive clear and interesting communications are identified as iconic cues. Having self-confidence and self-satisfaction, having intimacy with the brand and a joyful feeling for interactions with the community around the brand are determined as existential cues in this research. This research furthers previous arguments on a multiplicity of brand authenticity by shedding light on the relationship between the different aspects of authenticity and the form that different affecting factors can be organized together. Consumers eventually evaluate a strengthened perception of brand authenticity through existential cues that reflect the cues of other aspects (iconic and indexical) which passed through the goal-based assessment and self-authentication filter.
Research limitations/implications – The research sampling population can be more diversified in terms of sociodemographic attributes. Due to the qualitative methodology of this research, assessment of the findings through quantitative methods can be considered in future research. Practical implications – Using the findings of this research, startup managers can properly build a perception of authenticity in their consumers’ minds by using alternate factors while lacking major indexical cues such as heritage. This research helps startup businesses to design their brand communications better to convey their authenticity to their audiences.
Originality/value – This research determines the factors affecting the authenticity of startup brands in social media. It also defines the process of authenticity perception through different aspects of brand authenticity.
Academic research is vital for innovation and industrial growth. However, a potential burden of processing ever more knowledge could be affecting research output and researchers’ careers. We look at a dataset of researchers who have published in journals in the field of economics during a period of 45 years. For a subset of these researchers, we amass data from journals listed in the EconLit database, supplemented with years of birth from public sources. Our results show an increase in the age of researchers at their first publication, in the number of articles referenced in debut articles, and in the number of co-authors. Simultaneously, we observe a decline in the probability of researchers changing research fields. Our findings extend earlier findings on patents and hint at a burden of knowledge pervading different areas of human progress. Moreover, our results indicate that researchers develop strategies of specialisation to deal with this challenge.
We analyze economics PhDs’ collaborations in peer-reviewed journals from 1990 to 2014 and investigate such collaborations’ quality in relation to each co-author’s research quality, field and specialization. We find that a greater overlap between co-authors’ previous research fields is significantly related to a greater publication success of co-authors’ joint work and this is robust to alternative specifications. Co-authors that engage in a distant collaboration are significantly more likely to have a large research overlap, but this significance is lost when co-authors’ social networks are accounted for. High quality collaboration is more likely to emerge as a result of an interaction between specialists and generalists with overlapping fields of expertise. Regarding interactions across subfields of economics (interdisciplinarity), it is more likely conducted by co- authors who already have interdisciplinary portfolios, than by co-authors who are specialized or starred in different subfields.
COVID-19 and educational inequality: How school closures affect low- and high-achieving students
(2021)
In spring 2020, governments around the globe shut down schools to mitigate the spread of the novel coronavirus. We argue that low-achieving students may be particularly affected by the lack of educator support during school closures. We collect detailed time-use information on students before and during the school closures in a survey of 1099 parents in Germany. We find that while students on average reduced their daily learning time of 7.4 h by about half, the reduction was significantly larger for low-achievers (4.1 h) than for high-achievers (3.7 h). Low-achievers disproportionately replaced learning time with detrimental activities such as TV or computer games rather than with activities more conducive to child development. The learning gap was not compensated by parents or schools who provided less support for low-achieving students.
Weltweit und in Deutschland erreicht das Thema Inflation neue Höchststände in der Aufmerksamkeit (Google-Trends 2022). Nach einer vielbeachteten und millionenfach angesehenen Online-Weihnachtsvorlesung aus dem Jahre 2020 hat der Ökonomieprofessor Hans-Werner Sinn das Buch mit gleichnamigem Titel „Die wundersame Geldvermehrung“ veröffentlicht. Abermals könnte es dem Autor gelingen die politische Öffentlichkeit damit aufzurütteln.
Identifikation von Schlaf- und Wachzuständen durch die Auswertung von Atem- und Bewegungssignalen
(2021)
In the current age of innovative business financing opportunities available from fintech apps, social media crowdfunding sites such as Kickstarter, Indiegogo, and RocketHub, et.al., and friends and family private equity investors, start-up firms can strategically source their venture capital funds from many globally disperse organizations and individuals. As the firm in this case learned, the benefit of alternative investing sources comes with a critical hidden risk for corporate governance. After a financial restructuring, a typical Silicon Valley software start-up found itself with close to 300 external individual shareholders, some of whom had not been documented as accredited investors. The regulatory agency could decide that the prior actions of the founders and the decisions of the board had been prejudicial to the interests of the minority investors. The management of this small private company faced an atypical investor relations dilemma, before its initial public offering (IPO).
In this work, a comparison between different brushless harmonic-excited wound-rotor synchronous machines is performed. The general idea of all topologies is the elimination of the slip rings and auxiliary windings by using the already existing stator and rotor winding for field excitation. This is achieved by injecting a harmonic airgap field with the help of power electronics. This harmonic field does not interact with the fundamental field, it just transfers the excitation power across the airgap. Alternative methods with varying number of phases, different pole-pair combinations, and winding layouts are covered and compared with a detailed Finite-Element-parameterized model. Parasitic effects due to saturation and coupling between the harmonic and main windings are considered.
Massive data transfers in modern data-intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-Data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become feasible. The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under RocksDB and the COSMOS hardware platform.
Continuous monitoring of individual vital parameters can provide information for the assessment of one’s health and indications of medical problems in the context of personalized medicine. Correlations between parameters and health issues are to be evaluated. As one project in this topic area, a telemedicine platform is implemented to gather data of outpatients via wearables and accumulate them for physicians and researchers to review. This work extracts requirements, draws use case scenarios, and shows the current system architecture consisting of a patient application, a physician application with a web server, and a backend server application. In further work, the prototype will assist to develop a vendor-free and open monitoring solution. A conclusion on functionality and usability will be evaluated in an imminent first study.
As consumer awareness surrounding impacts of the climate crisis continues to be a notable threat, businesses are searching for new models to make their sustainability profile even better. As a result, the implementation of a company’s sustainability vision following the SDGs has to be linked closely to the integration of customers into strategic action. One success factor is the management of customers over their entire life cycle. The Customer Journey serves as a model to systematise this approach, by designing touchpoints throughout the purchasing process in order to motivate consumers to act sustainably. Based on behaviour models, the authors develop recommendations for the food industry to design a sustainable Customer Journey that helps to reduce the percentage of consumers reporting positive attitudes to sustainable products while not exhibiting corresponding behaviour.
Imagine a world in which the search for tomorrow's trends of (software) products is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
Imagine a world in which the search for tomorrow's trends is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
In recent years, the Graph Model has become increasingly popular, especially in the application domain of social networks. The model has been semantically augmented with properties and labels attached to the graph elements. It is difficult to ensure data quality for the properties and the data structure because the model does not need a schema. In this paper, we propose a schema bound Typed Graph Model with properties and labels. These enhancements improve not only data quality but also the quality of graph analysis. The power of this model is provided by using hyper-nodes and hyper-edges, which allows to present data structures on different abstraction levels. We prove that the model is at least equivalent in expressive power to most popular data models. Therefore, it can be used as a supermodel for model management and data integration. We illustrate by example the superiority of this model over the property graph data model of Hidders and other prevalent data models, namely the relational, object-oriented, XML model, and RDF Schema.
Deep learning-based EEG detection of mental alertness states from drivers under ethical aspects
(2021)
One of the most critical factors for a successful road trip is a high degree of alertness while driving. Even a split second of inattention or sleepiness in a crucial moment, will make the difference between life and death. Several prestigious car manufacturers are currently pursuing the aim of automated drowsiness identification to resolve this problem. The path between neuro-scientific research in connection with artificial intelligence and the preservation of the dignity of human individual’s and its inviolability, is very narrow. The key contribution of this work is a system of data analysis for EEGs during a driving session, which draws on previous studies analyzing heart rate (ECG), brain waves (EEG), and eye function (EOG). The gathered data is hereby treated as sensitive as possible, taking ethical regulations into consideration. Obtaining evaluable signs of evolving exhaustion includes techniques that obtain sleeping stage frequencies, problematic are hereby the correlated interference’s in the signal. This research focuses on a processing chain for EEG band splitting that involves band-pass filtering, principal component analysis (PCA), independent component analysis (ICA) with automatic artefact severance, and fast fourier transformation (FFT). The classification is based on a step-by-step adaptive deep learning analysis that detects theta rhythms as a drowsiness predictor in the pre-processed data. It was possible to obtain an offline detection rate of 89% and an online detection rate of 73%. The method is linked to the simulated driving scenario for which it was developed. This leaves space for more optimization on laboratory methods and data collection during wakefulness-dependent operations.
Seit über 12 Jahren findet nun die Informatics Inside als Informatikkonferenz an der Hochschule Reutlingen statt, in diesem Jahr zum zweiten Mal in einem halbjährigen Rhythmus, d.h. auch im Herbst. Diese Wissenschaftliche Konferenz des Masterstudiengangs Human-Centered Computing wird von den Studierenden selbst organisiert und durchgeführt. Sie erhalten während ihres Masterstudiums die Gelegenheit sich in einem selbstgewählten Fachthema zu vertiefen. Dies kann an der Hochschule, in einem Unternehmen, einem Forschungsinstitut oder im Ausland durchgeführt werden. Gerade diese flexible Ausgestaltung des Moduls „Wissenschaftliche Vertiefung“ führt zu einem sehr breiten Themenspektrum, das von den Studierenden bearbeitet wird. Neben der eigentlichen fachlichen Vertiefung spielt auch die Präsentation und Verteidigung von wissenschaftlichen Ergebnissen eine wichtige Rolle und dies weit über das Studium hinaus. Ein gewähltes Fachgebiet so allgemeinverständlich aufzubereiten und zu vermitteln, dass es auch für Nicht-Spezialisten verständlich wird, stellt immer wieder eine besondere Herausforderung dar. Dieser Herausforderung stellen sich die Studierenden im Rahmen der Herbstkonferenz zur Wissenschaftlichen Vertiefung am 24. November 2021. Bereits zum vierten Mal wird die Veranstaltung in einem online-Modus stattfinden, einschließlich eines virtuellen Begleitprogramms.
Das Themenspektrum der diesjährigen Herbstkonferenz ist wieder sehr vielfältig und breit gefächert. So erwarten Sie u.a. Beiträge aus dem Gesundheitssektor, dem Maschinellen Lernen, der KI und VR sowie dem Marketing und E-Learning. Allen gemein ist ein sehr starker Bezug zu innovativen Informatikansätzen, was sich auch in dem Wortspiel und Motto „RockIT Science“ der Konferenz widerspiegelt. Die Informatik durchdringt fast alle beruflichen und privaten Anwendungsbereiche und hat zunehmend größeren Einfluss auf unser tägliches Leben. Dies kann einerseits Besorgnis und andererseits Begeisterung auslösen. Gerade letzteres wollen die Studierenden mit Ihren Beiträgen erreichen und es auch mal im Informatiksektor „rocken“ lassen.
Context-aware systems to support actors in the operating room depending on the status of the intervention require knowledge about the current situation in the intra-operative area. In literature, solutions to achieve situation awareness already exist for specific use cases, but applicability and transferability to other conditions are less addressed. It is assumed that a unified solution that can be adapted to different processes and sensors would allow for greater flexibility, applicability, and thus transferability to different applications. To enable a flexible and intervention-independent system, this work proposes a concept for an adaptable situation recognition system. The system consists of four layers with several modular components for different functionalities. The feasibility is demonstrated via prototypical implementation and functional evaluation of a first basic framework prototype. Further development goal is the stepwise extension of the prototype.
Silicon photonic micro-ring resonators (MRR) developed on the silicon-on-insulator (SOI) platform, owing to their high sensitivity and small footprint, show great potential for many chemical and biological sensing applications such as label-free detection in environmental monitoring, biomedical engineering, and food analysis. In this tutorial, we provide the theoretical background and give design guidelines for SOI-based MRR as well as examples of surface functionalization procedures for label-free detection of molecules. After introducing the advantages and perspectives of MRR, fundamentals of MRR are described in detail, followed by an introduction to the fabrication methods, which are based on a complementary metal-oxide semiconductor (CMOS) technology. Optimization of MRR for chemical and biological sensing is provided, with special emphasis on the optimization of waveguide geometry. At this point, the difference between chemical bulk sensing and label-free surface sensing is explained, and definitions like waveguide sensitivity, ring sensitivity, overall sensitivity as well as the limit of detection (LoD) of MRR are introduced. Further, we show and explain chemical bulk sensing of sodium chloride (NaCl) in water and provide a recipe for label-free surface sensing.
Together with many success stories, promises such as the increase in production speed and the improvement in stakeholders' collaboration have contributed to making agile a transformation in the software industry in which many companies want to take part. However, driven either by a natural and expected evolution or by contextual factors that challenge the adoption of agile methods as prescribed by their creator(s), software processes in practice mutate into hybrids over time. Are these still agile In this article, we investigate the question: what makes a software development method agile We present an empirical study grounded in a large-scale international survey that aims to identify software development methods and practices that improve or tame agility. Based on 556 data points, we analyze the perceived degree of agility in the implementation of standard project disciplines and its relation to used development methods and practices. Our findings suggest that only a small number of participants operate their projects in a purely traditional or agile manner (under 15%). That said, most project disciplines and most practices show a clear trend towards increasing degrees of agility. Compared to the methods used to develop software, the selection of practices has a stronger effect on the degree of agility of a given discipline. Finally, there are no methods or practices that explicitly guarantee or prevent agility. We conclude that agility cannot be defined solely at the process level. Additional factors need to be taken into account when trying to implement or improve agility in a software company. Finally, we discuss the field of software process-related research in the light of our findings and present a roadmap for future research.
Prior to the introduction of AI-based forecast models in the procurement department of an industrial retail company, we assessed the digital skills of the procurement employees and surveyed their attitudes toward a new digital technology. The aim of the survey was to ascertain important contextual factors which are likely to influence the acceptance and the successful use of the new forecast tool. What we find is that the digital skills of the employees show an intermediate level and that their attitudes toward key aspects of new digital technologies are largely positive. Thus, the conditions for high acceptance and the successful use of the models are good, as evidenced by the high intention of the procurement staff to use the models. In line with previous research, we find that the perceived usefulness of a new technology and the perceived ease of use are significant drivers of the willingness to use the new forecast tool.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
In buchstäblich letzter Minute haben sich die englische Regierung und die Europäische Union auf ein umfangreiches Abkommen geeinigt, um einen ungeregelten Brexit zu verhindern. Nach dem jahrelangen zähen Verhandlungsmarathon fällt der Jubel verhalten aus, dennoch herrscht auf beiden Seiten des Ärmelkanals Erleichterung, weil ein Modus Vivendi gefunden wurde, auf dem sich die künftigen Beziehungen aufbauen und fortführen lassen. Ob sich die englischen Blütenträume, die an den Brexit geknüpft wurden, erfüllen werden, wird die Zukunft erweisen.
Die Strategie und Taktik der englischen Regierungen zum Brexit und bei den Austrittsverhandlungen spiegeln sich in den Erfahrungen wider, die Friedrich List vor genau 175 Jahren bei seinen Bemühungen um eine deutsch-englische Allianz machen musste. Wegen der von England schon damals strikt befolgten Insular und Handelssuprematie musste er sich eingestehen, dass England diese Position hartnäckig verteidigt und deshalb frustriert und ernüchtert seine Pläne aufgeben. Deshalb setzte er seine Hoffnung auf eine "Kontinentalallianz" der europäischen Nationen, wie sie nun nach dem Austritt Großbritanniens aus der Europäischen Union entstanden ist. Vielleicht werden wir uns nun an den Begriff "Kontinentalallianz" gewöhnen müssen und dabei an die Weitsicht von Friedrich List erinnert.
Andererseits gilt auch für die englische Politik das Motto von Lists zweiter Pariser Preisschrift: "Le monde marche - Die Welt bewegt sich", allerdings mit völlig anderen Vorzeichen als vor 175 Jahren: Die Welthandelsachse hat sich von der westlichen auf die östliche Halbkugel verlagert; das britische Weltreich ist Geschichte, die Fließgeschwindigkeit des globalen Wandels hat sich dramatisch beschleunigt und trotz der Lingua Franca erscheint England, vor allem aus asiatischer Sicht, nur noch als kleiner Fleck auf der Weltkarte. Falls die schottische Regierung ihre Absicht durchsetzen und die Unabhängigkeit vom Vereinigten Königreich erreichen sollte, würde sich der Brexit als verhängnisvoller Bumerang erweisen.
Digitalisierung und Mediatisierung prägen die Gesellschaft und auch die Erwachsenenbildung/Weiterbildung. Der Beitrag geht der Frage nach, wie Digitalisierung in Angeboten der Erwachsenenbildung/Weiterbildung gelingt. Damit wird ein Fokus auf den Einsatz digitaler Medien gelegt. Dazu werden die Angebotsentwicklung für Adressatinnen und Adressaten sowie Teilnehmende, medienbezogene Inhalte, Lehr- und Lernarrangements mit digitalen Medien, der Einsatz digitaler Medien und die Zugänglichkeit von Lehr- und Lernmaterialien als relevante Merkmale identifiziert. Insgesamt zeigen die analysierten Interviewdaten, dass der Einsatz digitaler Medien in Angeboten eine Erweiterung der didaktischen Aufgaben darstellt, da Angebote mit digitalen Medien zielgenau auf die Bedarfe und Möglichkeiten von Adressatinnen und Adressaten sowie Teilnehmenden abgestimmt werden müssen.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Hearing contact lens (HCL) is a new type of hearing aid devices. One of its main components is a piezo-electric actuator (PEA). In order to evaluate and maximizethe HCL´s performance, a model of the HCL coupled to the middle ear was developed using finite element (FE)approach. To validate the model, vibrational measurements on the HCL and temporal bones were performed using a Laser-Doppler-Vibrometer (LDV). The model was validated step by step starting with HCL only. Then a silicone cap was fitted onto the HCL to provide an interface between the HCL and the tympanic membrane. The HCL was placed on the tympanic membrane and additional measurements were performed to validate the coupled model. The model was used to evaluate the sensitivity of geometrical and material parameters with respect to performance measures of the HCL. Moreover, deeper insight was gained into the feedback behavior, which causes whistling sounds, and the contact between the HCL and tympanic membrane.
This paper presents a permanent magnet tubular linear generator system for powering passive sensors using vertical vibration harvesting energy. The system consists of a permanent magnet tubular linear vibration generator and electric circuits. By using the design of mechanical resonant movers, the generator is capable of converting low frequencies small amplitude vertical vibration energy into more regular sinusoidal electrical energy. The distribution of the magnetic field and electromotive force are calculated by Finite Element Analysis. The characteristics of the linear vibration generator system are observed. The experimental results show the generator can produce about 0.4W~1.6W electrical power when the vibration source's amplitude is fixed on 2mm and the frequencies are between 13Hz and 22Hz.
So-called cloud-based management information systems are a fairly new phenomenon in management accounting in recent years. Quite a few companies (and especially their business managers and management accountants) do not always work via the cloud, but with hybrid solutions or on-premise solutions of ERP software such as SAP or Oracle, but often still with "manual" solutions such as Microsoft Excel.
This paper takes a holistic view on an IP-traceability process in interorganizational R&D projects, as a particular Open innovation mode, aiming at showing different technologies which can be used in the front and backend of a traceability process and discussing these technologies in terms of their suitability for data from creativity processes in these projects. To achieve this goal a two-stage literature review on different technologies in the context of traceability was conducted. Then, criteria were derived from the characteristics of data from creativity processes and of interorganizational R&D projects, with which the resulting technologies were discussed. At the end, recommendations regarding suitable technologies for tracing individual creativity artifacts in interorganizational R&D projects were given.
Die vorliegende Studie zeigt, dass das Thema Smart Innovation (der Einsatz von KI-Systemen im Innovationsprozess) von hoher Relevanz ist und Zustimmung für den Einsatz von KI im Innovationsprozess besteht. Sowohl von den Unternehmen als auch von den Studierenden werden Effizienzsteigerung, schnellere Bearbeitung großer Datenmengen, die Steigerung der Wettbewerbsfähigkeit und Kosteneinsparungen als Gründe für den Einsatz von KI im Innovationsprozess gesehen. In Deutschland finden KI-Technologien bereits jetzt punktuell und branchenunabhängig Anwendung im Innovationsprozess. Einflussfaktoren, wie Hochschulkooperationen, Innovationsabteilungen und Open Innovation können den Einsatz fördern. Vor allem KMU aus den frühen Phasen der Industrialisierung sollten davon Gebrauch machen. In einem Zusammenspiel von menschlicher Expertise und der schnellen und präzisen Datenverarbeitung der KI liegt das Erfolgsgeheimnis eines möglichst effizienten Innovationsprozesses. Es wird deutlich, dass verschiedene Einflussfaktoren erforderlich sind, um die Anwendung von Smart Innovation praktikabel zu gestalten. So gilt es zunächst die technischen Voraussetzungen einer funktionierenden IT-Infrastruktur zu erfüllen. Gleichbedeutend sind offene Fragestellungen hinsichtlich der Datenverfügbarkeit, des Dateneigentums und der Datensicherheit. Ohne rechtlichen Rahmen sind kaum Akteure gewillt, ihre Daten zu teilen und zugänglich zu machen. Erschwert wird der Einsatz von KI durch den nationalen IT-Fachkräftemangel. So sehen sowohl Unternehmen als auch die Studierenden das größte Hindernis im Mangel von KI-relevantem Know-how. Dies hemmt einerseits die Forschung, andererseits fehlt es den Unternehmen an erforderlichen Fachkräften für eine Einführung von KI im Unternehmen. Es ist jedoch notwendig, den Unternehmen durch das Aufzeigen von Anwendungsbeispielen, die Potenziale und Chancen von Smart Innovation zu vermitteln. Es gilt, die anwendungsorientierte Forschung zu fördern und einen reibungslosen Transfer in die Wirtschaft sicherzustellen. Dieser Wissensaustausch erfordert zudem eine höhere unternehmerische Risikobereitschaft. Es wächst die Notwendigkeit, unternehmensspezifische KI-Strategien zu entwerfen. Die Technologien entwickeln sich schnell, es gilt daher auch für Unternehmen sich diesem Fortschritt anzupassen, um den Anschluss nicht zu verlieren und die Wettbewerbsfähigkeit zu sichern. So liegt die größte Herausforderung im grundlegenden Wandel der Geschäftsmodelle, denn die Wertschöpfung erfolgreicher Unternehmen basiert zunehmend auf "digitalen assets". Daten gelten generell als die neue Ressource, als Rohstoff, auch für Smarte Innovationen. Die Bedeutung von Smart Innovation wird in Zukunft weiterhin ansteigen. Kurz- und mittelfristig unterstützt die Schwache KI vor allem bei der Datensammlung und -analyse, bei der Prozessautomatisierung sowie bei der Bedürfnis- und Trendidentifikation. Weiter werden sich inkrementelle Veränderungen im Innovationsmanagement mithilfe von Simulationen und der zufälligen Kombination von Technologien erhofft. Langfristig wird eine stärkere KI den Einsatz der Menschen im Innovationsprozess in Teilen ersetzen können. Ob autonomes Innovieren zukünftig möglich sein wird, hängt zunächst von dem Ausmaß der Neuheit einer Innovation, aber vor allem auch von der Möglichkeit einer kreativen KI ab. Es ist davon auszugehen, dass die Fortschritte im Bereich der KI nicht nur radikale Innovationen ermöglichen werden, sondern auch zu einer strukturellen Veränderung unseres heutigen Verständnisses des Innovationsmanagements führen.
Unter den widrigsten wirtschaftlichen und politischen Verhältnissen und Bedingungen wurde die Friedrich-List-Gesellschaft (FLG) 1925 gegründet und bis 1934 fortgeführt. Sie verfolgte vor allem den Zweck, die weit verstreuten, schwer zugänglichen und vielfach unbekannten Schriften, Reden und Briefe von Friedrich List (1789-1846) zusammenzutragen und in Form einer Gesamtausgabe zu publizieren.
Weder diese 10- bzw. 12-bändige Gesamtausgabe, noch die Namen ihrer Herausgeber haben in der Wirtschaftswissenschaft die gebührende Wertschätzung und Aufmerksamkeit erfahren. Die längst überfällige Dankesschuld wird in dem vorliegenden Beitrag nach nahezu 100 Jahren abgetragen. Ohne den engagierten und mutigen Einsatz der Herausgeber, insbesondere von Edgar Salin, wäre die List-Forschung undenkbar und die deutsche Wirtschaftswissenschaft um ein ruhmreiches Kapitel ärmer.
Seit 5 Jahrzehnten steht die Erforschung von Leben, Werk und Wirkungsgeschichte von Friedrich List (1789–1846) im Zentrum der wissenschaftlichen Arbeit von Eugen Wendler. Im Laufe der Zeit sind ca. 30 Monographien und eine größere Anzahl von wissenschaftlichen Aufsätzen und journalistischen Artikeln entstanden. Dabei baute Eugen Wendler auf der unschätzbaren Vorarbeit der Herausgeber der Gesamtausgabe von Lists Werken von 1925 bis 1935 auf.
Der vorliegende Aufsatz vermittelt einen Überblick über die Buchpublikationen von Eugen Wendler zur List-Forschung. Mit seinem eindrucksvollen Oeuvre bekennt er sich zum letzten lebenden Fossil in der Nachfolge der FLG und erweist damit den Herausgebern die gebührende und längst überfällige Wertschätzung und Achtung.
In various German cities free-floating e-scooter sharing is an upcoming trend in e-mobility. Trends such as climate change, urbanization, demographic change, amongst others are arising and forces the society to develop new mobility solutions. Contrasting the more scientifically explored car sharing, the usage patterns and behaviors of e-scooter sharing customers still need to be analyzed. This presumably enables a better addressing of customers as well as adaptions of the business model to increase scooter utilization and therefore the profit of the e-scooter providers. The customer journey is digitally traceable from registration to scooter reservation and the ride itself. These data enable to identifies customer needs and motivations. We analyzed a dataset from 2017 to 2019 of an e-scooter sharing provider operating in a big German city. Based on the datasets we propose a customer clustering that identifies three different customer segments, enabling to draw multiple conclusions for the business development and improving the problem-solution fit of the e-scooter sharing model.
Forecasting intermittent and lumpy demand is challenging. Demand occurs only sporadically and, when it does, it can vary considerably. Forecast errors are costly, resulting in obsolescent stock or unmet demand. Methods from statistics, machine learning and deep learning have been used to predict such demand patterns. Traditional accuracy metrics are often employed to evaluate the forecasts, however these come with major drawbacks such as not taking horizontal and vertical shifts over the forecasting horizon into account, or indeed stock-keeping or opportunity costs. This results in a disadvantageous selection of methods in the context of intermittent and lumpy demand forecasts. In our study, we compare methods from statistics, machine learning and deep learning by applying a novel metric called Stock-keeping-oriented Prediction Error Costs (SPEC), which overcomes the drawbacks associated with traditional metrics. Taking the SPEC metric into account, the Croston algorithm achieves the best result, just ahead of a Long Short-Term Memory Neural Network.
Omnichannel retailing and sustainability are two important challenges for the fast fashion industry. However, the sustainable behavior of fast fashion consumers in an omnichannel environment has not received much attention from researchers. This paper aims to examine the factors that determine consumers’ willingness to participate in fast fashion brands’ used clothes recycling plans in an omnichannel retail environment. In particular, we examine the impact of individual consumer characteristics (environmental attitudes, consumer satisfaction), organizational arrangements constitutive for omnichannel retailing (channel integration), and their interplay (brand identification, impulsive consumption). A conceptual model was developed based on findings from previous research and tested on data that were collected online from Chinese fast fashion consumers. Findings suggest that consumers’ intentions for clothes recycling are mainly determined by individual factors, such as environmental attitudes and consumer satisfaction. Organizational arrangements (perceived channel integration) showed smaller effects. This study contributes to the literature on omnichannel (clothing) retail, as well as on sustainability in the clothing industry, by elucidating individual and organizational determinants of consumers’ recycling intentions for used clothes in an omnichannel environment. It helps retailers to organize used clothes recycling plans in an omnichannel environment and to motivate consumers to participate in them.
Digitalization increases the pressure for companies to innovate. While current research on digital transformation mostly focuses on technological and management aspects, less attention has been paid to organizational culture and its influence on digital innovations. The purpose of this paper is to identify the characteristics of organizational culture that foster digital innovations. Based on a systematic literature review on three scholarly databases, we initially found 778 articles that were then narrowed down to a total number of 23 relevant articles through a methodical approach. After analyzing these articles, we determine nine characteristics of organizational culture that foster digital innovations: corporate entrepreneurship, digital awareness and necessity of innovations, digital skills and resources, ecosystem orientation, employee participation, agility and organizational structures, error culture and risk-taking, internal knowledge sharing and collaboration, customer and market orientation as well as open-mindedness and willingness to learn.
The early detection of head and neck cancer is a prolonged challenging task. It requires a precise and accurate identification of tissue alterations as well as a distinct discrimination of cancerous from healthy tissue areas. A novel approach for this purpose uses microspectroscopic techniques with special focus on hyperspectral imaging (HSI) methods. Our proof-of-principle study presents the implementation and application of darkfield elastic light scattering spectroscopy (DF ELSS) as a non-destructive, high-resolution, and fast imaging modality to distinguish lingual healthy from altered tissue regions in a mouse model. The main aspect of our study deals with the comparison of two varying HSI detection principles, which are a point-by-point and line scanning imaging, and whether one might be more appropriate in differentiating several tissue types. Statistical models are formed by deploying a principal component analysis (PCA) with the Bayesian discriminant analysis (DA) on the elastic light scattering (ELS) spectra. Overall accuracy, sensitivity, and precision values of 98% are achieved for both models whereas the overall specificity results in 99%. An additional classification of model-unknown ELS spectra is performed. The predictions are verified with histopathological evaluations of identical HE-stained tissue areas to prove the model’s capability of tissue distinction. In the context of our proof-of-principle study, we assess the Pushbroom PCA-DA model to be more suitable for tissue type differentiations and thus tissue classification. In addition to the HE-examination in head and neck cancer diagnosis, the usage of HSI-based statistical models might be conceivable in a daily clinical routine.
Gold bipyramids (AuBPs) attract significant attention due to the large enhancement of the electric field around their sharp tips and well-defined tunability of their plasmon resonances. Excitation patterns of single AuBPs are recorded using raster-scanning confocal microscopy combined with radially and azimuthally polarized laser beams. Photoluminescence spectra (PL) and excitation patterns of the same AuBPs are acquired with three different excitation wavelengths. The isotropic excitation patterns suggest that the AuBPs are mainly excited by interband transitions with 488/530 nm radiation, while excitation patterns created with a 633 nm laser exhibit a double-lobed shape that indicates a single-dipole excitation process associated with the longitudinal plasmon resonance mode. We are able to determine the three-dimensional orientation of single AuBPs nonperturbatively by comparing experimental patterns with theoretical simulations. The asymmetric patterns show that the AuBPs are lying on the substrate with an out-of-plane tilt angle of around 10–15°.
Today, many industrial tasks are not automated and still require human intervention. One of these tasks is the unloading of oversea containers. After the end of transportation to the sorting center, the containers must be unloaded manually for further sending the parcels to the recipients. A robot-based automatic unloading of containers was therefore researched. However, the promising results of the system developed in these projects could not be commercialized due to problems with its reliability. Mechanical, algorithmic or other limitations are possible causes of the observed errors. To analyze errors, it is necessary to evaluate the results of the robot’s work without complicating the existing system by adding new sensors to it. This paper presents a reference system based on machine learning to evaluate the robotics grasps of parcels. It analyzes two states of the container: before and after picking up one box. The states are represented as a point cloud received from a laser scanner. The proposed system evaluates the success of transferring a box from an overseas container to the sorting line by supervised learning using convolutional neural networks (CNN) and manual labeling of the data. The process of obtaining a working model using a hyperband model search with a maximum classification error of 3.9 % is also described.
Focal adhesion clusters (FAC) are dynamic and complex structures that help cells to sense physicochemical properties of their environment. Research in biomaterials, cell adhesion or cell migration often involves the visualization of FAC by fluorescence staining and microscopy, which necessitates quantitative analysis of FAC and other cell features in microscopy images using image processing. Fluorescence microscopy images of human umbilical vein endothelial cells (HUVEC) obtained at 63x magnification were quantitatively analysed using ImageJ software. A generalised algorithm for selective segmentation and morphological analysis of FAC, nucleus and cell morphology is implemented. Further, a method for discrimination of FACnear the nucleus and around the periphery is implemented using masks. Our algorithm is able to effectively quantify different morphological characteristics of cell components and shows a high sensitivity and specificity while providing a modular software implementation.
This paper presents a modular and scalable power electronics concept for motor control with continuous output voltage. In contrast to multilevel concepts, modules with continuous output voltage are connected in series. The continuous output voltage of each module is obtained by using gallium nitride (GaN) high electron motility transistor (HEMT)s as switches inside the modules with a switching frequency in the range between 500 kHz and 1 MHz. Due to this high switching frequency a LC filter is integrated into the module resulting in a continuous output voltage. A main topic of the paper is the active damping of this LC output filter for each module and the analysis of the series connection of the damping behaviour. The results are illustrated with simulations and measurements.
Context: The manufacturing industry is facing a transformation with regard to Industry 4.0 (I4). A transformation towards full automation of production including a multitude of innovations is necessary. Startups and entrepreneurial processes can support such a transformation as has been shown in other industries. However, I4 has some specifics, so it is unclear how entrepreneurship can be adapted in I4. Understanding these specifics is important to develop suitable training programs for I4 startups and to accelerate the transformation.
Objective: This study identifies and outlines the essential characteristics and constraints of entrepreneurial processes in I4.
Method: 14 semi-structured interviews were conducted with experts in the field of I4 entrepreneurship. The interviews were analysed and categorized by qualitative analyses.
Results: The interviews revealed several characteristics of I4 that have a significant impact on the various phases of the entrepreneurial process. Examples of such specifics include the difficult access to customers, the necessary deep understanding of the customer and the domain, the difficulty of testing risky assumptions, and the complex development and productization of solutions. The complexity of hardware and software components, cost structures, and necessary customer-specific customizations affect the scalability of I4 startups. These essential characteristics also require specialised skills and resources from I4 startups.
Soft lithography, a tool widely applied in biology and life sciences with numerous applications, uses the soft molding of photolithography-generated master structures by polymers. The central part of a photolithography set-up is a mask-aligner mostly based on a high-pressure mercury lamp as an ultraviolet (UV) light source. This type of light source requires a high level of maintenance and shows a decreasing intensity over its lifetime, influencing the lithography outcome. In this paper, we present a low-cost, bench-top photolithography tool based on ninety-eight 375 nm light-emitting diodes (LEDs). With approx. 10 W, our presented lithography set-up requires only a fraction of the energy of a conventional lamp, the LEDs have a guaranteed lifetime of 1000 h, which becomes noticeable by at least 2.5 to 15 times more exposure cycles compared to a standard light source and with costs less than 850 C it is very affordable. Such a set-up is not only attractive to small academic and industrial fabrication facilities who want to enable work with the technology of photolithography and cannot afford a conventional set-up, but also microfluidic teaching laboratories and microfluidic research and development laboratories, in general, could benefit from this cost-effective alternative. With our self-built photolithography system, we were able to produce structures from 6 μm to 50 μm in height and 10 μm to 200 μm in width. As an optional feature, we present a scaled-down laminar flow hood to enable a dust-free working environment for the photolithography process.
Das Buch führt in die Grundlagen der Softwaretechnik ein. Dabei liegt sein Fokus auf der systematischen und modellbasierten Software- und Systementwicklung aber auch auf dem Einsatz agiler Methoden. Die Autoren legen besonderen Wert auf die gleichwertige Behandlung praktischer Aspekte und zugrundeliegender Theorien, was das Buch als Fach- und Lehrbuch gleichermaßen geeignet macht. Die Softwaretechnik wird im Rahmen eines systematischen Frameworks umfassend beschrieben. Ausgewählte und aufeinander abgestimmte Konzepte und Methoden werden durchgängig und integriert dargestellt.
Software is an integrated part of new features within the automotive sector, car manufacturers, the Hersteller Initiative Software (HIS) consortium defined metrics to determine software quality. Yet, problems with assigning metrics to quality attributes often occur in practice. The specified boundary values lead to discussions between contractors and clients as different standards and metric sets are used. This paper studies metrics used in the automotive sector and the quality attributes they address. The HIS, ISO/IEC 25010:2011, and ISO/IEC 26262:2018 are utilized to draw a big picture illustrating (i) which metrics and boundary values are reported in literature, (ii) how the metrics match the standards, (iii) which quality attributes are addressed, and (iv) how the metrics are supported by tools. Our findings from analyzing 38 papers include a catalog of 112 metrics of which 17 define boundary values and 48 are supported by tools. Most of the metrics are concerned with source code, are generic, and not specifically designed for automotive software development. We conclude that many metrics exist, but a clear definition of the metrics' context, notably regarding the construction of flexible and efficient measurement suites, is missing.
Context:
Test-driven development (TDD) is an agile software development approach that has been widely claimed to improve software quality. However, the extent to which TDD improves quality appears to be largely dependent upon the characteristics of the study in which it is evaluated (e.g., the research method, participant type, programming environment, etc.). The particularities of each study make the aggregation of results untenable.
Objectives:
The goal of this paper is to: increase the accuracy and generalizability of the results achieved in isolated experiments on TDD, provide joint conclusions on the performance of TDD across different industrial and academic settings, and assess the extent to which the characteristics of the experiments affect the quality-related performance of TDD.
Method:
We conduct a family of 12 experiments on TDD in academia and industry. We aggregate their results by means of meta-analysis. We perform exploratory analyses to identify variables impacting the quality-related performance of TDD.
Results:
TDD novices achieve a slightly higher code quality with iterative test-last development (i.e., ITL, the reverse approach of TDD) than with TDD. The task being developed largely determines quality. The programming environment, the order in which TDD and ITL are applied, or the learning effects from one development approach to another do not appear to affect quality. The quality-related performance of professionals using TDD drops more than for students. We hypothesize that this may be due to their being more resistant to change and potentially less motivated than students.
Conclusion:
Previous studies seem to provide conflicting results on TDD performance (i.e., positive vs. negative, respectively). We hypothesize that these conflicting results may be due to different study durations, experiment participants being unfamiliar with the TDD process, or case studies comparing the performance achieved by TDD vs. the control approach (e.g., the waterfall model), each applied to develop a different system. Further experiments with TDD experts are needed to validate these hypotheses.
Fault diagnosis of rolling bearings is an essential process for improving the reliability and safety of the rotating machinery. It is always a major challenge to ensure fault diag- nosis accuracy in particular under severe working conditions. In this article, a deep adversarial domain adaptation (DADA) model is proposed for rolling bearing fault diagnosis. This model con- structs an adversarial adaptation network to solve the commonly encountered problem in numerous real applications: the source domain and the target domain are inconsistent in their distribution. First, a deep stack autoencoder (DSAE) is combined with representative feature learning for dimensionality reduction, and such a combination provides an unsupervised learning method to effectively acquire fault features. Meanwhile, domain adaptation and recognition classification are implemented using a Softmax classifier to augment classification accuracy. Second, the effects of the number of hidden layers in the stack autoencoder network, the number of neurons in each hidden layer, and the hyperparameters of the proposed fault diagnosis algorithm are analyzed. Third, comprehensive analysis is performed on real data to vali- date the performance of the proposed method; the experimental results demonstrate that the new method outperforms the existing machine learning and deep learning methods, in terms of classification accuracy and generalization ability.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
By 2019, Germany-based Kärcher, “the world’s leading provider of cleaning technology,” had turned its professional cleaning devices into IoT products. The data generated by these IoT-connected cleaning devices formed a key ingredient in the company’s ongoing strategic shift in its B2B business: Kärcher was transforming from a seller of cleaning devices to a provider of consulting services in order to help professional cleaning companies improve their cleaning processes. Based on interviews with seven IT- and non-IT executives, the case illustrates how the company learned to generate value from IoT products. And it demonstrates how a family-owned company transformed its organization in order to be able to more effectively develop and provide IoT products, while adding roles, developing technology platforms, and changing organizational structures and ways of working.