Refine
Year of publication
- 2022 (233) (remove)
Document Type
- Journal article (123)
- Conference proceeding (83)
- Book chapter (15)
- Working Paper (4)
- Doctoral Thesis (3)
- Patent / Standard / Guidelines (2)
- Report (2)
- Book (1)
Has full text
- yes (233) (remove)
Is part of the Bibliography
- yes (233) (remove)
Institute
- ESB Business School (83)
- Informatik (71)
- Technik (34)
- Life Sciences (30)
- Texoversum (11)
- Zentrale Einrichtungen (4)
Publisher
- Springer (30)
- Elsevier (27)
- IEEE (21)
- MDPI (21)
- Hochschule Reutlingen (13)
- Center for Promoting Education and Research (6)
- Wiley (5)
- Association for Computing Machinery (4)
- De Gruyter (4)
- Emerald (4)
The strong demand for a transformation of the textile and fashion industry towards sustainability requires a continuous implementation of the guiding principle of Education for Sustainable Development (ESD) in education and industry [1, 2]. In a first step of the European research project "Sustainable fashion curriculum at textile Universities in Europe - Development, Implementation and Evaluation of a Teaching Module for Educators" (Fashion DIET) a continuing education module shall be created to implement ESD as a guiding principle in university teaching. The research-based teaching and learning materials are delivered through an e-learning portal.
Externe Ladeinfrastruktur kann rechtskonform aus dem Stromnetz einer öffentlichen Liegenschaft versorgt werden. Bisher war die Vorgabe, die Versorgung über einen eigenen (neuen) Netzanschlusspunkt zu realisieren. Die hier vorgestellte Lösung ist ökologisch, wirtschaftlich und technisch deutlich günstiger und dient als Muster für die weitere Erschließung landeseigenen Parkraums in ganz Baden-Württemberg. Ein virtuelles Kraftwerk ermöglicht den gemeinschaftsdienlichen Betrieb.
We address the problem of 3D face recognition based on either 3D sensor data, or on a 3D face reconstructed from a 2D face image. We focus on 3D shape representation in terms of a mesh of surface normal vectors. The first contribution of this work is an evaluation of eight different 3D face representations and their multiple combinations. An important contribution of the study is the proposed implementation, which allows these representations to be computed directly from 3D meshes, instead of point clouds. This enhances their computational efficiency. Motivated by the results of the comparative evaluation, we propose a 3D face shape descriptor, named Evolutional Normal Maps, that assimilates and optimises a subset of six of these approaches. The proposed shape descriptor can be modified and tuned to suit different tasks. It is used as input for a deep convolutional network for 3D face recognition. An extensive experimental evaluation using the Bosphorus 3D Face, CASIA 3D Face and JNU-3D Face datasets shows that, compared to the state of the art methods, the proposed approach is better in terms of both computational cost and recognition accuracy.
The Fourteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2022), held between May 22 – 26, 2022, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Die prä-, intra- und postoperative Entitäts- und Dignitätsbestimmung von Speicheldrüsen-tumoren (ST) allein anhand von histomorphologischen Kriterien ist häufig mit großen Unsicherheiten verbunden.
Die Spektren der Raman-Spektroskopie (RS) und der Infrarot-Spektroskopie (IS) enthalten Informationen zu der molekularen Zusammensetzung des untersuchten Gewebes. Ziel der Arbeit war die Etablierung eines Gewebe-Aufarbeitungs-Workflows und die Analyse des Einflusses der Fixierung auf die spektrale Bioinformation. Zudem wird ein Überblick über den Einsatz der RS und IS im Kopf-Hals Bereich gegeben.
Es wurden 10 mm dicke, konsekutive kryo-, formalin- und paraffinfixierte ST-Gewebeschnitte von Zystadenolymphomen (n=5) und pleomorphen Adenomen (n=4) mit der RS und IS untersucht und die Daten multivariat ausgewertet. Die Messungen erfolgten in Korrelation zur Histomorphologie über einen korrespondierenden HE-Schnitt sowohl im Tumorgewebe als auch im gesunden Speicheldrüsengewebe.
In der Mittelwertspektrenanalyse zeigte sich eine deutliche Paraffin-Signatur, Formalin-Fixierung hatte keinen wesentlichen Einfluss. Dies konnte durch die Hauptkomponentenanalyse (PCA) bestätigt werden. Eine Diskriminierung von Tumor- und Nicht-Tumorgewebe durch die PCA und gekoppelte Diskriminanzanalyse war ebenfalls mit beiden spektroskopischen Methoden mit einer hohen Sensitivität möglich.
Für eine Translation von spektralen Verfahren ist das Wissen über Einflussfaktoren auf die spektrale Bioinformation der Gewebeaufarbeitung und -fixierung unabdingbar. Die Integration spektraler Verfahren additiv in bestehende Arbeitsabläufe ist möglich. Der Einfluss der Formalinfixierung auf die spektrale Bioinformation ist gering. Die bioinformatische Analyse der umfangreichen Datensätze ist herausfordernd.
IZKF Würzburg
This paper reviews suggestions for changes to database technology coming from the work of many researchers, particularly those working with evolving big data. We discuss new approaches to remote data access and standards that better provide for durability and auditability in settings including business and scientific computing. We propose ways in which the language standards could evolve, with proof-of-concept implementations on Github.
We study whether compulsory religious education in schools affects students' religiosity as adults. We exploit the staggered termination of compulsory religious education across German states in models with state and cohort fixed effects. Using three different datasets, we find that abolishing compulsory religious education significantly reduced religiosity of affected students in adulthood. It also reduced the religious actions of personal prayer, church-going, and church membership. Beyond religious attitudes, the reform led to more equalized gender roles, fewer marriages and children, and higher labor-market participation and earnings. The reform did not affect ethical and political values or non-religious school outcomes.
We study whether compulsory religious education in schools affects students' religiosity as adults. We exploit the staggered termination of compulsory religious education across German states in models with state and cohort fixed effects. Using three different datasets, we find that abolishing compulsory religious education significantly reduced religiosity of affected students in adulthood. It also reduced the religious actions of personal prayer, church-going, and church membership. Beyond religious attitudes, the reform led to more equalized gender roles, fewer marriages and children, and higher labor-market participation and earnings. The reform did not affect ethical and political values or non-religious school outcomes.
Die Lohnlücke zwischen Frauen und Männern (der sogenannte Gender Pay Gap) wird üblicherweise in Bevölkerungsgruppen untersucht, die ihre Bildungslaufbahn bereits abgeschlossen haben. In diesem Beitrag betrachten wir eine frühere Phase der Berufstätigkeit, indem wir den Gender Pay Gap unter Studierenden, die neben ihrem Studium arbeiten, analysieren. Anhand von Daten aus fünf Kohorten einer Studierendenbefragung in Deutschland beschreiben wir den Gender Pay Gap und diskutieren mögliche Erklärungen. Die Ergebnisse zeigen, dass Studentinnen im Durchschnitt etwa 6% weniger verdienen als Studenten. Nach Berücksichtigung verschiedener entlohnungsrelevanter Faktoren verringert sich die Lücke auf 4,1%. Einer der Hauptgründe für die Differenz in der Entlohnung sind die unterschiedlichen Beschäftigungen, die männliche und weibliche Studierende ausüben.
Being exposed to compulsory religious education in school can have long-run consequences for students’ lives. At different points in time since the 1970s, German states terminated compulsory religious education in public schools and replaced it by a choice between ethics classes and religious education. This article shows that the reform not only led to reduced religiosity in students’ later life, but also eroded traditional attitudes towards gender roles and increased labor-market participation and earnings.
Gender pay gaps are commonly studied in populations with already completed educational careers. We focus on an earlier stage by investigating the gender pay gap among university students working alongside their studies. With data from five cohorts of a large-scale student survey from Germany, we use regression and wage decomposition techniques to describe gender pay gaps and potential explanations. We find that female students earn about 6% less on average than male students, which reduces to 4.1% when accounting for a rich set of explanatory variables. The largest explanatory factor is the type of jobs male and female students pursue.
With the digital transformation, companies will experience a change that focuses on shaping the organization into an agile organizational form. In today's competitive and fast-moving business environment, it is necessary to react quickly to changing market conditions. Agility represents a promising option for overcoming these challenges. The path to an agile organization represents a development process that requires consideration of countless levels of the enterprise. This paper examines the impact of digital transformation on agile working practices and the benefits that can be achieved through technology. To enable a solution for today's so-called VUCA (Volatility, Uncertainty, Complexity und Ambiguity) world, agile ways of working can be applied project management requires adaptation. In the qualitative study, expert interviews were conducted and analyzed using the grounded theory method. As a result, a model can be presented that shows the influencing factors and potentials of agile management in the context of the digital transformation of medium-sized companies.
Uncontrolled movement of instruments in laparoscopic surgery can lead to inadvertent tissue damage, particularly when the dissecting or electrosurgical instrument is located outside the field of view of the laparoscopic camera. The incidence and relevance of such events are currently unknown. The present work aims to identify and quantify potentially dangerous situations using the example of laparoscopic cholecystectomy (LC). Twenty-four final year medical students were prompted to each perform four consecutive LC attempts on a well-established box trainer in a surgical training environment following a standardized protocol in a porcine model. The following situation was defined as a critical event (CE): the dissecting instrument was inadvertently located outside the laparoscopic camera’s field of view. Simultaneous activation of the electrosurgical unit was defined as a highly critical event (hCE). Primary endpoint was the incidence of CEs. While performing 96 LCs, 2895 CEs were observed. Of these, 1059 (36.6%) were hCEs. The median number of CEs per LC was 20.5 (range: 1–125; IQR: 33) and the median number of hCEs per LC was 8.0 (range: 0–54, IQR: 10). Mean total operation time was 34.7 min (range: 15.6–62.5 min, IQR: 14.3 min). Our study demonstrates the significance of CEs as a potential risk factor for collateral damage during LC. Further studies are needed to investigate the occurrence of CE in clinical practice, not just for laparoscopic cholecystectomy but also for other procedures. Systematic training of future surgeons as well as technical solutions address this safety issue.
Industrial practice is characterized by random events, also referred to as internal and external turbulences, which disturb the target-oriented planning and execution of production and logistics processes. Methods of probabilistic forecasting, in contrast to single value predictions, allow an estimation of the probability of various future outcomes of a random variable in the form of a probability density function instead of predicting the probability of a specific single outcome. Probabilistic forecasting methods, which are embedded into the analytics process to gain insights for the future based on historical data, therefore offer great potential for incorporating uncertainty into planning and control in industrial environments. In order to familiarize students with these potentials, a training module on the application of probabilistic forecasting methods in production and intralogistics was developed in the learning factory 'Werk150' of the ESB Business School (Reutlingen University). The theoretical introduction to the topic of analytics, probabilistic forecasting methods and the transition to the application domain of intralogistics is done based on examples from other disciplines such as weather forecasting and energy consumption forecasting. In addition, data sets of the learning factory are used to familiarize the students with the steps of the analytics process in a practice-oriented manner. After this, the students are given the task of identifying the influencing factors and required information to capture intralogistics turbulences based on defined turbulence scenarios (e.g. failure of a logistical resource) in the learning factory. Within practical production scenario runs, the students apply probabilistic forecasting using and comparing different probabilistic forecasting methods. The graduate training module allows the students to experience the potentials of using probabilistic forecasting methods to improve production and intralogistics processes in context with turbulences and to build up corresponding professional and methodological competencies.
Bioactive cations, including calcium, copper and magnesium, have shown the potential to become the alternative to protein growth factor-based therapeutics for bone healing. Ion substitutions are less costly, more stable, and more effective at low concentrations. Although they have been shown to be effective in providing bone grafts with more biological functions, the precise control of ion release kinetics is still a challenge. Moreover, the synergistic effect of three or more metal ions on bone regeneration has rarely been studied. In this study, vaterite-calcite CaCO3 particles were loaded with copper (Cu2+) and magnesium (Mg2+). The polyelectrolyte multilayer (PEM) was deposited on CaCuMg-CO3 particles via layer-by-layer technique to further improve the stability and biocompatibility of the particles and to enable controlled release of multiple metal ions. The PEM coated microcapsules were successfully combined with collagen at the outmost layer, providing a further stimulating microenvironment for bone regeneration. The in vitro release studies showed remarkably stable release of Cu2+ in 2 months without initial burst release. Mg2+ was released in relatively low concentration in the first 7 days. Cell culture studies showed that CaCuMg-PEM-Col microcapsules stimulated cell proliferation, extracellular maturation and mineralization more effectively than blank control and other microcapsules without collagen adsorption (Ca-PEM, CaCu-PEM, CaMg-PEM, CaCuMg-PEM). In addition, the CaCuMg-PEM-Col microcapsules showed positive effects on osteogenesis and angiogenesis in gene expression studies. The results indicate that such a functional and controllable delivery system of multiple bioactive ions might be a safer, simpler and more efficient alternative of protein growth factor-based therapeutics for bone regeneration. It also provides an effective method for functionalizing bone grafts for bone tissue engineering.
Context: Companies that operate in the software-intensive business are confronted with high market dynamics, rapidly evolving technologies as well as fast-changing customer behavior. Traditional product roadmapping practices, such as fixed-time-based charts including detailed planned features, products, or services typically fail in such environments. Until now, the underlying reasons for the failure of product roadmaps in a dynamic and uncertain market environment are not widely analyzed and understood.
Objective: This paper aims to identify current challenges and pitfalls practitioners face when developing and handling product roadmaps in a dynamic and uncertain market environment.
Method: To reach our objective we conducted a grey literature review (GLR).
Results: Overall, we identified 40 relevant papers, from which we could extract 11 challenges of the application of product roadmapping in a dynamic and uncertain market environment. The analysis of the articles showed that the major challenges for practitioners originate from overcoming a feature-driven mindset, not including a lot of details in the product roadmap, and ensuring that the content of the roadmap is not driven by management or expert opinion.
Providing a digital infrastructure, platform technologies foster interfirm collaboration between loosely coupled companies, enabling the formation of ecosystems and building the organizational structure for value co-creation. Despite the known potential, the development of platform ecosystems creates new sources of complexity and uncertainty due to the involvement of various independent actors. For a platform ecosystem to succeed, it is essential that the platform ecosystem participants are aligned, coordinated, and given a common direction. Traditionally, product roadmaps have served these purposes during product development. A systematic mapping study was conducted to better understand how product roadmapping could be used in the dynamic environment of platform ecosystems. One result of the study is that there are hardly any concrete approaches for product roadmapping in platform ecosystems so far. However, many challenges on the topic are described in the literature from different perspectives. Based on the results of the systematic mapping study, a research agenda for product roadmapping in platform ecosystems is derived and presented.
Eine zukunftsfähige Ausrichtung der betrieblichen Abläufe nach den Prinzipien des nachhaltigen Wirtschaftens erhöht die Wettbewerbsfähigkeit, die Innovationskraft und die Glaubwürdigkeit des Unternehmens bei allen Interessengruppen. Zudem zeigt die Praxis, dass Unternehmen damit nicht nur ökologische und soziale Aspekte angehen können, sondern auch ökonomisch besser aufgestellt sind, zum Beispiel durch Einsparungen an Ressourcen, einer höheren Akzeptanz im Markt und in der Gesellschaft oder einer besseren Mitarbeitermotivation. In der VDI 4070 Blatt 1 wurde eine Handlungsanleitung gegeben und eine strukturierte Vorgehensweise beschrieben, um Betriebe systematisch an ein nachhaltiges Wirtschaften heranzuführen. In Ergänzung dazu werden in Blatt 2 beispielhafte Methoden sowie bewährte und innovative Instrumente vorgestellt und praktische Anwendungshilfen und Beispiele aufgezeigt. Die Richtlinie richtet sich an Behörden, Beratungsunternehmen, kleine und mittelständische Unternehmen.
Cell migration plays an essential role in wound healing and inflammatory processes inside the human body. Peripheral blood neutrophils, a type of polymorphonuclear leukocyte (PMN), are the first cells to be activated during inflammation and subsequently migrate toward an injured tissue or infection site. This response is dependent on both biochemical signaling and the extracellular environment, one aspect of which includes increased temperature in the tissues surrounding the inflammation site. In our study, we analyzed temperature-dependent neutrophil migration using differentiated HL-60 cells. The migration speed of differentiated HL-60 cells was found to correlate positively with temperature from 30 to 42 °C, with higher temperatures inducing a concomitant increase in cell detachment. The migration persistence time of differentiated HL-60 cells was higher at lower temperatures (30–33 °C), while the migration persistence length stayed constant throughout the temperature range. Coupled with the increased speed observed at high temperatures, this suggests that neutrophils are primed to migrate more effectively at the elevated temperatures characteristic of inflammation. Temperature gradients exist on both cell and tissue scales. Taking this into consideration, we also investigated the ability of differentiated HL-60 cells to sense and react to the presence of temperature gradients, a process known as thermotaxis. Using a two-dimensional temperature gradient chamber with a range of 27–43 °C, we observed a migration bias parallel to the gradient, resulting in both positive and negative thermotaxis. To better mimic the extracellular matrix (ECM) environment in vivo, a three-dimensional collagen temperature gradient chamber was constructed, allowing observation of biased neutrophil-like differentiated HL-60 migration toward the heat source.
Context: Nowadays the market environment is characterized by high uncertainties due to high market dynamics, confronting companies with new challenges in creating and updating product roadmaps. Most companies are still using traditional approaches which typically fail in such environments. Therefore, companies are seeking opportunities for new product roadmapping approaches.
Objective: This paper presents good practices to support companies better understand what factors are required to conduct a successful product roadmapping in a dynamic and uncertain market environment.
Method: Based on a grey literature review, essential aspects for conducting product roadmapping in a dynamic and uncertain market environment were identified. Expert workshops were then held with two researchers and three practitioners to develop best practices and the proposed approach for an outcome-driven roadmap. These results were then given to another set of practitioners and their perceptions were gathered through interviews.
Results: The study results in the development of 9 good practices that provide practitioners with insights into what aspects are crucial for product roadmapping in a dynamic and uncertain market environment. Moreover, we propose an approach to product roadmapping that includes providing a flexible structure and focusing on delivering value to the customer and the business. To ensure the latter, this approach consists of the main items outcome hypothesis, validated outcomes, and discovered outputs.
There is a growing consensus in research and practice that value-creating networks and ecosystems are supplementing the traditional distinction between the internal firm and market perspectives. To achieve joint value in ecosystems, it is crucial to align the various interests of independently acting ecosystem actors and create a common vision. In this paper, we argue that the ecosystem-wide use of product roadmaps may help with this. To get a better understanding of how roadmapping is conducted in the dynamic ecosystem environment, we systematize the main characteristics of product roadmaps and perform a conceptual comparison with the known challenges of ecosystem management. Comparing the two concepts of ecosystems and product roadmaps, we highlight the fit between the characteristics and objectives of the roadmaps and the challenges of ecosystem management. Hence, we propose to experiment with the ecosystem-wide use of product roadmaps as well as the empirical study of the challenges emerging in the process and the associated redesign of the roadmaps.
Advancements in Internet of Things (IoT), cloud and mobile computing have fostered the digital enrichment—or “digitization”—of physical products, which are gaining increasing relevance in practice. According to recent studies, global IoT spending will exceed USD 1 Trillion by 2021 and there will be over 25 billion IoT connections (KPMG, 2018). Porter and Heppelmann (2014) state that IT is “revolutionizing products [as …] IT is becoming an integral part of the product itself.” Senior business executives like GE’s former CEO Jeff Immelt (2015) are even proposing that “every industrial company in the coming age is also going to become a software and analytics company.” This reflects the increasing relevance of IT components’ (i.e., software, data analytics, cloud computing) integration into previously purely physical products. We call IT-enriched physical products, “digitized” products to differentiate them from purely intangible “digital” products, such as digital music, e-books, and software. Examples of digitized products include the Philips Hue smartphone-controllable lightbulb, Audi Connect internet-connected cars, or Rolls-Royce’s sensor-enabled pay per use jet engines.
Digitized products provide their producers with a wide range of opportunities to offer new functionality and product capabilities (e.g., autonomy) that traditional, physical products do not exhibit (Porter and Heppelmann, 2014). In addition, the digitization of products allows producers to continuously repurpose their offerings, by extending and/or changing the product functionality and, thus, enabling new value creation opportunities. Based on their re-programmability and connectivity, digitized products “remain essentially incomplete […] throughout their lifetime as users continue to add and delete […] and change […] functional capabilities” (Yoo, 2013). For instance, the Philips Hue connected lightbulb enables remote control of basic functions (e.g., switching on and off the light) as well as setting more advanced light scenes for day-to-day tasks (e.g., relax, read) via Amazon’s Alexa artificial intelligence assistant (Signify, 2019), offerings that were not intended use cases when Signify (previously known as Philips Lighting) created Hue in 2012. Thus, digitized products present limitless potentials for new functionality and unforeseen use cases, which provides them with a huge innovation capacity.
Despite the limitless potentials offered by digitized products, there has been a slow uptake of digitized products by businesses so far (Jernigan et al., 2016; Mocker et al., 2019). According to a 2016 MIT Sloan Management Review report (Jernigan et al., 2016) only 24% of the investigated firms were actively using IoT technologies – a key technology for digitized products. In a more recent research study Mocker et al. (2019) found that the median revenue share from digital offerings (i.e., solutions based on IT enriched products) in large companies only accounted for 5% of the total revenue of the investigated companies.
The slow uptake of digitized products might be explained by the challenges that firms face regarding the changing nature of digitized products. Pervasive digital technologies (such as IoT) change the nature of products by adding new functionality that was previously not part of the value proposition of the products/services (e.g., a pair of shoes embedded with sensors and connectivity allows joggers to have access to data regarding their run distance, speed, etc.) (Yoo et al., 2012). The addition of new functionality and use cases of digitized products makes it harder for producers to design and develop relevant products (Hui 2014). As described in the paper ‘Do Your Customers Actually Want a “Smart” Version of Your Product?’, “just because [firms] can make something with IoT technology doesn’t mean people will want it.” (Smith, 2017).
The shift in digitized products’ nature poses new challenges for producers along the entire product development process (Porter and Heppelmann, 2015; Yoo et al., 2012) and create a paradox in product digitization, described by Yoo et al. (2012) as the paradox of pace: while technology accelerates the rate of innovation, companies need to spend more time to digitize their products, extending time to market. The production of these digitized products also becomes more challenging, e.g., as companies need to deal with different clock-speeds of software and hardware development (Porter and Heppelman, 2015). The above-mentioned challenges suggest that producers need to better understand how they can generate value from their digitized products’ generative potentials.
The body of literature on digitized products has been growing in recent years. For instance, Herterich et al. (2016) investigate how digitized product affordances (i.e., potentials) enable industrial service innovation; Nicolescu et al. (2018) explore the emerging meanings of value associated with IoT; and Benbunan-Fich (2019) studies the impact of basic wearable sensors on the quality of the user experience. However, it remains unclear what it takes for firms to generate value with their digitized product potentials. This dissertation investigates this research gap.
Today, companies face increasing market dynamics, rapidly evolving technologies, and rapid changes in customer behavior. Traditional approaches to product development typically fail in such environments and require companies to transform their often feature-driven mindset into a product-led mindset. A promising first step on the way to a product-led company is a better understanding of how product planning can be adapted to the requirements of an increasingly dynamic and uncertain market environment in the sense of product roadmapping. The authors developed the DEEP product roadmap assessment tool to help companies evaluate their current product roadmap practices and identify appropriate actions to transition to a more product-led company. Objective: The goal of this paper is to gain insight into the applicability and usefulness of version 1.1 of the DEEP model. In addition, the benefits, and implications of using the DEEP model in corporate contexts will be explored. Method: We conducted a multiple case study in which participants were observed using the DEEP model. We then interviewed each participant to understand their perceptions of the DEEP model. In addition, we conducted interviews with each company's product management department to learn how the application of the DEEP model influenced their attitudes toward product roadmapping. Results: The study showed that by applying the DEEP model, participants better understood which artifacts and methods were critical to product roadmapping success in a dynamic and uncertain market environment. In addition, the application of the DEEP model helped convince management and other stakeholders of the need to change current product roadmapping practices. The application also proved to be a suitable starting point for the transformation in the participating companies.
The energy turnaround, digitalization and decreasing revenues forces enterprises in the energy domain to develop new business models. Following a Design Science Research approach, we showed in two action research projects that businesses models in the energy domain result in complex ecosystems with multiple actors. Additionally, we identified that municipal utilities have problems with the systematic development of business models. In order to solve the problem, we captured together with the partners of the enterprises the requirements in a second phase. Further we developed a method which consist of the following components: Method for the creative development of a new business model in form of a Business Model Canvas (BMC). A mapping between the e3Value ontology and the BMC for modelling a business ecosystem. The Business Model Configurator (BMConfig) prototype for modelling and simulating the e3Value-Ontology. The Business model can be quantified and analyzed for its viability. We demonstrate the feasibility of our approach in business model of a power community.
Turning students into Industry 4.0 entrepreneurs: design and evaluation of a tailored study program
(2022)
Startups in the field of Industry 4.0 could be a huge driver of innovation for many industry sectors such as manufacturing. However, there is a lack of education programs to ensure a sufficient number of well-trained founders and thus a supply of such startups. Therefore, this study presents the design, implementation, and evaluation of a university course tailored to the characteristics of Industry 4.0 entrepreneurship. Educational design-based research was applied with a focus on content and teaching concept. The study program was first implemented in 2021 at a German university of applied sciences with 25 students, of which 22 participated in the evaluation. The evaluation of the study program was conducted with a pretest–posttest-design targeting three areas: (1) knowledge about the application domain, (2) entrepreneurial intention and (3) psychological characteristics. The entrepreneurial intention was measured based on the theory of planned behavior. For measuring psychological characteristics, personality traits associated with entrepreneurship were used. Considering the study context and the limited external validity of the study, the following can be identified in particular: The results show that a university course can improve participants' knowledge of this particular area. In addition, perceived behavioral control of starting an Industry 4.0 startup was enhanced. However, the results showed no significant effects on psychological characteristics.
Job advertisements are important means of communicating role expectations for management accountants to the labor market. They provide information about which roles of management accountants are sought by companies or which roles are expected. However, which roles are communicated in job advertisements is unknown so far. Using a large sample of 889 job ads and a text-mining approach, we show an apparent mix of different role types with a strong focus on a rather classic role: the watchdog role. However, individuals with business partner characteristics are more often sought for leadership positions or in family businesses and small and medium-sized enterprises (SMEs). The results challenge the current role discussion for management accountants as business partners in practice and some academic fields.
Data governance have been relevant for companies for a long time. Yet, in the broad discussion on smart cities, research on data governance in particular is scant, even though data governance plays an essential role in an environment with multiple stakeholders, complex IT structures and heterogeneous processes. Indeed, not only can a city benefit from the existing body of knowledge on data governance, but it can also make the appropriate adjustments for its digital transformation. Therefore, this literature review aims to spark research on urban data governance by providing an initial perspective for future studies. It provides a comprehensive overview of data governance and the relevant facets embedded in this strand of research. Furthermore, it provides a fundamental basis for future research on the development of an urban data governance framework.
Hybrid organic/inorganic nanocomposites combine the distinct properties of the organic polymer and the inorganic filler, resulting in overall improved system properties. Monodisperse porous hybrid beads consisting of tetraethylene pentamine functionalized poly(glycidyl methacrylateco-ethylene glycol dimethacrylate) particles and silica nanoparticles (SNPs) were synthesized under Stoeber sol-gel process conditions. A wide range of hybrid organic/silica nanocomposite materials with different material properties was generated. The effects of n(H2O)/n(TEOS) and c(NH3 ) on the hybrid bead properties particle size, SiO2 content, median pore size, specific surface area, pore volume and size of the SNPs were studied. Quantitative models with a high robustness and predictive power were established using a statistical and systematic approach based on response surface methodology. It was shown that the material properties depend in a complex way on the process factor settings and exhibit non-linear behaviors as well as partly synergistic interactions between the process factors. Thus, the silica content, median pore size, specific surface area, pore volume and size of the SNPs are non-linearly dependent on the water-to-precursor ratio. This is attributed to the effect of the water-to-precursor ratio on the hydrolysis and condensation rates of TEOS. A possible mechanism of SNP incorporation into the porous polymer network is discussed.
Startups play a key role in software-based innovation. They make an important contribution to an economy’s ability to compete and innovate, and their importance will continue to grow due to increasing digitalization. However, the success of a startup depends primarily on market needs and the ability to develop a solution that is attractive enough for customers to choose. A sophisticated technical solution is usually not critical, especially in the early stages of a startup. It is not necessary to be an experienced software engineer to start a software startup. However, this can become problematic as the solution matures and software complexity increases. Based on a proposed solution for systematic software development for early-stage startups, in this paper, we present the key findings of a survey study to identify the methodological and technical priorities of software startups. Among other things, we found that requirements engineering and architecture pose challenges for startups. In addition, we found evidence that startups’ software development approaches do not tend to change over time. An early investment in a more scalable development approach could help avoid long-term software problems. To support such an investment, we propose an extended model for Entrepreneurial Software Engineering that provides a foundation for future research.
Organizations that operate under uncertainty need to cultivate their ability to manage their primary resource, knowledge, accordingly. Under such conditions, organizations are required to harvest knowledge from two sources: to explore knowledge that is to be found outside the organization as well as exploit knowledge that is contained within. In a knowledge management context these exploitation and exploration activities have been conceptualized as knowledge ambidexterity. While ambidexterity has been studied extensively in contexts as manufacturing or IT, the notion of knowledge ambidexterity remains scarce in current knowledge management research. This study illustrates knowledge ambidexterity and elaborates its positive impact on organizational performance. Our study furthermore answers the question of how the use of enterprise social media (ESM) can facilitate the performance effects of knowledge ambidexterity. Drawing on the theory of communication visibility, we argue that ESM (e.g., Microsoft Teams, Slack, etc.) allow employees to communicate unhindered while making these communications visible. This allows for capturing tacit knowledge within these communications - this form of knowledge is generally hard to codify and can be a source of competitive edge. With respect to knowledge ambidexterity, ESM use can capture tacit knowledge aspects originating from inside and outside the organization, which fosters the development of a competitive advantage and, thus, supports its positive effect on organizational performance. This paper contributes to IT-enabled ambidexterity research in two aspects: (1) It sheds light on knowledge ambidexterity and, thereby, addresses a major practical challenge for knowledge-intensive organizations, and (2) it elaborates on the effects that ESM use can have on the relationship between knowledge ambidexterity and organizational performance. This work-in-progress paper offers a better understanding of the phenomenon of ambidexterity in a knowledge context, while providing insights on the facilitating role of ESM. Our research serves as a foundation for future empirical examinations of the concept of knowledge ambidexterity.
Digital twins: a meta-review on their conceptualization, application, and reference architecture
(2022)
The concept of digital twins (DTs) is receiving increasing attention in research and management practice. However, various facets around the concept are blurry, including conceptualization, application areas, and reference architectures for DTs. A review of preliminary results regarding the emerging research output on DTs is required to promote further research and implementation in organizations. To do so, this paper asks four research questions: (1) How is the concept of DTs defined? (2) Which application areas are relevant for the implementation of DTs? (3) How is a reference architecture for DTs conceptualized? and (4) Which directions are relevant for further research on DTs? With regard to research methods, we conduct a meta-review of 14 systematic literature reviews on DTs. The results yield important insights for the current state of conceptualization, application areas, reference architecture, and future research directions on DTs.
Literature reviews are essential for any scientific work, both as part of a dissertation or as a stand-alone work. Scientists benefit from the fact that more and more literature is available in electronic form, and finding and accessing relevant literature has become more accessible through scientific databases. However, a traditional literature review method is characterized by a highly manual process, while technologies and methods in big data, machine learning, and text mining have advanced. Especially in areas where research streams are rapidly evolving, and topics are becoming more comprehensive, complex, and heterogeneous, it is challenging to provide a holistic overview and identify research gaps manually. Therefore, we have developed a framework that supports the traditional approach of conducting a literature review using machine learning and text mining methods. The framework is particularly suitable in cases where a large amount of literature is available, and a holistic understanding of the research area is needed. The framework consists of several steps in which the critical mind of the scientist is supported by machine learning. The unstructured text data is transformed into a structured form through data preparation realized with text mining, making it applicable for various machine learning techniques. A concrete example in the field of smart cities makes the framework tangible.
This research briefing describes the organizational capability of scaling at scale, which we define as enabling multiple digital innovation initiatives to realize bottom-line value from their innovation by leveraging shared resources. We illustrate this concept with a case study from global multi-energy company Repsol, which implemented scaling at scale to cultivate a portfolio of more than 450 initiatives and helped over seventy percent of initiatives to reach the scale-up stage. As a result, over five years Repsol realized €800 million of bottom-line value from digital innovations.
To generate greater value faster from digital innovation, many companies are increasing how much they learn from their own innovation efforts. However, in many companies, these changes are limited to one stakeholder group: innovation teams. Two other stakeholder groups, senior executives and experts from corporate functions, also need to learn from digital innovation initiatives. We have defined three learning imperatives that address a company’s needs to learn continually about building (1) a successful innovation, (2) a portfolio of initiatives that realizes strategic objectives faster, and (3) shared resources that propel multiple initiatives. All three imperatives involve collecting data regularly from digital innovation initiatives. In this research briefing we outline the three learning imperatives and provide examples of how companies are pursuing them to achieve strategic objectives more effectively and efficiently.
The euphoria around microservices has decreased over the years, but the trend of modernizing legacy systems to this novel architectural style is unbroken to date. A variety of approaches have been proposed in academia and industry, aiming to structure and automate the often long-lasting and cost-intensive migration journey. However, our research shows that there is still a need for more systematic guidance. While grey literature is dominant for knowledge exchange among practitioners, academia has contributed a significant body of knowledge as well, catching up on its initial neglect. A vast number of studies on the topic yielded novel techniques, often backed by industry evaluations. However, practitioners hardly leverage these resources. In this paper, we report on our efforts to design an architecture-centric methodology for migrating to microservices. As its main contribution, a framework provides guidance for architects during the three phases of a migration. We refer to methods, techniques, and approaches based on a variety of scientific studies that have not been made available in a similarly comprehensible manner before. Through an accompanying tool to be developed, architects will be in a position to systematically plan their migration, make better informed decisions, and use the most appropriate techniques and tools to transition their systems to microservices.
Especially, if the potential of technical and organizational measures for ergonomic workplace design is limited, exoskeletons can be considered as innovative ergonomic aids to reduce the physical workload of workers. Recent scientific findings from ergonomic analyses with and without exoskeletons are indicating that strain reduction can be achieved, particularly at workplaces with lifting, holding, and carrying processes. Currently, a work system design method is under development incorporating criteria and characteristics for the design of work systems in which a human worker is supported by an exoskeleton. Based on the properties of common passive and active exoskeletons, factors influencing the human on which an exoskeleton can have a positive or negative effect (e.g. additional weight) were derived. The method will be validated by the conceptualization and setup of several work system demonstrators at Werk150, the factory of ESB Business School on campus of Reutlingen University, to prove the positive ergonomic effect on humans and the supporting process to choose the suitable exoskeleton. The developed method and demonstrators enable the user to experience the positive ergonomic effects of exoskeletal support in lifting, holding and carrying processes in logistics and production. The new work system design method will contribute to the fact that employees can pursue their professional activity longer without substantial injuries or can be used more flexibly at different work stations. Also new work concepts, strategies and scenarios are opened up to reduce the risk of occupational accidents and to promote the compatibility of work for employees. A training module is being developed and evaluated with participants from industry and master students to build up competence.
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
Over the last decades, a tremendous change toward using information technology in almost every daily routine of our lives can be perceived in our society, entailing an incredible growth of data collected day-by-day on Web, IoT, and AI applications.
At the same time, magneto-mechanical HDDs are being replaced by semiconductor storage such as SSDs, equipped with modern Non-Volatile Memories, like Flash, which yield significantly faster access latencies and higher levels of parallelism. Likewise, the execution speed of processing units increased considerably as nowadays server architectures comprise up to multiple hundreds of independently working CPU cores along with a variety of specialized computing co-processors such as GPUs or FPGAs.
However, the burden of moving the continuously growing data to the best fitting processing unit is inherently linked to today’s computer architecture that is based on the data-to-code paradigm. In the light of Amdahl's Law, this leads to the conclusion that even with today's powerful processing units, the speedup of systems is limited since the fraction of parallel work is largely I/O-bound.
Therefore, throughout this cumulative dissertation, we investigate the paradigm shift toward code-to-data, formally known as Near-Data Processing (NDP), which relieves the contention on the I/O bus by offloading processing to intelligent computational storage devices, where the data is originally located.
Firstly, we identified Native Storage Management as the essential foundation for NDP due to its direct control of physical storage management within the database. Upon this, the interface is extended to propagate address mapping information and to invoke NDP functionality on the storage device. As the former can become very large, we introduce Physical Page Pointers as one novel NDP abstraction for self-contained immutable database objects.
Secondly, the on-device navigation and interpretation of data are elaborated. Therefore, we introduce cross-layer Parsers and Accessors as another NDP abstraction that can be executed on the heterogeneous processing capabilities of modern computational storage devices. Thereby, the compute placement and resource configuration per NDP request is identified as a major performance criteria. Our experimental evaluation shows an improvement in the execution durations of 1.4x to 2.7x compared to traditional systems. Moreover, we propose a framework for the automatic generation of Parsers and Accessors on FPGAs to ease their application in NDP.
Thirdly, we investigate the interplay of NDP and modern workload characteristics like HTAP. Therefore, we present different offloading models and focus on an intervention-free execution. By propagating the Shared State with the latest modifications of the database to the computational storage device, it is able to process data with transactional guarantees. Thus, we achieve to extend the design space of HTAP with NDP by providing a solution that optimizes for performance isolation, data freshness, and the reduction of data transfers. In contrast to traditional systems, we experience no significant drop in performance when an OLAP query is invoked but a steady and 30% faster throughput.
Lastly, in-situ result-set management and consumption as well as NDP pipelines are proposed to achieve flexibility in processing data on heterogeneous hardware. As those produce final and intermediary results, we continue investigating their management and identified that an on-device materialization comes at a low cost but enables novel consumption modes and reuse semantics. Thereby, we achieve significant performance improvements of up to 400x by reusing once materialized results multiple times.
The respiratory rate is a vital sign indicating breathing illness. It is necessary to analyze the mechanical oscillations of the patient's body arising from chest movements. An inappropriate holder on which the sensor is mounted, or an inappropriate sensor position is some of the external factors which should be minimized during signal registration. This paper considers using a non-invasive device placed under the bed mattress and evaluates the respiratory rate. The aim of the work is the development of an accelerometer sensor holder for this system. The normal and deep breathing signals were analyzed, corresponding to the relaxed state and when taking deep breaths. The evaluation criterion for the holder's model is its influence on the patient's respiratory signal amplitude for each state. As a result, we offer a non-invasive system of respiratory rate detection, including the mechanical component providing the most accurate values of mentioned respiratory rate.
Digital assistants like Alexa, Google Assistant or Siri have seen a large adoption over the past years. Using artificial intelligence (AI) technologies, they provide a vocal interface to physical devices as well as to digital services and have spurred an entire new ecosystem. This comprises the big tech companies themselves, but also a strongly growing community of developers that make these functionalities available via digital platforms. At present, only few research is available to understand the structure and the value creation logic of these AI-based assistant platforms and their ecosystem. This research adopts ecosystem intelligence to shed light on their structure and dynamics. It combines existing data collection methods with an automated approach that proves useful in deriving a network-based conceptual model of Amazon’s Alexa assistant platform and ecosystem. It shows that skills are a key unit of modularity in this ecosystem, which is linked to other elements such as service, data, and money flows. It also suggests that the topology of the Alexa ecosystem may be described using the criteria reflexivity, symmetry, variance, strength, and centrality of the skill coactivations. Finally, it identifies three ways to create and capture value on AI-based assistant platforms. Surprisingly only a few skills use a transactional business model by selling services and goods but many skills are complementary and provide information, configuration, and control services for other skill provider products and services. These findings provide new insights into the highly relevant ecosystems of AI-based assistant platforms, which might serve enterprises in developing their strategies in these ecosystems. They might also pave the way to a faster, data-driven approach for ecosystem intelligence.
Verification of an active time constant tuning technique for continuous-time delta-sigma modulators
(2022)
In this work we present a technique to compensate the effects of R-C / g m -C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with finely tunable circuit structures, such as current-steering digital-to-analog converters (DAC). We apply our technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure, implemented in a 0.35-μm CMOS process. A tuning scheme for the reference currents of the feedback DACs is derived as a function of the individual TC errors and verified by circuit simulations. We confirm the tuning technique experimentally on the fabricated circuit over a TC parameter variation range of ±20%. Stable modulator operation is achieved for all parameter sets. The measured performances satisfy the expectations from our theoretical calculations and circuit-level simulations.
Glioblastoma WHO IV belongs to a group of brain tumors that are still incurable. A promising treatment approach applies photodynamic therapy (PDT) with hypericin as a photosensitizer. To generate a comprehensive understanding of the photosensitizer-tumor interactions, the first part of our study is focused on investigating the distribution and penetration behavior of hypericin in glioma cell spheroids by fluorescence microscopy. In the second part, fluorescence lifetime imaging microscopy (FLIM) was used to correlate fluorescence lifetime (FLT) changes of hypericin to environmental effects inside the spheroids. In this context, 3D tumor spheroids are an excellent model system since they consider 3D cell–cell interactions and the extracellular matrix is similar to tumors in vivo. Our analytical approach considers hypericin as probe molecule for FLIM and as photosensitizer for PDT at the same time, making it possible to directly draw conclusions of the state and location of the drug in a biological system. The knowledge of both state and location of hypericin makes a fundamental understanding of the impact of hypericin PDT in brain tumors possible. Following different incubation conditions, the hypericin distribution in peripheral and central cryosections of the spheroids were analyzed. Both fluorescence microscopy and FLIM revealed a hypericin gradient towards the spheroid core for short incubation periods or small concentrations. On the other hand, a homogeneous hypericin distribution is observed for long incubation times and high concentrations. Especially, the observed FLT change is crucial for the PDT efficiency, since the triplet yield, and hence the O2 activation, is directly proportional to the FLT. Based on the FLT increase inside spheroids, an incubation time 30 min is required to achieve most suitable conditions for an effective PDT.
Geometry of music perception
(2022)
Prevalent neuroscientific theories are combined with acoustic observations from various studies to create a consistent geometric model for music perception in order to rationalize, explain and predict psycho-acoustic phenomena. The space of all chords is shown to be a Whitney stratified space. Each stratum is a Riemannian manifold which naturally yields a geodesic distance across strata. The resulting metric is compatible with voice-leading satisfying the triangle inequality. The geometric model allows for rigorous studies of psychoacoustic quantities such as roughness and harmonicity as height functions. In order to show how to use the geometric framework in psychoacoustic studies, concepts for the perception of chord resolutions are introduced and analyzed.
Moral change and the purchase-sales-relationship: critical analysis of German and Swiss companies
(2022)
This study examines the awareness and causes of moral change from the economic perspective in Germany and Switzerland. Based on an analysis of value research to date and interviews with experts in B2B sales, the manifestations of moral change are critically examined and recommendations for action are derived on an employee-specific and company-wide level.
A single-phase fixed-frequency operated power factor correction circuit with reduced switching losses is proposed. The circuit uses the combination of a boost converter with an added clamp-switch, a pulse wave shaping circuit, and a standard control IC to discharge the transistor's output capacitance prior to its turn-on. In this way, a very low-complexity control circuit implementation to reduce switching losses or even achieve complete zero-voltage switching without additional sensors is possible. Moreover, this operation method is achieved at a constant switching frequency, possibly simplifying the design of the EMI filter and the converter's inductor. Experimental test results for a 100 W prototype converter are presented to validate the feasibility of the proposed operating method and corresponding circuit structure.
In this paper we presented the results of the workshop with the topic: Co-creation in citizen science (CS) for the development of climate adaptation measurements - Which success factors promote, and which barriers hinder a fruitful collaboration and co-creation process between scientists and volunteers? Under consideration of social, motivational, technical/technological and legal factors., which took place at the CitSci2022. We underlined the mentioned factors in the work with scientific literature. Our findings suggest that a clear communication strategy of goals and how citizen scientists can contribute to the project are important. In addition, they have to feel include and that the contribution makes a difference. To achieve this, it is critical to present the results to the citizen scientists. Also, the relationship between scientist and citizen scientists are essential to keep the citizen scientists engaged. Notification of meetings and events needs to be made well in advance and should be scheduled on the attendees' leisure time. The citizen scientists should be especially supported in technical questions. As a result, they feel appreciated and remain part of the project. For legal factors the current General Data Protection Regulation was considered important by the participants of the workshop. For the further research we try to address the individual points and first of all to improve our communication with the citizen scientist about the project goals and how they can contribute. In addition, we should better share the achieved results.
Since half a decade, there has been an increasing interest in Robotic Process Automation (RPA) by business firms. However, academic literature has been lacking attention to RPA, before adopting the topic to a larger extent. The aim of this study is to review and structure the latest state of scholarly research on RPA. This chapter is based on a systematic literature review that is used as a basis to develop a conceptual framework to structure the field. Our study shows that some areas of RPA have been extensively examined by many authors, e.g. potential benefits of RPA. Other categories, such as empirical studies on adoption of RPA or organisational readiness models, have remained research gaps.
On the influence of ground and substrate on the radiation characteristics of planar spiral antennas
(2022)
The unidirectional radiation of spiral antennas mounted on a substrate requires the presence of a ground plane. In this work, we successively illustrate the impact of dielectric material and ground plane on the key metrics of a planar equiangular spiral antenna (PESA). For this purpose, a PESA mounted on several substrates with different dielectric properties and thicknesses is modeled and simulated. We introduce the tertiary current flowing on spiral arms when backed by a ground plane.
Eine wichtige Informationsgrundlage für strategische Entscheidungen im Sportmarketing bildet das Markenimage, da es die Perspektive der Anspruchsgruppen auf die Marke widerspiegelt. Die Analyse des Markenimages ist jedoch methodisch komplex, weshalb dafür der Einsatz Künstlicher Neuronaler Netze eingehender untersucht wird. Denn dieses Verfahren der Künstlichen Intelligenz ermöglicht die Modellierung vielschichtiger und nichtlinearer Wirkungsbeziehungen. Der konzeptionelle Ansatz wird am empirischen Praxisbeispiel des Sportartikelherstellers adidas veranschaulicht, indem ein mehrschichtiges Künstliches Neuronales Netz zwischen den Bewertungen spezifischer Markenattribute und der Gesamtmarke modelliert wird. Mithilfe einer Analyse der Verbindungsgewichte des Netzes wird der Variableneinfluss verschiedener Markenattribute gemessen, woraus sich konkrete Implikationen für die Sportmarketingpraxis ergeben.