Refine
Year of publication
Document Type
- Article (961)
- Conference Proceeding (855)
- Part of a Book (331)
- Book (214)
- Doctoral Thesis (40)
- Working Paper (28)
- Anthology (26)
- Patent (21)
- Report (21)
- Part of Periodical (17)
Institute
- ESB Business School (866)
- Informatik (705)
- Technik (435)
- Angewandte Chemie (299)
- Textil und Design (187)
- Zentrale Einrichtungen (6)
Publisher
- Springer (263)
- IEEE (213)
- Hochschule Reutlingen (170)
- Elsevier (152)
- Springer Gabler (72)
- Gesellschaft für Informatik (63)
- MDPI (63)
- Universitätsbibliothek Tübingen (59)
- ACM (35)
- De Gruyter (33)
Die additive Fertigung hat sich in den vergangenen Jahren wesentlich weiter entwickelt. Dabei wurde die Prozesstechnologie, Anlagen und die Werkstoffe optimiert. Für die industrielle Anwendung auch bei größeren Stückzahlen in der flexiblen Fertigung fehlen noch automatisierte Lösungen für die gesamte Prozesskette. In diesem Beitrag werden Werkzeuge und Technologie für die Reinigung interner Strukturelemente dargestellt.
The functionality of existing cyber-physical production systems generally focuses on mapping technologic specifications derived from production requirements. Consequently, such systems base their conception on a structurally mechanistic paradigm. Insofar as these approaches have considered humans, their conception likewise is based on the structurally identical paradigm. Due to the fundamental reorientation towards explicitly human-centered approaches, the fact that essential aspects of the dimension "human" remain unconsidered by the previous paradigm becomes more and more apparent. To overcome such limitations, mapping the "social" dimension requires a structurally different approach. In this paper, an anthropocentric approach is developed based on possible conceptions of the human being, enabling a structural integration of the human being in an extended dimension. Through the model, extending concepts for better integration of the human being in the sense of human-centered approaches, as envisioned in the Industrie 5.0 conception, is possible.
This article explores the question of how sustainability and labour law are interrelated. The modern world of work is characterised by the growing social and environmental responsibility of companies. Especially in the post-COVID era, sustainability also plays an increasingly important role in the corporate context, which is also noticeable in the so-called ‘war for talent’. Achieving personal career goals is no longer enough for employees today. Corporate values and in particular the so-called ESG criteria (Environment, Social, Governance) are thus also becoming increasingly important in the employment relationship and in corporate reporting requirements. In terms of social sustainability, labour law instruments can, for example, promote the creation of a discrimination-free working environment, the introduction of flexible working time models or the protection of whistleblowers. From an ecological perspective, labour regulations are also suitable for implementing ‘green mobility’ and other measures to reduce companies’ ecological footprints. Working from home, which experienced a huge boom during the COVID-19 pandemic, is also sustainable, especially from an ecological point of view. Appropriate consideration of these sustainable work tools in future corporate social responsibility (CSR) strategies not only creates a competitive advantage but can also be beneficial in recruitment.
Die Charakterisierung und Beschreibung der komplexen Wechselwirkungen an der Zerspanstelle eines Bearbeitungszentrums beeinflusst die Qualität der hergestellten Bauteile. In diesem Beitrag wird die Messung und Beschreibung der Eigenfrequenzen unterschiedlicher Bearbeitungszentren in Abhängigkeit der bei der Bearbeitung verwendeten Werkzeuge und Bearbeitungsstrategien bezüglich der Auswirkungen auf die Stabilität hergeleitet. Dazu werden die gestellseitigen Resonanzfrequenzen analysiert. Ziel der Untersuchungen ist eine Beschreibung der dynamischen Eigenschaften zur Optimierung der NC-Programmierung.
Einige Ideen, Erfahrungen und Realitäten für die Studierenden und Bürger in Reutlingen. Zusammengestellt von 50 Studierenden 2020/21 und aus Beiträgen von 40 Institutionen und Unternehmen in und um Reutlingen.
Ein Versuch, sehr konkret am Tatsächlichen zu erklären, was zu mehr Nachhaltigkeit führt, in Reutlingen. Dabei bleibt nicht aus, auch auf Schwachstellen hinzuweisen.
Wenn Studierende und Bürger in den nächsten Jahren bewusst zu mehr Nachhaltigkeit bereit sind, so sind sie mit den Ideen und Realitäten in diesem Projekt auf einem guten Weg.
Imagine a world in which the search for tomorrow's trends of (software) products is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
Imagine a world in which the search for tomorrow's trends is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
Die Zielsetzung des hier vorgestellten Projekts ist es, eine intelligente Steuerungsalgorithmik für Biogas-Blockheizkraftwerke (Biogas-BHKW) zu entwickeln und zu optimieren. Daran schließt sich eine Testphase an einer realen Biogasanlage an, an der die Algorithmik zu diesem Zweck in die Anlagensteuerung implementiert wird. Um beurteilen zu können inwieweit die Steuerungsalgorithmik einen Beitrag zur Entlastung von Stromnetzen leisten kann, wird für die Versuche neben dem elektrischen Bedarf des landwirtschaftlichen Betriebs, an dem die Anlage angesiedelt ist, zusätzlich die Residuallast des benachbarten Stromnetzes betrachtet. Diese basiert auf Daten vom nächstgelegenen Umspannwerk, die so skaliert werden, dass sie eine Siedlung repräsentieren, die von dem Biogas-BHKW der Anlage mitversorgt werden kann. Die Einbindung der Steuerungsalgorithmik in die Anlagensteuerung erfolgt über eine Kommunikationsstruktur mit einer Datenbank als zentraler Schnittstelle. Eine erste Versuchsreihe, bei der das Biogas-BHKW nach den Fahrplänen der intelligenten Steuerungsalgorithmik geregelt wird, zeigt vielversprechende Ergebnisse. Über die gesamte Versuchsreihe hinweg berechnet die Steuerungsalgorithmik zuverlässig neue Fahrpläne, die vom BHKW weitestgehend auch sehr gut umgesetzt werden. Zudem kann nachgewiesen werden, dass durch den Einsatz der Algorithmik das vorgelagerte Stromnetz entlastet wird.
Die bedarfsgerechte Steuerung dezentraler thermischer Energiesysteme, wie Kraft-Wärme-Kopplungs- (KWK-) Anlagen und Wärmepumpen, kann einen entscheidenden Beitrag zur Deckung bzw. Reduktion der Residuallast leisten und so für eine Verringerung der konventionellen Reststromversorgung und den damit einhergehenden Treibhausgasemissionen sorgen. Dafür wurde an der Hochschule Reutlingen in mehrjähriger Forschungsarbeit ein prognosebasierter Steuerungsalgorithmus entwickelt. Gegenstand dieses Beitrags bilden neben der Vorstellung eben jenes Steuerungsalgorithmus auch dessen praktische Umsetzungsvarianten: Eine auf einer speicherprogrammierbaren Steuerung (SPS) rein lokal ausführbare Version sowie eine Webservice-Anwendung für den parallelen Betrieb mehrerer Anlagen – ausgehend von einem zentralen Server. Erprobungen am KWK-Prüfstand der Hochschule Reutlingen bestätigen die zuverlässige Funktionsweise des Algorithmus in den verschiedenen Umsetzungsvarianten. Gleichzeitig wird der Vorteil der bedarfsgerechten Steuerung gegenüber dem, insbesondere im Mikro-KWK-Bereich standardmäßig vorliegenden, wärmegeführten Betrieb in Form einer Steigerung der Eigenstromdeckung von bis zu 27 % aufgezeigt. Neben der bedarfsgerechten Steuerung bedient der entwickelte Algorithmus zudem noch ein weiteres Anwendungsgebiet: Den vorhersagbaren KWK-Betrieb, der beispielsweise in Form täglicher Einspeiseprognose im Rahmen des Redispatch 2.0 eingefordert wird. Die Vorhersage des KWK-Betriebs ist dabei auf zwei Weisen möglich: Als erste Option kann der wärmegeführte Betrieb direkt über den Algorithmus abgebildet und prognostiziert werden. Eine andere Möglichkeit stellt wiederum die bedarfsgerechte Steuerung der Anlage dar; der berechnete optimale Fahrplan entspricht dabei gleichzeitig der Betriebsprognose des KWK-Geräts. Damit ist der entwickelte Steuerungsalgorithmus in der Lage, auf unterschiedliche Weisen zum Gelingen der Energiewende beizutragen.
Values Management System
(2022)
The ValuesManagementSystem (VWS) is a management standard to “provide a sustainable safeguard of a firm and its development, in all dimensions (legal, economic, ecological, social)” (VWSZfW, p. 4). It includes a framework for values-driven governance through self-commitment and self-binding mechanisms. Values promote a sense of identity and give organizations guidance in decision-making. This is especially important in decision-making processes where topics are not clearly ruled by laws and regulations.
VMSZfW must be embedded in the specific business strategy, structure, and culture of an organization. The following four steps describe the implementation of the ValuesManagementSystemZfW: (i) Codify core values of an organization, for instance, with a “mission, vision and values statement” or Code of Ethics, (ii) implement guidelines such as Code of Conduct and specific policies and procedures, (iii) systematize these by establishing management systems such as Compliance and CSR management systems, and (iv) finally organize and establish structures to ensure the strategic direction and operational implementation and review of these processes. The top management shows that values management is taken seriously by their self-commitment to the core values of the company.
The Principles for Responsible Investments (PRI) is “the world’s leading proponent of responsible investment” (PRI 2021a). With the development of six Principles for Responsible Investment, the PRI supports its international network of investor signatories in incorporating the environmental, social, and governance (ESG) factors into their investment and ownership decisions. The goal of PRI is to develop a more sustainable global financial system by encouraging “investors to use responsible investment to enhance returns and better manage risks” (PRI 2021a). This independent financial initiative is supported by the United Nations and linked to the United Nations Environmental Program Finance Initiative (UNEP FI 2021) and the United Nations Global Compact (UN Global Compact 2021).
The United Nations (UN) Global Compact is a call to companies to align their strategies and operations with ten universal principles in the areas of human rights, labor, environment, and anti-corruption, and to take actions that advance societal goals (UN Global Compact 2017, p. 3). The UN Global Compacts’ vision is “to mobilize a global movement of sustainable companies and stakeholder to create the world we want” (UN Global Compact 2021a). It is a global network with local presence all around the world.
Bioenergy production is a new and promising industry in Ecuador. However, a confusing variety of laws, which are spread among different regulating institutions, regulate the agricultural sector. Such dispersion makes it difficult for farmers and businesses to understand applicable rights, duties, regulations and agricultural policies. Moreover, this rather young industry lacks important experience. In the first section of this work, the existing Ecuadorian legislation on bioenergy is presented and analyzed. Then, a brief, thorough analysis and comparison are carried out for experiences not only in developed countries, but also with similar cultural frameworks and comparable climatic conditions. The results are summarized as specific recommendations that have been handed to the National Agricultural Chamber of Ecuador from academia for the proposal of a Unified Agricultural Code established in the Ecuadorian legal hierarchy as an Organic Law.
This article explores the determinants of people’s growth prospects in survey data as well as the impact of the European recovery fund to future growth. The focus is on the aftermath of the Corona pandemic, which is a natural limit to the sample size. We use Eurobarometer survey data and macroeconomic variables, such as GDP, unemployment, public deficit, inflation, bond yields, and fiscal spending data. We estimate a variety of panel regression models and develop a new simulation-regression methodology due to limitation of the sample size. We find the major determinant of people’s growth prospect is domestic GDP per capita, while European fiscal aid does not significantly matter. In addition, we exhibit with the simulation-regression method novel scientific insights, significant outcomes, and a policy conclusion alike.
The vast majority of state-of-the-art integrated circuits are mixed-signal chips. While the design of the digital parts of the ICs is highly automated, the design of the analog circuitry is largely done manually; it is very time-consuming; and prone to error. Among the reasons generally listed for this is often the attitude of the analog designer. The fact is that many analog designers are convinced that human experience and intuition are needed for good analog design. This is why they distrust the automated synthesis tools. This observation is quite correct, but this is only a symptom of the real problem. This paper shows that this phenomenon is caused by very concrete technical (and thus very rational) issues. These issues lie in the mode of operation of the typical optimization processes employed for the synthesizing tasks. I will show that the dilemma that arises in analog design with these optimizers is the root cause of the low level of automation in analog design. The paper concludes with a review of proposals for automating analog design
Recognition of sleep and wake states is one of the relevant parts of sleep analysis. Performing this measurement in a contactless way increases comfort for the users. We present an approach evaluating only movement and respiratory signals to achieve recognition, which can be measured non-obtrusively. The algorithm is based on multinomial logistic regression and analyses features extracted out of mentioned above signals. These features were identified and developed after performing fundamental research on characteristics of vital signals during sleep. The achieved accuracy of 87% with the Cohen’s kappa of 0.40 demonstrates the appropriateness of a chosen method and encourages continuing research on this topic.
Motivation: Aim of this project is the automatic classification of total hip endoprosthesis (THEP) components in 2D Xray images. Revision surgeries of total hip arthroplasty (THA) are common procedures in orthopedics and trauma surgery. Currently, around 400.000 procedures per year are performed in the United States (US) alone. To achieve the best possible result, preoperative planning is crucial. Especially if parts of the current THEP system are to be retained.
Methods: First, a ground truth based on 76 X-ray images was created: We used an image processing pipeline consisting of a segmentation step performed by a convolutional neural network and a classification step performed by a support vector machine (SVM). In total, 11 classes (5 pans and 6 shafts) shall be classified.
Results: The ground truth generated was of good quality even though the initial segmentation was performed by technicians. The best segmentation results were achieved using a U-net architecture. For classification, SVM architectures performed much better than additional neural networks.
Conclusions: The overall image processing pipeline performed well, but the ground truth needs to be extended to include a broader variability of implant types and more examples per training class.
Die Informatics Inside ist seit über 13 Jahren ein fester Bestandteil des akademischen Jahres an der Fakultät für Informatik der Hochschule Reutlingen. Die Konferenz wird von Studierenden des Masterstudiengangs Human-Centered Computing selbstständig organisiert und bildet einen wichtigen Teil der wissenschaftlichen Ausbildung. Die Studierenden haben ihre Themen selbst gewählt und nicht selten sind es Fragen, die sie bereits durch das ganze Studium begleiten. Sie bereiten diese im Format einer wissenschaftlichen Ausarbeitung auf, wobei Inhalt, Vollständigkeit und Nachvollziehbarkeit entscheidende Faktoren sind. Die Ergebnisse dieser vertieften Auseinandersetzung mit relevanten Anwendungsthemen der Informatik können Sie in diesem Tagungsband nachlesen. Die Anwendungsdomänen reichen von der Medizin über Wirtschaft bis zu den Medien. Dabei werden aktuelle Fragestellungen des menschzentrierten Einsatzes von künstlicher Intelligenz, Softwaretechnik, Datenanalyse und Kommunikation sowie der digitalen Transformation behandelt. Es wird deutlich, dass der Nutzen von IT-Lösungen für den Menschen im Mittelpunkt der Veranstaltung steht. Das Motto der Veranstaltung „IT´s Future“ ist Programm und macht die Relevanz der Informatik für alle Lebensbereiche sowie die zukünftige Innovations- und Wettbewerbsfähigkeit von Industrie und Forschung deutlich.
The paradigmatic shift of production systems towards Cyber-Physical Production Systems (CPPSs) requires the development of flexible and decentralized approaches. In this way, such systems enable manufacturers to respond quickly and accurately to changing requirements. However, domain-specific applications require the use of suitable conceptualizations. The issue at hand, when using various conceptualizations is the interoperability of different ontologies. To achieve flexibility and adaptability in CPPSs though requires overcoming interoperability issues within CPPSs. This paper presents an approach to increase flexibility and adaptability in CPPSs while addressing the interoperability issue. In this work, OWL ontologies conceptualize domain knowledge. The Intelligent Manufacturing Knowledge Ontology Repository (IMKOR) connects the domain knowledge in different ontologies. Testing if adaptions in one ontology within the IMKOR provide knowledge to the whole IMKOR. The tests showed, positive results and the repository makes the knowledge available to the whole CPPS. Furthermore, an increase in flexibility and adaptability was noticed.
Seit einigen Jahren befinden sich das globale ökonomische System, dessen Märkte und Organisationen in einem dynamischen und komplexen Veränderungsprozess; in diesem Zusammenhang ist der Begriff der Globalisierung eines der meist zitierten Schlagworte. Die Globalisierung der Märkte spiegelt sich in der Ausweitung, Intensivierung und grenzüberschreitenden Integration wirtschaftlicher Transaktionen in einem zuvor nie gekannten Ausmaß wider. Zu den Kennzeichen der ökonomischen Globalisierung gehören die internationalen Finanzmärkte und die Schaffung weltumspannender Wertschöpfungsketten durch eine Zunahme von Kooperationen transnational agierender Konzerne, den so genannten global players. Die Verflechtungen innerhalb der Finanzbranche und die Entstehung weltweiter Wertschöpfungsketten werfen eine Reihe moralisch sensibler Fragen hinsichtlich der Verantwortung von Unternehmen in der globalisierten Wirtschaftswelt auf. Ein Blick in die Schlagzeilen am Anfang des Jahrtausends lässt die Vielzahl der Konfliktfelder erahnen: Kollabierende Finanzkonstrukte von Investmentbanken, Schmiergeldzahlungen zur Sicherung von Aufträgen, nicht rechtmäßige Überprüfung von Mitarbeiterdaten sowie Skandale um Kinder- oder Sklavenarbeit in Ziegeleien und Kohlegruben sind nur ein kleiner Ausschnitt dieser Berichterstattungen. In der öffentlichen Wahrnehmung stehen die Reputation und Glaubwürdigkeit ganzer Branchen auf dem Spiel. Vertrauen, das über viele Jahre aufgebaut wurde, wird so binnen kürzester Zeit zerstört.
In Deutschland leisten über 90.000 Sportvereine einen bemerkenswerten Beitrag zum Gemeinwohl, Mit einem zumeist auf ehrenamtlichem Engagement basierenden breiten Angebot an Leistungs-, Breiten-, Freizeit und Gesundheitssport sind Vereine die Anlaufstelle für Sportbegeisterte. Sportvereine, die ein Repertoire an Fachwissen und pädagogischen Kenntnissen vorweisen können, stehen nicht nur bei Kindern und Jugendlichen, sondern auch bei ihren Eltern und Bezugspersonen sowie Erwachsenen als Freizeitoption hoch im Kurs. So engagierten sich im Jahr 2010 ca. 24. Millionen Menschen in den Landessportbünden, die im Deutschen Olympischen Sportbund organisiert sind.
Continuous manufacturing is becoming more important in the biopharmaceutical industry. This processing strategy is favorable, as it is more efficient, flexible, and has the potential to produce higher and more consistent product quality. At the same time, it faces some challenges, especially in cell culture. As a steady state has to be maintained over a prolonged time, it is unavoidable to implement advanced process analytical technologies to control the relevant process parameters in a fast and precise manner. One such analytical technology is Raman spectroscopy, which has proven its advantages for process monitoring and control mostly in (fed-) batch cultivations. In this study, an in-line flow cell for Raman spectroscopy is included in the cell-free harvest stream of a perfusion process. Quantitative models for glucose and lactate were generated based on five cultivations originating from varying bioreactor scales. After successfully validating the glucose model (Root Mean Square Error of Prediction (RMSEP) of ∼0.2 g/L), it was employed for control of an external glucose feed in cultivation with a glucose-free perfusion medium. The generated model was successfully applied to perform process control at 4 g/L and 1.5 g/L glucose over several days, respectively, with variability of ±0.4 g/L. The results demonstrate the high potential of Raman spectroscopy for advanced process monitoring and control of a perfusion process with a bioreactor and scale-independent measurement method.
In recent years, the Graph Model has become increasingly popular, especially in the application domain of social networks. The model has been semantically augmented with properties and labels attached to the graph elements. It is difficult to ensure data quality for the properties and the data structure because the model does not need a schema. In this paper, we propose a schema bound Typed Graph Model with properties and labels. These enhancements improve not only data quality but also the quality of graph analysis. The power of this model is provided by using hyper-nodes and hyper-edges, which allows to present data structures on different abstraction levels. We prove that the model is at least equivalent in expressive power to most popular data models. Therefore, it can be used as a supermodel for model management and data integration. We illustrate by example the superiority of this model over the property graph data model of Hidders and other prevalent data models, namely the relational, object-oriented, XML model, and RDF Schema.
Monodisperse porous poly(glycidyl methacrylate-co–ethylene glycol dimethacrylate) particles are widely applied in different fields, as their pore properties can be influenced and functionalization of the epoxy group is versatile. However, the adjustment of parameters which control morphology and pore properties such as pore volume, pore size and specific surface area is scarcely available. In this work, the effects of the process factors monomer:porogen ratio, GMA:EDMA ratio and composition of the porogen mixture on the response variables pore volume, pore size and specific surface area are investigated using a face centered central composite design. Non-linear effects of the process factors and second order interaction effects between them were identified. Despite the complex interplay of the process factors, targeted control of the pore properties was possible. For each response a response surface model was derived with high predictive power (all R2 predicted > 0.85). All models were tested by four external validation experiments and their validity and predictive power was demonstrated.
Adoption of artificial intelligence (AI) has risen sharply in recent years but many firms are not successful in realising the expected benefits or even terminate projects before completion. While there are a number of previous studies that highlight challenges in AI projects, critical factors that lead to project failure are mostly unknown. The aim of this study is therefore to identify distinct factors that are critical for failure of AI projects. To address this, interviews with experts in the field of AI from different industries are conducted and the results are analyzed using qualitative analysis methods. The results show that both, organizational and technological issues can cause project failure. Our study contributes to knowledge by reviewing previously identified challenges in terms of their criticality for project failure based on new empirical data, as well as, by identifying previously unknown factors.
Purpose
Interpretive research in management accounting and control provides rich insights from empirically based studies, but it has been criticised for lacking generalisability and potential subjectivity. On the latter, triangulation is useful, and this paper aims to offer some insights on a triangulation technique thus far not commonly reported in management accounting/control research.
Design/methodology/approach
Drawing on a study of the roles of management accountants in performance management systems, this paper offers some insights from empirical experiences on the use of concept maps as a tool to assist triangulation and improve understanding of complex empirical phenomena.
Findings
The concept maps as utilised revealed additional insights which were not recounted by interviewees during the normal interview time. This is a potentially important finding for consideration of future researchers.
Practical implications
In this paper, how concept maps were used is detailed, and it is hoped that future researchers will find their use beneficial in interview settings.
Originality/value
Thus far, concept maps seem under-utilised in management accounting and control research. This paper gives some initial insights on how they may be used in case study settings.
One of the key challenges for automatic assistance is the support of actors in the operating room depending on the status of the procedure. Therefore, context information collected in the operating room is used to gain knowledge about the current situation. In literature, solutions already exist for specific use cases, but it is doubtful to what extent these approaches can be transferred to other conditions. We conducted a comprehensive literature research on existing situation recognition systems for the intraoperative area, covering 274 articles and 95 cross-references published between 2010 and 2019. We contrasted and compared 58 identified approaches based on defined aspects such as used sensor data or application area. In addition, we discussed applicability and transferability. Most of the papers focus on video data for recognizing situations within laparoscopic and cataract surgeries. Not all of the approaches can be used online for real-time recognition. Using different methods, good results with recognition accuracies above 90% could be achieved. Overall, transferability is less addressed. The applicability of approaches to other circumstances seems to be possible to a limited extent. Future research should place a stronger focus on adaptability. The literature review shows differences within existing approaches for situation recognition and outlines research trends. Applicability and transferability to other conditions are less addressed in current work.
Lehrbuch zur CAD-Software Creo Parametric und zur Produktdatenverwaltung mit Windchill.
3D-Volumenmodellierung, 3D-Flächenmodellierung, Blechmodellierung, Baugruppen- und Zeichnungserstellung, Definition von Normteilen, Erstellen von Animationen und dynamischen Analysen.
Verfahren zum Umgang mit großen Baugruppen und zur flexiblen Modellierung, Konstruk-tionsvarianten "Top-Down" und "Bottom-Up", Organisation von Konstruktionsprojekten über Skeletttechnik.
Neu: Konstruktion von und mit Mehrkörperobjekten, Rahmenkonstruktion in der Profilumgebung (AFX), intelligente Verbindungen (IFX), Live Simulation und Generatives Design.
Säureschutzmantel - Ausrüstung zum Schutz gegen mikrobiellen Befall - (DTNW Mitteilung Nr. 129)
(2022)
Ziel des Forschungsvorhabens war es, den Effekt des Säureschutzmantels der menschlichen Haut auf der textilen Oberfläche unter der Verwendung von Säurekatalyten nachzuahmen, um so neuartige, antibakterielle Textilien zu entwickeln. Hierzu sollten für die Textilindustrie wässrige Ausrüstungen entwickelt werden, die über konventionelle Veredlungstechniken wie das Foulardieren appliziert werden können. Die Aktivität der Ausrüstung sollte im feuchten Millieu gegeben sein, um einen Effekt beim Tragen von z.B. Funktionskleidung oder Arbeitskleidung im medizinischen Bereich zu gewährleisten.
Für die Erfüllung der Projektziele wurden verschiedene kommerzielle Polyoxometallate verwendet. Zudem wurden Polyoxometallate synthetisiert und funktionalisiert. Diese führen im wässrigen Millieu eine saure Katalyse durch und kommen als industrielle Katalysatoren an Membranen gebunden zum Einsatz. Ein Aktivitätsscreening geeigneter Kandidaten zeigte, dass eine wässrige Applikation möglich ist und zu einer antibakteriellen Aktivität der ausgerüsteten Textilien führt.
Die Polyoxometallate konnten durch das Sol-Gel-Verfahren mittels Tetraethoxysilan durch Foulardierverfahren im Labormaßstab an verschiedenen Textilien immobilisiert werden. Eine Hochskalierung auf den Technikumsmaßstab gelang ebenfalls. Das Aktivitätsscreening der Ausrüstungen zeigte, dass ein saurer Oberflächen-pH-Wert von ≤ 4 durch die entwickelte Ausrüstung möglich ist und zu einem antibakteriellen Effekt führt. Die Abrasionsbeständigkeit war gegeben. Nach Waschversuchen verloren die Ausrüstungen zum Teil ihren antibakteriellen Effekt.
Insgesamt ergab sich ein Einblick in den Nutzen von Polyoxometallaten als katalytisch aktive Substanz, die zur Ausrüstung von Textilien geeignet ist. Da die in diesem Forschungsvorhaben synthetisierten Polyoxometallate keine genotoxische und mutagene Aktivität aufweisen, können die KMU des textilveredelnden Wirtschaftszweigs eine neue Art der antibakteriellen Ausrüstung anwenden. Um eine Waschstabile Ausrüstung zu erzielen, müssen die Funktionalisierungen und darüber die Bindung der Polyoxometallate an die Ausrüstungsmatrix jedoch weiterentwickelt werden.
Die Ziele des Forschungsvorhabens wurden erreicht.
For collision and obstacle avoidance as well as trajectory planning, robots usually generate and use a simple 2D costmap without any semantic information about the detected obstacles. Thus a robot’s path planning will simply adhere to an arbitrarily large safety margin around obstacles. A more optimal approach is to adjust this safety margin according to the class of an obstacle. For class prediction, an image processing convolutional neural network can be trained. One of the problems in the development and training of any neural network is the creation of a training dataset. The first part of this work describes methods and free open source software, allowing a fast generation of annotated datasets. Our pipeline can be applied to various objects and environment settings and is extremely easy to use to anyone for synthesising training data from 3D source data. We create a fully synthetic industrial environment dataset with 10 k physically-based rendered images and annotations. Our da taset and sources are publicly available at https://github.com/LJMP/synthetic-industrial-dataset. Subsequently, we train a convolutional neural network with our dataset for costmap safety class prediction. We analyse different class combinations and show that learning the safety classes end-to-end directly with a small dataset, instead of using a class lookup table, improves the quantity and precision of the predictions.
The Dow Jones Sustainability Indexes (DJSI) track the performance of companies that lead in corporate sustainability in their respective sectors or in the geographies they operate. The Sustainable Asset Management (SAM) Indexes GmbH publishes and markets the indexes, the so-called Dow Jones Sustainability Indexes in collaboration with SAM. All indexes of the DJSI family are assessed according to SAM’s Corporate Sustainability AssessmentTM methodology.
Deep learning-based EEG detection of mental alertness states from drivers under ethical aspects
(2021)
One of the most critical factors for a successful road trip is a high degree of alertness while driving. Even a split second of inattention or sleepiness in a crucial moment, will make the difference between life and death. Several prestigious car manufacturers are currently pursuing the aim of automated drowsiness identification to resolve this problem. The path between neuro-scientific research in connection with artificial intelligence and the preservation of the dignity of human individual’s and its inviolability, is very narrow. The key contribution of this work is a system of data analysis for EEGs during a driving session, which draws on previous studies analyzing heart rate (ECG), brain waves (EEG), and eye function (EOG). The gathered data is hereby treated as sensitive as possible, taking ethical regulations into consideration. Obtaining evaluable signs of evolving exhaustion includes techniques that obtain sleeping stage frequencies, problematic are hereby the correlated interference’s in the signal. This research focuses on a processing chain for EEG band splitting that involves band-pass filtering, principal component analysis (PCA), independent component analysis (ICA) with automatic artefact severance, and fast fourier transformation (FFT). The classification is based on a step-by-step adaptive deep learning analysis that detects theta rhythms as a drowsiness predictor in the pre-processed data. It was possible to obtain an offline detection rate of 89% and an online detection rate of 73%. The method is linked to the simulated driving scenario for which it was developed. This leaves space for more optimization on laboratory methods and data collection during wakefulness-dependent operations.
We propose a novel technique to compensate the effects of R-C / gm-C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with tunable circuit structures, such as current-steering digital-to-analog converters (DAC). This approach constitutes an alternative or supplement to existing compensation techniques, including capacitor or gm tuning. We apply the proposed technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure. A feedback path tuning scheme is derived analytically and confirmed numerically using behavioral simulations. The modulator circuit was implemented in a 0.35-μm CMOS process using an active feedback coefficient tuning structure based on current-steering DACs. Post-layout simulations show that with this tuning structure, constant performance and stable operation can be obtained over a wide range of TC variation.
IT governance: current state of and future perspectives on the concept of agility in IT governance
(2020)
Digital transformation has changed corporate reality and, with that, corporates’ IT environments and IT governance (ITG). As such, the perspective of ITG has shifted from the design of a relatively stable, closed and controllable system of a self-sufficient enterprise to a relatively fluid, open, agile and transformational system of networked co-adaptive entities. Related to the paradigm shift in ITG, this thesis aims to conceptualize a framework to integrate the concept of agility into the traditional ITG framework and to test the effects of such an extended ITG framework on corporate performance.
To do so, the thesis uses literature research and a mixed method design by blending both qualitative and quantitative research methods. Given the poorly understood situation of the agile mechanisms within the ITG framework, the building process of this thesis’ research model requires an adaptive and flexible approach which involves four different research phases. The initial a priori research model based on a comprehensive review of the extant literature is critically examined and refined at the end of each research phase, which later forms the basis of a subsequent research phase. As a result, the final research model provides guidance on how the conceptualized framework leads to better business/IT alignment as well as how business/IT alignment can mediate the effectiveness of such an extended ITG framework on corporate performance.
The first research phase explores the current state of literature with a focus on the ITG-corporate performance association. This analysis identifies five perspectives with respect to the relationship between ITG and corporate performance. The main variables lead to the perspectives of business/IT alignment, IT leadership, IT capability and process performance, resource relatedness and culture. Furthermore, the analysis presents core aspects explored within the identified perspectives that could act as potential mediators or moderators in the relationship between ITG and corporate performance.
The second research phase investigates the agile aspect of an effective ITG framework in the dynamic contemporary world through a qualitative study. Gleaned from 46 semi-structured interviews across various industries with governance experts, the study identifies 25 agile ITG mechanisms and 22 traditional ITG mechanisms that corporations use to master digital transformation projects. Moreover, the research offers two key patterns indicating to a call for ambidextrous ITG, with corporations alternating between stability and agility in their ITG mechanisms.
In research phase three, a scale development process is conducted in order to develop the agile items explored in research phase two. Through 56 qualitative interviews with professionals the evaluation uncovers 46 agile governance mechanisms. Moreover, these dimensions are rated by 29 experts to identify the most effective ones. This leads to the identification of six structure elements, eight processes, and eight relational mechanisms.
Finally, in research phase four a quantitative research approach through a survey of 400 respondents is established to test and predict the formulated relationships by using the partial least squares structural equation modelling (PLS-SEM) method. The results provide evidence for a strong causal relationship among an expanded ITG concept, business/IT alignment, and corporate performance. These findings reveal that the agile ITG mechanisms within an effective ITG framework seem critical in today’s digital age.
This research is unique in exploring the combination of traditional and agile ITG mechanisms. It contributes to the theoretical base by integrating and extending the literature on ITG, business/IT alignment, ambidexterity and agility, all of which have long been recognized as critical for achieving organizational goals. In summary, this work presents an original analysis of an effective ITG framework for digital transformation by including the agile aspect within the ITG construct. It highlights that is not enough to apply only traditional mechanisms to achieve effective business/IT alignment in today’s digital age; agile ITG mechanisms are also needed. Therefore, a novel ITG framework following an ambidextrous approach is provided consisting of traditional ITG mechanisms as well as newly developed agile ITG practices. This thesis also demonstrates that agile ITG mechanisms can be measured independently of traditional ITG mechanisms within one causal model. This is an important theoretical outcome that allows the current state of ITG to be assessed in two distinct dimensions, offering various pathways for further research on the different antecedents and effects of traditional and agile ITG mechanisms. Furthermore, this thesis makes practical contributions by highlighting the need to develop a basic governance framework powered by traditional ITG mechanisms and simultaneously increase agility in ITG mechanisms. The results imply that corporations might be even more successful if they include both traditional and agile mechanisms in their ITG framework. In this way, the uncovered agile ITG practices may provide a template for CIOs to derive their own mechanisms in following an ambidextrous approach that is suitable for their corporation.
Als Google vor einigen Jahren begann, seine riesigen Personaldatenbestände auszuwerten, um herauszufinden, welche Eigenschaften gute Führungskräfte ausmachen, betrat es Neuland. Die Ergebnisse legten nahe, die Daten auch für andere personalwirtschaftliche Fragen zu nutzen (vgl. Garvin).
Inzwischen beschäftigen sich nicht nur Technologie-unternehmen wie Google mit Verfahren, die unter dem Schlagwort People Analytics (auch HR Analytics oder Workforce Analytics) intensiv diskutiert und erforscht werden. Dabei werden die umfangreichen Bestände an mitarbeiterbezogenen Daten, die bei der Rekrutierung, bei Mitarbeiterumfragen oder Leistungsbeurteilungen anfallen, systematisch analysiert und für Prognosen genutzt (vgl. Marler/Boudreau, S. 15). Dem liegt die Annahme zugrunde, dass Personalentscheidungen verbessert werden, wenn sie nicht nur auf Intuition und Erfahrung beruhen, sondern zudem auf einem soliden Datenfundament.
Public transport maps are typically designed in a way to support route finding tasks for passengers, while they also provide an overview about stations, metro lines, and city-specific attractions. Most of those maps are designed as a static representation, maybe placed in a metro station or printed in a travel guide. In this paper, we describe a dynamic, interactive public transport map visualization enhanced by additional views for the dynamic passenger data on different levels of temporal granularity. Moreover, we also allow extra statistical information in form of density plots, calendar-based visualizations, and line graphs. All this information is linked to the contextual metro map to give a viewer insights into the relations between time points and typical routes taken by the passengers. We also integrated a graph-based view on user-selected routes, a way to interactively compare those routes, an attribute- and property-driven automatic computation of specific routes for one map as well as for all available maps in our repertoire, and finally, also the most important sights in each city are included as extra information to include in a user-selected route. We illustrate the usefulness of our interactive visualization and map navigation system by applying it to the railway system of Hamburg in Germany while also taking into account the extra passenger data. As another indication for the usefulness of the interactively enhanced metro maps we conducted a controlled user experiment with 20 participants.
Commercially available homogenized cow- and plant-based milks were investigated by optical spectroscopy in the range of 400–1360 nm. Absorbance spectra, the effective scattering coefficient μs′, and the spectral absorption coefficient μa were recorded for 23 milk varieties and analyzed by multivariate data analysis. Cow- and plant-based milks were compared and discriminated using principal component analysis combined with a quadratic discriminant analysis. Furthermore, it was possible to discriminate the origin of plant-based milk by μa and the fat content in cow-based milk by μs′. Partial least squares regression models were developed to determine the fat content in cow-based milk. The model for μs′ proved to be the most efficient for this task with R2 = 0.98 and RMSEP = 0.19 g/100 mL for the external validation. Thus, optical spectroscopy together with multivariate data analysis is suitable for routine laboratory analysis or quality monitoring in the dairy production.
A new planar compact antenna composed of two crossed Cornu spirals is presented. Each Cornu spiral is fed from the center of the linearly part of the curvature between the two spirals, which builds the clothoid. Sequential rotation is applied using a sequential phase network to obtain circular polarization and increase the effective bandwidth. Signal integrity issues have been addressed and designed to ensure high quality of signal propagation. As a result, the antenna shows good radiation characteristics in the bandwidth of interest. Compared to antennas of the same size in the literature, it is broadband and of high gain. Although the proposed antenna has been designed for K- and Ka-band operations, it can also be developed for lower and upper frequencies because of the linearity of the Maxwell equations.
Purpose
The purpose of this study is to examine private households’ preferences for service bundles in the German energy market.
Design/methodology/approach
This investigation is based on survey data collected from 3,663 customers of seven mainly municipal energy suppliers in the German energy market. The data set was analyzed via a binary logistic regression model to identify the most prospective customers and their preferences regarding bundles of energy services.
Findings
The results indicate that potential adopters of energy-related service bundles have greater prior knowledge about service bundles; place higher importance on simplified handling, flat rates and long price guarantees; prefer to purchase a service bundle from an energy supplier; live in urban areas and have a gas tariff; are both less likely to have a green electricity tariff and to support the German energy transition; have a greater intention to purchase a smart home product; are less likely to already be prosumers; and prefer customer centers and social media as communication channels with energy providers.
Practical implications
This paper offers several implications for decision-makers in developing marketing strategies for bundled offerings in a highly competitive energy market.
Originality/value
This paper contributes to the sparse research on service bundles in the energy sector, despite the growing interest of energy suppliers and consumers in this topic. It expands the research focusing on the telecommunications sector.
The food system represents a key industry for Europe and Germany in particular. However, it is also the single most significant contributor to climate and environmental change. A food system transformation is necessary to overcome the system’s major and constantly increasing challenges in the upcoming decades. One possible facilitator for this transformation are radical and disruptive innovations that start-ups develop. There are many challenges for start-ups in general and food start-ups in particular. Various support opportunities and resources are crucial to ensure the success of food start-ups. One aim of this study is to identify how the success of start-ups in the food system can be supported and further strengthened by actors in the innovation ecosystem in Germany. There is still room for improvement and collaboration toward a thriving innovation ecosystem. A successful innovation ecosystem is characterised by a well-organised, collaborative, and supportive environment with a vivid exchange between the members in the ecosystem. The interviewees confirmed this, and although the different actors are already cooperating, there is still room for improvement. The most common recommendation for improving cooperation is learning from other countries and bringing the best to Germany.
Up to now biorefinery concepts can hardly compete with the conventional production of fossil-based chemicals. On one hand, conventional chemical production has been optimised over many decades in terms of energy, yield and costs. Biorefineries, on the other hand, do not have the benefit of long-term experience and therefore have a huge potential for optimisation. This study deals with the economic evaluation of a newly developed biorefinery concept based on superheated steam (SHS) torrefaction of biomass residues with recovery of valuable platform chemicals. Two variants of the biorefinery were economically investigated. One variant supplies various platform chemicals and torrefied biomass. The second variant supplies thermal energy for external consumers in addition to platform chemicals. The results show that both variants can be operated profitably if the focus of the platform chemicals produced is on high quality and thus on the higher-priced segment. The economic analysis gives clear indications of the most important financial influencing parameters. The economic impact of integration into existing industrial structures is positive. With the analysis, a viable business model can be developed. Based on the results of the present study, an open-innovation platform is recommended for the further development and commercialisation of the novel biorefinery.
The world population is growing and alternative ways of satisfying the increasing demand for meat are being explored, such as using animal cells for the fabrication of cultured meat. Edible biomaterials are required as supporting structures. Hence, we chose agarose, gellan and a xanthan-locust bean gum blend (XLB) as support materials with pea and soy protein additives and analyzed them regarding material properties and biocompatibility. We successfully built stable hydrogels containing up to 1% pea or soy protein. Higher amounts of protein resulted in poor handling properties and unstable gels. The gelation temperature range for agarose and gellan blends is between 23–30 °C, but for XLB blends it is above 55 °C. A change in viscosity and a decrease in the swelling behavior was observed in the polysaccharide-protein gels compared to the pure polysaccharide gels. None of the leachates of the investigated materials had cytotoxic effects on the myoblast cell line C2C12. All polysaccharide-protein blends evaluated turned out as potential candidates for cultured meat. For cell-laden gels, the gellan blends were the most suitable in terms of processing and uniform distribution of cells, followed by agarose blends, whereas no stable cell-laden gels could be formed with XLB blends.
Due to its availability and minimal invasive harvesting human adipose tissue-derived extracellular matrix (dECM) is often used as a biomaterial in various tissue engineering and healthcare applications. Next to dECM, cell-derived ECM (cdECM) can be generated by and isolated from in vitro cultured cells. So far both types of ECM were investigated extensively toward their application as (bio)material in tissue engineering and healthcare. However, a systematic characterization and comparison of soft tissue dECM and cdECM is still missing. In this study, we characterized dECM from human adipose tissue, as well as cdECM from human adipose-derived stem cells, toward their molecular composition, structural characteristics, and biological purity. The dECM was found to exhibit higher levels of collagens and lower levels of sulfated glycosaminoglycans compared with cdECMs. Structural characteristics revealed an immature state of the fibrous part of cdECM samples. By the identified differences, we aim to support researchers in the selection of a suitable ECM-based biomaterial for their specific application and the interpretation of obtained results.
Within the last decade, research on torrefaction has gained increasing attention due to its ability to improve the physical properties and chemical composition of biomass residues for further energetic utilisation. While most of the research works focused on improving the energy density of the solid fraction to offer an ecological alternative to coal for energy applications, little attention was paid to the valorisation of the condensable gases as platform chemicals and its ecological relevance when compared to conventional production processes. Therefore, the present study focuses on the ecological evaluation of an innovative biorefinery concept that includes superheated steam drying and the torrefaction of biomass residues at ambient pressure, the recovery of volatiles and the valorisation/separation of several valuable platform chemicals. For a reference case and an alternative system design scenario, the ecological footprint was assessed, considering the use of different biomass residues. The results show that the newly developed process can compete with established bio-based and conventional production processes for furfural, 5-HMF and acetic acid in terms of the assessed environmental performance indicators. The requirements for further research on the synthesis of other promising platform chemicals and the necessary economic evaluation of the process were elaborated.
The scoring of sleep stages is an essential part of sleep studies. The main objective of this research is to provide an algorithm for the automatic classification of sleep stages using signals that may be obtained in a non-obtrusive way. After reviewing the relevant research, the authors selected a multinomial logistic regression as the basis for their approach. Several parameters were derived from movement and breathing signals, and their combinations were investigated to develop an accurate and stable algorithm. The algorithm was implemented to produce successful results: the accuracy of the recognition of Wake/NREM/REM stages is equal to 73%, with Cohen's kappa of 0.44 for the analyzed 19324 sleep epochs of 30 seconds each. This approach has the advantage of using the only movement and breathing signals, which can be recorded with less effort than heart or brainwave signals, and requiring only four derived parameters for the calculations. Therefore, the new system is a significant improvement for non-obtrusive sleep stage identification compared to existing approaches.
From the perspective of manufacturing companies, the political, media and economic discourse on decarbonisation in the recent years manifests itself as an increasing social expectation of action. In Germany, in particular, this discourse is also being driven forward by powerful companies, respectively sectors, most notably the automotive industry. Against this background, the present paper examines how German manufacturing companies react to rising societal pressure and emerging policies. It examines which measures the companies have taken or plan to take to reduce their carbon footprint, which aspirations are associated with this and the structural characteristics (company size, energy intensity, and sector) by which these are influenced. A mix methods approach is applied, utilising data gathered from approx. 900 companies in context of the Energy Efficiency Index of German Industry (EEI), along with media research focusing on the announced decarbonisation plans and initiatives. We demonstrate that one-size-serves-all approaches are not suitable to decarbonise industry, as the situation and ambitions differ considerably depending on size, energy intensity and sector. Even though the levels of ambition and urgency are high, micro and energy intensive companies, in particular, are challenged. The present research uncovers a series of questions that call for attention to materialise the ambitions and address the challenges outlined.
The development of new materials that mimic cartilage and its function is an unmet need that will allow replacing the damaged parts of the joints, instead of the whole joint. Polyvinyl alcohol (PVA) hydrogels have raised special interest for this application due to their biocompatibility, high swelling capacity and chemical stability. In this work, the effect of post-processing treatments (annealing, high hydrostatic pressure (HHP) and gamma-radiation) on the performance of PVA gels obtained by cast-drying was investigated and, their ability to be used as delivery vehicles of the anti-inflammatories diclofenac or ketorolac was evaluated. HHP damaged the hydrogels, breaking some bonds in the polymeric matrix, and therefore led to poor mechanical and tribological properties. The remaining treatments, in general, improved the performance of the materials, increasing their crystallinity. Annealing at 150 °C generated the best mechanical and tribological results: higher resistance to compressive and tensile loads, lower friction coefficients and ability to support higher loads in sliding movement. This material was loaded with the anti-inflammatories, both without and with vitamin E (Vit.E) or Vit.E + cetalkonium chloride (CKC). Vit.E + CKC helped to control the release of the drugs which occurred in 24 h. The material did not induce irritability or cytotoxicity and, therefore, shows high potential to be used in cartilage replacement with a therapeutic effect in the immediate postoperative period.
Der vorliegende Beitrag gibt eine Einführung in das von Josef Wieland in den 1990er-Jahren vorgestellte und seitdem weiterentwickelte Forschungsprogramm der Governanceethik als moderne Unternehmensethik in seiner zentralen Argumentation sowie praktischen Umsetzung durch WerteManagementSysteme (WMS). In seiner Struktur orientiert sich dieser Beitrag an der Beantwortung der von Wieland formulierten Fragestellung, die seiner Meinung nach auch die gesamte Problematik der wirtschaftsethischen Diskussion der Moderne auszeichnet, "[w]ie [...] Normativität, die sich auf das Gesamt des Gesellschaftssystems ausrichtet, unter den Bedingungen funktionaler Differenzierung möglich [ist]?"
Monitoring tautomerization of single hypericin molecules in a tunable optical λ/2 microcavity
(2022)
Hypericin tautomerization that involves the migration of the labile protons is believed to be the primary photophysical process relevant to its light-activated antiviral activity. Despite the difficulty in isolating individual tautomers, it can be directly observed in single-molecule experiments. We show that the tautomerization of single hypericin molecules in free space is observed as an abrupt flipping of the image pattern accompanied with fluorescence intensity fluctuations, which are not correlated with lifetime changes. Moreover, the study can be extended to a λ/2 Fabry–Pérot microcavity. The modification of the local photonic environment by a microcavity is well simulated with a theoretical model that shows good agreement with the experimental data. Inside a microcavity, the excited state lifetime and fluorescence intensity of single hypericin molecules are correlated, and a distinct jump of the lifetime and fluorescence intensity reveals the temporal behavior of the tautomerization with high sensitivity and high temporal resolution. The observed changes are also consistent with time-dependent density functional theory calculations. Our approach paves the way to monitor and even control reactions for a wider range of molecules at the single molecule level.
Am 1. November 2010 wurde der Leitfaden zur gesellschaftlichen Verantwortung von Organisationen – „Guidance on Social Responsibility“ (ISO 26000:2010) – veröffentlicht. Dieses Normendokument wurde innerhalb von sechs Jahren in einem auch für die ‚International Organization for Standardization’(ISO) einzigartigen, weltweiten Normierungsprozess mit mehr als 400 Experten aus 99 Ländern erarbeitet.
Seit über 12 Jahren findet nun die Informatics Inside als Informatikkonferenz an der Hochschule Reutlingen statt, in diesem Jahr zum zweiten Mal in einem halbjährigen Rhythmus, d.h. auch im Herbst. Diese Wissenschaftliche Konferenz des Masterstudiengangs Human-Centered Computing wird von den Studierenden selbst organisiert und durchgeführt. Sie erhalten während ihres Masterstudiums die Gelegenheit sich in einem selbstgewählten Fachthema zu vertiefen. Dies kann an der Hochschule, in einem Unternehmen, einem Forschungsinstitut oder im Ausland durchgeführt werden. Gerade diese flexible Ausgestaltung des Moduls „Wissenschaftliche Vertiefung“ führt zu einem sehr breiten Themenspektrum, das von den Studierenden bearbeitet wird. Neben der eigentlichen fachlichen Vertiefung spielt auch die Präsentation und Verteidigung von wissenschaftlichen Ergebnissen eine wichtige Rolle und dies weit über das Studium hinaus. Ein gewähltes Fachgebiet so allgemeinverständlich aufzubereiten und zu vermitteln, dass es auch für Nicht-Spezialisten verständlich wird, stellt immer wieder eine besondere Herausforderung dar. Dieser Herausforderung stellen sich die Studierenden im Rahmen der Herbstkonferenz zur Wissenschaftlichen Vertiefung am 24. November 2021. Bereits zum vierten Mal wird die Veranstaltung in einem online-Modus stattfinden, einschließlich eines virtuellen Begleitprogramms.
Das Themenspektrum der diesjährigen Herbstkonferenz ist wieder sehr vielfältig und breit gefächert. So erwarten Sie u.a. Beiträge aus dem Gesundheitssektor, dem Maschinellen Lernen, der KI und VR sowie dem Marketing und E-Learning. Allen gemein ist ein sehr starker Bezug zu innovativen Informatikansätzen, was sich auch in dem Wortspiel und Motto „RockIT Science“ der Konferenz widerspiegelt. Die Informatik durchdringt fast alle beruflichen und privaten Anwendungsbereiche und hat zunehmend größeren Einfluss auf unser tägliches Leben. Dies kann einerseits Besorgnis und andererseits Begeisterung auslösen. Gerade letzteres wollen die Studierenden mit Ihren Beiträgen erreichen und es auch mal im Informatiksektor „rocken“ lassen.
Context-aware systems to support actors in the operating room depending on the status of the intervention require knowledge about the current situation in the intra-operative area. In literature, solutions to achieve situation awareness already exist for specific use cases, but applicability and transferability to other conditions are less addressed. It is assumed that a unified solution that can be adapted to different processes and sensors would allow for greater flexibility, applicability, and thus transferability to different applications. To enable a flexible and intervention-independent system, this work proposes a concept for an adaptable situation recognition system. The system consists of four layers with several modular components for different functionalities. The feasibility is demonstrated via prototypical implementation and functional evaluation of a first basic framework prototype. Further development goal is the stepwise extension of the prototype.
Silicon photonic micro-ring resonators (MRR) developed on the silicon-on-insulator (SOI) platform, owing to their high sensitivity and small footprint, show great potential for many chemical and biological sensing applications such as label-free detection in environmental monitoring, biomedical engineering, and food analysis. In this tutorial, we provide the theoretical background and give design guidelines for SOI-based MRR as well as examples of surface functionalization procedures for label-free detection of molecules. After introducing the advantages and perspectives of MRR, fundamentals of MRR are described in detail, followed by an introduction to the fabrication methods, which are based on a complementary metal-oxide semiconductor (CMOS) technology. Optimization of MRR for chemical and biological sensing is provided, with special emphasis on the optimization of waveguide geometry. At this point, the difference between chemical bulk sensing and label-free surface sensing is explained, and definitions like waveguide sensitivity, ring sensitivity, overall sensitivity as well as the limit of detection (LoD) of MRR are introduced. Further, we show and explain chemical bulk sensing of sodium chloride (NaCl) in water and provide a recipe for label-free surface sensing.
Together with many success stories, promises such as the increase in production speed and the improvement in stakeholders' collaboration have contributed to making agile a transformation in the software industry in which many companies want to take part. However, driven either by a natural and expected evolution or by contextual factors that challenge the adoption of agile methods as prescribed by their creator(s), software processes in practice mutate into hybrids over time. Are these still agile In this article, we investigate the question: what makes a software development method agile We present an empirical study grounded in a large-scale international survey that aims to identify software development methods and practices that improve or tame agility. Based on 556 data points, we analyze the perceived degree of agility in the implementation of standard project disciplines and its relation to used development methods and practices. Our findings suggest that only a small number of participants operate their projects in a purely traditional or agile manner (under 15%). That said, most project disciplines and most practices show a clear trend towards increasing degrees of agility. Compared to the methods used to develop software, the selection of practices has a stronger effect on the degree of agility of a given discipline. Finally, there are no methods or practices that explicitly guarantee or prevent agility. We conclude that agility cannot be defined solely at the process level. Additional factors need to be taken into account when trying to implement or improve agility in a software company. Finally, we discuss the field of software process-related research in the light of our findings and present a roadmap for future research.
Prior to the introduction of AI-based forecast models in the procurement department of an industrial retail company, we assessed the digital skills of the procurement employees and surveyed their attitudes toward a new digital technology. The aim of the survey was to ascertain important contextual factors which are likely to influence the acceptance and the successful use of the new forecast tool. What we find is that the digital skills of the employees show an intermediate level and that their attitudes toward key aspects of new digital technologies are largely positive. Thus, the conditions for high acceptance and the successful use of the models are good, as evidenced by the high intention of the procurement staff to use the models. In line with previous research, we find that the perceived usefulness of a new technology and the perceived ease of use are significant drivers of the willingness to use the new forecast tool.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
In buchstäblich letzter Minute haben sich die englische Regierung und die Europäische Union auf ein umfangreiches Abkommen geeinigt, um einen ungeregelten Brexit zu verhindern. Nach dem jahrelangen zähen Verhandlungsmarathon fällt der Jubel verhalten aus, dennoch herrscht auf beiden Seiten des Ärmelkanals Erleichterung, weil ein Modus Vivendi gefunden wurde, auf dem sich die künftigen Beziehungen aufbauen und fortführen lassen. Ob sich die englischen Blütenträume, die an den Brexit geknüpft wurden, erfüllen werden, wird die Zukunft erweisen.
Die Strategie und Taktik der englischen Regierungen zum Brexit und bei den Austrittsverhandlungen spiegeln sich in den Erfahrungen wider, die Friedrich List vor genau 175 Jahren bei seinen Bemühungen um eine deutsch-englische Allianz machen musste. Wegen der von England schon damals strikt befolgten Insular und Handelssuprematie musste er sich eingestehen, dass England diese Position hartnäckig verteidigt und deshalb frustriert und ernüchtert seine Pläne aufgeben. Deshalb setzte er seine Hoffnung auf eine "Kontinentalallianz" der europäischen Nationen, wie sie nun nach dem Austritt Großbritanniens aus der Europäischen Union entstanden ist. Vielleicht werden wir uns nun an den Begriff "Kontinentalallianz" gewöhnen müssen und dabei an die Weitsicht von Friedrich List erinnert.
Andererseits gilt auch für die englische Politik das Motto von Lists zweiter Pariser Preisschrift: "Le monde marche - Die Welt bewegt sich", allerdings mit völlig anderen Vorzeichen als vor 175 Jahren: Die Welthandelsachse hat sich von der westlichen auf die östliche Halbkugel verlagert; das britische Weltreich ist Geschichte, die Fließgeschwindigkeit des globalen Wandels hat sich dramatisch beschleunigt und trotz der Lingua Franca erscheint England, vor allem aus asiatischer Sicht, nur noch als kleiner Fleck auf der Weltkarte. Falls die schottische Regierung ihre Absicht durchsetzen und die Unabhängigkeit vom Vereinigten Königreich erreichen sollte, würde sich der Brexit als verhängnisvoller Bumerang erweisen.
Digitalisierung und Mediatisierung prägen die Gesellschaft und auch die Erwachsenenbildung/Weiterbildung. Der Beitrag geht der Frage nach, wie Digitalisierung in Angeboten der Erwachsenenbildung/Weiterbildung gelingt. Damit wird ein Fokus auf den Einsatz digitaler Medien gelegt. Dazu werden die Angebotsentwicklung für Adressatinnen und Adressaten sowie Teilnehmende, medienbezogene Inhalte, Lehr- und Lernarrangements mit digitalen Medien, der Einsatz digitaler Medien und die Zugänglichkeit von Lehr- und Lernmaterialien als relevante Merkmale identifiziert. Insgesamt zeigen die analysierten Interviewdaten, dass der Einsatz digitaler Medien in Angeboten eine Erweiterung der didaktischen Aufgaben darstellt, da Angebote mit digitalen Medien zielgenau auf die Bedarfe und Möglichkeiten von Adressatinnen und Adressaten sowie Teilnehmenden abgestimmt werden müssen.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Hearing contact lens (HCL) is a new type of hearing aid devices. One of its main components is a piezo-electric actuator (PEA). In order to evaluate and maximizethe HCL´s performance, a model of the HCL coupled to the middle ear was developed using finite element (FE)approach. To validate the model, vibrational measurements on the HCL and temporal bones were performed using a Laser-Doppler-Vibrometer (LDV). The model was validated step by step starting with HCL only. Then a silicone cap was fitted onto the HCL to provide an interface between the HCL and the tympanic membrane. The HCL was placed on the tympanic membrane and additional measurements were performed to validate the coupled model. The model was used to evaluate the sensitivity of geometrical and material parameters with respect to performance measures of the HCL. Moreover, deeper insight was gained into the feedback behavior, which causes whistling sounds, and the contact between the HCL and tympanic membrane.
This paper presents a permanent magnet tubular linear generator system for powering passive sensors using vertical vibration harvesting energy. The system consists of a permanent magnet tubular linear vibration generator and electric circuits. By using the design of mechanical resonant movers, the generator is capable of converting low frequencies small amplitude vertical vibration energy into more regular sinusoidal electrical energy. The distribution of the magnetic field and electromotive force are calculated by Finite Element Analysis. The characteristics of the linear vibration generator system are observed. The experimental results show the generator can produce about 0.4W~1.6W electrical power when the vibration source's amplitude is fixed on 2mm and the frequencies are between 13Hz and 22Hz.
So-called cloud-based management information systems are a fairly new phenomenon in management accounting in recent years. Quite a few companies (and especially their business managers and management accountants) do not always work via the cloud, but with hybrid solutions or on-premise solutions of ERP software such as SAP or Oracle, but often still with "manual" solutions such as Microsoft Excel.
This paper takes a holistic view on an IP-traceability process in interorganizational R&D projects, as a particular Open innovation mode, aiming at showing different technologies which can be used in the front and backend of a traceability process and discussing these technologies in terms of their suitability for data from creativity processes in these projects. To achieve this goal a two-stage literature review on different technologies in the context of traceability was conducted. Then, criteria were derived from the characteristics of data from creativity processes and of interorganizational R&D projects, with which the resulting technologies were discussed. At the end, recommendations regarding suitable technologies for tracing individual creativity artifacts in interorganizational R&D projects were given.
Die vorliegende Studie zeigt, dass das Thema Smart Innovation (der Einsatz von KI-Systemen im Innovationsprozess) von hoher Relevanz ist und Zustimmung für den Einsatz von KI im Innovationsprozess besteht. Sowohl von den Unternehmen als auch von den Studierenden werden Effizienzsteigerung, schnellere Bearbeitung großer Datenmengen, die Steigerung der Wettbewerbsfähigkeit und Kosteneinsparungen als Gründe für den Einsatz von KI im Innovationsprozess gesehen. In Deutschland finden KI-Technologien bereits jetzt punktuell und branchenunabhängig Anwendung im Innovationsprozess. Einflussfaktoren, wie Hochschulkooperationen, Innovationsabteilungen und Open Innovation können den Einsatz fördern. Vor allem KMU aus den frühen Phasen der Industrialisierung sollten davon Gebrauch machen. In einem Zusammenspiel von menschlicher Expertise und der schnellen und präzisen Datenverarbeitung der KI liegt das Erfolgsgeheimnis eines möglichst effizienten Innovationsprozesses. Es wird deutlich, dass verschiedene Einflussfaktoren erforderlich sind, um die Anwendung von Smart Innovation praktikabel zu gestalten. So gilt es zunächst die technischen Voraussetzungen einer funktionierenden IT-Infrastruktur zu erfüllen. Gleichbedeutend sind offene Fragestellungen hinsichtlich der Datenverfügbarkeit, des Dateneigentums und der Datensicherheit. Ohne rechtlichen Rahmen sind kaum Akteure gewillt, ihre Daten zu teilen und zugänglich zu machen. Erschwert wird der Einsatz von KI durch den nationalen IT-Fachkräftemangel. So sehen sowohl Unternehmen als auch die Studierenden das größte Hindernis im Mangel von KI-relevantem Know-how. Dies hemmt einerseits die Forschung, andererseits fehlt es den Unternehmen an erforderlichen Fachkräften für eine Einführung von KI im Unternehmen. Es ist jedoch notwendig, den Unternehmen durch das Aufzeigen von Anwendungsbeispielen, die Potenziale und Chancen von Smart Innovation zu vermitteln. Es gilt, die anwendungsorientierte Forschung zu fördern und einen reibungslosen Transfer in die Wirtschaft sicherzustellen. Dieser Wissensaustausch erfordert zudem eine höhere unternehmerische Risikobereitschaft. Es wächst die Notwendigkeit, unternehmensspezifische KI-Strategien zu entwerfen. Die Technologien entwickeln sich schnell, es gilt daher auch für Unternehmen sich diesem Fortschritt anzupassen, um den Anschluss nicht zu verlieren und die Wettbewerbsfähigkeit zu sichern. So liegt die größte Herausforderung im grundlegenden Wandel der Geschäftsmodelle, denn die Wertschöpfung erfolgreicher Unternehmen basiert zunehmend auf "digitalen assets". Daten gelten generell als die neue Ressource, als Rohstoff, auch für Smarte Innovationen. Die Bedeutung von Smart Innovation wird in Zukunft weiterhin ansteigen. Kurz- und mittelfristig unterstützt die Schwache KI vor allem bei der Datensammlung und -analyse, bei der Prozessautomatisierung sowie bei der Bedürfnis- und Trendidentifikation. Weiter werden sich inkrementelle Veränderungen im Innovationsmanagement mithilfe von Simulationen und der zufälligen Kombination von Technologien erhofft. Langfristig wird eine stärkere KI den Einsatz der Menschen im Innovationsprozess in Teilen ersetzen können. Ob autonomes Innovieren zukünftig möglich sein wird, hängt zunächst von dem Ausmaß der Neuheit einer Innovation, aber vor allem auch von der Möglichkeit einer kreativen KI ab. Es ist davon auszugehen, dass die Fortschritte im Bereich der KI nicht nur radikale Innovationen ermöglichen werden, sondern auch zu einer strukturellen Veränderung unseres heutigen Verständnisses des Innovationsmanagements führen.
Unter den widrigsten wirtschaftlichen und politischen Verhältnissen und Bedingungen wurde die Friedrich-List-Gesellschaft (FLG) 1925 gegründet und bis 1934 fortgeführt. Sie verfolgte vor allem den Zweck, die weit verstreuten, schwer zugänglichen und vielfach unbekannten Schriften, Reden und Briefe von Friedrich List (1789-1846) zusammenzutragen und in Form einer Gesamtausgabe zu publizieren.
Weder diese 10- bzw. 12-bändige Gesamtausgabe, noch die Namen ihrer Herausgeber haben in der Wirtschaftswissenschaft die gebührende Wertschätzung und Aufmerksamkeit erfahren. Die längst überfällige Dankesschuld wird in dem vorliegenden Beitrag nach nahezu 100 Jahren abgetragen. Ohne den engagierten und mutigen Einsatz der Herausgeber, insbesondere von Edgar Salin, wäre die List-Forschung undenkbar und die deutsche Wirtschaftswissenschaft um ein ruhmreiches Kapitel ärmer.
Seit 5 Jahrzehnten steht die Erforschung von Leben, Werk und Wirkungsgeschichte von Friedrich List (1789–1846) im Zentrum der wissenschaftlichen Arbeit von Eugen Wendler. Im Laufe der Zeit sind ca. 30 Monographien und eine größere Anzahl von wissenschaftlichen Aufsätzen und journalistischen Artikeln entstanden. Dabei baute Eugen Wendler auf der unschätzbaren Vorarbeit der Herausgeber der Gesamtausgabe von Lists Werken von 1925 bis 1935 auf.
Der vorliegende Aufsatz vermittelt einen Überblick über die Buchpublikationen von Eugen Wendler zur List-Forschung. Mit seinem eindrucksvollen Oeuvre bekennt er sich zum letzten lebenden Fossil in der Nachfolge der FLG und erweist damit den Herausgebern die gebührende und längst überfällige Wertschätzung und Achtung.
In various German cities free-floating e-scooter sharing is an upcoming trend in e-mobility. Trends such as climate change, urbanization, demographic change, amongst others are arising and forces the society to develop new mobility solutions. Contrasting the more scientifically explored car sharing, the usage patterns and behaviors of e-scooter sharing customers still need to be analyzed. This presumably enables a better addressing of customers as well as adaptions of the business model to increase scooter utilization and therefore the profit of the e-scooter providers. The customer journey is digitally traceable from registration to scooter reservation and the ride itself. These data enable to identifies customer needs and motivations. We analyzed a dataset from 2017 to 2019 of an e-scooter sharing provider operating in a big German city. Based on the datasets we propose a customer clustering that identifies three different customer segments, enabling to draw multiple conclusions for the business development and improving the problem-solution fit of the e-scooter sharing model.
A closed-loop control for a cooperative innovation culture in interorganizational R&D projects
(2022)
Since project managers only have a limited authority in interorganizational R&D projects a cooperative innovation culture is essential for team cohesion and thus for achieving project scope in time and cost. For its development different factors depending on underlying values are essential. These factors must be learned iteratively by the project members so that they are living the values of a cooperative innovation culture. Hence, this paper raises the following research question: “How to control living the values of a cooperative innovation culture in interorganizational R&D projects?” To answer this question, a closed-loop control for a cooperative innovation culture is developed. The developed closed-loop control system includes several different functional units which show essential roles and several different variables which show what to consider and design in the control system. In addition, the developed closed-loop control system is generalized for other types of projects such as intraorganizational projects.
According to several surveys and statistics, the great majority of companies previously not accustomed to automation are piloting solutions to automate business processes. Those accustomed to automation also attempt to introduce more of it, focusing on automation-unfriendly processes that remained manual. However, when the decision on what and whether to automate is not trivial for evident reasons, even industry leaders may get stuck on an overwhelming question: where to begin automating? The question remains too often unanswered as state-of-the-art methods fail to consider the whole picture. This paper introduces a holistic approach to the decision-making for investments in automation. The method supports the iterative analysis and evaluation of operative processes, providing tools for a quantitative approach to the decision-making. Thanks to the method, a large pool of processes can be first considered and then filtered out in order to select the one that yields the best value for the automation in the specific context. After introducing the method, a case study is reported for validation before the discussion.
Forecasting intermittent and lumpy demand is challenging. Demand occurs only sporadically and, when it does, it can vary considerably. Forecast errors are costly, resulting in obsolescent stock or unmet demand. Methods from statistics, machine learning and deep learning have been used to predict such demand patterns. Traditional accuracy metrics are often employed to evaluate the forecasts, however these come with major drawbacks such as not taking horizontal and vertical shifts over the forecasting horizon into account, or indeed stock-keeping or opportunity costs. This results in a disadvantageous selection of methods in the context of intermittent and lumpy demand forecasts. In our study, we compare methods from statistics, machine learning and deep learning by applying a novel metric called Stock-keeping-oriented Prediction Error Costs (SPEC), which overcomes the drawbacks associated with traditional metrics. Taking the SPEC metric into account, the Croston algorithm achieves the best result, just ahead of a Long Short-Term Memory Neural Network.
Omnichannel retailing and sustainability are two important challenges for the fast fashion industry. However, the sustainable behavior of fast fashion consumers in an omnichannel environment has not received much attention from researchers. This paper aims to examine the factors that determine consumers’ willingness to participate in fast fashion brands’ used clothes recycling plans in an omnichannel retail environment. In particular, we examine the impact of individual consumer characteristics (environmental attitudes, consumer satisfaction), organizational arrangements constitutive for omnichannel retailing (channel integration), and their interplay (brand identification, impulsive consumption). A conceptual model was developed based on findings from previous research and tested on data that were collected online from Chinese fast fashion consumers. Findings suggest that consumers’ intentions for clothes recycling are mainly determined by individual factors, such as environmental attitudes and consumer satisfaction. Organizational arrangements (perceived channel integration) showed smaller effects. This study contributes to the literature on omnichannel (clothing) retail, as well as on sustainability in the clothing industry, by elucidating individual and organizational determinants of consumers’ recycling intentions for used clothes in an omnichannel environment. It helps retailers to organize used clothes recycling plans in an omnichannel environment and to motivate consumers to participate in them.
Evaluation of human-robot order picking systems considering the evolution of object detection
(2021)
The automation of intralogistic processes is a major trend, but order picking, one of the core and most cost-intensive tasks in this field, remains mostly manual due to the flexibility required during picking. Reacting to its hard physical and ergonomic strain, the automation of this process is however highly relevant. Robotic picking system would enable the automation of this process from a technical point of view, but the necessity for the system to evolve in time, due to dynamics of logistic environments, faces operations with new challenges that are hardly treated in literature. This unknown scares potential investors, hindering the application of technically feasible solutions. In this paper, a model for the evaluation of the additional cost of training of automated systems during operations is presented, that also considers the savings enabled by the system after its evolution. The proposed approach, that considers different parameters such as capacity, ergonomics and cost, is validated with a case study and discussed.
Compared to the automotive sector, where automation is the rule, in many other less standardized sectors automation is still the exception. This could soon hurt the productivity of industrialized countries, where the unemployment is low and the population is aging. Phenomena like the recent downfall in productivity, due to lockdowns and social distancing for prevention of health hazards during the COVID19 pandemic, only add to the problem. For these reasons, the relevance, motivation and intention for more automation in less standardized sectors has probably never been higher. However, available statistics say that providers and users of technologies struggle to bring more automation into action in automation-unfriendly sectors. In this paper, we present a decision support method for investment in automation that tackles the problem: the STIC analysis. The method takes a holistic and quantitative approach tying together technological, context-related and economic input parameters and synthetizing them in a final economic indicator. Thanks to the modelling of such parameters, it is possible to gain sensibility on the technological and/or process adjustments that would have the highest impact on the efficiency of the automation, thereby delivering value for both technology users and technology providers.
Digitalization increases the pressure for companies to innovate. While current research on digital transformation mostly focuses on technological and management aspects, less attention has been paid to organizational culture and its influence on digital innovations. The purpose of this paper is to identify the characteristics of organizational culture that foster digital innovations. Based on a systematic literature review on three scholarly databases, we initially found 778 articles that were then narrowed down to a total number of 23 relevant articles through a methodical approach. After analyzing these articles, we determine nine characteristics of organizational culture that foster digital innovations: corporate entrepreneurship, digital awareness and necessity of innovations, digital skills and resources, ecosystem orientation, employee participation, agility and organizational structures, error culture and risk-taking, internal knowledge sharing and collaboration, customer and market orientation as well as open-mindedness and willingness to learn.
Morphometry and stiffness of red blood cells—signatures of neurodegenerative diseases and aging
(2022)
Human red blood cells (RBCs) are unique cells with the remarkable ability to deform, which is crucial for their oxygen transport function, and which can be significantly altered under pathophysiological conditions. Here we performed ultrastructural analysis of RBCs as a peripheral cell model, looking for specific signatures of the neurodegenerative pathologies (NDDs)—Parkinson’s disease (PD), amyotrophic lateral sclerosis (ALS) and Alzheimer’s disease (AD), utilizing atomic force (AFM) and conventional optical (OM) microscopy. We found significant differences in the morphology and stiffness of RBCs isolated from patients with the selected NDDs and those from healthy individuals. Neurodegenerative pathologies’ RBCs are characterized by a reduced abundance of biconcave discoid shape, lower surface roughness and a higher Young’s modulus, compared to healthy cells. Although reduced, the biconcave is still the predominant shape in ALS and AD cells, while the morphology of PD is dominated by crenate cells. The features of RBCs underwent a marked aging-induced transformation, which followed different aging pathways for NDDs and normal healthy states. It was found that the diameter, height and volume of the different cell shape types have different values for NDDs and healthy cells. Common and specific morphological signatures of the NDDs were identified.
The early detection of head and neck cancer is a prolonged challenging task. It requires a precise and accurate identification of tissue alterations as well as a distinct discrimination of cancerous from healthy tissue areas. A novel approach for this purpose uses microspectroscopic techniques with special focus on hyperspectral imaging (HSI) methods. Our proof-of-principle study presents the implementation and application of darkfield elastic light scattering spectroscopy (DF ELSS) as a non-destructive, high-resolution, and fast imaging modality to distinguish lingual healthy from altered tissue regions in a mouse model. The main aspect of our study deals with the comparison of two varying HSI detection principles, which are a point-by-point and line scanning imaging, and whether one might be more appropriate in differentiating several tissue types. Statistical models are formed by deploying a principal component analysis (PCA) with the Bayesian discriminant analysis (DA) on the elastic light scattering (ELS) spectra. Overall accuracy, sensitivity, and precision values of 98% are achieved for both models whereas the overall specificity results in 99%. An additional classification of model-unknown ELS spectra is performed. The predictions are verified with histopathological evaluations of identical HE-stained tissue areas to prove the model’s capability of tissue distinction. In the context of our proof-of-principle study, we assess the Pushbroom PCA-DA model to be more suitable for tissue type differentiations and thus tissue classification. In addition to the HE-examination in head and neck cancer diagnosis, the usage of HSI-based statistical models might be conceivable in a daily clinical routine.
Gold bipyramids (AuBPs) attract significant attention due to the large enhancement of the electric field around their sharp tips and well-defined tunability of their plasmon resonances. Excitation patterns of single AuBPs are recorded using raster-scanning confocal microscopy combined with radially and azimuthally polarized laser beams. Photoluminescence spectra (PL) and excitation patterns of the same AuBPs are acquired with three different excitation wavelengths. The isotropic excitation patterns suggest that the AuBPs are mainly excited by interband transitions with 488/530 nm radiation, while excitation patterns created with a 633 nm laser exhibit a double-lobed shape that indicates a single-dipole excitation process associated with the longitudinal plasmon resonance mode. We are able to determine the three-dimensional orientation of single AuBPs nonperturbatively by comparing experimental patterns with theoretical simulations. The asymmetric patterns show that the AuBPs are lying on the substrate with an out-of-plane tilt angle of around 10–15°.
Allyls
(2022)
This chapter addresses the importance and usage of the commercially low-volume thermoset plastics group known as allyls. The three significant subelements of this group are poly(diallylphthalates), poly(diallylisophthalates), and poly(allyldiglycol carbonate). Chemistry, processing, and properties are also described. Allyl polymers are synthesized by radical polymerizations of allyl monomers that usually do not produce high-molecular-mass macromolecules. Therefore only a few specific monomers can produce thermosetting materials. Diallyldiglycolcarbonate (CR-39) and diallylphthalates are the most significant examples that have considerably improved our everyday life.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries.
This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Cross-linked thermoplastics
(2022)
Cross-linked thermoplastics represent an important class of materials for numerous applications such as heat-shrinkable tubing, rotational molded parts, and polyolefin foams. By cross-linking olefins, their mechanical performance can be significantly enhanced. This chapter covers the three main methods for the cross-linking of thermoplastics: radiation cross-linking, chemical cross-linking with organic peroxides, and cross-linking using silane-grafting agents. It also considers the major effects of the cross-linking procedure on the performance of the thermoplastic materials discussed.
Silicones
(2022)
Silicones are found in a variety of applications with requirements that range from long life at elevated temperatures to fluidity at low temperatures. This chapter first considers silicone elastomers and their application in room temperature vulcanizing (RTV) and heat curing systems (HTV). Also, new technologies for UV curing are introduced. Coverage of RTVs includes both one-component and two-component systems and the different cure chemistries of each and is followed by a separate discussion of silicone laminates. Due to the high importance of silicone fluids, they are also discussed. Fluids include polishes, release agents, surfactants, and dielectric fluids.
Cyanate ester resins
(2022)
Cyanate ester resins are an important class of thermosetting compounds that experience an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other application fields are especially suitable for highly demanding applications in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the –OCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure.
Process analysis and process control have attracted increasing interest in recent years. The development and application of process analytical methods are a prerequisite for the knowledge-based manufacturing of industrial goods and allow for the production of high-value products of defined, constantly good quality. Discussed in this chapter are the measurement principle and some relevant aspects and illustrative examples of online monitoring tools as the basis for process control in the manufacturing and processing of thermosetting resins. Optical spectroscopy is featured as one of the main process analytical methods applicable to, among other applications, online monitoring of resin synthesis. In combination with chemometric methods for multivariate data analysis, powerful process models can be generated within the framework of feedback and feed-forward control concepts. Other analytical methods covered in this chapter are those frequently used to control further processing of thermosets to the final parts, including dielectric analysis, ultrasonics, fiber optics, and Fiber Bragg Grating sensors.
Self-healing thermosets
(2022)
This chapter discusses the basic extrinsic, intrinsic, and combined extrinsic/intrinsic strategies for equipping thermosetting polymers with self-healing properties. The main focus will be on the presentation of a holistic optimization of thermosetting materials, that is, on a simultaneous optimization of both self-healing and other, specialized material properties. Due to their very rigid, highly cross-linked three-dimensional structure, thermosetting polymers require special chemical strategies to achieve self-healing properties. The main chemical strategies available for this will be briefly outlined. The examples given illustrate interesting and/or typical procedures and serve as an inspiration to find solutions for your own applications. They summarize important recent development in research and technology aiming toward multifunctional truly smart self-healing thermosetting materials. An important aspect in this topic area is also how precisely the self-healing effects are analytically checked, quantified, and evaluated. A range of measuring methods is available for this purpose. In this chapter, the most important analytical tools for testing self-healing properties are briefly introduced and highlighted with some illustrative examples.
Today, many industrial tasks are not automated and still require human intervention. One of these tasks is the unloading of oversea containers. After the end of transportation to the sorting center, the containers must be unloaded manually for further sending the parcels to the recipients. A robot-based automatic unloading of containers was therefore researched. However, the promising results of the system developed in these projects could not be commercialized due to problems with its reliability. Mechanical, algorithmic or other limitations are possible causes of the observed errors. To analyze errors, it is necessary to evaluate the results of the robot’s work without complicating the existing system by adding new sensors to it. This paper presents a reference system based on machine learning to evaluate the robotics grasps of parcels. It analyzes two states of the container: before and after picking up one box. The states are represented as a point cloud received from a laser scanner. The proposed system evaluates the success of transferring a box from an overseas container to the sorting line by supervised learning using convolutional neural networks (CNN) and manual labeling of the data. The process of obtaining a working model using a hyperband model search with a maximum classification error of 3.9 % is also described.
Focal adhesion clusters (FAC) are dynamic and complex structures that help cells to sense physicochemical properties of their environment. Research in biomaterials, cell adhesion or cell migration often involves the visualization of FAC by fluorescence staining and microscopy, which necessitates quantitative analysis of FAC and other cell features in microscopy images using image processing. Fluorescence microscopy images of human umbilical vein endothelial cells (HUVEC) obtained at 63x magnification were quantitatively analysed using ImageJ software. A generalised algorithm for selective segmentation and morphological analysis of FAC, nucleus and cell morphology is implemented. Further, a method for discrimination of FACnear the nucleus and around the periphery is implemented using masks. Our algorithm is able to effectively quantify different morphological characteristics of cell components and shows a high sensitivity and specificity while providing a modular software implementation.
This paper presents a modular and scalable power electronics concept for motor control with continuous output voltage. In contrast to multilevel concepts, modules with continuous output voltage are connected in series. The continuous output voltage of each module is obtained by using gallium nitride (GaN) high electron motility transistor (HEMT)s as switches inside the modules with a switching frequency in the range between 500 kHz and 1 MHz. Due to this high switching frequency a LC filter is integrated into the module resulting in a continuous output voltage. A main topic of the paper is the active damping of this LC output filter for each module and the analysis of the series connection of the damping behaviour. The results are illustrated with simulations and measurements.
Context: The manufacturing industry is facing a transformation with regard to Industry 4.0 (I4). A transformation towards full automation of production including a multitude of innovations is necessary. Startups and entrepreneurial processes can support such a transformation as has been shown in other industries. However, I4 has some specifics, so it is unclear how entrepreneurship can be adapted in I4. Understanding these specifics is important to develop suitable training programs for I4 startups and to accelerate the transformation.
Objective: This study identifies and outlines the essential characteristics and constraints of entrepreneurial processes in I4.
Method: 14 semi-structured interviews were conducted with experts in the field of I4 entrepreneurship. The interviews were analysed and categorized by qualitative analyses.
Results: The interviews revealed several characteristics of I4 that have a significant impact on the various phases of the entrepreneurial process. Examples of such specifics include the difficult access to customers, the necessary deep understanding of the customer and the domain, the difficulty of testing risky assumptions, and the complex development and productization of solutions. The complexity of hardware and software components, cost structures, and necessary customer-specific customizations affect the scalability of I4 startups. These essential characteristics also require specialised skills and resources from I4 startups.
Soft lithography, a tool widely applied in biology and life sciences with numerous applications, uses the soft molding of photolithography-generated master structures by polymers. The central part of a photolithography set-up is a mask-aligner mostly based on a high-pressure mercury lamp as an ultraviolet (UV) light source. This type of light source requires a high level of maintenance and shows a decreasing intensity over its lifetime, influencing the lithography outcome. In this paper, we present a low-cost, bench-top photolithography tool based on ninety-eight 375 nm light-emitting diodes (LEDs). With approx. 10 W, our presented lithography set-up requires only a fraction of the energy of a conventional lamp, the LEDs have a guaranteed lifetime of 1000 h, which becomes noticeable by at least 2.5 to 15 times more exposure cycles compared to a standard light source and with costs less than 850 C it is very affordable. Such a set-up is not only attractive to small academic and industrial fabrication facilities who want to enable work with the technology of photolithography and cannot afford a conventional set-up, but also microfluidic teaching laboratories and microfluidic research and development laboratories, in general, could benefit from this cost-effective alternative. With our self-built photolithography system, we were able to produce structures from 6 μm to 50 μm in height and 10 μm to 200 μm in width. As an optional feature, we present a scaled-down laminar flow hood to enable a dust-free working environment for the photolithography process.
Das Buch führt in die Grundlagen der Softwaretechnik ein. Dabei liegt sein Fokus auf der systematischen und modellbasierten Software- und Systementwicklung aber auch auf dem Einsatz agiler Methoden. Die Autoren legen besonderen Wert auf die gleichwertige Behandlung praktischer Aspekte und zugrundeliegender Theorien, was das Buch als Fach- und Lehrbuch gleichermaßen geeignet macht. Die Softwaretechnik wird im Rahmen eines systematischen Frameworks umfassend beschrieben. Ausgewählte und aufeinander abgestimmte Konzepte und Methoden werden durchgängig und integriert dargestellt.
Software is an integrated part of new features within the automotive sector, car manufacturers, the Hersteller Initiative Software (HIS) consortium defined metrics to determine software quality. Yet, problems with assigning metrics to quality attributes often occur in practice. The specified boundary values lead to discussions between contractors and clients as different standards and metric sets are used. This paper studies metrics used in the automotive sector and the quality attributes they address. The HIS, ISO/IEC 25010:2011, and ISO/IEC 26262:2018 are utilized to draw a big picture illustrating (i) which metrics and boundary values are reported in literature, (ii) how the metrics match the standards, (iii) which quality attributes are addressed, and (iv) how the metrics are supported by tools. Our findings from analyzing 38 papers include a catalog of 112 metrics of which 17 define boundary values and 48 are supported by tools. Most of the metrics are concerned with source code, are generic, and not specifically designed for automotive software development. We conclude that many metrics exist, but a clear definition of the metrics' context, notably regarding the construction of flexible and efficient measurement suites, is missing.
Context:
Test-driven development (TDD) is an agile software development approach that has been widely claimed to improve software quality. However, the extent to which TDD improves quality appears to be largely dependent upon the characteristics of the study in which it is evaluated (e.g., the research method, participant type, programming environment, etc.). The particularities of each study make the aggregation of results untenable.
Objectives:
The goal of this paper is to: increase the accuracy and generalizability of the results achieved in isolated experiments on TDD, provide joint conclusions on the performance of TDD across different industrial and academic settings, and assess the extent to which the characteristics of the experiments affect the quality-related performance of TDD.
Method:
We conduct a family of 12 experiments on TDD in academia and industry. We aggregate their results by means of meta-analysis. We perform exploratory analyses to identify variables impacting the quality-related performance of TDD.
Results:
TDD novices achieve a slightly higher code quality with iterative test-last development (i.e., ITL, the reverse approach of TDD) than with TDD. The task being developed largely determines quality. The programming environment, the order in which TDD and ITL are applied, or the learning effects from one development approach to another do not appear to affect quality. The quality-related performance of professionals using TDD drops more than for students. We hypothesize that this may be due to their being more resistant to change and potentially less motivated than students.
Conclusion:
Previous studies seem to provide conflicting results on TDD performance (i.e., positive vs. negative, respectively). We hypothesize that these conflicting results may be due to different study durations, experiment participants being unfamiliar with the TDD process, or case studies comparing the performance achieved by TDD vs. the control approach (e.g., the waterfall model), each applied to develop a different system. Further experiments with TDD experts are needed to validate these hypotheses.
Fault diagnosis of rolling bearings is an essential process for improving the reliability and safety of the rotating machinery. It is always a major challenge to ensure fault diag- nosis accuracy in particular under severe working conditions. In this article, a deep adversarial domain adaptation (DADA) model is proposed for rolling bearing fault diagnosis. This model con- structs an adversarial adaptation network to solve the commonly encountered problem in numerous real applications: the source domain and the target domain are inconsistent in their distribution. First, a deep stack autoencoder (DSAE) is combined with representative feature learning for dimensionality reduction, and such a combination provides an unsupervised learning method to effectively acquire fault features. Meanwhile, domain adaptation and recognition classification are implemented using a Softmax classifier to augment classification accuracy. Second, the effects of the number of hidden layers in the stack autoencoder network, the number of neurons in each hidden layer, and the hyperparameters of the proposed fault diagnosis algorithm are analyzed. Third, comprehensive analysis is performed on real data to vali- date the performance of the proposed method; the experimental results demonstrate that the new method outperforms the existing machine learning and deep learning methods, in terms of classification accuracy and generalization ability.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
By 2019, Germany-based Kärcher, “the world’s leading provider of cleaning technology,” had turned its professional cleaning devices into IoT products. The data generated by these IoT-connected cleaning devices formed a key ingredient in the company’s ongoing strategic shift in its B2B business: Kärcher was transforming from a seller of cleaning devices to a provider of consulting services in order to help professional cleaning companies improve their cleaning processes. Based on interviews with seven IT- and non-IT executives, the case illustrates how the company learned to generate value from IoT products. And it demonstrates how a family-owned company transformed its organization in order to be able to more effectively develop and provide IoT products, while adding roles, developing technology platforms, and changing organizational structures and ways of working.
Intermittent time series forecasting is a challenging task which still needs particular attention of researchers. The more unregularly events occur, the more difficult is it to predict them. With Croston’s approach in 1972 (1.Nr. 3:289–303), intermittence and demand of a time series were investigated the first time separately. He proposes an exponential smoothing in his attempt to generate a forecast which corresponds to the demand per period in average. Although this algorithm produces good results in the field of stock control, it does not capture the typical characteristics of intermittent time series within the final prediction. In this paper, we investigate a time series’ intermittence and demand individually, forecast the upcoming demand value and inter-demand interval length using recent machine learning algorithms, such as long-short-term-memories and light-gradient-boosting machines, and reassemble both information to generate a prediction which preserves the characteristics of an intermittent time series. We compare the results against Croston’s approach, as well as recent forecast procedures where no split is performed.
Context: Agile practices as well as UX methods are nowadays well-known and often adopted to develop complex software and products more efficiently and effectively. However, in the so called VUCA environment, which many companies are confronted with, the sole use of UX research is not sufficient to find the best solutions for customers. The implementation of Design Thinking can support this process. But many companies and their product owners don’t know how much resources they should spend for conducting Design Thinking.
Objective: This paper aims at suggesting a supportive tool, the “Discovery Effort Worthiness (DEW) Index”, for product owners and agile teams to determine a suitable amount of effort that should be spent for Design Thinking activities.
Method: A case study was conducted for the development of the DEW index. Design Thinking was introduced into the regular development cycle of an industry Scrum team. With the support of UX and Design Thinking experts, a formula was developed to determine the appropriate effort for Design Thinking.
Results: The developed “Discovery Effort Worthiness Index” provides an easy-to-use tool for companies and their product owners to determine how much effort they should spend on Design Thinking methods to discover and validate requirements. A company can map the corresponding Design Thinking methods to the results of the DEW Index calculation, and product owners can select the appropriate measures from this mapping. Therefore, they can optimize the effort spent for discovery and validation.
Context: The software-intensive business is characterized by increasing market dynamics, rapid technological changes, and fast-changing customer behaviors. Organizations face the challenge of moving away from traditional roadmap formats to an outcome-oriented approach that focuses on delivering value to the customer and the business. An important starting point and a prerequisite for creating such outcome-oriented roadmaps is the development of a product vision to which internal and external stakeholders can be aligned. However, the process of creating a product vision is little researched and understood.
Objective: The goal of this paper is to identify lessons-learned from product vision workshops, which were conducted to develop outcome-oriented product roadmaps.
Method: We conducted a multiple-case study consisting of two different product vision workshops in two different corporate contexts.
Results: Our results show that conducting product vision workshops helps to create a common understanding among all stakeholders about the future direction of the products. In addition, we identified key organizational aspects that contribute to the success of product vision workshops, including the participation of employees from functionally different departments.
Context: Many companies are facing an increasingly dynamic and uncertain market environment, making traditional product roadmapping practices no longer sufficiently applicable. As a result, many companies need to adapt their product roadmapping practices for continuing to operate successfully in today’s dynamic market environment. However, transforming product roadmapping practices is a difficult process for organizations. Existing literature offers little help on how to accomplish such a process.
Objective: The objective of this paper is to present a product roadmap transformation approach for organizations to help them identify appropriate improvement actions for their roadmapping practices using an analysis of their current practices.
Method: Based on an existing assessment procedure for evaluating product roadmapping practices, the first version of a product roadmap transformation approach was developed in workshops with company experts. The approach was then given to eleven practitioners and their perceptions of the approach were gathered through interviews.
Results: The result of the study is a transformation approach consisting of a process describing what steps are necessary to adapt the currently applied product roadmapping practice to a dynamic and uncertain market environment. It also includes recommendations on how to select areas for improvement and two empirically based mapping tables. The interviews with the practitioners revealed that the product roadmap transformation approach was perceived as comprehensible, useful, and applicable. Nevertheless, we identified potential for improvements, such as a clearer presentation of some processes and the need for more improvement options in the mapping tables. In addition, minor usability issues were identified.