Refine
Year of publication
- 2021 (297) (remove)
Document Type
- Journal article (158)
- Conference proceeding (86)
- Book chapter (29)
- Book (7)
- Report (5)
- Doctoral Thesis (3)
- Anthology (3)
- Issue of a journal (2)
- Working Paper (2)
- Patent / Standard / Guidelines (1)
Is part of the Bibliography
- yes (297) (remove)
Institute
- ESB Business School (113)
- Informatik (83)
- Life Sciences (44)
- Technik (37)
- Texoversum (12)
- Zentrale Einrichtungen (5)
Publisher
- Springer (61)
- Elsevier (31)
- MDPI (24)
- IEEE (17)
- Wiley (11)
- De Gruyter (10)
- American Chemical Society (8)
- VDE Verlag (7)
- Hochschule Reutlingen (6)
- Association for Information Systems (5)
Effektives Risiko-Management sollte neben quantifizierbaren, bekannten Risiken auch Ereignisse berücksichtigen, die entweder in ähnlicher Art bereits eingetreten oder grundsätzlich vorstellbar sind. Für eine Identifikation dieser "Grauen Schwäne" müssen institutionell-organisatorische Voraussetzungen geschaffen und analytisch-konzeptionelle Instrumente bereitgestellt werden.
Dieser Beitrag entwickelt ein Managementmodell, das Unternehmen dabei unterstützt, relevante Aktionsfelder zur nachhaltigen Steuerung von Konsumenten entlang der eigenen Customer Journey zu identifizieren. Aufbauend auf dem SHIFT-Modell, als strukturelle Abbildung des nachhaltigen Käuferverhaltens, wird die Customer Journey entlang der owned, paid und earned Touchpoints aufgezogen. Mithilfe des faktisch analytischen Ansatzes, der die Integration neuer Erkenntnisse in die Forschungsstrategie unterstützt, werden Aktionsfelder identifiziert, die als grundlegende Logik Unternehmen dazu anleiten sollen, bei der Ausgestaltung der eigenen nachhaltigen Customer Journey dieses Strukturraster anzunehmen.
Problem: Die Covid-19 Pandemie verschärft nicht nur die wirtschaftlichen, sondern auch die öko-sozialen Rahmenbedingungen vieler Unternehmen. Nachhaltiges Handeln ist daher wichtiger denn je. Unternehmen wählen unterschiedliche Wege, um Nachhaltigkeit in das Managementsystem der oberen Führungsebene zu integrieren. Dadurch besteht die Chance, Nachhaltigkeit nicht nur in Form von Einzelmaßnahmen zu sehen, sondern als Element der Strategie- und Organisationsentwicklung zu verstehen. Für die gesamthafte Betrachtung kommen u. a. die Gemeinwohlbilanz (GWB) und die Nachhaltigkeits-Balanced Scorecard (N-BSC) in Betracht, wie die Beispiele von Vaude und der Sparda Bank München, die die GWB nutzen (siehe https://web.ecogood.org/de/die-bewegung/pionier-unternehmen/), sowie Alpha und Axel Springer, die Nachhaltigkeit in ihre BSC integrieren (Hansen/Schaltegger, 2016, S. 207), zeigen.
Ziel: Diskussion der GWB und der N-BSC als Möglichkeiten zur Integration öko-sozialer Aspekte in das Managementsystem.
Methode: Aufzeigen wesentlicher Grundzüge der GWB und N-BSC
Das textile Bauen ist ein seit vielen Jahren wachsender Bereich der Textilindustrie. Durch die Verwendung textiler Materialien bieten sich nicht zuletzt für die Architektur neue gestalterische Möglichkeiten, die mit konventionellen Baumaterialien nicht realisierbar sind. Bekannte Beispiele für textile Bauwerke sind große Sportarenen, Bahnhöfe und Flughäfen. Dabei sind Leichtbauweisen und zumindest teilweise Transparenz der Bauwerke auf einer Seite herausragende Eigenschaften, auf der anderen Seite stellen diese Gebäude besondere Anforderungen an das Klima- und Energiemanagement. Der Innenraum kann sich bei Sonneneinstrahlung stark aufheizen, da neben dem sichtbaren Licht vor allem ein Großteil des Infrarotanteils der solaren Strahlung transmittieren kann. Im konventionellen Bauen existieren bereits hohe Anforderungen an die energietechnische Ausgestaltung von Bauwerken, die u.a. über eine effiziente Wärmedämmung erfüllt werden. Dies wird in der Regel mit Hilfe von voluminösen, offenporigen Dämmstoffen erreicht. Ziel ist es dabei vornehmlich, den Verlust von Wärme aus dem Innenraum zu verringern, gleichzeitig können schlecht wärmeleitende Stoffe bei hoher Masse und hoher spezifischer Wärmekapazität Temperaturspitzen im Sommer abpuffern. Auch für das textile Bauen ist die Energieeffizienz ein wichtiger Aspekt. Die Verwendung von schweren Dämmstoffen widerspricht dabei aber der Idee der flexiblen textilen Leichtbauweise.
Trotz Niedrigzinsphase bleibt das Working Capital Management ein wichtiger Treiber für Wertgrößen in Unternehmen und wichtiges Managementinstrument. Unsere Ergebnisse über 115 Unternehmen aus den wichtigsten deutschen Indizes in den Jahren 2011 bis 2017 zeigen, dass effektives Working Capital Management einen positiven Einfluss auf die Rentabilität und den Unternehmenswert haben kann. Gleichzeitig zeigen unsere Ergebnisse aber auch, dass dem Working Capital Management jüngst weniger Aufmerksamkeit zuteilgeworden ist und digitale Innovationen vermutlich noch nicht in dem Umfang zur Effizienzsteigerung eingesetzt werden, wie dies möglich erscheint. Selbst vor dem Hintergrund andauernd niedriger Kapitalmarktzinsen ist dies kritisch zu sehen.
Mit diesem Strategiepapier formulieren die Universitäts-, Landes- und Hochschulbibliotheken des Landes Baden-Württemberg die aus ihrer Sicht zentralen Entwicklungsfelder und Herausforderungen der kommenden Jahre. Die Bibliotheken und das Bibliotheksservice-Zentrum Baden-Württemberg (BSZ) sorgen als Wissenschafts- und Kultureinrichtungen gemeinsam für die akademische Informationsinfrastruktur. Sie nehmen die Herausforderungen der Digitalisierung an und gestalten den Wandel im Dialog mit Forschenden, Lehrenden und Studierenden aktiv mit.
This book is about the challenges that emerge for organizations from an ever faster changing world. While useful at their time, several management tools, including classic strategic planning processes, will no longer suffice to address these challenges in a timely and comprehensive fashion. While individual management tools are still valid to solve specific problems, they need to be employed based on a clear understanding of what the greater challenge is and how they need to be combined and prioritized with other approaches. In order to do so, companies can apply the clarity of thinking from the military with regard to which leadership level is responsible for what and how these levels need to interact in order to produce a single aligned response to an outside opportunity or threat. Finally, the tool of business wargaming, while known for some time, proves to be an ideal approach to quickly and effectively bring all leadership levels together, align them around a common objective and lay the groundwork for effective implementation of targeted responses that will keep the organization competitive and in the game for the long run. The book offers a comprehensive introduction to business wargaming, including a historical account, a classification of different types of games and a number of specific real-world examples. This book is targeted at practicing managers dealing with the aforementioned challenges, as well as for students of business and strategy at every level.
In buchstäblich letzter Minute haben sich die englische Regierung und die Europäische Union auf ein umfangreiches Abkommen geeinigt, um einen ungeregelten Brexit zu verhindern. Nach dem jahrelangen zähen Verhandlungsmarathon fällt der Jubel verhalten aus, dennoch herrscht auf beiden Seiten des Ärmelkanals Erleichterung, weil ein Modus Vivendi gefunden wurde, auf dem sich die künftigen Beziehungen aufbauen und fortführen lassen. Ob sich die englischen Blütenträume, die an den Brexit geknüpft wurden, erfüllen werden, wird die Zukunft erweisen.
Die Strategie und Taktik der englischen Regierungen zum Brexit und bei den Austrittsverhandlungen spiegeln sich in den Erfahrungen wider, die Friedrich List vor genau 175 Jahren bei seinen Bemühungen um eine deutsch-englische Allianz machen musste. Wegen der von England schon damals strikt befolgten Insular und Handelssuprematie musste er sich eingestehen, dass England diese Position hartnäckig verteidigt und deshalb frustriert und ernüchtert seine Pläne aufgeben. Deshalb setzte er seine Hoffnung auf eine "Kontinentalallianz" der europäischen Nationen, wie sie nun nach dem Austritt Großbritanniens aus der Europäischen Union entstanden ist. Vielleicht werden wir uns nun an den Begriff "Kontinentalallianz" gewöhnen müssen und dabei an die Weitsicht von Friedrich List erinnert.
Andererseits gilt auch für die englische Politik das Motto von Lists zweiter Pariser Preisschrift: "Le monde marche - Die Welt bewegt sich", allerdings mit völlig anderen Vorzeichen als vor 175 Jahren: Die Welthandelsachse hat sich von der westlichen auf die östliche Halbkugel verlagert; das britische Weltreich ist Geschichte, die Fließgeschwindigkeit des globalen Wandels hat sich dramatisch beschleunigt und trotz der Lingua Franca erscheint England, vor allem aus asiatischer Sicht, nur noch als kleiner Fleck auf der Weltkarte. Falls die schottische Regierung ihre Absicht durchsetzen und die Unabhängigkeit vom Vereinigten Königreich erreichen sollte, würde sich der Brexit als verhängnisvoller Bumerang erweisen.
Several studies analyzed existing Web APIs against the constraints of REST to estimate the degree of REST compliance among state-of-the-art APIs. These studies revealed that only a small number of Web APIs are truly RESTful. Moreover, identified mismatches between theoretical REST concepts and practical implementations lead us to believe that practitioners perceive many rules and best practices aligned with these REST concepts differently in terms of their importance and impact on software quality. We therefore conducted a Delphi study in which we confronted eight Web API experts from industry with a catalog of 82 REST API design rules. For each rule, we let them rate its importance and software quality impact. As consensus, our experts rated 28 rules with high, 17 with medium, and 37 with low importance. Moreover, they perceived usability, maintainability, and compatibility as the most impacted quality attributes. The detailed analysis revealed that the experts saw rules for reaching Richardson maturity level 2 as critical, while reaching level 3 was less important. As the acquired consensus data may serve as valuable input for designing a tool-supported approach for the automatic quality evaluation of RESTful APIs, we briefly discuss requirements for such an approach and comment on the applicability of the most important rules.
Together with many success stories, promises such as the increase in production speed and the improvement in stakeholders' collaboration have contributed to making agile a transformation in the software industry in which many companies want to take part. However, driven either by a natural and expected evolution or by contextual factors that challenge the adoption of agile methods as prescribed by their creator(s), software processes in practice mutate into hybrids over time. Are these still agile In this article, we investigate the question: what makes a software development method agile We present an empirical study grounded in a large-scale international survey that aims to identify software development methods and practices that improve or tame agility. Based on 556 data points, we analyze the perceived degree of agility in the implementation of standard project disciplines and its relation to used development methods and practices. Our findings suggest that only a small number of participants operate their projects in a purely traditional or agile manner (under 15%). That said, most project disciplines and most practices show a clear trend towards increasing degrees of agility. Compared to the methods used to develop software, the selection of practices has a stronger effect on the degree of agility of a given discipline. Finally, there are no methods or practices that explicitly guarantee or prevent agility. We conclude that agility cannot be defined solely at the process level. Additional factors need to be taken into account when trying to implement or improve agility in a software company. Finally, we discuss the field of software process-related research in the light of our findings and present a roadmap for future research.
Ziel des Projekts ist es, die Schutzwirkung von Schweißerschutzkleidung zu verbessern. Der Fokus lag dabei auf den Fragestellungen: Kann man durch eine Ausrüstung die Beständigkeit der Textilien gegen Tropfen von flüssigem Metall erhöhen und gleichzeitig einen besseren UV-Schutz erhalten? Diese Schutzfaktoren von Schweißerschutzkleidung hängen stark vom Flächengewicht des verwendeten Textils ab. Je höher das Flächengewicht, desto beständiger ist die Kleidung gegenüber Metallspritzern und desto weniger UV wird durch die Kleidung hindurchgelassen. Jedoch gilt, je höher das Flächengewicht, desto schlechter ist der Tragekomfort, da ein hohes Flächengewicht u.a. das Schwitzen fördert. Schweißerschutzkleidung wird nach zwei Klassen unterteilt. Im Fall von Kleidung der Klasse 1 darf ein Temperaturanstieg von 40 K auf der Rückseite des Textils erst nach dem 15. aufgetroffenen Tropfen flüssigen Eisens auftreten. Im Fall der Klasse 2 darf der Temperaturanstieg erst nach 25 Tropfen auftreten. Als Ausgang für dieses Projekt wurden Gewebe ausgewählt, welche die Klasse 1 erfüllen. Es wurde versucht, diese Gewebe durch die Ausrüstung entweder mit wärmeleitfähigen Kompositen oder durch eine Nanostrukturierung ("Lotuseffekt") entsprechend auszurüsten, so dass die Anforderungen für Klasse 2 erfüllt werden. Wärmeleitfähige Komposite sollten für die Ausrüstung ein schnelles Ableiten und Verteilen der Wärme der Metalltropfen auf der Oberfläche garantieren, wodurch sichergestellt werden sollte, dass die Erwärmung der Rückseite des Gewebes deutlich verlangsamt wird. Mit dieser Ausrüstung konnte die Klasse 2 nicht erreicht werden, sie führte jedoch zu keiner Verschlechterung des Tragekomforts des leichteren Gewebes, und die Transmission von schädlicher UV-Strahlung wurde verringert. Durch eine Nanostrukturierung sollte ein "Lotuseffekt" für kleine Metalltropfen erzielt werden. Durch die Nanostrukturierung trifft der Metalltropfen zuerst auf die Oberfläche der Nanopartikel auf, wobei isolierende Luft zwischen Metalltropfen und Gewebeoberfläche eingeschlossen wird und so das Gewebe vor dem Tropfen selbst schützt. Dieser Ansatz lässt vermuten, dass sich der Effekt gut über die aufgetragenen Menge Nanopartikel / Binder einstellen lässt. Im Fall von Binderkonzentrationen zwischen 1,25 und 2,5 % wird die Flexibilität nur geringfügig beeinträchtigt, wobei mit unterschiedlichen Partikeln (SiO2, ZnO, AlOx und TiO2) die Schweißerschutzklasse 2 erreicht werden kann. Der Tragekomfort der Gewebe wird nicht beeinflusst. Das Verfahren bietet KMU aus dem Bereich der Textilveredlung neue innovative Produkte für den Arbeitsschutzsektor. Die Verwendung von leichterer Kleidung im Bereich der PSA (Persönliche Schutzausrüstung) erhöht die Akzeptanz dieser, da der Tragekomfort im Vergleich zu Schweißerschutzkleidung der Klasse 2 durch das im Projekt entwickelte Verfahren der Nanostrukturierung von Kleidung der Schweißerschutzklasse 1 einen deutlich verbesserten Tragekomfort mit sich bringt. Dadurch können von KMU, welche sich auf den Sektor PSA spezialisiert haben, neue und auch internationale Absatzmärkte eröffnet werden.
Wir kennen alle die Herausforderungen der neuen Arbeitswelt. Menschen müssen lernen, in der so genannten VUCA-Welt zurecht zu kommen. Der Begriff ist ein Akronym für die englischen Begriffe volatility (Unbeständigkeit), uncertainty (Unsicherheit), complexity (Komplexität) und ambiguity (Mehrdeutigkeit). Das erfordert ein schnelles Adaptieren an Veränderungen im Arbeitskontext. Beweglichkeit ist da ein zentraler Faktor und einerseits sehr energetisierend und andererseits auch mit Anstrengungen verbunden, denn es müssen Gewohnheiten und Arbeitsweisen verändert werden.
Veränderungen der Rolle von Controllern in Großkonzernen - Ergebnisse einer empirischen Erhebung
(2021)
Die anhaltende Diskussion über die Rolle von Management Accountants (MA) führt häufig dazu, dass die Rolle des Business Partners (BP) als die Rolle der Wahl angesehen wird. Dennoch scheinen viele Wissenschaftler und Praktiker davon auszugehen, dass diese Rolle den Managern und MA klar ist, dass sie für sie sinnvoll ist und alle Manager und MA ihr zustimmen und sie umsetzen. Unstimmigkeiten zwischen der tatsächlichen Rolle, der wahrgenommenen und der erwarteten Rolle könnten zu Identitäts- und Rollenkonflikten führen. Dieser Beitrag basiert auf einer quantitativen empirischen Studie in einem großen deutschen High-Tech-Unternehmen im Jahr 2019, dessen Top-Management sich für die Einführung der BP-Rolle entschied.
Der Verschleiß von Werkzeugen bei der Zerspanung mit geometrisch definierter Schneide ist wesentliches Kriterium für die Qualität der bearbeiteten Werkstücke, die Zuverlässigkeit der Bearbeitungsprozesse sowie der Wirtschaftlichkeit. Die Wirtschaftlichkeit der Bearbeitung wird vor allem durch die Anzahl der mit einem Werkzeug zuverlässig bearbeitbaren Werkstücke beeinflusst. Die Standzeit / der Standweg der Werkzeuge sowie die einsetzbaren Technologieparameter sind von unterschiedlichen Faktoren abhängig. Dabei sind neben dem Werkzeug und deren Eingriffsbedingungen (z. B. axiale und radiale Zustellung) auch die Einflüsse seitens der Maschine (z. B. Steifigkeit, Eigenfrequenzen, Drehmoment), des Werkstückes (z. B. Werkstoff, Genauigkeiten) und des Bearbeitungsprozesses mit den dabei auftretenden Kräften, Drehmomenten, Drehzahlen und Vorschüben abhängig. Trotz verschiedener Bemühungen der vergangenen beiden Jahrzehnte zur Bearbeitung ohne Kühlschmierstoff oder mit Minimalmengenschmierung werden heute immer noch zahlreiche Bearbeitungsprozesse unter Einsatz von Kühlschmierstoff durchgeführt. Dadurch lassen sich aufgrund der geringeren thermischen Belastung von Werkzeug und Werkstück teilweise deutlich höhere Schnittbedingungen und/oder Standzeiten erzielen.
Hyperspectral imaging and reflectance spectroscopy in the range from 200–380 nm were used to rapidly detect and characterize copper oxidation states and their layer thicknesses on direct bonded copper in a non-destructive way. Single-point UV reflectance spectroscopy, as a well-established method, was utilized to compare the quality of the hyperspectral imaging results. For the laterally resolved measurements of the copper surfaces an UV hyperspectral imaging setup based on a pushbroom imager was used. Six different types of direct bonded copper were studied. Each type had a different oxide layer thickness and was analyzed by depth profiling using X-ray photoelectron spectroscopy. In total, 28 samples were measured to develop multivariate models to characterize and predict the oxide layer thicknesses. The principal component analysis models (PCA) enabled a general differentiation between the sample types on the first two PCs with 100.0% and 96% explained variance for UV spectroscopy and hyperspectral imaging, respectively. Partial least squares regression (PLS-R) models showed reliable performance with R2c = 0.94 and 0.94 and RMSEC = 1.64 nm and 1.76 nm, respectively. The developed in-line prototype system combined with multivariate data modeling shows high potential for further development of this technique towards real large-scale processes.
The paper explains a workflow to simulate the food energy water (FEW) nexus for an urban district combining various data sources like 3D city models, particularly the City Geography Markup Language (CityGML) data model from the Open Geospatial Consortium, Open StreetMap and Census data. A long term vision is to extend the CityGML data model by developing a FEW Application Domain Extension (FEW ADE) to support future FEW simulation workflows such as the one explained in this paper. Together with the mentioned simulation workflow, this paper also identifies some necessary FEW related parameters for the future development of a FEW ADE. Furthermore, relevant key performance indicators are investigated, and the relevant datasets necessary to calculate these indicators are studied. Finally, different calculations are performed for the downtown borough Ville-Marie in the city of Montréal (Canada) for the domains of food waste (FW) and wastewater (WW) generation. For this study, a workflow is developed to calculate the energy generation from anaerobic digestion of FW and WW. In the first step, the data collection and preparation was done. Here relevant data for georeferencing, data for model set-up, and data for creating the required usage libraries, like food waste and wastewater generation per person, were collected. The next step was the data integration and calculation of the relevant parameters, and lastly, the results were visualized for analysis purposes. As a use case to support such calculations, the CityGML level of detail two model of Montréal is enriched with information such as building functions and building usages from OpenStreetMap. The calculation of the total residents based on the CityGML model as the main input for Ville-Marie results in a population of 72,606. The statistical value for 2016 was 89,170, which corresponds to a deviation of 15.3%. The energy recovery potential of FW is about 24,024 GJ/year, and that of wastewater is about 1,629 GJ/year, adding up to 25,653 GJ/year. Relating values to the calculated number of inhabitants in Ville-Marie results in 330.9 kWh/year for FW and 22.4 kWh/year for wastewater, respectively.
Avatars are in use when interacting in virtual environments in different contexts, in collaborative work, as well as in gaming and also in virtual meetings with friends. Therefore it is important to understand how the relationship between user and avatar works. In this study, an online survey is used to determine how the perception of an avatar changes in different contexts by relating it to existing avatar relationship typologies. Additionally, it is determined whether in each context a realistic, abstract or comic-like representation is preferred by the participants. One result was a preference of low poly representations in the work context, which are associated with the perception of the avatar as a tool. In the context of meeting friends, a realistic representation is perceived as more appropriate, which is perceived as an accurate self-representation. In the gaming context, the results are less clear, which can be attributed to different gaming preferences. Here, unlike in the other contexts, a comic-like representation is also perceived as appropriate, which is associated with the perception of the avatar as a friend. A symbiotic user-avatar relationship is not directly related to any form of representation, but always lies in the midfield, which is attributed to the fact that it represents a whole spectrum between other categories.
Viele von uns saßen sicher schon einmal in einer Weiterbildungsveranstaltung, die gefühlt wenig brachte. Erfolgreiche Weiterbildung gelingt, indem Veranstaltungen gezielt an den Wünschen und Bedürfnissen der Nutzer ausgerichtet und geplant werden. Doch wie können persönliche Bedürfnisse der Nutzer von Weiterbildungsangeboten ermittelt und systematisch in die Gestaltung des Angebotes einbezogen werden?
Das Handbuch hilft dabei Weiterbildungsveranstaltungen zu entwickeln, die Teilnehmer motivieren und Wissens- und Kompetenzvermittlung sicherstellen. Die AutorInnen haben ein Methodenset entwickelt, dass die Teilnehmer schon bei der Entwicklung von Bildungsangeboten miteinbezieht und so eine effektive Umsetzung der Teilnehmerorientierung in der Weiterbildung ermöglicht. Im Buch beschreiben sie ihr Vorgehen und geben eine zusammenfassende Anleitung an die Hand.
To correctly assess the cleanliness of technical surfaces in a production process, corresponding online monitoring systems must provide sufficient data. A promising method for fast, large-area, and non-contact monitoring is hyperspectral imaging (HSI), which was used in this paper for the detection and quantification of organic surface contaminations. Depending on the cleaning parameter constellation, different levels of organic residues remained on the surface. Afterwards, the cleanliness was determined by the carbon content in the atom percent on the sample surfaces, characterized by XPS and AES. The HSI data and the XPS measurements were correlated, using machine learning methods, to generate a predictive model for the carbon content of the surface. The regression algorithms elastic net, random forest regression, and support vector machine regression were used. Overall, the developed method was able to quantify organic contaminations on technical surfaces. The best regression model found was a random forest model, which achieved an R2 of 0.7 and an RMSE of 7.65 At.-% C. Due to the easy-to-use measurement and the fast evaluation by machine learning, the method seems suitable for an online monitoring system. However, the results also show that further experiments are necessary to improve the quality of the prediction models.
Unprecedented formation of sterically stabilized phospholipid liposomes of cuboidal morphology
(2021)
Sterically stabilized phospholipid liposomes of unprecedented cuboid morphology are formed upon introduction in the bilayer membrane of original polymers, based on polyglycidol bearing a lipid-mimetic residue. Strong hydrogen bonding in the polyglycidol sublayers creates attractive forces, which, facilitated by fluidization of the membrane, bring about the flattening of the bilayers and the formation of cuboid vesicles.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Intermittent time series forecasting is a challenging task which still needs particular attention of researchers. The more unregularly events occur, the more difficult is it to predict them. With Croston’s approach in 1972 (1.Nr. 3:289–303), intermittence and demand of a time series were investigated the first time separately. He proposes an exponential smoothing in his attempt to generate a forecast which corresponds to the demand per period in average. Although this algorithm produces good results in the field of stock control, it does not capture the typical characteristics of intermittent time series within the final prediction. In this paper, we investigate a time series’ intermittence and demand individually, forecast the upcoming demand value and inter-demand interval length using recent machine learning algorithms, such as long-short-term-memories and light-gradient-boosting machines, and reassemble both information to generate a prediction which preserves the characteristics of an intermittent time series. We compare the results against Croston’s approach, as well as recent forecast procedures where no split is performed.
Coopetitive endeavors offer valuable strategic options for firms. Yet, many of them are failure-prone as partners must balance collective and private interest. While interpartner trust is considered central for alliance success, paradoxically, the role and dynamics of trust is still not understood. We synthesize a computational model, capturing relational dynamics of an alliance, encompassing coevolution of trust, partner contributions, and (relative) alliance interactions. Analyzing alliance dynamics using simulation we find and explore a tipping boundary, separating a regime of alliance failure and success. We identify implications for collaborative (aspirations) and private strategies (openness). Our analyses reveal that strategies informed by a static mental model of partner trust, contributions, and openness tend to yield subpar alliance results and hidden failure-risk. We discuss implications for management theory.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
Ambitious goals set by the European Union strategy towards the emission reduction of multimodal logistic chains and new requirements for intermodal terminals set by the evolution of customer needs, contribute to a shift in the driver for the infrastructure development: from economy of scale to economy of density. This paper aims to present an innovative method for designing a process oriented technology chain for intermodal terminals in order to fulfill these new demanding requirements. The results of the case study of the Zero Emission Logistic Terminal Reutlingen are presented, highlighting how this particular context enables the design and development of a modular concept, paving the way for the generalization of the findings towards the transfer to similar contexts of other European cities.
Towards Automated Surgical Documentation using automatically generated checklists from BPMN models
(2021)
The documentation of surgeries is usually created from memory only after the operation, which is an additional effort for the surgeon and afflicted with the possibility of imprecisely, shortend reports. The display of process steps in the form of checklists and the automatic creation of surgical documentation from the completed process steps could serve as a reminder, standardize the surgical procedure and save time for the surgeon. Based on two works from Reutlingen University, which implemented the creation of dynamic checklists from Business Process Modelling Notation (BPMN) models and the storage of times at which a process step was completed, a prototype was developed for an android tablet, to expand the dynamic checklists by functions such as uploading photos and files, manual user entries, the interception of foreseeable deviations from the normal course of operations and the automatic creation of OR documentation.
While there has been increased digitization of private homes, only little has been done to understand these specific home technologies, how they serve consumers, among other issues. “Smart home technology” (SHT) refer to a wide range of artifacts from cleaning aids to energy advisors. Given this breadth, clarity surrounding the key characteristics and the multi-faceted impact of SHT is needed to conduct more directed research on SHT. We propose a taxonomy to help outline the salient intended outcomes of SHT. Through a process involving five iterations, we analyzed and classified 79 technologies (gathered from literature and industry reports). This uncovered seven dimensions encompassing 20 salient characteristics. We believe these dimensions/characteristics will help researchers and organizations better design and study the impacts of these technologies. Our long-term agenda is to use the proposed taxonomy for an exploratory inquiry to understand tensions occurring when personal and sustainability-related outcomes compete.
In this work, a comparison between different brushless harmonic-excited wound-rotor synchronous machines is performed. The general idea of all topologies is the elimination of the slip rings and auxiliary windings by using the already existing stator and rotor winding for field excitation. This is achieved by injecting a harmonic airgap field with the help of power electronics. This harmonic field does not interact with the fundamental field, it just transfers the excitation power across the airgap. Alternative methods with varying number of phases, different pole-pair combinations, and winding layouts are covered and compared with a detailed Finite-Element-parameterized model. Parasitic effects due to saturation and coupling between the harmonic and main windings are considered.
Distributed ledger technologies such as the blockchain technology offer an innovative solution to increase visibility and security to reduce supply chain risks. This paper proposes a solution to increase the transparency and auditability of manufactured products in collaborative networks by adopting smart contract-based virtual identities. Compared with existing approaches, this extended smart contract-based solution offers manufacturing networks the possibility of involving privacy, content updating, and portability approaches to smart contracts. As a result, the solution is suitable for the dynamic administration of complex supply chains.
Successful transitions to a sustainable bioeconomy require novel technologies, processes, and practices as well as a general agreement about the overarching normative direction of innovation. Both requirements necessarily involve collective action by those individuals who purchase, use, and co-produce novelties: the consumers. Based on theoretical considerations borrowed from evolutionary innovation economics and consumer social responsibility, we explore to what extent consumers’ scope of action is addressed in the scientific bioeconomy literature. We do so by systematically reviewing bioeconomy-related publications according to (i) the extent to which consumers are regarded as passive vs. active, and (ii) different domains of consumer responsibility (depending on their power to influence economic processes). We find all aspects of active consumption considered to varying degrees but observe little interconnection between domains. In sum, our paper contributes to the bioeconomy literature by developing a novel coding scheme that allows us to pinpoint different aspects of consumer activity, which have been considered in a rather isolated and undifferentiated manner. Combined with our theoretical considerations, the results of our review reveal a central research gap which should be taken up in future empirical and conceptual bioeconomy research. The system-spanning nature of a sustainable bioeconomy demands an equally holistic exploration of the consumers’ prospective and shared responsibility for contributing to its coming of age, ranging from the procurement of information on bio-based products and services to their disposal.
Theory and practice of implementing a successful enterprise IoT strategy in the industry 4.0 era
(2021)
Since the arrival of the internet and affordable access to technologies, digital technologies have occupied a growing place in industries, propelling us towards a 4th industrial revolution: Industry 4.0. In today’s era of digital upheaval, enterprises are increasingly undergoing transformations that are leading to their digitalization. The traditional manufacturing industry is in the throes of a digital transformation that is accelerated by exponentially growing technologies (e.g., intelligent robots, Internet of Things, sensors, 3D printing). Around the world, enterprises are in a frantic race to implement solutions based on IoT to improve their productivity, innovation, and reduce costs and improve their markets on the international scene. Considering the immense transformative potential that IoTs and big data have to bring to the industrial sector, the adoption of IoT in all industrial systems is a challenge to remain competitive and thus transform the industry into a smart factory. This paper presents the description of the innovation and digitalization process, following the Industry 4.0 paradigm to implement a successful enterprise IoT strategy.
Hypericin has large potential in modern medicine and exhibits fascinating structural dynamics, such as multiple conformations and tautomerization. However, it is difficult to study individual conformers/tautomers, as they cannot be isolated due to the similarity of their chemical and physical properties. An approach to overcome this difficulty is to combine single molecule experiments with theoretical studies. Time-dependent density functional theory (TD-DFT) calculations reveal that tautomerization of hypericin occurs via a two-step proton transfer with an energy barrier of 1.63 eV, whereas a direct single-step pathway has a large activation energy barrier of 2.42 eV. Tautomerization in hypericin is accompanied by reorientation of the transition dipole moment, which can be directly observed by fluorescence intensity fluctuations. Quantitative tautomerization residence times can be obtained from the autocorrelation of the temporal emission behavior revealing that hypericin stays in the same tautomeric state for several seconds, which can be influenced by the embedding matrix. Furthermore, replacing hydrogen with deuterium further proves that the underlying process is based on tunneling of a proton. In addition, the tautomerization rate can be influenced by a λ/2 Fabry–Pérot microcavity, where the occupation of Raman active vibrations can alter the tunneling rate.
In recent years, the Graph Model has become increasingly popular, especially in the application domain of social networks. The model has been semantically augmented with properties and labels attached to the graph elements. It is difficult to ensure data quality for the properties and the data structure because the model does not need a schema. In this paper, we propose a schema bound Typed Graph Model with properties and labels. These enhancements improve not only data quality but also the quality of graph analysis. The power of this model is provided by using hyper-nodes and hyper-edges, which allows to present data structures on different abstraction levels. We prove that the model is at least equivalent in expressive power to most popular data models. Therefore, it can be used as a supermodel for model management and data integration. We illustrate by example the superiority of this model over the property graph data model of Hidders and other prevalent data models, namely the relational, object-oriented, XML model, and RDF Schema.
Recently described rhizolutin and collinolactone isolated from Streptomyces Gç 40/10 share the same novel carbon scaffold. Analyses by NMR and X-Ray crystallography verify the structure of collinolactone and propose a revision of rhizolutins stereochemistry. Isotope-labeled precursor feeding shows that collinolactone is biosynthesized via type I polyketide synthase with Baeyer–Villiger oxidation. CRISPR-based genetic strategies led to the identification of the biosynthetic gene cluster and a high-production strain. Chemical semisyntheses yielded collinolactone analogues with inhibitory effects on L929 cell line. Fluorescence microscopy revealed that only particular analogues induce monopolar spindles impairing cell division in mitosis. Inspired by the Alzheimerprotective activity of rhizolutin, we investigated the neuroprotective effects of collinolactone and its analogues on glutamate-sensitive cells (HT22) and indeed, natural collinolactone displays distinct neuroprotection from intracellular oxidative stress.
Artificial intelligence (AI) technologies, such as machine learning or deep learning, have been predicted to highly impact future organizations and radically change the way how projects are managed. The Project Management Institute (PMI), the network of around 1.1 million certified project managers, ranked AI as one of the top three disruptors of their profession. In an own study on the effect of AI, 37% of the project management processes can be executed by machine learning and other AI technologies. In addition, Gartner recently postulated that 80% of the work of today's project managers may be eliminated by AI in 2030.
This editorial aims to outline today's project and portfolio management in context of pharmaceutical research and development (R&D), followed by an AI-vision and a more tangible mission, and illustrate what the consequences of an AI-enabled project and portfolio management could be for pharmaceutical R&D.
Purpose
Injury or inflammation of the middle ear often results in the persistent tympanic membrane (TM) perforations, leading to conductive hearing loss (HL). However, in some cases the magnitude of HL exceeds that attributable by the TM perforation alone. The aim of the study is to better understand the effects of location and size of TM perforations on the sound transmission properties of the middle ear.
Methods
The middle ear transfer functions (METF) of six human temporal bones (TB) were compared before and after perforating the TM at different locations (anterior or posterior lower quadrant) and to different degrees (1 mm, ¼ of the TM, ½ of the TM, and full ablation). The sound-induced velocity of the stapes footplate was measured using single-point laser-Doppler-vibrometry (LDV). The METF were correlated with a Finite Element (FE) model of the middle ear, in which similar alterations were simulated.
Results
The measured and calculated METF showed frequency and perforation size dependent losses at all perforation locations. Starting at low frequencies, the loss expanded to higher frequencies with increased perforation size. In direct comparison, posterior TM perforations affected the transmission properties to a larger degree than anterior perforations. The asymmetry of the TM causes the malleus-incus complex to rotate and results in larger deflections in the posterior TM quadrants than in the anterior TM quadrants. Simulations in the FE model with a sealed cavity show that small perforations lead to a decrease in TM rigidity and thus to an increase in oscillation amplitude of the TM mainly above 1 kHz.
Conclusion
Size and location of TM perforations have a characteristic influence on the METF. The correlation of the experimental LDV measurements with an FE model contributes to a better understanding of the pathologic mechanisms of middle-ear diseases. If small perforations with significant HL are observed in daily clinical practice, additional middle ear pathologies should be considered. Further investigations on the loss of TM pretension due to perforations may be informative.
The physicochemical properties of synthetically produced bone substitute materials (BSM) have a major impact on biocompatibility. This affects bony tissue integration, osteoconduction, as well as the degradation pattern and the correlated inflammatory tissue responses including macrophages and multinucleated giant cells (MNGCs). Thus, influencing factors such as size, special surface morphologies, porosity, and interconnectivity have been the subject of extensive research. In the present publication, the influence of the granule size of three identically manufactured bone substitute granules based on the technology of hydroxyapatite (HA)-forming calcium phosphate cements were investigated, which includes the inflammatory response in the surrounding tissue and especially the induction of MNGCs (as a parameter of the material degradation). For the in vivo study, granules of three different size ranges (small = 0.355–0.5 mm; medium = 0.5–1 mm; big = 1–2 mm) were implanted in the subcutaneous connective tissue of 45 male BALB/c mice. At 10, 30, and 60 days post implantationem, the materials were explanted and histologically processed. The defect areas were initially examined histopathologically. Furthermore, pro- and anti-inflammatory macrophages were quantified histomorphometrically after their immunohistochemical detection. The number of MNGCs was quantified as well using a histomorphometrical approach. The results showed a granule size-dependent integration behavior. The surrounding granulation tissue has passivated in the groups of the two bigger granules at 60 days post implantationem including a fibrotic encapsulation, while a granulation tissue was still present in the group of the small granules indicating an ongoing cell-based degradation process. The histomorphometrical analysis showed that the number of proinflammatory macrophages was significantly increased in the small granules at 60 days post implantationem. Similarly, a significant increase of MNGCs was detected in this group at 30 and 60 days post implantationem. Based on these data, it can be concluded that the integration and/or degradation behavior of synthetic bone substitutes can be influenced by granule size.
Electronic design automation approaches can roughly be divided into optimizers and procedures. While the former have enabled highly automated synthesis flows for digital integrated circuits, the latter play a vital (but mostly underestimated role) in the analog domain. This paper describes both automation strategies in comparison, identifying two fundamentally different automation paradigms that reflect the two basic design practices known as “top-down” and “bottom-up”. Then, with a focus on the latter, the history of procedural approaches is traced from their
early beginnings until today’s evolvements and future prospects to underline their practical importance and to accentuate their scientific value, both in itself and in the overall context of EDA.
Collagen-based barrier membranes are an essential component in Guided Bone Regeneration (GBR) procedures. They act as cell-occlusive devices that should maintain a micromilieu where bone tissue can grow, which in turn provides a stable bed for prosthetic implantation. However, the standing time of collagen membranes has been a challenging area, as native membranes are often prematurely resorbed. Therefore, consolidation techniques, such as chemical cross-linking, have been used to enhance the structural integrity of the membranes, and by consequence, their standing time. However, these techniques have cytotoxic tendencies and can cause exaggerated inflammation and in turn, premature resorption, and material failures. However, tissues from different extraction sites and animals are variably cross-linked. For the present in vivo study, a new collagen membrane based on bovine dermis was extracted and compared to a commercially available porcine-sourced collagen membrane extracted from the pericardium. The membranes were implanted in Wistar rats for up to 60 days. The analyses included well-established histopathological and histomorphometrical methods, including histochemical and immunohistochemical staining procedures, to detect M1- and M2-macrophages as well as blood vessels. Initially, the results showed that both membranes remained intact up to day 30, while the bovine membrane was fragmented at day 60 with granulation tissue infiltrating the implantation beds. In contrast, the porcine membrane remained stable without signs of material-dependent inflammatory processes. Therefore, the bovine membrane showed a special integration pattern as the fragments were found to be overlapping, providing secondary porosity in combination with a transmembraneous vascularization. Altogether, the bovine membrane showed comparable results to the porcine control group in terms of biocompatibility and standing time. Moreover, blood vessels were found within the bovine membranes, which can potentially serve as an additional functionality of barrier membranes that conventional barrier membranes do not provide.
Study programs in higher education have to reflect important societal and industrial challenges to prepare the next generations of professionals for future tasks. The focus of this paper is the challenge of digitalization and digital transformation. The paper proposes the IS education profile of a Digital Business Architect (DBA). The study program emphasizes design thinking, model centricity, and capability thinking as a response to domain requirements from digital transformation and educational system and structure requirements. Experiences in implementing the DBA include the need for integrating deductive and inductive teaching, a strong basis in real-world cases, and collaborative learning approaches to develop adequate competences in business model management, enterprise modeling, enterprise architecture management, and capability management.
This paper covers test and verification of a forecast-based Monte Carlo algorithm for an optimized, demand-oriented operation of combined heat and power (CHP) units using the hardware-in-the-loop approach. For this purpose, the optimization algorithm was implemented at a test bench at Reutlingen University for controlling a CHP unit in combination with a thermal energy storage, both in real hardware. In detail, the hardware-in-the-loop tests are intended to reveal the effects of demand forecasting accuracy, the impact of thermal energy storage capacity and the influence of load profiles on demand-oriented operation of CHP units. In addition, the paper focuses on the evaluation of the content of energy in the thermal energy storage under practical conditions. It is shown that a 5-layer model allows to determine the energy stored quite accurately, which is verified by experimental results. The hardware-in-the-loop tests disclose that demand forecasting accuracies, especially electricity demand forecasting, as well as load profiles strongly impact the potential for CHP electricity utilization on-site in demand-oriented mode. Moreover, it is shown that a larger effective capacity of the thermal energy storage positively affects demand-oriented operation. In the hardware-in-the-loop tests, the fraction of electricity generated by the CHP unit utilized on-site could thus be increased by a maximum of 27% compared to heat-led operation, which is still the most common modus operandi of small-scale CHP plants. Hence, the hardware-in-the-loop tests were adequate to prove the significant impact of the proposed algorithm for optimization of demand-oriented operation of CHP units.
Das vorliegende Taschenbuch fasst die bekannten Berechnungsformeln und Erkenntnisse aus der betrieblichen Praxis und aus wissenschaftlichen Untersuchungen im Bereich des Weberei-Vorwerks und der Weberei zusammen. Die bei der Gewebeherstellung notwendigen Entscheidungsprozesse sollen damit erleichtert werden.
Mit dieser Formelsammlung lassen sich jedoch nicht nur die optimalen Fertigungsvorschriften für Gewebe praxisgerecht erstellen, sondern auch die wichtigsten technischen und physikalischen Grundlagen des Fabrikbetriebs werden in der gebotenen Kürze dargestellt.
Context: Agile practices as well as UX methods are nowadays well-known and often adopted to develop complex software and products more efficiently and effectively. However, in the so called VUCA environment, which many companies are confronted with, the sole use of UX research is not sufficient to find the best solutions for customers. The implementation of Design Thinking can support this process. But many companies and their product owners don’t know how much resources they should spend for conducting Design Thinking.
Objective: This paper aims at suggesting a supportive tool, the “Discovery Effort Worthiness (DEW) Index”, for product owners and agile teams to determine a suitable amount of effort that should be spent for Design Thinking activities.
Method: A case study was conducted for the development of the DEW index. Design Thinking was introduced into the regular development cycle of an industry Scrum team. With the support of UX and Design Thinking experts, a formula was developed to determine the appropriate effort for Design Thinking.
Results: The developed “Discovery Effort Worthiness Index” provides an easy-to-use tool for companies and their product owners to determine how much effort they should spend on Design Thinking methods to discover and validate requirements. A company can map the corresponding Design Thinking methods to the results of the DEW Index calculation, and product owners can select the appropriate measures from this mapping. Therefore, they can optimize the effort spent for discovery and validation.
We present the modification of ethylene-propylene rubber (EPM) with vinyltetra-methydisiloxane (VTMDS) via reactive extrusion to create a new silicone-based material with the potential for high-performance applications in the automotive, industrial and biomedical sectors. The radical-initiated modification is achieved with a peroxide catalyst starting the grafting reaction. The preparation process of the VTMDS-grafted EPM was systematically investigated using process analytical technology (in-line Raman spectroscopy) and the statistical design of experiments (DoE). By applying an orthogonal factorial array based on a face-centered central composite experimental design, the identification, quantification and mathematical modeling of the effects of the process factors on the grafting result were undertaken. Based on response surface models, process windows were defined that yield high grafting degrees and good grafting efficiency in terms of grafting agent utilization. To control the grafting process in terms of grafting degree and grafting efficiency, the chemical changes taking place during the modification procedure in the extruder were observed in real-time using a spectroscopic in-line Raman probe which was directly inserted into the extruder. Successful grafting of the EPM was validated in the final product by 1H-NMR and FTIR spectroscopy.
This article studies the renewed interest surrounding sustainable public finance and the topic of tax evasion as well as the new theory of information inattention. Extending a model of tax evasion with the notion of inattention reveals novel findings about policy instruments that can be used to mitigate tax evasion. We show that the attention parameters regarding tax rates, financial penalty schemes and income levels are as important as the level of the detection probability and the financial penalty incurred. Thus, our theory recommends the enhancement of sustainability in public policy, particularly in tax policy. Consequently, the paper contributes both to the academic and public policy debate.
The technologies of digital transformation, such as the Internet-of-Things (IoT), artificial intelligence or predictive maintenance enable significant efficiency gains in industry and are becoming increasingly important as a competitive factor. However, their successful implementation and creative, future application requires the broad acceptance and knowledge of non-IT-related groups, such as production management students, engineers or skilled workers, which is still lacking today. This paper presents a low-threshold training concept bringing IoT-technologies and applications into manufacturing related higher education and employee training. The concept addresses the relevant topics starting from IoT-basics to predictive maintenance using mobile low-cost hardware and infrastructure.
This article studies the effects of reverse factoring in a supply chain when the buyer company facilitates its lower short-term borrowing rates to the supplier corporation in return for extended payment terms. We explore the role of interest rate changes, rating changes, and the business cycle position on the cost and benefit trade-off from a supplier perspective. We utilize a combined empirical approach consisting of an event study in Step 1 and a simulation model in Step 2. The event study identifies the quantitative magnitude of central bank decisions and rating changes on the interest rate differential. The simulation computes with a rolling-window methodology the daily cost and benefits of reverse factoring from 2010 to 2018 under the assumption of the efficient market hypothesis. Our major finding is that changes of crucial financial variables such as interest rates, ratings, or news alerts will turn former win-win into win-lose situations for the supplier contingent to the business cycle. Overall, our results exhibit sophisticated trade-offs under reverse factoring and consequently require a careful evaluation in managerial decisions.
Für die erfolgreiche Umsetzung der Energiewende in Deutschland ist die Kraft-Wärme-Kopplung (KWK) aufgrund ihrer hohen Effizienz und Flexibilität nicht mehr wegzudenken. Um die verfügbare Flexibilität einer KWK-Anlage unter Gewährleistung ihrer hohen Effizienz optimal nutzen zu können, ist an der Hochschule Reutlingen in mehrjährigen Forschungsarbeit ein prognosebasierter Steuerungsalgorithmus für Blockheizkraftwerke (BHKW) in Verbindung mit Wärmespeichern entwickelt worden.
Im IGF-Projekt Nr. 19617 N wurden stickstoff- und phosphorsubstituierte Alkoxysilane hergestellt und ihre flammhemmenden Eigenschaften für Textilien untersucht. Die Synthesen erfolgten nach unterschiedlichen Strategien wie der Klick-Chemie und der nukleophilen Substitution kommerziell erhältlicher Organophosphorverbindungen mit aminobasierten Trialkoxysilanen und/oder Cyanurchlorid. Diese neuartigen, halogen- und aldehydfreien Flammschutzmittel wurden auf Stoffe aus Baumwolle (BW), Polyethylenterephthalat (PET), Polyamid (PA), sowie Mischgeweben daraus mit der industriell etablierten Pad-Dry-Cure-Technik und mittels Sol-Gel-Verfahren aufgetragen. Die flammhemmenden Eigenschaften wurden mit den Prüfverfahren nach EN ISO 15025 (Schutzkleidung – Schutz gegen Hitze und Flammprüfverfahren für begrenzte Flammenausbreitung= bewertet. Eine gute Schwerentflammbarkeit der hybriden organisch-anorganischen Materialien wurde bei einer geringen Menge von 3-5 Gew.% auf Baumwollgeweben erreicht. Darüber hinaus konnten die Wasserlöslichkeit und die Waschbeständigkeit durch die an das Phosphoratom gebundenen funktionellen Gruppen und durch die Optimierung der Härtungstemperatur kontrolliert werden. Insgesamt zeigte das Forschungsprojekt, dass N-P-Silane sehr gute permanente Flammschutzmittel für Textilien sind.
In the current age of innovative business financing opportunities available from fintech apps, social media crowdfunding sites such as Kickstarter, Indiegogo, and RocketHub, et.al., and friends and family private equity investors, start-up firms can strategically source their venture capital funds from many globally disperse organizations and individuals. As the firm in this case learned, the benefit of alternative investing sources comes with a critical hidden risk for corporate governance. After a financial restructuring, a typical Silicon Valley software start-up found itself with close to 300 external individual shareholders, some of whom had not been documented as accredited investors. The regulatory agency could decide that the prior actions of the founders and the decisions of the board had been prejudicial to the interests of the minority investors. The management of this small private company faced an atypical investor relations dilemma, before its initial public offering (IPO).
We analyze economics PhDs’ collaborations in peer-reviewed journals from 1990 to 2014 and investigate such collaborations’ quality in relation to each co-author’s research quality, field and specialization. We find that a greater overlap between co-authors’ previous research fields is significantly related to a greater publication success of co-authors’ joint work and this is robust to alternative specifications. Co-authors that engage in a distant collaboration are significantly more likely to have a large research overlap, but this significance is lost when co-authors’ social networks are accounted for. High quality collaboration is more likely to emerge as a result of an interaction between specialists and generalists with overlapping fields of expertise. Regarding interactions across subfields of economics (interdisciplinarity), it is more likely conducted by co- authors who already have interdisciplinary portfolios, than by co-authors who are specialized or starred in different subfields.
This paper explores why and how dominant international social standards used in the fashion industry are prone to implementation failures. A qualitative multiple-case study method was conducted, using purposive sampling to select 13 apparel supply chain actors. Data were collected through on-site semi-structured face-to-face interviews. The findings of the study are interpreted by using core tenets of agency theory. The case study findings clearly highlight why and how multi-tier apparel supply chains fail to implement social standards effectively. As a consequence of substantial goal conflicts and information asymmetries, sourcing agents and suppliers are driven to perform opportunistic behaviors in form of hidden characteristics, hidden intentions, and hidden actions, which significantly harm social standards. Fashion retailers need to empower their corporate social responsibility (CSR) departments by awarding an integrative role to sourcing decisions. Moreover, accurate calculation of orders, risk sharing, cost sharing, price premiums, and especially guaranteed order continuity for social compliance are critical to reduce opportunistic behaviors upstream of the supply chain. The development of social standards is highly suggested, e.g., by including novel metrics such as the assessment of buying practices or the evaluation of capacity planning at factories and the strict inclusion of subcontractors’ social performances. This paper presents evidence from multiple Vietnamese and Indonesian cases involving sourcing agents as well as Tier 1 and Tier 2 suppliers on a highly sensitive topic. With the development of the conceptual framework and the formulation of seven related novel propositions, this paper unveils the ineffectiveness of social standards, offers guidance for practitioners, and contributes to the neglected social dimension in sustainable supply chain management research and accountability literature.
Imagine a world in which the search for tomorrow's trends is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
Die vorliegende Studie zeigt, dass das Thema Smart Innovation (der Einsatz von KI-Systemen im Innovationsprozess) von hoher Relevanz ist und Zustimmung für den Einsatz von KI im Innovationsprozess besteht. Sowohl von den Unternehmen als auch von den Studierenden werden Effizienzsteigerung, schnellere Bearbeitung großer Datenmengen, die Steigerung der Wettbewerbsfähigkeit und Kosteneinsparungen als Gründe für den Einsatz von KI im Innovationsprozess gesehen. In Deutschland finden KI-Technologien bereits jetzt punktuell und branchenunabhängig Anwendung im Innovationsprozess. Einflussfaktoren, wie Hochschulkooperationen, Innovationsabteilungen und Open Innovation können den Einsatz fördern. Vor allem KMU aus den frühen Phasen der Industrialisierung sollten davon Gebrauch machen. In einem Zusammenspiel von menschlicher Expertise und der schnellen und präzisen Datenverarbeitung der KI liegt das Erfolgsgeheimnis eines möglichst effizienten Innovationsprozesses. Es wird deutlich, dass verschiedene Einflussfaktoren erforderlich sind, um die Anwendung von Smart Innovation praktikabel zu gestalten. So gilt es zunächst die technischen Voraussetzungen einer funktionierenden IT-Infrastruktur zu erfüllen. Gleichbedeutend sind offene Fragestellungen hinsichtlich der Datenverfügbarkeit, des Dateneigentums und der Datensicherheit. Ohne rechtlichen Rahmen sind kaum Akteure gewillt, ihre Daten zu teilen und zugänglich zu machen. Erschwert wird der Einsatz von KI durch den nationalen IT-Fachkräftemangel. So sehen sowohl Unternehmen als auch die Studierenden das größte Hindernis im Mangel von KI-relevantem Know-how. Dies hemmt einerseits die Forschung, andererseits fehlt es den Unternehmen an erforderlichen Fachkräften für eine Einführung von KI im Unternehmen. Es ist jedoch notwendig, den Unternehmen durch das Aufzeigen von Anwendungsbeispielen, die Potenziale und Chancen von Smart Innovation zu vermitteln. Es gilt, die anwendungsorientierte Forschung zu fördern und einen reibungslosen Transfer in die Wirtschaft sicherzustellen. Dieser Wissensaustausch erfordert zudem eine höhere unternehmerische Risikobereitschaft. Es wächst die Notwendigkeit, unternehmensspezifische KI-Strategien zu entwerfen. Die Technologien entwickeln sich schnell, es gilt daher auch für Unternehmen sich diesem Fortschritt anzupassen, um den Anschluss nicht zu verlieren und die Wettbewerbsfähigkeit zu sichern. So liegt die größte Herausforderung im grundlegenden Wandel der Geschäftsmodelle, denn die Wertschöpfung erfolgreicher Unternehmen basiert zunehmend auf "digitalen assets". Daten gelten generell als die neue Ressource, als Rohstoff, auch für Smarte Innovationen. Die Bedeutung von Smart Innovation wird in Zukunft weiterhin ansteigen. Kurz- und mittelfristig unterstützt die Schwache KI vor allem bei der Datensammlung und -analyse, bei der Prozessautomatisierung sowie bei der Bedürfnis- und Trendidentifikation. Weiter werden sich inkrementelle Veränderungen im Innovationsmanagement mithilfe von Simulationen und der zufälligen Kombination von Technologien erhofft. Langfristig wird eine stärkere KI den Einsatz der Menschen im Innovationsprozess in Teilen ersetzen können. Ob autonomes Innovieren zukünftig möglich sein wird, hängt zunächst von dem Ausmaß der Neuheit einer Innovation, aber vor allem auch von der Möglichkeit einer kreativen KI ab. Es ist davon auszugehen, dass die Fortschritte im Bereich der KI nicht nur radikale Innovationen ermöglichen werden, sondern auch zu einer strukturellen Veränderung unseres heutigen Verständnisses des Innovationsmanagements führen.
Imagine a world in which the search for tomorrow's trends of (software) products is not subject to a long and laborious data search but is possible with a single mouse click. Through the use of artificial intelligence (AI), this reality is made possible and is to be further advanced through research. The study therefore aims to provide an initial overview of the young research field. Based on research, expert interviews, company and student surveys, current application possibilities of AI in the innovation process (defined as Smart Innovation), existing challenges that slow down the further development are discussed in more detail and future application possibilities are presented. Finally, a recommendation for action is made for business, politics and science to help overcome the current obstacles together and thus drive the future of Smart Innovation.
Purpose
Computerized medical imaging processing assists neurosurgeons to localize tumours precisely. It plays a key role in recent image-guided neurosurgery. Hence, we developed a new open-source toolkit, namely Slicer-DeepSeg, for efficient and automatic brain tumour segmentation based on deep learning methodologies for aiding clinical brain research.
Methods
Our developed toolkit consists of three main components. First, Slicer-DeepSeg extends the 3D Slicer application and thus provides support for multiple data input/ output data formats and 3D visualization libraries. Second, Slicer core modules offer powerful image processing and analysis utilities. Third, the Slicer-DeepSeg extension provides a customized GUI for brain tumour segmentation using deep learning-based methods.
Results
The developed Slicer-DeepSeg was validated using a public dataset of high-grade glioma patients. The results showed that our proposed platform’s performance considerably outperforms other 3D Slicer cloud-based approaches.
Conclusions
Developed Slicer-DeepSeg allows the development of novel AI-assisted medical applications in neurosurgery. Moreover, it can enhance the outcomes of computer-aided diagnosis of brain tumours. Open-source Slicer-DeepSeg is available at github.com/razeineldin/Slicer-DeepSeg.
Metalworking fluids (MWFs) are widely used to cool and lubricate metal workpieces during processing to reduce heat and friction. Extending a MWF’s service life is of importance from both economical and ecological points of view. Knowledge about the effects of processing conditions on the aging behavior and reliable analytical procedures are required to properly characterize the aging phenomena. While so far no quantitative estimations of ageing effects on MWFs have been described in the literature other than univariate ones based on single parameter measurements, in the present study we present a simple spectroscopy-based set-up for the simultaneous monitoring of three quality parameters of MWF and a mathematical model relating them to the most influential process factors relevant during use. For this purpose, the effects of MWF concentration, pH and nitrite concentration on the droplet size during aging were investigated by means of a response surface modelling approach. Systematically varied model MWF fluids were characterized using simultaneous measurements of absorption coefficients µa and effective scattering coefficients µ’s. Droplet size was determined via dynamic light scattering (DLS) measurements. Droplet size showed non-linear dependence on MWF concentration and pH, but the nitrite concentration had no significant effect. pH and MWF concentration showed a strong synergistic effect, which indicates that MWF aging is a rather complex process. The observed effects were similar for the DLS and the µ’s values, which shows the comparability of the methodologies. The correlations of the methods were R2c = 0.928 and R2P = 0.927, as calculated by a partial least squares regression (PLS-R) model. Furthermore, using µa, it was possible to generate a predictive PLS-R model for MWF concentration (R2c = 0.890, R2P = 0.924). Simultaneous determination of the pH based on the µ’s is possible with good accuracy (R²c = 0.803, R²P = 0.732). With prior knowledge of the MWF concentration using the µa-PLS-R model, the predictive capability of the µ’s-PLS-R model for pH was refined (10 wt%: R²c = 0.998, R²p = 0.997). This highlights the relevance of the combined measurement of µa and µ’s. Recognizing the synergistic nature of the effects of MWF concentration and pH on the droplet size is an important prerequisite for extending the service life of an MWF in the metalworking industry. The presented method can be applied as an in-process analytical tool that allows one to compensate for ageing effects during use of the MWF by taking appropriate corrective measures, such as pH correction or adjustment of concentration.
Durch das Verbot der ozonschädigenden Fluor-Chlorkohlenwasserstoffen als Kältemittel und der heute überwiegend eingesetzten Fluor-Kohlenwasserstoffe, welche sich negativ auf den Treibhauseffekt auswirken, gewinnt das umweltfreundlichere CO2 (Kohlendioxid) in der Verwendung als Kältemittel an Bedeutung. Ausgangspunkt dieser Arbeit sind ein Prototyp einer reversiblen CO2 Wärmepumpe und ein Simulationsmodell derselbigen. Ziel dieser Arbeit ist es das Simulationsmodell, anhand von realen Messergebnissen des Prototyps, zu verifizieren. Durch die Berechnung von Vergleichsparametern, das Festlegen von Randbedingungen und geeigneten Messpunkten am Prototyp wird die Simulation optimiert. Abschließend folgt die Bewertung der Ergebnisse im Hinblick auf die Funktionalität der Wärmepumpe und deren Abbild in der Simulation.
Silicon photonic micro-ring resonators (MRR) developed on the silicon-on-insulator (SOI) platform, owing to their high sensitivity and small footprint, show great potential for many chemical and biological sensing applications such as label-free detection in environmental monitoring, biomedical engineering, and food analysis. In this tutorial, we provide the theoretical background and give design guidelines for SOI-based MRR as well as examples of surface functionalization procedures for label-free detection of molecules. After introducing the advantages and perspectives of MRR, fundamentals of MRR are described in detail, followed by an introduction to the fabrication methods, which are based on a complementary metal-oxide semiconductor (CMOS) technology. Optimization of MRR for chemical and biological sensing is provided, with special emphasis on the optimization of waveguide geometry. At this point, the difference between chemical bulk sensing and label-free surface sensing is explained, and definitions like waveguide sensitivity, ring sensitivity, overall sensitivity as well as the limit of detection (LoD) of MRR are introduced. Further, we show and explain chemical bulk sensing of sodium chloride (NaCl) in water and provide a recipe for label-free surface sensing.
The digital transformation is today’s dominant business transformation having a strong influence on how digital services and products are designed in a service-dominant way. A popular underlying theory of value creation and economic exchange that is known as the service-dominant (S-D) logic can be connected to many successful digital business models. However, S-D logic by itself is abstract. Companies cannot directly use it as an instrument for business model innovation and design in an easy way. To address this a comprehensive ideation method based on S-D logic is proposed, called service-dominant design (SDD). SDD is aimed at supporting firms in the transition to a service- and value-oriented perspective. The method provides a simplified way to structure the ideation process based on four model components. Each component consists of practical implications, auxiliary questions and visualization techniques that were derived from a literature review, a use case evaluation of digital mobility and a focus group discussion. SDD represents a first step of having a toolset that can support established companies in the process of service- and value-orientation as part of their digital transformation efforts.
The cloud evolved into an attractive execution environment for parallel applications, which make use of compute resources to speed up the computation of large problems in science and industry. Whereas Infrastructure as a Service (IaaS) offerings have been commonly employed, more recently, serverless computing emerged as a novel cloud computing paradigm with the goal of freeing developers from resource management issues. However, as of today, serverless computing platforms are mainly used to process computations triggered by events or user requests that can be executed independently of each other and benefit from on-demand and elastic compute resources as well as per-function billing. In this work, we discuss how to employ serverless computing platforms to operate parallel applications. We specifically focus on the class of parallel task farming applications and introduce a novel approach to free developers from both parallelism and resource management issues. Our approach includes a proactive elasticity controller that adapts the physical parallelism per application run according to user-defined goals. Specifically, we show how to consider a user-defined execution time limit after which the result of the computation needs to be present while minimizing the associated monetary costs. To evaluate our concepts, we present a prototypical elastic parallel system architecture for self-tuning serverless task farming and implement two applications based on our framework. Moreover, we report on performance measurements for both applications as well as the prediction accuracy of the proposed proactive elasticity control mechanism and discuss our key findings.
Science-based analysis for climate action: how HSBC Bank uses the En-ROADS climate policy simulation
(2021)
In 2018, the Intergovernmental Panel on Climate Change (IPCC, 2018) found that rapid decarbonization and net negative greenhouse gas (GHG) emissions by mid-century are required to "hold the increase in global average temperature to well below 2°C above pre-industrial levels and pursue efforts to limit the temperature increase to 1.5°C," as stipulated by the Paris Agreement (UNFCCC, 2015, p. 2). Meeting these goals reduces physical climate-related risks from, for example, sea-level rise, ocean acidification, extreme weather, water shortages, declining crop yields, and other impacts. These impacts threaten our economy, security, health, and lives.
At the same time, policies to mitigate these harms by rapidly reducing GHG emissions can create transition risks for businesses - for example, stranded assets and loss of market value for fossil fuel producers and firms dependent on fossil energy (Carney, 2019). Rapid decarbonization requires an unprecedented energy transition (IEA, 2021a) driven by and affecting economic players including businesses, asset managers, and investors in all sectors and all countries (Kriegler et al., 2014).
However, GHG emissions are not falling rapidly enough to meet the goals of the Paris Agreement (Holz et al., 2018). The UNFCCC, 2021 found that the emissions reductions pledged by all nations as of early 2021 "fall far short of what is required, demonstrating the need for Parties to further strengthen their mitigation commitments under the Paris Agreement" (2021, p. 5). Businesses are faring no better. Despite high-profile calls to action from influential firms such as BlackRock (Fink, 2018, 2021), corporate action to meet climate goals has thus far fallen short (e.g. the Right, 2019 analysis of the German DAX 30 companies' emissions targets by NGO "right."). Instead of implementing climate strategies that might mitigate the risks, managers are often caught up in "firefighting" and capability traps that erode the resources needed for ambitious climate action (Sterman, 2015). Firms may also exaggerate environmental accomplishments, leading to greenwashing (Lyon and Maxwell, 2011); implement policies that are vague, rely on unproven offsets, or are not climate neutral (e.g. Sterman et al., 2018); or simply take no action at all (Delmas and Burbano, 2011; Sterman, 2015).
Adding to the confusion are difficulties evaluating the effectiveness of different climate policies. Misperceptions include wait-and-see approaches (Dutt and Gonzalez, 2012; Sterman, 2008), underestimating time delays and ignoring the unintended consequences of policies (Sterman, 2008), and beliefs in "silver bullet" solutions (Gilbert, 2009; Kriegler et al., 2013; Shackley and Dütschke, 2012). These beliefs arise in part because the climate–energy system is a high-dimensional dynamic system characterized by long time delays, multiple feedback loops, and nonlinearities (Sterman, 2011), while even simple systems are difficult for people to understand (Booth Sweeney and Sterman, 2000; Cronin et al., 2009; Kapmeier et al., 2017). Although senior executives might receive briefings on climate change, simply providing more information does not necessarily lead to more effective action (Pearce et al., 2015; Sterman, 2011).
Alternatively, interactive approaches to learning about climate change and policies to mitigate it can trigger climate action (Creutzig and Kapmeier, 2020). Decision-makers require tools and methods grounded in science that enable them to learn for themselves how a low-carbon economy can be achieved and how climate policies condition physical and transition risks. The system dynamics climate–energy simulation En-ROADS (Energy-Rapid Overview and Decision Support; Jones et al., 2019b), codeveloped by the climate think-tank Climate Interactive and the MIT Sloan Sustainability Initiative, provides such a tool.
Here we show how En-ROADS helps HSBC Bank U.S.A., the American subsidiary of U.K.-based multinational financial services company HSBC Holdings plc, focus its global sustainability strategy on activities with higher impact and relevance, communicate and implement the strategy, understand transition risks, and better align the strategy with global climate goals. We show how the versatility and interactivity of En-ROADS increases its reach throughout the organization. Finally, we discuss challenges and lessons learned that may be helpful to other organizations.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Rotating machinery occupies a predominant place in many industrial applications. However, rotating machines are often encountered with severe vibration problems. The measurement of these machines’ vibrations signal is of particular importance since it plays a crucial role in predictive maintenance. When the vibrations are too high, they often cause fatigue failure. They announce an unexpected stop or break and, consequently, a significant loss of productivity or an attack on the personnel’s safety. Therefore, fault identification at early stages will significantly enhance the machine’s health and significantly reduce maintenance costs. Although considerable efforts have been made to master the field of machine diagnostics, the usual signal processing methods still present several drawbacks. This paper examines the rotating machinery condition monitoring in the time and frequency domains. It also provides a framework for the diagnosis process based on machine learning by analyzing the vibratory signals.
Die Digitalisierung und Nachhaltigkeit werden die Erwartungen und Anforderungen an die Controller dauerhaft und umfassend verändern. Die Lehre hat für den Rollenwandel eine hohe Relevanz. Eine auf die veränderten Anforderungen abgestimmte Ausbildung bietet den Unternehmen die Möglichkeit, Controller mit diesen veränderte Rollenprofilen für ihre Organisation zu gewinnen. Für die Absolventen mit dem Berufswunsch Controlling sichert das veränderte Rollenprofil ihre langfristige Arbeitsmarktfähigkeit. Für den Rollenwandel selbst kann diese als Treiber verstanden werden.
Trotz der Bedeutung der Lehre für den Rollenwandel gibt es dazu bislang wenige Forschungsergebnisse zur konkreten Abbildung der Rollen in der Lehre. Es stellt sich daher die Frage, wie Hochschulen in ihren Studiengängen die Rollen grundsätzlich abbilden und mit welcher Intensität sowie Kombinationen die Rollen gelehrt werden. Diese Forschungsfrage wird anhand einer Analyse von controllingspezifischen Masterstudiengängen und deren Modulhandbücher evaluiert und diskutiert.
Im Ergebnis stellt sich der Rollenwandel in der Controllinglehre sehr heterogen dar. Es dominiert die Vermittlung der klassische Controllerrolle gefolgt von der Business Partner Rolle. Lehrinhalte bezogen auf die Rollen des digitalen Controllers oder Risikocontrollers sind schwach ausgeprägt. Für die Übernahme einer Controllerrolle im Nachhaltigkeitsmanagement existiert kaum ein Lehrangebot. Diese Ergebnisse sollen zum Diskurs über den Rollenwandel und die Gestaltung der Lehre im Controlling beitragen.
Supply chains have become increasingly complex, making it difficult to ensure transparency throughout the whole supply chain. In this context, first approaches came up, adopting the immutable, decentralised, and secure characteristics of the blockchain technology to increase the transparency, security, authenticity, and auditability of assets in supply chains. This paper investigates recent publications combining the blockchain technology and supply chain management and classifies them regarding the complexity to be mapped on the blockchain. As a result, the increase of supply chain transparency is identified as the main objective of recent blockchain projects in supply chain management. Thereby, most of the recent publications deal with simple supply chains and products. The few approaches dealing with complex parts only map sub-areas of supply chains. Currently no example exists which has the aim of increasing the transparency of complex manufacturing supply chains, and which enables the mapping of complex assembly processes, an efficient auditability of all assets, and an implementation of dynamic adjustments.
Gold bipyramids (AuBPs) attract significant attention due to the large enhancement of the electric field around their sharp tips and well-defined tunability of their plasmon resonances. Excitation patterns of single AuBPs are recorded using raster-scanning confocal microscopy combined with radially and azimuthally polarized laser beams. Photoluminescence spectra (PL) and excitation patterns of the same AuBPs are acquired with three different excitation wavelengths. The isotropic excitation patterns suggest that the AuBPs are mainly excited by interband transitions with 488/530 nm radiation, while excitation patterns created with a 633 nm laser exhibit a double-lobed shape that indicates a single-dipole excitation process associated with the longitudinal plasmon resonance mode. We are able to determine the three-dimensional orientation of single AuBPs nonperturbatively by comparing experimental patterns with theoretical simulations. The asymmetric patterns show that the AuBPs are lying on the substrate with an out-of-plane tilt angle of around 10–15°.
Kopainsky et al., (2020) examines intended and unintended transition effects of the Swiss food system on the system's structure and the environment. Kopainsky et al.'s research refers to studies on and is embedded in research streams in global health (Jamison et al., 2013) and sustainable food systems (Willett et al., 2019). It also addresses many of Steffen et al.'s (2015) planetary boundaries, the United Nations' (2015) sustainability goals (SDGs), and potentially could address how they are interrelated, following Randers et al. (2019). It is furthermore embedded in research on natural and human systems, particularly in the intertwined business, supply and demand, governance, ecological and health feedback loops (Swinburn et al., 2019). This feedback view enhances understanding and assessment of drivers towards improving human and ecological health and mitigating climate change.
Warum haben wir ausgerechnet einen Roboter für unseren Titel ausgesucht, der starke Ähnlichkeit hat mit Robbi aus „Robbi, Tobbi und das Fliewatüüt“. Ein Roboter aus den 80ern als Sinnbild für die Zukunft der Arbeit? Nicht ganz. Er steht vielmehr für die Anfänge der Automatisierung, mit der das Ende der Arbeit prophezeit wurde. Heute schaut sein moderner, agiler Nachfolger keck ums Eck. Der „Robbi“ von heute ist hervorgegangen aus einer ständigen technologischen Entwicklung, die unsere Arbeitswelt in erheblichem Maße verändert hat und noch verändern wird. Wie Sie sehen werden, war „Robbi“ sehr wandlungsfähig. Doch was bedeutet das für uns? Wie könnte sie aussehen, die Zukunft der Arbeit? Und was verändert sich dadurch für jeden Einzelnen von uns? Diese Fragen haben wir Professorinnen und Professoren aller Fakultäten gestellt. Sie beschäftigen sich in ihrer Forschung mit digitalen Arbeitsmodellen und zukunftsfähigen Bildungskonzepten, mit Krankenhäusern der Zukunft und Künstlicher Intelligenz. Vieles unterscheidet sich, doch vieles ist auch gleich: Es geht um Vertrauen. Was bedeutet die zunehmende Digitalisierung für unsere Arbeitskultur? Es geht um Verantwortung, uns selbst und anderen gegenüber. Es geht um Vielfalt. Wer sind sie, die Arbeitskräfte von morgen? Es geht um Vernetzung, denn die ist in einer digitalen Welt allgegenwärtig. Für uns als Hochschule ist die „Zukunft der Arbeit“ ein besonders wichtiges Thema, denn unsere Studierenden heute werden sich morgen in dieser neuen Arbeitswelt bewegen und sie gestalten. Welche Kompetenzen müssen wir Ihnen vermitteln? Das ist eine Frage, die wir uns immer wieder neu stellen.
Comparative analysis of the R&D efficiency of 14 leading pharmaceutical companies for the years 1999–2018 shows that there is a close positive correlation between R&D spending and the two investigated R&D output parameters, approved NMEs and the cumulative impact factor of their publications. In other words, higher R&D investments (input) were associated with higher R&D output. Second, our analyses indicate that there are "economies of scale" (size) in pharmaceutical R&D.
Employing diffuse reflection ultraviolet visible (UV–Vis) spectroscopy we developed an approach that is capable to quantitatively determine flux residues on a technical copper surface. The technical copper surface was soldered with a no-clean flux system of organic acids. By a post-solder cleaning step with different cleaning parameters, various levels of residues were produced. The surface was quantitatively and qualitatively characterized using X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES), Fourier transform infrared spectroscopy (FTIR) and diffuse reflection UV–Vis spectroscopy. With the use of a multivariate analysis (MVA) we examined the UV–Vis data to create a correlation to the carbon content on the surface. The UV–Vis data could be discriminated for all groups by their level of organic residues. Combined with XPS the data were evaluated by a partial least squares (PLS) regression to establish a model. Based on this predictive model, the carbon content was calculated with an absolute error of 2.7 at.%. Due to the high correlation of predictive model, the easy-to-use measurement and the evaluation by multivariate analysis the developed method seems suitable for an online monitoring system. With this system, flux residues can be detected in a manufacturing cleaning process of technical surfaces after soldering.
Context: The software-intensive business is characterized by increasing market dynamics, rapid technological changes, and fast-changing customer behaviors. Organizations face the challenge of moving away from traditional roadmap formats to an outcome-oriented approach that focuses on delivering value to the customer and the business. An important starting point and a prerequisite for creating such outcome-oriented roadmaps is the development of a product vision to which internal and external stakeholders can be aligned. However, the process of creating a product vision is little researched and understood.
Objective: The goal of this paper is to identify lessons-learned from product vision workshops, which were conducted to develop outcome-oriented product roadmaps.
Method: We conducted a multiple-case study consisting of two different product vision workshops in two different corporate contexts.
Results: Our results show that conducting product vision workshops helps to create a common understanding among all stakeholders about the future direction of the products. In addition, we identified key organizational aspects that contribute to the success of product vision workshops, including the participation of employees from functionally different departments.
In diesem Buch erfahren Sie eine neue wirkungsvolle Lernmethode, um auf eine realitätsnahe und nachhaltige Art Inhalte zu vermitteln, Teilnehmende zu begeistern und sich von anderen Anbietern zu unterscheiden.
Begeistern Sie Ihre Teilnehmenden und Coachees durch prototypische Strukturaufstellungen in Ihren Trainings und Beratungen. Mit dieser modernen Methode erreichen Sie wirkungsvolle und nachhaltigere Lernfortschritte bei den Teilnehmenden. Der Transfer in die Praxis beginnt bereits während des Lernprozesses.
Prototypische Strukturaufstellungen simulieren typische Situationen in Organisationen. Dabei werden Themen aufgegriffen und aufgestellt, die mehrere Teilnehmende im Berufsalltag betreffen. Diese Methode wird für eine lebendige Simulation genutzt, um Verbindungen aufzuzeigen sowie Verhaltensweisen und Handlungsoptionen auszuprobieren und zu reflektieren: ähnlich wie in einem Flugsimulator. Dabei haben die Teilnehmenden die Möglichkeit, aktive und realitätsnah zu lernen. Erkenntnisse und Lösungen werden auf eine überraschende und nachhaltige Art gewonnen.
Product roadmaps in the new mobility domain: state of the practice and industrial experiences
(2021)
Context: The New Mobility industry is a young market that includes high market dynamics and is therefore associated with a high degree of uncertainty. Traditional product roadmapping approaches such a detailed planning of features over a long-time horizon typically fail in such environments. For this reason, companies that are active in the field of New Mobility are faced with the challenge of keeping their product roadmaps reliable for stakeholders while at the same time being able to react flexibly to changing market requirements.
Objective: The goal of this paper is to identify the state of practice regarding product roadmapping of New Mobility companies. In addition, the related challenges within the product roadmapping process as well as the success factors to overcome these challenges will be highlighted.
Method: We conducted semi-structured expert interviews with 8 experts (7 German company and one Finnish company) from the field of New Mobility and performed a content analysis.
Results: Overall the results of the study showed that the participating companies are aware of the requirements that the New Mobility sector entails. Therefore, they exhibit a high level of maturity in terms of product roadmapping. Nevertheless, some aspects were revealed that pose specific challenges for the participating companies. One major challenge, for example, is that New Mobility in terms of public clients is often a tender business with non-negotiable product requirements. Thus, the product roadmap can be significantly influenced from the outside. As factors for a successful product roadmapping mainly soft factors such as trust between all people involved in the product development process and transparency throughout the entire roadmapping process were mentioned.
Context: Currently, most companies apply approaches for product roadmapping that are based on the assumption that the future is highly predicable. However, nowadays companies are facing the challenge of increasing market dynamics, rapidly evolving technologies, and shifting user expectations. Together with the adaption of lean and agile practices it makes it increasingly difficult to plan and predict upfront which products, services or features will satisfy the needs of the customers. Therefore, they are struggling with their ability to provide product roadmaps that fit into dynamic and uncertain market environments and that can be used together with lean and agile software development practices.
Objective: To gain a better understanding of modern product roadmapping processes, this paper aims to identify suitable processes for the creation and evolution of product roadmaps in dynamic and uncertain market environments.
Method: We performed a Grey Literature Review (GLR) according to the guidelines from Garousi et al.
Results: 32 approaches to product roadmapping were identified. Typical characteristics of these processes are the strong connection between the product roadmap and the product vision, an emphasis on stakeholder alignment, the definition of business and customer goals as part of the roadmapping process, a high degree of flexibility with respect to reaching these goals, and the inclusion of validation activities in the roadmapping process. An overall goal of nearly all approaches is to avoid waste by early reducing development and business risks. From the list of the 32 approaches found, four representative roadmapping processes are described in detail.
Polymeric micelle-like nanoparticles have demonstrated effectiveness for the delivery of some poorly soluble or hydrophobic anticancer drugs. In this study, a hydrophobic moiety, deoxycholic acid (DCA) was first bonded on a polysaccharide, chitosan (CS), for the preparation of amphiphilic chitosan (CS-DCA), which was further modified with a cationic glycidyltrimethylammounium chloride (GTMAC) to form a novel soluble chitosan derivative (HT-CS-DCA). The cationic amphiphilic HT-CS-DCA was easily self-assembled to micelle-like nanoparticles about 200 nm with narrow size distribution (PDI 0.08–0.18). The zeta potential of nanoparticles was in the range of 14 to 24 mV, indicating higher positive charges. Then, doxorubicin (DOX), an anticancer drug with poor solubility, was entrapped into HT-CS-DCA nanoparticles. The DOX release test was performed in PBS (pH 7.4) at 37 °C, and the results showed that there was no significant burst release in the first two hours, and the cumulative release increased steadily and slowly in the following hours. HT-CS-DCA nanoparticles loaded with DOX could easily enter into MCF-7 cells, as observed by a confocal microscope. As a result, DOX-loaded HT-CS-DCA nanoparticles demonstrated a significant inhibition activity on MCF-7 growth without obvious cellular toxicity in comparison with blank nanoparticles. Therefore, the anticancer efficacy of these cationic HT-CS-DCA nanoparticles showed great promise for the delivery of DOX in cancer therapy.
Preliminary results of homomorphic deconvolution application to surface EMG signals during walking
(2021)
Homomorphic deconvolution is applied to sEMG signals recorded during walking. Gastrocnemius lateralis and tibialis anterior signals were acquired according to SENIAM recommendation. MUAP parameters like amplitude and scale were estimated, whilst the MUAP shape parameter was fixed. This features a useful time-frequency representation of sEMG signal. Estimation of scale MUAP parameter was verified extracting the mean frequency of filtered EMG signal, extracted from the scale parameter estimated with two different MUAP shape values.
Given the increasing internationalisation of higher education, universities compete more and more not only for national but even more for international students. Selecting the best candidates from the pool of international applicants is a challenge. In our study, we analysed which criteria are best to predict the academic performance of students coming from different countries with different education systems, using different grade point average (GPA) standards. Using an administrative data set from an International Business programme at a German university of applied sciences, we explored the predictive power of adjusted high school GPA, IQ test result, interview score and first year grades in English, maths, and statistics.
Melamine-formaldehyde (MF) resins are widely used as surface finishes for engineered wood-based panels in decorative laminates. Since no additional glue is applied in lamination, the overall residual curing capacity of MF resins is of great technological importance. Residual curing capacity is measured by differential scanning calorimetry (DSC) as the exothermic curing enthalpy integral of the liquid resin. After resin synthesis is completed, the resulting pre-polymer has a defined chemical structure with a corresponding residual curing capacity. Predicting the residual curing capacity of a resin batch already at an early stage during synthesis would enable corrective measures to be taken by making adjustments while synthesis is still in progress. Thereby, discarding faulty batches could be avoided. Here, by using a batch modelling approach, it is demonstrated how quantitative predictions of MF residual curing capacity can be derived from inline Fourier Transform infrared (FTIR) spectra recorded during resin synthesis using partial least squares regression. Not only is there a strong correlation (R2 = 0.89) between the infrared spectra measured at the end of MF resin synthesis and the residual curing capacity. The inline reaction spectra obtained already at the point of complete dissolution of melamine upon methylolation during the initial stage of resin synthesis are also well suited for predicting final curing performance of the resin. Based on these IR spectra, a valid regression model (R2 = 0.85) can be established using information obtained at a very early stage of MF resin synthesis.
Um den Übergang von der Schule zur Hochschule zu erleichtern, brauchen Studierende technischer Fächer häufig eine Auffrischung ihrer Kenntnisse in Mathematik und Physik. Ein Online-Lernsystem für Physik kann Studierende bei der Beschäftigung mit physikalischen Inhalten unterstützen. Zudem kann ein Physik-Wissenstest Lücken im individuellen Wissensstand aufzeigen und zum Lernen der fehlenden Themen motivieren. Die Arbeitsgruppe "eLearning in der Physik" der Hochschulföderation Süd-West (HfSW) bestehend aus den baden-württembergischen Hochschulen Aalen, Esslingen, Heilbronn, Mannheim und Reutlingen hat einen Aufgabenpool von über 200 Physikaufgaben für Erstsemester erarbeitet. Sie stehen den Studierenden mit Lösungen in Lernmanagementsystemen zum Selbststudium und jetzt auch im "Zentralen Open Educational Resources Repositorium der Hochschulen in Baden-Württemberg" (ZOERR) zur Verfügung. In diesem Beitrag wird über den Einsatz der Online-Übungsaufgaben in 2020/2021 berichtet, über die Ergebnisse der Wissenstests und über die in der Corona-Zeit neu eingerichteten eTutorien.
A systematic study using a central composite design of experiments (DoE) was performed on the oxygen plasma surface modifications of two different polymers—Pellethane 2363-55DE, which is a polyurethane, and vinyltrimethoxysilane-grafted ethylene-propylene (EPR-g-VTMS), a cross-linked ethylene-propylene rubber. The impacts of four parameters—gas pressure, generator power, treatment duration, and process temperature—were assessed, with static contact angles and calculated surface free energies (SFEs) as the main responses in the DoE. The plasma effects on the surface roughness and chemistry were determined using scanning electron microscopy (SEM) and X-ray photoelectron spectroscopy (XPS). Through the sufficiently accurate DoE model evaluation, oxygen gas pressure was established as the most impactful factor, with the surface energy and polarity rising with falling oxygen pressure. Both polymers, though different in composition, exhibited similar modification trends in surface energy rise in the studied system. The SEM images showed a rougher surface topography after low pressure plasma treatments. XPS and subsequent multivariate data analysis of the spectra established that higher oxidized species were formed with plasma treatments at low oxygen pressures of 0.2 mbar.
Die Corona-Pandemie hat zu einer Einschränkung des Alltags der medizinischen Versorgung geführt. Das zeigt sich u.a. in zum Teil erheblichen Zugangsbeschränkungen zu Krankenhäusern und Praxen mit stark reduzierter Einbestellung von Patienten, der Einhaltung von gesteigerten Hygienemaßnahmen mit entsprechend längeren Wartezeiten, dem Zugangsverbot für Begleitpersonen und nicht zuletzt der Angst vieler Patienten vor einer Ansteckung bei einem Aufenthalt in medizinischen Bereichen. Folge dessen war und ist, dass ein deutlich wahrnehmbarer Rückgang der Patientenzahlen in den Krankenhausambulanzen und Praxen zu verzeichnen war. Davon war die Augenheilkunde als Fachdisziplin mit einem hohen Anteil an ambulanten und geplanten, chirurgischen Eingriffen in besonderem Maße betroffen.
Organisationale Identität in digitalisierten Arbeitswelten: Grundlagen für gelingende Kooperation
(2021)
Organisationen bilden Identitäten aus und beantworten dabei die Fragen „Wer sind wir? Und wer sind wir nicht?“. Vorstellungen zur organisationalen Identität gehen zunächst von traditionellen Organisationen aus. Durch die Digitalisierung können bisher integrierte Aufgaben stärker modularisiert werden, sodass die Koordination der organisatorischen Gesamtaufgabe stärker sach- und weniger personenorientiert erfolgt. Zudem lassen sich organisationale Aufgaben zunehmend projektorientiert und virtuell abbilden, sodass externe Aufgabenträger leichter integriert werden können. Unsere Vorstellungen zu Organisationsgrenzen und -mitgliedschaften verändern sich dadurch. Dies wirft die Frage auf, inwiefern sich in solchen sach- und projektorientierten, grenzaufgelösten Organisationen eine gemeinsame organisationale Identität ausbildet. Im Beitrag wird argumentiert, dass sich die Wege der Identitätsentwicklung verändern, die Funktionen der organisationalen Identität für gelingende Kooperation aber erhalten bleiben.
In this paper, we examine the political gridlock in reforming the Economic and Monetary Union. We utilize a two–stage game with imperfect information in order to study the optimal sequencing. The main results are: first, optimal sequencing requires for incompliant Member States a default option in stage–two, which in principle is related to the today's fiscal architecture (EMU-I). Second, we show that compliant countries prefer a reform equilibrium today if and only if they have a free choice about the preferred fiscal architecture at the end — either EMU-II with binding European coordination or EMU-I related to Maastricht. Noteworthy, our sequencing approach works for any design of the EMU-II architecture.
Programmable nano-bio interfaces driven by tuneable vertically configured nanostructures have recently emerged as a powerful tool for cellular manipulations and interrogations. Such interfaces have strong potential for ground-breaking advances, particularly in cellular nanobiotechnology and mechanobiology. However, the opaque nature of many nanostructured surfaces makes non-destructive, live-cell characterization of cellular behavior on vertically aligned nanostructures challenging to observe. Here, a new nanofabrication route is proposed that enables harvesting of vertically aligned silicon (Si) nanowires and their subsequent transfer onto an optically transparent substrate, with high efficiency and without artefacts. We demonstrate the potential of this route for efficient live-cell phase contrast imaging and subsequent characterization of cells growing on vertically aligned Si nanowires. This approach provides the first opportunity to understand dynamic cellular responses to a cell-nanowire interface, and thus has the potential to inform the design of future nanoscale cellular manipulation technologies.
Conventional production systems are evolving through cyber-physical systems and application-oriented approaches of AI, more and more into "smart" production systems, which are characterized among other things by a high level of communication and integration of the individual components. The exchange of information between the systems is usually only oriented towards the data content, where semantics is usually only implicitly considered. The adaptability required by external and internal influences requires the integration of new or the redesign of existing components. Through an open application-oriented ontology the information and communication exchange are extended by explicit semantic information. This enables a better integration of new and an easier reconfiguration of existing components. The developed ontology, the derived application and use of the semantic information will be evaluated by means of a practical use case.
Massive data transfers in modern data-intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-Data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become feasible. The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under RocksDB and the COSMOS hardware platform.
Respiratory diseases are leading causes of death and disability in the world. The recent COVID-19 pandemic is also affecting the respiratory system. Detecting and diagnosing respiratory diseases requires both medical professionals and the clinical environment. Most of the techniques used up to date were also invasive or expensive.
Some research groups are developing hardware devices and techniques to make possible a non-invasive or even remote respiratory sound acquisition. These sounds are then processed and analysed for clinical, scientific, or educational purposes.
We present the literature review of non-invasive sound acquisition devices and techniques.
The results are about a huge number of digital tools, like microphones, wearables, or Internet of Thing devices, that can be used in this scope.
Some interesting applications have been found. Some devices make easier the sound acquisition in a clinic environment, but others make possible daily monitoring outside that ambient. We aim to use some of these devices and include the non-invasive recorded respiratory sounds in a Digital Twin system for personalized health.
In the IGF project No. 19617 N, nitrogen and phosphorous substituted alkoxysilanes were prepared and their ability to inhibit fire growth and spread for fabrics was explored. To this end, a series of flame retardants were synthesized using different strategies including click chemistry and nucleophilic substitution of commercial organophosphorus compounds with amino-based trialkoxysilanes and/or cyanuric chloride. The new halogen-free and aldehyde-free flame retardants were applied to different fabrics such as cotton (CO), polyethylene terephthalate (PET), polyamide (PA) and their blends using the well-known pad-dry-cure technique and sol-gel method. The flame-retarding efficiencies were evaluated by EN ISO 15025 test methods (protective clothing-protection against heat and flame method of test for limited flame spread). Good flame retardancy of the hybrid organic-inorganic materials was achieved with the addition of as small amount as 3-5 wt.% for cotton fabrics. Moreover, the water solubility and the washing resistance could be controlled through the functional groups attached to the phosphor atom or through the optimization of the curing temperature. Overall, the research project demonstrated that N-P-silanes are very good permanent flame retardants for textiles.
Escherichia coli (E. coli) is considered the most common life-threatening infectious bacteria in our daily life and poses a major challenge to human health. However, antibiotics frequently overused and misused has triggered increased multidrug resistance, hinders therapeutic outcomes, and causes higher mortalities. Herein, we addressed near-infrared (NIR) laser-excited human serum albumin (HSA) mediated graphene oxide loaded palladium nano-dots (HSA-GO-Pd) that can effectively combat Gram-negative E. coli in vitro. NIR laser-excited designed hybrid material highly generates singlet oxygen and hydroxyl radical by electron spin-resonance (ESR) analysis. Transmission electron microscope (TEM) images show small spherical sizes PdNPs on the surface of GO nano-sheets. The zeta (ζ) potential study indicates that in an aqueous medium, the average PdNPs size and surface capped charge comes from human body protein (HSA), HSA-GO-Pd is 5–8 nm, and +25 mV, respectively. The spectroscopic characterization reveals that in the synthesized HSA-GO-Pd nanocomposite, PdNPs successfully well-dispersed decorated on the surface of graphene oxide. The as-synthesized HSA-GO-Pd shows excellent antibacterial activity against gram-negative pathogen by killing 95% bacteria within 5 h. HSA-GO-Pd having very biocompatible and shows significant antibacterial activities. Owing to their intense photothermal conversation potential, low toxicity to normal cells, the as-addressed hybrid (HSA-GO-Pd) combined with NIR-irradiation will catch up valuable insight into the effective ablation of pathogenic bacteria.
Nanocoatings based on sol–gel coatings are presented as suitable tool to modify materials based on polymers. The main focus is set onto textiles as the most common polymer materials. It presents which types of functionalization can be reached by modified sol–gel processes. Also a suitable categorization of functions is given and set into relation to common applications. A special focus is set on the functional properties, antimicrobial, UV protective, and flame retardant. The concept of bifunctional coatings is discussed and especially the combination of water-repellent and antistatic is presented.
Der betriebswirtschaftlichen Forschung in der Textil- und Bekleidungsindustrie gelingt es nicht, Lösungen zu erarbeiten, die das Nachhaltigkeitsproblem der Branche lösen. Dies liegt primär an der Art und Weise, wie in unserem Fachbereich geforscht wird. In Anbetracht der Problemstellung kann der starke Fokus auf empirische Arbeiten nur eine begrenzte Hilfestellung leisten. Denn empirische Forschung erfolgt innerhalb der bestehenden Denkmuster und ist tendenziell gegenwarts- oder vergangenheitsorientiert. Für die Lösung zukunftsorientierter Fragestellungen werden jedoch völlig neue Parameter und Logiken benötigt. Ein Umdenken ist gefragt.
During curing of thermosetting resins the technologically relevant properties of binders and coatings develop. However, curing is difficult to monitor due to the multitude of chemical and physical processes taking place. Precise prediction of specific technological properties based on molecular properties is very difficult. In this study, the potential of principal component analysis (PCA) and principal component regression (PCR) in the analysis of Fourier transform infrared (FTIR) spectra is demonstrated using the example of melamine-formaldehyde (MF) resin curing in solid state. FTIR/PCA-based reaction trajectories are used to visualize the influence of temperature on isothermal cure. An FTIR/PCR model for predicting the hydrolysis resistance of cured MF resin from their spectral fingerprints is presented which illustrates the advantages of FTIR/PCR compared to the combination differential scanning calorimetry/isoconversional kinetic analysis. The presented methodology is transferable to the curing reactions of any thermosetting resin and can be applied to model other technologically relevant final properties as well.
The isothermal curing of melamine resin is investigated by in-line infrared spectroscopy at different temperatures. The infrared spectra are decomposed into time courses of characteristic spectral patterns using Multivariate Curve Resolution (MCR). It was found that depending on the applied curing temperature, melamine films with different spectral fingerprints and correspondingly different chemical network structures are formed. The network structures of fully cured resin films are specific for the applied curing temperatures used and cannot simply be compensated by changes in the curing time. For industrial curing processes, this means that cure temperature is the main system determining factor at constant M:F ratio. However, different MF resin networks can be specifically obtained from one and the same melamine resin by suitable selection of the curing time and temperatures profiles to design resin functionality. The spectral fingerprints after short curing time as well as after long curing time reflect the fundamental differences in the thermoset networks that can be obtained with industrial short-cycle and multi-daylight presses.
The seamless fusion of the virtual world of information with the real physical world of things is considered the key for mastering the increasing complexity of production networks in the context of Industry 4.0. This fusion, widely referred to as the Internet of Things (IoT), is primarily enabled through the use of automatic identification (Auto-ID) technologies as an interface between the two worlds. Existing Auto-ID technologies almost exclusively rely on artificial features or identifiers that are attached to an object for the sole purpose of identification. In fact, using artificial features for the purpose of identification causes additional efforts and is not even always applicable. This paper, therefore, follows an approach of using multiple natural object features defined by the technical product information from computer-aided design (CAD) models for direct identification. By extending optical instance-level 3D-Object recognition by means of additional non-optical sensors, a multi-sensor automatic identification system (AIS) is realised, capable of identifying unpackaged piece goods without the need for artificial identifiers. While the implementation of a prototype confirms the feasibility of the approach, first experiments show improved accuracy and distinctiveness in identification compared to optical instance-level 3D-Object recognition. This paper aims to introduce the concept of multisensor identification and to present the prototype multi-sensor AIS.
Teaching at assembly workstations in production in SMEs (small and medium sized companies) often does not take place at all or only insufficiently. In addition to the lack of technical content, there are also aggravatingly incorrect movement sequences from an ergonomic point of view, which "untrained" people usually automatically acquire. An AI based approach is used to analyze a definite workflow for a specific assembly scope regarding the behavior of several employees. Based on these different behaviors, the AI gives feedback at which points in time, work steps and movement’s particularly dangerous incorrect postures occur. Motion capturing and digital human model simulation in combination with the results of the AI define the optimized workflow. Individual employees can be trained directly due to the fact that AI identifies their most serious incorrect postures and provide them with a direct analogy of their “wrong” posture and “easy on the joints posture”. With the assistance of various test persons, the AI can conduct a study in which the most frequently occurring incorrect postures can be identified. This could be realized in general or tailored to specific groups of people (e.g. "People over 1.90m tall must be particularly careful not to make the following mistake...). The approach will be tested and validated at the Werk150, the factory of the ESB Business School, on the campus of the Reutlingen University. The new gained knowledge will be used subsequently for training in SMEs.
Software is an integrated part of new features within the automotive sector, car manufacturers, the Hersteller Initiative Software (HIS) consortium defined metrics to determine software quality. Yet, problems with assigning metrics to quality attributes often occur in practice. The specified boundary values lead to discussions between contractors and clients as different standards and metric sets are used. This paper studies metrics used in the automotive sector and the quality attributes they address. The HIS, ISO/IEC 25010:2011, and ISO/IEC 26262:2018 are utilized to draw a big picture illustrating (i) which metrics and boundary values are reported in literature, (ii) how the metrics match the standards, (iii) which quality attributes are addressed, and (iv) how the metrics are supported by tools. Our findings from analyzing 38 papers include a catalog of 112 metrics of which 17 define boundary values and 48 are supported by tools. Most of the metrics are concerned with source code, are generic, and not specifically designed for automotive software development. We conclude that many metrics exist, but a clear definition of the metrics' context, notably regarding the construction of flexible and efficient measurement suites, is missing.
The present work proposes the use of modern ICT technologies such as smartphones, NFCs, internet, and web technologies, to help patients in carrying out their therapies. The implemented system provides a calendar with a reminder of the assumptions, ensures the drug identification through NFC, allows remote assistance from healthcare staff and family members to check and manage the therapy in real-time. The system also provides centralized information on the patient's therapeutic situation, helpful in choosing new compatible therapies.