Refine
Document Type
- Journal article (1244)
- Conference proceeding (1039)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (38)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3088)
Institute
- ESB Business School (1107)
- Informatik (875)
- Technik (509)
- Life Sciences (343)
- Texoversum (220)
- Zentrale Einrichtungen (16)
Publisher
- Springer (473)
- IEEE (252)
- Elsevier (243)
- Hochschule Reutlingen (191)
- MDPI (99)
- Wiley (72)
- Gesellschaft für Informatik e.V (69)
- Universität Tübingen (65)
- De Gruyter (60)
- VDE Verlag (48)
Die Regulierung von Banken, Finanzinstituten und Ratingagenturen ist nicht zuletzt seit der letzten Finanzkrise wieder in den Fokus von Politik und Wissenschft geraten. Zahlreiche Banken sind noch aufgrund staatlicher Beteiligungen unter besonderer Aufsicht und gerade in den europäischen Staaten mit hoher Verschuldung wird den Banken eine Mitschuld an der Krise gegeben. Im Fokus sind immer wieder auch die Ratingagenturen, die durch angebliche Fehlurteile und intransparente Methoden und Modelle Unternehmen und Staaten mit ihren Bonitätsurteilen vermeintlich diskriminieren und deren Finanzierungsspielräume einengen. Der Wunsch nach stärkerer Regulierung und Transparenz ist auch im Wahlkampf zum Europaparlament die letzten Wochen erneut spürbar gewesen. Der folgende Beitrag will die historischen Entwicklungen der Regulierungsstufen von Basel I bis Basel III aufzeigen und gleichzeitig die relevanten Institutionen und Normen vorstellen, um auf Basis dieser Übersicht eine kritische Würdigung und Diskussion anzustoßen.
Ungeachtet der dargestellten Fülle und ggf. daraus ableitbaren Überregulierung des Finanzmarktes, insbesondere der Banken und Ratingagenturen, sollten am Ende die Nutzen regulierter Bonitäts- und Solvenzbewertung nicht außer Acht gelassen bleiben. Neben dem volkswirtschaftlichen Schutz der Gläubiger sind auch interne Nutzenpotentiale für die Schuldner selbst zu nennen. Als interner Nutzen für ein Unternehmen kann die Gewinnung von Informationen für das Finanzmanagement genannt werden. Risiken können identifiziert und abgestellt werden. Eine Optimierung des Ratings durch Verbesserung von Kennzahlen oder aufgedeckten Schwachstellen kann zu einer Verringerung der Kreditkonditionen führen. Bei einer Unternehmensbewertung ist bei einem guten Rating der Unternehmenswert höher anzusetzen. Dies wird bereits beim IDW-S1-Standard der Wirtschaftsprüfer zur Ermittlung eines Discounted Cash Flow als Basis eines Unternehmenswertes nach dem WACC-CAPM-Modell deutlich. Ratings können als Frühwarnsystem rechtzeitig vor einer finanziellen Schieflage warnen und das Management oder die Eigentürmer für finanzielle Risiken sensibilisieren. Veröffentlichte Bonitäts- und Solvenzurteile können für Kunden, Lieferanten und alle beteiligten Stakeholder eines Unternehmens Vertrauen schaffen bzw. evtl. eine bessere Verhandlungsposition erzeugen. Mit einem Rating kann ein Unternehmen alternative Finanzierungsinstrumente wie z. B. eigene Anleihen nutzen und ist nicht unbedingt auf die Kreditvergabe der Bank angewiesen.
Letztlich ist also die Überregulierung zu kritisieren, die Notwendigkeit einer Regulierung für die komplexe Bonitäts- und Solvenzbeurteilung von Schuldnern im Interesse der Gläubiger aber nicht zu verleugnen.
Lexikon der Betriebswirtschaft : 3000 grundlegende und aktuelle Begriffe für Studium und Beruf
(2015)
Was ist Strategische Planung, welche Steueränderungen gibt es, was versteht man unter Break-Even-Analyse, Cashflow, Prozesskostenrechnung oder Balanced Scorecard und wo liegen die Besonderheiten der Konzernrechnungslegung? Ein aktuelles Nachschlagewerk mit zahlreichen Verweisen für Studierende und Praktiker.
Das Erkennen und Steuern von Risiken wird in einem turbulenten und dynamischen Umfeld von Unternehmen immer wichtiger. Neue Regularien wie Basel II, Solvency II und vor allem die aktuellen Reglementierungen vieler Staaten aufgrund der aktuellen Finanzmarktkrise führen zu einem verstärkten Einsatz von Instrumenten des Risikomanagements auch außerhalb von Banken und Versicherungen. Die Kenntnis der rechtlichen Vorgaben (Basel II, DRS, SolvV und MaRisk), der risikotheoretischen Grundlagen und deren Mess- und Frühwarnmethoden (Szenario, Delphi) ist für Unternehmen aus diesem Grund weiterhin eminent wichtig.Über die Grundlagen des Risikomanagements hin zu Risikocontrolling und -steuerung (d.h. der Identifikation und Messung von Risiken) beschäftigt sich dieser Praxisleitfaden zum Risikomanagement darüber hinaus auch mit dem Thema Risikovorsorge und -abwälzung durch Derivate, das für die Planung eines Risikomanagements im Unternehmen von enormer Bedeutung ist. Abschließend wird ein Fallbeispiel einer erfolgreichen Risikomanagement-Implementierung betrachtet. Schritt für Schritt soll damit nicht nur die konkrete Implementierung demonstriert sondern darüber hinaus gezeigt werden, dass solch eine Einführung möglich und sinnvoll ist.
Das Vertrauen in unsere Währungen sinkt. Die Zentralbanken fluten die Finanzmärkte mit billigem Geld. In Deutschland boomt die Wirtschaft, während in anderen Euro-Ländern hohe Arbeitslosigkeit und Staatspleiten drohen. Kann ein System mit Niedrigzins, Deflationsgefahr und geliehenem Wohlstand dauerhaft bestehen oder sind eventuell Überlegungen zu einer realen Hinterlegung von Währungen wieder anzustellen, um das Finanzsystem zu stabilisieren. Seit Jahrtausenden verwenden Menschen Geld. Seit jeher wird es Rechen- und Tauschmittel benutzt. Dabei haben Menschen seit Gedenken auch andere Tausch- und Finanzsysteme verwandt und sind bereits heute Miles & More-Punkte, realer Warentausch oder digitale Währungen wie Bitcoins Realität. Weiterhin stellt sich die Systemfrage, ob allein Zentralbanken Geld ausgeben sollten oder nicht marktwirtschaftlich auch die Geldausgabe frei für Jedermann möglich sein kann. Tausch von Waren und Zeit, Regionalwährungen, Schwundgeld oder Freigeld sind nur einige Ideen, die längst die Überlegungen, die als Ersatz- oder Komplementärwährung eventuell das Vertrauen in das Finanzsystem zurück bringen könnten. Eine verständliche Übersicht über das aktuelle Geldsystem und seiner Alternativen ist das Ziel dieses Buches.
Was ist Strategische Planung, welche Steueränderungen gibt es, was versteht man unter Break-Even-Analyse, Cashflow, Prozesskostenrechnung oder Balanced Scorecard und wo liegen die Besonderheiten der Konzernrechnungslegung? Ein aktuelles Nachschlagewerk mit zahlreichen Verweisen für Studenten und Praktiker.
Diese Arbeit beschäftigt sich mit dem neuen elektronischen Personalausweis. Zum einen werden in diesem Paper die Sicherheitsziele des Personalausweises und die technische Umsetzung der Architektur und Protokolle erklärt. Es wird der Ablauf einer Online-Identifizierung für einen Nutzer mithilfe des Ausweises aufgezeigt. Risiken und Schwachstellen der Technologie im Software- und Hardwarebereich werden diskutiert und die bereits erfolgten Hack-Angriffe aufgezeigt. Die Arbeit legt Möglichkeiten dar, wie sich der Nutzer vor Angriffen schützen kann. Es werden die Gründe genannt, warum der neue Personalausweis online nur schwar Anklang findet und warum die Aufklärung über die zur Verfügung stehenden Anwendungen, eine Preisreduzierung der Lesegeräte sowie die vom Europa-Parlament und Europarat erlassene eIDAS-Verordnung nicht helfen werden, um die Nutzung voranzutreiben. Ergebnisse hierfür liefert eine Nutzerstudie. Zum anderen werden Ideen genannt, wie die Nutzung der elektronischen Funktionen des Ausweises stattdessen zu fördern ist.
This paper presents a novel multi-modal CNN architecture that exploits complementary input cues in addition to sole color information. The joint model implements a mid-level fusion that allows the network to exploit cross modal interdependencies already on a medium feature-level. The benefit of the presented architecture is shown for the RGB-D image understanding task. So far, state-of-the-art RGB-D CNNs have used network weights trained on color data. In contrast, a superior initialization scheme is proposed to pre-train the depth branch of the multi-modal CNN independently. In an end-to-end training the network parameters are optimized jointly using the challenging Cityscapes dataset. In thorough experiments, the effectiveness of the proposed model is shown. Both, the RGB GoogLeNet and further RGB-D baselines are outperformed with a significant margin on two different tasks: semantic segmentation and object detection. For the latter, this paper shows how to extract object level groundtruth from the instance level annotations in Cityscapes in order to train a powerful object detector.
In smart factories, maintenance is still an important aspect to safeguard the performance of their production. Especially in case of failures of machine components diagnosis is a time-consuming task. This paper presents an approach for a cyber-physical failure management system, which uses information from machines such as programmable logic controller or sensor data and IT systems to support the diagnosis and repairing process. Key element is a model combining the different information sources to detect deviations and to determine a probable failed component. Furthermore, the approach is prototypically implemented for leakage detection in compressed air networks.
In today’s marketplace, the consumption of luxury goods is at a peak due to increasing global wealth and low interest rates, resulting in a vast supply of goods and services to which customer experiences are more relevant than ever before. One of the most recent developments in this field shows that consumers no longer simply purchase a product or service based on the fact sheet; they are also interested in the experience around the product. Successful brands must develop and maintain individual images to sustain their competitive advantage and build brand equity that is beneficial for customers and firms. Ideally, these will lead to satisfaction and loyalty between a brand, its products, and its customers. Existing research about brand experience and brand equity has mainly focused on functional aspects, which seem to differ for high-value luxury goods. Most studies have focused on industries like retail and fashion brands, sampling university students or visitors to shopping malls, and some have even mixed different types of industries together. This underpins the need for research within a single luxury industry with actual luxury customers who have a solid background with brand experiences.
The purpose of this study was to explore the brand experience spectrum within the automotive industry in Germany, particularly in the affordable luxury sport car sector. Identifying the factors and components that constitute, influence, or leverage/drive a brand experience from their perspective was a clear aim of the study. To achieve this, the study collected data from indepth interviews with German (n=60) respondents who had experience with affordable and luxury sport cars. The conceptual framework was based on two empirically tested models guiding this exploratory consumer research. The first model to build on was the consumerbased brand equity model, empirically tested by Çifci et al. (2016) and Nam et al. (2011). The second conceptual framework was Lemon and Verhoef’s (2016) customer journey model consisting of relevant touchpoints along the following three stages: pre-purchase, purchase, and post-purchase.
The findings of the research demonstrate that, although the six brand equity concepts – brand awareness, physical quality, staff behaviour, self-congruence, brand identification, and lifestyle – are broadly applicable in understanding customer experience in the affordable luxury car industry, the content of these dimensions differs from that suggested by the previous authors. The research established that cognitive and affective (or symbolic) components build the foundation of customer brand experience and supports Çifci et al.’s (2016) and Nam et al.’s (2011) study results. The study also identified brand trust as an important and highly relevant concept for customer brand experience in the luxury automotive car industry. Brand trust influences customer satisfaction and loyalty, therefore improving and complementing the existing model. Furthermore, the study confirmed Lemon and Verhoef’s (2016) process model of the customer journey and experience; however, it suggested two different customer journeys depending on the customers’ previous experience (first-time and experienced buyers). The differences between the two groups and the relevance of the journey touchpoints within the three purchase stages vary significantly in terms and are distinct. Identified key touchpoints for both groups are the contact to a dealer as well as information gathering online. Differences have been found in the length of purchase stages and across the customer journey. The study highlights the importance of trust, identification, and product quality for customer brand experience. Moreover, the findings of this study complement the brand equity model of Çifci et al. (2016) by adding the new concept of trust, which is highly relevant. The current knowledge is complemented by a new understanding and mapping of the customer journey for luxury sports cars in Germany. This study can assist practitioners and managers by providing a compass indicating which touchpoints are relevant to which customer group. Social value can be achieved by encouraging interactions between brand and consumer (e.g. central product launch events) and through brand-oriented interactions among consumers (e.g. dealer events, clubs, or communities). Customers are motivated to express their distinctiveness through product experience and brand identification (belonging/distinction) and to develop a loyal link to brands.
Affordable Luxury Sports Cars in Germany : Investigating the Determinants of Customer Experience
(2022)
The article discusses the factors affecting the customer experience when buying affordable luxury sports cars in Germany by identifying differences between first-time and experienced buyers. It emphasizes the need for the creation of two different customer journeys based on different customer experience clusters, a touchpoint analysis from the customer perspective identified differences in purchase stages, and staff behaviour and brand trust for customer satisfaction and brand identification.
Das vorliegende Taschenbuch fasst die bekannten Berechnungsformeln und Erkenntnisse aus der betrieblichen Praxis und aus wissenschaftlichen Untersuchungen im Bereich des Weberei-Vorwerks und der Weberei zusammen. Die bei der Gewebeherstellung notwendigen Entscheidungsprozesse sollen damit erleichtert werden.
Mit dieser Formelsammlung lassen sich jedoch nicht nur die optimalen Fertigungsvorschriften für Gewebe praxisgerecht erstellen, sondern auch die wichtigsten technischen und physikalischen Grundlagen des Fabrikbetriebs werden in der gebotenen Kürze dargestellt.
Das vorliegende Buch ist als eine zusammenfassende Bearbeitung der über viele Jahre gesammelten Erkenntnisse im Bereich der Schussfadenzugkraftmessung entstanden. Für die Schusseintragstechniken mit Greifern, Projektilen und Luftstrahl ist es gelungen, Formeln zur Berechnung der Schussfadenzugkräfte zu erarbeiten, die es auch dem Weberei-Praktiker ermöglichen, die maximalen Schussfadenzugkräfte mit guter Genauigkeit voraus zu berechnen. Ausgehend von diesen Ergebnissen kann über einen Vergleich mit dem Schwachstellenniveau des Schussgarns die Entscheidung über die Einsatzfähigkeit des Schussgarns auf der vorgesehenen Webmaschine oder über die Einsatzfähigkeit der vorgesehenen Webmaschine selbst bei der gewünschten Drehzahl herbeigeführt werden.
Das vorliegende Taschenbuch fasst die bekannten Berechnungsformeln und Erkenntnisse aus der betrieblichen Praxis und aus wissenschaftlichen Untersuchungen im Bereich des Weberei-Vorwerks und der Weberei zusammen. Die bei der Gewebeherstellung notwendigen Entscheidungsprozesse sollen damit erleichtert werden.
Mit dieser Formelsammlung lassen sich jedoch nicht nur die optimalen Fertigungsvorschriften für Gewebe praxisgerecht erstellen, sondern auch die wichtigsten technischen und physikalischen Grundlagen des Fabrikbetriebs werden in der gebotenen Kürze dargestellt.
Purpose: The purpose of this paper is to elaborate if video marketing enhance emotional involvement. Therefore a literature research is done in two parts. Firstly there is a review on the development of marketing communication and video marketing. In the second part of the review the focus is set on emotions itself, how emotional involvement is generated and how emotions influence consumption behavior.
Findings: The key finding of this paper is that videos can enhance emotions through their multi-sensory character in an efficient way. Furthermore there could be identified that especially viral videos create emotional enhancement and meet the direct marketing approach.
Understanding the factors that influence the accuracy of visual SLAM algorithms is very important for the future development of these algorithms. So far very few studies have done this. In this paper, a simulation model is presented and used to investigate the effect of the number of scene points tracked, the effect of the baseline length in triangulation and the influence of image point location uncertainty. It is shown that the latter is very critical, while the other all play important roles. Experiments with a well known semi-dense visual SLAM approach are also presented, when used in a monocular visual odometry mode. The experiments show that not including sensor bias and scale factor uncertainty is very detrimental to the accuracy of the simulation results.
In the period from the 1950s to 2013, the American Food and Drug Administration (FDA) approved 1346 new molecular entities (NMEs) or new biologics entities (NBEs). On average, the approval rate was 20 NMEs per year. In the past 40 years, the number of new drugs launched into the market increased slightly from 15 NMEs in the 1970s to 25–30 NMEs since the 1990s. The highest number of new drugs approved by FDA was in 1996 and 1997, which might be related to the enactment of the Prescription Drug User Fee Act (PDUFA) in 1993.
The reduced research and development (R&D) efficiency, strong competition from generics, increased cost pressure from payers, and an increased biological complexity of new target indications have resulted in a rethinking and a change from a traditional and more closed R&D model in the pharmaceutical industry toward the new paradigm of open innovation. In the past years, pharmaceutical companies have broadened their external networks toward research collaborations with academic institutes, technology providers, or codevelopment partners. To fulfill the demand to reduce timelines and costs, research-based pharmaceutical companies started to outsource R&D activities. In addition, internal R&D processes were adjusted to the more open R&D model and new processes such as alliance management were established. The corporate frontier of pharmaceutical companies became permeable and more open. As a result, the focus of pharmaceutical R&D expanded from a purely internal toward a mixed internal and external model. Today, the U.S. pharmaceutical company Eli Lilly may have established the most open model toward external innovation, as it has integrated its innovation processes with its business model. Other companies are following this more open R&D model with newer concepts such as new frontier sciences, drug discovery alliances, private public partnerships, innovation incubators, virtual R&D, crowdsourcing, open source innovation, and innovation camps.
It is known that the costs related with drug research and development (R&D) and the timelines to develop a new drug increased over the past years. In parallel, the success rates of drug projects along the pharmaceutical R&D phases are still very low, and the outcome of all R&D efforts is stagnating. In consequence, the R&D efficiency defined as the financial investment per drug has been steadily decreasing. As innovation is the major growth driver of the pharmaceutical industry, reliable data on R&D efficiency and new concepts to overcome these challenges are of great interest for R&D managers and the sustainability of the pharmaceutical industry as a whole. This book chapter reviews publications on R&D performance indicators of the past years, such as the success rates and timelines per phase. Additionally, it illustrates the factors influencing the success rates, timelines, and costs of pharmaceutical R&D most and, thus, the denominators of the R&D efficiency.
The efficiency of pharmaceutical research and development (R&D) reflected by increasing costs of R&D, long timelines, and low probabilities of technical and regulatory success decreased continuously in the past years. Today, the costs for discovering and developing a new drug are enormously high with more than USD 2 billion per new molecular entity (NME), while the average overall success of a research project to provide an NME is in the single-digit percentage rate, and the total timelines of R&D easily exceeds 10 years questioning the return on investment (ROI) of pharmaceutical R&D. As a consequence and also caused by numerous patent expirations of blockbuster drugs that increased the pressure to return to an acceptable ROI, the pharmaceutical industry addressed this challenge and the related causes and identified several actions that need to be taken to increase the output/input ratio of R&D. This book chapter will review the pipeline sizes and the R&D investments of multinational pharmaceutical companies, will describe new processes that have been implemented to increase the reach and to reduce costs of pharmaceutical R&D, and it will illustrate new innovation models that were developed to increase the R&D efficiency.
New drugs serving unmet medical needs are one of the key value drivers of research-based pharmaceutical companies. The efficiency of research and development (R&D), defined as the successful approval and launch of new medicines (output) in the rate of the monetary investments required for R&D (input), has declined since decades. We aimed to identify, analyze and describe the factors that impact the R&D efficiency. Based on publicly available information, we reviewed the R&D models of major research-based pharmaceutical companies and analyzed the key challenges and success factors of a sustainable R&D output. We calculated that the R&D efficiencies of major research-based pharmaceutical companies were in the range of USD 3.2–32.3 billion (2006–2014). As these numbers challenge the model of an innovation-driven pharmaceutical industry, we analyzed the concepts that companies are following to increase their R&D efficiencies: (A) Activities to reduce portfolio and project risk, (B) activities to reduce R&D costs, and (C) activities to increase the innovation potential. While category A comprises measures such as portfolio management and licensing, measures grouped in category B are outsourcing and risk-sharing in late-stage development. Companies made diverse steps to increase their innovation potential and open innovation, exemplified by open source, innovation centers, or crowdsourcing, plays a key role in doing so. In conclusion, research-based pharmaceutical companies need to be aware of the key factors, which impact the rate of innovation, R&D cost and probability of success. Depending on their company strategy and their R&D set-up they can opt for one of the following open innovators: knowledge creator, knowledge integrator or knowledge leverager.
Artificial intelligence (AI) technologies, such as machine learning or deep learning, have been predicted to highly impact future organizations and radically change the way how projects are managed. The Project Management Institute (PMI), the network of around 1.1 million certified project managers, ranked AI as one of the top three disruptors of their profession. In an own study on the effect of AI, 37% of the project management processes can be executed by machine learning and other AI technologies. In addition, Gartner recently postulated that 80% of the work of today's project managers may be eliminated by AI in 2030.
This editorial aims to outline today's project and portfolio management in context of pharmaceutical research and development (R&D), followed by an AI-vision and a more tangible mission, and illustrate what the consequences of an AI-enabled project and portfolio management could be for pharmaceutical R&D.
Today, virtualizing pharma R&D is increasingly related with data analytics and artificial intelligence (AI), technologies that have been developed by software companies outside the healthcare sector. The process of virtualizing pharma R&D is closely related to the technological advancements that result in the generation of large data sets ranging from genomics, proteomics, metabolomics, medical imaging, IoT wearables and large clinical trials, making it necessary for pharma companies to find new ways to store and ultimately analyze information. As a consequence, pharma companies are experimenting with AI in R&D ranging from in-silico drug design to clinical trail participants identification or dosage error reduction.
Historically, research and development (R&D) in the pharmaceutical sector has predominantly been an in-house activity. To enable investments for game changing late-stage assets and to enable better and less costly go/no-go decisions, most companies have employed a fail early paradigm through the implementation of clinical proof-of-concept organizations. To fuel their pipelines, some pioneers started to complement their internal R&D efforts through collaborations as early as the 1990s. In recent years, multiple extrinsic and intrinsic factors induced an opening for external sources of innovation and resulted in new models for open innovation, such as open sourcing, crowdsourcing, public–private partnerships, innovations centres, and the virtualization of R&D. Three factors seem to determine the breadth and depth regarding how companies approach external innovation: (1) the company’s legacy, (2) the company’s willingness and ability to take risks and (3) the company’s need to control IP and competitors. In addition, these factors often constitute the major hurdles to effectively leveraging external opportunities and assets. Conscious and differential choices of the R&D and business models for different companies and different divisions in the same company seem to best allow a company to fully exploit the potential of both internal and external innovations.
We investigated the state of artificial intelligence (AI) in pharmaceutical research and development (R&D) and outline here a risk and reward perspective regarding digital R&D. Given the novelty of the research area, a combined qualitative and quantitative research method was chosen, including the analysis of annual company reports, investor relations information, patent applications, and scientific publications of 21 pharmaceutical companies for the years 2014 to 2019. As a result, we can confirm that the industry is in an ‘early mature’ phase of using AI in R&D. Furthermore, we can demonstrate that, despite the efforts that need to be managed, recent developments in the industry indicate that it is worthwhile to invest to become a ‘digital pharma player’.
Research and Development (R&D) is crucial for the growth and future success of research-based pharma companies. To maintain their R&D organisations efficient, pharmaceutical companies started to hedge the potential of open innovation to cut R&D costs and to access external knowledge. These new strategies could be divided into several categories: open source, innovation centres, crowd sourcing and virtual R&D.
Pharmaceutical companies are among the top investors into research and development (R&D) globally, as product innovation is still the main growth driver for the industry and because the related complexities necessitate enormous R&D investments. The market demand for new medicines to be more efficacious or to provide better safety than existing drugs and the regulatory need to prove superiority in clinical trials are reasons why drug R&D is increasingly expensive and pharmaceutical companies need to manage extraordinarily high costs per approved new compound.
Comparative analysis of the R&D efficiency of 14 leading pharmaceutical companies for the years 1999–2018 shows that there is a close positive correlation between R&D spending and the two investigated R&D output parameters, approved NMEs and the cumulative impact factor of their publications. In other words, higher R&D investments (input) were associated with higher R&D output. Second, our analyses indicate that there are "economies of scale" (size) in pharmaceutical R&D.
Circular economy aims to support reuse and extends the product life cycles through repair, remanufacturing, upgrades and retrofits, as well as closing material cycles through recycling. To successfully manage the necessary transformation processes to circular economy, manufacturing enterprises rely on the competency of their employees. The definition of competency requirements for circular economy-oriented production networks will contribute to the operationalization of circular economy. The International Association of Learning Factories (IALF) statesin its mission the development of learning systems addressing these challenges for training of students and further education of industry employees. To identify the required competencies for circular economy, the major changes of the product life cycle phases have been investigated based on the state of the science and compared to the socio-technical infrastructure and thematic fields of the learning factories considered in this paper. To operationalize the circular economy approach in the product design and production phase in learning factories, an approach for a cross learning factory network (so called "Cross Learning Factory Product Production System (CLFPPS)") has been developed. The proposed CLFPPS represents a network on the design dimensions of learning factories. This approach contributes to the promotion of circular economy in learning factories as it makes use of and combines the focus areas of different learning factories. This enables the CLFPPS to offer a holistic view on the product life cycle in production networks.
Production planning and control are characterized by unplanned events or so-called turbulences. Turbulences can be external, originating outside the company (e.g., delayed delivery by a supplier), or internal, originating within the company (e.g., failures of production and intralogistics resources). Turbulences can have far reaching consequences for companies and their customers, such as delivery delays due to process delays. For target-optimized handling of turbulences in production, forecasting methods incorporating process data in combination with the use of existing flexibility corridors of flexible production systems offer great potential. Probabilistic, data-driven forecasting methods allow determining the corresponding probabilities of potential turbulences. However, a parallel application of different forecasting methods is required to identify an appropriate one for the specific application. This requires a large database, which often is unavailable and, therefore, must be created first. A simulation-based approach to generate synthetic data is used and validated to create the necessary database of input parameters for the prediction of internal turbulences. To this end, a minimal system for conducting simulation experiments on turbulence scenarios was developed and implemented. A multi-method simulation of the minimal system synthetically generates the required process data, using agent-based modeling for the autonomously controlled system elements and event-based modeling for the stochastic turbulence events. Based on this generated synthetic data and the variation of the input parameters in the forecast, a comparative study of data-driven probabilistic forecasting methods was conducted using a data analytics tool. Forecasting methods of different types (including regression, Bayesian models, nonlinear models, decision trees, ensemble, deep learning) were analyzed in terms of prediction quality, standard deviation, and computation time. This resulted in the identification ofappropriate forecasting methods, and required input parameters for the considered turbulences.
Decreasing batch sizes in production in line with Industrie 4.0 will lead to tremendous changes of the control of logistic processes in future production systems. Intelligent bins are crucial enablers to establish decentrally controlled material flow systems in value chain networks as well as at the intralogistics level. These intelligent bins have to be integrated into an overall decentralized monitoring and control approach and have to interact with humans and other entities just like other cyber-physical systems (CPS) within the cyber-physical production system (CPPS). To realize a decentralized material supply following the overall aim of a decentralized control of all production and logistics processes, an intelligent bin system is currently developed at the ESB Logistics Learning Factory. This intelligent bin system will be integrated into the self developed, cloud-based and event-oriented SES system (so-called “Self Execution System”) which goes beyond the common functionalities and capabilities of traditional manufacturing execution systems (MES).
To ensure a holistic integration of the intelligent bin for different material types into the SES framework, the required hard- and software components for the decentrally controlled bin system will be split into a common and an adaptable component. The common component represents the localization and network layer which is common for every bin, whereas the flexible component will be customizable to different requirements, like to the specific characteristics of the parts.
The increasing emergence of cyber-physical systems (CPS) and a global crosslinking of these CPS to cyber-physical production systems (CPPS) are leading to fundamental changes of future work and logistic systems requiring innovative methods to plan, control and monitor changeable production systems and new forms of human-machine-collaboration. Particularly logistic systems have to obey the versatility of CPPS and will be transferred to so-called cyber physical logistic systems, since the logistical networks will underlie the requirements of constant changes initiated by changeable production systems. This development is driven and enhanced by increasingly volatile and globalized market and manufacturing environments combined with a high demand for individualized products and services. Also nowadays mainly used centralized control systems are pushed to their limits regarding their abilities to deal with the arising complexity to plan, control and monitor changeable work and logistic systems. Decentralized control systems bear the potential to cope with these challenges by distributing the required operations on various nodes of the resulting decentralized control system.
Learning factories, like the ESB Logistics Learning Factory at ESB Business School (Reutlingen University), provide a wide range of possibilities to develop new methods and innovative technical solutions in a risk-free and close-to-reality factory environment and to transfer knowledge as well as specific competences into the training of students and professionals. To intensify the research and training activities in the field of future work and logistics systems, ESB Business School is transferring its existing production system into a CPPS involving decentralized planning, control and monitoring methods and systems, human-machine-collaboration as well as technical assistance systems for changeable work and logistics systems.
Die zunehmende Durchdringung von cyber-physischen Systemen und deren Vernetzung zu cyberphysischen Produktionssystemen (CPPS) führt zu fundamentalen Veränderungen von zukünftigen Montage-, Fertigungs- und Logistiksystemen, welche innovative Methoden zur Planung, Steuerung und Kontrolle von wandlungsfähigen Produktionssystemen erfordern. Zukünftige logistische Systeme werden dabei den Anforderungen einer hochfrequenten Veränderung und Re-Konfiguration ausgelöst durch wandlungsfähige Produktionssysteme für individualisierte Produkte und kleinen Losgrößen unterliegen. Der Einsatz dezentraler Steuerungssysteme, bei denen die komplexen Planungs-, Steuerungs- und Kontrollprozesse auf zahlreiche Knoten und Entitäten des entstehenden Steuerungssystems verteilt werden, bietet ein großes Potential, den Anforderungen in cyber-physischen Logistiksystemen gerecht zu werden. Eine zentrale Herausforderung ist dabei die echtzeitfähige Steuerung und Re-Konfiguration von sogenannten hybriden Logistiksystemen, welche u.a. durch die Kollaboration von Mensch und Maschine, der Kombination verschiedenartiger Fördermittel sowie verschiedenartiger Steuerungsarchitekturen geprägt sind und darüber hinaus auf hybriden Entscheidungsfindungsprozessen beruhen, welche die Fähigkeiten von Menschen und (cyber-physischen) Systemen synergetisch nutzen.
Lernfabriken, wie die ESB Logistik-Lernfabrik an der ESB Business School (Hochschule Reutlingen), bieten dabei weitreichende Möglichkeiten, diese innovativen Methoden, Systeme und technischen Lösungen in einer industrienahen und risikofreien Fabrikumgebung zu entwickeln sowie in die Ausbildung von Studierenden und Weiterbildung von Teilnehmern aus der Industrie zu transferieren. Um die Forschung, Lehre sowie Aus- und Weiterbildung im Bereich zukünftiger Montage-, Fertigungs- und Logistiksysteme auszuweiten, wird das bestehende Produktionssystem der ESB Logistik-Lernfabrik im Rahmen verschiedenster Forschungs- und Studentenprojekte schrittweise in ein dezentral gesteuertes cyber-physisches Produktionssystem, basierend auf einer ereignisorientierten, cloud-basierten und dezentralen Steuerungsarchitektur, überführt.
Future intralogistics systems need to adapt flexibly to changing material flow requirements in line with future versatile factory environments, producing personalized products under the performance and cost conditions of today's mass production. Small batch sized down to a batch size of "1" lead to a high complexity in the design and economical manufacturing of these customized products. Intralogistics systems are integrated into higher-level areas (segment level) as well as into upsteam and downstream performance units (system-wide areas). This includes the logistic activities relevant for the system (organized according to storage, picking, transport) such as transportation or storage tasks of tools, semi-finished products, components, assemblies and containers, and waste. Today's centralized material flow control systems, which work based on predefined processes, are not capable and more specifically not suitable to deal with the arising complexity of changeable intralogistics systems. Autononomous, decentralized material flow control systems distribute the required decision-making and control processes on intelligent logistic entities. A major step for the development of an autonomous control method for hybrid intralogistics systems (manual, semi-automated and automated) is the development of a generic archetype for intralogistics systems regarding the system boundaries, elements and relations resulting in a descriptive model taking into account amongst others the time of demand, availability of resources, economic efficiency and technical performance parameters. The ESB Logistics Learning Factory at ESB Business School (Reutlingen University) serves for this as a close-to-reality development and validation environment.
Manufacturing companies are confronted with external (e.g. short-term change of product configuration by the customer) and internal (e.g. production process deviations) turbulences which are affecting the performance of production. Predefined, centrally controlled logistics processes are limiting the possibilities of production to initiate countermeasures to react in an optimized way to these turbulences. The autonomous control of intralogistics offers a great potential to cope with these turbulences by using the respective flexibility corridors of production systems and applying intelligent logistic objects with decentralized decision and process execution capabilities to maintain a target-optimized production. A method for AI-based storage-location- and material-handling-optimization to achieve performance-optimized intralogistics system through continuous monitoring of performance-relevant parameters and influencing factors by using AI (e.g. for pattern recognition) has been developed. To provide the basis to investigate and demonstrate the potentials of autonomously controlled intralogistics in connection with turbulences of production and in combination with AI, an intelligent warehouse involving an indoor localization system, smart bins, manual, semi-automated/collaborative and autonomous transport systems has been developed and implemented at Werk150, the factory on campus of ESB Business School (Reutlingen University). This scenario, which has been integrated into graduate training modules, allows the analysis and demonstration of different measures of intralogistics to cope with turbulences in production involving amongst others storage and material provision processes. The target fulfilment of the applied intralogistics measures to master arising turbulences is assessed based on the overall performance of production considering lead times and adherence to delivery dates. By applying artificial intelligence (AI) algorithms the intelligent logistical objects (smart bin, transport systems, etc.) as well as the entire logistics system should be enabled to improve their decision and process execution capabilities to master short-term turbulences in the production system autonomously.
The global demand for individualized products leading to decreasing production batch sizes requires innovative approaches how to organize production and logistics systems in a dynamic manner. Current material flow systems mainly rely on predefined system structures and processes, which result in a huge increase of complexity and effort for system and process changes to realize an optimized production and material provision of individualized products. Autonomous production and logistics entities in combination with intelligent products or logistic load carriers following the vision of the “Internet of Things” offer a promising solution for mastering this complexity based on autonomous, decentralized and target size-optimized decision making and structure formation without the need for predefined processes and central decision-making bodies. Customer orders are going to prioritize themselves and communicate directly with the required production and logistics resources. Bins containing the required materials are going to communicate with the conveyors or workers of the respective intralogistics system organizing and controlling the material flow to the autonomously selected workstation. A current research project is the development of a collaborative tugger train combing the potential of automation and human-robot collaboration in intralogistics. This tugger train is going to be integrated into a self organized intralogistics scenario involving individualized customer orders (low to high batch sizes). To classify the application of self-organization within intralogistics systems, a criteria catalogue has been developed. The application of this criteria catalogue will be demonstrated on the example of a self-organization scenario involving the collaborative tugger train and an intelligent bin system.
The persistent development towards decreasing batch sizes due to an ongoing product individualization, as well as increasingly dynamic market and competitive conditions lead to new changeability requirements in production environments. Since each of the individualized products mgith require different base materials or components and manufacturing resources, the paths of the products giong through the factory as well as the required internal transport and material supply processes are going to differ for every product. Conventional planning and control systems, which rely on predifined processes and central decision-making, are not capable to deal with the arising system's complexity along the dimensions of changing goods, layouts and throughput requirements. The concepts of "self-organization" in combination with "autonomous ocntrol" provide promising solutions to solve these new requirements by using among other things the potential of autonomous, decentralized and target-optimized logistical objects (e.g. smart products, bins and conveyor systems) wich are able to communicate and interact with each other as well as with human wokers. To investigate the potential of automation and human-robot collaboration for intralogistics, a research project for the development of a collaborative tugger train has been started at the ESB Logistics Learning Factory in lin with various student projects in neighboring research areas. This collaboraive tugger train system in combination with other manual (e.g. handcarts) and (semi-) automated conveyoer systems (e.g. automated guided forklift) will be integrated into a dynamic, self-organized scenario with varying production batch sizes to develop a method for target-oriented sefl-organization and autonomous control of intralogistics systems. For a structured investigation of self-organized scenarios a generic intralogistics model as well as a criteria cataloghe has been developed. The ESB Logistics Learning will serve as a practice-oriented research, validation and demonstration environment for these purposes.
The approach of self-organized and autonomous controlled systems offers great potential to meet new requirements for the economical production of customized products with small batch sizes based on a distributed, flexible management of dynamics and complexity within the production and intralogistics system. To support the practical application of self-organization for intralogistics systems, a catalogue of criteria for the evaluation of the self-organization of flexible logistics systems has been developed and validated, which enables the classification of logistics systems as well as the identification and evaluation of corresponding potentials that can be achieved by increasing the degree of self-organization.
The planning and control of intralogistics systems in line with versatile production systems of smart factories requires new approaches and methods to cope with changing requirements within future factories. The planning of intralogistics can no longer follow a static, sequential approach as in the past since the planning assumptions are going to change in a high frequency. Reasons for these constant changes are amongst others external turbulences like rapidly changing market conditions, decreasing batch sizes down to customer-specific products with a batch size of one and on the other hand internal turbulences (like production and logistic resource breakdowns) affecting the production system. This paper gives an insight into research approaches and results how capabilities of intelligent logistical objects (intelligent bins, autonomous transport systems etc.) can be used to achieve a self-organized, cost and performance optimized intralogistics system with autonomously controlled process execution within versatile production environments. A first consistent method has been developed which has been validated and implemented within a scenario at the pilot factory Werk150 at the ESB Business School (Reutlingen University). Based on the incoming production orders, the method of the Extended Profitability Appraisal (EPA) covering the work system value to define the most effective work system for order fulfilment is applied. To derive the appropriate intralogistics processes, an autonomous control method involving principles of decentralized and target-oriented decision-making (e.g. intelligent bins are interacting with autonomously controlled transport systems to fulfil material orders of assembly workstations) has been developed and applied to achieve a target-optimized process execution. The results of the first stage research using predefined material sources and sinks described in this paper is going to set the basis for the further development of a self-organized and autonomously controlled method for intralogistics systems considering dynamic source and sink relations. By allowing dynamic shifts of production orders in the sense of dynamic source and sink relations the cost and performance aims of the intralogistics system can be directly aligned with the aims of the entire versatile production system in the sense of self-organized and autonomously controlled systems.
Industrial practice is characterized by random events, also referred to as internal and external turbulences, which disturb the target-oriented planning and execution of production and logistics processes. Methods of probabilistic forecasting, in contrast to single value predictions, allow an estimation of the probability of various future outcomes of a random variable in the form of a probability density function instead of predicting the probability of a specific single outcome. Probabilistic forecasting methods, which are embedded into the analytics process to gain insights for the future based on historical data, therefore offer great potential for incorporating uncertainty into planning and control in industrial environments. In order to familiarize students with these potentials, a training module on the application of probabilistic forecasting methods in production and intralogistics was developed in the learning factory 'Werk150' of the ESB Business School (Reutlingen University). The theoretical introduction to the topic of analytics, probabilistic forecasting methods and the transition to the application domain of intralogistics is done based on examples from other disciplines such as weather forecasting and energy consumption forecasting. In addition, data sets of the learning factory are used to familiarize the students with the steps of the analytics process in a practice-oriented manner. After this, the students are given the task of identifying the influencing factors and required information to capture intralogistics turbulences based on defined turbulence scenarios (e.g. failure of a logistical resource) in the learning factory. Within practical production scenario runs, the students apply probabilistic forecasting using and comparing different probabilistic forecasting methods. The graduate training module allows the students to experience the potentials of using probabilistic forecasting methods to improve production and intralogistics processes in context with turbulences and to build up corresponding professional and methodological competencies.
In recent years, artificial intelligence (AI) has increasingly become a relevant technology for many companies. While there are a number of studies that highlight challenges and success factors in the adoption of AI, there is a lack of guidance for firms on how to approach the topic in a holistic and strategic way. The aim of this study is therefore to develop a conceptual framework for corporate AI strategy. To address this aim, a systematic literature review of a wide spectrum of AI-related research is conducted, and the results are analyzed based on an inductive coding approach. An important conclusion is that companies should consider diverse aspects when formulating an AI strategy, ranging from technological questions to corporate culture and human resources. This study contributes to knowledge by proposing a novel, comprehensive framework to foster the understanding of crucial aspects that need to be considered when using the emerging technology of AI in a corporate context.
In diesem Beitrag wurde gezeigt, wie ein bereits bekanntes Verfahren zur modellprädiktiven Regelung zur Optimierung der Energieeffizienz einer Asynchronmaschine im dynamischen Betrieb eingesetzt werden kann. Dazu wurden zunächst die Beziehungen für die Verlustleistung bei alleiniger Berücksichtigung der Kupferverluste im dynamischen Betrieb hergeleitet. Ausgehend davon wurde das Optimierungsproblem formuliert, der Einfluss von Parametern des modellprädiktiven Verfahrens auf das Optimierungsergebnis untersucht und damit Vorschlagswerte für diese Parameter ermittelt. Der Vergleich mit zwei weiteren Verfahren ohne Optimierung bzw. mit Optimierung allein für stationäre Arbeitspunkte zeigt die Vorteile des modellprädiktiven Verfahrens.
Der vorliegende Beitrag stellte die beiden Verfahrensklassen parameterbasierte Verfahren und Suchverfahren für eine energieeffiziente Betriebsführung von Asynchronmaschinen im stationären Zustand kurz vor und zeigte das Potential zur Reduktion der Verlustleistung in stationären Arbeitspunkten für zwei Motoren unterschiedlicher Baugröße. Dabei wurde deutlich, dass insbesondere im unteren Teillastbereich eine erhebliche Verringerung der Verlustleistung erzielt werden kann.
In this paper a double hogger used in woodworking machines is considered. The machining tools are driven by induction machines operated by standard inverters. During production the load of these motors changes periodically between low load and high load at a given speed. This paper investigates the reduction of power losses in such an application using an appropriate energy efficient control strategy for the induction machines.
Ein wesentliches Ziel der unter dem Schlagwort Industrie 4.0 gebündelten neuen Entwicklungen ist die Vernetzung intelligenter Komponenten in industriellen Anlagen, um Prozesse transparenter und effizienter zu gestalten. Ein weiteres Ziel ist das Condition Monitoring, d.h. die Überwachung des Zustands der Komponenten während der Laufzeit und die Abschätzung der Restlebensdauer, damit die gesamte Lebensdauer der Komponente ausgenutzt und Wartungsintervalle besser geplant werden können. Die Bewertung des Komponentenzustands erfolgt anhand von Messgrößen, die entweder durch zusätzlich in den Prozess eingebrachte Sensoren erfasst werden oder durch Prozessdaten, die in den Regel- und Steuereinrichtungen verfügbar sind. Diese Messdaten werden ausgewertet und das Ergebnis wird dem Anwender angezeigt.
Der vorliegende Beitrag gibt einen kurzen Überblick über verwendete Messgrößen sowie verwendete Auswerteverfahren. Darüber hinaus wird ein Verfahren erläutert, das die Schwierigkeiten bei der Beurteilung der üblicherweise verwendeten Frequenzspektren vermeidet.
Der vorliegende Beitrag zeigte die Anwendung eines Extended Kalman Filters für die Beurteilung des Verschleißzustandes von Rollenketten. Anders als in den üblicherweise eingesetzten signalbasierten Verfahren wurde damit ein modellbasierter Ansatz gewählt. Der Einsatz des Extended Kalman Filters ermöglicht die Schätzung von Parametern eines reduzierten Kettenmodells, das die Dynamik der einzigen Messgröße, nämlich des Drehmoments des antreibenden Motors näherungsweise nachbildet. Im Beitrag wurde dieses Verfahren auf Messdaten aus vier Dauerversuchen an Rollenketten eingesetzt und gezeigt, dass mit steigendem Verschleiß eine Änderung ausgewählter Modellparameter erfolgt.
Diese Vorgehensweise ist ein erster Ansatz, der durch weitere Forschungsarbeiten noch verbessert werden muss. In zukünftigen Forschungsarbeiten wird zusätzlich zur Parameterschätzung eine Prädiktion durchgeführt, um einen Schätzwert für die Restlebensdauer zu erhalten. Hierzu gibt es Ansätze in der Literatur, die auf das konkrete Problem angepasst werden müssen. Zudem muss die Modellierungssystematik so erweitert werden, dass Wissen über das Prozessverhalten in die Modellierung mit eingebracht wird, um die Aussagekraft der Ergebnisse sowie die Robustheit des Verfahrens bezüglich Betriebsparametern, Umgebungsbedingungen und Exemplarstreuungen zu verbessern.
Ganz gleich, ob im privaten oder beruflichen Alltag, begleiten uns digitale Medien heute nahezu überall. Dabei dienen sie nicht nur zur Unterhaltung, sondern helfen uns, Arbeitsabläufe effizienter und produktiver durchzuführen. Doch die Arbeit des Menschen ist bei Weitem nicht überflüssig geworden. Durch die steigenden Anforderungen ist die Nachfrage nach qualifiziertem Fachpersonal heute höher denn je. Währenddessen müssen Mitarbeiter in der Lage sein, mit der rasanten Entwicklung neuer Produkte und Technologien Schritt zu halten. Dabei ist eine qualitative Aus- und Weiterbildung unumgänglich. Beginnend mit der Bildung von Medienkompetenz in Schulen bis hin zur Fach- und Berufsbildung sowie beruflichen Weiterbildung, muss der Umgang mit digitalen Technologien gelehrt sein. Darüber hinaus bieten diese Technologien neue Potenziale zur Verbesserung von Bildungskonzepten und können zudem dabei helfen, den Lernerfolg zu steigern.
Diese Arbeit beschäftigt sich mit der Evaluation einer VR-basierten Lernumgebung und untersucht mögliche Auswirkungen auf den Lernerfolg durch die verkörperte Darstellung eines virtuellen Instruktors. Dazu wurde die technische Implementierung einer kollaborativen Lernumgebung vorgenommen, mit welcher anschließend eine Versuchsreihe mit 16 Probanden durchgeführt wurde. Im Hinblick auf eine mögliche Steigerung der Effizienz in der eigenständigen Bewältigung von Montageaufgaben nach unterschiedlichen Instruktionsarten, wurden keine signifikanten Leistungsverbesserungen festgestellt.
Systemische Betrachtung des therapeutischen Roboters Paro im Vergleich zu dem Haustierroboter AIBO
(2020)
Roboter sind in der heutigen Zeit nicht nur in der Industrie zu finden, sondern werden immer häufiger in privaten Lebensbereichen eingesetzt. Ein Beispiel hierfür ist der soziale Therapie-Roboter Paro. Dieser ist dem Verhalten und Aussehen einer jungen Robbe nachempfunden, drückt Gefühle aus und wird besonders in Pflegeheimen eingesetzt. Dabei zeigt er positive Auswirkungen auf das Wohlbefinden pflegebedürftiger Menschen. Diese Arbeit stellt den Roboter Paro in einer systemischen Analyse dar: hierbei werden Systemkontext, Anwendungsfälle, Anforderungen und Struktur betrachtet. Anschließend erfolgt eine Analyse des Haustierroboters AIBO, welcher einem Welpen ähnelt und verstärkt der Unterhaltung von Privatpersonen dient. Es werden Gemeinsamkeiten und Unterschiede zwischen den Systemen herausgearbeitet. Dabei wird ersichtlich, dass beide Systeme dem Nutzer vorrangig Gesellschaft leisten, jedoch verschiedene Anforderungen besitzen und in unterschiedlichen Anwendungsdomänen eingesetzt werden. Zudem besitzt AIBO vielfältigere Fähigkeiten und einen höheren Bewegungsgrad als Paro. Dies spiegelt sich in einer komplexeren Struktur der Hardware wider.