Refine
Document Type
- Book chapter (155)
- Conference proceeding (145)
- Journal article (131)
- Book (27)
- Anthology (13)
- Doctoral Thesis (2)
Is part of the Bibliography
- yes (473)
Institute
- Informatik (205)
- ESB Business School (145)
- Texoversum (51)
- Life Sciences (34)
- Technik (34)
- Zentrale Einrichtungen (4)
Publisher
- Springer (473) (remove)
Nachhaltige Managementmodelle sind auf Erfüllung der Triple Bottom Line ausgerichtet: Unternehmen adressieren Energie- und Co2-Effizienz (ökologisch), Arbeitsschutz oder Arbeitslosenquoten (sozial) sowie mögliche Wachstumspotenziale, die durch Nachhaltigkeit zu erreichen sind, um das eigene Überleben des Unternehmens am Markt zu sichern (ökonomisch). Daneben stehen die 17 Sustainable Development Goals (SDGs), die bis 2030 als Leitlinie nachhaltigen Wirtschaftens weltweit gelten und in nationale Gesetzgebung überführt wurden. Dieser Beitrag entwickelt ein Managementmodell, das Unternehmen dabei unterstützt, relevante SDGs zu identifizieren und Handlungsempfehlungen abzuleiten. Aufbauend auf einer nachhaltigen Supply Chain ordnet das Modell die SDGs den Dimensionen der Triple Bottom Line zu, um kurz eine Checkliste möglicher zu berücksichtigender Nachhaltigkeitsmaßnahmen im Kontext des Behaviour Change Modells zu erarbeiten. Zurückgreifend auf die Empfehlungen der Vereinten Nationen wird ein nachhaltiger Managementansatz eingeführt, der Unternehmen dazu befähigt, Governance, Transparenz und Engagement in ihrer Supply Chain zu implementieren.
Sustainability is a development that meets the needs of the present without compromising the ability of future generations to meet their own needs.
Business Model is a plan for the successful operation of a business, identifying sources of revenue, the intended customer base, products, and details of financing.
Circular economy is an approach of how a company creates, captures and delivers value, with a value creation logic designed to improve resource efficiency through contributing to extending the useful life of products and parts (e.g., through long-life design, repair and remanufacturing) and closing material loops.
Children undergoing systemic chemotherapy often suffer from severe immunosuppression usually associated to severe neutropenia (neutrophils < 0.5 x 109/l). Clinical courses during those periods range from asymptomatic to septic general conditions. Development of septic symptoms can be very fast and life-threatening. Swift detection of risk factors in those patients is therefore needed. So far no early, rapid and reliable marker or tool exists. Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column (IMS-MCC) can analyze more than 600 volatile components from exhaled air within a few minutes and hence is a potential, rapid detection-tool. As a proof of concept we measured the exhaled breath of 11 patients with neutropenia and 10 healthy controls ranging from 3 to 18 years of age at the time of measurement. Ten milliliters breath samples were taken at the outpatient clinic and analyzed with an onsite IMS-MCC (BreathDiscovery, B&S Analytik, Dortmund, Germany). Dead-space-volume was adapted to two groups (small 250 ml, large 500 ml). Interestingly 59 differing peaks were measured. Eleven were significantly different (p ≤ 0.05), three of which highly significant (p ≤ 0.01) in Mann-Whitney-Rank-Sum-testing. The corresponding analytes used in the decision tree are 2-Propanol, D-Limonene and Acetone. The analytes with the lowest rank sum identified are 2-Hexanone, Iso-Propylamine and 1-Butanol. Eventually we were able to show a three-step-decision-tree, which discerns the 21 samples except one from each group. Sensitivity was 90 % and specificity was 91 %. Naturally these findings need further confirmation within a bigger population. Our pilot-study proves that Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column is a feasible rapid diagnostic tool in the setting of a pediatric oncology out-patient clinic for patients 3 years and older. Our first results furthermore encourage additional analysis as to whether patients at risk for septic events during immunosuppression can be diagnosed in advance by rapidly assessing risk factors such as Neutropenia in exhaled breath.
Sleep quality and in general, behavior in bed can be detected using a sleep state analysis. These results can help a subject to regulate sleep and recognize different sleeping disorders. In this work, a sensor grid for pressure and movement detection supporting sleep phase analysis is proposed. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this project is a non invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable actigraphy devices tends to be uncomfortable. Besides this fact, they are also very expensive. The system represented in this work classifies respiration and body movement with only one type of sensor and also in a non invasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed the potential for classification of breathing rate and body movements. Although previous researches show the use of pressure sensors in recognizing posture and breathing, they have been mostly used by positioning the sensors between the mattress and bedsheet. This project however, shows an innovative way to position the sensors under the mattress.
In many cases continuous monitoring of vital signals is required and low intrusiveness is an important requirement. Incorporating monitoring systems in the hospital or home bed could have benefits for patients and caregivers. The objective of this work is the definition of a measurement protocol and the creation of a data set of measurements using commercial and low-cost prototypes devices to estimate heart rate and breathing rate. The experimental data will be used to compare results achieved by the devices and to develop algorithms for feature extraction of vital signals.
The recovery of our body and brain from fatigue directly depends on the quality of sleep, which can be determined from the results of a sleep study. The classification of sleep stages is the first step of this study and includes the measurement of vital data and their further processing. The non-invasive sleep analysis system is based on a hardware sensor network of 24 pressure sensors providing sleep phase detection. The pressure sensors are connected to an energy-efficient microcontroller via a system-wide bus. A significant difference between this system and other approaches is the innovative way in which the sensors are placed under the mattress. This feature facilitates the continuous use of the system without any noticeable influence on the sleeping person. The system was tested by conducting experiments that recorded the sleep of various healthy young people. Results indicate the potential to capture respiratory rate and body movement.
The main aim of presented in this manuscript research is to compare the results of objective and subjective measurement of sleep quality for older adults (65+) in the home environment. A total amount of 73 nights was evaluated in this study. Placing under the mattress device was used to obtain objective measurement data, and a common question on perceived sleep quality was asked to collect the subjective sleep quality level. The achieved results confirm the correlation between objective and subjective measurement of sleep quality with the average standard deviation equal to 2 of 10 possible quality points.
Identifikation von Schlaf- und Wachzuständen durch die Auswertung von Atem- und Bewegungssignalen
(2021)
Fragestellung: Das klinische Standardverfahren und Referenz der Schlafmessung und der Klassifizierung der einzelnen Schlafstadien ist die Polysomnographie (PSG). Alternative Ansätze zu diesem aufwändigen Verfahren könnten einige Vorteile bieten, wenn die Messungen auf eine komfortablere Weise durchgeführt werden. Das Hauptziel dieser Forschung Studie ist es, einen Algorithmus für die automatische Klassifizierung von Schlafstadien zu entwickeln, der ausschließlich Bewegungs- und Atmungssignale verwendet [1].
Patienten und Methoden: Nach der Analyse der aktuellen Forschungsarbeiten haben wir multinomiale logistische Regression als Grundlage für den Ansatz gewählt [2]. Um die Genauigkeit der Auswertung zu erhöhen, wurden vier Features entwickelt, die aus Bewegungs- und Atemsignalen abgeleitet wurden. Für die Auswertung wurden die nächtlichen Aufzeichnungen von 35 Personen verwendet, die von der Charité-Universitätsmedizin Berlin zur Verfügung gestellt wurden. Das Durchschnittsalter der Teilnehmer betrug 38,6 +/– 14,5 Jahre und der BMI lag bei durchschnittlich 24,4 +/– 4,9 kg/m2. Da der Algorithmus mit drei Stadien arbeitet, wurden die Stadien N1, N2 und N3 zum NREM-Stadium zusammengeführt. Der verfügbare Datensatz wurde strikt aufgeteilt: in einen Trainingsdatensatz von etwa 100 h und in einen Testdatensatz mit etwa 160 h nächtlicher Aufzeichnungen. Beide Datensätze wiesen ein ähnliches Verhältnis zwischen Männern und Frauen auf, und der durchschnittliche BMI wies keine signifikante Abweichung auf.
Ergebnisse: Der Algorithmus wurde implementiert und lieferte erfolgreiche Ergebnisse: die Genauigkeit der Erkennung von Wach-/NREM-/REM-Phasen liegt bei 73 %, mit einem Cohen’s Kappa von 0,44 für die analysierten 19.324 Schlafepochen von jeweils 30 s. Die beobachtete gewisse Überschätzung der NREM-Phase lässt sich teilweise durch ihre Prävalenz in einem typischen Schlafmuster erklären. Selbst die Verwendung eines ausbalancierten Trainingsdatensatzes konnte dieses Problem nicht vollständig lösen.
Schlussfolgerungen: Die erreichten Ergebnisse haben die Tauglichkeit des Ansatzes prinzipiell bestätigt. Dieser hat den Vorteil, dass nur Bewegungs- und Atemsignale verwendet werden, die mit weniger Aufwand und komfortabler für Benutzer aufgezeichnet werden können als z. B. Herz- oder EEG-Signale. Daher stellt das neue System eine deutliche Verbesserung im Vergleich zu bestehenden Ansätzen dar. Die Zusammenführung der beschriebenen algorithmischen Software mit dem in [1] beschriebenen Hardwaresystem zur Messung von Atem- und Körperbewegungssignalen zu einem autonomen, berührungslosen System zur kontinuierlichen Schlafüberwachung ist eine mögliche Richtung zukünftiger Arbeiten.
Recognition of sleep and wake states is one of the relevant parts of sleep analysis. Performing this measurement in a contactless way increases comfort for the users. We present an approach evaluating only movement and respiratory signals to achieve recognition, which can be measured non-obtrusively. The algorithm is based on multinomial logistic regression and analyses features extracted out of mentioned above signals. These features were identified and developed after performing fundamental research on characteristics of vital signals during sleep. The achieved accuracy of 87% with the Cohen’s kappa of 0.40 demonstrates the appropriateness of a chosen method and encourages continuing research on this topic.
The scoring of sleep stages is one of the essential tasks in sleep analysis. Since a manual procedure requires considerable human and financial resources, and incorporates some subjectivity, an automated approach could result in several advantages. There have been many developments in this area, and in order to provide a comprehensive overview, it is essential to review relevant recent works and summarise the characteristics of the approaches, which is the main aim of this article. To achieve it, we examined articles published between 2018 and 2022 that dealt with the automated scoring of sleep stages. In the final selection for in-depth analysis, 125 articles were included after reviewing a total of 515 publications. The results revealed that automatic scoring demonstrates good quality (with Cohen's kappa up to over 0.80 and accuracy up to over 90%) in analysing EEG/EEG + EOG + EMG signals. At the same time, it should be noted that there has been no breakthrough in the quality of results using these signals in recent years. Systems involving other signals that could potentially be acquired more conveniently for the user (e.g. respiratory, cardiac or movement signals) remain more challenging in the implementation with a high level of reliability but have considerable innovation capability. In general, automatic sleep stage scoring has excellent potential to assist medical professionals while providing an objective assessment.
In recent decades, it can be observed that a steady increase in the volume of tourism is a stable trend. To offer travel opportunities to all groups, it is also necessary to prepare offers for people in need of long-term care or people with disabilities. One of the ways to improve accessibility could be digital technologies, which could help in planning as well as in carrying out trips. In the work presented, a study of barriers was first conducted, which led to selecting technologies for a test setup after analysis. The main focus was on a mobile app with travel information and 360° tours. The evaluation results showed that both technologies could increase accessibility, but some essential aspects (such as usability, completeness, relevance, etc.) need to be considered when implementing them.
This book investigates and highlights the most critical challenges the pharmaceutical industry faces in an increasingly competitive environment of inflationary R&D investments and tightening cost control pressures. The authors present three sources of pharmaceutical innovation: new management methods in the drug development pipeline; new technologies as enablers for cutting-edge R&D; and new forms of cooperation and internationalization, such as open innovation in the early phases of R&D. New models and methods are illustrated with cases from Europe, the US, and Asia. This third fully revised edition was expanded to reflect the latest updates in open and collaborative innovation, the greater strategic importance of venture capital and early stage investments, and the new range of emerging technologies now being put to use in pharmaceutical innovation.
Broad acceptance of finite-element-based analysis of structural problems and the increased availability of CAD-systems for structural tasks, which help to generate meshes of non-trivial geometries, have been setting a standard for the evaluation of designs in mechanical engineering in the last few decades. The development of automated or semi-automated optimizers, integrated into the Computer-Aided Engineering (CAE)-packages or working as outer loop machines, requiring the solver to do the analysis of the specific designs, has been accepted by most advanced users of the simulation community as well. The availability and inexpensive processing power of computers is increasing without any limitations foreseen in the coming years. There is little doubt that virtual product development will continue using the tools that have proved to be so successful and so easy to handle.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
Indoor localization systems are becoming more and more important with the digitalization of the industrial sector. Sensor data such as the current position of machines, transport vehicles, goods or tools represent an essential component of cyber physical production systems (CCPS). However, due to the high costs of these sensors, they are not widespread and are used mainly in special scenarios. However, especially optical indoor positioning systems (OIPS) based on cameras have certain advantages due to their technological specifications. In this paper, the application scenarios and requirements as well as their characteristics are presented and a classification approach of OIPS is introduced.
Intermittent time series forecasting is a challenging task which still needs particular attention of researchers. The more unregularly events occur, the more difficult is it to predict them. With Croston’s approach in 1972 (1.Nr. 3:289–303), intermittence and demand of a time series were investigated the first time separately. He proposes an exponential smoothing in his attempt to generate a forecast which corresponds to the demand per period in average. Although this algorithm produces good results in the field of stock control, it does not capture the typical characteristics of intermittent time series within the final prediction. In this paper, we investigate a time series’ intermittence and demand individually, forecast the upcoming demand value and inter-demand interval length using recent machine learning algorithms, such as long-short-term-memories and light-gradient-boosting machines, and reassemble both information to generate a prediction which preserves the characteristics of an intermittent time series. We compare the results against Croston’s approach, as well as recent forecast procedures where no split is performed.
A 3D face modelling approach for pose-invariant face recognition in a human-robot environment
(2017)
Face analysis techniques have become a crucial component of human-machine interaction in the fields of assistive and humanoid robotics. However, the variations in head-pose that arise naturally in these environments are still a great challenge. In this paper, we present a real-time capable 3D face modelling framework for 2D in-the-wild images that is applicable for robotics. The fitting of the 3D Morphable Model is based exclusively on automatically detected landmarks. After fitting, the face can be corrected in pose and transformed back to a frontal 2D representation that is more suitable for face recognition. We conduct face recognition experiments with non-frontal images from the MUCT database and uncontrolled, in the wild images from the PaSC database, the most challenging face recognition database to date, showing an improved performance. Finally, we present our SCITOS G5 robot system, which incorporates our framework as a means of image pre-processing for face analysis.
Based on well-established robotic concepts of autonomous localization and navigation we present a system prototype to assist camera-based indoor navigation for human utilization implemented in the Robot Operating System (ROS). Our prototype takes advantage of state-of-the-art computer vision and robotic methods. Our system is designed for assistive indoor guidance. We employ a vibro tactile belt to serve as a guiding device to render derived motion suggestions to the user via vibration patterns. We evaluated the effectiveness of a variety of vibro-tactile feedback patterns for guidance of blindfolded users. Our prototype demonstrates that a vision-based system can support human navigation, and may also assist the visually impaired in a human-centered way.
Software startups often make assumptions about the problems and customers they are addressing as well as the market and the solutions they are developing. Testing the right assumptions early is a means to mitigate risks. Approaches such as Lean Startup foster this kind of testing by applying experimentation as part of a constant build-measure-learn feedback loop. The existing research on how software startups approach experimentation is very limited. In this study, we focus on understanding how software startups approach experimentation and identify challenges and advantages with respect to conducting experiments. To achieve this, we conducted a qualitative interview study. The initial results show that startups often spent a disproportionate amount of time focusing on creating solutions without testing critical assumptions. Main reasons are the lack of awareness, that these assumptions can be tested early and a lack of knowledge and support on how to identify, prioritize and test these assumptions. However, startups understand the need for testing risky assumptions and are open to conducting experiments.
Das Ziel der vorliegenden Studie war es, den Zusammenhang zwischen der Implementierung von CRM-Prozessen und der Kundenzufriedenheit zu analysieren. Unsere Untersuchung ist einigen grundsätzlichen Beschränkunfen unterworfen. CRM ist immer noch ein relativ junges Forschungsgebiet, dessen Prozesse sich im Zeitablauf mit großer Wahrscheinlichkeit noch weiterentwickeln werden. Manche Praktiken werden als ineffektiv identifiziert und verworfen werden; andere existierende Prozesse werden eine Verbesserung erfahren. Es ist zudem zu erwarten, dass neue Prozesse und Aktivitäten entwickelt und eingeführt werden. Als Folge dieser Entwicklungen ist es möglich, dass die hier berichtete Wirkung auf die Kundenzufriedenheit durch die Implementierung von CRM-Prozessen sich im Laufe der Zeit ebenfalls ändern wird. Ein interessanter Forschungsansatz wäre daher die Beobachtung dieser Evolution im Zeitablauf.
Darüber hinaus muss in dieser Studie beachtet werden, dass die Kundenzufriedenzeit lediglich ein vorökonomisches Ziel des CRM ist. Einzelne Investitionen in eine bestehende Geschäftsbeziehung müssen anhand der Wertigkeit des Kunden für das Unvernehmen vorgenommen werden.Bestehen ferner keine Alternativen zum bisherigen Anbieter, so ist es ökonomisch nicht sinnvoll, Ressourcen zur Steigerung der Kundenzufriedenheit einzusetzen, da ein Wechsel des Anbieters unwahrscheinlich ist.
Schließlich nutzten wir für die vorliegende Studie Skalen zur Einschätzung der Einstellungen der Kunden durch die Unternehmen. Da dieses Vorgehen möglichst genaue Beurteilungen erfordert, kann es sein, dass die Daten gewisse Verzerrungen aufweisen. Zukünftige Forschungsansätze könnten die Studie durch eine ergänzende Einschätzung der Kunden zur Kreuzvalidierung sinnvoll erweitern.
While the topic of Customer Relationship Management (CRM) has generated an increasing amount of research attention in recent years, still lacking is a comprehensive overview that helps to explain how companies can implement CRM successfully. To address these issues, this article identifies and discusses factors that are associated with a greater degree of CRM success. More specifically, we identify and discuss determinants on strategy, human resources, information management, structure and processes as well as specific factors within the implementation phase which help to improve CRM success. First, our results indicate that the implementation of CRM processes is associated with better company performance, especially at the relationship initiation and maintenance stage. Second, the findings emphasis a predominant influence of firm-based factors vis-à-vis structural industry, and customer-based factors. Furthermore, cross-functional CRM teams and a top management feeling responsible for CRM projects help to improve CRM success. In addition, internal processes which are related to customer contact points have to be redesigned to enhance the interaction between employees and customers. The current article sheds more light on what really drives CRM success.
Rational strain engineering requires solid testing of phenotypes including productivity and ideally contributes thereby directly to our understanding of the genotype-phenotype relationship. Actually, the test step of the strain engineering cycle becomes the limiting step, as ever advancing tools for generating genetic diversity exist. Here, we briefly define the challenge one faces in quantifiying phenotypes and summarize existing analytical techniques that partially overcome this challenge. We argue that the evolution of volatile metabolites can be used as proxy for cellular metabolism. In the simplest case, the product of interest is a volatile (e.g., from bulk alcohols to special fragrances) that is directly quantified over time. But also nonvolatile products (e.g., from bulk long-chain fatty acids to natural products) require major flux rerouting that result potentially in altered volatile production. While alternative techniques for volatile determination exist, rather few can be envisaged for medium to high-throughput analysis required for phenotype testing. Here, we contribute a detailed protocol for an ion mobility spectrometry (IMS) analysis that allows volatile metabolite quantification down to the ppb range. The sensivity can be exploited for small-scale fermentation monitoring. The insights shared might contribute to a more frequent use of IMS in biotechnology, while the experimented aspects are of general use for researchers interested in volatile monitoring.
In the era of precision medicine, digital technologies and artificial intelligence, drug discovery and development face unprecedented opportunities for product and business model innovation, fundamentally changing the traditional approach of how drugs are discovered, developed and marketed. Critical to this transformation is the adoption of new technologies in the drug development process, catalyzing the transition from serendipity-driven to data-driven medicine. This paradigm shift comes with a need for both translation and precision, leading to a modern Translational Precision Medicine approach to drug discovery and development. Key components of Translational Precision Medicine are multi-omics profiling, digital biomarkers, model-based data integration, artificial intelligence, biomarker-guided trial designs and patient-centric companion diagnostics. In this review, we summarize and critically discuss the potential and challenges of Translational Precision Medicine from a cross-industry perspective.
In recent years, the parallel computing community has shown increasing interest in leveraging cloud resources for executing parallel applications. Clouds exhibit several fundamental features of economic value, like on-demand resource provisioning and a pay-per-use model. Additionally, several cloud providers offer their resources with significant discounts; however, possessing limited availability. Such volatile resources are an auspicious opportunity to reduce the costs arising from computations, thus achieving higher cost efficiency. In this paper, we propose a cost model for quantifying the monetary costs of executing parallel applications in cloud environments, leveraging volatile resources. Using this cost model, one is able to determine a configuration of a cloud-based parallel system that minimizes the total costs of executing an application.
In this paper, we deal with optimizing the monetary costs of executing parallel applications in cloud-based environments. Specifically, we investigate on how scalability characteristics of parallel applications impact the total costs of computations. We focus on a specific class of irregularly structured problems, where the scalability typically depends on the input data. Consequently, dynamic optimization methods are required for minimizing the costs of computation. For quantifying the total monetary costs of individual parallel computations, the paper presents a cost model that considers the costs for the parallel infrastructure employed as well as the costs caused by delayed results. We discuss a method for dynamically finding the number of processors for which the total costs based on our cost model are minimal. Our extensive experimental evaluation gives detailed insights into the performance characteristics of our approach.
Bei der Bayer AG wird als Lösung für das Enterprise Social Network IBM Connections eingesetzt. Bayer verfolgt das Ziel, die Mitarbeiter/innen weltweit zu vernetzen, die Kommunikation über Bereichsgrenzen hinweg zu unterstützen und um einen Wissens- und Expertenpool bereitzustellen. Im Rahmen eines Relaunches wurde 2012 Connections@Bayer, das vorher nur in Teilkonzernen verfügbar war, auf das gesamte Unternehmen ausgerollt. In einem weiteren Relaunch 2014 führte das Unternehmen ein Update auf die Version 4.5 und eine umfangreiche Kommunikationskampagne durch, die unter den Mitarbeiter/innen Aufmerksamkeit für die Kommunikationsplattform schuf und Neugier weckte. Darin wurde eine Analyse der Schlüsselvorteile der Nutzung von Connections durchgeführt, acht Kernnachrichten erarbeitet und diese auf diversen Kommunikationskanälen im Unternehmen verbreitet. Zudem ließen sich durch die Verwendung von Testimonials die Vorteile für alle Mitarbeitergruppen darstellen. Dieser Relaunch war erfolgreich: Die Nutzerzahlen konnten erweitert werden, die Mitarbeiterzufriedenheit stieg an. Die vorliegende Fallstudie stellt anschaulich dar, dass ein von einer effektiven Kommunikationskampagne begleiteter Relaunch eines Enterprise Social Networks einen nachhaltigen Erfolg herbeiführen kann.
The market for indoor positioning systems for a variety of applications has grown strongly in recent years. A wide range of systems is available, varying considerably in terms of accuracy, price and technology used. The suitability of the systems is highly dependent on the intended application. This paper presents a concept to use a single low-cost PTZ camera in combination with fiducial markers for indoor position and orientation determination. The intended use case is to capture a plant layout consisting of position, orientation and unique identity of individual facilities. Important factors to consider for the selection of a camera have been identified and the transformation of the marker pose in camera coordinates into a selectable plant coordinate system is described. The concept is illustrated by an exemplary practical implementation and its results.
Global, competitive markets which are characterised by mass customisation and rapidly changing customer requirements force major changes in production styles and the configuration of manufacturing systems. As a result, factories may need to be regularly adapted and optimised to meet short-term requirements. One way to optimise the production process is the adaptation of the plant layout to the current or expected order situation. To determine whether a layout change is reasonable, a model of the current layout is needed. It is used to perform simulations and in the case of a layout change it serves as a basis for the reconfiguration process. To aid the selection of possible measurement systems, a requirements analysis was done to identify the important parameters for the creation of a digital shadow of a plant layout. Based on these parameters, a method is proposed for defining limit values and specifying exclusion criteria. The paper thus contributes to the development and application of systems that enable an automatic synchronisation of the real layout with the digital layout.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (longterm electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic TimeWarping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
Das ZD.BB - Digitaler Hub für kleine und mittelständische Unternehmen in der Region Stuttgart
(2020)
Die Digitale Transformation ist eines der meistdiskutierten Themen in der heutigen Geschäftswelt. Viele Unternehmen, vor allem kleine und mittelständische Unternehmen (KMU), tun sich schwer die Chancen und Risiken der Digitalisierung einzuschätzen. Mit all den Möglichkeiten und Chancen, welche die Digitalisierung birgt, droht Unternehmen, die sich vor den Entwicklungen verschließen, der Verlust ihrer Markt- und Wettbewerbsposition. Mit dem im Februar 2019 eröffneten Digital Hub ZD.BB (Zentrum Digitalisierung) besteht in der Region Stuttgart eine neue, zentrale Anlaufstelle für Fragen rund um das Thema Digitalisierung. Am ZD.BB erhalten kleine und mittelständische Unternehmen (KMU) sowie Startups für ihre digitalen Transformationsprozesse eine kompetente Beratung und Betreuung. Sie geht von der Sensibilisierung über die Analyse bis zur Lösungsentwicklung für digitale Prozesse. Mithilfe einer digitalen Qualifizierungsoffensive und mittelstandsgerechten Methoden zur Geschäftsmodellentwicklung werden Unternehmen im ZD.BB umfassend bei ihren Digitalisierungsvorhaben unterstützt. Dazu werden in Innovationslaboren, in Coworking Spaces und bei Events unterschiedliche Kompetenzen, Disziplinen, Ideen, Technologien und Kreativität vernetzt und auf diese Weise digitale Innovationen hervorgebracht.
Big Data wird aktuell als einer der Haupttrends der IT-Industrie diskutiert. Big Data d. h. auf Basis großer Mengen unterschiedlich strukturierter Daten die Entscheidungen in Echtzeit oder prognostisch zu treffen. Von hochleistungsfähigen, schnell verfügbaren Prognoseverfahren erhofft man sich eine Risikominimierung für unternehmerische Entscheidungen in hochvolatilen Märkten.
Die weiterhin hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation erst gar nicht eintritt, hält der Autor eine staatliche Insovenzordnung – mit Bail-out durch die anderen Mitgliedstaaten nur in Notfällen – für erforderlich. Er schlägt einen staatlichen Abwicklungsmechanismus für überschuldete Euro-Länder vor, der auf einem Konzept des Sachverständigenrates für Wirtschaft von 2016 beruht.
Das Weltwirtschaftswachstum der vergangenen Jahrzehnte war durch die Dynamik der Digitalisierung und Globalisierung in den Lieferketten geprägt. Die Corona-Pandemie hat die Abhängigkeit und Verletzlichkeit der Lieferketten offengelegt. Trotz einer Vielzahl verbindlicher Standards haben Unternehmen die Digitalisierung und Arbeitsteilung auch für regulatorische Arbitrage genutzt. Einerseits erhöht das die Effizienz der Wirtschaft - was mithin ökologische Ressourcen schont - andererseits werden damit internationale Standards konterkariert. Globalisierung und Digitalisierung sind Segen und Fluch zugleich.
Die hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die bereits entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation künftig gar nicht erst eintritt, ist eine staatliche Insolvenzordnung erforderlich.
Das Phänomen des Populismus wird in den verschiedensten wissenschaftlichen Disziplinen seit Jahrzehnten erörtert. Dss vorherrschende Narrativ ist aber ökonomischer Natur. Die (Finanz-)Globalisierung und der technologische Fortschritt entfremden die Menschen und bewirken Verunsicherung. Zudem erkennen die Bürger mehr die Herausforderungen als die Chancen in diesem Transformationsprozess.
This paper studies whether a monetary union can be managed solely by a rule based approach. The Five Presidents’ Report of the European Union rejects this idea. It suggests a centralisation of powers. We analyse the philosophy of policy rules from the vantage point of the German economic school of thought. There is evidence that a monetary union consisting of sovereign states is well organised by rules, together with the principle of subsidiarity. The root cause of the euro crisis is rather the weak enforcement of rules, compounded by structural problems. Therefore, we suggest a genuine rule-based paradigm for a stable future of the Economic and Monetary Union.
Newly developed active pharmaceutical ingredients (APIs) are often poorly soluble in water. As a result the bioavailability of the API in the human body is reduced. One approach to overcome this restriction is the formulation of amorphous solid dispersions (ASDs), e.g., by hot-melt extrusion (HME). Thus, the poorly soluble crystalline form of the API is transferred into a more soluble amorphous form. To reach this aim in HME, the APIs are embedded in a polymer matrix. The resulting amorphous solid dispersions may contain small amounts of residual crystallinity and have the tendency to recrystallize. For the controlled release of the API in the final drug product the amount of crystallinity has to be known. This review assesses the available analytical methods that have been recently used for the characterization of ASDs
and the quantification of crystalline API content. Well established techniques like near- and mid-infrared spectroscopy (NIR and MIR, respectively), Raman spectroscopy, and emerging ones like UV/VIS, terahertz, and ultrasonic spectroscopy are considered in detail. Furthermore, their advantages and limitations are discussed with regard to general practical applicability as process analytical technology (PAT) tools in industrial manufacturing. The review focuses on spectroscopic methods which have been proven as most suitable for in-line and on-line process analytics. Further aspects are spectroscopic techniques that have been or could be integrated into an extruder.
Back to the future: origins and directions of the “Agile Manifesto” – views of the originators
(2018)
In 2001, seventeen professionals set up the manifesto for agile software development. They wanted to define values and basic principles for better software development. On top of brought into focus, the manifesto has been widely adopted by developers, in software-developing organizations and outside the world of IT. Agile principles and their implementation in practice have paved the way for radical new and innovative ways of software and product development. In parallel, the understanding of the manifesto’s underlying principles evolved over time. This, in turn, may affect current and future applications of agile principles. This article presents results from a survey and an interview study in collaboration with the original contributors of the manifesto for agile software development. Furthermore, it comprises the results from a workshop with one of the original authors. This publication focuses on the origins of the manifesto, the contributors’ views from today’s perspective, and their outlook on future directions. We evaluated 11 responses from the survey and 14 interviews to understand the viewpoint of the contributors. They emphasize that agile methods need to be carefully selected and agile should not be seen as a silver bullet. They underline the importance of considering the variety of different practices and methods that had an influence on the manifesto. Furthermore, they mention that people should question their current understanding of "agile" and recommend reconsidering the core ideas of the manifesto.
Context: The current transformation of automotive development towards innovation, permanent learning and adapting to changes are directing various foci on the integration of agile methods. Although, there have been efforts to apply agile methods in the automotive domain for many years, a wide-spread adoption has not yet taken place.
Goal: This study aims to gain a better understanding of the forces that prevent the adoption of agile methods.
Method: Survey based on 16 semi-structured interviews from the automotive domain. The results are analyzed by means of thematic coding.
Results: Forces that prevent agile adoption are mainly of organizational, technical and social nature and address inertia, anxiety and context factors. Key challenges in agile adoption are related to transforming organizational structures and culture, achieving faster software release cycles without loss of quality, the importance of software reuse in combination with agile practices, appropriate quality assurance measures, and the collaboration with suppliers and other disciplines such as mechanics.
Conclusion: Significant challenges are imposed by specific characteristics of the automotive domain such as high quality requirements and many interfaces to surrounding rigid and inflexible processes. Several means are identified that promise to overcome these challenges.
Context: The current situation and future scenarios of the automotive domain require a new strategy to develop high quality software in a fast pace. In the automotive domain, it is assumed that a combination of agile development practices and software product lines is beneficial, in order to be capable to handle high frequency of improvements. This assumption is based on the understanding that agile methods introduce more flexibility in short development intervals. Software product lines help to manage the high amount of variants and to improve quality by reuse of software for long term development.
Goal: This study derives a better understanding of the expected benefits for a combination. Furthermore, it identifies the automotive specific challenges that prevent the adoption of agile methods within the software product line.
Method: Survey based on 16 semi structured interviews from the automotive domain, an internal workshop with 40 participants and a discussion round on ESE congress 2016. The results are analyzed by means of thematic coding.
Context: Software product lines are widely used in automotive embedded software development. This software paradigm improves the quality of software variants by reuse. The combination of agile software development practices with software product lines promises a faster delivery of high quality software. However, the set up of an agile software product line is still challenging, especially in the automotive domain. Goal: This publication aims to evaluate to what extend agility fits to automotive product line engineering. Method: Based on previous work and two workshops, agility is mapped to software product line concerns. Results: This publication presents important principles of software product lines, and examines how agile approaches fit to those principles. Additionally, the principles are related to one of the four major concerns of software product line engineering: Business, Architecture, Process, and Organization. Conclusion: Agile software product line engineering is promising and can add value to existing development approaches. The identified commonalities and hindering factors need to be considered when defining a combined agile product line engineering approach.
Stronger than they look
(2019)
Engineering of large vascularized adipose tissue constructs is still a challenge for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Communication between mature adipocytes and endothelial cells is important for homeostasis and the maintenance of adipose tissue mass but, to date, is mainly neglected in tissue engineering strategies. Thus, new coculture strategies are needed to integrate adipocytes and endothelial cells successfully into a functional construct. This review focuses on the cross-talk of mature adipocytes and endothelial cells and considers their influence on fatty acid metabolism and vascular tone. In addition, the properties and challenges with regard to these two cell types for vascularized tissue engineering are highlighted.
After the initiator of the ESB Logistics Learning Factory, Prof. Vera Hummel had made experience in developing and implementing a concept for a Learning Factory for Advanced Industrial Engineering (aIE) at the University of Stuttgart, Institute IFF between 2005 and 2008, she was appointed as a full professor at ESB Business School, a faculty of Reutlingen University in March 2010. Lacking a realistic, hands on learning and teaching environment of industrial scale for its industrial engineering students, first ideas for a Learning Factory that would strongly focus on all aspects of production logistics were drafted in 2012. Already back then, a strong integration of virtual and physical factory was desired: While the Learning Factory itself would be physical, the neighboring partners along the supply chain, such as suppliers or distribution warehouses, could be added in a fully virtual way. Considering implementation of the ESB Logistics Learning Factory a strategic initiative of the university, initial funding was provided by the faculty ESB Business School itself. Following its own creed, to provide future-oriented training for the region, also primarily local suppliers and manufacturers were selected as equipment providers to the new Learning Factory. During the initialization phase, 2014, a total of three researchers and nine students worked approximately four months to set up a first assembly line, storage racks, AGVs, or pick-by-light systems in conjunction with the underlying didactical concept. Since then, several hundred of students have participated in trainings and lectures held in the ESB Logistics Learning Factory, several research projects were carried out, and multiple high-level politicians and industry executives have been touring the shop floor. Also, more than EUR 2 million in research and infrastructure funds could be secured for expansion and upgrade — allowing the ESB Logistics Learning Factory today to represent many core aspects of an Industrie 4.0 production environment.
Dieser Artikel zeigt, dass und wie die Ideen, Werkzeuge und Lösungsansätze von Lean Management im Sales-Umfeld genutzt werden können. Es wird verdeutlicht, wie der Wertschöpfungsanteil eigener Vertriebsprozesse gesteigert und gleichzeitig die Verschwendung aus Kundensicht minimiert werden kann. Ein wesentliches Werkzeug stellt hierfür die Methode des Wertstromdesigns dar, die vom Autor und seinen Partnern speziell auf die Besonderheiten von Vertriebsprozessen adaptiert wurde. Fokus ist hierbei das Hervorheben der unterschiedlichsten Arten der Verschwendung innerhalb von Prozessen dieser Art, um so einen Lösungsfindungsprozess zu initiieren. Es wird das Potenzial dieser Methodik verdeutlicht und die Anwendung erläutert. Abschließend wird diskutiert, wie eine Kultur der Veränderung auch innerhalb von Sales-Organisationen realisiert und eine Nachhaltigkeit von Veränderungen gefördert werden kann.
Wer in ein Unternehmen investiert, tut dies, um in Zukunft Geld zu verdienen. Er rechnet mit einer risikoadäquaten Rendite. Die Auswahl der Kennzahlen, die diese Wertsteigerung transparent machen, ist allerdings nicht trivial. Denn von ihnen hängt ab, ob die Unternehmensziele richtig vorgegeben und ob die Anreize für das Management richtig gesetzt werden.
Umsatz und Gewinne stagnieren auf hohem Niveau, und dennoch steigen der Aktienkurs und der Gewinn pro Aktie – eine Entwicklung, die sich etwa bei Apple oder Ebay beobachten lässt. Aktionäre sollten wissen, welche Arithmetik sich hinter solchen Entwicklungen verbirgt und mit welchen Verfahren sie den Unternehmenswert am besten ermitteln können.
Ziel der Arbeit von Julian Ilg ist die Entwicklung einer systematischen Eignungsanalyse zum Einsatz additiver Fertigungsverfahren im unternehmensspezifischen Kontext. Der Autor fasst die gängigsten additiven Fertigungsverfahren zusammen und liefert einen Überblick über die derzeitige Anwendung dieser Verfahren sowie über die Herausforderungen bei deren Einsatz am Beispiel von Unternehmen aus der Medizintechnik. Basierend auf diesen Erkenntnissen gelingt es dem Autor, interessierten Unternehmen eine quantifizierende Entscheidungshilfe zu liefern, die gleichermaßen sowohl die ingenieurswissenschaftlichen Punkte als auch die wirtschaftlichen Aspekte beim Umstieg vom herkömmlichen Herstellungsverfahren auf additive Fertigungsverfahren berücksichtigt.
Dieser Beitrag untersucht, wer in Deutschland Bildungsminister:in wird. Zur Klärung dieser Frage entwickelten wir einen Datensatz, der die biografischen Merkmale aller Bildungsminister:innen der deutschen Bundesländer zwischen 1950 und 2020 enthält. Als Beispiel für die Nutzung des Datensatzes untersuchen wir die beiden Merkmale Geschlecht und frühere Berufserfahrung und verknüpfen diese Merkmale mit Indikatoren für die Größe und Entwicklung des Bildungsbudgets und die Dauer der Amtszeit. Wir zeigen, dass zwischen 1950 und 2020 deutlich mehr Männer als Frauen zum/zur Bildungsminister:in ernannt wurden, unabhängig davon, welche Parteien die Bildungsminister:innen stellten. Außerdem verfügt die Mehrheit der Bildungsminister:innen bei Amtsantritt nicht über vorherige Berufserfahrung als Lehrer:in. Die meisten Bildungsminister:innen haben jedoch bereits politische Erfahrung, wenn sie ihr Amt antreten. Unsere Datenbank, die die erste umfassende Erhebung biografischer Merkmale von Bildungsminister:innen in den deutschen Bundesländern enthält, steht allen interessierten Forscher:innen zur Verfügung.
Despite the significant potential offered by the powder coating process for finishing wood-based materials, until now it has been used almost exclusively for coating Medium Density Fiber Board (MDF). A research project aims to develop processes and substrate materials that will allow lightweight boards to be powder coated.
Dieser Beitrag leistet einen Beitrag zur Marketingforschung, da er den jungen aber von zunehmender Relevanz geprägten Forschungsstrang zum Themenkomplex CEM grundlegend entwickelt. Zum einen zeigt das identifizierte Rahmenkonzept auf, dass CEM über einzelne unternehmerische Fähigkeiten wie dem Design von Serviceerlebnissen, das die bisherige CEM-Forschung bestimmt hat, hinausgeht. Zum anderen leistet das Konzept einen Beitrag zur Synthese fragmentierter, aber miteinander zusammenhängender Literaturströmungen in der Marketingforschung ...
Due to digitalization, constant technological progress and ever shorter product life cycles, enterprises are currently facing major challenges. In order to succeed in the market, business models have to be adapted more often and more quickly to changing market conditions than they used to be. Fast adaptability, also called agility, is a decisive competitive factor in today’s world. Because of the ever-growing IT part of products and the fact that they are manufactured using IT, changing the business model has a major impact on the enterprise architecture (EA). However, developing EAs is a very complex task, because many stakeholders with conflicting interests are involved in the decision-making process. Therefore, a lot of collaboration is required. To support organizations in developing their EA, this article introduces a novel integrative method that systematically integrates stakeholder interests into decision-making activities. By using the method, collaboration between stakeholders involved is improved by identifying points of contact between them. Furthermore, standardized activities make decision-making more transparent and comparable without limiting creativity.
Die Digitalisierung, der ständige technologische Fortschritt und immer kürzere Produktlebenszyklen stellen Unternehmen derzeit vor große Herausforderungen. Um am Markt erfolgreich zu sein, müssen Geschäftsmodelle häufiger und schneller als früher an veränderte Marktbedingungen angepasst werden. Schnelle Anpassungsfähigkeit, auch Agilität genannt, ist in der heutigen Zeit ein entscheidender Wettbewerbsfaktor. Aufgrund des ständig wachsenden IT-Anteils von Produkten und der Tatsache, dass diese mit Hilfe von IT hergestellt werden, hat die Änderung des Geschäftsmodells große Auswirkungen auf die Unternehmensarchitektur (EA). Die Entwicklung von EAs ist jedoch eine sehr komplexe Aufgabe, da viele Beteiligte mit gegensätzlichen Interessen in den Entscheidungsprozess eingebunden sind. Daher ist ein hohes Maß an Zusammenarbeit erforderlich. Um Unternehmen bei der Entwicklung ihrer EA zu unterstützen, wird in diesem Artikel eine neuartige integrative Methode vorgestellt, die die Interessen der Stakeholder systematisch in die Entscheidungsfindung einbezieht. Durch die Anwendung der Methode wird die Zusammenarbeit zwischen den beteiligten Interessengruppen verbessert, indem Berührungspunkte zwischen ihnen identifiziert werden. Darüber hinaus machen die standardisierten Aktivitäten die Entscheidungsfindung transparenter und vergleichbarer, ohne die Kreativität einzuschränken.
In times of dynamic markets, enterprises have to be agile to be able to quickly react to market influences. Due to the increasing digitization of products, the enterprise IT often is affected when business models change. Enterprise Architecture Management (EAM) targets a holistic view of the enterprise’ IT and their relations to the business. However, Enterprise Architectures (EA) are complex structures consisting of many layers, artifacts and relationships between them. Thus, analyzing EA is a very complex task for stakeholders. Visualizations are common vehicles to support analysis. However, in practice visualization capabilities lack flexibility and interactivity. A solution to improve the support of stakeholders in analyzing EAs might be the application of visual analytics. Starting from a systematic literature review, this article investigates the features of visual analytics relevant for the context of EAM.
New or adapted digital business models have huge impacts on Enterprise Architectures (EA) and require them to become more agile, flexible, and adaptable. All these changes are happening frequently and are currently not well documented. An EA consists of a lot of elements with manifold relationships between them. Thus changing the business model may have multiple impacts on other architectural elements. The EA engineering process deals with the development, change and optimization of architectural elements and their dependencies. Thus an EA provides a holistic view for both business and IT from the perspective of many stakeholders, which are involved in EA decision-making processes. Different stakeholders have specific concerns and are collaborating today in often unclear decision-making processes. In our research we are investigating information from collaborative decision-making processes to support stakeholders in taking current decisions. In addition we provide all information necessary to understand how and why decisions were taken. We are collecting the decision-related information automatically to minimize manual time intensive work as much as possible. The core contribution of our research extends a decisional metamodel, which links basic decisions with architectural elements and extends them with an associated decisional case context. Our aim is to support a new integral method for multi perspective and collaborative decision-making processes. We illustrate this by a practice-relevant decision-making scenario for Enterprise Architecture Engineering.
Companies are continuously changing their strategy, processes, and information systems to benefit from the digital transformation. Controlling the digital architecture and governance is the fundamental goal. Enterprise Governance, Risk and Compliance (GRC) systems are vital for managing digital risks threatening in modern enterprises from many different angles. The most significant constituent to GRC systems is the definition of controls that is implemented on different layers of a digital Enterprise Architecture (EA). As part of the compliant aspect of GRC, the effectiveness of these controls is assessed and reported to relevant management bodies within the enterprise. In this paper, we present a metamodel which links controls to the affected elements of a digital EA and supplies a way of expressing associated assessment techniques and results. We complement a metamodel with an expository instantiation of a control compliance cockpit in an international insurance enterprise.
The capability of the method of Immersion transmission ellipsometry (ITE) (Jung et al. Int Patent WO, 2004/109260) to not only determine three-dimensional refractive indices in anisotropic thin films (which was already possible in the past), but even their gradients along the z-direction (perpendicular to the film plane) is investigated in this paper. It is shown that the determination of orientation gradients in deep-sub-lm films becomes possible by applying ITE in combination with reflection ellipsometry. The technique is supplemented by atomic force microscopy for measuring the film thickness. For a photooriented thin film, no gradient was found, as expected. For a photo-oriented film, which was subsequently annealed in a nematic liquid crystalline phase, an order was found similar to the one applied in vertically aligned nematic displays, with a tilt angle varying along the z-direction. For fresh films, gradients were only detected for the refractive index perpendicular to the film plane, as expected.
Purpose
Context awareness in the operating room (OR) is important to realize targeted assistance to support actors during surgery. A situation recognition system (SRS) is used to interpret intraoperative events and derive an intraoperative situation from these. To achieve a modular system architecture, it is desirable to de-couple the SRS from other system components. This leads to the need of an interface between such an SRS and context-aware systems (CAS). This work aims to provide an open standardized interface to enable loose coupling of the SRS with varying CAS to allow vendor-independent device orchestrations.
Methods
A requirements analysis investigated limiting factors that currently prevent the integration of CAS in today's ORs. These elicited requirements enabled the selection of a suitable base architecture. We examined how to specify this architecture with the constraints of an interoperability standard. The resulting middleware was integrated into a prototypic SRS and our system for intraoperative support, the OR-Pad, as exemplary CAS for evaluating whether our solution can enable context-aware assistance during simulated orthopedical interventions.
Results
The emerging Service-oriented Device Connectivity (SDC) standard series was selected to specify and implement a middleware for providing the interpreted contextual information while the SRS and CAS are loosely coupled. The results were verified within a proof of concept study using the OR-Pad demonstration scenario. The fulfillment of the CAS’ requirements to act context-aware, conformity to the SDC standard series, and the effort for integrating the middleware in individual systems were evaluated. The semantically unambiguous encoding of contextual information depends on the further standardization process of the SDC nomenclature. The discussion of the validity of these results proved the applicability and transferability of the middleware.
Conclusion
The specified and implemented SDC-based middleware shows the feasibility of loose coupling an SRS with unknown CAS to realize context-aware assistance in the OR.
The focus of the developed maturity model was set on processes. The concept of the widespread CMM and its practices has been transferred to the perioperative domain and the concept of the new maturity model. Additional optimization goals and technological as well as networking-specific aspects enable a process- and object-focused view of the maturity model in order to ensure broad coverage of different subareas. The evaluation showed that the model is applicable to the perioperative field. Adjustments and extensions of the maturity model are future steps to improve the rating and classification of the new maturity model.
One of the key challenges for automatic assistance is the support of actors in the operating room depending on the status of the procedure. Therefore, context information collected in the operating room is used to gain knowledge about the current situation. In literature, solutions already exist for specific use cases, but it is doubtful to what extent these approaches can be transferred to other conditions. We conducted a comprehensive literature research on existing situation recognition systems for the intraoperative area, covering 274 articles and 95 cross-references published between 2010 and 2019. We contrasted and compared 58 identified approaches based on defined aspects such as used sensor data or application area. In addition, we discussed applicability and transferability. Most of the papers focus on video data for recognizing situations within laparoscopic and cataract surgeries. Not all of the approaches can be used online for real-time recognition. Using different methods, good results with recognition accuracies above 90% could be achieved. Overall, transferability is less addressed. The applicability of approaches to other circumstances seems to be possible to a limited extent. Future research should place a stronger focus on adaptability. The literature review shows differences within existing approaches for situation recognition and outlines research trends. Applicability and transferability to other conditions are less addressed in current work.
Purpose
For the modeling, execution, and control of complex, non-standardized intraoperative processes, a modeling language is needed that reflects the variability of interventions. As the established Business Process Model and Notation (BPMN) reaches its limits in terms of flexibility, the Case Management Model and Notation (CMMN) was considered as it addresses weakly structured processes.
Methods
To analyze the suitability of the modeling languages, BPMN and CMMN models of a Robot-Assisted Minimally Invasive Esophagectomy and Cochlea Implantation were derived and integrated into a situation recognition workflow. Test cases were used to contrast the differences and compare the advantages and disadvantages of the models concerning modeling, execution, and control. Furthermore, the impact on transferability was investigated.
Results
Compared to BPMN, CMMN allows flexibility for modeling intraoperative processes while remaining understandable. Although more effort and process knowledge are needed for execution and control within a situation recognition system, CMMN enables better transferability of the models and therefore the system. Concluding, CMMN should be chosen as a supplement to BPMN for flexible process parts that can only be covered insufficiently by BPMN, or otherwise as a replacement for the entire process.
Conclusion
CMMN offers the flexibility for variable, weakly structured process parts, and is thus suitable for surgical interventions. A combination of both notations could allow optimal use of their advantages and support the transferability of the situation recognition system.
Die rasante Entwicklung der Sensortechnik im Endverbraucherbereich lässt einen klinischen Nutzen der verfügbaren dezentral erhobenen Daten aus dem Patientenalltag zur Überwachung des individuellen Gesundheitszustands vermuten. Zur Überprüfung dieser Vermutung ist die Bereitstellung einer entsprechenden Plattform in den klinischen Alltag erforderlich. Hierzu wird die bwHealthApp entwickelt, mit der sowohl die aktuelle Bandbreite als auch die Evolution der Sensortechnik auf die klinische Anwendung abbildbar ist. Mit dem flexiblen Entwurf lässt sich der klinische Nutzen für die personalisierte Medizin evaluieren. Außerdem bietet die bwHealthApp einen an Machbarkeit orientierten Diskussionsbeitrag zu offenen rechtlichen, regulatorischen und ethischen Fragestellungen der Digitalisierung in der Medizin in Deutschland.
Die zukünftige Arbeitswelt ist durch unterschiedliche Grenzverschiebungen gekennzeichnet, so dass wir von fluiden Grenzen sprechen können. Faktoren, die diese Entwicklung befördern, sind Technologie, Gesellschaft und Organisation. Ein Beispiel: Die digitalen Technologien, wie unter anderen mobile Arbeitsgeräte, Clouds und soziale Netzwerke ermöglichen eine zeitliche und räumliche Flexibilisierung von Arbeit, die von den Mitarbeitern und den Organisationen begrüßt und aktiv vorangetrieben wird. Allerdings führt diese Entwicklung auch dazu, dass die neue Arbeitswelt, vor allem durch die Entgrenzung von privatem und beruflichem Lebensbereich, für viele Mitarbeiter belastender wird. Die Implikationen für die Führungspraxis werden diskutiert.
Traditionelle Organisationen wandeln sich in komplexe Wertschöpfungssysteme mit zunehmend dezentralisierten und digitalisierten Formen der Arbeitsorganisation. Indem sich die Vorstellungen von Mitgliedschaft verändern und sich Arbeit in digitalisierte Räume verschiebt, lösen sich die Grenzen von Organisationen auf. Im Beitrag wird argumentiert, dass sich in solch grenzaufgelösten Organisationen die Machtdynamiken verändern. Im Beitrag werden zwei Dynamiken exemplarisch betrachtet: erstens diejenigen, die sich aus der abnehmenden Wirkung bürokratischer Strukturen als Machtressourcen ergeben, wenn sich die Formen der organisationalen Mitgliedschaft und Zugehörigkeit verändern (z. B. Freelancer, hybride Arbeit). Zweitens werden die Schnittstellen zwischen Menschen und intelligenten Technologien in digitalen Arbeitsräumen und die sich dadurch verschiebenden Machtverhältnisse betrachtet. Der Beitrag zielt darauf ab, die veränderten Machtdynamiken sichtbarer zu machen und damit einen reflektierten Umgang mit Macht in digital transformierten Organisationen zu ermöglichen.
Das Projekt DigiTraIn 4.0 hat ein Beratungskonzept entwickelt und erprobt, das Unternehmen bei der erfolgreichen Digitalisierung ihrer Arbeitswelt unterstützt. Das Beratungskonzept basiert auf vier anwendungsorientierten Instrumenten: Der Digitalisierungsatlas bildet die Digitalisierung der Arbeitswelt in all ihren Dimensionen ab und ermöglicht es, die Notwendigkeit sowie Chancen und Risiken der Veränderungen zu verstehen. Hierauf aufbauend können Unternehmen mit dem Digitalisierungsindex ihren aktuellen Ist-Digitalisierungsgrad der Arbeitswelt individuell bestimmen. Der individuelle Digitalisierungsgrad dient als Ausgangspunkt für den Digitalisierungskompass, der es dem Unternehmen ermöglicht, die Soll-Vorstellung der digitalen Arbeitswelt zu illustrieren und eine unternehmensspezifische Transformationsagenda abzuleiten. Der Beratungsprozess und die Entwicklung der zentralen Instrumente werden in diesem Beitrag dargestellt.
Die Zukunftsfähigkeit des Personalmanagements lässt sich daran festmachen, dass in der Organisation qualitativ und quantitativ ausreichend Personal zur Erfüllung des Organisationszwecks in dynamischen Umfeldern zur Verfügung steht. Einen wichtigen Ansatzpunkt stellen die Flexibilisierung der Personalausstattung sowie die institutionelle und strukturelle Öffnung von Organisationen in Richtung mehr Agilität dar. Darauf aufbauend muss das Personalmanagement selbst durch neue Arbeitsweisen und Praktiken innovativer werden und zusätzlich zu seinem stabilen Kern ein zweites agiles Betriebssystem entwickeln. Das zeitlich und strukturell abgestimmte Zusammenspiel des stabilen und agilen Betriebssystems ermöglicht dann die gleichzeitige Nutzung von exploitativen und explorativen Praktiken. Um die Agilitätsagenda des Personalmanagements weiter voranzutreiben, benötigt es einen systematischen Umgang mit der Bedeutung unterschiedlicher Agilitätsdimensionen, die Entwicklung von Instrumenten sowie Zielsetzungen, die ein agiles Personalmanagement verfolgen sollte.
Die Digitalisierung von Arbeitswelt und Führung ist eine der aktuell zentralen Herausforderungen für Organisationen. Ein Resultat der Beschäftigung des AK Unternehmensführung mit diesem Thema war die Erkenntnis, dass eine holistische und integrative Perspektive auf das Thema erforderlich ist. Mit dem in diesem Beitrag vorgeschlagenen konfigurationstheoretischen Ansatz kann es künftig besser gelingen, die Zusammenhänge und Wechselwirkungen verschiedener Dimensionen und Elemente der digitalen Organisation und deren Auswirkungen auf Arbeit und Führung zu verstehen.
Intra-operative fluoroscopy-guided assistance system for transcatheter aortic valve implantation
(2014)
A new surgical assistance system has been developed to assist the correct positioning of the AVP during transapical TAVI. The developed assistance system automatically defines the target area for implanting the AVP under live 2-D fluoroscopy guidance. Moreover, this surgical assistance system works with low levels of contrast agent for the final deployment of AVP, reducing therefore long-term negative effects, such as renal failure in the elderly and high-risk patients.
Container virtualization evolved into a key technology for deployment automation in line with the DevOps paradigm. Whereas container management systems facilitate the deployment of cloud applications by employing container based artifacts, parts of the deployment logic have been applied before to build these artifacts. Current approaches do not integrate these two deployment phases in a comprehensive manner. Limited knowledge on application software and middleware encapsulated in container-based artifacts leads to maintainability and configuration issues. Besides, the deployment of cloud applications is based on custom orchestration solutions leading to lock in problems. In this paper, we propose a two-phase deployment method based on the TOSCA standard. We present integration concepts for TOSCA-based orchestration and deployment automation using container-based artifacts. Our two-phase deployment method enables capturing and aligning all the deployment logic related to a software release leading to better maintainability. Furthermore, we build a container management system, which is composed of a TOSCA-based orchestrator on Apache Mesos, to deploy container-based cloud applications automatically.
An important shift in software delivery is the definition of a cloud service as an independently deployable unit by following the microservices architectural style. Container virtualization facilitates development and deployment by ensuring independence from the runtime environment. Thus, cloud services are built as container based systems - a set of containers that control the lifecycle of software and middleware components. However, using containers leads to a new paradigm for service development and operation: Self service environments enable software developers to deploy and operate container based systems on their own - you build it, you run it. Following this approach, more and more operational aspects are transferred towards the responsibility of software developers. In this work, we propose a concept for self-adaptive cloud services based on container virtualization in line with the microservices architectural style and present a model-based approach that assists software developers in building these services. Based on operational models specified by developers, the mechanisms required for self-adaptation are automatically generated. As a result, each container automatically adapts itself in a reactive, decentralized manner. We evaluate a prototype which leverages the emerging TOSCA standard to specify operational behavior in a portable manner.
Parallel applications are the computational backbone of major industry trends and grand challenges in science. Whereas these applications are typically constructed for dedicated High Performance Computing clusters and supercomputers, the cloud emerges as attractive execution environment, which provides on-demand resource provisioning and a pay-per-use model. However, cloud environments require specific application properties that may restrict parallel application design. As a result, design trade-offs are required to simultaneously maximize parallel performance and benefit from cloud-specific characteristics.
In this paper, we present a novel approach to assess the cloud readiness of parallel applications based on the design decisions made. By discovering and understanding the implications of these parallel design decisions on an application’s cloud readiness, our approach supports the migration of parallel applications to the cloud.We introduce an assessment procedure, its underlying meta model, and a corresponding instantiation to structure this multi-dimensional design space. For evaluation purposes, we present an extensive case study comprising three parallel applications and discuss their cloud readiness based on our approach.
Elasticity is considered to be the most beneficial characteristic of cloud environments, which distinguishes the cloud from clusters and grids. Whereas elasticity has become mainstream for web-based, interactive applications, it is still a major research challenge how to leverage elasticity for applications from the high-performance computing (HPC) domain, which heavily rely on efficient parallel processing techniques. In this work, we specifically address the challenges of elasticity for parallel tree search applications. Well-known meta-algorithms based on this parallel processing technique include branch-and-bound and backtracking search. We show that their characteristics render static resource provisioning inappropriate and the capability of elastic scaling desirable. Moreover, we discuss how to construct an elasticity controller that reasons about the scaling behavior of a parallel system at runtime and dynamically adapts the number of processing units according to user-defined cost and efficiency thresholds. We evaluate a prototypical elasticity controller based on our findings by employing several benchmarks for parallel tree search and discuss the applicability of the proposed approach. Our experimental results show that, by means of elastic scaling, the performance can be controlled according to user-defined thresholds, which cannot be achieved with static resource provisioning.
Cloud resources can be dynamically provisioned according to application-specific requirements and are payed on a per-use basis. This gives rise to a new concept for parallel processing: Elastic parallel computations. However, it is still an open research question to which extent parallel applications can benefit from elastic scaling, which requires resource adaptation at runtime and corresponding coordination mechanisms. In this work, we analyze how to address these system-level challenges in the context of developing and operating elastic parallel tree search applications. Based on our findings, we discuss the design and implementation of TASKWORK, a cloud-aware runtime system specifically designed for elastic parallel tree search, which enables the implementation of elastic applications by means of higher-level development frameworks. We show how to implement an elastic parallel branch-and-bound application based on an exemplary development framework and report on our experimental evaluation that also considers several benchmarks for parallel tree search.
The cloud evolved into an attractive execution environment for parallel applications, which make use of compute resources to speed up the computation of large problems in science and industry. Whereas Infrastructure as a Service (IaaS) offerings have been commonly employed, more recently, serverless computing emerged as a novel cloud computing paradigm with the goal of freeing developers from resource management issues. However, as of today, serverless computing platforms are mainly used to process computations triggered by events or user requests that can be executed independently of each other and benefit from on-demand and elastic compute resources as well as per-function billing. In this work, we discuss how to employ serverless computing platforms to operate parallel applications. We specifically focus on the class of parallel task farming applications and introduce a novel approach to free developers from both parallelism and resource management issues. Our approach includes a proactive elasticity controller that adapts the physical parallelism per application run according to user-defined goals. Specifically, we show how to consider a user-defined execution time limit after which the result of the computation needs to be present while minimizing the associated monetary costs. To evaluate our concepts, we present a prototypical elastic parallel system architecture for self-tuning serverless task farming and implement two applications based on our framework. Moreover, we report on performance measurements for both applications as well as the prediction accuracy of the proposed proactive elasticity control mechanism and discuss our key findings.
In den letzten Jahren hat der Trend zur Digitalisierung und Konnektivität die Kundenerwartungen an den B2B-Kundenservice verändert. Vorliegender Artikel arbeitet mit zwei klaren Studienzielen und untersucht zum einen die Rolle von IoT (Internet of Things) und Cybersicherheit als Erfolgsfaktoren für den Business-to-Business (B2B) Kundenservice und zum anderen wie eine sichere Integration zu einem Wettbewerbsvorteil auf dem deutschen Markt beitragen kann. Durch einen qualitativen Ansatz mithilfe von 20 Befragungen wurde untersucht, dass IoT und Cybersicherheit als Erfolgsfaktoren für den deutschen B2B-Kundenservice angesehen werden können. Als Ergebnis liefert diese Studie fünf Kernaussagen (Hypothesen) aus qualitativen Interviews. Neben der Diskussion allgemeiner Erfolgsfaktoren und deren Einfluss, wurde die Rolle von IoT bei der Optimierung des B2B Kundendienstes diskutiert. Zudem werden potenzielle Sicherheitsrisken in Zusammenhang mit den Dienstleistungsmodellen, notwendige Anforderungen an Cybersicherheit sowie Datenerfassung erörtert. Abschließend wurde ein Modell entwickelt, das interne und externe Aspekte aufzeigt, die dazu beitragen, dass IoT und Cybersicherheit als Erfolgsfaktoren in der Aktivitätskette des Kunden in der Pre-Sales‑, Sales- und After-Sales-Phase erlebt werden.
Dieser praxis-nahe und industrie-übergreifende Artikel liefert somit Einblicke basierend auf qualitativen Erkenntnissen für weitere Forschung in der Theorie und befähigt Organisationen das Thema ganzeinheitlich zu betrachten.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Digitalization increases the pressure for companies to innovate. While current research on digital transformation mostly focuses on technological and management aspects, less attention has been paid to organizational culture and its influence on digital innovations. The purpose of this paper is to identify the characteristics of organizational culture that foster digital innovations. Based on a systematic literature review on three scholarly databases, we initially found 778 articles that were then narrowed down to a total number of 23 relevant articles through a methodical approach. After analyzing these articles, we determine nine characteristics of organizational culture that foster digital innovations: corporate entrepreneurship, digital awareness and necessity of innovations, digital skills and resources, ecosystem orientation, employee participation, agility and organizational structures, error culture and risk-taking, internal knowledge sharing and collaboration, customer and market orientation as well as open-mindedness and willingness to learn.
Delphi Markets
(2023)
Delphi markets refer to approaches and implementations of integrating prediction markets and Delphi studies (Real-time Delphi). The combination of the two methods for producing forecasts can potentially compensate for each other´s weaknesses. For example, prediction markets can be used to select participants with expertise and also motivate long-term participation through their gamified approach and incentive mechanisms. In this paper, two potentials for prediction markets and four potentials for Delphi studies, which are made possible by integration, are derived theoretically. Subsequently, three different integration approaches are presented, on the basis of which the integration on user, market and Delphi question-level is exemplified and it is shown that, depending on the approach, not all potentials can be achieved. At the end, recommendations for the use of Delphi markets are derived, existing limitations for Delphi markets as well as future developments are pointed out.
The Internet of Things (IoT) is coined by many different standards, protocols, and data formats that are often not compatible to each other. Thus, the integration of different heterogeneous (IoT) components into a uniform IoT setup can be a time-consuming manual task. This lacking interoperability between IoT components has been addressed with different approaches in the past. However, only very few of these approaches rely on Machine Learning techniques. In this work, we present a new way towards IoT interoperability based on Deep Reinforcement Learning (DRL). In detail, we demonstrate that DRL algorithms, which use network architectures inspired by Natural Language Processing (NLP), can be applied to learn to control an environment by merely taking raw JSON or XML structures, which reflect the current state of the environment, as input. Applied to IoT setups, where the current state of a component is often reflected by features embedded into JSON or XML structures and exchanged via messages, our NLP DRL approach eliminates the need for feature engineering and manually written code for pre-processing of data, feature extraction, and decision making.
Beschleunigung und Reorientierung des technischen Fortschritts überfordern selbst große Unternehmen im Spannungsfeld zwischen Spezialisierung und interdisziplinärer Konvergenz. So wird die Kombination interner Forschung und Entwicklung mit externem Wissen, vor allem in Hochtechnologien, zur zentralen Voraussetzung langfristigen Unternehmenserfolgs. In diesem Kontext untersucht die vorliegende Dissertation das Potenzial kooperativen Verhaltens zwischen Unternehmen zur Bewältigung technologischer Diskontinuitäten am Beispiel des bevorstehenden Paradigmenwechsels im automobilen Antrieb. Dabei wird Kooperation als superiore Strategie zur Stimulation des explorativen Innovationsmodus identifiziert und in eine übergreifende Dynamik der Koordinationseignung im Verlauf technologischen Fortschritts integriert. Bezogen auf den automobilen Antrieb ist eine nachhaltigkeitsinduzierte Destabilisierung des technologischen Paradigmas des Verbrennungsmotors festzustellen, während sich seine intensiven Möglichkeiten erschöpfen. Konsequenz dessen ist zunehmender Innovationsdruck, der konsistenzorientiert eine systemische Transformation von Kraftwerkstechnik und Energienetz sowie einen Paradigmenwechsel zu elektrischen Antrieben erzwingt. Aufgrund der bisher geringen technologischen Reife und hohen Kosten elektrischer Antriebssysteme zeichnet sich allerdings ein Übergang in Form einer graduellen Rekonfiguration über eine Hybridphase ab, deren Dynamik maßgeblich von der Entwicklung der technoökonomischen Schlüsselmodule Batterie und Brennstoffzelle abhängt. Die dazu erforderliche technologische Transformation birgt existenzielle Gefährdungen für die etablierten Unternehmen der Automobilindustrie, die sich gegenüber ihren Herausforderern explorationsbezogen in einer inferioren Ausgangssituation befinden. Eben hier bieten sich umfangreiche Potenziale kooperativer Exploration elektrischer Antriebe auf Verhaltens-, Innovationsprozess und Wissensebene. In Relation zu diesen erscheint das reale Kooperationsniveau jedoch als gering, volatil und, vor allem in Deutschland, übermäßig intrasektoral fokussiert.
Aus diesen Erkenntnissen ergeben sich Implikationen für Unternehmensführung, Innovationspolitik und Forschung. Managementseitig besteht die zentrale Herausforderung in der Befähigung der Organisation zu Dynamisierung von Wissen und Fähigkeiten durch simultan-heterogene Koordination explorativer und exploitativer Innovationsströme. Insbesondere die Erschließung kooperativer Potenziale setzt allerdings die Bereitschaft zur Einschränkung der eigenen Unabhängigkeit sowie zur Abweichung von bewährten Verhaltensmustern voraus. Innovationspolitisch steht die Überwindung von Beharrungskräften durch Anpassung des sozio-institutionellen Rahmens sowie die Förderung langfristiger Kooperation bei potenzialgeleiteter Intersektorialität im Vordergrund. Forschungsbezogen eröffnet speziell die Kombination von Innovations- Nachhaltigkeits- und Koordinationstheorie ein besseres Verständnis von Triebfedern und Dynamik technischen Fortschritts, das weiter vertieft werden sollte.
Background: Internationally, teledermatology has proven to be a viable alternative to conventional physical referrals. Travel cost and referral times are reduced while patient safety is preserved. Especially patients from rural areas benefit from this healthcare innovation. Despite these established facts and positive experiences from EU neighboring countries like the Netherlands or the United Kingdom, Germany has not yet implemented store-and-forward teledermatology in routine care.
Methods: The TeleDerm study will implement and evaluate store-and-forward teledermatology in 50 general practitioner (GP) practices as an alternative to conventional referrals. TeleDerm aims to confirm that the possibility of store-and-forward teledermatology in GP practices is going to lead to a 15% (n = 260) reduction in referrals in the intervention arm. The study uses a cluster-randomized controlled trial design. Randomization is planned for the cluster “county”. The main observational unit is the GP practice. Poisson distribution of referrals is assumed. The evaluation of secondary outcomes like acceptance, enablers and barriers uses a mixed methods design with questionnaires and interviews.
Discussion: Due to the heterogeneity of GP practice organization, patient management software, information technology service providers, GP personal technical affinity and training, we expect several challenges in implementing teledermatology in German GP routine care. Therefore, we plan to recruit 30% more GPs than required by the power calculation. The implementation design and accompanying evaluation is expected to deliver vital insights into the specifics of implementing telemedicine in German routine care.
Decentralized energy systems are characterized by an ad hoc planing. The missing integration of energy objectives into business strategy creates difficulties resulting in inefficient energy architectures and decisions. Practice-proven methods such as balanced scorecard, enterprise architecture management and value network approach supports the transformation path towards an effective decentralized system. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
Several studies analyzed existing Web APIs against the constraints of REST to estimate the degree of REST compliance among state-of-the-art APIs. These studies revealed that only a small number of Web APIs are truly RESTful. Moreover, identified mismatches between theoretical REST concepts and practical implementations lead us to believe that practitioners perceive many rules and best practices aligned with these REST concepts differently in terms of their importance and impact on software quality. We therefore conducted a Delphi study in which we confronted eight Web API experts from industry with a catalog of 82 REST API design rules. For each rule, we let them rate its importance and software quality impact. As consensus, our experts rated 28 rules with high, 17 with medium, and 37 with low importance. Moreover, they perceived usability, maintainability, and compatibility as the most impacted quality attributes. The detailed analysis revealed that the experts saw rules for reaching Richardson maturity level 2 as critical, while reaching level 3 was less important. As the acquired consensus data may serve as valuable input for designing a tool-supported approach for the automatic quality evaluation of RESTful APIs, we briefly discuss requirements for such an approach and comment on the applicability of the most important rules.
Hypermedia as the Engine of Application State (HATEOAS) is one of the core constraints of REST. It refers to the concept of embedding hyperlinks into the response of a queried or manipulated resource to show a client possible follow-up actions and transitions to related resources. Thus, this concept aims to provide a client with a navigational support when interacting with a Web-based application. Although HATEOAS should be implemented by any Web-based API claiming to be RESTful, API providers tend to offer service descriptions in place of embedding hyperlinks into responses. Instead of relying on a navigational support, a client developer has to read the service description and has to identify resources and their URIs that are relevant for the interaction with the API. In this paper, we introduce an approach that aims to identify transitions between resources of a Web-based API by systematically analyzing the service description only. We devise an algorithm that automatically derives a URI Model from the service description and then analyzes the payload schemas to identify feasible values for the substitution of path parameters in URI Templates. We implement this approach as a proxy application, which injects hyperlinks representing transitions into the response payload of a queried or manipulated resource. The result is a HATEOAS-like navigational support through an API. Our first prototype operates on service descriptions in the OpenAPI format. We evaluate our approach using ten real-world APIs from different domains. Furthermore, we discuss the results as well as the observations captured in these tests.
Der Beitrag erweitert aus der theoretischen Perspektive der Soziologie der Konventionen (Économie des Conventions, EC) die Forschung zur pragmatischen Dimension des organisationalen Gedächtnisses. Dabei wird erstens argumentiert, dass Konventionen als organisationales Gedächtnis verstanden werden können, in denen gespeichert wird, wie Koordinationsprobleme erfolgreich lösbar sind. Zweitens wird anhand des Akteurstatus der EC sowie des Konzepts der Handlungsregime diskutiert, wie Akteure auf gespeichertes Wissen zugreifen. Und drittens wird die bislang nicht berücksichtigte normative Dimension des organisationalen Gedächtnisses analysiert. Dabei wird argumentiert, dass Akteure sich auf Konventionen gestützt rechtfertigen, wenn sie Elemente des organisationalen Gedächtnisses aufgreifen. Insgesamt trägt der Beitrag dazu bei, die Verbindung von kollektivem Gedächtnis und Entscheidung besser zu verstehen, indem sie auf Basis der EC als eine interaktionistische, pragmatische und normativ geprägte Aushandlung von Erinnerungen in konkreten Situationen betrachtet wird.
Zielsetzung dieses Beitrags ist es darzustellen, wie die Soziologie der Konventionen dazu beitragen kann, das Phänomen organisationaler Routinen zu verstehen. Nach einer kurzen Einführung in die aktuelle Routineforschung sowie in die EC werden zwei potentielle Antworten auf die Frage vorgestellt: Erstens, kann die EC dazu beitragen, die vorhandenen Modelle und Konzeptualisierungen von organisationalen Routinen anzureichern. So können über die EC insbesondere Rechtfertigungsprozesse im Routinehandeln erfasst werden, die bislang nicht berücksichtigt wurden. Zweitens kann die EC einen eigenständigen, d. h. genuinen Ansatz für die Beobachtung organisationaler Routinen bilden. Dabei wird der Feststellung von Brandl et al. (2014, S. 314) gefolgt, dass die partielle Übernahme einzelner Gedanken der EC und deren Integration in andere Theoriekonzepte (wie hier der organisationalen Routine) kaum dazu in der Lage ist, das volle Potential der EC zur Erklärung organisationaler Phänomene auszuschöpfen. In diesem Beitrag werden daher die wesentlichen Elemente dargelegt, aus denen ein genuin konventionenbasiertes Verständnis von organisationalen Routinen ausgearbeitet werden könnte. Der Beitrag schließt mit einer Diskussion und einem Fazit ab.
Organisationale Identität in digitalisierten Arbeitswelten: Grundlagen für gelingende Kooperation
(2021)
Organisationen bilden Identitäten aus und beantworten dabei die Fragen „Wer sind wir? Und wer sind wir nicht?“. Vorstellungen zur organisationalen Identität gehen zunächst von traditionellen Organisationen aus. Durch die Digitalisierung können bisher integrierte Aufgaben stärker modularisiert werden, sodass die Koordination der organisatorischen Gesamtaufgabe stärker sach- und weniger personenorientiert erfolgt. Zudem lassen sich organisationale Aufgaben zunehmend projektorientiert und virtuell abbilden, sodass externe Aufgabenträger leichter integriert werden können. Unsere Vorstellungen zu Organisationsgrenzen und -mitgliedschaften verändern sich dadurch. Dies wirft die Frage auf, inwiefern sich in solchen sach- und projektorientierten, grenzaufgelösten Organisationen eine gemeinsame organisationale Identität ausbildet. Im Beitrag wird argumentiert, dass sich die Wege der Identitätsentwicklung verändern, die Funktionen der organisationalen Identität für gelingende Kooperation aber erhalten bleiben.
Arbeitswelten strategisch entwicklen: mit den DigiTraIn-Instrumenten zur digitalen Transformation
(2021)
Der Weg in die digitale Arbeitswelt ist für viele Unternehmen eine herausfordernde und komplexe Transformation. Um diesen Weg erfolgreich zu beschreiten, benötigen Unternehmen funktionierende Managementinstrumente. Im Projekt DigiTraIn 4.0 wurden vier Instrumente für eine gelingende Transformation in das digitale Arbeiten entwickelt und in der Unternehmenspraxis erprobt. Diese Instrumente werden im vorliegenden Beitrag, ausgehend von der Zielsetzung des Projekts, einführend dargestellt. Zudem wird ein Überblick über die weiteren Beiträge in diesem Buch gegeben, in denen die Instrumente im Detail erläutert werden und spezifische Aspekte des Wandels in die digitale Arbeitswelt im Fokus stehen.
Organisationslernen
(2019)
Durch Organisationslernen passen sich Organisationen an veränderte Umweltanforderungen (Digitalisierung, politische Reformen, usw.) an. Organisationen können die Lernfähigkeit erhöhen, indem sie ihre dynamischen Fähigkeiten durch eine geringe Arbeitsteilung stärken, ihren Absorptionsprozess von Wissen hinterfragen, und strukturelle und zeitliche Ambidextrie schaffen. Sie können sich am Leitbild der lernenden Organisation orientieren und flache Organisationsstrukturen sowie Teamarbeit fördern. Insbesondere für öffentliche Verwaltungen, die derzeit nicht ausreichend lernfähig sind, bietet das Organisationslernen sinnvolle Ansatzpunkte.
Menopause is the permanent cessation of menstruation occurring naturally in women's aging. The most frequent symptoms associated with menopausal phases are mucosal dryness, increased weight and body fat, and changes in sleep patterns. Oral symptoms in menopause derived from saliva flow reduction can lead to dry mouth, ulcers, and alterations of taste and swallowing patterns. However, the oral health phenotype of postmenopausal women has not been characterized. The aim of the study was to determine postmenopausal women's oral phenotype, including medical history, lifestyle, and oral assessment through artificial intelligence algorithms. We enrolled 100 postmenopausal women attending the Dental School of the University of Seville were included in the study. We collected an extensive questionnaire, including lifestyle, medication, and medical history. We used an unsupervised k-means algorithm to cluster the data following standard features for data analysis. Our results showed the main oral symptoms in our postmenopausal cohort were reduced salivary flow and periodontal disease. Relying on the classical assessment of the collected data, we might have a biased evaluation of postmenopausal women. Then, we used artificial intelligence analysis to evaluate our data obtaining the main features and providing a reduced feature defining the oral health phenotype. We found 6 clusters with similar features, including medication affecting salivation or smoking as essential features to obtain different phenotypes. Thus, we could obtain main features considering differential oral health phenotypes of postmenopausal women with an integrative approach providing new tools to assess the women in the dental clinic.
Several diseases occur due to asbestos exposure. Until today, asbestos predicted mortality and morbidity will increase because of the long latency period. Actually, the methods to investigate asbestos related disease are mostly invasive. Therefore, the aim of the present paper was to investigate, whether signals in human breath could be correlated to Asbestos related lung diseases using a multi-capillary column (MCC) connected to an ion mobility spectrometer (IMS) as non-invasive method. Here, the breath samples of 10 mL of 25 patients suffering from asbestos related diseases. This group includes patients with asbestos related pleural thickening with and without pulmonary fibrosis. Twelve healthy persons constitute the control group and the breath samples are compared with those of the BK4103 patients. In total 83 peaks are found in the IMS-Chromatogram. A discrimination was possible with p-values <0.001 for two peaks (99.9 %), <0.01 (99 %) for 5 peaks and <0.05 (95 %) for 17 peaks. The most discrimination peaks alpha pinene and 4-ethyltoluol were identified among some others with lower p-values. The corresponding Box-and-Whisker-Plots comparing both groups are presented. In addition, a decision tree including all peaks was created that shows a differentiation with alpha pinene between BK4103 (pleural plaques group) and the control group. In addition, the sensitivity was calculated to 96 %, specificity was 50 %, positive and negative predictive values were 80 % and 86 %. Ion mobility spectrometry was introduced as non-invasive method to separate both groups Asbestos related and healthy. Naturally, the findings need further confirmation on larger population groups, but encourage further investigations, too.
Companies are constantly changing their business process models. In team environments, different versions of a process model are created at the same time. These versions of a process model need to be merged from time to time to consolidate changes and create a new common version.
In this short paper, we propose a solution for modifying a merge result. The goal is to create a meaningful merge result by adding connector nodes to the model at specific locations. This increases the amount of possible result models and reduces additional implementation effort.
Online measurement of drug concentrations in patient's breath is a promising approach for individualized dosage. A direct transfer from breath- to blood-concentrations is not possible. Measured exhaled concentrations are following the blood-concentration with a delay in non-steady-state situations. Therefore, it is necessary to integrate the breath-concentration into a pharmacological model. Two different approaches for pharmacokinetic modelling are presented. Usually a 3-compartment model is used for pharmacokinetic calculations of blood concentrations. This 3-compartment model is extended with a 2-compartment model based on the first compartment of the 3-compartment model and a new lung compartment. The second approach is to calculate a time delay of changes in the concentration of the first compartment to describe the lung-concentration. Exemplarily both approaches are used for modelling of exhaled propofol. Based on time series of exhaled propofol measurements using an ion-mobility-spectrometer every minute for 346 min a correlation of calculated plasma and the breath concentration was used for modelling to deliver R2 = 0.99 interdependencies. Including the time delay modelling approach the new compartment coefficient ke0lung was calculated to ke0lung = 0.27 min−1 with R2 = 0.96. The described models are not limited to propofol. They could be used for any kind of drugs, which are measurable in patient's breath.
Das vorliegende Kapitel umreißt die aktuellen empirischen Forschungsfelder der Erwachsenenbildung/Weiterbildung. Diese beschäftigt sich seit ihren Anfängen in den 1960er-Jahren mit Teilnehmenden und mit Lehr- und Lern-Prozessen und wird seither sowohl hinsichtlich ihrer Theoriegrundlegung als auch ihrer Gegenstände weiterentwickelt und ausdifferenziert. Die verschiedenen Theorieperspektiven werden für die empirische Untersuchung ausdifferenzierter Gegenstandsbereiche herangezogen. Zentrale Erkenntnisse der aktuellen empirischen Forschung ermöglichen nicht nur Wissen über die Teilnehmenden, sondern auch über das Lernen Erwachsener, über Programme und Angebote, Institutionen und Organisationen sowie deren Einbettung in staatliche und gesellschaftliche Systeme und bildungspolitische Entscheidungen.
Software and system development is complex and diverse, and a multitude of development approaches is used and combined with each other to address the manifold challenges companies face today. To study the current state of the practice and to build a sound understanding about the utility of different development approaches and their application to modern software system development, in 2016, we launched the HELENA initiative. This paper introduces the 2nd HELENA workshop and provides an overview of the current project state. In the workshop, six teams present initial findings from their regions, impulse talk are given, and further steps of the HELENA roadmap are discussed.