Refine
Year of publication
- 2021 (234) (remove)
Document Type
- Journal article (135)
- Conference proceeding (73)
- Book chapter (13)
- Report (5)
- Doctoral Thesis (3)
- Issue of a journal (2)
- Working Paper (2)
- Book (1)
Has full text
- yes (234) (remove)
Is part of the Bibliography
- yes (234)
Institute
- ESB Business School (86)
- Informatik (69)
- Life Sciences (40)
- Technik (28)
- Texoversum (5)
- Zentrale Einrichtungen (4)
Publisher
- Springer (46)
- Elsevier (26)
- MDPI (24)
- IEEE (17)
- Wiley (10)
- American Chemical Society (8)
- De Gruyter (8)
- Hochschule Reutlingen (6)
- Association for Computing Machinery (4)
- VDE Verlag (4)
Since the beginning of the energy sector liberalization, the design of energy markets has become a prominent field of research. Markets nowadays facilitate efficient resource allocation in many fields of energy system operation, such as plant dispatch, control reserve provisioning, delimitation of related carbon emissions, grid congestion management, and, more recently, smart grid concepts and local energy trading. Therefore, good market designs play an important role in enabling the energy transition toward a more sustainable energy supply for all. In this chapter, we retrace how market engineering shaped the development of energy markets and how the research focus shifted from national wholesale markets to more decentralized and location-sensitive concepts.
The maintenance of railway infrastructure remains a challenge. Data acquisition technologies have evolved because of Industry 4.0, expanding the capabilities of predictive maintenance. Despite the advances, the potential of these emerging technologies has not been fully realised. This paper presents a technology selection framework in support of railway infrastructure predictive maintenance, which is based on qualitative methods. It consists of three stages, including the mapping of the infrastructure characteristics with the identified technologies, the evaluation of the most appropriate technologies, and the sourcing thereof. This presents the collective decision support output of the framework.
Human bestrophin-1 protein (hBest1) is a transmembrane channel associated with the calcium-dependent transport of chloride ions in the retinal pigment epithelium as well as with the transport of glutamate and GABA in nerve cells. Interactions between hBest1, sphingomyelins, phosphatidylcholines and cholesterol are crucial for hBest1 association with cell membrane domains and its biological functions. As cholesterol plays a key role in the formation of lipid rafts, motional ordering of lipids and modeling/remodeling of the lateral membrane structure, we examined the effect of different cholesterol concentrations on the surface tension of hBest1/POPC (1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine) and hBest1/SM Langmuir monolayers in the presence/absence of Ca2+ ions using surface pressure measurements and Brewster angle microscopy studies. Here, we report that cholesterol: (1) has negligible condensing effect on pure hBest1 monolayers detected mainly in the presence of Ca2+ ions, and; (2) induces a condensing effect on composite hBest1/POPC and hBest1/SM monolayers. These results offer evidence for the significance of intermolecular protein–lipid interactions for the conformational dynamics of hBest1 and its biological functions as multimeric ion channel.
This paper presents the concept of the system architecture of a flexible cyber-physical factory control system. The system allows the automation of process structures using cyber-physical fractal nodes. These nodes have a functional and independent form and can be clustered to larger structures. This makes it possible to equip the factory with a flexible, freely scalable, modular system. The description of this system architecture and the associated rules and conditions is outlined in the concept.
Software is an integrated part of new features within the automotive sector, car manufacturers, the Hersteller Initiative Software (HIS) consortium defined metrics to determine software quality. Yet, problems with assigning metrics to quality attributes often occur in practice. The specified boundary values lead to discussions between contractors and clients as different standards and metric sets are used. This paper studies metrics used in the automotive sector and the quality attributes they address. The HIS, ISO/IEC 25010:2011, and ISO/IEC 26262:2018 are utilized to draw a big picture illustrating (i) which metrics and boundary values are reported in literature, (ii) how the metrics match the standards, (iii) which quality attributes are addressed, and (iv) how the metrics are supported by tools. Our findings from analyzing 38 papers include a catalog of 112 metrics of which 17 define boundary values and 48 are supported by tools. Most of the metrics are concerned with source code, are generic, and not specifically designed for automotive software development. We conclude that many metrics exist, but a clear definition of the metrics' context, notably regarding the construction of flexible and efficient measurement suites, is missing.
Product roadmaps in the new mobility domain: state of the practice and industrial experiences
(2021)
Context: The New Mobility industry is a young market that includes high market dynamics and is therefore associated with a high degree of uncertainty. Traditional product roadmapping approaches such a detailed planning of features over a long-time horizon typically fail in such environments. For this reason, companies that are active in the field of New Mobility are faced with the challenge of keeping their product roadmaps reliable for stakeholders while at the same time being able to react flexibly to changing market requirements.
Objective: The goal of this paper is to identify the state of practice regarding product roadmapping of New Mobility companies. In addition, the related challenges within the product roadmapping process as well as the success factors to overcome these challenges will be highlighted.
Method: We conducted semi-structured expert interviews with 8 experts (7 German company and one Finnish company) from the field of New Mobility and performed a content analysis.
Results: Overall the results of the study showed that the participating companies are aware of the requirements that the New Mobility sector entails. Therefore, they exhibit a high level of maturity in terms of product roadmapping. Nevertheless, some aspects were revealed that pose specific challenges for the participating companies. One major challenge, for example, is that New Mobility in terms of public clients is often a tender business with non-negotiable product requirements. Thus, the product roadmap can be significantly influenced from the outside. As factors for a successful product roadmapping mainly soft factors such as trust between all people involved in the product development process and transparency throughout the entire roadmapping process were mentioned.
Monodisperse polystyrene spheres are functional materials with interesting properties, such as high cohesion strength, strong adsorptivity, and surface reactivity. They have shown a high application value in biomedicine, information engineering, chromatographic fillers, supercapacitor electrode materials, and other fields. To fully understand and tailor particle synthesis, the methods for characterization of their complex 3D morphological features need to be further explored. Here we present a chemical imaging study based on three-dimensional confocal Raman microscopy (3D-CRM), scanning electron microscopy (SEM), focused ion beam (FIB), diffuse reflectance infrared Fourier transform (DRIFT), and nuclear magnetic resonance (NMR) spectroscopy for individual porous swollen polystyrene/poly (glycidyl methacrylate-co-ethylene di-methacrylate) particles. Polystyrene particles were synthesized with different co-existing chemical entities, which could be identified and assigned to distinct regions of the same particle. The porosity was studied by a combination of SEM and FIB. Images of milled particles indicated a comparable porosity on the surface and in the bulk. The combination of standard analytical techniques such as DRIFT and NMR spectroscopies yielded new insights into the inner structure and chemical composition of these particles. This knowledge supports the further development of particle synthesis and the design of new strategies to prepare particles with complex hierarchical architectures.
Context: The manufacturing industry is facing a transformation with regard to Industry 4.0 (I4). A transformation towards full automation of production including a multitude of innovations is necessary. Startups and entrepreneurial processes can support such a transformation as has been shown in other industries. However, I4 has some specifics, so it is unclear how entrepreneurship can be adapted in I4. Understanding these specifics is important to develop suitable training programs for I4 startups and to accelerate the transformation.
Objective: This study identifies and outlines the essential characteristics and constraints of entrepreneurial processes in I4.
Method: 14 semi-structured interviews were conducted with experts in the field of I4 entrepreneurship. The interviews were analysed and categorized by qualitative analyses.
Results: The interviews revealed several characteristics of I4 that have a significant impact on the various phases of the entrepreneurial process. Examples of such specifics include the difficult access to customers, the necessary deep understanding of the customer and the domain, the difficulty of testing risky assumptions, and the complex development and productization of solutions. The complexity of hardware and software components, cost structures, and necessary customer-specific customizations affect the scalability of I4 startups. These essential characteristics also require specialised skills and resources from I4 startups.
Metalworking fluids (MWFs) are widely used to cool and lubricate metal workpieces during processing to reduce heat and friction. Extending a MWF’s service life is of importance from both economical and ecological points of view. Knowledge about the effects of processing conditions on the aging behavior and reliable analytical procedures are required to properly characterize the aging phenomena. While so far no quantitative estimations of ageing effects on MWFs have been described in the literature other than univariate ones based on single parameter measurements, in the present study we present a simple spectroscopy-based set-up for the simultaneous monitoring of three quality parameters of MWF and a mathematical model relating them to the most influential process factors relevant during use. For this purpose, the effects of MWF concentration, pH and nitrite concentration on the droplet size during aging were investigated by means of a response surface modelling approach. Systematically varied model MWF fluids were characterized using simultaneous measurements of absorption coefficients µa and effective scattering coefficients µ’s. Droplet size was determined via dynamic light scattering (DLS) measurements. Droplet size showed non-linear dependence on MWF concentration and pH, but the nitrite concentration had no significant effect. pH and MWF concentration showed a strong synergistic effect, which indicates that MWF aging is a rather complex process. The observed effects were similar for the DLS and the µ’s values, which shows the comparability of the methodologies. The correlations of the methods were R2c = 0.928 and R2P = 0.927, as calculated by a partial least squares regression (PLS-R) model. Furthermore, using µa, it was possible to generate a predictive PLS-R model for MWF concentration (R2c = 0.890, R2P = 0.924). Simultaneous determination of the pH based on the µ’s is possible with good accuracy (R²c = 0.803, R²P = 0.732). With prior knowledge of the MWF concentration using the µa-PLS-R model, the predictive capability of the µ’s-PLS-R model for pH was refined (10 wt%: R²c = 0.998, R²p = 0.997). This highlights the relevance of the combined measurement of µa and µ’s. Recognizing the synergistic nature of the effects of MWF concentration and pH on the droplet size is an important prerequisite for extending the service life of an MWF in the metalworking industry. The presented method can be applied as an in-process analytical tool that allows one to compensate for ageing effects during use of the MWF by taking appropriate corrective measures, such as pH correction or adjustment of concentration.
Despite its success against cancer, photothermal therapy (PTT) (>50 °C) suffers from several limitations such as triggering inflammation and facilitating immune escape and metastasis and also damage to the surrounding normal cells. Mild-temperature PTT has been proposed to override these shortcomings. We developed a nanosystem using HepG2 cancer cell membrane-cloaked zinc glutamate-modified Prussian blue nanoparticles with triphenylphosphine-conjugated lonidamine (HmPGTL NPs). This innovative approach achieved an efficient mild-temperature PTT effect by downregulating the production of intracellular ATP. This disrupts a section of heat shock proteins that cushion cancer cells against heat. The physicochemical properties, anti-tumor efficacy, and mechanisms of HmPGTL NPs both in vitro and in vivo were investigated. Moreover, the nanoparticles cloaked with the HepG2 cell membrane substantially prolonged the circulation time in vivo. Overall, the designed nanocomposites enhance the efficacy of mild-temperature PTT by disrupting the production of ATP in cancer cells. Thus, we anticipate that the mild-temperature PTT nanosystem will certainly present its enormous potential in various biomedical applications.
The incudo-malleal joint (IMJ) in the human middle ear is a true diarthrodial joint and it has been known that the flexibility of this joint does not contribute to better middle-ear sound transmission. Previous studies have proposed that a gliding motion between the malleus and the incus at this joint prevents the transmission of large displacements of the malleus to the incus and stapes and thus contributes to the protection of the inner ear as an immediate response against large static pressure changes. However, dynamic behavior of this joint under static pressure changes has not been fully revealed. In this study, effects of the flexibility of the IMJ on middle-ear sound transmission under static pressure difference between the middle-ear cavity and the environment were investigated. Experiments were performed in human cadaveric temporal bones with static pressures in the range of +/- 2 kPa being applied to the ear canal (relative to middle-ear cavity). Vibrational motions of the umbo and the stapes footplate center in response to acoustic stimulation (0.2-8 kHz) were measured using a 3D-Laser Doppler vibrometer for (1) the natural IMJ and (2) the IMJ with experimentally-reduced flexibility. With the natural condition of the IMJ, vibrations of the umbo and the stapes footplate center under static pressure loads were attenuated at low frequencies below the middle-ear resonance frequency as observed in previous studies. After the flexibility of the IMJ was reduced, additional attenuations of vibrational motion were observed for the umbo under positive static pressures in the ear canal (EC) and the stapes footplate center under both positive and negative static EC pressures. The additional attenuation of vibration reached 4~7 dB for the umbo under positive static EC pressures and the stapes footplate center under negative EC pressures, and 7~11 dB for the stapes footplate center under positive EC pressures. The results of this study indicate an adaptive mechanism of the flexible IMJ in the human middle ear to changes of static EC pressure by reducing the attenuation of the middle-ear sound transmission. Such results are expected to be used for diagnosis of the IMJ stiffening and to be applied to design of middle-ear prostheses.
This paper presents a generic method to enhance performance and incorporate temporal information for cardiorespiratory-based sleep stage classification with a limited feature set and limited data. The classification algorithm relies on random forests and a feature set extracted from long-time home monitoring for sleep analysis. Employing temporal feature stacking, the system could be significantly improved in terms of Cohen’s κ and accuracy. The detection performance could be improved for three classes of sleep stages (Wake, REM, Non-REM sleep), four classes (Wake, Non-REM-Light sleep, Non-REM Deep sleep, REM sleep), and five classes (Wake, N1, N2, N3/4, REM sleep) from a κ of 0.44 to 0.58, 0.33 to 0.51, and 0.28 to 0.44 respectively by stacking features before and after the epoch to be classified. Further analysis was done for the optimal length and combination method for this stacking approach. Overall, three methods and a variable duration between 30 s and 30 min have been analyzed. Overnight recordings of 36 healthy subjects from the Interdisciplinary Center for Sleep Medicine at Charité-Universitätsmedizin Berlin and Leave-One-Out-Cross-Validation on a patient-level have been used to validate the method.
Near-Data Processing is a promising approach to overcome the limitations of slow I/O interfaces in the quest to analyze the ever-growing amount of data stored in database systems. Next to CPUs, FPGAs will play an important role for the realization of functional units operating close to data stored in non-volatile memories such as Flash.It is essential that the NDP-device understands formats and layouts of the persistent data, to perform operations in-situ. To this end, carefully optimized format parsers and layout accessors are needed. However, designing such FPGA-based Near-Data Processing accelerators requires significant effort and expertise. To make FPGA-based Near-Data Processing accessible to non-FPGA experts, we will present a framework for the automatic generation of FPGA-based accelerators capable of data filtering and transformation for key-value stores based on simple data-format specifications.The evaluation shows that our framework is able to generate accelerators that are almost identical in performance compared to the manually optimized designs of prior work, while requiring little to no FPGA-specific knowledge and additionally providing improved flexibility and more powerful functionality.
Massive data transfers in modern data-intensive systems resulting from low data-locality and data-to-code system design hurt their performance and scalability. Near-Data processing (NDP) and a shift to code-to-data designs may represent a viable solution as packaging combinations of storage and compute elements on the same device has become feasible. The shift towards NDP system architectures calls for revision of established principles. Abstractions such as data formats and layouts typically spread multiple layers in traditional DBMS, the way they are processed is encapsulated within these layers of abstraction. The NDP-style processing requires an explicit definition of cross-layer data formats and accessors to ensure in-situ executions optimally utilizing the properties of the underlying NDP storage and compute elements. In this paper, we make the case for such data format definitions and investigate the performance benefits under RocksDB and the COSMOS hardware platform.
Die vorliegende Studie zeigt, dass das Thema Smart Innovation (der Einsatz von KI-Systemen im Innovationsprozess) von hoher Relevanz ist und Zustimmung für den Einsatz von KI im Innovationsprozess besteht. Sowohl von den Unternehmen als auch von den Studierenden werden Effizienzsteigerung, schnellere Bearbeitung großer Datenmengen, die Steigerung der Wettbewerbsfähigkeit und Kosteneinsparungen als Gründe für den Einsatz von KI im Innovationsprozess gesehen. In Deutschland finden KI-Technologien bereits jetzt punktuell und branchenunabhängig Anwendung im Innovationsprozess. Einflussfaktoren, wie Hochschulkooperationen, Innovationsabteilungen und Open Innovation können den Einsatz fördern. Vor allem KMU aus den frühen Phasen der Industrialisierung sollten davon Gebrauch machen. In einem Zusammenspiel von menschlicher Expertise und der schnellen und präzisen Datenverarbeitung der KI liegt das Erfolgsgeheimnis eines möglichst effizienten Innovationsprozesses. Es wird deutlich, dass verschiedene Einflussfaktoren erforderlich sind, um die Anwendung von Smart Innovation praktikabel zu gestalten. So gilt es zunächst die technischen Voraussetzungen einer funktionierenden IT-Infrastruktur zu erfüllen. Gleichbedeutend sind offene Fragestellungen hinsichtlich der Datenverfügbarkeit, des Dateneigentums und der Datensicherheit. Ohne rechtlichen Rahmen sind kaum Akteure gewillt, ihre Daten zu teilen und zugänglich zu machen. Erschwert wird der Einsatz von KI durch den nationalen IT-Fachkräftemangel. So sehen sowohl Unternehmen als auch die Studierenden das größte Hindernis im Mangel von KI-relevantem Know-how. Dies hemmt einerseits die Forschung, andererseits fehlt es den Unternehmen an erforderlichen Fachkräften für eine Einführung von KI im Unternehmen. Es ist jedoch notwendig, den Unternehmen durch das Aufzeigen von Anwendungsbeispielen, die Potenziale und Chancen von Smart Innovation zu vermitteln. Es gilt, die anwendungsorientierte Forschung zu fördern und einen reibungslosen Transfer in die Wirtschaft sicherzustellen. Dieser Wissensaustausch erfordert zudem eine höhere unternehmerische Risikobereitschaft. Es wächst die Notwendigkeit, unternehmensspezifische KI-Strategien zu entwerfen. Die Technologien entwickeln sich schnell, es gilt daher auch für Unternehmen sich diesem Fortschritt anzupassen, um den Anschluss nicht zu verlieren und die Wettbewerbsfähigkeit zu sichern. So liegt die größte Herausforderung im grundlegenden Wandel der Geschäftsmodelle, denn die Wertschöpfung erfolgreicher Unternehmen basiert zunehmend auf "digitalen assets". Daten gelten generell als die neue Ressource, als Rohstoff, auch für Smarte Innovationen. Die Bedeutung von Smart Innovation wird in Zukunft weiterhin ansteigen. Kurz- und mittelfristig unterstützt die Schwache KI vor allem bei der Datensammlung und -analyse, bei der Prozessautomatisierung sowie bei der Bedürfnis- und Trendidentifikation. Weiter werden sich inkrementelle Veränderungen im Innovationsmanagement mithilfe von Simulationen und der zufälligen Kombination von Technologien erhofft. Langfristig wird eine stärkere KI den Einsatz der Menschen im Innovationsprozess in Teilen ersetzen können. Ob autonomes Innovieren zukünftig möglich sein wird, hängt zunächst von dem Ausmaß der Neuheit einer Innovation, aber vor allem auch von der Möglichkeit einer kreativen KI ab. Es ist davon auszugehen, dass die Fortschritte im Bereich der KI nicht nur radikale Innovationen ermöglichen werden, sondern auch zu einer strukturellen Veränderung unseres heutigen Verständnisses des Innovationsmanagements führen.
The isothermal curing of melamine resin is investigated by in-line infrared spectroscopy at different temperatures. The infrared spectra are decomposed into time courses of characteristic spectral patterns using Multivariate Curve Resolution (MCR). It was found that depending on the applied curing temperature, melamine films with different spectral fingerprints and correspondingly different chemical network structures are formed. The network structures of fully cured resin films are specific for the applied curing temperatures used and cannot simply be compensated by changes in the curing time. For industrial curing processes, this means that cure temperature is the main system determining factor at constant M:F ratio. However, different MF resin networks can be specifically obtained from one and the same melamine resin by suitable selection of the curing time and temperatures profiles to design resin functionality. The spectral fingerprints after short curing time as well as after long curing time reflect the fundamental differences in the thermoset networks that can be obtained with industrial short-cycle and multi-daylight presses.
During curing of thermosetting resins the technologically relevant properties of binders and coatings develop. However, curing is difficult to monitor due to the multitude of chemical and physical processes taking place. Precise prediction of specific technological properties based on molecular properties is very difficult. In this study, the potential of principal component analysis (PCA) and principal component regression (PCR) in the analysis of Fourier transform infrared (FTIR) spectra is demonstrated using the example of melamine-formaldehyde (MF) resin curing in solid state. FTIR/PCA-based reaction trajectories are used to visualize the influence of temperature on isothermal cure. An FTIR/PCR model for predicting the hydrolysis resistance of cured MF resin from their spectral fingerprints is presented which illustrates the advantages of FTIR/PCR compared to the combination differential scanning calorimetry/isoconversional kinetic analysis. The presented methodology is transferable to the curing reactions of any thermosetting resin and can be applied to model other technologically relevant final properties as well.
Die Annexion der Krim, die Kriegsführung in Syrien, das finanzielle Engagement in Zypern, das Tauziehen um die Ukraine und Weißrussland oder die Namensgebung Sputnik 5 für den Impfstoff gegen die Corona Epidemie sind eindeutige Belege für das aktuelle russische Machtstreben – und seine Expansionspolitik. Deshalb ist es nicht uninteressant zu fragen, welches Meinungsbild Friedrich List (1789–1846) von Russland hatte, zumal es heute noch so aktuell, wie vor 180 bis 190 Jahren erscheint und in seinen Schriften dargelegt ist. Dieses Meinungsbild wird in diesem Aufsatz erstmals untersucht und umfassend dargestellt.
Seit 5 Jahrzehnten steht die Erforschung von Leben, Werk und Wirkungsgeschichte von Friedrich List (1789–1846) im Zentrum der wissenschaftlichen Arbeit von Eugen Wendler. Im Laufe der Zeit sind ca. 30 Monographien und eine größere Anzahl von wissenschaftlichen Aufsätzen und journalistischen Artikeln entstanden. Dabei baute Eugen Wendler auf der unschätzbaren Vorarbeit der Herausgeber der Gesamtausgabe von Lists Werken von 1925 bis 1935 auf.
Der vorliegende Aufsatz vermittelt einen Überblick über die Buchpublikationen von Eugen Wendler zur List-Forschung. Mit seinem eindrucksvollen Oeuvre bekennt er sich zum letzten lebenden Fossil in der Nachfolge der FLG und erweist damit den Herausgebern die gebührende und längst überfällige Wertschätzung und Achtung.
Unter den widrigsten wirtschaftlichen und politischen Verhältnissen und Bedingungen wurde die Friedrich-List-Gesellschaft (FLG) 1925 gegründet und bis 1934 fortgeführt. Sie verfolgte vor allem den Zweck, die weit verstreuten, schwer zugänglichen und vielfach unbekannten Schriften, Reden und Briefe von Friedrich List (1789-1846) zusammenzutragen und in Form einer Gesamtausgabe zu publizieren.
Weder diese 10- bzw. 12-bändige Gesamtausgabe, noch die Namen ihrer Herausgeber haben in der Wirtschaftswissenschaft die gebührende Wertschätzung und Aufmerksamkeit erfahren. Die längst überfällige Dankesschuld wird in dem vorliegenden Beitrag nach nahezu 100 Jahren abgetragen. Ohne den engagierten und mutigen Einsatz der Herausgeber, insbesondere von Edgar Salin, wäre die List-Forschung undenkbar und die deutsche Wirtschaftswissenschaft um ein ruhmreiches Kapitel ärmer.
In buchstäblich letzter Minute haben sich die englische Regierung und die Europäische Union auf ein umfangreiches Abkommen geeinigt, um einen ungeregelten Brexit zu verhindern. Nach dem jahrelangen zähen Verhandlungsmarathon fällt der Jubel verhalten aus, dennoch herrscht auf beiden Seiten des Ärmelkanals Erleichterung, weil ein Modus Vivendi gefunden wurde, auf dem sich die künftigen Beziehungen aufbauen und fortführen lassen. Ob sich die englischen Blütenträume, die an den Brexit geknüpft wurden, erfüllen werden, wird die Zukunft erweisen.
Die Strategie und Taktik der englischen Regierungen zum Brexit und bei den Austrittsverhandlungen spiegeln sich in den Erfahrungen wider, die Friedrich List vor genau 175 Jahren bei seinen Bemühungen um eine deutsch-englische Allianz machen musste. Wegen der von England schon damals strikt befolgten Insular und Handelssuprematie musste er sich eingestehen, dass England diese Position hartnäckig verteidigt und deshalb frustriert und ernüchtert seine Pläne aufgeben. Deshalb setzte er seine Hoffnung auf eine "Kontinentalallianz" der europäischen Nationen, wie sie nun nach dem Austritt Großbritanniens aus der Europäischen Union entstanden ist. Vielleicht werden wir uns nun an den Begriff "Kontinentalallianz" gewöhnen müssen und dabei an die Weitsicht von Friedrich List erinnert.
Andererseits gilt auch für die englische Politik das Motto von Lists zweiter Pariser Preisschrift: "Le monde marche - Die Welt bewegt sich", allerdings mit völlig anderen Vorzeichen als vor 175 Jahren: Die Welthandelsachse hat sich von der westlichen auf die östliche Halbkugel verlagert; das britische Weltreich ist Geschichte, die Fließgeschwindigkeit des globalen Wandels hat sich dramatisch beschleunigt und trotz der Lingua Franca erscheint England, vor allem aus asiatischer Sicht, nur noch als kleiner Fleck auf der Weltkarte. Falls die schottische Regierung ihre Absicht durchsetzen und die Unabhängigkeit vom Vereinigten Königreich erreichen sollte, würde sich der Brexit als verhängnisvoller Bumerang erweisen.
Successful transitions to a sustainable bioeconomy require novel technologies, processes, and practices as well as a general agreement about the overarching normative direction of innovation. Both requirements necessarily involve collective action by those individuals who purchase, use, and co-produce novelties: the consumers. Based on theoretical considerations borrowed from evolutionary innovation economics and consumer social responsibility, we explore to what extent consumers’ scope of action is addressed in the scientific bioeconomy literature. We do so by systematically reviewing bioeconomy-related publications according to (i) the extent to which consumers are regarded as passive vs. active, and (ii) different domains of consumer responsibility (depending on their power to influence economic processes). We find all aspects of active consumption considered to varying degrees but observe little interconnection between domains. In sum, our paper contributes to the bioeconomy literature by developing a novel coding scheme that allows us to pinpoint different aspects of consumer activity, which have been considered in a rather isolated and undifferentiated manner. Combined with our theoretical considerations, the results of our review reveal a central research gap which should be taken up in future empirical and conceptual bioeconomy research. The system-spanning nature of a sustainable bioeconomy demands an equally holistic exploration of the consumers’ prospective and shared responsibility for contributing to its coming of age, ranging from the procurement of information on bio-based products and services to their disposal.
Railway operators are being challenged by increasing complexity and safeguarding the availability of passenger rolling stock, bringing maintenance and especially emerging technologies into the focus. This paper presents a model for selection and implementation of Industry 4.0 technologies in rolling stock maintenance. The model consists of different stages and considers the main components of rolling stock, the related appropriate maintenance strategies and Industry 4.0 technologies considering the maturity level of the railway operators. Relevant criteria and main prerequisites of the technologies were identified. The model proposes relevant activities and was validated by industry experts.
Accurate and safe neurosurgical intervention can be affected by intra-operative tissue deformation, known as brain-shift. In this study, we propose an automatic, fast, and accurate deformable method, called iRegNet, for registering pre-operative magnetic resonance images to intra-operative ultrasound volumes to compensate for brain-shift. iRegNet is a robust end-to-end deep learning approach for the non-linear registration of MRI-iUS images in the context of image-guided neurosurgery. Pre-operative MRI (as moving image) and iUS (as fixed image) are first appended to our convolutional neural network, after which a non-rigid transformation field is estimated. The MRI image is then transformed using the output displacement field to the iUS coordinate system. Extensive experiments have been conducted on two multi-location databases, which are the BITE and the RESECT. Quantitatively, iRegNet reduced the mean landmark errors from pre-registration value of (4.18 ± 1.84 and 5.35 ± 4.19 mm) to the lowest value of (1.47 ± 0.61 and 0.84 ± 0.16 mm) for the BITE and RESECT datasets, respectively. Additional qualitative validation of this study was conducted by two expert neurosurgeons through overlaying MRI-iUS pairs before and after the deformable registration. Experimental findings show that our proposed iRegNet is fast and achieves state-of-the-art accuracies outperforming state-of-the-art approaches. Furthermore, the proposed iRegNet can deliver competitive results, even in the case of non-trained images as proof of its generality and can therefore be valuable in intra-operative neurosurgical guidance.
A hybrid deep registration of MR scans to interventional ultrasound for neurosurgical guidance
(2021)
Despite the recent advances in image-guided neurosurgery, reliable and accurate estimation of the brain shift still remains one of the key challenges. In this paper, we propose an automated multimodal deformable registration method using hybrid learning-based and classical approaches to improve neurosurgical procedures. Initially, the moving and fixed images are aligned using classical affine transformation (MINC toolkit), and then the result is provided to the convolutional neural network, which predicts the deformation field using backpropagation. Subsequently, the moving image is transformed using the resultant deformation into a moved image. Our model was evaluated on two publicly available datasets: the retrospective evaluation of cerebral tumors (RESECT) and brain images of tumors for evaluation (BITE). The mean target registration errors have been reduced from 5.35 ± 4.29 to 0.99 ± 0.22 mm in the RESECT and from 4.18 ± 1.91 to 1.68 ± 0.65 mm in the BITE. Experimental results showed that our method improved the state-of-the-art in terms of both accuracy and runtime speed (170 ms on average). Hence, the proposed method provides a fast runtime for 3D MRI to intra-operative US pair in a GPU-based implementation, which shows a promise for its applicability in assisting the neurosurgical procedures compensating for brain shift.
Purpose
Computerized medical imaging processing assists neurosurgeons to localize tumours precisely. It plays a key role in recent image-guided neurosurgery. Hence, we developed a new open-source toolkit, namely Slicer-DeepSeg, for efficient and automatic brain tumour segmentation based on deep learning methodologies for aiding clinical brain research.
Methods
Our developed toolkit consists of three main components. First, Slicer-DeepSeg extends the 3D Slicer application and thus provides support for multiple data input/ output data formats and 3D visualization libraries. Second, Slicer core modules offer powerful image processing and analysis utilities. Third, the Slicer-DeepSeg extension provides a customized GUI for brain tumour segmentation using deep learning-based methods.
Results
The developed Slicer-DeepSeg was validated using a public dataset of high-grade glioma patients. The results showed that our proposed platform’s performance considerably outperforms other 3D Slicer cloud-based approaches.
Conclusions
Developed Slicer-DeepSeg allows the development of novel AI-assisted medical applications in neurosurgery. Moreover, it can enhance the outcomes of computer-aided diagnosis of brain tumours. Open-source Slicer-DeepSeg is available at github.com/razeineldin/Slicer-DeepSeg.
This paper presents a permanent magnet tubular linear generator system for powering passive sensors using vertical vibration harvesting energy. The system consists of a permanent magnet tubular linear vibration generator and electric circuits. By using the design of mechanical resonant movers, the generator is capable of converting low frequencies small amplitude vertical vibration energy into more regular sinusoidal electrical energy. The distribution of the magnetic field and electromotive force are calculated by Finite Element Analysis. The characteristics of the linear vibration generator system are observed. The experimental results show the generator can produce about 0.4W~1.6W electrical power when the vibration source's amplitude is fixed on 2mm and the frequencies are between 13Hz and 22Hz.
Platforms and their surrounding ecosystems are becoming increasingly important components of many companies' strategies. Artificial Intelligence, in particular, has created new opportunities to create and develop ecosystems around the platform. However, there is not yet a methodology to systematically develop these new opportunities for enterprise development strategy. Therefore, this paper aims to lay a foundation for the conceptualization of Artificial Intelligence-based service ecosystems exploiting a Service-Dominant Logic. The basis for conceptualization is the study of value creation and particularly effective network effects. This research investigates the fundamental idea of extending specific digital concepts considering the influence of Artificial Intelligence on the design of intelligent services, along with their architecture of digital platforms and ecosystems, to enable a smooth evolutionary path and adaptability for human-centric collaborative systems and services. The paper explores an extended digital enterprise conceptual model through a combined, iterative, and permanent task of co-creating value between humans and intelligent systems as part of a new idea of cognitively adapted intelligent services.
The current advancement of Artificial Intelligence (AI) combined with other digitalization efforts significantly impacts service ecosystems. Artificial intelligence has a substantial impact on new opportunities for the co-creation of value and the development of intelligent service ecosystems. Motivated by experiences and observations from digitalization projects, this paper presents new methodological perspectives and experiences from academia and practice on architecting intelligent service ecosystems and explores the impact of artificial intelligence through real cases supporting an ongoing validation. Digital enterprise architecture models serve as an integral representation of business, information, and technological perspectives of intelligent service-based enterprise systems to support management and development. This paper focuses on architectural models for intelligent service ecosystems, showing the fundamental business mechanism of AI-based value co-creation, the corresponding digital architecture, and management models. The focus of this paper presents the key architectural model perspectives for the development of intelligent service ecosystems.
Der Masterstudiengang Human-Centered Computing der Hochschule Reutlingen ist geprägt durch die Zusammenarbeit von Mensch und Computer. Eine wichtige Rolle an der Schnittstelle spielt die Sensorik, die der diesjährigen Konferenz Informatics Inside das Motto „perceive(it)“ verleiht. Dabei hebt das Wortspiel „it“ für die Informationstechnologie und die englische Übersetzung des Personalpronomens „es“ die Dualität der Wahrnehmung der Informationstechnologie durch den Menschen und der Realität durch den Computer hervor. Das Spannungsfeld zwischen künstlichen Sinneserweiterungen und der intelligenten Verarbeitung von Sensordaten ermöglicht unzählige Anwendungen für digitale Medien, virtuelle Welten, realitätsnahe Simulationen, computergestützte Arbeitsprozesse sowie intelligente Assistenzsysteme in der Produktion, im Haushalt, in der Medizin und in der Mobilität. Meine Aufmerksamkeit gilt hierbei der praxisnahen Forschung als Motor für diesen technischen Fortschritt.
The Thirteenth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2021), held between May 30 – June 3rd, 2021, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Warum haben wir ausgerechnet einen Roboter für unseren Titel ausgesucht, der starke Ähnlichkeit hat mit Robbi aus „Robbi, Tobbi und das Fliewatüüt“. Ein Roboter aus den 80ern als Sinnbild für die Zukunft der Arbeit? Nicht ganz. Er steht vielmehr für die Anfänge der Automatisierung, mit der das Ende der Arbeit prophezeit wurde. Heute schaut sein moderner, agiler Nachfolger keck ums Eck. Der „Robbi“ von heute ist hervorgegangen aus einer ständigen technologischen Entwicklung, die unsere Arbeitswelt in erheblichem Maße verändert hat und noch verändern wird. Wie Sie sehen werden, war „Robbi“ sehr wandlungsfähig. Doch was bedeutet das für uns? Wie könnte sie aussehen, die Zukunft der Arbeit? Und was verändert sich dadurch für jeden Einzelnen von uns? Diese Fragen haben wir Professorinnen und Professoren aller Fakultäten gestellt. Sie beschäftigen sich in ihrer Forschung mit digitalen Arbeitsmodellen und zukunftsfähigen Bildungskonzepten, mit Krankenhäusern der Zukunft und Künstlicher Intelligenz. Vieles unterscheidet sich, doch vieles ist auch gleich: Es geht um Vertrauen. Was bedeutet die zunehmende Digitalisierung für unsere Arbeitskultur? Es geht um Verantwortung, uns selbst und anderen gegenüber. Es geht um Vielfalt. Wer sind sie, die Arbeitskräfte von morgen? Es geht um Vernetzung, denn die ist in einer digitalen Welt allgegenwärtig. Für uns als Hochschule ist die „Zukunft der Arbeit“ ein besonders wichtiges Thema, denn unsere Studierenden heute werden sich morgen in dieser neuen Arbeitswelt bewegen und sie gestalten. Welche Kompetenzen müssen wir Ihnen vermitteln? Das ist eine Frage, die wir uns immer wieder neu stellen.
Seit über 12 Jahren findet nun die Informatics Inside als Informatikkonferenz an der Hochschule Reutlingen statt, in diesem Jahr zum zweiten Mal in einem halbjährigen Rhythmus, d.h. auch im Herbst. Diese Wissenschaftliche Konferenz des Masterstudiengangs Human-Centered Computing wird von den Studierenden selbst organisiert und durchgeführt. Sie erhalten während ihres Masterstudiums die Gelegenheit sich in einem selbstgewählten Fachthema zu vertiefen. Dies kann an der Hochschule, in einem Unternehmen, einem Forschungsinstitut oder im Ausland durchgeführt werden. Gerade diese flexible Ausgestaltung des Moduls „Wissenschaftliche Vertiefung“ führt zu einem sehr breiten Themenspektrum, das von den Studierenden bearbeitet wird. Neben der eigentlichen fachlichen Vertiefung spielt auch die Präsentation und Verteidigung von wissenschaftlichen Ergebnissen eine wichtige Rolle und dies weit über das Studium hinaus. Ein gewähltes Fachgebiet so allgemeinverständlich aufzubereiten und zu vermitteln, dass es auch für Nicht-Spezialisten verständlich wird, stellt immer wieder eine besondere Herausforderung dar. Dieser Herausforderung stellen sich die Studierenden im Rahmen der Herbstkonferenz zur Wissenschaftlichen Vertiefung am 24. November 2021. Bereits zum vierten Mal wird die Veranstaltung in einem online-Modus stattfinden, einschließlich eines virtuellen Begleitprogramms.
Das Themenspektrum der diesjährigen Herbstkonferenz ist wieder sehr vielfältig und breit gefächert. So erwarten Sie u.a. Beiträge aus dem Gesundheitssektor, dem Maschinellen Lernen, der KI und VR sowie dem Marketing und E-Learning. Allen gemein ist ein sehr starker Bezug zu innovativen Informatikansätzen, was sich auch in dem Wortspiel und Motto „RockIT Science“ der Konferenz widerspiegelt. Die Informatik durchdringt fast alle beruflichen und privaten Anwendungsbereiche und hat zunehmend größeren Einfluss auf unser tägliches Leben. Dies kann einerseits Besorgnis und andererseits Begeisterung auslösen. Gerade letzteres wollen die Studierenden mit Ihren Beiträgen erreichen und es auch mal im Informatiksektor „rocken“ lassen.