Refine
Document Type
- Journal article (992)
- Conference proceeding (904)
- Book chapter (222)
- Working Paper (35)
- Book (30)
- Doctoral Thesis (24)
- Report (23)
- Issue of a journal (17)
- Review (6)
- Anthology (2)
Has full text
- yes (2257) (remove)
Is part of the Bibliography
- yes (2257)
Institute
- Informatik (747)
- ESB Business School (698)
- Technik (373)
- Life Sciences (274)
- Texoversum (137)
- Zentrale Einrichtungen (14)
Publisher
- Springer (367)
- IEEE (251)
- Elsevier (189)
- Hochschule Reutlingen (176)
- MDPI (99)
- Gesellschaft für Informatik e.V (66)
- Wiley (62)
- De Gruyter (51)
- Association for Computing Machinery (45)
- IARIA (26)
Organisationen sind immer mehr gefragt, auch digitale Arbeitsumgebungen bewusst zu formen. Neue Technologien und digitale Arbeitspraktiken verlagern den Ort, an dem eine gemeinsame Identität gebildet wird, zunehmend in virtuelle Räume. Bislang fokussieren sich Führungskräfte und Change Manager jedoch zu sehr auf Dinge, die sie anfassen und plastisch gestalten können. Die Autoren erörtern daher, wie Unternehmen auch in virtuellen Arbeitswelten die organisationale Identität gestalten und aufrechterhalten können, um auf diese Weise das Change Management zu unterstützen.
The powder coating of veneered particle boards by the sequence electrostatic powder application -powder curing via hot pressing is studied in order to create high gloss surfaces. To obtain an appealingaspect, veneer Sheets were glued by heat and pressure on top of particle boards and the resulting surfaceswere used as carrier substrates for powder coat finishing. Prior to the powder coating, the veneeredparticle board surfaces were pre-treated by sanding to obtain good uniformity and the boards werestored in a climate chamber at controlled temperature and humidity conditions to adjust an appropriate electrical surface resistance. Characterization of surface texture was done by 3D microscopy. The surfaceelectrical resistance was measured for the six veneers before and after their application on the particleboard surface. A transparent powder top-coat was applied electrostatically onto the veneered particleboard surface. Curing of the powder was done using a heated press at 130◦C for 8 min and a smooth, glossy coating was obtained on the veneered surfaces. By applying different amounts of powder thecoating thickness could be varied and the optimum amount of powder was determined for each veneer type.
In the powder coating of veneered particle boards the highly reactive hybrid epoxy/polyester powder transparent Drylac 530 Series from TIGER Coatings GmbH & Co. KG, Wels, Austria was used. Curing is accelerated by a mixture of catalysts reaching curing times of 3 min at 150 °C or 5 min at 135 °C which allows for energy and time savings making Drylac Series 530 powder suitable for the coating of temperaturesensitive substrates such as MDF and wood.
Decorative laminates based on melamine formaldehyde (MF) resin impregnated papers are used at great extent for surface finishing of engineered wood that is used for furniture, kitchen, and working surfaces, flooring and exterior cladding. In all these applications, optically flawless appearance is a major issue. The work described here is focused on enhancing the cleanability and antifingerprint properties of smooth, matt surface-finished melamine-coated particleboards for furniture fronts, without at the same time changing or deteriorating other important surface parameters such as hardness, roughness or gloss. In order to adjust the surface polarity of a low pressure melamine film, novel interface-active macromolecular compounds were prepared and tested for their suitability as an antifingerprint additive. Two hydroxy-functional surfactants (polydimethysiloxane, PDMS-OH and perfluoroether, PF-OH) were oxidized under mild conditions to the corresponding aldehydes (PDMS-CHO and PF-CHO) using a pyridinium chlorochromate catalyst. With the most promising oxidized polymeric additive, PDMS-CHO, the contact angles against water, n-hexadecane, and squalene increased from 79.8°, 26.3° and 31.4° for the pure MF surface to 108.5°, 54.8°, and 59.3°, respectively, for the modified MF surfaces. While for the laminated MF surface based on the oxidized fluoroether the gloss values were much higher than required, for the surfaces based on oxidized polydimethylsiloxane the technological values as well as the lower gloss values were in agreement with the requirements and showed much improved surface cleanability, as was also confirmed by colorimetric measurements.
The tale of 1000 cores: an evaluation of concurrency control on real(ly) large multi-socket hardware
(2020)
In this paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” and analyse in-memory DBMSs on today’s large hardware. Despite the original assumption of the authors, today we do not see single-socket CPUs with 1000 cores. Instead multi-socket hardware made its way into production data centres. Hence, we follow up on this prior work with an evaluation of the characteristics of concurrency control schemes on real production multi-socket hardware with 1568 cores. To our surprise, we made several interesting findings which we report on in this paper.
In this paper, we propose a radical new approach for scale-out distributed DBMSs. Instead of hard-baking an architectural model, such as a shared-nothing architecture, into the distributed DBMS design, we aim for a new class of so-called architecture-less DBMSs. The main idea is that an architecture-less DBMS can mimic any architecture on a per-query basis on-the-fly without any additional overhead for reconfiguration. Our initial results show that our architecture-less DBMS AnyDB can provide significant speedup across varying workloads compared to a traditional DBMS implementing a static architecture.
In our initial DaMoN paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” (Yu in Proc. VLDB Endow 8: 209-220, 2014). Against their assumption, today we do not see single-socket CPUs with 1000 cores. Instead, multi-socket hardware is prevalent today and in fact offers over 1000 cores. Hence, we evaluated concurrency control (CC) schemes on a real (Intel-based) multi-socket platform. To our surprise, we made interesting findings opposing results of the original analysis that we discussed in our initial DaMoN paper. In this paper, we further broaden our analysis, detailing the effect of hardware and workload characteristics via additional real hardware platforms (IBM Power8 and 9) and the full TPC-C transaction mix. Among others, we identified clear connections between the performance of the CC schemes and hardware characteristics, especially concerning NUMA and CPU cache. Overall, we conclude that no CC scheme can efficiently make use of large multi-socket hardware in a robust manner and suggest several directions on how CC schemes and overall OLTP DBMS should evolve in future.
In this paper, we present a new approach for achieving robust performance of data structures making it easier to reuse the same design for different hardware generations but also for different workloads. To achieve robust performance, the main idea is to strictly separate the data structure design from the actual strategies to execute access operations and adjust the actual execution strategies by means of so-called configurations instead of hard-wiring the execution strategy into the data structure. In our evaluation we demonstrate the benefits of this configuration approach for individual data structures as well as complex OLTP workloads.
This booklet will give you an overview of the development of CSR from a (brief) historic point of view and will examine the underlying concepts and research. Furthermore, examples of contemporary CSR management will be explored to show how companies Interpret the issue and how they face the challenges of managing the new demands placed upon them. Business, in the end, comes down to figures and numbers which give management, shareholders and stakeholders a chance to measure a company’s success. Therefore, modern methods and approaches for measuring, rating and ranking a company’s CSR management will be presented. Finally, an attempt will be made to evaluate CSR as a tool for increasing global welfare and as a business and management strategy for companies and entrepreneurs.
Die öffentliche Verwaltung und die in ihr lebenden und arbeitenden Menschen sehen sich häufig mit einer institutionalisierten Dilemma-Situation konfrontiert. Die Ursachen hierfür sind darin zu sehen, dass öffentliches Verwaltungshanden mit sehr wenigen Ausnahmen keinem Selbstzweck dient. Stattdessen dient es anderen Funktionssystmen und wird mit deren binären (Funktions-)Codes beobachtet und zwangsnotwendig bewertet. Die Möglichkeiten des Einzelnen, steuernd und gegebenenfalls korrigieren einzugreifen, sind in einer modernen durch extrem hohe Komplexität und (internationale) Verflechtung bestimmten Gesellschaft gering.
Die Einführung CSR- und wertebasierter Unternehmensziele und Managementmethoden wird von Führungskräften und Mitarbeitern häufig als Überforderung empfunden und löst Bedenken und teilweise Ängste aus. Diesem Phänomen kann alleine durch eine gelungene Schulung in den Theorien und Methoden entgegengetreten werden. Das hier vorgeschlagene Sechs-Schritte-Programm zur Schulung dieser Theorien und Methoden weckt das Bewusstsein für die Notwendigkeit eines Paradigmenwechsels und vermittelt den betroffenen Individuen die erforderlichen Kenntnisse und Werkzeuge, sich dieser Herausforderung zu stellen. In sechs Arbeitsschritten wird von der Phänomenologie der derzeitigen Unternehmenswelt über die theoretische Analyse der Situation bis hin zur Vorstellung geeigneter Tools und der möglichen Risiken ein Weg zur erfolgreichen Schulung gezeigt.
The present publication reports the purification effort of two natural bone blocks, that is, an allogeneic bone block (maxgraft®, botiss biomaterials GmbH, Zossen, Germany) and a xenogeneic block (SMARTBONE®, IBI S.A., Mezzovico Vira, Switzerland) in addition to previously published results based on histology. Furthermore, specialized scanning electron microscopy (SEM) and in vitro analyses (XTT, BrdU, LDH) for testing of the cytocompatibility based on ISO 10993-5/-12 have been conducted. The microscopic analyses showed that both bone blocks possess a trabecular structure with a lamellar subarrangement. In the case of the xenogeneic bone block, only minor remnants of collagenous structures were found, while in contrast high amounts of collagen were found associated with the allogeneic bone matrix. Furthermore, only island-like remnants of the polymer coating in case of the xenogeneic bone substitute seemed to be detectable. Finally, no remaining cells or cellular remnants were found in both bone blocks. The in vitro analyses showed that both bone blocks are biocompatible. Altogether, the purification level of both bone blocks seems to be favorable for bone tissue regeneration without the risk for inflammatory responses or graft rejection. Moreover, the analysis of the maxgraft® bone block showed that the underlying purification process allows for preserving not only the calcified bone matrix but also high amounts of the intertrabecular collagen matrix.
Introduction: Bioresorbable collagenous barrier membranes are used to prevent premature soft tissue ingrowth and to allow bone regeneration. For volume stable indications, only non-absorbable synthetic materials are available. This study investigates a new bioresorbable hydrofluoric acid (HF)-treated magnesium (Mg) mesh in a native collagen membrane for volume stable situations. Materials and Methods: HF-treated and untreated Mg were compared in direct and indirect cytocompatibility assays. In vivo, 18 New Zealand White Rabbits received each four 8 mm calvarial defects and were divided into four groups: (a) HF-treated Mg mesh/collagen membrane, (b) untreated Mg mesh/collagen membrane (c) collagen membrane and (d) sham operation. After 6, 12 and 18 weeks, Mg degradation and bone regeneration was measured using radiological and histological methods. Results: In vitro, HF-treated Mg showed higher cytocompatibility. Histopathologically, HF-Mg prevented gas cavities and was degraded by mononuclear cells via phagocytosis up to 12 weeks. Untreated Mg showed partially significant more gas cavities and a fibrous tissue reaction. Bone regeneration was not significantly different between all groups. Discussion and Conclusions: HF-Mg meshes embedded in native collagen membranes represent a volume stable and biocompatible alternative to the non-absorbable synthetic materials. HF-Mg shows less corrosion and is degraded by phagocytosis. However, the application of membranes did not result in higher bone regeneration.
Analog-/Mixed-Signal (AMS) design verification is one of the most challenging and time consuming tasks of todays complex system on chip (SoC) designs. In contrast to digital system design, AMS designers have to deal with a continuous state space of conservative quantities, highly nonlinear relationships, non-functional influences, etc. enlarging the number of possibly critical scenarios to infinity. In this special session we demonstrate the verification of functional properties using simulative and formal methods. We combine different approaches including automated abstraction and refinement of mixed-level models, state-space discretization as well as affine arithmetic. To reach sufficient verification coverage with reasonable time and effort, we use enhanced simulation schemes to avoid conventional simulation drawbacks.
An integrated synchronous buck converter with a high resolution dead time control for input voltages up to 48V and 10MHz switching frequency is presented. The benefit of an enhanced dead time control at light loads to enable zero voltage switching at both the high-side and low-side switch at low output load is studied. This way, compact multi-MHz DCDC converters can be implemented at high efficiency over a wide load current range. The concept also eliminates body diode forward conduction losses and minimizes reverse recovery losses. A dead time resolution of 125 ps is realized by an 8-bit differential delay chain. A further efficiency enhancement by soft switching at the high-side switch at light load is achieved with a voltage boost of the switching node by dead time control in forced continuous conduction mode. The monolithic converter is implemented in an 180nm high-voltage BiCMOS technology. At V IN = 48V, V OUT = 5V, 50mA load, 10MHz switching frequency and 500 nH output inductance, the efficiency is measured to be increased by 14.4% compared to a conventional predictive dead time control. A peak efficiency of 80.9% is achieved at 12V input.
Different sensor types using chemical and biochemical principles are described. The former are mainly gas sensors, the latter are applied especially to liquids. Those label-free direct detection methods are compared with applications where assays take advantage of labeled receptors.
Furthermore, selected applications in the area of gas sensors are discussed, and sensors for process control, point-of-care diagnostics, environmental analytics, and food analytics are reviewed. In addition, multiplexing approaches used in microplates and microarrays are described.
On account of the huge number of sensor types and the wide range of possible applications, only the most important ones are selected here.
Der Anteil mittelständischer Unternehmen, die Standorte im Ausland unterhalten, nimmt seit einigen Jahren zu. Oft finden Auslandsaktivitäten dieser Art in Niedriglohnländern statt. Dort ergeben sich u.a durch die infrastrukturellen Gegebenheiten und durch die verfügbaren Personalressourcen diverse Herausforderungen, insbesondere für die Produktivitätsermittlung und -bewertung innerhalb der Produktion. Dieser Beitrag soll für diese Herausforderungen geeignete Technologien und eine mögliche Vorgehensweise für deren Auswahl vor dem Hintergrund der ländertypischen Herausforderungen aufzeigen.
Alle DAX30-Unternehmen kommunizieren ihre Kapitalkosten, ausgelöst einerseits aus IFRS-Vorgaben, andererseits, weil sie ihre wertorientierte Performancemessung und -steuerung belegen wollen. Bei der Berechnung der Kapitalkosten verwenden die Unternehmen i. d. R. den WACC-Ansatz. Die Tiefe der Angaben variiert von der bloßen Bekanntgabe eines Prozentsatzes bis hin zur vollständigen Offenlegung aller Inputfaktoren für deren Berechnung. Die Autoren argumentieren, dass die Transparenz der Kapitalkosten jedoch wenig Mehrwert schafft, da die in die Berechnung einfließenden Parameter wie z. B. risikoloser Zins, Marktrendite oder unternehmensindividuelles Beta stark schwanken bzw. nahezu willkürlich ermittelt werden. Die von den DAX30 Konzernen zurzeit praktizierte Form der Transparenz schafft für die Adressaten der Geschäftsberichte daher nur einen geringen Erkenntnisgewinn.
Der eine Eingliederungsvereinbarung ersetzende Verwaltungsakt ist rechtswidrig, wenn die gesetzlich vorgesehene Geltungsdauer ohne Ermessenserwägungen überschritten wird.
Salivary gland tumors (SGTs) are a relevant, highly diverse subgroup of head and neck tumors whose entity determination can be difficult. Confocal Raman imaging in combination with multivariate data analysis may possibly support their correct classification. For the analysis of the translational potential of Raman imaging in SGT determination, a multi-stage evaluation process is necessary. By measuring a sample set of Warthin tumor, pleomorphic adenoma and non-tumor salivary gland tissue, Raman data were obtained and a thorough Raman band analysis was performed. This evaluation revealed highly overlapping Raman patterns with only minor spectral differences. Consequently, a principal component analysis (PCA) was calculated and further combined with a discriminant analysis (DA) to enable the best possible distinction. The PCA-DA model was characterized by accuracy, sensitivity, selectivity and precision values above 90% and validated by predicting model-unknown Raman spectra, of which 93% were classified correctly. Thus, we state our PCA-DA to be suitable for parotid tumor and non-salivary salivary gland tissue discrimination and prediction. For evaluation of the translational potential, further validation steps are necessary.
Glioblastoma WHO IV belongs to a group of brain tumors that are still incurable. A promising treatment approach applies photodynamic therapy (PDT) with hypericin as a photosensitizer. To generate a comprehensive understanding of the photosensitizer-tumor interactions, the first part of our study is focused on investigating the distribution and penetration behavior of hypericin in glioma cell spheroids by fluorescence microscopy. In the second part, fluorescence lifetime imaging microscopy (FLIM) was used to correlate fluorescence lifetime (FLT) changes of hypericin to environmental effects inside the spheroids. In this context, 3D tumor spheroids are an excellent model system since they consider 3D cell–cell interactions and the extracellular matrix is similar to tumors in vivo. Our analytical approach considers hypericin as probe molecule for FLIM and as photosensitizer for PDT at the same time, making it possible to directly draw conclusions of the state and location of the drug in a biological system. The knowledge of both state and location of hypericin makes a fundamental understanding of the impact of hypericin PDT in brain tumors possible. Following different incubation conditions, the hypericin distribution in peripheral and central cryosections of the spheroids were analyzed. Both fluorescence microscopy and FLIM revealed a hypericin gradient towards the spheroid core for short incubation periods or small concentrations. On the other hand, a homogeneous hypericin distribution is observed for long incubation times and high concentrations. Especially, the observed FLT change is crucial for the PDT efficiency, since the triplet yield, and hence the O2 activation, is directly proportional to the FLT. Based on the FLT increase inside spheroids, an incubation time 30 min is required to achieve most suitable conditions for an effective PDT.
The early detection of head and neck cancer is a prolonged challenging task. It requires a precise and accurate identification of tissue alterations as well as a distinct discrimination of cancerous from healthy tissue areas. A novel approach for this purpose uses microspectroscopic techniques with special focus on hyperspectral imaging (HSI) methods. Our proof-of-principle study presents the implementation and application of darkfield elastic light scattering spectroscopy (DF ELSS) as a non-destructive, high-resolution, and fast imaging modality to distinguish lingual healthy from altered tissue regions in a mouse model. The main aspect of our study deals with the comparison of two varying HSI detection principles, which are a point-by-point and line scanning imaging, and whether one might be more appropriate in differentiating several tissue types. Statistical models are formed by deploying a principal component analysis (PCA) with the Bayesian discriminant analysis (DA) on the elastic light scattering (ELS) spectra. Overall accuracy, sensitivity, and precision values of 98% are achieved for both models whereas the overall specificity results in 99%. An additional classification of model-unknown ELS spectra is performed. The predictions are verified with histopathological evaluations of identical HE-stained tissue areas to prove the model’s capability of tissue distinction. In the context of our proof-of-principle study, we assess the Pushbroom PCA-DA model to be more suitable for tissue type differentiations and thus tissue classification. In addition to the HE-examination in head and neck cancer diagnosis, the usage of HSI-based statistical models might be conceivable in a daily clinical routine.
Hybride Arbeitsmodelle gelten als Zukunft der Arbeit. Demnach beschäftigt sich die vorliegende Forschungsarbeit mit der Untersuchung hybrider Arbeitsmodelle im Hinblick auf deutsche kleine und mittlere Unternehmen (KMU) im Vergleich zu Großbetrieben. Mithilfe einer multi-methodischen Studie, bestehend aus einer Umfrage und qualitativen Experteninterviews, wird evaluiert, in welchem Maß hybride Arbeitsmodelle in KMU bereits etabliert sind und welche Herausforderungen sie dabei bewältigen müssen. Zusätzlich wird betrachtet, ob soziodemografische Faktoren wie Alter, Geschlecht oder Rolle im Unternehmen einen Einfluss auf hybrides Arbeiten haben.Die Ergebnisse zeigen, dass die Etablierung von hybriden Arbeitsmodellen in KMU im Gegensatz zu Großbetrieben weniger vorangeschritten ist. KMUs stehen vor vielfältigen Herausforderungen, die beispielsweise auf unzureichende Digitalisierung oder traditionellere Strukturen zurückzuführen sind. Insbesondere die Unternehmenskultur sowie die Rolle im Unternehmen und der Einfluss der Führungskraft spielen eine wichtige Rolle.Praktische Relevanz: Der Großteil vorliegender Literatur zum Thema New Work und Hybride Arbeit legt den Fokus auf die Gesamtbetrachtung aller Unternehmensgrößen oder auf Großbetriebe. Aufgrund der spezifischen Merkmale, wie beispielsweise eingeschränkter Ressourcenzugang, können Ergebnisse von Großbetrieben kaum auf KMU übertragen werden. Demnach gibt diese Arbeit eine Orientierung, wie hybride Arbeitsmodelle in KMU sinnvoll und gewinnbringend umgesetzt werden und welche Herausforderungen auftreten.
Changing requirements and qualification profiles of employees, increasingly complex digital systems up to artificial intelligence, missing standards for the seamless embedding of existing resources and unpredictable return on investments are just a few examples of the challenges of an SME in the age of digitalisation. In most cases there is a lack of suitable tools and methods to support companies in the digital transformation process in the value creation processes, but also of training and learning materials. A European research project (BITTMAS - Business Transformation towards Digitalisation and Smart systems, ERASMUS+, 2016-1 DE02-KA202-003437) with international partners from science, associations and industry has addressed this issue and developed various methods and instruments to support SMEs. Within the scope of a literature search, 16 suitable digitalisation concepts for production and logistics were identified. In the following, a learning platform with a literature database with multivariable sorting options according to branches and keywords of digitalisation, a video gallery with basic and advanced knowledge and a glossary were created in order to provide the user with consolidated and structured specialist knowledge. The 16 identifying concepts for transforming value-added processes in the context of digitalisation were transferred to a learning platform using developed learning paths in coaching and training to online course modules including test questions. A maturity model was developed and implemented in a self assessment tool for the analysis to identify the potential of digitalisation in production and logistics in relation to the current technological digitalisation level of the company. As a result, the user receives one or more of the 16 potential digitalisation concepts suggested or the delta for the necessary, not yet available enabler technologies is presented as a spider diagram. For a successful implementation of the identified suitable digitalisation concepts in production and logistics, a further tool was developed to identify supplementary requirements for all company divisions and stakeholders in relation to the "digital transformation" in the form of a self-evaluation. This paper presents the methods and tools developed, the accompanying learning materials and the learning platform.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Im Rahmen der wissenschaftlichen Vertiefung an der Hochschule Reutlingen befasst sich diese Arbeit mit der Untersuchung der Anforderungen und der Machbarkeit zur computergestützten Erkennung der Deutschen Gebärdensprache (DGS) und des deutschen Fingeralphabets. Die Erkenntnisse aus dieser Arbeit dienen als Grundlage zur Entwicklung eines Systems zur Übersetzung von Gebärden der DGS oder des Fingeralphabets in die deutsche Schriftsprache. Zunächst werden grundlegende Informationen zu Geschichte, Aufbau und Grammatik der DGS und des Fingeralphabets aufgeführt. Die Erkennung der Gebärden soll durch optische Bewegungssensoren erfolgen. Hierfür werden unterschieliche Sensortypen betrachtet und verglichen. Im weiteren Verlauf erfolgt die Analyse der benutzerspezifischen und technischen Anforderungen. Erstere basieren auf der Befragung einer Fokusgruppe aus gehörlosen und hörenden Menschen aus dem Bereich der Gehörlosen-, Schwerhörigen- und Sprachbehindertenpädagogik. Abgeleitet aus den Informationen der Anforderungsanalyse ergibt sich, bis zu einem gewissen Grad, die Machbarkeit aus technischer und benutzerspezifischer Sicht. Abschließend erfolgen die Zusammenfassung der Anforderungen, welche an das zu entwickelnde System gestllt werden, sowie eine Handlungsempfehlung für die Entwicklung eines Prototyps.
The unprecedented acceleration in the dynamics of economic development and its dependence on global interactions makes predicting the future especially difficult. Nevertheless, an examination of long-term trends provides an opportunity to begin a discussion about what reality could await us tomorrow and how we want to deal with it. With this food-for-thought paper, the member institutes of the Fraunhofer Group for Innovation Research wish to present a selection of the trends that are destined to have a significant impact on innovation systems in the period leading up to 2030. Based on these trends, the paper derives theses for innovation in the year 2030 and describes the resulting tasks for business, politics, science and society.
Die Dynamik der wirtschaftlichen Entwicklung und deren Abhängigkeit von globalen Wechselwirkungen wachsen heute schneller denn je. Das macht Zukunftsprognosen besonders schwierig. Dennoch bietet der Blick auf langfristig prägende Trends die Chance, eine Diskussion darüber zu eröffnen, welche Realität uns morgen erwarten könnte und wie wir damit umgehen wollen.
Dieses Impulspapier stellt aus Sicht der Mitgliedsinstitute des Fraunhofer Verbunds Innovationsforschung eine Auswahl derjenigen Trends dar, die Innovationssysteme im Zeitraum bis 2030 wesentlich beeinflussen werden. Auf dieser Grundlage werden Thesen für Innovation im Jahr 2030 abgeleitet und beschrieben, welche Aufgaben sich daraus für Wirtschaft, Politik, Wissenschaft und Gesellschaft ergeben.
Purpose: The purpose of this study was to investigate the value of the web representation of certain fashion hot spots and how these results can be shown on fashion maps in an illustrated way.
Design/methodology/approach: A new ranking was created, which was evaluated with a self-instructed index, to gain solid results. Numbers were collected from Google, Instagram, Facebook, Twitter and web.alert.io. Additionally, fashion maps were created for an illustrative visualization of the results.
Findings: Compared with the ranking of a trend forecasting agency, called Global Language Monitor, which concepted a ranking of non-virtual fashion cities, the web representation and therefore the ranking of the research project, differs mainly in the situation of the cities among the first 10, viz. the rank on which a city occurs, but fewer in the actual cities mentioned.
Research limitations: The research was limited to subjective analysis of data, leading to partly subjective results, as well as the selected number of social media platforms, that had been used.
Originality/value: This is the first study to explore the web representation value of fashion metropolises in comparison to their non-virtual ranking. The results are partly based on results that already existed, concerning transformations of fashion cities or in general which cities own the status of a fashion city.
The basic idea behind a wearable robotic grasp assistancesystem is to support people that suffer from severe motor impairments in daily activities. Such a system needs to act mostly autonomously and according to the user’s intent. Vision-based hand pose estimation could be an integral part of a larger control and assistance framework. In this paper we evaluate the performance of egocentric monocular hand pose estimation for a robot-controlled hand exoskeleton in a simulation. For hand pose estimation we adopt a Convolutional Neural Network (CNN). We train and evaluate this network with computer graphics, created by our own data generator. In order to guide further design decisions we focus in our experiments on two egocentric camera viewpoints tested on synthetic data with the help of a 3D-scanned hand model, with and without an exoskeleton attached to it.We observe that hand pose estimation with a wrist-mounted camera performs more accurate than with a head-mounted camera in the context of our simulation. Further, a grasp assistance system attached to the hand alters visual appearance and can improve hand pose estimation. Our experiment provides useful insights for the integration of sensors into a context sensitive analysis framework for intelligent assistance.
Die Ionenmobilitätsspektrometrie ist eine gasanalytische Methode, die analytisch zwischen Sensoren auf der einen Seite und Spektrometern auf der anderen angesiedelt ist. Ihre Vorteile liegen darin, auch komplexere Gasgemische als Sensoren erfolgreich vor Ort und online sowie bettseitig im Krankenhaus analysieren zu können. Beispiele als dem Bereich Bio- und Prozessanalytik sollen die jeweiligen Ansätze und das Potential von der Fragestellung über die jeweils spezifische Lösung bis hin zum Ergebnis am Prozess exemplarisch zusammenstellen. Hierbei werden sowohl die analytische Sicht als auch die Marktsicht thematisiert.
Background: Conventional methods for lung cancer detection including computed tomography (CT) and bronchoscopy are expensive and invasive. Thus, there is still a need for an optimal lung cancer detection technique. Methods: The exhaled breath of 50 patients with lung cancer histologically proven by bronchoscopic biopsy samples (32 adenocarcinomas, 10 squamous cell carcinomas, 8 small cell carcinomas), were analyzed using ion mobility spectrometry (IMS) and compared with 39 healthy volunteers. As a secondary assessment, we compared adenocarcinoma patients with and without epidermal growth factor receptor (EGFR) mutation. Results: A decision tree algorithm could separate patients with lung cancer including adenocarcinoma, squamous cell carcinoma and small cell carcinoma. One hundred-fifteen separated volatile organic compound (VOC) peaks were analyzed. Peak-2 noted as n-Dodecane using the IMS database was able to separate values with a sensitivity of 70.0% and a specificity of 89.7%. Incorporating a decision tree algorithm starting with n-Dodecane, a sensitivity of 76% and specificity of 100% was achieved. Comparing VOC peaks between adenocarcinoma and healthy subjects, n-Dodecane was able to separate values with a sensitivity of 81.3% and a specificity of 89.7%. Fourteen patients positive for EGFR mutation displayed a significantly higher n-Dodecane than for the 14 patients negative for EGFR (p<0.01), with a sensitivity of 85.7% and a specificity of 78.6%. Conclusion: In this prospective study, VOC peak patterns using a decision tree algorithm were useful in the detection of lung cancer. Moreover, n-Dodecane analysis from adenocarcinoma patients might be useful to discriminate the EGFR mutation.
Um sich in einem schnelllebigen und globalen Markt nachhaltig wettbewerbsfähig aufzustellen, bedarf es innovativer Ansätze, Produkte sichtbar zu machen. Vorreiter wie Apple oder Microsoft stehen mit ihren Marketingstrategien und der Präsentation ihrer Produkte für eine neue Denkweise. Doch wie kann ein klein- oder mittelständiges Unternehmen (KMU) mit solchen Strategien konkurrieren und sich und die eigenen Produkte am Markt erfolgreich platzieren? Der vorliegende Beitrag zeigt auf, wie ein Markteinführungskonzept mithilfe des Design-Thinking-Ansatzes auf Basis der Kundenbedürfnisse modular und skalierbar ausgestaltet werden kann, um auf die jeweiligen Anforderungen des einzuführenden Produktes adaptierbar zu sein.
Additive Manufacturing is increasingly used in the industrial sector as a result of continuous development. In the Production Planning and Control (PPC) system, AM enables an agile response in the area of detailed and process planning, especially for a large number of plants. For this purpose, a concept for a PPC system for AM is presented, which takes into account the requirements for integration into the operational enterprise software system. The technical applicability will be demonstrated by individual implemented sections. The presented solution approach promises a more efficient utilization of the plants and a more elastic use.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
The blockchain technology enables a common data basis between the participants. Entries are logged and the authenticity of the participants is guaranteed. In the case of a relationship between customers and producers, this would lead to verifiable cooperation, which would be a major step as companies enter into service contracts based on the flow of many small transactions through communication. This paper proposes an architecture that enables the creation and processing of orders between the customer and producers via a blockchain based production network. The handling of larger files which are traceable via the blockchain is also shown and the use of a public or permissioned blockchain for an application case is also considered.
The use of additive manufacturing technologies for industrial production is constantly growing. This technology differs from the known production proecdures. The areas for scheduling, detailed and sequence planning are particularly important for additive production due to the long print times and flexible use of the production area. Therefore, production-relevant variables are considered and used for the production planning and control (PPC) of additive manufacturing machines. For this purpose, an optimization model is presented which shows a time-oriented build space utilization. In the implementation, a nesting algorithm is used to check the combinability of different models for each individual print job.
The promise of immutable documents to make it easier and less expensive for consumers and producers to collaborate in a verifiable way would represent an enormous progress, especially as companies strive for establish service contracts which are based on the flow of many small transactions using machine-to-machine communication. The blockchain technology logs these data, verifies the authenticity and make them available for service offers. This work deals with an architecture enabling to setup order processing between consumers and produceers using blockchain. In this way, the technical feasibility is shown and the special characteristics of blockchain production networks will be discussed.
Additive manufacturing (AM) is a promising manufacturing method for many industrial sectors. For this application, industrial requirements such as high production volumes and coordinated implementation must be taken into account. These tasks of the internal handling of production facilities are carried out by the Production Planning and Control (PPC) information system. A key factor in the planning and scheduling is the exact calculation of manufacturing times. For this purpose we investigate the use of Machine Learning (ML) for the prediction of manufacturing times of AM facilities.
Sowohl bei den industriellen als auch wissenschaftlichen Institutionen nimmt die Anwendung der additiven Fertigung stetig zu und ist insbesondere in den Bereichen der Prototypenentwicklung nicht mehr wegzudenken. Die werkzeuglose Herstellung von Teilen ermöglicht eine dynamische Nutzung der Produktionsressourcen bis unmittelbar zum Fertigungsstart. Dies erlaubt, einerseits in den Bereichen der Feinterminierung und Ablaufplanung, agil auf Veränderungen zu reagieren und andererseits Modelle unterschiedlicher Fertigungsaufträge miteinander zu kombinieren, um somit eine hohe Effizienz der Fertigungsanlagen zu erreichen. Bei der Nutzung von multiplen Anlagen in einem Unternehmen oder im Partnerverbund stellt die vorhandene Intransparenz Unternehmen und Unternehmensnetzwerke vor viele Herausforderungen. Die Blockchain Technologie ermöglicht eine gemeinsame Datenbasis zwischen den Teilnehmern. Die Einträge werden protokolliert und die Authentizität der Teilnehmer wird gewährleistet. Dies führt, im Falle der Beziehung zwischen Kunden und Produzenten, zu einer nachprüfbaren Zusammenarbeit, da Unternehmen Dienstleistungsverträge abschließen, die auf dem Fluss vieler kleiner Transaktionen basieren. In diesem Beitrag wird dargestellt, wie verfügbare additive Fertigungsressourcen erkannt werden, sowie, unter der Verwendung der Blockchain-Technologie, in einem dezentralen Produktionsnetzwerk angeboten und von unterschiedlichen Akteuren genutzt werden können.
Pre-clinical evaluation of advanced nerve guide conduits using a novel 3D in vitro testing model
(2018)
Autografts are the current gold standard for large peripheral nerve defects in clinics despite the frequently occurring side effects like donor site morbidity. Hollow nerve guidance conduits (NGC) are proposed alternatives to autografts, but failed to bridge gaps exceeding 3 cm in humans. Internal NGC guidance cues like microfibres are believed to enhance hollow NGCs by giving additional physical support for directed regeneration of Schwann cells and axons. In this study, we report a new 3D in vitro model that allows the evaluation of different intraluminal fibre scaffolds inside a complete NGC. The performance of electrospun polycaprolactone (PCL) microfibres inside 5 mm long polyethylene glycol (PEG) conduits were investigated in neuronal cell and dorsal root ganglion (DRG) cultures in vitro. Z-stack confocal microscopy revealed the aligned orientation of neuronal cells along the fibres throughout the whole NGC length and depth. The number of living cells in the centre of the scaffold was not significantly different to the tissue culture plastic (TCP) control. For ex vivo analysis, DRGs were placed on top of fibre-filled NGCs to simulate the proximal nerve stump. In 21 days of culture, Schwann cells and axons infiltrated the conduits along the microfibres with 2.2 ± 0.37 mm and 2.1 ± 0.33 mm, respectively. We conclude that this in vitro model can help define internal NGC scaffolds in the future by comparing different fibre materials, composites and dimensions in one setup prior to animal testing.
In thermopervaporation the same economically favorable driving force as in membrane distillation, i.e., a temperature difference between feed and permeate for the transport, is used but with non-porous thin-film composite membranes. Membrane pores cannot be wetted and long-term operational stability can be achieved with the appropriate coating layer, but normally with a decrease of the flux compared to membrane distillation with porous hydrophobic membranes.
Porous asymmetric PVDF membranes were made to achieve low permeation resistance and pores which could be overcoated with polyelectrolyte polymers. This coating prohibits pore wetting and strongly reduces adsorption of organic substances.
Those membranes showed a high permeation rate for water due to a structure of phase-separated hydrophilic and hydrophobic three-dimensional domains. The permeation rates of these composite membranes for water is between 6 and 12 l/(h m²) at a feed temperature of 60 °C and permeate at a temperature of 40 °C of a 2% saline solution feed depending on the operational parameters. This is only a slight reduction of 10–15% in permeation rate compared to membrane distillation with porous hydrophobic membranes.
In whey dewatering experiment this membrane showed a constant performance over 4 days in intermittent operation mode and stability in cleaning with strong alkaline solution.
A vapor permeation processes for the separation of aromatic compounds from aliphatic compounds
(2014)
A number of rubbery and glassy membranes have been prepared and evaluated in vapor permeation experiments for separation of aromatic/aliphatic mixtures, using 5/95 (wt:wt) toluene/methylcyclohexane (MCH) as a model solution. Candidate membranes that met the required toluene/MCH selectivity of ≥ 10 were identified. The stability of the candidate membranes was tested by cycling the experiment between higher toluene concentrations and the original 5 wt% level. The best membrane produced has a toluene permeance of 280 gpu and a toluene/MCH selectivity of 13 when tested with a vapor feed of the model mixture at its boiling point and at atmospheric pressure. When a series of related membrane materials are compared, there is a sharp trade-off between membrane permeance and membrane selectivity. A process design study based on the experimental results was conducted. The best preliminary membrane design uses 45% of the energy of a conventional distillation process.
Die Ausrüstung von Textilien mit Sol-Gel-Beschichtungen wird seit einigen Jahren intensiv verfolgt. Eine Vielzahl von bekannten, aber auch neuen Ausrüstungseffekten können über diesen Ansatz realisiert werden. Besonders interessant ist die Sol-Gel-Technik wegen der Möglichkeiten, multifunktionelle Ausrüstungen zu synthetisieren. Problematisch ist eine in vielen Fällen geringe Beständigkeit solcher Ausrüstungen, insbesondere gegenüber Waschprozessen. Ziel des Projektes war es davon ausgehend, Vorbehandlungsstrategien für textile Fasermaterialien, basierend auf synthetischen Polymeren oder aus Naturfasern, zu entwickeln, die die Haltbarkeit von Sol-Gel-basierten Ausrüstungen verbessern. Im Rahmen der Arbeiten wurden, angepasst an die jeweiligen Faserpolymere - Polyethylenterephthalat, Polyamid, Polypropylen und Baumwolle - funktionelle Gruppen über geeignete Anker auf den Polymeren etabliert, die in der Lage sind, kovalente Bindungen zu Sol-Gel-basierten Beschichtungssystemen auszubilden. Als Anker wurden primär Trialkoxysilane verwendet, die zusätzlich z.B. Epoxy-, Isocyanato-, Azido- oder Amino-funktionelle Reste besitzen. Mit diesen Resten können die Anker kovalent an die Polymere angebunden werden. Die meisten Sol-Gel-basierten Systeme enthalten zumindest zu einem gewissen Anteil SiOx und/oder MexOy-Cluster. Die zur Funktionalisierung der Oberflächen eingesetzten Alkoxysilane können generell an solche Systeme/Cluster per Kondensation gebunden werden und dienen daher für die effektive Anbindung verschiedenster funktioneller Sol-Gel-Schichten. Entsprechend vorfunktionalisierte Substrate wurden in der Folge mit exemplarisch ausgewählten Sol-Gel-Ausrüstungen beschichtet. Dabei wurden für den Großteil der Untersuchungen hydrophobierende Sole appliziert. Vorteilhaft ist, dass sich der mit hydrophobierenden Solen erzielte Ausrüstungseffekt genau wie dessen Beständigkeit mit vergleichsweise überschaubarem Aufwand über die Untersuchung der Benetzbarkeit (DuPont-Noten, Kontaktwinkel, Tropfeneinsinkzeiten) charakterisieren lässt. Die Wirksamkeit der Vorbehandlungen wurde dann vor allem anhand von Untersuchungen zur Waschbeständigkeit der Ausrüstungen überprüft. Im Rahmen der Arbeiten konnte gezeigt werden, dass sich über die Etablierung geeigneter Anker die Beständigkeit von Sol-Gel-Ausrüstungen bzw. der daraus hervorgehenden Effekte verbessern lässt. Es zeigt sich gleichzeitig, dass die erzielten Verbesserungen sehr stark vom jeweiligen Sol abhängen. D.h., dass sich erzielte Verbesserungen nicht zwangsläufig auf andere Sol übertragen lassen. Analytische Charakterisierungen weisen darauf hin, dass in vielen Fällen die Beständigkeit der Beschichtungsnetzwerke selbst einen weit größeren Einfluss besitzt als die Anbindung an das Substrat. So zeigt sich bei verschiedenen Untersuchungen, dass die Auflage der Sol-Gel-Beschichtung vor allem nach einer ersten Wäsche, aber auch darüber hinaus, signifikant sinkt, oftmals aber ohne dass der durch Ausrüstung erzielte Effekt verloren geht. Dies deutet auf ein (Auf-)Lösen der Beschichtungsmatrizes hin, wovor die Anker nicht schützen können, da deren Wirkung auf die Grenzfläche zum Substrat beschränkt ist. Neben den hydrophobierenden Ausrüstungen wurden exemplarisch auch antibakterielle Ausrüstungen nach den entsprechenden Vorbehandlungen appliziert. Auch hier konnten Verbesserungen in der Beständigkeit des Effektes erzielt werden. Abschließend wurde untersucht inwieweit sich die Vorbehandlungen im Vergleich zur einfachen Ausrüstung negativ auf die textilen Produkte auswirken. Hierzu wurden relevante textile Parameter wie z.B. Höchstzugkräfte, Weißgrade, Steifigkeit oder Luftdurchlässigkeiten bestimmt. Diese Parameter wurden in der überwiegenden Zahl der Vorbehandlungen nicht oder nur geringfügig beeinflusst.
Military organizations have special features like following different organizational laws in times of peace and war and their specific embeddedness in society and politics. Especially the latter aspect has made the military an important object of study since the beginnings of modern sociology. In the wake of establishing specific sociological accounts, military sociology has been developed, dedicated to the different facets of the military. This research is based on different theoretical perspectives, but has hardly embraced the frameworks from economics and sociology of conventions (EC/SC) so far. The aim of the chapter is to explore and demonstrate the potentials of this approach. In a first step, the state of the art of military sociology research is outlined, and potential avenues for analyzing military forces based on EC/SC are identified. It is argued that especially the connection to organizational theory (military as organization) and civil-military relations, including leadership and professionalism, offer starting points. After introducing existing studies addressing military-related topics with reference to EC/SC, relevant concepts and approaches of convention theory that prove to be particularly enriching for military research are discussed. An outlook on possible further fields and topics of research is given to concretize how an inclusion of the perspective of EC/SC could look like.
The performance and scalability of modern data-intensive systems are limited by massive data movement of growing datasets across the whole memory hierarchy to the CPUs. Such traditional processor-centric DBMS architectures are bandwidth- and latency-bound. Processing-in-Memory (PIM) designs seek to overcome these limitations by integrating memory and processing functionality on the same chip. PIM targets near- or in-memory data processing, leveraging the greater in-situ parallelism and bandwidth.
In this paper, we introduce pimDB and provide an initial comparison of processor-centric and PIM-DBMS approaches under different aspects, such as scalability and parallelism, cache-awareness, or PIM-specific compute/bandwidth tradeoffs. The evaluation is performed end-to-end on a real PIM hardware system from UPMEM.