Refine
Document Type
- Journal article (117) (remove)
Has full text
- yes (117) (remove)
Is part of the Bibliography
- yes (117)
Institute
- ESB Business School (39)
- Informatik (39)
- Life Sciences (27)
- Technik (8)
- Texoversum (2)
- Zentrale Einrichtungen (2)
Publisher
- Springer (117) (remove)
The critical process parameters cell density and viability during mammalian cell cultivation are assessed by UV/VIS spectroscopy in combination with multivariate data analytical methods. This direct optical detection technique uses a commercial optical probe to acquire spectra in a label-free way without signal enhancement. For the cultivation, an inverse cultivation protocol is applied, which simulates the exponential growth phase by exponentially replacing cells and metabolites of a growing Chinese hamster ovary cell batch with fresh medium. For the simulation of the death phase, a batch of growing cells is progressively replaced by a batch with completely starved cells. Thus, the most important parts of an industrial batch cultivation are easily imitated. The cell viability was determined by the well-established method partial least squares regression (PLS). To further improve process knowledge, the viability has been determined from the spectra based on a multivariate curve resolution (MCR) model. With this approach, the progress of the cultivations can be continuously monitored solely based on an UV/VIS sensor. Thus, the monitoring of critical process parameters is possible inline within a mammalian cell cultivation process, especially the viable cell density. In addition, the beginning of cell death can be detected by this method which allows us to determine the cell viability with acceptable error. The combination of inline UV/VIS spectroscopy with multivariate curve resolution generates additional process knowledge complementary to PLS and is considered a suitable process analytical tool for monitoring industrial cultivation processes.
Purpose: Gliomas are the most common and aggressive type of brain tumors due to their infiltrative nature and rapid progression. The process of distinguishing tumor boundaries from healthy cells is still a challenging task in the clinical routine. Fluid attenuated inversion recovery (FLAIR) MRI modality can provide the physician with information about tumor infiltration. Therefore, this paper proposes a new generic deep learning architecture, namely DeepSeg, for fully automated detection and segmentation of the brain lesion using FLAIR MRI data.
Methods: The developed DeepSeg is a modular decoupling framework. It consists of two connected core parts based on an encoding and decoding relationship. The encoder part is a convolutional neural network (CNN) responsible for spatial information extraction. The resulting semantic map is inserted into the decoder part to get the full-resolution probability map. Based on modified U-Net architecture, different CNN models such as residual neural network (ResNet), dense convolutional network (DenseNet), and NASNet have been utilized in this study.
Results: The proposed deep learning architectures have been successfully tested and evaluated on-line based on MRI datasets of brain tumor segmentation (BraTS 2019) challenge, including s336 cases as training data and 125 cases for validation data. The dice and Hausdorff distance scores of obtained segmentation results are about 0.81 to 0.84 and 9.8 to 19.7 correspondingly.
Conclusion: This study showed successful feasibility and comparative performance of applying different deep learning models in a new DeepSeg framework for automated brain tumor segmentation in FLAIR MR images. The proposed DeepSeg is open source and freely available at https://github.com/razeineldin/DeepSeg/.
Elasticity is considered to be the most beneficial characteristic of cloud environments, which distinguishes the cloud from clusters and grids. Whereas elasticity has become mainstream for web-based, interactive applications, it is still a major research challenge how to leverage elasticity for applications from the high-performance computing (HPC) domain, which heavily rely on efficient parallel processing techniques. In this work, we specifically address the challenges of elasticity for parallel tree search applications. Well-known meta-algorithms based on this parallel processing technique include branch-and-bound and backtracking search. We show that their characteristics render static resource provisioning inappropriate and the capability of elastic scaling desirable. Moreover, we discuss how to construct an elasticity controller that reasons about the scaling behavior of a parallel system at runtime and dynamically adapts the number of processing units according to user-defined cost and efficiency thresholds. We evaluate a prototypical elasticity controller based on our findings by employing several benchmarks for parallel tree search and discuss the applicability of the proposed approach. Our experimental results show that, by means of elastic scaling, the performance can be controlled according to user-defined thresholds, which cannot be achieved with static resource provisioning.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
Exogenous factors of influence on exhaled breath analysis by ion-mobility spectrometry (MCC/IMS)
(2019)
The interpretation of exhaled breath analysis needs to address to the influence of exogenous factors, especially to a transfer of confounding analytes by the test persons. A test person who was exposed to a disinfectant had exhaled breath analysis by MCC/IMS (Bioscout®) after different time intervals. Additionally, a new sampling method with inhalation of synthetic air before breath analysis was tested. After exposure to the disinfectant, 3-Pentanone monomer, 3-Pentanone dimer, Hexanal, 3-Pentanone trimer, 2-Propanamine, 1-Propanol, Benzene, Nonanal showed significantly higher intensities, in exhaled breath and air of the examination room, compared to the corresponding baseline measurements. Only one ingredient of the disinfectant (1-Propanol) was identical to the 8 analytes. Prolonging the time intervals between exposure and breath analysis showed a decrease of their intensities. However, the half-time of the decrease was different. The inhalation of synthetic air - more than consequently airing the examination room with fresh air - reduced the exogenous and also relevant endogenous analytes, leading to a reduction and even changing polarity of the alveolar gradient. The interpretation of exhaled breath needs further knowledge about the former residence of the proband and the likelihood and relevance of the inhalation of local, site-specific and confounding exogenous analytes by him. Their inhalation facilitates a transfer to the examination room and a detection of high concentrations in room air and exhaled breath, but also the exhalation of new analytes. This may lead to a misinterpretation of these analytes as endogenous resp. disease-specific ones.
Im Frühjahr 1817 unternahm der damalige Professor Friedrich List an der Universität Tübingen eine Reise nach Frankfurt a. M., wo zu dieser Zeit die berühmte Ostermesse stattfand. Dort traf er mit den Anführern der Kaufleute zusammen, die darüber klagten, dass die zaghafte wirtschaftliche Entwicklung unter den vielen Zollschranken und den Billigimporten aus England stark zu leiden habe. Deshalb forderten sie die Abschaffung der Binnenzölle und die Bildung einer Wirtschaftsunion. Im Auftrag der Kaufleute verfasste List seine berühmt gewordene Petition an die Bundesversammlung, die lose Interessenvertretung des Deutschen Bundes in Frankfurt. Als die Petition mit großem Beifall aufgenommen wurde, gründete List im Hochgefühl seines Erfolges spontan den "Allgemeinen Deutschen Handels- und Gewerbsverein" – die erste Interessenvertretung deutscher Kaufleute. Er legte damit den Grundstein für den politischen Prozess zur Gründung des Zollvereins von 1834, der wiederum die Vorstufe zur Gründung des Deutschen Reiches von 1871 bildete. Lists damalige Forderungen sind zurzeit wieder hoch aktuell.
Vielen Unternehmen gelingt es aufgrund der hohen Komplexität nicht, sich bietende Chancen der digitalen Transformation der Arbeitswelt auszuschöpfen und Risiken zu vermeiden. Um die Digitalisierung aktiv gestalten zu können, müssen die für die jeweiligen Digitalisierungsinitiativen relevanten Handlungsfelder identifiziert werden. Hier setzt die vorliegende Forschung an. Anhand einer Einzelfallstudie in einem mittelgroßen deutschen Versicherungsunternehmen werden im vorliegenden Beitrag die konkreten Auswirkungen der digitalen Transformation auf die beteiligten Mitarbeiter analysiert und Implikationen diskutiert. Hierzu wurde ein Digitalisierungsprojekt, und zwar die Digitalisierung der bislang papierbasierten analogen Geschäftsprozesse (E-Akte), in den Blick genommen. Auf Basis der Durchführung und Auswertung von 24 Interviews, in denen die direkten Effekte der Veränderungsmaßnahme aus Sicht der Mitarbeiter und Führungskräfte erfasst und analysiert wurden, ließen sich 10 Handlungsfelder identifizieren, in denen sich die Arbeitswelt des untersuchten Unternehmens durch die Digitalisierung des Geschäftsprozesses verändert.
In this paper, we deal with optimizing the monetary costs of executing parallel applications in cloud-based environments. Specifically, we investigate on how scalability characteristics of parallel applications impact the total costs of computations. We focus on a specific class of irregularly structured problems, where the scalability typically depends on the input data. Consequently, dynamic optimization methods are required for minimizing the costs of computation. For quantifying the total monetary costs of individual parallel computations, the paper presents a cost model that considers the costs for the parallel infrastructure employed as well as the costs caused by delayed results. We discuss a method for dynamically finding the number of processors for which the total costs based on our cost model are minimal. Our extensive experimental evaluation gives detailed insights into the performance characteristics of our approach.
Parallel applications are the computational backbone of major industry trends and grand challenges in science. Whereas these applications are typically constructed for dedicated High Performance Computing clusters and supercomputers, the cloud emerges as attractive execution environment, which provides on-demand resource provisioning and a pay-per-use model. However, cloud environments require specific application properties that may restrict parallel application design. As a result, design trade-offs are required to simultaneously maximize parallel performance and benefit from cloud-specific characteristics.
In this paper, we present a novel approach to assess the cloud readiness of parallel applications based on the design decisions made. By discovering and understanding the implications of these parallel design decisions on an application’s cloud readiness, our approach supports the migration of parallel applications to the cloud.We introduce an assessment procedure, its underlying meta model, and a corresponding instantiation to structure this multi-dimensional design space. For evaluation purposes, we present an extensive case study comprising three parallel applications and discuss their cloud readiness based on our approach.
Software evolvability is an important quality attribute, yet one difficult to grasp. A certain base level of it is allegedly provided by service- and microservice-based systems, but many software professionals lack systematic understanding of the reasons and preconditions for this. We address this issue via the proxy of architectural modifiability tactics. By qualitatively mapping principles and patterns of Service Oriented Architecture (SOA) and microservices onto tactics and analyzing the results, we cannot only generate insights into service-oriented evolution qualities, but can also provide a modifiability comparison of the two popular service-based architectural styles. The results suggest that both SOA and microservices possess several inherent qualities beneficial for software evolution. While both focus strongly on loose coupling and encapsulation, there are also differences in the way they strive for modifiability (e.g. governance vs. evolutionary design). To leverage the insights of this research, however, it is necessary to find practical ways to incorporate the results as guidance into the software development process.
Artefaktkorrektur und verfeinerte Metriken für ein EEG-basiertes System zur Müdigkeitserkennung
(2019)
Fragestellung: Müdigkeit ist ein oft unterschätztes, aber dennoch großes Problem im Straßenverkehr. Von rund 2,5 Mio. Verkehrsunfällen 2015 in Deutschland, waren 2898 Unfälle, mit insgesamt 59 Toten (~1,7 % der Todesfälle), auf Übermüdung zurückzuführen. Schätzungen gehen von einer Dunkelziffer von bis zu 20 % aus. In einer ersten eigenen Studie wurde überprüft, ob ein mobiles EEG in einem Fahrsimulator Müdigkeitszustände zuverlässig erkennen kann. Die Erkennungsrate lag lediglich bei 61 %. Ziel dieser Arbeit ist, das verwendete Messsystem zu verbessern. Dazu wird die Genauigkeit durch eine Artefaktkorrektur und mit Hilfe von verfeinerten Qualitätsmetriken erhöht. Eine erkannte Übermüdung wird dem Fahrer dann in angemessener Weise angezeigt, so dass er entsprechend reagieren kann.
Patienten und Methoden: Die Independent Component Analysis (ICA) ist ein multivariates Verfahren, um mehrere Zufallsvariablen zu analysieren. Für die Entscheidung, ob ein Fahrer gerade müde oder wach ist, wird der erstellte Merkmalsvektor für jede Sequenz mit ICA klassifiziert. Dafür wird ein trainierter Machine-Learning-Algorithmus eingesetzt, der in der Lage ist, auch unbekannte Datensätze in Klassen einzuteilen. Um die benötigten Frequenzwerte zu erhalten, wurde für jeden EEG-Kanal eine Fourier Transformation durchgeführt. Der erstellte Merkmalsvektor wird im nächsten Schritt durch ein Künstliches Neuronales Netz klassifiziert. Für das Training werden vorab erstellte Merkmalsvektoren mit den Klassen „Wach“ und „Müde“ versehen. Diese Daten werden zufällig gemischt und im Verhältnis 2:1 in eine Trainings- und Testmenge geteilt. Das Experiment wurde mit acht Personen mit jeweils zweimal 45 min Testfahrt durchgeführt.
Ergebnisse: Der komplette Datensatz besteht aus 150.000 Signalwerten, welche zu ca. 7000 Sequenzen zusammengefasst werden. Durch die Anwendung der Qualitätsmetrik bleiben 4370 Sequenzen für das Training übrig. Bei invaliden Sequenzen aufgrund von EEG-Artefakten gibt es deutliche Unterschiede. Im „Wach“ Zustand werden dreimal so viele Sequenzen verworfen als im „Müde“ Zustand. Insgesamt werden bei wachen Probanden im Schnitt ca. 50 % der Sequenzen verworfen, bei Müden lediglich 25 %. Im Durchschnitt erreicht das System eine Erkennungsrate von 73 % für beide Zustände. Vergleicht man nun das Verhältnis von „Wach“ und „Müde“ und lässt „Leichte Müdigkeit“ außen vor, liegen die Ergebnisse bei über 90 %.
Schlussfolgerungen: Die Ergebnisse zeigen, dass die Aufmerksamkeit während des Experiments abnimmt bzw. die Müdigkeit zunimmt. Dies verdeutlichen zum einen subjektive und objektive Beobachtungen von Müdigkeitsanzeichen. Zum anderen lassen sich messbare und klassifizierbare Unterschiede im EEG Signal nachweisen. Die als Merkmale eingesetzten Theta-Wellen zeigten eine niedrigere Amplitude gegen Ende des Experiments. Die Erweiterung der binären Klassifizierung führt zu einer weiteren Stabilisierung der Ergebnisse. Artefaktkorrektur und Qualitätsmetriken steigern die Güte der Daten weiter. Die entwickelte Anwendung zur Müdigkeitserkennung ermittelt messbare Zeichen von Müdigkeit und kann eine gute Entscheidung über die Fahrtauglichkeit treffen.
Recently, practitioners have begun appraising an effective customer journey design (CJD) as an important source of customer value in increasingly complex and digitalized consumer markets. Research, however, has neither investigated what constitutes the effectiveness of CJD from a consumer perspective nor empirically tested how it affects important variables of consumer behavior. The authors define an effective CJD as the extent to which consumers perceive multiple brand-owned touchpoints as designed in a thematically cohesive, consistent, and context-sensitive way. Analyzing consumer data from studies in two countries (4814 consumers in total), they provide evidence of the positive influence of an effective CJD on customer loyalty through brand attitude — over and above the effects of brand experience. Importantly, an effective CJD more strongly influences utilitarian brand attitudes, while brand experience more strongly affects hedonic brand attitudes. These underlying mechanisms are also prevalent when testing for the contingency factors services versus goods, perceived switching costs, and brand involvement.
Empirical software engineering experts on the use of students and professionals in experiments
(2018)
Using students as participants remains a valid simplification of reality needed in laboratory contexts. It is an effective way to advance software engineering theories and technologies but, like any other aspect of study settings, should be carefully considered during the design, execution, interpretation, and reporting of an experiment. The key is to understand which developer population portion is being represented by the participants in an experiment. Thus, a proposal for describing experimental participants is put forward.
The focus of the developed maturity model was set on processes. The concept of the widespread CMM and its practices has been transferred to the perioperative domain and the concept of the new maturity model. Additional optimization goals and technological as well as networking-specific aspects enable a process- and object-focused view of the maturity model in order to ensure broad coverage of different subareas. The evaluation showed that the model is applicable to the perioperative field. Adjustments and extensions of the maturity model are future steps to improve the rating and classification of the new maturity model.
Background: Internationally, teledermatology has proven to be a viable alternative to conventional physical referrals. Travel cost and referral times are reduced while patient safety is preserved. Especially patients from rural areas benefit from this healthcare innovation. Despite these established facts and positive experiences from EU neighboring countries like the Netherlands or the United Kingdom, Germany has not yet implemented store-and-forward teledermatology in routine care.
Methods: The TeleDerm study will implement and evaluate store-and-forward teledermatology in 50 general practitioner (GP) practices as an alternative to conventional referrals. TeleDerm aims to confirm that the possibility of store-and-forward teledermatology in GP practices is going to lead to a 15% (n = 260) reduction in referrals in the intervention arm. The study uses a cluster-randomized controlled trial design. Randomization is planned for the cluster “county”. The main observational unit is the GP practice. Poisson distribution of referrals is assumed. The evaluation of secondary outcomes like acceptance, enablers and barriers uses a mixed methods design with questionnaires and interviews.
Discussion: Due to the heterogeneity of GP practice organization, patient management software, information technology service providers, GP personal technical affinity and training, we expect several challenges in implementing teledermatology in German GP routine care. Therefore, we plan to recruit 30% more GPs than required by the power calculation. The implementation design and accompanying evaluation is expected to deliver vital insights into the specifics of implementing telemedicine in German routine care.
Historically, research and development (R&D) in the pharmaceutical sector has predominantly been an in-house activity. To enable investments for game changing late-stage assets and to enable better and less costly go/no-go decisions, most companies have employed a fail early paradigm through the implementation of clinical proof-of-concept organizations. To fuel their pipelines, some pioneers started to complement their internal R&D efforts through collaborations as early as the 1990s. In recent years, multiple extrinsic and intrinsic factors induced an opening for external sources of innovation and resulted in new models for open innovation, such as open sourcing, crowdsourcing, public–private partnerships, innovations centres, and the virtualization of R&D. Three factors seem to determine the breadth and depth regarding how companies approach external innovation: (1) the company’s legacy, (2) the company’s willingness and ability to take risks and (3) the company’s need to control IP and competitors. In addition, these factors often constitute the major hurdles to effectively leveraging external opportunities and assets. Conscious and differential choices of the R&D and business models for different companies and different divisions in the same company seem to best allow a company to fully exploit the potential of both internal and external innovations.
Container virtualization evolved into a key technology for deployment automation in line with the DevOps paradigm. Whereas container management systems facilitate the deployment of cloud applications by employing container based artifacts, parts of the deployment logic have been applied before to build these artifacts. Current approaches do not integrate these two deployment phases in a comprehensive manner. Limited knowledge on application software and middleware encapsulated in container-based artifacts leads to maintainability and configuration issues. Besides, the deployment of cloud applications is based on custom orchestration solutions leading to lock in problems. In this paper, we propose a two-phase deployment method based on the TOSCA standard. We present integration concepts for TOSCA-based orchestration and deployment automation using container-based artifacts. Our two-phase deployment method enables capturing and aligning all the deployment logic related to a software release leading to better maintainability. Furthermore, we build a container management system, which is composed of a TOSCA-based orchestrator on Apache Mesos, to deploy container-based cloud applications automatically.
The relative pros and cons of using students or practitioners in experiments in empirical software engineering have been discussed for a long time and continue to be an important topic. Following the recent publication of “Empirical software engineering experts on the use of students and professionals in experiments” by Falessi, Juristo, Wohlin, Turhan, Münch, Jedlitschka, and Oivo (EMSE, February 2018) we received a commentary by Sjøberg and Bergersen. Given that the topic is of great methodological interest to the community and requires nuanced treatment, we invited two editorial board members, Martin Shepperd and Per Runeson, respectively, to provide additional views.
Back to the future: origins and directions of the “Agile Manifesto” – views of the originators
(2018)
In 2001, seventeen professionals set up the manifesto for agile software development. They wanted to define values and basic principles for better software development. On top of brought into focus, the manifesto has been widely adopted by developers, in software-developing organizations and outside the world of IT. Agile principles and their implementation in practice have paved the way for radical new and innovative ways of software and product development. In parallel, the understanding of the manifesto’s underlying principles evolved over time. This, in turn, may affect current and future applications of agile principles. This article presents results from a survey and an interview study in collaboration with the original contributors of the manifesto for agile software development. Furthermore, it comprises the results from a workshop with one of the original authors. This publication focuses on the origins of the manifesto, the contributors’ views from today’s perspective, and their outlook on future directions. We evaluated 11 responses from the survey and 14 interviews to understand the viewpoint of the contributors. They emphasize that agile methods need to be carefully selected and agile should not be seen as a silver bullet. They underline the importance of considering the variety of different practices and methods that had an influence on the manifesto. Furthermore, they mention that people should question their current understanding of "agile" and recommend reconsidering the core ideas of the manifesto.
An interactive clothing design and a personalized virtual display with user’s own face are presented in this paper to meet the requirement of personalized clothing customization. A customer interactive clothing design approach based on genetic engineering ideas is analyzed by taking suit as an example. Thus, customers could rearrange the clothing style elements, chose available color, fabric and come up with their own personalized suit style. A web 3D customization prototype system of personalized clothing is developed based on the Unity3D and VR technology. The layout of the structure and functions combined with the flow of the system are given. Practical issues such as 3D face scanning, suit style design, fabric selection, and accessory choices are addressed also. Tests to the prototype system indicate that it could show realistic clothing and fabric effect and offer effective visual and customization experience to users.
The influence of turbidity on the Raman signal strengths of condensed matter is theoretically analyzed and measured with laboratory - scale equipment for remote sensing. The results show the quantitative dependence of back- and forward-scattered signals on the thickness and elastic-scattering properties of matter. In the extreme situation of thin, highly turbid layers, the measured Raman signal strengths exceed their transparent analogs by more than a factor of ten. The opposite behavior is found for thick layers of low turbidity, where the presence of a small amount of scatterers leads to a decrease of the measured signal. The wide range of turbidities appearing in nature is experimentally realized with stacked polymer layers and solid/liquid dispersions, and theoretically modeled by the equation of radiative transfer using the analytical diffusion approximation or random walk simulations.
Purpose: Human breath analysis is proposed with increasing frequency as a useful tool in clinical application. We performed this study to find the characteristic volatile organic compounds (VOCs) in the exhaled breath of patients with idiopathic pulmonary fibrosis (IPF) for discrimination from healthy subjects. Methods: VOCs in the exhaled breath of 40 IPF patients and 55 healthy controls were measured using a multi-capillary column and ion mobility spectrometer. The patients were examined by pulmonary function tests, blood gas analysis, and serum biomarkers of interstitial pneumonia. Results: We detected 85 VOC peaks in the exhaled breath of IPF patients and controls. IPF patients showed 5 significant VOC peaks; p-cymene, acetoin, isoprene, ethylbenzene, and an unknown compound. The VOC peak of p-cymene was significantly lower (p < 0.001), while the VOC peaks of acetoin, isoprene, ethylbenzene, and the unknown compound were significantly higher (p < 0.001 for all) compared with the peaks of controls. Comparing VOC peaks with clinical parameters, negative correlations with VC (r =−0.393, p = 0.013), %VC (r =−0.569, p < 0.001), FVC (r = −0.440, p = 0.004), %FVC (r =−0.539, p < 0.001), DLco (r =−0.394, p = 0.018), and %DLco (r =−0.413, p = 0.008) and a positive correlation with KL-6 (r = 0.432, p = 0.005) were found for p-cymene. Conclusion: We found characteristic 5 VOCs in the exhaled breath of IPF patients. Among them, the VOC peaks of p-cymene were related to the clinical parameters of IPF. These VOCs may be useful biomarkers of IPF.
Newly developed active pharmaceutical ingredients (APIs) are often poorly soluble in water. As a result the bioavailability of the API in the human body is reduced. One approach to overcome this restriction is the formulation of amorphous solid dispersions (ASDs), e.g., by hot-melt extrusion (HME). Thus, the poorly soluble crystalline form of the API is transferred into a more soluble amorphous form. To reach this aim in HME, the APIs are embedded in a polymer matrix. The resulting amorphous solid dispersions may contain small amounts of residual crystallinity and have the tendency to recrystallize. For the controlled release of the API in the final drug product the amount of crystallinity has to be known. This review assesses the available analytical methods that have been recently used for the characterization of ASDs
and the quantification of crystalline API content. Well established techniques like near- and mid-infrared spectroscopy (NIR and MIR, respectively), Raman spectroscopy, and emerging ones like UV/VIS, terahertz, and ultrasonic spectroscopy are considered in detail. Furthermore, their advantages and limitations are discussed with regard to general practical applicability as process analytical technology (PAT) tools in industrial manufacturing. The review focuses on spectroscopic methods which have been proven as most suitable for in-line and on-line process analytics. Further aspects are spectroscopic techniques that have been or could be integrated into an extruder.
Wege der Gewinnermittlung
(2017)
Macht ein Unternehmen Gewinn, heißt dies nicht notwendigerweise, dass alles „in trockenen Tüchern“ ist. Die entscheidende Frage ist, wie der Gewinn ermittelt wurde, denn nur mit dem richtigen Verfahren erhält man auch den geeigneten Blickwinkel – auf den Erfolg eines einzelnen Geschäfts, auf den Gewinn einer Periode, auf das Betriebsvermögen, auf die Liquidität oder auf die Bilanz.
EBIT & Co.
(2017)
Eine ganze Reihe von Kennzahlen wird in der Betriebswirtschaftslehre zur Ermittlung und Steuerung des Unternehmensgewinns verwendet. Doch nicht alle eignen sich für denselben Zweck. Je nach Fragestellung sollten unterschiedliche Kennzahlen herangezogen werden. Ihre Interpretation muss nicht zuletzt auch branchenspezifisch erfolgen.
Eine realistische Risikoeinschätzung ist Basis von verantwortungsvollen Unternehmensentscheidungen. Doch wie lassen sich Risiken richtig einschätzen? Verschiedene Instrumente des Risiko-Managements erlauben es, Risiken systematisch zu identifizieren, zu quantifizieren, zu bewerten und zu dokumentieren.
Risiken sind per se nichts Schlechtes, wenn der dadurch erzielte Ertrag für das eingegangene Risiko angemessen ist. Dieser Zusammenhang wird allerdings nicht immer verstanden – einer der Gründe für die Finanzkrise von 2008/09. Die in diesem Beitrag vorgestellten Kennzahlen zeigen, wie man Risiken mit erzielten oder möglichen Erträgen ins Verhältnis setzen kann.
Wer in ein Unternehmen investiert, tut dies, um in Zukunft Geld zu verdienen. Er rechnet mit einer risikoadäquaten Rendite. Die Auswahl der Kennzahlen, die diese Wertsteigerung transparent machen, ist allerdings nicht trivial. Denn von ihnen hängt ab, ob die Unternehmensziele richtig vorgegeben und ob die Anreize für das Management richtig gesetzt werden.
Umsatz und Gewinne stagnieren auf hohem Niveau, und dennoch steigen der Aktienkurs und der Gewinn pro Aktie – eine Entwicklung, die sich etwa bei Apple oder Ebay beobachten lässt. Aktionäre sollten wissen, welche Arithmetik sich hinter solchen Entwicklungen verbirgt und mit welchen Verfahren sie den Unternehmenswert am besten ermitteln können.
This paper studies whether a monetary union can be managed solely by a rule based approach. The Five Presidents’ Report of the European Union rejects this idea. It suggests a centralisation of powers. We analyse the philosophy of policy rules from the vantage point of the German economic school of thought. There is evidence that a monetary union consisting of sovereign states is well organised by rules, together with the principle of subsidiarity. The root cause of the euro crisis is rather the weak enforcement of rules, compounded by structural problems. Therefore, we suggest a genuine rule-based paradigm for a stable future of the Economic and Monetary Union.
Die weiterhin hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation erst gar nicht eintritt, hält der Autor eine staatliche Insovenzordnung – mit Bail-out durch die anderen Mitgliedstaaten nur in Notfällen – für erforderlich. Er schlägt einen staatlichen Abwicklungsmechanismus für überschuldete Euro-Länder vor, der auf einem Konzept des Sachverständigenrates für Wirtschaft von 2016 beruht.
This paper presents an approach for label-free brain tumor tissue typing. For this application, our dual modality microspectroscopy system combines inelastic Raman scattering spectroscopy and Mie elastic light scattering spectroscopy. The system enables marker-free biomedical diagnostics and records both the chemical and morphologic changes of tissues on a cellular and subcellular level. The system setup is described and the suitability for measuring morphologic features is investigated.
To evaluate the quality of sleep, it is important to determine how much time was spent in each sleep stage during the night. The gold standard in this domain is an overnight polysomnography (PSG). But the recording of the necessary electrophysiological signals is extensive and complex and the environment of the sleep laboratory, which is unfamiliar to the patient, might lead to distorted results. In this paper, a sleep stage detection algorithm is proposed that uses only the heart rate signal, derived from electrocardiogram (ECG), as a discriminator. This would make it possible for sleep analysis to be performed at home, saving a lot of effort and money. From the heart rate, using the fast Fourier transformation (FFT), three parameters were calculated in order to distinguish between the different sleep stages. ECG data along with a hypnogram scored by professionals was used from Physionet database, making it easy to compare the results. With an agreement rate of 41.3%, this approach is a good foundation for future research.
In this paper a method for the generation of gSPM with ontology-based generalization was presented. The resulting gSPM was modeled with BPMN/BPMNsix in an efficient way and could be executed with BPMN workflow engines. In the next step the implementation of resource concepts, anatomical structures, and transition probabilities for workflow execution will be realized.
High quality decorative laminate panels typically consist of two major types of components: the surface layers comprising décor and overlay papers that are impregnated with melamine-based resins, and the core which is made of stacks of kraft papers impregnated with phenolic (PF) resin. The PF-impregnated layers impart superior hydrolytic stability, mechanical strength and fire-resistance to the composite. The manufacturing involves the complex interplay between resin, paper and impregnation/drying processes. Changes in the input variables cause significant alterations in the process characteristics and adaptations of the used materials and specific process conditions may, in turn, be required. This review summarizes the main variables influencing both processability and technological properties of phenolic resin impregnated papers and laminates produced therefrom. It is aimed at presenting the main influences from the involved components (resin and paper), how these may be controlled during the respective process steps (resin preparation and paper production), how they influence the impregnation and lamination conditions, how they affect specific aspects of paper and laminate performance, and how they interact with each other
(synergies).
This article provides a general overview of the most promising candidates of bio based materials and deals with the most important issues when it comes to their incorporation into PF resins. Due to their abundance on Earth, much knowledge of lignin-based materials has already been gained and uses of lignin in PF resins have been studied for many decades. Other natural polyphenols that are less frequently considered for impregnation are covered as well, as they do also possess some potential for PF substitution.
Characterisation of porous knitted titanium for replacement of intervertebral disc nucleus pulposus
(2017)
Effective restoration of human intervertebral disc degeneration is challenged by numerous limitations of the currently available spinal fusion and arthroplasty treatment strategies. Consequently, use of artificial biomaterial implant is gaining attention as a potential therapeutic strategy. Our study is aimed at investigating and characterizing a novel knitted titanium (Ti6Al4V) implant for the replacement of nucleus pulposus to treat early stages of chronic intervertebral disc degeneration. Specific knitted geometry of the scaffold with a porosity of 67.67 ± 0.824% was used to overcome tissue integration failures. Furthermore, to improve the wear resistance without impairing original mechanical strength, electro-polishing step was employed. Electro-polishing treatment changed a surface roughness from 15.22 ± 3.28 to 4.35 ± 0.87 μm without affecting its wettability which remained at 81.03 ± 8.5°. Subsequently, cellular responses of human mesenchymal stem cells (SCP1 cell line) and human primary chondrocytes were investigated which showed positive responses in terms of adherence and viability. Surface wettability was further enhanced to super hydrophilic nature by oxygen plasma treatment, which eventually caused substantial increase in the proliferation of SCP1 cells and primary chondrocytes. Our study implies that owing to scaffolds physicochemical and biocompatible properties, it could improve the clinical performance of nucleus pulposus replacement.
Wer mit Argumenten Veränderungen bewirken will, muss seine Ansprechpartner für seine Lösungsansätze gewinnen. Ob dies gelingt, ist heutzutage keine Frage von rhetorischem Talent und Charisma mehr. Denn Techniken des Storylinings und Storytellings machen eine Professionalisierung betriebswirtschaftlicher Argumentation und Gedankenführung für jedermann möglich.
Kennzahlen zur Liquidität
(2016)
Eine gut funktionierende Logistik ist ein wichtiger Wettbewerbsfaktor. Um ihren Beitrag zum Unternehmenserfolg ermitteln zu können, müssen ihre Kosten aber bestimmbar sein. Daran hapert es häufig. Dabei gibt es Ansätze, um Logistikkosten von anderen Kosten abzugrenzen. Unternehmen müssen nur konkrete Regeln für ihren Einsatz berücksichtigen.
Fremdkapital ist aktuell „billig“, doch die Investitionstätigkeit von Unternehmen bleibt zurückhaltend. Wer investieren möchte, muss in erster Linie die Höhe der Kapitalkosten berücksichtigen, die im Gegensatz zu den Zinsen nur leicht gesunken sind. Controller brauchen geeignete Ansätze, um zukünftige Kapitalkosten in Investitionsentscheidungen einzubeziehen.
New drugs serving unmet medical needs are one of the key value drivers of research-based pharmaceutical companies. The efficiency of research and development (R&D), defined as the successful approval and launch of new medicines (output) in the rate of the monetary investments required for R&D (input), has declined since decades. We aimed to identify, analyze and describe the factors that impact the R&D efficiency. Based on publicly available information, we reviewed the R&D models of major research-based pharmaceutical companies and analyzed the key challenges and success factors of a sustainable R&D output. We calculated that the R&D efficiencies of major research-based pharmaceutical companies were in the range of USD 3.2–32.3 billion (2006–2014). As these numbers challenge the model of an innovation-driven pharmaceutical industry, we analyzed the concepts that companies are following to increase their R&D efficiencies: (A) Activities to reduce portfolio and project risk, (B) activities to reduce R&D costs, and (C) activities to increase the innovation potential. While category A comprises measures such as portfolio management and licensing, measures grouped in category B are outsourcing and risk-sharing in late-stage development. Companies made diverse steps to increase their innovation potential and open innovation, exemplified by open source, innovation centers, or crowdsourcing, plays a key role in doing so. In conclusion, research-based pharmaceutical companies need to be aware of the key factors, which impact the rate of innovation, R&D cost and probability of success. Depending on their company strategy and their R&D set-up they can opt for one of the following open innovators: knowledge creator, knowledge integrator or knowledge leverager.
Current techniques for chromosome analysis need to be improved for rapid, economical identification of complex chromosomal defects by sensitive and selective visualisation. In this paper, we present a straightforward method for characterising unstained human metaphase chromosomes. Backscatter imaging in a dark-field setup combined with visible and short near-infrared spectroscopy is used to monitor morphological differences in the distribution of the chromosomal fine structure in human metaphase chromosomes. The reasons for the scattering centres in the fine structure are explained. Changes in the scattering centres during preparation of the metaphases are discussed. FDTD simulations are presented to substantiate the experimental findings. We show that local scattering features consisting of underlying spectral modulations of higher frequencies associated with a high variety of densely packed chromatin can be represented by their scatter profiles even on a sub-microscopic level. The result is independent of the chromosome preparation and structure size. This analytical method constitutes a rapid, costeffective and label-free cytogenetic technique which can be used in a standard light microscope.
The analysis of exhaled metabolites has become a promising field of research in recent decades. Several volatile organic compounds reflecting metabolic disturbance and nutrition status have even been reported. These are particularly important for long-term measurements, as needed in medical research for detection of disease progression and therapeutic efficacy. In this context, it has become urgent to investigate the effect of fasting and glucose treatment for breath analysis. In the present study, we used amodel of ventilated rats that fasted for 12 h prior to the experiment. Ten rats per group were randomly assigned for continuous intravenous infusion without glucose or an infusion including 25 mg glucose per 100 g per hour during an observation period of 12 h. Exhaled gas was analysed using multicapillary column ion-mobility spectrometry. Analytes were identified by the BS-MCC/IMS database (version 1209; B & S Analytik, Dortmund, Germany). Glucose infusion led to a significant increase in blood glucose levels (p<0.05 at 4 h and thereafter) and cardiac output (p<0.05 at 4 h and thereafter). During the observation period, 39 peaks were found collectively. There were significant differences between groups in the concentration of ten volatile organic compounds: p<0.001 at 4 h and thereafter for isoprene, cyclohexanone, acetone, p-cymol, 2-hexanone, phenylacetylene, and one unknown compound, and p<0.001 at 8 h and thereafter for 1-pentanol, 1-propanol, and 2-heptanol. Our results indicate that for long-term measurement, fasting and the withholding of glucose could contribute to changes of volatile metabolites in exhaled air.
Engineering of large vascularized adipose tissue constructs is still a challenge for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Communication between mature adipocytes and endothelial cells is important for homeostasis and the maintenance of adipose tissue mass but, to date, is mainly neglected in tissue engineering strategies. Thus, new coculture strategies are needed to integrate adipocytes and endothelial cells successfully into a functional construct. This review focuses on the cross-talk of mature adipocytes and endothelial cells and considers their influence on fatty acid metabolism and vascular tone. In addition, the properties and challenges with regard to these two cell types for vascularized tissue engineering are highlighted.