Refine
Document Type
- Journal article (992)
- Conference proceeding (904)
- Book chapter (222)
- Working Paper (35)
- Book (30)
- Doctoral Thesis (24)
- Report (23)
- Issue of a journal (17)
- Review (6)
- Anthology (2)
Has full text
- yes (2257) (remove)
Is part of the Bibliography
- yes (2257) (remove)
Institute
- Informatik (747)
- ESB Business School (698)
- Technik (373)
- Life Sciences (274)
- Texoversum (137)
- Zentrale Einrichtungen (14)
Publisher
- Springer (367)
- IEEE (251)
- Elsevier (189)
- Hochschule Reutlingen (176)
- MDPI (99)
- Gesellschaft für Informatik e.V (66)
- Wiley (62)
- De Gruyter (51)
- Association for Computing Machinery (45)
- IARIA (26)
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nonetheless, in real life history is not always repeatable, i.e., in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction based on a calculated periodicity. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. The periodicity is calculated based on a novel approach that is based on data folding and Pearson Correlation. Compared to other techniques this approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 as well as artificial data demonstrate better results than established sophisticated time series methods.
Since November 2011 the standard DIN 4709 stipulates performance tests for Micro-CHP units in Germany. In contrast to steady state measurements of the CHP unit itself, the test according to DIN 4709 includes the thermal storage tank as well as the internal control unit, and it is based on a 24 h test cycle following a specified thermal load profile. Hence, heat losses from the storage tank are as well taken into account as transient losses of the CHP unit. In addition, the control strategy for loading and unloading the storage tank affects the test results.
The DIN 4709 test cycle has been applied at the test stand for Micro-CHP units at Reutlingen University, and results for the Micro-CHP unit WhisperGen and the EC Power units XRGI 15® and XRGI 20® are available. During the analysis a method has been developed to evaluate the results in case the test cycle does not end in a time slot between 24 and 24.5 h after the starting as demanded by DIN 4709. Since this method has been successfully applied to the test of various CHP units of different size and technology so far, it is suggested to incorporate it to DIN 4709 during the next revision of the standard.
The performance numbers obtained reveal the differences in efficiencies measured at steady-state on the one hand and following the DIN 4709 test cycle on the other hand. While the deviations in electrical efficiencies are small, thermal efficiencies according to DIN 4709 fall below steady state data by 3–6 percentage points. This is attributed to transient thermal losses and heat losses from the storage tank, which are not included in steady state and separate testing of the CHP unit, only.
Enhancing the undergraduate educational experience : development of a micro-gas turbine laboratory
(2014)
A Capstone C30 MicroTurbine has been installed, instrumented, and utilized in a junior-level laboratory course at Valparaiso University. The C30 MicroTurbine experiment enables Valparaiso University to educate students interested in power generation and turbine technology. The first goal of this experiment is for students to explore a gas turbine generator and witness the discrepancies between idealized models and real thermodynamic systems. Secondly, students measure and analyze data to determine where losses occur in a real gas turbine. The third educational goal is for students to recognize the true costs associated with natural gas use, i.e. the hidden costs of transporting the gas to the consumer. Overall, the gas turbine experiment has garnered positive feedback from students. The twenty-six students who performed the lab in Spring 2014 rated the quality and usefulness of the gas turbine experiment as 4.28 and 4.19, respectively, on a 1-5 Likert scale, where 1 is low and 5 is high.
Bionic optimisation is one of the most popular and efficient applications of bionic engineering. As there are many different approaches and terms being used, we try to come up with a structuring of the strategies and compare the efficiency of the different methods. The methods mostly proposed in literature may be classified into evolutionary, particle swarm and artificial neural net optimisation. Some related classes have to be mentioned as the non-sexual fern optimisation and the response surfaces, which are close to the neuron nets. To come up with a measure of the efficiency that allows to take into account some of the published results the technical optimisation problems were derived from the ones given in literature. They deal with elastic studies of frame structures, as the computing time for each individual is very short. General proposals, which approach to use may not be given. It seems to be a good idea to learn about the applicability of the different methods at different problem classes and then do the optimisation according to these experiences. Furthermore in many cases there is some evidence that switching from one method to another improves the performance. Finally the identification of the exact position of the optimum by gradient methods is often more efficient than long random walks around local maxima.
Global acting rating agencies were held responsible for the latest financial market crisis. False estimations in rating, non-transparent methods, processes and systems as well as a lack of qualification of rating analysts have been points of criticism. The level of the tightened regulation of the agencies in the USA and in Europe is pointed out in this article. All relevant institutions and norms as well as the international and national standards from the German point of view are presented and exhaustively analyzed. In doing so it is illustrated, that in this olio-political market one can definitely speak about protection with regard to the admission and accreditation of the agencies.
Many companies practice performance management in the framework of a heterogeneous, grown mix of numerous separate decisions, instruments, processes and systems and not in terms of a strategically and systematically planned management system. Due to the inefficiency of the above mentioned performance management style, a holistic and integrated approach is a key factor. Performance management must be able to meet central objectives and requirements and set the groundwork for long-term corporate success. This article presents a central approach of the conception of holistic and long-term performance management. The five equal part disciplines are illustrated and demonstrate the issue and composition complexity of a performance management due to their characteristics and combination. The objective of this article is to display and communicate the performance management issue and its context through an easily comprehensible system without following a general recipe.
Scanning Near-Field Optical Microscopy (SNOM) has developed during recent decades into a valuable tool to optically image the surface topology of materials with super-resolution. With aperture-based SNOM systems, the resolution scales with the size of the aperture, but also limits the sensitivity of the detection and thus the application for spectroscopic techniques like Raman SNOM. In this paper we report the extension of solid immersion lens (SIL) technology to Raman SNOM. The hemispherical SIL with a tip on the bottom acts as an apertureless dielectric nanoprobe for simultaneously acquiring topographic and spectroscopic information. The SIL is placed between the sample and the microscope objective of a confocal Raman microscope. The lateral resolution in the Raman mode is validated with a cross section of a semiconductor layer system and, at approximately 180 nm, is beyond the classical diffraction limit of Abbe.
In visual adaptive tracking, the tracker adapts to the target, background, and conditions of the image sequence. Each update introduces some error, so the tracker might drift away from the target over time. To increase the robustness against the drifting problem, we present three ideas on top of a particle filter framework: An optical-flow-based motion estimation, a learning strategy for preventing bad updates while staying adaptive, and a sliding window detector for failure detection and finding the best training examples. We experimentally evaluate the ideas using the BoBoT dataseta. The code of our tracker is available online.
We investigated the excitation modes of the light-harvesting protein phycocyanin (PC) from Thermosynechococcus vulcanus in the crystalline state using UV and near-infrared Raman spectroscopy. The spectra revealed the absence of a hydrogen out-of-plane wagging (HOOP) mode in the PC trimer, which suggests that the HOOP mode is activated in the intact PC rod, while it is not active in the PC trimer. Furthermore, in the PC trimer an intense mode at 984 cm−1 is assigned to the C–C stretching vibration while the mode at 454 cm−1 is likely due to ethyl group torsion. In contrast, in the similar chromophore phytochromobilin the C5,10,15-D wag mode at 622 cm−1 does not come from a downshift of the HOOP. Additionally, the absence of modes between 1200 and 1300 cm−1 rules out functional monomerization. A correlation between phycocyanobilin (PCB) and phycoerythrobilin (PEB) suggests that the PCB cofactors of the PC trimer appear in a conformation similar to that of PEB. The conformation of the PC rod is consistent with that of the allophycocyanin (APC) trimer, and thus excitonic flow is facilitated between these two independent light harvesting compounds. This excitonic flow from the PC rod to APC appears to be modulated by the vibration channels during HOOP wagging, C = C stretching, and the N–H rocking in-plan vibration.
Knowledge transfer is very important to our knowledge-based society and many approaches have been proposed to describe this transfer. However, these approaches take a rather abstract view on knowledge transfer, which makes implementation difficult. In order to address this issue, we introduce a layered model for knowledge transfer that structures the individual steps of knowledge transfer in more detail. This paper gives a description of the process and also an example of the application of the layered model for knowledge transfer. The example is located in the area of business process modelling. Business processes contain the important knowledge describing the procedures of the company to produce products and services. Knowledge transfer is the fundamental basis in the modelling and usage of Business processes, which makes it an interesting use case for the layered model for knowledge transfer.
DMOS transistors are often subject to high power dissipation and thus substantial self-heating. This limits their safe operating area because very high device temperatures can lead to thermal runaway and subsequent destruction. Because the peak temperature usually occurs only in a small region in the device, it is possible to redistribute part of the dissipated power from the hot region to the cooler device areas. In this way, the peak temperature is reduced, whereas the total power dissipation is still the same. Assuming that a certain temperature must not be exceeded for safe operation, the improved device is now capable of withstanding higher amounts of energy with an unchanged device area. This paper presents two simple methods to redistribute the power dissipation density and thus lower the peak device temperature. The presented methods only require layout changes. They can easily be applied to modern power technologies without the need of process modifications. Both methods are implemented in test structures and investigated by simulations and measurements.
Im März dieses Jahres fand der Steuerprozess gegen Uli Hoeneß statt, der mit einer Verurteilung des damaligen Präsidenten und Aufsichtsratsvorsitzenden des FC Bayern München endete. Vor, während und nach dem mit Spannung erwarteten Prozess stellte das Deutsche Institut für Sportmarketing (DISM) in Kooperation mit dem Felddienstleister Norstat Germany über 7.000 Probanden in Deutschland im Rahmen einer Online-Befragung verschiedene Fragen zum Sportmarketing in Verbindung mit der Steueraffäre. Auf die zentralen Ergebnisse dieser Befragung wird im vorliegenden Forschungsreport eingegangen.
DMOS transistors in integrated power technologies are often subject to significant self-heating and thus high temperatures, which can lead to device failure and reduced lifetime. Hence, it must be ensured that the device temperature does not rise too much. For this, the influence of the on-chip metallization must be taken into account because of the good thermal conductivity and significant thermal capacitance of the metal layers on top of the active DMOS area. In this paper, test structures with different metal layers and vias configurations are presented that can be used to determine the influence of the onchip metallization on the temperature caused by self-heating. It will be shown how accurate results can be obtained to determine even the influence of small changes in the metallization. The measurement results are discussed and explained, showing how on-chip metallization helps to lower the device temperature. This is further supported by numerical simulations. The obtained insights are valuable for technology optimization, but are also useful for calibration of temperature simulators.
The recent years and especially the Internet have changed the ways in which data is stored. It is now common to store data in the form of transactions, together with ist creation time-stamp. These transactions can often be attributed to Logical units, e.g., all transactions that belong to one customer. These groups, we refer to them as data sequences, have a more complex structure than tuple-based data. This makes it more difficult to find discriminatory patterns for classification purposes. However, the complex structure potentially enables us to track behaviour and its change over the course of time. This is quite interesting, especially in the e-commerce area, in which classification of a sequence of customer actions is still a challenging task for data miners. However, before standard algorithms such as Decision Trees, Neural Nets, Naive Bayes or Bayesian Belief Networks can be applied on sequential data, preparations are required in order to capture the information stored within the sequences. Therefore, this work presents a systematic approach on how to reveal sequence patterns among data and how to construct powerful features out of the primitive sequence attributes. This is achieved by sequence aggregation and the incorporation of time dimension into the feature construction step. The proposed algorithm is described in detail and applied on a real-life data set, which demonstrates the ability of the proposed algorithm to boost the classification performance of well-known data mining algorithms for binary classification tasks.
Vehicles have been so far improved in terms of energy-efficiency and safety mainly by optimising the engine and the power train. However, there are opportunities to increase energy-efficiency and safety by adapting the individual driving behaviour in the given driving situation. In this paper, an improved rule match algorithm is introduced, which is used in the expert system of a human-centred driving system. The goal of the driving system is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. The improved rule match algorithm checks the incoming information against the driving rules to recognise any breakings of a driving rule. The needed information is obtained by monitoring the driver, the current driving situation as well as the car, using in-vehicle sensors and serial-bus systems. On the basis of the detected broken driving rules, the expert system will create individual recommendations in terms of energy-efficiency and safety, which will allow eliminating bad driving habits, while considering the driver needs.
Online credit card fraud presents a significant challenge in the field of eCommerce. In 2012 alone, the total loss due to credit card fraud in the US amounted to $ 54 billion. Especially online games merchants have difficulties applying standard fraud detection algorithms to achieve timely and accurate detection. This paper describes the Special constrains of this domain and highlights the reasons why conventional algorithms are not quite effective to deal with this problem. Our suggested solution for the problem originates from the fields of feature construction joined with the field of temporal sequence data mining. We present Feature construction techniques, which are able to create discriminative features based on a sequence of transaction and are able to incorporate the time into the classification process. In addition to that, a framework is presented that allows for an automated and adaptive change of features in case the underlying pattern is changing.
Positively charged metallic oxides prevent blood coagulation whereas negatively charged metallic oxides are thrombogenic. This study was performed to examine whether this effect extends to metallic oxide nanoparticles. Oscillation shear rheometry was used to study the effect of zinc oxide and silicon dioxide nanoparticles on thrombus formation in human whole blood. Our data show that oscillation shear rheometry is a sensitive and robust technique to analyze thrombogenicity induced by nanoparticles. Blood without previous contact with nanoparticles had a clotting time (CT) of 16.7 ± 1.0 min reaching a maximal clot strength (CS) of 16 ± 14 Pa (G') after 30 min. ZnO nanoparticles (diameter 70 nm, +37 mV zeta-potential) at a concentration of 1 mg/mL prolonged CT to 20.8 ± 3.6 min and provoked a weak clot (CS 1.5 ± 1.0 Pa). However, at a lower concentration of 100 µg/mL the ZnO particles dramatically reduced CT to 6.0 ± 0.5 min and increased CS to 171 ± 63 Pa. This procoagulant effect decreased at lower concentrations reaching the detection limit at 10 ng/mL. SiO2 nanoparticles (diameter 232 nm, −28 mV zeta-potential) at high concentrations (1 mg/mL) reduced CT (2.1 ± 0.2 min) and stimulated CS (249 ± 59 Pa). Similar to ZnO particles, this procoagulant effect reached a detection limit at 10 ng/mL. Nanoparticles in high concentrations reproduce the surface charge effects on blood coagulation previously observed with large particles or solid metal oxides. However, nanoparticles with different surface charges equally well stimulate coagulation at lower concentrations. This stimulation may be an effect which is not directly related to the surface charge.
The interaction between lipid bilayers in water has been intensively studied over the last decades. Osmotic stress was applied to evaluate the forces between two approaching lipid bilayers in aqueous solution. The force–distance relation between lipid mono- or bilayers deposited on mica sheets using a surface force apparatus (SFA) was also measured. Lipid stabilised foam films offer another possibility to study the interactions between lipid monolayers. These films can be prepared comparatively easy with very good reproducibility. Foam films consist usually of two adsorbed surfactant monolayers separated by a layer of the aqueous solution from which the film is created. Their thickness can be conveniently measured using microinterferometric techniques. Studies with foam films deliver valuable information on the interactions between lipid membranes and especially their stability and permeability. Presenting inverse black lipid membrane (BLM) foam films supply information about the properties of the lipid self-organisation in bilayers. The present paper summarises results on microscopic lipid stabilised foam films by measuring their thickness and contact angle. Most of the presented results concern foam films prepared from dispersions of the zwitterionic lipid 1,2-dimyristoyl-sn-glycero-3-phosphorylcholine (DMPC) and some of its mixtures with the anionic lipid — 1,2-dimyristoyl-sn-glycero-3-[phospho-rac-(1-glycerol)] (DMPG).
The strength of the long range and short range forces between the lipid layers is discussed. The van der Waals attractive force is calculated. The electrostatic repulsive force is estimated from experiments at different electrolyte concentrations (NaCl, CaCl2) or by modification of the electrostatic double layer surface potential by incorporating charged lipids in the lipid monolayers. The short range interactions are studied and modified by using small carbohydrates (fructose and sucrose), ethanol (EtOH) or dimethylsulfoxide (DMSO). Some results are compared with the structure of lipid monolayers deposited at the liquid/air interface (monolayers spread in Langmuir trough), which are one of most studied biomembrane model system. The comparison between the film thickness and the free energy of film formation is used to estimate the contribution of the different components of the disjoining pressure to the total interaction in the film and their dependence on the composition of the film forming solution.
Increasing number of studies are focused on how adherent cells respond, in vitro, to different properties of a material. Typical properties are the surface chemistry, topographical cues (at the nano- and micro-scale) of the surface, and the substrate stiffness. Cell Response studies are of importance for designing new biomaterials with applications in cell culture technologies, regenerative medicine, or for medical implants. However, only very few studies take the cell age factor, respectively the donor age, into account. In this work, we tested two types of human vascular cells (smooth muscle and endothelial cells) from old and young donors on (a) micro-structured surfaces made of pol (dimethylsiloxane) or on (b) flat polyacrylamide hydrogels with varying stiffnesses. These experiments reveal age-dependent and cell typedependent differences in the cell response to the topography and stiffness, and may establish the Basis for further studies focusing on cell age-dependent responses.
It is well established that the mechanical environment influences cell functions in health and disease. Here, we address how the mechanical environment influences tumor growth, in particular, the shape of solid tumors. In an in vitro tumor model, which isolates mechanical interactions between cancer tumor cells and a hydrogel, we find that tumors grow as ellipsoids, resembling the same, oft-reported observation of in vivo tumors. Specifically, an oblate ellipsoidal tumor shape robustly occurs when the tumors grow in hydrogels that are stiffer than the tumors, but when they grow in more compliant hydrogels they remain closer to spherical in shape. Using large scale, nonlinear elasticity computations we Show that the oblate ellipsoidal shape minimizes the elastic free energy of the tumor-hydrogel system. Having eliminated a number of other candidate explanations, we hypothesize that minimization of the elastic free energy is the reason for predominance of the experimentally observed ellipsoidal shape. This result may hold significance for explaining the shape progressio.
Die Informatics Inside-Konferenz findet in diesem Jahr zum dritten Mal statt. Mit dem Thema "Grenzen überwinden – Virtualität erweitert Realität" stellt sich die Veranstaltung einem aktuellen Schwerpunkt, der viele Interessierte aus Wirtschaft, Wissenschaft und Forschung anzieht. Die Konferenz hat sich von einer Veranstaltung für die Masterstudenten des Studiengangs Medien- und Kommunikationsinformatik zu einer offenen Studentenkonferenz entwickelt. Um die Qualität weiter zu steigern wurde parallel dazu ein zweistufiges Review-Verfahren für Beiträge dieses Tagungsbandes eingeführt.
Poly(dimethylsiloxane) can be covalently coated with ultrathin NCO-sP(EO-stat-PO) hydrogel layers which permit covalent binding of cell adhesive moieties, while minimizing unspecific cell adhesion on non-functionalized areas. We applied long term uniaxial cyclic tensile strain (CTS) and revealed (a) the preservation of protein and cell-repellent properties of the NCO-sP(EO-stat-PO) coating and (b) the stability and bioactivity of a covalently bound fibronectin (FN) line pattern. We studied the adhesion of human dermal fibroblast (HDFs) on non-modified NCO-sP(EO-stat-PO) coatings and on the FN. HDFs adhered to FN and oriented their cell bodies and actin fibers along the FN lines independently of the direction of CTS. This mechanical long term stability of the bioactive, patterned surface allows unraveling biomechanical stimuli for cellular signaling and behavior to understand physiological and pathological cell phenomenon. Additionally, it allows for the application in wound healing assays, tissue engineering, and implant development demanding spatial control over specific cell adhesion.
In vivo, cells encounter different physical and chemical signals in the extracellular matrix (ECM) which regulate their behavior. Examples of these signals are micro- and nanometer-sized features, the rigidity, and the chemical composition of the ECM. The study of cell responses to such cues is important to understand complex cell functions, some diseases, and is basis for the development of new biomaterials for applications in medical implants or regenerative medicine. Therefore, the development of new methods for surface modifications with controlled physical and chemical features is crucial. In this work, we report a new combination of micelle nanolithography (BCML) and soft micro-lithography, for the production of polyethylene glycol (PEG) hydrogels, with a micro-grooved surface and decoration with hexagonally precisely arranged gold nanoparticles (AU NPs). The Au-NPs are used for binding adhesive ligands in a well-defined density. First tests were performed by culturing human fibroblasts on the gels. Adhesion and alignment of the cells along the parallel grooves of the surface were investigated. The substrates could provide a new platform for studying cell contact guidance by micro structures, and may enable a more precise control of cell behavior by nanometrically controlled surface functionalization.
Powder coating of engineered wood panels such as medium density fibreboards (MDF) is gaining industrial interest due to ecological and economic advantages of powder coating technology. For transferring powder coating technology to temperature-sensitive substrates like MDF, a thorough understanding of the melting, flowing and curing behaviour of the used low-bake resins is required. In the present study, thermo-analysis in combination with iso-conversional kinetic data analysis as well as rheometry is applied to characterise the properties of an epoxy-based powder coating. Neat resin and cured powder coating films are examined in order to define an ideal production window within which the resin is preferably applied and processed to yield satisfactory surface performance on the one hand and without exposing the carrier MDF too high a temperature load on the other hand to prevent the panel from deteriorating in mechanical strength. In order to produce powder coated films of high surface gloss – a feature that has not yet successfully been realized on MDF with powder coatings – a new curing technology, in-mould surface finishing, has been applied.
Hardboards (HBs) (wet-process high-density fibreboards) were made in an industrial trial using a binder system consisting of cationic mimosa tannin and laccase or just cationic tannin without any thermosetting adhesive. The boards displayed superior mechanical strength compared to reference boards made with phenol–formaldehyde, easily exceeding the European standards for general-purpose HBs. The thickness swell of most of the boards was slightly greater than the standards would allow, so some optimisation is required in this area. The improved board properties appear to be mainly associated with ionic interactions involving quaternary amino groups in cationic tannin and negatively charged wood fibres rather than to cross-linking of fibres via laccase-assisted formation and coupling of radicals in tannin and fibre lignin.
The fiber deformations of once-dried, bleached and never-dried unbleached kraft pulps were studied with respect to their behavior in high- and low-consistency refining. The pulps were stained with congo red to experimentally highlight areas where the arrangement of the fibrils was altered by refining such as dislocated zones or slip planes. The stained fibers were analyzed with conventional Metso Fiberlab but also with a novel prototype measurement device utilizing a color imaging setup. The local intensity of the stain in the fiber was expressed as degree of overall damage (Overall fiber damage index, OFDI). The rewetted zero span tensile index (RWZSTI) was used to verify the OFDI with respect to the pulp strength. High consistency refining resulted in a clear increase in the number of kinks which negatively influenced the pulp strength. The OFDI which was used to detect the intensity of local fiber defects also responded accordingly. A higher OFDI resulted in a lower pulp strength. Low consistency refining removed a significant amount of kinks and resulted in an increase in fiber swelling. A slight increase in fibrillation and a significant increase in flake-like fines were also observed. The OFDI, however, was not reduced in low consistency refining as it would be expected by the removal of less severe dislocations. One reason proposed here is that low consistency refining created new fiber pores that allowed the dye to penetrate into the fiber wall similarly as it does in the zones of the dislocations.
Melamine formaldehyde (MF) resins are widely used for the gluing and surface coating of wood-based consumer products in the interior design of living environments. MF resins are especially relevant in decorative laminate applications because of their good performance-to-price ratio. In their industrial processing, an important intermediate state is the liquid MF prepolymer that is used for decorative paper impregnation. Here, the drying of impregnated papers is investigated with respect to premature curing. A new method to quantify water release upon drying that allows estimation of the degree of undesired precuring is described. Since curing proceeds via polycondensation, crosslinking brings about the release of water molecules. By thermogravimetric analysis (TGA), drying was studied in terms of water release due to physical drying (elimination of “dilution water”) and chemical crosslinking of the prepolymer to a three-dimensional MF network (elimination of chemically liberated water). The results obtained by TGA/IR spectroscopic analysis of the liberated volatiles show that the emission of water from b-stage MF can be clearly analytically separated into a physical (evaporation of dilution water) and a chemical (liberation via condensation) sequence. TGA experiments were correlated with curing experiments performed with differential scanning calorimetry (DSC) to estimate the residual crosslinking capacities of the impregnated papers. The drying conditions used during the preparation of impregnated decorative papers seemed to significantly affect their remaining reactivity only when harsh drying conditions were used. Upon heat exposure for prolonged time, precuring of the oligomer units results in a shift of the temperature maxima in TGA.
Ethylene terephthalate and ethylene naphthalate oligomers of defined degree of polymerization were synthesized via chemical recycling of the parent polymers. The oligomers were used as defined building blocks for the preparation of novel block-co-polyesters having tailored sequence compositions. The sequence lengths were systematically varied using Design of Experiments. The dispersive surface energy and the specific desorption energy of the co-polymers were determined by inverse gas chromatography. The study shows that polyethylene terephthalate-polyethylene naphthalate (PET-PEN) block-co-polyesters of defined sequence lengths can be prepared. Furthermore, the specific and dispersive surface energies of the obtained block-co-polyesters showed a linear dependence on the oligomer molecular weight and it was possible to regulate and control their interfacial properties. In contrast, with the corresponding random-block-co-polyesters no such dependence was found. The synthesized block-co-polyesters could be used as polymeric modifying agents for stabilizing PET-PEN polymer blends.
Applied mathematical theory for monetary-fiscal interaction in a supranational monetary union
(2014)
I utilize a differentiable dynamical system á la Lotka-Voletrra and explain monetary and fiscal interaction in a supranational monetary union. The paper demonstrates an applied mathematical approach that provides useful insights about the interaction mechanisms in theoretical economics in general and a monetary union in particular. I find that a common central bank is necessary but not sufficient to tackle the new interaction problems in a supranational monetary union, such as the free-riding behaviour of fiscal policies. Moreover, I show that upranational institutions, rules or laws are essential to mitigate violations of decentralized fiscal policies.
This paper provides a quantitative approach to measuring the effectiveness of ambush marketing by using Google data. To our knowledge, it is one of the first studies that develop an empirical approach that directly measures the attention effect of ambush marketing in sports. The new data consists of 14 ambushers (treatment group) and 26 official sponsors (control group) and covers the time period of 2004 to 2012. These firms conducted marketing activities during the past football World Cups and European Championships. The innovation in our paper is the measurement method of attention by means of Google. The results are as follows: First ambush marketing increases product attention significantly. Second the product awareness of ambushers is greater or the same to that of official sponsors. Finally, we demonstrate that ambush marketing has positive impacts on the company's performance. Overall, we conclude that Google provide new insights for the analysis of ambush marketing.
The intelligent recycling of plastics waste is a major concern. Because of the widespread use of polyethylene terephtalate, considerable amounts of PET waste are generated that are ideally re-introduced into the material cycle by generating second generation products without loss of materials performance. Chemical recycling methods are often expensive and entail environmentally hazardous by-products. Established mechanical methods generally provide materials of reduced quality, leading to products of lower quality. These drawbacks can be avoided by the development of new recycling methods that provide materials of high quality in every step of the production cycle. In the present work, oligomeric ethylene terephthalate with defined degrees of polymerization and defined molecular weight is produced by melt-mixing PET with different quantities of adipic acid as an alternative pathway of recycling PET with respect to conventional methods, offering ecofriendly and economical aspects. Additionally, block-copolyesters of defined block length are designed from the oligomeric products.
Clay minerals play an increasingly important role as functional fillers and reinforcing materials for clay polymer nanocomposites (CPN) in advanced applications. Among the prerequisites necessary for polymer improvement by clay minerals are homogeneous and stable Distribution of the clay mineral throughout the CPN, good compatibility of the reinforcement with the Matrix component and suitable processability. Typically, clay minerals are surface-modified with organic interface active compounds like detergents or silanes to obtain favorable properties as filler. They are incorporated into the polymer matrix using manufacturing Equipment like extruders, batch reactors or other mixing machines. In order for the surface modification to survive the stresses and strains during incorporation, the modified clay minerals must display sufficient thermal and mechanical stability to retain the compatibilizing effect. In the present study, thermogravimetry was used in combination with isoconversional kinetic analysis to determine the thermal stability of a silane-modified clay mineral based on bentonite. These findings were compared with the stability of the same clay mineral that was only surfactant-modified. It was found that silane modification leads to significantly improved thermal stability, which depends strongly on the type of silane employed.
Decorative laminates based on melamine formaldehyde (MF) resin impregnated papers are used at great extent for surface finishing of engineered wood that is used for furniture, kitchen, and working surfaces, flooring and exterior cladding. In all these applications, optically flawless appearance is a major issue. The work described here is focused on enhancing the cleanability and antifingerprint properties of smooth, matt surface-finished melamine-coated particleboards for furniture fronts, without at the same time changing or deteriorating other important surface parameters such as hardness, roughness or gloss. In order to adjust the surface polarity of a low pressure melamine film, novel interface-active macromolecular compounds were prepared and tested for their suitability as an antifingerprint additive. Two hydroxy-functional surfactants (polydimethysiloxane, PDMS-OH and perfluoroether, PF-OH) were oxidized under mild conditions to the corresponding aldehydes (PDMS-CHO and PF-CHO) using a pyridinium chlorochromate catalyst. With the most promising oxidized polymeric additive, PDMS-CHO, the contact angles against water, n-hexadecane, and squalene increased from 79.8°, 26.3° and 31.4° for the pure MF surface to 108.5°, 54.8°, and 59.3°, respectively, for the modified MF surfaces. While for the laminated MF surface based on the oxidized fluoroether the gloss values were much higher than required, for the surfaces based on oxidized polydimethylsiloxane the technological values as well as the lower gloss values were in agreement with the requirements and showed much improved surface cleanability, as was also confirmed by colorimetric measurements.
This paper studies the impact of governmental transparency on the political business cycle. The literature on electoral cycles finds evidence that cycles depend on the stage of the economy. However, we show a reliance of the cycle on transparency. We use data for G7 countries and compare it with less developed OECD countries. Our theory states that transparency reduces the political cycles due to peer pressure and by voting outs. We confirm the theory with an econometric assessment of 34 countries from 1970 to 2012. We discover smaller cycles in countries with a higher transparency, especially in G7-countries.
The powder coating of veneered particle boards by the sequence electrostatic powder application -powder curing via hot pressing is studied in order to create high gloss surfaces. To obtain an appealingaspect, veneer Sheets were glued by heat and pressure on top of particle boards and the resulting surfaceswere used as carrier substrates for powder coat finishing. Prior to the powder coating, the veneeredparticle board surfaces were pre-treated by sanding to obtain good uniformity and the boards werestored in a climate chamber at controlled temperature and humidity conditions to adjust an appropriate electrical surface resistance. Characterization of surface texture was done by 3D microscopy. The surfaceelectrical resistance was measured for the six veneers before and after their application on the particleboard surface. A transparent powder top-coat was applied electrostatically onto the veneered particleboard surface. Curing of the powder was done using a heated press at 130◦C for 8 min and a smooth, glossy coating was obtained on the veneered surfaces. By applying different amounts of powder thecoating thickness could be varied and the optimum amount of powder was determined for each veneer type.
The capability of the method of Immersion transmission ellipsometry (ITE) (Jung et al. Int Patent WO, 2004/109260) to not only determine three-dimensional refractive indices in anisotropic thin films (which was already possible in the past), but even their gradients along the z-direction (perpendicular to the film plane) is investigated in this paper. It is shown that the determination of orientation gradients in deep-sub-lm films becomes possible by applying ITE in combination with reflection ellipsometry. The technique is supplemented by atomic force microscopy for measuring the film thickness. For a photooriented thin film, no gradient was found, as expected. For a photo-oriented film, which was subsequently annealed in a nematic liquid crystalline phase, an order was found similar to the one applied in vertically aligned nematic displays, with a tilt angle varying along the z-direction. For fresh films, gradients were only detected for the refractive index perpendicular to the film plane, as expected.
Strategy to adjust people’s performance capabilities to new requirements and grantee employability in the world of work. Good examples for this are the current changes in the logistics environment. Regularly, new services and processes close to production were taken into the portfolio of logistics enterprises, so the daily Tasks are changing continuously for the skilled works.
LOPEC aims in developing and offering special-tailored training for Lean Logistics and required basic skills for skilled workers on shopfloor level. Needed know-how for today’s challenges in logistics will be transferred. Another aspect of LOPEC is the development and use of a personal excellence self-assessment that allows a Person to assess and thus improve his/her own level of maturity in employability skills. Thus, LOPEC is aiming at People ehancement as entry ticket to lifelong continuous learning by increasing the maturity level of personal logistic excellence. A common European view for “Logistics personal excellence” for skilled workers will ensure that the final product is an open product, using international, pan European validated standards. As results LOPEC will provide training modules for post-secondary education in the area of Lean Logistics, required basics skills and offers transparency of personal excellence with a personal self-assessment Software solution, regarding the personal maturity Level of hard and soft skills at any time. It can be used as an innovative tool for monitoring personal lifelong learning routes as well as within companies as a strategic tool within Human Resource Development.
This article focuses on potential economic implications of a free trade agreement (FTA) between the European Union (EU) and the Indian Federation. The economic implications are evaluated by estimating an Extended gravity model for all existing FTAs with the Indian Federation. Moreover, we control for the trade contribution of EU member countries in our econometric model during the period from 1990 until 2008. The results show a significant increase in trade, if there is a free trade agreement between India and another country. Interestingly, we find that India has the largest positive impact from FTAs with more advanced economies. Thus, we reaffirm the potential benefits of trade relationships between the EU and India.
This paper is a brief review on the book ‘Capital in the Twenty-First Century’ by the French scholar Thomas Piketty. The book has started a new debate about inequality and capital taxation in Europe. It provides interesting empirical facts and develops a theory of the functioning of capitalist economies. However, I personally think the book is less convincing than recognized in the public debate. The demonstrated theory of economic growth in the book is elusive and lacks a psychological and behavioral underpinning. In fact, I do think that the increasing inequality and economic divergence are caused by capitalism but the psychological and behavioral aspects of humans are of similar or greater significance. Therefore, Piketty’s argument does not stimulate an open and scientifically founded debate in all aspects.
Whither the german council of economic experts? The past and future of public economic advice
(2014)
The article discusses the development and impact of the German Council of Economic Experts (GCEE). Firstly, the author studies the historical origins and the institutional setup of the GCEE. In the second step, an analyse of the impact of the annual reports of the German Council is given, along with the international comparison with other advisory boards. Finally, the paper discusses the current economic challenges and the need of modernization of the GCEE in special and political advisory boards in general.
This white paper builds a new financial theory of euro area sovereign bond markets under stress. The theory explains the abnormal bond pricing and increasing spreads during the recent market turmoil. We find that the strong disconnect of bond spreads from the respective bonds’ underlying fundamental values in 2010 was triggered by an increase in asymmetric information and weak reputation of government policies. Both factors cause a normal bond market to switch into a crisis mode. Finally, those markets are prone to self-fulfilling bubbles in which the economic effects are amplified by herding behaviour arising from animal spirits. Altogether, this produces contagious effects and multiple equilibria. Thus, we argue that government bond markets in a monetary union are more fragile and vulnerable to liquidity and solvency crises. Consequently, the systemic mispricing of sovereign debt creates more macroeconomic instability and bubbles in the euro area than in a single country. In other words, financial markets are partly blind to national default risks in a currency union. Therefore, the current European institutional framework puts the wrong incentives in place and needs structural changes soon. To tackle the root causes we suggest more market incentives via consistent rules, pre-emptive austerity measures in good economic times, and a resolution scheme for heavily indebted countries. In summary, our paper enhances the bond market theory and provides new insights into the recent bond market turmoil in Europe.
This paper develops a new governance scheme for a stable and lasting European Monetary Union (EMU). I demonstrate that existing economic governance is based on flawed incentives especially due to insufficient macroeconomic coordination, failures of institutional enforcement and animal spirit in financial markets. All this caused the European sovereign debt crisis in 2010. Consequently, the EMU crisis is not a conundrum at all rather a failure of national and supranational governance. To tackle this problem, I propose a return to flexible but compulsory rules driven by market forces. The new governance principles shall promote the compliance and effective enforcement of rules.
This paper examines the relationship of asset Price determination via Google data. To capture this relation, I create a model and estimate several time series’ regressions. I use weekly data from 2004 to 2010 from 30 international banks. To my knowledge this is the first study which differentiates between Google’s search volume and Google’s search clicks. I show that asset prices are positively related to the rate of change in Google’s search volume, trading volume and the level of Google search clicks. Secondly, I demonstrate that the absolute level of Google’s search volume and Google’s search clicks
behave differently regarding the asset price dynamics. Google’s search volume, which measures long-run searches, is negatively related while Google’s search clicks have a positive relationship to asset prices. Hence, Google’s data offer new insights on both measuring attention and pricing financial assets.
Intra-operative fluoroscopy-guided assistance system for transcatheter aortic valve implantation
(2014)
A new surgical assistance system has been developed to assist the correct positioning of the AVP during transapical TAVI. The developed assistance system automatically defines the target area for implanting the AVP under live 2-D fluoroscopy guidance. Moreover, this surgical assistance system works with low levels of contrast agent for the final deployment of AVP, reducing therefore long-term negative effects, such as renal failure in the elderly and high-risk patients.
The article discusses how drama can support language learning at the university level and how drama can also support learners in acquiring professional competences. In the first part, the article will briefly outline forms of drama in language teaching. It will discuss its benefits, such as putting language in context, making learning holistic and memorable, improving learners’ social and personal competences. The second part describes aspects of drama beneficial for language learning in a professional context and gives a concrete teaching example: theatre projects with a focus on business English.
Plasma polymerization is used for the modification and control of surface properties of a highly transparent, thermoplastic elastomeric silicone copolymer, GENIOMER® 80 (G80). PEG-like diglyme plasma polymer films were deposited with ether retentions varying between 20% and 70% as measured by X-ray photoelectron spectroscopy analysis which did not affect the transparency of the substrate. Films with ether retentions of greater than 70% inhibit protein binding (bovine serum albumin and fibrinogen) and cell proliferation. A short oxygen plasma pretreatment enhances the adhesion and stability of the film as shown by protein binding and cell adhesion experiments. The transparency of the material and the stability of the coating makes this material a versatile bulk material for technical (e.g., lab-on-a-chip) and biomedical (e.g., intraocular lens) applications. The G80/plasma polymer composite is stable against vigorous washing and storage over 5 months and, therefore, offers an attractive alternative to poly(dimethylsiloxane).
There are several intra-operative use cases which require the surgeon to interact with medical devices. We used the Leap Motion Controller as input device and implemented two use-cases: 2D-Interaction (e.g. advancing EPR data) and selection of a value (e.g. room illumination brightness). The gesture detection was successful and we mapped its output to several devices and systems.
Stent graft visualization and planning tool for endovascular surgery using finite element analysis
(2014)
Purpose: A new approach to optimize stent graft selection for endovascular aortic repair is the use of finite element analysis. Once the finite element model is created and solved, a software module is needed to view the simulation results in the clinical work environment. A new tool for Interpretation of simulation results, named Medical Postprocessor, that enables comparison of different stent graft configurations and products was designed, implemented and tested. Methods Aortic endovascular stent graft ring forces and sealing states in the vessel landing zone of three different configurations were provided in a surgical planning software using the Medical Imaging Interaction Tool Kit (MITK) Software system. For data interpretation, software modules for 2D and 3D presentations were implemented. Ten surgeons evaluated the software features of the Medical Postprocessor. These surgeons performed usability tests and answered questionnaires based on their experience with the system.
Results: The Medical Postprocessor visualization system enabled vascular surgeons to determine the configuration with the highest overall fixation force in 16 ± 6 s, best proximal sealing in 56±24 s and highest proximal fixation force in 38 ± 12 s. The majority considered the multiformat data provided helpful and found the Medical Postprocessor to be an efficient decision support system for stent graft selection. The evaluation of the user interface results in an ISONORMconform user interface (113.5 points).
Conclusion: The Medical Postprocessor visualization Software tool for analyzing stent graft properties was evaluated by vascular surgeons. The results show that the software can assist the interpretation of simulation results to optimize stent graft configuration and sizing.
A vapor permeation processes for the separation of aromatic compounds from aliphatic compounds
(2014)
A number of rubbery and glassy membranes have been prepared and evaluated in vapor permeation experiments for separation of aromatic/aliphatic mixtures, using 5/95 (wt:wt) toluene/methylcyclohexane (MCH) as a model solution. Candidate membranes that met the required toluene/MCH selectivity of ≥ 10 were identified. The stability of the candidate membranes was tested by cycling the experiment between higher toluene concentrations and the original 5 wt% level. The best membrane produced has a toluene permeance of 280 gpu and a toluene/MCH selectivity of 13 when tested with a vapor feed of the model mixture at its boiling point and at atmospheric pressure. When a series of related membrane materials are compared, there is a sharp trade-off between membrane permeance and membrane selectivity. A process design study based on the experimental results was conducted. The best preliminary membrane design uses 45% of the energy of a conventional distillation process.