Refine
Document Type
- Journal article (758)
- Conference proceeding (750)
- Book chapter (124)
- Book (22)
- Doctoral Thesis (22)
- Working Paper (13)
- Review (6)
- Report (1)
Language
- English (1696) (remove)
Has full text
- yes (1696) (remove)
Is part of the Bibliography
- yes (1696) (remove)
Institute
- Informatik (615)
- ESB Business School (408)
- Technik (296)
- Life Sciences (266)
- Texoversum (109)
- Zentrale Einrichtungen (6)
Publisher
- Springer (268)
- IEEE (251)
- Elsevier (188)
- MDPI (99)
- Wiley (60)
- Hochschule Reutlingen (57)
- Gesellschaft für Informatik e.V (52)
- Association for Computing Machinery (44)
- De Gruyter (40)
- IARIA (26)
Transaction processing is of growing importance for mobile computing. Booking tickets, flight reservation, banking, ePayment, and booking holiday arrangements are just a few examples for mobile transactions. Due to temporarily disconnected situations the synchronisation and consistent transaction processing are key issues. Serializability is a too strong criteria for correctness when the semantics of a transaction is known. We introduce a transaction model that allows higher concurrency for a certain class of transactions defined by its semantic. The transaction results are ”escrow serializable” and the synchronisation mechanism is non-blocking. Experimental implementation showed higher concurrency, transaction throughput, and less resources used than common locking or optimistic protocols.
Modern web-based applications are often built as multi-tier architecture using persistence middleware. Middleware technology providers recommend the use of Optimistic Concurrency Control (OCC) mechanism to avoid the risk of blocked resources. However, most vendors of relational database management systems implement only locking schemes for concurrency control. As consequence a kind of OCC has to be implemented at client or middleware side.
A simple Row Version Verification (RVV) mechanism has been proposed to implement an OCC at client side. For performance reasons the middleware uses buffers (cache) of its own to avoid network traffic and possible disk I/O. This caching however complicates the use of RVV because the data in the middleware cache may be stale (outdated). We investigate various data access technologies, including the new Java Persistence API (JPA) and Microsoft’s LINQ technologies for their ability to use the RVV programming discipline.
The use of persistence middleware that tries to relieve the programmer from the low level transaction programming turns out to even complicate the situation in some cases.Programmed examples show how to use SQL data access patterns to solve the problem.
The Third International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2011) held on January 23-27, 2011 in St. Maarten, The Netherlands Antilles, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take this opportunity to thank all the members of the DBKDA 2011 Technical Program Committee as well as the numerous reviewers. The creation of such a broad and high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to the DBKDA 2011. We truly believe that, thanks to all these efforts, the final conference program consists of top quality contributions. This event could also not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2011 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2011 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in database research. We are convinced that the participants found the event useful and communications very open. The beautiful places of St. Maarten surely provided a pleasant environment during the conference and we hope you had a chance to visit the surroundings.
This work presents a disconnected transaction model able to cope with the increased complexity of longliving, hierarchically structured, and disconnected transactions. Wecombine an Open and Closed Nested Transaction Model with Optimistic Concurrency Control and interrelate flat transactions with the aforementioned complex nature. Despite temporary inconsistencies during a transaction’s execution our model ensures consistency.
Since 2000, Indian special economic zones were established with the intention to attract foreign direct investment. We present a first empirical assessment with new data from 1980 to 2010 and evaluate the outcome after 10 years. In general, our empirical results confirm that special economic zones attract FDI statistical significantly. Another finding of the study is that open economies with stable inflation attract more FDI than small and closed economies.
India’s growth: perspectives for Indo-European business “Skilled labour in India: bridging the gap”
(2011)
The following paper is based on a survey conducted for ESB Business School and will show how German companies perceive India’s labour market. Besides existing geographical and sectoral gaps we will reveal gaps in the required qualification profile. Thinking merely of hard qualification factors like education levels, skills etc., though, would be short-sighted. Often cited intercultural qualifications also play an important role.
What can be done? What should be done to bridge these gaps? These will be the leading questions of this chapter. We will discuss some solutions – not forgetting that the problems German companies face are complex and knowing there is no ideal way. However, we will see that some of the most urgent problems can be solved or reduced by Indo-European or Indo-German co operation models in the field of vocational training and institutions of higher education.
The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2012], held between February 29th and March 5th, 2012 in Saint Gilles, Reunion Island, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2012 Technical Program Committee, as well as the numerous reviewers. The creation of such a broad and high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2012. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2012 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2012 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge, and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Saint Gilles, Reunion Island.
Redirected walking techniques allow people to walk in a larger virtual space than the physical extents of the laboratory. We describe two experiments conducted to investigate human sensitivity to walking on a curved path and to validate a new redirected walking technique. In a psychophysical experiment, we found that sensitivity to walking on a curved path was significantly lower for slower walking speeds (radius of 10 meters versus 22 meters). In an applied study, we investigated the influence of a velocity-dependent dynamic gain controller and an avatar controller on the average distance that participants were able to freely walk before needing to be reoriented. The mean walked distance was significantly greater in the dynamic gain controller condition, as compared to the static controller (22 meters versus 15 meters). Our results demonstrate that perceptually motivated dynamic redirected walking techniques, in combination with reorientation techniques, allow for unaided exploration of a large virtual city model.
Multi-dimensional patient data, such as time varying volume data, data of different imaging modalities, surface segmentations etc. are of growing importance in the clinical routine. For many use cases, it is of major importance to replicate a certain visualization of a data set created on one machine on a different computer using different software tools. Up until now, there exists no standardized methodology for this consistent presentation. We propose an extension of the Digital Imaging und Communications in Medicine (DICOM) called “Multi dimensional Presentation State” and outline scope and first results of the standardization process.
Energy-efficiency and safety became an important factor for car manufacturers. Thus, the cars have been optimised regarding the energy consumption and safety by optimising for example the power train or the engine. Besides the optimisation of the car itself, energy-efficiency and safety can also be increased by adapting the individual driving behaviour to the current driving situation. This paper introduces a driving system, which is in development. Its goal is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. For the creation of a recommendation the driving system monitors the driver and the current driving situation as well as the car using in-vehicle sensors and serial-bus systems. On the basis of the acquired data, the driving system will give individual energy-efficiency and safety recommendations in real-time. This will allow eliminating bad driving habits, while considering the driver needs.
Telemedicine is becoming an increasingly important approach to diagnostic, treat or prevent diseases. However, the usage of Information Communication Technologies in healthcare results in a considerable amount of data that must be efficiently and securely transmitted. Many manufacturers provide telemedicine platforms without regarding interoperability, mobility and collaboration. This paper describes a collaborative mobile telemonitoring platform that can use the IEEE 11073 and HL7 communication standards or adapt proprietary protocols. The proposed platform also covers the security and modularity aspects. Furthermore this work introduces an Android-based prototype implementation
This paper presents a new European initiative to support the sustainable empowerment of the ageing society. Empowerment in this context represents the capability to have a self-determined, autonomous and healthy life. The paper justifies the need of such an initiative and highlights the role that telemedicine and ambient assisted living can play in this environment.
The purpose of this study is to evaluate online German fashion shopping websites from a customer perspective, based on a two-dimensional conceptual framework covering
shopping experience and shopping quality. As the research methodology, an exploratory mystery shopping approach was used in order to compare online shops. The results were as follows. First, four categories of online shops were identified: heroes, marketing winners, process winners, and underperformers. Second, three main levers for improvement were elaborated: emotionality of websites, reducing complexity, and the introduction of an industry standard of payments. From These results, it is possible to analyze and benchmark websites and to adapt online Marketing decisions as well as general management strategies for online fashion Shopping companies. The study has originality and value as it is the first time that an Evaluation of websites has combined the consumer´s perspective before the purchase and its fulfillment (e.g. delivery) after the online purchase.
The workshop aims to discuss leading edge contributions to the interdisciplinary research area of ambient intelligence (AmI) applied to the domains of telemedicine and driving assistance. AmI refers to human centered environments attributed with sensors. The development of AmI in the two application domains of the workshop shares several commonalities: the extensive usage of networked devices and sensors, the design of artificial intelligence algorithms for diagnosis, including recommendation systems and qualitative reasoning or the application of mobile and wireless communication to their distributed systems. Together with the presentation of common aspects of Ambient Intelligence, a further goal of the workshop is to stimulate synergies among both application domains and present examples. The telemedicine domain can benefit from methodologies in designing complex devices, real-time conform system design, audiovisual or computer vision system design used in automotive driving assistance. Furthermore, the automotive domain can benefit from the usercentric view, biometric sensor data design, multi-user data bases for aggregation and diagnosis using big data like used in telemedicine. The German Government supports these research lines in its Hightec-Strategie under the domains “Health and Nutrition” and “Climate and Energy”. In Spain the term “Spanish Program for R&D Challenged Oriented Society – Challenge in energy safe, efficient and clean & Challenge in sustainable transport, smart and integrated” is used. Scientific contributions to the event are peer-reviewed by a suited program committee having members from Germany and Spain. The same committee is serving the JARCA workshop (Jornadas sobre Sistemas cualitativos y sus Aplicaciones en Diagnosis, Robótica e Inteligencia Ambiental - Conference on Qualitative Systems and their Applications in Diagnoses, Robotics and Ambient Intelligence) since 15 years. This workshop is sponsored by the German Academic Exchange Service (DAAD) under contract number 57070010.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nonetheless, in real life history is not always repeatable, i.e., in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction based on a calculated periodicity. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. The periodicity is calculated based on a novel approach that is based on data folding and Pearson Correlation. Compared to other techniques this approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 as well as artificial data demonstrate better results than established sophisticated time series methods.
"Learning by doing" in Higher Education in technical disciplines is mostly realized by hands-on labs. It challenges the exploratory aptitude and curiosity of a person. But, exploratory learning is hindered by technical situations that are not easy to establish and to verify. Technical skills are, however, mandatory for employees in this area. On the other side, theoretical concepts are often compromised by commercial products. The challenge is to contrast and reconcile theory with practice. Another challenge is to implement a self-assessment and grading scheme that keeps up with the scalability of e-learning courses. In addition, it should allow the use of different commercial products in the labs and still grade the assignment results automatically in a uniform way. In two European Union funded projects we designed, implemented, and evaluated a unique e-learning reference model, which realizes a modularized teaching concept that provides easily reproducible virtual hands-on labs. The novelty of the approach is to use software products of industrial relevance to compare with theory and to contrast different implementations. In a sample case study, we demonstrate the automated assessment for the creative database modeling and design task. Pilot applications in several European countries demonstrated that the participants gained highly sustainable competences that improved their attractiveness for employment.
The Fifth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2013], held between January 27th- February 1st, 2013 in Seville, Spain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2013 Technical Program Committee, as well as the numerous reviewers. The creation of such a high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2013. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2013 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2013 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Seville, Spain.
New storage technologies, such as Flash and Non- Volatile Memories, with fundamentally different properties are appearing. Leveraging their performance and endurance requires a redesign of existing architecture and algorithms in modern high performance databases. Multi-Version Concurrency Control (MVCC) approaches in database systems, maintain multiple timestamped versions of a tuple. Once a transaction reads a tuple the database system tracks and returns the respective version eliminating lock-requests. Hence under MVCC reads are never blocked, which leverages well the excellent read performance (high throughput, low latency) of new storage technologies. Upon tuple updates, however, established implementations of MVCC approaches (such as Snapshot Isolation) lead to multiple random writes – caused by (i) creation of the new and (ii) in-place invalidation of the old version – thus generating suboptimal access patterns for the new storage media. The combination of an append based storage manager operating with tuple granularity and snapshot isolation addresses asymmetry and in-place updates. In this paper, we highlight novel aspects of log-based storage, in multi-version database systems on new storage media. We claim that multi-versioning and append-based storage can be used to effectively address asymmetry and endurance. We identify multi-versioning as the approach to address dataplacement in complex memory hierarchies. We focus on: version handling, (physical) version placement, compression and collocation of tuple versions on Flash storage and in complex memory hierarchies. We identify possible read- and cacherelated optimizations.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nontheless, in real life history is not always repeatable, i.e. in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. Compared to other techniques this novel approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 demonstrate better results than established sophisticated time series methods.
A fast transient current-mode buckboost DC-DC converter for portable devices is presented. Running at 1 MHz the converter provides stable 3 V from a 2.7 V to 4.2 V Li-Ion battery. A small voltage under-/overshoot is achieved by fast transient techniques: (1) adaptive pulse skipping (APS) and (2) adaptive compensation capacitance (ACC). The proposed converter was implemented in a 0.25 μm CMOS technology. Load transient simulations confirm the effectiveness of APS and ACC. The improvement in voltage undershoot and response time at light-to-heavy load step (100 mA to 500 mA), are 17 % and 59 %, respectively, in boost mode and 40 % and 49 %, respectively, in buck mode. Similar results are achieved at heavy-to-light load step for overshoot and response time.
The Dow Jones Sustainability Indexes (DJSI) track the performance of companies that lead in corporate sustainability in their respective sectors or in the geographies they operate. The Sustainable Asset Management (SAM) Indexes GmbH publishes and markets the indexes, the so-called Dow Jones Sustainability Indexes in collaboration with SAM. All indexes of the DJSI family are assessed according to SAM’s Corporate Sustainability AssessmentTM methodology.
Unraveling the double-edged sword : effects of cultural diversity on creativity and innovativeness
(2014)
Cultural diversity is considered a “double-edged sword” (Kravitz, 2005) as research on its effects on teams’ performance regularly delivers inconsistent and contradictory results. This paper makes an attempt to unravel the double-edged sword by discerning different forms of cultural diversity: separation and variety (Harrison & Klein, 2007). Based on a review of the literature, a conceptual model is developed hypothesizing that cultural variety yields positive, while cultural separation yields negative effects on team creativity and innovativeness. In addition the effects of national diversity are contrasted to proof whether national diversity can serve as a proxy for cultural diversity as is often practiced. The model is tested on a sample of 113 student teams of Entrepreneurship modules at 4 European universities. Cultural diversity is measured directly on the basis of individual team members’ cultural value orientations by means of the CPQ4 (Maznevski, DiStefano, Gomez, Noorderhaven & Wu, 2002). Data is analyzed using the PLS structural equation modeling technique. The results confirm the hypothesized impacts of cultural variety and separation on creativity but do not deliver evidence for impacts on innovativeness. Same is true for national diversity. Interestingly, national diversity does not show any relation to neither form of cultural diversity.
The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2014), held between April 20 - 24, 2014 in Chamonix, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Silicones
(2014)
Silicones are found in a variety of applications with requirements that range from long life at elevated temperatures to fluidity at low temperatures. This chapter first considers silicone elastomers and their application in room temperature vulcanizing (RTV) and heat curing systems (HTV). Also, new technologies for UV curing are introduced. Coverage of RTVs includes both one-component and two-component systems and the different cure chemistries of each, and is followed by a separate discussion of silicone laminates. Due to the high importance of silicone fluids, they are also discussed. Fluids include polishes, release agents, surfactants, and dielectric fluids.
While digital IC design is highly automated, analog circuits are still handcrafted in a time-consuming, manual fashion today. This paper introduces a novel Parameterized Circuit Description Scheme (PCDS) for the development of procedural analog schematic generators as parameterized circuits. Circuit designers themselves can use PCDS to create circuit automatisms which capture valuable expert knowledge, offer full topological flexibility, and enhance the re-use of well-established topologies. The generic PCDS concept has been successfully implemented and employed to create parameterized circuits for a broad range of use cases. The achieved results demonstrate the efficiency of our PCDS approach and the potential of parameterized circuits to increase automation in circuit design, also to benefit physical design by promoting the common schematic-driven-layout flow, and to enhance the applicability of circuit synthesis approaches.
In the period from the 1950s to 2013, the American Food and Drug Administration (FDA) approved 1346 new molecular entities (NMEs) or new biologics entities (NBEs). On average, the approval rate was 20 NMEs per year. In the past 40 years, the number of new drugs launched into the market increased slightly from 15 NMEs in the 1970s to 25–30 NMEs since the 1990s. The highest number of new drugs approved by FDA was in 1996 and 1997, which might be related to the enactment of the Prescription Drug User Fee Act (PDUFA) in 1993.
In diesem Beitrag wurde gezeigt, wie mit Hilfe von Verfahren zur Analyse von Petri–Netzen ein in der Programmiersprache Kontaktplan erstelltes SPS–Programm analysiert werden kann. Das Ziel des Verfahrens ist dabei nicht eine Verifikation im eigentlichen Sinne sondern das Aufdecken von verbotenen oder unerwünschten Zuständen. Im Beitrag wurden Regeln zur Transformation des im Kontaktplan erstellten Ablaufs in ein Petri–Netz angegeben und anhand der Analyse eines fehlerhaft implementierten Ablaufs die Leistungsfähigkeit des Ansatzes vorgestellt. Das Beispiel zeigt, dass Programmfehler bereits vor einem Test an der realen Anlage erkannt werden können. Bei der weiteren Entwicklung des Verfahrens liegt ein Schwerpunkt auf der Verallgemeinerung auf im Kontaktplan entwickelte Programmorganisationseinheiten, die nicht nur reine
Abläufe implementieren. Ein weiterer wichtiger Entwicklungsschritt ist die graphische Unterstützung der Fehlersuche im Erreichbarkeitsgraphen, so dass insgesamt ein leistungsfähiges Werkzeug zur Unterstützung der Implementierung von Ablaufsteuerungen im Kontaktplan zur Verfügung steht.
Proceedings of the International Workshop on Mobile Networks for Biometric Data Analysis (mBiDA)
(2014)
Prevention and treatment of common and widesprea (chronic) diseases is a challenge in any modern Society and vitally important for health maintenance in aging societies. Capturing biometric data is a cornerstone for any analysis and Treatment strategy. Latest advances in sensor technology allow accurate data measurement in a non-intrusive way. In many cases, it is necessary to provide online monitoring and real-time data capturing to support patients´ prevention plans or to allow medical professionals to access the current status. Different communication standards are required to push sensor data and to store and analyze them on different (mobile) platforms. The objective of the workshop is to show new and innovative approaches dedicated to biometric data capture and analysis in a non-intrusive way maintaining mobility. Examples can be found in human centered ambient intelligence attributed with sensors or even in methodologies applied in automotive real-time conform mobile system design. The workshop´s main challenge is to focus on approaches promoting non-intrusiveness, reliable prediction algorithms and high user-acceptance. The workshop will provide overview presentations, Young researcher poster tracks, doctoral tracks and classical peer-review full paper tracks. Especially, would like to encourage students and young researchers to participate and to contribute to the workshop. Scientific contributions to the event are peer-reviewed by a suited program committee.
Today 40 Gbps is in development at IEEE 802.3bq over four pair balanced cabling. In this paper, we describe a transmission experiment of 25 Gbps enabling either a single pair transmission of 25 Gbps over a 30 meter balanced cabling channel, or a 100 Gbps transmission via a four-pair balanced channel. A scalable matrix modeling tool is introduced which allows the prediction of transmission characteristics of a channel taking mode conversion into account . We applied this tool to characterize PCB-channels including the magnetics and PCB for a four-pair 100 Gbps transmission. We evaluated prototype cables and connecting hardware for frequencies up to 2 GHz and beyond. Finally we investigated possible line encoding schemes and provide measurement results of a transmission over 30 m with a data rate of 25 Gbps per twisted pair.
In this paper, research projects with 30 meter balanced cabling and data rates up to 25 Gbps over one single pair are described. The project aim is to achieve 100 Gbps via a four pair balanced cabling channel. In the following, spectral characteristics of the used prototype twisted pair are presented. Therefore, the insertion loss of the single cable in comparison to the insertion loss of the cable in combination with an equalizing amplifier, as well as the group delay of the cable and the cable connected to the equalizing amplifier is shown. Furthermore, a carrierless Pulse Amplitude Modulation with 32 different levels (PAM-32) as an approach for a possible line encoding is presented. Finally, research measurements of the data transmission with a data rate up to 25 Gbps via shielded twisted pair is shown.
An index in a Multi-Version DBMS (MV-DBMS) has to reflect different tuple versions of a single data item. Existing approaches follow the paradigm of logically separating the tuple version data from the data item, e.g. an index is only allowed to return at most one version of a single data item (while it may return multiple data items that match a search criteria). Hence to determine the valid (and therefore visible) tuple version of a data item, the MV-DBMS first fetches all tuple versions that match the search criteria and subsequently filters visible versions using visibility checks. This involves I/O storage accesses to tuple versions that do not have to be fetched. In this vision paper we present the Multi Version Index (MV-IDX) approach that allows index-only visibility checks which significantly reduce the amount of I/O storage accesses as well as the index maintenance overhead. The MV-IDX achieves significantly lower response times and higher transactional throughput on OLTP workloads.
The use of Wireless Sensor and Actuator Networks (WSAN) as an enabling technology for Cyber-Physical Systems has increased significantly in recent past. The challenges that arise in different application areas of Cyber- Physical Systems, in general, and in WSAN in particular, are getting the attention of academia and industry both. Since reliability issues for message delivery in wireless communication are of critical importance for certain safety related applications, it is one of the areas that has received significant focus in the research community. Additionally, the diverse needs of different applications put different demands on the lower layers in the protocol stack, thus necessitating such mechanisms in place in the lower layers which enable them to dynamically adapt. Another major issue in the realization of networked wirelessly communicating cyber-physical systems, in general, and WSAN, in particular, is the lack of approaches that tackle the reliability, configurability and application awareness issues together. One could consider tackling these issues in isolation. However, the interplay between these issues create such challenges that make the application developers spend more time on meeting these challenges, and that too not in very optimal ways, than spending their time on solving the problems related to the application being developed. Starting from some fundamental concepts, general issues and problems in cyber-physical systems, this chapter discusses such issues like energy-efficiency, application and channel-awareness for networked wirelessly communicating cyber-physical systems. Additionally, the chapter describes a middleware approach called CEACH, which is an acronym for Configurable, Energy-efficient, Application- and Channel-aware Clustering based middleware service for cyber-physical systems. The state of-the art in the area of cyberphysical systems with a special focus on communication reliability, configurability, application- and channel-awareness is described in the chapter. The chapter also describes how these features have been considered in the CEACH approach. Important node level and network level characteristics and their significance vis-àvis the design of applications for cyber physical systems is also discussed. The issue of adaptively controlling the impact of these factors vis-à-vis the application demands and network conditions is also discussed. The chapter also includes a description of Fuzzy-CEACH which is an extension of CEACH middleware service and which uses fuzzy logic principles. The fuzzy descriptors used in different stages of Fuzzy-CEACH have also been described. The fuzzy inference engine used in the Fuzzy-CEACH cluster head election process is described in detail. The Rule-Bases used by fuzzy inference engine in different stages of Fuzzy-CEACH is also included to show an insightful description of the protocol. The chapter also discusses in detail the experimental results validating the authenticity of the presented concepts in the CEACH approach. The applicability of the CEACH middleware service in different application scenarios in the domain of cyberphysical systems is also discussed. The chapter concludes by shedding light on the Publish-Subscribe mechanisms in distributed event-based systems and showing how they can make use of the CEACH middleware to reliably communicate detected events to the event-consumers or the actuators if the WSAN is modeled as a distributed event-based system.
Advanced power semiconductors such as DMOS transistors are key components of modern power electronic systems. Recent discrete and integrated DMOS technologies have very low area-specific on-state resistances so that devices with small sizes can be chosen. However, their power dissipation can sometimes be large, for example in fault conditions, causing the device temperature to rise significantly. This can lead to excessive temperatures, reduced lifetime, and possibly even thermal runaway and subsequent destruction. Therefore, it is required to ensure already in the design phase that the temperature always remains in an acceptable range. This paper will show how self-heating in DMOS transistors can be experimentally determined with high accuracy. Further, it will be discussed how numerical electrothermal simulations can be carried out efficiently, allowing the accurate assessment of self-heating within a few minutes. The presented approach has been successfully verified experimentally for device temperatures exceeding 500 ◦C up to the onset of thermal runaway.
This paper presents a new broadband antenna for satellite communications. It describes the procedure involved in the design of a microstrip antenna array and its multi-level passive feed network that together yield circular polarization and the necessary gain to be used in an earth-satellite link. The designed antenna is notable for its large bandwidth, circular polarization, high gain and small dimensions.
This paper presents the design and simulation processes of an Equiangular Spiral Antenna for the extremely high frequencies between 65 GHz and 170 GHz. A new approach for the analysis of the antenna’s electrical parameters is described. This approach is based on formalism proposed by Rumsey to determine the EM field produced by an equiangular spiral antenna. Analytical expressions of the electrical parameters such as the gain or the directivity are then calculated using well sustained mathematical approximations. The comparison of obtained results with those from numerical integration methods shows a good agreement.
Functionally impaired people have problems with choosing and finding the right clothing. So, they need help in their daily life to wash and manage the clothing. The goal of this work is to support the user by giving recommendations to choose the right clothing, to find the clothing and how to wash the clothing. The idea behind eKlarA is to generate a gateway based system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of (one and more) closet, laundry basket and washing machine in one or several places. The gateway uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. This allows to give more freedom to the functionally impaired people in their daily life.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy-efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decides whether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of the driving system.
DMOS transistors are often subject to high power dissipation and thus substantial self-heating. This limits their safe operating area because very high device temperatures can lead to thermal runaway and subsequent destruction. Because the peak temperature usually occurs only in a small region in the device, it is possible to redistribute part of the dissipated power from the hot region to the cooler device areas. In this way, the peak temperature is reduced, whereas the total power dissipation is still the same. Assuming that a certain temperature must not be exceeded for safe operation, the improved device is now capable of withstanding higher amounts of energy with an unchanged device area. This paper presents two simple methods to redistribute the power dissipation density and thus lower the peak device temperature. The presented methods only require layout changes. They can easily be applied to modern power technologies without the need of process modifications. Both methods are implemented in test structures and investigated by simulations and measurements.
Knowledge transfer is very important to our knowledge-based society and many approaches have been proposed to describe this transfer. However, these approaches take a rather abstract view on knowledge transfer, which makes implementation difficult. In order to address this issue, we introduce a layered model for knowledge transfer that structures the individual steps of knowledge transfer in more detail. This paper gives a description of the process and also an example of the application of the layered model for knowledge transfer. The example is located in the area of business process modelling. Business processes contain the important knowledge describing the procedures of the company to produce products and services. Knowledge transfer is the fundamental basis in the modelling and usage of Business processes, which makes it an interesting use case for the layered model for knowledge transfer.
We investigated the excitation modes of the light-harvesting protein phycocyanin (PC) from Thermosynechococcus vulcanus in the crystalline state using UV and near-infrared Raman spectroscopy. The spectra revealed the absence of a hydrogen out-of-plane wagging (HOOP) mode in the PC trimer, which suggests that the HOOP mode is activated in the intact PC rod, while it is not active in the PC trimer. Furthermore, in the PC trimer an intense mode at 984 cm−1 is assigned to the C–C stretching vibration while the mode at 454 cm−1 is likely due to ethyl group torsion. In contrast, in the similar chromophore phytochromobilin the C5,10,15-D wag mode at 622 cm−1 does not come from a downshift of the HOOP. Additionally, the absence of modes between 1200 and 1300 cm−1 rules out functional monomerization. A correlation between phycocyanobilin (PCB) and phycoerythrobilin (PEB) suggests that the PCB cofactors of the PC trimer appear in a conformation similar to that of PEB. The conformation of the PC rod is consistent with that of the allophycocyanin (APC) trimer, and thus excitonic flow is facilitated between these two independent light harvesting compounds. This excitonic flow from the PC rod to APC appears to be modulated by the vibration channels during HOOP wagging, C = C stretching, and the N–H rocking in-plan vibration.
In visual adaptive tracking, the tracker adapts to the target, background, and conditions of the image sequence. Each update introduces some error, so the tracker might drift away from the target over time. To increase the robustness against the drifting problem, we present three ideas on top of a particle filter framework: An optical-flow-based motion estimation, a learning strategy for preventing bad updates while staying adaptive, and a sliding window detector for failure detection and finding the best training examples. We experimentally evaluate the ideas using the BoBoT dataseta. The code of our tracker is available online.
Many companies practice performance management in the framework of a heterogeneous, grown mix of numerous separate decisions, instruments, processes and systems and not in terms of a strategically and systematically planned management system. Due to the inefficiency of the above mentioned performance management style, a holistic and integrated approach is a key factor. Performance management must be able to meet central objectives and requirements and set the groundwork for long-term corporate success. This article presents a central approach of the conception of holistic and long-term performance management. The five equal part disciplines are illustrated and demonstrate the issue and composition complexity of a performance management due to their characteristics and combination. The objective of this article is to display and communicate the performance management issue and its context through an easily comprehensible system without following a general recipe.
Global acting rating agencies were held responsible for the latest financial market crisis. False estimations in rating, non-transparent methods, processes and systems as well as a lack of qualification of rating analysts have been points of criticism. The level of the tightened regulation of the agencies in the USA and in Europe is pointed out in this article. All relevant institutions and norms as well as the international and national standards from the German point of view are presented and exhaustively analyzed. In doing so it is illustrated, that in this olio-political market one can definitely speak about protection with regard to the admission and accreditation of the agencies.
Processing
(2014)
In this chapter, some relevant aspects and illustrative examples of online monitoring tools as the basis for process control in the manufacturing and processing of thermosetting resins are briefly discussed. In principle, any chemical or physical information made accessible by sensors can be used for online monitoring of resin formation, resin location in the mold, and resin cure. For instance, changes in the flow properties of the reaction mixture are often routinely recorded in dependence of the reaction time during resin synthesis as a measure for the degree of conversion of raw materials into macromolecules or oligomers by applying rheometry in an in-process environment. Typically, a small sample of the reaction mixture is by-passed, subjected to rheological measurement, and re-introduced into the bulk reactor. In a similar way, pH measurements, turbidimetric measurements, or other analyses are performed. Although rheometry may not always be suitable for following resin cure (especially in cases where there is a very rapid increase in viscosity after initiation of the cure), [1] naturally, the method can in principle also be used in the subsequent processing of the thermosets, for instance in the curing of wood glue applied to wood specimen [2]. Similarly, pH changes during thermoset curing can be followed. Hence, an encyclopedic and comprehensive approach to present process control methods would systematically proceed according to the involved physical measurement principle. However, since only a very Brief sketch of means for monitoring thermoset processing can be given here, only a small, personally biased selection of important methods and application examples is addressed in the following sections. These examples hopefully illustrate some of the general strategies and solutions to problems that are typically encountered when processing thermosets.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries. This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Three different polyols (soluble starch, sucrose, and glycerol) were tested for their potential in the chemical modification of melamine formaldehyde (MF) resins for paper impregnation. MF impregnated papers are widely used as finishing materials for engineered wood. These polyols were selected because the presence of multiple hydroxy groups in the molecules was suspected to facilitate cocondensation with the main MF framework. This should lead to good resin performance. Moreover, they are readily produced from natural feedstock. They are available in large quantities and may serve as economically feasible, environmentally harmless alternative co-monomers suitable to substitute a portion of fossil-based starting material. In the presented work, a number of model resins were synthesized and tested for covalent incorporation of the natural polyol into the MF Framework. Spectroscopic evidence of chemical incorporation of glycerol was found by applying by 1H, 13C, 1H/13C HSQC, 1H/13C HMBC, and 1H DOSY methods. It was furthermore found that covalent incorporation of glycerol in the network took place when glycerol was added at different stages during synthesis. Further, all resins were used to prepare decorative laminates and the performance of the novel resins as surface finishing was evaluated using standard technological tests. The technological performance of the various modified thermosetting resins was assessed by determining flow viscosity, molar mass distribution, the storage stability, and in a second step laminating impregnated paper to particle boards and testing the resulting surfaces according to standardized quality tests. In most cases, the average board surface properties were of acceptable quality. Our findings demonstrate the possibility to replace several percent of the petrol-based product melamine by compounds obtained from renewable resources.
Crosslinked thermoplastics
(2014)
Cross-linked thermoplastics represent an important class of materials for numerous applications such as heat-shrinkable tubing, rotational molded parts, and polyolefin foams. By cross-linking olefins, their mechanical performance can be significantly enhanced. This chapter covers the three main methods for the cross-linking of thermoplastics: radiation cross-linking, chemical cross-linking with organic peroxides, and cross-linking using silane-grafting agents. It also considers the major effects of the cross-linking procedure on the performance of the thermoplastic materials discussed.
Mass-customization is a megatrend that also affects the wood industry. To obtain individually designed laminates in batch size one efficient printing and processing technologies are required. Digital printing was envisaged as it does not depend on highly costly printing cylinders (as used in rotogravure printing) and allows rapid exchange of the printing designs. In the present work, two wellestablished digital printing approaches, the multi-pass and the single-pass technique, were investigated and evaluated for their applicability in decorating engineered wood and low-pressure melamine films. Three different possibilities of implementing digital printing in the decorative laminates manufacturing process were studied: (1) digital printing on coated chipboard and subsequently applying a lacquered top-coat or melamine overlay (designated as “direct printing”, since the LPM was the printing substrate), (2) digital printing on decorative paper which was subsequently impregnated before hot pressing (designated as “indirect printing, variant A”) and (3) digital printing on decorative paper with subsequent interlamination of the paper between impregnated under- and overlay paper layers during the pressing process (designated as “indirect printing, variant B”). Due to various advantages of the resulting cured melamine resin surfaces including a much better technological performance and flexibility in surface texture design, it was decided to industrially further pursue only the indirect digital printing process comprising interlamination and the direct printing process with a melamine overlay-finishing. Basis for the successful trials on production and laboratory scales were the identification of applicable inks (in terms of compatibility with melamine resin) and of appropriate printing paper quality (in terms of impregnation and imprinting ability). After selection and fine tuning of suitable materials, the next challenge to overcome was the initially insufficient bond strength between impregnated overlay and the ink layers which led to unsatisfactory quality of the print appearance and delamination effects. However, the optimization of the pressing program and the development of a modified impregnation procedure for the underlay and overlay papers allowed the successful implementation of digital printing in the production line of our industrial partner FunderMax.
Allyls
(2014)
This chapter addresses the importance and usage of the commercially low volume thermoset plastics group known as allyls. The three significant sub-elements of this group are poly(diallylphthalates), poly(diallylisophthalates), and poly(allyldiglycol carbonate). Chemistry, processing, and properties are also described. Allyl polymers are synthesized by radical polymerizations of allyl monomers that usually do not produce high-molecular-mass macromolecules. Therefore, only a few specific monomers can produce thermosetting materials. Diallyldiglycolcarbonate (CR-39) and diallylphthalates are the most significant examples that have considerably improved our everyday life.
Prior studies ascribed people’s poor performance in dealing with basic systems concepts to different causes. While results indicate that, among other things, domain specific experience and familiarity with the problem context play a role in this stock-flow-(SF-)performance, this has not yet been fully clarified. In this article, we present an experiment that examines the role of educational background in SF-performance. We hypothesize that SF-performance increases when the problem context is embedded in the problem solver’s knowledge domain, indicated by educational background. Using the square wave pattern and the sawtooth pattern tasks from the initial study by Booth Sweeney and Sterman (2000), we design two additional cover stories for the former, the Vehicle story from the engineering domain and the Application story from the business domain, next to the original Bathtub story. We then test the three sets of questions on business students. Results mainly support our hypothesis. Interestingly, participants even do better on a more complex behavioral pattern from their knowledge domain than on a simpler pattern from more distant domains. Although these findings have to be confirmed by further studies, they contribute both to the methodology of future surveys and the context familiarity discussion.
This paper compares the influence a video self-avatar and a lack of a visual representation of a body have on height estimation when standing at a virtual visual cliff. A height estimation experiment was conducted using a custom augmented reality Oculus Rift hardware and software prototype also described in this paper. The results show a consistency with previous research demonstrating that the presence of a visual body influences height estimates, just as it has been shown to influence distance estimates and affordance estimates.
The intelligent recycling of plastics waste is a major concern. Because of the widespread use of polyethylene terephtalate, considerable amounts of PET waste are generated that are ideally re-introduced into the material cycle by generating second generation products without loss of materials performance. Chemical recycling methods are often expensive and entail environmentally hazardous by-products. Established mechanical methods generally provide materials of reduced quality, leading to products of lower quality. These drawbacks can be avoided by the development of new recycling methods that provide materials of high quality in every step of the production cycle. In the present work, oligomeric ethylene terephthalate with defined degrees of polymerization and defined molecular weight is produced by melt-mixing PET with different quantities of adipic acid as an alternative pathway of recycling PET with respect to conventional methods, offering ecofriendly and economical aspects. Additionally, block-copolyesters of defined block length are designed from the oligomeric products.
Ethylene terephthalate and ethylene naphthalate oligomers of defined degree of polymerization were synthesized via chemical recycling of the parent polymers. The oligomers were used as defined building blocks for the preparation of novel block-co-polyesters having tailored sequence compositions. The sequence lengths were systematically varied using Design of Experiments. The dispersive surface energy and the specific desorption energy of the co-polymers were determined by inverse gas chromatography. The study shows that polyethylene terephthalate-polyethylene naphthalate (PET-PEN) block-co-polyesters of defined sequence lengths can be prepared. Furthermore, the specific and dispersive surface energies of the obtained block-co-polyesters showed a linear dependence on the oligomer molecular weight and it was possible to regulate and control their interfacial properties. In contrast, with the corresponding random-block-co-polyesters no such dependence was found. The synthesized block-co-polyesters could be used as polymeric modifying agents for stabilizing PET-PEN polymer blends.
Melamine formaldehyde (MF) resins are widely used for the gluing and surface coating of wood-based consumer products in the interior design of living environments. MF resins are especially relevant in decorative laminate applications because of their good performance-to-price ratio. In their industrial processing, an important intermediate state is the liquid MF prepolymer that is used for decorative paper impregnation. Here, the drying of impregnated papers is investigated with respect to premature curing. A new method to quantify water release upon drying that allows estimation of the degree of undesired precuring is described. Since curing proceeds via polycondensation, crosslinking brings about the release of water molecules. By thermogravimetric analysis (TGA), drying was studied in terms of water release due to physical drying (elimination of “dilution water”) and chemical crosslinking of the prepolymer to a three-dimensional MF network (elimination of chemically liberated water). The results obtained by TGA/IR spectroscopic analysis of the liberated volatiles show that the emission of water from b-stage MF can be clearly analytically separated into a physical (evaporation of dilution water) and a chemical (liberation via condensation) sequence. TGA experiments were correlated with curing experiments performed with differential scanning calorimetry (DSC) to estimate the residual crosslinking capacities of the impregnated papers. The drying conditions used during the preparation of impregnated decorative papers seemed to significantly affect their remaining reactivity only when harsh drying conditions were used. Upon heat exposure for prolonged time, precuring of the oligomer units results in a shift of the temperature maxima in TGA.
The fiber deformations of once-dried, bleached and never-dried unbleached kraft pulps were studied with respect to their behavior in high- and low-consistency refining. The pulps were stained with congo red to experimentally highlight areas where the arrangement of the fibrils was altered by refining such as dislocated zones or slip planes. The stained fibers were analyzed with conventional Metso Fiberlab but also with a novel prototype measurement device utilizing a color imaging setup. The local intensity of the stain in the fiber was expressed as degree of overall damage (Overall fiber damage index, OFDI). The rewetted zero span tensile index (RWZSTI) was used to verify the OFDI with respect to the pulp strength. High consistency refining resulted in a clear increase in the number of kinks which negatively influenced the pulp strength. The OFDI which was used to detect the intensity of local fiber defects also responded accordingly. A higher OFDI resulted in a lower pulp strength. Low consistency refining removed a significant amount of kinks and resulted in an increase in fiber swelling. A slight increase in fibrillation and a significant increase in flake-like fines were also observed. The OFDI, however, was not reduced in low consistency refining as it would be expected by the removal of less severe dislocations. One reason proposed here is that low consistency refining created new fiber pores that allowed the dye to penetrate into the fiber wall similarly as it does in the zones of the dislocations.
Hardboards (HBs) (wet-process high-density fibreboards) were made in an industrial trial using a binder system consisting of cationic mimosa tannin and laccase or just cationic tannin without any thermosetting adhesive. The boards displayed superior mechanical strength compared to reference boards made with phenol–formaldehyde, easily exceeding the European standards for general-purpose HBs. The thickness swell of most of the boards was slightly greater than the standards would allow, so some optimisation is required in this area. The improved board properties appear to be mainly associated with ionic interactions involving quaternary amino groups in cationic tannin and negatively charged wood fibres rather than to cross-linking of fibres via laccase-assisted formation and coupling of radicals in tannin and fibre lignin.
Powder coating of engineered wood panels such as medium density fibreboards (MDF) is gaining industrial interest due to ecological and economic advantages of powder coating technology. For transferring powder coating technology to temperature-sensitive substrates like MDF, a thorough understanding of the melting, flowing and curing behaviour of the used low-bake resins is required. In the present study, thermo-analysis in combination with iso-conversional kinetic data analysis as well as rheometry is applied to characterise the properties of an epoxy-based powder coating. Neat resin and cured powder coating films are examined in order to define an ideal production window within which the resin is preferably applied and processed to yield satisfactory surface performance on the one hand and without exposing the carrier MDF too high a temperature load on the other hand to prevent the panel from deteriorating in mechanical strength. In order to produce powder coated films of high surface gloss – a feature that has not yet successfully been realized on MDF with powder coatings – a new curing technology, in-mould surface finishing, has been applied.
In vivo, cells encounter different physical and chemical signals in the extracellular matrix (ECM) which regulate their behavior. Examples of these signals are micro- and nanometer-sized features, the rigidity, and the chemical composition of the ECM. The study of cell responses to such cues is important to understand complex cell functions, some diseases, and is basis for the development of new biomaterials for applications in medical implants or regenerative medicine. Therefore, the development of new methods for surface modifications with controlled physical and chemical features is crucial. In this work, we report a new combination of micelle nanolithography (BCML) and soft micro-lithography, for the production of polyethylene glycol (PEG) hydrogels, with a micro-grooved surface and decoration with hexagonally precisely arranged gold nanoparticles (AU NPs). The Au-NPs are used for binding adhesive ligands in a well-defined density. First tests were performed by culturing human fibroblasts on the gels. Adhesion and alignment of the cells along the parallel grooves of the surface were investigated. The substrates could provide a new platform for studying cell contact guidance by micro structures, and may enable a more precise control of cell behavior by nanometrically controlled surface functionalization.
Poly(dimethylsiloxane) can be covalently coated with ultrathin NCO-sP(EO-stat-PO) hydrogel layers which permit covalent binding of cell adhesive moieties, while minimizing unspecific cell adhesion on non-functionalized areas. We applied long term uniaxial cyclic tensile strain (CTS) and revealed (a) the preservation of protein and cell-repellent properties of the NCO-sP(EO-stat-PO) coating and (b) the stability and bioactivity of a covalently bound fibronectin (FN) line pattern. We studied the adhesion of human dermal fibroblast (HDFs) on non-modified NCO-sP(EO-stat-PO) coatings and on the FN. HDFs adhered to FN and oriented their cell bodies and actin fibers along the FN lines independently of the direction of CTS. This mechanical long term stability of the bioactive, patterned surface allows unraveling biomechanical stimuli for cellular signaling and behavior to understand physiological and pathological cell phenomenon. Additionally, it allows for the application in wound healing assays, tissue engineering, and implant development demanding spatial control over specific cell adhesion.
It is well established that the mechanical environment influences cell functions in health and disease. Here, we address how the mechanical environment influences tumor growth, in particular, the shape of solid tumors. In an in vitro tumor model, which isolates mechanical interactions between cancer tumor cells and a hydrogel, we find that tumors grow as ellipsoids, resembling the same, oft-reported observation of in vivo tumors. Specifically, an oblate ellipsoidal tumor shape robustly occurs when the tumors grow in hydrogels that are stiffer than the tumors, but when they grow in more compliant hydrogels they remain closer to spherical in shape. Using large scale, nonlinear elasticity computations we Show that the oblate ellipsoidal shape minimizes the elastic free energy of the tumor-hydrogel system. Having eliminated a number of other candidate explanations, we hypothesize that minimization of the elastic free energy is the reason for predominance of the experimentally observed ellipsoidal shape. This result may hold significance for explaining the shape progressio.
Whither the german council of economic experts? The past and future of public economic advice
(2014)
The article discusses the development and impact of the German Council of Economic Experts (GCEE). Firstly, the author studies the historical origins and the institutional setup of the GCEE. In the second step, an analyse of the impact of the annual reports of the German Council is given, along with the international comparison with other advisory boards. Finally, the paper discusses the current economic challenges and the need of modernization of the GCEE in special and political advisory boards in general.
This paper is a brief review on the book ‘Capital in the Twenty-First Century’ by the French scholar Thomas Piketty. The book has started a new debate about inequality and capital taxation in Europe. It provides interesting empirical facts and develops a theory of the functioning of capitalist economies. However, I personally think the book is less convincing than recognized in the public debate. The demonstrated theory of economic growth in the book is elusive and lacks a psychological and behavioral underpinning. In fact, I do think that the increasing inequality and economic divergence are caused by capitalism but the psychological and behavioral aspects of humans are of similar or greater significance. Therefore, Piketty’s argument does not stimulate an open and scientifically founded debate in all aspects.
This article focuses on potential economic implications of a free trade agreement (FTA) between the European Union (EU) and the Indian Federation. The economic implications are evaluated by estimating an Extended gravity model for all existing FTAs with the Indian Federation. Moreover, we control for the trade contribution of EU member countries in our econometric model during the period from 1990 until 2008. The results show a significant increase in trade, if there is a free trade agreement between India and another country. Interestingly, we find that India has the largest positive impact from FTAs with more advanced economies. Thus, we reaffirm the potential benefits of trade relationships between the EU and India.
This paper provides a quantitative approach to measuring the effectiveness of ambush marketing by using Google data. To our knowledge, it is one of the first studies that develop an empirical approach that directly measures the attention effect of ambush marketing in sports. The new data consists of 14 ambushers (treatment group) and 26 official sponsors (control group) and covers the time period of 2004 to 2012. These firms conducted marketing activities during the past football World Cups and European Championships. The innovation in our paper is the measurement method of attention by means of Google. The results are as follows: First ambush marketing increases product attention significantly. Second the product awareness of ambushers is greater or the same to that of official sponsors. Finally, we demonstrate that ambush marketing has positive impacts on the company's performance. Overall, we conclude that Google provide new insights for the analysis of ambush marketing.
Strategy to adjust people’s performance capabilities to new requirements and grantee employability in the world of work. Good examples for this are the current changes in the logistics environment. Regularly, new services and processes close to production were taken into the portfolio of logistics enterprises, so the daily Tasks are changing continuously for the skilled works.
LOPEC aims in developing and offering special-tailored training for Lean Logistics and required basic skills for skilled workers on shopfloor level. Needed know-how for today’s challenges in logistics will be transferred. Another aspect of LOPEC is the development and use of a personal excellence self-assessment that allows a Person to assess and thus improve his/her own level of maturity in employability skills. Thus, LOPEC is aiming at People ehancement as entry ticket to lifelong continuous learning by increasing the maturity level of personal logistic excellence. A common European view for “Logistics personal excellence” for skilled workers will ensure that the final product is an open product, using international, pan European validated standards. As results LOPEC will provide training modules for post-secondary education in the area of Lean Logistics, required basics skills and offers transparency of personal excellence with a personal self-assessment Software solution, regarding the personal maturity Level of hard and soft skills at any time. It can be used as an innovative tool for monitoring personal lifelong learning routes as well as within companies as a strategic tool within Human Resource Development.
The capability of the method of Immersion transmission ellipsometry (ITE) (Jung et al. Int Patent WO, 2004/109260) to not only determine three-dimensional refractive indices in anisotropic thin films (which was already possible in the past), but even their gradients along the z-direction (perpendicular to the film plane) is investigated in this paper. It is shown that the determination of orientation gradients in deep-sub-lm films becomes possible by applying ITE in combination with reflection ellipsometry. The technique is supplemented by atomic force microscopy for measuring the film thickness. For a photooriented thin film, no gradient was found, as expected. For a photo-oriented film, which was subsequently annealed in a nematic liquid crystalline phase, an order was found similar to the one applied in vertically aligned nematic displays, with a tilt angle varying along the z-direction. For fresh films, gradients were only detected for the refractive index perpendicular to the film plane, as expected.
The powder coating of veneered particle boards by the sequence electrostatic powder application -powder curing via hot pressing is studied in order to create high gloss surfaces. To obtain an appealingaspect, veneer Sheets were glued by heat and pressure on top of particle boards and the resulting surfaceswere used as carrier substrates for powder coat finishing. Prior to the powder coating, the veneeredparticle board surfaces were pre-treated by sanding to obtain good uniformity and the boards werestored in a climate chamber at controlled temperature and humidity conditions to adjust an appropriate electrical surface resistance. Characterization of surface texture was done by 3D microscopy. The surfaceelectrical resistance was measured for the six veneers before and after their application on the particleboard surface. A transparent powder top-coat was applied electrostatically onto the veneered particleboard surface. Curing of the powder was done using a heated press at 130◦C for 8 min and a smooth, glossy coating was obtained on the veneered surfaces. By applying different amounts of powder thecoating thickness could be varied and the optimum amount of powder was determined for each veneer type.
This paper studies the impact of governmental transparency on the political business cycle. The literature on electoral cycles finds evidence that cycles depend on the stage of the economy. However, we show a reliance of the cycle on transparency. We use data for G7 countries and compare it with less developed OECD countries. Our theory states that transparency reduces the political cycles due to peer pressure and by voting outs. We confirm the theory with an econometric assessment of 34 countries from 1970 to 2012. We discover smaller cycles in countries with a higher transparency, especially in G7-countries.
Decorative laminates based on melamine formaldehyde (MF) resin impregnated papers are used at great extent for surface finishing of engineered wood that is used for furniture, kitchen, and working surfaces, flooring and exterior cladding. In all these applications, optically flawless appearance is a major issue. The work described here is focused on enhancing the cleanability and antifingerprint properties of smooth, matt surface-finished melamine-coated particleboards for furniture fronts, without at the same time changing or deteriorating other important surface parameters such as hardness, roughness or gloss. In order to adjust the surface polarity of a low pressure melamine film, novel interface-active macromolecular compounds were prepared and tested for their suitability as an antifingerprint additive. Two hydroxy-functional surfactants (polydimethysiloxane, PDMS-OH and perfluoroether, PF-OH) were oxidized under mild conditions to the corresponding aldehydes (PDMS-CHO and PF-CHO) using a pyridinium chlorochromate catalyst. With the most promising oxidized polymeric additive, PDMS-CHO, the contact angles against water, n-hexadecane, and squalene increased from 79.8°, 26.3° and 31.4° for the pure MF surface to 108.5°, 54.8°, and 59.3°, respectively, for the modified MF surfaces. While for the laminated MF surface based on the oxidized fluoroether the gloss values were much higher than required, for the surfaces based on oxidized polydimethylsiloxane the technological values as well as the lower gloss values were in agreement with the requirements and showed much improved surface cleanability, as was also confirmed by colorimetric measurements.
Clay minerals play an increasingly important role as functional fillers and reinforcing materials for clay polymer nanocomposites (CPN) in advanced applications. Among the prerequisites necessary for polymer improvement by clay minerals are homogeneous and stable Distribution of the clay mineral throughout the CPN, good compatibility of the reinforcement with the Matrix component and suitable processability. Typically, clay minerals are surface-modified with organic interface active compounds like detergents or silanes to obtain favorable properties as filler. They are incorporated into the polymer matrix using manufacturing Equipment like extruders, batch reactors or other mixing machines. In order for the surface modification to survive the stresses and strains during incorporation, the modified clay minerals must display sufficient thermal and mechanical stability to retain the compatibilizing effect. In the present study, thermogravimetry was used in combination with isoconversional kinetic analysis to determine the thermal stability of a silane-modified clay mineral based on bentonite. These findings were compared with the stability of the same clay mineral that was only surfactant-modified. It was found that silane modification leads to significantly improved thermal stability, which depends strongly on the type of silane employed.
A vapor permeation processes for the separation of aromatic compounds from aliphatic compounds
(2014)
A number of rubbery and glassy membranes have been prepared and evaluated in vapor permeation experiments for separation of aromatic/aliphatic mixtures, using 5/95 (wt:wt) toluene/methylcyclohexane (MCH) as a model solution. Candidate membranes that met the required toluene/MCH selectivity of ≥ 10 were identified. The stability of the candidate membranes was tested by cycling the experiment between higher toluene concentrations and the original 5 wt% level. The best membrane produced has a toluene permeance of 280 gpu and a toluene/MCH selectivity of 13 when tested with a vapor feed of the model mixture at its boiling point and at atmospheric pressure. When a series of related membrane materials are compared, there is a sharp trade-off between membrane permeance and membrane selectivity. A process design study based on the experimental results was conducted. The best preliminary membrane design uses 45% of the energy of a conventional distillation process.
Stent graft visualization and planning tool for endovascular surgery using finite element analysis
(2014)
Purpose: A new approach to optimize stent graft selection for endovascular aortic repair is the use of finite element analysis. Once the finite element model is created and solved, a software module is needed to view the simulation results in the clinical work environment. A new tool for Interpretation of simulation results, named Medical Postprocessor, that enables comparison of different stent graft configurations and products was designed, implemented and tested. Methods Aortic endovascular stent graft ring forces and sealing states in the vessel landing zone of three different configurations were provided in a surgical planning software using the Medical Imaging Interaction Tool Kit (MITK) Software system. For data interpretation, software modules for 2D and 3D presentations were implemented. Ten surgeons evaluated the software features of the Medical Postprocessor. These surgeons performed usability tests and answered questionnaires based on their experience with the system.
Results: The Medical Postprocessor visualization system enabled vascular surgeons to determine the configuration with the highest overall fixation force in 16 ± 6 s, best proximal sealing in 56±24 s and highest proximal fixation force in 38 ± 12 s. The majority considered the multiformat data provided helpful and found the Medical Postprocessor to be an efficient decision support system for stent graft selection. The evaluation of the user interface results in an ISONORMconform user interface (113.5 points).
Conclusion: The Medical Postprocessor visualization Software tool for analyzing stent graft properties was evaluated by vascular surgeons. The results show that the software can assist the interpretation of simulation results to optimize stent graft configuration and sizing.
There are several intra-operative use cases which require the surgeon to interact with medical devices. We used the Leap Motion Controller as input device and implemented two use-cases: 2D-Interaction (e.g. advancing EPR data) and selection of a value (e.g. room illumination brightness). The gesture detection was successful and we mapped its output to several devices and systems.
Plasma polymerization is used for the modification and control of surface properties of a highly transparent, thermoplastic elastomeric silicone copolymer, GENIOMER® 80 (G80). PEG-like diglyme plasma polymer films were deposited with ether retentions varying between 20% and 70% as measured by X-ray photoelectron spectroscopy analysis which did not affect the transparency of the substrate. Films with ether retentions of greater than 70% inhibit protein binding (bovine serum albumin and fibrinogen) and cell proliferation. A short oxygen plasma pretreatment enhances the adhesion and stability of the film as shown by protein binding and cell adhesion experiments. The transparency of the material and the stability of the coating makes this material a versatile bulk material for technical (e.g., lab-on-a-chip) and biomedical (e.g., intraocular lens) applications. The G80/plasma polymer composite is stable against vigorous washing and storage over 5 months and, therefore, offers an attractive alternative to poly(dimethylsiloxane).
The article discusses how drama can support language learning at the university level and how drama can also support learners in acquiring professional competences. In the first part, the article will briefly outline forms of drama in language teaching. It will discuss its benefits, such as putting language in context, making learning holistic and memorable, improving learners’ social and personal competences. The second part describes aspects of drama beneficial for language learning in a professional context and gives a concrete teaching example: theatre projects with a focus on business English.
Intra-operative fluoroscopy-guided assistance system for transcatheter aortic valve implantation
(2014)
A new surgical assistance system has been developed to assist the correct positioning of the AVP during transapical TAVI. The developed assistance system automatically defines the target area for implanting the AVP under live 2-D fluoroscopy guidance. Moreover, this surgical assistance system works with low levels of contrast agent for the final deployment of AVP, reducing therefore long-term negative effects, such as renal failure in the elderly and high-risk patients.
This paper examines the relationship of asset Price determination via Google data. To capture this relation, I create a model and estimate several time series’ regressions. I use weekly data from 2004 to 2010 from 30 international banks. To my knowledge this is the first study which differentiates between Google’s search volume and Google’s search clicks. I show that asset prices are positively related to the rate of change in Google’s search volume, trading volume and the level of Google search clicks. Secondly, I demonstrate that the absolute level of Google’s search volume and Google’s search clicks
behave differently regarding the asset price dynamics. Google’s search volume, which measures long-run searches, is negatively related while Google’s search clicks have a positive relationship to asset prices. Hence, Google’s data offer new insights on both measuring attention and pricing financial assets.
This paper develops a new governance scheme for a stable and lasting European Monetary Union (EMU). I demonstrate that existing economic governance is based on flawed incentives especially due to insufficient macroeconomic coordination, failures of institutional enforcement and animal spirit in financial markets. All this caused the European sovereign debt crisis in 2010. Consequently, the EMU crisis is not a conundrum at all rather a failure of national and supranational governance. To tackle this problem, I propose a return to flexible but compulsory rules driven by market forces. The new governance principles shall promote the compliance and effective enforcement of rules.
This white paper builds a new financial theory of euro area sovereign bond markets under stress. The theory explains the abnormal bond pricing and increasing spreads during the recent market turmoil. We find that the strong disconnect of bond spreads from the respective bonds’ underlying fundamental values in 2010 was triggered by an increase in asymmetric information and weak reputation of government policies. Both factors cause a normal bond market to switch into a crisis mode. Finally, those markets are prone to self-fulfilling bubbles in which the economic effects are amplified by herding behaviour arising from animal spirits. Altogether, this produces contagious effects and multiple equilibria. Thus, we argue that government bond markets in a monetary union are more fragile and vulnerable to liquidity and solvency crises. Consequently, the systemic mispricing of sovereign debt creates more macroeconomic instability and bubbles in the euro area than in a single country. In other words, financial markets are partly blind to national default risks in a currency union. Therefore, the current European institutional framework puts the wrong incentives in place and needs structural changes soon. To tackle the root causes we suggest more market incentives via consistent rules, pre-emptive austerity measures in good economic times, and a resolution scheme for heavily indebted countries. In summary, our paper enhances the bond market theory and provides new insights into the recent bond market turmoil in Europe.
Applied mathematical theory for monetary-fiscal interaction in a supranational monetary union
(2014)
I utilize a differentiable dynamical system á la Lotka-Voletrra and explain monetary and fiscal interaction in a supranational monetary union. The paper demonstrates an applied mathematical approach that provides useful insights about the interaction mechanisms in theoretical economics in general and a monetary union in particular. I find that a common central bank is necessary but not sufficient to tackle the new interaction problems in a supranational monetary union, such as the free-riding behaviour of fiscal policies. Moreover, I show that upranational institutions, rules or laws are essential to mitigate violations of decentralized fiscal policies.
Industry 4.0 predicts that industrial processes, technological infrastructure and all corresponding Business processes, with the help of information and communication technology (ICT), will advance to integrated, ad-hoc interconnected and decentralized Cyber-Physical Production Systems (CPPS) with real-time capabilities of selfoptimization and adaptability. Considering this change, the human being will remain in a dominant role, because it is not expected that the human factor with its characteristics and capabilities will be substituted entirely by autonomously acting technology in the foreseeable future. The mechanical intelligence, for instance, is limited to the selection of predefined options, while human creativity, flexibility, the ability to learn and to improve are required to design and configure systems, processes and products. Humans have the expertise and experience to analyze, assess and solve - even in exceptional situations. However, the amount of purely manual tasks for shop floor workers will decrease. Their role will change from a manually executing to a proactive preconceiving worker with increased responsibility. Due to the growing degree of digitalization and interconnectedness, also the tasks and responsibilities for planning and design personnel will continuously expand and become more complex. The work in versatile ad-hoc networks with advanced ICT-Tools and assistance systems will lead to increased requirements regarding the knowledge, capability and capacity of the respective employees. The on-going pervasion of IT and emergence of systems with unprecedented complexity specifically require significantly improved capabilities in analysis, abstraction, problem solving and decision making from future labour. Accordingly, the industry is asking for graduates that are educated interdisciplinary and practice-oriented. Some universities already meet these expectations, using learning factories for realistic, action-oriented classes and trainings. Lecturers are confronted with the challenge to identify future job profiles and correlated qualification requirements, especially regarding the conceptualization and implementation of CPPS, and to adapt and enhance their education concepts and methods adequately and consequently. For the new, virtual world of manufacturing a proper understanding of engineering as well as Computer sciences is essential. Industry 4.0 implies this interdisciplinary split. Integrated competencies for product and process planning and design, methodological competencies for systematical idea and innovation management as well as a holistic system and Interface competence will be crucial to achieve interconnection of physical and digital processes and machines. The Vienna University of Technology and the ESB Reutlingen committed to integrate key aspects of Industry 4.0 into their respective learning factories successively. Thus, the students will act as the coordinators of the CPPS and thereby remain in the center of all learning and implementation activities.
Enterprise Architectures (EA) consists of many architecture elements, which stand in manifold relationships to each other. Therefore Architecture Analysis is important and very difficult for stakeholders. Due changing an architecture element has impacts on other elements different stakeholders are involved. In practice EAs are often analyzed using visualizations. This article aims at contributing to the field of visual analytics in EAM by analyzing how state of-the-art software platforms in EAM support stakeholders with respect to providing and visualizing the “right” information for decision-making tasks. We investigate the collaborative decision-making process in an experiment with master students using professional EAM tools by developing a research study and accomplishing them in a master’s level class with students.
Analysis and planning of Enterprise Architectures (EA) is a complex task for stakeholders. The change of one architecture element has impact on multiple other elements because of manifold relationships and interactions between them. The interactive cockpit approach presented in this paper supports stakeholders planning and analyzing EAs and to tackle the intrinsic complexity. This approach supplies a cockpit with multiple viewpoints to put relevant information side-by-side without losing the context combined with interaction functionality. In this paper, we develop such cockpit starting with relevant use cases, describing a potential design based on well-established foundations in EA modeling, and outline an exemplary usage scenario.
In the powder coating of veneered particle boards the highly reactive hybrid epoxy/polyester powder transparent Drylac 530 Series from TIGER Coatings GmbH & Co. KG, Wels, Austria was used. Curing is accelerated by a mixture of catalysts reaching curing times of 3 min at 150 °C or 5 min at 135 °C which allows for energy and time savings making Drylac Series 530 powder suitable for the coating of temperaturesensitive substrates such as MDF and wood.
Powder coatings provide several advantages over traditional coatings: environmental friendliness, freedom of design, robustness and resistance of surfaces, possibility to seamlessly all-around coating, fast production process, cost-effectiveness. In the last years these benefits of the powder coating technology have been adopted from metal to heat-sensitive natural fibre/ wood based substrates (especially medium density fibre boards- MDF) used for interior furniture applications. Powder coated MDF furniture parts are gaining market share already in the classic furniture applications kitchen, bathroom, living and offices. The acceptance of this product is increasing as reflected by excellent growth rates and an increasing customer base. Current efforts of the powder coating industry to develop new powders with higher reactivity (i.e. lower curing temperatures and shorter curing times; e.g. 120°C/5min) will enable the powder coating of other heat-sensitive substrates like natural fibre composites, wood plastic composites, light weight panels and different plastics in the future. The coating could be applied and cured by the conventional powder coating process (electrostatic application, and melting and curing in an IR-oven) or by a new powder coating procedure based on the in-mould-coating (IMC) technique which is already established in the plastic industry. Extra value could be added in the future by the functional powder toner printing of powder coated substrates using the electrophotographic printing technology, meeting the future demand of both individualization of the furniture part surface by applying functional 3D textures and patterns and individually created coloured images and enabling shorter delivery times for these individualized parts. The paper describes the distinctiveness of powder coating on natural fibre/ wood based substrates, the requirements of the substrate and the coating powder.
Cyanate esters
(2014)
Cyanate ester resins are an important class of thermosetting compounds that have experienced an ever-increasing interest as matrix systems for advanced polymer composite materials, which among other applications, are especially suitable for highly demanding functions in the aerospace or microelectronics industries. Other names for cyanate ester resins are cyanate resins, cyanic esters, or triazine resins. The various types of cyanate ester monomers share the aOCN functional group that trimerizes in the course of resin formation to yield a highly branched heterocyclic polymeric network based on the substituted triazine core structure. The basic reaction sequence leading to the typical cyanate ester polymer molecule is depicted in Figure 11.1. The curing reaction may take place with or without catalyst.
Increasing number of studies are focused on how adherent cells respond, in vitro, to different properties of a material. Typical properties are the surface chemistry, topographical cues (at the nano- and micro-scale) of the surface, and the substrate stiffness. Cell Response studies are of importance for designing new biomaterials with applications in cell culture technologies, regenerative medicine, or for medical implants. However, only very few studies take the cell age factor, respectively the donor age, into account. In this work, we tested two types of human vascular cells (smooth muscle and endothelial cells) from old and young donors on (a) micro-structured surfaces made of pol (dimethylsiloxane) or on (b) flat polyacrylamide hydrogels with varying stiffnesses. These experiments reveal age-dependent and cell typedependent differences in the cell response to the topography and stiffness, and may establish the Basis for further studies focusing on cell age-dependent responses.
Positively charged metallic oxides prevent blood coagulation whereas negatively charged metallic oxides are thrombogenic. This study was performed to examine whether this effect extends to metallic oxide nanoparticles. Oscillation shear rheometry was used to study the effect of zinc oxide and silicon dioxide nanoparticles on thrombus formation in human whole blood. Our data show that oscillation shear rheometry is a sensitive and robust technique to analyze thrombogenicity induced by nanoparticles. Blood without previous contact with nanoparticles had a clotting time (CT) of 16.7 ± 1.0 min reaching a maximal clot strength (CS) of 16 ± 14 Pa (G') after 30 min. ZnO nanoparticles (diameter 70 nm, +37 mV zeta-potential) at a concentration of 1 mg/mL prolonged CT to 20.8 ± 3.6 min and provoked a weak clot (CS 1.5 ± 1.0 Pa). However, at a lower concentration of 100 µg/mL the ZnO particles dramatically reduced CT to 6.0 ± 0.5 min and increased CS to 171 ± 63 Pa. This procoagulant effect decreased at lower concentrations reaching the detection limit at 10 ng/mL. SiO2 nanoparticles (diameter 232 nm, −28 mV zeta-potential) at high concentrations (1 mg/mL) reduced CT (2.1 ± 0.2 min) and stimulated CS (249 ± 59 Pa). Similar to ZnO particles, this procoagulant effect reached a detection limit at 10 ng/mL. Nanoparticles in high concentrations reproduce the surface charge effects on blood coagulation previously observed with large particles or solid metal oxides. However, nanoparticles with different surface charges equally well stimulate coagulation at lower concentrations. This stimulation may be an effect which is not directly related to the surface charge.
The interaction between lipid bilayers in water has been intensively studied over the last decades. Osmotic stress was applied to evaluate the forces between two approaching lipid bilayers in aqueous solution. The force–distance relation between lipid mono- or bilayers deposited on mica sheets using a surface force apparatus (SFA) was also measured. Lipid stabilised foam films offer another possibility to study the interactions between lipid monolayers. These films can be prepared comparatively easy with very good reproducibility. Foam films consist usually of two adsorbed surfactant monolayers separated by a layer of the aqueous solution from which the film is created. Their thickness can be conveniently measured using microinterferometric techniques. Studies with foam films deliver valuable information on the interactions between lipid membranes and especially their stability and permeability. Presenting inverse black lipid membrane (BLM) foam films supply information about the properties of the lipid self-organisation in bilayers. The present paper summarises results on microscopic lipid stabilised foam films by measuring their thickness and contact angle. Most of the presented results concern foam films prepared from dispersions of the zwitterionic lipid 1,2-dimyristoyl-sn-glycero-3-phosphorylcholine (DMPC) and some of its mixtures with the anionic lipid — 1,2-dimyristoyl-sn-glycero-3-[phospho-rac-(1-glycerol)] (DMPG).
The strength of the long range and short range forces between the lipid layers is discussed. The van der Waals attractive force is calculated. The electrostatic repulsive force is estimated from experiments at different electrolyte concentrations (NaCl, CaCl2) or by modification of the electrostatic double layer surface potential by incorporating charged lipids in the lipid monolayers. The short range interactions are studied and modified by using small carbohydrates (fructose and sucrose), ethanol (EtOH) or dimethylsulfoxide (DMSO). Some results are compared with the structure of lipid monolayers deposited at the liquid/air interface (monolayers spread in Langmuir trough), which are one of most studied biomembrane model system. The comparison between the film thickness and the free energy of film formation is used to estimate the contribution of the different components of the disjoining pressure to the total interaction in the film and their dependence on the composition of the film forming solution.
Online credit card fraud presents a significant challenge in the field of eCommerce. In 2012 alone, the total loss due to credit card fraud in the US amounted to $ 54 billion. Especially online games merchants have difficulties applying standard fraud detection algorithms to achieve timely and accurate detection. This paper describes the Special constrains of this domain and highlights the reasons why conventional algorithms are not quite effective to deal with this problem. Our suggested solution for the problem originates from the fields of feature construction joined with the field of temporal sequence data mining. We present Feature construction techniques, which are able to create discriminative features based on a sequence of transaction and are able to incorporate the time into the classification process. In addition to that, a framework is presented that allows for an automated and adaptive change of features in case the underlying pattern is changing.
Vehicles have been so far improved in terms of energy-efficiency and safety mainly by optimising the engine and the power train. However, there are opportunities to increase energy-efficiency and safety by adapting the individual driving behaviour in the given driving situation. In this paper, an improved rule match algorithm is introduced, which is used in the expert system of a human-centred driving system. The goal of the driving system is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. The improved rule match algorithm checks the incoming information against the driving rules to recognise any breakings of a driving rule. The needed information is obtained by monitoring the driver, the current driving situation as well as the car, using in-vehicle sensors and serial-bus systems. On the basis of the detected broken driving rules, the expert system will create individual recommendations in terms of energy-efficiency and safety, which will allow eliminating bad driving habits, while considering the driver needs.
The recent years and especially the Internet have changed the ways in which data is stored. It is now common to store data in the form of transactions, together with ist creation time-stamp. These transactions can often be attributed to Logical units, e.g., all transactions that belong to one customer. These groups, we refer to them as data sequences, have a more complex structure than tuple-based data. This makes it more difficult to find discriminatory patterns for classification purposes. However, the complex structure potentially enables us to track behaviour and its change over the course of time. This is quite interesting, especially in the e-commerce area, in which classification of a sequence of customer actions is still a challenging task for data miners. However, before standard algorithms such as Decision Trees, Neural Nets, Naive Bayes or Bayesian Belief Networks can be applied on sequential data, preparations are required in order to capture the information stored within the sequences. Therefore, this work presents a systematic approach on how to reveal sequence patterns among data and how to construct powerful features out of the primitive sequence attributes. This is achieved by sequence aggregation and the incorporation of time dimension into the feature construction step. The proposed algorithm is described in detail and applied on a real-life data set, which demonstrates the ability of the proposed algorithm to boost the classification performance of well-known data mining algorithms for binary classification tasks.
DMOS transistors in integrated power technologies are often subject to significant self-heating and thus high temperatures, which can lead to device failure and reduced lifetime. Hence, it must be ensured that the device temperature does not rise too much. For this, the influence of the on-chip metallization must be taken into account because of the good thermal conductivity and significant thermal capacitance of the metal layers on top of the active DMOS area. In this paper, test structures with different metal layers and vias configurations are presented that can be used to determine the influence of the onchip metallization on the temperature caused by self-heating. It will be shown how accurate results can be obtained to determine even the influence of small changes in the metallization. The measurement results are discussed and explained, showing how on-chip metallization helps to lower the device temperature. This is further supported by numerical simulations. The obtained insights are valuable for technology optimization, but are also useful for calibration of temperature simulators.
Several diseases occur due to asbestos exposure. Until today, asbestos predicted mortality and morbidity will increase because of the long latency period. Actually, the methods to investigate asbestos related disease are mostly invasive. Therefore, the aim of the present paper was to investigate, whether signals in human breath could be correlated to Asbestos related lung diseases using a multi-capillary column (MCC) connected to an ion mobility spectrometer (IMS) as non-invasive method. Here, the breath samples of 10 mL of 25 patients suffering from asbestos related diseases. This group includes patients with asbestos related pleural thickening with and without pulmonary fibrosis. Twelve healthy persons constitute the control group and the breath samples are compared with those of the BK4103 patients. In total 83 peaks are found in the IMS-Chromatogram. A discrimination was possible with p-values <0.001 for two peaks (99.9 %), <0.01 (99 %) for 5 peaks and <0.05 (95 %) for 17 peaks. The most discrimination peaks alpha pinene and 4-ethyltoluol were identified among some others with lower p-values. The corresponding Box-and-Whisker-Plots comparing both groups are presented. In addition, a decision tree including all peaks was created that shows a differentiation with alpha pinene between BK4103 (pleural plaques group) and the control group. In addition, the sensitivity was calculated to 96 %, specificity was 50 %, positive and negative predictive values were 80 % and 86 %. Ion mobility spectrometry was introduced as non-invasive method to separate both groups Asbestos related and healthy. Naturally, the findings need further confirmation on larger population groups, but encourage further investigations, too.
Children undergoing systemic chemotherapy often suffer from severe immunosuppression usually associated to severe neutropenia (neutrophils < 0.5 x 109/l). Clinical courses during those periods range from asymptomatic to septic general conditions. Development of septic symptoms can be very fast and life-threatening. Swift detection of risk factors in those patients is therefore needed. So far no early, rapid and reliable marker or tool exists. Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column (IMS-MCC) can analyze more than 600 volatile components from exhaled air within a few minutes and hence is a potential, rapid detection-tool. As a proof of concept we measured the exhaled breath of 11 patients with neutropenia and 10 healthy controls ranging from 3 to 18 years of age at the time of measurement. Ten milliliters breath samples were taken at the outpatient clinic and analyzed with an onsite IMS-MCC (BreathDiscovery, B&S Analytik, Dortmund, Germany). Dead-space-volume was adapted to two groups (small 250 ml, large 500 ml). Interestingly 59 differing peaks were measured. Eleven were significantly different (p ≤ 0.05), three of which highly significant (p ≤ 0.01) in Mann-Whitney-Rank-Sum-testing. The corresponding analytes used in the decision tree are 2-Propanol, D-Limonene and Acetone. The analytes with the lowest rank sum identified are 2-Hexanone, Iso-Propylamine and 1-Butanol. Eventually we were able to show a three-step-decision-tree, which discerns the 21 samples except one from each group. Sensitivity was 90 % and specificity was 91 %. Naturally these findings need further confirmation within a bigger population. Our pilot-study proves that Ion-Mobility-Spectrometry coupled with a Multi-Capillary-Column is a feasible rapid diagnostic tool in the setting of a pediatric oncology out-patient clinic for patients 3 years and older. Our first results furthermore encourage additional analysis as to whether patients at risk for septic events during immunosuppression can be diagnosed in advance by rapidly assessing risk factors such as Neutropenia in exhaled breath.
Background: Conventional methods for lung cancer detection including computed tomography (CT) and bronchoscopy are expensive and invasive. Thus, there is still a need for an optimal lung cancer detection technique. Methods: The exhaled breath of 50 patients with lung cancer histologically proven by bronchoscopic biopsy samples (32 adenocarcinomas, 10 squamous cell carcinomas, 8 small cell carcinomas), were analyzed using ion mobility spectrometry (IMS) and compared with 39 healthy volunteers. As a secondary assessment, we compared adenocarcinoma patients with and without epidermal growth factor receptor (EGFR) mutation. Results: A decision tree algorithm could separate patients with lung cancer including adenocarcinoma, squamous cell carcinoma and small cell carcinoma. One hundred-fifteen separated volatile organic compound (VOC) peaks were analyzed. Peak-2 noted as n-Dodecane using the IMS database was able to separate values with a sensitivity of 70.0% and a specificity of 89.7%. Incorporating a decision tree algorithm starting with n-Dodecane, a sensitivity of 76% and specificity of 100% was achieved. Comparing VOC peaks between adenocarcinoma and healthy subjects, n-Dodecane was able to separate values with a sensitivity of 81.3% and a specificity of 89.7%. Fourteen patients positive for EGFR mutation displayed a significantly higher n-Dodecane than for the 14 patients negative for EGFR (p<0.01), with a sensitivity of 85.7% and a specificity of 78.6%. Conclusion: In this prospective study, VOC peak patterns using a decision tree algorithm were useful in the detection of lung cancer. Moreover, n-Dodecane analysis from adenocarcinoma patients might be useful to discriminate the EGFR mutation.
Ion mobility spectrometry coupled to multi capillary columns (MCC/IMS) combines highly sensitive spectrometry with a rapid separation technique. MCC\IMS is widely used for biomedical breath analysis. The identification of molecules in such a complex sample necessitates a reference database. The existing IMS reference databases are still in their infancy and do not allow to actually identify all analytes. With a gas chromatograph coupled to a mass selective detector (GC/MSD) setup in parallel to a MCC/IMS instrumentation we may increase the accuracy of automatic analyte identification. To overcome the time-consuming manual evaluation and comparison of the results of both devices, we developed a software tool MIMA (MS-IMS-Mapper), which can computationally generate analyte layers for MCC/IMS spectra by using the corresponding GC/MSD data. We demonstrate the power of our method by successfully identifying the analytes of a seven-component mixture. In conclusion, the main contribution of MIMA is a fast and easy computational method for assigning analyte names to yet un-assigned signals in MCC/IMS data. We believe that this will greatly impact modern MCC/IMS-based biomarker research by 'giving a name' to previously detected disease-specific molecules.
Online measurement of drug concentrations in patient's breath is a promising approach for individualized dosage. A direct transfer from breath- to blood-concentrations is not possible. Measured exhaled concentrations are following the blood-concentration with a delay in non-steady-state situations. Therefore, it is necessary to integrate the breath-concentration into a pharmacological model. Two different approaches for pharmacokinetic modelling are presented. Usually a 3-compartment model is used for pharmacokinetic calculations of blood concentrations. This 3-compartment model is extended with a 2-compartment model based on the first compartment of the 3-compartment model and a new lung compartment. The second approach is to calculate a time delay of changes in the concentration of the first compartment to describe the lung-concentration. Exemplarily both approaches are used for modelling of exhaled propofol. Based on time series of exhaled propofol measurements using an ion-mobility-spectrometer every minute for 346 min a correlation of calculated plasma and the breath concentration was used for modelling to deliver R2 = 0.99 interdependencies. Including the time delay modelling approach the new compartment coefficient ke0lung was calculated to ke0lung = 0.27 min−1 with R2 = 0.96. The described models are not limited to propofol. They could be used for any kind of drugs, which are measurable in patient's breath.
Model-guided Therapy and Surgical Workflow Systems are two interrelated research fields, which have been developed separately in the last years. To make full use of both technologies, it is necessary to integrate them and connect them to Hospital Information Systems. We propose a framework for integration of Model-guided Therapy in Hospital Information Systems based on the Electronic Medical Record, and a taskbased Workflow Management System, which is suitable for clinical end users. Two prototypes - one based on Business Process Modeling Language, one based on the serum-board - are presented. From the experience with these prototypes, we developed a novel personalized visualization system for Surgical Workflows and Model-guided Therapy. Key challenges for further development are automated situation detection and a common communication infrastructure.
An operation room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for making mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and tasks of their colleagues. The entire team must work synchronously at all times. However, the operation room (OR) is a noisy environment and the actors have to set their focus on their work. To optimize the overall workflow, a task manager supporting the team was developed. Each actor is equipped with a client terminal showing a summary of their own tasks. Moreover, a big screen displays all tasks of all actors. The architecture is a distributed system based on a communication framework that supports the interaction of all clients with the task manager. A prototype of the task manager and several clients have been developed and implemented. The system represents a proof-of-concept for further development. This paper describes the concept of the task manager.