Refine
Year of publication
Document Type
- Conference proceeding (1039) (remove)
Is part of the Bibliography
- yes (1039)
Institute
- Informatik (570)
- Technik (273)
- ESB Business School (163)
- Texoversum (24)
- Life Sciences (11)
- Zentrale Einrichtungen (2)
Publisher
- IEEE (222)
- Springer (145)
- Hochschule Reutlingen (112)
- Gesellschaft für Informatik e.V (57)
- Association for Computing Machinery (41)
- VDE Verlag (31)
- Association for Information Systems (30)
- SciTePress (21)
- IARIA (19)
- Elsevier (18)
Transaction processing is of growing importance for mobile computing. Booking tickets, flight reservation, banking, ePayment, and booking holiday arrangements are just a few examples for mobile transactions. Due to temporarily disconnected situations the synchronisation and consistent transaction processing are key issues. Serializability is a too strong criteria for correctness when the semantics of a transaction is known. We introduce a transaction model that allows higher concurrency for a certain class of transactions defined by its semantic. The transaction results are ”escrow serializable” and the synchronisation mechanism is non-blocking. Experimental implementation showed higher concurrency, transaction throughput, and less resources used than common locking or optimistic protocols.
Relationship Marketing (RM) presumes trust as an important antecedent for the performance of interfirm relationships. Current research is dominated by an interpersonal perspective. In this research tack, trust chiefly emerges as a result of interpersonal relationships. But multiple risks arise if customer trust rests solely on elements inextricably linked to single representatives. Hence, this paper evaluates the impact of organizational capabilities and the moderating role of customer preferences on the trust creation process. The framework presented here is tested cross-industry on 220 customers for IT solutions. The results offer significant insight into the effectiveness of individual and organizational RM strategies.
Modern web-based applications are often built as multi-tier architecture using persistence middleware. Middleware technology providers recommend the use of Optimistic Concurrency Control (OCC) mechanism to avoid the risk of blocked resources. However, most vendors of relational database management systems implement only locking schemes for concurrency control. As consequence a kind of OCC has to be implemented at client or middleware side.
A simple Row Version Verification (RVV) mechanism has been proposed to implement an OCC at client side. For performance reasons the middleware uses buffers (cache) of its own to avoid network traffic and possible disk I/O. This caching however complicates the use of RVV because the data in the middleware cache may be stale (outdated). We investigate various data access technologies, including the new Java Persistence API (JPA) and Microsoft’s LINQ technologies for their ability to use the RVV programming discipline.
The use of persistence middleware that tries to relieve the programmer from the low level transaction programming turns out to even complicate the situation in some cases.Programmed examples show how to use SQL data access patterns to solve the problem.
In this presentation the audience will be: (a) introduced to the aims and objectives of the DBTechNet initiative, (b) briefed on the DBTech EXT virtual laboratory workshops (VLW), i.e. the educational and training (E&T) content which is freely available over the internet and includes vendor-neutral hands-on laboratory training sessions on key database technology topics, and (c) informed on some of the practical problems encountered and the way they have been addressed. Last but not least, the audience will be invited to consider incorporating some or all of the DBTech EXT VLW content into their higher education (HE), vocational education and training (VET), and/or lifelong learning/training type course curricula. This will come at no cost and no commitment on behalf of the teacher/trainer; the latter is only expected to provide his/her feedback on the pedagogical value and the quality of the E&T content received/used.
Die Informatics Inside-Konferenz findet in diesem Jahr zum dritten Mal statt. Mit dem Thema "Grenzen überwinden – Virtualität erweitert Realität" stellt sich die Veranstaltung einem aktuellen Schwerpunkt, der viele Interessierte aus Wirtschaft, Wissenschaft und Forschung anzieht. Die Konferenz hat sich von einer Veranstaltung für die Masterstudenten des Studiengangs Medien- und Kommunikationsinformatik zu einer offenen Studentenkonferenz entwickelt. Um die Qualität weiter zu steigern wurde parallel dazu ein zweistufiges Review-Verfahren für Beiträge dieses Tagungsbandes eingeführt.
The Third International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2011) held on January 23-27, 2011 in St. Maarten, The Netherlands Antilles, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take this opportunity to thank all the members of the DBKDA 2011 Technical Program Committee as well as the numerous reviewers. The creation of such a broad and high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to the DBKDA 2011. We truly believe that, thanks to all these efforts, the final conference program consists of top quality contributions. This event could also not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2011 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2011 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in database research. We are convinced that the participants found the event useful and communications very open. The beautiful places of St. Maarten surely provided a pleasant environment during the conference and we hope you had a chance to visit the surroundings.
This work presents a disconnected transaction model able to cope with the increased complexity of longliving, hierarchically structured, and disconnected transactions. Wecombine an Open and Closed Nested Transaction Model with Optimistic Concurrency Control and interrelate flat transactions with the aforementioned complex nature. Despite temporary inconsistencies during a transaction’s execution our model ensures consistency.
Suppliers need to improve their relational capabilities if they are to enhance customer trust. Debate about such capabilities is dominated by an interpersonal approach. This paper provieds novel marketing options by expanding insights into alternative types of relational capabilities. Furthermore, the moderating role of customer preferences on the effectiveness of relational capabilities is evaluated.
The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2012], held between February 29th and March 5th, 2012 in Saint Gilles, Reunion Island, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2012 Technical Program Committee, as well as the numerous reviewers. The creation of such a broad and high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2012. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2012 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2012 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge, and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Saint Gilles, Reunion Island.
Turning complainers into fans : towards a framework for customer services in social media channels
(2012)
In recent years, marketing scholars have invested heavily in exploring the role of social media in marketing theory and practice. One valuable strategy for using social media in marketing communication is to provide customer services in applications like Facebook or Twitter. This paper evaluates a) the concept of perceived service quality in different service channels and b) the impact customer service strategies have on customer loyalty, word of mouth communication, and cross-sell preferences. The framework presented here is tested cross-channel against data collected from the customer service department of a large telecommunication provider. The results elucidate the effectiveness of customer service strategies in different channels.
In diesem Artikel wird ein neu entwickeltes Werkzeug zur Dimensionierung von Bonddrähten im ASIC-Entwurf vorgestellt. Die Berücksichtigung aller Einflussfaktoren erlaubt eine gegenüber Handrechnungen optimierte Auslegung der Bondanordnung. Dies ermöglicht zum einen die Absicherung gegen Degradationseffekte bis hin zum Durchbrennen und garantiert so die Zuverlässigkeit über die gesamte Lebensdauer. Zum anderen wird eine aus Zuverlässigkeitserwägungen resultierende Überdimensionierung vermieden.
Das Werkzeug erlaubt die Kalkulation aller für die Auslegung von Bonddrähten relevanten Parameter. Je nach Kontext der Aufgabenstellung lassen sich die Stromtragfähigkeit für Dauerstrom oder Pulsstrombelastung, kritische Temperaturen oder die maximale Bonddrahtlänge als Ausgabegrößen berechnen. Durch diese Flexibilität und die benutzerfreundliche Integration in eine industrielle Entwicklungsumgebung ist der „Bond-Rechner“ im gesamten Entwurfsverlauf einsetzbar und leistet wertvolle Hilfestellung von ersten Abschätzungen in frühen Entwurfsphasen bis hin zur abschließenden Verifikation.
Ein praktikables Mittel zur Erhöhung des Automatisierungsgrads im analogen IC-Entwurf ist die Verwendung parametrisierter Zellen. Diese sogenannten pCells werden eingesetzt, um determinierte Layouts automatisch zu erzeugen, und zwar in der Regel für einzelne Bauelemente wie Transistoren oder Dioden. Der vorliegende Beitrag zeigt die Potenziale eines erweiterten pCell-Konzepts, mit dem determinierte Layouts als auch Schaltpläne für ganze Schaltungsmodule automatisch generiert werden können. Als Beispiel wird eine solche Modul-pCell für analoge Stromspiegel beschrieben, die nicht nur die Dimensionierung der Einzeltransistoren, sondern auch verschiedene Transistortypen, beliebige Spiegelverhältnisse und sogar mehrere Topologien sowie weitere Freiheitsgrade implementiert. Das dadurch erzielte Maß an Flexibilität erlaubt es, die zahlreichen schaltungstechnischen Varianten im Analogbereich abzudecken, die ansonsten oftmals Hürden für Automatisierungsansätze darstellen.
Das Motto der diesjährigen Konferenz lautet "Reality++: Tomorrow comes today!". Unter diesem fast schon visionären Thema werden die Ergebnisse der verschiedenen Vertiefungsarbeiten aus den vergangenen Monaten präsentiert. Das Programm wird vervollständigt durch Beiträge von Experten aus der Forschung und Industrie.
Multi-dimensional patient data, such as time varying volume data, data of different imaging modalities, surface segmentations etc. are of growing importance in the clinical routine. For many use cases, it is of major importance to replicate a certain visualization of a data set created on one machine on a different computer using different software tools. Up until now, there exists no standardized methodology for this consistent presentation. We propose an extension of the Digital Imaging und Communications in Medicine (DICOM) called “Multi dimensional Presentation State” and outline scope and first results of the standardization process.
Energy-efficiency and safety became an important factor for car manufacturers. Thus, the cars have been optimised regarding the energy consumption and safety by optimising for example the power train or the engine. Besides the optimisation of the car itself, energy-efficiency and safety can also be increased by adapting the individual driving behaviour to the current driving situation. This paper introduces a driving system, which is in development. Its goal is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. For the creation of a recommendation the driving system monitors the driver and the current driving situation as well as the car using in-vehicle sensors and serial-bus systems. On the basis of the acquired data, the driving system will give individual energy-efficiency and safety recommendations in real-time. This will allow eliminating bad driving habits, while considering the driver needs.
Telemedicine is becoming an increasingly important approach to diagnostic, treat or prevent diseases. However, the usage of Information Communication Technologies in healthcare results in a considerable amount of data that must be efficiently and securely transmitted. Many manufacturers provide telemedicine platforms without regarding interoperability, mobility and collaboration. This paper describes a collaborative mobile telemonitoring platform that can use the IEEE 11073 and HL7 communication standards or adapt proprietary protocols. The proposed platform also covers the security and modularity aspects. Furthermore this work introduces an Android-based prototype implementation
This paper presents a new European initiative to support the sustainable empowerment of the ageing society. Empowerment in this context represents the capability to have a self-determined, autonomous and healthy life. The paper justifies the need of such an initiative and highlights the role that telemedicine and ambient assisted living can play in this environment.
The workshop aims to discuss leading edge contributions to the interdisciplinary research area of ambient intelligence (AmI) applied to the domains of telemedicine and driving assistance. AmI refers to human centered environments attributed with sensors. The development of AmI in the two application domains of the workshop shares several commonalities: the extensive usage of networked devices and sensors, the design of artificial intelligence algorithms for diagnosis, including recommendation systems and qualitative reasoning or the application of mobile and wireless communication to their distributed systems. Together with the presentation of common aspects of Ambient Intelligence, a further goal of the workshop is to stimulate synergies among both application domains and present examples. The telemedicine domain can benefit from methodologies in designing complex devices, real-time conform system design, audiovisual or computer vision system design used in automotive driving assistance. Furthermore, the automotive domain can benefit from the usercentric view, biometric sensor data design, multi-user data bases for aggregation and diagnosis using big data like used in telemedicine. The German Government supports these research lines in its Hightec-Strategie under the domains “Health and Nutrition” and “Climate and Energy”. In Spain the term “Spanish Program for R&D Challenged Oriented Society – Challenge in energy safe, efficient and clean & Challenge in sustainable transport, smart and integrated” is used. Scientific contributions to the event are peer-reviewed by a suited program committee having members from Germany and Spain. The same committee is serving the JARCA workshop (Jornadas sobre Sistemas cualitativos y sus Aplicaciones en Diagnosis, Robótica e Inteligencia Ambiental - Conference on Qualitative Systems and their Applications in Diagnoses, Robotics and Ambient Intelligence) since 15 years. This workshop is sponsored by the German Academic Exchange Service (DAAD) under contract number 57070010.
The Fifth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2013], held between January 27th- February 1st, 2013 in Seville, Spain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2013 Technical Program Committee, as well as the numerous reviewers. The creation of such a high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2013. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2013 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2013 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Seville, Spain.
New storage technologies, such as Flash and Non- Volatile Memories, with fundamentally different properties are appearing. Leveraging their performance and endurance requires a redesign of existing architecture and algorithms in modern high performance databases. Multi-Version Concurrency Control (MVCC) approaches in database systems, maintain multiple timestamped versions of a tuple. Once a transaction reads a tuple the database system tracks and returns the respective version eliminating lock-requests. Hence under MVCC reads are never blocked, which leverages well the excellent read performance (high throughput, low latency) of new storage technologies. Upon tuple updates, however, established implementations of MVCC approaches (such as Snapshot Isolation) lead to multiple random writes – caused by (i) creation of the new and (ii) in-place invalidation of the old version – thus generating suboptimal access patterns for the new storage media. The combination of an append based storage manager operating with tuple granularity and snapshot isolation addresses asymmetry and in-place updates. In this paper, we highlight novel aspects of log-based storage, in multi-version database systems on new storage media. We claim that multi-versioning and append-based storage can be used to effectively address asymmetry and endurance. We identify multi-versioning as the approach to address dataplacement in complex memory hierarchies. We focus on: version handling, (physical) version placement, compression and collocation of tuple versions on Flash storage and in complex memory hierarchies. We identify possible read- and cacherelated optimizations.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nontheless, in real life history is not always repeatable, i.e. in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. Compared to other techniques this novel approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 demonstrate better results than established sophisticated time series methods.
Der vorliegende Artikel beleuchtet die grundsätzlichen Möglichkeiten der Integration von Funktionalitäten der sozialen Medien in Unternehmen. Darauf aufbauend wird Social Commerce als zentraler Gegenstand der Unternehmensführung hergeleitet. Dabei stehen der kundenseitige Kaufprozess und dessen Schnittstellen zu Kommunikationsinstrumenten des Social Webs im Vordergrund. Gezeigt wird die Beeinflussung des individuellen Kaufprozesses durch Social Media. Diese Wirkungsdynamiken sind nachfolgend die Grundlage für die Deskription von möglichen strategischen Einsatzfeldern und Bereichen des Social Commerce in der Unternehmensführung.
Die Spannungsversorgung elektronischer Steuergeräte im Automotive-Bereich wird zunehmend durch Schaltregler sichergestellt. Der SEPIC (Single Ended Primary Inductance Converter) besitzt die Eigenschaft, eine Spannung aufwärts wie auch abwärts wandeln zu können und könnte somit klassische Buck- und Boost-Wandler ablösen. Dieser Beitrag untersucht den SEPIC hinsichtlich Eignung für Automotive-Anwendungen. Dazu wurde eine Groß- sowie Kleinsignalanalyse am Wandler durchgeführt, mit geeigneten Simulationsmodellen nachgebildet und Messungen gegenüber gestellt. Der SEPIC zeigt als Hauptvorteile:
1. einen verzugsfreien Übergang zwischen Buck-/Boost Betrieb, 2. geringe Eingangswelligkeit, 3.DC-Kurzschlussfestigkeit. Auch hinsichtlich Wirkungsgrad und EMV-Verhalten stellt der SEPIC eine interessante Alternative dar. Der zwischen Ein- und Ausgang liegende Kondensator wird dauerhaft von einem Strom durchflossen, auf Basis der Effektivströme wird das damit verbundene Ausfallrisiko diskutiert.
A fast transient current-mode buckboost DC-DC converter for portable devices is presented. Running at 1 MHz the converter provides stable 3 V from a 2.7 V to 4.2 V Li-Ion battery. A small voltage under-/overshoot is achieved by fast transient techniques: (1) adaptive pulse skipping (APS) and (2) adaptive compensation capacitance (ACC). The proposed converter was implemented in a 0.25 μm CMOS technology. Load transient simulations confirm the effectiveness of APS and ACC. The improvement in voltage undershoot and response time at light-to-heavy load step (100 mA to 500 mA), are 17 % and 59 %, respectively, in boost mode and 40 % and 49 %, respectively, in buck mode. Similar results are achieved at heavy-to-light load step for overshoot and response time.
The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2014), held between April 20 - 24, 2014 in Chamonix, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Universelle OTA-Testbench
(2014)
Es wird eine universell einsetzbare Testbench zur Simulation von integrierten Schaltungen innerhalb der OTA-Schaltungsklasse (Operational Transconductance Amplifier; Transkonduktanzverstärker) vorgestellt. Transkonduktanzverstärker sind in der analogen Schaltungstechnik weit verbreitet und daher von großer Bedeutung. Sie treten sowohl als eigenständige Schaltungen innerhalb eines Chips, sowie als Bestandteil anderer Schaltungen (z.B. als erste und zweite Stufe von Operationsverstärkern) auf. Es kann davon ausgegangen werden, dass heute kaum ein analoger oder Mixed-Signal-Chip gefertigt wird, in dem keine Transkonduktanzverstärker verbaut sind. Die Entscheidungsfindung des Entwicklers bei der Auslegung eines OTAs beruht maßgeblich auf einer anwendungsspezifischen Simulation. Die Erstellung einer eigenen Testbench für jede Anwendung bedeutet allerdings einen hohen Zeitaufwand und erschwert den Vergleich der Simulationsergebnisse unterschiedlicher Schaltungsvarianten. Durch eine universelle Testbench kann zum einen der Zeitaufwand verringert werden, zum anderen können nun Simulationsergebnisse direkt miteinander verglichen werden. Hierdurch wird die Entscheidungsfindung des Entwicklers objektiviert und beschleunigt. Neben dem Vergleich unterschiedlicher Schaltungen innerhalb einer Technologie ist auch der Vergleich einer Schaltung in unterschiedlichen Technologien denkbar. Die Idee einer universell anwendbaren Testbench lässt sich auch auf andere analoge Schaltungsklassen anwenden und damit als Prinzip verallgemeinern.
Es wird das Ziel verfolgt, eine Möglichkeit für die sichere Wiederverwendbarkeit von Schaltungen aus der OTA-Schaltungsklasse bereitzustellen. Hierfür werden ausgewählte OTA-Schaltungstopologien für die "Copy-and-Paste"-Methode vorgestellt. Es wurde im industriellen Umfeld gezeigt, dass sie sich unter der Voraussetzung einer repräsentativen Topologieauswahl – vordimensioniert für den typischen Anwendungsbereich – schon in dieser Form für die Wiederverwendung eignen.
While digital IC design is highly automated, analog circuits are still handcrafted in a time-consuming, manual fashion today. This paper introduces a novel Parameterized Circuit Description Scheme (PCDS) for the development of procedural analog schematic generators as parameterized circuits. Circuit designers themselves can use PCDS to create circuit automatisms which capture valuable expert knowledge, offer full topological flexibility, and enhance the re-use of well-established topologies. The generic PCDS concept has been successfully implemented and employed to create parameterized circuits for a broad range of use cases. The achieved results demonstrate the efficiency of our PCDS approach and the potential of parameterized circuits to increase automation in circuit design, also to benefit physical design by promoting the common schematic-driven-layout flow, and to enhance the applicability of circuit synthesis approaches.
Mit der Verfügbarkeit leistungsfähiger Computer haben rechnergestützte Simulationsverfahren überall in Wissenschaft und Technik Einzug gehalten. Die modellbasierte Simulation als "virtuelles Experiment" stellt insbesondere im Entwurf technischer Systeme ein wirksames und längst unverzichtbares Hilfsmittel dar, um Entwicklungsergebnisse hinsichtlich gewünschter Eigenschaften abzusichern. Die Möglichkeiten heutiger Simulationsmethoden sind faszinierend, weshalb gerade Anfänger (aber nicht nur diese) der Gefahr ausgesetzt sind, deren Ergebnisse unkritisch zu übernehmen. Besondere Bedeutung kommt hier der Lehre zu. Neben der Anwendung der Simulationswerkzeuge ist es wichtig, den Studierenden auch deren theoretische Grundlagen nahe zu bringen und damit ihr Bewusstsein hinsichtlich der Grenzen der Simulation zu schärfen. Der Workshop der ASIM/GI-Fachgruppen "Simulation technischer Systeme" und "Grundlagen und Methoden in Modellbildung und Simulation" bringt Fachleute aus Wirtschaft und Wissenschaft zum Erfahrungsaustausch rund um die Simulation zusammen. Hierbei werden alle Aspekte von den Grundlagen über die Methoden bis hin zu Werkzeugen und Anwendungsbeispielen angesprochen.
In diesem Beitrag wurde gezeigt, wie mit Hilfe von Verfahren zur Analyse von Petri–Netzen ein in der Programmiersprache Kontaktplan erstelltes SPS–Programm analysiert werden kann. Das Ziel des Verfahrens ist dabei nicht eine Verifikation im eigentlichen Sinne sondern das Aufdecken von verbotenen oder unerwünschten Zuständen. Im Beitrag wurden Regeln zur Transformation des im Kontaktplan erstellten Ablaufs in ein Petri–Netz angegeben und anhand der Analyse eines fehlerhaft implementierten Ablaufs die Leistungsfähigkeit des Ansatzes vorgestellt. Das Beispiel zeigt, dass Programmfehler bereits vor einem Test an der realen Anlage erkannt werden können. Bei der weiteren Entwicklung des Verfahrens liegt ein Schwerpunkt auf der Verallgemeinerung auf im Kontaktplan entwickelte Programmorganisationseinheiten, die nicht nur reine
Abläufe implementieren. Ein weiterer wichtiger Entwicklungsschritt ist die graphische Unterstützung der Fehlersuche im Erreichbarkeitsgraphen, so dass insgesamt ein leistungsfähiges Werkzeug zur Unterstützung der Implementierung von Ablaufsteuerungen im Kontaktplan zur Verfügung steht.
In diesem Beitrag wurde gezeigt, wie ein bereits bekanntes Verfahren zur modellprädiktiven Regelung zur Optimierung der Energieeffizienz einer Asynchronmaschine im dynamischen Betrieb eingesetzt werden kann. Dazu wurden zunächst die Beziehungen für die Verlustleistung bei alleiniger Berücksichtigung der Kupferverluste im dynamischen Betrieb hergeleitet. Ausgehend davon wurde das Optimierungsproblem formuliert, der Einfluss von Parametern des modellprädiktiven Verfahrens auf das Optimierungsergebnis untersucht und damit Vorschlagswerte für diese Parameter ermittelt. Der Vergleich mit zwei weiteren Verfahren ohne Optimierung bzw. mit Optimierung allein für stationäre Arbeitspunkte zeigt die Vorteile des modellprädiktiven Verfahrens.
Die Wahl einer Klinik ist typischerweise dem stellvertretenden Kaufverhalten zuzuordnen – Kunden suchen vertrauenswürdige, persönliche Quellen zur Unterstützung der Entscheidung. Weiterempfehlungsverhalten kann durch Anreize unterstützt werden – grundlegende Voraussetzung für ehrliche Weiterempfehlung ist jedoch Kundenzufriedenheit. Kundenzufriedenheit entsteht durch den Abgleich zwischen erwarteter und empfundener Leistung – das erwartete Leistungsniveau wird häufig durch Unternehmen anderer Branchen determiniert. Individuen sind nicht in der Lage, die Bestandteile einer Erfahrung isoliert zu bewerten, sondern vermengen sie (Halo-Effekt) - Inkonsistenzen führen zu einer Abwertung der Gesamterfahrung. Darum ist im ersten Schritt die Identifikation der Gesamterfahrung (Kundenreise) erforderlich – diese beginnt vor und endet nach der unmittelbaren Interaktion des Kunden mit dem Unternehmen / der Klinik. Im zweiten Schritt sind die Zufriedenheitstreiber und die Interdependenzen zwischen den Einzelerfahrungen zu ermitteln um dann die Optimierung der Kundenreise zu planen und umzusetzen.
Proceedings of the International Workshop on Mobile Networks for Biometric Data Analysis (mBiDA)
(2014)
Prevention and treatment of common and widesprea (chronic) diseases is a challenge in any modern Society and vitally important for health maintenance in aging societies. Capturing biometric data is a cornerstone for any analysis and Treatment strategy. Latest advances in sensor technology allow accurate data measurement in a non-intrusive way. In many cases, it is necessary to provide online monitoring and real-time data capturing to support patients´ prevention plans or to allow medical professionals to access the current status. Different communication standards are required to push sensor data and to store and analyze them on different (mobile) platforms. The objective of the workshop is to show new and innovative approaches dedicated to biometric data capture and analysis in a non-intrusive way maintaining mobility. Examples can be found in human centered ambient intelligence attributed with sensors or even in methodologies applied in automotive real-time conform mobile system design. The workshop´s main challenge is to focus on approaches promoting non-intrusiveness, reliable prediction algorithms and high user-acceptance. The workshop will provide overview presentations, Young researcher poster tracks, doctoral tracks and classical peer-review full paper tracks. Especially, would like to encourage students and young researchers to participate and to contribute to the workshop. Scientific contributions to the event are peer-reviewed by a suited program committee.
The impact of stress of every human being has become a serious problem. Reported impact on persons are a higher rate or health disorders like heart problems, obesity, asthma, diabetes, depressions and many others. An individual in a stressful situation has to deal with altered cognition as well as an affected decision making skill and problem solving. This could lead to a higher risk for accidents in dynamic environments such as automotive. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives or computes the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence as well as recommend driving behavior to decrease stress influenced driving as well as improve road safety.
Today 40 Gbps is in development at IEEE 802.3bq over four pair balanced cabling. In this paper, we describe a transmission experiment of 25 Gbps enabling either a single pair transmission of 25 Gbps over a 30 meter balanced cabling channel, or a 100 Gbps transmission via a four-pair balanced channel. A scalable matrix modeling tool is introduced which allows the prediction of transmission characteristics of a channel taking mode conversion into account . We applied this tool to characterize PCB-channels including the magnetics and PCB for a four-pair 100 Gbps transmission. We evaluated prototype cables and connecting hardware for frequencies up to 2 GHz and beyond. Finally we investigated possible line encoding schemes and provide measurement results of a transmission over 30 m with a data rate of 25 Gbps per twisted pair.
In this paper, research projects with 30 meter balanced cabling and data rates up to 25 Gbps over one single pair are described. The project aim is to achieve 100 Gbps via a four pair balanced cabling channel. In the following, spectral characteristics of the used prototype twisted pair are presented. Therefore, the insertion loss of the single cable in comparison to the insertion loss of the cable in combination with an equalizing amplifier, as well as the group delay of the cable and the cable connected to the equalizing amplifier is shown. Furthermore, a carrierless Pulse Amplitude Modulation with 32 different levels (PAM-32) as an approach for a possible line encoding is presented. Finally, research measurements of the data transmission with a data rate up to 25 Gbps via shielded twisted pair is shown.
An index in a Multi-Version DBMS (MV-DBMS) has to reflect different tuple versions of a single data item. Existing approaches follow the paradigm of logically separating the tuple version data from the data item, e.g. an index is only allowed to return at most one version of a single data item (while it may return multiple data items that match a search criteria). Hence to determine the valid (and therefore visible) tuple version of a data item, the MV-DBMS first fetches all tuple versions that match the search criteria and subsequently filters visible versions using visibility checks. This involves I/O storage accesses to tuple versions that do not have to be fetched. In this vision paper we present the Multi Version Index (MV-IDX) approach that allows index-only visibility checks which significantly reduce the amount of I/O storage accesses as well as the index maintenance overhead. The MV-IDX achieves significantly lower response times and higher transactional throughput on OLTP workloads.
IGBT modules with anti-parallel FWDs are widely used in inductive load switching power applications, such as motor drive applications. Nowadays there is a continuous effort to increase the efficiency of such systems by decreasing their switching losses. This paper addresses the problems arising in the turn-on process of an IGBT working in hard-switching conditions. A method is proposed which achieves – contrary to most other approaches – a high switching speed and, at the same time, a low peak reverse-recovery current. This is done by applying an improved gate current waveform that is briefly lowered during the turn-on process. The proposed method achieves low switching losses. Its effectiveness is demonstrated by experimental results with IGBT modules for 600V and 1200V.
Advanced power semiconductors such as DMOS transistors are key components of modern power electronic systems. Recent discrete and integrated DMOS technologies have very low area-specific on-state resistances so that devices with small sizes can be chosen. However, their power dissipation can sometimes be large, for example in fault conditions, causing the device temperature to rise significantly. This can lead to excessive temperatures, reduced lifetime, and possibly even thermal runaway and subsequent destruction. Therefore, it is required to ensure already in the design phase that the temperature always remains in an acceptable range. This paper will show how self-heating in DMOS transistors can be experimentally determined with high accuracy. Further, it will be discussed how numerical electrothermal simulations can be carried out efficiently, allowing the accurate assessment of self-heating within a few minutes. The presented approach has been successfully verified experimentally for device temperatures exceeding 500 ◦C up to the onset of thermal runaway.
This paper presents a new broadband antenna for satellite communications. It describes the procedure involved in the design of a microstrip antenna array and its multi-level passive feed network that together yield circular polarization and the necessary gain to be used in an earth-satellite link. The designed antenna is notable for its large bandwidth, circular polarization, high gain and small dimensions.
This paper presents the design and simulation processes of an Equiangular Spiral Antenna for the extremely high frequencies between 65 GHz and 170 GHz. A new approach for the analysis of the antenna’s electrical parameters is described. This approach is based on formalism proposed by Rumsey to determine the EM field produced by an equiangular spiral antenna. Analytical expressions of the electrical parameters such as the gain or the directivity are then calculated using well sustained mathematical approximations. The comparison of obtained results with those from numerical integration methods shows a good agreement.
Functionally impaired people have problems with choosing and finding the right clothing. So, they need help in their daily life to wash and manage the clothing. The goal of this work is to support the user by giving recommendations to choose the right clothing, to find the clothing and how to wash the clothing. The idea behind eKlarA is to generate a gateway based system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of (one and more) closet, laundry basket and washing machine in one or several places. The gateway uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. This allows to give more freedom to the functionally impaired people in their daily life.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy-efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decides whether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of the driving system.
Three different polyols (soluble starch, sucrose, and glycerol) were tested for their potential in the chemical modification of melamine formaldehyde (MF) resins for paper impregnation. MF impregnated papers are widely used as finishing materials for engineered wood. These polyols were selected because the presence of multiple hydroxy groups in the molecules was suspected to facilitate cocondensation with the main MF framework. This should lead to good resin performance. Moreover, they are readily produced from natural feedstock. They are available in large quantities and may serve as economically feasible, environmentally harmless alternative co-monomers suitable to substitute a portion of fossil-based starting material. In the presented work, a number of model resins were synthesized and tested for covalent incorporation of the natural polyol into the MF Framework. Spectroscopic evidence of chemical incorporation of glycerol was found by applying by 1H, 13C, 1H/13C HSQC, 1H/13C HMBC, and 1H DOSY methods. It was furthermore found that covalent incorporation of glycerol in the network took place when glycerol was added at different stages during synthesis. Further, all resins were used to prepare decorative laminates and the performance of the novel resins as surface finishing was evaluated using standard technological tests. The technological performance of the various modified thermosetting resins was assessed by determining flow viscosity, molar mass distribution, the storage stability, and in a second step laminating impregnated paper to particle boards and testing the resulting surfaces according to standardized quality tests. In most cases, the average board surface properties were of acceptable quality. Our findings demonstrate the possibility to replace several percent of the petrol-based product melamine by compounds obtained from renewable resources.
Mass-customization is a megatrend that also affects the wood industry. To obtain individually designed laminates in batch size one efficient printing and processing technologies are required. Digital printing was envisaged as it does not depend on highly costly printing cylinders (as used in rotogravure printing) and allows rapid exchange of the printing designs. In the present work, two wellestablished digital printing approaches, the multi-pass and the single-pass technique, were investigated and evaluated for their applicability in decorating engineered wood and low-pressure melamine films. Three different possibilities of implementing digital printing in the decorative laminates manufacturing process were studied: (1) digital printing on coated chipboard and subsequently applying a lacquered top-coat or melamine overlay (designated as “direct printing”, since the LPM was the printing substrate), (2) digital printing on decorative paper which was subsequently impregnated before hot pressing (designated as “indirect printing, variant A”) and (3) digital printing on decorative paper with subsequent interlamination of the paper between impregnated under- and overlay paper layers during the pressing process (designated as “indirect printing, variant B”). Due to various advantages of the resulting cured melamine resin surfaces including a much better technological performance and flexibility in surface texture design, it was decided to industrially further pursue only the indirect digital printing process comprising interlamination and the direct printing process with a melamine overlay-finishing. Basis for the successful trials on production and laboratory scales were the identification of applicable inks (in terms of compatibility with melamine resin) and of appropriate printing paper quality (in terms of impregnation and imprinting ability). After selection and fine tuning of suitable materials, the next challenge to overcome was the initially insufficient bond strength between impregnated overlay and the ink layers which led to unsatisfactory quality of the print appearance and delamination effects. However, the optimization of the pressing program and the development of a modified impregnation procedure for the underlay and overlay papers allowed the successful implementation of digital printing in the production line of our industrial partner FunderMax.
Prior studies ascribed people’s poor performance in dealing with basic systems concepts to different causes. While results indicate that, among other things, domain specific experience and familiarity with the problem context play a role in this stock-flow-(SF-)performance, this has not yet been fully clarified. In this article, we present an experiment that examines the role of educational background in SF-performance. We hypothesize that SF-performance increases when the problem context is embedded in the problem solver’s knowledge domain, indicated by educational background. Using the square wave pattern and the sawtooth pattern tasks from the initial study by Booth Sweeney and Sterman (2000), we design two additional cover stories for the former, the Vehicle story from the engineering domain and the Application story from the business domain, next to the original Bathtub story. We then test the three sets of questions on business students. Results mainly support our hypothesis. Interestingly, participants even do better on a more complex behavioral pattern from their knowledge domain than on a simpler pattern from more distant domains. Although these findings have to be confirmed by further studies, they contribute both to the methodology of future surveys and the context familiarity discussion.
This paper compares the influence a video self-avatar and a lack of a visual representation of a body have on height estimation when standing at a virtual visual cliff. A height estimation experiment was conducted using a custom augmented reality Oculus Rift hardware and software prototype also described in this paper. The results show a consistency with previous research demonstrating that the presence of a visual body influences height estimates, just as it has been shown to influence distance estimates and affordance estimates.
Fundamentale Veränderungen der heutigen Arbeitswelt stellen Menschen, Systeme, Prozesse und ganze Organisationen vor erhebliche Herausforderungen. Der Faktor Mensch leistet in allen Bereichen dieses Wirkgefüges einen essentiellen Beitrag zum Wettbewerbsvorteil vieler produzierender Unternehmen am Standort Deutschland. Der Wandel von Automatisierung zu selbststeuernden Unternehmen geht dabei nicht spurlos an dem wandlungsfähigsten Glied dieses Gefüges, dem Menschen, vorüber. Belastungsarten verändern sich, singuläre Bewältigungsstrategien genügen nicht mehr, um einen optimalen Beanspruchungszustand jedes einzelnen Individuums zu erreichen und gleichzeitig das höchstmögliche Potenzial zu schöpfen. Das Belastungs- und Beanspruchungscockpit bildet einen Lösungsansatz zur systematischen und durchgängigen Bewertung von Belastungszuständen und der individuellen Beanspruchung von Beschäftigten an Montagearbeitsplätzen. Es liefert in Echtzeit Informationen zum Belastungs- und Beanspruchungszustand des Mitarbeiters und kann mit Ergonomiebewertungsverfahren verknüpft werden. Der Aspekt der Multidimensionalität umfasst die Bewertung verschiedener Indikatoren unter Betrachtung ihrer Wirkzusammenhänge.
Industry 4.0 predicts that industrial processes, technological infrastructure and all corresponding Business processes, with the help of information and communication technology (ICT), will advance to integrated, ad-hoc interconnected and decentralized Cyber-Physical Production Systems (CPPS) with real-time capabilities of selfoptimization and adaptability. Considering this change, the human being will remain in a dominant role, because it is not expected that the human factor with its characteristics and capabilities will be substituted entirely by autonomously acting technology in the foreseeable future. The mechanical intelligence, for instance, is limited to the selection of predefined options, while human creativity, flexibility, the ability to learn and to improve are required to design and configure systems, processes and products. Humans have the expertise and experience to analyze, assess and solve - even in exceptional situations. However, the amount of purely manual tasks for shop floor workers will decrease. Their role will change from a manually executing to a proactive preconceiving worker with increased responsibility. Due to the growing degree of digitalization and interconnectedness, also the tasks and responsibilities for planning and design personnel will continuously expand and become more complex. The work in versatile ad-hoc networks with advanced ICT-Tools and assistance systems will lead to increased requirements regarding the knowledge, capability and capacity of the respective employees. The on-going pervasion of IT and emergence of systems with unprecedented complexity specifically require significantly improved capabilities in analysis, abstraction, problem solving and decision making from future labour. Accordingly, the industry is asking for graduates that are educated interdisciplinary and practice-oriented. Some universities already meet these expectations, using learning factories for realistic, action-oriented classes and trainings. Lecturers are confronted with the challenge to identify future job profiles and correlated qualification requirements, especially regarding the conceptualization and implementation of CPPS, and to adapt and enhance their education concepts and methods adequately and consequently. For the new, virtual world of manufacturing a proper understanding of engineering as well as Computer sciences is essential. Industry 4.0 implies this interdisciplinary split. Integrated competencies for product and process planning and design, methodological competencies for systematical idea and innovation management as well as a holistic system and Interface competence will be crucial to achieve interconnection of physical and digital processes and machines. The Vienna University of Technology and the ESB Reutlingen committed to integrate key aspects of Industry 4.0 into their respective learning factories successively. Thus, the students will act as the coordinators of the CPPS and thereby remain in the center of all learning and implementation activities.