Refine
Document Type
- Journal article (875)
- Conference proceeding (850)
- Book chapter (184)
- Book (61)
- Doctoral Thesis (34)
- Anthology (15)
- Working Paper (13)
- Patent / Standard / Guidelines (6)
- Review (6)
- Issue of a journal (2)
Language
- English (2049) (remove)
Is part of the Bibliography
- yes (2049)
Institute
- Informatik (705)
- ESB Business School (518)
- Technik (346)
- Life Sciences (327)
- Texoversum (151)
- Zentrale Einrichtungen (6)
Publisher
- Springer (331)
- IEEE (252)
- Elsevier (242)
- MDPI (99)
- Wiley (66)
- Hochschule Reutlingen (59)
- Gesellschaft für Informatik e.V (54)
- Association for Computing Machinery (45)
- De Gruyter (43)
- Association for Information Systems (32)
Transaction processing is of growing importance for mobile computing. Booking tickets, flight reservation, banking, ePayment, and booking holiday arrangements are just a few examples for mobile transactions. Due to temporarily disconnected situations the synchronisation and consistent transaction processing are key issues. Serializability is a too strong criteria for correctness when the semantics of a transaction is known. We introduce a transaction model that allows higher concurrency for a certain class of transactions defined by its semantic. The transaction results are ”escrow serializable” and the synchronisation mechanism is non-blocking. Experimental implementation showed higher concurrency, transaction throughput, and less resources used than common locking or optimistic protocols.
Relationship Marketing (RM) presumes trust as an important antecedent for the performance of interfirm relationships. Current research is dominated by an interpersonal perspective. In this research tack, trust chiefly emerges as a result of interpersonal relationships. But multiple risks arise if customer trust rests solely on elements inextricably linked to single representatives. Hence, this paper evaluates the impact of organizational capabilities and the moderating role of customer preferences on the trust creation process. The framework presented here is tested cross-industry on 220 customers for IT solutions. The results offer significant insight into the effectiveness of individual and organizational RM strategies.
In order to be innovative, an organisation has to utilise the skills of all its employees. Implementing the potential of women, older staff members as well as people with different backgrounds leads to more creative ideas, new approaches and results in more innovative products. But how can a company use the innovative potential of its diverse workforce? And how can customer diversity be used to create innovations?
This book answers these questions as well as many more related to the topic of diversity management and innovation. Special emphasis is put on the role of women in the innovation system. Therefore, the language and effects of pictures used in job advertisements are addressed. Moreover, measures to advocate highly innovative women are identified including their demands regarding workplace requirements.
In addition, the book deals with diversity management in both publicly traded companies and public institutions. The involvement of children as Lead Users along the product development process is addressed as well. Given the fact that innovation does not only comprise products, but also services, a separate chapter focusing on the effect of Diversity on service innovations is included.
Modern web-based applications are often built as multi-tier architecture using persistence middleware. Middleware technology providers recommend the use of Optimistic Concurrency Control (OCC) mechanism to avoid the risk of blocked resources. However, most vendors of relational database management systems implement only locking schemes for concurrency control. As consequence a kind of OCC has to be implemented at client or middleware side.
A simple Row Version Verification (RVV) mechanism has been proposed to implement an OCC at client side. For performance reasons the middleware uses buffers (cache) of its own to avoid network traffic and possible disk I/O. This caching however complicates the use of RVV because the data in the middleware cache may be stale (outdated). We investigate various data access technologies, including the new Java Persistence API (JPA) and Microsoft’s LINQ technologies for their ability to use the RVV programming discipline.
The use of persistence middleware that tries to relieve the programmer from the low level transaction programming turns out to even complicate the situation in some cases.Programmed examples show how to use SQL data access patterns to solve the problem.
In this presentation the audience will be: (a) introduced to the aims and objectives of the DBTechNet initiative, (b) briefed on the DBTech EXT virtual laboratory workshops (VLW), i.e. the educational and training (E&T) content which is freely available over the internet and includes vendor-neutral hands-on laboratory training sessions on key database technology topics, and (c) informed on some of the practical problems encountered and the way they have been addressed. Last but not least, the audience will be invited to consider incorporating some or all of the DBTech EXT VLW content into their higher education (HE), vocational education and training (VET), and/or lifelong learning/training type course curricula. This will come at no cost and no commitment on behalf of the teacher/trainer; the latter is only expected to provide his/her feedback on the pedagogical value and the quality of the E&T content received/used.
Relationship marketing is an important issue in every business. Knowing the customers and establishing, maintaining and enhancing long-term customer relationships is a key component of long-term business success. Considering that sport is such big business today, it is surprising that this crucial approach to marketing has yet to be fully recognised either in literature or in the sports business itself. Relationship Marketing in Sports aims to fill this void by discussing and reformulating the principles of relationship marketing and by demonstrating how relationship marketing can be successfully applied in practice within a sports context. Written by a unique author team of academic and practitioner experience, the book provides the reader with: the first book to apply the principles of relationship marketing specifically to a sports context case studies from around the world to provide a uniquely global approach applicable worldwide strong pedagogical features including learning outcomes, overviews, discussion questions, glossary, guided reading and web links practical advice for professional, semi-professional and non-professional sporting organisations a companion website providing web links, case studies and PowerPoint slides for lecturers. Relationship Marketing in Sports is crucial reading for both students and professionals alike and marks a turning point in the marketing of sports.
The Third International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2011) held on January 23-27, 2011 in St. Maarten, The Netherlands Antilles, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take this opportunity to thank all the members of the DBKDA 2011 Technical Program Committee as well as the numerous reviewers. The creation of such a broad and high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to the DBKDA 2011. We truly believe that, thanks to all these efforts, the final conference program consists of top quality contributions. This event could also not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2011 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2011 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in database research. We are convinced that the participants found the event useful and communications very open. The beautiful places of St. Maarten surely provided a pleasant environment during the conference and we hope you had a chance to visit the surroundings.
This work presents a disconnected transaction model able to cope with the increased complexity of longliving, hierarchically structured, and disconnected transactions. Wecombine an Open and Closed Nested Transaction Model with Optimistic Concurrency Control and interrelate flat transactions with the aforementioned complex nature. Despite temporary inconsistencies during a transaction’s execution our model ensures consistency.
Suppliers need to improve their relational capabilities if they are to enhance customer trust. Debate about such capabilities is dominated by an interpersonal approach. This paper provieds novel marketing options by expanding insights into alternative types of relational capabilities. Furthermore, the moderating role of customer preferences on the effectiveness of relational capabilities is evaluated.
Behavioral economics links social, cognitive and emotional elements to help understand and explain the economic decision-making of individuals and institutions. The focus of research in behavioral economics is on individual choice and the motives underlying that choice. This study booklet introduces the key features and ideas of behavioral economics.
This is the first copy of JIEBS. The papers it presents are the result of a call for papers CEBS made in 2011. We actually received far more interesting papers and research reports than expected.They all passed a double blind review and the papers naturally are the original work of the named authors. The choice we finally made was also influenced by the topic of the CEBS annual conference 2011, namely the influence of infrastructure and skilled labour on Indo-European Business. The papers analyse structure and explain many issues related to this, they raise questions and point towards areas for further research and they form the nucleus of this new and currently only scientific platform for Indo-European business studies.
Since 2000, Indian special economic zones were established with the intention to attract foreign direct investment. We present a first empirical assessment with new data from 1980 to 2010 and evaluate the outcome after 10 years. In general, our empirical results confirm that special economic zones attract FDI statistical significantly. Another finding of the study is that open economies with stable inflation attract more FDI than small and closed economies.
India’s growth: perspectives for Indo-European business “Skilled labour in India: bridging the gap”
(2011)
The following paper is based on a survey conducted for ESB Business School and will show how German companies perceive India’s labour market. Besides existing geographical and sectoral gaps we will reveal gaps in the required qualification profile. Thinking merely of hard qualification factors like education levels, skills etc., though, would be short-sighted. Often cited intercultural qualifications also play an important role.
What can be done? What should be done to bridge these gaps? These will be the leading questions of this chapter. We will discuss some solutions – not forgetting that the problems German companies face are complex and knowing there is no ideal way. However, we will see that some of the most urgent problems can be solved or reduced by Indo-European or Indo-German co operation models in the field of vocational training and institutions of higher education.
The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2012], held between February 29th and March 5th, 2012 in Saint Gilles, Reunion Island, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2012 Technical Program Committee, as well as the numerous reviewers. The creation of such a broad and high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2012. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2012 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2012 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge, and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Saint Gilles, Reunion Island.
Redirected walking techniques allow people to walk in a larger virtual space than the physical extents of the laboratory. We describe two experiments conducted to investigate human sensitivity to walking on a curved path and to validate a new redirected walking technique. In a psychophysical experiment, we found that sensitivity to walking on a curved path was significantly lower for slower walking speeds (radius of 10 meters versus 22 meters). In an applied study, we investigated the influence of a velocity-dependent dynamic gain controller and an avatar controller on the average distance that participants were able to freely walk before needing to be reoriented. The mean walked distance was significantly greater in the dynamic gain controller condition, as compared to the static controller (22 meters versus 15 meters). Our results demonstrate that perceptually motivated dynamic redirected walking techniques, in combination with reorientation techniques, allow for unaided exploration of a large virtual city model.
Turning complainers into fans : towards a framework for customer services in social media channels
(2012)
In recent years, marketing scholars have invested heavily in exploring the role of social media in marketing theory and practice. One valuable strategy for using social media in marketing communication is to provide customer services in applications like Facebook or Twitter. This paper evaluates a) the concept of perceived service quality in different service channels and b) the impact customer service strategies have on customer loyalty, word of mouth communication, and cross-sell preferences. The framework presented here is tested cross-channel against data collected from the customer service department of a large telecommunication provider. The results elucidate the effectiveness of customer service strategies in different channels.
This study analyses the impact of Basel III on the fair pricing of bank guarantee facilities.Guarantees are an important risk mitigation instrument between exporters and importers in international trade and regularly a prerequisite for cross border sales contracts to be closed. Basel III – which shall be introduced from 2013 onwards - is a new regulation stipulating higher capital requirements for banks compared to the predecessor Basel II. It will therefore have an impact on the pricing of guarantee facilities which banks provide to exporting companies, making it also a crucial regulation for the cost of exportation overall. The study compares those contents of Basel III and Basel II which are particularly relevant for guarantees in order to identify and crystallize pricing-relevant changes in the regulations and their respective impact potential. The Basel frameworks are analyzed part by part and reviewed in terms of relevance for guarantees. In case of ambiguity the analysis is verified by complementary expert interviews. References and examples are mainly focusing on the German banking system but the basic conclusions can be generalized for those countries adopting Basel III.1 As the result, a case study expresses the quantitative outcomes of different scenarios and the impact of the different price determining factors on the overall fair pricing of bank guarantee facilities.
The intention of this paper is to show that the statistical approach to risk is not enough to explain the behavior of investors. It furthermore proposes ideas and alternative approaches on how to deal with risk. Psychological findings are of particular interest as they might enhance our understanding of risk perception and assessment. The chapter “From the normal distribution to fat tails” starts with the rejection of the normal distribution as a simplifying basis for risk and return. This rejection is supported by several empirical observations like clustering of volatility and fat tails. This leads to a two-step approach for modeling risk and return based on the distinction of conditional and un-conditional changes. Conditional time series models (ARMA, ARCH, GARCH) and alternative distributions are presented (Stable Paretian, Student’s T, EVT) as a way to improve the art of risk and return modeling beyond the normal distribution assumption. The chapter ends with the conclusion that each model is only a statistical approximation and never encompasses the unpredictability of black swans and the nature of human behavior in the financial markets. After having discussed the limitations of the purely statistical approach to risk and return this paper goes beyond the standard theory of finance for two purposes. Firstly, behavioral finance provides some arguments for the limitation of statistics in assessing risk. Secondly, an alternative approach to risk perception is presented. This alternative is called Prospect Theory, a rather psychology-based approach using preferences to explain investors’ actions by human behavior in decision making processes. Starting point is the utility function and the value function followed by a description of the two phases: framing and evaluation. The value function is then clearly distinguished from the utility function by elaborating certain effects like reference points, loss aversion or the weighting function. In this section the paper enters the arena of human risk perception which is far from being monetarily rational in the sense of the homo oeconomicus. With Cumulative Prospect Theory there exists an extension to multiple outcome scenarios where risk does not necessarily have to be known. In such a situation, besides risk, there also exists immeasurable uncertainty. Current research confirms and rejects parts of (Cumulative) Prospect Theory which is not necessarily a bad sign as human behavior is rarely exactly replicable and the complexity does not really allow generalizations. Therefore, even if the theory is not completely correct it still enhances our understanding of risk perception and human decision making which can be a very valuable input for agent-based models. The next chapter analyses in more detail possible distortions from psychological biases in the assessment of risk. In this context the law of small numbers, overconfidence and feelings/experience are discussed. Knowing these biases complicates the idea of developing a risk model even further. However, this is again another step to better understand the underlying processes and motives of decision making in the context of financial markets. The last chapter is an attempt to link the different aspects to get a holistic view on risk behavior. Two possibilities are discussed: Hedonic psychology, with the distinction between blow up and bleeding strategy, and heuristic-based explanations for real observations like clustering of expectations and trust in experts. This leaves space for further research as we do not have a tool that is based on current findings and can actually help us in explaining and predicting behavior in financial markets. One possibility would be to link all these aspects in the approach of computational finance to develop agent-based models in which market observations, psychological findings and the situational context can be integrated.
Game theory is the study of how people behave in strategic situatons. By "strategic" we mean a situation in which each person, when deciding what actions to take, must consider how others might respond to that action. Like other fields in economics, game theory consists of a collection of models. The understanding that game-theoretic models give is particularly relevant in the social, political, and economic areas.
Multi-dimensional patient data, such as time varying volume data, data of different imaging modalities, surface segmentations etc. are of growing importance in the clinical routine. For many use cases, it is of major importance to replicate a certain visualization of a data set created on one machine on a different computer using different software tools. Up until now, there exists no standardized methodology for this consistent presentation. We propose an extension of the Digital Imaging und Communications in Medicine (DICOM) called “Multi dimensional Presentation State” and outline scope and first results of the standardization process.
Energy-efficiency and safety became an important factor for car manufacturers. Thus, the cars have been optimised regarding the energy consumption and safety by optimising for example the power train or the engine. Besides the optimisation of the car itself, energy-efficiency and safety can also be increased by adapting the individual driving behaviour to the current driving situation. This paper introduces a driving system, which is in development. Its goal is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. For the creation of a recommendation the driving system monitors the driver and the current driving situation as well as the car using in-vehicle sensors and serial-bus systems. On the basis of the acquired data, the driving system will give individual energy-efficiency and safety recommendations in real-time. This will allow eliminating bad driving habits, while considering the driver needs.
Telemedicine is becoming an increasingly important approach to diagnostic, treat or prevent diseases. However, the usage of Information Communication Technologies in healthcare results in a considerable amount of data that must be efficiently and securely transmitted. Many manufacturers provide telemedicine platforms without regarding interoperability, mobility and collaboration. This paper describes a collaborative mobile telemonitoring platform that can use the IEEE 11073 and HL7 communication standards or adapt proprietary protocols. The proposed platform also covers the security and modularity aspects. Furthermore this work introduces an Android-based prototype implementation
This paper presents a new European initiative to support the sustainable empowerment of the ageing society. Empowerment in this context represents the capability to have a self-determined, autonomous and healthy life. The paper justifies the need of such an initiative and highlights the role that telemedicine and ambient assisted living can play in this environment.
The purpose of this study is to evaluate online German fashion shopping websites from a customer perspective, based on a two-dimensional conceptual framework covering
shopping experience and shopping quality. As the research methodology, an exploratory mystery shopping approach was used in order to compare online shops. The results were as follows. First, four categories of online shops were identified: heroes, marketing winners, process winners, and underperformers. Second, three main levers for improvement were elaborated: emotionality of websites, reducing complexity, and the introduction of an industry standard of payments. From These results, it is possible to analyze and benchmark websites and to adapt online Marketing decisions as well as general management strategies for online fashion Shopping companies. The study has originality and value as it is the first time that an Evaluation of websites has combined the consumer´s perspective before the purchase and its fulfillment (e.g. delivery) after the online purchase.
The workshop aims to discuss leading edge contributions to the interdisciplinary research area of ambient intelligence (AmI) applied to the domains of telemedicine and driving assistance. AmI refers to human centered environments attributed with sensors. The development of AmI in the two application domains of the workshop shares several commonalities: the extensive usage of networked devices and sensors, the design of artificial intelligence algorithms for diagnosis, including recommendation systems and qualitative reasoning or the application of mobile and wireless communication to their distributed systems. Together with the presentation of common aspects of Ambient Intelligence, a further goal of the workshop is to stimulate synergies among both application domains and present examples. The telemedicine domain can benefit from methodologies in designing complex devices, real-time conform system design, audiovisual or computer vision system design used in automotive driving assistance. Furthermore, the automotive domain can benefit from the usercentric view, biometric sensor data design, multi-user data bases for aggregation and diagnosis using big data like used in telemedicine. The German Government supports these research lines in its Hightec-Strategie under the domains “Health and Nutrition” and “Climate and Energy”. In Spain the term “Spanish Program for R&D Challenged Oriented Society – Challenge in energy safe, efficient and clean & Challenge in sustainable transport, smart and integrated” is used. Scientific contributions to the event are peer-reviewed by a suited program committee having members from Germany and Spain. The same committee is serving the JARCA workshop (Jornadas sobre Sistemas cualitativos y sus Aplicaciones en Diagnosis, Robótica e Inteligencia Ambiental - Conference on Qualitative Systems and their Applications in Diagnoses, Robotics and Ambient Intelligence) since 15 years. This workshop is sponsored by the German Academic Exchange Service (DAAD) under contract number 57070010.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nonetheless, in real life history is not always repeatable, i.e., in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction based on a calculated periodicity. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. The periodicity is calculated based on a novel approach that is based on data folding and Pearson Correlation. Compared to other techniques this approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 as well as artificial data demonstrate better results than established sophisticated time series methods.
"Learning by doing" in Higher Education in technical disciplines is mostly realized by hands-on labs. It challenges the exploratory aptitude and curiosity of a person. But, exploratory learning is hindered by technical situations that are not easy to establish and to verify. Technical skills are, however, mandatory for employees in this area. On the other side, theoretical concepts are often compromised by commercial products. The challenge is to contrast and reconcile theory with practice. Another challenge is to implement a self-assessment and grading scheme that keeps up with the scalability of e-learning courses. In addition, it should allow the use of different commercial products in the labs and still grade the assignment results automatically in a uniform way. In two European Union funded projects we designed, implemented, and evaluated a unique e-learning reference model, which realizes a modularized teaching concept that provides easily reproducible virtual hands-on labs. The novelty of the approach is to use software products of industrial relevance to compare with theory and to contrast different implementations. In a sample case study, we demonstrate the automated assessment for the creative database modeling and design task. Pilot applications in several European countries demonstrated that the participants gained highly sustainable competences that improved their attractiveness for employment.
The Fifth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2013], held between January 27th- February 1st, 2013 in Seville, Spain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2013 Technical Program Committee, as well as the numerous reviewers. The creation of such a high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2013. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2013 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2013 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Seville, Spain.
New storage technologies, such as Flash and Non- Volatile Memories, with fundamentally different properties are appearing. Leveraging their performance and endurance requires a redesign of existing architecture and algorithms in modern high performance databases. Multi-Version Concurrency Control (MVCC) approaches in database systems, maintain multiple timestamped versions of a tuple. Once a transaction reads a tuple the database system tracks and returns the respective version eliminating lock-requests. Hence under MVCC reads are never blocked, which leverages well the excellent read performance (high throughput, low latency) of new storage technologies. Upon tuple updates, however, established implementations of MVCC approaches (such as Snapshot Isolation) lead to multiple random writes – caused by (i) creation of the new and (ii) in-place invalidation of the old version – thus generating suboptimal access patterns for the new storage media. The combination of an append based storage manager operating with tuple granularity and snapshot isolation addresses asymmetry and in-place updates. In this paper, we highlight novel aspects of log-based storage, in multi-version database systems on new storage media. We claim that multi-versioning and append-based storage can be used to effectively address asymmetry and endurance. We identify multi-versioning as the approach to address dataplacement in complex memory hierarchies. We focus on: version handling, (physical) version placement, compression and collocation of tuple versions on Flash storage and in complex memory hierarchies. We identify possible read- and cacherelated optimizations.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nontheless, in real life history is not always repeatable, i.e. in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. Compared to other techniques this novel approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 demonstrate better results than established sophisticated time series methods.
Decorative laminates are the most important class of surface-finished engineered wood products. However, while there are numerous scientific publications published dealing with the technology of wood, wood-based products and also liquid coating systems, there is practically no scientific research work available in the field of paper-based laminates. In view of an ever increasing global competition it is time to systematically apply and pursue scientific approaches in this field. The present work is based on a knowledge-based manufacturing paradigm. The application of scientific methodology (e.g. instrumental analysis, process analytics, design of experiments, chemometrics, process modeling) to the preparation of decorative laminates covering the whole process chain from resin synthesis to paper impregnation and to final laminate should enable a targeted design of material functionality.
A fast transient current-mode buckboost DC-DC converter for portable devices is presented. Running at 1 MHz the converter provides stable 3 V from a 2.7 V to 4.2 V Li-Ion battery. A small voltage under-/overshoot is achieved by fast transient techniques: (1) adaptive pulse skipping (APS) and (2) adaptive compensation capacitance (ACC). The proposed converter was implemented in a 0.25 μm CMOS technology. Load transient simulations confirm the effectiveness of APS and ACC. The improvement in voltage undershoot and response time at light-to-heavy load step (100 mA to 500 mA), are 17 % and 59 %, respectively, in boost mode and 40 % and 49 %, respectively, in buck mode. Similar results are achieved at heavy-to-light load step for overshoot and response time.
The Dow Jones Sustainability Indexes (DJSI) track the performance of companies that lead in corporate sustainability in their respective sectors or in the geographies they operate. The Sustainable Asset Management (SAM) Indexes GmbH publishes and markets the indexes, the so-called Dow Jones Sustainability Indexes in collaboration with SAM. All indexes of the DJSI family are assessed according to SAM’s Corporate Sustainability AssessmentTM methodology.
Ambush marketing in sports
(2013)
Ambush marketing is a strategy by which a company or organisation uses their marketing communications to associate themselves with an event without being an official sponsor or authorised partner or licensee. It has become a particular concern in the marketing of major sports events, with international sponsorship and branding properties worth many millions of dollars. Ambush Marketing in Sports is the first book to offer comprehensive analysis of the theoretical and practical implications of ambush marketing.
Drawing on cutting-edge empirical research data, the book outlines an innovative model for understanding ambush marketing and offers practical advice for all stakeholders, from sponsors and event organisers to media organisations. The book examines the opportunities and the risks of ambush marketing, assesses the legal, ethical and business dimensions, and offers advice for preventing ambush marketing in a range of contexts. Fully supported throughout with examples and cases from major international sports events, such as the FIFA World Cup and the Olympic Games, this book is important reading for any student, researcher or practitioner with an interest in sport marketing, sport business or event management.
Unraveling the double-edged sword : effects of cultural diversity on creativity and innovativeness
(2014)
Cultural diversity is considered a “double-edged sword” (Kravitz, 2005) as research on its effects on teams’ performance regularly delivers inconsistent and contradictory results. This paper makes an attempt to unravel the double-edged sword by discerning different forms of cultural diversity: separation and variety (Harrison & Klein, 2007). Based on a review of the literature, a conceptual model is developed hypothesizing that cultural variety yields positive, while cultural separation yields negative effects on team creativity and innovativeness. In addition the effects of national diversity are contrasted to proof whether national diversity can serve as a proxy for cultural diversity as is often practiced. The model is tested on a sample of 113 student teams of Entrepreneurship modules at 4 European universities. Cultural diversity is measured directly on the basis of individual team members’ cultural value orientations by means of the CPQ4 (Maznevski, DiStefano, Gomez, Noorderhaven & Wu, 2002). Data is analyzed using the PLS structural equation modeling technique. The results confirm the hypothesized impacts of cultural variety and separation on creativity but do not deliver evidence for impacts on innovativeness. Same is true for national diversity. Interestingly, national diversity does not show any relation to neither form of cultural diversity.
The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2014), held between April 20 - 24, 2014 in Chamonix, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Silicones
(2014)
Silicones are found in a variety of applications with requirements that range from long life at elevated temperatures to fluidity at low temperatures. This chapter first considers silicone elastomers and their application in room temperature vulcanizing (RTV) and heat curing systems (HTV). Also, new technologies for UV curing are introduced. Coverage of RTVs includes both one-component and two-component systems and the different cure chemistries of each, and is followed by a separate discussion of silicone laminates. Due to the high importance of silicone fluids, they are also discussed. Fluids include polishes, release agents, surfactants, and dielectric fluids.
While digital IC design is highly automated, analog circuits are still handcrafted in a time-consuming, manual fashion today. This paper introduces a novel Parameterized Circuit Description Scheme (PCDS) for the development of procedural analog schematic generators as parameterized circuits. Circuit designers themselves can use PCDS to create circuit automatisms which capture valuable expert knowledge, offer full topological flexibility, and enhance the re-use of well-established topologies. The generic PCDS concept has been successfully implemented and employed to create parameterized circuits for a broad range of use cases. The achieved results demonstrate the efficiency of our PCDS approach and the potential of parameterized circuits to increase automation in circuit design, also to benefit physical design by promoting the common schematic-driven-layout flow, and to enhance the applicability of circuit synthesis approaches.
In the period from the 1950s to 2013, the American Food and Drug Administration (FDA) approved 1346 new molecular entities (NMEs) or new biologics entities (NBEs). On average, the approval rate was 20 NMEs per year. In the past 40 years, the number of new drugs launched into the market increased slightly from 15 NMEs in the 1970s to 25–30 NMEs since the 1990s. The highest number of new drugs approved by FDA was in 1996 and 1997, which might be related to the enactment of the Prescription Drug User Fee Act (PDUFA) in 1993.
In diesem Beitrag wurde gezeigt, wie mit Hilfe von Verfahren zur Analyse von Petri–Netzen ein in der Programmiersprache Kontaktplan erstelltes SPS–Programm analysiert werden kann. Das Ziel des Verfahrens ist dabei nicht eine Verifikation im eigentlichen Sinne sondern das Aufdecken von verbotenen oder unerwünschten Zuständen. Im Beitrag wurden Regeln zur Transformation des im Kontaktplan erstellten Ablaufs in ein Petri–Netz angegeben und anhand der Analyse eines fehlerhaft implementierten Ablaufs die Leistungsfähigkeit des Ansatzes vorgestellt. Das Beispiel zeigt, dass Programmfehler bereits vor einem Test an der realen Anlage erkannt werden können. Bei der weiteren Entwicklung des Verfahrens liegt ein Schwerpunkt auf der Verallgemeinerung auf im Kontaktplan entwickelte Programmorganisationseinheiten, die nicht nur reine
Abläufe implementieren. Ein weiterer wichtiger Entwicklungsschritt ist die graphische Unterstützung der Fehlersuche im Erreichbarkeitsgraphen, so dass insgesamt ein leistungsfähiges Werkzeug zur Unterstützung der Implementierung von Ablaufsteuerungen im Kontaktplan zur Verfügung steht.
Proceedings of the International Workshop on Mobile Networks for Biometric Data Analysis (mBiDA)
(2014)
Prevention and treatment of common and widesprea (chronic) diseases is a challenge in any modern Society and vitally important for health maintenance in aging societies. Capturing biometric data is a cornerstone for any analysis and Treatment strategy. Latest advances in sensor technology allow accurate data measurement in a non-intrusive way. In many cases, it is necessary to provide online monitoring and real-time data capturing to support patients´ prevention plans or to allow medical professionals to access the current status. Different communication standards are required to push sensor data and to store and analyze them on different (mobile) platforms. The objective of the workshop is to show new and innovative approaches dedicated to biometric data capture and analysis in a non-intrusive way maintaining mobility. Examples can be found in human centered ambient intelligence attributed with sensors or even in methodologies applied in automotive real-time conform mobile system design. The workshop´s main challenge is to focus on approaches promoting non-intrusiveness, reliable prediction algorithms and high user-acceptance. The workshop will provide overview presentations, Young researcher poster tracks, doctoral tracks and classical peer-review full paper tracks. Especially, would like to encourage students and young researchers to participate and to contribute to the workshop. Scientific contributions to the event are peer-reviewed by a suited program committee.
The impact of stress of every human being has become a serious problem. Reported impact on persons are a higher rate or health disorders like heart problems, obesity, asthma, diabetes, depressions and many others. An individual in a stressful situation has to deal with altered cognition as well as an affected decision making skill and problem solving. This could lead to a higher risk for accidents in dynamic environments such as automotive. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives or computes the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence as well as recommend driving behavior to decrease stress influenced driving as well as improve road safety.
Marketing of and with sports is as international as sports itself. While this impression may be intuitively evident during global events such as the Olympic Games, internationalisation also takes place in the daily routines of our increasingly globalised domestic leagues and sports events. In this book, edited by André Bühler and Gerd Nufer, leading sports economists and marketing experts from around the world provide detailed insights into current issues and future challenges of sports marketing from an international perspective. An inspiring reading and an essential book to gain a better understanding of today’s status quo and developmental stages of sports marketing in the various regions of this world.
Today 40 Gbps is in development at IEEE 802.3bq over four pair balanced cabling. In this paper, we describe a transmission experiment of 25 Gbps enabling either a single pair transmission of 25 Gbps over a 30 meter balanced cabling channel, or a 100 Gbps transmission via a four-pair balanced channel. A scalable matrix modeling tool is introduced which allows the prediction of transmission characteristics of a channel taking mode conversion into account . We applied this tool to characterize PCB-channels including the magnetics and PCB for a four-pair 100 Gbps transmission. We evaluated prototype cables and connecting hardware for frequencies up to 2 GHz and beyond. Finally we investigated possible line encoding schemes and provide measurement results of a transmission over 30 m with a data rate of 25 Gbps per twisted pair.
In this paper, research projects with 30 meter balanced cabling and data rates up to 25 Gbps over one single pair are described. The project aim is to achieve 100 Gbps via a four pair balanced cabling channel. In the following, spectral characteristics of the used prototype twisted pair are presented. Therefore, the insertion loss of the single cable in comparison to the insertion loss of the cable in combination with an equalizing amplifier, as well as the group delay of the cable and the cable connected to the equalizing amplifier is shown. Furthermore, a carrierless Pulse Amplitude Modulation with 32 different levels (PAM-32) as an approach for a possible line encoding is presented. Finally, research measurements of the data transmission with a data rate up to 25 Gbps via shielded twisted pair is shown.
An index in a Multi-Version DBMS (MV-DBMS) has to reflect different tuple versions of a single data item. Existing approaches follow the paradigm of logically separating the tuple version data from the data item, e.g. an index is only allowed to return at most one version of a single data item (while it may return multiple data items that match a search criteria). Hence to determine the valid (and therefore visible) tuple version of a data item, the MV-DBMS first fetches all tuple versions that match the search criteria and subsequently filters visible versions using visibility checks. This involves I/O storage accesses to tuple versions that do not have to be fetched. In this vision paper we present the Multi Version Index (MV-IDX) approach that allows index-only visibility checks which significantly reduce the amount of I/O storage accesses as well as the index maintenance overhead. The MV-IDX achieves significantly lower response times and higher transactional throughput on OLTP workloads.