Refine
Document Type
- Conference proceeding (750) (remove)
Language
- English (750) (remove)
Has full text
- yes (750) (remove)
Is part of the Bibliography
- yes (750)
Institute
- Informatik (402)
- Technik (200)
- ESB Business School (124)
- Texoversum (15)
- Life Sciences (10)
- Zentrale Einrichtungen (2)
Publisher
- IEEE (221)
- Springer (112)
- Gesellschaft für Informatik e.V (40)
- Association for Computing Machinery (39)
- Hochschule Reutlingen (31)
- Association for Information Systems (23)
- SciTePress (20)
- IARIA (19)
- VDE Verlag (19)
- Elsevier (18)
Transaction processing is of growing importance for mobile computing. Booking tickets, flight reservation, banking, ePayment, and booking holiday arrangements are just a few examples for mobile transactions. Due to temporarily disconnected situations the synchronisation and consistent transaction processing are key issues. Serializability is a too strong criteria for correctness when the semantics of a transaction is known. We introduce a transaction model that allows higher concurrency for a certain class of transactions defined by its semantic. The transaction results are ”escrow serializable” and the synchronisation mechanism is non-blocking. Experimental implementation showed higher concurrency, transaction throughput, and less resources used than common locking or optimistic protocols.
Modern web-based applications are often built as multi-tier architecture using persistence middleware. Middleware technology providers recommend the use of Optimistic Concurrency Control (OCC) mechanism to avoid the risk of blocked resources. However, most vendors of relational database management systems implement only locking schemes for concurrency control. As consequence a kind of OCC has to be implemented at client or middleware side.
A simple Row Version Verification (RVV) mechanism has been proposed to implement an OCC at client side. For performance reasons the middleware uses buffers (cache) of its own to avoid network traffic and possible disk I/O. This caching however complicates the use of RVV because the data in the middleware cache may be stale (outdated). We investigate various data access technologies, including the new Java Persistence API (JPA) and Microsoft’s LINQ technologies for their ability to use the RVV programming discipline.
The use of persistence middleware that tries to relieve the programmer from the low level transaction programming turns out to even complicate the situation in some cases.Programmed examples show how to use SQL data access patterns to solve the problem.
The Third International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2011) held on January 23-27, 2011 in St. Maarten, The Netherlands Antilles, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take this opportunity to thank all the members of the DBKDA 2011 Technical Program Committee as well as the numerous reviewers. The creation of such a broad and high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to the DBKDA 2011. We truly believe that, thanks to all these efforts, the final conference program consists of top quality contributions. This event could also not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2011 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2011 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in database research. We are convinced that the participants found the event useful and communications very open. The beautiful places of St. Maarten surely provided a pleasant environment during the conference and we hope you had a chance to visit the surroundings.
This work presents a disconnected transaction model able to cope with the increased complexity of longliving, hierarchically structured, and disconnected transactions. Wecombine an Open and Closed Nested Transaction Model with Optimistic Concurrency Control and interrelate flat transactions with the aforementioned complex nature. Despite temporary inconsistencies during a transaction’s execution our model ensures consistency.
The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2012], held between February 29th and March 5th, 2012 in Saint Gilles, Reunion Island, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2012 Technical Program Committee, as well as the numerous reviewers. The creation of such a broad and high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2012. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2012 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2012 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge, and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Saint Gilles, Reunion Island.
Multi-dimensional patient data, such as time varying volume data, data of different imaging modalities, surface segmentations etc. are of growing importance in the clinical routine. For many use cases, it is of major importance to replicate a certain visualization of a data set created on one machine on a different computer using different software tools. Up until now, there exists no standardized methodology for this consistent presentation. We propose an extension of the Digital Imaging und Communications in Medicine (DICOM) called “Multi dimensional Presentation State” and outline scope and first results of the standardization process.
Energy-efficiency and safety became an important factor for car manufacturers. Thus, the cars have been optimised regarding the energy consumption and safety by optimising for example the power train or the engine. Besides the optimisation of the car itself, energy-efficiency and safety can also be increased by adapting the individual driving behaviour to the current driving situation. This paper introduces a driving system, which is in development. Its goal is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. For the creation of a recommendation the driving system monitors the driver and the current driving situation as well as the car using in-vehicle sensors and serial-bus systems. On the basis of the acquired data, the driving system will give individual energy-efficiency and safety recommendations in real-time. This will allow eliminating bad driving habits, while considering the driver needs.
Telemedicine is becoming an increasingly important approach to diagnostic, treat or prevent diseases. However, the usage of Information Communication Technologies in healthcare results in a considerable amount of data that must be efficiently and securely transmitted. Many manufacturers provide telemedicine platforms without regarding interoperability, mobility and collaboration. This paper describes a collaborative mobile telemonitoring platform that can use the IEEE 11073 and HL7 communication standards or adapt proprietary protocols. The proposed platform also covers the security and modularity aspects. Furthermore this work introduces an Android-based prototype implementation
This paper presents a new European initiative to support the sustainable empowerment of the ageing society. Empowerment in this context represents the capability to have a self-determined, autonomous and healthy life. The paper justifies the need of such an initiative and highlights the role that telemedicine and ambient assisted living can play in this environment.
The workshop aims to discuss leading edge contributions to the interdisciplinary research area of ambient intelligence (AmI) applied to the domains of telemedicine and driving assistance. AmI refers to human centered environments attributed with sensors. The development of AmI in the two application domains of the workshop shares several commonalities: the extensive usage of networked devices and sensors, the design of artificial intelligence algorithms for diagnosis, including recommendation systems and qualitative reasoning or the application of mobile and wireless communication to their distributed systems. Together with the presentation of common aspects of Ambient Intelligence, a further goal of the workshop is to stimulate synergies among both application domains and present examples. The telemedicine domain can benefit from methodologies in designing complex devices, real-time conform system design, audiovisual or computer vision system design used in automotive driving assistance. Furthermore, the automotive domain can benefit from the usercentric view, biometric sensor data design, multi-user data bases for aggregation and diagnosis using big data like used in telemedicine. The German Government supports these research lines in its Hightec-Strategie under the domains “Health and Nutrition” and “Climate and Energy”. In Spain the term “Spanish Program for R&D Challenged Oriented Society – Challenge in energy safe, efficient and clean & Challenge in sustainable transport, smart and integrated” is used. Scientific contributions to the event are peer-reviewed by a suited program committee having members from Germany and Spain. The same committee is serving the JARCA workshop (Jornadas sobre Sistemas cualitativos y sus Aplicaciones en Diagnosis, Robótica e Inteligencia Ambiental - Conference on Qualitative Systems and their Applications in Diagnoses, Robotics and Ambient Intelligence) since 15 years. This workshop is sponsored by the German Academic Exchange Service (DAAD) under contract number 57070010.
The Fifth International Conference on Advances in Databases, Knowledge, and Data Applications [DBKDA 2013], held between January 27th- February 1st, 2013 in Seville, Spain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology. We take here the opportunity to warmly thank all the members of the DBKDA 2013 Technical Program Committee, as well as the numerous reviewers. The creation of such a high quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and efforts to contribute to DBKDA 2013. We truly believe that, thanks to all these efforts, the final conference program consisted of top quality contributions. Also, this event could not have been a reality without the support of many individuals, organizations, and sponsors. We are grateful to the members of the DBKDA 2013 organizing committee for their help in handling the logistics and for their work to make this professional meeting a success. We hope that DBKDA 2013 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in the fields of databases, knowledge and data applications. We are convinced that the participants found the event useful and communications very open. We also hope the attendees enjoyed the charm of Seville, Spain.
New storage technologies, such as Flash and Non- Volatile Memories, with fundamentally different properties are appearing. Leveraging their performance and endurance requires a redesign of existing architecture and algorithms in modern high performance databases. Multi-Version Concurrency Control (MVCC) approaches in database systems, maintain multiple timestamped versions of a tuple. Once a transaction reads a tuple the database system tracks and returns the respective version eliminating lock-requests. Hence under MVCC reads are never blocked, which leverages well the excellent read performance (high throughput, low latency) of new storage technologies. Upon tuple updates, however, established implementations of MVCC approaches (such as Snapshot Isolation) lead to multiple random writes – caused by (i) creation of the new and (ii) in-place invalidation of the old version – thus generating suboptimal access patterns for the new storage media. The combination of an append based storage manager operating with tuple granularity and snapshot isolation addresses asymmetry and in-place updates. In this paper, we highlight novel aspects of log-based storage, in multi-version database systems on new storage media. We claim that multi-versioning and append-based storage can be used to effectively address asymmetry and endurance. We identify multi-versioning as the approach to address dataplacement in complex memory hierarchies. We focus on: version handling, (physical) version placement, compression and collocation of tuple versions on Flash storage and in complex memory hierarchies. We identify possible read- and cacherelated optimizations.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nontheless, in real life history is not always repeatable, i.e. in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. Compared to other techniques this novel approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 demonstrate better results than established sophisticated time series methods.
A fast transient current-mode buckboost DC-DC converter for portable devices is presented. Running at 1 MHz the converter provides stable 3 V from a 2.7 V to 4.2 V Li-Ion battery. A small voltage under-/overshoot is achieved by fast transient techniques: (1) adaptive pulse skipping (APS) and (2) adaptive compensation capacitance (ACC). The proposed converter was implemented in a 0.25 μm CMOS technology. Load transient simulations confirm the effectiveness of APS and ACC. The improvement in voltage undershoot and response time at light-to-heavy load step (100 mA to 500 mA), are 17 % and 59 %, respectively, in boost mode and 40 % and 49 %, respectively, in buck mode. Similar results are achieved at heavy-to-light load step for overshoot and response time.
The Sixth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2014), held between April 20 - 24, 2014 in Chamonix, France, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
While digital IC design is highly automated, analog circuits are still handcrafted in a time-consuming, manual fashion today. This paper introduces a novel Parameterized Circuit Description Scheme (PCDS) for the development of procedural analog schematic generators as parameterized circuits. Circuit designers themselves can use PCDS to create circuit automatisms which capture valuable expert knowledge, offer full topological flexibility, and enhance the re-use of well-established topologies. The generic PCDS concept has been successfully implemented and employed to create parameterized circuits for a broad range of use cases. The achieved results demonstrate the efficiency of our PCDS approach and the potential of parameterized circuits to increase automation in circuit design, also to benefit physical design by promoting the common schematic-driven-layout flow, and to enhance the applicability of circuit synthesis approaches.
In diesem Beitrag wurde gezeigt, wie mit Hilfe von Verfahren zur Analyse von Petri–Netzen ein in der Programmiersprache Kontaktplan erstelltes SPS–Programm analysiert werden kann. Das Ziel des Verfahrens ist dabei nicht eine Verifikation im eigentlichen Sinne sondern das Aufdecken von verbotenen oder unerwünschten Zuständen. Im Beitrag wurden Regeln zur Transformation des im Kontaktplan erstellten Ablaufs in ein Petri–Netz angegeben und anhand der Analyse eines fehlerhaft implementierten Ablaufs die Leistungsfähigkeit des Ansatzes vorgestellt. Das Beispiel zeigt, dass Programmfehler bereits vor einem Test an der realen Anlage erkannt werden können. Bei der weiteren Entwicklung des Verfahrens liegt ein Schwerpunkt auf der Verallgemeinerung auf im Kontaktplan entwickelte Programmorganisationseinheiten, die nicht nur reine
Abläufe implementieren. Ein weiterer wichtiger Entwicklungsschritt ist die graphische Unterstützung der Fehlersuche im Erreichbarkeitsgraphen, so dass insgesamt ein leistungsfähiges Werkzeug zur Unterstützung der Implementierung von Ablaufsteuerungen im Kontaktplan zur Verfügung steht.
Proceedings of the International Workshop on Mobile Networks for Biometric Data Analysis (mBiDA)
(2014)
Prevention and treatment of common and widesprea (chronic) diseases is a challenge in any modern Society and vitally important for health maintenance in aging societies. Capturing biometric data is a cornerstone for any analysis and Treatment strategy. Latest advances in sensor technology allow accurate data measurement in a non-intrusive way. In many cases, it is necessary to provide online monitoring and real-time data capturing to support patients´ prevention plans or to allow medical professionals to access the current status. Different communication standards are required to push sensor data and to store and analyze them on different (mobile) platforms. The objective of the workshop is to show new and innovative approaches dedicated to biometric data capture and analysis in a non-intrusive way maintaining mobility. Examples can be found in human centered ambient intelligence attributed with sensors or even in methodologies applied in automotive real-time conform mobile system design. The workshop´s main challenge is to focus on approaches promoting non-intrusiveness, reliable prediction algorithms and high user-acceptance. The workshop will provide overview presentations, Young researcher poster tracks, doctoral tracks and classical peer-review full paper tracks. Especially, would like to encourage students and young researchers to participate and to contribute to the workshop. Scientific contributions to the event are peer-reviewed by a suited program committee.
Today 40 Gbps is in development at IEEE 802.3bq over four pair balanced cabling. In this paper, we describe a transmission experiment of 25 Gbps enabling either a single pair transmission of 25 Gbps over a 30 meter balanced cabling channel, or a 100 Gbps transmission via a four-pair balanced channel. A scalable matrix modeling tool is introduced which allows the prediction of transmission characteristics of a channel taking mode conversion into account . We applied this tool to characterize PCB-channels including the magnetics and PCB for a four-pair 100 Gbps transmission. We evaluated prototype cables and connecting hardware for frequencies up to 2 GHz and beyond. Finally we investigated possible line encoding schemes and provide measurement results of a transmission over 30 m with a data rate of 25 Gbps per twisted pair.
In this paper, research projects with 30 meter balanced cabling and data rates up to 25 Gbps over one single pair are described. The project aim is to achieve 100 Gbps via a four pair balanced cabling channel. In the following, spectral characteristics of the used prototype twisted pair are presented. Therefore, the insertion loss of the single cable in comparison to the insertion loss of the cable in combination with an equalizing amplifier, as well as the group delay of the cable and the cable connected to the equalizing amplifier is shown. Furthermore, a carrierless Pulse Amplitude Modulation with 32 different levels (PAM-32) as an approach for a possible line encoding is presented. Finally, research measurements of the data transmission with a data rate up to 25 Gbps via shielded twisted pair is shown.
An index in a Multi-Version DBMS (MV-DBMS) has to reflect different tuple versions of a single data item. Existing approaches follow the paradigm of logically separating the tuple version data from the data item, e.g. an index is only allowed to return at most one version of a single data item (while it may return multiple data items that match a search criteria). Hence to determine the valid (and therefore visible) tuple version of a data item, the MV-DBMS first fetches all tuple versions that match the search criteria and subsequently filters visible versions using visibility checks. This involves I/O storage accesses to tuple versions that do not have to be fetched. In this vision paper we present the Multi Version Index (MV-IDX) approach that allows index-only visibility checks which significantly reduce the amount of I/O storage accesses as well as the index maintenance overhead. The MV-IDX achieves significantly lower response times and higher transactional throughput on OLTP workloads.
Advanced power semiconductors such as DMOS transistors are key components of modern power electronic systems. Recent discrete and integrated DMOS technologies have very low area-specific on-state resistances so that devices with small sizes can be chosen. However, their power dissipation can sometimes be large, for example in fault conditions, causing the device temperature to rise significantly. This can lead to excessive temperatures, reduced lifetime, and possibly even thermal runaway and subsequent destruction. Therefore, it is required to ensure already in the design phase that the temperature always remains in an acceptable range. This paper will show how self-heating in DMOS transistors can be experimentally determined with high accuracy. Further, it will be discussed how numerical electrothermal simulations can be carried out efficiently, allowing the accurate assessment of self-heating within a few minutes. The presented approach has been successfully verified experimentally for device temperatures exceeding 500 ◦C up to the onset of thermal runaway.
This paper presents a new broadband antenna for satellite communications. It describes the procedure involved in the design of a microstrip antenna array and its multi-level passive feed network that together yield circular polarization and the necessary gain to be used in an earth-satellite link. The designed antenna is notable for its large bandwidth, circular polarization, high gain and small dimensions.
This paper presents the design and simulation processes of an Equiangular Spiral Antenna for the extremely high frequencies between 65 GHz and 170 GHz. A new approach for the analysis of the antenna’s electrical parameters is described. This approach is based on formalism proposed by Rumsey to determine the EM field produced by an equiangular spiral antenna. Analytical expressions of the electrical parameters such as the gain or the directivity are then calculated using well sustained mathematical approximations. The comparison of obtained results with those from numerical integration methods shows a good agreement.
Functionally impaired people have problems with choosing and finding the right clothing. So, they need help in their daily life to wash and manage the clothing. The goal of this work is to support the user by giving recommendations to choose the right clothing, to find the clothing and how to wash the clothing. The idea behind eKlarA is to generate a gateway based system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of (one and more) closet, laundry basket and washing machine in one or several places. The gateway uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. This allows to give more freedom to the functionally impaired people in their daily life.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy-efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decides whether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of the driving system.
Three different polyols (soluble starch, sucrose, and glycerol) were tested for their potential in the chemical modification of melamine formaldehyde (MF) resins for paper impregnation. MF impregnated papers are widely used as finishing materials for engineered wood. These polyols were selected because the presence of multiple hydroxy groups in the molecules was suspected to facilitate cocondensation with the main MF framework. This should lead to good resin performance. Moreover, they are readily produced from natural feedstock. They are available in large quantities and may serve as economically feasible, environmentally harmless alternative co-monomers suitable to substitute a portion of fossil-based starting material. In the presented work, a number of model resins were synthesized and tested for covalent incorporation of the natural polyol into the MF Framework. Spectroscopic evidence of chemical incorporation of glycerol was found by applying by 1H, 13C, 1H/13C HSQC, 1H/13C HMBC, and 1H DOSY methods. It was furthermore found that covalent incorporation of glycerol in the network took place when glycerol was added at different stages during synthesis. Further, all resins were used to prepare decorative laminates and the performance of the novel resins as surface finishing was evaluated using standard technological tests. The technological performance of the various modified thermosetting resins was assessed by determining flow viscosity, molar mass distribution, the storage stability, and in a second step laminating impregnated paper to particle boards and testing the resulting surfaces according to standardized quality tests. In most cases, the average board surface properties were of acceptable quality. Our findings demonstrate the possibility to replace several percent of the petrol-based product melamine by compounds obtained from renewable resources.
Mass-customization is a megatrend that also affects the wood industry. To obtain individually designed laminates in batch size one efficient printing and processing technologies are required. Digital printing was envisaged as it does not depend on highly costly printing cylinders (as used in rotogravure printing) and allows rapid exchange of the printing designs. In the present work, two wellestablished digital printing approaches, the multi-pass and the single-pass technique, were investigated and evaluated for their applicability in decorating engineered wood and low-pressure melamine films. Three different possibilities of implementing digital printing in the decorative laminates manufacturing process were studied: (1) digital printing on coated chipboard and subsequently applying a lacquered top-coat or melamine overlay (designated as “direct printing”, since the LPM was the printing substrate), (2) digital printing on decorative paper which was subsequently impregnated before hot pressing (designated as “indirect printing, variant A”) and (3) digital printing on decorative paper with subsequent interlamination of the paper between impregnated under- and overlay paper layers during the pressing process (designated as “indirect printing, variant B”). Due to various advantages of the resulting cured melamine resin surfaces including a much better technological performance and flexibility in surface texture design, it was decided to industrially further pursue only the indirect digital printing process comprising interlamination and the direct printing process with a melamine overlay-finishing. Basis for the successful trials on production and laboratory scales were the identification of applicable inks (in terms of compatibility with melamine resin) and of appropriate printing paper quality (in terms of impregnation and imprinting ability). After selection and fine tuning of suitable materials, the next challenge to overcome was the initially insufficient bond strength between impregnated overlay and the ink layers which led to unsatisfactory quality of the print appearance and delamination effects. However, the optimization of the pressing program and the development of a modified impregnation procedure for the underlay and overlay papers allowed the successful implementation of digital printing in the production line of our industrial partner FunderMax.
Prior studies ascribed people’s poor performance in dealing with basic systems concepts to different causes. While results indicate that, among other things, domain specific experience and familiarity with the problem context play a role in this stock-flow-(SF-)performance, this has not yet been fully clarified. In this article, we present an experiment that examines the role of educational background in SF-performance. We hypothesize that SF-performance increases when the problem context is embedded in the problem solver’s knowledge domain, indicated by educational background. Using the square wave pattern and the sawtooth pattern tasks from the initial study by Booth Sweeney and Sterman (2000), we design two additional cover stories for the former, the Vehicle story from the engineering domain and the Application story from the business domain, next to the original Bathtub story. We then test the three sets of questions on business students. Results mainly support our hypothesis. Interestingly, participants even do better on a more complex behavioral pattern from their knowledge domain than on a simpler pattern from more distant domains. Although these findings have to be confirmed by further studies, they contribute both to the methodology of future surveys and the context familiarity discussion.
This paper compares the influence a video self-avatar and a lack of a visual representation of a body have on height estimation when standing at a virtual visual cliff. A height estimation experiment was conducted using a custom augmented reality Oculus Rift hardware and software prototype also described in this paper. The results show a consistency with previous research demonstrating that the presence of a visual body influences height estimates, just as it has been shown to influence distance estimates and affordance estimates.
Industry 4.0 predicts that industrial processes, technological infrastructure and all corresponding Business processes, with the help of information and communication technology (ICT), will advance to integrated, ad-hoc interconnected and decentralized Cyber-Physical Production Systems (CPPS) with real-time capabilities of selfoptimization and adaptability. Considering this change, the human being will remain in a dominant role, because it is not expected that the human factor with its characteristics and capabilities will be substituted entirely by autonomously acting technology in the foreseeable future. The mechanical intelligence, for instance, is limited to the selection of predefined options, while human creativity, flexibility, the ability to learn and to improve are required to design and configure systems, processes and products. Humans have the expertise and experience to analyze, assess and solve - even in exceptional situations. However, the amount of purely manual tasks for shop floor workers will decrease. Their role will change from a manually executing to a proactive preconceiving worker with increased responsibility. Due to the growing degree of digitalization and interconnectedness, also the tasks and responsibilities for planning and design personnel will continuously expand and become more complex. The work in versatile ad-hoc networks with advanced ICT-Tools and assistance systems will lead to increased requirements regarding the knowledge, capability and capacity of the respective employees. The on-going pervasion of IT and emergence of systems with unprecedented complexity specifically require significantly improved capabilities in analysis, abstraction, problem solving and decision making from future labour. Accordingly, the industry is asking for graduates that are educated interdisciplinary and practice-oriented. Some universities already meet these expectations, using learning factories for realistic, action-oriented classes and trainings. Lecturers are confronted with the challenge to identify future job profiles and correlated qualification requirements, especially regarding the conceptualization and implementation of CPPS, and to adapt and enhance their education concepts and methods adequately and consequently. For the new, virtual world of manufacturing a proper understanding of engineering as well as Computer sciences is essential. Industry 4.0 implies this interdisciplinary split. Integrated competencies for product and process planning and design, methodological competencies for systematical idea and innovation management as well as a holistic system and Interface competence will be crucial to achieve interconnection of physical and digital processes and machines. The Vienna University of Technology and the ESB Reutlingen committed to integrate key aspects of Industry 4.0 into their respective learning factories successively. Thus, the students will act as the coordinators of the CPPS and thereby remain in the center of all learning and implementation activities.
Enterprise Architectures (EA) consists of many architecture elements, which stand in manifold relationships to each other. Therefore Architecture Analysis is important and very difficult for stakeholders. Due changing an architecture element has impacts on other elements different stakeholders are involved. In practice EAs are often analyzed using visualizations. This article aims at contributing to the field of visual analytics in EAM by analyzing how state of-the-art software platforms in EAM support stakeholders with respect to providing and visualizing the “right” information for decision-making tasks. We investigate the collaborative decision-making process in an experiment with master students using professional EAM tools by developing a research study and accomplishing them in a master’s level class with students.
Analysis and planning of Enterprise Architectures (EA) is a complex task for stakeholders. The change of one architecture element has impact on multiple other elements because of manifold relationships and interactions between them. The interactive cockpit approach presented in this paper supports stakeholders planning and analyzing EAs and to tackle the intrinsic complexity. This approach supplies a cockpit with multiple viewpoints to put relevant information side-by-side without losing the context combined with interaction functionality. In this paper, we develop such cockpit starting with relevant use cases, describing a potential design based on well-established foundations in EA modeling, and outline an exemplary usage scenario.
In the powder coating of veneered particle boards the highly reactive hybrid epoxy/polyester powder transparent Drylac 530 Series from TIGER Coatings GmbH & Co. KG, Wels, Austria was used. Curing is accelerated by a mixture of catalysts reaching curing times of 3 min at 150 °C or 5 min at 135 °C which allows for energy and time savings making Drylac Series 530 powder suitable for the coating of temperaturesensitive substrates such as MDF and wood.
Powder coatings provide several advantages over traditional coatings: environmental friendliness, freedom of design, robustness and resistance of surfaces, possibility to seamlessly all-around coating, fast production process, cost-effectiveness. In the last years these benefits of the powder coating technology have been adopted from metal to heat-sensitive natural fibre/ wood based substrates (especially medium density fibre boards- MDF) used for interior furniture applications. Powder coated MDF furniture parts are gaining market share already in the classic furniture applications kitchen, bathroom, living and offices. The acceptance of this product is increasing as reflected by excellent growth rates and an increasing customer base. Current efforts of the powder coating industry to develop new powders with higher reactivity (i.e. lower curing temperatures and shorter curing times; e.g. 120°C/5min) will enable the powder coating of other heat-sensitive substrates like natural fibre composites, wood plastic composites, light weight panels and different plastics in the future. The coating could be applied and cured by the conventional powder coating process (electrostatic application, and melting and curing in an IR-oven) or by a new powder coating procedure based on the in-mould-coating (IMC) technique which is already established in the plastic industry. Extra value could be added in the future by the functional powder toner printing of powder coated substrates using the electrophotographic printing technology, meeting the future demand of both individualization of the furniture part surface by applying functional 3D textures and patterns and individually created coloured images and enabling shorter delivery times for these individualized parts. The paper describes the distinctiveness of powder coating on natural fibre/ wood based substrates, the requirements of the substrate and the coating powder.
Model-guided Therapy and Surgical Workflow Systems are two interrelated research fields, which have been developed separately in the last years. To make full use of both technologies, it is necessary to integrate them and connect them to Hospital Information Systems. We propose a framework for integration of Model-guided Therapy in Hospital Information Systems based on the Electronic Medical Record, and a taskbased Workflow Management System, which is suitable for clinical end users. Two prototypes - one based on Business Process Modeling Language, one based on the serum-board - are presented. From the experience with these prototypes, we developed a novel personalized visualization system for Surgical Workflows and Model-guided Therapy. Key challenges for further development are automated situation detection and a common communication infrastructure.
An operation room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for making mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and tasks of their colleagues. The entire team must work synchronously at all times. However, the operation room (OR) is a noisy environment and the actors have to set their focus on their work. To optimize the overall workflow, a task manager supporting the team was developed. Each actor is equipped with a client terminal showing a summary of their own tasks. Moreover, a big screen displays all tasks of all actors. The architecture is a distributed system based on a communication framework that supports the interaction of all clients with the task manager. A prototype of the task manager and several clients have been developed and implemented. The system represents a proof-of-concept for further development. This paper describes the concept of the task manager.
At Reutlingen University in Germany students from different countries and disciplines can learn business English within the framework of a theatre production. In the "Business English Theatre" they work in an international project team staging a play with a business focus and thus improve both their language, social and professional skills.
A millimeter-wave power amplifier concept in an advanced silicon germanium (SiGe) BiCMOS technology is presented. The goal of the concept is to investigate the impact of physical limitations of the used heterojunction bipolar transistors (HBT) on the performance of a 77 GHz power amplifier. High current behavior, collectorbase breakdown and transistor saturation can be forced with the presented design. The power amplifier is manufactured in an advanced SiGe BiCMOS technology at Infineon Technologies AG with a maximum transit frequency fT of around 250 GHz for npn HBT’s [1]. The simulation results of the power amplifier show a saturated output power of 16 dBm at a power added efficiency of 13%. The test chip is designed for a supply voltage of 3.3 V and requires a chip size of 1.448 x 0.930 mm².
Many future Services Oriented Architecture (SOA) systems may be pervasive SmartLife applications that provide real-time support for users in everyday tasks and situations. Development of such applications will be challenging, but in this position paper we argue that their ongoing maintenance may be even more so. Ontological modelling of the application may help to ease this burden, but maintainers need to understand a system at many levels, from a broad architectural perspective down to the internals of deployed components. Thus we will need consistent models that span the range of views, from business processes through system architecture to maintainable code. We provide an initial example of such a modelling approach and illustrate its application in a semantic browser to aid in software maintenance tasks.
Current approaches for enterprise architecture lack analytical instruments for cyclic evaluations of business and system architectures in real business enterprise system environments. This impedes the broad use of enterprise architecture methodologies. Furthermore, the permanent evolution of systems desynchronizes quickly model representation and reality. Therefore we are introducing an approach for complementing the existing top-down approach for the creation of enterprise architecture with a bottom approach. Enterprise Architecture Analytics uses the architectural information contained in many infrastructures to provide architectural information. By applying Big Data technologies it is possible to exploit this information and to create architectural information. That means, Enterprise Architectures may be discovered, analyzed and optimized using analytics. The increased availability of architectural data also improves the possibilities to verify the compliance of Enterprise Architectures. Architectural decisions are linked to clustered architecture artifacts and categories according to a holistic EAM Reference Architecture with specific architecture metamodels. A special suited EAM Maturity Framework provides the base for systematic and analytics supported assessments of architecture capabilities.
Nowadays the software development plays an important role in the entire value chain in production machine and plant engineering. An important component for rapid development of high quality software is the virtual commissioning. The real machine is described on the basis of simulation models. Therefore, the control software can be verified at an early stage using the simulation models. Since production machines are produced highly individual or in very small series, the challenge of virtual commissioning is to reduce the effort to the development of simulation models. Therefore, a systematic reuse of the simulation models and the control software for different variants of a machine is essential for an economic use. This necessarily requires a consideration of the variability which may occur between the production machines. This paper analyzes the question of how to systematically deal with the software-related variability in the context of virtual commissioning. For this purpose, first the characteristics of the virtual commissioning and variability handling are considered. Subsequently, the requirements to a so-called variant infrastructure for virtual commissioning are analyzed and possible solutions are discussed.
Today’s cars are characterized by many functional variants. There are many reasons for the underlying variability, from the adaptation to diverse markets to different technical aspects, which are based on a cross platform reuse of software functions. Inevitably, this variability is reflected in the model-based automotive software development. A modeling language, which is widely used for modeling embedded software in the automotive industry, is MATLAB/Simulink. There are concepts facing the high demand for a systematic handling of variability in Simulinkmodels. However, not every concept is suitable for every automotive application. In order to present a classification of concepts for modeling variability in Simulink, this paper first has to determine the relevant use cases for variant handling in modelbased automotive software development. Existing concepts for modeling variability in Simulink will then be presented before being classified in relation to the previously determined use cases.
Bootstrap circuits are mainly used for supplying a gate driver circuit to provide the gate overdrive voltage for a high-side NMOS transistor. The required charge has to be provided by a bootstrap capacitor which is often too large for integration if an acceptable voltage dip at the capacitor has to be guaranteed. Three options of an area efficient bootstrap circuit for a high side driver with an output stage of two NMOS transistors are proposed. The key idea is that the main bootstrap capacitor is supported by a second bootstrap capacitor, which is charged to a higher voltage and connected when the gate driver turns on. A high voltage swing at the second capacitor leads to a high charge allocation. Both bootstrap capacitors require up to 70% less area compared to a conventional bootstrap circuit. This enables compact power management systems with fewer discrete components and smaller die size. A calculation guideline for optimum bootstrap capacitor sizing is given. The circuit was manufactured in a 180nm high-voltage BiCMOS technology as part of a high-voltage gate driver. Measurements confirm the benefit of high-voltage charge storing. The fully integrated bootstrap circuit including two stacked 75.8pF and 18.9pF capacitors results in a voltage dip lower than 1V. This matches well with the theory of the calculation guideline.
Size and cost of a switched mode power supply can be reduced by increasing the switching frequency. The maximum switching frequency and the maximum input voltage range, respectively, is limited by the minimum propagated on-time pulse, which is mainly determined by the level shifter speed. At switching frequencies above 10 MHz, a voltage conversion with an input voltage range up to 50 V and output voltages below 5 V requires an on-time of a pulse width modulated signal of less than 5 ns. This cannot be achieved with conventional level shifters. This paper presents a level shifter circuit, which controls an NMOS power FET on a high-voltage domain up to 50 V. The level shifter was implemented as part of a DCDC converter in a 180 nm BiCMOS technology. Experimental results confirm a propagation delay of 5 ns and on-time pulses of less than 3 ns. An overlapping clamping structure with low parasitic capacitances in combination with a high-speed comparator makes the level shifter also very robust against large coupling currents during high-side transitions as fast as 20 V/ns, verified by measurements. Due to the high dv/dt, capacitive coupling currents can be two orders of magnitude larger than the actual signal current. Depending on the conversion ratio, the presented level shifter enables an increase of the switching frequency for multi-MHz converters towards 100 MHz. It supports high input voltages up to 50 V and it can be applied also to other high-speed applications.
The recent years and especially the Internet have changed the way on how data is stored. We now often store data together with its creation time-stamp. These data sequences potentially enable us to track the change of data over time. This is quite interesting, especially in the e-commerce area, in which classification of a sequence of customer actions, is still a challenging task for data miners. However, before Standard algorithms such as Decision Trees, Neuronal Nets, Naive Bayes or Bayesian Belief Networks can be applied on sequential data, preparations need to be done in order to capture the information stored within the sequences. Therefore, this work presents a systematic approach on how to reveal sequence patterns among data and how to construct powerful features out of the primitive sequence attributes. This is achieved by sequence aggregation and the incorporation of time dimension into the Feature construction step. The proposed algorithm is described in detail and applied on a real life data set, which demonstrates the ability of the proposed algorithm to boost the classification performance of well known data mining algorithms for classification tasks.
SmartLife ecosystems are emerging as intelligent user-centered systems that will shape future trends in technology and communication. Biological metaphors of living adaptable ecosystems provide the logical foundation for self-optimizing and self-healing run-time environments for intelligent adaptable business services and related information systems with service-oriented enterprise architectures. The present research in progress work investigates mechanisms for adaptable enterprise architectures for the development of service-oriented ecosystems with integrated technologies like Semantic Technologies, Web Services, Cloud Computing and Big Data Management. With a large and diverse set of ecosystem services with different owners, our scenario of service-based SmartLife ecosystems can pose challenges in their development, and more importantly, for maintenance and software evolution. Our research explores the use of knowledge modeling using ontologies and flexible metamodels for adaptable enterprise architectures to support program comprehension for software engineers during maintenance and evolution tasks of service-based applications. Our previous reference enterprise architecture model ESARC -- Enterprise Services Architecture Reference Cube -- and the Open Group SOA Ontology was extended to support agile semantic analysis, program comprehension and software evolution for a SmartLife applications scenario. The Semantic Browser is a semantic search tool that was developed to provide knowledge-enhanced investigation capabilities for service-oriented applications and their architectures.
The purpose of this paper is to review, compare and contrast the body of published literature regarding consumer related emotions in fashion shopping behavior. This paper analyses 39 academic articles which focus on emotions in fashion shopping behavior between 2000 and 2013. Therefore articles which examine the influence of environmental stimuli in a retail setting as well as articles which focus on the impact of factors affecting individuals especially in shopping for fashion were analysed. Most of the articles are based on the SOR paradigm. A larger focus is recently placed on the research of emotions and consumers’ behavior in online fashion environments. The influence of stimuli, occurring in endogenous and exogenous ways, on consumers’ emotion and resulting behavior could be confirmed in most studies. However the determination of addressed emotions is already widely researched, the impact on consumers’ shopping behavior has to be analysed more detailed.
Quest 3C : an integrative simulation game used to encourage cross-disciplinary thinking and action
(2014)
Interdisciplinary, complex problem-solving and the necessity to communicate effectively in global Teams characterise today’s rapidly changing Business environment. Employers consistently stress the need for business engineering graduates to demonstrate technical expertise, methodological competences and diverse soft skills. The "silo effect" in higher education has partially created a gap between what industry wants and what academia provides. Here we examine how interdisciplinary team teaching and shared ICT might be more effective in bringing higher education teaching in sync with industry and its demands.
The implementation of a web based portal QA solution will lead to a high acceptance of the staff as the usage of commonly known standard software (e.g. web browser) allows intuitive handling. In the daily use a significant simplification of the workflow and Performance enhancement can be achieved by easy access to the check documents. As the data is now saved in a database it can easily be processed and long-term trends can be displayed. Therefore possible errors can be detected much easier and earlier. By the usage of time stamps and user authentication procedures and user responsibilities are comprehensibly documented. As the software is browser based, integration into an existing software Environment is not critical. As only technical QA data is processed, no further data security measures are necessary. A certification as a medical product is not required.
Started as a mono-line focused purely on savings, in late 2012 ING Direct Spain was becoming a full-service bank. To this end, the bank had substantially increased its product- and channel-portfolio. ING Direct Spain originally provided "simple", "good value for money" products in an "easy to deal with" way at low cost supported by a direct model. But with the growth in its product portfolio during the previous decade and the ambitious goal of becoming a full-service bank, an increase in complexity seemed inevitable. Like many businesses in the global, digital economy, ING Direct Spain found it needed to decide which complexity created value for its customers and which one not. It also learned that IT can contribute to complexity and/or help manage complexity.
This case offers a close look at challenges of growing a company by increasing product complexity to provide comprehensive yet simple services.
As "the most international company on earth", DHL Express promised to deliver packages between almost any pair of countries within a defined time-frame. To fulfill this promise, the company had introduced a set of global business and technology standards. While standardization had many advantages (improving service for multinational customers, faster response to changes in import/export regulations, sharing of best practices etc.), it created impediments to local innovation and responsiveness in DHL Express' network of 220 countries/territories. Reconciling standardization-innovation tradeoffs is a critical management issue for global companies in the digital economy.
This case describes one large, successful company's approach to the tradeoff of standardization versus innovation.
Executive education in IS is under the scrutiny of many institution for the potential to bring in financial revenues. However teaching executives can be a very challenging task because of the previous experiences, variation in their previous education, and multiplicity of motivations for pursuing a continuous education. The panel aims at sharing successful experiences and highlighting challenges of dealing with executive audiences. The panel will present the results of a large survey among executive students and identify the three most significant elements emerged from the survey: the importance of theory that is actionable, the importance of varied pedagogical tools and practices, and the importance of relevance beyond practical tools. Based on a survey that will be distributed to the audience at the beginning of the panel, the audience will be actively engaged in sharing their experiences on the three topics aiming at capitalize and sum up the collective knowledge of the room.
Learning and teaching requires the transfer of knowledge from one person to another. Due to the relevance of knowledge many models have been developed for knowledge transfer. However, the process of knowledge transfer has not yet been described completely and the approaches are too vague to facilitate its implementation. This paper contributes to a better understanding of knowledge transfer to support knowledge transfer in teaching. To address this challenge, we depict a layered model for knowledge transfer. The model structures the transfer in several steps and thus identifies major influencing factors. The paper describes the knowledge transfer from one person to another step by step. An example in the area of teaching business process management illuminates the process. The main contribution of this paper is the development of a layered model and its application in teaching.
There are several intra-operative use cases which require the surgeon to interact with medical devices. I used the Leap Motion Controller as input device for three use-cases: 2D-interaction (e.g. advancing EPR data), selection of a value (e.g. room illumination brightness) and an application point and click scenario. I evaluated the Palm Mouse as the most suitable gesture solution to coordinate the mouse and advise to use the implementation using all fingers to perform a click. This small case study introduces the implementations and methods that result those recommendations.
Strategy to test mobile apps
(2014)
Nowadays the development of a mobile app implies challenges and difficulties, which have to be faced by mobile app developers. Innovations lead to a rapidly evolving mobile app market, therefore apps should be developed faster and offered in short release cycles to the market. Testing is a decisive activity within the development process that helps to improve the quality of the app. This research paper describes a strategy to test mobile apps that overcomes the challenges that mobile apps confront and permits to test the app in a structural test environment.
The automotive industry faces three major challenges – shortage of fossil fuels, politics of global warming and rising competition from new markets. In order to remain competitive companies have to develop more efficient and alternative fuel vehicles that meet the individual requirements of the customers. Functional Integration combined with new Technologies and materials are the key to stable success in this industry. The sustaining upward trend to system innovations within the last ten years confirms this. The development of complex products like automobiles claim skills of various disciplines e.g. engineering, chemistry. Furthermore, these skills are spread all over the supply chain. Hence the only way to stay successful in the automotive industry is cooperation and collaborative innovation. Interdisciplinary and interorganizational development has high demands on cooperation models especially in the automotive industry. In this case study cooperation models are analyzed and evaluated according to their applicability to interdisciplinary, interorganizational development projects in the automotive industry. Following, the research campus ARENA2036 is analyzed. ARENA2036 is an interdisciplinary, interorganizational development project housing automobile manufacturers, suppliers, research establishments and university institutes. Finally, based on interviews with the partners and the precede analyses of cooperation models, suggestions for implementation are given to ARENA2036.
In the luxury Fashion industry, consumers could be categorized into two groups: fashion leader and Fashion follower. Both groups of consumers purchase luxury fashion products aim at satisfying both their functional needs and social needs (i.e., social influence). Thus the demands of both consumer groups are related. In this paper, we construct a model to examine the effects of pricing and online retail service in luxury fashion firms with social influence. To maximize profit, we identify the optimal prices and online retail service when the luxury fashion firms provide the non-differentiated and differentiated online retail services, respectively. More insights are discussed.
In this paper it is first identified the trade-off among costs, flexibility and performances of autonomous robotic solutions for material handling processes, where adding value with automation is not as trivial as in production processes: hence the requirement for automated solutions to be simple, lean and efficient becomes even stricter. Then a method for modelling and comparing differential performances and costs of manual and autonomous solutions is developed. As a result of the method, a smart man-machine collaborative interface is designed and its impact evaluated on a specific case of study. Results are then generalized and prove the strong conclusions that in unconstrained environments, where full standardization cannot be achieved, the risk of investing in autonomous solutions can only be mitigated by creating a fast and smart man-machine collaborative interface.
According to a recent survey the great majority of players in logistics are planning to adopt one or more robotic solutions until 2019. Technical solutions for automation of processes in logistics are often available as a market-ready product, but the lack of standardization and skepticism towards long term investments are often the reasons why these solutions are not implemented on a large scale. This paper is set to bridge the gap between the world of technologies and the one of applications in order to help investors, robot producers and system integrators to decide on which branch of logistics to set their focus. The three main branches Courier Express Parcel (CEP), contract logistics and production logistics are briefly defined and distinguished through their characteristic factors and parameters. Then a method based on the analysis of three parameters (operative costs, required performance and flexibility) in the three branches is set to identify the most convenient branch of logistics for investing in new technologies, namely the one in which the risk of investment is lower, the return is higher and faster. The conclusion of the method shows that higher labor costs, strict regulations and higher standardization make the production logistics the most suitable branch for investments in emerging automation solutions.
The EU funded project RobLog recently developed a system able to autonomously unload coffee sacks from a standard container. Being the first of its kind, a further development is needed in order for the system to be competitive against manual labor. Financing this development entails a risk, hence a justified skepticism, which can be overcome by the longsighted view of the existing market potential. This paper presents a method to estimate the market potential of autonomous unloading systems for heavy deformable goods. Starting from the analysis of the coffee trade, first the current coffee traffic is investigated in order to calculate the number of autonomous systems needed to handle the imported sacks; Results are validated and the method is extended for the calculation of the potential of other market segments, where the same unloading technology can be applied.
An ultra-low power capacitance extrema and ratio detector for electrostatic energy harvesters
(2015)
The power supply is one of the major challenges for applications like internet of things IoTs and smart home. The maintenance issue of batteries and the limited power level of energy harvesting is addressed by the integrated micro power supply presented in this paper. Connected to the 120/230 Vrms mains, which is one of the most reliable energy sources and anywhere indoor available, it provides a 3.3V DC output voltage. The micro power supply consists of a fully integrated ACDC and DCDC converter with one external low voltage SMD buffer capacitor. The micro power supply is fabricated in a low cost 0.35 μm 700 V CMOS technology and covers a die size of 7.7 mm2. The use of only one external low voltage SMD capacitor, results in an extremely compact form factor. The ACDC is a direct coupled, full wave rectifier with a subsequent bipolar shunt regulator, which provides an output voltage around 17 V. The DCDC stage is a fully integrated 4:1 SC DCDC converter with an input voltage as high as 17 V and a peak efficiency of 45 %. The power supply achieves an overall output power of 3 mW, resulting in a power density of 390 μW/mm2. This exceeds prior art by a factor of 11.
Virtual prototyping of integrated mixed-signal smart-sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in combination with a cycle-count accurate temporal decoupling approach to simulate digital components and firmware code execution at high speed while preserving clock cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming communication events. These methods fail in case of non-deterministic, asynchronous events resulting in a possibly invalid simulation result. In this paper, we propose an extension of this method to the case of asynchronous events generated by blackbox sources from which a-priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency and/or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. For an example smart-sensor system model, we show that quasi- periodic events that trigger activities in temporally decoupled processes are handled accurately after the predictor has settled.
Stress is recognized as a factor of predominant disease and in the future the costs for treatment will increase. The presented approach tries to detect stress in a very basic and easy to implement way, so that the cost for the device and effort to wear it remain low. The user should benefit from the fact that the system offers an easy interface reporting the status of his body in real time. In parallel, the system provides interfaces to pass the obtained data forward for further processing and (professional) analyses, in case the user agrees. The system is designed to be used in every day’s activities and it is not restricted to laboratory use or environments. The implementation of the enhanced prototype shows that the detection of stress and the reporting can be managed using correlation plots and automatic pattern recognition even on a very light weighted microcontroller platform.
An ongoing challenge in our days is to lower the impact on the quality of life caused by dysfunctionality through individual support. With the background of an aging society and continuous increases in costs for care, a holistic solution is needed. This solution must integrate individual needs and preferences, locally available possibilities, regional conditions, professional and informal caregivers and provide the flexibility to implement future requirements. The proposed model is a result of a common initiative to overcome the major obstacles and to center a solution on individual needs caused by dysfunctionality.
New or adapted digital business models have huge impacts on Enterprise Architectures (EA) and require them to become more agile, flexible, and adaptable. All these changes are happening frequently and are currently not well documented. An EA consists of a lot of elements with manifold relationships between them. Thus changing the business model may have multiple impacts on other architectural elements. The EA engineering process deals with the development, change and optimization of architectural elements and their dependencies. Thus an EA provides a holistic view for both business and IT from the perspective of many stakeholders, which are involved in EA decision-making processes. Different stakeholders have specific concerns and are collaborating today in often unclear decision-making processes. In our research we are investigating information from collaborative decision-making processes to support stakeholders in taking current decisions. In addition we provide all information necessary to understand how and why decisions were taken. We are collecting the decision-related information automatically to minimize manual time intensive work as much as possible. The core contribution of our research extends a decisional metamodel, which links basic decisions with architectural elements and extends them with an associated decisional case context. Our aim is to support a new integral method for multi perspective and collaborative decision-making processes. We illustrate this by a practice-relevant decision-making scenario for Enterprise Architecture Engineering.
In modern times markets are very dynamic. This situation requires agile enterprises to have the ability to react fast on market influences. Thereby an enterprise’ IT is especially affected, because new or changed business models have to be realized. However, enterprise architectures (EA) are complex structures consisting of many artifacts and relationships between them. Thus analyzing an EA becomes to a complex task for stakeholders. In addition, many stakeholders are involved in decision-making processes, because Enterprise Architecture Management (EAM) targets providing a holistic view of the enterprise. In this article we use concepts of Adaptive Case Management (ACM) to design a decision-making case consisting of a combination of different analysis techniques to support stakeholders in decision-making. We exemplify the case with a scenario of a fictive enterprise.
Customer services in the digital transformation: social media versus hotline channel performance
(2015)
Due to the digital transformation online service strategies have gained prominence in practice as well as in the theory of service management. This study examines the efficacy of different types of service channels in customer complaint handling. The theoretical framework, developed using complaint handling and social media literature, is tested against data collected from two different channels (hotline and social media) of a German telecommunication service provider. We contribute to the understanding of firm’s multichannel distribution strategy in two ways: a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels, and b) by testing the impact of complaint handling quality on key performance outcomes like customer loyalty, positive word-of-mouth, and cross purchase intentions.
The Internet of Things (IoT) fundamentally influences today’s digital strategies with disruptive business operating models and fast changing markets. New business information systems are integrating emerging Internet of Things infrastructures and components. With the huge diversity of Internet of Things technologies and products organizations have to leverage and extend previous enterprise architecture efforts to enable business value by integrating the Internet of Things into their evolving Enterprise Architecture Management environments. Both architecture engineering and management of current enterprise architectures is complex and has to integrate beside the Internet of Things synergistic disciplines like EAM - Enterprise Architecture and Management with disciplines like: services & cloud computing, semantic-based decision support through ontologies and knowledge-based systems, big data management, as well as mobility and collaboration networks. To provide adequate decision support for complex business/IT environments, it is necessary to identify affected changes of Internet of Things environments and their related fast adapting architecture. We have to make transparent the impact of these changes over the integral landscape of affected EAM-capabilities, like directly and transitively impacted IoT-objects, business categories, processes, applications, services, platforms and infrastructures. The paper describes a new metamodel-based approach for integrating partial Internet of Things objects, which are semi-automatically federated into a holistic Enterprise Architecture Management environment.
Enterprise architecture management (EAM) is a holistic approach to tackle the complex Business and IT architecture. The transformation of an organization’s EA towards a strategy-oriented system is a continuous task. Many stakeholders have to elaborate on various parts of the EA to reach the best decisions to shape the EA towards an optimized support of the organizations’ capabilities. Since the real world is too complex, analyzing techniques are needed to detect optimization potentials and to get all information needed about an issue. In practice visualizations are commonly used to analyze EAs. However these visualizations are mostly static and do not provide analyses. In this article we combine analyzing techniques from literature and interactive visualizations to support stakeholders in EA decision-making.
The stimulation of user engagement has received significant attention in extant research. However, the theory of antecedents for user engagement with an initial electronic word-of-mouth (eWoM) communication is relatively less developed. In an investigation of 576 unique user postings across independent Facebook (FB) communities for two German firms, we contribute to the extant knowledge on user engagement in two different ways. First, we explicate senders’ prior usage experience and the extent of their acquaintance with other community members as the two key drivers of user engagement across a product and a service community. Second, we reveal that these main effects differ according to the type of community. In service communities, experience has a stronger impact on user engagement; whereas, in product communities, acquaintance is more important.
In recent years, the rise of the digital transformation received significant importance in Business-to-Business (B2B) research. Social media applications provide executives with a raft of new options. Consequently, interfaces to social media platforms have also been integrated into B2B salesforce applications, although very little is as yet known about their usage and general impact on B2B sales performance. This paper evaluates 1) the conceptualization of social media usage in a dyadic B2B relationship; 2) the effects of a more differentiated usage construct on customer satisfaction; 3) antecedents of social media usage on multiple levels; and 4) the effectiveness of social media usage for different types of customers. The framework presented here is tested cross-industry against data collected from dyadic buyer seller relationships in the IT service industry. The results elucidate the preconditions and the impact of social media usage strategies in B2B sales relations.
Location-based services in buildings represent a great advantage for people to search places, products or people. In our paper we examine the feasibility of Bluetooth iBeacons for indoor localization. In the first part we define and evaluate the iBeacon technology through different experiments. In the second part our solution application is described. Our system is able to estimate the position of the user’s smartphone based on RSSI measurements. Therefore we used the built-in smartphone sensor and a building map with required sender information. Trilateration is used as positioning technique in contrast to fingerprinting to minimize beforehand effort. Results are promising but cannot reach the same accuracy level as sensor-fusion or fingerprinting approaches.
Enterprise Architecture (EA) management is an activity that seeks to foster the alignment of business and IT, and pursues various goals further operationalizing this alignment. Key to effective EA management is a framework that defines the roles, activities, and viewpoints used for EA management in accordance to the concerns that the stakeholders aim to address. Consensus holds that such frameworks are organization-specific and hence they are designed in governance activities for EA management. As of today, top-down approaches for governance are used to derive organization-specific frameworks. These usually lack systematic mechanisms for improving the framework based on the feedback of the responsible stakeholders. We outline a bottom-up approach for EA management governance that systematically observes the behavior of the actors to learn user concerns and recommend appropriate viewpoints. With this approach, we complement traditional top-down governance activities.
Decision-making in the field of Enterprise Architecture (EA) is a complex task. Many organizations establish a set of complex processes and hierarchical structures to enable strategy-driven development of their EA. This leads to slow and inefficient decision-making entailing bad time-to-market and discontented stakeholders. Collaborative EA delineates a lightweight approach to enable EA decisions but often neglects strategic alignment. In this paper, we present an approach to integrate the concept of collaborative EA and goal-driven decision-making through collaborative modeling of goal-oriented information demands based on ArchiMate’s motivation extension to reach a goal-oriented EA decision support in a collaborative EA environment.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the digital transformation since years. The Internet of Things, social collaboration systems for adaptive case management, mobility systems and services for Big Data in cloud services environments are emerging to support intelligent user-centered and social community systems. They will shape future trends of business innovation and the next wave of information and communication technology. Biological metaphors of living and adaptable ecosystems provide the logical foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems with service-oriented enterprise architectures. The present research investigates mechanisms for flexible adaptation and evolution of digital enterprise architectures in the context of integrated synergistic disciplines like distributed service-oriented architectures and information systems, EAM - Enterprise Architecture and Management, metamodeling, semantic echnologies, web services, cloud computing and Big Data technology. Our aim is to support flexibility and agile transformations for both business domains and related enterprise systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates digital transformations of business and IT and integrates fundamental mappings between adaptable digital enterprise architectures and service-oriented information systems.
Business processes are important knowledge resources of a company. The knowledge contained in business processes impart procedures used to create products and services. However, modelling and application of business processes are affected by problems connected to knowledge transfer. This paper presents and implements a layered model to improve the knowledge transfer. Thus modelling and understanding of business process models is supported. An evaluation of the approach is presented and results and other areas of application are discussed.
The character of knowledge-intense processes is that participants decide the next process activities on base of the present information and their expert knowledge. The decisions of these knowledge workers are in general non-deterministic. It is not possible to model these processes in advance and to automate them using a process engine of a BPM system. Hence, in this context a process instance is called a case, because there is no predefined model that could be instantiated. Domain-specific or general case management systems are used to support the knowledge workers. These systems provide all case information and enable users to define the next activities, but they have no or only limited activity recommendation capabilities. In the following paper, we present a general concept for a self-learning system based on process mining that suggests the next best activity on quantitative and qualitative data for a given case. As a proof of concept, it was applied to the area of insurance claims settlement.
In a world with rapidly changing customer requirements and the increased role of technology, companies need more flexible systems to adapt their processes and react dynamically to changes. Adaptive Case Management (ACM) comes into consideration by providing a concept to adapt to changing business conditions. Within our research project we did a first foundational evaluation of the potential of ACM in supporting unpredictable sales processes. Based on a set of criteria we tested the concept of ACM with the open source tool Cognoscenti. The evaluation gave us the possibility to experience the concept of ACM. Hence we were able to provide a statement about the potential of ACM within the context of an unpredictable sales process, setting the path to further research and discussion of ACM in the area of sales processes.
A sequence of transactions represents a complex and multi dimensional type of data. Feature construction can be used to reduce the data´s dimensionality to find behavioural patterns within such sequences. The patterns can be expressed using the blue prints of the constructed relevant features. These blue prints can then be used for real time classification on other sequences.
An operating room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and the tasks of their colleagues. The entire team must work synchronously at all times. To optimize the overall workflow, a task manager supporting the team was developed. In parallel, a common conceptual design of a business process visualization was developed, which makes all relevant information accessible in real-time during a surgery. In this context an overview of all processes in the operating room was created and different concepts for the graphical representation of these user-dependent processes were developed. This paper describes the concept of the task manager as well as the general concept in the field of surgery.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (longterm electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic TimeWarping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
New business concepts such as Enterprise 2.0 foster the use of social software in enterprises. Especially social production significantly increases the amount of data in the context of business processes. Unfortunately, these data are still an unearthed treasure in many enterprises. Due to advances in data processing such as Big Data, the exploitation of context data becomes feasible. To provide a foundation for the methodical exploitation of context data, this paper introduces a classification, based on two classes, intrinsic and extrinsic data.
Modern enterprises reshape and transform continuously by a multitude of management processes with different perspectives. They range from business process management to IT service management and the management of the information systems. Enterprise Architecture (EA) management seeks to provide such a perspective and to align the diverse management perspectives. Therefore, EA management cannot rely on hierarchic - in a tayloristic manner designed - management processes to achieve and promote this alignment. It, conversely, has to apply bottom-up, information-centered coordination mechanisms to ensure that different management processes are aligned with each other and enterprise strategy. Social software provides such a bottom-up mechanism for providing support within EAM-processes. Consequently, challenges of EA management processes are investigated, and contributions of social software presented. A cockpit provides interactive functions and visualization methods to cope with this complexity and enable the practical use of social software in enterprise architecture management processes.
Leveraging textual information for improving decision making in the business process lifecycle
(2015)
Business process implementations fail, because requirements are elicited incompletely. At the same time, a huge amount of unstructured data is not used for decision-making during the business process lifecycle. Data from questionnaires and interviews is collected but not exploited because the effort doing so is too high. Therefore, this paper shows how to leverage textual information for improving decision making in the business process lifecycle. To do so, text mining is used for analyzing questionnaires and interviews.
The digitization of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change interacts with all information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology and enterprise systems through adaptation and evolution of digital enterprise architectures. The present research paper investigates collaborative decision mechanisms for adaptive digital enterprise architectures by extending original architecture reference models with state of art elements for agile architectural engineering for the digitization and collaborative architectural decision support.
The Seventh International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2015), held between May 24-29, 2015 in Rome, Italy, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base Technologies and content processing, as well as specifics in applications domains databases. Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and Agent technologies, as well as from the generalization of the XML adoption. High-speed communications and computations, large storage capacities, and loadbalancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods. Evolution on e-business, e-health and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
This paper presents a concurrency control mechanism that does not follow a ‘one concurrency control mechanism fits all needs’ strategy. With the presented mechanism a transaction runs under several concurrency control mechanisms and the appropriate one is chosen based on the accessed data. For this purpose, the data is divided into four classes based on its access type and usage (semantics). Class O (the optimistic class) implements a first-committer-wins strategy, class R (the reconciliation class) implements a first-n-committers-win strategy, class P (the pessimistic class) implements a first reader-wins strategy, and class E (the escrow class) implements a firsnreaderswin strategy. Accordingly, the model is called OjRjPjE. Under this model the TPC-C benchmark outperforms other CC mechanisms like optimistic Snapshot Isolation.
Delivering value to customers in real-time requires companies to utilize real-time deployment of software to expose features to users faster, and to shorten the feedback loop. This allows for faster reaction and helps to ensure that the development is focused on features providing real value. Continuous delivery is a development practice where the software functionality is deployed continuously to customer environment. Although this practice has been established in some domains such as B2C mobile software, the B2B domain imposes specific challenges. This article presents a case study that is conducted in a medium-sized software company operating in the B2B domain. The objective of this study is to analyze the challenges and benefits of continuous delivery in this domain. The results suggest that technical challenges are only one part of the challenges a company encounters in this transition. The company must also address challenges related to the customer and procedures. The core challenges are caused by having multiple customers with diverse environments and unique properties, whose business depends on the software product. Some customers require to perform manual acceptance testing, while some are reluctant towards new versions. By utilizing continuous delivery, it is possible for the case company to shorten the feedback cycles, increase the reliability of new versions, and reduce the amount of resources required for deploying and testing new releases.
Software development as an experiment system : a qualitative survey on the state of the practice
(2015)
An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software functionalities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development. Although case studies on experimentation in industry exist, the understanding of the state of the practice and the encountered obstacles is incomplete. This paper presents an interview-based qualitative survey exploring the experimentation experiences of ten software development companies. The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing organizational culture, accelerating development cycle speed, and measuring customer value and product
success.
SF-failure, the inability of people to correctly determine the behavior of simple stock and flow structures is subject of a long research stream. Reasons for SF-failure can be attributed to different reasons, one of them being lacking domain specific experience, thus familiarity with the problem context. In this article we present a continuation of an experiment to examine the role of educational background in SF-performance. We base the question set on the Bathtub Dynamics tasks introduced by Booth Sweeney and Sterman (2000) and vary the cover stories. In this paper we describe how we developed and tested a new cover story for the engineering domain and implemented the recommendations from a prior study. We test three sets of questions with engineering students which enables us to compare the results to a previous study in which we tested the questions with business students. Results mainly support our hypothesis that context familiarity increases SF-performance. With our findings we further develop the methodology of the research on SF-failure.
This paper describes the design and outcomes of an experimental study that addresses stock-and-flow-failure from a cognitive perspective. It is based on the assumption that holistic (global) and analytic (local) processing are important cognitive mechanisms underlying the ability to infer the behavior of dynamic systems. In a stock-and-flow task that is structurally equivalent to the department store task, we varied the format in which participants are primed to think about an environmental system, in particular whether they are primed to concentrate on lower-level (local) or higher-level (global) system elements. 148 psychology, geography and business students participated in our study. Students’ answers support our hypothesis that global processing increases participants’ ability to infer the overall system behavior. The beneficial influence of global presentation is even stronger when data are presented numerically rather than in the form of a graph. Our results suggest presenting complex dynamic systems in a way that facilitates global processing. This is particularly important as policy-designers and decision makers deal with complex issues in their everyday and professional life.
Rapid value delivery requires a company to utilize empirical evaluation of new features and products in order to avoid unnecessary product risks. This helps to make data-driven decisions and to ensure that the development is focused on features that provide real value for customers. Short feedback loops are a prerequisite as they allow for fast learning and reduced reaction times. Continuous experimentation is a development practice where the entire R&D process is guided by constantly conducting experiments and collecting feedback. Although principles of continuous experimentation have been successfully applied in domains such as game software or SAAS, it is not obvious how to transfer continuous experimentation to the business to-business domain. In this article, a case study from a medium-sized software company in the B2B domain is presented. The study objective is to analyze the challenges, benefits and organizational aspects of continuous experimentation in the B2B domain. The results suggest that technical challenges are only one part of the challenges a company encounters in this transition. The company also has to address challenges related to the customer and organizational culture. Unique properties in each customers business play a major role and need to be considered when designing experiments. Additionally, the speed by which experiments can be conducted is relative to the speed by which production deployments can be made. Finally, the article shows how the study results can be used to modify the development in the case company in a way that more feedback and data is used instead of opinions.
For years, agile methods are considered the most promising route toward successful software development, and a considerable number of published studies the (successful) use of agile methods and reports on the benefits companies have from adopting agile methods. Yet, since the world is not black or white, the question for what happened to the traditional models arises. Are traditional models replaced by agile methods? How is the transformation toward Agile managed, and, moreover, where did it start? With this paper we close a gap in literature by studying the general process use over time to investigate how traditional and agile methods are used. Is there coexistence or do agile methods accelerate the traditional processes’ extinction? The findings of our literature study comprise two major results: First, studies and reliable numbers on the general process model use are rare, i.e., we lack quantitative data on the actual process use and, thus, we often lack the ability to ground process-related research in practically relevant issues. Second, despite the assumed dominance of agile methods, our results clearly show that companies enact context-specific hybrid solutions in which traditional and agile development approaches are used in combination.
Managers recognize that software development project teams need to be developed and guided. Although technical skills are necessary, non-technical (NT) skills are equally, if not more, necessary for project success. Currently, there are no proven tools to measure the NT skills of software developers or software development teams. Behavioral markers (observable behaviors that have positive or negative impacts on individual or team performance) are beginning to be successfully used by airline and medical industries to measure NT skill performance. The purpose of this research is to develop and validate the behavior marker system tool that can be used by different managers or coaches to measure the NT skills of software development individuals and teams. This paper presents an empirical study conducted at the Software Factory where users of the behavior marker tool rated video clips of software development teams. The initial results show that the behavior marker tool can be reliably used with minimal training.
Competing logics in evaluating employee performance : building compromises through conventions
(2015)
Current research argues that competing institutional logistics1 can co-exist enduringly and investigates how organizations cope with such institutional complexity (Greenwood et al. 2011). Thereby, the role of practices for handling competing logics has been overlooked and it is currently only to limited extent understood how organizations establish compromises between competing logics. Therefore, we investigated the recent performance appraisal reform of a German public sector organization that occurred in 2008 (see also Kozica, Brandl 2015). BAND (the pseudonym for our organization) has been using performance appraisals for several decades, and performance appraisals have already become entrenched instruments (Zeitz, Mittal, McAulay 1999) for handling staff promotion decisions. While BAND accepted the accountability logic of the performance appraisal, the professional logic (which is based on trust and comradeship as a high value of being professional in our organization) is accepted too and BAND has established a fine-grained compromise between the different logics. During the recent reform of the performance appraisal system, however, this compromise has broken up and challenged organizational members to (re-)arrange a compromise. By using French convention school of thinking (Boltanski, Thévenot 2006) we address how BAND copes with conflicting logics by forming compromises in organizational practices. Thereby, we show that the concept of convention is particularly promising for understanding of how organizations deal with institutional complexity. More broadly, our argument contributes to the elaboration of an organizational theory for the institutional logics discussion that explains how organizational and individual actions are interlinked.
Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? In this paper, we present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models like CMMI and ISO/IEC 15504 are analyzed, enhanced, and evaluated for applicability, whereas these standards are critically discussed from the perspective of SPI in small-to- medium-sized companies, which leads to new specialized frameworks. Furthermore, we find a growing interest in success factors to aid companies in conducting SPI.
Entrepreneurs and small and medium enterprises usually have issues on developing new prototypes, new ideas or testing new techniques. In order to help them, in the last years, academic Software Factories, a new concept of collaboration between universities and companies have been developed. Software Factories provide a unique environment for students and companies. Students benefit from the possibility of working in a real work environment learning how to apply the state of the art of the existing techniques and showing their skills to entrepreneurs. Companies benefit from the risk-free environment where they can develop new ideas, in a protected environment. Universities finally benefit from this setup as a perfect environment for empirical studies in industrial-like environment. In this paper, we present the network of academic Software Factories in Europe, showing how companies had already benefit from existing Software Factories and reporting success stories. The results of this paper can increase the network of the factories and help other universities and companies to setup similar environment to boost the local economy.
Large power semiconductors are complex structures, their metallization usually containing many thousands of contacts or vias. Because of this, detailed FEM simulations of the whole device are nowadays not possible because of excessive simulation time.
This paper introduces a simulation approach which allows quick identification of critical regions with respect to lifetime by a simplified simulation. For this, the complex layers are replaced by a much simpler equivalent layer, allowing a simulation of the whole device even including its package. In a second step, precise simulations taking all details of the structure into account are carried out, but only for the critical regions of interest. Thus, this approach gives detailed results where required with consideration of the whole structure including packaging. Further, the simulation time requirements are very moderate.