Refine
Year of publication
- 2017 (209) (remove)
Document Type
- Conference proceeding (103)
- Journal article (77)
- Book chapter (17)
- Book (6)
- Working Paper (3)
- Doctoral Thesis (1)
- Issue of a journal (1)
- Review (1)
Has full text
- yes (209) (remove)
Is part of the Bibliography
- yes (209)
Institute
- Informatik (73)
- ESB Business School (54)
- Technik (38)
- Texoversum (24)
- Life Sciences (19)
Publisher
- Springer (45)
- IEEE (27)
- Hochschule Reutlingen (21)
- Elsevier (16)
- Gesellschaft für Informatik e.V (15)
- Association for Computing Machinery (8)
- Association for Information Systems (5)
- Wiley (5)
- Università Politecnica delle Marche (4)
- American Marketing Association (3)
In this article, liposome-based coatings aiming to control drug release from therapeutic soft contact lenses (SCLs) materials are analyzed. A PHEMA based hydrogel material loaded with levofloxacin is used as model system for this research. The coatings are formed by polyelectrolyte layers containing liposomes of 1,2-dimyristoyl-sn-glycero-3- phosphocholine (DMPC) and DMPC1cholesterol (DMPC1 CHOL). The effect of friction and temperature on the drug release is investigated. The aim of the friction tests is to simulate the blinking of the eyelid in order to verify if the SCLs materials coated with liposomes are able to keep their properties, in particular the drug release ability. It was observed that under the study conditions, friction did not affect significantly the drug release from the liposome coated PHEMA material. In contrast, increasing the temperature of release leads to an increase of the drug diffusion rate through the hydrogel. This phenomenon is recorded both in the control and in the coated samples.
In this study, a novel strategy has been developed for the assembly of polyelectrolyte multilayer (PEM) on CaCO3 templates in acidic pH solutions, where consecutive polyelectrolyte layers (heparin/poly(allylamine hydrochloride) or heparin/chitosan) were deposited on PEM hollow microcapsules established previously on CaCO3 templates. The PEM build-up, hollow capsule characterization and successful encapsulation of fluorescein 5(6)-isothiocyanate (FITC)-Dextran by coprecipitation with CaCO3 are demonstrated. Improvement by the removal of CaCO3 core was achieved while the depositions. In the course of the release profile, high retardation for encapsulated FITC-Dextran was observed. The combined shell capsules system is a significant trait that has potential use in tailoring functional layer-by-layer capsules as intelligent drug delivery vehicles where the preliminary in vitro tests showed the responsiveness on the enzymes.
This article covers the design of highly integrated gate drivers and level shifters for high-speed, high power efficiency and dv/dt robustness with focus on automotive applications. With the introduction of the 48 V board net in addition to the conventional 12 V battery, there is an increasing need for fast switching integrated gate drivers in the voltage range of 50 V and above. State-of-the-art drivers are able to switch 50 V in less than 5 ns. The high-voltage electrical drive train demands for galvanic isolated and highly integrated gate drivers. A gate driver with bidirectional signal transmission with a 1 MBit/s amplitude modulation, 10/20 MHz frequency modulation and power transfer over one single transformer will be discussed. The concept of high-voltage charge storing enables an area-efficient fully integrated bootstrapping supply with 70 % less area consumption. EMC is a major concern in automotive. Gate drivers with slope control optimize EMC while maintaining good switching efficiency. A current mode gate driver, which can change its drive current within 10 ns, results in 20 dBuV lower emissions between 7 and 60 MHz and 52 % lower switching loss compared to a conventional constant current gate driver.
The influence of turbidity on the Raman signal strengths of condensed matter is theoretically analyzed and measured with laboratory - scale equipment for remote sensing. The results show the quantitative dependence of back- and forward-scattered signals on the thickness and elastic-scattering properties of matter. In the extreme situation of thin, highly turbid layers, the measured Raman signal strengths exceed their transparent analogs by more than a factor of ten. The opposite behavior is found for thick layers of low turbidity, where the presence of a small amount of scatterers leads to a decrease of the measured signal. The wide range of turbidities appearing in nature is experimentally realized with stacked polymer layers and solid/liquid dispersions, and theoretically modeled by the equation of radiative transfer using the analytical diffusion approximation or random walk simulations.
The purpose of this paper is to explain the key aspects and growing relevance of sustainability in fashion retail and to evaluate the possibilities of fashion retailers to act sustainable in supply chain management as well as carving out the challenges they have to deal with. The research methodology applied for this purpose is a critical literature review examining books and articles. The findings demonstrate the rising importance of sustainability in fashion retail. In this regard, fashion retailers play a key role and responsibility for sustainability in the fashion supply chain, from the beginning up to the end. This paper mainly analyzes sustainability in the fashion supply chain. It does not analyze topics like second-hand shopping or social media sustainability.
The purpose of this paper is to investigate how the practice of closed-loop production systems (CLPS) is implemented in the fashion industry. This paper offers a critical literature review to present a thorough understanding of the actual status of literature. Subsequently, the paper reveals that CLPS are of great importance. Generally, such systems include different activities that have to be integrated. Critical points are the product acquisition, the recovering process itself and the remarketing to the customer. A lack of reliable data concerning CLPS in the specific case of fashion industry can be identified. Important research fields could be marketing strategies, controlling the acquisition process, evolvement of return technologies and strategies, adaption of recovered products to the mass market, and the development of new technologies concerning recovering processes.
The purpose of this paper is to examine the impact on sustainability of fashion production and consumption in order to discuss what the main lever is to reduce the negative impact. The research methodology applied is a literature review examining academic references. Key findings suggest that fashion production and consumption have a single comparable impact on sustainability. Moreover, as the fashion production follows the demand, the consumer steers the production in a certain direction. Therefore, consumers take over responsibility and need to be informed. To reach a long-term change in the fashion industry, the consumer has to be the focus of the sustainable efforts. Most results in literature were conducted by qualitative research methods, so that further quantitative testing of the results is recommended. Furthermore, most surveys were conducted with young fashion consumers in the EU or UK which does not represent the fashion consumer in general.
The purpose of this paper is to define what impacts sustainable manufacturing standards have for retail brands concerning the communication policy and to find possible solutions of how the companies can deal with them. Therefore, sustainable standards and the impacts on the internal and external communication are described. The enclosed discussion finds possible solutions for the negative impacts. A literature discussion has been conducted to investigate the purpose. Generally, there are many impacts fashion retails have to consider, if they want to transform their company to become more sustainable, because only the impacts on a defined part of the communication policy were huge. A limitation of this paper is that the proposals how retailers could deal with the impacts of the transformation of the company toward more sustainability need further research and tests until they are practicable.
The purpose of this paper is to highlight potentials and limitations of the prosumer concept in fashion retail. The paper illustrates the evolution of prosumption and in which directions the concept is being developed. The primary research is based on a literature review containing different sources of academic and non-academic references. Findings suggest that the prosumer concept is no new phenomenon. Recently, it has moved into the focus of companies that have noted that it is efficient when engaging with customers in order to strengthen their brand loyalty. An increasing number of companies offer innovative business models that underlie the concept. However, lately smart prosuming machines are changing the objectives of the concept. Even if the prosumer concept exists since many years and scholars investigate its potentials continuously, it is the fashion industry that has been researched comparatively little up to now.
The second hand concept indicates a growing trend in clothing recently, leading to growing numbers of second hand shops and developments of new second hand retail forms. This paper concentrates on the current second hand market for fashion products and presents the different motives toward second hand consumption as well as alternative consumption channels for second hand products. The findings of the paper are founded on literature research of academic articles and case studies. Results show that there is a high potential for the second hand market due to the increasing interest of consumers in buying second hand products. The paper concentrates on the second hand market for fashion products in the western society. This means that there was no research on second hand products for disadvantaged people in poor countries. Furthermore, the paper focuses the formal second hand retail channels to see what is already on the market.
The purpose of this paper is to evaluate consisting consumption patterns caused by fast fashion with a new appearing form of consumption and retaining potentials as an alternative as well as sustainable form of fast fashion consumption. This research is set up on a theoretical background of scientific literature including governmental as well as press releases in order to evaluate the status quo of consumption and answering the research question. A new consumption pattern as well as an appearing economy of sharing can be stated including potential aspects of raising businesses and sustainable alternative forms of fast fashion. The framework of the research is limited to the textile and fashion industry in industrialized countries focusing on consumption in the twenty first century.
The purpose of this research is to explore current boundaries of the fashion industry’s second hand market and which solutions and approaches can be adopted from the used-car industry. The paper is based on the study of existing literature which deals with sustainability in combination with second hand markets in general and adaptable features of the used-car industry. Adaptable features are found using the business model canvas. The key finding of this study indicates that the fashion industry faces immense social and environmental challenges which can be partly solved by the development of the second hand market. Used-car industry can be seen as role model for fashion retail. In this study only aspects of used-car distribution are highlighted; therefore, characteristics of the recycling of used cars are not examined.
The purpose of this paper is to study the recycling form of reusing second hand clothing from a conventional fashion brand’s perspective. It should clarify which measures and activities a fashion company needs to integrate in its value chain in order to offer branded second hand merchandise in a self-operated store. The research paper relies on a desk-based research and aims to illustrate the topic by means of a descriptive approach, processing the existing literature. Key findings demonstrate that fashion brands need to integrate complete lifecycle strategies, sustainability communication, and reverse logistics structures, like take-back schemes, for offering second hand clothing. The main limitations evolve from the research design. Further, empirical evidences need to be conducted for a more fundamental understanding of the new business model.
The purpose of this paper is to investigate the use of sustainable closed-loop supply chain of the fashion brand Filippa K. Information on green fashion has been gathered and a case study approach on the fashion retailer
Filippa K conducted. Results show a switch in knowledge content between a fast fashion supply chain and a sustainable supply chain. Also there is an evolution in sustainability as companies, retailers, and manufactures suffer under pressure from the customers, governments, and the media. Sustainable fashion brands like Filippa K are interested in sharing precise knowledge on variety of aspects linked to the sustainable closed-loop supply chain. This research paper has been limited by less information and unexplored topics in the theme green fashion. This led to the personal critical disputation with the brand Filippa K.
This study focuses on the different roles of social media for the promotion of a sustainable lifestyle, behaviour and consumption, especially with regard to the typically non-ethical fashion industry. Research findings include eight roles of social media influencing a sustainable consumption contrary to prior research naming one to five impacts. Results show that social media educates and engages the young and ethically interested target group besides increasing supply chain transparency and brand or theme awareness. Furthermore, social media provides a platform for organisations’ relationship management and social interaction since users get empowered to share experiences which leads to a higher level of trust.
The purpose of this paper is to identify the potential of a fashion fTRACE (ffTRACE) application that gives transparent insight on the supply chain of a fashion item. The research methodology applied to this purpose is a literature review examining academic references. The key findings of this paper are that information plays a major role in the consumer decision process and is therefore beneficial to the demand for sustainable products. Given the right information content in a transparent, credible and understandable way is important. It is found that the functions of such an application would be able to satisfy this consumer demand and therefore has the potential to raise the sales of a sustainable company as well as increase the brand’s awareness and improve its image. While mainly indicating the potentials of the ffTRACE application, their relevance is not examined in this paper.
Since there is no denying that transparency is increasingly central to corporate sustainability, the purpose of this paper is a case study on a company’s attempt to be fully transparent, hence, picking up the existent scholarly conversation about uncompromising supply chain transparency. Literature so far was found to be fairly limited, but, following a trend, has been rising in numbers over recent years. Addressing these shortcomings in the methodology, an in-depth literature review about the multiple dimensions of supply chain transparency has been performed and links within supply networks stressed. On this basis, a case study by exemplary illustrating the fashion label Honestby has been drafted and the effort to become the world’s first 100 % transparent company further examined. Findings are discussed whether more supply chain transparency is desirable in any case, obstacles listed and an outlook for this kind of business model has been drawn. The research is clearly limited by the amount of scholarly literature concerning Honestby in particular. Out of this reason, magazines and journal entries are used as reference as well. Only with the extension of the topic itself to supply chain transparency and the literature review beforehand, the paper gained its necessary academic standard. Concerning implications, it needs to be mentioned that even though Honest by demonstrates to be fully transparent, it was not possible to find any public information about the degree of supplier relationship. In particular, concerning the applied control mechanisms used to exert influence and to balance out the power gradient between company and suppliers.
Social networks, smart portable devices, Internet of Things (IoT) on base of technologies like analytics for big data and cloud services are emerging to support flexible connected products and agile services as the new wave of digital transformation. Biological metaphors of living and adaptable ecosystems with service-oriented enterprise architectures provide the foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems. We are extending Enterprise Architecture (EA) with mechanisms for flexible adaptation and evolution of information systems having distributed IoT and other micro-granular digital architecture to support next digitization products, services, and processes. Our aim is to support flexibility and agile transformation for both IT and business capabilities through adaptive digital enterprise architectures. The present research paper investigates additionally decision mechanisms in the context of multi-perspective explorations of enterprise services and Internet of Things architectures by extending original enterprise architecture reference models with state of art elements for architectural engineering and digitization.
Social sustainable supply chain management in the textile and apparel industry : a literature review
(2017)
So far, a vast amount of studies on sustainability in supply chain management have been conducted by academics over the last decade. Nevertheless, socially related aspects are still neglected in the related discussion. The primary motivation of the present literature review has arisen from this shortcoming, thus the key purpose of this study is to enrich the discussion by providing a state of-the-art, focusing exclusively on social issues in sustainable supply chain management (SSCM) by considering the textile/apparel sector as the field of application. The authors conduct a literature review, including content analysis which covers 45 articles published in English peer-reviewed journals, and proposes a comprehensive map which integrates the latest findings on socially related practices in the textile/apparel industry with the dominant conceptualization in order to reveal potential research areas in the field. The results show an ongoing lack of investigation regarding the social dimension of the triple bottom line in SSCM. Findings indicate that a company’s internal orientation is the main assisting factor in sustainable supply chain management practices. Further, supplier collaboration and assessment can be interpreted as an offer for suppliers deriving from stakeholders and a focal company’s management of social risk. Nevertheless, suppliers do also face or even create huge barriers in improving their social performance. This calls for more empirical research and qualitative or quantitative survey methods, especially at the supplier level located in developing countries.
Purpose: Human breath analysis is proposed with increasing frequency as a useful tool in clinical application. We performed this study to find the characteristic volatile organic compounds (VOCs) in the exhaled breath of patients with idiopathic pulmonary fibrosis (IPF) for discrimination from healthy subjects. Methods: VOCs in the exhaled breath of 40 IPF patients and 55 healthy controls were measured using a multi-capillary column and ion mobility spectrometer. The patients were examined by pulmonary function tests, blood gas analysis, and serum biomarkers of interstitial pneumonia. Results: We detected 85 VOC peaks in the exhaled breath of IPF patients and controls. IPF patients showed 5 significant VOC peaks; p-cymene, acetoin, isoprene, ethylbenzene, and an unknown compound. The VOC peak of p-cymene was significantly lower (p < 0.001), while the VOC peaks of acetoin, isoprene, ethylbenzene, and the unknown compound were significantly higher (p < 0.001 for all) compared with the peaks of controls. Comparing VOC peaks with clinical parameters, negative correlations with VC (r =−0.393, p = 0.013), %VC (r =−0.569, p < 0.001), FVC (r = −0.440, p = 0.004), %FVC (r =−0.539, p < 0.001), DLco (r =−0.394, p = 0.018), and %DLco (r =−0.413, p = 0.008) and a positive correlation with KL-6 (r = 0.432, p = 0.005) were found for p-cymene. Conclusion: We found characteristic 5 VOCs in the exhaled breath of IPF patients. Among them, the VOC peaks of p-cymene were related to the clinical parameters of IPF. These VOCs may be useful biomarkers of IPF.
This paper investigates the electrothermal stability and the predominant defect mechanism of a Schottky gate AlGaN/GaN HEMT. Calibrated 3-D electrothermal simulations are performed using a simple semiempirical dc model, which is verified against high-temperature measurements up to 440°C. To determine the thermal limits of the safe operating area, measurements up to destruction are conducted at different operating points. The predominant failure mechanism is identified to be hot-spot formation and subsequent thermal runaway, induced by large drain–gate leakage currents that occur at high temperatures. The simulation results and the high temperature measurements confirm the observed failure patterns.
Nenne sie niemals Senioren!
(2017)
In times of dynamic markets, enterprises have to be agile to be able to quickly react to market influences. Due to the increasing digitization of products, the enterprise IT often is affected when business models change. Enterprise Architecture Management (EAM) targets a holistic view of the enterprise’ IT and their relations to the business. However, Enterprise Architectures (EA) are complex structures consisting of many layers, artifacts and relationships between them. Thus, analyzing EA is a very complex task for stakeholders. Visualizations are common vehicles to support analysis. However, in practice visualization capabilities lack flexibility and interactivity. A solution to improve the support of stakeholders in analyzing EAs might be the application of visual analytics. Starting from a systematic literature review, this article investigates the features of visual analytics relevant for the context of EAM.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. With the use of a cascaded-regressor-based face tracking and a 3D morphable face model shape fitting, we obtain a semidense 3D face shape. We further use the texture information from multiple frames to build a holistic 3D face representation from the video footage. Our system is able to capture facial expressions and does not require any person specific training. We demonstrate the robustness of our approach on the challenging 300 Videos in the Wild (300- VW) dataset. Our real-time fitting framework is available as an open-source library at http://4dface.org.
It is assumed that more education leads to better understanding of complex systems. Some researchers, however, find indications that simple mechanisms like stocks and flows are not well understood even by people who have passed higher education. In this paper, we test people’s understanding of complex systems with the widely studied stock-and-flow (SF) tasks. SF tasks assess people’s understanding of the interplay between stocks and flows. We investigate SF failure of domain experts and novices in different knowledge domains. In particular, we compare performance on the original study’s bathtub task with the square wave pattern with two alternative cover stories from the engineering and business domains on different groups of business and engineering students from different semesters. Further, we show that, while engineering students perform better than business students, with progressing in higher education, students may lose the capability of dealing with simple SF tasks. We thus find hints on déformation professionelle in higher education.
Nowadays CHP units are discussed for the production of electricity on demand rather than for generation of heat providing electricity as a by-product. By this means, CHP units are capable of satisfying a higher share of the electricity demand on-site and in this new role, CHP units are able to reduce the load on the power grid and to compensate for high fluctuations of solar and wind power.
Evidently, a novel control strategy for CHP units is required in order to shift the operation oriented at the heat demand to an operation led by the electricity demand. Nevertheless, the heat generated by the CHP unit needs to be utilized completely in any case, for maintaining energy as well as economic efficiency. Such a strategy has been developed at Reutlingen University, and it will be presented in the paper. Part of the strategy is an intelligent management for the thermal energy storage (TES) ensuring that the storage is at low level in terms of its heat content just before an electricity demand is calling the CHP unit into operation. Moreover, a proper forecast of both, heat and electricity demand, is incorporated and the requirements of the CHP unit in terms of maintenance and lifetime are considered by limiting the number of starts and stops per unit time and by maintaining a certain minimum length of the operation intervals.
All aspects of this novel control strategy are revealed in the paper, which has been implemented on a controller for further testing at two sites in the field. Results from these tests are given as well as results from a simulation model, which is able to evaluate the performance of the control strategy for an entire year.
THE PROBLEM: Companies create problems for customers and employees when product innovation goes unmanaged. Eventually, excessive operational complexity hurts the bottom line.
THREE SOLUTIONS: Focus on product integration, not product proliferation. Make sure your product developers work closely with customerfacing and operational employees. And settle on a high-level purpose that can guide decision making.
This research is about Omnichannel Retailing and addresses the question how the omnichanneling of retailers in the fashion market can be measured. Our sources will include books, interviews, newspapers and scientific databases.
Omnichanneling is a current topic in the fashion market, retailers all over the world face the question on how to adapt to the challenges Omnichannel Retailing sets. We are going to define what Omnichanneling is by explaining the differences between Multiple-, Multi-, Cross- and Omnichannel Retailing. After we defined omnichanneling itself, we took a set of 26 retailers to evaluate regarding their Omnichannel capabilities. Then we create an index with criteria that can measure the Omnichannel capability of each retailer.
The Omnichannel Score is based on 31 criteria, which analyze the retailers in offline, online, mobile and social aspects enables to see differences between retailers. Our findings were that retailers in the US fashion market are more advanced in Omnichannel Retailing than retailers in the German fashion market. Our top three Omnichannel retailers were Sears with an Omnichannel Score of 91, followed by KOHL’S and Marks&Spencer, both with a Omnichannel Score of 88. The best Omnichannel Retailer from Germany was Adidas with the fourth place and an Omnichannel Score of 81.
There is no doubt that the amplification of channel integration towards an omni-channel structure is a powerful idea whose time has finally come. The digitally cross-linked world postulates all-encompassing, ubiquitous, and unobtrusive future services. In the concomitant, increasingly competitive market, retailers are starting to lay the foundation for omnichannel, meeting the expectations of a digitally cunning audience wanting their shopping experience to be as seamless and uncomplicated as possible. Nevertheless, recent researches show that there are still enough avenues for further research on omnichannel. Until now, the performance of companies was solely considered by experts from a suppliers’ point of view. It would be rather interesting to find out whether the desire to meet the increased cus-tomer expectations is also recognized by the customers themselves. This paper seeks to answering how the purchasing behavior has changed and what customers demand. In addition, it elaborates the opportunities that are promoted by omni-channel. Searching out all the effects, the paper will get to a final step, where it can be attested how the omnichannel performance of fashion and lifestyle retailers can be measured from a consumers’ perspective by developing an exclusive index. The study is confined to four fashion and lifestyle retailers: Hugo Boss AG, Levi Strauss & Co, Pull and Bear as well as COS. Using the scientific method of mystery shopping and a multi-item checklist including 54 key performance indicators, the paper aims to examine to which extend the four selected retailers provide a seamless customer journey, according to the five decision-making phases.
The purpose of this paper is to determine the relevance of social media for luxury brand management. It employs both a multi-methodological approach: After analyzing the online performance of the three luxury brands Burberry, Louis Vuitton and Gucci, the empirical research includes a survey as well as an eye tracking test executed with Tobii Studio. The findings reveal that online and social media have given luxury fashion businesses the opportunity to establish a sustainable interaction with their customers and distinguish themselves from the competition. Still, the online business holds many challenges for luxury companies to overcome. This paper gives instructions as to how social media can be effectively incorporated into a luxury company.
The conventional view of the value-creation chain suggests offering high-value propositions at the product level (in terms of benefits provided by elements of the product) to attain high-value perceptions at the customer level, which should ultimately result in high-value appropriation at the firm level (i.e. relationship, volume, pricing and financial success). This study challenges this view and provides a differentiated understanding of the value creation chain. With a multi-industry sample of 339 companies and a sample of 626 customers to validate managerial assessments, the authors apply a configurational approach to identify whether and to what extent offering high-value propositions at the product level is necessary or sufficient for achieving superior value perceptions at the customer level and high-value appropriation at the firm level. Taking into account the company-internal and company-external environment of the value-creation chain, the study identifies seven value creation chain constellations.
This work presents a spiral antenna array, which can be used in the V- and W-Band. An array equipped with Dolph-Chebychev coefficients is investigated to address issues related to the low gain and side lobe level of the radiating structure. The challenges encountered in this achievement are to provide an antenna that is not only good matched but also presents an appreciable effective bandwidth at the frequency bands of interest. Its radiation properties including the effective bandwidth and the gain are analyzed for the W-Band.
We present a topology of MIMO arrays of inductive antennas exhibiting inherent high crosstalk cancellation capabilities. A single layer PCB is etched into a 3-channels array of emitting/receiving antennas. Once coupled with another similar 3-channels emitter/receiver, we measured an Adjacent Channel Rejection Ratio (ACRR) as high as 70 dB from 150 Hz to 150 kHz. Another primitive device made out of copper wires wound around PVC tubes to form a 2-channels “non-contact slip-ring” exhibited 22 dB to 47 dB of ACRR up to 15MHz. In this paper we introduce the underlying theoretical model behind the crosstalk suppression capabilities of those so-called “Pie-Chart antennas”: an extension of the mutual inductance compensation method to higher number of channels using symmetries. We detail the simple iterative building process of those antennas, illustrate it with numerical analysis and evaluate there effectiveness via real experiments on the 3-channels PCB array and the 2-channels rotary array up to the limit of our test setup. The Pie Chart design is primarily intended as an alternative solution to costly electronic filters or cumbersome EM shields in wireless AND wired applications, but not exclusively.
Energy transfer kinetics in photosynthesis as an inspiration for improving organic solar cells
(2017)
Clues to designing highly efficient organic solar cells may lie in understanding the architecture of light harvesting systems and exciton energy transfer (EET) processes in very efficient photosynthetic organisms. Here, we compare the kinetics of excitation energy tunnelling from the intact phycobilisome (PBS) light harvesting antenna system to the reaction center in photosystem II in intact cells of the cyanobacterium Acaryochloris marina with the charge transfer after conversion of photons into photocurrent in vertically aligned carbon nanotube (va- CNT) organic solar cells with poly(3-hexyl)thiophene (P3HT) as the pigment. We find that the kinetics in electron hole creation following excitation at 600 nm in both PBS and va-CNT solar cells to be 450 and 500 fs, respectively. The EET process has a 3 and 14 ps pathway in the PBS, while in va-CNT solar cell devices, the charge trapping in the CNT takes 11 and 258 ps. We show that the main hindrance to efficiency of va CNT organic solar cells is the slow migration of the charges after exciton formation.
Digitisation forms a part of Industrie 4.0 and is both threatening, but also providing an opportunity to transform business as we know it; and can make entire business models redundant. Although companies might realise the need to digitise, many are unsure of how to start this digital transformation. This paper addresses the problems and challenges faced in digitisation, and develops a model for initialising digital transformation in enterprises. The model is based on a continuous improvement cycle, and also includes triggers for innovative and digital thinking within the enterprise. The model was successfully validated in the German service sector.
Propofol is a commonly used intravenous general anesthetic. Multi-capillary column (MCC) coupled ion-mobility spectrometry (IMS) can be used to quantify exhaled propofol, and thus estimate plasma drug concentration. Here, we present results of the calibration and analytical validation of a MCC/IMS pre-market prototype for propofol quantification in exhaled air.
In vitro cultured cells produce a complex extracellular matrix (ECM) that remains intact after decellularization. The biological complexity derived from the variety of distinct ECM molecules makes these matrices ideal candidates for biomaterials. Biomaterials with the ability to guide cell function are a topic of high interest in biomaterial development. However, these matrices lack specific addressable functional groups, which are often required for their use as a biomaterial. Due to the biological complexity of the cell-derived ECM, it is a challenge to incorporate such functional groups without affecting the integrity of the biomolecules within the ECM. The azide-alkyne cycloaddition (click reaction, Huisgen-reaction) is an efficient and specific ligation reaction that is known to be biocompatible when strained alkynes are used to avoid the use of copper (I) as a catalyst. In our work, the ubiquitous modification of a fibroblast cell-derived ECM with azides was achieved through metabolic oligosaccharide engineering by adding the azide-modified monosaccharide Ac4GalNAz (1,3,4,6 tetra-O-acetyl-N-azidoacetylgalactosamine) to the cell culture medium. The resulting azide-modified network remained intact after removing the cells by lysis and the molecular structure of the ECM proteins was unimpaired after a gentle homogenization process. The biological composition was characterized in order to show that the functionalization does not impair the complexity and integrity of the ECM. The azides within this ‘‘clickECM” could be accessed by small molecules (such as an alkyne modified fluorophore) or by surface-bound cyclooctynes to achieve a covalent coating with clickECM.
Purpose: The purpose of this study was to investigate the value of the web representation of certain fashion hot spots and how these results can be shown on fashion maps in an illustrated way.
Design/methodology/approach: A new ranking was created, which was evaluated with a self-instructed index, to gain solid results. Numbers were collected from Google, Instagram, Facebook, Twitter and web.alert.io. Additionally, fashion maps were created for an illustrative visualization of the results.
Findings: Compared with the ranking of a trend forecasting agency, called Global Language Monitor, which concepted a ranking of non-virtual fashion cities, the web representation and therefore the ranking of the research project, differs mainly in the situation of the cities among the first 10, viz. the rank on which a city occurs, but fewer in the actual cities mentioned.
Research limitations: The research was limited to subjective analysis of data, leading to partly subjective results, as well as the selected number of social media platforms, that had been used.
Originality/value: This is the first study to explore the web representation value of fashion metropolises in comparison to their non-virtual ranking. The results are partly based on results that already existed, concerning transformations of fashion cities or in general which cities own the status of a fashion city.
Purpose: The purpose of this paper is to examine the service of the new business model Curated Shopping in the fashion industry as well as to analyze if the service provides a higher costumer added value in comparison to traditional services in retail stores and e-commerce platforms. It gives implications to curated shop operators how to optimize the service in each stage of the customer buying process.
Design/methodology/approach: The research methodology applied is an empirical study that uses the principal of mystery shopping in order to investigate the provided services during the selling process.
Findings: The study showed that information about the customer should be collected carefully and as holistic as possible in order to assort a suitable outfit. The consumer is able to benefit from the service by saving time and enjoying a stress-free way of shopping. Nevertheless there are limitations in the personal service to give individual and inspiring advice by the curator caused by the physical distance to the customer.
Research limitations: The survey was conducted under 10 mystery shoppers and 4 curated shop operators in Germany, limiting findings to these mystery shoppers and operators.
Practical implications: One implication for the shop operators is to collect consumer information carefully and expand the assortment and brand portfolio in order to provide fashion goods to inspire the consumer. The shop operators are on the right track still there is huge potential to provide a more shopper-oriented service.
Newly developed active pharmaceutical ingredients (APIs) are often poorly soluble in water. As a result the bioavailability of the API in the human body is reduced. One approach to overcome this restriction is the formulation of amorphous solid dispersions (ASDs), e.g., by hot-melt extrusion (HME). Thus, the poorly soluble crystalline form of the API is transferred into a more soluble amorphous form. To reach this aim in HME, the APIs are embedded in a polymer matrix. The resulting amorphous solid dispersions may contain small amounts of residual crystallinity and have the tendency to recrystallize. For the controlled release of the API in the final drug product the amount of crystallinity has to be known. This review assesses the available analytical methods that have been recently used for the characterization of ASDs
and the quantification of crystalline API content. Well established techniques like near- and mid-infrared spectroscopy (NIR and MIR, respectively), Raman spectroscopy, and emerging ones like UV/VIS, terahertz, and ultrasonic spectroscopy are considered in detail. Furthermore, their advantages and limitations are discussed with regard to general practical applicability as process analytical technology (PAT) tools in industrial manufacturing. The review focuses on spectroscopic methods which have been proven as most suitable for in-line and on-line process analytics. Further aspects are spectroscopic techniques that have been or could be integrated into an extruder.
Integrated power semiconductors are often used for applications with cyclic on-chip power dissipation. This leads to repetitive self-heating and thermo-mechanical stress, causing fatigue on the on-chip metallization and possibly destruction by short circuits. Because of this, an accurate simulation of the thermo-mechanical stress is needed already during the design phase to ensure that lifetime requirements are met. However, a detailed thermo mechanical simulation of the device, including the on-chip metallization is prohibitively time-consuming due to its complex structure, typically consisting of many thin metal lines with thousands of vias. This paper introduces a two-step approach as a solution for this problem. First, a simplified but fast simulation is performed to identify the device parts with the highest stress. After, precise simulations are carried out only for them. The applicability of this method is verified experimentally for LDMOS transistors with different metal configurations. The measured lifetimes and failure locations correlate well with the simulations. Moreover, a strong influence of the layout of the on-chip metallization lifetime was observed. This could also be explained with the simulation
method.
Comments on “Solubility parameter of chitin and chitosan”, Carbohydrate Polymers 36 (1998) 121–127
(2017)
Results on the solubility parameters of chitin and chitosan presented in the paper DOI: 10.1016/S0144-8617(98)00020-4 were recalculated and data evaluation was redone. A number of misprints, erroneous calculations and data evaluations were found with respect to Hansen as well as total solubility parameters as derived according to group contribution methods by Hoftyzer-Van Krevelen and Hoy’s system. Revised numerical data are presented.
Wege der Gewinnermittlung
(2017)
Macht ein Unternehmen Gewinn, heißt dies nicht notwendigerweise, dass alles „in trockenen Tüchern“ ist. Die entscheidende Frage ist, wie der Gewinn ermittelt wurde, denn nur mit dem richtigen Verfahren erhält man auch den geeigneten Blickwinkel – auf den Erfolg eines einzelnen Geschäfts, auf den Gewinn einer Periode, auf das Betriebsvermögen, auf die Liquidität oder auf die Bilanz.
EBIT & Co.
(2017)
Eine ganze Reihe von Kennzahlen wird in der Betriebswirtschaftslehre zur Ermittlung und Steuerung des Unternehmensgewinns verwendet. Doch nicht alle eignen sich für denselben Zweck. Je nach Fragestellung sollten unterschiedliche Kennzahlen herangezogen werden. Ihre Interpretation muss nicht zuletzt auch branchenspezifisch erfolgen.
Eine realistische Risikoeinschätzung ist Basis von verantwortungsvollen Unternehmensentscheidungen. Doch wie lassen sich Risiken richtig einschätzen? Verschiedene Instrumente des Risiko-Managements erlauben es, Risiken systematisch zu identifizieren, zu quantifizieren, zu bewerten und zu dokumentieren.
Risiken sind per se nichts Schlechtes, wenn der dadurch erzielte Ertrag für das eingegangene Risiko angemessen ist. Dieser Zusammenhang wird allerdings nicht immer verstanden – einer der Gründe für die Finanzkrise von 2008/09. Die in diesem Beitrag vorgestellten Kennzahlen zeigen, wie man Risiken mit erzielten oder möglichen Erträgen ins Verhältnis setzen kann.
Wer in ein Unternehmen investiert, tut dies, um in Zukunft Geld zu verdienen. Er rechnet mit einer risikoadäquaten Rendite. Die Auswahl der Kennzahlen, die diese Wertsteigerung transparent machen, ist allerdings nicht trivial. Denn von ihnen hängt ab, ob die Unternehmensziele richtig vorgegeben und ob die Anreize für das Management richtig gesetzt werden.
Umsatz und Gewinne stagnieren auf hohem Niveau, und dennoch steigen der Aktienkurs und der Gewinn pro Aktie – eine Entwicklung, die sich etwa bei Apple oder Ebay beobachten lässt. Aktionäre sollten wissen, welche Arithmetik sich hinter solchen Entwicklungen verbirgt und mit welchen Verfahren sie den Unternehmenswert am besten ermitteln können.
Propofol in exhaled breath can be measured and may provide a real-time estimate of plasma concentration. However, propofol is absorbed in plastic tubing, thus estimates may fail to reflect lung/blood concentration if expired gas is not extracted directly from the endotracheal tube.We evaluated exhaled propofol in five ventilated ICU patients who were sedated with propofol. Exhaled propofol was measured once per minute using ion mobility spectrometry. Exhaled air was sampled directly from the endotracheal tube and at the ventilator end of the expiratory side of the anesthetic circuit. The circuit was disconnected from the patient and propofol was washed out with a separate clean ventilator. Propofol molecules, which discharged from the expiratory portion of the breathing circuit, were measured for up to 60 h.We also determined whether propofol passes through the plastic of breathing circuits. A total of 984 data pairs (presented as median values, with 95% confidence interval), consisting of both concentrations were collected. The concentration of propofol sampled near the patient was always substantially higher, at 10.4 [10.25–10.55] versus 5.73 [5.66–5.88] ppb (p<0.001). The reduction in concentration over the breathing circuit tubing was 4.58 [4.48–4.68] ppb, 3.46 [3.21–3.73] in the first hour, 4.05 [3.77–4.34] in the second hour, and 4.01 [3.36–4.40] in the third hour. Out-gassing propofol from the breathing circuit remained at 2.8 ppb after 60 h of washing out. Diffusion through the plastic was not observed. Volatile propofol binds or adsorbs to the plastic of a breathing circuit with saturation kinetics. The bond is reversible so propofol can be washed out from the plastic. Our data confirm earlier findings that accurate measurements of volatile propofol require exhaled air to be sampled as close as possible to the patient.
The modern industrial corporation encompasses a myriad of different software applications, each of which must work in concert to deliver functionality to end-users. However, the increasingly complex and dynamic nature of competition in today’s product-markets dictates that this software portfolio be continually evolved and adapted, in order to meet new business challenges. This ability – to rapidly update, improve, remove, replace, and reimagine the software applications that underpin a firm’s competitive position – is at the heart of what has been called IT agility. Unfortunately, little work has examined the antecedents of IT agility, with respect to the choices a firm makes when designing its “Software Portfolio Architecture.”
We address this gap in the literature by exploring the relationship between software portfolio architecture and IT agility at the level of the individual applications in the architecture. In particular, we draw from modular systems theory to develop a series of hypotheses about how different types of coupling impact the ability to update, remove or replace the software applications in a firm’s portfolio. We test our hypotheses using longitudinal data from a large financial services firm, comprising over 1,000 applications and over 3,000 dependencies between them. Our methods allow us to disentangle the effects of different types and levels of coupling.
Our analysis reveals that applications with higher levels of coupling cost more to update, are harder to remove, and are harder to replace, than those with lower coupling. The measures of coupling that best explain differences in IT agility include all indirect dependencies between software applications (i.e., they include coupling and dependency relationships that are not easily visible to the system architect). Our results reveal the critical importance of software portfolio design decisions, in developing a portfolio of applications that can evolve and adapt over time.
Angesichts des breiten Angebotsspektrums neuer Technologien und der Vielzahl verschieden verwendeter Begriffe rund um Industrie 4.0, stehen Unternehmen nicht selten orientierungslos vor der Herausforderung, individuelle Umsetzungsstrategien abzuleiten. Das vorliegende Reifegradmodell ermöglicht die Erfassung bereits im Produktionssystem implementierter Lean Management-Prinzipien und gibt praktikable Antworten auf die evolutionären Visionen, indem es realisierbare und individuelle Migrationspfade in Richtung Industrie 4.0 für Unternehmen aufzeigt.
Der B-to-B-Vertrieb hat sich durch neueste Informationstechnologien stark verändert und ist wesentlich komplexer geworden. Gleichwohl haben sich die Möglichkeiten dadurch auch dramatisch verbessert. International agierende, virtuell zusammenarbeitende Teams haben die Entscheidungsprozesse und Kompetenzen verschoben. Mit Social Selling bietet sich jetzt eine Möglichkeit, die neu entstandenen Herausforderungen im Vertrieb besser zu meistern.
Trotz größer werdendem Tenor, die Intuition als gewinnbringende Ergänzung zur rational geprägten Entscheidungskultur im Vertrieb zu implementieren, scheint dennoch Unsicherheit und Unwissenheit über den richtigen Umgang mit der Intuition seitens der Mitarbeiter und Mitarbeiterinnen vorzuherrschen. Durch die systematische Legitimierung der Intuition Vertrieb kann diesem Umstand entgegengewirkt werden.
Condition Monitoring for mechanical systems like bearings or transmissions is often done by analysing frequency spectra obtained from accelerometers mounted to the components under observation. Although this approach gives a high amount on information about the system behaviour, the interpretation of the resulting spectra requires expert knowledge, that is, a deep understanding of the effect on condition deterioration on the measured spectra. However, an increasing number of condition monitoring applications demands other representations of the measured signals that can be easily interpreted even by non–experts. Therefore, the objective of this paper is to develop an approach for processing measured process data in order to obtain an easy to interpret measure for assessing the component condition. The main idea is to evaluate the deterioration of a component condition by computing the correlation function of current measurements with past measurements in order to detect a component condition deterioration from a change in these correlation functions. Besides the simplicity of the obtained measure, this approach opens the opportunity for integrating a model based approach as well. The developed method is tested based on a condition monitoring application in a roller chain.
In the lights of an increasing digitalization of companies, the sales process might experience changes in the usage and the influence of digital tools. In order to examine the status quo of German companies in this regard, a study was conducted between 235 participants. The results of this study will be outlined in the article at hand.
With the Internet of Things being one of the most discussed trends in the computer world lately, many organizations find themselves struggling with the great paradigm shift and thus the implementation of IoT on a strategic level. The Ignite methodoogy as a part of the Enterprise-IoT project promises to support organizations with these strategic issues as it combines best practices with expert knowledge from diverse industries helping to create a better understanding of how to transform into an IoT driven business. A framework that is introduced within the context of IoT business model development is the Bosch IoT Business Model Builder. In this study the provided framework is compared to the Osterwalder Business Model Canvas and the St. Gallen Business Model Navigator, the most commonly used and referenced frameworks according to a quantitative literature analysis.
In an exploratory study about online communication of large and medium-sized B2B companies from the German state of Baden-Württemberg, their message content communicated via websites, and their websites' appeal for international prospects has been analyzed. It revealed many basic content items absent, making the site less attractive for further exploration, and difficult or international prospects to enter into a dialog, become leads, and possible customers. The subsequent survey elicited organizational backgrounds, available resources, and objectives for online communication. It could trace deficiencies back to a lack of understanding of the importance of digital communication for lead generation, and the customer journey in general, absence of a communication strategy, lack of urgency, and lack of resources to implement desired changes and additions to communication content.
Digitization will require companies to fundamentally reengineer their sales processes. Adapting the concept of value selling to the digital age will enable them to deliver superior value to their customers. Specifically, social selling will provide them with an answer to the ever-increasing complexity of customer journeys. This article, based on a survey among 235 German companies, assesses the status quo and outlines opportunities. Moreover, it introduces a novel approach for developing well-grounded social selling metrics.
Für Führungskräfte sind Widersprüche und paradoxe Spannungen eine alltägliche Erfahrung. Dennoch sehen viele Führungskräfte Widersprüche als etwas an, das »eigentlich« nicht da sein sollte. Häufig blenden sie die widersprüchlichen Signale aus und ignorieren die Paradoxie. Oder sie nehmen die Spannungen als störend und belastend wahr und versuchen die Paradoxie zu lösen. Die Management- und Organisationsforschung zeigt jedoch: Paradoxien sind in Organisationen omnipräsent. Und sie sind nicht dauerhaft lösbar. Welche Konsequenzen hat das für unser Verständnis von Management?
Best Practice-Modelle und Change- Weisheiten erfreuen sich großer Beliebtheit, was sich wohl damit erklären lässt, dass sie bei den Verantwortlichen komplexitätsreduzierend wirken und Unsicherheit abbauen. Allerdings sind Organisationen voller Widersprüche, reagieren oft irrational und folgen nicht unbedingt den durchgeplanten Entwürfen des Change Managements. Manche Fragen in Organisationen sind unlösbar, und die Organisationen pendeln bei ihrer Lösungssuche zwischen entgegengesetzten Polen hin und her. Best Practices können als Idealvorstellungen die in sie gesetzten Erwartungen oft nicht erfüllen. Dort, wo sie an ihre Grenzen stoßen, erscheint es ratsam, sich auf die fundamentalen Kräfte des Wandels wie Paradoxien, Ambiguität, Komplexität und Nicht-Steuerbarkeit einzulassen.
Purpose: Medical processes can be modeled using different methods and notations.Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail.
Methods: We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN).
Results: First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention.
Conclusion: An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
Die meisten Unternehmen steuern ihr operatives Marketing, ohne die Effektivität und die Effizienz der dabei eingesetzten Instrumente zu quantifizieren - in Anbetracht der Höhe typischer Marketingbudgets ein kritischer Befund. Wie kann man hier Abhilfe schaffen? Ein Blick auf eines der Schlüsselkonzepte von Industrie 4.0 zeigt einen möglichen Weg auf.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
Bei großen Sportereignissen wie Fußball-, Welt- und Europameisterschaften oder Olympischen Spielen geht es für Verbände und offizielle Sponsoren um Millionen, entsprechend scharf verteidigen sie ihre Werberechte. Burger King zeigt, wie sich dieses „Monopol“ kreativ umgehen lässt. Im folgenden Beitrag werden exemplarisch zwei Ambush-Marketing-Aktivitäten von Burger King im Rahmen der Fußball Europameisterschaften 2016 vorgestellt. Nicht-Sponsor Burger King setzte Ambush Marketing dabei gezielt und kreativ im Rahmen der EM ein, um gegen den offiziellen UEFA-Sponsor und Wettbewerber McDonald‘s Punkte zu sammeln.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. This disruptive change drive current and next information processes and systems that are important business enablers for the context of digitization since years. Our aim is to support flexibility and agile transformations for both business domains and related information technology with more flexible enterprise information systems through adaptation and evolution of digital architectures. The present research paper investigates the continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, like microservices and the Internet of Things, as part of a new composed digital architecture. To integrate micro-granular architecture models into living architectural model versions we are extending enterprise architecture reference models by state of art elements for agile architectural engineering to support digital products, services, and processes.
Steady growing research material in a variety of databases, repositories and clouds make academic content more than ever hard to discover. Finding adequate material for the own research however is essential for every researcher. Based on recent developments in the field of artificial intelligence and the identified digital capabilities of future universities a change in the basic work of academic research is predicted. This study defines the idea of how artificial intelligence could simplifiy academic research at a digital university. Today's studies in the field of AI spectacle the true potential and its commanding impact on academic research.
In recent times, enterprises have been increasingly dealing with the use of social media in internal communication and collaboration. In particular, so-called Enterprise Social Networks (ESN) promise meaningful benefits for the nature of work in corporations. However, these platforms often suffer from poor degrees of use. This raises the question of what initiatives enterprise can launch in order to stimulate the vitality of ESN. Since the use of ESN is often voluntary, individual adoption by employees need to be examined to find an answer. Therefore, the Unified Theory of Acceptance and Use of Technology (UTAUT) model was selected for the theoretical foundation of this paper. Following a qualitative research approach, the available research provides an analysis of expert interviews on specific ESN implementation strategies and included factors. In order to extensively conceptualize and generalize these strategic considerations, we conducted an inductive coding process. The results reveal that ESN implementation strategies can be understood as a multi-level construct (individual vs. group vs. organizational level) containing different factors dependent on the degree of documentation and intensity. This research in progress describes a qualitative evaluation as a preliminary study for further quantitative analysis of an ESN adoption model.
Smart meter based business models for the electricity sector : a systematical literature research
(2017)
The Act on the Digitization of the Energy Transition forces German industries and households to introduce smart meters in order to save engery, to gain individual based electricity tariffs and to digitize the energy data flow. Smart meter can be regarded as the advancement of the traditional meter. Utilizing this new technology enables a wide range of innovative business models that provide additional value for the electricity suppliers as well as for their customers. In this study, we followed a two-step approach. At first, we provide a state-of-the-art comparison of these business models found in the literature and identify structural differences in the way they add value to the offered products and services. Secondly, the business models are grouped into categories with respect to customer segmetns and the added value to the smart grid. Findings indicate that most business models focus on the end-costumer as their main customer.
Digitization in the energy sector is a necessity to enable energy savings and energy efficiency potentials. Managing decentralized corporate energy systems is hindered by a non-existence. The required integration of energy objectives into business strategy creates difficulties resulting in inefficient decisions. To improve this, practice-proven methods such as Balanced Scorecard, Enterprise Architecture Management and the Value Network approach are transferred to the energy domain. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
The increasing number of connected mobile devices such as fitness trackers and smartphones define new data for health insurances, enabling them to gain deeper insights into the health of their customers. These additional data sources plus the trend towards an interconnected health community, including doctors, hospitals and insurers, lead to challenges regarding data filtering, organization and dissemination. First, we analyze what kind of information is relevant for a digital health insurance. Second, functional and non-functional requirements for storing and managing health data in an interconnected environment are defined. Third, we propose a data architecture for a digitized health insurance, consisting of a data model and an application architecture.
Pokémon Go was the first mobile Augmented Reality (AR) game that made it to the top of the download charts of mobile applications. However, very little is known about this new generation of mobile online Augmented Reality (AR) games. Existing media usage and technology acceptance theories provide limited applicability to the understanding of its users. Against this background, this research provides a comprehensive framework that incorporates findings from uses & gratification theory (U>), technology acceptance and risk research as well as flow theory. The proposed framework aims at explaining the drivers of attitudinal and intentional reactions, such as continuance in gaming or willingness to conduct in-app purchases. A survey among 642 Pokémon Go players provides insights into the psychological drivers of mobile AR games. Results show that hedonic, emotional and social benefits, and social norms drive, vice versa physical risks (but not privacy risks) hinder consumer reactions. However, the importance of these drivers differs between different forms of user behavior.
This paper investigates the impact of dynamic capabilities (DC) on brand love. From a resource-based view, there is little clarity vis-à-vis the specific capabilities that drive the ability to create brand love. This paper focuses on three research questions: Firstly, which dynamic capabilities are relevant for brand love? Secondly, how strong is the impact of certain dynamic capabilities on brand love? Thirdly, which conditions mediate and moderate the impact of specific dynamic capabilities on brand love? Data from a multi-method research approach have been used to itentify the specific capabilities that corporations need, to enhance brand love. Furthermore, a standardized online survey was conducted on marketing executives and evaluated by structural equation modeling. The results indicate, that customer expertise plays a major role in the relationship between dynamic capabilities and brand love. Furthermore, this relationship is more important in markets that have a low competitive differentiation in products and services.
Electronic word-of-mouth (eWoM) communication plays an increasingly important role in modern business. The underlying concept of word-of-mouth (WoM) communication is well researched and has proved highly significant in respect of its impact on customers purchase behavior. However, due to the advent of digital technologies, decision-making among customers is progressively shifting to the online world. Consequently, eWoM has received a lot of attention from the academic community. As multiple research papers focus on specific facets of eWoM, there is a need to integrate current research results systematically. Thus, this paper presents a scientific literature analysis in order to determine the current state-of-the-art in the field of eWoM. Five main research areas were analyzed, supporting the need for further eWoM studies and providing a structured overview of existing results.
Clinical reading centers provide expertise for consistent, centralized analysis of medical data gathered in a distributed context. Accordingly, appropriate software solutions are required for the involved communication and data management processes. In this work, an analysis of general requirements and essential architectural and software design considerations for reading center information systems is provided. The identified patterns have been applied to the implementation of the reading center platform which is currently operated at the Center of Ophthalmology of the University Hospital of Tübingen.
Context: The current situation and future scenarios of the automotive domain require a new strategy to develop high quality software in a fast pace. In the automotive domain, it is assumed that a combination of agile development practices and software product lines is beneficial, in order to be capable to handle high frequency of improvements. This assumption is based on the understanding that agile methods introduce more flexibility in short development intervals. Software product lines help to manage the high amount of variants and to improve quality by reuse of software for long term development.
Goal: This study derives a better understanding of the expected benefits for a combination. Furthermore, it identifies the automotive specific challenges that prevent the adoption of agile methods within the software product line.
Method: Survey based on 16 semi structured interviews from the automotive domain, an internal workshop with 40 participants and a discussion round on ESE congress 2016. The results are analyzed by means of thematic coding.
Software startups often make assumptions about the problems and customers they are addressing as well as the market and the solutions they are developing. Testing the right assumptions early is a means to mitigate risks. Approaches such as Lean Startup foster this kind of testing by applying experimentation as part of a constant build-measure-learn feedback loop. The existing research on how software startups approach experimentation is very limited. In this study, we focus on understanding how software startups approach experimentation and identify challenges and advantages with respect to conducting experiments. To achieve this, we conducted a qualitative interview study. The initial results show that startups often spent a disproportionate amount of time focusing on creating solutions without testing critical assumptions. Main reasons are the lack of awareness, that these assumptions can be tested early and a lack of knowledge and support on how to identify, prioritize and test these assumptions. However, startups understand the need for testing risky assumptions and are open to conducting experiments.
Software and system development is complex and diverse, and a multitude of development approaches is used and combined with each other to address the manifold challenges companies face today. To study the current state of the practice and to build a sound understanding about the utility of different development approaches and their application to modern software system development, in 2016, we launched the HELENA initiative. This paper introduces the 2nd HELENA workshop and provides an overview of the current project state. In the workshop, six teams present initial findings from their regions, impulse talk are given, and further steps of the HELENA roadmap are discussed.
The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology.
The Ninth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2017), held between May 21 - 25, 2017 - Barcelona, pain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
This paper studies whether a monetary union can be managed solely by a rule based approach. The Five Presidents’ Report of the European Union rejects this idea. It suggests a centralisation of powers. We analyse the philosophy of policy rules from the vantage point of the German economic school of thought. There is evidence that a monetary union consisting of sovereign states is well organised by rules, together with the principle of subsidiarity. The root cause of the euro crisis is rather the weak enforcement of rules, compounded by structural problems. Therefore, we suggest a genuine rule-based paradigm for a stable future of the Economic and Monetary Union.
Die weiterhin hohen Schulden in einigen Staaten der Europäischen Wirtschafts- und Währungsunion lassen nach wie vor staatliche Insolvenzen befürchten. Um die entstandenen Probleme zu bewältigen, aber auch damit eine solche Situation erst gar nicht eintritt, hält der Autor eine staatliche Insovenzordnung – mit Bail-out durch die anderen Mitgliedstaaten nur in Notfällen – für erforderlich. Er schlägt einen staatlichen Abwicklungsmechanismus für überschuldete Euro-Länder vor, der auf einem Konzept des Sachverständigenrates für Wirtschaft von 2016 beruht.
Strategic alliances have become important strategic options for firms to achieve competitive advantage. Yet, there are many examples of alliance failures. Scholars have studied this phenomenon and identified many reasons for alliance failure, including lack of trust between the partnering firms. Paradoxically, the concept of trust is still not fully understood, specifically how and under what conditions trust comes to break down within the broader process of alliance building. We synthesize a process model that describes the “alliance capability”, including trust, openness, partner contributions, and relational rents. We then translate this framework into a formal simulation model and analyze it thoroughly. In analyzing trust dynamics we identify and explore a tipping boundary, separating a regime of alliance failures and successes. We apply our core findings to openness strategies – decisions about how much knowledge to share with partners. Our analyses reveal that strategies informed by a static mental model of trust, contributions, and openness, under undervalue openness. Further, too little openness risks early failure due to the being trapped in a vicious cycle of trust depletion.
Real estate markets are known to fluctuate. The real estate market in Stuttgart, Germany, has been booming for more than a decade: square-meter price hit top levels and real estate agents claim that market prices will continue to increase. In this paper, we test this market understanding by developing and analyzing a system dynamics model that depicts the Stuttgart real estate market. Simulating the model explains oscillating behavior arising from significant time delays and endogenous feedback structures – and not necessarily oscillating interest rates, as market experts assume. Scenarios provide insights into the system's behavior reacting to changes exogenous to the model. The first scenario tests the market development under increasing interest rates. The other scenario deals with possible effects on the real estate market if the regional automotive economy suffers from intense competition with new market players entering with alternative fuel vehicles and new technologies. With a policy run we test market structure changes to eliminate cyclical effects. The paper confirms that the business cycle in the Stuttgart real estate market arises from within the system's underlying structure, thus emphasizing the importance of understanding feedback structures.
In vitro composed vascularized adipose tissue is and will continue to be in great demand e.g. for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Up to date, the lack of adequate culture conditions, mainly a culture medium, decelerates further achievements. In our study, we evaluated the influence of epidermal growth factor (EGF) and hydrocortisone (HC), often supplemented in endothelial cell (EC) specific media, on the co-culture of adipogenic differentiated adipose derived stem cells (ASCs) and microvascular endothelial cells (mvECs). In ASCs, EGF and HC are thought to inhibit adipogenic differentiation and have lipolytic activities. Our results showed that in indirect co-culture for 14 days, adipogenic differentiated ASCs further incorporated lipids and partly gained an univacuolar morphology when kept in media with low levels of EGF and HC. In media with high EGF and HC levels, cells did not incorporate further lipids, on the contrary, cells without lipid droplets appeared. Glycerol release, to measure lipolysis, also increased with elevated amounts of EGF and HC in the culture medium. Adipogenic differentiated ASCs were able to release leptin in all setups. MvECs were functional and expressed the cell specific markers, CD31 and von Willebrand factor (vWF), independent of the EGF and HC content as long as further EC specific factors were present. Taken together, our study demonstrates that adipogenic differentiated ASCs can be successfully co-cultured with mvECs in a culture medium containing low or no amounts of EGF and HC, as long as further endothelial cell and adipocyte specific factors are available.
Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.
Context: Development of software intensive products and services increasingly occurs by continuously deploying product or service increments, such as new features and enhancements, to customers. Product and service developers must continuously find out what customers want by direct customer feedback and usage behaviour observation. Objective: This paper examines the preconditions for setting up an experimentation system for continuous customer experiments. It describes the RIGHT model for Continuous Experimentation (Rapid Iterative value creation Gained through High-frequency Testing), illustrating the building blocks required for such a system. Method: An initial model for continuous experimentation is analytically derived from prior work. The model is matched against empirical case study findings from two startup companies and further developed. Results: Building blocks for a continuous experimentation system and infrastructure are presented. Conclusions: A suitable experimentation system requires at least the ability to release minimum viable products or features with suitable instrumentation, design and manage experiment plans, link experiment results with a product roadmap, and manage a flexible business strategy. The main challenges are proper, rapid design of experiments, advanced instrumentation of software to collect, analyse, and store relevant data, and the integration of experiment results in both the product development cycle and the software development process.
This paper presents an approach for label-free brain tumor tissue typing. For this application, our dual modality microspectroscopy system combines inelastic Raman scattering spectroscopy and Mie elastic light scattering spectroscopy. The system enables marker-free biomedical diagnostics and records both the chemical and morphologic changes of tissues on a cellular and subcellular level. The system setup is described and the suitability for measuring morphologic features is investigated.
New digital technologies present both game-changing opportunities for—and existential threats to—companies whose success was built in the pre-digital economy. This article describes our findings from a study of 25 companies that were embarking on digital transformation journeys. We identified two digital strategies—customer engagement and digitized solutions—that provide direction for a digital transformation. Two technology-enabled assets are essential for executing those strategies: an operational backbone and a digital services platform. We describe how a big old company can combine these elements to navigate its digital transformation.
A 3D face modelling approach for pose-invariant face recognition in a human-robot environment
(2017)
Face analysis techniques have become a crucial component of human-machine interaction in the fields of assistive and humanoid robotics. However, the variations in head-pose that arise naturally in these environments are still a great challenge. In this paper, we present a real-time capable 3D face modelling framework for 2D in-the-wild images that is applicable for robotics. The fitting of the 3D Morphable Model is based exclusively on automatically detected landmarks. After fitting, the face can be corrected in pose and transformed back to a frontal 2D representation that is more suitable for face recognition. We conduct face recognition experiments with non-frontal images from the MUCT database and uncontrolled, in the wild images from the PaSC database, the most challenging face recognition database to date, showing an improved performance. Finally, we present our SCITOS G5 robot system, which incorporates our framework as a means of image pre-processing for face analysis.
This paper reports an analysis of application and impact of FMEA on susceptibility of generic IT-networks. It is not new that in communication system, the frequency and the data transmission rate play a very important role. The rapid increase in miniaturization of electronic devices leads to very sensitivity against electromagnetic interference. Since the IT network with the data transfer rate makes a huge contribution to this development it is very important to monitor their functionality. Therefore, tests are performed to observe and ensure the data transfer rate of IT networks against IEMI. A fault tree model is presented and observed effects during radiation of disturbance on complex system by a HPEM interference sources are described using a continuous and consistent model of the physical layer to the application layer.
Zusammen mit Partnern aus Industrie und Politik untersuchen die ESB Business School der Hochschule Reutlingen, die Hochschule Offenburg und die Fachhochschule Nordwestschweiz (FHNW) in einem Interreg-Projekt die Möglichkeiten, klima- und gesundheitsschädliche Emissionen im Grenzverkehr am Hochrhein zu reduzieren. Elektromobilität und Fahrgemeinschaften werden dazu im Rahmen eines Pilotprojekts gefördert und die Wirkung analysiert. Erste Ergebnisse zeigen, dass heutige Elektroautos für das grenzüberschreitende Pendeln unter bestimmten Voraussetzungen geeignet sind.
In the present paper we demonstrate a novel approach to handling small updates on Flash called In-Place Appends (IPA). It allows the DBMS to revisit the traditional write behavior on Flash. Instead of writing whole database pages upon an update in an out-of-place manner on Flash, we transform those small updates into update deltas and append them to a reserved area on the very same physical Flash page. In doing so we utilize the commonly ignored fact that under certain conditions Flash memories can support in-place updates to Flash pages without a preceding erase operation.
The approach was implemented under Shore-MT and evaluated on real hardware. Under standard update-intensive workloads we observed 67% less page invalidations resulting in 80% lower garbage collection overhead, which yields a 45% increase in transactional throughput, while doubling Flash longevity at the same time. The IPA outperforms In-Page Logging (IPL) by more than 50%.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware – the OpenSSD Flash research platform. During the demonstration we allow the users to interact with the system and gain hands on experience of its performance under different demonstration scenarios. These involve various workloads such as TPC-B, TPC-C or TATP.
In the present paper we demonstrate the novel technique to apply the recently proposed approach of In-Place Appends – overwrites on Flash without a prior erase operation. IPA can be applied selectively: only to DB-objects that have frequent and relatively small updates. To do so we couple IPA to the concept of NoFTL regions, allowing the DBA to place update-intensive DB-objects into special IPA-enabled regions. The decision about region configuration can be (semi-)automated by an advisor analyzing DB-log files in the background.
We showcase a Shore-MT based prototype of the above approach, operating on real Flash hardware. During the demonstration we allow the users to interact with the system and gain hands-on experience under different demonstration scenarios.
Under update intensive workloads (TPC, LinkBench) small updates dominate the write behavior, e.g. 70% of all updates change less than 10 bytes across all TPC OLTP workloads. These are typically performed as in-place updates and result in random writes in page-granularity, causing major write-overhead on Flash storage, a write amplification of several hundred times and lower device longevity.
In this paper we propose an approach that transforms those small in-place updates into small update deltas that are appended to the original page. We utilize the commonly ignored fact that modern Flash memories (SLC, MLC, 3D NAND) can handle appends to already programmed physical pages by using various low-level techniques such as ISPP to avoid expensive erases and page migrations. Furthermore, we extend the traditional NSM page-layout with a delta-record area that can absorb those small updates. We propose a scheme to control the write behavior as well as the space allocation and sizing of database pages.
The proposed approach has been implemented under Shore- MT and evaluated on real Flash hardware (OpenSSD) and a Flash emulator. Compared to In-Page Logging it performs up to 62% less reads and writes and up to 74% less erases on a range of workloads. The experimental evaluation indicates: (i) significant reduction of erase operations resulting in twice the longevity of Flash devices under update-intensive workloads; (ii) 15%-60% lower read/write I/O latencies; (iii) up to 45% higher transactional throughput; (iv) 2x to 3x reduction in overall write
amplification.
The wet chemical deposition of solution processed transparent conducting oxides (TCO) provides an alternative low cost and economical deposition technique to realize large-areas of conducting films. Since the price for the most common TCO Indium Tin Oxide rises enormously, Aluminum Zinc Oxide (AZO) as alternative TCO reaches more and more interest. The optoelectronical properties of nanoparticle coatings strongly depend beneath the porosity of the coating on the shape and size of the used particles. By using bigger or rod-shaped particles it is possible to minimize the amount of grain boundaries resulting in an improvement of the electrical properties, whereas particles bigger than 100 nm should not be used if highly transparent coatings are necessary as these big particles scatter the visible light and lower the transmittance of the coatings. In this work we present a simple method to synthesize AZO particles with different shape and size, but comparable electronical properties. We use a simple, well reproducible polyol method for synthesis and influence the shape and size of the particles by adding different amounts of water to the precursor solution. We can show that the addition of aluminum as dopant strongly hinders the crystal growth but the addition of water counteracts this, so that both, spherical and rod-shaped particles can be obtained.
To evaluate the quality of sleep, it is important to determine how much time was spent in each sleep stage during the night. The gold standard in this domain is an overnight polysomnography (PSG). But the recording of the necessary electrophysiological signals is extensive and complex and the environment of the sleep laboratory, which is unfamiliar to the patient, might lead to distorted results. In this paper, a sleep stage detection algorithm is proposed that uses only the heart rate signal, derived from electrocardiogram (ECG), as a discriminator. This would make it possible for sleep analysis to be performed at home, saving a lot of effort and money. From the heart rate, using the fast Fourier transformation (FFT), three parameters were calculated in order to distinguish between the different sleep stages. ECG data along with a hypnogram scored by professionals was used from Physionet database, making it easy to compare the results. With an agreement rate of 41.3%, this approach is a good foundation for future research.