Refine
Document Type
- Journal article (869)
- Conference proceeding (846)
- Book chapter (184)
- Book (61)
- Doctoral Thesis (34)
- Anthology (15)
- Working Paper (12)
- Patent / Standard / Guidelines (6)
- Review (6)
- Issue of a journal (2)
Language
- English (2038) (remove)
Is part of the Bibliography
- yes (2038)
Institute
- Informatik (702)
- ESB Business School (513)
- Technik (344)
- Life Sciences (327)
- Texoversum (150)
- Zentrale Einrichtungen (6)
Publisher
- Springer (298)
- IEEE (250)
- Elsevier (217)
- MDPI (97)
- Hochschule Reutlingen (55)
- Gesellschaft für Informatik (54)
- Wiley (49)
- ACM (40)
- De Gruyter (35)
- Association for Information Systems (AIS) (31)
The Virtual Power Plant Neckar-Alb is a demonstration platform for operation, optimization and control of distributed energy resources, which are able to produce, store or consume electric energy. A heterogeneous set of distributed energy devices has been installed at the Campus of Reutlingen University by the Reutlingen Energy Centre (REZ) of the School of Engineering. The distributed energy devices have been combined to local microgrids and connected to an operative central power plant with additional participants. The demonstration platform serves students, researchers and industry experts for education and investigation of new technologies, devices and software.
This study examines the phenomenon of Virtual Influencer (VI) marketing and its impact on customer purchase behavior. The aim is to understand the scope and impact of VI marketing. The study compares VI marketing to traditional Human Influencer (HI) marketing and identifies the unique benefits and challenges associated with VIs. A survey was conducted to gain insight into consumer attitudes and behaviors toward VIs. Key findings reveal varying levels of trust and acceptance of VIs among consumers. While some participants expressed openness to buying products promoted by VIs, others had reservations about their authenticity. The study also explores the potential role of VIs in the metaverse, highlighting business opportunities and challenges in this evolving digital landscape. Overall, this research sheds light on the growing influence of VIs and the need for further research in the field of marketing.
Today the optimization of metal forming processes is done using advanced simulation tools in a virtual process, e.g. FEM-studies. The modification of the free parameters represents the different variants to be analysed. So experienced engineers may derive useful proposals in an acceptable time if good initial proposals are available. As soon as the number of free parameters growths or the total process takes long times and uses different succeeding forming steps it might be quite difficult to find promising initial ideas. In metal forming another problem has to be considered. The optimization using a series of local improvements, often called a gradient approach may find a local optimum, but this could be far away from a satisfactory solution. Therefore non-deterministic approaches, e.g. Bionic Optimization have to be used. These approaches like Evolutionary Optimization or Particle Swarm Optimization are capable to cover a large range of high dimensional optimization spaces and discover many local optima. So the chance to include the global optimum increases when using such non-deterministic methods. Unfortunately these bionic methods require large numbers of studies of different variants of the process to be optimized. The number of studies tends to increase exponentially with the number of free parameters of the forming process. As the time for one single study might be not too small as well, the total time demand will be inacceptable, taking weeks to months even if high performance computing will be used. Therefore the optimization process needs to be accelerated. Among the many ideas to reduce the time and computer power requirement Meta- and Hybrid Optimization seem to produce the most efficient results. Hybrid Optimization often consists of global searches of promising regions within the parameter space. As soon as the studies indicate that there could be a local optimum, a deterministic study tries to identify this local region. If it shows better performance than other optima found until now, it is preserved for a more detailed analysis. If it performs worse than other optima the region is excluded from further search. Meta-Optimization is often understood as the derivation of Response Surfaces of the functions of free parameters. Once there are enough studies performed, the optimization is done using the Response Surfaces as representatives e.g. for the goal and the restrictions of the optimization problem. Having found regions where interesting solutions are to be expected, the studies available up to now are used to define the Response Surfaces. In many cases low degree polynomials are used, defining their coefficients by least square methods. Both proposals Hybrid Optimization and Meta-Optimization, sometimes used in combination often help to reduce the total optimization processes by large numbers of variants to be studied. In consequence they are highly recommended when dealing with time consuming optimization studies.
The purpose of this article is to provide insight of a new simple forecasting method based on a state-estimation algorithm known as the Kalman filter. While the accuracy of such algorithm is not comparable to state-of-the-art forecasting algorithms for PV-power production it does not require any internet connection, eyefish cameras or time intensive training. The algorithm was tested with several months of real high-resolution data with adequate results for the intended applications. The minimization of the necessary spinning reserve on a PV-diesel hybrid system to increase the solar fraction and reduce diesel consumption.
The present study investigated the possibilities and limitations of using a low-cost NIR spectrometer for the verification of the presence of the declared active pharmaceutical ingredients (APIs) in tablet formulations, especially for medicine screening studies in low-resource settings. Spectra from 950 to 1650 nm were recorded for 170 pharmaceutical products representing 41 different APIs, API combinations or placebos. Most of the products, including 20 falsified medicines, had been collected in medicine quality studies in African countries. After exploratory principal component analysis, models were built using data-driven soft independent modelling of class analogy (DD-SIMCA), a one-class classifier algorithm, for tablet products of penicillin V, sulfamethoxazole/trimethoprim, ciprofloxacin, furosemide, metronidazole, metformin, hydrochlorothiazide, and doxycycline. Spectra of amoxicillin and amoxicillin/clavulanic acid tablets were combined into a single model. Models were tested using Procrustes cross-validation and by projection of spectra of tablets containing the same or different APIs. Tablets containing no or different APIs could be identified with 100 % specificity in all models. A separation of the spectra of amoxicillin and amoxicillin/clavulanic acid tablets was achieved by partial least squares discriminant analysis. 15 out of 19 external validation products (79 %) representing different brands of the same APIs were correctly identified as members of the target class; three of the four rejected samples showed an API mass percentage of the total tablet weight that was out of the range covered in the respective calibration set. Therefore, in future investigations larger and more representative spectral libraries are required for model building. Falsified medicines containing no API, incorrect APIs, or grossly incorrect amounts of the declared APIs could be readily identified. Variation between different NIR-S-G1 spectroscopic devices led to a loss of accuracy if spectra recorded with different devices were pooled. Therefore, piecewise direct standardization was applied for calibration transfer. The investigated method is a promising tool for medicine screening studies in low-resource settings.
Verification of an active time constant tuning technique for continuous-time delta-sigma modulators
(2022)
In this work we present a technique to compensate the effects of R-C / g m -C time-constant (TC) errors due to process variation in continuous-time delta-sigma modulators. Local TC error compensation factors are shifted around in the modulator loop to positions where they can be implemented efficiently with finely tunable circuit structures, such as current-steering digital-to-analog converters (DAC). We apply our technique to a third-order, single-bit, low-pass continuous-time delta-sigma modulator in cascaded integrator feedback structure, implemented in a 0.35-μm CMOS process. A tuning scheme for the reference currents of the feedback DACs is derived as a function of the individual TC errors and verified by circuit simulations. We confirm the tuning technique experimentally on the fabricated circuit over a TC parameter variation range of ±20%. Stable modulator operation is achieved for all parameter sets. The measured performances satisfy the expectations from our theoretical calculations and circuit-level simulations.
Venture capital and the innovative power of a state : econometric study including Google data
(2015)
This article focuses on venture capital investments and the innovative power of a state defined by its public infrastructure. The economic implications are evaluated by estimating several panel regression models. The novelty is twofold: on the one hand the research approach and on the other hand the new data set. The data ranges from 1995 to 2014 and consists of 10 European countries plus the US and Canada. For the first time we include Google search data on Venture Capital. The results show a significant increase in Venture Capital is mainly determined by economic conditions such as real GDP growth. The impact of the innovative power of a state is not significant. We find that Google data is positively related and significant in respect to Venture Capital investments too. Consequently, we confirm that private business investments cannot be created by government policy alone rather via solid macroeconomic conditions.
Redirected walking techniques allow people to walk in a larger virtual space than the physical extents of the laboratory. We describe two experiments conducted to investigate human sensitivity to walking on a curved path and to validate a new redirected walking technique. In a psychophysical experiment, we found that sensitivity to walking on a curved path was significantly lower for slower walking speeds (radius of 10 meters versus 22 meters). In an applied study, we investigated the influence of a velocity-dependent dynamic gain controller and an avatar controller on the average distance that participants were able to freely walk before needing to be reoriented. The mean walked distance was significantly greater in the dynamic gain controller condition, as compared to the static controller (22 meters versus 15 meters). Our results demonstrate that perceptually motivated dynamic redirected walking techniques, in combination with reorientation techniques, allow for unaided exploration of a large virtual city model.
Values Management System
(2022)
The ValuesManagementSystem (VWS) is a management standard to “provide a sustainable safeguard of a firm and its development, in all dimensions (legal, economic, ecological, social)” (VWSZfW, p. 4). It includes a framework for values-driven governance through self-commitment and self-binding mechanisms. Values promote a sense of identity and give organizations guidance in decision-making. This is especially important in decision-making processes where topics are not clearly ruled by laws and regulations.
VMSZfW must be embedded in the specific business strategy, structure, and culture of an organization. The following four steps describe the implementation of the ValuesManagementSystemZfW: (i) Codify core values of an organization, for instance, with a “mission, vision and values statement” or Code of Ethics, (ii) implement guidelines such as Code of Conduct and specific policies and procedures, (iii) systematize these by establishing management systems such as Compliance and CSR management systems, and (iv) finally organize and establish structures to ensure the strategic direction and operational implementation and review of these processes. The top management shows that values management is taken seriously by their self-commitment to the core values of the company.
Private equity (PE) firms are investment firms that acquire equity shares in companies. The goal of PE firms is to exit the investment after few years with a substantial increase in value. PE firms often claim to outperform the market, i.e. to create alpha.
The overall aim of this paper is to unravel the mystery of value creation in the PE industry. First, the author presents a conceptual framework for value creation in the PE industry based on a multiple valuation model that breaks down value creation into different elements. Second, the paper evaluates whether PE firms really create value by analysing and combining results from prior empirical studies based on the conceptual framework.
The results show that existing empirical evidence is mixed but that there is indeed a tendency toward a positive evidence that PE firms create economic value in average. However, there are methodological difficulties in measuring the value creation and studies are often subject to bias. Finally, it is pointed out that the question whether PE firms really create value has to be viewed from different perspectives such as the perspective of the PE firm, the investors and the portfolio companies.
This practical guide for advanced students and decision-makers in the pharma and biotech industry presents key success factors in R&D along with value creators in pharmaceutical innovation. A team of editors and authors with extensive experience in academia and industry and at some of the most prestigious business schools in Europe discusses in detail the innovation process in pharma as well as common and new research and innovation strategies. In doing so, they cover collaboration and partnerships, open innovation, biopharmaceuticals, translational medicine, good manufacturing practice, regulatory affairs, and portfolio management. Each chapter covers controversial aspects of recent developments in the pharmaceutical industry, with the aim of stimulating productive debates on the most effective and efficient innovation processes. A must-have for young professionals and MBA students preparing to enter R&D in pharma or biotech as well as for students on a combined BA/biomedical and natural sciences program.
Artificial Intelligence enables innovative applications, and applications based on Artificial Intelligence are increasingly important for all aspects of the Digital Economy. However, the question of how AI resources such as tools and data can be linked to provide an AI-capability and create business value is still open. Therefore, this paper identifies the value-creating mechanisms of connectionist artificial intelligence using a capability-oriented view and points out the connections to different kinds of business value. The analysis supports an agenda that identifies areas that need further research to understand the mechanism of value creation in connectionist artificial intelligence.
This article investigates the fundamental value of digital platforms, such as Facebook and Google. Despite the transformative nature of digital technologies, it is challenging to value digital services, given that the usage is free of charge. Applying the methodology of discrete choice experiments, we estimated the value of digital free goods. For the first time in the literature, we obtained data for the willingness-to-pay and willingness-to-accept, together with socio-economic variables. The customer´s valuation of free digital services is on average, for Google, 121 € per week and Facebook, 28 €.
Product engineering and subsequent phases of product lifecycles are predominantly managed in isolation. Companies therefore do not fully exploit potentials through using data from smart factories and product usage. The novel intelligent and integrated Product Lifecycle Management (i²PLM) describes an approach that uses these data for product engineering. This paper describes the i²PLM, shows the cause-and-effect relationships in this context and presents in detail the validation of the approach. The i²PLM is applied and validated on a smart product in an industrial research environment. Here, the subsequent generation of a smart lunchbox is developed based on production and sensor data. The results of the validation give indications for further improvements of the i²PLM. This paper describes how to integrate the i²PLM into a learning factory.
Sleep disorders can impact daily life, affecting physical, emotional, and cognitive well-being. Due to the time-consuming, highly obtrusive, and expensive nature of using the standard approaches such as polysomnography, it is of great interest to develop a noninvasive and unobtrusive in-home sleep monitoring system that can reliably and accurately measure cardiorespiratory parameters while causing minimal discomfort to the user’s sleep. We developed a low-cost Out of Center Sleep Testing (OCST) system with low complexity to measure cardiorespiratory parameters. We tested and validated two force-sensitive resistor strip sensors under the bed mattress covering the thoracic and abdominal regions. Twenty subjects were recruited, including 12 males and 8 females. The ballistocardiogram signal was processed using the 4th smooth level of the discrete wavelet transform and the 2nd order of the Butterworth bandpass filter to measure the heart rate and respiration rate, respectively. We reached a total error (concerning the reference sensors) of 3.24 beats per minute and 2.32 rates for heart rate and respiration rate, respectively. For males and females, heart rate errors were 3.47 and 2.68, and respiration rate errors were 2.32 and 2.33, respectively. We developed and verified the reliability and applicability of the system. It showed a minor dependency on sleeping positions, one of the major cumbersome sleep measurements. We identified the sensor under the thoracic region as the optimal configuration for cardiorespiratory measurement. Although testing the system with healthy subjects and regular patterns of cardiorespiratory parameters showed promising results, further investigation is required with the bandwidth frequency and validation of the system with larger groups of subjects, including patients.
Hyperspectral imaging and reflectance spectroscopy in the range from 200–380 nm were used to rapidly detect and characterize copper oxidation states and their layer thicknesses on direct bonded copper in a non-destructive way. Single-point UV reflectance spectroscopy, as a well-established method, was utilized to compare the quality of the hyperspectral imaging results. For the laterally resolved measurements of the copper surfaces an UV hyperspectral imaging setup based on a pushbroom imager was used. Six different types of direct bonded copper were studied. Each type had a different oxide layer thickness and was analyzed by depth profiling using X-ray photoelectron spectroscopy. In total, 28 samples were measured to develop multivariate models to characterize and predict the oxide layer thicknesses. The principal component analysis models (PCA) enabled a general differentiation between the sample types on the first two PCs with 100.0% and 96% explained variance for UV spectroscopy and hyperspectral imaging, respectively. Partial least squares regression (PLS-R) models showed reliable performance with R2c = 0.94 and 0.94 and RMSEC = 1.64 nm and 1.76 nm, respectively. The developed in-line prototype system combined with multivariate data modeling shows high potential for further development of this technique towards real large-scale processes.
In addition to increased safety by detecting possible overload, continuous component monitoring by sensor integration makes the use of fiber reinforced plastics more cost-effective. Since the components are continuously monitored, one can switch from time-based to condition-based maintenance. However, the integration of conventional sensor components causes weak points, as foreign objects are inserted into the reinforcing structure. In this paper, we examine the use of the textile reinforcement as a sensor in itself. We describe how bending sensors can be formed by slightly modifying in the composite’s reinforcement structure. We investigated two different sensor principles. (1) The integration of textile plate capacitors into the structure; (2) The construction of textile piezo elements as part of the reinforcing structure. The bending test results reveal that textile plate capacitors show a load-dependent signal output. The samples with textile piezo elements show a significant increase in signal strength.
The paper illustrates the status quo of a research project for the development of a control system enabling CHP units for a demand-oriented electricity production by an intelligent management of the heat storage tank. Thereby the focus of the project is twofold. One is the compensation of the fluctuating power production by the renewable energies solar and wind. Secondly, a reduction of the load on the power grid is intended by better matching local electricity demand and production.
In detail, the general control strategy is outlined, the method utilized for forecasting heat and electricity demand is illustrated as well as a correlation method for the temperature distribution in the heat storage tank based on a Sigmoid function is proposed. Moreover, the simulation model for verification and optimization of the control system and the two field test sites for implementing and testing the system are introduced.
Applications often need to be deployed in different variants due to different customer requirements. However, since modern applications often need to be deployed using multiple deployment technologies in combination, such as Ansible and Terraform, the deployment variability must be considered in a holistic way. To tackle this, we previously developed Variability4TOSCA and the prototype OpenTOSCA Vintner, which is a TOSCA preprocessing and management layer that implements Variability4TOSCA. In this demonstration, we present a detailed case study that shows how to model a deployment using Variability4TOSCA, how to resolve the variability using Vintner, and how the result can be deployed.
The purpose of this paper is to determine the relevance of social media for luxury brand management. It employs both a multi-methodological approach: After analyzing the online performance of the three luxury brands Burberry, Louis Vuitton and Gucci, the empirical research includes a survey as well as an eye tracking test executed with Tobii Studio. The findings reveal that online and social media have given luxury fashion businesses the opportunity to establish a sustainable interaction with their customers and distinguish themselves from the competition. Still, the online business holds many challenges for luxury companies to overcome. This paper gives instructions as to how social media can be effectively incorporated into a luxury company.
Recognizing actions of humans, reliably inferring their meaning and being able to potentially exchange mutual social information are core challenges for autonomous systems when they directly share the same space with humans. Today’s technical perception solutions have been developed and tested mostly on standard vision benchmark datasets where manual labeling of sensory ground truth is a tedious but necessary task. Furthermore, rarely occurring human activities are underrepresented in such data leading to algorithms not recognizing such activities. For this purpose, we introduce a modular simulation framework which offers to train and validate algorithms on various environmental conditions. For this paper we created a dataset, containing rare human activities in urban areas, on which a current state of the art algorithm for pose estimation fails and demonstrate how to train such rare poses with simulated data only.
Context: Organizations increasingly develop software in a distributed manner. The cloud provides an environment to create and maintain software-based products and services. Currently, it is unknown which software processes are suited for cloud-based development and what their effects in specific contexts are.
Objective: We aim at better understanding the software process applied to distributed software development using the cloud as development environment. We further aim at providing an instrument which helps project managers comparing different solution approaches and to adapt team processes to improve future project activities and outcomes.
Method: We provide a simulation model which helps analyzing different project parameters and their impact on projects performed in the cloud. To evaluate the simulation model, we conduct different analyses using a Scrumban process and data from a project executed in Finland and Spain. An extra adaptation of the simulation model for Scrum and Kanban was used to evaluate the suitability of the simulation model to cover further process models.
Results: A comparison of the real project data with the results obtaind from the different simulation runs shows the simulation producing results close to the real data, and we could successfully replicate a distributed software project. Furthermore, we could show that the simulation model is suitable to address further process models.
Conclusion: The simulator helps reproducing activities, developers, and events in the project, and it helps analyzing potential tradeoffs, e.g., regarding throughput, total time, project size, team size and work-in-progress limits. Furthermore, the simulation model supports project managers selecting the most suitable planning alternative thus supporting decision-making processes.
Engineers of the research project “Digital Product Life-Cycle” are using a graph-based design language to model all aspects of the product they are working on. This abstract model is the base for all further investigations, developments and implementations. In particular at early stages of development, collaborative decision making is very important. We propose a semantic augmented knowledge space by means of mixed reality technology, to support engineering teams. Therefore we present an interaction prototype consisting of a pico projector and a camera. In our usage scenario engineers are augmenting different artefacts in a virtual working environment. The concept of our prototype contains both an interaction and a technical concept. To realise implicit and natural interactions, we conducted two prototype tests: (1) A test with a low-fidelity prototype and (2) a test by using the method Wizard of Oz. As a result, we present a prototype with interaction selection using augmentation spotlighting and an interaction zoom as a semantic zoom.
The desire to combine advanced user friendly interfaces with a product personality communicating environmental friendliness to customers poses new challenges for car interior designers, as little research has been carried out in this field to date. In this paper, the creation of three personas aimed at defining key German car users with pro environmental behaviour is presented. After collecting ethnographic data of potential drivers through literature review, information about generation and Euro car segment led to the definition of three key user groups. The resulting personas were applied to determine the most important interaction points in car interior. Finally, present design cues of eco-friendly product personality developed in the field of automotive design were explored. Our work presents three strategic directions for the design development of future in-car user interfaces named as a) foster multimodal mobility; b) emphasize the interlinkage economy - sustainable driving; and c) highlight new technological developments. The presented results are meant as an impulse for developers to fit the needs of green customers and drivers when designing user-friendly HMI components.
Using measurement and simulation for understanding distributed development processes in the Cloud
(2017)
Organizations increasingly develop software in a distributed manner. The Cloud provides an environment to create and maintain software-based products and services. Currently, it is widely unknown which software processes are suited for Cloud-based development and what their effects in specific contexts are. This paper presents a process simulation to study distributed development in the Cloud. We contribute a simulation model, which helps analyzing different project parameters and their impact on projects carried out in the Cloud. The simulator helps reproducing activities, developers, issues and events in the project, and it generates statistics, e.g., on throughput, total time, and lead and cycle time. The aim of this simulation model is thus to analyze the tradeoffs regarding throughput, total time, project size, and team size. Furthermore, the modified simulation model aims to help project managers select the most suitable planning alternative. Based on observed projects in Finland and Spain, we simulated a distributed project using artificial and real data. Particularly, we studied the variables project size, team size, throughput, and total project duration. A comparison of the real project data with the results obtained from the simulation shows the simulation producing results close to the real data, and we could successfully replicate a distributed software project. By improving the understanding of distributed development processes, our simulation model thus supports project managers in their decision-making.
A sequence of transactions represents a complex and multi dimensional type of data. Feature construction can be used to reduce the data´s dimensionality to find behavioural patterns within such sequences. The patterns can be expressed using the blue prints of the constructed relevant features. These blue prints can then be used for real time classification on other sequences.
Digital light microscopy techniques are among the most widely used methods in cell biology and medical research. Despite that, the automated classification of objects such as cells or specific parts of tissues in images is difficult. We present an approach to classify confluent cell layers in microscopy images by learned deep correlation features using deep neural networks. These deep correlation features are generated through the use of gram-based correlation features and are input to a neural network for learning the correlation between them. In this work we wanted to prove if a representation of cell data based on this is suitable for its classification as has been done for artworks with respect to their artistic period. The method generates images that contain recognizable characteristics of a specific cell type, for example, the average size and the ordered pattern.
Process quality has reached a high level on mass production, utilizing well known methods like the DoE. The drawback of the unterlying statistical methods is the need for tests under real production conditions, which cause high costs due to the lost output. Research over the last decade let to methods for correcting a process by using in-situ data to correct the process parameters, but still a lot of pre-production is necessary to get this working. This paper presents a new approach in improving the product quality in process chains by using context data - which in part are gathered by using Industry 4.0 devices - to reduce the necessary pre-production.
Software evolvability is an important quality attribute, yet one difficult to grasp. A certain base level of it is allegedly provided by service- and microservice-based systems, but many software professionals lack systematic understanding of the reasons and preconditions for this. We address this issue via the proxy of architectural modifiability tactics. By qualitatively mapping principles and patterns of Service Oriented Architecture (SOA) and microservices onto tactics and analyzing the results, we cannot only generate insights into service-oriented evolution qualities, but can also provide a modifiability comparison of the two popular service-based architectural styles. The results suggest that both SOA and microservices possess several inherent qualities beneficial for software evolution. While both focus strongly on loose coupling and encapsulation, there are also differences in the way they strive for modifiability (e.g. governance vs. evolutionary design). To leverage the insights of this research, however, it is necessary to find practical ways to incorporate the results as guidance into the software development process.
Vehicles have been so far improved in terms of energy-efficiency and safety mainly by optimising the engine and the power train. However, there are opportunities to increase energy-efficiency and safety by adapting the individual driving behaviour in the given driving situation. In this paper, an improved rule match algorithm is introduced, which is used in the expert system of a human-centred driving system. The goal of the driving system is to optimise the driving behaviour in terms of energy-efficiency and safety by giving recommendations to the driver. The improved rule match algorithm checks the incoming information against the driving rules to recognise any breakings of a driving rule. The needed information is obtained by monitoring the driver, the current driving situation as well as the car, using in-vehicle sensors and serial-bus systems. On the basis of the detected broken driving rules, the expert system will create individual recommendations in terms of energy-efficiency and safety, which will allow eliminating bad driving habits, while considering the driver needs.
The coupling of the heat and power sector is required as supply and demand in the German electricity mix drift further and further apart with a high percentage of renewable energy. Heat pumps in combination with thermal energy storage systems can be a useful way to couple the heat and power sectors. This paper presents a hardware-in the-loop test bench for experimental investigation of optimized control strategies for heat pumps. 24-hour experiments are carried out to test whether the heat pump is able to serve optimized schedules generated by a MATLAB algorithm. The results show that the heat pump is capable of following the generated schedules, and the maximum deviation of the operational time between schedule and experiment is only 3%. Additionally, the system can serve the demand for space heating and DHW at any time.
The paper explains a workflow to simulate the food energy water (FEW) nexus for an urban district combining various data sources like 3D city models, particularly the City Geography Markup Language (CityGML) data model from the Open Geospatial Consortium, Open StreetMap and Census data. A long term vision is to extend the CityGML data model by developing a FEW Application Domain Extension (FEW ADE) to support future FEW simulation workflows such as the one explained in this paper. Together with the mentioned simulation workflow, this paper also identifies some necessary FEW related parameters for the future development of a FEW ADE. Furthermore, relevant key performance indicators are investigated, and the relevant datasets necessary to calculate these indicators are studied. Finally, different calculations are performed for the downtown borough Ville-Marie in the city of Montréal (Canada) for the domains of food waste (FW) and wastewater (WW) generation. For this study, a workflow is developed to calculate the energy generation from anaerobic digestion of FW and WW. In the first step, the data collection and preparation was done. Here relevant data for georeferencing, data for model set-up, and data for creating the required usage libraries, like food waste and wastewater generation per person, were collected. The next step was the data integration and calculation of the relevant parameters, and lastly, the results were visualized for analysis purposes. As a use case to support such calculations, the CityGML level of detail two model of Montréal is enriched with information such as building functions and building usages from OpenStreetMap. The calculation of the total residents based on the CityGML model as the main input for Ville-Marie results in a population of 72,606. The statistical value for 2016 was 89,170, which corresponds to a deviation of 15.3%. The energy recovery potential of FW is about 24,024 GJ/year, and that of wastewater is about 1,629 GJ/year, adding up to 25,653 GJ/year. Relating values to the calculated number of inhabitants in Ville-Marie results in 330.9 kWh/year for FW and 22.4 kWh/year for wastewater, respectively.
Avatars are in use when interacting in virtual environments in different contexts, in collaborative work, as well as in gaming and also in virtual meetings with friends. Therefore it is important to understand how the relationship between user and avatar works. In this study, an online survey is used to determine how the perception of an avatar changes in different contexts by relating it to existing avatar relationship typologies. Additionally, it is determined whether in each context a realistic, abstract or comic-like representation is preferred by the participants. One result was a preference of low poly representations in the work context, which are associated with the perception of the avatar as a tool. In the context of meeting friends, a realistic representation is perceived as more appropriate, which is perceived as an accurate self-representation. In the gaming context, the results are less clear, which can be attributed to different gaming preferences. Here, unlike in the other contexts, a comic-like representation is also perceived as appropriate, which is associated with the perception of the avatar as a friend. A symbiotic user-avatar relationship is not directly related to any form of representation, but always lies in the midfield, which is attributed to the fact that it represents a whole spectrum between other categories.
Going forward with the requirements of missions to the Moon and further into deep space, the European Space Agency is investigating new methods of astronaut training that can help accelerate learning, increase availability and reduce complexity and cost in comparison to currently used methods. To achieve this, technologies such as virtual reality may be utilized. In this paper, an investigation into the benefits of using virtual reality as a means for extravehicular activity training in comparison to conventional training methods, such as neutral buoyancy pools is given. To help determine the requirements and current uses of virtual reality for extravehicular activity training first hand tests of currently available software as well as expert interviews are utilized. With this knowledge a concept is developed that may be used to further advance training methods in virtual reality. The resulting concept is used as a basis for development of a prototype to showcase user interactions and locomotion in microgravity simulations.
The stimulation of user engagement has received significant attention in extant research. However, the theory of antecedents for user engagement with an initial electronic word-of-mouth (eWoM) communication is relatively less developed. In an investigation of 576 unique user postings across independent Facebook (FB) communities for two German firms, we contribute to the extant knowledge on user engagement in two different ways. First, we explicate senders’ prior usage experience and the extent of their acquaintance with other community members as the two key drivers of user engagement across a product and a service community. Second, we reveal that these main effects differ according to the type of community. In service communities, experience has a stronger impact on user engagement; whereas, in product communities, acquaintance is more important.
Managerial accountants spend a large part of their working time on more operational activities in cost accounting, reporting, and operational planning and budgeting. In all these areas, there has been increasing discussion in recent years, both in theory and practice, about using more digital technologies. For reporting, this means not only an intensified discussion of technologies such as RPA and AI but also more intensive changes to existing reporting systems. In particular, management information systems (MIS), which are maintained by managerial accountants and used by managers for corporate management, should be mentioned here. Based on an empirical survey in a large German company, this article discusses the requirements and assessments of users when switching from a regular MIS to a cloud-based system.
To correctly assess the cleanliness of technical surfaces in a production process, corresponding online monitoring systems must provide sufficient data. A promising method for fast, large-area, and non-contact monitoring is hyperspectral imaging (HSI), which was used in this paper for the detection and quantification of organic surface contaminations. Depending on the cleaning parameter constellation, different levels of organic residues remained on the surface. Afterwards, the cleanliness was determined by the carbon content in the atom percent on the sample surfaces, characterized by XPS and AES. The HSI data and the XPS measurements were correlated, using machine learning methods, to generate a predictive model for the carbon content of the surface. The regression algorithms elastic net, random forest regression, and support vector machine regression were used. Overall, the developed method was able to quantify organic contaminations on technical surfaces. The best regression model found was a random forest model, which achieved an R2 of 0.7 and an RMSE of 7.65 At.-% C. Due to the easy-to-use measurement and the fast evaluation by machine learning, the method seems suitable for an online monitoring system. However, the results also show that further experiments are necessary to improve the quality of the prediction models.
This paper examines the efficacy of social media systems in customer complaint handling. The emergence of social media, as a useful complement and (possibly) a viable alternative to the traditional channels of service delivery, motivates this research. The theoretical framework, developed from literature on social media and complaint handling, is tested against data collected from two different channels (hotline and social media) of a German telecommunication services provider, in order to gain insights into channel efficacy in complaint handling. We contribute to the understanding of firm’s technology usage for complaint handling in two ways:
(a) by conceptualizing and evaluating complaint handling quality across traditional and social media channels and (b) by comparing the impact of complaint handling quality on key performance outcomes such as customer loyalty, positive word-of-mouth, and crosspurchase intentions across traditional and social media channels.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries. This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Unsaturated polyester resins (UPR) and vinyl ester resins (VER) are among the most commercially important thermosetting matrix materials for composites. Although comparatively low cost, their technological performance is suitable for a wide range of applications, such as fiber-reinforced plastics, artificial marble or onyx, polymer concrete, or gel coats. The main areas of UPR consumption include the wind energy, marine, pipe and tank, transportation, and construction industries.
This chapter discusses basic UPR and VER chemistry and technology of manufacturing, and consequent applications. Some important properties and performance characteristics are discussed, such as shrinkage behavior, flame retardance, and property modification by nanoparticles. Also briefly introduced and described are the practical aspects of UPR and VER processing, with special emphasis on the most widely used technological approaches, such as hand and spray layup, resin infusion, resin transfer molding, sheet and bulk molding, pultrusion, winding, and centrifugal casting.
Here, we study resin cure and network formation of solid melamine formaldehyde pre-polymer over a large temperature range viadynamic temperature curing profiles. Real-time infrared spectroscopy is used to analyze the chemical changes during network formation and network hardening. By applying chemometrics (multivariate curve resolution,MCR), the essential chemical functionalities that constitute the network at a given stage of curing are mathematically extracted and tracked over time. The three spectral components identified by MCR were methylol-rich, ether linkages-rich and methylene linkages-rich resin entities. Based on dynamic changes of their characteristic spectral patterns in dependence of temperature, curing is divided into five phases: (I) stationary phase with free methylols as main chemical feature, (II) formation of flexible network cross-linked by ether linkages, (III) formation of rigid, ether-cross-linked network, (IV) further hardening via transformation of methylols and ethers into methylene-cross-linkages, and (V) network consolidation via transformation of ether into methylene bridges. The presented spectroscopic/chemometric approach can be used as methodological basis for the functionality design of MF-based surface films at the stage of laminate pressing, i.e., for tailoring the technological property profile of cured MF films using a causal understanding of the underlying chemistry based on molecular markers and spectroscopic fingerprints.
Unraveling the double-edged sword : effects of cultural diversity on creativity and innovativeness
(2014)
Cultural diversity is considered a “double-edged sword” (Kravitz, 2005) as research on its effects on teams’ performance regularly delivers inconsistent and contradictory results. This paper makes an attempt to unravel the double-edged sword by discerning different forms of cultural diversity: separation and variety (Harrison & Klein, 2007). Based on a review of the literature, a conceptual model is developed hypothesizing that cultural variety yields positive, while cultural separation yields negative effects on team creativity and innovativeness. In addition the effects of national diversity are contrasted to proof whether national diversity can serve as a proxy for cultural diversity as is often practiced. The model is tested on a sample of 113 student teams of Entrepreneurship modules at 4 European universities. Cultural diversity is measured directly on the basis of individual team members’ cultural value orientations by means of the CPQ4 (Maznevski, DiStefano, Gomez, Noorderhaven & Wu, 2002). Data is analyzed using the PLS structural equation modeling technique. The results confirm the hypothesized impacts of cultural variety and separation on creativity but do not deliver evidence for impacts on innovativeness. Same is true for national diversity. Interestingly, national diversity does not show any relation to neither form of cultural diversity.
Unprecedented formation of sterically stabilized phospholipid liposomes of cuboidal morphology
(2021)
Sterically stabilized phospholipid liposomes of unprecedented cuboid morphology are formed upon introduction in the bilayer membrane of original polymers, based on polyglycidol bearing a lipid-mimetic residue. Strong hydrogen bonding in the polyglycidol sublayers creates attractive forces, which, facilitated by fluidization of the membrane, bring about the flattening of the bilayers and the formation of cuboid vesicles.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Intermittent time series forecasting is a challenging task which still needs particular attention of researchers. The more unregularly events occur, the more difficult is it to predict them. With Croston’s approach in 1972 (1.Nr. 3:289–303), intermittence and demand of a time series were investigated the first time separately. He proposes an exponential smoothing in his attempt to generate a forecast which corresponds to the demand per period in average. Although this algorithm produces good results in the field of stock control, it does not capture the typical characteristics of intermittent time series within the final prediction. In this paper, we investigate a time series’ intermittence and demand individually, forecast the upcoming demand value and inter-demand interval length using recent machine learning algorithms, such as long-short-term-memories and light-gradient-boosting machines, and reassemble both information to generate a prediction which preserves the characteristics of an intermittent time series. We compare the results against Croston’s approach, as well as recent forecast procedures where no split is performed.
The United Nations (UN) Global Compact is a call to companies to align their strategies and operations with ten universal principles in the areas of human rights, labor, environment, and anti-corruption, and to take actions that advance societal goals (UN Global Compact 2017, p. 3). The UN Global Compacts’ vision is “to mobilize a global movement of sustainable companies and stakeholder to create the world we want” (UN Global Compact 2021a). It is a global network with local presence all around the world.
Engineering of large vascularized adipose tissue constructs is still a challenge for the treatment of extensive high-graded burns or the replacement of tissue after tumor removal. Communication between mature adipocytes and endothelial cells is important for homeostasis and the maintenance of adipose tissue mass but, to date, is mainly neglected in tissue engineering strategies. Thus, new coculture strategies are needed to integrate adipocytes and endothelial cells successfully into a functional construct. This review focuses on the cross-talk of mature adipocytes and endothelial cells and considers their influence on fatty acid metabolism and vascular tone. In addition, the properties and challenges with regard to these two cell types for vascularized tissue engineering are highlighted.
Strategic alliances have become important strategic options for firms to achieve competitive advantage. Yet, there are many examples of alliance failures. Scholars have studied this phenomenon and identified many reasons for alliance failure, including lack of trust between the partnering firms. Paradoxically, the concept of trust is still not fully understood, specifically how and under what conditions trust comes to break down within the broader process of alliance building. We synthesize a process model that describes the “alliance capability”, including trust, openness, partner contributions, and relational rents. We then translate this framework into a formal simulation model and analyze it thoroughly. In analyzing trust dynamics we identify and explore a tipping boundary, separating a regime of alliance failures and successes. We apply our core findings to openness strategies – decisions about how much knowledge to share with partners. Our analyses reveal that strategies informed by a static mental model of trust, contributions, and openness, under undervalue openness. Further, too little openness risks early failure due to the being trapped in a vicious cycle of trust depletion.
This research addresses the question of why employees use enterprise social networks (ESN). Against the background of technology acceptance research, we propose an extended unified theory of acceptance and use of technology (UTAUT) model, adapt it to an ESN context, and test our model against data from ESN users of large and medium-sized enterprises. We use partial least squares structural equation modeling to gain insights into the determinants of ESN use. This paper contributes to ESN acceptance research by evaluating a model containing determinants of ESN use. It also examines the effects of determinants on five different usage dimensions of ESN. The results reveal that facilitating conditions are the main driver of ESN use while the impact of intention to use is comparably small. Implications for theory and practice are discussed.
The unprecedented acceleration in the dynamics of economic development and its dependence on global interactions makes predicting the future especially difficult. Nevertheless, an examination of long-term trends provides an opportunity to begin a discussion about what reality could await us tomorrow and how we want to deal with it. With this food-for-thought paper, the member institutes of the Fraunhofer Group for Innovation Research wish to present a selection of the trends that are destined to have a significant impact on innovation systems in the period leading up to 2030. Based on these trends, the paper derives theses for innovation in the year 2030 and describes the resulting tasks for business, politics, science and society.