Refine
Document Type
- Journal article (1244)
- Conference proceeding (1039)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (38)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3088)
Institute
- ESB Business School (1107)
- Informatik (875)
- Technik (509)
- Life Sciences (343)
- Texoversum (220)
- Zentrale Einrichtungen (16)
Publisher
- Springer (473)
- IEEE (252)
- Elsevier (243)
- Hochschule Reutlingen (191)
- MDPI (99)
- Wiley (72)
- Gesellschaft für Informatik e.V (69)
- Universität Tübingen (65)
- De Gruyter (60)
- VDE Verlag (48)
The digital twin concept has been widely known for asset monitoring in the industry for a long time. A clear example is the automotive industry. Recently, there has also been significant interest in the application of digital twins in healthcare, especially in genomics in what is known as precision medicine. This work focuses on another medical speciality where digital twins can be applied, sleep medicine. However, there is still great controversy about the fundamentals that constitute digital twins, such as what this concept is based on and how it can be included in healthcare effectively and sustainably. This article reviews digital twins and their role so far in what is known as personalized medicine. In addition, a series of steps will be exposed for a possible implementation of a digital twin for a patient suffering from sleep disorders. For this, artificial intelligence techniques, clinical data management, and possible solutions for explaining the results derived from artificial intelligence models will be addressed.
Implementation of product-service systems (PSS) requires structural changes in the way that business in manufacturing industries is traditionally conducted. Literature frequently mentions the importance of human resource management (HRM), since people are involved in the entire process of PSS development and employees are the primary link to customers. However, to this day, no study has provided empirical evidence whether and in what way HRM of firms that implement PSS differs from HRM of firms that solely run a traditional manufacturing based business model. The aim of this study is to contribute to closing this gap by investigating the particular HR components of manufacturing firms that implement PSS and compare it with the HRM of firms that do not. The context of this study is the fashion industry, which is an ideal setting since it is a mature and highly competitive industry that is well-documented for causing significant environmental impact. PSS present a promising opportunity for fashion firms to differentiate and mitigate the industry’s ecological footprint. Analysis of variance (ANOVA) was conducted to analyze data of 102 international fashion firms. Findings reveal a significant higher focus on nearly the entire spectrum of HRM components of firms that implement PSS compared with firms that do not. Empirical findings and their interpretation are utilized to propose a general framework of the role of HRM for PSS implementation. This serves as a departure point for both scholars and practitioners for further research, and fosters the understanding of the role of HRM for managing PSS implementation.
CODE RED FOR HUMANITY. The alarm bells are deafening, and the evidence is irrefutable: greenhouse-gas emissions from fossil-fuel burning and deforestation are choking our planet and putting billions of people at immediate risk. Global heating is affecting every region on Earth, with many of the changes becoming irreversible. (Guterres 2021)
The digitalisation ongoing in households and sustainability-related challenges are multifaceted and complex. The introducing quote of the United Nations Secretary-General refers to the latest report of the Intergovernmental Panel on Climate Change (IPCC), emphasising the urgency to act – now. As of today, becoming a sustainable population is still a distant destination. As outlined in the previous chapters, the challenges associated with that transformation remain huge, complex, and largely unsolved. Recent dramas such as the power incident in Texas (2021), the floods in Germany (2021), or the drought in sub-Saharan Africa (2020s) – are just a few of the uncountable issues stirring up the debate about fossil-fuel abandonment and the timing of climate neutrality. Business research can actually be accused of referring to the persistent focus on gains and growth, despite early warnings for society at large (e.g., Meadows et al., 1972; Kölsch & Veit, 1981; Veit & Thatcher, 2023). However, academic researchers, corporations, and society are now waking up, as shown by the climate change conference. In fact, it appears that the information systems (IS) discipline just began tackling mammoth challenges around climate change within the last decade (Melville, 2010; Watson et al., 2010). The central discussion in emerging work revolves around the role and use of digital technologies on the path to a healthy planet. But while early studies have focused on organisational settings (e.g., Gholami et al., 2016; Seidel et al., 2013), increasingly research addresses private settings (e.g., Wunderlich et al., 2019).
This study focuses on the different roles of social media for the promotion of a sustainable lifestyle, behaviour and consumption, especially with regard to the typically non-ethical fashion industry. Research findings include eight roles of social media influencing a sustainable consumption contrary to prior research naming one to five impacts. Results show that social media educates and engages the young and ethically interested target group besides increasing supply chain transparency and brand or theme awareness. Furthermore, social media provides a platform for organisations’ relationship management and social interaction since users get empowered to share experiences which leads to a higher level of trust.
The second hand concept indicates a growing trend in clothing recently, leading to growing numbers of second hand shops and developments of new second hand retail forms. This paper concentrates on the current second hand market for fashion products and presents the different motives toward second hand consumption as well as alternative consumption channels for second hand products. The findings of the paper are founded on literature research of academic articles and case studies. Results show that there is a high potential for the second hand market due to the increasing interest of consumers in buying second hand products. The paper concentrates on the second hand market for fashion products in the western society. This means that there was no research on second hand products for disadvantaged people in poor countries. Furthermore, the paper focuses the formal second hand retail channels to see what is already on the market.
This article explores current debate on the use of soft power in international higher education, highlighting existing tensions between competing political and academic discourses. It draws on examples from practice and relevant insights in soft power scholarship to capture varying paradoxes and dilemmas that emerge as nations try to leverage the power of international tertiary education to enhance their brand and attract foreign audiences in the name of public diplomacy. Whilst exposing cases of hubris and hidden agendas, this study also addresses issues of inequality and responds to a growing call for knowledge diplomacy aimed at tackling common global problems.
The sound of brands
(2019)
The aim of this research paper is to both examine and conceptualise the concept of audio branding. Audio branding is an important part of the overall brand management concept and corporate identity. Strong brands ease the choice for customers and convey values and a certain quality promise. Branding is of vital importance. It needs to be acknowledged that only 0.004% of all outer stimuli reach the human consciousness. Therefore, audio branding is a way to further strengthen the overall brand awareness. This leads to an emotional connection with a brand.
This study strives to determine the characteristics of audio branding and to analyse the corporate audio branding of Audi. The result of this research study is the suggestion of the use of audio branding in a way that fits the overall brand picture. Otherwise, the brand communication is inconsistent, and this could lead to a misunderstanding of the brand values for customers. The analysis of the Audi corporate sound design might be beneficial for practitioners. The overall evaluation of the concept of audio branding contributes to the existing body of literature in branding.
The success of an autonomous robotic system is influenced by several interdependent factors not easily identifiable. This paper is set to lay the foundation of a new integrated approach in order to deeply examine all the parameters and understand their contribution to success. After introducing the problem, two cutting edge autonomous systems for the process of unloading of containers will be presented. Then the STIC analysis, a recently developed method for modelling and interpreting all the parameters, will be introduced. The preliminary results of applying such a methodology to a first study case, based on one of the two systems available to the authors, will be shortly presented. Future research is in the end recommended in order to prove that this methodology is the only way to efficiently and effectively mitigate the risk that stops potential users from investing in autonomous systems in the logistics sector.
Compared to the automotive sector, where automation is the rule, in many other less standardized sectors automation is still the exception. This could soon hurt the productivity of industrialized countries, where the unemployment is low and the population is aging. Phenomena like the recent downfall in productivity, due to lockdowns and social distancing for prevention of health hazards during the COVID19 pandemic, only add to the problem. For these reasons, the relevance, motivation and intention for more automation in less standardized sectors has probably never been higher. However, available statistics say that providers and users of technologies struggle to bring more automation into action in automation-unfriendly sectors. In this paper, we present a decision support method for investment in automation that tackles the problem: the STIC analysis. The method takes a holistic and quantitative approach tying together technological, context-related and economic input parameters and synthetizing them in a final economic indicator. Thanks to the modelling of such parameters, it is possible to gain sensibility on the technological and/or process adjustments that would have the highest impact on the efficiency of the automation, thereby delivering value for both technology users and technology providers.
Few unfocused factories outperform competitors, but Focus is elusive because the environment is constantly evolving and this requires changes to a factory’s key tasks. So how can focus be achieved and sustained? We present insights derived from an historical analysis of the German Hewlett-Packard server plant which went through a series of Focus changes over the years. Using this example, we provide clues for the right timing of Focus changes and discuss critical structural and infrastructural changes required during the Focus transitions, as well as cross-functional coordination and leadership challenges. Our assertion is that production operations constitute a system that can adapt to disruptive Change by using the levers of manufacturing policies to stay focused on a limited but absolutely essential task which creates a strategic advantage.
Recently described rhizolutin and collinolactone isolated from Streptomyces Gç 40/10 share the same novel carbon scaffold. Analyses by NMR and X-Ray crystallography verify the structure of collinolactone and propose a revision of rhizolutins stereochemistry. Isotope-labeled precursor feeding shows that collinolactone is biosynthesized via type I polyketide synthase with Baeyer–Villiger oxidation. CRISPR-based genetic strategies led to the identification of the biosynthetic gene cluster and a high-production strain. Chemical semisyntheses yielded collinolactone analogues with inhibitory effects on L929 cell line. Fluorescence microscopy revealed that only particular analogues induce monopolar spindles impairing cell division in mitosis. Inspired by the Alzheimerprotective activity of rhizolutin, we investigated the neuroprotective effects of collinolactone and its analogues on glutamate-sensitive cells (HT22) and indeed, natural collinolactone displays distinct neuroprotection from intracellular oxidative stress.
Purpose: This paper is to show what sustainable fashion is and how it has developed in recent years. Also the paper discusses which factors are important in order to be sustainable. Above all, it's about customers who show a lot interest in sustainable fashion. Child labor, working conditions, poor quality and poisonous substances are stricty rejected by these consumers. Amazingly, fashion companies that repeatedly hit the headlines with bad properties are very successful. It's about the sustainable oxymoron, the act and want of the consumer.
Findings: It is difficult to be sustainable. The reasons for that are the consumption, not much transparency in textile chains, fast fashion and much more. It's almost impossible for a product to achieve the 100 percent sustainability. On one hand the consumers want to have sustainable products, on the other hand they purchase for newness and cheap clothes. It has become clear that they buy in a conflict.
The tale of 1000 cores: an evaluation of concurrency control on real(ly) large multi-socket hardware
(2020)
In this paper, we set out the goal to revisit the results of “Starring into the Abyss [...] of Concurrency Control with [1000] Cores” and analyse in-memory DBMSs on today’s large hardware. Despite the original assumption of the authors, today we do not see single-socket CPUs with 1000 cores. Instead multi-socket hardware made its way into production data centres. Hence, we follow up on this prior work with an evaluation of the characteristics of concurrency control schemes on real production multi-socket hardware with 1568 cores. To our surprise, we made several interesting findings which we report on in this paper.
Purpose
This field study aims to investigate the interactive relationships of millennial employee’s gender, supervisor’s gender and country culture on the conflict-management strategies (CMS) in ten countries (USA, China, Turkey, Germany, Bangladesh, Portugal, Pakistan, Italy, Thailand and Hong Kong).
Design/methodology/approach
This exploratory study extends past research by examining the interactive effects of gender × supervisor’s gender × country on the CMS within a single generation of workers, millennials. The Rahim Organizational Conflict Inventory–II, Form A was used to assess the use of the five CMS (integrating, obliging, dominating, avoiding and compromising). Data analysis found CMS used in the workplace are associated with the interaction of worker and supervisor genders and the national context of their work.
Findings
Data analysis (N = 2,801) was performed using the multivariate analysis of covariance with work experience as a covariate. The analysis provided support for the three-way interaction. This interaction suggests how one uses the CMS depends on self-gender, supervisor’s gender and the country where the parties live. Also, the covariate – work experience – was significantly associated with CMS.
Research limitations/implications
One of the limitations of this study is that the authors collected data from a collegiate sample of employed management students in ten countries. There are significant implications for leading global teams and training programs for mid-level millennials.
Practical implications
There are various conflict situations where one conflict strategy may be more appropriate than others. Organizations may have to change their policies for recruiting employees who are more effective in conflict management.
Social implications
Conflict management is not only important for managers but it is also important for all human beings. Individuals handle conflict every day and it would be really good if they could handle it effectively and improve their gains.
Originality/value
To the best of the authors’ knowledge, no study has tested a three-way interaction of variables on CMS. This study has a wealth of information on CMS for global managers.
The time has come : application of artificial intelligence in small- and medium-sized enterprises
(2022)
Artificial intelligence (AI) is not yet widely used in small- and medium-sized industrial enterprises (SME). The reasons for this are manifold and range from not understanding use cases, not enough trained employees, to too little data. This article presents a successful design-oriented case study at a medium-sized company, where the described reasons are present. In this study, future demand forecasts are generated based on historical demand data for products at a material number level using a gradient boosting machine (GBM). An improvement of 15% on the status quo (i.e. based on the root mean squared error) could be achieved with rather simple techniques. Hence, the motivation, the method, and the first results are presented. Concluding challenges, from which practical users should derive learning experiences and impulses for their own projects, are addressed.
Induced by a societal decision to phase out conventional energy production - the so-called Energiewende (energy transition) - the rise of distributed generation acts as a game changer within the German energy market. The share of electricity produced from renewable resources increased to 31,6% in 2015 (UBA, 2016) with a targeted share of renewable resources in the electricity mix of 55%-60% in 2035 (RAP, 2015), opening perspectives for new products and services. Moreover, the rapidly increasing degree of digitization enables innovative and disruptive business models in niches at the grid's edge that might be the winners of the future. It also stimulates the market entry of newcomers and competitors from other sectors, such as IT or telecommunication, challenging the incumbent utilities. For example, virtual and decentral market places for energy are emerging; a trend that is likely to speed up considerably by blockchain technology, if the regulatory environment is adjusted accordingly. Consequently, the energy business is turned upside down, with customers now being at the wheel. For instance, more than one-third of the renewable production capacities are owned by private persons (Trendsearch, 2013). Therefore, the objective of this chapter is to examine private energy consumer and prosumer segments and their needs to derive business models for the various decentralized energy technologies and services. Subsequently, success factors for dealing with the changing market environment and consequences of the potentially disruptive developments for the market structure are evaluated.
The typed graph model
(2020)
In recent years, the Graph Model has become increasingly popular, especially in the application domain of social networks. The model has been semantically augmented with properties and labels attached to the graph elements. It is difficult to ensure data quality for the properties and the data structure because the model does not need a schema. In this paper, we propose a schema bound Typed Graph Model with properties and labels. These enhancements improve not only data quality but also the quality of graph analysis. The power of this model is provided by using hyper-nodes and hyper edges, which allows to present a data structure on different abstraction levels. We demonstrate by example the superiority of this model over the property graph data model of Hidders and other prevalent data models, namely the relational, object-oriented, and XML model.
In recent years, the Graph Model has become increasingly popular, especially in the application domain of social networks. The model has been semantically augmented with properties and labels attached to the graph elements. It is difficult to ensure data quality for the properties and the data structure because the model does not need a schema. In this paper, we propose a schema bound Typed Graph Model with properties and labels. These enhancements improve not only data quality but also the quality of graph analysis. The power of this model is provided by using hyper-nodes and hyper-edges, which allows to present data structures on different abstraction levels. We prove that the model is at least equivalent in expressive power to most popular data models. Therefore, it can be used as a supermodel for model management and data integration. We illustrate by example the superiority of this model over the property graph data model of Hidders and other prevalent data models, namely the relational, object-oriented, XML model, and RDF Schema.
We investigated the state of artificial intelligence (AI) in pharmaceutical research and development (R&D) and outline here a risk and reward perspective regarding digital R&D. Given the novelty of the research area, a combined qualitative and quantitative research method was chosen, including the analysis of annual company reports, investor relations information, patent applications, and scientific publications of 21 pharmaceutical companies for the years 2014 to 2019. As a result, we can confirm that the industry is in an ‘early mature’ phase of using AI in R&D. Furthermore, we can demonstrate that, despite the efforts that need to be managed, recent developments in the industry indicate that it is worthwhile to invest to become a ‘digital pharma player’.
This paper investigates if food ^ retailing mobile applications from Germany, Austria, USA and the United Kingdom are meant to stay a marginal topic in grocery shopping, or if they have the potential to significantly shape the future of grocery retailing by serving as competitive advantages that can fulfil customer requirements and satisfaction. It has filtered out success factors in form of functions of grocery apps and it has extracted key competencies that can be used to create customer value. The Kano model can help selecting the right app functions. But, there are other prerequisites, like customers’ general attitude towards technology and their acceptance towards any kind of apps, that play an important role looking at the big picture of apps in grocery retailing. However, this paper has contributed one vital part of giving more importance to apps in grocery retailing in form of app functions that clearly deliver customer value. In short, apps that fit customers’ needs and that provide usability and convenience clearly have the potential to shape the future of grocery retailing - if key barriers towards app use are eliminated and if incentives are given that overcome scepticism.
Due to the rising need for palliative care in Russia, it is crucial to provide timely and high-quality solutions for patients, relatives, and caregivers. A methodology for remote monitoring of patients in need of palliative care and the requirements will be developed for a hardware-software complex for remote monitoring of patients' health at home.
The use of gamification in workplace learning to encourage employee motivation and engagement
(2019)
When we think about playing a game, be it a card game, board game, sport, or video game, we generally associate the act of playing with a positive experience like having fun, enjoying the interaction with others, or feeling a greater motivation to reach a certain goal. By contrast, workplace learning is often perceived as being dull. Employees are likely at some point in their career to find themselves stuck in a rigidly defined seminar for a long period of time or in front of their computer navigating through a mandatory e-learning course on a dry topic such as standards of business conduct of safety policies.
In recent years, organizations have tried to leverage the motivating quality of games for more serious learning contexts. Gamification entails transferring those elements and principles from games to nongaming context that improve user experience and engagement. In this chapter, we will specifically focus on the context of workplace learning.
The purpose of this paper is to study the recycling form of reusing second hand clothing from a conventional fashion brand’s perspective. It should clarify which measures and activities a fashion company needs to integrate in its value chain in order to offer branded second hand merchandise in a self-operated store. The research paper relies on a desk-based research and aims to illustrate the topic by means of a descriptive approach, processing the existing literature. Key findings demonstrate that fashion brands need to integrate complete lifecycle strategies, sustainability communication, and reverse logistics structures, like take-back schemes, for offering second hand clothing. The main limitations evolve from the research design. Further, empirical evidences need to be conducted for a more fundamental understanding of the new business model.
Thematic issue on human-centred ambient intelligence: cognitive approaches, reasoning and learning
(2017)
This editorial presents advances on human-centred Ambient Intelligence applications which take into account cognitive issues when modelling users (i.e. stress, attention disorders), and learn users’ activities/preferences and adapt to them (i.e. at home, driving a car). These papers also show AmI applications in health and education, which make them even more valuable for the general society.
Hypericin has large potential in modern medicine and exhibits fascinating structural dynamics, such as multiple conformations and tautomerization. However, it is difficult to study individual conformers/tautomers, as they cannot be isolated due to the similarity of their chemical and physical properties. An approach to overcome this difficulty is to combine single molecule experiments with theoretical studies. Time-dependent density functional theory (TD-DFT) calculations reveal that tautomerization of hypericin occurs via a two-step proton transfer with an energy barrier of 1.63 eV, whereas a direct single-step pathway has a large activation energy barrier of 2.42 eV. Tautomerization in hypericin is accompanied by reorientation of the transition dipole moment, which can be directly observed by fluorescence intensity fluctuations. Quantitative tautomerization residence times can be obtained from the autocorrelation of the temporal emission behavior revealing that hypericin stays in the same tautomeric state for several seconds, which can be influenced by the embedding matrix. Furthermore, replacing hydrogen with deuterium further proves that the underlying process is based on tunneling of a proton. In addition, the tautomerization rate can be influenced by a λ/2 Fabry–Pérot microcavity, where the occupation of Raman active vibrations can alter the tunneling rate.
Theoretical foundation, effectiveness, and design artefact for machine learning service repositories
(2022)
Machine learning (ML) has played an important role in research in recent years. For companies that want to use ML, finding the algorithms and models that fit for their business is tedious. A review of the available literature on this problem indicates only a few research papers. Given this gap, the aim of this paper is to design an effective and easy-to-use ML service repository. The corresponding research is based on a multi-vocal literature analysis combined with design science research, addressing three research questions: (1) How is current white and gray literature on ML services structured with respect to repositories? (2) Which features are relevant for an effective ML service repository? (3) How is a prototype for an effective ML service repository conceptualized? Findings are relevant for the explanation of user acceptance of ML repositories. This is essential for corporate practice in order to create and use ML repositories effectively.
Theory and practice of implementing a successful enterprise IoT strategy in the industry 4.0 era
(2021)
Since the arrival of the internet and affordable access to technologies, digital technologies have occupied a growing place in industries, propelling us towards a 4th industrial revolution: Industry 4.0. In today’s era of digital upheaval, enterprises are increasingly undergoing transformations that are leading to their digitalization. The traditional manufacturing industry is in the throes of a digital transformation that is accelerated by exponentially growing technologies (e.g., intelligent robots, Internet of Things, sensors, 3D printing). Around the world, enterprises are in a frantic race to implement solutions based on IoT to improve their productivity, innovation, and reduce costs and improve their markets on the international scene. Considering the immense transformative potential that IoTs and big data have to bring to the industrial sector, the adoption of IoT in all industrial systems is a challenge to remain competitive and thus transform the industry into a smart factory. This paper presents the description of the innovation and digitalization process, following the Industry 4.0 paradigm to implement a successful enterprise IoT strategy.
This paper investigates the electrothermal stability and the predominant defect mechanism of a Schottky gate AlGaN/GaN HEMT. Calibrated 3-D electrothermal simulations are performed using a simple semiempirical dc model, which is verified against high-temperature measurements up to 440°C. To determine the thermal limits of the safe operating area, measurements up to destruction are conducted at different operating points. The predominant failure mechanism is identified to be hot-spot formation and subsequent thermal runaway, induced by large drain–gate leakage currents that occur at high temperatures. The simulation results and the high temperature measurements confirm the observed failure patterns.
In der Mikroelektronik werden Chips häufig in Mold-Gehäusen verpackt. Die elektrischen Verbindungen vom Chip zu den Anschlussbeinchen des Gehäuses werden mit Bonddrähten realisiert. Für die Berechnung der Gleichgewichtstemperatur in einem Bonddraht bei konstantem Strom sowie von Temperaturverläufen bei transienten Strömen ist die herkömmliche FEM-Methode langsam und unhandlich. Daher wurde der Bondrechner entwickelt, der ein zylindersymmetrisches Ersatz-Modell für das Package in geeigneten mathematischen Gleichungen abbildet.
Im Gegensatz zum Bondrechner der ersten Generation [1], der auf den Gleichungen von [2] basiert, bietet ein neuer mathematischer Ansatz die Möglichkeit, eine endliche effektive Package-Größe, sowie einen endlichen Wärmeübergang zwischen Bonddraht und Mold-Masse zu berücksichtigen. Ebenso wurde die Berechnung der Interaktion von mehreren benachbarten Drähten verfeinert. Die Berechnung von beliebigen transienten Pulsformen mittlerer Länge wurde ebenfalls verbessert. Eine quadratische Komponente in der Temperaturabhängigkeit des spezifischen Widerstandes des Drahtmaterials kann jetzt ebenfalls berücksichtigt werden.
Die Ergebnisse wurden erfolgreich mit FEM-Berechnungen verglichen und die Geschwindigkeit der Berechnung ist um Größenordnungen schneller als mit kommerziellen FEM-Programmen.
In the course of a more intensive energy generation from regenerative sources, an increased number of energy storages is required. In addition to the widespread means of storing electric energy, storing energy thermally can contribute significantly. However, limited research exists on the behaviour of thermal energy storages (TES) in practical operation. While the physical processes are well known, it is nevertheless often not possible to adequately evaluate its performance with respect to the quality of thermal stratification inside the tank, which is crucial for the thermodynamic effectiveness of the TES. The behaviour of a TES is experimentally investigated in cyclic charging and discharging operation in interaction with a cogeneration (CHP) unit at a test rig in the lab. From the measurements the quality of thermal stratification is evaluated under varying conditions using different metrics such as normalised stratification factor, modified MIX number, exergy number and exergy efficiency, which extends the state of art for CHP applications. The results show that the positioning of the temperature sensors for turning the CHP unit on and off has a significant influence on both the effective capacity of a TES and the quality of thermal stratification inside the tank. It is also revealed that the positioning of at least one of these sensors outside the storage tank, i.e. in the return line to the CHP unit, prevents deterioration of thermal stratification, thereby enhancing thermodynamic effectiveness. Furthermore, the effects of thermal load and thermal load profile on effective capacity and thermal stratification are discussed, even though these are much smaller compared to the effect of positioning the temperature sensors.
The data presented in this article characterize the thermomechanical and microhardness properties of a novel melamine-formaldehyde resin (MF) intended for the use as a self-healing surface coating. The investigated MF resin is able to undergo reversible crosslinking via Diels Alder reactive groups. The microhardness data were obtained from nanoindentation measurements performed on solid resin film samples at different stages of the self-healing cycle. Thermomechanical analysis was performed under dynamic load conditions. The data provide supplemental material to the manuscript published by Urdl et al. 2020 (https://doi.org/10.1016/j.eurpolymj.2020.109601) on the self-healing performance of this resin, where a more thorough discussion on the preparation, the properties of this coating material and its application in impregnated paper-based decorative laminates can be found.
For large-scale processes as implemented in organizations that develop software in regulated domains, comprehensive software process models are implemented, e.g., for compliance requirements. Creating and evolving such processes is demanding and requires software engineers having substantial modeling skills to create consistent and certifiable processes. While teaching process engineering to students, we observed issues in providing and explaining models. In this paper, we present an exploratory study in which we aim to shed light on the challenges students face when it comes to modeling. Our findings show that students are capable of doing basic modeling tasks, yet, fail in utilizing models correctly. We conclude that the required skills, notably abstraction and solution development, are underdeveloped due to missing practice and routine. Since modeling is key to many software engineering disciplines, we advocate for intensifying modeling activities in teaching.
To generate greater value faster from digital innovation, many companies are increasing how much they learn from their own innovation efforts. However, in many companies, these changes are limited to one stakeholder group: innovation teams. Two other stakeholder groups, senior executives and experts from corporate functions, also need to learn from digital innovation initiatives. We have defined three learning imperatives that address a company’s needs to learn continually about building (1) a successful innovation, (2) a portfolio of initiatives that realizes strategic objectives faster, and (3) shared resources that propel multiple initiatives. All three imperatives involve collecting data regularly from digital innovation initiatives. In this research briefing we outline the three learning imperatives and provide examples of how companies are pursuing them to achieve strategic objectives more effectively and efficiently.
This study introduces a straightforward approach to construct three-dimensional (3D) surface-enhanced Raman spectroscopy (SERS) substrates using chemically modified silica particles as microcarriers and by attaching metal nanoparticles (NPs) onto their surfaces. Tollens’ reagent and sputtering techniques are utilized to prepare the SERS substrates from mercapto-functionalized silica particles. Treatment with Tollens’ reagent generates a variety of silver NPs, ranging from approximately 10 to 400 nm, while sputtering with gold (Au) yields uniformly distributed NPs with an island-like morphology. Both substrates display wide plasmon resonances in the scattering spectra, making them effective for SERS in the visible spectral range, with enhancement factors (ratio of the analyte’s intensity at the hotspot compared to that on the substrate in the absence of metal nanoparticles) of up to 25. These 3D substrates have a significant advantage over traditional SERS substrates because their active surface area is not limited to a 2D surface but offers a much greater active surface due to the 3D arrangement of the NPs. This feature may enable achieving much higher SERS intensity from within streaming liquids or inside cells/tissues.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
Advancements in Internet of Things (IoT), cloud and mobile computing have fostered the digital enrichment—or “digitization”—of physical products, which are gaining increasing relevance in practice. According to recent studies, global IoT spending will exceed USD 1 Trillion by 2021 and there will be over 25 billion IoT connections (KPMG, 2018). Porter and Heppelmann (2014) state that IT is “revolutionizing products [as …] IT is becoming an integral part of the product itself.” Senior business executives like GE’s former CEO Jeff Immelt (2015) are even proposing that “every industrial company in the coming age is also going to become a software and analytics company.” This reflects the increasing relevance of IT components’ (i.e., software, data analytics, cloud computing) integration into previously purely physical products. We call IT-enriched physical products, “digitized” products to differentiate them from purely intangible “digital” products, such as digital music, e-books, and software. Examples of digitized products include the Philips Hue smartphone-controllable lightbulb, Audi Connect internet-connected cars, or Rolls-Royce’s sensor-enabled pay per use jet engines.
Digitized products provide their producers with a wide range of opportunities to offer new functionality and product capabilities (e.g., autonomy) that traditional, physical products do not exhibit (Porter and Heppelmann, 2014). In addition, the digitization of products allows producers to continuously repurpose their offerings, by extending and/or changing the product functionality and, thus, enabling new value creation opportunities. Based on their re-programmability and connectivity, digitized products “remain essentially incomplete […] throughout their lifetime as users continue to add and delete […] and change […] functional capabilities” (Yoo, 2013). For instance, the Philips Hue connected lightbulb enables remote control of basic functions (e.g., switching on and off the light) as well as setting more advanced light scenes for day-to-day tasks (e.g., relax, read) via Amazon’s Alexa artificial intelligence assistant (Signify, 2019), offerings that were not intended use cases when Signify (previously known as Philips Lighting) created Hue in 2012. Thus, digitized products present limitless potentials for new functionality and unforeseen use cases, which provides them with a huge innovation capacity.
Despite the limitless potentials offered by digitized products, there has been a slow uptake of digitized products by businesses so far (Jernigan et al., 2016; Mocker et al., 2019). According to a 2016 MIT Sloan Management Review report (Jernigan et al., 2016) only 24% of the investigated firms were actively using IoT technologies – a key technology for digitized products. In a more recent research study Mocker et al. (2019) found that the median revenue share from digital offerings (i.e., solutions based on IT enriched products) in large companies only accounted for 5% of the total revenue of the investigated companies.
The slow uptake of digitized products might be explained by the challenges that firms face regarding the changing nature of digitized products. Pervasive digital technologies (such as IoT) change the nature of products by adding new functionality that was previously not part of the value proposition of the products/services (e.g., a pair of shoes embedded with sensors and connectivity allows joggers to have access to data regarding their run distance, speed, etc.) (Yoo et al., 2012). The addition of new functionality and use cases of digitized products makes it harder for producers to design and develop relevant products (Hui 2014). As described in the paper ‘Do Your Customers Actually Want a “Smart” Version of Your Product?’, “just because [firms] can make something with IoT technology doesn’t mean people will want it.” (Smith, 2017).
The shift in digitized products’ nature poses new challenges for producers along the entire product development process (Porter and Heppelmann, 2015; Yoo et al., 2012) and create a paradox in product digitization, described by Yoo et al. (2012) as the paradox of pace: while technology accelerates the rate of innovation, companies need to spend more time to digitize their products, extending time to market. The production of these digitized products also becomes more challenging, e.g., as companies need to deal with different clock-speeds of software and hardware development (Porter and Heppelman, 2015). The above-mentioned challenges suggest that producers need to better understand how they can generate value from their digitized products’ generative potentials.
The body of literature on digitized products has been growing in recent years. For instance, Herterich et al. (2016) investigate how digitized product affordances (i.e., potentials) enable industrial service innovation; Nicolescu et al. (2018) explore the emerging meanings of value associated with IoT; and Benbunan-Fich (2019) studies the impact of basic wearable sensors on the quality of the user experience. However, it remains unclear what it takes for firms to generate value with their digitized product potentials. This dissertation investigates this research gap.
The increase in distributed energy generation, such as photovoltaic systems (PV) or combined heat and power plants (CHP), poses new challenges to almost every distribution network operator (DNO). In the low-voltage (LV) grids, where installed PV capacity approaches the magnitude of household load, reverse power flow occurs at the secondary substa-tions. High PV penetration leads to voltage rise, flicker and loading problems. These problems have been addressed by the application of various techniques amongst which is the deployment of step voltage regulators (SVR). SVR can solve the voltage problem, but do not prevent or reduce reverse power flows. Therefore, the application of SVR in low voltage grids can result in significant power losses upstream. In this paper we present part of a research project investi-gating the application of remote-controlled cable cabinets (CC) with metering units in a low-voltage network as a possible alternative for SVR. A new generation of custom-made remote-control cable cabinets has been deployed and dynamic network reconfigurations (NR) have been realized with the following objectives: (i) reduction of reverse power flow through the secondary substation to the upstream network and therefore a reduction of upstream losses, (ii) reduction of the voltage rise caused by distributed energy resources and (iii) load balancing in the low-voltage grid. Secondary objec-tives are to improve the DNO's insight into the state of the network and to provide further information on future smart grid integration.
Standardisation of breath sampling is important for application of breath analysis in clinical settings. By studying the effect of room airing on indoor and breath analytes and by generating time series of room air with different sampling intervals we sought to get further insights into room air metabolism, to detect the relevance of exogenous VOCs and to make conclusions about their consideration for the interpretation of exhaled breath. Room air and exhaled breath of a healthy subject were analysed before and after room airing. Furthermore a time series of room air with doors and windows closed was taken over 84 h by an automatic sampling every 180 min. A second times series studied room air analytes over 70 h with samples taken every 16.5 min. For breath and room air measurements an IMS coupled to a multi-capillary column (IMS/MCC) [Bio-Scout® - B&S Analytik GmbH, Dortmund, Germany] was used. The peaks were characterized using the Software Visual Now (B&S Analytik, Dortmund Germany) and identified using the software package MIMA (version 1.1, provided by the Max Planck Institute for Informatics, Saarbrücken, Germany) and the database 20160426_SubstanzDbNIST_122 (B & S Analytik GmbH, Dortmund, Germany). In the morning 4 analytes (Decamethylcylopentasiloxane [541-02-6]; Pentan-2-one [107-87-9] – Dimer; Hexan-1-al [66-25-1]; Pentan-2-one [107-87-9]) – Monomer showed high intensities in the room air and exhaled breath. They were significantly but not equally reduced by room airing. The time series about 84 h showed a time dependent decrease of analytes (limonen-monomer and -dimer; Decamethylcylopentasiloxane, Butan-1-ol, Butan-1-ol) as well as increase (Pentan-2-one [107-87-9] – Dimer). Shorter sampling intervals exhibited circadian variations of analyte concentrations for many analytes. Breath sampling in the morning needs room airing before starting. Then the variation of the intensity of indoor analytes can be kept small. The time series of indoor analytes show, that their intensities have a different behaviour, with time dependent declines, constant increases and circadian variations, dependent on room airing. This has implications on the breath sampling procedure and the intrepretation of exhaled breath.
When forecasting sales figures, not only the sales history but also the future price of a product will influence the sales quantity. At first sight, multivariate time series seem to be the appropriate model for this task. Nonetheless, in real life history is not always repeatable, i.e., in the case of sales history there is only one price for a product at a given time. This complicates the design of a multivariate time series. However, for some seasonal or perishable products the price is rather a function of the expiration date than of the sales history. This additional information can help to design a more accurate and causal time series model. The proposed solution uses an univariate time series model but takes the price of a product as a parameter that influences systematically the prediction based on a calculated periodicity. The price influence is computed based on historical sales data using correlation analysis and adjustable price ranges to identify products with comparable history. The periodicity is calculated based on a novel approach that is based on data folding and Pearson Correlation. Compared to other techniques this approach is easy to compute and allows to preset the price parameter for predictions and simulations. Tests with data from the Data Mining Cup 2012 as well as artificial data demonstrate better results than established sophisticated time series methods.
Successful transitions to a sustainable bioeconomy require novel technologies, processes, and practices as well as a general agreement about the overarching normative direction of innovation. Both requirements necessarily involve collective action by those individuals who purchase, use, and co-produce novelties: the consumers. Based on theoretical considerations borrowed from evolutionary innovation economics and consumer social responsibility, we explore to what extent consumers’ scope of action is addressed in the scientific bioeconomy literature. We do so by systematically reviewing bioeconomy-related publications according to (i) the extent to which consumers are regarded as passive vs. active, and (ii) different domains of consumer responsibility (depending on their power to influence economic processes). We find all aspects of active consumption considered to varying degrees but observe little interconnection between domains. In sum, our paper contributes to the bioeconomy literature by developing a novel coding scheme that allows us to pinpoint different aspects of consumer activity, which have been considered in a rather isolated and undifferentiated manner. Combined with our theoretical considerations, the results of our review reveal a central research gap which should be taken up in future empirical and conceptual bioeconomy research. The system-spanning nature of a sustainable bioeconomy demands an equally holistic exploration of the consumers’ prospective and shared responsibility for contributing to its coming of age, ranging from the procurement of information on bio-based products and services to their disposal.
This paper is a review about the book "Stress Test: Reflections on Financial Crises" by Timothy Geithner. The book mainly discusses the policy decisions and implications of T. Geithner during his job as New York FED president and US-Treasury secretary under president Obama. The book reveals some hidden information about the decision-making process in both institutions. But it lacks a scientific foundation in order to explain the financial crisis in more detail. Hence, I think the book is less convincing than recognized in public. No doubt, Geithner crisis response deserves appreciation especially the "Stress Test". However, the overall book does not demonstrate that the response is sustainable in the long run and scientifically sound. Consequently, it is more a book on public policy and governance than economics.
Titanium(IV) surface complexes bearing chelating catecholato ligands for enhanced band-gap reduction
(2023)
Protonolysis reactions between dimethylamido titanium(IV) catecholate [Ti(CAT)(NMe2)2]2 and neopentanol or tris(tert-butoxy)silanol gave catecholato-bridged dimers [(Ti(CAT)(OCH2tBu)2)(HNMe2)]2 and [Ti(CAT){OSi(OtBu)3}2(HNMe2)2]2, respectively. Analogous reactions using the dimeric dimethylamido titanium(IV) (3,6-di-tert-butyl)catecholate [Ti(CATtBu2-3,6)(NMe2)2]2 yielded the monomeric Ti(CATtBu2-3,6)(OCH2tBu)2(HNMe2)2 and Ti(CATtBu2-3,6)[OSi(OtBu)3]2(HNMe2)2. The neopentoxide complex Ti(CATtBu2-3,6)(OCH2tBu)2(HNMe2)2 engaged in further protonolysis reactions with Si–OH groups and was consequentially used for grafting onto mesoporous silica KIT-6. Upon immobilization, the surface complex [Ti(CATtBu2-3,6)(OCH2tBu)2(HNMe2)2]@[KIT-6] retained the bidentate chelating geometry of the catecholato ligand. This convergent grafting strategy was compared with a sequential and an aqueous approach, which gave either a mixture of bidentate chelating species with a bipodally anchored Ti(IV) center along with other physisorbed surface species or not clearly identifiable surface species. Extension of the convergent and aqueous approaches to anatase mesoporous titania (m-TiO2) enabled optical and electronic investigations of the corresponding surface species, revealing that the band-gap reduction is more pronounced for the bidentate chelating species (convergent approach) than for that obtained via the aqueous approach. The applied methods include X-ray photoelectron spectroscopy, ultraviolet photoelectron spectroscopy, and solid-state UV/vis spectroscopy. The energy-level alignment for the surface species from the aqueous approach, calculated from experimental data, accounts for the well-known type II excitation mechanism, whereas the findings indicate a distinct excitation mechanism for the bidentate chelating surface species of the material [Ti(CATtBu2-3,6)(OCH2tBu)2(HNMe2)2]@[m-TiO2].
Distributed ledger technologies such as the blockchain technology offer an innovative solution to increase visibility and security to reduce supply chain risks. This paper proposes a solution to increase the transparency and auditability of manufactured products in collaborative networks by adopting smart contract-based virtual identities. Compared with existing approaches, this extended smart contract-based solution offers manufacturing networks the possibility of involving privacy, content updating, and portability approaches to smart contracts. As a result, the solution is suitable for the dynamic administration of complex supply chains.
Enterprise Architectures (EA) consist of a multitude of architecture elements, which relate in manifold ways to each other. As the change of a single element hence impacts various other elements, mechanisms for architecture analysis are important to stakeholders. The high number of relationships aggravates architecture analysis and makes it a complex yet important task. In practice EAs are often analyzed using visualizations. This article contributes to the field of visual analytics in enterprise architecture management (EAM) by reviewing how state-of-the-art software platforms in EAM support stakeholders with respect to providing and visualizing the “right” information for decision-making tasks. We investigate the collaborative decision-making process in an experiment with master students using professional EAM tools by developing a research study. We evaluate the students’ findings by comparing them with the experience of an enterprise architect.
Mature economies which are driven mainly by small and medium sized enterprises (SMEs) are increasingly becoming dependent on material imports. Global material consumption is ever increasing, mainly driven by population increases. Decoupling of material consumption from economic growth is one of the greatest challenges of the 21st century. Within this paper available methods for the assessment of material efficiency on different economic scales are investigated and those detected that are particulary suitable for the use in SMEs. Recommendations for further improvements of the selected tools and an outlook concerning planned research activities in the field of material efficiency in enterprises, supply chains and circular economy aspects are given.
Automatisierte Analyse von Review-Daten beschäftigt sich mit den Möglichkeiten, freien Text zu analysieren und relevante Informationen daraus zu extrahieren. Die Arbeit setzt sich dabei mit Methoden des unüberwachten Lernens auseinander. Hierbei steht die Methode der Topic Modellierung im Mittelpunkt. Es werden Verfahren betrachtet, die im Bereich der textbasierten Informationsgewinnung bekannt sind. Latent Semantic Indexing LSI, (probabilistic) pLSI und Latent Dirichlet Allocation (LDA) werden erläutert und verglichen. Die Arbeit zeigt, wie LDA genutzt wurde, um einen nhaltlichen Überblick über einen Datenkorpus von 1 Mio. Reviews zu bekommen und diesen auf einen feineren Detailgrad zu betrachten. Die Topic-basierte Analyse wird genutzt, um Erkentnisse für ein Opinion Mining System zu generieren, welches eine tiefergehende Analyse vornehmen wird. Der gesamte Prozess ist als vollständig automatisiert und maschinell unüberwacht konzeptioniert.
In this work, a comparison between different brushless harmonic-excited wound-rotor synchronous machines is performed. The general idea of all topologies is the elimination of the slip rings and auxiliary windings by using the already existing stator and rotor winding for field excitation. This is achieved by injecting a harmonic airgap field with the help of power electronics. This harmonic field does not interact with the fundamental field, it just transfers the excitation power across the airgap. Alternative methods with varying number of phases, different pole-pair combinations, and winding layouts are covered and compared with a detailed Finite-Element-parameterized model. Parasitic effects due to saturation and coupling between the harmonic and main windings are considered.
Container virtualization evolved into a key technology for deployment automation in line with the DevOps paradigm. Whereas container management systems facilitate the deployment of cloud applications by employing container based artifacts, parts of the deployment logic have been applied before to build these artifacts. Current approaches do not integrate these two deployment phases in a comprehensive manner. Limited knowledge on application software and middleware encapsulated in container-based artifacts leads to maintainability and configuration issues. Besides, the deployment of cloud applications is based on custom orchestration solutions leading to lock in problems. In this paper, we propose a two-phase deployment method based on the TOSCA standard. We present integration concepts for TOSCA-based orchestration and deployment automation using container-based artifacts. Our two-phase deployment method enables capturing and aligning all the deployment logic related to a software release leading to better maintainability. Furthermore, we build a container management system, which is composed of a TOSCA-based orchestrator on Apache Mesos, to deploy container-based cloud applications automatically.
There are several intra-operative use cases which require the surgeon to interact with medical devices. We used the Leap Motion Controller as input device and implemented two use-cases: 2D-Interaction (e.g. advancing EPR data) and selection of a value (e.g. room illumination brightness). The gesture detection was successful and we mapped its output to several devices and systems.