Refine
Year of publication
Document Type
- Conference proceeding (142)
- Book chapter (91)
- Journal article (78)
- Anthology (10)
- Book (9)
- Doctoral Thesis (1)
Language
- English (331) (remove)
Is part of the Bibliography
- yes (331)
Institute
- Informatik (178)
- Texoversum (48)
- ESB Business School (42)
- Life Sciences (33)
- Technik (29)
- Zentrale Einrichtungen (1)
Publisher
- Springer (331) (remove)
In this paper, we deal with optimizing the monetary costs of executing parallel applications in cloud-based environments. Specifically, we investigate on how scalability characteristics of parallel applications impact the total costs of computations. We focus on a specific class of irregularly structured problems, where the scalability typically depends on the input data. Consequently, dynamic optimization methods are required for minimizing the costs of computation. For quantifying the total monetary costs of individual parallel computations, the paper presents a cost model that considers the costs for the parallel infrastructure employed as well as the costs caused by delayed results. We discuss a method for dynamically finding the number of processors for which the total costs based on our cost model are minimal. Our extensive experimental evaluation gives detailed insights into the performance characteristics of our approach.
Background
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).
Results
In order to obtain more detailed information about the electrical activity of the heart, we carry out a reconstruction of the distribution of equivalent electrical sources on the heart surface. In this area, we hold reconstruction of the equivalent sources during the cardiac cycle at relatively low hardware cost. ECG maps of electrical potentials on the surface of the torso (TSPM) and electrical sources on the surface of the heart (HSSM) were studied for different times of the cardiac cycle. We carried out a visual and quantitative comparison of these maps in the presence of pathological regions of different localization. For this purpose we used the model of the heart electrical activity, based on cellular automata.
Conclusions
The model of cellular automata allows us to consider the processes of heart excitation in the presence of pathological regions of various sizes and localization. It is shown, that changes in the distribution of electrical sources on the surface of the epicardium in the presence of pathological areas with disturbances in the conduction of heart excitation are much more noticeable than changes in ECG maps on the torso surface.
The use of Wireless Sensor and Actuator Networks (WSAN) as an enabling technology for Cyber-Physical Systems has increased significantly in recent past. The challenges that arise in different application areas of Cyber- Physical Systems, in general, and in WSAN in particular, are getting the attention of academia and industry both. Since reliability issues for message delivery in wireless communication are of critical importance for certain safety related applications, it is one of the areas that has received significant focus in the research community. Additionally, the diverse needs of different applications put different demands on the lower layers in the protocol stack, thus necessitating such mechanisms in place in the lower layers which enable them to dynamically adapt. Another major issue in the realization of networked wirelessly communicating cyber-physical systems, in general, and WSAN, in particular, is the lack of approaches that tackle the reliability, configurability and application awareness issues together. One could consider tackling these issues in isolation. However, the interplay between these issues create such challenges that make the application developers spend more time on meeting these challenges, and that too not in very optimal ways, than spending their time on solving the problems related to the application being developed. Starting from some fundamental concepts, general issues and problems in cyber-physical systems, this chapter discusses such issues like energy-efficiency, application and channel-awareness for networked wirelessly communicating cyber-physical systems. Additionally, the chapter describes a middleware approach called CEACH, which is an acronym for Configurable, Energy-efficient, Application- and Channel-aware Clustering based middleware service for cyber-physical systems. The state of-the art in the area of cyberphysical systems with a special focus on communication reliability, configurability, application- and channel-awareness is described in the chapter. The chapter also describes how these features have been considered in the CEACH approach. Important node level and network level characteristics and their significance vis-àvis the design of applications for cyber physical systems is also discussed. The issue of adaptively controlling the impact of these factors vis-à-vis the application demands and network conditions is also discussed. The chapter also includes a description of Fuzzy-CEACH which is an extension of CEACH middleware service and which uses fuzzy logic principles. The fuzzy descriptors used in different stages of Fuzzy-CEACH have also been described. The fuzzy inference engine used in the Fuzzy-CEACH cluster head election process is described in detail. The Rule-Bases used by fuzzy inference engine in different stages of Fuzzy-CEACH is also included to show an insightful description of the protocol. The chapter also discusses in detail the experimental results validating the authenticity of the presented concepts in the CEACH approach. The applicability of the CEACH middleware service in different application scenarios in the domain of cyberphysical systems is also discussed. The chapter concludes by shedding light on the Publish-Subscribe mechanisms in distributed event-based systems and showing how they can make use of the CEACH middleware to reliably communicate detected events to the event-consumers or the actuators if the WSAN is modeled as a distributed event-based system.
Platforms and their surrounding ecosystems are becoming increasingly important components of many companies' strategies. Artificial Intelligence, in particular, has created new opportunities to create and develop ecosystems around the platform. However, there is not yet a methodology to systematically develop these new opportunities for enterprise development strategy. Therefore, this paper aims to lay a foundation for the conceptualization of Artificial Intelligence-based service ecosystems exploiting a Service-Dominant Logic. The basis for conceptualization is the study of value creation and particularly effective network effects. This research investigates the fundamental idea of extending specific digital concepts considering the influence of Artificial Intelligence on the design of intelligent services, along with their architecture of digital platforms and ecosystems, to enable a smooth evolutionary path and adaptability for human-centric collaborative systems and services. The paper explores an extended digital enterprise conceptual model through a combined, iterative, and permanent task of co-creating value between humans and intelligent systems as part of a new idea of cognitively adapted intelligent services.
The blockchain technology represents a decentralized database that stores information securely in immutable data blocks. Regarding supply chain management, these characteristics offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. In this context, first token-based mapping approaches exist to transfer certain manufacturing processes to the blockchain, such as the creation or assembly of parts as well as their transfer of ownership. However, the decentralized and immutable structure of blockchain technology also creates challenges when applying these token-based approaches to dynamic manufacturing processes. As a first step, this paper investigates existing mapping approaches and exemplifies weaknesses regarding their suitability for products with changeable configurations. Secondly, a concept is proposed to overcome these weaknesses by introducing logically coupled tokens embedded into a flexible smart contract structure. Finally, a concept for a token-based architecture is introduced to map manufacturing processes of products with changeable configurations.
Public transport causes in rural areas high costs per passenger and kilometer as the frequency of scheduled busses is low and therefore, many people avoid using public transport. With the trend of moving from urban regions to countryside individual traffic will further increase. To tackle issues of emissions, mobility for young and elderly people and provide economically meaningful public transport a new concept was elaborated in Germany. This consists of (partly) autonomous shuttle busses which are remote controlled. For implementation rural districts of Germany have worked together and set up a three-phase plan consisting of a project with public funding, a highly frequent used pilot region and industrial partners with the commitment and possibilities for necessary investments. The concept promises economical value with respect to installation, service and maintaining costs, it leads to lower barriers for public transport of young and elderly people and ultimately reduces emissions and congestions.
The early detection of head and neck cancer is a prolonged challenging task. It requires a precise and accurate identification of tissue alterations as well as a distinct discrimination of cancerous from healthy tissue areas. A novel approach for this purpose uses microspectroscopic techniques with special focus on hyperspectral imaging (HSI) methods. Our proof-of-principle study presents the implementation and application of darkfield elastic light scattering spectroscopy (DF ELSS) as a non-destructive, high-resolution, and fast imaging modality to distinguish lingual healthy from altered tissue regions in a mouse model. The main aspect of our study deals with the comparison of two varying HSI detection principles, which are a point-by-point and line scanning imaging, and whether one might be more appropriate in differentiating several tissue types. Statistical models are formed by deploying a principal component analysis (PCA) with the Bayesian discriminant analysis (DA) on the elastic light scattering (ELS) spectra. Overall accuracy, sensitivity, and precision values of 98% are achieved for both models whereas the overall specificity results in 99%. An additional classification of model-unknown ELS spectra is performed. The predictions are verified with histopathological evaluations of identical HE-stained tissue areas to prove the model’s capability of tissue distinction. In the context of our proof-of-principle study, we assess the Pushbroom PCA-DA model to be more suitable for tissue type differentiations and thus tissue classification. In addition to the HE-examination in head and neck cancer diagnosis, the usage of HSI-based statistical models might be conceivable in a daily clinical routine.
The influence of turbidity on the Raman signal strengths of condensed matter is theoretically analyzed and measured with laboratory - scale equipment for remote sensing. The results show the quantitative dependence of back- and forward-scattered signals on the thickness and elastic-scattering properties of matter. In the extreme situation of thin, highly turbid layers, the measured Raman signal strengths exceed their transparent analogs by more than a factor of ten. The opposite behavior is found for thick layers of low turbidity, where the presence of a small amount of scatterers leads to a decrease of the measured signal. The wide range of turbidities appearing in nature is experimentally realized with stacked polymer layers and solid/liquid dispersions, and theoretically modeled by the equation of radiative transfer using the analytical diffusion approximation or random walk simulations.
The relevance of Robotic Process Automation (RPA) has increased over the last few years. Combining RPA with Artificial Intelligence (AI) can further enhance the business value of the technology. The aim of this research was to analyze applications, terminology, benefits, and challenges of combining the two technologies. A total of 60 articles were analyzed in a systematic literature review to evaluate the aforementioned areas. The results show that by adding AI, RPA applications can be used in more complex contexts, it is possible to minimize the human factor during the development process, and AI-based decision-making can be integrated into RPA routines. This paper also presents a current overview of the used terminology. Moreover, it shows that by integrating AI, some unseen challenges in RPA projects can emerge, but also a lot of new benefits will come along with it. Based on the outcome, it is concluded that the topic offers a lot of potential, but further research and development is required. The result of this study help researches to gain an overview of the state-of-the-art in combining RPA and AI.
While many maintainability metrics have been explicitly designed for service-based systems, tool-supported approaches to automatically collect these metrics are lacking. Especially in the context of microservices, decentralization and technological heterogeneity may pose challenges for static analysis. We therefore propose the modular and extensible RAMA approach (RESTful API Metric Analyzer) to calculate such metrics from machine-readable interface descriptions of RESTful services. We also provide prototypical tool support, the RAMA CLI, which currently parses the formats OpenAPI, RAML, and WADL and calculates 10 structural service-based metrics proposed in scientific literature. To make RAMA measurement results more actionable, we additionally designed a repeatable benchmark for quartile-based threshold ranges (green, yellow, orange, red). In an exemplary run, we derived thresholds for all RAMA CLI metrics from the interface descriptions of 1,737 publicly available RESTful APIs. Researchers and practitioners can use RAMA to evaluate the maintainability of RESTful services or to support the empirical evaluation of new service interface metrics.
The purpose of this paper is to evaluate consisting consumption patterns caused by fast fashion with a new appearing form of consumption and retaining potentials as an alternative as well as sustainable form of fast fashion consumption. This research is set up on a theoretical background of scientific literature including governmental as well as press releases in order to evaluate the status quo of consumption and answering the research question. A new consumption pattern as well as an appearing economy of sharing can be stated including potential aspects of raising businesses and sustainable alternative forms of fast fashion. The framework of the research is limited to the textile and fashion industry in industrialized countries focusing on consumption in the twenty first century.
Co-design and endorsement
(2018)
The purpose of this paper is to determine the success factors regarding celebrities of the music business involved in fashion advertising. That famous people have the power to help brands and products to stand out among others is proven and popular. This paper is concentrating on successful musicians and their endorsements of fashion brands and examines the benefits for both, the brand and the artist. It investigates how consumer perceives brand and artist collaboration and what factors enhance the purchase intention and increase sales. This paper is structured in the following manner: The introduction presents the research question and sets the aim for the paper, followed by the analysis of the existing literature. The paper ends with conclusions, limitations and suggestions for further research.
The purpose of this paper is to investigate how the practice of closed-loop production systems (CLPS) is implemented in the fashion industry. This paper offers a critical literature review to present a thorough understanding of the actual status of literature. Subsequently, the paper reveals that CLPS are of great importance. Generally, such systems include different activities that have to be integrated. Critical points are the product acquisition, the recovering process itself and the remarketing to the customer. A lack of reliable data concerning CLPS in the specific case of fashion industry can be identified. Important research fields could be marketing strategies, controlling the acquisition process, evolvement of return technologies and strategies, adaption of recovered products to the mass market, and the development of new technologies concerning recovering processes.
Indoor localization systems are becoming more and more important with the digitalization of the industrial sector. Sensor data such as the current position of machines, transport vehicles, goods or tools represent an essential component of cyber physical production systems (CCPS). However, due to the high costs of these sensors, they are not widespread and are used mainly in special scenarios. However, especially optical indoor positioning systems (OIPS) based on cameras have certain advantages due to their technological specifications. In this paper, the application scenarios and requirements as well as their characteristics are presented and a classification approach of OIPS is introduced.
New business concepts such as Enterprise 2.0 foster the use of social software in enterprises. Especially social production significantly increases the amount of data in the context of business processes. Unfortunately, these data are still an unearthed treasure in many enterprises. Due to advances in data processing such as Big Data, the exploitation of context data becomes feasible. To provide a foundation for the methodical exploitation of context data, this paper introduces a classification, based on two classes, intrinsic and extrinsic data.
Sustainability is a development that meets the needs of the present without compromising the ability of future generations to meet their own needs.
Business Model is a plan for the successful operation of a business, identifying sources of revenue, the intended customer base, products, and details of financing.
Circular economy is an approach of how a company creates, captures and delivers value, with a value creation logic designed to improve resource efficiency through contributing to extending the useful life of products and parts (e.g., through long-life design, repair and remanufacturing) and closing material loops.
The amount of image data has been rising exponentially over the last decades due to numerous trends like social networks, smartphones, automotive, biology, medicine and robotics. Traditionally, file systems are used as storage. Although they are easy to use and can handle large data volumes, they are suboptimal for efficient sequential image processing due to the limitation of data organisation on single images. Database systems and especially column-stores support more stuctured storage and access methods on the raw data level for entiere series.
In this paper we propose definitions of various layouts for an efficient storage of raw image data and metadata in a column store. These schemes are designed to improve the runtime behaviour of image processing operations. We present a tool called column-store Image Processing Toolbox (cIPT) allowing to easily combine the data layouts and operations for different image processing scenarios.
The experimental evaluation of a classification task on a real world image dataset indicates a performance increase of up to 15x on a column store compared to a traditional row-store (PostgreSQL) while the space consumption is reduced 7x. With these results cIPT provides the basis for a future mature database feature.
The connection of fashion and film seems symbiotic at first sight and they influence each other. There exist differences, including a different understanding of clothing by costume designers and fashion businesses. This article focuses on two successful movies „The Hunger Games“ and „The Great Gatsby“ in order to explore the role of film in fashion and vice versa. The findings suggest, that there are various collections in the fashion world, based on both movies. Therefore, movies indeed have an influence on the development of seasonal fashion. However, this connection is not natural, but rather artificially created by both industries. Through nowadays organized co-operation, the lines between costume designers and fashion designers get blurred. Furthermore, today fashion doesn’t trickle down to an audience naturally, but promoted using the film and its broad reach.
In today’s education, healthcare, and manufacturing sectors, organizations and information societies are discussing new enhancements to corporate structure and process efficiency using digital platforms. These enhancements can be achieved using digital tools. Industry 5.0 and Society 5.0 give several potentials for businesses to enhance the adaptability and efficacy of their industrial processes, paving the door for developing new business models facilitated by digital platforms. Society 5.0 can contribute to a super-intelligent society that includes the healthcare industry. In the past decade, the Internet of Things, Big Data Analytics, Neural Networks, Deep Learning, and Artificial Intelligence (AI) have revolutionized our approach to various job sectors, from manufacturing and finance to consumer products. AI is developing quickly and efficiently. We have heard of the latest artificial intelligence chatbot, ChatGPT. OpenAI created this, which has taken the internet by storm. We tested the effectiveness of a considerable language model referred to as ChatGPT on four critical questions concerning “Society 5.0”, “Healthcare 5.0”, “Industry,” and “Future Education” from the perspectives of Age 5.0.
Characterization of low density polyethylene greenhouse films during the composting of rose residues
(2022)
This study presents an evaluation of a potential alternative to plastic degradation in the form of organic composting. It stems from the urgent need of finding solutions to the plastic residues and focuses on the compost-based degradation of greenhouse film covers in an important rose exporter company in Ecuador. Thus, this study analyzes the physical, chemical, and biological changes of rose wastes composting, and also evaluates the stability of new and aged agricultural plastic under these conditions. Interestingly, results of compost characterization show a slow degradation rate of organic matter and total organic carbon, along with a significant increase in pH and rise of bacterial populations. However, the results demonstrate that despite these findings, composting conditions had no significant influence on plastic degradation, and while deterioration of aged plastic samples was reported in some tests, it may be the result of environmental conditions and a prolonged exposure to solar radiation. Importantly, these factors could facilitate the adhesion of microorganisms and promote plastic biodegradation. Hence, it is encouraged for future studies to analyze the ecotoxicity of plastics in the compost, as well as isolate, identify, and evaluate the possible biodegradative potential of these microorganisms as an alternative to plastic waste management.
Characterisation of porous knitted titanium for replacement of intervertebral disc nucleus pulposus
(2017)
Effective restoration of human intervertebral disc degeneration is challenged by numerous limitations of the currently available spinal fusion and arthroplasty treatment strategies. Consequently, use of artificial biomaterial implant is gaining attention as a potential therapeutic strategy. Our study is aimed at investigating and characterizing a novel knitted titanium (Ti6Al4V) implant for the replacement of nucleus pulposus to treat early stages of chronic intervertebral disc degeneration. Specific knitted geometry of the scaffold with a porosity of 67.67 ± 0.824% was used to overcome tissue integration failures. Furthermore, to improve the wear resistance without impairing original mechanical strength, electro-polishing step was employed. Electro-polishing treatment changed a surface roughness from 15.22 ± 3.28 to 4.35 ± 0.87 μm without affecting its wettability which remained at 81.03 ± 8.5°. Subsequently, cellular responses of human mesenchymal stem cells (SCP1 cell line) and human primary chondrocytes were investigated which showed positive responses in terms of adherence and viability. Surface wettability was further enhanced to super hydrophilic nature by oxygen plasma treatment, which eventually caused substantial increase in the proliferation of SCP1 cells and primary chondrocytes. Our study implies that owing to scaffolds physicochemical and biocompatible properties, it could improve the clinical performance of nucleus pulposus replacement.
New drugs serving unmet medical needs are one of the key value drivers of research-based pharmaceutical companies. The efficiency of research and development (R&D), defined as the successful approval and launch of new medicines (output) in the rate of the monetary investments required for R&D (input), has declined since decades. We aimed to identify, analyze and describe the factors that impact the R&D efficiency. Based on publicly available information, we reviewed the R&D models of major research-based pharmaceutical companies and analyzed the key challenges and success factors of a sustainable R&D output. We calculated that the R&D efficiencies of major research-based pharmaceutical companies were in the range of USD 3.2–32.3 billion (2006–2014). As these numbers challenge the model of an innovation-driven pharmaceutical industry, we analyzed the concepts that companies are following to increase their R&D efficiencies: (A) Activities to reduce portfolio and project risk, (B) activities to reduce R&D costs, and (C) activities to increase the innovation potential. While category A comprises measures such as portfolio management and licensing, measures grouped in category B are outsourcing and risk-sharing in late-stage development. Companies made diverse steps to increase their innovation potential and open innovation, exemplified by open source, innovation centers, or crowdsourcing, plays a key role in doing so. In conclusion, research-based pharmaceutical companies need to be aware of the key factors, which impact the rate of innovation, R&D cost and probability of success. Depending on their company strategy and their R&D set-up they can opt for one of the following open innovators: knowledge creator, knowledge integrator or knowledge leverager.
Case study: Marillion
(2018)
The purpose of this paper is to highlight the use of crowdfunding,
demonstrated by a case study about the rock band Marillion. The research
methodology applied is a literature review examining academic references. On this basis, a case study by exemplary illustrating the rock band Marillion and how they invented crowdfunding has been drafted. Findings suggest that the crowdfunding concept is no new phenomenon, since the rock band Marillion has investigated the business model. Recently, the funding method is applied to the fashion industry; hence it is efficient and engaging to finance projects by that specific business model. A limitation of this paper is that the topic of crowdfunding is new to the fashion business and needs further research and tests until they are practicable to interpret. Results show that there is a high potential for using crowdfunding in fashion by reaching a long-term change in this industry.
This paper is purposed to examine the impact of grunge music on fashion and to explain how grunge music is reflected in grunge style. The research methodology applied is a case study on grunge music and grunge style. Key findings suggest that different elements of grunge music had a great impact on the evolution of grunge style: Mentality and philosophy of the movement, musical style and sound as well as lyrical concerns are incorporated by grunge style. Commercial exploitation of grunge partly led to its downfall. Moreover, the original spirit of the movement is not commonly shared by all sub-genres’ respective contemporary styles. Musicians had great impact on the evolution of grunge style and unintentional rose to style icons. The research is limited by the amount of academic literature concerning the connection between grunge music and grunge style. Therefore, journal entries and blogs are used as reference as well.
Case study: EMP
(2018)
The purpose of this research paper is to investigate the business model of the retailer EMP. The in-depth literature review develops the relevance of merchandising for the rock and heavy metal scene and the relevance of EMP within that market. Literature about existing approaches of multi-channelling has been reviewed. Based on this theoretical framework, a case study of EMP has been drafted. Findings are discussed, focusing on the performance of EMP as a multi-channel and lifestyle retailer and additionally provide valuable managerial implications for fashion retailers. Implications for further research address lifestyle retailers to contribute to the findings or validate them with different examples. The research is clearly limited by the amount of scholar literature concerning EMP in particular. Hence, magazines, journals and information provided by the company serve as reference. Even though EMP provided some information, gathering any information about how EMP manages multi channelling operationally was not possible.
Since there is no denying that transparency is increasingly central to corporate sustainability, the purpose of this paper is a case study on a company’s attempt to be fully transparent, hence, picking up the existent scholarly conversation about uncompromising supply chain transparency. Literature so far was found to be fairly limited, but, following a trend, has been rising in numbers over recent years. Addressing these shortcomings in the methodology, an in-depth literature review about the multiple dimensions of supply chain transparency has been performed and links within supply networks stressed. On this basis, a case study by exemplary illustrating the fashion label Honestby has been drafted and the effort to become the world’s first 100 % transparent company further examined. Findings are discussed whether more supply chain transparency is desirable in any case, obstacles listed and an outlook for this kind of business model has been drawn. The research is clearly limited by the amount of scholarly literature concerning Honestby in particular. Out of this reason, magazines and journal entries are used as reference as well. Only with the extension of the topic itself to supply chain transparency and the literature review beforehand, the paper gained its necessary academic standard. Concerning implications, it needs to be mentioned that even though Honest by demonstrates to be fully transparent, it was not possible to find any public information about the degree of supplier relationship. In particular, concerning the applied control mechanisms used to exert influence and to balance out the power gradient between company and suppliers.
The purpose of this paper is to investigate the use of sustainable closed-loop supply chain of the fashion brand Filippa K. Information on green fashion has been gathered and a case study approach on the fashion retailer
Filippa K conducted. Results show a switch in knowledge content between a fast fashion supply chain and a sustainable supply chain. Also there is an evolution in sustainability as companies, retailers, and manufactures suffer under pressure from the customers, governments, and the media. Sustainable fashion brands like Filippa K are interested in sharing precise knowledge on variety of aspects linked to the sustainable closed-loop supply chain. This research paper has been limited by less information and unexplored topics in the theme green fashion. This led to the personal critical disputation with the brand Filippa K.
An ongoing challenge in our days is to lower the impact on the quality of life caused by dysfunctionality through individual support. With the background of an aging society and continuous increases in costs for care, a holistic solution is needed. This solution must integrate individual needs and preferences, locally available possibilities, regional conditions, professional and informal caregivers and provide the flexibility to implement future requirements. The proposed model is a result of a common initiative to overcome the major obstacles and to center a solution on individual needs caused by dysfunctionality.
In the period from the 1950s to 2013, the American Food and Drug Administration (FDA) approved 1346 new molecular entities (NMEs) or new biologics entities (NBEs). On average, the approval rate was 20 NMEs per year. In the past 40 years, the number of new drugs launched into the market increased slightly from 15 NMEs in the 1970s to 25–30 NMEs since the 1990s. The highest number of new drugs approved by FDA was in 1996 and 1997, which might be related to the enactment of the Prescription Drug User Fee Act (PDUFA) in 1993.
Public enterprises find themselves in increasingly competitive markets, a situation that makes having an entrepreneurial orientation (EO) an urgent need, given that EO is an indispensable driver of performance. Research describes politicians delaying the strategic change of public enterprises when serving as board members, but empirical evidence of the impact of board behavior on EO in public enterprises is lacking. We draw on stakeholder-agency theory (SAT) and resource dependence theory (RDT) and use structural equation modeling (SEM) to investigate survey data collected from 110 German energy suppliers that are majority government owned. Results indicate that board strategy control and board networking do not seem to predict EO on first sight. Closer analysis reveals a board networking–EO relationship depending on ownership structure. Remarkably, we find that it is not the usually suspected local municipal owner who hinders EO in our sample organizations but minority shareholders engaging in board networking activities. The results shed light on the intersection of governance and entrepreneurship with special reference to the fine-grained conceptualization of RDT.
In this chapter we introduce methods to improve mechanical designs by bionic methods. In most cases we assume that a general idea of the part or system is given by a set of data or parameters. Our task is to modify these free parameters so that a given goal or objective is optimized without violation of any of the existing restrictions.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study’s parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware.
Bionic optimization means finding the best solution to a problem using methods found in nature. As evolutionary strategies and particle swarm optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them.
A set of sample applications shows how bionic optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for multi-objective-optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating bionic optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented.
The closing section focuses on an overview and outlook on reliable and robust as well as on multi-objective optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
Today fiber reinforced plastics (FRP) are well established in manifold technical applications, because they provide advantages such as low weight, high stiffness, high strength and chemical resistance. The broad range of production methods starts from cost effective mass production up to the manufacturing of ultra-lightweight composite parts.
Biological materials are also usually composite materials: Higher plants or bones of higher animals are hierarchically organized and are composed of only a few materials such as lignin, cellulose, apatite and collagen. The large variety and the mechanical properties of natural tissues results primarily from an optimized fiber lay-up to adapt to the mechanical requirements of the respective “installation circumstances”.
Advanced lightweight technical solutions need strong materials and structurally optimized structures. In many industries, the structural optimization by an appropriate fiber lay-up has become an important method to save more weight. Corresponding software tools help to optimize topology/shape (e.g. Mattheck: CAO/SKO, Co. Altair: Optistruct), mainly using finite element analyzing technology.
The combination of strong lightweight materials, optimized topology and sophisticated fiber lay-up is also present in many bio-mineralized planktonic shells — for instance diatoms and radiolaria—but also in glass sponges.
Following it is shown, how the high weight-related mechanical properties of plankton are biomimetically transferred into ultra-lightweight technical structures.
Silicon neurons represent different levels of biological details and accuracies as a trade-off between complexity and power consumption. With respect to this trade-off and high similarity to neuron behaviour models, relaxation-type oscillator circuits often yield a good compromise to emulate neurons. In this chapter, two exemplified relaxation-type silicon neurons are presented that emulate neural behaviour with energy consumption under the scale of nJ/spike. The first proposed fully CMOS relaxation SiN is based on mathematical Izhikevich model and can mimic a broad range of physiologically observable spike patterns. The results of kinds of biologically plausible output patterns and coupling process of two SiNs are presented in 0.35 μm CMOS technology. The second type is a novel ultra-low-frequency hybrid CMOS-memristive SiN based on relaxation oscillators and analog memristive devices. The hybrid SiN directly emulates neuron behaviour in the range of physiological spiking frequencies (less than 100 Hz). The relaxation oscillator is implemented and fabricated in 0.13 μm CMOS technology. An autonomous neuronal synchronization process is demonstrated with two relaxation oscillators coupled by an analog memristive device in the measurement to emulate the synchronous behaviour between spiking neurons.
After the initiator of the ESB Logistics Learning Factory, Prof. Vera Hummel had made experience in developing and implementing a concept for a Learning Factory for Advanced Industrial Engineering (aIE) at the University of Stuttgart, Institute IFF between 2005 and 2008, she was appointed as a full professor at ESB Business School, a faculty of Reutlingen University in March 2010. Lacking a realistic, hands on learning and teaching environment of industrial scale for its industrial engineering students, first ideas for a Learning Factory that would strongly focus on all aspects of production logistics were drafted in 2012. Already back then, a strong integration of virtual and physical factory was desired: While the Learning Factory itself would be physical, the neighboring partners along the supply chain, such as suppliers or distribution warehouses, could be added in a fully virtual way. Considering implementation of the ESB Logistics Learning Factory a strategic initiative of the university, initial funding was provided by the faculty ESB Business School itself. Following its own creed, to provide future-oriented training for the region, also primarily local suppliers and manufacturers were selected as equipment providers to the new Learning Factory. During the initialization phase, 2014, a total of three researchers and nine students worked approximately four months to set up a first assembly line, storage racks, AGVs, or pick-by-light systems in conjunction with the underlying didactical concept. Since then, several hundred of students have participated in trainings and lectures held in the ESB Logistics Learning Factory, several research projects were carried out, and multiple high-level politicians and industry executives have been touring the shop floor. Also, more than EUR 2 million in research and infrastructure funds could be secured for expansion and upgrade — allowing the ESB Logistics Learning Factory today to represent many core aspects of an Industrie 4.0 production environment.
Excellence in IT is a key enabler for the digital transformation of enterprises. To realize the vision of digital enterprises it is necessary to cope with changing business requirements and to align business and IT. In order to evaluate the contribution of enterprise architecture management to these goals, our paper explores the impact of various factors to the perceived benefit of EAM in enterprises. Based on literature, we build an empirical research model. It is tested with empirical data of European EAM experts using a structural equation modelling approach. It is shown that changing business requirements, IT business alignment, the complexity of information technology infrastructure as well as enterprise architecture knowledge of information technology employees are crucial impact factors to the perceived benefit of EAM in enterprises.
Back to the future: origins and directions of the “Agile Manifesto” – views of the originators
(2018)
In 2001, seventeen professionals set up the manifesto for agile software development. They wanted to define values and basic principles for better software development. On top of brought into focus, the manifesto has been widely adopted by developers, in software-developing organizations and outside the world of IT. Agile principles and their implementation in practice have paved the way for radical new and innovative ways of software and product development. In parallel, the understanding of the manifesto’s underlying principles evolved over time. This, in turn, may affect current and future applications of agile principles. This article presents results from a survey and an interview study in collaboration with the original contributors of the manifesto for agile software development. Furthermore, it comprises the results from a workshop with one of the original authors. This publication focuses on the origins of the manifesto, the contributors’ views from today’s perspective, and their outlook on future directions. We evaluated 11 responses from the survey and 14 interviews to understand the viewpoint of the contributors. They emphasize that agile methods need to be carefully selected and agile should not be seen as a silver bullet. They underline the importance of considering the variety of different practices and methods that had an influence on the manifesto. Furthermore, they mention that people should question their current understanding of "agile" and recommend reconsidering the core ideas of the manifesto.
Annotations of character IDs in news images are critical as ground truth for news retrieval and recommendation system. Universality and accuracy optimization of deep neural network models constitutes the key technology to improve the precision and computing efficiency of automatic news character identification, which is attracting increased attention globally. This paper explores the optimized deep neural network model for automatic focus personage identification in multi-lingual news. First, the face model of the focus personage is trained by using the corresponding face images from German news as positive samples. Next, the scheme of Recurrent Convolutional Neural Network (RCNN) + Bi-directional Long-Short Term Memory (Bi-LSTM) + Conditional Random Field (CRF) is utilized to label the focus name, and the RCNN-RCNN encoder–decoder is applied to translate names of people into multiple languages. Third, face features are described by combining the advantages of Local Gabor Binary Pattern Histogram Sequence (LGBPHS) and RCNN, and iterative quantization (ITQ) is used to binarize codes. Finally, a name semantic network is built for different domains. Experiments are performed on a dataset which comprises approximately 100,000 news images. The experimental results demonstrate that the proposed method achieves a significant improvement over other algorithms.
Automatic classification of rotating machinery defects using Machine Learning (ML) algorithms
(2020)
Electric machines and motors have been the subject of enormous development. New concepts in design and control allow expanding their applications in different fields. The vast amount of data have been collected almost in any domain of interest. They can be static; that is to say, they represent real-world processes at a fixed point of time. Vibration analysis and vibration monitoring, including how to detect and monitor anomalies in vibration data are widely used techniques for predictive maintenance in high-speed rotating machines. However, accurately identifying the presence of a bearing fault can be challenging in practice, especially when the failure is still at its incipient stage, and the signal-to-noise ratio of the monitored signal is small. The main objective of this work is to design a system that will analyze the vibration signals of a rotating machine, based on recorded data from sensors, in the time/frequency domain. As a consequence of such substantial interest, there has been a dramatic increase of interest in applying Machine Learning (ML) algorithms to this task. An ML system will be used to classify and detect abnormal behavior and recognize the different levels of machine operation modes. The proposed solution can be deployed as predictive maintenance for Industry 4.0.
Assistant platforms
(2023)
Many assistant systems have evolved toward assistant platforms. These platforms combine a range of resources from various actors via a declarative and generative interface. Among the examples are voice-oriented assistant platforms like Alexa and Siri, as well as text-oriented assistant platforms like ChatGPT and Bard. They have emerged as valuable tools for handling tasks without requiring deeper domain expertise and have received large attention with the present advances in generative artificial intelligence. In view of their growing popularity, this Fundamental outlines the key characteristics and capabilities that define assistant platforms. The former comprise a multi-platform architecture, a declarative interface, and a multi-platform ecosystem, while the latter include capabilities for composition, integration, prediction, and generativity. Based on this framework, a research agenda is proposed along the capabilities and affordances for assistant platforms.
This research-oriented book presents key contributions on architecting the digital transformation. It includes the following main sections covering 20 chapters: · Digital Transformation · Digital Business · Digital Architecture · Decision Support · Digital Applications Focusing on digital architectures for smart digital products and services, it is a valuable resource for researchers, doctoral students, postgraduates, graduates, undergraduates, academics and practitioners interested in digital transformation.
This chapter presents an introduction to the emerging trends for architecting the digital transformation having a strong focus on digital products, intelligent services, and related systems together with methods, models and architectures. The primary aim of this book is to highlight some of the most recent research results in the field. We are providing a focused set of brief descriptions of the chapters included in the book.
Presently, many companies are transforming their strategy and product base, as well as their culture, processes and information systems to become more digital or to approach for a digital leadership. In the last years new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, edge and fog computing, social networks, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Digitization fosters the development of IT environments with many rather small and distributed structures, like the Internet of Things, Microservices, or other micro-granular elements. This has a strong impact for architecting digital services and products. The change from a closed-world modeling perspective to more flexible open-world composition and evolution of micro-granular system architectures defines the moving context for adaptable systems. We are focusing on a continuous bottom-up integration of micro-granular architectures for a huge amount of dynamically growing systems and services, as part of a new digital enterprise architecture for service dominant digital products.
The current advancement of Artificial Intelligence (AI) combined with other digitalization efforts significantly impacts service ecosystems. Artificial intelligence has a substantial impact on new opportunities for the co-creation of value and the development of intelligent service ecosystems. Motivated by experiences and observations from digitalization projects, this paper presents new methodological perspectives and experiences from academia and practice on architecting intelligent service ecosystems and explores the impact of artificial intelligence through real cases supporting an ongoing validation. Digital enterprise architecture models serve as an integral representation of business, information, and technological perspectives of intelligent service-based enterprise systems to support management and development. This paper focuses on architectural models for intelligent service ecosystems, showing the fundamental business mechanism of AI-based value co-creation, the corresponding digital architecture, and management models. The focus of this paper presents the key architectural model perspectives for the development of intelligent service ecosystems.
Our paper gives first answers on a fundamental question: how can the design of architectures of intelligent digital systems and services be accomplished methodologically? Intelligent systems and services are the goals of many current digitalization efforts today and part of massive digital transformation efforts based on digital technologies. Digital systems and services are the foundation of digital platforms and ecosystems. Digtalization disrupts existing businesses, technologies, and economies and promotes the architecture of open environments. This has a strong impact on new value-added opportunities and the development of intelligent digital systems and services. Digital technologies such as artificial intelligence, the Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, and social enterprise networks systems are important enablers of digitalization. The current publication presents our research on the architecture of intelligent digital ecosystems and products and services influenced by the service-dominant logic. We present original methodological extensions and a new reference model for digital architectures with an integral service and value perspective to model intelligent systems and services that effectively align digital strategies and architectures with artificial intelligence as main elements to support intelligent digitalization.
Enterprises are currently transforming their strategy, processes, and their information systems to extend their degree of digitalization. The potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, artificial intelligence, big data with analytics, mobile systems, collaboration networks, and cyber physical systems both drives and enables new business designs. Digitalization deeply disrupts existing businesses, technologies and economies and fosters the architecture of digital environments with many rather small and distributed structures. This has a strong impact for new value producing opportunities and architecting digital services and products guiding their design through exploiting a Service-Dominant Logic. The main result of the book chapter extends methods for integral digital strategies with value-oriented models for digital products and services which are defined in the framework of a multi-perspective digital enterprise architecture reference model.
The evolution of Services Oriented Architectures (SOA) presents many challenges due to their complex, dynamic and heterogeneous nature. We describe how SOA design principles can facilitate SOA evolvability and examine several approaches to support SOA evolution. SOA evolution approaches can be classified based on the level of granularity they address, namely, service code level, service interaction level and model level. We also discuss emerging trends, such as microservices and knowledge-based support, which can enhance the evolution of future SOA systems.
Military organizations have special features like following different organizational laws in times of peace and war and their specific embeddedness in society and politics. Especially the latter aspect has made the military an important object of study since the beginnings of modern sociology. In the wake of establishing specific sociological accounts, military sociology has been developed, dedicated to the different facets of the military. This research is based on different theoretical perspectives, but has hardly embraced the frameworks from economics and sociology of conventions (EC/SC) so far. The aim of the chapter is to explore and demonstrate the potentials of this approach. In a first step, the state of the art of military sociology research is outlined, and potential avenues for analyzing military forces based on EC/SC are identified. It is argued that especially the connection to organizational theory (military as organization) and civil-military relations, including leadership and professionalism, offer starting points. After introducing existing studies addressing military-related topics with reference to EC/SC, relevant concepts and approaches of convention theory that prove to be particularly enriching for military research are discussed. An outlook on possible further fields and topics of research is given to concretize how an inclusion of the perspective of EC/SC could look like.