Refine
Document Type
- Journal article (1240)
- Conference proceeding (1035)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (37)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3079)
Institute
- ESB Business School (1103)
- Informatik (872)
- Technik (508)
- Life Sciences (343)
- Texoversum (219)
- Zentrale Einrichtungen (16)
Publisher
- Springer (346)
- IEEE (250)
- Elsevier (219)
- Hochschule Reutlingen (186)
- MDPI (98)
- Springer Gabler (79)
- Gesellschaft für Informatik (66)
- Universitätsbibliothek Tübingen (59)
- Wiley (54)
- ACM (40)
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
It has not yet been possible to achieve the desired aim of decoupling economic growth from global material demand. Small and medium sized enterprises (SMEs) represent the backbone of most industrialized economies. Although material efficiency is of vital importance for many SMEs, few of them actually treat it as their top priority. There is a cornucopia of tools and methods available which can be used for material efficiency purposes. These, however, have gained little ground in the SME-field. This work deals with the enabling factors for material efficiency improvements in manufacturing SMEs and projections towards aspects of supply chain and circular economy. A multi-disciplinary decoupling approach for manufacturing SMEs and an implementation roadmap for further practical development are proposed. The approach combines appropriate complexity of technology and socio-economic considerations. It enables a connection of existing methods and the implementation of established information technologies.
It has not yet been possible to achieve the desired aim of decoupling economic growth from global material demand. Small and medium sized enterprises (SMEs) represent the backbone of most industrialized economies. Although material efficiency is of vital importance for many SMEs, few of them actually treat it as their top priority. There is a cornucopia of tools and methods available, which can be used for material efficiency purposes. These, however, have gained little groud in the SME-field. This work deals with the enabling factors for material efficiency improvements in manufacturing SMEs and projections towards aspects of supply chain and circular economy. A multi-disciplinary decoupling approach for manufacturing SMEs and an implementation roadmap for further practical development are proposed. The approach combines appropriate complexity of technology and socio-economic considerations. It enables a connection to existing methods and the implementation of established information technologies.
Patterns are virtually simulated in 3D CAD programs before production to check the fit. However, achieving lifelike representations of human avatars, especially regarding soft tissue dynamics, remains challenging. This is mainly since conventional avatars in garment CAD programs are simulated with a continuous hard surface and not corresponding to the human physical and mechanical body properties of soft tissue. In the real world, the human body’s natural shape is affected by the contact pressure of tight-fitting textiles. To verify the fit of a simulated garment, the interactions between the individual body shape and the garment must be considered. This paper introduces an innovative approach to digitising the softness of human tissue using 4D scanning technology. The primary objective of this research is to explore the interactions between tissue softness and different compression levels of apparel, exerting pressure on the tissue to capture the changes in the natural shape. Therefore, to generate data and model an avatar with soft body physics, it is essential to capture the deform ability and elasticity of the soft tissue and map it into the modification options for a simulation. To aim this, various methods from different fields were researched and compared to evaluate 4D scanning as the most suitable method for capturing tissue deformability in vivo. In particular, it should be considered that the human body has different deformation capabilities depending on age, the amount of muscle and body fat. In addition, different tissue zones have different mechanical properties, so it is essential to identify and classify them to back up these properties for the simulation. It has been shown that by digitising the obtained data of the different defined applied pressure levels, a prediction of the deformation of the tissue of the exact person becomes possible. As technology advances and data sets grow, this approach has the potential to reshape how we verify fit digitally with soft avatars and leverage their realistic soft tissue properties for various practical purposes.
Due to Industry 4.0, the full value creation has the chance to undergo a fundamental technological transformation, the realisation of which, however, requires the commitment of every company for its own benefit. The new approaches of Industry 4.0 are often hardly evaluated, let alone proven, so that SMEs in particular often cannot properly estimate the potentials and risks, and often waiting too long with the migration towards Industry 4.0. In addition, they often do not pursue an integrated concept in order to identify possible potentials through changes in their business models. . As part of the research project "GEN-I 4.0 – Geschäftsmodell-Entwicklung für die Industrie 4.0” ", the ESB Business School at Reutlingen University of Applied Sciences and the Fraunhofer Institute for Industrial Engineering and Organization FHG IAO were engaged by the Baden-Württemberg Foundation from 2016 to 2018 to develop tools and an approach how the local economy can develop digital business models for itself in a methodical, beneficial and targeted manner. Through international analyses and interviews GEN-I 4.0 gained and concretized the knowledge required for the evaluation and selection of solutions and approaches for the transfer to develop digital business models. Together with the know-how of the project partners on Industry 4.0 and business model development, the findings were incorporated into the development of two software tools with which SMEs are shown the potentials of Industry 4.0 for their individual business model, online and in selfassessment, and given a comprehensive structured, concrete approach to development, as well as their individual risk. Users of the tools are supported by the selected platform for the networking of different players to implement innovative business models accompanied by coaching concepts for the companies in the follow-up and implementation of the assessment results.
Cotton contamination by honeydew is considered one of the significant problems for quality in textiles as it causes stickiness during manufacturing. Therefore, millions of dollars in losses are attributed to honeydew contamination each year. This work presents the use of UV hyperspectral imaging (225–300 nm) to characterize honeydew contamination on raw cotton samples. As reference samples, cotton samples were soaked in solutions containing sugar and proteins at different concentrations to mimic honeydew. Multivariate techniques such as a principal component analysis (PCA) and partial least squares regression (PLS-R) were used to predict and classify the amount of honeydew at each pixel of a hyperspectral image of raw cotton samples. The results show that the PCA model was able to differentiate cotton samples based on their sugar concentrations. The first two principal components (PCs) explain nearly 91.0% of the total variance. A PLS-R model was built, showing a performance with a coefficient of determination for the validation (R2cv) = 0.91 and root mean square error of cross-validation (RMSECV) = 0.036 g. This PLS-R model was able to predict the honeydew content in grams on raw cotton samples for each pixel. In conclusion, UV hyperspectral imaging, in combination with multivariate data analysis, shows high potential for quality control in textiles.
Military organizations have special features like following different organizational laws in times of peace and war and their specific embeddedness in society and politics. Especially the latter aspect has made the military an important object of study since the beginnings of modern sociology. In the wake of establishing specific sociological accounts, military sociology has been developed, dedicated to the different facets of the military. This research is based on different theoretical perspectives, but has hardly embraced the frameworks from economics and sociology of conventions (EC/SC) so far. The aim of the chapter is to explore and demonstrate the potentials of this approach. In a first step, the state of the art of military sociology research is outlined, and potential avenues for analyzing military forces based on EC/SC are identified. It is argued that especially the connection to organizational theory (military as organization) and civil-military relations, including leadership and professionalism, offer starting points. After introducing existing studies addressing military-related topics with reference to EC/SC, relevant concepts and approaches of convention theory that prove to be particularly enriching for military research are discussed. An outlook on possible further fields and topics of research is given to concretize how an inclusion of the perspective of EC/SC could look like.
Enterprises and societies currently face crucial challenges, while Industry 4.0 becomes important in the global manufacturing industry all the more. Industry 4.0 offers a range of opportunities for companies to increase the flexibility and efficiency of production processes. The development of new business models can be promoted with digital platforms and architectures for Industry 4.0. Therefore, products from the healthcare sector can increase in value. The adaptive integrated digital architecture framework (AIDAF) for Industry 4.0 is expected to promote and implement the digital platforms and robotics for healthcare and medical communities efficiently. In this paper, we propose that various digital platforms and robotics are designed and evaluated for digital healthcare as for manufacturing industry with Industry 4.0. We argue that the design of an open healthcare platform “Open Healthcare Platform 2030 - OHP2030” for medical product design and robotics can be developed with AIDAF. The vision of AIDAF applications to enable Industry 4.0 in the OHP2030 research initiative is explained and referenced, extended in the context of Society 5.0.
Enterprises and societies currently face essential challenges, and digital transformation can contribute to their resolution. Enterprise architecture (EA) is useful for promoting digital transformation in global companies and information societies covering ecosystem partners. The advancement of new business models can be promoted with digital platforms and architectures for Industry 4.0 and Society 5.0. Therefore, products from the sector of healthcare, manufacturing and energy, etc. can increase in value. The adaptive integrated digital architecture framework (AIDAF) for Industry 4.0 and the design thinking approach is expected to promote and implement the digital platforms and digital products for healthcare, manufacturing and energy communities more efficiently. In this paper, we propose various cases of digital transformation where digital platforms and products are designed and evaluated for digital IT, digital manufacturing and digital healthcare with Industry 4.0 and Society 5.0. The vision of AIDAF applications to perform digital transformation in global companies is explained and referenced, extended toward the digitalized ecosystems such as Society 5.0 and Industry 4.0.
Knowledge transfer is very important to our knowledge-based society and many approaches have been proposed to describe this transfer. However, these approaches take a rather abstract view on knowledge transfer, which makes implementation difficult. In order to address this issue, we introduce a layered model for knowledge transfer that structures the individual steps of knowledge transfer in more detail. This paper gives a description of the process and also an example of the application of the layered model for knowledge transfer. The example is located in the area of business process modelling. Business processes contain the important knowledge describing the procedures of the company to produce products and services. Knowledge transfer is the fundamental basis in the modelling and usage of Business processes, which makes it an interesting use case for the layered model for knowledge transfer.
This paper develops a linear and tractable model of financial bubbles. I demonstrate the application of the linear model and study the root causes of financial bubbles. Moreover, I derive leading properties of bubbles. This model enables investors and regulators to react to market dynamics in a timely manner. In conclusion, the linear model is helpful for the empirical verification and detection of financial bubbles.
This paper analyzes governance mechanisms for different group sizes. The European sovereign debt crisis has demonstrated the need of efficient governance for different group sizes. I find that self-governance only works for sufficiently homogenous and small neighbourhoods. Second, as long as the union expands, the effect of credible self-governance decreases. Third, spill-over effects amplify the size effect. Fourth, I show that sufficiently large monetary unions, are better off with costly but external governance or a free market mechanism. Finally, intermediate-size unions are most difficult to govern efficiently.
Applied mathematical theory for monetary-fiscal interaction in a supranational monetary union
(2014)
I utilize a differentiable dynamical system á la Lotka-Voletrra and explain monetary and fiscal interaction in a supranational monetary union. The paper demonstrates an applied mathematical approach that provides useful insights about the interaction mechanisms in theoretical economics in general and a monetary union in particular. I find that a common central bank is necessary but not sufficient to tackle the new interaction problems in a supranational monetary union, such as the free-riding behaviour of fiscal policies. Moreover, I show that upranational institutions, rules or laws are essential to mitigate violations of decentralized fiscal policies.
Application to CAE systems
(2016)
Due to the broad acceptance of CAD-systems based on 3D solids, the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models, where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
Due to the broad acceptance of CAD-systems based on 3D solids , the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models , where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
In the early days of automated meshing development, there were discussions over the use of tetragonal (Fig. 4.1) or hexagonal based meshes. But, after a short period of time, it became evident, that there were and will always be many problems using automated meshers to generate hexagonal elements . So today nearly all automated 3D-meshing systems use tetragonal elements .
The character of knowledge-intense processes is that participants decide the next process activities on base of the present information and their expert knowledge. The decisions of these knowledge workers are in general non-deterministic. It is not possible to model these processes in advance and to automate them using a process engine of a BPM system. Hence, in this context a process instance is called a case, because there is no predefined model that could be instantiated. Domain-specific or general case management systems are used to support the knowledge workers. These systems provide all case information and enable users to define the next activities, but they have no or only limited activity recommendation capabilities. In the following paper, we present a general concept for a self-learning system based on process mining that suggests the next best activity on quantitative and qualitative data for a given case. As a proof of concept, it was applied to the area of insurance claims settlement.
Like many others, fashion companies have to deal with a global and very competitive environment. Thus companies rely on accurate sales forecasts - as key success factor of an efficient supply chain management. However, forecasters have to take into account some specificities of the fashion industry. To respond to these constraints, a variety of different forecasting methods exists, including new, computer-based predictive analytics. After the evaluation of different methods, their application to the fashion industry is investigated through semi structured expert interviews. Despite several benefits predictive analytics is not yet frequently used in practice. This research does not only reflect an industry profile, but also gives important insights about the future potential and obstacles of predictive analytics.
Literature reviews are essential for any scientific work, both as part of a dissertation or as a stand-alone work. Scientists benefit from the fact that more and more literature is available in electronic form, and finding and accessing relevant literature has become more accessible through scientific databases. However, a traditional literature review method is characterized by a highly manual process, while technologies and methods in big data, machine learning, and text mining have advanced. Especially in areas where research streams are rapidly evolving, and topics are becoming more comprehensive, complex, and heterogeneous, it is challenging to provide a holistic overview and identify research gaps manually. Therefore, we have developed a framework that supports the traditional approach of conducting a literature review using machine learning and text mining methods. The framework is particularly suitable in cases where a large amount of literature is available, and a holistic understanding of the research area is needed. The framework consists of several steps in which the critical mind of the scientist is supported by machine learning. The unstructured text data is transformed into a structured form through data preparation realized with text mining, making it applicable for various machine learning techniques. A concrete example in the field of smart cities makes the framework tangible.
Development work within an experimental environment, in which certain properties are investigated and optimized, requires many test runs and is therefore often associated with long execution times, costs and risks. This can affect product, material and technology development in industry and research. New digital driver technologies offer the possibility to automate complex manual work steps in a cost-effective way, to increase the relevance of the results and to accelerate the processes many times over. In this context, this article presents a low-cost, modular and open-source machine vision system for test execution and evaluates it on the basis of a real industrial application. For this purpose a methodology for the automated execution of the load intervals, the process documentation and for the evaluation of the generated data by means of machine learning to classify wear levels. The software and the mechanical structure are designed to be adaptable to different conditions, components and for a variety of tasks in industry and research. The mechanical structure is required for tracking the test object and represents a motion platform with independent positioning by machine vision operators or machine learning. An evaluation of the state of the test object is performed by the transfer learning after the initial documentation run. The manual procedure for classifying the visually recorded data on the state of the test object is described for the training material. This leads to an increased resource efficiency on the material as well as on the personnel side since on the one hand the significance of the tests performed is increased by the continuous documentation and on the other hand the responsible experts can be assigned time efficiently. The presence and know-how of the experts are therefore only required for defined and decisive events during the execution of the experiments. Furthermore, the generated data are suitable for later use as an additional source of data for predictive maintenance of the developed object.
This paper reports an analysis of application and impact of FMEA on susceptibility of generic IT-networks. It is not new that in communication system, the frequency and the data transmission rate play a very important role. The rapid increase in miniaturization of electronic devices leads to very sensitivity against electromagnetic interference. Since the IT network with the data transfer rate makes a huge contribution to this development it is very important to monitor their functionality. Therefore, tests are performed to observe and ensure the data transfer rate of IT networks against IEMI. A fault tree model is presented and observed effects during radiation of disturbance on complex system by a HPEM interference sources are described using a continuous and consistent model of the physical layer to the application layer.
To illustrate the power and the pitfalls of Bionic Optimization, we will show some examples spanning classes of applications and employing various strategies. These applications cover a broad range of engineering tasks. Nevertheless, there is no guarantee that our experiences and our examples will be sufficient to deal with all questions and issues in a comprehensive way. As general rule it might be stated, that for each class of problems, novices should begin with a learning phase. So, in this introductory phase, we use simple and quick examples, e.g., using small FE-models, linear load cases, short time intervals and simple material models. Here beginners within the Bionic Optimization community can learn which parameter combinations to use. In Sect. 3.3 we discuss strategies for optimization study acceleration. Making use of these parameters as starting points is one way to set the specific ranges, e.g., number of parents and kids, crossing, mutation radii and, numbers of generations. On the other hand, these trial runs will doubtless indicate that Bionic Optimization needs large numbers of individual designs, and considerable time and computing power. We recommend investing enough time preparing each task in order to avoid the frustration should large jobs fail after long calculation times.