Refine
Document Type
- Conference proceeding (124) (remove)
Language
- English (124) (remove)
Has full text
- yes (124) (remove)
Is part of the Bibliography
- yes (124)
Institute
- ESB Business School (124) (remove)
Publisher
- IEEE (18)
- Elsevier (16)
- Association for Information Systems (15)
- Springer (14)
- Stellenbosch University (10)
- Leibniz-Universität Hannover (5)
- Curran Associates Inc. (3)
- Gesellschaft für Informatik e.V (3)
- SciTePress (3)
- Hochschule Reutlingen (2)
Real estate markets are known to fluctuate. The real estate market in Stuttgart, Germany, has been booming for more than a decade: square-meter price hit top levels and real estate agents claim that market prices will continue to increase. In this paper, we test this market understanding by developing and analyzing a system dynamics model that depicts the Stuttgart real estate market. Simulating the model explains oscillating behavior arising from significant time delays and endogenous feedback structures – and not necessarily oscillating interest rates, as market experts assume. Scenarios provide insights into the system's behavior reacting to changes exogenous to the model. The first scenario tests the market development under increasing interest rates. The other scenario deals with possible effects on the real estate market if the regional automotive economy suffers from intense competition with new market players entering with alternative fuel vehicles and new technologies. With a policy run we test market structure changes to eliminate cyclical effects. The paper confirms that the business cycle in the Stuttgart real estate market arises from within the system's underlying structure, thus emphasizing the importance of understanding feedback structures.
Today 40 Gbps is in development at IEEE 802.3bq over four pair balanced cabling. In this paper, we describe a transmission experiment of 25 Gbps enabling either a single pair transmission of 25 Gbps over a 30 meter balanced cabling channel, or a 100 Gbps transmission via a four-pair balanced channel. A scalable matrix modeling tool is introduced which allows the prediction of transmission characteristics of a channel taking mode conversion into account . We applied this tool to characterize PCB-channels including the magnetics and PCB for a four-pair 100 Gbps transmission. We evaluated prototype cables and connecting hardware for frequencies up to 2 GHz and beyond. Finally we investigated possible line encoding schemes and provide measurement results of a transmission over 30 m with a data rate of 25 Gbps per twisted pair.
The 21st century: an era where emojis and hashtags find their way into every sentence, where taking selfies, live tweeting and mining bitcoin are the norm, and where Insta-culture dictates what we say and do. This is the era into which the digital native was born. With so many changes in every aspect of our lives, how is it that one of the most influential aspects, our education, has remained unchanged? Our education system not only fails to appeal to today’s students, but more importantly, it fails to equip them with the skills required in the 21st Century. It is thus of no surprise that industries feel graduates entering the workplace lack skills in critical thinking, problem solving and self-directed learning. AI, machine learning and big data: Tools and mechanisms we so eagerly incorporate to create smart factories yet are hesitant to use elsewhere. Gamification and games have shown great results in education and training; with most research suggesting a stronger focus on personalization and adaptation. When combined with analytics and machine learning, the potential of games is yet to be realized. A real-time adaptive game would not only always present an appropriate degree of challenge for the individual but would allow for a shift in focus from the recitation of facts, to the application of information filtered to solve the particular problem at hand. South Africa, a country faced with a severe skills gap, could benefit greatly from games. If used correctly, they may just offer a desperately needed contribution toward equipping both current and future employees with the skills needed to survive in the 21st century. This paper explores the feasibility of using such games for enhanced knowledge dissemination and the upskilling of the workforce.
Rapidly growing population and increasing amount of shipments induced by the e-commerce are two of the main reasons for the constantly rising urban freight traffic. Cities are therefore overwhelmed by a growing stream of goods and the available infrastructure, shared between people and goods traffic, often reached its maximum capacity. Phenomena such as traffic congestion, pollution and lack of space are direct consequences of this trend and their impact on the quality of life in the city is not negligible. City administrations are keen to evaluate innovative city logistics concepts and adopt alternative solutions, to overcome the challenges posed by such a dynamic environment, constrained in existing infrastructure. In this paper, a heuristic method based on the utility analysis is presented. Thanks to a modular approach accounting for stakeholders´ requirements, possible different scenarios and available technologies, the development of new city logistic concepts is supported. The proposed method is then applied to a case study concerning the city of Reutlingen (Germany). Results are presented and a brief discussion leads to the conclusion.
The members of the European TRIZ Campus (ETC) have been learning from and working together with many honorable members of MATRIZ Official for many years and feel very connected to the official International TRIZ Association.
To further spread the TRIZ methodology and TRIZ teaching in the European area in the past 12 months the ETC has put a lot of thought in how making TRIZ accessible to a broader audi-ence and getting more professionals in touch with the methodology was one of the focal points.
To this end, we have developed new formats such as the "Trainer Day" to support trainers on their way into practice. We have drawn up detailed quality guidelines for the teaching of the TRIZ methodology, which are intended to provide orientation for the design of training classes and docu-mentation. We strive for exchange with representatives of "neighbouring" methods such as Six sigma, Lean, DFMA and Design Thinking to indicate synergies and added value among methods and approaches of different kinds. We are testing formats for community building, in order to connect users of all places more strongly with the TRIZ methodology through communication and information of-fers. If TRIZ users feel alone in their organizations, the exchange outside their organi-zation helps them to keep up with the TRIZ methodology. Moreover, the ETC strives to increase the ability to communicate the benefits of TRIZ-usage inside organizations. We discuss, how to reach teachers and students of all age, to make them the unique way of inventive thinking accessible.
In our paper we want to give other MATRIZ Official members insights and share our experi-ences and best practices with our fellow MO members.
The high system flexibility necessary for the full automation of complex and unstructured tasks leads to increased complexity, thus higher costs. On the other hand, the effectiveness and performance of such systems decrease, explaining the unfulfilled potential of robotcs in sectors such as intralogistics, where the benefits of a robotic solution rarely justify its costs. Taking the distance from the false idea that a task should be either fully automated, or fully manual, this aper presents a method for design of a lean human-robot interaction (HRI) withe the objective of the "right level of automation", where functions are divided among human and automated agends, so that the overall process gains in performances and/or costs. ... The 10 progressive steps of the method are presented and discussed with reference to their graphical tool: the House of Quality Interaction.
The financial crisis of 2007-2010 was probably one of the greatest, most lustrous black-swan events that people of our generation(s) will experience – and at its heart, it was a dynamic phenomenon. It is stated in the vision of the System Dynamics Society that we aspire to transform society by influencing decision-making. Yet, it seems as if system dynamics did not play any significant role in this crisis: we did not examine the markets, we did not provide insights to banks, and we did not warn governments or the people. In our presentation we describe the dynamics involved in a housing bubble, and describe what made the last one different. With the insights gained from this exercise we conclude that, from a system dynamics perspective, the dimension of the financial crisis of 2007-2009 was eminently foreseeable, which will lead us to pose the following question: where were we as a field while this crisis was unfolding, why were we not active players? We present a range of potential answers to this question, hoping to provoke some reflection… and maybe some (re)action.
Blockchain is a technology for the secure processing and verification of data transactions based on a distributed peer-to-peer network that uses cryptographic processes, consensus algorithms, and backward-linked blocks to make transactions virtually immutable. Within supply chain management, blockchain technology offer potentials in increasing supply chain transparency, visibility, automation, and efficiency. However, its complexity requires future employees to have comprehensive knowledge regarding the functionality of blockchain-based applications in order to be able to apply their benefits to scenarios in supply chain and production. Learning factories represent a suitable environment allowing learners to experience new technologies and to apply them to virtual and physical processes throughout value chains. This paper presents a concept to practically transfer knowledge about the technical functionality of blockchain technology to future engineers and software developers working within supply chains and production operations to sensitize them regarding the advantages of decentralized applications. First, the concept proposes methods to playfully convey immutable backward-linked blocks and the embedment of blockchain smart contracts. Subsequently, the students use this knowledge to develop blockchain-based application scenarios by means of an exemplary product in a learning factory environment. Finally, the developed solutions are implemented with the help of a prototypical decentralized application, which enables a holistic mapping of supply chain events.
The production environment experiences copious challenges, but likewise discovers many new potential opportunities. To meet the new requirements, caused by the developments towards mass-customization, human-robot-cooperation (HRC) was identified as a key piece of technology and is becoming more and more important. HRC combines the strengths of robots, such as reliability, endurance and repeatability, with the strengths of humans, for instance flexibility and decision-making skills. Notwithstanding the high potential of HRC applications, the technology has not achieved a breakthrough in production so far. Studies have shown that one of the biggest obstacles for implementing HRC is the allocation of tasks. Another key technology that offers various opportunities to improve the production environment is Artificial Intelligence (AI). Therefore, this paper describes an AI supported method to improve the work organization in HRC in regards to the task-allocation. The aim of this method is to build a dynamic, semi-autonomous group work environment which keeps not just employee motivation at a high level, but also the product quality due to a decreased failure rate. The AI helps to detect the perfect condition in which the employee delivers the best performance and also supports at identifying the time when the worker leaves this optimal state. As soon as the employee reaches this trigger event, the allocation of the tasks adapts based on the identified stress. This adaptation aims to return the employee to the state of the optimal performance. In order to realize such a dynamic allocation, this method describes the creation of a pool with various interaction scenarios, as well as the AI supported recognition of the defined trigger event.
The supply of customer-specific products is leading to the increasing technical complexity of machines and plants in the manufacturing process. In order to ensure the availability of the machines and plants, maintenance is considered as an essential key. The application of cyber-physical systems enables the complexity to be mastered by improving the availability of information, implementing predictive maintenance strategies and the provision of all relevant information in real-time. The present research project deals with the development of a cost-effective and retrofittable smart maintenance system for the application of ultraviolet (UV) lamps. UV lamps are used in a variety of applications such as curing of materials and water disinfection, where UV lamps are still used instead of UV LED due to their higher effectiveness. The smart maintenance system enables continuous condition monitoring of the UV lamp through the integration of sensors. The data obtained are compared with data from existing lifetime models of UV lamps to provide information about the remaining useful lifetime of the UV lamp. This ensures needs-based maintenance measures and more efficient use of UV lamps. Furthermore, it is important to have accurate information on the remaining useful lifetime of a UV lamp, as the unplanned breakdown of a UV lamp can have far-reaching consequences. The key element is the functional model of the envisioned cyber-physical system, describing the dependencies between the sensors and actuator, the condition monitoring system as well as the IoT platform. Based on the requirements developed and the functional model, the necessary hardware and software are selected. Finally, the system is developed and retrofitted to a simulated curing process of a 3D printer to validate its functional capability. The developed system leads to improved information availability of the condition of UV lamps, predictive maintenance measures and context-related provision of information.
Today's logistics systems are characterized by uncertainty and constantly changing requirements. Rising demand for customized products, short product life cycles and a large number of variants increases the complexity of these systems enormously. In particular, intralogistics material flow systems must be able to adapt to changing conditions at short notice, with little effort and at low cost. To fulfil these requirements, the material flow system needs to be flexible in three important parameters, namely layout, throughput and product. While the scope of the flexibility parameters is described in literature, the respective effects on an intralogistics material flow system and the influencing factors are mostly unknown. This paper describes how flexibility parameters of an intralogistics system can be determined using a multi-method simulation. The study was conducted in the learning factory “Werk150” on the campus of Reutlingen University with its different means of transport and processes and validated in terms of practical experiments.
The aim of this paper is to show to what extent Artificial Intelligence can be used to optimize forecasting capability in procurement as well as to compare AI with traditional statistic methods. At the same time this article presents the status quo of the research project ANIMATE. The project applies Artificial Intelligence to forecast customer orders in medium-sized companies.
Precise forecasts are essential for companies. For planning, decision making and controlling. Forecasts are applied, e.g. in the areas of supply chain, production or purchasing. Medium-sized companies have major challenges in using suitable methods to improve their forecasting ability.
Companies often use proven methods such as classical statistics as the ARIMA algorithm. However, simple statistics often fail while applied for complex non-linear predictions.
Initial results show that even a simple MLP ANN produces better results than traditional statistic methods. Furthermore, a baseline (Implicit Sales Expectation) of the company was used to compare the performance. This comparison also shows that the proposed AI method is superior.
Until the developed method becomes part of corporate practice, it must be further optimized. The model has difficulties with strong declines, for example due to holidays. The authors are certain that the model can be further improved. For example, through more advanced methods, such as a FilterNet, but also through more data, such as external data on holiday periods.
Due to Industry 4.0, the full value creation has the chance to undergo a fundamental technological transformation, the realisation of which, however, requires the commitment of every company for its own benefit. The new approaches of Industry 4.0 are often hardly evaluated, let alone proven, so that SMEs in particular often cannot properly estimate the potentials and risks, and often waiting too long with the migration towards Industry 4.0. In addition, they often do not pursue an integrated concept in order to identify possible potentials through changes in their business models. . As part of the research project "GEN-I 4.0 – Geschäftsmodell-Entwicklung für die Industrie 4.0” ", the ESB Business School at Reutlingen University of Applied Sciences and the Fraunhofer Institute for Industrial Engineering and Organization FHG IAO were engaged by the Baden-Württemberg Foundation from 2016 to 2018 to develop tools and an approach how the local economy can develop digital business models for itself in a methodical, beneficial and targeted manner. Through international analyses and interviews GEN-I 4.0 gained and concretized the knowledge required for the evaluation and selection of solutions and approaches for the transfer to develop digital business models. Together with the know-how of the project partners on Industry 4.0 and business model development, the findings were incorporated into the development of two software tools with which SMEs are shown the potentials of Industry 4.0 for their individual business model, online and in selfassessment, and given a comprehensive structured, concrete approach to development, as well as their individual risk. Users of the tools are supported by the selected platform for the networking of different players to implement innovative business models accompanied by coaching concepts for the companies in the follow-up and implementation of the assessment results.
The early involvement of experiences gained through intelligence and data analysis is becoming increasingly important in order to develop new products, leading to a completely different conception of product creation, development and engineering processes using the advantages that the dedication of the digital twin entails. Introducing a novel stage gate process in order to be holistically anchored in learning factories adopting idea generation and idea screening in an early stage, beta testing of first prototypes, technical implementation in real production scenarios, business analysis, market evaluation, pricing, service models as well as innovative social media portals. Corresponding product modelling in the sense of sustainability, circular economy, and data analytics forecasts the product on the market both before and after market launch with the interlinking of data interpretation nearby in real-time. The digital twin represents the link between the digital model and the digital shadow. Additionally, the connection of the digital twin with the product provides constantly updated operating status and process data as well as mapping of technical properties and real-world behaviours. A future-networking product, by embedded information technology with the ability to initiate and carry out one's own further development, is able to interact with people and environments and thus is relevant to the way of life of future generations. In today's development work for this new product creation approach, on one hand, "Werk150" is the object of the development itself and on the other hand the validation environment. In the next step, new learning modules and scenarios for trainings at master level will be derived from these findings.
Demand forecasting intermittent time series is a challenging business problem. Companies have difficulties in forecasting this particular form of demand pattern. On the one hand, it is characterized by many non-demand periods and therefore classical statistical forecasting algorithms, such as ARIMA, only work to a limited extent. On the other hand, companies often cannot meet the requirements for good forecasting models, such as providing sufficient training data. The recent major advances of artificial intelligence in applications are largely based on transfer learning. In this paper, we investigate whether this method, originating from computer vision, can improve the forecasting quality of intermittent demand time series using deep learning models. Our empirical results show that, in total, transfer learning can reduce the mean square error by 65 percent. We also show that especially short (65 percent reduction) and medium long (91 percent reduction) time series benefit from this approach.
The high system flexibility necessary for the full automation of complex and unstructured tasks leads to increased technological complexity, thus to higher costs and lower performance. In this paper, after an introduction to the different dimensions of flexibility, a method for flexible modular configuration and evaluation of systems of systems is introduced. The method starts from process requirements and, considering factors such as feasibility, development costs, market potential and effective impact on the current processes, enables the evaluation of a flexible systems of systems equipped with the needed functionalities before its actual development. This allows setting the focus on those aspects of flexibility that add market value to the system, thus promoting the efficient development of systems addressed to interested customers in intralogistics. An example of application of the method is given and discussed.
According to several surveys and statistics, the great majority of companies previously not accustomed to automation are piloting solutions to automate business processes. Those accustomed to automation also attempt to introduce more of it, focusing on automation-unfriendly processes that remained manual. However, when the decision on what and whether to automate is not trivial for evident reasons, even industry leaders may get stuck on an overwhelming question: where to begin automating? The question remains too often unanswered as state-of-the-art methods fail to consider the whole picture. This paper introduces a holistic approach to the decision-making for investments in automation. The method supports the iterative analysis and evaluation of operative processes, providing tools for a quantitative approach to the decision-making. Thanks to the method, a large pool of processes can be first considered and then filtered out in order to select the one that yields the best value for the automation in the specific context. After introducing the method, a case study is reported for validation before the discussion.
The EU funded project RobLog recently developed a system able to autonomously unload coffee sacks from a standard container. Being the first of its kind, a further development is needed in order for the system to be competitive against manual labor. Financing this development entails a risk, hence a justified skepticism, which can be overcome by the longsighted view of the existing market potential. This paper presents a method to estimate the market potential of autonomous unloading systems for heavy deformable goods. Starting from the analysis of the coffee trade, first the current coffee traffic is investigated in order to calculate the number of autonomous systems needed to handle the imported sacks; Results are validated and the method is extended for the calculation of the potential of other market segments, where the same unloading technology can be applied.
Prior studies ascribed people’s poor performance in dealing with basic systems concepts to different causes. While results indicate that, among other things, domain specific experience and familiarity with the problem context play a role in this stock-flow-(SF-)performance, this has not yet been fully clarified. In this article, we present an experiment that examines the role of educational background in SF-performance. We hypothesize that SF-performance increases when the problem context is embedded in the problem solver’s knowledge domain, indicated by educational background. Using the square wave pattern and the sawtooth pattern tasks from the initial study by Booth Sweeney and Sterman (2000), we design two additional cover stories for the former, the Vehicle story from the engineering domain and the Application story from the business domain, next to the original Bathtub story. We then test the three sets of questions on business students. Results mainly support our hypothesis. Interestingly, participants even do better on a more complex behavioral pattern from their knowledge domain than on a simpler pattern from more distant domains. Although these findings have to be confirmed by further studies, they contribute both to the methodology of future surveys and the context familiarity discussion.
SF-failure, the inability of people to correctly determine the behavior of simple stock and flow structures is subject of a long research stream. Reasons for SF-failure can be attributed to different reasons, one of them being lacking domain specific experience, thus familiarity with the problem context. In this article we present a continuation of an experiment to examine the role of educational background in SF-performance. We base the question set on the Bathtub Dynamics tasks introduced by Booth Sweeney and Sterman (2000) and vary the cover stories. In this paper we describe how we developed and tested a new cover story for the engineering domain and implemented the recommendations from a prior study. We test three sets of questions with engineering students which enables us to compare the results to a previous study in which we tested the questions with business students. Results mainly support our hypothesis that context familiarity increases SF-performance. With our findings we further develop the methodology of the research on SF-failure.