Refine
Document Type
- Conference proceeding (8)
- Journal article (4)
- Book chapter (2)
Is part of the Bibliography
- yes (14)
Institute
Publisher
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Forecasting demand is challenging. Various products exhibit different demand patterns. While demand may be constant and regular for one product, it may be sporadic for another, as well as when demand occurs, it may fluctuate significantly. Forecasting errors are costly and result in obsolete inventory or unsatisfied demand. Methods from statistics, machine learning, and deep learning have been used to predict such demand patterns. Nevertheless, it is not clear for what demand pattern, which algorithm would achieve the best forecast. Therefore, even today a large number of models are used to forecast on a test period. The model with the best result on the test period is used for the actual forecast. This approach is computationally and time intensive and, in most cases, uneconomical. In our paper we show the possibility to use a machine learning classification algorithm, which predicts the best possible model based on the characteristics of a time series. The approach was developed and evaluated on a dataset from a B2B-technical-retailer. The machine learning classification algorithm achieves a mean ROC-AUC of 89%, which emphasizes the skill of the model.
Forecasting intermittent and lumpy demand is challenging. Demand occurs only sporadically and, when it does, it can vary considerably. Forecast errors are costly, resulting in obsolescent stock or unmet demand. Methods from statistics, machine learning and deep learning have been used to predict such demand patterns. Traditional accuracy metrics are often employed to evaluate the forecasts, however these come with major drawbacks such as not taking horizontal and vertical shifts over the forecasting horizon into account, or indeed stock-keeping or opportunity costs. This results in a disadvantageous selection of methods in the context of intermittent and lumpy demand forecasts. In our study, we compare methods from statistics, machine learning and deep learning by applying a novel metric called Stock-keeping-oriented Prediction Error Costs (SPEC), which overcomes the drawbacks associated with traditional metrics. Taking the SPEC metric into account, the Croston algorithm achieves the best result, just ahead of a Long Short-Term Memory Neural Network.
Intermittent time series forecasting is a challenging task which still needs particular attention of researchers. The more unregularly events occur, the more difficult is it to predict them. With Croston’s approach in 1972 (1.Nr. 3:289–303), intermittence and demand of a time series were investigated the first time separately. He proposes an exponential smoothing in his attempt to generate a forecast which corresponds to the demand per period in average. Although this algorithm produces good results in the field of stock control, it does not capture the typical characteristics of intermittent time series within the final prediction. In this paper, we investigate a time series’ intermittence and demand individually, forecast the upcoming demand value and inter-demand interval length using recent machine learning algorithms, such as long-short-term-memories and light-gradient-boosting machines, and reassemble both information to generate a prediction which preserves the characteristics of an intermittent time series. We compare the results against Croston’s approach, as well as recent forecast procedures where no split is performed.
Demand forecasting intermittent time series is a challenging business problem. Companies have difficulties in forecasting this particular form of demand pattern. On the one hand, it is characterized by many non-demand periods and therefore classical statistical forecasting algorithms, such as ARIMA, only work to a limited extent. On the other hand, companies often cannot meet the requirements for good forecasting models, such as providing sufficient training data. The recent major advances of artificial intelligence in applications are largely based on transfer learning. In this paper, we investigate whether this method, originating from computer vision, can improve the forecasting quality of intermittent demand time series using deep learning models. Our empirical results show that, in total, transfer learning can reduce the mean square error by 65 percent. We also show that especially short (65 percent reduction) and medium long (91 percent reduction) time series benefit from this approach.
This study explores the application of the PatchCore algorithm for anomaly classification in hobbing tools, an area of keen interest in industrial artificial intelligence application. Despite utilizing limited training images, the algorithm demonstrates capability in recognizing a variety of anomalies, promising to reduce the time-intensive labeling process traditionally undertaken by domain experts. The algorithm demonstrated an accuracy of 92%, precision of 84%, recall of 100%, and a balanced F1 score of 91%, showcasing its proficiency in identifying anomalies. However, the investigation also highlights that while the algorithm effectively identifies anomalies, it doesn't primarily recognize domain-specific wear issues. Thus, the presented approach is used only for pre-classification, with domain experts subsequently segmenting the images indicating significant wear. The intention is to employ a supervised learning procedure to identify actual wear. This premise will be further investigated in future research studies.
Thorough maintenance of industrial equipment is crucial for the finances of companies. Whereas the purchase of new tools can be an expensive business, reconditioning special gear often costs just a fraction. In this paper, preliminary steps for an accurate visual based preventive maintenance of hobbing wheels are investigated. To perform robust and reliable decisions about the wheel's condition, tool department specialists require precise taken captures of it. For this reason, a visual control cell is built, which depends on correctly aligned hobbing wheels in its image acquisition construction. The tool needs to be placed on a turn-table and rotated, so that a single tooth is centered in the field-of-view of the camera mounted on a robot arm. For this alignment task, three different main approaches with various preprocessing steps are investigated, a brute-force algorithm, an orb-feature approach and an image regression model. The results show that even a brute-force algorithm can be outperformed by a moderate deep neural network.
Long setup times in CNC tool production significantly hinder operational efficiency, characterized by reduced machine utilization, increased planning efforts, and subsequent delivery delays and production bottlenecks. These inefficiencies not only escalate production costs but also tie up capital, compromise order flexibility, augment storage expenses, and prevent the capitalization on market opportunities. This paper explores the application of explainable AI to analyze process data within CNC setups, aiming to identify and elucidate patterns that contribute to prolonged setup durations. By implementing AI models with explanation methods, this research transparently highlights critical improvement points, facilitating targeted interventions to enhance production agility. The outcome is a dual advantage of reducing setup times and operational costs, thereby speeding up overall manufacturing processes. This approach emphasizes innovative manufacturing systems and provides practical insights on using artificial intelligence to enhance efficiency in CNC tool production.
This paper proposes a novel framework - "Transparent Reasoning in Artificial intelligence Cause Explanation" (TRACE) - that combines root cause analysis, explainable artificial intelligence, and machine learning in a comprehensible manner for the shopfloor worker. The goal is to enhance transparency, interpretability, and explainability in AI-driven decision-making processes as well as to increase the acceptance of AI within an industrial manufacturing area. A human AI collaboration tool in perspective. The paper outlines the need of such a framework, describes the proposed design science approach for the development.
Enhancing power skiving tool longevity: the synergy of AI and robotics in manufacturing automation
(2024)
In gear manufacturing, the longevity and cost-effectiveness of power skiving tools are essential. This study presents an innovative approach that combines artificial intelligence and robotics in manufacturing automation to prevent tool breakage to improve the remaining useful life (RUL). Using a robotic cell, the system captures six images per tooth from different angles. An unsupervised generative deep learning model approach is used because it is more suitable for industrial application as it can be trained with a small number of defect-free images. It is used in a first step as a classifier and, in a second step, to segment tool wear. This approach promises economic benefits by reducing manual inspection and, through automated tool inspection, detecting wear earlier to prevent tool breakage.