Technical Sessions A3 - G3
SESSION A3: Applied Operations Research
 An Algorithm for Packing Tubes and Boxes
João Pedro Pedroso, João Nuno Tavares and Jorge Leite
In this paper we describe a method for packing tubes and boxes in containers. Each container is divided into parts (holders) which are allocated to subsets of objects. The method consists of a recursive procedure which, based on a predefined order for dealing with tubes and boxes, determines the dimensions and position of each holder. Characteristics of the objects to pack and rules limiting their placement make this problem unique. The method devised provides timely and practical solutions.
 A STUDY ON CAPACITY PRICING AND RESERVATION PROBLEM UNDER OPTION CONTRACT
Yi Tao, Ek Peng Chew and Loo Hay Lee
Option contracts have been increasingly applied in the air cargo freight industry recently due to its ability to mitigate asset provider’s capacity utilization risk. By entering into option contract with an air cargo carrier, freight forwarders reserve a certain amount of capacity upon signing the contract and execute the option partially or completely after the market demand is realized. In this work, we address the capacity pricing and reservation problem under option contract in the air cargo freight industry. Mathematical models are established to simulate the behaviors of air cargo carrier and freight forwarders, and then respectively derive optimal pricing and reservation policy for both parties with the aim to maximize their expected profits when multiple freight forwarders are involved. Lastly, numerical experiments are conducted.
 OPERATIONAL RULES AND SIMULATOR TO DYNAMIC SLAB STACK SHUFFLING PROBLEM
Gislaine Almeida, Michelle Botelho, Gabriela Breder, Gabriel Bianchi and Leandro Resendo
In a steel industry, the slab yard works as a buffer storing slabs, produced by Continuous Casting Machines (CCM), before they are transformed into coils in Hot Strip Mill (HSM). The problem of managing this yard logistic is known as Slab Stack Shuffling Problem. In this work we propose a simulator (3S-Sim) and a set of operational rules to investigate the slab yard in a dynamic scenario with effective management. In addition, the operational rules presented in this work are inspired on good practices observed in a big steel industry. In numerical results we investigate three policies of management to straighten stacks out, so called Campaign Area, and the parameters customization to each Campaign Area. Results show that, to improve yard effectiveness, the rules of management must be customized according to Campaign area. Moreover, it is shown that the fine-tune of parameters can improve the meeting of the analyzed demand, in a real size yard, until 36.19%.
 Electricity cost and makespan optimization on a single batch processing machine under Time-of-use pricing policy
Junheng Cheng, Feng Chu, Ming Liu and Weili Xia
Time-of-use (TOU) electricity pricing policy offers a good electricity cost saving opportunity for industries, where batch scheduling is often involved. The electricity consumption of a machine includes processing electricity (PE) as well as non-processing electricity (NPE). This work investigates a new bi-objective single batch scheduling problem that simultaneously optimizes electricity cost of both PE and NPE and the makespan under TOU pricing policy. For the problem, we first establish a bi-objective mixed-integer nonlinear math program. Then, the nonlinear program is transformed to an equivalent linear one by analyzing the optimal strategy for NPE saving. Finally, the equidistant _-constraint method is adapted for the bi-objective problem to obtain a set of Pareto optimal solutions. The computational results on randomly generated instances show the effectiveness of the proposed approach.
SESSION B3: Cellular manufacturing
 A NEW MATHEMATICAL MODEL FOR DYNAMIC CELLULAR MANUFACTURING SYSTEM WITH CONSIDERING SEVERAL RAW MATERIAL WITH DIFFERENT LEAD TIME
Reza Tavakkoli-Moghaddam, Mohammad Kazemi, Shima Shafiee-Gol and Sobhan Mostafayi
This paper presents an integer mathematical programming model for the design of cellular manufacturing systems (CMSs) in a dynamic environment considering several raw materials, in which each one has different lead time for producing parts. In none of the previous research in the field of cellular manufacturing, the delivery times of raw materials have not considered differently. We consider warehouse for raw materials; therefore, the required materials will be bought during the periods that it is economic and will be used for another periods. The problem is solved by GAMS software and computational results are presented by solving the example in order to demonstrate the validity of the model. To solve the large-scale problem, a hybrid genetic algorithm (HGA) is used.
 CELL FORMATION PROBLEM: A GENETIC ALGORITHM BASED ON AN INTER-OPERATION FLOW MATRIX
Ana Raquel Xambre
When designing a Cellular Manufacturing System an essential step is to solve the cell formation problem: determining which machines and parts belong to each cell. The main purpose is to obtain autonomous cells capable of completely processing the respective family of parts and thus eliminate, or at least reduce, intercellular flow.
Additionally, if there is more than one machine of each type (i.e. if a certain operation can be performed in different machines) the assignment of operations to specific machines becomes part of the cell formation problem.
In this paper an algorithm for the cell formation problem with multiple identical machines, which minimises the intercellular flow, is presented. The algorithm uses the information provided by an inter-operation flow matrix so the real flow, associated with each solution, can be adequately determined. Furthermore, due to the combinatorial nature of this problem, the procedure is based on genetic algorithms in order to improve the exploration of the solution space.
 PRINCIPLES OF LEAN PLANNING AND CONTROL
Lean production systems use teams instead of functional departments as well as simple shop floor control methods to manage the flow of orders at the shop floor. Lean shop floor control focuses on robust and visual methods that are able to cope with variation in processing times, routing sequences, disturbances, resource allocation, et cetera. Although the main focus of lean will be to eliminate unnecessary variation that causes waste, unreasonableness, and unevenness, there will be variation that directly relates to providing customer value and hence needs to be accommodated in the control system. This paper investigates what fundamental principles are behind lean planning and control methods. We describe underlying principles that have been developed even before computerized methods such as Material Requirements Planning and Input/Output control became popular. Next we review some recent developments in the use of lean planning and control methods, such as Polca, (M-) Conwip, and Cobacabana.
 Lean and agile productions: an evolution process by seru systems
How do manufacturing firms operate in a highly volatile environment where the rate of technological and competitive change is so extreme, where market information is often unpredictable, and where customer demands are fluctuating wildly? We introduce how companies implemented a new production system – seru – that combines the power of leanness and agility to overcome the difficulties inherent in a volatile environment. Amazing, or maybe unbelievable, seru performance outcomes were reported. Our work gives a concretely empirical approach to the question of how to evolve a lean system into an agile one asked by prior studies [e.g., Narasimhan et al. (2006)]. We also list a lot of questions for this new research area that spans several academic disciplines, and requires a multitude of research methodologies.
SESSION C3: Data Mining, Knowledge Discovery and Computational Intelligence
 FACILITATING LARGE DATA ANALYSIS IN VIRTUAL HUMANITIES AND E-SCIENCE :THE ALIS PARADIGM
Yann Girard, Pierre Saurel and Francis Rousseaux
In this paper, we propose a method based upon the recent progress of active learning for researcher or companies who have access to a lot of unlabeled data. It has become especially important for the relatively new research domain that are e-science and virtual humanities. Those disciplines tends to collect relatively important quantities of data that will then requires intensive analysis. Unsupervised analysis of data, despite being able to give an insight into ontological relationship between elements, is unable to account for semantically complex model in the same way as supervised analysis does. But setting up a supervised learner requires a lot of annotation works. ALIS is a solution based on analysis of training sets meant to reduce the man hours needed in the annotation work by setting up evocative training set right of the bat.
Although ALIS has been used for prospective research for some years, this paper is the first communication about it in an academic setting. We describe below the method along with its computational and mathematical model. We then show its application and results in an experimental setting.
 CASE-BASED REASONING MODEL OF BAYESIAN NETWORK BASED ON MUTUAL INFORMATION
Man Xu and Jiang Shen
The decision-making system exists the problem of data features redundancy, which causes lower search efficiency and accuracy based on CBR. But the traditional BN-CBR does not select to eliminate redundancy feature set, which has a low utilization efficiency for priori knowledge. This paper presents a case-based reasoning model of Bayesian network based on mutual information_MI-BNCBR_. It uses mutual information method to eliminate the redundancy of the data set characteristics, obtain optimal feature subset, and calculate case features comprehensive weight based on the characteristics of redundancy and mutual information, which improves utilization level of prior knowledge. This paper uses K-D tree search method based on the calculation of the distal end of the closestin order to improve Bayesian case inference efficiency based on mutual information. Experimental results using the baseline data show that the proposed method improves the accuracy of decision-making system and knowledge utilization efficiency. This method has the robustness and generalization ability.
 Visual analytics for exploring the topic evolution of company targeted tweets
Lambert Pépin, Nicolas Greffard, Pascale Kuntz, Julien Blanchard, Fabrice Guillet and Philippe Suignard
Business decision support tools including social media data analysis are required to help managers better understand trends and customer opinions. This paper presents an interactive approach to assist an analyst in tracking topics from Twitter relative to his/her company. Developed to visualize topic long term evolution, our process is composed of three complementary steps: (i) a time-dependent topic extraction based on a Latent Dirichlet Allocation, (ii) a topic relationship detection based on a dissimilarity which evaluates the topic proximities between consecutive time slots, and (iii) an interactive topic evolution visualization based on a Sankey diagram popular in industrial environments to show dynamic relationships in a system. Our approach has been tested on a real-life dataset from the French energy company EDF: we have analyzed the evolution of a corpus of more than 70 000 tweets related to this company published during one year.
 THE ESTABLISHMENT AND DEVELOPMENT OF THE CUSTOMS INFORMATION TECHNOLOGY DATA PLATFORM
Changhu Liu, Sihan Zhang and Bozhi Yu
In recent years, although China's foreign trade is developed rapidly, Customs operations still face challenges. This paper is based on the Nolan model, Synnott model and Mische model and depended on innovative technology of RFID and ASN to study.In addition, the writers suggest that Customs should Establish an information technology data platform to address the plight of the current Customs management and to propose policies and applications for law-enforcing department, finance department, tax department,the legislative and related industries.
SESSION D3: Artificial Intelligence in Medicine
 METHOD FOR THE INVESTIGATION OF THE ELECTROMECHANICAL ACTIVITY OF THE HEART USING TIME-FREQUENCY TOOLS
Zied Bouguila, Ali Moukadem, Alain Dieterlen, Christian Brandt, Samuel Schmidt, Ana Castro, Samy Talha and Emmanuel Andres
The waveform analysis of the electrocardiogram (ECG) and phonocardiogram (PCG) for heart monitoring has proven the capability to provide important information, which can prevent the number of heart attacks and resulting deaths. While of a different nature, the fusion of these two dependent signals could improve the overall recognition performance of the mechanical and electrical activity of the heart.
Acoustic cardiography, synchronizing and analyzing electrical activity and heart sound, permits a more detailed analysis of the timing characteristics of the heart. The portability and low cost of acoustic cardiography makes it useful for the estimation and monitoring of patients with known or suspected heart disease for both clinical and investigational applications.
In this field, time–frequency representation (TFR) methods are used to study the ECG and PCG signals since they are non-stationary bio-signals. Due to its richness both in time and in frequency domain, and its robustness against noise, it has recently become a favorite tool. These tools could provide macroscopic time informations (S1, S2, Q, R …) and microscopic details (e.g. heart sounds’ split, murmurs) that will be beneficial for diagnosis purposes.
In this study, a number of coordinated research projects in the heart activity analysis (MARS500, E-Care and data from Aalborg University) are presented, and TFR tools are demonstrated to have a large potential in the acoustic cardiography for monitoring heart function. Design of a computer-based monitor using the techniques specified in this study is discussed along with relative strengths and weaknesses of such a system.
 TOWARD A GENERIC METHODOLOGY FOR THE CONSTRUCTION OF A TELEMONITORING SYSTEM
Amine Ahmed Benyahia, Amir Hajjam, Vincent Hilaire and Mohamed Hajjam
Nowadays, telemonitoring systems are increasingly used, due to the increasing of life expectancy and chronic diseases. Indeed, chronic diseases and disabilities due to advancing age are responsible for health care costs increasingly growing. Telemonitoring systems provide a low cost way to monitor patients and their needs in the comfort of their own homes. In first systems, the data were collected then sent directly to physicians to be interpreted. Nowadays, thanks to technological advancements, software and systems have been developed to process data, on a simple computer or even smartphone. In this paper, we present e-Care telemonitoring system that combines the semantic web and expert system. E-Care is based on generic ontologies and a decision support system. The decision support system uses ontologies as knowledge base and an inference engine to detect abnormal situations. E-Care platform has a generic open architecture, which cans include other knowledge coming from other systems. We’ll show how to integrate data of auscultation sounds in this architecture.
 Efficient algorithms for an optimization problem in a hospital’s pharmacy
Benoît Beroule, Olivier Grunder, Oussama Barakat and Olivier Aujoulat
In this paper, we study the performances of a hospital’s pharmacy that is evaluated through the
process of collecting medical devices. This problem is a well known Order Batching and Picking
problem, which has been studied mainly in industrial warehouses. In a general way, it consists in
finding the best batching strategy coupled with a picking scheme in order to collect a list of items
using the minimum of available resources. To tackle this problem, we first propose a graph modeling
approach which can be used to represent the specific constraints of the considered system like narrow
aisles or one-way aisles, but can also be applied to other particular configurations. The notion of crossroad is introduced in order to represent the possibilities of connection between the different areas of the pharmacy’s warehouse. From this graph modeling, we are able to generate a Integer Programming model derived from the Vehicle Routing Problem, which needs also to be adapted in case some medical devices have to be picked together. Here, we develop a genetic algorithm to solve this problem and compare the results of this algorithm with the solution that is used in the pharmacy and the optimal solution found with the linear formulation for tiny size instances. Computational experiments show that the proposed genetic algorithm is very efficient in term of processing time and is capable of generating near-optimal solutions.
 A HARDWARE SOLUTION FOR HEVC INTRA PREDICTION LOSSLESS CODING
Farouk Amish and El-Bay Bourennane
The lossless coding mode of the High Efficiency Video Coding (HEVC) main profile that bypasses transform, quantization, and in-loop filters is described. Compared to the HEVC non-lossless coding mode, the HEVC lossless coding mode provides perfect fidelity and an average bit-rate reduction of 3.2%–13.2%. It also significantly outperforms the existing lossless compression solutions, such as JPEG2000 and JPEG-LS for images as well as WinRAR for data archiving. A fully parallel-based solution is presented in this paper in order to reduce processing time and computation complexity resulting from intra prediction. Two higher performance structures are designed to perform angular and planar modes, and implemented in five engines which compose our architecture. This solution supports all intra prediction modes for all prediction unit sizes. The synthesis results show that our design can run at 256 MHz for Xilinx Virtex 7 and is capable to process real time 120 1080p frames per second.
SESSION E3: Data Mining, Knowledge Discovery and Computational Intelligence
 BANISHING A TYPE OF WASTE AND ITS IMPACT ON THE COMPANY: AN AUTOMOTIVE FIELD CASE STUDY
Samah Elrhanimi, Laila El Abbadi and Abdellah Abouabdellah
To improve its performance, the current company leans towards the adoption of Lean manufacturing, which was introduced; since its appearance; as a system for the company to reduce its costs, to satisfy the need of its customers, and to reduce the overwork of its employees. The basic idea of Lean is the total hunting of waste. However, eliminate all waste can be disappointing performance. Since it can cause stress translated into the deterioration of working conditions, and consequently an apparent decrease of the productivity."
"In this context, our work will expose an overview of the Lean manufacturing with a focus on its history, its different definitions, and different types of waste; then, a case study of hunting of a waste type, with a focus on its impact.We finish our article with a perspective of research.
 A SVR/HADOOP BASED TRAFFIC FORECASTING SCHEME USING BIG OPEN DATA
The road traffic information is useful and valuable for the intelligent transportation systems (ITS) or the advanced driver assistance systems (ADAS). A precise speed forecasting system can help the traffic department of government to make good traffic strategies and management. The information of real-time/forecasting vehicle speed also is very useful for many companies and end users, such as logistics, passenger traffic, shortest travel time path navigation, and so on. In this paper, we propose vehicle speed forecasting scheme using big open traffic data and implement this scheme in a Hadoop environment for analyzing the a large and past traffic data of Taipei City in Taiwan. The proposed scheme can be divided into several stages. The first stage is data filter to detect and remove the missing, error, or outlier data from raw data. These error data may come from the broken/unstable VDs or other unknown reasons. This paper proposed an automatic raw data collection system with XML parser and Hbase. The second stage is data splitting for map function in the Hadoop environment. Then, the multiple Support Vector Regression (SVR) model architecture is proposed to forecast vehicle speed in the map/reduce framework. Besides, we also propose a model selection mechanism to select multiple better but smaller models (map) at the model training stage, and then they are used for forecasting vehicle speed and to obtain the final results by fusing these results (reduce). Each multiple forecasting model can be used for certain road, time interval or splinted dataset.
 LOCATE: INFERRING TIMETABLE OF INDIVIDUALS FROM THE GPS TRACES OF THEIR VEHICLES
The problem we want to solve consists in finding, among locations of interest, the most probable one where a car driver would have been each time his/her car stopped significantly. The goal is then to be able to rebuild a probable timetable of such a person according to the GPS traces of his/her vehicle. This document presents the model and the algorithm made. The study and the experimental validation have been conduct with data excerpt from the 2014 VAST Challenge.
 A REAL TIME DATA MINING RULES SELECTION MODEL FOR THE JOB SHOP SCHEDULING PROBLEM
Mohamed Habib Zahmani, Baghdad Atmani, Abdelghani Bekrar and Nassima Aissani
Finding the best Dispatching Rule for a Job Shop Scheduling Problem is a tedious task, both time and cost-consuming. Since there is no rule that can outperform the others, in this paper we propose an approach able to affect in real-time a different Dispatching Rule for each machine while minimizing makespan. This approach is based on simulation and Data Mining. Experimentations show that the proposed system returns good results both in makespan and processing time needs.
SESSION F3: Computers & Industrial Engineering in fashion industry
 SIMULATION BASED OPTIMISATION PLANNING FOR A HIGH VARIETY TEXTILE PRODUCTION
Brahmadeep and Sébastien Thomassey
This paper aims to explain the methodology to develop a hybrid model which involves a combination of an optimisation model (Genetic algorithm model) and a production simulation model (discrete event simulation) with a robust link (ActiveX/OLE Automation server). The purpose of this model is to have the production schedule optimisation for an automatic-manufacturing of high-variety of dyeing yarns. The data involved in such a manufacturing process are huge and constitute many parameters and constraints. The complexity of this scenario demands this approach. The global model could be divided into two main modules, the optimisation model and the production floor model. This forms a complex synchronised loop which replicates and improves the production schedule in process till the best results are achieved. A case is demonstrated to verify the model. The expected impacts are to have on-time shipment, increased productivity and profitability with the implementation of lean tools. The implementation of this model is very vast. This would permit the use a powerful discrete event model with an optimisation algorithm which gives numerous possibilities from manufacturing scheduling to the global supply chain, distribution and logistics planning and optimisation.
 DETECTING THE MORPHOLOGY OF A REMOTE INDIVIDUAL CONSUMER IN A WEB-BASED ENVIRONMENT
Pascal Bruniaux, Maria Kulinska and Xianyi Zeng
Together with the number of computer users, which is increasing steadily, the sale of garments in a Web-based environment grows very quickly. For fashion companies which wish to gain competitiveness on such market, they need to increase their level of service as well as consumer’s satisfaction. One of the main difficulties standing in front of an Web-based garment retail is the lack of an efficient try-on process. In a remote environment, the body shape of an individual consumer cannot be measured physically in order to be matched with a specific size of a garment and ensure the proper garment fit. For solving this problem, fashion companies are trying to address this challenge with development of efficient and practical Web-based online non-contact human body measurement methods.
In this paper, a new non-contact method is proposed in order to match the individual consumer’s body dimensions with those of a standardized 3D morphotype. Our method is based on comparison between 2D images of the individual consumer with those projected from the 3D morphotypes, of the database of a target population. We will show how the 3D morphotypes are converted into 2D information data set and how the comparison with 2D data obtained from Kinect camera is performed.
The proposed method is beneficial for Web-based applications in e-commerce. Especially, it has a potential future for being implemented in Web-based fashion shops in order to characterize remote consumer’s body shapes.
 AUTOMATIC DEFINITION OF ADAPTIVE MORPHOTYPE FROM A 3D SCAN POPULATION FOR VIRTUAL TRY-ON
Moez Hamad, Sébastien Thomassey and Pascal Bruniaux
The Textile-Apparel-Industry require a very accurate sizing system to minimize their costs and satisfy their customers. However, the specific constraints of the human morphotologies complicate the sizing system definition procedure and distributors prefer to use standard sizing system rather than an intelligent system suitable to their customers. In this paper, we propose a two-level clustering method (SOM + K-means) based on 3D scans to define 3D adaptive morphotypes mannequins. Based on these morphotypes mannequins, we define an intelligent system for virtual try-on and a new sizing system can be defined in the future. Performances of our method are evaluated using real data from the French Sizing Survey conducted in 2006 by the French Institute of Textiles and Clothing.
 APPAREL SALES PERFORMANCE: FINDINGS OF A CASE STUDY OF FASHION AND FAST FASHION
Adriana P. Martins, Sébastien Thomassey and Pascal Bruniaux
Successful fashion retailers are recognized as those who manage to respond rapidly to demand by commanding modern supply chains and short lead times. To achieve this, operational flexibility is often preferred to statistical forecasts. The segment commonly called “Fast Fashion” is pre-eminent in the use of such concepts. Recent studies argue that operations undertaken by fast fashion chains may be detrimental to some environmental and social aspects and that their sales forecasts, based on expert judgement are inaccurate. This paper presents the results of a case study that aims to assess sales performance in the field of fashion apparel. Data from a European online fashion retailer is presented and analysed. The case study findings corroborate the literature insofar as stochastic demand for fashion goods remains a challenge to retail sales forecasting. Enormous variety of products and a high rate of error are shown to reduce stock turnover and the Gross Margin Return on investment in retail price. This evidence confirms a need for improved sales forecasting methods for fashion retailers and supports the use of data mining techniques in future studies.
SESSION G3: Heuristics and Approximation Algorithms for Scheduling Problems
A NEW BRANCHING SCHEME FOR THE OPEN PIT MINE PRODUCTION SCHEDULING PROBLEM
Mehran Samavati, Daryl Essam, Micah Nehring and Ruhul Sarker
The design and scheduling of open pit mining operations is an enormous complex task. Given a discretisation of an orebody as a block model, the open pit mine production scheduling problem (OPMPSP) can be defined as finding the best extraction time for each block over the lifetime of the deposit. Despite the importance of minimum resource constraints in practical applications, they significantly increase complexity for solving the scheduling problem. Several heuristics, which employ the concept of Branch and Bound (B&B) and Branch and Cut (B&C) as their black box, have been developed for this problem; however, they are still not efficient enough due to the deficiencies in the traditional B&B and B&C. In this paper, we propose a new branching scheme based on the concept of using type-1 special ordered sets for the OPMPSP, while considering the aforementioned resource constraints. By fixing variables at nodes, this scheme can substantially accelerate the branching-based heuristics. For the purpose of evaluation, we add our scheme to B&B and B&C, solving several randomly generated instances, and compare the results to those given by B&B and B&C. The results demonstrate that the scheme can significantly improve the performance of B&B and B&C, thus improving the heuristics that use these techniques.
THE IMPACT OF BATCH SHIPMENTS ON THE ECONOMIC LOT SCHEDULING PROBLEM
Fabian Beck and Christoph Glock
This paper studies the Economic Lot Scheduling Problem (ELSP) for the single-machine-multi-product case where batch shipments are permitted. Integrating batch shipments into the ELSP helps to reduce cycle times at the expense of higher transportation costs. In addition, the question arises whether batch shipments have an effect on the general problem addressed in the ELSP, which is to avoid overlaps in the production schedule. This paper selects two popular approaches used in the literature to solve the ELSP, namely the Common-Cycle-Approach of Hanssmann and the Basic-Period-Approach of Haessler and Hogue, extends them to include batch shipments, and suggests a solution procedure for each approach. Both model extensions are illustrated using the modified Bomberger data set. Finally, ideas for further research are presented.
CONSIDERING PRODUCT DIMENSIONALITY AND UTILIZATION RATES ON THE PERFORMANCE OF DIFFERENT PRODUCTION STRATEGIES FOR THE ECONOMIC LOT SCHEDULING PROBLEM
Raul Cortes-Fibla, Pilar Isabel Vidal-Carreras and Jose Pedro Garcia-Sabater
We consider the problem of scheduling the production of multiple products on a single facility with
limited production capacity, in literature known as Economic Lot Scheduling Problem (ELSP). In this
work we compare the performance of five different production strategies for the ELSP under
different stationary stochastic demand patterns. The performance, related to total inventory and
setup costs, of different approaches to this problem is clearly dependent upon the production
environment. In this study we focus on utilization rates, as one of the most important parameters
affecting this environment. We analyze through a simulation study the effect of dimensionality on
utilization rates, at different utilization levels. The results of our simulation study confirm that the
performance of a production strategy is strongly dependent on the dimensionality of utilization
rates. Therefore, this effect raises the need for a sensitivity analysis to evaluate the impact of the
particular value of these and other parameters on the performance of the production strategies for
STOCHASTIC SCHEDULING OF AN AUTOMATED TWO-MACHINE ROBOTIC CELL WITH IN-PROCESS INSPECTION SYSTEM
Mehdi Foumani, Kate Smith-Miles, Indra Gunawan and Asghar Moeini
This study is focused on the domain of a two-machine robotic cell scheduling problem. Particularly, we propose the first analytical method for minimizing the partial cycle time of such a cell with a PC-based automatic inspection system to make the problem more realistic. It is assumed that parts must be inspected in one of the production machines, and this may result in a rework process. The stochastic nature of the rework process prevents us from applying existing deterministic solution methods for the scheduling problem. This study aims to develop an in-line inspection of identical parts using multiple contact/non-contact sensors. Initially, we present a heuristic method that converts a multiple-sensor inspection system into a single-sensor inspection system. Then, the expected sequence times of two different cycles are derived based on a geometric distribution, and finally the maximum expected throughput is pursued for each individual case.