**ABSTRACT**

The basic idea of a Variable Neighborhood Search (VNS) algorithm is to systematically explore a neighborhood of a solution using a set of predefined neighborhood structures. Since different problem instances have different landscapes and complexities whilst different neighborhood structures may lead to different solution spaces, the choice of which neighborhood structure to be applied is a challenging task. Therefore, this work proposes an Adaptive Guided Variables Neighborhood Search (AG-VNS). AG-VNS has two phases. First, is a learning phase which is used to memorize neighborhood structures that can effectively solve specific soft constraint violations by applying the neighborhood structures to the best solution. These steps are repeated until a stopping condition for the learning phase is met. Second, is an improvement phase which is used to enhance the quality of a current best solution by selecting the most suitable neighborhood structure from memory that will be applied to the current solution. Its effectiveness is illustrated by solving course time tabling problems. The performance of the AG-VNS is tested over the Socha course time tabling datasets. Results demonstrated that the performance of the AG-VNS is comparable with the results of the other VNS variants, while outperforming some variants in particular instances. This demonstrates the effectiveness of applying the adaptive learning mechanism to guide the selection of the neighborhood structures in the VNS algorithm.

PDF Abstract XML References Citation

**Received:**January 11, 2013;

**Accepted:**May 08, 2013;

**Published:**July 19, 2013

####
**How to cite this article**

*Journal of Applied Sciences, 13: 883-888.*

**DOI:**10.3923/jas.2013.883.888

**URL:**https://scialert.net/abstract/?doi=jas.2013.883.888

**INTRODUCTION**

Variable Neighbourhood Search (VNS) is an improvement meta-heuristic that has been proposed by VNS deals with several neighbourhood structures and therefore, the decision to apply which neighbourhood at the current iteration in crucial (Thompson and Dowsland, 1995), due to the fact that different neighborhood structures generate different landscapes. Literature shows that VNS has been applied to solve challenging optimization problems such as course time tabling.

However, up to date, there has been no work undertaken to solve course time tabling problems by applying an adaptive search in VNS. Motivated by the above, this work proposes an adaptive learning mechanism in the VNS algorithm in order to guide the VNS algorithm to choose the right neighborhood structure during the search. The proposed approach is tested over the Socha course time tabling datasets (Socha *et al*., 2003). Results demonstrate that the adaptive learning mechanism in the VNS can enhance the performance of the AG-VNS to obtain better solutions (compared with other variants of VNS) for the course time tabling problem.

**PROBLEM DESCRIPTION**

University course time tabling problems can be defined as assigning a given number of courses to a given number of timeslots and rooms subject to a set of hard and soft constraints. This work used the Socha datasets (Socha *et al*., 2003) which are modeled as follows:

• | A set of C courses c_{i }(i = 1,…,C) |

• | t_{n} represents the set of timeslots (n = 1,…,45) |

• | A set of R rooms r_{j} (j = 1,…, R) |

• | A set of F room features |

• | A set of M students |

The course time tabling problem consists of assigning every course c_{i} to a timeslot t_{n} and room r_{j} so that the following hard constraints are satisfied:

• | No student can be assigned to more than one course at the same time |

• | The room should satisfy the features required by the course |

• | The number of students attending the course should be less than or equal to the capacity of the room |

• | No more than one course is allowed in a timeslot in each room |

The objective is to satisfy all hard constraints while minimizing the number of students involved in the violation of soft constraints. The soft constraints are equally penalized (penalty cost = 1 for each violation per student). The soft constraints are:

• | A student should not have a course scheduled in the last timeslot of the day |

• | A student should not have more than two consecutive courses |

• | A student should not have a single course on a day |

Table 1: | Eleven instances of socha datasets (Socha et al., 2003) |

Table 1 presents the characteristics of the Socha datasets (Socha *et al*., 2003). The datasets consists of 11 problem instances which are categorized as small, medium and large.

**VNS-BASIC FOR COURSE TIME TABLING PROBLEMS**

Contrary to other local search methods, VNS does not follow a trajectory but explores increasingly distant neighborhoods of the current incumbent solution and jumps from this solution to a new one if and only if an improvement has been made (Hansen and Mladenovic, 2001). Figure 1 shows the pseudocode of the basic VNS (Hansen and Mladenovic, 2001).

Figure 2 shows the pseudocode of the VNS-Basic for solving course time tabling problems. In this work, the initial solution is produced using a Largest Degree Graph.

**Coloring heuristic:** The feasible timetable is obtained by adjusting appropriate courses in the schedule based on room availability and other hard constraint violations until all the hard constraints are satisfied without taking into account any of the soft constraint violations.

This study used the following neighborhood structures:

• | (N1) Swap: Select two timeslots at random and simply swap all the courses in one timeslot with all the courses in the other timeslot while maintaining a feasible timetable |

• | (N2) Swap: Select any last timeslot of the day and another single timeslot (exclude the last timeslots of the day) and simply swap all the courses in the last timeslot with all the courses in the other timeslot while maintaining a feasible timetable |

• | (N3) Move: Select any 5 events from any last timeslot of the day and move it to a new free timeslot (exclude the last timeslots of the day) while maintaining a feasible timetable |

• | (N4) Swap: Select any 2 events at random and simply swap their timeslot while maintaining a feasible timetable |

• | (N5) Move: Select any 5 events from any timeslot and move it to a new free timeslot (exclude the last timeslots of the day) while maintaining a feasible timetable |

Fig. 1: | Steps of the basic VNS |

Fig. 2: | Steps of the VNS-Basic for solving course time tabling problems |

**AG-VNS ALGORITHM PROPOSED FOR COURSE TIME TABLING PROBLEMS**

The basic idea of AG-VNS is to guide the VNS algorithm to choose the right neighborhood structure at the right time. Therefore, the adaptive learning mechanism is introduced. With this mechanism, the algorithm memorizes which neighborhood structure could effectively solve the specific soft constraint violations. Then, the algorithm uses the memory information as a guide for selecting the right neighborhood structure to enhance the quality of the best solution. For this purpose, AG-VNS is divided into two phases. Phase I is for learning while Phase II is for improvement.

**Phase I: Learning: ** In this phase, the algorithm memorizes which neighborhood structure could effectively solve the specific soft constraint violations. For this purpose, the memory is divided into three parts (which represents the three soft constraints S_{i }where i = 1, 2, 3) and each part contains five spaces (representing the neighborhood structures) to keep a total of the reduction values of the soft constraint violations.

This phase begins when the algorithm generates a solution x’ by applying N_{k }(where k = 1) to the initial solution x^{0}. If the solution x’ is better, the algorithm will identify which soft constraint violations have reduced. The reduction value is calculated and updated in the memory. After that, the algorithm repeats the same steps by applying the neighborhood structure N_{k }(where k = 2 until k = k_{max}). The procedures are repeated until the stopping condition for the Phase I is met.

For example, given an initial solution x^{0}, where f(x^{0}) = S^{0}_{1}+S^{0}_{2}+S^{0}_{3} and after applying neighborhood structure N_{k }(where k = 1), an incumbent solution x’ is generated where f(x’) = S’_{1} + S’_{2 }+ S’_{3}. If f(x’) is less than f(x^{0}), then the algorithm will identify which soft constraint violations was/were reduced. For example, if S’_{1}<S^{0}_{1}, the reduction value is calculated. Therefore, the reduction value of the soft constraint violations (d) = S^{0}_{1}-S’_{1}. Then, the value d is updated in the memory at the space 1 (which represents the first neighborhood structure) in part 1 (which represents the soft constraint 1 (S_{1}).

Table 2 illustrates an example of the memory contents after the stopping condition of learning phase (phase I) has been met. Based on the table, the highest total of the reduction value for the soft constraint 1 (S_{1}) is 123 has been obtained after applied neighborhood structure 1 (N_{1}) to the solution x^{0}. This means that N_{1 }is the most appropriate neighborhood structure to be selected to solve soft constraint 1 while simultaneously enhance the quality of the solution x” (in phase II). N_{2}, on the other hand, is the most appropriate neighborhood structure to be selected to solve soft constraint 2 (S_{2}) and soft constraint 3 (S_{3}). This is because the highest total reduction value for soft constraint 2 (S_{2}) is 88 has been obtained after applied the neighborhood structure 2 (N_{2}) to the solution *x ^{0}* and the highest total the reduction value for the soft constraint 3 (S

_{3}) is 78 also has been obtained after applying neighborhood structure 2 (N

_{2}) to the solution x

^{0}. This information will be used in the phase II.

**Phase II: Improvement: **In this phase, the algorithm uses the memory (from Phase I) as a guide for selecting the most appropriate neighborhood structure to enhance the quality of a best solution. For the first iteration, the initial solution x^{0} is set to be the best solution x_{best}.

This phase starts when the algorithm generates a solution x” by randomly applying any of the neighborhood structures (N_{k}) to the best solution x_{best}. Then, the algorithm identifies the highest soft constraint violations in the solution x”. Subsequently, the algorithm uses the memory information from Phase I to select the most appropriate neighborhood structure to solve the soft constraint violations to enhance the quality of the solution x”.

Table 2: | Example of the memory contents after the stopping condition of learning phase (phase i) has been met |

Fig. 3: | Pseudo code of the AG-VNS |

For example, if the highest soft constraint violations in the solution x” is S_{1}, the algorithm will refer to the memory to select the most appropriate neighborhood structure to solve the soft constraint 1 while enhancing the quality of the solution x”. Based on the Table 2, the highest total reduction value for S_{1 }is from N_{1}. Therefore, N_{1} is used to enhance the quality of the solution x”. If it turns out that the solution x* produced by that step is better than the best solution x_{best}, the best solution x_{best} is updated by the solution x* and the memory is updated by repeating step (i) to step (iii) as in the first phase.

In instances where solution x* is found to be worse than the best solution x_{best}, the algorithm will apply the neighborhood structure N_{p }to the solution x”. If the solution x** produced from that step is better than the solution x”, then the solution x” is updated by the solution *x*** and the memory is updated by repeating step (i) to step (iii) as in the first phase. Otherwise, the algorithm continues the search with N_{p }(where p7p+1). These steps are repeated until the solution *x*** produced is better than the solution x” or p = p_{maz}. Finally, if the solution x” is better than the solution x_{best}, the solution x_{best} is updated by the solution x”. The algorithm proceeds to the next iteration until the stopping condition for the phase 2 is met. Figure 3 shows the pseudocode of the AG-VNS.

**RESULTS AND DISCUSSION**

In order to assess the effectiveness of applying learning mechanism in the VNS, the same neighborhood structures are used to compare AG-VNS with VNS-Basic. The stopping conditions for the AG-VNS that are used in phase I and phase II are 2000 and 100,000 iterations, respectively. These values were determined based on preliminary experiment. Table 3 presents the average penalty cost for 20 runs.

It is clear from Table that AG-VNS outperforms VNS-Basic. This is due to the use of the adaptive learning mechanism strategy to guide the algorithm to choose the right neighborhood structure at the right time. The performance of AG-VNS has been compared with other approaches in the literature. Table 4 presents the comparison results. These are the works that has been compared in this experiment:

• | M1: Variable neighborhood search with EMC (Abdullah, 2006) |

• | M2: Variable neighborhood search with Tabu (Abdullah, 2006) |

• | M3: Non-linear Decay Rate (Abdullah and Turabieh, 2008) |

• | M4: Genetic Algorithm with a sequential local search (Jat and Yang, 2009) |

Table 3: | Comparison results on course time tabling Problem between vns-basic and ag-vns |

Table 4: | Comparison results on course time tabling problem between ag-vns and other approaches |

It can be seen that across all problem instances, AG-VNS has produced much better results when compared against other approaches in literature. For the five small instances, AG-VNS is able to solve them to optimality. AG-VNS results for medium1, medium3, medium5 and large instance are better than the results obtained by other approaches. These demonstrated that AG-VNS outperformed other approaches in the literature. These results demonstrated the effectiveness of applying the adaptive learning mechanism in VNS.

**CONCLUSION**

This study have proposed the AG-VNS algorithm for solving course time tabling problems. The basic idea of the AG-VNS is to guide the VNS algorithm to choose the right neighborhood structure during the search. Therefore, the adaptive learning mechanism was introduced to memorize which neighborhood structure can effectively solve specific soft constraint violations. The recommended neighborhood structure (from the learning phase) is used to enhance the quality of solutions in the improvement phase. Results have demonstrated that the AG-VNS produces better solutions for course time tabling problems and outperforms other stated approaches. This finding shows the effectiveness of the adaptive learning mechanism in the VNS algorithm in order to guide the VNS algorithm to choose the right neighborhood structure at the right time.

For future work, the AG-VNS will be hybridized with Honey Bee Mating Optimization (HBMO) to investigate the effectiveness of the AG-VNS in HBMO.

**ACKNOWLEDGMENT**

The authors wish to thank Ministry of Higher Education for supporting this work under the FRGS Research Grant Scheme (FRGS/1/2012/SG05/UKM/02/11).

####
**REFERENCES**

- Thompson, J. and K. Dowsland, 1995. General cooling schedules for a simulated annealing based timetabling system. Proceedings of the 1st International Conference on Practice and Theory of Automated Timetabling, August 29-September 1, 1995, Edinburgh, UK., pp: 345-363.

CrossRef - Socha, K., M. Samples and M. Manfrin, 2003. Ant Algorithms for the University Course Timetabling Problem with Regard to the State-of-the-Art. In: Applications of Evolutionary Computing, Cagnoni, S., C.G. Johnson, J.J.R. Cardalda, E. Marchiori and D.W. Corne
*et al*. (Eds.). Springer, Berlin, Germany, ISBN-13: 9783540009764, pp: 334-345. - Hansen, P. and N. Mladenovic, 2001. Variable neighborhood search: Principles and applications. Eur. J. Operat. Res., 130: 449-467.

CrossRef - Abdullah, S. and H. Turabieh, 2008. Generating university course timetable using genetic algorithms and local search. Proceedings of the 3rd International Conference on Convergence and Hybrid Information Technology, Volume 1, November 11-13, 2008, Busan, pp: 254-260.

CrossRef