Computer Science
Permanent URI for this communityhttps://hdl.handle.net/10413/6769
Browse
Browsing Computer Science by Author "Adewumi, Aderemi Oluyinka."
Now showing 1 - 16 of 16
- Results Per Page
- Sort Options
Item Discrete particle swarm optimization for combinatorial problems with innovative applications.(2016) Ayokunle, Popoola Peter.; Adewumi, Aderemi Oluyinka.; Martins, Arasomwan Akugbe.Abstract available in PDF file.Item The enhanced best performance algorithm for global optimization with applications.(2016) Chetty, Mervin.; Adewumi, Aderemi Oluyinka.Abstract available in PDF file.Item Fusion of face and iris biometrics in security verification systems.(2016) Azom, Valentine.; Adewumi, Aderemi Oluyinka.Abstract available in PDF file.Item Hierarchical age estimation using enhanced facial features.(2018) Angulu, Raphael.; Tapamo, Jules-Raymond.; Adewumi, Aderemi Oluyinka.Ageing is a stochastic, inevitable and uncontrollable process that constantly affect shape, texture and general appearance of the human face. Humans can easily determine ones’ gender, identity and ethnicity with highest accuracy as compared to age. This makes development of automatic age estimation techniques that surpass human performance an attractive yet challenging task. Automatic age estimation requires extraction of robust and reliable age discriminative features. Local binary patterns (LBP) sensitivity to noise makes it insufficiently reliable in capturing age discriminative features. Although local ternary patterns (LTP) is insensitive to noise, it uses a single static threshold for all images regardless of varied image conditions. Local directional patterns (LDP) uses k directional responses to encode image gradient and disregards not only central pixel in the local neighborhood but also 8 k directional responses. Every pixel in an image carry subtle information. Discarding 8 k directional responses lead to lose of discriminative texture features. This study proposes two variations of LDP operator for texture extraction. Significantorientation response LDP (SOR-LDP) encodes image gradient by grouping eight directional responses into four pairs. Each pair represents orientation of an edge with respect to central reference pixel. Values in each pair are compared and the bit corresponding to the maximum value in the pair is set to 1 while the other is set to 0. The resultant binary code is converted to decimal and assigned to the central pixel as its’ SOR-LDP code. Texture features are contained in the histogram of SOR-LDP encoded image. Local ternary directional patterns (LTDP) first gets the difference between neighboring pixels and central pixel in 3 3 image region. These differential values are convolved with Kirsch edge detectors to obtain directional responses. These responses are normalized and used as probability of an edge occurring towards a respective direction. An adaptive threshold is applied to derive LTDP code. The LTDP code is split into its positive and negative LTDP codes. Histograms of negative and positive LTDP encoded images are concatenated to obtain texture feature. Regardless of there being evidence of spatial frequency processing in primary visual cortex, biologically inspired features (BIF) that model visual cortex uses only scale and orientation selectivity in feature extraction. Furthermore, these BIF are extracted using holistic (global) pooling across scale and orientations leading to lose of substantive information. This study proposes multi-frequency BIF (MF-BIF) where frequency selectivity is introduced in BIF modelling. Local statistical BIF (LS-BIF) uses local pooling within scale, orientation and frequency in n n region for BIF extraction. Using Leave-one-person-out (LOPO) validation protocol, this study investigated performance of proposed feature extractors in age estimation in a hierarchical way by performing age-group classification using Multi-layer Perceptron (MLP) followed by within age-group exact age regression using support vector regression (SVR). Mean absolute error (MAE) and cumulative score (CS) were used to evaluate performance of proposed face descriptors. Experimental results on FG-NET ageing dataset show that SOR-LDP, LTDP, MF-BIF and LS-BIF outperform state-of-the-art feature descriptors in age estimation. Experimental results show that performing gender discrimination before age-group and age estimation further improves age estimation accuracies. Shape, appearance, wrinkle and texture features are simultaneously extracted by visual system in primates for the brain to process and understand an image or a scene. However, age estimation systems in the literature use a single feature for age estimation. A single feature is not sufficient enough to capture subtle age discriminative traits due to stochastic and personalized nature of ageing. This study propose fusion of different facial features to enhance their discriminative power. Experimental results show that fusing shape, texture, wrinkle and appearance result into robust age discriminative features that achieve lower MAE compared to single feature performance.Item Improved roach-based algorithms for global optimization problems.(2014) Obagbuwa, Ibidun Christiana.; Adewumi, Aderemi Oluyinka.Optimization of systems plays an important role in various fields including mathematics, economics, engineering and life sciences. A lot of real world optimization problems exist across field of endeavours such as engineering design, space planning, networking, data analysis, logistic management, financial planning, risk management, and a host of others. These problems are constantly increasing in size and complexity, necessitating the need for improved techniques. Many conventional approaches have failed to solve complex problems effectively due to increasingly large solution space. This has led to the development of evolutionary algorithms that draw inspiration from the process of natural evolution. It is believed that nature provides inspirations that can lead to innovative models or techniques for solving complex optimization problems. Among the class of paradigm based on this inspiration is Swarm Intelligence (SI). SI is one of the recent developments of evolutionary computation. A SI paradigm is comprised of algorithms inspired by the social behaviour of animals and insects. SI-based algorithms have attracted interest, gained popularity and attention because of their flexibility and versatility. SIbased algorithms have been found to be efficient in solving real world optimization problems. Examples of SI algorithms include Ant Colony Optimization (ACO) inspired by the pheromone trail-following behaviour of ant species; Particle Swarm Optimization (PSO) inspired by flocking and swarming behaviour of insects and animals; and Bee Colony Optimization (BCO) inspired by bees’ food foraging. Recent emerging techniques in SI includes Roach-based Algorithms (RBA) motivated by cockroaches social behaviour. Two recently introduced RBA algorithms are Roach Infestation Optimization (RIO) and Cockroach Swarm Optimization (CSO) which have been applied to some optimization problems to achieve competitive results when compared to PSO. This study is motivated by the promising results of RBA, which have shown that the algorithms have potentials to be efficient tools for solving optimization problems. Extensive studies of existing RBA were carried out in this work revealing the shortcomings such as slow convergence and entrapment in local minima. The aim of this study is to overcome the identified drawbacks. We investigate RBA variants that are introduced in this work by introducing parameters such as constriction factor and sigmoid function that have proved effective for similar evolutionary algorithms in the literature. In addition components such as vigilance, cannibalism and hunger are incorporated into existing RBAs. These components are constructed by the use of some known techniques such as simple Euler, partial differential equation, crossover and mutation methods to speed up convergence and enhance the stability, exploitation and exploration of RBA. Specifically, a stochastic constriction factor was introduced to the existing CSO algorithm to improve its performance and enhance its ability to solve optimization problems involving thousands of variables. A CSO algorithm that was originally designed with three components namely chase-swarming, dispersion and ruthlessness is extended in this work with hunger component to improve its searching ability and diversity. Also, predator-prey evolution using crossover and mutation techniques were introduced into the CSO algorithm to create an adaptive search in each iteration thereby making the algorithm more efficient. In creating a discrete version of a CSO algorithm that can be used to evaluate optimization problems with any discrete range value, we introduced the sigmoid function. Furthermore, a dynamic step-size adaptation with simple Euler method was introduced to the existing RIO algorithm enhancing swarm stability and improving local and global searching abilities. The existing RIO model was also re-designed with the inclusion of vigilance and cannibalism components. The improved RBA were tested on established global optimization benchmark problems and results obtained compared with those from the literature. The improved RBA introduced in this work show better improvements over existing ones.Item Improved techniques for phishing email detection based on random forest and firefly-based support vector machine learning algorithms.(2014) Andronicus, Ayobami Akinyelu.; Adewumi, Aderemi Oluyinka.Electronic fraud is one of the major challenges faced by the vast majority of online internet users today. Curbing this menace is not an easy task, primarily because of the rapid rate at which fraudsters change their mode of attack. Many techniques have been proposed in the academic literature to handle e-fraud. Some of them include: blacklist, whitelist, and machine learning (ML) based techniques. Among all these techniques, ML-based techniques have proven to be the most efficient, because of their ability to detect new fraudulent attacks as they appear.There are three commonly perpetrated electronic frauds, namely: email spam, phishing and network intrusion. Among these three, more financial loss has been incurred owing to phishing attacks. This research investigates and reports the use of MLand Nature Inspired technique in the domain of phishing detection, with the foremost objective of developing a dynamic and robust phishing email classifier with improved classification accuracy and reduced processing time.Two approaches to phishing email detection are proposed, and two email classifiers are developed based on the proposed approaches. In the first approach, a random forest algorithm is used to construct decision trees,which are,in turn,used for email classification. The second approach introduced a novel MLmethod that hybridizes firefly algorithm (FFA) and support vector machine (SVM). The hybridized method consists of three major stages: feature extraction phase, hyper-parameter selection phase and email classification phase. In the feature extraction phase, the feature vectors of all the features described in Section 3.6 are extracted and saved in a file for easy access.In the second stage, a novel hyper-parameter search algorithm, developed in this research, is used to generate exponentially growing sequence of paired C and Gamma (γ) values. FFA is then used to optimize the generated SVM hyper-parameters and to also find the best hyper-parameter pair. Finally, in the third phase, SVM is used to carry out the classification. This new approach addresses the problem of hyper-parameter optimization in SVM, and in turn, improves the classification speed and accuracy of SVM. Using two publicly available email datasets, some experiments are performed to evaluate the performance of the two proposed phishing email detection techniques. During the evaluation of each approach, a set of features (well suited for phishing detection) are extracted from the training dataset and used to constructthe classifiers. Thereafter, the trained classifiers are evaluated on the test dataset. The evaluations produced very good results. The RF-based classifier yielded a classification accuracy of 99.70%, a FP rate of 0.06% and a FN rate of 2.50%. Also, the hybridized classifier (known as FFA_SVM) produced a classification accuracy of 99.99%, a FP rate of 0.01% and a FN rate of 0.00%.Item Intelligent instance selection techniques for support vector machine speed optimization with application to e-fraud detection.(2017) Akinyelu, Ayobami Andronicus.; Adewumi, Aderemi Oluyinka.Decision-making is a very important aspect of many businesses. There are grievous penalties involved in wrong decisions, including financial loss, damage of company reputation and reduction in company productivity. Hence, it is of dire importance that managers make the right decisions. Machine Learning (ML) simplifies the process of decision making: it helps to discover useful patterns from historical data, which can be used for meaningful decision-making. The ability to make strategic and meaningful decisions is dependent on the reliability of data. Currently, many organizations are overwhelmed with vast amounts of data, and unfortunately, ML algorithms cannot effectively handle large datasets. This thesis therefore proposes seven filter-based and five wrapper-based intelligent instance selection techniques for optimizing the speed and predictive accuracy of ML algorithms, with a particular focus on Support Vector Machine (SVM). Also, this thesis proposes a novel fitness function for instance selection. The primary difference between the filter-based and wrapper-based technique is in their method of selection. The filter-based techniques utilizes the proposed fitness function for selection, while the wrapper-based technique utilizes SVM algorithm for selection. The proposed techniques are obtained by fusing SVM algorithm with the following Nature Inspired algorithms: flower pollination algorithm, social spider algorithm, firefly algorithm, cuckoo search algorithm and bat algorithm. Also, two of the filter-based techniques are boundary detection algorithms, inspired by edge detection in image processing and edge selection in ant colony optimization. Two different sets of experiments were performed in order to evaluate the performance of the proposed techniques (wrapper-based and filter-based). All experiments were performed on four datasets containing three popular e-fraud types: credit card fraud, email spam and phishing email. In addition, experiments were performed on 20 datasets provided by the well-known UCI data repository. The results show that the proposed filter-based techniques excellently improved SVM training speed in 100% (24 out of 24) of the datasets used for evaluation, without significantly affecting SVM classification quality. Moreover, experimental results also show that the wrapper-based techniques consistently improved SVM predictive accuracy in 78% (18 out of 23) of the datasets used for evaluation and simultaneously improved SVM training speed in all cases. Furthermore, two different statistical tests were conducted to further validate the credibility of the results: Freidman’s test and Holm’s post-hoc test. The statistical test results reveal that the proposed filter-based and wrapper-based techniques are significantly faster, compared to standard SVM and some existing instance selection techniques, in all cases. Moreover, statistical test results also reveal that Cuckoo Search Instance Selection Algorithm outperform all the proposed techniques, in terms of speed. Overall, the proposed techniques have proven to be fast and accurate ML-based e-fraud detection techniques, with improved training speed, predictive accuracy and storage reduction. In real life application, such as video surveillance and intrusion detection systems, that require a classifier to be trained very quickly for speedy classification of new target concepts, the filter-based techniques provide the best solutions; while the wrapper-based techniques are better suited for applications, such as email filters, that are very sensitive to slight changes in predictive accuracy.Item Model and solutions to campus parking space allocation problem.(2013) Joel, Luke Oluwaseye.; Adewumi, Aderemi Oluyinka.Parking is considered a major land use challenge in campus planning. The problem can be in terms of scarcity (few available spaces compared to demand) or management (ineffi cient usage of available facilities). Many studies have looked at the parking problem from the administrative and management points of view. However, it is believed that mathematical models and optimiza- tion can provide substantial solution to the parking problem. This study investigates a model for allocating car parking spaces in the university environment and improves on the constraints to address the reserved parking policy on campus. An investigation of both the exact and heuristic techniques was undergone to provide solutions to this model with a case study of the University of KwaZulu-Natal (UKZN), Westville Campus. The optimization model was tested with four different set of data that were generated to mimic real life situations of parking supply and demand on campus for reserved and unreserved parking spaces. These datasets consist of the number of parking lots and offi ce buildings in the case study. The study also investigate some optimization algorithms that can be used to obtain solutions to this problem. An exact solution of the model was generated with CPLEX solver (as incorporated in AIMMS software). Further investigation of the performance of the three meta-heuristics to solve this problem was done. A comparative study of the performance of these techniques was conducted. Results obtained from the meta-heuristic algorithms indicate that the algorithms used can successfully solve the parking allocation problem and can give solutions that are near optimal. The parking allocation and fitness value for each of the meta-heuristic algorithms on the sets of data used were obtained and compared to each other and also to the ones obtained from CPLEX solver. The results suggest that PSwarm performs better and faster than the other two algorithms and gives solutions that are close to the exact solutions obtained from CPLEX solver.Item On modeling and optimisation of air Traffic flow management problem with en-route capacities.(2016) Alochukwu, Alex Somto Arinze.; ; Adewumi, Aderemi Oluyinka.The air transportation industry in the past ten years witnessed an upsurge with the number of passengers swelling exponentially. This development has seen a high demand in airport and airspace usage, which consequently has an enormous strain on the aviation industry of a given country. Although increase in airport capacity would be logical to meet this demand, factors such as poor weather conditions and other unforeseen ones have made it difficult if not impossible to do such. In fact there is a high probability of capacity reduction in most of the airports and air sectors within these regions. It is no surprise therefore that, most countries experience congestion almost on a daily basis. Congestion interrupts activities in the air transportation network and this has dire consequences on the air traffic control system as well as the nation's economy due to the significant costs incurred by airlines and passengers. This is against a background where most air tra c managers are met with the challenge of finding optimal scheduling strategies that can minimise delay costs. Current practices and research has shown that there is a high possibility of reducing the effects of congestion problems on the air traffic control system as well as the total delay costs incurred to the nearest minimum through an optimal control of ights. Optimal control of these ights can either be achieved by assigning ground holding delays or air borne delays together with any other control actions to mitigate congestion. This exposes a need for adequate air traffic ow management given that it plays a crucial role in alleviating delay costs. Air Traffic Flow Management (ATFM) is defined as a set of strategic processes that reduce air traffic delays and congestion problems. More precisely, it is the regulation of air traffic in such a way that the available airport and airspace capacity are utilised efficiently without been exceeded when handling traffic. The problem of managing air traffic so as to ensure efficient and safe ow of aircraft throughout the airspace is often referred to as the Air Traffic Flow Management Problem (ATFMP). This thesis provides a detailed insight on the ATFMP wherein the existing approaches, methodologies and optimisation techniques that have been (and continue to be) used to address the ATFMP were critically examined. Particular attention to optimisation models on airport capacity and airspace allocation were also discussed extensively as they depict what is obtainable in the air transportation system. Furthermore, the thesis attempted a comprehensive and, up-to-date review which extensively fed off literature on ATFMP. The instances in this literature were mainly derived from North America, Europe and Africa. Having reviewed the current ATFM practices and existing optimisation models and approaches for solving the ATFMP, the generalised basic model was extended to account for additional modeling variations. Furthermore, deterministic integer programming formulations were developed for reducing the air traffic delays and congestion problems based on the sector and path-based approaches already proposed for incorporating rerouting options into the basic ATFMP model. The formulation does not only takes into account all the ight phases but it also solves for optimal synthesis of other ow management activities including rerouting decisions, ight cancellation and penalisation. The claims from the basic ATFMP model was validated on artificially constructed datasets and generated instances. The computational performance of the basic and modified ATFMP reveals that the resulting solutions are completely integral, and an optimal solution can be obtained within the shortest possible computational time. Thereby, affirming the fact that these models can be used in effective decision making and efficient management of the air traffic flow.Item On the performance of metaheuristics for the blood platelet production and inventory problem.(2016) Olayemi, Fagbemi Seun.; Adewumi, Aderemi Oluyinka.; Olusanya, Micheal Olusoji.Abstract available in PDF file.Item On the performance of recent swarm based metaheuristics for the traveling tournament problem.(2013) Saul, Sandile Sinethemba .; Adewumi, Aderemi Oluyinka.Item On the sample consensus robust estimation paradigm: comprehensive survey and novel algorithms with applications.(2016) Olukanmi, Peter Olubunmi.; Adewumi, Aderemi Oluyinka.This study begins with a comprehensive survey of existing variants of the Random Sample Consensus (RANSAC) algorithm. Then, five new ones are contributed. RANSAC, arguably the most popular robust estimation algorithm in computer vision, has limitations in accuracy, efficiency and repeatability. Research into techniques for overcoming these drawbacks, has been active for about two decades. In the last one-and-half decade, nearly every single year had at least one variant published: more than ten, in the last two years. However, many existing variants compromise two attractive properties of the original RANSAC: simplicity and generality. Some introduce new operations, resulting in loss of simplicity, while many of those that do not introduce new operations, require problem-specific priors. In this way, they trade off generality and introduce some complexity, as well as dependence on other steps of the workflow of applications. Noting that these observations may explain the persisting trend, of finding only the older, simpler variants in ‘mainstream’ computer vision software libraries, this work adopts an approach that preserves the two mentioned properties. Modification of the original algorithm, is restricted to only search strategy replacement, since many drawbacks of RANSAC are consequences of the search strategy it adopts. A second constraint, serving the purpose of preserving generality, is that this ‘ideal’ strategy, must require no problem-specific priors. Such a strategy is developed, and reported in this dissertation. Another limitation, yet to be overcome in literature, but is successfully addressed in this study, is the inherent variability, in RANSAC. A few theoretical discoveries are presented, providing insights on the generic robust estimation problem. Notably, a theorem proposed as an original contribution of this research, reveals insights, that are foundational to newly proposed algorithms. Experiments on both generic and computer-vision-specific data, show that all proposed algorithms, are generally more accurate and more consistent, than RANSAC. Moreover, they are simpler in the sense that, they do not require some of the input parameters of RANSAC. Interestingly, although non-exhaustive in search like the typical RANSAC-like algorithms, three of these new algorithms, exhibit absolute non-randomness, a property that is not claimed by any existing variant. One of the proposed algorithms, is fully automatic, eliminating all requirements of user-supplied input parameters. Two of the proposed algorithms, are implemented as contributed alternatives to the homography estimation function, provided in MATLAB’s computer vision toolbox, after being shown to improve on the performance of M-estimator Sample Consensus (MSAC). MSAC has been the choice in all releases of the toolbox, including the latest 2015b. While this research is motivated by computer vision applications, the proposed algorithms, being generic, can be applied to any model-fitting problem from other scientific fields.Item A semantic sensor web framework for proactive environmental monitoring and control.(2017) Adeleke, Jude Adekunle.; Moodley, Deshendran.; Rens, Gavin Brian.; Adewumi, Aderemi Oluyinka.Observing and monitoring of the natural and built environments is crucial for main- taining and preserving human life. Environmental monitoring applications typically incorporate some sensor technology to continually observe specific features of inter- est in the physical environment and transmitting data emanating from these sensors to a computing system for analysis. Semantic Sensor Web technology supports se- mantic enrichment of sensor data and provides expressive analytic techniques for data fusion, situation detection and situation analysis. Despite the promising successes of the Semantic Sensor Web technology, current Semantic Sensor Web frameworks are typically focused at developing applications for detecting and reacting to situations detected from current or past observations. While these reactive applications provide a quick response to detected situations to minimize adverse effects, they are limited when it comes to anticipating future adverse situations and determining proactive control actions to prevent or mitigate these situations. Most current Semantic Sensor Web frameworks lack two essential mechanisms required to achieve proactive control, namely, mechanisms for antici- pating the future and coherent mechanisms for consistent decision processing and planning. Designing and developing proactive monitoring and control Semantic Sensor Web applications is challenging. It requires incorporating and integrating different tech- niques for supporting situation detection, situation prediction, decision making and planning in a coherent framework. This research proposes a coherent Semantic Sen- sor Web framework for proactive monitoring and control. It incorporates ontology to facilitate situation detection from streaming sensor observations, statistical ma- chine learning for situation prediction and Markov Decision Processes for decision making and planning. The efficacy and use of the framework is evaluated through the development of two different prototype applications. The first application is for proactive monitoring and control of indoor air quality to avoid poor air quality situations. The second is for proactive monitoring and control of electricity usage in blocks of residential houses to prevent strain on the national grid. These appli- cations show the effectiveness of the proposed framework for developing Semantic Sensor Web applications that proactively avert unwanted environmental situations before they occur.Item Studies in heuristics for the annual crop planning problem.(2012) Chetty, Sivashan.; Adewumi, Aderemi Oluyinka.Increase in the costs associated with agricultural production and the limited availability of resources have amplified the need for optimized solutions to the problem of crop planning. The increased costs have imparted negatively on both the cost of production as well as the sale prices of finished products to consumers, with the resultant effects on the socio-economic livelihoods of people around the world. This has increased the burden of poverty, malnutrition, diseases and other types of social problems. The limited availability of land, irrigated water and other resources in crop planning therefore demand optimal solutions to the problem of crop planning, in order to maintain the desired level of profitable outputs that do not strain available resources while still meeting the demands of consumers. Incidentally, the current situation is such that crop producers are required to generate more output per area of crops cultivated within the ambit of the available resources for crop production. This creates a great challenge both for farmers and researchers. Interesting, the problem is essentially an optimization problem hence a challenge to researchers in mathematical and computing science. Notably within the agricultural sector, achieving efficient use of irrigated water demands that optimized solutions be found for its usage during crop planning and production. Incidentally, increase in population growth and limited availability of fresh water has increased the demand of fresh water supply from all sectors of the economy. This has increased the pressure on the agricultural sector as being one of the primary users of fresh water supply to use irrigated water more efficiently. This is to minimize excessive water wastage. It has therefore become very important that optimized solutions be found to the allocation and use of the irrigated water, for water conservational purposes. This is also a very essential key to crop planning decisions. Therefore, in order to determine good solutions to crop planning decisions, this study dwells on a fairly new but important area of agricultural planning, namely the Annual Crop Planning (ACP) problem which essentially focuses at the level of an irrigation scheme. The study presents a model of the ACP problem that helps to determine solutions to resource allocations amongst the various competing crops that are required to be grown at an irrigation scheme within a year. Both new and existing irrigation schemes are considered. Determining solutions for an ACP problem requires that the requirements and constraints presented by crop characteristics, climatic conditions, market demand conditions and the variable costs associated with agricultural production are observed. The objective is to maximize the total gross profits that can be earned in producing the various crops within a production year. Due to the complexity involved in determining solutions for an ACP problem, exact methods are not researched in this study. Rather, to determine near-optimal solutions for this -Hard optimization problem, this research introduces three new Local Search (LS) metaheuristic algorithms. These algorithms are called the Best Performance Algorithm (BPA), the Iterative Best Performance Algorithm (IBPA) and the Largest Absolute Difference Algorithm (LADA). The motivation for implementing these algorithms is to investigate techniques that can be used to determine effective solutions to difficult optimization problems at low computational costs. This study also investigates the performances of three recently introduced swarm intelligence (SI) metaheuristic algorithms in determining solutions to the ACP problems studies. These algorithms have shown great strength in providing competitive solutions to similar optimization problems in literature, hence their use in this work. To the best of the researchers’ knowledge, this is the first work that reports comparative study of the performances of these particular SI algorithms in determining solutions to a crop planning problem. Interesting results obtained and reported herein show the viability, effectiveness and efficiency of incorporation proven metaheuristic techniques into any decision support system that will help determine solutions to the ACP problem.Item Studies in particle swarm optimization technique for global optimization.(2013) Martins, Arasomwan Akugbe.; Adewumi, Aderemi Oluyinka.Abstract available in the digital copy.Item Studies of heuristics for hostel space allocation problem.(2013) Ajibola, Ariyo Sunday.; Adewumi, Aderemi Oluyinka.This research work focused on the performance of heuristics and metaheuristics for the recently defined Hostel Space Allocation Problem (HSAP), a new instance of the space allocation problem (SAP) in higher institutions of learning (HIL). SAP is a combinatorial optimisation problem that involves the distribution of spaces available amongst a set of deserving entities (rooms, bed spaces, and office spaces etc.), so that the available spaces are optimally utilized and complied with the given set of constraints. HSAP deals with the allocation of bed space in available but limited halls of residence to competing groups of students such that given requirements and constraints are satisfied as much as possible. The problem was recently introduced in literature and a preliminary, baseline solution using Genetic Algorithm (GA) was provided to show the viability of heuristics in solving the problem rather than recourse to the usual manual processing. Since the administration of hostel space allocation varies across institutions, countries and continents, the available instance is defined as obtained from a top institution in Nigeria. This instance identified is the point of focus for this research study. The main aim of this thesis is to study the strength and performance of some Local Search (LS) heuristics in solving this problem. In the process however, some hybrid techniques that combine both population-based and LS heuristics in providing solutions are derived. This enables one to carry out a comprehensive comparative study aimed at determining which heuristics and/or combination performs best for the given problem. HSAP is a multi-objective and multi-stage problem. Each stage of the allocation has different requirements and constraints. An attempt is made to provide a formulation of these problems as an optimisation problem and then provides various inter-related heuristics and meta-heuristics to solve it at different levels of the allocation process. Specifically, Hill Climbing (HC), Simulated Annealing (SA), Tabu Search (TS), Late Acceptance Hill Climbing (LAHC) and GA were applied to distribute the students at all the three levels of allocation. At each level, a comparison of the algorithms is presented. In addition, variants of the algorithms were performed from a multi-objective perspective with promising and better solutions compared to the results obtained from the manual method used by the administrators in the institutions. Comparisons and analyses of the results obtained from the above methods were done. Obtaining datasets for HSAP is a very difficult task as most institutions either do not keep proper records of past allocations or are not willing to make such records available for research purposes. The only dataset available which is also used for simulation in this study is the one recently reported in literature. However, to test the robustness of the algorithms, two new data sets that follow the pattern of the known dataset obtained from literature are randomly generated. Results obtained with these datasets further demonstrate the viability of applying tested operations research techniques efficiently to solve this new instance of SAP.