Masters Degrees (Computer Science)
Permanent URI for this collectionhttps://hdl.handle.net/10413/7114
Browse
Browsing Masters Degrees (Computer Science) by Issue Date
Now showing 1 - 20 of 86
- Results Per Page
- Sort Options
Item Planarity testing and embedding algorithms.(1990) Carson, D. I.; Oellermann, Ortrud Ruth.This thesis deals with several aspects of planar graphs, and some of the problems associated with non-planar graphs. Chapter 1 is devoted to introducing some of the fundamental notation and tools used in the remainder of the thesis. Graphs serve as useful models of electronic circuits. It is often of interest to know if a given electronic circuit has a layout on the plane so that no two wires cross. In Chapter 2, three efficient algorithms are described for determining whether a given 2-connected graph (which may model such a circuit) is planar. The first planarity testing algorithm uses a path addition approach. Although this algorithm is efficient, it does not have linear complexity. However, the second planarity testing algorithm has linear complexity, and uses a recursive fragment addition technique. The last planarity testing algorithm also has linear complexity, and relies on a relatively new data structure called PQ-trees which have several important applications to planar graphs. This algorithm uses a vertex addition technique. Chapter 3 further develops the idea of modelling an electronic circuit using a graph. Knowing that a given electronic circuit may be placed in the plane with no wires crossing is often insufficient. For example, some electronic circuits often have in excess of 100 000 nodes. Thus, obtaining a description of such a layout is important. In Chapter 3 we study two algorithms for obtaining such a description, both of which rely on the PQ-tree data structure. The first algorithm determines a rotational embedding of a 2-connected graph. Given a rotational embedding of a 2-connected graph, the second algorithm determines if a convex drawing of a graph is possible. If a convex drawing is possible, then we output the convex drawing. In Chapter 4, we concern ourselves with graphs that have failed a planarity test of Chapter 2. This is of particular importance, since complex electronic circuits often do not allow a layout on the plane. We study three different ways of approaching the problem of an electronic circuit modelled on a non-planar graph, all of which use the PQ-tree data structure. We study an algorithm for finding an upper bound on the thickness of a graph, an algorithm for determining the subgraphs of a non-planar graph which are subdivisions of the Kuratowski graphs K5 and K3,3, and lastly we present a new algorithm for finding an upper bound on the genus of a non-planar graph.Item Packing problems on a PC.(1991) Deighton, Andrew George.; Meyerowitz, Jane Julia.Bin packing is a problem with many applications in various industries. This thesis addresses a specific instance of the this problem, known as the Container Packing problem. Special attention is paid to the Pallet Loading problem which is a restricted sub-problem of the general Container Packing problem. Since the Bin Packing problem is NP-complete, it is customary to apply a heuristic measure in order to approximate solutions in a reasonable amount of computation time rather than to attempt to produce optimal results by applying some exact algorithm. Several heuristics are examined for the problems under consideration, and the results produced by each are shown and compared where relevant.Item Built-in tests for a real-time embedded system.(1991) Olander, Peter Andrew.Beneath the facade of the applications code of a well-designed real-time embedded system lies intrinsic firmware that facilitates a fast and effective means of detecting and diagnosing inevitable hardware failures. These failures can encumber the availability of a system, and, consequently, an identification of the source of the malfunction is needed. It is shown that the number of possible origins of all manner of failures is immense. As a result, fault models are contrived to encompass prevalent hardware faults. Furthermore, the complexity is reduced by determining syndromes for particular circuitry and applying test vectors at a functional block level. Testing phases and philosophies together with standardisation policies are defined to ensure the compliance of system designers to the underlying principles of evaluating system integrity. The three testing phases of power-on self tests at system start up, on-line health monitoring and off-line diagnostics are designed to ensure that the inherent test firmware remains inconspicuous during normal applications. The prominence of the code is, however, apparent on the detection or diagnosis of a hardware failure. The authenticity of the theoretical models, standardisation policies and built-in test philosophies are illustrated by means of their application to an intricate real-time system. The architecture and the software design implementing the idealogies are described extensively. Standardisation policies, enhanced by the proposition of generic tests for common core components, are advocated at all hierarchical levels. The presentation of the integration of the hardware and software are aimed at portraying the moderately complex nature of the task of generating a set of built-in tests for a real-time embedded system. In spite of generic policies, the intricacies of the architecture are found to have a direct influence on software design decisions. It is thus concluded that the diagnostic objectives of the user requirements specification be lucidly expressed by both operational and maintenance personnel for all testing phases. Disparity may exist between the system designer and the end user in the understanding of the requirements specification defining the objectives of the diagnosis. It is thus essential for complete collaboration between the two parties throughout the development life cycle, but especially during the preliminary design phase. Thereafter, the designer would be able to decide on the sophistication of the system testing capabilities.Item Application of backpropagation-like generative algorithms to various problems.(1992) Powell, Alan Roy.; Sartori-Angus, Alan G.Artificial neural networks (ANNs) were originally inspired by networks of biological neurons and the interactions present in networks of these neurons. The recent revival of interest in ANNs has again focused attention on the apparent ability of ANNs to solve difficult problems, such as machine vision, in novel ways. There are many types of ANNs which differ in architecture and learning algorithms, and the list grows annually. This study was restricted to feed-forward architectures and Backpropagation- like (BP-like) learning algorithms. However, it is well known that the learning problem for such networks is NP-complete. Thus generative and incremental learning algorithms, which have various advantages and to which the NP-completeness analysis used for BP-like networks may not apply, were also studied. Various algorithms were investigated and the performance compared. Finally, the better algorithms were applied to a number of problems including music composition, image binarization and navigation and goal satisfaction in an artificial environment. These tasks were chosen to investigate different aspects of ANN behaviour. The results, where appropriate, were compared to those resulting from non-ANN methods, and varied from poor to very encouraging.Item Speech recognition and blackboard expert systems.(1992) Loureiro, Guy Marchand.; Sartori-Angus, Alan G.Spoken language is used by people to communicate naturally with one another. A simplistic view of the communication process is as follows. Person A wishes to communicate an idea to person B. The idea, initiated in the mind/brain of person A is encoded into speech signals by means of the person A's speech production mechanism, the vocal apparata in the vocal tract. Various kinds of noise may interfere with the speech signals as they travel to person B. The resulting signal is captured by person B's speech receiving mechanism, the ear. It is then analysed and decoded into a meaningful message by the brain of person B. This thesis concerns itself with the investigation of and attempt to automate the receiving and decoding of English sentences using a machine - that is to perform the task of person B in the above scenario using a computer. The aim is not only to produce a sequence of phonetic sounds, but to look at the problems of building in the 'mind of the machine', a picture of the meanings, intentions, absurdities and realities of the spoken message. The various models, algorithms and techniques of speech recognition and speech understanding systems are examined. Speech signals are captured and digitised by hardware. The digital samples are analysed and the important distinguishing features of all speech sounds are identified. These are then used to classify speech sounds in subsequent spoken words. The way speech sounds are joined together to form syllables and words introduces difficult problems to the automatic recognition process. Speech sounds are blurred, overlapped or left out due to the effects of coarticulation. Finally, natural language processing issues, such as the importance of syntax (the structure) and semantics (the meaning) of sentences, are studied. A system to control and unite all the above processing is considered. The blackboard expert system model of the widely reported HEARSAY-II speech recognition system is reviewed as the system with the best potential for the above tasks.Item Possible Models Diagrams - a new approach to teaching propositional logic.(1994) Clarke, Matthew Charles.; Dempster, Robert.; Grayson, Diane J.Abstract available in PDF.Item Neural network assisted software engineered refractive fringe diagnostic of spherical shocks.(1996) Kistan, Trevor.; Sartori-Angus, Alan G.A shock is essentially a propagating variation in the pressure or density of a medium. If the medium is transparent, such as air, and the shock radially symmetric, the refractive fringe diagnostic can be used to examine its general features. A laser beam probes the shock, the central part of the beam, refracted to different degrees by the different density features within the shock, interferes with itself and with the outer unrefracted part creating a series of coarse and fine fringes. By examining this interference pattern one can gain insight into the density profile underlying the shock. A series of such experiments was conducted by the Plasma Physics Research Institute at the University of Natal in 1990. To model the situation computationally, they developed a ray-tracer which produced interference patterns for modified theoretical density profiles based on those predicted by Sedov. After numerous trials, an intensity pattern was produced which agreed approximately with experimental observations. Thus encouraged, the institute then sought to determine density profiles directly from the interference patterns, but a true mathematical deconvolution proved non-trivial and is still awaited. The work presented in this thesis reconstructs the ray-tracer using software engineering techniques and achieves the desired deconvolution by training a neural network of the back-propagation type to behave as an inverse ray-tracer. The ray-tracer is first used to generate numerous density profile - interference pattern pairs. The neural network is trained with this theoretical data to provide a density profile when presented with an interference pattern. The trained network is then tested with experimental interference patterns extracted from captured images. The density profiles predicted by the neural network are then fed back to the ray-tracer and the resultant theoretically determined interference patterns compared to those obtained experimentally. The shock is examined at various times after the initial explosion allowing its propagation to be tracked by its evolving density profile and interference pattern. The results obtained prove superior to those first published by the institute and confirm the neural network's promise as a research tool. Instead of lengthy trial and error sessions with the unaided ray-tracer, experimental interference patterns can be fed directly to an appropriately trained neural network to yield an initial density profile. The network, not the researcher, does the pattern association.Item A computer mediated system for distance education.(1996) Pillay, Nelishia.; Dempster, Robert.A problem currently facing South Africa is the large number of poorly educated or uneducated people in many parts of the country. Distance education has proven to be an apt solution to this problem However, one of the numerous constraints associated with studying at a distance is insufficient communication between students and lecturers and the lack of peer interaction. The integration of Computer Mediated Communications (CMC) in the delivery of distance education courses world-wide has proved to be a means of alleviating this communication problem. The study presented in this thesis examines the technical feasibility of implementing CMC in the delivery of South African distance education courses as a solution to the communication problems experienced by distance learners in this country. For this purpose a system was developed and implemented at a South African distance education institution namely, Natal College of Education in Pietermaritzburg. Based on this implementation a technical evaluation of the feasibility of CMC in the instruction of distance education courses within a South African infrastructure was examined. As a result of this study we have been able to: • Determine the technical problems associated with the implementation of a CMC system in a South African distance education environment. • Identify possible solutions to these technical problems • Define a set of criteria, which if met by a CMC system would ensure the technical feasibility of the system as a solution to the communication problems experienced by South African distance learners. • Determine the effects of students' attitudes towards computers on their use of the CMC system. • Determine the effect of CMC on students' attitudes towards computers. • Identify any additional factors, besides technical issues, which need to be taken into account when implementing a CMC system.Item Application of genetic algorithms to the travelling salesperson problem.(1996) McKenzie, Peter John Campbell.; Petkov, Doncho.Genetic Algorithms (GAs) can be easily applied to many different problems since they make few assumptions about the application domain and perform relatively well. They can also be modified with some success for handling a particular problem. The travelling salesperson problem (TSP) is a famous NP-hard problem in combinatorial optimization. As a result it has no known polynomial time solution. The aim of this dissertation will be to investigate the application of a number of GAs to the TSP. These results will be compared with those of traditional solutions to the TSP and with the results of other applications of the GA to the TSP.Item Artificial neural networks for image recognition : a study of feature extraction methods and an implementation for handwritten character recognition.(1996) Moodley, Deshendran.; Ram, Vevek.; Haines, Linda Margaret.The use of computers for digital image recognition has become quite widespread. Applications include face recognition, handwriting interpretation and fmgerprint analysis. A feature vector whose dimension is much lower than the original image data is used to represent the image. This removes redundancy from the data and drastically cuts the computational cost of the classification stage. The most important criterion for the extracted features is that they must retain as much of the discriminatory information present in the original data. Feature extraction methods which have been used with neural networks are moment invariants, Zernike moments, Fourier descriptors, Gabor filters and wavelets. These together with the Neocognitron which incorporates feature extraction within a neural network architecture are described and two methods, Zernike moments and the Neocognitron are chosen to illustrate the role of feature extraction in image recognition.Item A support environment for the teaching of programming.(1996) Stewart, Rosanne.This thesis examines the effectiveness of a specially constructed computer based support environment for the teaching of computer programming to novice programmers. In order to achieve this, the following distinct activities were pursued. Firstly, an in-depth investigation of programming misconceptions and techniques used for overcoming them was carried out. Secondly, the educational principles gained from this investigation were used to design and implement a computer based environment to support novice programmers learning the Pascal language. Finally, several statistical methods were used to compare students who made use of the support environment to those who did not and the results are discussed.Item The development of a behavioural video analysis system and a toolbox prototype.(1996) Naidoo, Throshni.; Dempster, Robert.Behavioural research studies often include a data acquisition process, during which subjects are observed, and the displayed behaviours recorded, and a subsequent analysis process, during which the observed behaviours are analysed. In addition the data acquisition process could occur in real-time, or at a later stage, by referring to a recording of the original session. This dissertation addresses the latter approach through the use of computer based digital video technology. A windows based video analysis system, called VAS, that was developed to assist with the acquisition of observed behaviours from the video, and the analysis thereof is discussed. This is followed by a discussion of the implementation of VAS. Finally the directions for further research are identified, and discussed.Item The application of computer technology in South African distance education.(1996) Owusu-Sekyere, Charles.; Meyerowitz, Jane Julia.The advent of on-line Computer-Assisted Instruction and Computer Mediated Communication may improve instruction and communication in distance education in South African universities. On-line Computer-Assisted Instruction in distance education makes the reinforcement of knowledge both systematic and immediate. With instructional media such printed text, audio-cassettes, radio and television broadcasts the student at a distance is an isolated and passive recipient of knowledge. On-line Computer-Assisted Instruction supported by Computer Mediated Communication for interaction and feedback could close the gaps in time and distance between the teacher and the student in distance education. The current network capabilities of the computer makes it possible for such a student to interact with peers and lecturers before, during and after instructional episodes. Computer Mediated Communication can facilitate the use of electronic messaging such as Electronic Mail, Internet Relay Chat, List Servers, Multi-User Domains and Bulletin Board Services for interactions and feedback. This thesis investigates whether instruction and communication in South African universities with a distance education option can be improved using on-line Computer-Assisted Instruction and Computer Mediated Communication respectively. The thesis also makes proposals for their implementation in South Africa by analysing the applications of computer technology in degree awarding distance education institutions in some developed and developing countries that use on-line Computer-Assisted Instruction and Computer Mediated Communication.Item On case representation and indexing in a case-based reasoning system for waste management.(1997) Wortmann, Karl Lyndon.; Petkov, Doncho.; Senior, Eric.Case-Based Reasoning is a fairly new Artificial Intelligence technique which makes use of past experience as the basis for solving new problems. Typically, a case-based reasoning system stores actual past problems and solutions in memory as cases. Due to its ability to reason from actual experience and to save solved problems and thus learn automatically, case-based reasoning has been found to be applicable to domains for which techniques such as rule-based reasoning have traditionally not been well-suited, such as experience-rich, unstructured domains. This applicability has led to it becoming a viable new artificial intelligence topic from both a research and application perspective. This dissertation concentrates on researching and implementing indexing techniques for casebased reasoning. Case representation is researched as a requirement for implementation of indexing techniques, and pre-transportation decision making for hazardous waste handling is used as the domain for applying and testing the techniques. The field of case-based reasoning was covered in general. Case representation and indexing were researched in detail. A single case representation scheme was designed and implemented. Five indexing techniques were designed, implemented and tested. Their effectiveness is assessed in relation to each other, to other reasoners and implications for their use as the basis for a case-based reasoning intelligent decision support system for pre-transportation decision making for hazardous waste handling are briefly assessed.Item A real time, system independent, secure, Internet based auctioning system.(2000) Brown, Cuan.; Murrell, Hugh Crozier.; Swart, Hendrika Cornelia Scott.This thesis outlines the creation of a secure, real time, system independent, Internet based auctioning application. The system has been developed to meet the needs of today's stringent reqUirements on secure Internet based applications. To attain this goal, the latest cryptographic algorithms and development platforms have been used. The result is a JAVA based server and client auctioning application. The client application is designed to run In any common web browser, and the server to execute on any JAVA enabled operating system with a web server and Internet connection. The real time system uses a relatively secure hybrid cryptosystem for communication. This involves the use of RSA for secure key exchange, and RC6 and MARS for secure communication.Item Computer assisted education : design, development and evaluation.(2001) Murrell, Katherine Ann.; Meyerowitz, Jane Julia.Educational institutions throughout the world are increasingly facing classes of educationally, culturally and linguistically diverse student groups. At the same time economic constraints require these institutions to expand their student base and they are therefore looking to distance education and continuing education modules to meet these challenges. Simultaneously rapid advances in desktop computing capabilities and Internet delivered information have revived interest in Computer Assisted Education (CAE). The University of Natal is no exception to these trends; schools, departments and individual members of staff are increasingly exploring the possibility of using the University's computer infrastructure to assist in delivering quality education, maintaining current standards, and addressing the multiple needs of the students. To investigate these issues a CAE program was developed for use in the Nelson R. Mandela School of Medicine to investigate how students would make use 'of the technology, and to report on the development and evaluation processes of such a development. In doing so various lessons could be learnt which could inform the further development of such software at the University. In order to support the development of the CAE program an extensive literature survey into current educational theory was conducted. Its objectives were to explore and understand all the factors affecting the development and use of computer based systems as an educational tool. Particular aspects considered were • the debate between constructivist and instructivist theory in their applicability to both the medium and the subject material. • instructional styles, and with them the learning styles, that could be used to support the educational goals of the diverse student population. • instructional design methodologies that are currently used as well as media production methodologies. The goal of this aspect of the research was to advise both the development of the case study and to gain a broader understanding of the methodology that could be used for other developments. Included in this phase of the research are methods and criteria for selection of authoring systems and interface design issues in a multi-cultural multi-lingual environment. • the review of different evaluation strategies in order to incorporate appropriate evaluation in the CAE case study. • the investigation of broader sociological and historical factors that may influence the way in which CAE can be used effectively in a South African context. The presumption was that students from historically disadvantaged backgrounds and those with English as a second language would be less willing to use technological interventions than those who were more likely to have had access to computers earlier in their education. The case study set out to investigate if this presumption was valid, and if so what elements of design and delivery could facilitate these students' usage of such systems. However, these presumptions were not validated by the case study, showing the exact opposite of expectations, with more historically disadvantaged students showing a willingness to use the module.Item Spectral techniques for roughness estimation.(2001) Lewis, Mark.Roughness is a relatively untouched field considering its significance to natural scientists. In this thesis mathematical techniques for measuring the roughness of signals are constructed and investigated. Both one dimensional and two dimensional signals are tackled. Applications include geological profiles and biological surfaces. Mathematical techniques include Fourier and Wavelet Transforms.Item Executive information systems usage : the impact of web-based technologies.(2002) Averweg, Udo Richard Franz.; Petkov, Don.Executive Information Systems (EIS) grew out ofthe information needs of top executives. The recent literature reports that EIS usage has spread throughout organisations. Web-based technologies are causing a revisit of existing IT implementation models, including those for EIS. These technologies include: Intranet, Internet, Extranet, e-Commerce: Business-to-Business (B2B), e-Comrnerce: Business-to-Consumer (B2C), Wireless Application Protocol (WAP) and other mobile technologies. The author conducts a field study of 31 well-established organisations in KwaZulu/Natal, South Africa, which have EIS experience. A validated survey instrument is administered to an EIS stakeholder in each organisation surveyed. This dissertation reports on (1) an investigation into previous research on IT adoption; (2) that there is little evidence to support that the theoretical usage aspects of the Technology Acceptance Model (TAM) are echoed in EIS implementations in KwaZulu/Natal; and (3) identifies and ranks Web-based technologies in order of their perceived impact on EIS currently and in the future. There is a positive impact level trend for all Web-based technologies on future EIS implementations. The results from this field study could be useful in formulating a set of management perspectives for organisations in South Africa wishing to embark on EIS implementation programs.Item Using mobile agents to solve the distributed buying problem.(2002) Reddy, Kamil.; Goddard, Wayne.This study deals with the Distributed Buying Problem, that is, the problem faced by geographically distributed businesses when it comes to optimising buyer time and global businesses resources. It adopts a software agent-based approach to the problem. A literature survey was carried out to review the relatively new field of software agents and mobile agents in particular. The role of agents in electronic commerce was also studied. A mobile agent system was then designed and implemented to serve as a proof-ofconcept system for an agent-based solution to the problem. The design and implementation are then discussed.Item Computer simulation of the two body abrasive wear process.(2002) Naicker, Theo.; Murrell, Hugh Crozier.New computer technologies are applied to the classical material engineering two-body abrasive wear process. The computer simulation provides an interactive and visual representation of the wear process. The influence of grit size, grit tip radius and load (at constant workpiece hardness and tool path) on the wear rate, wear coefficient and wear surface topography is predicted. The simulation implements microcutting and microploughing with material displacement to the sides of the groove. The validation of the simulation is demonstrated by comparing with the previous modelling literature and with experiments.