Repository logo
 

Forensic computing strategies for ethical academic writing.

Thumbnail Image

Date

2009

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

This study resulted in the creation of a conceptual framework for ethical academic writing that can be applied to cases of authorship identification. The framework is the culmination of research into various other forensic frameworks and aspects related to cyber forensics, in order to ensure maximum effectiveness of this newly developed methodology. The research shows how synergies between forensic linguistics and electronic forensics (computer forensics) create the conceptual space for a new, interdisciplinary, cyber forensic linguistics, along with forensic auditing procedures and tools for authorship identification. The research also shows that an individual’s unique word pattern usage can be used to determine document authorship, and that in other instances, authorship can be attributed with a significant degree of probability using the identified process. The importance of this fact cannot be understated, because accusations of plagiarism have to be based on facts that will withstand cross examination in a court of law. Therefore, forensic auditing procedures are required when attributing authorship in cases of suspected plagiarism, which is regarded as one of the most serious problems facing any academic institution. This study identifies and characterises various forms of plagiarism as well the responses that can be implemented to prevent and deter it. A number of online and offline tools for the detection and prevention of plagiarism are identified, over and above the more commonly used popular tools that, in the author’s view, are overrated because they are based on mechanistic identification of word similarities in source and target texts, rather than on proper grammatical and semantic principles. Linguistic analysis is a field not well understood and often underestimated. Yet it is a critical field of inquiry in determining specific cases of authorship. The research identifies the various methods of linguistic analysis that could be applied to help establish authorship identity, as well as how they can be applied within a forensic environment. Various software tools that could be used to identify and analyse source documents that were plagiarised are identified and briefly characterised. Concordance, function word analysis and other methods of corpus analysis are explained, along with some of their related software packages. Corpus analysis that in the past would have taken months to perform manually, could now only take a matter of hours using the correct programs, given the availability of computerised analysis tools. This research integrates the strengths of these tools within a structurally sound forensic auditing framework, the result of which is a conceptual framework that encompasses all the pertinent factors and ensures admissibility in a court of law by adhering to strict rules and features that are characteristic of the legal requirements for a forensic investigation.

Description

Thesis (M.Com.)-University of KwaZulu-Natal, Westville, 2009.

Keywords

Ethics., Academic writing--Data processing., Theses--Information systems and technology.

Citation

DOI