Academic journal article The Qualitative Report

Computer Aided Phenomenography: The Role of Leximancer Computer Software in Phenomenographic Investigation

Academic journal article The Qualitative Report

Computer Aided Phenomenography: The Role of Leximancer Computer Software in Phenomenographic Investigation

Article excerpt

Introduction and Background

As students enter tertiary study with varying levels of literacy, tertiary institutions provide a variety of support mechanisms and programs to develop academic literacy skills (e.g., reading, writing). These are predominantly directed towards correcting problems which are evidenced in the students' early work. Although this has its use, students also need to be encouraged to be proactive in raising and refining their academic literacy skills prior to submission of their first assignments. Students will only access such literacy aid if they share the same understanding of what literacy is and if they recognise that their levels of skill could be further enhanced. A study aimed at identifying the conceptions of a range of terms relevant to academic literacy held by beginning university students enrolled in a teacher education degree program in an Australian tertiary institution was undertaken.

The relatively large number of written responses (274) prompted me to experiment with a computer aided form of analysis to streamline the traditional manual phenomenographic analysis process. Reporting on the data of the survey and on this methodological experiment was lengthy and divided into two papers. In this paper I report only on the two methods of lexicological analysis used with one set of data for illustration. The full data of the study is to be reported elsewhere (Penn-Edwards, 2009).

The data from a recent survey to ascertain how beginning pre-service education students conceive of the phenomenon of literacy was analysed using phenomenographic techniques as described by Marton (1994). Phenomenography is a qualitative research approach, aiming to capture and analyse lexigraphically subjects' qualitative observations and perceptions of events and propositions. It emerged from studies undertaken by Ference Marton and the Gothenburg School in the early 1980s which focused on the experiences of learning and teaching

Phenomenography is grounded in a distinct theoretical framework with an accompanying research methodology. It is a qualitative exploration of how a specific phenomenon is experienced by a group of people, each of whom may perceive the phenomenon from a different standpoint. In education, the phenomenon under study may be a process or an act, such as that of learning or teaching. Here, researchers seek to qualitatively describe the subjects' expressed understandings or "accounting practices" (Saljo, 1997, p. 184) of the process. In phenomenography various tools of inquiry can be used to collect data in order to pursue investigation into the subject's conceptions of the phenomenon and aiming to "describe differences between conceptions" (Dahlgren & Fallsberg, 1991, pp. 151-152). In this study the researcher explores how beginning tertiary students in Education programs at an Australian university conceptualise the phenomenon of literacy, that is, what do they think the role of literacy is in learning and education.

The phenomenographic analysis of data is usually undertaken by the manual sorting of concepts inferred from transcripts into descriptive categories. This process is "a strongly iterative and comparative one, involving the continual sorting and resorting of data, plus ongoing comparisons between the data and the developing categories of description, as well as between the categories themselves" (Akerlind, 2005, p. 324). The objective is to develop a coherent visual mapping or outcome space of the minimum number of categories which include all the variations in the data but also to demonstrate an internal consistency. The sorting process is time consuming and whilst it is seen as necessary in order to be reiterative and comparative, it provides an opportunity for analysts to be immersed in the data to better order and identify categories of description. The amount of data to be analysed generated by lengthy or multiple interviews can be overwhelming but the development of a computer software package, Leximancer, would appear to offer a fast, efficient method of sorting large amounts of transcripted data and identifying expressed concepts. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.