Electronic Text: Investigations in Method and Theory

Electronic Text: Investigations in Method and Theory

Electronic Text: Investigations in Method and Theory

Electronic Text: Investigations in Method and Theory


Since the 1950s, when Roland Barthes re-expressed the formalist ideal of an open-ended text, there has been much interest among literary critics and theorists of text in the question of what text is and what it gives us access to. The computer storage and electronic dissemination of texts adds a new controversy to the debate: what is the significance of the electronic text for the representation and transmission of knowledge? In its functions as multi-text storer and in its capacity to weave, unweave, and reweave text, the computer lends itself to a variety of later twentieth-century theoretic and cultural practices, from the decomposing strategies of deconstructive criticism to the date-dense contextualism of criticisms of postmodernism, coming from new historicism, cultural anthropology, and post-Marxism. The contributors to this book examine the impact of electronic technology on literary and textual studies. They ask how the computer is being used to reshape ideas of text, of authorship, of a literary canon, of authenticity and value as embodied in the edited work. They combine approaches from literary theory, the philosophy of text, feminist theory, and textual criticism. Topics include interactive Shakespeare, the poetry of Laetitia Landon, Mark Twain and hypertext, and the Mighty Morphin' Power Rangers.


Allen Renear

Many different crafts and disciplines concern themselves with texts, books, documents, and the like. Some, such as literary criticism and the philosophy of art, are recognized as intrinsically involved, at least in part, in developing a theoretical knowledge of textuality. Other more practical activities, such as publishing, office automation, textual editing, engineering text processing software, managing publishing systems, and developing textbases and digital libraries, while quite intimately involved with text, are typically not only not thought of as being sources of much theoretical insight into textuality, but may seem to have little theoretical content at all.

This essay focuses on one of these latter practical pursuits: computer text processing, and, more specifically, computer text encoding. I hope to show two things. First, that the particular community which has been designing and configuring computer text-processing and encoding systems has evolved a rich body of illuminating theory about the nature of text--theory that is useful not only to anyone who would create, manage, or use electronic texts, but also to anyone who would, more generally, understand electronic textuality from a theoretical perspective. Second, and more ambitiously, I hope to suggest that the significance of this body of theory and analysis extends well beyond the specific concerns of text processing and text encoding and contributes directly to our general understanding of the deepest issues of textuality and textual communication in general.

In fact, computer text processing and text encoding provide, I think, a much needed fresh perspective on textuality, a perspective which is deeply interdisciplinary, and which is largely driven by very practical problems and projects. For the most part however I will focus in what follows on understanding these theorizing practitioners in their own terms, so that . . .

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.