Citation: “Transcendental Data: Toward A Cultural History and Aesthetics of the New Encoded Discourse.” Critical Inquiry 31 (2004): 49-84.

  • DOI: 10.1086/427302
  • Full text (post-embargo published version in institutional repository, PDF)
  • Full text (paywalled, Jstor)
  • Full text (paywalled, Univ. of Chicago Press Journals)

Beginning of essay (pages 49-51)

Whether one writes fiction or business reports, prepares lectures or sales presentations, publishes works stored in a library or a commercial database—whatever, in fact, one’s domain of authoring might be—the chances are that one is already producing content that somewhere along the route of its transmission takes the form of a uniquely contemporary kind of discourse: encoded or structured discourse, in the technical sense of digital text encoding and structured markup. At its most local, such encoding or markup shows up in the copyedited manuscripts that authors now see from publishers, which instead of notes to the designer in the old style of “18 pt. heading” (and so on) provide pure logical descriptors or “tags” keyed to house stylefor example, Chapter Title. At its most global, a bewildering variety of the world’s documents and media have in the recent past been encoded in, or are managed by, standardized text-based markup schemes (especially XML, or Extensible Markup Language) that include descriptors for everything from textual or multimedia content to such metadata as author, date, section, and so on. Alternatively, such documents and media have been entered in databases that hold content in tables, records, and fields exportable into XML.

Authors and readers will join with their institutions to complete a new discursive circuit we might call, updating Friedrich Kittler’s media analysis, discourse network 2000. (p. 50)

This entire collection of databases and text markup languages has so far remained largely hidden from individual writers and readers because it is first being implemented at the institutional level. An increasing number of businesses, publishers, booksellers, university libraries, and digital text archives now use databases and XML to manage the jostling, dynamic bundle of data objects we once called books, articles, reports, or songs. But now that XML is being integrated into standard enterprise and personal productivity software (including Microsoft’s Office 2003 suite), ordinary authors and readers—especially those working in institutional settings—will be influenced as well. Authors and readers will join with their institutions to complete a new discursive circuit we might call, updating Friedrich Kittler’s media analysis, discourse network 2000.

Though the problem of reading the new discourse—that is, browsing—is intriguing in its own right, I will concentrate in this essay on the originating end of the transmission act—authoring. What will discourse network 2000 mean for the act of authoring?

[Full-text (requires institutional subscription)]