Category > Uncategorized

Friending the Past: The Sense of History in the Digital Age. Chicago: University of Chicago Press, 2018 (forthcoming  November 2018).

Cover of Alan Liu, Friending the Past
Catalogue Copy

Can today’s society, increasingly captivated by a constant flow of information, share a sense of history? How did our media-making forebears balance the tension between the present and the absent, the individual and the collective, the static and the dynamic—and how do our current digital networks disrupt these same balances? Can our social media, with its fleeting nature, even be considered social at all?

In Friending the Past, Alan Liu proposes fresh answers to these innovative questions of connection. He explores how we can learn from the relationship between past societies whose media forms fostered a communal and self-aware sense of history—such as prehistorical oral societies with robust storytelling cultures, or the great print works of nineteenth-century historicism—and our own instantaneous present. He concludes with a surprising look at how the sense of history exemplified in today’s JavaScript timelines compares to the temporality found in Romantic poetry.

Interlaced among these inquiries, Liu shows how extensive “network archaeologies” can be constructed as novel ways of thinking about our affiliations with time and with each other. These conceptual architectures of period and age are also always media structures, scaffolded with the outlines of what we mean by history. Thinking about our own time, Liu wonders if the digital, networked future can sustain a similar sense of history.

[See abstracts for book and chapters]

 

Table of Contents

Acknowledgments

Introduction (abstract)

1          Friending the Past (abstract)

2          Imagining the New Media Encounter (abstract)

3          When Was Linearity? (abstract)

4          Remembering Networks (abstract)

5          Like a Sense of History (abstract)

Appendix: Hypothetical Machine-Learning Workflow for Studying the Sense of History

 

Book Abstract

Friending the Past asks if today’s society, increasingly captivated by up-to-the-minute information media, can have a sense of history. What is the relation between past societies whose media forms fostered a communal or self-aware sense of history—for example, storytelling in prehistorical oral societies, or the great print works of historicism in the nineteenth century—and today’s “instant” networked information society? How did the sense of history once balance between the feeling for the present and for the absent, the temporal and the social, the individual and the collective, and the static and the dynamic? And how do digital networks now change the balance? Blending the approaches of intellectual history, media studies, and digital humanities, the book proposes novel ways of thinking about the evolving sense of history. Topics include the relation between high-print historicism and social networking; narratives of “new media encounters” between societies; graphically visualized and conceptualized understandings of history; and “network archaeology” as the variant of media archaeology needed to grasp the networked texture of our contemporary feeling for history. At its close, the book calls the question: is there a sense of history in the digital, networked age? The book concludes with an example of what a digitally networked sense of history can be by examining (in a manner poised between “close reading” and “distant reading”) the code of one of today’s JavaScript “timelines” and comparing it to the experience of temporality encoded in William Wordsworth’s poetry during the era of romanticism.

Book Keywords:
digital humanities, historicism, history, information society, media, media archaeology, networks, romanticism, temporality, timelines

 


Chapter Abstracts

Introduction

Written fictionally in the voice of today’s “sense of history,” the introduction frames the central question of the book: in the age of digital media, digital networks, social networking, and data, can society have a sense of history comparable to that which characterized earlier eras of history and media? Speaking like a chorus in a Greek tragedy, the Sense of History speaks in character to reframe the problem as the transition from an older, interconnected circuit of meaning-making acts—”rhetoric representation interpretation”—to a later one: “communication information media.”

Chapter Keywords: communication, data, digital, information, interpretation, media, networks, representation, rhetoric, sense of history

 


Chapter 1: Friending the Past

Chapter 1 studies the change from prior senses of history to today’s “real time” sense of history—or instant sense of community—of social networks. How was the equivalent of a sense of history experienced, and mediated, in prehistorical oral cultures? How did print culture at the height of the history of the book, which coincided with narrative historicism in the mode of Leopold von Ranke (Historismus), alter the sense of history? And how do “Web 2.0” and social networking today yet again change the sense of history? Can today’s society “friend” past ones to imagine, and absorb, prior senses of history as a layered, enrichening texture of the present? What continuities—for example, of Internet transmissions following the routes once forced by imperial roads across conquered lands—lock the digital present to its historical past? But, also, what discontinuities allow past historicism and today’s information empire to challenge each other’s assumptions, thus enabling a more humane texture of the present mindful of the past?

Chapter Keywords: historicism, Historismus, history of the book, Leopold von Ranke, media determinism, narrative, oral culture, print culture, social networking, web 2.0

 


Chapter 2: Imagining the New Media Encounter

This chapter studies “narratives of new media encounter” (accounts of how individuals and societies react to the introduction of writing, radio, television, the Internet, Web 2.0, and so on) to suggest that major historical changes in the sociocultural order are mirrored in narratives of media history. Often, as in the case of Marshall McLuhan’s writings, such narratives follow a plot of progressivist media determinism—of necessary change from old media to new media—even as they also reveal the more ambivalent experience of a “contact zone” between civilizations. At once descriptive and interpretive, tales of new media encounter are a foundational form of media theory—a kind of media archaeology of media theory. They show how societies experience history as communication and information media, and communication and information media as history. They register the experience of history as media history. Finishing on the promising example of a recent collection of essays on the digital humanities, the chapter concludes by asking the critical question: what is an imaginatively enrichening rather than determinist and constraining narrative of new media encounter?

Chapter Keywords: contact zone, digital humanities, Marshall McLuhan, media archaeology, media determinism, media history, media theory, narrative, new media, old media,

 


Chapter 3: When Was Linearity?

Linearists, as they might be called, have staked deep claims of cultural and other value on the linear exposition of history, narrative, argument, and other forms of thought. Theorists of networks, hypertext, and other domains of today’s digital era stake equally significant claims on the nonlinear, often represented emblematically in network-style or other postlinear graphical visualizations. Indeed, they often elevate the importance of graphical knowledge in general. Informed by media history extending from oral culture and the history of the book to digital new media, this chapter asks the simplifying question: what if there never was any linearity to defend or to contest? What if the idea of linearity has always been an ideology deployed through graphical knowledge systems that are realized in graphics as the visualization of any era’s idea of authoritative linearity—for example, who gets to go to the front of a line and why—and ultimately of its sense of history? The chapter makes Wallace Stevens’s poem “The Idea of Order at Key West” (with its invocation of “meaningless plungings” yet also visualization of seas “portioned” into fixed “emblazoned zones”) a recurrent poetic touchstone of its argument–in part by using digital humanities text analysis methods to render the poem as visualizations.

Chapter Keywords: digital humanities, graphics, history of the book, ideology, linearity, media history, networks, oral culture, Wallace Stevens, visualization

 


Chapter 4: Remembering Networks

Chapter 4 begins on the paradigmatic instance of a hybrid print/digital work at the onset of the digital networked era—Agrippa (A Book of the Dead) by Dennis Ashbaugh, Kevin Begos, Jr., and William Gibson (1992)—to call for a method of “network archaeology” extending media archaeology. Network archaeology facilitates understanding the sense of history in our postlinear age of digital networks filled with buzzing, flitting ephemeral and dynamic artifacts making a mockery of archiving yet urgently requiring methods not just of archiving but of open, transparent archiving. Past eras created networked artifacts and systems in their own way. The chapter braids together research on web archiving, scientific workflows (data-analysis workflows facilitating reproducible research), data provenance, and digital humanities prosopography to make the case for remembering networks through new digital archiving methods. Remembering networks, it argues, is foundational for providing our networked age with its appropriate, distinctive sense of history.

Chapter Keywords: Agrippa (A Book of the Dead), media archaeology, network archaeology, networks, prosopography, provenance, reproducible research, scientific workflows, web archiving, William Gibson

 


Chapter 5: Like a Sense of History

This concluding chapter defines the sense of history of any era or culture as a set of parameters—ontological, epistemological, socio-historical, and others—that can be studied through a combination of close reading and digital humanities distant reading. Splitting the difference between close and distant reading, the chapter studies visualized “timelines” as a traditional mode of distant reading history (analyzing and visualizing long vistas of historical event). Then, to define the sense of history specific to the internet age, it “close reads” at the code level an influential contemporary form of history: digital timelines. Focusing on the genre of JavaScript digital timelines, which dynamically draw data from backend sources to populate the “document object model” (DOM) of web-based timelines in frontend interfaces, the chapter postulates that the digital era is characterized by its own sense of history—one attuned to the contingency of networks. Setting this contingent sense of history in relief against that of an earlier era, the chapter ends by comparing the TimelineJS Javascript timeline in particular to the time sense, and implicit timelines, in William Wordsworth’s poetry and romanticism. Code meets poetry at a junction between the internet era and the humanities.

Chapter Keywords: close reading, contingency, digital humanities, distant reading, JavaScript, networks, romanticism, sense of history, timelines, William Wordsworth

The following writings by Alan Liu are available online in open-access full text form.

Citation: “Drafts for Against the Cultural Singularity (book in progress).” Alan Liu, 2 May 2016. doi:10.21972/G2B663. https://liu.english.ucsb.edu/drafts-for-against-the-cultural-singularity

The following is draft work (notes and bibliography not included) from one of my books in progress tentatively titled Against the Cultural Singularity: Digital Humanities & Critical Infrastructure Studies. Excerpted  are a few portions from the beginning of the manuscript that bear on the critical potential of the digital humanities and critique.

For a talk including this material as well as additional excerpts from my book in progress, see the video recording of my contribution to the Workshop on “Frontiers of DH: Humanities Systems Infrastructure,” University of Canterbury, 12 November 2015 (delivered as part of a series in New Zealand during my Fulbright Specialist residency at U. Canterbury, October-November, 2015.)

2 May 2016

My aim in this book is to make a strategic intervention in the development of the digital humanities.  Following up on my 2012 essay, “Where is Cultural Criticism in the Digital Humanities?”, I call for digital humanities research and development informed by, and able to influence, the way scholarship, teaching, administration, support services, labor practices, and even development and investment strategies in higher education intersect with society, where a significant channel of the intersection between the academy and other social sectors, at once symbolic and instrumental, consists in shared but contested information-technology infrastructures.  I first lay out in the book a methodological framework for understanding how the digital humanities can develop a mode of critical infrastructure studies.  I then offer a prospectus for the kinds of infrastructure (not only research “cyberinfrastructures,” as they have been called) whose development the digital humanities might help create or guide.  And I close with thoughts on how the digital humanities can contribute to ameliorating the very idea of “development”–technological, socioeconomic, and cultural–today.

Method (1)

The first step–framing for the digital humanities a suitable methodological framework for critical digital infrastructure studies–is challenging, given that the digital humanities are maturing after the late twentieth-century bloom of humanities “theory” and “cultural criticism,” which I here group together (grosso modo) under the name “critique.”  The late-comer status of the digital humanities in this regard is epitomized in the field’s debate a few years ago about “hack versus yack.”  Should digital humanists primarily program, build, or make (hack)?  Or should they instead critically interpret and theorize information media, past and present, in a manner much like normative humanities research (yack)?  At core, the debate is not really about theorized critique versus something other than such critique.  Instead, the debate situates the digital humanities at a fork between two branches of late humanities critique.  One, a hack branch (sometimes referred to as “critical making”), affiliates with, but is often more concretely pragmatic, than “thing theory,” the new materialism, actor-network theory, assemblage theory, and similar late poststructuralist theories.  The other, a yack branch, descends from the not unrelated critical traditions of Frankfurt School “critical theory,” deconstruction, Foucauldian “archaeology,” cultural materialism, postcolonial theory, and gender and race theory–especially as all these have now been inflected by media studies.

In short, the question is not whether the digital humanities should include theorized critique.  At some level, and especially in some branches, the field already does by virtue simply of belonging to the family of the contemporary humanities.  Instead, the question is what sort of critique is uniquely appropriate and purposive for the digital humanities.  What critique, in other words, not only allows the field to assist mainstream humanities critique but could not be conducted except through digital humanities methods that use technology self-reflexively as part of the very condition, and not just facility, of critically knowing and acting on culture today?

The answer to this question, I suggest, is critique at the level of, and articulated through, infrastructure–where “infrastructure,” the social-cum-technological milieu that at once enables the fulfillment of human experience and enforces constraints on that experience, today has much of the same scale, complexity, and general cultural impact as the idea of “culture” itself.  Indeed, it may be that in late modernity when the bulk of life and work occurs in organizational institutions of one kind or another, the experience of infrastructure at institutional scales (undergirded by national or regional infrastructures such as electricity grids and global-scale infrastructures such as the Internet) is operationally the experience of “culture.”  Put another way, the word “infrastructure” can now give us the same kind of general purchase on social complexity that Stuart Hall, Raymond Williams, and others sought when they reached for their all-purpose word, “culture.”  Consider the way dystopian films produced at the onset of the digital information age such as Blade Runner (1982) and the Mad Max films (beginning in 1979) characterized whole cultures by foregrounding infrastructure–in the former: glistening, noir cityscapes defined by transportation and media technology; in the latter: desert landscapes defined by fuel and water supply systems. Those films gave a foretaste of the way late-modern infrastructure is increasingly the mise-en-scène of culture.  Daily life steeps us in pervasive encounters with transportation, media, and other infrastructures that do not just neutrally convey the experience of culture but are visibly parts of our cultural experience. Late modernity is thus car culture, cable TV culture, Internet culture, smartphone culture, and any other kind of “cool” culture where, as I studied in my Laws of Cool, “cool” is a cultural affect of both “smart” technologies and the knowledge workers who use them to be, or at least look, smart.

The consequence of such convergence between infrastructure and culture for critique may be predicted as follows: especially in the digital humanities, critique must now begin to focus on infrastructure in order to have any hope of creating tomorrow’s equivalents of the great cultural-critical statements of the past. Tomorrow’s E. P. Thompson writing about the making of the working class, C. Wright Mills about white collars, Raymond Williams about culture and society, Michel Foucault about discipline, Judith Butler about gender and performativity, Donna Haraway about cyborgs, or Homi Bhaba about hybridity–among many more who could be cited–will need to include in their critiques attention to infrastructure as that cyborg being whose making, working, disciplining, performance, gender formation, and hybridity are increasingly part of the core identity of late modern culture.

What would the method for such a digital humanities cultural criticism focused on infrastructure actually look like? [material elided here]  . . . [P]rosaically, the style of digital humanities infrastructural critique I imagine–one that takes advantage of modes of thinking already prevalent in the field–may be called lightly-antifoundationalist.  The question that I concoct this phrase to answer is how much antifoundationalism–or, perhaps “anti-groundwork” (to allude to Marx’s Grundrisse der Kritik der Politischen Ökonomie)–is actually useful for critical infrastructure studies.  Mainstream humanistic critique (the “hermeneutics of suspicion” that Rita Felski has recently taken to task in her critique of critique) has often been antifoundationalist all the way down according to a three-stage logic that might be outlined as follows.

In its first logical moment, critique recognizes that the “real,” “true,” or “lawful” groundwork (i.e., infrastructure) for anything, especially the things that matter most to people, such as the allocation of goods or the assignation of identity, is ungrounded.  For example, while there are material reasons for resource allocation and the social relations of force needed to do that dirty deed–i.e., for political economy and society–any particular political economy and society are arbitrary and, in the last analysis, unjust.  Political economy and society are thus not grounds but, to play on the word, precisely groundworks: particular ways of working the ground (i.e., a mode of production) supported by discursive, epistemic, psychic, and cultural institutions for ensuring that the work continues in the absence of rational or moral foundation.

In its second logical moment, critique then goes antifoundationalist to the second degree by criticizing its own standing in the political-economic system–a recursion effect attested in now familiar, post-May-1968 worries that critics themselves are complicit in elitism, “embourgeoisment,” “recuperation,” “containment,” and majoritarian identity, not to mention tenure.

Finally, in its third logical moment, critique seeks to turn its complicity to advantage–for example, by positioning critics as what Foucault called embedded or “specific intellectuals” acting on a particular institutional scene to steer social forces.  A related idea is to go “tactical” in the manner theorized by Michel de Certeau, who argued that people immured in any system can appropriate that system’s infrastructure through bottom-up agency for deviant purposes (as in his paradigm of jaywalking in the city).  Media critics, including new media critics, have generalized de Certeau’s notion in the name of “tactical media,” meaning media whose platforms, channels, interfaces, and representations can be appropriated by users for alternative ends.

In general, the digital humanities tend to do things according to methods that slice out from the above total arc of critique just the latter tactical moment.  Such slicing–hacking critique to severe its roots from purist antifoundationalism–brings digital humanities critique into the orbit of several late- or post-critical approaches with a similar style (style rather than full-blown theory precisely because they eschew foundational purity).  One approach that James Smithies has associated with the digital humanities is “postfoundationalism” in his “Digital Humanities, Postfoundationalism, Postindustrial Culture.”  Borrowing from the philosopher of science Dimitri Ginev, Smithies argues that postfoundationalism is “an intellectual position that balances a distrust of grand narrative with an acceptance that methods honed over centuries and supported by independently verified evidence can lead, if not to Truth itself, then closer to it than we were before” (¶ 26).  Postfoundationalism is thus well matched to the digital humanities, Smithies suggests, if we think of the digital humanities as “a process of continuous methodological and . . . theoretical refinement that produces research outputs as snapshots of an ongoing activity rather than the culmination of ‘completed’ research” (¶ 29).  A related idea is “critical technical practice,” which Michael Dieter (“The Virtues of Critical Digital Practice”)–building on Philip Agre’s writings on artificial intelligence research–makes a goal of the digital humanities.  Dieter quotes from Agre: “The word ‘critical’ here does not call for pessimism and destruction but rather for an expanded understanding of the conditions and goals of technical work. . . .  Instead of seeking foundations it would embrace the impossibility of foundations, guiding itself by a continually unfolding awareness of its own workings as a historically specific practice.”  Other ideas that are lightly-foundationalist in this way, though not to my knowledge yet applied to the digital humanities, include Bruno Latour’s “compositionism” (fixed on neither absolute foundations of knowledge nor absolutist refutations of such foundations but instead on mixed, impure, make-do, and can-do compositions of multiple positions; “An Attempt at a ‘Compositionist Manifesto’,” PDF) and Ackbar Abbas and David Theo Goldberg’s “poor theory” (which uses “tools at hand” and “limited resources” to engage “with heterogeneous probings, fragmentary thinking, and open-endedness” in resistance to “totalization, restriction, and closure”) (“Poor Theory: Notes Toward a Manifesto”, PDF).

All these lightly-antifoundationalist approaches are tactical rather than strategically pure because their very potential for critique arises from polluting proximity to, and sometimes even partnership with, their objects of critique.  Unlike distantiated critique, that is, tactical critique (as the root of the word “tactic” might indicate) makes contact.  Smithies thus notes postfoundationalism’s function as a “bridging concept” for the “interdependence” and “entanglement” of the digital humanities with postindustrialism (¶ 8, 3, 2).  Indeed, I add that all the approaches thus far mentioned as a “light foundation” for critical infrastructure studies are similarly contaminated by the double principle of efficiency and flexibility, which (as I articulated in my The Laws of Cool) is the two-stroke engine of the postindustrial mode of production.  As it were, all the approaches I have mentioned are instances of “lean” and “just-in-time” critique and thus not dissimilar in spirit to the in-house critique that postindustrial corporations at the end of the twentieth century began to design into their own production lines by famously empowering workers to “stop the line” ad hoc (or, less catastrophically, to suggest incremental improvements) when they saw something wrong.  Such dirty contact with postindustrialism is both the weakness and strength of lightly-antifoundationalist approaches, where weakness means being swallowed up by the system and strength comes from getting close enough to the system to know its critical points of inflection, difference, and change.  If, as Smithies says, the digital humanities are “deeply entangled” in postindustrialism, in other words, entanglement need not be the same as equivalence.  It is also engagement.

The critical potential of this tendency in the digital humanities to be lightly-antifoundationalist can now be stated: it is precisely the ability to treat infrastructure not as a foundation, a given, but instead as a tactical medium that opens the possibility of critical infrastructure studies as a mode of cultural studies.  And it is such cultural studies that will allow the digital humanities to fulfill their final-cause critical function at the present time, which is to help adjudicate how academic infrastructure connects higher education to, but also differentiates it from, the workings of other institutions in advanced technological societies.  The critical function of the digital humanities going forward, in other words, is to assist in shaping smart, ethical academic infrastructures that not only further normative academic work (research, pedagogy, advising, administration, etc.) but also intelligently transfer some, but not all, values and practices in both directions between higher education and today’s other powerful institutions–business, law, medicine, government, the media, the creative industries, NGOs, and so on.

Method (2)

At present, some of the most influential general understandings of infrastructure cited by digital humanists such as Sheila Anderson and James Smithies studying humanities cyberinfrastructure in particular have been the Large Technical Systems (LTS) approach, stemming originally from the historian Thomas Hughes’s Networks of Power (1983), and the information-ethnography approach stemming from Susan Leigh Star, Geoffrey Bowker, and their circle. Good expositions of both are combined in one of the best conceptualizations of infrastructure I have so far found: a document of 2007 titled “Understanding Infrastructure: Dynamics, Tensions, and Design” (PDF) (whose authors include Bowker) representing the final report to the National Science Foundation of a workshop it sponsored.

Adding to these general approaches to infrastructure, I borrow in this book another portfolio of thought that to my knowledge has not yet been introduced directly to infrastructure studies. It is also a portfolio largely unknown in the digital humanities and, for that matter, in the humanities as a whole even though it is broadly compatible with humanities cultural criticism.  The portfolio consists of the “neoinstitutionalist” approach to organizations in sociology and, highly consonant, also “social constructionist” (especially “adaptive structuration”) approaches to organizational infrastructure in sociology and information science.  Taken together, these approaches explore how organizations are structured as social institutions by so-called “carriers” of beliefs and practices (i.e., culture), among which information-technology infrastructure is increasingly crucial.  Importantly, these approaches are a social-science version of what I have called lightly-antifoundationalist.  Scholars in these areas “see through” the supposed rationality of organizations and their supporting infrastructures to the fact that they are indeed social institutions with all the irrationality that implies.  But they are less interested in exposing the ungrounded nature of organizational institutions and infrastructures (as if it were possible to avoid or get outside them) than in illuminating, and pragmatically guiding, the agencies and factors involved in their making and remaking.  Such approaches are thus inherently a good match for the epistemology of building, unbuilding, and rebuilding in the digital humanities.

More than a good match, neoinstitutionalism and the social science of organizational technologies offer exactly the right tactical opening for a digital humanities cultural criticism because they are all about the site on which the already existing critical force of the digital humanities is pent up: institutional forms of technologically-assisted knowledge work.  After all, the digital humanities stand in contrast to new media studies and network critique among cousin fields as the branch of digitally-focused humanities work that has been primarily focused on changing research, authorship, dissemination, and teaching inside (and across) academic institutions and related cultural or heritage institutions rather than on broader commentary directed externally at society and social justice.  The digital humanities are all about developing analytical, publishing, curatorial, and hybrid-pedagogical tools and practices at scales ranging from standalone projects to federated or regional frameworks; creating new university programs and centers; changing the accepted notion of academic careers (e.g., to include “alt-ac” alternative academic careers); and, ultimately, instilling a new scholarly digital ethos in the academy in the name of “collaboration” and “open access.”  As a consequence, the existing critical energy of the digital humanities–sometimes quite passionate and even militant–has been primarily devoted to such institutional issues.  Breaking down the paywalls of closed publication infrastructures, for instance, is the digital humanities version of storming a university administration building in the 1970s.

Can neoinstitutional and social-structuration-of-technology approaches to understanding the evolving relation between the academic institution and today’s more domineering institutions (most notably, business and government) help the digital humanities release its intramural critical energy?  Can that release help propel not just change in higher education but, through higher education and the technological infrastructures that mediate its relationship to other institutions, also extramural change in the larger society that higher ed contributes to?  (Besides its focus on culture, I note, one of the special strengths of neoinstitutionalism that make it attractive to add it to Large Technical System analyses of infrastructure is that it is especially attuned to studying change and divergence among dominant institutional systems.)  In short, can the considerable existing intelligence, idealism, and moral force of the digital humanities be redirected from being only an instrument of institution work to becoming through interventions in instrumental infrastructure also a way to act on institutions and their wider social impact?

But I do not wish to overreach, which is also why I think an approach focused on institutions and their infrastructures is particularly appropriate.  Ultimately, the digital humanities field must be critical in a way that does not ask it inauthentically to reach beyond its expertise and mandate to bear exaggerated responsibility for larger social phenomena.  Acting out through the digital humanities about larger social issues is necessary.  But such actions must be complemented by creating infrastructures and practices that make their social impact by being what Susan Leigh Star called “boundary objects”–in this case boundary objects situated between the academic institution and other major social institutions.  It is in this boundary zone–just as one example, “content management system” infrastructures whose use by scholars oscillates between corporate “managed” and “open community” philosophies–that higher education can most pertinently influence, and be influenced by, other institutions through what I earlier called “shared but contested information-technology infrastructures.”  It is in this boundary zone of hybrid scholarly, pedagogical, and administrative institutional infrastructure that we need the attention of skilled and thoughtful digital humanists, even if the interventions they make are not called anything as ambitious as “activism” but instead simply “building.”

[End of excerpt]

Sessions at the 2016 MLA Convention related to digital humanities research, teaching, or the direction of the DH field, with some overlap with new media studies, writing studies, editing, and other topics. (Some sessions listed here are not centrally on DH but include at least one relevant paper.)

Online versions of list: .docx | .pdf

This list is compiled by Alan Liu, U. California, Santa Barbara (with kudos to Mark Sample for the idea, based on his listing of MLA DH sessions at previous MLA conventions). Please send Alan corrections and notices of sessions he has missed: ayliu@english.ucsb.edu

List last revised: 23 Dec 2015

Citation: Peter de Bolla, “Digital Knowledge: Format, Scale, and the Information-knowledge Parallax at the Total Knowledge Horizon — A Reply to Alan Liu.” 15 November 2014. https://liu.english.ucsb.edu/peter-de-bolla-reply-to-alan-lius-theses-on-the-epistemology-of-the-digital/

The following was written by Peter de Bolla of Cambridge University in reply to Alan Liu’s “Theses on the Epistemology of the Digital,” a solicited follow-up to Liu’s participation in the second planning consultation session of the Cambridge University Centre for Digital Knowledge (CDK). Held on 7 May 2014 at the Cambridge Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), the consultation session focused on “digital epistemology,” one of the two intended thematic strands of the Centre for Digital Knowledge. A previous planning consultation at CRASSH focused on the other intended strand of “digital society.”

At the consultation session that prompted Liu’s “Theses,” de Bolla initiated proceedings by reading a not-yet-published paper on the epistemology of the digital that served as a provocation for the proceedings.

(November 2015; posted here 15 November 2015)

It is clear, as Alan Liu points out in his “Theses on the Epistemology of the Digital,” that knowledge is not the first thing that comes to mind when we turn our attention to the digital. As he notes, we more commonly think of the digital as having considerable impact on the social, economic, political and cultural. This, it seems to me, is because our primary engagement with the digital is at the level of its effects and affordances with respect to communication, information storage and retrieval, statistical inference and manipulation. And the overwhelming majority of us use the digital as if it were analogue: the interfaces we are comfortable with simulate analogue forms. This is why the question of knowledge is so far back, so buried in our encounter with the format of the digital.

It is true that programmers take a different route but even in this case the question of knowledge is not particularly to the fore as programmers attend to system more than episteme. They are most concerned to get the ontology of their programmes to function efficiently. Perhaps this focus on “ontology” rather than “episteme” reveals a truth: the ontology of computation (where “ontology” is now being used in its philosophical sense) is order and sequence. In contrast to this sense “ontology” is used in programming to mean “a hierarchically organized classification system associated with a controlled, structured vocabulary that is designed to serve the retrieval and integration of data” [Ceusters and Smith, Switching Codes].

The Cambridge University Centre for Digital Knowledge (CDK) proposes to put the question of knowledge to the fore in our attempts to understand the difference that is made by the move from the analogue to the digital. This project is, at least initially, focussed on the difference in format. Seen from this perspective the alterations in the social, political, economic and cultural that arise when digital technologies become ubiquitous are epiphenomenal: these changes, important as they surely are, tend to obscure the alteration in episteme that occurs when analogue materials are migrated into digital format. This alteration is, I think, indicated in Liu’s observation that the “unique quality, or quantum, that is digital knowledge” involves “rebalancing the values of quality and quantity.” Put another way, data is knowledge in computational environments. The problem for us humans is that at the quantum level of data we are unable to perceive that knowledge. It is only at higher orders of data configuration that we are able to transform information into larger bits that we identify as knowledge. But to the computer this transformation is not one of type or category–a changing epistemology–it is merely one of scale. So, in concert with Liu’s remark that the CDK will need to “let go of too fixed an adherence to established modern ideas of knowledge,” we mean to push hard at the gates which sort information from knowledge.

It is manifestly clear that, as Liu points out, the landscape within which knowledge is produced and disseminated has changed significantly in recent times. There are new “systems, forms, and standards of knowledge” which pit “algorithmic” against “philosophical” knowledge, or “multimedia instead of print-codex knowledge.” The question for the CDK, however, is once again: are these changes epiphenomenal? Or, to put that more carefully, has our attention to these alterations so far been less directed to how epistemic shifts are at play than to the effects of these shifts. Liu’s attention to the flatness of knowledge in the realm of big or crowd data (in contrast to my “vertical axiology”) is, I think, directed at one of these effects. The distribution of information, its access and pliability all fall within a flat terrain where the “flatness” is the price of entry to the information. That is to say there is no–or very little–cost. But is that the same thing as a “flat epistemology”? As Liu puts it: “the wisdom of the crowd challenges the very notion of an epistemology.” Once more this seems to be a question of scale: is it the case that at this level of quanta one cannot see a theory of knowledge? Or, perhaps as important, at this level one does not need a theory of knowledge since the mass circulation of and access to information works as if it were knowledge.

I am not sure that this account is fully satisfying or convincing. Perhaps we need a more complex topography in order to see what is going on. If we were to plot the terms “information,” “knowledge,” “opinion” within a multidimensional space–a vector space–it might be possible to begin to see how these slightly different epistemic identifications overlap and connect, disconnect and repel. Here are some thought experiments. If one way of seeing knowledge is this: “knowledge is information which, when it is physically embodied in a suitable environment, tends to cause itself to remain so” (David Deutsch), and one way of seeing the wisdom of the crowd is “mass circulation opinion” then one of the vectors we would need in order to plot the topology would be time or persistence. Mass circulation opinion has, for example, a very distinctive temporality that might be represented as a wave form that corresponds to the volume of traffic at time t. Knowledge, in the Deutsch formulation, tends towards stability and might be represented as a flat line . The temporality of information might be represented as a set of discontinuities as older information becomes updated and replaced by more recent ?. A vector space model would seek to plot these data points within a planar matrix one of whose axes would be temporality. Another might be quantity. This would seek to plot the topology within which expert knowledge, the knowledge of the few, is distributed against mass circulation opinion, the knowledge of many. What measures would be helpful here? Would this help one to identify the range of measure x within which opinion tends towards knowledge? This, I think, might be close to Liu’s observation that knowledge “may not be either truth or story but just a probability distribution.”

If one were now to return the initiation question for the CDK–the issue of digital format–a similar observation might be made. The topology I have begun to sketch could equally be applied to the scalar issue. If one were to plot quanta of data in a vector space alongside persistence of accepted knowledge it might be possible to see more clearly how information at the very basic level of computational format (zero-one) contains within it knowledge, but knowledge that can only be seen at very high levels of data compression. Put this way knowledge decomposes into information at the quantum level. This seems to me to speak to Liu’s observation that “the humanistic and quantum universes of uncertainty are doppelgängers of each other.” It is a matter of scale. From the massive scale of knowledge in the humanities uncertainty appears as ambiguity; from the minuscule scale of data in the quantum world ambiguity appears as uncertainty.

This brings me to the question of the distinctiveness of the humanities. From whichever way I look at this issue it seems to me that we obscure too much when we maintain disciplinary coherence. The humanities are wedded to verification just as the non-humanities are. The humanities give considerable weight to accuracy just as the non-humanities. But in saying this we should not seek to make all knowledge wear the same overcoat. There are distributions within one kind of knowledge (call it humanistic) that are distinct from other kinds. But I am less convinced than some that these distributions are fixed. Still less am I convinced that they ought to stay fixed. By and large what we identify as humanities scholarship contains a mix–a distribution–of opinion, information and knowledge. Up until now the accreditation for this distribution of epistemic entities has been underwritten by a set of practices that give considerable cultural capital to individuals (so called experts) and institutions. What we might call single researcher accreditation protocols underwrite the opinions of literary critics. Outside the humanities the blend of opinion, information and knowledge is not quite the same. The hard sciences, for example, use the rhetoric of observer independent protocols for verifying information. Accreditation for opinions here stems from the repeatability of observational verification protocols. Single person cultural capital also applies, but it is tempered by mass and repeated observation. As long as the information persists and supports the opinion, knowledge is said to be gained.

It seems to me that what Liu identifies as the explosion of knowledge in the digital domain–“from crowds, people outside expert institutions, people outside formal organisations entirely”–consists in a remix of the distributions in a vector space that plots opinion, information and knowledge. This remix is not distinctively humanistic. In some ways it is “trans-humanistic.” Indeed we may be witnessing the decomposition of the humanities as knowledge decomposes into data or information. Put the other way around, we may be witnessing the recomposition of information as knowledge in the digital domain. And that applies to all types or kinds of knowledge, it is not unique to what heretofore we have called the humanities.

So let’s try and fit this around Liu’s sketch of a future humanities scholarship–one that has shed the protective skin of the “discourse model.” In place of the “one” reading and writing a book–the lone humanist–we will find collective and collaborative enterprise distributed in expansive networks of communication and experimentation. This, to my mind, will not be humanities 2.0 (or some such) but digital knowledge work. In this domain interpretation and critique are no longer centre stage (though there will be no necessary reason to jettison them entirely), occupying the foreground will be information, its gathering and manipulation so as to reveal what knowledge lies within the quantum level of information as data. Pattern recognition as much as pattern building will be the primary tasks for the agent who seeks to reveal this knowledge, and the machine (computer) will certainly have equal agential responsibilities in these tasks. There will doubtless be those who see such a sketch as irredeemably anti-humanist but the clear direction in which the contemporary is travelling seeks to make the division between the human and non-human far less monolithic. Things, animals and machines are more useful and friendly to humans when we begin to investigate the ways in which they may have or obtain agency or quasi-agency. The more we understand the quantum universe of biology the clearer it becomes that at the lowest level of life the domain of operation is digital. Pattern recognition and pattern building is what we, humans, do and are made of. To that extent the most humanistic enquiry is, therefore, digital. Once we accept this it becomes possible to see how much we have to learn from opening out to inspection digital knowledge, that is information held in a format that creates knowing from bits that are sub-opinion or only recognisable as knowledge at very high densities. It is not that we stand to gain an enormous revolution in what we know–that by definition will be impossible–but in how it comes to be known.

I agree with Liu that one of the ways this can be implemented is to build digital objects–hardware, software, algorithms and so forth–since this is a very testing methodology for exploring how the computer thinks. In this case there is a kind of “craft knowledge” that is only really accessible to those who make. But we should take care to insist that this is but one way. Others include embracing the techniques and technologies of advanced and advancing computer science–not the off the peg packages designed to fit problems already identified but the future horizon thinking that pushes at the boundaries of computation or the digital. What if we seek to produce a “total knowledge horizon” in which dynamic contextualisation operates at the largest scale of data repository and inspection? In a domain in which everything is potentially capable of finding relation with everything the task will be to sort out the noise from the significant signal. Although this may well involve building in the first sense above, it is as likely to involve building in the sense of conceptualising what digital knowledge might be at the “total knowledge horizon.” That’s the thing I hope we might aspire to build together.

Peter de Bolla has been Professor of Cultural History and Aesthetics at King’s College, Cambridge University, since 2009. He has been a visiting Professor at Siegen, Vanderbilt, New York University. He is Director of the Cambridge Concept Lab which is housed in the Cambridge Centre for Digital Knowledge at CRASSH (the Centre for Research in the Arts, Social Sciences, and Humanities at Cambridge University).

 


Works Cited

  • Ceusters, Werner, and Barry Smith. “Switching Partners: Dancing Witlogy in the Humanities and the Arts. Switching Codes: Thinking Through Digital Technology in the Humanities and the Arts. Ed. Thomas Bartscherer and Roderick Coover. Chicago: University of Chicago Press, 2011: 103. [Link to quotation in Google Books copy of book]
  • Deutsch, David. The Beginning of Infinity: Explanations That Transform the World. 2011; rpt. London: Penguin, 2012. E-book. Kindle Edition. Page 130. [Link to quotation in Google Books unpaginated copy of e-book]

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Citation: “Theses on the Epistemology of the Digital: Advice For the Cambridge Centre for Digital Knowledge.” Alan Liu, 14 August 2014. https://liu.english.ucsb.edu/theses-on-the-epistemology-of-the-digital-page/

The following was written as a solicited follow-up to my participation in the second planning consultation session of the Cambridge University Centre for Digital Knowledge. The session, held on 7 May 2014 at the Cambridge Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), focused on “digital epistemology,” one of the two intended thematic strands of the Centre for Digital Knowledge. A previous planning consultation at CRASSH that I did not attend focused on the other intended strand of “digital society.”

My theses below are meant more as provocation than as prescription; and they do not take account of plans that may have been put in place for the Center for Digital Knowledge since the planning consultations.

14 August 2014

Thesis 1: Enlightening the Digital

Establishing a Centre for Digital Knowledge oriented around “digital epistemology” will require a laser-sharp focus on making “knowledge” a productive framework for understanding the digital age. This framework must be robust enough to compete with such more common gestalts as “society,” “politics,” “culture,” and “economy” (represented in such phrases as “information society,” “surveillance society,” “social media,” “online culture,” “information economy,” etc.). The proposed Centre for Digital Knowledge can generate its agenda by deliberately harnessing the tension between knowledge (including ideals of academic knowledge shaped by the German research university model and the Enlightenment) and social, cultural, and economic understandings of the digital age.

After all, knowledge today is not intuitively the first thing that comes to mind when thinking about the digital, even in regard to such iconic artifacts of the Internet as Wikipedia that ought by rights to hew to the Enlightenment tradition of the Encyclopédie. Not only do Wikipedia’s “no original research” and “notability” principles abridge the idea of knowledge, but its most distinctive traits as a knowledge enterprise are characterized in social terms such as “open” and “community.” And this is before we even come to the identification of the digital with such knowledge-“lite” paradigms as entertainment.

For many, therefore, the digital is not primarily a mutation in knowledge. It is a social change. Social-science and other disciplines operating on this premise treat the digital as a phenomenon of “communication” (“ICT”: “information and communication technology”) impacting social practices, institutions, and organizations [example]. Or the digital facilitates political change. Political scientists or sociologists who study the Internet see it as a testing ground for new kinds of organizing, protest, voting, and other virtual realpolitik [example]. Or, again, the digital marks a cultural change. Disciplines such as “new media studies” and “network critique”–extending British, European, and American traditions of cultural criticism–treat the digital as a domain of contested identity, gender, ethnicity, ideology, affect, privacy, and so on [example]. And, yet again, the digital is an economic change. Economists and organization theorists (chorused by business journalists and business consultants) see the digital as a proxy for the postindustrial reorganization of capital [example].

Amid this clash of paradigms, the specific mission of a Centre for Digital Knowledge should be to illuminate–we may say, “reenlighten”–the knowledge overshadowed by other major views of the digital. Why is it, for instance, that business theorists discuss “knowledge work” in ways that say everything about work but almost nothing about knowledge [example]? What is the actual knowledge embedded in the society, politics, culture, and economy of the digital with their faux-knowledges of “information,” “wisdom of the crowd,” “knowledge bases,” “smart phones,” etc.?

The Centre for Digital Knowledge can design a sequence of events, activities, and outputs that foreground the specific force of digital knowledge amid digital society, politics, culture, and economy. For example, one cycle of Centre activities could focus on how the production and circulation of digital academic knowledge (or of specific “knowledge artifacts”; see provisional plan below) compares to crowdsourcing or social networking. A second could explore how new ideologies of scholarly open-access and open peer review compare to the politics of “open-source” and “open government.” A third could focus on the relation between traditional expert cultures (including but not limited to academic culture) and the new open-source knowledge cultures. And a fourth could focus on the uncanny convergence/divergence between the digitization of scholarly archives (e.g., of traditional restricted-access or closed-stack research libraries) and the economics of monetized proprietary databases (e.g., Google’s). All these cycles of activities would have in common the goal of sifting the sands of the digital for the unique quality, or quantum, that is digital knowledge (where rebalancing the values of quality and quantity is itself a problem of the epistemology of the digital comparable to similar recalculations of value in the social, political, cultural, and economic digital realms).

Thesis 2: Rethinking Enlightenment

But alluding to the Enlightenment forecloses as much as it discloses. An honest effort to grapple with digital knowledge will also require the Centre for Digital Knowledge to let go of too fixed an adherence to established modern ideas of knowledge (here simplistically branded “Enlightenment”). Those ideas are bound up with philosophical, media-specific (print, codex), institutional (academic and other expert-faculty), and “public sphere” configurations of knowledge that co-evolved as the modern system of knowledge. But today there are new systems, forms, and standards of knowledge, including some that refute or make unrecognizable each of the modern configurations mentioned above–e.g., algorithmic instead of philosophical knowledge, multimedia instead of print-codex knowledge, autodidactic or crowdsourced instead of institutional knowledge, and paradoxically “open”/”private” (even encrypted) instead of public-sphere knowledge.

In this light, Peter de Bolla’s incisive “provocation” paper on digital knowledge (presented 7 May 2014 at the start of the second planning consultation for the proposed Centre for Digital Knowledge held at Cambridge University’s CRASSH Center) is revealing for its frequent rhetorical reliance on two prepositions: “under” and “beneath” (used to query the foundations under or beneath the digital). Evidenced in this rhetoric is an inverted Platonic Divided Line that locates essential knowledge not high above but–in the modern tradition that runs from Kant’s “conditions of possibility” through Foucault’s “archaeology of knowledge”–deep below.

But it is unclear that the epistemology of the digital respects, or should respect, a vertical axiology of truth. Some of the most important dimensions of the digital extend laterally in networked, distributed, and other “inch-deep but mile-wide” formations. Big data or crowd data is bottom-up data, not high data (in the sense of “high church” or “high Latin”). In this regard, the Facebook-era cliché of “the social graph” is symptomatic. Used with the definite article in discussions of social networking, the social graph (commonly reified in visual graphs of nodes and links) has become the icon of a flat epistemology with just two secular dimensions (who knows whom) oblivious to any Platonic or Kantian higher dimension.

In the digital age, in other words, the “wisdom of the crowd” challenges the very notion of an epistemology, or philosophy, of knowledge. If we were to juxtapose the Enlightenment with the digital age, we might say that (a) the French Revolution paid quit to philosophy (and philosophes) by advancing a mob mentality that later nineteenth-century “historicists” (and twentieth-century revisionary historians of the Revolution such as François Furet) could only “know” by displacing the Revolutionary “idea” into notions of “spirit [Geist],” “rumor,” “representation,” etc.; and (b) the “digital commons” and “open” movement now represents the resurgence of a similar crowd knowledge challenging scholars. Then and now, the difficulty is that the object of inquiry puts in question the knowledge-standards of scholarly inquiry itself. Circa 1790, for example, people in Paris “knew” who was an “aristocrat” to be accused to the local Watch Committee because “everyone knows.” After 2000, with the onset of Web 2.0 and social media, people similarly know who the “celebs” are (not to mention more plebian “friends” and “followers”) because Facebook, Twitter, etc. know. Pity scholars who want to know what such “knowing” means but are constrained to rigorous older standards of “critical” knowledge that are like being the only person on Facebook who doesn’t “like” anything.

A similar incommensurability between old and new epistemologies applies in temporal terms. Instead of valuing enduring or permanent truths (the temporal version of “high” knowledge), the digital age is preoccupied with information of much shorter durations–time spans plunging down to the diurnal rhythm of blog posts, the microseconds of a data packet’s “TTL” (defined “time to live”), and even the gigahertz clock rate of a computer’s CPU. Originally, after all, Facebook and Twitter both prompted their users for “status updates” with variants of the hyper-immediate question: “What are you doing now?” Nor is it just a matter of the short durée but also of different temporal rhythms. Digital knowledge moves through computers and networks in fitful, robotic ballets of inhumanly precise starts and stops that fatally deform the early-twentieth-century Bergsonian intuition of flow and even the late-twentieth-century McLuhan intuition of media flow or field. Today the time of knowledge belongs to the invisible order of “micro-temporality” theorized by such media archaeologists as Wolfgang Ernst.

So, too, the incommensurability of digital epistemology can be formulated in terms of “uncertainty.” After all, digital knowledge often verges into or draws on stochastic processes that are native to our current scientific epistemology of statistical, probabilistic knowledge. Probability theory and the world view it models (e.g., the quantum-mechanical view of the universe) undercut the foundation of any knowledge that, in order to count as knowledge, needs definite subjects and predicates linked in narrative syntax of the sort that Boris Tomashevsky instanced in his definition of a thematic “motif.” Tomashevky’s example of a motif: “Raskolnikov kills the old woman.” To conform to today’s scientific world view, we would have to rewrite that sentence approximately as follows: “There is a 74% chance that in this document Raskolnikov kills (82%) / wounds (15%) / ignores (3%) the old woman (68%) / young woman (23%) / other (9%).” (Those familiar with “topic modeling” in the digital humanities and other digital research fields will recognize that such a recasting of “motif” makes it resemble the probabilistic “topics” generated by the MALLET topic modeling tool.) In other words, the humanities today have a hard time adjusting to the idea that knowledge may not be either truth or story but just a probability distribution. Even the “ambiguity,” “paradox,” and “irony” that were the highest evolutions of humanistic knowledge valued by the New Critics seem to exist in an alternate cosmos from the equivalent uncertainties of quantum mechanics. Not Cleanth Brooks’s well-wrought urn, in other words, but Schrödinger’s cat. The New Critics equated the paradox of “Beauty is truth, truth beauty” (the line from John Keats’s “Ode on a Grecian Urn” that so exercised Brooks in The Well Wrought Urn) with the full richness of human reality, which they also called “experience” in consonance with John Dewey’s contemporaneous philosophy of experience. In today’s scientific epistemology, by contrast, reality is defined by the collapse of the quantum wave front, as it were, into either beauty or truth, a binary decision state (consonant with the digital epistemology of 1 vs. 0) that nevertheless does not negate wonder at the unknowability of the paradoxically more real (but also less real because created from “virtual particles”) reality of the “quantum foam” underlying it all. The humanistic and quantum universes of uncertainty are doppelgängers of each other, incommensurable in difference and similarity.

In sum, there was knowledge; and today there are other kinds of knowledge that seem to come foaming up from the zero state of knowability not just in physics (and metaphysics) but in the epistemology of the digital–e.g., from crowds, people outside expert institutions, people outside formal organizations entirely, people from other parts of the world, and so on whose virtual knowledge seems as transient as virtual particles. That is one of the lessons of the digital.

Thesis 3: Decentering the Centre

A Centre for Digital Knowledge also needs to try out alternatives to the very form of an academic “centre,” since that form is vested in traditional ways of organizing knowledge production that the digital is currently reinvesting in a wider, differently articulated network of institutions, collectives, and media. “Neoinstitutional” theory combined with “adaptive structuration theory” (in the fields of sociology and organizational technology studies, respectively) help us understand how the digital facilitates changes in organizational and institutional structures, especially those oriented toward knowledge work. For example, Wikipedia, open-source communities, etc., evidence how the once hallowed institutions of “expertise” (professional work in corporations, professorial work in universities, professional journalism, etc.) are being repositioned by the new technologies in unstable relation to networked “open” para-institutions of knowledge outside settled organizational fields.

It thus seems clear that a Centre for Digital Knowledge that relies solely on traditional institutional forms–even the now normative “interdisciplinary” form (e.g., a centre that creates weak-tie intersections among faculty in different fields) will be cut off from some of the most robust conceptual and practical adventures of digital knowledge. A key test for the proposed Centre for Digital Knowledge, therefore, will be whether it is willing at least on occasion to accommodate non-standard forms of knowledge organization, production, presentation, exploration, and dissemination acclimated to the digital age or open to its networked ethos. Examples of such forms include “THATcamps” or “unconferences,” writing or coding “sprints,” design “charrettes,” online forums, events planned by non-academic invitees, cross-institutional collaboration (university to high school, university to newspaper, university to corporation, university to NGO, etc.), direct engagement with the public in online or face-to-face venues, and intellectual events planned not just by research faculty but also by teaching-first instructors, clerical staff, and students (to break down the divide between those tiers).

An additional desideratum is that the Centre should produce a replicable model for other academic (or hybrid academic/public-humanities) institutions, programs, and events that does not depend on the funding resources and “A-list” guest speakers of an elite university such as Cambridge. That is, the Centre should ensure that every event aspiring to be the academic equivalent of an Aspen Institute or TED Talks should be balanced by an event aspiring to be a THATcamp, beginner or early-career forum, project incubation workshop, regional all-institutions conference, or other forum that sows the seeds wide and far.

Thesis 4: Redesigning Discourse

In modern times, the academic production and dissemination of humanities knowledge have run in a well-known discourse pattern (OED: “discourse” from “discursus action of running off in different directions, dispersal, action of running about”). With some exceptions (e.g., co-editions), humanities scholarly discourse runs, mutatis mutandis, as follows:

Reading & Research arrow right Syllabi & Teaching notes arrow right Talks arrow right Articles arrow right Monographs.

Some traits associated with this program are dominant and others recessive. Solo agents of knowledge are dominant in the humanities. One reads and annotates a book; one designs a syllabus; one writes a paper; etc. By contrast, collective agency–the thick bunchings of academic life in meetings, reading groups, conferences, etc.–are recessive: either epiphenomenal (one would be writing that article anyway) or taken for granted as para-academic apparatus (e.g., the discourse between a scholar and editor that only occasionally comes to view in a book’s acknowledgements).

In terms of the acts rather than agents of humanities knowledge, interpretation and critique are dominant as the ends of knowledge, while observation and analysis are recessive as preliminaries to knowledge. Spanning in between are the acts of rhetoric and narrative that comprise the dispositio that William Germano (drawing on his experience as a former editor of humanities monographs) calls a book’s “throughline.”

Additionally, humanistic discourse has dominant and recessive styles. Through an act of introjection, many humanities scholars have come to believe that their dominant discourse should be of the same order of linguistic phenomena as their object of study. Since much of humanistic study concentrates on exceptional texts (e.g., literary works, pivotal historical speeches or documents), this means that higher value is ascribed to scholarly writings that at least to some degree are as resonantly crafted, nuanced, or elegant as complex literary language; as classically or biblically periodic as famous historical speeches; or otherwise as linguistically tour-de-force as some variant of the above. (Disclaimer: the present piece of humanistic writing is no exception, at least in its aims.) Even a humanities scholar’s spoken lectures are traditionally pre-scripted for high-pitch verbatim performance–an exercise that other disciplines such as the sciences and engineering view as bizarrely theatrical, not to mention fantastically inefficient for presenting data and conclusions.

Indeed, the issue of “data” in the humanities is increasingly acute in the digital age since it is a direct challenge to the privilege of high style. With some exceptions in fields like history, the humanities treat data as something to be embedded in discourse as part of the argument (or at least kept as close as a footnote or appendix at one remove). “Close reading” is an example of how the humanities fold data–the precise lines of poetry being interpreted, for instance–into argument. As a consequence, and by corollary with its stylistic ideal, the humanities create arguments that seem data-lite. After all, only so much concrete evidence can be folded into an argument without the prose taking on the poured concrete quality of many scientific or social-scientific articles with their masses of particulate citations–e.g., “Empirical studies adopting this social constructionist view of technology have been done by sociologists of technology (Bijker 1987; Bijker, Hughes and Pinch 1987; Collins 1987; Pinch and Bijker 1984, 1987; Woolgar 1985; Wynne 1988), and information technology researchers (Boland and Day 1982; Hirschheim, Klein and Newman 1987; Klein and Hirschheim 1983; Newman and Rosenberg 1985)” (source for this example [PDF]). Of course, the appearance of being data-lite belies the true heft and complexity of humanities data (where “data” here means low-level observational and descriptive information recorded in some structured pattern, as in the “images” or “paradoxes” Brooks accumulates in his Keats chapter in The Well Wrought Urn, whose title notably rejects the idea of explicit data: “Keat’s Sylvan Historian: History Without Footnotes”). First, there is a multiplier effect by which humanistic knowledge is attended by messy problems of missing, irregular, incommensurate, and ambiguous information that require much behind-the-scenes processing and adjudication (a post by Hugh Cayless on this issue). Secondly, much underlying data in the humanities is implicit. Data inheres in entrained reading practices such that the “what is your data?” question typical in other disciplines is normatively answered in literary studies: “here’s the book; do a close reading yourself to see if my interpretation is persuasive.” And data also inheres silently in the stability of a massive infrastructure of book collections, curatorial staffs, bibliographies, metadata, and other apparatuses–i.e., the whole order of data to which even simple humanities citations (e.g., “see Cleanth Brooks”) really refer. Humanities data refers to “all that” (background editing, archiving, reading practices and apparatuses) even when, as in Brooks’s case, it seems to wear on its sleeve few, if any, footnotes. So long as libraries, books, or reading do not change, “all that” can be left unspoken as assumed knowledge.

By contrast, the sciences and social sciences (especially branches of the latter focused on quantitative or empirical research) cleave the orders of data and of argument so that they can be managed separately. Data is channeled through closed or open datasets, databases, repositories, etc.; while argument appears in pre-prints, conference proceedings, and journal publications. This separation allows for the creation, processing, maintenance, and presentation of data as a distinct workflow–one that can acquire independent value and even generate its own research problems (as in recent work on computationally assisted “data provenance” [example, PDF]). Scientific and social-scientific data can thus be presented or otherwise made available autonomously for critical inspection–a fact demonstrated, for example, in recent arguments for and against the data validity of Thomas Pikkety’s Capital in the Twenty-First Century.

Humanities discourse has rarely needed to aspire to the same standards for making all its data explicit, shareable, and open to critical examination. “So long as the nature of libraries, books, or reading do not change,” as I put it above, there was no need. But today digital media are rapidly destabilizing the traditional evidentiary structure of the humanities and bringing it closer to that of the sciences. The digital humanities field is a leading example. There are no established humanities protocols for adequately citing even the moderately “big data” that advanced digital methods now tempt humanists to study–e.g., the 7,000 novels that Franco Moretti explores in “Style, Inc. Reflections on Seven Thousand Titles”; the 3,500 works of Irish American prose literature that Matthew L. Jockers mines in Macroanalysis; or the 21,000 articles from “seven generalist literary-studies journals” with up to a century of volumes each that Andrew Goldstone and Ted Underwood canvass in their “The Quiet Transformations of Literary Studies” [PDF]. Even outside the digital humanities, mainstream humanities scholars who work with any kind of digital material are now at sea when needing to quote or cite the increasingly important plenum of born-digital, dynamic, social-media, streaming, and other new kinds of resources. For example, how does one shoehorn into the MLA’s citation style for a Web resource–simply “Web,” void of URLs–any granular reference to a distinct structure or state of an online site, archive, or database?

The high style of humanities discourse, in sum, is increasingly under threat in a digital age that values information over style. Meanwhile, the more data-explicit “ordinary” humanities style of book prospectuses, grant proposals, personnel case reviews, research assessment reports, etc., remains recessive even as it becomes increasingly pervasive. Days and nights may be spent writing a grant proposal, for example, but the prose that emerges is never valued as the “real” voice of the humanist. This is a situation that is increasingly unstable as humanities scholars devote larger proportions of time to writing such works as reports for program reviews or research assessments. What the digital age seems to be telling the academy–an outcome that the humanities will need to adapt for its own purposes–is that the dominant/recessive relation between the language of a book and that of a report or proposal may need to be rebalanced. Nor is the rebalancing solely driven by intramural and administrative needs–part of the rise of “managerialism” in universities. “Public humanities” scholars and humanities advocates make a strong case for complementing humanities research with dissemination in “plain and simple” language [example].

What, then, should be the discourse of knowledge in a Center for Digital Knowledge? One thesis is that such a Center should embrace alternatives to normative humanities academic discourse as part of its very project of understanding the difference of digital knowledge. “Alternatives” does not necessarily mean abandoning the most distinguished features of humanities discourse–individually cultivated voices of eloquence feeling their way toward sustained, rigorous, and elegant or “edgy” interpretations of past and present phenomena. But it does mean diversifying and reordering humanities discourse so that its voice can join in a broader discursive cycle of digital knowledge.

What I mean may be elucidated through a hypothetical research scenario of a sort increasingly common among scholars collaborating with digital methods. Imagine that a major grant has been won to fund a cross-disciplinary, multi-year project entitled “Climate Change and Social Change.” The project’s mission is to correlate climate change with both historical and recent social, economic, political, and cultural impacts–e.g., impacts on the perception of climate (e.g., in the media), social demographics (e.g., mortality rates and migration patterns), monetary flows, political movements, and policy decisions. The promised deliverables are heavily digital: a dataset or corpus, digital tools and interfaces for researchers and the public, and digitally-accessible conferences, papers, and articles. Members of the project team include scholars in computer science, biology, epidemiology, sociology, political science, communication, anthropology, film and media studies, environmental history or literary ecocriticism, history, and literary studies or comparative literature. The operational procedure is a series of plenary meetings branching off into working groups and development “sprints,” all coordinated around a series of defined project milestones and deliverables.

One of the distinctive features of such projects in the digital age is that the breadth of disciplines involved is homologous with a condition of the digital itself: the fact that the object of study can be mutated into a common digital dataset and transformed into countless permutational views for treatment from different disciplinary angles. Thus there is no one primary discourse of knowledge agents, acts, and styles. Monographic publications written in high style by humanities scholars are on a par with such discourses dominating other disciplines as collaborative conference papers, datasets, prototype demonstrations, etc. Or, more accurately, the dominant discourses of different disciplines each take command at different phases of the overall cycle of knowledge production before receding to let other kinds of discourse dominate–the whole alternating sequence driving the process forward iteratively. Thus for example, individuals may drive the work in some parts of the cycle, and teams in others. Observation and analysis come to the fore in some parts of the cycle, and interpretation and critique in others (e.g., critical discussion that occurs at the beginning of the project to shape the mission, or midway in the project as a correction of preliminary results). And style modulates through the cycle accordingly–full-throated at some points, but collapsed to bullet points, diagrams, mockups, and “demos” at others. In this regard, the “provocation” paper by de Bolla at the second planning consultation for the Centre for Digital Knowledge is a perfect exemplum of high-style humanistic critical argument used tactically to start rather than finish a project. Ideally, the sum of all the phase-cycles of this discourse–in which the discursive norms of each discipline take the lead at different points–creates a whole greater than the parts.

The humanities, in other words, need not think that the discursive flow of “Reading & Research arrow right Syllabi & Teaching notes arrow right Talks arrow right Articles arrow right Monographs” is a linear path. Different segments of that traditional agenda can be broken out separately and inserted tactically into other phases of the overall collaborative act of knowledge production where they will have the most value. From the point of view of the humanities themselves, this thesis assumes its most radical form in two propositions. One is that in the digital age humanities scholars should be encouraged to complement their dominant discourse with other kinds of discourse–including challenging collaborative work, difficult and innovative acts of data collection and analysis, and research outputs such as published conference proceedings or online projects that do not sum up in a critical/interpretive monograph. The other proposition is that in the digital age humanities scholars should not be engaging solely in discursive acts at all. Instead, it is already clear in the field of the digital humanities–a leading edge of the humanities’ encounter with digital knowledge–that a gestalt-shift is underway that recasts acts of discourse as acts of “making” and “building.” In the digital humanities, the “epistemology of building”–realized through the building of digital projects, hardware DIY projects, media archaeology labs, etc., and theorized with the aid of such broader intellectual movements as the “new materialism”–is, as they say, a thing.

Thesis 5: Program for the Centre for Digital Knowledge

There are many possible ways the above recommendations could be built into a Centre for Digital Knowledge. Here, for example, is one program of activities that interweaves many of the above theses:

  • Imagine that the Centre for Digital Knowledge would organize itself for the first four years of activity around the question, “What will be the important new digital artifacts of knowledge in the year 2050, and what will their relation be to older digital or material artifacts of knowledge?” The year 2050 is chosen to provide an aim point that provokes imagination, but not one so far in the future as to encourage pure fantasy. The notion of “artifacts” (rather than “media,” “society,” “culture,” etc.) is chosen to anchor the question in the concrete and in building.
  • To address this question, the Centre for Digital Knowledge would recruit and provide fellowships for one or more cross-disciplinary teams of researchers (both senior and early-career, intramural and extramural)–e.g., several humanists, social scientists, and engineers, with at least one ethnographer and one administrator.
  • The team(s) would be given the following mission: design a digital artifact of knowledge for the year 2050, supported by research, mockups or prototypes, exploration of the intellectual premises and theory, speculations on economic and social viability, etc. In doing so, conduct activities that engage other kinds of institutions (e.g., high schools, corporations, the government) and the public; and at least on occasion plan activities that do not conform to established academic forms such as a conference or colloquium.
  • The ethnographer on the team would be given the mission: document the workflow, discourse patterns, etc. of the team(s).
  • The administrator on the team would be given the mission: note the kinds of activities, discourses, and outputs in the project that currently do not have a place in a university’s reward or hiring procedures; and draft a revision of personnel policy that finds a viable way to recognize those activities in a way that furthers the overall research and teaching strength of the university.
  • The final outputs of all the above would consist of traditional scholarly articles and research; an online site giving access to the project and its data as well as explanations addressed to the public; and publications on the project workflow itself.

As stated above, this is just one example program. Many other kinds of organization, activity, and output could be imagined that would allow the Center to explore, and enact, the epistemology of the digital. Whatever the program, the goal is to engage the topic of what it means to “know” in the digital age in a spirit of serious play–at once disciplined and exploratory of new paradigms.

Errata & Revisions

19 August 2014: Corrected to 21,000 the mention of 13,000 articles in Andrew Goldstone and Ted Underwood’s “The Quiet Transformations of Literary Studies” [PDF].

10 November 2014: Corrected to 3,500 the mention of 758 works of Irish American prose literature that Matthew L. Jockers mines in Macroanalysis (the latter was correct only for Chapter 8 in Jocker’s book).

“This is Not a Book: Long Forms of Shared Attention in the Digital Age.” INKE Conference on Research Foundations for Understanding Books and Reading in the Digital Age: E/Merging Reading, Writing, and Research Practices, Havana. 12 December 2012.


“This is Not a Book: Long Forms of Shared Attention in the Digital Age.” Humanities Center, DePaul University. 11 April 2013.


Literature and Data
(Theory & Media Studies Colloquium, Yale Univ., Oct. 7, 2009)

ProSE
How a Romantic Became a Digital Humanist
Tom Swift

  • The Two Cultures
  • The Sense of History and Information Culture

Selected UCSB English Department Digital Initiatives

Department Projects

Literature+

Experimental Courses

  • English 194: Creativity and Collaboration
  • English 194: Literature+ (Spring 2007)
  • English 149: Literature+ (Winter 2008)
  • English 149: Literature+ (Winter 2009; co-taught with James Donelan)
  • English 236: Literature+ (Winter 2008)
  • Toy Chest (Online or Downloadable Tools for Building Projects)

  • See A. Liu, “Literature+”.
    Currents in Electronic Literacy (Spring 2008). <http://currents.cwrl.utexas.edu/Spring08/Liu>

    • Ideal Conclusion

      There has never been a time when world issues on the scale of globalism, terrorism, and the environment have created such a need for radical interdisciplinarity in the academy. There has never been a time when the digital tools facilitating such interdisciplinarity have been more accessible, shareable, and useable. And, from the point of view of our students (who are idealistic about the future but also worried about their careers after graduation), there has also never been a time when the workplace seems more to reward “knowledge workers” able to collaborate via digital technologies across expertises, departments, firms, and nations. My Literature+ courses are packed, drawing students from many disciplines who sense that they are in the pipeline, for better or worse, to such a future. Can the humanities prepare its students not just to survive but to shape the future into what might be called, in complementarity to Literature+ , Dataset+? I mean by this a view of the world that exceeds the usual spreadsheets, databases, reports, and other bleak expressive forms that today sum up the knowledge of business, government, etc., to afford some measure of ethical intelligence, social awareness, communicational fluency, aesthetic/design sensibility, and other cultural quotients of a robust human knowledge?

      Of course, a skeptic responding to such idealism might be suspicious that asking students to take a literary work and do anything with it other than literary interpretation in preparation for a more robust knowledge work can only be a recipe for dilution, popularization, and philistinism. But I have rarely, if ever, seen students more truly engaged with literature than in these courses, where they decide what is essential about a work that must be modeled in new paradigms and technologies so as to make literary experience tractable and manipulable in other disciplinary world views. During the studio/lab classes, I rotate among student teams to ask such questions as, “So what is this work really about? What does your project have to carry over no matter what?” Given that responsibility, students act as if they were at the sensitive stick of a jet fighter called literature.

 

What is the Relation of Literary Study to Data?

  • Exempla:
    • Shaun Sanders, Textones I, Textones II
    • Jeremy Douglass (and Lev Manovich), Cultural Analytics Project (Software Studies Program, UC San Diego)
    • Hans Rosling, demo of GapMinder software at TED (2006)
      (TED = Technology, Entertainment, Design annual conference, Monterey, CA)

      • Bio: "Rosling began his wide-ranging career as a physician, spending many years in rural Africa tracking a rare paralytic disease (which he named konzo) and discovering its cause: hunger and badly processed cassava. He co-founded Médecins sans Frontièrs (Doctors without Borders) Sweden, wrote a textbook on global health, and as a professor at the Karolinska Institut in Stockholm initiated key international research collaborations. He’s also personally argued with many heads of state, including Fidel Castro."
    • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005)
  • Some Questions:
    • What is the relation of literary study to data?

      • What do we gain, and what do we lose with "distant reading"?

      • Whither "interpretation"?

      • What is the relation between data and aesthetics?
Selected Quotations and Concepts
  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:

    • Model
    • Adaptation
    • Translation
    • Performance
    • Rendering
    • Simulation
    • Deformance
    • Edition
    • Interpretation

Question for This Seminar
Selected UCSB English Department Digital Initiatives

Department Projects

Collaborative Research or Curricular Development Projects

Digital Technology and Transdisciplinarity

Paradigmatic Transdisciplinary Question

Literature+

Experimental Courses

Question for This Seminar

What is the future of “interpretation”?

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Translation
    • Performance
    • Rendering
    • Simulation
    • Deformance
    • Edition
    • Interpretation

University of Chicago Press, 2008, 392 pages, ISBN-10: 0226486966, ISBN-13: 978-0226486963

front cover[Catalog copy]

Driven by global economic forces to innovate, today’s society paradoxically looks forward to the future while staring only at the nearest, most local present–the most recent financial quarter, the latest artistic movement, the instant message or blog post at the top of the screen. Postmodernity is lived, it seems, at the end of history.

In the essays collected in Local Transcendence, Alan Liu takes the pulse of such postmodern historicism by tracking two leading indicators of its acceleration in the late twentieth and early twenty-first centuries: postmodern cultural criticism–including the new historicism, the new cultural history, cultural anthropology, the new pragmatism, and postmodern and postindustrial theory–and digital information technology. What is the relation between the new historicist anecdote and the database field, Liu asks, and can either have a critical function in the age of postmodern historicism? Local Transcendence includes two previously unpublished essays and a synthetic introduction, in which Liu traverses from his earlier work on the theory of historicism to his recent studies of information culture to propose a theory of contingent method incorporating a special inflection of history: media history.

“This book is a reflection of and on a nearly twenty-year career. It is as much a work of history as of literary and cultural critique, as much a narrative and a piece of performance art as it is philosophical investigation and Nietzschean genealogy. Alan Liu is sui generis.”

–Marjorie Levinson, University of Michigan

“Following the magnificent achievement of The Laws of Cool, Alan Liu in Local Transcendence takes on the problem he astutely identified as deeply connected with the ‘cool’: the loss of historical grounding and consequent restructuring of identities by postindustrial corporations. Offering a rigorous yet humane critique of new historicism and cultural criticism from the inside, he interrogates the possibilities for historical grounding in the age of information in a witty prose style and a capacious field of reference. Local Transcendence is required reading for anyone interested in the multiple conjunctions, oppositions, and synergies between information, historicism, and cultural context.”

–N. Katherine Hayles, University of California, Los Angeles

“Before he turned to digital humanities, Alan Liu once posed the key question for the new historicism: what’s the connection? What’s the connection, for example, between two juxtaposed details or anecdotes in a cultural field? Now he has reframed both inquiries with a broader question that raises the level of both the game and its stakes: what is the connection between the ‘new historicism’ and the ‘new media’? The result is a book that addresses the central question of the ‘link’ itself in our age and that links the link not only conceptually but also historically. It is a book for anyone interested in how disciplinary and technological innovation in the humanities have informed each other over these past two decades.”

–James Chandler, University of Chicago
book spine
Table of Contents

Acknowledgements

Introduction: Contingent Methods

1. The Power of Formalism: The New Historicism

2. Trying Cultural Criticism: Wordsworth and Subversion

3. Local Transcendence: Cultural Criticism, Postmodernism,
and the Romanticism of Detail

4. Remembering the Spruce Goose: Historicism and Postmodernism

5. The New Historicism and the Work of Mourning

6. The Interdisciplinary War Machine

7. Sidney’s Technology: A Critique by Technology of Literary History

8. “Transcendental Data: Toward A Cultural History and Aesthetics of the New Encoded Discourse”

9. Escaping History: New Historicism, Databases, and Contingency

back cover
Selected UCSB English Department Digital Initiatives

1. Department Projects

3. Transformation Triggered by Digital Technology

4. Global Humanism & Transdisciplinarity

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

Suggestions for a 21st-Century English Department
  1. English Departments should hire to clusters of topical or project-centered interests (e.g., literature and global media, literature and science, literature and terror) that have the potential both to foster collaboration within the department and to link up to campus-wide initiatives. Considerations of historical or field specialization should be secondary (such considerations should not be a priori, but should be generated as part of robust topics and projects).
  2. Every three years, each senior faculty member should be asked to teach a new course on a period, topic, or approach in which they are complete novices or are very uncomfortable.
  3. To foster a more genuine relation between research and teaching, one or two courses in a faculty member’s load each year should be workshop- or lab-style courses in which faculty work alongside students (grad, undergrad, or both) to produce something (e.g., an essay, a web resource, an edition, a conference, a film). At the extreme, such a course would start with no syllabus.
  4. Using teleconferencing or virtual-immersion information technology (e.g., Second Life instructional spaces), English departments at major research institutions in the U.S. should co-teach classes (if not whole courses) with instructors from significantly different areas of the world or different kinds of educational institutions. What do the topics and approaches that matter to “us” (e.g., identity, ethnicity, aesthetics, theory, culture, popular culture) look like when brought into dialgue with the needs and assumptions of students in Europe, Africa, or the East, students at a different grade level, adult students, students from a different social class, etc.?
  5. Today the assumptions that divide, and unite, “literary interpretation” and “creative writing” in a literature department should be rethought in a larger social context that privileges over both poles of that binary such goals as “innovation,” “collaboration,” and “entertainment.” In the globally competitive age of innovate-or-die and critique-by-radio-talk-show-or-blog, scholars entrenched in either interpretive critique or avant-garde creativity seem to be fighting some past war.
  6. English Departments should borrow paradigms from such departments as Engineering to establish robust, proactive internship programs that place students in a variety of for-profit, non-profit, and other organizations. Such an internship program should have a high level of visibility and supervision in the department–e.g., supported by an adviser who visits area businesses, arranges field trips for students, etc.
  7. English Departments should have a “public humanities” initiative with public events and outreach missions. Such an initiative should be coordinated alongside an extramural fund-raising campaign of the sort that other disciplines organize.
Selected UCSB English Department Digital Initiatives

1. Solo and Small-Team Projects

4. Global Humanism & Transdisciplinarity

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

1. Small-Team Digital Projects

Selected UCSB English Department Small-Team Projects

Some (Tentative) Principles of Small-Team Projects

  • Team model + POST method (Forrester Research group on POST)
    • Team-first versus project-first philosophy (i.e., the difference between the academy and business)
    • Differentiation of skills/tasks
    • Parity of interests and intellectual engagement in project (“mindshare” problem)
    • Simultaneous work tracks (i.e., avoid “engineer-first” project design)
    • Project collaboration logistics:
      • hands-on supervision
      • lead research assistant
      • weekly face-to-face meetings
      • content-management-system as staging ground for work (or equivalent: blogs, wikis, Google Docs, etc.)

2. Large, Distributed Digital Projects

Selected UCSB/UC Large Collaborative Projects

Some Problems of Large, Distributed Projects

  • Normal humanities large-scale formats for working and sharing interim results not fully useful (e.g., conferences, editions)
    • Cost and logistics issues
    • Not collaborative-goal oriented
  • Need to scale modularly into multiple small-team groups (e.g., Transliteracies research working groups
  • Need to cross between disciplines
  • Need to bridge across multiple geographical locations
    • Small-scale face-to-face workshops
    • Remote meetings
    • Asynchronous use of video/audio recordings
    • Staff-to-staff financial coordination / group and task-oriented budget reporting
Suggestions for a 21st-Century English Department
  1. English Departments should hire to clusters of topical or project-centered interests (e.g., literature and global media, literature and science, literature and terror) that have the potential both to foster collaboration within the department and to link up to campus-wide initiatives. Considerations of historical or field specialization should be secondary (such considerations should not be a priori, but should be generated as part of robust topics and projects).
  2. Every three years, each senior faculty member should be asked to teach a new course on a period, topic, or approach in which they are complete novices or are very uncomfortable.
  3. To foster a more genuine relation between research and teaching, one or two courses in a faculty member’s load each year should be workshop- or lab-style courses in which faculty work alongside students (grad, undergrad, or both) to produce something (e.g., an essay, a web resource, an edition, a conference, a film). At the extreme, such a course would start with no syllabus.
  4. Using teleconferencing or virtual-immersion information technology (e.g., Second Life instructional spaces), English departments at major research institutions in the U.S. should co-teach classes (if not whole courses) with instructors from significantly different areas of the world or different kinds of educational institutions. What do the topics and approaches that matter to “us” (e.g., identity, ethnicity, aesthetics, theory, culture, popular culture) look like when brought into dialgue with the needs and assumptions of students in Europe, Africa, or the East, students at a different grade level, adult students, students from a different social class, etc.?
  5. Today the assumptions that divide, and unite, “literary interpretation” and “creative writing” in a literature department should be rethought in a larger social context that privileges over both poles of that binary such desiderata as “innovation,” “collaboration,” or and “entertainment.” In the globally competitive age of innovate-or-die and critique-by-radio-talk-show-or-blog, scholars entrenched either interpretive critique or avant-garde creativity seem to be fighting some past war.
  6. English Departments should borrow paradigms from such departments as Engineering to establish robust, proactive internship programs that place students in a variety of for-profit, non-profit, and other organizations. Such an internship program should have a high level of visibility and supervision in the department–e.g., supported by an adviser who visits area businesses, arranges field trips for students, etc.
  7. English Departments should have a “public humanities” initiative with public events and outreach missions. Such an initiative should be coordinated alongside an extramural fund-raising campaign of the sort that other disciplines organize.

 

Digital Humanities (1):
Selected UCSB English Department Digital Initiatives

Small-Team Projects

 

 

Collaborative Research or Curricular Development Projects

 

 

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Imitation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

 

Larger Thesis

Global Humanism

Global humanism is not an older classical or Enlightenment universal humanism–the idea that, as Sir Joshua Reynolds said, there is a “central form” of humanity. And it is also not the modernizing ideal of melting-pot or fusion humanism. Global humanism is not universality or fusion but, as we now say, diversity; not culture but multiculturalism.

(Of course, these latter terms are overused today, but that does not mean that they are just cliché or banal. They are very much alive because the larger social and semantic frameworks that give them meaning are still in the process of collision and adjustment. Diversity and multiculturalism as understood in the academic humanities today, for instance, abuts uncomfortably with the usage of those terms in such other frameworks as neo-corporatism, neo-nationalism, and even neo-regionalism.)

 

Diversity as Interdisciplinarity

But understanding global humanism requires a diversity rather than harmonium of disciplinary methods capable of revealing the seams between alternative understandings of the “human”–e.g., economic, social, political, historical, cognitive, cultural. Indeed, it may be that we do not have meaningful diversity unless it comprises lived experience refuses to fit in any single, stable organization of the various human knowledges. A case in point would be so-called “marginal” peoples who have almost no global economic or political presence but enormous local cultural, aesthetic, and historical presence only uneasily meshed with the institutions and laws of the new global world order.

 

Digital Humanities (2):
The Difference That New Media Technologies Make

Evolutionary Changes

  • Authorship ➝ collaboration, open-source, anonymity, piracy, Wikipedia
  • Refereeing ➝ not peer-review but after-the-fact-review
  • Publication ➝ not publishers but databases and search engines; not proprietary but open-access or mash-up (open-API)
  • Reading ➝ blogs, wikis, social networking: social computing)
  • Interpretation ➝ data-mining, data-visualization, etc.
  • Critical Judgement ➝ reputation, trust, information credibility
  • Teaching ➝ Co-building

Revolutionary Changes

  • New media is changing every single discipline I know both instrumentally and to the core in ways similar to the humanities.
  • New media is thus bringing each discipline’s basic paradigm of knowledge into fundamental (not just superficial) collision. (Science: interface as metaphor.) (Art: engineering.)
  • This collision of paradigms occurs across not just intra-academically but across social sectors: business (e.g., spreadsheets, team collaboration).
  • Hence: “Literature+”
  • Hence: reshaping the profession (see my “Suggestions for a 21st-Century English department”).

 

What the Humanities Offer in Return
  • Ambiguity: areas where qualitative judgements have to be made based on data that in which there are crucial quantitative gaps due to technical or social reasons.
  • Big Humanities:
    • KAREN (Kiwi Advanced Research and Education Network)
    • Cathy Davison on “Big Humanities” (e.g., Shoah Foundation visual/online testimony project with its 200 terabytes of data).
    • NEH/DOE Humanities High Performance Computing Program (“The goal of the program is to provide opportunities for humanities scholars whose research requires high performance computing to collaborate with computer scientists and others at centers already familiar with the challenges of intensive data mining, visualization, and other demanding applications.”)

 

Selected UCSB English Department Digital Initiatives

Small-Team Projects

Collaborative Research or Curricular Development Projects

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

Suggestions for a 21st-Century English Department
  1. English Departments should hire to clusters of topical or project-centered interests (e.g., literature and global media, literature and science, literature and terror) that have the potential both to foster collaboration within the department and to link up to campus-wide initiatives. Considerations of historical or field specialization should be secondary (such considerations should not be a priori, but should be generated as part of robust topics and projects).
  2. Every three years, each senior faculty member should be asked to teach a new course on a period, topic, or approach in which they are complete novices or are very uncomfortable.
  3. To foster a more genuine relation between research and teaching, one or two courses in a faculty member’s load each year should be workshop- or lab-style courses in which faculty work alongside students (grad, undergrad, or both) to produce something (e.g., an essay, a web resource, an edition, a conference, a film). At the extreme, such a course would start with no syllabus.
  4. Using teleconferencing or virtual-immersion information technology (e.g., Second Life instructional spaces), English departments at major research institutions in the U.S. should co-teach classes (if not whole courses) with instructors from significantly different areas of the world or different kinds of educational institutions. What do the topics and approaches that matter to “us” (e.g., identity, ethnicity, aesthetics, theory, culture, popular culture) look like when brought into dialgue with the needs and assumptions of students in Europe, Africa, or the East, students at a different grade level, adult students, students from a different social class, etc.?
  5. Today the assumptions that divide, and unite, “literary interpretation” and “creative writing” in a literature department should be rethought in a larger social context that privileges over both poles of that binary such desiderata as “innovation,” “collaboration,” or and “entertainment.” In the globally competitive age of innovate-or-die and critique-by-radio-talk-show-or-blog, scholars entrenched either interpretive critique or avant-garde creativity seem to be fighting some past war.
  6. English Departments should borrow paradigms from such departments as Engineering to establish robust, proactive internship programs that place students in a variety of for-profit, non-profit, and other organizations. Such an internship program should have a high level of visibility and supervision in the department–e.g., supported by an adviser who visits area businesses, arranges field trips for students, etc.
  7. English Departments should have a “public humanities” initiative with public events and outreach missions. Such an initiative should be coordinated alongside an extramural fund-raising campaign of the sort that other disciplines organize.

 

Digital Humanities (1):
Selected UCSB English Department Digital Initiatives

Small-Team Projects

 

 

Collaborative Research or Curricular Development Projects

 

 

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

 

Larger Thesis

Global Humanism

Global humanism is not an older classical or Enlightenment universal humanism–the idea that, as Sir Joshua Reynolds said, there is a “central form” of humanity. And it is also not the modernizing ideal of melting-pot or fusion humanism. Global humanism is not universality or fusion but, as we now say, diversity; not culture but multiculturalism.

(Of course, these latter terms are overused today, but that does not mean that they are just cliché or banal. They are very much alive because the larger social and semantic frameworks that give them meaning are still in the process of collision and adjustment. Diversity and multiculturalism as understood in the academic humanities today, for instance, abuts uncomfortably with the usage of those terms in such other frameworks as neo-corporatism, neo-nationalism, and even neo-regionalism.)

 

Diversity as Interdisciplinarity

But understanding global humanism requires a diversity rather than harmonium of disciplinary methods capable of revealing the seams between alternative understandings of the “human”–e.g., economic, social, political, historical, cognitive, cultural. Indeed, it may be that we do not have meaningful diversity unless it comprises lived experience refuses to fit in any single, stable organization of the various human knowledges. A case in point would be so-called “marginal” peoples who have almost no global economic or political presence but enormous local cultural, aesthetic, and historical presence only uneasily meshed with the institutions and laws of the new global world order.

 

Digital Humanities (2):
The Difference That New Media Technologies Make

Evolutionary Changes

  • Authorship ➝ collaboration, open-source, anonymity, piracy, Wikipedia
  • Refereeing ➝ not peer-review but after-the-fact-review
  • Publication ➝ not publishers but databases and search engines; not proprietary but open-access or mash-up (open-API)
  • Reading ➝ blogs, wikis, social networking: social computing)
  • Interpretation ➝ data-mining, data-visualization, etc.
  • Critical Judgement ➝ reputation, trust, information credibility
  • Teaching ➝ Co-building

Revolutionary Changes

  • New media is changing every single discipline I know both instrumentally and to the core in ways similar to the humanities.
  • New media is thus bringing each discipline’s basic paradigm of knowledge into fundamental (not just superficial) collision. (Science: interface as metaphor.) (Art: engineering.)
  • This collision of paradigms occurs across not just intra-academically but across social sectors: business (e.g., spreadsheets, team collaboration).
  • Hence: “Literature+”
  • Hence: reshaping the profession (see my “Suggestions for a 21st-Century English department”).

 

What the Humanities Offer in Return
  • Ambiguity: areas where qualitative judgements have to be made based on data that in which there are crucial quantitative gaps due to technical or social reasons.
  • Big Humanities:
    • KAREN (Kiwi Advanced Research and Education Network)
    • Cathy Davison on “Big Humanities” (e.g., Shoah Foundation visual/online testimony project with its 200 terabytes of data).
    • NEH/DOE Humanities High Performance Computing Program (“The goal of the program is to provide opportunities for humanities scholars whose research requires high performance computing to collaborate with computer scientists and others at centers already familiar with the challenges of intensive data mining, visualization, and other demanding applications.”)

 

Selected UCSB English Department Digital Initiatives

Small-Team Projects

Collaborative Research or Curricular Development Projects

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

Selected UCSB English Department Digital Initiatives

Small-Team Projects

Collaborative Research or Curricular Development Projects

Selected Quotations and Concepts

  • Franco Moretti, Graphs, Maps, Trees: Abstract Models for a Literary History (2005):

    Moretti collage“But within that old territory [of literature], a new object of study: instead of concrete, individual works, a trio of artificial constructs–graphs, maps, and trees–in which the reality of the text undergoes a process of deliberate reduction and abstraction. ‘Distant reading,’ I have once call this type of approach; where distance is however not an obstacle, but a specific form of knowledge: fewer elements, hence a sharper sense of their overall interconnection. Shapes, relations, structures. Forms. Models” (p. 1).

  • Willard McCarty, Humanities Computing (2005):

    “By ‘modelling’ I mean the heuristic process of constructing and manipulating models: a ‘model’ I take to be either a representation of something for purposes of study, or a design for realizing something new…. Two effects of computing sharpen the distinction between ‘concept’ on the one hand and the ‘model’ on the other: first, the computational demand for tractability, i.e. for complete explicitness and absolute consistency; second, the manipulability that a digital representation provides…. Take, for example, knowledge one might have of a particular concentration in a deeply familiar work of literature. In modelling one begins by privileging this knowledge, however wrong it might later turn out to be, then building a computational representation of it, e.g. by specifying a structured vocabulary of word-forms in a text-analysis tool. In the initial stages of use, this model would be almost certain to reveal trivial errors of omission and commission. Gradually, however, through perfective iteration trivial error is replaced by meaningful surprise . . . either by a success we cannot explain . . . or by a likewise inexplicable failure” (pp. 24, 25, 25-26)

  • Lisa Samuels and Jerome J. McGann, “Deformance and Interpretation,” New Literary History 30, No. 1 (Winter, 1999):

    “The usual object of interpretation is “meaning,” or some set of ideas that can be cast in thematic form. These meanings are sought in different ways: as though resident ‘in’ the work, or evoked through ‘reader-response,’ or deconstructable through a process that would reinstall a structure of intelligibility at a higher, more critical level…. In this paper we want to propose–or recall–another way of engaging imaginative work…. The alternative moves to break beyond conceptual analysis into the kinds of knowledge involved in performative operations–a practice of everyday imaginative life. We will argue that concept-based interpretation, reading along thematic lines, is itself best understood as a particular type of performative and rhetorical operation…. In an undated fragment on a leaf of stationery, Emily Dickinson wrote what appears to be one of her ‘letters to the world’: ‘Did you ever read one of her Poems backward, because the plunge from the front overturned you? I sometimes (often have, many times) have–a Something overtakes the Mind’ (Prose Fragment 30)…. Our deformations do not flee from the question, or the generation, of ‘meaning.’ Rather, they try to demonstrate–the way one demonstrates how to make something, or do something … that ‘meaning’ in imaginative work is a secondary phenomenon, a kind of meta-data, what Blake called a form of worship ‘Dependent’ upon some primary poetical tale. This point of view explains why, in our deformative maneuvers, interpretive lines of thought spin out of some initial nondiscursive ‘experiment’ with the primary materials. ‘Meaning’ is important not as explanation but as residue. It is what is left behind after the experiment has been run” (pp. 26, 48).

  • The unstable continuum between modeling and interpreting:
    • Model
    • Adaptation
    • Rendering
    • Translation
    • Simulation
    • Deformance
    • Edition
    • Interpretation

Selected UCSB English Department Digital Initiatives

Small-Team Projects

Collaborative Research or Curricular Development Projects

Materials Related to Digital Strategy, Digital Capability Development, and the Humanities in New Zealand

Selected Quotations:

1. First sentences of the Foreword to “The Digital Strategy” by the Minister of Information Technology and of Communications (David Cunliffe): “There’s a buzz about New Zealand right now. We have vibrant communities. We have innovative people and companies at the creative cutting-edge.”

2. From “The Digital Strategy”: “It is important that we keep all the dimensions of the Digital Strategy in line. Content, Connection, and Confidence are the three enablers. Connection is necessary but not sufficient — it simply provides the means. Confidence gives us the skills and a secure online environment, whilst accessing or creating Content provides a compelling reason to make it happen.”

3. In the “The Digital Strategy,” the section on “Why We Need a Digital Strategy” begins: “The information we access through digital technologies can promote innovation, increase productivity, and enrich the quality of our lives. Content creation is not only a global business — now it can be anyone’s business. Using digital technologies to create and access our distinctive cultural content enhances our identity as New Zealanders. ICT helps us unlock our stores of national content, making them accessible to all, and it is a powerful tool for directing and expressing our creativity.”

4. From Draft New Zealand Digital Content Strategy: “The appropriate mechanisms are also needed to unlock New Zealand’s stock of current and future content, in part to provide a supply of high quality content to stimulate demand and uptake of digital technology. In stimulating demand for content however, we must also protect, preserve and promote our heritage and cultural identities, in an environment open to being swamped by the widening access to international content. Maori language, knowledge and culture, a vital part of New Zealand’s identity, is particularly vulnerable to being drowned out or appropriated by international interests unless adequately protected.”

5. From an appendix of the Council of Humanities “Research Policy Paper” (the appendix is a table titled “Sketch of the Cultural Knowledge Research System”): “Research Mode: Primarily interpretive, but including creative and social scientific methodologies. Research Outcomes: Including: Peer-reviewed academic research, contract research, catalogues . . . , conferences and seminars,[etc.]”

Observations

  • The primary goal of the national digital strategy is to bring New Zealand front and center as a postindustrial “knowledge society” in which the premium value is “innovative” or “creative” knowledge.
  • But one of the distinctive premium values of New Zealand is heritage, including Maori language and culture.
  • A national digital strategy should allow New Zealand fully to access—and fully be accessed by—global informational, economic, social, and cultural networks.
  • But New Zealand must protect itself from those global networks.
  • The driver of the whole digital strategy is national “content,” which is to be “preserved.”
  • Except when it is being “unlocked.”
  • Neither of those verbs having any apparent relation to the master verbs of the strategy: “create” and “innovate.”
  • Capability-development initiatives in support of the national digital strategy are BIG (KAREN, BESTGRID, Cultural Portals, etc.).
  • But much of the distinctive culture and heritage of the nation starts small: at the level of the local “community,” which wouldn’t know what to do with a GRID if it met one.

Selected Web Sites

Collaborative Research or Curricular Development Projects

Sterling Publishing, 2003, 48 pp., 35 illustrations, ISBN-10: 0806982772, ISBN-13: 978-0806982779

Table of Contents

Introduction

I. Nature

  • “I Wandered Lonely as a Cloud”
  • “To a Butterfly”
  • “Inscriptions Supposed to be Found in and near a Hermit’s Cell, 1818: III”
  • Lines Written in Early Spring
  • My Heart Leaps Up When I Behold

II. Children & Young People

  • “The Reverie of Poor People”
  • “A Slumber Did My Spirit Seal”
  • “Lucy Gray: or, Solitude”
  • “The Solitary Reaper”
  • “Alice Fell: or, Poverty”

III. The Present and the Past

  • “The Two April Mornings”
  • “Composed Upon Westminster Bridge, September 3, 1802”
  • “It is a Beauteous Evening, Calm and Free”
  • “The White Doe of Rylstone (Excerpt)”
  • “Surprised by Joy”

IV. Scenes from The Prelude

  • The Stolen Boat
  • The Boy of Winander
  • Climbing Mt. Snowdon

V. Growing Up

  • From “Ode: Intimations of Immortality from Recollections of Early Childhood”

This is the professional home page of Alan Liu, Professor of English, University of California, Santa Barbara (UCSB). The English Department at UCSB also maintains a less complete bio page for Alan Liu.

See also Nothing Transcendental: Alan Liu’s Ad Hoc Site for Ordinary Business: “Here, the ordinary and routine business of professional life finds shelter from the pressure to be any more than it simply is.”

Stanford Univ. Press, 1989 , 726 pp., ISBN-10: 0804718938, ISBN-13: 978-0804718936

front cover
[Catalog copy: original description on hardcover jacket]

The imaginative power of Wordsworth’s poetry stems from a denial of history so strong and precise that denial itself—the determined absence of history—must be studied as positive fact. The author argues this thesis with the aid of substantial methodological innovations allowing the best of formalist, deconstructive, and New Historicist reading strategies to be synthesized and informd by a wealth of historical matter. Drawing upon recent advances in the history and theory of the French Revolution, art history, economic history, family history, and the social history of the Lake District, he shows that history—however absent it seems to be—influences literature deeply at the level of form. In particular, the most telling register of historical change and perception in Wordsworth’s poetry is generic transformation. Studying the works of the early and middle years intensively, and the later works suggestively, the author argues that Wordsworth’s overall shift from description to narrative, and from narrative to lyric, is a mimetic denail of contemporary cultural history. By the time “imagination” invests lyric imagery, it has learned to capture history within an empire of self that is no less than a surrogate history, a facsimile ideology.

Part One of the book introduces the subject by rereading the Simplon Pass episode in The Prelude as a denial of Napoleon’s Alpine crossing of 1800. It then formulates a methodology of historical reading by witnessing in the modern and postmodern notion of “context” a developing collaboration between formalist and materialist perspectives. The “matter” of history, the author argues, is collectively structured, witnessed, and uttered absence; and the reading of history is therefore a discrimination of forms of absence. When a city or a cottage is effaced, there is left only the nothing that is the constitutive basis of conventions of difference—of hate, prejudicial discrimination, “nation,” “culture,” and, as one of the most discriminating of cultural discriminations, the differential forms of art.

back cover

Part Two draws upon art history, political history, contemporary journalism, and narrative theory to study the formal collision between Wordsworth’s early picturesque and the predominantly narrative mode of French Revolutionary violence. Out of this collision, “time” arose as the massive denial of history, giving the poet his first authority separate from the “People.” In chapters entitled “The Tragedy of the Family,” “The Economy of Lyric,” and “A Transformed Revolution,” Part Three traces the development of authority into the “originality” of the poet’s mature ideology of autobiography. Part Four concludes the work by pointing ahead in Wordsworth’s corpus toward “The Idea of the Memorial Tour” and the self-critical stance of a poet whose quintessential act was to “collect” himself. The book ends with a brief epilogue on history and critical self-consciousness.

[Jacket Illustration: La Journée des Brouettes (or Préparatifs de la Fête de las Fédération au Champ de Mars, Juileet 1790) by Etienne-charles Le Guay. Musée Carnavalet]

Table of Contents
back cover

Part I. Introduction

  1. The History in "Imagination"
  2. History, Literature, Form

Part II. Violence and Time: A Study in Poetic Emergence

    Before Time

  1. The Politics of the Picturesque: An Evening Walk
    • Motive and Motif
    • From Form to Institution
    • Toward the Indescribable
  2. The Poetics of Violence
    • London: The Silence
    • Paris: The Story
  3. A First Time: Descriptive Sketches, Salisbury Plain
    • The Tragedy of Nature
    • Unexplained Violence
    • The Terror of Time

Part III. The Flight of Forms: A Study of Poetic Individuation

    Lyric and Empire

  1. The Tragedy of the Family: The Borderers
    • A Question of Legitimacy
    • The Crime of the Family
    • Toward a Discourse of Self
  2. The Economy of Lyric: The Ruined Cottage
    • The Value of Imagery
    • The Economy of Debt
    • Peddling Poetry
  3. A Transformed Revolution: The Prelude, Books 9-13
    • The Contest of Genres
    • Autobiography and Ideology

Part IV. Conclusion

  1. The Idea of the Memorial Tour: "Composed Upon Westminster Bridge" (excerpt)

    Epilogue

Appendix
Notes
References
Index

University of Chicago Press, 2004, 552 pages, ISBN-10: 0226486990, ISBN-13: 978-0226486994 (fuller precis of book)

front cover
[Catalogue copy]

“Knowledge work” is now the reigning business paradigm and affects even the world of higher education. But what perspective can the knowledge of the humanities and arts contribute to a world of knowledge work whose primary mission is business? And what is the role of information technology as both the servant of the knowledge economy and the medium of a new technological cool?

In The Laws of Cool, Alan Liu reflects on these questions as he considers the emergence of new information technologies and their profound influence on the forms and practices of knowledge. Liu first explores the nature of postindustrial corporate culture, studies the rise of digital technologies, and charts their dramatic effect on business. He then shows how such technologies have given rise to a new high-tech culture of cool. At the core of this book are an assessment of this new cool and a measured consideration of its potential and limitations as a popular new humanism.

According to Liu, cool at once mimics and resists the postindustrial credo of innovation and creative destruction, which holds that the old must perpetually give way to the new. Information, he maintains, is no longer used by the cool just to revolutionize human knowledge—it is also used to resist it. What counts as cool today, however, is too frequently narrow, shallow, and self-centered. The challenge for the humanities, then, is to help redefine cool and to use technology in a way that mediates between knowledge work and a fuller lifework glimpsed in historical lives and works.

A study of enormous scope, ambition, and intellect, The Laws
of Cool
provides an indispensable account of knowledge work
today and its future.

[Original draft of catalog copy]

In The Laws of Cool, Alan Liu thinks about knowledge work in contemporary society from the viewpoint of the historical, critical, and aesthetic knowledges valued by the humanities and arts. He also looks through the glass in the other direction to reflect on the evolving nature of the humanities and arts under the pressure of the newly dominant, corporate knowledge cultures of lifelong learning,learning organizations, team work, and diversity management. Liu’s pivotal topic is information technology and its semi-autonomous culture of cool (as in Web pages so cool that they thwart the flow of information). Information cool, as he calls it, is now the symptom not just of consumer culture but of a producer culture–the culture of the cubicle–that seeks an “ethos of the unknown” within the world of knowledge work.

back cover

Liu draws on contemporary business theory, sociology, anthropology, art, literature, literary theory, cultural studies, history of information technology, and Internet and new media theory to create an argument that is at once historical, formal, and theoretical. After articulating the concept of postindustrial knowledge work, he narrates the rise of information technology in the workplace and the cognate rise of cool subcultures, countercultures, and cubicle “intracultures.” He then focuses on the formal, technical, social, and political features of high-tech “information cool” and concludes with a sustained reflection–and some practical suggestions–on how the humanities and arts can help educate the contemporary generations of cool.

One of Liu’s special concerns is the emergence of new “destructively creative” or viral arts that resist the postindustrial credo of innovation or what economist Joseph Schumpeter called creative destruction. Another is the current humanities emphasis on historicist critique, which also reevaluates the process of creative destruction. How might these twin tendencies in recent humanities and arts collaborate, he asks, to help shape the well-being–or wealth in a deeper sense–of the new classes of knowledge who spend their days and nights staring at a computer screen and wishing they were cool?

Since the early 1990s, Liu has built on his work in literary history, theory, and cultural criticism by exploring contemporary information culture through a number of technology projects, including his Voice of the Shuttle Web site and Transcriptions: Literary History and the Culture of Information (the NEH-funded research and curricular development initiative he directs). The Laws of Cool harvests the practical and theoretical experience gained in such projects.

(See fuller precis of book)

book spine
Table of Contents

Introduction: Literature and Creative Destruction

Part I. The New Enlightenment

Preface “Unnice Work”: Knowledge Work and the Academy

  1. The Idea of Knowledge Work

Part II. Ice Ages

Preface “We Work Here, But We’re Cool”

  1. Automating
  2. Informating
  3. Networking

Part III. The Laws of Cool

Preface “What’s Cool?” (excerpt)

  1. The Ethos of Information
  2. Information is Style
  3. The Feeling of Information
  4. Cyber-Politics and Bad Attitude

Part IV. Humanities and Arts in the Age of Knowledge
Work

Preface “More”

  1. The Tribe of Cool
  2. Historicizing Cool: Humanities in the Information Age
  3. Destructive Creativity: The Arts in the Information Age
  4. Speaking of History: Toward an Alliance of New Humanities and
    New Arts (With a Prolegomenon on the Future Literary)

Epilogue

Appendices

  1. Taxonomy of Knowledge Work
  2. Chronology of Downsizing
  3. “Ethical Hacking” and Art
Profession 2000: 186-88. This reply is in response to a “Letter to the Editor” by William Pitsenberger in Profession 2000 (185-86) regarding Alan Liu’s “Knowledge in the Age of Knowledge Work,” which had appeared in Profession 1999. The following is the full text of the “Reply.”

With his combined background in business management, business law, and graduate literary studies, William Pitsenberger is uniquely placed to follow up on my call for the academy and business to engage each other critically. “Suppose instead,” he says, “that the training in critical analysis with which those with advanced degrees in literature are armed were brought into the business community in a way that offered that community new kinds of value—understanding, for example, how business texts can be read, what contradictions exist between those texts and the desired message, and how to resolve those contradictions?”

This is an imaginative vision of humanities scholarship as a new missionary activity, one that attempts to offer business not just “skills” and “tools” (to which Pitzenberger admirably refuses to reduce the issues), not even just “value” (or, as he says later, “best use of academic training”), but instead “new kinds of value.” It would be interesting for a group of experienced managers and professionals from both sides of the business/academy divide to sit down together to judge whether this idea has merit and how it could be implemented—whether in a consultancy, training workshop, internship program, or something else.

I would like to take the occasion here, however, of putting Pitsenberger’s prescription in broader perspective. If the goal is to offer business the critical understanding it needs to make wise use of the texts of contemporary management literature—whose now ample and influential body of works is by turns insightful, cruel, heedless, and shallow—then the best general term I know for such an enterprise is still education. In this light, what Pitsenberger’s suggestions makes me wonder about is the very role of education today. In the “knowledge economy,” education occurs across a whole lifetime in an unprecedented variety of social sectors, institutions, and media—not just schools, community colleges, and universities, for instance, but also businesses, broadcast media, the Internet, even the manuals or “tutorials” that accompany software applications. Education, in other words, is now a decentralized field where no one institution any longer individually corners the market and where the sheer dispersion of the kinds and scales of learning—all the way from programs leading to degrees to CNN “factoids” leading only to the next commercial—is dizzying. Given this context, I think, the relevant question becomes: where can society most responsibly and effectively place the training in critical analysis that Pitzenberger suggests? Is it in consultancies or reading groups (workshops, team exercises, and focus groups) within corporations? Is it within the academy in humanities departments, on the hoary theory that the best way to insert critical understanding in society is to teach well the students destined to enter that society? (The humanities could thus teach contemporary management theory with the same critical perspective it brings to any other past theory of civilization, which is what management theory really is in its grandest ambition.) Or, because of the importance to business of non-textual knowledges not easily amenable to learning “how business texts can be read” (a point I owe to my colleague, Christopher Newfield, who also studies business and the academy), should we instead look to the sciences to develop courses on the critical understanding of numerical analysis or to the media industry to sponsor programs on the critical use of images and music? Perhaps the best question: how can society create the most inclusive, flexible, and intelligently interrelated mix of such options to take care of all its citizens hungry to “know”?

None of these questions are rhetorical; all are open. I suspect that they will not be solved from the top down by adding more representatives from government, media, etc., to the panel of business and education managers I imagined above. Rather, the work will begin from the bottom—through efforts by those like Pitzenberger who might want to try innovating a business training workshop here or an internship program there; and also by those working within the academy to introduce works of business literature among other works we ask students to read critically. (See the following course on “The Culture of Information” for my own example:
http://transcriptions.english.ucsb.edu/archive/courses/liu/english236/)

“The Interdisciplinary War Machine (The Theory of Interdisciplinary Studies).” Harvard University. 1 December 1994.