The following writings by Alan Liu are available online in open-access full text form.
- “The Meaning of the Digital Humanities.” PMLA 128 (2013): 409-23.
(Post-embargo published version in institutional repository, PDF)
The following writings by Alan Liu are available online in open-access full text form.
Citation: “Drafts for Against the Cultural Singularity (book in progress).” Alan Liu, 2 May 2016. http://liu.english.ucsb.edu/drafts-for-against-the-cultural-singularity
The following is draft work (notes and bibliography not included) from one of my books in progress tentatively titled Against the Cultural Singularity: Digital Humanities & Critical Infrastructure Studies. Excerpted are a few portions from the beginning of the manuscript that bear on the critical potential of the digital humanities and critique.
For a talk including this material as well as additional excerpts from my book in progress, see the video recording of my contribution to the Workshop on “Frontiers of DH: Humanities Systems Infrastructure,” University of Canterbury, 12 November 2015 (delivered as part of a series in New Zealand during my Fulbright Specialist residency at U. Canterbury, October-November, 2015.)
2 May 2016
My aim in this book is to make a strategic intervention in the development of the digital humanities. Following up on my 2012 essay, “Where is Cultural Criticism in the Digital Humanities?”, I call for digital humanities research and development informed by, and able to influence, the way scholarship, teaching, administration, support services, labor practices, and even development and investment strategies in higher education intersect with society, where a significant channel of the intersection between the academy and other social sectors, at once symbolic and instrumental, consists in shared but contested information-technology infrastructures. I first lay out in the book a methodological framework for understanding how the digital humanities can develop a mode of critical infrastructure studies. I then offer a prospectus for the kinds of infrastructure (not only research “cyberinfrastructures,” as they have been called) whose development the digital humanities might help create or guide. And I close with thoughts on how the digital humanities can contribute to ameliorating the very idea of “development”–technological, socioeconomic, and cultural–today.
The first step–framing for the digital humanities a suitable methodological framework for critical digital infrastructure studies–is challenging, given that the digital humanities are maturing after the late twentieth-century bloom of humanities “theory” and “cultural criticism,” which I here group together (grosso modo) under the name “critique.” The late-comer status of the digital humanities in this regard is epitomized in the field’s debate a few years ago about “hack versus yack.” Should digital humanists primarily program, build, or make (hack)? Or should they instead critically interpret and theorize information media, past and present, in a manner much like normative humanities research (yack)? At core, the debate is not really about theorized critique versus something other than such critique. Instead, the debate situates the digital humanities at a fork between two branches of late humanities critique. One, a hack branch (sometimes referred to as “critical making”), affiliates with, but is often more concretely pragmatic, than “thing theory,” the new materialism, actor-network theory, assemblage theory, and similar late poststructuralist theories. The other, a yack branch, descends from the not unrelated critical traditions of Frankfurt School “critical theory,” deconstruction, Foucauldian “archaeology,” cultural materialism, postcolonial theory, and gender and race theory–especially as all these have now been inflected by media studies.
In short, the question is not whether the digital humanities should include theorized critique. At some level, and especially in some branches, the field already does by virtue simply of belonging to the family of the contemporary humanities. Instead, the question is what sort of critique is uniquely appropriate and purposive for the digital humanities. What critique, in other words, not only allows the field to assist mainstream humanities critique but could not be conducted except through digital humanities methods that use technology self-reflexively as part of the very condition, and not just facility, of critically knowing and acting on culture today?
The answer to this question, I suggest, is critique at the level of, and articulated through, infrastructure–where “infrastructure,” the social-cum-technological milieu that at once enables the fulfillment of human experience and enforces constraints on that experience, today has much of the same scale, complexity, and general cultural impact as the idea of “culture” itself. Indeed, it may be that in late modernity when the bulk of life and work occurs in organizational institutions of one kind or another, the experience of infrastructure at institutional scales (undergirded by national or regional infrastructures such as electricity grids and global-scale infrastructures such as the Internet) is operationally the experience of “culture.” Put another way, the word “infrastructure” can now give us the same kind of general purchase on social complexity that Stuart Hall, Raymond Williams, and others sought when they reached for their all-purpose word, “culture.” Consider the way dystopian films produced at the onset of the digital information age such as Blade Runner (1982) and the Mad Max films (beginning in 1979) characterized whole cultures by foregrounding infrastructure–in the former: glistening, noir cityscapes defined by transportation and media technology; in the latter: desert landscapes defined by fuel and water supply systems. Those films gave a foretaste of the way late-modern infrastructure is increasingly the mise-en-scène of culture. Daily life steeps us in pervasive encounters with transportation, media, and other infrastructures that do not just neutrally convey the experience of culture but are visibly parts of our cultural experience. Late modernity is thus car culture, cable TV culture, Internet culture, smartphone culture, and any other kind of “cool” culture where, as I studied in my Laws of Cool, “cool” is a cultural affect of both “smart” technologies and the knowledge workers who use them to be, or at least look, smart.
The consequence of such convergence between infrastructure and culture for critique may be predicted as follows: especially in the digital humanities, critique must now begin to focus on infrastructure in order to have any hope of creating tomorrow’s equivalents of the great cultural-critical statements of the past. Tomorrow’s E. P. Thompson writing about the making of the working class, C. Wright Mills about white collars, Raymond Williams about culture and society, Michel Foucault about discipline, Judith Butler about gender and performativity, Donna Haraway about cyborgs, or Homi Bhaba about hybridity–among many more who could be cited–will need to include in their critiques attention to infrastructure as that cyborg being whose making, working, disciplining, performance, gender formation, and hybridity are increasingly part of the core identity of late modern culture.
What would the method for such a digital humanities cultural criticism focused on infrastructure actually look like? [material elided here] . . . [P]rosaically, the style of digital humanities infrastructural critique I imagine–one that takes advantage of modes of thinking already prevalent in the field–may be called lightly-antifoundationalist. The question that I concoct this phrase to answer is how much antifoundationalism–or, perhaps “anti-groundwork” (to allude to Marx’s Grundrisse der Kritik der Politischen Ökonomie)–is actually useful for critical infrastructure studies. Mainstream humanistic critique (the “hermeneutics of suspicion” that Rita Felski has recently taken to task in her critique of critique) has often been antifoundationalist all the way down according to a three-stage logic that might be outlined as follows.
In its first logical moment, critique recognizes that the “real,” “true,” or “lawful” groundwork (i.e., infrastructure) for anything, especially the things that matter most to people, such as the allocation of goods or the assignation of identity, is ungrounded. For example, while there are material reasons for resource allocation and the social relations of force needed to do that dirty deed–i.e., for political economy and society–any particular political economy and society are arbitrary and, in the last analysis, unjust. Political economy and society are thus not grounds but, to play on the word, precisely groundworks: particular ways of working the ground (i.e., a mode of production) supported by discursive, epistemic, psychic, and cultural institutions for ensuring that the work continues in the absence of rational or moral foundation.
In its second logical moment, critique then goes antifoundationalist to the second degree by criticizing its own standing in the political-economic system–a recursion effect attested in now familiar, post-May-1968 worries that critics themselves are complicit in elitism, “embourgeoisment,” “recuperation,” “containment,” and majoritarian identity, not to mention tenure.
Finally, in its third logical moment, critique seeks to turn its complicity to advantage–for example, by positioning critics as what Foucault called embedded or “specific intellectuals” acting on a particular institutional scene to steer social forces. A related idea is to go “tactical” in the manner theorized by Michel de Certeau, who argued that people immured in any system can appropriate that system’s infrastructure through bottom-up agency for deviant purposes (as in his paradigm of jaywalking in the city). Media critics, including new media critics, have generalized de Certeau’s notion in the name of “tactical media,” meaning media whose platforms, channels, interfaces, and representations can be appropriated by users for alternative ends.
In general, the digital humanities tend to do things according to methods that slice out from the above total arc of critique just the latter tactical moment. Such slicing–hacking critique to severe its roots from purist antifoundationalism–brings digital humanities critique into the orbit of several late- or post-critical approaches with a similar style (style rather than full-blown theory precisely because they eschew foundational purity). One approach that James Smithies has associated with the digital humanities is “postfoundationalism” in his “Digital Humanities, Postfoundationalism, Postindustrial Culture.” Borrowing from the philosopher of science Dimitri Ginev, Smithies argues that postfoundationalism is “an intellectual position that balances a distrust of grand narrative with an acceptance that methods honed over centuries and supported by independently verified evidence can lead, if not to Truth itself, then closer to it than we were before” (¶ 26). Postfoundationalism is thus well matched to the digital humanities, Smithies suggests, if we think of the digital humanities as “a process of continuous methodological and . . . theoretical refinement that produces research outputs as snapshots of an ongoing activity rather than the culmination of ‘completed’ research” (¶ 29). A related idea is “critical technical practice,” which Michael Dieter (“The Virtues of Critical Digital Practice”)–building on Philip Agre’s writings on artificial intelligence research–makes a goal of the digital humanities. Dieter quotes from Agre: “The word ‘critical’ here does not call for pessimism and destruction but rather for an expanded understanding of the conditions and goals of technical work. . . . Instead of seeking foundations it would embrace the impossibility of foundations, guiding itself by a continually unfolding awareness of its own workings as a historically specific practice.” Other ideas that are lightly-foundationalist in this way, though not to my knowledge yet applied to the digital humanities, include Bruno Latour’s “compositionism” (fixed on neither absolute foundations of knowledge nor absolutist refutations of such foundations but instead on mixed, impure, make-do, and can-do compositions of multiple positions; “An Attempt at a ‘Compositionist Manifesto’,” PDF) and Ackbar Abbas and David Theo Goldberg’s “poor theory” (which uses “tools at hand” and “limited resources” to engage “with heterogeneous probings, fragmentary thinking, and open-endedness” in resistance to “totalization, restriction, and closure”) (“Poor Theory: Notes Toward a Manifesto”, PDF).
All these lightly-antifoundationalist approaches are tactical rather than strategically pure because their very potential for critique arises from polluting proximity to, and sometimes even partnership with, their objects of critique. Unlike distantiated critique, that is, tactical critique (as the root of the word “tactic” might indicate) makes contact. Smithies thus notes postfoundationalism’s function as a “bridging concept” for the “interdependence” and “entanglement” of the digital humanities with postindustrialism (¶ 8, 3, 2). Indeed, I add that all the approaches thus far mentioned as a “light foundation” for critical infrastructure studies are similarly contaminated by the double principle of efficiency and flexibility, which (as I articulated in my The Laws of Cool) is the two-stroke engine of the postindustrial mode of production. As it were, all the approaches I have mentioned are instances of “lean” and “just-in-time” critique and thus not dissimilar in spirit to the in-house critique that postindustrial corporations at the end of the twentieth century began to design into their own production lines by famously empowering workers to “stop the line” ad hoc (or, less catastrophically, to suggest incremental improvements) when they saw something wrong. Such dirty contact with postindustrialism is both the weakness and strength of lightly-antifoundationalist approaches, where weakness means being swallowed up by the system and strength comes from getting close enough to the system to know its critical points of inflection, difference, and change. If, as Smithies says, the digital humanities are “deeply entangled” in postindustrialism, in other words, entanglement need not be the same as equivalence. It is also engagement.
The critical potential of this tendency in the digital humanities to be lightly-antifoundationalist can now be stated: it is precisely the ability to treat infrastructure not as a foundation, a given, but instead as a tactical medium that opens the possibility of critical infrastructure studies as a mode of cultural studies. And it is such cultural studies that will allow the digital humanities to fulfill their final-cause critical function at the present time, which is to help adjudicate how academic infrastructure connects higher education to, but also differentiates it from, the workings of other institutions in advanced technological societies. The critical function of the digital humanities going forward, in other words, is to assist in shaping smart, ethical academic infrastructures that not only further normative academic work (research, pedagogy, advising, administration, etc.) but also intelligently transfer some, but not all, values and practices in both directions between higher education and today’s other powerful institutions–business, law, medicine, government, the media, the creative industries, NGOs, and so on.
At present, some of the most influential general understandings of infrastructure cited by digital humanists such as Sheila Anderson and James Smithies studying humanities cyberinfrastructure in particular have been the Large Technical Systems (LTS) approach, stemming originally from the historian Thomas Hughes’s Networks of Power (1983), and the information-ethnography approach stemming from Susan Leigh Star, Geoffrey Bowker, and their circle. Good expositions of both are combined in one of the best conceptualizations of infrastructure I have so far found: a document of 2007 titled “Understanding Infrastructure: Dynamics, Tensions, and Design” (PDF) (whose authors include Bowker) representing the final report to the National Science Foundation of a workshop it sponsored.
Adding to these general approaches to infrastructure, I borrow in this book another portfolio of thought that to my knowledge has not yet been introduced directly to infrastructure studies. It is also a portfolio largely unknown in the digital humanities and, for that matter, in the humanities as a whole even though it is broadly compatible with humanities cultural criticism. The portfolio consists of the “neoinstitutionalist” approach to organizations in sociology and, highly consonant, also “social constructionist” (especially “adaptive structuration”) approaches to organizational infrastructure in sociology and information science. Taken together, these approaches explore how organizations are structured as social institutions by so-called “carriers” of beliefs and practices (i.e., culture), among which information-technology infrastructure is increasingly crucial. Importantly, these approaches are a social-science version of what I have called lightly-antifoundationalist. Scholars in these areas “see through” the supposed rationality of organizations and their supporting infrastructures to the fact that they are indeed social institutions with all the irrationality that implies. But they are less interested in exposing the ungrounded nature of organizational institutions and infrastructures (as if it were possible to avoid or get outside them) than in illuminating, and pragmatically guiding, the agencies and factors involved in their making and remaking. Such approaches are thus inherently a good match for the epistemology of building, unbuilding, and rebuilding in the digital humanities.
More than a good match, neoinstitutionalism and the social science of organizational technologies offer exactly the right tactical opening for a digital humanities cultural criticism because they are all about the site on which the already existing critical force of the digital humanities is pent up: institutional forms of technologically-assisted knowledge work. After all, the digital humanities stand in contrast to new media studies and network critique among cousin fields as the branch of digitally-focused humanities work that has been primarily focused on changing research, authorship, dissemination, and teaching inside (and across) academic institutions and related cultural or heritage institutions rather than on broader commentary directed externally at society and social justice. The digital humanities are all about developing analytical, publishing, curatorial, and hybrid-pedagogical tools and practices at scales ranging from standalone projects to federated or regional frameworks; creating new university programs and centers; changing the accepted notion of academic careers (e.g., to include “alt-ac” alternative academic careers); and, ultimately, instilling a new scholarly digital ethos in the academy in the name of “collaboration” and “open access.” As a consequence, the existing critical energy of the digital humanities–sometimes quite passionate and even militant–has been primarily devoted to such institutional issues. Breaking down the paywalls of closed publication infrastructures, for instance, is the digital humanities version of storming a university administration building in the 1970s.
Can neoinstitutional and social-structuration-of-technology approaches to understanding the evolving relation between the academic institution and today’s more domineering institutions (most notably, business and government) help the digital humanities release its intramural critical energy? Can that release help propel not just change in higher education but, through higher education and the technological infrastructures that mediate its relationship to other institutions, also extramural change in the larger society that higher ed contributes to? (Besides its focus on culture, I note, one of the special strengths of neoinstitutionalism that make it attractive to add it to Large Technical System analyses of infrastructure is that it is especially attuned to studying change and divergence among dominant institutional systems.) In short, can the considerable existing intelligence, idealism, and moral force of the digital humanities be redirected from being only an instrument of institution work to becoming through interventions in instrumental infrastructure also a way to act on institutions and their wider social impact?
But I do not wish to overreach, which is also why I think an approach focused on institutions and their infrastructures is particularly appropriate. Ultimately, the digital humanities field must be critical in a way that does not ask it inauthentically to reach beyond its expertise and mandate to bear exaggerated responsibility for larger social phenomena. Acting out through the digital humanities about larger social issues is necessary. But such actions must be complemented by creating infrastructures and practices that make their social impact by being what Susan Leigh Star called “boundary objects”–in this case boundary objects situated between the academic institution and other major social institutions. It is in this boundary zone–just as one example, “content management system” infrastructures whose use by scholars oscillates between corporate “managed” and “open community” philosophies–that higher education can most pertinently influence, and be influenced by, other institutions through what I earlier called “shared but contested information-technology infrastructures.” It is in this boundary zone of hybrid scholarly, pedagogical, and administrative institutional infrastructure that we need the attention of skilled and thoughtful digital humanists, even if the interventions they make are not called anything as ambitious as “activism” but instead simply “building.”
[End of excerpt]
Sessions at the 2016 MLA Convention related to digital humanities research, teaching, or the direction of the DH field, with some overlap with new media studies, writing studies, editing, and other topics. (Some sessions listed here are not centrally on DH but include at least one relevant paper.)
This list is compiled by Alan Liu, U. California, Santa Barbara (with kudos to Mark Sample for the idea, based on his listing of MLA DH sessions at previous MLA conventions). Please send Alan corrections and notices of sessions he has missed: firstname.lastname@example.org
List last revised: 23 Dec 2015
Peter de Bolla, “Digital Knowledge: Format, Scale, and the Information-knowledge Parallax at the Total Knowledge Horizon — A Reply to Alan Liu”Categories Uncategorized
Citation: Peter de Bolla, “Digital Knowledge: Format, Scale, and the Information-knowledge Parallax at the Total Knowledge Horizon — A Reply to Alan Liu.” 15 November 2014. http://liu.english.ucsb.edu/peter-de-bolla-reply-to-alan-lius-theses-on-the-epistemology-of-the-digital/
The following was written by Peter de Bolla of Cambridge University in reply to Alan Liu’s “Theses on the Epistemology of the Digital,” a solicited follow-up to Liu’s participation in the second planning consultation session of the Cambridge University Centre for Digital Knowledge (CDK). Held on 7 May 2014 at the Cambridge Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), the consultation session focused on “digital epistemology,” one of the two intended thematic strands of the Centre for Digital Knowledge. A previous planning consultation at CRASSH focused on the other intended strand of “digital society.”
At the consultation session that prompted Liu’s “Theses,” de Bolla initiated proceedings by reading a not-yet-published paper on the epistemology of the digital that served as a provocation for the proceedings.
(November 2015; posted here 15 November 2015)
It is clear, as Alan Liu points out in his “Theses on the Epistemology of the Digital,” that knowledge is not the first thing that comes to mind when we turn our attention to the digital. As he notes, we more commonly think of the digital as having considerable impact on the social, economic, political and cultural. This, it seems to me, is because our primary engagement with the digital is at the level of its effects and affordances with respect to communication, information storage and retrieval, statistical inference and manipulation. And the overwhelming majority of us use the digital as if it were analogue: the interfaces we are comfortable with simulate analogue forms. This is why the question of knowledge is so far back, so buried in our encounter with the format of the digital.
It is true that programmers take a different route but even in this case the question of knowledge is not particularly to the fore as programmers attend to system more than episteme. They are most concerned to get the ontology of their programmes to function efficiently. Perhaps this focus on “ontology” rather than “episteme” reveals a truth: the ontology of computation (where “ontology” is now being used in its philosophical sense) is order and sequence. In contrast to this sense “ontology” is used in programming to mean “a hierarchically organized classification system associated with a controlled, structured vocabulary that is designed to serve the retrieval and integration of data” [Ceusters and Smith, Switching Codes].
The Cambridge University Centre for Digital Knowledge (CDK) proposes to put the question of knowledge to the fore in our attempts to understand the difference that is made by the move from the analogue to the digital. This project is, at least initially, focussed on the difference in format. Seen from this perspective the alterations in the social, political, economic and cultural that arise when digital technologies become ubiquitous are epiphenomenal: these changes, important as they surely are, tend to obscure the alteration in episteme that occurs when analogue materials are migrated into digital format. This alteration is, I think, indicated in Liu’s observation that the “unique quality, or quantum, that is digital knowledge” involves “rebalancing the values of quality and quantity.” Put another way, data is knowledge in computational environments. The problem for us humans is that at the quantum level of data we are unable to perceive that knowledge. It is only at higher orders of data configuration that we are able to transform information into larger bits that we identify as knowledge. But to the computer this transformation is not one of type or category–a changing epistemology–it is merely one of scale. So, in concert with Liu’s remark that the CDK will need to “let go of too fixed an adherence to established modern ideas of knowledge,” we mean to push hard at the gates which sort information from knowledge.
It is manifestly clear that, as Liu points out, the landscape within which knowledge is produced and disseminated has changed significantly in recent times. There are new “systems, forms, and standards of knowledge” which pit “algorithmic” against “philosophical” knowledge, or “multimedia instead of print-codex knowledge.” The question for the CDK, however, is once again: are these changes epiphenomenal? Or, to put that more carefully, has our attention to these alterations so far been less directed to how epistemic shifts are at play than to the effects of these shifts. Liu’s attention to the flatness of knowledge in the realm of big or crowd data (in contrast to my “vertical axiology”) is, I think, directed at one of these effects. The distribution of information, its access and pliability all fall within a flat terrain where the “flatness” is the price of entry to the information. That is to say there is no–or very little–cost. But is that the same thing as a “flat epistemology”? As Liu puts it: “the wisdom of the crowd challenges the very notion of an epistemology.” Once more this seems to be a question of scale: is it the case that at this level of quanta one cannot see a theory of knowledge? Or, perhaps as important, at this level one does not need a theory of knowledge since the mass circulation of and access to information works as if it were knowledge.
I am not sure that this account is fully satisfying or convincing. Perhaps we need a more complex topography in order to see what is going on. If we were to plot the terms “information,” “knowledge,” “opinion” within a multidimensional space–a vector space–it might be possible to begin to see how these slightly different epistemic identifications overlap and connect, disconnect and repel. Here are some thought experiments. If one way of seeing knowledge is this: “knowledge is information which, when it is physically embodied in a suitable environment, tends to cause itself to remain so” (David Deutsch), and one way of seeing the wisdom of the crowd is “mass circulation opinion” then one of the vectors we would need in order to plot the topology would be time or persistence. Mass circulation opinion has, for example, a very distinctive temporality that might be represented as a wave form that corresponds to the volume of traffic at time t. Knowledge, in the Deutsch formulation, tends towards stability and might be represented as a flat line . The temporality of information might be represented as a set of discontinuities as older information becomes updated and replaced by more recent ?. A vector space model would seek to plot these data points within a planar matrix one of whose axes would be temporality. Another might be quantity. This would seek to plot the topology within which expert knowledge, the knowledge of the few, is distributed against mass circulation opinion, the knowledge of many. What measures would be helpful here? Would this help one to identify the range of measure x within which opinion tends towards knowledge? This, I think, might be close to Liu’s observation that knowledge “may not be either truth or story but just a probability distribution.”
If one were now to return the initiation question for the CDK–the issue of digital format–a similar observation might be made. The topology I have begun to sketch could equally be applied to the scalar issue. If one were to plot quanta of data in a vector space alongside persistence of accepted knowledge it might be possible to see more clearly how information at the very basic level of computational format (zero-one) contains within it knowledge, but knowledge that can only be seen at very high levels of data compression. Put this way knowledge decomposes into information at the quantum level. This seems to me to speak to Liu’s observation that “the humanistic and quantum universes of uncertainty are doppelgängers of each other.” It is a matter of scale. From the massive scale of knowledge in the humanities uncertainty appears as ambiguity; from the minuscule scale of data in the quantum world ambiguity appears as uncertainty.
This brings me to the question of the distinctiveness of the humanities. From whichever way I look at this issue it seems to me that we obscure too much when we maintain disciplinary coherence. The humanities are wedded to verification just as the non-humanities are. The humanities give considerable weight to accuracy just as the non-humanities. But in saying this we should not seek to make all knowledge wear the same overcoat. There are distributions within one kind of knowledge (call it humanistic) that are distinct from other kinds. But I am less convinced than some that these distributions are fixed. Still less am I convinced that they ought to stay fixed. By and large what we identify as humanities scholarship contains a mix–a distribution–of opinion, information and knowledge. Up until now the accreditation for this distribution of epistemic entities has been underwritten by a set of practices that give considerable cultural capital to individuals (so called experts) and institutions. What we might call single researcher accreditation protocols underwrite the opinions of literary critics. Outside the humanities the blend of opinion, information and knowledge is not quite the same. The hard sciences, for example, use the rhetoric of observer independent protocols for verifying information. Accreditation for opinions here stems from the repeatability of observational verification protocols. Single person cultural capital also applies, but it is tempered by mass and repeated observation. As long as the information persists and supports the opinion, knowledge is said to be gained.
It seems to me that what Liu identifies as the explosion of knowledge in the digital domain–“from crowds, people outside expert institutions, people outside formal organisations entirely”–consists in a remix of the distributions in a vector space that plots opinion, information and knowledge. This remix is not distinctively humanistic. In some ways it is “trans-humanistic.” Indeed we may be witnessing the decomposition of the humanities as knowledge decomposes into data or information. Put the other way around, we may be witnessing the recomposition of information as knowledge in the digital domain. And that applies to all types or kinds of knowledge, it is not unique to what heretofore we have called the humanities.
So let’s try and fit this around Liu’s sketch of a future humanities scholarship–one that has shed the protective skin of the “discourse model.” In place of the “one” reading and writing a book–the lone humanist–we will find collective and collaborative enterprise distributed in expansive networks of communication and experimentation. This, to my mind, will not be humanities 2.0 (or some such) but digital knowledge work. In this domain interpretation and critique are no longer centre stage (though there will be no necessary reason to jettison them entirely), occupying the foreground will be information, its gathering and manipulation so as to reveal what knowledge lies within the quantum level of information as data. Pattern recognition as much as pattern building will be the primary tasks for the agent who seeks to reveal this knowledge, and the machine (computer) will certainly have equal agential responsibilities in these tasks. There will doubtless be those who see such a sketch as irredeemably anti-humanist but the clear direction in which the contemporary is travelling seeks to make the division between the human and non-human far less monolithic. Things, animals and machines are more useful and friendly to humans when we begin to investigate the ways in which they may have or obtain agency or quasi-agency. The more we understand the quantum universe of biology the clearer it becomes that at the lowest level of life the domain of operation is digital. Pattern recognition and pattern building is what we, humans, do and are made of. To that extent the most humanistic enquiry is, therefore, digital. Once we accept this it becomes possible to see how much we have to learn from opening out to inspection digital knowledge, that is information held in a format that creates knowing from bits that are sub-opinion or only recognisable as knowledge at very high densities. It is not that we stand to gain an enormous revolution in what we know–that by definition will be impossible–but in how it comes to be known.
I agree with Liu that one of the ways this can be implemented is to build digital objects–hardware, software, algorithms and so forth–since this is a very testing methodology for exploring how the computer thinks. In this case there is a kind of “craft knowledge” that is only really accessible to those who make. But we should take care to insist that this is but one way. Others include embracing the techniques and technologies of advanced and advancing computer science–not the off the peg packages designed to fit problems already identified but the future horizon thinking that pushes at the boundaries of computation or the digital. What if we seek to produce a “total knowledge horizon” in which dynamic contextualisation operates at the largest scale of data repository and inspection? In a domain in which everything is potentially capable of finding relation with everything the task will be to sort out the noise from the significant signal. Although this may well involve building in the first sense above, it is as likely to involve building in the sense of conceptualising what digital knowledge might be at the “total knowledge horizon.” That’s the thing I hope we might aspire to build together.
Citation: “Theses on the Epistemology of the Digital: Advice For the Cambridge Centre for Digital Knowledge.” Alan Liu, 14 August 2014. http://liu.english.ucsb.edu/theses-on-the-epistemology-of-the-digital-page/
The following was written as a solicited follow-up to my participation in the second planning consultation session of the Cambridge University Centre for Digital Knowledge. The session, held on 7 May 2014 at the Cambridge Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), focused on “digital epistemology,” one of the two intended thematic strands of the Centre for Digital Knowledge. A previous planning consultation at CRASSH that I did not attend focused on the other intended strand of “digital society.”
My theses below are meant more as provocation than as prescription; and they do not take account of plans that may have been put in place for the Center for Digital Knowledge since the planning consultations.
14 August 2014
Establishing a Centre for Digital Knowledge oriented around “digital epistemology” will require a laser-sharp focus on making “knowledge” a productive framework for understanding the digital age. This framework must be robust enough to compete with such more common gestalts as “society,” “politics,” “culture,” and “economy” (represented in such phrases as “information society,” “surveillance society,” “social media,” “online culture,” “information economy,” etc.). The proposed Centre for Digital Knowledge can generate its agenda by deliberately harnessing the tension between knowledge (including ideals of academic knowledge shaped by the German research university model and the Enlightenment) and social, cultural, and economic understandings of the digital age.
After all, knowledge today is not intuitively the first thing that comes to mind when thinking about the digital, even in regard to such iconic artifacts of the Internet as Wikipedia that ought by rights to hew to the Enlightenment tradition of the Encyclopédie. Not only do Wikipedia’s “no original research” and “notability” principles abridge the idea of knowledge, but its most distinctive traits as a knowledge enterprise are characterized in social terms such as “open” and “community.” And this is before we even come to the identification of the digital with such knowledge-“lite” paradigms as entertainment.
For many, therefore, the digital is not primarily a mutation in knowledge. It is a social change. Social-science and other disciplines operating on this premise treat the digital as a phenomenon of “communication” (“ICT”: “information and communication technology”) impacting social practices, institutions, and organizations [example]. Or the digital facilitates political change. Political scientists or sociologists who study the Internet see it as a testing ground for new kinds of organizing, protest, voting, and other virtual realpolitik [example]. Or, again, the digital marks a cultural change. Disciplines such as “new media studies” and “network critique”–extending British, European, and American traditions of cultural criticism–treat the digital as a domain of contested identity, gender, ethnicity, ideology, affect, privacy, and so on [example]. And, yet again, the digital is an economic change. Economists and organization theorists (chorused by business journalists and business consultants) see the digital as a proxy for the postindustrial reorganization of capital [example].
Amid this clash of paradigms, the specific mission of a Centre for Digital Knowledge should be to illuminate–we may say, “reenlighten”–the knowledge overshadowed by other major views of the digital. Why is it, for instance, that business theorists discuss “knowledge work” in ways that say everything about work but almost nothing about knowledge [example]? What is the actual knowledge embedded in the society, politics, culture, and economy of the digital with their faux-knowledges of “information,” “wisdom of the crowd,” “knowledge bases,” “smart phones,” etc.?
The Centre for Digital Knowledge can design a sequence of events, activities, and outputs that foreground the specific force of digital knowledge amid digital society, politics, culture, and economy. For example, one cycle of Centre activities could focus on how the production and circulation of digital academic knowledge (or of specific “knowledge artifacts”; see provisional plan below) compares to crowdsourcing or social networking. A second could explore how new ideologies of scholarly open-access and open peer review compare to the politics of “open-source” and “open government.” A third could focus on the relation between traditional expert cultures (including but not limited to academic culture) and the new open-source knowledge cultures. And a fourth could focus on the uncanny convergence/divergence between the digitization of scholarly archives (e.g., of traditional restricted-access or closed-stack research libraries) and the economics of monetized proprietary databases (e.g., Google’s). All these cycles of activities would have in common the goal of sifting the sands of the digital for the unique quality, or quantum, that is digital knowledge (where rebalancing the values of quality and quantity is itself a problem of the epistemology of the digital comparable to similar recalculations of value in the social, political, cultural, and economic digital realms).
But alluding to the Enlightenment forecloses as much as it discloses. An honest effort to grapple with digital knowledge will also require the Centre for Digital Knowledge to let go of too fixed an adherence to established modern ideas of knowledge (here simplistically branded “Enlightenment”). Those ideas are bound up with philosophical, media-specific (print, codex), institutional (academic and other expert-faculty), and “public sphere” configurations of knowledge that co-evolved as the modern system of knowledge. But today there are new systems, forms, and standards of knowledge, including some that refute or make unrecognizable each of the modern configurations mentioned above–e.g., algorithmic instead of philosophical knowledge, multimedia instead of print-codex knowledge, autodidactic or crowdsourced instead of institutional knowledge, and paradoxically “open”/”private” (even encrypted) instead of public-sphere knowledge.
In this light, Peter de Bolla’s incisive “provocation” paper on digital knowledge (presented 7 May 2014 at the start of the second planning consultation for the proposed Centre for Digital Knowledge held at Cambridge University’s CRASSH Center) is revealing for its frequent rhetorical reliance on two prepositions: “under” and “beneath” (used to query the foundations under or beneath the digital). Evidenced in this rhetoric is an inverted Platonic Divided Line that locates essential knowledge not high above but–in the modern tradition that runs from Kant’s “conditions of possibility” through Foucault’s “archaeology of knowledge”–deep below.
But it is unclear that the epistemology of the digital respects, or should respect, a vertical axiology of truth. Some of the most important dimensions of the digital extend laterally in networked, distributed, and other “inch-deep but mile-wide” formations. Big data or crowd data is bottom-up data, not high data (in the sense of “high church” or “high Latin”). In this regard, the Facebook-era cliché of “the social graph” is symptomatic. Used with the definite article in discussions of social networking, the social graph (commonly reified in visual graphs of nodes and links) has become the icon of a flat epistemology with just two secular dimensions (who knows whom) oblivious to any Platonic or Kantian higher dimension.
In the digital age, in other words, the “wisdom of the crowd” challenges the very notion of an epistemology, or philosophy, of knowledge. If we were to juxtapose the Enlightenment with the digital age, we might say that (a) the French Revolution paid quit to philosophy (and philosophes) by advancing a mob mentality that later nineteenth-century “historicists” (and twentieth-century revisionary historians of the Revolution such as François Furet) could only “know” by displacing the Revolutionary “idea” into notions of “spirit [Geist],” “rumor,” “representation,” etc.; and (b) the “digital commons” and “open” movement now represents the resurgence of a similar crowd knowledge challenging scholars. Then and now, the difficulty is that the object of inquiry puts in question the knowledge-standards of scholarly inquiry itself. Circa 1790, for example, people in Paris “knew” who was an “aristocrat” to be accused to the local Watch Committee because “everyone knows.” After 2000, with the onset of Web 2.0 and social media, people similarly know who the “celebs” are (not to mention more plebian “friends” and “followers”) because Facebook, Twitter, etc. know. Pity scholars who want to know what such “knowing” means but are constrained to rigorous older standards of “critical” knowledge that are like being the only person on Facebook who doesn’t “like” anything.
A similar incommensurability between old and new epistemologies applies in temporal terms. Instead of valuing enduring or permanent truths (the temporal version of “high” knowledge), the digital age is preoccupied with information of much shorter durations–time spans plunging down to the diurnal rhythm of blog posts, the microseconds of a data packet’s “TTL” (defined “time to live”), and even the gigahertz clock rate of a computer’s CPU. Originally, after all, Facebook and Twitter both prompted their users for “status updates” with variants of the hyper-immediate question: “What are you doing now?” Nor is it just a matter of the short durée but also of different temporal rhythms. Digital knowledge moves through computers and networks in fitful, robotic ballets of inhumanly precise starts and stops that fatally deform the early-twentieth-century Bergsonian intuition of flow and even the late-twentieth-century McLuhan intuition of media flow or field. Today the time of knowledge belongs to the invisible order of “micro-temporality” theorized by such media archaeologists as Wolfgang Ernst.
So, too, the incommensurability of digital epistemology can be formulated in terms of “uncertainty.” After all, digital knowledge often verges into or draws on stochastic processes that are native to our current scientific epistemology of statistical, probabilistic knowledge. Probability theory and the world view it models (e.g., the quantum-mechanical view of the universe) undercut the foundation of any knowledge that, in order to count as knowledge, needs definite subjects and predicates linked in narrative syntax of the sort that Boris Tomashevsky instanced in his definition of a thematic “motif.” Tomashevky’s example of a motif: “Raskolnikov kills the old woman.” To conform to today’s scientific world view, we would have to rewrite that sentence approximately as follows: “There is a 74% chance that in this document Raskolnikov kills (82%) / wounds (15%) / ignores (3%) the old woman (68%) / young woman (23%) / other (9%).” (Those familiar with “topic modeling” in the digital humanities and other digital research fields will recognize that such a recasting of “motif” makes it resemble the probabilistic “topics” generated by the MALLET topic modeling tool.) In other words, the humanities today have a hard time adjusting to the idea that knowledge may not be either truth or story but just a probability distribution. Even the “ambiguity,” “paradox,” and “irony” that were the highest evolutions of humanistic knowledge valued by the New Critics seem to exist in an alternate cosmos from the equivalent uncertainties of quantum mechanics. Not Cleanth Brooks’s well-wrought urn, in other words, but Schrödinger’s cat. The New Critics equated the paradox of “Beauty is truth, truth beauty” (the line from John Keats’s “Ode on a Grecian Urn” that so exercised Brooks in The Well Wrought Urn) with the full richness of human reality, which they also called “experience” in consonance with John Dewey’s contemporaneous philosophy of experience. In today’s scientific epistemology, by contrast, reality is defined by the collapse of the quantum wave front, as it were, into either beauty or truth, a binary decision state (consonant with the digital epistemology of 1 vs. 0) that nevertheless does not negate wonder at the unknowability of the paradoxically more real (but also less real because created from “virtual particles”) reality of the “quantum foam” underlying it all. The humanistic and quantum universes of uncertainty are doppelgängers of each other, incommensurable in difference and similarity.
In sum, there was knowledge; and today there are other kinds of knowledge that seem to come foaming up from the zero state of knowability not just in physics (and metaphysics) but in the epistemology of the digital–e.g., from crowds, people outside expert institutions, people outside formal organizations entirely, people from other parts of the world, and so on whose virtual knowledge seems as transient as virtual particles. That is one of the lessons of the digital.
A Centre for Digital Knowledge also needs to try out alternatives to the very form of an academic “centre,” since that form is vested in traditional ways of organizing knowledge production that the digital is currently reinvesting in a wider, differently articulated network of institutions, collectives, and media. “Neoinstitutional” theory combined with “adaptive structuration theory” (in the fields of sociology and organizational technology studies, respectively) help us understand how the digital facilitates changes in organizational and institutional structures, especially those oriented toward knowledge work. For example, Wikipedia, open-source communities, etc., evidence how the once hallowed institutions of “expertise” (professional work in corporations, professorial work in universities, professional journalism, etc.) are being repositioned by the new technologies in unstable relation to networked “open” para-institutions of knowledge outside settled organizational fields.
It thus seems clear that a Centre for Digital Knowledge that relies solely on traditional institutional forms–even the now normative “interdisciplinary” form (e.g., a centre that creates weak-tie intersections among faculty in different fields) will be cut off from some of the most robust conceptual and practical adventures of digital knowledge. A key test for the proposed Centre for Digital Knowledge, therefore, will be whether it is willing at least on occasion to accommodate non-standard forms of knowledge organization, production, presentation, exploration, and dissemination acclimated to the digital age or open to its networked ethos. Examples of such forms include “THATcamps” or “unconferences,” writing or coding “sprints,” design “charrettes,” online forums, events planned by non-academic invitees, cross-institutional collaboration (university to high school, university to newspaper, university to corporation, university to NGO, etc.), direct engagement with the public in online or face-to-face venues, and intellectual events planned not just by research faculty but also by teaching-first instructors, clerical staff, and students (to break down the divide between those tiers).
An additional desideratum is that the Centre should produce a replicable model for other academic (or hybrid academic/public-humanities) institutions, programs, and events that does not depend on the funding resources and “A-list” guest speakers of an elite university such as Cambridge. That is, the Centre should ensure that every event aspiring to be the academic equivalent of an Aspen Institute or TED Talks should be balanced by an event aspiring to be a THATcamp, beginner or early-career forum, project incubation workshop, regional all-institutions conference, or other forum that sows the seeds wide and far.
In modern times, the academic production and dissemination of humanities knowledge have run in a well-known discourse pattern (OED: “discourse” from “discursus action of running off in different directions, dispersal, action of running about”). With some exceptions (e.g., co-editions), humanities scholarly discourse runs, mutatis mutandis, as follows:
Some traits associated with this program are dominant and others recessive. Solo agents of knowledge are dominant in the humanities. One reads and annotates a book; one designs a syllabus; one writes a paper; etc. By contrast, collective agency–the thick bunchings of academic life in meetings, reading groups, conferences, etc.–are recessive: either epiphenomenal (one would be writing that article anyway) or taken for granted as para-academic apparatus (e.g., the discourse between a scholar and editor that only occasionally comes to view in a book’s acknowledgements).
In terms of the acts rather than agents of humanities knowledge, interpretation and critique are dominant as the ends of knowledge, while observation and analysis are recessive as preliminaries to knowledge. Spanning in between are the acts of rhetoric and narrative that comprise the dispositio that William Germano (drawing on his experience as a former editor of humanities monographs) calls a book’s “throughline.”
Additionally, humanistic discourse has dominant and recessive styles. Through an act of introjection, many humanities scholars have come to believe that their dominant discourse should be of the same order of linguistic phenomena as their object of study. Since much of humanistic study concentrates on exceptional texts (e.g., literary works, pivotal historical speeches or documents), this means that higher value is ascribed to scholarly writings that at least to some degree are as resonantly crafted, nuanced, or elegant as complex literary language; as classically or biblically periodic as famous historical speeches; or otherwise as linguistically tour-de-force as some variant of the above. (Disclaimer: the present piece of humanistic writing is no exception, at least in its aims.) Even a humanities scholar’s spoken lectures are traditionally pre-scripted for high-pitch verbatim performance–an exercise that other disciplines such as the sciences and engineering view as bizarrely theatrical, not to mention fantastically inefficient for presenting data and conclusions.
Indeed, the issue of “data” in the humanities is increasingly acute in the digital age since it is a direct challenge to the privilege of high style. With some exceptions in fields like history, the humanities treat data as something to be embedded in discourse as part of the argument (or at least kept as close as a footnote or appendix at one remove). “Close reading” is an example of how the humanities fold data–the precise lines of poetry being interpreted, for instance–into argument. As a consequence, and by corollary with its stylistic ideal, the humanities create arguments that seem data-lite. After all, only so much concrete evidence can be folded into an argument without the prose taking on the poured concrete quality of many scientific or social-scientific articles with their masses of particulate citations–e.g., “Empirical studies adopting this social constructionist view of technology have been done by sociologists of technology (Bijker 1987; Bijker, Hughes and Pinch 1987; Collins 1987; Pinch and Bijker 1984, 1987; Woolgar 1985; Wynne 1988), and information technology researchers (Boland and Day 1982; Hirschheim, Klein and Newman 1987; Klein and Hirschheim 1983; Newman and Rosenberg 1985)” (source for this example [PDF]). Of course, the appearance of being data-lite belies the true heft and complexity of humanities data (where “data” here means low-level observational and descriptive information recorded in some structured pattern, as in the “images” or “paradoxes” Brooks accumulates in his Keats chapter in The Well Wrought Urn, whose title notably rejects the idea of explicit data: “Keat’s Sylvan Historian: History Without Footnotes”). First, there is a multiplier effect by which humanistic knowledge is attended by messy problems of missing, irregular, incommensurate, and ambiguous information that require much behind-the-scenes processing and adjudication (a post by Hugh Cayless on this issue). Secondly, much underlying data in the humanities is implicit. Data inheres in entrained reading practices such that the “what is your data?” question typical in other disciplines is normatively answered in literary studies: “here’s the book; do a close reading yourself to see if my interpretation is persuasive.” And data also inheres silently in the stability of a massive infrastructure of book collections, curatorial staffs, bibliographies, metadata, and other apparatuses–i.e., the whole order of data to which even simple humanities citations (e.g., “see Cleanth Brooks”) really refer. Humanities data refers to “all that” (background editing, archiving, reading practices and apparatuses) even when, as in Brooks’s case, it seems to wear on its sleeve few, if any, footnotes. So long as libraries, books, or reading do not change, “all that” can be left unspoken as assumed knowledge.
By contrast, the sciences and social sciences (especially branches of the latter focused on quantitative or empirical research) cleave the orders of data and of argument so that they can be managed separately. Data is channeled through closed or open datasets, databases, repositories, etc.; while argument appears in pre-prints, conference proceedings, and journal publications. This separation allows for the creation, processing, maintenance, and presentation of data as a distinct workflow–one that can acquire independent value and even generate its own research problems (as in recent work on computationally assisted “data provenance” [example, PDF]). Scientific and social-scientific data can thus be presented or otherwise made available autonomously for critical inspection–a fact demonstrated, for example, in recent arguments for and against the data validity of Thomas Pikkety’s Capital in the Twenty-First Century.
Humanities discourse has rarely needed to aspire to the same standards for making all its data explicit, shareable, and open to critical examination. “So long as the nature of libraries, books, or reading do not change,” as I put it above, there was no need. But today digital media are rapidly destabilizing the traditional evidentiary structure of the humanities and bringing it closer to that of the sciences. The digital humanities field is a leading example. There are no established humanities protocols for adequately citing even the moderately “big data” that advanced digital methods now tempt humanists to study–e.g., the 7,000 novels that Franco Moretti explores in “Style, Inc. Reflections on Seven Thousand Titles”; the 3,500 works of Irish American prose literature that Matthew L. Jockers mines in Macroanalysis; or the 21,000 articles from “seven generalist literary-studies journals” with up to a century of volumes each that Andrew Goldstone and Ted Underwood canvass in their “The Quiet Transformations of Literary Studies” [PDF]. Even outside the digital humanities, mainstream humanities scholars who work with any kind of digital material are now at sea when needing to quote or cite the increasingly important plenum of born-digital, dynamic, social-media, streaming, and other new kinds of resources. For example, how does one shoehorn into the MLA’s citation style for a Web resource–simply “Web,” void of URLs–any granular reference to a distinct structure or state of an online site, archive, or database?
The high style of humanities discourse, in sum, is increasingly under threat in a digital age that values information over style. Meanwhile, the more data-explicit “ordinary” humanities style of book prospectuses, grant proposals, personnel case reviews, research assessment reports, etc., remains recessive even as it becomes increasingly pervasive. Days and nights may be spent writing a grant proposal, for example, but the prose that emerges is never valued as the “real” voice of the humanist. This is a situation that is increasingly unstable as humanities scholars devote larger proportions of time to writing such works as reports for program reviews or research assessments. What the digital age seems to be telling the academy–an outcome that the humanities will need to adapt for its own purposes–is that the dominant/recessive relation between the language of a book and that of a report or proposal may need to be rebalanced. Nor is the rebalancing solely driven by intramural and administrative needs–part of the rise of “managerialism” in universities. “Public humanities” scholars and humanities advocates make a strong case for complementing humanities research with dissemination in “plain and simple” language [example].
What, then, should be the discourse of knowledge in a Center for Digital Knowledge? One thesis is that such a Center should embrace alternatives to normative humanities academic discourse as part of its very project of understanding the difference of digital knowledge. “Alternatives” does not necessarily mean abandoning the most distinguished features of humanities discourse–individually cultivated voices of eloquence feeling their way toward sustained, rigorous, and elegant or “edgy” interpretations of past and present phenomena. But it does mean diversifying and reordering humanities discourse so that its voice can join in a broader discursive cycle of digital knowledge.
What I mean may be elucidated through a hypothetical research scenario of a sort increasingly common among scholars collaborating with digital methods. Imagine that a major grant has been won to fund a cross-disciplinary, multi-year project entitled “Climate Change and Social Change.” The project’s mission is to correlate climate change with both historical and recent social, economic, political, and cultural impacts–e.g., impacts on the perception of climate (e.g., in the media), social demographics (e.g., mortality rates and migration patterns), monetary flows, political movements, and policy decisions. The promised deliverables are heavily digital: a dataset or corpus, digital tools and interfaces for researchers and the public, and digitally-accessible conferences, papers, and articles. Members of the project team include scholars in computer science, biology, epidemiology, sociology, political science, communication, anthropology, film and media studies, environmental history or literary ecocriticism, history, and literary studies or comparative literature. The operational procedure is a series of plenary meetings branching off into working groups and development “sprints,” all coordinated around a series of defined project milestones and deliverables.
One of the distinctive features of such projects in the digital age is that the breadth of disciplines involved is homologous with a condition of the digital itself: the fact that the object of study can be mutated into a common digital dataset and transformed into countless permutational views for treatment from different disciplinary angles. Thus there is no one primary discourse of knowledge agents, acts, and styles. Monographic publications written in high style by humanities scholars are on a par with such discourses dominating other disciplines as collaborative conference papers, datasets, prototype demonstrations, etc. Or, more accurately, the dominant discourses of different disciplines each take command at different phases of the overall cycle of knowledge production before receding to let other kinds of discourse dominate–the whole alternating sequence driving the process forward iteratively. Thus for example, individuals may drive the work in some parts of the cycle, and teams in others. Observation and analysis come to the fore in some parts of the cycle, and interpretation and critique in others (e.g., critical discussion that occurs at the beginning of the project to shape the mission, or midway in the project as a correction of preliminary results). And style modulates through the cycle accordingly–full-throated at some points, but collapsed to bullet points, diagrams, mockups, and “demos” at others. In this regard, the “provocation” paper by de Bolla at the second planning consultation for the Centre for Digital Knowledge is a perfect exemplum of high-style humanistic critical argument used tactically to start rather than finish a project. Ideally, the sum of all the phase-cycles of this discourse–in which the discursive norms of each discipline take the lead at different points–creates a whole greater than the parts.
The humanities, in other words, need not think that the discursive flow of “Reading & Research Syllabi & Teaching notes Talks Articles Monographs” is a linear path. Different segments of that traditional agenda can be broken out separately and inserted tactically into other phases of the overall collaborative act of knowledge production where they will have the most value. From the point of view of the humanities themselves, this thesis assumes its most radical form in two propositions. One is that in the digital age humanities scholars should be encouraged to complement their dominant discourse with other kinds of discourse–including challenging collaborative work, difficult and innovative acts of data collection and analysis, and research outputs such as published conference proceedings or online projects that do not sum up in a critical/interpretive monograph. The other proposition is that in the digital age humanities scholars should not be engaging solely in discursive acts at all. Instead, it is already clear in the field of the digital humanities–a leading edge of the humanities’ encounter with digital knowledge–that a gestalt-shift is underway that recasts acts of discourse as acts of “making” and “building.” In the digital humanities, the “epistemology of building”–realized through the building of digital projects, hardware DIY projects, media archaeology labs, etc., and theorized with the aid of such broader intellectual movements as the “new materialism”–is, as they say, a thing.
There are many possible ways the above recommendations could be built into a Centre for Digital Knowledge. Here, for example, is one program of activities that interweaves many of the above theses:
As stated above, this is just one example program. Many other kinds of organization, activity, and output could be imagined that would allow the Center to explore, and enact, the epistemology of the digital. Whatever the program, the goal is to engage the topic of what it means to “know” in the digital age in a spirit of serious play–at once disciplined and exploratory of new paradigms.
19 August 2014: Corrected to 21,000 the mention of 13,000 articles in Andrew Goldstone and Ted Underwood’s “The Quiet Transformations of Literary Studies” [PDF].
10 November 2014: Corrected to 3,500 the mention of 758 works of Irish American prose literature that Matthew L. Jockers mines in Macroanalysis (the latter was correct only for Chapter 8 in Jocker’s book).
“This is Not a Book: Long Forms of Shared Attention in the Digital Age.” INKE Conference on Research Foundations for Understanding Books and Reading in the Digital Age: E/Merging Reading, Writing, and Research Practices, Havana. 12 December 2012.
“This is Not a Book: Long Forms of Shared Attention in the Digital Age.” Humanities Center, DePaul University. 11 April 2013.
Literature and Data
(Theory & Media Studies Colloquium, Yale Univ., Oct. 7, 2009)
Local Transcendence: Essays on Postmodern Historicism and the Database
University of Chicago Press, 2008, 392 pages, ISBN-10: 0226486966, ISBN-13: 978-0226486963
Driven by global economic forces to innovate, today’s society paradoxically looks forward to the future while staring only at the nearest, most local present–the most recent financial quarter, the latest artistic movement, the instant message or blog post at the top of the screen. Postmodernity is lived, it seems, at the end of history.
In the essays collected in Local Transcendence, Alan Liu takes the pulse of such postmodern historicism by tracking two leading indicators of its acceleration in the late twentieth and early twenty-first centuries: postmodern cultural criticism–including the new historicism, the new cultural history, cultural anthropology, the new pragmatism, and postmodern and postindustrial theory–and digital information technology. What is the relation between the new historicist anecdote and the database field, Liu asks, and can either have a critical function in the age of postmodern historicism? Local Transcendence includes two previously unpublished essays and a synthetic introduction, in which Liu traverses from his earlier work on the theory of historicism to his recent studies of information culture to propose a theory of contingent method incorporating a special inflection of history: media history.
“This book is a reflection of and on a nearly twenty-year career. It is as much a work of history as of literary and cultural critique, as much a narrative and a piece of performance art as it is philosophical investigation and Nietzschean genealogy. Alan Liu is sui generis.”
“Following the magnificent achievement of The Laws of Cool, Alan Liu in Local Transcendence takes on the problem he astutely identified as deeply connected with the ‘cool’: the loss of historical grounding and consequent restructuring of identities by postindustrial corporations. Offering a rigorous yet humane critique of new historicism and cultural criticism from the inside, he interrogates the possibilities for historical grounding in the age of information in a witty prose style and a capacious field of reference. Local Transcendence is required reading for anyone interested in the multiple conjunctions, oppositions, and synergies between information, historicism, and cultural context.”
“Before he turned to digital humanities, Alan Liu once posed the key question for the new historicism: what’s the connection? What’s the connection, for example, between two juxtaposed details or anecdotes in a cultural field? Now he has reframed both inquiries with a broader question that raises the level of both the game and its stakes: what is the connection between the ‘new historicism’ and the ‘new media’? The result is a book that addresses the central question of the ‘link’ itself in our age and that links the link not only conceptually but also historically. It is a book for anyone interested in how disciplinary and technological innovation in the humanities have informed each other over these past two decades.”
Table of Contents
Introduction: Contingent Methods
1. The Power of Formalism: The New Historicism
2. Trying Cultural Criticism: Wordsworth and Subversion
3. Local Transcendence: Cultural Criticism, Postmodernism,
4. Remembering the Spruce Goose: Historicism and Postmodernism
5. The New Historicism and the Work of Mourning
6. The Interdisciplinary War Machine
7. Sidney’s Technology: A Critique by Technology of Literary History
8. “Transcendental Data: Toward A Cultural History and Aesthetics of the New Encoded Discourse”
9. Escaping History: New Historicism, Databases, and Contingency
Notes and Links for Presentation in Lynne Siemen’s seminar on “Issues in Large Project Planning and Management” (DHSI, U. Victoria)Categories Uncategorized
Link library for Alan’s presentation at the graduate seminar on “The Truth of the Humanities” related to the lecture seriesCategories Uncategorized
Sterling Publishing, 2003, 48 pp., 35 illustrations, ISBN-10: 0806982772, ISBN-13: 978-0806982779
II. Children & Young People
III. The Present and the Past
IV. Scenes from The Prelude
V. Growing Up
This is the professional home page of Alan Liu, Professor of English, University of California, Santa Barbara (UCSB). The English Department at UCSB also maintains a less complete bio page for Alan Liu.
See also Nothing Transcendental: Alan Liu’s Ad Hoc Site for Ordinary Business: “Here, the ordinary and routine business of professional life finds shelter from the pressure to be any more than it simply is.”
Stanford Univ. Press, 1989 , 726 pp., ISBN-10: 0804718938, ISBN-13: 978-0804718936
The imaginative power of Wordsworth’s poetry stems from a denial of history so strong and precise that denial itself—the determined absence of history—must be studied as positive fact. The author argues this thesis with the aid of substantial methodological innovations allowing the best of formalist, deconstructive, and New Historicist reading strategies to be synthesized and informd by a wealth of historical matter. Drawing upon recent advances in the history and theory of the French Revolution, art history, economic history, family history, and the social history of the Lake District, he shows that history—however absent it seems to be—influences literature deeply at the level of form. In particular, the most telling register of historical change and perception in Wordsworth’s poetry is generic transformation. Studying the works of the early and middle years intensively, and the later works suggestively, the author argues that Wordsworth’s overall shift from description to narrative, and from narrative to lyric, is a mimetic denail of contemporary cultural history. By the time “imagination” invests lyric imagery, it has learned to capture history within an empire of self that is no less than a surrogate history, a facsimile ideology.
Part One of the book introduces the subject by rereading the Simplon Pass episode in The Prelude as a denial of Napoleon’s Alpine crossing of 1800. It then formulates a methodology of historical reading by witnessing in the modern and postmodern notion of “context” a developing collaboration between formalist and materialist perspectives. The “matter” of history, the author argues, is collectively structured, witnessed, and uttered absence; and the reading of history is therefore a discrimination of forms of absence. When a city or a cottage is effaced, there is left only the nothing that is the constitutive basis of conventions of difference—of hate, prejudicial discrimination, “nation,” “culture,” and, as one of the most discriminating of cultural discriminations, the differential forms of art.
Part Two draws upon art history, political history, contemporary journalism, and narrative theory to study the formal collision between Wordsworth’s early picturesque and the predominantly narrative mode of French Revolutionary violence. Out of this collision, “time” arose as the massive denial of history, giving the poet his first authority separate from the “People.” In chapters entitled “The Tragedy of the Family,” “The Economy of Lyric,” and “A Transformed Revolution,” Part Three traces the development of authority into the “originality” of the poet’s mature ideology of autobiography. Part Four concludes the work by pointing ahead in Wordsworth’s corpus toward “The Idea of the Memorial Tour” and the self-critical stance of a poet whose quintessential act was to “collect” himself. The book ends with a brief epilogue on history and critical self-consciousness.
[Jacket Illustration: La Journée des Brouettes (or Préparatifs de la Fête de las Fédération au Champ de Mars, Juileet 1790) by Etienne-charles Le Guay. Musée Carnavalet]
Part I. Introduction
Part II. Violence and Time: A Study in Poetic Emergence
Part III. The Flight of Forms: A Study of Poetic Individuation
Lyric and Empire
Part IV. Conclusion
The Laws of Cool: Knowledge Work and the Culture of Information
University of Chicago Press, 2004, 552 pages, ISBN-10: 0226486990, ISBN-13: 978-0226486994 (fuller precis of book)
“Knowledge work” is now the reigning business paradigm and affects even the world of higher education. But what perspective can the knowledge of the humanities and arts contribute to a world of knowledge work whose primary mission is business? And what is the role of information technology as both the servant of the knowledge economy and the medium of a new technological cool?
In The Laws of Cool, Alan Liu reflects on these questions as he considers the emergence of new information technologies and their profound influence on the forms and practices of knowledge. Liu first explores the nature of postindustrial corporate culture, studies the rise of digital technologies, and charts their dramatic effect on business. He then shows how such technologies have given rise to a new high-tech culture of cool. At the core of this book are an assessment of this new cool and a measured consideration of its potential and limitations as a popular new humanism.
According to Liu, cool at once mimics and resists the postindustrial credo of innovation and creative destruction, which holds that the old must perpetually give way to the new. Information, he maintains, is no longer used by the cool just to revolutionize human knowledge—it is also used to resist it. What counts as cool today, however, is too frequently narrow, shallow, and self-centered. The challenge for the humanities, then, is to help redefine cool and to use technology in a way that mediates between knowledge work and a fuller lifework glimpsed in historical lives and works.
A study of enormous scope, ambition, and intellect, The Laws
of Cool provides an indispensable account of knowledge work
today and its future.
[Original draft of catalog copy]
In The Laws of Cool, Alan Liu thinks about knowledge work in contemporary society from the viewpoint of the historical, critical, and aesthetic knowledges valued by the humanities and arts. He also looks through the glass in the other direction to reflect on the evolving nature of the humanities and arts under the pressure of the newly dominant, corporate knowledge cultures of lifelong learning,learning organizations, team work, and diversity management. Liu’s pivotal topic is information technology and its semi-autonomous culture of cool (as in Web pages so cool that they thwart the flow of information). Information cool, as he calls it, is now the symptom not just of consumer culture but of a producer culture–the culture of the cubicle–that seeks an "ethos of the unknown" within the world of knowledge work.
Liu draws on contemporary business theory, sociology, anthropology, art, literature, literary theory, cultural studies, history of information technology, and Internet and new media theory to create an argument that is at once historical, formal, and theoretical. After articulating the concept of postindustrial knowledge work, he narrates the rise of information technology in the workplace and the cognate rise of cool subcultures, countercultures, and cubicle "intracultures." He then focuses on the formal, technical, social, and political features of high-tech "information cool" and concludes with a sustained reflection–and some practical suggestions–on how the humanities and arts can help educate the contemporary generations of cool.
One of Liu’s special concerns is the emergence of new "destructively creative" or viral arts that resist the postindustrial credo of innovation or what economist Joseph Schumpeter called creative destruction. Another is the current humanities emphasis on historicist critique, which also reevaluates the process of creative destruction. How might these twin tendencies in recent humanities and arts collaborate, he asks, to help shape the well-being–or wealth in a deeper sense–of the new classes of knowledge who spend their days and nights staring at a computer screen and wishing they were cool?
Since the early 1990s, Liu has built on his work in literary history, theory, and cultural criticism by exploring contemporary information culture through a number of technology projects, including his Voice of the Shuttle Web site and Transcriptions: Literary History and the Culture of Information (the NEH-funded research and curricular development initiative he directs). The Laws of Cool harvests the practical and theoretical experience gained in such projects.
Table of Contents
Introduction: Literature and Creative Destruction
Part I. The New Enlightenment
Preface "Unnice Work": Knowledge Work and the Academy
Part II. Ice Ages
Preface "We Work Here, But We’re Cool"
Part III. The Laws of Cool
Preface "What’s Cool?" (excerpt)
Part IV. Humanities and Arts in the Age of Knowledge
With his combined background in business management, business law, and graduate literary studies, William Pitsenberger is uniquely placed to follow up on my call for the academy and business to engage each other critically. “Suppose instead,” he says, “that the training in critical analysis with which those with advanced degrees in literature are armed were brought into the business community in a way that offered that community new kinds of value—understanding, for example, how business texts can be read, what contradictions exist between those texts and the desired message, and how to resolve those contradictions?”
This is an imaginative vision of humanities scholarship as a new missionary activity, one that attempts to offer business not just “skills” and “tools” (to which Pitzenberger admirably refuses to reduce the issues), not even just “value” (or, as he says later, “best use of academic training”), but instead “new kinds of value.” It would be interesting for a group of experienced managers and professionals from both sides of the business/academy divide to sit down together to judge whether this idea has merit and how it could be implemented—whether in a consultancy, training workshop, internship program, or something else.
I would like to take the occasion here, however, of putting Pitsenberger’s prescription in broader perspective. If the goal is to offer business the critical understanding it needs to make wise use of the texts of contemporary management literature—whose now ample and influential body of works is by turns insightful, cruel, heedless, and shallow—then the best general term I know for such an enterprise is still education. In this light, what Pitsenberger’s suggestions makes me wonder about is the very role of education today. In the “knowledge economy,” education occurs across a whole lifetime in an unprecedented variety of social sectors, institutions, and media—not just schools, community colleges, and universities, for instance, but also businesses, broadcast media, the Internet, even the manuals or “tutorials” that accompany software applications. Education, in other words, is now a decentralized field where no one institution any longer individually corners the market and where the sheer dispersion of the kinds and scales of learning—all the way from programs leading to degrees to CNN “factoids” leading only to the next commercial—is dizzying. Given this context, I think, the relevant question becomes: where can society most responsibly and effectively place the training in critical analysis that Pitzenberger suggests? Is it in consultancies or reading groups (workshops, team exercises, and focus groups) within corporations? Is it within the academy in humanities departments, on the hoary theory that the best way to insert critical understanding in society is to teach well the students destined to enter that society? (The humanities could thus teach contemporary management theory with the same critical perspective it brings to any other past theory of civilization, which is what management theory really is in its grandest ambition.) Or, because of the importance to business of non-textual knowledges not easily amenable to learning “how business texts can be read” (a point I owe to my colleague, Christopher Newfield, who also studies business and the academy), should we instead look to the sciences to develop courses on the critical understanding of numerical analysis or to the media industry to sponsor programs on the critical use of images and music? Perhaps the best question: how can society create the most inclusive, flexible, and intelligently interrelated mix of such options to take care of all its citizens hungry to “know”?
None of these questions are rhetorical; all are open. I suspect that they will not be solved from the top down by adding more representatives from government, media, etc., to the panel of business and education managers I imagined above. Rather, the work will begin from the bottom—through efforts by those like Pitzenberger who might want to try innovating a business training workshop here or an internship program there; and also by those working within the academy to introduce works of business literature among other works we ask students to read critically. (See the following course on “The Culture of Information” for my own example:
“The Interdisciplinary War Machine (The Theory of Interdisciplinary Studies).” Harvard University. 1 December 1994.