Citation: “Theses on the Epistemology of the Digital: Advice For the Cambridge Centre for Digital Knowledge.” Alan Liu, 14 August 2014. http://liu.english.ucsb.edu/theses-on-the-epistemology-of-the-digital-page/

The following was written as a solicited follow-up to my participation in the second planning consultation session of the Cambridge University Centre for Digital Knowledge. The session, held on 7 May 2014 at the Cambridge Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), focused on “digital epistemology,” one of the two intended thematic strands of the Centre for Digital Knowledge. A previous planning consultation at CRASSH that I did not attend focused on the other intended strand of “digital society.”

My theses below are meant more as provocation than as prescription; and they do not take account of plans that may have been put in place for the Center for Digital Knowledge since the planning consultations.

14 August 2014

Thesis 1: Enlightening the Digital

Establishing a Centre for Digital Knowledge oriented around “digital epistemology” will require a laser-sharp focus on making “knowledge” a productive framework for understanding the digital age. This framework must be robust enough to compete with such more common gestalts as “society,” “politics,” “culture,” and “economy” (represented in such phrases as “information society,” “surveillance society,” “social media,” “online culture,” “information economy,” etc.). The proposed Centre for Digital Knowledge can generate its agenda by deliberately harnessing the tension between knowledge (including ideals of academic knowledge shaped by the German research university model and the Enlightenment) and social, cultural, and economic understandings of the digital age.

After all, knowledge today is not intuitively the first thing that comes to mind when thinking about the digital, even in regard to such iconic artifacts of the Internet as Wikipedia that ought by rights to hew to the Enlightenment tradition of the Encyclopédie. Not only do Wikipedia’s “no original research” and “notability” principles abridge the idea of knowledge, but its most distinctive traits as a knowledge enterprise are characterized in social terms such as “open” and “community.” And this is before we even come to the identification of the digital with such knowledge-“lite” paradigms as entertainment.

For many, therefore, the digital is not primarily a mutation in knowledge. It is a social change. Social-science and other disciplines operating on this premise treat the digital as a phenomenon of “communication” (“ICT”: “information and communication technology”) impacting social practices, institutions, and organizations [example]. Or the digital facilitates political change. Political scientists or sociologists who study the Internet see it as a testing ground for new kinds of organizing, protest, voting, and other virtual realpolitik [example]. Or, again, the digital marks a cultural change. Disciplines such as “new media studies” and “network critique”–extending British, European, and American traditions of cultural criticism–treat the digital as a domain of contested identity, gender, ethnicity, ideology, affect, privacy, and so on [example]. And, yet again, the digital is an economic change. Economists and organization theorists (chorused by business journalists and business consultants) see the digital as a proxy for the postindustrial reorganization of capital [example].

Amid this clash of paradigms, the specific mission of a Centre for Digital Knowledge should be to illuminate–we may say, “reenlighten”–the knowledge overshadowed by other major views of the digital. Why is it, for instance, that business theorists discuss “knowledge work” in ways that say everything about work but almost nothing about knowledge [example]? What is the actual knowledge embedded in the society, politics, culture, and economy of the digital with their faux-knowledges of “information,” “wisdom of the crowd,” “knowledge bases,” “smart phones,” etc.?

The Centre for Digital Knowledge can design a sequence of events, activities, and outputs that foreground the specific force of digital knowledge amid digital society, politics, culture, and economy. For example, one cycle of Centre activities could focus on how the production and circulation of digital academic knowledge (or of specific “knowledge artifacts”; see provisional plan below) compares to crowdsourcing or social networking. A second could explore how new ideologies of scholarly open-access and open peer review compare to the politics of “open-source” and “open government.” A third could focus on the relation between traditional expert cultures (including but not limited to academic culture) and the new open-source knowledge cultures. And a fourth could focus on the uncanny convergence/divergence between the digitization of scholarly archives (e.g., of traditional restricted-access or closed-stack research libraries) and the economics of monetized proprietary databases (e.g., Google’s). All these cycles of activities would have in common the goal of sifting the sands of the digital for the unique quality, or quantum, that is digital knowledge (where rebalancing the values of quality and quantity is itself a problem of the epistemology of the digital comparable to similar recalculations of value in the social, political, cultural, and economic digital realms).

Thesis 2: Rethinking Enlightenment

But alluding to the Enlightenment forecloses as much as it discloses. An honest effort to grapple with digital knowledge will also require the Centre for Digital Knowledge to let go of too fixed an adherence to established modern ideas of knowledge (here simplistically branded “Enlightenment”). Those ideas are bound up with philosophical, media-specific (print, codex), institutional (academic and other expert-faculty), and “public sphere” configurations of knowledge that co-evolved as the modern system of knowledge. But today there are new systems, forms, and standards of knowledge, including some that refute or make unrecognizable each of the modern configurations mentioned above–e.g., algorithmic instead of philosophical knowledge, multimedia instead of print-codex knowledge, autodidactic or crowdsourced instead of institutional knowledge, and paradoxically “open”/”private” (even encrypted) instead of public-sphere knowledge.

In this light, Peter de Bolla’s incisive “provocation” paper on digital knowledge (presented 7 May 2014 at the start of the second planning consultation for the proposed Centre for Digital Knowledge held at Cambridge University’s CRASSH Center) is revealing for its frequent rhetorical reliance on two prepositions: “under” and “beneath” (used to query the foundations under or beneath the digital). Evidenced in this rhetoric is an inverted Platonic Divided Line that locates essential knowledge not high above but–in the modern tradition that runs from Kant’s “conditions of possibility” through Foucault’s “archaeology of knowledge”–deep below.

But it is unclear that the epistemology of the digital respects, or should respect, a vertical axiology of truth. Some of the most important dimensions of the digital extend laterally in networked, distributed, and other “inch-deep but mile-wide” formations. Big data or crowd data is bottom-up data, not high data (in the sense of “high church” or “high Latin”). In this regard, the Facebook-era cliché of “the social graph” is symptomatic. Used with the definite article in discussions of social networking, the social graph (commonly reified in visual graphs of nodes and links) has become the icon of a flat epistemology with just two secular dimensions (who knows whom) oblivious to any Platonic or Kantian higher dimension.

In the digital age, in other words, the “wisdom of the crowd” challenges the very notion of an epistemology, or philosophy, of knowledge. If we were to juxtapose the Enlightenment with the digital age, we might say that (a) the French Revolution paid quit to philosophy (and philosophes) by advancing a mob mentality that later nineteenth-century “historicists” (and twentieth-century revisionary historians of the Revolution such as François Furet) could only “know” by displacing the Revolutionary “idea” into notions of “spirit [Geist],” “rumor,” “representation,” etc.; and (b) the “digital commons” and “open” movement now represents the resurgence of a similar crowd knowledge challenging scholars. Then and now, the difficulty is that the object of inquiry puts in question the knowledge-standards of scholarly inquiry itself. Circa 1790, for example, people in Paris “knew” who was an “aristocrat” to be accused to the local Watch Committee because “everyone knows.” After 2000, with the onset of Web 2.0 and social media, people similarly know who the “celebs” are (not to mention more plebian “friends” and “followers”) because Facebook, Twitter, etc. know. Pity scholars who want to know what such “knowing” means but are constrained to rigorous older standards of “critical” knowledge that are like being the only person on Facebook who doesn’t “like” anything.

A similar incommensurability between old and new epistemologies applies in temporal terms. Instead of valuing enduring or permanent truths (the temporal version of “high” knowledge), the digital age is preoccupied with information of much shorter durations–time spans plunging down to the diurnal rhythm of blog posts, the microseconds of a data packet’s “TTL” (defined “time to live”), and even the gigahertz clock rate of a computer’s CPU. Originally, after all, Facebook and Twitter both prompted their users for “status updates” with variants of the hyper-immediate question: “What are you doing now?” Nor is it just a matter of the short durée but also of different temporal rhythms. Digital knowledge moves through computers and networks in fitful, robotic ballets of inhumanly precise starts and stops that fatally deform the early-twentieth-century Bergsonian intuition of flow and even the late-twentieth-century McLuhan intuition of media flow or field. Today the time of knowledge belongs to the invisible order of “micro-temporality” theorized by such media archaeologists as Wolfgang Ernst.

So, too, the incommensurability of digital epistemology can be formulated in terms of “uncertainty.” After all, digital knowledge often verges into or draws on stochastic processes that are native to our current scientific epistemology of statistical, probabilistic knowledge. Probability theory and the world view it models (e.g., the quantum-mechanical view of the universe) undercut the foundation of any knowledge that, in order to count as knowledge, needs definite subjects and predicates linked in narrative syntax of the sort that Boris Tomashevsky instanced in his definition of a thematic “motif.” Tomashevky’s example of a motif: “Raskolnikov kills the old woman.” To conform to today’s scientific world view, we would have to rewrite that sentence approximately as follows: “There is a 74% chance that in this document Raskolnikov kills (82%) / wounds (15%) / ignores (3%) the old woman (68%) / young woman (23%) / other (9%).” (Those familiar with “topic modeling” in the digital humanities and other digital research fields will recognize that such a recasting of “motif” makes it resemble the probabilistic “topics” generated by the MALLET topic modeling tool.) In other words, the humanities today have a hard time adjusting to the idea that knowledge may not be either truth or story but just a probability distribution. Even the “ambiguity,” “paradox,” and “irony” that were the highest evolutions of humanistic knowledge valued by the New Critics seem to exist in an alternate cosmos from the equivalent uncertainties of quantum mechanics. Not Cleanth Brooks’s well-wrought urn, in other words, but Schrödinger’s cat. The New Critics equated the paradox of “Beauty is truth, truth beauty” (the line from John Keats’s “Ode on a Grecian Urn” that so exercised Brooks in The Well Wrought Urn) with the full richness of human reality, which they also called “experience” in consonance with John Dewey’s contemporaneous philosophy of experience. In today’s scientific epistemology, by contrast, reality is defined by the collapse of the quantum wave front, as it were, into either beauty or truth, a binary decision state (consonant with the digital epistemology of 1 vs. 0) that nevertheless does not negate wonder at the unknowability of the paradoxically more real (but also less real because created from “virtual particles”) reality of the “quantum foam” underlying it all. The humanistic and quantum universes of uncertainty are doppelgängers of each other, incommensurable in difference and similarity.

In sum, there was knowledge; and today there are other kinds of knowledge that seem to come foaming up from the zero state of knowability not just in physics (and metaphysics) but in the epistemology of the digital–e.g., from crowds, people outside expert institutions, people outside formal organizations entirely, people from other parts of the world, and so on whose virtual knowledge seems as transient as virtual particles. That is one of the lessons of the digital.

Thesis 3: Decentering the Centre

A Centre for Digital Knowledge also needs to try out alternatives to the very form of an academic “centre,” since that form is vested in traditional ways of organizing knowledge production that the digital is currently reinvesting in a wider, differently articulated network of institutions, collectives, and media. “Neoinstitutional” theory combined with “adaptive structuration theory” (in the fields of sociology and organizational technology studies, respectively) help us understand how the digital facilitates changes in organizational and institutional structures, especially those oriented toward knowledge work. For example, Wikipedia, open-source communities, etc., evidence how the once hallowed institutions of “expertise” (professional work in corporations, professorial work in universities, professional journalism, etc.) are being repositioned by the new technologies in unstable relation to networked “open” para-institutions of knowledge outside settled organizational fields.

It thus seems clear that a Centre for Digital Knowledge that relies solely on traditional institutional forms–even the now normative “interdisciplinary” form (e.g., a centre that creates weak-tie intersections among faculty in different fields) will be cut off from some of the most robust conceptual and practical adventures of digital knowledge. A key test for the proposed Centre for Digital Knowledge, therefore, will be whether it is willing at least on occasion to accommodate non-standard forms of knowledge organization, production, presentation, exploration, and dissemination acclimated to the digital age or open to its networked ethos. Examples of such forms include “THATcamps” or “unconferences,” writing or coding “sprints,” design “charrettes,” online forums, events planned by non-academic invitees, cross-institutional collaboration (university to high school, university to newspaper, university to corporation, university to NGO, etc.), direct engagement with the public in online or face-to-face venues, and intellectual events planned not just by research faculty but also by teaching-first instructors, clerical staff, and students (to break down the divide between those tiers).

An additional desideratum is that the Centre should produce a replicable model for other academic (or hybrid academic/public-humanities) institutions, programs, and events that does not depend on the funding resources and “A-list” guest speakers of an elite university such as Cambridge. That is, the Centre should ensure that every event aspiring to be the academic equivalent of an Aspen Institute or TED Talks should be balanced by an event aspiring to be a THATcamp, beginner or early-career forum, project incubation workshop, regional all-institutions conference, or other forum that sows the seeds wide and far.

Thesis 4: Redesigning Discourse

In modern times, the academic production and dissemination of humanities knowledge have run in a well-known discourse pattern (OED: “discourse” from “discursus action of running off in different directions, dispersal, action of running about”). With some exceptions (e.g., co-editions), humanities scholarly discourse runs, mutatis mutandis, as follows:

Reading & Research arrow right Syllabi & Teaching notes arrow right Talks arrow right Articles arrow right Monographs.

Some traits associated with this program are dominant and others recessive. Solo agents of knowledge are dominant in the humanities. One reads and annotates a book; one designs a syllabus; one writes a paper; etc. By contrast, collective agency–the thick bunchings of academic life in meetings, reading groups, conferences, etc.–are recessive: either epiphenomenal (one would be writing that article anyway) or taken for granted as para-academic apparatus (e.g., the discourse between a scholar and editor that only occasionally comes to view in a book’s acknowledgements).

In terms of the acts rather than agents of humanities knowledge, interpretation and critique are dominant as the ends of knowledge, while observation and analysis are recessive as preliminaries to knowledge. Spanning in between are the acts of rhetoric and narrative that comprise the dispositio that William Germano (drawing on his experience as a former editor of humanities monographs) calls a book’s “throughline.”

Additionally, humanistic discourse has dominant and recessive styles. Through an act of introjection, many humanities scholars have come to believe that their dominant discourse should be of the same order of linguistic phenomena as their object of study. Since much of humanistic study concentrates on exceptional texts (e.g., literary works, pivotal historical speeches or documents), this means that higher value is ascribed to scholarly writings that at least to some degree are as resonantly crafted, nuanced, or elegant as complex literary language; as classically or biblically periodic as famous historical speeches; or otherwise as linguistically tour-de-force as some variant of the above. (Disclaimer: the present piece of humanistic writing is no exception, at least in its aims.) Even a humanities scholar’s spoken lectures are traditionally pre-scripted for high-pitch verbatim performance–an exercise that other disciplines such as the sciences and engineering view as bizarrely theatrical, not to mention fantastically inefficient for presenting data and conclusions.

Indeed, the issue of “data” in the humanities is increasingly acute in the digital age since it is a direct challenge to the privilege of high style. With some exceptions in fields like history, the humanities treat data as something to be embedded in discourse as part of the argument (or at least kept as close as a footnote or appendix at one remove). “Close reading” is an example of how the humanities fold data–the precise lines of poetry being interpreted, for instance–into argument. As a consequence, and by corollary with its stylistic ideal, the humanities create arguments that seem data-lite. After all, only so much concrete evidence can be folded into an argument without the prose taking on the poured concrete quality of many scientific or social-scientific articles with their masses of particulate citations–e.g., “Empirical studies adopting this social constructionist view of technology have been done by sociologists of technology (Bijker 1987; Bijker, Hughes and Pinch 1987; Collins 1987; Pinch and Bijker 1984, 1987; Woolgar 1985; Wynne 1988), and information technology researchers (Boland and Day 1982; Hirschheim, Klein and Newman 1987; Klein and Hirschheim 1983; Newman and Rosenberg 1985)” (source for this example [PDF]). Of course, the appearance of being data-lite belies the true heft and complexity of humanities data (where “data” here means low-level observational and descriptive information recorded in some structured pattern, as in the “images” or “paradoxes” Brooks accumulates in his Keats chapter in The Well Wrought Urn, whose title notably rejects the idea of explicit data: “Keat’s Sylvan Historian: History Without Footnotes”). First, there is a multiplier effect by which humanistic knowledge is attended by messy problems of missing, irregular, incommensurate, and ambiguous information that require much behind-the-scenes processing and adjudication (a post by Hugh Cayless on this issue). Secondly, much underlying data in the humanities is implicit. Data inheres in entrained reading practices such that the “what is your data?” question typical in other disciplines is normatively answered in literary studies: “here’s the book; do a close reading yourself to see if my interpretation is persuasive.” And data also inheres silently in the stability of a massive infrastructure of book collections, curatorial staffs, bibliographies, metadata, and other apparatuses–i.e., the whole order of data to which even simple humanities citations (e.g., “see Cleanth Brooks”) really refer. Humanities data refers to “all that” (background editing, archiving, reading practices and apparatuses) even when, as in Brooks’s case, it seems to wear on its sleeve few, if any, footnotes. So long as libraries, books, or reading do not change, “all that” can be left unspoken as assumed knowledge.

By contrast, the sciences and social sciences (especially branches of the latter focused on quantitative or empirical research) cleave the orders of data and of argument so that they can be managed separately. Data is channeled through closed or open datasets, databases, repositories, etc.; while argument appears in pre-prints, conference proceedings, and journal publications. This separation allows for the creation, processing, maintenance, and presentation of data as a distinct workflow–one that can acquire independent value and even generate its own research problems (as in recent work on computationally assisted “data provenance” [example, PDF]). Scientific and social-scientific data can thus be presented or otherwise made available autonomously for critical inspection–a fact demonstrated, for example, in recent arguments for and against the data validity of Thomas Pikkety’s Capital in the Twenty-First Century.

Humanities discourse has rarely needed to aspire to the same standards for making all its data explicit, shareable, and open to critical examination. “So long as the nature of libraries, books, or reading do not change,” as I put it above, there was no need. But today digital media are rapidly destabilizing the traditional evidentiary structure of the humanities and bringing it closer to that of the sciences. The digital humanities field is a leading example. There are no established humanities protocols for adequately citing even the moderately “big data” that advanced digital methods now tempt humanists to study–e.g., the 7,000 novels that Franco Moretti explores in “Style, Inc. Reflections on Seven Thousand Titles”; the 3,500 works of Irish American prose literature that Matthew L. Jockers mines in Macroanalysis; or the 21,000 articles from “seven generalist literary-studies journals” with up to a century of volumes each that Andrew Goldstone and Ted Underwood canvass in their “The Quiet Transformations of Literary Studies” [PDF]. Even outside the digital humanities, mainstream humanities scholars who work with any kind of digital material are now at sea when needing to quote or cite the increasingly important plenum of born-digital, dynamic, social-media, streaming, and other new kinds of resources. For example, how does one shoehorn into the MLA’s citation style for a Web resource–simply “Web,” void of URLs–any granular reference to a distinct structure or state of an online site, archive, or database?

The high style of humanities discourse, in sum, is increasingly under threat in a digital age that values information over style. Meanwhile, the more data-explicit “ordinary” humanities style of book prospectuses, grant proposals, personnel case reviews, research assessment reports, etc., remains recessive even as it becomes increasingly pervasive. Days and nights may be spent writing a grant proposal, for example, but the prose that emerges is never valued as the “real” voice of the humanist. This is a situation that is increasingly unstable as humanities scholars devote larger proportions of time to writing such works as reports for program reviews or research assessments. What the digital age seems to be telling the academy–an outcome that the humanities will need to adapt for its own purposes–is that the dominant/recessive relation between the language of a book and that of a report or proposal may need to be rebalanced. Nor is the rebalancing solely driven by intramural and administrative needs–part of the rise of “managerialism” in universities. “Public humanities” scholars and humanities advocates make a strong case for complementing humanities research with dissemination in “plain and simple” language [example].

What, then, should be the discourse of knowledge in a Center for Digital Knowledge? One thesis is that such a Center should embrace alternatives to normative humanities academic discourse as part of its very project of understanding the difference of digital knowledge. “Alternatives” does not necessarily mean abandoning the most distinguished features of humanities discourse–individually cultivated voices of eloquence feeling their way toward sustained, rigorous, and elegant or “edgy” interpretations of past and present phenomena. But it does mean diversifying and reordering humanities discourse so that its voice can join in a broader discursive cycle of digital knowledge.

What I mean may be elucidated through a hypothetical research scenario of a sort increasingly common among scholars collaborating with digital methods. Imagine that a major grant has been won to fund a cross-disciplinary, multi-year project entitled “Climate Change and Social Change.” The project’s mission is to correlate climate change with both historical and recent social, economic, political, and cultural impacts–e.g., impacts on the perception of climate (e.g., in the media), social demographics (e.g., mortality rates and migration patterns), monetary flows, political movements, and policy decisions. The promised deliverables are heavily digital: a dataset or corpus, digital tools and interfaces for researchers and the public, and digitally-accessible conferences, papers, and articles. Members of the project team include scholars in computer science, biology, epidemiology, sociology, political science, communication, anthropology, film and media studies, environmental history or literary ecocriticism, history, and literary studies or comparative literature. The operational procedure is a series of plenary meetings branching off into working groups and development “sprints,” all coordinated around a series of defined project milestones and deliverables.

One of the distinctive features of such projects in the digital age is that the breadth of disciplines involved is homologous with a condition of the digital itself: the fact that the object of study can be mutated into a common digital dataset and transformed into countless permutational views for treatment from different disciplinary angles. Thus there is no one primary discourse of knowledge agents, acts, and styles. Monographic publications written in high style by humanities scholars are on a par with such discourses dominating other disciplines as collaborative conference papers, datasets, prototype demonstrations, etc. Or, more accurately, the dominant discourses of different disciplines each take command at different phases of the overall cycle of knowledge production before receding to let other kinds of discourse dominate–the whole alternating sequence driving the process forward iteratively. Thus for example, individuals may drive the work in some parts of the cycle, and teams in others. Observation and analysis come to the fore in some parts of the cycle, and interpretation and critique in others (e.g., critical discussion that occurs at the beginning of the project to shape the mission, or midway in the project as a correction of preliminary results). And style modulates through the cycle accordingly–full-throated at some points, but collapsed to bullet points, diagrams, mockups, and “demos” at others. In this regard, the “provocation” paper by de Bolla at the second planning consultation for the Centre for Digital Knowledge is a perfect exemplum of high-style humanistic critical argument used tactically to start rather than finish a project. Ideally, the sum of all the phase-cycles of this discourse–in which the discursive norms of each discipline take the lead at different points–creates a whole greater than the parts.

The humanities, in other words, need not think that the discursive flow of “Reading & Research arrow right Syllabi & Teaching notes arrow right Talks arrow right Articles arrow right Monographs” is a linear path. Different segments of that traditional agenda can be broken out separately and inserted tactically into other phases of the overall collaborative act of knowledge production where they will have the most value. From the point of view of the humanities themselves, this thesis assumes its most radical form in two propositions. One is that in the digital age humanities scholars should be encouraged to complement their dominant discourse with other kinds of discourse–including challenging collaborative work, difficult and innovative acts of data collection and analysis, and research outputs such as published conference proceedings or online projects that do not sum up in a critical/interpretive monograph. The other proposition is that in the digital age humanities scholars should not be engaging solely in discursive acts at all. Instead, it is already clear in the field of the digital humanities–a leading edge of the humanities’ encounter with digital knowledge–that a gestalt-shift is underway that recasts acts of discourse as acts of “making” and “building.” In the digital humanities, the “epistemology of building”–realized through the building of digital projects, hardware DIY projects, media archaeology labs, etc., and theorized with the aid of such broader intellectual movements as the “new materialism”–is, as they say, a thing.

Thesis 5: Program for the Centre for Digital Knowledge

There are many possible ways the above recommendations could be built into a Centre for Digital Knowledge. Here, for example, is one program of activities that interweaves many of the above theses:

  • Imagine that the Centre for Digital Knowledge would organize itself for the first four years of activity around the question, “What will be the important new digital artifacts of knowledge in the year 2050, and what will their relation be to older digital or material artifacts of knowledge?” The year 2050 is chosen to provide an aim point that provokes imagination, but not one so far in the future as to encourage pure fantasy. The notion of “artifacts” (rather than “media,” “society,” “culture,” etc.) is chosen to anchor the question in the concrete and in building.
  • To address this question, the Centre for Digital Knowledge would recruit and provide fellowships for one or more cross-disciplinary teams of researchers (both senior and early-career, intramural and extramural)–e.g., several humanists, social scientists, and engineers, with at least one ethnographer and one administrator.
  • The team(s) would be given the following mission: design a digital artifact of knowledge for the year 2050, supported by research, mockups or prototypes, exploration of the intellectual premises and theory, speculations on economic and social viability, etc. In doing so, conduct activities that engage other kinds of institutions (e.g., high schools, corporations, the government) and the public; and at least on occasion plan activities that do not conform to established academic forms such as a conference or colloquium.
  • The ethnographer on the team would be given the mission: document the workflow, discourse patterns, etc. of the team(s).
  • The administrator on the team would be given the mission: note the kinds of activities, discourses, and outputs in the project that currently do not have a place in a university’s reward or hiring procedures; and draft a revision of personnel policy that finds a viable way to recognize those activities in a way that furthers the overall research and teaching strength of the university.
  • The final outputs of all the above would consist of traditional scholarly articles and research; an online site giving access to the project and its data as well as explanations addressed to the public; and publications on the project workflow itself.

As stated above, this is just one example program. Many other kinds of organization, activity, and output could be imagined that would allow the Center to explore, and enact, the epistemology of the digital. Whatever the program, the goal is to engage the topic of what it means to “know” in the digital age in a spirit of serious play–at once disciplined and exploratory of new paradigms.

Errata & Revisions

19 August 2014: Corrected to 21,000 the mention of 13,000 articles in Andrew Goldstone and Ted Underwood’s “The Quiet Transformations of Literary Studies” [PDF].

10 November 2014: Corrected to 3,500 the mention of 758 works of Irish American prose literature that Matthew L. Jockers mines in Macroanalysis (the latter was correct only for Chapter 8 in Jocker’s book).