Norms and Knowledge
“Think about the most boring book you can think of. Think of the history text book from one of those red states filled with racist, false and colonialist information. Books that are as stiff to the touch as they are blithely mundane. This is what I mean when I say books. Books that are not made by artists. Books that when you touch them you feel absolutely unimportant.” [1]
“As a mechanism of settler-colonial erasure and possession of Indigenous knowledge, attribution is a technique of ownership in that it signifies an authorial relationship. Working in a complementary way to the function of land title, attribution rewrites relationships to knowledge through a colonial property paradigm […] these logics of possession are buried and normalized within classificatory systems, organizational processes, and infrastructural logics of institutions themselves.” [2]
The first quote above comes from Be Oakley’s collection of texts Genderfail Reader, a small, softcover text that presents Oakley’s meditations on the praxis, stakes, and value systems at play in their practice as a queer independent publisher, a collector of books, and an artist. In this rhizomatic, nonlinear essay, they articulate the relationship between a book’s form and its content, framing the alternative materialities of zines and independently-published texts as reflective of their alternate relationalities with creators, archivists, and publics. The alternative materiality of the zine, they argue, embodies “radical softness as a boundless form of resistance.”[3]
In contrast, normative books, as described above, are “as stiff to the touch as they are blithely mundane.” Oakley’s relational framework of content, materiality, and affect points us to the second quote, taken from Anderson’s and Christen’s Journal of Radical Librarianship article “Decolonizing Attribution.” Anderson and Christen examine how conventions of knowledge-gathering, publication, and collection intertwine with colonial processes of Indigenous dispossession. They note that the colonization of knowledge has a material component, as it “rewrites relationships to knowledge” through the structures of European land deeds and property ownership; however, they also note that this materiality is “buried and normalized” within larger systems of how we think about and organize knowledge. While Anderson and Christen analyze how colonialism structures our relationships to knowledge, Oakley examines how the physical form of a book embodies those relationships.
Taken together, these two critical analyses point us toward questions that should concern those in library and information sciences (LIS), material culture scholars, and designers who work in information systems: how does bias in knowledge organization systems (KOS) relate to the materiality of books that those systems are designed to organize? In this paper, I argue that a material culture analysis of books and library metadata, informed by queer and decolonial lenses, allows us to see how standard library KOSs are built to house a particular kind of book-object, thereby creating a norm of a book that is exclusionary to non-Western, queer, and anticapitalist texts. Library metadata, which consists of the fields and subjects that are used to describe and organize books, codifies that norm and imposes it onto larger systems of information. This data creates entire systems of collecting, cataloguing, and physical positioning that reify the norm of a book while forming an increasingly narrow definition of what constitutes valid knowledge and information.
In this paper, I begin with an analysis of how the materiality of a Eurocentric, commodified book-object is historically constructed, and how it is codified through the standard metadata systems used by institutional libraries and archives throughout the U.S. and much of the world. I examine how these metadata build out into catalogs that systematize the attributes of the book-object, creating oppressive knowledge organization systems. I then turn to non-normative texts, analyzing what happens when these exclusionary KOSs come into contact with works that they were never designed to accommodate – in particular, Indigenous- and queer-authored works. Finally, I examine current activist and radical archiving/cataloguing practices that gesture toward alternate forms of knowledge organization based on the relationalities of different texts, again drawing on long histories of work by Indigenous and queer librarians, scholars, cataloguers, and activists.
The book-object as object: the colonial construction of books as owned “knowledge”
The question of what constitutes a legitimate text lies at the heart of literary and media studies; it also presents a point of historical and ongoing political struggle, as Indigenous nations have fought to legitimize their intellectual traditions in the face of colonial attempts to dismiss, expropriate, and assimilate them. In previous work, I have written about how Indigenous people incorporated, altered, and re-appropriated (indeed, indigenized)[4] print technologies into existing traditions of Indigenous literature and intellectual thought. In 1659, a Nipmuc man who took on the name James Printer (and yes, the surname is significant) set type for the first Bible printed in North America – which was printed in an Algonquian language. Printer
was carefully navigating tensions between continuous adaptation and fierce resistance as he engaged with the printing press. Part of this negotiation was bringing his skill as a multilingual scribe back to his community during the [King Philip’s] war, turning the skills that he learned in colonial contexts into instruments of radical indigenous resistance. As people like Printer adapted colonial writing technologies to their own contexts, writing technologies shaped by colonial and indigenous users and makers passed through native spatial networks. Colonial print technologies came to be brought into indigenous ideological and grammatical conceptions of “image making,” becoming awikhiganak – the Abenaki word that pertains to written or drawn material. Indigenous awikhigan technologies predated the arrival of colonial printing.[5]
Printer is accompanied by other major figures in Indigenous literary history such as Sequoyah, whose Cherokee syllabary from 1821 is used to this day.[6], 2010); Margaret Bender, Signs of Cherokee Culture: Sequoyah’s Syllabary in Eastern Cherokee Life (The University of North Carolina Press, 2003).”] All of this is to say that the questions of whose text counts as text; what kinds of textual production are valued; and whose definition of a “book” or an awikhigan is followed – these are questions with very real context and stakes for Indigenous sovereignty, historically and today.
With this contested context, we can begin to see how the idea of a text as a printed, bound volume captures a very narrow and decidedly Eurocentric subjectivity. Media studies also has something to say about this; Marshall McLuhan writes of “printing, a ditto device” under whose auspices “the private, fixed point of view became possible and literacy conferred the power of detachment, non-involvement.”[7] (New York; Toronto: Bantam, 1967), 50.”] Printing, to McLuhan, is a material counterpart to Enlightenment values about rationality and objectivity. The printed book, then, becomes the materialization of those values brought to standardized, mass-produced form. McLuhan also notes that printing constructed the idea of authorship as “the habit of considering intellectual effort as private property,” reifying the idea of a book’s contents as an object unto itself.[8] The idea of a book becomes deeply bound with assumptions that books (and indeed, information itself) are logical and objective; that they constitute a kind of commodity as objects; and that they pass on their knowledge to their readers in a kind of teacher-student pedagogical relationship. Bibliographers like Phil Round, as well as LIS scholars like Anderson and Christen, have examined how colonial contexts often involve book-objects functioning in this way. Round writes on the crucial intertwining of colonial literacy, religious conversion projects (from the early “praying towns” of the Northeast to the boarding schools of the 1800s which were notoriously cruel and violent), and assimilation ideology. He argues that “for the Europeans who came to the Americas, the content of the book was sometimes less important than its ideological function as ‘the object in which a set of regulations and metaphors was inscribed, giving to it the special status of Truth and Wisdom.’”[9]
The book-object, then, materializes particularly European-Enlightenment ideas about knowledge, viewpoint, and objectivity, which have historically been leveraged against Indigenous peoples through violent, paternalistic education systems. Be Oakley’s analysis of books that are “stiff to the touch” connects this history back to materiality: “Books that are as stiff to the touch as they are blithely mundane. This is what I mean when I say books… Books that when you touch them you feel absolutely unimportant.”[10] Stiff to the touch, inflexible, authoritative – the materiality of these books holds onto “the special status of Truth and Wisdom.” Read closely, Oakley’s last sentence is crucial: when you touch these books, you are the one who feels unimportant. As objects, hardcover books carry a particular authority, perhaps connected to the corresponding concept of authorship.
Micro and Macro: metadata and knowledge organization
Think of the spine of a conventional book: picture its title, the author’s surname, maybe the publisher, and if it’s a library book perhaps a call number, including a date. Title, author, publisher, call number, date: these are many of the metadata headings that most commonly describe a book in a library catalog. These data have their own material qualities that are linked on one hand to the material history of cataloguing and on the other to the material history of the book-objects they are designed to describe.
In order to understand the material history of cataloguing, we must look at the material history of MARC. The MARC (MAchine Readable Cataloguing) data format, a system that many (if not most) institutional libraries continue to use[11], was developed in the 1960s and tested in 1966. Its creator, Henriette Avram, was a mathematician, systems analyst, and programmer who had previously worked at the NSA. She developed MARC to absorb the older Dewey Decimal System and expand it. It is important to note that MARC was designed in ways that do not correspond to the way that we think about computers and digital media – it is not networked, or linked, or nodal. Rather, MARC stores data in a digital version of a catalog card, because that is the work that it was originally designed to automate.[12] MARC records consist of fields and subfields of data. For example, field 245 is the title field; add subfield a and you get 245$a, the primary title field; 245$b is the subtitle field; and so on. Finally, it is important to note that MARC contains a series of fields for Library of Congress (LC) subject headings (the 600s), which are standard descriptors of content that allow users to view books about a particular “subject” all at once.
Figures 1 and 2 contain the bibliographic metadata for a book in Columbia’s online catalog. Figure 1 displays the information in the user-friendly format to which the catalog defaults; figure 2 displays the MARC record. Through these records, we can begin to see how this metadata sits at the intersection of the materiality of the book-object and the materiality of the system created by Avram, Dewey, the Library of Congress, and other figures and entities involved in the design of LIS systems.
Bibliographic metadata codifies the material and ideological properties of the Eurocentric book-object, and by doing so, it imposes a set of biases and emphases. On a micro-level, bibliographic metadata are meant to describe the book-object in terms of its physical properties, its subject, and any information related to its production. Hence, in figure 1, we see physical descriptors in the 300s, LC subject headings in the 600s, and if this entry had publication data, it would appear in the 200s. Interestingly, because this is a zine, there is no publisher and therefore no data.
A lack of publication data is almost unheard of for a conventional Eurocentric book-object because they are so steeped in authorship (Enlightenment authority) and copyright (property ownership). Oakley, Anderson and Christen, and McLuhan have all commented on this – from their various perspectives as queer independent publisher, Indigenous/LIS scholars, and media studies theorist. Anderson and Christen write on how copyright presents the merging of intellectual property with land ownership and therefore with dispossession, and they critique the very idea of an author:
because the kind of labor required for authorship mirrors that required for real property, we see an overlap in the making of racialized subjects and those deemed unfit to be authors. Moreover, authorship is determined and conferred through hierarchies of documentation practices where the written word exists as the pinnacle of a civilizational imaginary and thus informs the ideal modern political subject. Those who don’t write and don’t have an identifiable written culture are reduced in this property model to a lower status.[13]
Anderson’s and Christen’s analysis is important in the context of many Indigenous materials held by libraries that originate from Indigenous communities and either do not have a single identifiable “author” or were “recorded” or “written” by a white anthropologist or ethnographer to whom the material is “attributed” in the library bibliographic data. In order to even exist in a system of data that is designed to hold Eurocentric book-objects, these kinds of stories, texts, and intellectual materials must be converted to a kind of held property – and the property holder, through the concept of single “authorship,” is often white. Meanwhile, McLuhan identifies authorship and copyright with capitalism, and print as a commodity, as “rising consumer-oriented culture became concerned with labels of authenticity and protection against theft and piracy.”[14] The heavy emphasis on author and title in the metadata – and indeed, in how we think about books – reflects those “labels of authenticity.”
In line with McLuhan’s analysis, Oakley reflects on the anti-authorial (anti-authority) and anti-copyright conventions of queer, activist, and underground print cultures, positioning these material conventions of production as anti-capitalist:
There is something about the urgency of making zines in which authorship doesn’t become the most important part. This lack of a regard for authorship becomes an anti-capitalist gesture, something like the complete opposite of a trademark. This invites others to build up the ideas of the unknown authors; to plagiarize the content in the refusal of complete ownership over any idea.[15]
Oakley’s analysis of how queer and underground texts often refuse “complete ownership over any idea” resonates with the intellectual traditions of Indigenous nations who do not and never have held knowledge as “property,” whose knowledge systems are so drastically at odds with the model that drives MARC and indeed publishing at large. Oakley and other queer, independent, and underground “publishers”, if the commercial word even fits here, stand alongside Indigenous intellectuals by showing us alternative ways that knowledge can be conceptualized, reproduced, archived, and organized.
Break it: non-normative texts in contexts
There have been many nuanced analyses of the oppressive consequences of normative bibliographic data, a few of which I will briefly outline here. Anderson and Christen provide one deeply clear example of the consequences of colonial knowledge structures for Indigenous access to knowledge contained in libraries:
A recent meeting with a tribal member at a university library began with a very common question about how much material the library held about or in relation to the community. The librarian was unable to answer. […] It became clear through discussions with the librarian that the reasons for this lack of knowledge were due, at least in part, to the structure of the metadata. […] There was no field where homelands were named as a part of knowledge relations, no field where communities, clans, families, or ceremonial societies were acknowledged as authorities. What is prominent, of course, is the author field – bearing the weight of its colonial underpinnings.[16]
The real consequences of the narrow colonial materiality of the metadata lie in the dismemberment of alternate knowledge structures. It is a kind of intellectual and symbolic dismemberment that continues the violence of forced colonial education and assimilation into the present day by disappearing Indigenous knowledge into the system.[17] Scholar and radical cataloguer KR Roberto, meanwhile, has done work on the appearance and disappearance of queer and transgender identities in MARC fields and in LC subject headings (LCSH). Roberto has found:
Instead of GLBT or LGBT people, Sexual minorities is the authorized form for these terms […] The only subject heading to use the word “queer” is Queer theory (with a cryptic reference to Gender identity as a broader term). If there are no queers in LCSH, what does Queer theory study? […] Removing queerness from the catalog does not eliminate it; rather it creates a space that only values clearly delineated identities. No matter what the terminology says, queerness is still present in the catalog; its explicit invisibility haunts LCSH.[18]
Roberto’s terminology of hauntingly “explicit invisibility” echoes Duarte’s and Belarde-Lewis’ “reading through and searching through the interstices of subject headings,” a kind of true reading from the margins borne out of erasure and necessity. What the imperfect erasure of queerness and the dismemberment of Indigenous knowledge have in common is both a theoretical and practical mismatch, a misfit. Theoretically, there is a mismatch of knowledge systems between the metadata and the text (as well as the text’s community of origin); practically, there is a mismatch between how a user or reader sees themself and what they see reflected of themself within the system – and therefore within history, literature, and beyond.
However, it is also crucial to emphasize that there are radical cataloguers, scholars, activists, librarians, and many others who are working to break the structures that do not fit and build alternatives that do. In 1998, founding librarian Gene Joseph (Wet’suwet’en/Dakelh) and head librarian Ann Doyle began implementing a set of Indigenous headings in the local note (690) MARC field at the Xwi7xwa library on Musqueam territory (at the University of British Columbia). Local notes are usually used for information about the library’s unique copy of a text, such as annotations or signatures. Their classification system was in collaboration with and based upon the earlier classification design work of Kanien’kehaka librarian Brian Deer and his work at the National Indian Brotherhood library. Interestingly, the librarians at Xwi7xwa submitted their classification system to the Library of Congress MARC Standards Office as a thesaurus titled First Nation House of Learning (FNHL) subject headings. By 2005, their thesaurus was accepted, and so their Indigenously-determined subject headings moved to the 650 field, reserved for subject headings. This movement from the 690 to the 650 field represents a material formalization of an Indigenous knowledge organization system within MARC data. While it is by no means a complete overturning of the knowledge-as-property model that MARC is built upon, it nevertheless stands as a material Indigenous intervention into library metadata that has served as an important example to other institutions and radical librarians.[19]
It is intriguing that elements of Xwi7xwa’s work took place within modified local note fields, as these fields have been the site of interventions in other institutional collections of texts that have their own nonstandard knowledge systems. In Amherst College’s Native American Literature collection, the 655 field is used to enter the tribal affiliation of authors; 655 is a local field that is usually used to classify genre. Barnard’s Zine Library, meanwhile, uses the 500 and 520 fields, which are both general note fields, to describe the unique physical appearance and content of zines with content-rich summaries. The Barnard zine librarian, Jenna Freedman, explained to me that these summaries, which become searchable in the catalog, are her way of achieving the searchability that subject headings are supposed to provide, without dealing with the ill-fitting and biased language of the standard LC headings.[20] The summaries are used for a wide variety of descriptive purposes, but generally emphasize the contents of the zine, its acquisition/provenance information, and if it is part of a series. In this way, they follow the relational framework that Be Oakley describes as central to independent publishing and zine work:
The producers (publisher, bookbinder, printer, paper-maker) of any book (zine, publication or artist book) are in collaboration with each author (writer, designer, image-maker, editor) in the making of the printed object. As these publications are disseminated into the world they are waiting for their interactions with each public they inhabit.[21]
By making room for tribal affiliations and the relational metadata of independent publishing, these examples demonstrate that a level of structural change can take place within the material structures of metadata that we already have. The re-appropriation of local notes into a kind of metadata that makes sense for non-normative books is one material step toward knowledge organization systems that can better, if not perfectly, hold these books.
The case studies and material culture analyses that I have presented here demonstrate some of the ways that bibliographic metadata, particularly in the MARC format, reify aspects of a Eurocentric book-object, including property ownership and the production of the book as commodity. The knowledge organization systems that derive from this data format, particularly through an overemphasis on authorial and publication data, marginalize and exclude relationships to knowledge that fall outside of the dominant European print tradition. These systems are insidiously dangerous because they hide behind the supposed neutrality of the institutional library, which conceals its own subjectivities and colonial histories. The invisibility of these systems makes it crucial to amplify both the critical voices of those who are analyzing oppression in the metadata, and the creative voices of those who are building alternatives within or without the confines of its fields and subject headings.
***