Category Archives: Information Theory

Solutions to Controlled Vocabularies Part Two……

steckel_082503_2

Structuralism can be understood as a normative science in which a classification system would begin with the norm and afterwards, if necessary, treat of any exceptions. This, in many ways, goes a long way towards explaining how some of the solutions to classification biases are conceptualised. Both Olson and Mai have basically inserted into the old systems a new way to deal with any exceptions to the norm. They both imply an understanding and incorporation of difference into their models, but only once the normative system has been first applied. However, in Poststructuralism, there emerges a more appropriate definition of difference, and not as an exception of ‘limit’ to the ‘core’, but as a regulating principal that functions to define the core of a subject. As James Williams (2104) demonstrates, in poststructuralism “the limit is not compared with the core, or balanced with it […]the limit is the core”. Poststructuralism sees dualism as a problematic approach to understanding language, and even more problematic with the dualist approach of classification theorists is that they imbalance the dualism between sameness and difference, lending more significance to sameness than difference. Perhaps it is better to explain the principle of ‘core’ and ‘limit’ through an example. Defining something like ‘Irishness’ is traditionally understood through what is at its core, that is, being born in a certain place, time, to certain parents, being a certain skin colour, speaking a certain language, etc. This understanding of Irishness is what regulates our political system and society as a whole. The ‘limit’ in this example would be the problems that arise with the definition once we factor in ethnic minorities who become naturalised with their own set of cultures and histories. But in a traditional, structuralist understanding, these minorities are the exception, and while the Irish government may legislate to create better conditions for ethnic minorities, there will always be discrimination because the limit does not change the core. However, poststructuralism, in James’ (2014) words, would argue that “The truth of a population is where it is changing. A nation is defined at its borders”, that is, at the point of difference because everything that happens at the borders of a country changes how the core is defined. It is the ‘difference’ of ethnic minorities that is most representative of a where a nation is going to in the future and so the ‘limit’ for poststructuralism is the most meaningful way that a nation, or a text, or indeed a classification system can be defined. In terms of vocabulary control then, what defines how a text is classified should not be a biased and static system. Texts should be classified by emphasising how those texts are being, or could be used in ‘different’ ways, by different disciplines. A book like ‘Words of Power’, that would be classified under LCSH and DDC as Philosophy: Logic, would no longer be limited to such static systems. This book could also be used by feminist scholars, by those working in ethics, or anthropology, sociology, psychology, gender studies, linguistics, and so on. Defining the book under strict controlled vocabularies denies access to ‘exceptional’ groups which results in a lack of real innovation and creativity in academia. In this sense, there are lessons to be learned form poststructuralist theorist Deleuze who argues for the power of openness in creativity.

To reassert, then, poststructuralism denies the traditional approach to classification through controlled vocabularies and aims to positively disrupt traditional classification systems in order to achieve greater autonomy for texts and their users. The problem with the solutions is that they are developed from the point of view that it is too disruptive to completely change our classification systems. But here disruption is seen negatively rather than positively. The seriousness of this ultimately limiting attitude and of the reliance on outmoded classifications can best be understood by applying the work of Jacques Derrida to the topic. Derrida’s (1976) ‘textual positivism’ would not ask ‘what is this book about?’, but rather, ‘what does this book do?’ This question radically changes the way we would categorise texts because it places an emphasis on multiplicity of use rather than the singularity of meaning as defined by ‘specialists’. Derrida’s approach distinctly adopts anthropocentrism and sees the classification system as onto-theological. Derrida’s ‘origin’ is constantly being affected by a texts ‘presence’, thats is, what a text is doing in the moment. It is this ‘presence’ that leads to both the future and the past, that essentially pulls the ‘origin’ into ever-evolving new contexts. A ‘sign’ then for Derrida is nothing more than a ‘trace’ of that change, a trace that can be followed to a point of difference so long as we understand that once we reach the trace, we have too altered its origin which has moved off beyond our grasp. It is this point of difference that allows for creativity to emerge in what Derrida defines as ‘play’. In this sense, and applied to classification systems, the moment at which a reader reads ‘Words of Power’ in relation to psychoanalytical studies, is the moment that changes the origin indefinitely, opening up that text to new contexts in the future and the past, and thereby changing how that text should be defined in a classification system.

Controlled-Vocabulary-Digital-Asset-Management

This essay will argue then, that classification systems should take ‘difference’ as a key underlying principle in the way we organise information. To do otherwise is to consciously do ‘violence’ by excluding individuals or groups from our knowledge economy. As Derrida (1976, 140) writes “There is no ethics without the presence of the other but also, and consequently, without absence, dissimulation, detour, differance, writing”. This is because traditional classification systems and the solutions outlined in this paper, try to regulate the past, present and future of information in order to make it more accessible. However, the adoption of a classical scientific approach only makes it possible to categories if texts, if language, are seen as static. But Derrida (1976, 67) teaches us that “The concepts of present, past, and future, everything in the concepts of time and history that assumes their classical evidence – the general metaphysical concepts of time – cannot describe the structure of the trace adequately”. This constitutes a denial that there is no ‘final’ past, present or future of a text. Derrida (1976, 69) would argue that what is really happening in controlled vocabularies is a violence and unethical act of control which is borne out of a fear and misunderstanding of ‘death’. “Spacing as writing is the becoming-absent and the becoming-unconsious of the subject. By the movement of its drift/ derivation the emancipation of the sign constitutes in return the desire of the presence. That becoming – or that drift/ derivation – does not befall the subject which would choose it or would passively let itself be drawn along by it. As the subject’s relationship with its own death, this becoming is the constitution of subjectivity.” Death in this sense is seen as continuity from one context to the next rather than a final end. In many ways, traditional classification systems are a kind of death sentence in the traditional sense in that they render entire texts and disciplines static and irrelevant as new contexts emerge and are classified in inadequate ways. Perhaps the point is best represented by Williams (2014) who claims “The demand for clarity is dangerous because clarity justifies violent judgements and exclusions on the basis of a promise of a world of understanding and togetherness.”

What is happening then in our classification systems is that subject specialists are attempting to create greater accessibility to information by categorising information under specific and specialist subject headings in order to create a sense of clarity when one is searching for that information. Texts are gathered together based around a principle of sameness and difference in which things that are similar are classified under the same subject headings. The reality is that this system, based on controlled vocabularies is extremely biased and fails to account for real difference, heterogeneity and multiplicity in our information world. There have been attempts to create new systems that, for example, are more appropriate for those interested in feminist studies, but these systems simply shift the control from one universal group to another smaller prioritisation. They do highlight a very important politics and injustice, for example, in the way texts are classified, but they do so by reverting to an equally biased system. It is at the point of difference that real innovation and creativity occurs. Our universities are set up as places in which creativity and independent thinking is supposed to lead to new innovations, while our public libraries are moving more and more towards providing creative spaces for communities to grow and develop. Yet, the way in which we search for information is limited and contradictory, and no longer fit for purpose. The argument that ‘difference’ should be prevalent in classification systems is not absurd or contradictory, but would require a complete overhaul of the way in which we understand and categorise information. Perhaps there is already a working model existent in the way in which internet search engines like Google operate. Websites on Google are presented to us in a way that can promote difference as a classifying principle. This is because those websites that have the most links to other active sites are presented as being more prevalent and relevant. What would happen if a similar approach was adopted by libraries? That when we search for information under a certain topic, that we are presented with a list of texts that are organised based around the number of connections the texts have to other texts and thus other disciplines? This would perhaps provide a classification system that would celebrate multiplicity, ranking texts according to the many possible ways they can be used and interpreted. In the field of literature, I remember my PhD supervisor suggesting to me that I include some comparative work between John Banville and Gabriel Garcia Marquez in my thesis because the postcolonial contexts of Ireland and Colombia bare many similarities. As a researcher, in all the hours spent searching for information on Banville and postcolonialism, I was never presented with any texts that implied real difference or multiplicity. Searches were singular and restrictive, never indicating that any different approach was possible. The classification system would provide no new spark of creativity for young researchers to pursue. In fact, using Boolean logic, I found that the more search terms I entered in different searches, the more relevant the results that were presented. If difference was to become an organising principle, then information would be retrieved that prioritised multiplicity and that would lead to greater inclusion and thus innovation. The problem at the moment is that the way we search the digital databases is dictated by the way in which information is placed on stacks. The stacks or the numbers on the books do not have to change. Where the information is located in the library is irrelevant because rarely nowadays do we search the stacks anyway. What needs to change is the way information is classified on the digital databases. There is nothing stopping us from radically changing this digital system to one that is more inclusive of difference and that contains less ‘violent’ vocabularies.

Bibliography

Beghtol, C. (1986). “Semantic validity: concepts of warrant in bibliographic classification systems”. Library Resources & Technical Services, Vol. 30 No. 2, pp. 109-25.

Borges, J.L. (1952). “The analytical language of John Wilkins”, Other Inquisitions 1937-1952. Souvenir Press, London, 1973.

Bowker, G.C. and Star, S.L. (1999). Sorting Things Out: Classification and Its Consequences. MIT Press, Boston, MA.

Derrida, Jacques (1976). Of Grammatology. John Hopkins University Press, Maryland.

Lakoff, G. (1987). Women, Fire, and Dangerous Things: What Categories Reveal about the Mind. University of Chicago Press, Chicago, IL.

Mai, Jens-Erik (2010). Classification in a social world: bias and trust. Journal of Documentation, Vol. 66 Issue 5, pp. 627-642.

Miksa, F. (1998). The DDC, the Universe of Knowledge, and the Post-Modern Library. Forest

Press, Albany, NY.

Olson, Hope A. (2001). Sameness and difference: a cultural foundation of classification. Library Resources and Technical Services, 45, no. 3, pp.115-122

Shirky, C. (2005). “Ontology is overrated: categories, links, and tags”. Clay Shirky’s Writings about the Internet. Economics & Culture, Media & Community, available at: http://www.shirky. com/writings/ontology_overrated.html html (Date Accessed 28th April 2015)

Williams, James (2014). Understanding Poststructuralism. Routledge: London.

Problematic Solutions to Controlled Vocabularies……

Controlled vocabularies in modern classification systems were designed to assist in making information more accessible to library patrons and professional researchers. The very idea that vocabulary can be controlled finds its origins in Enlightenment ideology in which modern science was founded on the principles of ‘knowledge and truth’. Within the field of linguistics, this meant that the greater an understanding of the mechanics of language and the greater the accuracy and control over words, the closer one could come to truth. In applying this ideology to library classification systems, both the Dewey Decimal System and the Library of Congress Classification System attempted to create subject headings under which library content could be listed. The claim that these systems are biased is no longer really in dispute as both systems have attempted in recent years to adapt to new emergent disciplines that have been marginalised due to the biases within the respective classifications. However, the continued attempt to regulate the classification of information through slightly more ‘flexible’ controlled vocabularies is detrimental to real innovation and creativity, not only in academic and scientific research, but also in terms of promoting real diversity and creativity privately and publicly in social, political, economic and cultural spheres. It is ironic that the very systems that set out to improve accessibility to information and to thereby foster greater innovation and awareness, has in fact lead to greater ignorance. The problem persists in so far as classification systems continue to be regulated by outmoded ideology in which classification ‘specialists’ take as their starting point the mantra that information can be classified in a coherent and organised way under specified subject headings. The truth is that until classification systems more fairly account for the differences and diversity that exists within singular texts, then these systems will continue to be biased, acting as an obstacle to knowledge as opposed to a medium to accessibility. The following paper sets out to explicate the problems created by controlled vocabularies in classification systems by discussing the issue from a poststructuralist perspective, specifically utilising Derrida’s work in Of Grammatology to explain the contradictions in traditional classification systems and to further critique some modern solutions to these problems.

Screen-shot-2012-06-01-at-10.45.30-AM

Borges (1952, 104) asserts that “there is no classification of the universe that is not arbitrary and conjectural”. Borges identified that at the core of any attempt to classify or to organise objects through controlled vocabularies is the fact that this organisation is based on shifting premises that can be challenged from a point of difference. Beghtol (1986 ) illuminates the problem with traditional classification systems further by arguing that these systems are created through concepts of authority, status and control. In this sense, traditional classification systems become homogenous and hegemonic, leading to theorists like Shirkey (2005) to argue that classifications are, by their very nature, biased. Miksa (1998, 81) demonstrates that the problem with classification lies in its Enlightenment roots, arguing that classification is based on “the idea that somewhere, somehow, we can, or should try to, produce the one best classification system that will serve all purposes”. What is emphasised here is that there is one purely scientific system that pertains to truth in that this one system is complete in its ability to categories all knowledge. Miksa (1998, 81) goes on to highlight the assumption that “knowledge categories are by nature hierarchical and logical in a classical, systematic sense”. Any kind of hierarchy is established to deny real difference in a subject because everything that exists under that hierarchy must be shaped to fit into categories that the hierarchy dictates. If that hierarchy is Western, or North American, then there is naturally going to be a bias towards the ideological prioritisation for those demographics. Such systems, then, become too rigid and cannot adequately account for emerging disciplines or in fact, for the transferral of information across national and continental boundaries.

Theorists have developed the critique of traditional classification systems further to incorporate contextual elements into the debate. These contextual arguments all revolve around the idea that classification systems work to identify similarities between objects and to thus categorise them under related headings. Bowker and Star (1999, 131) suggests that “classifications that appear natural, eloquent, and homogenous within a given human context appear forced and heterogeneous outside that context”. Thus, extracting classifications from their original context within the system demonstrates just how biased the system actually becomes. Mai develops this concept further to highlight the prominence of ‘similarity’ in classification systems: “most bibliographic classification theory stipulates that documents are holders of concepts and concepts are context and human independent constructs and that classification brings together concepts based on similarity”. Lakoff (1987, 6) explains this idea of similarity further by explicating that since the writings of Aristotle, and following through the entire history of Western thought, objects were categorised based on whether or not they had ‘common properties’. Olson (2001, 116) develops the concept of similarity more to incorporate concepts of ‘sameness and difference’ in organising information: ‘once we collect this innovative material we try to organize it by gathering what is the same […] We build our classifications using these facets that bring things together according to some kinds of sameness’. However, Olson, like many contemporary theorists navigate more towards ‘sameness’ as a regulating principal of classification systems, ultimately, paying tribute to the dualism of difference but never seriously considering its significance comprehensively enough. Olson further relates the idea of sameness of that to ‘discipline’, referring to discipline as ‘the primary facet in our classification systems’. ‘Discipline’ is a word that implies an authority and strictness over controlling vocabularies and categories of information.

What is interesting is that any attempt by theorists to explain classification as biased through the concept of sameness, implicitly means using the concept of ‘difference’ as a critical hinge upon which to base those critiques. It is through this ‘hinge’ that solutions to the biases within controlled vocabularies start to emerge. Clare Beghtol (1982, 2), for example, suggests that “increasing multidisciplinary knowledge creation makes it critical to reconsider the traditional reliance on discipline-based classification and to try to solve the problems that orientation has created”. Olson (2001, 120) continues to highlight, in relation to English Literature classification, that this traditional mode privileges ‘colonisers over colonised’. She (2001, 118) further develops the importance of ‘difference’ in that she explains that “Recent recognition of the validity of oral literary traditions and the questioning of existing literary canons suggest that this definition of literature is exclusive rather than inclusive. It is defined by difference as much as by sameness”. It is through the importance of understanding difference in classification systems that some solutions to the biases begin to emerge. Olson’s solution appears to be more practical and achievable than many others. The problem is that Olson’s solutions (2001, 120-122) self-consciously lack real difference in suggesting change, preferring to base her solutions around ‘local control’ rather than any real radical changes. This means giving libraries, both regional and national, ‘notational options’ that allows them to make amendments to the subject headings so as to find vocabularies that are more suitable to the given context. Olson also suggests that “Flexibility can also be achieved by varying the citation order of classifications – shifting which samenesses get priority. It must involve rejecting at least some of the samenesses and differences of our classifications.” Olson justifies these changes by referencing postmodernism’s rejection of universals, seeing more local control as a disruption of traditional all-encompassing systems. However, she fails completely to really understand postmodern and thus poststructuralist concepts of difference and disruption. The suggested solutions simply replace one form of universal control with another more local version of the same thing. The solutions still rely on ‘specialists’ to assert authority over the vocabularies used to classify, taking the authority out of the hands of the users and placing it into the hands of individuals. It is as equally problematic as Mai’s (2010) concept of ‘cognitive control’ which allows for the continuation of the traditional system once it has been properly theorised, questioned and explained, so long as it is self-conscious to its own potential biases. The problem with this is that being self-conscious of bias does nothing to eradicate the bias, it is simply bias in a softer guise. And in any case, no matter how self-conscious we are of the bias, objects in our libraries will continue to be irretrievably buried under inadequate subject headings.

tumblr_m94gff9D3F1qciesyo1_500

What becomes clear, then, is that there is an awareness of poststructuralism’s influence through the referencing of the concept of ‘difference’ in attempting to understand and solve the problem of controlled vocabularies in classification systems. However, there is also a clear reluctance to engage with poststructuralism in a meaningful way. In fact, there is a clear misunderstanding of poststructuralism in the dualism of understanding ‘difference’ as the opposite of ‘sameness’. This suggests that there is an assertion of poststructuralist politics in promoting some ‘difference’ at local and national level, but that this politics is built on structuralist rather than poststructuralist linguistic foundations, thereby rendering it contradictory and self-defeating. All of the ‘solutions’ to classification bias are retained within a traditional mindset in which texts ought to be categorised into similar or related categories that are hierarchal in nature. No matter how much flexibility one allows within this model, or no matter how much ‘trustworthiness’ is achieved due to cognitive control, there still persists a traditional model that gives authority control to an oppressive few. The remainder of this paper will attempt to explain the real value of poststructuralism to this debate and will further attempt to demonstrate the radical potential of poststructuralism to not only disrupt traditional classification systems, but to disrupt them in a positive way that could lead to more meaningful solutions to the problem.

Part two, explicating how poststructuralist theory impacts upon classification systems, is coming soon…………

Part 3: The Semiotic Approach to Citation Indexing

semiotics-13318422817605-phpapp01-120315152442-phpapp01-thumbnail-4

This brings us onto the final approach to theorizing citation indexing. This approach was termed by Chubin and Moitra (1975) as ‘phenomenological’ in that it looks at citing in terms of being a symbolic exchange. Small (1980) puts forward the idea that citations become markers or symbols which are indicative of theories, concepts, ideas or methods. Blaise Cronin has developed this approach in a more interesting way by looking at citations as both sign and symbol in his essay ‘Symbolic Capitalism’. Cronin (2005, 143) goes on to assert that a citation is a signaling device or action indicating that one is familiar with and have drawn upon a particular author and work. However, here Cronin once again places equal emphasis on the author and the work, meaning his semiotic approach draws more from structuralism than post-structuralism. The concept of the author becomes a regulating force over all future iterations of that text, meaning that the text can never be re-conceptualized leading to greater innovations having finally been released from the original hegemonic authorial context and given life of its own. Cronin is not alone in positing a semiotic symbolic relationship between texts. Wouters (1993, 7) suggests that citations act as two different signs, one that points back to the original text and one that refers to its own context. Warner (1990, 28) rejects this approach arguing that “the ambiguity of citation in aggregate form can be seen as a special case of the indeterminacy other written signifiers, such as words, can acquire when torn from their discursive context”. However, while there is some validity to this argument, Warner is still reliant on a citation being held within an original authoritative context. Cronin’s approach (2005, 156) is successful with regard to his focus on sign systems, arguing that “references and citations need to be unraveled in respect of their respective sign systems.” He (Cronin 2005, 159) goes on to suggest that this sign system is triadic in nature: “The referent of the bibliographic reference is a specific work; the referent of a citation the absent text that it denotes; in the case of large-scale citation counts, the referents are the cited authors.”

The problem with Cronin’s approach is that he views language from a structuralist perspective as is clearly evident from his triangular structure of language in which signs fall back upon an original context. But reference to Roland Bathes theory above demonstrates that signs do not necessarily operate in such a coherent direction. Rather, signs are dispersive entities that ripple out into the past, present and future thereby creating multiple contexts. They do not necessarily fold back upon the original text, but rather re-conceptualize that text pushing it into the future as a ‘new’ work. Baudrillard (1981, 150) would refer to Cronin’s sign system as the “mirage of the referent”. This essay supports Baudrillard’s concept of the sign becoming a kind of false referent that signals back to the original text. So, citations as signs do not really contain a past, rather, they only push past texts into a newly imagined future. In many ways, post-structuralism depicts citations as signs in terms of what Brian McHale (1987, 166) defined as heteroglossia, that is, “a plurality of discourse […] which serves as the vehicle for the confrontation and dialogue among world-views”. What this recognition must do, is destory any sense of hierarchy within the citation process. It can not only tear citing from their authorial and hierarchal structure, but it can also seriously undermine the normative approach that all theories appear to fall back into, nó longer allowing citations to be retained under a hegemonic capitalist scheme.

In conclusion, this paper has attempted to explicate the three main approaches to understanding and theorizing citation indexing. It has done this through a brief review of the literature available in the academic field. The suggestion is that citation indexing has become blinded to the hierarchy that now controls it. In this sense, Sosteric’s (1999) argument that hegemonic control over scholarship through the proliferation and globalization of citation practices in the wake of the technological revolution has well and truly been realized. This can be argued as we see scholars become blinded to the underlying capitalism that controls scholarly thinking by embedding scholarship within a fragmented and contradictory paradigm. Many scholars argue then for a more expansive theory through the interpretative, phenomenological, and semiotic approaches, but these become retained within authoritative contexts and ultimately collapse back into a normative approach. By identifying the persistence of an underlying capitalised structure, this essay has attempted to take a more holistic and ontological approach to the subject. It has also attempted to utilise some post-structuralist theory in order to develop the semiotic approach of Cronin. In doing so, this paper argues for the freeing up of Croin’s sign system to incorporate a more dispersed heterogenous theory that could ultimately create a freer, more authonomous citation system.

References

Baudrillard, J. (1981), For a Critique of the Political Economy of the Sign, London: Telos Press

Chubin, D.E. & Moitra, S.D. (1975), Content analysis of references: adjunct or alternative to citation counting? Social Studies of Science, 5, 423-441

Cronin, B. (2005), ‘Symbolic capitalism’, The Hand of Science: Academic writing and its rewards. Lanham, MD: Scarecrow.

McHale, Brian (1987), Postmodernist Fiction, New York and London: Methuen

Small, H.G. (1980), Co-citation context analysis and the structure of paradigms, Journal of Documentation, 36(3), 183-196

Sosteric, M. (1999). Endowing mediocrity: Neoliberalism, information technology, and the decline of radical pedagogy. Radical Pedagogy. http://www.radicalpedagogy.org/radicalpedagogy.org/Endowing_Mediocrity__Neoliberalism,_Information_Technology,_and_the_Decline_of_Radical_Pedagogy.html

Warner, J. (1990), Semiotics, information science, documents and computers, Journal of Documentation, 46(1), 16-32

Wouters, P. (1993), Writing histories of scientometrics or what precisely is scientometrics?

Part 2: The Interpretative Approach to Citation Indexing

ME_132_AuthorIsDead

The second, and again, fragmented approach to citation indexing is best described as the interpretative approach which relies on the idea of citation as a communicative act that forms a relationship between texts. Firstly, May (1967: 890) suggests that citations are ‘deviants’ and are partly informed by the ‘scientific, political and personal motivations’ of the user. This has lead to citation indexing being viewed as a social science with the emphasis on communication between texts and also between authors. Mitroff (1972) questioned the normative approach in this way by suggesting that referencing relies on subjective behavior in the methods scientists use to cite. What this means, according to Gilbert and Mulkay (1982) is that citation behavior is context dependent. The problem with this approach was best summarized by Blackwell and Kochtanek (1981) who point out that while citation indexing is a communicable relationship involving two texts, it does not make it explicitly clear what the nature of the relationship actually is. This leads to the inclusion of psychological analysis in the debate. Harter, Nisonger and Weng (1993) suggest that there is psychological validity to citation usage in that they don’t always retain a clear topical relevance. However, little consensus was reached regarding what the texts are actually saying to each other given the rather slippery application of terms like ‘subjective’ and ‘context-based’.

Once again, as long as a fragmented, rather than holistic approach, is taken to the subject the real value of citation indexing, if any at all, will not be realized. For example, Stanely Fish (1989, 164) argues from a reader-response perspective that “the convention is a way of acknowledging that we are involved in a community activity in which the value of one’s work is directly related to the work that has been done by others; that is, in this profession you earn the right to say something because it has not been said by anyone else, or because while it has been said, its implications have not been spelled out.” However, Fish’s explanation only assesses the citation process through an insular relationship between two texts rather assessing qualitatively ‘why’ a specific cited text is valuable. His approach still falls victim to the idea expressed by Voos and Dagaev (1976) that citations function on the assumption that they have and equivalent value. In this sense, citations fail to distinguish between degrees of importance between differently weighted texts. This has lead to Czarniawska-Joerges’ (1998, 63) supposition that citations act as a “trace of conversations between texts”. In this sense, Merton’s (1977, 84) early argument that citations are too cognitively complex to be accurate and comprehensive in their citation behavior still holds true. However, this does not prevent Brodkey (1987, 4) and Bordieu (1991, 20) from falling back on the idea that there are normative procedures that regulate citation practices.

The key problem with the interpretative approach is that it takes the scientific tenet that the process of citing and the relationship between texts needs to be clearly defined. Post-structuralist theory can be useful in this sense in that it demonstrates that one cannot really assert clear definitions based around authorial intention onto context-based reading processes. Roland Barthes’s (1967) essay ‘The Death of the Author’ argues that understanding the intention of an author is neither useful or desirable when understanding textual referents. This is because language operates not as a circular reciprocal structure, but as a more dispersed set of signs. A citation marker then, cannot refer backwards to highlight the importance of an older text, but rather, a citation marker can only refer forward into the future of that ‘old’ text. This is because language does not work as a static system; in post-structuralism, language is a highly dispersive and heterogenous marker that pushes ‘past’ texts into the future while having the impact of re-contextualizing them in the process. The interpretative approach views language as something ordered and permanent. The fact that they cannot figure out what these ‘ordered’ citation markers are actually saying should act as a solid indicator that they do not engage in a conversation between the original cited work and the work that is citing, and that each time a work is cited it is transformed into a new context taking on new signification. In this sense, the author as an authority becomes irrelevant and dispersed in that he/she cannot possibly retain control of the original information. It is here where the academics theorizing citation indexing come unstuck. If they fall back upon a normative approach, then they must realize how ideologically corrupt that approach is due to the overarching commodification of education and research, not to mention the fact that the normative theory is an attempt to assert to control and authority in asserting predictable practices. However, if they embrace contemporary linguistic and culturally theory, then they must accept that they will loss control of the hierarchy altogether.

References

Barthes, Roland (1967), The Death of the Author, Aspen, No. 5-6

Blackwell, P.K. & Kochtanek, T.R. (1981), An iterative technique for document retrieval using descriptors and relations, Proceedings of the 44th American Society for Information Science Annual Meeting, Washington: ASIS, 215-217

Bordieu, P. (1991), Language and Symbolic Power, Cambridge, MA: Harvard University Press

Brodkey, L (1987), Academic Writing as Social Practice, Philidelphia: Temple University Press

Czarniawska-Joerges B. (1998), Narrative Approach to Organization Studies, London: Sage

Fish, S. (1989), Doing What Comes Naturally: Change, Rhetoric, and the Practice of Theory in Literary and Legal Studies, Durham NC, Duke University Press

Gibert, G.N. & Mulkay, M. (1980), Contexts of scientific discourse: social accounting in experimental papers, in Knorr, K.D. et al (eds.), The social process of scientific investigation, Dordrecht: Reidel, 269-294

Harter S.P., Nisonger T.E. and Weng A. (1993), Semantic relationships between cited and citing articles in library and information science journals, Journal of the American Society for Information Science, 44(9), 543-552

May, K.O. (1967), Abuses of citation indexing, Science, 156, 890-892

Merton, R.K. (1977), The sociology of science: an episodic memoir, The Sociology of Science in Europe, Carbondale: South Illinois University Press, 3-141

Mitroff, .I.I. (1972), The myth of subjectivity or why science needs a new psychology of science, Management Science, 18, 613-618

Voos, H. & Dagaev, K.S. (1976), Are all citations equal? Or did we op.cit. your idem? Journal of Academic Librarianship, 1(6), 19-21

Part 1: The Normative Approach to Citation Indexing

image_thumb[7]-2

This is the first part of three short critiques of citation indexing….

The first theoretical approach to citation indexing is the normative approach. However, much of the discussion around this approach remains fragmented as protagonists of the approach maintain an outlook that assesses normative measures by analysing codes and processes ‘within’ the practice of citing. Cronin (1984, 2) explains that “Implicit in this is the assumption that authors’ citing habits display conformity and consistency.” This view was originally developed by Garfield (1963) who argues for the use of citation indexes as quantitative and valuable if they adhere to scientific principles. The fact that this argument requires codified modes of behaviour demonstrates that the approach looks only at the processes of citing rather than asking questions about the value of citing itself and the motivations that encourage or dictate authors to cite in the first place. Once Kaplan (1965) argued for a citation approach that was sociological in that citations relate to other kinds of social data, Merton (1973) developed the normative approach to include four categories upon which this code can be identified and understood. These include: Universalism; Organised Skepticism; Communism; and Disinterestedness. These four categories were then expanded by Mitroff (1974) to eleven categories.

However, this method of assessing only an implicit code of reference within citation practices ultimately falls victim to hierarchy in which a few elite or powerful authors become dominant players in influencing new research. Whitely (1969, 219) argues that “The formal communication system also forms the basis for the allocation of rewards: instrumental and consummatory. Thus it is a means of exercising social control . . . Publication of an article in an archival journal signifies a degree of recognition for the author, while legitimizing the object of research and methodology.” Thus, the danger of any normative approach that relies on there being established rules or codes of practice that regulates citation practices, is that it is prone to become part of a system of control in which influential academics begin to benefit from a normative approach that acts as a kind of pyramid scheme. Cronin (1984, 12/3) seems to celebrate the concept that “Maverick ideas, or notions which are, scientifically speaking, revolutionary, are thus effectively debarred from the official record of science – the journal archive”. Storer (1966) highlights that citations will continue to be used out of a principle of self-interest in which scientists adhere to the norms because citations are necessary commodities in which colleagues share mutual interest. This monetization of citations is confirmed by Hagstrom (1971) who goes on to argue that citations coincide with the value of grants, funding and university rewards. However, the fact that academics are engaging in a discourse that commonly accepts the commodification of ideas within an education setting is ethically reprehensible. It also demonstrates a lack of real interest in exploring the core value of citation indexes because the academics in question are benefiting from being cited. It can clearly be seen from looking at the literature that there is an acceptance of the monetization of citations as part of normative practice. However, the normative argument is highly fragmentary in that it fails to acknowledge that the citing norms are only compliant to an underlying monetized hierarchy. All the norms do is reinforce a homogenous and hierarchal academic system. The approach cannot claim to be truly normative because the norms are actually imposed.

Mike Sosteric in his essay ‘Endowing Mediocrity’ takes a more holistic approach to the subject as he attempts to expose the narrative that underlies and informs the normative codes in citation analysis. In doing so he gives greater context to some of the above mentioned problems with the normative approach to citation indexes. Sosteric (1999) examines the influence of capitalism and cybernetics on bibliometrics, asserting that citation indexing creates a homogenous narrative that reasserts hierarchy within eduction. Sosteric expands upon Teeple’s (1995, 1) suggestions that the 1980s “signified the beginning of what has been called the triumph of capitalism”. Sosteric (1999) continues to argue that “as a result of the neoliberal push, universities are being colonized, both physically and intellectually, by capital, its representatives, and its ideologies.” What can be seen here is that the normative trends that regulate citation indexing are monopolized by capitalist processes. Senior or established academics at the top of the hierarchy directly benefit from the setting up of normative modes of practice because the more their work is cited, the greater the monetary and symbolic gain. Those less established academics cannot become more visible unless they pay tribute through normative citation practices to the established scholars and universities who exert significant authority over the career trajectories of younger and emerging academics and researchers. In this sense, normative practices within citation indexing is regulated under hegemonic control. And as Boor (1982) points out, it is highly susceptible to manipulation, especially now that it has come under the complete control of cybernetic processes insofar as citation counts can be ‘engineered’ through unfair means in order to create inflated citation scores. Therefore, Nelson (1997, 39) may refer to citation indexing as “academia’s version of applause”, and Grafton (1997, 5) may insist that it is codified by “ideology and technical practices”, but their assessment remains fragmentary. Once we assess the processes of citation from a more holistic approach, we must question the very ideology that is creating such practices and more deeply consider the true value that they have.

References:

Cronin, Blaise (1984), The Citation Process: The Role and Significance of Citations in Scientific Communication, Taylor Graham

Garfield, E. (1963), Citation indexes in sociological research, American Documentation, 14(4), 289-291

Grafton, A. (1997), The Footnote: A Curious History, Cambridge, MA: Harvard University Press

Hagstrom, W.O. (1971), Inputs, outputs and the prestige of university science departments, Sociology of Education, 44(4), 375-397

Kaplan, N. (1965), The norms of citation behaviour: prolegomena to the footnote, American Documentation, 16(3), 179 – 184

Merton, R.K. (1973), The sociology of science: theoretical and empirical investigations, Chicago University Press

Mitroff, .I.I. (1974), The subjective side of science: a philosophical inquiry into the psychology of the Apollo moon scientists, Amsterdam: Elsevier

Nelson, P. (1997), Superstars, Academe, 87(1), 38-54

Sosteric, M. (1999). Endowing mediocrity: Neoliberalism, information technology, and the decline of radical pedagogy. Radical Pedagogy. http://www.radicalpedagogy.org/radicalpedagogy.org/Endowing_Mediocrity__Neoliberalism,_Information_Technology,_and_the_Decline_of_Radical_Pedagogy.html

Teeple, Gary (1995). Globalization and the Decline of Social Reform. Toronto: Garamond Press.

Storer, N.W. (1966) The social system of science, New York: Holt Rinehart & Winston

Whitley, R.D. (1969), Communication nets in science: status and citation patterns in animal

Academia, Capitalism and Bibliometrics

Slide3

Many of the ideas/ concepts within academic scholarship are quite simple, however, they are often dressed up in quite complex discourses. This paper aims to reduce Cronin’s article to its more basic ideas and to then assess the validity and relevance of these. It is interesting that the article is framed by the title ‘Symbolic Capitalism’ and yet Cronin never makes explicit reference to the capitalist contradictions inherent in scholarship. All capitalism, no matter what tag or label you add to it, is concerned with a monetary system. Academics cannot claim to be interested only in collecting symbolic status because this status inevitably leads to greater monetary gain. Bibliometrics in this sense is an extension of a longstanding hierarchal system of the ‘economy of attention’ within academia in which politics overshadows the search for truth. Furthermore, Cronin’s semiotic approach cannot really add much value to the debate until he moves it more fully into a promotion of Open Access and Open Source modes of publishing.

Cronin is essentially attempting to validate the value of bibliometrics and citation indexes in assessing the significance of scholarship. It is for this reason that he draws a distinction between “enduring scholarly impact […] and, on the other hand, web-based measures of ‘transient group interest’”. The idea that citation and referencing provides a more relevant account of a scholar’s status than his/her media celebrity is accurate, however, there is no doubt that the two are deeply connected. Scholars sit in university chairs, but they also do consultation work across a wide spectrum of public and private enterprises. They sit on funding allocation committees, on external examining boards, on the boards of private companies and government advisories. They accumulate a media profile in much the same was as they do through bibliometrics and the one informs the other. Those academics at the top of their fields do hold quite a lot of influence over the career trajectory of those who are just entering the hierarchy. Citation in this sense then does not take place on the actual merit of scholarship, but on the necessity to network by acknowledging the work of those academics that might become influential in work being funded, published and promoted.

cartoon

The real contribution that Cronin’s article makes is in his semiotic approach to understanding citation and referencing. He suggests that citations act as signposts within a discipline to authoritative and meaningful scholarship referring to citations as “frozen footprints in the landscape of scholarly achievements.” This is drawn from Saussure’s structuralist approach to linguistics in that, like Saussure, Cronin views language as stable and static as in the structuralist tradition. Following on from this approach, Cronin can argue that citations retain ‘enduring’ characteristics in that they are quantitative in culture building. The problem here is that structuralist models were quickly replaced by post-structuralist principles within which language is no loner seen as permanent, rather, language is seen as dispersive and highly manipulatable. Or more metaphorically speaking, seasons change and alter the shape of those ‘frozen footprints’ if not melting them altogether. The Structuralist approach suits Cronin’s purpose in that he views academia as a closed off community in which cultural norms and significances are established by the participants. However, the politics of academia as mentioned above is one driven by capitalist gain in which citation indexes become a more globalized form of academic hierarchy. This happens because bibliometrics does not change the politics. A young scholar cannot get published within his/her discipline if they write a paper that does not reference the hierarchy of that discipline. Why? Simply because that work will be considered incomplete by the hierarchy itself. Structuralism will tell us that consistent referencing and citation leads to more authority in assessing an author’s contribution to a discipline. But post-structuralism will be highly suspicious of this word-game in that language becomes a tool of hierarchal control over a discipline to allow established scholars earn more money. Culture then is simply a construct of hierarchy.

In this sense, it does not really matter how Cronin dresses up bibliometrics in semiotic garb. It may add to the ways in which citations and references produce meaning, but only insofar as this meaning is retained within a closed academic framework. But of course, Cronin is acutely aware of this closed community and, in fact, supports it. He acknowledges that bibliometrics is playing a role in commodifying academia and that academia has always been a commodity. He goes so far as to compare scholarship citations to the stock exchange. It is in this sense that he draws on Stanley Fish’s idea of ‘interpretative communities’. Fish in particular relies on a kind of ‘ganging up’ in forming community. His theory posits that readers do come to a consensus about what texts mean. However, this consensus is often dictated by those within a hierarchy that wield more power. This issue is even more acute within a closed community group such as an academic discipline community within which scholars are clamoring for university chairs and research funding, and now, positions on citation indexes. In actual fact, Cronin’s use of Reader-response is antithetical to his structuralist approach. Fish is indeed a Reader-response proponent, but Reader-response theory is itself a highly dispersed field of study with often contradictory ideas contained within it. Cronin’s use of the term is far too general to be meaningful. In any case, Fish’s ‘interpretative communities’ don’t adapt very well to academia. This is because it is extremely difficult for those outside of academia to become contributors to meaning within it and the core tenet of reader-response is that more readers build greater consensus. This is because of the inaccessibility to academic scholarship and journals. These are expensive to access and are usually only subscribed to by interested bodies such as universities and research institutes. In order for, say, an internet blogger to enter the debate, they would have to somehow gain access to scholarship. A subscription to just one discipline in Jstor, for example, costs more than $6000 per year. And Jstor is only one of very many journals.

Finally, the structuralist approach adopted by Cronin, coupled with reader-response theory does allow Cronin to open up a new way of assessing and conceptualizing the validity of citation and referencing indexes. However, his essay really only moves to reassert the status quo within a hierarchal academic system. I believe that an adoption of post-structuralist theories, coupled with a move towards Open Access and Open Source scholarship is the only way of achieving scholarly work and a bibliometrics that is truly meaningful based on the merits of the work and not its political/ monetary motives. But this idea would not work within Cronin’s framework of ‘Symbolic Capitalism’ because it would deny direct monetary reward for academic scholarship and would open scholarship up to a new world of scholars that exist outside of the university/ research system, thus dispersing homogenous hierarchal systems and allowing real innovation to emerge through new contexts. This is self-evident in Cronin’s essay because his work is written for a traditional academic audience who might not be so quick to read it if it had put forward a pro-open access argument. I believe maintaining the status quo is one reason why scholars are quick to distinguish the ‘impact’ of altmetrics from the more ‘enduring’ effect of bibliometrics.

[Perhaps we should cease citing and referencing altogether, as a protest against ‘Academic Capitalism’]

Information as ‘story’

postmodernism1

It is interesting that the past continues to steam roll into the present when you least expect it. I was sitting in a lecture on Information Theory when the literary theory of my past lit a fire in my mind once again. I had thought studying the MA in Library and Information Studies was a new departure into a different future, until I heard Dr. Lai ask “What is ‘information?'”. Of course, there was no response. We had just spent the last week reading theorists who had been studying the subject for decades and who had failed to come to a definite answer on the question. Dr. Lai answered the question for us: ‘Everything!’. That naturally did not really narrow it down all that much!

I have to admit I was frustrated by the readings that week. It seemed to me as a first impression that information theorists are stuck back in a time before post-structuralism, still theorising circles around each other in the pursuit of a definitive definition of the word ‘information’. And all for what? So that more regulation can be introduced. So that the word, the idea can be further controlled. There was a sense in all of the readings that each theorist believed they were being objective. I remember thinking, the more they try to narrow it down the more out of control and expansive the word becomes. That is because every attempt to define the word resulted in more being added to it. For me, the heterogeneity of the word is where its strength lies. Lets not try to tie it down. Why not let the word grow organically? Let’s explore its possibilities so as to create more space for innovation to emerge. Let’s finally learn that narrow definitions that lead to stricter rules and regulations actually destroy creativity.

And then, with those thoughts, came rushing back a new answer out of the past. Dr. Lai was right, information is everything. But, what is everything? The answer……’story’, or narrative. The key terms that define ‘information’ revolve around data, process, knowing and communicating. There emerged the idea in the lectures and readings that information is essentially manipulated data, that is, data used in specific contexts by people with a specific agenda. Poststructuralist linguistic theory determines that the same word spoken by two different people results in two different words. Why? Because words are not just lines on a page, or sounds vibrating through the air. Words are experiences with context and subjectivity build into them and that ‘experience’ of word changes in mid air and is transformed the moment it leaves the speakers lips. What it transforms into is another, different experience that depends on who is hearing it. It might sound strange, but no one word is the same. So in this sense, or more accurately, in my own sense, information is narrative, a never ending game of Chinese Whispers spiralling out of control because no two people in the game speak the same language…….