Monthly Archives: March 2015

Week 7

I thought that McDonough’s article for this week was very informative of a problem that is easy to overlook. Being new to XML, I had no idea that the language’s being designed with variability in mind was making it difficult in terms of interoperability. It is an interesting point that different needs of different library communities are creating social differences in the use and the miscommunication of XML based tools.

I also felt that the author’s call for some sort of standardization runs a little counter to the entire philosophy of XML. As he says, “In the designers’ world view, a key benefit to XML is the freedom it provides users to define their own structure for documents and data, using their own semantics, and to escape restrictions software vendors might wish to impose on their users.” In this way, isn’t standardizing the language to some degree counterintuitive to its design? This is definitely an interesting problem, as I don’t know a potential solution, or even if one is required.

And maybe it’s the comic book lover in me, but I found Walsh’s article (look at all the superheroes!!!) to be very interesting. The idea that a language can be designed that is capable of examining documents with images and text is extremely beneficial for the future.

Reading the article, I kept thinking of the X-Men. As a comic book series, the comics are analogous for the American civil rights movement. This example demonstrates the literary value of such a markup language, but the use for exploring items such as tapestries or other historical documents is obvious. If a historical work makes heavy use of primary sources such as images or maps, an accurate digital rendition could easily live online, either by itself or as a complement to a monograph or other type of publication.

Week 7 Readings

The readings this week address digital preservation, digital oncology, and interoperability. Much of the information in these readings was somewhat abstract and technical, but I think I understand what is at stake.

McDonough

This article talks about the unintended consequences of  XML use in the digital library community, mainly why there are failures of interoperability between institutions, developers and other users of XML.  The whole idea of XML was to have flexibility within a generalized, standard mark-up language so that different programs could communicate within it.  What has happened, though, is not quite what the creators of XML envisioned.  McDonough says this is because of culture and social relationships between content creators and software vendors (par. 13), among other things.

His solution is to establish “common principles of structural encoding and standardized stylesheets for translation” (par. 36).  The main problem I see is that a standard, or universal mark-up language is probably impossible.  We don’t have a standard, universal human language, we don’t have a standard, universal culture, and we don’t have standard, universal social relationships.  A consequence of human flexibility makes it hard to generalize the digital.

Kirschenbaum

We are again dealing with archives in this article.  The author tells a great story about preservation and possibility for research within the born-digital archives with his example of MITH.  His hook is the “.txtual condition.” He says the catechism of this condition is “Access is duplication, duplication is preservation, and preservation is creation – and recreation”  (par. 16).  I took this to mean that archiving is all about making things ready for recall.

In the digital aspect, an archivist has to “be able to identify and retrieve all its digital components” -Kenneth Thibodeau (par. 16).  What this means is hardware and software may need to be physically available, or emulated, to access old digital information.  This would be very important to a historian because records are primary sources.  Digital records will need to be accessible if future historians are to study the digital age.

Evens

This is a very abstract article about the abstraction of the digital world.  It is about binary code (0,1) being behind everything digital.  We don’t think about that as we go about our digital lives, but it is real.  Evens says, ” The digital has an ontology, a way of being, and products and processes generated through digital technologies bear traces of this way of being” (par. 9).  Later, Evans says it’s not just about technology, but binary code has abstract, ideal and discrete characteristics that end up being expressed in human relations (par. 15).  I found this article difficult to digest, but do think it would be an interesting area of study for philosophers.

 

 

Endless Possibilities

Just wanted to check in and see if anyone else Loves CSS and HTML. There is so much we can do with it and this is only the beginning. I think that learning webdesign has so much potential for the field of Public History. The “public” sees the world in these formats (i.e. web pages), so we should know how to show them history through the lens with which they are most comfortable. Accessibility is key here.

I do want to reiterate my earlier statements: I believe that this is the job of the Public Historian, not the Academic. We need trained professionals (digital historians) in academia to teach these skills for use by the public historian. It is easy to get lost in learning web design elements and lose what is important about what we are doing (or trying to do) as academics/professionals.

There is a great work that predates the digital revolution that is poignant to me as we work through these problems and you can access it here.

 

Now you may like this Christina!

Week 6

Argamon & Olsen

This article does an excellent job highlighting the potential for computers as tools in the humanities. Similar to the Gibbs and Owens article, it shows the use of computers to help brainstorm and articulate research questions. One particular example of this from the article is the new connections discovered between Diderot and d’Alambert’s Encyclpédie and the Journal de Trévoux. With the discover of new connections and historical questions to be asked, who’s to say that we won’t discover new connections in centuries old texts, or even millennia-old texts?

It also seems to me that there may be potential in using machine learning to examine content or stylistic differences and similarities in unattributed historical writings. There are numerous texts written by that pesky anonymous person. Without machine learning, these writings may remain unattributed to any particular author and instead continue to be wondered about. Such tools may not be able to definitively tell us who wrote what, but they may be able to provide insight and provide us with new potential authors to theorize about. In large enough bodies of work from a similar time period, could such a tool help us discover if not one author, perhaps a style linked to an institution or an organization? Could such a discovery help to discuss and/or further develop certain hegemonic theories?

One other brief (instead or risking beating a dead horse…with an already-dead horse) point that Argamon & Olsen are careful to make: new technology can lead to new interpretations of the past, however, any new interpretations must still be researched and developed by professional historians, as context must be acknowledged, especially in new historical connections.

Kramer

I agree with Kramer’s argument that now, during the emergence of the digitization of the humanities, is the time to reconsider how we think about and approach the interpretation of history. Not that we have to completely change and revolutionize the practice of history, but an incorporation of a new way of thinking could definitely freshen up the field and the methods used to define the field. Regarding the increasing significance of the digital archive, thinking of digital and physical archives as a historiography in and of themselves is an important step to building a dynamic fluidity between not only primary and secondary sources, as Kramer asserts, but also between scholarly works of history and public history as well.

Week 6 Readings

Machine Learning, Text Mining and Archives

Argamon and Olsen

This article from 2009 introduces the challenges faced by humanities scholars of dealing with voluminous digital/digitized sources, or the “global digital library.”  These authors believe text mining is complementary to traditional textual analysis.   They used predictive text mining, comparative text mining, and clustering/similarity analysis on three different projects.  In two cases, the digital tools supported the original, traditional scholarly conclusions, and one case found new connections previously unnoticed by traditional scholarly methods.

I thought the warnings were interesting.  You could create the results you want to find based on the construction of the task itself, and there is anxiety inherent in doing criticism by algorithm.   I think these warnings are born of normal professional concerns for historians, whether digital tools are used or not.  I mean, is the evidence there, or are we forcing the evidence to fit our argument?

Kramer

I did not know there were any issues between historians and archivists.  Kramer’s article makes it sound like historians take archivists for granted, and that historians think the archive is for their use alone.  I don’t like the sound of that, but the article is from summer 2014, so apparently it’s a thing.  I agree with his point that digitization will bring the archival and history professions closer together.  I like the idea of digital historiography and all the possibilities that digitization opens up for historical transparency, accessibility and openness.  I think historians and archivists all need to reconsider their relevance in the digital age, so this kind of discussion makes sense.

The history of historical inquiry could become much more interesting than reading traditional historiography.  To follow intellectual journeys in the digital world sounds more fun than reading dry bibliographies to me.  I think this is what Kramer meant when he said we could dynamically link primary sources and their subsequent interpretations.