Week 7

The readings were pretty technical this week and were, I admit, over my head.  The idea behind XML and library use seems interesting; if all metadata must meet certain codes for compatibility then that takes away institutional quirks and/or bias that has marked archival work.  On the other hand, this makes the ability to interpret sources difficult as well; the archivist, like the historian, should have some sort of right to organize information according to standards, but with a sense of individualism.

Walsh states that, “The act of encoding a document is a form of discovery, or prospecting, in which the encoder maps a document’s structure, identifies semantic elements of interest, and documents relationships internal and external to the document.”  This is an interesting way to see encoding, and I think I have to remind myself of this.  While working with webdesign (did I mention I love this) or databases I have felt disconnected from the “doing history” part of the process, however, it is important to note that these digital tools are a part of the process. The use of digital tools force the historian to explore the document in new ways.  Comic books are an interesting way to discuss the implications of this because they use small amounts of text coupled with images.

 

Week six- Meta to my MetaData

Kramer

This week’s readings were interesting as they explored the implications of digital history on the way we view our profession.  Kramer suggests that perhaps “all” that historians do is add meta to metadata.  While at first I found this insulting, eventually I saw the value of this label.  History adds information to/about the primary source documents.  What we do is analysis and by placing the historian and the archivist together in this way the act of “doing” history becomes more clearly defined and accessible.

The historiography is where Kramer runs into trouble.  I do not believe that the entire historiography of a source can be placed into a primary source and be understood.  Perhaps arguments and counter arguments could be layered into the system; for instance, full access for people who would find that meta useful and common access for the average person who can not, without training ,understand the implications of what he/she is looking at.

Olsen and Argamon

Text mining is an interesting concept which seems filled with both problems and solutions.  When a historian works with primary sources they are limited in the number of texts they can use for any given project.  A computer is not human, therefore it can sort through many documents much more quickly which means faster work with a more complete body of sources.  This sounds like it would be extremely helpful in the humanities, but a computer is not human.

A computer can not understand the text it reads, it can not place value in circumstances and it can not feel the era it “reads”; a computer only recognizes symbols and their recurrence in a particular set of data. As Dr. Church said, it is difficult for the computer to mine symbols and metaphors in primary source documents.  This is a red flag on the use of text mining, because it is limited in scope;  the historian can view patterns through it, but computers have a limited capacity–perhaps this is comforting.

The article suggests that one goal of text mining is to make very large data sets “manageable and meaningful”.  In this pursuit, I believe that text mining provides the ability to organize and work with a large set of primary source documents, however, it does not provide meaning.  Meaning comes from human ability to grapple with data.  Meaning comes from the interpretation of the historian.  Text mining can lead the professional to explore different research questions and broaden his/her ability to work with sources, but can not be relied upon to give meaning to sources.

Idea for Space History

In my main field of research, I am looking at the way outer space is viewed/handled in Soviet and U.S. culture in the 20th century. For instance, what happened in U.S. culture between the 1960’s and the 1970’s to change people’s perception of outer space conquest from admirable to unnecessary? How is this reflected in culture?  How could I use digital tools to research these questions and represent the answers digitally?

From discussion with our class instructor, Dr. Church, I will find primary sources in public media such as newspaper articles and Time magazine archives; government sources such as NASA archives, State of the Union addresses etc.  Text-mining, web-scraping and Python were mentioned. These are all tools and techniques that I will be learning in the next several weeks.   I am excited to see how it all turns out.

Week 7

I thought that McDonough’s article for this week was very informative of a problem that is easy to overlook. Being new to XML, I had no idea that the language’s being designed with variability in mind was making it difficult in terms of interoperability. It is an interesting point that different needs of different library communities are creating social differences in the use and the miscommunication of XML based tools.

I also felt that the author’s call for some sort of standardization runs a little counter to the entire philosophy of XML. As he says, “In the designers’ world view, a key benefit to XML is the freedom it provides users to define their own structure for documents and data, using their own semantics, and to escape restrictions software vendors might wish to impose on their users.” In this way, isn’t standardizing the language to some degree counterintuitive to its design? This is definitely an interesting problem, as I don’t know a potential solution, or even if one is required.

And maybe it’s the comic book lover in me, but I found Walsh’s article (look at all the superheroes!!!) to be very interesting. The idea that a language can be designed that is capable of examining documents with images and text is extremely beneficial for the future.

Reading the article, I kept thinking of the X-Men. As a comic book series, the comics are analogous for the American civil rights movement. This example demonstrates the literary value of such a markup language, but the use for exploring items such as tapestries or other historical documents is obvious. If a historical work makes heavy use of primary sources such as images or maps, an accurate digital rendition could easily live online, either by itself or as a complement to a monograph or other type of publication.

Week 7 Readings

The readings this week address digital preservation, digital oncology, and interoperability. Much of the information in these readings was somewhat abstract and technical, but I think I understand what is at stake.

McDonough

This article talks about the unintended consequences of  XML use in the digital library community, mainly why there are failures of interoperability between institutions, developers and other users of XML.  The whole idea of XML was to have flexibility within a generalized, standard mark-up language so that different programs could communicate within it.  What has happened, though, is not quite what the creators of XML envisioned.  McDonough says this is because of culture and social relationships between content creators and software vendors (par. 13), among other things.

His solution is to establish “common principles of structural encoding and standardized stylesheets for translation” (par. 36).  The main problem I see is that a standard, or universal mark-up language is probably impossible.  We don’t have a standard, universal human language, we don’t have a standard, universal culture, and we don’t have standard, universal social relationships.  A consequence of human flexibility makes it hard to generalize the digital.

Kirschenbaum

We are again dealing with archives in this article.  The author tells a great story about preservation and possibility for research within the born-digital archives with his example of MITH.  His hook is the “.txtual condition.” He says the catechism of this condition is “Access is duplication, duplication is preservation, and preservation is creation – and recreation”  (par. 16).  I took this to mean that archiving is all about making things ready for recall.

In the digital aspect, an archivist has to “be able to identify and retrieve all its digital components” -Kenneth Thibodeau (par. 16).  What this means is hardware and software may need to be physically available, or emulated, to access old digital information.  This would be very important to a historian because records are primary sources.  Digital records will need to be accessible if future historians are to study the digital age.

Evens

This is a very abstract article about the abstraction of the digital world.  It is about binary code (0,1) being behind everything digital.  We don’t think about that as we go about our digital lives, but it is real.  Evens says, ” The digital has an ontology, a way of being, and products and processes generated through digital technologies bear traces of this way of being” (par. 9).  Later, Evans says it’s not just about technology, but binary code has abstract, ideal and discrete characteristics that end up being expressed in human relations (par. 15).  I found this article difficult to digest, but do think it would be an interesting area of study for philosophers.

 

 

Endless Possibilities

Just wanted to check in and see if anyone else Loves CSS and HTML. There is so much we can do with it and this is only the beginning. I think that learning webdesign has so much potential for the field of Public History. The “public” sees the world in these formats (i.e. web pages), so we should know how to show them history through the lens with which they are most comfortable. Accessibility is key here.

I do want to reiterate my earlier statements: I believe that this is the job of the Public Historian, not the Academic. We need trained professionals (digital historians) in academia to teach these skills for use by the public historian. It is easy to get lost in learning web design elements and lose what is important about what we are doing (or trying to do) as academics/professionals.

There is a great work that predates the digital revolution that is poignant to me as we work through these problems and you can access it here.

 

Now you may like this Christina!

Week 6

Argamon & Olsen

This article does an excellent job highlighting the potential for computers as tools in the humanities. Similar to the Gibbs and Owens article, it shows the use of computers to help brainstorm and articulate research questions. One particular example of this from the article is the new connections discovered between Diderot and d’Alambert’s Encyclpédie and the Journal de Trévoux. With the discover of new connections and historical questions to be asked, who’s to say that we won’t discover new connections in centuries old texts, or even millennia-old texts?

It also seems to me that there may be potential in using machine learning to examine content or stylistic differences and similarities in unattributed historical writings. There are numerous texts written by that pesky anonymous person. Without machine learning, these writings may remain unattributed to any particular author and instead continue to be wondered about. Such tools may not be able to definitively tell us who wrote what, but they may be able to provide insight and provide us with new potential authors to theorize about. In large enough bodies of work from a similar time period, could such a tool help us discover if not one author, perhaps a style linked to an institution or an organization? Could such a discovery help to discuss and/or further develop certain hegemonic theories?

One other brief (instead or risking beating a dead horse…with an already-dead horse) point that Argamon & Olsen are careful to make: new technology can lead to new interpretations of the past, however, any new interpretations must still be researched and developed by professional historians, as context must be acknowledged, especially in new historical connections.

Kramer

I agree with Kramer’s argument that now, during the emergence of the digitization of the humanities, is the time to reconsider how we think about and approach the interpretation of history. Not that we have to completely change and revolutionize the practice of history, but an incorporation of a new way of thinking could definitely freshen up the field and the methods used to define the field. Regarding the increasing significance of the digital archive, thinking of digital and physical archives as a historiography in and of themselves is an important step to building a dynamic fluidity between not only primary and secondary sources, as Kramer asserts, but also between scholarly works of history and public history as well.

Week 6 Readings

Machine Learning, Text Mining and Archives

Argamon and Olsen

This article from 2009 introduces the challenges faced by humanities scholars of dealing with voluminous digital/digitized sources, or the “global digital library.”  These authors believe text mining is complementary to traditional textual analysis.   They used predictive text mining, comparative text mining, and clustering/similarity analysis on three different projects.  In two cases, the digital tools supported the original, traditional scholarly conclusions, and one case found new connections previously unnoticed by traditional scholarly methods.

I thought the warnings were interesting.  You could create the results you want to find based on the construction of the task itself, and there is anxiety inherent in doing criticism by algorithm.   I think these warnings are born of normal professional concerns for historians, whether digital tools are used or not.  I mean, is the evidence there, or are we forcing the evidence to fit our argument?

Kramer

I did not know there were any issues between historians and archivists.  Kramer’s article makes it sound like historians take archivists for granted, and that historians think the archive is for their use alone.  I don’t like the sound of that, but the article is from summer 2014, so apparently it’s a thing.  I agree with his point that digitization will bring the archival and history professions closer together.  I like the idea of digital historiography and all the possibilities that digitization opens up for historical transparency, accessibility and openness.  I think historians and archivists all need to reconsider their relevance in the digital age, so this kind of discussion makes sense.

The history of historical inquiry could become much more interesting than reading traditional historiography.  To follow intellectual journeys in the digital world sounds more fun than reading dry bibliographies to me.  I think this is what Kramer meant when he said we could dynamically link primary sources and their subsequent interpretations.

 

 

The “hegefox”

This article was much more realistic than the others we have read and asks similar questions to those that have cropped up in class.  Obviously the field of history, humanities, and academia are changing as a result of the digital era.  This may not always be for the better, but historians, and humanities professionals in general, have an opportunity and a responsibility to preserve what is traditionally important while taking on digital tools.

The term “hegefox” was discussed and I think it perfectly describes the conundrum and perhaps the solution.  Levis in the 60s and the author speculated on how we, the professional, could deliberate over new technologies and evaluate them properly.  How the humanities could thrive in a new environment without both losing the precious pillars of our profession and/or becoming obsolete.  The idea of the hegefox was his suggestion, we must preserve traditional methodology and also incorporate new digital tools.

The author presents the same question I have been asking myself,   “…to what extent will those of us who care about the humanities be allowed to fret about the present state and future of our disciplines in the same way that Leavis wondered about what can and should be done, without being considered “highbrow,” elitist snobs?”  It is just as important, I would argue, for the hedgehog to respect the fox as it is for the fox to respect the hedgehog during this debate.  Those who want to dive head first into quantitative research and digital analysis without considering the importance of traditional methodology do not allow for the evaluation necessary for viable and long lasting solutions.

I wonder if the digital age, because of the fast pace in which it evolves, produces a moment in which we must chose.  Instead of being able to evaluate and progress, will the humanities have to let go of the past and embrace a future that steps further and further away from the “human” element?  Social media, as mentioned at the beginning of the article, has driven people both closer and further from one another–we may be losing part of our human connection.  Is this what will happen to the humanities?

This article made me look closely at the database work that we have been doing.  How did people feel about the primary sources after working with them in a database?  I saw that it would be possible to come to new conclusions, draw correlations I may not have seen before, and work with large loads of data; but I also felt distant from the subject matter, the period of time, and my own connection with the material.  Maybe because this method is new to me? Maybe not.

 

Week 5 – Porsdam

Porsdam’s article raised an interesting point, one that I have not considered much. How do we balance the need for more scientific methods and the values of the humanities? I had only briefly heard of Snow and Leavis before this article and I don’t think I really appreciated at the time the importance of their debate. Obviously, the bridging of the gap between the ‘two cultures’ is crucial to the development of digital history.

Again, a new example of one recurring theme of the class is present in the readings. In this gap-bridging, there is a focus on the “process rather than the finished product” in history. This methodological transparency was mentioned in the Gibbs and Owens reading as well, but the risk of wide engagement of the general untrained public in history needs to be tempered with the training of traditional historical academia. The problem with this is that, as Porsdam states, “the humanities have come to be seen today as out of touch with life outside the walls of the university. This has in turn led to an attempt to move more students into vocational training in order that higher education may be reserved for more elitist-minded students.”

A bridging of this gap may involve redefining the humanities through a vocational filter. As humanistic research becomes more digitized, there may be a more blue-collar (or maybe ‘less-elitist’ is a good enough way to say it) association with the humanites. The trick then, I think, is the balancing of elitist-associated academic methods and training with the less-elitist and more general-interest public’s engagement. Stated differently, the problem, perhaps, is less about balancing the current methods of the humanities and the sciences and more about altering how we view these two cultures.