All posts by zkleinba

Mid Term Assignment: VoS

Zach Kleinbaum, Electra  Washburn

Digital Humanities

Midterm Writing Assignment

3/8/17


http://vos.ucsb.edu

 

Describe and evaluate the significance of the scholarship for the humanities?

 

Voice of the Shuttle is a database dedicated to the humanities.  VoS consists of extensive research in the form of weblinks on twenty-eight disciplines of the humanities.  Within these twenty-eight categories, there are hundreds of links that direct the user towards research and scholarly articles related to these categories.  The topics range from traditional humanities such as Anthropology, Art History, and History to more obscure humanitarian disciplines such as Cyberculture, Minority Studies, and Technology of Writing.  The fact that there is such a broad scope of disciplines is one aspect that allows the website to significantly contribute to humanities scholarship.  Additionally, the detailed organization of links creates a serious and comprehensive database on the humanities.  Overall, the volume and range of information that this database consists of makes it an extremely relevant source of scholarship for the humanities.  

 

How does the project push forward (or fail to push forward) the state of knowledge of a discipline?

 

Voice of the Shuttle certainly pushes the state of humanity’s knowledge forward.  The volume of material on the the various subjects is one obvious way in which the database contributes to the knowledge of various disciplines within the humanities.  Before opening any of the links on a certain disciple, one can gather a great deal of information by simply looking at the organizations and titles of the links that lead to the actual articles.  For example, under the category of “Photography” on the homepage of the database, one can immediately recognize some of the major Photography Museums by looking at the titles listed under “Galleries and Museums.”  Additionally, one can tell that one of the most important photographers is Ansel Adams, for most of the links under “Photography” fall under the category “Ansel Adams.” Therefore, by simply glancing at this database, one would learn more about these humanitarian disciplines.  Once one wants to explore these topics in greater depth, the organization of the website makes it extremely easy for the user to do so.  One critique of the way in which this project pushes the state of knowledge of the disciplines forward, is that is does not include any original research.  Additionally, the fact that some links do not work is a way that the project fails to push forward the state of knowledge.  Nevertheless, one cannot deny that the access and volume to hundreds of other websites ultimately makes the project successful in pushing the knowledge of the discipline forward.  

 

Can you identify the project’s primary research question? What is it? A series of questions?

 

There is not necessarily a clear question that this project aims to answer.  The project is more of a resource that provides links to more information on various fields of the humanities than a unified set of pages that aims to answer a question or questions on a single topic.  I would say that access to information, specifically primary and secondary sources, is the primary purpose of this project; however, one overarching question that the distribution of these links could lead to is, “How does one gain a deeper knowledge of the humanities?”

 

Describe and evaluate the project’s design and interface. Evaluate the interactivity and modes of navigation of the project.

 

The design and interface of this project seems rather outdated.  The aesthetic of the project is reminiscent of websites from when it was first created in 1994.  Additionally, the color scheme of the project is bland and there is no aspect that the user’s eye is drawn to.  The layout of the project, with a short description of what it is on the homepage and the list of humanitarian disciplines that the project provides information on on the left, is extremely simple.  Additionally, there is a lack of visuals throughout this project, which makes it less interesting for the viewer.  In terms of the interface, there is no advanced form of interactivity.  The only forms of interactivity are the ability to click on the links that lead to websites or articles, and occasionally small boxes next to categories that one can click on to get to “subcategories.”  The modes of navigation of this project are simply clicking links.  I think that this project would greatly benefit from updating its design and interface; however the information is presented in an extremely clear manner that is easy to navigate, which is arguably the most important feature of a DH project.  

 

What technologies does the project employ (both frontend and backend) and how does the scholarship make use of these technologies?

 

Front end technologies are those that create the user interface of a digital humanities project and are critical when judging usability and interactivity. An accessible and comprehendible interface allows for a project to convey its argument more effectively to the user. The VoS front end technology is archaic, muddled, and confusing. The design is a simple one made using HTML, but fails to provide the user with any direction in their use of the database. The main page lists the different categories of content and each one takes the user to a page of hyperlinks with related source materials, however the user must navigate through several links to find something of interest. Overall, the front end of VoS can be approved. One possible improvement is to enhance the descriptions of the hyperlinks, so users know what kind of source they are choosing.

Back end technology consists of servers and databases responsible for the management of data. The back end technology used by VoS consists of a SQL Server database, ASP, and VB code that allows for a dynamic website. The programming allows for the databases category pages to be generated spontaneously. The back end technology used allows the database to easily add new links and information, because the website is pulling the information from the database. This allows the database to effectively manage the addition of new resources.

 

What do you consider to be the successes and failures of the project?

 

I consider the Voice of the Shuttle to be a largely outdated project in the digital humanities. The age of the database shows in its archaic user interface and the lack of properly functioning links. The functionality of the user guide is essentially obsolete as only half of the links work. However, if I was judging the project when it was first created I believe my opinion would differ. But, the functionality of the site is overcome by a basic google search, VoS even suggests on its home page that users should use Google’s feature, “Glossary” as an alternative to its own services.

With disregard to the current flaws, VoS was one of the first databases to provide a centralized collection of humanity’s resources allowing technology to become a humanities tool of research. The database links to primary sources, secondary sources, and other databases that all aid in humanities research. The ability for users to contribute new links allows for a collection of a wide range of resources. However, the site fails to provide a centralized argument and does not point the user in any direction making it difficult to navigate for new users with no particular subject of interest. I could see how the design could lead some users to endless hours of clicking in a quest for a desirable resource or document. But, I do believe the site is a valuable tool for humanities researchers vying for information on a particular topic.

 

Consider the role of the project director (listed in parentheses). What influence does the project director have on the project’s success (or failure)?

 

The project director of the Voice of the Shuttle is Alan Liu, who is a member of UC Santa Barbara’s english department. Liu specializes in cultural studies and created VoS as part of his Digital Humanities research, initially creating the project as a means to connect literature and technology. Like any project director, Liu has played an essential role in creating VoS and maintaining the project since its inception in 1994. Liu’s expertise in the humanities allows him to properly review all the data sources, and users can confide in his ability to judge these sources due to his credibility as a professor. However, Liu has not necessarily made the site easy for others to use. If Liu were able to fix the user guide on his site he has the potential to attract more users, and in turn more contributors. Liu’s role in providing credible information helps support the project, however his neglect towards functionality detracts from its usability.

 

Consider using your rubric and applying whatever form of evaluation from that assignment that might work best with the project you are examining.

 

My personal rubric tends to judge a project in the digital humanities based on the following categories: accessibility, manipulability, credibility, and the user experience. I would only give VoS positive reviews in terms of manipulability and credibility. I find that Alan Liu and his team at UC Santa Barbara provide the project with an accomplished and knowledgeable team of content reviewers. The ability for user’s to suggest new sites coupled with the backend software, which allows for dynamically created web pages, ensures that new data can constantly be added. However, these features can also harm the site as over collection of resources has resulted in many broken links. With regards to accessibility, the site contains no direction and is merely a collection of information. If the user is new to the site  they will have difficulty using the database, thus making the user experience subpar. The usability could potentially be improved with a properly functioning user guide, but at the moment many of those links are broken. Overall the failure to manage broken links has hurt the VoS, but pending a major cleanup the site maintains the ability to attract digital humanities researchers and contribute to the field.

Digital Humanities Project Evaluation

Zach Kleinbaum, Cooper Halpern

 

September 11 Digital Archive

http://911digitalarchive.org

&

Bracero History Archive

http://braceroarchive.org

 

  1. What is the difference between a “website” and a digital humanities project?

A website is a tool for presenting and providing access to information. A project is a meticulously thought out experiment with a research question that it is designed to answer. Projects require the actions of different methods to collect data and present findings. The results or tools can potentially be posted to a website as a method of presenting the data, but it is not a necessary component of a digital humanities project.

 

  1. What is the research question in each of the sites above?

The research question of the 9/11 archive is to preserve the history of the events, and the public reaction to those events, which occurred on September 11, 2001 in New York, Virginia, and Pennsylvania. The archive contains emails, first person accounts, and images and was accepted by the Library of Congress which will ensure its preservation.

The research question of the Bracero History Archive is to contribute to our understanding of the topics immigration, citizenship, nationalism, agriculture, labor practices, gender, sexuality, family, visual culture, and the cold war in the context of Mexican migrant workers. The archive examines the Bracero Program, which was a US Government program in place from 1942-1964, it allowed Mexican workers to enter the US in order to fulfill agricultural job openings. Because of the limited documentation regarding this topic, the archive was created to preserve the history of these people, especially as the question of immigration is such a hot button topic now.

 

Personal Rubric:

Accessibility:

9/11 – The homepage of the September 11 archive is user friendly and even presents instructions on how to use the site. It says how to browse, search, and contribute to the material in the archive and points users in the direction of featured collections by linking them to the home page. It instructs users on how to browse, search, and contribute to the material in the archive and points users in the direction of featured collections by linking them to the home page. Many of the first person accounts are scanned images of handwritten documents and can be harder to read. The previews of files are not accessible without an account, which provides a significant barrier to cross.

Bracero – The Bracero homepage is welcoming and clearly presents its research topic. The information is presented via several tabs which are clearly labeled. However, some of the information is hard to get to. The archive contains several forms of media which need to be downloaded or opened in another window. One example of this is the interviews with the participants in the Bracero Program. Also, some of the interviews are in Spanish without any subtitles or labeling of their language, so much of the material is impossible to understand or navigate without a fluency in Spanish. Another media issue is that many pictures lack descriptions, so without a significant historical background, it is hard to understand what the photos’ purpose is in the archive. Despite these issues, the media is of high fidelity and searchable/sortable.

 

Manipulability/Interactivity:

9/11 – In 2011, the project ensured its long term preservation by relaunching the site with Omeka software. The site is easily manipulable and encourages user interaction. Users can freely contribute accounts of the events via the contribute tab. Anyone can submit their personal accounts of the events. Users can search and navigate the materials, but they are not presented in a friendly manner, so it is hard to determine what you are looking for. Also, the documents and materials themselves are not interactive.

Bracero – The website provides a separate URL for users to harvest the archives metadata, making raw data easily accessible. They also provide video tutorials on how to effectively use the site including an introduction, how to add to the archive, create a poster, among others. The site also provides resources on how to properly collect and contribute data. And even possible interview questions to ask potential data sources like farmers or border patrol agents. These features make the site especially manipulable and easier to access the interactive portion of the site.

 

Credibility:

9/11 – The site provides collections of submissions, which is a uniquely open form of documentation commemorating the event, but made by individuals. The credibility is impossible to determine since the submissions are anonymous. However, seven people are employed to staff and work on the archive, so it seems as if there must be substantial funding and access to resources. The employees keep the archive present and up to date. Besides the Library of Congress, the site has highly credible partners including the Smithsonian Museum, and the American Red Cross.

Bracero – The site is extremely credible. Under the history tab there is an extensive bibliography listing all of the data’s sources regarding Bracero history. The site even advertises an award which it won in 2010 for the best project from the National Council on Public History. Also, it boasts many credible partners including George Mason University, Brown University, and the Smithsonian Museum of Natural History. Clearly, the project has been well received by academics.

 

Audience Experience:

9/11 – The homepage of the September 11 archive is user friendly and informative. It provides a unique and interesting collection of data ranging from documentation of flight safety protocol to interviews with Arab Americans before and after the attacks. However, since many files need to be downloaded to access them, the limited accessibility negatively affects the user experience with the project.

Bracero – Overall, the project is very informative and easy to use and covers a topic which I think few people know much about. The site even has activities for students to access to learn more about the Bracero Program and engage with the material. The site is extremely successful in it’s content and the delivery of that content.

 

Presner’s Rubric:

Before comparing our rubric to Presner’s, it is worth noting that he posits that to accurately evaluate a project, it must be evaluated in its original, intended medium. Overall, Presner’s  rubric includes eight categories for grading digital projects. We only have four, but our categories are a bit broader than his. For example, within our category, “Credibility,” we cover who is sourced, and how believable these sources are. Presner on the other hand, uses “Crediting,” “Intellectual Rigor,” and “Peer Review” to all get at this same idea of whether the information is viable or not. Though Presner’s categories are more detailed, they may not be as broadly applicable as ours. Evaluating the 9/11 Digital Archive, by any of the two latter categories are somewhat pointless because the archive consists largely of user submissions of accounts of the events surrounding 9/11. Therefore, these are impossible to confirm accurate accreditation. Also, it is simply a collection of materials, so it is hard to say what the “Intellectual Rigor” of the work is. The materials are by themselves are not particularly intellectually challenging, though as a whole, they present an impressive collection. Therefore, for this particular project, and possibly others, Presner’s categories are unnecessarily detailed. However, Presner gets at ideas that we do not address like the cultural impact of the work. This is an important idea which we neglected to mention, but also difficult to address without doing outside research. Despite each set of criteria’s flaws, I think it’s probably impossible to come up with a perfect set of categories which are well suited to every digital humanities project; by its nature, a grading rubric like this will always be too wide or narrow for some projects.

Writing Assignment #2 – Cultural Analytics

Zach Kleinbaum

Writing Assignment #2 – Evaluating a Cultural Analytics Project

When Lev Manovich first coined the term “Cultural Analytics” in 2005, he established a new means of studying the Digital Humanities. Cultural Analytics attempts to provide previously unquantifiable aspects of human culture, such as trends in artwork, television, video games, and social media, with quantifiable data so that it can be analyzed for patterns and statistics. The goal is to then display this cultural data in aesthetically pleasing and easily interpretable manners. The findings are presented visually, usually on high quality LCD screens or other imaging platforms. The goal of Cultural Analytics is to make previously unobtainable cultural data more open to trend analysis through the development of software and visual data representations.

One example of a project in Cultural Analytics is “Making Visible the Invisible” by George Legrady. Every project developed using Cultural Analytics requires a main research question, or problem at hand. Legrady’s work is designed  so that both customers and librarians alike can determine what works are popular at any given moment in time, to accomplish this task Legrady generates data on the books and other forms of media, like DVD and VHS, being checked out at the Seattle Central LIbrary. The information can be useful for several reasons, such as evaluation of trends in societal preferences or to the relationship between current events and the reading of patrons at the library. The data is then displayed using four different styles of visuals across six LCD screens located above the main librarian information desk in the library’s “Mixing Chamber”, an area dedicated to information retrieval and public research.

Accessibility and interpretability are characteristics used to judge the effectiveness of Cultural Analytics works, and Legrady succeeds in both of those categories. The location of the screens are easily accessible and centralized, considering that they are situated in the libraries’ main data center making the information relevant for patrons conducting research there. With regards to interpretability, Legrady is able to accumulate data on the materials being checked out of the library based on the “Dewey Decimal System”, which is a popular method of categorization used by libraries, as well as keywords found in the text of the respective works, and check out time. He then uses four styles of visualization to present the data. The first graphic is a count of vital statistics, such as the total number of items checked out, names of books checked out, and totals of fiction vs. nonfiction items. The second is a floating list of titles being checked out, which are organized chronologically based on check out time and color coded by genre. The third is a dot matrix that organizes the titles by the Dewey classification system. The final visual is a keyword map that connects the recently checked out titles by the presence of  keywords, and shows the distribution of keywords based on Dewey categories of the titles, for example books may be connected based on the presence of words like “1941” or “winter” . Overall, the collection of all four of these visuals provides the viewer with an assortment of useful data that benefits their visit to the library.

I thought that overall, John Legrady’s project is an exemplary representation of Cultural Analytics and how it can be beneficial to society. Legrady’s work at the library aids in the ability of patrons to make informed decisions based on the preferences and trends regarding the choices of their peers. The success of the project can be quantified by its lifespan as it lasted nearly nine years in the library from 2005 until 2014. Longevity is another characteristic of successful projects along with accessibility and interpretability.

Works Cited:

Forbes, Angus. “Cultural Analytics.” http://angusforbes.com/blog/cultural-analytics/

Legrady, George. “Making Visible the Invisible, 2005 – 2014.” http://georgelegrady.com

Manovich, Lev. “How and why study big cultural data.” http://lab.softwarestudies.com/2008/09/cultural-analytics.html

What is Digital Humanities?

Zach Kleinbaum

2/6/17

Intro to Digital Humanities

Writing Assignment #1

 

What is Digital Humanities?

Digital Humanities is an interdisciplinary academic field of study encompassing aspects of both Humanities Computing and the traditional humanities. Humanities Computing is a field focused on software development meant to utilize the capabilities of computers for analysis of traditional humanities’ resources,  like manuscripts and artworks. John Unsworth’s definition of the field as “new and continuing investments of personal, professional, institutional, and cultural resources,” contributes to the notion that Humanities Computing is the creation of assets for use by traditional humanists, but lacks its own substance considering that the field would not exist without people to utilize the products. [1] Humanities Computing is a subcategory of the traditional humanities, because it is centered on the creation of tools and not their employment. Humanities Computing is an asset for Traditional humanitarians, who use the new tools in their work, but cannot qualify as its own individual field of study.

Digital Humanities builds upon the foundation established by Humanities Computing by utilizing modern technology as a means of research in the humanities. However Digital Humanities seeks to bind the two fields by posing traditional humanitarian research questions and approaching them with new modes of engagement allowing for the discovery of unseen trends and patterns in large data sets. Digital Humanities goes beyond software development, its main goal is to create “‘good’ data” meant to “undergird new searches for patterns, visualization, and algorithmic analysis.”[2] Using methods like text analysis, digital mapping, archiving, and databasing, Digital Humanitarians develop quantitative and statistical reports of traditional resources signifying trends and patterns that previously went unnoticed.

An example of Digital Humanities at work is The Newton Project, which is an archive created to provide internet users free access to transcripted versions of Newton’s original manuscripts. The goal of the archive is to provide users with access to previously unreleased documents, allowing them to contribute to the ever growing interpretations of Newton’s work.[3] The Newton Project exemplifies a key aspect of the Digital Humanities, collaboration. Projects are constantly changing due to user collaboration, therefore Digital Humanities stresses the process in answering a specific research question and the “the questions raised by such algorithmic thinking” over the status of the final product, which is constantly in limbo.[4] The main factor to distinguish Digital Humanities from Humanities Computing is the presence of a research question, or an objective for each project to expand public knowledge regarding specific subject matter. Digital Humanities is not just focused on technology, but on creating an intricate understanding of human culture and society by processing, organizing, and displaying data in innovative ways. The Digital Humanities changes the way history is studied and that is what separates it from Humanities Computing and qualifies it as its own field.

[1]Unsworth, John. “What is Humanities Computing and what is not?” Annual Review of Computer Philology 4 (2002).

[2]Kramer, Michael. “What Does Digital Humanities bring to the Table?,” Michaeljkramer.net (blog), September 23, 2012, http://www.michaeljkramer.net/cr/what-does-digital-humanities-bring-to-the-table/ .

[3]Popova, Maria. “Digital Humanities Spotlight: 7 Important Digitization Projects,” http://www.brainpickings.org

[4]McCarty, Willard. “Humanities Computing,” http://www.mccarty.org.uk/essays/McCarty,%20Humanities%20computing.pdf

Lab #1 Cultural Analytics

Zach Kleinbaum, Cooper Halpern

  1. What kinds of patterns are being examined and how are they being measured in the projects found at the Stanford Literary Lab?

The projects at the Stanford Literary Lab are designed to analyze literary trends. These projects track the patterns and relationships between textual and thematic elements of texts in order to tell more about a genre or the time period when the piece was written. These projects all attempt to place the scope texts within a defined, larger picture by elucidating the patterns which connect the texts to the research question. For example one project tracks the relationship between the race of fictional characters and how they are perceived in the literature. The patterns are measured by tracking the descriptive terms associated with particular characters and then connecting those adjectives to the character’s stated ethnicity. This project allows for an observation of the way race is viewed at a specific point in time based on when the book was written. A second project looks at how the gender of the author influences the way genders are depicted in literature based on the dialogue of the characters in the book. This too is tracked by analyzing the text and observing the words used by certain genders. This project can help us gain a deeper insight into the role of gender in the development of the novel.

  1. Review the visualizations listed below.  What makes these visualizations successful?  http://www.visualisingdata.com/2015/01/new-visual-package-chicago-planning-agency/

This visualization depicts the past, present, and future plans of Chicago’s transit, roads, and freight. It successfully takes a multimedia approach by smoothly blending, text, video, images, and interactivity to create an engaging, useful, and informative demonstration of the information/tool for future planning. However, it would have been more readable for a person not familiar with the geography of Chicago if it had included some more literal signposting like names of neighborhoods, main streets, and geographic landmarks.

http://www.visualisingdata.com/index.php/2015/01/make-grey-best-friend/

These visualizations serve as an argument for the use of grey as the main color for presentations. The argument follows that grey as a main color allows for sharp contrast in certain colorful areas, drawing the eye to specific, more important data. Also, the author provides images of presentations which employ this method of heavily featuring grey which helps illustrate (literally) their point.

http://www.hyperhistory.com/online_n2/History_n2/a.html

This visualization is an expansive historical database, covering topics grouped into two sets of four categories: science, culture, religion, and politics, and: people, history, events, and maps. Users can view timelines of either important figures lives or events in a particular category. The site color codes the timeline and provides the user with short summaries when clicking a specific piece of data. Though each timeline is useful to a certain extent for getting a sense of a specific time, it is hard to understand the big picture across time periods. Also, the site itself is abrasively web 1.0 in its clunky design and functionality. The most difficult part of the site is that the same information can be accessed through the two different sets of categories. Since the categories are so unclear, it is not always clear exactly where the categorical lines are drawn when it comes to what chart you are looking at.

http://neoformix.com/2013/NovelViews.html

This visualization attempts to analyze Les Miserables by connecting characters to various words in the text, and assessing the frequency and proximity of these words to other words. Though the graphics are consistently aesthetically pleasing, and some are quite simple (like the word clouds), many are almost impossible to understand (like the radial word connections). Also, the descriptions which accompany the the text are often so obtuse and confusing that they lend no clarity to the complex illustrations.

How would you measure their success?  If you had to develop a list of features that make these visualizations successful, what might those include?

I would measure the success of the projects by how clear and engaging the portrayal of the information they attempt to explain is. Though not one of these alone would make a project successful, these features would provide a rubric for grading a successful visualization: clarity, accessibility, interactivity, efficiency in exposition, and depth of analysis.

  1. Go to Dirt (Digital Research Tools) and choose one (1) tool listed under “Analyze Data” and one tool listed under “Visualize Data.”  How might these tools be useful in analyzing large amounts of data?  

Under “Analyze Data,”  we chose “Timeflow,” an open source timeline which allows journalists to analyze temporal data. It offers several modes to view the data by time, including calendar, timeline, list, and table. The program also allows you to import your own data. “Time Flow” might be useful in analyzing large amounts of data because it does a good job organizing data by time, so if I were working on a project which had a lot of data over a large amount of time, this would be a helpful organizational tool. Under “Visualize Data” I chose “Tableau Public,” which is not an open source web platform, but is free and allows anyone to publish their data visualizations. This would be useful if this were a private project and I could not or did not want to spend my own money. It is also a relatively accessible and usable tool which would help a someone without a coding background visualize a large amount of data. It allows for users to make convenient pivot tables or dashboards to establish an interactive assortment of data.