Closing the Gap in Non-Latin-Script Data: A tool for building and navigating collections of DH research projects

Closing the Gap in Non-Latin-Script Data: A tool for building and navigating collections of DH research projects

The Closing the Gap in non-Latin script data aims at mapping the field of digital humanities projects outside and beyond the anglosphere with a particular focus on non-Latin scripts such as Arabic or Chinese in both machine-actionable and human readable form. The urgency and value of such a survey has been highlighted in recent discussions around global, decolonial, and multilingual digital humanities.

Tools for Critical Discourse Analysis – and introduction to tool critizism

Tools for Critical Discourse Analysis – and introduction to tool critizism

In this video, Drs. Stephanie Vie and Jennifer deWinter explain some of the tools digital humanists can use for critical discourse analysis and visualization of data collected from social media platforms. Although not all the tools they mention are open source, the majority of them have free to use or freemium versions, including AntConc, a free-to-use concordancing tool, or several Twitter data visualisation tools such as Tweeps map or Tweetstats.

Even though the video does not provide just-as-good open source alternatives to Atlas.ti or MAXQDA (an obviously a recurrent question or shortcoming that is recurrently discussed on OpenMethods), it sets an excellent example for how to introduce tool criticism in the classroom alongside introduction to certain Digital Humanities Tools. After briefly touching upon both advantages and disadvantages of each tool, they encourage their audience (students in Digital Humanities study programs) to pilot each of them by using the same data-set and not only compare their results but also reflect on the epistemic processes in-between.

Sharing the video on Humanities Commons with stable archiving, DOI and rich metadata is among the best things that could happen to teaching resources of all kinds.

Approaching Linked Data

Approaching Linked Data

Introduction: Linked Data and Linked Open Data are gaining an increasing interest and application in many fields. A recent experiment conducted in 2018 at Furman University illustrates and discusses some of the challenges from a pedagogical perspective posed by Linked Open Data applied to research in the historical domain.

“Linked Open Data to navigate the Past: using Peripleo in class” by Chiara Palladino describes the exploitation of the search-engine Peripleo in order to reconstruct the past of four archeologically-relevant cities. Many databases, comprising various types of information, have been consulted, and the results, as highlighted in the contribution by Palladino, show both advantages and limitations of a Linked Open Data-oriented approach to historical investigations.

Old Periodicals, a New Datatype and Spiderfied Query Results in Wikidata

Old Periodicals, a New Datatype and Spiderfied Query Results in Wikidata

Introduction: This blog post describes how the National Library of Wales makes us of Wikidata for enriching their collections. It especially showcases new features for visualizing items on a map, including a clustering service, the support of polygons and multipolygons. It also shows how polygons like the shapes of buildings can be imported from OpenStreetMap into Wikidata, which is a great example for re-using already existing information.