‘Voyant Tools’

‘Voyant Tools’

Introduction: Digital humanists looking for tools in order to visualize and analyze texts can rely on ‘Voyant Tools’ (https://voyant-tools.org), a software package created by S.Sinclair and G.Rockwell. Online resources are available in order to learn how to use Voyant. In this post, we highlight two of them: “Using Voyant-Tools to Formulate Research Questions for Textual Data” by Filipa Calado (GC Digital Fellows and the tutorial “Investigating texts with Voyant” by Miriam Posner.

Evaluating named entity recognition tools for extracting social networks from novels

Evaluating named entity recognition tools for extracting social networks from novels

Introduction: Named Entity Recognition (NER) is used to identify textual elements that gives things a name. In this study, four different NER tools are evaluated using a corpus of modern and classic fantasy or science fiction novels. Since NER tools have been created for the news domain, it is interesting to see how they perform in a totally different domain. The article comes with a very detailed methodological part and the accompanying dataset is also made available.

The Uncanny Valley and the Ghost in the Machine

The Uncanny Valley and the Ghost in the Machine

Introduction: There is a postulated level of anthropomorphism where people feel uncanny about the appearance of a robot. But what happens if digital facsimiles and online editions become nigh indistinguishable from the real, yet materially remaining so vastly different? How do we ethically provide access to the digital object without creating a blindspot and neglect for the real thing. A question that keeps digital librarian Dot Porter awake and which she ponders in this thoughtful contribution.

Old Periodicals, a New Datatype and Spiderfied Query Results in Wikidata

Old Periodicals, a New Datatype and Spiderfied Query Results in Wikidata

Introduction: This blog post describes how the National Library of Wales makes us of Wikidata for enriching their collections. It especially showcases new features for visualizing items on a map, including a clustering service, the support of polygons and multipolygons. It also shows how polygons like the shapes of buildings can be imported from OpenStreetMap into Wikidata, which is a great example for re-using already existing information.

Attributing Authorship in the Noisy Digitized Correspondence of Jacob and Wilhelm Grimm | Digital Humanities

Attributing Authorship in the Noisy Digitized Correspondence of Jacob and Wilhelm Grimm | Digital Humanities

Introduction: Apart from its buoyant conclusion that authorship attribution methods are rather robust to noise (transcription errors) introduced by optical character recognition and handwritten text recognition, this article also offers a comprehensive read on the application of sophisticated computational techniques for testing and validation in a data curation process. 

KNOW YOUR IMPLEMENTATION: SUBGRAPHS IN LITERARY NETWORKS

Introduction: Know Your Implementation: Subgraphs in Literary Networks shows how the online tool ezlinavis can give account of detached subgraphs while working with network analysis of literary texts. For this specific case, Goethe’s Faust, Part One (1808) was analyzed and visualized with ezlinavis, and average distances were calculated giving some new results to this research in relation to Faust as protagonist.

Zur Epistemologie digitaler Methoden in den Geisteswissenschaften

Introduction: What is the precise impact of digital humanities on the humanities in general? That this influence exists seems a given, but how the digital humanities impact humanities methodology en epistemology is still an open question. This article delves deeper into this problem of epistemology and presents a model of five ‘polarities’ along which these influences can be positioned.