Old Periodicals, a New Datatype and Spiderfied Query Results in Wikidata

Introduction: This blog post describes how the National Library of Wales makes us of Wikidata for enriching their collections. It especially showcases new features for visualizing items on a map, including a clustering service, the support of polygons and multipolygons. It also shows how polygons like the shapes of buildings can be imported from OpenStreetMap into Wikidata, which is a great example for re-using already existing information.

Attributing Authorship in the Noisy Digitized Correspondence of Jacob and Wilhelm Grimm | Digital Humanities

Introduction: Apart from its buoyant conclusion that authorship attribution methods are rather robust to noise (transcription errors) introduced by optical character recognition and handwritten text recognition, this article also offers a comprehensive read on the application of sophisticated computational techniques for testing and validation in a data curation process. 

KNOW YOUR IMPLEMENTATION: SUBGRAPHS IN LITERARY NETWORKS

Introduction: Know Your Implementation: Subgraphs in Literary Networks shows how the online tool ezlinavis can give account of detached subgraphs while working with network analysis of literary texts. For this specific case, Goethe’s Faust, Part One (1808) was analyzed and visualized with ezlinavis, and average distances were calculated giving some new results to this research in relation to Faust as protagonist.

Zur Epistemologie digitaler Methoden in den Geisteswissenschaften

Introduction: What is the precise impact of digital humanities on the humanities in general? That this influence exists seems a given, but how the digital humanities impact humanities methodology en epistemology is still an open question. This article delves deeper into this problem of epistemology and presents a model of five ‘polarities’ along which these influences can be positioned.

Towards a Computational Literary Science

Introduction: This article introduces a novel way to unfold and discover patterns in complex texts, at the intersection between macro and micro analytics. This technique is called TIC (Transcendental Information Cascades) allows analysis of how a cast of characters is generated and managed dynamically over the duration of a text.