OpenMethods

OpenMethods

HIGHLIGHTING DIGITAL HUMANITIES METHODS AND TOOLS

Menu
Skip to content
  • Home
  • About
  • Who we are
    • Editorial Team
    • Volunteer Editors
  • Join us
  • Submit a content
  • RSS feeds
  • Log in

Category: Metadata

Navigating the centuries with the ‘Mapping of the Republic of Letters’ project.
  • Analysis

Navigating the centuries with the ‘Mapping of the Republic of Letters’ project.

  • Posted on September 19, 2022September 19, 2022
  • by Marinella Testori

In this post, we reach back in time to showcase an older project and highlight its impact on data visualization in Digital Humanities as well as its good practices to make different layers of scholarship available for increased transparency and reusability.  

Developed at Stanford with other research partners (‘Cultures of Knowledge’ at Oxford, the Groupe d’Alembert at CNRS, the KKCC-Circulation of Knowledge and Learned Practices in the 17th-century Dutch Republic, the DensityDesign ResearchLab), the ‘Mapping of the Republic of Letters Project’ aimed at digitizing and visualizing the intellectual community throughout the XVI and XVIII centuries known as ‘Republic of Letters’ (an overview of the concept can be found in Bots and Waquet, 1997), to get a better sense of the shape, size and associated intellectual network, its inherent complexities and  boundaries.

Below we highlight the different, interrelated
layers of making project outputs available and reusable on the long term (way before FAIR data became a widespread policy imperative!): methodological reflections, interactive visualizations, the associated data and its data model schema. All of these layers are published in a trusted repository and are interlinked with each other via their Persistent Identifiers.

[Click ‘Read more’ for the full post!]

Read More
OpenMethods Spotlights #4 Improving access to Asian cultural heritage and enabling new ways to connect and study them: a podcast with Alíz Horváth and Shih-Pei Chen
  • Analysis

OpenMethods Spotlights #4 Improving access to Asian cultural heritage and enabling new ways to connect and study them: a podcast with Alíz Horváth and Shih-Pei Chen

  • Posted on August 8, 2022August 8, 2022
  • by Erzsebet Tóth-Czifra

The conversation below is a special, summer episode of our Spotlight series. It is a collaboration between OpenMethods and the Humanista podcast and this it comes as a podcast, in which Alíz Horváth, owner of the Humanista podcast series and proud Editorial Team member of OpenMethods, is asking Shih-Pei Chen, scholar and Digital Content Curator at the Max Plank Institute for the History of Science about the text analysis tools LoGaRT, RISE and SHINE; non-Latin scripted Digital Humanities, why local gazetteers are goldmines to Asian Studies, how digitization changes, broadens the kinds research questions one can study, where are the challenges in the access to cultural heritage and liaising with proprietary infrastructure providers… and many more! Enjoy!

Read More
An end-to-end approach for extracting and segmenting high-variance references from pdf documents
  • Analysis

An end-to-end approach for extracting and segmenting high-variance references from pdf documents

  • Posted on November 9, 2020November 10, 2020
  • by Stefan Karcher

Introduction: Digital text analysis depends on one important thing: text that can be processed with little effort. Working with PDFs often leads to great difficulties, as Zeyd Boukhers Shriharsh Ambhore and Steffen Staab describe in their paper. Their goal is to extract references from PDF documents. Highlight of their described workflow are very impressive precision rates. The paper thereby encourages to a further development of the process and its application as a “method” in the humanities.

Read More
OpenMethods Spotlights #1: Interview with Hilde De Weerdt about MARKUS
  • Analysis

OpenMethods Spotlights #1: Interview with Hilde De Weerdt about MARKUS

  • Posted on October 13, 2020October 13, 2020
  • by Alíz Horváth

OpenMethods Spotlights showcase people and epistemic reflections behind Digital Humanities tools and methods. You can find here brief interviews with the creator(s) of the blogs or tools that are highlighted on OpenMethods to humanize and contextualize them. In the first episode, Alíz Horváth is talking with Hilde de Weerdt at Leiden University about MARKUS, a tool that offers offers a variety of functionalities for the markup, analysis, export, linking, and visualization of texts in multiple languages, with a special focus on Chinese and now Korean as well.

Read More
MARKUS – Comprehensive tool with the needs of non-Latin script users in mind
  • Analysis

MARKUS – Comprehensive tool with the needs of non-Latin script users in mind

  • Posted on October 11, 2020October 13, 2020
  • by Alíz Horváth

East Asian studies are still largely underrepresented in digital humanities. Part of the reason for this phenomenon is the relative lack of tools and methods which could be used smoothly with non-Latin scripts. MARKUS, developed by Brent Ho within the framework of the Communication and Empire: Chinese Empires in Comparative Perspective project led by Hilde de Weerdt at Leiden University, is a comprehensive tool which helps mitigate this issue. Selected as a runner up in the category “Best tool or suite of tools” in the DH2016 awards, MARKUS offers a variety of functionalities for the markup, analysis, export, linking, and visualization of texts in multiple languages, with a special focus on Chinese and now Korean as well.

Read More
Das Projekt “GND für Kulturdaten” (GND4C)
  • Data

Das Projekt “GND für Kulturdaten” (GND4C)

  • Posted on September 1, 2020September 1, 2020
  • by Ulrike Wuttke

Introduction: Standardized metadata, linked meaningfully using semantic web technologies are prerequisites for cross-disciplinary Digital Humanities research as well as for FAIR data management. In this article from the Open Access Journal o-bib, members of the project „GND for Cultural Data“ (GND4C) describe how the Gemeinsame Normdatei (GND) (engl.  Integrated Authority File), a widely accepted vocabulary for description and information retrieval in the library world is maintained by the German National Library and how it supports semantic interoperability and reuse of data. It also explores how the GND can be utilized and advanced collaboratively, integrating the perspectives of its multidisciplinary stakeholders, including the Digital Humanities. For background reading, the training resources „Controlled Vocabularies and SKOS“ (https://campus.dariah.eu/resource/controlled-vocabularies-and-skos) or „Formal Ontologies“ (https://campus.dariah.eu/resource/formal-ontologies-a-complete-novice-s-guide) are of interest.

Read More
Exposing legacy project datasets in Digital Humanities | King’s Digital Lab
  • Archiving

Exposing legacy project datasets in Digital Humanities | King’s Digital Lab

  • Posted on July 28, 2020July 29, 2020
  • by Erzsebet Tóth-Czifra

Introduction: Issues around sustaining digital project outputs after their funding period is a recurrent topic on OpenMethods. In this post, Arianna Ciula introduces the King’s Digital Lab’s solution, a workflow around their CKAN  (Comprehensive Knowledge Archive Network) instance, and uncovers the many questions around not only maintaining a variety of legacy resources from long-running projects, but also opening them up for data re-use, verification and integration beyond siloed resources.

Read More
Ediarum. A toolbox for editors and developers | RIDE
  • Annotating

Ediarum. A toolbox for editors and developers | RIDE

  • Posted on April 14, 2020April 15, 2020
  • by Erzsebet Tóth-Czifra

Introduction: the RIDE journal (the  Review Journal of the Institute for Documentology and Scholarly Editing) aims to offer a solution to current misalignments between scholarly workflows and their evaluation and provides a forum for the critical evaluation of the methodology of digital edition projects. This time, we have been cherry picking from their latest issue (Issue 11) dedicated to the evaluation and critical improvement of tools and environments.
Ediarum is a toolbox  developed for editors by the TELOTA initiative at the BBAW in Berlin to generate and annotate TEI-XML Data in German language. In his review, Andreas Mertgens touches upon issues regarding methodology and implementation, use cases, deployment and learning curve, Open Source, sustainability and extensibility of the tool, user interaction and GUI and of course a rich functional overview.
[Click ‘Read more’ for the full post!]

Read More
TEIdown: Uso de Markdown extendido para el marcado automático de documentos TEI
  • Digital Humanities

TEIdown: Uso de Markdown extendido para el marcado automático de documentos TEI

  • Posted on July 23, 2019July 24, 2019
  • by Gimena Del Rio

Introduction: In this article, Alejandro Bia Platas and Ramón P. Ñeco García introduce TEIdown, an extension of the Markdown syntax that aims at creating XML-TEI documents, and transformation programs. TEIdown helps editors to validate and find errors in TEI documents.

Read More
Forschungsdaten in der (digitalen) Geschichtswissenschaft. Warum sie wichtig sind und wir gemeinsame Standards brauchen – Digitale Geschichtswissenschaft
  • Data

Forschungsdaten in der (digitalen) Geschichtswissenschaft. Warum sie wichtig sind und wir gemeinsame Standards brauchen – Digitale Geschichtswissenschaft

  • Posted on October 1, 2018October 16, 2018
  • by Ulrike Wuttke

Introduction: This is a comprehensive account of a workshop on research data in the study of the past. It introduces a broad spectrum of aspects and questions related to the growing relevance of digital research data and methods for this discipline and which methodological and conceptual consequences are involved and needed, especially a shared understanding of standards.

Read More

Posts navigation

Page 1 Page 2 … Page 5 Next Page

Interested in blogging about your research? The Digital Humanities Tools and Methods blog is for you!

In cooperation with

OPERAS

Categories

Recent Posts

  • Tools for Critical Discourse Analysis – and introduction to tool critizism
  • SPARQL for music: when melodies meet ontology
  • Humanities Data Analysis: Case Studies with Python — Humanities Data Analysis: Case Studies with Python
  • What is PixPlot? (DH Tools) – YouTube
  • Navigating the centuries with the ‘Mapping of the Republic of Letters’ project.

Archives

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
OpenMethods © 2017-2018.
All site content, except where otherwise noted, is licensed under a CC BY license. This is in line with DARIAH’s Open Access Policy
Privacy Notice
Hosted by – We use
HaS received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 675570
Bezel Theme by SimpleFreeThemes ⋅ Powered by WordPress