Introduction: Especially humanities scholars (not only historians) who have not yet had any contact with the Digital Humanities, Silke Schwandt offers a motivating and vivid introduction to see the potential of this approach, using the analysis of word frequencies as an example. With the help of Voyant Tools and Nopaque, she provides her listeners with the necessary equipment to work quantitatively with their corpora. Schwandt’s presentation, to which the following report by Maschka Kunz, Isabella Stucky and Anna Ruh refers, can also be viewed at https://www.youtube.com/watch?v=tJvbC3b1yPc.
Category: Analysis
This general research goal refers to the activity of extracting any kind of information from open or closed, structured or unstructured collections of data, of discovering recurring phenomena, units, elements, patterns, groupings, and the like. This can refer to structural, formal or semantic aspects of data. Analysis also includes methods used to visualize results. Methods and techniques related to this goal may be considered to follow Capture and Enrichment; however, Enrichment depends upon assumptions, research questions and results related to Analysis.
Introduction: One of the major challenges of digital data workflows in the Arts and Humanities is that resources that belong together, in extreme cases, like this particular one, even parts of dismembered manuscripts, are hosted and embedded in different geographical and institutional silos. Combining IIIF with a mySQL database, Fragmentarium provides a user-friendly but also standardized, open workspace for the virtual reconstruction of medieval manuscript fragments. Lisa Fagin Davis’s blog post gives contextualized insights of the potentials of Fragmentarium and how, as she writes, “technology has caught up with our dreams”.
Introduction: Digital text analysis depends on one important thing: text that can be processed with little effort. Working with PDFs often leads to great difficulties, as Zeyd Boukhers Shriharsh Ambhore and Steffen Staab describe in their paper. Their goal is to extract references from PDF documents. Highlight of their described workflow are very impressive precision rates. The paper thereby encourages to a further development of the process and its application as a “method” in the humanities.
Introduction: Vistorian is a network analysis tool for digital humanists, especially historians. Its features have been specifically developed along basic principles of simplicity, privacy, openness and extensibility. The tool is part of the open-source Networkcube Project.
[Click ‘Read more’ for the full post!]
OpenMethods Spotlights showcase people and epistemic reflections behind Digital Humanities tools and methods. You can find here brief interviews with the creator(s) of the blogs or tools that are highlighted on OpenMethods to humanize and contextualize them. In the first episode, Alíz Horváth is talking with Hilde de Weerdt at Leiden University about MARKUS, a tool that offers offers a variety of functionalities for the markup, analysis, export, linking, and visualization of texts in multiple languages, with a special focus on Chinese and now Korean as well.
East Asian studies are still largely underrepresented in digital humanities. Part of the reason for this phenomenon is the relative lack of tools and methods which could be used smoothly with non-Latin scripts. MARKUS, developed by Brent Ho within the framework of the Communication and Empire: Chinese Empires in Comparative Perspective project led by Hilde de Weerdt at Leiden University, is a comprehensive tool which helps mitigate this issue. Selected as a runner up in the category “Best tool or suite of tools” in the DH2016 awards, MARKUS offers a variety of functionalities for the markup, analysis, export, linking, and visualization of texts in multiple languages, with a special focus on Chinese and now Korean as well.
Introduction: In this blog post, Michael Schonhardt explores and evaluates a range of freely available, Open Source tools – Inkscape, Blender, Stellarium, Sketchup – that enable the digital, 3D modelling of medieval scholarly objects. These diverse tools bring easily implementable solutions for both the analysis and the communication of results of object-related cultural studies and are especially suitable for projects with small budgets.
Historical newspapers, already available in many digitized collections, may represent a significant source of information for the reconstruction of events and backgrounds, enabling historians to cast new light on facts and phenomena, as well as to advance new interpretations. Lausanne, University of Zurich and C2DH Luxembourg, the ‘impresso – Media Monitoring of the Past’ project wishes to offer an advanced corpus-oriented answer to the increasing need of accessing and consulting collections of historical digitized newspapers.
[…] Thanks to a suite of computational tools for data extraction, linking and exploration, impresso aims at overcoming the traditional keyword-based approach by means of the application of advanced techniques, from lexical processing to semantically deepened n-grams, from data modelling to interoperability.
[Click ‘Read more’ for the full post!]
Introduction: In our guidelines for nominating content, databases are explicitly excluded. However, this database is an exception, which is not due to the burning issue of COVID-19, but to its exemplary variety of digital humanities methods with which the data can be processed.AVOBMAT makes it possible to process 51,000 articles with almost every conceivable approach (Topic Modeling, Network Analysis, N-gram viewer, KWIC analyses, gender analyses, lexical diversity metrics, and so on) and is thus much more than just a simple database – rather, it is a welcome stage for the Who is Who (or What is What?) of OpenMethods.
Introduction: This blog, curated by Andreas W. Müller from Halle University, provides an insight on qualitative data analysis (QDA) techniques to conduct research in the field of Digital Humanities. The field is currently dominated by quantitative research methods, and is still lacking digital analysis derived from qualitative approaches. The author implies that QDA is a not a method, but a set of techniques that can be used with different analysis methods, for instance Content Analysis or Discourse Analysis. He also outlines how QDA deals with qualitative data combined with qualitative analysis, being both elements fundamental.
[Click ‘Read more’ for the full post!]