Do humanists need BERT?

https://openmethods.dariah.eu/2019/08/12/do-humanists-need-bert-the-stone-and-the-shell/ OpenMethods introduction to: Do humanists need BERT? 2019-08-12 12:36:45 Introduction: Ted Underwood tests a new language representation model called “Bidirectional Encoder Representations from Transformers” (BERT) and asks if humanists should use it. Due to its high degree of difficulty and its limited success (e.g. in questions of genre detection) he concludes, that this approach will be important in the future but it’s nothing to deal with for humanists at the moment. An important caveat worth reading. Christopher Nunn https://tedunderwood.com/2019/07/15/do-humanists-need-bert/ Blog post Analysis Digital Humanities English Machine Learning Modeling Research Activities Sentiment Analysis Structural Analysis Tools Computational linguistics Google Books logistic regression machine learning Natural language processing neural networks science fiction sentiment analysis Text Categorization Text Classification
Introduction by OpenMethods Editor (Christopher Nunn): Ted Underwood tests a new language representation model called “Bidirectional Encoder Representations from Transformers” (BERT) and asks if humanists should use it. Due to its high degree of difficulty and its limited success (e.g. in questions of genre detection) he concludes, that this approach will be important in the future but it’s nothing to deal with for humanists at the moment. An important caveat worth reading.

Even if you have no intention of ever using the model, there is something thrilling about BERT’s ability to reuse the knowledge it gained solving one problem to get a head start on lots of other problems. This approach, called “transfer learning,” brings machine learning closer to learning of the human kind.


Source: Do humanists need BERT? | The Stone and the Shell