The Programming Historian (Jan 2024)

Understanding and Creating Word Embeddings

  • Avery Blankenship,
  • Sarah Connell,
  • Quinn Dombrowski

DOI
https://doi.org/10.46430/phen0116
Journal volume & issue
Vol. 13

Abstract

Read online

Word embeddings allow you to analyze the usage of different terms in a corpus of texts by capturing information about their contextual usage. Through a primarily theoretical lens, this lesson will teach you how to prepare a corpus and train a word embedding model. You will explore how word vectors work, how to interpret them, and how to answer humanities research questions using them.