Trayectorias Humanas Trascontinentales (Dec 2024)

Digital materiality

  • Christian STEIN

DOI
https://doi.org/10.25965/trahs.6416
Journal volume & issue
no. 18

Abstract

Read online

In 2017, Google's “Attention is All You Need” (Vaswani et al.) introduced the Transformer architecture, laying the groundwork for today’s large language models (LLMs) like GPT, Claude, and Llama. Transformers excel at processing sequences by leveraging self-attention, which allows for the dynamic weighting of relationships between words in a sentence. This approach revolutionized natural language processing, enabling models to understand and generate human-like text by calculating complex contextual meanings. These foundation models are now so advanced that interactions with them often feel human-like. This evolution challenges not only technological norms but also human self-perception, sparking both fascination and fear. Historically, humans have been resistant to ideas that diminish their unique place in the world, such as the Copernican and Darwinian revolutions. Similarly, today's AI advancements evoke concerns about technology surpassing human abilities, including creativity and problem-solving. Transformers, by processing vast amounts of digital information, act as material production engines, transforming human-readable inputs into abstract data structures, leading to outputs often more polished than what humans can produce. As AI continues to advance at an unprecedented rate, its integration into society poses ethical and existential questions. Understanding AI as active digital material requires new metaphors, vocabularies, and frameworks, including ethical considerations about creativity, intelligence, and responsibility in a rapidly evolving technological landscape.

Keywords