top of page
  • Writer's pictureMicah Voraritskul

AI-LARGE LANGUAGE MODELS, DEISM/MONOTHEISM, AND THE PARTICULARITY OF WORDS

TAKEAWAYS:
  1. Large Language Models (LLMs) are potent neural networks for processing natural language. Developed from extensive language datasets and refined through human guidance, they've evolved beyond basic text prediction to generate innovative ideas, even crafting in-depth term papers.

  2. LLMs excel at synthesizing ideas creatively while comprehending over 1,500 natural languages, and models like Megatron-Turing with 500 billion parameters exemplify their rapid progress.

  3. Language's role in shaping human understanding and socio-political cultures is underscored by major religions attributing the universe's origin to a language event, aligning with the Sapir-Worf Hypothesis, which emphasizes language's influence on perception and comprehension.

RAMBLINGS FROM MICAH VORARITSKUL | 2023

Note: This article was written in April 2023 - a few things have changed in the development of generative AI technology in the past six months.


LARGE LANGUAGE MODELS CAN GENERATE NOVEL IDEAS

AI–Large Language Models (LLMs) are neural networks that process natural language. These LLMs consist of billions of language data and have been set up to crunch

words (not just numbers, actual words). The networks' outputs are scored through human-supervised training, and the network is tweaked (and self-corrected in billions of tiny, infinitely complex, unknowable, and often random (stochastic) ways. When a network reaches an acceptable level (or rather, an amazing level), the result is a powerful data model driven by language and useful to humans. These datasets, networks, and models have advanced far beyond general text prediction (the way most of us first encountered them on cell phones and internet searches). They can parse and generate useful idea streams (which is why they can write remarkable term papers). They can even amalgamate and synthesize ideas, combining them in novel ways. And the models are learning natural languages (speech-to-text in 1,500+ natural languages now). LLMs are proliferating globally at an alarming rate. As of this writing (April 2023), there are several monoliths (models with tens of billions of parameters, some like Megatron-Turing with over 500 billion parameters). Others include well-known ones like BERT, GPT3, and T5. While we sleep, LLMs are doing ten trillion push-ups a night and getting smarter, faster, and more accurate.

WHY THIS MATTERS

Words, language, and the advancement of ideas are what drive and shape the world we live in.* We would be foolish to underestimate the power of language as it relates to the world we understand and the parts of the world (or universe) we don't/can't understand.

FIRST CAUSE OF EXISTENCE: A SPEECH EVENT?


Consider this: Three major world religions, whose constituents comprise the majority of humans in the world (about 56% according to Pew), present a systematic, philosophical/theological framework rooted in this idea: The Universe was created with words through a language event (I.e., the universe was a swirling soup of chaos and a powerful creative being said, "Let's start this project with some light."). Notwithstanding the agnostic problem: Why is there something and not nothing? we should pay attention to most humans with a unique paradigm underpinning their collective socio-political cultures.

Those who identify with Christianity (2.4B - 31%)

Those who identify with Islam (1.9B - 25%)

Those who identify with Judaism (15M - >1%)

*EDWARD SAPIR & BENJAMIN WORF (1929): REALITY IS FRAMED BY LANGUAGE

The famous Sapir-Worf Hypothesis is also worth consideration. The hypothesis has said the way people learn a language (and the actual grammatical structure of that language) determines the limits (the framework) of their cognitive processes. In other words, the language a person "thinks in" and "speaks in" guides that person's perception, categorization, and understanding of everything they experience in the world. A famous example is the Eskimo-Aleut (aka Inuit), a language that contains many words for "snow." Eskimos know snow–soft white snow, dirty white snow, wet white snow, powdery white snow, glimmering white snow, etc. It's safe to say native speakers of the Eskimo-Aleut language have a deeper understanding of "snow-ness" and the white qualities of snow than most white people.

+ BLOG

bottom of page