Artificial Intelligence
NOUN
Computing.
The capacity of computers or other machines to exhibit or simulate intelligent behaviour; the field of study concerned with this. In later use also: software used to perform tasks or produce output previously thought to require human intelligence, esp. by using machine learning to extrapolate from large collections of data. Also as a count noun: an instance of this type of software; a (notional) entity exhibiting such intelligence. Abbreviated AI.
“Artificial Intelligence, N.” Oxford English Dictionary, Oxford UP, March 2024, https://doi.org/10.1093/OED/7359280480.
But what is our current level of progress in the development of Artificial Intelligence?
At this time, there are a variety of AI software available, much of it is proprietary, and some of it is strictly confidential. The most well known of the existing AI tools are Language Learning Models (abbreviated to LLM), these models are trained on massive datasets of text and are capable of using advanced algorithms to answer questions by using the instructions given to it as context to predict a viable response. These include AI assistant software such as Chat-GPT and Copilot, as well as tools for generating stories and other forms of writing such as Novel AI.
Also prominent in the news are tools for generating images and video based off a prompt, these AI contain patterns trained from large datasets of images and their accompanying metadata, and operate by replicating the patterns it identified in the training data into new images based off the context of the prompt posed to it. Stable Diffusion and Midjourney are the leaders in image generation software, while Open AI's Sora has shown substantial results in being able to generate video's from text prompts.
Is AI actually Intelligent? Do they reason as a Human would?
There is currently no consensus on the matter, we (in the collective sense) have neither an adequate understanding on how the brain accomplishes what it accomplishes, and AI is a black box, we know what we did to create them, but its unclear exactly what processes it's developing in order to output its responses. Because AI is trained off large datasets over time, AI is functionally grown rather than built.
Chemero, A. LLMs differ from human cognition because they are not embodied. Nat Hum Behav 7, 1828–1829 (2023). https://doi.org/10.1038/s41562-023-01723-5
Components of an LLM
So-called Neural Networks are what allow the AI to "think" they're loosely inspired by how human brains process information, with nodes and synapses forming connections between them.
Transformers turn language into numbers, and then uses those numbers to organize concepts in proximity to each other, concepts that are related are organized to be closer together.
This allows the AI to effectively "Think" by performing math on these concepts using the prompt as a guide for what concepts are at play.
The Transformer then translates the answer back into natural language.
Vaswani, Ashish; Shazeer, Noam; Parmar, Niki; Uszkoreit, Jakob; Jones, Llion; Gomez, Aidan N; Kaiser, Łukasz; Polosukhin, Illia (2017). "Attention is All you Need" (PDF). Advances in Neural Information Processing Systems. 30. Curran Associates, Inc.
This work is licensed under a Creative Commons Attribution NonCommercial 4.0 International License. | Details and Exceptions