• Thumbnail for Generative pre-trained transformer
    A generative pre-trained transformer (GPT) is a type of large language model (LLM) and a prominent framework for generative artificial intelligence. It...
    65 KB (5,342 words) - 13:55, 1 May 2025
  • Thumbnail for ChatGPT
    misinformation. ChatGPT is built on OpenAI's proprietary series of generative pre-trained transformer (GPT) models and is fine-tuned for conversational applications...
    207 KB (17,915 words) - 15:42, 4 May 2025
  • Thumbnail for GPT-2
    Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained...
    44 KB (3,243 words) - 09:14, 19 April 2025
  • Generative Pre-trained Transformer 4Chan (GPT-4chan) is a controversial AI model that was developed and deployed by YouTuber and AI researcher Yannic Kilcher...
    9 KB (1,128 words) - 12:11, 24 April 2025
  • Generative Pre-trained Transformer 4 (GPT-4) is a multimodal large language model trained and created by OpenAI and the fourth in its series of GPT foundation...
    64 KB (6,200 words) - 22:30, 6 May 2025
  • Thumbnail for GPT-1
    GPT-1 (category Generative pre-trained transformers)
    Generative Pre-trained Transformer 1 (GPT-1) was the first of OpenAI's large language models following Google's invention of the transformer architecture...
    32 KB (1,064 words) - 01:34, 21 March 2025
  • OpenAI o1 is a reflective generative pre-trained transformer (GPT). A preview of o1 was released by OpenAI on September 12, 2024. o1 spends time "thinking"...
    13 KB (1,349 words) - 01:41, 28 March 2025
  • Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer...
    55 KB (4,916 words) - 23:17, 7 May 2025
  • Thumbnail for Transformer (deep learning architecture)
    of pre-trained systems, such as generative pre-trained transformers (GPTs) and BERT (bidirectional encoder representations from transformers). For many...
    106 KB (13,091 words) - 06:27, 8 May 2025
  • Thumbnail for Generative artificial intelligence
    advancements in generative models compared to older Long-Short Term Memory models, leading to the first generative pre-trained transformer (GPT), known as...
    172 KB (14,754 words) - 05:24, 8 May 2025
  • OpenAI o3 (category Generative pre-trained transformers)
    OpenAI o3 is a reflective generative pre-trained transformer (GPT) model developed by OpenAI as a successor to OpenAI o1. It is designed to devote additional...
    8 KB (744 words) - 06:49, 29 April 2025
  • GPT-4o (category Generative pre-trained transformers)
    GPT-4o ("o" for "omni") is a multilingual, multimodal generative pre-trained transformer developed by OpenAI and released in May 2024. GPT-4o is free,...
    23 KB (2,244 words) - 22:36, 6 May 2025
  • Thumbnail for GPT-J
    GPT-J (category Generative pre-trained transformers)
    developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like text that continues from...
    11 KB (1,015 words) - 12:21, 2 February 2025
  • OpenAI o4-mini (category Generative pre-trained transformers)
    OpenAI o4-mini is a generative pre-trained transformer model created by OpenAI. On April 16, 2025, the o4-mini model was released to all ChatGPT users...
    4 KB (303 words) - 22:34, 6 May 2025
  • Generative pre-trained transformer, a type of artificial intelligence language model ChatGPT, a chatbot developed by OpenAI, based on generative pre-trained...
    1 KB (145 words) - 03:59, 12 March 2025
  • DeepSeek (chatbot) (category Generative pre-trained transformers)
    published a paper unveiling a new model that combines the techniques generative reward modeling (GRM) and self-principled critique tuning (SPCT). The...
    50 KB (4,279 words) - 13:24, 6 May 2025
  • IBM Watsonx (category Generative pre-trained transformers)
    Watsonx is IBM's commercial generative AI and scientific data platform based on cloud. It offers a studio, data store, and governance toolkit. It supports...
    8 KB (634 words) - 08:23, 9 February 2025
  • Thumbnail for Microsoft Copilot
    Microsoft Copilot (category Generative pre-trained transformers)
    Microsoft Copilot (or simply Copilot) is a generative artificial intelligence chatbot developed by Microsoft. Based on the GPT-4 series of large language...
    63 KB (5,651 words) - 15:18, 1 May 2025
  • Thumbnail for Attention Is All You Need
    as multimodal Generative AI. The paper's title is a reference to the song "All You Need Is Love" by the Beatles. The name "Transformer" was picked because...
    15 KB (3,915 words) - 20:36, 1 May 2025
  • and are trained with self-supervised learning on a vast amount of text. The largest and most capable LLMs are generative pretrained transformers (GPTs)...
    114 KB (11,944 words) - 06:36, 8 May 2025
  • meaning), transformers (a deep learning architecture using an attention mechanism), and others. In 2019, generative pre-trained transformer (or "GPT")...
    279 KB (28,676 words) - 09:36, 8 May 2025
  • Ernie Bot (category Generative pre-trained transformers)
    regulatory authorities on August 31, 2023. "Ernie 3.0", the language model, was trained with 10 billion parameters on a 4 terabyte (TB) corpus which consists of...
    18 KB (1,743 words) - 12:41, 2 May 2025
  • Thumbnail for DALL-E
    DALL-E (category Generative pre-trained transformers)
    Initiative. The first generative pre-trained transformer (GPT) model was initially developed by OpenAI in 2018, using a Transformer architecture. The first...
    54 KB (4,243 words) - 02:48, 30 April 2025
  • long stretches of contiguous text. Generative Pre-trained Transformer 2 ("GPT-2") is an unsupervised transformer language model and the successor to...
    218 KB (19,027 words) - 18:47, 5 May 2025
  • Thumbnail for Claude (language model)
    compared to previous versions. Claude models are generative pre-trained transformers. They have been pre-trained to predict the next word in large amounts of...
    21 KB (1,894 words) - 22:18, 6 May 2025
  • Thumbnail for PaLM
    billion-parameter dense decoder-only transformer-based large language model (LLM) developed by Google AI. Researchers also trained smaller versions of PaLM (with...
    13 KB (807 words) - 13:21, 13 April 2025
  • OpenAI Codex (category Generative pre-trained transformers)
    programming applications. Based on GPT-3, a neural network trained on text, Codex was additionally trained on 159 gigabytes of Python code from 54 million GitHub...
    11 KB (1,085 words) - 18:00, 2 May 2025
  • Thumbnail for AutoGPT
    AutoGPT (category Generative pre-trained transformers)
    CEO commented that "AutoGPT illustrates the power and unknown risks of generative AI," and that due to usage risks, enterprises should include a human in...
    17 KB (1,656 words) - 04:47, 26 April 2025
  • GigaChat (category Generative pre-trained transformers)
    GigaChat is a generative artificial intelligence chatbot developed by the Russian financial services corporation Sberbank and launched in April 2023. It...
    3 KB (294 words) - 15:47, 16 March 2025
  • generated artworks. In 2021, using the influential large language generative pre-trained transformer models that are used in GPT-2 and GPT-3, OpenAI released a...
    96 KB (9,167 words) - 06:18, 5 May 2025