• Thumbnail for Generative pre-trained transformer
    A generative pre-trained transformer (GPT) is a type of large language model (LLM) that is widely used in generative AI chatbots. GPTs are based on a deep...
    54 KB (4,304 words) - 18:45, 3 August 2025
  • Thumbnail for ChatGPT
    is a generative artificial intelligence chatbot developed by OpenAI and released on November 30, 2022. It uses generative pre-trained transformers (GPTs)...
    171 KB (15,047 words) - 07:11, 3 August 2025
  • Thumbnail for GPT-2
    Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained...
    44 KB (3,269 words) - 19:59, 2 August 2025
  • Thumbnail for Transformer (deep learning architecture)
    of pre-trained systems, such as generative pre-trained transformers (GPTs) and BERT (bidirectional encoder representations from transformers). For many...
    106 KB (13,107 words) - 01:38, 26 July 2025
  • OpenAI o1 is a reflective generative pre-trained transformer (GPT). A preview of o1 was released by OpenAI on September 12, 2024. o1 spends time "thinking"...
    14 KB (1,421 words) - 20:12, 2 August 2025
  • Thumbnail for Generative artificial intelligence
    advancements in generative models compared to older Long-Short Term Memory models, leading to the first generative pre-trained transformer (GPT), known as...
    155 KB (13,950 words) - 05:14, 30 July 2025
  • Generative Pre-trained Transformer 4Chan (GPT-4chan) is a controversial AI model that was developed and deployed by YouTuber and AI researcher Yannic Kilcher...
    9 KB (1,189 words) - 04:25, 28 July 2025
  • Generative Pre-trained Transformer 4 (GPT-4) is a large language model trained and created by OpenAI and the fourth in its series of GPT foundation models...
    63 KB (6,044 words) - 12:11, 3 August 2025
  • Thumbnail for GPT-1
    GPT-1 (category Generative pre-trained transformers)
    Generative Pre-trained Transformer 1 (GPT-1) was the first of OpenAI's large language models following Google's invention of the transformer architecture...
    32 KB (1,069 words) - 19:58, 2 August 2025
  • Thumbnail for Claude (language model)
    Claude (language model) (category Generative pre-trained transformers)
    was released in May 2025. Claude models are generative pre-trained transformers. They have been pre-trained to predict the next word in large amounts of...
    26 KB (2,274 words) - 20:30, 2 August 2025
  • OpenAI o3 (category Generative pre-trained transformers)
    OpenAI o3 is a reflective generative pre-trained transformer (GPT) model developed by OpenAI as a successor to OpenAI o1 for ChatGPT. It is designed to...
    9 KB (851 words) - 20:12, 2 August 2025
  • Thumbnail for GPT-J
    GPT-J (category Generative pre-trained transformers)
    developed by EleutherAI in 2021. As the name suggests, it is a generative pre-trained transformer model designed to produce human-like text that continues from...
    11 KB (1,015 words) - 12:21, 2 February 2025
  • OpenAI o4-mini (category Generative pre-trained transformers)
    OpenAI o4-mini is a generative pre-trained transformer model created by OpenAI. On April 16, 2025, the o4-mini model was released to all ChatGPT users...
    4 KB (306 words) - 20:06, 10 July 2025
  • Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer...
    55 KB (4,897 words) - 20:00, 2 August 2025
  • Thumbnail for Grok (chatbot)
    Grok (chatbot) (category Generative pre-trained transformers)
    Grok is a generative artificial intelligence chatbot developed by xAI. It was launched in November 2023 by Elon Musk as an initiative based on the large...
    83 KB (8,060 words) - 12:42, 3 August 2025
  • Thumbnail for PaLM
    PaLM (category Generative pre-trained transformers)
    billion-parameter dense decoder-only transformer-based large language model (LLM) developed by Google AI. Researchers also trained smaller versions of PaLM (with...
    13 KB (807 words) - 19:02, 2 August 2025
  • GPTs are customized generative pre-trained transformers accessible through OpenAI's chatbot ChatGPT. They are derived from OpenAI's main GPT models, such...
    7 KB (732 words) - 11:36, 20 July 2025
  • largest and most capable LLMs are generative pretrained transformers (GPTs), which are largely used in generative chatbots such as ChatGPT, Gemini or...
    135 KB (14,248 words) - 17:13, 3 August 2025
  • GPT-4o (category Generative pre-trained transformers)
    GPT-4o ("o" for "omni") is a multilingual, multimodal generative pre-trained transformer developed by OpenAI and released in May 2024. It can process and...
    25 KB (2,396 words) - 17:43, 21 July 2025
  • Thumbnail for Llama (language model)
    Llama (language model) (category Generative pre-trained transformers)
    laboratories around the world". Llama was trained on only publicly available information, and was trained at various model sizes, with the intention...
    57 KB (5,448 words) - 20:35, 2 August 2025
  • Gemini (language model) (category Generative pre-trained transformers)
    tablets. The latest version of Gemma, Gemma 3, is based on a decoder-only transformer architecture with grouped-query attention (GQA) and the SigLIP vision...
    64 KB (5,017 words) - 19:03, 2 August 2025
  • GPT-4.1 (category Generative pre-trained transformers)
    tools field when giving the model access to tools. The models are also trained to follow instructions more literally, making the model more steerable...
    6 KB (574 words) - 21:08, 23 July 2025
  • Thumbnail for Microsoft Copilot
    Microsoft Copilot (category Generative pre-trained transformers)
    Microsoft Copilot (or simply Copilot) is a generative artificial intelligence chatbot developed by Microsoft. Based on the GPT-4 series of large language...
    63 KB (5,654 words) - 12:49, 31 July 2025
  • Generative pre-trained transformer, a type of artificial intelligence language model ChatGPT, a chatbot developed by OpenAI, based on generative pre-trained...
    1 KB (145 words) - 13:36, 10 July 2025
  • Thumbnail for Attention Is All You Need
    as multimodal generative AI. The paper's title is a reference to the song "All You Need Is Love" by the Beatles. The name "Transformer" was picked because...
    15 KB (3,911 words) - 03:09, 1 August 2025
  • DeepSeek (chatbot) (category Generative pre-trained transformers)
    DeepSeek is a generative artificial intelligence chatbot by the Chinese company DeepSeek. Released on 10 January 2025, DeepSeek-R1 surpassed ChatGPT as...
    51 KB (4,443 words) - 02:50, 1 August 2025
  • Thumbnail for Qwen
    Qwen (category Generative pre-trained transformers)
    Qwen-VL series is a line of visual language models that combines a vision transformer with a LLM. Alibaba released Qwen2-VL with variants of 2 billion and...
    22 KB (1,560 words) - 20:03, 2 August 2025
  • community for a diverse set of AI development tasks. These models come pre-trained and are designed to excel in various Natural Language Processing (NLP)...
    9 KB (759 words) - 23:41, 31 July 2025
  • meaning), transformers (a deep learning architecture using an attention mechanism), and others. In 2019, generative pre-trained transformer (or "GPT")...
    285 KB (29,145 words) - 07:39, 1 August 2025
  • Thumbnail for Google logo
    "G" visually connects with the gradient present in the logo of Google's generative artificial intelligence chatbot Gemini. "Information about the typeface...
    21 KB (2,141 words) - 15:25, 16 July 2025