• cognitive psychology, sequence learning is inherent to human ability because it is an integrated part of conscious and nonconscious learning as well as activities...
    15 KB (1,974 words) - 21:14, 25 October 2023
  • Thumbnail for Seq2seq
    Seq2seq (redirect from Sequence-to-sequence)
    recognition, and text summarization. Seq2seq uses sequence transformation: it turns one sequence into another sequence. One naturally wonders if the problem of...
    23 KB (2,946 words) - 03:12, 19 July 2025
  • Thumbnail for Transformer (deep learning architecture)
    to overcome the vanishing gradient problem, allowing efficient learning of long-sequence modelling. One key innovation was the use of an attention mechanism...
    106 KB (13,130 words) - 14:54, 15 July 2025
  • Associative sequence learning (ASL) is a neuroscientific theory that attempts to explain how mirror neurons are able to match observed and performed actions...
    9 KB (1,235 words) - 11:31, 13 April 2025
  • accurate". Sutskever, Ilya; Vinyals, Oriol; Le, Quoc V. (2014). "Sequence to Sequence Learning with Neural Networks" (PDF). Electronic Proceedings of the Neural...
    90 KB (10,416 words) - 11:26, 18 July 2025
  • Thumbnail for Attention Is All You Need
    Sutskever, Ilya; Vinyals, Oriol; Le, Quoc Viet (14 December 2014). "Sequence to sequence learning with neural networks". arXiv:1409.3215 [cs.CL]. [first version...
    15 KB (3,911 words) - 13:54, 9 July 2025
  • Thumbnail for Deep learning
    Retrieved 2020-02-25. Sutskever, L.; Vinyals, O.; Le, Q. (2014). "Sequence to Sequence Learning with Neural Networks" (PDF). Proc. NIPS. arXiv:1409.3215....
    182 KB (17,994 words) - 00:54, 4 July 2025
  • Thumbnail for Ilya Sutskever
    Ilya Sutskever (category Machine learning researchers)
    Sutskever worked with Oriol Vinyals and Quoc Viet Le to create the sequence-to-sequence learning algorithm, and worked on TensorFlow. He is also one of the AlphaGo...
    27 KB (2,177 words) - 16:18, 27 June 2025
  • approach models reinforcement learning as a sequence modelling problem. Similar to Behavior Cloning, it trains a sequence model, such as a Transformer...
    13 KB (1,339 words) - 15:09, 2 June 2025
  • Mamba is a deep learning architecture focused on sequence modeling. It was developed by researchers from Carnegie Mellon University and Princeton University...
    11 KB (1,159 words) - 19:42, 16 April 2025
  • learning. The topic has been studied in relation to real world systems (dynamic control systems), artificial grammar learning and sequence learning most...
    28 KB (3,673 words) - 18:19, 5 July 2025
  • Generalization (learning) Knowledge representation and reasoning Memory Memory Encoding Merge (linguistics) Method of loci Mnemonic Sequence learning Tokens in...
    47 KB (6,098 words) - 22:12, 11 July 2025
  • Thumbnail for Long short-term memory
    Long short-term memory (category Deep learning)
    is its advantage over other RNNs, hidden Markov models, and other sequence learning methods. It aims to provide a short-term memory for RNN that can last...
    52 KB (5,822 words) - 10:08, 15 July 2025
  • close connection between machine learning and compression. A system that predicts the posterior probabilities of a sequence given its entire history can be...
    140 KB (15,562 words) - 06:08, 19 July 2025
  • Thumbnail for Attention (machine learning)
    machine learning, attention is a method that determines the importance of each component in a sequence relative to the other components in that sequence. In...
    35 KB (3,418 words) - 03:54, 9 July 2025
  • researchers at Google. It learns to represent text as a sequence of vectors using self-supervised learning. It uses the encoder-only transformer architecture...
    32 KB (3,623 words) - 18:01, 18 July 2025
  • are frequently in the wrong sequence, errors that are similar to phonological mistakes made by young children when learning a language. As the bird ages...
    66 KB (8,038 words) - 08:34, 2 July 2025
  • Procedural memory Proximodistal trend Sequence learning Adams JA (June 1971). "A closed-loop theory of motor learning". J mot Behav. 3 (2): 111–49. doi:10...
    28 KB (3,349 words) - 19:23, 26 June 2025
  • in-context learning is temporary. Training models to perform in-context learning can be viewed as a form of meta-learning, or "learning to learn". Self-consistency...
    40 KB (4,480 words) - 01:25, 20 July 2025
  • Thumbnail for Neural network (machine learning)
    232, no. 2 (1993): 584–599. Amari SI (November 1972). "Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements". IEEE Transactions...
    168 KB (17,613 words) - 15:58, 16 July 2025
  • Thumbnail for Reinforcement learning
    Reinforcement learning is one of the three basic machine learning paradigms, alongside supervised learning and unsupervised learning. Reinforcement learning differs...
    69 KB (8,200 words) - 18:16, 17 July 2025
  • to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field...
    285 KB (29,076 words) - 16:53, 18 July 2025
  • In statistics and machine learning, ensemble methods use multiple learning algorithms to obtain better predictive performance than could be obtained from...
    53 KB (6,692 words) - 01:25, 12 July 2025
  • In machine learning, sequence labeling is a type of pattern recognition task that involves the algorithmic assignment of a categorical label to each member...
    3 KB (506 words) - 19:50, 25 June 2025
  • accurate". Sutskever, Ilya; Vinyals, Oriol; Le, Quoc V. (2014). "Sequence to Sequence Learning with Neural Networks" (PDF). Electronic Proceedings of the Neural...
    85 KB (8,625 words) - 20:54, 10 June 2025
  • perceptron algorithm for learning linear classifiers with an inference algorithm (classically the Viterbi algorithm when used on sequence data) and can be described...
    6 KB (773 words) - 20:14, 1 February 2025
  • myriad of learning paradigms, notably unsupervised learning, supervised learning, reinforcement learning, multimodal learning, and sequence learning. In addition...
    16 KB (2,200 words) - 09:30, 30 June 2025
  • pairwise dissimilarities such as categorical sequences. Decision trees are among the most popular machine learning algorithms given their intelligibility and...
    47 KB (6,542 words) - 15:35, 9 July 2025
  • between encoding and retrieval in the domain of sequence learning". Journal of Experimental Psychology: Learning, Memory, and Cognition. 32 (1): 118–130. doi:10...
    30 KB (3,921 words) - 07:45, 24 May 2025
  • Quoc V. Le (category Machine learning researchers)
    Sutskever, Ilya; Vinyals, Oriol; Le, Quoc V. (2014-12-14). "Sequence to Sequence Learning with Neural Networks". arXiv:1409.3215 [cs.CL]. Zoph, Barret;...
    10 KB (796 words) - 07:40, 10 June 2025