• Thumbnail for Eliezer Yudkowsky
    Eliezer S. Yudkowsky (/ˌɛliˈɛzər ˌjʌdˈkaʊski/ EH-lee-EH-zər YUD-KOW-skee; born September 11, 1979) is an American artificial intelligence researcher and...
    23 KB (1,835 words) - 06:41, 6 May 2024
  • conjecture or speculation by many LessWrong users, LessWrong co-founder Eliezer Yudkowsky reported users who described symptoms such as nightmares and mental...
    32 KB (3,243 words) - 20:48, 15 May 2024
  • Methods of Rationality (HPMOR) is a work of Harry Potter fan fiction by Eliezer Yudkowsky published on FanFiction.Net as a serial from February 28, 2010, to...
    27 KB (2,757 words) - 11:01, 12 May 2024
  • design and on predicting the rate of technology development. In 2000, Eliezer Yudkowsky founded the Singularity Institute for Artificial Intelligence with...
    16 KB (1,138 words) - 09:01, 21 December 2023
  • by Machine Intelligence Research Institute, founded by Eliezer Yudkowsky. In 2007, Yudkowsky suggested that many of the varied definitions that have...
    112 KB (12,050 words) - 13:08, 30 April 2024
  • Thumbnail for LessWrong
    intelligence researcher Eliezer Yudkowsky and economist Robin Hanson as the principal contributors. In February 2009, Yudkowsky's posts were used as the...
    16 KB (1,254 words) - 11:43, 24 April 2024
  • to change their behavior, or block their attempts at interference. Eliezer Yudkowsky illustrates such instrumental convergence as follows: "The AI does...
    24 KB (2,948 words) - 20:09, 13 May 2024
  • "Pascal's mugging" to refer to this problem was originally coined by Eliezer Yudkowsky in the LessWrong forum. Philosopher Nick Bostrom later elaborated...
    12 KB (1,566 words) - 20:08, 28 April 2024
  • Thumbnail for AI takeover
    with a given ethical framework but not "common sense". According to Eliezer Yudkowsky, there is little reason to suppose that an artificially designed mind...
    39 KB (4,258 words) - 00:03, 14 May 2024
  • and ensuring it is adequately constrained. The term was coined by Eliezer Yudkowsky, who is best known for popularizing the idea, to discuss superintelligent...
    24 KB (2,710 words) - 14:18, 7 February 2024
  • Russell Jaan Tallinn Max Tegmark Frank Wilczek Roman Yampolskiy Eliezer Yudkowsky Other Statement on AI risk of extinction Human Compatible Open letter...
    18 KB (1,586 words) - 17:18, 13 May 2024
  • Institute. It was started in 2006 at Stanford University by Ray Kurzweil, Eliezer Yudkowsky, and Peter Thiel, and the subsequent summits in 2007, 2008, 2009,...
    7 KB (670 words) - 23:33, 14 October 2023
  • algorithms to use in machines. For simple decisions, Nick Bostrom and Eliezer Yudkowsky have argued that decision trees (such as ID3) are more transparent...
    129 KB (13,861 words) - 00:47, 9 May 2024
  • an arbitrary goal." While the game often takes narrative license, Eliezer Yudkowsky of the Machine Intelligence Research Institute argues that the core...
    15 KB (1,673 words) - 09:00, 4 December 2023
  • University Press. ISBN 9780199678112. "The AI-Box Experiment: – Eliezer S. Yudkowsky". www.yudkowsky.net. Retrieved 2022-09-19. Armstrong, Stuart; Sandberg, Anders;...
    24 KB (3,055 words) - 13:03, 14 April 2024
  • rational choice, both theories are incorrect. FDT was first proposed by Eliezer Yudkowsky and Nate Soares in a 2017 research paper supported by the Machine...
    21 KB (2,953 words) - 12:05, 12 April 2024
  • professor, political activist, and author Eliezer Williams (1754–1820), Welsh clergyman and genealogist Eliezer Yudkowsky (born 1979), American decision theorist...
    4 KB (500 words) - 02:35, 11 April 2024
  • beginning to minimize risks and to make choices that benefit humans. Eliezer Yudkowsky, who coined the term, argues that developing friendly AI should be...
    217 KB (22,027 words) - 21:09, 17 May 2024
  • Rothblatt Anders Sandberg Peter Thiel Edward O. Thorp Natasha Vita-More Eliezer Yudkowsky James Bedford, 1967 Dick Clair, 1988 L. Stephen Coles, 2014 Peter...
    11 KB (706 words) - 09:17, 14 April 2024
  • Thumbnail for Manifold (prediction market)
    California. Attendees included Nate Silver, Robin Hanson, Richard Hanania, Eliezer Yudkowsky, Robert Miles, and Destiny. Manifold is a reputation-based prediction...
    6 KB (453 words) - 17:12, 23 April 2024
  • catastrophic and existential risks'". CNN Business. Retrieved 20 July 2023. Yudkowsky, Eliezer (2008). "Artificial Intelligence as a Positive and Negative Factor...
    122 KB (12,714 words) - 05:54, 10 May 2024
  • Russell Jaan Tallinn Max Tegmark Frank Wilczek Roman Yampolskiy Eliezer Yudkowsky Other Statement on AI risk of extinction Human Compatible Open letter...
    79 KB (9,363 words) - 00:44, 4 May 2024
  • as a proxy to for how powerful an AI is, and thus as a threshold. Eliezer Yudkowsky wrote that the letter "doesn't go far enough" and argued that it should...
    6 KB (667 words) - 22:29, 13 May 2024
  • Thumbnail for Perenelle Flamel
    group of vampires. In Harry Potter and the Methods of Rationality by Eliezer Yudkowsky, Nicolas Flamel is simply a disguise of Perenelle's that she creates...
    5 KB (570 words) - 21:14, 22 April 2024
  • originality. Readership doubled when it was recommended by author Eliezer Yudkowsky on his website while the story was in its final months. Critics favorably...
    37 KB (4,206 words) - 11:20, 12 May 2024
  • Thumbnail for Future of Humanity Institute
    Moskovitz Yew-Kwang Ng Toby Ord Derek Parfit Peter Singer Cari Tuna Eliezer Yudkowsky Organizations 80,000 Hours Against Malaria Foundation All-Party Parliamentary...
    17 KB (1,528 words) - 09:11, 15 May 2024
  • coherent ideology in 2000, when artificial intelligence (AI) researcher Eliezer Yudkowsky wrote The Singularitarian Principles, in which he stated that a Singularitarian...
    15 KB (1,779 words) - 22:25, 1 April 2024
  • Russell Jaan Tallinn Max Tegmark Frank Wilczek Roman Yampolskiy Eliezer Yudkowsky Other Statement on AI risk of extinction Human Compatible Open letter...
    12 KB (1,133 words) - 03:28, 14 May 2024
  • as many people would die, and our entire future may be destroyed). Eliezer Yudkowsky previously made similar remarks regarding the effect of scope neglect...
    5 KB (562 words) - 14:31, 24 February 2024
  • come in many forms or variations. The term "Seed AI" was coined by Eliezer Yudkowsky. The concept begins with a hypothetical "seed improver", an initial...
    10 KB (1,180 words) - 14:52, 9 May 2024