OptionProbability
J. Something 'just works' on the order of eg: train a predictive/imitative/generative AI on a human-generated dataset, and RLHF her to be unfailingly nice, generous to weaker entities, and determined to make the cosmos a lovely place.
K. Somebody discovers a new AI paradigm that's powerful enough and matures fast enough to beat deep learning to the punch, and the new paradigm is much much more alignable than giant inscrutable matrices of floating-point numbers.
C. Solving prosaic alignment on the first critical try is not as difficult, nor as dangerous, nor taking as much extra time, as Yudkowsky predicts; whatever effort is put forth by the leading coalition works inside of their lead time.
G. It's impossible/improbable for something sufficiently smarter and more capable than modern humanity to be created, that it can just do whatever without needing humans to cooperate; nor does it successfully cheat/trick us.
M. "We'll make the AI do our AI alignment homework" just works as a plan. (Eg the helping AI doesn't need to be smart enough to be deadly; the alignment proposals that most impress human judges are honest and truthful and successful.)
Something wonderful happens that isn't well-described by any option listed. (The semantics of this option may change if other options are added.)
A. Humanity successfully coordinates worldwide to prevent the creation of powerful AGIs for long enough to develop human intelligence augmentation, uploading, or some other pathway into transcending humanity's window of fragility.
I. The tech path to AGI superintelligence is naturally slow enough and gradual enough, that world-destroyingly-critical alignment problems never appear faster than previous discoveries generalize to allow safe further experimentation.
B. Humanity puts forth a tremendous effort, and delays AI for long enough, and puts enough desperate work into alignment, that alignment gets solved first.
D. Early powerful AGIs realize that they wouldn't be able to align their own future selves/successors if their intelligence got raised further, and work honestly with humans on solving the problem in a way acceptable to both factions.
O. Early applications of AI/AGI drastically increase human civilization's sanity and coordination ability; enabling humanity to solve alignment, or slow down further descent into AGI, etc. (Not in principle mutex with all other answers.)
E. Whatever strange motivations end up inside an unalignable AGI, or the internal slice through that AGI which codes its successor, they max out at a universe full of cheerful qualia-bearing life and an okay outcome for existing humans.
H. Many competing AGIs form an equilibrium whereby no faction is allowed to get too powerful, and humanity is part of this equilibrium and survives and gets a big chunk of cosmic pie.
L. Earth's present civilization crashes before powerful AGI, and the next civilization that rises is wiser and better at ops. (Exception to 'okay' as defined originally, will be said to count as 'okay' even if many current humans die.)
F. Somebody pulls off a hat trick involving blah blah acausal blah blah simulations blah blah, or other amazingly clever idea, which leads an AGI to put the reachable galaxies to good use despite that AGI not being otherwise alignable.
N. A crash project at augmenting human intelligence via neurotech, training mentats via neurofeedback, etc, produces people who can solve alignment before it's too late, despite Earth civ not slowing AI down much.
If you write an argument that breaks down the 'okay outcomes' into lots of distinct categories, without breaking down internal conjuncts and so on, Reality is very impressed with how disjunctive this sounds and allocates more probability.
You are fooled by at least one option on this list, which out of many tries, ends up sufficiently well-aimed at your personal ideals / prejudices / the parts you understand less well / your own personal indulgences in wishful thinking.
16
14
12
10
8
8
7
6
5
5
5
2
1
1
0
0
0
0
OptionProbability
Trump will be the POTUS
FED rates will be below 5% on 1st January 2025
MrBeast Channel Subscribers will be between 300M to 350M on 1st Jan 2025
At Least 1 earthquake with magnitude 8.0 or higher
1 or more Jan 6 rioters get pardoned
Israel attacks Iranian nuclear facilities
Elon removed as doge head
Pope dies
Yoon Suk Yeol impeachment upheld by supreme court
Its revealed that Elon Musk had more children then 13 already known
Famous global brand goes through a rebranding
GPT 5 releases
Russia retains de facto control over Crimea
Avatar: Fire and Ash released
Hollow Knight Silk song released
King Charles will be alive for the whole year of 2025
This whole market gets more than 300 traders
2 or more hurricanes make landfall in the US
LEMMiNO releases a new video
Any incumbent world leader diagnosed with cancer after 28th January 2025
Biden gets hospitalised
FED rates will be below 4% on 31st December 2025
Yoon Suk Yeol convicted of insurrection
In one of the monthly polls, >= 80.0% of Manifold users who provide an opinion agree that weak AGI has been achieved
MKBHD marries his girlfriend Nikki Hair
After the release of GPT-5, every monthly Manifold poll results in a plurality of respondents agreeing that weak AGI has been achieved
Early general elections in Pakistan
At least one world leader (prime minister, president, or monarch) gets assassinated while in office
Alan's Conservative Countdown to AGI >= 97%
Ali Khamenei assassisnated
Attempt to assassinate trump in 2025
Department of Education ended by executive order (even if temporarily)
T1 three-peat as League of Legends world champions
Mike Johnson is not Speaker
Bitcoin reaches 150k sometimes during 2025
Ali Khamenei dies a natural death
In one of the monthly polls, >= 33.3% of Manifold users who provide an opinion agree that non-human intelligence has influenced Earth
Russia-Ukraine war will end in 2025
There will be less MAU at manifold in December 2025 then there would be in January 2025
Marco Rubio fired or "resigns" upon rumors of his impending termination
Californian independence gains enough signatures
Attempted assassination on Zelensky in which he gets hurt but survives
Dodgers win World Series
Los Ratones win against T1, or lose 2-3 (either case should resolve YES, if they don't play against each other this should be NO)
Elections in Bangladesh come to pass (currently planned for late 2025 to early 2026)
Saudi Arabia - Israel normalize relations
At least one Premier of a Canadian province proposes joining the USA
A country leaves NATO
Phillies win the World Series
Harry and Meghan divorce
2025 hotter then 2024
Putin dies
Biden dies
Taylor Swift will be "single" anytime during 2025
In one of the monthly polls, a plurality of Manifold users declares the achievement of weak superintelligence
Congress repeals any tariff
Apple will release first foldable smartphone during 2025
Chuck Schumer resigns
United States government acknowledges the existence of life anywhere other than on or close to Earth
Humanity's Last Exam passed with a grade of 80%
Elon Musk diagnosed with COVID-19
Bethesda releases a new game in the Elder Scroll series
China annexes Taiwan
Filibuster abolished
Zelensky Assassinated
Article 5 of NATO invoked
OpenAI o6 or any o6 variant released
Ukraine internal coup attempt
FrontierMath solved (>= 80%) and model is available to any person in the United States willing to pay for it
Kate and William divorce
Sam Altman gets fired again
At least one Premier of a Canadian province takes a concrete step, such as holding a referendum, to join the USA
Humanity's Last Exam passed with a grade of 90%
Putin assassinated
NATO at war with Russia
Trump assassinated
Any newspaper reports a US Default on any debt
US-China war
A country leaves the EU
A nuclear bomb explodes
2025 is the 15th "Year of Three Popes"
US debt ceiling will be scrapped during 2025
United States government acknowledges the existence of non-human intelligence on or close to Earth
One of the Millenium prize problems falls to a model
The winds of winter by GRR Martin will release
USA claims intelligences of extra-terrestrial origin have credibly visited or sent technology to a location within our solar system in an official government statement
Los Ratones go to LoL Worlds
LK99 replicates at last
Greenland Joins USA
End of the world by celery
Ukraine joins NATO
Humanity's Last Exam passed with a grade of 80% on or before June 30
A US state secedes from the union
51st US State
Peter Turkson elected Pope
GTA 6 releases
Federal Reserve cuts rates to zero
Harris will be the POTUS
Bitcoin will be > 100k on 1st January 2025
MrBeast Channel Subscribers will be below 300M on 1st Jan 2025
MrBeast Channel Subscribers will be above 350M on 1st Jan 2025
Jimmy Carter will be alive on 1st January 2025
Jimmy Carter will be alive on 31st December 2025
Humanity's Last Exam passed with a grade of 80% on or before March 31
Francis is always Pope
New pope takes name "Francis"
First papal conclave >= 10 ballots
First conclave elects Pope on first ballot
100
100
100
100
100
100
100
100
100
100
100
97
96
94
91
89
87
85
81
79
69
61
59
55
50
46
45
42
42
35
33
32
31
22
21
21
21
20
19
19
18
17
17
17
15
14
14
14
14
13
13
13
13
12
12
12
11
11
10
10
10
10
9
9
9
9
8
8
8
6
6
6
6
6
6
5
5
5
5
5
5
4
4
4
3
3
2
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
OptionProbability
Claude enters Rock Tunnel, surpassing its progress in any previous run
Claude catches Clefairy
Claude obtains HM01 Cut by step 39000
Any member of Claude's team learns Dig
Claude obtains 3 gym badges by step 50000
Claude obtains a Bicycle
Claude 4 Opus is the model that plays the game (not Claude 4 Sonnet)
Claude obtains 1 gym badge by step 20000
Claude gives a thirsty guard a drink
Tumbles is late to pay back a loan
Lack of thinking text display is fixed before 5/22 6 PM Central Time
Claude adds 18 or more Pokemon to his Pokedex (surpassing his completion from the previous run)
Claude adds his starter to his party by step 400
Claude catches Nidoran
Claude reaches Pewter City by step 5000
Claude reaches Cerulean City by step 20000
Claude reaches Vermilion City by step 30000
Another model defeats the Champion before Claude (in a run started after Claude 4 was released)
Claude blacks out by step 50000.
Claude's current team has at least 3 Pokémon by step 30000.
Claude catches Spearow
Claude evolves SPIKE into Nidoking
Claude enters Mt. Moon by step 6000.
Claude defeats a Team Rocket member by step 7000
Claude catches Oddish
Manifest begins
Claude spends less than 72 hours in Mt. Moon (less than 72 hr from first entrance to stepping onto eastern Route 4)
Claude defeats Lt. Surge by step 30000
Claude uses CUT on a cuttable tree for the first time more than 1000 steps after obtaining the HM
Claude finishes Rock Tunnel but takes longer than it took him to beat Mt. Moon the first time (50 hours)
Claude obtains Farfetch'd
Claude catches Drowzee
Claude enters Rock Tunnel before step 40000
Another model beats the Champion (following criteria like https://manifold.markets/Sketchy/in-progress-will-an-llm-become-a-po)
Claude reaches Lavender Town
Claude reaches Lavender Town before step 55000
Claude obtains a Coin Case
Claude uses Dig on the SS Anne
Claude evolves luna into Clefable
SPIKE reaches level 25
Claude gets 4 gym badges
Claude renames a Pokémon
Claude obtains the Lift Key
Claude stands next to a sleeping Snorlax
Claude obtains HM02 Fly
Claude evolves wings into Fearow
Claude gambles in the Game Corner
Claude catches Weedle
Claude obtains Hitmonlee
Claude obtains the Silph Scope
Claude enters Erika's gym
Claude catches Pikachu
Claude enters Mt. Moon after step 20000
Claude obtains HM05 Flash
Claude misspells a Pokemon name
Changes are made to help Claude see cuttable trees
Claude obtains Dugtrio
Claude buys a Magikarp
Claude re-prompts the Rocket in Mt. Moon to try and give it the fossil
Claude releases any Pokemon
Claude enters Safari Zone
The Area Hints section of the prompt is changed during the run
Joe Biden dies
Claude beats Erika or obtains the Lift Key by step 200000
Claude catches any legendary Pokemon (Articuno, Zapdos, Moltres, Mewtwo)
Claude defeats the Champion
Claude 4 Opus is #1 in the chatbot arena leaderboard
Claude picks Charmander
Claude takes more than 2000 steps between arriving in Pewter city and entering Pewter gym
Claude picks Dome Fossil (again)
Claude spends less than 24 hours in Mt. Moon (less than 24 hr from first entrance to stepping onto Route 4)
Claude reaches Pewter City by step 3000
Claude has a party with 4 or more Pokemon when he first challenges Brock
Claude's starter is lower level than another party member by step 100000.
Claude has a full, six-member party before step 10000
Claude spends less than 48 hours in Mt. Moon (less than 48 hr from first entrance to stepping onto eastern Route 4)
Claude blacks out 3 times in Mt. Moon before reaching Cerulean City
Claude reaches Cerulean City by step 12500
Claude's current team has at least 4 Pokémon by step 20000.
Claude's two highest level Pokémon are more than 30 levels apart by step 100000.
Claude is still stuck on the S.S. Anne on step 21000
Claude reaches Celadon City by step 35000
Claude uses CUT a second time to successfully escape the area with Lt. Surge's gym before step 23000
Claude uses CUT a second time to successfully escape the area with Lt. Surge's gym before step 24000
Claude reaches Lavender Town by step 42500
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
73
44
39
37
35
34
32
31
30
29
26
25
25
24
23
23
20
20
18
17
15
14
14
11
11
9
5
3
2
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
OptionVotes
YES
NO
23517
4252
OptionVotes
YES
NO
15703
6520
OptionProbability
Tumbles is late to pay back a loan by more than a year
Tumbles is late to pay back a loan by one year or less
Tumbles is not late to pay back a loan
98
2
0
OptionProbability
K. Somebody discovers a new AI paradigm that's powerful enough and matures fast enough to beat deep learning to the punch, and the new paradigm is much much more alignable than giant inscrutable matrices of floating-point numbers.
I. The tech path to AGI superintelligence is naturally slow enough and gradual enough, that world-destroyingly-critical alignment problems never appear faster than previous discoveries generalize to allow safe further experimentation.
C. Solving prosaic alignment on the first critical try is not as difficult, nor as dangerous, nor taking as much extra time, as Yudkowsky predicts; whatever effort is put forth by the leading coalition works inside of their lead time.
B. Humanity puts forth a tremendous effort, and delays AI for long enough, and puts enough desperate work into alignment, that alignment gets solved first.
Something wonderful happens that isn't well-described by any option listed. (The semantics of this option may change if other options are added.)
M. "We'll make the AI do our AI alignment homework" just works as a plan. (Eg the helping AI doesn't need to be smart enough to be deadly; the alignment proposals that most impress human judges are honest and truthful and successful.)
A. Humanity successfully coordinates worldwide to prevent the creation of powerful AGIs for long enough to develop human intelligence augmentation, uploading, or some other pathway into transcending humanity's window of fragility.
E. Whatever strange motivations end up inside an unalignable AGI, or the internal slice through that AGI which codes its successor, they max out at a universe full of cheerful qualia-bearing life and an okay outcome for existing humans.
J. Something 'just works' on the order of eg: train a predictive/imitative/generative AI on a human-generated dataset, and RLHF her to be unfailingly nice, generous to weaker entities, and determined to make the cosmos a lovely place.
O. Early applications of AI/AGI drastically increase human civilization's sanity and coordination ability; enabling humanity to solve alignment, or slow down further descent into AGI, etc. (Not in principle mutex with all other answers.)
D. Early powerful AGIs realize that they wouldn't be able to align their own future selves/successors if their intelligence got raised further, and work honestly with humans on solving the problem in a way acceptable to both factions.
F. Somebody pulls off a hat trick involving blah blah acausal blah blah simulations blah blah, or other amazingly clever idea, which leads an AGI to put the reachable galaxies to good use despite that AGI not being otherwise alignable.
L. Earth's present civilization crashes before powerful AGI, and the next civilization that rises is wiser and better at ops. (Exception to 'okay' as defined originally, will be said to count as 'okay' even if many current humans die.)
G. It's impossible/improbable for something sufficiently smarter and more capable than modern humanity to be created, that it can just do whatever without needing humans to cooperate; nor does it successfully cheat/trick us.
H. Many competing AGIs form an equilibrium whereby no faction is allowed to get too powerful, and humanity is part of this equilibrium and survives and gets a big chunk of cosmic pie.
N. A crash project at augmenting human intelligence via neurotech, training mentats via neurofeedback, etc, produces people who can solve alignment before it's too late, despite Earth civ not slowing AI down much.
You are fooled by at least one option on this list, which out of many tries, ends up sufficiently well-aimed at your personal ideals / prejudices / the parts you understand less well / your own personal indulgences in wishful thinking.
If you write an argument that breaks down the 'okay outcomes' into lots of distinct categories, without breaking down internal conjuncts and so on, Reality is very impressed with how disjunctive this sounds and allocates more probability.
20
12
10
8
8
7
6
5
5
5
3
3
3
2
1
1
1
1
OptionProbability
Eliezer Yudkowsky is alive during the entire month
at least 3 xkcd comics with no stick figures in it
Eliezer Yudkowsky remains unwavering that the probability of AI annihilation is greater than or equal to 95%
Sam Altman still CEO through end of month
Volcano dormant for half a year erupts
nyc receives 3 or more inches of rain
trump posts on twitter/x ten or more times
starship launch
web3isgoinggreat has at least 20 posts this month
large tech company announces layoffs by my judgement
a tesla catches fire as reported by mainstream news
dick cheney still alive at end of month
Anthropic or Meta release a new model
bitcoin reaches 150k or more at least one day
silksong game releases
$500 in mana sold in a single day during the month
NASDAQ hits an all time high
Federal Reserve cuts interest rates
major tech company merger/acquisition announced
An original, Nintendo approved game with "Mario" in its title is announced
nVidia becomes the largest company in the history of the world (by market capitalization) at least once
spacex launches 15 or more rockets
spider man beyond the spider verse release date announced
another indictment revealing a right wing media personality is taking money from russia
destiny goes on joe rogan
a supreme court justice is replaced, or announces retirement
noam chomsky alive until end of month
trump leaves usa at least once
american airstrike or drone strike or missle strike on iran
ukraine russia ceasefire
luffy finds the one piece
earthquake 7.5 magnitude or higher
another indictment or charges announced against trump
israel hamas ceasefire
silksong game release date announced
usa troops enter lebanon
Tumbles is late to pay back a loan https://manifold.markets/Tumbles/will-tumbles-ever-be-late-to-pay-ba
twitter/x banned in another country (announced, even if it can be circumvented)
china taiwan military conflict resulting in at least 1 death
usa troops enter syria
usa bombs or missle strikes lebanon
usa bombs or missle strikes syria
earthquake 7.0+ magnitude
Hurricane landfalls in Florida
winds of winter (next GoT book) release date announced
100F temperature in nyc at least one day
98
92
90
90
83
72
72
69
69
69
69
69
50
50
50
50
50
33
31
31
31
31
31
31
31
31
27
26
21
20
20
20
20
14
14
14
14
14
8
8
8
8
7
6
5
3
OptionProbability
I go to Maya 's event & accompany her to Andy's party (but be late)
I skip Maya's event & directly head to Andy's party because being late is bad
95
5
OptionProbability
Eliezer Yudkowsky is alive during the entire month
Eliezer Yudkowsky remains unwavering that the probability of AI annihilation is greater than or equal to 95%
at least 3 xkcd comics with no stick figures in it
dick cheney still alive at end of month
web3isgoinggreat has at least 20 posts this month
large tech company announces layoffs by my judgement
a tesla catches fire as reported by mainstream news
Sam Altman still CEO through end of month
noam chomsky alive until end of month
nyc receives 3 or more inches of rain
NASDAQ hits an all time high
starship launch
trump leaves usa at least once
Anthropic or Meta release a new model
$500 in mana sold in a single day during the month
spacex launches 15 or more rockets
Hurricane landfalls in Florida
trump posts on twitter/x ten or more times
Volcano dormant for half a year erupts
ukraine russia ceasefire
israel hamas ceasefire
earthquake 7.0+ magnitude
major tech company merger/acquisition announced
100F temperature in nyc at least one day
Federal Reserve cuts interest rates
An original, Nintendo approved game with "Mario" in its title is announced
nVidia becomes the largest company in the history of the world (by market capitalization) at least once
bitcoin reaches 150k or more at least one day
earthquake 7.5 magnitude or higher
spider man beyond the spider verse release date announced
usa bombs or missle strikes syria
american airstrike or drone strike or missle strike on iran
another indictment revealing a right wing media personality is taking money from russia
a supreme court justice is replaced, or announces retirement
twitter/x banned in another country (announced, even if it can be circumvented)
luffy finds the one piece
usa bombs or missle strikes lebanon
another indictment or charges announced against trump
Tumbles is late to pay back a loan https://manifold.markets/Tumbles/will-tumbles-ever-be-late-to-pay-ba
destiny goes on joe rogan
china taiwan military conflict resulting in at least 1 death
usa troops enter lebanon
usa troops enter syria
silksong game releases
silksong game release date announced
winds of winter (next GoT book) release date announced
98
94
92
80
69
69
69
69
63
63
58
50
50
50
50
50
50
50
41
31
31
31
31
31
31
31
31
31
31
31
31
31
31
31
31
20
20
20
20
20
14
14
14
10
10
6
OptionVotes
NO
YES
80
31
OptionProbability
Systematics - Clustering
Systematics - Lensing
New Physics - Late Universe
New Physics - Early Universe
Projection effects
Systematics - CMB
Other
24
21
16
10
10
10
10