OptionProbability
J. Something 'just works' on the order of eg: train a predictive/imitative/generative AI on a human-generated dataset, and RLHF her to be unfailingly nice, generous to weaker entities, and determined to make the cosmos a lovely place.
K. Somebody discovers a new AI paradigm that's powerful enough and matures fast enough to beat deep learning to the punch, and the new paradigm is much much more alignable than giant inscrutable matrices of floating-point numbers.
C. Solving prosaic alignment on the first critical try is not as difficult, nor as dangerous, nor taking as much extra time, as Yudkowsky predicts; whatever effort is put forth by the leading coalition works inside of their lead time.
G. It's impossible/improbable for something sufficiently smarter and more capable than modern humanity to be created, that it can just do whatever without needing humans to cooperate; nor does it successfully cheat/trick us.
M. "We'll make the AI do our AI alignment homework" just works as a plan. (Eg the helping AI doesn't need to be smart enough to be deadly; the alignment proposals that most impress human judges are honest and truthful and successful.)
Something wonderful happens that isn't well-described by any option listed. (The semantics of this option may change if other options are added.)
A. Humanity successfully coordinates worldwide to prevent the creation of powerful AGIs for long enough to develop human intelligence augmentation, uploading, or some other pathway into transcending humanity's window of fragility.
I. The tech path to AGI superintelligence is naturally slow enough and gradual enough, that world-destroyingly-critical alignment problems never appear faster than previous discoveries generalize to allow safe further experimentation.
B. Humanity puts forth a tremendous effort, and delays AI for long enough, and puts enough desperate work into alignment, that alignment gets solved first.
D. Early powerful AGIs realize that they wouldn't be able to align their own future selves/successors if their intelligence got raised further, and work honestly with humans on solving the problem in a way acceptable to both factions.
O. Early applications of AI/AGI drastically increase human civilization's sanity and coordination ability; enabling humanity to solve alignment, or slow down further descent into AGI, etc. (Not in principle mutex with all other answers.)
E. Whatever strange motivations end up inside an unalignable AGI, or the internal slice through that AGI which codes its successor, they max out at a universe full of cheerful qualia-bearing life and an okay outcome for existing humans.
H. Many competing AGIs form an equilibrium whereby no faction is allowed to get too powerful, and humanity is part of this equilibrium and survives and gets a big chunk of cosmic pie.
L. Earth's present civilization crashes before powerful AGI, and the next civilization that rises is wiser and better at ops. (Exception to 'okay' as defined originally, will be said to count as 'okay' even if many current humans die.)
F. Somebody pulls off a hat trick involving blah blah acausal blah blah simulations blah blah, or other amazingly clever idea, which leads an AGI to put the reachable galaxies to good use despite that AGI not being otherwise alignable.
N. A crash project at augmenting human intelligence via neurotech, training mentats via neurofeedback, etc, produces people who can solve alignment before it's too late, despite Earth civ not slowing AI down much.
If you write an argument that breaks down the 'okay outcomes' into lots of distinct categories, without breaking down internal conjuncts and so on, Reality is very impressed with how disjunctive this sounds and allocates more probability.
You are fooled by at least one option on this list, which out of many tries, ends up sufficiently well-aimed at your personal ideals / prejudices / the parts you understand less well / your own personal indulgences in wishful thinking.
16
14
12
10
8
8
7
6
5
5
5
2
1
1
0
0
0
0
OptionProbability
Trump will be the POTUS
FED rates will be below 5% on 1st January 2025
MrBeast Channel Subscribers will be between 300M to 350M on 1st Jan 2025
At Least 1 earthquake with magnitude 8.0 or higher
1 or more Jan 6 rioters get pardoned
Israel attacks Iranian nuclear facilities
Elon removed as doge head
Pope dies
Yoon Suk Yeol impeachment upheld by supreme court
Its revealed that Elon Musk had more children then 13 already known
Famous global brand goes through a rebranding
GPT 5 releases
Russia retains de facto control over Crimea
Avatar: Fire and Ash released
Hollow Knight Silk song released
King Charles will be alive for the whole year of 2025
This whole market gets more than 300 traders
2 or more hurricanes make landfall in the US
LEMMiNO releases a new video
Any incumbent world leader diagnosed with cancer after 28th January 2025
Biden gets hospitalised
FED rates will be below 4% on 31st December 2025
Yoon Suk Yeol convicted of insurrection
In one of the monthly polls, >= 80.0% of Manifold users who provide an opinion agree that weak AGI has been achieved
MKBHD marries his girlfriend Nikki Hair
After the release of GPT-5, every monthly Manifold poll results in a plurality of respondents agreeing that weak AGI has been achieved
Early general elections in Pakistan
At least one world leader (prime minister, president, or monarch) gets assassinated while in office
Alan's Conservative Countdown to AGI >= 97%
Ali Khamenei assassisnated
Attempt to assassinate trump in 2025
Department of Education ended by executive order (even if temporarily)
T1 three-peat as League of Legends world champions
Mike Johnson is not Speaker
Bitcoin reaches 150k sometimes during 2025
Ali Khamenei dies a natural death
In one of the monthly polls, >= 33.3% of Manifold users who provide an opinion agree that non-human intelligence has influenced Earth
Russia-Ukraine war will end in 2025
There will be less MAU at manifold in December 2025 then there would be in January 2025
Marco Rubio fired or "resigns" upon rumors of his impending termination
Californian independence gains enough signatures
Attempted assassination on Zelensky in which he gets hurt but survives
Dodgers win World Series
Los Ratones win against T1, or lose 2-3 (either case should resolve YES, if they don't play against each other this should be NO)
Elections in Bangladesh come to pass (currently planned for late 2025 to early 2026)
Saudi Arabia - Israel normalize relations
At least one Premier of a Canadian province proposes joining the USA
A country leaves NATO
Phillies win the World Series
Harry and Meghan divorce
2025 hotter then 2024
Putin dies
Biden dies
Taylor Swift will be "single" anytime during 2025
In one of the monthly polls, a plurality of Manifold users declares the achievement of weak superintelligence
Congress repeals any tariff
Apple will release first foldable smartphone during 2025
Chuck Schumer resigns
United States government acknowledges the existence of life anywhere other than on or close to Earth
Humanity's Last Exam passed with a grade of 80%
Elon Musk diagnosed with COVID-19
Bethesda releases a new game in the Elder Scroll series
China annexes Taiwan
Filibuster abolished
Zelensky Assassinated
Article 5 of NATO invoked
OpenAI o6 or any o6 variant released
Ukraine internal coup attempt
FrontierMath solved (>= 80%) and model is available to any person in the United States willing to pay for it
Kate and William divorce
Sam Altman gets fired again
At least one Premier of a Canadian province takes a concrete step, such as holding a referendum, to join the USA
Humanity's Last Exam passed with a grade of 90%
Putin assassinated
NATO at war with Russia
Trump assassinated
Any newspaper reports a US Default on any debt
US-China war
A country leaves the EU
A nuclear bomb explodes
2025 is the 15th "Year of Three Popes"
US debt ceiling will be scrapped during 2025
United States government acknowledges the existence of non-human intelligence on or close to Earth
One of the Millenium prize problems falls to a model
The winds of winter by GRR Martin will release
USA claims intelligences of extra-terrestrial origin have credibly visited or sent technology to a location within our solar system in an official government statement
Los Ratones go to LoL Worlds
LK99 replicates at last
Greenland Joins USA
End of the world by celery
Ukraine joins NATO
Humanity's Last Exam passed with a grade of 80% on or before June 30
A US state secedes from the union
51st US State
Peter Turkson elected Pope
GTA 6 releases
Federal Reserve cuts rates to zero
Harris will be the POTUS
Bitcoin will be > 100k on 1st January 2025
MrBeast Channel Subscribers will be below 300M on 1st Jan 2025
MrBeast Channel Subscribers will be above 350M on 1st Jan 2025
Jimmy Carter will be alive on 1st January 2025
Jimmy Carter will be alive on 31st December 2025
Humanity's Last Exam passed with a grade of 80% on or before March 31
Francis is always Pope
New pope takes name "Francis"
First papal conclave >= 10 ballots
First conclave elects Pope on first ballot
100
100
100
100
100
100
100
100
100
100
100
97
96
94
91
89
87
85
81
79
69
61
59
55
50
46
45
42
42
35
33
32
31
22
21
21
21
20
19
19
18
17
17
17
15
14
14
14
14
13
13
13
13
12
12
12
11
11
10
10
10
10
9
9
9
9
8
8
8
6
6
6
6
6
6
5
5
5
5
5
5
4
4
4
3
3
2
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
OptionVotes
NO
YES
2034
1605
OptionProbability
Make the bread taste good
Don't eat anything for at least 48 hours before eating the bread
Stretch-and-fold after mixing, 3x every 30 min
Create indentation, fill with melted cheese and butter
Bake on upside-down sheet pan, covered with Dutch oven
Resolve this option YES while eating the bread
Donate the bread to a food pantry, homeless person, or someone else in need
Use sourdough instead of yeast
Sprinkle 3 grams of flaky sea salt on top of each loaf before the second bake
Watch the video
Autolyse 20 minutes
3 iterations of stretch-and-fold, at any time during the 14h waiting period. Minimum wait time between iterations 1 hour
Make a poolish 12 h ahead: 100 g flour + 100 g water + 0.8 g yeast (0.1 %). After it ferments, use this poolish in place of 100 g flour and 100 g water in the final dough.
Bake it with your best friend.
Add 50g honey
Swap 200ml water for milk
Incorporate a whole grain flour (buckwheat for example)
More steam! Either spritz with more water (preferably hot) or actually pour some boiling water in just before closing the lid.
Bake for an amount of minutes equal to the percent this market answer is at when it comes time to begin baking. (Maintain the ±3 minute tolerances and the 2:1 ratio of time before:after the water spritz.)
Use King Arthur Bread Flour instead of All-Purpose
Decompose it into infinite spheres, then a few parts per sphere, rotate the spheres by arccos(1/3), unite them and you will find 2 chilis (Banach-Tarski)
Let dough rise on counter only until double volume or 2h max, any time longer in fridge
Add lots of butter (0.2 ml per gram)
Use 50% whole grain flour
Ditch current process, do everything the same as the video
Toast the bread
Eat the bread while punching @realDonaldTrump in the face
Eat the bread while watching your mana balance steadily tick to (M)0
Throw the bread at a telescope
Add 50g sugar
Put a baking rack in the Dutch oven before putting the loaf in, raising the loaf off the floor and lofting it over a layer of air.
Replace all water spritz steps with a basting of extra virgin olive oil.
Use flour made from an unconventional grain e.g. barley, millet, oats, rye, sorghum, maize etc.
Assume the chili is not in the interval [0,1], square it for more chili, if it is in (0,1), take the square root, else (equals 0 or 1) add 1 to it.
Assume the chili is in the interval (0,1), square it for less chili, if it is in (1,infinity) take the square root, if it is in (-infinity,0) take the negative of the square of the of the chile, else (equals 0 or 1) subtract 1 from it.
Get your friends to help you make a batch ten times the size, but add a Pepper X (2.7M Scoville heat units) to the mixture
Add 1tsp of diastatic malt powder per 3cps of flour
replace 10% of flour with farina bona
Bake the bread into a fun shape, like a fish, or an octagon
While the bread is baking, tip every user who voted "Yes" on this option 25 Mana
Add 50g vital wheat gluten
Give ChatGPT your current recipe as well your take on what optimal bread tastes like, then take that advice for your next bake
Bread flour, 3x yeast, cut rise to ~3h
Use whole wheat to improve the nutrition of the bread
Add an amount of MSG equivalent to half the current salt content
Place small ice cubes between parchment and pot instead of water
Cook the bread with a rod/puck of aluminum foil (or similar) in the core in an attempt to conduct heat through the center of the bread, cooking it evenly like a doughnut.
Make all of the ingredients from scratch.
Add a pinch of sugar
Make the bread edible then throw it in
Buy bread from a michelin star restaurant.
Increase water by 50 g
Drink vodka while eating the bread
Cover bread with damp paper towel instead of initial water spritz. Rehydrate paper towel during 2nd spritz. Remove paper towel before placing on cooling rack.
Do FOLDED
Quit Manifold into the bread.
Kill the bread into Manifold.
Improve the bread
Start at 500F, drop to 450F and uncover half way through
Grind/powderize all salt used into a fine powder (with pestle & mortar or similar device)
it needs more salt
Add 1/2 cup yogurt to the bread and name the bread “gurt” while addressing it with “yo, gurt”.
Half yeast
Ship a piece of the bread to a random person.
Encourage people to participate in the market in good faith while making the bread
Add 2g? of baking soda
Let dough sit 48 hrs
Resolve this option NO while eating the bread
put butter into it
Mix half sodium/potassium chloride
Add a tablespoon of sugar
Bake for 5 fewer minutes
Bake one more minute
Mail the bread to 1600 Pennsylvania Ave. Washington D.C.
Use tap water instead of fancy RO water
Frost it and put sprinkles on it to make it a birthday cake.
Add sawdust to increase the volume of the bread (but only like 10% sawdust by volume max. maybe 20% if it's good sawdust)
Add as many Jack Daniel's whiskey barrel smoking chips as feasible to the Dutch oven before baking, physically separating them from the bread as necessary while baking.
Eat the bread while sending all your mana to @realDonaldTrump
Bake the Manifold Crane into the Bread
Don't eat anything for at least 24 hours before eating the bread
Quadruple salt
Do all the changes in the top 5 open options by probability, excluding this option
Have someone sell the bread to you at an expensive price
Use lemonade instead of water.
Bake one fewer minute
Bake the cake while wearing a onesie.
Bake vegimite into it.
Bake for 5 more minutes
Replace salt with sugar
Eat the bread in front of the White House.
Bake vodka into it
Implement all options that resolved NO
Make the bread inedible then throw it out.
Replace flour with flowers
Throw the bread at @realDonaldTrump
Force Feed it to @realDonaldTrump
Make the bread great again
Cut the bread into the number of traders in the market slices.
Make naan bread, an easy-to-make bread
Only buy ingredients from 7/11.
Implementing every element listed below.
Put a non-lethal dose of any rat poison.
Just make donuts instead
Bake it in an easy bake kids oven
Think positive thoughts before tasting
Use a plastic baking sheet.
Eat the bread while betting yes on Cuomo on Manifold
Ditch all the steps. Just buy the bread from the supermarket
Double oven temperature
Halve oven temperature
Play classical music while baking
Light it on fire with birthday candles.
Bake it with a microwave
Eat the bread while betting yes on Mamdani on Manifold
Wear a suit while baking the cake.
Bake your social security number into it.
Bring it to Yemen and put a bomb in it
Bake America Great Again
Sacrifice a lamb
Add MAGA and a splash of Trump juice
Bake in a cat and a dog
Explode it:
Take a fat dump in the dough
Sit in dough 24 hrs
Let dough sit 24 hrs
Bake in rectangular tin
double yeast
halve salt
Double salt
Add 2tsp olive oil
Refrigerate dough instead of room temp wait
Do not mix salt and yeast in water together
Put fork in microwave
Don't eat anything for at least 12 hours before eating the bread
Add 2tbsp vanilla extract
Eat the bread with friends
Bake it in the country you were born in.
Eat the bread over the course of a week.
Bake the bread with love
92
90
88
85
85
80
72
70
70
67
66
65
64
63
62
62
59
58
52
52
52
51
51
51
50
50
50
50
50
50
50
50
50
50
50
50
48
47
47
46
42
42
41
41
40
38
37
35
34
34
34
34
34
34
34
34
34
33
32
31
31
28
27
26
26
24
24
24
23
22
20
20
20
19
18
18
17
17
17
16
16
15
15
14
14
14
13
12
11
11
10
10
10
10
10
10
10
9
9
8
8
8
8
8
7
6
6
6
6
6
6
5
5
5
5
4
4
3
3
2
2
2
2
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
OptionProbability
Brakes hit
Capability limit in AI
Huge alignment effort
New AI paradigm
Slow and gradual capability gains in AI
Enhancing human minds and/or society
Major capability limit in AI
Non-AI tech
Alignment relatively easy
Brakes not hit
Alignment unnecessary
Well-behaved AI with bad ends
Alignment extra hard
56
50
49
49
48
44
42
38
35
28
25
19
1
OptionVotes
YES
NO
2690
458
OptionProbability
K. Somebody discovers a new AI paradigm that's powerful enough and matures fast enough to beat deep learning to the punch, and the new paradigm is much much more alignable than giant inscrutable matrices of floating-point numbers.
I. The tech path to AGI superintelligence is naturally slow enough and gradual enough, that world-destroyingly-critical alignment problems never appear faster than previous discoveries generalize to allow safe further experimentation.
C. Solving prosaic alignment on the first critical try is not as difficult, nor as dangerous, nor taking as much extra time, as Yudkowsky predicts; whatever effort is put forth by the leading coalition works inside of their lead time.
B. Humanity puts forth a tremendous effort, and delays AI for long enough, and puts enough desperate work into alignment, that alignment gets solved first.
Something wonderful happens that isn't well-described by any option listed. (The semantics of this option may change if other options are added.)
M. "We'll make the AI do our AI alignment homework" just works as a plan. (Eg the helping AI doesn't need to be smart enough to be deadly; the alignment proposals that most impress human judges are honest and truthful and successful.)
A. Humanity successfully coordinates worldwide to prevent the creation of powerful AGIs for long enough to develop human intelligence augmentation, uploading, or some other pathway into transcending humanity's window of fragility.
E. Whatever strange motivations end up inside an unalignable AGI, or the internal slice through that AGI which codes its successor, they max out at a universe full of cheerful qualia-bearing life and an okay outcome for existing humans.
J. Something 'just works' on the order of eg: train a predictive/imitative/generative AI on a human-generated dataset, and RLHF her to be unfailingly nice, generous to weaker entities, and determined to make the cosmos a lovely place.
O. Early applications of AI/AGI drastically increase human civilization's sanity and coordination ability; enabling humanity to solve alignment, or slow down further descent into AGI, etc. (Not in principle mutex with all other answers.)
D. Early powerful AGIs realize that they wouldn't be able to align their own future selves/successors if their intelligence got raised further, and work honestly with humans on solving the problem in a way acceptable to both factions.
F. Somebody pulls off a hat trick involving blah blah acausal blah blah simulations blah blah, or other amazingly clever idea, which leads an AGI to put the reachable galaxies to good use despite that AGI not being otherwise alignable.
L. Earth's present civilization crashes before powerful AGI, and the next civilization that rises is wiser and better at ops. (Exception to 'okay' as defined originally, will be said to count as 'okay' even if many current humans die.)
G. It's impossible/improbable for something sufficiently smarter and more capable than modern humanity to be created, that it can just do whatever without needing humans to cooperate; nor does it successfully cheat/trick us.
H. Many competing AGIs form an equilibrium whereby no faction is allowed to get too powerful, and humanity is part of this equilibrium and survives and gets a big chunk of cosmic pie.
N. A crash project at augmenting human intelligence via neurotech, training mentats via neurofeedback, etc, produces people who can solve alignment before it's too late, despite Earth civ not slowing AI down much.
You are fooled by at least one option on this list, which out of many tries, ends up sufficiently well-aimed at your personal ideals / prejudices / the parts you understand less well / your own personal indulgences in wishful thinking.
If you write an argument that breaks down the 'okay outcomes' into lots of distinct categories, without breaking down internal conjuncts and so on, Reality is very impressed with how disjunctive this sounds and allocates more probability.
20
12
10
8
8
7
6
5
5
5
3
3
3
2
1
1
1
1
OptionVotes
NO
YES
1146
415
OptionProbability
Prediction markets
Signalling
Ems
Criminal law vouching
Tax Career Agents
Healthcare
Reversible agents
Other
Extra-terrestrial civilizations (Grabby aliens, Great Filter...)
Fertility
30
11
11
9
7
7
7
7
6
5
OptionVotes
NO
YES
200
50
OptionVotes
YES
NO
148
84