8 Comments

Trevor - excellent essay. I've had similar concerns that AI alignment researchers are being quite naive about possible social/cultural backlash against AI. I'm working on a related essay.

You might be interested in the responses to this question I posed on Twitter Oct 10:

https://twitter.com/primalpoly/status/1579552298262736896

I think there are many possible applications of narrow AI over the next 20 years that would be so disgusting, alarming, distressing, and politically polarizing, that anti-AI activists could easily turn the public against AI....

Expand full comment

Can I summarize the key points of the argument to make sure I understand?

1) Nick Beckstead's subjective probability estimate of AGI before 2043 should go from 20% to below 2% specifically because a TMI or "becomes a military matter" incident is likely to happen before the technology comes to fruition (if in fact it does).

2) At least one of these two events is very likely to happen.

3) If it does, it is very likely to mean that AGI is not developed before 2043.

Since the punchline argues for a 90% reduction in the subjective probability, presumably you'd claim that #2 and #3 are actually really, really likely -- something like 95% likely that a relevant incident happens before AGI (if AGI does happen), and also that such an incident reduces the chance that AGI then happens before 2043 by 95% (or 90% in one slot and 100% in the other, or...). Otherwise, they shouldn't convince someone who believes in 20% to change their belief to below 2% (assuming they previously thought those things were 0% to happen; you have to argue higher if the 20%-believer previously thought them possible). Do you believe / argue for numbers like 95%; 95%? Higher?

(I work at the FTX Foundation, but not for the Future Fund, and my question-asking here is as a blog reader, not in any connection to the prizes.)

Expand full comment

Hi Trevor! Thanks for contributing to our contest!

I wanted to drop in to see if I can explain myself better on the subject of "how you can assign an exact numeric probability to the probability of AGI being developed by 2043." I definitely agree that reporting my subjective probabilities about a roughly defined event is inexact and less scientific than what the IPCC is doing, and I don't consider what I'm encouraging people to do here science - I consider it something more like "thoughtfully explaining your bets."

It seems like there is a gulf of understanding between us, where I think of assigning a subjective probability on AGI timelines as a natural thing to do when trying to make sense of the world, and you (and I think many others) seem to consider it a kind of crazy and hubristic thing to do, which couldn't possibly be done exactly, and is perhaps an affront to the diligence, objectivity, and seriousness of the scientific enterprise. From my point of view, you're implicitly setting too high of a bar for people to consider themselves qualified to have any opinion at all on the question.

One thing that may be helpful here is to consider start-up valuations. Let's consider a pre-revenue start-up in a speculative industry, such as Helion Energy. Let's say the company is pre-revenue right now, and we wanted to have a discussion about what it's worth. Suppose I write up a few pages of thoughts and I came up with a valuation of $X based on guesses of size of total addressable market, odds they succeed and get cost/kWh below some threshold, odds they get regulatory approval, and expected market share conditional on getting there first.

You could come back to me and say, "I don't understand how you can assign an exact numeric valuation to Helion. It's a pre-revenue company with an unproven technology. You know when the IPCC uses subjective probabilities, they base that on mountains of evidence and detailed climate models. The models of the trajectory of fusion energy are much worse than our models for climate change. If you want to create scientific discussion of start-up valuations, then you should at least come in with clearly stated assumptions about the company!"

And if you did I would say: I don't know how to fully explain it and make my thinking fully reproducible by others, but there are some valuations where I'm a buyer and some valuations where I'm a seller, and some intermediate valuations. I think it's possible to make meaningful statements about this question using napkin math and without IPCC-level standards of rigor.

I believe something similar about forecasting AGI by 2043. Ultimately, we have to decide what bets they're going to take on this subject, we won't be able to do it as rigorously as the IPCC (and success probably looks more like "about as rigorous as a good VC making long-term bets"), but maybe we'll learn something if we try to explain our thinking and subject it to external scrutiny!

Expand full comment