Will betting in or subsidizing specific prediction markets be an EA cause area by 2030?
➕
Plus
30
Ṁ2943
2030
25%
chance

Funding prediction market platforms such as Manifold does not count. I'm referring to putting money into specific markets, like "provide large subsidies for markets on charity impact to increase the accuracy of their results", or "bet YES on AI doomsday markets to convince people it's a risk worth taking seriously". I will also include a grant from an EA-aligned organization to try it out and see whether this is a good use of money.

Get
Ṁ1,000
and
S3.00
Sort by:

I want to bet NO because we'll probably all be paperclips by then, but then I'd be betting for EA reasons, since I won't have much use for mana as a paperclip, and that might make the market resolve yes. Hmm.

It seems likely this can't be a good cause area until several earlier steps of the tech tree are done. But given that you pushed it all the way out to 2030, I think there's a good chance there will be something like this. Buying predictions may get drastically easier; I would be shocked if buying predictions using the best tools of the era doesn't stay a major ea cause area, and ea folks have donated to stuff like this before.

that said, I'd argue that this market could benefit from much clearer resolution criteria.

@L You EAs with your darn needs to quantify everything! Why can't you just let it be ambiguous and potentially lose a bunch of money when I resolve this market based on a different interpretation than you were betting on‽

...Ok, I might have answered my own question.

Seriously though, I'd love to have more objective resolution criteria for a lot of my markets, but coming up with such a thing is often a significant challenge. The world is complicated, and getting my intuitions down to fit in a few paragraphs is not easy. (It's not quite "AI alignment" levels of difficulty, but perhaps could be used as a example case...)

If you have specific suggestions, I'm very open to them. Or if you have questions about what would count, ask and I will answer to the best of my ability.

predictedNO

Would love it if you clarified more what you mean by "EA cause area".

Hmm. Anything that a significant fraction of EAs believe it's worthwhile to donate to and to encourage others to donate to, due to their belief that it's one of the most effective ways to benefit the world.

Is that clear enough?

predictedNO

@IsaacKing I would rather it be measured by total % EA donations, rather than just the subjective perception that people think it's effective. If "a significant fraction of EAs" believe it's worthwhile to donate to, then that should be measured by actual money, either in absolute terms or % of donations, no?

predictedNO

@DeanValentine Otherwise clearly they don't actually believe it's a legitimate cause area vs. malaria nets or AI risk.

@DeanValentine Good idea. That seems hard to measure though, since there likely won't be any single organization receiving and tracking all the funds.

According to https://effectivealtruismdata.com/, the smallest cause area that was worth giving its own label is "far future", at 0.1% of total funding. I'd be happy to consider prediction markets a cause area if they reach that level of funding.

I don't however want to say that I won't consider them a cause area at less than that level. As Katja mentioned, some things can reasonably be considered "cause areas" despite not receive much direct funding or explicit attention.

For example, if it's determined that there are only a small number of specific markets that are beneficial to fund, there may be rapidly diminishing returns, such that only a few thousand dollars satisfies the demand, and EAs just make sure that that small fund never dries up.

Would it be a 'cause area' if EAs were often doing it as an altruistic activity, or does it take more than that? (e.g. 'writing EA forum posts' is a thing EAs do a lot, often with altruistic intent, but I wouldn't call it a cause area.)

@KatjaGrace If it seems that people are doing it because they believe it's an effective way to help the world, it counts. If they're doing it as a personal project that they acknowledge is for more selfish reasons, like donating to a Youtube channel they enjoy watching and want to support, that doesn't count.

Writing forum posts is an interesting example. Do you know of anyone who's argued that writing a good forum post has higher impact then spending that time earning to give?

@IsaacKing I don't remember seeing such an argument, but I do take that for granted and expect many others do too. Just because forum posts are often straightforwardly valuable EA work of the kind that people do at their jobs. (e.g. my last forum post I actually wrote at work: https://forum.effectivealtruism.org/posts/zoWypGfXLmYsDFivk/counterarguments-to-the-basic-ai-risk-case)

@KatjaGrace Yeah I kind of assumed the same. If a forum post can redirect 0.1% of all EA funds, that's definitely worth more than ~20 hours of work elsewhere.

Yeah ok, I'll count that as a cause area for the purposes of this market. If it becomes tribal knowledge in EA that betting in or subsidizing specific prediction markets is good and people should do it more, but very few people talk about it explicitly, I'll resolve this to YES.

But it has to be specific markets. (Or markets about specific topics.) A general ethos of "it's epistemically virtuous to bet on your beliefs, and prediction markets are a good way to do that" is not sufficient.

Hope I'm wrong!

predictedNO

@DeanValentine Subsidized a thousand M$ btw. (Hate how it doesn't show you that)

@DeanValentine I appreciate it! (I get a notification it happened, and it's possible to see it by opening the market details in the top right corner. But it's not all that visible if you're not looking for it unfortunately.)

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules