Pusulam Admin·

Collective Intelligence vs Expert Panel: Who Is More Accurate?

#collective-intelligence#experts#research#superforecasting#accuracy

The Reputation of Experts

On television screens, newspaper columns, and conference stages, experts talk. Economists make dollar forecasts, political scientists predict election outcomes, technology analysts explain the next big trend.

We trust them. After all, they've spent years in training and have experience in their fields. But we need to ask a question: how accurate are their predictions, really?

Tetlock's Finding

Philip Tetlock is a professor at the University of Pennsylvania. Over 20 years, he tracked more than 28,000 predictions made by 284 experts. The results sent shockwaves through the academic world.

Experts' prediction accuracy was only marginally better than a chimpanzee making random guesses.

Yes, you read that right. A monkey throwing darts at a dartboard performed close to the average expert.

But Tetlock didn't stop there. In his next project, the Good Judgment Project, he discovered a group of ordinary people he called "superforecasters." These people weren't experts. There were homemakers, retired engineers, and students among them. Yet their forecasting ability surpassed even intelligence analysts.

What's Their Secret?

Superforecasters shared these common traits:

First, intellectual humility. They could say "I might be wrong." This sounds simple, but most experts can't do it. An expert who has stated a position on a public platform struggles to admit when they're mistaken.

Second, constant updating. They revised their predictions as new information came in. They didn't cling to an idea. If their initial prediction was 60% and new data arrived, they'd update it to 55% or 65%.

Third, looking from different perspectives. They evaluated a topic not just through their own area of expertise, but from multiple angles.

The Power of Prediction Markets

This is exactly where prediction markets come in. A prediction market does what superforecasters do, but at scale.

Individuals bring different knowledge and perspectives. The market mechanism converts these different views into a price (a probability). And that price is constantly updated.

The results are impressive. In the 2024 US presidential election, Polymarket produced more accurate predictions than polls and expert panels. A similar picture emerged in 2020. Manifold Markets' Brier score is 0.17, which falls in the "very good" category.

But Is the Crowd Always Right?

No. A few conditions need to be in place for the wisdom of crowds to work:

First, diversity. A crowd of like-minded people isn't wise, it's dangerous. The echo chamber effect kicks in.

Second, independence. People need to make decisions without being influenced by each other. That's why in prediction markets, everyone casts their own vote, and deciding before seeing others' votes leads to healthier outcomes.

Third, having skin in the game. Simply saying "I think this will happen" isn't enough. You need to put something on the line. In Pusulam, that's Voting Rights. Spending even a small, limited resource encourages people to think more carefully.

Should We Stop Listening to Experts?

No, absolutely not. Experts matter. But rather than seeing them as the sole source of information, it's better to treat them as one of many sources.

The best results come when you combine expert opinion with community prediction. Experts give you depth, the community gives you breadth. Together, they're more powerful than either is alone.

In Pusulam, there's an AI assistant beneath every market. This assistant presents expert opinions, up-to-date data, and different perspectives. But you make the final call. And your decision becomes part of the community's collective prediction.

Join the community

Did you like this post?

Comments (0)

Please log in to comment.

No comments yet. Be the first to comment!