Wikipedia at 20: What next for the world's biggest collection of information?
As it marks its 20th anniversary The Big Issue speaks to Wikipedia co-founder Jimmy Wales, and Katherine Maher, CEO of the Wikimedia Foundation, about the responsibilities of overseeing the biggest source of information on the planet
Wikipedia was launched on 15 January 2001. I know that because I looked it up on Wikipedia.
The “multilingual open-collaborative online encyclopedia” – (how Wikipedia is described on the Wikipedia page about Wikipedia) now, at the time of writing, contains 6,227,517 articles in English, with on average, 597 added every day.
And there are also versions in around 300 other languages, relying on a community of volunteer contributors – currently 280,000 – to generate and moderate content, which they do at a rate of 350 articles per minute.
Last year the site was visited 269 billion times. More people have had more time to browse online and plenty of big news stories to find out about – stories that increasingly divide and polarise opinion, meaning Wikipedia’s place as a respected source of trusted information is more critical than ever.
But let’s take a step back. Maybe 20 years back.
Lockdowns have taken income away from hundreds of Big Issue sellers. Support The Big Issue and our vendors by signing up for a subscription.
Advertisement
Advertisement
A pathological optimist’s ‘really big idea’
In January 2001, tech pioneers brimmed with optimism about the potential of the information age, no doubt buoyed by the fact the Millennium Bug could be dismissed with a shrug.
On 15 January 2001, Alabama-born Jimmy Wales posted the first entries on Wikipedia.com. The 34-year-old had a vision at once simple and preposterously ambitious: “To create a free, high-quality encyclopedia in all the languages of the world”.
Jimmy Wales co-founded Wikipedia in 2001
“That was a really big idea back then and it’s still a really big idea today,” Wales, now living in London, tells The Big Issue.
Then, as now, the site’s growth depended on volunteer contributors who would emphasise quality and neutrality, providing sources that have improved Wikipedia’s reputation for accuracy over the years.
“I set down as one of the very first rules of Wikipedia that neutrality is non-negotiable,” Wales says. “Wikipedia would and should not become a vehicle for any one particular ideology or point of view. Which of course, in its own way, is an ideology. But it’s a very enlightening one.”
In retrospect, did Wales know people would get behind the idea?
Advertisement
“I’m a pathological optimist,” Wales replies. “I always thought, everything’s going to work out fine, but I was very pleased to see how many people are actually quite thoughtful and really try to work together. I think Wikipedia does represent the optimistic, hopeful side of what we can do as humans on this planet.
“And it’s a reminder, that it doesn’t have to be all about people screaming at each other on Twitter and so forth. There is another side of the internet that’s much more healthy.”
Two decades later, Wikipedia has remained largely the same, but the world has changed.
How can it be that we live in an age where more information about more things is available to more people, but ignorance is not only widespread but weaponised?
“Wikipedia tries really hard to be that place you go for quality, thoughtful information,” Wales says. “But, of course, not every place online has the same values.
Advertisement
“You can go online today and access huge quantities of information, both high and low quality – Wikipedia has really always been about trying to help navigate that.
“We have very strict rules about reliable sources, community members always discussing and debating the accuracy of information. We have a lot of people who have been very, very diligent in keeping entries, for example, related to COVID-19, very high quality and fact-based to fight some of the misinformation is going around.
“We are seeing a real problem with an advertising-only business model for social networks, giving them a real incentive to create addictive products that generate outrage, dispute and anger. Because that keeps you on the site longer; it keeps you there long enough to see more ads.”
The consequences of this, Wales and the Wikipedia team believe, is causing problems for society. But taking a look at how Wikipedia works with differing viewpoints to find consensus could provide a route map for the rest of us to follow.
‘We’re not interested in clickbait’
Wikipedia is run as a non-profit organisation.
“We don’t have any advertising, we just ask people if they find it meaningful to donate,” Wales explains. “This changes the whole incentive structure for us. We’re not interested in clickbait headlines, we just want to give you a quality experience so that when you do see that banner at the end of the year that says, ‘Look, could you please chip in?’ You go, ‘Oh, yeah, that’s great, I want to help.’”
Advertisement
If Wales had decided to sell out (conservative estimates value the site at $10+billion) and Wikipedia’s aim was to make as much money as possible, it would look very different.
Algorithms would tailor articles to match your digital profile, similar to how many social networks suggest posts or topics. If Wikipedia’s aim was to keep you on pages as long as it could, and to keep you coming back by feeding you what it knew you liked, it would use your digital profile to predict the information you wanted.
Brexit supporter? Boris is great! Against it all? Campaign to Remain!
That’s effectively what YouTube, Facebook and others do: curate content it predicts you’re interested in, even if that means posts by flat Earth anti-vax UFO abductees.
Mis- and disinformation has stoked division and polarised viewpoints. The real-world consequences of this became starkly clear when insurrectionists stormed the Capitol in Washington earlier this month. So how does Wikipedia handle such a complex and incendiary event?
Wikipedia user Molly White posted a thread of tweets about how events were recorded.
Advertisement
As the news broke on Wednesday that there was a riot at the U.S. Capitol, volunteer #Wikipedia editors worked to document what was happening. A thread about the fascinating process of breaking news editing: pic.twitter.com/8FoCCphKVJ
As of today, the article has been edited more than 4,500 times by almost 800 different people, who collaborated to create an article that is nearly 14,000 words long and cites 480-plus sources. It’s been viewed almost a million times.
It’s this collaborative process that ensures some consensus on even the most divisive issues.
‘A baseline of public education is so important for our societies’
“We might not all agree on everything about the world,” San Francisco-based CEO Katherine Maher says. “And it’s not just the political, with any matter of sensitivity, you have a range of people pushing from all sides. What that ends up doing is pushing the articles into something that approximates agreement.”
Trust of official sources and governments is at a low, Maher says.
“We’re in a period at the tail end of a number of decades that have undermined public institutions and education, sometimes for very good reasons. And the rise of the internet, the ability for misinformation to be so widely disseminated [means] it is reaching an audience that has lost confidence in those institutions. Wikipedia can play a role in remaining a trusted information source.
Advertisement
“A baseline of public education is so important for our societies to be able to have a common ground to stand on as we go about making difficult decisions and trying to interpret our world around us.”
But with great power comes great responsibility and even greater challenges.
Maher runs through some of her day-to-day priorities: “One of our first commitments, is we want Wikipedia to be free to everyone, free to participate in, free to use. We view that as a core part of our mission.
“A second challenge is how do we make sure that everyone can access it, how do people around the world who come from very different circumstances, how do we make sure that it is available to them?
“Then the next question is, how do we ensure that there’s information that represents everybody? Wikipedia is in 300 languages, give or take, ranging from very small to very large languages. The largest is English but there are small European languages such as Basque and Welsh. Then there are languages spoken by millions of people in Africa and India that don’t have as many articles, they’re not as rich in terms of their knowledge.
“And so our next question is, how do we make sure we really serve the world? And in this moment of misinformation and disinformation, how do we make sure that people trust that information and that it’s not biased in some sort of way?
Advertisement
“Those are the things that we think about at least every day as we go about our work.”
Katherine Maher, CEO of the Wikipedia Foundation
So how can Wikipedia ensure factual accuracy at a time nobody can agree on anything?
“The problem is complex and complex problems rarely have one solution,” Maher says.
“An article itself evolves over months or years but we only get to see a single version of it, which means that we don’t get to fall into sort of rabbit holes of our own political persuasions or ideas.
“I think that’s the really fundamental difference between us and other social platforms with individualised, personalised feeds that have a tendency to sort of pull you further away from consensus, whereas Wikipedia sort of forces you into dialogue with people you may or may not agree with.”
So unlike pretty much every other major or minor website, Wikipedia has no interest in collecting data about its visitors. One fact that neither Wales or Maher agree on is how Wikipedia ranks among other websites – somewhere between fifth and 15th most popular in the world – simply because they don’t collect visitor data.
Advertisement
“We don’t track our users because we’d never want the data about who they are or what they believe because we think that it’s just so tempting once you have it!” Maher admits.
“And it’s why we don’t want to ever be a commercial entity because we would be tempted to make decisions like that, because that would be, you know, potentially good business.
“We’re not taking positions on politics or matters of social disagreement, what we’re trying to do is present the information so that people can come to their own conclusions.
“We just know that no matter where you are in your life, whether you’re a student or you are older, whether you’re pursuing information in order to make an important life decision, or because you’re curious about the world and are looking for a diversion, whether it is serious scientific information, or whether it’s just about pop culture, knowledge really matters to people.”
‘A non-toxic social network’
Wales is currently developing WT:Social, a “non-toxic social network” to provide an alternative to Facebook and Twitter with no adverts and users able to flag misleading posts.
“It’s a pilot project,” Wales explains. “Everything I’m doing there is focused on trying to completely reinvent what it means to be in a social network. I’m having fun building it.
Advertisement
“I think the public is quite dissatisfied with the current state, saying, you know what, let’s start looking for alternatives.”
Somebody looking for an alternative social media platform may be President Trump. How would he have to behave differently on a different kind of social media?
“Well, I mean, I have a pretty strong feeling that if he behaves the way he normally behaves on social media, the community would vote to ban him quite quickly.
“My view is, in order to solve the problems of social networking, it’s not about simply changing their top-down centralised policies, it’s fundamentally rethinking how the whole thing works. Stop thinking about a model in which the general public that’s using the platform is disempowered entirely from having any say in how it’s run. It’s not easy, but that’s my belief is that that’s what we have to try to do.”
‘We really want to keep Wikipedia safe’
Twenty years from now, Wales believes Wikipedia will look very similar to how it does today. “It’s an encyclopedia, we’re not going to become TikTok or anything.
“We try to run the Wikimedia Foundation, the charity that I set up to own and operate Wikipedia, in a very cautious way. So financially, every year we spend a little bit less than we take in to build up our reserves to fund us through hard times in the future, or to fund any opportunities that come.
Advertisement
“We really want to keep Wikipedia safe. We really want to be here for the for the long haul.
“A lot of the changes will be invisible to most of us because it’s going to be about Wikipedia in the languages of poorer parts of the world as we welcome the next two billion people online.”
Katherine Maher adds: “For example, India is a country with at least 24 Wikipedia language versions. And we want to be able to serve the hundreds of millions of people who live in India, who speak one of these 24 languages.
“On the flip side, we look at countries where we worry about the risk of censorship, which we think is increasing around the globe. We’ve been blocked in China for a very long time now and while that’s particularly disappointing, we want to make sure that doesn’t spread, that censorship does not continue to grow.”
‘We’ve always had conspiracy theories’
Nowadays to be the smartest person in the room you just have to be the fastest on a smartphone. Has the internet and unlimited access to information actually changed the meaning of intelligence? It’s no longer about knowing information, it’s knowing the value of that information.
Advertisement
“I think it was always important, it’s probably more important today to have skills at determining what is quality information, or what isn’t,” Jimmy Wales says. “If you believe every random nonsense that you see floating around on social media, you’re going to end up with a very poor state of understanding of the world.”
Human beings have remained unchanged for thousands of years, he adds, but our environment is evolving rapidly.
“We’ve always had conspiracy theories but people do get sucked into it. How could you believe, for example, that 5G is the cause of coronavirus? I mean, it’s a completely stupid thing to believe, right? If you told people that 100 years ago, they wouldn’t have a very easy way of going to check that. Whereas now they do. You can just go on Wikipedia and it takes you 10 minutes to go, oh I see that makes no sense whatsoever.
“And a lot of people don’t. I’m not sure what the answer is. People should read more Wikipedia.”
To learn more about Wikipedia over the last two decades visit 20.wikipedia.org
This Christmas, 3.8 million people across the UK will be facing extreme poverty. Thousands of those struggling will turn to selling the Big Issue as a vital source of income - they need your support to earn and lift themselves out of poverty.