Wikipedia co-founder says site has liberal bias — here’s his plan to fix that
Oct 3, 2025
Sanger told SAN that he hopes the organization’s leadership will adopt his plan to return to the original mission of neutrality.
View Video Transcript
0:00
Does Wikipedia have a left-leaning bias? Its co-founder, Larry Sanger, believes it does
0:04
citing a blacklist of conservative outlets that are not allowed to be used as sources on the site
0:09
Sanger wants Wikipedia to improve its neutrality and have greater transparency and developed a nine-point plan he hopes administrators and editors at the encyclopedia
0:19
will follow. Why do you think Wikipedia needs reforms? Wikipedia has got a lot of problems that
0:25
It's accumulated over the years. And this is for a number of reasons
0:34
But I think the main root reason why it has as many problems is that it has no way to legitimize policy proposals and big new changes
0:49
So it ends up being institutionally conservative. It simply lacks any method of major reform
0:58
Sanger wrote nine suggestions that he hopes Wikipedia's current leadership will adopt
1:02
They include enabling competing articles, abolishing the blacklist, reviving the original neutrality policy, allowing articles to be raided, and ending indefinite blocking
1:13
He also wants Wikipedia to reveal who its current leaders are. Consider joining Wikipedia at some point in the next few months and making your voice heard there
1:30
You do have a right to edit there. And I think Wikipedia needs fresh blood from a wide variety of ideological and religious and other perspectives
1:44
national perspectives for example right now Wikipedia is very much a sort of echo chamber
1:52
I find and they won't like me saying this but they should
2:00
because I'm simply encouraging people to participate and to get behind the proposals
2:07
that I've put out there I think that really would make the world
2:13
better by making Wikipedia better. Wikipedia has a list of reliable sources that can be used for
2:21
entries. The sources are given one of five ratings, generally reliable, no consensus
2:27
generally unreliable, deprecated, and blacklisted. Fox News is listed as unreliable, which means it shouldn't be used, while MSNBC is considered generally reliable
2:39
Do you think that's a good system, or do you think that system needs to be changed entirely
2:43
Because I see that there could be a difference between thinking that's an acceptable system that's not executed properly. You see what I mean
2:51
I think it's a pretty bad system, generally speaking. My view is that sometimes facts only appear or are reported only in sources that are currently disallowed on Wikipedia
3:09
maybe the source decided to speak only to that publication. And in that case, if that person doesn't speak to any other publication that is allowed on Wikipedia
3:24
then that information is never going to appear on Wikipedia. So the way it has to work, I think, is all sources are allowed, but we can still make a selection of the most credible or the best sources
3:40
And we can have a debate about which those are. Controversial views need to be appropriately attributed to the people who own them
3:48
And that also goes with opinions about the sources themselves. You can see that AdFontis has this media bias chart, and it's got two separate ways in which it grades news outlets, one based on their partisanship and also based on their reliability
4:05
And as you can see, some of the outlets on the right that you described, like Newsmax, the New York Post, have equal reliability to like MSNBC, the New York Republican and Mother Jones
4:19
But none of them are within that green box, which would hit the right balance of both not being partisan and having neutral news and also being very reliable
4:29
So I want to just push back on this idea that certain outlets should be allowed and wonder perhaps maybe the standards should be stricter
4:38
No, I think most of the information that can be found about what's going on in the world is reported in sources that are outside of the box
4:50
That's not a problem. I think we can learn a lot from biased sources I Speaking as a sort of conservatarian myself
5:06
I think it's possible to learn things from the New York Times
5:10
I wouldn't simply take its word for it if it were the only source
5:14
or if I hadn't checked it out on controversial issues
5:24
but there's a lot of things that I might trust it on. And if it's among multiple sources that
5:31
are reporting a certain thing, well, it's fact-checking tradition is pretty strong. So I
5:38
would say, I suppose the New York Times is okay, but it doesn't belong in the green box
5:45
It's very far left. When do you believe Wikipedia was most neutral and best fulfilling its mission
5:52
And when do you believe it started moving to the left? And how do you think that happened
5:56
Best fulfilling its mission and being most neutral are distinct things. I would say it was most
6:04
neutral in its first three years or so, but it actually fulfilled its mission better in later
6:12
years, maybe between 2005 and 2010, then it was already starting to be biased
6:22
I remember complaining about that a bit at the time. Nevertheless, it wasn't nearly as bad as it is now, and it had expanded into just an
6:33
absolutely enormous resource. so there never was a time that I'd want to go back to
6:43
where it was just the way I would like it to be
6:47
I think in terms of the qualities of the community I actually like how
6:56
things were in the very first year there were a lot of decent people around that
7:00
there was a lot of give and take and And people were not so full of themselves and busily negotiating the rules
7:17
So now, as to the question when things really changed in the sense of becoming more biased, I would actually break it down into a few different periods
7:29
So first of all, the left really started descending in what I would call a noticeable way, even in the second half of 2001. Not a lot of people, but there were some people who were very clearly on the left
7:49
And then they didn't really crowd out And here, when I speak of the left here
7:57
I'm not just talking about people who might happen to vote for the Democratic Party
8:01
That's not what I mean I mean people who are serious full-time or at least part-time activists
8:13
And who generally tend to be on the far left So there were a number of those people toward the end of 2001, and a lot more started showing up in the following year
8:27
And before that long, they really dominated what was going on so that by 2008, 2010, the board of trustees was pretty much dominated by such people
8:50
And it's been that way ever since. But here we're just talking about the people who are like on the board of trustees and actually, in general, the community simply became more radicalized as essentially the political discourse on the left became more radicalized
9:13
So there was a big change, I would say, during the Bush years, especially over the Iraq war
9:26
And then there was an even bigger change, of course, due to Trump's election
9:32
So that's when really, when they became, I think, extremely radical. And that was reflected in the choice of sources
9:46
So it was just a couple of years after Trump was in office
9:52
I don want to interrupt you but you talked about the community the Wikipedia community and how it changed over time And I find that very interesting So I saw the statistic nearly 260 volunteer editors contribute to Wikipedia every month
10:09
Do you have any insight into who exactly they are and how that process works
10:13
They're able, of course, to be anonymous. So that has been the rule since the founding of
10:21
Wikipedia and in this it follows much of the rest of the internet. So we don't know who they are in
10:29
that sense. We can make some generalizations about what kinds of people edit. They aren't all
10:36
lefties. And indeed, there's a lot of people who edit Wikipedia on topics that have nothing to do
10:43
with politics, of course. You know, of course, a lot of people edit Wikipedia on topics about music
10:50
and pop culture and, you know, apolitical history topics and geography and on such topics
11:00
it can be quite good. Because I know you know Stephen Pruitt, who is the single biggest editor
11:06
Over the years, I believe he's made more than 6 million edits to 33,000 articles
11:11
No matter what the numbers are, they're astronomical. So what do you think of one single person having that much influence or impact
11:18
Well, he has every right to do it under the system. That's how the system is set up
11:27
Doesn't bother me if that's what you're asking. I think there's nothing really wrong with somebody being able to do that
11:36
If he is, however, a front for an operation, that's another matter. But I'm not accusing him of being one
11:46
I think he might run a lot of bots, and that might be one of the main reasons why he has as many edits
11:54
But I actually don't know. The House Oversight Committee started an investigation into what they call, this is their words
12:00
organized efforts undertaken in violation of Wikipedia platform rules to influence U.S. public opinion on important and sensitive topics by manipulating Wikipedia articles
12:11
The chairman, James Comer, wrote a letter to Wikimedia Foundation's CEO requesting documents and information, and they are saying that some volunteer editors have been caught violating platform policies, and they're curious to hear about the CEO's efforts to thwart that
12:28
I wonder what you think of congressional involvement in Wikipedia. It's sad that it's necessary, but at this point, I wouldn't second guess Congress if they think that it is necessary
12:43
It never makes me happy when Congress has to get involved overseeing any part of the Internet
12:53
It's extremely important that the Internet be free. and even if it is a Republican Congress
13:04
it makes me nervous and it should make us all nervous. That said, I do think the issues
13:13
that they are taking up in that subcommittee hearing is those issues are very important
13:21
and they have, let's just say that the Wigimedia Foundation, in my opinion, has not done the best job of staying
13:36
on top of all of the problems that are going on on the platform. And I don't know why that is. And
13:44
I think it'll be interesting to see if Congress is able to shed any light on that
13:51
Let me just go back to the 260,000 volunteers real quick, just because I think it's fascinating insight to hear from you about how the process works. Once one of those volunteers makes an edit, is there a check and balance system in place to make sure that what they wrote is accurate
14:08
Not as such. The edit is immediately available to others to look at, and they will. Generally, the way it works is if you have made edits to an article, then unless you tell it otherwise, it will add it to your watched article list
14:33
And then so if somebody else edits an article that you have edited, you'll be alerted
14:39
And then you'll go and look over the edits that other people have made and be a sort of informal peer review system in that way
14:48
And in a lot of ways, it works well. If I didn think so I wouldn have started the thing So it is quite amazing that wikis work as well as they do And you know we actually do
15:10
have this process to thank. And of course, the human beings that have have followed it to for
15:18
a lot of the ease of access to information that that we enjoy today. It's just that, unfortunately
15:26
that has been greatly, well, let's just say it hasn't gone well in recent years
15:36
And then finally, let me cover one more topic with you. You point out in your article that
15:41
Wikipedia is the free encyclopedia, but it seems like Elon Musk wants to change that. He recently
15:47
called for people to join XAI to help build Grokipedia, which he says is going to be an
15:53
open source knowledge repository that is vastly better than Wikipedia. How do you feel about Elon Musk trying to build a competitor
16:01
Well, I hope he does a good job. I'm happy about it. I think that's great
16:06
The more encyclopedias in the world there are, the happier I am, I suppose, unless they're
16:17
extremely irresponsible. I do worry, however, that Grokpedia or whatever it's going to be called
16:27
will reflect the same sort of biases that Wikipedia has and that the Grok chatbot LLM
16:43
has reflected. So in the last year or so, the Grok, actually shortly after it rolled out
16:54
it started drifting to the left. And in the last six months
16:59
if you ask it a question on anything remotely political, anything having to do with the culture war
17:07
it answers from a left-wing perspective. It's pretty annoying unless you actually support that
17:19
What do you make of the use of AI to build something like this? I've anticipated it for quite a while
17:25
For over a couple of years, I've been thinking about what's going to happen
17:31
And I do think that AI can be used to write a decent encyclopedia article
17:38
I've done it myself. I've observed others using it to add articles to encyclopedias
17:48
So I know it can be done and it can be done reasonably well
17:52
You said when I asked you this original question about Grokopedia and XAI, you said, I hope he does it right
17:59
What is your advice to make sure he does it right? Well, there are these prompts that are in the background when you make a query to a chatbot
18:18
So you type in your prompt, but then it's wrapped essentially by another prompt
18:26
and then the whole thing is submitted to the LLM, to the system
18:33
And it's possible to edit that wrapping so that it elicits a more fair and balanced sort of response
18:48
I would like to see that in Grok and I would like to see that reflected in Grokpedia
18:56
But I'm not sure that we're going to see that. I think Elon Musk and his people know that, of course, it's possible to edit those background prompts
19:13
I forget what they're called, but they clearly are not. They have not
19:24
Let's just put it this way. They have to know that the system as it exists now has the sort of biased result
19:34
Therefore, I think they must intend it. And if they intend it, that means that they're probably going to come up with a similarly biased encyclopedia
19:48
I hope not. I hope I'm wrong. Well, thank you very much. I really appreciate your time
19:53
I'm Ray Bogan for Straight Arrow News. For more reporting straight from our nation's capital, download the SAN app
#Media Critics & Watchdogs
#news
#Politics
#Sensitive Subjects