Why would anyone want an editor that doesn’t fact check?
tbh i somehow didnt even realize that wikipedia is one of the few super popular sites not trying to shove ai down my throat every 5 seconds
i’m grateful now
Don’t count your chickens before they hatch, Jimmy Wales founded Wikipedia and already used ChatGPT in a review process once according to this article.
damn T_T
To all our readers on Lemmy,
Please don’t scroll past this. This Friday, for the 1st time recently, we interrupt your reading to humbly ask you to support Wikipedia’s independence. Only 2% of our readers give. Many think they’ll give later, but then forget. If you donate just £2, or whatever you can this Friday, Wikipedia could keep thriving for years. We don’t run ads, and we never have. We rely on our readers for support. We serve millions of people, but we run on a fraction of what other top sites spend. Wikipedia is special. It is like a library or a public park where we can all go to learn. We ask you, humbly: please don’t scroll away. If Wikipedia has given you £2 worth of knowledge this year, take a minute to donate. Show the world that access to neutral information matters to you. Thank you.
deleted by creator
I will stop donating to Wikipedia if they use AI
Wikipedia already has a decades operating cost of savings.
No they don’t because they blast it on inflated exec wages.
Why don’t they blast execs and reduce the expenses.
Just got back from asking them. They said they like cash moneys and don’t like blasting themselves.
This is such a tiresome aspect of society. Even if you believe in executives, they certainly don’t need to get paid more than anyone else.
What’s funny is that for enormous big systems with network effects we are trying to use mechanisms intended for smaller businesses, like a hot dog kiosk.
IRL we have a thing for those, it’s called democracy.
In the Internet it’s either anarchy or monarchy, sometimes bureaucratic dictatorship, but in that area even Soviet-style collegial rule is something not yet present.
I’m recently read that McPherson article about Unix and racism, and how our whole perception of correct computing (modularity, encapsulation, object-orientation, all the KISS philosophy even) is based on that time’s changes in the society and reaction to those. I mean, real world is continuous and you can quantize it into discrete elements in many ways. Some unfit for your task. All unfit for some task.
So - first, I like the Usenet model.
Second, cryptography is good.
Third, cryptographic ownership of a limited resource is … fine, blockchains are maybe not so stupid. But not really necessary, because one can choose between a few versions of the same article retrieved, based on web of trust or whatever else. No need to have only one right version.
Fourth, we already have a way to turn sequence of interdependent actions into state information, it’s called a filesystem.
Fifth, Unix with its hierarchies is really not the only thing in existence, there’s BTRON, and even BeOS had a tagged filesystem.
Sixth, interop and transparency are possible with cryptography.
Seventh, all these also apply to a hypothetical service over global network.
Eighth, of course, is that the global network doesn’t have to be globally visible\addressable to operate globally for spreading data, so even the Internet itself is not as much needed as the actual connectivity over which those change messages will propagate where needed and synchronize.
Ninth, for Wikipedia you don’t need as much storage as for, say, Internet Archive.
And tenth - with all these one can make a Wikipedia-like decentralized system with democratic government, based on rather primitive principles, other than, of course, cryptography involved.
(Yes, Briar impressed me.)
EDIT: Oh, about democracy - I mean technical democracy. That an event (making any change) weren’t valid if not processed correctly, by people eligible for signing it, for example, and they are made eligible by a signed appointment, and those signing it are made eligible by a democratic process (signed by majority of some body, signed in turn). That’s that blockchain democracy people dreamed at some point. Maybe that’s not a scam. Just haven’t been done yet.
How do you prevent sybil attacks without making it overly expensive to vote?
How do you use Sybil attack for a system where the initial creator signs the initial voters, and then they collectively sign elections and acceptance of new members and all such stuff?
Doesn’t seem to be a problem for a system with authorized voters.
Flood them with AI-generated applicants.
So why would they accept said AI-generated applicants?
If we are making a global system, then confirmation using some nation’s ID can be done, with removing fakes found out later. Like with IRL nation states. Or “bring a friend and be responsible if they are a fake”. Or both at the same time.
Would every participant get to see my government-issued ID?
One can elect a small group which will and will sign its connection to something intermediate. Then only they will.
jimmy wales is also the president and co-founder of fandom
to give you an idea of who that guy is
Oh gross
Why?
Fandom (previously Wikia) is an extremely shitty service with low-quality wikis mostly consisting of content copied from independent wikis and a terrible layout that only exists to amplify their overwhelming advertising.
While this is true, the majority of the wikis are not at all low quality. Some are the only ones existing for a topic. The wikis are community-based, after all.
But its easy to vandalize and is highly profit-driven. The fandom wikis are filled with ads that absolutely destroy navigation. Infamous is the video ad that scrolls you up automatically in the middle of reading once it finishes. You have to pause it to read the article with no interruption.
my one weird trick for using fandom.com is to disable javascript for that domain.
What if they put anubis on it ?
Anubis only does a proof of work challenge if you lack a specific cookie that it gives you. You can temporarily enable JavaScript, pass the challenge, get the cookie, then disable JavaScript.
I use uBlock Origin, btw, to make selectively enabling/disabling JavaScript per domain a simple two-click task.
they captured the “niche wiki” market as wikia, then rebranded and started serving shittons of ads. the vim wiki is unusable these days because it runs like ass and looks like a gamer rgb nightmare
There’s an addon for that, Indie Wiki Buddy.
It tries to redirect you to non fandom/fextralife wikis if they exist, and if not, it proxies fandom wikis through BreezeWiki which just displays the content.And I’ll take this opportunity to plug Hohser and the uBlock AI blocklist as well.
I mean, the Wikipedia page does say it was sold in 2018. Not sure how it was before but it’s not surprising that it enshittified by now.
Obligatory plug for BreezeWiki. Makes that shit actually usable.
Oh yeah that website’s pretty great It has really in depth wiki about games like https://fallout.fandom.com/wiki/Caesar’s_Legion
So I guess you mean that Wales guy is pretty great then
Oh, you mean the https://fallout.wiki/wiki/Caesar's_Legion ?
Yup, Fallout Wiki has a pretty crazy history. I don’t remember if they were originally a Fandom wiki, but at some point they definitely went “well, we don’t want to go with Fandom, we’ll go with Curse wiki host instead.” Then Fandom bought Curse wikis and put all of them under Fandom banner anyway.
The independent Fallout Wiki is basically where the actual community is right now, the Fandom wiki is just there to confuse passers-by with their high search engine rank. Fandom has the policy that the community can fork a wiki and go elsewhere, but they will not close down the Fandom wiki, so good luck with your search rankings.
Many game communities have opted for the “unbridled vandalism” strategy to push people away from fandom. Just replace all the articles with plausible lies.
The “fandom” one is much more complete ?
I mean, they’re both pretty great,
From the search engine if I wanted to know about in-game faction,
I’d just pick which ever appeared first.
and it’d be fine either waySo why would “Chloé 🥕@lemmy.blahaj.zone”
think they can just point at it and imagine any random people would even know
what she “who that guy is” means just because he’s associated with that wiki ?And that my innocuous comment
would triggers the nerds with such an unanimously negative response ?
The user content on fandom is generally pretty good, at least for the wikis I frequent. It’s everything else about the site which is awful – the pop-ups, the completely irrelevant auto-playing videos, how it’s constantly trying to shove other fandom wikis into your attention.
I’m sure the site is improved with userscripts and such, and I am already using adblock, but it’s pretty unforgivable IMO.
He can also stick AI inside his own ass
Christ, I miss when I could click on an article and not be asked to sign up for it.
Oh, right! Thanks for reminding me. I tried to archive it the last time but it took forever.
Edit. There ya’ go: https://archive.is/oWcIr
You know, I remember way back in the day when…
#Interested in reading the rest of this comment?
Please sign up with your name, DOB, banking information, list of valuables, times you’re away from home, and an outline of your house key to “Yaztromo@lemmy.world”. It’s quick, easy, and fun!
…and that’s why I’m no longer welcome in New Zealand. Crazy!
As I have adblock mostly because of the abuse of trackers, I understand people trying to monetize their work.
Journalists monetizing their work is totally reasonable. The problem for me is that it seems unfair to ask that literally everyone trying to read an article have to sign up. Maybe I’m missing something.
Not sure about Wikipedia, but Conservapedia would find it very useful. In fact, since most of their entries are factually incorrect and appear as fantasy I think AI writing articles would save them a lot of time.
Bonus: hallucinations can help create new conspiracy theories!
He is nobody to Wikipedia now. He also failed to create a news site and a micro SNS.
if jimmy wales puts ai in wikipedia i stg imma scream
The editor community rejected the idea so overwhelmingly, that Wikipedia paused the planned experiment in June, hopefully for good.
Fuck AI
Important context: he’s not suggesting AIs writing content for Wikipedia. He’s suggesting using AI to provide feedback for new editors. Take that how you will.
Right, which makes it just as bad. Wikipedia had enough proofreaders. You don’t need AI for that, because the need is already filled.
This is entirely different from a book writer who is going everything solo and has exactly one publishing window.
And writing feedback software has existed for decades. So AI adds nothing new. Again it is snake oil. It is always snake oil. Except when it’s bait and switch, to pretend it wasn’t snake oil in the first place.
So I fed the page to ChatGPT to ask for advice. And I got what seems to me to be pretty good. And so I’m wondering if we might start to think about how a tool like AFCH might be improved so that instead of a generic template, a new editor gets actual advice. It would be better, obviously, if we had lovingly crafted human responses to every situation like this, but we all know that the volunteers who are dealing with a high volume of various situations can’t reasonably have time to do it. The templates are helpful - an AI-written note could be even more helpful.
This actually sounds like a plausibly decent use for an LLM. Initial revision to take some of the load off from the human review process isn’t a bad idea - he isn’t advocating for AI to write articles, just that it can be useful for copy-editing and potentially supplement a system already heavy in Go/No Go evaluations.
Which is weird, really. Jimmy Wales is just fucking awful. I didn’t realize he was anatomically capable of not talking out of his ass.
Honestly, translating the good articles from other languages would improve Wikipedia immensely.
For example, the Nanjing dialect article is pretty bare in English and very detailed in Mandarin
You can do that, that’s fine. As long as you can verify it is an accurate translation, so you need to know the subject matter and the target language.
But you could probably also have used Google translate and then just fine tune the output yourself. Anyone could have done that at any point in the last 10 years.
Google translate is horrendously bad at Korean, especially with slang and accidental typos. Like nonsense bad.
Same in Hungarian, machine translation still often gives hilariously bad results. It’s especially prone to mixing up formal and informal ‘you’ within the same paragraph, something which humans never do. At least it’s easy to tell when a website is one of those ‘auto-translated to 30 languages’ content mill.
As long as you can verify it is an accurate translation
Unless the process has changed in the last decade, article translations are a multi-step process, which includes translators and proof-readers. It’s easier to get volunteer proof-readers than volunteer translators. Adding AI for the translation step, but keeping the proof-reading step should be a great help.
But you could probably also have used Google translate and then just fine tune the output yourself. Anyone could have done that at any point in the last 10 years.
Have you ever used Google translate? Putting an entire Wikipedia article through it and then “fine tuning” it would be more work than translating it from scratch. Absolutely no comparison between Google translate and AI translations.
Putting an entire Wikipedia article through it and then “fine tuning” it would be more work than translating it from scratch.
That depends on if you are capable of translating the language if you don’t know the language then the translator will give you a good start.
If you don’t know the language then you shouldn’t be involved in the translation at all… The current process requires both the translators and the proof-readers to know the language.
I recently have edited a small wiki page that was obviously written by someone that wasn’t proficient in English. I used AI to just reword what was already written and then I edited the output myself. It did a pretty good job. It was a page about some B-list Indonesian actress that I just stumbled upon and I didn’t want to put time and effort into it but the page really needed work done.
This is the goal of Abstract Wikipedia. https://meta.wikimedia.org/wiki/Special:MyLanguage/Abstract_Wikipedia
Wikipedia’s translation tool for porting articles between languages currently uses google translate so I could see an LLM being an improvement but LLMs are also way way costlier than normal translation models like google translate. Would it be worth it? And also would the better LLM translations make editors less likely to reword the translation to make it’s tone better?
You can use an LLM to reword the translation to make the tone better. It’s literally what LLMs are designed to do
WikipedAI
Sit down Jimmy. Wikipedia has enough problems already, it doesn’t need more to be added by AI.