Jump to content

Wikipedia:Wikipedia Signpost/Single/2024-06-08

From Wikipedia, the free encyclopedia
The Signpost
Single-page Edition
WP:POST/1
8 June 2024

Deletion report
The lore of Kalloor
Featured content
We didn't start the wiki
 

File:Foundation Form 990 FY 2022-2023 cover image.png
Wikimedia Foundation
CC BY-SA 4.0
100
67
500
2024-06-08

Wikimedia Foundation publishes its Form 990 for fiscal year 2022–2023

Form 990: WMF now holding quarter-billion dollars

Black and white Wikimedia Foundation logo

The Form 990 is a United States Internal Revenue Service document that provides the public with financial information about a nonprofit organization. It is often the only source of such information. The Wikimedia Foundation recently published its Form 990 for the 2022–2023 financial year, along with an FAQ on Meta and a public-facing blog post. Here is a very brief summary of some key points:

  • Total revenue was $180.2M ($173.4M in donations and grants, $3M in investment income, and $3.8M of "other revenue", such as funds coming from Wikimedia Enterprise), up from $167.9M.
  • Total expenditure was $168.3M, up from $145.8M.
    • According to the FAQ, 43% of this went to technology, 33% went to support volunteers, 11% went to fundraising expenses, and 13% went to general and administrative support.

Net assets at the end of the year (not including the $120M Wikimedia Endowment) were $254,971,336 (up from $239,351,532).

"Direct support to volunteers"?

The WMF blog post and the FAQ further mention that –

Our second largest expense category is direct support to volunteers. A third of our expenses went to support volunteers totalling $56.1M, of which $24.7M is given as grants to community groups for their work towards the Wikimedia mission.

Careful readers will note that this says nothing about what the remaining $31.4M was spent on. The paragraph continues:

You can find out more about the grants in this fiscal year in the Wikimedia Foundation Funding Report.

This sounded promising. But the linked Wikimedia Foundation Funding Report covers "grants to mission-aligned organizations and people around the world, totaling $17,512,472". Disappointingly, the figure of $24.7M is nowhere to be found on the page.

As we were drafting this article – in public, as we usually do – the WMF added an explanation to the FAQ on 30 May 2024 that the $24.7M additionally includes –

This is welcome information. However, the $17.5M that are accounted for in the Wikimedia Foundation Funding Report already include $1M for the Knowledge Equity Fund. According to this, then, there were two separate Knowledge Equity grants for $1M each. Right?

And apart from the Wikimania scholarships, none of the items mentioned – support for Wikidata, the Endowment and the Knowledge Equity Fund, which gives grants to various NGOs unrelated to the Wikimedia projects (see previous Signpost coverage) – would seem to benefit volunteers directly. So why are they included under "direct support to volunteers"? Moreover, whichever way one cuts it, we were still well over $30M short of the claimed $56.1M volunteer support total.

But the edit made to the FAQ on 30 May addressed that too. It added the following new text to the FAQ:

The Foundation spent $56.1M on support for volunteers in FY 22–23, of which $24.7M was given as grants to community groups for their work towards the Wikimedia mission (line 4b in Part III). The remaining spend of $31.4M funded volunteer support work led by Wikimedia Foundation teams, including legal support, trust and safety support, community programs like GLAM and education work, partnerships, public policy work, human rights work, community research such as the Community Insights survey, and communications. Through these activities, the Foundation aims to support, protect and advocate for volunteers, and expand the impact and reach of the Wikimedia projects.

The information added to the FAQ does improve the page. But it still seems odd that the $56.1M spent on "direct support to volunteers" should include $7.5M in grants given to Wikidata, the Wikimedia Endowment and the Knowledge Equity Fund. One gets the sense that these may be "broad strokes" data published for PR purposes, rather than rigorously sourced figures.

Salaries and severance payments

The WMF's salary costs were up from $88.1M to $96.8M. Page 9 tells us that 252 individuals (up from 233) received more than $100,000 in reportable compensation. Page 53 of the Form tells us that there were no big raises for Wikimedia executives in 2022–2023. In fact, some of the top earners' total compensation shrank slightly, with increases in base salaries cancelled out by the absence of bonuses and incentives (cf. previous year). Ten executives had total compensation above $300K; figures for the top five on page 53, column E, are:

  • Maryana Iskander, CEO: $534,468
  • Jaime Villagomez, CFO & Treasurer: $418,040
  • Lisa Seitz, CAO and Deputy CEO: $411,984
  • Amanda Keton, General Counsel & Sec (through 23 February): $411,682
  • Robyn Arville, Chief T&C Officer: $384,200

Severance payments (page 54) were:

  • $248,228 for Robyn Arville
  • $111,383 for Anthony Negrin
  • $94,738 for Carol Dunn

Below, for your reference, is the announcement the WMF's Elena Lappen submitted to us for publication in The Signpost. – AK

WMF announcement

Expense breakdowns for the 2022–2023 fiscal year

The Wikimedia Foundation released its Form 990 for fiscal year July 2022 – June 2023 on 14 May. It is now available on the Foundation website, along with FAQs available on Meta. The Form 990 is the annual tax form required of all nonprofits in the United States. It contains disclosures about an organization’s finances, governance practices, activities and more. The highlights of the Foundation's 2022–2023 Form 990 show: high ratings for governance policies; information about leadership transitions; revenue numbers supported mostly by donations; slow growth in expenses driven primarily by an increase in the grants budget and in personnel costs; and expense breakdowns that aligned with the 2022–2023 Annual Plan goals. More details about these highlights are available in a summary post on Diff. Questions and comments can be left on the FAQ talk page.

Wikimedia Movement Charter ratification begins

The Signpost encourages all individual Wikimedia editors and all Wikimedia movement affiliates to express their support or rejection of the Movement Charter in the vote scheduled to take place from 25 June till 9 July.

The Movement Charter would guide many governance decisions in the Wikimedia Movement by establishing a Global Council of Wikimedia community volunteer representatives who make funding decisions, such as overseeing the money described above. Wikimedia Deutschland's vision for the Charter is that it should empower the "Global Council as the highest decision making body" in the Wikimedia Movement, even over the Wikimedia Foundation. Wikimedia Deutschland has complained that the current draft does not do this and instead positions the Global Council as an advisory body without decision-making authority. Attendees at the Wikimedia Summit rejected the current Movement Charter draft pending the correction of a series of deal-breaking problems, including that the charter failed to provide resources for the Wikimedia community to maintain its own directly managed staff to access Charter rights, failed to grant the Wikimedia community power to demand Wikimedia Movement financial transparency, failed to make the Global Council accountable to the Wikimedia community, failed to clarify the division of powers between itself and the Wikimedia Foundation, and had a series of other shortcomings ranked by severity. The Wikimedia Foundation itself rejected the draft with its own recommendations for improvement.

Regardless of challenges, the Movement Charter Drafting Committee is optimistic that they can revise the rejected draft to make it acceptable by 18 June in time for that ratification vote to run for two weeks starting 25 June. The Charter Electoral Commission will be running two elections, one which will ask individual Wikimedia editors to vote and one which asks each registered Wikimedia organization to issue votes representing that organization. The charter will be ratified if both individual and affiliate votes pass with more than 50% approval.

Movement Charter Drafting Committee member Risker says, "We are now doing final work on the Charter that will be the subject of the ratification vote. One should always keep in mind that, given there are around 70,000 potential voters, the number of individuals who have commented and provided feedback is actually very, very small. Without a much more major push for participation, we really don't know what the broad level of the community thinks about the charter. We've tried a range of activities to get the broader community involved, but I think it's been known for years that many (most?) Wikimedians only comment on proposed significant changes when there's likely to be a direct impact on them; to this point, I'm sure many folks have believed this was largely a theoretical discussion with little or no effect on them. We will have a much better sense of where the broad community, the affiliate community, and the Board of Trustees stands after the ratification vote. If the vote succeeds, then we have a charter. If it does not....well, we will also have a metric tonne more comments, including a lot from individual Wikimedians. We will all have a result by Wikimania, at which point next steps can be discussed."

Again, do your civic duty and VOTE. Get out the vote and remind your wiki colleagues to vote. Comment on the charter when it is published. The Wikimedia Foundation, your fellow wiki editors, The Signpost, Wikimedia affiliate organizations, Wikimedia's readers, and Wikimedia donors all want you to protect Wikimedia values by participating in governance. BR

The New York Times, NPR and Reuters block Wikipedia editors from citing their articles

The "Cite" button in VisualEditor

A bug filed on April 12 (phab:T362379) uncovered that "several major news websites (NYT, NPR, Reuters...) block citoid", the service underlying the "Add a citation" tool in Wikipedia's Visual Editor. The tool retrieves metadata (like author, title and publication date) from the cited site to generate a reference, and fails with a "we couldn't make a citation for you" message in those cases.

As explained by the Wikimedia Foundation's Citoid expert Marielle Volz:

The NYTimes has been blocking us for a while, it briefly worked when we changed datacenters and ergo IP, but they've understandably reblocked us after a few weeks' reprieve!

[...]

This is partly a consequence of the fact that over the last few years our traffic has increased a lot, we didn't used to trigger IP blocks as often.

Sam Walton (who, as product manager for The Wikipedia Library, maintains contact with various publishers including Elsevier, whose ScienceDirect database appears affected too) confirmed that

We've been explicitly told by at least one organisation that the block is deliberate unfortunately, due to concerns with the volume of traffic. I agree that there are convincing reasons for them not to block us and that it is ultimately in their best interests – we'll just have to see how these conversations go.

However, other comments questioned whether the rate of citations requested (which on average only amount to a few per second overall according to one commenter) could really be the root cause of overloading the APIs of external sites, and further discussion veered into an investigation whether Citoid itself might be generating an excessive amount of traffic, possibly due to issues with the Zotero service that it relies on.

At the time of writing, the tool worked again for a New York Times article, but was still failing for examples from NPR and Reuters. – H

U4C election results in no quorum

Results were announced on May 31 by the chair of the Elections Committee for the first elections for the Universal Code of Conduct Coordinating Committee (U4C). Only seven of the sixteen seats were filled. Thirty other candidates did not receive the required 60% support from the voters. The U4C serves "as final recourse in the case of systemic failures by local groups to enforce the UCoC."

Eight members of the committee are required to form a quorum to vote or to make any decision, though the committee may still conduct discussions. The only exception is that they may call a special election and set its scope, to seat additional members. 0xDeadbeef, one of the newly elected members, states that so far "there doesn't appear to be any progress" in finding a way to secure a quorum. Another newly elected member, Ghilt adds that changes to the rules are contemplated: "It is obvious that the current set of rules is partially dysfunctional."

Three candidates were elected to fill regional seats:

  • Ghilt (Northern and Western Europe) – home wiki German
  • Ibrahim.ID (Middle East and Northern Africa) – home wiki Arabic
  • 0xDeadbeef (East, South East Asia and Pacific) – home wiki English

Four candidates were elected to fill community-at-large seats:

The Signpost thanks all candidates, the election committee, and voters for their participation. – S

WikiConference in Indiana, 4–6 October 2024

Wikimedians of Indiana User Group, other Hoosier Wikimedians, and the WikiConference North America team invite the world to WikiConference North America in Indianapolis, Indiana on 4–6 October. Scholarships were available to attendees in North America until 31 May. Submissions are currently being sought, with no posted deadline, but the sooner the better. – BR

Brief notes

A group from Wikimedia Germany travelled to Prague for the Wikimedia Europe General Assembly in June, where the next goals for Wikimedia Europe were set.
  • Annual reports: Wikimedia Deutschland, Wikimedia España, Wikimedians of Slovakia.
  • New administrators: There are no new administrators to report. Two requests for adminship (RfAs) in the waning days of May were closed as unsuccessful, and another one that began on 31 May is still running. The last successful RfA was in February — one of just four successful RfAs this year — making 2024 one of the four least productive January–May periods on record. Only 2018 and 2023 were less productive, according to the monthly charts. The list of active administrators stood at 435 as of 2 June, equal to the record low reported in the last issue of The Signpost.
  • WikiCup: In the annual WikiCup, editors compete to improve and create content, earning points for various types of work. The competition is divided into multiple rounds, each progressively narrowing down the participants. We are now in Round 3, and the leaders are currently Generalissima (752 points), Christmas Island AryKun (415 points), Brazil Skyshifter (403 points), OlifanofmrTennant (261 points), and Canada Hey man im josh (235 points). In this round, there are a total of 3882 points altogether.
  • Articles for Improvement: This week's Article for Improvement is Sun Management Center, followed next week by State of emergency. Please be bold in helping improve these articles!



Reader comments

File:Triage 041105 big.jpg
Paramedics Worldwide
CC 3.0 BY-SA NL
81
0
449
2024-06-08

New Page Patrol receives a much-needed software upgrade

Following a widely supported community campaign which kicked off in 2022, technical updates and a new user interface have been deployed for the New Pages Patrol PageTriage software. Sam Walton, Product Manager for the Moderator Tools team at the Wikimedia Foundation (WMF), and Novem Linguae, English Wikipedia administrator, volunteer developer, and lead NPP coordinator, explain how collaboration between the WMF and volunteer developers was key to the success of this project.

What is New Page Patrol? What software do they use?

The English Wikipedia's new page patrollers aim to review every newly created article produced by newer editors. To help tackle this large workload, specialised software (called PageTriage) was deployed in 2012 to help patrollers navigate and take actions on the new page backlog. This software powers Special:NewPagesFeed and the Page Curation toolbar. Becoming the default mechanism for patrollers to review new pages, the software grew over the years to incorporate the wide array of moderation processes, including adding maintenance tags, sending WikiLove, and the various forms of deletion nominations. In 2018 and 2019, the WMF Growth and Community Tech teams worked to improve the extension by incorporating ORES scores, integrating Articles for Creation into the software, and adding a variety of requested features.

A call for help

After the burst of work on PageTriage in 2019, because WMF and volunteer developers had not spent focused time on PageTriage for a number of years, issues began to arise. Workflows such as PROD tagging were broken, and AFD tagging had a tricky bug causing it to regularly fail.

As a result of these mounting issues, NPP coordinator MB wrote an open letter to the WMF in July 2022 as a call for help. The open letter asked the WMF to direct resources towards PageTriage, asking for more time to be spent fixing bugs and developing new features in the software. Gathering 444 signatures (becoming one of very few English Wikipedia pages to achieve 400+ signatories), the English Wikipedia community rallied to support the open letter. There was a watchlist notice for it, emails with WMF staff, and Atsme and Novem Linguae each attended a "Conversation with the Trustees" with the Community Affairs Committee of the WMF's Board of Trustees.

Following this widespread support, the WMF's relatively new Chief Product and Technology Officer, Selena Deckelmann, joined in conversations with patrollers. The WMF shifted some resources from improving mobile web (which was in the Foundation's 2022–2023 annual plan), and designated the Moderator Tools team to begin exploring PageTriage's issues. The Growth Team also made contact with the NPP team, and three video meetings were arranged to brainstorm improvements to the new article creation process (such as the article wizard).

Another outcome of the open letter, and other community discussions from the past year, is that it encouraged the WMF to rethink how it receives community opinions for technical requests. The ongoing Future of the Wishlist planning and discussions factored in lessons learned from the NPP open letter and PageTriage software improvement process.

Help arrives!

The WMF's Moderator Tools product team began work internally to understand the problem space: How was PageTriage being used? What problems did it have? Why wasn't it being actively maintained? One of the big findings from these discussions was that the extension was in a poor technical state: it was built a decade ago, and since then features had been added as needed, but without a coherent strategy or consistent technical choices. Tackling technical debt had not been prioritized during that decade, and many of the technologies used in the software were at this point out of date or unique to this extension. It is not good for software in a large ecosystem such as MediaWiki to have unique technology, because it means that other MediaWiki developers are less likely to know how to support it. This made it very difficult for staff or volunteers to make improvements or changes, and it particularly wasn't an appealing prospect to throw more new features onto the pile.

In January 2023, the team announced that it would be spending at least 3 months working on the PageTriage extension. The work would be focused primarily on technical modernisation, with a view to maintaining the toolset in the longer term, but the details were left open for the team to figure out over the coming months.

Over the course of these months, which spanned well beyond the original 3 months allotted, two collaborative processes took place to figure out the priorities. Patrollers were interviewed to ensure that the Foundation understood how the software was being used, and to identify any high priority workflow-breaking bugs which might need attention. At the same time, the team met with volunteer developers to jointly prioritise the various ways in which the codebase might be improved technically.

It's fair to say that both the WMF and volunteers approached this collaboration warily. WMF staff were faced with an overwhelming set of bugs and a long list of requested features. On the community side, some NPP coordinators and volunteer developers had little experience working directly with WMF staff, and were concerned that the WMF would want to change the software's appearance too much, possibly disrupting patroller workflows.

Over the following months, what ensued was a very productive collaboration between the Moderator Tools team's engineers – Jason Sherman and Susana Cardenas Molinar – and volunteer developers, including Novem Linguae, MPGuy2824, TheresNoTime, DannyS712, Sohom Datta, and Chlod. We were able to agree on ways forward which preserved the existing behaviour of PageTriage, while making substantial improvements to the underlying codebase, including replacing old Javascript frameworks with modern Vue.js, updating deprecated code, and improving test coverage. Weekly office hours and active discussions on Discord meant that WMF and volunteer developers collaborated closely on defining and solving problems – feeling like one coherent team. New volunteers became involved, and Sohom Datta updated Special:NewPagesFeed to use Codex components, giving it a much needed visual facelift! More information on the work achieved during this project can be found in the project's final update.

Empowering technical volunteers

The project was also an excellent opportunity to involve volunteer developers more in the MediaWiki technical community. PageTriage patches ended up being Novem Linguae and MPGuy2824's first patches ever submitted to Gerrit, the MediaWiki code review system. Novem Linguae and Sohom Datta also underwent an RFA-like process on Phabricator to receive "+2" certification for the PageTriage code repository, confirming they had the technical community's trust to approve other developers' PageTriage patches. Having more developers with +2 rights for a repository is a big help towards improving its technical maintainability in the long term. Finally, Novem Linguae attended technical conferences and discussed PageTriage there with developers, product managers, and leadership. Sohom Datta's participation in the PageTriage project began after meeting Novem at a conference and working on a PageTriage bug report together.

Looking to the future

Going forward, the Moderator Tools team has taken over as official maintainers of the codebase, and is continuing to provide code review and office hours for volunteer developers, as well as working on high priority bugs and potentially tackling more technical debt in the future. Between this and the active volunteer developer community around PageTriage, both the WMF and PageTriage's volunteer developers feel that the extension is in a much better place than it was a year ago. We hope that this project can serve as a positive example of collaboration between the WMF and volunteer developers to make Wikipedia's tooling better.

We also hope that this can serve as an example of the power of the open source movement and philosophy. The fact that Phabricator tickets, Gerrit code review, and the PageTriage source code are all public, and that volunteers have a process by which they can apply to receive +2 rights to MediaWiki repositories, enabled volunteers to step up to the plate and very actively participate in the process of modernizing PageTriage. Working side-by-side with WMF software engineers, much more was achieved than if either group had worked in isolation.

A full overview of the project and the updates posted over the course of this year can be found at Wikipedia:Page Curation/2023 Moderator Tools project.

This year the Moderator Tools team and other product teams are prioritising other improvements aimed at supporting experienced editors as part of the WMF's annual plan focus on editors with extended rights. That work includes Automoderator, Edit check, patrolling on Android, and Commons Upload Wizard improvements. Input is welcome on all of these projects, as well as the draft goals for next Fiscal Year.




Reader comments


File:Chetput Village.jpeg
India Illustrated
PD
450
2024-06-08

The lore of Kalloor

The writer is a long-term reader of Wikipedia and decided to finally bite the bullet to improve the project. He is starting off by working on fixup projects while easing into writing article content.

Kalloor—purportedly a location in Tamil Nadu in India, linked to the death of Thomas the Apostle—became the center of scrutiny. Nominated for deletion on 5 May 2024, the article was criticized for lacking verifiable sources and potentially being a hoax. Because the article was insignificant, this process presents an opportunity to highlight the consequences that original information on Wikipedia has on knowledge, without the risk of causing a heated debate.

Kalloor

A village.
Chetpet (a village in Tamil Nadu that has multiple sources to back up its existence) in 1905.

Editor TenPoundHammer initially brought the issue to the discussion page for possible hoaxes. They highlighted that the article had remained relatively unchanged since its creation on 31 August 2005. Despite a sparse web footprint, the article claimed Kalloor as the site where Thomas the Apostle was killed. Even more concerning was that Piotrus' investigation revealed that the article's primary contributor, an anonymous IP address, had also created a similarly questionable entry on Thrikkannamangal, a village which has sources to back up its existence. This pattern raised further suspicions about the legitimacy of the Kalloor article.

Piotrus and Malerisch debated whether the article might be a hoax or an urban legend. Another peculiarity is that Malerisch found a brief mention of Kalloor in the 2005 book First International Conference on the History of Early Christianity in India, suggesting the name might have some historical basis. The quote read "Apostle Thomas was martyred in Mylapore near Madras (Tradition calls this place Kalloor – the place of rock) in Tamilnadu State, India". This conference took place in early August 2005, and predated the article's creation by around two weeks. Viewed from a Wikipedian perspective, this single reference may have been insufficient to establish notability or credibility.

Kalloor was initially tagged for speedy deletion, but upon review, it underwent a full articles for deletion (AfD) process. This allowed a thorough examination by the Wikipedia editorial community. If it was speedily deleted, it could open up the possibility of a deletion review. The discussion, initiated by Piotrus, highlighted concerns over the article's authenticity. It was noted that the claim of Kalloor being the "place in Tamil Nadu, India, where the Apostle Thomas, one of the 12 disciples of Jesus, is believed to have been killed" was a significant claim that may fail to meet Wikipedia's verification standards, according to Piotrus. He noted the absence of credible sources, and the article's dubious nature, as reasons for its nomination for deletion. Editors such as Gawaon and SparklessPlug supported deletion due to the absence of reliable sources and the high probability of the article being a hoax.

Other editors largely agreed on the article's lack of verifiability. JBW, another editor, pointed out the historical inconsistencies and the difficulty in finding reliable references to support the claims made about Kalloor. The discussion revealed that the original text, with minimal changes over time, remained unsubstantiated and potentially fabricated. Despite initial consensus leaning towards deletion, further examination by editor Malerisch suggested that while parts of the article might be dubious, the entire entry could not be entirely dismissed as a hoax, as "Kalloor" was confirmed to be an Indian surname, citing Yoohanon Chrysostom Kalloor as an example. The discussion ran for the seven days and closed with the article being deleted.

Wikipedia, a platform reliant on community contributions, faces the constant challenge of verifying the vast amount of information it hosts. Kalloor serves as a fleeting reminder of the necessity for thorough verification processes to prevent the spread of misinformation. Although the village may exist, as there are sources that predate the article, there is not enough evidence to support its inclusion in the encyclopedia.

Brazilian aardvark problem

TKTK
An aardvark in South Africa (I couldn't find a picture of a Brazilian one)

Wikipedia has become an important fact-checking website, meaning false information on it can cause knowledge to become distorted. The New York Times has called Wikipedia a "factual netting that holds the digital world together", so being an encyclopedia that anyone can edit, it is uniquely paradoxical by being trustworthy and untrustworthy at the same time. While the notion of "don't trust Wikipedia, anyone can edit it" has taken a new meaning, highlighting political bias, it originally focused on incorrect purported factual information. The most obvious way to identify original information on Wikipedia is to see if it existed before it was in the encyclopedia.

If original information were to exist long enough in the encyclopedia, other places may reuse this information and then other places will cite it from them with infinite regress. This happened in 2008 when a 17-year-old student included the nickname "Brazilian aardvark" in the article about the coati. This nickname was on Wikipedia for six years which led it to be cited by other publications. With the newspapers The Independent, the Daily Express, Metro, The Daily Telegraph, and works published by the University of Chicago and the University of Cambridge using the nickname, it became reliably sourced through circular reporting. Between November 2007 and April 2014, an anonymous editor added what translated to "hairy bush fruit" to a list of Chinese names for kiwifruit. This term was then used by The Guardian and cited by the article to source the name.

TKTK
Coati in Brazil

The examples section in the Wikipedia article on circular reporting lists other times this has happened. Most of these are no longer part of the respective articles. Every time a Brazilian aardvark appears, it sometimes sparks a discussion about whether information should be included in the article as it becomes reliably sourced:

As I was writing this piece, I stumbled upon a fascinating discovery: an entire page on Wikipedia dedicated to the phenomenon. Wikipedia:List of citogenesis incidents meticulously keeps a record of the widespread occurrences of this problem.

Some Brazilian aardvarks do eventually become real. The name of the Pringles mascot, Julius Pringle, originated in 2006 when an editor inserted the name "Julius Pringles" into Wikipedia. This was then used by publications and subsequently adopted by Pringles. The alias "Patrick Parker" for the comic book supervillain, the Riddler, originated in 2013 when an anonymous editor inserted the nickname to Wikipedia. It was in the article for nine years and eventually used in the 2022 film The Batman.

The aim of Wikipedia is not to seek truth, it is to seek verifiability. Sources that are considered reliable or unreliable are decided by Wikipedians through discussions. If incorrect information comes from sources that are deemed reliable, it can be included until reliable sources correct it. However, the instance of information originating from Wikipedia makes "the Brazilian aardvark problem" a special case which both inflicts harm on knowledge and challenges some of Wikipedia's core principles.

Kalloor has sources which predated the article, so it wasn't a Brazilian aardvark. However, it would be insightful to know if there has ever been information, that didn't eventually become real, which originally came from Wikipedia and was reused by reliable sources. How did editors handle it, knowing the information came from Wikipedia? Have publishers ever been notified that the information came from Wikipedia?



Reader comments

File:Invisible couple - geograph.org.uk - 2387488.jpg
william
CC BY-SA 2.0
50
0
400
2024-06-08

National cable networks get in on the action arguing about what the first sentence of a Wikipedia article ought to say

Trump conviction causes kerfuffle

TKTK
Then-president Trump (2017)

Despite our policy about not being the news, Wikipedians can be very fast in reporting it. On Thursday, May 30, they were quite speedy in editing to reflect the news that the prosecution of former U.S. president Donald Trump ended with a verdict to convict him on 34 New York state felony charges of falsifying business records to cover up a crime. While the jury was still in the jury box at 5:07 p.m Eastern Time (21:07 UTC), the first "guilty" edit hit the Donald Trump article, adding "convicted felon" in the article's first sentence.

The jury foreperson had started pronouncing the 34 individual guilty verdicts about 5:00 p.m., perhaps a bit earlier. Each juror confirmed their agreement with the verdict. Judge Juan Merchan thanked and excused the jurors at 5:11. By the time Trump walked out of the courtroom four or five minutes later, five different editors had each edited the article once.

After another fifteen minutes (and ten edits), editors had removed the words "convicted felon" from the article's first line, but not from the sixth paragraph of the introduction, where it had also been added. Nine minutes later, a Request for Comment was posted on the talk page, effectively forestalling an edit war on whether "convicted felon" should be in the first sentence and the sixth paragraph and the body text, as opposed to in only the sixth paragraph and the body text. The RfC will likely be closed by 10 June. So far the "Supports" (for including "convicted felon" in the first line) seem to have an edge in !votes, but the "Opposes" might have an edge in the style of their comments.

Bizarrely, the argument continued the next morning on CNN and on Fox, the latter of which headlined its story "CNN host suggests Trump conviction not mentioned prominently enough on former president's Wikipedia page". CNN host John Berman's first question to President Joe Biden's campaign co-chair Mitch Landrieu had included "read Trump’s Wikipedia page after the decision, noting that the historic conviction had not been entered into the entry until the sixth paragraph. And the very top line is, 'Donald John Trump, born June 14, 1946, is an American politician, media personality, and businessman … who served as the 45th president of the United States from 2017 to 2021.' That's the first paragraph. It’s not until paragraph six where it says he was convicted of a felony." While CNN has removed the actual video from their website, the Fox video includes this short segment from CNN.

The final part of Berman's question to Landrieu was "If it were up to you ... Where would 'convicted felon' appear in this entry?" . He responded: "Well, I'm not going to tell people how to write that (sic) Wikipedia pages." We can all be grateful that there is at least one sane person appearing on the news channels.

S

Trump's Truth Social borrows term "unified Reich" from Wikipedia

For many years, movies and TV shows have used spinning newspaper cuts, where the main plot points are depicted as the main headlines on a newspaper, with other filler articles below. Many B movies of the mid-twentieth century used the same prop for this, resulting in countless headlines about zombies and space aliens running above identical stories of "New Petitions Against Tax" and "Building Code Under Fire".

Recently, Donald Trump's account on Truth Social posted a video that showed images of the former U.S. president embedded in one of these — apparently a video stock template called "Newspaper Vintage History Headlines Promo". The trouble started when the Associated Press noticed that "at least one of the headlines flashing in the video appears to be text that is copied verbatim from a Wikipedia entry on World War I: 'German industrial strength and production had significantly increased after 1871, driven by the creation of a unified Reich.'"[n 1] The Trump campaign press secretary told AP "This was not a campaign video, it was created by a random account online and reposted by a staffer who clearly did not see the word, while the President was in court."[n 2]

The AP story, in addition to being run by major US news outlets like ABC News, and global media like The Times of Israel, was covered as secondary reporting by the Wall Street Journal, Politico, Axios, The Guardian and Reuters which noted the Wikipedia text connection, as did Newsweek's article "Where Trump's 'Unified Reich' Reference Came From".

So where did Wikipedia's sentence come from? Before July 8, 2009, there had been a sentence in World War I § Background about the growth of German industrial power, but it didn't mention any connection to the founding of the Second Reich in 1871. But the July 8 edit made this connection through the 1871 unification of Germany without mentioning the words "unified" or "Reich". Over the next 13 years, the sentence was rewritten – expanded and contracted – several times, with "unified" and "Reich" each appearing and then disappearing at least once, but apparently never appearing together. "Unified" stayed in the article after December 2021 until this May 22, after the video appeared and the paragraph was rewritten. "Reich" and the quoted word order appeared on November 15, 2022.[n 3]

One may surmise that Wikipedia provided an attractive source for filler text, as its permissive Creative Commons license allows liberal reuse — although here it wasn't given with the necessary attribution — although on the other hand maybe the use was de minimis — the guy who invented Godwin's Law used to be the general counsel for the Wikimedia Foundation, maybe he'd know more about the copyright issue and the unified Reich. At any rate, maybe in the future the stock video makers should stick with lorem ipsum — or Signpost articles.

B, S, J

Footnotes:
  1. ^ We confirm this is the first sentence at World War I § Arms race at the time of writing.
  2. ^ [1]
  3. ^ revision 1122080181

In brief

Wikipedia is better for the contributions from a passionate editor at this library.
Articles should not be invisible, like these people.



Do you want to contribute to "In the media" by writing a story or even just an "in brief" item? Edit our next issue in the Newsroom or leave a tip on the suggestions page.




Reader comments

file:Header_image_Annual_Plan_Diff_post_23-24.png
Wikimedia Foundation
CC BY-SA 4.0
23
450
2024-06-08

Progress on the plan — how the Wikimedia Foundation advanced on its Annual Plan goals during the first half of fiscal year 2023–2024

Elena Lappen is the Wikimedia Foundation's Movement Communications Manager, and works to strengthen the communications and collaboration between the Wikimedia movement and the Wikimedia Foundation.

Each year, the Wikimedia Foundation summarizes goals for the year in the Annual Plan. This fiscal year’s plan, which started in July 2023, centers Product and Technology work, recognizing Wikimedia’s role as a platform for people to contribute on a massive scale. To that end, it puts special emphasis on established editors, who have an outsized impact expanding and improving the quality of content, as well as managing community processes. Hundreds of Wikimedians shaped this annual plan both on and off wiki.

Below, we have summarized progress on each of the four goals during the first half of fiscal year 2023–2024 (July 2023 to January 2024).

Goal 1: Infrastructure

Goal 2: Equity

Goal 3: Safety and integrity

  • EU Digital Services Act: took steps to comply with the new Digital Services Act, an act that went into effect in August 2023 that regulates internet platforms operating in the European Union.
  • Advocacy: educated regulators, policymakers and government leaders about Wikimedia's model
  • Disclosure: met our reporting and disclosure obligations, including publishing a supplemental transparency report
  • Disinformation: supported volunteers and project integrity by mapping anti-disinformation initiatives across the ecosystem; tackled disinformation on the projects in an Anti-Disinformation Repository
  • Volunteer safety: supported community measures for safety and inclusion by working with the Affiliations Committee, Case Review Committee and Ombuds Commission.

Goal 4: Effectiveness

  • Increased efficiency: we are on track to increase the percentage of our budget that goes to directly supporting Wikimedia's mission (our "Programmatic Efficiency Ratio") through increasing our internal efficiency around administrative and fundraising costs.
  • Additional investment into supporting the movement: this increased efficiency will enable an additional investment of $1.8M into funding in areas like grants, feature development, site infrastructure and more.

If you are interested in diving deeper into some of these workstreams, you can read about our progress against the plan on Diff. We also maintain quarterly Metrics Reports to help measure impact, and are constantly feeding data back into the process to see what interventions are working and where we need to course correct. We look forward to sharing more progress as fiscal year 2023–2024 wraps up and we head towards the next fiscal year, for which Annual Plan conversations and drafting are already underway.



Reader comments

File:Letter W, drop capital illustration.png
?
PD
300
2024-06-08

Public response to the editors of Settler Colonial Studies

The Signpost strives to publish a variety of opinion pieces, essays and letters representing a diversity of perspectives; the following letter is a response to a paper written by Tamzin, a Wikipedia editor. On one hand, it concerns specific claims made in an academic paper; on the other hand, it relates strongly to the public interest and the mission of Wikimedia projects, which the Signpost exists to foment.
While we as Wikipedia editors accept that our work is mostly anonymous, and while we lack the prestige and imprimatur of academic institutions, in the name of our project's stability and continued reliability, it is important to stand up for ourselves from time to time. J

Tamzin Hadasa Kelly (they/xe/any; Mx.)
wikimedian@tamz.in
3 June 2024

Dr. Janne Lahti et al., editors
Settler Colonial Studies

To the editors:

I write in response to "Wikipedia's Indian problem: settler colonial erasure of native American knowledge and history on the world's largest encyclopedia", an article by Dr. Kyle Keeler published on 24 May 2024. I believe that this article contains multiple factual errors, as well as an undisclosed conflict of interest.

But before we get to that, I'd like to tell a story.

In February of 2023, a user named Insitemobile came to Wikipedia's administrators' noticeboard for incidents, to report an Indigenous editor named Yuchitown for reverting his edits. His report, titled "Spam, Vandalism and Bullying By Native Tribes", contained the claim that Yuchitown had a conflict of interest regarding the Saponi and Sappony because the Yuchi historically fought the Saponi.[1] Wikipedians broadly recognize that attempting to disqualify an editor's opinions on the basis of inherent or quasi-inherent attributes (race, ethnicity, gender, sexual orientation, religion, etc.) is hateful conduct, and administrators routinely block[2] editors who make such claims. Yet, while administrators did call out Insitemobile as disruptive, none called out the racism in his comments. He was blocked for a week, which became a month when he called his opponents "wikinazis". He then switched to editing without an account, writing to Yuchitown, "I have some advice, be careful online with oppressing other groups of people and especially be careful what IT people you offend and call an OP because this site and country is not safe. people can drive around and use any ip and stalk etc". A second administrator blocked the IP subnet.[3] But the Insitemobile account remained under a merely temporary block. Ten hours later, I noticed this and converted his block to indefinite. I added, "[In my opinion], where threats are involved, that's 'indefinite as in infinite'."

I do not mention this anecdote to claim credit for some act of heroism. I did what any administrator should have done; it took me only a few minutes. I mention it for a few reasons:

Firstly, it was a good introduction to the challenges of systemic racism that editors face in the Indigenous topic area. I do not believe that either of the two administrators who under-reacted harbor any racism against Indigenous people. Assuming good faith is a core principle on Wikipedia, and I assume that the first administrator simply overlooked the racist argument of disqualifying based on tribe membership (which arose several paragraphs into a long post) and that the second didn't realize that the first had only blocked temporarily. But such oversights are often good indicators of where systemic biases lie. If an editor had tried to disqualify someone's views because they were a woman or Black or gay or Muslim, it would not have taken 30 hours and an intervening threat of violence for someone to block the account indefinitely.

Secondly, this incident is how I got to know CorbieVreccan and Indigenous girl. I exchanged emails with both of them about their experiences in the topic area and their sense of administrators not stepping in to keep Indigenous editors safe. I was left with very favorable impressions of both of them. That remains the case with Indigenous girl.

Thirdly, this provides a good illustration of what Wikipedia administrators do and don't do. They[4] do determine whether editors are acting in compliance with our policies and guidelines, especially as pertains to user conduct. They do not decide who is right or wrong in a dispute. I made, absolutely, the right call in blocking Insitemobile, but I couldn't tell you who is right in the underlying dispute as to how Wikipedia should characterize the recognition of the Sappony. I am not a subject-matter expert. Even if I did have a personal opinion on the matter, it would not have influenced my decision. My action was based on the racism and death threats, no more, no less. This distinction is important to keep in mind as one considers the narrative that Dr. Keeler has presented.

I do not wish to delegitimize the core message of Dr. Keeler's piece. Members of WikiProject Indigenous peoples of North America (IPNA) have often been mistreated by Wikipedians who are racist, clueless, or somewhere in between. There was great potential in this article to uncover the nuances of how the Wikipedia community has interacted with CorbieVreccan, Indigenous girl, and other members of IPNA. Dr. Keeler, however, squandered that opportunity in two ways: First, he failed to disclose his personal involvement in the matters his article discusses and his past conflict with me. In addition to the ethical implications of these omissions, Dr. Keeler's lack of necessary distance led him to his second critical mistake: not interviewing all of the people he wrote about. By presenting a narrative based only on his own recollection of events and those of, apparently, those with similar perspectives, he perpetrated many easily avoided errors and misrepresentations. This is a shame: What could have been a compelling investigation into systemic racism on Wikipedia instead becomes, in essence, one ex-Wikipedian's grudge piece against people he feels wronged him and his allies, facts be damned.

Part one: Conflict of interest

This part I will address to Dr. Keeler: I would like to start off with an apology. When I took administrative action against your account[5] some time ago, I was hasty. I don't think you were behaving particularly well in the dispute in question, but I had gotten in over my head and tried to quell the unrest with blunt instruments rather than defusing tensions. I am, genuinely, sorry about that.

To the editors, however, I do feel that Dr. Keeler has done you a great disservice about failing to disclose this past interaction. The article that you published mentions me in three contexts, none of them flattering, and none of them accurate (as we will see in the next section). It is hard to believe that this is a coincidence given his and my background.

For the public version of this letter, I am not going to name the account or give much information about it, because I respect that Dr. Keeler may have good-faith reasons for keeping it private. But the evidence is all pretty clear-cut, down to similar phrasing used on Wikipedia and in a published paper. So instead I will summarize the relevant facts: Dr. Keeler had a somewhat active Wikipedia account; he was involved in the overarching conflict described in the article, although not the specific disputes; I took administrative action against him once as described above (not about an Indigenous topic); he at least twice cited himself in a manner that exceeded what's allowed by policy;[6] and he retired from editing while in good standing. If there is any part of that that Dr. Keeler disputes, he is welcome to let me know. The private version of this letter will go into more detail.

Dr. Keeler's article discloses no conflicts of interest. I submit that the failure to disclose both the general involvement in content disputes about Indigenous topics on Wikipedia, and the specific conflict with me in a different setting, violates your journal's policies. Per Taylor & Francis, "Examples of non-financial conflicts of interests [include] ... personal, political, religious, ideological, academic and intellectual competing interests which are perceived to be relevant to the published content." Certainly a dispute with someone mentioned in an article is a competing personal interest, and being part of the group of editors discussed in an article is a competing ideological interest. T&F considers the non-disclosure of competing interests a form of authorial misconduct. As we are about to see, this misconduct harmed not only the subjects of this article you published, but also your readers.

Part two: Factual errors

I have neither the expertise nor the time and energy to fact-check everything in Dr. Keeler's article, at least not for free. But I know well the parts that I was involved in, so I will go over those. I suppose it is possible that I was just unlucky and everything else in the article is accurate, but, per Crichton and Gell–Mann, I feel obliged to assume that this rate of errors is pervasive throughout.

Freoh

In July Freoh was blocked by Tamzin, an administrator, 'for persistent disruption ... after warnings by three admins across two massive [Administrator's Noticeboard] threads' and edits that 'consist almost entirely of stirring drama.' Freoh's history at administrator's noticeboards, and the narrative constructed there by settler nationalists, follow the strategies utilized to remove editors who settler nationalists disagree with. Content disputes were turned into conduct disputes, Freoh was accused of pushing a specific point of view and righting great wrongs, settler nationalists suggested banning Freoh, and administrators did so. Freoh's suggested content was never added, the pages Freoh sought to edit read as they did before Freoh intervened, and Freoh was erased from Wikipedia.

The term "block" on Wikipedia usually refers to what most other sites would call a ban. My action against Freoh, however, was only a partial block, making him unable to edit some behind-the-scenes parts of the encyclopedia, but still able to edit all articles and their discussion pages. I did this specifically because I felt Freoh was a productive content contributor and needed to focus on encyclopedic work rather than drama. Furthermore, Freoh was not editing about Indigenous topics when I blocked him. I blocked him for editing another user's comments[7] to remove what he saw as an unacceptable insult against the French. A good illustration of constantly stirring drama, less so of being a martyr in the fight against settler colonialism, what with France being perhaps the most overtly colonialist country in the world.

Freoh was not banned. Freoh was not blocked sitewide. Freoh was not erased. Freoh simply chose to stop editing[8] at that point, cut off from drama venues.

Pingnova

PingNova [sic] continued to ignore Native and allied editors, and they reached out to Tamzin for help. Tamzin accused CorbieVreccan of mistreating PingNova [sic], explaining that 'This could land at [Administrator's Noticeboard] with a lot of recriminations.' The message was clear: if Native editors did not allow PingNova [sic] to edit articles related to Native topics they would be taken to the Administrators' noticeboard and removed from digital space.

Let's quote a bit more context of what I said:

To be clear, I did ask [Pingnova], before posting this, if they would agree to some kind of mentorship / 'on-the-job learning', and they did. What I've seen so far here is that they made some good-faith changes, and you [CorbieVreccan] came down fairly hard on them, and they've taken umbrage at that, and now we're in a cycle heading in a bad direction. This could land at [the administrators' noticeboard for incidents] with a lot of recriminations. Or we could defuse tensions and try to get some quality content out of this. I'd really like to see both Pingnova and you step up to the plate on that.

There is, of course, no threat to "remove" anyone here. One informal role of an administrator is to ease tensions before they land at our infamously drama-prone user-conduct noticeboards (i.e., to avoid removing anyone from the community), which is why I intervened in hopes of getting Pingnova good advice on the particulars of editing in the Indigenous American topic area, which is Wikipedia's standard practice for well-meaning new editors who are making some mistakes. By Dr. Keeler's article's own description of it, nothing that Pingnova wrote was hateful or otherwise would have disqualified them from continuing to write in the topic area with some guidance, and so that is what I sought. After Indigenous girl graciously agreed to make herself available to answer questions from Pingnova, I withdrew from the discussion:

The rest, from here, is up to [Pingnova]. They can heed your critique, or not. I just wanted to make sure they were getting a fair chance to sink or swim. I hope that makes sense. All the best.

CorbieVreccan

One day later, Tamzin opened a public case against CorbieVreccan at the Administrator's Noticeboard, accusing them of 'meatpuppetry,' or when an editor recruits acquaintances offline to support them in a debate on Wikipedia.

That's a very brief summary of the controversy at the heart of this paper—one that omits almost every important detail. It does not even name the other editor involved, Mark Ironie, a fellow administrator and long-time offline associate of Corbie's. It is important to understand here that abuse of multiple accounts is one of the cardinal sins of Wikipedia: We cannot have a collaborative, consensus-based community if we don't know how many people we're really talking to.

One of my volunteer roles at the time[9] was to investigate multiple-account abuse. I was spurred into investigating longstanding rumors of multiple-account abuse by Corbie on August 26, when Pingnova pointed out an action of Corbie's that I recognized as violating a different administrative policy.[10] After my initial private report led to Mark and Corbie being ordered to disclose their shared IP, on 11 September 2023—six days after the "This could land" comment, not one—I explained to the community, in painstaking detail, how almost all of Mark's substantive participation since 2020 was to back up Corbie, including demanding sanctions of those Corbie opposed and even blocking someone who opposed Corbie in an "edit war". For a paper ostensibly about tactics used on Wikipedia to manipulate disputes, one would think that Dr. Keeler would have been more interested in these details. Mark and Corbie's peers, including fellow IPNA member ARoseWolf, certainly were.

CorbieVreccan explained that the issue had been resolved privately by administrators some days earlier.

This argument was rejected by every single person to hear it, including the same group of administrators (our Arbitration Committee) who Corbie claimed had resolved the matter. Nor would it matter if they had: Honest participation in consensus-building is a bedrock aspect of running a collaborative project, and no entity can give someone permission to manipulate the community.

Indigenous girl was also placed under scrutiny for working with CorbieVreccan.

This bit is true, and it's the bit I most wish weren't. Indigenous girl is an amazing editor and got dragged into this (not by me) even though no clear evidence was ever presented against her. She absolutely deserved better.

For what it's worth, we remained on good terms throughout the dispute, even as we clashed on-wiki, and had a long, very cathartic debrief phone call after the dust settled. I wish her nothing but the best, on-wiki and off-.

Native and allied editors pointed out the suspect timing, given the proximity to Tamzin's disagreements with CorbieVreccan about PingNova's [sic] editing.

Explained above. If Dr. Keeler does not believe my explanation for why my investigation of Corbie began the same day as Pingnova, he should say that, not cite speculation that I already publicly responded to. Furthermore, combined with the "One day later" error, the sentence before it about how "The message was clear", and the omission of the resolution of the Pingnova matter, this sentence gives the implication that I reported Corbie for failing to reconcile with Pingnova. This is demonstrably false: The public record establishes that I had submitted my private evidence against Corbie 10 days before the reconciliation attempt and 16 days before the administrators' noticeboard thread.

CorbieVreccan and Indigenous girl were brought before Wikipedia publicly for refusing to allow non-Native editors to add colonial viewpoints to Native pages

They were not, and nothing Dr. Keeler writes up to this point supports this statement.

Just a note here about Corbie. At one point in the dispute about Pingnova, Corbie said

[Pingnova] added in a quote from a Pretendian as a giant pull-quote up top and argued with Indigenous people about it. When some of us tried to engage with them here and on talk, they were either incivil or refused to respond. They have whitesplained to Indigenous people, insisting they know these topics better. They have pinged non-Natives into discussions instead of Indigenous editors.

Corbie often spoke in this way, blending "we/us" with "Indigenous" in a way that didn't quite say "I'm Indigenous" but sure implied it. (They are not Indigenous, by their own admission on their personal blog, although they do claim some amount of Native American heritage.) Dr. Keeler's way of describing Corbie—here on the side of "Native pages" and against "non-Native editors", earlier lumped ambiguously among "Native and allied editors"—perpetuates that long-term blurring of lines.

If that seems unfair to CorbieVreccan, if it seems to go against what I said earlier about assuming good faith, please understand how profoundly disruptive an editor Corbie has been on Wikipedia. Much of this evidence was presented to the Arbitration Committee, but Dr. Keeler omits it. In addition to the massive, long-term breach of trust that allowed them and another (also non-Indigenous, to my knowledge) administrator to manufacture consensus about Indigenous topics, they used Wikipedia to promote their personal agenda for almost two decades. They promoted an obscure religious movement that they and Mark are prominent figures in, and advocated against the legitimacy of rival pagan movements. They promoted a personal friend's questionable claims of having married Jim Morrison. They fought to deny that queer rights activist Marsha P. Johnson was anything other than a gay man.[11]

Summation

This is the kind of added depth that Dr. Keeler's article could have had if he had interviewed a more diverse group of editors. He could have avoided easy errors, put forward a more complex narrative that better advanced the too-small field of Wikipedia research, and offset his own biases as a participant in the overarching dispute and a critic of mine.

He could have also strengthened his own arguments about racism. For instance, Gwillhickers has responded to the article with a horrific comment about, among other things, how all civil liberties are thanks to settlers and how Indigenous people who resisted colonization were genocidal. A quote like that would have made for a much more concrete example of racism and colonialism on Wikipedia than much of what is actually in the article. And it would have sent a clear message back to the Wikipedia community that Gwillhickers is someone who is incapable of editing neutrally about Indigenous topics, who is himself on a crusade to "right great wrongs", who should be as unwelcome to edit about that topic as CorbieVreccan is to serve as an administrator.

If I were to write a Wikipedia article about Dr. Keeler, every fact in the article would have to be verifiable in reliable sources. And I would pull my hair out making sure that they all were, because I take my role seriously, even if I don't get paid to do it, even if it doesn't require any degree. I would subject myself to the most rigorous fact-checking I could, and my peers would do the same to me. I would also be held to a pre-review requirement because of my conflict of interest with Dr. Keeler. To create the article at all, I would need to convince an independent editor that it fully complies with all of our policies, including verifiability and neutral point of view, and to make any subsequent nontrivial changes I would have to go through a similar process.

I imagine you all consider Wikipedia a less reliable source than your own journal. Wikipedia itself does not consider itself a reliable source. Why is it, then, that I would be held to a higher standard when writing about Dr. Keeler than he was held to when writing about me?

I request corrections of the numerous errors highlighted above, an acknowledgment of Dr. Keeler's conflict of interest, and an acknowledgment that Wikipedians who were criticized in the article were not given the chance to comment. I reiterate that I have not fact-checked those parts of his article not about me; I suggest consulting with a disinterested experienced Wikipedian for expert feedback.

Thank you for considering this request,

Tamzin Hadasa Kelly
Volunteer editor and retired administrator
Wikipedia

Notes

  1. ^ I am not an expert on Indigenous history. I do not know if the underlying historical claim there is actually true.
  2. ^ Wikipedia uses the word "block" for what most sites call banning. On Wikipedia, "ban" refers to a small subset of blocks that are imposed through certain formal processes.
  3. ^ When people edit Wikipedia without signing into an account, their edits are tied to their IP address. With the more modern "IPv6" form of IP addresses, an individual end-user will usually not have access to just a single IP, but rather a "subnet" of about 18 quintillion IPs, so this is what Wikipedia administrators block.
  4. ^ I am no longer a Wikipedia administrator. I resigned in February 2024 after the suicide of a friend who had been the victim of brutal personal attacks from administrators and other experienced editors—an event that predated his death by several years, and which was by no means its sole cause, but which I know for a fact was a major trauma in his life. His death caused me to reconsider the way editors treat one another in our back-room processes, and led me to decide I did not want to be part of those processes. I elaborate on this in the audio essay "On the backrooms".
  5. ^ I am being vague as possible for privacy reasons. There are perhaps 3,000 people whose accounts I took action against in some manner when I was an administrator.
  6. ^ In short, editors may cite their own publications if a reasonable independent editor would do so, which was not the case here.
  7. ^ Allowed only to remedy serious policy violations or for technical fixes.
  8. ^ At the time the article was submitted, Freoh had made no edits since the partial block. He has since made some, including a successful appeal to have the block's scope relaxed and an unsuccessful one to have it lifted in full.
  9. ^ I resigned at the same time that I resigned as an administrator.
  10. ^ Corbie has been the primary editor of the Two-spirit article, but also locked the page down to changes by less experienced users, citing a special policy that may not be used by administrators who are editorially involved in a page.
  11. ^ Much of Corbie's work regarding Johnson was devoted to resisting the ahistorical narrative that Johnson was transgender. To that extent, at least, I agree with Corbie: Johnson never called themself transgender, and arguments in favor of that term are based on people's own personal interpretation of what Johnson might have felt, a form of research forbidden under Wikipedia policy. However, in these same debates, Corbie resorted to the same kind of arguments in the opposite direction, applying their own personal analysis of why Johnson really didn't feel like anything other than a man, despite Johnson having variously referred to themself as a man, woman, transvestite, and transsexual, all in the same interview, even. Corbie even omitted well-sourced information about Johnson's use of hormone replacement therapy, based on their own analysis of Johnson's breast size.




Reader comments

File:The fundamental knowledge scaffolding model.png
Bagnoli et al.
CC BY 4.0
485
213
800
2024-06-08

ChatGPT did not kill Wikipedia, but might have reduced its growth


A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.

Actually, Wikipedia was not killed by ChatGPT – but it might be growing a little less because of it

A preprint[1] by three researchers from King's College London tries to identify the impact of the November 2022 launch of ChatGPT on "Wikipedia user metrics across four areas: page views, unique visitor numbers, edit counts and editor numbers within twelve language instances of Wikipedia." The analysis concludes that

"any impact has been limited and while [ChatGPT] may have led to lower growth in engagement [i.e. Wikipedia pageviews] within the territories where it is available, there has been no significant drop in usage or editing behaviours"

The authors note that there are good a priori reasons to hypothesize that ChatGPT may have replaced Wikipedia for some usages:

"At this time, there is limited published research which demonstrates how and why users have been engaging with ChatGPT, but early indications would suggest users are turning to it in place of other information gathering tools such as search engines [...]. Indeed, question answering, search and recommendation are key functionalities of large language models identified in within the literature [...]"

However, like many other current concerns about AI, these have been speculative and anecdotal. Hence the value of a quantitative analysis that tries to identify the causal impact of ChatGPT on Wikipedia in a statistically rigorous manner. Without conducting experiments though, i.e. based on observational data alone, it is not easy to establish that particular change or external event caused persistent increases or decreases in Wikipedia usage overall (as opposed to one-time spikes from particular events, or recurring seasonal changes). The paper's literature review section cites only one previous publication which achieved that for Wikipedia pageviews: a 2019 paper by three authors from the Wikimedia Foundation (see our earlier coverage: "An awareness campaign in India did not affect Wikipedia pageviews, but a new software feature did"). They had used a fairly sophisticated statistical approach (Bayesian structural time series) to first create a counterfactual forecast of Wikipedia traffic in a world where the event in question did not happen, and then interpret the difference between that forecast and the actual traffic as related to the event's impact. Their method successfully estimated the impact of a software change (consistent with the results of a previous randomized experiment conducted by this reviewer), as highlighted by the authors of the present paper: "Technological changes can [...] have significant and pervasive changes in user behaviour as demonstrated by the significant and persistent drop in pageviews observed in 2014 [sic, actually 2018] when Wikipedia introduced a page preview feature allowing desktop users to explore Wikipedia content without following links." The WMF authors concluded their 2019 paper by expressing the hope that "it lays the groundwork for exploring more standardized methods of predicting trends such as page views on Wikipedia with the goal of understanding the effect of external events."

In contrast, the present paper starts out with a fairly crude statistical method.

First,

We gathered data for twelve languages from the Wikipedia API covering a period of twenty two months between the 1st of January 2021 and the 1st of January 2024. This includes a period of approximately one year following the date on which ChatGPT was initially released on the 30th of November 2022.

(The paper does not state which 22 months of the 36 months in that timespan were included.)

The 12 Wikipedia languages were

"selected to ensure geographic diversity covering both the global north and south. When selecting languages, we looked at three key factors:

  1. The common crawl size of the GPT-3 main training data as a proxy for the effectiveness of ChatGPT in that language.
  2. The number of Wikipedia articles in that language.
  3. The number of global first and second language speakers of that language.

We aimed to contrast languages with differing numbers of global speakers and languages with differing numbers of Wikipedia articles [...ending up with English, Urdu, Swahili, Arabic, Italian and Swedish].

As a comparison, we also analysed six languages selected from countries where ChatGPT is banned, restricted or otherwise unavailable [Amharic, Farsi, Russian, Tigrinya, Uzbek and Vietnamese].

Then, "[a]s a first step to assess any impact from the release of ChatGPT, we performed paired statistical tests comparing aggregated statistics for each language for a period before and after release" (the paper leaves it unclear how long these periods were). E.g.

"For page views, we first performed a two-sided Wilcoxon Rank Sum test to identify whether there was a difference between the two periods (regardless of directionality). We found a statistically significant different for five of the six languages where ChatGPT was available and two of the six languages where it was not. However, when repeating this test with a one-sided test to identify if views in the period after release were lower than views in the period before release, we identified a statistically significant result in Swedish, but not for the remaining 11 languages."

For the other three metrics (unique users, active editors, and edits) the results were similarly ambiguous, motivating the authors to resort to a somewhat more elaborate approach:

"While the Wilcoxon Signed-Rank test provided weak evidence for changes among the languages before and after the release of ChatGPT, we note ambiguities in the findings and limited accounting for seasonality. To address this and better evaluate any impact, we performed a panel regression using data for each of the four metrics. Additionally, to account for longer-term trends, we expanded our sample period to cover a period of three years with data from the 1st of January in 2021 to the 1st of January 2024."

While this second method accounts for weekly and yearly seasonality, it too does not attempt to disentangle the impact of ChatGPT from ongoing longer term trends. (While the given regression formula includes a language-specific fixed effect, it doesn't have one for the availability of ChatGPT in that language, and also no slope term.) The usage of Wikipedia might well have been decreasing or increasing steadily during those three years for other reasons (say the basic fact that every year, the number of Internet users worldwide increases by hundreds of millions). Indeed, a naive application of the method would yield the counter-intuitive conclusion that ChatGPT increased Wikipedia traffic in those languages where it was available:

"For all six languages, [using panel regression] we found a statistically significant difference in page views associated with whether ChatGPT had launched when controlling for day of the week and week of the year. In five of the six languages, this was a positive effect with Arabic featuring the most significant rise (18.3%) and Swedish featuring the least (10.2%). The only language where a fall was observed was Swahili, where page views fell by 8.5% according to our model. However, Swahili page viewing habits were much more sporadic and prone to outliers perhaps due to the low number of visits involved."

To avoid this fallacy (and partially address the aforementioned lack of trend analysis), the authors apply the same method to their (so to speak) control group, i.e. "the six language versions of Wikipedia where ChatGPT is was unavailable":

"Once again, results showed a statistically significant rise across five of the six languages. However, in contrast with the six languages where ChatGPT was available, these rises were generally much more significant. For Farsi, for example, our model showed a 30.3% rise, while for Uzbek and Vietnamese we found a 20.0% and 20.7% rise respectively. In fact, four of the languages showed higher rises than all of the languages where ChatGPT was available except Arabic, while one was higher than all languages except Arabic and Italian."

The authors stop short of attempting to use this difference (between generally larger pageview increases in ChatGPT-less languages and generally smaller increases for those where ChatGPT was available) to quantify the overall effect of ChatGPT directly, perhaps because such an estimation would become rather statistically involved and require additional assumptions. In the paper's "conclusions" sections, they frame this finding in vague, qualitative terms instead, by stating that ChatGPT may have led to lower growth in engagement [pageviews] within the territories where it is available.

For the other three metrics studied (unique devices, active editors, and edits), the results appear to have been even less conclusive. E.g. for edits, "[p]anel regression results for the six languages were generally not statistically significant. Among the languages where a significant result was found, our model suggested a 23.7% rise in edits in Arabic, while for Urdu the model suggested a 21.8% fall."

In the "Conclusion" section, the authors summarize this as follows:

Our findings suggest an increase in page visits and visitor numbers [i.e. page views and unique devices] that occurred across languages regardless of whether ChatGPT was available or not, although the observed increase was generally smaller in languages from countries where it was available. Conversely, we found little evidence of any impact for edits and editor numbers. We conclude any impact has been limited and while it may have led to lower growth in engagement within the territories where it is available, there has been no significant drop in usage or editing behaviours.

Unfortunately this preprint does not adhere to research best practices about providing replication data or code (let alone a preregistration), making it impossible to e.g. check whether the analysis of pageviews included automated traffic by spiders etc. (the default setting in the Wikimedia Foundation's Pageviews API), which would considerably impact the interpretations of the results. The paper itself notes that such an attempt was made for edits ("we tried to limit the impact of bots by requesting only contributions from users") but doesn't address the analogous question for pageviews.

An earlier version of the paper as uploaded to arXiv had the title "'The Death of Wikipedia?' – Exploring the Impact of ChatGPT on Wikipedia Engagement", which was later shortened by removing the attention-grabbing "Death of Wikipedia". As explained in the paper itself, that term refers to "an anonymous Wikipedia editor's fears that generative AI tools may lead to the death of Wikipedia" – specifically, the essay User:Barkeep49/Death of Wikipedia, via its mention in a New York Times article, see Wikipedia:Wikipedia Signpost/2023-08-01/In the media. While the paper's analysis conclusively disproves that Wikipedia has died as of May 2024, it is worth noting that Barkeep49 did not necessarily predict the kind of immediate, lasting drop that the paper's methodology was designed to measure. In fact, the aforementioned NYT article quoted him as saying (in July 2023) "It wouldn't surprise me if things are fine for the next three years [for Wikipedia] and then, all of a sudden, in Year 4 or 5, things drop off a cliff." Nevertheless, the paper's findings leave reason for doubt whether this will be the first of the many predictions of the end of Wikipedia to become true.

Briefly

Other recent publications

Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

"Do We Trust ChatGPT as much as Google Search and Wikipedia?"

From the abstract:[2]

"A focus group and interview study (N=14) revealed that thankfully not all users trust ChatGPT-generated information as much as Google Search and Wikipedia. It also shed light on the primary psychological considerations when trusting an online information source, namely perceived gatekeeping, and perceived information completeness. In addition, technological affordances such as interactivity and crowdsourcing were also found to be important for trust formation."

From the paper:

"Among all three information sources, Google was the most trusted platform, favored by 57% of our participants, followed by Wikipedia, which was liked by 29% of our participants [...]. Four participants expressed that ChatGPT is less credible than Google because it does not disclose the original source of the information."

It should be noted that the authors' relieved conclusion ("thankfully") is somewhat in contrast with the result of a larger scale blind experiment published last year in preprint form (see our coverage: "In blind test, readers prefer ChatGPT output over Wikipedia articles in terms of clarity, and see both as equally credible").


WikiChat, "the first few-shot LLM-based chatbot that almost never hallucinates"

A diagram.
"All WikiChat components, and a sample conversation about an upcoming movie [Oppenheimer], edited for brevity. The steps taken to generate a response include (1) generating a query to retrieve from Wikipedia, (2) summarizing and filtering the retrieved passages, (3) generating a response from an LLM, (4) extracting claims from the LLM response (5) fact-checking the claims in the LLM response using retrieved evidence, (6) drafting a response, and (7) refining the response." (Figure 1 from the paper)

From the abstract of this paper (by three graduate students at Stanford University's computer science department and Monica S. Lam as fourth author):[3]

"This paper presents the first few-shot LLM-based chatbot that almost never hallucinates and has high conversationality and low latency. WikiChat is grounded on the English Wikipedia, the largest curated free-text corpus. WikiChat generates a response from an LLM, retains only the grounded facts, and combines them with additional information it retrieves from the corpus to form factual and engaging responses. We distill WikiChat based on GPT-4 into a 7B-parameter LLaMA model with minimal loss of quality, to significantly improve its latency, cost and privacy, and facilitate research and deployment. [...] we show that our best system achieves 97.3% factual accuracy in simulated conversations. It significantly outperforms all retrieval-based and LLM-based baselines, and by 3.9%, 38.6% and 51.0% on head, tail and recent knowledge compared to GPT-4. Compared to previous state-of-the-art retrieval-based chatbots, WikiChat is also significantly more informative and engaging, just like an LLM. WikiChat achieves 97.9% factual accuracy in conversations with human users about recent topics, 55.0% better than GPT-4, while receiving significantly higher user ratings and more favorable comments."

An online demo is available at https://wikichat.genie.stanford.edu/ . The code underlying the paper has been released under an open source license, and two distilled models (for running the chatbot locally without relying e.g. on OpenAI's API) have been published on Huggingface.

See also our review of a previous (preprint) version of this paper: "Wikipedia-based LLM chatbot 'outperforms all baselines' regarding factual accuracy"

"A Simple Model of Knowledge Scaffolding Applied to Wikipedia Growth"

From the abstract:[4]

"We illustrate a simple model of knowledge scaffolding, based on the process of building a corpus of knowledge, each item of which is linked to “previous” ones. [...]. Our model can be used as a rough approximation to the asymptotic growth of Wikipedia, and indeed, actual data show a certain resemblance with our model. Assuming that the user base is growing, at beginning, in an exponential way, one can also recover the early phases of Wikipedia growth."

"The fundamental knowledge scaffolding model. (left) Knowledge bits are represented as nodes of a network, where different colors represent different levels and nodes at a certain level only depend on a certain number of nodes at lower levels. Green (basic) nodes represent axioms. (right) Observing the filling of the network (here with fixed width W and with fixed number of dependencies K), one can detect holes [e.g. content gaps on Wikipedia] that are filled after the appearance of nodes at higher levels." (from the paper)


"males outperform females" when navigating Wikipedia under time pressure

From the abstract:[5]

"we conducted an online experiment where participants played a navigation game on Wikipedia and completed personal information questionnaires. Our analysis shows that age negatively affects knowledge space navigation performance, while multilingualism enhances it. Under time pressure, participants’ performance improves across trials and males outperform females, an effect not observed in games without time pressure. In our experiment, successful route-finding is usually not related to abilities of innovative exploration of routes."

From the paper:

"In a popular online navigation game on Wikipedia, implemented in several versions such as the Wikispeedia (https://dlab.epfl.ch/wikispeedia/play/) and the Wikigame (https://www.thewikigame.com/), players try to go from one Wikipedia article (source) to another (target) through the hyperlinks of other articles within the Wikipedia website. Several navigation patterns on the Wikipedia knowledge network have been discovered: players typically first navigate to more general and popular articles and then narrow down to articles that are semantically closer to the target[...]; players’ search is not Markovian, meaning that a navigation step depends on the previous steps taken by the players [...] To gain a better understanding of how navigation on the knowledge network is affected by individual characteristics, we conducted an online experiment where we hired 445 participants from the US to play nine rounds of Wikipedia navigation games [...]"

"WorldKG": Interlinking Wikidata and OpenStreetmap

From the abstract:[6]

"[...] the coverage of geographic entities in popular general-purpose knowledge graphs, such as Wikidata and DBpedia, is limited. An essential source of the openly available information regarding geographic entities is OpenStreetMap (OSM). In contrast to knowledge graphs, OSM lacks a clear semantic representation of the rich geographic information it contains. [...] This chapter discusses recent knowledge graph completion methods for geographic data, comprising entity linking and schema inference for geographic entities, to provide semantic geographic information in knowledge graphs. Furthermore, we present the WorldKG knowledge graph, lifting OSM entities into a semantic representation."

From the paper:

"As of September 2022, WORLDKG contains over 800 million triples describing approximately a 100 million entities that belong to over 1,000 distinct classes. The number of unique properties (wgks:WKGProperty) in WORLDKG is over 1,800. [...] WORLDKG provides links to 40 Wikidata and 21 DBpedia classes."

From https://www.worldkg.org/data :

"In total, WorldKG covers 113,444,975 geographic entities, clearly more than Wikidata (8,621,058) and DBpedia (8,621,058)."


Dissertation: "Multilinguality in knowledge graphs" such as Wikidata

From the abstract:[7]

"In this thesis, we present studies to assess and improve the state of labels and languages in knowledge graphs and apply multilingual information. We propose ways to use multilingual knowledge graphs to reduce gaps in coverage between languages. We explore the current state of language distribution in knowledge graphs by developing a framework – based on existing standards, frameworks, and guidelines – to measure label and language distribution in knowledge graphs. We apply this framework to a dataset representing the web of data, and to Wikidata. [...] Due to its multilingual editors, Wikidata has a better distribution of languages in labels. [...] A way of overcoming the lack of multilingual information in knowledge graphs is to transliterate and translate knowledge graph labels and aliases. We propose the automatic classification of labels into transliteration or translation in order to train a model for each task. [...] A use case of multilingual labels is the generation of article placeholders for Wikipedia using neural text generation in lower-resourced languages. On the basis of surveys and semi-structured interviews, we show that Wikipedia community members find the placeholder pages, and especially the generated summaries, helpful, and are highly likely to accept and reuse the generated text."

See also mw:Extension:ArticlePlaceholder and our coverage of a subsequent paper: "Using natural language generation to bootstrap missing Wikipedia articles: A human-centric perspective"


"Increasing Coverage and Precision of Textual Information in Multilingual Knowledge Graphs" such as Wikidata

From the abstract:[8]

"Recent work in Natural Language Processing and Computer Vision has been using textual information – e.g., entity names and descriptions – available in knowledge graphs [such as Wikidata] to ground neural models to high-quality structured data. However, when it comes to non-English languages, the quantity and quality of textual information are comparatively scarce. To address this issue, we [...] i) bring to light the problem of increasing multilingual coverage and precision of entity names and descriptions in Wikidata; ii) demonstrate that state-of-the-art methods, namely, Machine Translation (MT), Web Search (WS), and Large Language Models (LLMs), struggle with this task; iii) present M-NTA, a novel unsupervised approach that combines MT, WS, and LLMs to generate high-quality textual information; and, iv) study the impact of increasing multilingual coverage and precision of non-English textual information in Entity Linking, Knowledge Graph Completion, and Question Answering. As part of our effort towards better multilingual knowledge graphs, we also introduce WikiKGE-10, the first human-curated benchmark to evaluate KGE approaches in 10 languages across 7 language families."

"LIS Journals’ Lack of Participation in Wikidata Item Creation"

From the abstract:[9]

"... This article presents findings from a survey investigating practices of library and information studies (LIS) journals in Wikidata item creation. Believing that a significant number of LIS journal editors would be aware of Wikidata and some would be creating Wikidata items for their publications, the authors sent a survey asking 138 English-language LIS journal editors if they created Wikidata items for materials published in their journal and follow-up questions. With a response rate of 41 percent, respondents overwhelmingly indicated that they did not create Wikidata items for materials published in their journal and were completely unaware of or only somewhat familiar with Wikidata. Respondents indicated that more familiarity with Wikidata and its benefits for scholarly journals as well as institutional support for the creation of Wikidata items could lead to greater participation; however, a campaign of education about Wikidata, documentation of benefits, and support for creation would be a necessary first step."

Survey on entity linking: Wikidata's potential is still underused

From the paper:[10]

"Entity Linking (EL) is the task of connecting already marked mentions in an utterance to their corresponding entities in a knowledge graph (KG) [...]. In the past, this task was tackled by using popular knowledge bases such as DBpedia [67], Freebase [11] or Wikipedia. While the popularity of those is still imminent, another alternative, named Wikidata [120], appeared."

From the abstract:

"Our survey reveals that current Wikidata-specific Entity Linking datasets do not differ in their annotation scheme from schemes for other knowledge graphs like DBpedia. Thus, the potential for multilingual and time-dependent datasets, naturally suited for Wikidata, is not lifted. Furthermore, we show that most Entity Linking approaches use Wikidata in the same way as any other knowledge graph missing the chance to leverage Wikidata-specific characteristics to increase quality. Almost all approaches employ specific properties like labels and sometimes descriptions but ignore characteristics such as the hyper-relational structure. [...] Many approaches also include information from Wikipedia, which is easily combinable with Wikidata and provides valuable textual information, which Wikidata lacks."


References

  1. ^ Reeves, Neal; Yin, Wenjie; Simperl, Elena (2024-05-22). "Exploring the Impact of ChatGPT on Wikipedia Engagement". arXiv:2405.10205 [cs.HC].
  2. ^ Jung, Yongnam; Chen, Cheng; Jang, Eunchae; Sundar, S. Shyam (2024-05-11). "Do We Trust ChatGPT as much as Google Search and Wikipedia?". Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems. CHI EA '24. New York, NY, USA: Association for Computing Machinery. pp. 1–9. doi:10.1145/3613905.3650862. ISBN 9798400703317.
  3. ^ Semnani, Sina; Yao, Violet; Zhang, Heidi; Lam, Monica (December 2023). "WikiChat: Stopping the Hallucination of Large Language Model Chatbots by Few-Shot Grounding on Wikipedia". Findings of the Association for Computational Linguistics: EMNLP 2023. EMNLP 2023. Singapore: Association for Computational Linguistics. pp. 2387–2413. doi:10.18653/v1/2023.findings-emnlp.157. Code
  4. ^ Bagnoli, Franco; de Bonfioli Cavalcabo’, Guido (February 2023). "A Simple Model of Knowledge Scaffolding Applied to Wikipedia Growth". Future Internet. 15 (2): 67. doi:10.3390/fi15020067. ISSN 1999-5903.
  5. ^ Zhu, Manran; Yasseri, Taha; Kertész, János (2024-04-09). "Individual differences in knowledge network navigation". Scientific Reports. 14 (1): 8331. arXiv:2303.10036. Bibcode:2024NatSR..14.8331Z. doi:10.1038/s41598-024-58305-2. ISSN 2045-2322. PMID 38594309.
  6. ^ Dsouza, Alishiba; Tempelmeier, Nicolas; Gottschalk, Simon; Yu, Ran; Demidova, Elena (2024). "WorldKG: World-Scale Completion of Geographic Information". In Dirk Burghardt; Elena Demidova; Daniel A. Keim (eds.). Volunteered Geographic Information: Interpretation, Visualization and Social Context. Cham: Springer Nature Switzerland. pp. 3–19. doi:10.1007/978-3-031-35374-1_1. ISBN 978-3-031-35374-1. Dataset: doi:10.5281/zenodo.4953986
  7. ^ Kaffee, Lucie-Aimée (October 2021). Multilinguality in knowledge graphs (Thesis). University of Southampton.
  8. ^ Conia, Simone; Li, Min; Lee, Daniel; Minhas, Umar; Ilyas, Ihab; Li, Yunyao (December 2023). "Increasing Coverage and Precision of Textual Information in Multilingual Knowledge Graphs". In Houda Bouamor; Juan Pino; Kalika Bali (eds.). Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. EMNLP 2023. Singapore: Association for Computational Linguistics. pp. 1612–1634. doi:10.18653/v1/2023.emnlp-main.100.
  9. ^ Willey, Eric; Radovsky, Susan (2024-01-02). "LIS Journals' Lack of Participation in Wikidata Item Creation". KULA: Knowledge Creation, Dissemination, and Preservation Studies. 7 (1): 1–12. doi:10.18357/kula.247. ISSN 2398-4112.
  10. ^ Cedric Möller, Jens Lehmann, Ricardo Usbeck: Survey on English Entity Linking on Wikidata. In: Semantic Web Journal, Special issue: Latest Advancements in Linguistic Linked Data, 2021; also as: Möller, Cedric; Lehmann, Jens; Usbeck, Ricardo (2021-12-03). "Survey on English Entity Linking on Wikidata". arXiv:2112.01989 [cs.CL]. Code




Reader comments

File:Wreck of the ship George Roper, Point Lonsdale (1883) by Fred Kruger.jpg
Fred Kruger
PD
62
104
600
2024-06-08

We didn't start the wiki

Wreck of the ship George Roper, Point Lonsdale (1883) by Fred Kruger — restored by Adam Cuerden, promoted to FP in January 2024.

This Signpost "Featured content" report covers material promoted from October 2023 through May 2024. Quotes are generally from the articles, but may be abridged or simplified for length.

It has been quite some time since we've last had a featured content segment, huh? The backlog has grown so great that, in the interest of giving things their proper respect, we've decided to separate the FAs from the other forms of featured content. Here is a full list of newly promoted FAs, starting from when our last Featured Content column left off (in that case, at the end of October!)


135 featured articles were promoted during this period. Enjoy the list, and pay no heed to the ordering or any musical accompaniments you might hear.

Night frog, Angel Reese
Tannehill, Marie Hingst
26 innings, Bandit Queen, Makhno
Easy on Me, Walt and Lincoln
Marshfield, Communication
H. sechmenii, Argosy
Taylor swift promo.
Benty Grange, Leigh Sayers
Firebird, plovers
Boulez, Guallatiri
Teloschistaceae.
Caitlin Clark, Holidays
Brompton doing okay
Tom de la More, Spencer scores,
Torpedo boat T4

[Chorus]:
We didn't start the wiki
It's been always growing while the web's been slowing
We didn't start the wiki
No we didn't write it,
but we tried to cite it!

Len Deighton, Mount Berlin
Eye and Brother Jonathan
My Little Love, the peeping tom
Channel 64
OneShot, The One
Reverend John Littlejohn
Somerset Cricket Club, Running Out of Time
Will Slack, Kalven
Albona, Sovereign
Jack the Ripper, Old King Ed
a Printing Plant and Bill Madden
Steinbock, Hernan, Bulkeley and Doom again
Illinois race, Hanson Place
Champaign v. Madigan

[Chorus]

George Griffith, Phil 101
Speechless, Dolly de Leon
Hö'elün, Israelis
Gillingham early eighties
Wildest Dreams, Oyster dress,
attempt against the Tirpitz
Missouri Medicaid
George Town in Pulau Pinang
Carmichael, Cane Hill
Citigroup, storm petrel
Warburg House, sungem
Tomorrow Speculative Fiction
Notts in twenty-twenty-three
Beulé and the Beaulieu
Leucippus, Ben&Ben
adapted dissertation

[Chorus]

Sounders' year, Boundary
The Spy Who Loved Me
Breakdown, new Brompton
Day Before the Revolution
Ojos del Salado
Hanford, FA Cup final
Capri-Sun, Cross Temple
Nicoll Highway caving in
Ty Cobb getting suspended
Dark Pictures Anthology
"Well he would, wouldn't he?"

[Chorus]

Overlook, Benneson
Tufted jays and John Pulman
Edziza Complex
Pornographic artifacts
Olsen, Munsey's, Alan Wace
Centre vs. Harvard game
Blackburne, Raynald, Morgan-Chase,
Tumors in the prostate
Illinois FOIA
Member of Stenochiridae,
Pseudastacus crustaceans
Broadway, Nasutoceratops
Etika, Ed Bradley
Soccer eighteen-seven-three
An Aston Martin touring car
and John Bullock Clark Sr.

[Chorus]

The Hunger by McQueen
Mount Hudson, Ed Ætheling
Ottoman art history
Football final forty-three
Franco-Russian feminist
Her Majesty's Secret Service
Guinea-worms inside the skin
Worlds by Porter Robinson
Jacques, operetta factory
Great cuckoo-dove from New Guinea
Smithsonian photographer,
the Prior of Worcester
Viaduct in Milton Keynes
George's Greek prehistory
Pan Am Flight Two-one-four
I can't write this any more!

[Chorus]



Reader comments

File:Wikipedia LGBT.png
Wiki thành viên Liên minh hội (WTL)
CC BY-SA 4.0
300
2024-06-08

No queerphobia

This essay was recently the subject of quite some debate — a deletion nomination that reached nearly two hundred thousand characters of text, a deletion review, and a wide assortment of conversations on- and off-wiki. Opinions varied on the content and tone of the essay, and the essay itself changed quite a bit during the process of these discussions (including the addition and removal of material, as well as a retitling of the page itself, and a move into and back out of userspace). Ultimately, a consensus was reached that it did not merit deletion from projectspace.
In light of this volume of discourse, it seems condign that it should be made available to all who are interested. Here, the version posted has been adapted and slightly abridged from the full version at WP:No queerphobia; the authors with 10 or more edits are listed in the byline. J

Many people are drawn to edit Wikipedia in order to promote anti-LGBT views, mistakenly believing that their beliefs are protected by the WP:NPOV policy. Expressions of homophobia, lesbophobia, biphobia, transphobia, arophobia, acephobia, or general queerphobia are not welcome here. They disrupt the encyclopedia by promoting WP:FRINGE viewpoints and drive away productive LGBT editors.

The essay WP:HATEISDISRUPTIVE lays out why denigrating minorities is not allowed on Wikipedia and results in blocking and banning; others such as Wikipedia:No racists, Wikipedia:No Nazis, and Wikipedia:No Confederates lay out more specific guidelines for those forms of bigotry; this essay specifically serves to outline common anti-LGBT beliefs, disruptive manifestations of them, and the systems of recourse on English Wikipedia.

Context of this essay

Discussions have raged on for decades about how Wikipedia should write about LGBT people and topics. Gender and sexuality (WP:GENSEX) are currently considered a contentious topic (formerly "discretionary sanctions"), meaning that editors contributing to articles and discussions about these topics must strictly follow Wikipedia's behavioral and editorial guidelines. MOS:GENDERID and the supplementary essay MOS:GIDINFO contain the most up-to-date guidelines for writing about transgender people on Wikipedia.

Anti-LGBT editors frequently disrupt Wikipedia by promoting misinformation or pushing fringe viewpoints (particularly dangerous in medical articles), and create an unwelcoming environment for other editors. Editors who are unable to set aside their beliefs about the LGBT community when editing or who seek to promote WP:FRINGE viewpoints may be restricted from editing.

This essay outlines common queerphobic beliefs, popular misinformation about the LGBT community, and groups known to spread and support it, so that administrators and editors may recognize them, address them, and show queerphobes the door.

Beliefs, expressions, and actions

This essay and sister essays such as WP:NORACISTS, WP:NOCONFED, and WP:NONAZIS face a common criticism: "we should sanction editors for their behaviors, not their beliefs".

This is not an unfair argument so it bears exploration. The essay Wikipedia:Hate is disruptive addresses the issue like this (emphasis added):

This essay is based on that underlying principle, put succinctly as "your right to swing your fist stops where my nose begins". If you believe LGBT people are amoral deviants who need conversion therapy, but practice civility, never bring it up, and solely contribute to articles about entomology and highways, you have nothing to worry about and your contributions to Wikipedia are welcomed. This essay isn't about you. If you try to change the first sentence of LGBT to All LGBT people are amoral deviants who need conversion therapy...—or insist on talk pages that this is the case and Wikipedia needs to take your POV seriously—that is a behavioral issue and the focus of this essay.

Queerphobic beliefs

Queerphobia is the fear, hatred, or dislike of lesbian, gay, bisexual, transgender, and otherwise queer people. Queerphobes commonly believe that LGBT people and identities are deviant, and should be denied rights and protections.

Frequent anti-LGBT narratives include:

  • That being LGBT is a conscious choice or unnatural.
  • That LGBT people are inherently fetishistic, predatory, pedophilic, or otherwise dangerous.
  • That LGBT people cannot know their identities
  • That LGBT people only identify as such due to media exposure, peer pressure, or social contagion.
  • That the LGBT community or a subset of it are indoctrinating or grooming youth into being LGBT.
  • That LGBT people overall have greater societal power than cisgender/heterosexual people.
  • That marriage, adoption, or parenting should be restricted to heterosexual couples.
  • That recognizing same-sex marriage is a slippery slope towards legalizing bestiality or other strange or disfavored sexual practices.
  • That the open or subtextual presence of LGBT people or acknowledgment of them is inappropriately sexual or political and should be kept from the public square, media, or education.

Overlapping with the narratives and beliefs above are more medically-related pseudoscientific/unevidenced proposals and typologies. The guideline WP:FRINGE addresses how to handle these in articles (we don't include them in articles on the broader topic, but if notable we can discuss them in their own articles while making clear they're fringe).

  • That LGBT identities and/or gender dysphoria are the result of mental illness.[1]
  • That LGBT identities should be cured, treated, or suppressed[2] - commonly referred to as conversion therapy, advocates often use terms such as reparative therapy or gender exploratory therapy and may justify it in scientific or religious terms.
  • That LGBT people should be forced to undergo medical or psychological treatments, procedures, or testing on the basis of their identity.[2]
  • That transgender people should be unable to change their legal gender, should be invariably excluded from gendered spaces, or should be legally denied medical transition or have it otherwise made inaccessible.[1][2]

Queerphobic editors on Wikipedia frequently think:

Possible manifestations

These beliefs may manifest in various ways that damage the encyclopedia. Below is a non-exhaustive list of possible ones.

Aspersions

Casting aspersions of queerphobia (as well as -ist or -phobe aspersions) should not be used as a trump card in disputes over content or a coup de grâce on a noticeboard. They have the potential to permanently damage reputation, especially when the accused's account is publicly tied to a real-world identity. As such, unsubstantiated aspersions are a form of personal attack which may lead to the accuser being blocked.

Aspersions make the normal dispute resolution process difficult to go through and may create a chilling effect. Editors are encouraged to work through the normal dispute-resolution process when it comes to legitimate content disputes, such as disagreements on the interpretation or quality of sources.

What to do if you encounter queerphobia

You should always assume good faith and exercise civility. However, our social policies are not a suicide pact; we don't have to treat every harmful edit as the result of non-malicious ignorance.

For a new editor, understand that they are likely ignorant of Wikipedia systems and standards. Point them toward relevant guidelines and policies. If they are editing material related to gender identification, make them aware of the GENSEX topic restrictions via the {{Contentious topics/alert/first|gg}} or {{Contentious topics/alert|gg}} templates. If they are arguing against the guidelines, make it clear that you can't change the guidelines in an article discussion and direct them toward where such discussions can take place.

If an editor consistently and chronically disrupts the encyclopedia by promoting queerphobic opinions/viewpoints, you should collect relevant diffs and report them. If an editor was already made aware of the GENSEX topic restrictions, then you can request enforcement at WP:AE. Otherwise, request administrator attention at WP:ANI.

Editors brazenly vandalizing articles or using slurs may be immediately blocked. Wikipedia has zero tolerance for such behavior. If an edit is grossly insulting, degrading, or offensive, it may be subject to revision deletion. If an edit breaches someone's privacy, you should request Oversight.

It can be very tempting, especially in article talk pages, to debate or rebut anti-LGBT talking points on their own merits. However, remember that Wikipedia is not a forum. Stick to source-based and policy-based discussions which serve to improve articles. If a conversation is blatantly unconstructive or off-topic, then consider collapsing, refactoring, or moving it so that you and other editors don't waste others' time.

References

  1. ^ a b "APA Policy Statement on Affirming Evidence-Based Inclusive Care for Transgender, Gender Diverse, and Nonbinary Individuals, Addressing Misinformation, and the Role of Psychological Practice and Science" (PDF).
  2. ^ a b c o'Connor, Aoife M.; Seunik, Maximillian; Radi, Blas; Matthyse, Liberty; Gable, Lance; Huffstetler, Hanna E.; Meier, Benjamin Mason (2022). "Transcending the Gender Binary under International Law: Advancing Health-Related Human Rights for Trans* Populations". Journal of Law, Medicine & Ethics. 50 (3): 409–424. doi:10.1017/jme.2022.84.



Reader comments

File:Red flag warning banner at Cal Fire Green Springs Station, May 2022.JPG
Pi.1415926535
CC BY-SA 4.0
50
400
2024-06-08

RetractionBot is back to life!

A bit of history, context, and what you can expect to see in articles

RetractionBot, from pre-history to v2

Back in 2012, Doc James made a query over at WikiProject Medicine about what sort of work could be done by a bot to find retracted papers. At the time, there was no centralized way of finding retracted papers, so Rich Farmbrough queried the PubMed database looking for retraction-related keywords (like 'retraction of publication' in the metadata). Of the roughly 4000 retractions, he found 138 that matched papers cited on Wikipedia. The template {{retracted}} was created to flag those papers, and was manually and semi-automatically added to articles.

Then in 2018, JenOttawa noticed the then newly-launched RetractionDatabase.org, a database of retractions maintained by Retraction Watch. This led Samwalton9 to code the first iteration of RetractionBot. The bot was then doing automatically what people did manually, saving everyone a lot of hassle. However, the bot only ran for a few months, and hit a snag: several Cochrane Reviews were flagged as retracted for technical reasons, while they were never retracted in actuality. The bot was put on hiatus, and Samwalton9 never got to fixing the issue.

Five years later, motivated by the slew of retractions hitting major publishers from Elsevier, Hindawi, SAGE, and many others, as well as the opening up of RetractionDatabase.org (now with nearly 40,000 retractions), I thought it would do us some good to kick the hornet's nest and see if I could interest someone in revisiting this project.

Turns out I could. Not even a week after probbing the volunteers at WP:BOTREQ, mdann52 graciously took over maintenance of RetractionBot, and the bot is now back alive, with many improvements. In particular, it now covers expressions of concerns, not only retractions, which are early warning signs that a paper might be dubious and could be retracted/in need of a major revision. This led to the creation of {{expression of concern}}, which works very similarly to {{retracted}} (see below).

What the bot does

Related articles
citations

RetractionBot is back to life!
8 June 2024

Tens of thousands of freely available sources flagged
4 December 2023

Citation tools for dummies!
1 August 2023

Journals cited by Wikipedia
1 August 2023

The unexpected rabbit hole of typo fixing in citations...
31 August 2022


More articles

Cleaning up awful citations with Citation bot
1 August 2022

Yes, the sky is blue
27 March 2022

Detecting spam, and pages to protect; non-anonymous editors signal their intelligence with high-quality articles
30 August 2020

The Wikipedia SourceWatch
31 March 2019

Reliable Sources Noticeboard editors discuss deprecating sources
24 December 2018

New guideline for technical collaboration
4 November 2016

Citations are needed
14 January 2015

Gender gap and skills gap; academic citations on the rise; European food cultures
26 November 2014

Citations, non-free content, and a MediaWiki meeting
19 June 2013

WebCite proposal; request for adminship reform
11 February 2013

Barnstars work; Wiktionary assessed; cleanup tags counted; finding expert admins; discussion peaks; Wikipedia citations in academic publications; and more
30 April 2012

Journalist regrets not checking citation, PR firms issue advice on how to "survive" Wikipedia (but U.S. Congressman caught red-handed)
22 August 2011

Oral citations; the state of global development; a gentler Huggle; brief news
25 July 2011

Second Wikipedian in Residence, citation needed for sanity
8 November 2010

Good faith vs. bad faith, climate change, court citations, weirdest medieval fact, brief news
25 October 2010

Wikipedia in British schools, Hitler's Downfall meme, and more
11 January 2010

WMF Elections, Annual Financial Plan, Google Image Search, and more
13 July 2009

Lessons for Brits, patent citations
23 February 2009

Tutorial: Adding citations
4 February 2008

Efforts to reform Requests for Adminship spark animated discussion
23 April 2007

News and notes: More legal citations, milestones
5 February 2007

Court decisions citing Wikipedia proliferate
29 January 2007

Wikipedia in the news
14 August 2006

Wikipedia cited by the High Court of England and Wales
3 July 2006

Wikipedia in the news
19 June 2006

Wikipedia in the news
20 February 2006

Wikipedia in the news
6 February 2006

Wikipedia in the news
16 January 2006

Wikipedia in the news
26 December 2005

Wikipedia in the news
19 December 2005

Bugs, Repairs, and Internal Operational News
28 November 2005

Wikipedia in the news
28 November 2005

Wikipedia in the news
21 November 2005

Wikipedia in the news
14 November 2005

Wikipedia in the news
7 November 2005

Wikipedia in the news
31 October 2005

Wikipedia in the news
24 October 2005

Wikipedia in the news
17 October 2005

Wikipedia in the news
10 October 2005

Wikipedia in the news
3 October 2005

Wikipedia in the news
26 September 2005

Wikipedia in the news
19 September 2005

Wikipedia in the news
12 September 2005

Wikipedia in the news
5 September 2005

Wikipedia in the news
29 August 2005

Wikipedia in the news
22 August 2005

Wikipedia citations abound; Wikimania article published
15 August 2005

Wikipedia praised in media, including by competitor of sorts
25 July 2005

Even without London events, Wikipedia draws media coverage, award
11 July 2005

Media focus on collaboration includes Wikipedia
20 June 2005

Wikipedia inspires fight against disease, newspaper "wikitorials"
13 June 2005

Press coverage this week
6 June 2005

Wikipedia featured in Time Magazine
30 May 2005

Featured article citation rules discussed, featured lists invented
30 May 2005

In the news: Wikipedia serves as supplement to science, BBC
23 May 2005

Press coverage this week
16 May 2005

Press coverage this week
9 May 2005

Press coverage this week
2 May 2005

Encarta pseudo-wiki debate continues
25 April 2005

Press covers Wikipedia after being beaten to the punch
18 April 2005

Media covers German Wikipedia DVD, plans for English
11 April 2005

News outlets note Wikipedia's rapid updates, jokes
4 April 2005

Outside sites build more indexes and links to Wikipedia
28 March 2005

Wikipedia described as best or worst of the Web, depending on the source
21 March 2005

In the media: Prolific editors featured, German magazine plagiarizes
14 March 2005

Student use of Wikipedia citations debated
7 March 2005

The bot first downloads a .csv file containing all the information in the RetractionDatabase (a 50MB download available here). Then it crosschecks retracted DOIs and PMIDs in the database against those found on Wikipedia.

If a match is found, the bot will, for example, change

  • ...Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.[1]
1. ^ Restrepo-Arango, Marcos; Gutiérrez-Builes, Lina Andrea; Ríos-Osorio, Leonardo Alberto (April 2018). "Seguridad alimentaria en poblaciones indígenas y campesinas: una revisión sistemática". Ciência & Saúde Coletiva. 23 (4): 1169–1181. doi:10.1590/1413-81232018234.13882016. PMID 29694594.

to

  • ...Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.[1]
1. ^ Restrepo-Arango, Marcos; Gutiérrez-Builes, Lina Andrea; Ríos-Osorio, Leonardo Alberto (April 2018). "Seguridad alimentaria en poblaciones indígenas y campesinas: una revisión sistemática". Ciência & Saúde Coletiva. 23 (4): 1169–1181. doi:10.1590/1413-81232018234.13882016. PMID 29694594. (Retracted, see doi:10.1590/1413-81232018241.32242011, PMID 30698268,  Retraction Watch. If this is an intentional citation to a retracted paper, please replace {{retracted|...}} with {{retracted|...|intentional=yes}}.)

It is now up to humans like you to review if this is problematic for the article. If the citation is no longer reliable, then the article needs to be updated, which could be as minor as the removal/replacement of the citation with a reliable one, to rewriting an entire section that was based on flawed premises. If the citation to a retracted paper was intentional, like in the context of a controversy noting that a paper was later retracted, you can replace {{retracted|...}} with {{retracted|...|intentional=yes}}, suppressing the red notice

  • ...Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.[1]
1. ^ Restrepo-Arango, Marcos; Gutiérrez-Builes, Lina Andrea; Ríos-Osorio, Leonardo Alberto (April 2018). "Seguridad alimentaria en poblaciones indígenas y campesinas: una revisión sistemática". Ciência & Saúde Coletiva. 23 (4): 1169–1181. doi:10.1590/1413-81232018234.13882016. PMID 29694594. (Retracted, see doi:10.1590/1413-81232018241.32242011, PMID 30698268,  Retraction Watch )

What you can do

If you are interested in doing systematic work involving Wikipedia articles citing retractions, the category Category:Articles intentionally citing retracted publications will be automatically populated by {{retracted}}. The retractions that haven't yet been reviewed by a human can be found in the sub-category Category:Articles citing retracted publications instead.

Otherwise? Well... carry on as usual. But if you see one of those big red notices, don't panic. Treat it like any other unreliable source, and update the article accordingly. If a retraction paper (or one with an expression of concern) is intentionally cited, then simply follow the instructions and replace {{retracted|...}} with {{retracted|...|intentional=yes}} (or {{expression of concern|...}} with {{expression of concern|...|intentional=yes}}) to suppress the red notice.

And while these are not currently handled by RetractionBot, {{erratum}} and Category:Articles intentionally citing publications with errata / Category:Articles citing publications with errata work very similarly to the above categories for papers with errata and might also be of interest to you.

Happy editing!



Reader comments

File:Rinpa style ink-stone box.jpg
?
PD
47
9
400
2024-06-08

Chimps, Eurovision, and the return of the Baby Reindeer

This traffic report is adapted from the Top 25 Report, prepared with commentary by Igordebraga, Vestrian24Bio, Shuipzv3, and CAWylie (May 12 to 18) and Igordebraga, CAWylie, Boyinaroom, and Vestrian24Bio (May 19 to 25).

I hate every ape that I see, from Chimpan-A to Chimpanzee (May 12 to May 18, 2024)

Rank Article Class Views Image Notes/about
1 Kingdom of the Planet of the Apes 996,673 War for the Planet of the Apes closed an inspired trilogy seven years ago, but there are clearly more stories to be told in a world of talking apes. Wes Ball directs the franchise's tenth overall movie, set generations after Caesar, where an ambitious bonobo is trying to forcibly unite many ape clans, being opposed by a young member of a tribe of falconer chimpanzees and a woman who is a reminder of when humans were the dominant species. Continuing the franchise's good streak, Kingdom of the Planet of the Apes earned positive reviews, and is close to clearing its $160 million budget in less than two weeks in theaters, making it clear the plans for a whole trilogy will soon be fulfilled.
2 Eurovision Song Contest 2024 984,621 After Sweden won the 2023 contest, it hosted the latest edition at Malmö Arena. 25 finalists out of 37 entrants made it the semifinals which began May 7. Romania opted not to participate, and Luxembourg competed for the first time since 1993. The Netherlands was disqualified before the final, but still retained its right to vote. The inclusion of Israel among the participants in the context of the Israel–Hamas war was met with controversy. Ultimately, Switzerland won the contest.
3 2024 Indian general election 975,252 The largest-ever election in history to choose its 18th parliment finished off its fourth phase 13 May, and the fifth phase began the following Monday. It will continue to go on until final counting on 4 June.
4 Deaths in 2024 968,728 If man is five
Then the devil is six
Then God is seven
This monkey's gone to heaven
5 Yasuke 924,347 Yasuke was a man, likely of African origin, who served under the daimyo Oda Nobunaga from 1581–1582. The video game company Ubisoft announced that a fictionalized version of Yasuke would be a primary character in the upcoming Assassin's Creed Shadows, which prompted discussions about his featuring over a Japanese character in a game set in feudal Japan.
6 Heeramandi 862,850 Despite mixed reviews following its May 1 release, this Indian period drama series was that week's second most-watched non-English show on Netflix globally. It received 4.5 million views, with 33 million viewership hours, breaking the record for the most-viewed Indian series in its first week of release. It was trending at number-one in 10 countries and was among the top ten most-watched shows in 43 countries.
7 Robert Fico 663,364 The Prime Minister of Slovakia suffered an assassination attempt, and after an emergency surgery for five bullet wounds, his condition has turned stable. It's possibly politically motivated, with the suspect having shown opposition to some of Fico's policies, like taking more control of the media and weakening anti-corruption policies.
8 Harrison Butker 654,977 Butker, a kicker for the Kansas City Chiefs and a conservative Catholic, gave a commencement speech at the private Benedictine College on May 11, in which he congratulated the women in the audience – and said most were probably eager to get married and have children. Butker also criticized Pride Month and US president Joe Biden's stance on abortion. While he received a standing ovation from the crowd, others were less impressed, with the NFL distancing itself from his comments, and it was pointed out that Butker's mother is a medical physicist with two university degrees.
9 Baby Reindeer 632,686 This critically acclaimed Netflix mini-series based on author Richard Gadd's real life experience has managed to make it to the Top 10, even after five weeks since the release.
10 Bridgerton 631,370 Shondaland's first scripted series for Netflix about regents living in the UK's social season was renewed for a third season in 2021. The first half of it premiered on May 16.

When the pastor's music plays, and that casket rolls away (May 19 to 25)

Rank Article Class Views Image Notes/about
1 Ebrahim Raisi 2,165,931 #8 ended the life of the President of Iran, a controversial figure even before taking office, as in the 1980s he was part of a commission that ordered the execution of political prisoners, and his government had the international community complaining about the intensification of the nuclear program of Iran and overall belligerence (supporting Russia against Ukraine, arming the groups that attack Israel and last month downright firing some missiles and drones), and Iran's own staging protests against the morality police. In spite of that it's expected that the government itself will not see many changes after his death, aside from the country's de facto ruler Ali Khamenei (who has been Supreme Leader of Iran since 1989) appointing the vice president as acting president until new elections are held on June 28.
2 Ali Khamenei 1,029,402
3 Deaths in 2024 1,000,312 In the dark of night
These faces, they haunt me
But I wish you were
So close to me...
4 2024 Indian general election 990,635 The Indian Lok Sabha election is almost reaching the conclusion next week. It is to be decided whether the ruling party (whose been in reign since 2014) would continue for another 5 years making it to 15 consecutive years or the opposition party would get their turn after 10 years.
5 Ademola Lookman 933,770 The 2024 UEFA Europa League final was a consagration for this striker, born in London to Nigerian parents (leading him to play internationally for the African country), who scored all three goals that guaranteed the title to Atalanta BC. Losing team Bayer Leverkusen didn't cry too much given they finally got the Bundesliga that evaded them for over 100 years.
6 Oleksandr Usyk 910,189 This Ukrainian boxer managed to defeat Tyson Fury to become the "Undisputed Heavyweight Champ!".
7 Morgan Spurlock 846,546 On the 20th anniversary of Super Size Me, an Academy Award-nominated documentary revolving around spending a month eating only McDonald's to see what so much fast food does to a body, its director\star Morgan Spurlock died at 53, following struggles with cancer. His career was not exempt from controversies, like claims that Super Size Me was not completely accurate (particularly for hiding Spurlock's alcohol abuse) and Spurlock himself admitting a history of sexual misconduct (something that made 2017's Super Size Me 2: Holy Chicken!, showing the dirt of the chicken industry, lose a distribution deal and ultimately become his final movie).
8 2024 Varzaqan helicopter crash 821,820 #1 had left the Giz Galasi Dam near the border with Azerbaijan to inaugurate a refinery in Tabriz, when his helicopter crashed in a forest. All eight people in the aircraft died, which also included the Foreign Minister and two authorities of the province of East Azerbaijan (one of whom survived the crash itself and picked up the pilot's phone to answer the rescue crew before passing from his injuries), and three flight crew. It should be noted that two other Iranian presidents were involved in helicopter crashes, as the country's aircraft are not in optimal condition given international sanctions lead to replacement part shortages.
9 Bridgerton 779,673 The first half of this American Netflix series set on the early 1800s in an alternative London Regency era was released last week. With the previous seasons having won two Emmy awards and more, the highly expected third season opened up to positive reviews across the internet.
10 President of Iran 729,927 Ever since the Iranian Revolution of 1979 that overthrew the monarchy, the biggest authority of the country is a religious one, the Supreme Leader of Iran, but right below him there is a President chosen by the people. #2 has held both offices, being President from 1981 to 1989 (following the only other than #1 to die as President, Mohammad-Javad Bahonar, whose offices were bombed), when he became Supreme Leader as the hand-picked successor of the recently deceased Ayatollah Khomeini.

Exclusions

  • These lists exclude the Wikipedia main page, non-article pages (such as redlinks), and anomalous entries (such as DDoS attacks or likely automated views). Since mobile view data became available to the Report in October 2014, we exclude articles that have almost no mobile views (5–6% or less) or almost all mobile views (94–95% or more) because they are very likely to be automated views based on our experience and research of the issue. Please feel free to discuss any removal on the Top 25 Report talk page if you wish.

Most edited articles

For the April 27 – May 27 period, per this database report.

Title Revisions Notes
Deaths in 2024 1964 Along with Ebrahim Raisi and Morgan Spurlock listed above, the period saw the deaths of among others Steve Albini, Roger Corman, and Richard M. Sherman.
List of tennis families 1494 Mellamelina created this page revealing all the tennis players who have relatives who also picked up the racket.
2024 Varzaqan helicopter crash 1437 As mentioned above, the helicopter crash that killed 5 Iranian politicians, including the country's president.
Eurovision Song Contest 2024 1327 After Sweden won the 2023 contest, it hosted the latest edition at Malmö Arena. 25 finalists out of 37 entrants made it the semifinals which began May 7. Romania opted not to participate, and Luxembourg competed for the first time since 1993. The Netherlands was disqualified before the final, but still retained its right to vote. The inclusion of Israel among the participants in the context of the Israel–Hamas war was met with controversy. Ultimately, Switzerland won the contest with "The Code", performed by Nemo.
Drake–Kendrick Lamar feud 1241 This rap feud involving Aubrey Drake Graham from Toronto, and Kendrick Lamar from Compton has been going since 2010s, but its been escalating since March 2024, and saw a real feud with back-to-back releases from both rappers. The lyrical content included Drake accusing Kendrick of beating his wife and Kendrick calling Drake a pedophile.
Legalism (Chinese philosophy) 1013 Big wheel keep on turnin', FourLights keep on burnin'.
Megalopolis (film) 992 Legendary director Francis Ford Coppola unveiled his first movie in 13 years, which he had been trying to film since the 1970s and saw its debut at the 77th Cannes Film Festival, to a polarizing critical reception.
Tornadoes of 2024 962 Before Twisters hits theaters in July, a reminder that even outside disaster movies those devastating winds are very dangerous, with a particular increase during two multi-day outbreaks.
2023–24 Liga 3 (Indonesia) 910 The third division of Indonesian football.
2024 Indian Premier League 849 Indians are not checking the page on their cricket tournament as much as previous weeks, but updates are still frequent.
History of Christianity 812 After making this vital article survive a GA reassessment, Jenhawk777 started a Peer Review hoping to push for Featured status.
Bigg Boss (Malayalam TV series) season 6 794 Like its film industries, India has one Big Brother for every language.
2024 New Caledonia unrest 780 New Caledonia, a French territory in the Pacific Ocean, reacted violently to a controversial voting reform aiming to change restrictions which prevent up to one-fifth of the population voting, with seven deaths and the declaration of state of emergency.
2024 pro-Palestinian protests on university campuses 752 Benjamin Netanyahu is a stubborn man who doesn't stop the Israel-Hamas war even if the international community would prefer otherwise. Among those protesting are university students all over the world, mostly in non-violent ways, yet also raising concerns about anti-Semitic incidents.
2024 Stanley Cup playoffs 695 The NHL postseason rolls on, and the four remaining teams in the Conference Finals are ones waiting really long to celebrate a title: New York Rangers (last won in 1994) vs. Florida Panthers (entered the league in 1993 and never won), and Dallas Stars (one slightly controversial title in 1999) vs. Edmonton Oilers (last raised the Stanley Cup in 1990).



Reader comments

File:Wikipediholic_Family_Comic.png
Relativity
CC0
60
120
900
2024-06-08

The Wikipediholic Family

Note: Light mode is recommended for viewing this comic.

Wikipediholic Family Comic




Reader comments

File:Man using computer (cropped) (vandalisming).jpg
Sapphireasa
CC BY-SA 3.0
300
2024-06-08

Wikipedia rattled by sophisticated cyberattack of schoolboy typing "balls" in infobox

A guy typing stupid stuff into a Wikipedia edit window

EAST SCOWTUMPKA, WISCONSIN – A local high-school sophomore and superhacker, known only as 199.85.228.117, today announced that he'd successfully hacked Wikipedia, the world-famous free encyclopedia anybody can edit, making his school's infobox say that principal Hubert Glockenspiel's name was instead "Hubert Glockenballs".

"I just went in there and, uh, I totally hacked it dude," 199.85.228.117 told the Signpost in an interview yesterday morning. "They thought they were so cool making a website anybody could edit, but they never even realized that people might go on there and write stuff that wasn't true on purpose."

The extent of the damage

"Glockenballs" wasn't the only piece of misleading information inserted by this sophisticated cyberattack – information about the school's cafeteria was also manipulated. "Pizza on Wednesdays", for example, was changed to "Pizza on EVERY DAY". And those seeking to find out more about the school's "creamed corn" day were, instead, confronted with a Wikipedia article claiming that they served "creamed crap".

It's unclear what the implications of this shocking attack will be for the future of the collaborative encyclopedia project. "It's devastating," said longtime arbitrator and Wikipedia administrator Newyorkchad, "and this is making me seriously question whether Wikipedia is viable in the long term – it had never occurred to us that people could open up the edit box and type swear words into it."


Meanwhile, some are puzzled by what could have motivated such a malicious deed. 199.85.228.117 explained: "I might not look like it, but I'm actually a pretty dark and twisted guy. I'm honestly actually kind of like, uh, the Joker meets South Park. So this is just a little way of me showing everyone what happens when you mess with the nice guy... hehe."

A task force springing into action, and very real harms

The volunteer staff of Wikipedia are currently working round-the-clock to figure out how to address this problem, with hundreds of ideas being worked out on the Village Pump regarding what can be done to prevent this menace to the project. Currently, the proposal with the most popular support is for the Wikimedia Foundation to establish an Anti-Misediting Exploratory Committee, which will come up with proposals on how to deal with this new phenomenon: first by setting up an Anti-Misediting Steering Board, which would draft a charter for a Anti-Misediting Policy Evaluation Panel, which could then publish a report on best practices for a provisional Anti-Misediting Implementation Group.

This news comes as a shock for those who previously held great faith in Wikipedia's reliability, and experts warn that it could threaten the very foundations of our democracy. "It's like nothing we've ever seen before," says Amelia Hegginbottom, professor of Malnarrative Resilience Studies at the University of Bridgetonshire-upon-Prestigiousham. "Study after study has shown, and experts have warned for years, that there are very real harms in the unchecked spread of malnarratives – like that this man's age was '999', or that his favorite food was 'rocks'."

"If people start thinking it's good to eat rocks," she explains, "they might eat a bunch of rocks, and then they'll have to go to the hospital, so that's one real harm already. And if my research group got a few million dollars in grants, we could figure out the rest, and then save democracy forever."

Wikipedia cofounder Larry Sanger was less optimistic, saying: "Whatever this thing is, I'm pretty sure it goes to show that nowadays Wikipedia is a pile of crap".

At the mercy of a master hacker

Meanwhile, 199.85.228.117 has shown no remorse, and has threatened to carry out further cyberattacks.

"We live in a society where the independent people who think for themselves are called 'crazy' and the ignorant sheep are called 'normal'. Just a glimpse into my twisted reality, a full stare would make most people simply just go insane haha. I'm just getting started... and this time it's personal."

However, back in East Scowtumpka, things are a little more relaxed: "Wow," said Principal Glockenspiel. "You'd think I would have heard them all by now, but that's actually a new one. Frankly, it is kind of funny. 'Glockenballs'. You know, like... uh... never mind."

When asked how he was coping with the recent developments, Glockenspiel said that it "doesn't really seem like that big of a deal, especially if it was only there for a minute or two," adding that "that article probably gets like ten views a week or something."

At press time, he had excused himself to the school's computer lab to check up on the Wikipedia page and "fix a couple things". We at The Signpost wish him the best of luck, although he may not need it in a place like that: according to the school's own article, the computer lab is a "cutting-edge, state-of-the-art facility made possible by award-winning principal Hubert Glockenspiel's commitment to investing in excellence, teaching valuable research skills and bringing a global perspective to the students at Powpica County's highest-ranked secondary institution with his time-tested Glockenspiel Method of innovative education leadership".



Reader comments

File:Photograph of Federal Records Center, Alexandria, Virginia (34877725360).jpg
USNA
PD
800
2024-06-08

Palimpsestuous

Early human habitation

From Wikipedia, the correct encyclopedia

This is an old revision of this page, as edited by Username or IP removed in the year of our Redemption 338-01-06T03:00:27 (edit summary removed). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(change visibility) (diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

This page revision has been removed from the public archives. Details can be found in the deletion log for this page.
As an administrator, you can still view this revision if you wish to proceed, by clicking "show" to the right.
The '''early human habitation heresy''' is the erroneous, debunked belief that human beings existed prior to the Antecessors' intervention in the first year of our creation and Redemption (1 YR). Itisconsideredtobeblasphemy,andisfrequentlyespousedbymembersofextremistsects. CorrectknowledgeindicatesthathumanswerecreatedintheirpresentstatebytheAntecessors,whoestablishedthestructuresofGovernanceandentrustedthemtotheHolyCouncil,beforeleavingourplanetpermanently.
+
The '''early human habitation is the belief that human beings existed prior to the Antecessors' intervention in the first year of our creation and Redemption (1 .



Reader comments

If articles have been updated, you may need to refresh the single-page edition.