Fake news website
|This article is part of a series on|
|Related security categories|
Fake news websites are websites that publish hoaxes, propaganda, or disinformation to increase web traffic through sharing on social media. Unlike news satire, where humor is the object, fake news websites seek to increase their traffic by knowingly circulating false stories. Fake news websites have promoted misleading or factually incorrect information concerning the politics of several countries including: Germany, Indonesia and the Philippines, Sweden, China, Myanmar, Italy, France, Brazil, Australia, India, and the United States. Many of the false news sites are hosted in Russia, Macedonia, Romania, and the U.S.
One Pan-European newspaper, The Local, described the proliferation of fake news as a form of psychological warfare. The European Parliament's Committee on Foreign Affairs called attention to the problem in 2016 when it passed a resolution warning that the Russian government was using think tanks, "pseudo-news agencies" and "Internet trolls" as forms of propaganda and disinformation to weaken confidence in Western institutions.
In 2015, the Swedish Security Service, Sweden's national security agency, issued a report concluding Russia was utilizing the tactic to inflame "splits in society" through the proliferation of propaganda. Sweden's Ministry of Defence tasked its Civil Contingencies Agency to combat fake news from Russia. Fraudulent news affected politics in Indonesia and the Philippines, where there was simultaneously widespread usage of social media and limited resources to check the veracity of political claims. German Chancellor Angela Merkel warned of the societal impact of "fake sites, bots, trolls".
Fraudulent articles spread through social media during the 2016 U.S. presidential election. Several officials within the United States Intelligence Community said that Russia was engaged in spreading fake news. Computer security company FireEye concluded Russia used social media as cyberwarfare. Google and Facebook banned fake sites from using online advertising. U.S. President Barack Obama said a disregard for facts created a "dust cloud of nonsense". Concern advanced bipartisan legislation in the U.S. Senate to authorize U.S. State Department action against foreign propaganda. U.S. Senate Intelligence Committee member Ron Wyden said: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."
Prominent among fraudulent news sites include false propaganda created by individuals in the countries of Russia, Macedonia, Romania, and the United States. Several of these websites are often structured to fool visitors that they are actually real publications and mimic the stylistic appearance of ABC News and MSNBC, while other pages are specifically propaganda.
A significant amount of fraudulent news during the 2016 United States election cycle came from adolescent youths in Macedonia attempting to rapidly profit from those believing their falsehoods. An investigation by BuzzFeed revealed that over 100 websites spreading fraudulent articles supportive of Donald Trump were created by teenagers in the town of Veles, Macedonia. The Macedonian teenagers experimented with writing fraudulent news about Bernie Sanders and other articles from a politically left or liberal slant; they quickly found out that their most popular fraudulent writings were about Donald Trump.
The Guardian performed its own independent investigation and reached the same conclusion as BuzzFeed News; and traced over 150 fraudulent news sites to the same exact town of Veles, Macedonia. One of the Macedonian teenagers, "Alex", was interviewed by The Guardian during the ongoing election cycle in August 2016 and stated that regardless of whether Trump won or lost the election fraudulent news websites would remain profitable. He explained he often began writing his pieces by plagiarism through copy and pasting direct content from other websites.
One of the investigative journalists who exposed the ties between fraudulent websites and Macedonian teenagers, Craig Silverman of BuzzFeed News, told Public Radio International that false stories net the Balkan adolescents a few thousand dollars per day and fake articles aggregate to earn them on average a few thousand per month. Public Radio International reported that after the 2016 election season the teenagers from Macedonia would likely turn back to making money off fraudulent medical advice websites, which Silverman noted was where most of the youths had previously garnered clickbait revenues.
The Associated Press tracked down an 18-year-old in Veles, Macedonia, and interviewed him about his tactics. The Associated Press and the fake news website operator met together at Gemdidzii Sports Hall in Veles, Macedonia. The teenager used Google Analytics to assess his web traffic and over the course of one week was able to garner 650,000 views. He regularly copy-and-pasted stories favorable of Donald Trump from a right-wing site called The Political Insider. The 18-year-old from Veles told the Associated Press he did not personally care for politics and was merely engaging in fake news production as a way to earn additional finances, in addition to a methodology to garner extra experience in his chosen field of marketing. The teenager said the effort should be on the consumer to check information: "They can read it if they want to. I’m not the one pushing them to click on the article." The Associated Press used DomainTools to confirm the teenager was behind several fake news websites, and additionally to determine there were approximately 200 websites tracked to Veles, Macedonia that focused on U.S. news.The Associated Press reported that the majority of fake news sites were composed of plagiarism. In the locality of Veles with a population of 50,000, the additional income brought in by fake news sites was not objected to by the local populace. Petar Peckov, a native reporter in the area, told the Associated Press the local townspeople were happy the youths were working.
"Ending the Fed", a site set up in March 2016 by Ovidiu Drobota reached over 3 million visitors a month according to Alexa Internet. At the time of the site's launch, Drobota was a 24-year-old Romanian web developer specialized in search engine optimization. The site propagated a false story in August 2016 about FOX News firing journalist Megyn Kelly. Facebook removed the article from its "Trending News" manually once it became clear the story was bogus. "Ending the Fed" held four out of the 10 most popular fake articles on Facebook related to the 2016 U.S. election in the three months prior to the election. Their associated Facebook page had 350,000 followers in November 2016.
Internet Research Agency
Beginning in fall 2014, The New Yorker writer Adrian Chen performed a six-month-long investigation into Russian propaganda campaigns on the Internet orchestrated by a group that called itself the Internet Research Agency. Evgeny Prigozhin, a close associate of Vladimir Putin, was behind the operation which hired hundreds of individuals to work in Saint Petersburg to support Russian government views online.
Internet Research Agency came to be regarded as a "troll farm", a term used to refer to propaganda efforts controlling many accounts online with the aim of artificially providing a semblance of a grassroots organization. Chen reported that Internet trolling came to be used by the Russian government as a tactic largely after observing the organic social media organization of the 2011 protests against Putin.
Chen interviewed reporters in Russia in addition to political activists, and was informed the end goal of fake news usage by the Russian government was not to attempt to persuade particular readers that it was factual, but rather to simply sew discord and chaos generally online. Chen wrote: "The real effect, the Russian activists told me, was not to brainwash readers but to overwhelm social media with a flood of fake content, seeding doubt and paranoia, and destroying the possibility of using the Internet as a democratic space."
EU regulation of Russian fake news
In 2015, the Organization for Security and Co-operation in Europe released an analysis highly critical of disinformation campaigns by Russia employed to appear as legitimate news reporting. These propaganda campaigns by Russia were intended to interfere with Ukraine relations with Europe — after the removal of former Ukraine president Viktor Yanukovych from power. According to Deutsche Welle, "The propaganda in question employed similar tactics used by fake news websites during the U.S. elections, including misleading headlines, fabricated quotes and misreporting". This propaganda motivated the European Union to create a special taskforce to deal with disinformation campaigns originating out of Russia.
Foreign Policy reported that the taskforce, called East StratCom Team, "employs 11 mostly Russian speakers who scour the web for fake news and send out biweekly reviews highlighting specific distorted news stories and tactics." The European Union voted to add to finances for the taskforce in November 2016.
Deutsche Welle noted: "Needless to say, the issue of fake news, which has been used to garner support for various political causes, poses a serious danger to the fabric of democratic societies, whether in Europe, the U.S. or any other nation across the globe."
In November 2016, the European Parliament Committee on Foreign Affairs passed a resolution warning of the use by Russia of tools including: "pseudo-news agencies ... social media and internet trolls" as forms of propaganda and disinformation in an attempt to weaken democratic values. The resolution emphatically requested media analysts within the European Union to investigate, explaining: "with the limited awareness amongst some of its member states, that they are audiences and arenas of propaganda and disinformation." The resolution condemned Russian sources for publicizing "absolutely fake" news reports, and the tally on 23 November 2016 passed by a margin of 304 votes to 179.
Gleb Pavlovsky, who assisted in creating an propaganda program for the Russian government prior to 2008, told The New York Times in August 2016: "Moscow views world affairs as a system of special operations, and very sincerely believes that it itself is an object of Western special operations. I am sure that there are a lot of centers, some linked to the state, that are involved in inventing these kinds of fake stories."
Anders Lindberg, a Swedish attorney and reporter, explained a common pattern of fake news distribution: "The dynamic is always the same: It originates somewhere in Russia, on Russia state media sites, or different websites or somewhere in that kind of context. Then the fake document becomes the source of a news story distributed on far-left or far-right-wing websites. Those who rely on those sites for news link to the story, and it spreads. Nobody can say where they come from, but they end up as key issues in a security policy decision."
The International Business Times reported that the United States Department of State had plans in the works to specifically use a unit that had been formed with the intention of fighting back against disinformation from the Russian government, and that the unit was disbanded in September 2015 after department heads within the State Department did not foresee the peril of the propaganda in the months immediately prior to the 2016 U.S. presidential campaign. The U.S. State Department had put 8 months of work into developing the counter-disinformation unit before deciding to scrap it.
Titled Counter-Disinformation Team, the program would have been a reboot of the Active Measures Working Group set up by the Reagan Administration which previously operated under the auspices of the U.S. State Department and United States Information Agency. The Counter-Disinformation Team was set up underneath the Bureau of International Information Programs of the U.S. State Department. Work began in the Obama Administration on the Counter-Disinformation Team in 2014. The intention of the Counter-Disinformation Team was to combat propaganda from Russian sources such as Russia Today. A beta release version website was established ready to go live and several staff members were hired by the U.S. State Department for the Counter-Disinformation Team prior to its cancellation. United States Intelligence Community officials explained to former National Security Agency analyst and counterintelligence officer John R. Schindler, that the Obama Administration decided to cancel the Counter-Disinformation Team because they were afraid of antagonizing the Russian government.
Under Secretary of State for Public Diplomacy and Public Affairs Richard Stengel was the point person at the U.S. State Department for the Counter-Disinformation Team before it was canceled. Stengel had experience previously on the matter, having written publicly for the U.S. State Department about the disinformation campaign by the Russian government and Russia Today. After United States Secretary of State John Kerry called Russia Today: a "propaganda bullhorn" for Vladimir Putin the president of Russia, Russia Today insisted that the State Department give an "official response" to Kerry's statement. In his response, Stengel wrote for the U.S. State Department that Russia Today engaged in a "disinformation campaign". Stengel spoke out against the spread of fake news, and explained the difference between reporting and propaganda: "Propaganda is the deliberate dissemination of information that you know to be false or misleading in order to influence an audience."
A representative for the U.S. State Department explained to the International Business Times in a statement after being contacted regarding the closure of the Counter-Disinformation Team: "The United States, like many other countries throughout Europe and the world, has been concerned about Russia's intense propaganda and disinformation campaigns. We believe the free flow of reliable, credible information is the best defense against the Kremlin's attack on the truth."
Peter Kreko of the Hungary-based Political Capital Institute spoke to International Business Times about his work studying the disinformation initiatives by the Russian government, and said: "I do think that the American [Obama] administration was caught not taking the issue seriously enough and there were a lot more words than action." Kreko recounted that employees within the U.S. government told him they were exasperated due to the "lack of strategy, efficiency and lack of taking it seriously" regarding the information warfare by the Russian government against the United States.
Further role in 2016 U.S. presidential election
Adrian Chen observed a strange pattern in December 2015 whereby online accounts he had been monitoring as supportive of Russia had suddenly additionally become highly supportive of 2016 U.S. presidential candidate Donald Trump. Writers Andrew Weisburd and Foreign Policy Research Institute fellow and senior fellow at the Center for Cyber and Homeland Security at George Washington University, Clint Watts, wrote for The Daily Beast in August 2016: "Fake news stories from Kremlin propagandists regularly become social media trends." Weisburd and Watts documented how a disinformation campaign spread from Russia Today and Sputnik News, "the two biggest Russian state-controlled media organizations publishing in English", to pro-Russian accounts on Twitter. Prior to the election, U.S. national security officials told BuzzFeed News they were more anxious about Russia tampering with U.S. news than hacking the election itself.
Citing the prior research by Adrian Chen, Weisburd and Watts observed compared the tactics used by Russia during the 2016 U.S. election to those previously utilized by the Soviet Union against the U.S. during the Cold War. They referenced the 1992 United States Information Agency report to the United States Congress, which warned about Russian propaganda campaigns called active measures. Weisburd and Watts concluded these Active measures became much easier for the intelligence agents with the advent of social media on the Internet. Institute of International Relations Prague senior research fellow and scholar on Russian intelligence, Mark Galeotti, agreed the Kremlin operations were a form of active measures. The Guardian reported in November 2016 that the most strident among Internet promoters of Trump were not U.S. citizens but instead paid Russian propagandists. The paper estimated there were several thousand trolls engaged in the offense, and that their primary topics included promoting Trump and Putin, and criticizing President Obama.
Weisburd and Watts collaborated with colleague J. M. Berger and published a follow-up study to their Daily Beast article in the online magazine War on the Rocks, titled: "Trolling for Trump: How Russia is Trying to Destroy Our Democracy". The three writers researched 7,000 user accounts on social media over a two-and-a-half year period of time that promoted Donald Trump. Their research detailed techniques of Internet trolls to degrade the reputation of critics of Russian activities in Syria, and to proliferate falsehoods Hillary Clinton's health. Watts explained his colleagues' analysis in War on the Rocks to CNN, and said the Russian propaganda effort targeted the alt-right movement, individuals from right-wing politics, and fascist groups.
BuzzFeed News reported Internet trolls financed by the Kremlin were open about their authorship and spread of fake news. After each presidential debate, tens of thousands of Twitter bots proliferated hashtags including #Trumpwon and to change online perceptions. The Federal Bureau of Investigation released a statement to BuzzFeed News stating they were investigating the propaganda. United States Intelligence Community officials told BuzzFeed News they believed the Russian government was engaged in spreading fake news.
The United States Intelligence Community tasked resources debating why Vladimir Putin chose summer 2016 to escalate active measures towards influencing domestic U.S. politics. Director of National Intelligence James R. Clapper said that after the 2011–13 Russian protests, Putin's confidence in his long term viability as a politician was damaged, and he decided to respond with the propaganda intelligence operation. Former Central Intelligence Agency case officer Patrick Skinner explained that the true goal of the propaganda operation was to spread uncertainty, regardless of whether or not a particular fake statement had been debunked. Investigative analyst at Bellingcat, Aric Toler, explained that fact-checking could simply draw further attention to the fake news.
U.S. Congressman Adam Schiff, Ranking Member of the House Permanent Select Committee on Intelligence, commented on Putin's aims, and said the U.S. intelligence agencies were significantly concerned with Russia propaganda in the U.S. Speaking about online disinformation that appeared in Hungary, Slovakia, the Czech Republic, and Poland, Schiff said there was an increase of the same behavior in the U.S. Schiff concluded Russian propaganda intelligence operations would likely continue against the U.S. after the election.
On 24 November 2016, The Washington Post reported that members of the Foreign Policy Research Institute had stated Russian propaganda during the election helped foment criticism of Hillary Clinton and support for Donald Trump. The strategy involved social media users, Internet trolls working for hire, botnets, and organized websites in order to cast Clinton in a negative light. Clint Watts monitored Russian propaganda and stated its tactics were similar to Cold War era strategies applied to social media. Watts stated the goal of Russia was to decrease trust in the U.S. government. Watts research along with colleagues Andrew Weisburd and J.M. Berger was published in November 2016. These conclusions were confirmed by prior research from the Elliott School of International Affairs at George Washington University and by the RAND Corporation. The Nation editor Katrina vanden Heuvel opined that "hysteria being drummed up around Putin's alleged intervention in the campaign" was overblown, arguing that it was the broken U.S. electoral system that decided the election rather than propaganda from afar.
In the same article, The Washington Post reported that the previously unknown group PropOrNot came to similar conclusions about involvement by Russia in propagating fake news during the 2016 U.S. election. The Washington Post and PropOrNot received criticism from The Intercept, Fortune, Rolling Stone, AlterNet, Adrian Chen at The New Yorker, and in an opinion piece in the paper itself, written by Katrina vanden Heuvel.
Ari Shapiro on the National Public Radio program All Things Considered interviewed Washington Post journalist Craig Timberg, who explained there was a massive amount of botnets and financed Internet trolls to increase the spread of fake news online. Timberg said there were thousands of social media accounts working for Russia that functioned as a "massive online chorus". Timberg stated Russia had a vested interest in the 2016 U.S. election due to a dislike for Hillary Clinton over the 2011–13 Russian protests.
Bloomberg News reported computer security company FireEye concluded the Russian government utilized social media as a weapon to influence perspectives regarding the U.S. election. FireEye Chairman David DeWalt told Bloomberg News the intelligence operation by the Russian government in 2016 was a new development in cyberwarfare by Russia. FireEye CEO Kevin Mandia stated the tactics of Russian propaganda cyberwarfare changed significantly after fall 2014, from covert computer hacking to suddenly more overt tactics with decreased concerns for operational security or being revealed to the public as an intelligence operation.
U.S. News & World Report warned readers to be wary of popular fraudulent news sites composed of either outright hoaxes or propaganda, and recommended the website Fake News Watch for a listing of such problematic sources.
Marco Chacon created the fake news website called RealTrueNews to show his alt-right friends "how ridiculous" their gullibility was for such websites. In one of the stories Chacon wrote a fake transcript for Hillary Clinton's leaked speeches in which Clinton explains bronies to Goldman Sachs bankers. Chacon was shocked when his fake article was attributed as factual by Fox News and he heard his own creation on The Kelly File hosted by Megyn Kelly. Trace Gallagher repeated Chacon's story word for word by saying Clinton had called Bernie Sanders supporters a "bucket of losers" — a phrase made-up and written by Chacon himself. Megyn Kelly apologized after emphatic denials from representatives for Hillary Clinton.
After his fake stories that he made up were believed as factual and shared and viewed tens of thousands of times, Chacon told Brent Bambury of CBC Radio One program Day 6 that he was so shocked at Internet consumers' ignorance he felt it was like an episode from The Twilight Zone. In an interview with ABC News, Chacon defended his site, saying his was only an over-the-top parody of other fake news sites to teach them the how ridiculous they were: "The only way I could think of to have a conversation with these people is to say, 'if you have a piece of crazy fake news, look I got one too, and it’s even crazier, it’s absurd.'"
Jestin Coler from Los Angeles is the founder and CEO of Disinfomedia, a company which owns many fake news websites He had previously given interviews to multiple media organizations about fake news under a pseudonym, Allen Montgomery, in order to evade personal scrutiny. With the help of tech-company engineer John Jansen, journalists from NPR found Coler's identity. After being identified as Disinformedia's owner, Coler agreed to an interview. Coler explained how his original intent for his project backfired: "The whole idea from the start was to build a site that could kind of infiltrate the echo chambers of the alt-right, publish blatantly or fictional stories and then be able to publicly denounce those stories and point out the fact that they were fiction." He stated his company attempted to write fraudulent reports for the left-wing perspective, but found those articles were not shared nearly as much as fake news from a right-wing point-of-view. Coler told NPR that consumers of information must be more skeptical of content in order to combat fake news: "Some of this has to fall on the readers themselves. The consumers of content have to be better at identifying this stuff. We have a whole nation of media-illiterate people. Really, there needs to be something done."
Paul Horner, a creator of fraudulent news stories, stated in an interview with The Washington Post that he was making approximately US$10,000 a month through advertisements linked to the fraudulent news. He claimed to have posted a fraudulent advertisement to Craigslist offering thousands of dollars in payment to protesters, and to have written a story based on this which was later shared online by Trump's campaign manager. Horner believed that when the stories were shown to be false, this would reflect badly on Trump's supporters who had shared them, but concluded "Looking back, instead of hurting the campaign, I think I helped it. And that feels [bad]."
In a follow-up interview with Rolling Stone, Horner revealed that The Washington Post profile piece on him spurred greatly increased interest with over 60 interview requests from media including ABC News, CBS News, and Inside Edition. Horner explained that his writing style was such that articles appeared legitimate at the top and became increasingly couched in absurdity as the reader progressed: "Most of my stuff, starts off, the first paragraph is super legit, the title is super legit, the picture is super legit, but then the story just gets more and more ridiculous and it becomes obvious that none of it is true." Horner told Rolling Stone that he always placed his name as a fictional character in his fake articles. He said he supported efforts to decrease fake news websites.
Impacts by country
Australia was plagued with fake stories being shared as if they were truth on Facebook, especially regarding false news about Muslim religious practices in the country. A group prominent on Facebook in the country was focused on getting rid of Halal, the Muslim laws regarding religious dietary restrictions. "Boycott Halal in Australia group" had about 100,000 members on its page on Facebook in 2016. The group publicized a satirical newspaper report in November 2014 and passed it off as truth. Another page, for proponents of Q Society, which refers to itself as "Australia's leading Islam-critical movement", frequently posts baseless fake statements.
Brazil faced increasing influence from fake news after the 2014 re-election of president Dilma Rousseff and subsequent impeachment in August 2016. BBC Brazil reported in April 2016 that sixty percent of the top shared articles on Facebook about the impeachment proceedings against Rousseff were fake.
In 2015, reporter Tai Nalon resigned from her position at Brazilian newspaper Folha de S Paulo in order to start the first fact-checking website in Brazil, called Aos Fatos (To The Facts).
Nalon told The Guardian: "There is a lot of false news, but I would be cautious about saying the problem is similar to what happens in the USA."
The government of China used the growing problem of fake news as a rationale for increasing Internet censorship in China in November 2016. China took the opportunity to publish an editorial in its Communist Party newspaper The Global Times called: "Western Media's Crusade Against Facebook", and criticized "unpredictable" political problems posed by freedoms enjoyed by users of Twitter, Google, and Facebook. China government leaders meeting in Wuzhen at the third World Internet Conference in November 2016 said fake news in the U.S. election justified adding more curbs to free and open use of the Internet. China Deputy Minister Ren Xianliang, official at the Cyberspace Administration of China, said increasing online participation led to additional "harmful information" and that "intimidation and fraud are more common than ever". Kam Chow Wong, a former Hong Kong law enforcement official and criminal justice professor at Xavier University, said at the conference: "it's a good move that the U.S. is trying to regulate social media; it’s overdue." The Wall Street Journal noted China's themes of Internet censorship became more relevant at the World Internet Conference due to the outgrowth of fake news: "China’s efforts to promote its concept of the internet had fresh resonance as Western minds now debate whether social media sites should screen out fake news". Fake news during the 2016 U.S. election spread to China. Translation efforts were made and articles were then translated into Chinese and shared within the country after becoming popularized within the United States.
France saw an uptick in amounts of disinformation and propaganda, primarily in the midst of election cycles. Le Monde fact-checking division "Les décodeurs" was headed by Samuel Laurent, who told The Guardian in December 2016: "I think the French presidential election campaign [next spring] will be fraught with this type of thing."
The country faced controversy regarding fake websites providing false information about abortion. The government's lower parliamentary body moved forward with intentions to ban such fake sites. Laurence Rossignol, women's minister for France, informed parliament though the fake sites "appear neutral and objective", in actuality there intentions were "deliberately seeking to trick women".
German Chancellor Angela Merkel lamented the problem of fraudulent news reports in a November 2016 speech, days after announcing her campaign for a fourth term as leader of her country. In a speech to the German parliament, Merkel was critical of such fake sites: "Something has changed -- as globalisation has marched on, (political) debate is taking place in a completely new media environment. Opinions aren't formed the way they were 25 years ago. Today we have fake sites, bots, trolls -- things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them." She warned that such fraudulent news websites were a force increasing the power of populist extremism. Merkel called fraudulent news a growing phenomenon that might need to be regulated in the future.
Germany's foreign intelligence agency Federal Intelligence Service Chief, Bruno Kahl, warned of the potential for cyberattacks by Russia in the 2017 German election. He said the cyberattacks would take the form of the intentional spread of misinformation. Kahl said the goal is to "elicit political uncertainty". Germany's domestic intelligence agency Federal Office for the Protection of the Constitution Chief, Hans-Georg Maassen, said: "The information security of German government, administrative, business, science and research institutions is under permanent threat. ... Russian intelligence agencies are also showing a readiness to [carry out] sabotage."
India had over 50 million accounts on the cellphone application Whatsapp in 2016. The country's prime minister declared in November 2016 there would be a 2,000-rupee currency bill established, and fake news went viral over Whatsapp that the note came equipped with spying technology which could track bills up to 120 meters below the earth. India's reserve bank refuted the falsities, but not before they had spread to the country's mainstream news outlets. Prabhakar Kumar of the Indian media research agency CMS, told The Guardian: "Mainstream media in India is more impacted by the phenomena [of fake news] because they broadcast these kinds of stories without verifying. There is no standard policy for TV news and newspapers about the process of researching and publishing stories."
Law enforcement officers in India arrested individuals with charges of creating fictitious articles, predominantly if there is a likelihood it will inflame societal conflict. The country warned supervisors of Whatsapp groups they may be liable for the proliferation of fake news.
Indonesia and Philippines
Fraudulent news has been particularly problematic in Indonesia and the Philippines, where social media has an outsized political influence. According to media analysts, "many developing countries with populations new to both democracy and social media" are particularly vulnerable to the influence of fraudulent news. In some developing countries, "Facebook even offers free smartphone data connections to basic public online services, some news sites and Facebook itself — but limits access to broader sources that could help debunk fake news."
Between October 1 and November 30, 2016, ahead of the Italian constitutional referendum, five out of the ten referendum-related stories with most engagements on social media (shares, likes, and comments on Facebook, plus shares on Twitter, LinkedIn and Google+) were hoaxes or contained a misleading title. Of the three stories with the most social media engagements, two were fake.
Prime Minister of Italy Matteo Renzi met with U.S. President Barack Obama and with leaders of European nations at a meeting in Berlin, Germany in November 2016, and privately spoke with them about the pervasive problem of fake news.
Pervasiveness of propaganda grew in advance of the constitutional referendum scheduled for 4 December 2016. The influence became so problematic that a senior adviser to Matteo Renzi began a defamation complaint on an anonymous Twitter user who had used the screenname "Beatrice di Maio". Cyberwarfare propaganda against Matteo Renzi increased before the referendum date, and Italian newspaper La Stampa brought attention to false reporting by Russia Today where they wrongly published that supportive rally in favor of Renzi in Rome was actually against him.
The Hollywood Reporter and The New York Times reported on the Five Star Movement (M5S), an Italian political party founded by Beppe Grillo, and how the party was said to manage a consortium of fake news websites amplifying support for Russian news sources, propaganda, and inflamed conspiracy theories. The Hollywood Reporter noted that Five Star Movement's site TzeTze had 1.2 million fans on Facebook and it regularly shared fake news articles and pieces supportive of Vladimir Putin primarily cited to Russian government owned news sources including Sputnik News. TzeTze often plagiarized the Russian source, and copied article titles and content directly from Sputnik News for its articles and re-posted them on its site.
BuzzFeed News investigative journalists tracked TzeTze, another site critical of Renzi called La Cosa, and a blog by Beppe Grillo — all to the same technology company called Casaleggio Associati which was started by Five Star Movement co-founder Gianroberto Casaleggio. These Five Star Movement controlled sources all cross-post to each other when they publish articles and thereby amplify their reach to a wider audience. Casaleggio's son Davide Casaleggio owns and manages TzeTze and La Cosa in addition to medical website La Fucina which markets anti-vaccine conspiracy theories and medical cure-all methods. BuzzFeed News reporting discovered that the blog by Grillo, the websites for the Five Star Movement, and the all the fake news sites operated by the party all use the exact same IP adresses, in addition to identical Google Analytics and Google Adsense accounts.
A former Google Adsense staff member analyzed the investigation by BuzzFeed News and compared the network of fake news sites run by the Five Star Movement party to the Donald Trump supportive fake news sites BuzzFeed News had previously investigated and found to be run out of one town in Macedonia. The official stated: "M5S talks a lot about transparency, but then as part of my job I realised that they are making so much money off this thing. When you look online there is no transparency about the amount of money they make with the blog and the sites. It’s all so mixed up. The leaders of the party are making money with a fake news aggregator. It’s like if Trump owned the Macedonian sites."
In October 2016, the Five Star Movement disseminated a video from Kremlin-aligned Russia Today which falsely reported displaying thousands of individuals protesting the 4 December 2016 scheduled referendum in Italy — when in fact the video that went on to 1.5 million views was actually showing people who supported the referendum itself and were not opposed to it. According to BuzzFeed News, the fake news sites run by Five Star Movement profit financially from the spread of such disinformation.
President of the Italian Chamber of Deputies, Laura Boldrini, stated: "Fake news is a critical issue and we can’t ignore it. We have to act now." Boldrini met on 30 November 2016 with vice president of public policy in Europe for Facebook Richard Allan to voice her concerns about the spread of fake news. She said Facebook needed to admit they functioned for all intents and purposes as a media organization: "They can’t pretend that they are just a platform. They are giant media companies."
Fake news negatively affected individuals in Myanmar, leading to a rise in violence against Muslims in the country. Online participation within the country surged from a value of one percent to 20 percent of Myanmar's total populace from the period of time of 2014 to 2016. Fake stories from Facebook in the country grew so influential that they were reprinted in paper periodicals called Facebook and The Internet that simply regurgitated the website's newsfeed text often without factual oversight, for those without Internet access. False reporting related to practitioners of Islam in the country was directly correlated with increased attacks on people of the religion in Myanmar, and protests against Muslims.
BuzzFeed News journalist Sheera Frenkel reported: "there has also been an increase in articles that demonize the country’s minority Muslim community, with fake news claiming that vast hordes of Muslim worshippers are attacking Buddhist sites. These articles, quickly shared and amplified on social media, have correlated with a surge in anti-Muslim protests and attacks on local Muslim groups." Frenkel noted countries that were relatively newer to Internet exposure were more susceptible to the problem, writing: "Countries like Myanmar, which come online quickly and without many government-backed programs to teach safe internet habits — like secure passwords and not revealing personal details online — rank among the lowest in digital literacy. They are the most likely to fall for scams, hacks, and fake news."
The Swedish Security Service issued a report in 2015 identifying propaganda from Russia infiltrating Sweden with the objective to: "spread pro-Russian messages and to exacerbate worries and create splits in society."
The Swedish Civil Contingencies Agency (MSB), part of the Ministry of Defence of Sweden, identified fake news reports targeting Sweden in 2016 which originated from Russia. Swedish Civil Contingencies Agency official Mikael Tofvesson stated: "This is going on all the time. The pattern now is that they pump out a constant narrative that in some respects is negative for Sweden."
The Local identified these tactics as a form of psychological warfare. The newspaper reported the MSB identified Russia Today and Sputnik News as "important channels for fake news". As a result of growth in this propaganda in Sweden, the MSB planned to hire six additional security officials to fight back against the campaign of fraudulent information.
Fraudulent stories during the 2016 U.S. presidential election popularized on Facebook included a viral post that Pope Francis had endorsed Donald Trump, and another that wrote actor Denzel Washington "backs Trump in the most epic way possible".
Donald Trump's son and campaign surrogate Eric Trump, top national security adviser Michael T. Flynn, and then-campaign managers Kellyanne Conway and Corey Lewandowski shared fake news stories during the campaign.
U.S. President Barack Obama commented on the significant problem of fraudulent information on social networks impacting elections, in a speech the day before Election Day in 2016: "The way campaigns have unfolded, we just start accepting crazy stuff as normal. And people, if they just repeat attacks enough and outright lies over and over again, as long as it’s on Facebook, and people can see it, as long as its on social media, people start believing it. And it creates this dust cloud of nonsense."
Shortly after the election, Obama again commented on the problem, saying in an appearance with German Chancellor Angela Merkel: "If we are not serious about facts and what’s true and what's not, and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can't discriminate between serious arguments and propaganda, then we have problems."
One prominent fraudulent news story released after the election—that protesters at anti-Trump rallies in Austin, Texas, were "bused in"—started as a tweet by one individual with 40 Twitter followers. Over the next three days, the tweet was shared at least 16,000 times on Twitter and 350,000 times on Facebook, and promoted in the conservative blogosphere, before the individual stated that he had fabricated his assertions.
BuzzFeed called the problem an "epidemic of misinformation". According to BuzzFeed's analysis, the 20 top-performing election news stories from fraudulent sites generated more shares, reactions, and comments on Facebook than the 20 top-performing stories from 19 major news outlets.
Fox News host of the journalism meta analysis television program Media Buzz, Howard Kurtz, acknowledged fraudulent news was a serious problem. Kurtz relied heavily upon the BuzzFeed analysis for his reporting on the controversy. Kurtz wrote that: "Facebook is polluting the media environment with garbage". Citing the BuzzFeed investigation, Kurtz pointed out: "The legit stuff drew 7,367,000 shares, reactions and comments, while the fictional material drew 8,711,000 shares, reactions and comments." Kurtz concluded Facebook founder Mark Zuckerberg must admit the website is a media company: "But once Zuckerberg admits he’s actually running one of the most powerful media brands on the planet, he has to get more aggressive about promoting real news and weeding out hoaxers and charlatans. The alternative is to watch Facebook’s own credibility decline."
Worries that fake news spread by the Russian government swayed the outcome of the election grew, and representatives in the U.S. Congress took action to safeguard the National security of the United States by advancing legislation to monitor incoming propaganda from external threats. On 30 November 2016, legislators approved a measure within the National Defense Authorization Act to ask the U.S. State Department to take action against foreign propaganda through an interagency panel. The legislation authorized funding of $160 million over a two-year-period.
The initiative was developed through a bipartisan bill written in March 2016 by U.S. Senators Chris Murphy and Rob Portman titled: Countering Foreign Propaganda and Disinformation Act. U.S. Senator Rob Portman stated: "This propaganda and disinformation threat is real, it’s growing, and right now the U.S. government is asleep at the wheel. The U.S. and our allies face many challenges, but we must better counter and combat the extensive propaganda and disinformation operations directed against us." U.S. Senator Chris Murphy was interviewed by The Washington Post about the legislation and said: "In the wake of this election, it’s pretty clear that the U.S. does not have the tools to combat this massive disinformation machinery that the Russians are running." United States Senate Select Committee on Intelligence member Senator Ron Wyden told The Washington Post: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."
Members of the United States Senate Select Committee on Intelligence traveled to Ukraine and Poland in March 2016 and heard from officials in both countries on Russian operations to influence their affairs. U.S. Senator Angus King told the Portland Press Herald that tactics used by Russia during the 2016 U.S. election cycle were analogous to those used against other countries as well. King recalled: "We were told by various officials in both countries about the Russian standard practice of interfering with elections: planting fake news stories". On 30 November 2016, King joined a letter in which seven members of the U.S. Senate Select Committee on Intelligence asked President Obama to publicize more information from the intelligence community on Russia's role in the U.S. election. In an interview with CNN, Senator King warned against ignoring the problem: "I don't consider this a partisan issue. We can't just let it go and say that's history because they will keep doing it."
Google CEO comment and actions
In the aftermath of the 2016 U.S. presidential election, Google, along with Facebook, faced increased scrutiny in the role of fake-news websites in the election. The top result on Google for results of the race was to a fraudulent news site. "70 News" had fraudulently written an incorrect headline and article that Donald Trump won the popular vote against Hillary Clinton in the 2016 U.S. election. With regards to the false results posted on "70 News", Google later stated that its prominence in search results was a mistake: "In this case we clearly didn't get it right, but we are continually working to improve our algorithms." By Monday, November 14, the "70 News" result was the second link that people saw when searching for results of the race.
When asked shortly after the election whether fraudulent news sites could have changed the election's results, Google CEO Sundar Pichai responded: "Sure" and went on to emphasize the importance of stopping the spread of fraudulent news sites: "Look, it is important to remember this was a very close election and so, just for me, so looking at it scientifically, one in a hundred voters voting one way or the other swings the election either way. ... From our perspective, there should just be no situation where fake news gets distributed, so we are all for doing better here."
On 14 November 2016, Google responded to the growing problem of fraudulent news sites by banning such companies from profiting on advertising from traffic to false articles through its marketing program AdSense. The company already had a policy for denying ads for dieting ripoffs and counterfeit merchandise. Google stated upon the announcement: "We’ve been working on an update to our publisher policies and will start prohibiting Google ads from being placed on misrepresentative content. Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property." This builds upon one of Google's existing better-advertisement policies, wherein misleading advertising is already banned from Google AdSense. The ban is not expected to apply to news satire sites like The Onion; some satirical sites may be inadvertently blocked under this new system.
Blocking fraudulent advertisers
Facebook made the decision to take a similar move to Google, and blocked fake news sites from advertising on its website the following day after Google took earlier action first on the matter. Facebook explained its new policy: "We do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news. ... We have updated the policy to explicitly clarify that this applies to fake news. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance." The steps by both Google and Facebook intended to deny ad revenue to fraudulent news sites; neither company took actions to prevent dissemination of false stories in search engine results pages or web feeds.
Facebook CEO Mark Zuckerberg said, in a post to his website on the issue, that the notion that fraudulent news sites impacted the 2016 election was a "crazy idea". Zuckerberg rejected that his website played any role in the outcome of the election, describing the idea that it might have done so as "pretty crazy". In a blog post, he stated that more than 99% of content on Facebook was authentic (i.e. neither fake news nor a hoax). In the same blog post, he stated that "News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance." Separately, Zuckerberg advised Facebook users to check the fact-checking website Snopes.com whenever they encounter fake news on Facebook.
Top staff members at Facebook did not feel that simply blocking ad revenue from these fraudulent sites was a strong enough response to the problem, and together they made an executive decision and created a secret group to deal with the issue themselves. In response to Zuckerberg's first statement that fraudulent news did not impact the 2016 election, the secret Facebook response group disputed this idea: "It’s not a crazy idea. What’s crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season." BuzzFeed reported that the secret task force included "dozens" of Facebook employees.
Facebook faced mounting criticism in the days after its decision to solely revoke advertising revenues from fraudulent news providers, and not take any further actions on the matter. After one week of negative coverage in the media including assertions that the proliferation of fraudulent news on Facebook gave the 2016 U.S. presidential election to Donald Trump, Mark Zuckerberg posted a second post on the issue on 18 November 2016. The post was a reversal of his earlier comments on the matter where he had discounted the impact of fraudulent news.
Zuckerberg said there was an inherent difficult nature in attempting to filter out fraudulent news: "The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible." The New York Times reported some measures being considered and not yet implemented by Facebook included: "third-party verification services, better automated detection tools and simpler ways for users to flag suspicious content." The 18 November post did not announce any concrete actions the company would definitively take, or when such measures would formally be put into usage on the website.
Many people commented positively under Zuckerberg's second post on fraudulent news. National Public Radio observed the changes being considered by Facebook to identify fraud constituted progress for the company into a new medium: "Together, the projects signal another step in Facebook's evolution from its start as a tech-oriented company to its current status as a complex media platform." On 19 November 2016, BuzzFeed advised Facebook users they could report posts from fraudulent news websites. Users could do so by choosing the report option: "I think it shouldn't be on Facebook", followed by: "It’s a false news story."
In November 2016, Facebook began assessing use of warning labels on fake news. The rollout was at first only available to a few users in a testing phase. A sample warning read: "This website is not a reliable news source. Reason: Classification Pending". TechCrunch analyzed the new feature during the testing phase and surmised it may have a tendency towards false positives.
Fake news proliferation on Facebook had a negative financial impact for the company. The Economist reported: "Brian Wieser of Pivotal Research recently wrote that the focus on fake news and the concerns over the measurement of advertising could well cut revenue growth by a couple of percentage points."
The New York Times reported shortly after Mark Zuckerberg's second statement on fake news proliferation on his website, that Facebook would engage in assisting the government of China with a version of its software in the country to allow increased censorship by the government. Barron's newspaper contributor William Pesek was highly critical of this move, writing: "By effectively sharing its fake news problem with the most populous nation, Facebook would be a pawn of [China’s President Xi] Jinping's intensifying censorship push."
Fact-checking websites and journalists
Fact-checking websites play a role as debunkers to fraudulent news reports. Such sites saw large increases in readership and web traffic during the 2016 U.S. election cycle. FactCheck.org, PolitiFact.com, Snopes.com, and "The Fact Checker" section of The Washington Post, are prominent fact-checking websites that played an important role in debunking fraud. The New Yorker writer Nicholas Lemann wrote on how to address fake news, and called for increasing the roles of FactCheck.org, PolitiFact.com, and Snopes.com, in the age of post-truth politics. CNN media meta analyst Brian Stelter wrote: "In journalism circles, 2016 is the year of the fact-checker."
By the close of the 2016 U.S. election season, fact-checking websites FactCheck.org, PolitiFact.com, and Snopes.com, had each authored guides on how to respond to fraudulent news. FactCheck.org advised readers to check the source, author, date, and headline of publications. They recommended their colleagues Snopes.com, The Washington Post Fact Checker, and PolitiFact.com as important resources to rely upon before re-sharing a fraudulent story. FactCheck.org admonished consumers to be wary of their own biases when viewing media they agree with. PolitiFact.com announced they would tag stories as "Fake news" so that readers could view all fraudulent stories they had debunked. Snopes.com warned readers: "So long as social media allows for the rapid spread of information, manipulative entities will seek to cash in on the rapid spread of misinformation."
The Washington Post's "The Fact Checker" section, which is dedicated to evaluating the truth of political claims, greatly increased in popularity during the 2016 election cycle. Glenn Kessler, who runs the Post's "Fact Checker", wrote that "fact-checking websites all experienced huge surges in readership during the election campaign." The Fact Checker had five times more unique visitors than during the 2012 cycle." Kessler cited research showing that fact-checks are effective at reducing "the prevalence of a false belief." Will Moy, director of the London-based Full Fact, a UK fact-checking website, said that debunking must take place over a sustained period of time to truly be effective. Full Fact began work to develop multiple products in a partnership with Google to help automate fact-checking.
FactCheck.org former director Brooks Jackson remarked that larger media companies had devoted increased focus to the importance of debunking fraud during the 2016 election: "It's really remarkable to see how big news operations have come around to challenging false and deceitful claims directly. It's about time." FactCheck.org began a new partnership with CNN journalist Jake Tapper in 2016 to examine the veracity of reported claims by candidates.
Angie Drobnic Holan, editor of PolitiFact.com, noted the circumstances warranted support for the practice: "All of the media has embraced fact-checking because there was a story that really needed it." Holan was heartened that fact-checking garnered increased viewership for those engaged in the practice: "Fact-checking is now a proven ratings getter. I think editors and news directors see that now. So that's a plus." Holan cautioned that heads of media companies must strongly support the practice of debunking, as it often provokes hate mail and extreme responses from zealots.
On 17 November 2016, the International Fact-Checking Network (IFCN) published an open letter on the website of the Poynter Institute to Facebook founder and CEO Mark Zuckerberg, imploring him to utilize fact-checkers in order to help identify fraud on Facebook. Created in September 2015, the IFCN is housed within the St. Petersburg, Florida-based Poynter Institute for Media Studies and aims to support the work of 64 member fact-checking organizations around the world. Alexios Mantzarlis, co-founder of FactCheckEU.org and former managing editor of Italian fact-checking site Pagella Politica, was named director and editor of IFCN in September 2015. Signatories to the 2016 letter to Zuckerberg featured a global representation of fact-checking groups, including: Africa Check, FactCheck.org, PolitiFact.com, and The Washington Post Fact Checker. The groups wrote they were eager to assist Facebook root out fraudulent news sources on the website.
In his second post on the matter on 18 November 2016, Zuckerberg responded to the fraudulent news problem by suggesting usage of fact-checking websites. He specifically identified fact-checking website Snopes.com, and pointed out that Facebook monitors links to such debunking websites in reply comments as a method to determine which original posts were fraudulent. Zuckerberg explained: "Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread."
Society of Professional Journalists president Lynn Walsh said in November 2016 that the society would reach out to Facebook in order to provide assistance with weeding out fake news. Walsh said Facebook should evolve and admit that it functioned as a large media company: "The media landscape has evolved. Journalism has evolved, and continues to evolve. So I do hope that while it may not be the original thought that Facebook had. I think they should be now."
Proposed technology tools
New York magazine contributor Brian Feldman responded to an article by media communications professor Melissa Zimdars, and used her list to create a Google Chrome extension that would warn users about fraudulent news sites. He invited others to use his code and improve upon it.
Slate magazine senior technology editor Will Oremus wrote that fraudulent news sites were controversial; and their prevalence was obscuring a wider discussion about the negative impact on society from those who only consume media from one particular tailored viewpoint — and therefore perpetuate filter bubbles.
Upworthy co-founder and The Filter Bubble author Eli Pariser launched an open-source model initiative on 17 November 2016 to address false news. Pariser began a Google Document to collaborate with others online on how to lessen the phenomenon of fraudulent news. Pariser called his initiative: "Design Solutions for Fake News". Pariser's document included recommendations for a ratings organization analogous to the Better Business Bureau, and a database on media producers in a format like Wikipedia.
Writing for Fortune, Matthew Ingram agreed with the idea that Wikipedia could serve as a helpful model to improve Facebook's analysis of potentially fake news. Ingram concluded: "If Facebook could somehow either tap into or recreate the kind of networked fact checking that Wikipedia does on a daily basis, using existing elements like the websites of Politifact and others, it might actually go some distance towards being a possible solution."
Writing for MIT Technology Review, Jamie Condliffe said that merely banning ad revenue from the fraudulent news sites was not enough action by Facebook to effectively deal with the problem. He wrote: "The post-election furor surrounding Facebook’s fake-news problem has sparked new initiatives to halt the provision of ads to sites that peddle false information. But it’s only a partial solution to the problem: for now, hoaxes and fabricated stories will continue to appear in feeds." Condliffe concluded: "Clearly Facebook needs to do something to address the issue of misinformation, and it’s making a start. But the ultimate solution is probably more significant, and rather more complex, than a simple ad ban."
Indiana University informatics and computer science professor Filippo Menczer commented on the steps by Google and Facebook to deny fraudulent news sites advertising revenue: "One of the incentives for a good portion of fake news is money. This could cut the income that creates the incentive to create the fake news sites." Menczer's research team engaged in developing an online tool titled: Hoaxy — to see the pervasiveness of unconfirmed assertions as well as related debunking on the Internet.
Zeynep Tufekci wrote critically about Facebook's stance on fraudulent news sites in a piece for The New York Times, pointing out fraudulent websites in Macedonia profited handsomely off false stories about the 2016 U.S. election: "The company's business model, algorithms and policies entrench echo chambers and fuel the spread of misinformation."
Merrimack College assistant professor of media studies Melissa Zimdars wrote an article "False, Misleading, Clickbait-y and Satirical 'News' Sources" in which she advised how to determine if a fraudulent source was a fake news site. Zimdars identified strange domain names, lack of author attribution, poor website layout, the use of all caps, and URLs ending in "lo" or "com.co" as red flags of a fake news site. In evaluating whether a website contains fake news, Zimdars recommends that readers check the "About Us" page of the website, and consider whether reputable news outlets are reporting on the story.
Education and history professor Sam Wineburg of the Stanford Graduate School of Education at Stanford University and colleague Sarah McGrew authored a 2016 study which analyzed students' ability to discern fraudulent news from factual reporting. The study took place over a year-long period of time, and involved a sample size of over 7,800 responses from university, secondary and middle school students in 12 states within the United States. The researchers were "shocked" at the "stunning and dismaying consistency" with which students thought fraudulent news reports were factual in nature. The study found that 82 percent of students in middle school were unable to differentiate between an advertisement denoted as sponsored content from an actual online news article. The authors concluded the solution was to educate consumers of media on the Internet to themselves behave like fact-checkers — and actively question the veracity of all sources they encounter online.
Scientist Emily Willingham proposed applying the scientific method towards fake news analysis. She had previously written on the topic of differentiating science from pseudoscience, and applied that logic to fake news. Her recommended steps included: Observe, Question, Hypothesize, Analyze data, Draw conclusion, and Act on results. Willingham suggested a hypothesis of "This is real news", and then forming a strong set of questions to attempt to disprove the hypothesis. These tests included: check the URL, date of the article, evaluate reader bias and writer bias, double-check the evidence, and verify the sources cited.
Samantha Bee went to Russia for her television show Full Frontal and met with individuals financed by the government of Russia to act as Internet trolls and attempt to subvert the 2016 U.S. election in order to subvert democracy. The man and woman interviewed by Bee said they influenced the election by commenting on websites for New York Post, The Wall Street Journal, The Washington Post, Twitter, and Facebook. They kept their identities covert, and maintained cover identities separate from their real Russian names, with the woman claiming in posts to be a housewife residing in Nebraska. They blamed consumers for believing all they read online.
Executive producers for Full Frontal told The Daily Beast that they relied upon writer Adrian Chen, who had previously reported on Russian trolls for The New York Times Magazine in 2015, as a resource to contact those in Russia agreeable to be interviewed by Bee. The Russian trolls wore masks on camera and asked Full Frontal producers to maintain the confidentiality of all of their fake accounts so they would not be publicly identified. Full Frontal producers paid the Russian trolls to utilize the Twitter hashtag #SleazySam in order to troll the show itself, so the production staff could verify the trolls were indeed able to manipulate content online as they claimed.
Subsequent to their research within Russia itself for a second segment on Full Frontal, the production staff came to the conclusion that Russian leader Vladimir Putin supported Donald Trump for U.S. President in order to subvert the system of democracy within the U.S. Television producer Razan Ghalayini explained to The Daily Beast: "Russia is an authoritarian regime and authoritarian regimes don’t benefit from the vision of democracy being the best version of governance." Television producer Miles Kahn concurred with this analysis, adding: "It’s not so much that Putin wants Trump. He probably prefers him in the long run, but he would almost rather the election be contested. They want chaos."
Last Week Tonight
John Oliver said the problem of fraudulent news sites fed into a wider issue of echo chambers in the media, saying there was "a whole cottage industry specializing in hyper-partisan, sometimes wildly distorted clickbait."
Critics contended that fraudulent news on Facebook may have been responsible for Donald Trump winning the 2016 U.S. presidential election, because most of the fake news stories Facebook allowed to spread portrayed him in a positive light. Facebook is not liable for posting or publicizing fake content because, under the Communications Decency Act, interactive computer services cannot be held responsible for information provided by another Internet entity. Some legal scholars, like Keith Altman, think that Facebook's huge scale creates such a large potential for fake news to spread that this law may need to be changed. Writing for The Washington Post, Institute for Democracy in Eastern Europe co-director Eric Chenoweth wrote "many 'fake news' stories that evidence suggests were generated by Russian intelligence operations".
British BBC News interviewed a fraudulent news site writer who went by the pseudonym "Chief Reporter (CR)", who defended his actions and possible influence on elections: "If enough of an electorate are in a frame of mind where they will believe absolutely everything they read on the internet, to a certain extent they have to be prepared to deal with the consequences."
- 2016 Democratic National Committee email leak
- Post-truth politics
- Confirmation bias
- Cyberwarfare by Russia
- Democratic National Committee cyber attacks
- Echo chamber (media)
- Fancy Bear
- Filter bubble
- Guccifer 2.0
- Hybrid warfare
- List of satirical news websites
- Russian espionage in the United States
- Russian propaganda
- Selective exposure theory
- Spiral of silence
- State-sponsored Internet propaganda
- Tribe (Internet)
- Trolls from Olgino
- Web brigades
- Fortune magazine described the Foreign Policy Research Institute as: "a conservative think tank known for its generally hawkish stance on relations between the U.S. and Russia"
- The Washington Post and the Associated Press described PropOrNot as a nonpartisan foreign policy analysis group composed of persons with prior experience in international relations, warfare, and information technology sectors. Their spokeman, interviewed by Adrian Chen of the The New Yorker said they were composed of government officials and tech company employees who agreed "that Russia should not be able to fuck with the American people".
- FactCheck.org, a nonprofit organization and a project of the Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania, won a 2010 Sigma Delta Chi Award from the Society of Professional Journalists.
- PolitiFact.com, run by the Tampa Bay Times, received a 2009 Pulitzer Prize for National Reporting for its fact-checking efforts the previous year.
- Snopes.com, privately run by Barbara and David Mikkelson, was given "high praise" by FactCheck.org, another fact-checking website; in addition, Network World gave Snopes.com a grade of "A" in a meta-analysis of fact-checking websites.
- "The Fact Checker" is a project by The Washington Post to analyze political claims. Their colleagues and competitors at FactCheck.org recommended The Fact Checker as a resource to use before assuming a story is factual.
- "Concern over barrage of fake Russian news in Sweden", The Local, 27 July 2016, retrieved 25 November 2016
- Lewis Sanders IV (11 October 2016), "'Divide Europe': European lawmakers warn of Russian propaganda", Deutsche Welle, retrieved 24 November 2016
- Paul Mozur and Mark Scott (17 November 2016), "Fake News on Facebook? In Foreign Elections, That's Not New", The New York Times, retrieved 18 November 2016
- "Merkel warns against fake news driving populist gains", Yahoo! News, Agence France-Presse, 23 November 2016, retrieved 23 November 2016
- Timberg, Craig (24 November 2016), "Russian propaganda effort helped spread 'fake news' during election, experts say", The Washington Post, retrieved 25 November 2016,
Two teams of independent researchers found that the Russians exploited American-made technology platforms to attack U.S. democracy at a particularly vulnerable moment
- "Russian propaganda effort likely behind flood of fake news that preceded election", PBS NewsHour, Associated Press, 25 November 2016, retrieved 26 November 2016
- "Russian propaganda campaign reportedly spread 'fake news' during US election", Nine News, Agence France-Presse, 26 November 2016, retrieved 26 November 2016
- Ali Watkins and Sheera Frenkel (30 November 2016), "Intel Officials Believe Russia Spreads Fake News", BuzzFeed News, retrieved 1 December 2016
- Strohm, Chris (1 December 2016), "Russia Weaponized Social Media in U.S. Election, FireEye Says", Bloomberg News, retrieved 1 December 2016
- "Google and Facebook target fake news sites with advertising clampdown", Belfast Telegraph, 15 November 2016, retrieved 16 November 2016
- Shanika Gunaratna (15 November 2016), "Facebook, Google announce new policies to fight fake news", CBS News, retrieved 16 November 2016
- John Ribeiro (14 November 2016), "Zuckerberg says fake news on Facebook didn't tilt the elections", Computerworld, retrieved 16 November 2016
- Timberg, Craig (30 November 2016), "Effort to combat foreign propaganda advances in Congress", The Washington Post, retrieved 1 December 2016
- Weisburd, Andrew; Watts, Clint (6 August 2016), "Trolls for Trump - How Russia Dominates Your Twitter Feed to Promote Lies (And, Trump, Too)", The Daily Beast, retrieved 24 November 2016
- Dan Tynan (24 August 2016), "How Facebook powers money machines for obscure political 'news' sites - From Macedonia to the San Francisco Bay, clickbait political sites are cashing in on Trumpmania – and they're getting a big boost from Facebook", The Guardian, retrieved 18 November 2016
- Ben Gilbert (15 November 2016), "Fed up with fake news, Facebook users are solving the problem with a simple list", Business Insider, retrieved 16 November 2016,
Some of these sites are intended to look like real publications (there are false versions of major outlets like ABC and MSNBC) but share only fake news; others are straight-up propaganda created by foreign nations (Russia and Macedonia, among others).
- Townsend, Tess (21 November 2016), "Meet the Romanian Trump Fan Behind a Major Fake News Site", Inc. magazine, ISSN 0162-8968, retrieved 23 November 2016
- Sydell, Laura (23 November 2016), "We Tracked Down A Fake-News Creator In The Suburbs. Here's What We Learned", All Things Considered, National Public Radio, retrieved 26 November 2016
- THR staff (17 November 2016), "Facebook Fake News Writer Reveals How He Tricked Trump Supporters and Possibly Influenced Election", The Hollywood Reporter, retrieved 18 November 2016
- Jamie Condliffe (15 November 2016), "Facebook's Fake-News Ad Ban Is Not Enough", MIT Technology Review, retrieved 16 November 2016
- Craig Silverman and Lawrence Alexander (3 November 2016), "How Teens In The Balkans Are Duping Trump Supporters With Fake News", BuzzFeed News, retrieved 16 November 2016,
As a result, this strange hub of pro-Trump sites in the former Yugoslav Republic of Macedonia is now playing a significant role in propagating the kind of false and misleading content that was identified in a recent BuzzFeed News analysis of hyperpartisan Facebook pages.
- Ishmael N. Daro and Craig Silverman (15 November 2016), "Fake News Sites Are Not Terribly Worried About Google Kicking Them Off AdSense", BuzzFeed, retrieved 16 November 2016
- Christopher Woolf (16 November 2016), "Kids in Macedonia made up and circulated many false news stories in the US election", Public Radio International, retrieved 18 November 2016
- "In Macedonia's fake news hub, this teen shows how it's done", CBS News, Associated Press, 2 December 2016, retrieved 3 December 2016
- Chen, Adrian (27 July 2016), "The Real Paranoia-Inducing Purpose of Russian Hacks", The New Yorker, retrieved 26 November 2016
- Lewis Sanders IV (17 November 2016), "Fake news: Media's post-truth problem", Deutsche Welle, retrieved 24 November 2016
- European Parliament Committee on Foreign Affairs (23 November 2016), "MEPs sound alarm on anti-EU propaganda from Russia and Islamist terrorist groups" (PDF), European Parliament, retrieved 26 November 2016
- Surana, Kavitha (23 November 2016), "The EU Moves to Counter Russian Disinformation Campaign", Foreign Policy, ISSN 0015-7228, retrieved 24 November 2016
- "EU Parliament Urges Fight Against Russia's 'Fake News'", Radio Free Europe/Radio Liberty, Agence France-Presse and Reuters, 23 November 2016, retrieved 24 November 2016
- MacFarquhar, Neil (29 August 2016), "A Powerful Russian Weapon: The Spread of False Stories", The New York Times, p. A1, retrieved 24 November 2016
- Porter, Tom (28 November 2016), "How and EU failings allowed Kremlin propaganda and fake news to spread through the West", International Business Times, retrieved 29 November 2016
- Schindler, John R. (5 November 2015), "Obama Fails to Fight Putin's Propaganda Machine", New York Observer, retrieved 28 November 2016
- Schindler, John R. (26 November 2016), "The Kremlin Didn't Sink Hillary—Obama Did", New York Observer, retrieved 28 November 2016
- LoGiurato, Brett (29 April 2014), "Russia's Propaganda Channel Just Got A Journalism Lesson From The US State Department", Business Insider, retrieved 29 November 2016
- LoGiurato, Brett (25 April 2014), "RT Is Very Upset With John Kerry For Blasting Them As Putin's 'Propaganda Bullhorn'", Business Insider, retrieved 29 November 2016
- Stengel, Richard (29 April 2014), "Russia Today's Disinformation Campaign", Dipnote, United States Department of State, retrieved 28 November 2016
- Dougherty, Jill (2 December 2016), "The reality behind Russia's fake news", CNN, retrieved 2 December 2016
- Frenkel, Sheera (4 November 2016), "US Officials Are More Worried About The Media Being Hacked Than The Ballot Box", BuzzFeed News, retrieved 2 December 2016
- Benedictus, Leo (6 November 2016), "Invasion of the troll armies: from Russian Trump supporters to Turkish state stooges", The Guardian, retrieved 2 December 2016
- "U.S. officials defend integrity of vote, despite hacking fears", WITN-TV, 26 November 2016, retrieved 2 December 2016
- "Vladimir Putin Wins the Election No Matter Who The Next President Is", The Daily Beast, 4 November 2016, retrieved 2 December 2016
- Schatz, Bryan, "The Kremlin Would Be Proud of Trump's Propaganda Playbook", Mother Jones, retrieved 2 December 2016
- Ingram, Matthew (25 November 2016), "No, Russian Agents Are Not Behind Every Piece of Fake News You See", Fortune magazine, retrieved 27 November 2016
- vanden Heuvel, Katrina (29 November 2016), "Putin didn't undermine the election. We did.", The Washington Post, retrieved 1 December 2016
- "The Propaganda About Russian Propaganda". The New Yorker. 1 December 2016. Retrieved 3 December 2016.
- Ben Norton; Glenn Greenwald (26 November 2016), "Washington Post Disgracefully Promotes a McCarthyite Blacklist From a New, Hidden, and Very Shady Group", The Intercept, retrieved 27 November 2016
- Taibbi, Matt (28 November 2016), "The 'Washington Post' 'Blacklist' Story Is Shameful and Disgusting", Rolling Stone, retrieved 30 November 2016
- Blumenthal, Max (25 November 2016). "Washington Post Promotes Shadowy Website That Accuses 200 Publications of Being Russian Propaganda Plants". AlterNet. Retrieved 3 December 2016.
- Shapiro, Ari (25 November 2016), "Experts Say Russian Propaganda Helped Spread Fake News During Election", All Things Considered, National Public Radio, retrieved 26 November 2016
- Collins, Ben (28 October 2016), "This 'Conservative News Site' Trended on Facebook, Showed Up on Fox News—and Duped the World", The Daily Beast, retrieved 27 November 2016
- Chacon, Marco (21 November 2016), "I've Been Making Viral Fake News for the Last Six Months. It's Way Too Easy to Dupe the Right on the Internet.", The Daily Beast, retrieved 27 November 2016
- Bambury, Brent (25 November 2016), "Marco Chacon meant his fake election news to be satire — but people took it as fact", Day 6, CBC Radio One, retrieved 27 November 2016
- Rachel Dicker (14 November 2016), "Avoid These Fake News Sites at All Costs", U.S. News & World Report, retrieved 16 November 2016
- Chang, Juju (29 November 2016), "When Fake News Stories Make Real News Headlines", ABC News, retrieved 29 November 2016
- McAlone, Nathan (17 November 2016), "This fake-news writer says he makes over $10,000 a month, and he thinks he helped get Trump elected", Business Insider, retrieved 18 November 2016
- Goist, Robin (17 November 2016), "The fake news of Facebook", The Plain Dealer, retrieved 18 November 2016
- Dewey, Caitlin (17 November 2016), "Facebook fake-news writer: 'I think Donald Trump is in the White House because of me'", The Washington Post, ISSN 0190-8286, retrieved 17 November 2016
- Hedegaard, Erik (29 November 2016), "How a Fake Newsman Accidentally Helped Trump Win the White House - Paul Horner thought he was trolling Trump supporters – but after the election, the joke was on him", Rolling Stone, retrieved 29 November 2016
- Eunice Yoon and Barry Huang (22 November 2016), "China on US fake news debate: We told you so", CNBC, retrieved 28 November 2016
- Cadell, Catherine (19 November 2016), China says terrorism, fake news impel greater global internet curbs, Reuters, retrieved 28 November 2016
- Read, Max (27 November 2016), "Maybe the Internet Isn't a Fantastic Tool for Democracy After All", New York Magazine, retrieved 28 November 2016
- Frenkel, Sheera (20 November 2016), "This Is What Happens When Millions Of People Suddenly Get The Internet", BuzzFeed News, retrieved 28 November 2016
- Kate Connolly, Angelique Chrisafis, Poppy McPherson, Stephanie Kirchgaessner, Benjamin Haas , Dominic Phillips, and Elle Hunt (2 December 2016), "Fake news: an insidious trend that's fast becoming a global problem - With fake online news dominating discussions after the US election, Guardian correspondents explain how it is distorting politics around the world", The Guardian, retrieved 2 December 2016
- Orlowski, Andrew (21 November 2016), "China cites Trump to justify 'fake news' media clampdown. Surprised?", The Register, retrieved 28 November 2016
- Pascaline, Mary (20 November 2016), "Facebook Fake News Stories: China Calls For More Censorship On Internet Following Social Media's Alleged Role In US Election", International Business Times, retrieved 28 November 2016
- Rauhala, Emily (17 November 2016), "After Trump, Americans want Facebook and Google to vet news. So does China.", The Washington Post, retrieved 28 November 2016
- Dou, Eva (18 November 2016), "China Presses Tech Firms to Police the Internet - Third-annual World Internet Conference aimed at proselytizing China's view to global audience", The Wall Street Journal, retrieved 28 November 2016
- Murdock, Jason (30 November 2016), "Russian hackers may disrupt Germany's 2017 election warns spy chief", International Business Times UK edition, retrieved 1 December 2016
- Horowitz, Jason (2 December 2016), "Spread of Fake News Provokes Anxiety in Italy", The New York Times, retrieved 3 December 2016
- "La notizia più condivisa sul referendum? È una bufala", Pagella Politica (in Italian), pagellapolitica.it, retrieved 2 December 2016
- Anderson, Ariston (30 November 2016), "Italy's Populist Party Found to Be Leader in Europe for Fake News", The Hollywood Reporter, retrieved 3 December 2016
- Alberto Nardelli and Craig Silverman (29 November 2016), "Italy's Most Popular Political Party Is Leading Europe In Fake News And Kremlin Propaganda", BuzzFeed News, retrieved 3 December 2016
- Alyssa Newcomb (15 November 2016), "Facebook, Google Crack Down on Fake News Advertising", NBC News, NBC News, retrieved 16 November 2016
- Drum, Kevin (17 November 2016), "Meet Ret. General Michael Flynn, the Most Gullible Guy in the Army", Mother Jones, retrieved 18 November 2016
- Tapper, Jake (17 November 2016), "Fake news stories thriving on social media - Phony news stories are thriving on social media, so much so President Obama addressed it. CNN's Jake Tapper reports.", CNN, retrieved 18 November 2016
- Masnick, Mike (14 October 2016), "Donald Trump's Son & Campaign Manager Both Tweet Obviously Fake Story", Techdirt, retrieved 18 November 2016
- President Barack Obama (7 November 2016), Remarks by the President at Hillary for America Rally in Ann Arbor, Michigan, White House Office of the Press Secretary, retrieved 16 November 2016
- Gardiner Harris and Melissa Eddy (17 November 2016), "Obama, With Angela Merkel in Berlin, Assails Spread of Fake News", The New York Times, retrieved 18 November 2016
- Maheshwari, Sapna (20 November 2016), "How Fake News Goes Viral", The New York Times, ISSN 0362-4331, retrieved 20 November 2016
- Kurtz, Howard, "Fake news and the election: Why Facebook is polluting the media environment with garbage", Fox News, archived from the original on 18 November 2016, retrieved 18 November 2016
- Porter, Tom (1 December 2016), "US House of representatives backs proposal to counter global Russian subversion", International Business Times UK edition, retrieved 1 December 2016
- Miller, Kevin (1 December 2016), "Angus King: Russian involvement in U.S. election 'an arrow aimed at the heart of democracy'", Portland Press Herald, retrieved 2 December 2016
- Staff report (30 November 2016), "Angus King among senators asking president to declassify information about Russia and election", Portland Press Herald, retrieved 2 December 2016
- Jim Sciutto and Manu Raju (3 December 2016), "Democrats want Russian hacking intelligence declassified", CNN, retrieved 3 December 2016
- Bump, Philip (14 November 2016), "Google's top news link for 'final election results' goes to a fake news site with false numbers", The Washington Post, retrieved 26 November 2016
- Jacobson, Louis (14 November 2016), "No, Donald Trump is not beating Hillary Clinton in the popular vote", PolitiFact.com, retrieved 26 November 2016
- Wingfield, Nick; Isaac, Mike; Benner, Katie (14 November 2016), "Google and Facebook Take Aim at Fake News Sites", The New York Times, retrieved 28 November 2016
- Sonam Sheth (14 November 2016), "Google looking into grossly inaccurate top news search result displayed as final popular-vote tally", Business Insider, retrieved 16 November 2016
- "Google to ban fake news sites from its advertising network", Los Angeles Times, Associated Press, 14 November 2016, retrieved 16 November 2016
- Avery Hartmans (15 November 2016), "Google's CEO says fake news could have swung the election", Business Insider, retrieved 16 November 2016
- "Google cracks down on fake news sites", The Straits Times, 15 November 2016, retrieved 16 November 2016
- Richard Waters (15 November 2016), "Facebook and Google to restrict ads on fake news sites", Financial Times, retrieved 16 November 2016
- Sridhar Ramaswamy (21 January 2016), "How we fought bad ads in 2015", Google blog, Google, retrieved 28 November 2016
- Paul Blake (15 November 2016), "Google, Facebook Move to Block Fake News From Ad Services", ABC News, retrieved 16 November 2016
- Gina Hall (15 November 2016), "Facebook staffers form an unofficial task force to look into fake news problem", Silicon Valley Business Journal, retrieved 16 November 2016
- Frenkel, Sheera (14 November 2016), "Renegade Facebook Employees Form Task Force To Battle Fake News", BuzzFeed, retrieved 18 November 2016
- Shahani, Aarti (15 November 2016), "Facebook, Google Take Steps To Confront Fake News", National Public Radio, retrieved 20 November 2016
- Cooke, Kristina (15 November 2016), Google, Facebook move to restrict ads on fake news sites, Reuters, retrieved 20 November 2016
- "Facebook's Fake News Problem: What's Its Responsibility?", The New York Times, Associated Press, 15 November 2016, retrieved 20 November 2016
- Ohlheiser, Abby (19 November 2016), "Mark Zuckerberg outlines Facebook's ideas to battle fake news", The Washington Post, retrieved 19 November 2016
- Vladimirov, Nikita (19 November 2016), "Zuckerberg outlines Facebook's plan to fight fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
- Mike Isaac (19 November 2016), "Facebook Considering Ways to Combat Fake News, Mark Zuckerberg Says", The New York Times, retrieved 19 November 2016
- Samuel Burke (19 November 2016), "Zuckerberg: Facebook will develop tools to fight fake news", CNNMoney, CNN, retrieved 19 November 2016
- Chappell, Bill (19 November 2016), "'Misinformation' On Facebook: Zuckerberg Lists Ways Of Fighting Fake News", National Public Radio, retrieved 19 November 2016
- Silverman, Craig (19 November 2016), "This Is How You Can Stop Fake News From Spreading On Facebook", BuzzFeed, retrieved 20 November 2016
- Taylor Hatmaker and Josh Constine (1 December 2016), "Facebook quietly tests warnings on fake news", TechCrunch, retrieved 2 December 2016
- "False news items are not the only problem besetting Facebook", The Economist, 26 November 2016, retrieved 28 November 2016
- Pesek, William (27 November 2016), "Will Facebook be China's propaganda tool?", The Japan Times, Barron's newspaper, retrieved 28 November 2016
- Stelter, Brian (7 November 2016), "How Donald Trump made fact-checking great again", CNNMoney, CNN, retrieved 19 November 2016
- Kessler, Glenn (10 November 2016), "Fact checking in the aftermath of a historic election", The Washington Post, retrieved 19 November 2016
- Neidig, Harper (17 November 2016), "Fact-checkers call on Zuckerberg to address spread of fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
- Hartlaub, Peter (24 October 2004), "Web sites help gauge the veracity of claims / Online resources check ads, rumors", San Francisco Chronicle, p. A1, retrieved 25 November 2016
- "Fact-Checking Deceptive Claims About the Federal Health Care Legislation - by Staff, FactCheck.org", 2010 Sigma Delta Chi Award Honorees, Society of Professional Journalists, 2010, retrieved 25 November 2016
- Columbia University (2009), "National Reporting - Staff of St. Petersburg Times", 2009 Pulitzer Prize Winners, retrieved 24 November 2016,
For "PolitiFact," its fact-checking initiative during the 2008 presidential campaign that used probing reporters and the power of the World Wide Web to examine more than 750 political claims, separating rhetoric from truth to enlighten voters.
- Novak, Viveca (10 April 2009), "Ask FactCheck - Snopes.com", FactCheck.org, retrieved 25 November 2016
- McNamara, Paul (13 April 2009), "Fact-checking the fact-checkers: Snopes.com gets an 'A'", Network World, retrieved 25 November 2016
- Lori Robertson and Eugene Kiely (18 November 2016), "How to Spot Fake News", FactCheck.org, retrieved 19 November 2016
- Lemann, Nicholas (30 November 2016), "Solving the Problem of Fake News", The New Yorker, retrieved 30 November 2016
- LaCapria, Kim (2 November 2016), "Snopes' Field Guide to Fake News Sites and Hoax Purveyors - Snopes.com's updated guide to the internet's clickbaiting, news-faking, social media exploiting dark side.", Snopes.com, retrieved 19 November 2016
- Sharockman, Aaron (16 November 2016), "Let's fight back against fake news", PolitiFact.com, retrieved 19 November 2016
- Burgess, Matt (17 November 2016), "Google is helping Full Fact create an automated, real-time fact-checker", Wired magazine UK edition, retrieved 29 November 2016
- The International Fact-Checking Network (17 November 2016), "An open letter to Mark Zuckerberg from the world's fact-checkers", Poynter Institute, retrieved 19 November 2016
- Hare, Kristen (September 21, 2015), Poynter names director and editor for new International Fact-Checking Network, Poynter Institute for Media Studies, retrieved 20 November 2016
- About the International Fact-Checking Network, Poynter Institute for Media Studies, 2016, retrieved 20 November 2016
- Klasfeld, Adam (22 November 2016), "Fake News Gives Facebook a Nixon-Goes-to-China Moment", Courthouse News Service, retrieved 28 November 2016
- Brian Feldman (15 November 2016), "Here's a Chrome Extension That Will Flag Fake-News Sites for You", New York Magazine, retrieved 16 November 2016
- Will Oremus (15 November 2016), "The Real Problem Behind the Fake News", Slate magazine, retrieved 16 November 2016
- Morris, David Z. (27 November 2016), "Eli Pariser's Crowdsourced Brain Trust Is Tackling Fake News", Fortune magazine, retrieved 28 November 2016
- Burgess, Matt (25 November 2016), "Hive mind assemble! There is now a crowdsourcing campaign to solve the problem of fake news", Wired magazine UK edition, retrieved 29 November 2016
- Ingram, Matthew (21 November 2016), "Facebook Doesn't Need One Editor, It Needs 1,000 of Them", Fortune magazine, retrieved 29 November 2016
- "Google, Facebook move to curb ads on fake news sites", Kuwait Times, Reuters, 15 November 2016, retrieved 16 November 2016
- Menczer, Filippo (28 November 2016), "Fake Online News Spreads Through Social Echo Chambers", Scientific American, The Conversation, retrieved 29 November 2016
- Douglas Perry (15 November 2016), "Facebook, Google try to drain the fake-news swamp without angering partisans", The Oregonian, retrieved 16 November 2016
- Cassandra Jaramillo (15 November 2016), "How to break it to your friends and family that they're sharing fake news", The Dallas Morning News, retrieved 16 November 2016
- Domonoske, Camila (23 November 2016), "Students Have 'Dismaying' Inability To Tell Fake News From Real, Study Finds", National Public Radio, retrieved 25 November 2016
- McEvers, Kelly (22 November 2016), "Stanford Study Finds Most Students Vulnerable To Fake News", National Public Radio, retrieved 25 November 2016
- Shellenbarger, Sue (21 November 2016), "Most Students Don't Know When News Is Fake, Stanford Study Finds", The Wall Street Journal, retrieved 29 November 2016
- Willingham, Emily (28 November 2016), "A Scientific Approach To Distinguishing Real From Fake News", Forbes magazine, retrieved 29 November 2016
- "Samantha Bee Interviews Russian Trolls, Asks Them About 'Subverting Democracy'", The Hollywood Reporter, 1 November 2016, retrieved 25 November 2016
- Holub, Christian (1 November 2016), "Samantha Bee interviews actual Russian trolls", Entertainment Weekly, retrieved 25 November 2016
- Wilstein, Matt (7 November 2016), "How Samantha Bee's 'Full Frontal' Tracked Down Russia's Pro-Trump Trolls", The Daily Beast, retrieved 25 November 2016
- Rogers, James (11 November 2016), "Facebook's 'fake news' highlights need for social media revamp, experts say", Fox News, retrieved 20 November 2016
- Chenoweth, Eric (25 November 2016), "Americans keep looking away from the election's most alarming story", The Washington Post, retrieved 26 November 2016
- "'I write fake news that gets shared on Facebook'", BBC News, BBC, 15 November 2016, retrieved 16 November 2016
|Wikinews has related news: Wikinews investigates: Advertisements disguised as news articles trick unknowing users out of money, credit card information|
- Jamie Condliffe (15 November 2016), "Facebook's Fake-News Ad Ban Is Not Enough", MIT Technology Review, retrieved 16 November 2016
- Cassandra Jaramillo (15 November 2016), "How to break it to your friends and family that they're sharing fake news", The Dallas Morning News, retrieved 16 November 2016
- Craig Silverman and Lawrence Alexander (3 November 2016), "How Teens In The Balkans Are Duping Trump Supporters With Fake News", BuzzFeed, retrieved 16 November 2016
- Ishmael N. Daro and Craig Silverman (15 November 2016), "Fake News Sites Are Not Terribly Worried About Google Kicking Them Off AdSense", BuzzFeed, retrieved 16 November 2016
- Craig Silverman (16 November 2016), "Viral Fake Election News Outperformed Real News On Facebook In Final Months Of The US Election", BuzzFeed, retrieved 16 November 2016
- Taylor, Adam (26 November 2016), "Before 'fake news,' there was Soviet 'disinformation'", The Washington Post, retrieved 3 December 2016
|Wikimedia Commons has media related to Fake news websites.|
|Look up spamvertise in Wiktionary, the free dictionary.|
- Kim LaCapria (2 November 2016), "Snopes' Field Guide to Fake News Sites and Hoax Purveyors", Snopes.com, snopes.com, retrieved 16 November 2016
- Rachel Dicker (14 November 2016), "Avoid These Fake News Sites at All Costs", U.S. News & World Report, retrieved 16 November 2016
- Lori Robertson and Eugene Kiely (18 November 2016), "How to Spot Fake News", FactCheck.org, Annenberg Public Policy Center, retrieved 19 November 2016
- Jared Keller (19 November 2016), "This Critique of Fake Election News Is a Must-Read for All Democracy Lovers", Mother Jones, retrieved 19 November 2016
- Lance Ulanoff (18 November 2016), "7 signs the news you're sharing is fake", Mashable, retrieved 19 November 2016
- Laura Hautala (19 November 2016), "How to avoid getting conned by fake news sites - Here's how you can identify and avoid sites that just want to serve up ads next to outright falsehoods.", CNET, retrieved 19 November 2016
- Sreenivasan, Hari (17 November 2016), "How online hoaxes and fake news played a role in the election", PBS NewsHour (video), retrieved 29 November 2016