Posted on Leave a comment

YouTube puts mail-in voting information next to videos on the topic – CNET

google-hq-sede-mountain-view.jpg

YouTube adds an information panel on mail-in voting.

Angela Lang/CNET

YouTube on Thursday said it will give people information on mail-in voting when they watch videos that discuss the subject. The ballot-casting method has become fraught with misinformation as President Donald Trump has tried to discredit the process, while providing no evidence of security flaws in the time-tested system.

To provide people with more context, YouTube’s software will add a text panel to accompany vote by mail videos, linking to information from the Bipartisan Policy Center, a Washington, DC-based think tank.

“Mail-in ballots that meet eligibility and validity requirements are counted in every election,” reads the page YouTube users will see when they click on the link. “The law requires all valid votes to be counted in every election regardless of how they are cast.”

YouTube, which is owned by Google, isn’t the only tech giant trying to quash misinformation related to mail-in voting. Facebook and Twitter have both flagged Trump’s posts on the topic. Earlier this month, the two social networks added labels to the president’s posts, which potentially encouraged people to vote twice if they thought their mail-in ballot hadn’t been counted.

Thursday’s announcement comes as Silicon Valley companies try to prove they can avoid the pitfalls they encountered in 2016. That election was marred by interference from Russia, which exploited platforms from Google, Facebook and Twitter to try to influence the outcome of the contest. 

Google earlier this month said it would block autocomplete suggestions in its search engine related to queries on voting procedures or donations to candidates. Last month, YouTube said it will ban videos containing information that was obtained through hacking and could meddle with elections or censuses. 

YouTube first introduced information panels two years ago, adding short blurbs that appear under false or misleading videos and aim to debunk misinformation by linking to accurate sources. Since then, the company has added the panels to videos about COVID-19, the moon landing and other subjects rife with conspiracy theories. 

The panels haven’t always worked as planned in the past. When the Notre Dame cathedral in Paris went up in flames last April, YouTube’s algorithm accidentally displayed an information panel on the 9/11 terrorist attacks because the software made a mistake in analyzing the images in the video. After the fire, YouTube said its systems made the “wrong call.”

Leave a Reply