За словами Кірбі, таку оцінку підтверджує те, що Путін «продовжує змінювати генералів, як я змінюю шкарпетки»
…
ByteDance, the Chinese parent company of social media platform TikTok, has dramatically upped its U.S. lobbying effort since 2020 as U.S.-China relations continue to sour and is now the fourth-largest Internet company in spending on federal lobbying as of last year, according to newly released data.
Publicly available information collected by OpenSecrets, a Washington nonprofit that tracks campaign finance and lobbying data, shows that ByteDance and its subsidiaries, including TikTok, the wildly popular short video app, have spent more than $13 million on U.S. lobbying since 2020. In 2022 alone, Fox News reported, the companies spent $5.4 million on lobbying.
Only Amazon.com ($19.7 million) and the parent companies of Google ($11 million) and Facebook ($19 million) spent more, according to OpenSecrets.
In the fourth quarter of 2022, ByteDance spent $1.2 million on lobbying, according to Fox News.
The lobbyists hired by ByteDance include former U.S. senators Trent Lott and John Breaux; David Urban, a former senior adviser to Donald Trump’s 2016 presidential campaign who was also a former chief of staff for the late Senator Arlen Specter; Layth Elhassani, special assistant to President Barack Obama in the White House Office of Legislative Affairs; and Samantha Clark, former deputy staff director of the U.S. Senate Armed Services Committee.
In November, TikTok hired Jamal Brown, a deputy press secretary at the Pentagon who was national press secretary for Joe Biden’s presidential campaign, to manage policy communications for the Americas, with a focus on the U.S., according to Politico.
“This is kind of the template for how modern tech lobbying goes,” Dan Auble, a senior researcher at Open Secrets, told Vox. “These companies come on the scene and suddenly start spending substantial amounts of money. And ByteDance has certainly done that.”
U.S. officials have criticized TikTok as a security risk due to ties between ByteDance and the Chinese government. The worry is that user data collected by TikTok could be passed to Beijing, so lawmakers have been trying to regulate or even ban the app in the U.S.
In 2019, TikTok paid a $5.7 million fine as part of a settlement with the Federal Trade Commission over violating children’s privacy rights. The Trump administration attempted unsuccessfully to ban downloads of TikTok from app stores and outlaw transactions between Americans and ByteDance.
As of late December, TikTok has been banned on federally managed devices, and 19 states had at least partially blocked the app from state-managed devices.
The number of federal bills that ByteDance has been lobbying on increased to 14 in 2022 from eight in 2020.
With TikTok CEO Shou Zi Chew scheduled to testify before the U.S. House of Representatives Energy and Commerce Committee on March 23, and a House of Representatives Foreign Affairs Committee vote in March on a bill that would ban the use of TikTok in the U.S., the company is expected to further expand its U.S. influence campaign.
Erich Andersen, general counsel and head of corporate affairs at ByteDance and TikTok, told the New York Times in January that “it was necessary for us to accelerate our own explanation of what we were prepared to do and the level of commitments on the national security process.”
TikTok has been met with a mixed response to its efforts to prove that its operations in the U.S. are outside of Beijing’s sphere of influence.
Michael Beckerman, who oversees public policy for the Americas at TikTok, met with Mike Gallagher, chairman of the U.S. House of Representatives Select Committee on China Affairs, on February 1 to explain the company’s U.S. data security plans.
According to Reuters, Gallagher’s spokesperson, Jordan Dunn, said after the meeting that the lawmaker “found their argument unpersuasive.”
Congressman Ken Buck and Senator Josh Hawley on January 25 introduced a bill, No TikTok on United States Devices Act, which will instruct President Joe Biden to use the International Emergency Economic Powers to prohibit downloads of TikTok and ban commercial activity with ByteDance.
Joel Thayer, president of the Digital Progress Institute and a telecom regulation lawyer, told VOA Mandarin that he doubted the Buck-Hawley bill would become law. He said that calls to ban TikTok began during the Trump administration, yet TikTok has remained a visible and influential presence in the U.S.
James Lewis, director of the CSIS Technology and Public Policy Program, told VOA Mandarin, “An outright ban will be difficult because TikTok is speech, which is protected speech. But it [the U.S. government] can ban financial transactions, that’s possible.”
Senators Marco Rubio and Angus King reintroduced bipartisan legislation on February 10 to ban TikTok and other similar apps from operating in the U.S. by “blocking and prohibiting all transactions from any social media company in, or under the influence of, China, Russia, and several other foreign countries of concern unless they fully divest of dangerous foreign ownership.”
The Committee on Foreign Investment in the United States (CFIUS), an interagency group that reviews transactions involving foreign parties for possible national security threats, ordered ByteDance to divest TikTok in 2020. The two parties have yet to reach an agreement after two years of talks.
Chuck Flint, vice president of strategic relationships at Breitbart News who is also the former chief of staff for Senator Marsha Blackburn, told VOA Mandarin, “I expect that CFIUS will be hesitant to ban TikTok. Anything short of an outright ban will leave China’s TikTok data pipeline in place.”
China experts believe that TikTok wants to reach an agreement with CFIUS rather than being banned from the U.S. or being forced to sell TikTok’s U.S. business to an American company.
Lewis of CSIS said, “Every month that we don’t do CFIUS is a step closer towards some kind of ban.”
Julian Ku, professor of law and faculty director of international programs at Hofstra University, told VOA Mandarin, “The problem is that no matter what they offer, there’s no way to completely shield the data from the Chinese government … as long as there continues to be a shared entity.”
Adrianna Zhang contributed to this report.
…
After seeing promising results in Eastern Europe, Google will initiate a new campaign in Germany that aims to make people more resilient to the corrosive effects of online misinformation.
The tech giant plans to release a series of short videos highlighting the techniques common to many misleading claims. The videos will appear as advertisements on platforms like Facebook, YouTube or TikTok in Germany. A similar campaign in India is also in the works.
It’s an approach called prebunking, which involves teaching people how to spot false claims before they encounter them. The strategy is gaining support among researchers and tech companies.
“There’s a real appetite for solutions,” said Beth Goldberg, head of research and development at Jigsaw, an incubator division of Google that studies emerging social challenges. “Using ads as a vehicle to counter a disinformation technique is pretty novel. And we’re excited about the results.”
While belief in falsehoods and conspiracy theories isn’t new, the speed and reach of the internet has given them a heightened power. When catalyzed by algorithms, misleading claims can discourage people from getting vaccines, spread authoritarian propaganda, foment distrust in democratic institutions and spur violence.
It’s a challenge with few easy solutions. Journalistic fact checks are effective, but they’re labor intensive, aren’t read by everyone, and won’t convince those already distrustful of traditional journalism. Content moderation by tech companies is another response, but it only drives misinformation elsewhere, while prompting cries of censorship and bias.
Prebunking videos, by contrast, are relatively cheap and easy to produce and can be seen by millions when placed on popular platforms. They also avoid the political challenge altogether by focusing not on the topics of false claims, which are often cultural lightning rods, but on the techniques that make viral misinformation so infectious.
Those techniques include fear-mongering, scapegoating, false comparisons, exaggeration and missing context. Whether the subject is COVID-19, mass shootings, immigration, climate change or elections, misleading claims often rely on one or more of these tricks to exploit emotions and short-circuit critical thinking.
Last fall, Google launched the largest test of the theory so far with a prebunking video campaign in Poland, the Czech Republic and Slovakia. The videos dissected different techniques seen in false claims about Ukrainian refugees. Many of those claims relied on alarming and unfounded stories about refugees committing crimes or taking jobs away from residents.
The videos were seen 38 million times on Facebook, TikTok, YouTube and Twitter — a number that equates to a majority of the population in the three nations. Researchers found that compared to people who hadn’t seen the videos, those who did watch were more likely to be able to identify misinformation techniques, and less likely to spread false claims to others.
The pilot project was the largest test of prebunking so far and adds to a growing consensus in support of the theory.
“This is a good news story in what has essentially been a bad news business when it comes to misinformation,” said Alex Mahadevan, director of MediaWise, a media literacy initiative of the Poynter Institute that has incorporated prebunking into its own programs in countries including Brazil, Spain, France and the U.S.
Mahadevan called the strategy a “pretty efficient way to address misinformation at scale, because you can reach a lot of people while at the same time address a wide range of misinformation.”
Google’s new campaign in Germany will include a focus on photos and videos, and the ease with which they can be presented of evidence of something false. One example: Last week, following the earthquake in Turkey, some social media users shared video of the massive explosion in Beirut in 2020, claiming it was actually footage of a nuclear explosion triggered by the earthquake. It was not the first time the 2020 explosion had been the subject of misinformation.
Google will announce its new German campaign Monday ahead of next week’s Munich Security Conference. The timing of the announcement, coming before that annual gathering of international security officials, reflects heightened concerns about the impact of misinformation among both tech companies and government officials.
Tech companies like prebunking because it avoids touchy topics that are easily politicized, said Sander van der Linden, a University of Cambridge professor considered a leading expert on the theory. Van der Linden worked with Google on its campaign and is now advising Meta, the owner of Facebook and Instagram, as well.
Meta has incorporated prebunking into many different media literacy and anti-misinformation campaigns in recent years, the company told The Associated Press in an emailed statement.
They include a 2021 program in the U.S. that offered media literacy training about COVID-19 to Black, Latino and Asian American communities. Participants who took the training were later tested and found to be far more resistant to misleading COVID-19 claims.
Prebunking comes with its own challenges. The effects of the videos eventually wears off, requiring the use of periodic “booster” videos. Also, the videos must be crafted well enough to hold the viewer’s attention, and tailored for different languages, cultures and demographics. And like a vaccine, it’s not 100% effective for everyone.
Google found that its campaign in Eastern Europe varied from country to country. While the effect of the videos was highest in Poland, in Slovakia they had “little to no discernible effect,” researchers found. One possible explanation: The videos were dubbed into the Slovak language, and not created specifically for the local audience.
But together with traditional journalism, content moderation and other methods of combating misinformation, prebunking could help communities reach a kind of herd immunity when it comes to misinformation, limiting its spread and impact.
“You can think of misinformation as a virus. It spreads. It lingers. It can make people act in certain ways,” Van der Linden told the AP. “Some people develop symptoms, some do not. So: if it spreads and acts like a virus, then maybe we can figure out how to inoculate people.”
…