У січні в канцелярії президента повідомили, що у Хейнгоба діагностували рак. Останні тижні він боровся з хворобою
…
BEIJING — A small but powerful Chinese rocket capable of carrying payloads at competitive costs delivered nine satellites into orbit Saturday, Chinese state media reported, in what is gearing up to be another busy year for Chinese commercial launches.
The Jielong-3, or Smart Dragon-3, blasted off from a floating barge off the coast of Yangjiang in southern Guangdong province, the second launch of the rocket in just two months.
Developed by China Rocket Company, a commercial offshoot of a state-owned launch vehicle manufacturer, Jielong-3 made its first flight in December 2022.
President Xi Jinping has called for the expansion of strategic industries including the commercial space sector, deemed key to building constellations of satellites for communications, remote sensing and navigation.
Also Saturday, Chinese automaker Geely Holding Group launched 11 satellites to boost its capacity to provide more accurate navigation for autonomous vehicles.
Last year saw 17 Chinese commercial launches with one failure, among a record 67 orbital launches by China. That was up from 10 Chinese commercial launches in 2022, including two failures.
In 2023, China conducted more launches than any other country except for the United States, which made 116 launch attempts, including just under 100 by Elon Musk’s SpaceX.
Critical to the construction of commercial satellite networks is China’s ability to open more launch windows, expand rocket types to accommodate different payload sizes, lower launch costs and increase the number of launch sites.
…
NEW YORK — A graphic video from a Pennsylvania man accused of beheading his father that circulated for hours on YouTube has put a spotlight yet again on gaps in social media companies’ ability to prevent horrific postings from spreading across the web.
Police said Wednesday that they charged Justin Mohn, 32, with first-degree murder and abusing a corpse after he beheaded his father, Michael, in their Bucks County home and publicized it in a 14-minute YouTube video that anyone, anywhere could see.
News of the incident — which drew comparisons to the beheading videos posted online by the Islamic State militants at the height of their prominence nearly a decade ago — came as the CEOs of Meta, TikTok and other social media companies were testifying in front of federal lawmakers frustrated by what they see as a lack of progress on child safety online. YouTube, which is owned by Google, did not attend the hearing despite its status as one of the most popular platforms among teens.
The disturbing video from Pennsylvania follows other horrific clips that have been broadcast on social media in recent years, including domestic mass shootings livestreamed from Louisville, Kentucky; Memphis, Tennessee; and Buffalo, New York — as well as carnages filmed abroad in Christchurch, New Zealand, and the German city of Halle.
Middletown Township Police Capt. Pete Feeney said the video in Pennsylvania was posted at about 10 p.m. Tuesday and online for about five hours, a time lag that raises questions about whether social media platforms are delivering on moderation practices that might be needed more than ever amid wars in Gaza and Ukraine, and an extremely contentious presidential election in the U.S.
“It’s another example of the blatant failure of these companies to protect us,” said Alix Fraser, director of the Council for Responsible Social Media at the nonprofit advocacy organization Issue One. “We can’t trust them to grade their own homework.”
A spokesperson for YouTube said the company removed the video, deleted Mohn’s channel and was tracking and removing any re-uploads that might pop up. The video-sharing site says it uses a combination of artificial intelligence and human moderators to monitor its platform but did not respond to questions about how the video was caught or why it wasn’t done sooner.
Major social media companies moderate content with the help of powerful automated systems, which can often catch prohibited content before a human can. But that technology can sometimes fall short when a video is violent and graphic in a way that is new or unusual, as it was in this case, said Brian Fishman, co-founder of the trust and safety technology startup Cinder.
That’s when human moderators are “really, really critical,” he said. “AI is improving, but it’s not there yet.”
The Global Internet Forum to Counter Terrorism, a group set up by tech companies to prevent these types of videos from spreading online, was in communication with its all of its members about the incident on Tuesday evening, said Adelina Petit-Vouriot, a spokesperson for the organization.
Roughly 40 minutes after midnight Eastern time on Wednesday, GIFCT issued a “Content Incident Protocol,” which it activates to formally alert its members – and other stakeholders – about a violent event that’s been livestreamed or recorded. GIFCT allows the platform with the original footage to submit a “hash” — a digital fingerprint corresponding to a video — and notifies nearly two dozen other member companies so they can restrict it from their platforms.
But by Wednesday morning, the video had already spread to X, where a graphic clip of Mohn holding his father’s head remained on the platform for at least seven hours and received 20,000 views. The company, formerly known as Twitter, did not respond to a request for comment.
Experts in radicalization say that social media and the internet have lowered the barrier to entry for people to explore extremist groups and ideologies, allowing any person who may be predisposed to violence to find a community that reinforces those ideas.
In the video posted after the killing, Mohn described his father as a 20-year federal employee, espoused a variety of conspiracy theories and ranted against the government.
Most social platforms have policies to remove violent and extremist content. But they can’t catch everything, and the emergence of many newer, less closely moderated sites has allowed more hateful ideas to fester unchecked, said Michael Jensen, senior researcher at the University of Maryland-based Consortium for the Study of Terrorism and Responses to Terrorism, or START.
Despite the obstacles, social media companies need to be more vigilant about regulating violent content, said Jacob Ware, a research fellow at the Council on Foreign Relations.
“The reality is that social media has become a front line in extremism and terrorism,” Ware said. “That’s going to require more serious and committed efforts to push back.”
Nora Benavidez, senior counsel at the media advocacy group Free Press, said among the tech reforms she would like to see are more transparency about what kinds of employees are being impacted by layoffs, and more investment in trust and safety workers.
Google, which owns YouTube, this month laid off hundreds of employees working on its hardware, voice assistance and engineering teams. Last year, the company said it cut 12,000 workers “across Alphabet, product areas, functions, levels and regions,” without offering additional detail.
…