Таке рішення ухвалено через те, що Антоніу Гутерреш не засудив «однозначно» атаку Ірану на Ізраїль
…
MELBOURNE, Australia — A code of conduct will be enforced on the online dating industry to better protect Australian users after research found that three-in-four people suffer some form of sexual violence through the platforms, Australia’s government said on Tuesday.
Bumble, Grindr and Match Group Inc., a Texas-based company that owns platforms including Tinder, Hinge, OKCupid and Plenty of Fish, have agreed to the code that took effect on Tuesday, Communications Minister Michelle Rowland said.
The platforms, which account for 75% of the industry in Australia, have until April 1 to implement the changes before they are strictly enforced, Rowland said.
The code requires the platforms’ systems to detect potential incidents of online-enabled harm and demands that the accounts of some offenders are terminated.
Complaint and reporting mechanisms are to be made prominent and transparent. A new rating system will show users how well platforms are meeting their obligations under the code.
The government called for a code of conduct last year after the Australian Institute of Criminology research found that three-in-four users of dating apps or websites had experienced some form of sexual violence through these platforms in the five years through 2021.
“There needs to be a complaint-handling process. This is a pretty basic feature that Australians would have expected in the first place,” Rowland said on Tuesday.
“If there are grounds to ban a particular individual from utilizing one of those platforms, if they’re banned on one platform, they’re blocked on all platforms,” she added.
Match Group said it had already introduced new safety features on Tinder, including photo and identification verification to prevent bad actors from accessing the platform while giving users more confidence in the authenticity of their connections.
The platform used artificial intelligence to issue real-time warnings about potentially offensive language in an opening line and advising users to pause before sending.
“This is a pervasive issue, and we take our responsibility to help keep users safe on our platform very seriously,” Match Group said in a statement on Wednesday.
Match Group said it would continue to collaborate with the government and the industry to “help make dating safer for all Australians.”
Bumble said it shared the government’s hope of eliminating gender-based violence and was grateful for the opportunity to work with the government and industry on what the platform described as a “world-first dating code of practice.”
“We know that domestic and sexual violence is an enormous problem in Australia, and that women, members of LGBTQ+ communities, and First Nations are the most at risk,” a Bumble statement said.
“Bumble puts women’s experiences at the center of our mission to create a world where all relationships are healthy and equitable, and safety has been central to our mission from day one,” Bumble added.
Grindr said in a statement it was “honored to participate in the development of the code and shares the Australian government’s commitment to online safety.”
All the platforms helped design the code.
Platforms that have not signed up include Happn, Coffee Meets Bagel and Feeld.
The government expects the code will enable Australians to make better informed choices about which dating apps are best equipped to provide a safe dating experience.
The government has also warned the online dating industry that it will legislate if the operators fail to keep Australians safe on their platforms.
…
little rock, arkansas — Arkansas sued YouTube and parent company Alphabet on Monday, saying the video-sharing platform is made deliberately addictive and fueling a mental health crisis among youth in the state.
Attorney General Tim Griffin’s office filed the lawsuit in state court, accusing them of violating the state’s deceptive trade practices and public nuisance laws. The lawsuit claims the site is addictive and has resulted in the state spending millions on expanded mental health and other services for young people.
“YouTube amplifies harmful material, doses users with dopamine hits, and drives youth engagement and advertising revenue,” the lawsuit said. “As a result, youth mental health problems have advanced in lockstep with the growth of social media, and in particular, YouTube.”
Alphabet’s Google, which owns the video service and is also named as a defendant in the case, denied the lawsuit’s claims.
“Providing young people with a safer, healthier experience has always been core to our work. In collaboration with youth, mental health and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls,” Google spokesperson Jose Castaneda said in a statement. “The allegations in this complaint are simply not true.”
YouTube requires users under 17 to get their parent’s permission before using the site, while accounts for users younger than 13 must be linked to a parental account. But it is possible to watch YouTube without an account, and kids can easily lie about their age.
The lawsuit is the latest in an ongoing push by state and federal lawmakers to highlight the impact that social media sites have on younger users. U.S. Surgeon General Vivek Murthy in June called on Congress to require warning labels on social media platforms about their effects on young people’s lives, like those now mandatory on cigarette boxes.
Arkansas last year filed similar lawsuits against TikTok and Facebook parent company Meta, claiming the social media companies were misleading consumers about the safety of children on their platforms and protections of users’ private data. Those lawsuits are still pending in state court.
Arkansas also enacted a law requiring parental consent for minors to create new social media accounts, though that measure has been blocked by a federal judge.
Along with TikTok, YouTube is one of the most popular sites for children and teens. Both sites have been questioned in the past for hosting, and in some cases promoting, videos that encourage gun violence, eating disorders and self-harm.
YouTube in June changed its policies about firearm videos, prohibiting any videos demonstrating how to remove firearm safety devices. Under the new policies, videos showing homemade guns, automatic weapons and certain firearm accessories like silencers will be restricted to users 18 and older.
Arkansas’ lawsuit claims that YouTube’s algorithms steer youth to harmful adult content, and that it facilitates the spread of child sexual abuse material.
The lawsuit doesn’t seek specific damages, but asks that YouTube be ordered to fund prevention, education and treatment for “excessive and problematic use of social media.”
…