Various social media platforms are now facing child safety lawsuits

0
34

[ad_1]

Child online safety is a major concern and various social media platforms have been found lacking in this area. As a result of this, these platforms from the likes of Meta, ByteDance, Alphabet, and Snap are facing child safety lawsuits. The platforms provided by these companies have proven to be not only addictive to children but also damaging to their mental health.

This issue has brought these companies before the court repeatedly, with the most recent ruling against them. On Tuesday, a federal court ruled against these companies, rejecting their request to dismiss cases filed against them. Most of these cases are coming from school districts around the country that have the responsibility to teach and train these children to help them become assets to society.

Some cases against these social media platforms come from various states within the country. All have a similar voice as they cry out for regulation of these platforms that are damaging children who have access to them. Here are the details on this case and how companies that own these social media platforms are reacting to the ruling.

The court strikes at social media platforms for putting children at risk with the services they offer

From the court ruling, US District Judge Yvonne Gonzalez Rogers stands with the plaintiffs. Social media platforms exert a lot of influence on those using their services. Over the past few months, lots of cases have come before the court on the damages of these platforms to young users.

With the type of content accessible to children on these platforms, their young minds are constantly changing. The ruling also points out that the First Amendment and Section 230 of the Communications Decency Act don’t protect these companies from the effects of the content their platforms produce. These laws point out that social media platforms shouldn’t be treated as publishers of third-party content.

The issue on the ground isn’t about content, but the measures put in place to protect certain people from content not meant for them. In this area, all the platforms on this case are lacking, but they still sought to get a break from the case. Some measures these platforms can put in place to protect children are age verification, parental control and so much more.

Already, all of these platforms offer these protective services, but even children can bypass them. This shows how lax these companies are when it comes to protecting minors on their platforms. The court stresses the need for changes that’ll help strengthen these protective measures.

Companies like Meta, ByteDance, Alphabet, and Snap need to closely watch minors on their platforms. Parents can also play a vital role in keeping an eye on the type of content their children have access to on social media. With this, parents and guardians can expect tighter regulations from social media platforms that will help protect users.

[ad_2]

Source link