On September 15, California Governor Gavin Newsom (D) signed it into law. Age Appropriate Design Code Act, That passed unanimously in the state Senate in late August, despite protests from the tech industry.

Modeled after UK Children’s Code The California law, which went into effect last year, protects the privacy and well-being of children online and requires companies to assess the impact of any product or service that is either designed for children. have been or are “likely to be accessed by children.”

The law will go into effect on July 1, 2024, after which time companies that violate the law could be fined up to $7,500 per affected child. While this may seem like a small amount, similar legislation in the European Union allows Ireland’s Data Protection Commission to The fine was $400 million The way Instagram treated children’s data (In the case of the new law, the California Attorney General would impose the fine.)

California’s Age Appropriate Design Code Act of 1998 defines any child under the age of 18 as Children’s Online Privacy Protection Act (COPPA), for which the cutoff age is 13.

COPPA codified protections for children’s data, prohibiting “unfair or deceptive practices or practices in connection with the collection, use, and/or disclosure of personal information from and about children over the Internet.” Is.

California’s new law goes even further. It requires that the highest privacy settings be the default for young users, and that companies “provide a clear signal” to children that their location is being tracked.

Founder and CEO Jim Steer Common Sense Media“This is a very important victory for children and families,” one of the bill’s lead sponsors told OlxPraca.

The law comes strongly to the protection of children over profit, stating: “If a conflict arises between commercial interests and the best interests of children, companies must prioritize the privacy, safety and welfare of children over commercial interests.” should be prioritized.”

In 2019 The interview With The New York Times, the chief architect of the UK Children’s Code, Baroness Babin Kidron, described her meetings with tech executives.

“The main thing they’re asking me is: ‘Are you really expecting companies to give up profits by limiting the data they collect on children?’ His answer? “Of course I am! Of course, everyone should.”

“If a conflict arises between commercial interests and the best interests of children, companies must prioritize the privacy, safety and well-being of children over commercial interests.”

– California Age Appropriate Design Code Act

How will the Age-Friendly Design Code Act protect children online?

The dangers of the Internet for children are far greater than children being contacted by strangers online (although by making high privacy settings the default, the California Act seeks to prevent such interactions).

Increasingly, parents are concerned about the excessive time children spend online, the lure of platforms with auto-play and other addictive features, and children’s content that poses risks such as self-harm and eating disorders. Promotes behavior.

The Age-Friendly Design Code Act requires companies to write a “data protection impact assessment” for each new product or service, which outlines how children’s data may be used and whether that use would cause Damage may occur.

“Basically, [companies] It has to look at whether their product design exposed children and teenagers to harmful content, or allowed others to have harmful contact, or used harmful algorithms,” said Steyer.

Under the law, Steyer explained, YouTube would still be able to make video recommendations, for example. The difference is that they will have less data to make those recommendations. Companies will also be responsible for checking whether their algorithms are promoting harmful content, and if so, taking action.

Haley Hinkle, at Policy Council JusticeAn organization “dedicated to ending marketing to children” told OlxPraca that by mandating impact assessments, “big tech companies would be responsible for assessing the impact of their algorithms on children. First They present a product or new design feature to the public.

Hinkle continued, “This is important in shifting the responsibility for protecting digital platforms to the platforms themselves, and away from families who have endless pages of privacy policies and settings options to decode. Don’t have the time or the resources.”

Under the law, the company cannot “collect, sell, share or retain” a young person’s information unless it is necessary to do so for the app or platform to provide its service. The law directs businesses to “estimate the age of child users with a reasonable level of discretion,” or simply provide data protections to all users.

“You cannot profile a child or teenager by default, unless the business has adequate safeguards in place,” Steer said. “And you cannot collect accurate geolocation information by default.”

Hinkel explained the incentive for companies to collect such data. “Online platforms are designed to capture as much of a child’s time and attention as possible. The more data a platform collects about a child or adolescent, the more effective it is at keeping them online. can target with content and design features for

Although the scope of the law is limited to California, it is hoped that it may inspire further reform, as some companies changed their practices around the world before the implementation of the Children’s Code in the UK Instagram, for example , made teen accounts private by default. , disabling direct messages between kids and adults they don’t follow. However, how they define “adult” varies by country – in the UK it’s 18 and “Some countries“But 16 other places around the world, according to their statement, announced changes.

While it’s uncertain whether Instagram will now raise that age cutoff to 18 in California, the Age-Friendly Design Code Act requires companies to take into account the “unique needs of different age ranges” and developmental stages. are required to have, defined by law: “0 to 5 years of age or ‘early and early literacy,’ 6 to 9 years of age or ‘basic primary school years,’ 10 to 12 years of age.” age or ‘transitional years,’ 13 to 15 years of age or ‘early adolescence,’ and 16 to 17 years of age or ‘approaching adulthood.’

“Child development and social media are not optimally linked.”

– Devorah Heitner

What is the biggest risk to children online?

Some of the threats to children come from large, impersonal corporations that collect data to target them with targeted advertising, or to profile them with targeted content that may promote risky behavior. , such as random eating.

Other threats come from people your child knows in real life, or even your child herself.

Devorah Heitner, author of “Screenwise: Helping kids survive (and thrive) in their digital world“Besides interpersonal harm from people they know,” such as cyberbullying, “told OlxPraca.There are ways they can compromise their reputation.”

“What you share when you’re 12 years old can stay with you for a really long time,” Heitner explained.

While no law can prevent a child from posting something they probably shouldn’t, the Age-Friendly Design Code Act requires that businesses “take into account the unique needs of different age ranges.” , setting the precedent that children and adolescents are developing. Adults are different and require different considerations.

“Child development and social media are not optimally linked,” noted Hettner.

What can parents do now to protect their children’s privacy and safety?

Parents don’t have to wait for big tech companies to change their behavior before California’s new law goes into effect. There are things you can do now to increase your child’s privacy and safety online.

Hinkle recommends keeping children off social media until at least age 13. To do this, she says, it can be helpful to talk to the parents of your child’s friends, since the presence of their peers is the biggest draw to social media for most. Children.

Once they have social media accounts, Hinkle recommends “reviewing the settings with your child, and explaining why you want the most protective settings.” These include turning off location data, choosing private accounts and disabling contact with strangers.

Hettner advocates an approach she calls “guidance over supervision.” Because safety settings can only do so much, and because kids are so good at finding solutions, she says your best defense is to talk to your child about their online habits, and the impact of their actions. There is an ongoing conversation about, which can affect the self and others. .

Your children will see harmful content during their online time. You want them to feel comfortable telling you about it, or when appropriate, reporting it.

When it comes to evaluating their own behavior, children need to know that you are open to discussion and will not rush to judgment. Hetner suggests using phrases like, “I Know you’re a good friend, but if you post that it might not sound like it.”

Children should understand how their posts can be misinterpreted, and why they should think before posting, especially when they are angry.

It’s a delicate balance of respecting how important your child’s online life is to them, while also teaching them that social media “can make you feel terrible, and that [companies] are taking advantage of their time spent there,” Hettner said.

Parents should aim to educate children about these issues, and “instill in children a healthy skepticism” of big technology, Hetner said.

In addition to the resources available on Common Sense Media, Steyer recommends that parents take advantage of this. Apple’s privacy settingsWhich Common Sense Media helped produce.

He also suggested that parents be role models in their own media consumption.

“If you’re spending all your time [there] Self, what message is that sending to your child?”


fbq(‘init’, ‘1621685564716533’);
fbq(‘track’, “PageView”);

var _fbPartnerID = null;
if (_fbPartnerID !== null) {
fbq(‘init’, _fbPartnerID + ”);
fbq(‘track’, “PageView”);

(function () {
‘use strict’;
document.addEventListener(‘DOMContentLoaded’, function () {
document.body.addEventListener(‘click’, function(event) {
fbq(‘track’, “Click”);

#California #Protected #Kids #Online #state #follow

Source link