2019 is a pivotal year for data privacy, safety, and internet security. While strides have been made to protect children from online dangers, there is still much work to be done. Larger tech companies with a big footprint are having a difficult time putting processes and policies in place that ensure the safety of young viewers.
The public and government are demanding more action to bring big tech into compliance and ensure that apps targeting children meet certain safety and privacy standards. Here’s what you need to know about the latest legislation and steps to protect our children.
COPPA, or the Children’s Online Privacy Protection Act, was passed in 1998 and instituted in 2000. It was created to restrict the way websites and online applications collect information in two main ways:
- Ensuring that websites and video streaming apps can’t collect personal information from children 13 or younger without their parents’ consent.
- Preventing websites and apps from sharing information about children outside of the website or app’s immediate organization without parental consent.
The law was also designed to keep any information received from children highly secure and completely confidential. Companies must also delete the information after a specified time period and provide immediate access to the parents of the children if requested. Lastly, the law empowers parents to stop the collection or usage of their children’s information if they choose.
While the legislation itself was a giant leap in the right direction when it comes to protecting the online privacy of children, tech giants continually fail to meet COPPA guidelines, and many remain in violation of the rules to this day.
It’s a challenging problem to solve, but instead of attempting to solve it and become compliant with COPPA, some companies are ignoring and abandoning the legislation altogether by changing their advertising standards. It all boils down to the almighty dollar.
Advertising Monetization is Nearly Impossible to Regulate
Advertising is the main form of monetization that most video platforms use to sustain their business and gain profitability. The problem is that it is difficult to regulate which ads show for which content (and which audience). One major video platform recently announced that it plans to end targeted advertising on uploaded videos with content that is likely to be consumed by children.
It sounds great right? But who determines which videos and what content are geared directly toward children? For example, a video about construction equipment or heavy machinery might appeal to both adults and children alike.
There are plenty of gray areas that blur the lines of when and where to use targeted advertising and where to omit it because it may be viewed by a young audience. These fuzzy boundaries will only hurt the community itself and enable monetization for the content creators across the board—not just for kids’ content. Unless someone performs countless hours of manual review to know which ads are appearing where, advertising is simply not a viable option for video apps for kids. To extend that statement even further, advertising to kids promotes commercialism and can encourage various unwanted behaviors.
Taking Real Steps to Protect Young Viewers
So how do you know which kids’ video apps you can trust? Our team at Jellies put together a list of questions to ask before you download a video app to ensure that your child’s information stays safe, and they are protected from inappropriate advertising tactics.
At Jellies, we never include advertising in our videos for many reasons, including:
- Safety and prevention of the wrong content appearing to kids
- Combatting commercialism
- Preventing unwanted behaviors
COPPA and its continued evolution to keep up with changing technology are critical in the effort to protect young viewers. As a parent, you can help by conducting proper research before letting your child explore an unknown app or technology. Keeping our children safe is a team effort.