Protecting kids from violent images on family-friendly video app
Posted April 26
Lisa Echols is always on the go. As a mom of four young boys, including a newborn, she has her hands full.
"It's a lot of work. It's fun," she says with a laugh.
Her oldest son is a kindergartener and loves watching Pokemon videos online.
It used to be that the Echols would let their kids watch regular YouTube, but with anyone able to post videos that may not be very kid-friendly, she grew concerned.
"They were watching YouTube and some questionable things would pop up," Echols said.
When we searched the site using the phrase "kids videos", it wasn't tough to find the stuff bad dreams are made — cartoon images of sharks circling a child's bedroom at night, a bed that harbors scary looking eyeballs, a killer clown rising from the sea.
The team also searched a popular children's character named Peppa the Pig and found violent spoofs. So instead of clicking on the kid-friendly Peppa, your child might innocently click on a version that shows characters getting stabbed or shot.
It's why Echols kicked YouTube to the curb and downloaded the YouTube Kids app.
It uses an algorithm to screen videos geared toward kids and is a safer version of YouTube.
"For me, I've been happy with what I've seen," Echols said.
But not all moms have had the same experience. Last year Ohio mom Beth Brister Kaster took her disapproval online. She reposted a video she'd found on the app.
"It's normal, then it changes to something awful," she said.
The video cuts to characters shooting each other in the head.
We searched the app and found plenty of kid-friendly videos, but also saw some that depicted acts of violence or language parents may consider inappropriate for their child.
Online safety expert Josie Angerhofer has heard concerns from parents.
"Somebody had taken a Sesame Street video and turned it into lots of swear words," said Angerhofer, who is the regional director for Utah NetSmartz. "People are trying to freak kids out."
She says questionable content can slip through. Although YouTube Kids uses an algorithm to block inappropriate videos, she explained no filter is perfect.
"There are some things that are going to make it through that filter," Angerhofer said.
YouTube's statement on their Kids app says:
We work to make the videos in YouTube Kids as family-friendly as possible and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it easy for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed within hours. For parents who want a more restricted experience, we recommend that they turn off the Search feature in the app.
Concerned parents should create and sign in with a YouTube account that will allow them to control settings.
Be sure turn off the search option on YouTube Kids and lock it with a secret passcode. It'll keep kids from searching for new videos without adult permission.
With the search function deactivated, the app won't recommend new videos for a child to click on and watch.
Angerhofer recommends parents have ongoing conversations with their children and advising kids to speak up immediately if they see something that's not right. She says parents can click on three dots in the corner of a video to report inappropriate content.
Click here to access a link from YouTube that can help parents set controls.