Many parents feel that YouTube Kids is a safe app for their children. It is, after all, supposed to be screened.
However, recent videos discovered by pediatrician Dr. Free Hess show that screening isn’t working. Through her research, Dr. Hess discovered videos, many related to or featuring the popular Minecraft video game, that depicted disturbing themes such as school shootings, self-harm, and even suicide. Others (not featuring Minecraft, but still appearing as child-friendly cartoons) glorified sexual exploitation, trafficking, domestic violence, and abuse. She later discussed these videos, including posting examples, on her parenting blog.
“There were so many that I had to stop recording,” Hess said.
YouTube issued a statement in response to Hess’ blog post saying, “We appreciate people drawing problematic content to our attention and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.”
The main problem with that, of course, is that many parents don’t police what their children are watching because they believe the content on the app to be safe. And many children may not know to, or feel comfortable with, informing their parents when they encounter disturbing content.
“There is this disconnect between what kids know about technology and what their parents know because the parents didn’t grow up with it,” Hess states. “The kids are the digital natives and the parents are digital immigrants.”
Parents need to be more careful about screening the content their children watch before they allow their children on the app.
“Once someone reports it, it’s too late because a kid has already seen it,” Hess points out.
Jenn Bentley is a writer and editor originally from Cadiz, Kentucky. Her writing has been featured in publications such as The Examiner, The High Tech Society, FansShare, Yahoo News, and others. When she’s not writing or editing, Jenn spends her time raising money for Extra Life and advocating for autism awareness.