Positive Changes That Affect How Kids Experience the Internet
- Editor OGN Daily
- Dec 2, 2025
- 4 min read
When we were young, we played outside all day long. Today? Not so much. Today’s ‘outside’ has turned into ‘online’. And even though we know this isn’t the best approach because of all the brain rot circling online, we still allow it.

But what can you do, really? Their friends spend a lot of time online, so your kids don’t want to be left out. While 10 years ago, this was considered science fiction, today things like games, social media, video apps, and even digital classrooms are all considered ‘normal’. But while they ARE normal, they introduce lots of risks with how slowly safety was struggling behind all the. Thankfully, this has started to somewhat change, and the internet is a safer place for kids than it used to be just a couple of years ago. All the new laws/regulations, closer oversight, pressure from families and educators, and clearer responsibilities for the companies running these online platforms.
You won’t see predictions or opinions here, just cold, hard facts on the good things happening RIGHT NOW that keep our kids safe every time they venture online.
What’s Changed (for the Better)?
Things didn’t magically get safer. What happened is that they’ve gotten more structured. We can’t live off promises. We need results. And when it comes to our children, those results are MUCH clearer rules/regulations and shared responsibility.
Kid-Friendly Designed Platforms: With all the platforms, apps, games, and similar coming out on a monthly basis, all developers need to design with kids in mind. Not as a ‘potential’ demographic, but it needs to be presumed that kids WILL use the apps. Because, to be realistic, they will. 100 percent. This basically means that these apps require age-appropriate design, where all the platforms need to set a safer default for children. This means:
Privacy settings are on automatically (you can disable them only manually and if you’re 18+);
App design that has the intent of hooking people (e.g., infinite scrolling, variable rewards and notifications, streak mechanics, loot boxes, randomized in-app rewards, time-spent obfuscation, bright colors, loud noises, etc.) should be restricted to age-restricted accounts;
Communication limits (no DMs, no voice chat, no clickable links, etc.);
Clear time spent/played visibility;
No credit card linking.
Safety is part of the design; it can’t be added later or considered as a ‘bonus’ feature.
Mandated Laws: Voluntary guidelines? No way! This could’ve passed before, but not anymore. Most of the world has sorted this by implementing laws such as COPPA in the U.S., GDPR-K in the EU, the UK Online Safety Act and Age Appropriate Design Code (UK), Australia Online Safety Act, Canada’s PIPEDA, Brazil’s LGPD, UN Convention on the Rights of the Child (CRC), etc. These laws are just some examples of how seriously this is being taken; and it should be. That pressure has basically forced development companies to treat child safety as a primary requirement, not a mere suggestion.
More Control For Parents: Now you get more help built directly into the apps and devices kids use. A lot of platforms have simple dashboards where you can see screen time, block certain content, and limit who your child can talk to. You can find controls like these on phones and computers, too. Platforms have to clearly explain all these tools so you don’t have to guess what random settings do.
Easy-To-Use Report System: You don’t have to jump through hoops to report serious problems. Now, platforms have clearer ways to report abuse that involves kids, and those reports are then handled by larger moderation teams with set steps on how to respond.
When it comes to minors, companies have to act quickly. And it’s not because they want to (although, hopefully, they do), but because regulators say they don’t have any other choice.
If you look at Roblox sexual abuse lawsuit updates, for example, this becomes obvious where its shown that a bunch of federal and state cases alleging that Roblox and connected platforms (e.g., Discord) have failed massively to protect minors (our kids) from grooming predators and exploitation because of a lack of safety regulations (e.g., weak moderation, age verification, unsafe chat features, etc.). We need to highlight such cases so that laws are enforced against such companies. Kids (under 13) make up approximately 50 percent of the entire Roblox playerbase. We need them protected.
Kids See Less Adult Content by Default: Safer defaults make a huge difference. A lot of platforms now automatically block adult content for younger users, no extra setup required. Direct messaging and public chat features are often limited or completely turned off for kids.
If you’re a parent, you’ll really appreciate these changes because they reduce the chance of kids being accidentally exposed to content that could harm them. Without this, it’s up to the kids to make the right choices, and with how curious they are, you know that’s pretty much impossible.
Transparency: Companies are now expected to show how they handle safety. You’ll see a lot of them publishing public reports that explain how they moderate content and how they handle reports that involve kids. Some places even require independent reviews. This makes it easier to see if platforms are actually following rules or if they just claim to follow them.
Schools Have Become Part of Online Safety: Online safety has been handled at home up until recently, but not anymore. Now, schools in many countries teach digital literacy and online safety as part of regular education. Governments support these programs, and schools often work with nonprofits and regulators to keep the curriculum up to date. This means that kids are learning how to handle online spaces safely as part of everyday life; it’s no longer something only mom and dad tell them they have to do.
Conclusion
Our kids spend more and more of their time online. Luckily, there’s good news. And the good news is that there’s a whole lot more structure around the spaces they hang out. You used to cross your fingers and hope that platforms would be responsible, but that’s no longer the case. Today, they don’t have any other choice and, if they don’t follow the rules, they suffer some serious consequences.
The biggest change presented itself in the accountability department. Online platforms have to explain themselves and respond faster, meaning that if there are any gaps or risks, they’re mandated to fix it ASAP or risk massive fines. We’re glad that capital gains are not being placed first instead of safety.



