Logo

Learn About Algorithms and Recommendation systems

Let's learn about...

Algorithms, recommendation systems, and personalised feeds shape a lot of what young people see online. They can help tamariki and rangatahi discover useful, relevant content, but they can also make online experiences feel narrower or harder to step away from. You do not need to be a tech expert to support your child.

In a nutshell

Algorithms decide what shows up first, and recommendation systems suggest what to watch, read, buy, or click on next. They learn from what a young person watches, skips, clicks, searches for, and follows, which is why two people can use the same app and see very different things.

This creates a feedback loop: the more someone watches, pauses, clicks, or searches, the more similar content they are likely to see. Over time, use of the same account can make recommendations feel “sticky”.

The goal is usually to keep attention, not to decide what is healthiest or most useful. This is part of the attention economy, where platforms are often designed to hold attention for longer.

Binoculars

5-minute whānau safety check

Use this as a quick starting point.

  • We understand that feeds and recommendations are not random.
  • We have noticed what kinds of things keep showing up lately.
  • We have talked about how repeated content can affect mood, pressure, or perspective.
  • My young person knows they can tell me if something online feels off, upsetting, or hard to avoid.
  • We are keeping the conversation curious, not judgmental.

What to expect

You will see algorithms and recommendation systems in online accounts: places like ‘For You’ and ‘Explore’ pages, streaming suggestions, search results, related content, shopping recommendations, video platforms, games, and online communities.

The first signs are often subtle. Parents may only notice it once one topic seems to take over, scrolling is harder to stop, or mood shifts appear after being online. You might hear this described as a feed algorithm shaping what shows up, or a recommender system deciding what comes next.

You might notice:

  • one topic, creator, look, product, or opinion showing up again and again.
  • stronger “For You”, “Suggested”, or recommended content.
  • more “just one more” scrolling.
  • frustration when asked to switch off.
  • mood shifts after certain apps or platforms.
  • content that feels hard to get away from.

These changes often build slowly. Noticing them early can help before they turn into conflict.

Algorithms are not magic. They respond to patterns, and patterns can be changed.

What's the up-side?

How can algorithms and recommendations support young people online?

Discovery

Finding hobbies, interests, music, creators, communities, and ideas they might not have searched for on their own.

Learning

Coming across useful tips, tutorials, information, and inspiration.

Connection

Seeing content that reflects their interests, humour, identity, or communities.

Downtime

Finding funny, relaxing, or entertaining content that helps them unwind.

What's the flip-side?

Recommendations can shape not just what young people enjoy, but what they think is common, true, or expected. That can matter for things like body image, buying pressure, opinions, and exposure to extreme or repetitive content.

More of the same

When similar content keeps appearing, it can narrow what feels normal, popular, or worth paying attention to, especially when repeated search results, suggested videos, or trending posts make one view seem more common than it is.

This can create an echo chamber, where a feed starts to feel narrow, repetitive, or more extreme than real life. You might also hear this called a filter bubble, where algorithms keep serving similar content and make it harder to come across different viewpoints.

Harder-to-avoid content

A young person might see more of a topic than they meant to, especially if they pause on it, watch it, or click on it once. Young people sometimes describe this as going down a rabbit hole, where one piece of content leads to more and more of the same.

Comparison and pressure

Repeated content about appearance, popularity, money, success, or lifestyle can affect how young people feel about themselves and what they think they should want.

Ads that blend in

Sponsored content, influencer promotions, and product recommendations do not always look like ads. Some content appears because it is promoted or designed to hold attention. This can include targeted ads or personalised advertising, where advertising and product suggestions are matched to a person’s activity, interests, or behaviour online.

Sticky design

Autoplay, endless scrolling, and stacked suggestions can make it harder to stop on purpose. These design choices can strengthen the feedback loop and keep young people engaged for longer, even when they meant to log off.

Once you start watching videos, the algorithm kicks in - where if you’re watching a lot of the same type of videos and all of that they’ll start suggesting you more. That could send you on a spiral and just make you start watching more of the same type of content.

Male, 16, Māori

Digital Reflections – The online experience and its influence on youth body image 2024

Tips Block Icon

Safety Check

Try this together on one online account, app, or platform for one minute.

  • Look at the first five suggestions, results, or recommendations
  • Ask: “What do these seem to want from me — attention, clicks, or a reaction?”
  • Notice whether one topic, feeling, or message shows up more than once
  • Point out anything that looks sponsored, pushy, or designed to keep you scrolling
  • If the feed feels off, agree to pause and talk about it.

Top Tips

What helps most is noticing patterns and keeping the conversation open.

Click on each block to learn more about how you can engage your whānau about algorithms and recommendations system.

You do not need to know everything. Ask your young person to show you what an app or platform shows as recommendations or suggested content for them right now.

You might say:

  • “What kinds of things keep coming up for you lately?”
  • “Why do you think the app is showing you this?”
  • “Do you think your friend would see the same thing?”

Talking about algorithms does not need to be one big formal conversation.

You can:

  • scroll together sometimes
  • notice when something looks repeated or heavily promoted
  • notice when search, shopping, or streaming keeps pushing the same thing
  • mention your own experience of getting pulled into recommended content
  • talk about how different apps keep suggesting certain things.

This helps make algorithms feel less mysterious and easier to talk about.

Some young people find it easier to make sense of recommendations when an adult notices alongside them.

You can:

  • compare what different family members get shown in search, shopping, or suggested videos
  • ask what seems to come up the most
  • notice whether certain topics feel hard to avoid
  • talk about whether the content feels helpful, annoying, pressure-filled, or too much.

Try not to frame algorithms as all bad, or your child as passive or careless.

It helps to acknowledge both sides:

  • recommendations can be fun, useful, and interesting
  • they can also repeat pressure, comparison, or upsetting content.

When young people feel heard rather than judged, they are more likely to talk honestly about what they are seeing.

Need help right now?

If you would like any advice or support about keeping your whānau safe online Netsafe can help.

Contact the helpline for free, confidential and non-judgmental advice and support.

Contact Netsafe
Was this helpful?

Give this resource a rating.

Pencil