Meta’s restrictions could be just the start of changes in digital marketing to teens
By Ciaran Deering, Head of Online
Bowing to pressure over how advertisers can target young people on both Facebook and Instagram, Meta has just introduced new measures to restrict brands’ access to teens’ data, and tools that will allow users to better manage the types of ads they see on the platform. In 2021, Meta removed the ability for advertisers to target teens based on their interests and activities and from the beginning of this month has added gender to these restrictions.
These changes will inevitably result in a shift in ad revenue to Meta competitors TikTok and Snapchat. But ultimately all platforms may eventually have to introduce similar restrictions on targeting teens. This could see brands shifting spend into the far less regulated arena of content marketing.
The changes introduced this month come amid heightened concerns about young people’s exposure to harmful online content and messaging and the Online Safety Bill, which is currently being finalised and will force social media firms to be legally required to abide by their terms and conditions.
Meta’s new restrictions mean that the only attributes marketers can use with the 13–17-year-old target in most countries are location and age. Teens can continue to choose to hide any or all ads from a specific advertiser. The topics already restricted in Meta’s policies will be set to ‘see less’ by default, so teens can’t choose to opt-in to content that may not be age appropriate. Meta has also committed to providing documentation designed to inform teens how their data is being used for advertising purposes.
Facebook has long been on a trend towards attracting an older userbase but has retained younger audiences via Instagram. Brands will inevitably now seek social media options outside of Meta, and TikTok and Snapchat are obvious choices. Both have high reach and coverage of teen audiences – Ofcom reports that 44% of eight- to 12-year-olds use TikTok and 42% of UK children aged three to 17 use Snapchat.
TikTok and Snapchat’s advertising propositions are similar in nature to Facebook’s. Both offer algorithms that optimise campaigns for reach, traffic, app installs, conversions etc. Both accommodate image and video ads as their basic offering, with other formats available such as collections ads – most frequently used by e-commerce retailers – as well as options to boost organic content. So, transition by advertisers that have assets ready for Facebook is relatively straight forward.
Snapchat and TikTok have also recently stepped up their marketing efforts to attract SME brands that target teens. Both have created relevant materials, tools and cases to make it easier for this large cohort of potential advertisers to use their platforms.
Amazon-owned Twitch, the video streaming platform for gamers, is also an option for brands targeting younger audiences. Currently there are numerous video, carousel and display ad formats available which would make a transition to Twitch a similarly easy process.
Advertisers may also look to networked gaming platforms such as Roblox which attracts sizable younger audiences. Late last year Roblox announced they will partner with select brands to allow advertising across multiple games in their platform. Roblox’s move towards ‘immersive’ ads, is well timed and seeks to diversify their business which until now has been reliant on in-game transactions as their main source of revenue.
There are currently no signs that other social platforms will immediately follow Meta’s lead disallowing gender-based targeting. TikTok and Snapchat continue to offer it. Arguably, Facebook’s changes were as a result of the intense regulatory scrutiny they have been under globally. TikTok and Snapchat are smaller and unlikely to make any changes in this regard unless they’re forced to.
However, the pressure will be on for both platforms and advertisers to take greater responsibility and control when targeting teens. And this is becoming increasingly harder to manage – according to a 2022 report from Ofcom, a third of children aged between eight and 17 with a social media profile have an adult user age, having signed up with a false date of birth. Despite most platforms having a minimum age of 13, the research suggests that six in 10 (60%) children aged 8 to 12 who use these platforms are signed up with their own profile.
The industry is likely to see a growing move towards ‘age assurance’, which encompasses a range of techniques designed to prevent children from accessing adult and harmful content, such as self-declaration and hard identifiers such as passports, and AI and biometric-based systems among others. With the Online Safety Bill, platforms will inevitably have to take greater responsibility for age assurance. This has to be a good thing, both in terms of protecting young people from inappropriate content and ensuring advertisers have accurate audience data.
Ultimately, controls and scrutiny in advertising to teens is the right way forward. But it’s important to remember that majority of the negative messaging that’s seen by teens is outside of advertising in the huge arena of online content. If the likes of TikTok and Snapchat do eventually follow Meta’s lead, the restrictions may make advertising to teens unviable forcing brands further into online content. And as we are seeing from the Online Safety Bill, content is so much harder to regulate.
This article appeared in New Digital Age, link below