alecm<p><strong>The Guardian uses — and then hides/recants — photos of schoolgirl in article targeting fearful parents; perhaps we need a better conversation?</strong></p><p>If, like me, you’re a parent, there’s a good chance that you woke up on Saturday to outraged fellow parents on various group chats, <a href="https://archive.ph/cfpjw" rel="nofollow noopener" target="_blank">sharing a story from the Guardian</a>:</p><blockquote><p>Exclusive: Instagram pictures of girls as young as 13 were posted to promote Threads site ‘as bait’, campaigner says</p></blockquote><p>…with a very bait-like photograph immediately beneath / as part of the share-card, that I strongly suspect contributed to the article’s viral appeal:</p><p>The article cites an anonymous man — identified only as the “Campaigner” or “Recipient” or “Instagram user” — <a href="https://archive.ph/cfpjw" rel="nofollow noopener" target="_blank">who (for the moment, ignoring the rest of the article) we learn</a>:</p><blockquote><p>Meta has used back-to-school pictures of schoolgirls to advertise one of its social media platforms to a 37-year-old man, in a move parents described as “outrageous” and “upsetting”. </p><p>The man noticed that posts encouraging him to “get Threads” […] were being dropped into his Instagram feed featuring embedded posts of uniformed girls as young as 13 with their faces visible and, in most cases, their names. […]</p><p>The recipient told the Guardian the posts felt “deliberately provocative and ultimately exploitative of the children and families involved”. […]</p><p>The man who received the posts said that as he was only sent promotional posts of schoolgirls – there were no boys in school uniform, for example – there appeared to be “an aspect of sexualisation”. […]</p><p>The 37-year-old Instagram user from London who received the posts and asked to remain anonymous said: “Over several days I was repeatedly served Meta adverts for Threads that exclusively featured parents’ images of their daughters in school uniform, some revealing their names. As a father, I find it deeply inappropriate for Meta to repurpose these posts in targeted promotion to adults.”</p><p>He said he had not posted or liked any similar images before he was sent the schoolgirl pictures.</p><p>“To me, showcasing such content as trending or popular feels deliberately provocative and ultimately exploitative of the children and families involved, putting their online safety at risk.”</p></blockquote><p><strong>So what’s wrong here?</strong></p><p>First up: I have to give credit to the journalist, Robert Booth, for at least attempting (<em>“any similar images</em>“) to recognise that <strong>IF YOU CLICK ON STUFF THEN THE ALGORITHM WILL TRY TO SERVE YOU MORE OF THE SAME STUFF.</strong></p><p>This is why you get so many people, hooked in an hate/love outrage loop, saying: <em>“My feed is nothing but fascism and I hate it, I keep on clicking on the links and checking and yes, social media is serving me nothing but evil content!”</em> — the solution to which is (a) <em>don’t click on the links</em>, and (b) find a pulldown menu option saying something like <em>“Show Me Less Of This”</em> (or, <a href="https://alecmuffett.com/article/114576" rel="nofollow noopener" target="_blank"><em>“Show Me More Of This”</em></a>) which should be the hallmark of a good algorithmic feed-based platform.</p><p>What Robert / the Campaigner presumably <em>don’t</em> consider — and certainly don’t mention — is that this algorithmic boosting happens <strong>in real time</strong>, so much so that you can be scrolling sideways in a gallery and it will be working out what to show next you on the basis of what you engaged with only moments before, caching its decisions temporarily (until you scroll away for an extended period) in order to maintain consistency.</p><p>This is less a matter of long-term profiling than <em>“if you’ve engaged with this, then maybe you will engage with that”</em> which — like it or not — is a standard sales tactic around the world: <em>would you like to supersize that choice</em>?</p><p>So my personal feeling is that if the Campaigner is the sort of person to click on stuff that outrages him, then he’s going to get a lot more outraged and that’s kind of where everyone is at the moment.</p><p>Secondly: Meta <em>are</em> telling the truth. The images are public, and under the terms which people use Instagram & Threads, they may be used for such. </p><p>If you don’t like it, make the images private and share them only amongst friends.</p><p>So… let’s dig into <em>privacy</em> for a bit. Back to the article:</p><p><strong>Privacy versus Unstated Expectations</strong></p><p>Meta and all the other big platforms have recently been roasted by the Online Safety Act, causing them to lock down access to childens’ accounts — or at least that what politicians want you to believe, however a lot of the constraints to keep kids safe online are years if not 1+ decades old.</p><p>But what if it’s the parent who posts the photograph?</p><p>So I was <a href="https://archive.ph/cfpjw" rel="nofollow noopener" target="_blank">reading the Guardian article</a> on my Android phone, and out of curiosity I tapped and held-down the big central button — thereby launching <em>Google Lens</em> and <em>Google Image Search</em> — and I circled the image which the Guardian had chosen to post the article with.</p><p>This immediately (<em>“Images Similar To This”</em>) led me to mother Sarah Dart’s <a href="https://www.threads.com/@sdart23/post/DOORf0BiOn8" rel="nofollow noopener" target="_blank">posting on Instagram — which is public and visible to the world</a> — with approving comments from (presumably) her circle of friends:</p><p>…although with immediate, critical and crude <em>“don’t post private pictures”</em> commentary on the related automatic reposting to Threads.</p><p>If we pick through the article focusing on commentary that is probably attributable to Sarah:</p><blockquote><p>[Image Caption] The mother whose Instagram picture of her 15-year-old daughter was used in a post advertising Threads said she had ‘no idea’ it would be used as a promotion.</p><p>[The mother] said she posted the picture to a public Instagram account. The posts of their children were highlighted to the stranger as “suggested threads”. […]</p><p>The mother of a 15-year-old whose picture was used in a promotional post that featured a large “Get Threads” button said: “For me it was a picture of my daughter going to school. I had no idea Instagram had picked it up and are using it as a promotion. It’s absolutely disgusting. She is a minor.”</p><p>She said she would have refused consent and “not for any money in the world would I let them use a girl dressed in a school uniform to get people on to [its platform]”.</p><p>With 267 followers, her Instagram account usually had modest reach but the post of her child attracted nearly 7,000 views, 90% from non-followers, half of whom were aged over 44 and 90% of whom were men.</p></blockquote><p>My perspective is that Sarah’s wants are absolutely fair, but it seems odd to me that she posted a photo — <em>one of such quality that for some reason the Guardian photo editor picked it up to illustrate the story</em> — and shared it with the world rather than just with a circle of friends. Sarah has had a Threads account since November 2024 but it has little content, so it could be that she forgot that her Threads account exists and that this was essentially an unpleasant surprise… but <a href="https://www.instagram.com/sdart23/p/DOOReZjCBQl/" rel="nofollow noopener" target="_blank">the photo is still public on Instagram</a> and it’s <a href="https://www.instagram.com/explore/locations/218182521/exeter-devon/" rel="nofollow noopener" target="_blank">tagged with the location of Exeter</a> — so anyone could stumble across it via that means, too.</p><p>One of the other things you find from Sarah’s Threads account is that Robert created his own and reached-out to her:</p><p>…which — reading his other posts — then means that you can find the names of all the other people he reached out to:</p><p>Let’s ignore the joint failures of operational security and of source protection — the Guardian are <a href="https://www.theguardian.com/world/2013/jun/09/edward-snowden-nsa-whistleblower-surveillance" rel="nofollow noopener" target="_blank">generally pretty good</a> <a href="https://www.theguardian.com/help/insideguardian/2022/may/30/guardian-launches-tor-onion-service" rel="nofollow noopener" target="_blank">at that stuff</a>, maybe Robert just needs a refresher — but these messages lead us to Jade, Lau, and Katie, who all have private Instagram accounts, some of whom actively use their Threads accounts for typical purposes, and one or more of whom (or their partners) are cited:</p><blockquote><p>One mother said her account was set to private, but the posts were automatically cross-posting to Threads where they were visible. […]</p><p> The father of a 13-year-old who appeared in one of the posts said it was “absolutely outrageous”. The images were all of schoolgirls in short skirts with either bare legs or stockings.</p><p>“When I found out an image of her has been exploited in what felt like a sexualised way by a massive company like that to market their product it left me feeling quite disgusted,” he said. […]</p><p>Another mother whose post of her 13-year-old was used in a promotional post said: “Meta did all of this on purpose, not informing us, as they want to generate content. It’s despicable. And who is responsible for creating that Threads ad using children’s photos to promote the platform for older men?”</p></blockquote><p>It’s interesting to observe that the parents have apparently been approached by Robert with the frame <em>“Threads [is] using children’s photos to promote the platform for older men”</em> as if this was an intentional marketing tactic, as opposed to <em>“Some Guy Clicked On A Schoolgirl Photo And Was Served More Of The Same”</em> — which would be less outrageous.</p><p>What I <em>would</em> strongly criticise Meta regarding is that they’re not nagging people enough about the existence of crossposting between Instagram, Threads and Facebook itself. </p><p>As a heavy Threads user I get frequent popups along the lines of:</p><blockquote><p><em>Do You Wish To Keep Sharing Your Threads Posts Into The [Mastodon, ActivityPub] Fediverse?</em></p></blockquote><p>…but there’s far less (some, but <em>far less</em>) of this for mutual sharing between Meta-owned platforms — <em>Instagram to Threads, Instagram to Facebook</em> — probably because it’s not such a significant unaddressed legal risk when posts are automatically re-shared <em>within</em> the Meta ecosystem.</p><p>But Robert is (with his photo editor) not doing that, and worse I feel like his whole article is actually framed to convey a subtext along the parodical lines of:</p><blockquote><p><em>Exclusive! Meta are actively attempting to grow Threads readership and engagement by attracting pervy men via posting sexualised pictures of young girls dressed in school uniform in short skirts with either bare legs or stockings. Photos like, for example, this one: [illustration]</em></p></blockquote><p>That the Guardian have slightly recognised this hypocrisy <em>may</em> be illustrated by the fact that they <a href="https://archive.ph/u40S8" rel="nofollow noopener" target="_blank">subsequently changed the headline image to a stock photo</a>; however this may also be a belated attempt to preserve the privacy of Sarah and her daughter — a young woman who has her own, private Instagram account.</p><p>As ever, Baroness Kidron gets a shout-out:</p><blockquote><p>Beeban Kidron, a crossbench peer and campaigner for children’s rights online, said: “Offering up school-age girls as bait to advertise a commercial service is a new low even for Meta.</p><p>“At every opportunity Meta privileges profit over safety, and company growth over children’s right to privacy. It is the only reason that they could think it appropriate to send pictures of schoolgirls to a 37-year-old man – as bait – Meta is a wilfully careless company.”</p><p>She called on the regulator Ofcom to consider if measures, introduced this summer to prevent unknown adults connecting to children, make clear that “companies cannot offer sexualised images of children as bait to unknown men”.</p></blockquote><p>…but, but, but, the photos were <em>posted by the parents</em>, and the “Campaigner” is the one who “sexualised” them:</p><blockquote><p>[he] told the Guardian the posts felt “deliberately provocative and ultimately exploitative” […] there appeared to be “an aspect of sexualisation”</p></blockquote><p>Who is the wilfully careless person in this scenario, and who is the wilfully sexualising one? Not the parents? Not the guy who is so offended that he has to keep clicking thorough to establish (quote) <em>“parents’ images of their daughters in school uniform, some revealing their names”</em>?</p><p><strong>Lessons</strong></p><p>Trying to wrap this up:</p><ol><li>If parents post pictures of their kids, publicly, to platforms the primary purpose of which is to share images widely … then these photos, posted from parental (i.e. <em>adult</em>) accounts, will be shared widely and without the protections that would have been applied had the child posted their own stuff around their own circle.</li><li>Expecting pictures that are posted to such platforms to remain somehow “private” on the grounds that (paraphrase) <em>“not many people follow or are interested in me”</em> is not merely unrealistic, it’s antithetical. Some experts refer to this as <a href="https://en.wikipedia.org/wiki/Context_collapse" rel="nofollow noopener" target="_blank"><em>“context collapse”</em></a> but more simply it’s <a href="https://www.youtube.com/watch?v=ewi_KNMs2rg" rel="nofollow noopener" target="_blank">living distractedly in your own little world until you walk into a tree</a>.</li><li>If you want something to be private, make/keep it private. That’s what privacy means. </li><li><em>Example:</em> If parents want to share photos privately amongst relatives, then use an end-to-end encrypted messenger tool such as WhatsApp or Signal, and turn on disappearing messages so the pictures are trashed after a reasonable time, if not actively saved.</li><li>Ironically, the Online Safety Act means that parents posting pictures of their kids in some ways is more risky than kids sharing photos amongst themselves. The parents have fewer guardrails and less awareness than the kids themselves, and then they blame choices on the platforms.</li><li>Although platforms do need to remind people about federation and automated resharing more often, because sometimes people forget or fail to understand.</li><li>The question of whether <em>“all public content”</em> on Instagram should be “fair game” to include in “recommendation boxes” is a thorny one: this is not like TikTok where people are using the “#fyp” tag to compete for the privilege of being recommended, but also recommendations may be “from friends” rather than merely “popular” and thus demanding content be nominated by posters for inclusion may undermine the whole point of the recommendation mechanism.</li><li>Absolutely nobody is talking about the good things that the Internet does any more — connecting family, friends, people, all that is washed away in fear of abuse and exploitation.</li><li>The Guardian have been derelict: they should not be posting the parents’ pictures of the kids, THEY SHOULD HAVE BEEN POSTING SCREENCAPS OF THE ADVERTS DEEMED AT FAULT — so that we can judge the pictures in their proper context, and we could sense how (or: if) they were targeted. Reporting hearsay generated by an outraged “Campaigner” who has been clicking on links, is not worthy of coverage.</li></ol><p><strong>Experiment</strong></p><p>As an experiment, as a parent, and subsequent to my clicking for this investigation around the Guardian article, I scrolled down my Instagram feed to find my Threads recommendations, and what did I find?</p><p>…and if you click through you find a Threads posting with 4 Million views and some savage child-protection commentary from Threads users:</p><p>How are you, dear reader, going to feel about this? </p><p><em>“stop posting your minor children on the internet?”</em></p><p><strong>Elsewhere</strong></p><p>I’m not the only person discussing this:</p><p><a href="https://www.reddit.com/r/news/comments/1nlrlwm/parents_outraged_as_meta_uses_photos_of/" rel="nofollow noopener" target="_blank">https://www.reddit.com/r/news/comments/1nlrlwm/parents_outraged_as_meta_uses_photos_of/</a></p><p><a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/child-safety" target="_blank">#childSafety</a> <a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/guardian" target="_blank">#guardian</a> <a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/instagram" target="_blank">#instagram</a> <a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/meta" target="_blank">#meta</a> <a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/online-safety" target="_blank">#onlineSafety</a> <a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/online-safety-act" target="_blank">#onlineSafetyAct</a> <a rel="nofollow noopener" class="hashtag u-tag u-category" href="https://alecmuffett.com/article/tag/privacy" target="_blank">#privacy</a></p>