cross-posted from: https://programming.dev/post/37122335

  • Sexual extortion, or ‘sextortion’ scams against children and young people on the rise, with ‘hideous and callous cruelty’ used to blackmail victims.
  • Boys still at particular risk as numbers surge – making up 97% of confirmed sextortion cases seen by the Internet Watch Foundation (IWF).
  • UK’s Report Remove service, run jointly by Childline and the IWF, sees significant rise in children self-reporting nude or sexual imagery which may have got out of control online.
    • omcgo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      6
      ·
      edit-2
      1 day ago

      We have to look at hardware-level solutions like VOOP, which can block the sending of nudes (for example) entirely.

      • Blemgo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Though this solution also seems to be very flawed, doesn’t it? You basically trust another company to manage your child’s smartphone and granting it full access to it. Furthermore, that doesn’t stop predators, as they could still arrange meetups with their unknowing victims. And even if it captures text messages, kids would be discouraged to use their phone due to their fear of their parents disproving of their friends or their communication to them. Instead, they’d more likely learn the use of “burner phones” by getting a factory-reset phone and using that one instead.

        It’s the sort of ham-fisted attempt expected by parents that blame their kids for their mistakes instead of their parenting.

  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 day ago

    This feels more like politics news than technology news. Sure it’s done over social media/messaging apps, but so are plenty of things that I wouldn’t really call technology news.

    I’m somewhat surprised that boys are at a much greater risk of online sexual exploitation than girls, though.

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      1 day ago

      It is a cyber-enabled crime, and as such can take other forms besides digital.

      Also this situation presented by the article seems to be odd. The only biggest change there was was the child protection laws and yet it’s growing faster than ever - I wonder if they provide some statistics because I might be just speculating.

      The reason for this increase just doesn’t seem clear.

  • Dyskolos@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    I love tech super much, but what the recent 2 decades have done against kids and minors (above topic, social media et al) saddens me more.

    I’m so superglad I long before decided against procreating…

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      This doesn’t hold much, because how do these two relate? Of course that could be deemed privacy invasive by anyone - and even considered to be trust invasive due to consent being taken loosely. But I wouldn’t consider oversharing to be remotely close to such crime.

      • 3dcadmin@lemmy.relayeasy.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Also the rise of AI with easily found child images is what is starting to happen. Massive library of those overshared on, for example, Facebook over the years.

        • sleen@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          I know AI usage is increasing, but I don’t understand how it allows ‘child images’ (which I assume is sexual abuse material) to be easily found.

          Additionally, I’d want clarification on how those massive libraries which are over shared are related to oversharing.

          • 3dcadmin@lemmy.relayeasy.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            They do not need to be sexual abusive material, they find easily scraped images and then use AI to make the sexually abusive or pornographic to then blackmail the child/children and their parents. For instance, a case 2 weeks back, young mother shared images of her kids, not sexual but a few in swimsuits when they were young. These were then doctored and sent to this girl who is now older to blackmail her and then on to her mother in an attempt to blackmail her as well. The pictures were shared with a lack of understanding of privacy at the time so anyone could see them. Police struggling to find out who is blackmailing the person, and struggling to find a reason to actually investigate as they say it is a likeness of the person not the actual person and it was shared to the world years ago meaning permission was given (ie the picture was allowed to be shared at the time). Now of course I am not revealing any info as it is a current police investigation and that would be illegal but it appears to be going nowhere yet is disturbing to the kid and the mother

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            Think of it as nonconsensual AI generated CSAM. Like nonconsensual AI porn, you take public SFW photos and use the AI to generate explicit images using the provided photo as reference for the abused victim.

            As for the over share library, think of all the pictures you see of peoples’ kids in your social media feed, then consider how few people take proper privacy controls on what they post (or just intentionally post them publicly for laughs/attention). All of those images can be used as the basis for generation nonconsensual AI porn/CSAM.