cross-posted from: https://programming.dev/post/37122335

  • Sexual extortion, or ‘sextortion’ scams against children and young people on the rise, with ‘hideous and callous cruelty’ used to blackmail victims.
  • Boys still at particular risk as numbers surge – making up 97% of confirmed sextortion cases seen by the Internet Watch Foundation (IWF).
  • UK’s Report Remove service, run jointly by Childline and the IWF, sees significant rise in children self-reporting nude or sexual imagery which may have got out of control online.
  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    14 hours ago

    This feels more like politics news than technology news. Sure it’s done over social media/messaging apps, but so are plenty of things that I wouldn’t really call technology news.

    I’m somewhat surprised that boys are at a much greater risk of online sexual exploitation than girls, though.

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 hours ago

      It is a cyber-enabled crime, and as such can take other forms besides digital.

      Also this situation presented by the article seems to be odd. The only biggest change there was was the child protection laws and yet it’s growing faster than ever - I wonder if they provide some statistics because I might be just speculating.

      The reason for this increase just doesn’t seem clear.

  • Dyskolos@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    15 hours ago

    I love tech super much, but what the recent 2 decades have done against kids and minors (above topic, social media et al) saddens me more.

    I’m so superglad I long before decided against procreating…

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 hours ago

      This doesn’t hold much, because how do these two relate? Of course that could be deemed privacy invasive by anyone - and even considered to be trust invasive due to consent being taken loosely. But I wouldn’t consider oversharing to be remotely close to such crime.

      • 3dcadmin@lemmy.relayeasy.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 hours ago

        Also the rise of AI with easily found child images is what is starting to happen. Massive library of those overshared on, for example, Facebook over the years.

        • sleen@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 hours ago

          I know AI usage is increasing, but I don’t understand how it allows ‘child images’ (which I assume is sexual abuse material) to be easily found.

          Additionally, I’d want clarification on how those massive libraries which are over shared are related to oversharing.

          • 3dcadmin@lemmy.relayeasy.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 hours ago

            They do not need to be sexual abusive material, they find easily scraped images and then use AI to make the sexually abusive or pornographic to then blackmail the child/children and their parents. For instance, a case 2 weeks back, young mother shared images of her kids, not sexual but a few in swimsuits when they were young. These were then doctored and sent to this girl who is now older to blackmail her and then on to her mother in an attempt to blackmail her as well. The pictures were shared with a lack of understanding of privacy at the time so anyone could see them. Police struggling to find out who is blackmailing the person, and struggling to find a reason to actually investigate as they say it is a likeness of the person not the actual person and it was shared to the world years ago meaning permission was given (ie the picture was allowed to be shared at the time). Now of course I am not revealing any info as it is a current police investigation and that would be illegal but it appears to be going nowhere yet is disturbing to the kid and the mother

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 hours ago

            Think of it as nonconsensual AI generated CSAM. Like nonconsensual AI porn, you take public SFW photos and use the AI to generate explicit images using the provided photo as reference for the abused victim.

            As for the over share library, think of all the pictures you see of peoples’ kids in your social media feed, then consider how few people take proper privacy controls on what they post (or just intentionally post them publicly for laughs/attention). All of those images can be used as the basis for generation nonconsensual AI porn/CSAM.