• Warl0k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    2 months ago

    We’ve been dealing with this a lot in WA, too. The popular one a couple months ago was the cliche hollywood “We’ve kidnapped your grandkid here’s proof (AI copy of their grandchild, taken from facebook videos as far as we can tell, reading off that day’s news article) please send $xxx bitcoin”. It’s disgustingly effective. Don’t post your kids on social media, folks. Don’t do it.

    • FiveMacs@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      Is this the 2000s again…

      No names, no numbers, no addresses, stop storing things on someone else’s computer . But it’s honestly too late for most people. They were all caught up in the fuckerburg give him your personal info days.

      More then half the world, doesn’t care or never thinks of the potentials and then this kinda crap happens.

  • Rai@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 months ago

    Everyone should make their elders watch Kitboga. I’ve learned about so many scams from him.

  • EpeeGnome@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    This scam has been around since long before AI voice was a thing. You say something scary enough, and people will subconsciously attribute anything off about the voice to a bad connection and the severe stress the person is presumably under and genuinely think the voice sounds exactly like their loved one. AI voice makes it easier to fool more people, but I bet most of these types of scammers are not putting in the time to research every target to build a voice profile and instead focus on calling as many people as possible. Of course, these days anyone taken by this scam will assume it must have been AI voice, because otherwise how did they sound so convincing.