Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

  • BigMuffin69@awful.systems
    link
    fedilink
    English
    arrow-up
    22
    ·
    5 months ago

    https://xcancel.com/AISafetyMemes/status/1802894899022533034#m

    The same pundits have been saying “deep learning is hitting a wall” for a DECADE. Why do they have ANY credibility left? Wrong, wrong, wrong. Year after year after year. Like all professional pundits, they pound their fist on the table and confidently declare AGI IS DEFINITELY FAR OFF and people breathe a sigh of relief. Because to admit that AGI might be soon is SCARY. Or it should be, because it represents MASSIVE uncertainty. AGI is our final invention. You have to acknowledge the world as we know it will end, for better or worse. Your 20 year plans up in smoke. Learning a language for no reason. Preparing for a career that won’t exist. Raising kids who might just… suddenly die. Because we invited aliens with superior technology we couldn’t control. Remember, many hopium addicts are just hoping that we become PETS. They point to Ian Banks’ Culture series as a good outcome… where, again, HUMANS ARE PETS. THIS IS THEIR GOOD OUTCOME. What’s funny, too, is that noted skeptics like Gary Marcus still think there’s a 35% chance of AGI in the next 12 years - that is still HIGH! (Side note: many skeptics are butthurt they wasted their career on the wrong ML paradigm.) Nobody wants to stare in the face the fact that 1) the average AI scientist thinks there is a 1 in 6 chance we’re all about to die, or that 2) most AGI company insiders now think AGI is 2-5 years away. It is insane that this isn’t the only thing on the news right now. So… we stay in our hopium dens, nitpicking The Latest Thing AI Still Can’t Do, missing forests from trees, underreacting to the clear-as-day exponential. Most insiders agree: the alien ships are now visible in the sky, and we don’t know if they’re going to cure cancer or exterminate us. Be brave. Stare AGI in the face.

    This post almost made me crash my self-driving car.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      20
      ·
      5 months ago

      Remember, many hopium addicts are just hoping that we become PETS. They point to Ian Banks’ Culture series as a good outcome… where, again, HUMANS ARE PETS. THIS IS THEIR GOOD OUTCOME.

      I am once again begging these e/acc fucking idiots to actually read and engage with the sci-fi books they keep citing

      but who am I kidding? the only way you come up with a take as stupid as “humans are pets in the Culture” is if your only exposure to the books is having GPT summarize them

    • maol@awful.systems
      link
      fedilink
      English
      arrow-up
      19
      ·
      5 months ago

      It’s mad that we have an actual existential crisis in climate change (temperature records broken across the world this year) but these cunts are driving themselves into a frenzy over something that is nowhere near as pressing or dangerous. Oh, people dying of heatstroke isn’t as glamorous? Fuck off

    • Mii@awful.systems
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      Seriously, could someone gift this dude a subscription to spicyautocompletegirlfriends.ai so he can finally cum?

      One thing that’s crazy: it’s not just skeptics, virtually EVERYONE in AI has a terrible track record - and all in the same OPPOSITE direction from usual! In every other industry, due to the Planning Fallacy etc, people predict things will take 2 years, but they actually take 10 years. In AI, people predict 10 years, then it happens in 2!

      ai_quotes_from_1965.txt

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      14
      ·
      5 months ago

      humans are pets

      Actually not what is happening in the books. I get where they are coming form but this requires redefining the word pet in such a way it is a useless word.

      The Culture series really breaks the brains of people who can only think in hierarchies.

    • gerikson@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 months ago

      If you’ve been around the block like I have, you’ve seen reports about people joining cults to await spaceships, people preaching that the world is about to end &c. It’s a staple trope in old New Yorker cartoons, where a bearded dude walks around with a billboard saying “The End is nigh”.

      The tech world is growing up, and a new internet-native generation has taken over. But everyone is still human, and the same pattern-matching that leads a 19th century Christian to discern when the world is going to end by reading Revelation will lead a 25 year old tech bro steeped in “rationalism” to decide that spicy autocomplete is the first stage of The End of the Human Race. The only difference is the inputs.