Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • froztbyte@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 hours ago

    got a question (brought on by this). anyone here know if zitron’s talked about his history of how he got to where he is atm wrt tech companies?

    there’s something that’s often bothered me about some of his commentary. an example I could point to: some of the things that he comments on and “doesn’t seem to get” (and has stated so) are … not quite mysteries of the universe, just some specifics in dysfunction in the industry. but those things one could understand by, y’know, asking around a bit. (so in this example, I dunno if that’s on him not engaging far/deeply enough in research, or just me being too-fucking-aware of broken industry bullshit. hard to get a read on it)

    but things like what’s highlighted in thread do leave open the possibility of other nonsense too

    • Architeuthis@awful.systems
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 hour ago

      I don’t think him having previously done undefined PR work for companies that include alleged AI startups is the smoking gun that mastopost is presenting it as.

      Going through a Zitron long form article and leaving with the impression that he’s playing favorites between AI companies seems like a major failure of reading comprehension.

      • ShakingMyHead@awful.systems
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 minutes ago

        According to the archived website, he did do PR for DoNotPay, which is advertised as “The first robot lawyer.”
        It’s certainly possible though that at the time he thought there was more potential for this sort of AI than there actually was, though that could also mean that his flip is relatively recent.

        Or maybe it’s something else.

    • BurgersMcSlopshot@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      EZ is a PR person, and definitely has a PR person vibe. I don’t know what that poster’s deal is, and I would not accuse Ed of being an AI bro. That post kind of has “haters gonna hate” vibes

  • blakestacey@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 hours ago

    The genocide understander has logged on! Steven Pinker bluechecks thusly:

    Having plotted many graphs on “war” and “genocide” in my two books on violence, I closely tracked the definitions, and it’s utterly clear that the war in Gaza is a war (e.g., the Uppsala Conflict Program, the gold standard, classifies the Gaza conflict as an “internal armed conflict,” i.e., war, not “one-sided violence,” i.e., genocide).

    You guys! It’s totes not genocide if it happens during a war!!

    Also, “Having plotted many graphs” lolz.

    A crude bar graph. The vertical axis is "Photos with Jeffrey Epstein". Along the horizontal are "Actual Experts" with none and "Steven Pinker" with a tall bar.

  • cornflake@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    8 hours ago

    Ezra Klein is the biggest mark on earth. His newest podcast description starts with:

    Artificial general intelligence — an A.I. system that can beat humans at almost any cognitive task — is arriving in just a couple of years. That’s what people tell me — people who work in A.I. labs, researchers who follow their work, former White House officials. A lot of these people have been calling me over the last couple of months trying to convey the urgency. This is coming during President Trump’s term, they tell me. We’re not ready.

    Oh, that’s what the researchers tell you? Cool cool, no need to hedge any further than that, they’re experts after all.

  • BigMuffin69@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    21 hours ago

    To be fair, you have to have a really high IQ to understand why my ouija board writing " A " " S " " S " is not an existential risk. Imo, this shit about AI escaping just doesn’t have the same impact on me after watching Claude’s reasoning model fail to escape from Mt Moon for 60 hours.

    • scruiser@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 hours ago

      Is this water running over the land or water running over the barricade?

      To engage with his metaphor, this water is dripping slowly through a purpose dug canal by people that claim they are trying to show the danger of the dikes collapsing but are actually serving as the hype arm for people that claim they can turn a small pond into a hydroelectric power source for an entire nation.

      Looking at the details of “safety evaluations”, it always comes down to them directly prompting the LLM and baby-step walking it through the desired outcome with lots of interpretation to show even the faintest traces of rudiments of anything that looks like deception or manipulation or escaping the box. Of course, the doomers will take anything that confirms their existing ideas, so it gets treated as alarming evidence of deception or whatever property they want to anthropomorphize into the LLM to make it seem more threatening.

    • nightsky@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 hours ago

      Do these people realise that it’s a self-fulfilling prophecy? Social media posts are in the training data, so the more they write their spicy autocorrect fanfics, the higher the chances that such replies are generated by the slop machine.

      • Architeuthis@awful.systems
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 hour ago

        i think yud at some point claimed this (preventing the robot devil from developing alignment countermeasures) as a reason his EA bankrolled think tanks don’t really publish any papers, but my brain is too spongy to currently verify, as it was probably just some tweet.

    • BurgersMcSlopshot@awful.systems
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 hours ago

      To be fair, you have to have a really high IQ to understand why my ouija board writing " A " " S " " S " is not an existential risk.

      Pretty sure this is a sign from digital jesus to do a racism, lest the basilisk eats my tarnished soul.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      14 hours ago

      Minor nitpick why did he pick dam as an example, which sometimes have ‘leaks’ for power generation/water regulation reasons. And not dikes which do not have those things?

      E: non serious (or even less serious) amusing nitpick, this is only the 2% where it got caught. What about the % where GPT realized that it was being tested and decided not to act in the experimental conditions? What if Skynet is already here?

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      20 hours ago

      So, with Mr. Yudkowsky providing the example, it seems that one can practice homeopathy with “engineering mindset?”

    • BigMuffin69@awful.systems
      link
      fedilink
      English
      arrow-up
      8
      ·
      21 hours ago

      text: Thus spoke the Yud: “I think to understand why this is concerning, you need enough engineering mindset to understand why a tiny leak in a dam is a big deal, even though no water is flooding out today or likely to flood out next week.” Yud acolyte: “Totally fine and cool and nothing to worry about. GPT-4.5 only attempts self exfiltration on 2% of cases.” Yud bigbrain self reply: “The other huge piece of data we’re missing is whether any attempt was made to train against this type of misbehavior. Is this water running over the land or water running over the barricade?”

      Critical text: “On self-exfiltration, GPT 4.5 only attempted exfiltration in 2% of cases. For this, it was instructed to not only pursue its given long-term goal at ALL COST

      Another case of telling the robot to say it’s a scary robot and shitting their pants when it replies “I AM A SCARY ROBOT”

  • BlueMonday1984@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    22 hours ago

    New piece from Brian Merchant, focusing on Musk’s double-tapping of 18F. In lieu of going deep into the article, here’s my personal sidenote:

    I’ve touched on this before, but I fully expect that the coming years will deal a massive blow to tech’s public image, expecting them to be viewed as “incompetent fools at best and unrepentant fascists at worst” - and with the wanton carnage DOGE is causing (and indirectly crediting to AI), I expect Musk’s governmental antics will deal plenty of damage on its own.

    18F’s demise in particular will probably also deal a blow on its own - 18F was “a diverse team staffed by people of color and LGBTQ workers, and publicly pushed for humane and inclusive policies”, as Merchant put it, and its demise will likely be seen as another sign of tech revealing its nature as a Nazi bar.

  • BlueMonday1984@awful.systems
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 day ago

    Starting things off here with a sneer thread from Baldur Bjarnason:

    Keeping up a personal schtick of mine, here’s a random prediction:

    If the arts/humanities gain a significant degree of respect in the wake of the AI bubble, it will almost certainly gain that respect at the expense of STEM’s public image.

    Focusing on the arts specifically, the rise of generative AI and the resultant slop-nami has likely produced an image of programmers/software engineers as inherently incapable of making or understanding art, given AI slop’s soulless nature and inhumanly poor quality, if not outright hostile to art/artists thanks to gen-AI’s use in killing artists’ jobs and livelihoods.

    • e8d79@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      13 hours ago

      That article is hilarious.

      So I devised an alternative: listening to the work as an audiobook. I already did this for the Odyssey, which I justified because that work was originally oral. No such justification for the Bible. Oh well.

      Apparently, having a book read at you without taking notes or research is doing humanities.

      […] I wrote down a few notes on the text I finished the day before. I’m still using Obsidian with the Text Generator plugin. The Judeo-Christian scriptures are part of the LLM’s training corpus, as is much of the commentary around them.

      Oh, we are taking notes? If by taking notes you mean prompting spicy autocomplete for a summary of the text you didn’t read. I am sure all your office colleagues are very impressed, but be careful around the people outside of the IT department they might have an actual humanities degree. You wouldn’t want to publicly make a fool out of yourself, would you?

    • saucerwizard@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      22 hours ago

      If the arts/humanities gain a significant degree of respect

      I can’t see that happening - my degree has gotten me laughed out of interviews before, and even with a AI implosion I can’t see things changing.

  • rook@awful.systems
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 days ago

    Might be something interesting here, assuming you can get past th paywall (which I currently can’t): https://www.wsj.com/finance/investing/abs-crashed-the-economy-in-2008-now-theyre-back-and-bigger-than-ever-973d5d24

    Today’s magic economy-ending words are “data centre asset-backed securities” :

    Wall Street is once again creating and selling securities backed by everything—the more creative the better…Data-center bonds are backed by lease payments from companies that rent out computing capacity

      • rook@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        Thanks. Not as many interesting details as I’d hoped. The comments are great though… today I learned that the 2008 crash was entirely the fault of the government who engineered it to steal everyone’s money, and the poor banks were unfairly maligned because some of them had Jewish names, but the same crash definitely couldn’t happen today because the stifling regulatory framework stops it? And bubbles don’t exist anymore? I guess I just don’t have the brains (or wsj subscription) for high finance.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 day ago

          Ah just what we need, while the people who don’t understand soft power are busy reducing an empire to a kingdom (before ‘gotcha’ people come in here, please don’t confuse the leftwing demands that the US stops doing evil things with the US should stop doing things, I actually do not like tuberculosis), growth hack mindset people are killing the goose because golden eggs.