• kusivittula@sopuli.xyz
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      10 months ago

      i’m studying mechanical engineering and there’s a guy in our class who’s obsessed with chatgpt. he’s always trying to solve all of the tasks using chatgpt and he’s always the first to share the solution in zoom. so far it’s never been correct but he just sticks with it…

      • Poayjay@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        10 months ago

        I am a mechanical engineer. I was able to get special permission from my IT department to use LLMs as part of my workflow as a genie pig for the department. It is completely useless.

        One the most valuable skill an engineer can have is being able to communicate technical information effectively to different audiences. GPT is on overly polite meat grinder, spitting out half chewed technical slop.

        • intensely_human@lemm.ee
          link
          fedilink
          arrow-up
          8
          arrow-down
          2
          ·
          10 months ago

          AI will have a better sense of something like mechanical engineering when it’s inhabited a body for a while.

      • drawerair@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        If I’m recalling right, I asked Chatgpt re banked turn with friction. Didn’t give the answer I was looking for.

        I asked Chatgpt re the best big phones of 2022. 1 of the phones it cited was released in 2021.

    • weird_nugget@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      10 months ago

      Yeah I also do and it is indeed frequently incorrect. It is good when you have like no idea about what you’re doing. It can help you get on track and then you can research by yourself.

    • TehBamski@lemmy.worldOP
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      Not picking fights. Just curious.

      Is this an improvement or a decline in your overall code programming success?

      • Matriks404@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        10 months ago

        I am a hobbyist (and not very good) programmer, and while ChatGPT (free version) often gives me wrong answers, it still gives me some insight on how some stuff could be done (intentionally or not) or how something works and is actually somewhat helpful in learning stuff, but I guess this could be double-edged sword even in that regard.

        It is also pretty good at detecting simple code errors, from what I have seen.

        Overall more positive than negative, but I wouldn’t recommend to use it blindly.

      • Deceptichum@kbin.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        10 months ago

        Huge improvement in work flow.

        Don’t get it to write your code for you, it’s not gonna work 3/10 times. Instead use it to review your code, help remove any code smells for refactoring.

      • blackbirdbiryani@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        10 months ago

        I don’t use chatGPT, but work with colleagues who do. They’re productivity visibly drops and half the time I gotta fix their shitty code.

    • LanternEverywhere@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      I use chatGPT for any topic I’m curious about, and like half the time when i double check the answers it turns out they’re wrong.

      For example i asked for a list of phones with screens that don’t use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don’t use PWM. Why does it straight up lie?!

  • abbadon420@lemm.ee
    link
    fedilink
    arrow-up
    22
    ·
    10 months ago

    People on ELI5 ask questions that can be answered with a single google search. Yet they do not do the google search. What makes yoi think they will do the bard or chatgpt?

    • XeroxCool@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      Because if the second worst option is asking ELI5 something basic, then the worst thing is asking Al the same question and then getting the wrong answer. So they choose Al

    • sturlabragason@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      Yeah I read a 30 year old newspaper a while back and it was like super high latency internet. Message board, posts, replies to posts, personals etc. None of that stuff makes it into newspapers anymore…

      • TehBamski@lemmy.worldOP
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        10 months ago

        It is. I’ve seen ‘Write the Editor’ sections often in the magazines I check out from the library from time to time. IIRC: Popular Mechanics, Popular Science, and The New Yorker have one.

      • Captain Poofter@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Newspapers? Depends who you ask

        Jokes aside, I am not talking about the “write the editor” sections we see now. I am saying they’d use it like GOOGLE. You’re not going to see someone ask “what’s 32,344 divided by 7?” Or “who is the senator of idaho” In the new yorker.

  • take6056@feddit.nl
    link
    fedilink
    arrow-up
    14
    ·
    10 months ago

    Interestingly, as ChatGPT might be trained on these ELI5 questions and as a result they are asked more infrequently, it might get worse over time or out of date on these types of questions by its own doing. I especially wonder how bad this influence will get on subjects that you’d normally search stackoverflow for.

  • treadful@lemmy.zip
    link
    fedilink
    English
    arrow-up
    14
    ·
    10 months ago

    I don’t get why people trust their answers so much. They lie. Confidently. Constantly.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      I usually ask the GPT, then look up the topic myself based on terms and keywords that were mentioned

    • Fubber Nuckin'@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I don’t get why people completely disregard their usefulness because of that. Just don’t trust anything they say until you verify it. It’s still useful for exploration or to get enough of a grasp of something that you can figure it out on your own.

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    What are some other communities that are less used now? WritingPrompts and PhotoshopBattles come to mind for me

    • TehBamski@lemmy.worldOP
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Oh man. I forgot that PhotoshopBattles existed.

      That’s a good question. I would guess that it’s lessened some. But both are for creative tasks vs explaining a proven topic, item, or thing.