• WiildFiire@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      It’ll be kept within product marketing and, I dunno how, but it would absolutely be used to see what they can raise prices on

    • CeeBee@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      It’s getting there. In the next few years as hardware gets better and models get more efficient we’ll be able to run these systems entirely locally.

      I’m already doing it, but I have some higher end hardware.

        • CeeBee@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          Stable diffusion SXDL Turbo model running in Automatic1111 for image generation.

          Ollama with Ollama-webui for an LLM. I like the Solar:7b model. It’s lightweight, fast, and gives really good results.

          I have some beefy hardware that I run it on, but it’s not necessary to have.

  • AeonFelis@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    I’m sure there are companies who’d love to develop something like this. And collect that information about exactly what groceries you currently have and statistics of how you consume them, so they can sell it to advertisers. Not advertisers that sell these groceries, of course - for these the AI company could just make the AI buy them from suppliers that pay them.

  • theneverfox@pawb.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    AI could do this. Conventional programming could do it faster and better, even if it was written by AI.

    It’s an important concept to grasp