• sturlabragason@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 months ago

    Second this.

    We’re in the first days and everyday I add a new model or tech to my reading list. We’re close to talking to our CPUs. We’re building these stacks. We’re solving the memory problems. Don’t need RAG with a million tokens, guerrilla model can talk with APIs, most models are great at python which is versatile as fuck, I can see the singularity on the horizon.

    Try Ollama if you want to test things yourself.

    Use GPT4 if you want to get an inkling of the potential that’s coming. I mean really use it.