• Lucy :3
    link
    fedilink
    13 months ago

    If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.

      • Lucy :3
        link
        fedilink
        13 months ago

        I bet even my Pi Zero W could run such a model*

        * with 1 character per hour or so