Since selfhosted clouds seem to be the most common thing ppl host, i’m wondering what else ppl here are selfhosting. Is anyone making use of something like excalidraw in the workplace? Curious about what apps that would be useful to always access over the web that aren’t mediaservers.

  • AtHeartEngineer
    link
    fedilink
    English
    1416 hours ago

    Local LLMs, I’m surprised no one brought that up yet. I’ve got an old GPU in my server, and I’m running some local models with openweb-ui for use in the browser and Maid for an Android app to connect to it.

    • @ifItWasUpToMe@lemmy.ca
      link
      fedilink
      English
      615 hours ago

      You’re a brave one admitting that on here. Don’t you know LLM’s are pure evil? You might as well be torturing children!

      • AtHeartEngineer
        link
        fedilink
        English
        27 hours ago

        I think looking through the comments on this post about AI stuff is a pretty good representation of my experience on lemmy. Definitely some opinions, but most people are pretty reasonable 🙂

      • 3DMVROP
        link
        fedilink
        English
        514 hours ago

        Ais fine as a tool, trying to replace workers and artists while blatantly ripping stuff off is annoying, it can be a timesaver or just helpful for searching through your own docs/files

        • @ifItWasUpToMe@lemmy.ca
          link
          fedilink
          English
          512 hours ago

          If you agree it’s a time saver, then you agree it makes workers more efficient. You now have a team of 5 doing the work of a team of 6. From a business perspective it’s idiotic to have more people than you need to, so someone would be let go from that team.

          I personally don’t see any issue with this, as it’s been happening for the existence of humanity.

          Tools are constantly improving that make us more efficient.

          Most of people’s issue with AI is more an issue with greedy humans, and not the technology itself. Lord knows that new team of 5 is not getting the collective pay as the previous team of 6.

          • @bluesheep@lemm.ee
            link
            fedilink
            English
            12 hours ago

            Nor will they get the workload of 6 people. They might for a couple of months, but at some point the KPI’s will suddenly say that it’s possible to squeeze out the workload of 2 more people. With maybe even 1 worker less!

          • 3DMVROP
            link
            fedilink
            English
            13 hours ago

            more work can get done and more work can be show in progress, its like a marginal timesaver, itll knock off 25% of a human maybe if that, not replace a whole one

      • AtHeartEngineer
        link
        fedilink
        English
        615 hours ago

        I think most people on here are reasonable, and I think local LLMs are reasonable.

        The race to AGI and companies trying to shove “AI” into everything is kind of insane, but it’s hard to deny LLMs are useful and running them locally you dont have privacy concerns.

        • @ifItWasUpToMe@lemmy.ca
          link
          fedilink
          English
          615 hours ago

          Interesting, this has not been my experience. Most people on here seem to treat AI as completely black and white, with zero shades of grey.

          • AtHeartEngineer
            link
            fedilink
            English
            315 hours ago

            I see a mix, don’t get me wrong, Lemmy is definitely opinionated lol, but I don’t think it’s quite black and white.

            Also, generally, I’m not going to not share my thoughts or opinions because I’m afraid of people that don’t understand nuance, sometimes I don’t feel like dealing with it, but I’m going to share my opinion most of the time.

            OP asked what you self host that isn’t media, self hosted LLMs is something I find very useful and I didn’t see mentioned. Home assistant, pihole, etc, all great answers… But those were already mentioned.

            I still have positive upvotes on that comment, and no one has flamed me yet, but we will see.

            • @treyf711@lemm.ee
              link
              fedilink
              English
              214 hours ago

              I’ll give my recommendation to local LLMs as well. I have a 1060 super that I bought years ago in 2019 and it’s just big enough to do some very basic auto completion within visual studio. I love it. I wouldn’t trust it to write an entire program on its own, but when I have hit a mental block and need a rough estimate of how to use a library or how I can arrange some code, it gives me enough inspiration to get through that hump.

              • AtHeartEngineer
                link
                fedilink
                English
                214 hours ago

                Ya exactly! Or just sanity checking if you understand how something works, I use it a lot for that, or trying to fill in knowledge gaps.

                Also qwen3 is out, check that out, it might fit on a 1060.