see also: @smallpatatas@gotosocial.patatas.ca

  • 4 Posts
  • 11 Comments
Joined 1 year ago
cake
Cake day: September 18th, 2023

help-circle

  • This is very similar to my story - end of support for win7 meant putting Mint on the HTPC.

    Soon after that, it was the old laptop my spouse was about to chuck out. Cinnamon was a little sluggish, so I eventually landed on Debian + XFCE

    And when I discovered I could get my desktop’s audio interface working on Linux (it’s firewire, and by most people’s standards, ancient), it was game over for Windows.

    I don’t know what Freetrack is but I hope it gets implemented for you :)












  • That’s a good question. The best answer is, I don’t know!

    But if I had to guess, based on the small amount I’ve learned:

    larger servers most likely benefit from economies of scale. They’ll be using CDNs, and will often have several people on their server following any given remote account, rather than just one. So the per-client energy use is almost certainly lower than for small servers.

    But it’s still tough to know whether it’s the client or server using more energy. IIRC with video streaming, the end user’s device was a big factor in overall consumption - but it’s not like the server is chugging away 24/7 fetching media for you like a Fediverse server is.

    For single-user servers, or servers with only a few accounts, I expect the server (and all the network infrastructure in between two servers) is doing a lot more work than the client(s) - unless it’s like, the server is on a raspberry Pi and the client is running on a powerful desktop for a lot of the day, or something. Again, many factors at play.

    Really though, the question I start to ask in all this is more about, which parts of the system are the most difficult to justify?___