• 1 Post
  • 513 Comments
Joined 1 year ago
cake
Cake day: July 24th, 2023

help-circle

  • You can either get something that is representative, so the median, by choosing mid traits (mid spiritualistic, mid artistic, mid intro/extroverted, med IQ), or by choosing at random. Everything else would not be representative, and depending on your own traits, you would be biased - eg. non spiritualistic, non artistic, introverted, anxious, mid IQ, technology oriented, but also trans and depressed. Some of those I would obviously not see as an average trait, but being atheist and non-spiritualistic would definitely be on my list, despite not being the actual norm.





  • 30p87@feddit.detoMemes@lemmy.mlpriorities
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    The local backups are done hourly, and incrementally. They hold 2+ weeks of backups, which means I can roll back versions of packages easily, as the normal package cache is cleaned regularly. They also prevent losing individual files accidentally through weird behaviour of apps, or me.

    The backups to my workstation are also done hourly, 15 minutes shifted for every device, and also incrementally. They protect against the device itself breaking, ransomware or some rouge program rm -rf’inf /, which would affect local backups too (as they’re mounted in /backups, but those are mainly for providing a file history as I said.)

    As most drives are slower than the 1 Gbps ethernet, the local backups are just more convenient to access and use than the one on my workstation, but otherwise exactly the same.

    The .tar.xz’d backups are actual backups, considering they are not easily accessible, and need to be unpacked and externally stored.

    I didn’t measure the speeds of a normal SSD vs the raid - but it feels faster. Not a valid argument, of course. But in any way, I want to use it as Raid 0/Unraided for more storage space, so I can have 2 weeks of backups instead of 5 days (considering it always keeps space for 2 backups, I would have 200- GB of space instead of 700+).

    The latest hourly backup is 1.3 GB in size, but if an application is used which has a single, big DB that can quickly shoot up to dozens of GB - relatively big for a homeserver hosting primarily my own stuff + a few things for my father. Like synapses’ DB has 20 GB alone. On an uneventful day, that would be 31 GB. With several updates done, which means dozens of new packages in cache, that could grow to 70+GB.



  • 30p87@feddit.detoMemes@lemmy.mlpriorities
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Because that’s what Raid 0 for, basically adding together storage space with faster reads and writes. The local backups are basically just to have earlier versions of (system) files, incrementally every hour, for reference or restoring. In case something goes wrong with the main root NVMe and a backup SSD at the same time (eg. trojan wiping everything), I still have exactly the same backups on my “workstation” (beefier server), on also a RAID 0 of 3 1 TB HDDs. And in case the house burns down or something, there are still daily full backups on Google Cloud and Hetzner.






  • In the modern world it’s completely subjective.
    The lowest-level language is probably ASM/machine code, as many people at least edit that regularly, and the highest-level would be LLMs. They are the shittiest way to program, yes, but technically you just enter instructions and the LLM outputs eg. Python, which is then compiled to bytecode and run. Similar to how eg. Java works. And that’s the subjective part; many people (me included) don’t use LLMs as the only way to program, or only use them for convenience and some help, therefore the highest level language is probably either some drag-and-drop UI stuff (like scratch), or Python/JS. And the lowest level is either C/C++ (because “no one uses ASM anyway”), or straight up machine code.