Anyone with the knowledge to self host will quickly discover 3-2-1. If they choose to follow it, that’s on them but data loss won’t be from ignorance
Anyone with the knowledge to self host will quickly discover 3-2-1. If they choose to follow it, that’s on them but data loss won’t be from ignorance
Borg backup to borgbase is not very expensive and borg will encrypt the data plus the vault is also encrypted
I have an atomic variant of fedora 40 (Aurora) and it just works on an Intel CPU with integrated graphics. I have a USB c dongle with HDMI out and it just works when I plug it in.
I also tried it on my steam deck dock the other day and it worked without issue.
Fresh RSS if you want a self hosted option
Conversely I have a dell xps from 2018 that run very well with fedora atomic (kde). I upgraded the SSD, WiFi card and replaced the battery. Should easily last me another 5 years
And good resources on how to learn to use Toolbox properly?
Does anyone know if Timeshift has any use with fedora atomic distros?
I tried to find this on DDG but also had trouble so I dug it out of my docker compose
Use this docker container:
prodrigestivill/postgres-backup-local
(I have one of these for every docker compose stack/app)
It connects to your postgres and uses the pg_dump command on a schedule that you set with retention (choose how many to save)
The output then goes to whatever folder you want.
So have a main folder called docker data, this folder is backed up by borgmatic
Inside I have a folder per app, like authentik
In that I have folders like data, database, db-bak etc
Postgres data would be in Database and the output of the above dump would be in the db-bak folder.
So if I need to recover something, first step is to just copy the whole all folder and see if that works, if not I can grab a database dump and restore it into the database and see if that works. If not I can pull a db dump from any of my previous backups until I find one that works.
I don’t shutdown or stop the app container to backup the database.
In addition to hourly Borg backups for 24 hrs, I have zfs snapshots every 5 mins for an hour and the pgdump happens every hour as well. For a homelab this is probably more than sufficient
Fair enough, I primarily use NFS for Linux to Linux sever communication and high file access.
Smb is mostly for moving files around occasionally
Not sure if trying to run a database over smb is a good idea but I do it on NFS all the time
Regardless it doesn’t have to be exclusive. OP can change it up depending on the application
You can use both without issue. I use NFS to share between two Linux servers (unraid and proxmox/dockers) and then some of those same folders are shared via smb for desktop windows or Linux laptop.
Aurora dev edition is the bazzite equivalent for devs. Containers built right into the terminal (ptyxis).
Self hosted AI seems like an intriguing option for those capable of running it. Naturally this will always be more complex than paying someone else to host it for you but it seems like that’s that only way if you care about privacy
Use the multi container extension for Firefox and have all your Google stuff in one container, banks in another, social media in another etc.
https://addons.mozilla.org/en-US/firefox/addon/multi-account-containers/
Do you have a link that talks about this? What is missing?
Do you want to have 2fa keys on all your devices? Doesn’t that defeat the purpose?
Thrown away your current ssh client and get
Actually yes. Fedora atomic has a system called toolbox that uses podman to encapsulate desktop apps. Flatpak also provides a sandboxed container.
The idea is to keep the OS and apps separate as much as possible for both security and stability.
I don’t think xpipe would work, it needs too many permissions.
Something like seafile would work, better than overlaying it I guess but still isn’t park of a package manager with easy auto updates etc like it would be if the devs published to flatpak.
At the end of the day it’s a lot more work that the promise of opening discover, searching an app and hitting install.
Ya exactly this. I get optimizing for spee but there should at least be an option afterwards to check file integrity. Feels like a crucial feature for a critical system
Is it still a drop in replacement for gitea, I’ve been meaning to switch