Programmer and sysadmin (DevOps?), wannabe polymath in tech, science and the mind. Neurodivergent, disabled, burned out, and close to throwing in the towel, but still liking ponies 🦄 and sometimes willing to discuss stuff.

  • 0 Posts
  • 278 Comments
Joined 2 years ago
cake
Cake day: June 26th, 2023

help-circle
  • TLDR: It’s a mess.

    Back in the day, I started migrating notepad stuff to Markdown on a Wiki. Then on a MediaWiki. Then DokuWiki. Then ZimWiki. Then Joplin. Then GitHub Pages and a self-hosted Jeckyll.

    Each, single, one, of, them, uses a slightly different flavor of Markdown. At this point, I have stuff spread over ALL OF THEM, much of it rotting away in “backups to migrate later”. 😮‍💨
    I’ve been considering “vibe coding” some converters…

    As for syncing… the Markdown part is easy: git.
    Working with a Markdown editor to update GH Pages, was a good experience.
    Having ZimWiki auto-sync to git, was good, but didn’t find a decent compatible editor for Android.
    I switched to Joplin lured by the built-in auto-sync options, but kind of regret it now, when it has a folder with thousands of files in it.

    Obsidian is not OSS itself, but has an OSS plugin to sync to git.
    I’ve read that using Logseq alongside Obsidian should be possible… and was planning to test that setup, keeping Obsidian in charge of sync. Possibly with GitHub/Jeckyll, git-lfs for images and attachments.


    PS: assuming one could have working back-and-forth converters for the different Markdown flavors, and everything stored in git, then one could theoretically use git hooks to convert to/from whatever local version used by a particular editor.



  • I was going to say that AI has a lot of implications in the online world that Mozilla was supposed to promote… but maybe you’re right, the AI genie is out of the bottle and there is little left to do about it. Its impact will be whatever it will be, no matter what people want to say about it.

    Not sure which “old Mozilla” you want, the 1998 one? the 2005 one? the 2015 one? It has changed a lot indeed, but kind of has been Google’s anti-anti-thrust shield for 20+ years.















  • Is this on the same machine, or multiple machines?

    The typical/easy design for an outgoing proxy, would be to set the proxy on one machine, configure the client on another machine to connect to the proxy, and drop any packets from the client that aren’t targeted at the proxy.

    For a transparent proxy, all connections coming from a client could be rewritten via NAT to go to the proxy, then the proxy can decide which ones it can handle or is willing to.

    If you try to fold this up into a single machine, I’d suggest using containers to keep things organized.