Kind of did that already, and for less money.
On the bright side, daggers are cheap… so if you really hated them that much… prison is like a vacation where you get to read a lot of books, right…?
Programmer and sysadmin (DevOps?), wannabe polymath in tech, science and the mind. Neurodivergent, disabled, burned out, and close to throwing in the towel, but still liking ponies 🦄 and sometimes willing to discuss stuff.
Kind of did that already, and for less money.
On the bright side, daggers are cheap… so if you really hated them that much… prison is like a vacation where you get to read a lot of books, right…?
TLDR: It’s a mess.
Back in the day, I started migrating notepad stuff to Markdown on a Wiki. Then on a MediaWiki. Then DokuWiki. Then ZimWiki. Then Joplin. Then GitHub Pages and a self-hosted Jeckyll.
Each, single, one, of, them, uses a slightly different flavor of Markdown. At this point, I have stuff spread over ALL OF THEM, much of it rotting away in “backups to migrate later”. 😮💨
I’ve been considering “vibe coding” some converters…
As for syncing… the Markdown part is easy: git.
Working with a Markdown editor to update GH Pages, was a good experience.
Having ZimWiki auto-sync to git, was good, but didn’t find a decent compatible editor for Android.
I switched to Joplin lured by the built-in auto-sync options, but kind of regret it now, when it has a folder with thousands of files in it.
Obsidian is not OSS itself, but has an OSS plugin to sync to git.
I’ve read that using Logseq alongside Obsidian should be possible… and was planning to test that setup, keeping Obsidian in charge of sync. Possibly with GitHub/Jeckyll, git-lfs for images and attachments.
PS: assuming one could have working back-and-forth converters for the different Markdown flavors, and everything stored in git, then one could theoretically use git hooks to convert to/from whatever local version used by a particular editor.
Yeah, I don’t think I like llamafile, reusing some weights between models, and smaller updates, sounds like a better idea.
What I’d like to see is a unified WebNN support, for CPU, GPU, and NPU: WebNN Overview
(Not to pull rank, but my mail profile can be tracked to Netscape Navigator, across multiple OSs 😁)
I was going to say that AI has a lot of implications in the online world that Mozilla was supposed to promote… but maybe you’re right, the AI genie is out of the bottle and there is little left to do about it. Its impact will be whatever it will be, no matter what people want to say about it.
Not sure which “old Mozilla” you want, the 1998 one? the 2005 one? the 2015 one? It has changed a lot indeed, but kind of has been Google’s anti-anti-thrust shield for 20+ years.
From those projects, which ones are out of scope for the Mozilla Manifesto?
The African nuclear reactors might need more explaining, but the rest seem to be right on the goals:
Is there more information about that nuclear reactor?
Pollution related to computing and coding, seem relevant to the mission statement.
What would be the proper advocacy groups? Would you’ve ever heard of Mozilla without some advocacy group?
Keeping in mind it’s an advertorial for their apps…
Not sure what they mean by “weird characters”, but chatbots add zero-width Unicode characters as a watermarking mechanism, and LLMs output their own tags to mark different sections.
(the “stochastic parrots” expression is already a contradiction, but whatever)
🟢 Exposure
🟠 Feeding AIs
🟣 Feeding editors
Seems like it hasn’t been updated for 2 years. Is it abandoned?
Depends… I’ve been running 2.99 for a while now 😄
Oooooh, can’t wait to see the new features!
[what? there is no new features…?]
Can’t wait to see the new splash screen! 😇
You can run a stress test, and compare your desired response times with the resource usage on the server side.
https://en.wikipedia.org/wiki/ApacheBench
Take into account all the requests needed to load a website, and the fact that:
Loading some content in 100ms, then loading more in the background, is a reasonable compromise. You may want a very quick response time for the first few requests, then put the rest on a possibility slower server, or running at a lower priority.
And who said anything about being surprised? Got my offline copies of all of them already, but can get another one.
6 pages of apps… nice. 😅
Now, seriously. Android 14 on a Samsung phone, lets me select “location: only while using the app”. I close apps when not using them, and limit notifications so they don’t get auto-started at random. Unused app detection, puts them in “deep sleep” which doesn’t allow them to run at all, and strips them of all permissions.
This scanner, would be more useful if it also checked which apps have the location permission enabled, and are frequently used.
“Your phone is not gay enough… here are some themes and backgrounds to make it more fun!”
(probably not PC, but that could be a fun app)
TIL Wikipedia has an app. My extras to that list:
Is this on the same machine, or multiple machines?
The typical/easy design for an outgoing proxy, would be to set the proxy on one machine, configure the client on another machine to connect to the proxy, and drop any packets from the client that aren’t targeted at the proxy.
For a transparent proxy, all connections coming from a client could be rewritten via NAT to go to the proxy, then the proxy can decide which ones it can handle or is willing to.
If you try to fold this up into a single machine, I’d suggest using containers to keep things organized.
Check IzzyOnDroid’s photo section, there are many feature-centered tools, but I’m afraid no “one-fits-all” ones.
At this point, why not just root it? There are public exploits for Android 6 vulnerabilities, it’s not like you’d lose any security.