Basically a deer with a human face. Despite probably being some sort of magical nature spirit, his interests are primarily in technology and politics and science fiction.

Spent many years on Reddit and then some time on kbin.social.

  • 0 Posts
  • 646 Comments
Joined 6 months ago
cake
Cake day: March 3rd, 2024

help-circle







  • Eh, there didn’t seem to be any sort of implied threat or imbalance of power in the little snippet presented here. The old ladies approached the soldiers and asked for a lift, and the soldiers seemed honestly apologetic that they had no room to provide one.

    It’s quite interesting seeing the “depoliticization” of the general Russian population having this effect, when the Ukrainians moved in a surprising number seem to be just shrugging and going “new management, I guess.” Will be interesting to see how the occupation goes if it’s long-term.






  • particularly for companies entrusted with vast amounts of sensitive personal information.

    I nodded along to most of your comment but this cast a discordant and jarring tone over it. Why particularly those companies? The CrowdStrike failure didn’t actually result in sensitive information being deleted or revealed, it just caused computers to shut down entirely. Throwing that in there as an area of particular concern seems clickbaity.



  • Different countries have a variety of very different approaches to appointing judges, and some of those methods are not nearly as easy to corrupt as the American system.

    Americans are subject to a lot of cultural indoctrination about how their system is the “greatest democracy in the world,” “leader of the free world,” and other such platitudes. It’s really not the case, though. America’s system is one of the earliest that’s still around, and unfortunately that means it’s got a lot of problems that have been corrected in democracies that were founded later on but have remained embedded in America’s.

    Doesn’t help that America has a somewhat problematic electorate as well.


  • Not necessarily. Curation can also be done by AIs, at least in part.

    As a concrete example, NVIDIA’s Nemotron-4 is a system specifically intended for generating “synthetic” training data for other LLMs. It consists of two separate LLMs; Nemotron-4 Instruct, which generates text, and Nemotron-4 Reward, which evaluates the outputs of Instruct to determine whether they’re good to train on.

    Humans can still be in that loop, but they don’t necessarily have to be. And the AI can help them in that role so that it’s not necessarily a huge task.


  • It means that even if AI is having more environmental impact right now, there’s no reason to say “you can’t improve it that much.” Maybe you can improve it. As I said previously, a lot of research is being done on exactly that - methods to train and run AIs much more cheaply than it has so far. I see developments along those lines being discussed all the time in AI forums such as /r/localllama.

    Much like with blockchains, though, it’s really popular to hate AI and “they waste enormous amounts of electricity” is an easy way to justify that. So news of such developments doesn’t spread easily.




  • The term “model collapse” gets brought up frequently to describe this, but it’s commonly very misunderstood. There actually isn’t a fundamental problem with training an AI on data that includes other AI outputs, as long as the training data is well curated to maintain its quality. That needs to be done with non-AI-generated training data already anyway so it’s not really extra effort. The research paper that popularized the term “model collapse” used an unrealistically simplistic approach, it just recycled all of an AI’s output into the training set for subsequent generations of AI without any quality control or additional training data mixed in.