Sorry guys, this is my experimental atheist AI. I never gave it any examples of christians being oppressed, so it kinda spits out gibberish when it sees any.
Sorry guys, this is my experimental atheist AI. I never gave it any examples of christians being oppressed, so it kinda spits out gibberish when it sees any.
Rust is already obsolete, compared to Stingpie’s excellent assembly language, paired with object oriented programming!
This is the SEALPOOP specification:
Fahernhaters are always like, “nooo!! 40 degrees is so hot!!” Meanwhile, the fahrenchad’s resting body temperature is nearly 2.5 times hotter. All fahernhaters would die at that temperature.
deleted by creator
Which one is cooler? Cause I’m that one.
That is so kind of you to say. Thank you!
I deliberately create characters which have an interesting dynamic with other player’s characters.
That either tells you nothing about me, or everything about me.
You have reinvented the typewriter. Ever since I got one, I’ve been amassing a horde of surrealist writing.
Recursion makes it cheaper to run in the dev’s mind, but more expensive to run on the computer. Subroutines are always slower than a simple jump.
This always cracks me up, because it’s a perfect example of a snake eating it’s own tail. “Based” was originally just a shortened way of saying “based in reality” or “based in fact”, but new people didn’t get the original context, so it just became it’s own word. Then, the uninitiated started making the “Based? Based on what?” joke, completely oblivious of the original meaning.
There are bindings in java and c++, but python is the industry standard for AI. The libraries for machine learning are actually written in c++, but use python language bindings. Python doesn’t tend to slow things down since machine learning is gpu-bound anyway. There are also library specific programming languages which urges the user to make pythonic code that can be compiled into c++.
I completely agree that it’s a stupid way of doing things, but it is how openai reduced the vocab size of gpt-2 & gpt-3. As far as I know–I have only read the comments in the source code– the conversion is done as a preprocessing step. Here’s the code to gpt-2: https://github.com/openai/gpt-2/blob/master/src/encoder.py I did apparently make a mistake, as the vocab reduction is done through a lut instead of a simple mod.
Can’t find the exact source–I’m on mobile right now–but the code for the gpt-2 encoder uses a utf-8 to unicode look up table to shrink the vocab size. https://github.com/openai/gpt-2/blob/master/src/encoder.py
This might be happening because of the ‘elegant’ (incredibly hacky) way openai encodes multiple languages into their models. Instead of using all character sets, they use a modulo operator on each character, to make all Unicode characters represented by a small range of values. On the back end, it somehow detects which language is being spoken, and uses that character set for the response. Seeing as the last line seems to be the same mathematical expression as what you asked, my guess is that your equation just happened to perfectly match some sentence that would make sense in the weird language.
I don’t know about that guy, but I used to have a speech impediment that meant I couldn’t pronounce the letter R. I went to several speech therapists, so I started to annunciate every other letter, but that made people think I had a British accent. Anyway, I eventually learned how to say R, so now I have a speech impediment that makes me sound like a British person doing a fake American accent.
If course it’s written in lisp. Though I’d expect it to be more like EURISKO or Cyc, instead of a more conversational ai.
Yeah, mine. EYYYYOOOOO! (I may or may not have ED)
Oh, so the goal is to get the certain doom?
“An anaconda that is sprung?” What does that mean?
You can’t choose where you grow up. :(