
To summarize: new fancy code autocompletion tools will make useless junior devs even more useless, plus a regular amount of usual AI buzzword doomsaying.

To summarize: new fancy code autocompletion tools will make useless junior devs even more useless, plus a regular amount of usual AI buzzword doomsaying.

The few people who actually used DeX for doing stuff are now religious about it.
Well, you have to commit into a Bluetooth mouse and keyboard and external display for it to be any useful.



Mine doesn’t. Seems like an aesthetic feature, since the pen is transparent.
If you can call CRISPR a known technology them yes. Like calling a combustion engine a known technology to develop a new car.


Yup, because they are annoying, not because they do not work.
Pretty much every news site asks you to enable browser notifications.


No app needed. Tell me if it works on your phone.


Patreon app is literally their website. What does it do that the website does not? Notifications? Offline images? All of these can be done in your mobile browser.

Are you saying the country that created the first ever nuclear bomb does not have uranium deposits?

I hope it will pass trials faster than Vasalgel.

I am sure I can see multiple never before seen colors as scientific lasers obliterate my retinas and burn my optical nerves.


Ah don’t worry, if you do fopen(file, "w") on Windows and forget to use "wb" flag, it will automatically replace all your \n with \r\n when you do fwrite, then you will try to debug for half a day your corrupted jpeg file, which totally never happened to me because I’m an experienced C++ developer who can never make such a novice mistake.


It depends on whether you are printing to a terminal or to a file (and yes the terminal is also a file), and even then you can control the flushing behaviour using something like unbuffer


To separate is to part, not to pert. Easy.


printf is superior and more concise, and snprintf is practically the only C string manipulation function that is not painful to use.
Try to print a 32-bit unsigned int as hexadecimal number of exactly 8 digits, using cout. You can do std::hex and std::setw(8) and std::setfill('0') and don’t forget to use std::dec afterwards, or you can just, you know, printf("%08x") like a sane person.
Just don’t forget to use -Werror=format but that is the default option on many compilers today.
C++23 now includes std::print which is exactly like printf but better, so the whole argument is over.


It’s way less expensive for state-sponsored hackers to blackmail your country’s official to leak backdoor keys than try to break the unbreakable crypto using a nuclear-powered GPU farm.
As long as your byte consists of 8 bits.

Just put the GPU in it’s own separate ATX case, why bother with PCI-X slot?


Eh, who is still using paper books to learn programming languages? Every popular language has a website with online manuals.
Well, except C, because it’s crammed together with C++ on https://cplusplus.com/ and https://cppreference.com/
And socks just grow organically after 3 years of coding.
A real-world optical chip that you can actually buy is exciting. Still, seems to be far from a consumer-grade optical CPU. It’s more like a microcontroller, which you stick at the end of your 10 GBit fiber optic cable, and receive processed optic data.
Memory is going to be a big problem, because any AI workload requires a ton of it, and replacing even a simple 16 GB DRAM chip with an all-optic equivalent means you are essentially creating 16 GB of L1 CPU cache, which would be like 100 server CPUs stacked together, used only for their cache memory. And if you are using a conventional DRAM, you need to introduce optic-to-electric converter, which will be a speed bottleneck of your system, and probably expensive.