I wonder how this changes if you adjust it for land size instead. As someone living in Singapore, I’d be surprised if it wasn’t at least the top 3.
I wonder how this changes if you adjust it for land size instead. As someone living in Singapore, I’d be surprised if it wasn’t at least the top 3.
I can find this believable in the US maybe (only stayed there for a few months and I heard nothing good, data caps on broadband is wild) but not a chance in countries with stricter regulations and guidelines on what the ISPs are allowed to do.
It does stand to reason that if they’re dropping all in house engine development, a lot of roles will be freed up. It’s not great and I’m personally not a fan of this consolidation of engines.
I do believe there is value in understanding the fundamentals of how the computer executes code by learning C as it is a nice balance without going to the level of Assembly. I don’t think I would be as good of a programmer as I am today without having learnt C as my first language but the way the school teaches it is important.
That said, that’s in the context of a role of a software engineer with a CS degree, if you’re just a regular developer writing web apps or plan on only ever using frameworks then yea, you probably don’t need that kind of knowledge. Even then, I’d argue knowing these details would help you resolve issues with the framework if you ever encounter them.
It doesn’t necessarily mean you have to use C to make products but it certainly is useful to get a feel of how it works.
Indefinitely implies never being able to go back which is much more similar to your analogy though. Either way, it’s clearly a click bait headline.
Same here. Game pass is a pretty good deal even at full price for playing AAA single player games that you won’t touch after a single play through. Plus, there’s a lot of games that I wouldn’t have given a shot if I didn’t happen to have Game Pass at the time.
I started with 0 and dropped it within an hour because it was moving so slow and the plot was written in a way which wasn’t very interesting because I didn’t really know the characters yet.
Didn’t touch the series for another year before playing Kiwami 1 which hooked me in all the way in the order that I mentioned earlier.
0 is really good but I don’t think I would find it to be as good and hard hitting as I experienced it without having played at least Kiwami 1 first.
The other TF2 does have dedicated servers now though. Unofficial but it does and comes with mods.
It was pretty buggy though, my class had people’s laptops permanently locked into the browser and unable to close it after the exam. Sometimes it wouldn’t even let you start the exam even after launching with the browser until you restarted the whole system.
Infinite Wealth was a very fun game but it’s honestly one of the worst stories they’ve written in the series with plot lines that were incoherent with the situation they’ve wrote themselves into. Not to mention the amount of time they spent on the Hawaii side of things having a very plain and unexciting wrap up. It’s only redeeming points are all the numerous call backs to the earlier games which is lost if you don’t play the earlier games.
I highly recommending playing in a modified release order.
There’s also Ishin which is fine to play anytime since it’s a spinoff but it casts characters from the main games to play historical characters so there’s a neat bit of irony if you know who the characters are in the first place. So playing it after 7 would be the best time.
And then if you still want more, Judgment and Lost Judgment takes place in the same world and use the same brawler style gameplay but follows a detective instead.
There’s not really a lot of options out there. Can’t say I agree with Samsung’s policies but their devices are pretty good compared to everyone else. iPhones are well, if you’d consider an iPhone then we wouldn’t be in this conversation. Chinese brands generally have very problematic software, Pixels are pretty barebones unless you’re into the AI stuff (Material 3 is also pretty ugly), Sony is very expensive and fairly barebones too.
Well for the current generation consoles they’re both x86-64 CPUs with only a single set of GDDR6 memory shared across the CPU and GPU so I’m not sure if you have such a penalty anymore
It’s not that unified memory can’t be created, but it’s not the architecture of a PC, where peripheral cards communicate over the PCI bus, with great penalties to touch RAM.
Are there any tests showing the difference in memory access of x86-64 CPUs with iGPUs compared to ARM chips?
Do you have any sources for this? Can’t seem to find anything specific describing the behaviour. It’s quite surprising to me since the Xbox and PS5 uses unified memory on x86-64 and would be strange if it is extremely slow for such a use case.
Thanks for the links, they’re really informative. That said, it doesn’t seem to be entirely certain that the extra work done by the x86 arch would incur a comparatively huge difference in energy consumption. Granted, that isn’t really the point of the article. I would love to hear from someone who’s more well versed in CPU design on the impact of it’s memory model. The paper is more interesting with regards to performance but I don’t find it very conclusive since it’s comparing ARM vs TSO on an ARM processor. It does link this paper which seems more relevant to our discussion but a shame that it’s paywalled.
Do x86 CPUs with iGPUs not already use unified memory? I’m not exactly sure what you mean but are you referring to the overhead of having to do data copying over from CPU to GPU memory on discrete graphics cards when performing GPU calculations?
Their primary money makers are what’s stopping them I reckon. Apple’s move to ARM is because they already had a ton of experience with building their own in house processors for their mobile devices and ARM licenses stock chip designs, making it easier for other companies to come up with their own custom chips whereas there really isn’t any equivalent for x86-64. There were some disagreements between Intel and AMD over patents on the x86 instruction set too.
Do you mind elaborating what is it about the difference on their memory models that makes a difference?
There’s nothing stopping x86-64 processors from being power efficient. This article is pretty technical but does a really good explanation of why that’s the case: https://chipsandcheese.com/2024/03/27/why-x86-doesnt-need-to-die/
It’s just that traditionally Intel and AMD earn most of their money from the server and enterprise sectors where high performance is more important than super low power usage. And even with that, AMD’s Z1 Extreme also gets within striking distance of the M3 at a similar power draw. It also helps that Apple is generally one node ahead.
They’re still pretty good at least here in Asia. The horror stories I hear of Asus support in the US is a might and day difference from what I experienced. Their Taiwan HQ needs to smash some sense into the US office and clean house.
Oh as someone very familiar with the field, I perfectly understand why things have come to this point and I honestly have no idea if there’s any way things could retain the way they’ve been before. I just find it worrying in different ways.