Ensuring that the system complies with industry standards and integrating security measures for cross-technology communication are also necessary steps, Gao adds.
This is absolutely a huge factor that could make or break the technology if they don’t do this perfectly. This could be the single most important part of the tech.
2.4 GHz is super saturated. The last thing we need is long range i.e. large footprint signals in already saturated spectrum. How this technology is deployed should either be not at all, or very carefully, to prevent widespread interference with existing WiFi devices. This spectrum is already on the verge of being complete trash. Please please do not be deploying more stuff on 2.4 spanning an entire “smart city.”
I actually ditched 2.4 gigahertz Wi-Fi on my home network entirely for this exact reason. If a device is not compatible with 5 gigahertz Wi-Fi, it doesn’t get purchased.
Because if so, that totally makes sense, and the other benefit of 5GHz/6GHz not traveling too far outside your apartment or condo wall, is pretty nifty as well.
But if you live in a house in the suburbs, man, that is commitment well outside of necessity, or convenience. Not saying it’s bad choice per se, just seems unnecessarily burdensome IMO.
I live in a single family house, but the area has quite a few single family houses packed pretty close together. So there’s still a lot of traffic on 2.4 GHz.
In my experience, having a vr setup with vive body trackers consumes the 2.4ghz band really fast; so there are still reasons to swap in the suburbs, but they’re more niche.
Source: my PC is too far away from the router for wired, so it uses wifi. I had to switch to using 5ghz because my internet would drop out on 2.4ghz whenever I played VRChat.
I mostly agree with you. I find it really weird how I live in a world where all my Internet is being run through 5G cellular for political and social reasons and not for technical ones. Due to the monopoly on the cables, it’s actually much cheaper here to buy 5G home internet. It seems unnecessarily complicated and choosing to use a shared medium for no reason. It’s just the politics.
In case you’re not from the States, we have a monopoly pretty much everywhere for Internet services.
With my 5G I have unlimited data, and it’s 300 down 44 up on a good day. It’s perfectly serviceable if you can live with increased latency.
Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).
It’s LoRa on 2.4ghz.
It’s just that chirp signals are easy to decode from a lot of noise.
And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.
LoRa is incredibly resilient.
It’s just really really slow
I don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?
How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.
And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.
This is absolutely a huge factor that could make or break the technology if they don’t do this perfectly. This could be the single most important part of the tech.
2.4 GHz is super saturated. The last thing we need is long range i.e. large footprint signals in already saturated spectrum. How this technology is deployed should either be not at all, or very carefully, to prevent widespread interference with existing WiFi devices. This spectrum is already on the verge of being complete trash. Please please do not be deploying more stuff on 2.4 spanning an entire “smart city.”
I actually ditched 2.4 gigahertz Wi-Fi on my home network entirely for this exact reason. If a device is not compatible with 5 gigahertz Wi-Fi, it doesn’t get purchased.
It doesn’t just benefit you. You’re benefiting the current users of that spectrum that for one reason or another might not be able to switch.
I suspect most users though couldn’t tell you what frequency their network uses let alone the devices on it.
Anyone with a NAS will immediately notice that they are on 2.4GHz because it will take several times longer to transfer files.
I think users who know what a NAS is probably know that information already. But true, yes!
Some of us know what a NAS is, but aren’t fortunate enough to afford one
Do you live in a high density urban environment?
Because if so, that totally makes sense, and the other benefit of 5GHz/6GHz not traveling too far outside your apartment or condo wall, is pretty nifty as well.
But if you live in a house in the suburbs, man, that is commitment well outside of necessity, or convenience. Not saying it’s bad choice per se, just seems unnecessarily burdensome IMO.
I live in a single family house, but the area has quite a few single family houses packed pretty close together. So there’s still a lot of traffic on 2.4 GHz.
In my experience, having a vr setup with vive body trackers consumes the 2.4ghz band really fast; so there are still reasons to swap in the suburbs, but they’re more niche.
Source: my PC is too far away from the router for wired, so it uses wifi. I had to switch to using 5ghz because my internet would drop out on 2.4ghz whenever I played VRChat.
Radio shouldn’t be used when avoidable. It’s for emergencies, aviation, hiking, short-range communication for convenience maybe. Phones - yes.
But providing internet connectivity via radio when you can lay cable is just stupid.
I mostly agree with you. I find it really weird how I live in a world where all my Internet is being run through 5G cellular for political and social reasons and not for technical ones. Due to the monopoly on the cables, it’s actually much cheaper here to buy 5G home internet. It seems unnecessarily complicated and choosing to use a shared medium for no reason. It’s just the politics.
In case you’re not from the States, we have a monopoly pretty much everywhere for Internet services.
With my 5G I have unlimited data, and it’s 300 down 44 up on a good day. It’s perfectly serviceable if you can live with increased latency.
Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).
It’s LoRa on 2.4ghz.
It’s just that chirp signals are easy to decode from a lot of noise.
And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.
LoRa is incredibly resilient.
It’s just really really slow
I don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?
How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.
And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.