2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth

2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.

Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.

Edit: I am actually really surprised at how unpopular this opinion appears to be.

  • dosse91@lemmy.trippy.pizza
    link
    fedilink
    English
    arrow-up
    44
    ·
    8 months ago

    The problem with 5Ghz is that it doesn’t go through walls very well compared to 2.4Ghz, resulting in APs having less range (or having to use several times more power)

    • shortwavesurfer@monero.townOP
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      12
      ·
      edit-2
      8 months ago

      The max power a 5 gigahertz access point puts out is 1 watt where the max on 2.4 gigahertz is 0.3 watts. You are right though. You do have to do better about centrally locating the access point in your home in order to get the best performance from it. Because otherwise, one side will have good WiFi and the other side will have nothing or very weak WiFi.

      Edit: Another benefit of that is that if somebody wants to crack your Wi-Fi network, they have to be physically closer to your house to do so. So, on like 60 gigahertz, where the signal doesn’t leave the room you’re in, it’s basically as secure as Ethernet, because an intruder would have to break into your house to crack your Wi-Fi network.

        • shortwavesurfer@monero.townOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          8
          ·
          8 months ago

          That is a fair point, and mobile devices are going to be hardest hit by that, since they have such small batteries, but laptops and desktops and stuff would be just fine since they are constantly connected to a power source, and can use a card with higher transmit power.

  • GarlicToast@programming.dev
    link
    fedilink
    English
    arrow-up
    21
    ·
    8 months ago

    You sound like a USA citizen. There many places in the world where walls are made of concrete. 5Ghz doesn’t penetration concrete.

    In such cases, the only way to get 5GHz into every room will be passing cat5 cable in the wall and placing an AP.

    Passing a cable in concrete walls requires a pipe in the wall, that was placed there when the house was built! But in many cases, the tunnels that exists are too narrow for cat5 and are already in use anyway.

    So to fulfill your idea and still have WiFi we will need to raze to the ground whole cities and rebuild them.

    Unless you are footing the bill, and take care of the CO2 emissions, just learn to disable 2.4GHz on your router.

    • Bizarroland@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      CAT5 is essentially dead. Highly recommended to use cat6/e as a minimum, or cat8. The world is beginning to switch to multi gig ethernet and CAT5 is simply insufficient for that.

      Yes it will work at gigabit speeds and most things you do will not require more than gigabit but who knows what we will be running in 10 years and cat 6 can handle 10 gig over a pretty good distance which should be sufficient until it needs to be completely replaced.

      That being said, unless you are currently running a multi gig ethernet setup and are running into bandwidth limitations on CAT5 or cat5e, there is no need to pull and replace what is already there. This advice is for new deployments.

      • GarlicToast@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        I agree with the sentiment, but I think cat5 is enough for at home deployment. My edge device isn’t using 1Gb now, and it won’t use 10 in ten years. Mostly because it may be cheaper to replace when needed than to deploy for future proofing.

        For offices and such I agree, as the disruption of work for a few days may cost more than future proofing the net.

    • shortwavesurfer@monero.townOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      21
      ·
      8 months ago

      What are your walls made of lead? Because mine is about a foot thick brick. And it still gets through.

      • 👍Maximum Derek👍@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        8
        ·
        8 months ago

        Don’t know about op, my 1950’s home is a half inch of plaster over chicken wire over wooden lattice. My options are 2.4GHz or ethernet. And ethernet for phones is problematic.

        • OsaErisXero@kbin.run
          link
          fedilink
          arrow-up
          3
          arrow-down
          5
          ·
          8 months ago

          That chicken wire could have been intentionally designed to absorb 5ghz signal, and is death to it. Literally any other material would be fine up to 3 rooms away depending on the noise floor in the space. 6ghz /might/ be able to punch through depending on the width of the space between the wires though, and might be worth exploring in your case.

          • thejml@lemm.ee
            link
            fedilink
            English
            arrow-up
            10
            ·
            8 months ago

            Chicken wire in walls is something that came WAY before wifi. It’s used for plaster in much the same way rebar is used for concrete.

            • OsaErisXero@kbin.run
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              8 months ago

              No, i know, but my point was if you were designing a wall material to block 5ghz you would end up with plaster on wire mesh. Couldn’t have been better if it were on purpose

              • Bizarroland@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                I bet that could be disabled if you somehow removed any path to ground from that chicken wire.

                My guess is there are a few conductive points that are attached to materials that can dissipate electrical energy, which would turn the chicken wire into a faraday cage.

                Without those conductive points, it would not function as a faraday cage or at least not well enough to significantly attenuate Wi-Fi.

          • 👍Maximum Derek👍@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            LOL, yeah. My wife and I like to call it our personal faraday cage. If we didn’t have windows in every room I’m guessing even our cell phones would struggle.

      • YMS@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        There are bricks of various kinds, and they can very well be challenging for Wifi. Concrete is even harder, and if you have reinforced concrete, good luck.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    16
    ·
    8 months ago

    Here’s the thing. There are still plenty of devices that only have 2.4Ghz radios. There’s some cheaper stuff still made today with just 2.4Ghz. So you’d just cut out a load of devices from working straight out. This kind of thing needs to be done slowly. 3G was very different because phone makers generally always want the more modern technology and phones that didn’t have radios capable of 4g or better really are just rare now.

    But, there’s also just no reason to. Have 2.4Ghz available doesn’t hurt you, if you’re not using it. Any chipset with 5Ghz is not costing more to also support 2.4. They’re just all pretty much single chip solutions these days and the aerial is usually just a coil on the board somewhere. If your device works on 5Ghz it will use 5Ghz.

    I’d also argue in real terms 5Ghz isn’t much better than 2.4Ghz in terms of channel space in places that need to respect DFS rules you generally only get one 80Mhz channel that will definitely work, and if you’re using 802.11ax 80Mhz is really the minimum you want to get even remotely close to the advertised rate. Everything else useful is either DFS or limited power (at least here in the UK, and I don’t recall seeing the limited power channel as an option). Now, I’ve generally setup two wifi APs in my house, one on the only non DFS channel, and the other on a DFS channel. That way if the DFS channel gets knocked out there’s a fallback to the already congested “main” 5Ghz channel.

    I think the main point is, why remove something that doesn’t really affect you but may well affect others?

  • krellor@kbin.earth
    link
    fedilink
    arrow-up
    11
    ·
    8 months ago

    For residential space sure. For campus deployments, 2.4 is really helpful to get coverage in places you couldn’t justify additional antennas, or to blanket outdoor spaces between buildings. When you manage 10s of thousands of WAPs, in all sorts of crazy buildings and locations, you need every tool you can get.

    • shortwavesurfer@monero.townOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      8
      ·
      8 months ago

      There comes a time when maintaining backward compatibility either introduces serious security flaws or becomes too great a load to maintain. Take the cellular networks shutting down 2G and 3G in the United States, for example. Yes, it maintains backwards compatibility, but 2G is highly flawed and easy to exploit. They sure as hell aren’t doing it to free up the 700 kilohertz of bandwidth that it’s been stuck on for 20 years.

        • shortwavesurfer@monero.townOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          8 months ago

          For now, sure. But that’s likely because there’s not very many bands. After all, with the inclusion of six gigahertz now, there’s three bands that must be supported, which is nothing like what the cellular networks have to do.

          • KairuByte@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 months ago

            2.4 has many applications that just don’t make sense for 5 or 6. Sure, the latter can transmit at higher rates, but you don’t always care about the speed. Sometimes you’ve got the choice between blasting 2.4 through an annoying section of wall, or drilling a hole. And you don’t always have the right to drill a hole through a wall.

            Consider my student dorm housing in college. The walls were literal concrete with wiring and piping running through them. 5Ghz had issues with penetration, and certain areas you just couldn’t get internet. 2.4 had similar issues sure, but to a much lesser extent. Those dorms are still in use today, and while you might be able to finagle the perfect placement for coverage, hanging the router on the back of a wooden door in the middle of the unit just isn’t a great idea, for many reasons.

            • shortwavesurfer@monero.townOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              8 months ago

              Eventually, I suspect that Wi-Fi will be high enough that you will need a router in every single room, just like you need a light in every single room. I’m thinking of those Wi-Fi access points that look kind of like smoke detectors except larger and that are hung from the ceiling.

              • KairuByte@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                This would be a step back from where we are now.

                The concept of wireless is to remove the need to set things up over and over. No wires, to dealing with holes in walls, no need to get to the specific spot that has a plug, etc.

                What you’re talking about is making every room require more cables than before, a run to every room that you might want to use internet in, possibly even two.

                I’d rather run a cell tower type setup in my backyard than deal with running dozens of cables through my wall just to get wifi.

            • shortwavesurfer@monero.townOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              8 months ago

              Eventually, I suspect that Wi-Fi will be high enough that you will need a router in every single room, just like you need a light in every single room. I’m thinking of those Wi-Fi access points that look kind of like smoke detectors except larger and that are hung from the ceiling.

  • abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    8 months ago

    Edit: I am actually really surprised at how unpopular this opinion appears to be.

    2.4Ghz WiFi works perfectly for me, possibly because I’m not using an “OEM” access point - but rather went out and spent a couple hundred dollars on a good one myself. Both at home in the suburbs and at our office in the city with several businesses in one building, 2.4Ghz works great.

    In my experience 5Ghz only has acceptable performance if you have an access point in every internal room. I have zero interest in setting that up and like the fact that I can have reliable internet on my entire suburban block with a single (good) access point.

    “Upgrading” to 5Ghz would mean replacing one access point with eight access points. No thanks.

    As for wanting 400mbps… wtf for? I have a 10Gbps connection (wired) at the office and 50Mbps (wifi, 2.4Ghz) at home. Honestly can’t tell the difference. Sure, large downloads are faster… but that’s not something I do often especially at home. And if I did want that, I wouldn’t be using wireless. Latency is far more important than bandwidth and wired has better latency.

  • spaghettiwestern@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    It is always amazing how many people think their own specific situation should be used as the defining standard for the rest of the world.

    5 ghz just doesn’t get through stucco, concrete or even an inconveniently located furnace very well, nor does it reach nearly as far as a 2.4 ghz signal when only drywall and wooden studs are in the way. It would take 5 AP’s at 5ghz to cover the same area as 2 at 2.4 ghz in my environment.

    The great thing is that you can disable 2.4 ghz wifi on all your devices and the rest of us can continue to do what works for us.

      • Olap@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        8 months ago

        Wifi for rural services should be replaced with cellular connections. Let’s pivot

        • shortwavesurfer@monero.townOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          8 months ago

          You mean like fixed wireless access? Because T-Mobile does that. And in my previous house that was very rural, it worked very well. I was able to get 100 megabits a second over the cellular network where my neighbors were only getting 10 megabits per second on DSL. And that’s all that was available.

          • Olap@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Sorta, there’s a point to point frequency already were you using 4g+/5g for that previous one? Or was it the 2.5Ghz spectrum that the original doc I listed was using?

            Either way, talk of 6g is already here. Let’s reassign 2.5Ghz

            • shortwavesurfer@monero.townOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              8 months ago

              6G and wireless spectrum, at least in that regard, have very little to do with each other. T-Mobile is making very good use of that 2.5 GHz spectrum for offering very serious capacity for home internet and cellular usage.

  • Toes♀@ani.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    I get this opinion quite well.

    The problem I’ve observed are devices that foolishly switch to 2.4ghz in a crowded space, such as the Nintendo switch. There really needs to be an extra check for devices sensitive to latency to never connect to a 2.4ghz network on a crowded channel unless it’s the only option.

      • Bizarroland@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        I agree that 2.4 gigahertz is ultimately doomed, but we are easily 25 years away from moving out of that space and even then there will still be use cases for it.

        If you were to suddenly disable all 2.4 GHz Wi-Fi connections across the world a large portion of the world would be stranded without Wi-Fi.

        And since smart home devices and many other products that are actively being created required 2.4 gigahertz to function, any router that did not include 2.4 gigahertz would be e-waste before it was even taken out of the box.

  • skatrek47@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I think it’s a fair opinion, but a lot of “cheap” IoT devices only support 2.4GHz, so I do have both networks setup in my house for that reason…

    • shortwavesurfer@monero.townOP
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      8 months ago

      IOT devices should support 5 GHz and at least for me personally, if it doesn’t support it, I don’t buy it. Which also means that I have no IOT devices. LOL. My alarm system only supports 2.4 GHz, but it also has a cellular radio, so has never been connected to Wi-Fi in the time I’ve owned it.

      • Max-P@lemmy.max-p.me
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 months ago

        Why would you refuse to buy IoT devices unless they’re more expensive, use more battery and have less range? Like why, what does it give you to not have a 2.4 GHz network? It’s not like it’ll interfere with the 5 GHz network.

        Like sure the 2.4 GHz spectrum is pretty crowded and much slower. But at this point that’s pretty much all that’s left on 2.4GHz: low bandwidth, battery powered devices at random locations of your house and on the exterior walls of your house and all the way across the yard.

        It’s the ideal spectrum to put those devices on: it’s dirt cheap (they all seem to use ES8266 or ESP32 chips, lots of Espressif devices on the IoT network), it uses less power, goes through walls better, and all it needs to get through is that the button has been pressed. I’m not gonna install an extra AP or two when 2.4 reaches fine, just so that a button makes my phone ring and a bell go ding dong or a camera that streams and bitrates that you could stream on dialup internet.

        Phones and laptop? Yeah they’re definitely all on 5 GHz. If anything I prefer my IoT on 2.4 because then I can make my 5 GHz network WPA3 and 11ac/11ax only so I don’t have random IoT devices running at 11n speeds slowing down my 5 GHz network.

        • shortwavesurfer@monero.townOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          But cameras on 5GHz could stream very high quality 4K video directly to your phone or whatever 2.4GHz would be lots more likely to buffer and skip doing that.

          • Max-P@lemmy.max-p.me
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            My best camera does 1080p at 150kbit/s H264. Most “4K” cameras have such shit encoding they’re nowhere near exceeding what 2.4 GHz can provide still. And if I were to spend money on a nice 4K camera that actually streams real 4K I would also invest on making it run over PoE because that would chew through battery like there’s no tomorrow and needs a power source anyway, and would go to an NVR to store it all on a RAID array.

            And if that had to happen I’d just put it on a dedicated 5 GHz network, because I want to keep the good bandwidth for the devices that needs it like the TV, phones and laptops. Devices on older WiFi standards slow down the network because they use more airtime to send data at lower rates, so fast devices gets less airtime to send data at high rates.

            Using the most fitting tech for the needs is more important than trying to get them all on the latest and greatest. Devices needs to be worthy of getting granted access to my 5 GHz networks.

            • shortwavesurfer@monero.townOP
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              8 months ago

              Channel slicing into units solves some of this and when you go higher frequency like that you can put more antennas in the same physical space so you can have like 16 transmit 16 receive to combat those airtime issues.

  • TimeSquirrel@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    I run both. 5Ghz for high bandwidth devices such as phones and laptops. 2.4Ghz for IoT stuff that needs to penetrate through walls and isn’t using much bandwidth.

    Because of this useful niche, it probably won’t go away for a long time. Just like new burglar/fire alarm panels, UPSs, and network appliances that still use RS232 serial interfaces to program some settings.