• asudox@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    3 months ago

    Block? Nope, robots.txt does not block the bots. It’s just a text file that says: “Hey robot X, please do not crawl my website. Thanks :>”

    • ɐɥO@lemmy.ohaa.xyz
      link
      fedilink
      arrow-up
      22
      ·
      3 months ago

      I disallow a page in my robots.txt and ip-ban everyone who goes there. Thats pretty effective.

        • bountygiver [any]@lemmy.ml
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          3 months ago

          humans typically don’t visit [website]/fdfjsidfjsidojfi43j435345 when there’s no button that links to it

          • Avatar_of_Self@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            ·
            3 months ago

            I used to do this on one of my sites that was moderately popular in the 00’s. I had a link hidden via javascript, so a user couldn’t click it (unless they disabled javascript and clicked it), though it was hidden pretty well for that too.

            IP hits would be put into a log and my script would add a /24 of that subnet into my firewall. I allowed specific IP ranges for some search engines.

            Anyway, it caught a lot of bots. I really just wanted to stop automated attacks and spambots on the web front.

            I also had a honeypot port that basically did the same thing. If you sent packets to it, your /24 was added to the firewall for a week or so. I think I just used netcat to add to yet another log and wrote a script to add those /24’s to iptables.

            I did it because I had so much bad noise on my logs and spambots, it was pretty crazy.

            • Mikelius@lemmy.ml
              link
              fedilink
              arrow-up
              7
              ·
              3 months ago

              This thread has provided genius ideas I somehow never thought of, and I’m totally stealing them for my sites lol.

          • JackbyDev@programming.dev
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            3 months ago

            I LOVE VISITING FDFJSIDFJSIDOJFI435345 ON HUMAN WEBSITES, IT IS ONE OF MY FAVORITE HUMAN HOBBIES. 🤖👨

      • Dizzy Devil Ducky@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        I doubt it’d be possible in most any way due to lack of server control, but I’m definitely gonna have to look this up to see if anything similar could be done on a neocities site.

      • asudox@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        Not sure if that is effective at all. Why would a crawler check the robots.txt if it’s programmed to ignore it anyways?

    • Cynicus Rex@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      ·
      3 months ago

      Unfortunate indeed.

      “Can AI bots ignore my robots.txt file? Well-established companies such as Google and OpenAI typically adhere to robots.txt protocols. But some poorly designed AI bots will ignore your robots.txt.”

      • breadsmasher@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        typically adhere. but they don’t have to follow it.

        poorly designed AI bots

        Is it a poor design if its explicitly a design choice to ignore it entirely to scrape as much data as possible? Id argue its more AI bots designed to scrape everything regardless of robots.txt. That’s the intention. Asshole design vs poor design.