• petaqui@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      As for everything, it has good things, and bad things. We need to be careful and use it in a proper way, and the same thing applies to the ones creating this technology

    • Slaxis@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 hours ago

      The problem is, how? I can set it up on my own computer using open source models and some of my own code. It’s really rough to regulate that.

    • gap_betweenus@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      Once a technology or even an idea is there, you can’t really make it go away - ai is here to stay. The generative LLM are just a small part.

  • x0x7@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    15 hours ago

    Jokes on them. I’m going to use AI to estimate the value of content, and now I’ll get the kind of content I want, though fake, that they will have to generate.

  • umbraroze@lemmy.world
    link
    fedilink
    English
    arrow-up
    58
    ·
    1 day ago

    I have no idea why the makers of LLM crawlers think it’s a good idea to ignore bot rules. The rules are there for a reason and the reasons are often more complex than “well, we just don’t want you to do that”. They’re usually more like “why would you even do that?”

    Ultimately you have to trust what the site owners say. The reason why, say, your favourite search engine returns the relevant Wikipedia pages and not bazillion random old page revisions from ages ago is that Wikipedia said “please crawl the most recent versions using canonical page names, and do not follow the links to the technical pages (including history)”. Again: Why would anyone index those?

    • EddoWagt@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      They want everything, does it exist, but it’s not in their dataset? Then they want it.

      They want their ai to answer any question you could possibly ask it. Filtering out what is and isn’t useful doesn’t achieve that

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      10 hours ago

      Because it takes work to obey the rules, and you get less data for it. The theoretical competitor could get more ignoring those and get some vague advantage for it.

      I’d not be surprised if the crawlers they used were bare-basic utilities set up to just grab everything without worrying about rules and the like.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      29
      ·
      23 hours ago

      Because you are coming from the perspective of a reasonable person

      These people are billionaires who expect to get everything for free. Rules are for the plebs, just take it already

  • surph_ninja@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    1 day ago

    I’m imagining a sci-fi spin on this where AI generators are used to keep AI crawlers in a loop, and they accidentally end up creating some unique AI culture or relationship in the process.

  • DigitalDilemma@lemmy.ml
    link
    fedilink
    English
    arrow-up
    71
    ·
    2 days ago

    Surprised at the level of negativity here. Having had my sites repeatedly DDOSed offline by Claudebot and others scraping the same damned thing over and over again, thousands of times a second, I welcome any measures to help.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 hours ago

      thousands of times a second

      Modify your Nginx (or whatever web server you use) config to rate limit requests to dynamic pages, and cache them. For Nginx, you’d use either fastcgi_cache or proxy_cache depending on how the site is configured. Even if the pages change a lot, a cache with a short TTL (say 1 minute) can still help reduce load quite a bit while not letting them get too outdated.

      Static content (and cached content) shouldn’t cause issues even if requested thousands of times per second. Following best practices like pre-compressing content using gzip, Brotli, and zstd helps a lot, too :)

      Of course, this advice is just for “unintentional” DDoS attacks, not intentionally malicious ones. Those are often much larger and need different protection - often some protection on the network or load balancer before it even hits the server.

      • DigitalDilemma@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        Already done, along with a bunch of other stuff including cloudflare WAF and rate limiting rules.

        I am still annoyed that it took me over a day’ of my life to finally (so far) restrict these things. And several other days to offload the problem to Cloudflare pages for sites that I previous self hosted but my rural link couldn’t support.

        this advice is just for “unintentional” DDoS attacks, not intentionally malicious ones.

        And I don’t think these high volume AI scrapes are unintentional DDOS attacks. I consider them entirely intentional. Not deliberrately malicious, but negligent to the point of criminality. (Especially in requesting the same pages again so frequently, and all of them ignoring robots.txt)

    • Fluke@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      24 hours ago

      And consumed the power output of a medium country to do it.

      Yeah, great job! 👍

      • LeninOnAPrayer@lemm.ee
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        24 hours ago

        We truly are getting dumber as a species. We’re facing climate change but running some of the most power hungry processers in the world to spit out cooking recipes and homework answers for millions of people. All to better collect their data to sell products to them that will distract them from the climate disaster our corporations have caused. It’s really fun to watch if it wasn’t so sad.

    • zovits@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      It certainly sounds like they generate the fake content once and serve it from cache every time: “Rather than creating this content on-demand (which could impact performance), we implemented a pre-generation pipeline that sanitizes the content to prevent any XSS vulnerabilities, and stores it in R2 for faster retrieval.”

  • 4am@lemm.ee
    link
    fedilink
    English
    arrow-up
    310
    arrow-down
    3
    ·
    2 days ago

    Imagine how much power is wasted on this unfortunate necessity.

    Now imagine how much power will be wasted circumventing it.

    Fucking clown world we live in

    • zovits@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      From the article it seems like they don’t generate a new labyrinth for every single time: Rather than creating this content on-demand (which could impact performance), we implemented a pre-generation pipeline that sanitizes the content to prevent any XSS vulnerabilities, and stores it in R2 for faster retrieval."

    • Demdaru@lemmy.world
      link
      fedilink
      English
      arrow-up
      57
      arrow-down
      2
      ·
      2 days ago

      On on hand, yes. On the other…imagine frustration of management of companies making and selling AI services. This is such a sweet thing to imagine.

      • Melvin_Ferd@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        9
        ·
        2 days ago

        I just want to keep using uncensored AI that answers my questions. Why is this a good thing?

          • Melvin_Ferd@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            11
            ·
            edit-2
            7 hours ago

            Good I ignore that too. I want a world where information is shared. I can get behind the

            • explodicle@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              15
              ·
              1 day ago

              Get behind the what?

              Perhaps an AI crawler crashed Melvin’s machine halfway through the reply, denying that information to everyone else!

              • Melvin_Ferd@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 hours ago

                Capitalist pigs are paying media to generate AI hatred to help them convince you people to get behind laws that all limit info sharing under the guise of IP and copyright

        • CileTheSane@lemmy.ca
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          2 days ago

          Because it’s not AI, it’s LLMs, and all LLMs do is guess what word most likely comes next in a sentence. That’s why they are terrible at answering questions and do things like suggest adding glue to the cheese on your pizza because somewhere in the training data some idiot said that.

          The training data for LLMs come from the internet, and the internet is full of idiots.

          • Melvin_Ferd@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            6
            ·
            1 day ago

            That’s what I do too with less accuracy and knowledge. I don’t get why I have to hate this. Feels like a bunch of cavemen telling me to hate fire because it might burn the food

            • CileTheSane@lemmy.ca
              link
              fedilink
              English
              arrow-up
              3
              ·
              13 hours ago

              Because we have better methods that are easier, cheaper, and less damaging to the environment. They are solving nothing and wasting a fuckton of resources to do so.

              It’s like telling cavemen they don’t need fire because you can mount an expedition to the nearest valcanoe to cook food without the need for fuel then bring it back to them.

              The best case scenario is the LLM tells you information that is already available on the internet, but 50% of the time it just makes shit up.

              • Melvin_Ferd@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                8 hours ago

                Wasteful?

                Energy production is an issue. Using that energy isn’t. LLMs are a better use of energy than most of the useless shit we produce everyday.

                • CileTheSane@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 hours ago

                  Did the LLMs tell you that? It’s not hard to look up on your own:

                  Data centers, in particular, are responsible for an estimated 2% of electricity use in the U.S., consuming up to 50 times more energy than an average commercial building, and that number is only trending up as increasingly popular large language models (LLMs) become connected to data centers and eat up huge amounts of data. Based on current datacenter investment trends,LLMs could emit the equivalent of five billion U.S. cross-country flights in one year.

                  https://cse.engin.umich.edu/stories/power-hungry-ai-researchers-evaluate-energy-consumption-across-models

                  Far more than straightforward search engines that have the exact same information and don’t make shit up half the time.

    • supersquirrel@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      21 hours ago

      No, it is far less environmentally friendly than rc bots made of metal, plastic, and electronics full of nasty little things like batteries blasting, sawing, burning and smashing one another to pieces.

    • IninewCrow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      They should program the actions and reactions of each system to actual battle bots and then televise the event for our entertainment.

  • oldfart@lemm.ee
    link
    fedilink
    English
    arrow-up
    107
    ·
    2 days ago

    So the web is a corporate war zone now and you can choose feudal protection or being attacked from all sides. What a time to be alive.

    • theparadox@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      There is also the corpo verified id route. In order to avoid the onslaught of AI bots and all that comes with them you’ll need to sacrifice freedom, anonymity, and privacy like a good little peasant to prove you aren’t a bot… and so will everyone else. You’ll likely be forced to deal with whatever AI bots are forced upon you while within the walls but better an enemy you know I guess?

  • TorJansen@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    2 days ago

    And soon, the already AI-flooded net will be filled with so much nonsense that it becomes impossible for anyone to get some real work done. Sigh.

    • rocket_dragon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      71
      ·
      2 days ago

      Next step is an AI that detects AI labyrinth.

      It gets trained on labyrinths generated by another AI.

      So you have an AI generating labyrinths to train an AI to detect labyrinths which are generated by another AI so that your original AI crawler doesn’t get lost.

      It’s gonna be AI all the way down.

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        4
        ·
        2 days ago

        All the while each AI costs more power than a million human beings to run, and the world burns down around us.

        • Fluke@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          24 hours ago

          This is the great filter.

          Why isn’t there detectable life out there? They all do the same thing we’re doing. Undone by greed.

          • BeanCounter781@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 hours ago

            I haven’t heard of someone refer to the great filter of intelligent life for a while. Good post.

        • LainTrain@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          1
          ·
          2 days ago

          The same way they justify cutting benefits for the disabled to balance budgets instead of putting taxes on the rich or just not giving them bailouts, they will justify cutting power to you before a data centre that’s 10 corporate AIs all fighting each other, unless we as a people stand up and actually demand change.

          • BeanCounter781@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 hours ago

            In Texas 80% of our water usage is corporate. But when the lakes are low during a drought they tell homeowners to reduce water the grass. Nobody tells the corporations to throw away less water.

            AI will be allowed to use as much energy as it wants. It will even remind people to turn off the lights in a room not being occupied while wasting energy to monitor everyone’s power usage.

            • BeanCounter781@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 hours ago

              This is why we need a centrists political party. Solutions shouldn’t be a false dichotomy.

              And we shouldn’t downvote people into oblivion. Take my charitable upvote.

              • finitebanjo@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                6 hours ago

                That will require reform of campaign finance laws and progressive reform for elections, both of which are highly partisan issues.

                • BeanCounter781@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  6 hours ago

                  I bet there are corporations that want regulatory stability over political football. If someone could figure out how to tap that with Super PACs they could capture funding. Probably easier to do in local elections where on party or another has failed to put up a candidate for a judge.

                  I see one party races in Texas for justice of the peace and for judges. The Dems have no viable candidates in some jurisdictions because no one wants a democrat or to be labeled as one. But maybe a centrist could brand themselves as the anti democratic alternative to republicans.

            • TronBronson@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              1 day ago

              Plenty of Democrats are voting to put trump nominees in office, plenty are voting on partisan spending bills. The CR vote should tip you off that any democrat is not better than any republican… half of them are complicit too. 10 Senate Dems just financed this authoritarian takeover.

              • finitebanjo@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                4
                ·
                1 day ago

                Not a single Democrat voted to confirm Hegseth and 3 Republicans also didnt but he still got confirmed.

                Every single Democrat was present and voted no for the Budget which passed the House and it still passed.

                Even if 10 dems voted not to shutdown government and enter congressional recess, the CR only exists because Republicans wrote it and won’t compromise.

                Any Democrat is Better than Any Republican.

                • TronBronson@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 day ago

                  Scheumer rubberstamped autocracy by not filibustering the CR. I think anyone who protects the constitution and their constituents is better than someone who doesn’t. Not that any repuclicans fit the bill, but its not like we can just trust any old democrat. Look at Gavin Newsome sliding to the right to maintain power. That the kinda dems we want?

            • LainTrain@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              3
              ·
              edit-2
              2 days ago

              In my country blue is conservatives… But I agree with the sentiment! It worked for California, it can work for your whole country, let the Dems stop fearing they’ll lose elections, give them comfortable margins and then massively support progressives who can bring in the good stuff, they won’t have a chance if the party core thinks the very future of elections is on the line, but if they think they’ll likely win anyway, you might just be able to push through a progressive candidate and end the Neoliberal decay.

              • knexcar@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                To be fair, California is kind of dysfunctional and constantly trips over its own regulations when trying to get anything built. For instance, needing excessive environmental impact review for things like trains that will obviously help the environment, or limiting ferry boats crossing the bay to protect the environment even though it likely results in more people driving instead.

                • LainTrain@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 hours ago

                  You need a strong and agile state. This dysfunction often stems from complexities introduced by corporate interests during the legislative process.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        edit-2
        2 days ago

        LLMs tend to be really bad at detecting AI generated content. I can’t imagine specialized models are much better. For the crawler, it’s also exponentially more expensive and more human work, and must be replicated for every crawler since they’re so freaking secretive.

        I think the hosts win here.