Today I set up a little project website on a new subdomain. It’s not a www subdomain or a newly registered domain, which is easy to detect. We’re talking about:

Randomchars.mydomain.com

Within 20 minutes, the anthropic ClaudeBot was on it. I could tell because the nginx access log showed a hit to robots.txt and then a handful of pages.

First off, how the hell did they find it? Next, is my DNS provider, Amazon Route 53 selling this kind of data now? Or is there some kind of DNS wildcard query?

  • Courant d'air 🍃
    link
    fedilink
    arrow-up
    37
    ·
    4 days ago

    SSL certificate could leak subdomains depending on how it’s configured. Otherwise I wouldn’t be surprised if Amazon sold this info within 20min

  • psycotica0@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    4 days ago

    What’s that thing Google is pushing, where the CAs basically push a list of all the certs they issue? Is that live? Maybe Amazon issued you a key, and then published it in a list of “domains I’ve issued keys for”, and they’re just watching that list?

    Unless that’s not a thing, or not a thing yet, or I’m fully misremembering…

  • Taldan@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    4 days ago

    I’m a bit confused about your DNS config. DNS is generally public, that’s the point of it

    AI scrapers, like most scrapers, just crawls every new DNS entry that is created

    • cron@feddit.org
      link
      fedilink
      arrow-up
      6
      ·
      4 days ago

      How exactly can you find all subdomains of a given domain?

      Sure, it is possible with misconfigured DNSSEC (zone walking), but otherwise I‘d say it is not possible.

      • ramble81@lemmy.zip
        link
        fedilink
        arrow-up
        5
        ·
        4 days ago

        I could see accidentally having XFER enabled (not even DNSSEC related) and they transfer your entire zone.

        • cron@feddit.org
          link
          fedilink
          arrow-up
          3
          ·
          4 days ago

          Thats possible, though probably not the most likely option (misconfigured webserver, certificate transparency logs).

  • thisbenzingring@lemmy.today
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 days ago

    if you are using any Amazon web services, those IPs are all well known and if one is not active but then starts reaching out, it will be seen and if it is not in a collected database, it will be added to be collected

  • techconsulnerd@programming.dev
    link
    fedilink
    arrow-up
    3
    ·
    4 days ago

    Perhaps it was crawling a list of IP addresses and your web server is also serving the website to your IP address (not domain/subdomain). You can configure the web server to show blank page or 403 error if accessed by IP address.

  • SpaceMan9000@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 days ago

    What’s the default in nginx? Did they need to know the actual subdomain or a lot of times you can get it by querying the DNS servers directly or have certs leak it.