👽 haze

Also to whoever is obviously developing a new Gemini crawler (hitting new URLs every 0.1s) - respect the robots.txt. It's a mutual respect thing. Please.

1 year ago · 👍 lufte, justyb

Actions

👋 Join Station

3 Replies

👽 haze

I mean, yeah, I wrote my own rate limiter. But that's not an excuse for people to misbehave (Gemini is small enough) · 11 months ago

👽 acidus

ugh, tell me about it. These things get stuck in Gemipedia, or NewsWaffle, which have virtually an unlimited number of URLs. It pounds my capsule. Some of these are crawlers in Geminispace, but sometimes its a web crawler, hitting my capsule through @mozz's HTTP-to-Gemini portal (or another one of the handful of public portals out there...) · 11 months ago

👽 mrrobinhood5

not it · 11 months ago