👽 clseibold

I have made some improvements to my crawler that will allow for some interesting ideas that I have planned for AuraGem Search. For at least 2 years now my search engine has had a way of detecting which pages can be used as gemsub feeds and which cannot. With slight changes to my crawler, it can now query from the db a list of all URLs that are considered feeds and crawl only internal links from those pages - meaning it will crawl only the non-cross-host links of those feed pages. This will allow me to have a constantly updated feed aggregator based on my search engine, with no censorship and no requirement of having to submit a url.

2 years ago · 👍 maxheadroom

Actions

👋 Join Station

1 Reply

👽 clseibold

Note: If you want your pages to not be crawled by my search engine, be sure to use a robots.txt · 2 years ago