I have made some improvements to my crawler that will allow for some interesting ideas that I have planned for AuraGem Search. For at least 2 years now my search engine has had a way of detecting which pages can be used as gemsub feeds and which cannot. With slight changes to my crawler, it can now query from the db a list of all URLs that are considered feeds and crawl only internal links from those pages - meaning it will crawl only the non-cross-host links of those feed pages. This will allow me to have a constantly updated feed aggregator based on my search engine, with no censorship and no requirement of having to submit a url.
2 months ago ยท ๐ maxheadroom
Actions
1 Reply
Note: If you want your pages to not be crawled by my search engine, be sure to use a robots.txt ยท 2 months ago
Response: 20 (Success), text/gemini
| Original URL | gemini://station.martinrue.com/clseibold/93925dca35894a12... |
|---|---|
| Status Code | 20 (Success) |
| Content-Type | text/gemini; charset=utf-8 |