SMOLNET PORTAL home about changes
๐Ÿ‘ฝ clseibold

I have made some improvements to my crawler that will allow for some interesting ideas that I have planned for AuraGem Search. For at least 2 years now my search engine has had a way of detecting which pages can be used as gemsub feeds and which cannot. With slight changes to my crawler, it can now query from the db a list of all URLs that are considered feeds and crawl only internal links from those pages - meaning it will crawl only the non-cross-host links of those feed pages. This will allow me to have a constantly updated feed aggregator based on my search engine, with no censorship and no requirement of having to submit a url.

2 months ago ยท ๐Ÿ‘ maxheadroom

Actions


๐Ÿ‘‹ Join Station


1 Reply


๐Ÿ‘ฝ clseibold

Note: If you want your pages to not be crawled by my search engine, be sure to use a robots.txt ยท 2 months ago

Response: 20 (Success), text/gemini
Original URLgemini://station.martinrue.com/clseibold/93925dca35894a12...
Status Code20 (Success)
Content-Typetext/gemini; charset=utf-8