Adding learning resource - URLs and Subpaths - limit depths

I just pasted a single URL and suddenly had 2000 pages that were being parsed. And there was no way to limit which pages.


The easiest way to improve on this is to introduce crawling depth.
- A depth of 0 is only the URL passed in
- A depth of 1 is only pages that are linked from this page
- A depth of 2 is direct links and then direct links on the next pages etc.

Also adding a button to allow following links outside the root domain or staying within the root domain.

With these small changes that feature would be vastly improved as right now i can not build a knowledgebase of the url and its direct subpages but can only do 1 page or 2000 when i really want 30 pages.

Please authenticate to join the conversation.

Upvoters
Status

In Progress

Board

💡 General

Tags

High Priority

Date

About 1 month ago

Author

Patrick Wolf

Subscribe to post

Get notified by email when there are changes.