The URLs project plays it fast and loose and archives an assortment of random URLs. This one has an IP block warning.
Some have NSFW warnings.
Other projects aim to archive a single site as accurately as possible (possibly with a deadline when the site is shutting down), so they can’t afford to have their warriors blocked or rate limited. If you are, that would be because of an issue. You can choose to archive sites you don’t want to visit to avoid issues.
How much does it spam, if I run it am I likely to get ip banned anywhere?
…Or in trouble for ‘visiting’ unsavoury sites?
There are always several projects to choose from.
The URLs project plays it fast and loose and archives an assortment of random URLs. This one has an IP block warning.
Some have NSFW warnings.
Other projects aim to archive a single site as accurately as possible (possibly with a deadline when the site is shutting down), so they can’t afford to have their warriors blocked or rate limited. If you are, that would be because of an issue. You can choose to archive sites you don’t want to visit to avoid issues.
Ah, I thought it was just a monolithic app you set going and have no control over. Ty
Just limit it to one job per session.
I ran 3 continuous jobs while archiving reddit and could still connect without issue.