Skip to content

DanWlker/dead_links_scraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A recursive web scraper for dead links in a website, supports both parallel and sequential execution.

To Build

go build .

Usage

./dead_links_scraper https://<your_link>

To start from a relative path, specify -s

./dead_links_scraper -s /relative_path https://<your_link>

To execute in parallel, specify -p

./dead_links_scraper -p https://<your_link>

Caveats

This dead links scraper only scrapes links inside a html <a> tag

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages