Fetch web pages using headless Chrome, storing all fetched resources including JavaScript files.

Fetch web pages using headless Chrome, storing all fetched resources including JavaScript files. Run arbitrary JavaScript on many web pages and see the returned values
Information
Category: Golang / Web Crawling
Watchers: 2
Star: 42
Fork: 1
Last update: Jun 3, 2021

Related Repos



southwolf Mac Bot A simple crawler sending Telegram notification when Refurbished Macbook Air / Pro in stock. Usage Create a Telegram Bot API account Create a H
 

s0rg crawley Crawls web pages and prints any link it can find. Scan depth (by default - 0) can be configured. features fast SAX-parser (powered by golang.o
 

reactor-joy Reactor Crawler Simple CLI content crawler for Joyreactor. He'll find all media content on the page you've provided and save it. If there will be any
 

amirgamil A Unix-style personal search engine and web crawler for your digital footprint
 

detectify Fetch web pages using headless Chrome, storing all fetched resources including JavaScript files. Run arbitrary JavaScript on many web pages and see the returned values
 

Zartenc Collyzar provides a very simple configuration and tools to implement distributed crawling/scraping.
 

el10savio GoCrawler - A distributed web crawler implemented using Go, Postgres, RabbitMQ and Docker
 

ianmarmour A simple bot written in go that tracks for availability of stock from nvidia's store and automatically add items to your checkout.
 

denverquane Discord Bot to scrape Among Us on-screen data, and automatically mute/unmute players during the course of the game!
 

IAmStoxe A golang utility to spider through a website searching for additional links with support for JavaScript rendering.
 

rocketlaunchr Quickly scrape Google Search Results.
 

jaeles-project GoSpider GoSpider - Fast web spider written in Go Installation go get -u github.com/jaeles-project/gospider Features Fast web crawling Brute force and parse sitemap.xml Parse robots.txt Generate
 

hakluke hakrawler What is it? hakrawler is a Go web crawler designed for easy, quick discovery of endpoints and assets within a web application. It can be used to discover: Forms Endpoints Subdomains Related doma
 

alash3al scrapyd-go an drop-in replacement for scrapyd that is more easy to be scalable and distributed on any number of commodity machines with no hassle, each scrapyd-go instance is a stateless microservice, all instances must be con
 

geziyor Geziyor Geziyor is a blazing fast web crawling and web scraping framework. It can be used to crawl websites and extract structured data from them. Geziyor is useful for a wide range of purposes such as data mining, monitoring and