Skip to content

tsdataclinic/gtfs-realtime-capsule

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

gtfs-realtime-capsule

Tool to archive GTFS-rt data

HOW-TOs

How to start local development docker

See the docker doc

How to run the scraper

Prerequisite

  1. Update config/config.json to include related credentials
  2. Make sure the implementation and metadata json of the feed you want to scrape is in src/scraper/feeds/

Via docker

In docker/prod/Dockerfile, update the last CMD step with correct feed you want to scrape

make local-prod-build
make local-prod-run
# you are now in shell in the docker container
# you can check the scraped files in 
ls /src/data/

To inspect scraper log on the computer running docker

docker logs -f local-prod

To store them locally:

docker logs -f local-prod &> prod_run.log &

Locally on your computer

python3 /local/src/scraper/scrape.py -f YOUR_FEED

An example is included in example_mta_subway.txt.

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •