diff --git a/docs/usage.md b/docs/usage.md index 8f536750..209f1b78 100644 --- a/docs/usage.md +++ b/docs/usage.md @@ -2,17 +2,26 @@ You can use the `clean-scraper` command-line tool to scrape available agencies by supplying agency slugs. It will write files, by default, to a hidden directory in the user's home directory. On Apple and Linux systems, this will be `~/.clean-scraper`. +To run a scraper, you must first know its agency "slug" (a state postal code + short agency name): + +You can list available agencies (and get their slugs) using the `list` subcommand: + ```bash -# Scrape a single state -clean-scraper ca_san_diego_pd +clean-scraper list +``` + +You can then run a scraper for an agency using its slug: + +```bash +clean-scraper scrape ca_san_diego_pd ``` To use the `clean` library in Python, import an agency's scraper and run it directly. ```python -from clean.ca import import san_diego_pd as sdpd +from clean.ca import san_diego_pd -sdpd.scrape() +san_diego_pd.scrape() ``` ## Configuration @@ -22,19 +31,14 @@ You can set the `CLEAN_OUTPUT_DIR` environment variable to specify a different d Use the `--help` flag to view additional configuration and usage options: ```bash -clean-scraper --help - -Usage: python -m warn [OPTIONS] [STATES]... +Usage: clean-scraper [OPTIONS] COMMAND [ARGS]... - Command-line interface for downloading law enforcement agency files. - - SLUGS -- a list of one or agency slugs to scrape. + Command-line interface for downloading CLEAN files. Options: - --data-dir PATH The Path were the results will be saved - --cache-dir PATH The Path where results can be cached - --delete / --no-delete Delete generated files from the cache - -l, --log-level [DEBUG|INFO|WARNING|ERROR|CRITICAL] - Set the logging level - --help Show this message and exit. + --help Show this message and exit. + +Commands: + list List all available agencies and their slugs. + scrape Command-line interface for downloading CLEAN files. ```