How to scrape the web using R and use APIs to access information. I looked to see if there is a relationship between the amount of money a country spends on R&D and the number of Nobel laureates they produce.
The R code was used to generate the Rmarkdown, markdown, and HTML documents.
Note: that I didn't write a Rmarkdown manually, the R script generated all of them using
rsalad::spinMyR
.rsalad
is a package I wrote that is available on GitHub and can be installed withdevtools::install_github("daattali/rsalad")
. The command to generate all the files wasrsalad::spinMyR("hw12_web-scraping-api.R", wd = "hw/hw12_web-scraping-api")
with the working directory being the root directory of this repository.
Visual report showing the output.