Skip to content

Commit

Permalink
fix R CMD check url issues & CRAN submission
Browse files Browse the repository at this point in the history
  • Loading branch information
pedrobtz committed Aug 25, 2024
1 parent 3c96e9f commit fa7d50e
Show file tree
Hide file tree
Showing 11 changed files with 22 additions and 33 deletions.
4 changes: 2 additions & 2 deletions CRAN-SUBMISSION
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
Version: 0.7.15
Date: 2024-08-24 13:03:06 UTC
SHA: d2ab4788e97bfad0a6e7b7a7c3b70938be954a5e
Date: 2024-08-25 07:16:38 UTC
SHA: 3c96e9f6872735f123da4dea2404c8f8df94810f
3 changes: 1 addition & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Package: robotstxt
Date: 2024-08-24
Date: 2024-08-25
Type: Package
Title: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Version: 0.7.15
Expand All @@ -24,7 +24,6 @@ Description: Provides functions to download and parse 'robots.txt' files.
(spiders, crawler, scrapers, ...) are allowed to access specific
resources on a domain.
License: MIT + file LICENSE
LazyData: TRUE
BugReports: https://github.com/ropensci/robotstxt/issues
URL: https://docs.ropensci.org/robotstxt/, https://github.com/ropensci/robotstxt
Imports:
Expand Down
5 changes: 2 additions & 3 deletions R/get_robotstxt.R
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,8 @@
#' @param user_agent HTTP user-agent string to be used to retrieve robots.txt
#' file from domain
#'
#' @param ssl_verifypeer analog to CURL option
#' \url{https://curl.haxx.se/libcurl/c/CURLOPT_SSL_VERIFYPEER.html} -- and
#' might help with robots.txt file retrieval in some cases
#' @param ssl_verifypeer either 1 (default) or 0, if 0 it disables SSL peer verification, which
#' might help with robots.txt file retrieval
#' @param rt_robotstxt_http_getter function that executes HTTP request
#' @param rt_request_handler handler function that handles request according to
#' the event handlers specified
Expand Down
7 changes: 2 additions & 5 deletions R/get_robotstxt_http_get.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,8 @@ rt_last_http$request <- list()

#' get_robotstxt() worker function to execute HTTP request
#'
#'
#' @param ssl_verifypeer analog to CURL option
#' \url{https://curl.haxx.se/libcurl/c/CURLOPT_SSL_VERIFYPEER.html}
#' -- and might help with robots.txt file retrieval in some cases
#'
#' @param ssl_verifypeer either 1 (default) or 0, if 0 it disables SSL peer verification, which
#' might help with robots.txt file retrieval
#' @param domain the domain to get tobots.txt. file for
#' @param user_agent the user agent to use for HTTP request header
#'
Expand Down
6 changes: 2 additions & 4 deletions R/get_robotstxts.R
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,8 @@
#' pages and vignettes of package future on how to set up
#' plans for future execution because the robotstxt package
#' does not do it on its own.
#' @param ssl_verifypeer analog to CURL option
#' \url{https://curl.haxx.se/libcurl/c/CURLOPT_SSL_VERIFYPEER.html}
#' -- and might help with robots.txt file retrieval in some cases
#'
#' @param ssl_verifypeer either 1 (default) or 0, if 0 it disables SSL peer verification, which
#' might help with robots.txt file retrieval
#' @param rt_request_handler handler function that handles request according to
#' the event handlers specified
#'
Expand Down
5 changes: 3 additions & 2 deletions cran-comments.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
## R CMD check results

0 errors | 0 warnings | 0 note
0 errors | 0 warnings | 1 note

* fixing all checks problems
* fixing "incoming feasibility" URL checks problems
* changing maintainer to Pedro Baltazar <pedrobtz@gmail.com>
5 changes: 2 additions & 3 deletions man/get_robotstxt.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 2 additions & 3 deletions man/get_robotstxt_http_get.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 2 additions & 3 deletions man/get_robotstxts.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 2 additions & 3 deletions man/paths_allowed.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 2 additions & 3 deletions man/robotstxt.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit fa7d50e

Please sign in to comment.