I created the following program to download some files from NCBI (GEO):
for(i in 1:5){ x <- GEO [i,1] myPath <- paste0("https://www.ncbi.nlm.nih.gov/geo/download/? acc=",x,"&format=file") download.file(myPath, paste0(x, ".tar")) out <- tryCatch( { message("This is the 'try' part") readLines(con=myPath, warn=FALSE) }, error=function(cond) { message(paste("URL does not seem to exist:", myPath)) message("Here's the original error message:") message(cond) # Choose a return value in case of error return(NA) }, warning=function(cond) { message(paste("URL caused a warning:", myPath)) message("Here's the original warning message:") message(cond) # Choose a return value in case of warning return(NULL) }, finally={ message(paste("Processed URL:", myPath)) message("Some other message at the end") } ) return(out) }
I´m trying to use tryCatch loop function but this is not working as I expected.
How does one write a tryCatch loop so that: (i) When the URL is wrong, the code does not stop and continues donwload the URLs list. (ii) When the URL is wrong, the code save those “wrong URL” in a separated file (exclusion list).