赫然想到是否可以直接用R下載檔案,
發現真的有相關函式,
就是download.file。
你可以給他網址,
並且設定檔案名稱,
即可將檔案下載下來,
配合網路爬蟲抓取多個URL即可自動化下載多個檔案了(?)。
下面有程式碼範例。
When I was using crawler,
I found a interesting function called download.file.
You give it the url of the file you want to download,
and give it a file name.
Then you can download the file with this function.
I think download.file can coordinate with crawler,
and download many files automatically.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
url <- "https://exam.tcte.edu.tw/105_4y/105-4y-00-ma.pdf" | |
destfile <- "myfile.pdf" | |
download.file(url, destfile, mode="wb") |
沒有留言:
張貼留言