Does elisp have a function that takes a url and a destination and downloads that url off the internet?
I've discovered url-retrieve
and url-retrieve-synchronously
but url-retrieve
takes a callback and url-retrieve-synchronously
puts everything into a buffer. Is there anything simpler?
To download a file: From inside Emacs, type M-x shell to start the shell. Since the wget command places the downloaded file into your current directory, change directories to the desired directory. When you are in the appropriate directory, type wget [url], then press Enter.
In a fresh Emacs window, type ESC-x lisp-interaction-mode . That will turn your buffer into a LISP terminal; pressing Ctrl+j will feed the s-expression that your cursor (called "point" in Emacs manuals' jargon) stands right behind to LISP, and will print the result.
Buffers in Emacs editing are objects that have distinct names and hold text that can be edited. Buffers appear to Lisp programs as a special data type. You can think of the contents of a buffer as a string that you can extend; insertions and deletions may occur in any part of the buffer. See Text.
Try url-copy-file
. Its description reads,
url-copy-file is an autoloaded Lisp function in `url-handlers.el'.
(url-copy-file url newname &optional ok-if-already-exists keep-time)
Copy url to newname. Both args must be strings. Signals a `file-already-exists' error if file newname already exists, unless a third argument ok-if-already-exists is supplied and non-nil. A number as third arg means request confirmation if newname already exists. This is what happens in interactive use with M-x. Fourth arg keep-time non-nil means give the new file the same last-modified time as the old one. (This works on only some systems.) A prefix arg makes keep-time non-nil.
Obviously url-copy-file
is the best option, but to the more adventurous Emacs hackers I'd suggest something like this:
(require 'url) (defun download-file (&optional url download-dir download-name) (interactive) (let ((url (or url (read-string "Enter download URL: ")))) (let ((download-buffer (url-retrieve-synchronously url))) (save-excursion (set-buffer download-buffer) ;; we may have to trim the http response (goto-char (point-min)) (re-search-forward "^$" nil 'move) (forward-char) (delete-region (point-min) (point)) (write-file (concat (or download-dir "~/downloads/") (or download-name (car (last (split-string url "/" t))))))))))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With