mirror of
https://git.savannah.gnu.org/git/emacs.git
synced 2024-12-23 10:34:07 +00:00
Fix previous URL doc change
* lisp/url/url-queue.el (url-queue-retrieve): Fic previous doc fix. * doc/misc/url.texi (Retrieving URLs): Update url-retrieve arguments. Mention url-queue-retrieve. * etc/NEWS: Related edit.
This commit is contained in:
parent
578ad769a6
commit
a48ec60ca1
@ -216,10 +216,10 @@ non-@code{nil}, do not store or send cookies.
|
||||
@vindex url-queue-parallel-processes
|
||||
@vindex url-queue-timeout
|
||||
@defun url-queue-retrieve url callback &optional cbargs silent no-cookies
|
||||
This acts like the @code{url-retrieve} function, but downloads in
|
||||
parallel. The option @code{url-queue-parallel-processes} controls the
|
||||
number of concurrent processes, and the option @code{url-queue-timeout}
|
||||
sets a timeout in seconds.
|
||||
This acts like the @code{url-retrieve} function, but with limits on
|
||||
the degree of parallelism. The option @code{url-queue-parallel-processes}
|
||||
controls the number of concurrent processes, and the option
|
||||
@code{url-queue-timeout} sets a timeout in seconds.
|
||||
@end defun
|
||||
|
||||
@node Supported URL Types
|
||||
|
5
etc/NEWS
5
etc/NEWS
@ -858,8 +858,9 @@ default value to "".
|
||||
remote machines that support SELinux.
|
||||
|
||||
+++
|
||||
** New function, url-queue-retrieve, fetches URLs asynchronously like
|
||||
url-retrieve does, but in parallel.
|
||||
** New function, `url-queue-retrieve', which behaves like url-retrieve,
|
||||
but with limits (`url-queue-parallel-processes', `url-queue-timeout') on
|
||||
the degree of parallelism.
|
||||
|
||||
** VC and related modes
|
||||
|
||||
|
@ -1,3 +1,7 @@
|
||||
2012-02-10 Glenn Morris <rgm@gnu.org>
|
||||
|
||||
* url-queue.el (url-queue-retrieve): Fic previous doc fix.
|
||||
|
||||
2012-02-10 Andreas Schwab <schwab@linux-m68k.org>
|
||||
|
||||
* url-http.el (url-http-clean-headers): Return the number of
|
||||
|
@ -57,9 +57,9 @@
|
||||
(defun url-queue-retrieve (url callback &optional cbargs silent inhibit-cookies)
|
||||
"Retrieve URL asynchronously and call CALLBACK with CBARGS when finished.
|
||||
This is like `url-retrieve' (which see for details of the arguments),
|
||||
but downloads in parallel. The variable `url-queue-parallel-processes'
|
||||
sets the number of concurrent processes. The variable `url-queue-timeout'
|
||||
sets a timeout."
|
||||
but with limits on the degree of parallelism. The variable
|
||||
`url-queue-parallel-processes' sets the number of concurrent processes.
|
||||
The variable `url-queue-timeout' sets a timeout."
|
||||
(setq url-queue
|
||||
(append url-queue
|
||||
(list (make-url-queue :url url
|
||||
|
Loading…
Reference in New Issue
Block a user