Update 20221110.1 ️ How to Fix bash wget Command Not Found Error.md

This commit is contained in:
六开箱 2022-11-13 14:30:18 +08:00 committed by GitHub
parent c5313884c6
commit 0abb164c8f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -69,7 +69,28 @@ wget https://releases.ubuntu.com/22.04.1/ubuntu-22.04.1-desktop-amd64.iso
Similarly, you can also download using the above command or, by combining several switches as described below. You can also get this via `wget --help` command.
```
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits)--retry-connrefused retry even if connection is refused--retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry-O, --output-document=FILE write documents to FILE-nc, --no-clobber skip downloads that would download toexisting files (overwriting them)--no-netrc don't try to obtain credentials from .netrc-c, --continue resume getting a partially-downloaded file--start-pos=OFFSET start downloading from zero-based position OFFSET--progress=TYPE select progress gauge type--show-progress display the progress bar in any verbosity mode-N, --timestamping don't re-retrieve files unless newer thanlocal--no-if-modified-since don't use conditional if-modified-since getrequests in timestamping mode--no-use-server-timestamps don't set the local file's timestamp bythe one on the server-S, --server-response print server response--spider don't download anything-T, --timeout=SECONDS set all timeout values to SECONDS--dns-timeout=SECS set the DNS lookup timeout to SECS--connect-timeout=SECS set the connect timeout to SECS--read-timeout=SECS set the read timeout to SECS-w, --wait=SECONDS wait SECONDS between retrievals(applies if more then 1 URL is to be retrieved)--waitretry=SECONDS wait 1..SECONDS between retries of a retrieval(applies if more then 1 URL is to be retrieved)--random-wait wait from 0.5WAIT…1.5WAIT secs between retrievals(applies if more then 1 URL is to be retrieved)
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits)
--retry-connrefused retry even if connection is refused
--retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry
-O, --output-document=FILE write documents to FILE
-nc, --no-clobber skip downloads that would download toexisting files (overwriting them)
--no-netrc don't try to obtain credentials from .netrc
-c, --continue resume getting a partially-downloaded file
--start-pos=OFFSET start downloading from zero-based position OFFSET
--progress=TYPE select progress gauge type
--show-progress display the progress bar in any verbosity mode
-N, --timestamping don't re-retrieve files unless newer than local
--no-if-modified-since don't use conditional if-modified-since get requests in timestamping mode
--no-use-server-timestamps don't set the local file's timestamp by the one on the server
-S, --server-response print server response
--spider don't download anything
-T, --timeout=SECONDS set all timeout values to SECONDS
--dns-timeout=SECS set the DNS lookup timeout to SECS
--connect-timeout=SECS set the connect timeout to SECS
--read-timeout=SECS set the read timeout to SECS
-w, --wait=SECONDS wait SECONDS between retrievals (applies if more then 1 URL is to be retrieved)
--wait retry=SECONDS wait 1..SECONDS between retries of a retrieval (applies if more then 1 URL is to be retrieved)
--random-wait wait from 0.5WAIT…1.5WAIT secs between retrievals(applies if more then 1 URL is to be retrieved)
```
### Wrapping Up