From 0abb164c8f23c72371440f0fde37d30269f4980b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E5=85=AD=E5=BC=80=E7=AE=B1?= Date: Sun, 13 Nov 2022 14:30:18 +0800 Subject: [PATCH] =?UTF-8?q?Update=2020221110.1=20=E2=AD=90=EF=B8=8F=20How?= =?UTF-8?q?=20to=20Fix=20bash=20wget=20Command=20Not=20Found=20Error.md?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- ...ow to Fix bash wget Command Not Found Error.md | 23 ++++++++++++++++++- 1 file changed, 22 insertions(+), 1 deletion(-) diff --git a/sources/tech/20221110.1 ⭐️ How to Fix bash wget Command Not Found Error.md b/sources/tech/20221110.1 ⭐️ How to Fix bash wget Command Not Found Error.md index dd8e9bb4e8..5d010bc7e5 100644 --- a/sources/tech/20221110.1 ⭐️ How to Fix bash wget Command Not Found Error.md +++ b/sources/tech/20221110.1 ⭐️ How to Fix bash wget Command Not Found Error.md @@ -69,7 +69,28 @@ wget https://releases.ubuntu.com/22.04.1/ubuntu-22.04.1-desktop-amd64.iso Similarly, you can also download using the above command or, by combining several switches as described below. You can also get this via `wget --help` command. ``` --t, --tries=NUMBER set number of retries to NUMBER (0 unlimits)--retry-connrefused retry even if connection is refused--retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry-O, --output-document=FILE write documents to FILE-nc, --no-clobber skip downloads that would download toexisting files (overwriting them)--no-netrc don't try to obtain credentials from .netrc-c, --continue resume getting a partially-downloaded file--start-pos=OFFSET start downloading from zero-based position OFFSET--progress=TYPE select progress gauge type--show-progress display the progress bar in any verbosity mode-N, --timestamping don't re-retrieve files unless newer thanlocal--no-if-modified-since don't use conditional if-modified-since getrequests in timestamping mode--no-use-server-timestamps don't set the local file's timestamp bythe one on the server-S, --server-response print server response--spider don't download anything-T, --timeout=SECONDS set all timeout values to SECONDS--dns-timeout=SECS set the DNS lookup timeout to SECS--connect-timeout=SECS set the connect timeout to SECS--read-timeout=SECS set the read timeout to SECS-w, --wait=SECONDS wait SECONDS between retrievals(applies if more then 1 URL is to be retrieved)--waitretry=SECONDS wait 1..SECONDS between retries of a retrieval(applies if more then 1 URL is to be retrieved)--random-wait wait from 0.5WAIT…1.5WAIT secs between retrievals(applies if more then 1 URL is to be retrieved) +-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits) +--retry-connrefused retry even if connection is refused +--retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry +-O, --output-document=FILE write documents to FILE +-nc, --no-clobber skip downloads that would download toexisting files (overwriting them) +--no-netrc don't try to obtain credentials from .netrc +-c, --continue resume getting a partially-downloaded file +--start-pos=OFFSET start downloading from zero-based position OFFSET +--progress=TYPE select progress gauge type +--show-progress display the progress bar in any verbosity mode +-N, --timestamping don't re-retrieve files unless newer than local +--no-if-modified-since don't use conditional if-modified-since get requests in timestamping mode +--no-use-server-timestamps don't set the local file's timestamp by the one on the server +-S, --server-response print server response +--spider don't download anything +-T, --timeout=SECONDS set all timeout values to SECONDS +--dns-timeout=SECS set the DNS lookup timeout to SECS +--connect-timeout=SECS set the connect timeout to SECS +--read-timeout=SECS set the read timeout to SECS +-w, --wait=SECONDS wait SECONDS between retrievals (applies if more then 1 URL is to be retrieved) +--wait retry=SECONDS wait 1..SECONDS between retries of a retrieval (applies if more then 1 URL is to be retrieved) +--random-wait wait from 0.5WAIT…1.5WAIT secs between retrievals(applies if more then 1 URL is to be retrieved) ``` ### Wrapping Up