[PATCH cygport] src_fetch.cygpart: iterate thru mirror lists if downloads fail
Brian Inglis
Brian.Inglis@Shaw.ca
Mon Mar 27 05:41:45 GMT 2023
This patch will allow recently released (especially) GNU mirror downloads to
succeed where they currently fail repeatedly (particulary in Scallywag), because
(GNU) mirror downloads lack checks for any particular file requested being
available (or not) on the selected mirror, because of propagation delays.
--
Take care. Thanks, Brian Inglis Calgary, Alberta, Canada
La perfection est atteinte Perfection is achieved
non pas lorsqu'il n'y a plus rien à ajouter not when there is no more to add
mais lorsqu'il n'y a plus rien à retirer but when there is no more to cut
-- Antoine de Saint-Exupéry
-------------- next part --------------
cygport lib/src_fetch.cygpart: iterate thru mirror lists if downloads fail
Recently released package downloads from mirrors often fail due to
mirror propagation variability; it often occurs on .asc/.sig retrieval,
which may be from a different mirror than the compressed tar, and it is
likely to happen in CI jobs if it happened in local downloads,
requiring cygport tweaks to redo a push.
This is particularly common with GNU packages which often seem to be
released close to weekends, which is also when I mainly run builds, and
probably when many mirrors run backup processes which may adversely
increase propagation duration.
The GNU mirror stats use 28 and 52 hours as their cutoffs for normal
propagation delay and there always seem to be more sites impacted on
weekends.
I have often have to change my cygport(s) to get builds to start,
especially under Scallywag, where the process attempts to download the
files three times, for source, arch, and noarch package builds, and
failure of any one of these fails the build, mainly in the initial
source build in my experience, and I have to modify the cygport in some
useless way, in order to be able to commit and push it again.
This patch has solved the problem of using mirrors in my local builds,
so it would be good to know that I can continue to use them also in CI,
rather than hard code overrides to the primary source, after a
succession of download failures.
The design of __mirror_fetch() expects to be able to try another site
in its mirror lists if the first fails, but on errors fetch() calls
error which exits from cygport.
Allow retries from other mirrors in the list if there is more than one,
by passing an override variable from __mirror_fetch() to fetch().
If the override variable is defined, fetch() returns the download
program exit status to __mirror_fetch(), rather than exiting cygport.
The download program error message is still issued so the maintainer
is made aware of the issue, but __mirror_fetch() continues to try all
mirrors in the list.
Where there are ongoing mirror issues, the maintainer can add local
unofficial backup sites to the relevant list by adding to the definition
from their cygport using 'mirror_NAME+="proto://site ..."', for example:
mirror_gnu+="https://muug.ca/mirror/gnu"
--- lib/src_fetch.cygpart 2022-02-23 05:52:54.000000000 -0700
+++ lib/src_fetch.cygpart 2022-09-06 09:50:10.405917200 -0600
@@ -74,6 +74,8 @@
fetch() {
local uri;
local urifile;
+ local prog;
+ local rc;
uri=${1%\#/*};
urifile=${1##*\#/};
@@ -86,24 +88,29 @@ fetch() {
return 0
elif check_prog wget
then
- if wget --no-check-certificate -O ${urifile}.tmp ${uri}
- then
- mv -f ${urifile}.tmp ${urifile}
- else
- rm -f ${urifile}.tmp
- error "wget ${uri} failed"
- fi
+ prog=wget
+ wget --no-check-certificate -O ${urifile}.tmp ${uri}
+ rc=$?
elif check_prog curl
then
- if curl -R -k --url ${uri} -o ${urifile}.tmp
+ prog=curl
+ curl -R -k --url ${uri} -o ${urifile}.tmp
+ rc=$?
+ else
+ error "Either wget or curl are required to fetch sources.";
+ fi
+
+ if [ 0 = ${rc} ]
+ then
+ mv -f ${urifile}.tmp ${urifile}
+ else
+ if defined __DL_MIRROR_LIST
then
- mv -f ${urifile}.tmp ${urifile}
+ return ${rc}
else
rm -f ${urifile}.tmp
- error "curl ${uri} failed"
+ error "${prog} ${uri} failed"
fi
- else
- error "Either wget or curl are required to fetch sources.";
fi
if defined DISTDIR && [ -f ${urifile} ]
@@ -119,6 +126,7 @@ __mirror_fetch() {
local mirvar;
local -a mirlist;
local -i n;
+ local dl_mirrors;
miruri=${1#mirror://};
mirname=${miruri%%/*};
@@ -131,17 +139,22 @@ __mirror_fetch() {
mirlist=(${!mirvar});
+ if [ ${#mirlist[*]} -gt 1 ] # iterate thru list > 1
+ then
+ dl_mirrors=${#mirlist[*]}
+ fi
+
n=0;
while (( n < ${#mirlist[*]} ))
do
- if fetch ${mirlist[${n}]}/${miruri#*/}
+ if __DL_MIRROR_LIST=${dl_mirrors} fetch ${mirlist[${n}]}/${miruri#*/}
then
return 0;
fi
n+=1;
done
- error "Could not download ${1##*/}";
+ error "Could not download ${1##*/} from ${mirname} mirror(s)";
}
# downloads all sources through method-specific functions
More information about the Cygwin-apps
mailing list