Maneage - Tasks: task #15345, Automatically using backup tarball...
You are not allowed to post comments on this tracker with your current authentication level.
task #15345: Automatically using backup tarball repository
Submitter: | Mohammad Akhlaghi <makhlaghi> | ||
Submitted: | Mon 29 Jul 2019 01:08:10 AM UTC | ||
Should Start On: | Sun 28 Jul 2019 11:00:00 PM UTC | Should be Finished on: | Sun 28 Jul 2019 11:00:00 PM UTC |
Category: | Software | Priority: | 5 - Normal |
Status: | Postponed | Privacy: | Public |
Percent Complete: | 80% | Assigned to: | None |
Open/Closed: | Open | Effort: | 0.00 |
Tue 06 Aug 2019 01:15:10 AM UTC, comment #4: |
Mohammad Akhlaghi <makhlaghi>![]() |
Mon 05 Aug 2019 12:47:39 AM UTC, comment #3: The current backup repository is in Gitlab, which can't be opened in some countries.
I have cloned that repository in this top-level address: `http://akhlaghi.org/reproduce-software' which doesn't actually have any HTML file (so it won't open anything if you click on it), but only contains the tarballs. For example this link for Gnuastro's tarball will work: http://akhlaghi.org/reproduce-software/gnuastro-0.10.tar.lz .
Because all the tarballs (including the ones that are only in our repository) are now available there, the base URL of those tarballs has now also been changed pointing to this URL.
Until we complete this task, if the download of any of the software has problems, just download them (with the same filename) from this top URL `http://akhlaghi.org/reproduce-software'. |
Mohammad Akhlaghi <makhlaghi>![]() |
Mon 29 Jul 2019 01:28:43 PM UTC, comment #2: Thanks for bringing this up Boud, you are completely right.
Checking tarball checksums is something I have been thinking of doing for a long time, but I haven't had the time to do it yet. But we will definitely add this step :-).
I just defined task #15347 to officially document this task. Hopefully one of us will be able to do it. As described there, its not too hard ;-).
In fact, even for Python packages, we don't check the checksums! Those hash values are actually within the URL that we use to download the tarball! |
Mohammad Akhlaghi <makhlaghi>![]() |
Mon 29 Jul 2019 01:02:55 PM UTC, comment #1: Is it correct that as of commit da1123c, hashes to check tarball
only finds the python hashes. I'd agree that python packages are
I think that a necessary element of automatically using backup tarballs
The python checksums appear to have 64 hexadecimal characters, with two pairs split off,
Anyway, my recommendation would be to, at least, have checksums for all downloaded packages
|
Boud Roukema <boud>![]() |
Mon 29 Jul 2019 01:08:10 AM UTC, original submission:
When a software's server is in-accessible for any reason, and the template user doesn't have a local copy of their necessary tarballs, the pipeline will stop progressing.
We already do have a backup repository with all the necessary software's tarballs. But so far, its up to the user to manually get the tarballs from here and put it where the template can see it.
We should make this automatic: as soon as a software can't be downloaded from its original URL, the template goes to our backup repository and tries downloading it from there. |
Mohammad Akhlaghi <makhlaghi>![]() |
No files currently attached
Depends on the following items: None found
Items that depend on this one: None found
There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.
I started an early implementation of this on my fork's `auto-mirror' branch. The first commit shows my current implementation. In short, there is now a `mirror.conf' file that a project designer can define their own list of mirrors to use when a tarball can't be downloaded from its original URL.
I haven't had time to fully check it, so I haven't merged it into the project's `master' branch yet.