Skip to content
Snippets Groups Projects
  1. Apr 15, 2020
    • Alexander V. Tikhonov's avatar
      Added ability to remove packages from S3 · d6c50af1
      Alexander V. Tikhonov authored
      Added ability to remove given in options package from S3. TO remove the
      needed package need to set '-r=<package name with version>' option,
      like:
        ./tools/update_repo.sh -o=<OS> -d=<DIST> -b=<S3 repo> \
          -r=tarantool-2.2.2.0
      it will remove all found appropriate source and binaries packages from
      the given S3 repository, also the meta files will be corrected there.
      
      Close #4839
      d6c50af1
    • Alexander.V Tikhonov's avatar
      Add help instruction on 'product' option · cccc989c
      Alexander.V Tikhonov authored
      Added instructions on 'product' option with examples.
      
      Part of #4839
      cccc989c
    • Alexander.V Tikhonov's avatar
      Enable script for saving packages in S3 for modules · 4527a4da
      Alexander.V Tikhonov authored
      Found that modules may have only binaries packages w/o sources
      packages. Script changed to be able to work with only binaries
      either sources packages.
      
      Part of #4839
      4527a4da
    • Alexander V. Tikhonov's avatar
      Add metafiles cleanup routines at S3 pack script · ed491409
      Alexander V. Tikhonov authored
      Added cleanup functionality for the meta files.
      Script may have the following situations:
      
       - package files removed at S3, but it still registered:
         Script stores and registers the new packages at S3 and
         removes all the other registered blocks for the sames
         files in meta files.
      
       - package files already exists at S3 with the same hashes:
         Script passes it with warning message.
      
       - package files already exists at S3 with the old hashes:
         Script fails w/o force flag, otherwise it stores and
         registers the new packages at S3 and removes all the other
         registered blocks for the sames files in meta files.
      
      Added '-s|skip_errors' option flag to skip errors on changed
      packages to avoid of exits on script run.
      
      Part of #4839
      ed491409
  2. Mar 26, 2020
    • Alexander V. Tikhonov's avatar
      test: fix OSX host setup · a41bef3b
      Alexander V. Tikhonov authored
      Fixed OSX host setup for Tarantool build:
      - set brew installation from Homebrew repository instructions;
      - set in use Python 2 latest commit from tapped local formula,
        since Python 2 is EOL, also removed extra pip installation with
        get-pip script, because tapped formula installs pip itself.
        python@2 was deleted from homebrew/core in commit 028f11f9e:
          python@2: delete (https://github.com/Homebrew/homebrew-core/issues/49796)
          EOL 1 January 2020.
        Tapped formula created from the latest formula before its removal:
          git -C "$(brew --repo homebrew/core)" show 028f11f9e^:Formula/python@2.rb
      - added upgrade packages call to avoid of fails on already
        installed packages, but with previous version;
      - fixed the gitlab-ci configuration for sudo on testing hosts and removed pip
        option '--user' to avoid of use the users paths with special setup for it.
      
      Fixed OSX host setup for Tarantool test:
      - set maximum processes limit value to 2500 for testing process;
      - new Mac machines are going to be added into CI and usernames on them
        are long according to internal policies. It makes a home directory to
        be long too and so a path to a unix socket created during testing can
        be longer then UNIX_PATH_MAX=108 constant which is described as issue
          https://github.com/tarantool/tarantool/issues/4634
        To avoid of it the short working directory for testing set by option:
          --vardir /tmp/tnt
      a41bef3b
  3. Feb 21, 2020
    • Alexander V. Tikhonov's avatar
      gitlab-ci: adjust base URL of RPM/Deb repositories · 4dee6890
      Alexander V. Tikhonov authored
      Our S3 based repositories now reflect packagecloud.io repositories
      structure.
      
      It will allow us to migrate from packagecloud.io w/o much complicating
      redirection rules on a web server serving download.tarantool.org.
      
      Deploy source packages (*.src.rpm) into separate 'SRPM' repository
      like packagecloud.io does.
      
      Changed repository signing key from its subkey to public and moved it
      to gitlab-ci environment.
      
      Follows up #3380
      Unverified
      4dee6890
  4. Feb 20, 2020
  5. Feb 04, 2020
    • Alexander V. Tikhonov's avatar
      gitlab-ci: push Deb/RPM packages to S3 based repos · 05d3ed4b
      Alexander V. Tikhonov authored
      We're going to use S3 compatible storage for Deb and RPM repositories
      instead of packagecloud.io service. The main reason is that
      packagecloud.io provides a limited amount of storage, which is not
      enough for keeping all packages (w/o regular pruning of old versions).
      
      Note: At the moment packages are still pushed to packagecloud.io from
      Travis-CI. Disabling this is out of scope of this patch.
      
      This patch implements saving of packages on an S3 compatible storage and
      regeneration of a repository metadata.
      
      The layout is a bit different from one we have on packagecloud.io.
      
      packagecloud.io:
      
       | - 1.10
       | - 2.1
       | - 2.2
       | - ...
      
      S3 compatible storage:
      
       | - live
       |   - 1.10
       |   - 2.1
       |   - 2.2
       |   - ...
       | - release
       |   - 1.10
       |   - 2.1
       |   - 2.2
       |   - ...
      
      Both 'live' and 'release' repositories track release branches (named as
      <major>.<minor>) and master branch. The difference is that 'live' is
      updated on every push, but 'release' is only for tagged versions
      (<major>.<minor>.<patch>.0).
      
      Packages are also built on '*-full-ci' branches, but only for testing
      purposes: they don't pushed anywhere.
      
      The core logic is in the tools/update_repo.sh script, which implements
      the following flow:
      
      - create metadata for new packages
      - fetch relevant metadata from the S3 storage
      - push new packages to the S3 storage
      - merge and push the updated metadata to the S3 storage
      
      The script uses 'createrepo' for RPM repositories and 'reprepro' for Deb
      repositories.
      
      Closes #3380
      Unverified
      05d3ed4b
Loading