Rather than using params in the script, set environment variables
and use them in the python script with os.getenv
Signed-off-by: Andrea Frittoli <andrea.frittoli@gmail.com>
Currently in the tests of upload-pypi task, pypiserver was pointing to
latest and there are some recent changes made there which is breaking
the test here. Pinning the version to v1.4.2
Signed-off-by: vinamra28 <jvinamra776@gmail.com>
This bumps the version of ansible-runner which uses the latest
ansible-runner image and this requires community.general to be installed
from ansible collections.
Signed-off-by: vinamra28 <jvinamra776@gmail.com>
The image being used here was removed by the authors from dockerhub and
instead moved to quay.io. Also the tag used in this version was very old
and no longer present in quay.io.
Signed-off-by: vinamra28 <jvinamra776@gmail.com>
The new param introduced in the latest version of the task was meant
to have a default value (as documented). Actually set the default.
Signed-off-by: Andrea Frittoli <andrea.frittoli@gmail.com>
gcs-upload uses rsync to upload folders. When the target contains
many thousands of files, it can take a very long time.
Change the default implementation to use "cp -r" which only copies
new files. Recursive copy supports not replacing existing files,
so adding an option to the task for that.
The option to delete extra remote files is still there, and still
false by default. Delete extra remote files is not compatible with
"not replace remote files" so adding a note about that.
Signed-off-by: Andrea Frittoli <andrea.frittoli@gmail.com>
fix: Use single parameter for YAML source
Update pipeline with kubernetes example repository
docs: add description for datree task
docs: add complete steps to datree readme
fix: run as non root and non privileged for datree task
feat: add labels annotations to datree task
feat: datree image as parameter
Use parameter for datree image instead of hardcoded image with latest tag
fix: remove trailing spaces for datree task
Since the no of resources in catalog are increasing, the whole test
suite is taking more time so increasing the parallel run from 5 -> 8
Signed-off-by: vinamra28 <jvinamra776@gmail.com>
Black task tests are failing with the following issue
```
Traceback (most recent call last):
File "/tmp/python/.local/bin/black", line 8, in <module>
sys.exit(patched_main())
File "/tmp/python/.local/lib/python3.8/site-packages/black/__init__.py", line 1127, in patched_main
patch_click()
File "/tmp/python/.local/lib/python3.8/site-packages/black/__init__.py", line 1113, in patch_click
from click import _unicodefun # type: ignore
ImportError: cannot import name '_unicodefun' from 'click' (/tmp/python/.local/lib/python3.8/site-packages/click/__init__.py)
```
bumping the version of black in resources.yaml fixes the issue
Signed-off-by: vinamra28 <jvinamra776@gmail.com>
The gcs-upload task v0.1, when dealing with folders, uses the "-d"
option by default, which deletes remote files by default. This is
a relatively dangerous choice of default, and it's safer to make
that as an opt-in, as it may result in deletion of a lot of remote
files by accident.
V0.2 has a new parameter "deleteExtraFiles" which corresponds to
gsutil "-d" flag. deleteExtraFiles defaults to false, making the
default behaviour of the task different from v0.1.
V0.2 replaces parameters in the script with environment variables
for security as recommended in the catalog best practices.
The cloud-sdk image version has been updated to 390.0.0-slim which
is newer, ~60% smaller than the current image and includes gsutil.
Signed-off-by: Andrea Frittoli <andrea.frittoli@gmail.com>
Currently the readme says the default `gcloud-image` uses the `slim` tag. However, doing a `gcloud artifacts` command revealed gcloud was pinned to an old version.
So, updating the gcloud image to the latest, and correctly document that it has been pinned to a specific version