Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • seco-ne/yocto/infrastructure/gitlab-ci
1 result
Show changes
Commits on Source (28)
Showing
with 248 additions and 280 deletions
scripts/__pycache__
.idea/
.idea/
.vscode/
scripts/__pycache__
......@@ -112,36 +112,31 @@ build:check-foo-branch:
- echo "Getting Yocto build artifacts"
- wget -O artifacts.zip ${BUILD_ARTIFACTS}
- unzip artifacts.zip
cache:
- !reference [.buildbase, cache]
# Additionally cache the build artifacts for re-runs of this job in other pipelines
- key: ${CI_JOB_NAME}-${BUILD_ARTIFACTS}
paths: !reference [.buildbase, artifacts, paths]
simulate-build-seco-mx6:
extends:
- .simulate_build
- .buildbase
- .simulate_build
variables:
# We have to specify a tag here instead of getting the artifacts from a master
# branch, because we don't always execute the full pipeline on the master branches.
# In those cases the actual build job does not exist, which results in a 404 error.
BUILD_ARTIFACTS: https://git.seco.com/seco-ne/yocto/manifest/-/jobs/artifacts/kirkstone/3.0/download?job=build-seco-mx6
BUILD_ARTIFACTS: https://git.seco.com/seco-ne/yocto/manifest/-/jobs/artifacts/kirkstone/7.0/download?job=build-seco-mx6
ARTIFACTS_PATH: build-*/tmp/deploy/images/**/*
cache:
- !reference [.buildbase, cache]
# Additionally cache the build artifacts for re-runs of this job in other pipelines
- key: ${CI_JOB_NAME}-${BUILD_ARTIFACTS}
paths: !reference [.buildbase, artifacts, paths]
simulate-buildsdk-seco-mx6:
extends:
- .simulate_build
- .buildbase
- .simulate_build
variables:
BUILD_ARTIFACTS: https://git.seco.com/seco-ne/yocto/manifest/-/jobs/artifacts/kirkstone/3.0/download?job=buildsdk-seco-mx6
BUILD_ARTIFACTS: https://git.seco.com/seco-ne/yocto/manifest/-/jobs/artifacts/kirkstone/7.0/download?job=buildsdk-seco-mx6
ARTIFACTS_PATH: build-*/tmp/deploy/sdk/*
MANUAL_BUILD: "true"
cache:
- !reference [.buildbase, cache]
# Additionally cache the build artifacts for re-runs of this job in other pipelines
- key: ${CI_JOB_NAME}-${BUILD_ARTIFACTS}
paths: !reference [.buildbase, artifacts, paths]
# --------------------------------------------------------------------------------------
# Stage: Test
......@@ -209,7 +204,7 @@ package-sdk-seco-mx6:
extends: .deploy
stage: Deploy SoftwareStore
variables:
# We can't use the RELEASE_NAME variable from package.env in the variables secion,
# We can't use the RELEASE_NAME variable from package.env in the variables section,
# because we're loading the file from the cache. Usually it is downloaded via
# GitLab's standard artifact mechanism, which adds all dotenv variables to the
# pipeline scope. This doesn't work when using the cache, so we have to prevent
......
......@@ -116,7 +116,7 @@ include:
variables:
GITLAB_CI_REVISION: 31e5a427d57e2d3cac971a32d18e6274932adb99
BB_RECIPE_NAME: <recipe name, like emc-test-suite.bb>
BB_RECIPE_NAME: <recipe name, like emc-test-suite, without the '.bb'>
```
The `BB_RECIPE_NAME` needs to be adapted, as it is later used to modify the SRCREV.
......
......@@ -43,11 +43,11 @@ dynamic-child-pipeline feature. [See gitlab docs.][1]
[1]: https://docs.gitlab.com/ee/ci/pipelines/parent_child_pipelines.html#dynamic-child-pipeline-example
There is a *'generate-build-jobs'* job, that creates a yaml file containing the
pipeline with all needed jobs.
There are the following CI variables in the 'generate-build-jobs' job controlling
the content (make sure these are not set in a more global scope, as this
would overwrite the settings in the generated yml):
There is a *'generate-build-pipeline'* job, that creates a yaml file containing
the pipeline with all needed jobs.
There are the following CI variables in the 'generate-build-pipeline' job
controlling the content (make sure these are not set in a more global scope, as
this would overwrite the settings in the generated yml):
* `CI_PARAM_MACHINES`: Space separated list of machines to build for, like "santaro santoka"
* `CI_PARAM_IMAGE`: The name of the image to build. If set to an empty string,
......@@ -59,7 +59,7 @@ would overwrite the settings in the generated yml):
* `CI_PARAM_DISTRO_FNG`: The name of the fngsystem distro to build
It uses a python script called `generate_job_from_template.py` to convert the
`build-jobs.jinja2` to `build-jobs.yml`. This yml file is then used by the
*'trigger-build-jobs'* job, to setup the pipeline described by it.
`build-pipeline.yml.jinja2` to `build-pipeline.yml`. This yml file is then used
by the *'trigger-build-pipeline'* job, to setup the pipeline described by it.
![Manifest's parent child pipeline](manifest-parent-child.png)
......@@ -18,8 +18,8 @@ variables:
BUILD_TIMEOUT: 2m
# This is the jinja2 template file used to generate the build jobs
BUILD_JOBS_TEMPLATE: build-jobs-ci-test.yml.jinja2
# This is the jinja2 template file used to generate the build pipeline
BUILD_PIPELINE_TEMPLATE: build-pipeline-ci-test.yml.jinja2
# The master branch is hardcoded here, because it cannot be determined automatically.
# Has to be modified for new branches, e.g. new Yocto versions or fix releases.
......@@ -30,6 +30,6 @@ variables:
seco-ne/yocto/infrastructure/ci-test/minimal-bar
seco-ne/yocto/infrastructure/ci-test/minimal-foo
build-jobs:
build-pipeline:
extends:
- .build-jobs
- .build-pipeline
......@@ -18,8 +18,8 @@ variables:
BUILD_TIMEOUT: 1h
# This is the jinja2 template file used to generate the build jobs
BUILD_JOBS_TEMPLATE: build-jobs-yocto.yml.jinja2
# This is the jinja2 template file used to generate the build pipeline
BUILD_PIPELINE_TEMPLATE: build-pipeline-yocto.yml.jinja2
# Projects to include in the changelog in addition to the manifest project
CHANGELOG_PROJECTS:
......@@ -58,9 +58,9 @@ variables:
DEPLOY_FTP_TARGET_LINK: >
http://support.garz-fricke.com/projects/Flash-N-Go/FNGSystem/"$"{RELEASE_NAME}
yocto-build-jobs:
yocto-pipeline:
extends:
- .build-jobs
- .build-pipeline
- .yocto-deploy
variables:
BITBAKE_TASK: build
......@@ -72,9 +72,9 @@ yocto-build-jobs:
TEST_STAGE: "true"
ALPHAPLAN_STAGE: "true"
sdk-build-jobs:
sdk-pipeline:
extends:
- .build-jobs
- .build-pipeline
- .yocto-deploy
variables:
BITBAKE_TASK: populate_sdk
......@@ -84,9 +84,9 @@ sdk-build-jobs:
MANUAL_BUILD: "true"
PACKAGE_TYPE: sdk
fngsystem-build-jobs:
fngsystem-pipeline:
extends:
- .build-jobs
- .build-pipeline
- .fngsystem-deploy
variables:
BITBAKE_TASK: build
......
......@@ -60,33 +60,33 @@ workflow:
# --------------------------------------------------------------------------------------
# Full build pipeline (runs in merge requests, and on master if manually triggered)
# --------------------------------------------------------------------------------------
generate-build-jobs:
generate-build-pipeline:
extends:
- .infrastructure
- .full_build_pipeline
stage: manifest-pipeline
script:
- echo "Generating build jobs from template file '${BUILD_JOBS_TEMPLATE}'"
- echo "Generating build pipeline from template file '${BUILD_PIPELINE_TEMPLATE}'"
# The job generation script implicitly passes the OS environment to the template, so
# that the template has access to all GitLab CI variables. Hence there is no need
# to explicitly pass any of them as command line arguments.
- .gitlab-ci/scripts/generate_job_from_template.py
--template=.gitlab-ci/${BUILD_JOBS_TEMPLATE}
> build-jobs.yml
--template=.gitlab-ci/${BUILD_PIPELINE_TEMPLATE}
> build-pipeline.yml
artifacts:
expire_in: 4 weeks
paths:
- build-jobs.yml
- build-pipeline.yml
.build-jobs:
.build-pipeline:
extends:
- .full_build_pipeline
stage: trigger
needs: ["generate-build-jobs"]
needs: ["generate-build-pipeline"]
trigger:
include:
- artifact: build-jobs.yml
job: generate-build-jobs
- artifact: build-pipeline.yml
job: generate-build-pipeline
strategy: depend
yamllint:
......
......@@ -8,33 +8,18 @@ rule_settings:
disable:
# Code is easier to read without this
- min-max-identity
# Keep explicit lower range limit instead of using implicit default value
- remove-zero-from-range
# FIXME: verify if we want to keep the checks below.
# If not, remove them. If yes, move them above this comment.
- use-fstring-for-formatting
- replace-interpolation-with-fstring
- use-fstring-for-concatenation
- merge-dict-assign
- assign-if-exp
- inline-immediately-returned-variable
- remove-redundant-if
- list-comprehension
- switch
- dict-literal
- low-code-quality
- dict-comprehension
- for-append-to-extend
- raise-specific-error
- simplify-len-comparison
- de-morgan
- merge-nested-ifs
- remove-dict-keys
- remove-zero-from-range
- use-named-expression
- use-next
- sum-comprehension
- use-join
- merge-comparisons
- merge-else-if-into-elif
rule_types:
- refactoring
......
......@@ -32,128 +32,118 @@ class ApSubKeys(Flag):
def get_ap_dict(machine, machine_ap, release_name_local):
"""Return a customized dict with information for AlphaPlan FWR articles"""
ap_dict = {}
ap_dict[ApKeys.YOCTO_PKG_PY] = {
ApSubKeys.MATCH: "pkg.py",
ApSubKeys.MATCHCODE: "FNGUpdate",
ApSubKeys.BEZEICHNUNG: "{} Flash-N-Go Update general pkg.py update"
"script for nonverbose fng-install.sh".format(machine_ap),
ApSubKeys.LANGTEXT: "To be used with packages the contain an "
"fng-install.sh.\n"
"* with --nonverbose mode (new output)\n"
"* Able to to local installation with unset TFTP variable\n"
"* Handle --Paramfile",
ApSubKeys.TYP: "FNGUpdate",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.YOCTO_FNG_INSTALL] = {
ApSubKeys.MATCH: "fng-install.sh",
ApSubKeys.MATCHCODE: "InstallScript",
ApSubKeys.BEZEICHNUNG: "{} {} Install Script".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "US",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.YOCTO_FS] = {
ApSubKeys.MATCH: "{}.tar.gz".format(machine),
ApSubKeys.MATCHCODE: "OS-Filesystem",
ApSubKeys.BEZEICHNUNG: "{} {} Filesystem".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_UPDATE] = {
ApSubKeys.MATCH: "fngsystem-self-update.sh",
ApSubKeys.MATCHCODE: "TFTP",
ApSubKeys.BEZEICHNUNG: "{} {} Self Update".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "TFTP",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_INIT] = {
ApSubKeys.MATCH: "fngsystem-self-init.sh",
ApSubKeys.MATCHCODE: "InstallScript",
ApSubKeys.BEZEICHNUNG: "{} {} Init Script".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "Updateskript",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_FS] = {
ApSubKeys.MATCH: "{}.tgz".format(machine),
ApSubKeys.MATCHCODE: "FS",
ApSubKeys.BEZEICHNUNG: "{} {} Filesystem".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
return {
ApKeys.YOCTO_PKG_PY: {
ApSubKeys.MATCH: "pkg.py",
ApSubKeys.MATCHCODE: "FNGUpdate",
ApSubKeys.BEZEICHNUNG: "{} Flash-N-Go Update general pkg.py update"
"script for nonverbose fng-install.sh".format(machine_ap),
ApSubKeys.LANGTEXT: "To be used with packages the contain an "
"fng-install.sh.\n"
"* with --nonverbose mode (new output)\n"
"* Able to to local installation with unset TFTP variable\n"
"* Handle --Paramfile",
ApSubKeys.TYP: "FNGUpdate",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.YOCTO_FNG_INSTALL: {
ApSubKeys.MATCH: "fng-install.sh",
ApSubKeys.MATCHCODE: "InstallScript",
ApSubKeys.BEZEICHNUNG: "{} {} Install Script".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "US",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.YOCTO_FS: {
ApSubKeys.MATCH: "{}.tar.gz".format(machine),
ApSubKeys.MATCHCODE: "OS-Filesystem",
ApSubKeys.BEZEICHNUNG: "{} {} Filesystem".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_UPDATE: {
ApSubKeys.MATCH: "fngsystem-self-update.sh",
ApSubKeys.MATCHCODE: "TFTP",
ApSubKeys.BEZEICHNUNG: "{} {} Self Update".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "TFTP",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_INIT: {
ApSubKeys.MATCH: "fngsystem-self-init.sh",
ApSubKeys.MATCHCODE: "InstallScript",
ApSubKeys.BEZEICHNUNG: "{} {} Init Script".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "Updateskript",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_FS: {
ApSubKeys.MATCH: "{}.tgz".format(machine),
ApSubKeys.MATCHCODE: "FS",
ApSubKeys.BEZEICHNUNG: "{} {} Filesystem".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_CHECKSUM: {
ApSubKeys.MATCH: "{}.md5".format(machine),
ApSubKeys.MATCHCODE: "TFTP",
ApSubKeys.BEZEICHNUNG: "{} {} Checksum".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "TFTP",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_UBOOT_UPDATE: {
ApSubKeys.MATCH: "fng-install-uboot.sh",
ApSubKeys.MATCHCODE: "US",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Update script".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "Updateskript",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_UBOOT_IMAGE: {
ApSubKeys.MATCH: "imx-boot",
ApSubKeys.MATCHCODE: "FS",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Bootloader Image".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_UBOOT_IMAGETAR: {
ApSubKeys.MATCH: "imx-boot.tar.gz",
ApSubKeys.MATCHCODE: "FS",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Bootloader Image".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
ApKeys.FNGSYS_UBOOT_CHECKSUM: {
ApSubKeys.MATCH: "imx-boot.md5",
ApSubKeys.MATCHCODE: "TFTP",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Checksum".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "TFTP",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
},
}
ap_dict[ApKeys.FNGSYS_CHECKSUM] = {
ApSubKeys.MATCH: "{}.md5".format(machine),
ApSubKeys.MATCHCODE: "TFTP",
ApSubKeys.BEZEICHNUNG: "{} {} Checksum".format(machine_ap, release_name_local),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "TFTP",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_UBOOT_UPDATE] = {
ApSubKeys.MATCH: "fng-install-uboot.sh",
ApSubKeys.MATCHCODE: "US",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Update script".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "Updateskript",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_UBOOT_IMAGE] = {
ApSubKeys.MATCH: "imx-boot",
ApSubKeys.MATCHCODE: "FS",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Bootloader Image".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_UBOOT_IMAGETAR] = {
ApSubKeys.MATCH: "imx-boot.tar.gz",
ApSubKeys.MATCHCODE: "FS",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Bootloader Image".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "FS",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
ap_dict[ApKeys.FNGSYS_UBOOT_CHECKSUM] = {
ApSubKeys.MATCH: "imx-boot.md5",
ApSubKeys.MATCHCODE: "TFTP",
ApSubKeys.BEZEICHNUNG: "{} U-Boot {} Checksum".format(
machine_ap, release_name_local
),
ApSubKeys.LANGTEXT: "",
ApSubKeys.TYP: "TFTP",
ApSubKeys.ATTRIBUTESET: "Firmware, Bestandteil eines SW-Paketes",
}
return ap_dict
......@@ -94,9 +94,8 @@ def main(args):
for job in job_it:
if options.filter_status is not None and job.status != options.filter_status:
continue
if options.filter_tag is not None:
if not options.filter_tag in job.tag_list:
continue
if options.filter_tag is not None and options.filter_tag not in job.tag_list:
continue
log = bytes.decode(job.trace())
if options.pattern in log:
logging.debug(
......
......@@ -35,8 +35,7 @@ verbose = 0
def decode_timestamp(t):
timestamp = datetime.datetime.strptime(t, GITLAB_TIMEFORMAT)
return timestamp
return datetime.datetime.strptime(t, GITLAB_TIMEFORMAT)
class Project:
......@@ -52,9 +51,7 @@ class Project:
)
def __eq__(self, p):
if not p:
return False
return self.project.id == p.project.id
return self.project.id == p.project.id if p else False
class Tag:
......@@ -168,9 +165,7 @@ class Release:
def description(self):
m = self.tag.message
if not m:
return ""
return m
return m or ""
def __str__(self):
return self.tag.name
......@@ -188,8 +183,7 @@ class MergeRequest:
return self.mr.title
def withlink(self):
out = self.mr.title + " [" + self.mr.reference + "](" + self.mr.web_url + ")"
return out
return self.mr.title + " [" + self.mr.reference + "](" + self.mr.web_url + ")"
def main(args):
......@@ -240,14 +234,10 @@ def main(args):
logging.debug(options)
gitlab = gl.Gitlab(options.gitlab_url, private_token=options.token)
projects = []
for project in options.project:
projects.append(Project(gitlab.projects.get(project)))
releases = []
for t in projects[0].project.tags.list(search=options.branch):
releases.append(Release(Tag(t)))
projects = [Project(gitlab.projects.get(project)) for project in options.project]
releases = [
Release(Tag(t)) for t in projects[0].project.tags.list(search=options.branch)
]
# Add dummy release with date today for new unstaged commits
releases.append(
Release(
......
......@@ -27,9 +27,7 @@ def check_value_length(filename: str, data: OrderedDict, keys: list[str], limit:
exceeded = False
for key, value in data.items():
if key in keys:
count = 0
for item in value:
count += len(item)
count = sum(len(item) for item in value)
if count > limit:
exceeded = True
print(
......
......@@ -153,10 +153,10 @@ def wait_until_merge_status_is_set(project: Project, mr: MergeRequest):
def list_commits(commits):
"""Create a list of commits along with the commit messages"""
commit_list = ""
for commit in commits:
commit_list += "\n--\n\nCommit: %s\n\n%s" % (commit.web_url, commit.message)
return commit_list
return "".join(
"\n--\n\nCommit: %s\n\n%s" % (commit.web_url, commit.message)
for commit in commits
)
def commit_and_push(
......@@ -278,7 +278,7 @@ def get_repository_file_obj(project: Project, filename, ref=None):
# logging.debug(repository_tree)
fileobj = [f for f in repository_tree if f["name"] == filename]
if len(fileobj) == 0:
if not fileobj:
logging.error("Could not find file %s", filename)
for f in repository_tree:
logging.debug(f["name"])
......
......@@ -183,8 +183,7 @@ def convertmd2html(infile, outfile):
fout = codecs.open(outfile, "w", encoding)
fout.write(HEADER)
extras = {}
extras["tables"] = ""
extras = {"tables": ""}
html = markdown_path(infile, extras=extras)
fout.write(html)
......
......@@ -87,9 +87,8 @@ def main(args):
continue
if options.filter_status is not None and job.status != options.filter_status:
continue
if options.filter_tag is not None:
if not options.filter_tag in job.tag_list:
continue
if options.filter_tag is not None and options.filter_tag not in job.tag_list:
continue
job.delete_artifacts()
logging.debug(
"Deleted artifacts for %s: %s %s.",
......
......@@ -137,28 +137,34 @@ def main(args):
logging.debug(options)
gitlab = gl.Gitlab(options.gitlab_url, private_token=options.token)
if options.path is None:
if options.destination is None:
destination = tempfile.mkstemp()
try:
if options.path is None:
if options.destination is None:
destination = tempfile.NamedTemporaryFile().name
else:
destination = options.destination
print(destination)
filename = download_job_artifacts(
gitlab, destination, options.job, options.project, extract=True
)
print("Downloaded artifacts for job {} to {}".format(options.job, filename))
else:
destination = options.destination
filename = download_job_artifacts(
gitlab, destination, options.job, options.project, extract=True
)
print("Downloaded artifacts for job {} to {}".format(options.job, filename))
else:
if options.destination is None:
destination = tempfile.mkdtemp()
else:
destination = options.destination
filename = download_job_artifact(
gitlab, destination, options.path, options.job, options.project
)
print(
"Downloaded {} for job {} to {}".format(options.path, options.job, filename)
)
if options.destination is None:
destination = tempfile.TemporaryDirectory().name
else:
destination = options.destination
filename = download_job_artifact(
gitlab, destination, options.path, options.job, options.project
)
print(
"Downloaded {} for job {} to {}".format(
options.path, options.job, filename
)
)
except gl.exceptions.GitlabGetError as e:
exit("ERROR: %s" % e)
if __name__ == "__main__":
......
......@@ -27,11 +27,11 @@ def new_ap_article(
stueckliste=None,
):
"""Creates a dict/list structure for a new AlphaPlan article"""
position = {}
for idx in range(len(attribute)):
if attribute[idx]:
position["Attribut{:02d}".format(idx + 1)] = attribute[idx]
position = {
"Attribut{:02d}".format(idx + 1): attribute[idx]
for idx in range(len(attribute))
if attribute[idx]
}
data = {
"Artikel": {
"ID": ap_id_generator(),
......@@ -160,19 +160,17 @@ def generate_fwr_articles(
]
if subarticles_uboot:
stueckliste_uboot = []
for key in subarticles_uboot:
stueckliste_uboot.append(
generate_ap_subarticle(
files,
key,
machine,
machine_ap,
release_name_ap,
md5sums,
)
stueckliste_uboot = [
generate_ap_subarticle(
files,
key,
machine,
machine_ap,
release_name_ap,
md5sums,
)
for key in subarticles_uboot
]
# At the moment there are no attributes specified for uboot FWR
attribute_uboot = [None, None, "UBOOT"]
data_uboot = new_ap_article(
......
......@@ -31,41 +31,46 @@ def generate_metadata(
elif filename.endswith(machine + ".wic"):
image_wic = filename
metadata = dict()
metadata["files"] = []
metadata["version"] = version
metadata["machine"] = machine
metadata["date"] = datetime.now().strftime("%Y-%m-%d")
metadata = {
"files": [],
"version": version,
"machine": machine,
"date": datetime.now().strftime("%Y-%m-%d"),
}
if install_script is not None:
new_file = dict()
new_file["name"] = "Install Script"
new_file["path"] = install_script
new_file = {
"name": "Install Script",
"path": install_script,
}
metadata["files"].append(new_file)
if image_general is not None:
new_file = dict()
new_file["name"] = "Image"
new_file["path"] = image_general
new_file = {
"name": "Image",
"path": image_general,
}
metadata["files"].append(new_file)
if image_wic is not None:
new_file = dict()
new_file["name"] = "SD-Card Image (WIC)"
new_file["path"] = image_wic
new_file = {
"name": "SD-Card Image (WIC)",
"path": image_wic,
}
metadata["files"].append(new_file)
if sdk is not None:
new_file = dict()
new_file["name"] = "SDK"
new_file["path"] = "sdk/" + sdk + ".sh"
new_file = {
"name": "SDK",
"path": "sdk/" + sdk + ".sh",
}
metadata["files"].append(new_file)
if licenses is not None:
new_file = dict()
new_file["name"] = "Licenses"
new_file["path"] = licenses
new_file = {
"name": "Licenses",
"path": licenses,
}
metadata["files"].append(new_file)
with open(output_file, "w", encoding="utf-8") as file:
......
......@@ -41,6 +41,9 @@ def get_integration_sources(manifest_project: str, manifest_branch: str, group:
"branch": source_branch,
}
)
# Skip a sourcery suggestion because the way it is written is easier to
# understand than the suggested change.
# sourcery skip: switch
except GitlabGetError as e:
if e.response_code == 404: # not found
pass
......