Voici la documentation : https://geoplateforme.pages.gpf-tech.ign.fr/documentation

Skip to content
Validations sur la source (5)
......@@ -19,6 +19,7 @@ include:
- "/ci/pre-commit-v1.yml"
- "/ci/sonarqube.yml"
- "/ci/package-py.yml"
- "/ci/versioning.yml"
# Change pip's cache directory to be inside the project directory since we can
# only cache local items.
......@@ -109,7 +110,7 @@ build:documentation:credits:
- main
- tags
before_script:
- python -m pip install -U pip setuptools wheel
- python -m pip install -U pip "setuptools>=79,<80" wheel
- python -m pip install -U "pip-licenses>=3.5,<5"
- python -m pip install -U "pipdeptree>=2.7,<3"
- python -m pip install -U -r requirements.txt
......
# CHANGELOG
The format is based on [Keep a Changelog](https://keepachangelog.com/), and this project adheres to [Semantic Versioning](https://semver.org/).
<!--
Unreleased
## {version_tag} - YYYY-DD-mm
### Added
## [Unreleased]
### Changed
- IGNGPF-4998 : use toolbelt 1.15.5 method to list MD5 files to check + better logs
### Removed
-->
## 2.6.2 - 2024-08-28
### Fixed
- fix setuptools version for documentation
## [2.7.0] - 2025-06-02
### Changed
- IGNGPF-4708 : limit info logs
## [2.6.2] - 2024-08-28
### Changed
- IGNGPF-3088 : update some logs
## 2.6.1 - 2024-05-29
## [2.6.1] - 2024-05-29
### Changed
- Adapt some logs to fix IGNGPF-3088
## 2.6.0 - 2024-01-19
## [2.6.0] - 2024-01-19
### Changed
- Use Toolbelt 1.8.1 using directly uploads on FILESYSTEM
- Specific working to deal with both S3 and FILESYSTEM uploads
- Adapt some logs (french + user/admin)
## 2.5.1 - 2023-12-21
## [2.5.1] - 2023-12-21
### Added
- IGNGPF-3076: Log each line of a md5 file
## 2.5.0 - 2023-12-18
## [2.5.0] - 2023-12-18
### Added
- Use Toolbelt 1.7.1 for filesystem storage
## 2.4.2 - 2023-12-07
## [2.4.2] - 2023-12-07
### Changed
- Use Toolbelt 1.7.0 using directly uploads on FILESYSTEM
- Specific working to deal with both S3 and FILESYSTEM uploads
- Adapt some logs (french + user/admin)
## 2.4.1 - 2023-07-26
## [2.4.1] - 2023-07-26
### Changed
- Use Toolbelt 0.19.6 for OpenIO fix
## 2.4.0 - 2023-07-06
## [2.4.0] - 2023-07-06
### Changed
- Mainly tooling upgrade
## 2.3.0 - 2023-06-27
## [2.3.0] - 2023-06-27
### Changed
- increase verbosity for end-users
- bump dependencies and dev tooling
- modernize CI
## 2.2.0 - 2023-05-12
## [2.2.0] - 2023-05-12
### Changed
- Update toolbelt
## 2.1.0 - 2023-04-03
## [2.1.0] - 2023-04-03
### Added
- S3 parameters in CLI
### Changed
- Files are downloadable from S3 bucket
## 2.0.1 - 2023-03-30
## [2.0.1] - 2023-03-30
### Changed
- fixe majuscules dans le hash MD5
## 2.0.0 - 2023-03-20
## [2.0.0] - 2023-03-20
### Added
- Handle v2 input/output
## 1.0.1 - 2023-03-30
## [1.0.1] - 2023-03-30
### Changed
- bugfix majuscules dans le hash MD5
## 1.0.0 - 2023-03-17
## [1.0.0] - 2023-03-17
### Added
- Release 1
## 0.6.2 - 2023-03-17
## [0.6.2] - 2023-03-17
### Added
- Fix tag release 1
## 0.6.1 - 2023-03-17
## [0.6.1] - 2023-03-17
### Added
- Wrong version, do not use
## 0.6.0 - 2022-12-20
## [0.6.0] - 2022-12-20
### Added
- Rename main folder into package name to avoid conflict
- Use gpf-entrepot-toolbelt as main dependency
- Rework packaging and CI tasks
- Rework of documentation
## 0.5.0 - 2022-12-16
## [0.5.0] - 2022-12-16
### Added
- Fix Status returned
- CI: add artifact for test report and coverage
- Replace (TECHNICAL) ERROR by TECHNICAL_ERROR
- Fix typo
## 0.4.0 - 2022-12-06
## [0.4.0] - 2022-12-06
### Added
- Make package much more generic
- Load input parameters.json
- Generate an output.json file at the end of execution
- Test coverage to 83%
- CI: add Sonarqube configuration `sonar-project.properties`
## 0.3.1 - 2022-11-23
## [0.3.1] - 2022-11-23
### Added
- CI: Sonarqube template requires test to run also on tags refs
## 0.3.0 - 2022-11-22
## [0.3.0] - 2022-11-22
### Added
- CI: enable Sonarube analysis
- CI: enable dependency track
- CD: package as Docker image and publish to container registry
- documentation: small improvements
## 0.2.0 - 2022-11-18
## [0.2.0] - 2022-11-18
### Added
- Use argparse to expose a robust CLI
- Make the CLI installable through pip
- Refactor tests and documention
## 0.1.3 - 2022-11-16
## [0.1.3] - 2022-11-16
### Added
- Documentation: add how to publish and install
- Fix: CD job was not using the .pypirc file
## 0.1.2 - 2022-11-16
## [0.1.2] - 2022-11-16
### Added
- Fix: repository URL was not updated in .pypirc
## 0.1.1 - 2022-11-16
## [0.1.1] - 2022-11-16
### Added
- Fix: bad repository URL in job release in CI/CD
## 0.1.0 - 2022-11-16
## [0.1.0] - 2022-11-16
### Added
- First functional version
- Unit tests and coverage
- Packaging with pip
- Documentation
- Tooling : formatter, guidelines, git-hooks, linter...
[Unreleased]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.6.2...main
[2.6.2]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.6.1...2.6.2
[2.6.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.6.0...2.6.1
[2.6.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.5.1...2.6.0
[2.5.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.5.0...2.5.1
[2.5.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.4.2...2.5.0
[2.4.2]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.4.1...2.4.2
[2.4.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.4.0...2.4.1
[2.4.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.3.0...2.4.0
[2.3.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.2.0...2.3.0
[2.2.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.1.0...2.2.0
[2.1.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.0.1...2.1.0
[2.0.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/2.0.0...2.0.1
[2.0.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/1.0.1...2.0.0
[1.0.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/1.0.0...1.0.1
[1.0.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.6.2...1.0.0
[0.6.2]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.6.1...0.6.2
[0.6.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.6.0...0.6.1
[0.6.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.5.0...0.6.0
[0.5.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.4.0...0.5.0
[0.4.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.3.1...0.4.0
[0.3.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.3.0...0.3.1
[0.3.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.2.0...0.3.0
[0.2.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.1.3...0.2.0
[0.1.3]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.1.2...0.1.3
[0.1.2]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.1.1...0.1.2
[0.1.1]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/compare/0.1.0...0.1.1
[0.1.0]: https://gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5/releases/tag/0.1.0
......@@ -6,82 +6,27 @@
- Accès réseau sur :
- l'instance GitLab : <https://gitlab.gpf-tech.ign.fr>
- le dépôt officiel de paquets Python : <https://pypi.org/>
- un jeton d'accès personnel (_Personal Access Token (PAT)_) avec le scope `read_api` ou a minima `read_registry`. Voir la page [Authentification](/usage/authentication).
----
## Cloner le dépôt
Exemple pour Oslandia avec l'utilisateur `geojulien` :
```sh
git clone --config 'credential.helper=store' https://geojulien@gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5.git
```
----
### Derrière le proxy
L'Usine Logicielle étant en accès restreint derrière un filtre IP, les personnes ne disposant pas d'IP fixe passent par un proxy qui pointe sur un serveur de rebond dont l'IP fixe est autorisée.
Exemple avec un proxy de type socks :
```sh
git clone --config http.proxy='socks5://127.0.0.1:8645' --config 'credential.helper=store' https://geojulien@gitlab.gpf-tech.ign.fr/geoplateforme/scripts-verification/check-md5.git
- Créer l'environnement virtuel :
```bash
python3 -m venv .venv
```
----
## Installation
### Environnement virtuel
Il est recommandé de travailler dans un environnement virtuel afin de garantir la meilleure reproductibilité.
Exemple sur une distribution Linux de type Ubuntu LTS :
## Démarrage rapide
Avant installation, assurez-vous d'avoir activé le venv :
```sh
# créer l'environnement virtuel
python3 -m venv .venv
# activer l'environnement virtuel
source .venv/bin/activate
# mise à jour de pip et des packages de base dans l'environnement virtuel
python -m pip install -U pip setuptools wheel
```
Exemple sur Windows 10+, avec PowerShell (attention à [adapter la politique d'exécution des scripts](https://static.geotribu.fr/articles/2020/2020-06-19_setup_python/#autoriser-lutilisation-des-environnements-virtuels) avant) :
```powershell
# créer l'environnement virtuel
py -3 -m venv .venv
# activer l'environnement virtuel
.venv/Scripts/Activate
# mise à jour de pip et des packages de base dans l'environnement virtuel
python -m pip install -U pip setuptools wheel
```
### Démarrage rapide
Exemple sur une distribution Linux de type Ubuntu LTS qui accède à l'Usine Logicielle sans proxy :
Installer les dépendances :
```sh
# mise à jour de pip
python -m pip install -U pip setuptools wheel
# installation des dépendances de base
python -m pip install -U -r requirements.txt
# installation du projet en mode développement
python -m pip install -e .
```
### Derrière un proxy
Pour les proxy de type socks, installer préalablement PySocks dans le même environnement que le reste :
```sh
python -m pip install -U "PySocks<2"
```
Puis installer tour à tour les dépendances issues de <pypi.org> puis celles du registre de l'Usine Logicielle :
```sh
python -m pip install -U -r requirements/base-pypi.txt
python -m pip install -U -r requirements/base-gpf.txt --proxy socks5://localhost:8645 --index-url https://gitlab.gpf-tech.ign.fr/api/v4/groups/55/-/packages/pypi/simple
python -m pip install -r requirements.txt
# pre-commit
pip install pre-commit pre-commit-hooks
pre-commit install
```
......@@ -40,7 +40,7 @@ __uri_repository__ = (
__uri_tracker__ = f"{__uri_repository__}issues/"
__uri__ = __uri_repository__
__version__ = "2.6.2"
__version__ = "2.7.0"
__version_info__ = tuple(
[
int(num) if num.isdigit() else num
......
......@@ -8,6 +8,7 @@ from os import environ, getenv
from pathlib import Path
# 3rd party
from gpf_entrepot_toolbelt.__about__ import __package_name__ as toolbelt_pkg_name
from gpf_entrepot_toolbelt.orchestrator.check_livraison import check_livraison_structure
from gpf_entrepot_toolbelt.orchestrator.models import GpfOrchestratorParameters
from gpf_entrepot_toolbelt.orchestrator.status import Status
......@@ -20,6 +21,7 @@ from gpf_check_md5.__about__ import (
__author__,
__cli_usage__,
__executable_name__,
__package_name__,
__summary__,
__title__,
__title_clean__,
......@@ -170,7 +172,8 @@ def main(argv: list[str] = None):
else:
args.verbosity = 0
logger = gpf_logger_script(args.verbosity, __title_clean__)
_ = gpf_logger_script(verbosity=args.verbosity, title=toolbelt_pkg_name)
logger = gpf_logger_script(args.verbosity, __package_name__)
# Définition des variables d'environnement
# Pour la toolbelt
......
......@@ -14,14 +14,20 @@ from pathlib import Path
from gpf_entrepot_toolbelt.orchestrator.models import GpfOrchestratorParameters
from gpf_entrepot_toolbelt.orchestrator.status import Status
from gpf_entrepot_toolbelt.utils.check_path import check_path
from gpf_entrepot_toolbelt.utils.directories_utils import (
MD5_FILES_TEMPLATE,
TECHNICAL_FILES_TEMPLATE,
PathType,
list_directory,
)
# package
from gpf_check_md5.__about__ import __title_clean__, __version__
from gpf_check_md5.__about__ import __package_name__, __title_clean__, __version__
# -- GLOBALS
# logs
logger = logging.getLogger(__name__)
logger = logging.getLogger(__package_name__)
# -- FUNCTIONS
......@@ -62,7 +68,7 @@ def validate(filename: str, md5digest: str, chunksize: int = 8192) -> Status:
if not check_path(
input_path=filename, must_be_a_file=True, must_exists=True, raise_error=False
):
logger.user_error(f"Le fichier {filename.name} n'existe pas.")
logger.user_error(f"Le fichier {filename} n'existe pas.")
return Status.TECHNICAL_ERROR
result = generate_md5_sum(filename, chunksize).lower() == md5digest.lower()
......@@ -71,54 +77,59 @@ def validate(filename: str, md5digest: str, chunksize: int = 8192) -> Status:
return status_return
def check_md5_file(filename: Path, chunksize: int = 8192) -> int:
def check_md5_file(md5_path: Path, chunksize: int = 8192) -> Status:
"""Vérifie un fichier *.md5.
Ce genre de fichier est classiquement géneré par l'utilitaire
md5sum (ou md5 -r) sous unix.
Il est composé d'une chaîne hexadécimale de 32 caractères suivi de
Il est composé d'une chaîne hexadécimale de 32 caractères suivie de
deux espaces et du nom du fichier correspondant au hash md5.
Args:
md5_path(Path): MD5 file path to validate
chunksize(int):
Returns:
0 indique un SUCCESS
1 pour indiquer qu'il y a eu au moins une erreur d'un calcul md5
2 pour indiquer qu'il y a eu au moins une erreur technique
Status: The md5 file validation status
"""
result = 0
result: Status = Status.SUCCESS
try:
with open(filename) as checksum_file:
no_line = 0
with open(md5_path) as checksum_file:
no_line: int = 0
no_error: int = 0
for line in checksum_file:
no_line += 1
line = line.strip()
if len(line) <= 32:
logger.user_error(
f"FAILURE : la longueur de la ligne {line} du fichier "
f"{filename.name} n'est pas conforme : {len(line)}<=32"
f"{md5_path.name} n'est pas conforme : {len(line)}<=32"
)
result |= Status.FAILURE.value
result = Status.FAILURE
continue
checksum = line[:32]
basename = Path(filename).parent
check_filename = line[32:].lstrip()
sourceFilename = str(basename / check_filename)
ret = validate(
filename=sourceFilename, md5digest=checksum, chunksize=chunksize
checksum: str = line[:32]
basename: Path = Path(md5_path).parent
check_filename: str = line[32:].lstrip()
source_filename: str = str(basename / check_filename)
validate_result: Status = validate(
filename=source_filename, md5digest=checksum, chunksize=chunksize
)
if ret != Status.SUCCESS:
if validate_result != Status.SUCCESS:
no_error += 1
logger.user_error(
f"Fichier {filename.name} - vérification ligne {no_line} ({check_filename}): {ret.name}"
)
else:
logger.user_info(
f"Fichier {filename.name} - vérification ligne {no_line} ({check_filename}): {ret.name}"
f"Fichier {md5_path.name} - vérification ligne {no_line} ({check_filename}): {validate_result.name}"
)
result = validate_result
result |= ret.value
logger.user_info(
f"Fichier {md5_path.name} - {no_error} erreur(s) détectée(s) pour {no_line} ligne(s) vérifiée(s)"
)
except OSError as err:
logger.user_error(
......@@ -126,7 +137,7 @@ def check_md5_file(filename: Path, chunksize: int = 8192) -> int:
)
exit(os.EX_IOERR)
return result if result < 2 else 2
return result
def run(
......@@ -145,8 +156,8 @@ def run(
Returns:
Status: Status of the upload check
"""
# variables
result: int = 0
result: Status = Status.SUCCESS
# getting upload directory
upload = parameters.input_uploads.pop(0)
......@@ -173,31 +184,30 @@ def run(
f"Vérification {__title_clean__} ({__version__}) pour la livraison {upload_id} "
)
with os.scandir(upload_dir) as it:
for entry in it:
if entry.name.endswith(".md5") and entry.is_file():
result_check = check_md5_file(
filename=upload_dir.joinpath(entry.name),
chunksize=chunk_size,
)
logger.user_info(
f"Vérification de {entry.name} : {Status(result_check).name}"
)
result |= result_check
for filepath in list_directory(
upload_dir,
types={PathType.FILE},
ignored_templates=[TECHNICAL_FILES_TEMPLATE],
included_templates=[MD5_FILES_TEMPLATE],
recursive=False,
):
logger.user_info(f"Vérification de {filepath.name}")
result_check_md5_file: Status = check_md5_file(
md5_path=filepath,
chunksize=chunk_size,
)
logger.user_info(f"Vérification de {filepath.name} : {result_check_md5_file}")
if result_check_md5_file != Status.SUCCESS:
result = Status.FAILURE
result_status = Status(result) if result < 2 else Status.TECHNICAL_ERROR
logger.user_info(f"Résultat global de la vérification : {result_status.name}")
return result_status
logger.user_info(f"Résultat global de la vérification : {result}")
return result
# -- Stand alone execution
if __name__ == "__main__":
from os import getenv
print(
run(
work_dir=getenv("GPF_WORK_DIR", Path("./tests")),
upload_dir_name=getenv("GPF_UPLOAD_DIR", "assets"),
chunk_size=getenv("GPF_CHUNK_SIZE", 8192),
)
)
run(
parameters=GpfOrchestratorParameters(), upload_dir_paths={}
) # required by unittest
# change index-url for next lines
--extra-index-url https://gitlab.gpf-tech.ign.fr/api/v4/groups/55/-/packages/pypi/simple
gpf-entrepot-toolbelt==1.8.1
gpf-entrepot-toolbelt==1.15.5
typing-extensions>=4,<5 ; python_version < '3.11'
typing-extensions==4.14.0 ; python_version=='3.11'
{"_id": "1231544456-1546546-164565", "global_variables": {"postgresql": {"pass": "", "user": ""}, "swift": {"auth_url": "", "identity_api_version": "", "password": "", "project_domain_name": "", "region_name": "", "tenant_id": "", "tenant_name": "", "user_domain_name": "", "username": ""}}, "inputs": {"stored_datas": [], "uploads": [{"_id": "invalid_upload", "extent": {"east": 0, "north": 0, "south": 0, "west": 0}, "name": "", "size": 0, "srs": "", "storage": {"_id": "string", "name": "string", "type": "S3", "type_infos": {"pot_name": "upload-test-check-md5"}}, "type": "stringEnum(uploadType)", "type_infos": {}}]}, "job_name": "", "output": null, "parameters": [{"name": "", "value": ""}], "pipeline_status": {"gpf-md5-checker": "SUCCESS", "job_name1": "SUCCESS", "job_name2": "FAILURE"}}
\ No newline at end of file
{"_id": "1231544456-1546546-164565", "inputs": {"stored_datas": [], "uploads": [{"_id": "invalid_upload", "extent": {"east": 0, "north": 0, "south": 0, "west": 0}, "name": "", "size": 0, "srs": "", "storage": {"_id": "string", "name": "string", "type": "S3", "type_infos": {"pot_name": "upload-test-check-md5"}}, "type": "stringEnum(uploadType)", "type_infos": {}}]}, "job_name": "", "output": null, "parameters": [{"name": "", "value": ""}], "pipeline_status": {"gpf-md5-checker": "SUCCESS", "job_name1": "SUCCESS", "job_name2": "FAILURE"}}
\ No newline at end of file
{"_id": "1231544456-1546546-164565", "global_variables": {"postgresql": {"pass": "", "user": ""}, "swift": {"auth_url": "", "identity_api_version": "", "password": "", "project_domain_name": "", "region_name": "", "tenant_id": "", "tenant_name": "", "user_domain_name": "", "username": ""}}, "inputs": {"stored_datas": [], "uploads": [{"_id": "valid_upload", "extent": null, "name": "", "size": 0, "srs": "", "storage": {"_id": "string", "name": "string", "type": "S3", "type_infos": {"pot_name": "upload-test-check-md5"}}, "type": "stringEnum(uploadType)", "type_infos": {}}]}, "job_name": "", "output": null, "parameters": [{"name": "", "value": ""}], "pipeline_status": {"gpf-md5-checker": "SUCCESS", "job_name1": "SUCCESS", "job_name2": "FAILURE"}}
\ No newline at end of file
{"_id": "1231544456-1546546-164565", "inputs": {"stored_datas": [], "uploads": [{"_id": "valid_upload", "extent": null, "name": "", "size": 0, "srs": "", "storage": {"_id": "string", "name": "string", "type": "S3", "type_infos": {"pot_name": "upload-test-check-md5"}}, "type": "stringEnum(uploadType)", "type_infos": {}}]}, "job_name": "", "output": null, "parameters": [{"name": "", "value": ""}], "pipeline_status": {"gpf-md5-checker": "SUCCESS", "job_name1": "SUCCESS", "job_name2": "FAILURE"}}
\ No newline at end of file
772ac1a55fab1122f3b369ee9cd31532 md5.txt
b5871a318190397c5878ff2bd9f326d4 oslandia.txt
......@@ -16,13 +16,16 @@ from collections import namedtuple
from pathlib import Path
from unittest.mock import patch
# 3rd party
from gpf_entrepot_toolbelt.orchestrator.models import GpfOrchestratorParameters
from gpf_entrepot_toolbelt.orchestrator.status import Status
from gpf_entrepot_toolbelt.utils.gpf_logger import gpf_logger_script
# project
from gpf_check_md5 import core
from gpf_check_md5.__about__ import __title_clean__
# logger
logger = gpf_logger_script(verbosity=0, title=__title_clean__)
......@@ -40,41 +43,35 @@ class TestMD5(unittest.TestCase):
"""Test a md5 hash."""
self.assertEqual(
core.generate_md5_sum(
Path(
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.txt"
)
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.txt"
),
"b5871a318190397c5878ff2bd9f326d3",
)
def test_validate(self):
"""Test validate md5 file."""
self.assertTrue(
self.assertEqual(
core.validate(
Path(
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.txt"
),
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.txt",
"b5871a318190397c5878ff2bd9f326d3",
)
== core.Status.SUCCESS
),
Status.SUCCESS,
)
self.assertTrue(
self.assertEqual(
core.validate(
Path(
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.txt"
),
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.txt",
"b5871a318190397c5878ff2bd9f326d2",
)
== core.Status.FAILURE
),
Status.FAILURE,
)
self.assertTrue(
self.assertEqual(
core.validate(
Path(
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.tx"
),
"./tests/fixtures/livraisons/good/upload/valid_upload/oslandia.tx",
"b5871a318190397c5878ff2bd9f326d2",
)
== core.Status.TECHNICAL_ERROR
),
Status.TECHNICAL_ERROR,
)
def test_check_md5_file(self):
......@@ -84,23 +81,22 @@ class TestMD5(unittest.TestCase):
ret = core.check_md5_file(
Path("./tests/fixtures/livraisons/good/upload/valid_upload/all.md5"),
)
self.assertEqual(ret, core.Status.SUCCESS.value)
self.assertEqual(ret, Status.SUCCESS)
# Technical error: file not found
status = {}
with self.assertRaises(SystemExit) as exc:
core.check_md5_file(
Path("./tests/fixtures/livraisons/good/failed_all"), status
)
core.check_md5_file(Path("./tests/fixtures/livraisons/good/failed_all"))
self.assertEqual(exc.exception.code, 74) # EX_IOERR
def test_script_run_ok(self):
"""test main script run."""
# Given
parameters = GpfOrchestratorParameters.from_json(
Path("./tests/fixtures/livraisons/good/parameters_v2.json")
)
# When
ret = core.run(
parameters=parameters,
upload_dir_paths={
......@@ -109,7 +105,9 @@ class TestMD5(unittest.TestCase):
)
},
)
self.assertEqual(ret, core.Status.SUCCESS)
# Then
self.assertEqual(ret, Status.SUCCESS)
def test_script_run_ko(self):
"""Test case where main script should raise an error."""
......@@ -123,7 +121,7 @@ class TestMD5(unittest.TestCase):
parameters=parameters,
upload_dir_paths={},
)
self.assertEqual(ret, core.Status.TECHNICAL_ERROR)
self.assertEqual(ret, Status.TECHNICAL_ERROR)
# Bad work dir returned
ret = core.run(
......@@ -134,7 +132,7 @@ class TestMD5(unittest.TestCase):
)
},
)
self.assertEqual(ret, core.Status.TECHNICAL_ERROR)
self.assertEqual(ret, Status.TECHNICAL_ERROR)
# No Md5 returned
with patch.object(core, "check_md5_file", return_value=2):
......@@ -146,7 +144,7 @@ class TestMD5(unittest.TestCase):
)
},
)
self.assertEqual(ret, core.Status.TECHNICAL_ERROR)
self.assertEqual(ret, Status.FAILURE)
# ############################################################################
# ####### Stand-alone run ########
......