diff --git a/README.md b/README.md index 87c4c80..b44ce28 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,4 @@ -# cyperf +# CyPerf CyPerf REST API This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project: @@ -8,20 +8,79 @@ This Python package is automatically generated by the [OpenAPI Generator](https: - Generator version: 7.7.0 - Build package: org.openapitools.codegen.languages.PythonClientCodegen -## Requirements. - -Python 3.11+ - ## Installation & Usage -### pip install - -You can install directly by doing: -```sh -pip install . -``` - -from the base of this repository. +# CyPerf API Wrapper Installation Instructions + +## Table of Contents + +* [Prerequisites](#prerequisites) +* [Step 1: Install Python](#step-1-install-python) +* [Step 2: Create a New Directory and Virtual Environment](#step-2-create-a-new-directory-and-virtual-environment) +* [Step 3: Clone the CyPerf API Wrapper Repository](#step-3-clone-the-cyperf-api-wrapper-repository) +* [Step 4: Install Dependencies and Build the Package](#step-4-install-dependencies-and-build-the-package) + +## Prerequisites + +* Python 3.11+ (check the [README.md file](https://github.com/Keysight/cyperf-api-wrapper/blob/main/README.md) for the latest requirement) + +## Step 1: Install Python + +* Download and install Python from the [official Python website](https://www.python.org/downloads/). +* Ensure necessary dependencies are installed: + + On Linux (Debian-based): + ```bash + sudo apt install libssl-dev libffi-dev python3-dev + +* If installing Python from source: + ```bash + sudo apt update + sudo apt upgrade + ./configure --enable-optimizations --with-ssl + make -j $(nproc) + sudo make altinstall + +* Verify the installation: + ```bash + python3.X --version + +## Step 2: Create a New Directory and Virtual Environment +* Create a new directory: + ```bash + mkdir cyperf-api +* Navigate to the directory: + ```bash + cd cyperf-api +* Create a virtual environment: + ```bash + python3.X -m venv env1 +* Activate the virtual environment: + ```bash + source env1/bin/activate + +## Step 3: Clone the CyPerf API Wrapper Repository and install tshark +* Clone the repository: + ```bash + git clone https://github.com/Keysight/cyperf-api-wrapper.git +* Navigate to the repository directory: + ```bash + cd cyperf-api-wrapper + sudo apt install tshark + +## Step 4: Install Dependencies and Build the Package +* Upgrade pip: + ```bash + python3 -m pip install --upgrade pip + python3 -m pip install pyshark +* Install poetry: + ```bash + python3 -m pip install poetry +* Install dependencies: + ```bash + poetry install +* Build the package: + ```bash + poetry build Then import the package: ```python @@ -117,7 +176,6 @@ Class | Method | HTTP request | Description *AgentsApi* | [**start_controllers_set_port_link_state**](docs/AgentsApi.md#start_controllers_set_port_link_state) | **POST** /api/v2/controllers/operations/set-port-link-state | *ApplicationResourcesApi* | [**delete_resources_capture**](docs/ApplicationResourcesApi.md#delete_resources_capture) | **DELETE** /api/v2/resources/captures/{captureId} | *ApplicationResourcesApi* | [**delete_resources_certificate**](docs/ApplicationResourcesApi.md#delete_resources_certificate) | **DELETE** /api/v2/resources/certificates/{certificateId} | -*ApplicationResourcesApi* | [**delete_resources_custom_fuzzing_script**](docs/ApplicationResourcesApi.md#delete_resources_custom_fuzzing_script) | **DELETE** /api/v2/resources/custom-fuzzing-scripts/{customFuzzingScriptId} | *ApplicationResourcesApi* | [**delete_resources_flow_library**](docs/ApplicationResourcesApi.md#delete_resources_flow_library) | **DELETE** /api/v2/resources/flow-library/{flowLibraryId} | *ApplicationResourcesApi* | [**delete_resources_global_playlist**](docs/ApplicationResourcesApi.md#delete_resources_global_playlist) | **DELETE** /api/v2/resources/global-playlists/{globalPlaylistId} | *ApplicationResourcesApi* | [**delete_resources_http_library**](docs/ApplicationResourcesApi.md#delete_resources_http_library) | **DELETE** /api/v2/resources/http-library/{httpLibraryId} | @@ -151,10 +209,6 @@ Class | Method | HTTP request | Description *ApplicationResourcesApi* | [**get_resources_certificate_content_file**](docs/ApplicationResourcesApi.md#get_resources_certificate_content_file) | **GET** /api/v2/resources/certificates/{certificateId}/contentFile | *ApplicationResourcesApi* | [**get_resources_certificates**](docs/ApplicationResourcesApi.md#get_resources_certificates) | **GET** /api/v2/resources/certificates | *ApplicationResourcesApi* | [**get_resources_certificates_upload_file_result**](docs/ApplicationResourcesApi.md#get_resources_certificates_upload_file_result) | **GET** /api/v2/resources/certificates/operations/uploadFile/{uploadFileId}/result | -*ApplicationResourcesApi* | [**get_resources_custom_fuzzing_script_by_id**](docs/ApplicationResourcesApi.md#get_resources_custom_fuzzing_script_by_id) | **GET** /api/v2/resources/custom-fuzzing-scripts/{customFuzzingScriptId} | -*ApplicationResourcesApi* | [**get_resources_custom_fuzzing_script_content_file**](docs/ApplicationResourcesApi.md#get_resources_custom_fuzzing_script_content_file) | **GET** /api/v2/resources/custom-fuzzing-scripts/{customFuzzingScriptId}/contentFile | -*ApplicationResourcesApi* | [**get_resources_custom_fuzzing_scripts**](docs/ApplicationResourcesApi.md#get_resources_custom_fuzzing_scripts) | **GET** /api/v2/resources/custom-fuzzing-scripts | -*ApplicationResourcesApi* | [**get_resources_custom_fuzzing_scripts_upload_file_result**](docs/ApplicationResourcesApi.md#get_resources_custom_fuzzing_scripts_upload_file_result) | **GET** /api/v2/resources/custom-fuzzing-scripts/operations/uploadFile/{uploadFileId}/result | *ApplicationResourcesApi* | [**get_resources_flow_library**](docs/ApplicationResourcesApi.md#get_resources_flow_library) | **GET** /api/v2/resources/flow-library | *ApplicationResourcesApi* | [**get_resources_flow_library_by_id**](docs/ApplicationResourcesApi.md#get_resources_flow_library_by_id) | **GET** /api/v2/resources/flow-library/{flowLibraryId} | *ApplicationResourcesApi* | [**get_resources_flow_library_content_file**](docs/ApplicationResourcesApi.md#get_resources_flow_library_content_file) | **GET** /api/v2/resources/flow-library/{flowLibraryId}/contentFile | @@ -223,9 +277,7 @@ Class | Method | HTTP request | Description *ApplicationResourcesApi* | [**poll_resources_captures_batch_delete**](docs/ApplicationResourcesApi.md#poll_resources_captures_batch_delete) | **GET** /api/v2/resources/captures/operations/batch-delete/{id} | *ApplicationResourcesApi* | [**poll_resources_captures_upload_file**](docs/ApplicationResourcesApi.md#poll_resources_captures_upload_file) | **GET** /api/v2/resources/captures/operations/uploadFile/{uploadFileId} | *ApplicationResourcesApi* | [**poll_resources_certificates_upload_file**](docs/ApplicationResourcesApi.md#poll_resources_certificates_upload_file) | **GET** /api/v2/resources/certificates/operations/uploadFile/{uploadFileId} | -*ApplicationResourcesApi* | [**poll_resources_config_export_user_defined_apps**](docs/ApplicationResourcesApi.md#poll_resources_config_export_user_defined_apps) | **GET** /api/v2/resources/configs/{configId}/operations/export-user-defined-apps/{id} | *ApplicationResourcesApi* | [**poll_resources_create_app**](docs/ApplicationResourcesApi.md#poll_resources_create_app) | **GET** /api/v2/resources/operations/create-app/{id} | -*ApplicationResourcesApi* | [**poll_resources_custom_fuzzing_scripts_upload_file**](docs/ApplicationResourcesApi.md#poll_resources_custom_fuzzing_scripts_upload_file) | **GET** /api/v2/resources/custom-fuzzing-scripts/operations/uploadFile/{uploadFileId} | *ApplicationResourcesApi* | [**poll_resources_edit_app**](docs/ApplicationResourcesApi.md#poll_resources_edit_app) | **GET** /api/v2/resources/operations/edit-app/{id} | *ApplicationResourcesApi* | [**poll_resources_find_param_matches**](docs/ApplicationResourcesApi.md#poll_resources_find_param_matches) | **GET** /api/v2/resources/operations/find-param-matches/{id} | *ApplicationResourcesApi* | [**poll_resources_flow_library_upload_file**](docs/ApplicationResourcesApi.md#poll_resources_flow_library_upload_file) | **GET** /api/v2/resources/flow-library/operations/uploadFile/{uploadFileId} | @@ -252,9 +304,7 @@ Class | Method | HTTP request | Description *ApplicationResourcesApi* | [**start_resources_captures_batch_delete**](docs/ApplicationResourcesApi.md#start_resources_captures_batch_delete) | **POST** /api/v2/resources/captures/operations/batch-delete | *ApplicationResourcesApi* | [**start_resources_captures_upload_file**](docs/ApplicationResourcesApi.md#start_resources_captures_upload_file) | **POST** /api/v2/resources/captures/operations/uploadFile | *ApplicationResourcesApi* | [**start_resources_certificates_upload_file**](docs/ApplicationResourcesApi.md#start_resources_certificates_upload_file) | **POST** /api/v2/resources/certificates/operations/uploadFile | -*ApplicationResourcesApi* | [**start_resources_config_export_user_defined_apps**](docs/ApplicationResourcesApi.md#start_resources_config_export_user_defined_apps) | **POST** /api/v2/resources/configs/{configId}/operations/export-user-defined-apps | *ApplicationResourcesApi* | [**start_resources_create_app**](docs/ApplicationResourcesApi.md#start_resources_create_app) | **POST** /api/v2/resources/operations/create-app | -*ApplicationResourcesApi* | [**start_resources_custom_fuzzing_scripts_upload_file**](docs/ApplicationResourcesApi.md#start_resources_custom_fuzzing_scripts_upload_file) | **POST** /api/v2/resources/custom-fuzzing-scripts/operations/uploadFile | *ApplicationResourcesApi* | [**start_resources_edit_app**](docs/ApplicationResourcesApi.md#start_resources_edit_app) | **POST** /api/v2/resources/operations/edit-app | *ApplicationResourcesApi* | [**start_resources_find_param_matches**](docs/ApplicationResourcesApi.md#start_resources_find_param_matches) | **POST** /api/v2/resources/operations/find-param-matches | *ApplicationResourcesApi* | [**start_resources_flow_library_upload_file**](docs/ApplicationResourcesApi.md#start_resources_flow_library_upload_file) | **POST** /api/v2/resources/flow-library/operations/uploadFile | @@ -352,6 +402,7 @@ Class | Method | HTTP request | Description *SessionsApi* | [**create_sessions**](docs/SessionsApi.md#create_sessions) | **POST** /api/v2/sessions | *SessionsApi* | [**delete_session**](docs/SessionsApi.md#delete_session) | **DELETE** /api/v2/sessions/{sessionId} | *SessionsApi* | [**delete_session_meta**](docs/SessionsApi.md#delete_session_meta) | **DELETE** /api/v2/sessions/{sessionId}/meta/{metaId} | +*SessionsApi* | [**get_appsec_ui_metadata**](docs/SessionsApi.md#get_appsec_ui_metadata) | **GET** /api/v2/appsec-ui-metadata | *SessionsApi* | [**get_config_docs**](docs/SessionsApi.md#get_config_docs) | **GET** /api/v2/sessions/{sessionId}/config/$docs | *SessionsApi* | [**get_config_granular_stats**](docs/SessionsApi.md#get_config_granular_stats) | **GET** /api/v2/sessions/{sessionId}/config/granular-stats | *SessionsApi* | [**get_config_granular_stats_filters**](docs/SessionsApi.md#get_config_granular_stats_filters) | **GET** /api/v2/sessions/{sessionId}/config/granular-stats-filters | @@ -365,8 +416,8 @@ Class | Method | HTTP request | Description *SessionsApi* | [**patch_session_meta**](docs/SessionsApi.md#patch_session_meta) | **PATCH** /api/v2/sessions/{sessionId}/meta/{metaId} | *SessionsApi* | [**patch_session_test**](docs/SessionsApi.md#patch_session_test) | **PATCH** /api/v2/sessions/{sessionId}/test | *SessionsApi* | [**poll_config_add_applications**](docs/SessionsApi.md#poll_config_add_applications) | **GET** /api/v2/sessions/{sessionId}/config/config/TrafficProfiles/{trafficProfileId}/operations/add-applications/{id} | +*SessionsApi* | [**poll_config_save**](docs/SessionsApi.md#poll_config_save) | **GET** /api/v2/sessions/{sessionId}/config/operations/save/{id} | *SessionsApi* | [**poll_session_config_granular_stats_default_dashboards**](docs/SessionsApi.md#poll_session_config_granular_stats_default_dashboards) | **GET** /api/v2/sessions/{sessionId}/config/operations/granular-stats-default-dashboards/{id} | -*SessionsApi* | [**poll_session_config_save**](docs/SessionsApi.md#poll_session_config_save) | **GET** /api/v2/sessions/{sessionId}/config/operations/save/{id} | *SessionsApi* | [**poll_session_load_config**](docs/SessionsApi.md#poll_session_load_config) | **GET** /api/v2/sessions/{sessionId}/operations/loadConfig/{id} | *SessionsApi* | [**poll_session_prepare_test**](docs/SessionsApi.md#poll_session_prepare_test) | **GET** /api/v2/sessions/{sessionId}/operations/prepareTest/{id} | *SessionsApi* | [**poll_session_test_end**](docs/SessionsApi.md#poll_session_test_end) | **GET** /api/v2/sessions/{sessionId}/operations/testEnd/{id} | @@ -463,7 +514,6 @@ Class | Method | HTTP request | Description - [ActivationCodeInfo](docs/ActivationCodeInfo.md) - [ActivationCodeListRequest](docs/ActivationCodeListRequest.md) - [ActivationCodeRequest](docs/ActivationCodeRequest.md) - - [AddActionInfo](docs/AddActionInfo.md) - [AddInput](docs/AddInput.md) - [AdvancedSettings](docs/AdvancedSettings.md) - [Agent](docs/Agent.md) @@ -497,15 +547,12 @@ Class | Method | HTTP request | Description - [AsyncOperationResponse](docs/AsyncOperationResponse.md) - [Attack](docs/Attack.md) - [AttackAction](docs/AttackAction.md) - - [AttackMetadata](docs/AttackMetadata.md) - - [AttackMetadataKeywordsInner](docs/AttackMetadataKeywordsInner.md) - [AttackObjectivesAndTimeline](docs/AttackObjectivesAndTimeline.md) - [AttackProfile](docs/AttackProfile.md) - [AttackTimelineSegment](docs/AttackTimelineSegment.md) - [AttackTrack](docs/AttackTrack.md) - [AuthMethodType](docs/AuthMethodType.md) - [AuthProfile](docs/AuthProfile.md) - - [AuthProfileMetadata](docs/AuthProfileMetadata.md) - [AuthSettings](docs/AuthSettings.md) - [Authenticate200Response](docs/Authenticate200Response.md) - [AuthenticationSettings](docs/AuthenticationSettings.md) @@ -532,6 +579,7 @@ Class | Method | HTTP request | Description - [ConfigCategory](docs/ConfigCategory.md) - [ConfigId](docs/ConfigId.md) - [ConfigMetadata](docs/ConfigMetadata.md) + - [ConfigMetadataConfigDataValue](docs/ConfigMetadataConfigDataValue.md) - [ConfigValidation](docs/ConfigValidation.md) - [Conflict](docs/Conflict.md) - [Connection](docs/Connection.md) @@ -541,7 +589,6 @@ Class | Method | HTTP request | Description - [CountedFeatureConsumer](docs/CountedFeatureConsumer.md) - [CountedFeatureStats](docs/CountedFeatureStats.md) - [CreateAppOperation](docs/CreateAppOperation.md) - - [CreateAppOrAttackOperationInput](docs/CreateAppOrAttackOperationInput.md) - [CustomDashboards](docs/CustomDashboards.md) - [CustomImportHandler](docs/CustomImportHandler.md) - [CustomStat](docs/CustomStat.md) @@ -850,6 +897,5 @@ Authentication schemes defined for the API: ## Author -support@keysight.com - +soumya.neogy@keysight.com diff --git a/poetry.lock b/poetry.lock new file mode 100644 index 0000000..b9c0ab5 --- /dev/null +++ b/poetry.lock @@ -0,0 +1,1145 @@ +# This file is automatically @generated by Poetry 2.1.4 and should not be changed by hand. + +[[package]] +name = "annotated-types" +version = "0.5.0" +description = "Reusable constraint types to use with typing.Annotated" +optional = false +python-versions = ">=3.7" +groups = ["main"] +markers = "python_version < \"3.13\"" +files = [ + {file = "annotated_types-0.5.0-py3-none-any.whl", hash = "sha256:58da39888f92c276ad970249761ebea80ba544b77acddaa1a4d6cf78287d45fd"}, + {file = "annotated_types-0.5.0.tar.gz", hash = "sha256:47cdc3490d9ac1506ce92c7aaa76c579dc3509ff11e098fc867e5130ab7be802"}, +] + +[package.dependencies] +typing-extensions = {version = ">=4.0.0", markers = "python_version < \"3.9\""} + +[[package]] +name = "annotated-types" +version = "0.7.0" +description = "Reusable constraint types to use with typing.Annotated" +optional = false +python-versions = ">=3.8" +groups = ["main"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"}, + {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"}, +] + +[[package]] +name = "cachetools" +version = "6.1.0" +description = "Extensible memoizing collections and decorators" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "cachetools-6.1.0-py3-none-any.whl", hash = "sha256:1c7bb3cf9193deaf3508b7c5f2a79986c13ea38965c5adcff1f84519cf39163e"}, + {file = "cachetools-6.1.0.tar.gz", hash = "sha256:b4c4f404392848db3ce7aac34950d17be4d864da4b8b66911008e430bc544587"}, +] + +[[package]] +name = "chardet" +version = "5.2.0" +description = "Universal encoding detector for Python 3" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "chardet-5.2.0-py3-none-any.whl", hash = "sha256:e1cf59446890a00105fe7b7912492ea04b6e6f06d4b742b2c788469e34c82970"}, + {file = "chardet-5.2.0.tar.gz", hash = "sha256:1b3b6ff479a8c414bc3fa2c0852995695c4a026dcd6d0633b2dd092ca39c1cf7"}, +] + +[[package]] +name = "colorama" +version = "0.4.6" +description = "Cross-platform colored terminal text." +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7" +groups = ["dev"] +markers = "sys_platform == \"win32\" or platform_system == \"Windows\" or python_version >= \"3.13\"" +files = [ + {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"}, + {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"}, +] + +[[package]] +name = "distlib" +version = "0.4.0" +description = "Distribution utilities" +optional = false +python-versions = "*" +groups = ["dev"] +files = [ + {file = "distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16"}, + {file = "distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d"}, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +description = "Backport of PEP 654 (exception groups)" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.11\"" +files = [ + {file = "exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10"}, + {file = "exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88"}, +] + +[package.dependencies] +typing-extensions = {version = ">=4.6.0", markers = "python_version < \"3.13\""} + +[package.extras] +test = ["pytest (>=6)"] + +[[package]] +name = "filelock" +version = "3.12.2" +description = "A platform independent file lock." +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "filelock-3.12.2-py3-none-any.whl", hash = "sha256:cbb791cdea2a72f23da6ac5b5269ab0a0d161e9ef0100e653b69049a7706d1ec"}, + {file = "filelock-3.12.2.tar.gz", hash = "sha256:002740518d8aa59a26b0c76e10fb8c6e15eae825d34b6fdf670333fd7b938d81"}, +] + +[package.extras] +docs = ["furo (>=2023.5.20)", "sphinx (>=7.0.1)", "sphinx-autodoc-typehints (>=1.23,!=1.23.4)"] +testing = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "diff-cover (>=7.5)", "pytest (>=7.3.1)", "pytest-cov (>=4.1)", "pytest-mock (>=3.10)", "pytest-timeout (>=2.1)"] + +[[package]] +name = "filelock" +version = "3.19.1" +description = "A platform independent file lock." +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d"}, + {file = "filelock-3.19.1.tar.gz", hash = "sha256:66eda1888b0171c998b35be2bcc0f6d75c388a7ce20c3f3f37aa8e96c2dddf58"}, +] + +[[package]] +name = "flake8" +version = "5.0.4" +description = "the modular source code checker: pep8 pyflakes and co" +optional = false +python-versions = ">=3.6.1" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "flake8-5.0.4-py2.py3-none-any.whl", hash = "sha256:7a1cf6b73744f5806ab95e526f6f0d8c01c66d7bbe349562d22dfca20610b248"}, + {file = "flake8-5.0.4.tar.gz", hash = "sha256:6fbe320aad8d6b95cec8b8e47bc933004678dc63095be98528b7bdd2a9f510db"}, +] + +[package.dependencies] +importlib-metadata = {version = ">=1.1.0,<4.3", markers = "python_version < \"3.8\""} +mccabe = ">=0.7.0,<0.8.0" +pycodestyle = ">=2.9.0,<2.10.0" +pyflakes = ">=2.5.0,<2.6.0" + +[[package]] +name = "flake8" +version = "7.3.0" +description = "the modular source code checker: pep8 pyflakes and co" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "flake8-7.3.0-py2.py3-none-any.whl", hash = "sha256:b9696257b9ce8beb888cdbe31cf885c90d31928fe202be0889a7cdafad32f01e"}, + {file = "flake8-7.3.0.tar.gz", hash = "sha256:fe044858146b9fc69b551a4b490d69cf960fcb78ad1edcb84e7fbb1b4a8e3872"}, +] + +[package.dependencies] +mccabe = ">=0.7.0,<0.8.0" +pycodestyle = ">=2.14.0,<2.15.0" +pyflakes = ">=3.4.0,<3.5.0" + +[[package]] +name = "importlib-metadata" +version = "4.2.0" +description = "Read metadata from Python packages" +optional = false +python-versions = ">=3.6" +groups = ["main", "dev"] +markers = "python_version == \"3.7\"" +files = [ + {file = "importlib_metadata-4.2.0-py3-none-any.whl", hash = "sha256:057e92c15bc8d9e8109738a48db0ccb31b4d9d5cfbee5a8670879a30be66304b"}, + {file = "importlib_metadata-4.2.0.tar.gz", hash = "sha256:b7e52a1f8dec14a75ea73e0891f3060099ca1d8e6a462a4dff11c3e119ea1b31"}, +] + +[package.dependencies] +typing-extensions = {version = ">=3.6.4", markers = "python_version < \"3.8\""} +zipp = ">=0.5" + +[package.extras] +docs = ["jaraco.packaging (>=8.2)", "rst.linker (>=1.9)", "sphinx"] +testing = ["flufl.flake8", "importlib-resources (>=1.3) ; python_version < \"3.9\"", "packaging", "pep517", "pyfakefs", "pytest (>=4.6)", "pytest-black (>=0.3.7) ; platform_python_implementation != \"PyPy\" and python_version < \"3.10\"", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.0.1)", "pytest-flake8", "pytest-mypy ; platform_python_implementation != \"PyPy\" and python_version < \"3.10\""] + +[[package]] +name = "iniconfig" +version = "2.0.0" +description = "brain-dead simple config-ini parsing" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"}, + {file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"}, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +description = "brain-dead simple config-ini parsing" +optional = false +python-versions = ">=3.8" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760"}, + {file = "iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7"}, +] + +[[package]] +name = "mccabe" +version = "0.7.0" +description = "McCabe checker, plugin for flake8" +optional = false +python-versions = ">=3.6" +groups = ["dev"] +files = [ + {file = "mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"}, + {file = "mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325"}, +] + +[[package]] +name = "mypy" +version = "1.4.1" +description = "Optional static typing for Python" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +files = [ + {file = "mypy-1.4.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:566e72b0cd6598503e48ea610e0052d1b8168e60a46e0bfd34b3acf2d57f96a8"}, + {file = "mypy-1.4.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ca637024ca67ab24a7fd6f65d280572c3794665eaf5edcc7e90a866544076878"}, + {file = "mypy-1.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dde1d180cd84f0624c5dcaaa89c89775550a675aff96b5848de78fb11adabcd"}, + {file = "mypy-1.4.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8c4d8e89aa7de683e2056a581ce63c46a0c41e31bd2b6d34144e2c80f5ea53dc"}, + {file = "mypy-1.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:bfdca17c36ae01a21274a3c387a63aa1aafe72bff976522886869ef131b937f1"}, + {file = "mypy-1.4.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7549fbf655e5825d787bbc9ecf6028731973f78088fbca3a1f4145c39ef09462"}, + {file = "mypy-1.4.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:98324ec3ecf12296e6422939e54763faedbfcc502ea4a4c38502082711867258"}, + {file = "mypy-1.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:141dedfdbfe8a04142881ff30ce6e6653c9685b354876b12e4fe6c78598b45e2"}, + {file = "mypy-1.4.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8207b7105829eca6f3d774f64a904190bb2231de91b8b186d21ffd98005f14a7"}, + {file = "mypy-1.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:16f0db5b641ba159eff72cff08edc3875f2b62b2fa2bc24f68c1e7a4e8232d01"}, + {file = "mypy-1.4.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:470c969bb3f9a9efcedbadcd19a74ffb34a25f8e6b0e02dae7c0e71f8372f97b"}, + {file = "mypy-1.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e5952d2d18b79f7dc25e62e014fe5a23eb1a3d2bc66318df8988a01b1a037c5b"}, + {file = "mypy-1.4.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:190b6bab0302cec4e9e6767d3eb66085aef2a1cc98fe04936d8a42ed2ba77bb7"}, + {file = "mypy-1.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9d40652cc4fe33871ad3338581dca3297ff5f2213d0df345bcfbde5162abf0c9"}, + {file = "mypy-1.4.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:01fd2e9f85622d981fd9063bfaef1aed6e336eaacca00892cd2d82801ab7c042"}, + {file = "mypy-1.4.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2460a58faeea905aeb1b9b36f5065f2dc9a9c6e4c992a6499a2360c6c74ceca3"}, + {file = "mypy-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a2746d69a8196698146a3dbe29104f9eb6a2a4d8a27878d92169a6c0b74435b6"}, + {file = "mypy-1.4.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:ae704dcfaa180ff7c4cfbad23e74321a2b774f92ca77fd94ce1049175a21c97f"}, + {file = "mypy-1.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:43d24f6437925ce50139a310a64b2ab048cb2d3694c84c71c3f2a1626d8101dc"}, + {file = "mypy-1.4.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c482e1246726616088532b5e964e39765b6d1520791348e6c9dc3af25b233828"}, + {file = "mypy-1.4.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:43b592511672017f5b1a483527fd2684347fdffc041c9ef53428c8dc530f79a3"}, + {file = "mypy-1.4.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:34a9239d5b3502c17f07fd7c0b2ae6b7dd7d7f6af35fbb5072c6208e76295816"}, + {file = "mypy-1.4.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5703097c4936bbb9e9bce41478c8d08edd2865e177dc4c52be759f81ee4dd26c"}, + {file = "mypy-1.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:e02d700ec8d9b1859790c0475df4e4092c7bf3272a4fd2c9f33d87fac4427b8f"}, + {file = "mypy-1.4.1-py3-none-any.whl", hash = "sha256:45d32cec14e7b97af848bddd97d85ea4f0db4d5a149ed9676caa4eb2f7402bb4"}, + {file = "mypy-1.4.1.tar.gz", hash = "sha256:9bbcd9ab8ea1f2e1c8031c21445b511442cc45c89951e49bbf852cbb70755b1b"}, +] + +[package.dependencies] +mypy-extensions = ">=1.0.0" +tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} +typed-ast = {version = ">=1.4.0,<2", markers = "python_version < \"3.8\""} +typing-extensions = ">=4.1.0" + +[package.extras] +dmypy = ["psutil (>=4.0)"] +install-types = ["pip"] +python2 = ["typed-ast (>=1.4.0,<2)"] +reports = ["lxml"] + +[[package]] +name = "mypy-extensions" +version = "1.0.0" +description = "Type system extensions for programs checked with the mypy type checker." +optional = false +python-versions = ">=3.5" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"}, + {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"}, +] + +[[package]] +name = "mypy-extensions" +version = "1.1.0" +description = "Type system extensions for programs checked with the mypy type checker." +optional = false +python-versions = ">=3.8" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505"}, + {file = "mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558"}, +] + +[[package]] +name = "packaging" +version = "24.0" +description = "Core utilities for Python packages" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "packaging-24.0-py3-none-any.whl", hash = "sha256:2ddfb553fdf02fb784c234c7ba6ccc288296ceabec964ad2eae3777778130bc5"}, + {file = "packaging-24.0.tar.gz", hash = "sha256:eb82c5e3e56209074766e6885bb04b8c38a0c015d0a30036ebe7ece34c9989e9"}, +] + +[[package]] +name = "packaging" +version = "25.0" +description = "Core utilities for Python packages" +optional = false +python-versions = ">=3.8" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484"}, + {file = "packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f"}, +] + +[[package]] +name = "platformdirs" +version = "2.6.2" +description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"." +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "platformdirs-2.6.2-py3-none-any.whl", hash = "sha256:83c8f6d04389165de7c9b6f0c682439697887bca0aa2f1c87ef1826be3584490"}, + {file = "platformdirs-2.6.2.tar.gz", hash = "sha256:e1fea1fe471b9ff8332e229df3cb7de4f53eeea4998d3b6bfff542115e998bd2"}, +] + +[package.dependencies] +typing-extensions = {version = ">=4.4", markers = "python_version < \"3.8\""} + +[package.extras] +docs = ["furo (>=2022.12.7)", "proselint (>=0.13)", "sphinx (>=5.3)", "sphinx-autodoc-typehints (>=1.19.5)"] +test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"] + +[[package]] +name = "platformdirs" +version = "4.3.8" +description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4"}, + {file = "platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc"}, +] + +[package.extras] +docs = ["furo (>=2024.8.6)", "proselint (>=0.14)", "sphinx (>=8.1.3)", "sphinx-autodoc-typehints (>=3)"] +test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=8.3.4)", "pytest-cov (>=6)", "pytest-mock (>=3.14)"] +type = ["mypy (>=1.14.1)"] + +[[package]] +name = "pluggy" +version = "1.2.0" +description = "plugin and hook calling mechanisms for python" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "pluggy-1.2.0-py3-none-any.whl", hash = "sha256:c2fd55a7d7a3863cba1a013e4e2414658b1d07b6bc57b3919e0c63c9abb99849"}, + {file = "pluggy-1.2.0.tar.gz", hash = "sha256:d12f0c4b579b15f5e054301bb226ee85eeeba08ffec228092f8defbaa3a4c4b3"}, +] + +[package.dependencies] +importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""} + +[package.extras] +dev = ["pre-commit", "tox"] +testing = ["pytest", "pytest-benchmark"] + +[[package]] +name = "pluggy" +version = "1.6.0" +description = "plugin and hook calling mechanisms for python" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746"}, + {file = "pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3"}, +] + +[package.extras] +dev = ["pre-commit", "tox"] +testing = ["coverage", "pytest", "pytest-benchmark"] + +[[package]] +name = "py" +version = "1.11.0" +description = "library with cross-python path, ini-parsing, io, code, log facilities" +optional = false +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"}, + {file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"}, +] + +[[package]] +name = "pycodestyle" +version = "2.9.1" +description = "Python style guide checker" +optional = false +python-versions = ">=3.6" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "pycodestyle-2.9.1-py2.py3-none-any.whl", hash = "sha256:d1735fc58b418fd7c5f658d28d943854f8a849b01a5d0a1e6f3f3fdd0166804b"}, + {file = "pycodestyle-2.9.1.tar.gz", hash = "sha256:2c9607871d58c76354b697b42f5d57e1ada7d261c261efac224b664affdc5785"}, +] + +[[package]] +name = "pycodestyle" +version = "2.14.0" +description = "Python style guide checker" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pycodestyle-2.14.0-py2.py3-none-any.whl", hash = "sha256:dd6bf7cb4ee77f8e016f9c8e74a35ddd9f67e1d5fd4184d86c3b98e07099f42d"}, + {file = "pycodestyle-2.14.0.tar.gz", hash = "sha256:c4b5b517d278089ff9d0abdec919cd97262a3367449ea1c8b49b91529167b783"}, +] + +[[package]] +name = "pydantic" +version = "2.5.3" +description = "Data validation using Python type hints" +optional = false +python-versions = ">=3.7" +groups = ["main"] +markers = "python_version < \"3.13\"" +files = [ + {file = "pydantic-2.5.3-py3-none-any.whl", hash = "sha256:d0caf5954bee831b6bfe7e338c32b9e30c85dfe080c843680783ac2b631673b4"}, + {file = "pydantic-2.5.3.tar.gz", hash = "sha256:b3ef57c62535b0941697cce638c08900d87fcb67e29cfa99e8a68f747f393f7a"}, +] + +[package.dependencies] +annotated-types = ">=0.4.0" +importlib-metadata = {version = "*", markers = "python_version == \"3.7\""} +pydantic-core = "2.14.6" +typing-extensions = ">=4.6.1" + +[package.extras] +email = ["email-validator (>=2.0.0)"] + +[[package]] +name = "pydantic" +version = "2.11.7" +description = "Data validation using Python type hints" +optional = false +python-versions = ">=3.9" +groups = ["main"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b"}, + {file = "pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db"}, +] + +[package.dependencies] +annotated-types = ">=0.6.0" +pydantic-core = "2.33.2" +typing-extensions = ">=4.12.2" +typing-inspection = ">=0.4.0" + +[package.extras] +email = ["email-validator (>=2.0.0)"] +timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows\""] + +[[package]] +name = "pydantic-core" +version = "2.14.6" +description = "" +optional = false +python-versions = ">=3.7" +groups = ["main"] +markers = "python_version < \"3.13\"" +files = [ + {file = "pydantic_core-2.14.6-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:72f9a942d739f09cd42fffe5dc759928217649f070056f03c70df14f5770acf9"}, + {file = "pydantic_core-2.14.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6a31d98c0d69776c2576dda4b77b8e0c69ad08e8b539c25c7d0ca0dc19a50d6c"}, + {file = "pydantic_core-2.14.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5aa90562bc079c6c290f0512b21768967f9968e4cfea84ea4ff5af5d917016e4"}, + {file = "pydantic_core-2.14.6-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:370ffecb5316ed23b667d99ce4debe53ea664b99cc37bfa2af47bc769056d534"}, + {file = "pydantic_core-2.14.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f85f3843bdb1fe80e8c206fe6eed7a1caeae897e496542cee499c374a85c6e08"}, + {file = "pydantic_core-2.14.6-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9862bf828112e19685b76ca499b379338fd4c5c269d897e218b2ae8fcb80139d"}, + {file = "pydantic_core-2.14.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:036137b5ad0cb0004c75b579445a1efccd072387a36c7f217bb8efd1afbe5245"}, + {file = "pydantic_core-2.14.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:92879bce89f91f4b2416eba4429c7b5ca22c45ef4a499c39f0c5c69257522c7c"}, + {file = "pydantic_core-2.14.6-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:0c08de15d50fa190d577e8591f0329a643eeaed696d7771760295998aca6bc66"}, + {file = "pydantic_core-2.14.6-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:36099c69f6b14fc2c49d7996cbf4f87ec4f0e66d1c74aa05228583225a07b590"}, + {file = "pydantic_core-2.14.6-cp310-none-win32.whl", hash = "sha256:7be719e4d2ae6c314f72844ba9d69e38dff342bc360379f7c8537c48e23034b7"}, + {file = "pydantic_core-2.14.6-cp310-none-win_amd64.whl", hash = "sha256:36fa402dcdc8ea7f1b0ddcf0df4254cc6b2e08f8cd80e7010d4c4ae6e86b2a87"}, + {file = "pydantic_core-2.14.6-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:dea7fcd62915fb150cdc373212141a30037e11b761fbced340e9db3379b892d4"}, + {file = "pydantic_core-2.14.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ffff855100bc066ff2cd3aa4a60bc9534661816b110f0243e59503ec2df38421"}, + {file = "pydantic_core-2.14.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1b027c86c66b8627eb90e57aee1f526df77dc6d8b354ec498be9a757d513b92b"}, + {file = "pydantic_core-2.14.6-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:00b1087dabcee0b0ffd104f9f53d7d3eaddfaa314cdd6726143af6bc713aa27e"}, + {file = "pydantic_core-2.14.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:75ec284328b60a4e91010c1acade0c30584f28a1f345bc8f72fe8b9e46ec6a96"}, + {file = "pydantic_core-2.14.6-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7e1f4744eea1501404b20b0ac059ff7e3f96a97d3e3f48ce27a139e053bb370b"}, + {file = "pydantic_core-2.14.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b2602177668f89b38b9f84b7b3435d0a72511ddef45dc14446811759b82235a1"}, + {file = "pydantic_core-2.14.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6c8edaea3089bf908dd27da8f5d9e395c5b4dc092dbcce9b65e7156099b4b937"}, + {file = "pydantic_core-2.14.6-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:478e9e7b360dfec451daafe286998d4a1eeaecf6d69c427b834ae771cad4b622"}, + {file = "pydantic_core-2.14.6-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:b6ca36c12a5120bad343eef193cc0122928c5c7466121da7c20f41160ba00ba2"}, + {file = "pydantic_core-2.14.6-cp311-none-win32.whl", hash = "sha256:2b8719037e570639e6b665a4050add43134d80b687288ba3ade18b22bbb29dd2"}, + {file = "pydantic_core-2.14.6-cp311-none-win_amd64.whl", hash = "sha256:78ee52ecc088c61cce32b2d30a826f929e1708f7b9247dc3b921aec367dc1b23"}, + {file = "pydantic_core-2.14.6-cp311-none-win_arm64.whl", hash = "sha256:a19b794f8fe6569472ff77602437ec4430f9b2b9ec7a1105cfd2232f9ba355e6"}, + {file = "pydantic_core-2.14.6-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:667aa2eac9cd0700af1ddb38b7b1ef246d8cf94c85637cbb03d7757ca4c3fdec"}, + {file = "pydantic_core-2.14.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:cdee837710ef6b56ebd20245b83799fce40b265b3b406e51e8ccc5b85b9099b7"}, + {file = "pydantic_core-2.14.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c5bcf3414367e29f83fd66f7de64509a8fd2368b1edf4351e862910727d3e51"}, + {file = "pydantic_core-2.14.6-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:26a92ae76f75d1915806b77cf459811e772d8f71fd1e4339c99750f0e7f6324f"}, + {file = "pydantic_core-2.14.6-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a983cca5ed1dd9a35e9e42ebf9f278d344603bfcb174ff99a5815f953925140a"}, + {file = "pydantic_core-2.14.6-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cb92f9061657287eded380d7dc455bbf115430b3aa4741bdc662d02977e7d0af"}, + {file = "pydantic_core-2.14.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4ace1e220b078c8e48e82c081e35002038657e4b37d403ce940fa679e57113b"}, + {file = "pydantic_core-2.14.6-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ef633add81832f4b56d3b4c9408b43d530dfca29e68fb1b797dcb861a2c734cd"}, + {file = "pydantic_core-2.14.6-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7e90d6cc4aad2cc1f5e16ed56e46cebf4877c62403a311af20459c15da76fd91"}, + {file = "pydantic_core-2.14.6-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:e8a5ac97ea521d7bde7621d86c30e86b798cdecd985723c4ed737a2aa9e77d0c"}, + {file = "pydantic_core-2.14.6-cp312-none-win32.whl", hash = "sha256:f27207e8ca3e5e021e2402ba942e5b4c629718e665c81b8b306f3c8b1ddbb786"}, + {file = "pydantic_core-2.14.6-cp312-none-win_amd64.whl", hash = "sha256:b3e5fe4538001bb82e2295b8d2a39356a84694c97cb73a566dc36328b9f83b40"}, + {file = "pydantic_core-2.14.6-cp312-none-win_arm64.whl", hash = "sha256:64634ccf9d671c6be242a664a33c4acf12882670b09b3f163cd00a24cffbd74e"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-macosx_10_7_x86_64.whl", hash = "sha256:24368e31be2c88bd69340fbfe741b405302993242ccb476c5c3ff48aeee1afe0"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-macosx_11_0_arm64.whl", hash = "sha256:e33b0834f1cf779aa839975f9d8755a7c2420510c0fa1e9fa0497de77cd35d2c"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6af4b3f52cc65f8a0bc8b1cd9676f8c21ef3e9132f21fed250f6958bd7223bed"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d15687d7d7f40333bd8266f3814c591c2e2cd263fa2116e314f60d82086e353a"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:095b707bb287bfd534044166ab767bec70a9bba3175dcdc3371782175c14e43c"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94fc0e6621e07d1e91c44e016cc0b189b48db053061cc22d6298a611de8071bb"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ce830e480f6774608dedfd4a90c42aac4a7af0a711f1b52f807130c2e434c06"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a306cdd2ad3a7d795d8e617a58c3a2ed0f76c8496fb7621b6cd514eb1532cae8"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:2f5fa187bde8524b1e37ba894db13aadd64faa884657473b03a019f625cee9a8"}, + {file = "pydantic_core-2.14.6-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:438027a975cc213a47c5d70672e0d29776082155cfae540c4e225716586be75e"}, + {file = "pydantic_core-2.14.6-cp37-none-win32.whl", hash = "sha256:f96ae96a060a8072ceff4cfde89d261837b4294a4f28b84a28765470d502ccc6"}, + {file = "pydantic_core-2.14.6-cp37-none-win_amd64.whl", hash = "sha256:e646c0e282e960345314f42f2cea5e0b5f56938c093541ea6dbf11aec2862391"}, + {file = "pydantic_core-2.14.6-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:db453f2da3f59a348f514cfbfeb042393b68720787bbef2b4c6068ea362c8149"}, + {file = "pydantic_core-2.14.6-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3860c62057acd95cc84044e758e47b18dcd8871a328ebc8ccdefd18b0d26a21b"}, + {file = "pydantic_core-2.14.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:36026d8f99c58d7044413e1b819a67ca0e0b8ebe0f25e775e6c3d1fabb3c38fb"}, + {file = "pydantic_core-2.14.6-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8ed1af8692bd8d2a29d702f1a2e6065416d76897d726e45a1775b1444f5928a7"}, + {file = "pydantic_core-2.14.6-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:314ccc4264ce7d854941231cf71b592e30d8d368a71e50197c905874feacc8a8"}, + {file = "pydantic_core-2.14.6-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:982487f8931067a32e72d40ab6b47b1628a9c5d344be7f1a4e668fb462d2da42"}, + {file = "pydantic_core-2.14.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dbe357bc4ddda078f79d2a36fc1dd0494a7f2fad83a0a684465b6f24b46fe80"}, + {file = "pydantic_core-2.14.6-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2f6ffc6701a0eb28648c845f4945a194dc7ab3c651f535b81793251e1185ac3d"}, + {file = "pydantic_core-2.14.6-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7f5025db12fc6de7bc1104d826d5aee1d172f9ba6ca936bf6474c2148ac336c1"}, + {file = "pydantic_core-2.14.6-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:dab03ed811ed1c71d700ed08bde8431cf429bbe59e423394f0f4055f1ca0ea60"}, + {file = "pydantic_core-2.14.6-cp38-none-win32.whl", hash = "sha256:dfcbebdb3c4b6f739a91769aea5ed615023f3c88cb70df812849aef634c25fbe"}, + {file = "pydantic_core-2.14.6-cp38-none-win_amd64.whl", hash = "sha256:99b14dbea2fdb563d8b5a57c9badfcd72083f6006caf8e126b491519c7d64ca8"}, + {file = "pydantic_core-2.14.6-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:4ce8299b481bcb68e5c82002b96e411796b844d72b3e92a3fbedfe8e19813eab"}, + {file = "pydantic_core-2.14.6-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b9a9d92f10772d2a181b5ca339dee066ab7d1c9a34ae2421b2a52556e719756f"}, + {file = "pydantic_core-2.14.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fd9e98b408384989ea4ab60206b8e100d8687da18b5c813c11e92fd8212a98e0"}, + {file = "pydantic_core-2.14.6-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4f86f1f318e56f5cbb282fe61eb84767aee743ebe32c7c0834690ebea50c0a6b"}, + {file = "pydantic_core-2.14.6-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86ce5fcfc3accf3a07a729779d0b86c5d0309a4764c897d86c11089be61da160"}, + {file = "pydantic_core-2.14.6-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dcf1978be02153c6a31692d4fbcc2a3f1db9da36039ead23173bc256ee3b91b"}, + {file = "pydantic_core-2.14.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eedf97be7bc3dbc8addcef4142f4b4164066df0c6f36397ae4aaed3eb187d8ab"}, + {file = "pydantic_core-2.14.6-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d5f916acf8afbcab6bacbb376ba7dc61f845367901ecd5e328fc4d4aef2fcab0"}, + {file = "pydantic_core-2.14.6-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:8a14c192c1d724c3acbfb3f10a958c55a2638391319ce8078cb36c02283959b9"}, + {file = "pydantic_core-2.14.6-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0348b1dc6b76041516e8a854ff95b21c55f5a411c3297d2ca52f5528e49d8411"}, + {file = "pydantic_core-2.14.6-cp39-none-win32.whl", hash = "sha256:de2a0645a923ba57c5527497daf8ec5df69c6eadf869e9cd46e86349146e5975"}, + {file = "pydantic_core-2.14.6-cp39-none-win_amd64.whl", hash = "sha256:aca48506a9c20f68ee61c87f2008f81f8ee99f8d7f0104bff3c47e2d148f89d9"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:d5c28525c19f5bb1e09511669bb57353d22b94cf8b65f3a8d141c389a55dec95"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:78d0768ee59baa3de0f4adac9e3748b4b1fffc52143caebddfd5ea2961595277"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b93785eadaef932e4fe9c6e12ba67beb1b3f1e5495631419c784ab87e975670"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a874f21f87c485310944b2b2734cd6d318765bcbb7515eead33af9641816506e"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b89f4477d915ea43b4ceea6756f63f0288941b6443a2b28c69004fe07fde0d0d"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:172de779e2a153d36ee690dbc49c6db568d7b33b18dc56b69a7514aecbcf380d"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:dfcebb950aa7e667ec226a442722134539e77c575f6cfaa423f24371bb8d2e94"}, + {file = "pydantic_core-2.14.6-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:55a23dcd98c858c0db44fc5c04fc7ed81c4b4d33c653a7c45ddaebf6563a2f66"}, + {file = "pydantic_core-2.14.6-pp37-pypy37_pp73-macosx_10_7_x86_64.whl", hash = "sha256:4241204e4b36ab5ae466ecec5c4c16527a054c69f99bba20f6f75232a6a534e2"}, + {file = "pydantic_core-2.14.6-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e574de99d735b3fc8364cba9912c2bec2da78775eba95cbb225ef7dda6acea24"}, + {file = "pydantic_core-2.14.6-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1302a54f87b5cd8528e4d6d1bf2133b6aa7c6122ff8e9dc5220fbc1e07bffebd"}, + {file = "pydantic_core-2.14.6-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f8e81e4b55930e5ffab4a68db1af431629cf2e4066dbdbfef65348b8ab804ea8"}, + {file = "pydantic_core-2.14.6-pp37-pypy37_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:c99462ffc538717b3e60151dfaf91125f637e801f5ab008f81c402f1dff0cd0f"}, + {file = "pydantic_core-2.14.6-pp37-pypy37_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:e4cf2d5829f6963a5483ec01578ee76d329eb5caf330ecd05b3edd697e7d768a"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:cf10b7d58ae4a1f07fccbf4a0a956d705356fea05fb4c70608bb6fa81d103cda"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:399ac0891c284fa8eb998bcfa323f2234858f5d2efca3950ae58c8f88830f145"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c6a5c79b28003543db3ba67d1df336f253a87d3112dac3a51b94f7d48e4c0e1"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:599c87d79cab2a6a2a9df4aefe0455e61e7d2aeede2f8577c1b7c0aec643ee8e"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:43e166ad47ba900f2542a80d83f9fc65fe99eb63ceec4debec160ae729824052"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3a0b5db001b98e1c649dd55afa928e75aa4087e587b9524a4992316fa23c9fba"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:747265448cb57a9f37572a488a57d873fd96bf51e5bb7edb52cfb37124516da4"}, + {file = "pydantic_core-2.14.6-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:7ebe3416785f65c28f4f9441e916bfc8a54179c8dea73c23023f7086fa601c5d"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:86c963186ca5e50d5c8287b1d1c9d3f8f024cbe343d048c5bd282aec2d8641f2"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:e0641b506486f0b4cd1500a2a65740243e8670a2549bb02bc4556a83af84ae03"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71d72ca5eaaa8d38c8df16b7deb1a2da4f650c41b58bb142f3fb75d5ad4a611f"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:27e524624eace5c59af499cd97dc18bb201dc6a7a2da24bfc66ef151c69a5f2a"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a3dde6cac75e0b0902778978d3b1646ca9f438654395a362cb21d9ad34b24acf"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:00646784f6cd993b1e1c0e7b0fdcbccc375d539db95555477771c27555e3c556"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:23598acb8ccaa3d1d875ef3b35cb6376535095e9405d91a3d57a8c7db5d29341"}, + {file = "pydantic_core-2.14.6-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7f41533d7e3cf9520065f610b41ac1c76bc2161415955fbcead4981b22c7611e"}, + {file = "pydantic_core-2.14.6.tar.gz", hash = "sha256:1fd0c1d395372843fba13a51c28e3bb9d59bd7aebfeb17358ffaaa1e4dbbe948"}, +] + +[package.dependencies] +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0" + +[[package]] +name = "pydantic-core" +version = "2.33.2" +description = "Core functionality for Pydantic validation and serialization" +optional = false +python-versions = ">=3.9" +groups = ["main"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8"}, + {file = "pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d"}, + {file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d"}, + {file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572"}, + {file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02"}, + {file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b"}, + {file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2"}, + {file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a"}, + {file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac"}, + {file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a"}, + {file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b"}, + {file = "pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22"}, + {file = "pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640"}, + {file = "pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7"}, + {file = "pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246"}, + {file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f"}, + {file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc"}, + {file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de"}, + {file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a"}, + {file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef"}, + {file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e"}, + {file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d"}, + {file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30"}, + {file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf"}, + {file = "pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51"}, + {file = "pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab"}, + {file = "pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65"}, + {file = "pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc"}, + {file = "pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7"}, + {file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025"}, + {file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011"}, + {file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f"}, + {file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88"}, + {file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1"}, + {file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b"}, + {file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1"}, + {file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6"}, + {file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea"}, + {file = "pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290"}, + {file = "pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2"}, + {file = "pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab"}, + {file = "pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f"}, + {file = "pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6"}, + {file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef"}, + {file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a"}, + {file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916"}, + {file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a"}, + {file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d"}, + {file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56"}, + {file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5"}, + {file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e"}, + {file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162"}, + {file = "pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849"}, + {file = "pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9"}, + {file = "pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9"}, + {file = "pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac"}, + {file = "pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5"}, + {file = "pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9"}, + {file = "pydantic_core-2.33.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a2b911a5b90e0374d03813674bf0a5fbbb7741570dcd4b4e85a2e48d17def29d"}, + {file = "pydantic_core-2.33.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:6fa6dfc3e4d1f734a34710f391ae822e0a8eb8559a85c6979e14e65ee6ba2954"}, + {file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c54c939ee22dc8e2d545da79fc5381f1c020d6d3141d3bd747eab59164dc89fb"}, + {file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53a57d2ed685940a504248187d5685e49eb5eef0f696853647bf37c418c538f7"}, + {file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:09fb9dd6571aacd023fe6aaca316bd01cf60ab27240d7eb39ebd66a3a15293b4"}, + {file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0e6116757f7959a712db11f3e9c0a99ade00a5bbedae83cb801985aa154f071b"}, + {file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d55ab81c57b8ff8548c3e4947f119551253f4e3787a7bbc0b6b3ca47498a9d3"}, + {file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c20c462aa4434b33a2661701b861604913f912254e441ab8d78d30485736115a"}, + {file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:44857c3227d3fb5e753d5fe4a3420d6376fa594b07b621e220cd93703fe21782"}, + {file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:eb9b459ca4df0e5c87deb59d37377461a538852765293f9e6ee834f0435a93b9"}, + {file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9fcd347d2cc5c23b06de6d3b7b8275be558a0c90549495c699e379a80bf8379e"}, + {file = "pydantic_core-2.33.2-cp39-cp39-win32.whl", hash = "sha256:83aa99b1285bc8f038941ddf598501a86f1536789740991d7d8756e34f1e74d9"}, + {file = "pydantic_core-2.33.2-cp39-cp39-win_amd64.whl", hash = "sha256:f481959862f57f29601ccced557cc2e817bce7533ab8e01a797a48b49c9692b3"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c"}, + {file = "pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb"}, + {file = "pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:87acbfcf8e90ca885206e98359d7dca4bcbb35abdc0ff66672a293e1d7a19101"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:7f92c15cd1e97d4b12acd1cc9004fa092578acfa57b67ad5e43a197175d01a64"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3f26877a748dc4251cfcfda9dfb5f13fcb034f5308388066bcfe9031b63ae7d"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dac89aea9af8cd672fa7b510e7b8c33b0bba9a43186680550ccf23020f32d535"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:970919794d126ba8645f3837ab6046fb4e72bbc057b3709144066204c19a455d"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3eb3fe62804e8f859c49ed20a8451342de53ed764150cb14ca71357c765dc2a6"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:3abcd9392a36025e3bd55f9bd38d908bd17962cc49bc6da8e7e96285336e2bca"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:3a1c81334778f9e3af2f8aeb7a960736e5cab1dfebfb26aabca09afd2906c039"}, + {file = "pydantic_core-2.33.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2807668ba86cb38c6817ad9bc66215ab8584d1d304030ce4f0887336f28a5e27"}, + {file = "pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc"}, +] + +[package.dependencies] +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0" + +[[package]] +name = "pyflakes" +version = "2.5.0" +description = "passive checker of Python programs" +optional = false +python-versions = ">=3.6" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "pyflakes-2.5.0-py2.py3-none-any.whl", hash = "sha256:4579f67d887f804e67edb544428f264b7b24f435b263c4614f384135cea553d2"}, + {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"}, +] + +[[package]] +name = "pyflakes" +version = "3.4.0" +description = "passive checker of Python programs" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pyflakes-3.4.0-py2.py3-none-any.whl", hash = "sha256:f742a7dbd0d9cb9ea41e9a24a918996e8170c799fa528688d40dd582c8265f4f"}, + {file = "pyflakes-3.4.0.tar.gz", hash = "sha256:b24f96fafb7d2ab0ec5075b7350b3d2d2218eab42003821c06344973d3ea2f58"}, +] + +[[package]] +name = "pygments" +version = "2.19.2" +description = "Pygments is a syntax highlighting package written in Python." +optional = false +python-versions = ">=3.8" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b"}, + {file = "pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887"}, +] + +[package.extras] +windows-terminal = ["colorama (>=0.4.6)"] + +[[package]] +name = "pyproject-api" +version = "1.9.1" +description = "API to interact with the python pyproject.toml based projects" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pyproject_api-1.9.1-py3-none-any.whl", hash = "sha256:7d6238d92f8962773dd75b5f0c4a6a27cce092a14b623b811dba656f3b628948"}, + {file = "pyproject_api-1.9.1.tar.gz", hash = "sha256:43c9918f49daab37e302038fc1aed54a8c7a91a9fa935d00b9a485f37e0f5335"}, +] + +[package.dependencies] +packaging = ">=25" + +[package.extras] +docs = ["furo (>=2024.8.6)", "sphinx-autodoc-typehints (>=3.2)"] +testing = ["covdefaults (>=2.3)", "pytest (>=8.3.5)", "pytest-cov (>=6.1.1)", "pytest-mock (>=3.14)", "setuptools (>=80.3.1)"] + +[[package]] +name = "pytest" +version = "7.4.4" +description = "pytest: simple powerful testing with Python" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "pytest-7.4.4-py3-none-any.whl", hash = "sha256:b090cdf5ed60bf4c45261be03239c2c1c22df034fbffe691abe93cd80cea01d8"}, + {file = "pytest-7.4.4.tar.gz", hash = "sha256:2cf0005922c6ace4a3e2ec8b4080eb0d9753fdc93107415332f50ce9e7994280"}, +] + +[package.dependencies] +colorama = {version = "*", markers = "sys_platform == \"win32\""} +exceptiongroup = {version = ">=1.0.0rc8", markers = "python_version < \"3.11\""} +importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""} +iniconfig = "*" +packaging = "*" +pluggy = ">=0.12,<2.0" +tomli = {version = ">=1.0.0", markers = "python_version < \"3.11\""} + +[package.extras] +testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "nose", "pygments (>=2.7.2)", "requests", "setuptools", "xmlschema"] + +[[package]] +name = "pytest" +version = "8.4.1" +description = "pytest: simple powerful testing with Python" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "pytest-8.4.1-py3-none-any.whl", hash = "sha256:539c70ba6fcead8e78eebbf1115e8b589e7565830d7d006a8723f19ac8a0afb7"}, + {file = "pytest-8.4.1.tar.gz", hash = "sha256:7c67fd69174877359ed9371ec3af8a3d2b04741818c51e5e99cc1742251fa93c"}, +] + +[package.dependencies] +colorama = {version = ">=0.4", markers = "sys_platform == \"win32\""} +iniconfig = ">=1" +packaging = ">=20" +pluggy = ">=1.5,<2" +pygments = ">=2.7.2" + +[package.extras] +dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "requests", "setuptools", "xmlschema"] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +description = "Extensions to the standard Python datetime module" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["main"] +files = [ + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, +] + +[package.dependencies] +six = ">=1.5" + +[[package]] +name = "six" +version = "1.17.0" +description = "Python 2 and 3 compatibility utilities" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["main", "dev"] +files = [ + {file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"}, + {file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"}, +] +markers = {dev = "python_version < \"3.13\""} + +[[package]] +name = "tomli" +version = "2.0.1" +description = "A lil' TOML parser" +optional = false +python-versions = ">=3.7" +groups = ["dev"] +markers = "python_version < \"3.11\"" +files = [ + {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"}, + {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"}, +] + +[[package]] +name = "tox" +version = "3.28.0" +description = "tox is a generic virtualenv management and test command line tool" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "tox-3.28.0-py2.py3-none-any.whl", hash = "sha256:57b5ab7e8bb3074edc3c0c0b4b192a4f3799d3723b2c5b76f1fa9f2d40316eea"}, + {file = "tox-3.28.0.tar.gz", hash = "sha256:d0d28f3fe6d6d7195c27f8b054c3e99d5451952b54abdae673b71609a581f640"}, +] + +[package.dependencies] +colorama = {version = ">=0.4.1", markers = "platform_system == \"Windows\""} +filelock = ">=3.0.0" +importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""} +packaging = ">=14" +pluggy = ">=0.12.0" +py = ">=1.4.17" +six = ">=1.14.0" +tomli = {version = ">=2.0.1", markers = "python_version >= \"3.7\" and python_version < \"3.11\""} +virtualenv = ">=16.0.0,<20.0.0 || >20.0.0,<20.0.1 || >20.0.1,<20.0.2 || >20.0.2,<20.0.3 || >20.0.3,<20.0.4 || >20.0.4,<20.0.5 || >20.0.5,<20.0.6 || >20.0.6,<20.0.7 || >20.0.7" + +[package.extras] +docs = ["pygments-github-lexers (>=0.0.5)", "sphinx (>=2.0.0)", "sphinxcontrib-autoprogram (>=0.1.5)", "towncrier (>=18.5.0)"] +testing = ["flaky (>=3.4.0)", "freezegun (>=0.3.11)", "pathlib2 (>=2.3.3) ; python_version < \"3.4\"", "psutil (>=5.6.1) ; platform_python_implementation == \"cpython\"", "pytest (>=4.0.0)", "pytest-cov (>=2.5.1)", "pytest-mock (>=1.10.0)", "pytest-randomly (>=1.0.0)"] + +[[package]] +name = "tox" +version = "4.28.4" +description = "tox is a generic virtualenv management and test command line tool" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "tox-4.28.4-py3-none-any.whl", hash = "sha256:8d4ad9ee916ebbb59272bb045e154a10fa12e3bbdcf94cc5185cbdaf9b241f99"}, + {file = "tox-4.28.4.tar.gz", hash = "sha256:b5b14c6307bd8994ff1eba5074275826620325ee1a4f61316959d562bfd70b9d"}, +] + +[package.dependencies] +cachetools = ">=6.1" +chardet = ">=5.2" +colorama = ">=0.4.6" +filelock = ">=3.18" +packaging = ">=25" +platformdirs = ">=4.3.8" +pluggy = ">=1.6" +pyproject-api = ">=1.9.1" +virtualenv = ">=20.31.2" + +[[package]] +name = "typed-ast" +version = "1.5.5" +description = "a fork of Python 2 and 3 ast modules with type comment support" +optional = false +python-versions = ">=3.6" +groups = ["dev"] +markers = "python_version == \"3.7\"" +files = [ + {file = "typed_ast-1.5.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4bc1efe0ce3ffb74784e06460f01a223ac1f6ab31c6bc0376a21184bf5aabe3b"}, + {file = "typed_ast-1.5.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5f7a8c46a8b333f71abd61d7ab9255440d4a588f34a21f126bbfc95f6049e686"}, + {file = "typed_ast-1.5.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:597fc66b4162f959ee6a96b978c0435bd63791e31e4f410622d19f1686d5e769"}, + {file = "typed_ast-1.5.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d41b7a686ce653e06c2609075d397ebd5b969d821b9797d029fccd71fdec8e04"}, + {file = "typed_ast-1.5.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:5fe83a9a44c4ce67c796a1b466c270c1272e176603d5e06f6afbc101a572859d"}, + {file = "typed_ast-1.5.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d5c0c112a74c0e5db2c75882a0adf3133adedcdbfd8cf7c9d6ed77365ab90a1d"}, + {file = "typed_ast-1.5.5-cp310-cp310-win_amd64.whl", hash = "sha256:e1a976ed4cc2d71bb073e1b2a250892a6e968ff02aa14c1f40eba4f365ffec02"}, + {file = "typed_ast-1.5.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c631da9710271cb67b08bd3f3813b7af7f4c69c319b75475436fcab8c3d21bee"}, + {file = "typed_ast-1.5.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b445c2abfecab89a932b20bd8261488d574591173d07827c1eda32c457358b18"}, + {file = "typed_ast-1.5.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc95ffaaab2be3b25eb938779e43f513e0e538a84dd14a5d844b8f2932593d88"}, + {file = "typed_ast-1.5.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61443214d9b4c660dcf4b5307f15c12cb30bdfe9588ce6158f4a005baeb167b2"}, + {file = "typed_ast-1.5.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6eb936d107e4d474940469e8ec5b380c9b329b5f08b78282d46baeebd3692dc9"}, + {file = "typed_ast-1.5.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e48bf27022897577d8479eaed64701ecaf0467182448bd95759883300ca818c8"}, + {file = "typed_ast-1.5.5-cp311-cp311-win_amd64.whl", hash = "sha256:83509f9324011c9a39faaef0922c6f720f9623afe3fe220b6d0b15638247206b"}, + {file = "typed_ast-1.5.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:44f214394fc1af23ca6d4e9e744804d890045d1643dd7e8229951e0ef39429b5"}, + {file = "typed_ast-1.5.5-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:118c1ce46ce58fda78503eae14b7664163aa735b620b64b5b725453696f2a35c"}, + {file = "typed_ast-1.5.5-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:be4919b808efa61101456e87f2d4c75b228f4e52618621c77f1ddcaae15904fa"}, + {file = "typed_ast-1.5.5-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:fc2b8c4e1bc5cd96c1a823a885e6b158f8451cf6f5530e1829390b4d27d0807f"}, + {file = "typed_ast-1.5.5-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:16f7313e0a08c7de57f2998c85e2a69a642e97cb32f87eb65fbfe88381a5e44d"}, + {file = "typed_ast-1.5.5-cp36-cp36m-win_amd64.whl", hash = "sha256:2b946ef8c04f77230489f75b4b5a4a6f24c078be4aed241cfabe9cbf4156e7e5"}, + {file = "typed_ast-1.5.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:2188bc33d85951ea4ddad55d2b35598b2709d122c11c75cffd529fbc9965508e"}, + {file = "typed_ast-1.5.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0635900d16ae133cab3b26c607586131269f88266954eb04ec31535c9a12ef1e"}, + {file = "typed_ast-1.5.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:57bfc3cf35a0f2fdf0a88a3044aafaec1d2f24d8ae8cd87c4f58d615fb5b6311"}, + {file = "typed_ast-1.5.5-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:fe58ef6a764de7b4b36edfc8592641f56e69b7163bba9f9c8089838ee596bfb2"}, + {file = "typed_ast-1.5.5-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d09d930c2d1d621f717bb217bf1fe2584616febb5138d9b3e8cdd26506c3f6d4"}, + {file = "typed_ast-1.5.5-cp37-cp37m-win_amd64.whl", hash = "sha256:d40c10326893ecab8a80a53039164a224984339b2c32a6baf55ecbd5b1df6431"}, + {file = "typed_ast-1.5.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:fd946abf3c31fb50eee07451a6aedbfff912fcd13cf357363f5b4e834cc5e71a"}, + {file = "typed_ast-1.5.5-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ed4a1a42df8a3dfb6b40c3d2de109e935949f2f66b19703eafade03173f8f437"}, + {file = "typed_ast-1.5.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:045f9930a1550d9352464e5149710d56a2aed23a2ffe78946478f7b5416f1ede"}, + {file = "typed_ast-1.5.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:381eed9c95484ceef5ced626355fdc0765ab51d8553fec08661dce654a935db4"}, + {file = "typed_ast-1.5.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:bfd39a41c0ef6f31684daff53befddae608f9daf6957140228a08e51f312d7e6"}, + {file = "typed_ast-1.5.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8c524eb3024edcc04e288db9541fe1f438f82d281e591c548903d5b77ad1ddd4"}, + {file = "typed_ast-1.5.5-cp38-cp38-win_amd64.whl", hash = "sha256:7f58fabdde8dcbe764cef5e1a7fcb440f2463c1bbbec1cf2a86ca7bc1f95184b"}, + {file = "typed_ast-1.5.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:042eb665ff6bf020dd2243307d11ed626306b82812aba21836096d229fdc6a10"}, + {file = "typed_ast-1.5.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:622e4a006472b05cf6ef7f9f2636edc51bda670b7bbffa18d26b255269d3d814"}, + {file = "typed_ast-1.5.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1efebbbf4604ad1283e963e8915daa240cb4bf5067053cf2f0baadc4d4fb51b8"}, + {file = "typed_ast-1.5.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f0aefdd66f1784c58f65b502b6cf8b121544680456d1cebbd300c2c813899274"}, + {file = "typed_ast-1.5.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:48074261a842acf825af1968cd912f6f21357316080ebaca5f19abbb11690c8a"}, + {file = "typed_ast-1.5.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:429ae404f69dc94b9361bb62291885894b7c6fb4640d561179548c849f8492ba"}, + {file = "typed_ast-1.5.5-cp39-cp39-win_amd64.whl", hash = "sha256:335f22ccb244da2b5c296e6f96b06ee9bed46526db0de38d2f0e5a6597b81155"}, + {file = "typed_ast-1.5.5.tar.gz", hash = "sha256:94282f7a354f36ef5dbce0ef3467ebf6a258e370ab33d5b40c249fa996e590dd"}, +] + +[[package]] +name = "types-python-dateutil" +version = "2.8.19.14" +description = "Typing stubs for python-dateutil" +optional = false +python-versions = "*" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "types-python-dateutil-2.8.19.14.tar.gz", hash = "sha256:1f4f10ac98bb8b16ade9dbee3518d9ace017821d94b057a425b069f834737f4b"}, + {file = "types_python_dateutil-2.8.19.14-py3-none-any.whl", hash = "sha256:f977b8de27787639986b4e28963263fd0e5158942b3ecef91b9335c130cb1ce9"}, +] + +[[package]] +name = "types-python-dateutil" +version = "2.9.0.20250809" +description = "Typing stubs for python-dateutil" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "types_python_dateutil-2.9.0.20250809-py3-none-any.whl", hash = "sha256:768890cac4f2d7fd9e0feb6f3217fce2abbfdfc0cadd38d11fba325a815e4b9f"}, + {file = "types_python_dateutil-2.9.0.20250809.tar.gz", hash = "sha256:69cbf8d15ef7a75c3801d65d63466e46ac25a0baa678d89d0a137fc31a608cc1"}, +] + +[[package]] +name = "typing-extensions" +version = "4.7.1" +description = "Backported and Experimental Type Hints for Python 3.7+" +optional = false +python-versions = ">=3.7" +groups = ["main", "dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "typing_extensions-4.7.1-py3-none-any.whl", hash = "sha256:440d5dd3af93b060174bf433bccd69b0babc3b15b1a8dca43789fd7f61514b36"}, + {file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"}, +] + +[[package]] +name = "typing-extensions" +version = "4.14.1" +description = "Backported and Experimental Type Hints for Python 3.9+" +optional = false +python-versions = ">=3.9" +groups = ["main", "dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "typing_extensions-4.14.1-py3-none-any.whl", hash = "sha256:d1e1e3b58374dc93031d6eda2420a48ea44a36c2b4766a4fdeb3710755731d76"}, + {file = "typing_extensions-4.14.1.tar.gz", hash = "sha256:38b39f4aeeab64884ce9f74c94263ef78f3c22467c8724005483154c26648d36"}, +] + +[[package]] +name = "typing-inspection" +version = "0.4.1" +description = "Runtime typing introspection tools" +optional = false +python-versions = ">=3.9" +groups = ["main"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51"}, + {file = "typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28"}, +] + +[package.dependencies] +typing-extensions = ">=4.12.0" + +[[package]] +name = "urllib3" +version = "2.0.7" +description = "HTTP library with thread-safe connection pooling, file post, and more." +optional = false +python-versions = ">=3.7" +groups = ["main"] +markers = "python_version < \"3.13\"" +files = [ + {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"}, + {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"}, +] + +[package.extras] +brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""] +secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"] +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] +zstd = ["zstandard (>=0.18.0)"] + +[[package]] +name = "urllib3" +version = "2.5.0" +description = "HTTP library with thread-safe connection pooling, file post, and more." +optional = false +python-versions = ">=3.9" +groups = ["main"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc"}, + {file = "urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760"}, +] + +[package.extras] +brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""] +h2 = ["h2 (>=4,<5)"] +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] +zstd = ["zstandard (>=0.18.0)"] + +[[package]] +name = "virtualenv" +version = "20.16.2" +description = "Virtual Python Environment builder" +optional = false +python-versions = ">=3.6" +groups = ["dev"] +markers = "python_version < \"3.13\"" +files = [ + {file = "virtualenv-20.16.2-py2.py3-none-any.whl", hash = "sha256:635b272a8e2f77cb051946f46c60a54ace3cb5e25568228bd6b57fc70eca9ff3"}, + {file = "virtualenv-20.16.2.tar.gz", hash = "sha256:0ef5be6d07181946891f5abc8047fda8bc2f0b4b9bf222c64e6e8963baee76db"}, +] + +[package.dependencies] +distlib = ">=0.3.1,<1" +filelock = ">=3.2,<4" +importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""} +platformdirs = ">=2,<3" + +[package.extras] +docs = ["proselint (>=0.10.2)", "sphinx (>=3)", "sphinx-argparse (>=0.2.5)", "sphinx-rtd-theme (>=0.4.3)", "towncrier (>=21.3)"] +testing = ["coverage (>=4)", "coverage-enable-subprocess (>=1)", "flaky (>=3)", "packaging (>=20.0)", "pytest (>=4)", "pytest-env (>=0.6.2)", "pytest-freezegun (>=0.4.1)", "pytest-mock (>=2)", "pytest-randomly (>=1)", "pytest-timeout (>=1)"] + +[[package]] +name = "virtualenv" +version = "20.34.0" +description = "Virtual Python Environment builder" +optional = false +python-versions = ">=3.8" +groups = ["dev"] +markers = "python_version >= \"3.13\"" +files = [ + {file = "virtualenv-20.34.0-py3-none-any.whl", hash = "sha256:341f5afa7eee943e4984a9207c025feedd768baff6753cd660c857ceb3e36026"}, + {file = "virtualenv-20.34.0.tar.gz", hash = "sha256:44815b2c9dee7ed86e387b842a84f20b93f7f417f95886ca1996a72a4138eb1a"}, +] + +[package.dependencies] +distlib = ">=0.3.7,<1" +filelock = ">=3.12.2,<4" +platformdirs = ">=3.9.1,<5" + +[package.extras] +docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2,!=7.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"] +test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8) ; platform_python_implementation == \"PyPy\" or platform_python_implementation == \"GraalVM\" or platform_python_implementation == \"CPython\" and sys_platform == \"win32\" and python_version >= \"3.13\"", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10) ; platform_python_implementation == \"CPython\""] + +[[package]] +name = "zipp" +version = "3.15.0" +description = "Backport of pathlib-compatible object wrapper for zip files" +optional = false +python-versions = ">=3.7" +groups = ["main", "dev"] +markers = "python_version == \"3.7\"" +files = [ + {file = "zipp-3.15.0-py3-none-any.whl", hash = "sha256:48904fc76a60e542af151aded95726c1a5c34ed43ab4134b597665c86d7ad556"}, + {file = "zipp-3.15.0.tar.gz", hash = "sha256:112929ad649da941c23de50f356a2b5570c954b65150642bccdd66bf194d224b"}, +] + +[package.extras] +docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +testing = ["big-O", "flake8 (<5)", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7) ; platform_python_implementation != \"PyPy\"", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8 ; python_version < \"3.12\"", "pytest-mypy (>=0.9.1) ; platform_python_implementation != \"PyPy\""] + +[metadata] +lock-version = "2.1" +python-versions = "^3.7" +content-hash = "131cde68a6e4a2aa7dffb9d7dc622b0b20282f3813e9d097d058c26174c83868" diff --git a/pyproject.toml b/pyproject.toml index 0c4f9f6..8d8ea19 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -16,6 +16,9 @@ urllib3 = ">= 1.25.3" python-dateutil = ">=2.8.2" pydantic = ">=2" typing-extensions = ">=4.7.1" +pyshark = ">=0.6" +pyyaml = ">=6.0.2" +pandas = ">=2.3.1" [tool.poetry.dev-dependencies] pytest = ">=7.2.1" diff --git a/samples/README.md b/samples/README.md new file mode 100644 index 0000000..b8ba503 --- /dev/null +++ b/samples/README.md @@ -0,0 +1,42 @@ +# AppMix Builder Script + +## Overview + +The appmix_builder.py script creates an application mix from a CSV file containing application names and their corresponding percentages or weights. + +## How it Works +The script reads the application names from the CSV file and searches for corresponding applications in the CyPerf library. If a match is found, the application is added to the CyPerf app mix using the percentage or weight for the corresponding application. + +The script provides a percentage coverage based on the matches. For example, if 10 applications were provided in the CSV and 5 were found in the CyPerf library, the percentage coverage would be (5/10) * 100 = 50%. + +## Features ++ Creates application mix from CSV file ++ Searches for applications in CyPerf library ++ Provides percentage coverage based on matches ++ Option to use capture-to-application converter to increase coverage percentage ++ Option to configure test or abort +## Capture-to-Application Converter + +The user can upload a set of captures (corresponding to applications that were not present in the CyPerf library) to a pre-configured location (folder) and the converter function will create applications and add them to the CyPerf library. This may help to increase the coverage percentage. +## Application Naming Convention +Applications created by the script are named CCA-. All applications can be found in the Resource Library section of the CyPerf controller. +## Steps to Run the Script +### Step 1: Configure Test Parameters + +1. Edit the test_parameters.yml file under the "cyperf-api-wrapper/samples" folder. +Set the variable named "location_of_folder_containing_captures" to the absolute path of the folder where all captures should be stored. + +2. Set the variable named "location_of_folder_containing_captures" to the absolute path of the folder where all captures should be stored. + +### Step 2: Run the Script + +1. Navigate to the folder: cyperf-api-wrapper. + +2. Run the script "appmix_builder.py" using the following command from the CLI: + ```bash + python3.x samples/appmix_builder.py --controller -- + license-server --user --password +Example: + ```bash + python3.13 samples/appmix_builder.py --controller 10.39.68.180 --license-server 10.39.68.180 --license-user admin --license-password CyPerf\&Keysight#1 + diff --git a/samples/appmix_builder.py b/samples/appmix_builder.py new file mode 100644 index 0000000..c20f56d --- /dev/null +++ b/samples/appmix_builder.py @@ -0,0 +1,745 @@ +import cyperf +import os +import re +import utils +import urllib3; urllib3.disable_warnings() +import argparse +import yaml +from pprint import pprint +import asyncio +import csv +import pandas as pd +from collections import Counter +import math + +file_path = 'samples/test_parameters.yml' + +def replace_zeros_with_ones(lst): + arr = np.array(lst) + arr[arr == 0] = 1 + return arr.tolist() + +def merge_duplicate_entries(csv_file_path): + # Read the CSV file + df = pd.read_csv(csv_file_path) + + # Group by the first column and sum the values + merged_df = df.groupby(df.columns[0])[df.columns[1]].sum().reset_index() + + return merged_df + +def convert_to_integers(numbers): + # Calculate the greatest common divisor (GCD) of the numbers + def gcd(a, b): + while b: + a, b = b, a % b + return a + + # Find the least common multiple (LCM) of the denominators + def lcm(a, b): + return a * b // gcd(a, b) + + # Convert floats to fractions + fractions = [] + for num in numbers: + if isinstance(num, int): + fractions.append((num, 1)) + else: + denominator = 10 ** len(str(num).split('.')[1]) + numerator = int(num * denominator) + gcd_val = gcd(numerator, denominator) + fractions.append((numerator // gcd_val, denominator // gcd_val)) + + # Find the LCM of the denominators + lcm_denominator = 1 + for _, denominator in fractions: + lcm_denominator = lcm(lcm_denominator, denominator) + + # Convert fractions to integers while maintaining ratios + integers = [] + for numerator, denominator in fractions: + integers.append(numerator * lcm_denominator // denominator) + + # Find the GCD of the integers + gcd_integers = integers[0] + for num in integers[1:]: + gcd_integers = gcd(gcd_integers, num) + + # Divide all integers by the GCD to get the smallest possible integers + integers = [num // gcd_integers for num in integers] + + #print("SOUMYA BEFORE LIMITING !") + #print(integers) + + # Limit the integer values to 10000 + + max_value = max(integers) + if max_value > 10000: #cyperf limits weights upto 10K + ratio = max_value / 10000 + integers = [int(num / ratio) for num in integers] + + #eliminate any zero weights as they are not configurable in CyPerf . Convert all zeros to ONES . + non_zero_integers=[1 if x == 0 else x for x in integers] + + return non_zero_integers + + +def get_file_names(folder_path): + file_names = [] + for filename in os.listdir(folder_path): + if os.path.isfile(os.path.join(folder_path, filename)): + file_name_without_extension = os.path.splitext(filename)[0] + file_names.append(file_name_without_extension) + return file_names + +def convert_to_next_highest(dictionary): + return {key: math.ceil(value) for key, value in dictionary.items()} + + +def remove_values(input_string, separator='-', values_to_remove=['ms', 'base', 'ds', 'as']): + words = input_string.split(separator) + filtered_words = [word for word in words if word not in values_to_remove] + return ' '.join(filtered_words) + +def find_key_by_value(dictionary, value): + for key, val in dictionary.items(): + if val == value: + return key + return None + +def extract_first_word(input_string): + return input_string.split(' ')[0] + +def check_string_position(input_string, target_string): + words = target_string.split() + if input_string == words[0]: + return "Start" + elif input_string == words[-1]: + return "Last" + elif input_string in words[1:-1]: + return "Middle" + else: + return "Not found" + +def display_menu(): + print(" Please select one of the following option [1/2/3] ") + print("[1] Manually upload the captures and use capture convertor to improve the coverage percentage") + print("[2] Try to Automatically find captures from BPS and use capture convertor to improve coverage percentage") + print("[3] Dont continue with Test execution") + +def get_user_choice(): + while True: + try: + choice = int(input("Please select an option (1, 2, or 3): ")) + if 1 <= choice <= 3: + return choice + else: + print("Invalid option. Please choose 1, 2, or 3.") + except ValueError: + print("Invalid input. Please enter a number.") + +def process_choice(choice): + if choice == 1: + print("You chose Option 1") + elif choice == 2: + print("You chose Option 2") + elif choice == 3: + print("You chose Option 3") + +def find_matching_string(search_str, string_list): + matching_strings = [string for string in string_list if search_str in string] + return matching_strings + + +def remove_suffix(string, suffix="-base"): + if string.endswith(suffix): + return string[:-len(suffix)] + return string + +def find_indices(my_list, target_string, exact_match=False): + if exact_match: + return [i for i, s in enumerate(my_list) if s == target_string] + else: + return [i for i, s in enumerate(my_list) if target_string in s] + +def common_items(dictionary): + # Check if dictionary is empty + if not dictionary: + return [] + + # Find the intersection of all lists + common = set(dictionary[list(dictionary.keys())[0]]) + for key in dictionary: + common = common.intersection(set(dictionary[key])) + print(common) + return list(common) + +def select_best_matching_app(dump_app_dict,cyperf_apps,first_word): + + list_common = common_items(dump_app_dict) + + if (not len(list_common)): + print("SOUMYA : There was no common item") + tmp_list=[] + for ele in cyperf_apps: + posi= check_string_position(first_word,ele) + #print(posi) + if posi == "Not found": + continue; + else: + tmp_list.append(ele) + print("tmp_list") + print(tmp_list) + print ("the return valr from func select_best_matching_app is ") + print (sorted(tmp_list, key=len, reverse=False)) + return sorted(tmp_list, key=len, reverse=False) + + else: + print("SOUMYA: Common item was found in intersection ") + print(sorted(list_common, key=len, reverse=False)) + sorted_list = sorted(list_common, key=len, reverse=False) + return sorted_list + + + +#This function needs refinement. This is crucial to matching the apps in our Library . +def convert_app_names_to_common_names(application_dict): + for item in application_dict: + app_name= item + #if the user-specified application name does not contain hyphen('-') then include it as-is in the list + if(app_name.find("-")==-1): + application_dict[item].append(app_name) + #if the user-specified application name contains suffix ( -base ) , prefix (ms-) ,etc remove them . This will help in search. + else: + temp_list= re.split(r"[- ]", app_name) + # strings_to_remove list needs to be updated on regular basis + strings_to_remove=["base","ms","as","ds"] + #list1 = [element for element in list1 if element not in list2] + #result = [s for s in temp_list if s not in strings_to_remove] + result = [s for s in temp_list if s not in strings_to_remove] + application_dict[item].extend(result) + return application_dict + + +#read a yml file and return a dictionary +def read_yaml_file(file_path): + + try: + with open(file_path, 'r') as file: + yaml_data = yaml.safe_load(file) + return yaml_data + except FileNotFoundError: + print(f"File not found: {file_path}") + return {} + except yaml.YAMLError as e: + print(f"Error parsing YAML file: {e}") + return {} + +#read the test_parameters from the yml file and populate into variables +yaml_dict = read_yaml_file(file_path) + +print(yaml_dict) + +#populate the variables with user inputs +capture_folder_path = yaml_dict["location_of_folder_containing_captures"] +name_of_existing_cyperf_configuration = yaml_dict["name_of_existing_cyperf_configuration"] +csv_path = yaml_dict["csv_path"] +exact_match = yaml_dict["exact_match"] +threshold_coverage_percentage = int(yaml_dict["threshold_coverage_percentage"]) +percentage = yaml_dict["percentage"] +dictionary_path = yaml_dict["dictionary_path"] + +#Load the configuration or check if the configuration is already loaded and having an active session id. +#Only the loading part is imlemented . In case the configuartion is already loaded and has an active session-id +#we need to delete the application-profile and create a new one based on the app_mix.csv ( this is TBD ) +class AppMixBuilderTest (object): + def __init__ (self, capture_folder_path ,name_of_existing_cyperf_configuration ,csv_path , agent_map={} , re_entry=0): + args, offline_token = utils.parse_cli_options() + self.utils = utils.Utils(args.controller, + username=args.user, + password=args.password, + refresh_token=offline_token, + license_server=args.license_server, + license_user=args.license_user, + license_password=args.license_password) + + + + self.capture_folder_path = capture_folder_path + self.name_of_existing_cyperf_configuration = name_of_existing_cyperf_configuration + self.csv_path = csv_path + self.agent_map = agent_map + self.local_stats = {} + self.re_entry = re_entry + + def __del__(self): + self._release() + + def __enter__(self): + return self + + def __exit__(self, exception_type, exception_value, exception_traceback): + self._release() + if exception_value: + raise (exception_value) + + def _release(self): + try: + if self.session: + print('Deleting session') + self.utils.delete_session(self.session) + print('Deleted session') + self.session = None + except AttributeError: + pass + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=self.test_duration) + + #function to read the input CSV file which contains 2 coulums - (a)Application Name (b) percentage + def read_csv_file(self): + data_dict = {} + with open(self.csv_path, 'r') as file: + reader = csv.reader(file) + for row in reader: + if len(row) == 2: + key, value = row + data_dict[key] = value + else: + print(f"Skipping row: {row}. Expected 2 columns, but got {len(row)}.") + return data_dict + + def csv_to_dict(self,filename): + data_dict = {} + with open(filename, 'r') as file: + reader = csv.reader(file) + next(reader) # Skip the header row + for row in reader: + if len(row) == 2: # Ensure the row has exactly two columns + data_dict[row[0]] = row[1] + return data_dict + + def sum_percentage_and_populate_applications(self): + try: + df = pd.read_csv(self.csv_path) + total_percentage = df['percentage'].sum() + applications = df['application'].tolist() + percentages = df['percentage'].tolist() + return total_percentage, applications,percentages + except pd.errors.EmptyDataError: + print("The CSV file is empty.") + return None, None, None + except KeyError as e: + print(f"The CSV file is missing the column: {e}") + return None, None, None + + def populate_applications_and_weights(self): + try: + df = pd.read_csv(self.csv_path) + applications = df['application'].tolist() + weights_raw = df['weight'].tolist() + #process the weights such that they are convered to integers and in propoer ratio + weights=convert_to_integers(weights_raw) + print(" I am inside populate application weights ") + print(applications) + print(weights) + return applications,weights + except pd.errors.EmptyDataError: + print("The CSV file is empty.") + return None, None + except KeyError as e: + print(f"The CSV file is missing the column: {e}") + return None, None + + + def percentages_to_weights(self, percentages): + total_percentage = sum(percentages) + weights = [round(p / total_percentage * 100) for p in percentages] + # Ensure no weight is zero + min_weight = min(weights) + if min_weight == 0: + zero_indices = [i for i, w in enumerate(weights) if w == 0] + for i in zero_indices: + weights[i] = 1 + # Adjust weights to ensure they sum up to 100 + diff = sum(weights) - 100 + weights[weights.index(max(weights))] -= diff + else: + # Adjust weights to ensure they sum up to 100 + diff = 100 - sum(weights) + weights[weights.index(max(weights))] += diff + return weights + + #Configure the Applications with their corresponding weights in CyPerf + def configure_appmix_test(self): + + #check if the input csv ( appmix name & % weights provided by user ) contains any duplicate entry in aplication + #coulum , In case it does we need to coaslese them into a single entry by adding weights + merged_df = merge_duplicate_entries(csv_path) + print(merged_df) + + # Save the merged DataFrame to a the CSV file + merged_df.to_csv(csv_path, index=False) + + if (percentage == "True"): + total_percentage, applications, percenatges = self.sum_percentage_and_populate_applications() + + if(total_percentage < 99 or total_percentage > 101): + print("You must fix the percentages such that they add upto 100 ") + else: + #if percentages of all the userinput adds close to 100 , then convert them into non-zero weights + weights = self.percentages_to_weights(percenatges) + else: + applications, weights = self.populate_applications_and_weights() + + + #create a dictionary to store the user-input applications and their corresponding weights + input_app_dict_tmp = dict(zip(applications, weights)) + + print(input_app_dict_tmp) + #covert the weights to the nearest integer ceiling value + input_app_dict=convert_to_next_highest(input_app_dict_tmp) + + #Search a configuation by name as provided in the test-aparmeters.yml file ( example - PANW-APPMIX) + if (self.utils.search_configuration_file(name_of_existing_cyperf_configuration)): + #load the configuration and create a test session + try: + session_appmix=self.utils.create_session_by_config_name(name_of_existing_cyperf_configuration) + print("The Base Configuration Template was loaded successfully !") + except Exception: + print("The Base Configuration Template was not loaded successfully !") + return False + + #create the CyPerf app-dictionary + app_dictionary= self.csv_to_dict(dictionary_path) + target_app_mix_dict={} + not_found_apps=[] + matching_matrix_dict={} + found=0 + for item in input_app_dict: + cyperf_app_name=app_dictionary.get(item) + if(cyperf_app_name): + found=found+1 + #check if it present earlier in the dictionary - in that case we need to modify weights + if (cyperf_app_name in target_app_mix_dict ): + adjusted_weight= target_app_mix_dict[cyperf_app_name] + input_app_dict[item] + input_app_dict[item]=adjusted_weight + + target_app_mix_dict.update({cyperf_app_name:input_app_dict[item]}) + matching_matrix_dict.update({item:cyperf_app_name}) + else: + + not_found_apps.append(item) + + + #Reports for users + print(" = = = = = = = = = = = = = = = = = = = = = = = = = = = == = = = = = = = = ") + print(f"Total applications provided in the CSV file = {len(input_app_dict.keys())}") + #pprint(new_app_dict.keys()) + #print(f"Total applications found in CyPerf Libarary = {len(target_app_mix_dict.keys())}") + print(f"Total applications found in CyPerf Libarary = {found}") + #pprint(target_app_mix_dict.keys()) + print(f"Total non matching application = {len(not_found_apps)}") + pprint(not_found_apps) + print("Matching matrix") + print(matching_matrix_dict) + print(" = = = = = = = = = = = = = = = = = = = = = = = = = = = == = = = = = = = = \n") + #Coverage Percenatge : (This is the ratio of applications found in CyPerf Library / Total number of application presented by the user ) + #A higher Covergae ratio incicates that most of the applications were found in the CyPerf Library + + coverage_percent = (len(target_app_mix_dict.keys())/len((input_app_dict.keys())))*100 + print(f"coverage Percentage = {coverage_percent}") + print(" = = = = = = = = = = = = = = = = = = = = = = = = = = = == = = = = = = = = \n") + + session_deleted=False + if(threshold_coverage_percentage == 0 ): + #Menu Driven option to the user + decision=input("Do you wish to continue with the existing Coverage percentage [Y/N]?: ") + if (decision.lower() == 'y'): + #start the configuration of the test and subsequently run the test + self.utils.add_apps_with_weights(session_appmix,target_app_mix_dict) + self.utils.set_objective_and_timeline(session_appmix,objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_unit=cyperf.ObjectiveUnit.EMPTY, + objective_value=100, + test_duration=30) + #self.utils.check_if_traffic_enabled(session) + print ("configuration complete !") + + elif (decision.lower() == 'n' and self.re_entry == 0 ) : + while True: + display_menu() + choice = get_user_choice() + process_choice(choice) + cont = input("Do you want to continue? (y/n): ") + if cont.lower() != 'y': + break + + + if(choice == 1): + #This part requires automation . TBD + print("Delete any existing capture files in the capture Folder & ensure the name of the capture file is same as the name of the app you provided in the CSV (application, weight) ") + user_input =input("Upload the capture files of missing applications manually to the capture folder and type -\"continue\" ") + if(user_input.lower()== "continue"): + #start updating the master dictionary after reading the capture folder + #read the capture folder and gather all the new capture names in a list + filenames=get_file_names(capture_folder_path) + #update the master csv - whic conyains teh dictionary mapping between user-input & cyPerf appnames + # Create a DataFrame from the list + df = pd.DataFrame({'app-id': filenames, 'cyperf-appname': ['CCA-'+ filename for filename in filenames]}) + + # Check if the CSV file exists + try: + existing_df = pd.read_csv(dictionary_path) + combined_df = pd.concat([existing_df, df]) + combined_df.to_csv(dictionary_path, index=False) + except FileNotFoundError: + df.to_csv(dictionary_path, index=False) + + #start creating the custom applications + asyncio.run(capture_convertor_and_custom_app_builder()) + print("Exiting the present Test . Reruning Test again and check coverage improvement") + self.utils.delete_session(session_appmix) + start_appmixbuilder_test(re_entry=1) + break; + else: + print("Invalid input ! ") + elif(choice == 2): + user_input =input("Trying to fetch captures from known repo & creating custom application") + #start creating the custom applications + print("This is under implementation ! ... exiting the test!") + if(self.utils.session_appmix.id): + self.utils.delete_session(session_appmix) + session_deleted=True + break; + elif(choice == 3): + print("Stopping the execution ! ") + #raise Exception(" TEST STOPPED ! ") + if(not session_deleted): + self.utils.delete_session(session_appmix) + break; + elif (decision.lower() == 'n' and self.re_entry == 1 ) : + print("Stopping the execution ! ") + #raise Exception(" TEST STOPPED ! ") + if(not session_deleted): + self.utils.delete_session(session_appmix) + + else: + print("Invalid ! option selected ") + + #If threshold percentage is non-zero the decision to configure and run the test is driven by the threshold percenatge & covergae percentage + else: + if(threshold_coverage_percentage <= coverage_percent): + #add_application-mix along with weights to the CyPerf + self.utils.add_apps_with_weights(session_appmix,target_app_mix_dict) + #self.utils.check_if_traffic_enabled(session) + print ("configuration complete !") + print ( "Starting test ") + #start the test + #self.utils.delete_session(session_appmix.id) + else: + #need to stop the test + #raise Exception ( " APPLICATION COVERAGE INADEQUATE FOR PROCEDDING WITH TEST Configuration !! ") + print(" APPLICATION COVERAGE INADEQUATE FOR PROCEDDING WITH TEST Configuration !!") + self.utils.delete_session(session_appmix.id) + + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=30) + +class CaptureReplayTest (object): + def __init__(self, capture_folder_path ,agent_map={}): + args, offline_token = utils.parse_cli_options() + self.utils = utils.Utils(args.controller, + username=args.user, + password=args.password, + refresh_token=offline_token, + license_server=args.license_server, + license_user=args.license_user, + license_password=args.license_password) + + + + self.capture_folder_path = capture_folder_path + self.agent_map = agent_map + self.test_duration = 60 + self.local_stats = {} + + def __del__(self): + self._release() + + def __enter__(self): + return self + + def __exit__(self, exception_type, exception_value, exception_traceback): + self._release() + if exception_value: + raise (exception_value) + + def _release(self): + try: + if self.session: + print('Deleting session') + self.utils.delete_session(self.session) + print('Deleted session') + self.session = None + except AttributeError: + pass + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=self.test_duration) + #soumo + def get_capture_file_paths(self): + try: + # Check if the folder exists + if not os.path.exists(self.capture_folder_path): + print(f"Error: Folder '{self.capture_folder_path}' does not exist.") + return [] + + # Get a list of all files in the folder + file_paths = [os.path.join(self.capture_folder_path, file) for file in os.listdir(self.capture_folder_path) if os.path.isfile(os.path.join(self.capture_folder_path, file))] + return file_paths + + except Exception as e: + print(f"An error occurred: {str(e)}") + return [] + + + + + async def configure(self): + print('Configuring ...') + #read the pcap files + list_of_paths_of_pcap_files = self.get_capture_file_paths() + list_of_paths_of_pcap_files.reverse() + print(list_of_paths_of_pcap_files) + #upload all the captures from the specified folder ( in yml file ) + #to CyPerf Resource Library for Captures + while(list_of_paths_of_pcap_files): + capture_file=list_of_paths_of_pcap_files.pop() + print("uploading capture - {} ".format(capture_file)) + await self.utils.upload_the_capture_file(capture_file) + #create an application from the uploaded captures + apps_created= self.utils.create_apps_from_captures() + print('Conversion from pcaps to Applications complete !!!.You may now use the created apps created from pcaps.\nThe custom apps are available under the Resource Library in CyPerf Controller') + + def _start(self): + print('Starting test ...') + self.utils.start_test(self.session) + print('Started test ...') + + def _process_stats(self, stats): + processed_stats = self.local_stats + for stat in stats: + if stat.snapshots: + processed_stats[stat.name] = {} + for snapshot in stat.snapshots: + time_stamp = snapshot.timestamp + processed_stats[stat.name][time_stamp] = [] + d = {} + for idx, stat_name in enumerate(stat.columns): + d[stat_name] = [val[idx].actual_instance for val in snapshot.values] + processed_stats[stat.name][time_stamp] = d + return processed_stats + + def _print_run_time_stats(self, test, time_from, time_to): + stat_names = ['client-streaming-rate', 'server-streaming-rate'] + return self.print_run_time_stats(test, time_from, time_to, stat_names) + + def print_run_time_stats(self, test, time_from, time_to, stat_names): + last_monitored_time_stamp = None + for stat_name in stat_names: + stats = self.utils.collect_stats(test, + stat_name, + time_from, + time_to, + self._process_stats) + if stat_name not in stats: + continue + + stats = stats[stat_name] + last_time_stamp = max(stats) + + if stat_name in self.last_recorded_time_stamps: + last_recorded_time_stamp = self.last_recorded_time_stamps[stat_name] + else: + last_recorded_time_stamp = 0 + + if last_time_stamp != last_recorded_time_stamp: + last_stats = stats[last_time_stamp] + + print(f'\n{stat_name} at {self.utils.format_milliseconds(last_time_stamp)}\n') + lines = self.utils.format_stats_dict_as_table(last_stats) + for line in lines: + print(line) + + self.last_recorded_time_stamps[stat_name] = last_time_stamp + + if last_monitored_time_stamp: + last_monitored_time_stamp = min (max(last_time_stamp, time_from), + last_monitored_time_stamp) + else: + last_monitored_time_stamp = max(last_time_stamp, time_from) + + return last_monitored_time_stamp + + def _wait_until_stopped(self): + self.last_recorded_time_stamps = {} + self.utils.wait_for_test_stop(self.session, self._print_run_time_stats) + print('Stopped test ...') + + def run(self): + self._start() + self._wait_until_stopped() + + def collect_final_stats(self): + print('Collecting final statistics ...') + stat_names = ['client-streaming-statistics', 'server-streaming-statistics'] + session_api = cyperf.SessionsApi(self.utils.api_client) + test = session_api.get_test(session_id=self.session.id) + self.print_run_time_stats(test, 0, -1, stat_names) + print('Collected final statistics ...') + + +def start_appmixbuilder_test(re_entry=0): + + agents = { + 'IP Network 1': ['10.39.69.98'], + 'IP Network 2': ['10.39.69.99'] + } + with AppMixBuilderTest( capture_folder_path,name_of_existing_cyperf_configuration,csv_path,agents,re_entry) as test1: + test1.configure_appmix_test() + #test1._set_objective_and_timeline() + #test.configure() + #test.run() + #test.collect_final_stats() + + + +#async def main(): +async def capture_convertor_and_custom_app_builder(): + agents = { + 'IP Network 1': ['10.39.68.192'], + 'IP Network 2': ['10.39.69.229'] + } + + with CaptureReplayTest(capture_folder_path,agents) as test: + await test.configure() + #test.configure() + #test.run() + #test.collect_final_stats() + +if __name__ == '__main__': + #asyncio.run(capture_convertor_and_custom_app_builder()) + start_appmixbuilder_test() diff --git a/samples/appmix_builder.py_bkp b/samples/appmix_builder.py_bkp new file mode 100644 index 0000000..181a208 --- /dev/null +++ b/samples/appmix_builder.py_bkp @@ -0,0 +1,748 @@ +import cyperf +import os +import re +import utils +import urllib3; urllib3.disable_warnings() +import argparse +import yaml +from pprint import pprint +import asyncio +import csv +import pandas as pd +from collections import Counter +import math + +file_path = 'samples/test_parameters.yml' + +def replace_zeros_with_ones(lst): + arr = np.array(lst) + arr[arr == 0] = 1 + return arr.tolist() + +def merge_duplicate_entries(csv_file_path): + # Read the CSV file + df = pd.read_csv(csv_file_path) + + # Group by the first column and sum the values + merged_df = df.groupby(df.columns[0])[df.columns[1]].sum().reset_index() + + return merged_df + +def convert_to_integers(numbers): + # Calculate the greatest common divisor (GCD) of the numbers + def gcd(a, b): + while b: + a, b = b, a % b + return a + + # Find the least common multiple (LCM) of the denominators + def lcm(a, b): + return a * b // gcd(a, b) + + # Convert floats to fractions + fractions = [] + for num in numbers: + if isinstance(num, int): + fractions.append((num, 1)) + else: + denominator = 10 ** len(str(num).split('.')[1]) + numerator = int(num * denominator) + gcd_val = gcd(numerator, denominator) + fractions.append((numerator // gcd_val, denominator // gcd_val)) + + # Find the LCM of the denominators + lcm_denominator = 1 + for _, denominator in fractions: + lcm_denominator = lcm(lcm_denominator, denominator) + + # Convert fractions to integers while maintaining ratios + integers = [] + for numerator, denominator in fractions: + integers.append(numerator * lcm_denominator // denominator) + + # Find the GCD of the integers + gcd_integers = integers[0] + for num in integers[1:]: + gcd_integers = gcd(gcd_integers, num) + + # Divide all integers by the GCD to get the smallest possible integers + integers = [num // gcd_integers for num in integers] + + #print("SOUMYA BEFORE LIMITING !") + #print(integers) + + # Limit the integer values to 10000 + + max_value = max(integers) + if max_value > 10000: #cyperf limits weights upto 10K + ratio = max_value / 10000 + integers = [int(num / ratio) for num in integers] + + #eliminate any zero weights as they are not configurable in CyPerf . Convert all zeros to ONES . + non_zero_integers=[1 if x == 0 else x for x in integers] + + return non_zero_integers + + +def get_file_names(folder_path): + file_names = [] + for filename in os.listdir(folder_path): + if os.path.isfile(os.path.join(folder_path, filename)): + file_name_without_extension = os.path.splitext(filename)[0] + file_names.append(file_name_without_extension) + return file_names + +def convert_to_next_highest(dictionary): + return {key: math.ceil(value) for key, value in dictionary.items()} + + +def remove_values(input_string, separator='-', values_to_remove=['ms', 'base', 'ds', 'as']): + words = input_string.split(separator) + filtered_words = [word for word in words if word not in values_to_remove] + return ' '.join(filtered_words) + +def find_key_by_value(dictionary, value): + for key, val in dictionary.items(): + if val == value: + return key + return None + +def extract_first_word(input_string): + return input_string.split(' ')[0] + +def check_string_position(input_string, target_string): + words = target_string.split() + if input_string == words[0]: + return "Start" + elif input_string == words[-1]: + return "Last" + elif input_string in words[1:-1]: + return "Middle" + else: + return "Not found" + +def display_menu(): + print(" Please select one of the following option [1/2/3] ") + print("[1] Manually upload the captures and use capture convertor to improve the coverage percentage") + print("[2] Try to Automatically find captures from BPS and use capture convertor to improve coverage percentage") + print("[3] Dont continue with Test execution") + +def get_user_choice(): + while True: + try: + choice = int(input("Please select an option (1, 2, or 3): ")) + if 1 <= choice <= 3: + return choice + else: + print("Invalid option. Please choose 1, 2, or 3.") + except ValueError: + print("Invalid input. Please enter a number.") + +def process_choice(choice): + if choice == 1: + print("You chose Option 1") + elif choice == 2: + print("You chose Option 2") + elif choice == 3: + print("You chose Option 3") + +def find_matching_string(search_str, string_list): + matching_strings = [string for string in string_list if search_str in string] + return matching_strings + + +def remove_suffix(string, suffix="-base"): + if string.endswith(suffix): + return string[:-len(suffix)] + return string + +def find_indices(my_list, target_string, exact_match=False): + if exact_match: + return [i for i, s in enumerate(my_list) if s == target_string] + else: + return [i for i, s in enumerate(my_list) if target_string in s] + +def common_items(dictionary): + # Check if dictionary is empty + if not dictionary: + return [] + + # Find the intersection of all lists + common = set(dictionary[list(dictionary.keys())[0]]) + for key in dictionary: + common = common.intersection(set(dictionary[key])) + print(common) + return list(common) + +def select_best_matching_app(dump_app_dict,cyperf_apps,first_word): + + list_common = common_items(dump_app_dict) + + if (not len(list_common)): + print("SOUMYA : There was no common item") + tmp_list=[] + for ele in cyperf_apps: + posi= check_string_position(first_word,ele) + #print(posi) + if posi == "Not found": + continue; + else: + tmp_list.append(ele) + print("tmp_list") + print(tmp_list) + print ("the return valr from func select_best_matching_app is ") + print (sorted(tmp_list, key=len, reverse=False)) + return sorted(tmp_list, key=len, reverse=False) + + else: + print("SOUMYA: Common item was found in intersection ") + print(sorted(list_common, key=len, reverse=False)) + sorted_list = sorted(list_common, key=len, reverse=False) + return sorted_list + + + +#This function needs refinement. This is crucial to matching the apps in our Library . +def convert_app_names_to_common_names(application_dict): + for item in application_dict: + app_name= item + #if the user-specified application name does not contain hyphen('-') then include it as-is in the list + if(app_name.find("-")==-1): + application_dict[item].append(app_name) + #if the user-specified application name contains suffix ( -base ) , prefix (ms-) ,etc remove them . This will help in search. + else: + temp_list= re.split(r"[- ]", app_name) + # strings_to_remove list needs to be updated on regular basis + strings_to_remove=["base","ms","as","ds"] + #list1 = [element for element in list1 if element not in list2] + #result = [s for s in temp_list if s not in strings_to_remove] + result = [s for s in temp_list if s not in strings_to_remove] + application_dict[item].extend(result) + return application_dict + + +#read a yml file and return a dictionary +def read_yaml_file(file_path): + + try: + with open(file_path, 'r') as file: + yaml_data = yaml.safe_load(file) + return yaml_data + except FileNotFoundError: + print(f"File not found: {file_path}") + return {} + except yaml.YAMLError as e: + print(f"Error parsing YAML file: {e}") + return {} + +#read the test_parameters from the yml file and populate into variables +yaml_dict = read_yaml_file(file_path) + +print(yaml_dict) + +#populate the variables with user inputs +capture_folder_path = yaml_dict["location_of_folder_containing_captures"] +name_of_existing_cyperf_configuration = yaml_dict["name_of_existing_cyperf_configuration"] +csv_path = yaml_dict["csv_path"] +exact_match = yaml_dict["exact_match"] +threshold_coverage_percentage = int(yaml_dict["threshold_coverage_percentage"]) +percentage = yaml_dict["percentage"] +dictionary_path = yaml_dict["dictionary_path"] + +#Load the configuration or check if the configuration is already loaded and having an active session id. +#Only the loading part is imlemented . In case the configuartion is already loaded and has an active session-id +#we need to delete the application-profile and create a new one based on the app_mix.csv ( this is TBD ) +class AppMixBuilderTest (object): + def __init__ (self, capture_folder_path ,name_of_existing_cyperf_configuration ,csv_path , agent_map={}): + args, offline_token = utils.parse_cli_options() + self.utils = utils.Utils(args.controller, + username=args.user, + password=args.password, + refresh_token=offline_token, + license_server=args.license_server, + license_user=args.license_user, + license_password=args.license_password) + + + + self.capture_folder_path = capture_folder_path + self.name_of_existing_cyperf_configuration = name_of_existing_cyperf_configuration + self.csv_path = csv_path + self.agent_map = agent_map + self.local_stats = {} + + def __del__(self): + self._release() + + def __enter__(self): + return self + + def __exit__(self, exception_type, exception_value, exception_traceback): + self._release() + if exception_value: + raise (exception_value) + + def _release(self): + try: + if self.session: + print('Deleting session') + self.utils.delete_session(self.session) + print('Deleted session') + self.session = None + except AttributeError: + pass + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=self.test_duration) + + #function to read the input CSV file which contains 2 coulums - (a)Application Name (b) percentage + def read_csv_file(self): + data_dict = {} + with open(self.csv_path, 'r') as file: + reader = csv.reader(file) + for row in reader: + if len(row) == 2: + key, value = row + data_dict[key] = value + else: + print(f"Skipping row: {row}. Expected 2 columns, but got {len(row)}.") + return data_dict + + def csv_to_dict(self,filename): + data_dict = {} + with open(filename, 'r') as file: + reader = csv.reader(file) + next(reader) # Skip the header row + for row in reader: + if len(row) == 2: # Ensure the row has exactly two columns + data_dict[row[0]] = row[1] + return data_dict + + def sum_percentage_and_populate_applications(self): + try: + df = pd.read_csv(self.csv_path) + total_percentage = df['percentage'].sum() + applications = df['application'].tolist() + percentages = df['percentage'].tolist() + return total_percentage, applications,percentages + except pd.errors.EmptyDataError: + print("The CSV file is empty.") + return None, None, None + except KeyError as e: + print(f"The CSV file is missing the column: {e}") + return None, None, None + + def populate_applications_and_weights(self): + try: + df = pd.read_csv(self.csv_path) + applications = df['application'].tolist() + weights_raw = df['weight'].tolist() + #process the weights such that they are convered to integers and in propoer ratio + weights=convert_to_integers(weights_raw) + print(" I am inside populate application weights ") + print(applications) + print(weights) + return applications,weights + except pd.errors.EmptyDataError: + print("The CSV file is empty.") + return None, None + except KeyError as e: + print(f"The CSV file is missing the column: {e}") + return None, None + + + def percentages_to_weights(self, percentages): + total_percentage = sum(percentages) + weights = [round(p / total_percentage * 100) for p in percentages] + # Ensure no weight is zero + min_weight = min(weights) + if min_weight == 0: + zero_indices = [i for i, w in enumerate(weights) if w == 0] + for i in zero_indices: + weights[i] = 1 + # Adjust weights to ensure they sum up to 100 + diff = sum(weights) - 100 + weights[weights.index(max(weights))] -= diff + else: + # Adjust weights to ensure they sum up to 100 + diff = 100 - sum(weights) + weights[weights.index(max(weights))] += diff + return weights + + #Configure the Applications with their corresponding weights in CyPerf + def configure_appmix_test(self): + + #check if the input csv ( appmix name & % weights provided by user ) contains any duplicate entry in aplication + #coulum , In case it does we need to coaslese them into a single entry by adding weights + merged_df = merge_duplicate_entries(csv_path) + print(merged_df) + + # Save the merged DataFrame to a the CSV file + merged_df.to_csv(csv_path, index=False) + + if (percentage == "True"): + total_percentage, applications, percenatges = self.sum_percentage_and_populate_applications() + + if(total_percentage < 99 or total_percentage > 101): + print("You must fix the percentages such that they add upto 100 ") + else: + #if percentages of all the userinput adds close to 100 , then convert them into non-zero weights + weights = self.percentages_to_weights(percenatges) + else: + applications, weights = self.populate_applications_and_weights() + + + #create a dictionary to store the user-input applications and their corresponding weights + input_app_dict_tmp = dict(zip(applications, weights)) + + print(input_app_dict_tmp) + #covert the weights to the nearest integer ceiling value + input_app_dict=convert_to_next_highest(input_app_dict_tmp) + + #Search a configuation by name as provided in the test-aparmeters.yml file ( example - PANW-APPMIX) + if (self.utils.search_configuration_file(name_of_existing_cyperf_configuration)): + #load the configuration and create a test session + try: + session_appmix=self.utils.create_session_by_config_name(name_of_existing_cyperf_configuration) + print("The Base Configuration Template was loaded successfully !") + except Exception: + print("The Base Configuration Template was not loaded successfully !") + return False + + #session_id = session_appmix.id + #print(session_id) + + #check if the user-input apps are present pre-canned in CyPerf + ##SOUMO + #create the CyPerf app-dictionary + app_dictionary= self.csv_to_dict(dictionary_path) + target_app_mix_dict={} + not_found_apps=[] + matching_matrix_dict={} + found=0 + for item in input_app_dict: + cyperf_app_name=app_dictionary.get(item) + if(cyperf_app_name): + found=found+1 + #check if it present earlier in the dictionary - in that case we need to modify weights + if (cyperf_app_name in target_app_mix_dict ): + adjusted_weight= target_app_mix_dict[cyperf_app_name] + input_app_dict[item] + input_app_dict[item]=adjusted_weight + + target_app_mix_dict.update({cyperf_app_name:input_app_dict[item]}) + matching_matrix_dict.update({item:cyperf_app_name}) + else: + + not_found_apps.append(item) + + + #Reports for users + print(" = = = = = = = = = = = = = = = = = = = = = = = = = = = == = = = = = = = = ") + print(f"Total applications provided in the CSV file = {len(input_app_dict.keys())}") + #pprint(new_app_dict.keys()) + #print(f"Total applications found in CyPerf Libarary = {len(target_app_mix_dict.keys())}") + print(f"Total applications found in CyPerf Libarary = {found}") + #pprint(target_app_mix_dict.keys()) + print(f"Total non matching application = {len(not_found_apps)}") + pprint(not_found_apps) + print("Matching matrix") + print(matching_matrix_dict) + print(" = = = = = = = = = = = = = = = = = = = = = = = = = = = == = = = = = = = = \n") + #Coverage Percenatge : (This is the ratio of applications found in CyPerf Library / Total number of application presented by the user ) + #A higher Covergae ratio incicates that most of the applications were found in the CyPerf Library + + coverage_percent = (len(target_app_mix_dict.keys())/len((input_app_dict.keys())))*100 + print(f"coverage Percentage = {coverage_percent}") + print(" = = = = = = = = = = = = = = = = = = = = = = = = = = = == = = = = = = = = \n") + + session_deleted=False + if(threshold_coverage_percentage == 0 ): + #Menu Driven option to the user + decision=input("Do you wish to continue with the existing Coverage percentage [Y/N]?: ") + if (decision.lower() == 'y'): + #start the configuration of the test and subsequently run the test + self.utils.add_apps_with_weights(session_appmix,target_app_mix_dict) + self.utils.set_objective_and_timeline(session_appmix,objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_unit=cyperf.ObjectiveUnit.EMPTY, + objective_value=100, + test_duration=30) + #self.utils.check_if_traffic_enabled(session) + print ("configuration complete !") + print ("Starting Test ....! ") + + elif (decision.lower() == 'n'): + while True: + display_menu() + choice = get_user_choice() + process_choice(choice) + cont = input("Do you want to continue? (y/n): ") + if cont.lower() != 'y': + break + + + if(choice == 1): + #This part requires automation . TBD + print("Delete any existing capture files in the capture Folder & ensure the name of the capture file is same as the name of the app in user-input CSV ") + user_input =input("Upload the capture files of missing applications manually to the capture folder and type -\"continue\" ") + if(user_input.lower()== "continue"): + + + #start updating the master dictionary after reading the capture folder + #read the capture folder and gather all the new capture names in a list + filenames=get_file_names(capture_folder_path) + #update the master csv - whic conyains teh dictionary mapping between user-input & cyPerf appnames + # Create a DataFrame from the list + df = pd.DataFrame({'app-id': filenames, 'cyperf-appname': ['CCA-'+ filename for filename in filenames]}) + + # Check if the CSV file exists + try: + existing_df = pd.read_csv(dictionary_path) + combined_df = pd.concat([existing_df, df]) + combined_df.to_csv(dictionary_path, index=False) + except FileNotFoundError: + df.to_csv(dictionary_path, index=False) + + + + #start creating the custom applications + asyncio.run(capture_convertor_and_custom_app_builder()) + print("Exiting the present Test . Rerun Test again and check coverage improvement") + self.utils.delete_session(session_appmix) + else: + print("Invalid input ! ") + elif(choice == 2): + user_input =input("Trying to fetch captures from known repo & creating custom application") + #start creating the custom applications + print("This is under implementation ! ... exiting the test!") + if(self.utils.session_appmix.id): + self.utils.delete_session(session_appmix) + session_deleted=True + break; + elif(choice == 3): + print("Stopping the execution ! ") + #raise Exception(" TEST STOPPED ! ") + if(session_deleted): + self.utils.delete_session(session_appmix) + break; + + else: + print("Invalid ! option selected ") + + #If threshold percentage is non-zero the decision to configure and run the test is driven by the threshold percenatge & covergae percentage + else: + if(threshold_coverage_percentage <= coverage_percent): + #add_application-mix along with weights to the CyPerf + self.utils.add_apps_with_weights(session_appmix,target_app_mix_dict) + #self.utils.check_if_traffic_enabled(session) + print ("configuration complete !") + print ( "Starting test ") + #start the test + #self.utils.delete_session(session_appmix.id) + else: + #need to stop the test + #raise Exception ( " APPLICATION COVERAGE INADEQUATE FOR PROCEDDING WITH TEST Configuration !! ") + print(" APPLICATION COVERAGE INADEQUATE FOR PROCEDDING WITH TEST Configuration !!") + self.utils.delete_session(session_appmix.id) + + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=30) + +class CaptureReplayTest (object): + def __init__(self, capture_folder_path ,agent_map={}): + args, offline_token = utils.parse_cli_options() + self.utils = utils.Utils(args.controller, + username=args.user, + password=args.password, + refresh_token=offline_token, + license_server=args.license_server, + license_user=args.license_user, + license_password=args.license_password) + + + + self.capture_folder_path = capture_folder_path + self.agent_map = agent_map + self.test_duration = 60 + self.local_stats = {} + + def __del__(self): + self._release() + + def __enter__(self): + return self + + def __exit__(self, exception_type, exception_value, exception_traceback): + self._release() + if exception_value: + raise (exception_value) + + def _release(self): + try: + if self.session: + print('Deleting session') + self.utils.delete_session(self.session) + print('Deleted session') + self.session = None + except AttributeError: + pass + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=self.test_duration) + #soumo + def get_capture_file_paths(self): + try: + # Check if the folder exists + if not os.path.exists(self.capture_folder_path): + print(f"Error: Folder '{self.capture_folder_path}' does not exist.") + return [] + + # Get a list of all files in the folder + file_paths = [os.path.join(self.capture_folder_path, file) for file in os.listdir(self.capture_folder_path) if os.path.isfile(os.path.join(self.capture_folder_path, file))] + + return file_paths + + except Exception as e: + print(f"An error occurred: {str(e)}") + return [] + + + + + async def configure(self): + print('Configuring ...') + #read the pcap files + list_of_paths_of_pcap_files = self.get_capture_file_paths() + list_of_paths_of_pcap_files.reverse() + print(list_of_paths_of_pcap_files) + #upload all the captures from the specified folder ( in yml file ) + #to CyPerf Resource Library for Captures + while(list_of_paths_of_pcap_files): + capture_file=list_of_paths_of_pcap_files.pop() + print("uploading capture - {} ".format(capture_file)) + await self.utils.upload_the_capture_file(capture_file) + #create an application from the uploaded captures + apps_created= self.utils.create_apps_from_captures() + print('Configuration complete !!!.You may now use the custom apps created from pcaps.\nThe custom apps are available under the Resource Library in CyPerf Controller') + + def _start(self): + print('Starting test ...') + self.utils.start_test(self.session) + print('Started test ...') + + def _process_stats(self, stats): + processed_stats = self.local_stats + for stat in stats: + if stat.snapshots: + processed_stats[stat.name] = {} + for snapshot in stat.snapshots: + time_stamp = snapshot.timestamp + processed_stats[stat.name][time_stamp] = [] + d = {} + for idx, stat_name in enumerate(stat.columns): + d[stat_name] = [val[idx].actual_instance for val in snapshot.values] + processed_stats[stat.name][time_stamp] = d + return processed_stats + + def _print_run_time_stats(self, test, time_from, time_to): + stat_names = ['client-streaming-rate', 'server-streaming-rate'] + return self.print_run_time_stats(test, time_from, time_to, stat_names) + + def print_run_time_stats(self, test, time_from, time_to, stat_names): + last_monitored_time_stamp = None + for stat_name in stat_names: + stats = self.utils.collect_stats(test, + stat_name, + time_from, + time_to, + self._process_stats) + if stat_name not in stats: + continue + + stats = stats[stat_name] + last_time_stamp = max(stats) + + if stat_name in self.last_recorded_time_stamps: + last_recorded_time_stamp = self.last_recorded_time_stamps[stat_name] + else: + last_recorded_time_stamp = 0 + + if last_time_stamp != last_recorded_time_stamp: + last_stats = stats[last_time_stamp] + + print(f'\n{stat_name} at {self.utils.format_milliseconds(last_time_stamp)}\n') + lines = self.utils.format_stats_dict_as_table(last_stats) + for line in lines: + print(line) + + self.last_recorded_time_stamps[stat_name] = last_time_stamp + + if last_monitored_time_stamp: + last_monitored_time_stamp = min (max(last_time_stamp, time_from), + last_monitored_time_stamp) + else: + last_monitored_time_stamp = max(last_time_stamp, time_from) + + return last_monitored_time_stamp + + def _wait_until_stopped(self): + self.last_recorded_time_stamps = {} + self.utils.wait_for_test_stop(self.session, self._print_run_time_stats) + print('Stopped test ...') + + def run(self): + self._start() + self._wait_until_stopped() + + def collect_final_stats(self): + print('Collecting final statistics ...') + stat_names = ['client-streaming-statistics', 'server-streaming-statistics'] + session_api = cyperf.SessionsApi(self.utils.api_client) + test = session_api.get_test(session_id=self.session.id) + self.print_run_time_stats(test, 0, -1, stat_names) + print('Collected final statistics ...') + + +def start_appmixbuilder_test(): + + agents = { + 'IP Network 1': ['10.39.69.98'], + 'IP Network 2': ['10.39.69.99'] + } + with AppMixBuilderTest( capture_folder_path,name_of_existing_cyperf_configuration,csv_path,agents) as test1: + test1.configure_appmix_test() + #test1._set_objective_and_timeline() + #test.configure() + #test.run() + #test.collect_final_stats() + + + +#async def main(): +async def capture_convertor_and_custom_app_builder(): + agents = { + 'IP Network 1': ['10.39.69.98'], + 'IP Network 2': ['10.39.69.99'] + } + + with CaptureReplayTest(capture_folder_path,agents) as test: + await test.configure() + #test.configure() + #test.run() + #test.collect_final_stats() + +if __name__ == '__main__': + #asyncio.run(capture_convertor_and_custom_app_builder()) + start_appmixbuilder_test() diff --git a/samples/capture_folder/mssql-db-encrypted.pcap b/samples/capture_folder/mssql-db-encrypted.pcap new file mode 100644 index 0000000..97e3a2e Binary files /dev/null and b/samples/capture_folder/mssql-db-encrypted.pcap differ diff --git a/samples/capture_folder/ssl.pcap b/samples/capture_folder/ssl.pcap new file mode 100644 index 0000000..4a410c7 Binary files /dev/null and b/samples/capture_folder/ssl.pcap differ diff --git a/samples/capture_replay.py b/samples/capture_replay.py new file mode 100644 index 0000000..c4d6085 --- /dev/null +++ b/samples/capture_replay.py @@ -0,0 +1,461 @@ +import cyperf +import os +import re +import utils +import urllib3; urllib3.disable_warnings() +import argparse +import yaml +from pprint import pprint +import asyncio +import csv +import pandas as pd +from collections import Counter + +file_path = '/mnt/c/new_python_automation_panw/cyperf-api-wrapper/samples/test_parameters.yml' + +def remove_suffix(string, suffix="-base"): + if string.endswith(suffix): + return string[:-len(suffix)] + return string + +def find_indices(my_list, target_string, exact_match): + if exact_match: + return [i for i, s in enumerate(my_list) if s == target_string] + else: + return [i for i, s in enumerate(my_list) if target_string in s] + +def select_best_matching_app(dump_app_list,cyperf_apps): + #print(dump_app_list) + # Count the frequency of each string + freq = Counter(dump_app_list) + #check for the highest freuency + sorted_list = sorted(dump_app_list, key=lambda x: freq[x], reverse=True) + return sorted_list + + + + +def convert_app_names_to_common_names(application_dict): + for item in application_dict: + app_name= item + #if the user-specified application name does not contain hyphen('-') then include it as-is in the list + if(app_name.find("-")==-1): + application_dict[item].append(app_name) + #if the user-specified application name contains suffix ( -base ) , prefix (ms-) ,etc remove them . This will help in search. + else: + temp_list= re.split(r"[- ]", app_name) + # This needs to be updated on regular basis + strings_to_remove=["base","ms","as","ds"] + result = [s for s in temp_list if s not in strings_to_remove] + application_dict[item].extend(result) + return application_dict + + +#read a yml file and return a dictionary +def read_yaml_file(file_path): + + try: + with open(file_path, 'r') as file: + yaml_data = yaml.safe_load(file) + return yaml_data + except FileNotFoundError: + print(f"File not found: {file_path}") + return {} + except yaml.YAMLError as e: + print(f"Error parsing YAML file: {e}") + return {} + +#read the test_parameters from the yml file and populate into variables +yaml_dict = read_yaml_file(file_path) + +print(yaml_dict) + +#populate the variables with user inputs +capture_folder_path = yaml_dict["location_of_folder_containing_captures"] +name_of_existing_cyperf_configuration = yaml_dict["name_of_existing_cyperf_configuration"] +csv_path = yaml_dict["csv_path"] +exact_match = yaml_dict["exact_match"] + +#Load the configuration or check if the configuration is already loaded and having an active session id. +class AppMixBuilderTest (object): + def __init__ (self, capture_folder_path ,name_of_existing_cyperf_configuration ,csv_path , agent_map={}): + args, offline_token = utils.parse_cli_options() + self.utils = utils.Utils(args.controller, + username=args.user, + password=args.password, + refresh_token=offline_token, + license_server=args.license_server, + license_user=args.license_user, + license_password=args.license_password) + + + + self.capture_folder_path = capture_folder_path + self.name_of_existing_cyperf_configuration = name_of_existing_cyperf_configuration + self.csv_path = csv_path + self.agent_map = agent_map + self.local_stats = {} + + def __del__(self): + self._release() + + def __enter__(self): + return self + + def __exit__(self, exception_type, exception_value, exception_traceback): + self._release() + if exception_value: + raise (exception_value) + + def _release(self): + try: + if self.session: + print('Deleting session') + self.utils.delete_session(self.session) + print('Deleted session') + self.session = None + except AttributeError: + pass + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=self.test_duration) +##SOUMO -- + #function to read the input CSV file which contains 2 coulums - (a)Application Name (b) percentage + def read_csv_file(self): + data_dict = {} + with open(self.csv_path, 'r') as file: + reader = csv.reader(file) + for row in reader: + if len(row) == 2: + key, value = row + data_dict[key] = value + else: + print(f"Skipping row: {row}. Expected 2 columns, but got {len(row)}.") + return data_dict + + def sum_percentage_and_populate_applications(self): + try: + df = pd.read_csv(self.csv_path) + total_percentage = df['percentage'].sum() + applications = df['application'].tolist() + percentages = df['percentage'].tolist() + return total_percentage, applications,percentages + except pd.errors.EmptyDataError: + print("The CSV file is empty.") + return None, None + except KeyError as e: + print(f"The CSV file is missing the column: {e}") + return None, None + + def percentages_to_weights(self, percentages): + total_percentage = sum(percentages) + weights = [round(p / total_percentage * 100) for p in percentages] + # Ensure no weight is zero + min_weight = min(weights) + if min_weight == 0: + zero_indices = [i for i, w in enumerate(weights) if w == 0] + for i in zero_indices: + weights[i] = 1 + # Adjust weights to ensure they sum up to 100 + diff = sum(weights) - 100 + weights[weights.index(max(weights))] -= diff + else: + # Adjust weights to ensure they sum up to 100 + diff = 100 - sum(weights) + weights[weights.index(max(weights))] += diff + return weights + + def configure_test(self): + total_percentage, applications, percenatges = self.sum_percentage_and_populate_applications() + + if(total_percentage < 99 or total_percentage > 101): + print("You must fix the percentages such that they add upto 100 ") + else: + weights = self.percentages_to_weights(percenatges) + + input_app_dict = dict(zip(applications, weights)) + + #Search a configuation by name as provided in the test-aparmeters.yml file ( example - PANW-APPMIX) + if (self.utils.search_configuration_file(name_of_existing_cyperf_configuration)): + #load the configuration and create a test session + try: + session_appmix=self.utils.create_session_by_config_name(name_of_existing_cyperf_configuration) + print("The configuration was loaded successfully !") + except Exception: + print("The configuration was not loaded successfully !") + return False + + session_id = session_appmix.id + #print(session_id) + + #check if the apps are present pre-canned in CyPerf + app_list = applications + + + + #convert this list to a dictionary where the keys are the app names read from the csv supplied by customer and values are list of + #names derived from them ( split into tokens , that enables better search ) + #For example if the customer provides a list contains application name as ms-office-base + #then we will store it in a dictionary where the key will be 'ms-office-base' and value will be [office]. + # This will enable more hits in the search against CyPerf applications ( pre-canned + custom ) + + #create the data-structure as described above + app_dict={} + for index in range(len(applications)): + app_dict.update({applications[index]:[]}) + + #The app_dict created above must have weight as the first element in the list + #for example if ms-office-base has weight 3 + #The app_dict must have a element ms-office-base as key & [3,"office"] as value + + for item in app_dict: + weight = input_app_dict[item] + app_dict[item].append(weight) + + #convert the application names mentioned in csv to common names + new_app_dict=convert_app_names_to_common_names(app_dict) + #pprint(new_app_dict) + #found + found=0 + # create a variable called target appmix which we will eventually use for appmix creatin in cyperf Traffic profile + target_app_mix_dict={} + + #search the application in CyPerf library ( Pre-canned + custom ) + #collect all the applciation from CyPerf Library into a variable + cyperf_apps=self.utils.get_apps(session_appmix) + + #convert cyPerf app-names them in lower case + for i, x in enumerate(cyperf_apps): + if isinstance(x, str): + cyperf_apps[i] = x.lower() + + for item in new_app_dict: + app_list=new_app_dict[item] + dump_app_list=[] + for ele in app_list[1:]: + indices = find_indices(cyperf_apps,ele,False) + if ( len(indices)): + temp_list = [cyperf_apps[i] for i in indices] + dump_app_list.extend(temp_list) + + #pprint(select_best_matching_app(dump_app_list,cyperf_apps)) + if((select_best_matching_app(dump_app_list,cyperf_apps))): + best=select_best_matching_app(dump_app_list,cyperf_apps)[0] + weight=new_app_dict[item][0] + target_app_mix_dict.update({best:weight}) + + #Now you need to tag the weight with this app + print(" Target appmix is = ") + pprint(target_app_mix_dict) + + #coverage percenatge + coverage_percent = (len(target_app_mix_dict.keys())/len((new_app_dict.keys())))*100 + print(f"coverage Percentage = {coverage_percent}") + + #add_applications + self.utils.add_apps_with_weights1(session_appmix,target_app_mix_dict) + #self.utils.check_if_traffic_enabled(session) + print ("done!") + + + + + +class CaptureReplayTest (object): + def __init__(self, capture_folder_path ,agent_map={}): + args, offline_token = utils.parse_cli_options() + self.utils = utils.Utils(args.controller, + username=args.user, + password=args.password, + refresh_token=offline_token, + license_server=args.license_server, + license_user=args.license_user, + license_password=args.license_password) + + + + self.capture_folder_path = capture_folder_path + self.agent_map = agent_map + self.test_duration = 60 + self.local_stats = {} + + def __del__(self): + self._release() + + def __enter__(self): + return self + + def __exit__(self, exception_type, exception_value, exception_traceback): + self._release() + if exception_value: + raise (exception_value) + + def _release(self): + try: + if self.session: + print('Deleting session') + self.utils.delete_session(self.session) + print('Deleted session') + self.session = None + except AttributeError: + pass + + def _set_objective_and_timeline(self): + # Change the objective type to 'Simulated Users'. 'Throughput' is not yet supported for UDP Stream. + self.utils.set_objective_and_timeline(self.session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_value=1000, + test_duration=self.test_duration) + #soumo + def get_capture_file_paths(self): + try: + # Check if the folder exists + if not os.path.exists(self.capture_folder_path): + print(f"Error: Folder '{self.capture_folder_path}' does not exist.") + return [] + + # Get a list of all files in the folder + file_paths = [os.path.join(self.capture_folder_path, file) for file in os.listdir(self.capture_folder_path) if os.path.isfile(os.path.join(self.capture_folder_path, file))] + + return file_paths + + except Exception as e: + print(f"An error occurred: {str(e)}") + return [] + + + + + async def configure(self): + print('Configuring ...') + #read the pcap files + list_of_paths_of_pcap_files = self.get_capture_file_paths() + list_of_paths_of_pcap_files.reverse() + print(list_of_paths_of_pcap_files) + #upload all the captures from the specified folder ( in yml file ) + #to CyPerf Resource Library for Captures + while(list_of_paths_of_pcap_files): + capture_file=list_of_paths_of_pcap_files.pop() + print("uploading capture - {} ".format(capture_file)) + await self.utils.upload_the_capture_file(capture_file) + #create an application from the uploaded captures + apps_created= self.utils.create_apps_from_captures() + print('Configuration complete !!!.You may now use the custom apps created from pcaps.\nThe custom apps are available under the Resource Library in CyPerf Controller') + + def _start(self): + print('Starting test ...') + self.utils.start_test(self.session) + print('Started test ...') + + def _process_stats(self, stats): + processed_stats = self.local_stats + for stat in stats: + if stat.snapshots: + processed_stats[stat.name] = {} + for snapshot in stat.snapshots: + time_stamp = snapshot.timestamp + processed_stats[stat.name][time_stamp] = [] + d = {} + for idx, stat_name in enumerate(stat.columns): + d[stat_name] = [val[idx].actual_instance for val in snapshot.values] + processed_stats[stat.name][time_stamp] = d + return processed_stats + + def _print_run_time_stats(self, test, time_from, time_to): + stat_names = ['client-streaming-rate', 'server-streaming-rate'] + return self.print_run_time_stats(test, time_from, time_to, stat_names) + + def print_run_time_stats(self, test, time_from, time_to, stat_names): + last_monitored_time_stamp = None + for stat_name in stat_names: + stats = self.utils.collect_stats(test, + stat_name, + time_from, + time_to, + self._process_stats) + if stat_name not in stats: + continue + + stats = stats[stat_name] + last_time_stamp = max(stats) + + if stat_name in self.last_recorded_time_stamps: + last_recorded_time_stamp = self.last_recorded_time_stamps[stat_name] + else: + last_recorded_time_stamp = 0 + + if last_time_stamp != last_recorded_time_stamp: + last_stats = stats[last_time_stamp] + + print(f'\n{stat_name} at {self.utils.format_milliseconds(last_time_stamp)}\n') + lines = self.utils.format_stats_dict_as_table(last_stats) + for line in lines: + print(line) + + self.last_recorded_time_stamps[stat_name] = last_time_stamp + + if last_monitored_time_stamp: + last_monitored_time_stamp = min (max(last_time_stamp, time_from), + last_monitored_time_stamp) + else: + last_monitored_time_stamp = max(last_time_stamp, time_from) + + return last_monitored_time_stamp + + def _wait_until_stopped(self): + self.last_recorded_time_stamps = {} + self.utils.wait_for_test_stop(self.session, self._print_run_time_stats) + print('Stopped test ...') + + def run(self): + self._start() + self._wait_until_stopped() + + def collect_final_stats(self): + print('Collecting final statistics ...') + stat_names = ['client-streaming-statistics', 'server-streaming-statistics'] + session_api = cyperf.SessionsApi(self.utils.api_client) + test = session_api.get_test(session_id=self.session.id) + self.print_run_time_stats(test, 0, -1, stat_names) + print('Collected final statistics ...') + + +def start_test(): + + agents = { + 'IP Network 1': ['10.39.68.164'], + 'IP Network 2': ['10.39.68.184'] + } + with AppMixBuilderTest( capture_folder_path,name_of_existing_cyperf_configuration,csv_path,agents) as test1: + test1.configure_test() + #test.configure() + #test.run() + #test.collect_final_stats() + + '''with CaptureReplayTest(capture_folder_path,agents) as test: + await test.configure() + #test.configure() + #test.run() + #test.collect_final_stats()''' + + +async def main(): + + agents = { + 'IP Network 1': ['10.39.68.164'], + 'IP Network 2': ['10.39.68.184'] + } + + with CaptureReplayTest(capture_folder_path,agents) as test: + await test.configure() + #test.configure() + #test.run() + #test.collect_final_stats() + +if __name__ == '__main__': + #asyncio.run(main()) + start_test() diff --git a/samples/combined_report.csv b/samples/combined_report.csv new file mode 100644 index 0000000..8b41ec7 --- /dev/null +++ b/samples/combined_report.csv @@ -0,0 +1,87 @@ +application,weight +adobe-echosign,1 +alteryx-base,2 +apple-push-notifications,24 +apt-get,1265 +azure-log-analytics,407 +azure-openai-encrypted,1 +bacnet-write-property,3 +canva-base,1 +capwap,1 +cloudinary-base,36 +datadog,429 +dns-base,4484 +dtls,5 +facebook-base,1 +firebase-cloud-messaging,21 +github-base,62 +github-pages,1210 +google-analytics,7 +google-base,1469 +google-maps,29 +http-proxy,6 +icmp,37 +insufficient-data,59 +jira-base,4 +ldap,161 +linkedin-base,3 +microsoft-dynamics-crm,33 +microsoft-excel,4 +microsoft-intune,2 +modbus-read-holding-registers,1 +mqtt-connect,2 +mqtt-disconnect,1 +ms-ds-smb-base,16 +ms-ds-smbv3,187 +ms-office365-base,509 +ms-onedrive-base,2 +ms-onedrive-business,2 +ms-update,420 +ms-visual-studio-tfs-base,37 +ms-wmi,4 +msrpc-base,163 +mssql-db-base,66 +mssql-db-encrypted,2508 +mssql-db-unencrypted,127 +mssql-mon,52 +mysql,2 +netbios-ns,61 +niagara-fox,44 +ntp-base,178 +ocsp,440 +octopus,2 +okta,580 +open-vpn,6 +openai-api,2 +oracle,90 +oracle-eloqua,39 +outlook-web,82 +postgres,4 +quic-base,1 +rabbitmq,4 +rtcp,2 +rtp-base,2 +rtsp,6 +salesforce-base,67 +sendgrid,216 +service-now-base,36 +sharepoint-online,60 +slack-base,2 +slp,2 +smtp-base,108 +snmp-trap,3 +snmpv1-get-next-request,3 +snmpv1-get-request,2 +snmpv3-get-next-request,2 +snmpv3-get-request,3 +soap,111 +splunk,166 +sproutsocial,9 +ssl,6812 +unknown-tcp,13 +web-browsing,2566 +webex-base,1 +windows-azure-base,139 +youtube-base,1 +zendesk-base,7 +zoominfo,2 diff --git a/samples/dict_of_pan_app-id_to_cyperf_app.txt b/samples/dict_of_pan_app-id_to_cyperf_app.txt new file mode 100644 index 0000000..31f7ef7 --- /dev/null +++ b/samples/dict_of_pan_app-id_to_cyperf_app.txt @@ -0,0 +1,103 @@ +web-browsing: YYLive Microsoft Edge +google-base: Office365 Outlook Microsoft Edge +google-maps: Meraki Microsoft Edge +facebook-base: iTunes Desktop +http-video: YYLive Microsoft Edge +alipay: Ctrip Chrome +alisoft: Alibaba +google-analytics: iTunes Desktop +amazon-chime-conferencing: Amazon Chime +amazon-chime-base: Amazon Chime +appdynamics: AppDynamics +appogee: Appogee +vimeo-base: appointy Microsoft Edge +zendesk-base: appointy Microsoft Edge +appointy: appointy Microsoft Edge +amazon-aws-console: AWS Console Microsoft Edge +amazon-cloud-drive-uploading: AWS S3 Microsoft Edge +bittorrent: BitTorrent Upload +blogger-blog-posting: Blogger +Web-browsing: Blogger +youtube-base: GooglePhotos Microsoft Edge +google-hangouts-base: GoogleHangouts Microsoft Edge +google-hangouts-video: Blogger +ms-ds-smbv1: CIFS +cisco-spark-base: Cisco Spark Microsoft Edge +webex-base: Cisco Spark Microsoft Edge +cisco-spark-file-transfer: Cisco Spark Microsoft Edge +jira-base: Jira Service Desk +confluence-downloading: Confluence +dns-base: DNS Flood +http-audio: Skype Microsoft Edge +facebook-voice: Facebook Audio Microsoft Edge +facebook-uploading: FacebookLive Chrome +facebook-posting: FacebookLive Microsoft Edge +facebook-chat: FacebookLive Microsoft Edge +facebook-video: FacebookLive Microsoft Edge +ftp: FTP +gmail-base: Gmail Chrome +gmail-downloading: Gmail Chrome +gmail-posting: Gmail Chrome +gmx-mail: GMX Mail +google: Google Classroom Microsoft Edge +google-drive-web: Google Drive Microsoft Edge +google-play: GooglePhotos Microsoft Edge +google-plus-base: GoogleHangouts Microsoft Edge +google-calendar-base: Google Calendar +google-classroom: Google Classroom Chrome +google-app-engine: Google Cloud Storage +google-safebrowsing: Google Cloud Storage +google-docs-downloading: Google Drive Microsoft Edge +google-docs-posting: Google Slides Microsoft Edge +google-docs-base: Google Slides Microsoft Edge +google-docs-uploading: GoogleHangouts Microsoft Edge +google-hangouts-audio-video: GoogleHangouts Microsoft Edge +google-hangouts-chat: GoogleHangouts Microsoft Edge +google-photos-uploading: GooglePhotos Microsoft Edge +google-photos-downloading: GooglePhotos Microsoft Edge +instagram-base: Google Search +google-update: Google Search +apache-guacamole: Guacamole +hbo: HBOMax +hulu-base: Hulu Microsoft Edge +c37.118-cmd-frame-send-cfg-2: IEEE C37.118 Synchrophasor UDP +jira-posting: Jira Service Desk +qq-games: League of Legends Microsoft Edge +mqtt-disconnect: MQTT Subscriber +mail.ru-base: Mail.ru Microsoft Edge +mms-ics-base: Manufacturing Message Specification (MMS) +ocsp: Meraki Microsoft Edge +new-relic: Meraki Microsoft Edge +windows-azure-base: Microsoft Azure Chrome +modbus-read-fifo-queue: Modbus +mongodb-base: MongoDB +mssql-db-unencrypted: MS-SQL Server +portmapper: NFSv3 +hotmail: Office365 Excel Microsoft Edge +office365-consumer-access: Skype 8 Microsoft Edge +ms-office365-base: Office365 Outlook Microsoft Edge +ms-onedrive-base: Office365 Outlook Microsoft Edge +outlook-web-online: Yammer Microsoft Edge +ms-onedrive-downloading: Office365 OneNote +ms-powerpoint-online: Office365 OneNote +ms-onedrive-uploading: Office365 OneDrive Microsoft Edge +outlook-web: Office365 OneNote +ms-onenote-base: Office365 OneNote +ms-outlook-personal-uploading: Office365 Outlook Microsoft Edge +odnoklassniki-base: OK.ru Microsoft Edge +oracle: Oracle Database +pop3: POP3 +postgres: PostgreSQL +reddit-base: Reddit Microsoft Edge +reddit-posting: Reddit Microsoft Edge +salesforce-base: Salesforce Microsoft Edge +service-now-editing: Service-Now Microsoft Edge +sina-weibo-base: Sina Weibo Microsoft Edge +skype: Skype Microsoft Edge +smtp-exception: SMTP +tubitv: Tubi Microsoft Edge +weather-desktop: TWC Microsoft Edge +vkontakte-base: VKontakte Microsoft Edge +vkontakte-chat: VKontakte Microsoft Edge +yammer-editing: Yammer Microsoft Edge +zoom: Zoom Classroom Teacher diff --git a/samples/invert.py b/samples/invert.py new file mode 100644 index 0000000..623d027 --- /dev/null +++ b/samples/invert.py @@ -0,0 +1,890 @@ +''' Script to create a csv which maps the App-Id from Plao Alto to CyPerf Applications . Input to this file is a +dictionary present at https://bitbucket.it.keysight.com/projects/ISGAPPSEC/repos/appsec-automation/browse/appsec/common/PAN_regression/PanSignatures.json +The disctionary which our Engineers at keysight maintains maps CyPerf AppNames to the App-id . Here we want to do the reverse, one-to-one mapping between +PANW-APPID & CyPerf APPNames ''' + +def invert_dictionary(d): + inverted_dict = {} + for key, values in d.items(): + for value in values: + inverted_dict[value] = key + return inverted_dict + +def count_values(d): + k=[] + for item in d : + for value in d[item]: + if value not in k: + k.append(value) + return k + +def print_dict_to_file(dictionary, filename): + with open(filename, 'w') as f: + for key, value in dictionary.items(): + f.write(f"{key}: {value}\n") + +def print_dict_to_file_csv(dictionary, filename): + with open(filename, 'w') as f: + for key, value in dictionary.items(): + f.write(f"{key},{value}\n") + + +if __name__ == '__main__': + k = { + "Adobe Reader Updates Chrome": [ + "web-browsing" + ], + "Adobe Reader Updates Firefox": [ + "web-browsing" + ], + "Adobe Reader Updates Internet Explorer": [ + "web-browsing" + ], + "Adobe Reader Updates Microsoft Edge": [ + "web-browsing" + ], + "ADP Chrome": [ + "web-browsing" + ], + "ADP Firefox": [ + "web-browsing" + ], + "ADP Internet Explorer": [ + "web-browsing" + ], + "ADP Microsoft Edge": [ + "web-browsing" + ], + "Airbnb Chrome": [ + "google-base", "google-maps", "facebook-base", "http-video", "web-browsing" + ], + "Airbnb Firefox": [ + "google-base", "google-maps", "facebook-base", "http-video", "web-browsing" + ], + "Airbnb Internet Explorer": [ + "google-base", "google-maps", "facebook-base", "http-video", "web-browsing" + ], + "Airbnb Microsoft Edge": [ + "google-base", "google-maps", "facebook-base", "http-video", "web-browsing" + ], + + "Alibaba": [ + "web-browsing", "alipay", "alisoft", "google-base", "google-analytics" + ], + "Amazon Chime": [ + "web-browsing", "amazon-chime-conferencing", "amazon-chime-base" + ], + "AppDynamics": [ + "web-browsing", "appdynamics", "google-base" + ], + "Appogee" : [ + "web-browsing", "google-base", "appogee" + ], + "appointy Chrome": [ + "web-browsing", "google-base", "vimeo-base", "zendesk-base", "google-maps", "appointy" + ], + "appointy Firefox": [ + "web-browsing", "google-base", "vimeo-base", "zendesk-base", "google-maps", "appointy" + ], + "appointy Internet Explorer": [ + "web-browsing", "google-base", "vimeo-base", "zendesk-base", "google-maps", "appointy" + ], + "appointy Microsoft Edge": [ + "web-browsing", "google-base", "vimeo-base", "zendesk-base", "google-maps", "appointy" + ], + "AWS Console Chrome": [ + "web-browsing", "amazon-aws-console" + ], + "AWS Console Firefox": [ + "web-browsing", "amazon-aws-console" + ], + "AWS Console Internet Explorer": [ + "web-browsing", "amazon-aws-console" + ], + "AWS Console Microsoft Edge": [ + "web-browsing", "amazon-aws-console" + ], + "AWS S3 Chrome": [ + "amazon-cloud-drive-uploading" + ], + "AWS S3 Firefox": [ + "amazon-cloud-drive-uploading" + ], + "AWS S3 Internet Explorer": [ + "amazon-cloud-drive-uploading" + ], + "AWS S3 Microsoft Edge": [ + "amazon-cloud-drive-uploading" + ], + "Baidu Chrome": [ + "web-browsing" + ], + "Baidu Firefox": [ + "web-browsing" + ], + "Baidu Internet Explorer": [ + "web-browsing" + ], + "Baidu Maps Chrome": [ + "web-browsing" + ], + "Baidu Maps Firefox": [ + "web-browsing" + ], + "Baidu Maps Internet Explorer": [ + "web-browsing" + ], + "Baidu Maps Microsoft Edge": [ + "web-browsing" + ], + "Baidu Microsoft Edge": [ + "web-browsing" + ], + "Bilibili Chrome": [ + "http-video", "web-browsing" + ], + "Bilibili Firefox": [ + "http-video", "web-browsing" + ], + "Bilibili Internet Explorer": [ + "http-video", "web-browsing" + ], + "Bilibili Microsoft Edge": [ + "http-video", "web-browsing" + ], + "BitTorrent Download": [ + "bittorrent" + ], + "BitTorrent Upload": [ + "bittorrent" + ], + "Blogger": [ + "blogger-blog-posting", "Web-browsing", "google-base", "youtube-base", "google-analytics", "google-hangouts-base", "google-hangouts-video" + ], + "CCTV Video Mobile": [ + "http-video" + ], + "CIFS": [ + "ms-ds-smbv1" + ], + "Cisco Spark Chrome": [ + "cisco-spark-base", "webex-base", "cisco-spark-file-transfer", "web-browsing" + ], + "Cisco Spark Firefox": [ + "cisco-spark-base", "webex-base", "cisco-spark-file-transfer", "web-browsing" + ], + "Cisco Spark Internet Explorer": [ + "cisco-spark-base", "webex-base", "cisco-spark-file-transfer", "web-browsing" + ], + "Cisco Spark Microsoft Edge": [ + "cisco-spark-base", "webex-base", "cisco-spark-file-transfer", "web-browsing" + ], + "Commvault Chrome": [ + "web-browsing" + ], + "Commvault Firefox": [ + "web-browsing" + ], + "Commvault Internet Explorer": [ + "web-browsing" + ], + "Commvault Microsoft Edge": [ + "web-browsing" + ], + "Confluence": [ + "web-browsing", "jira-base", "confluence-downloading" + ], + "Crawling Wikipedia (Chinese) Chrome": [ + "web-browsing" + ], + "Crawling Wikipedia (Chinese) Firefox": [ + "web-browsing" + ], + "Crawling Wikipedia (Chinese) Internet Explorer": [ + "web-browsing" + ], + "Crawling Wikipedia (Chinese) Microsoft Edge": [ + "web-browsing" + ], + "Ctrip Chrome": [ + "web-browsing", "alipay" + ], + "Dianping": [ + "web-browsing" + ], + "DNS": [ + "dns-base" + ], + "DNS Flood": [ + "dns-base" + ], + "DocuSign Chrome": [ + "web-browsing" + ], + "DocuSign Firefox": [ + "web-browsing" + ], + "DocuSign Internet Explorer": [ + "web-browsing" + ], + "DocuSign Microsoft Edge": [ + "web-browsing" + ], + "Dreambox Chrome": [ + "http-audio", "web-browsing" + ], + "Dreambox Firefox": [ + "http-audio", "web-browsing" + ], + "Dreambox Internet Explorer": [ + "http-audio", "web-browsing" + ], + "Dreambox Microsoft Edge": [ + "http-audio", "web-browsing" + ], + "eBanking Chrome to Apache": [ + "web-browsing" + ], + "eBanking Firefox to IIS": [ + "web-browsing" + ], + "eBanking Internet Explorer to Nginx": [ + "web-browsing" + ], + "eBanking Microsoft Edge to Apache": [ + "web-browsing" + ], + "Ebay": [ + "web-browsing" + ], + "EpixNow Chrome": [ + "web-browsing", "http-video" + ], + "EpixNow Firefox": [ + "web-browsing", "http-video" + ], + "EpixNow Internet Explorer": [ + "web-browsing", "http-video" + ], + "EpixNow Microsoft Edge": [ + "web-browsing", "http-video" + ], + "eShop Chrome to Apache": [ + "web-browsing" + ], + "eShop Firefox to IIS": [ + "web-browsing" + ], + "eShop Internet Explorer to Nginx": [ + "web-browsing" + ], + "eShop Microsoft Edge to Apache": [ + "web-browsing" + ], + "Facebook Audio Chrome": [ + "facebook-voice", "facebook-base" + ], + "Facebook Audio Firefox": [ + "facebook-voice", "facebook-base" + ], + "Facebook Audio Internet Explorer": [ + "facebook-voice", "facebook-base" + ], + "Facebook Audio Microsoft Edge": [ + "facebook-voice", "facebook-base" + ], + "Facebook Chrome": [ + "facebook-base", "facebook-uploading" + ], + "Facebook Firefox": [ + "facebook-base", "facebook-uploading" + ], + "Facebook Internet Explorer": [ + "facebook-base", "facebook-uploading" + ], + "Facebook Microsoft Edge": [ + "facebook-base", "facebook-uploading" + ], + "FacebookLive Chrome": [ + "web-browsing", "facebook-base", "facebook-posting", "facebook-chat", "facebook-video", "facebook-posting", "facebook-chat", "facebook-uploading" + ], + "FacebookLive Firefox": [ + "web-browsing", "facebook-base", "facebook-posting", "facebook-chat", "facebook-video", "facebook-posting", "facebook-chat" + ], + "FacebookLive Internet Explorer": [ + "web-browsing", "facebook-base", "facebook-posting", "facebook-chat", "facebook-video", "facebook-posting", "facebook-chat" + ], + "FacebookLive Microsoft Edge": [ + "web-browsing", "facebook-base", "facebook-posting", "facebook-chat", "facebook-video", "facebook-posting", "facebook-chat" + ], + "FTP": [ + "ftp" + ], + "Gab Chrome": [ + "http-video", "web-browsing" + ], + "Gab Firefox": [ + "http-video", "web-browsing" + ], + "Gab Internet Explorer": [ + "http-video", "web-browsing" + ], + "Gab Microsoft Edge": [ + "http-video", "web-browsing" + ], + "Gaode Maps Chrome": [ + "web-browsing" + ], + "Gaode Maps Firefox": [ + "web-browsing" + ], + "Gaode Maps Internet Explorer": [ + "web-browsing" + ], + "Gaode Maps Microsoft Edge": [ + "web-browsing" + ], + "Gettr Chrome": [ + "web-browsing" + ], + "Gmail Chrome": [ + "gmail-base", "gmail-downloading", "gmail-posting" + ], + "GMX Mail": [ + "web-browsing", "gmx-mail", "google-base" + ], + "Google App Engine Chrome": [ + "google" , "google-base", "google-drive-web" , "google-play", "youtube-base" + ], + "Google Calendar": [ + "google-base", "google-plus-base", "google-maps", "google-calendar-base", "youtube-base" + ], + "Google Classroom Chrome": [ + "web-browsing","google-base", "google-drive-web", "google-play", "google-classroom" + ], + "Google Classroom Firefox": [ + "google" + ], + "Google Classroom Internet Explorer": [ + "google" + ], + "Google Classroom Microsoft Edge": [ + "google" + ], + "Google Cloud Storage": [ + "web-browsing","google-base","google-app-engine", "google-safebrowsing", "google-analytics" + ], + "Google Drive Chrome": [ + "google-drive-web", "youtube-base", "google-docs-downloading", "google-base", "google-play", "google-docs-posting", "google-docs-base", "google-docs-uploading" + ], + "Google Drive Firefox": [ + "google-drive-web", "youtube-base", "google-docs-downloading", "google-base", "google-play", "google-docs-posting", "google-docs-base", "google-docs-uploading" + ], + "Google Drive Internet Explorer": [ + "google-drive-web", "youtube-base", "google-docs-downloading", "google-base", "google-play", "google-docs-posting", "google-docs-base", "google-docs-uploading" + ], + "Google Drive Microsoft Edge": [ + "google-drive-web", "youtube-base", "google-docs-downloading", "google-base", "google-play", "google-docs-posting", "google-docs-base", "google-docs-uploading" + ], + "Google Sheets Chrome": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "Google Sheets Firefox": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "Google Sheets Internet Explorer": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "Google Sheets Microsoft Edge": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + + ], + "Google Slides Chrome": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "Google Slides Firefox": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "Google Slides Internet Explorer": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "Google Slides Microsoft Edge": [ + "youtube-base", "google-docs-posting", "google-base", "google-docs-base" + ], + "GoogleHangouts Chrome": [ + "google-analytics", "google-hangouts-audio-video", "google-hangouts-base", "google-base", "google-play", "google-plus-base", "web-browsing", "google-hangouts-chat", "google-docs-uploading" + ], + "GoogleHangouts Firefox": [ + "google-analytics", "google-hangouts-audio-video", "google-hangouts-base", "google-base", "google-play", "google-plus-base", "web-browsing", "google-hangouts-chat", "google-docs-uploading" + ], + "GoogleHangouts Internet Explorer": [ + "google-analytics", "google-hangouts-audio-video", "google-hangouts-base", "google-base", "google-play", "google-plus-base", "web-browsing", "google-hangouts-chat", "google-docs-uploading" + ], + "GoogleHangouts Microsoft Edge": [ + "google-analytics", "google-hangouts-audio-video", "google-hangouts-base", "google-base", "google-play", "google-plus-base", "web-browsing", "google-hangouts-chat", "google-docs-uploading" + ], + "GooglePhotos Chrome": [ + "google-base", "google-photos-uploading", "google-photos-downloading", "google-play", "youtube-base" + ], + "GooglePhotos Firefox": [ + "google-base", "google-photos-uploading", "google-photos-downloading", "google-play", "youtube-base" + ], + "GooglePhotos Internet Explorer": [ + "google-base", "google-photos-uploading", "google-photos-downloading", "google-play", "youtube-base" + ], + "GooglePhotos Microsoft Edge": [ + "google-base", "google-photos-uploading", "google-photos-downloading", "google-play", "youtube-base" + ], + "Google Search": [ + "web-browsing", "google-base", "instagram-base", "google-update" + ], + "Guacamole": [ + "apache-guacamole" + ], + "HBOMax": [ + "web-browsing", "hbo" + ], + "HTTP App": [ + "web-browsing" + ], + "HTTP Excessive GET": [ + "web-browsing" + ], + "HTTP Excessive POST": [ + "web-browsing" + ], + "Hulu Chrome": [ + "hulu-base", "web-browsing" + ], + "Hulu Firefox": [ + "hulu-base", "web-browsing" + ], + "Hulu Internet Explorer": [ + "hulu-base", "web-browsing" + ], + "Hulu Microsoft Edge": [ + "hulu-base", "web-browsing" + ], + "IEEE C37.118 Synchrophasor TCP": [ + "c37.118-cmd-frame-send-cfg-2" + ], + "IEEE C37.118 Synchrophasor UDP": [ + "c37.118-cmd-frame-send-cfg-2" + ], + "Instacart Chrome": [ + "google-base", "facebook-base", "google-analytics", "web-browsing" + ], + "Instacart Firefox": [ + "google-base", "facebook-base", "google-analytics", "web-browsing" + ], + "Instacart Internet Explorer": [ + "google-base", "facebook-base", "google-analytics", "web-browsing" + ], + "Instacart Microsoft Edge": [ + "google-base", "facebook-base", "google-analytics", "web-browsing" + ], + "iTunes Desktop": [ + "web-browsing", "facebook-base", "google-analytics", "google-base" + ], + "Jingdong Chrome": [ + "web-browsing" + ], + "Jingdong Firefox": [ + "web-browsing" + ], + "Jingdong Internet Explorer": [ + "web-browsing" + ], + "Jingdong Microsoft Edge": [ + "web-browsing" + ], + "Jira Chrome": [ + "web-browsing", "jira-base", "jira-posting" + ], + "Jira Firefox": [ + "web-browsing", "jira-base", "jira-posting" + ], + "Jira Internet Explorer": [ + "web-browsing", "jira-base", "jira-posting" + ], + "Jira Microsoft Edge": [ + "web-browsing", "jira-base", "jira-posting" + ], + "Jira Service Desk": [ + "web-browsing", "jira-base", "jira-posting" + ], + "League of Legends Chrome": [ + "web-browsing", "qq-games" + ], + "League of Legends Firefox": [ + "web-browsing", "qq-games" + ], + "League of Legends Internet Explorer": [ + "web-browsing", "qq-games" + ], + "League of Legends Microsoft Edge": [ + "web-browsing", "qq-games" + ], + "LwM2M over MQTT Client": [ + "mqtt-disconnect" + ], + "LwM2M over MQTT Server": [ + "mqtt-disconnect" + ], + "Mail.ru Chrome": [ + "web-browsing", "mail.ru-base", "http-audio" + ], + "Mail.ru Firefox": [ + "web-browsing", "mail.ru-base", "http-audio" + ], + "Mail.ru Internet Explorer": [ + "web-browsing", "mail.ru-base", "http-audio" + ], + "Mail.ru Microsoft Edge": [ + "web-browsing", "mail.ru-base", "http-audio" + ], + "Mango TV Chrome": [ + "web-browsing", "http-video" + ], + "Manufacturing Message Specification (MMS) ": [ + "mms-ics-base" + ], + "Meraki Chrome": [ + "google-base", "ocsp", "google-maps", "web-browsing", "new-relic" + ], + "Meraki Firefox": [ + "google-base", "ocsp", "google-maps", "web-browsing", "new-relic" + ], + "Meraki Internet Explorer": [ + "google-base", "ocsp", "google-maps", "web-browsing", "new-relic" + ], + "Meraki Microsoft Edge": [ + "google-base", "ocsp", "google-maps", "web-browsing", "new-relic" + ], + "Mewe Chrome": [ + "web-browsing" + ], + "Mewe Firefox": [ + "web-browsing" + ], + "Mewe Internet Explorer": [ + "web-browsing" + ], + "Mewe Microsoft Edge": [ + "web-browsing" + ], + "Microsoft Azure Chrome": [ + "web-browsing", "windows-azure-base" + ], + "Modbus": [ + "modbus-read-fifo-queue" + ], + "MongoDB": [ + "mongodb-base" + ], + "MQTT Publisher":[ + "mqtt-disconnect" + ], + "MQTT Subscriber":[ + "mqtt-disconnect" + ], + "MS-SQL Server": [ + "mssql-db-unencrypted" + ], + "Netease Music Chrome": [ + "http-video", "http-audio", "web-browsing" + ], + "Netease Music Firefox": [ + "http-video", "http-audio", "web-browsing" + ], + "Netease Music Internet Explorer": [ + "http-video", "http-audio", "web-browsing" + ], + "Netease Music Microsoft Edge": [ + "http-video", "http-audio", "web-browsing" + ], + "NFSv3": [ + "portmapper" + ], + "Office 365 Outlook People Chrome": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "web-browsing", "ms-onedrive-base" + ], + "Office 365 Outlook People Firefox": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "web-browsing", "ms-onedrive-base" + ], + "Office 365 Outlook People Internet Explorer": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "web-browsing", "ms-onedrive-base" + ], + "Office 365 Outlook People Microsoft Edge": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "web-browsing", "ms-onedrive-base" + ], + "Office365 Excel Chrome": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "outlook-web-online", "web-browsing", "ms-onedrive-base", "ms-onedrive-downloading", "google-base" + ], + "Office365 Excel Firefox": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "outlook-web-online", "web-browsing", "ms-onedrive-base", "ms-onedrive-downloading", "google-base" + ], + "Office365 Excel Internet Explorer": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "outlook-web-online", "web-browsing", "ms-onedrive-base", "ms-onedrive-downloading", "google-base" + ], + "Office365 Excel Microsoft Edge": [ + "hotmail", "office365-consumer-access", "ms-office365-base", "outlook-web-online", "web-browsing", "ms-onedrive-base", "ms-onedrive-downloading", "google-base" + ], + "Office365 OneDrive Chrome": [ + "ms-powerpoint-online", "office365-consumer-access", "ms-onedrive-uploading", "ms-onedrive-downloading", "ms-office365-base", "outlook-web", "outlook-web-online", "web-browsing", "ms-onedrive-base" + ], + "Office365 OneDrive Firefox": [ + "ms-powerpoint-online", "office365-consumer-access", "ms-onedrive-uploading", "ms-onedrive-downloading", "ms-office365-base", "outlook-web", "outlook-web-online", "web-browsing", "ms-onedrive-base" + ], + "Office365 OneDrive Internet Explorer": [ + "ms-powerpoint-online", "office365-consumer-access", "ms-onedrive-uploading", "ms-onedrive-downloading", "ms-office365-base", "outlook-web", "outlook-web-online", "web-browsing", "ms-onedrive-base" + ], + "Office365 OneDrive Microsoft Edge": [ + "ms-powerpoint-online", "office365-consumer-access", "ms-onedrive-uploading", "ms-onedrive-downloading", "ms-office365-base", "outlook-web", "outlook-web-online", "web-browsing", "ms-onedrive-base" + ], + "Office365 OneNote":[ + "web-browsing", "outlook-web-online", "ms-office365-base", "ms-onedrive-downloading", "ms-powerpoint-online", "outlook-web", "ms-onedrive-base", "office365-consumer-access", "ms-onenote-base" + ], + "Office365 Outlook Chrome": [ + "office365-consumer-access", "ms-office365-base", "http-audio", "google-base", "ms-outlook-personal-uploading", "outlook-web-online", "web-browsing", "ms-onedrive-base", "google-base" + ], + "Office365 Outlook Firefox": [ + "office365-consumer-access", "ms-office365-base", "http-audio", "google-base", "ms-outlook-personal-uploading", "outlook-web-online", "web-browsing", "ms-onedrive-base", "google-base" + ], + "Office365 Outlook Internet Explorer": [ + "office365-consumer-access", "ms-office365-base", "http-audio", "google-base", "ms-outlook-personal-uploading", "outlook-web-online", "web-browsing", "ms-onedrive-base", "google-base" + ], + "Office365 Outlook Microsoft Edge": [ + "office365-consumer-access", "ms-office365-base", "http-audio", "google-base", "ms-outlook-personal-uploading", "outlook-web-online", "web-browsing", "ms-onedrive-base", "google-base" + ], + "OK.ru Chrome": [ + "odnoklassniki-base", "web-browsing" + ], + "OK.ru Firefox": [ + "odnoklassniki-base", "web-browsing" + ], + "OK.ru Internet Explorer": [ + "odnoklassniki-base", "web-browsing" + ], + "OK.ru Microsoft Edge": [ + "odnoklassniki-base", "web-browsing" + ], + "Oracle Database": [ + "oracle" + ], + "Portal Chrome to Apache": [ + "web-browsing" + ], + "Portal Firefox to IIS": [ + "web-browsing" + ], + "Portal Internet Explorer to Nginx": [ + "web-browsing" + ], + "Portal Microsoft Edge to Apache": [ + "web-browsing" + ], + "POP3": [ + "pop3" + ], + "PostgreSQL": [ + "postgres" + ], + "Reddit Chrome": [ + "reddit-base", "reddit-posting", "web-browsing" + ], + "Reddit Firefox": [ + "reddit-base", "reddit-posting", "web-browsing" + ], + "Reddit Internet Explorer": [ + "reddit-base", "reddit-posting", "web-browsing" + ], + "Reddit Microsoft Edge": [ + "reddit-base", "reddit-posting", "web-browsing" + ], + "Salesforce Chrome": [ + "salesforce-base", "web-browsing" + ], + "Salesforce Firefox": [ + "salesforce-base", "web-browsing" + ], + "Salesforce Internet Explorer": [ + "salesforce-base", "web-browsing" + ], + "Salesforce Microsoft Edge": [ + "salesforce-base", "web-browsing" + ], + "Service-Now Chrome": [ + "service-now-editing" + ], + "Service-Now Firefox": [ + "service-now-editing" + ], + "Service-Now Internet Explorer": [ + "service-now-editing" + ], + "Service-Now Microsoft Edge": [ + "service-now-editing" + ], + "Sina Weibo Chrome": [ + "http-video", "sina-weibo-base", "web-browsing", "http-video" + ], + "Sina Weibo Firefox": [ + "http-video", "sina-weibo-base", "web-browsing", "http-video" + ], + "Sina Weibo Internet Explorer": [ + "http-video", "sina-weibo-base", "web-browsing", "http-video" + ], + "Sina Weibo Microsoft Edge": [ + "http-video", "sina-weibo-base", "web-browsing", "http-video" + ], + "Skype 8 Chrome": [ + "skype", "web-browsing", "office365-consumer-access" + ], + "Skype 8 Firefox": [ + "skype", "web-browsing", "office365-consumer-access" + ], + "Skype 8 Internet Explorer": [ + "skype", "web-browsing", "office365-consumer-access" + ], + "Skype 8 Microsoft Edge": [ + "skype", "web-browsing", "office365-consumer-access" + ], + "Skype Chrome": [ + "skype", "web-browsing", "http-audio" + ], + "Skype Firefox": [ + "skype", "web-browsing", "http-audio" + ], + "Skype Internet Explorer": [ + "skype", "web-browsing", "http-audio" + ], + "Skype Microsoft Edge": [ + "skype", "web-browsing", "http-audio" + ], + "SMTP": [ + "smtp-exception" + ], + "Socal Network Chrome to Apache": [ + "web-browsing" + ], + "Social Network Firefox to IIS": [ + "web-browsing" + ], + "Social Network Internet Explorer to Nginx": [ + "web-browsing" + ], + "Social Network Microsoft Edge to Apache": [ + "web-browsing" + ], + "Splunk Chrome": [ + "web-browsing" + ], + "Splunk Firefox": [ + "web-browsing" + ], + "Splunk Internet Explorer": [ + "web-browsing" + ], + "Splunk Microsoft Edge": [ + "web-browsing" + ], + "Tubi Chrome": [ + "tubitv" + ], + "Tubi Firefox": [ + "tubitv" + ], + "Tubi Internet Explorer": [ + "tubitv" + ], + "Tubi Microsoft Edge": [ + "tubitv" + ], + "TWC Chrome": [ + "web-browsing", "http-video", "weather-desktop" + ], + "TWC Firefox": [ + "web-browsing", "http-video", "weather-desktop" + ], + "TWC Internet Explorer": [ + "web-browsing", "http-video", "weather-desktop" + ], + "TWC Microsoft Edge": [ + "web-browsing", "http-video", "weather-desktop" + ], + "Video Platform Chrome to Apache": [ + "web-browsing" + ], + "Video Platform Firefox to IIS": [ + "web-browsing" + ], + "Video Platform Internet Explorer to Nginx": [ + "web-browsing" + ], + "Video Platform Microsoft Edge to Apache": [ + "web-browsing" + ], + "VKontakte Chrome": [ + "vkontakte-base", "vkontakte-chat", "web-browsing" + ], + "VKontakte Firefox": [ + "vkontakte-base", "vkontakte-chat", "web-browsing" + ], + "VKontakte Internet Explorer": [ + "vkontakte-base", "vkontakte-chat", "web-browsing" + ], + "VKontakte Microsoft Edge": [ + "vkontakte-base", "vkontakte-chat", "web-browsing" + ], + "Yammer Chrome": [ + "yammer-editing", "outlook-web-online", "web-browsing" + ], + "Yammer Firefox": [ + "yammer-editing", "outlook-web-online", "web-browsing" + ], + "Yammer Internet Explorer": [ + "yammer-editing", "outlook-web-online", "web-browsing" + ], + "Yammer Microsoft Edge": [ + "yammer-editing", "outlook-web-online", "web-browsing" + ], + "YYLive Chrome": [ + "web-browsing", "http-video" + ], + "YYLive Firefox": [ + "web-browsing", "http-video" + ], + "YYLive Internet Explorer": [ + "web-browsing", "http-video" + ], + "YYLive Microsoft Edge": [ + "web-browsing", "http-video" + ], + "Zoom All Hands Participant": [ + "zoom" + ], + "Zoom All Hands Presenter": [ + "zoom" + ], + "Zoom Brainstorming Participant": [ + "zoom" + ], + "Zoom Classroom Student": [ + "zoom" + ], + "Zoom Classroom Teacher": [ + "zoom" + ] +} + + j = invert_dictionary(k) + import pprint + pprint.pprint(j) + + print(len(count_values(k))) + + print_dict_to_file(j,"dict_of_pan_app-id_to_cyperf_app.txt") + print_dict_to_file_csv(j,"pan_app_id_to_cyperf_app_mappings.csv") + + diff --git a/samples/pan_app_id_to_cyperf_app_mappings.csv b/samples/pan_app_id_to_cyperf_app_mappings.csv new file mode 100644 index 0000000..40df765 --- /dev/null +++ b/samples/pan_app_id_to_cyperf_app_mappings.csv @@ -0,0 +1,188 @@ +app-id,cyperf-appname +solarwinds,SolarWinds NCM +solarwinds-agent,SolarWinds NCM +solarwinds-base,SolarWinds NCM +solarwinds-msp-anywhere,SolarWinds NCM +solarwinds-npm,SolarWinds NCM +solarwinds-rmm,SolarWinds SAM +solarwinds-sam,SolarWinds SAM +solarwinds-loggly,SolarWinds SAM +solarwinds-loggly-base,SolarWinds SAM +solarwinds-loggly-logout,SolarWinds SAM +google-base,Google App Engine Chrome +oracle,Oracle Database +azure-log-analytics,Microsoft Azure Chrome +splunk,Splunk Internet Explorer +smtp-base,SMTP +dns,DNS +rtp-base,Zoom All Hands Presenter +sharepoint-online,Microsoft SharePoint +ldap,LDAP +windows-azure-base,Microsoft Azure Chrome +okta,Okta Multifactor Authentication +ms-office365-base,Office365 Outlook Firefox +ms-update,Windows Updates +service-now-base,Service-Now Chrome +unknown-udp,UDP Stream +salesforce-base,Salesforce Chrome +github-base,Git over HTTP +github-pages,Git over HTTP +ntp-base,NTP +mysql,MySQL +sendgrid,SendGrid Chrome +mssql-db-base,MS-SQL Server +postgres,PostgreSQL +openai-api,OpenAI API +snmpv2-get-next-request,SNMPv2c +linkedin-base,LinkedIn +microsoft-excel,Office365 Excel Chrome +zoominfo,Zoom Classroom Teacher +ms-onedrive-business,Office365 OneDrive Microsoft Edge +jira-base,Jira Chrome +modbus-read-holding-registers,Modbus +outlook-web,Office365 Outlook Internet Explorer +mqtt-connect,MQTT Subscriber +mssql-mon,SQLMon +canva-base,Canva +ms-onedrive-base,Office365 OneDrive Microsoft Edge +google-analytics,Google Analytics +snmpv1-get-request,SNMPv1 +mqtt-disconnect,MQTT Subscriber +snmpv1-get-next-request,SNMPv1 +ms-ds-smb-base,SMBv2 +capwap,Capwap +facebook-base,Facebook Chrome +youtube-base,Youtube Chrome +niagara-fox,Tridium Niagara Fox +netbios-ns,NetBIOS +bacnet-simple-ack,BACnet-IP +kafka,Kafka +cassandra,Apache Cassandra +nfs,NFSv3 +http,HTTP App +radius,RADIUS +ftp,FTP +outlook,Office365 Outlook Chrome +ms-sql,MS-SQL Server +rtp,Zoom Classroom Teacher +google,Google App Engine Chrome +ms-ds-smbv2,SMBv2 +bacnet-read-property,BACnet-IP +dns-base,DNS +zendesk-base,Zendesk +bacnet-write-property,BACnet-IP +web-browsing,YYLive Microsoft Edge +google-base,Office365 Outlook Microsoft Edge +google-maps,Meraki Microsoft Edge +facebook-base,iTunes Desktop +http-video,YYLive Microsoft Edge +alipay,Ctrip Chrome +alisoft,Alibaba +google-analytics,iTunes Desktop +amazon-chime-conferencing,Amazon Chime +amazon-chime-base,Amazon Chime +appdynamics,AppDynamics +appogee,Appogee +vimeo-base,appointy Microsoft Edge +zendesk-base,appointy Microsoft Edge +appointy,appointy Microsoft Edge +amazon-aws-console,AWS Console Microsoft Edge +amazon-cloud-drive-uploading,AWS S3 Microsoft Edge +bittorrent,BitTorrent Upload +blogger-blog-posting,Blogger +Web-browsing,Blogger +youtube-base,GooglePhotos Microsoft Edge +google-hangouts-base,GoogleHangouts Microsoft Edge +google-hangouts-video,Blogger +ms-ds-smbv1,CIFS +cisco-spark-base,Cisco Spark Microsoft Edge +webex-base,Cisco Spark Microsoft Edge +cisco-spark-file-transfer,Cisco Spark Microsoft Edge +jira-base,Jira Service Desk +confluence-downloading,Confluence +dns-base,DNS Flood +http-audio,Skype Microsoft Edge +facebook-voice,Facebook Audio Microsoft Edge +facebook-uploading,FacebookLive Chrome +facebook-posting,FacebookLive Microsoft Edge +facebook-chat,FacebookLive Microsoft Edge +facebook-video,FacebookLive Microsoft Edge +ftp,FTP +gmail-base,Gmail Chrome +gmail-downloading,Gmail Chrome +gmail-posting,Gmail Chrome +gmx-mail,GMX Mail +google,Google Classroom Microsoft Edge +google-drive-web,Google Drive Microsoft Edge +google-play,GooglePhotos Microsoft Edge +google-plus-base,GoogleHangouts Microsoft Edge +google-calendar-base,Google Calendar +google-classroom,Google Classroom Chrome +google-app-engine,Google Cloud Storage +google-safebrowsing,Google Cloud Storage +google-docs-downloading,Google Drive Microsoft Edge +google-docs-posting,Google Slides Microsoft Edge +google-docs-base,Google Slides Microsoft Edge +google-docs-uploading,GoogleHangouts Microsoft Edge +google-hangouts-audio-video,GoogleHangouts Microsoft Edge +google-hangouts-chat,GoogleHangouts Microsoft Edge +google-photos-uploading,GooglePhotos Microsoft Edge +google-photos-downloading,GooglePhotos Microsoft Edge +instagram-base,Google Search +google-update,Google Search +apache-guacamole,Guacamole +hbo,HBOMax +hulu-base,Hulu Microsoft Edge +c37.118-cmd-frame-send-cfg-2,IEEE C37.118 Synchrophasor UDP +jira-posting,Jira Service Desk +qq-games,League of Legends Microsoft Edge +mqtt-disconnect,MQTT Subscriber +mail.ru-base,Mail.ru Microsoft Edge +mms-ics-base,Manufacturing Message Specification (MMS) +ocsp,Meraki Microsoft Edge +new-relic,Meraki Microsoft Edge +windows-azure-base,Microsoft Azure Chrome +modbus-read-fifo-queue,Modbus +mongodb-base,MongoDB +mssql-db-unencrypted,MS-SQL Server +portmapper,NFSv3 +hotmail,Office365 Excel Microsoft Edge +office365-consumer-access,Skype 8 Microsoft Edge +ms-office365-base,Office365 Outlook Microsoft Edge +ms-onedrive-base,Office365 Outlook Microsoft Edge +outlook-web-online,Yammer Microsoft Edge +ms-onedrive-downloading,Office365 OneNote +ms-powerpoint-online,Office365 OneNote +ms-onedrive-uploading,Office365 OneDrive Microsoft Edge +outlook-web,Office365 OneNote +ms-onenote-base,Office365 OneNote +ms-outlook-personal-uploading,Office365 Outlook Microsoft Edge +odnoklassniki-base,OK.ru Microsoft Edge +oracle,Oracle Database +pop3,POP3 +postgres,PostgreSQL +reddit-base,Reddit Microsoft Edge +reddit-posting,Reddit Microsoft Edge +salesforce-base,Salesforce Microsoft Edge +service-now-editing,Service-Now Microsoft Edge +sina-weibo-base,Sina Weibo Microsoft Edge +skype,Skype Microsoft Edge +smtp-exception,SMTP +tubitv,Tubi Microsoft Edge +weather-desktop,TWC Microsoft Edge +vkontakte-base,VKontakte Microsoft Edge +vkontakte-chat,VKontakte Microsoft Edge +yammer-editing,Yammer Microsoft Edge +zoom,Zoom Classroom Teacher +mssql-db-encrypted,CCA-mssql-db-encrypted +ssl,CCA-ssl +mssql-db-encrypted,CCA-mssql-db-encrypted +ssl,CCA-ssl +mssql-db-encrypted,CCA-mssql-db-encrypted +ssl,CCA-ssl +mssql-db-encrypted,CCA-mssql-db-encrypted +ssl,CCA-ssl +mssql-db-encrypted,CCA-mssql-db-encrypted +ssl,CCA-ssl +mssql-db-encrypted,CCA-mssql-db-encrypted +ssl,CCA-ssl diff --git a/samples/sample_attack_based_script.py b/samples/sample_attack_based_script.py deleted file mode 100644 index ace63d8..0000000 --- a/samples/sample_attack_based_script.py +++ /dev/null @@ -1,217 +0,0 @@ -import cyperf -from cyperf import * -from cyperf.utils import create_api_client_cli, TestRunner -from time import sleep -import os - -if __name__ == "__main__": - import urllib3; urllib3.disable_warnings() - - # Enter a context with an instance of the API client - with create_api_client_cli(verify_ssl=False) as api_client: - - # Get some strikes - api_application_resources_instance = cyperf.ApplicationResourcesApi(api_client) - take = 3 # int | The number of search results to return (optional) - skip = 0 # int | The number of search results to skip (optional) - search_col = 'Name' # str | A list of comma-separated columns used to search for the supplied values (optional) - search_val = 'Google Chrome' # str | The keywords used to filter the items (optional) - categories = 'Browser:Chrome' - - strikes = None - try: - print(f"Finding first {take} strikes with \'{search_val}\' in their name, from the \'Browser\' category...") - api_application_resources_response = api_application_resources_instance.get_resources_strikes(take=take, skip=skip, search_col=search_col, search_val=search_val, categories=categories) - strikes = api_application_resources_response.data - if len(strikes) != take: - raise ValueError(f"Couldn't find {take} strikes.") - - print(f"{len(strikes)} strikes found:") - for strike in strikes: - print(f" {strike.name}") - print() - - except Exception as e: - print("Exception when calling ApplicationResourcesApi->get_resources_strikes: %s\n" % e) - - - - all_files = [] - # Add the attacks - for strike in strikes: - - # Find the pre-canned empty config - api_config_instance = cyperf.ConfigurationsApi(api_client) - take = 1 # int | The number of search results to return (optional) - skip = None # int | The number of search results to skip (optional) - search_col = 'displayName' # str | A list of comma-separated columns used to search for the supplied values (optional) - search_val = 'Cyperf Empty Config' # str | The keywords used to filter the items (optional) - filter_mode = None # str | The operator applied to the supplied values (optional) - sort = None # str | A list of comma-separated field:direction pairs used to sort the items where direction must be asc or dsc (optional) - config = api_config_instance.get_configs(take=1, - skip=skip, - search_col=search_col, - search_val=search_val, - filter_mode=filter_mode, - sort=sort) - - if len(config.data) == 0: - raise ValueError("Couldn't find the specified configuration.") - - # Load a pre-canned config - api_session_instance = cyperf.SessionsApi(api_client) - application = None # str | The user-friendly name for the application that controls this session (optional) - config_name = None # str | The display name of the configuration loaded in the session (optional) - config_url = config.data[0].config_url # str | The external URL of the configuration loaded in the session (optional) - index = None # int | The session's index (optional) (readonly) - name = None # str | The user-visible name of the session (optional) - owner = None # str | The user-visible name of the session's owner (optional) (readonly) - sessions = [cyperf.Session(application=application, - config_name=config_name, - configUrl=config_url, - index=index, - name=name, - owner=owner)] - # Create a session - session = None - try: - print("Creating empty session...") - api_session_response = api_session_instance.create_sessions(sessions=sessions) - session = api_session_response[0] - print("Session created.\n") - except Exception as e: - print("Exception when calling SessionsApi->create_sessions: %s\n" % e) # Add an app profile - - # Create an attack profile - print("Creating a Attack Profile...") - attack_profile_name = "Custom Attack Profile" - session.config.config.attack_profiles.append(cyperf.AttackProfile(id="1", name="My Attack Profile", attacks=[])) - session.config.config.attack_profiles.update() - print(attack_profile_name + " created succesfully.\n") - - take = 1 # int | The number of search results to return (optional) - skip = 0 # int | The number of search results to skip (optional) - search_col = 'Name' # str | A list of comma-separated columns used to search for the supplied values (optional) - search_val = strike.name # str | The keywords used to filter the items (optional) - - attack = None - try: - print(f"Finding the attack with the same name as the strike...") - api_application_resources_response = api_application_resources_instance.get_resources_attacks(take=take, skip=skip, search_col=search_col, search_val=search_val, categories=categories) - attack = api_application_resources_response.data[0] - print("Attack found.") - - except Exception as e: - print("Exception when calling ApplicationResourcesApi->get_resources_attacks: %s\n" % e) - - session.config.config.attack_profiles[0].attacks.append(Attack(id = "1", name = strike.name, external_resource_url = attack.id)) - session.config.config.attack_profiles[0].attacks.update() - - - print(f"Attack {strike.name} added successfully.\n") - - # Set the objective to single iteration - print("Setting the Iteration Count to 1...") - session.config.config.attack_profiles[0].objectives_and_timeline.timeline_segments[0].iteration_count = 1 - session.config.config.attack_profiles[0].objectives_and_timeline.update() - print("Iteration Count setted to 1 successfully.\n") - - # Create IP Networks - client_ip_network = IPNetwork(name="IP Network 1", id="1", agentAssignments=AgentAssignments(by_tag=[]), minAgents=1) - server_ip_network = IPNetwork(name="IP Network 2", id="2", agentAssignments=AgentAssignments(by_tag=[]), minAgents=1) - - # Append the IP Networks to the Network Profile - session.config.config.network_profiles[0].ip_network_segment = [client_ip_network, server_ip_network] - session.config.config.network_profiles[0].ip_network_segment.update() - print("Client and Server network segments added successfully.\n") - - # Get available agents - api_agents_instance = cyperf.AgentsApi(api_client) - agents = api_agents_instance.get_agents(exclude_offline='true') - - if len(agents) < 2: - raise ValueError("Expected at least 2 active agents") - - # Create an agent map - agent_map = { - 'IP Network 1': [agents[0].id, agents[0].ip], - 'IP Network 2': [agents[1].id, agents[1].ip] - } - - # Assign the agents - print("Assigning agents ...") - for net_profile in session.config.config.network_profiles: - for ip_net in net_profile.ip_network_segment: - if ip_net.name in agent_map: - agent_ip = agent_map[ip_net.name][1] - print(f" Agent {agent_ip} assigned to {ip_net.name}.") - capture_settings = None # str | The capture settings of the agent that is assigned (optional) - interfaces = None # List[str] | The names of the assigned test interfaces for the agent (optional) - links = None # List[APILink] | (optional) - agent_id = agent_map[ip_net.name][0] - agentDetails = [cyperf.AgentAssignmentDetails(agent_id=agent_id, - capture_setting=capture_settings, - id=agent_id, - interfaces=interfaces, - links=links)] - - if not ip_net.agent_assignments: - by_id = None # List[AgentAssignmentDetails] | The agents statically assigned to the current test configuration (optional) - by_port = None # List[AgentAssignmentByPort] | The ports assigned to the current test configuration (optional) - by_tag = [] # List[str] | The tags according to which the agents are dynamically assigned - links = None # List[APILink] | (optional) - ip_net.agent_assignments = cyperf.AgentAssignments(by_id=by_id, - by_port=by_port, - by_tag=by_tag, - links=links) - - ip_net.agent_assignments.by_id.extend(agentDetails) - ip_net.update() - print("Assigning agents completed.\n") - - # Start traffic - print("Starting the test ...") - api_test_operation_instance = cyperf.TestOperationsApi(api_client) - api_test_operation_response = api_test_operation_instance.start_test_run_start(session_id=session.id) - api_test_operation_response.await_completion() - - # Wait for the test to be finished - print("Test running ...") - session.refresh() - while session.test.status != 'STOPPED': - sleep(5) - session.refresh() - print("Test finished successfully.\n") - - # Download the test results - print("Downloading test results ...") - api_reports_instance = cyperf.ReportsApi(api_client) - generate_csv_operation = cyperf.GenerateCSVReportsOperation() - api_test_results_response = api_reports_instance.start_result_generate_csv(result_id=session.test.test_id, generate_csv_reports_operation=generate_csv_operation) - file_path = api_test_results_response.await_completion() - - last_separator_index = file_path.rfind("\\") - directory = file_path[:last_separator_index] - file_name = file_path[last_separator_index + 1:] - - print(f"Saved as: '{file_name}' at {directory}\n") - - # Add the file path to the list - all_files.append(file_path) - - - print("Aggregating all CSV files ...") - aggregated_file_path = os.path.join(directory, "aggregated_all_strikes_attack.csv") - - with open(aggregated_file_path, 'w', encoding='ISO-8859-1') as outfile: - for i, file_path in enumerate(all_files): - with open(file_path, 'r', encoding='ISO-8859-1') as infile: - if i == 0: - outfile.write(infile.read()) - else: - next(infile) - outfile.write(infile.read()) - - print(f"Aggregated results saved as: '{aggregated_file_path}'") - - diff --git a/samples/test_parameters.yml b/samples/test_parameters.yml new file mode 100644 index 0000000..f8e89d9 --- /dev/null +++ b/samples/test_parameters.yml @@ -0,0 +1,25 @@ +#Path of the folder where the captures will be stored . These will be used to convert to custom- applications +location_of_folder_containing_captures : "/mnt/c/git-new-features/cyperf-api-wrapper/samples/capture_folder" + +#The name of the Base Configuration Template where the application profile needs to be configured . +#This template must not have any pre-configured Application profile. +name_of_existing_cyperf_configuration : "PANW-APPMIX" + +#The path of the csv file +#This file contains input in form of csv [ application name , weights/percentage ]format +csv_path : "/mnt/c/git-new-features/cyperf-api-wrapper/samples/combined_report.csv" + +#weights or percentage provided for the application if percentage is False , then direct weights are provided +percentage: False + +#When set to True the user wishes to have exact name macthes with the CyPerf Libarary +#It is recommended to keep the boolean value as False for maximum coverage +exact_match : False + +#If the coverage percentage is equal to or greater than the non-zero threshold value ; then only the test will start running . +#If the threshold value is set to zero, the script enters interactive mode (menu driven) +threshold_coverage_percentage : 0 + +#dictionary- which contains mapping between app-id and the Cyperf app names- no need to modify this +dictionary_path: "/mnt/c/git-new-features/cyperf-api-wrapper/samples/pan_app_id_to_cyperf_app_mappings.csv" +#dictionary_path: "/mnt/c/new_python_automation_panw/cyperf-api-wrapper/samples/dictionary-of-app-id_bkp.csv" diff --git a/samples/sample_attacks_load_and_run.py b/samples/test_samples/sample_attacks_load_and_run.py similarity index 100% rename from samples/sample_attacks_load_and_run.py rename to samples/test_samples/sample_attacks_load_and_run.py diff --git a/samples/sample_create_save_and_export_config.py b/samples/test_samples/sample_create_save_and_export_config.py similarity index 99% rename from samples/sample_create_save_and_export_config.py rename to samples/test_samples/sample_create_save_and_export_config.py index a0dc980..e36799c 100644 --- a/samples/sample_create_save_and_export_config.py +++ b/samples/test_samples/sample_create_save_and_export_config.py @@ -44,7 +44,7 @@ # Create a session session = None print("Creating empty session...") - api_session_response = api_session_instance.create_sessions(sessions=sessions) + api_session_response = api_session_instance.create_sessions(session=sessions) session = api_session_response[0] print("Session created.\n") @@ -188,3 +188,4 @@ file_name = file_path[last_separator_index + 1:] print(f"Exported as: '{file_name}' at {directory}\n") + diff --git a/samples/sample_load_and_run_precanned_config.py b/samples/test_samples/sample_load_and_run_precanned_config.py similarity index 99% rename from samples/sample_load_and_run_precanned_config.py rename to samples/test_samples/sample_load_and_run_precanned_config.py index 7e238a4..cf71dce 100644 --- a/samples/sample_load_and_run_precanned_config.py +++ b/samples/test_samples/sample_load_and_run_precanned_config.py @@ -44,7 +44,7 @@ # Create a session session = None print("Creating session from config called 'Not Working From Home Traffic Mix' ...") - api_session_response = api_session_instance.create_sessions(sessions=sessions) + api_session_response = api_session_instance.create_sessions(session=sessions) session = api_session_response[0] print("Session created.\n") diff --git a/samples/sample_udp_streaming_run.py b/samples/test_samples/sample_udp_streaming_run.py similarity index 99% rename from samples/sample_udp_streaming_run.py rename to samples/test_samples/sample_udp_streaming_run.py index a36caff..2c2ce64 100644 --- a/samples/sample_udp_streaming_run.py +++ b/samples/test_samples/sample_udp_streaming_run.py @@ -51,6 +51,7 @@ def _set_objective_and_timeline(self): def configure(self): print('Configuring ...') self.utils.add_app(self.session, 'UDP Stream') + #import pdb;pdb.set_trace() self.utils.disable_automatic_network(self.session) if self.agent_map: self.utils.assign_agents(self.session, self.agent_map) diff --git a/samples/utils.py b/samples/utils.py new file mode 100644 index 0000000..3d8991f --- /dev/null +++ b/samples/utils.py @@ -0,0 +1,699 @@ +import os +import socket +import time +import datetime +import warnings +from pprint import pprint +import sys +import cyperf +import json +import pyshark +import asyncio +import math +from urllib.parse import urlparse + +def extract_path(url): + parsed_url = urlparse(url) + return parsed_url.path + +def extract_last_part_of_path(url): + parsed_url = urlparse(url) + return os.path.basename(parsed_url.path) + +def turns_coversations_dict_values(input_dict): + # Use dictionary comprehension to create a new dictionary with the updated values + updated_dict = {key: math.ceil(value / 2) for key, value in input_dict.items()} + return updated_dict + +def udp_identify_client_server(pcap_file): + + + # Create a pyshark Capture object + capture = pyshark.FileCapture(pcap_file, display_filter='udp') + # Initialize an empty dictionary to store the client-server information for each TCP stream + client_server_info = {} + + # Iterate through all packets in the capture + for packet in capture: + # Extract the TCP stream index + stream_index = packet.udp.stream + + # Check if the packet is a SYN packet (i.e., the start of a new TCP connection) + #if packet.tcp.flags_syn == 'True' and packet.tcp.flags_ack == 'False': + + # If the stream index is not already in the dictionary, add it with the client-server information + if stream_index not in client_server_info: + client_server_info[stream_index] = { + 'client_ip': packet.ip.src, + 'server_ip': packet.ip.dst + } + + # Close the capture + capture.close() + + return client_server_info + +def identify_client_server(pcap_file): + # Create a pyshark Capture object + capture = pyshark.FileCapture(pcap_file, display_filter='tcp') + + # Initialize an empty dictionary to store the client-server information for each TCP stream + client_server_info = {} + + # Iterate through all packets in the capture + for packet in capture: + # Extract the TCP stream index + stream_index = packet.tcp.stream + + if ((packet.tcp.flags_syn == '1') and (packet.tcp.flags_ack == '0' )): + + # If the stream index is not already in the dictionary, add it with the client-server information + if stream_index not in client_server_info: + client_server_info[stream_index] = { + 'client_ip': packet.ip.src, + 'server_ip': packet.ip.dst + } + + # Close the capture + capture.close() + return client_server_info + +def udp_count_byte_direction_changes_sync(pcap_file): + + conversation_per_udp_stream = {} + + # Create a pyshark Capture object + capture = pyshark.FileCapture(pcap_file, display_filter='udp') + + udp_packet_count_per_stream = {} + + # Initialize a dictionary to store the byte direction change count for each TCP stream + udp_byte_direction_changes = {} + + # Initialize a dictionary to store the previous packet direction for each TCP stream + previous_packet_direction = {} + + # Initialize an empty dictionary to store the client-server information for each TCP stream + udp_client_server_info = udp_identify_client_server(pcap_file) + + # Iterate through all packets in the capture + for packet in capture: + + # Extract the TCP stream index + stream_index = packet.udp.stream + + # Initialize the byte direction change count for this stream if it doesn't exist + if stream_index not in udp_byte_direction_changes: + udp_byte_direction_changes[stream_index] = 0 + + # Initialize the UDP packet count for this stream if it doesn't exist + if stream_index not in udp_packet_count_per_stream: + udp_packet_count_per_stream[stream_index] = 0 + + # Increment the UDP packet count for this stream + udp_packet_count_per_stream[stream_index] += 1 + + # Check if the packet is in the forward direction (i.e., from client to server) + if packet.ip.src == udp_client_server_info[stream_index]['client_ip']: + current_direction = 'forward' + else: + current_direction = 'reverse' + + # Check if the previous packet direction is different from the current packet direction + if stream_index in previous_packet_direction and previous_packet_direction[stream_index] != current_direction: + udp_byte_direction_changes[stream_index] += 1 + + # Update the previous packet direction for this stream + previous_packet_direction[stream_index] = current_direction + + # Close the capture + capture.close() + + for stream in udp_packet_count_per_stream: + if ( udp_byte_direction_changes[stream] == 0 ): + conversation_per_udp_stream[ stream]= udp_packet_count_per_stream[stream] - (udp_byte_direction_changes[stream]) + if ( udp_byte_direction_changes[stream] == 1 ): + conversation_per_udp_stream[ stream]= udp_packet_count_per_stream[stream] - (udp_byte_direction_changes[stream]) + if ( udp_byte_direction_changes[stream] > 1 ): + conversation_per_udp_stream[ stream]= udp_packet_count_per_stream[stream] - (udp_byte_direction_changes[stream] - 1 ) + return conversation_per_udp_stream + +def count_byte_direction_changes_sync(pcap_file): + # Create a pyshark Capture object + capture = pyshark.FileCapture(pcap_file, display_filter='tcp && tcp.payload != ""') + + # Initialize a dictionary to store the byte direction change count for each TCP stream + byte_direction_changes = {} + + # Initialize a dictionary to store the previous packet direction for each TCP stream + previous_packet_direction = {} + + # Initialize an empty dictionary to store the client-server information for each TCP stream + client_server_info = identify_client_server(pcap_file) + + # Iterate through all packets in the capture + for packet in capture: + + # Extract the TCP stream index + stream_index = packet.tcp.stream + + # Initialize the byte direction change count for this stream if it doesn't exist + if stream_index not in byte_direction_changes: + byte_direction_changes[stream_index] = 0 + + # Check if the packet is in the forward direction (i.e., from client to server) + if packet.ip.src == client_server_info[stream_index]['client_ip']: + current_direction = 'forward' + else: + current_direction = 'reverse' + + # Check if the previous packet direction is different from the current packet direction + if stream_index in previous_packet_direction and previous_packet_direction[stream_index] != current_direction: + byte_direction_changes[stream_index] += 1 + + # Update the previous packet direction for this stream + previous_packet_direction[stream_index] = current_direction + + # Close the capture + capture.close() + + return byte_direction_changes + +async def count_udp_conversations(pcap_file): + # Run the synchronous code in a separate thread + loop = asyncio.get_running_loop() + conversation_count = await loop.run_in_executor(None, udp_count_byte_direction_changes_sync, pcap_file) + return conversation_count + +async def count_tcp_conversations(pcap_file): + # Run the synchronous code in a separate thread + loop = asyncio.get_running_loop() + conversation_count = await loop.run_in_executor(None, count_byte_direction_changes_sync, pcap_file) + return conversation_count + + + +def prepare_payload(json_string,item): + # Example JSON string + #json_string = '{"AppName":"Custom application Mar 14 2025 01:12","Actions":[{"Name":"Action 1","Captures":[{"CaptureId":"68","Flows":[{"AppFlowId":"all","Exchange":[]}]}]}]}' + + # Load the JSON string into a Python dictionary + data = json.loads(json_string) + + new_app_name = "CCA-"+ item[0].rsplit('.', 1)[0] + c_id = item[1] + + # Substitute AppName in the dictionary + data['AppName'] = new_app_name + data['Actions'][0]['Captures'][0]['CaptureId'] = c_id + + # Convert the dictionary back to a JSON string + new_json_string = json.dumps(data) + + return new_json_string + + +def format_warning_cli_issues(message, category, filename, lineno=None, line=None): + return f"{category.__name__}: {message}\n" + + +warnings.formatwarning = format_warning_cli_issues + + +class Utils: + WAP_CLIENT_ID = 'clt-wap' + + def __init__(self, controller, username="", password="", refresh_token="", license_server=None, license_user="", license_password=""): + self.controller = controller + self.host = f'https://{controller}' + self.license_server = license_server + self.license_user = license_user + self.license_password = license_password + + self.configuration = cyperf.Configuration(host=self.host, + refresh_token=refresh_token, + username=username, + password=password) + self.configuration.verify_ssl = False + self.api_client = cyperf.ApiClient(self.configuration) + self.added_license_servers = [] + + self.resource_api = cyperf.ApplicationResourcesApi(self.api_client) + + self.update_license_server() + + self.agents = {} + agents_api = cyperf.AgentsApi(self.api_client) + agents = agents_api.get_agents() + for agent in agents: + self.agents[agent.ip] = agent + + def __del__(self, time=time, datetime=datetime): + if 'time' not in sys.modules or not sys.modules['time']: + sys.modules['time'] = time + self.remove_license_server() + + def update_license_server(self): + if not self.license_server or self.license_server == self.controller: + return + license_api = cyperf.LicenseServersApi(self.api_client) + try: + response = license_api.get_license_servers() + for lServerMetaData in response: + if lServerMetaData.host_name == self.license_server: + if 'ESTABLISHED' == lServerMetaData.connection_status: + pprint(f'License server {self.license_server} is already configured') + return + license_api.delete_license_servers(str(lServerMetaData.id)) + waitTime = 5 # seconds + print(f'Waiting for {waitTime} seconds for the license server deletion to finish.') + time.sleep(5) # How can I avoid this sleep???? + break + + lServer = cyperf.LicenseServerMetadata(host_name=self.license_server, + trust_new=True, + user=self.license_user, + password=self.license_password) + print(f'Configuring new license server {self.license_server}') + newServers = license_api.create_license_servers(license_server_metadata=[lServer]) + while newServers: + for server in newServers: + s = license_api.get_license_servers_by_id( + str(server.id)) + if 'IN_PROGRESS' != s.connection_status: + newServers.remove(server) + self.added_license_servers.append(server) + if 'ESTABLISHED' == s.connection_status: + print(f'Successfully added license server {s.host_name}') + else: + raise Exception(f'Could not connect to license server {s.host_name}') + time.sleep(0.5) + except cyperf.ApiException as e: + raise (e) + + def remove_license_server(self): + license_api = cyperf.LicenseServersApi(self.api_client) + for server in self.added_license_servers: + try: + license_api.delete_license_servers(str(server.id)) + except cyperf.ApiException as e: + pprint(f'{e}') + + def load_configuration_files(self, configuration_files=[]): + config_api = cyperf.ConfigurationsApi(self.api_client) + config_ops = [] + for config_file in configuration_files: + config_ops.append(config_api.start_configs_import(config_file)) + + configs = [] + for op in config_ops: + try: + results = op.await_completion() + configs += [(elem['id'], elem['configUrl']) for elem in results] + except cyperf.ApiException as e: + raise (e) + return configs + + def load_configuration_file(self, configuration_file): + configs = self.load_configuration_files([configuration_file]) + if configs: + return configs[0] + else: + return None + + def remove_configurations(self, configurations_ids=[]): + config_api = cyperf.ConfigurationsApi(self.api_client) + for config_id in configurations_ids: + config_api.delete_configs(config_id) + + def remove_configuration(self, configurations_id): + self.remove_configurations([configurations_id]) + + def create_session_by_config_name(self,configName ): + configsApiInstance = cyperf.ConfigurationsApi(self.api_client) + appMixConfigs = configsApiInstance.get_configs(search_col='displayName', search_val=configName) + if not len(appMixConfigs): + return None + + return self.create_session(appMixConfigs[0].config_url) + + def create_session(self, config_url): + session_api = cyperf.SessionsApi(self.api_client) + session = cyperf.Session() + session.config_url = config_url + sessions = session_api.create_sessions([session]) + if len(sessions): + print( type(sessions)) + print( type(sessions[0])) + return sessions[0] + else: + return None + + def delete_session(self, session): + session_api = cyperf.SessionsApi(self.api_client) + test = session_api.get_session_test(session_id=session.id) + if test.status != 'STOPPED': + self.stop_test(session) + session_api.delete_session(session.id) + + def assign_agents(self, session, agent_map, augment=False): + # Assing agents to the indivual network segments based on the input provided + for net_profile in session.config.config.network_profiles: + for ip_net in net_profile.ip_network_segment: + if ip_net.name in agent_map: + mapped_ips = agent_map[ip_net.name] + agent_details = [cyperf.AgentAssignmentDetails(agent_id=self.agents[agent_ip].id, id = self.agents[agent_ip].id) for agent_ip in mapped_ips if agent_ip in self.agents] # why do we need to pass agent_id and id both???? + if not ip_net.agent_assignments: + ip_net.agent_assignments = cyperf.AgentAssignments(ByID=[], ByTag=[]) # Why is ByTag argument a must???? + + if augment: + ip_net.agent_assignments.by_id.extend(agent_details) + else: + ip_net.agent_assignments.by_id = agent_details + ip_net.update() + + def disable_automatic_network(self, session): + for net_profile in session.config.config.network_profiles: + for ip_net in net_profile.ip_network_segment: + ip_net.ip_ranges[0].ip_auto = False + ip_net.update() + + def add_apps(self, session, appNames): + # Retrieve the app from precanned Apps + resource_api = cyperf.ApplicationResourcesApi(self.api_client) + app_info = [] + for appName in appNames: + apps = resource_api.get_resources_apps(search_col='Name', search_val=appName) + if not len(apps): + print('Couldn\'t find any {appName} app.') + raise Exception(f'Couldn\'t find \'{appName}\' app') + + # Add the app to the App-Mix, which may be empty until now. + app_info.append(cyperf.Application(external_resource_url=apps[0].id, objective_weight=1)) + + + + if not session.config.config.traffic_profiles: + session.config.config.traffic_profiles.append(cyperf.ApplicationProfile(name="Application Profile")) + session.config.config.traffic_profiles.update() + app_profile = session.config.config.traffic_profiles[0] + app_profile.applications += app_info + app_profile.active = True + app_profile.update() + app_profile.applications.update() + + + def get_apps(self, session,): + resource_api = cyperf.ApplicationResourcesApi(self.api_client) + cyperf_apps=[] + try: + api_response = resource_api.get_resources_apps() + print("The response of ApplicationResourcesApi->get_resources_apps:\n") + for index in range(len(api_response)): + cyperf_apps.append(api_response[index].name) + #Process individual AppNames to get more common names + return cyperf_apps + except Exception as e: + print("Exception when calling ApplicationResourcesApi->get_resources_apps: %s\n" % e) + + + + def add_apps_with_weights(self, session, app_dict): + resource_api = cyperf.ApplicationResourcesApi(self.api_client) + app_info = [] + for appName in app_dict.keys(): + apps = resource_api.get_resources_apps(search_col='Name', search_val=appName) + if not len(apps): + print('Couldn\'t find any {appName} app.') + raise Exception(f'Couldn\'t find \'{appName}\' app') + + app_info.append(cyperf.Application(external_resource_url=apps[0].id, objective_weight=7)) + + if not session.config.config.traffic_profiles: + session.config.config.traffic_profiles.append(cyperf.ApplicationProfile(name="Application Profile")) + session.config.config.traffic_profiles.update() + + app_profile = session.config.config.traffic_profiles[0] + + #It is very imprtant to forcefully upadate the application Profile + app_profile.active = True + app_profile.update() + #Now update the applications . The Order of update is very important . You must always update the parent , then you must update the child + #The update in no capacity is recursive as of now . It may be in future - we do not know . + app_profile.applications.extend(app_info) + app_profile.applications.update() + + #Update the weights of the individual application + for appName in app_dict: + + apps = resource_api.get_resources_apps(search_col='Name', search_val=appName) + if not len(apps): + print('Couldn\'t find any {appName} app.') + raise Exception(f'Couldn\'t find \'{appName}\' app') + + for i in range(len(app_profile.applications)): + + if((app_profile.applications[i].name).lower() == appName.lower()): + app_profile.applications[i].objective_weight=app_dict[appName] + app_profile.applications[i].update() + break + + def get_session(self,session): + session_api = cyperf.SessionsApi(self.api_client) + return session_api.get_session_by_id(session.id) + + + def get_apps(self, session,): + resource_api = cyperf.ApplicationResourcesApi(self.api_client) + cyperf_apps=[] + try: + api_response = resource_api.get_resources_apps() + #print("The response of ApplicationResourcesApi->get_resources_apps:\n") + for index in range(len(api_response)): + cyperf_apps.append(api_response[index].name) + + return cyperf_apps + except Exception as e: + print("Exception when calling ApplicationResourcesApi->get_resources_apps: %s\n" % e) + + def add_app(self, session, appName): + self.add_apps(session, [appName]) + + #new function + async def upload_the_capture_file(self, pcap_file): + #validate the pcap first + #turns - means the numbe rof times the direction of bytes changes in a TCP stream . + turns_per_stream= await count_tcp_conversations(pcap_file) + udp_conversations_per_stream= await count_udp_conversations(pcap_file) + #update the value in the dictionary such that turns are replaced by conversations for each tcp stream . The UDP part is taken care . + conversation_per_stream=turns_coversations_dict_values(turns_per_stream) + + #Total Number of TCP coversations / exchanges - Summation across all streams + total_number_of_conversations=sum(value for value in conversation_per_stream.values()) + + #Total Number of UDP coversations / exchanges - Summation across all streams + udp_total_number_of_conversations = sum(value for value in udp_conversations_per_stream.values()) + + if( total_number_of_conversations > 0 ): + print("\nThe number of tcp conversation for the file {} is {}".format((os.path.basename(pcap_file)),total_number_of_conversations)) + if(udp_total_number_of_conversations > 0 ): + print("\nThe number of udp conversation for the file {} is {}".format((os.path.basename(pcap_file)), udp_total_number_of_conversations)) + + if ((total_number_of_conversations + udp_total_number_of_conversations) > 10000): + print(f"The pcap file - {pcap_file} has more than 10000 exchanges/ conversations and this is not suppoted by CyPerf. \n The max number of converstaions supported in 10000.\n This file will be skipped !") + return + #if (udp_total_number_of_conversations > 10000): + #print(f"The pcap file - {pcap_file} has more than 10000 exchanges/ conversations and this is not suppoted by CyPerf. \n The max number of converstaions supported in 10000.\n This file will be skipped !") + #return + print(f"\nStarting upload of capture file-{os.path.basename(pcap_file)} to CyPerf Resouce Library.") + + response =self.resource_api.start_resources_captures_upload_file( pcap_file) + + flag = True + + while flag: + res = self.resource_api.poll_resources_captures_upload_file(upload_file_id=str(response.id),_request_timeout=300) + if res.state == "SUCCESS": + print("File {} is uploaded SUCCESSFULLY!!.\n".format(os.path.basename(pcap_file))) + flag = False + else: + print("Uploading file {} is {}".format(os.path.basename(pcap_file),res.state)) + time.sleep(3) + + def create_apps_from_captures(self): + list_of_captures = self.resource_api.get_resources_captures() + user_uploaded_list_of_captures = [ (x.name,x.id) for x in list_of_captures if x.owner_id!='system'] + for item in user_uploaded_list_of_captures: + #create the json payload + json_string = '{"AppName":"Custom","Actions":[{"Name":"Super-Action","Captures":[{"CaptureId":"00","Flows":[{"AppFlowId":"all","Exchange":[]}]}]}]}' + pp=prepare_payload(json_string,item) + #import pdb;pdb.set_trace() + flag = True + create_app_operation_instance = cyperf.CreateAppOperation.from_json(pp) + response=self.resource_api.start_resources_create_app(create_app_operation_instance) + while flag: + res = self.resource_api.poll_resources_create_app(id=response.id,_request_timeout=300) + if res.state == "SUCCESS": + print("App from {} is created SUCCESSFULLY!!.\n".format(item[0])) + flag = False + else: + print("App from {} - creation is {}".format(item[0],res.state)) + time.sleep(1) + return True + + + + def set_objective_and_timeline(self, session, + objective_type=cyperf.ObjectiveType.SIMULATED_USERS, + objective_unit=cyperf.ObjectiveUnit.EMPTY, + objective_value=100, + test_duration=600): + primary_objective = session.config.config.traffic_profiles[0].objectives_and_timeline.primary_objective + primary_objective.type = objective_type + primary_objective.unit = objective_unit + primary_objective.update() # How will the customer know that update() has to be called twice (separately)???? + + for segment in primary_objective.timeline: # How will the customer know that primary_objective.timeline has to be updated instead of objectives_and_timeline.timeline_segments???? + if segment.enabled and (segment.segment_type == cyperf.SegmentType.STEADYSEGMENT or segment.segment_type == cyperf.SegmentType.NORMALSEGMENT): + segment.duration = test_duration + segment.objective_value = objective_value + segment.objective_unit = objective_unit + primary_objective.update() + #import pdb;pdb.set_trace() + + def start_test(self, session): + test_ops_api = cyperf.TestOperationsApi(self.api_client) + test_start_op = test_ops_api.start_start_traffic(session_id=session.id) + try: + test_start_op.await_completion() + except cyperf.ApiException as e: + raise (e) # The error shown in the GUI is not sent back to the API caller, why???? + + def wait_for_test_stop(self, session, test_monitor=None): + session_api = cyperf.SessionsApi(self.api_client) + monitored_at = None + wait_interval = 0.5 + while 1: + test = session_api.get_test(session_id=session.id) + if 'STOPPED' == test.status: # Why don't we have a enum here???? + break + if test_monitor: + if monitored_at: + monitor_start = monitored_at + 1 + else: + monitor_start = 0 + monitor_upto = monitor_start - 1 # Anything less than monitor_start will mean up to most latest + monitored_at = test_monitor(test, monitor_start, monitor_upto) + time.sleep(wait_interval) + + def stop_test(self, session): + test_ops_api = cyperf.TestOperationsApi(self.api_client) + test_stop_op = test_ops_api.start_stop_traffic(session_id=session.id) + try: + test_stop_op.await_completion() + except cyperf.ApiException as e: + raise (e) + + def collect_stats(self, test, stats_name, time_from, time_to, stats_processor=None): + stats_api = cyperf.StatisticsApi(self.api_client) + stats = stats_api.get_stats(test.test_id) + stats = [stat for stat in stats if stats_name in stat.name] + if time_from: + if time_to > time_from: + stats = [stats_api.get_stats_by_id(test.test_id, stat.name, var_from=time_from, to=time_to) for stat in stats] + else: + stats = [stats_api.get_stats_by_id(test.test_id, stat.name, var_from=time_from) for stat in stats] + else: + stats = [stats_api.get_stats_by_id(test.test_id, stat.name) for stat in stats] + if stats_processor: + stats = stats_processor(stats) + + return stats + + def format_milliseconds(self, milliseconds): + seconds = int(milliseconds / 1000) % 60 + minutes = int(milliseconds / (1000 * 60)) % 60 + hours = int(milliseconds / (1000 * 60 * 60)) % 24 + + return f'{hours:02d}H:{minutes:02d}M:{seconds:02d}S' + + def is_valid_ipv4(ip): + try: + socket.inet_aton(ip) + except Exception: + return False + return True + + def is_valid_ipv6(ip): + try: + socket.inet_pton(socket.AF_INET6, ip) + except Exception: + return False + return True + + def format_stats_dict_as_table(self, stats_dict={}): + if not stats_dict: + return + + stat_names = stats_dict.keys() + col_widths = [max(len(str(val)) + 2 for val in val_list + [stat_name]) for stat_name, val_list in stats_dict.items()] + header = '|'.join([f'{name:^{col_width}}' for name, col_width in zip(stat_names, col_widths)]) + line_delim = '+'.join(['-' * col_width for col_width in col_widths]) + + lines = ['|'.join([f'{val:^{col_width}}' for val, col_width in zip(item, col_widths)]) for item in zip(*stats_dict.values())] + return [line_delim, header, line_delim] + lines + [line_delim] + + def search_configuration_file(self, name): + try: + flag=0 + api_instance = cyperf.ConfigurationsApi(self.api_client) + api_response = api_instance.get_configs() + while(api_response): + dn=api_response.pop().to_dict()['displayName'] + if (dn == name): + print (f"The configuration was found and it will be loaded now ") + flag =1 + return True + if (flag==0): + print (f"The configuration was not found and it will be not be loaded.Provide an existing configuration name ") + return False + except Exception as e: + print("Exception when calling ConfigurationsApi->get_configs: %s\n" % e) + + +def parse_cli_options(extra_options=[]): + import argparse + + parser = argparse.ArgumentParser(description='A simple UDP test') + parser.add_argument('--controller', help='The IP address or the hostname of the CyPerf controller', required=True) + parser.add_argument('--user', help='The username for accessing the controller, needs a password too') + parser.add_argument('--password', help='The password for accessing the controller, needs a username too') + parser.add_argument('--license-server', help='The IP address or the hostname of the license server, default is the controller') + parser.add_argument('--license-user', help='The username for accessing the license server, needed if controller is not the license server') + parser.add_argument('--license-password', help='The password for accessing the license server, needed if controller is not the license server') + for option, help, required in extra_options: + parser.add_argument(option, help=help, required=required) + args = parser.parse_args() + + if not args.license_server or args.license_server == args.controller: + args.license_server = args.controller + args.license_user = None + args.license_password = None + else: + if not args.license_user or not args.license_password: + parser.error('--license-user and --license-password are mandatory if a different --license-server is provided') + + if args.user and args.password: + offline_token = None + else: + if args.user or args.password: + warnings.warn('Only one of --user and --password is provided, looking for offline token') + + try: + offline_token = os.environ['CYPERF_OFFLINE_TOKEN'] + except KeyError as e: + parser.error(f'Couldn\'t find environment variable {e}') + + return args, offline_token + + \ No newline at end of file