Generally speaking Azure DevOps services has three ways of doing authentication:
1. Microsoft Entra ID(with an Entra token)
2. Personal Access Tokens(PATs)
3. Azure DevOps OAuth
In this project all tools should only use the first one. An Entra token is very long compared to PATs. There are two ways of getting such a token:
1. Use a service connection with Workload Identity Federation, which is for scripts that run in pipelines in Azure cloud.
2. Get it from AzCli, which is for scripts that runs on Microsoft developers machines.
Some tools need to support the both scenarios, for example, nuget.exe. We are still exploring ways to make it work well with Entra ID auth(need to find a way to support cross-org nuget publishing).
## delete_ado_pipeline.py
Prerequisites: Install AzCli and get logged in.
This script is designed to completely delete an Azure DevOps pipeline definition and all of its associated data.
The process is as follows:
1. Asynchronously finds and deletes ALL retention leases associated with the pipeline.
2. Asynchronously deletes ALL build runs (history) for the pipeline.
3. Deletes the pipeline definition itself.
## upload_symbol.py
This script downloads ONNX Runtime Windows release artifacts for a specified version from GitHub. Extracts PDB symbol files and uploads them to your Azure DevOps symbol server, simplifying debugging for teams hosting symbols there.
This script requires symbols.exe, which can be downloaded by using the following Azure REST API: https://learn.microsoft.com/en-us/rest/api/azure/devops/symbol/client/get?view=azure-devops-rest-7.1
symbols.exe handles authentication.
## run_packaging_pipelines.py
The script is for triggering Azure DevOps pipelines for a specific Github pull request or git branch. ONNX Runtime's release managers should use this script to trigger packaging pipelines.
You may also use this script to trigger pull request pipelines for external PRs.
### Prerequisites
1. Install AzCli and get logged in.
2. pip install pyyaml
### Usages
It supports two modes:
1. CI Build Mode (Default):
- Triggers pipelines in the 'Lotus' project.
- Filters pipelines based on the following criteria:
- Repository Association: Must be 'https://github.com/microsoft/onnxruntime'.
- Recent Activity: Must have run in the last 30 days.
- Pipeline Type: Must be YAML-based.
- Trigger Type: Must NOT be triggered by another pipeline resource.
- Template Requirement: Must extend from 'v1/1ES.Official.PipelineTemplate.yml@1esPipelines'.
2. Pull Request (PR) Mode:
- Activated by using the '--pr <ID>' argument.
- Triggers pipelines in the 'PublicPackages' project.
- Filters pipelines based on a simplified criteria:
- Repository Association: Must be 'https://github.com/microsoft/onnxruntime'.
- Recent Activity: Must have run in the last 30 days.
- Pipeline Type: Must be YAML-based.
The script also includes a feature to cancel any currently running builds for a matching
pipeline on the target branch/PR before queuing a new one.
Often a support request will only provide a problematic model and no input data. create_test_dir can be used to create input to allow the model to be debugged more easily. Random input can be generated if not provided. If expected output is not provided, the model will be run with the input, and the output from that will be saved as the expected output.
To execute the test once the directory is created you can use the onnx_test_runner or onnxruntime_perf_test executables if you have built onnxruntime from source, or the run_test_dir helper. Input can be either the test directory, or the model in case there are multiple in the test directory.
--channels_last Transpose image from channels first to channels last.
--add_batch_dim Prepend a batch dimension with value of 1 to the shape. i.e. convert from CHW to NCHW
random_to_pb:
random_to_pb specific options
--shape SHAPE Provide the shape as comma separated values e.g. --shape 200,200
--datatype DATATYPE numpy dtype value for the data type. e.g. f4=float32, i8=int64. See: https://docs.scipy.org/doc/numpy/reference/arrays.dtypes.html
--min_value MIN_VALUE
Limit the generated values to this minimum.
--max_value MAX_VALUE
Limit the generated values to this maximum.
--seed SEED seed to use for the random values so they're deterministic.
```
## dump_subgraphs.py
If you're investigating a model with control flow nodes (Scan/Loop/If) the subgraphs won't be displayed in Netron. Run dump_subgraphs to dump the subgraphs as .onnx files that can be viewed individually.
```
usage: dump_subgraphs.py [-h] -m MODEL [-o OUT]
Dump all subgraphs from an ONNX model into separate onnx files.
optional arguments:
-h, --help show this help message and exit
-m MODEL, --model MODEL model file
-o OUT, --out OUT output directory (default: <current dire)
Convert ONNX model to TensorBoard events file so that we can visualize the model in TensorBoard. This is especially useful for debugging large models that cannot be visualized in Netron.