From 263ef61cb86fce0209083608c728a0017b3b52d9 Mon Sep 17 00:00:00 2001 From: Chris Sommers <31145757+chrispsommers@users.noreply.github.com> Date: Tue, 2 Aug 2022 16:42:47 -0700 Subject: [PATCH] saithrift client & server integration with DASH bmv2 (Fix #131,158) (#164) * Build sai-thrift server (not working yet). New dockerfile. * Removed extraneous step. * README * Move Dockerfile under dockerfiles/ * Pass flags to saithrift-build, expecting https://github.com/opencomputeproject/SAI/pull/1514 to merge eventually. Meanwhile can locally modify SAI makefiles. * Fix CI files for new location of Docekrfile.bmv2, add CI for dash-saithrift docker. * Fix CI workflow for new dockerfiles. * Add line to trigger CI. * Restore CI filename, try to get it to trigger. * Rename CI file; modify DOckerfile to trigger; add docker targets to Makefile. * Modify dockerfiles to trigger CI. * Fix make target names in CI and Makefile * Update CI to trigger on CI file changes itself. * Trigger on CI file change. * Remove thrift source from dash-saithrift-bldr docker image. New tag 220625. * Enhance saithrift docker to handle dash sai header generation and libsai compilation. * Remove thrift source and source tarball from docker image; copy python dist tarball to /usr/lib inside container so can be extracted for deployment to a host. * Experimenting with reduced Dockerfiles. * Makefile: p4 source dependencies. dash_pipeline.p4: egress_spec=ingress_spec, comments. * Fix stray merge conflict. * Fix merge conflict in CI file. * Optimized docker build and run images by creating a dedicated gRPC library image and selectively copying to BMV2 builder/runtime image based on available p4lang docker image. * FIx CI, Makefile for new docker images. * Optimize dockerfiles, reduced to 3 images totalling ~2.7G, down from ~10G. * Fix CI files, add Make targets; cleanup. * CI files, badge * CI files. * Split up READMEs into smaller files. Diagram. * READMEs, diagram, SHA version of p4c * Add missing CI workflow (p4c); update main CI workflow to trigger on any dockerfile change. * Fix make command in CI file. * Pin the SHA version of p4lang/behavioral-model:latest. * Ping SHA versions of Ubuntu used in base images. * READMEs - workflows etc. * README typos, clarifications. * README * Use dev fork of SAI which passes flags to gensairpc.pl per https://github.com/opencomputeproject/SAI/pull/1514. Once merged, we can change back to upstream branch of SAI. * Use fork of SAI with fixes for sairpcgen Make flags, and sairpcgen templates which add static_cast to avoid type mismatches due to sai extensions. * Add placeholder code for SAI fixed functions, -fpermissive to avoid enum Long error. TODO: P4RT libs missing, otherwise sai-thrift server builds. * Progress on sai-thrift server, final saiserver linkage still has errors but was able to pass grpc/proto/pi libs into make. * Created init_switch test program and make target to initialize bmv2 model (P4RUntime SetForwardingPipelineRequest) to allow explicit switch intialization via P4Runtime. Removed sudo from most Make commands. READMEs. * Remove dependency to avoid rebuilding files while running test in CI. Delete stray file. * README * Build saithrift-bldr in Ubuntu-20.04 to avoid glibcc conflicts wth libgrpc etc. Modify sai-thrift Maefile to accomodate Python3 vs Python2. TODO - saiserver builds but get errors in later stages of make saithrift-build target (ctypesgen...) * Can build and run saithrift server. Fixed makefiles to call SAI makes with proper env vars etc., fixed install steps. * Add sai-thrift-server to CI job. * saithrift server starts up properly - added volume mounts to p4info * Add DOCKER_FLAGS for sai targets * Add sai-thrift-server CI test. * Workaround for git repo ownership in CI. * Correct trigger path for CI. * Workaround for SAI submodule - fixed step name. * Change dir into SAI/SAI to make safe directory * Try safe directory from /dash * Remove -u root from CI flags, remove git safe directory. * ove lib dir creation to Makefile, remove from .pyu script - permission errors during CI. * Add back -privileged flag * ADd -u root to make sai in CI * Change CI docker flags, use sudo for python script to overcome perms. * Update SAI branch which fixes sairpcgen Makefiles (sai header include path). * Remove experimental sudo from bash script, clean up overwrite options from python script (leave such things to makefiles.) Still struggling with perms issues in CI script. * Try sudo in CI make sai step. * Try -u root in CI file and git config --global --add safe.directory in Makefile * Use "git config --global --add safe.directory /dash/dash-pipeline/SAI/SAI" per error message, even though it doesn't match actual path IMO. * Try git config --global --add safe.directory '*' in Makefile. * Temporarily skip make sai step, see if rest of pipeline runs. * REstore CI script, try sudo workarounds in Makefile. * fix chmod command * Remove -u root, use sudo calling python script. * fix chmod to a+rw and also commit bash script. * chown and git safe dirs script attempt * pass user to sudo * export user * FIx chown * More wrappers and scripts to solve CI git ownership issues. * REmove chown to try to fix "./checkenumlock.sh ...fatal: Unable to create '/dash/.git/modules/SAI/index.lock': Permission denied * remove git safe dirs, getting lockfile error in CI * REmove sudo calling python script; shotgun sudo for all CI steps. * REstore sudo for py script; add -u root, --privileged to flags in CI * Workaround, delete scripts in SAI/meta/Makefile which throw Git permission errors. * Add sudo for sai-thrift-server target due to -u roots above. Got past git repo errors via previous commit, yay! * Use sudo to avoid permissions problem in CI pipeline for SAI/rpc dir * chown in sai-thrift Makefile to overcome CI perms issues. * Change perms instead of chown in saithrift build. * Add missing DOCKER_FLAGS to sai-thrift-server target * Add DOCKER_FLAGS to CI step; remove obsolete wrapper scripts. * Pass DOCKER_FLAGS to make target for sai-thrift-server test * Use different SAI submodule branch (better name, synced to master). Add saithrift clean target to clean. * Remove extraneous permissions workarounds. * Update SAI submodule. * Dockerfile for saithrift test clients. * ADd CI for saithrift client docker * Add prerequisite builds steps for sai-thrift-client. Update drawing. * Build prerequisites first to make docker-saithrift-client * Add missing requirements.txt, make targets. * Progress with saithrift-client docker image, pytest, ptf test installation and skeletons. * Refactor pytest to use common theift Client class; run sai client tests in CI. NOte - need sai_api_query impl to avoid sai thrift server crashes. * Set docker flags for saithrift client test in CI * Docs, diagrams. saithrift workflows. * Use updated SAI dev branch to insert checks for NULL sai API funcs before calling them. Add simple test for switch attribute to prove thrift API working. * Hand-modified sai_api_query to return SAI_API_DASH_VNET apis. Should be auto-generated. * Use workaround for saithrift builder per https://github.com/opencomputeproject/SAI/issues/1537. Reorganize nacent pytests to use fixtures to get saithrift client, put in directories, add custom markers, new README. * mark vnet test x-fail for CI * Update saithrift client docker image. * Add empty README for saithrift/ptf * Clean up make targets. * Permissions fixes (manually, not merged). Add gdb to sai-thrift-server docker. * Fix sai-thrift-server; got first RPC working (vnet). Slight refactoring of pytests (imports); fixed sai_api_query & made _impl structs non-static. * Fix path for sai-thrift-server according to new -w path in Makefile * Add CI triggers for all .py, .sh, .yml and requirements.txt. under test/ dir * Update to snappi-0.7.38 for fix https://github.com/Azure/DASH/issues/153 * Temporary workaround for container permissions issues per https://github.com/Azure/DASH/issues/143. Executes chmod as required. Permanent fix will require some Docker mods. * Update snappi to 0.7.38 in saithrift tests * FIx dangling merge conflicts in README. * s/sai-thrift/saithrift to for consistency. * Delete sai_test which was accidentally copied from SAI repo. * Fix https://github.com/Azure/DASH/issues/158 - libsai delete operation failure and log msg. THe test for error code was backwards and log message used same write updateType string for all operations. I fixed the compare and used a dynamic enum print method. * Modify vnet pytest case to create/delete. README. Makefile polishing. * Add more table create/deletes + cleanup to saithrift vnet test case * Assign container names in ixia-c deployments (append username) * Got preliminary PTF and Pytests including saithrift table accessors, packet echo tests. Removed ixia-c test as standalone test; now it runs as a pytest inside saithrift-client container so host tool installs are reduced (snappi, python, pip). * Add back missing line to build saithrift server. * Fix CI trigger, a prior merge messed this up. * Add deploy-ixiac to Makefile deps and CI steps to support pytests. * Refactor saithrift-client docker image into builder and runtime images; runtime client is always built locally, not pulled. Change python requirements to hardlink to DASH/test/requirements.txt. Change DOckerfile name for consistency. * Create tests/libsai dir and move "C++" tests there; fix typo in CI file. * Defer deploying ixia-c until PTF tests complete to see if CI tests fail less often, suspect CI runner limits are being strained. Tests fail sometimes w/o reason. * Moved saithrift tests under tests/ for consistency. * FIx path, paste errors which run tests. * Move requirements.txt to hardlink under tests (was under saithrift). Delete obs file. * READMEs (saithrift test framework), Makefile (fix dev tests). * READMEs - test workflows. * Fix paste error - README * READMEs - workflows. Diagram. Remove "make run-test," replaced by "make libsai-test." * Fix CI action; READMEs polish. * Minor README improvements; fix URL, diagrams (sirius->dash); improve Quick-start instructions. * Add back URL. * REmove python, pip install deps. * READMEs; move saithrift test scripts into subdirs; change verify_packets() to verify_packet() to avoid failing due to junk entering veths from host network. * Change CI job title Sirius->DASH * Update SAI submodule to rescind changes to SAI include paths for sairpcgen, wasn't needed after all. Co-authored-by: Chris Sommers --- .github/workflows/dash-bmv2-ci.yml | 42 ++- .../dash-saithrift-client-bldr-docker.yml | 50 +++ .../dash-saithrift-client-docker.yml | 51 +++ .github/workflows/dash-saithrift-docker.yml | 8 +- .gitignore | 5 +- .gitmodules | 2 +- assets/CI-badge-failing.svg | 18 +- assets/CI-badge-passing.svg | 12 +- assets/scapy-icon.png | Bin 0 -> 4963 bytes dash-pipeline/.dockerignore | 6 +- dash-pipeline/Dockerfile | 120 ------ dash-pipeline/Makefile | 248 ++++++++++--- dash-pipeline/README-common-errors.md | 23 ++ dash-pipeline/README-dash-ci.md | 13 +- dash-pipeline/README-dash-docker.md | 4 +- dash-pipeline/README-dash-workflows.md | 204 ++++++++-- dash-pipeline/README-ptftests.md | 68 ++++ dash-pipeline/README-pytests.md | 133 +++++++ dash-pipeline/README-saithrift.md | 348 ++++++++++++++++++ dash-pipeline/README.md | 106 +++--- dash-pipeline/SAI/README.md | 7 + dash-pipeline/SAI/SAI | 2 +- dash-pipeline/SAI/generate_dash_api.sh | 3 +- dash-pipeline/SAI/sai_api_gen.py | 12 +- dash-pipeline/SAI/saithrift/Makefile | 53 ++- dash-pipeline/SAI/saithrift/Makefile.old | 45 --- dash-pipeline/SAI/templates/Makefile.j2 | 31 +- dash-pipeline/SAI/templates/saiapi.cpp.j2 | 5 +- dash-pipeline/SAI/templates/utils.cpp.j2 | 83 ++++- dash-pipeline/bmv2/dash_pipeline.p4 | 7 + dash-pipeline/dockerfiles/.dockerignore | 6 +- dash-pipeline/dockerfiles/Dockerfile.p4c-bmv2 | 22 +- ...le.saithrift => Dockerfile.saithrift-bldr} | 47 ++- .../dockerfiles/Dockerfile.saithrift-client | 39 ++ .../Dockerfile.saithrift-client-bldr | 37 ++ .../images/dash-p4-bmv2-thrift-workflow.svg | 2 +- dash-pipeline/tests/{ => libsai}/Makefile | 0 dash-pipeline/tests/libsai/README.md | 2 + .../tests/{ => libsai}/init_switch/.gitignore | 0 .../tests/{ => libsai}/init_switch/Makefile | 2 +- .../{ => libsai}/init_switch/init_switch.cpp | 0 .../tests/{ => libsai}/vnet_out/.gitignore | 0 .../tests/{ => libsai}/vnet_out/Makefile | 2 +- .../tests/{ => libsai}/vnet_out/vnet_out.cpp | 29 +- dash-pipeline/tests/requirements.txt | 2 + dash-pipeline/tests/saithrift/README.md | 8 + dash-pipeline/tests/saithrift/ptf/README.md | 4 + .../saithrift/ptf/run-saithrift-ptftests.sh | 5 + .../ptf/thrift/test_thrift_session.py | 83 +++++ .../saithrift/ptf/vnet/test_saithrift_vnet.py | 71 ++++ .../tests/saithrift/pytest/conftest.py | 13 + .../saithrift/pytest/echo/test_echo_port.py | 148 ++++++++ .../tests/saithrift/pytest/pytest.ini | 6 + .../saithrift/pytest/run-saithrift-pytests.sh | 2 + .../saithrift/pytest/saithrift_rpc_client.py | 30 ++ .../pytest/switch/test_saithrift_switch.py | 15 + .../pytest/thrift/test_saithrift_session.py | 11 + .../pytest/vnet/test_saithrift_vnet.py | 73 ++++ test/docs/testbed/README.testbed.Overview.md | 2 +- test/images/dash-test-wflow-p4-saithrift.svg | 2 +- test/images/dash-test-wflow-saithrift.svg | 2 +- .../deployment/ixia-c-deployment.yml | 3 + 62 files changed, 1956 insertions(+), 421 deletions(-) create mode 100644 .github/workflows/dash-saithrift-client-bldr-docker.yml create mode 100644 .github/workflows/dash-saithrift-client-docker.yml create mode 100644 assets/scapy-icon.png delete mode 100644 dash-pipeline/Dockerfile create mode 100644 dash-pipeline/README-common-errors.md create mode 100644 dash-pipeline/README-ptftests.md create mode 100644 dash-pipeline/README-pytests.md create mode 100644 dash-pipeline/README-saithrift.md delete mode 100644 dash-pipeline/SAI/saithrift/Makefile.old rename dash-pipeline/dockerfiles/{Dockerfile.saithrift => Dockerfile.saithrift-bldr} (54%) create mode 100644 dash-pipeline/dockerfiles/Dockerfile.saithrift-client create mode 100644 dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr rename dash-pipeline/tests/{ => libsai}/Makefile (100%) create mode 100644 dash-pipeline/tests/libsai/README.md rename dash-pipeline/tests/{ => libsai}/init_switch/.gitignore (100%) rename dash-pipeline/tests/{ => libsai}/init_switch/Makefile (92%) rename dash-pipeline/tests/{ => libsai}/init_switch/init_switch.cpp (100%) rename dash-pipeline/tests/{ => libsai}/vnet_out/.gitignore (100%) rename dash-pipeline/tests/{ => libsai}/vnet_out/Makefile (92%) rename dash-pipeline/tests/{ => libsai}/vnet_out/vnet_out.cpp (72%) create mode 100644 dash-pipeline/tests/requirements.txt create mode 100644 dash-pipeline/tests/saithrift/README.md create mode 100644 dash-pipeline/tests/saithrift/ptf/README.md create mode 100755 dash-pipeline/tests/saithrift/ptf/run-saithrift-ptftests.sh create mode 100644 dash-pipeline/tests/saithrift/ptf/thrift/test_thrift_session.py create mode 100644 dash-pipeline/tests/saithrift/ptf/vnet/test_saithrift_vnet.py create mode 100644 dash-pipeline/tests/saithrift/pytest/conftest.py create mode 100644 dash-pipeline/tests/saithrift/pytest/echo/test_echo_port.py create mode 100644 dash-pipeline/tests/saithrift/pytest/pytest.ini create mode 100755 dash-pipeline/tests/saithrift/pytest/run-saithrift-pytests.sh create mode 100644 dash-pipeline/tests/saithrift/pytest/saithrift_rpc_client.py create mode 100644 dash-pipeline/tests/saithrift/pytest/switch/test_saithrift_switch.py create mode 100755 dash-pipeline/tests/saithrift/pytest/thrift/test_saithrift_session.py create mode 100644 dash-pipeline/tests/saithrift/pytest/vnet/test_saithrift_vnet.py diff --git a/.github/workflows/dash-bmv2-ci.yml b/.github/workflows/dash-bmv2-ci.yml index 934671e39..450e1e157 100644 --- a/.github/workflows/dash-bmv2-ci.yml +++ b/.github/workflows/dash-bmv2-ci.yml @@ -4,7 +4,7 @@ on: push: branches: [ "**" ] paths: - - '.github/workflows/dash-ci.yml' + - '.github/workflows/dash-bmv2-ci.yml' - 'test/**.py' - 'test/**requirements.txt' - 'test/**.sh' @@ -19,7 +19,7 @@ on: pull_request: branches: [ "main" ] paths: - - '.github/workflows/dash-ci.yml' + - '.github/workflows/dash-bmv2-ci.yml' - 'test/**.py' - 'test/**requirements.txt' - 'test/**.sh' @@ -35,7 +35,7 @@ on: jobs: build: - name: Build and Test Sirius Pipeline + name: Build and Test DASH Pipeline runs-on: ubuntu-20.04 env: docker_fg_flags: -u root --privileged @@ -46,25 +46,35 @@ jobs: steps: - uses: actions/checkout@v3 - name: Pull docker p4c image - run: make docker-pull-dash-p4c + run: make docker-pull-dash-p4c - name: Build P4 software switch (bmv2) and P4Info - run: DOCKER_FLAGS=$docker_fg_flags make p4 + run: DOCKER_FLAGS=$docker_fg_flags make p4 - name: Install SAI submodule - run: git submodule update --init + run: git submodule update --init - name: Pull docker saithrift-bldr image - run: make docker-pull-saithrift-bldr + run: make docker-pull-saithrift-bldr - name: Generate SAI API - run: DOCKER_FLAGS=$docker_fg_flags make sai + run: DOCKER_FLAGS=$docker_fg_flags make sai + - name: Generate saithrift-server + run: DOCKER_FLAGS=$docker_fg_flags make saithrift-server + - name: Generate saithrift-client local docker + run: DOCKER_FLAGS=$docker_fg_flags make docker-saithrift-client - name: Pull docker bmv2-bldr image - run: make docker-pull-bmv2-bldr + run: make docker-pull-bmv2-bldr - name: Build libsai c++ tests - run: DOCKER_FLAGS=$docker_fg_flags make test + run: DOCKER_FLAGS=$docker_fg_flags make test - name: Prepare network - run: DOCKER_FLAGS=$docker_fg_flags make network + run: DOCKER_FLAGS=$docker_fg_flags make network - name: Run P4 software switch (bmv2) with P4Runtime - run: DOCKER_FLAGS=$docker_bg_flags make run-switch - - name: Test SAI library - run: DOCKER_FLAGS=$docker_fg_flags make run-test - - name: Ixia-c Traffic Generator test - run: make run-ixiac-test + run: DOCKER_FLAGS=$docker_bg_flags make run-switch + - name: Test SAI library over P4RT to switch + run: DOCKER_FLAGS=$docker_fg_flags make run-libsai-test + - name: Run saithrift server + run: DOCKER_FLAGS=$docker_bg_flags make run-saithrift-server + - name: Run PTF Tests + run: DOCKER_FLAGS=$docker_fg_flags make run-saithrift-ptftests + - name: Deploy ixia-c Traffic Generator + run: DOCKER_FLAGS=$docker_fg_flags make deploy-ixiac + - name: Run Pytests + run: DOCKER_FLAGS=$docker_fg_flags make run-saithrift-pytests diff --git a/.github/workflows/dash-saithrift-client-bldr-docker.yml b/.github/workflows/dash-saithrift-client-bldr-docker.yml new file mode 100644 index 000000000..88b3ac27b --- /dev/null +++ b/.github/workflows/dash-saithrift-client-bldr-docker.yml @@ -0,0 +1,50 @@ +name: DASH-docker-saithrift-client-bldr-image + +on: + push: + branches: [ "**" ] + paths: + - '.github/workflows/dash-saithrift-client-bldr-docker.yml' + - '.github/workflows/dash-saithrift-client-bldr-docker.yml' + - 'dash-pipeline/Makefile' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr' + - 'dash-pipeline/.dockerignore' + - 'dash-pipeline/dockerfiles/.dockerignore' + pull_request: + branches: [ "main" ] + paths: + - '.github/workflows/dash-saithrift-client-bldr-docker.yml' + - 'dash-pipeline/Makefile' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr' + - 'dash-pipeline/.dockerignore' + - 'dash-pipeline/dockerfiles/.dockerignore' + workflow_dispatch: + +jobs: + build: + name: Build dash-saithrift-client-bldr-image + runs-on: ubuntu-20.04 + env: + docker_fg_flags: -u root --privileged + docker_bg_flags: -d -u root --privileged + defaults: + run: + working-directory: ./dash-pipeline + steps: + - uses: actions/checkout@v3 + - name: Pull docker p4c image + run: make docker-pull-dash-p4c + - name: Build P4 software switch (bmv2) and P4Info + run: DOCKER_FLAGS=$docker_fg_flags make p4 + - name: Install SAI submodule + run: git submodule update --init + - name: Build docker saithrift-bldr image + run: make docker-saithrift-bldr + - name: Generate SAI API + run: DOCKER_FLAGS=$docker_fg_flags make sai + - name: Generate SAI-Thrift client and server code and libs + run: DOCKER_FLAGS=$docker_fg_flags make saithrift-server + - name: Build saithrift client docker image + run: DOCKER_FLAGS=$docker_fg_flags make docker-saithrift-client-bldr diff --git a/.github/workflows/dash-saithrift-client-docker.yml b/.github/workflows/dash-saithrift-client-docker.yml new file mode 100644 index 000000000..2256130b2 --- /dev/null +++ b/.github/workflows/dash-saithrift-client-docker.yml @@ -0,0 +1,51 @@ +name: DASH-docker-saithrift-client-image + +on: + push: + branches: [ "**" ] + paths: + - '.github/workflows/dash-saithrift-client-docker.yml' + - 'dash-pipeline/Makefile' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-client' + - 'dash-pipeline/.dockerignore' + - 'dash-pipeline/dockerfiles/.dockerignore' + pull_request: + branches: [ "main" ] + paths: + - '.github/workflows/dash-saithrift-client-docker.yml' + - 'dash-pipeline/Makefile' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-client' + - 'dash-pipeline/.dockerignore' + - 'dash-pipeline/dockerfiles/.dockerignore' + workflow_dispatch: + +jobs: + build: + name: Build dash-saithrift-client-image + runs-on: ubuntu-20.04 + env: + docker_fg_flags: -u root --privileged + docker_bg_flags: -d -u root --privileged + defaults: + run: + working-directory: ./dash-pipeline + steps: + - uses: actions/checkout@v3 + - name: Pull docker p4c image + run: make docker-pull-dash-p4c + - name: Build P4 software switch (bmv2) and P4Info + run: DOCKER_FLAGS=$docker_fg_flags make p4 + - name: Install SAI submodule + run: git submodule update --init + - name: Build docker saithrift-bldr image + run: make docker-saithrift-bldr + - name: Generate SAI API + run: DOCKER_FLAGS=$docker_fg_flags make sai + - name: Generate SAI-Thrift client and server code and libs + run: DOCKER_FLAGS=$docker_fg_flags make saithrift-server + - name: Build saithrift client docker image + run: DOCKER_FLAGS=$docker_fg_flags make docker-saithrift-client diff --git a/.github/workflows/dash-saithrift-docker.yml b/.github/workflows/dash-saithrift-docker.yml index 9df6b8944..774807431 100644 --- a/.github/workflows/dash-saithrift-docker.yml +++ b/.github/workflows/dash-saithrift-docker.yml @@ -1,25 +1,25 @@ -name: DASH-docker-saithrift-build-image +name: DASH-docker-saithrift-bldr-build-image on: push: branches: [ "**" ] paths: - '.github/workflows/dash-saithrift-docker.yml' - - 'dash-pipeline/dockerfiles/Dockerfile.saithrift' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr' - 'dash-pipeline/.dockerignore' - 'dash-pipeline/dockerfiles/.dockerignore' pull_request: branches: [ "main" ] paths: - '.github/workflows/dash-saithrift-docker.yml' - - 'dash-pipeline/dockerfiles/Dockerfile.saithrift' + - 'dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr' - 'dash-pipeline/.dockerignore' - 'dash-pipeline/dockerfiles/.dockerignore' workflow_dispatch: jobs: build: - name: Build docker dash-saithrift image + name: Build docker dash-saithrift-bldr image runs-on: ubuntu-20.04 defaults: run: diff --git a/.gitignore b/.gitignore index d108f3f3d..ccdd6b781 100644 --- a/.gitignore +++ b/.gitignore @@ -1,5 +1,8 @@ *.bkp *.log +*.pcap __pycache__/ +.pytest_cache/ dash-pipeline/bmv2/dash_pipeline.bmv2/ -dash-pipeline/SAI/lib/ \ No newline at end of file +dash-pipeline/SAI/lib/ +dash-pipeline/SAI/rpc/ diff --git a/.gitmodules b/.gitmodules index 72171505d..487f827ed 100644 --- a/.gitmodules +++ b/.gitmodules @@ -1,4 +1,4 @@ [submodule "SAI"] path = dash-pipeline/SAI/SAI url = https://github.com/chrispsommers/SAI.git - branch = add-gensairpc-flags + branch = saithriftv2-check-null-sai-apis diff --git a/assets/CI-badge-failing.svg b/assets/CI-badge-failing.svg index 6f671a081..3c4d894d2 100644 --- a/assets/CI-badge-failing.svg +++ b/assets/CI-badge-failing.svg @@ -1,5 +1,5 @@ - - Sirius-CI - failing + + DASH-BMV2-CI - failing @@ -12,21 +12,21 @@ - + - + - Sirius-CI + DASH-BMV2-CI - - + + - failing + failing diff --git a/assets/CI-badge-passing.svg b/assets/CI-badge-passing.svg index 17e989ef7..73748ea2c 100644 --- a/assets/CI-badge-passing.svg +++ b/assets/CI-badge-passing.svg @@ -1,5 +1,5 @@ - - Sirius-CI - passing + + DASH-BMV2-CI - passing @@ -12,15 +12,15 @@ - + - + - Sirius-CI + DASH-BMV2-CI - +
[p4-clean](#compile-p4-code) | Compiles P4 code and produces both bmv2 and P4Info `.json` files.
Delete p4 artifacts | +| [sai](#build-libsaiso-adaptor-library)
[sai-clean](#build-libsaiso-adaptor-library)| Auto-generate sai headers, sai adaptor code and compile into `libsai.so` library
Cleans up artifacts and restores SAI submodule | +| [test](build-libsai-c-client-test-programs) | Compile C++ unit tests under [tests/libsai](tests/libsai) +| [saithrift-server](#build-saithrift-server) | Auto-generate the saithrift client-server framework and libraries | +| [docker-saithrift-client](#build-saithrift-client-docker-image) | Build a docker image containing tools, libraries and saithrift test-cases for PTF and Pytest + +## Launch Daemons/Containers +| Target(s) | Description | +| ---------------------- | --------------------------------------------------| +| [run-switch](#run-software-switch)
[kill-switch](#run-software-switch) | Run a docker container with bmv2 dataplane & P4Runtime server
Stop the bmv2 container +| [run-saithrift-server](#run-saithrift-server)
[kill-saithrift-server](#run-saithrift-server) | Run a saithrift server which translates SAI over thrift into into P4Runtime
Stop the saithrift server container| +| [deploy-ixiac](#startstop-ixia-c-traffic-generator)
[undeploy-ixiac](#startstop-ixia-c-traffic-generator) | Start ixia-c containers (done automatically when running tests)
Stop ixia-c containers (called by `kill-all`) + +## Run Tests +| Target(s) | Description | +| ---------------------- | --------------------------------------------------| +| [run-libsai-test](run-libsai-c-tests) | Run tests under [tests/libsai](tests/libsai) | +| [run-saithrift-ptftests](#run-saithrift-client-ptf-tests) | Run PTF tests under [tests/saithrift/ptf](tests/libsai/ptf) using tests built into [docker-saithrift-client](#build-saithrift-client-docker-image) image +| [run-saithrift-pytests](#run-saithrift-client-pytests) | Run Pytests under [tests/saithrift/pytest](tests/libsai/pytest) using tests built into [docker-saithrift-client](#build-saithrift-client-docker-image) image +|[run-saithrift-client-tests](#run-saithrift-client-tests) | Run all saithrift tests | +| [run-saithrift-dev-ptftests](#run-saithrift-client-ptf-tests)
[run-saithrift-dev-pytests](#run-saithrift-client-dev-pytests)
[run-saithrift-client-dev-tests](#run-saithrift-client-dev-tests) | Like the three targets above. above, but run tests from host directory `tests/saithrift` instead of tests built into the `saithrift-client` container for faster test-case development code/test cycles. + + # Detailed DASH Behavioral Model Build Workflow This explains the various build steps in more details. The CI pipeline does most of these steps as well. All filenames and directories mentioned in the sections below are relative to the `dash-pipeline` directory (containing this README) unless otherwise specified. @@ -40,8 +97,17 @@ The workflows described here are primarily driven by a [Makefile](Makefile) and * Automated script-based execution in a development or production environment, e.g. regression testing * Cloud-based CI (Continuous Integration) build and test, every time code is pushed to GitHub or a Pull Request is submitted to the upstream repository. -See the diagram below. You can read the [dockerfiles](dockerfiles) and all `Makefiles` in various directories to get a deeper understanding of the build process. +See the [Diagram](#build-workflow-diagram) below. You can read the [dockerfiles](dockerfiles) and all `Makefiles` in various directories to get a deeper understanding of the build process. You generally use the targets from the main [Makefile](Makefile) and not any subordinate ones. +## TODO +* Document specific task workflows and required build steps and dependencies, e.g.: + * P4 code development + * SAI adaptor development + * Test-case development + * Docker development +* Assign more uniform `make` target names. ## Docker Image(s) +>**NOTE** P4 code or test-case developers generally **don't** need to build `p4c`, `saithrift-bldr,` or `bmv2` docker images; they are pulled automatically, on-demand, from a registry. They contain static tooling. Developers who create and maintain the Docker images **do** need to build and push new images. + Several docker images are used to compile artifacts, such as P4 code, or run processses, such as the bmv2 simple switch. These Dockerfiles should not change often and are stored/retrieved from an external docker registry. See [README-dash.docker](README-dash.docker.md) for details. When a Dockerfile does change, it needs to be published in the resgistry. Dockerfile changes also trigger rebuilds of the docker images in the CI pipeline. See the diagram below. You can read the [Dockerfile](Dockerfile) and all `Makefiles` to get a deeper understanding of the build process. @@ -49,7 +115,30 @@ See the diagram below. You can read the [Dockerfile](Dockerfile) and all `Makefi ## Build Workflow Diagram ![dash-p4-bmv2-thrift-workflow](images/dash-p4-bmv2-thrift-workflow.svg) - +## Make All +This make target will build all the artifacts from source: +* Compile P4 code +* Auto-generate DASH SAI API header files based on P4Info from previous step +* Compile `libsai` for dash including SAI-to-P4RUntime adaptor +* Compile functional tests written in C++ to verify sai (under `dash-pipeline/tests/libsai`) +* Auto-generate the saithrift server and client framework (server daemon + client libraries) based on the DASH SAI headers +* Build a saithrift-client Docker image containing all needed tools and test suites +``` +make all +``` +## Cleanup +This will delete all built artifacts, restore the SAI submodule and kill all running containers. +``` +make clean +``` +## Stop Containers +This will kill one or all containers: +``` +make kill-switch # stop the P4 bmv2 switch +make kill-saithrift-server # stop the RPC server +make undeploy-ixiac # stop the ixia-c containers +make kill-all # all of the above +``` ## Compile P4 Code ``` make p4-clean # optional @@ -62,14 +151,16 @@ The primary outputs of interest are: * P4-to-SAI header code generation (see next step below) ## Build libsai.so adaptor library -This library is the crucial item to allow integration with a Network Operating System (NOS) like SONiC. It wraps an implementtion specific "SDK" with standard Switch Abstraction Interface (SAI) APIs. In this case, an adaptor translates SAI API table/attribute CRUD operations into equivalent P4Runtime RPC calls, which is the native RPC API for bmv2. +This library is the crucial item to allow integration with a Network Operating System (NOS) like SONiC. It wraps an implementation specific "SDK" with standard Switch Abstraction Interface (SAI) APIs. In this case, an adaptor translates SAI API table/attribute CRUD operations into equivalent P4Runtime RPC calls, which is the native RPC API for bmv2. ``` -make sai-clean # optional -make sai +make sai-headers # Auto-generate headers & adaptor code +make libsai # Compile into libsai.so +make sai # Combines steps above +make sai-clean # Clean up artifacts and Git Submodule ``` -This target generates SAI headers from the P4Info which was described above. It uses [Jinja2](https://jinja.palletsprojects.com/en/3.1.x/) which renders [SAI/templates](SAI/templates) into c++ source code for the SAI headers corresponding to the DASH API as defined in the P4 code. It then compiles this code into a shared library `libsai.so` which will later be used to link to a test server (Thrift) or `syncd` daemon for production. +These targets generates SAI headers from the P4Info which was described above. It uses [Jinja2](https://jinja.palletsprojects.com/en/3.1.x/) which renders [SAI/templates](SAI/templates) into c++ source code for the SAI headers corresponding to the DASH API as defined in the P4 code. It then compiles this code into a shared library `libsai.so` which will later be used to link to a test server (Thrift) or `syncd` daemon for production. This consists of two main steps * Generate the SAI headers and implementation code via [SAI/generate_dash_api.sh](SAI/generate_dash_api.sh) script, which is merely a wrapper which calls the real workhorse: [SAI/sai_api_gen.py](SAI/sai_api_gen.py). This uses templates stored in [SAI/templates](SAI/templates). @@ -77,7 +168,7 @@ This consists of two main steps Headers are emitted into the imported `SAI` submodule (under `SAI/SAI`) under its `inc`, `meta` and `experimental` directories. Implementation code for each SAI accessor are emitted into the `SAI/lib` directory. -* Compile the implementation source code into `libsai.so`, providing the definitive DASH dataplane API. Note this `libsai` makes calls to bmv2's emdedded P4Runtime Server and must be linked with numerous libraries, see `tests/vnet_out/Makefile` to gain insights. +* Compile the implementation source code into `libsai.so`, providing the definitive DASH dataplane API. Note this `libsai` makes calls to bmv2's emdedded P4Runtime Server and must be linked with numerous libraries, see for example `tests/vnet_out/Makefile` to gain insights. ### Restore SAI Submodule As mentioned above, the `make sai` target generates code into the `SAI` submodule (e.g. at `./SAI/SAI`). This "dirties" what is otherwise a cloned Git repo from `opencomputeproject/SAI`. @@ -87,8 +178,13 @@ make sai-clean To ensure the baseline code is restored prior to each run, the modified directories under SAI are deleted, then restored via `git checkout -- ` . This retrieves the subtrees from the SAI submodule, which is stored intact in the local project's Git repo (e.g. under `DASH/.git/modules/dash-pipeline/SAI/SAI`) -## Build SAI client test program(s) -This compiles a simple libsai client program to verify the libsai-to-p4runtime-to-bmv2 stack. It performs table access(es). +## Build saithrift-server +This builds a saithrift-server daemon, which is linked to the `libsai` library and also includes the SAI-to-P4Runtime adaptor. It also builds Python thrift libraries and saithrift libraries. +``` +make saithrift-server +``` +## Build libsai C++ client test program(s) +This compiles simple libsai client program(s) to verify the libsai-to-p4runtime-to-bmv2 stack. It performs table access(es). ``` make test @@ -128,29 +224,79 @@ Switch is initialized. ``` ### Use wireshark to decode P4Runtime messages in the SAI-P4RT adaptor >**Hint:** You can monitor P4Runtime messages using Wireshark or similar. Select interface `lo`, filter on `tcp.port==9559`. Right-click on a captured packet and select "Decode as..." and configure port 9559 to decode as HTTP2 (old versions of Wireshark might lack this choice). - -## Run simple SAI library test -From a different terminal, run SAI client tests. This exercises the `libsai.so` shared library including P4Runtime client adaptor, which communicates to the running `simple_switch_grpc` process over a socket. +## Run saithrift-server +>**Note:** the bmv2 switch must be running, see +When this server is launched, it will establish a P4Runtime session (behind the scenes) to the running `bmv2` switch daemon . The thrift server listens on port `9092` for Thrift messages carrying SAI rpc commands. These commands are dispatched the the SAI library handlers. These handlers translate them into corresponding P4Runtime RPC commands and are sent to the bmv2 daemon onto a socket at standard P4Runtime port `9559`. ``` -make run-test +make run-saithrift-server +``` +When the server starts, the first SAI command it receives will load the `libsai.so` shared library and establish a P4Runtime connection. This results in a console message similar to below. Note this message doesn't necessairly appear when the daemon starts. This also loads the bmv2 behavioral model with the P4 "object code" (JSON file), see [Initialize software switch](#initialize-software-switch). +``` +GRPC call SetForwardingPipelineConfig 0.0.0.0:9559 => /etc/dash/dash_pipeline.json, /etc/dash/dash_pipeline_p4rt.txt +Switch is initialized. ``` -## Run ixia-c traffic-generator test -Remeber to [Install docker-compose](#install-docker-compose). - -From a different terminal, run [ixia-c](https://github.com/open-traffic-generator/ixia-c) traffic tests. The first time this runs, it will pull Python packages for the [snappi](https://github.com/open-traffic-generator/snappi) client as well as Docker images for the [ixia-c](https://github.com/open-traffic-generator/ixia-c) controller and traffic engines. - +To stop it: ``` -make run-ixiac-test +make kill saithrift-server ``` -### ixia-c components and setup/teardown -The first time you run ixia-c traffic tests, the `ixiac-prereq` make target will run two dependent targets: -* `install-python-modules` - downloads and installs snappi Python client libraries -* `deploy-ixiac` - downloads two docker images (ixia-c controller, and ixia-c traffic engine or TE), then spins up one controller container and two traffic engines. -ixia-c always requires a dedicated CPU core for the receiver, capable of full DPDK performance, but can use dedicated or shared CPU cores for the transmitter and controller, at reduced performance. In this project, two cores total are required: one for the ixia-c receiver, and one shared core which handles the TE transmitters, controller, and all other processes including the P4 BMV2 switch, P4Runtime server, test clients, etc. This accommodates smaller cloud instances like the "free" Azure CI runners provided by Github [described here](https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources) +## Build saithrift-client docker image +``` +make docker-saithrift-client +``` +This will build a docker image which has all libraries needed to talk to the saithrift-server daemon, including: +* saithrift client libraries (Python) +* PTF framework from [OCP SAI repo](https://github.com/opencomputeproject/SAI.git), including all test cases +* The [PTF repo](https://github.com/p4lang/ptf) imported from p4lang +* Scapy etc. + +It also contains all the artifacts under `tests/` which includes PTF and Pytest test-cases. Thus, it comprises a self-contained test resource with tools, libraries and test scripts. +## Run All Tests +``` +make run-all-tests +``` +This will run all the tests cases for libsai (C++ programs) as well as saithrift (Pytest and PTF). You must have the bmv2 switch and saithrift-server running. +## Run saithrift-client tests +To run all "Production" tests which use the saithrift interface, execute the following. The tests are assumed to be built into the `saithrift-client` docker image. See [Running DASH saithrift tests](README-saithrift.md#running-dash-saithrift-tests). You must have the bmv2 switch and saithrift-server running. +``` +make run-saithrift-client-tests +``` +This will launch a saithrift-client docker container and execute tests under `tests/saithrift`, including: +* Pytests under `tests/saithrift/pytest` +* PTF Tests under `tests/saithrift/PTF` +### Run saithrift-client PTF tests +To run all PTF tests which use the saithrift interface, execute the following. You must have the bmv2 switch and saithrift-server running. +``` +make run-saithrift-client-ptftests +``` +This will launch a saithrift-client docker container and execute tests under `tests/saithrift/ptf`. +### Run saithrift-client Pytests +To run all Pytests which use the saithrift interface, execute the following. You must have the bmv2 switch and saithrift-server running. +``` +make run-saithrift-client-pytests +``` +This will launch a saithrift-client docker container and execute tests under `tests/saithrift/pytests`. +### Run saithrift-client "Dev" Pytests +You can also run "dev" versions of tests using the following make targets. These use test scripts mounted from the host's filesystem, allowing a faster development workflow. No dockers need to be rebuilt to try out test cases iteratively. Use the following variants of the make targets. See [Development - Launch container, run tests in one shot](#development---launch-container-run-tests-in-one-shot) +``` +make run-saithrift-client-dev-pytests # run Pytests from host mount +make run-saithrift-client-dev-ptftests # run PTF tests from host mount +make run-saithrift-client-dev-tests # run both suites above +``` +## Run libsai C++ tests +This exercises the `libsai.so` shared library with c++ programs. This tests the SAI API handlers and P4Runtime client adaptor, which communicates to the running `simple_switch_grpc` process over a socket. +``` +make run-libsai-test +``` +## Start/Stop ixia-c Traffic Generator +This will start/stop the ixia-c traffic generator, whcih consists of one container for the Controller and one container per Traffic Enginer (1 per port = 2 for DASH). +``` +make deploy-ixiac # Start the containers +make undeploy-ixiac # Stop the containers +``` ### About snappi and ixia-c traffic-generator #### Opensource Sites * Vendor-neutral [Open Traffic Generator](https://github.com/open-traffic-generator) model and API @@ -209,7 +355,7 @@ The sections below discuss version control of critical components. ## DASH Repo Versioning The DASH GitHub repo, i.e. [https://github.com/Azure/DASH](https://github.com/Azure/DASH) is controlled by Git source-code control, tracked by commit SHA, tag, branch, etc. This is the main project and its components should also be controlled. ## Submodules -As discussed in [About Git Submodules](#about-git-submodules), submodules are controlled by the SHA commit of the submodule, which is "committed" to the top level project (see [About Git Submodules](#about-git-submodules)). The versions are always known and explicitly specified. +As discussed in [About Git Submodules](#about-git-submodules), submodules are controlled by the SHA commit of the submodule, which is "committed" to the top level project (see [About Git Submodules](#about-git-submodules). The versions are always known and explicitly specified. ## Docker Image Versioning Docker image(s) are identified by their `repo/image_name:tag`, e.g. `p4lang/dp4c:latest`. ### Project-Specific Images diff --git a/dash-pipeline/README-ptftests.md b/dash-pipeline/README-ptftests.md new file mode 100644 index 000000000..537c6fa8c --- /dev/null +++ b/dash-pipeline/README-ptftests.md @@ -0,0 +1,68 @@ +* [README-dash-workflows.md](README-dash-workflows.md) for build workflows and Make targets. +* [README-saithrift](README-saithrift.md) for saithrift client/server and test workflows. +* [README-pytests](README-pytests.md) for saithrift Pytest test-case development and usage. + + +# PTF - Packet Test Framework +## PTF Overview +The Packet Test Framework (PTF) is a popular tool and is based on Pyunit, Scapy and some utilities to make testing dataplanes and switching devices easy and convenient. +- [PTF - Packet Test Framework](#ptf---packet-test-framework) + - [PTF Overview](#ptf-overview) + - [Learn By Example](#learn-by-example) + - [Invoking Tests From Command-line](#invoking-tests-from-command-line) + - [Locating PTF Packet Utilities](#locating-ptf-packet-utilities) +## Learn By Example +An excellent place to learn test-writing patterns is the existing SAI PTF tests located at https://github.com/opencomputeproject/SAI/tree/master/ptf and also expanded into the DASH workspace at `DASH/dash-pipeline/SAI/SAI/ptf` +## Invoking Tests From Command-line +You can exercise `ptf` manually by entering the saithrift-client container in a shell: + +``` +DASH/dash-pipeline$ make run-saithrift-client-bash +... +root@chris-z4:/tests-dev# ptf -h +... +usage: usage: ptf [options] --test-dir TEST_DIR [tests] +PTF (Packet Test Framework) is a framework and set of tests to test a software switch. +... +``` +Typical invocation: +``` +sudo ptf --test-dir ./ptf --pypath /SAI/ptf \ + --interface 0@veth1 --interface 1@veth3 +``` +Note that the container is launched with `/SAI/ptf` in the container mounted to the corresponding +## Locating PTF Packet Utilities +The following directory contains source code for packet test utillities: +``` +DASH/dash-pipeline/SAI/SAI/test/ptf/src/ptf +``` +Note this directory won't be expanded into your workspace when you first clone DASH. You have to expand the GitSubmodules under SAI/SAI and SAI/SAI/test/ptf. This is done part of the first DASH build (`git submodule update --init` and `make all`) as explained in other README's. + +To clarify: the directory structure below is created after the `SAI/SAI` repo is cloned; then subsequently the `SAI/SAI/test/ptf` repo is cloned inside that repo. + +You can also consult the source of PTF: https://github.com/p4lang/ptf. Note that SAI imports a specific commit SHA of PTF via the submodule so it's best to consult the code which is actually pulled into DASH. The way to check the version is to enter the PTF submodule directory and look at git branch: +``` +DASH/dash-pipeline/SAI/SAI/test/ptf$ git branch +* (HEAD detached at 10a2d4b) + master +``` + +An example of utility functions you might find inside a typical PTF test is `verify_packets`. This function and many others are inside `DASH/dash-pipeline/SAI/SAI/test/ptf/src/ptf/testutils.py`. +``` +def verify_packets(test, pkt, ports=[], device_number=0, timeout=None): + """ + Check that a packet is received on each of the specified port numbers for a + given device (default device number is 0). + + Also verifies that the packet is not received on any other ports for this + device, and that no other packets are received on the device (unless --relax + is in effect). + + The parameter timeout will be passed as is for each individual verify calls. + + This covers the common and simplest cases for checking dataplane outputs. + For more complex usage, like multiple different packets being output, or + multiple packets on the same port, use the primitive verify_packet, + verify_no_packet, and verify_no_other_packets functions directly. + """ +``` diff --git a/dash-pipeline/README-pytests.md b/dash-pipeline/README-pytests.md new file mode 100644 index 000000000..0d0cfb67e --- /dev/null +++ b/dash-pipeline/README-pytests.md @@ -0,0 +1,133 @@ + +* [README-dash-workflows.md](README-dash-workflows.md) for build workflows and Make targets. +* [README-saithrift](README-saithrift.md) for saithrift client/server and test workflows. +* [README-ptftests](README-ptftests.md) for saithrift PTF test-case development and usage. + +**Table of Contents** +- [Pytests](#pytests) + - [Markers](#markers) + - [View markers for tests](#view-markers-for-tests) + - [Using Markers](#using-markers) + - [Run all pytests](#run-all-pytests) + - [Run select pytests](#run-select-pytests) + - [Run pytests *except* selected](#run-pytests-except-selected) + - [Run pytests - complex selection](#run-pytests---complex-selection) +- [Debugging](#debugging) + - [View thrift protocol using tcpdump](#view-thrift-protocol-using-tcpdump) + - [View thrift protocol using Wireshark](#view-thrift-protocol-using-wireshark) +# Pytests +## Markers +### View markers for tests +Markers can be used to select different tests, e.g. only bmv2 tests, only vnet tests, etc. +Custom markers are defined in `pytest.ini` and shown at the top of the list below: + +``` +python -m pytest --markers + +@pytest.mark.bmv2: test DASH bmv2 model + +@pytest.mark.saithrift: test DASH using saithrift API + +@pytest.mark.vnet: test DASH vnet scenarios + +<...SKIP built-in markers...> +``` +### Using Markers +#### Run all pytests +``` +python -m pytest -s +### Run vnet pytests +``` +python -m pytest -s + +#### Run select pytests +In this example we'll run *only* tests marked with `vnet`* +``` +python -m pytest -m vnet +``` +#### Run pytests *except* selected +In this example we'll run all tests *except* tests marked with `vnet`* +``` +python -m pytest -m "not vnet" +``` + +### Run pytests - complex selection +In this example we'll run all tests marked with `bmv2` *except* tests marked with `vnet`* +``` +python -m pytest -m "bmv2 and not vnet" +``` +# Debugging +## View thrift protocol using tcpdump +Run tcpdump on local loopback port `9092` and select options to dump Hex and ASCII. You can see thrift RPC calls by name. Below is an example of calling `saithrift_get_switch_attribute()`. + +Look at the 4th packet (request) and 6th packet (response): + +``` +$ sudo tcpdump -enXi lo tcp port 9092 +tcpdump: verbose output suppressed, use -v or -vv for full protocol decode +listening on lo, link-type EN10MB (Ethernet), capture size 262144 bytes +14:06:02.443857 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 74: 127.0.0.1.40960 > 127.0.0.1.9092: Flags [S], seq 4281156243, win 65495, options [mss 65495,sackOK,TS val 2250594224 ecr 0,nop,wscale 7], length 0 + 0x0000: 4500 003c 1669 4000 4006 2651 7f00 0001 E..<.i@.@.&Q.... + 0x0010: 7f00 0001 a000 2384 ff2d 4293 0000 0000 ......#..-B..... + 0x0020: a002 ffd7 fe30 0000 0204 ffd7 0402 080a .....0.......... + 0x0030: 8625 57b0 0000 0000 0103 0307 .%W......... +14:06:02.443875 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 74: 127.0.0.1.9092 > 127.0.0.1.40960: Flags [S.], seq 1458900721, ack 4281156244, win 65483, options [mss 65495,sackOK,TS val 2250594224 ecr 2250594224,nop,wscale 7], length 0 + 0x0000: 4500 003c 0000 4000 4006 3cba 7f00 0001 E..<..@.@.<..... + 0x0010: 7f00 0001 2384 a000 56f5 0ef1 ff2d 4294 ....#...V....-B. + 0x0020: a012 ffcb fe30 0000 0204 ffd7 0402 080a .....0.......... + 0x0030: 8625 57b0 8625 57b0 0103 0307 .%W..%W..... +14:06:02.443889 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 66: 127.0.0.1.40960 > 127.0.0.1.9092: Flags [.], ack 1, win 512, options [nop,nop,TS val 2250594224 ecr 2250594224], length 0 + 0x0000: 4500 0034 166a 4000 4006 2658 7f00 0001 E..4.j@.@.&X.... + 0x0010: 7f00 0001 a000 2384 ff2d 4294 56f5 0ef2 ......#..-B.V... + 0x0020: 8010 0200 fe28 0000 0101 080a 8625 57b0 .....(.......%W. + 0x0030: 8625 57b0 .%W. +14:06:02.444303 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 130: 127.0.0.1.40960 > 127.0.0.1.9092: Flags [P.], seq 1:65, ack 1, win 512, options [nop,nop,TS val 2250594225 ecr 2250594224], length 64 + 0x0000: 4500 0074 166b 4000 4006 2617 7f00 0001 E..t.k@.@.&..... + 0x0010: 7f00 0001 a000 2384 ff2d 4294 56f5 0ef2 ......#..-B.V... + 0x0020: 8018 0200 fe68 0000 0101 080a 8625 57b1 .....h.......%W. + 0x0030: 8625 57b0 8001 0001 0000 001f 7361 695f .%W.........sai_ + 0x0040: 7468 7269 6674 5f67 6574 5f73 7769 7463 thrift_get_switc + 0x0050: 685f 6174 7472 6962 7574 6500 0000 000c h_attribute..... + 0x0060: 0001 0f00 010c 0000 0001 0800 0100 0000 ................ + 0x0070: 0000 0000 .... +14:06:02.444322 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 66: 127.0.0.1.9092 > 127.0.0.1.40960: Flags [.], ack 65, win 512, options [nop,nop,TS val 2250594225 ecr 2250594225], length 0 + 0x0000: 4500 0034 8b50 4000 4006 b171 7f00 0001 E..4.P@.@..q.... + 0x0010: 7f00 0001 2384 a000 56f5 0ef2 ff2d 42d4 ....#...V....-B. + 0x0020: 8010 0200 fe28 0000 0101 080a 8625 57b1 .....(.......%W. + 0x0030: 8625 57b1 .%W. +14:06:02.444602 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 121: 127.0.0.1.9092 > 127.0.0.1.40960: Flags [P.], seq 1:56, ack 65, win 512, options [nop,nop,TS val 2250594225 ecr 2250594225], length 55 + 0x0000: 4500 006b 8b51 4000 4006 b139 7f00 0001 E..k.Q@.@..9.... + 0x0010: 7f00 0001 2384 a000 56f5 0ef2 ff2d 42d4 ....#...V....-B. + 0x0020: 8018 0200 fe5f 0000 0101 080a 8625 57b1 ....._.......%W. + 0x0030: 8625 57b1 8001 0002 0000 001f 7361 695f .%W.........sai_ + 0x0040: 7468 7269 6674 5f67 6574 5f73 7769 7463 thrift_get_switc + 0x0050: 685f 6174 7472 6962 7574 6500 0000 000c h_attribute..... + 0x0060: 0001 0800 01ff ffff f100 00 ........... +14:06:02.444621 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 66: 127.0.0.1.40960 > 127.0.0.1.9092: Flags [.], ack 56, win 512, options [nop,nop,TS val 2250594225 ecr 2250594225], length 0 + 0x0000: 4500 0034 166c 4000 4006 2656 7f00 0001 E..4.l@.@.&V.... + 0x0010: 7f00 0001 a000 2384 ff2d 42d4 56f5 0f29 ......#..-B.V..) + 0x0020: 8010 0200 fe28 0000 0101 080a 8625 57b1 .....(.......%W. + 0x0030: 8625 57b1 .%W. +14:06:02.492966 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 66: 127.0.0.1.40960 > 127.0.0.1.9092: Flags [F.], seq 65, ack 56, win 512, options [nop,nop,TS val 2250594273 ecr 2250594225], length 0 + 0x0000: 4500 0034 166d 4000 4006 2655 7f00 0001 E..4.m@.@.&U.... + 0x0010: 7f00 0001 a000 2384 ff2d 42d4 56f5 0f29 ......#..-B.V..) + 0x0020: 8011 0200 fe28 0000 0101 080a 8625 57e1 .....(.......%W. + 0x0030: 8625 57b1 .%W. +14:06:02.493072 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 66: 127.0.0.1.9092 > 127.0.0.1.40960: Flags [F.], seq 56, ack 66, win 512, options [nop,nop,TS val 2250594274 ecr 2250594273], length 0 + 0x0000: 4500 0034 8b52 4000 4006 b16f 7f00 0001 E..4.R@.@..o.... + 0x0010: 7f00 0001 2384 a000 56f5 0f29 ff2d 42d5 ....#...V..).-B. + 0x0020: 8011 0200 fe28 0000 0101 080a 8625 57e2 .....(.......%W. + 0x0030: 8625 57e1 .%W. +14:06:02.493088 00:00:00:00:00:00 > 00:00:00:00:00:00, ethertype IPv4 (0x0800), length 66: 127.0.0.1.40960 > 127.0.0.1.9092: Flags [.], ack 57, win 512, options [nop,nop,TS val 2250594274 ecr 2250594274], length 0 + 0x0000: 4500 0034 166e 4000 4006 2654 7f00 0001 E..4.n@.@.&T.... + 0x0010: 7f00 0001 a000 2384 ff2d 42d5 56f5 0f2a ......#..-B.V..* + 0x0020: 8010 0200 fe28 0000 0101 080a 8625 57e2 .....(.......%W. + 0x0030: 8625 57e2 .%W. + +``` +## View thrift protocol using Wireshark +**TODO:** There's rumor of a dissector, yet to be located. + +* Launch Wireshark +* Enter the following filter: `tcp.dstport==9092` +* You can see rcp calls bein made in the packet data view, the ASCII string names of methods are displayed \ No newline at end of file diff --git a/dash-pipeline/README-saithrift.md b/dash-pipeline/README-saithrift.md new file mode 100644 index 000000000..9b41122d5 --- /dev/null +++ b/dash-pipeline/README-saithrift.md @@ -0,0 +1,348 @@ +See also: +* [README.md](README.md) Top-level README for dash-pipeline +* [README-dash-workflows.md](README-dash-workflows.md) for build workflows and Make targets. +* [README-ptftests](README-ptftests.md) for saithrift PTF test-case development and usage. +* [README-pytests](README-pytests.md) for saithrift Pytest test-case development and usage. + +**Table of Contents** +- [DASH saithrift client and server](#dash-saithrift-client-and-server) + - [Overview](#overview) +- [TODO](#todo) +- [Running DASH saithrift tests](#running-dash-saithrift-tests) + - [Running/Stopping the saithrift server](#runningstopping-the-saithrift-server) + - [Production - Launch container, run tests in one shot](#production---launch-container-run-tests-in-one-shot) + - [Development - Launch container, run tests in one shot](#development---launch-container-run-tests-in-one-shot) +- [Developer: Run tests selectively from `bash` inside saithrift-client container](#developer-run-tests-selectively-from-bash-inside-saithrift-client-container) + - [Select Directory - Container pre-built directory, or mounted from host](#select-directory---container-pre-built-directory-or-mounted-from-host) +- [Test aftermath and clearing the switch config](#test-aftermath-and-clearing-the-switch-config) +- [Tips and techniques for writing tests](#tips-and-techniques-for-writing-tests) + - [Workspace File Layout](#workspace-file-layout) + - [saithrift Python client modules](#saithrift-python-client-modules) + - [Walk-through example of finding saithrift module entities](#walk-through-example-of-finding-saithrift-module-entities) + - [How to create a local object?](#how-to-create-a-local-object) + - [How to call the SAI create function for our local object?](#how-to-call-the-sai-create-function-for-our-local-object) +- [Debugging saithrift Server with GDB](#debugging-saithrift-server-with-gdb) + - [Run Interactive saithrift-server container](#run-interactive-saithrift-server-container) +# DASH saithrift client and server +## Overview +The DASH saithrift API is used to configure and query a device under test (DUT) as described in [dash-test-workflow-saithrift.md](../test/docs/dash-test-workflow-saithrift.md) and [dash-test-workflow-saithrift.md](../test/docs/dash-test-workflow-p4-saithrift.md). + +This document describes how to run the saithrift server and client to run test suites. It also gives some advice for writing tests, debugging, etc. +# TODO +* Select saithrift server IP address to allow running client remotely from target. +# Running DASH saithrift tests +## Running/Stopping the saithrift server +``` +make run-saithrift-server +make kill-saithrift-server +``` +## Production - Launch container, run tests in one shot +This will run all the tests built into the `dash-saithrift-client` docker image. This assumes you've already done `make docker-saithrift-client` which will bundle the current state of the `dash-pipeline/tests` directory into the image. + +Calling these make targets spins up a `saithrift-client` container on-the-fly, runs tests and kills the container. It's very lightweight. +``` +make run-saithrift-client-pytests # run Pytests from container's scripts +make run-saithrift-client-ptftests # run PTF tests from container's scripts +make run-saithrift-client-tests # run both suites above +``` +## Development - Launch container, run tests in one shot +You can run tests based on the current state of the `dash-pipeline/tests/` directory without rebuilding the `saithrift-client` docker image. Instead of running tests built into the container and stored under `/tests`, a host volume `dash-pipeline/tests` is mounted to container `/tests-dev`) and tests are run from there. This allows rapid incremental test-case development. When doing so, the container's `/test` directory remains in-place with tests which were copied into the container at image build-time. + +You can keep all containers running (switch, saithrift-server, ixia-c) and interactively write and execute test-cases without stopping the daemons. As stated, the saithrift-client container will start and stop each time you run the tests but the switch and saithrift-server will continue to run. +``` +make run-saithrift-client-dev-pytests # run Pytests from host mount +make run-saithrift-client-dev-ptftests # run PTF tests from host mount +make run-saithrift-client-dev-tests # run both suites above +``` + +**TODO:** - pass params to the container to select tests etc. +# Developer: Run tests selectively from `bash` inside saithrift-client container +Enter the container, this will place you in the `/test-dev/` directory of the container which corresponds to the contents of the `DASH/dash-pipline/tests` directory on the host. In this way you can interactively run test-cases while you're editing them. When doing so, the container's `/test` directory remains in-place with tests which were copied into the container at image build-time. +``` +make run-saithrift-client-bash +root@chris-z4:/tests-dev# +``` +The running container is also mounted via `-v $(PWD)/test:/test-dev` which mounts the current developer workspace into the running container. You can thereby create and edit new tests "live" from a text editor and see the effect inside the container in real-time. Note, the container image also contains the `/tests` directory which was copied into the Docker image when `make docker-saithrift-client` was last run. This means you have a "production" copy of tests as well as live "development" host volume simultaneously in the container. + +## Select Directory - Container pre-built directory, or mounted from host + +* `cd /test/` - Enter directory which was prebuilt into container image; tests are not modifiable "live" from the host. This is good for canned tests. +* `cd /test-dev/` - Enter directory which is mounted to `dash-pipeline/tests` from the host, allowing live editing in the host and running in the container. This is a convenient developer workflow. + +To get the desired subdirectory for Pytests or PTF test, choose the appropriate path, e.g.: +* `cd /tests/saithrift/pytest` +* `cd /tests-dev/saithrift/ptf` + +You can run all tests inside each respective directory by entering the directory and running the `run-saithrift-xxx` bash scripts, e.g.: +``` +DASH/DASH/dash-pipeline$ make run-saithrift-client-bash +... +root@chris-z4:/tests-dev/saithrift# cd ptf/ +root@chris-z4:/tests-dev/saithrift/ptf# ./run-saithrift-ptftests.sh +``` +*OR* +``` +DASH/DASH/dash-pipeline$ make run-saithrift-client-bash +... +root@chris-z4:/tests-dev/saithrift# cd pytest/ +root@chris-z4:/tests-dev/saithrift/pytest# ./run-saithrift-ptests.sh +``` + + +See the relevant documentation for running select PTF or Pytests using `bash` commands. You can pass parameters via the command-line, to control which test groups are run using filenames, directories, or filtering on groups (PTF); or marks or match expressions (pytest). +# Test aftermath and clearing the switch config +Sometimes tests leave entries programmed into the switch, when they should have cleaned everything up. This can be caused by exceptions/assertions which fail and either inadvertently, or unavoidably, leave entries in tables. This might make a subsequent run of the same (or a different) test suite fail. In these cases, it might be best to execute the following sequence to restart the switch and saithrift server, then rerun test cases. +: +``` +make kill-all run-switch # console 1 +make run-saithrift-server # console 2 +make run-saithrift-client-tests # Console 3 +make run-saithrift-client-dev-tests # Alternative to above +``` + +It's strongly recommended to perform proper DUT config cleanup in the code for every testcase and catch exceptions where possible, to ensure a complete cleanup, despite failures along the way. + +# Tips and techniques for writing tests +The following information should apply equally well to writing any tests which utilize saithrift as the client library: PTF, Pytests, etc. Please refer to other READMEs for information specific to various frameworks. +## Workspace File Layout +Below is depicted a selected subset of the DASH repo pertinent to understanding source and build artifact locations needed for saithrift test development. + +Note that the `SAI/SAI` directory is a Git submodule and its contents are modified during `make sai` and `make saithrift-server`. +``` +DASH + dash-pipeline + SAI - top-level dir for SAI-related artifacts + This is the dash-pipeline SAI directory, not the SAI repo! + rpc - output dir for saithrift code generator + contains client & server libraries & executable, see below + SAI - Git submodule root, imported into DASH repo + extensions - DASH extension headers - mix of repo files + generated via "make sai" + inc - upstream sai headers + meta - generated SAI metadata, scripts, etc. + test + saithriftv2 - autogenerated saithrift tools, outputs + tests - saithrift client/server test cases + pytest - DASH tests using Pytests with saithrift + ptf - DASH tests using PTF with saithrift +``` +## saithrift Python client modules +Here are the detailed contents of `DASH/dash-pipeline/SAI/rpc/usr/local/lib/python3.8/site-packages/sai_thrift`. You need to utilize the APIs and constants inside these modules, to write saithrift tests in PTF or Pytest. + +When writing tests using particular SAI tables or attributes, use your editor to search the Python modules below to find functions, structures, attributes etc. See [Walk-through example of finding saithrift module entities](#walk-through-example-of-finding-saithrift-module-entities) in the next section. + +>**Note:** These artifacts are generated via the `make sai-thrift-server` target, they are not stored in the DASH repo. +``` +constants.py - wrapper for ttypes.py, not interesting +sai_adapter.py - Main source of APIs for saithrift, e.g. create/remove/get/set +sai_headers.ph - constants e.g. SAI_STATUS_SUCCESS +sai_rpc.py - lower-level thrift marshalling/unmarshalling etc. Called by sai_adapter.py functions +ttypes.py - SAI data types +``` + +## Walk-through example of finding saithrift module entities +See the following code snippet from a PTF test. A Pytest would look nearly identical. We'll brielfy describe how you can find things in the Python saithrift library modules. Recall we'll be hunting inside `DASH/dash-pipeline/SAI/rpc/usr/local/lib/python3.8/site-packages/sai_thrift` as explained above. +``` + self.switch_id = 0 + self.eth_addr = '\xaa\xcc\xcc\xcc\xcc\xcc' + self.vni = 60 + self.eni = 7 + self.dle = sai_thrift_direction_lookup_entry_t(switch_id=self.switch_id, vni=self.vni) + self.eam = sai_thrift_eni_ether_address_map_entry_t(switch_id=self.switch_id, address = self.eth_addr) + self.e2v = sai_thrift_outbound_eni_to_vni_entry_t(switch_id=self.switch_id, eni_id=self.eni) + + try: + + status = sai_thrift_create_direction_lookup_entry(self.client, self.dle, + action=SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_create_eni_ether_address_map_entry(self.client, + eni_ether_address_map_entry=self.eam, + eni_id=self.eni) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_create_outbound_eni_to_vni_entry(self.client, + outbound_eni_to_vni_entry=self.e2v, + vni=self.vni) +``` +### How to create a local object? +We want to determine the function signature to create the SAI object `direction_lookup_entry_t`. + +**Understand the type:** + +First, search for the data type itself inside `ttypes.py` to find: +``` +# From ttypes.py: + +class sai_thrift_direction_lookup_entry_t(object): + """ + Attributes: + - switch_id + - vni + """ + + def __init__(self, switch_id=None, vni=None,): + self.switch_id = switch_id + self.vni = vni +``` +Note the actual type is `sai_thrift_direction_lookup_entry_t` and it has two parameters to create it: `switch_id` and `vni`. + +Looking further down into the `write()` method (which serializes into thrift) we get hints about the datatypes of these two parameters. We can see `switch_id` is 64 bits and `vni` is 32 bits: +``` +# From ttypes.py: + + if self.switch_id is not None: + oprot.writeFieldBegin('switch_id', TType.I64, 1) + oprot.writeI64(self.switch_id) + oprot.writeFieldEnd() + if self.vni is not None: + oprot.writeFieldBegin('vni', TType.I32, 2) + oprot.writeI32(self.vni) + oprot.writeFieldEnd() +``` +**Call the sai_thrift_direction_lookup_entry_t constructor:** + +Finally, we see the code in our test case is as below, using the name of the Python class `sai_thrift_direction_lookup_entry_t` and the attributes from the `__init__()` method to form the contructor call: +``` + self.dle = sai_thrift_direction_lookup_entry_t(switch_id=self.switch_id, vni=self.vni) +``` +### How to call the SAI create function for our local object? +Now that we have a local object, we need to find the RPC call to remotely create it using saithrift. + +Search `adaptor.py` for the string `direction_lookup_entry_t` and you'll find numerous instances. In particular you'll find the four accessors to `create()`, `remove()`, `set()` and `get()` these objects, as well as `bulk_create()` and `bulk_remove()` versions thereof. In our case, we want to `create()`. The method signature is: + +``` +# From adaptor.py: + +def sai_thrift_create_direction_lookup_entry(client, + direction_lookup_entry, + action=None): + """ + sai_create_direction_lookup_entry() - RPC client function implementation. + + Args: + client (Client): SAI RPC client + direction_lookup_entry(sai_thrift_direction_lookup_entry_t): direction_lookup_entry IN argument + + For the other parameters, see documentation of direction_lookup_entry CREATE attributes. +``` +The `client` param is a handle to the already-established Thrift session. + +The `direction_lookup_entry` was created by our previous steps. + +The `action` is a SAI attribute enum which we can glean from the SAI header file. We need to find valid values. + +**Find attribute enum values from SAI headers:** + +We can examine `SAI/experimental/saiexperimentaldash.h` (which was autogenerated from our P4 code) to find: +``` +// From saiexperimentaldash.h: + +typedef enum _sai_direction_lookup_entry_action_t +{ + SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION, + + SAI_DIRECTION_LOOKUP_ENTRY_ACTION_DENY, + +} sai_direction_lookup_entry_action_t; + +``` + +Let's find the Python counterparts. Search for `SAI_DIRECTION_LOOKUP_ENTRY_ACTION` inside `saiheaders.py` to find the following, which are clearly the identical constant names from our sai headers. + +``` +# From saiheaders.py: + +SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION = 0# /usr/include/sai/saiexperimentaldash.h: 45 + +SAI_DIRECTION_LOOKUP_ENTRY_ACTION_DENY = (SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION + 1)# /usr/include/sai/saiexperimentaldash.h: 45 +``` + +**Call the remote create() method:** + +We now have all the information needed to execute the RPC call: +``` + status = sai_thrift_create_direction_lookup_entry(self.client, self.dle, + action=SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION) +``` + +We can also find valid values for `status` inside `sai_headers.py`. Search for `SAI_STATUS` to find entries such as: +``` +# From saiheaders.py: + +# /usr/include/sai/saistatus.h: 50 +try: + SAI_STATUS_SUCCESS = 0 +except: + pass +``` + +Viola! You're ready to become a saithrift power-user. Rock on bruh! +# Debugging saithrift Server with GDB +`gdb` is built into the saithrift server image for easy debugging. Server code is compiled with the `-g` flag to include debug symbols. The saithrift server source code is available from withint the running Docker container via volume mounts. Below is shown some a typical workflow: + +## Run Interactive saithrift-server container +This starts the container and opens a bash session instead of running the server like normal. The working directory `/SAI/rpc/usr/sbin` contains the saiserver. +``` +$ make run-saithrift-server-bash +docker run --rm -it --net=host --name dash-saithrift-server-chris -v /home/chris/chris-DASH/DASH/dash-pipeline/bmv2/dash_pipeline.bmv2/dash_pipeline.json:/etc/dash/dash_pipeline.json -v /home/chris/chris-DASH/DASH/dash-pipeline/bmv2/dash_pipeline.bmv2/dash_pipeline_p4rt.txt:/etc/dash/dash_pipeline_p4rt.txt -v /home/chris/chris-DASH/DASH/dash-pipeline/SAI:/SAI -w /SAI/rpc/usr/sbin -v /home/chris/chris-DASH/DASH/dash-pipeline/SAI/SAI/meta:/meta -e LD_LIBRARY_PATH=/SAI/lib:/usr/local/lib chrissommers/dash-saithrift-bldr:220719 \ +/bin/bash +chris@chris-z4:/SAI/rpc/usr/sbin$ +``` +Start gdb on the saiserver process: +``` +chris@chris-z4:/SAI/rpc/usr/sbin$ gdb saiserver +GNU gdb (Ubuntu 9.2-0ubuntu1~20.04.1) 9.2 +Copyright (C) 2020 Free Software Foundation, Inc. +License GPLv3+: GNU GPL version 3 or later +This is free software: you are free to change and redistribute it. +There is NO WARRANTY, to the extent permitted by law. +Type "show copying" and "show warranty" for details. +This GDB was configured as "x86_64-linux-gnu". +Type "show configuration" for configuration details. +For bug reporting instructions, please see: +. +Find the GDB manual and other documentation resources online at: + . +--Type for more, q to quit, c to continue without paging--c + +For help, type "help". +Type "apropos word" to search for commands related to "word"... +Reading symbols from saiserver... +``` +Point gdb to the mounted source directory which which must be build locally via `make saithrift-server`: +``` +(gdb) dir /meta +Source directories searched: /meta:$cdir:$cwd +``` +Set some breakpoints. +``` +(gdb) b sai_api_query +Breakpoint 1 at 0x76d90 +(gdb) b create_outbound_eni_to_vni_entry +Function "create_outbound_eni_to_vni_entry" not defined. +Make breakpoint pending on future shared library load? (y or [n]) y +Breakpoint 2 (create_outbound_eni_to_vni_entry) pending. +``` +Run the process: +``` +(gdb) r +Starting program: /SAI/rpc/usr/sbin/saiserver +warning: Error disabling address space randomization: Operation not permitted +[Thread debugging using libthread_db enabled] +Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". +[New Thread 0x7f2b50e25700 (LWP 15)] +[New Thread 0x7f2b48624700 (LWP 16)] +[New Thread 0x7f2b4bfff700 (LWP 17)] +GRPC call SetForwardingPipelineConfig 0.0.0.0:9559 => /etc/dash/dash_pipeline.json, /etc/dash/dash_pipeline_p4rt.txt +``` +First breakoint is reached, it's a startup behavior. Enter `c` to resume: +``` +Thread 1 "saiserver" hit Breakpoint 1, sai_api_query (api=SAI_API_UNSPECIFIED, api_method_table=0x558bad22dd30 ) at utils.cpp:217 +217 _Out_ void **api_method_table) { +(gdb) c +Continuing. +Starting SAI RPC server on port 9092 +[New Thread 0x7f2b4b7fe700 (LWP 18)] +[New Thread 0x7f2b4affd700 (LWP 19)] +``` \ No newline at end of file diff --git a/dash-pipeline/README.md b/dash-pipeline/README.md index 26e531bc4..d6cab444a 100644 --- a/dash-pipeline/README.md +++ b/dash-pipeline/README.md @@ -4,6 +4,9 @@ See also: * [README-dash-workflows.md](README-dash-workflows.md) for build workflows and Make targets. * [README-dash-ci](README-dash-ci.md) for CI pipelines. * [README-dash-docker](README-dash-docker.md) for Docker usage. +* [README-saithrift](README-saithrift.md) for saithrift client/server and test workflows. +* [README-ptftests](README-ptftests.md) for saithrift PTF test-case development and usage. +* [README-pytests](README-pytests.md) for saithrift Pytest test-case development and usage. # DASH Pipeline This is a P4 model of the DASH overlay pipeline which uses the [bmv2](https://github.com/p4lang/behavioral-model) from [p4lang](https://github.com/p4lang). It includes the P4 program which models the DASH overlay dataplane; Dockerfiles; build and test infrastructure; and CI (Continuous Integration) spec files. @@ -19,12 +22,9 @@ This is a P4 model of the DASH overlay pipeline which uses the [bmv2](https://gi - [Quick-start](#quick-start) - [Prerequisites](#prerequisites) - [Clone this repo](#clone-this-repo) - - [Get the right branch](#get-the-right-branch) - [I feel lucky!](#i-feel-lucky) - - [Build Artifacts](#build-artifacts) - - [Run bmv2 software switch](#run-bmv2-software-switch) - - [Run tests](#run-tests) - [Cleanup](#cleanup) + - [More Make Targets](#more-make-targets) - [Installing Prequisites](#installing-prequisites) - [Install git](#install-git) - [Install docker](#install-docker) @@ -35,20 +35,20 @@ This is a P4 model of the DASH overlay pipeline which uses the [bmv2](https://gi # Known Issues * P4 code doesn't loop packets back to same port. * P4 code mark-to-drop not set when meta.drop is set. +* Permission and ownership issues in Docker images, permanent fix is needed. # TODOs ## Loose Ends Small items to complete given the exsting features and state, e.g. excluing major roadmap items. -* n/a +* Update SAI submodule to upstream when PRs are merged (currently using dev branches for URLs) +* Produce "dev" and "distro" versions of docker images. Dev images mount to host FS and use artifacts built on the host. Distro images are entirely self-contained including all artifacts. ## Desired Optimizations * Build a Docker image automatically when its Dockerfile changes, publish and pull from permanent repo * Use Azure Container Registry (ACR) for Docker images instead of temporary Dockerhub registry * Use dedicated higher-performance runners instead of [free Azure 2-core GitHub runner instances](https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources) -* Explore use of [virtualenv](https://virtualenv.pypa.io/en/latest/) to avoid contaiminating the local environment with this project's particular Python requirements. ## Roadmap These are significant feature or functionality work items. * Use modified bmv2 which adds stateful processing. Current version is vanilla bmv2. This will require building it instead of using a prebuilt bmv2 docker image, see [Build Docker dev container](#build-docker-dev-container). [**WIP**] -* Integrate SAI-thrift server from [OCP/SAI](https://github.com/opencomputeproject/SAI) [**WIP**] * Add DASH sevice test cases including SAI-thrift pipeline configuration and traffic tests # Quick-start @@ -62,102 +62,79 @@ See [Installing Prequisites](#installing-prequisites) for details. * git - tested with version 2.25.1 * docker * [docker-compose](#install-docker-compose) (**1.29.2 or later**) -* python3, pip3 ## Clone this repo ``` git clone -git submodule update --init # NOTE --recursive not needed (yet) +cd DASH ``` -## Get the right branch - -**Optional** - if you require a particular dev branch. +**Optional** - if you require a particular dev branch: ``` git checkout ``` -## I feel lucky! -Eager to see it work? Try this: - -In one terminal: +Init (clone) the SAI submodule: ``` -make clean && make all network run-switch +git submodule update --init # NOTE --recursive not needed (yet) ``` -In another terminal: +## I feel lucky! +Eager to see it work? First [clone this repo](#clone-this-repo), then do the following: + +In first terminal (console will print bmv2 logs): ``` -make run-all-tests clean +cd dash-pipeline +make clean && make all run-switch ``` -The final `clean` above will kill the switch, delete artifacts and veth pairs. +The above procedure takes awhile since it has to pull docker images (once) and build some code. -Below we break down the steps in more detail. - -## Build Artifacts +In second terminal (console will print saithrift server logs): ``` -make clean # optional, as needed -make all +make run-saithrift-server ``` - -## Run bmv2 software switch -This will also automatically ceate `veth` pairs as needed. +In third terminal (console will print test results): ``` -make run-switch # willrun in foreground with logging +make run-all-tests ``` - -## Run tests -Use a different terminal: +When you're done, do: ``` -make run-test # Simple SAI table accessor, no traffic +make kill-all # just to stop the daemons + # you can redo "run" commands w/o rebuilding ``` -Follow instructions for [Install docker-compose](#install-docker-compose), then: +*OR* ``` -make run-ixiac-test # Uses SW traffic-generator +make clean # stop daemons and clean everything up ``` -The setup for ixia-c traffic tests is as follows. More info is available [here](README-dash-workflows#about-snappi-and-ixia-c-traffic-generator). +The final `clean` above will kill the switch, delete artifacts and veth pairs. + + +The tests may use a combination of SW packet generators: +* Scapy - well-known packet-at-a-time SW traffic generator/capture +* ixia-c - performant flow-based packet genrator/capture + +The setup for ixia-c -based traffic tests is as follows. More info is available [here](README-dash-workflows#about-snappi-and-ixia-c-traffic-generator). ![ixia-c setup](../test/third-party/traffic_gen/deployment/ixia-c.drawio.svg) -## Cleanup +## Cleanup This is a summary of most-often used commands, see [README-dash-workflows.md](README-dash-workflows.md) for more details. * `CTRL-c` - kill the switch container from within the iteractive terminal -* `make kill-switch` - kills the switch container from another terminal -* `make network-clean` - delete veth pairs -* `make undeploy-ixiac` - kill ixia-c containers -* `make p4-clean` - delete P4 artifacts -* `make sai-clean` - delete SAI artifacts. Do this before committing code, see [Here](README-dash-workflows.md#typical-workflow-committing-new-code---ignoring-sai-submodule) -* `make clean` - does all of the above +* `make kill-all` - kill all the running containers +* `make clean` - clean up everything, kill containers +## More Make Targets +See [README-dash-workflows.md](README-dash-workflows.md) for build workflows and Make targets. There are many fine-grained Make targets to control your development workflow. # Installing Prequisites - ## Install git ``` sudo apt install -y git ``` - ## Install docker Need for basically everything to build/test dash-pipeline. See: * https://docs.docker.com/desktop/linux/install/ -## Install Python 3 -This is probably already installed in your Linux OS, but if not: - -See: -* https://docs.python-guide.org/starting/install3/linux/ - -``` -sudo apt install -y python3 -``` - -## Install pip3 -See: -* https://pip.pypa.io/en/latest/installation/ - -You can probably use the following command for most cases: -``` -sudo apt install -y python3-pip -``` ## Install docker-compose >**NOTE** Use docker-compose 1.29.2 or later! The `.yml` file format changed. Using an older version might result in an error such as:
`ERROR: Invalid interpolation format for "controller" option in service "services": "ixiacom/ixia-c-controller:${CONTROLLER_VERSION:-latest}"` @@ -168,10 +145,11 @@ See also: * https://www.cyberithub.com/how-to-install-docker-compose-on-ubuntu-20-04-lts-step-by-step/ -Installation of `docker-compose` has to be done just once. You can use another technique based on your platform and preferences. The following will download and install a linux executable under `/usr/local/bin`. You should have a PATH to this directory. You can edit the below command to locate it somewhere else as desired, just change the path as needed. +Installation of `docker-compose` has to be done just once. You can use another technique based on your platform and preferences. The following will download and install a linux executable under `/usr/local/bin`. You should have a PATH to this directory. **You can edit the commands below to locate it somewhere else as desired; just change the path as needed.** ``` +sudo mkdir -p /usr/local/bin sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose sudo chmod +x /usr/local/bin/docker-compose ``` diff --git a/dash-pipeline/SAI/README.md b/dash-pipeline/SAI/README.md index 16d3d7201..0ab5f8dfa 100644 --- a/dash-pipeline/SAI/README.md +++ b/dash-pipeline/SAI/README.md @@ -1,3 +1,5 @@ +# dash-pipeline/SAI directory description +## sai_api_gen.py ``` usage: sai_api_gen.py [-h] [--print-sai-lib PRINT_SAI_LIB] [--sai-git-url SAI_GIT_URL] @@ -31,3 +33,8 @@ Example: ``` In this example, the input is a dash_pipeline.json, which is a result of a P4 code compilation. The list of tables to ignore is provided to not generate API for them, because they are representing the underlay. A custom Git URL and branch can be provided. The last argument is a name of the API. + +# requirements.txt +This is used for installing python modules, in particular for [snappi](https://github.com/open-traffic-generator/snappi) and [pytest](https://docs.pytest.org/en/7.1.x/index.html). + +>**NOTE:** This file is a **hardlink** pointing to a single source of truth for test infrastructure. Take care accordingly. Modifying its contents will impact other collections of test scripts. It's a hardlink for convenience in order to pass the Docker context when building the `saithrift-client` images. We can't use a symlink because Docker cannot dereference symlinks in the context passed to it, see https://stackoverflow.com/questions/31881904/docker-follow-symlink-outside-context and https://medium.com/@307/hard-links-and-symbolic-links-a-comparison-7f2b56864cdd. \ No newline at end of file diff --git a/dash-pipeline/SAI/SAI b/dash-pipeline/SAI/SAI index 950e45867..3feb459e6 160000 --- a/dash-pipeline/SAI/SAI +++ b/dash-pipeline/SAI/SAI @@ -1 +1 @@ -Subproject commit 950e45867ef18413dfc58ceba2468cdfa9aff7f9 +Subproject commit 3feb459e6cd7984cffb5e39ef5867ce3e4590a5e diff --git a/dash-pipeline/SAI/generate_dash_api.sh b/dash-pipeline/SAI/generate_dash_api.sh index 0c08cf392..60b6c3566 100755 --- a/dash-pipeline/SAI/generate_dash_api.sh +++ b/dash-pipeline/SAI/generate_dash_api.sh @@ -1,6 +1,5 @@ #!/bin/bash -./sai_api_gen.py \ +sudo ./sai_api_gen.py \ /bmv2/dash_pipeline.bmv2/dash_pipeline_p4rt.json \ --ignore-tables=appliance,eni_meter,slb_decap \ - --overwrite=false \ dash \ No newline at end of file diff --git a/dash-pipeline/SAI/sai_api_gen.py b/dash-pipeline/SAI/sai_api_gen.py index 9238914e2..2504f77c3 100755 --- a/dash-pipeline/SAI/sai_api_gen.py +++ b/dash-pipeline/SAI/sai_api_gen.py @@ -354,21 +354,13 @@ def write_sai_files(sai_api): parser.add_argument('apiname', type=str, help='Name of the new SAI API') parser.add_argument('--print-sai-lib', type=bool) parser.add_argument('--ignore-tables', type=str, default='', help='Comma separated list of tables to ignore') -parser.add_argument('--overwrite', type=bool, default=False, help='Restore SAI subdirectories') args = parser.parse_args() if not os.path.isfile(args.filepath): print('File ' + args.filepath + ' does not exist') exit(1) -if os.path.exists('./lib'): - if args.overwrite == False: - print('Directory ./lib already exists. Please remove in order to proceed') - exit(1) - else: - print('Directory ./lib will be deleted...') - shutil.rmtree('./lib') - +# # Get SAI dictionary from P4 dictionary print("Generating SAI API...") with open(args.filepath) as json_program_file: @@ -376,8 +368,6 @@ def write_sai_files(sai_api): sai_apis = generate_sai_apis(json_program, args.ignore_tables.split(',')) -os.mkdir("lib") - # Write SAI dictionary into SAI API headers sai_api_name_list = [] for sai_api in sai_apis: diff --git a/dash-pipeline/SAI/saithrift/Makefile b/dash-pipeline/SAI/saithrift/Makefile index ab7d0b4a9..936d380fc 100644 --- a/dash-pipeline/SAI/saithrift/Makefile +++ b/dash-pipeline/SAI/saithrift/Makefile @@ -1,38 +1,67 @@ -all: sai-thrift-server +all: saithrift-server saithrift-server-install # SAI submodule used by below targets; modified in-place! -SAI=../SAI -# Host dir where sai-thrift artifacts will be installed -LIB=../lib +SAI=$(shell pwd)/../SAI +# Host dir where saithrift artifacts will be installed +LIB=$(shell pwd)/../lib +RPC_INST_DIR=$(shell pwd)/../rpc META=../SAI/meta +SAIRPC_EXTRA_LIBS="\ + -L/lib/x86_64-linux-gnu -Wl,-rpath=/lib/x86_64-linux-gnu -lm \ + -L/usr/local/lib/ -Wl,-rpath=/usr/local/lib \ + -lpthread \ + -lpiprotogrpc \ + -lpiprotobuf \ + -lprotobuf \ + -lgrpc++ \ + -lgrpc \ + -lpiall \ + -lpi_dummy \ + -lpthread \ + -labsl_synchronization \ + -labsl_status \ + -labsl_raw_hash_set \ + -lgpr \ + -lre2 \ + -lssl \ + -laddress_sorting" + # Below based on: https://github.com/opencomputeproject/SAI/blob/088627dd90c3420daf96d294c661b4a152afb01e/ptf/SAI_PTF_user-guide.md # Dependencies are assumed to be installed, e.g. in the docker container -sai-thrift-server: +saithrift-server: # Copy headers to /usr/include/sai sudo mkdir -p /usr/include/sai - sudo cp $(SAI)/inc/sai*.h $(SAI)/experimental/*.h /usr/include/sai/ + sudo cp $(SAI)/inc/sai*.h /usr/include/sai/ + # Following is workaround for https://github.com/opencomputeproject/SAI/issues/1537 + sudo cp $(SAI)/experimental/sai*.h /usr/include/sai/ + sudo cp -r $(SAI)/experimental/ /usr/include/ # Install vendor specific SAI library i.e. DASH bmv2 libsai.so in /usr/lib. sudo cp $(LIB)/libsai.so /usr/lib @echo "Build SAI thrift server and libraries..." - cd $(SAI) && export SAITHRIFTV2=y && export GEN_SAIRPC_OPTS="-ve" && make saithrift-build saithrift-install + cd $(SAI) && export SAITHRIFTV2=y && \ + export GEN_SAIRPC_OPTS="-ve" && \ + export SAIRPC_EXTRA_LIBS=$(SAIRPC_EXTRA_LIBS) && \ + make saithrift-build && \ + export DESTDIR=$(RPC_INST_DIR) && make saithrift-install # NOTE: commands below is a workaround (WA) and needed until packaging of SAI python is fixed. # Re-generate python SAI thrift library again - cd $(SAI)/test/saithriftv2 && make install-pylib + cd $(SAI)/test/saithriftv2 && export DESTDIR=$(RPC_INST_DIR) && make install-pylib # Copy auto-generated python SAI thrift library to your Test controller host. - cp $(SAI)/test/saithriftv2/dist/saithrift-0.9.tar.gz $(LIB) + cp $(SAI)/test/saithriftv2/dist/saithrift-0.9.tar.gz $(RPC_INST_DIR) # Copy thrift libs from builder image onto host - cp /thrift-0.11.0/lib/py/dist/* $(LIB) + cp /usr/lib/libthrift*so* /usr/lib/thrift-0.11.0.tar.gz $(RPC_INST_DIR) -sai-thrift-server-clean: +saithrift-server-clean: cd $(SAI) && export SAITHRIFTV2=y && make clean rm -rf $(SAI)/test/saithriftv2/gen-cpp/ rm -rf $(SAI)/test/saithriftv2/obj/ + rm -rf $(RPC_INST_DIR) -clean: sai-thrift-server-clean +clean: saithrift-server-clean diff --git a/dash-pipeline/SAI/saithrift/Makefile.old b/dash-pipeline/SAI/saithrift/Makefile.old deleted file mode 100644 index 49f7aa7ba..000000000 --- a/dash-pipeline/SAI/saithrift/Makefile.old +++ /dev/null @@ -1,45 +0,0 @@ -all:sai-meta sai-thrift-rpc sai-thrift-server - -META=../SAI/meta -RPC_SRCS=$(META)/sai_rpc_frontend.cpp $(META)/sai_rpc_server.cpp - -sai-meta: - @echo "Generating sai meta sources..." - cd $(META) && $(MAKE) - -sai-meta-clean: - @echo "Cleaning sai meta source..." - cd $(META) && $(MAKE) clean - -sai-thrift-rpc: - @echo "Generating sai-thrift RPC client/server sources..." - cd $(META) && ./gensairpc.pl -ve - -sai-thrift-server: $(RPC_SRCS) - @echo "Generating sai-thrift RPC server daemon..." - g++ \ - -Wno-conversion \ - -I /SAI/SAI/inc \ - -I /SAI/SAI/experimental/ \ - -I /SAI/SAI/meta/generated/gen-cpp \ - -I /SAI/SAI/meta/ \ - -o saithrift_server \ - $(RPC_SRCS) \ - -Wl,-rpath,/SAI/lib \ - -L/SAI/lib/ \ - -lsai \ - -L/usr/local/lib/ \ - -lpthread \ - -lpiprotogrpc \ - -lpiprotobuf \ - -lprotobuf \ - -lgrpc++ \ - -lgrpc \ - -lpiall \ - -lpi_dummy \ - -lpthread \ - -labsl_synchronization \ - -g - -clean: - rm -rf vnet_out diff --git a/dash-pipeline/SAI/templates/Makefile.j2 b/dash-pipeline/SAI/templates/Makefile.j2 index 27e9a4afb..9cbea736f 100644 --- a/dash-pipeline/SAI/templates/Makefile.j2 +++ b/dash-pipeline/SAI/templates/Makefile.j2 @@ -1,16 +1,45 @@ -libsai.so: {% for api in api_names %}sai{{ api }}.cpp {% endfor %} +# DASH libsai.so Makefile +# THIS MAKEFILE IS AUTO-GENERATED FROM templates/Makefile.j2 +# DO NOT MODIFY + +SAI_DIR=../SAI/meta/ + +SAI_SRCS=saimetadatautils.c \ + saimetadata.c \ + saiserialize.c + +SAI_SRC_PATHS=$(addprefix $(SAI_DIR),$(SAI_SRCS)) +SAI_OBJS=$(SAI_SRCS:.c=.o) + +#GXX_FLAGS=-D_GLIBCXX_USE_CXX11_ABI=0 + +libsai.so: utils.cpp \ + {% for api in api_names %}sai{{ api }}.cpp {% endfor %} + gcc \ + -fPIC \ + -c \ + -I ../SAI/meta/ \ + -I ../SAI/inc/ \ + -I ../SAI/experimental/ \ + $(GXX_FLAGS) \ + $(SAI_SRC_PATHS) + g++ \ -fpermissive \ -c \ + -I ../SAI/meta/ \ -I ../SAI/inc/ \ -I ../SAI/experimental/ \ -fPIC \ -g \ + $(GXX_FLAGS) \ utils.cpp \ {% for api in api_names %}sai{{ api }}.cpp {% endfor %} + g++ \ -shared \ -g \ -o libsai.so \ utils.o \ + $(SAI_OBJS) \ {% for api in api_names %}sai{{ api }}.o {% endfor %} diff --git a/dash-pipeline/SAI/templates/saiapi.cpp.j2 b/dash-pipeline/SAI/templates/saiapi.cpp.j2 index dc236165a..44561c5bc 100644 --- a/dash-pipeline/SAI/templates/saiapi.cpp.j2 +++ b/dash-pipeline/SAI/templates/saiapi.cpp.j2 @@ -282,7 +282,7 @@ sai_status_t sai_remove_{{ table.name }}( {% endfor %} retCode = MutateTableEntry(matchActionEntry, p4::v1::Update_Type_DELETE); - if (grpc::StatusCode::OK != retCode) { + if (grpc::StatusCode::OK == retCode) { delete matchActionEntry; return 0; } @@ -310,7 +310,8 @@ sai_status_t sai_get_{{ table.name }}_attribute( {% endif %} {% endfor %} -static sai_{{ app_name }}_api_t sai_{{app_name }}_api_impl = { +/* TODO [cs] Generate .h file for _impl to use within sai_api_query() */ +sai_{{ app_name }}_api_t sai_{{app_name }}_api_impl = { {% for table in tables %} .create_{{ table.name }} = sai_create_{{ table.name }}, .remove_{{ table.name }} = sai_remove_{{ table.name }}, diff --git a/dash-pipeline/SAI/templates/utils.cpp.j2 b/dash-pipeline/SAI/templates/utils.cpp.j2 index a356702ba..aabe8f18c 100644 --- a/dash-pipeline/SAI/templates/utils.cpp.j2 +++ b/dash-pipeline/SAI/templates/utils.cpp.j2 @@ -3,6 +3,8 @@ #include #include #include +#include +#include #include #include #include "p4/v1/p4runtime.grpc.pb.h" @@ -12,6 +14,7 @@ extern "C" { #include "saiobject.h" #include "saistatus.h" #include "saitypes.h" +#include "saiextensions.h" } #include #include @@ -128,7 +131,13 @@ int GetDeviceId() { return deviceId; } +string updateTypeStr(p4::v1::Update_Type updateType) { + const google::protobuf::EnumDescriptor *descriptor = p4::v1::Update_Type_descriptor(); + return descriptor->FindValueByNumber(updateType)->name(); +} + grpc::StatusCode MutateTableEntry(p4::v1::TableEntry *entry, p4::v1::Update_Type updateType) { + p4::v1::WriteRequest request; request.set_device_id(GetDeviceId()); auto update = request.add_updates(); @@ -140,11 +149,11 @@ grpc::StatusCode MutateTableEntry(p4::v1::TableEntry *entry, p4::v1::Update_Type grpc::ClientContext context; grpc::Status status = stub->Write(&context, request, &rep); if (status.ok()) { - LOG("GRPC call Write::add_one_entry OK: "); + LOG("GRPC call Write::" << updateTypeStr(updateType) << " OK" << std::endl); } else { LOG("GRPC ERROR["<< status.error_code() <<"]: " << status.error_message() << ", " << status.error_details()); - LOG("GRPC call Write::add_one_entry ERROR: " << std::endl << entry->ShortDebugString()); + LOG("GRPC call Write::" << updateTypeStr(updateType) << " ERROR: " << std::endl << entry->ShortDebugString()); } //MILIND?? What is this? reference release? memory release? entity->release_table_entry(); @@ -187,12 +196,78 @@ bool RemoveFromTable(sai_object_id_t id) { return true; } -/* [cs] placeholders to satisfy sairpcgen until actual callbacks are written */ +/* TODO [cs] placeholders to satisfy sairpcgen until actual callbacks are written + * Replace with some reasonable minimal code to support PTF tests + * Put in own template files e.g. sai_switch.cpp.j2 sai_switch.h.j2 + * Update sai_api_gen.py accordingly. + */ + +sai_status_t sai_create_switch_dummy( + _Out_ sai_object_id_t *switch_id, + _In_ uint32_t attr_count, + _In_ const sai_attribute_t *attr_list) { + + *switch_id = 0; // + fprintf(stderr, "sai_create_switch_dummy()\n"); + return SAI_STATUS_SUCCESS; +} + +sai_status_t sai_get_switch_attribute_dummy( + _In_ sai_object_id_t switch_id, + _In_ uint32_t attr_count, + _Inout_ sai_attribute_t *attr_list) { + fprintf(stderr, "sai_get_switch_attribute_dummy()\n"); + return SAI_STATUS_SUCCESS; +} +sai_switch_api_t sai_switch_api_impl = { + .create_switch = sai_create_switch_dummy, + .remove_switch = (void *)0, + .set_switch_attribute = (void *)0, + .get_switch_attribute = sai_get_switch_attribute_dummy, + .get_switch_stats = (void *)0, + .get_switch_stats_ext = (void *)0, + .clear_switch_stats = (void *)0, + .switch_mdio_read = (void *)0, + .switch_mdio_write = (void *)0, + .create_switch_tunnel = (void *)0, + .remove_switch_tunnel = (void *)0, + .set_switch_tunnel_attribute = (void *)0, + .get_switch_tunnel_attribute = (void *)0 +}; + + +/* TODO [cs] This should be auto-generated or part of per-API, auto-generated include file */ +extern sai_dash_api_t sai_dash_api_impl; +extern sai_dash_vnet_api_t sai_dash_vnet_api_impl; +extern sai_dash_acl_api_t sai_dash_acl_api_impl; + + +/* TODO [cs] This should be auto-generated */ sai_status_t sai_api_query( _In_ sai_api_t api, - _Out_ void **api_method_table) { return SAI_STATUS_SUCCESS; } + _Out_ void **api_method_table) { + + switch(api) { + case SAI_API_SWITCH: + *api_method_table = (void *)&sai_switch_api_impl; + break; + + case SAI_API_DASH: + *api_method_table = (void *)&sai_dash_api_impl; + break; + + case SAI_API_DASH_VNET: + *api_method_table = (void *)&sai_dash_vnet_api_impl; + break; + + default: + return SAI_STATUS_NOT_SUPPORTED; + + } + return SAI_STATUS_SUCCESS; +} sai_status_t sai_object_type_get_availability( diff --git a/dash-pipeline/bmv2/dash_pipeline.p4 b/dash-pipeline/bmv2/dash_pipeline.p4 index 82a93c361..1b7f58477 100644 --- a/dash-pipeline/bmv2/dash_pipeline.p4 +++ b/dash-pipeline/bmv2/dash_pipeline.p4 @@ -174,7 +174,14 @@ control dash_ingress(inout headers_t hdr, } apply { + + /* Send packet on same port it arrived (echo) by default */ + standard_metadata.egress_spec = standard_metadata.ingress_port; + vip.apply(); + // TODO [cs] shouldn't this also be called at end of ingress? + // Shouldn't it call mark_to_drop(standard_metadata); + if (meta.dropped) { return; } diff --git a/dash-pipeline/dockerfiles/.dockerignore b/dash-pipeline/dockerfiles/.dockerignore index f59ec20aa..4a1624118 100644 --- a/dash-pipeline/dockerfiles/.dockerignore +++ b/dash-pipeline/dockerfiles/.dockerignore @@ -1 +1,5 @@ -* \ No newline at end of file +*.bkp +*.log +*.pcap +__pycache__/ +.pytest_cache/ \ No newline at end of file diff --git a/dash-pipeline/dockerfiles/Dockerfile.p4c-bmv2 b/dash-pipeline/dockerfiles/Dockerfile.p4c-bmv2 index 349ae0afa..729d34ff6 100644 --- a/dash-pipeline/dockerfiles/Dockerfile.p4c-bmv2 +++ b/dash-pipeline/dockerfiles/Dockerfile.p4c-bmv2 @@ -1,9 +1,9 @@ # This Dockerfile builds an image used to compile P4 programs for the bmv2 backend only # It's based on public p4lang/p4c docker but strips out uneeded backends. # See https://docs.docker.com/develop/develop-images/multistage-build/ -# FROM p4lang/p4c:stable as builder +# FROM p4lang/p4c:stable as p4lang-p4c # :stable on 2022-07-03: -FROM p4lang/p4c@sha256:e4e8aa3f38e84cc51acb7df60257309a6ddfd77a2d0a05e0db445750f209be93 as builder +FROM p4lang/p4c@sha256:e4e8aa3f38e84cc51acb7df60257309a6ddfd77a2d0a05e0db445750f209be93 as p4lang-p4c LABEL maintainer="SONIC-DASH Community" LABEL description="DASH p4c-bmv2 compiler, minimal" @@ -24,9 +24,9 @@ FROM amd64/ubuntu:20.04 # Need python for "p4c" wrapper & gcc for the C preprocessor RUN apt update && apt install -y python3 gcc -COPY --from=builder /usr/local/lib/* /usr/local/lib/ +COPY --from=p4lang-p4c /usr/local/lib/* /usr/local/lib/ -COPY --from=builder \ +COPY --from=p4lang-p4c \ /usr/lib/x86_64-linux-gnu/libboost_*so* \ /usr/lib/x86_64-linux-gnu/libgc*so* \ /usr/lib/x86_64-linux-gnu/libisl*so* \ @@ -34,13 +34,13 @@ COPY --from=builder \ /usr/lib/x86_64-linux-gnu/libmpfr*so* \ /usr/lib/x86_64-linux-gnu/ -COPY --from=builder /usr/lib/gcc/x86_64-linux-gnu /usr/lib/gcc/x86_64-linux-gnu/ -COPY --from=builder /usr/bin/cpp /usr/bin/ -COPY --from=builder /usr/local/share/p4c/ /usr/local/share/p4c/ +COPY --from=p4lang-p4c /usr/lib/gcc/x86_64-linux-gnu /usr/lib/gcc/x86_64-linux-gnu/ +COPY --from=p4lang-p4c /usr/bin/cpp /usr/bin/ +COPY --from=p4lang-p4c /usr/local/share/p4c/ /usr/local/share/p4c/ WORKDIR /usr/local/bin -COPY --from=builder \ +COPY --from=p4lang-p4c \ /usr/local/bin/p4c \ /usr/local/bin/p4c-bm2-ss \ /usr/local/bin/ @@ -49,7 +49,7 @@ CMD bash # # Alternate approach - selective remove backends etc. # # ~ 726 MB vs 971MG for p4lang/p4c:stable -# FROM p4lang/p4c:stable as builder +# FROM p4lang/p4c:stable as p4lang-p4c # LABEL maintainer="SONIC-DASH Community" # LABEL description="DASH p4c-bmv2 compiler, minimal" @@ -114,9 +114,9 @@ CMD bash # yanglint \ # yangre -# # Copy everything from builder as one layer to avoid the deleted files above +# # Copy everything from p4lang-p4c as one layer to avoid the deleted files above # FROM scratch -# COPY --from=builder / / +# COPY --from=p4lang-p4c / / # CMD bash diff --git a/dash-pipeline/dockerfiles/Dockerfile.saithrift b/dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr similarity index 54% rename from dash-pipeline/dockerfiles/Dockerfile.saithrift rename to dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr index 25bb386c4..3bee87e89 100644 --- a/dash-pipeline/dockerfiles/Dockerfile.saithrift +++ b/dash-pipeline/dockerfiles/Dockerfile.saithrift-bldr @@ -1,15 +1,23 @@ -#FROM amd64/ubuntu:18.04 -# amd64/ubuntu:18.04 on 2022-07-03 -FROM amd64/ubuntu@sha256:8da4e9509bfe5e09df6502e7a8e93c63e4d0d9dbaa9f92d7d767f96d6c20a78a + +FROM chrissommers/dash-grpc:1.43.2 as grpc +FROM chrissommers/dash-bmv2-bldr:220630 as bmv2 +# amd64/ubuntu:20.04 on 2022-07-03 +FROM amd64/ubuntu@sha256:b2339eee806d44d6a8adc0a790f824fb71f03366dd754d400316ae5a7e3ece3e as builder LABEL maintainer="SONiC-DASH Community " LABEL description="This Docker image contains the toolchain to build \ -the sai-thrift server for DASH." +the saithrift client & server + sai-P4Runtime adaptor layer, for DASH." # Configure make to run as many parallel jobs as cores available ARG available_processors ARG MAKEFLAGS=-j$available_processors -ENV SAI_PTF_DEPS sudo git python python-pip wget doxygen graphviz aspell-en \ + +# Set TZ to avoid interactive installer +ENV TZ=America/Los_Angeles +RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone +ENV GIT_SSL_NO_VERIFY=true + +ENV SAI_PTF_DEPS sudo git python wget doxygen graphviz aspell-en \ libgetopt-long-descriptive-perl libconst-fast-perl \ libtemplate-perl libnamespace-autoclean-perl libmoose-perl libmoosex-aliases-perl @@ -17,8 +25,7 @@ ENV DASH_SAIGEN_DEPS python3 python3-pip RUN apt-get update -qq && \ apt-get install -y --no-install-recommends $SAI_PTF_DEPS $DASH_SAIGEN_DEPS && \ - pip install ctypesgen && \ - pip3 install jinja2 + pip3 install ctypesgen jinja2 ENV SAI_THRIFT_DEPS automake bison flex g++ git libboost-all-dev libevent-dev libssl-dev libtool make pkg-config @@ -40,6 +47,32 @@ RUN wget http://archive.apache.org/dist/thrift/0.11.0/thrift-0.11.0.tar.gz && \ cd / && \ rm -rf thrift-0.11.0 thrift-0.11.0.tar.gz +# TODO - merge into first RUN layer (or delete?) this is for dev only +RUN sudo apt install -y gdb + +# Used to make saithrift server +COPY --from=grpc /usr/local/lib/lib*grpc*.so* \ + /usr/local/lib/libabsl*.so* \ + /usr/local/lib/libgpr*.so* \ + /usr/local/lib/libupb*.so* \ + /usr/local/lib/libre2*.so* \ + /usr/local/lib/libaddress_sorting*.so* \ + /usr/local/lib/libssl*.so* \ + /usr/local/lib/libcrypto*.so* \ + /usr/local/lib/ + +COPY --from=grpc /usr/local/lib/libssl*.so* \ + /usr/local/lib/libcrypto*.so* \ + /lib/x86_64-linux-gnu/ + +# Used to make saithrift server +COPY --from=bmv2 /usr/local/lib/libpiprotogrpc.so* \ + /usr/local/lib/libprotobuf.so* \ + /usr/local/lib/libpiprotobuf.so* \ + /usr/local/lib/libpiall.so* \ + /usr/local/lib/libpi_dummy.so* \ + /usr/local/lib/ + WORKDIR / ARG user diff --git a/dash-pipeline/dockerfiles/Dockerfile.saithrift-client b/dash-pipeline/dockerfiles/Dockerfile.saithrift-client new file mode 100644 index 000000000..a50e205e8 --- /dev/null +++ b/dash-pipeline/dockerfiles/Dockerfile.saithrift-client @@ -0,0 +1,39 @@ + +#FROM amd64/ubuntu:20.04 as builder +# amd64/ubuntu:20.04 on 2022-07-03 +FROM chrissommers/dash-saithrift-client-bldr:220723 +LABEL maintainer="SONiC-DASH Community " +LABEL description="This Docker image contains the toolchain to run\ +the saithrift client and test programs for DASH." + +# +# Install/copy artifacts which can change based on current DASH pipeline code, submodules and test cases +# + +# Copy distro PTF submodule and tools from SAI repo +ADD SAI/SAI/test/ptf /SAI/test/ptf +# Install PTF test framework & test-cases from SAI repo +ADD SAI/SAI/ptf /SAI/ptf/ +# Copy thrift python distro +ADD SAI/rpc/thrift-0.11.0.tar.gz / +# Copy autogenerated saithrift library built from SAI headers for DASH dataplane +ADD SAI/rpc/saithrift-0.9.tar.gz / +# Install the python libraries +RUN cd /saithrift-0.9 && \ + sudo python3 setup.py install && \ + cd / && \ + sudo rm -rf saithrift-0.9 &&\ + cd thrift-0.11.0 && \ + sudo python3 setup.py install && \ + cd / &&\ + sudo rm -rf thrift-0.11.0 && \ + cd /SAI/test/ptf && \ + sudo python3 setup.py install + +# Copy distro test-cases into container, making it standalone (doesn't need host FS contents). +# For dev, host directories can be mounted to another container directory to "see" tests in development +ADD tests/ tests/ + +WORKDIR / + +CMD ["/bin/bash"] diff --git a/dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr b/dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr new file mode 100644 index 000000000..c639f9352 --- /dev/null +++ b/dash-pipeline/dockerfiles/Dockerfile.saithrift-client-bldr @@ -0,0 +1,37 @@ + +#FROM amd64/ubuntu:20.04 +# amd64/ubuntu:20.04 on 2022-07-03 +FROM amd64/ubuntu@sha256:b2339eee806d44d6a8adc0a790f824fb71f03366dd754d400316ae5a7e3ece3e +LABEL maintainer="SONiC-DASH Community " +LABEL description="This Docker image contains the toolchain to build and install \ +the saithrift client and test programs for DASH. It does not contain thrift/saithrift libraries" +ADD requirements.txt /tests/ + +# Below we build the baseline set of tools to run saithrift client tests +# Contents do not include the thrift and saithrift client libraries, which need +# to be added to form another container which reflects the current DASH sai libraries +# Those contents are built in saithrift-server workflow. +RUN apt update && apt install -y python3 python3-pip sudo && \ + sudo update-alternatives --install /usr/bin/python python /usr/bin/python3.8 1 && \ + sudo python3 -m pip install -r /tests/requirements.txt && \ + sudo pip3 install scapy pysubnettree + +WORKDIR / + +ARG user +ARG uid +ARG guid +ARG hostname + +ENV BUILD_HOSTNAME $hostname +ENV USER $user + +RUN groupadd -f -r -g $guid g$user + +RUN useradd $user -l -u $uid -g $guid -d /var/$user -m -s /bin/bash + +RUN echo "$user ALL=(ALL) NOPASSWD:ALL" >>/etc/sudoers + +USER $user + +CMD ["/bin/bash"] diff --git a/dash-pipeline/images/dash-p4-bmv2-thrift-workflow.svg b/dash-pipeline/images/dash-p4-bmv2-thrift-workflow.svg index 082c7786d..0d6c544ee 100644 --- a/dash-pipeline/images/dash-p4-bmv2-thrift-workflow.svg +++ b/dash-pipeline/images/dash-p4-bmv2-thrift-workflow.svg @@ -1,4 +1,4 @@ -
P4RT
server
P4RT...
P4RT
commands
P4RT...
SW traffic 
generator
SW traffic...
Sirius P4
behavioral model 
(source of truth)
Sirius P4...
Standard OCP SAI
header files subset
(underlay)
Standard OCP SAI...
DASH SAI
header files
(overrlay)
DASH SAI...
Saithrift code
generator
Saithrift code...
Thrift server
skeleton C++ code
Thrift server...
Python thrift client  lib*
Python thrift client...
DUT Software
Target
DUT Software...
opencompute/SAI
opencompute/SAI
make P4
make P4
dash/sirius-pipeline
dash/sirius-pipeline
Generate SAI headers
Generate SAI headers
Git
Git
Git
Git
sirius_pipeline.json
sirius_pipeline.json
sirius_pipeline_p4rt.json
sirius_pipeline_p4rt.json
generate_dash_api.sh
generate_dash_api.sh
Containers provide the build & run environment:
Containers provide the b...
make docker-XXX
make docker-XXX
make sai
make sai
meta/make
meta/make
bmv2 SAI implementation C++ code
bmv2 SAI implementation C++ code
meta/gensairpc.pl
meta/gensairpc.pl
Bmv2+
V1+ Arch
Bmv2+...
SAI-P4RT
Adaptor/
P4RT Client
SAI-P4RT...
saithrift
server
saithrift...
libsai
libsai
p4c
p4c
make sai
make sai
make sai-thrift-server
make sai-thrift-server
make <target>
make <target>
LEGEND
LEGEND
make target or script in sirius-pipeline
make target or script in sirius-pipeline
make <target>
make <target>
make target or script in another repo (e.g. SAI/meta)
make target or script in another repo (e.g. SAI/meta)
SAI & meta headers
SAI & meta headers
Resource comes from external repo (resources assumed to be in this repo otherwise)
Resource comes from external repo (resources assumed to be in this repo otherwi...
Test script
* imports thrift client lib
Test script...
Runtime socket communications (RPC commands or test traffic)
Runtime socket communications (RPC commands or test traffic)
SAI-Thrift
commands - future
SAI-Thrift...
Tgen Commands
Tgen Commands
Traffic
Traffic
Build step produces artifacts
Build step produces artifacts
make run-switch
make run-switch
make run-icxiac-test
make run-icxiac-test
(Git Submodule)
(Git Submodule)
SAI-thrift server is TODO
SAI-thrift server is TODO

OR, compiled c++ test programs e.g. vnet_out
OR, compiled c++ te...
make test
make run-test
make test...
make docker-XXX-publish
make docker-XXX-publish
local environment
local environ...
Various repos (Ubuntu, p4.org, etc.)
Various repos (Ubunt...
local environment
local environ...
dash-xxx
dash-xxx
dash-XXX
dash-XXX
P4 Info
P4 Info
P4 "object code" loaded by bmv2 
P4 "object code" loaded by bmv2...
Reg
Reg
make docker-XXX-pull (explicit)
make docker-XXX...
docker-run (implicit)
docker-run (imp...
Text is not SVG - cannot display
\ No newline at end of file +
ixia-c
ixia-c
SAI-P4RT
Adaptor/
P4RT Client
SAI-P4RT...
 sai-thrift-client
 sai-thrift-client
SW traffic 
generator
SW traffic...
DASH P4
behavioral model 
(source of truth)
DASH P4...
Standard OCP SAI
header files subset
(underlay)
Standard OCP SAI...
DASH SAI
header files
(overrlay)
DASH SAI...
Saithrift code
generator
Saithrift code...
Thrift server
skeleton C++ code
Thrift server...
opencompute/SAI
opencompute/SAI
make P4
make P4
DASH/dash-pipeline
DASH/dash-pipeline
Generate SAI headers
Generate SAI headers
DUT Software
Target
DUT Software...
Python thrift client  lib*
Python thrift client...
Git
Git
SAI-Thrift
commands
SAI-Thrift...
P4 "object code" loaded by bmv2 
P4 "object code" loaded by bmv2...
make sai-thrift-server
make sai-thrift-server
dash_pipeline_p4rt.json
dash_pipeline_p4rt.json
make run-sai-thrift-server
make run-sai-thrift-server
Git
Git
dash_pipeline.json
dash_pipeline.json
generate_dash_api.sh
generate_dash_api.sh
Containers provide the build & run environment:
Containers provide the b...
make docker-XXX
make docker-XXX
make sai
make sai
meta/make
meta/make
bmv2 SAI implementation C++ code
bmv2 SAI implementation C++ code
meta/gensairpc.pl
meta/gensairpc.pl
saithrift
server
saithrift...
libsai
libsai
p4c
p4c
make sai
make sai
make <target>
make <target>
LEGEND
LEGEND
make target or script in dash-pipeline
make target or script in dash-pipeline
make <target>
make <target>
make target or script in another repo (e.g. SAI/meta)
make target or script in another repo (e.g. SAI/meta)
SAI & meta headers
SAI & meta headers
Resource comes from external repo (resources assumed to be in this repo otherwise)
Resource comes from external repo (resources assumed to be in this repo otherwi...
Test scripts:
PTF, Pytest
built into container
Test scripts:...
Runtime socket communications (RPC commands or test traffic)
Runtime socket communications (RPC commands or test traffic)
Tgen Commands
Tgen Commands
Build step produces artifacts
Build step produces artifacts
make run-switch
make run-switch
make run-saithrift_XXXtests
make run-saithrift_XXXtests
(Git Submodule)
(Git Submodule)

OR, compiled c++ test programs e.g. vnet_out
OR, compiled c++ te...
make test
make run-test
make test...
make docker-XXX-publish
make docker-XXX-publish
local environment
local environ...
Various repos (Ubuntu, p4.org, etc.)
Various repos (Ubunt...
local environment
local environ...
dash-xxx
dash-xxx
dash-XXX
dash-XXX
P4 Info
P4 Info
Reg
Reg
make docker-XXX-pull (explicit)
make docker-XXX...
docker-run (implicit)
docker-run (imp...
Python thrift client  lib*
Python thrift client...
Bmv2+
V1+ Arch
Bmv2+...
Git
Git
opencompute/SAI
opencompute/SAI
SAI PTF Framework
SAI PTF Framework
Scapy
Scapy
make deploy-ixia-c
make deploy-ixia-c
Build-time container
Build-time container
Run-time container
Run-time container
Test scripts:
PTF, Pytest
mounted from host dev env
Test scripts:...
make run-saithrift_dev-XXXtests
make run-saithrift_dev-XXXtests
/test-dev
/test-dev
/test
/test
/SAI
/SAI
Text is not SVG - cannot display
\ No newline at end of file diff --git a/dash-pipeline/tests/Makefile b/dash-pipeline/tests/libsai/Makefile similarity index 100% rename from dash-pipeline/tests/Makefile rename to dash-pipeline/tests/libsai/Makefile diff --git a/dash-pipeline/tests/libsai/README.md b/dash-pipeline/tests/libsai/README.md new file mode 100644 index 000000000..d318bb1fb --- /dev/null +++ b/dash-pipeline/tests/libsai/README.md @@ -0,0 +1,2 @@ +# libsai tests directory +These tests are written in c++ and are intended to test and demonstrate writing DASH API configuration and management code which links to the `libsai` library for DASH. In particular, these programs use the SAI-to-P4Runtime adaptor layer. As such they require many libraries including gRPC, protobuf, P4 PI layer etc. in addition to `libsai` itself. \ No newline at end of file diff --git a/dash-pipeline/tests/init_switch/.gitignore b/dash-pipeline/tests/libsai/init_switch/.gitignore similarity index 100% rename from dash-pipeline/tests/init_switch/.gitignore rename to dash-pipeline/tests/libsai/init_switch/.gitignore diff --git a/dash-pipeline/tests/init_switch/Makefile b/dash-pipeline/tests/libsai/init_switch/Makefile similarity index 92% rename from dash-pipeline/tests/init_switch/Makefile rename to dash-pipeline/tests/libsai/init_switch/Makefile index 5e7db3aa4..786fa3f4f 100644 --- a/dash-pipeline/tests/init_switch/Makefile +++ b/dash-pipeline/tests/libsai/init_switch/Makefile @@ -1,5 +1,5 @@ all:init_switch -init_switch: init_switch.cpp +init_switch: init_switch.cpp /SAI/lib/libsai.so echo "building $@ ..." g++ \ -I /SAI/lib \ diff --git a/dash-pipeline/tests/init_switch/init_switch.cpp b/dash-pipeline/tests/libsai/init_switch/init_switch.cpp similarity index 100% rename from dash-pipeline/tests/init_switch/init_switch.cpp rename to dash-pipeline/tests/libsai/init_switch/init_switch.cpp diff --git a/dash-pipeline/tests/vnet_out/.gitignore b/dash-pipeline/tests/libsai/vnet_out/.gitignore similarity index 100% rename from dash-pipeline/tests/vnet_out/.gitignore rename to dash-pipeline/tests/libsai/vnet_out/.gitignore diff --git a/dash-pipeline/tests/vnet_out/Makefile b/dash-pipeline/tests/libsai/vnet_out/Makefile similarity index 92% rename from dash-pipeline/tests/vnet_out/Makefile rename to dash-pipeline/tests/libsai/vnet_out/Makefile index a7c03ce3f..d96e3c6b1 100644 --- a/dash-pipeline/tests/vnet_out/Makefile +++ b/dash-pipeline/tests/libsai/vnet_out/Makefile @@ -1,5 +1,5 @@ all:vnet_out -vnet_out: vnet_out.cpp +vnet_out: vnet_out.cpp /SAI/lib/libsai.so echo "building $@ ..." g++ \ -I /SAI/SAI/inc \ diff --git a/dash-pipeline/tests/vnet_out/vnet_out.cpp b/dash-pipeline/tests/libsai/vnet_out/vnet_out.cpp similarity index 72% rename from dash-pipeline/tests/vnet_out/vnet_out.cpp rename to dash-pipeline/tests/libsai/vnet_out/vnet_out.cpp index 9f80ec668..7a0e1cc71 100644 --- a/dash-pipeline/tests/vnet_out/vnet_out.cpp +++ b/dash-pipeline/tests/libsai/vnet_out/vnet_out.cpp @@ -9,16 +9,22 @@ extern sai_status_t sai_create_direction_lookup_entry( _In_ const sai_direction_lookup_entry_t *direction_lookup_entry, _In_ uint32_t attr_count, _In_ const sai_attribute_t *attr_list); +extern sai_status_t sai_remove_direction_lookup_entry( + _In_ const sai_direction_lookup_entry_t *direction_lookup_entry); extern sai_status_t sai_create_eni_ether_address_map_entry( _In_ const sai_eni_ether_address_map_entry_t *outbound_eni_lookup_from_vm_entry, _In_ uint32_t attr_count, _In_ const sai_attribute_t *attr_list); +extern sai_status_t sai_remove_eni_ether_address_map_entry( + _In_ const sai_eni_ether_address_map_entry_t *outbound_eni_lookup_from_vm_entry); extern sai_status_t sai_create_outbound_eni_to_vni_entry( _In_ const sai_outbound_eni_to_vni_entry_t *outbound_eni_to_vni_entry, _In_ uint32_t attr_count, _In_ const sai_attribute_t *attr_list); +extern sai_status_t sai_remove_outbound_eni_to_vni_entry( + _In_ const sai_outbound_eni_to_vni_entry_t *outbound_eni_to_vni_entry); extern sai_dash_api_t sai_dash_api_impl; @@ -83,9 +89,30 @@ int main(int argc, char **argv) std::cout << "Failed to create ENI To VNI" << std::endl; return 1; } - attrs.clear(); + // Delete everything in reverse order + status = sai_remove_outbound_eni_to_vni_entry(&e2v); + if (status != SAI_STATUS_SUCCESS) + { + std::cout << "Failed to remove ENI To VNI" << std::endl; + return 1; + } + + status = sai_remove_eni_ether_address_map_entry(&eam); + if (status != SAI_STATUS_SUCCESS) + { + std::cout << "Failed to remove ENI Lookup From VM" << std::endl; + return 1; + } + + status = sai_remove_direction_lookup_entry(&dle); + if (status != SAI_STATUS_SUCCESS) + { + std::cout << "Failed to remove Direction Lookup Entry" << std::endl; + return 1; + } + std::cout << "Done." << std::endl; diff --git a/dash-pipeline/tests/requirements.txt b/dash-pipeline/tests/requirements.txt new file mode 100644 index 000000000..649e00354 --- /dev/null +++ b/dash-pipeline/tests/requirements.txt @@ -0,0 +1,2 @@ +snappi==0.7.38 +pytest==6.0.1 diff --git a/dash-pipeline/tests/saithrift/README.md b/dash-pipeline/tests/saithrift/README.md new file mode 100644 index 000000000..41370db59 --- /dev/null +++ b/dash-pipeline/tests/saithrift/README.md @@ -0,0 +1,8 @@ +# saithrift tests directory +This directory contains tests for DASH pipeline using python `saithrift` client libraries. The following frameworks are supported; see the corresponding directories: +* [ptf/](ptf) directory - Tests using the [PTF](https://github.com/p4lang/ptf) or Packet test framework, as used in [SAI/ptf](https://github.com/opencomputeproject/SAI/tree/master/ptf) test cases +* [pytest/](pytest/) diretory - Tests using the [Pytest](https://docs.pytest.org/en/7.1.x/index.html) testing framework + +The tests use the same thrift and saithrift client libraries and in general the configuration and setup of the DASH dataplane will use the same APIs and command sequences. The frameworks differ primarily in how test suites are designed and orcestrated. Each framework has advantages and disadvantages, hence both are supported as first-class citizens. + +In particular the PTF test framework has a significant body of helper libraries which simplify setup. The corrollary is that the PTF libraries make a lot of embedded assumptions about the test target, the environment and the dataplane SW packet generator (scapy). \ No newline at end of file diff --git a/dash-pipeline/tests/saithrift/ptf/README.md b/dash-pipeline/tests/saithrift/ptf/README.md new file mode 100644 index 000000000..e45f126bf --- /dev/null +++ b/dash-pipeline/tests/saithrift/ptf/README.md @@ -0,0 +1,4 @@ +# DASH saithrift PTF Tests +**TODO - Placeholder** + +This directory can contain PTF tests for DASH bmv2. These would supplement standard tests under [SAI/ptf](../../SAI/ptf) \ No newline at end of file diff --git a/dash-pipeline/tests/saithrift/ptf/run-saithrift-ptftests.sh b/dash-pipeline/tests/saithrift/ptf/run-saithrift-ptftests.sh new file mode 100755 index 000000000..cc1fb26fb --- /dev/null +++ b/dash-pipeline/tests/saithrift/ptf/run-saithrift-ptftests.sh @@ -0,0 +1,5 @@ +#!/bin/bash +# To be run inside saithrift-client container, assumes SAI repo portions exist under /SAI directory +sudo ptf --test-dir . --pypath /SAI/ptf \ + --interface 0@veth1 --interface 1@veth3 + diff --git a/dash-pipeline/tests/saithrift/ptf/thrift/test_thrift_session.py b/dash-pipeline/tests/saithrift/ptf/thrift/test_thrift_session.py new file mode 100644 index 000000000..ffd88eb25 --- /dev/null +++ b/dash-pipeline/tests/saithrift/ptf/thrift/test_thrift_session.py @@ -0,0 +1,83 @@ +from sai_thrift.sai_headers import * +from sai_base_test import * + +class TestSaiThriftSession(ThriftInterfaceDataPlane): + """ Test saithrift client connection only""" + def setup(self): + print ("setup()") + + def runTest(self): + print ("TestSaiThriftSession OK") + + + def teardown(self): + print ("teardown()") + +# TODO - need to implement some DASH switch APIs to get switch attributes etc. +# We need this to run traditional PTF tests which depend upon device attributes etc. + +# class TestSaiThriftSaiHelper(SaiHelper): + +# Using SaiHelper base class, we get this (because vlan default = 0) +# root@chris-z4:/saithrift-host# ./run-saithrift-ptftests.sh +# /usr/local/lib/python3.8/dist-packages/ptf-0.9.1-py3.8.egg/EGG-INFO/scripts/ptf:19: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses +# import imp +# test_thrift_session.TestThriftSession ... ***** Number of available resources ***** +# SAI_SWITCH_ATTR_ECMP_MEMBERS : 0 +# ecmp_members : 0 +# SAI_SWITCH_ATTR_NUMBER_OF_ECMP_GROUPS : 0 +# number_of_ecmp_groups : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPV4_ROUTE_ENTRY : 0 +# available_ipv4_route_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPV6_ROUTE_ENTRY : 0 +# available_ipv6_route_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPV4_NEXTHOP_ENTRY : 0 +# available_ipv4_nexthop_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPV6_NEXTHOP_ENTRY : 0 +# available_ipv6_nexthop_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPV4_NEIGHBOR_ENTRY : 0 +# available_ipv4_neighbor_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPV6_NEIGHBOR_ENTRY : 0 +# available_ipv6_neighbor_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_NEXT_HOP_GROUP_ENTRY : 0 +# available_next_hop_group_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_NEXT_HOP_GROUP_MEMBER_ENTRY : 0 +# available_next_hop_group_member_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_FDB_ENTRY : 0 +# available_fdb_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_IPMC_ENTRY : 0 +# available_ipmc_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_SNAT_ENTRY : 0 +# available_snat_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_DNAT_ENTRY : 0 +# available_dnat_entry : 0 +# SAI_SWITCH_ATTR_AVAILABLE_DOUBLE_NAT_ENTRY : 0 +# available_double_nat_entry : 0 +# FAIL + +# ====================================================================== +# FAIL: test_thrift_session.TestThriftSession +# ---------------------------------------------------------------------- +# Traceback (most recent call last): +# File "/saithrift-host/../../SAI/ptf/sai_base_test.py", line 654, in setUp +# super(SaiHelper, self).setUp() +# File "/saithrift-host/../../SAI/ptf/sai_base_test.py", line 304, in setUp +# self.assertNotEqual(self.default_vlan_id, 0) +# AssertionError: 0 == 0 + +# ---------------------------------------------------------------------- +# Ran 1 test in 0.021s + +# FAILED (failures=1) + +# TODO - temporary until fix per above: +class TestSaiThriftSaiHelper(ThriftInterfaceDataPlane): + """ Test saithrift client connection and basic SaiHelper intitialization""" + def setup(self): + print ("setup()") + + def runTest(self): + print ("TestSaiThriftSaiHelper OK") + + def teardown(self): + print ("teardown()") diff --git a/dash-pipeline/tests/saithrift/ptf/vnet/test_saithrift_vnet.py b/dash-pipeline/tests/saithrift/ptf/vnet/test_saithrift_vnet.py new file mode 100644 index 000000000..17b65a755 --- /dev/null +++ b/dash-pipeline/tests/saithrift/ptf/vnet/test_saithrift_vnet.py @@ -0,0 +1,71 @@ +import pytest +import snappi +import scapy + +from sai_thrift.sai_headers import * +from sai_base_test import * +# TODO - when switch APIs implemented: +# class TestSaiThrift_create_outbound_eni_to_vni_entry(SaiHelper): + +class TestSaiThrift_create_outbound_eni_to_vni_entry(ThriftInterfaceDataPlane): + """ Test saithrift vnet outbound""" + def setUp(self): + super(TestSaiThrift_create_outbound_eni_to_vni_entry, self).setUp() + self.switch_id = 0 + self.eth_addr = '\xaa\xcc\xcc\xcc\xcc\xcc' + self.vni = 60 + self.eni = 7 + self.dle = sai_thrift_direction_lookup_entry_t(switch_id=self.switch_id, vni=self.vni) + self.eam = sai_thrift_eni_ether_address_map_entry_t(switch_id=self.switch_id, address = self.eth_addr) + self.e2v = sai_thrift_outbound_eni_to_vni_entry_t(switch_id=self.switch_id, eni_id=self.eni) + + try: + + status = sai_thrift_create_direction_lookup_entry(self.client, self.dle, + action=SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_create_eni_ether_address_map_entry(self.client, + eni_ether_address_map_entry=self.eam, + eni_id=self.eni) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_create_outbound_eni_to_vni_entry(self.client, + outbound_eni_to_vni_entry=self.e2v, + vni=self.vni) + assert(status == SAI_STATUS_SUCCESS) + + except AssertionError as ae: + # Delete entries which might be lingering from previous failures etc.; ignore failures here + print ("Cleaning up after failure...") + sai_thrift_remove_outbound_eni_to_vni_entry(self.client, self.e2v) + sai_thrift_remove_eni_ether_address_map_entry(self.client, self.eam) + sai_thrift_remove_direction_lookup_entry(self.client, self.dle) + raise ae + + + def runTest(self): + # TODO form a packet related to dataplane config + self.udp_pkt = simple_udp_packet() + # TODO expected packet might be different + self.udp_pkt_exp = self.udp_pkt + print("\nSending packet...", self.udp_pkt.__repr__()) + send_packet(self, 0, self.udp_pkt) + print("\nVerifying packet...", self.udp_pkt_exp.__repr__()) + verify_packet(self, self.udp_pkt_exp, 0) + print ("test_sai_thrift_create_outbound_eni_to_vni_entry OK") + + def tearDown(self): + + # Delete in reverse order + status = sai_thrift_remove_outbound_eni_to_vni_entry(self.client, self.e2v) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_remove_eni_ether_address_map_entry(self.client, self.eam) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_remove_direction_lookup_entry(self.client, self.dle) + assert(status == SAI_STATUS_SUCCESS) + + + diff --git a/dash-pipeline/tests/saithrift/pytest/conftest.py b/dash-pipeline/tests/saithrift/pytest/conftest.py new file mode 100644 index 000000000..3728df592 --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/conftest.py @@ -0,0 +1,13 @@ +import pytest +from saithrift_rpc_client import SaithriftRpcClient + +myclient = None +@pytest.fixture +def saithrift_client(): + global myclient + print ("Called fixture saithrift_client()") + if myclient is None: + myclient = SaithriftRpcClient().client + return myclient + + diff --git a/dash-pipeline/tests/saithrift/pytest/echo/test_echo_port.py b/dash-pipeline/tests/saithrift/pytest/echo/test_echo_port.py new file mode 100644 index 000000000..bbcd5488f --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/echo/test_echo_port.py @@ -0,0 +1,148 @@ +import snappi +import pytest + + +@pytest.mark.bmv2 +def test_udp_unidirectional(): + """ + This script does following: + - Send 1000 packets from one port to another at a rate of + 1000 packets per second. + - Validate that total packets sent are received on the same interface + + TODO - configure DUT. This test as originally written relies on incomplete P4 pipeline + implementation which "happens" to not drop packets, instead it accepts them and echos them back. + """ + # create a new API instance where location points to controller + api = snappi.api(location="https://localhost", verify=False) + # and an empty traffic configuration to be pushed to controller later on + cfg = api.config() + + # add two ports where location points to traffic-engine (aka ports) + p1, p2 = cfg.ports.port(name="p1", location="localhost:5555").port( + name="p2", location="localhost:5556" + ) + + # add layer 1 property to configure same speed on both ports + ly = cfg.layer1.layer1(name="ly")[-1] + ly.port_names = [p1.name, p2.name] + ly.speed = ly.SPEED_1_GBPS + + + # add two traffic flows + f1, f2 = cfg.flows.flow(name="flow p1->p2").flow(name="flow p2->p1") + # and assign source and destination ports for each + f1.tx_rx.port.tx_name, f1.tx_rx.port.rx_name = p1.name, p2.name + f2.tx_rx.port.tx_name, f2.tx_rx.port.rx_name = p2.name, p1.name + + # configure packet size, rate and duration for both flows + f1.size.fixed, f2.size.fixed = 128, 256 + pkt_count=500 + pps=100 + for f in cfg.flows: + # send pkt_count packets and stop + f.duration.fixed_packets.packets = pkt_count + # send pps packets per second + f.rate.pps = pps + + # configure packet with Ethernet, IPv4 and UDP headers for both flows + eth1, ip1, udp1 = f1.packet.ethernet().ipv4().udp() + eth2, ip2, udp2 = f2.packet.ethernet().ipv4().udp() + + # set source and destination MAC addresses + eth1.src.value, eth1.dst.value = "00:AA:00:00:04:00", "00:AA:00:00:00:AA" + eth2.src.value, eth2.dst.value = "00:AA:00:00:00:AA", "00:AA:00:00:04:00" + + # set source and destination IPv4 addresses + ip1.src.value, ip1.dst.value = "10.0.0.1", "10.0.0.2" + ip2.src.value, ip2.dst.value = "10.0.0.2", "10.0.0.1" + + # set incrementing port numbers as source UDP ports + udp1.src_port.increment.start = 5000 + udp1.src_port.increment.step = 2 + udp1.src_port.increment.count = 10 + + udp2.src_port.increment.start = 6000 + udp2.src_port.increment.step = 4 + udp2.src_port.increment.count = 10 + + # assign list of port numbers as destination UDP ports + udp1.dst_port.values = [4000, 4044, 4060, 4074] + udp2.dst_port.values = [8000, 8044, 8060, 8074, 8082, 8084] + + print("Pushing traffic configuration ...") + api.set_config(cfg) + + print("Starting transmit on all configured flows ...") + ts = api.transmit_state() + ts.state = ts.START + api.set_transmit_state(ts) + + print("Checking metrics on all configured ports ...") + print("Expected\tTotal Tx\tTotal Rx") + assert wait_for(lambda: metrics_ok(api, cfg)), "Metrics validation failed!" + + print("Test passed !") + + +def metrics_ok(api, cfg): + # create a port metrics request and filter based on port names + req = api.metrics_request() + req.port.port_names = [p.name for p in cfg.ports] + # include only sent and received packet counts + req.port.column_names = [req.port.FRAMES_TX, req.port.FRAMES_RX] + + # fetch port metrics + res = api.get_metrics(req) + # calculate total frames sent and received across all configured ports + total_tx = sum([m.frames_tx for m in res.port_metrics]) + total_rx = sum([m.frames_rx for m in res.port_metrics]) + expected = sum([f.duration.fixed_packets.packets for f in cfg.flows]) + + print("%d\t\t%d\t\t%d" % (expected, total_tx, total_rx)) + + return expected == total_tx and total_rx >= expected + + +def captures_ok(api, cfg): + import dpkt + + print("Checking captured packets on all configured ports ...") + print("Port Name\tExpected\tUDP packets") + + result = [] + for p in cfg.ports: + exp, act = 1000, 0 + # create capture request and filter based on port name + req = api.capture_request() + req.port_name = p.name + # fetch captured pcap bytes and feed it to pcap parser dpkt + pcap = dpkt.pcapng.Reader(api.get_capture(req)) + for _, buf in pcap: + # check if current packet is a valid UDP packet + eth = dpkt.ethernet.Ethernet(buf) + if isinstance(eth.data.data, dpkt.udp.UDP): + act += 1 + + print("%s\t\t%d\t\t%d" % (p.name, exp, act)) + result.append(exp == act) + + return all(result) + + +def wait_for(func, timeout=60, interval=0.2): + """ + Keeps calling the `func` until it returns true or `timeout` occurs + every `interval` seconds. + """ + import time + + start = time.time() + + while time.time() - start <= timeout: + if func(): + return True + time.sleep(interval) + + print("Timeout occurred !") + return False \ No newline at end of file diff --git a/dash-pipeline/tests/saithrift/pytest/pytest.ini b/dash-pipeline/tests/saithrift/pytest/pytest.ini new file mode 100644 index 000000000..42107e96f --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/pytest.ini @@ -0,0 +1,6 @@ +[pytest] +markers = + bmv2: test DASH bmv2 model + saithrift: test DASH using saithrift API + vnet: test DASH vnet scenarios + \ No newline at end of file diff --git a/dash-pipeline/tests/saithrift/pytest/run-saithrift-pytests.sh b/dash-pipeline/tests/saithrift/pytest/run-saithrift-pytests.sh new file mode 100755 index 000000000..9cb433083 --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/run-saithrift-pytests.sh @@ -0,0 +1,2 @@ +#!/bin/bash +python3 -m pytest . -s diff --git a/dash-pipeline/tests/saithrift/pytest/saithrift_rpc_client.py b/dash-pipeline/tests/saithrift/pytest/saithrift_rpc_client.py new file mode 100644 index 000000000..aad1070a9 --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/saithrift_rpc_client.py @@ -0,0 +1,30 @@ +import pytest + +from thrift.transport import TSocket +from thrift.transport import TTransport +from thrift.protocol import TBinaryProtocol +from sai_thrift import sai_rpc + +THRIFT_PORT = 9092 + +class SaithriftRpcClient: + def __init__(self, port=THRIFT_PORT, server = 'localhost'): + self.transport = None + self.port = port + self.server = server + self.createRpcClient() + + def createRpcClient(self): + """ + Set up thrift client and contact RPC server + """ + + print ("making thrift connection to %s:%d" % (self.server, self.port)) + self.transport = TSocket.TSocket(self.server, THRIFT_PORT) + self.transport = TTransport.TBufferedTransport(self.transport) + self.protocol = TBinaryProtocol.TBinaryProtocol(self.transport) + + self.client = sai_rpc.Client(self.protocol) + self.transport.open() + print ("sai-thrift connection established with %s:%d" % (self.server, self.port)) + diff --git a/dash-pipeline/tests/saithrift/pytest/switch/test_saithrift_switch.py b/dash-pipeline/tests/saithrift/pytest/switch/test_saithrift_switch.py new file mode 100644 index 000000000..f36cb7012 --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/switch/test_saithrift_switch.py @@ -0,0 +1,15 @@ +import pytest + +from sai_thrift.sai_headers import * +from sai_thrift.sai_adapter import * +from sai_thrift.ttypes import * + +@pytest.mark.saithrift +@pytest.mark.bmv2 +def test_sai_thrift_get_switch_attribute(saithrift_client): + attr = sai_thrift_get_switch_attribute( + saithrift_client, number_of_active_ports=True) + print ("switch_attributes = %s" % attr) + print ("test_sai_thrift_get_switch_attribute OK") + + diff --git a/dash-pipeline/tests/saithrift/pytest/thrift/test_saithrift_session.py b/dash-pipeline/tests/saithrift/pytest/thrift/test_saithrift_session.py new file mode 100755 index 000000000..4b2d97bab --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/thrift/test_saithrift_session.py @@ -0,0 +1,11 @@ +import pytest + +from saithrift_rpc_client import SaithriftRpcClient + +@pytest.mark.saithrift +@pytest.mark.bmv2 +def test_saithrift_session(saithrift_client): + """ Test saithrift client connection only""" + print ("test_saithrift_session OK") + + diff --git a/dash-pipeline/tests/saithrift/pytest/vnet/test_saithrift_vnet.py b/dash-pipeline/tests/saithrift/pytest/vnet/test_saithrift_vnet.py new file mode 100644 index 000000000..b74d6bd1e --- /dev/null +++ b/dash-pipeline/tests/saithrift/pytest/vnet/test_saithrift_vnet.py @@ -0,0 +1,73 @@ +import pytest +import snappi +from scapy.all import * + +from sai_thrift.sai_headers import * +from sai_thrift.sai_adapter import * +from sai_thrift.ttypes import * + +@pytest.mark.saithrift +@pytest.mark.bmv2 +@pytest.mark.vnet + +def test_sai_thrift_create_outbound_eni_to_vni_entry(saithrift_client): + + switch_id = 0 + eth_addr = '\xaa\xcc\xcc\xcc\xcc\xcc' + vni = 60 + eni = 7 + + try: + dle = sai_thrift_direction_lookup_entry_t(switch_id=switch_id, vni=vni) + eam = sai_thrift_eni_ether_address_map_entry_t(switch_id=switch_id, address = eth_addr) + e2v = sai_thrift_outbound_eni_to_vni_entry_t(switch_id=switch_id, eni_id=eni) + + status = sai_thrift_create_direction_lookup_entry(saithrift_client, dle, + action=SAI_DIRECTION_LOOKUP_ENTRY_ACTION_SET_OUTBOUND_DIRECTION) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_create_eni_ether_address_map_entry(saithrift_client, + eni_ether_address_map_entry=eam, + eni_id=eni) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_create_outbound_eni_to_vni_entry(saithrift_client, + outbound_eni_to_vni_entry=e2v, + vni=vni) + assert(status == SAI_STATUS_SUCCESS) + + # TODO form a packet related to dataplane config + + # TODO this is using raw scapy; prefer to use snappi or a wrapper for scapy or snappi + udp_pkt = Ether()/IP()/UDP() + # TODO expected packet might be different + udp_pkt_exp = udp_pkt + print("\nSending packet...", udp_pkt.__repr__()) + sendp(udp_pkt, iface='veth0') + + # TODO need simple pkt verify for Pytest similar to PTF helper + print("\nTODO: Verifying packet...", udp_pkt_exp.__repr__()) + # verify_packets(self, udp_pkt_exp, [0]) + print ("test_sai_thrift_create_outbound_eni_to_vni_entry OK") + + # Delete in reverse order + + status = sai_thrift_remove_outbound_eni_to_vni_entry(saithrift_client, e2v) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_remove_eni_ether_address_map_entry(saithrift_client, eam) + assert(status == SAI_STATUS_SUCCESS) + + status = sai_thrift_remove_direction_lookup_entry(saithrift_client, dle) + assert(status == SAI_STATUS_SUCCESS) + + except AssertionError as ae: + # Delete entries which might be lingering from previous failures etc.; ignore failures here + print ("Cleaning up after failure...") + sai_thrift_remove_outbound_eni_to_vni_entry(saithrift_client, e2v) + sai_thrift_remove_eni_ether_address_map_entry(saithrift_client, eam) + sai_thrift_remove_direction_lookup_entry(saithrift_client, dle) + raise ae + + print ("test_sai_thrift_create_outbound_eni_to_vni_entry OK") + diff --git a/test/docs/testbed/README.testbed.Overview.md b/test/docs/testbed/README.testbed.Overview.md index 7a4322567..39c55330f 100644 --- a/test/docs/testbed/README.testbed.Overview.md +++ b/test/docs/testbed/README.testbed.Overview.md @@ -1,6 +1,6 @@ # DASH Testbed Topology -This repository (https://github.com/Azure/DASH/test) contains all the scripts for setting up testbed and running feature/functional tests. Documents under this folder are for explaining the DASH testbed and running tests. +This repository [DASH/test](../..) contains all the scripts for setting up testbed and running feature/functional tests. Documents under this folder are for explaining the DASH testbed and running tests. diff --git a/test/images/dash-test-wflow-p4-saithrift.svg b/test/images/dash-test-wflow-p4-saithrift.svg index 4405ae547..3df9196ee 100644 --- a/test/images/dash-test-wflow-p4-saithrift.svg +++ b/test/images/dash-test-wflow-p4-saithrift.svg @@ -1,4 +1,4 @@ -
TDI (Table-Driven Interface)
TDI (Table-Driven In...
Optional (not required
by DASH project)
May be used to verify P4 code
Optional (not required...
Traffic veths
Traffic veths
P4-DPDK
PNA arch
P4-DPDK...
Bmv2+
V1+ Arch
Bmv2+...
P4RT
server
P4RT...
SAI-P4RT
Adaptor/
P4RT Client
SAI-P4RT...
Intel WIP
P4-DPDK with native
PNA arch support
Intel WIP...
Community WIP
bmv2 modified for V1 model with added stateful tracking.
Long-term: PNA compliant?
Community WIP...
Sirius P4
V1 Architecture
Sirius P4...
Traffic veths
Traffic veths
Sirius P4
PNA Architecture
Sirius P4...
Traffic generator commands
Traffic generator commands
PTF or PyTest
SAI-thrift test scripts
PTF or PyTest...
Scripts scalable to line-rate using snappi and HW packet generators
Scripts scalable to line-...
saithrift commands
saithrift commands
saithrift commands
saithrift commands
P4RT commands
P4RT commands
P4RT commands
P4RT commands
P4RT 
test scripts
P4RT...
P4RT
commands
(socket)
P4RT...
P4RT
server
P4RT...
saithrift
server
saithrift...
libsai
libsai
saithrift
server
saithrift...
libsai
libsai
P4RT and saithrift are alternate & parallel RPCs. TDI is the native interface.
P4RT and saithrift are...
P4RT and saithrift are alternate RPCs, P4RT is the native interface and saithrift is translated into P4RT
P4RT and saithrift are alt...
GitHub Actions
(CI/CD)
GitHub Actions...

Upon commit:
Any dependency change triggers a build & test.
Upon commit:Any dependen...
SW traffic 
generators
SW traffic...
+ Ixia-c
+ Ixia-c
Scapy
Scapy
OR
OR
Viewer does not support full SVG 1.1
\ No newline at end of file +
TDI (Table-Driven Interface)
TDI (Table-Driven In...
Optional (not required
by DASH project)
May be used to verify P4 code
Optional (not required...
Traffic veths
Traffic veths
P4-DPDK
PNA arch
P4-DPDK...
Bmv2+
V1+ Arch
Bmv2+...
P4RT
server
P4RT...
SAI-P4RT
Adaptor/
P4RT Client
SAI-P4RT...
Intel WIP
P4-DPDK with native
PNA arch support
Intel WIP...
Community WIP
bmv2 modified for V1 model with added stateful tracking.
Long-term: PNA compliant?
Community WIP...
dash-pipeline P4
V1 Architecture
dash-pipeline P4...
Traffic veths
Traffic veths
dash-pipeline P4
PNA Architecture
dash-pipeline P4...
Traffic generator commands
Traffic generator commands
PTF or PyTest
SAI-thrift test scripts
PTF or PyTest...
Scripts scalable to line-rate using snappi and HW packet generators
Scripts scalable to line-...
saithrift commands
saithrift commands
saithrift commands
saithrift commands
P4RT commands
P4RT commands
P4RT commands
P4RT commands
P4RT 
test scripts
P4RT...
P4RT
commands
(socket)
P4RT...
P4RT
server
P4RT...
saithrift
server
saithrift...
libsai
libsai
saithrift
server
saithrift...
libsai
libsai
P4RT and saithrift are alternate & parallel RPCs. TDI is the native interface.
P4RT and saithrift are...
P4RT and saithrift are alternate RPCs, P4RT is the native interface and saithrift is translated into P4RT
P4RT and saithrift are alt...
GitHub Actions
(CI/CD)
GitHub Actions...

Upon commit:
Any dependency change triggers a build & test.
Upon commit:Any dependen...
SW traffic 
generators
SW traffic...
+ Ixia-c
+ Ixia-c
Scapy
Scapy
OR
OR
Text is not SVG - cannot display
\ No newline at end of file diff --git a/test/images/dash-test-wflow-saithrift.svg b/test/images/dash-test-wflow-saithrift.svg index a37e82611..d22116d96 100644 --- a/test/images/dash-test-wflow-saithrift.svg +++ b/test/images/dash-test-wflow-saithrift.svg @@ -1,4 +1,4 @@ -
Reusable for multiple northbound/southbound APIs
Reusable for multiple northbo...
Saithrift commands
Saithrift commands
Traffic generator commands (Scapy, snappi)
Traffic generator commands (Scapy, snappi)
HW/SW traffic 
generator
HW/SW traffic...
Automated and
repeatable traffic
tests
Automated and...
Hand-written and/or
templated test cases
Abstract format
Hand-written and/or...
Sirius P4
behavioral model 
(source of truth)
Sirius P4...
Standard OCP SAI
header files subset
(underlay)
Standard OCP SAI...
DASH SAI
header files
(overrlay)
DASH SAI...
Inputs
Inputs
Generate
Generate
Saithrift code
generator
Saithrift code...


Automatic saithrift client and server code generator


Automatic saithrift cli...
Traffic cables
Traffic cables
Import libs
Import libs
Thrift server
skeleton C++ code
Thrift server...
Target libsai
Target libsai
Inputs
Inputs
Build image
Build image
DUT image
DUT image
Generate
Generate
Python client 
+ helpers 
Python client...
DUT Software
Target
DUT Software...
dash/test
dash/test
opencompute/SAI
opencompute/SAI
Compile P4 Target
Compile P4 Target
Traffic veths
Traffic veths
Test scripts
PTF, PyTest
Test scripts...
SW Dev
SW Dev
DASH 
CI/CD Pipeline
DASH...
Manually triggered test
Manually triggered te...
Commit-triggered test
Commit-triggered test
dash/sirius-pipeline
dash/sirius-pipeline
Device Under test (DUT) config
Device Under test...
P4 Info
P4 Info
Generate SAI headers
Generate SAI headers
DUT Hardware
Target
DUT Hardware...
Git
Git
Git
Git
Git
Git
Git
Git
p4c
p4c
Viewer does not support full SVG 1.1
\ No newline at end of file +
Reusable for multiple northbound/southbound APIs
Reusable for multiple northbo...
Saithrift commands
Saithrift commands
Traffic generator commands (Scapy, snappi)
Traffic generator commands (Scapy, snappi)
HW/SW traffic 
generator
HW/SW traffic...
Automated and
repeatable traffic
tests
Automated and...
Hand-written and/or
templated test cases
Abstract format
Hand-written and/or...
dash-pipeline P4
behavioral model 
(source of truth)
dash-pipeline P4...
Standard OCP SAI
header files subset
(underlay)
Standard OCP SAI...
DASH SAI
header files
(overrlay)
DASH SAI...
Inputs
Inputs
Generate
Generate
Saithrift code
generator
Saithrift code...


Automatic saithrift client and server code generator


Automatic saithrift cli...
Traffic cables
Traffic cables
Import libs
Import libs
Thrift server
skeleton C++ code
Thrift server...
Target libsai
Target libsai
Inputs
Inputs
Build image
Build image
DUT image
DUT image
Generate
Generate
Python client 
+ helpers 
Python client...
DUT Software
Target
DUT Software...
dash/test
dash/test
opencompute/SAI
opencompute/SAI
Compile P4 Target
Compile P4 Target
Traffic veths
Traffic veths
Test scripts
PTF, PyTest
Test scripts...
SW Dev
SW Dev
DASH 
CI/CD Pipeline
DASH...
Manually triggered test
Manually triggered te...
Commit-triggered test
Commit-triggered test
DASH/dash--pipeline
DASH/dash--pipeline
Device Under test (DUT) config
Device Under test...
P4 Info
P4 Info
Generate SAI headers
Generate SAI headers
DUT Hardware
Target
DUT Hardware...
Git
Git
Git
Git
Git
Git
Git
Git
p4c
p4c
Text is not SVG - cannot display
\ No newline at end of file diff --git a/test/third-party/traffic_gen/deployment/ixia-c-deployment.yml b/test/third-party/traffic_gen/deployment/ixia-c-deployment.yml index f47372d38..df8b85689 100644 --- a/test/third-party/traffic_gen/deployment/ixia-c-deployment.yml +++ b/test/third-party/traffic_gen/deployment/ixia-c-deployment.yml @@ -1,11 +1,13 @@ services: controller: image: ixiacom/ixia-c-controller:${CONTROLLER_VERSION:-latest} + container_name: ixia-c-controller-${USER} command: --accept-eula network_mode: "host" restart: always traffic_engine_1: image: ixiacom/ixia-c-traffic-engine:${TRAFFIC_ENGINE_VERSION:-latest} + container_name: ixia-c-traffic-engine1-${USER} network_mode: "host" restart: always privileged: true @@ -16,6 +18,7 @@ services: - OPT_NO_HUGEPAGES=Yes traffic_engine_2: image: ixiacom/ixia-c-traffic-engine:${TRAFFIC_ENGINE_VERSION:-latest} + container_name: ixia-c-traffic-engine2-${USER} network_mode: "host" restart: always privileged: true