[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <71fcc44f-b283-4868-dc97-14441a040147@mojatatu.com>
Date: Tue, 28 Mar 2023 18:24:02 -0300
From: Pedro Tammela <pctammela@...atatu.com>
To: Davide Caratti <dcaratti@...hat.com>,
Jamal Hadi Salim <jhs@...atatu.com>,
Cong Wang <xiyou.wangcong@...il.com>,
Jiri Pirko <jiri@...nulli.us>,
Ilya Maximets <i.maximets@....org>
Cc: Jakub Kicinski <kuba@...nel.org>, netdev@...r.kernel.org
Subject: Re: [PATCH net-next v3 2/4] selftests: tc-testing: add "depends_on"
property to skip tests
On 28/03/2023 13:45, Davide Caratti wrote:
> currently, users can skip individual test cases by means of writing
>
> "skip": "yes"
>
> in the scenario file. Extend this functionality, introducing 'dependsOn':
> it's optional property like "skip", but the value contains a command (for
> example, a probe on iproute2 to check if it supports a specific feature).
> If such property is present, tdc executes that command and skips the test
> when the return value is non-zero.
>
> Signed-off-by: Davide Caratti <dcaratti@...hat.com>
> ---
> .../creating-testcases/AddingTestCases.txt | 2 ++
> tools/testing/selftests/tc-testing/tdc.py | 13 +++++++++++++
> 2 files changed, 15 insertions(+)
>
> diff --git a/tools/testing/selftests/tc-testing/creating-testcases/AddingTestCases.txt b/tools/testing/selftests/tc-testing/creating-testcases/AddingTestCases.txt
> index a28571aff0e1..ff956d8c99c5 100644
> --- a/tools/testing/selftests/tc-testing/creating-testcases/AddingTestCases.txt
> +++ b/tools/testing/selftests/tc-testing/creating-testcases/AddingTestCases.txt
> @@ -38,6 +38,8 @@ skip: A completely optional key, if the corresponding value is "yes"
> this test case will still appear in the results output but
> marked as skipped. This key can be placed anywhere inside the
> test case at the top level.
> +dependsOn: Same as 'skip', but the value is executed as a command. The test
> + is skipped when the command returns non-zero.
> category: A list of single-word descriptions covering what the command
> under test is testing. Example: filter, actions, u32, gact, etc.
> setup: The list of commands required to ensure the command under test
> diff --git a/tools/testing/selftests/tc-testing/tdc.py b/tools/testing/selftests/tc-testing/tdc.py
> index 7bd94f8e490a..5fa3fe644bfe 100755
> --- a/tools/testing/selftests/tc-testing/tdc.py
> +++ b/tools/testing/selftests/tc-testing/tdc.py
> @@ -369,6 +369,19 @@ def run_one_test(pm, args, index, tidx):
> pm.call_post_execute()
> return res
>
> + if 'dependsOn' in tidx:
> + if (args.verbose > 0):
> + print('probe command for test skip')
> + (p, procout) = exec_cmd(args, pm, 'execute', tidx['dependsOn'])
> + if p:
> + if (p.returncode != 0):
> + res = TestResult(tidx['id'], tidx['name'])
> + res.set_result(ResultState.skip)
> + res.set_errormsg('probe command failed: test skipped.')
'probe command: test skipped'
> + pm.call_pre_case(tidx, test_skip=True)
> + pm.call_post_execute()
> + return res
> +
> # populate NAMES with TESTID for this test
> NAMES['TESTID'] = tidx['id']
>
Other than the above, LGTM.
Reviewed-by: Pedro Tammela <pctammela@...atatu.com>
Powered by blists - more mailing lists