RE: [PATCH v3 1/3] perf test: Add metric value validation test

From: Wang, Weilin
Date: Sun Jun 18 2023 - 13:43:25 EST




> -----Original Message-----
> From: Ravi Bangoria <ravi.bangoria@xxxxxxx>
> Sent: Friday, June 16, 2023 12:03 AM
> To: Wang, Weilin <weilin.wang@xxxxxxxxx>
> Cc: Kan Liang <kan.liang@xxxxxxxxxxxxxxx>; Alt, Samantha
> <samantha.alt@xxxxxxxxx>; Taylor, Perry <perry.taylor@xxxxxxxxx>; Biggers,
> Caleb <caleb.biggers@xxxxxxxxx>; Peter Zijlstra <peterz@xxxxxxxxxxxxx>; Ingo
> Molnar <mingo@xxxxxxxxxx>; Arnaldo Carvalho de Melo <acme@xxxxxxxxxx>;
> Jiri Olsa <jolsa@xxxxxxxxxx>; Namhyung Kim <namhyung@xxxxxxxxxx>; Hunter,
> Adrian <adrian.hunter@xxxxxxxxx>; Ian Rogers <irogers@xxxxxxxxxx>; linux-
> perf-users@xxxxxxxxxxxxxxx; linux-kernel@xxxxxxxxxxxxxxx; Ravi Bangoria
> <ravi.bangoria@xxxxxxx>
> Subject: Re: [PATCH v3 1/3] perf test: Add metric value validation test
>
> Hi,
>
> > diff --git a/tools/perf/tests/shell/stat_metrics_values.sh
> b/tools/perf/tests/shell/stat_metrics_values.sh
> > new file mode 100755
> > index 000000000000..65a15c65eea7
> > --- /dev/null
> > +++ b/tools/perf/tests/shell/stat_metrics_values.sh
> > @@ -0,0 +1,30 @@
> > +#!/bin/bash
> > +# perf metrics value validation
> > +# SPDX-License-Identifier: GPL-2.0
> > +if [ "x$PYTHON" == "x" ]
> > +then
> > + if which python3 > /dev/null
> > + then
> > + PYTHON=python3
> > + else
> > + echo Skipping test, python3 not detected please set
> environment variable PYTHON.
> > + exit 2
> > + fi
> > +fi
> > +
> > +grep -q Intel /proc/cpuinfo || (echo Skipping non-Intel; exit 2)
>
> This check doesn't seem to be working. On my Zen3 AMD machine:

Thanks for reporting this! I've update this search to "GenuineIntel" in v4 to help solve this issue.
Please check it out.

>
> $ sudo ./perf test -vvv 105
> 105: perf metrics value validation :
> --- start ---
> test child forked, pid 518035
> Skipping non-Intel
> Launch python validation script ./tests/shell/lib/perf_metric_validation.py
> Output will be stored in: /tmp/__perf_test.program.tnPoW
> Starting perf collection
> ...
>
> Interestingly, it passes :)
>
> ...
> Test validation finished. Final report:
> [
> {
> "Workload": "perf bench futex hash -r 2 -s",
> "Report": {
> "Metric Validation Statistics": {
> "Total Rule Count": 2,
> "Passed Rule Count": 2
> },
> "Tests in Category": {
> "PositiveValueTest": {
> "Total Tests": 12,
> "Passed Tests": 12,
> "Failed Tests": []
> },
> "RelationshipTest": {
> "Total Tests": 0,
> "Passed Tests": 0,
> "Failed Tests": []
> },
> "SingleMetricTest": {
> "Total Tests": 2,
> "Passed Tests": 2,
> "Failed Tests": []
> }
> },
> "Errors": []
> }
> }
> ]
> test child finished with 0
> ---- end ----
> perf metrics value validation: Ok
>
> I haven't yet investigated whether it's genuine or false positive.
>
Base on this final report, it validated some basic rules like the 12 metrics for positive value test and 2 metrics for single metric value checks.
The test script grabs metrics supported on the platform and generates validation rules that only include metrics in the supported list.
Therefore, it is not surprising that the check passes on your system.

Thanks,
Weilin

> Thanks,
> Ravi