diff mbox series

kselftest/alsa: Mark test plan as skipped when no cards are available

Message ID 20230908181242.95714-1-nfraprado@collabora.com (mailing list archive)
State New, archived
Headers show
Series kselftest/alsa: Mark test plan as skipped when no cards are available | expand

Commit Message

Nícolas F. R. A. Prado Sept. 8, 2023, 6:12 p.m. UTC
When no soundcards are available, it won't be possible to run any tests.
Currently, when this happens, in both pcm-test and mixer-test, 0
tests are reported, and the pass exit code is returned. Instead, call
ksft_exit_skip() so that the whole test plan is marked as skipped in the
KTAP output and it exits with the skip exit code.

Signed-off-by: Nícolas F. R. A. Prado <nfraprado@collabora.com>

---

 tools/testing/selftests/alsa/mixer-test.c | 7 +++++--
 tools/testing/selftests/alsa/pcm-test.c   | 7 +++++--
 2 files changed, 10 insertions(+), 4 deletions(-)

Comments

Mark Brown Sept. 8, 2023, 11:08 p.m. UTC | #1
On Fri, Sep 08, 2023 at 02:12:40PM -0400, Nícolas F. R. A. Prado wrote:
> When no soundcards are available, it won't be possible to run any tests.
> Currently, when this happens, in both pcm-test and mixer-test, 0
> tests are reported, and the pass exit code is returned. Instead, call
> ksft_exit_skip() so that the whole test plan is marked as skipped in the
> KTAP output and it exits with the skip exit code.

Why?
Nícolas F. R. A. Prado Sept. 11, 2023, 12:35 p.m. UTC | #2
On Sat, Sep 09, 2023 at 12:08:22AM +0100, Mark Brown wrote:
> On Fri, Sep 08, 2023 at 02:12:40PM -0400, Nícolas F. R. A. Prado wrote:
> > When no soundcards are available, it won't be possible to run any tests.
> > Currently, when this happens, in both pcm-test and mixer-test, 0
> > tests are reported, and the pass exit code is returned. Instead, call
> > ksft_exit_skip() so that the whole test plan is marked as skipped in the
> > KTAP output and it exits with the skip exit code.
> 
> Why?

To better reflect the actual test plan status. If 0 tests were run, it doesn't
really make sense to say that the test plan passed, rather it was skipped since
nothing was run. So with this change, if there's a regression that prevents the
soundcard driver from even probing, the result won't be "pass", but "skip", and
the reason 'No soundcard available' will be in the logs.

Thanks,
Nícolas
Mark Brown Sept. 11, 2023, 2:29 p.m. UTC | #3
On Mon, Sep 11, 2023 at 08:35:37AM -0400, Nícolas F. R. A. Prado wrote:
> On Sat, Sep 09, 2023 at 12:08:22AM +0100, Mark Brown wrote:

> > Why?

> To better reflect the actual test plan status. If 0 tests were run, it doesn't
> really make sense to say that the test plan passed, rather it was skipped since
> nothing was run. So with this change, if there's a regression that prevents the
> soundcard driver from even probing, the result won't be "pass", but "skip", and
> the reason 'No soundcard available' will be in the logs.

So, I would interpret the overall result for the suite as being "No
errors were found in any of the cards discovered" if there is no
configuration file specified which enumerates the set of cards that are
expected (if there is a config file that's a different matter, we know
what we're expecting).  I'm not sure that the different behaviour for 0
cards is super useful.
Nícolas F. R. A. Prado Sept. 12, 2023, 7:23 p.m. UTC | #4
On Mon, Sep 11, 2023 at 03:29:00PM +0100, Mark Brown wrote:
> On Mon, Sep 11, 2023 at 08:35:37AM -0400, Nícolas F. R. A. Prado wrote:
> > On Sat, Sep 09, 2023 at 12:08:22AM +0100, Mark Brown wrote:
> 
> > > Why?
> 
> > To better reflect the actual test plan status. If 0 tests were run, it doesn't
> > really make sense to say that the test plan passed, rather it was skipped since
> > nothing was run. So with this change, if there's a regression that prevents the
> > soundcard driver from even probing, the result won't be "pass", but "skip", and
> > the reason 'No soundcard available' will be in the logs.
> 
> So, I would interpret the overall result for the suite as being "No
> errors were found in any of the cards discovered" if there is no
> configuration file specified which enumerates the set of cards that are
> expected (if there is a config file that's a different matter, we know
> what we're expecting).  I'm not sure that the different behaviour for 0
> cards is super useful.

Right... So what we want to be doing is adding a config file for every platform
defining the card(s) and PCMs expected, so that when they're missing a test
failure will be triggered which is even more helpful. Although I've noticed that
only missing PCMs are detected currently, but I imagine it should be possible to
to extend the code to detect missing cards as well.

I take it the intention is to expand the conf.d directory with configs for all
platforms currently being tested then? There's only one example file there so I
wasn't sure.

Thanks,
Nícolas
Mark Brown Sept. 13, 2023, 3:33 p.m. UTC | #5
On Tue, Sep 12, 2023 at 03:23:34PM -0400, Nícolas F. R. A. Prado wrote:
> On Mon, Sep 11, 2023 at 03:29:00PM +0100, Mark Brown wrote:

> > So, I would interpret the overall result for the suite as being "No
> > errors were found in any of the cards discovered" if there is no
> > configuration file specified which enumerates the set of cards that are
> > expected (if there is a config file that's a different matter, we know
> > what we're expecting).  I'm not sure that the different behaviour for 0
> > cards is super useful.

> Right... So what we want to be doing is adding a config file for every platform
> defining the card(s) and PCMs expected, so that when they're missing a test
> failure will be triggered which is even more helpful. Although I've noticed that
> only missing PCMs are detected currently, but I imagine it should be possible to
> to extend the code to detect missing cards as well.

It seems like a reasonable approach for systems that do want to have
this confirmation.

> I take it the intention is to expand the conf.d directory with configs for all
> platforms currently being tested then? There's only one example file there so I
> wasn't sure.

I think it's a question for people working on individual systems if they
want that coverage, for example I don't really care for the things in my
CI because I have higher level stuff which will notice any newly missing
tests so I don't need the test to do anything here.
diff mbox series

Patch

diff --git a/tools/testing/selftests/alsa/mixer-test.c b/tools/testing/selftests/alsa/mixer-test.c
index c95d63e553f4..8f45c15a5667 100644
--- a/tools/testing/selftests/alsa/mixer-test.c
+++ b/tools/testing/selftests/alsa/mixer-test.c
@@ -66,8 +66,11 @@  static void find_controls(void)
 	char *card_name, *card_longname;
 
 	card = -1;
-	if (snd_card_next(&card) < 0 || card < 0)
-		return;
+	err = snd_card_next(&card);
+	if (err < 0)
+		ksft_exit_skip("Couldn't open first soundcard. rc=%d\n", err);
+	if (card < 0)
+		ksft_exit_skip("No soundcard available\n");
 
 	config = get_alsalib_config();
 
diff --git a/tools/testing/selftests/alsa/pcm-test.c b/tools/testing/selftests/alsa/pcm-test.c
index 2f5e3c462194..74d9cf8b5a69 100644
--- a/tools/testing/selftests/alsa/pcm-test.c
+++ b/tools/testing/selftests/alsa/pcm-test.c
@@ -161,8 +161,11 @@  static void find_pcms(void)
 	snd_pcm_info_alloca(&pcm_info);
 
 	card = -1;
-	if (snd_card_next(&card) < 0 || card < 0)
-		return;
+	err = snd_card_next(&card);
+	if (err < 0)
+		ksft_exit_skip("Couldn't open first soundcard. rc=%d\n", err);
+	if (card < 0)
+		ksft_exit_skip("No soundcard available\n");
 
 	config = get_alsalib_config();