-
I am using pytest_generate_tests(metafunc): to generate my test parameters from a config file, So far so good! However, if I can not find any matching configurations in this configfile to pass as params, there seems to be no clean way of skipping or dropping that test. I have tried using pytest.skip() but then the whole module of tests seem to be skipped which I do not want. This is pseudocode for what my conftest.py is doing: def pytest_generate_tests(metafunc):
test_requirements = get_test_requirements()
configurations= get_configurations(requirements)
if configurations:
metafunc.parametrize('fixture', configurations, indirect=True , ids=lambda d: d.name)
else:
SKIP THIS TEST My solution so far is metafunc.parametrize('fixture', [pytest.param(None, marks=pytest.mark.skip(reason="No config for test"))]) However i would like something more elegant if possible. I think its useful to be able to generate tests at collection time to see what tests should/would be run with the current configuration setup you have. Do you know any way to do this? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
You can deselect those tests, then they won't show up as skipped: https://github.com/aio-libs/multidict/blob/bc86d23/tests/conftest.py#L189-L210. |
Beta Was this translation helpful? Give feedback.
-
Instead of using a empty parameters set, use a dummy parameter that adds a per parameterization mark to skip that test |
Beta Was this translation helpful? Give feedback.
You can deselect those tests, then they won't show up as skipped: https://github.com/aio-libs/multidict/blob/bc86d23/tests/conftest.py#L189-L210.