-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Execute 'async def' test functions #2175
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This patch discriminates against all non asyncio frameworks |
I am very new to pytest. Any nice idea to solve the fundamental problem in a better way? |
pytest core should either trigger a pytest warning or a xfail on awaitable results @hpk42 @nicoddemus please chime in, this is a fundamental decission |
IMHO it makes a lot of sense for pytest to support something which is in the Python stdlib, at least on such a low level. |
I agree that the current state is not acceptable. At the very least we should issue a warning, or even fail with an error (I'm leaning towards the latter). Also there's pytest-asyncio already. We might consider integrating it into the core, as @The-Compiler's point is a valid one. Also, there are other questions to address, like for example the ability of using custom event loops; I will probably use quamash in the near-future for Qt applications. |
For the short term, perhaps we should issue an error about awaitable objects not being supported at the moment, and point users towards |
I have a question about your discussion and a suggestion about pytest-asyncio. The question: If pytest have to check the results of test functions to show errors, is there any reason not to run it to gain the actual result? I guess the operation cost for normal tests is almost same to showing error or gaining actual result if there is a test to check it. (Mostly, the test for result and branching) pytest-asyncio still suggests more features. But I think pytest itself also can suggest basic features even without full integration. (Not sure. I don't know well about your policies.) (Edited after the comment below) For now, users must attach I think the behaviors of pytest and pytest-asyncio are ideal if:
|
well - as of now, it is simply impossible to figure if a async function is for asyncio or not and simply enforcing asyncio onto async functions of a different framework breaks peoples code for example if a library exposes curio and asyncio and tests both |
Thanks for the explanation. Now I got what you meant by discrimination. |
pytest didn't use the result of test functions but `async def` functions return an awaitable object. Though `async def` functions were just passed like they don't have any problem, their codes actually didn't run. For users, it was really hard to detect the tests really were run or not. This patch makes to check the result of test functions and if it is an awaitable object then actually run the functions to ensure its code is run.
Created #2224 for adding a warning when collecting I was about to propose to close this PR because of the problem of supporting other async frameworks when it occurred to me: if we implement support for On the other hand, we might want to keep async support as external plugins, in which case warning would make more sense. |
What do you guys think? I would vote to add the warning explicitly mentioning the more popular |
closing this pr as we determined the correct action for py.test would be to warn or error out still thanks for the proposal as it sparked the discussion and forced us to get to a good idea for how py.test core should handle this |
Thanks, I will wait for the update. And thanks to everyone on this thread to consider my PR seriously and help me to catch what I was wrong :) |
pytest didn't use the result of test functions but
async def
functions return an awaitable object.Though
async def
functions were just passed like they don't have any problem, their codes actually didn't run. For users, it was really hard to detect the tests really were run or not.This patch makes to check the result of test functions and if it is an awaitable object then actually run the functions to ensure its code is run.
Thanks for submitting a PR, your contribution is really appreciated!
Here's a quick checklist that should be present in PRs:
master
; for new features, targetfeatures
;Unless your change is trivial documentation fix (e.g., a typo or reword of a small section) please:
AUTHORS
;CHANGELOG.rst
CHANGELOG
, so please add a thank note to yourself ("Thanks @user for the PR") and a link to your GitHub profile. It may sound weird thanking yourself, but otherwise a maintainer would have to do it manually before or after merging instead of just using GitHub's merge button. This makes it easier on the maintainers to merge PRs.