Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continuous Testing. Integrate Ozone's test suite with a continuous testing feature to indicate if build is passing or not #130

Closed
Milind220 opened this issue Apr 24, 2022 · 14 comments · Fixed by #156
Labels

Comments

@Milind220
Copy link
Collaborator

Current situation

Ozone now has a test suite. The tests can be manually run by running pytest in the root directory of Ozone.

Desired situation

Ozone should have a continuous testing feature, and a badge/shield to go with it in the README. This way, with a single glance, we can tell that Ozone's working perfectly and without errors.

NOTE: Ozone is currently in the process of fixing many bugs that were discovered by the test suite. Ideally let's only add this continuous testing feature once we have those bugs sorted out.

@ShootGan
Copy link
Contributor

maybe you can also use codecov.io to show test coverage

@lahdjirayhan
Copy link
Contributor

Note: with #132 and #133 I think the tests should pass. I've tested the two branches separately on my device -- and I think, combined they shall pass all tests (I'm not 100% sure). You may want to check locally to confirm this after they got merged later @Milind220.

@Milind220
Copy link
Collaborator Author

Milind220 commented Apr 25, 2022

@lahdjirayhan OH awesome! that's excellent :) I'll check them out soon!

EDIT: Hey, so I ran the tests again, and I'm getting 2 failed still. Here's the summary that I see:

Screenshot 2022-04-26 at 12 35 27 AM

@Milind220 Milind220 self-assigned this Apr 25, 2022
@lahdjirayhan
Copy link
Contributor

@Milind220 Yeah I've got them too. They fail not because of the tests, but because a request in the test is not found in the test's corresponding cassette file. By default, pytest-recording uses --record-mode=none, i.e. prevent any new requests not found in the cassette to be made.

Solution is simple: run the following commands in your device:

# Check that the tests pass if pytest-recording is not used at all, i.e. the test uses live connection to WAQI
pytest tests/test_get_city_forecast.py tests/test_get_coordinate_air.py --disable-recording
# Run the tests using live connection to WAQI, and rewrite the cassettes
pytest tests/test_get_city_forecast.py tests/test_get_coordinate_air.py --record-mode=rewrite
# Save the new cassettes into version control
git commit tests/cassettes -m "test: Re-record cassette"

Running the first command gives me this in my device (all the previously failing tests pass):

$ pytest tests/test_get_city_forecast.py tests/test_get_coordinate_air.py --disable-recording
============================= test session starts =============================
platform win32 -- Python 3.8.5, pytest-7.1.1, pluggy-1.0.0
rootdir: E:\Git-LINE\milind-ozone\Ozone
plugins: anyio-3.5.0, profiling-1.7.0, recording-0.12.0
collected 17 items

tests\test_get_city_forecast.py .......                                  [ 41%]
tests\test_get_coordinate_air.py ..........                              [100%]

============================== warnings summary ===============================
src\ozone\ozone.py:93
  e:\git-line\milind-ozone\ozone\src\ozone\ozone.py:93: UserWarning: You have not specified a custom save file name. Existing files with the same name may be overwritten!
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================= 17 passed, 1 warning in 40.68s ========================

@lahdjirayhan
Copy link
Contributor

lahdjirayhan commented Apr 25, 2022

Unrelated to the issue but maybe later we need to rethink how to design the tests so that fixing to pass a failing test won't require re-recording a vcrpy cassette (because it's a hassle and frankly not intuitive for starters). Re-recording a cassette also shouldn't happen too often.

EDIT: Nope after more consideration and careful thoughts I came to conclude we can't just design away our way from re-recording every time a test changes from failing to passing. Or maybe I didn't think hard enough.

@Milind220
Copy link
Collaborator Author

@lahdjirayhan I'm not too well versed with this actually, so I'll do some research too!

@Miller2014
Copy link
Contributor

Do you still want to implement continuous testing/CI as this post suggests. This is new to me with Github, but I was having a read around Github actions which seems to be a good feature to implement CI/CD, have you considered this?

@Milind220
Copy link
Collaborator Author

Yes, continuous testing is still a priority for Ozone. I've never implemented it before though, so I'm not sure what the best way to proceed is. I did notice that GitHub Actions seems to have some options for this.

If you'd like to give it a try, that'd be really awesome! Even just some research into this would be super helpful!

@ShootGan
Copy link
Contributor

ShootGan commented May 9, 2022

I think GitHub actions and something like codecov.io for showing test coverage and badge in readme will be a good option. This pipeline should be complicated but I'm not sure how cassette works.
Also, the question is if we want to have pylint in the pipeline if it is implemented in pre-commit action.

@Milind220
Copy link
Collaborator Author

@ShootGan codecov + GitHub Actions sounds perfect.
If you want to learn this is the ideal place to give it a shot - we can help figure out any issues you face, and patch any mistakes that might be made in future PRs.

I don't think we need pylint in the continuous testing pipeline. Just the tests should be plenty.

@lahdjirayhan
Copy link
Contributor

lahdjirayhan commented May 18, 2022

Is anyone working on this? @ShootGan @Miller2014 @Milind220

I want to work on this if no one does yet.

@ShootGan
Copy link
Contributor

Hi, sorry right now i don't have enough time to do this but I can send a similar pipeline yaml.

@Milind220 Milind220 linked a pull request May 20, 2022 that will close this issue
@Milind220
Copy link
Collaborator Author

@ShootGan No worries! We'll always have more issues to work on hahaha. Thanks for your help :)

@Miller2014
Copy link
Contributor

Sorry for the lack of response on this, glad to see you implemented this, great job!. I will keep an eye out for any new issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants