Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[devops] TravisCI tests fail more often than not - Need to simplify testing and mock network requests #637

Open
joeyklee opened this issue Oct 16, 2019 · 5 comments

Comments

@joeyklee
Copy link
Contributor

Dear ml5 community,

I'm submitting a new issue. Please see the details below.

β†’ Step 1: Describe the issue πŸ“

Did you find a bug? Want to suggest an idea for feature?

  • Want to suggest an idea

Currently the TravisCI tests fail more often than they pass. This has more to do with the fetching of images for analysis, the instantiation of models making big requests to external pretrained models from google or github servers, etc.

It would be wonderful if we can get this under control so that we can also start making a comprehensive set of tests for our features.

@joeyklee
Copy link
Contributor Author

Making a note that this is partially improved with #652, but still lots of room for improvement + actually adding more tests on functionality is necessary.

@joeyklee
Copy link
Contributor Author

joeyklee commented Nov 8, 2019

Make a note that we might consider doing mock tests to help with the CI tests but run the full tests locally.

@joeyklee joeyklee pinned this issue Aug 14, 2020
@joeyklee
Copy link
Contributor Author

We should be mocking our async network model requests -- e.g. https://jestjs.io/docs/en/mock-functions. I definitely would add this to our devOps and testing todo list.

@joeyklee
Copy link
Contributor Author

Just revisiting some ideas here that I will just jot down:

  • Ideas around moving from mock/chai to jest?
  • Employing snapshot tests as a first pass -- but this requires that we mock our model requests
  • Thinking about: Since models are basically functions -- input/output -- we can make some mock models that might be broadly applied for testing purposes something like mockClassificationModel and mockRegressionModel
  • Along the same lines of mocking for tests: maybe something like a mockIO for mocking out expected inputs and expected outputs since one of ml5's main "special sauces" is how we make it easier to pass a variety of inputs and then friendly structure our outputs.
  • Maybe we also make a tutorial on mocking async functions in ml5 and mocking external model requests 😬 since this has likely a reason for our testing timeouts as well as creating barriers for contributors to explore testing code. How wonderful would it be if we could help reduce the incentives/intimidation of writing tests! 😍

@joeyklee
Copy link
Contributor Author

joeyklee commented Feb 4, 2022

Noting that I've got some PRs to start this work:

Some notes:

@joeyklee joeyklee changed the title [devops] TravisCI tests fail more often than not [devops] TravisCI tests fail more often than not - Need to simplify testing and mock network requests Feb 4, 2022
joeyklee added a commit that referenced this issue Apr 5, 2022
refactor(#637): Migrate testing utilities to Jest pt.1/X
joeyklee added a commit that referenced this issue Apr 5, 2022
refactor(#637): Migrate testing utilities to Jest - remove other testing libs pt.2/X
joeyklee added a commit that referenced this issue Apr 6, 2022
refactor(#637): Refactor BodyPix and CharRNN tests to jest
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants