Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase rigor of tests for model changes #23

Closed
5 of 6 tasks
jonrkarr opened this issue Feb 8, 2021 · 1 comment
Closed
5 of 6 tasks

Increase rigor of tests for model changes #23

jonrkarr opened this issue Feb 8, 2021 · 1 comment
Assignees

Comments

@jonrkarr
Copy link
Member

jonrkarr commented Feb 8, 2021

The tests for support for model changes basically just test that the simulation tool doesn't have a run-time when a SED model has SED changes.

The test suite should more rigorously verify that simulation tools implement model changes correctly.

  • changeAttribute
  • addXML
  • removeXML
  • changeXML
  • computeChange
  • setValue

Since the test suite can't directly observe model specifications, the test suite has to verify this by checking that simulation tools produce one or more data sets that demonstrate the impact of the model change. For example, a change could set an initial value of a model variable to zero, and then the test suite could verify that the first value of a data set for a data generator for the variable is zero, indicating that the change was correctly applied. This is challenging to do in a model language and algorithm agnostic way as this test suite aims to do.

@jonrkarr
Copy link
Member Author

jonrkarr commented Feb 9, 2021

Done except for repeated tasks which is covered in #6

@jonrkarr jonrkarr closed this as completed Feb 9, 2021
@jonrkarr jonrkarr self-assigned this Feb 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

1 participant