Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(api, performance-metrics): clean up performance-metrics tracking #15289

Merged

Conversation

DerekMaggio
Copy link
Member

@DerekMaggio DerekMaggio commented May 30, 2024

Overview

I originally had all my work in this PR then got pulled to other projects. 2 months later, the PR is ~250 commits behind edge and is a nightmare to get in sync. I created a new branch off of edge and cherry-picked pertinent things.

This PR covers:

  • Removing the tracking of analysis triggered from the CLI
  • Removing hacky unit tests
  • Fixing the RobotContextTracker.track function
  • Cleaning up some of the typing

Test Plan

  • Unit tests in performance metrics cover the RobotContextTracker.track changes
  • Removed unit testing of track_analysis because it required a ton of internal property overrides to make it work. This is easier to just test on the robot when it is ready.

Changelog

  • Remove track_analysis decorator on CLI analyze. That analysis does not run on the robot so we don't care about tracking it
  • Corrected some typing around the wrapped function calls.
    • Renamed type aliases and type vars to provide more context
    • Narrowed typing of wrapped function
  • Switched some sync calls in the testing of performance_metrics to ensure that their tracking was not missed
  • Added a couple tests around exception handling when not tracking. During development on this PR I broke this and it took me a while to figure out, so I added tests

Review requests

  • Is it correct to have all my TypeAliases in shared data since they are used in the api project and the performance-metrics project.

Risk assessment

Low

Copy link

codecov bot commented May 30, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 63.75%. Comparing base (7de5e83) to head (06fec00).
Report is 39 commits behind head on edge.

Current head 06fec00 differs from pull request most recent head 23cfdae

Please upload reports for the commit 23cfdae to get more accurate results.

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             edge   #15289      +/-   ##
==========================================
+ Coverage   63.20%   63.75%   +0.54%     
==========================================
  Files         287      300      +13     
  Lines       14891    15491     +600     
==========================================
+ Hits         9412     9876     +464     
- Misses       5479     5615     +136     
Flag Coverage Δ
shared-data 76.33% <100.00%> (+0.26%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Coverage Δ
...hon/opentrons_shared_data/performance/dev_types.py 77.50% <100.00%> (+1.18%) ⬆️

... and 15 files with indirect coverage changes

@DerekMaggio DerekMaggio force-pushed the performance-metrics-nightmare-fuel-merge-conflict-resolution branch from d64072a to b73010f Compare June 3, 2024 13:32
@DerekMaggio DerekMaggio force-pushed the performance-metrics-nightmare-fuel-merge-conflict-resolution branch from 40d1d64 to 70017cd Compare June 3, 2024 13:42
@DerekMaggio DerekMaggio changed the title Performance metrics nightmare fuel merge conflict resolution feat(robot-server, api, performance-metrics): get performance metrics to work on robot Jun 3, 2024
@DerekMaggio DerekMaggio changed the title feat(robot-server, api, performance-metrics): get performance metrics to work on robot chore(api, performance-metrics): clean up performance-metrics tracking Jun 3, 2024
@DerekMaggio DerekMaggio self-assigned this Jun 3, 2024
@DerekMaggio DerekMaggio requested a review from a team June 3, 2024 17:26
@DerekMaggio DerekMaggio marked this pull request as ready for review June 3, 2024 17:27
@DerekMaggio DerekMaggio requested a review from a team as a code owner June 3, 2024 17:27
Copy link
Member

@sfoster1 sfoster1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Almost there, I think we can slightly improve the analysis tracker and then we're good!

api/src/opentrons/util/performance_helpers.py Outdated Show resolved Hide resolved
@DerekMaggio DerekMaggio requested a review from sfoster1 June 5, 2024 17:08
@SyntaxColoring
Copy link
Contributor

Is it correct to have all my TypeAliases in shared data since they are used in the api project and the performance-metrics project.

Sorry if this was answered in prior discussions, but does any of this need to be in shared-data?

Since api has an optional dependency on performance-metrics, could api do something like:

if TYPE_CHECKING:
    import performance_metrics

def _handle_package_import() -> "typing.Type[performance_metrics.SupportsTracking]":
    ...

package_to_use: "typing.Type[performance_metrics.SupportsTracking]" = _handle_package_import()

When I think of what belongs in shared-data, I think of things that are used across languages and all up and down the stack, like the JSON protocol schemas.

@DerekMaggio
Copy link
Member Author

@SyntaxColoring, your explanation of what goes in shared-data clarifies a lot.
I will reevaluate with that in mind and update

Copy link
Contributor

@SyntaxColoring SyntaxColoring left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM if Seth's comments are addressed.

Here are some typing-related comments. I don't think any of these, nor the shared-data organizational thing above, need to be done in this PR.

Thank you!

Comment on lines +8 to +14
UnderlyingFunctionParameters = typing.ParamSpec("UnderlyingFunctionParameters")
UnderlyingFunctionReturn = typing.Any
UnderlyingFunction = typing.TypeVar(
"UnderlyingFunction",
typing.Callable[..., UnderlyingFunctionReturn],
typing.Callable[..., typing.Awaitable[UnderlyingFunctionReturn]],
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is Python's fault for being confusing, but for a few layered reasons, I don't think this is working as intended.

The red flag is that in robot_context_tracker.py, the type-checker is seeing the func_to_track() calls as returning Any. You're currently solving that with a pair of casts, but if everything is working properly, that shouldn't be needed.

There are probably a few ways to fix this, but as a first step, I would do this:

UnderlyingFunctionParameters = typing.ParamSpec("UnderlyingFunctionParameters")
UnderlyingFunctionReturn = typing.TypeVar("UnderlyingFunctionReturn")
UnderlyingFunction = typing.Callable[UnderlyingFunctionParameters, UnderlyingFunctionReturn]
  1. UnderlyingFunctionReturn goes from Any to TypeVar to retain the return type of the wrapped function. This is what fixes the cast problem.
  2. UnderlyingFunction becomes a plain Callable alias instead of a TypeVar because its arg type is a TypeVar now, and I guess you can't have nested TypeVars. (This might be Higher-Kinded TypeVars python/typing#548, but that discussion is mostly over my head.)

Then, in the places using these type aliases, I would do this:

def track(
    self,
    state: RobotContextState,
) -> typing.Callable[
    [UnderlyingFunction[UnderlyingFunctionParameters, UnderlyingFunctionReturn]],
    UnderlyingFunction[UnderlyingFunctionParameters, UnderlyingFunctionReturn]
]:
  1. Any mention of UnderlyingFunction becomes UnderlyingFunction[UnderlyingFunctionParameters, UnderlyingFunctionReturn] because it's a Callable alias now and so its type parameters need to be specified.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Separately: I would consider not centralizing these aliases. Instead, maybe mark them as private (like _UnderlyingFunctionReturn) and redeclare them wherever they're needed.

Importing and sharing TypeVars can confuse people because, although they're declared in one central place, the type-checker evaluates them in the local context of wherever they're used. Things can get especially confusing when there's subclassing involved, like SupportsTracking in this PR. The base class's type parameter is not necessarily the same as the subclass's type parameter, even though they're the same TypeVar. I think there's some broken typing in opentrons.protocol_api.core because of stuff like this.

This is, again, Python's fault for being confusing. No other language works like this as far as I know. Thankfully, Python 3.12 fixes it by replacing TypeVar.

Copy link
Member

@sfoster1 sfoster1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me although I'm a little surprised to see that annotation on _do_analyze disappear

@@ -198,7 +197,6 @@ def _get_return_code(analysis: RunResult) -> int:
return 0


@track_analysis
async def _do_analyze(protocol_source: ProtocolSource) -> RunResult:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wait wasn't this the point of the exercise? or are you doing this in a followup now

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, in a follow up. Since I am not actually installing this on a robot yet I am deferring it to when I can verify if it is actually working or not

@DerekMaggio
Copy link
Member Author

Added EXEC-508 and EXEC-509 to address typing stuff in a follow up PR

@DerekMaggio DerekMaggio merged commit dfbde9f into edge Jun 5, 2024
41 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants