Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(api): handle performance-metrics package not existing #14922

Merged

Conversation

DerekMaggio
Copy link
Member

@DerekMaggio DerekMaggio commented Apr 16, 2024

Overview

Closes https://opentrons.atlassian.net/browse/EXEC-407

We do not want performance-metrics to be a required dependency for the publicly available opentrons package on PyPI. But we want to utilize performance-metrics when running the opentrons package on the robot itself.

The issue is, that since the performance-metrics package uses decorators to track running code, the decorators will always need to exist. This PR handles the case where the performance-metrics package does not exist and injects stubs where necessary.

Changelog

  • Created the SupportsTracking mypy Protocol to define an interface both RobotContextTracker and the stubbed class, StubbedTracker implement
  • Added performance-metrics as a dev dependency so tests using performance-metrics can still run
  • Created performance_helpers.py
    • Contains StubbedTracker defininition
    • Handles loading StubberTracker if performance-metrics library fails to load
    • Provides _get_robot_context_tracker private singleton function for eventual public-facing functions to use.

Test Plan

  • Testing to ensure stubbed track decorator returns the decorated function unchanged
  • Validate singleton logic of _get_robot_context_tracker

Risk assessment

Low, still not actually being used anywhere

DerekMaggio and others added 14 commits April 16, 2024 10:06
Closes EXEC-186

If the gantry is not homed and a powercycle occurs, drop-tip wizard cannot proceed with flows. An error is raised during the flow, and ultimately a home command is dispatched that has the side effect of potentially aspirating liquid into the pipette, damaging it. We special case home errors to prevent this. 

The primary functional difference is now that any time an error occurs, exiting the wizard via the header should not home the gantry. Homing as a result of an error should only occur when the "Confirm removal and home" button is presented and clicked.
@DerekMaggio DerekMaggio changed the title Exec 407 handle performance metrics package not existing chore: handle performance-metrics package not existing Apr 16, 2024
@DerekMaggio DerekMaggio marked this pull request as ready for review April 16, 2024 20:36
@DerekMaggio DerekMaggio requested review from a team as code owners April 16, 2024 20:36
@DerekMaggio DerekMaggio requested a review from a team April 16, 2024 20:36
@DerekMaggio DerekMaggio changed the title chore: handle performance-metrics package not existing chore(api): handle performance-metrics package not existing Apr 17, 2024
Comment on lines +17 to +22
try:
from performance_metrics import RobotContextTracker

return RobotContextTracker
except ImportError:
return StubbedTracker
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should I check opentrons.config.IS_ROBOT here?

if IS_ROBOT:
    try:
       from performance_metrics import RobotContextTracker
    except 
        <raise or log here>
else:
    return StubbedTracker

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nah I think this is totally fine

# Overview

Closes https://opentrons.atlassian.net/browse/EXEC-399

Evaluate robot context as `In Analysis` when _analyze is running in
`opentrons.cli.analyze`

# Changelog

- Add `performance_metrics_data` directory 
- Inside of performance_helpers.py
- Add logic to determine if tracking should be enabled (based off FF)
- Add `track_analysis` factory decorator (I think that is the right name
for it lol)
-  Wrap `opentrons.cli.analyze._analyze` with `track_analysis` decorator
- Change attribute naming on SupportsTracking to `storage_location`. If
the passed Path wants to be a dir that needs to have a file defined or a
direct file path, is an implementation detail.
- Have `RobotContextTracker` define its own file name

# Test Plan

- Verifying storage with tests in test_cli.py. A follow-up PR will
implement storing to a file on the robot. At that point, testing will be
done on the robot.
- Did verify that `performance_metrics_data` directory is getting
created

# Review requests

None

# Risk assessment

Low, although this is wrapping production code, it is gated by a feature
flag that defaults to false
Comment on lines +17 to +22
try:
from performance_metrics import RobotContextTracker

return RobotContextTracker
except ImportError:
return StubbedTracker
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nah I think this is totally fine

@DerekMaggio DerekMaggio merged commit 0940e7c into edge Apr 17, 2024
53 checks passed
Carlos-fernandez pushed a commit that referenced this pull request May 20, 2024
# Overview

Closes https://opentrons.atlassian.net/browse/EXEC-407

We do not want performance-metrics to be a required dependency for the
publicly available opentrons package on PyPI. But we want to utilize
performance-metrics when running the opentrons package on the robot
itself.

The issue is, that since the performance-metrics package uses decorators
to track running code, the decorators will always need to exist. This PR
handles the case where the performance-metrics package does not exist
and injects stubs where necessary.

# Changelog

- Created the `SupportsTracking` mypy Protocol to define an interface
both `RobotContextTracker` and the stubbed class, `StubbedTracker`
implement
- Added performance-metrics as a dev dependency so tests using
performance-metrics can still run
- Created `performance_helpers.py`
  - Contains `StubbedTracker` defininition
- Handles loading `StubberTracker` if performance-metrics library fails
to load
- Provides `_get_robot_context_tracker` private singleton function for
eventual public-facing functions to use.

# Test Plan

- Testing to ensure stubbed `track` decorator returns the decorated
function unchanged
- Validate singleton logic of _get_robot_context_tracker

# Risk assessment

Low, still not actually being used anywhere

---------

Co-authored-by: Jethary Rader <[email protected]>
Co-authored-by: Jamey Huffnagle <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants