Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EMA FP32 assert classification bug fix #9016

Merged
merged 3 commits into from
Aug 18, 2022
Merged

EMA FP32 assert classification bug fix #9016

merged 3 commits into from
Aug 18, 2022

Conversation

glenn-jocher
Copy link
Member

@glenn-jocher glenn-jocher commented Aug 18, 2022

Resolves v6.2 classification training bug around FP16 EMA updates (bad!)

πŸ› οΈ PR Summary

Made with ❀️ by Ultralytics Actions

🌟 Summary

Enhancements to model compatibility, export options, and variable handling in YOLOv5.

πŸ“Š Key Changes

  • Changed class names from a list to a dictionary in multiple scripts for consistency.
  • Updated the export.py default export formats by removing 'onnx' from the default settings.
  • Removed an unused import (Conv) in experimental.py for cleaner code.
  • Added model compatibility updates in attempt_load function, including stride adjustments and conversion of names to dictionaries for better handling.
  • Revised the ModelEMA class to omit unnecessary FP16 conversion.
  • Ensured that all EMA updates occur in FP32 to maintain consistency within the model.

🎯 Purpose & Impact

  • πŸ“ˆ Improved compatibility and maintenance by standardizing class names to dictionaries, making the codebase more uniform and easier to manage.
  • πŸ”§ Adjusted the default export formats, optimizing the focus for the most commonly used export types, potentially simplifying the export process for users.
  • πŸ› οΈ Code cleanup and compatibility adjustments aid in a smoother experience when loading models and contribute to a reduction in possible future issues due to data type discrepancies.
  • πŸ€– The general codebase cleanup and performance optimization may offer a more streamlined development experience while maintaining backward compatibility with existing models.
  • πŸ’Ύ Consistency in using FP32 for EMA updates improves the reliability and stability of the training process.

@glenn-jocher glenn-jocher self-assigned this Aug 18, 2022
@glenn-jocher glenn-jocher changed the title Return EMA float on classification val EMA FP32 assert Aug 18, 2022
@glenn-jocher glenn-jocher changed the title EMA FP32 assert EMA FP32 assert classification bug fix Aug 18, 2022
@glenn-jocher glenn-jocher merged commit 20049be into master Aug 18, 2022
@glenn-jocher glenn-jocher deleted the update/fix branch August 18, 2022 12:06
ctjanuhowski pushed a commit to ctjanuhowski/yolov5 that referenced this pull request Sep 8, 2022
* Return EMA float on classification val

* verbose val fix

* EMA check
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant