Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[video_player_web] Add an alternate mode that renders images directly into the canvas #6286

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

jezell
Copy link

@jezell jezell commented Mar 8, 2024

This allows the web video player to avoid issues relating to having too many active overlaygroups. When this mode is enabled as follows:

VideoPlayerPlugin.renderVideoAsTexture = true;`

The video player will use createImageFromImageBitmap to grab frames as they are available and render them using RawImage instead.

Fixes:

flutter/flutter#144591

If you had to change anything in the flutter/tests repo, include a link to the migration guide as per the breaking change policy.

Pre-launch Checklist

  • [ X ] I read the Contributor Guide and followed the process outlined there for submitting PRs.
  • [ X ] I read the Tree Hygiene wiki page, which explains my responsibilities.
  • [ X] I read and followed the relevant style guides and ran the auto-formatter. (Unlike the flutter/flutter repo, the flutter/packages repo does use dart format.)
  • [ X] I signed the CLA.
  • [ X ] The title of the PR starts with the name of the package surrounded by square brackets, e.g. [shared_preferences]
  • [ X ] I linked to at least one issue that this PR fixes in the description above.
  • [ X ] I updated pubspec.yaml with an appropriate new version according to the pub versioning philosophy, or this PR is exempt from version changes.
  • [ X ] I updated CHANGELOG.md to add a description of the change, following repository CHANGELOG style.
  • [ X ] I updated/added relevant documentation (doc comments with ///).
  • I added new tests to check the change I am making, or this PR is test-exempt.
  • [ X ] All existing and new tests are passing.

If you need help, consider asking for advice on the #hackers-new channel on Discord.

@jezell jezell requested a review from ditman as a code owner March 8, 2024 03:37
@flutter-dashboard
Copy link

It looks like this pull request may not have tests. Please make sure to add tests before merging. If you need an exemption to this rule, contact "@test-exemption-reviewer" in the #hackers channel in Chat (don't just cc them here, they won't see it! Use Discord!).

If you are not sure if you need tests, consider this rule of thumb: the purpose of a test is to make sure someone doesn't accidentally revert the fix. Ask yourself, is there anything in your PR that you feel it is important we not accidentally revert back to how it was before your fix?

Reviewers: Read the Tree Hygiene page and make sure this patch meets those guidelines before LGTMing.

@ditman
Copy link
Member

ditman commented Mar 8, 2024

Do you need both rendering modes at the same time, or do you always use the image-to-canvas mode? If only the canvas mode: Should this be a separate package of video_player_web that can be set as a user dependency, similar to video_player_web_hls?

(Also, qq: does this play sound?)

@jezell
Copy link
Author

jezell commented Mar 8, 2024

@ditman this does play sound.

I think this way of rendering should actually be the default and the platform view style should be deprecated and removed eventually it's totally crashes the renderer if you have more than 8 videos and doesn't work with any of flutter's built in features like shaders. I only put a configuration option to switch the behavior because I thought people might be scared of change until it has some real world use.

If people really want a VideoPlayer HTML tag, they can do it with HtmlElementView. The web video player is unusable in its current state for a lot of people like us.

@jezell
Copy link
Author

jezell commented Mar 8, 2024

@ditman I believe these rendering issues are unique to the web player, as the other platforms are already doing something like this to render the video into textures on their respective platforms, so making this the default way that videos work would bring the platforms a bit more inline with each other in terms of behavior. I don't know of a good reason to keep the old behavior around forever or as the default. Might actually be better if the default of the setting was true, or the old behavior was put into a legacy player package. That would be quite a bit of code duplication just for a toggle on how the rendering part happens though. At the end of the day all paths go through a video element. Maybe the renderer widget itself could be abstracted so packages could choose a rendering strategy instead of having a boolean toggle? Might be a little cleaner.

@jezell
Copy link
Author

jezell commented Mar 8, 2024

@ditman refactored a bit to allow rendering logic to be overridden without swapping out the whole package. This would allow the newer or older portions to be moved into another package without a lot of code duplication. Is there a good reason to retain the old strategy as a default? I think for safety it's nice to allow configuration in case someone has reasons (like maybe a case where you are really low on texture memory in safari and are going to be able to avoid having any overlay groups created), but I don't know of a good reason to use an HtmlElementView over rendering into the canvas in general, as overlay groups cause a lot of issues in general:

  1. Shaders don't work properly with them
  2. Using more than 7 overlay groups in your app will cause your app to stop working. Could be a combination of videos, cameras, svgs, or other things, so burning them by default is always asking for trouble until the engine itself changes.

The reason the createImageFromImageBitmap API was introduced was specifically to allow changes like this.

@ditman
Copy link
Member

ditman commented Mar 8, 2024

@jezell what browser are you targeting? I'm guessing Safari?

This can't be the default method because requestVideoFrameCallback doesn't seem to be supported at all on Firefox?

I guess it can be made to fallback to the html element in older browsers, but how can this be more performant than the native tag?

@jezell
Copy link
Author

jezell commented Mar 9, 2024

@ditman performance is secondary to the fact that overlay groups cause rendering to entirely stop working.

Regardless, the performance issue isn't the native video element itself, it's that the native video element is inserted into an overlay group and the overlay groups cause issues. That's why the engine has a hard stop a a really low number. Nothing should ever depend on an overlay groups or an HtmlElementView in a framework intended to be used generically. If overlay groups didn't exist, the performance would be that of the native element, but overlay groups do exist.

To avoid the problems with overlay groups and use a native element is possible, but only if you avoid using HtmlElementView entirely and just render the element on top of the canvas manually. IMO, that's probably how overlay groups should work in the first place (rendering always 100% below or 100% above the canvas), but that's not how they work. At the moment, there is a large cost to overlay groups so anything that forces you to use them will make you pay a performance price.

@jezell
Copy link
Author

jezell commented Mar 9, 2024

@ditman It wouldn't be a bad idea to make the rendering strategy that uses the native player avoid overlay groups entirely. Then you could pick between rendering into the canvas at a reasonable cost or rendering on top of the canvas at no cost. That would still be better than not being able to ever put a video on a page that happens to have 8 SVGs or 8 webrtc cameras on it.

@jezell
Copy link
Author

jezell commented Mar 9, 2024

This can't be the default method because requestVideoFrameCallback doesn't seem to be supported at all on Firefox?

I guess it can be made to fallback to the html element in older browsers, but how can this be more performant than the native tag?

Most frameworks use requestAnimationFrame when requestVideoFrameCallback isn't available. It's a reasonable strategy, since RAF is used all over the place, including in flutter itself as a fallback for various missing APIs.

@ditman
Copy link
Member

ditman commented Mar 9, 2024

@jezell I'm not opposed to merging this, because I like that it brings new features that were impossible before (filtering, probably access to the stream of frames) but it's a lot of "experimental" APIs (and more that could be used, like the Web Codecs API, I guess), rather than <video>. Let me bring it up with the team, I know there were concerns about transferring images like this in firefox for the core framework (I linked the performance issues in the related issue of this PR)

@jezell
Copy link
Author

jezell commented Mar 9, 2024

@ditman added firefox support via RAF, works fine for playing full screen videos in Firefox for me in places where they crash due to overlay group limitations on the current player.

@jezell
Copy link
Author

jezell commented Mar 9, 2024

@ditman Thanks. I understand that this is a bit cutting edge and scary, but it does work better than the current method, and I think providing the fallback in the plugin if you prefer the old behavior should help address the concern -- or at the very least including the new logic, but making you turn on this as "experimental" to start.

@ditman
Copy link
Member

ditman commented Mar 9, 2024

There's some dart analyze issues, but those should be straightforward to fix.

A couple of tests are complaining, though:

The following PlatformException was thrown running a test:
PlatformException(MEDIA_ERR_SRC_NOT_SUPPORTED,
MEDIA_ELEMENT_ERROR: Format error, The video has been found to be
unsuitable (missing or in a format not supported by your
browser)., null)

@ditman
Copy link
Member

ditman commented Mar 9, 2024

Another issue with this new technique is that it exposes the video_player_web to the same CORS problems that the NetworkImage has in the canvaskit renderer: videos need to satisfy CORS restrictions, which is not necessarily true with the <video> tag.

See: flutter/flutter#107187 (comment)

I guess we must (need to?) keep the platform-view based implementation for these cross-domain cases.

…g, capture bitmap at optimum width / height of video
@jezell
Copy link
Author

jezell commented Mar 9, 2024

@ditman yeah, the crossOrigin is set to anonymous right now to at least allow some things through, but seeing that NetworkImage has same problem, at least it's consistent with other assets.

I don't think there's a good reason to forbid using the platform layer upon request, but if you opt into it, at least then you'd be choosing the risky behavior and it could be properly documented to make sure you know what you are signing up for. I'd guess the average consumer doesn't know they have to be very careful about overlay groups until it's too late and their renderer is misbehaving. Could probably also be detected and auto fallback, or maybe defaulted based on the renderer you are using? If using the HTML renderer maybe you'd prefer the video element because you don't have the webgl limits to contend with? Since that's the default in places like mobile safari that are constrained for texture space it could actually be a good thing to default to the old way there.

@stuartmorgan
Copy link
Contributor

the other platforms are already doing something like this to render the video into textures on their respective platforms

We expect to introduce platform view versions of playback for all of them over time, because the texture approach doesn't work with DRM-protected videos.

I don't know of a good reason to keep the old behavior around forever

I would expect the same issues with DRM on the web. This send to be explicitly saying that accessing frames of DRM-protected videos can't be relied on.

@jezell
Copy link
Author

jezell commented Mar 9, 2024

We expect to introduce platform view versions of playback for all of them over time, because the texture approach doesn't work with DRM-protected videos.

Good point. Hopefully all platforms will support both rendering into a texture or using a platform view depending on the needs of the app then. At least for us it would not be optimal to get reduced functionality for all videos because some videos can be DRM'd. Likely DRM'd content can get away with the limitations of platform views on webs as typically apps like Netflix don't let you watch more than one video at a time. Other cases like WebRTC, collaboration apps, etc. often do end up with a lot of videos on the screen at once and none of them are DRM'd, which is where we are currently running into trouble with platform views on web.

Another possibility I was thinking could help here on web (at least for videos) would be to only use the platform view when the video is playing, and to swap to a captured RawImage when it is not playing. Then, as long as you don't need multiple videos playing at once, you are only burning a single layer. Wouldn't be a great solution for WebRTC scenarios, but those aren't covered by this plugin anyway.

@stuartmorgan
Copy link
Contributor

Hopefully all platforms will support both rendering into a texture or using a platform view depending on the needs of the app then.

That is my current expectation for the long term state, yes.

…dering mode at runtime, allow manual configuration of rendering strategy
@jezell
Copy link
Author

jezell commented Mar 11, 2024

Updated PR based on feedback in this thread.

In auto render mode (default) which will choose a rendering strategy dynamically at runtime. A specific strategy can also be selected to so that people can choose platform views if they want things like DRM to work or texture mode if they want shaders to work and no platform layers to be used.

In auto mode:

  • When flutter is configured with the html renderer, auto mode will use platform views as there are no shaders to take advantage of and there is not such bad behavior when they are exhausted.

  • When flutter is using canvaskit renderer, auto mode will only use a platform view for playing videos. When videos are not playing, it will capture an image to a texture. This could offer a better default than the current player with minimal performance impact, as it doesn't require the per frame overhead of createImageBitmap and drastically minimizes the number of platform layers that are being requested. This eliminates, for example, a current issue where simply putting multiple videos in a feed or trying to preview videos in a grid can exhaust the available overlay groups and cause rendering issues even if they aren't currently playing.

Copy link
Member

@ditman ditman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jezell we're discussing internally if there's anything else that could be exposed through ui_web to make this more efficient/better integrated with the renderer mechanisms; (for example, there's a new skwasm renderer coming that attempts to do off-thread rendering, and I'm not sure how this would play with it), but also the way the renderer is determined, or the browser detection code, could be vended from the engine so everything is guaranteed to stay in sync with whatever it says.

One quick thing, could you split the _AdapterVideoPlayerRenderer to separate file(s), so we keep the video_player_web.dart as "the place that has the platform channel overrides" and the new stuff that knows how to render the video Widget in different files?

Also, there's no need for classes to be "private" (prefixed by _) in this package, because people shouldn't be using it directly on their code.

}

bool get rendererCanvasKit {
return hasProperty(web.window, 'flutterCanvasKit');
Copy link
Member

@ditman ditman Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Should be provided by the engine, also take skwasm into consideration?)

Comment on lines +467 to +470
final web.ImageBitmap newSource =
await promiseToFuture<web.ImageBitmap>(
web.window.createImageBitmap(widget.element));
final ui.Image img = await ui_web.createImageFromImageBitmap(newSource);
Copy link
Member

@ditman ditman Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the fastest way?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ditman It's definitely not the fastest way the browser supports, but it is the only way that's currently exposed to get a frame into the renderer. I have some suggestions here in this thread:

flutter/flutter#144815 (comment)

Ideally, in browsers where a VideoFrame is available, it could be transferred to the worker for rendering. There are some constraints with VideoFrames though that you can't hold onto them without blocking the decoding, so might require some additional plumbing to make work properly. Maybe WebCodecs would be the best option at some point: w3c/webcodecs#43. But also not doable without some work in the flutter engine.

Comment on lines +537 to +540
final web.CanvasRenderingContext2D context =
canvas!.getContext('2d')! as web.CanvasRenderingContext2D;

context.drawImage(widget.element, 0, 0, canvas!.width, canvas!.height);
Copy link
Member

@ditman ditman Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the fastest way?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ditman worth some benchmarking, but other than the native element, this is probably the fastest path to get a video frame rendered in FF and Safari. It avoids the transfer to the worker, which the slow path in FF / Safari and avoids the GC churn caused by getting bitmaps every frame.

This mode only exists to allow the player to only use a platform layer while playing and use a texture while not playing. Ideally, there wouldn't be a need for the canvas element because ideally we could swap from a texture straight to the video element when it starts playing. However, the challenge is that you can't move a video element in the DOM without resetting it, so if you try to move the video element into the platform layer when it is created, it will immediately stop playing.

I haven't experimented with invisible platform layers yet, but it's possible that an alternative could be using an invisible platform layer for the video when it's not playing and then swapping it to be a visible layer later. If the video element was not removed from the DOM during a swap from invisible to visible, there's a possibility it could work, but depends on whether Chrome would allow us to get the frames while the video was in the invisible layer and whether the element would get recreated by that swap.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ditman worth some benchmarking, but other than the native element, this is probably the fastest path to get a video frame rendered in FF and Safari. It avoids the transfer to the worker, which the slow path in FF / Safari and avoids the GC churn caused by getting bitmaps every frame.

This mode only exists to allow the player to only use a platform layer while playing and use a texture while not playing. Ideally, there wouldn't be a need for the canvas element because ideally we could swap from a texture straight to the video element when it starts playing. However, the challenge is that you can't move a video element in the DOM without resetting it, so if you try to move the video element into the platform layer when it is created, it will immediately stop playing.

I haven't experimented with invisible platform layers yet, but it's possible that an alternative could be using an invisible platform layer for the video when it's not playing and then swapping it to be a visible layer later. If the video element was not removed from the DOM during a swap from invisible to visible, there's a possibility it could work, but depends on whether Chrome would allow us to get the frames while the video was in the invisible layer and whether the element would get recreated by that swap.

VideoRenderMode mode = VideoRenderMode.auto;
}

class _AdapterVideoPlayerRenderer extends StatefulWidget {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Extract to separate files)

Copy link
Member

@ditman ditman Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This whole file should be provided by flutter web engine, so the decision is always in sync?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ditman yeah, that file is in the browser engine, but can it be imported from here? It would definitely be ideal to not have to double implement it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, it can't be imported now, it needs to be reexported from the ui_web package to work.

@stuartmorgan
Copy link
Contributor

From triage: @ditman What's the status of this PR? Is it blocked by potential SDK changes?

@stuartmorgan
Copy link
Contributor

From triage: @ditman Ping on the question above?

@ditman
Copy link
Member

ditman commented Jun 7, 2024

Sorry I missed the earlier message!

There's at least two relevant changes to the SDK:

The priority right now is in doing the work to move to canvaskit (js)/skwasm (wasm). Platform views are staying, so the current implementation of the engine stays. I'm still unsure on how the image copying would play with the rest of the framework (but currently there's a pattern where people do custom rendering to bytes in Flutter, and I guess that'd need to continue working)

I think this should land as an opt-in feature for the video player web, I'll give this some testing after I wrap up some unrelated stuff.

@jezell
Copy link
Author

jezell commented Jun 19, 2024

@ditman React Native Skia is using MakeLazyImageFromTextureSource for a faster path here. I really think that API should land in web_ui to allow this to be improved by binding the WebGL texture straight to the video element.

@kevmoo
Copy link
Contributor

kevmoo commented Jul 26, 2024

Are we blocked here? Or still in discussion...

@jezell
Copy link
Author

jezell commented Jul 26, 2024

@kevmoo Fix requires some changes that are in main branch, how does the plugin release cycle work?

I had pointed out the potential problem here:

flutter/flutter#150592

I think video_player_web is kind of a bad place to put the lower level video rendering logic because there are a lot of places that need the same logic, and video_player_web is way higher level, so it can't be used by camera_web or livekit_client. Not sure what approach the team wants to go with here. It's easy enough to hack it in to all these places manually, but feels wrong having to duplicate the code and then keep updating it as the web engine gets better.

@stuartmorgan
Copy link
Contributor

Fix requires some changes that are in main branch, how does the plugin release cycle work?

Please see https://github.com/flutter/flutter/blob/master/docs/ecosystem/contributing/README.md#supported-flutter-versions for this repository's support model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants