
Pretty much becomes unusable
If the video is encoded using a codec your hardware doesn't handle, it would be left up to the CPU to decode. Av1 can slow everything down to a crawl over CPU. You'd think the browser would be smart about the stream selection though.
That's intentional on YouTubes end, they aim to serve more bitrate-efficient codecs wherever possible, even if it's a high burden on the client due to a lack of hardware acceleration. They'll only fall back to older codecs if the client is completely incapable of handling the modern ones. It's annoying but at their scale it no doubt saves them a shitload of bandwidth.
Classic externality: at their scale, the power costs offloaded to their clients will also be enormous.
It also encourages users to upgrade to newer hardware since older devices are known to get slower as they age due to software increasing complexity and hardware mitigations (yes, they are also for phones). Most users will just blame the device.
Not saying that is the cause of this slow down, but since the mpeg4 patents don't expire till 2027(?) (and one of those patents prevents hardware decode on Linux) we as a society have given Google every incentive to do this and I welcome them to make mpeg4 irrelevant.
My laptop is held together by Gaffe tape. There ain’t no upgrade in sight
I believe Youtube's player is driving codec selection, not the browser (i.e. the player requests a list of supported codecs and then picks the one most beneficial for Google, not the other way around).
That said, I've solved this problem for myself on macOS and Firefox by setting media.webrtc.codec.video.av1.enabled to false on about:config, as all other codecs used by Youtube are hardware accelerated on my Mac.
> I believe Youtube's player is driving codec selection, not the browser (i.e. the player requests a list of supported codecs and then picks the one most beneficial for Google, not the other way around).
The way the browser can still participate in choosing is by e.g. not listing AV1 as supported when there is no hardware decoder on the local system. Both Safari and Edge took (approximately) that style of approach, but it comes with the downside that if the server only has AV1 video then the client gets nothing.
Practically, that downside isn't a big deal until codec support is high enough sites start assuming the codec is just supported and they don't need to host alternative options.
Yes, I think Safari even did so dynamically based on the mac being plugged into external power or not for a while, which I think is a nice compromise.
Apparently, there's even an API attribute that indicates whether a given codec is power efficient (https://developer.mozilla.org/en-US/docs/Web/API/MediaCapabi...), which Google must also be ignoring – not their problem, after all. (I wonder if anybody did the math of the opportunity cost of losing a few ad impressions due to the user's battery dying early vs. the incremental bandwidth cost?)
I run an extension that allows to automatically request h.264 streams from YouTube even when av1 is also available. Saves a lot of CPU, at the cost of some bandwidth.
It’s funny that this kind of browser extension has recurred over the years. Originally it was to replace the awful CPU hog flash player with an HTML5 h.264 player[1], then it was to sidestep YouTube’s insistence on VP* codecs, and now it’s to sidestep AV1.
[1]: https://web.archive.org/web/20110302145602/http://www.vertic...
In a beautiful world, there would be a link to each codec to let the user decide, or the browser itself would override the web site's preference and provide such links. It's sad that we keep having to resort to browser extensions to circumvent terrible web site and browser software.
I guess in a beautiful world we'd have hardware accelerated, patent free codecs.
Which is almost what AV1 is, native hardware decoding is slowly slowly progressing
Assuming you have hardware support for VP9 as well, setting media.webrtc.codec.video.av1.enabled to false on about:config achieves the same outcome without an extension.
What's the name of the addon?
I recall h264ify but not sure about it
Yep, that's what I use. It took youtube from using 100% cpu, to the point where my little xps13 was thermal throttling, to 50% cpu running 1080p at 2-3x speed.
Out of curiousity, what CPU do you have in that XPS 13?
It's pretty old, like from 2016 so it's only got an i5 7200U (2 core 4 threads) @ 2.5ghz.
Mostly fast enough for what I use it for (content consumption, web browsing, light gaming and coding). It's mainly limited by it's 8gb of ram which isn't upgradable.
What I run is https://github.com/alextrv/enhanced-h264ify
I looked now and noticed that I actually reject VP8 and VP9 and accept AV1. I run Linux on a Ryzen 4750U, for the record. It did not have trouble chewing through VP8 / VP9 without skipping frames, but it ran unpleasantly hot.
Youtube has been turning on AV1 for 1080p content for me. My phone is the only device with an AV1 hardware decoder. The impact on battery life and CPU usage has been extreme.
You can tell Youtube to prefer AV1 only for low-quality videos (https://www.youtube.com/account_playback) or you can install an extension that will force h264 playback where supported.
Other playback features such as ambient mode and volume equalisation can also impact performance, though that kind of depends on how fast your web browser is at executing Javascript, and to a much smaller extent.
The substantial bandwidth savings are here to stay, though, so in time I think Youtube will move to AV1 more often.
> You can tell Youtube to prefer AV1 only for low-quality videos (https://www.youtube.com/account_playback)
What option are you referring to here? I don’t see anything that seems related to that.
What it looks like if it's available (AV1 settings section): https://i.imgur.com/rp6Cvkd.png
The wording is gold dark pattern.
"auto"
"prefer AV1"
"always prefer AV1"
so... which one turns av1 off?
None, but "prefer AV1 for SD" will prevent the stuttering due to the changes mentioned above.
If you don't ever want AV1 then it probably makes more sense to not have the browser advertise it as an option to sites. One can configure Firefox as such, I'm not sure about Chrome.
There is an "AV1 settings" below "Subtitles and Closed Captions" for me. If you don't see it perhaps it's not available for you yet.
For devices with the hardware decoder it'll be in the same ballpark as before. For devices without the hardware decoder the CPU will use significantly more power to decode the video than a hardware decoder would.
How much power does hardware accelerated AV1 consume vs. hardware accelerated H.264? I would think they'd be relatively similar.
I have noticed Google sites showing what looks like some form of memory leak. Typically it's only Microsoft websites doing this.
This memory leak happens very often to me in GitHub, not sure why, and not sure if it's Microsoft's fault, or Firefox's. I actually have no idea at all on how to debug this...
Firefox Console > Memory > Enable record call stacks, reload the page, take a snapshot, wait until you notice memory grows, take another snapshot. Then compare the two: https://firefox-source-docs.mozilla.org/devtools-user/memory.... You can also try a performance recording.
But you probably won't be able to make much sense of the results without a lot of effort because of all the minimization/obfuscation.