
A lively discussion about open source, security, and who pays the bills has erupted on Twitter.
You may never have heard of FFmpeg, but you’ve used it. This open source program’s robust multimedia framework is used to process video and audio media files and streams across numerous platforms and devices. It provides tools and libraries for format conversion, aka transcoding, playback, editing, streaming, and post-production effects for both audio and video media.
FFmpeg’s libraries, such as libavcodec and libavformat, are essential for media players and software, including VLC, Kodi, Plex, Google Chrome, Firefox, and even YouTube’s video processing backend. It is also, like many other vital open source programs, terribly underfunded.
A lively debate on Twitter began between Dan Lorenc, CEO and co-founder of Chainguard, the software supply chain security company, the FFmpeg project, Google, and security researchers over security disclosures and the responsibilities of large tech companies in open-source software.
The core of the discussion revolves around how vulnerabilities should be reported, who is responsible for fixing them, and the challenges that arise when AI is used to uncover a flood of potentially meaningless security issues. But at heart, it’s about money.
This discussion has been heating up for some time. In mid-October, FFmpeg tweeted that “security issues are taken extremely seriously in FFmpeg, but fixes are written by volunteers.” This point cannot be emphasised enough. As FFmpeg tweeted later, “FFmpeg is written almost exclusively by volunteers.”
Thus, as Mark Atwood, an open source policy expert, pointed out on Twitter, he had to keep telling Amazon to not do things that would mess up FFmpeg because, he had to keep explaining to his bosses that “They are not a vendor, there is no NDA, we have no leverage, your VP has refused to help fund them, and they could kill three major product lines tomorrow with an email. So, stop, and listen to me … ”
The latest episode was sparked after a Google AI agent found an especially obscure bug in FFmpeg. How obscure? This “medium impact issue in ffmpeg,” which the FFmpeg developers did patch, is “an issue with decoding LucasArts Smush codec, specifically the first 10-20 frames of Rebel Assault 2, a game from 1995.”
Wow.
FFmpeg added, “FFmpeg aims to play every video file ever made.” That’s all well and good, but is that a valuable use of an assembly programmer’s time? Oh, right, you may not know. FFmpeg’s heart is assembly language. As a former assembly language programmer, it is not, in any way, shape, or form, easy to work with.
As FFmpeg put it, this is “CVE slop.”
Many in the FFmpeg community argue, with reason, that it is unreasonable for a trillion-dollar corporation like Google, which heavily relies on FFmpeg in its products, to shift the workload of fixing vulnerabilities to unpaid volunteers. They believe Google should either provide patches with vulnerability reports or directly support the project’s maintenance.
Earlier, FFmpeg pointed out that it’s far from the only open source project to face such issues.
Specifically, the project team mentions Nick Wellnhofer, the former maintainer of libxml2, a widely used open source software library for parsing Extensible Markup Language (XML). Wellnhofer recently resigned from maintaining libxml2 because he had to “spend several hours each week dealing with security issues reported by third parties. Most of these issues aren’t critical, but it’s still a lot of work.
“In the long term, this is unsustainable for an unpaid volunteer like me. … In the long run, putting such demands on OSS maintainers without compensating them is detrimental. … It’s even more unlikely with Google Project Zero, the best white-hat security researchers money can buy, breathing down the necks of volunteers.”
What made this a hot issue was that back in July, Google Project Zero (GPZ) announced a trial of its new Reporting Transparency policy. With this policy change, GPZ announces that it has reported an issue on a specific project within a week of discovery, and the security standard 90-day disclosure clock then starts, regardless of whether a patch is available or not.
Many volunteer open source program maintainers and developers feel this is massively unfair to put them under such pressure when Google has billions to address the problem.
FFmpeg tweeted, “We take security very seriously, but at the same time, is it really fair that trillion-dollar corporations run AI to find security issues in people’s hobby code? Then expect volunteers to fix.”
True, Google does offer a Patch Rewards Program, but as a Twitter user using the handle Ignix The Salamander observed, “FFmpeg already mentioned the program is too limited for them, and they point out the three patches per month limit. Please don’t assume people complain just for the sake of complaining, there is a genuine conflict between corporate security & usage vs open source support IMHO.”
Lorenc argues back, in an e-mail to me, that “Creating and publishing software under an open source license is an act of contribution to the digital commons. Finding and publishing information about security issues in that software is also an act of contribution to the same commons.
“The position of the FFmpeg X account is that somehow disclosing vulnerabilities is a bad thing. Google provides more assistance to open source software projects than almost any other organization, and these debates are more likely to drive away potential sponsors than to attract them.”
The fundamental problem remains that the FFmpeg team lacks the financial and developer resources to address a flood of AI-created CVEs.
On the other hand, security experts are certainly right in thinking that FFmpeg is a critical part of the Internet’s technology framework and that security issues do need to be made public responsibly and addressed. After all, hackers can use AI to find vulnerabilities in the same way Google does with its AI bug finder, Big Sleep, and Google wants to identify potential security holes ahead of them.
The reality is, however, that without more support from the trillion-dollar companies that profit from open source, many woefully underfunded, volunteer-driven critical open-source projects will no longer be maintained at all.
For example, Wellnhofer has said he will no longer maintain libxml2 in December. Libxml2 is a critical library in all web browsers, web servers, LibreOffice and numerous Linux packages. We don’t need any more arguments; we need real support for critical open source programs before we have another major security breach.
From TFA this was telling:
Thus, as Mark Atwood, an open source policy expert, pointed out on Twitter, he had to keep telling Amazon to not do things that would mess up FFmpeg because, he had to keep explaining to his bosses that “They are not a vendor, there is no NDA, we have no leverage, your VP has refused to help fund them, and they could kill three major product lines tomorrow with an email. So, stop, and listen to me … ”
I agree with the headline here. If Google can pay someone to find bugs, they can pay someone to fix them. How many time have managers said "Don't come to me with problems, come with solutions"
I've been a proponent of upstreaming fixes for open source software.
Why? - It makes continued downstream consumption easier, you don't have to rely on fragile secret patches. - It gives back to projects that helped you to begin with, it's a simple form of paying it forward. - It all around seems like the "ethical" and "correct" thing to do.
Unfortunately, in my experience, there's often a lot of barriers within companies to upstream. Reasons can be everything from compliance, processes, you name it... It's unfortunate.
I have a very distinct recollection of talks about hardware aspirations and upstreaming software fixes at a large company. The cultural response was jarring.
As yet, Valve is the only company I know of doing this, and it's paying off in dividends both for Linux and for Valve. In just 5ish years of Valve investing people and money into Linux- specifically mesa and WINE, Linux has gone from a product that is kind of shaky with Windows, to "I can throw a windows program or game at it and it usually works". Imagine how further the OSS ecosystem would be if Open Source hadn't existed, only FOSS; and companies were legally obligated to either publish source code or otherwise invest in the ecosystem.
WINE, CodeWeavers, Mesa, Red Hat, and plenty of others have been pumping money into the Linux graphics subsystems for a very long time. It's cool that Valve was able to use its considerable wealth to build a business off of it. But they came in at a pretty opportune time.
Windows support had gotten a boost from .NET going open source as well as other stuff MS began to relax about. It also helped that OpenGL was put to rest and there was a new graphics API that could reasonably emulate DirectX. I don't know much about the backstory of Mesa, but it's pretty cool tech that has been developing for a long time.
Valve is so successful because it is a private company, and the CEO is the CTO and he is essentially the corporate equivalent of a religious monk. How else can you get 20+ years to slowly build a software business?
As a side note YC and tech startups themselves have become reality TV. Your goal should be Valve! You should be Gabe Newell! You don’t need to be famous! Just build something valuable and be patient
Ironically, Gabe is more famous than the rest of whoever you're talking about, not because he seeks fame but just because he generally does right by his customers and makes himself accessible. Telling gamers to email him with questions, concerns, comments, anything, and then actually responding. Even though he's apparently spending most of his time hanging out on yachts, people love him because he makes an effort to be tuned in to what his customers want. If you do that, you'll be famous in a better way than what you can get from reality TV.
Steam is the most dominant game tool on the planet and landed when there was not yet a market for it. Very few other projects will get to the level of success it has in any sector, anywhere.
GabeN was also a MS developer back in the day and likely would have been well off regardless, but he didn't need to play the YC A-B-let's-shoehorn-AI bullshit games that are 100% de rigeour for all startups in 2025.
From what I understand, Gabe/Valve almost went bust during Half Life's development. His gamble paid off when that turned into a runaway success, but he still could have lost it when he bet again on HL2 and Steam; at the time it was extremely controversial to make those a package deal. If Half Life 2 had been not quite as good as it turned out to be, it could have turned out to be a studio with a one hit wonder that burned their goodwill with some sketchy DRM sort of scheme on their second game.
> How else can you get 20+ years to slowly build a software business?
It used to be normal to build a business slowly over 20 years. Now everyone grabs for the venture capital, grows so fast they almost burst, and the venture capital inevitably ends in enshittification as companies are forced by shareholders to go against their business model and shit over their customers in order to generate exponential profit margins.
Credit to wine and cross over (?)for years and years of work as well
WINE was a thing for years and generally worked okay for a lot of things.
I was playing Fallout 3 on WINE well before Valve got involved with minimal tweaks or DIY effort.
Proton with Steam works flawlessly for most things including AAA games like RDR2 and it's great, but don't forget that WINE was out there making it work for a while
> WINE was a thing for years and generally worked okay for a lot of things.
Yes, but Valve's involvement handled "the last 10% takes the 90% of the time" part of WINE, and that's a great impact, IMHO.
Trivia: I remember WINE guys laughing at WMF cursor exploit, then finding the exploit works on WINE too and fix it in panic, and then bragging bug-for-bug compatibility with Windows. It was hilarious.
Also, WINE allowed Linux systems to be carrier for Windows USB flash drive virii without being affected by them for many years.
But this is a perfect example of one of those "90/10" esque ideas.
Even if Wine was 90% there technologically, the most important 90% is really that last 10.
I'm glad you threw in "I know of", because that part is true.
Feel free to read lore.kernel.org, and sort out where the people contributing many patches actually work.
Can't you just give the information you are hinting at? Other people than OP read this. You basically tell me to go read thousands of messages on a mailing list just solve your rhetorical question. (answer: Intel, Redhat, Meta, Google, Suse, Arm and Oracle. There are much more efficient ways to find this.) Yes, they are the main kernel contributors and have been for many years. I'm still not sure I understand the comment.
I think GP answered as they did because there are so many examples it's hard to know where to start.
It's not entirely unlike if someone said "the only person I know writing books successfully is Brandon Sanderson." I do think "you ought to go check out your local book store" would be a valid response.
No, "just" having to debunk BS from a BSer who lazily threw out misinformation is not the way to go. It's the BSer that needs to do more work.
I'd say as a counterpoint that just because someone works at, say, Meta or Oracle, and also contributes to OSS projects, that doesn't equate to the company they work at funding upstream projects (at least not by itself).
I don't even have to link the xkcd comic because everyone already knows which one goes here.
Everyone I know who contributes to Linux upstream is paid to do it. It's not really worth the hassle to bother trying if you weren't getting paid. It's also very easy to find companies that will pay you to work on Linux and upstream.
At GOOG you’re required to, by policy.
Linus does...
Well, if they use their work email, doesn't that mean their kernel work is endorsed by their employer?
> Valve is the only company I know of [upstreaming fixes for open source software]
Sorry, that's ridiculous. Basically every major free software dependency of every major platform or application is maintained by people on the payroll of one or another tech giant (edit: or an entity like LF or Linaro funded by the giants, or in a smaller handful of cases a foundation like the FSF with reasonably deep industry funding). Some are better than others, sure. Most should probably be doing more. FFMpeg in particular is a project that hasn't had a lot of love from platform vendors (most of whom really don't care about software codecs or legacy formats anymore), and that's surely a sore point.
But to pretend that SteamOS is the only project working with upstreams is just laughable.
From my time working at a Fortune 100 company, if I ever mentioned pushing even small patches to libraries we effing used, I'd just be met "try to focus on your tickets". Their OSS library and policies were also super byzantine, seemingly needing review of everything you'd release, but the few times I tried to do it the official way, I just never heard anything back from the black-hole mailing list you were supposed to contact.
Yes, I've also worked on OpenStack components at a university, and there I see Red Hat or IBM employees pushing up loads of changes. I don't know if I've ever seen a Walmart, UnitedHealth, Chase Bank, or Exxon Mobil (to pick some of the largest companies) email address push changes.
I don't know about ExxonMobil but Walmart, UnitedHealth Group, and JPMorganChase employees do actively contribute to open source projects. Maybe just not the ones you used. They have also published some of their own.
To steelman this: I've never worked at any of the companies you listed but most likely Red Hat and IBM employees (Is there still a difference?) are being paid specifically to work on Openstack, as they get money from support contracts. When Walmart of Chase use Openstack there is a rather small team who is implementing openstack to be used as a platform. They are then paying IBM/Redhat for that support. There probably isn't really the expertise in the Openstack team at Warlmart to be adding patches. Some companies spend a different amount of money on in house technology than others, and then open source it.
Those aren’t tech giants. They're just shit companies. I agree they greatly outnumber Big Tech, in employees if not talent.
Sure, but the parent’s comment hits on something perhaps. All the tech giants contribute more haphazardly and for their own internal uses.
Valve does seem somewhat rare position of making a proper Linux distro work well with games. Google’s Chromebooks don’t contribute to the linux ecosystem in the same holistic fashion it seems.
They totally broke CSGO Legacy's code to push its sequel CS2 and won't accept fixes for it because it's 'not being supported'.
To be clear, both of those are closed source, proprietary games owned by Valve. It makes sense for them to want to consolidate their player base in one game.
> Unfortunately, in my experience, there's often a lot of barriers within companies to upstream. Reasons can be everything from compliance, processes, you name it... It's unfortunate.
I’ve been at several companies where upstreaming was encouraged for everything. The fewer internal forks we could maintain, the better.
What surprised me was how many obstacles we’d run into in some of the upstream projects. The amount of time we lost to trying to appease a few maintainers who were never happy with code unless they wrote it themselves was mind boggling.
For some projects you can basically forget about upstreaming anything other than an obvious and urgent bug fix because the barriers can be so high.
While there's sometimes maintainer-prima-donna egos the contend with there's also this:
Any patch sent in also needs to be maintained into the future, and most of the time it's the maintainers that need to do that, not the people contributing the patch. Therefore any feature-patches (as opposed to simple bugfixes) are quite often refused, even if they add useful functionality, because the maintainers conclude they will not be able to maintain the functionality into the future (because no one on the maintaining team has experience in a certain field, for example).
The quality bar for a 'drive by patch' which is contributed without the promise of future support is ridiculously high and it has to be. Other peoples' code is always harder to maintain than your own so it has to make up for that in quality.
Not any patch. Sometimes there are patches that are not explicitly fixing defects, but for example they surface a boolean setting that some upstream library started to expose. That setting is exactly like a dozen other settings already there. It's made using the same coding style and has all requisite things other settings have.
Will you be still making a fuss over it?
Maybe, it depends!
Maybe the developer intends to some day change the internal implementation, such that that particular boolean flag wouldn't make sense any more. Or they're considering taking out the option entirely, and thus simplifying the codebase by making it so it only works one way.
Maybe the developer just doesn't care about your use case. If I have a project that works fine for what I do with it, why should I also care about some other use case you have for my work? I'm not your employee. Your product doesn't put a roof over my head.
I don't want a job where I do free work, for a bunch of companies who all make money off my work. That's a bad deal. Its a bad deal even if my code gets better as a result. I have 150 projects on github. I don't want to be punished if any of those projects become popular.
We can't go around punishing projects like ffmpeg or ruby on rails for the crime of being useful.
> Maybe the developer just doesn't care about your use case. If I have a project that works fine for what I do with it, why should I also care about some other use case you have for my work?
Then say you don't expect contributions at all. That's a fair game, I'm ok with it. I will then exercise my rights granted by your license in another way (forking and making my own fix most likely). My gripe is with projects that write prominently "PRs welcome", and then make enough red tape to signal that nah, not really.
I don't know.
The pattern I have seen is that if you want to contribute a fix into a project, you are expected to "engage with the community", wear their badge, invest into the whole thing. I don't want to be in your community, I want to fix a bug in a thing I'm using and go on with my life.
Given the usual dynamics of online communities which are getting somehow increasingly more prone to dramas, toxicity, tribalism, and polarization, I just as increasingly want to have no part in them most of the time.
I think many projects would be better for having a lane for drive-by contributors who could work on fixing bugs that prevent their day-to-day from working without expectations of becoming full-time engaged. The project could set an expectation that "we will rewrite your patch as we see fit so we could integrate and maintain it, if we need/want to". I wouldn't care as long as the problem is taken care of in some way.
> The amount of time we lost to trying to appease a few maintainers who were never happy with code unless they wrote it themselves was mind boggling.
That brings us full circle to the topic because one important thing that gets people motivated into accepting other people's changes to their code is being paid.
If you work in FOSS side projects as well as a proprietary day job, you know it: you accept changes at work that you wouldn't in those side projects.
In the first place, you write the code in ways you wouldn't due to conventions you disagree with, in some crap language you wouldn't use voluntarily, and so it geos.
People working on their own FOSS project want everything their way, because that's one of the benefits of working on your own FOSS project.
I've literally had my employer's attorneys tell me I can't upstream patches because it would put my employer's name on the project, and they don't want the liability.
No, it didn't help giving them copies of licenses that have the usual liability clauses.
It seems a lot of corporate lawyers fundamentally misunderstand open source.
Corporate counsel will usually say no to anything unusual because there's no personal upside for them to say yes. If you escalate over their heads with a clear business case then you can often get a senior executive to overrule the attorneys and maybe even change the company policy going forward. But this is a huge amount of extra unpaid work, and potentially politically risky if you don't have a sold management chain.
I don't know if it would work, but sometimes I consider a "moochers" rule wrt opensource code.
Like, here's the deal: The work is proper, legit opensource. You can use it for free, with no obligations.
But if your company makes a profit from it, you're expected to either donate money to the project or contribute code back in kind. (Eg security patches, bug fixes, or contribute your own opensource projects to the ecosystem, etc).
If you don't, all issues you raise and PRs get tagged with a special "moocher" status. They're automatically - by default - ignored or put in a low priority bin. If your employees attend any events, or join a community discord or anything like that, you get a "moocher" badge, so everyone can see that you're a parasite or you work for parasites. Thats ok; opensource licenses explicitly allow parasites. I'm sure you're a nice person. But we don't really welcome parasites in our social spaces, or allow parasites to take up extra time from the developers.
I've spent the last 32 years pushing every employer I've had to contribute back to open source. Because of the sector I work in, more often than not I'm constrained by incredibly tight NDAs.
I can usually stop short of providing code and file a bug that explains the replication case and how to fix it. I've taken patches and upstreamed them pseudonymously on my own time when the employer believed the GPL meant they couldn't own the modifications.
If after all that you still want to label me a moocher at cons, that's your choice.
You can wear your secret cape with pride, don't worry about the moocher badge.
Sounds like your employers attorneys need to be brought to heel by management. Like most things, this is a problem of management not understanding that details matter.
It goes even further sometimes, I've seen someone in the Go community slack announce they are going to dial back their activity because of Very Serious Clauses in their Apple contract.
That seems to imply that Apple employees are prohibited from being good internet citizens and e.g. helping people out with any kind of software issue. This presumably includes contributing to open source, although I'm sure they can get approval for that. But the fact they have to get approval for it is already a chilling effect.
Apple? Not interested in being a good internet citizen? Say it ain't so!
Why would they invest resources - scarce, expensive time of attorneys - in researching and solving this problem? The attorneys' job is to help the company profit, to maximize ROI for legal work. Where is the ROI here? And remember, just positive ROI is unacceptable; they want maximum ROI per hour worked. When the CEO asks them how this project maximized ROI, what do they say?
I believe in FOSS and can make an argument that lots of people on HN will accept, but many outside this context will not understand it or care.
If you fixed something in an open source library you use, and you don't push that upstream, you are bound to re-apply that patch with every library update you do. And today's compliance rules require you to essentially keep all libraries up to date all the time, or your CVE scanners will light up. So fixing this upstream in the original project has a measurable impact on your "time spent on compliance and updates KPI".
This touches on what I ended up telling them: maintaining a local patchset is expensive and fragile. Running customized versions of things is a self-inflicted compliance problem.
I still had to upstream anonymously, though.
That is a real benefit, I agree.
I upstreamed a 1-line fix, plus tests, at my previous company. I had to go through a multi-month process of red tape and legal reviews to make it happen. That was a discouraging experience to say the least.
My favorite is when while you were working through all that, the upstream decided they need a CLA. And then you have to go through another round of checking to see if your company thinks it's ok for you to agree to sign that for a 1 line change.
Certainly easier to give a good bug report and let upstream write the change, if they will.
In this scenario does your employer have strong controls around what whether you can write hobby code on your own time?
One of my past employers in the UK added to the policy all the software the employee writes during the employment (eg. during the weekend, on the personal hardware), is owned by the company.
Several software engineers left, several didn't sign it.
Yes, company was very toxic apart of that. Yeah, I should name and shame but I won't be doxxing myself.
Many years ago an employer tried to to that and everyone .. just refused to sign the new contracts. The whole thing sat in standoff limbo for months until the dotcom crash happened and the issue became moot when we were all made redundant.
Generally yes. Or yes, you could just do it yourself in your free time.
This is what I've done in those rare cases I've had to fix a bug in a tool or a library I've used professionally. I've also made sure to do that using online identities with no connection to my employer so that any small positive publicity for the contribution lands on my own CV instead of the bureaucratic company getting the bragging rights.
Even at places that are permissive about hobby code, a company ought to want to put its name on open source contributions. These build awareness in the programming community of the company and can possibly serve as a channel for recruitment leads. But the (usually false) perception of legal risk and misguided ideas about what constitutes productivity usually sink any attempts.
It is amazing how companies want this "marketing" but don't want to put the actual effort to make it possible.
A tech company I worked at once had a "sponsorship fund" to "sponsor causes" that employees wanted, it was actually good money but a drop in the bucket for a company. A lot of employees voted for sponsoring Vue.js, which is what we used. Eventually, after months of silence, legal/finance decided it was too much work.
But hey it wasn't an exception. The local animal shelter was the second most voted and legal/finance also couldn't figure it out how to donate.
In the end the money went to nowhere.
The only "developer marketing" they were doing was sending me in my free time to do panels with other developers in local universities and conferences. Of course it was unpaid, but in return I used it to get another job.
I found a tiny bug in a library. A single, trivial, “the docs say this utility function does X, but it actually does Y”. I’m not even allowed to file a bug report. It took me some time to figure out how to even ask for permission, and they referred it to some committee where it’s in limbo.
My team lead once approved me upstreaming some changes to a open source project, so long as I did it using my private account.
Basically I got to do the work on company time&dime, but I couldn't give my employer credit, due to this kind of legal red tape.
I liked that teamlead
Maybe I've just gotten lucky, but at companies I've worked for I've usually gotten the go-ahead to contribute upstream on open source projects so as long as it's something important for what we are working on. The only reason I didn't do a whole lot as part of my work at Google is because most of the open source projects I contributed to at Google were Google projects that I could contribute to from the google3 side, and that doesn't count.
interesting, checking the git history of FFmpeg, google has approximately 643 contributions
git clone https://git.ffmpeg.org/ffmpeg.git
cd ffmpeg
git log --pretty=format:"%ae" | grep -E "chromium\\.org|google\\.com" | wc -l
prints 643> Unfortunately, in my experience, there's often a lot of barriers within companies to upstream. Reasons can be everything from compliance, processes, you name it...
True. In my case I literally had to fight for it. Our lawyers were worried about a weakened patent portfolio and whatnot. In my case at least I won and now we have a culture of upstreaming changes. So don't give up the fight, you might win.
It would probably be easier for these companies to pay Collabora or Igalia.
> Unfortunately, in my experience, there's often a lot of barriers within companies to upstream. Reasons can be everything from compliance, processes, you name it... It's unfortunate.
I sympathize and understand those issues for small companies, but after a certain size those excuses stop being convincing.
Especially for a software company like Google who runs dozens of open source projects, employs an army of lawyers to monitor compliance, and surely has to deal with those issues on a daily basis anyway.
At some point there needs to be pushback. Companies get a huge amount of value from hobbiest open source projects, and eventually they need to start helping out or be told to go away.
This is why all open source software should be copyleft. No discussion to be had: either you upstream changes, or that open source developer's going to get funded via the courts.
Nothing says solid industry better than having 3 majors product lines from a trillion dollar company depending of unpaid volunteer labor.
I think if you look a bit deeper, all product lines from said trillion dollor company rely on open source to some degree. They should be spending hundreds of millions in sponsorship of OS projects. They should put the maintainers on their payroll. Not even reporting to a manager, just pay them a salary for their OS work.
I have a feeling that if they do this, the economy would be hurt (somehow).
None of us want the economy to be hurt, right?
That's what's going to happen when these corporations extract the last value from OSS and all the maintainers give up, so..
How could ffmpeg maintainers kill three major AWS product lines with an email?
In a follow-up tweet, Mark Atwood eloborates: "Amazon was very carefully complying with the licenses on FFmpeg. One of my jobs there was to make sure the company was doing so. Continuing to make sure the company was was often the reason I was having a meeting like that inside the company."
I interpret this as meaning there was an implied "if you screw this up" at the end of "they could kill three major product lines with an email."
Are you interpreting that as "if we violate the license, they can revoke our right to use the software" ?? And they use it in 3 products so that would be really bad. That would make sense to have a compliance person.
Possibly Twitch, Amazon Prime Video, and another one that escapes my mind (AWS-related?).
Yeah - Amazon Elastic Transcoder which they just shut down and replaced with Elemental MediaConvert is almost certainly just managed "ffmpeg as a Service" under the hood.
Twitch definitely. This whole brouhaha has been brewing for a while, and can be traced back to a spat between Theo and ffmpeg.
In the now deleted tweet Theo thrashed VLC codecs to which ffmpeg replied basically "send patches, but you wouldn't be able to". The reply to which was
--- start quote ---
https://x.com/theo/status/1952441894023389357
You clearly have no idea how much of my history was in ffmpeg. I built a ton of early twitch infra on top of yall.
--- end quote ---
This culminated in Theo offering a 20k bounty to ffmpeg if they remove the people running ffmpeg twitter account. Which prompted a lot of heated discussion.
So when Google Project Zero posted their bug... ffmpeg went understandably ballistic
And Blink. I used to contract with them a few years back, they all rely heavily on open source.
Still doesn't make any sense.
The company that I work at makes sure anything that uses third-party library, whether in internal tools/shipped product/hosted product, goes through legal review. And you'd better comply with whatever the legal team asks you to do. Unless you and everyone around you are as dumb as a potato, you are not going to do things that blatantly violates licenses, like shipping a binary with modified but undisclosed GPL source code. And you can be sure that (1) it's hard to use anything GPL or LGPL in the first place (2) even if you are allowed to, someone will tell you to be extra careful and exactly do what you are told to (or not to)
And as long as Amazon is complying with ffmpeg's LGPL license, ffmpeg can't just stop licensing existing code via an email. Of course, unless there is some secret deal, but again, in that case, someone in the giant corporation will make sure you follow what's in the contract.
Basically, at company at Amazon where there are functional legal teams, the chance of someone "screwing up" is very small.
Easy: ffmpeg discontinues or relicenses some ffmpeg functionality that AWS depends on for those product alines and AWS is screwed. I've seen that happen in other open source projects.
But if it gets relicensed, they would still be able to use the current version. Amazon definitely would be able to fund an independent fork.
And then the argument for refusing to just pay ffmpeg developers gets even more flimsy.
The entire point here is to pay for the fixes/features you keep demanding, else the project is just going to do as it desires and ignore you.
More and more OSS projects are getting to this point as large enterprises (especially in the SaaS/PaaS spheres) continue to take advantage of those projects and treat them like unpaid workers.
Heard of OpenSearch?
There are many reasons, often good ones, not to pay money for an open source project but instead fund your own projects, from a company's perspective.
Yes, if FFMpeg suddenly made a corporate-incompatible license change, it would 100% make sense to fork it.
I don’t see how that at all relates to the point, but sure, you got me.
Not really. Their whole reason for not funding open source is it essentially funds their competitors who use the same projects. That's why they'd rather build a closed fork in-house than just hand money to ffmpeg.
It's a dumb reason, especially when there are CVE bugs like this one, but that's how executives think.
> Their whole reason for not funding open source is it essentially funds their competitors who use the same projects. That's why they'd rather build a closed fork in-house than just hand money to ffmpeg.
So the premise here is that AWS should waste their own money maintaining an internal fork in order to try to make their competitors do the same thing? But then Google or Intel or someone just fixes it a bit later and wisely upstreams it so they can pay less than you by not maintaining an internal fork. Meanwhile you're still paying the money even though the public version has the fix because now you either need to keep maintaining your incompatible fork or pay again to switch back off of it. So what you've done is buy yourself a competitive disadvantage.
> that's how executives think.
That's how cargo cult executives think.
Just because you've seen someone else doing something doesn't mean you should do it. They might not be smarter than you.
But their competitors also fund them, which makes it a net positive sum.
ffmpeg is LGPL, so they can't make a proprietary fork anyways
Google, AWS, Vimeo, etc can demand all they want. But they’re just another voice without any incentives that aid the project. If they find having an in-house ffmpeg focused on their needs to be preferable, go for it; that’s OSS.
But given its license, they’re going to have to reveal those changes anyways (since many of the most common codecs trigger the GPL over LGPL clause of the license) or rewrite a significant chunk of the library.
Sounds like it would be a lot of churn for nothing; if they can fund a fork, then they could fund the original project, no?
They COULD, but history has shown they would rather start and maintain their own fork.
It might not make sense morally, but it makes total sense from a business perspective… if they are going to pay for the development, they are going to want to maintain control.
If they want that level of control, reimburse for all the prior development too. - ie: buy that business.
As it stands, they're just abusing someone's gift.
Like jerks.
Do they want control or do they really want something that works that they don't have to worry about?
The only reason for needing control would be if it was part of their secret sauce and at that point they can fork it and fuck off.
These companies should be heavily shamed for leaching off the goodwill of the OSS community.
If they can fund a fork, they can continue business as usual until the need arises
Funding ffmpeg also essentially funds their competitors, but a closed fork in-house doesn't. Submitting bugs costs less than both, hence why they still use ffmpeg in the first place.
With a bit of needless work the fixes could be copied and they would still end up funding them.
They can't - it's LGPL 2.1. So the fork would be public essentially.
It still takes expensive humans to do this so they are incentivized to use the free labor.
Yes, definitely. I was just saying that if the license ever did change, they would move to an in-house library. In fact, they would probably release the library for consumer use as an AWS product.
Oh the irony - we don't want to pay for ffmpeg's development, but sure can finance a fork if we have to.
something more dangerous would be "amazon is already breaking the license, but the maintainers for now havent put in the work to stop the infringement"
Wouldn’t that only affect new versions and current versions are still licensed under the old license ?
ffmpeg cannot relicense anything because it doesn't own anything. The contributors own the license to their code.
Relicensing isn't necessary. If you violate the GPL with respect to a work you automatically lose your license to that work.
It's enough if one or two main contributors assert their copyrights. Their contributions are so tangled with everything else after years of development that it can't meaningfully be separated away.
In addition, there is the potential for software users to sue for GPL compliance. At least that is the theory behind the lawsuit against Vizio:
But that's only relevant if AWS (in this example) violates the GPL license, and it doesn't really seem like they have?
They can switch from LGPLv2.1 to GPLv2 or GPLv3 for future development because the license has an explicit provision for that.
I don’t know about ffmpeg, but plenty of OSS projects have outlined rules for who/when a project-wide/administrative decision can be made. It’s usually outlined in a CONTRIB or similar file.
I'd guess Prime Video heavily relies on ffmpeg, then you got Elastic Transcode and the Elemental Video Services. Probably Cloudfront also has special things for streaming that rely on ffmpeg.
The "kill it with an email" probably means that whoever said this is afraid that some usecase there wouldn't stand up to an audit by the usual patent troll mothercluckers. The patents surrounding video are so complex, old and plentiful that I'd assume full compliance is outright impossible.
AWS MediaConvert as well which is a huge API (in surface it covers) which is under Elemental but is kinda it's own thing - willing to bet (though I don't know) that that is ffmpeg somewhere underneath.
The API manual for it is nearly 4000 pages and it can do insane stuff[1].
I had to use it at last job(TM), it's not terrible API wise.
[1] https://docs.aws.amazon.com/pdfs/mediaconvert/latest/apirefe... CAUTION: big PDF.
" ... and it can do insane stuff"
That's a pretty good indicator it's likely just ffmpeg in an AWS Hoodie/Trenchcoat.
If you breach the LGPLv2/GPLv2 licence then you lose all rights to use the software.
There's no penalty clause, there's no recovery clause. If you don't comply with the licence conditions then you don't have a licence. If you don't have a licence then you can't use the program, any version of the program. And if your products depend on that program then you lose your products.
The theoretical email would be a notification that they had breached the licence and could no longer use the software. The obvious implication being that AWS was wanting to do something that went contrary to the restrictions in the GPL, and he was trying to convince them not to.
Open up an Amazon media app and navigate around enough, and you'll encounter a page with all their "Third Party Software Licenses."
For instance, here's one for the Amazon Music apps, which includes an FFMpeg license: https://www.amazon.com/gp/help/customer/display.html?nodeId=...
If Google can pay someone to find bugs, they can pay someone to fix them.
Sounds like they'll just throw their employees to work on it rather than monetarily fund it, that way they can aura farm.
As a Googler, I wish I was as optimistic as you. There is an internal sentiment that valuable roles are being removed that aren't aligned with strategic initiatives, even roles that are widely believed to improve developer productivity. See the entire python maintainers team being laid off: https://www.reddit.com/r/AskProgramming/comments/1cem1wk/goo...
Roles fixing FFmpeg bugs would be a hard sell in this environment, imho.
Finding the bug is 95% of the effort. The idea that reporting obscure security bugs is worthless is BS.
> "Don't come to me with problems, come with solutions"
The problem is, the issue in the article is explicitly named as "CVE slop", so if the patch is of the same quality, it might require quite some work anyway.
The linked report seems to me to be the furthest thing from "slop". It is an S-tier bug report that includes a complete narrative, crash artifacts, and detailed repro instructions. I can't believe anyone is complaining about what is tied for the best bug report I have ever seen. https://issuetracker.google.com/issues/440183164?pli=1
It's a good quality bug report.
But it's also a bug report about the decoder for "SANM ANIM v0" - a format so obscure almost all the search results are the bug report itself. Possibly a format exclusive to mid-1990s LucasArts games [1]
Pretty crazy that ffmpeg supports the codec in the first place, IMHO.
I can understand volunteers not wanting to sink time into maintaining a codec to play a video format that hasn't been used since the Clinton administration. gstreamer divides their plugins into 'good', 'bad' and 'ugly' to give them somewhere to stash unmaintained codecs.
[1] https://web.archive.org/web/20250419105551/https://wiki.mult...
It's a codec that is enabled by default at least on major Linux distributions, and that will be processed by ffmpeg without any extra flags. Anyone playing an untrusted video file without explictly overriding the codec autodetection is vulnerable.
The format being obscure and having no real usage doesn't help when it's the attackers creating the files. The obscure formats are exposing just as much attack surface as the common ones.
> Pretty crazy that ffmpeg supports the codec in the first place, IMHO.
Yes.
Sure, it's a valid bug report. But I don't understand why there has been so much drama over this when all the ffmpeg folks have to do is say "sorry, this isn't a priority for us so we'll get to it as soon as we can" and put the issue in the backlog as a low priority. If Google wants the issue fixed faster, they can submit a fix. If they don't care enough to do that, they can wait. No big deal either way. Instead, ffmpeg is getting into a public tiff with them over what seems to be a very easily handled issue.
Yes, you're very right. They could simply have killed a codec that no one uses anymore. Or put it behind a compile flag, so if you really want, you can still enable it
But no. Intentionally or not, there was a whole drama created around it [1], with folks being criticized [2] for saying exactly what you said above, because their past (!) employers.
Instead of using the situation to highlight the need for more corporate funding for opensource projects in general, it became a public s**storm, with developers questioning their future contributions to projects. Shameful.
FFMPEG is upset because Google made the exploit public. They preferred that it remained a zero-day until they decided it was a priority.
I don't understand how anyone believes that behavior is acceptable.
I think the answer is pretty simple: ffmpeg is being thin-skinned here. They do care about the vulnerability (despite whining it's an old / obscure format), but they don't want to / have time to fix the issue, and don't want to publicly admit that their software is insecure with lots of attack vectors due to the gazillion codecs they have.
Judging from some online responses I think it's working too. I honestly don't see how ffmpeg's response is remotely acceptable.
Ffmpeg makes it trivial to enable and disable individual codecs at compile time. Perhaps it's the Linux distros that need to make a change here?
I get that the ffmpeg people have limited time and resources, I get that it would be nice if Google (or literally anyone else) patched this themselves and submitted that upstream. But "everyone else down stream of us should compile out our security hole" is a terrible way to go about things. If this is so obscure of a bug that there's no real risk, then there's no need for anyone to worry that the bug has been reported and will be publicized. On the other hand, if it's so dangerous that everyone should be rebuilding ffmpeg from source and compiling it out, then it really needs to be fixed in the up stream.
Edit: And also, how is anyone supposed to know they should compile the codec our unless someone makes a bug report and makes it public in the first place?
The key point here is: how would a distro know about this vulnerability if Google didn't disclose it? ffmpeg is acting as if Google should have just shut up about it instead of using a well-established timed disclosure mechanism. That means the vulnerability would be private, and downstream users (e.g. distros and individuals) would have no way of knowing said codec is insecure.
Every change breaks somebody's workflow.
There are dozens if not hundreds of issues just like this one in ffmpeg, except for codecs that are infinitely more common. Google has been running all sorts of fuzzers against ffmpeg for over a decade at this point and it just never ends. It's a 20 year old C project maintained by poorly funded volunteers that mostly gives every media file ever the be-liberal-in-what-you-accept treatment, because people complain if it doesn't decode some bizarrely non-standard MPEG4 variant recorded with some Chinese plastic toy from 2008. Of course it has all of the out-of-bounds bugs. I poked around on the issue tracker for like 5 minutes and found several "high impact" issues similar to the one in TFA just from the last two or three months, including at least one that hasn't passed the 90 day disclosure window yet.
Nobody who takes security even remotely seriously should decode untrusted media files outside of a sandboxed environment. Modern media formats are in themselves so complex one starts wondering if they're actually Turing complete, and in ffmpeg the attack surface is effectively infinitely large.
The issue is CVE slop because it just doesn't matter if you consider the big picture.
Some example issues to illustrate my point:
https://issuetracker.google.com/issues/436511754 https://issuetracker.google.com/issues/445394503 https://issuetracker.google.com/issues/436510316 https://issuetracker.google.com/issues/433502298
I don't get why you think linking to multiple legitimate and high quality bug reports with detailed analysis and precise reproduction instructions demonstrates "slop". It is the opposite.
This is software that is directly or indirectly run by millions of people on untrusted media files without sandboxing. It's not even that they don't care about security, it's that they're unaware that they should care. It should go without saying that they don't deserve to be hacked just because of that. Big companies doing tons of engineering work to add defense in depth for use cases on their own infrastructure (via sandboxing or disabling obsolete codecs) doesn't help those users. Finding and fixing the vulnerabilities does.
Anyone running this code with untrusted input needs to sandbox it (which Google has been doing all along).
> Google has been running all sorts of fuzzers against ffmpeg for over a decade at this point
Yeah. It's called YouTube... Why run fuzzers if you can get people to upload a few million random videos every day? ;-)
(I wonder if the BigSleep AI was trained on or prompted with YouTube error logs?)
Yeah but as you can see from the bug report ffmpeg automatically triggers the codec based on file magic, so it is possible that if you run some kind of network service or anything that handles hostile data an attacker could trigger the bug.
It feels like maybe people do not realize that Google is not the only company that can run fuzzers against ffmpeg? Attackers are also highly incentivized to do so and they will not do you the courtesy of filing bug reports.
I'm sure that a hacker wouldn't think of trying to use an obscure format...
https://googleprojectzero.blogspot.com/2021/12/a-deep-dive-i...
> If you used the scan to pdf functionality of a [Xerox] like this a decade ago, your PDF likely had a JBIG2 stream in it.
That's not an obscure format, that's an old format. Meanwhile with ffmpeg we're talking about > decoding LucasArts Smush codec, specifically the first 10-20 frames of Rebel Assault 2, a game from 1995.
That's both old and obscure.Your point is still taken, but just to clarify that these are different situations. JBIG2 is included for legacy. The Lucas art codec is included for... completion's sake(?)
The problem is that if you have a process using ffmpeg and an attacker feeds it a video with this codec, ffmpeg will proceed to auto-detect the codec, attempt to decrypt and then then break everything.
If the format is old and obscure, and the implementation is broken, it shouldn't be on by default.
Hmmmm. There's probably just one guy who wrote the ffmpeg code for that format. _Maybe_ one or two more who contributed fixes or enhancements?
The ffmpeg project need to get in touch and get then to assign copyright to the ffmpeg project, then delete that format/decoder from ffmpeg. Then go back to Google with an offer to licence then a commercial version of ffmpeg with the fixed SANM ANIM v0 decoder, for the low low price of only 0.0001% of YouTube's revenue every year. That'd likely make them the best funded open source project ever, if they pulled it off.
Best response would be to drop this codec entirely, or have it off by default. At least distros should do that.
The actual best response would be to run any "unsupported" codecs in a WASM sandbox. That way you are not throwing away work, Google can stop running fuzzers against random formats from 1995, and you can legitimately say that the worst that can happen with these formats is a process crash. Everybody wins.
Google is not paying anyone to find bugs. They are running AIs indiscriminately.
Someone is making the tools to find these bugs. It's not like they're telling ChatGPT "go find bugs lol"
And running those models on large codebases like these isnt anywhere close to free either.
Does it matter? Either it's a valid bug or it's not. Either it's of high importance or it's not.
A human at Google investigates all of the bugs fuzzers and AI find manually and manually writes bug reports for upstream with more analysis. They are certainly paid to do that. They are also paid to develop tooling to find bugs.
I'm not sure what you think you mean when you say "running AIs indiscriminately". It's quite expensive to run AI this way, so it needs to be done with very careful consideration.
Still, they are paying for the computing resources needed to run the AI/agents etc.
[dead]
Someone started it running, they are responsible for the results.
They certainly paid someone to run the so-called AIs.
I’m an open source maintainer, so I empathize with the sentiment that large companies appear to produce labor for unpaid maintainers by disclosing security issues. But appearance is operative: a security issue is something that I (as the maintainer) would need to fix regardless of who reports it, or would otherwise need to accept the reputational hit that comes with not triaging security reports. That’s sometimes perfectly fine (it’s okay for projects to decide that security isn’t a priority!), but you can’t have it both ways.
If google bears no role in fixing the issues it finds and nobody else is being paid to do it either, it functionally is just providing free security vulnerability research for malicious actors because almost nobody can take over or switch off of ffmpeg.
I don’t think vulnerability researchers are having trouble finding exploitable bugs in FFmpeg, so I don’t know how much this actually holds. Much of the cost center of vulnerability research is weaponization and making an exploit reliable against a specific set of targets.
(The argument also seems backwards to me: Google appears to use a lot of not-inexpensive human talent to produce high quality reports to projects, instead of dumping an ASan log and calling it a day. If all they cared about was shoveling labor onto OSS maintainers, they could make things a lot easier for themselves than they currently do!)
Internally, Google maintains their own completely separate FFMpeg fork as well as a hardened sandbox for running that fork. Since they keep pace with releases to receive security fixes, there’s potentially lots of upstreamable work (with some effort on both sides…)
My understanding from adjacent threads in this discussion is that Google does in fact make significant upstream contributions to FFmpeg. Per policy those are often made with personal emails, but multiple people have said that Google’s investment in FFmpeg’s security and codec support have been significant.
(But also, while this is great, it doesn’t make an expectation of a patch with a security report reasonable! Most security reports don’t come with patches.)
Shouldn't this fork be publicly available as per GPL license?
So your claim is that buggy software is better than documented buggy software?
I think so, yes. Certainly it's more effort to both find and exploit a bug than to simply exploit an existing one someone else found for you.
Yeah it's more effort, but I'd argue that security through obscurity is a super naive approach. I'm not on Google's side here, but so much infrastructure is "secured" by gatekeeping knowledge.
I don't think you should try to invoke the idea of naivete when you fail to address the unhappy but perfectly simple reality that the ideal option doesn't exist, is a fantasy that isn't actually available, and among the available options, even though none are good, one is worse than another.
"obscurity isn't security" is true enough, as far as it goes, but is just not that far.
And "put the bugs that won't be fixed soon on a billboard" is worse.
The super naive approach is ignoring that and thinking that "fix the bugs" is a thing that exists.
If I know it's a bug and I use ffmpeg, I can avoid it by disabling the affected codec. That's pretty valuable.
More fantasy. Presumes the bug only exists in some part of ffmpeg that can be disabled at all, and that you don't need, and that you are even in control over your use of ffmpeg in the first place.
Sure, in maybe 1 special lucky case you might be empowered. And in 99 other cases you are subject to a bug without being in the remotest control over it since it's buried away within something you use and don't even have the option not to use the surface service or app let alone control it's subcomponents.
The bug exists whether it's reported to the maintainers or not, so yeah, it's pretty naive.
You observe that it is better to be informed than ignorant.
This is true. Congratulations. Man we are all so smart for getting that right. How could anyone get something so obvious and simple wrong?
What you leave out is "in a vacuum" and "all else being equal".
We are not in a vacuum and all else is not equal, and there are more than those 2 factors alone that interact.
Given that Google is both the company generating the bug reports and one of the companies using the buggy library, while most of the ffmpeg maintainers presumably aren't using their libraries to run companies with a $3.52 trillion dollar market cap, would you argue that going public with vulnerabilities that affect your own product before you've fixed them is also a naive approach?
Sorry, but this states a lot of assumption as fact to ask a question which only makes sense if it's all true. I feel Google should assist the project more financially given how much they use it, but I don't think Google shipping products using every codec they find bugs for with their open source fuzzer project is a reasonable guess. I certainly doubt YouTube/Chrome let's you upload/compiles ffmpeg with this LucasArts format, as an example. For security issues relevant to their usage via Chrome CVEs etc, they seem to contribute on fixes as needed. E.g. here is one via fuzzing or a codec they use and work on internally https://github.com/FFmpeg/FFmpeg/commit/b1febda061955c6f4bfb...
In regards whether it's a bad idea to publicly document security concerns found regardless whether you plan on fixing them, it often depends if you ask the product manager what they want for their product or what the security concerned folks in general want for every product :).
> I think so, yes. Certainly it's more effort to both find and exploit a bug than to simply exploit an existing one someone else found for you.
That just means the script kiddies will have more trouble, while more scary actors like foreign intellegence agencies will have free reign.
Foreign intelligence has free rein either way. The script kiddies are the only ones that can be stopped by technological solutions.
it’s not a claim it’s common sense that’s why we have notice periods
I like how some coward downvoted with no response when my counterpoint is devestating.
> it functionally is just providing free security vulnerability research for malicious actors because almost nobody can take over or switch off of ffmpeg
At least, if this information is public, someone can act on it and sandbox ffmpeg for their use case, if they think it's worth it.
I personally prefer to have this information be accessible to all users.
This is a weird argument. Basically condoning security through obscurity: If nobody reports the bug then we just pretend it doesn’t exist, right?
There are many groups searching for security vulnerabilities in popular open source software who deliberately do not disclose them. They do this to save them for their own use or even to sell them to bad actors.
It’s starting to feel silly to demonize Google for doing security research at this point.
> It’s starting to feel silly to demonize Google for doing security research at this point.
Aren't most people here demonizing Google for dedicating the resources to find bugs, but not to fix them?
And not giving the maintainners reasonable amount of time to fix. This was triggered by recent change of policy on google side.
The timeline is industry standard at this point. The point is make sure folks take security more seriously. If you start deviating from the script, others will expect the same exceptions and it would lose that ability. Sometimes it's good to let something fail loudly to show this is a problem. If ffmpeg doesn't have enough maintainers, then they should fail and let downstream customers know so they have more pressure to contribute resources. Playing superman and trying to prevent them from seeing the problem will just lead to burn out.
Is it industry standard to run automatic AI tools and spam the upstream with bug reports? To then expect the bugs to be fixed within a 90 days is a bit much.
It's not some lone report of an important bug, it's AI spam that put forth security issues at a speed greater than they have resources to fix it.
"AI tools" and "spam" are knee jerk reactions, not an accurate picture of the bug filed: https://issuetracker.google.com/issues/440183164?utm_source=...
whether or not AI found it, clearly a human refined it and produced a very high quality bug report. There's no AI slop here. No spam.
I guess the question that a person at Google who discovers a bug they don’t personally have time to fix is, should they report the bug at all? They don’t necessarily know if someone else will be able to pick it up. So the current “always report” rule makes sense since you don’t have to figure out if someone can fix it.
The same question applies if they have time to fix it in six months, since that presumably still gives attackers a large window of time.
In this case the bug was so obscure it’s kind of silly.
It doesn't matter how obscure it is if it's a vulnerability that's enabled in default builds.
This was not a case of stumbling across a bug. This was dedicated security research taking days if not weeks of high paid employees to find.
And after all that, they just drop an issue, instead of spending a little extra time on producing a patch.
It’s possible that this is a more efficient use of their time when it comes to open source security as a whole, most projects do not have a problem with reports like this.
If not pumping out patches allows them to get more security issues fixed, that’s fine!
From the perspective of Google maybe, but from the perspective of open source projects, how much does this drain them?
Making open source code more secure and at the same time less prevalent seems like a net loss for society. And if those researchers could spare some time to write patches for open source projects, that might benefit society more than dropping disclosure deadlines on volunteers.
I’m specifically talking from the perspective of everybody but Google.
High quality bug reports like this are very good for open source projects.
Except users can act accordingly to work around the vulnerability.
For one, it lets people understand where ffmpeg is at so they can treat it more carefully (e.g. run it in a sandbox).
Ffmpeg is also open source. After public disclosure, distros can choose to turn off said codec downstream to not expose this attack vector. There are a lot of things users can do to protect themselves but they need to be aware of the problem first.
Security by obscurity. In 2025. On HN.
My takeaway from the article was not that the report was a problem, but a change in approach from Google that they’d disclose publicly after X days, regardless of if the project had a chance to fix it.
To me its okay to “demand” from a for profit company (eg google) to fix an issue fast. Because they have ressources. But to “demand” that an oss project fix something with a certain (possibly tight) timeframe.. well I’m sure you better than me, that that’s not who volunteering works
On the other hand as an ffmpeg user do you care? Are you okay not being told a tool you're using has a vulnerability in it because the devs don't have time to fix it? I mean someone could already be using the vulnerability regardless of what Google does.
>Are you okay not being told a tool you're using has a vulnerability in it because the devs don't have time to fix it?
Yes? It's in the license
>NO WARRANTY
>15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
If I really care, I can submit a patch or pay someone to. The ffmpeg devs don't owe me anything.
Not being told the existence of bugs is different from having a warranty on software. How would you submit a patch on a bug you were not aware of?
Google should provide a fix but it's been standard to disclose a bug after a fixed time because the lack of disclosure doesn't remove the existence of the bug. This might have to be rethought in the context of OSS bugs but an MIT license shouldn't mean other people can't disclose bugs in my project.
Google publicly disclosing the bug doesn't only let affected users know. It also lets attackers know how they can exploit the software.
Holding public disclosure over the heads of maintainers if they don't act fast enough is damaging not only to the project, but to end users themselves also. There was no pressing need to publicly disclose this 25 year old bug.
How is having a disclosure policy so that you balance the tradeoffs between informing people and leaving a bug unreported "holding" anything over the heads of the maintainers? They could just file public bug reports from the beginning. There's no requirement that they file non-public reports first, and certainly not everyone who does file a bug report is going to do so privately. If this is such a minuscule bug, then whether it's public or not doesn't matter. And if it's not a minuscule bug, then certainly giving some private period, but then also making a public disclosure is the only responsible thing to do.
Come on, we let this argument die a decade ago. Disclosure timelines that match what the software author wants is a courtesy, not a requirement.
That license also doesn't give the ffmpeg devs the right to dictate which bugs you're allowed to find, disclose privately, or disclose publicly. The software is provided as-is, without warranty, and I can do what I want with it, including reporting bugs. The ffmpeg devs can simply not read the bug reports, if they hate bug reports so much.
All the license means is that I can’t sue them. It doesn’t mean I have to like it.
Just because software makes no guarantees about being safe doesn’t mean I want it to be unsafe.
Sorry to put it this bluntly, but you are not going to get what you want unless you do it yourself or you can convince, pay, browbeat, or threaten somebody to provide it for you.
If the software makes no guarantees about being safe, then you should assume it is unsafe.
Have you ever used a piece of software that DID make guarantees about being safe?
Every software I've ever used had a "NO WARRANTY" clause of some kind in the license. Whether an open-source license or a EULA. Every single one. Except, perhaps, for public-domain software that explicitly had no license, but even "licenses" like CC0 explicitly include "Affirmer offers the Work as-is and makes no representations or warranties of any kind concerning the Work ..."
I don't know what our contract terms were for security issues, but I've certainly worked on a product where we had 5 figure penalties for any processing errors or any failures of our system to perform its actions by certain times of day. You can absolutely have these things in a contract if you pay for it, and mass market software that you pay for likely also has some implied merchantability depending on jurisdiction.
But yes things you get for free have no guarantees and there should be no expectations put in the gift giver beyond not being actively intentionally malicious.
OK, then you can't decode videos.
Anyone who has seen how the software is sausaged knows that. Security flaws will happen, no matter what the lawyers put in the license.
And still, we live in a society. We have to use software, bugs or not.
not possible to guarantee safety
This is a fantastic argument for the universe where Google does not disclose vulnerability until the maintainers had had reasonable time to fix it.
In this world the user is left vulnerable because attackers can use published vulnerabilities that the maintainers are to overwhelmed to fix
This program discloses security issues to the projects and only discloses them after they have had a "reasonable" chance to fix it though, and projects can request extensions before disclosure if projects plan to fix it but need more time.
Google runs this security program even on libraries they do not use at all, where it's not a demand, it's just whitehat security auditing. I don't see the meaningful difference between Google doing it and some guy with a blog doing it here.
Google is a multi-billion dollar company, which is paying people to find these bugs in the first place.
That's a pretty core difference.
Great, so Google is actively spending money on making open source projects better and more secure. And for some reason everyone is now mad at them for it because they didn't also spend additional money making patches themselves. We can absolutely wish and ask that they spend some money and resources on making those patches, but this whole thing feels like the message most corporations are going to take is "don't do anything to contribute to open source projects at all, because if you don't do it just right, they're going to drag you through the mud for it" rather than "submit more patches"
Why should Google not be expected to also contribute fixes to a core dependency of their browser, or to help funding the developers? Just publishing bug reports by themselves does not make open source projects secure!
They're actively making open source projects less secure by publishing bugs that the projects don't have the volunteers to fix
I saw another poster say something about "buggy software". All software is buggy.
> so Google is actively spending money on making open source projects better and more secure
It looks like they are now starting to flood OSS with issues because "our AI tools are great", but don't want to spend a dime helping to fix those issues.
xkcd 2347
Corporate Social Responsibility? The assumption is that the work is good for end users. I don't know if that's the case for the maintainers though.
The user is vulnerable while the problem is unfixed. Google publishing a vulnerability doesn't change the existence of the vulnerability. If Google can find it, so can others.
Making the vulnerability public makes it easy to find to exploit, but it also makes it easy to find to fix.
If it is so easy to fix, then why doesn't Google fix it? So far they've spent more effort in spreading knowledge about the vulnerability than fixing it, so I don't agree with your assessment that Google is not actively making the world worse here.
I didn't say it was easy to fix. I said a publication made it easy to find it, if someone wanted to fix something.
If you want to fix up old codecs in ffmpeg for fun, would you rather have a list of known broken codecs and what they're doing wrong; or would you rather have to find a broken codec first.
>If Google can find it, so can others.
What a strange sentence. Google can do a lot of things that nobody can do. The list of things that only Google, a handful of nation states, and a handful of Google-peers can do is probably even longer.
Sure, but running a fuzzer on ancient codecs isn't that special. I can't do it, but if I wanted to learn how, codecs would be a great place to start. (in fact, Google did some of their early fuzzing work in 2012-2014 on ffmpeg [1]) Media decoders have been the vector for how many zero interaction, high profile attacks lately? Media decoders were how many of the Macromedia Flash vulnerabilities? Codecs that haven't gotten any new media in decades but are enabled in default builds are a very good place to go looking for issues.
Google does have immense scale that makes some things easier. They can test and develop congestion control algorithms with world wide (ex-China) coverage. Only a handful of companies can do that; nation states probably can't. Google isn't all powerful either, they can't make Android updates really work even though it might be useful for them.
[1] https://security.googleblog.com/2014/01/ffmpeg-and-thousand-...
Nation-states are a very relevant part of the threat model.
> If Google can find it, so can others.
While true, Only Google has google infrastructure, this presupposes that 100% of all published exploits would be findable.
you'd assume that a bad actor would have found the exploit and kept it hidden for their own use. To assume otherwise is fundamentally flawed security practice.
> If Google can find it, so can others.
Not really. It requires time, ergo money.
which bad actors would have more of, as they'd have a financial incentive to make use of the found vulnerabilities. White hats don't get anything in return (financially) - it's essentially charity work.
In this world and the alternate universe both, attackers can also use _un_published vulnerabilities because they have high incentive to do research. Keeping a bug secret does not prevent it from existing or from being exploited.
As clearly stated, most users of ffmpeg are unaware of them using it. Even them knowing about a vulnerability in ffmpeg, they wouldn't know they are affected.
Really, the burden is on those shipping products that depend on ffmpeg: they are the ones who have to fix the security issues for their customers. If Google is one of those companies, they should provide the fix in the given time.
But how are those companies supposed to know they need to do anything unless someone finds and publicly reports the issue in the first place? Surely we're not advocating for a world where every vendor downstream of the ffmpeg project independently discovers and patches security vulnerabilities without ever reporting the issues upstream right?
If they both funded vulnerability scanning and vulnerability fixing (if they don't want to do it in-house, they can sponsor the upstream team), which is to me the obvious "how", I am not sure why you believe there is only one way to do it.
It's about accountability! Who really gets to do it once those who ship it to customers care, is on them to figure out (though note that maintainers will have some burden to review, integrate and maintain the change anyway).
They regularly submit code and they buy consulting from the ffmpeg maintainers according to the maintainer's own website. It seems to me like they're already funding fixes in ffmpeg, and really everyone is just mad that this particular issue didn't come with a fix. Which is honestly not a great look for convincing corporations to invest resources into contributing to upstream. If regular patches and buying dev time from the maintainers isn't enough to avoid getting grief for "not contributing" then why bother spending that time and money in the first place?
They could be, and the chances of that increase immensely once Google publishes it.
I have about 100x as much sympathy for an open source project getting time to fix a security bug than I do a multibillion dollar company with nearly infinite resources essentially blackmailing a small team of developers like this. They could -easily- pay a dev to fix the bug and send the fix to ffmpeg.
Since when are bug reports blackmail? If some retro game enthusiast discovered this bug and made a blog post about it that went to the front page of HN, is that blackmail? If someone running a fuzzer found this bug and dumped a public bug report into github is that blackmail? What if google made this report privately, but didn't say anything about when they would make it public and then just went public at some arbitrary time in the future? How is "heads up, here's a bug we found, here's the reproduction steps for it, we'll file a public bug report on it soon" blackmail?
In my case, yes, but my pipeline is closed. Processes run on isolated instances that are terminated without haste as soon as workflow ends. Even if uncaught fatal errors occur, janitor scripts run to ensure instances are terminated on a fast schedule. This isn't something running on my personal device with random content that was provided by unknown someone on the interwebs.
So while this might be a high security risk because it possibly could allow RCE, the real-world risk is very low.
> On the other hand as an ffmpeg user do you care? Are you okay not being told a tool you're using has a vulnerability in it because the devs don't have time to fix it?
Yes, because publicly disclosing the vulnerability means someone will have enough information to exploit it. Without public disclosure, the chance of that is much lower.
Public disclosures also means users will know about it and distros can turn off said codec downstream. It's not that hard lol. Information is always better. You may also get third-party contributors who will then be motivated to fix the issue. If no one signs up to do so, maybe this codec should just be permanently shelved.
Note that ffmpeg doesn't want to remove the codec because their goal is to play every format known to man, but that's their goal. No one forces them to keep all codecs working.
Sure but how.
Let's say that FFMPEG has a 10 CVE where a very easy stream can cause it to RCE. So what?
We are talking about software commonly for end users deployed to encode their own media. Something that rarely comes in untrusted forms. For an exploit to happen, you need to have a situation where an attacker gets out a exploited media file which people commonly transcode via FFMPEG. Not an easy task.
This sure does matter to the likes of google assuming they are using ffmpeg for their backend processing. It doesn't matter at all for just about anyone else.
You might as well tell me that `tar` has a CVE. That's great, but I don't generally go around tarring or untarring files I don't trust.
AIUI, (lib)ffmpeg is used by practically everything that does anything with video, including such definitely-security-sensitive things as Chrome, which people use to play untrusted content all the time.
Then maybe the Google chrome devs should submit a PR to ffmpeg.
Chrome devs frequently do just that, Chrome just doesn’t enable this codec.
Sure. And fund them.
hmm, didn't realize chrome was using ffmpeg in the background. That definitely makes it more dangerous than I supposed.
Looks like firefox does the same.
Firefox has moved some parsers to Rust: https://github.com/mozilla/mp4parse-rust
Firefox also does a lot of media decoding in a separate process.
Pretty much anything that has any video uses the library (incl. youtube)
Ffmpeg is a versatile toolkit used in lot of different places.
I would be shocked if any company working with user generated video from the likes of zoom or TikTok or YouTube to small apps all over which do not have it in their pipeline somewhere.
There are alternatives such as gstreamer and proprietary options. I can’t give names, but can confirm at least two moderately sized startups that use gstreamer in their media pipeline instead of ffmpeg (and no, they don’t use gst-libav).
One because they are a rust shop and gstreamer is slightly better supported in that realm (due to an official binding), the other because they do complex transformations with the source streams at a basal level vs high-level batch transformations/transcoding.
There are certainly features and use cases where gstreamer is better fit than ffmpeg.
My point was it would be hard to imagine eschewing ffmpeg completely, not that there is no value for other tools and ffmpeg is better at everything. It is so versatile and ubiquitous it is hard to not use it somewhere.
In my experience there usually is always some scenarios in the stack where throwing in ffmpeg for a step is simpler and easier even if there no proper language binding etc, for some non-core step or other.
From a security context that wouldn't matter, As long it touches data, security vulnerabilities would be a concern.
It would be surprising, not that it would impossible to forgo ffmpeg completely. It would be just like this site is written Lisp, not something you would typically expect not impossible.
I wasn’t countering your point, I just wanted to add that there are alternatives (well, an alternative in the OSS sphere) that are viable and well used outside of ffmpeg despite its ubiquity.
Upload a video to YouTube or Vimeo. They almost certainly run it through ffmpeg.
ffmpeg is also megabytes of parsing code, whereas tar is barely a parser.
It would be surprising to find memory corruption in tar in 2025, but not in ffmpeg.
If you use a trillion dollar AI to probe open source code in ways that no hacker could, you're kind of unearthing the vulnerabilities yourself if you disclose them.
This particular bug would be easy to find without any fancy expensive tools.
That is standard practice. It is considered irresponsible to not publicly disclose any vulnerability.
The X days is a concession to the developers that the public disclosure will be delayed to give them an opportunity to address the issue.
> That is standard practice.
It's standard practice for commercially-sponsored software, and it doesn't necessarily fit volunteer maintained software. You can't have the same expectations.
Vulnerabilities should be publicly disclosed. Both closed and open source software are scrutinized by the good and the bad people; sitting on vulnerabilities isn't good.
Consumers of closed source software have a pretty reasonable expectation that the creator will fix it in a timely manner. They paid money, and the (generally) the creator shouldn't put the customer in a nasty place because of errors.
Consumers of open source software should have zero expectation that someone else will fix security issues. Individuals should understand this; it's part of the deal for us using software for free. Organizations that are making money off of the work of others should have the opposite of an expectation that any vulns are fixed. If they have or should have any concern about vulnerabilities in open source software, then they need to contribute to fixing the issue somehow. Could be submitting patches, paying a contractor or vendor to submit patches, paying a maintainer to submit patches, or contributing in some other way that betters the project. The contribution they pick needs to work well with the volunteers, because some of the ones I listed would absolutely be rejected by some projects -- but not by others.
The issue is that an org like Google, with its absolute mass of technical and financial resources, went looking for security vulnerabilities in open source software with the pretense of helping. But if Google (or whoever) doesn't finish the job, then they're being a piece of shit to volunteers. The rest of the job is reviewing the vulns by hand and figuring out patches that can be accepted with absolutely minimal friction.
To your point, the beginning of the expectation should be that vulns are disclosed, since otherwise we have known insecure software. The rest of the expectation is that you don't get to pretend to do a nice thing while _knowing_ that you're dumping more work on volunteers that you profit from.
In general, wasting the time of volunteers that you're benefiting from is rude.
Specifically, organizations profiting off of volunteer work and wasting their time makes them an extractive piece of shit.
Stop being a piece of shit, Google.
why are the standards and expectation different for google vs an independent researcher? Just because they are richer, doesn't mean they should be held to a standard that isn't done for an independent researcher.
The OSS maintainer has the responsibility to either fix, or declare they won't fix - both are appropriate actions, and they are free to make this choice. The consumer of OSS should have the right to know what vulns/issues exist in the package, so that they make as informed a decision as they can (such as adding defense in depth for vulns that the maintainers chooses not to fix).
They are different because the independent researchers don't make money off the projects that they investigate.
Google makes money off ffmpeg in general but not this part of the code. They're not getting someone else to write a patch that helps them make money, because google will just disable this codec if it wasn't already disabled in their builds.
Also in general Google does investigate software they don't make money off.
> independent researchers don't make money off the projects that they investigate
but they make money off the reputational increase they earn for having their name attached to the investigation. Unless the investigation and report is anonymous and their name not attached (which, could be true for some researchers), i can say that they're not doing charity.
I'm an open source maintainer and I have never been in a situation where someone filing a security issue will withhold indefinitely, nor would I ever think of asking them to withhold forever. If there are some complications maybe we can discuss a delayed disclosure but ffmpeg is just complaining about the whole concept of delayed disclosures which seems really immature to me.
As a user of ffmpeg I would definitely want to know this kind of information. The responsibility the issue filer has is not to the project, but to the public.
You disclose so that users can decide what mitigations to take. If there's a way to mitigate the issue without a fix from the developers the users deserve to know. Whether the developers have any obligation to fix the problem is up to the license of the software, the 90 day concession is to allow those developers who are obligated or just want to issue fixes to do so before details are released.
So no one should file public bug reports for open source software?
This is standard practice for Linux as well.
The entire conflict here is that norms about what's considered responsible were developed in a different context, where vulnerability reports were generated at a much lower rate and dedicated CVE-searching teams were much less common. FFmpeg says this was "AI generated bug reports on an obscure 1990s hobby codec"; if that's accurate (I have no reason to doubt it, just no time to go check), I tend to agree that it doesn't make sense to apply the standards that were developed for vulnerabilities like "malicious PNG file crashes the computer when loaded".
The codec is compiled in, enabled by default, and auto detected through file magic, so the fact that it is an obscure 1990s hobby codec does not in any way make the vulnerability less exploitable. At this point I think FFmpeg is being intentionally deceptive by constantly mentioning only the ancient obscure hobby status and not the fact that it’s on by default and autodetected. They have also rejected suggestions to turn obscure hobby codecs off by default, giving more priority to their goal of playing every media format ever than to security.
Yeah, ffmpeg's responses is really giving me a disingenuous vibe as their argument is completely misleading (and it seems to be working on a decent amount of people who don't try to read further into it). IMO it really damages their reputation in my eyes. If they handled it maturely I think I would have had a bit more respect for them.
As a user this is making me wary of running it tbh.
I think the discussion on what standard practice should be does need to be had. This seems to be throwing blame at people following the current standard.
If the obscure coded is not included by default or cannot be triggered by any means other than being explicitly asked for, then it would be reasonable to tag it Won't Fix. If it can be triggered by other means, such as auto file type detection on a renamed file, then it doesn't matter how obscure the feature is, the exploit would affect all.
What is the alternative to a time limited embargo. I don't particularly like the idea of groups of people having exploits that they have known about for ages that haven't been publicly disclosed. That is the kind of information that finds itself in the wrong hands.
Of course companies should financially support the developers of the software they depend upon. Many do this for OSS in the form of having a paid employee that works on the project.
Specifically, FFMPEG seems to have a problem that much of their limitation of resources comes from them alienating contributors. This isn't isolated to just this bug report.
FFMPEG does autodetection of what is inside a file, the extension doesn't really matter. So it's trivial to construct a video file that's labelled .mp4 but is really using the vulnerable codec and triggers its payload upon playing it. (Given ffmpeg is also used to generate thumbnails in Windows if installed, IIRC, just having a trapped video file in a directory could be dangerous.)
> CVE-searching teams
Silly nitpick, but you search for vulnerabilities not CVEs. CVE is something that may or may not be assigned to track a vulnerability after it has been discovered.
Most security issues probably get patched without a CVE ever being issued.
It is accurate. This is a codec that was added for archival and digital preservation purposes. It’s like adding a Unicode block for some obscure 4000 year old dead language that we have a scant half dozen examples of writing.
Here's the question:
Why is Google deliberately running an AI process to find these bugs if they're just going to dump them all on the FFmpeg team to fix?
They have the option to pay someone to fix them.
They also have the option to not spend resources finding the bugs in the first place.
If they think these are so damn important to find that it's worth devoting those resources to, then they can damn well pay for fixing them too.
Or they can shut the hell up and let FFmpeg do its thing in the way that has kept it one of the https://xkcd.com/2347/ pieces of everyone's infrastructure for over 2 decades.
Google is a significant contributor to ffmpeg by way of VP9/AV1/AV2. It's not like it's a gaping maw of open-source abuse, the company generally provides real value to the OSS ecosystem at an even lower level than ffmpeg (which is saying a lot, ffmpeg is pretty in-the-weeds already).
As to why they bother finding these bugs... it's because that's how Google does things. You don't wait for something to break or be exploited, you load your compiler up with santizers and go hunting for bugs.
Yeah this one is kind of trivial, but if the bug-finding infrastructure is already set up it would be even more stupid if Google just sat on it.
So to be clear, if Google doesn't include patches, you would rather they don't make bugs they find in software public so other people can fix them?
That is, you'd rather a world where Google either does know about a vulnerability and refuses to tell anyone, or just doesn't look for them at all, over a world where google looks for them and lets people know they exist, but doesn't submit their own fix for it.
Why do you want that world? Why do you want corporations to reduce the already meager amounts of work and resources they put into open source software even further?
That's not a choice. You can decide if Google files bugs like this or not, you can't force them to fix them.
Many people are already developing and fixing FFmpeg.
How many people are actively looking for bugs? Google, and then the other guys that don't share their findings, but perhaps sell them to the highest bidder. Seems like Google is doing some good work by just picking big, popular open source projects and seeing if they have bugs, even if they don't intend to fix them. And I doubt Google was actually using the Lucas Arts video format their latest findings were about.
However, in my mind the discussion whether Google should be developing FFmpeg (beyond the codec support mentioned elsewhere in the thread) or other OSS projects is completely separate from whether they should be finding bugs in them. I believe most everyone would agree they should. They are helping OSS in other ways though, e.g. https://itsfoss.gitlab.io/post/google-sponsors-1-million-to-... .
I would love to see Google contribute here, but I think that's a different issue.
Are the bug reports accurate? If so, then they are contributing just as if I found them and sent a bug report, I'd be contributing. Of course a PR that fixes the bug is much better than just a report, but reports have value, too.
The alternative is to leave it unfound, which is not a better alternative in my opinion. It's still there and potentially exploitable even when unreported.
But FFmpeg does not have the resources to fix these at the speed Google is finding them.
It's just not possible.
So Google is dedicating resources to finding these bugs
and feeding them to bad actors.
Bad actors who might, hypothetically have had the information before, but definitely do once Google publicizes them.
You are talking about an ideal situation; we are talking about a real situation that is happening in the real world right now, wherein the option of Google reports bug > FFmpeg fixes bug simply does not exist at the scale Google is doing it at.
A solution definitely ought to be found. Google putting up a few millionths of a percent of their revenue or so towards fixing the bugs they find in ffmpeg would be the ideal solution here, certainly. Yet it seems unlikely to actually occur.
I think the far more likely result of all the complaints is that Google simply completely disengages from ffmpeg and stops doing any security work on it. I think that would be quite bad for the security of the project - if Google can trivially find bugs at a high speed such that it overwhelms the ffmpeg developers, I would imagine bad actors can also search for them and find those same vulnerabilities Google is constantly finding, and if they know that those vulnerabilities very much exist, but that Google has simply stopped searching for them upon demand of the ffmpeg project, this would likely give them extremely high motivation to go looking in a place they can be almost certain they'll find unreported/unknown vulnerabilities in. The result would likely be a lot more 0-day attacks involving ffmpeg, which I do not think anyone regards as a good outcome (I would consider "Google publishes a bunch of vulnerabilities ffmpeg hasn't fixed so that everyone knows about them" to be a much preferable outcome, personally)
Now, you might consider that possibility fine - after all, the ffmpeg developers have no obligation to work on the project, and thus to e.g. fix any vulnerabilities in it. But if that's fine, then simply ignoring the reports Google currently makes is presumably also fine, no ?
If widely deployed infrastructure software is so full of vulnerabilities that its maintainers can't fix them as fast as they're found, maybe it shouldn't be widely deployed, or they shouldn't be its maintainers. Disabling codecs in the default build that haven't been used in 30 years might be a good move, for example.
Either way, users need to know about the vulnerabilities. That way, they can make an informed tradeoff between, for example, disabling the LucasArts Smush codec in their copy of ffmpeg, and being vulnerable to this hole (and probably many others like it).
> But FFmpeg does not have the resources to fix these at the speed Google is finding them.
Google submitting a patch does not address this issue. The main work for maintainers here is making the decision whether or not they want to disable this codec, whether or not Google submits a patch to do that is completely immaterial.
What makes you think the bad actors aren't already finding these bugs? From the looks of it, there isn't really any rocket science going on here. There are equally well-funded bad actors who will and do find these issues.
With Google finding these bugs, at least the user can be informed. For this instance for example, the core problem here is the codec is in *active use*. Ffmpeg utilizes a disingenuous argument that it's old and obscure, but omits the fact that it's still compiled in meaning that an attacker can craft a file and send it to you and still works.
A user (it could be a distro who packages ffmpeg) can use this information to turn off the codec that virtually no one uses today and make their distribution of ffmpeg more secure. Not having this information means they can't do that.
If ffmpeg doesn't have the resources to fix these bugs, at least let the public know so we can deal with it.
Also, just maybe, they wouldn't have that many vulnerabilities filed against them if the project took security more seriously to begin with? It's not a good sign for the software when you get so many valid security reports and just ask them to withhold them.
The actual real alternative is that the ffmpeg maintainers quit, just like the libxml2 maintainer did.
A lot of these core pieces of infrastructure are maintained by one to three middle-aged engineers in their free time, for nothing. Meanwhile, billion dollar companies use the software everywhere, and often give nothing back except bug reports and occasional license violations.
I mean, I love "responsible disclosure." But the only result of billion dollar corporations drowning a couple of unpaid engineers in bug reports is that the engineers will walk away and leave the code 100% unmaintained.
And yeah, part of the problem here is that C-based data parsers and codecs are almost always horrendously insecure. We could rewrite it all in Rust (and I have in fact rewritten one obscure codec in Rust) or WUFFS. But again, who's going to pay for that?
The other alternative if the ffmpeg developers change the text on their "about" screen from "Security is a high priority and code review is always done with security in mind. Though due to the very large amounts of code touching untrusted data security issues are unavoidable and thus we provide as quick as possible updates to our last stable releases when new security issues are found." to something like "Security is a best-effort priority. Code review is always done with security in mind. Due to the very large amounts of code touching untrusted data security issues are unavoidable. We attempt to provide updates to our last stable releases when new security issues are found, but make no guarantees as to how long this may take. Priority will be given to reports including a proof-of-concept exploit and a patch that fixes the security bug."
Then point to the "PoC + Patch or GTFO" sign when reports come in. If you use a library with a "NO WARRANTY" license clause in an application where you're responsible for failures, it's on you to fix or mitigate the issues, not on the library authors.
> Why is Google deliberately running an AI process to find these bugs if they're just going to dump them all on the FFmpeg team to fix?
This is called fuzzing and it has been standard practice for over a decade. Nobody has had any problem with it until FFmpeg decided they didn’t like that AI filed a report against them and applied the (again, mostly standard at this point) disclosure deadline. FWIW, nobody would have likely cared except they went on their Twitter to complain, so now everyone has an opinion on it.
> They also have the option to not spend resources finding the bugs in the first place.
The Copenhagen interpretation of security bugs: if you don’t look for it, it doesn’t exist and is not a problem.
> My takeaway from the article was not that the report was a problem, but a change in approach from Google that they’d disclose publicly after X days, regardless of if the project had a chance to fix it.
That is not an accurate description? Project Zero was using a 90 day disclosure policy from the start, so for over a decade.
What changed[0] in 2025 is that they disclose earlier than 90 days that there is an issue, but not what the issue is. And actually, from [1] it does not look like that trial policy was applied to ffmpeg.
> To me its okay to “demand” from a for profit company (eg google) to fix an issue fast. Because they have ressources. But to “demand” that an oss project fix something with a certain (possibly tight) timeframe.. well I’m sure you better than me, that that’s not who volunteering works
You clearly know that no actual demands or even requests for a fix were made, hence the scare quotes. But given you know it, why call it a "demand"?
[0] https://googleprojectzero.blogspot.com/2025/07/reporting-tra..., discussed at https://news.ycombinator.com/item?id=44724287
[1] https://googleprojectzero.blogspot.com/p/reporting-transpare...
When you publicize a vulnerability you know someone doesn't have the capacity to fix according to the requested timeline, you are simultaneously increasing the visibility of the vulnerability and name-calling the maintainers. All of this increases the pressure on the maintainers, and it's fair to call that a "demand" (quotes-included). Note that we are talking about humans who will only have their motivation dwindle: it's easy to say that they should be thick-skinned and ignore issues they can't objectively fix in a timely manner, but it's demoralizing to be called out like that when everyone knows you can't do it, and you are generally doing your best.
It's similar to someone cooking a meal for you, and you go on and complain about every little thing that could have been better instead of at least saying "thank you"!
Here, Google is doing the responsible work of reporting vulnerabilities. But any company productizing ffmpeg usage (Google included) should sponsor a security team to resolve issues in high profile projects like these too.
Sure, the problem is that Google is a behemoth and their internal org structure does not cater to this scenario, but this is what the complaint is about: make your internal teams do the right thing by both reporting, but also helping fix the issue with hands-on work. Who'd argue against halving their vulnerability finding budget and using the other half to fund a security team that fixes highest priority vulnerabilities instead?
> When you publicize a vulnerability you know someone doesn't have the capacity to fix according to the requested timeline
My understanding is that the bug in question was fixed about 100 times faster than Project Zero's standard disclosure timeline. I don't know what vulnerability report your scenario is referring to, but it certainly is not this one.
> and name-calling the maintainers
Except Google did not "name-call the maintainers" or anything even remotely resembling that. You just made it up, just like GP made up the the "demands". It's pretty telling that all these supposed misdeeds are just total fabrications.
"When you publicize... you are ... name-calling": you are taking partial quotes out of context, where I claimed that publicizing is effectively doing something else.
> When you publicize a vulnerability you know someone doesn't have the capacity to fix according to the requested timeline, you are simultaneously increasing the visibility of the vulnerability and name-calling the maintainers.
So how long should all bug reporters wait before filing public bugs against open source projects? What about closed source projects? Anyone who works in software knows to ship software is to always have way more things to do than time to do it in. By this logic, we should never make bug reports public until the software maintainers (whether OSS, Apple or Microsoft) has a fix ready. Instead of "with enough eyeballs, all bugs are shallow" the new policy going forward I guess will be "with enough blindfolds, all bugs are low priority".
It's funny you come up with that suggestion when I clearly offer a different solution: "make your internal teams do the right thing by both reporting, but also helping fix the issue with hands-on work".
It's a call not to stop reporting, but to equally invest in fixing these.
Hands on work like filing a detailed bug report with suspected line numbers, reproduction code and likely causes? Look, I get it. It would be nice if Google had filed a patch with the bug. But also not every bug report is going to get a patch with it, nor should that be the sort of expectation we have. It's hard enough getting corporations to contribute time and resources to open source projects as it is, to set an expectation that the only acceptable corporate contribution to open source is full patches for any bug reports is just going to make it that much harder to get anything out of them.
In the end, Google does submit patches and code to ffmpeg, they also buy consulting from the ffmpeg maintainers. And here they did some security testing and filed a detailed and useful bug report. But because they didn't file a patch with the bug report, we're dragging them through the mud. And for what? When another corporation looks at what Google does do, and what the response this bug report has gotten them, which do you think is the most likely lesson learned?
1) "We should invest equally in reporting and patching bugs in our open source dependencies"
2) "We should shut the hell up and shouldn't tell anyone else about bugs and vulnerabilities we discover, because even if you regularly contribute patches and money to the project, that won't be good enough. Our name and reputation will get dragged for having the audacity to file a detailed bug report without also filing a patch."
But the vulnerability exists already. You are making it sound like Google invented a problem for the project. Maybe the project should be name called if it has hundreds of vulnerabilities? Whether it's run by volunteers or not does not matter for a user of the software.
No one is forcing anyone to do anything. Ffmpeg does not have to fix this bug, btw. If they don't have time, just let the disclosure happen.
Also, in this case, the simple fix is to turn off the codec. They just didn't want to do that because they want to have all codecs enabled. This is a conscious choice and no one is forcing them to do that. If the CVE was allowed to disclose without ffmpeg fixing the issue, at least the downstream users can turn off the codec themselves.
Just to be clear here: Googles' responsibility here is to the public (aka the users of ffmpeg), not the project.
Also, let's go back to your "cooked a meal" analogy. If I cook a meal for you, for free, that's nice. But that doesn't entitle me to be careless in hygiene and gives you salmonella poisoning because I didn't wash my hands. Doing things for free doesn't absolve me of any responsibility.
Publishing the vulnerability is a demand to fix it. It threatens to cause harm to the reputation of the maintainer if left unfixed.
No, publishing the vulnerability is the right thing to do for a secure world because anyone can find this stuff including nation states that weaponize it. This is a public service. Giving the dev a 90 day pre warn is a courtesy.
Expecting a reporter to fix your security vulnerabilities for you is entitlement.
If your reputation is harmed by your vulnerable software, then fix the bugs. They didn’t create the hazzard they discovered it. You created it, and acting like you’re entitled to the free labor of those that gave you the heads up is insane, and trying to extort them for their labor is even worse.
This is all true(maybe not the extortion being worse hard to say), but it doesnt change the fact that publishing the CVE is a demand to fix it.
No, it is a notice to others that your software as-is is insecure in some way. The pre notice is again a courtesy if you want to fix it.
What you do with the notice as a dev is up to you, but responsible ones would fix it without throwing a tantrum.
Devs need to stop thinking of themselves as the main character and things get a lot more reasonable.
No, it is a request to fix it. How the maintainer feels about it is up to them.
CVE!=vulnerability
These two terms are not interchangeable.
Most vulnerabilities never have CVEs issued.
The fact that details of the issue _will_ be disclosed publicly is an implicit threat. Sure it's not an explicit threat, but it's definitely an implicit threat. So the demand, too, is implicit: fix this before we disclose publicly, or else your vulnerability will be public knowledge.
You should not be threatened by the fact that your software has security holes in it being made public knowledge. If you are, then your goals are fundamentally misaligned with making secure software.
I don't think that you understand the point of the delayed public disclosure. If it wasn't a threat, then there'd be no need to delay -- it would be publicly disclosed immediately.
Nobody is demanding anything. Google is just disclosing issues.
This opens up transparency of ffmpeg’s security posture, giving others the chance to fix it themselves, isolate where it’s run or build on entirely new foundations.
All this assuming the reports are in fact pointing to true security issues. Not talking about AI-slop reports.
From TFA:
> The latest episode was sparked after a Google AI agent found an especially obscure bug in FFmpeg. How obscure? This “medium impact issue in ffmpeg,” which the FFmpeg developers did patch, is “an issue with decoding LucasArts Smush codec, specifically the first 10-20 frames of Rebel Assault 2, a game from 1995.”
This doesn't feel like a medium-severity bug, and I think "Perhaps reconsider the severity" is a polite reading. I get that it's a bug either way, but this leaves me with a vague feeling of the ffmpeg maintainer's time being abused.
On the other hand, if the bug doesn't get filed, it doesn't get fixed. Sure Google could spend some resources on fixing it themselves, but even if they did would we not likely see a complaint about google flooding the maintainers with PR requests for obscure 30 year old codec bugs? And isn't a PR even more of a demand on the maintainer's time because now there's actual code that needs to be reviewed, tests that need to be run and another person waiting for a response on the other end?
"Given enough eyeballs, every bug is shallow" right? Well, Google just contributed some eyeballs, and now a bug has been made shallow. So what's the actual problem here? If some retro game enthusiast had filed the same but report would that be "abusing" the maintainer's time? I would think not, but then we're saying that a bug report can be "abusive" simply by the virtue of who submits it. And I'm really not sure "don't assign employees to research bugs in your open source dependencies and if you do certainly don't submit bug reports on what you find because that's abusive" is the message we want to be sending to corporations that are using these projects.
The vulnerability in question is being severely underestimated. There are many other comments in this thread going into detail. UAF = RCE.
Use-after-free bugs (such as the vulnerability in question, https://issuetracker.google.com/issues/440183164) usually can be exploited to result in remote code execution, but not always. It wouldn't be prudent to bet that this case is one of the exceptions, of course.
I think it’s exceedingly reasonable for a maintainer to dispute the severity of a vulnerability, and to ultimately decide the severity.
Maintainers rarely understand or agree with the severity of a bug until an exploit beats them over the head publicly in a way they are unable to sweep under the rug.
On the other hand, reporters giving a CVE a 10 for a bug in an obscure configuration option that is disabled by default in most deployments is bit over the top. I've seen security issues being reported as world ending, being there for years, without anyone being able to make an exploit PoC.
Yes, I think a defining aspect of vulnerability disclosure is how perverted the incentives structure is for all parties, including maintainers.
Whether the codec is from 1995 or 2025 does not matter. What matters is that the codec is compiled in and working by default on ffmpeg as they intend to bundle all codecs for the user. You can just craft a file, send it to a user pretending to be a regular mp4 file, and trigger the bug. It literally wouldn't matter if the codec was this Lucas Arts one of HEVC. An attacker wouldn't care if they walk in the front door or a random broken window in the back.
If it causes a crash, that's denial of service, so medium would be appropriate. But it's true that medium CVEs aren't that bad in most situations.
This bug can most likely lead to RCE, proving that it can’t is generally a very difficult problem.
There’s absolutely no reason to assume that it does not lead to RCE, and certainly no reason whatsoever to invest significant time to prove that one way or the other unless you make a living selling exploits.
If you need this kind of security, build ffmpeg with only decoders you find acceptable
That quote felt pretty disingenuous. OK, so the proof of concept was found in some minor asset of an old video game. But is it an exploitable vulnerability? If so, you will quickly find it on modern-day scummy advertising networks. Will it still be "medium severity"? Not clear either way to me, from the quote.
> But appearance is operative: a security issue is something that I (as the maintainer) would need to fix regardless of who reports it
I think this is the heart of the issue and it boils off all of the unimportant details.
If it's a real, serious issue, you want to know about it and you want to fix it. Regardless of who reports it.
If it's a real, but unimportant issue, you probably at least want to track it, but aren't worried about disclosure. Regardless of who reports it.
If it's invalid, or AI slop, you probably just want to close/ignore it. Regardless of who reports it.
It seems entirely irrelevant who is reporting these issues. As a software project, ultimately you make the judgment call about what bugs you fix and what ones you don't.
But if it's a real, serious issue without an easy resolution, who is the burden on? It's not that the maintainers wouldn't fix bugs if they easily could. FFmpeg is provided "as is"[0], so everyone should be responsible for their side of things. It's not like the maintainers dumped their software on every computer and forced people to use it. Google should be responsible for their own security. I'm not adamant that Google should share the patch with others, but it would hardly be an imposition to Google if they did. And yes, I really do intend that you could replace Google with any party, big or small, commercial or noncommercial. It's painful, but no one has any inherent obligations to provide others with software in most circumstances.
[0] More or less. It seems the actual language is shied from. Is there a meaningful difference?
But if no bug report is filed, then only google gets the ability to "be responsible for their own security", everyone else either has to independently discover and then patch the bug themselves, or wait until upstream discovers the bug.
In no reasonable reading of the situation can I see how anything Google has done here has made things worse:
1) Before hand, the bug existed, but was either known by no one, or known only by people exploiting it. The maintainers weren't actively looking at or for this particular bug and so it may have continue to go undiscovered for another 20 years.
2) Then Google was the only one that knew about it (modulo exploiters) and were the only people that could take any steps to protect themselves. The maintainers still don't know so everyone else would remain unprotected until they discover it independently.
3) Now everyone knows about the issue, and are now informed to take whatever actions they deem appropriate to protect themselves. The maintainers know and can choose (or not) to patch the issue, remove the codec or any number of other steps including deciding it's too low priority in their list of todos and advising concerned people to disable/compile it out if they are worried.
#3 is objectively the better situation for everyone except people who would exploit the issue. Would it be even better if Google made a patch and submitted that too? Sure it would. But that doesn't make what they have done worthless or harmful. And more than that, there's nothing that says they can't or won't do that. Submitting a bug report and submitting a fix don't need to happen at the same time.
It's hard enough convincing corporations to spend any resources at all on contributing to upstream. Dragging them through the mud for not submitting patches in addition to any bug reports they file is in my estimation less likely to get you more patches, and more likely to just get you less resources spent on looking for bugs in the first place.
I wasn't really thinking about the disclosure part, although I probably should have. I was focusing on the patching side of things. I think you're correct that disclosure is good, but in that case, I think it increases the burden of those with resources to collaborate to produce a patch.
Well, it's open source and built by volunteers, so nobody is obligated to fix it. If FFmpeg volunteers don't want to fix it or don't have the time/bandwidth to fix it, then they won't fix it. Like any other bug or CVE in any other open source project. The burden doesn't necessarily need to be on anyone.
They aren't obligated to fix CVEs until they're exploited, and then, suddenly, they very much were obligated to fix the CVEs, and their image as FLOSS maintainers and as a project are very much tarnished.
If they are unable to fix CVEs in a timely manner, then it is very reasonable for people to judge them (accurately!) as being unable to fix CVEs in a timely manner. Maybe some people might even decide to use other projects or chip in to help out! However, it is dishonest to hide reports and pretend like bugs are being fixed on time when they are not.
I would like them to publicly state that there are not enough hours in their day to fix this, therefore it will have to wait until they get to it.
Please don’t use “CVE” as a stand-in for “vulnerability”, you know much better than this :)
Most vulnerabilities never get CVEs even when they’re patched.
I don't think anyone can force them to fix cve. Software is provided as-is. Can't be more straightforward as that.
So what is Google gonna do if security fixes don't happen in time and the project takes a "reputational hit"? Fork it and maintain it themselves? Why not send in patches instead?
Maintaining a reputation might be enough reward for you, but not everyone is happy to work for free for a billion dollars corporation breathing down their necks. It's puzzling to me why people keep defending their free lunch.
If ffmpeg maintainers cannot keep up, downstream customers should know so they can help.
FFmpeg is developed almost entirely by volunteers. We have no "customers".
There are people who use and depend on ffmpeg. Maintainers seem to go out of their way to solve issues these folks face. If you don't care, then ignore the bug reports and force them to solve their own problems by contributing.
These people are not customers though. The maintainers do their best, but overall the project seems to be understaffed though, so customers (for example Google, as it seems they occasionally chip in) get priority.
Then you and your security friends will create lots of FUD about FFmpeg being "insecure" with lots of red and the word "critical" everywhere.
Why complain about pressure from customers then?
if you've ever read about codependency, "need" is a relative term.
codependency is when someone accepts too much responsibility, in particular responsibility for someone else or other things out of their control.
the answer is to have a "healthy neutrality".
OTOH they could disclose security issues AND send patches to close them.
The issue at hand is that Google has a policy of making the security issue public regardless of whether a fix has been produced.
Typically disclosures happen after a fix exists.
This isn’t true at all in my experience: disclosures happen on a timeline (60 to 90 days is common), with extensions provided as a courtesy based on remediation complexity and other case-by-case considerations. I’ve been party to plenty of advisories that went public without a fix because the upstream wasn’t interested in providing one.
For OSS projects or commercial ones? I feel it's not the same when one has trillion in market cap and the other has a few unpaid maintainers.
The norm is the same for both. Perhaps there’s an argument that it should be longer for OSS maintainers, but OSS maintainers also have different levers at their disposal: they can just say “no, I don’t care” because nobody’s paying them. A company can’t do that, at least not without a financial hit.
To my original comment, the underlying problem here IMO is wanting to have it both ways: you can adhere to common notions of security for reputational reasons, or you can exercise your right as a maintainer to say “I don’t care,” but you can’t do both.
I wonder if language plays a large role in the burden imposed on the maintainers.
Sure, but this is about Google funding FFmpeg not providing bug fixes.
True - if we're talking about actual security bugs, not the "CVE slop"
P.S. I'm an open source maintainer myself, and I used to think, "oh, OSS developers should just stop whining and fix stuff." Fast forward a few years, and now I'm buried under false-positive "reports" and overwhelmed by non-coding work (deleting issue spam, triage, etc.)
P.P.S. What's worse, when your library is a security component the pressure’s even higher - one misplaced loc could break thousands of apps (we literally have a million downloads at nuget [1] )
Please speak openly about that on your dev page Manage expectations.
I feel this comment is far to shallow a take. I would expect that you know better than most of HN, exactly how much a reputation security has as a cost center. Google uses ffmpeg internally, how many millions would they have to spend if they were required to not only create, but maintain ffmpeg themselves? How significant would that cost be at Google's scale?
I dont agree the following framing is accurate, but I can mention it because you've already said the important part (about how this issue exists, and mearly knowing about it doesn't create required work.) But here announcing it, and registering a CVE, Google is starting the clock. By some metrics, it was already running, but the reputational risk clearly was not. This does change priorities, and requires as urgent context switch. neither are free actions, especially not within FOSS.
To me, being someone who believes everyone, individuals and groups, have a responsibility to contribute fairly. I would frame it as Google's behavior gives the appearance weaponizing their cost center externally, given this is something Google could easily fix, but instead they shirked that responsibility to unfunded volunteers.
To be clear, I think Google (Apple, Microsoft, etc.) can and should fund more of the OSS they depend on. But this doesn’t change the fact that vulnerability reports don’t create work per se, they just reveal work that the project can choose to act on or not.
Hopefully, until that changes, more people with influence will keep saying it, and always say it until it stops being true, and important.
So thank you for saying the important thing too! :)
I see you didn't read the article.
The problem isn't Google reporting vulnerabilities. It's Google using AI to find obscure bugs that affect 2 people on the planet, then making a CVE out of it, without putting any effort into fixing it themselves or funding the project. What are the ffmpeg maintainers supposed to do about this? It's a complete waste of everybody's time.
> The latest episode was sparked after a Google AI agent found an especially obscure bug in FFmpeg. How obscure? This “medium impact issue in ffmpeg,” which the FFmpeg developers did patch, is “an issue with decoding LucasArts Smush codec, specifically the first 10-20 frames of Rebel Assault 2, a game from 1995.”
I don't think that's an accurate description of the full scope of the problem. The codec itself is mostly unused but the code path can possibly be triggered from file fuzzing that ffmpeg uses so a maliciously crafted payload (e.g. any run of ffmpeg that touches user input without disabling this codec) could possibly be exploited.
> that affect 2 people on the planet
Wrong. The original files only affect 2 people. A malicious file could be anywhere.
Do you remember when certain sequences of letters could crash iphones? The solution was not "only two people are likely to ever type that, minimum priority". Because people started spreading it on purpose.
Mark it low, and estimate by when it can be fixed (3, 4 months from now). Advise google of it. Google does not need to disclose this bug that fast. If I do something one the side or as hobby and big corp comes by to tell me to hurry up I feel inclined to say no thanks.
Close the bug, if they don’t care. They don’t want to do that because then people will yell at them for not caring.
Fully on FFmpeg team side, many companies approach to FOSS is only doing so when it sounds good on their marketing karma, leech otherwise.
Most of them would just pirate in the old days, and most FOSS licences give them clear conscience to behave as always.
Google is, at no cost to FFMPEG:
1) dedicating compute resources to continuously fuzzing the entire project
2) dedicating engineering resources to validating the results and creating accurate and well-informed bug reports (in this case, a seriously underestimated security issue)
3) additionally for codecs that Google likely does not even internally use or compile, purely for the greater good of FFMPEG's user base
Needless to say, while I agree Google has a penny to spare to fund FFMPEG, and should (although they already contribute), I do not agree with funding this maintainer.
Google is:
- choosing to do this of their own volition
- are effectively just using their resources to throw bug reports over the wall unprompted.
- benefiting from the bugs getting fixed, but not contributing to them.
> - benefiting from the bugs getting fixed, but not contributing to them.
I would be very surprised if Google builds this codec when they build ffmpeg. If you run a/v codecs (like ffmpeg) in bulk, the first thing to do is sandbox the hell out of it. The second thing you do is strictly limit the containers and codecs you'll decode. Not very many people need to decode movies from old LucasArts games, for video codecs, you probably only want mpeg 1-4, h.26x, vp8, vp9, av1. And you'll want to have fuzzed those decoders as best you can too.
Nobody should be surprised that there's a security problem in this ancient decoder. Many of the eclectic codecs were written to mimic how the decoders that shipped with content were written, and most of those codecs were written assuming they were decoding a known good file, because why wouldn't they be. There's no shame, that's just how it is... there's too much to proactively investigate, so someone doing fuzzing and writing excellent reports that include diagnosis, specific location of the errors, and a way to reproduce are providing a valuable contribution.
Could they contribute more? Sure. But even if they don't, they've contributed something of value. If the maintainers can't or don't want to address it, that'd be reasonable too.
FFmpeg and a thousand fixes:
https://j00ru.vexillium.org/2014/01/ffmpeg-and-the-tale-of-a...
"While reading about the 4xm demuxer vulnerability, we thought that we could help FFmpeg eliminate many potential low-hanging problems from the code by making use of the Google fleet and fuzzing infrastructure we already had in place"
Google is a contributor to FFMPEG.
FFMPEG, at no cost to Google, provided a core piece of their infrastructure for multiple multi-billion dollar product lines.
"At no cost to Google" seems difficult to substantiate, given that multiple sources indicate that Google is sponsoring FFmpeg both with engineering resources (for codec development) and cold hard cash (delivered to the FFmpeg core team via their consulting outfit[1]).
This is excellent, to be clear. But it's not compatible with the yarn currently being spun of a purely extractive relationship.
[1]: https://fflabs.eu
Yes, according to the license selected by ffmpeg. And google, according to this license selected by ffmpeg, paid them nothing. And then do some additional work, beneficial to ffmpeg.
And this is why Google contributes back.
They could, but there is really no requirement on them to do so. The security flaw was discovered by Google, but it was not created by them.
Equally there is no requirement on ffmpeg to fix these CVEs nor any other.
And, of course, there is no requirement on end-users to run software from projects which do not consider untrusted-input-validation bugs to be high priority.
> They could, but there is really no requirement on them to do so.
I see this sort of sentiment daily. The sentiment that only what is strictly legal or required is what matters.
Sometimes, you know, you have to recognise that there are social norms and being a good person matters and has intrinsic value. A society only governed by what the written law of the land explicitly states is a dystopia worse than hell.
What's "strictly legal or required" of Google here is absolutely nothing. They didn't have to do any auditing or bug hunting. They certainly didn't have to validate or create a proper bug report, and there's no requirement whatsoever that they tell anyone about it at all. They could have found the bug, found it was being actively exploited, made their own internal patch and sat quietly by while other people remained vulnerable. All of that is well within what is "strictly legal or required".
Google did more than what is "strictly legal or required", and what they did was submit a good and valid bug report. But for some reason we're mad because they didn't do even more. Why? The world is objectively a better place for having this bug report, at least now people know there's something to address.
Google did more than what is "strictly legal or required", and what they did was submit a good and valid bug report. But for some reason we're mad because they didn't do even more. Why?
The Copenhagen Interpetation of Ethics is annoyingly prevalent (https://forum.effectivealtruism.org/posts/QXpxioWSQcNuNnNTy/...)
"I noticed your window was broken, so I took the liberty of helping you, working for free, by posting a sign that says UNLOCKED WINDOW HERE with exact details on how it was broken. I did lots of gratis work for you which you do not need to do yourself now. The world is safer now. Why are you not grateful?"
You're correct, but it's the social norms -- or at least, the norms as I perceive them -- that I am talking about here.
If you find yourself with potentially serious security bugs in your repo, then the social norm should be for you to take ownership of that because, well, it's your repo.
The socially unacceptable activity here should be treating security issues as an irritation, or a problem outside your control. If you're a maintainer, and you find yourself overwhelmed by genuine CVE reports, then it might be worth reflecting on the root cause of that. What ffmpeg did here was to shoot the messenger, which is non-normative.
Justice is more than just following laws.
> And, of course, there is no requirement on end-users to run software from projects which do not consider untrusted-input-validation bugs to be high priority.
What's this even saying?
Then they're free to fork it and never use the upstream again.
Where do you draw the line? Do you want Google to just not inspect any projects that it can't fully commit to maintaining?
Providing a real CVE is a contribution, not a burden. The ffmpeg folks can ignore it, since by all indications it's pretty minor.
Personally, I want the $3.5 Trillion company to do more. So the line should be somewhere else.
So you don't have a line, you just want to move the goalposts and keep moving them?
It is my understanding that the commenters in FFMPEG's favor believe that Google is doing a disservice by finding these security vulnerabilities, as they require volunteer burden to patch, and that they should either:
1) allow the vulnerabilities to remain undiscovered & unpatched zero-days (stop submitting "slop" CVEs.)
2) supply the patches (which i'm sure the goalpost will move to the maintainers being upset that they have to merge them.)
3) fund the project (including the maintainers who clearly misunderstand the severity of the vulnerabilities and describe them as "slop") (no thank you.)
This entire thread defies logic.
Yep, that's clearly what I was saying. I want to just keep moving the goalposts (which I didn't even know I had set or moved in the first place) again and again.
Or I just want the $3.5 trillion company to also provide the patches to OSS libraries/programs/etc that their projects with hundreds of millions or billions in funding happen to find.
Crazy, I know.
location of goalposts scales with market cap
> Providing a real CVE is a contribution, not a burden.
Isn't a real CVE (like any bug report) both a contribution and a burden?
What is the mission of Project Zero? Is it to build a vulnerability database, or is it to fix vulnerabilities?
If it's to fix vulnerabilities, it seems within reason to expect a patch. If the reason Google isn't sending a patch is because they truly think the maintainers can fix it better, then that seems fair. But if Google isn't sending a patch because fixing vulns "doesn't scale" then that's some pretty weak sauce.
Maybe part of the solution is creating a separate low priority queue for bug reports from groups that could fix it but chose not to.
It's neither. WP says:
> After finding a number of flaws in software used by many end-users while researching other problems, such as the critical "Heartbleed" vulnerability, Google decided to form a full-time team dedicated to finding such vulnerabilities, not only in Google software but any software used by its users.
It did that but it did not decide to form a team dedicated to fixing issues in software that it uses? That's the misallocation of funds that's at play here.
The ideal outcome is that Project Zero sends its discoveries off to a team who triage and develop patches for the significant vulnerabilities, and then the communication with the project is a much more helpful one.
Project Zero is an offensive security team. Its job is to find vulnerabilities.
In their own words:
> Our mission is to make the discovery and exploitation of security vulnerabilities more difficult, and to significantly improve the safety and security of the Internet for everyone.
> We perform vulnerability research on popular software like mobile operating systems, web browsers, and open source libraries. We use the results from this research to patch serious security vulnerabilities, to improve our understanding of how exploit-based attacks work, and to drive long-term structural improvements to security.
If you are deliberately shipping insecure software, you should stop doing that. In ffmpeg's case, that means either patching the bug, or disabling the codec. They refused to do the latter because they were proud of being able to support an obscure codec. That puts the onus on them to fix the bug in it.
I can tell you with 100% certainty that there are undiscovered vulnerabilities in the Linux kernel right now. Does that mean they should stop shipping?
I do think that contributing fuzzing and quality bug reports can be beneficial to a project, but it's just human nature that when someone says "you go ahead and do the work, I'll stand here and criticize", people get angry.
Rather than going off and digging up ten time bombs which all start counting down together, how about digging up one and defusing it? Or even just contributing a bit of funding towards the team of people working for free to defuse them?
If Google really wants to improve the software quality of the open source ecosystem, the best thing they could do is solve the funding problem. Not a lot of people set out to intentionally write insecure code. The only case that immediately comes to mind is the xz backdoor attempt, which again had a root cause of too few maintainers. I think figuring out a way to get constructive resources to these projects would be a much more impressive way to contribute.
This is a company that takes a lot of pride in being the absolute best of the best. Maybe what they're doing can be justified in some way, but I see why maintainers are feeling bullied. Is Google really being excellent here?
Nah, ffmpeg volunteers dont owe you or Google anything. They are giving you free access to their open project.
The ffmpeg authors aren't "shipping" anything; they're giving away something they make as a hobby with an explicit disclaimer of any kind of fitness for purpose. If someone needs something else, they can pay an engineer to make it for them.
To build on that - if it "doesn't scale" for one of the wealthiest companies in the world, it certainly doesn't scale for a volunteer project...
> Providing a real CVE is a contribution, not a burden. The ffmpeg folks can ignore it, since by all indications it's pretty minor.
Re-read the article. There's CVEs and then there's CVEs. This is the former, and they're shoving tons of those down the throats of unpaid volunteers while contributing nothing back.
What Google's effectively doing is like a food safety inspection company going to the local food bank to get the food that they operate their corporate cafeteria on just to save a buck, then calling the health department on a monthly basis to report any and all health violations they think they might have seen, while contributing nothing of help back to the food bank.
I have read the article. The expectation for a tool like ffmpeg is that regardless of what kind of file you put into it, it safely handles it.
This is an actual bug in submitted code. It doesn't matter that it's for some obscure codec, it's technically maintained by the ffmpeg project and is fair game for vulnerability reports.
Given that Google is also a major contributor to open-source video, this is more like a food manufacturer making sure that grocery stores are following health code when they stock their food.
Mind you, the grocery store has no obligation to listen to them in this metaphor and is free to just let the report/CVE sit for a while.
I am unsure why this untruth is being continuously parroted. It is false.
This is exploitable on a majority of systems as the codec is enabled by default. This is a CVE that is being severely underestimated.
This is why many have warned against things like MIT licence. Yes, it gives you source code and does easily get incorporated into a lot of projects but it comes at the cost of potential abuse.
Yes, GPL 3 is a lot ideologically but it was trying to limit excessive leeching.
Now that I have opened the flood gates of a 20 year old debate, time to walk away.
Google Project Zero just looks for security issues in popular open source packages, regardless of if Google itself even uses those packages or not.
So I'm not sure what GPLv3 really has to do with it in this case, if it under was a "No billion dollar company allowed" non-free-but-source-available license, this same thing would have happened if the project was popular enough for Project Zero to have looked at it for security issues.
The difference is that Google does use it, though. They use it heavily. All of us in the video industry do - Google, Amazon, Disney, Sony, Viacom, or whoever. Companies you may have never heard of build it into their solutions that are used by big networks and other streaming services, too.
Right, Google absolutely should fund ffmpeg.
But opening security issues here is not related to that in any way. It's an obscure file format Google definitely doesn't use, the security issue is irrelevant to Google's usages of it.
The critique would make sense if Google was asking for ffmpeg to implement something that Google wanted, instead of sending a patch. But they don't actually care about this one, they aren't actually asking for them to fix it for their benefit, they are sending a notice of a security issue that only affects people who are not Google to ffmpeg.
Google absolutely does fund ffmpeg via SPI.
"and Google provided substantial donations to SPI's general fund".
The amounts don't appear to be public (and what is enough!?)
Opening a security issue is not the problem. A public disclosure so soon when there are so many machine-assisted reports for such obscure issues is the problem.
If Google wants to force a faster turnaround on the fixes, they can send the reports with patches or they can pay for prioritization.
Three months is "soon"? What do you think is reasonable?
And like so many posters in this thread, you seem to be under the impression that Google needed this fixed at some specific timeline. In reality the fix timeline, or even a total lack of a fix, makes no impact to them. They almost certainly already disable these kinds of codecs in their build. They reported this for the good of the ecosystem and the millions of users who were vulnerable.
Google does not "want this fixed", this isn't a bug report from a team using ffmpeg, it's a security analysis from a whitehat security project.
I think really if there's all these machine generated meaningless security reports, wasting time with that sounds like a very sensible complaint, but then they should point at that junk as the problem.
But for the specific CVE discussed it really looks to me like they are doing everything right: it's a real, default-configuration exploitable issue, they reported it and ffmpeg didn't fix or ask for any extension then it gets publicly disclosed after 90 days per a standard security disclosure policy.
What in GPL3 or MIT mandates that Google fix this bug and submit a PR or simply sends a bug report and walks away? I don't see how this applies at all.
AGPL, with no CLA that lets the owners relicense. Then we'll see if the using corporation fully believes in open source.
There's a reason Google turned into year 2000 Microsoft "it's viral!" re. the AGPL. They're less able to ignore the intent of the license and lock away their changes.
There is nothing whatsoever that the GPL would do to change this situation. Bringing up the permissive license debate is a non-sequitur here.
ffmpeg is already LGPL / GPLv2. How does the license choice factor into this at all?