Hacker News

sudhakaran88
FFmpeg at Meta: Media Processing at Scale engineering.fb.com

dewey5 hours ago

> As our internal fork became increasingly outdated, we collaborated with FFmpeg developers, FFlabs, and VideoLAN to develop features in FFmpeg that allowed us to fully deprecate our internal fork and rely exclusively on the upstream version for our use cases.

Some comments seem to glance over the fact that they did give back and they are not the only ones benefitting from this. Could they give more? Sure, but this is exactly one of the benefits of open source where everyone benefits from changes that were upstreamed or financially supported by an entity instead of re-implementing it internally.

sergiotapia3 hours ago

One thing people can't fault Meta for is that they contribute back to the community at large.

We're using React Native, hello!?

We're using React!

Tons of projects, we should be very grateful they give so much tbh.

kindkang2024an hour ago

Let alone PyTorch, which greatly boosted the entire LLM wave. Thanks, Meta.

Those who benefit others deserve to be benefited in return — and if we could, we should help make them more fit.

jcul22 minutes ago

zstd and the Folly C++ library are two that come to mind.

popalchemist39 minutes ago

Yes, they do that, but it's not out of altruism. Gratitude may be the wrong word when Meta and Zuck have actively worked to erode people's trust in society and reality, while actualizing a technofuedalist vision of serfdom; literally a 21st century scheme for world domination and subjugation of the poors.

j45an hour ago

It's a positive development, but we can't minimize or ignore the conditions that precipitated it, giving back was less than hanging onto the changes for private benefit.

Still, Meta has also put a lot out there in open source, from a differentiation perspective it doesn't seem to go unnoticed.

vmaurin4 hours ago

A gentle reminder that all the big techs companies would not exist without open source projects

kccqzy4 hours ago

Would Microsoft not exist without open source project? Microsoft is that company founded in 1975, but the GPL license only appeared in 1989, and BSD licenses appearing at roughly the same time just because of the Unix Wars.

Big tech companies can easily hire manpower to make proprietary versions of software, or just pay licensing fees for other proprietary software. They don’t rely on open source. Microsoft bought 86-DOS to produce MS-DOS; Microsoft paid the Unix license to produce Xenix; and when Microsoft hired former DEC people to make NT, it later paid DEC.

Instead, modern startups wouldn’t exist without open source.

golfer2 hours ago

Indeed, open source exists despite Microsoft trying its hardest to kill it. Microsoft was (and still is) a ruthless, savage competitor. Their image has softened as of late but I'll never forget the BS they did under Bill Gates and Steve Ballmer.

[deleted]2 hours agocollapsed

ok1234562 hours ago

Microsoft wouldn't exist without the theft of CPU time on time-shared computers.

cedws4 hours ago

I think they would due to massive financial incentive. On the other hand, a lot more developers might actually be getting compensated for their work, instead of putting their code on the internet for free and then complaining on social media that they feel exploited.

izacus2 hours ago

And a gentle reminder that most of open source you use was developed and is maintained by tech companies.

Take a glance and contributor lists for your projects sometime.

dirasieb2 hours ago

it's the exact opposite but alright, take a look at who's behind funding and sending code to the linux kernel if you want an example

EdNutting4 hours ago

Yes, they contributed to open source - this is a good thing.

But personally, I took issue with the tone of the blog post, characterised by this opening framing:

>For many years we had to rely on our own internally developed fork of FFmpeg to provide features that have only recently been added to FFmpeg

Could they not have upstreamed those features in the first place? They didn't integrate with upstream and now they're trying to spin this whole thing as a positive? It doesn't seem to acknowledge that they could've done better (e.g. the mantra of 'upstream early; upstream often').

The attempt to spin it ("bringing benefits to Meta, the wider industry, and people who use our products") just felt tone-deaf. The people reading this post are engineers - I don't like it when marketing fluff gets shoe-horned into a technical blog post, especially when it's trying to put lipstick on a story that is a mix of good and not so good things.

So yeah, you're right, they've contributed to OSS, which is good. But the communication of that contribution could have been different.

pdpi4 hours ago

> e.g. the mantra of 'upstream early; upstream often'

This is the gold standard, sure. In practice, you end up maintaining a branch simply because upstream isn't merging your changes on your timescale, or because you don't quite match their design — this is completely reasonable on both sides, because they have different priorities.

dewey4 hours ago

> Could they not have upstreamed those features in the first place?

Hard to say without being there, but in my experience it's very easy to end up in "we'll just patch this thing quickly for this use case" to applying a bunch of hacks in various places and then ending up with an out of sync fork. As a developer I've been there many times.

It's a big step to go from patching one specific company internal use case to contributing a feature that works for every user of ffmpeg and will be accepted upstream.

EdNutting4 hours ago

I've also had that experience of patching an OSS project internally, with the best intention of upstreaming externally-useful improvements in the future (when allowed).

However, my interpretation of the article was that they did a lot more than just patching pieces. They, perhaps, could have taken a much earlier opportunity to work with the core maintainers of ffmpeg to help define its direction and integrate improvements, rather than having to assist a significant overhaul now (years later).

Aurornis3 hours ago

Getting something accepted upstream is orders of magnitude harder than patching it internally.

The typical situation is that you need to write a proof of concept internally and get it deployed fast. Then you can iterate on it and improve it through real world use. Once it matures you can start working on aligning with upstream, which may take a lot of effort if upstream has different ideas about how it should be designed.

I’ve also had cases where upstream decided that the feature was good but they didn’t want it. If it doesn’t overlap with what the maintainers want for the project then you can’t force them to take it.

Upstreaming is a good goal to aim toward but it can’t be a default assumption.

summerlight3 hours ago

I guess it is much more frequent to maintain internal patches rather than doing all the merging work into upstream, especially the feature is non-trivial. Merging upstream consumes more time externally and internally, and many developers are working with an aggressive timeline. I don't think it is fair to criticize them because they didn't do ideal things from the beginning.

p-o2 hours ago

>For many years we had to rely on our own internally developed fork of FFmpeg to provide features that have only recently been added to FFmpeg

I really wonder if they couldn't have run the fork as an open source project. They present their options as binary when it fact they had many different options from the get go. They could have run the fork in an open-source fashion for developers of FFmpeg to see what their work was and be able to understand what the features they were working on was.

Keeping everything close source and then contributing back X amount of years later feels a little bit disingenuous.

kevincox4 hours ago

I find it hard to be too upset, better late than never. Would it have been better to upstream shortly after they wrote the code? Yes. Would it have been better if they also made a sizable contribution to fmmpeg? Yes. But at the end of the day they did contribute back valuable code and that is worth celebrating even if it was done purely because of the benefit to them. Let's hope that this is a small step and they do even more in the future.

EdNutting4 hours ago

As I said, the contribution is good, it's the communication via this blog post that I don't entirely like. It could have been different. It could have acknowledged better ways of engaging with ffmpeg (that would've benefitted both Meta and ffmpeg/the community, not _just_ ffmpeg).

But corporate blog posts often go this way. I'm not mad at them or anything. Just a mild dislike ;)

kevincox4 hours ago

Yeah, I see what you mean. It basically shows that they contributed to ffmpeg purely because it helped them, but then they wrote this post to get good will for that contribution.

EdNutting4 hours ago

:thumbs-up:

arcfour3 hours ago

I'm glad to know that outcomes are affected by having pure intentions. /s

zer0zzz3 hours ago

> Could they not have upstreamed those features in the first place?

Often when you are working on a downstream code base either you are inheriting the laziness of non-upstreaming of others or you are dealing with an upstream code base that’s really opinionated and doesn’t want many of your teams patches. It can vary, and I definitely empathize.

xienze3 hours ago

> Could they not have upstreamed those features in the first place?

This can be harder than you think. Some time ago I worked a $BIGCORP and internally we used an open source library with some modifications to allow it to fit better into our architecture. In order to get things upstreamed we had to become official contributors AND lobby to get everyone involved to see the usefulness of what we were trying to do. This took a lot of back-and-forth and rethinking the design to make it less specific to OUR needs and more generally applicable to everyone. It's a process. I'm not surprised that Facebook's initial approach would be an internal fork instead of trying to play the political games necessary to get everything upstreamed right off the bat. That's exactly the situation we were in, so I get it.

neutrinobro5 hours ago

> At the same time, new versions of FFmpeg brought support for new codecs and file formats, and reliability improvements, all of which allowed us to ingest more diverse video content from users without disruptions.

While it is good they worked to get their internal improvements into upstream, and this is certainly better behavior than some other unmentioned tech giants. It makes one wonder (since they are presumably running it tens of billions of times per day), if they were involved in supporting these improvements all along. If not, why not?

ebbflowgo4 hours ago

[dead]

tcbrahan hour ago

tens of billions of executions per day is insane. i run ffmpeg a few thousand times daily for automated video assembly and even at that scale the process startup overhead is noticeable. the single-decode multi-output trick alone saved me like 40% wall time when i switched to it. cant imagine what those savings look like multiplied by 10 billion

HumblyTossed26 minutes ago

I hope when Fabrice Bellard retires, he's able to do so quite comfortably. So much money has been made on the back of his software creations.

kevincox4 hours ago

> By running all encoder instances in parallel, better parallelism can be obtained overall.

This makes a lot of sense for the live-streaming use case, and some sense for just generally transcoding a video into multiple formats. But I would love to see time-axis parallelization in ffmpeg. Basically quickly split the input video into keyframe chunks then encode each keyframe in parallel. This would allow excellent parallelization even when only producing a single output. (And without lowering video quality as most intra-frame parallelization does)

infogulch4 hours ago

Encoders do some interframe analysis (motion, etc) as part of encoding P/B-frames; I wonder if this work could be done once and reused for all the encodings.

Melatonic2 hours ago

I would guess if you have it export multiple files from the same source in one go it already does but could be wrong.

infogulchan hour ago

This post is how meta engineers just recently submitted a patch with the ability to avoid starting a new process for every output encoding and so they can share the decoding step. Maybe that also includes sharing the motion estimation step, but I would be careful making such assumptions, FFMPEG has a lot of low hanging optimization work that hasn't been done just because someone hasn't done it yet.

hparadiz5 hours ago

> As our internal fork became increasingly outdated

Oof. That is so relatable.

Also ffmpeg 8 is finally handling HDR and SDR color mapping for HDR perfectly as of my last recompile on Gentoo :)

treyd5 hours ago

Oh this is so nice, I had huge annoyances with figuring out how to automatically copy over the metadata in the past.

randall4 hours ago

sweet.

I've been out of the game for a bit but it's great to hear.

comrade12346 hours ago

Germany's sovereign tech fund has donated more to FFmpeg thanks meta.

ecshafer2 hours ago

Germany's sovereign tech fund donated a bit more than $150k. What is a year of META Engineer time? $300k? More? If they spent a year of Engineering time, Meta gave more than double Germany's sovereign tech fund. My guess is that they have a team, probably including more than junior and mid level software engineers, working on media encoding and upstreaming patches, so I wouldn't be surprised if they are providing in terms of work at least $1M a year.

Maxious6 hours ago

"while the funding mentioned in the [Meta] post is appreciated, it's not enough to sustain the project" https://x.com/FFmpeg/status/2029053011314786701

dewey5 hours ago

It wouldn't really be a good fundraising move to tell everyone that Meta took care of everything this fundraising year.

petcat5 hours ago

Do we know how much Meta donated to ffmpeg? A quick search shows that the German STF donated €157,580.00 for 2024/2025.

BonoboIO4 hours ago

What a joke … Meta making billions and saving millions by not brewing their own stuff can not give more than some national fund.

flipped3 hours ago

[dead]

qalmakka6 hours ago

Isn't this like telling the world you ate a full meal by eating samples at Costco? Meta is ranking in billions as we speak, they ensure the FOSS projects they rely on are properly funded instead of shovelling cash to bullshit datacentre developments. Otherwise we're basically guaranteed to end up with another XZ fiasco once again when some tired unpaid FOSS maintainer ends up trusting a random Jia Tan in their desperation

semiquaver5 hours ago

This post is all about how they upstreamed their improvements!

If you get mad when a company makes good use of open source and contributes to a project’s betterment, you do not understand the point of open source, you’re just fumbling for a pitchfork.

golfer2 hours ago

I'd say this post reads more like them beating their chest about how great their improvements are.

gruez5 hours ago

>Isn't this like telling the world you ate a full meal by eating samples at Costco?

The analogy fails because free samples cost costco (or whatever the vendor is) money. Raking Meta over the coals for using ffmpeg instead of paying for some proprietary makes as much sense as raking every tech company over the coals for using Linux. Or maybe you'd do that too, I can't tell.

acedTrex4 hours ago

I mean, they contributed their fixes upstream. Thats the most important thing they could do here.

theultdev5 hours ago

Meta is the sole reason PHP is still alive. Also a big reason we're not in MVC hell.

They bet on open source and they open source a lot of technology.

It's one of the best companies when it comes to open source.

I don't know how much total they donate, but I've seen tons of grants given to projects from them.

pmontra5 hours ago

I think that WordPress is still big enough to keep PHP alive. Furthermore, the sheer number of developer that started coding web apps with PHP in year 2000 plus minus 5 years is large enough to give PHP a critical mass for the next 20 years.

dotancohen4 hours ago

Is Automattic contributing back to PHP? I think that WordPress benefits because PHP is available, but does not significantly contribute to PHP development.

theultdev5 hours ago

WordPress is keeping PHP alive now

But PHP wouldn't be here today if it wasn't for Meta and it's support.

pmontraan hour ago

WordPress is from 2003 and has been very successful since the beginning. FaceBook is from 2004. Both were PHP apps because the late 90s and early 2000s were the years of PHP CMSes and ecommerce platforms. Even if FaceBook did not happen PHP would have been one of the top 5 languages of that age. PHP was popular because of web hostings and the simplicity of apache + mod PHP. It was not big in hype because it was a really bad language until about version 7 and few people would admit to like it.

Actually, FaceBook worked against WordPress and the adoption of PHP because a number of people that could have used a WP instance to blog or to market a product started using a FB page instead. Ecommerce went from self hosted (Magento, Woocommerce, Prestashop) to hosted or to Amazon and also FB.

ianhawes5 hours ago

> Meta is the sole reason PHP is still alive.

This could not be more wrong. Meta is still using PHP AFAIK but I'm not sure it's modern. They created the Hack programming language ~10 years ago but it doesn't look like it's been updated in several years. Most of the improvements they touted were included in PHP 7 years ago.

theultdev5 hours ago

I never said they were still using it (they are in some cases)

But when the backend world was either Java or ASP, FB chose PHP and helped us other small companies out.

They eventually went Hack, the rest went Node for the most part.

But during those PHP years they gave us HHVM and many PHP improvements to get us through.

righthand5 hours ago

Yeah we’re in React SPA hell instead. I’d rather be in MVC hell.

cheema333 hours ago

> Yeah we’re in React SPA hell instead. I’d rather be in MVC hell.

I am guessing the world moved to React because the developer community in general does not feel the same way.

righthand2 hours ago

No they moved to Reactjs because it was evangelized as the only framework available. There are plenty of people who hate reactjs, don’t worry.

ecshafer2 hours ago

As a react hater, I share DHH's opinion that React was driven by ZIRP. So many giant, slow, react apps out there that are super slow to develop with. IMO HTMX is a 10x dev time reducer over React.

theultdev5 hours ago

That's a common take here but I'd take React any day.

Been doing this for 20 years. React/JSX is the easiest (for me)

embedding-shape5 hours ago

Yeah, same. Not sure if everyone is as traumatized as us when it comes to dealing with 100K LOC large Backbone.js codebases though, or before that where we kept state in the DOM itself and tried to wrangle it all with jQuery.

React and JSX really did help a lot compared to how it used to be, which was pretty unmanageable already.

[deleted]2 hours agocollapsed

[deleted]2 hours agocollapsed

jamesnorden5 hours ago

Where's the big donation?

hrmtst93837an hour ago

If you expect a press-release-sized check, don't hold your breath. Big companies usually prefer to buy leverage instead, by upstreaming engineering time, sponsoring CI runners, donating hardware for NVENC and VideoToolbox tests, or funding maintainers rather than cutting a single headline check.

Concrete things that actually reduce risk are paying for continuous fuzzing with OSS-Fuzz on libavcodec, funding multi-arch CI that covers macOS, Windows, ARM and Nvidia GPU tests, and committing to upstream fixes instead of maintaining an internal fork. If a company does those three things you'll likely see fewer regressions, fewer security surprises, and much lower downstream maintenance cost than from a one-off bank transfer and a press release.

mghackerlady5 hours ago

they'll get one when the openbsd maintainers become millionaires

touwer4 hours ago

Happily for the ffmpeg machines, it's all lightweight content. Something more heavy would overload them

EdNutting5 hours ago

Same HN post from 6 days ago: https://news.ycombinator.com/item?id=47224355

[Edit: Why is anyone downvoting me linking to the previous post of this? What possible objection could you have to this particular comment?]

WalterGR3 hours ago

My understanding is that reposts are fine until there’s measurable discussion, as long as said reposts aren’t by the same user.

EdNutting2 hours ago

And what in my comment said anything about there being a problem??

EdNutting2 hours ago

This is the modern internet at work. People assume that a completely neutral statement that (to reword my original comment into a longer form) "here's a link to the same post from previously, because you might want to see it was there" and assume there's some kind of complaint or problem and take offense to their own assumption, and then downvote.

It's completely the opposite of HN "assume good faith" policy. Sigh.

[deleted]2 hours agocollapsed

[deleted]2 hours agocollapsed

flipped3 hours ago

[dead]

thiago_fm4 hours ago

Wish they gave FFmpeg a decent chunk of money, words are cheap

cheema333 hours ago

They did contribute code. For opensource projects, good code contributions may be more valuable.

raphaelmolly82 hours ago

[dead]

boxingdog2 hours ago

[dead]

BorisMelnik4 hours ago

Meta this is just SAD, Mark your company would be nothing without FF. Do the right thing and write a check today.

randall4 hours ago

This is the least informed take i've ever seen.

I worked at fb, and I'm 100% certain we sponsored VLC and OBS at the time. It would be strange if we didn't sponsor FFMPEG, but regardless (as the article says) we definitely got out of our internal fork and upstreamed a lot of the changes.

I worked on live, and everyone in the entire org worships ffmpeg.

Suckseh4 hours ago

You make a lot of money, Meta makes A LOT of money.

doesn't matter how you worship ffmpeg if a company, which makes billions by destroying our society, gives a little bit of handout back.

So good for you? Bad for ffmpeg, society and the rest of the world.

tt243 hours ago

Meta has made more positive contributions to society and the world than every HN commenter combined, and more than most of the other FAANGS (Amazon being the exception).

JambalayaJimbo38 minutes ago

While contributing back to ffmpeg is great, this is insanely hyperbolic lol. Do you genuinely think Instagram and Facebook are positive contributions to society?

tsumnia2 hours ago

Damned for virtual signalling if they make posts about their contributions, damned for destroying tech when they don't. I love these kinds of articles and share them with students all the time.

BorisMelnik4 hours ago

I never said they didn't sponsor them, it just isnt enough, not even close.

and I know the teams love ffmpeg, there are some great folks at meta just not a lot in the c suite

flipped3 hours ago

[dead]

hn-front (c) 2024 voximity
source