Back

Meta is using the Linux scheduler designed for Valve's Steam Deck on its servers

701 points2 monthsphoronix.com
Fiveplus2 months ago

Valve is practically singlehandedly dragging the Linux ecosystem forward in areas that nobody else wanted to touch.

They needed Windows games to run on Linux so we got massive Proton/Wine advancements. They needed better display output for the deck and we got HDR and VRR support in wayland. They also needed smoother frame pacing and we got a scheduler that Zuck is now using to run data centers.

Its funny to think that Meta's server efficiency is being improved because Valve paid Igalia to make Elden Ring stutter less on a portable Linux PC. This is the best kind of open source trickledown.

kshri242 months ago

Game development is STILL a highly underrated field. Plenty of advancements/optimizations (both in software/hardware) can be directly traced back to game development. Hopefully, with RAM prices shooting up the way it is, we go back to keeping optimizations front and center and reduce all the bloat that has accumulated industry wide.

hinkley1 month ago

A number of my tricks are stolen from game devs and applied to boring software. Most notably, resource budgets for each task. You can’t make a whole system fast if you’re spending 20% of your reasonable execution time on one moderately useful aspect of the overall operation.

ksec1 month ago

I think one could even say gaming as a sector single handedly move most of the personal computing platform forward since 80s and 90s. Before that it was probably Military and cooperate. From DOS era, overclocking CPU to push benchmarks, DOOM, 3D Graphics API from 3DFx Glide to Direct X. Faster HDD for faster Gaming Load times. And for 10 - 15 years it was gaming that carried CUDA forward.

abustamam2 months ago

Yes please! Stop making me download 100+gb patches!

ffsm82 months ago

The large file sizes are not because of bloat per-se...

It's a technique which supposedly helped at one point in time to reduce loading times, helldiver's being the most note-able example of removing this "optimization".

However, this is by design - specifically as an optimization. Can't really be calling that boat in the parents context of inefficient resource usage

flohofwoe1 month ago

This was the the reason in Helldivers, other games have different reasons - like uncompressed audio (which IIRC was the reason for the CoD-install-size drama a couple of years back) - the underlying reason is always the same though, the dev team not caring about asset size (or more likely: they would like to take care of it but are drowned in higher priority tasks).

+3
thanksgiving2 months ago
+1
SkiFire131 month ago
abustamam1 month ago

Interesting, today I learned!

MarleTangible2 months ago

Over time they're going to touch things that people were waiting for Microsoft to do for years. I don't have an example in mind at the moment, but it's a lot better to make the changes yourself than wait for OS or console manufacturer to take action.

asveikau2 months ago

I was at Microsoft during the Windows 8 cycle. I remember hearing about a kernel feature I found interesting. Then I found linux had it for a few years at the time.

I think the reality is that Linux is ahead on a lot of kernel stuff. More experimentation is happening.

wmf2 months ago

I was surprised to hear that Windows just added native NVMe which Linux has had for many years. I wonder if Azure has been paying the SCSI emulation tax this whole time.

stackskipton2 months ago

Probably, most of stuff you see in Windows Server these days is backported from Azure improvements.

athoneycutt2 months ago

It was always wild to me that their installer was just not able to detect an NVMe drive out of the box in certain situations. I saw it a few times with customers when I was doing support for a Linux company.

+1
pantalaimon2 months ago
mycall2 months ago

Linux is behind Windows wrt (Hybrid) Microkernel vs Monolith, which helps with having drivers and subsystems in user mode and support multiple personalities (Win32, POSIX, OS/2 and WSL subsystems). Linux can hot‑patch the kernel, but replacing core components is risky and drivers and filesystems cannot be restarted independently.

7bit2 months ago

And behind on a lot of stuff. The Microsoft's ACLs are nothing short of one of the best designed permission systems there are.

On the surface, they are as simple as Linux UOG/rwx stuff if you want it to be, but you can really, REALLY dive into the technology and apply super specific permissions.

nunez2 months ago

The file permission system on Windows allows for super granular permissions, yes; administrating those permissions was a massive pain, especially on Windows file servers.

torginus2 months ago

And they work on everything. You can have a mutex, a window handle or a process protected by ACL.

+3
jandrese2 months ago
bbkane2 months ago

Do you have any favorite docs or blogs on these? Reading about one of the best designed permissions systems sounds like a fun way to spend an afternoon ;)

Eggpants1 month ago

And yet, it requires kernel extension anti-cheat to stop a game mod from reading and writing memory locations in a running process. It’s a toy operating system if it can’t even prevent that. It’s why corporate machines are so locked down. Then there is the fact video drivers run in ring 0 and are allowed to phone home… but hey you can prevent notepad++ from running FTW.

+3
trueismywork2 months ago
+6
dabockster2 months ago
b00ty4breakfast2 months ago

when the hood is open for anyone to tinker, lots of little weirdos get to indulge their ideas. Sometimes those are ideas are even good!

+1
ethbr12 months ago
IshKebab2 months ago

Yeah and Linux is waaay behind in other areas. Windows had a secure attention sequence (ctrl-alt-del to login) for several decades now. Linux still doesn't.

+3
roblabla2 months ago
marcodiego2 months ago

Please check the relates wikipedia article. Updated to reflect recent secure attention key in the linux world: https://en.wikipedia.org/wiki/Secure_attention_key

+3
dangus2 months ago
+1
ttctciyf2 months ago
fleroviumna2 months ago

[dead]

pjmlp1 month ago

And behind in anything related to kernel security, sandboxing, user space drivers, and 3D graphics drivers.

Without Proton there would be no "Linux" games.

It would be great if Valve actually continued Loki Entertainment's work.

dijit2 months ago

yeah, but you have IO Completion Ports…

IO_Uring is still a pale imitation :(

asveikau2 months ago

io_uring does more than IOCP. It's more like an asynchronous syscall interface that avoids the overhead of directly trapping into the kernel. This avoids some overheads IOCP cannot. I'm rusty on the details but the NT kernel has since introduced an imitation: https://learn.microsoft.com/en-us/windows/win32/api/ioringap...

+1
loeg2 months ago
+1
senderista2 months ago
6r172 months ago

Tbh i'm starting to think that I do not see microsoft being able to keep it's position in the OS market ; with steam doing all the hard work and having a great market to play with ; the vast distributions to choose from, and most importantly how easy it has become to create an operating system from scratch - they not only lost all possible appeal, they seem stuck on really weird fetichism with their taskbar and just didn't provide me any kind of reason to be excited about windows.

Their research department rocks however so it's not a full bash on Microsoft at all - i just feel like they are focusing on other way more interesting stuff

Arainach2 months ago

Kernel improvements are interesting to geeks and data centers, but open source is fundamentally incompatible with great user experience.

Great UX requires a lot of work that is hard but not algorithmically challenging. It requires consistency and getting many stakeholders to buy in. It requires spending lots of time on things that will never be used by more than 10-20% of people.

Windows got a proper graphics compositor (DWM) in 2006 and made it mandatory in 2012. macOS had one even earlier. Linux fought against Compiz and while Wayland feels inevitable vocal forces still complain about/argue against it. Linux has a dozen incompatible UI toolkits.

Screen readers on Linux are a mess. High contrast is a mess. Setting font size in a way that most programs respect is a mess. Consistent keyboard shortcuts are a mess.

I could go on, but these are problems that open source is not set up to solve. These are problems that are hard, annoying, not particularly fun. People generally only solve them when they are paid to, and often only when governments or large customers pass laws requiring the work to be done and threaten to not buy your product if you don't do it. But they are crucially important things to building a great, widely adopted experience.

jraph2 months ago

Your comment gives the impression that you think open source software is only developed by unpaid hobbyists. This not true, this is quite an outdated view. Many things are worked on by developers paid full time. And that people are mostly interested in algorithmically challenging stuff, which I don't think is the case.

Accessibility does need improvement. It seems severely lacking. Although your link makes it look like it's not that bad actually, I would have expected worse.

+3
einr2 months ago
embedding-shape2 months ago

> Tbh i'm starting to think that I do not see microsoft being able to keep it's position in the OS market

It's a big space. Traditionally, Microsoft has held both the multimedia, gaming and lots of professional segments, but with Valve doing a large push into the two first and Microsoft not even giving it a half-hearted try, it might just be that corporate computers continue using Microsoft, people's home media equipment is all Valve and hipsters (and others...) keep on using Apple.

+1
thewebguyd2 months ago
pjmlp1 month ago

Game developers still need Windows that Valve then runs on top of Proton.

pjmlp1 month ago

First Valve has to actually start pushing for proper Linux games, until then Windows can keep enjoying its 70% market share, with game studios using Windows business as usual.

Also Raspeberri PIs are the only GNU/Linux devices most people can find at retail stores.

m4rtink2 months ago

Add to that all the bullshit they have been pushing on their customers lately: * OS level adds

* invasive AI integration

* dropping support for 40% of their installed base (Windows 10)

* forcing useless DRM/trusted computing hardware - TPM - as a requirement to install the new and objectively worse Windows version version, with even more spying and worse performance (Windows 11)

With that I think their prospects are bleak & I have no idea who would install anything else than Steam OS or Bazzite in the future with this kind of Microsoft behavior.

benoau2 months ago

"It just works" sleep and hibernate.

"Slide left or right" CPU and GPU underclocking.

dijit2 months ago

“it just works” sleep was working, at least on basically every laptop I had the last 10 years…

until the new s2idle stuff that Microsoft and Intel have foisted on the world (to update your laptop while sleeping… I guess?)

+1
dabockster2 months ago
QuiEgo2 months ago

Power management is a really hard problem. It's the stickiest of programming problems, a multi-threaded sequence where timing matters across threads (sometimes down to the ns). I'm convinced only devices that have hardware and software made by the same company (Apple, Andoid phones, Steam deck, maybe Surface laptops) have a shot in hell at getting it perfect. The long-tail/corner cases and testing is a nightmare.

As an example, if you have a mac, run "ioreg -w0 -p IOPower" and see all the drivers that have to interact with each other to do power management.

chocochunks2 months ago

It never really worked in games even with S3 sleep. The new connected standby stuff created new issues but sleeping a laptop while gaming was a roulette wheel. SteamOS and the like actually work, like maybe 1/100 times I've run into an issue. Windows was 50/50.

pmontra2 months ago

Sleep and hibernate don't just work on Windows unless Microsoft work with laptop and boards manufacturers to make Windows play nice with all those drivers. It's inevitable that it's hit and miss on any other OS that manufacturers don't care much about. Apple does nearly everything inside their walls, that's why it just works.

Insanity2 months ago

“It just works” sadly isn’t true across the Apple Ecosystem anymore.

Liquid Glass ruined multitasking UX on my iPad. :(

Also my macbook (m4 pro) has random freezes where finder becomes entirely unresponsive. Not sure yet why this happens but thankfully it’s pretty rare.

+5
pbh1012 months ago
ls6122 months ago

Sleep has always worked on my desktop with a random Asus board from the early 2020s with no issues aside from one Nvidia driver bug earlier this year (which was their fault not MS's). Am I just really lucky?

Krssst2 months ago

On my Framework 13 AMD : Sleep just works on Fedora. Sleep is unreliable on Windows; if my fans are all running at full speed while running a game and I close the lid to begin sleeping, it will start sleeping and eventually wake up with all fans blaring.

devnullbrain2 months ago

I don't understand this comment in this context. Both of these features work on my Steam Deck. Neither of them have worked on any Windows laptop my employers have foisted upon me.

tremon2 months ago

That requires driver support. What you're seeing is Microsoft's hardware certification forcing device vendors to care about their products. You're right that this is lacking on Linux, but it's not a slight on the kernel itself.

seba_dos12 months ago

Both of these have worked fine for the last 15 years or so on all my laptops.

packetlost2 months ago

Kernel level anti-cheat with trusted execution / signed kernels is probably a reasonable new frontier for online games, but it requires a certain level of adoption from game makers.

dabockster2 months ago

This is a part of Secure Boot, which Linux people have raged against for a long time. Mostly because the main key signing authority was Microsoft.

But here's my rub: no one else bothered to step up to be a key signer. Everyone has instead whined for 15 years and told people to disable Secure Boot and the loads of trusted compute tech that depends on it, instead of actually building and running the necessary infra for everyone to have a Secure Boot authority outside of big tech. Not even Red Hat/IBM even though they have the infra to do it.

Secure Boot and signed kernels are proven tech. But the Linux world absolutely needs to pull their heads out of their butts on this.

+2
ndriscoll2 months ago
codeflo2 months ago

There are plenty of locked down computers in my life already. I don't need or want another system that only runs crap signed by someone, and it doesn't really matter whether that someone is Microsoft or Redhat. A computer is truly "general purpose" only if it will run exactly the executable code I choose to place there, and Secure Boot is designed to prevent that.

mhitza2 months ago

I don't know overall in the ecosystem but Fedora has been working for me with secureboot enabled for a long time.

Having the option to disable secureboot, was probably due to backlash at the time and antitrust concerns.

Aside from providing protection "evil maid" attacks (right?) secureboot is in the interest of software companies. Just like platform "integrity" checks.

packetlost2 months ago

I'm pro secure boot fwiw and have had it working on my of my Linux systems for awhile.

esseph2 months ago

I'm not giving game ownership of my kernel, that's fucking insane. That will lead to nothing but other companies using the same tech to enforce other things, like the software you can run on your own stuff.

No thanks.

duped2 months ago

> I don't have an example in mind at the moment

I do, MIDI 2.0. It's not because they're not doing it, just that they're doing it at a glacial pace compared to everyone else. They have reasons for this (a complete rewrite of the windows media services APIs and internals) but it's taken years and delays to do something that shipped on Linux over two years ago and on Apple more like 5 (although there were some protocol changes over that time).

mstank2 months ago

Valve... please do Github Actions next

xmprt2 months ago

I wonder what Valve uses for source control (no pun intended) internally.

harrisoned2 months ago
shantara2 months ago

I’ve heard from several people who game on Windows that Gamescope side panel with OS-wide tweakables for overlays, performance, power, frame limiters and scaling is something that they miss after playing on Steam Deck. There are separate utilities for each, but not anything so simple and accessible as in Gamescope.

amlib2 months ago

A good one is the shader pre caching with fossilize, microsoft is only now getting around it and it still pales in comparison to Valve's solution for Linux.

guidopallemans2 months ago

Surely a gaming handheld counts

theLiminator2 months ago

Imagine if windows moved to the linux kernel and then used wine/proton to serve their own userspace.

m4rtink2 months ago

It kinda looked like this is the future, about at the same time they introduced WSL, released dotNET for Linux and started contributing to the Linux Kernel - all the while making the bank with Azure mostly thanks to running Linux workloads.

But then they deCided it is better to show adds at OS level, rewrite OS UI as a web app, force harware DRM for their new OS version (TPM requirement) as well as automatically capturing content of you screen and feed it to AI.

layer82 months ago

The Linux kernel and Windows userspace are not very well matched on a fundamental level. I’m not sure we should be looking forward to that, other than for running games and other insular apps.

+1
theLiminator2 months ago
delusional2 months ago

> Valve is practically singlehandedly dragging the Linux ecosystem forward in areas that nobody else wanted to touch.

I'm loving what valve has been doing, and their willingness to shove money into projects that have long been under invested in, BUT. Please don't forget all the volunteers that have developed these systems for years before valve decided to step up. All of this is only possible because a ton of different people spent decades slowly building a project, that for most of it's lifetime seemed like a dead end idea.

Wine as a software package is nothing short of miraculous. It has been monumentally expensive to build, but is provided to everyone to freely use as they wish.

Nobody, and I do mean NOBODY would have funded a project that spent 20 years struggling to run office and photoshop. Valve took it across the finish line into commercially useful project, but they could not have done that without the decade+ of work before that.

aeyes2 months ago

Long before Valve there was CrossOver which sold a polished version of Wine making a lot of Windows only enterprise software work on Linux.

I'm sure there have been more commercial contributors to Wine other than Valve and CodeWeavers.

mixmastamyk2 months ago

Like giving the Han Solo award to the Rebel Fleet. ;-)

cosmic_cheese2 months ago

One would've expected one of the many desktop-oriented distros (some with considerable funding, even) to have tackled these things already, but somehow desktop Linux has been stuck in the awkward midway of "it technically works, just learn to live with the rough edges" until finally Valve took initiative. Go figure.

johnny222 months ago

Please don't erase all the groundwork they've done over the years to make it possible for these later enhancements to happen. It wasn't like they were twiddling their thumbs this whole time!

cosmic_cheese2 months ago

That's not my intention at all. It's just frustrating how little of it translates to impact that's readily felt by end users, including those of us without technical inclination.

+1
johnny221 month ago
rapind2 months ago

It's not just Valve taking the initiative. It's mostly because Windows has become increasingly hostile and just plain horrible over the years. They'll be writing textbooks on how badly Microsoft screwed up their operating system.

pwthornton2 months ago

I'm a Mac user, but I recently played around with a beefy laptop at work to see how games ran on it, and I was shocked at how bad and user-hostile Windows 11 is. I had previously used Windows 98, 2000, XP, Vista, and 7, but 11 is just so janky. It's feestoned with Co-pilot/AI jank, and seems to be filled with ads and spyware.

If I didn't know better, I'd assume Windows was a free, ad-supported product. If I ever pick up a dedicated PC for gaming, it's going to be a Steam Machine and/or Steam Deck. Microsoft is basically lighting Xbox and Windows on fire to chase AI clanker slop.

+1
defrost2 months ago
WackyFighter2 months ago

That isn't it. Generally whatever the majority of users tend to use that where the majority of focus goes.

The vast majority of people that were using Linux on the desktop before 2015 were either hobbyists, developers or people that didn't want to run proprietary software for whatever reason.

These people generally didn't care about a lot of fancy tech mentioned. So this stuff didn't get fixed.

cosmic_cheese2 months ago

There’s some truth to that, but a lot of (maybe most) Linux desktop users are on laptops and yet there are many aspects of the Linux laptop experience that skew poor.

I think the bigger problem is that commercial use cases suck much of the air out of the room, leaving little for end user desktop use cases.

WackyFighter1 month ago

What laptops though? Most people end up getting either Thinkpads, old Dell business laptop or something like a framework.

Most people learn that using some crap top will leave you with stuff on the laptop not working e.g. volume buttons, wifi buttons etc.

All of these just work with Linux.

iknowstuff2 months ago

There's far more of that, starting with the lack of a stable ABI in gnu/linux distros. Eventually Valve or Google (with Android) are gonna swoop in with a user-friendly, targetable by devs OS that's actually a single platform

thewebguyd2 months ago

The enterprise distros do provide that, somewhat.

That's why, RHEL for example, has such a long support lifecycle. It's so you can develop software targeting RHEL specifically, and know you have a stable environment for 10+ years. RHEL sells a stable (as in unchanging) OS for x number of years to target.

nineteen9992 months ago

And if you want to follow the RHEL shaped bleeding edge you can develop on latest Fedora. I'll often do this, develop/package and Fedora and then build on RHEL as well.

cosmic_cheese2 months ago

I don't have a whole lot of faith in Google, based on considerable experience with developing for Android. Put plainly, it's a mess, and even with improvements in recent years there's enough low-hanging fruit for improving its developer story that much of it has fallen off the tree and stands a foot thick on the ground.

MarsIronPI2 months ago

Except that Android doesn't have a fixed ABI either. Google Play requires apps to rebuild targeting the latest Android ABI all the time. They have one year after each release to update or be removed.

+3
api2 months ago
ninth_ant2 months ago

Ubuntu LTS is currently on track to be that. Both in the server and desktop space, in my personal experience it feels like a rising number of commercial apps are targeting that distro specifically.

It’s not my distribution of choice, but it’s currently doing exactly what you suggest.

mips_avatar2 months ago

I just installed Ubuntu again after a few years, and it’s striking how familiar the pain points are—especially around graphics. If Ubuntu LTS is positioning itself as the standard commercial Linux target, it has to clearly outperform Windows on fundamentals, not just ideology. Linux feels perpetually one breakthrough release away from actually displacing it.

+2
cosmic_cheese2 months ago
LeFantome2 months ago

Valve has been pretty clear that Win32 is the platform.

singron2 months ago

Isn't that the steam linux runtime? Games linked against the runtime many years ago still run on modern distros.

LeFantome2 months ago

What desktop Linux distro has “considerable funding”?

MrDrMcCoy2 months ago

Red Hat, SuSE, and Ubuntu.

bilekas2 months ago

I do agree. It's also thanks to gaming that the GPU industry was in such a good state to be consumed by AI now. Game development used to always be the frontier of software optimisation techniques and ingenious approaches to the constraints.

baq2 months ago

I low key hope the current DDR5 prices push them to drag the Linux memory and swap management into the 21st century, too, because hard locking on low memory got old a while ago

the_pwner2242 months ago

It takes a solid 45 seconds for me to enable zram (compressed RAM as swap) on a fresh Arch install. I know that doesn't solve the issue for 99% of people who don't even know what zram is / have no idea how to do it / are trying to do it for the first time, but it would be pretty easy for someone to enable that in a distro. I wouldn't be shocked if it is already enabled by default in Ubuntu or Fedora.

m4rtink2 months ago

Zram has been enabled on Fedora by default since 2020:

https://fedoraproject.org/wiki/Changes/SwapOnZRAM

MrDrMcCoy2 months ago

Zswap is arguably better. It confers most of the benefits of zram swap, plus being able to evict to non-RAM if cache becomes more important or if the situation is dire. The only times I use zram are when all I have to work with for storage is MMC, which is too slow and fragile to be written to unless absolutely necessary.

johnny222 months ago

that just pushes away the problem ,it doesn't solve it. I still hit that limit when i ran a big compile while some other programs were using a lot of memory.

ahepp2 months ago

what behavior would you like to see when primary memory is under extreme pressure?

baq2 months ago

See mac or windows: grow swap automatically up to some sane limit, show a warning, give user an option to kill stuff; on headless systems, kill stuff. Do not page out critical system processes like sshd or the compositor.

A hard lock which requires a reboot or god forbid power cycling is the worst possible outcome, literally anything else which doesn’t start a fire is an improvement TBH.

+1
jpc02 months ago
jhasse2 months ago

Same as Windows. Instead the system freezes.

marcodiego2 months ago

I thought that was fixed after MGLRU.

stdbrouw2 months ago

I feel like all of the elements are there: zram, zswap, various packages that improve on default oom handling... maybe it's more about creating sane defaults that "just work" at this point?

gf0002 months ago

I think it's more of a user space issue, that the UI doesn't degrade nicely. The kernel just defaults to a more server-oriented approach.

PartiallyTyped2 months ago

To be fair proton is based on DXVK which is some guy’s project because he wanted to play nier automata on Linux.

The guy is Philip Rebohler.

foresto2 months ago

Yes, and when Valve caught wind of his early efforts, they paid him to work on it full time.

https://www.gamingonlinux.com/2018/09/an-interview-with-the-...

robotnikman2 months ago

And thanks to him I was able to play and finish Nier Automata on the Steam Deck!

raverbashing2 months ago

Let's be honest

Linux (and its ecosystem) sucks at having focus and direction.

They might get something right here and there, especially related to servers, but they are awful at not spinning wheels

See how wayland progress is slow. See how some distros moved to it only after a lot of kicking and screaming.

See how a lot of peripherals in "newer" (sometimes a model that's 2 or 3 yrs on the market) only barely works in a newer distro. Or has weird bugs

"but the manufacturers..." "but the hw producers..." "but open source..." whine

Because Linux lacks a good hierarchy at isolating responsibility, otherwise going for a "every kernel driver can do all it wants" together with "interfaces that keep flipping and flopping at every new kernel release" - notable (good) exception : USB userspace drivers. And don't even get me started on the whole mess that is xorg drivers

And then you have a Ruby Goldberg machine in form of udev dbus and what not, or whatever newer solution that solves half the problems and create another new collection of bugs.

cosmic_cheese2 months ago

Honestly I can't see it remaining tenable to keep things like drivers in the kernel for too much longer… both due to the sheer speed at the industry moves and due to the security implications involved.

captn3m02 months ago

My favourite is the Windows futex primitives being shipped on Linux: https://lwn.net/Articles/961884/

foresto2 months ago

They needed less stuttering in games and we got an optimized shader compiler for the open-source graphics stack.

https://steamcommunity.com/games/221410/announcements/detail...

GZGavinZhao2 months ago

Next thing I want them to work on is Linux suspend(-to-RAM) support!

jpetso1 month ago

You don't feel like the Steam Deck does a pretty good job with suspend and resume, even while playing games?

asdff2 months ago

I wish valve didn't abandon mac as a platform, honestly. As nice as these improvements are for linux and deck users they have effectively abandoned their mac ports as they never updated them to 64 bit like the linux and windows builds, so they can't run on new macs at all. You can coax them into running with wine on mac but it is a very tricky experience. My kegworks wine wrapper for tf2 is currently broken as of last month because the game update download from wine steam keeps corrupting and I'm at a bit of a loss at this point how to work around it. Even when it was working performance was not great and subject to regular lag spikes whenever too many explosions went off.

ux2664782 months ago

I totally get why they did, having had to support Mac for an in-house engine. Apple is by far the most painful platform to support out of the big 3 if you're not using turnkey tools, and they don't make up for it with sales outside of iOS. The extra labor is hard to justify already, and then we get to technical deficiencies like MoltenVK, plus social deficiencies like terrible support. It's just a really hard sell all around.

ndsipa_pomu1 month ago

It was likely about control. Valve saw that Microsoft was becoming more controlling about the Windows platform and that's what pushed them towards developing SteamOS on Linux as that means that Valve can put resources into fixing anything that they want to. The Apple platform is also under control of a single entity, so it doesn't make too much sense for Valve to care about that (as well as Apple not being known as a gaming platform).

What you should do is just buy a SteamDeck for gaming.

hulitu1 month ago

> They also needed smoother frame pacing and we got a scheduler that Zuck is now using to run data centers.

There was a lot of work in Linux scheduling space over the years. Con Kolivas BFS was one example. The issue was that Linus had his own ideas about kernel scheduling which, unfortunately, were very different from the ones of the linux community. And yes. The default linux scheduler sucks.

Plagman1 month ago

The Elden Ring stutter work was unrelated to this effort, it was work in vkd3d-proton by Hans-Kristian Arntzen as part of our open-source graphics effort.

irusensei2 months ago

If I'm not mistaken this has been greatly facilitated by the recent bpf based extension mechanism that allows developers to go crazy on creating schedulers and other functionality through some protected virtual machine mechanism provided by the kernel.

thdrtol2 months ago

I have a feeling this will also drag Linux mobile forwards.

Currently almost no one is using Linux for mobile because the lack or apps (banking for example) and bad hardware support. When developing for Linux becomes more and more attractive this might change.

thewebguyd2 months ago

> When developing for Linux becomes more and more attractive this might change.

If one (or maybe two) OSes win, then sure. The problem is there is no "develop for Linux" unless you are writing for the kernel.

Each distro is a standalone OS. It can have any variety of userland. You don't develop "for Linux" so much as you develop "for Ubuntu" or "for Fedora" or "for Android" etc.

Root_Denied1 month ago

There's always appimages or flatpaks that could fill that cross-distro gap, though I suspect a lot of development work would need to be done to get that to a point where either of those are streamlined enough to work in the phone ecosystem.

Zetaphor1 month ago

This is addressed (imperfectly) by Flatpak

znpy2 months ago

If anything it’s crazy that a company as large as meta is doing such a shitty job that it has to pull in solutions from entirely different industries … but that’s just my opinion

HexPhantom1 month ago

Yeah, it's a great example of demand-driven open source work actually landing in places that matter

rcbdev2 months ago

In game development, you encounter most hard computer science problems.

teekert2 months ago

They also sponsor bcachefs.

znpy2 months ago

Gaben is our lord and saviour.

downrightmike2 months ago

Man, if only meta would give back, oh and also stop letting scammers use their AI to scam our parents, but hey, that accounted for 10% of their revenue this last year, that's $16 BILLION.

phatfish2 months ago

Valve seemingly has no concerns with using the same tactics casinos perfected to hook people (and their demographics are young). They are not Meta level of societal harm, but they are happy to be a gateway for kids into gambling. Not that this is unusual in gaming unfortunately.

justapassenger2 months ago

Like them or not - when it comes to the Linux kernel they are one of the biggest contributors for many years now.

ls6122 months ago

Gaben does nothing: Wins

Gaben does something: Wins Harder

7bit2 months ago

He's the person I want to meet the least from all the people in the world, he is that much of my hero.

dabockster2 months ago

> This is the best kind of open source trickledown.

We shouldn't be depending on trickledown anything. It's nice to see Valve contributing back, but we all need to remember that they can totally evaporate/vanish behind proprietary licensing at any time.

dymk2 months ago

They have to abide by the Wine license, which is basically GPL, so unless they’re going to make their own from scratch, they can’t make the bread and butter of their compat layer proprietary

nextaccountic2 months ago

That's why the anti-GPL push is so harmful. Specially in the Rust ecosystem

+3
bigstrat20032 months ago
stavros2 months ago

How? It's GPL.

jact2 months ago

Can it vanish behind proprietary licensing? Pretty sure most of Valve’s stuff is under GPL so they can’t exactly evaporate that away.

mikkupikku2 months ago

> SCX-LAVD has been worked on by Linux consulting firm Igalia under contract for Valve

It seems like every time I read about this kind of stuff, it's being done by contractors. I think Proton is similar. Of course that makes it no less awesome, but it makes me wonder about the contractor to employee ratio at Valve. Do they pretty much stick to Steam/game development and contract out most of the rest?

ZeroCool2u2 months ago

Igalia is a bit unique as it serves as a single corporate entity for organizing a lot of sponsored work on the Linux kernel and open source projects. You'll notice in their blog posts they have collaborations with a number of other large companies seeking to sponsor very specific development work. For example, Google works with them a lot. I think it really just simplifies a lot of logistics for paying folks to do this kind of work, plus the Igalia employees can get shared efficiency's and savings for things like benefits etc.

butlike2 months ago

Oh ok, so Igalia owns the developer sweatshops now. Got it.

dan-robertson2 months ago

This seems to be a win-win where developers benefit from more work in niche areas, companies benefit by getting better developers for the things they want done, and Igalia gets paid (effectively) for matching the two together, sourcing sufficient work/developers, etc.

ksynwa1 month ago

I don't know much about Igalia but they are worker owned and I always see them work on high skill requirement tasks. Makes me wish I was good enough to work for them.

the_mitsuhiko2 months ago

It's a cooperative sweatshop in that sense.

saagarjha2 months ago

And the developers own Igalia.

zipy1241 month ago

Just because work is 'out-sourced' to contractors does not mean it is a sweatshop....

chucky_z2 months ago

This isn’t explicitly called out in any of the other comments in my opinion so I’ll state this. Valve as a company is incredibly focused internally on its business. Its business is games, game hardware, and game delivery. For anything outside of that purview instead of trying to build a huge internal team they contract out. I’m genuinely curious why other companies don’t do this style more often because it seems incredibly cost effective. They hire top level contractors to do top tier work on hyper specific areas and everyone benefits. I think this kind of work is why Valve gets a free pass to do some real heinous shit (all the gambling stuff) and maintain incredible good will. They’re a true “take the good with the bad” kind of company. I certainly don’t condone all the bad they’ve put out, and I also have to recognize all the good they’ve done at the same time.

Back to the root point. Small company focused on core business competencies, extremely effective at contracting non-core business functions. I wish more businesses functioned this way.

javier22 months ago

Yeah, I suppose this workflow is not for everyone. I can only imagine Valve has very specific issue or requirements in mind when they hire contractors like this. When you hire like this, i suspect what one really pay for is a well known name that will be able to push something important to you to upstream linux. Its the right way to do it if you want it resolved quickly. If you come in as a fresh contributor, landing features upstream could take years.

smotched2 months ago

Whats the bad practices valve is doing in gambling?

crtasm2 months ago

Their games and systems tie into huge gambling operations on 3rd party sites

If you have 30mins for a video I recommend People Make Games' documentary on it https://www.youtube.com/watch?v=eMmNy11Mn7g

+2
trinsic22 months ago
mewse-hn2 months ago

Loot box style underage gambling in their live service games - TF2 hats, counterstrike skins, "trading cards", etc etc

msh2 months ago

Lootboxes comes to mind.

butlike2 months ago

Small company doesn't have the capital to contract out library work like that. Same story as it's always been

tayo422 months ago

I feel like I rarely see contacting out work go well. This seems like an exception

OkayPhysicist2 months ago

The .308 footgun with software contracting stems from a misunderstanding of what we pay software developers for. The model under which contracting seems like the right move is "we pay software developers because we want a unit of software", like how you pay a carpenter to build you some custom cabinets. If the union of "things you have a very particular opinion about, and can specify coherently" and "things you don't care about" completely cover a project, contracting works great for that purpose.

But most of the time you don't want "a unit of software", you want some amorphous blob of product and business wants and needs, continuously changing at the whims of business, businessmen, and customers. In this context, sure, you're paying your developers to solve problems, but moreover you're paying them to store the institutional knowledge of how your particular system is built. Code is much easier to write than to read, because writing code involves applying a mental model that fits your understanding of the world onto the application, whereas reading code requires you to try and recreate someone else's alien mental model. In the situation of in-house products and business automation, at some point your senior developers become more valuable for their understanding of your codebase than their code output productivity.

The context of "I want this particular thing fixed in a popular open source codebase that there are existing people with expertise in", contracting makes a ton of sense, because you aren't the sole buyer of that expertise.

magicalhippo2 months ago

If you have competent people on both sides who care, I don't see why it wouldn't work.

The problem seems, at least from a distance, to be that bosses treat it as a fire-and-forget solution.

We haven't had any software done by oursiders yet, but we have hired consultants to help us on specifics, like changing our infra and help move local servers to the cloud. They've been very effective and helped us a lot.

We had talks though so we found someone who we could trust had the knowledge, and we were knowledgeable enough ourselves that we could determine that. We then followed up closely.

tayo422 months ago

I think your first 2 sentances are pretty common issues though.

stackskipton2 months ago

Most companies that hiring a ton of contractors are doing it for business/financial reporting reasons. Contractors don't show up as employees so investors don't see employee count rise so metric of "Revenue/Employee" ratio does not get dragged down and contractors can be cut immediately with no further on expenses. Laid off employees take about quarter to be truly shed from the books between severance, vacation payouts and unemployment insurance.

TulliusCicero2 months ago

Valve contracts out to actually competent people and companies rather than giant bodycount consulting firms.

m4rtink2 months ago

Not to mention the code being open source & in need to be accepted upstream to be actually useful in the long term.

zipy1241 month ago

This is mostly because the title of contracter has come to mean many things. In the original form, of outsourcing temporary work to experts in the field it still works very very well. Where it fails is when a business contracts out business critical work, or contracts to a general company rather than experts.

to11mtm2 months ago

I've seen both good and bad contractors in multiple industries.

When I worked in the HFC/Fiber plant design industry, the simple act of "Don't use the same boilerplate MSA for every type of vendor" and being more specific about project requirements in the RFP makes it very clear what is expected, and suddenly we'd get better bids, and would carefully review the bids to make sure that the response indicated they understood the work.

We also had our own 'internal' cost estimates (i.e. if we had the in house capacity, how long would it take to do and how much would it cost) which made it clear when a vendor was in over their head under-bidding just to get the work, which was never a good thing.

And, I've seen that done in the software industry as well, and it worked.

That said, the main 'extra' challenge in IT is that key is that many of the good players aren't going to be the ones beating down your door like the big 4 or a WITCH consultancy will.

But really at the end of the day, the problem is what often happens is that business-people who don't really know (or necessarily -care-) about specifics enough unfortunately are the people picking things like vendors.

And worse, sometimes they're the ones writing the spec and not letting engineers review it. [0]

[0] - This once led to an off-shore body shop getting a requirement along the lines of 'the stored procedures and SQL called should be configurable' and sure enough the web.config had ALL the SQL and stored procedures as XML elements, loaded from config just before the DB call, thing was a bitch to debug and their testing alone wreaked havoc on our dev DB.

WD-422 months ago

Igalia isn’t your typical contractor. It’s made up of competent developers that actually want to be there and care to see open source succeed. Completely different ball game.

abnercoimbre2 months ago

Nope. Plenty of top-tier contractors work quietly with their clientele and let the companies take the credit (so long as they reference the contractor to others, keeping the gravy train going.)

If you don't see it happening, the game is being played as intended.

tapoxi2 months ago

Valve is actually extremely small, I've heard estimates at around 350-400 people.

They're also a flat organization, with all the good and bad that brings, so scaling with contractors is easier than bringing on employees that might want to work on something else instead.

sneak2 months ago

300 people isn’t “extremely small” for a company. I don’t work with/for companies over 100 people, for example, and those are already quite big.

zipy1241 month ago

300 is extremely small for a company of their size in terms of revenue and impact. Linus media group and their other companies for instance is over 100 people, and is much smaller in impact and revenue than a company like valve, despite not being far off the number of employers (within an order of magnitude)...

tester7562 months ago

300 people running Steam, creating games and maintaining Steam Deck / Linux and stuff?

Yes, 300 is quite small.

frakkingcylons2 months ago

I think a better way to think of it is in terms of revenue per employee. Valve is WAY up there.

PlanksVariable2 months ago

Of course smaller companies exist — there are 1 person companies! But in a world where many tech companies have 50,000+ employees, 300 is much closer to 100 or 10 and they can all be considered small.

And then you consider it in context: a company with huge impact, brand recognition, and revenue (about $50M/employee in 2025). They’ve remained extremely small compared to how big they could grow.

+2
sneak2 months ago
hatthew2 months ago

the implied observation is that valve is extremely small relative to what it does and how big most people would expect it to be

mindcrash2 months ago

Proton is mainly a co-effort between in-house developers at Valve (with support on specific parts from contractors like Igalia), developers at CodeWeavers and the wider community.

For contextual, super specific, super specialized work (e.g. SCX-LAVD, the DirectX-to-Vulkan and OpenGL-to-Vulkan translation layers in Proton, and most of the graphics driver work required to make games run on the upcoming ARM based Steam Frame) they like to subcontract work to orgs like Igalia but that's about it.

everfrustrated2 months ago

Valve is known to keep their employee count as low as possible. I would guess anything that can reasonably be contracted out is.

That said, something like this which is a fixed project, highly technical and requires a lot of domain expertise would make sense for _anybody_ to contract out.

treyd2 months ago

They seem to be doing it through Igalia, which is a company based on specialized consulting for the Linux ecosystem, as opposed to hiring individual contractors. Your point still stands, but from my perspective this arrangement makes a lot of sense while the Igalia employees have better job security than they would as individual contractors.

izacus2 months ago

This is how "Company funding OSS" looks like in real life.

There have been demands to do that more on HN lately. This is how it looks like when it happens - a company paying for OSS development.

wildzzz2 months ago

It would be a large effort to stand up a department that solely focuses on Linux development just like it would be to shift game developers to writing Linux code. Much easier to just pay a company to do the hard stuff for you. I'm sure the steam deck hardware was the same, Valve did the overall design and requirements but another company did the actual hardware development.

koverstreet2 months ago

Speaking for myself, Valve has been great to work with - chill, and they bring real technical focus. It's still engineers running the show there, and they're good at what they do. A real breath of fresh air from much of the tech world.

FartyMcFarter1 month ago

What sort of stuff did you work on with them, if you don't mind me asking?

jvanderbot2 months ago

They probably needed some point expertise on this one, as they build out their teams.

Brian_K_White2 months ago

I don't know what you're trying to suggest or question. If there is a question here, what is it exactly, and why is that question interesting? Do they employ contractors? Yes. Why was that a question?

mikkupikku2 months ago

Wut.

bogwog2 months ago

Valve has a weird obsession with maximizing their profit-per-employee ratio. There are stories from ex-employees out on the web about how this creates a hostile environment, and perverse incentives to sabotage those below you to protect your own job.

I don't remember all the details, but it doesn't seem like a great place to work, at least based on the horror stories I've read.

Valve does a lot of awesome things, but they also do a lot of shitty things, and I think their productivity is abysmal based on what you'd expect from a company with their market share. They have very successful products, but it's obvious that basically all of their income comes from rent-seeking from developers who want to (well, need to) publish on Steam.

wocram2 months ago

There are numerous other ways to publish games. Is it really rent-seeking to own and maintain the most popular game publishing platform?

redleader552 months ago

It's worth mentioning that sched_ext was developed at Meta. The schedulers are developed by several companies who collaborate to develop them, not just Meta or Valve or Italia and the development is done in a shared GitHub repo - https://github.com/sched-ext/scx.

9999000009992 months ago

That's the magic of open source. Valve can't say ohh noes you need a deluxe enterprise license.

senfiaj2 months ago

In this case yes, but on the other hand Red Hat won't publish the RHEL code unless you have the binaries. The GPLv2 license requires you to provide the source code only if you provide the compiled binaries. In theory Meta can apply its own proprietary patches on Linux and don't publish the source code if it runs that patched Linux on its servers only.

dralley2 months ago

RHEL source code is easily available to the public - via CentOS Stream.

For any individual RHEL package, you can find the source code with barely any effort. If you have a list of the exact versions of every package used in RHEL, you could compose it without that much effort by finding those packages in Stream. It's just not served up to you on a silver platter unless you're a paying customer. You have M package versions for N packages - all open source - and you have to figure out the correct construction for yourself.

cherryteastain2 months ago

Can't anyone get a RHEL instance on their favorite cloud, dnf install whatever packages they want sources of, email Redhat to demand the sources, and shut down the instance?

dfedbeef2 months ago

RHEL specifically makes it really annoying to see the source. You get a web view.

+2
tremon2 months ago
Aperocky2 months ago

Don't forget RH is owned by IBM.

+1
OsrsNeedsf2P2 months ago
kstrauser2 months ago

I'm more surprised that the scheduler made for a handheld gaming console is also demonstrably good for Facebook's servers.

giantrobot2 months ago

Latency-aware scheduling is important in a lot of domains. Getting video frames or controller input delivered on a deadline is a similar problem to getting voice or video packets delivered on a deadline. Meanwhile housecleaning processes like log rotation can sort of happen whenever.

bigyabai2 months ago

I mean, part of it is that Linux's default scheduler is braindead by modern standards: https://en.wikipedia.org/wiki/Completely_Fair_Scheduler

3eb7988a16632 months ago

Part of that is the assumption that Amazon/Meta/Google all have dedicated engineers who should be doing nothing but tuning performance for 0.0001% efficiency gains. At the scale of millions of servers, those tweaks add up to real dollar savings, and I suspect little of how they run is stock.

Anon10962 months ago

This is really just an example of survivorship bias and the power of Valve's good brand value. Big tech does in fact employ plenty of people working on the kernel to make 0.1% efficiency gains (for the reason you state), it's just not posted on HN. Someone would have found this eventually if not Valve.

And the people at FB who worked to integrate Valve's work into the backend and test it and measure the gains are the same people who go looking for these kernel perf improvements all day.

accelbred2 months ago

CFS was replaced by EEVDF, no?

0x1ch2 months ago

I vaguely remember reading when this occurred. It was very recent no? Last few years for sure.

> The Linux kernel began transitioning to EEVDF in version 6.6 (as a new option in 2024), moving away from the earlier Completely Fair Scheduler (CFS) in favor of a version of EEVDF proposed by Peter Zijlstra in 2023 [2-4]. More information regarding CFS can be found in CFS Scheduler.

jorvi2 months ago

Ultimately, CPU schedulers are about choosing which attributes to weigh more heavily. See this[0] diagram from Github. EEVDF isn't a straight upgrade on CFS. Nor is LAVD over either.

Just traditionally, Linux schedulers have been rather esoteric to tune and by default they've been optimized for throughput and fairness over everything else. Good for workstations and servers, bad for everyone else.

[0]https://tinyurl.com/mw6uw9vh

phdelightful2 months ago

Parent's article says

> Starting from version 6.6 of the Linux kernel, [CFS] was replaced by the EEVDF scheduler.[citation needed]

ranger2072 months ago

A lot of scheduler experimentation has been enabled by sched_ext: https://lwn.net/Articles/922405/

jorvi2 months ago

I mean.. many SteamOS flavors (and Linux distros in general have) have switched to Meta's Kyber IO scheduler to fix microstutter issues.. the knife cuts both ways :)

bronson2 months ago

Kyber is an I/O scheduler. Nothing to do with this article.

Brian_K_White2 months ago

The comment was perfectly valid and topical and applicable. It doesn't matter what kind of improvement Meta supplied that everyone else took up. It could have been better cache invalidation or better usb mouse support.

HexPhantom1 month ago

Exactly. Once the work is upstream and open, it stops being "Valve's thing" and just becomes part of the commons

sintax2 months ago

Well if you think about it, in this case the license is the 30% cut on every game you purchase on steam.

Sparkyte2 months ago

I've been using Bazzite Desktop for 4 months now and it has been my everything. Windows is just abandonware now even with every update they push. It is clunky and hard to manage.

aucisson_masque2 months ago

Isn't bazzite a gaming focused distribution ? It seems weird to install it on a PC that does 'my everything'.

I wouldn't make excel spreadsheet on the steam deck for instance.

0x1ch2 months ago

Bazzite is advertised for gamers, however from my understanding it's just Fedora Atomic wrapped up to work well on steamdeck adjacent hardware and gaming is a top priority. You'd still be receiving the same level of quality you would expect from Fedora/RHEL (I would think).

Sparkyte2 months ago

Precisely, I like it's commitment to the Fedora Atomic. Fedora is in my opinion the best user experience Linux out there, not just because Linus Torvalds said it was his favorite. Probably not the best server or best to base a console OS on, but as a daily driver consistency is more important. Keeping things in flatpaks makes it easy to manage what is installed too.

pawelduda2 months ago

Why not? It has full desktop mode with Plasma and can be docked like PC

Sparkyte2 months ago

Gaming or not, stability is important. An OS that focuses on gaming will typically focus on stability, neither bleeding edge or lag behind in support. Has to update enough to work with certain games and behind enough to not have weird support isues.

So Bazzite in my opinion is probably one of the best user experience flavors of Fedora around.

Yes you can do more than gaming on Bazzite.

hinkley1 month ago

I think you’ve forgotten or aren’t aware that before 3d graphics cards took over, people would buy new video cards to ostensibly make excel faster but then use them to play video games. It was an interesting time with interesting justifications for buying upgrades.

tra32 months ago

I'm curious how this came to be:

> Meta has found that the scheduler can actually adapt and work very well on the hyperscaler's large servers.

I'm not at all in the know about this, so it would not even occur to me to test it. Is it the case that if you're optimizing Linux performance you'd just try whatever is available?

laweijfmvo2 months ago

almost certainly bottom-up: some eng somewhere read about it, ran a test, saw positive results, and it bubbles up from there. this is still how lots of cool things happen at big companies like Meta.

balls1872 months ago

How well does Linux handle game streaming? I’m just now getting into it, and now that Windows10 is dead, I want to move my desktop PC over to linux, and end my relationship with Microsoft, formally.

Kholin2 months ago

It works will. I've tried used Sunshine as stream server and Moonlight as client to play games on my Steam Deck, my PC installed openSUSE Tumbleweed with KDE Plasma. There may be some key binding issues, but they can be solved with a little setup.

tayo422 months ago

Interesting to see server workloads take ideas from other areas. I saw recently that some of the k8s specific os do their updates like android devices

esseph2 months ago

You mean immutable?

tayo422 months ago

That wasn't what I was thinking about. There's a phrase for it using active and back up partitions but I can't find what it's called

jraph2 months ago

A/B updates?

tayo421 month ago

Yeah that's what I was thinking of. Looks like it's also called seamless which I think the phrase I couldn't come up with was

ahartmetz2 months ago

I keep being puzzled by the unwillingness of developers to deal with scheduling issues. Many developers avoid optimization, almost all avoid scheduling. There are some pretty interesting algorithms and data structures in that space, and doing it well almost always improves user experience. Often it even decreases total wall-clock time for a given set of tasks.

HexPhantom1 month ago

Something built to shave off latency on a handheld gaming device ends up scaling to hyperscale servers, not because anyone planned it that way, but because the abstraction was done right

erichocean2 months ago

Omarchy should adopt the SCX-LAVD scheduler as its default, it helps conserve power on laptops.

shmerl2 months ago

Can't find scxctl in Debian. Was it never packaged?

binary1322 months ago

I'm struggling to understand what workloads Meta might be running that are _this_ latency-critical.

commandersaki2 months ago

According to the video linked somewhere in this thread indicates WhatsApp Erlang workers that want sub-ms latency.

tayo422 months ago

If you have 50,000 servers for your service, and you can reduce that by 1 percent, you save 50 servers. Multiply that by maybe $8k per server and you have saved $400k,you just paid for your self for a year. With meta the numbers are probably a bit bigger.

binary1321 month ago

yes, but latency-optimized schedulers tend to have _worse_ throughput, not better.

pixelbeat__2 months ago

LOL (I used to work for Meta, so appreciate the facetious understatement)

bongodongobob2 months ago

That's not how it works though. Budgets are annual. A 1% savings of cpu cycles doesn't show up anywhere, it's a rounding error. They don't have a guy that pulls the servers and sells them ahead of the projection. You bought them for 5 years and they're staying. 5 years from now, that 1% got eaten up by other shit.

Anon10962 months ago

You're wrong about how services that cost 9+ figures to run annually are budgeted. 1% CPU is absolutely massive and well measured and accounted for in these systems.

+1
bongodongobob2 months ago
tayo422 months ago

You don't buy servers once every 5 years. I've done purchasing every quarter and forecasted a year out. You reduce your services budget for hardware by the amount saved for that year.

bongodongobob2 months ago

5 years is the lifecycle. You're not going to get rid of a 4 year old server because you're using less cycles that you thought you would. You already bought it. You find something else for it to do or you have a little extra redundancy. If I increase the mpg of my semi fleet, that doesn't mean I can sell some of my semis off just because the cost per trip goes down.

Pr0Ger2 months ago

It's definitely for ads auctions

dabockster2 months ago

It's Meta. They always push to be that fast on paper, even when it's costly to do and doesn't really need it.

stuxnet792 months ago

Meta is a humongous company. Any kind of latency has to have a business impact.

loeg2 months ago

[flagged]

fph2 months ago

Life becomes a lot better the moment you stop considering Youtube videos valid primary sources.

loeg2 months ago

It’s a recording of a talk. Feel free to point out other sources but there doesn’t seem like much to object to here.

fph1 month ago

https://lpc.events/event/19/contributions/2099/ is a much better reference in my view. It is the original conference website, it contains all the material in text format as well, and it does not force you to watch a video (and maybe an ad or two before that, idk, I use adblock). I call this link "primary" and the Youtube video "secondary" (as well as Phoronix).

hobobaggins2 months ago

Phoronix is blogspam?!

webdevver2 months ago

yeah thats kinda harsh, phoronix is a good oss news aggregator at the very least, and the PTS is a huge boon for "whats the best bang for buck llvm build box" type of question (which is very useful!)

zipy1241 month ago

It is certainly not, I'm not sure where the commentor gets that view. Most likely because alongside their primary journalistic content they also produce secondary reporting like these short pieces, disseminating niche viewership content to a wider audience. I can see how it might be easy to see it as blog spam given the latter is almost entirely their purview, but it should not be misconstrued as such in this case.

loeg1 month ago

Yes.

MrDrMcCoy2 months ago

Blogspam is very disingenuous. Phoronix covers a lot of content in the open source world that isn't well tracked elsewhere, and does some of the best and most comprehensive benchmarking of hardware and software you'll find anywhere on the internet.

alecco2 months ago

[flagged]

hoppyhoppy22 months ago

Generated comments are not allowed on HN.

alecco2 months ago

[flagged]

Boxxed2 months ago

Posting an AI summary is about as useful as posting Google search results. We can all do it, we don't need anyone to do it for us.

mikkupikku2 months ago

As well as the points already raised by others, I'd like to make the point that we should be encouraging people to prompt LLMs themselves rather than just accepting the outputs of others. As a social norm, this will make society more robust to misinformation and deception, as it will result in fewer people trusting outputs without knowing how the LLM was actually prompted.

This probably doesn't really matter in this context, but I think it's a general best practice worth reinforcing whenever possible.

littlestymaar2 months ago

If someone want to ask an LLM about something, good for them, but there's no need to paste its content over the internet, disclosed or not.

TZubiri2 months ago

I feel the intent of these rules are to forbid undisclosed aigen comments.

If you ban disclosed usage of AIgen, you will get covert usage of AIgen

bigyabai2 months ago

> you will get covert usage of AIgen

We get that regardless of how we ban disclosed usage.

mort962 months ago

You can actually ban both.

wizzwizz42 months ago

Such behaviour is extremely obvious. Anyone capable of hiding it is also capable of just… not using AIgen.

DebugDruid2 months ago

Looks like open source helped create Silicon Valley, while no IP laws made Shenzhen. Sharing seems to really drive industry growth, so maybe the US and EU should rethink their IP laws?