Back

PlayStation 2 Recompilation Project Is Absolutely Incredible

425 points16 hoursredgamingtech.com
pwdisswordfishs11 hours ago

> The PlayStation 2’s library is easily among the best of any console ever released, and even if you were to narrow down the list of games to the very best, you’d be left with dozens (more like hundreds) of incredible titles. But the PS2 hardware is getting a bit long in the tooth

Besides the library, the PS2 is the most successful video game console of all time in terms of number of units shipped, and it stayed on the market for over ten years, featured a DVD drive, and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer, including their own official PS2 Linux distribution.

In a more perfect world, this would have:

(a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two: a smidge better than the former, but not quite as exotic as the latter (with its Cell CPU or the weird form factor; whereas the PS2's physical profile in comparison was perfect, whether in the original form or the Slim version), which could have:

(b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units even if/when Sony discontinued it themselves, periodically revving the platform (doubling the amount of memory here, providing a way to tap into higher clock speeds there) all while maintaining backwards compatibility such that you would be able to go out today and buy a brand new, $30 bargain-bin, commodity "PS2 clone" that can do basic computing tasks on it (in other words, not including the ability to run a modern Web browser or Electron apps), can play physical media, and supports all the original games and any other new games that explicitly target(ed) the same platform, or you could pay Steam Machine 2026 prices for the latest-gen "PS2" that retains native support for the original titles of the very first platform revision but unlocks also the ability to play those for every intermediate rev, too.

anonymous9082139 hours ago

> (a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two

I would argue strongly that the weak hardware is why the PS2, and other old consoles, were so good, and that by improving the hardware you cannot replicate what they accomplished (which is why, indeed, newer consoles have never managed to be as iconic as older consoles). You can make an equally strong case that the Super Famicom is the best console of all time, with dozens of 10/10 games that stand the test of time. I think the limitations of the hardware played a pivotal role in both, as they demanded good stylistic decisions to create aesthetically appealing games with limited resources, and demanded a significant level of work into curating and optimizing the game design, because every aspect of the game consumed limited resources and therefore bad ideas had to be culled, leaving a well-polished remainder of the best ideas in a sort of Darwinian sense.

> (b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units

Unlike the PC market, the comprehensive list of "other vendors" is two entries long. Is it a more perfect world if Nintendo manufactures knockoff Playstations instead of its variety of unique consoles? I don't think so.

trashb2 hours ago

This reminded me of the following quote "Limitation breeds creativity", and therefore the PS2's limitations where instrumental to it's success.

The PS2 in may ways was a great improvement on the PS1 however it was not easy to develop for and could do certain things very well, other things not so well. One example is the graphics due to the unusual architecture of the Emotion Engine (gpu). I think this forced the developers to consider what their games really required and where they wanted to spend the development effort, one of the key ingredients for good game design.

Additionally the release hype of the PS2 was quite big and the graphics that where achievable where very good at the time, so developers wanted to go through the development pains to create a game for this console.

Not to forget besides the mountain of great titles for the PS2 there is also a mountain of flopped games that faded into obscurity.

vlunkr7 hours ago

I love retro consoles as much as the next middle aged software developer, but realistically, the reason those consoles are so iconic is because we were children. Every console generation is that special generation for one group of kids.

I do agree that sometimes limitations breed creativity, but that’s not the only thing that can make the magic work.

surgical_fire2 hours ago

I disagree.

I routinely revisit old games with a critical mind. It is an interesting thing to do.

I find that quite a few games I really loved as a kid are special because I played during a formative age, yes. Some are better left in the past.

But I find some that still manage to impress me to this day. They are not good only as a memory, they are just really good.

And a second counter is that my all-time favorite consoles are the SNES and the Switch. I have been gaming ever since the Atari 2600 days. The Switch was released well into my 30s. I have no nostalgia for it.

PetitPrince3 hours ago

I join my voice in disgreeing with this. While some games can indeed be rose-tinted (I have fond memory of that Game Boy Spiderman game, and it's a terrible shoverware game), many of them are traiblazer (like, invented a genre) or are still standing on their own very well.

+1
leni5362 hours ago
anonymous9082137 hours ago

I know it's easy to trot out "nostalgia", but do you not think it's possible that older games can genuinely be better than newer games? I very much think it is common to find such games, even games I had never played in my youth. There were bad games then too, of course, and good games now, but I think the ratio of hits was higher. Particularly now that modern game development is so sloppy. Microtransaction-infested games rule the world, and while the indie scene does still produce excellent gems, most of them tend to be significantly less polished and rougher around the edges.

vlunkr6 hours ago

Yeah I think that individual retro games can be incredible and stand the test of time. For me Super Metroid and Symphony of the Night are timeless. As a whole though, it’s hard to measure. Today we have microtransactions, in the past we had games that threw in one bullshit level so you couldn’t beat it during a rental. (Lookin at you battletoads) and bad movie tie-ins, lazy arcade ports, etc. There’s always going to be trash.

One thing retro games obviously don’t have is hindsight. Shovel Knight feels like the best NES games, but lacks crap like lives and continues, because it learned from later games like Dark Souls that you can make death punishing without making it un-fun. Hollow knight builds on my favorite games with a couple of decades of lessons on how to make platformers more interesting and less frustrating.

+1
pjerem5 hours ago
pjmlp3 hours ago

I belong to the 8 and 16 bit home computers generation, which grew along those consoles, yet for those on my circle consoles weren't special, home computers were.

Hence why I find funny the remarks of "PC gaming" is growing, for my crowd it was always there since the 1990's.

Barrin926 hours ago

>the reason those consoles are so iconic is because we were children

if you spend some time on youtube and look at people too young to even have been around play through those games it just becomes evident very quickly how wrong that assessment is. There's an energy even among young audiences when they're playing games like Metal Gear Solid 1&2 for the first time that you hardly see for anything coming out today.

There was a level of artistic talent in that generation, also in animation of the time, that simply doesn't really have a parallel today and brushing it off as nostalgia has a lot to do with he inability of people to recognize that there's no linear progress in art. Talent can be lost, some periods are better than others, just having more cpu and gpu cycles available does not produce better art.

The fact that almost 30 years after games like MGS it's still Kojima and a lot of Japanese guys now with increasingly gray hair who end up getting a lot of awards and pushing the envelope that should tell you something.

sapphicsnail2 hours ago

I think people forget there were a ton of shit SNES/PSX/whatever games. I personally have a soft spot for the 16 bit era but there are plenty of indie games coming out that are just as beautiful and creative. There's also way more exploration with narrative structure now then there was back then.

+1
vlunkr5 hours ago
mistercheph5 hours ago

Will people ever be nostalgic for the xbox one? For the iphone 14?

I doubt it. These products might even be good, but they are not like their early ancestors in several significant ways that will have them relegated to the footnotes of history. Most importantly, they are difficult to distinguish from both their immediate predecessors and their immediate successors. I don't mean to say that people won't have treasured experiences from this time that they long for in 20 years, just that I doubt the console will play as significant of a role in the memory.

pjerem4 hours ago

Just for the joke, I own the og Xbox One and it’s the only console I hated from day one.

I clearly remember plugging it to my TV with excitement and being greeted with gigabytes of mandatory updates. And then I discovered that you weren’t able to play the game from the disk and that you need to install it on the fucking hard drive !! And then I discovered that the disc reader was actually slower than my fiber connection which means it was faster to play a game from the online store than installing it from a real disk.

I think I had to wait for at least a full hour just to play my first game.

And on top of that the performance was actually not that good. 30fps everywhere, it was worse than the Nintendo games on Wii / GameCube which usually ran at 50/60fps.

I still own this shit but I never liked it. At least it was useful some month ago when I had to update my Xbox controller firmware (but since I didn’t power it on for years , I also had to wait for updates :) ).

JoeyJoJoJr9 hours ago

This might be a nitpick, but I could probably only count 5-10 SNES games that would be considered 10/10 IMO, and not many that I think are worth sinking decent time into these days, compared to something like Burnout Revenge - a great game but certainly not a 10/10 game.

Still, I do find the SNES library, and 16bit games in general, quite astounding from a creative and artistic perspective, but not so much from a player’s perspective.

anonymous9082139 hours ago

A Link to the Past, Super Mario World, Yoshi's Island, Kirby Super Star, Donkey Kong Country 1-3, Super Metroid, Megaman X series, Dragon Quest series, Final Fantasy series, Chrono Trigger, Earthbound... just off the top of my head, are all very much worth playing today.

+1
deaddodo8 hours ago
vimoses7 hours ago

> This might be a nitpick, but I could probably only count 5-10 SNES games that would be considered 10/10 IMO firstly, this seems like a pretty flawed standard for evaluating a consoles library, no? but secondly, "5-10 10/10"s seems like a pretty good amount for any consoles library anyways, unless you value a "10/10" less than i guess i would

JoeyJoJoJr5 hours ago

I’m not criticizing the library of SNES. I have very fond memories playing SNES games. It was more in response to the statement that there are dozens of 10/10 games on SNES. Let me clarify, there are not many 10/10 games on SNES (or any system for that matter), let alone dozens.

anthk3 hours ago

Between JRPG's, plataformers, SMK and Top Gears you can sum more than 20.

delaminator11 hours ago

> and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer with their own official PS2 Linux distribution.

to avoid EU import taxes

pjmlp4 hours ago

As owner of PS2 Linux distribution and related hardware, it was sort of ok.

Sony intended it to be the evolution of Playstation Yaroze, fostering indie development, instead people used it mostly to run emulators on the PS2, hence why the PS3 version lost access to accelerated hardware for graphics.

PS2 Linux had hardware acceleration, the only difference was that the OpenGL inspired API did not expose all the capabilities of a regular DevKit.

Community proved that the development effort wasn't worth it.

The XBox arcade and ID@XBox programs have also taken these lessons into account, which is why you only see everyone running emulators on rooted XBoxes, not the developer mode ones.

The market of IBM PC clones only happened because of an IBM mistake, that was never supposed to happen, and IBM tried with the PS2 / MCA to take their control back, but the Pandora box was already open, and Compaq was clever with the way they did reverse engineer the BIOS.

joshu10 hours ago

it was a dreadful, useless computer, even then

nick2388 hours ago

Unlike the PS3 which the US Air Force bought 1,760 and clustered into the 33rd most powerful** at the time.

(**Distributed computing is very cheat-y compared to a "real" supercomputer which has insane RDMA capabilities)

mywittyname7 hours ago

We had clusters of them in university too.

If all you needed to do was vector math, a dedicated vector processor with eight cores that are capable of running as fast as the extremely wide bus could feed them with data is the way to do it. You couldn't buy anything close to it's capabilities (for that specific task) for the money.

I remember the course we used them in being hard as hell, and the professor didn't really have any projects prepared that would really push the system.

emodendroket14 hours ago

This is cool but of course it's only going to be a small handful of titles that ever receive this kind of attention. But I have been blown away that now sub-$300 Android handhelds are more than capable of emulating the entire PS2 library, often with upscaling if you prefer.

observationist13 hours ago

Moore's law never ceases to amaze (the vulgar version where we're talking compute/dollar, not the transistor count doubling rate.) It won't be too long before phones are running AI models with performance equal to or better than current frontier models running on $100 million dollar clusters. It's hard to even imagine the things that will be running on billion dollar clusters in 10 years.

freedomben12 hours ago

I do hope you're right, but I'm quite skeptical. As mobile devices get more and more locked down, All that memory capacity gets less and less usable. I'm sure it will be accessible to Apple and Google models, but models that obey the user? Not likely

timschmidt12 hours ago

As state of the art machines continue to chase the latest node, capacity for older nodes has become much less expensive, more openly documented, and actually accessible to individuals. Open source FPGA and ASIC synthesis tools have also immensely improved in quality and capability. The Raspberry Pi Pico RP2350 contains an open source Risc-V core designed by an individual. And 4G cell phones like the https://lilygo.cc/products/t-deck-pro are available on the market built around the very similar ESP32. The latest greatest will always be behind a paywall, but the rising tide floats all boats, and hobbyist projects are growing more sophisticated. Even a $1 ESP32 has dual 240mhz 32bit cores, 8Mb ram, and fast network interfaces which blow away the 8bit micros I grew up with. The state of the open-source art may be a bit behind the state of the proprietary arts, but is advancing as well.

It's really fun to have useful hardware that's easy to program at the bare metal.

+1
direwolf2010 hours ago
raincole8 hours ago

> compute/dollar

That's ironic because building a PC is getting more expensive than last year for the first time.

anonymars5 hours ago

Heh, well, they didn't say memory/dollar

spookie4 hours ago

> ... It won't be too long before phones are running AI models with performance equal to or better than current frontier models running on $100 million dollar clusters.

Maybe, perhaps phones will have the compute power... But not enough memory. If things continue the way they are, that is. Great for AI firms, they'll have their moat.

cubefox2 hours ago

DRAM price actually hasn't decreased much over the last 10 to 15 years. In the decades before, there was a huge increase in memory capacity, perhaps even exponential like for transistors.

pants24 hours ago

In the same way we have websites running on disposable vapes, it may not be long before such a device could run a small local LLM, and lots of appliances could have a local voice interface - so you literally talk to your microwave!

heliumtera9 hours ago

I don't think you're going to see phones with 512gb VRAM+RAM in your lifetime.

bentcorner6 hours ago

When I was a kid I recall my cousin upgrading his computer to 1 or 2 MB so that we could get some extra features when playing Wing Commander 1. That was 1990.

35 years later, burner phones regularly come with 4 GB of RAM these days. 3 order of magnitude difference, not taking into account miniaturization and speed improvements.

In another 35 years who knows what will happen. Yeah things can't improve at the same pace forever but I would be surprised if anyone back in 1990 could predict the level of technology you can get at every corner store today.

Maybe it's not that everyone gets an RTX 5090 in our pocket, but maybe it's that LLMs now can run on rpi. Realistically it's probably something in the middle.

pants24 hours ago

This is a joke right? Not even 10 years ago the first phones with 4GB RAM came out, today there are quite a few phones with 24GB. At that rate we'll be at 512GB by around 2040.

cubefox2 hours ago

I don't think there are "quite a few" phones with 24GB. For example, even the Samsung Galaxy S25 Ultra, which is one of the most expensive ones out there, only has 12GB DRAM.

anthk3 hours ago

When I was a kid in Elementary we used DOS computers with maybe 4MB of RAM or few MB and the Play Station wasn't many times powerful. A few years (two or three) later we got Windows 95/98 with 128 times more RAM. A few years later, computers could emulate more or less the PSX and the N64, all within six years.

cubefox2 hours ago

The PlayStation 5 (16GB) has only twice as much RAM as the PlayStation 4 (8GB), and the PlayStation 6 will likely have just 1.5x as much as the PS5: 24GB. And even that might be optimistic with the recent explosion of memory price.

cvs2688 hours ago

A tech-optimist would perceive this as a death-threat! :,-)

deadbabe10 hours ago

They will not build that phone because then you won’t subscribe to AI cloud platforms.

jkingsman13 hours ago

It really is incredible. I've been playing through my childhood games on retro handhelds, and recently jumped from <$100 handhelds to a Retroid Pocket Flip, and it's incredible. Been playing WiiU and PS2 games flawlessly at 2x res, and even tackling some lighter Switch games on it.

reactordev13 hours ago

It truly is. My issue though, like in 2010 when I built an arcade cabinet capable of playing everything is you eventually just run out of interest. In it all. Not even the nostalgia of it keeps my attention. With the exception of just a small handful of titles.

- Excite Bike (it’s in its own league) NES

- Punchout (good arcade fun) NES

- TMNT 4-P Coop Mame Version

- NBA Jam Mame Version

- Secret of Mana SNES

- Chronotrigger SNES

- Breath of Fire 2 SNES

- Mortal Kombat Series SEGA32X

- FF Tactics PS1

I know these can all be basically run in a browser at this point but even Switch or Dreamcast games were meh. N64/PS1/PS2/Xbox was peak and it’s been rehashed franchises ever since. Shame. The only innovative thing that has happened since storytelling died has been Battle Royale Looter Shooters.

Novosell13 hours ago

Outer Wilds, Baba is You, Blue Prince, Hades 1&2, Disco Elysium, Hollow Knight, Slay the Spire, Vampire Survivors, Clair Obscur, What Remains of Edith Finch, 1000xResist, Return of the Obra Dinn, Roboquest, Rocket League, Dark Souls, etc. I could go on, and on, and...

Not rehashes. Original, phenomenal games covering damm near every genre and if there is a genre you're missing, I can find a modern game to match.

Do you actually engage with modern games?

+5
chongli11 hours ago
+1
phatfish12 hours ago
+5
reactordev12 hours ago
haunter12 hours ago

>The only innovative thing that has happened since storytelling died

lol

There are countless already classic modern story driven games which pushing the boundaries of video games forward.

I know nostalgia is a very strong drug and I also love the games I grew up with in the 90s but it's pure ignorance to say that 1, "storytelling died" 2, no innovation happened in video games in modern times (whatever that even means)

+2
reactordev12 hours ago
leguminous11 hours ago

I disagree. There are some new (sub-) genres and great games since that period.

* Roguelites have proliferated: Hades is the most obvious example, but there are a variety of sub-genres at this point.

* Vampire Survivors (itself a roguelite) spawned survivors-likes. Megabonk is currently pretty popular.

* Slay the Spire kicked off a wave of strategy roguelites.

* There are "cozy" games like Unpacking.

* I don't recall survival games like Subnautica or Don't Starve being much of a thing in the PS2 era.

* There are automation games like Factorio and Satisfactory.

* Casual mobile games are _huge_.

* There are more experimental games, sometimes in established genres, like Inscription, Undertale, or Baba Is You.

Not to mention that new games in existing genres can be great. Hollow Knight is a good example. Metroidvanias were established by the SNES and PS1 era, but Hollow Knight really upped the stakes.

I'm sure I'm forgetting things and people will have some criticism, but I really don't believe games have stagnated in general.

mlyle13 hours ago

For the oldies but goodies in my list:

- Any one of the 194_ games

- Legend of Zelda: A Link To The Past

- Super Mario World

- Final Fantasy VI, VII, IX

- Chrono Trigger (agree)

- Street Fighter 2 Championship Edition

- Metal Gear Solid 1-3, MGS: Peace Walker

But I think there's been good stuff since.

- The Super Mario Galaxy games

- Super Monkey Ball

- MGS4, MGS5

- Witcher 3

- The Bioshock games

- Minecraft-- probably the game with the most replay value of anything of all time.

I don't know what will stand the test of time. I don't want to play any of these games now, since I've burnt them out, but at some point I'll likely want to play them again...

- Undertale

- Bravely Default

- The Octopath games

- Dispatch

- AstroBot

- Clair Obscur

sbinnee9 hours ago

Playing Metal Gear Solid 2 was one of my fondest memories I cherish. I could play it only at Taekwondo gym I was attending to. I couldn't finish it because I only had a couple of hours at the gym and I could play only during break time. Oh and I was always waiting for the break time!

reactordev12 hours ago

Street Fighter 2 Championship Edition (whichever was the one with the most characters) as well as Street Fighter Alpha were great for the arcade machine.

Most of my buddies at the time would come over, have a beer, immediately hang it on the boat-coozy cup holders (the ones that gyro) and go to town shoulder to shoulder playing SF2. The cup holders gyro would prevent the beers from spilling as the arcade cabinet rocked back and forth from two grown men having a virtual fist fight. Best times.

RGamma11 hours ago

Baldur's Gate 3 has awesome story telling for video game standards. Plan 100+ hours for a reasonably complete first playthrough though.

+1
vunderba4 hours ago
+1
reactordev9 hours ago
chongli11 hours ago

If you're struggling with keeping your attention, you ought to try making a list of games you never finished (or never played) and commit yourself to playing through them in order. I have been doing that with NES games and really enjoying it. I alternate between RPGs/adventures and action games, to mix things up a bit.

Recently, I have played through Faxanadu, Dragon Warrior, Blaster Master, and am now working through Fire Emblem (translated from Japanese).

bluescrn11 hours ago

It's called getting older.

As a grown adult, nothing can recreate the feeling of exploring a new game as a child/teen. Especially during the 80s/90s, where gaming as a whole was new and rapidly-evolving.

But revisiting old favourites for the nostalgia can still be enjoyable.

techpression12 hours ago

What? Dreamcast was a marvel when it came to games, Crazy Taxi, Virtua Tennis, Power Stone, Jet Set Radio, Grandia, SoulCalibur etc.

+3
reactordev12 hours ago
fragmede12 hours ago

Paradox of choice. When you were single digit/low double years old, and you only had 3 games, you had to play the shit out of them. With every game available at your fingertips, there's no such compunction.

reactordev12 hours ago

Blockbuster and Funco Land gave me all the titles I could get my 7 year old fingers on.

irishcoffee13 hours ago

> N64/PS1/PS2/Xbox was peak and it’s been rehashed franchises ever since. Shame. The only innovative thing that has happened since storytelling died has been Battle Royale Looter Shooters.

I was a kid when ps1/n64 came out so I also have a lot of nostalgia about that era of gaming.

However…

There are a ton of great games out there from this era. Hell, the Uncharted series and Expedition 33 will get you 100-200 hours of excellent gameplay, Elden ring is another 200. Lies of P is a fantastic game, 50-100 more. The star wars Legos and star wars Harry Potter games are a lot of fun to play with kids, and Breath of the Wild/Tears of the Kingdom are the Zelda games we wanted on n64 as a kid, I love those games. And they’re not a rehash, at all.

There’s a lot of fun things out there to play if you poke around. Your local library might surprise you with the collection for completely free games you can borrow. Modern games even.

+1
reactordev6 hours ago
wahnfrieden13 hours ago

The Demons Souls lineage titles are another valuable innovation (I understand the earlier inspirations it had but those aren't playable like these modern ones)

For MAME I recommend trying Pang and Super Buster Bros

pjmlp12 hours ago

And then folks waste whole that power away, with embedded widgets applications.

My Android phone is more powerful than the four PCs I owned during the 1990 - 2002, 386SX - P75 - P166 - Athlon XP, all CPU, GPU, RAM and disk space added together.

PlatoIsADisease12 hours ago

I sit here with a laggy windows 11 computer with an Nvidia GPU and wonder: WTF

Its fine with Fedora, but Windows 11 is terrible.

pjmlp11 hours ago

Another one full of Webview2 instances because new hires cannot code anything else, apparently.

They aren't to blame, management is.

+1
josephg11 hours ago
grimgrin13 hours ago

I'll take a longbet with you that this or successors tackle more than a small handful of titles

We live in interesting times

lysace12 hours ago

There is so much work hunting down the proper upscaled/improved texture packs though. Supposedly.

PlatoIsADisease12 hours ago

I gave up video games, but I remember that being a huge reason why I picked Android a decade + ago. Emulators :D

Apparently now iphone allows it. Eventually Apple gives features that are standard elsewhere. Veblen goods...

Onavo13 hours ago

I suspect we will see a proliferation of emulator development in the next few years.

In a lot of ways, emulators are the perfect problem for vision/LLMs. It's like all those web browser projects popping up on HN. You have a very well define problem with existing reference test cases. It's not going to be fun for Nintendo's lawyers in future when everybody can crowdfund an emulator by simply running a VLM against a screen recording of gameplay (barring non deterministic éléments).

They can't oppress the software engineering masses any longer through lawfare.

flykespice14 hours ago

What the dev of AertherSx2 did to run games smooth, even on my midrange 2019 android phone, is wonders.

Too bad the dev is a very emotionally unstable person that abandoned his port, despite his big talent.

dottjt14 hours ago

On the flip side, maybe those traits are what lead to the existence of the emulator in the first place. Better something than nothing.

Sarkie13 hours ago

Wasn't he hounded by users as usual?

siev13 hours ago

Yeah and he didn't want to deal with receiving death threats for working on a passion project. Which I guess is considered being "emotionally unstable".

bananaboy13 hours ago

Link to the actual project rather than just a news article about it https://github.com/ran-j/PS2Recomp

OneDeuxTriSeiGo13 hours ago

On this topic of ports/recomps there's also OpenGOAL [1] which is a FOSS desktop native implementation of the GOAL (Game Oriented Assembly Lisp) interpreter [2] used by Naughty Dog to develop a number of their famous PS2 titles.

Since they were able to port the interpreter over they have been able to start rapidly start porting over these titles even with a small volunteer team.

1. https://opengoal.dev/

2. https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp

ZX830112 hours ago

90% of the PS2’s floating point throughput is in the two vector units, not the R5900 conducting them. Concentrating on that, as the article does, seems as futile as focussing on the 68000 rather than the Amiga PAD in a 16-bit context (ignoring the EE’s 16-bit RAMBUS bottleneck).

However that approach will probably suit the least-ambitious PC-ports to PS2 (by studios that didn’t appreciate the difference) - rather as an ST emulator was a short cut to run the simplest Amiga games.

corysama6 hours ago

Hey! I can speak here.

Back in the day, I wrote a simulator for the PS2’s vector units because Sony did not furnish any debugger for them. A month after I got it working, a Sony 2nd party studio made their VU debugger available to everyone… Anyway…

The good news is that the VU processors are actually quite simple as far as processors go. Powerful. Complicated to use. But, not complicated to specify.

This is made much simpler by the fact that the only documentation Sony provided was written by the Japanese hardware engineers. It laid out the bit-by-bit details of the instruction set. And, the bitwise inputs, outputs, delays and side effects of each instruction.

No guidance on how to use it. But, awesome docs for writing a simulator (or recompiler).

wmf14 hours ago

An application of the first Futamura projection. https://en.wikipedia.org/wiki/Partial_evaluation

jszymborski14 hours ago

I read this as Futurama way too many times

suprjami14 hours ago

So did I. Considering there is a PS2 Futurama game, it seems a reasonable mistake.

jszymborski13 hours ago

honestly I kept thinking of this https://theinfosphere.org/Futurama_theorem

masfuerte13 hours ago

Is it? It would be if it partially evaluated a MIPS emulator on a particular game. But it doesn't seem to work like that.

wmf13 hours ago

"Decoding the MIPS R5900 instructions in each function Translating those instructions to equivalent C++ code Generating a runtime that can execute the recompiled code The translated code is very literal, with each MIPS instruction mapping to a C++ operation." It sounds like a MIPS interpreter that gets statically unrolled.

masfuerte12 hours ago

Yes, it's like the result of unrolling a MIPS interpreter, but there never was an actual MIPS interpreter.

I thought the point of the Futamura projection was that there was actually partial evaluation happening, i.e. you take a real interpreter and specialize it in some automated fashion. That's what makes it interesting.

But I could well be wrong about the naming. It doesn't really matter what it's called if we're all clear about what's actually happening.

xnx14 hours ago

Emulation is already amazing. What can be done with recompilation is magic: https://github.com/Zelda64Recomp/Zelda64Recomp

Koshkin9 hours ago

So… What’s the magic? (In theory, interpretation/emulation and compilation should produce identical behavior.)

corysama6 hours ago

The magic is that now you can modify the source code of the game and recompile that.

Folks have been optimizing SuperMario64 to run much faster on actual N64 hardware. And, there is a project that has ported it to run on the PlayStation 1. That’s much weaker hardware that has no hope of emulating the N64.

firegodjr9 hours ago

Identical behavior, sure, but much less overhead and fewer restrictions on e.g. resolution than you'd get on a general purpose emulator

zeroq8 hours ago

I absolutely love the idea!

As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.

Of course, "Nosferatu, eine Symphonie des Grauens" is not for everyone, but I firmly believe that you can watch the new Dune and Lawrence of Arabia back to back and have similarly enjoyable time.

Fallout 1 and 2 are miles ahead of Fallout 3 (mostly due to uncanny valley phenomenon). Sure, the medium has changed a lot and modern consumers are used to more streamlined experience - my favorite example is the endless stream of Baldurs Gate "modern reimplementations" or rehashes, like Pilars of Eterniety that were too close to the original source, and then, suddenly, someone came up with Divinity, basically a Baldurs clone but with modern UI and QoL improvements.

But consoles are different.

This can truly be a window for the next generation to look back in the past.

echelon8 hours ago

> As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.

Now I feel old. I was thinking you might say 1960 or something.

vkazanov4 hours ago

I recently had a chat with a collegue who never heard of Quake.

He also never watched Lock, Stock and 2 Smoking Barrels.

And Half-life is just something-something-let-me-check.

Oh, well...

Decabytes6 hours ago

I hope the steam machine 2.0 can be a good target for developers for years to come like the ps2 was

coopykins4 hours ago

My all time favorite console. I keep coming back to it. This to me is a fantastic way to preserve gaming history.

bri3d12 hours ago

See also: XenonRecomp, which does the same thing for Xbox 360, and N64:Recompiled which does the same thing for N64.

Note that this "recompilation" and the "decompilation" projects like the famous Super Mario 64 one are almost orthogonal approaches in a way that the article failed to understand; this approach turns the assembly into C++ macros and then compiles the C++ (so basically using the C++ compiler as a macro re-assembler / emulation recompiler in a very weird way). The famous Super Mario 64 decompilation (and openrct and so on) use the output from an actual decompiler which attempts to reconstruct C from assembly, and then modify that code accordingly (basically, converting the game's object code back into some semblance of its source code, which this approach does NOT do).

sbinnee9 hours ago

I have a samurai game, Kengo 3, that I really liked on PS2. I still have that CD at my parents'. Can anyone recommend me a PS2 emulator?

MobiusHorizons9 hours ago

If you have a Mac aethersx2 it works great on Apple silicon

colordrops12 hours ago

> So yes, currently playing PS2 games on PC via emulator is still absolutely fantastic, but native ports would be the holy grail of game preservation.

I would think that emulation of the original game as closely as possible would be the gold standard of preservation, and native ports would be a cool alternative. As described in the article, native ports are typically not faithful reproductions but enhanced to use the latest hardware.

snvzz10 hours ago

Indeed, the focus for preservation would be to increase the accuracy of emulators.

pcsx2 is pretty good today in terms of running games (there is a single digit list of games it does not run), but it's far from accurate to the hardware.

Porting to current systems via recompilation is cool, but it has very little to do with preservation.

matthewfcarlson9 hours ago

I’ve been meaning to start decompiling one of my favorite games of the era (Hulk Ultimate Destruction) after watching the decomp of other games. Perhaps this is a sign to start?

flykespice13 hours ago

I wonder how they will tackle the infamous non-conformant Ps2 floating-point behavior issue, that is the biggest hurdle on emulating Ps2.

mikepurvis13 hours ago

Some context for others who were unaware: https://github.com/PSI-Rockin/DobieStation/issues/51

EDIT here's potentially a better link: https://www.gregorygaines.com/blog/emulating-ps2-floating-po...

toast012 hours ago

As of now, it looks like they're ignoring it:

https://github.com/ran-j/PS2Recomp/blob/91678d19778891b4df85...

   #define FPU_ADD_S(a, b) ((float)(a) + (float)(b))
(etc)

But if you wanted to handle it, you'd presumably macro expand the floating point operations to something that matches the PS2 fpu (or comes closer).

kmeisthax12 hours ago

PS2 floating-point behavior is one of the few hardware misfeatures so awful it affects emulation of competing systems[0]. The game True Crime: New York City is so dependent on PS2 floating point that the GameCube port installs an error handler just to make 1/0 = 0. Which isn't even PS2 hardware behavior. But it is "close enough" that the game does not immediately throw you into the void every time you step on a physics object.

[0] https://dolphin-emu.org/blog/2021/11/13/dolphin-progress-rep...

realusername12 hours ago

Probably the same way as the emulator themselves, with a list of titles needing the real PS2 floating point.

A lot of titles don't actually need it and work fine with standard IEEE floating point.

ChrisMarshallNY13 hours ago

This sounds very cool, but I can practically hear the IP lawyers sharpening their buzz-axes...

zerocrates6 hours ago

They haven't been all that aggressive against the decompile/recompile projects, interestingly. They're sometimes/often set up so you need the original to grab assets etc., but that code is copyrighted too and I'd have to imagine a decompile that purposely compiles to an identical binary would be a derivative work.

My best guess is that for them it's not worth the hassle or any possibility of a negative result in court as long as people have to jump through some hoops by providing an original, and for the projects that don't do that, you have very straightforward easy infringement cases without even getting into the decomp stuff. Though really even ROMs seem to be tacitly tolerated to some extent lately. Maybe there's an attitude that keeping people involved with the franchise is worth it, again so long as it doesn't become too easy.

chippiewill10 hours ago

Sony have actually been fairly chill about emulators etc. so I'd be surprised if lawyers got involved here.

They actually used an open source Playstation emulator when they released the "Playstation Classic" in 2018.

karel-3d6 hours ago

Sony is not Nintendo.

doublerabbit13 hours ago

Or as in cartoons, IP lawyers with dollar symbols in their eyes.

denkmoon13 hours ago

Only in terms of their own salaries and bonuses. For all their litigiousness over emulation I can't imagine it really makes them money.

dylan60412 hours ago

Do IP cases ever make anyone other than outside counsel money?

hn_user_987612 hours ago

This is amazing for preservation. Being able to run these classics on modern hardware with native recompilation is a huge step forward.

AtlasBarfed7 hours ago

N64 as I understand it has some self rewriting code that makes this hard

whywhywhywhy5 hours ago

N64 is the one leading the way on recompilation Mario 64, Perfect Dark, Zelda, Mario Kart etc etc have all been done

imtringued14 hours ago

As far as I know, static recompilation is thwarted by self modifying code (primarily JITs) and the ability to jump to arbitrary code locations at runtime.

The latter means that even in the absence of a JIT, you would need to achieve 100% code coverage (akin to unit testing or fuzzing) to perform static recompilation, otherwise you need to compile code at runtime at which point you're back to state of the art emulation with a JIT. The only real downside of JITs is the added latency similar to the lag induced by shader compilation, but this could be addressed by having a smart code cache instead. That code cache realistically only needs to store a trace of potential starting locations, then the JIT can compile the code before starting the game.

bluGill13 hours ago

Yes, but in practice that isn't a problem. People do write self modifying code, and jump to random places today. However it is much less common today than in the past. IT is safe to say that most games are developed and run on the developers PC and then ported to the target system. If they know the target system they will make sure it works on the system from day one, but most developers are going to prefer to run their latest changes on their current system over sending it to the target system. If you really need to take advantage of the hardware you can't do this, but most games don't.

Many games are written in a high level language (like C...) which doesn't give you easy access to self modifying code. (even higher level languages like python do, but they are not compiled and so not part of this discussion). Likewise, jumping to arbitrary code is limited to function calls for most programmers.

Many games just run on a game engine, and the game engine is something we can port or rewrite to other systems and then enable running the game.

Be careful of the above: most games don't become popular. It is likely the "big ticket games" people are most interested in emulating had the development budget and need to take advantage of the hardware in the hard ways. That is the small minority of exceptions are the ones we care about the most.

bri3d12 hours ago

This is PS2 emulation, where most engines were still bespoke and every hack in the book was still on the table.

edflsafoiewq3 hours ago

I believe the main interest in recompilation is in using the recompiled source code as a base for modifications.

Otherwise, yeah, a normal emulator JIT basically points a recompiler at each jump target encountered at runtime, which avoids the static analysis problem. AFAIK translating small basic blocks and not the largest reachable set is actually desirable since you want frequent "stopping points" to support pausing, evaluating interrupts, save states, that kind of stuff, which you'd normally lose with a static recompiler.

bri3d12 hours ago

JIT isn't _that_ common in games (although it is certainly present in some, even from the PS2 era), but self-modifying or even self-referencing executables were a quite common memory saving trick that lingered into the PS2 era - binaries that would swap different parts in and out of disk were quite common, and some developers kept using really old school space-saving tricks like reusing partial functions as code gadgets, although this was dying out by the PS2 era.

Emulation actually got easier after around the PS2 era because hardware got a little closer to commodity and console makers realized they would need to emulate their own consoles in the future and banned things like self-modifying code as policy (AFAIK, the PowerPC code segment on both PS3 and Xbox 360 is mapped read only; although I think SPE code could technically self-modify I'm not sure this was widespread)

The fundamental challenges in this style of recompilation are mostly offset jump tables and virtual dispatch / function pointer passing; this is usually handled with some kind of static analysis fixup pass to deal with jump tables and some kind of function boundary detection + symbol table to deal with virtual dispatch.

duskwuff13 hours ago

How many PS2-era games used JIT? I would be surprised if there were many of them - most games for the console were released between 2000 and 2006. JIT was still considered a fairly advanced and uncommon technology at the time.

whizzter4 hours ago

I'd say practically none, we were quite memory starved most of the time and even regular scripting engines were a hard sell at times (perhaps more so due to GC rather than interpretation performance).

Games on PS2 were C or C++ with some VU code (asm or some specialized hll) for most parts, often Lua(due to low memory usage) or similar scripting added for minor parts with bindings to native C/C++ functions.

"Normal" self-modifying code went out of favour a few years earlier in the early-mid 90s, and was perhaps more useful on CPU's like the 6502s or X86's that had few registers so adjusting constants directly into inner-loops was useful (The PS2 MIPS cpu has plenty of registers, so no need for that).

However by the mid/late 90s CPU's like the PPro already added penalties for self-modifying code so it was already frowned on, also PS2 era games already often ran with PC-versions side-by-side so you didn't want more than needed platform dependencies.

Most PS2 performance tuning we did was around resources/memory, VU and helped by DMA-chains.

Self modifying code might've been used for copy-protection but that's another issue.

bri3d12 hours ago

A lot of PS2-era games unfortunately used various self-modifying executable tricks to swap code in and out of memory; Naughty Dog games are notorious for this. This got easier in the Xbox 360 and PS3 era where the vendors started banning self-modifying code as a matter of policy, probably because they recognized that they would need to emulate their own consoles in the future.

The PS2 is one of the most deeply cursed game console architectures (VU1 -> GS pipeline, VU1 microcode, use of the PS1 processor as IOP, etc) so it will be interesting to see how far this gets.

duskwuff12 hours ago

Ah - so, not full-on runtime code generation, just runtime loading (with some associated code-mangling operations like applying relocations). That seems considerably more manageable than what I was thinking at first.

bri3d10 hours ago

Yeah, at least in the case of most Naughty Dog games the main ELF binary is in itself a little binary format loader that fixes up and relocates proprietary binaries (compiled GOAL LISP) as they are streamed in by the IOP. It would probably be a bit pointless to recompile Naughty Dog games this way anyway though; since the GOAL compiler didn’t do a lot of optimization, the original code can be recovered fairly effectively (OpenGOAL) and recompiled from that source.

vyr12 hours ago

[dead]

brcmthrowaway10 hours ago

Whats the best PS2 game of all time?

keyle11 hours ago

Side note, are we at the level where tech blogs and news site can't even write <a href> links properly?

2 out of 4 links in the article are messed up, that's mind boggling... On a tech blog!

Is that how far deep we've sunk to assert it wasn't written by AI?

simondotau10 hours ago

A more accurate version of the famous idiom:

Those who can, do (and sometimes become teachers when they get older). Those who can’t become journalists.