Laevatein’s Campfire Tales: The PS4 Announcement

Nothing makes bigger waves in this industry than new console announcements, and Sony sure dropped a big one with the PS4. Announced at their conference last week, Sony was in the perfect position to wow the general gaming audience, as it not only could also control all the spectacle and hype, but it also didn’t have to compete with another console launch. In fact, the PS4 announcement didn’t really have any notable news to compete with, and I’m pretty sure just about everyone has some opinion on it, myself included. At initial glance, the PS4 looks like a pretty slick machine, but that’s what this industry is good at, style over substance. That’s not to say there isn’t any substance here, but if there is, it’s hard to imagine Sony wanted you to see it; they did a pretty good job of hiding it.

PS4 controller

I want to get out of the way right now that “supercharged PC architecture” is a meaningless buzzword, much like blast processing. It means absolutely nothing, as architecture isn’t something you can “supercharge.” Have you ever heard of anything ever implementing “supercharged electric circuits?”

Terrible buzzwords aside, there’s some pretty interesting tech in the PS4. I can’t say I know much about the graphics unit the PS4 will be using, but you don’t see CPUs like the PS4’s too often. The PS4’s CPU has 8 cores, and uses an x86 chip (specifically, an x86-64 AMD chip). 8 cores is rather impressive, but like clock speeds, number of cores isn’t the be all, end all for CPUs. There are a number of factors in CPU performance, though Sony did mention they are using a “next-generation” chip, so we’ll see. We’ll also have to see if developers can take advantage of all eight cores.

The x86 part is by far the most important part. What does it mean? Simply that the PS4’s processor will be a lot more like a standard PC’s. That should pretty much make development for the PS4 much easier than development for the PS3 allegedly was, and perhaps even easier than development for most other consoles. Resultantly, we likely won’t see very many “inferior PS4 ports,” if any at all.

The other important bit is the 8 GB of GDDR5 RAM. 8 GB is a lot of RAM, about as much as the average gaming PC has. However, GDDR5 RAM is unique, as it is currently used in GPUs, predominantly. In the PS4’s case, the GDDR5 RAM is unified memory, so any part of the system can use it. I can’t say I’ve ever heard of GDDR5 RAM used anywhere outside of GPUs, as it isn’t optimized for general processing purposes, but it does have a number of advantages over DDR3 RAM. That, and Sony seems to be placing the memory as close to the processing units as possible, which should speed things up, in general (through signal travel times being reduced or eliminated).

killzone 4

8GB of unified RAM should not be used to support a “consoles can do more with less” argument, however. While it’s true a PC uses more background resources than a console does, I imagine both the PS4 and the PC will have comparable amounts of memory overhead, as the PS4 now has a number of new features that need to run in the background. I imagine the PS4 would only be able to use 4-6 GB max for games.

There could be more to go on, I suppose, though I can easily see many developers going wild after hearing these details, as they can stick with what they know from coding for x86 systems. That leaves me with a sense of disappointment. Though consoles existing solely to play games are very much a thing of the past, it’s a little sad to see consoles moving further and further from their origins, into systems that are more like media systems or, well, PCs.

At the same time, many of the social features announced (such as watching your friends play games in real time, or the ability to take control of their games) seem like they want to replicate the feeling of local multiplayer of old. All of the social features combined can hardly replicate the feeling of actual, local multiplayer. It’s easy to dismiss the social features as meaningless features (and I’m saying that as one who is very much not a fan of social gaming), but in some ways, the PS4’s social features seem to be more relevant to actual gaming.

I want to say I’m excited for the games, I really do. After Sony’s bait and switch with Killzone 2 from E3 2005, I’m almost entirely convinced that most of the games shown were pre-rendered in some way, or if they are rendered by an engine, what we saw is certainly not indicative of the final product. I say almost because a playable demo of Killzone 4 was shown a day later, and it’d be hard to argue that what we saw of Killzone 4 wasn’t gameplay, but for something like Capcom’s game, I’m going to say it’s not gameplay, even if it was rendered by their engine. Still, it wouldn’t be a stretch to expect The Witcher 2-level production values across the board.

Most importantly, I didn’t like the general vibe of the conference. There was this attitude there that “bigger = better,” that we need better hardware to make better games. That we need higher production values to make better games. That we need no restrictions to make better games. That mindset couldn’t be further from the truth, quite frankly. Granted, everyone there may have been just saying that to build the required spectacle, but it’d be very scary if they sincerely believed what they were saying. Even if they didn’t, it’s sad to know that we’ve gotten to the point where games need to become spectacles.

infamous second son

 

No one at the conference convinced me that actual important things like game design or mechanics needed the better hardware and social features. In fact, I wasn’t very convinced that developers could improve upon secondary requirements such as physics or AI (then again, in an industry where no game could beat a game from 2005 in AI, I’m not expecting much).

On the whole, I think it’s great that Sony is making what amounts to a PC, but I’m not seeing where game experiences stand to improve, considering what they showed us in the conference. I suppose one could make the argument that better hardware helps developers make immersive games, but I always found strong game design and mechanics much more immersive than prettier graphics, and whatnot. I guess I’m really just arguing against spectacle here, and if there’s one thing I learned about this industry, it’s that spectacle is a predominant force, one that will likely never go away.

The following two tabs change content below.
A mad scientist who's so cool!

Laevatein

A mad scientist who's so cool!

One Comment:

  1. I could care less about the fancy schmancy upgrades. It's a new console and it is supposed to be better than its predecessor. what has and continues to influence which consoles I buy each generation, is its library. Why do you think I have not, and will not buy an Xbox…ever! It's Nintendo or Sony for me.

Leave a Reply