PS4 is revealed

Status
Not open for further replies.
Let me see, think I have the proper response down here somewhere... ah here it is...

Fuck you with a broomstick.
Is that a reaction to paraphrasing what you already said, or do you feel misinterpreted?

Anyway, enjoy your railroad tunnel movies, on whatever platform you choose. You're welcome :thumbsup:.
 
Actually the C++ error was pretty widespread for both the Firefall beta and the LoL build at the time. A minority of players, sure, but it wasn't exactly 1 in a million.
I'd point you to the first year of Xbox360 and PS3 release issues, but it's arguing a moot point. Your issues in the past are as abundant as console owners have had with thier own systems. There's literally no major issue here.
 
I'd point you to the first year of Xbox360 and PS3 release issues, but it's arguing a moot point. Your issues in the past are as abundant as console owners have had with thier own systems. There's literally no major issue here.
Except possibly that Microsoft manages to suck no matter the platform.
 
I'd point you to the first year of Xbox360 and PS3 release issues, but it's arguing a moot point. Your issues in the past are as abundant as console owners have had with thier own systems. There's literally no major issue here.
I don't think anything can match the RROD issue that plagued the 360 through multiple iterations. The failure rate was nigh criminal.
 
Yeah, the fact that it even happened on the Elites and beyond is baffling to me. When my Elite died I was livid.
It's because the Elite was still basically the same system, just a new production run. The actual physical changes necessary to combat the real problem didn't happen until they released the S and re-built the motherboard around a new spec.
 
Really, it comes down to this. Consoles are like a BMW. Very nice to drive, dependable, will get you where you need to go and do it with comfort. The PC is like a custom italian supercar. Sure, it requires more maintenance, it requires you to fine tune gear ratios and weight distributions, but it appeals to people that want to get the most out of their machine, and tear around a track at 200mph with the roar of an engine.

Does that make the supercar drivers better? Well, clearly yes, but we try not to make a big deal about it. We all share the same road.
 
Really, it comes down to this. Consoles are like a BMW. Very nice to drive, dependable, will get you where you need to go and do it with comfort. The PC is like a custom italian supercar. Sure, it requires more maintenance, it requires you to fine tune gear ratios and weight distributions, but it appeals to people that want to get the most out of their machine, and tear around a track at 200mph with the roar of an engine.

Does that make the supercar drivers better? Well, clearly yes, but we try not to make a big deal about it. We all share the same road.
And yet those Italian supercars keep getting wrapped around telephone poles.

Edit: Funny story. I went on Google images and looked up "Italian car accidents." The first image was Snooki in a neckbrace. I guess I misspelled "car accident" as "train wreck."
 

figmentPez

Staff member
Most can't use more than 4GB because they're 32-bit executables. Their address space hits a wall at that point. There are tricks to use more than that amount of memory on a 32-bit process (look to the server days before the 64-bit transition) but that never hit consumer-level products. And even though 90% of us have 64-bit processors, if you're on XP (XP-64 is a bastard that shall not be spoken of) you can't use 64-bit binaries. Thus, everything is still compiled for 32-bit, and the 64-bit people can run those too.

I will celebrate when I no longer see 32-bit executables in my task manager.
Could this generation of consoles be what pushes the final shift to 64-bit binaries for games? If the PS4 and the next Xbox both have 8+ GB of RAM and developers are making games that use all of that RAM, then it might end up being too much extra work to re-jigger the code for 32-bit XP (either that or PCs will end up with a port of the WiiU port of the game *shudder*)
 
I'm not finding any concrete pictures of the PS4 online, only rumored mockups and straight up forgeries. People are even showing black Xboxes and photoshoping PS logos on them trying to pass them off as "offical" pictures.

I'd really like to know where the legit pictures are.

According to Engadget, not even Sony has settled on the final design yet.
 
If I were Sony I'd shape it like a large roast ham or a miniature telephone pole or something else random like that, just to mess with people.
 
Could this generation of consoles be what pushes the final shift to 64-bit binaries for games? If the PS4 and the next Xbox both have 8+ GB of RAM and developers are making games that use all of that RAM, then it might end up being too much extra work to re-jigger the code for 32-bit XP (either that or PCs will end up with a port of the WiiU port of the game *shudder*)
If you're coding sanely, there should be no difference NOW. Only if you're violating tons of best practices is it more than a re-compile. But hey, if it's what causes it to happen, then WOOT!
 

figmentPez

Staff member
If you're coding sanely, there should be no difference NOW. Only if you're violating tons of best practices is it more than a re-compile. But hey, if it's what causes it to happen, then WOOT!
Is it possible to re-compile AI routines and map data and other stuff like that to use up less memory? I don't exactly know what gets kept in main system memory for a game. I know textures are a memory hog, but they're kept in the video RAM. What is the advantage to having more main system RAM, and can that be easily scaled down to fit in the 2GB/2GB limit that PCs have? (Without special considerations, I'm pretty sure that the 4GB memory limit on 32-bit executeables is actually broken down to 2GB of RAM and 2GB of swap file.)
 
Is it possible to re-compile AI routines and map data and other stuff like that to use up less memory? I don't exactly know what gets kept in main system memory for a game. I know textures are a memory hog, but they're kept in the video RAM. What is the advantage to having more main system RAM, and can that be easily scaled down to fit in the 2GB/2GB limit that PCs have? (Without special considerations, I'm pretty sure that the 4GB memory limit on 32-bit executeables is actually broken down to 2GB of RAM and 2GB of swap file.)
Your "pretty sure" is unfortunately wrong. In Windows executables, it's actually just a binary flag that says "use the last bit of memory or not" because generally everything that's ultra-high in the memory space is where the memory-mapped addresses to access hardware devices are. So you start getting into weird overlaps if you start using too much of that memory space in 32-bit applications. Also some logical bit-shift operations are different, again if you were being lazy and "Assuming" that the highest bit is "0" and thus not bothering to tell the difference between an arithmatic and logical bit shift.

Swap files are something completely different. If a process creates a swap file for itself, fine, but it doesn't have anything to do with the memory space. The OS itself can also "swap" some of the memory of idle processes to disk, but again, this doesn't impact the address space barriers we're talking about.
 

figmentPez

Staff member
Your "pretty sure" is unfortunately wrong. In Windows executables, it's actually just a binary flag that says "use the last bit of memory or not" because generally everything that's ultra-high in the memory space is where the memory-mapped addresses to access hardware devices are. So you start getting into weird overlaps if you start using too much of that memory space in 32-bit applications. Also some logical bit-shift operations are different, again if you were being lazy and "Assuming" that the highest bit is "0" and thus not bothering to tell the difference between an arithmatic and logical bit shift.

Swap files are something completely different. If a process creates a swap file for itself, fine, but it doesn't have anything to do with the memory space. The OS itself can also "swap" some of the memory of idle processes to disk, but again, this doesn't impact the address space barriers we're talking about.
I was going by what I remembered from this article from years ago: A Messy Transition: Practical Problems With 32bit Addressing In Windows a piece about how Supreme Commander was a game that was already crashing due to reaching the 2GB memory limit in 32-bit applications. My memory was completely wrong, it's not a 2GB RAM / 2GB swap file division. It is, according to the article, that "in designing Windows Microsoft opted to split up the virtual address space of an application in half; 2GB goes to Windows (kernel space) and 2GB goes to the application (user space). Under normal circumstances this 2GB of space is all a 32bit application has to work with, this is the 2GB barrier and as we'll see is the cause of the problems with Supreme Commander." This was an article from 2007 and I have no idea what sorts of work-arounds there have been since that time.

Either way, there is a limit that 32-bit programs hit, and have been hitting for some time now, and I'm really unclear on what type of programming adjustments must be made in order to fit a game that was originally designed to have well over 4GB of RAM containing AI data, map data, physics and what-not into less than 4GB for a 32-bit limited system. I realize that the executable file just needs to be recompiled, but what happens to all the resources that the game is using? Do enemies get dumber with less memory space to process AI? Do maps have less detail? Is everything just slowed down with tons of pop-up because everything has to be swapped in and out all the time? At what point does it become impractical to make a 32-bit version?
 
Well, at least Sony has their heads NOT up their ass about one thing: used games.

Also it looks like their copyright protection will focus on how long it takes to load a game - probably to deal with disc swapping from mod chips, I guess. That's still a thing, right? That's what I did with my ps2 anyway...
I haven't heard of modding like that since XBox/PS2. At least, not in the terms of opening it up and soldering a chip in there like I did with my old PS1.
 

GasBandit

Staff member
I haven't heard of modding like that since XBox/PS2. At least, not in the terms of opening it up and soldering a chip in there like I did with my old PS1.
On my PS2, I used a solderless mod chip that basically was put in the line between the power button assembly and DVD reader and the motherboard. It let you boot up with a boot disk, then hot swap out to a DVD-R. Booting took longer naturally, but hey.

So, if that's not the case now, I dunno what the load time has to do with piracy.
 
On my PS2, I used a solderless mod chip that basically was put in the line between the power button assembly and DVD reader and the motherboard. It let you boot up with a boot disk, then hot swap out to a DVD-R. Booting took longer naturally, but hey.

So, if that's not the case now, I dunno what the load time has to do with piracy.
Yeah the one we had was the "ghost chip" that would get bypassed by games with piracy software like FF8 (the game wouldn't load if it detected the regular chip).
 
I think if it was real it would be all over every gaming site.

There are tons of mock ups out there though.
 
I was going by what I remembered from this article from years ago: A Messy Transition: Practical Problems With 32bit Addressing In Windows a piece about how Supreme Commander was a game that was already crashing due to reaching the 2GB memory limit in 32-bit applications. My memory was completely wrong, it's not a 2GB RAM / 2GB swap file division. It is, according to the article, that "in designing Windows Microsoft opted to split up the virtual address space of an application in half; 2GB goes to Windows (kernel space) and 2GB goes to the application (user space). Under normal circumstances this 2GB of space is all a 32bit application has to work with, this is the 2GB barrier and as we'll see is the cause of the problems with Supreme Commander." This was an article from 2007 and I have no idea what sorts of work-arounds there have been since that time.
And that's essentially correct, but incomplete. You can set a flag in your compiler options, and it becomes a flag in the output .exe that says "I want to use 3.xGB (I don't remember the exact amount) and I know what I'm doing!" So the "kernel space" (not exactly) is smaller, but still, all within 4GB of space. By default, yes, split into two 2GB chunks.
 
So, if that's not the case now, I dunno what the load time has to do with piracy.
If you're playing legitimately, you're suffering through 17 pre-load videos and arnings against piracy. If you're using illegal means to avoid those videos, you're pirating and must be punished. :p
 
Could this generation of consoles be what pushes the final shift to 64-bit binaries for games? If the PS4 and the next Xbox both have 8+ GB of RAM and developers are making games that use all of that RAM, then it might end up being too much extra work to re-jigger the code for 32-bit XP (either that or PCs will end up with a port of the WiiU port of the game *shudder*)
Consoles are special purse machines, and unlike general purpose machines they probably would rather give the programmers a few more specialized instructions to handle more memory than go to a 64 bit architecture which really doesn't buy them much.

Chances are good a few GB are set aside for the graphics chipset, and its possible that each core in the machine has some dedicated memory beyond the usual caches. It wouldn't surprise me if each core still only had a 32 bit addressing space.

The graphics processor likely has 128 bit wide or larger access to the ram, and like general purpose computers they probably give specialized access to the GPU for parallel processing for physics and graphics uses, where the regular processors would be at a disadvantage.

A real 64 bit data bus really won't buy a console much, and would cost more. Better, for them, to simply make the existing 32 bit bus faster, keep the instruction set small, and allow for a few instructions that deal with the larger memory space. Since memory size is fixed, it won't really be a problem.

But I'm still thinking each of the "8 cores" has dedicated memory, with a shared memory pool they can all access. This way each core would only see 32bits of address space, some their own, some shared, and some controlling hardware.

Pure speculation, though.
 
Status
Not open for further replies.
Top