PoetPhilosopher
Veteran Member
Video games these days look pretty good, don't get me wrong. But a Playstation 4 is 50-750 times as powerful as the PS1, and it won't look 50-750 times as good as the PS1 if the PS1 is using something better than composite as its TV connection cord, which it can.
Here are the reasons, I have found there to be, why:
Playstation 4 games are coded in easier languages and less hand-crafted tools, costing 2-4 times the CPU performance with no real benefit to the end user.
Programmable shaders are a newer invention which give a little more control over graphics, but can easily cost well over 10x the performance, sometimes 50x the performance in certain areas of the graphics chip (not necessarily in every last area, though).
Screen resolution is now about 30 times as high in terms of pixel count which though it might result in something like twice as good graphics in some scenarios, will cost pretty much 30 times the performance on the graphics chip.
Bigger, more full featured console operating systems which consume extra processor power even when idle (pretty much) and need RAM held back for them.
If you make the resolution 30 times as high, you typically want to include higher-resolution textures or it still won't look good, it may even look worse. Textures two bumps up in detail tend to cost around 16 times more RAM, as an example. And this is still ignoring the fact that it's more something you have to do, rather than all about improving things. The real way to improve things well, is by better art direction, in my opinion.
My post in layman's terms: There exists a time when a more powerful computer will only solve your problems noticeably in cases where you have 30+ times the performance and can implement a better method. But even then, there still exists a bottleneck in human creativity and human talent and things done by hand.
Here are the reasons, I have found there to be, why:
Playstation 4 games are coded in easier languages and less hand-crafted tools, costing 2-4 times the CPU performance with no real benefit to the end user.
Programmable shaders are a newer invention which give a little more control over graphics, but can easily cost well over 10x the performance, sometimes 50x the performance in certain areas of the graphics chip (not necessarily in every last area, though).
Screen resolution is now about 30 times as high in terms of pixel count which though it might result in something like twice as good graphics in some scenarios, will cost pretty much 30 times the performance on the graphics chip.
Bigger, more full featured console operating systems which consume extra processor power even when idle (pretty much) and need RAM held back for them.
If you make the resolution 30 times as high, you typically want to include higher-resolution textures or it still won't look good, it may even look worse. Textures two bumps up in detail tend to cost around 16 times more RAM, as an example. And this is still ignoring the fact that it's more something you have to do, rather than all about improving things. The real way to improve things well, is by better art direction, in my opinion.
My post in layman's terms: There exists a time when a more powerful computer will only solve your problems noticeably in cases where you have 30+ times the performance and can implement a better method. But even then, there still exists a bottleneck in human creativity and human talent and things done by hand.