• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

CPU vs. video card

PoetPhilosopher

Veteran Member
I'm continuing my trend of making topics on technology considering it's a subject I like and it got some positive response.

Video cards are very very important for gamers. They make a big difference. However if faced between a low-end video card and a low-end CPU for gaming, you may still want to consider the video card that is low-end despite what the world says. And choose the better CPU instead.

I have tried 3 PCs where the CPU, upon not meeting the Recommended System Requirements of a game, but meeting the minimum system requirements, produced occasional microstuttering in games and longer in-game loading times.

Whereas the only side effects of a worse video card was quite often, having to turn down the graphics settings to maintain a stable framerate. This isn't exactly the same as microstuttering, the framerate may be lower overall, but unless it's too low, there is no microstuttering like effect to make the game unsmooth like there is with a weak CPU.

I have also seen a YouTube video regarding the latest PC Tomb Raider game that shows the same. I'm trying to locate the video again to post as stronger evidence.
 

dybmh

דניאל יוסף בן מאיר הירש
Yes, A high-end video card will need a fast CPU in order to be 'smooth'?

That makes sense.
 

PoetPhilosopher

Veteran Member
Yes, A high-end video card will need a fast CPU in order to be 'smooth'?

That makes sense.

My points were a bit broader than that even, but yeah, pretty much.

I further stated that there exists a certain ceiling you must reach in terms of CPU power with modern games, that may be mildly based on the video card you have, but not entirely. This is where my points push the envelope rather than repeating conventional wisdom.
 

Bob the Unbeliever

Well-Known Member
I'm continuing my trend of making topics on technology considering it's a subject I like and it got some positive response.

Video cards are very very important for gamers. They make a big difference. However if faced between a low-end video card and a low-end CPU for gaming, you may still want to consider the video card that is low-end despite what the world says. And choose the better CPU instead.

I have tried 3 PCs where the CPU, upon not meeting the Recommended System Requirements of a game, but meeting the minimum system requirements, produced occasional microstuttering in games and longer in-game loading times.

Whereas the only side effects of a worse video card was quite often, having to turn down the graphics settings to maintain a stable framerate. This isn't exactly the same as microstuttering, the framerate may be lower overall, but unless it's too low, there is no microstuttering like effect to make the game unsmooth like there is with a weak CPU.

I have also seen a YouTube video regarding the latest PC Tomb Raider game that shows the same. I'm trying to locate the video again to post as stronger evidence.

Interesting. I see I am so behind the latest information-- the last box I built from scratch, was ... 8? years ago. That's like "dinosaur age" in computing terms. :)

I did about a month of research before buying my Alienware laptop, though-- I wanted a "real" GPU, not an extension of Intel's CPU chips. I'm not a fan of Intel Graphics.

Ironically, you cannot purchase a CPU without a GPU built in-- so my 'top sports "dual" video architecture.... most amusing. It reminds me of a build I did, ages ago, that had a built-in GPU, but in which I added a graphics card.

In contrast to my laptop, the BIOS was of the "either or" variety, and I had to disable the built-in GPU in BIOS.

My current machine (the aforementioned laptop) lets both be "enabled" but you select in the OS which one is to the fore.

This is actually useful: Windoze is "smart" enough to switch to the Intel GPU, if I mess up the drivers on the NVIDIA GPU. (I'd say 'card' but that's not exactly correct either). I know this is the case, as that actually happened once... :D

I rather like that feature... instead of being confronted with a blank screen, and some disturbing "beeps" I get a warning message on a working screen...

Idiotproof? Well... no... nothing is idiot proof... not even idiots are idiot-proof... ;)
 

ChristineM

"Be strong", I whispered to my coffee.
Premium Member
Go for the better cpu.

I am not a gamer but have been involved in computer graphics all my working life. Games tend to recommend high minimum frames per second rates which i am sure must enhance the gaming experience, the manufacturers of games and graphics cards tell us so.

My reason for recommending you go for the better cpu is... The human eye cannot really see any difference over 18fps and simply cannot detect more than 60fps. Although manufacturers will tell you otherwise.

Note that films are projected at 24fps.
 

PoetPhilosopher

Veteran Member
Go for the better cpu.

I am not a gamer but have been involved in computer graphics all my working life. Games tend to recommend high minimum frames per second rates which i am sure must enhance the gaming experience, the manufacturers of games and graphics cards tell us so.

My reason for recommending you go for the better cpu is... The human eye cannot really see any difference over 18fps and simply cannot detect more than 60fps. Although manufacturers will tell you otherwise.

Note that films are projected at 24fps.

I think we can tell the difference between 24FPS and 60FPS in a game. A slight difference. But that doesn't refute your points at all.

One reason to have 60FPS vs. 30FPS is when the game inevitably dips in framerate, it happens, the game will then go to about 35-40FPS for awhile rather than 18-20FPS.

Personally that's the main reason I aim for a device that can push 40-60FPS.

There's a certain Yin-Yang to computer graphics and its method. Waste in performance and efficient methods.
 

Bob the Unbeliever

Well-Known Member
Go for the better cpu.

I am not a gamer but have been involved in computer graphics all my working life. Games tend to recommend high minimum frames per second rates which i am sure must enhance the gaming experience, the manufacturers of games and graphics cards tell us so.

My reason for recommending you go for the better cpu is... The human eye cannot really see any difference over 18fps and simply cannot detect more than 60fps. Although manufacturers will tell you otherwise.

Note that films are projected at 24fps.

Agreed.

Not to be dismissed, either: Throughput of Data.

The bottleneck can also be a slow hard disk, or the type of channel the hard drive uses to communicate with the rest of the system.

I'm amazed at the difference between my SSD (2.5 form factor), running on a more traditional hard drive channel, and my M.2 drive.

The former is 1000GB, whereas the latter is 500GB, but both are Samsung Evo.

The M.2 is noticeably faster than the SSD, on SATA. My machine is new enough, it's the latest iteration of SATA, both on the drive and the motherboard.

But, as I understand it, M.2 uses a dedicated channel direct to the CPU, specifically optimized for storage, whereas SATA is now considered "old school", or "mature".

Ain't it fun?

As an aside, though? M.2 drives run significantly hotter than 2.5 SSD. Something to consider, when designing your system-- a small, silent fan blowing across your array of M.2's can extend their life.

Excess heat is the Destroyer Of All Things Electronic.

:D

"Heat, I christen thy name 'Godzilla'. A pox upon thy house."
 

PoetPhilosopher

Veteran Member
Agreed.

Not to be dismissed, either: Throughput of Data.

The bottleneck can also be a slow hard disk, or the type of channel the hard drive uses to communicate with the rest of the system.

I'm amazed at the difference between my SSD (2.5 form factor), running on a more traditional hard drive channel, and my M.2 drive.

The former is 1000GB, whereas the latter is 500GB, but both are Samsung Evo.

The M.2 is noticeably faster than the SSD, on SATA. My machine is new enough, it's the latest iteration of SATA, both on the drive and the motherboard.

But, as I understand it, M.2 uses a dedicated channel direct to the CPU, specifically optimized for storage, whereas SATA is now considered "old school", or "mature".

Ain't it fun?

As an aside, though? M.2 drives run significantly hotter than 2.5 SSD. Something to consider, when designing your system-- a small, silent fan blowing across your array of M.2's can extend their life.

Excess heat is the Destroyer Of All Things Electronic.

:D

"Heat, I christen thy name 'Godzilla'. A pox upon thy house."

I can make things even more complicated. Oftentimes nongamers think they need a better CPU for their apps, when they'd benefit more from a faster hard drive.
 

PoetPhilosopher

Veteran Member
Correct me if I'm wrong, but I believe there exist 240 frames per second pricey gaming monitors now. And I really think buying one is based on a certain fallacy. People will see a slight difference between 30FPS and 60FPS in their games and be really happy about it, then think "Imagine the difference 4x 60FPS would make!!!!11111oneone"

My 2c.
 

Bob the Unbeliever

Well-Known Member
Correct me if I'm wrong, but I believe there exist 240 frames per second pricey gaming monitors now. And I really think buying one is based on a certain fallacy. People will see a slight difference between 30FPS and 60FPS in their games and be really happy about it, then think "Imagine the difference 4x 60FPS would make!!!!11111oneone"

My 2c.

I agree with you, here.

However, I will pass on some of the claims made in favor of faster monitors--- tearing and other artifacts.

Supposedly, in a slower monitor, you can see video artifacts created by lag, due to a slower frame rate.

I can see their argument, but I've never experienced tearing, even in watching bluetooth movies.

I will point out, that my TV died several years ago, and I've been using computer grade display panels ever since. And even the worst panel I've used, was capable of 60hz minimum. The better panel (which I used most for TV) could do 70+.

My current panels are the high frame rate capable, and I pretty much ignore it. Never experienced the items that are complained about elsewhere.

Of course... something else to consider. Back in the day? Video was uncompressed signal from the card to the device. There was very little electronics within the monitor.

Even the first digital panels, used DVI, which was digital to digital, but again, little or no compression.

Now? We use display port, or HDMI or other compressed signal methods. And so the panel has to process the compressed signal, expanding it to show the image.

I have watched the el-cheapo TV's in the shops, and yes-- you can see image tearing and other artifacts.
 

PoetPhilosopher

Veteran Member
I agree with you, here.

However, I will pass on some of the claims made in favor of faster monitors--- tearing and other artifacts.

Supposedly, in a slower monitor, you can see video artifacts created by lag, due to a slower frame rate.

I can see their argument, but I've never experienced tearing, even in watching bluetooth movies.

I will point out, that my TV died several years ago, and I've been using computer grade display panels ever since. And even the worst panel I've used, was capable of 60hz minimum. The better panel (which I used most for TV) could do 70+.

My current panels are the high frame rate capable, and I pretty much ignore it. Never experienced the items that are complained about elsewhere.

Of course... something else to consider. Back in the day? Video was uncompressed signal from the card to the device. There was very little electronics within the monitor.

Even the first digital panels, used DVI, which was digital to digital, but again, little or no compression.

Now? We use display port, or HDMI or other compressed signal methods. And so the panel has to process the compressed signal, expanding it to show the image.

I have watched the el-cheapo TV's in the shops, and yes-- you can see image tearing and other artifacts.

My respectful counterargument for purpose of discussion wouldn't be to disagree with you, but to also point out that the video card performance needed is not only calculated by the number of pixels on screen, but the number of frames per second.

In the case of CPUs, pixels hardly matter, but frames per second makes a difference as well.

My point: A game running at 240 frames per second takes 2.5-4 times the CPU performance and a full 4x the video card performance, as a game running at 60 frames per second.

Expect to spend thousands in hardware.
 

sun rise

The world is on fire
Premium Member
It's a complicated question because of course what the machine will be used for varies.

I use a computer at a place where I volunteer to transcode videos. One add-on that I have wants a fast GPU for that purpose. Since the graphics card is not up to snuff, the CPU is used and the process takes longer than it should.

Cryptocurrency "mining" is another area where the graphics card, GPU, matters.

So, as noted above, the first question is usage. The second is the minimum recommended for every component of the system. Then, how much money one has to go above the minimum comes into play.
 

ecco

Veteran Member
CPU vs Graphics card, at least in games, depends on the authors of the games. Code can be written to put more stress the video cards or the CPU.

A good monitor is important in either case.

The only solution is to have lots of money and buy expensive everything.
 

Tumah

Veteran Member
I use a computer at a place where I volunteer to transcode videos. One add-on that I have wants a fast GPU for that purpose. Since the graphics card is not up to snuff, the CPU is used and the process takes longer than it should.
Well, at least it's not being wasted. Someone who will remain anonymous bought a laptop with an extra NVIDIA card (along with Intel's discrete graphics) to do some GPU intensive processing in a VM. Only, VMware Player doesn't support PCI passthrough, so you just can't. And CPU processing takes a lot longer.
 

PoetPhilosopher

Veteran Member
CPU vs Graphics card, at least in games, depends on the authors of the games. Code can be written to put more stress the video cards or the CPU.

Yes but suppose a scenario where the developer firmly believes the code takes around 2GHz CPU power. Then it is ran on a 4GHz CPU. The unpredictableness of executing code means you may still see some small benefits to having a 4GHz CPU, not to mention if you have something like an antivirus program you didn't know was running in the background slowing things down as you were running your game.

It's a more exact science with video cards. Things are drawn per frame. Even in cases where a video card has to run shader code on its Arithmetic Logic Units, it's small code calculated per pixel. Because video card performance can be interpreted mathematically if you know the values, by pixel, or frame, you don't run into as many problems.
 

dybmh

דניאל יוסף בן מאיר הירש
Well, at least it's not being wasted. Someone who will remain anonymous bought a laptop with an extra NVIDIA card (along with Intel's discrete graphics) to do some GPU intensive processing in a VM. Only, VMware Player doesn't support PCI passthrough, so you just can't. And CPU processing takes a lot longer.
Did you find a VM solution that supported the passthrough?
 

Tumah

Veteran Member
Did you find a VM solution that supported the passthrough?
Not a free one! I ended up having to install the relevant programs separately on the host, which is kind of annoying since they already come standard on the guest, but not the end of the world.
 
Top