Are integrated graphics better for gaming?

Integrated GPUs, dedicated GPUs, and everything else you need to know for your gaming needs.

Image by Brad Chacos via IDG

There are a plethora of reasons why people buy computers and laptops. Everything in the world is being digitized and even work is being taken to the online sphere all around the world. Depending on the work being done, your computer may need to possess an adequate level of hardware to carry out the tasks at hand.

Gaming, however, more universally requires a system with higher requirements. In the modern age of gaming, where the visuals are almost indistinguishable from reality, only getting more intense and complex over time, the hardware requirements will get increasingly more demanding as well, specifically warranting a high-quality graphics card.

When it comes to gaming, a system’s graphics card is the single most important piece of hardware to consider. While there are other hardware components that are essential as well, the graphics card determines the kind of visual quality you’ll get and how smoothly your game will run.

Hardware experts will likely tell you to get the best graphics card that you can afford and cut costs in other departments when upgrading your PC. There is a possibility, however, that depending on your gaming needs, you might not require buying a high-end graphics card at all.

But making that decision would require that you know the difference between the two kinds of graphics: integrated and dedicated graphics.

What is integrated graphics?

Photo via AMD

To put it simply, a computer with integrated graphics is one in which the graphics processing unit (GPU) is built on the same chip as the CPU. 

New-age computer graphics with high-definition video and detailed 3D rendering in video games, as well as professional applications, is no easy task for a system to handle. While a CPU can render graphics, it lacks the means to do it quickly and optimally.

This is why GPUs exist, which are built with the aim of excelling at calculating the exact values of millions of pixels hundreds of times every second. Every computing device possesses integrated graphics these days, from desktops to smartphones.

An integrated GPU has to share everything with the CPU, which is why it is often called an onboard GPU. It exists in the same processor package, is cooled by the same heat spreader and heatsink, and shares the same system memory as the CPU. While the motherboard provides the display output hardware, the meat of the GPU exists inside the CPU’s package.

One thing to keep in mind is that integrated graphics share memory with the rest of the system. Due to this, integrated graphics may sometimes be referred to as shared graphics. If your computer has 8 GB of RAM and 1 GB of shared graphics memory, you will end up having only 7 GB available for primary computing operations.

There are several benefits of using an integrated graphics card. They are compact, use less energy, and will not burn a hole in your wallet. Integrated graphics are now more than enough for your everyday computer usage, which includes gaming and 4K movie viewing. That being said, they are not the best to use when considering graphically-heavy applications, which includes the majority of modern gaming.

What is dedicated graphics?

Photo via Unsplash

Dedicated graphics, or the dedicated GPU, is a piece of hardware that manages a computer’s graphics performance and is also known as discrete graphics or video card.

When you buy a laptop or desktop computer, the specification sheet will often say whether the system comes with dedicated graphics or not. There are various kinds of graphics cards, but they all include a GPU, some RAM, an independent processor package, as well as a fan for cooling.

In a desktop computer, dedicated GPUs come along with their own circuit board, more commonly known as a graphics card. In laptops, the dedicated GPU is attached directly to the mainboard, but they are still dedicated as separate components from the CPU, coming along with their own cooling, memory, and power. 

Advantages of integrated graphics

Leaving out all of the high-end CPUs out there, essentially every computer’s CPU has an integrated GPU in today’s age. Systems with the integrated GPU model are probably the most common kind and there are a couple of good reasons for that.

First comes the cost. It doesn’t cost too much more to have a GPU also embedded into your CPU. Having a GPU already included reduces costs in other areas of the system by far more than it increases the cost of the GPU itself, which makes systems that possess an integrated GPU much cheaper than those with a dedicated GPU.

Secondly, integrated graphics are incredibly convenient for portable systems, such as laptops, smartphones, and tablets. Advances in fabrication technology have enabled chipmakers to design and produce integrated chips with a tinier footprint, which allows device manufacturers to create a graphics card with a smaller and thinner profile. Some of these portable systems, especially the flagship ones, are capable of running graphics-intensive applications.

Furthermore, integrated GPUs generate less heat than dedicated GPUs, which is critical in portable devices where there are limited options to install internal management and cooling systems. Additionally, less heat means consistent performance and less susceptibility to hardware damages. High-end dedicated GPUs will require an additional cooling system, which brings more costs, something that integrated GPUs completely avoid.

Lastly, integrated graphics uses less power when compared to dedicated graphics, and this makes it ideal for battery-operated devices, leading to the entire circuitry consuming less energy.

Disadvantages of integrated graphics

Integrated graphics have their uses and their advantages, but they also have their fair share of areas where they falter.

Quite obviously, integrated graphics do not hit the mark when it comes to graphically-intense use cases, most notoriously, gaming. Modern gaming requires your computer’s GPU to be able to handle high frame rates using high-resolution display, rendering of three-dimensional graphics, time-critical encoding of high-resolution videos, and so much more, which integrated graphics cannot provide adequately.

While a number of integrated graphics processors are capable of handling video games, they usually would consist of previous generation video games, with moderate graphical settings at most, and would not be able to handle the current generation of video games.

Additionally, integrated graphics cannot be removed, which leads to them being non-upgradable. You need to keep in mind that integrated graphics are fabricated within the entirety of a particular chipset, which means that if you want to upgrade the graphics processing capabilities of your device fitted with integrated graphics, you would need to replace the whole chip.

As games get more and more graphically intensive, upgrades turn out to become necessary to keep up with the changes in the system requirements of operating systems and software applications. In the sphere of graphical processing, examples of these changes include the normalization of a more encompassing user interface as well as high-definition photo and video content.

Are integrated graphics better for gaming?

The answer to this question is subjective. If you are playing cinematic, story-based games that do not warrant graphically-intense settings, then integrated graphics would suffice for you. These games will comfortably run at 30 FPS while working smoothly at 720p, and you may be able to push them to 1080p. The graphics settings would then have to be adjusted accordingly.

For modern games, especially competitive online games such as Apex Legends or Fortnite, you want the best GPU possible, which would be attainable only in the form of a dedicated GPU since these games require the utmost precision and high FPS. There are a couple of competitive games such as VALORANT that are optimized to work with low-budget hardware too, however, so that’s where integrated graphics may just do the job.

While integrated graphics have seen complaints from gamers over the years, in recent times, there have been considerable improvements with the built-in GPU variant. CPU transistors have now turned highly efficient while also occupying less space, which creates room for better-quality integrated graphics to be installed.

Companies like Intel and AMD have managed to generate never-before-seen values with their Accelerated Processing Units. You can easily buy a Ryzen 400 series chipset along with Vega graphics for GPU performance that rivals a couple of the budget graphics cards out there.

Even Intel, which is infamous for keeping its integrated graphics rather tame over the years, has introduced its Xe GPU for integrated graphics, which offers great graphical performance that is able to pull its own weight when it comes to gaming.

All in all, integrated graphics are not better than dedicated graphics for gaming, if “gaming” for you consists of the competitive, graphically-intense games of today.

If battery life, cost, heat management, and noise management matter to you, and if you plan on dabbling in games with entry-level to moderate graphics only, then a system with integrated graphics will do wonders for you. A perfect option for this would be the Ryzen 3400g chip.

If you are looking to play competitive, modern games with high-end graphics, you would almost necessarily need to buy a dedicated GPU. 1080p gaming would be best suited with a chip like the Ryzen 5700g

If you don’t mind low detail in gaming while still keeping FPS relatively high, the Ryzen 3400g will do an amazing job too, also allowing you to cut on costs.