Best graphics cards in 2023

Two of the best graphics cards from top down and side on view, on a pastel yellow background
(Image credit: Future)

The best graphics card is objectively Nvidia's RTX 4090. Subjectively, you're going to want to weigh up the pros and cons of spending $1,500 or more on a GPU. It's not for everyone. That's why we've been testing every new GPU out of the Nvidia, AMD, and now Intel stables to find the best card across multiple price points. The results might surprise you.

For the high-end gamer, you've plenty of new cards to choose from. Nvidia has its RTX 40-series led by the RTX 4090, and that thing really is a beast of massive proportions. No seriously, it's huge. Then there's the RTX 4080, which is a bit too pricey for us, and the RTX 4070 Ti. The RTX 4070 Ti isn't as cheap as we'd like, but at least it's more reasonable than Nvidia's finest for a perfectly 4K capable card.

On the other end of the market, there's not much new to write about. Neither Nvidia or AMD has released their new generation of budget GPU for the desktop yet. Most recently, we saw Intel enter the game with two more affordable graphics cards, the Arc A770 and Arc A750. These two Alchemist generation cards are going even cheaper today, and the same goes for AMD's more impressive budget graphics cards, like the RX 6600 and RX 6650 XT.

We suggest avoiding the high-end RX 6950 XT and RTX 3090 Ti nowadays, as these cards are generally being pushed out by similarly priced newer options. And we also don't recommend Nvidia's budget GPUs right now, such as the RTX 3060, as AMD and even Intel have them roundly beat on price. That said, you can find many RTX 3060 gaming PCs going cheap in the best prebuilt gaming PCs (opens in new tab), and they're often still good value for money, so all is not lost.

Where are the best graphics card deals?

In the US:

In the UK:

Best graphics card

PC Gamer's got your back Our experienced team dedicates many hours to every review, to really get to the heart of what matters most to you. Find out more about how we evaluate games and hardware.

(Image credit: Future)
The best graphics card

Specifications

Shaders: 16,432
Base clock: 2,235MHz
Boost clock: 2,520MHz
TFLOPs: 82.58
Memory: 24GB GDDR6X
Memory clock: 21GT/s
Memory bandwidth: 1,008GB/s

Reasons to buy

+
Excellent gen-on-gen performance
+
DLSS Frame Generation is magic
+
Super-high clock speeds

Reasons to avoid

-
Massive
-
Ultra-enthusiast pricing
-
Non-4K performance is constrained
-
High power demands

There's nothing subtle about Nvidia's GeForce RTX 4090 graphics card. It's a hulking great lump of a pixel pusher, and while there are some extra curves added to what could otherwise look like a respin of the RTX 3090 shroud, it still has that novelty graphics card aesthetic.

It looks like some semi-satirical plastic model made up to skewer GPU makers for the ever-increasing size of their cards. But it's no model, and it's no moon, this is the vanguard for the entire RTX 40-series GPU generation and our first taste of the new Ada Lovelace architecture.

A hell of an introduction to the sort of extreme performance Ada can deliver.

On the one hand, it's a hell of an introduction to the sort of extreme performance Ada can deliver when given a long leash, and on the other, a slightly tone-deaf release in light of a global economic crisis that makes launching a graphics card for a tight, very loaded minority of gamers feel a bit off.

But we can't ignore it for this guide to the best GPUs around simply because, as it stands today, there's no alternative to the RTX 4090 that can get anywhere close to its performance. It's unstoppable, and will stay ahead of the pack as we now know AMD's highest performance graphics card, the RX 7900 XTX, is well and truly an RTX 4080 competitor.

This is a vast GPU that packs in 170% more transistors than even the impossibly chonk GA102 chip that powered the RTX 3090 Ti. And, for the most part, it makes the previous flagship card of the Ampere generation look well off the pace. That's even before you get into the equal mix of majesty and black magic that lies behind the new DLSS 3.0 revision designed purely for Ada.

Look, it's quick, okay. With everything turned on, with DLSS 3 and Frame Generation working its magic, the RTX 4090 is monumentally faster than the RTX 3090 that came before it. The straight 3DMark Time Spy Extreme score is twice that of the big Ampere core, and before ray tracing or DLSS come into it, the raw silicon offers twice the 4K frame rate in Cyberpunk 2077, too.

There's no denying it is an ultra-niche ultra-enthusiast card, and that almost makes the RTX 4090 little more than a reference point for most of us PC gamers. We're then left counting the days until Ada descends to the pricing realm of us mere mortals, which is still yet to happen despite the launch of the RTX 4070 Ti.

In itself, however, the RTX 4090 is an excellent graphics card and will satisfy the performance cravings of every person who could ever countenance spending $1,600 on a new GPU. That's whether they're inconceivably well-heeled gamers, or content creators not willing to go all-in on a Quadro card. And it will deservedly sell, because there's no other GPU that can come near it right now.

Read our full Nvidia GeForce RTX 4090 review (opens in new tab).

The best high-end graphics card

Specifications

Shaders: 7,680
Base clock: 2,310MHz
Boost clock: 2,610MHz
TFLOPs: 40.09
Memory: 12GB GDDR6X
Memory clock: 21Gbps
Memory bandwidth: 504.2GB/s

Reasons to buy

+
RTX 3090-level performance
+
Frame Generation remains a potent weapon
+
Cool-running
+
Very efficient

Reasons to avoid

-
Weak memory system
-
More expensive than we'd like

The unlaunching and subsequent rebadging and repricing of the RTX 4080 12GB was the best thing to happen to this third-tier Ada GPU. Now and forever to be known as the RTX 4070 Ti, this is the card that now makes it impossible to recommend AMD's RX 7900 XT (opens in new tab).

Possibly the most impressive thing to say about the RTX 4070 Ti is that very regularly it's level to, or faster than an RTX 3090. When you think that's the $1,500 GPU of the last generation that looks like a great gen-on-gen uptick in performance, especially when that's at the top 4K resolution.

What's maybe less exciting is that, when you're just talking in straight rasterized gaming terms, it's not a whole lot faster than the old, cheaper RTX 3080 10GB at 4K. It is faster, most especially when you bring those third gen RT Cores into the equation, but it's clear the higher clocks and heftier L2 cache is having to work hard to give it the lead in raw frame rate terms over the older Ampere card.

Where it looks far more positive is up against the new AMD RDNA 3 cards, the RX 7900 XTX and RX 7900 XT. It is generally slower than the top Radeon GPU, but against the still more expensive RX 7900 XT the RTX 4070 Ti regularly posts higher 4K performance.

The power of Nvidia's upscaling tech is preposterously good.

Without any of the DLSS 3/Frame Gen stuff in attendance the RTX 4070 Ti is a very capable performer, but once again the power of Nvidia's upscaling tech is preposterously good. I keep trying to see where the Frame Generation technology fails but I can't do it. Every time I'm like 'aha, there it is, the tell-tale artefact of fake AI frames!' I then check out the native rendering and it looks exactly the same. If not worse.

And with the extra genuine performance of the upscaled frames and the interpolated smoothness of the AI-generated frames, the performance improvement is spectacular where DLSS 3 is available. Which should be more and more often, with Nvidia's Streamline SDK offering devs a one-stop option for enabling it and other vendors' upscaling tech too.

In gaming terms, the 4K performance of the RTX 4070 Ti is impressive even without upscaling, and is rather astounding with it.

There's definitely an argument to be made that in its own price bracket the RTX 4070 Ti is a better option for anyone looking to spend the best part of a grand on their new GPU. With its lower price point the RTX 4070 Ti makes more sense than it did as an RTX 4080 at $899, and with RTX 3090-level gaming performance there's a lot of power under that triple-fan shroud.

Read our full Nvidia GeForce RTX 4070 Ti review (opens in new tab).

An AMD Radeon RX 6700 XT graphics card with a colourful gradient background

(Image credit: Future)
The best mid-range graphics card

Specifications

Shaders: 2,560
Base clock: 2,321MHz
Boost clock: 2,581MHz
TFLOPs: 13.21
Memory: 12GB GDDR6
Memory clock: 16Gbps
Memory bandwidth: 384GB/s

Reasons to buy

+
Cheaper than the competition
+
High frame rates at 1440p
+
High clock speeds
+
Effective cooler

Reasons to avoid

-
Slows at 4K
-
RTX 3070 is faster
-
Can drop to RTX 3060 Ti performance

There's a certain level of pomp and excitement that comes with every major architectural overhaul, though perhaps we're not giving enough love to what comes after. Those more affordable graphics cards that actually bring that new technology to the masses are just as important, if not more so to many gamers eyeing up an upgrade. The AMD Radeon RX 6700 XT was the beginning of that journey for RDNA 2: The GPU with the grunt of a next-gen console for under $500.

AMD has released the RDNA 3 architecture now, but it's not for the likes of the budget gamer with the cheapest card costing $899. So RDNA 2 it is, then.

We're not talking the cheapest of chips here, either. The Radeon RX 6700 XT is still a high-ish-end card by most counts. But its price tag is slipping into the more affordable end of the market by the week, and that's high up on a list of things we absolutely love to see.

There's more to the Radeon RX 6700 XT than a simple halving of silicon from AMD's top RDNA 2 chip, the Radeon RX 6900 XT (opens in new tab). In some ways, sure, it's a straight slice down the middle. The RX 6700 XT features 40 compute units (CUs) for a total of 2,560 RDNA 2 cores and is equipped with 64 ROPs—exactly half of the maximum configuration of the Navi 21 GPU—but the card comes with more than its fair share of memory and Infinity Cache.

Often quite a reasonable amount cheaper than the RTX 3060 Ti.

A headline feature of AMD's RDNA 2 lineup has been bigger than thou memory capacities compared with rival RTX 30-series GeForce GPUs, and the RX 6700 XT doesn't buck that trend. There's 12GB of GDDR6 packed onto this card: an attempt at what we optimistically call 'future-proofing'. That's greater VRAM capacity than the RTX 3060 Ti and RTX 3070, and is a match for the RTX 3060 12GB.

With a price tag closer to the GeForce RTX 3070, yet performance between it and the GeForce RTX 3060 Ti, most often closer to the latter, the Radeon was a solid alternative but hadn't been my first port of call for this sort of cash for a good portion of its life. That's all changed now that it's often quite a reasonable amount cheaper than the RTX 3060 Ti or anything the newer generation can manage.

That said I still think you could pick up an RTX 3060 Ti right now and be happy with it. If only because that card was possibly the best value proposition of the entire RTX 30-series, and is still something close to that (it's all out of whack). If ray tracing doesn't bother you and you'd prefer the extra memory, however, the RX 6700 XT is the more all-round GPU to buy right now.

Read our full AMD Radeon RX 6700 XT review (opens in new tab).

The best budget graphics card

Specifications

Shaders: 3,584
Base clock: 2,050MHz
Boost clock: 2,400MHz
TFLOPs: 17.20
Memory: 8GB GDDR6
Memory clock: 16Gbps
Memory bandwidth: 512GB/s

Reasons to buy

+
Competitive with RX 6600 at 1080p
+
Faster at 1440p
+
Ray tracing performance
+
AV1 encoding
+
Runs quiet and cool

Reasons to avoid

-
Don't buy it if your PC does not support Resize BAR
-
Software package still needs work
-
High power draw

You might be surprised to see the Intel Arc A750 as our pick for the best budget graphics card right now. I'm quite surprised myself. But when I thought on what might sit in this spot in this guide, really thought about it, I came up short for another graphics card that can match its feature set right now.

The Arc A750 offers a moderately cut-down version of the Arc A770 GPU, the G10, which you can read about more in my Intel Arc A7 (opens in new tab)70 review. It comes with 28 Xe-cores, the building blocks of the Xe-HPG architecture, which is only four off the 32 Xe-cores found in the Arc A770. For that, it's not massively off the pace of that card in terms of performance, either, but it does run slightly slower and has half the total overall memory capacity.

Intel has more graphics card to offer for the money than the competition.

But the memory spec on the Arc A750, for a card of its price, is immense. Mostly that's because Intel's Arc A750 is more graphics card than it should be for the money. Intel invested on some beefy specs for its first-generation GPUs, including a massive memory spec, and ultimately the drivers couldn't get to where Intel wanted them to be. That means some games don't play nicely on the Arc A750, but it's not all bad. It also means that the Arc A750 has a lot of untapped potential.

Potential that Intel is gradually exploiting with every new driver update.

In the meantime, Intel and its partners (such as ASRock) have been dropping the price of the Arc A750 to rival AMD's RX 6600. The RX 6600 was my previous pick for this spot, but ultimately I feel the Intel has more graphics card to offer for the money than the competition. And there are a few more things to sweeten the deal.

The Arc A750 comes with impressive AV1 acceleration for encoding in the new, bandwidth-savvy codec. That's a big deal if you're a streamer or content creator looking to improve the quality of your videos. The A750's ray tracing ability is similarly beefed up compared to the competition's.

There will be cases where the A750 is way off the mark, and that pretty much rules this card out for anyone on an older system without Resize BAR support, but generally I think it's a savvy buy for its new lower price and definitely a bit of a budget underdog right now. At a time when graphics cards are often underwhelming for the money and dreadfully prescriptive at the lower-end, I'm finding myself getting surprisingly more and more onboard with the Arc A750 as a great option.

Who'd have thought? Not me.

Read our full Intel Arc A750 review (opens in new tab).

Best CPU for gaming (opens in new tab) | Best DDR4 RAM (opens in new tab) | Best gaming motherboards (opens in new tab)
Best SSD for gaming (opens in new tab) | Best gaming laptop (opens in new tab) | Best gaming monitors (opens in new tab)

The best AMD graphics card

Specifications

Shaders: 6,144
Base clock: 1,855MHz
Boost clock: 2,499MHz
TFLOPs: 61.42
Memory: 24GB GDDR6
Memory clock: 20Gbps
Memory bandwidth: 960GB/s

Reasons to buy

+
Much faster than an RX 6950 XT at 4K
+
$999 price tag
+
Much improved ray tracing capabilities
+
Frickin' chiplets!

Reasons to avoid

-
Not a consistent RTX 4080 competitor
-
Runs real hot

The AMD Radeon RX 7900 XTX has a lot going for it. We're used to seeing GPU generations that arrive on smaller process nodes, redesigned architectures, larger caches, reworked shaders, more memory—the list goes on. But all of that, all at once? That's what RDNA 3 delivers: the whole lot in one fell swoop.

The RX 7900 XTX is the best example of the all-encompassing upgrade to the Radeon DNA, and it's a mighty 4K card for those improvements.

The RX 7900 XTX makes quick work of 4K gaming. It's that sort of speedy card that let's you boot up any old game in 4K and expect perfectly playable performance. The sort of graphics power that allows you to be extremely lazy and 'optimise' your game settings by turning everything up to ultra and never diving into the settings menu ever again. I love that sort of brute strength—if only because I don't like to mess with my game settings unless there's a problem.

You can be fairly liberal with ray tracing options on this AMD graphics card, too, which is the text-based equivalent of thunderous applause for AMD's second generation RT acceleration. While Nvidia tends to still pull a lead in ray-traced games, there's no need to be any more considerate about which ray tracing effects you enable and how high you crank them with AMD's latest number.

The difficulty we've had with AMD's RX 7900 XTX, however, is in a hotspot temp issue (opens in new tab) that we just can't shake on AMD's own-designed version of the card. That saw our card run a little slower on average that it should have, and our attempts to get another to test saw the same issue arise again. It's not a problem to worry about with third-party designed GPUs, at least, so we're only recommending those non-AMD cards for the time being.

The RX 7900 XTX makes quick work of 4K gaming.

Then there's the competitive aspect. Ultimately, the RTX 4080 is the target that AMD has been aiming for with the XTX, and in my testing it just doesn't reach it consistently enough to make that a favourable comparison. Not often enough, anyways. 

It's that age-old thing: the XTX, in the right circumstances, will make swift work of the RTX 4080, but in games few and far between. The RTX 4080 is generally the faster card from my testing.

And the RTX 4080 is more threatening for the release of DLSS 3. While limited in its scope due to a lack of support in all too many games today, the frame generation tech inherent to the newer upscaling technology is incredibly impressive. It adds heaps of performance that AMD just can't quite match right now without its own frame generation tech. Don't worry, that's coming, but it's not here yet.

But AMD does have its own software stack, and a capable one at that. And this is a performant Radeon graphics card like no other. If you're really keen to ditch team green, or save some cash otherwise spend on an RTX 4080, there's a case to be made for the XTX. Though I have a feeling many will instead opt for the RTX 4090 or RTX 4070 Ti, depending on how frivolous you feel that day.

Read our full AMD Radeon RX 7900 XTX review (opens in new tab).

GPU hierarchy

Every new GPU generation offers new features and possibilities. But rasterized rendering is still the most important metric for general gaming performance across the PC gaming world. Sure, Nvidia GPUs might well be better at the ray tracing benchmarks they more or less instigated, but when it comes to standard gaming performance AMD's latest line up can certainly keep pace.

It's also worth noting that the previous generation of graphics cards do still have something to offer, with something like the GTX 1650 Super able to outpace a more modern RTX 3050 in most benchmarks.

We're not saying you should buy an older card—Intel or AMD's budget options are a much better deal today—but it's worth knowing where your current GPU stacks up, or just knowing the lie of the land. But there is also the fact there will be gaming rigs on sale with older graphics cards over the coming days, and if they're cheap enough they may still be worth a punt as a cheap entry into PC gaming.

We've benchmarked all the latest GPUs of this generation, and have tracked their performance against the previous generation in terms of 3DMark Time Spy Extreme scores. Where we don't have the referential numbers for an older card we have used the average index score from the UL database. These figures track alongside an aggregated 1440p frame rate score from across our suite of benchmarks.

(Image credit: Future)

The alternative: AMD's Radeon RX 6950 XT

Once again I'm turning to AMD's RX 6950 XT for our alternative here. I do think if you can find a great, and I mean great, deal on an RTX 3080 of any variety then maybe it's worth a shot for the green team fan, but generally the RX 6950 XT is the all-round card to buy if you're looking for a more might for more cash. Just don't even think of spending anything close to $1,000 on it, you might as well wait for the RX 7000-series if you're going to do that.

Here's a list of the manufacturer set retail prices (MSRP), or recommended retail price (RRP), for most the latest graphics cards. For the most part, these are the set prices for the stock or reference versions of these cards, if applicable, and not representative of overclocked or third-party graphics cards, which may well be priced higher.

Nvidia

  • RTX 4090 - $1,599 | £1,699
  • RTX 4080 - $1,199 | £1,269
  • RTX 4070 Ti - $799 | £799
  • RTX 3090 Ti - $1,999 | ~£1,999
  • RTX 3090 - $1,499 | £1,399
  • RTX 3080 Ti - $1,199 | £1,049
  • RTX 3080 - $699 | £649
  • RTX 3070 Ti - $599 | £529
  • RTX 3070 - $499 | £469
  • RTX 3060 Ti - $399 | £349
  • RTX 3060 - $329 | £299
  • RTX 3050  - $249 | £239

AMD

  • RX 7900 XTX - $999 | ~£999
  • RX 7900 XT - $899 | ~£899
  • RX 6950 XT - $1,099 | ~£1,060
  • RX 6900 XT - $999 | ~£770
  • RX 6800 XT - $649 | ~£600
  • RX 6800 - $579 | ~£530
  • RX 6750 XT - $549 | ~£530
  • RX 6700 XT - $479 | ~£420
  • RX 6650 XT - $399 | ~£389
  • RX 6600 XT - $379 | ~£320
  • RX 6600 - $329 | ~£299
  • RX 6500 XT - $199 | ~£180

Graphics card FAQ

Top down shot of an Nvidia GTX 1080 graphics card without a cooler on so you can see the GPU itself

Which is better GTX or RTX?

The older GTX prefix is now used to denote older Nvidia graphics cards which don't have the extra AI and ray tracing silicon that the RTX-level cards do. This RTX prefix was introduced with the RTX 20-series, and highlights which cards have GPUs which sport both the Tensor Cores and RT Cores necessary for real-time ray tracing and Deep Learning Super Sampling (DLSS).

Nowadays you'll only find older 16-series GPUs with the GTX prefix attached, so it's pretty much RTX all the way.

Is ray tracing only for RTX cards?

The RTX prefix is only used to denote cards which house Nvidia GPUs with dedicated ray tracing hardware, but AMD's RDNA 2 GPUs and RDNA 3 GPUs also support real-time ray tracing acceleration.

Intel's Alchemist graphics cards support ray tracing, even, though as more budget offerings you can't expect super-high frame rates while it's enabled. Otherwise Intel's ray tracing acceleration is pretty good.

Is SLI or CrossFire still a thing?

If you're looking for maximum performance, you used to run two cards in SLI or CrossFire. However, it's become increasingly common for major games to ignore multi-GPU users completely. That includes all DXR games. There's also the fact that fewer and fewer modern cards actually support the linking of two cards together.

So, no. It's not a thing.

Do I need a 4K capable graphics card?

The obvious answer is: Only if you have a 4K gaming monitor (opens in new tab). But there are other things to consider here, such as what kinds of games do you play? If frame rates are absolutely king for you, and you're into ultra-competitive shooters, then you want to be aiming for super high fps figures. And, right now, you're better placed to do that at either 1440p or 1080p resolutions.

That said, the more games that incorporate upscaling technologies, such as DLSS, FSR, and XeSS, the more cards will be capable of making a close approximation of 4K visuals on your 4K monitor, but at higher frame rates.

What's a Founders Edition graphics card?

The Founders Edition cards are simply Nvidia's in-house designs for its graphics cards, as opposed to those designed by its partners. These are usually reference cards, meaning they run at stock clocks. 

Briefly, for the RTX 20-series, Nvidia decided to offer Founders Editions with factory overclocks. These had made it a little difficult to compare cards, as Founders Edition cards give us a baseline for performance, but Nvidia has since returned to producing them as reference again.

Intel also offers something similar with its Limited Edition Arc Alchemist cards featuring its own in-house cooler design, as does AMD with its reference cards. 

Jacob Ridley
Senior Hardware Editor

Jacob earned his first byline writing for his own tech blog from his hometown in Wales in 2017. From there, he graduated to professionally breaking things as hardware writer at PCGamesN, where he would later win command of the kit cupboard as hardware editor. Nowadays, as senior hardware editor at PC Gamer, he spends his days reporting on the latest developments in the technology and gaming industry. When he's not writing about GPUs and CPUs, however, you'll find him trying to get as far away from the modern world as possible by wild camping.

With contributions from