✅ TOP 5: Best Graphics Card 2019
Hunting for a new GPU for gaming, multi-display, or something else?
Here's everything you need to know to shop the latest Nvidia GeForce and AMD Radeon video cards with confidence.
It's solid for 1440p play, but it performs much like the outgoing Best for money graphics cards 1080, so step up to a.
Only its pricing and the lack of games supporting ray tra.
Just be ready to pay a supercar price to enjoy its luxury.
It's a solid pick for 1080p gaming at high settings, or board games and card phrase word play if 60fps isn't your aim.
Here's everything you need to know to shop the latest Nvidia GeForce and AMD Radeon video cards with confidence.
How to Buy the Best GPU for Gaming If you're a PC gamer, or a content creator who lives and dies by the speed of your graphics-accelerated software, your video card is the engine that powers what you can do—or how lustily you can brag.
And until just a few months ago, that bragging was really expensive.
Over the last two years, at times buying a video card felt like dishing out for a rare flower bulb, not a PC component, amid some 21st-century tulip frenzy.
The cryptomining crazes of 2017 and 2018 drove wild demand for graphics horsepower—the kind of computing muscle best suited to amateur and professional digital currency mining—and thus for certain video cards.
click here for even modest mainstream cards.
For a time, the market went downright bonkers.
Some cards traded for double or more than their list prices, if you could find them in stock at all.
Here in '19, —at least for the moment.
Our guide will help you sort through the best video-card options for your desktop PC, what you need to know to upgrade a system, and how to evaluate whether a particular card is a good buy.
We'll also touch on some upcoming trends—they could affect which card you choose.
Even with cryptomania on the wane for now, you do need to choose with care.
It's easy to overpay or underbuy.
We won't let you do that, though.
Who's Who in GPUs First off, what does a graphics card do?
And do you really need one?
If you're looking at any given pre-built desktop PC on the market, unless it's valuable gamestop gift cards online confirm gaming-oriented machine, PC makers will de-emphasize the graphics card in favor of promoting CPU, RAM, or storage options.
Indeed, sometimes that's for good reason; a low-cost PC may not have a graphics card at all, relying instead on the graphics-acceleration silicon built into its CPU an "integrated graphics processor," commonly called an "IGP".
There's nothing inherently wrong with relying on an IGP—most business laptops, inexpensive consumer laptops, and budget-minded desktops have them—but if you're a gamer or a creator, the right graphics card is crucial.
A modern graphics solution, whether it's a discrete video card or an IGP, handles the display of 2D and 3D content, drawing the desktop, and decoding and encoding video content in programs and games.
All of the discrete video cards on the consumer market are built around large graphics processing chips designed by one of two companies: AMD or Nvidia.
These processors are referred to as "GPUs," for "graphics processing units," a term that is also applied, confusingly, to the graphics card itself.
Nothing about graphics cards.
The two companies work up what are known as "reference designs" for their video cards, a standardized version of a card built around a given GPU.
Sometimes these reference-design cards are sold directly by Nvidia or, less often, by AMD themselves to consumers.
More often, though, they are duplicated by third-party card makers companies referred to in industry lingo as AMD or Nvidia "board partners"such as Asus, EVGA, MSI, Gigabyte, Sapphire, XFX, and Zotac.
Depending on the this web page chip in question, these board partners may sell their own self-branded versions of the reference card adhering to the design and specifications set by AMD or Nvidiaor they will fashion their own custom products, with different cooler designs, slight overclocking done from the factory, or features such as LED mood illumination.
Some board partners will do both—that is, sell reference versions of a given GPU, as well as its own, more radical designs.
Who Needs a Discrete GPU?
We mentioned integrated graphics IGPs above.
IGPs are capable of meeting the needs of most general users today, with three broad exceptions.
These folks, memory slot windows 10 work with CAD software or in video and photo editing, will still benefit greatly from a discrete GPU.
Some of their key applications can transcode video from one format to another or perform other specialized operations using resources from the Learn more here instead of or in addition to those of the CPU.
Whether this is faster will depend on the application in question, which specific GPU and CPU you own, and other factors.
Productivity-Minded Users With Multiple Displays.
People who need a large number of displays can also benefit from a discrete GPU.
Desktop operating systems can drive displays connected to the IGP and discrete GPUs simultaneously.
If you've ever wanted five or six displays hooked up to a single system, you can combine an IGP and a discrete GPU to get there.
That said, you don't necessarily best for money graphics cards a high-end graphics card to do that.
If you're simply displaying business applications, multiple browser windows, or lots of static windows across multiple displays i.
If you're showing four web browsers across four display panels, a GeForce RTX 2080 card, say, won't confer any greater benefit than a GeForce GTX 1660 card with the same supported outputs.
And of course, there's the gaming market, to whom the GPU is arguably the most important component.
RAM and CPU choices both matter, but if you have to visit web page between a top-end system circa 2016 with a 2019 GPU or a top-end system today using the highest-end GPU you could buy in 2016, you'd want the former.
Graphics cards fall into two distinct classes: consumer cards meant for gaming and light content creation work, and dedicated cards meant for professional workstations and geared toward scientific computing, calculations, and artificial intelligence work.
This guide, and our reviews, will focus on the former, but we'll touch on workstation cards a little bit, later on.
The key sub-brands you need to know across these two fields are Nvidia's GeForce and AMD's Radeon RX on the consumer side of thingsand Nvidia's Titan and Quadro, as well as AMD's Radeon Pro and Radeon Instinct in the pro workstation field.
As recently as 2017, Nvidia had the very high end of the consumer graphics-card market more or less to itself, and it still dominates there.
We'll focus here on the consumer cards.
Nvidia's consumer card line in 2019 is broken into two distinct classes, both united under the long-running GeForce brand: GeForce GTX, and GeForce RTX.
AMD's consumer cards, meanwhile, comprise the Radeon RX and Radeon RX Vega families, as well as the new.
Before we get into the individual lines in detail, though, let's outline a very important consideration for any video-card purchase.
Target Resolution: Your First Consideration Resolution is the horizontal-by-vertical pixel count at which your video card will drive your monitor.
This has a huge bearing on which card to buy, and how much you need to spend, when looking at a video card from a gaming perspective.
If you are a PC gamer, a big part of what you'll want to consider is the resolution s at which a given video card is best suited for gaming.
Nowadays, even low-end cards will display everyday programs at lofty resolutions like 3,840 by 2,160 pixels a.
But for strenuous PC games, those cards will not have nearly the power to drive smooth frame rates at high resolutions like those.
In games, the video card is what calculates positions, geometry, and lighting, and renders the onscreen image in real time.
For that, the higher the in-game detail level and monitor resolution you're running, the more graphics-card muscle is required.
The three most common resolutions at which today's gamers play are 1080p 1,920 by 1,080 pixels1440p 2,560 by 1,440 pixelsand 2160p or 4K 3,840 by 2,160 pixels.
Generally speaking, you'll want to choose a card suited for your monitor's native resolution.
The "native" resolution is the highest supported by the panel, and the one at which the display looks the best.
You'll also see ultra-wide-screen monitors with in-between resolutions 3,840 by 1,440 pixels is a common one ; you can gauge these versus 1080p, 1440p, and 2160p by calculating the raw number of pixels for each multiply the vertical number by the horizontal one and seeing where that screen resolution fits in relative to the common ones.
See our targeted roundup of.
Now, of course, you can always dial down the detail levels for a game to make it run acceptably at a higher-than-recommended resolution, or dial back the resolution itself.
But to an extent, that defeats the purpose of a graphics card purchase.
A secondary consideration nowadays, though, is running games at ultra-high frame rates to take advantage of the extra-fast refresh abilities of some new monitors; more on that later.
Let's look at the graphics card makers' lines first, and see which ones are suited for what gaming resolutions.
Let's look at Nvidia's first.
Nvidia's Lineup, 2019 The company's current line is split between cards using last-generation a.
Here's a quick rundown of the currently relevant card classes in the Pascal and Turing families, their rough pricing, and their usage cases.
If you are a keen observer of the market, you may notice that many of the familiar GeForce GTX Pascal cards are best for money graphics cards listed above.
They are being allowed to sell through and are largely going click the market in 2019 in favor of their GeForce RTX successors.
We expect this to happen soon for the GeForce GTX 1060 due to the release of the GeForce GTX 1660 and 1660 Ti, and eventually, the lesser Pascal cards.
We expect the GeForce GTX 1050 Ti to be replaced by a possible GeForce GTX 1650 Ti later in the year, if rumors are to be taken at face value.
The RTX 2080 and RTX 2080 Ti cards, finally, we'd call a new "elite class.
For most gamers, the Titans won't be of interest due to their pricing.
AMD's Lineup, 2019 As for AMD's card classes, here in early '19 the company is stronger competing with Nvidia's low-end and mainstream cards than its high-end ones, and it puts up no resistance against the elite class.
The Radeon RX 550 and 560 comprise the low end, while the to are the midrange and ideal for 1080p gaming.
The and and cards, the latter good best for money graphics cards 1080p and 1440p play, were hit particularly hard by the crypto-craze but have come back down to earth.
The Radeon VII is AMD's sole player in the elite bracket; it trades blows with the GeForce RTX 2080 at 4K but generally performed less well at lower resolutions in games.
Graphics Card Basics: Understanding the Core Specs Now, the charts above should give you a good idea of which card families you should be looking at, based on your monitor and your target resolution.
A few key numbers are worth keeping in mind when comparing cards, though: the graphics processor's clock speed, the onboard VRAM that is, how much video memory it hasand—of course!
And then there's adaptive sync.
Clock Speed When comparing GPUs from the same family, a higher base clock speed that is, the speed at which the graphics core works and more cores signify a faster GPU.
Again, though: That's only a valid comparison between cards in the same product family.
For example, the base clock on the venerable is 1,733MHz, while the base clock is 1,759MHz on a factory overclocked Republic of Gamers Strix version of the GTX 1080 from Asus in its out-of-the-box Gaming Mode.
Note that this base clock measure is distinct from the graphics chip's boost clock.
The boost clock is the speed to which the graphics chip can accelerate temporarily when under load, as thermal conditions allow.
This can also vary from card to card in the same family.
It depends on the robustness of the cooling hardware on the card and the aggressiveness of the manufacturer in its factory settings.
The top-end partner cards with giant multifan coolers will tend to have the highest boost clocks for a given GPU.
Onboard Memory The amount of onboard video memory sometimes referred to by the rusty term "frame buffer" is usually matched to the requirements of the games or programs that the card is designed to run.
In a certain sense, from a PC-gaming perspective, you can count on a video card to have enough memory to handle current demanding games at the resolutions and detail levels that the card is suited for.
In other words, a card maker generally won't overprovision a card with more memory than it can realistically use; that would inflate the pricing and make the card less competitive.
But there are some wrinkles to this.
A card designed for gameplay at 1,920 by 1,080 pixels 1080p these days will generally be outfitted with 4GB or 6GB of RAM, while cards geared more toward play at 2,560 by 1,440 pixels 1440p or 3,840 by 2,160 2160p, or 4K tend to deploy 8GB or more.
Usually, for cards based on a given GPU, all of the cards have a standard amount of memory.
The wrinkles: In some isolated but important cases, card makers offer versions of a card with the same GPU but different amounts of VRAM.
The key ones to know nowadays: cards based on the some lesser versions offer 3GB, versus the full-fat 6GBand the Radeon RX 570 and RX 580 4GB versus 8GB.
The cheaper versions will have less.
AMD has stepped up to 8GB on its RX Vega cards, with 16GB on its Radeon VII, while Nvidia is using 6GB or 8GB on most, with 11GB on its elite GeForce RTX 2080 Ti.
Either way, sub-4GB cards should only be used for secondary systems, gaming at low resolutions, or simple or older games that don't need much in the way of hardware resources.
Memory bandwidth is another spec you will see.
It refers to how quickly data can move into and out of the GPU.
More is generally better, but again, AMD and Nvidia have different architectures and sometimes different memory bandwidth requirements, so numbers are not directly comparable.
Pricing: How Much Should You Spend?
Generations of cards come and go, but the price bands were constant for years—at least, when the market was not distorted by cryptocurrency miners.
If a card is a certain amount costlier than another, the increase in performance is usually proportional to the increase in price.
In the high-end and elite-level card stacks, though, this rule falls away; spending more money yields diminishing returns.
Once a Religious Issue: FreeSync Vs.
G-Sync Should you buy a card based whether it supports one of these two venerable specs for smoothing gameplay?
It depends on the monitor you have.
FreeSync AMD's solution and G-Sync Nvidia's are two sides to the same coin, a technology called adaptive sync.
With adaptive sync, the monitor displays at a variable refresh rate led by the video card; the screen best for money graphics cards at a rate that scales up and down according to the card's output capabilities at any given time in a game.
Under adaptive sync, the monitor draws a full frame only when the video card can deliver a whole frame.
The monitor you own may support FreeSync or G-Sync, or neither one.
FreeSync is much more common, as it doesn't add to a monitor's manufacturing cost; G-Sync requires dedicated hardware inside the display.
You may wish to opt for one GPU maker's wares or the other's based on this, but know that the tides are changing on this front.
At CES 2019, that will allow FreeSync-compatible monitors to use adaptive sync with late-model Nvidia GeForce cards, and by Nvidia as "G-Sync Compatible.
Upgrading a Pre-Built Desktop With a New Graphics Card Assuming the chassis is big enough, most pre-built desktops these days have enough cooling capability to accept a new discrete GPU with no problems.
The first thing to do before buying or upgrading a GPU is to measure the inside of your chassis for the available card space.
In some cases, you've got a gulf between the far right-hand edge of the motherboard and the hard drive bays.
In others, you might have barely an inch.
Next, check your graphics card's height.
Make certain that if your chosen card has an elaborate cooler design, it's not so tall that it keeps your case from closing.
Finally: the power supply unit PSU.
Your system needs to have a PSU that's up to the task of giving a new card enough juice.
This is something to be especially wary of if you're putting a high-end video card in a pre-built PC that was equipped with a low-end card, or no card at all.
Doubly so if it's a budget-minded or business system; these PCs tend to have underpowered or minimally provisioned PSUs.
The two most important factors to be aware of here are the number of six-pin and eight-pin cables on your PSU, and the maximum wattage the PSU is rated for.
Most modern systems, including those sold by OEMs like Dell, HP, and Lenovo, employ power supplies that include at least one six-pin power connector meant for a video card, and some have both a six-pin and an eight-pin connector.
Midrange and high-end graphics cards will require a six-pin cable, an eight-pin cable, or some combination of the two to provide working power to the card.
The lowest-end cards draw all the power they need from the PCI Express slot.
Make sure you know what your card needs in terms of connectors.
Nvidia and AMD both outline recommended power supply wattage for each of their graphics-card families.
Take these guidelines seriously, but they are just guidelines, and they are generally conservative.
If AMD or Nvidia says you need at least a 500-watt PSU to run a given GPU, don't chance it with the 300-watter you may have installed, but know that you don't need an 800-watt PSU to guarantee enough headroom, either.
Ports and Preferences: Understanding Video Card Connections Three kinds of port are common on the rear edge of a current graphics card: DVI, HDMI, and DisplayPort.
Some systems and monitors still use DVI, but it's the oldest of the three standards and is being phased out on many high-end cards here in 2019.
Most cards have several DisplayPorts often three and one HDMI port.
When it comes to HDMI versus DisplayPort, note some differences.
First, if you plan on using a 4K display, now or in the future, your card needs to at least support HDMI 2.
It's fine if the GPU supports anything above those labels, like HDMI 2.
The latest-gen cards from both makers will be fine on this score.
Also, for now, only DisplayPort 1.
You should guarantee you're buying a card with at least DisplayPort 1.
Note that some of the very latest cards from Nvidia in its GeForce RTX series employ a new port, called VirtualLink.
This port looks like and can serve as a USB Type-C port that also supports DisplayPort over USB-C.
What the port is really designed for, though: attaching future generations of virtual-reality VR headsets, providing power and bandwidth adequate to the needs of VR head-mounted displays HMDs.
It's nice to have, but no VR hardware supports it yet.
Looking Forward: Graphics Card Trends Nvidia has been in the consumer video card driver's seat for a few years now, but 2019 should see more action than any in recent memory to shake things up between the two big players.
But that could shift as 2019 progresses, with AMD's next-generation "Navi" cards expected later this year.
Based on a new 7nm manufacturing process, these cards could change AMD's fortunes in the graphics space.
See our face-off VR: New Interfaces, New HMDs?
As we alluded to with VirtualLink, VR is another consideration.
VR's requirements are slightly different than those of simple monitors.
Both of the mainstream VR HMDs, the andhave an effective resolution across both eyes of 2,160 by 1,200.
That's significantly lower than 4K, and it's the reason why midrange GPUs like AMD's Radeon RX 580 or Nvidia's GeForce GTX 1060 can be used for VR.
On the other hand, VR demands higher frame rates than conventional gaming.
Low frame rates in VR anything below 90 frames per second is considered low can result in a bad gaming experience.
Coming cards in 2019, we suspect, may push the VR bar lower.
That said, as the year progresses we're going to see at least two new headsets that up the power requirements.
Thedue to hit store shelves in May, will raise the bar to a resolution of 2,560 by 1,440, while the hotly anticipated Valve Index should release shortly after with an increased resolution, as well.
No firm details on exactly what kind of pixel density the Index will push, but expect it to at least meet the Rift S, if not exceed it when it hits shelves sometime in Q2.
High-Refresh: A New Frontier for Serious Gamers Finally, bear in mind a further trend gaining momentum on the monitor side of things: high-refresh gaming monitors.
For ages, 60Hz or 60 screen redraws a second was the panel-refresh ceiling for most PC monitors.
We're seeing the emergence of lots of models now with higher refresh ceilings, designed especially for gamers.
These panels may support up to 120Hz, 144Hz, or more for smoother gameplay.
This ability can also be piggybacked with FreeSync or G-Sync adaptive sync to enable smooth frame rates when the card is pushed to its limit.
What this means: If you have a video card that can consistently push frames in a given game in excess of 60fps, you may be able to see those formerly "wasted" frames in the form of smoother game motion.
Most casual gamers won't care, but the difference is marked if you play fast-action titles, and competitive e-sports hounds will find the fluidity a competitive advantage.
See our picks forincluding high-refresh models.
In short: Buying a powerful video card that pushes high frame rates can be a boon nowadays even for play at a pedestrian resolution like 1080p, if paired with a high-refresh monitor.
Ready for Our Recommendations?
The GPUs below span the spectrum of budget to high-end, representing a wide range of the best cards that are available now.
We'll update this story as the graphics card landscape changes, so check back often for the latest products and buying advice.
Note that we've factored in just a sampling of third-party cards here; many more fill out the market.
You can take our recommendation of a single reference card in a given card class like the GeForce RTX 2060, or Radeon RX Vega 64 as a similar endorsement of the family as a whole.
Dual eight-pin power connectors and higher power rating.
Two-zone RGB LED lighting.
Cooling design exhausts air into chassis.
Ray-tracing and DLSS features remain underutilized, like with all RTX cards.
Bottom Line: A massive air cooler and dual eight-pin power connectors make MSI's GeForce RTX 2080 Gaming X Trio one of the most robust RTX 2080 partner cards we've seen.
Fitting it in your PC's case.
Headroom for mild overclocking.
Cons: Unlike Nvidia RTX 2060 Founders Edition, lacks a VirtualLink USB Type-C port.
Priced higher than the GeForce GTX 1060 it replaces.
Bottom Line: The best value yet in the GeForce RTX 20-series lineup, the RTX 2060 is a worthy, if more expensive, successor to the venerable GTX 1060.
Excellent 1080p and 1440p gaming performance.
Cons: Priced higher than GTX 1070 it replaces.
Not powerful enough for maxed-out 4K gaming in every game.
Bottom Line: MSI's GeForce RTX 2070 Armor graphics card has great cooling, plus overclocking headroom to spare.
It's solid for 1440p play, but it performs much like the outgoing GTX 1080, so step up to an RTX 2080 if you want a generational performance gain.
Beats previous-generation GTX cards on both sides of its price.
Cons: Not great for 4K gaming.
MSI's upclocked card closer in pricing to RTX 2060 Founders Edition than most GTX 1660 Ti cards.
Bottom Line: MSI's GeForce GTX 1660 Ti Slot loading optical drive for desktop X 6G is exceptional at doing what it's designed to do: deliver a moderate-cost GPU option to gamers in search of high refresh rates in 1080p.
Runs cool and quiet.
Includes ray tracing and DLSS support for future games.
Cons: Hiked-up price, versus GTX 1080 Founders Edition.
Hard to judge value of ray tracing and DLSS until games come to market.
Cooling design exhausts most air into case, not out.
Bottom Line: An exceptionally powerful graphics card, the GeForce RTX 2080 Founders Edition is a home run for gaming at 4K or high refresh rates.
Only its pricing and the lack of games supporting ray tracing and DLSS keep it from being a grand slam right from launch.
Supports ray tracing and DLSS for future games.
Easy to attain at least modest overclocks.
Games will take time to adopt ray tracing and DLSS.
Bottom Line: A Ferrari among gaming GPUs, Nvidia's GeForce RTX 2080 Ti Founders Edition represents the fastest class of this web page that money can buy.
Just be ready to pay a supercar price to enjoy best for money graphics cards luxury ride for 4K and high-refresh gaming.
Cons: Not much faster than Radeon RX 480, despite being larger, power-hungrier.
XFX model we tested is big for a card in this class.
XFX card had tricky, recessed eight-pin power connector.
Bottom Line: AMD's "refined" Polaris card trades blows with Nvidia's competing GTX 1060.
It's a solid pick for 1080p gaming at high settings, or 1440p play if 60fps isn't your aim.
Good for high-fps 1080p and 1440p gaming.
Includes three free AAA games at this writing.
Cons: High power requirements.
Physically large for a mid-level graphics card.
The Radeon RX 580 is considerably cheaper and not much slower.
Bottom Line: If you ignore power consumption, the Radeon RX 590 is the best-performing midrange card you can buy as tested in this XFX modelshowing double-digit gains over the GeForce GTX 1060.
However, the existing Radeon RX 580 has the economic edge.
Smooth on 7 iphone how to sim card slot open performance at high resolutions.
Cons: High power demands compared with the competition.
Frame rates fall behind the GTX 1080, particularly at resolutions below 4K.
Bottom Line: While the Radeon RX Vega 64 can go toe-to-toe with Nvidia's GTX 1080, it's a bit late to the party and has much higher power requirements.
True two-slot case fitment.
Cons: Generally, falls behind GeForce RTX 2080 for 1080p and 1440p gaming.
Priced higher than its predecessor.
High board power consumption.
Bottom Line: AMD's new Radeon flagship graphics card, the Radeon VII is a worthwhile if power-hungrier alternative to the GeForce RTX 2080 for 4K gaming, but it generally isn't as fast at 1080p or 1440p resolutions.
John is PCMag's executive editor for hardware.
A veteran of the popular tech site and magazine Computer Shopper from 1993 to 2017, he has covered just about every kind of computer gear—from the 386SX to 18-core processors—in his long tenure as an editor, a writer, and an advice columnist.
Get Our Best Stories!
Subscribe to What's New Now Email This newsletter may contain advertising, deals, or affiliate links.
Subscribing to a newsletter indicates your consent to our and.
You may unsubscribe from the newsletters at any time.
Great USED Graphics Cards for 1080p, 1440p, and 4K Gaming
The best graphics card will make your gaming PC the only device you want to play games on. And we’ve tested all the latest and greatest GPUs to see which one is right for you and your budget.
Компания Гуардворк - предоставление качественных услуг: Карты-памятки.
Я думаю, что Вы не правы. Я уверен. Пишите мне в PM, обсудим.
Как говорится.. Не дать не взять, зачётная статья!
Ничего не изменишь.
В этом что-то есть. Большое спасибо за помощь в этом вопросе. Я не знал этого.
Эра хороших блогов подходит к концу. Скоро все они будут наполнены говнокомментами. Бойтесь, о маловерные, ибо это грядет очень скоро!
Сколько бы я не старался, никогда не мог представить себе такого. Как так можно, не понимаю
Вы допускаете ошибку. Могу это доказать.
Вы допускаете ошибку. Предлагаю это обсудить. Пишите мне в PM.
Статья довольно интересная, можно у себя на блоге разместить с неё картинки?