What Kind of System is Required for 4k Gaming?

 
 


4k, or "UHD" monitors and TVs have been on the market for a while. If your current video card has a DisplayPort or HDMI output, you likely could hook a 4k monitor for your current setup. For watching movies, that should be all you need. For use as a desktop monitor for non-gaming purposes, it would work, but might occasionally have a slight lag. But for gaming, you need one of the current generation of video cards or relatively recent cards in SLI or Crossfire. To really shine for gaming, you need to carefully optimize your system for that purpose.

Refresh rates

The root problem with 4k is refresh rates. For watching video, 30hz is sufficient. For using desktop applications, 30hz is borderline. At 30hz, you may notice, for example, that the mouse seems to jump a bit or you may spot tearing or lag (depending on whether you have v sync enabled) when doing things like moving a window. For gaming, 30hz can be fairly annoying. It can significantly reduce the immersion in a game or even introduce enough lag to affect your accuracy and timing in game. Gamers typically shoot for 60hz or higher.

One thing to note is that you often see stats about how many hz a particular card can run at with a particular game and ultra settings. Those stats can be kind of misleading. If you're really pushing the limits of a video card, the refresh rate may not be at all consistent. You might get 60hz when inside buildings and then drop to 20hz when you step outside or when a particularly complex effect appears on screen. What you really are concerned with isn't the maximum refresh rate, or even the average refresh rate, but the worst refresh rate the card ever gets in that game.

Connectivity

4k displays generally have two connection options- HDMI and DisplayPort. You can run a 4k display off of older versions of HDMI or DisplayPort, but the older versions of each will limit your refresh rate to 30hz. If you want a 60hz refresh rate, you need to have either HDMI 2.0 or DisplayPort 1.2 on both your video card and your display. The most widely deployed version of HDMI is 1.4, which does not support 60hz. HDMI 2.0 only started becoming commercially available very recently. DisplayPort 1.2, on the other hand, is the most common version of DisplayPort and was originally released in 2009.

You can also technically hit 60hz by using multiple 30hz cables, although displays that support this are relatively hard to come by, and it is unlikely that a video card that doesn't have either DisplayPort 1.2 or HDMI 2.0 is going to have enough power to fuel a 4k display at 60hz while gaming anyways.

My recommendation would be not to invest in a 4k monitor unless you're able to get your hands on, or already have, a video card that has either DisplayPort 1.2 or HDMI 2.0, even if you intend to use the monitor only for desktop applications and video, but if you intend to use your system for gaming, I think support for one of these connectivity options is an absolute requirement.

Video cards

Buying a video card is always a tricky balance. In addition to meeting the connectivity requirements above, you would ideally want a video card that can drive the most demanding games with ultra settings and still have a bit of headroom, because 2 years from now, the games will be still more demanding and you'll end up having to back off of ultra pretty quickly if you didn't have much headroom to start with. The catch is, game designers basically tend to calibrate their ultra setting to make use of whatever the top of the line cards of the day can do, so, to really get decent headroom, you may have to consider SLI or Crossfire configurations, and given how pricey the top-of-the-line cards are, that can get quite costly (not to mention loud and hot).

If you are shooting for being able to maintain 60hz with a single card while gaming with ultra settings, your options are quite limited: the nVidia Titan X, nVidia 980Ti or the Radeon R9 Fury X. The best bet is probably the 980Ti. It performs nearly as well as the Titan, and better than the R9 Fury X, and costs distinctly less. Although the Titan X remains nVidia's flagship, the 980 Ti is a bit newer than the Titan X, and in some ways, improved.

These cards can generally hit, or least get near to, 60hz on the most demanding games with the most demanding settings. But, as discussed above, they can also bog down significantly in certain situations, such as when you can see far into the distance. If you really want to have a smooth 4k experience with the highest graphical settings, you would actually need to get two of these cards and run them in SLI or Crossfire.

It is important to note that you have another option for achieving that kind of flawless 4k play- you can wait. nVidia has announced its upcoming Pascal line of cards (the 1000 series) and all indications are that they will be a dramatic upgrade, likely capable of comfortably running any game at ultra settings in 4k while consistently maintaining 60hz. However, the current estimate for a release date is in Q2 of 2016. And, then you may actually need to wait a bit longer until you can actually get your hands on one. So, that is quite a while to wait. And, heck, by then you'll probably be reading articles about how you should wait for the generation after that for 8k support... Also, you don't necessarily need to run everything at the maximum setting to get a really great gaming experience. Anti-aliasing in particular is not as important on a 4k display where the pixels are much smaller, as it is on an ordinary display.

Monitors

The choice of monitor is largely driven by what you plan to use it for. Different monitors are optimized for size, color precision, price, viewing angle or refresh speed. You can spend obscene amounts of money on monitors that professional graphics designers use, but the truth is, most of us wouldn't really notice the difference. Still, you should look around at a number of monitors to get a sense of how they differ.

That said, in my view, as a gamer, you have one basic choice to make: size vs. performance. You probably already know which is more important to you. When you first heard about 4k, did you think "OMG, I can have a HUGE display now!" or did you think "OMG, the picture will be SO sharp!"? That's probably your answer right there. In my view, it depends partly on what types of games you like. First-person shooters aren't necessarily improved that much by having an obscenely large display. It takes more time to run your eyes from one corner to another with a large display, which might slow you down, and you typically have relatively limited things you need to be looking for, which fit on a smaller area. On the other hand, strategy games often are greatly improved by being able to have more of the world on screen at any given time. Games that involve lots of maps and radar screens and build orders and inventories and whatnot often benefit from being able to leave many things open at once, and hence are better suited to larger displays. You should also consider what things other than gaming you do and whether they would be benefited by having a massive amount of screen real estate. Also, of course, you need to think about where you're going to put the thing. A 40 inch monitor is really a very large object to fit on your desk.

If you are looking for a normal sized, by extremely high performance, gaming, your best bet right now is the 28 inch Acer XB280HK. 28 inches is a good size for most types of gaming. It is plenty big to be deeply immersive without being so big that you fail to notice things happening in far corners and so on. It is reasonably priced and has solid specifications. But, one relatively rare feature it has is something called G-Sync. G-Sync is an nVidia technology that allows the video card to tell the monitor when to display a frame. That is useful because video cards render frames at a variable speed- more rapidly when the load is light and more slowly when the load is heavy- while monitors traditionally try to display frames at a fixed rate, which causes the tearing or stuttering when your frame rate drops. You end up either having the monitor skipping frames when the card isn't ready in time or else wasting frames when the card is moving faster than the monitor. G-Sync harmonizes the two and gives a notably smoother image, especially at lower refresh speeds. There is a good video showing the difference here. But, the XB280HK has two significant limitations- G-Sync only works with nVidia cards, and the XB280HK only has a DisplayPort input. But, if you have an nVidia card with a DisplayPort output, and you're not looking for a huge monitor, this is the way to go.

Personally, what excites me more about 4k is the ability to have a huge screen. I love screen real-estate both for gaming and for desktop applications and have been frustrated for years that the biggest I could really get was 30 inches. To me, that is the biggest problem 4k solves. Note, of course, that spreading those pixels across 40 inches instead of 28 means significantly less sharpness. A 40 inch 4k monitor still has finer pixels than a 30 inch 2560 monitor, so it is still a very sharp display, but not as mind-blowingly sharp as you get when you pack all those pixels into just 28 inches. But, if your priority is size, I recommend the 40 inch Seiki Pro SM40UNP. Note that none of the 40 inch 4k monitors have G-Sync yet. So, you may not be able to push the settings quite as hard on this monitor as you could on the Acer. However, with the Seiki you get a very large, high quality, display that runs up to 60hz for not a whole lot more money than the 28 inch Acer costs.