Since the dawn of computer gaming, framerate has been a major constraint on game developers and hardware producers. But, the number of frames per second (fps) that a player needed to “enjoy” a game was, to a certain extent, a matter of opinion. Consoles settled within the comfort zone of 30 fps for most titles, while the PC community pushed the framerates of each successive game with its more powerful (but more expensive), more customizable hardware. The rigs that PC gaming enthusiasts built were capable of running games well beyond 60 fps, the “sweet spot” that most of the PC community agreed was preferable for having the best experience with a game.
But, for many years, these custom built gaming computers weren’t actually doing much more than driving up electricity bills. This is due to a monitor constraint that has only really been investigated and expanded upon within the past few years, with the rise of competitive gaming tournaments. Computer monitors all run at a certain factory set frequency, measure in hertz. This frequency is the number of times that a monitor refreshes per second, which essentially translates to a framerate that each monitor is locked to.
Until recently, nearly all consumer-available monitors ran at either 30 or 60 hertz, meaning that these monitors could only display any type of information, games included, at either 30 or 60 frames per second. So, if a PC gaming enthusiast with the latest graphics card and processor was running a game at 100 fps on a 60 hertz monitor, 40 of those frames were processed by the computer and not even displayed. Some people tried to overclock their monitors to achieve higher refresh rates, but the gains were usually small and the risk to the monitor was large.
This refresh rate constraint began to change as competitive games gained popularity and the professional players grew more skilled. Players involved in very fast, reflex based games (particularly shooters) began to acknowledge the advantage they might have if their monitors could actually display those wasted frames their graphics cards were pumping out. Gaming at a higher framerate could give them a smoother viewing experience and allow them to aim with greater precision.
And so the market for what are called “high refresh rate” monitors began. Companies like Asus and BenQ began to manufacture panels that could run at 100 hertz, and the monitors they went in became available to consumers (although they came at a steep price). They continued pushing the limits of LCD technology, releasing monitors onto the market that ran at 120, 144, 165, and just recently 200 hertz. In fact, just a couple of weeks ago at the Consumer Electronics Show, Asus unveiled a 240 hertz monitor, which would allow for framerates that are actually laughably unachievable in many AAA titles, even with the most recent and powerful hardware.Consumers and manufacturers are now left to ponder whether more improvement is necessary, or if the refresh rate barrier has been officially conquered.