Back to the future: the second coming of overclocking

Tuesday, 09 December 2014 14:46

Tall gas cylinders stand like sentinels behind anxious devotees who tend to motherboards riven by wires.

Fluorescent tubes protrude from the machines, seeming to feed their depths with elaborate drips of sickly neon. Liquid nitrogen spills into makeshift sinks crowded with whirring components, coolly billowing fog over the benches.

At a glance, you could mistake this scene for the production department of a made-for-TV sci-fi relic.


The outrageous machines could easily be failed prototypes for a Dalek’s innards or Doc Brown’s DeLorean. But this is real life Taipei, Taiwan, 2014.

The room is one of many at this year’s Computex expo, and enthusiasts are using jerry-rigged PCs to overclock Intel’s new processor, codenamed ‘Devil’s Canyon’.


Wait, what?

The practice of overclocking dates back to the early 90s. It involves pushing a computer component, such as a central processing unit (CPU), to operate faster than the manufacturer’s prescribed frequency. Higher frequencies can deliver a boost in CPU performance, which results in faster applications.

However, a CPU running faster than its factory specifications often demands extra power, which causes it to run hotter and harder and can lessen its life span. In some unfortunate instances, high operating temperatures can completely destroy the component.

This potential for disaster has always added an element of risk to overclocking, which probably accounts for the tension among the event’s attendees. It’s also the reason for the clouds of fog, tubing and fans that encompass many of the motherboards – though the neon lighting is arguably only good for dramatic effect.

What’s the appeal?

Overclocking has long been considered a dark art in the world of computing, but it still attracts fans for many different reasons.

Any application that involves complex visuals is likely to benefit from higher CPU output. For this reason, people working in 3D animation or video production sometimes overclock their machines in search of peak performance – and greater productivity.

However, the most common and widely acknowledged use for overclocking is in competitive gaming. Impoverished gamers brave the risks of overclocking $200 CPUs in search of $1,000 performance levels, or to push aging PCs until they can handle modern games.


Others overclock more expensive gear to get the best frame rates possible for online gaming. Though such advantages are frowned upon in professional competitions, it’s this application that’s most familiar to Tony Trubridge, Managing Director of Australian e-sports crew Team Immunity.

“The best hardware gives you the best opportunity to win,” he says. “If your opposition only has 60 frames per second in the game and you’ve got 500, you’re seeing 440 more refreshes per second than they are. Physics for most of these games is CPU-based, so when there’s a lot happening, if you don’t have a good processor, you notice the slowdown.”

YouTube comparisons of overclocked PCs running CPU-intensive games show that the benefits are fairly subtle, especially to non-gamers. For those stalking opponents in the battle-scarred scenes of Titanfall, on the other hand, even an extra 20 frames per second can be the deciding factor in who pulls the trigger first.

Lingering risks

With great power comes great responsibility. Pushing a CPU beyond the manufacturer’s specs can bring better performance, but may also void your computer’s warranty altogether. And that’s before you consider the range of things that can go wrong at the hardware level.

Without the stabilising presence of a heat sink, it is possible for a CPU to fail at high temperatures. In the days when not all processors came with a thermal cut-off point, highly overclocked components could become hot enough to melt through motherboards.

Stress-testing and monitoring software means there is a much lower risk of misadventure for today’s overclockers, but anyone considering more than a marginal tweak is wise to invest in some kind of cooling system.

“It depends what you want to get out of it,” says Trubridge. “If you’re an entry-level overclocker at home, stick to air cooling. Using the Intel i7 4790K as an example, you’d probably get between 4.3GHz and 4.6GHz on air safely.”


Water cooling is appropriate for more ambitious overclocking goals but introduces the possibility of flooding thousands of dollars’ worth of hardware. One of Trubridge’s early overclocking peers discovered this the hard way.

“He had just installed a new water cooling loop and hadn’t tested whether it leaked,” he says. “Generally, you set a cooling system up before you even install other hardware, and then run it to make sure it’s watertight. But he hadn’t done that. He just threw it in there and it started leaking. It shorted his motherboard and a $1,100 video card.”

Other than poorly fitted cooling systems, the most dramatic failures usually involve extreme voltages. During overclocking, it is possible for the capacitors in a computer’s power supply to explode – loudly – and fill the computer chassis with acrid smoke.

These catastrophes may distress readers who once considered computer hardware their most prized possession. However, after the grief subsides, the only tangible thing today’s overclockers stand to lose is money. And given the price of overclockable processors these days, cost is far from the obstacle it used to be.

Overclocking for profit and fun

It’s easy to understand why someone would risk their gear for greater performance. But it’s harder to appreciate that many of the most devoted overclockers don’t actually use the computers they’re working on.

Around the world, there are entire communities of technical fanatics who treat overclocking as an end in itself. These overclockers are the purists, the die-hards. They typically have bigger budgets than their cash-strapped gaming peers and are happy to spend thousands of dollars to overclock just because they can, pushing high-end CPUs to breaking point for fun and, sometimes, glory.

Many in the Computex overclocking space fall into this category. Each figure hunched intently over the long benches is part of a team competing to set new records in a competition hosted by Intel.


Before the day is out, one crew manages to overclock the new Intel chip to a frequency of 5.498GHz in the air/liquid cooling category – a world record for which they take $4,000 of a larger prize pool.


Another team uses liquid nitrogen cooling to achieve 6.331GHz. Both results are produced using a flagship processor that comes packaged with a recommended ‘turbo’ frequency of 4.4GHz.

“It’s a different kind of competition,” says Trubridge. “A lot of these guys are just hardcore, like I used to be many years ago.


They don’t care if they melt a $1,000 processor; they just want to see what they can do with the hardware and whether they can potentially hit some world records. It’s also a test of their personal level of knowledge. At the higher levels, there’s a real art to it.”

Simpler times

Trubridge started overclocking in his early teens. His goal: to run the latest and greatest games at their highest settings and gain a competitive edge while playing first-person shooters with friends.

“Overclocking is like throwing a turbo on your car,” he says. “Back in the day, you’d use one of the really cheap, entry-level Pentium chips. You’d get an $80 chip, throw a decent water cooling or air cooling kit on it, dial up the voltage and frequency, and turn it into something that would normally cost you $400. It was great.”

To a teenage gamer in the 90s, these performance improvements seemed like a revelation. But they came with a price. Apart from the initial cost of the components (a substantial outlay for a 14-year-old), success depended on the strength of your acquired knowledge.


Overclocking was effectively an obscure subculture for many people. Without the resource of YouTube tutorials, reliable ‘how to’ information was scarce. One wrong move could destroy the family PC and (more importantly) shatter dreams of holding the high scores on local Quake servers.

Trubridge was lucky in this respect: his next-door neighbour was an avid overclocker, and showed him how to hot-rod his first CPU.

“It’s so much easier than it was,” he says. “Way back when, you’d spend days trying to eke everything you could out of your system. There were so many variables and parameters. You’d even go as far as voltage mods to provide additional power – literally hot-wiring the motherboard by hand with solder and wires to try and get more voltage through to the processor.”

This kind of in-depth modification still happens today, says Trubridge, but tends to be the domain of those who also use liquid nitrogen and dramatic lighting. For users that just need a few more points on the gigahertz spectrum, modern overclocking is quite straightforward – provided you are willing to void warranties and risk losing components.

“These days, a lot of people take the value option and overclock an entry-level K-series chip,” says Trubridge. “If you’re prepared to spend two minutes in BIOS changing your frequency and voltage values, that’s all you really need to do to bring a basic Intel CPU up to four-plus gigahertz. It’s absolutely the best and most cost-effective way to get better performance.”

In 2014, the barriers to entry for overclocking are lower than they ever have been. Coupled with Intel’s recent launch of the Pentium Anniversary Edition – an entry-level chip designed specifically for overclockers – this may herald a new age for overclocking.


One in which yesterday’s would-be elite gamers can finally afford the tools to realise their lofty teenage aspirations. Where animators can cost-effectively render complex weather environments without fear of hardware failure, and hardcore overclockers can plough through processors with greater gusto than ever before. It could well be the beginning of something beautiful.

Last modified on Tuesday, 09 December 2014 15:05