Tuesday, December 18, 2012

Soldering On - The Future of the Desktop CPU

Back in November a slide was leaked to the press that detailed Intel's CPU road-map for the next couple of years. What made the leak so controversial was the suggested move away from the Land Grid Array (LGA) sockets currently used to seat processors. The slide seems to indicate that in 2014 a Ball Grid Array (BGA) mechanism would be used to directly mount Broadwell CPUs to the motherboard using solder.

Shortly after the leak, there were a flurry of articles posted online about the future of the (desktop) PC as we know it, which culminated in Intel making a statement to Maximum PC; essentially denying the move away from LGA packaging. In his article, Maximum PC's Deputy Editor Gordon Mah Ung made some good points around why it would be difficult for Intel, and the desktop PC industry as a whole to switch to BGA packaging. However, I feel that demand for the traditional desktop form factor has decreased significantly in the consumer marketplace and I felt that this recent insight into a possible future was worthy of a post.

Standards vs. Style

For literally decades now, the desktop pc form factor (specifically, the AT and ATX standards) has allowed people to design, build, upgrade and maintain their own system(s) using a slew of components produced by dozens of manufacturers. With the move towards smaller and more portable devices, we have seen the general consumer eschew this bulky device in favour of laptops and more recently, tablets and smartphones. Those consumers who do opt for a desktop system are often drawn towards all-in-one devices like the Apple iMac or HP Omni machines.

In many of these new, slimmer devices, the CPU is often attached directly to the motherboard or difficult to replace. In the case of the iMac, it appears you can upgrade the CPU, but the new processor can't deviate too much from the original CPU's TDP. This makes sense; Apple don't want consumers meddling with their systems' interiors, so why add support for other processors? For the most part, processors are not considered "upgradable" in slimmer/portable devices; people will just buy a whole new device when their existing one becomes too slow.

But it hasn't stopped at the CPU: in a quest to produce the most portable, slim and desirable devices manufacturers have been directly attaching other components to systems' motherboards, such as RAM and storage (solid-state NAND chips). This has produced some extremely thin machines, but at the expense of user-serviceability (in fact, it's produced some of the most environmentally unfriendly machines ever). Purchasing one of these machine is akin to buying a phone or tablet; you need to ensure the system's specs will be sufficient for the entire life cycle of the machine. I think this is a bit unfair, considering most people upgrade their phones yearly, whereas I would expect a laptop purchase to last at an absolute minimum of three years (I actually have a seven year old laptop at the heart of my CCTV system).

The End is Nigh?

Given the trend of smartphone/tablet SoCs slowly absorbing more functionality and portable computers becoming harder to maintain and upgrade, I felt comfortable knowing that my trusty (yes, and big and bulky) desktop would continue to provide me with the upgradable, customisable and truly personal computer. That was, until the slide was leaked to the press and all the speculation started!

It was while listening to the Anandtech podcast (and later reading the Maximum PC article) that things were put into perspective. On the show, the possibility was raised that Broadwell may only be intended for mobile devices and that desktop users would have to wait until the following major architectural change (a "tock" in Intel's release cadance). It was even pointed out that Intel has set a precedent for this already with the move from the Nehalem to Sandy Bridge; no six-core, high-end part was released to replace the older equivalent, which is why so many people delayed upgrading their Gulftown-based systems (i7-970,980X,990X) until Sandy Bridge-E was announced. If you wanted an eight-core part that using the newer architecture, it would be necessary to splash out on a Xeon system; a much more expensive proposition.

For the enthusiast market, it has been argued that processor upgrades are a rarity. Given Intel's socket lifecycle, upgrading one's PC will usually involve a new motherboard purchase anyway, to accommodate a new CPU architecture. While this is true, I know many people who bought/built a system and had to settle for a lower spec CPU, but later on were able to afford a higher-end part because of the naturally falling price of product lines as they age. But not only that, there are two other issues that myself and many others in the system builder community have concerns over:

  • Repairing/replacing damaged components - the obvious issue. If I have an expensive CPU attached to a motherboard, it's not easy to replace either one in the event of hardware failure. A motherboard has a lot of additional components; should I need to replace it, why should I be lumbered with the cost of replacing a (perfectly good) CPU?

  • High end motherboards paired high-end CPUs/confusing product line-ups - it's not clear how this integrated CPU approach will work from a buyer's perspective. Will motherboard manufacturers attempt to produce a variant of each board, each with a different CPU, or will they simply put the high end CPUs on the high end boards? The former would probably be costly to the manufacturer and confusing for the consumer, while the latter will reduce consumer choice; currently, it's possible to pair a some high-end motherboard with a lower end CPU, taking advantage of the board's features (additional SATA and USB ports, RAID, Wifi, etc.) to create a useful workstation or home-server.

Brave New World

As more and more motherboard functionality is absorbed by the CPU, it will be interesting to see how long the above two issues cause problems for system builders. It's pretty clear that the ATX form factor isn't a priority any more. Without AMD providing competition in the enthusiast/performance market, Intel can focus on the battle with ARM-based systems.

If the ATX form factor is slowly becoming extinct, will we see an alternative for the hardware enthusiast? Personally, I hope so and there seems to have been a rise in popularity of Micro-ATX and Mini-ITX systems, which is promising. These tend to require more forward planning and are more complex to build in; sometimes requiring case modifications, especially if the build uses higher end components or liquid-cooling solutions.

Something I found particularly interesting in the Anandtech podcast was the mention of discussions had with industry players (before this recent leak) who had revealed a possible future: boards with CPU, RAM, etc. directly attached, producing modular systems. This would fall in line with Intel's Next Unit of Computing (NUC) initiative and so perhaps the future for home desktops are consumer blade systems; upgrading or replacing a computing module in your home server would be as simple as removing a NUC. I could see specialised NUCs being developed for gaming or GPU compute, high I/O requirements, etc., which would allow hardware enthusiasts to customise their builds to their needs.

Maybe the outlook isn't as bleak as I first imagined... Still, I might brush up on my soldering skills and invest in re-flowing equipment, just to be on the safe side!