News


AMD Announces K12 Core: Custom 64-bit ARM Design in 2016

AMD Announces K12 Core: Custom 64-bit ARM Design in 2016

In 2015 AMD will launch project SkyBridge, a pair of pin-compatible ARM and x86 based SoCs. Leveraging next generation Puma+ x86 cores or ARM’s Cortex A57 cores, these SoCs form the foundation of the next phase in AMD’s evolution where ARM and x86 are treated like equal class citizens. As I mentioned in today’s post however, both of these designs really aim at the lower end of the performance segment. To address a higher performance market, AMD is doing what many ARM partners have done and is leveraging an ARM architecture license to design their own microarchitecture. 

In 2016 AMD will release its first custom 64-bit ARMv8 CPU core, codenamed K12. Jim Keller is leading the team that is designing the K12, as well as a corresponding new 64-bit x86 design. AMD is pretty quiet about K12 details at this point given how far away it is. Given the timing I’m assuming we’re talking about a 14/16nm FinFET SoC. On the slide above we see that AMD is not only targeting servers and embedded markets, but also ultra low power client devices for its 64-bit ARM designs (presumably notebooks, chromebooks, tablets). AMD has shied away from playing in the phone market directly, but it could conceivably play in that space with its semi-custom business (offering just a CPU/GPU core with other IP). Update: AMD added that server, embedded and semi-custom markets are obvious targets for K12. 

There’s also this discussion of modularity, treating both ARM and x86 cores as IP modules rather than discrete designs. AMD continues to have a lot of expertise in SoC design, all it really needs is a focus on improving single threaded performance. I can only hope (assume?) that K12 won’t be Bulldozer-like and will hopefully prioritize single threaded performance. It’s important to point out that there hasn’t been a single reference to the Bulldozer family of CPU cores in any of these announcements either…

Update: Jim Keller added some details on K12. He referenced AMD’s knowledge of doing high frequency designs as well as “extending the range” that ARM is in. Keller also mentioned he told his team to take the best of the big and little cores that AMD presently makes in putting together this design. 

AMD Announces Project SkyBridge: Pin-Compatible ARM and x86 SoCs in 2015, Android Support

AMD Announces Project SkyBridge: Pin-Compatible ARM and x86 SoCs in 2015, Android Support

This morning AMD decided to provide an update on its CPU core/SoC roadmap, particularly as it pertains to the ARM side of the business. AMD already committed to releasing a 28nm 8-core Cortex A57 based Opteron SoC this year. That particular SoC is aimed at the enterprise exclusively and doesn’t ship with an on-die GPU.

Next year, AMD will release a low-power 20nm Cortex A57 based SoC with integrated Graphics Core Next GPU. The big news? The 20nm ARM based SoC will be pin compatible with AMD’s next-generation low power x86 SoC (using Puma+ cores). The ARM SoC will also be AMD’s first official Android platform.

I don’t expect we’ll see standard socketed desktop boards that are compatible with both ARM and x86 SoCs, but a pin compatible design will have some benefits for embedded, BGA solutions. AMD expects to target embedded and client markets with these designs, not servers.

AMD’s motivation behind offering both ARM and x86 designs is pretty simple. The TAM (Total Addressable Market) for x86 is decreasing, while it’s increasing for ARM. AMD is no longer married to x86 exclusively and by offering OEMs pin compatible x86/ARM solutions it gets to play in both markets, as well as benefit if one increases at the expense of the other.

Note that we’re still talking about mobile phone/tablet class CPU cores here (Cortex A57/Puma+). AMD has yet to talk about what it wants to do at the high end, but I suspect there’s a strategy there as well.

Corsair’s AX1500i Released: A 1500W 80 Plus Titanium PSU

Corsair’s AX1500i Released: A 1500W 80 Plus Titanium PSU

Shopping around for a power supply on a tight budget can be a bit of an ordeal.  On forums, everyone will have their own opinion of what constitutes a good power supply, and similarly to mechanical HDDs, a single bad experience can put a user off a brand forever.  My golden rule, unless you need a specific feature/amperage on the power lines for unique GPUs, is to take the total power draw of your system and add 40%.  My analogy is thus – a car whose top speed is 80mph will squeak and rattle if you run it every day at 70mph, whereas a car whose top speed is 130mph will hum along nicely at 70mph.  Others may disagree, but I find this is a nice guideline when building systems for family and friends.

Most desktop systems bought and sold today are often very basic, with integrated or a low end graphics card, making power requirements very low.  However the extreme is also true, with users wanting to make the most out of three or four end GPUs with a heavy deal of overclocking.  If you can recall our Gaming CPU article from April 2013 we used a 24-thread dual-processor system with four 7970 GPUs, lightly overclocked, which drew 1550W at load. This is why power supplies north of 1000W exist, and it can be very frustrating to get these units to be very efficient.  To that end, Corsair is releasing today their AX1500i, a 1500W model certified with 80 Plus Titanium qualifications.

80 Plus Titanium is a newer addition to the 80 Plus, derived from server requirements and first realised back in 2012.  As with all 80 Plus specifications, it requires a specific efficiency at 20%, 50% and 100% loading (it can be any efficiency in between these values), although Titanium also adds an element for 10% loading.  For the AX1500i, this means a minimum efficiency rating of 90/92/94/90% for 10/20/50/100% loading in the 110-120V regions and 90/94/96/91% for 220V+ regions.

The Corsair design implements their Zero-RPM Fan technology, meaning the power supply fan will only activate when a 450W load or above is detected. 

The supply comes with ten connectors for PCIe devices, is fully modular, and has native USB support for Corsair Link for monitoring the power supply.  This includes real-time temperature, power use and efficiency ratings in the operating system.  The AX1500i blows the ATX specification out of the water in terms of size, measuring 225mm (8.86in) long, which is still shorter than a big GPU.

The price is not for the faint hearted: $450 MSRP, to be initially available direct from Corsair followed by worldwide distributors in late May.  This price is indicative of the high power rating combined with the high efficiency certification, as well as a 7-year warranty.  I have already seen interest online from extreme overclockers and modders designing hardcore top-end desktop machines, which indicates the niche that Corsair believes this supply will fit in to.

A Discussion on Material Choices in Mobile

A Discussion on Material Choices in Mobile

Within the past four years, the smartphone market has changed drastically. Displays have dramatically increased in quality. Battery life has increased. As OEMs converge on largely similar platforms, the material design of a phone has become increasingly important. Almost every OEM has had a major shift in the material design of their devices as the market becomes increasingly saturated and competitive. This is especially true at the high end where the upgrade cycle has become lengthened. As people find less and less reason to upgrade to the latest and greatest, OEMs have to change things to stave off decreasing growth. Overall, this seems to mean going “back to basics” with their new devices, which often entails improved material design.

This seems to make very little sense, especially when there are a great deal of people that simply aren’t concerned with materials. It’s not unusual to hear the argument that because everyone uses a case, the design of the device shouldn’t matter. It’s often said that aluminum devices are less durable, heavier, and with worse radio reception than one made of polycarbonate. Other issues often cited include uncomfortable skin temperatures under load. Higher cost is also a problem, one that OEMs will often cite internally. With glass, it’s almost universally understood that any drop risks shattering the brittle back. So the question remains: Why is it that OEMs continue to push material design?

Without a doubt, this is a complex topic. Material choices entail a huge number of trade-offs. There isn’t any one material that has the best compromises either. For the most part, there are three key materials that smartphones are made from. These three materials are plastic, glass, and metal.

Plastic

Within plastics, the most commonly used material is polycarbonate, which has high impact resistance, relatively good temperature resistance, and it’s also extremely flexible. A great example of polycarbonate would be the battery door of the Galaxy S and Note line, or Nokia’s Lumia devices. In general, it’s almost impossible to point to a phone with a polycarbonate external build that has reception issues, as polycarbonate effectively doesn’t attenuate radio signals, as shown on page 38 of this study of radio propagation differences. As the market becomes squeezed by decreasing profit margins, the low price of polycarbonate relative to glass or metal is also a significant advantage that can’t be overlooked.

While these are advantages are reason enough to make a smartphone or tablet with a polycarbonate casing, there are disadvantages as well. Polycarbonate is a poor conductor of heat, which means that with today’s thermally constrained devices, true nominal clockspeed for both CPU and GPU on the SoC will be lower than a phone or tablet that is made of metal such as aluminum and magnesium. The same is also true when comparing a device made with polycarbonate and one with glass in general. For reference, the thermal conductivity of aluminum is 205 watts per meter kelvin, 156 W/m*K for magnesium, .8 W/m*K for ordinary glass, and .22 W/m*K for polycarbonate. The unit refers to the rate of energy transfer needed to heat up a length of material by a certain unit of temperature. This means that in today’s phones and tablets, one made from plastic will generally run slower in intensive games than one made of metal or glass, if all else is equal.

On top of this, while polycarbonate is extremely impact resistant, the flexibility of the material is a major issue for smartphones that have to be as thin and compact as possible. People often bring up the car analogy to argue that polycarbonate protects a phone or tablet better, but there’s no such thing as a crumple zone in a phone. Even the back cover serves a purpose, as antennas are inserted into the back cover in order to have the space for the huge number of frequencies supported. Bending the back cover into the phone is often a dangerous problem, as it will affect the delicate antenna connectors, which are often small, spring-like pieces that make contact with the back cover. This is an issue, as while the metal antenna connectors are elastic to a certain extent, once stretched too far it won’t spring back. One of the most notable examples of this issue can be seen with the Tegra 3 variant of the HTC One X. This variant of the One X would often lose all WiFi and Bluetooth reception due to crushed antenna connectors. Fixing the problem required additional reinforcement to prevent the back cover from bending in too far. As seen in the photo from iFixit’s teardown of the Galaxy S4 below, the connectors are small gold-plated nubs that touch parts of the back cover.

Metal

On the other end, metal is often hailed by reviewers as a superior material. However, most reviewers focus upon look and feel, rather than the technical advantages and disadvantages of using metal. Of course, metal is an extremely broad term and comprises around 80% of the known elements in existence. For this discussion, the key metal used for the outer casing is aluminum. Magnesium is another commonly-used material, but is mostly limited to the midframe.

Of course, aluminum alloys have their advantages as well. With the stiff alloys used in smartphones and tablets, there is a significant structural advantage that helps to protect internal components. To go back to the car analogy, because there aren’t crumple zones in a compact phone or tablet, there is only the safety cell. The safety cell is made to be as rigid as possible to prevent crushing the contents within the safety cell. In short, aluminum is actually more durable, not less. As pointed out back in the iPhone 4 review, the external antennas required by all-aluminum designs can give better reception and performance than internal antennas. Even today, it’s possible to see better reception performance from an all-metal device. For example, the Sprint One M8 has a higher effective isotropically radiated power (EIRP) on 1900 MHz CDMA than the Sprint Galaxy S5. Higher EIRP generally translates into better radio reception, although it also takes effective isotropic sensitivity (EIS) to see the full picture.

Because of metal’s higher stiffness, it’s also harder to scratch the surface. However, with anodization treatment, scratches are more likely be visible if they expose the untreated surface. Another key advantage is the much higher thermal conductivity of aluminum, which allows for better performance in situations where a device is thermally limited. After all, heat sinks are made of aluminum and/or copper, not polycarbonate or glass. The best example of this can be seen in the comparison between the Galaxy S5 and HTC One (M8) in the T-Rex rundown test, as the frame rate of the One (M8) is significantly higher than the Galaxy S5’s, as seen in the graph below.

Like every other material, Aluminum is also not the perfect material to make a mobile device. As a result of making the device from metal, it’s impossible to use internal antennas unless plastic/glass “windows” are used to allow signal in and out of the phone. This means that the device will be less isotropic (direction-independent) in its reception of radio signals. Even with external antennas that turn parts of the metal casing into an antenna, detuning that occurs when a hand touches the antenna or bridges it to another conductive body is a major problem, as is the need to support multiple frequencies with an external body that isn’t necessarily able to change, as the iPhone 5s/HTC One (M8) can’t look radically different from operator to operator. While the use of multiple antennas (receive/transmit diversity) and active antenna tuners have made all-metal designs possible, there is still a noticeable difference in radio reception. Whether this difference is for better or worse depends upon the frequency used.

Outside of radio reception, aluminum alloys’ lower limit of elastic deformation means that while the casing is better at protecting internal components, it’s more likely to receive cosmetic damage. On the other hand, polycarbonate is more likely to come out of a drop without dents or gouges. Aluminum bodies are also significantly more expensive, as the time and cost associated with working the material into a final product means that the difference in price can be as great as an order of magnitude. This can take away from other aspects of the device. Finally, while aluminum is far more effective at dissipating heat than polycarbonate, this also means that the polycarbonate device will have lower perceived skin temperatures under load. What that means is that it’s more comfortable to hold a polycarbonate-bodied device even if the internals are at higher temperatures. This also means that low temperatures will cause an aluminum-bodied device to feel much colder than a polycarbonate-bodied one.

Of course, magnesium changes things up as well. It’s lighter than aluminum due to its lower density, more RF transmissive than aluminum, and in general, carries many of the advantages that aluminum does over plastics such as polycarbonate and aluminosilicate glass, which include high thermal conductivity, relatively high rigidity, and relatively better scratch resistance. In theory then, magnesium would be better than aluminum.

Unfortunately, from a mass production standpoint magnesium casings are generally infeasible, although still possible. This is primarily due to the reactive nature of magnesium in oxygenated environments, and due to outgassing that occurs during the baking process. Without surface treatment, magnesium rapidly corrodes as well. This means that it’s not currently feasible to use magnesium as an external casing, although many manufacturers use it for the midframe.

Glass

Not to be forgotten, glass is another possibility for the external casing of a tablet or smartphone. It is the most rigid of all three materials and resists scratches the best. However, it is also the most brittle and susceptible to shattering. This is because glass can only deform elastically. Aluminosilicate glass, more commonly referred to as Gorilla Glass (when made by Corning) is the most common type of glass used for the external casing of a phone. It is between aluminum and polycarbonate when it comes to thermal conductivity. However, it is only slightly more conductive than polycarbonate, and far less conductive than aluminum. It also doesn’t significantly attenuate radio signals, which means that internal antennas can be used. Of course, the disadvantage is that glass is incredibly fragile, and can pose a major safety hazard. The shape of the phone is also significantly constrained. This is why glass-bodied devices have generally been small and the glass portion of the device is generally a flat sheet.

As mentioned previously, there are plenty of complications as well once you factor in the actual layout of the device. Thermal dissipation of a polycarbonate-bodied device can be improved by using a magnesium midframe that dissipates heat into the display and other components. This increases the rate of heat transfer from internal components to the air/hand. Wall thicknesses and different types of plastic, metal, and glass can significantly decrease the severity of issues associated with the disadvantages of various materials. For example, adding ABS plastic to polycarbonate can significantly increase the rigidity of the material. Applying anti-shatter film to glass can catch shards in case the glass shatters to reduce the hazard involved in shattered glass. New antenna tuning technologies can enable all-metal devices.

Conclusion

Of course, the question still remains. Why is it that all of this matters? After all, Apple didn’t have to worry about thermal dissipation with the iPhone 4 because the SoC didn’t generate enough heat, but they used a steel side ring and glass back cover. While the glass back cover and stainless steel ring was more effective at protecting the internal components, minor improvements to drop protection and possible improvements to reception wouldn’t be strong justifications for pursuing such a design. So why would Apple do this?

The answer lies in industrial design. While it’s all too easy to conflate this with pure aesthetics, industrial design is a crucial aspect of any device. After all, smartphones and tablets are both touched all the time, and while we look at the display primarily, the shape, look, and feel all dramatically affect the experience. If it fits the hand better, feels better, and looks better, it is better. Unnecessary elements hurt the focus of the design, honest design helps it. Good design is obvious and invisible. It’s only when we use something poorly designed that we see what is well-designed. Advances in technology can and do fix the issues that materials have, but nothing can fix bad design. While most of these are subjective, as the mobile industry reaches saturation, both industrial design and material design will become crucial differentiators. If anything, it already has.

Running an NVIDIA GTX 780 Ti Over Thunderbolt 2

Running an NVIDIA GTX 780 Ti Over Thunderbolt 2

A common issue for laptop users is the lack of GPU power. Even the fastest mobile GPUs, in SLI or Crossfire cannot reach the echelons of performance of a higher-end desktop, mainly due to the power consumption and heat generation.  Not only that, laptops with high-end mobile GPUs in order to cope with heat generation tend to be far from portable. Sure, they are still easy to carry around compared to a full-size desktop system, but not many are willing to carry one around on a daily basis. In other words, if you want a laptop that’s relatively portable, you are left with mediocre GPU performance that usually doesn’t satisfy the needs if you happen to be an active gamer. 

Ever since the original Thunderbolt was released back in 2011, there has been a lot of discussion about the potential of using Thunderbolt for external GPUs. Today’s mobile CPUs are far more than capable of driving desktop GPUs and as Thunderbolt is essentially just PCIe and DisplayPort in a single interface, a laptop with an external GPUs makes almost too much sense. 

SilverStone’s/ASUS’ Thunderbolt eGPU enclosure at CES

So far a handful of companies, such as MSI and SilverStone, have showcased their external Thunderbolt GPU enclosures at trade shows, but due to issues such as performance and hot-plug applications, no-one has made it to retail . Intel’s decision to double the bandwidth with Thunderbolt 2 negated the launch of the original Thunderbolt-based designs, although with any luck TB2 should be an appropriate drop-in. Especially with GPUs, bandwidth can make a dramatic difference in performance and given the niche of external Thunderbolt GPUs, many users wouldn’t have been satisfied with a product that doesn’t provide at least near the maximum performance. 

Another big issue is obviously driver and operating system support. To make matters worse, nearly all Thunderbolt-equipped devices are Macs and traditionally Apple likes to have very tight control over drivers and other elements of the OS, making it hard (or even impossible) to develop an external GPU that would also function under OS X. In the PC arena, a few motherboards and products exhibit Thunderbolt support, and it is primarily up to Intel working with Microsoft to develop Windows based drivers.

DIY to the Rescue!

Disclaimer: All information and results here are based on a forum post and a (now private) Youtube video. We cannot guarantee that the results are accurate, thus any and all purchase decisions must be done at own risk with the possibility that the results may or may not be on par with what is reported below. 

As no company has set forth and commercialized their products yet, the enthusiast group has been looking for a do-it-yourself method to drive an external GPU over Thunderbolt. I came by a very interesting setup over at Tech Inferno forums today and thought I would share it with a larger readership here. A forum member squinks has managed to run an NVIDIA GTX 780 Ti over Thunderbolt 2 using Sonnet’s Echo Express III-D chassis with Corsair’s RM450 power supply dedicated to the GPU. 

Courtesy of Tech Inferno forum user squinks

Update: The video of the setup in action originally went private right before our post but it has now been made public again and can be seen here.

The results are certainly auspicious. Based on squinks’ own tests and GTX 780 Ti reviews posted online, the performance seems to be around 80-90% of the full desktop performance based on synthetic benchmarks (3DMark and Unigine Heaven). Given that Thunderbolt 2 offers only 20Gbit/s of bandwidth while a PCIe 3.0 x16 slot offers 128Gbit/s, getting 80-90% of the performance is a lot more than expected. This will vary depending on the game, as based on our own PCIe scaling tests the PCIe bandwidth may cause little to no difference in some games while in others the drop can be close to 50%. The more severe the needs of the PCIe connection, the worse the performance.  Either way, Thunderbolt 2 can show a potential for external GPUs even when talking about the most powerful ones that should also require the most bandwidth. 

According to the forum posts, the setup is also pretty much plug and play, but only as long as the GPU is connected to an external monitor. Once everything has been connected and drivers installed, the GTX 780 Ti will be recognized as a GPU like it would in any desktop system. Getting the external GPU to drive the internal display is also possible, although there apparently seems to be some limitations with this homebrew method. First off, it only works if the computer doesn’t have a discrete GPU as then NVIDIA Optimus can be used to enable GPU switching. If there is already a discrete GPU in place (like the GT 750M in the high-end 2013 rMBP), then Optimus cannot be used and unfortunately you’ll be limited to an external monitor. Secondly, there seems to be some loss in performance (~5-20% in addition to the original loss from Thunderbolt 2) when driving the internal display, which is likely due to Optimus and its limitations. 

The big question is whether such setup reasonably affordable in any way. Currently, the short answer is no. The Sonnet Echo Express III-D chassis alone costs $979, and you’ll need to add the cost of the GPU and power supply to that. The weight of the chassis is also 7.5lb (3.4kg), without the GPU or the power supply, hardly making it all that portable.  In total this means ~$1500 minimum if you are going with a higher-end GPU (which you should given the cost of the chassis). For comparison’s sake, I quickly gathered parts for a decent gaming rig in NewEgg and the total came in at $764.94 (without GPU and PSU). That’s with a Core i7-4770K, ASUS Z87 motherboard, 8GB of DDR3-1600, 120GB SSD, 1TB hard drive and mid-price case, so we are not even dealing with a budget system. In other words, you can build a higher performance system for over $200 less and take full advantage of your GPU. 

Update 2: As some of you mentioned in the comments, there are cheaper alternatives available that provide about 70-90% of the desktop performance. What you need is a Thunderbolt to ExpressCard adapter (like Sonnet’s Echo ExpressCard Pro) and an ExpressCard to PCIe adapter (like the BPlus PE4L V2.1), which together come in at $240 when bought straight from the manufacturers’ online stores. Add a cheap ~400W power supply to that and the total is less than $300 (without the GPU, of course). If you are interested in external Thunderbolt GPUs, I recommend that you take a good look at Tech Inferno forums as they have several guides and other resources from troubleshooting to benchmarks.

All in all, it is fun to see an external GPU connected via Thunderbolt 2 can actually get up and running. The price and DIY-ness are currently factors that don’t exactly allure the masses, but there is a potential market for a retail product that is designed specifically for this. Pricing is, of course, a major factor and at $200-$300 I could see external GPUs gaining popularity but once you go over $500 it, in most cases, becomes more viable to build a dedicated gaming rig.