Bluetooth bugs are making contact tracing apps spit out tons of errors

At the heart of all Bluetooth-based contact tracing apps is one fundamental issue: whether they actually help in the struggle to slow the spread of Covid-19. Whether they’re effective or not can be judged on two factors: epidemiologically and technically. But, months into the pandemic, there are still no clear answers.

“They’re quite unreliable,” says Douglas Leith, the chair of computer systems at Trinity College Dublin, who has been analysing how well devices can use Bluetooth to measure distance. Leith says the reliability of apps to detect other devices around them accurately and calculate their difference can vary widely. Bluetooth wasn’t designed for contact tracing, but in the pandemic era it’s being held up as a potential solution to the crisis.

Contact tracing apps using Bluetooth Low Energy – which is the majority of apps around the world, including those using the Apple and Google protocols – detect other devices around them using Received Signal Strength Indication (RSSI). In short this is the signal strength. The loss in signal strength between a transmitter in one phone and the receiver in another device is used as a way to determine distance between phones.

The distance between phones is calculated every two and five minutes, this can vary in different countries, and is used to work out if someone being close to another person increases the chances of Covid-19 transmission. Apps are based on the understanding that the virus is more likely to pass from one person to another if they’re within two metres of that person for more than 15 minutes. (Although, as the variety of social distancing measurements around the world show, the science behind distances where transmission risk changes is tricky).

“There are at least four things that can have a big effect on the performance,” Leith says. None of these come as a surprise to engineers, he adds, as Bluetooth works on the same frequency band as microwave ovens and the frequencies are well studied. The type of phone, including the antennas, chipsets and way they’re combined, can all vary Bluetooth performance. How the app was tested before being released to the public can also have an impact, as can the materials of the environment where the app is being used – think thick concrete walls versus thin glass windows. All of these alter the registered RSSI.

Phone manufacturers have been trying to iron out some of the differences in how Bluetooth performs on their devices. Google is hard at work too and has created a method where device makers can configure Bluetooth settings more easily. This involves sticking phones on a tripod, rotating them and collecting lots of data.

A spreadsheet listing more than 12,000 models of phone – including manufacturers from all the big names right down to Coolpad and Tecno – shows how much variance there is. RSSI corrections range from 19 to -29 and the power transmitted ranges by up to 45 different points.

If that weren’t complicated enough, one of the most influential factors on signal strength is simply where a device is placed. “You see fluctuations due to small changes in the relative orientation of the phone,” Leith says. If your phone is in your back pocket, the signal transmitted to someone sitting or standing in front of you will be very different to if it is in your front pocket.

An experiment conducted on a tram by Leith and colleague Stephen Farrell found signal strength didn’t effectively calculate distances in that scenario. They used the app detection rules of the Italian, Swiss and German apps for their experiment. They found that similar ranges of signal strength were observed both between handsets which are less than two metres apart and handsets which are greater than two metres apart – all the way up to five metres. Such differences could have a big impact on Covid-19 transmission – but Bluetooth isn’t sensitive enough to spot them.

In the tram scenario, the lack of accuracy is caused by the metal walls, floor and ceiling, all of which reflect radio signals. “There are some places where we’re fairly sure it just won’t work. And that’s in environments where there’s a lot of metal,” Leith adds.

The inability for Bluetooth to always precisely measure how far away people are from each other is an open secret. And incorrect measurements can result in false notifications being sent. Telling people to self-isolate when there’s lesser risk of them having coronavirus can be economically damaging and erode trust in the overall track and trace system. On the flip side, not identifying people who have a greater risk of contracting the virus means it could spread more easily.

In August, ahead of the launch of the NHS Covid-19 app in England and Wales, the National Cybersecurity Centre discussed “the Bluetooth problem”. It said the issue of false positives – phones thinking their owners had been close to others when they hadn’t been within two metres – was something that was being worked on.

“If you were between 2m and 4m from an infected person, there’s currently a false positive rate of about 45 per cent,” the NCSC’s technical director Ian Levy wrote in a blog post. This means some people are sent notifications of potential infection despite being more than two metres away from someone who has tested positive for Covid-19.

“By way of a simple illustration, during the recent Leicester outbreak, the app would have generated ~50 false positives a day in a population of 330,000,” Mark Briers, from the Alan Turing Institute wrote in a blog post in August about the app’s modelling. That modelling has not yet been published.

Researchers are scrambling to improve the false positive rate. A spokesperson for the Department of Health and Social Care, the government department responsible for the app, says that since the NCSC analysis was published the false positive rate for people between two and four metres apart has been reduced by around 25 per cent. People who the app judges are at a high-risk of transmission but are actually low-risk have been reduced from 4.5 out of ten to three in ten, a DHSC spokesperson says.

They add that notifications from the app are only triggered once someone has been confirmed to have tested positive. “We are very clear, everyone who is contacted will have been in close contact with someone who has a confirmed case of coronavirus,” the spokesperson explains.

Key to the operation of the NHS Covid-19 app is a risk-scoring algorithm that combines Bluetooth distance and time data with when the person who tested positive developed symptoms. The DHSC and NHS can change the levels of risk that trigger notifications as more scientific advice becomes available. Briers at the Turing Institute and the NCSC are working to further refine the algorithm and detection methods used.

“The only way that you could minimise or decrease false positives is to decrease the sensitivity,” says Gary Hatke, an engineer at MIT and part of the PACT team working on contact tracing apps and the underlying technology. “Or you’d have to get Apple and Google to change the way the API is working.” Google says it has worked to make Android battery impact as low as possible and, along with Apple, is continually improving the notification system.

Other Bluetooth devices could tackle distance measurement better than phones. Singapore, which was first to develop a contact tracing app for Covid-19, is now trialling a wearable device. “It is possible that wearable devices could, in general, perform better than smartphones in helping to estimate likely exposure events,” says Ken Kolderup, the chief marketing office at Bluetooth SIG, the group that develops the standard. “In many cases, wearables such as wristbands are worn on the body in locations that will result in less interference, and thereby lead to more accurate estimations of the true distance between people.”

But, for the time being, Bluetooth contact tracing is will be done by millions of phones, not a smattering of wearables. Hatke says that the MIT team has shown that Bluetooth reliability can be improved on devices. He says apps can make more frequent RSSI measurements to help determine how far away other devices are, although this could require more battery power to be effective.

The other way, Hatke says, is “to do some self awareness on the phone. Every time I take a measurement, can I tell whether or not I’m in a pocket or not”. This could be through proximity sensors that can tell whether a phone’s surface is covered or near a body, or light sensors. “You could encode that information in the signal that is being transmitted,” Hatke says.

But Hatke and his colleague Marc Zissman still question the exact purpose of development for contact tracing apps. And this makes them harder to optimise. Are they meant to help find people who are unknown to those who have been infected (for instance, people on a train) or to help notify people to self-isolate faster than human contact tracing efforts can contact them?

Seven months into the pandemic, there’s still no obvious answer. Distance and time are only two indicators of transmission risk. Airflow, whether people are indoors or outdoors and the volume at which people are speaking are some of the other variables that impact the Covid-19’s ability to spread. But none of these can be measured easily by an app that collects very little data.

“We really need the data from the jurisdictions that are running this,” Zissman says. As a result, there’s little incentive for researchers to work out how contact tracing apps can be most effective. “There is no published data where we can take the country or state and tell you what these values are.”

Intel quietly introduces the Core i3-10100F processor

Intel has quietly rolled out the 10th generation Intel Core i3-10100F processor. HEXUS regulars will already be well aware, from the nomenclature, that this is one of Intel’s processors with a disabled on-board GPU (‘F’ suffix). However, the best feature of the new Intel Core i3-10100F can’t be decoded from the name – it boasts Hyperthreading – as well as a slightly faster boost clock, and double the RAM capacity support, compared to its processor (the Core i3-9100F), at the same price.

The Intel Core i3-10100F processor is part of the Comet Lake-S line of desktop processors built by Intel on a 14nm process. It packs 4C/8T with a base/turbo boost clock of 3.6/4.3GHz, with 6MB of Intel Smart Cache and a 65W TDP. The Core i3-10100F supports up to 128GB of DDR4-2666 RAM in dual channel mode, for a max bandwidth of 41.6GB/s. It isn’t a straight upgrade from the Intel Core i3-9100F as the socket is of course LGA1200 now, rather than LGA1151.

If you want to do an A/B comparison with the Intel Core i3-10100(F) vs the Core i3-9100F processor, you can head on over to this Intel Ark link, which lists specs side by side. However, I think I’ve covered all the key differences above and a truncated version of the table can be seen below.

Such an entry level processor, which lacks on board graphics, is obviously meant to be paired with an appropriate dGPU. You save between US$25 and $45 by choosing this iGPU-less part, so you can put that towards an add-in card. Most are likely to want to spend under $200 on a partner GPU for PC gaming so may have the likes of the GTX 1650 or RX 5500 on their shortlists. Next year one might be able to find a suitable Ampere/Navi2 graphics card to pair with this Comet Lake-S processor to create a balanced system.

The cheapest 4K monitor right now could replace all your displays

Walmart is currently offering up the most affordable 4K monitor (that doubles as TV) on the market – even cheaper than bargain basement Amazon Prime Day displays.

At just $158, the Spectre U425CV-U even undercuts most refurbished displays we’ve seen to date, and this one comes with a full one year warranty. It costs around $14 a year to run and uses a 42.5-inch LED panel, supported by two feet and surrounded by an extremely thin bezel.

Uniquely among ultra cheap televisions and monitors, it sports four (yes four) HDMI inputs, meaning you could connect up to four computers at once, should you fancy such an elaborate setup.

An absurdly cheap deal

There’s also a USB port to connect an external storage device and power smart devices such as an Amazon Fire Stick or Chromecast, plus there’s a VESA mount should you wish to mount it to your wall. 

Its rated contrast ratio of 3000:1, 250cd/m^2 brightness and 60Hz refresh rate won’t set any world records, but should be good enough for most use cases. And it’s also worth noting there are two 10W speakers with ARC support and an SPDIF port.

It doesn’t do split screen – which is unsurprising at this price point – but if you use Windows 10, Snap Assist will achieve almost the same thing.

Spreadsheet users, professionals and multitaskers will love the massive real estate, which is equivalent to four 21.5-inch full HD monitors.

Samsung Galaxy Z Fold 2 5G review

Samsung hopes to change everyone’s mind with the Galaxy Z Fold 2 5G, the successor to its seminal foldable. This time, Samsung’s ditched much of what made the first Fold meh, like the dinky front display and the dodgy side-notches, and smartened up the design.

Good start, yes. But this is a £1,799 phone, so saying it has to be good is an understatement. To get a recommendation for anyone without a huge amount of disposable income, the Samsung Galaxy Z Fold 2 has to be near-perfect. So, is it?

DESIGN AND SCREEN: FOLD EVERYTHING

Combining a 6.23-inch cover display with an 816 x 2260 resolution, the Fold 2 is an instant improvement. The front screen is tall with a 25:9 aspect ratio, so almost as novel as the inner screen. While it doesn’t have a nippy refresh rate, it is punchy, packing Super AMOLED tech.

Clasp the top half of the Fold 2 5G with one hand, the bottom half with another, ease the phone open, and as you engage the smooth hinge mechanism, the magic happens – a big, bright, bold, notch-free tab gloriously presents itself.

The updated inner screen is slightly recessed for protection, and clocks in at a high-impact 7.6 inches with an almost square shape. Packing a resolution of 1768 x 2208 (373 pixels in every inch), it’s sharp too, and with HDR10+ support and a 120Hz refresh rate, every swipe and tap is seriously special.

Yes – there’s a crease. Unlike the Motorola RAZR 5G, Samsung’s flexible display doesn’t curl inwards when folded, so you see a visible line down the middle when viewed off-angle. You get used to it, but it’s definitely worth noting. Another point: the inner screen is plastic, flexible, and susceptible to damage when knocked – a given with all foldables. As a package, though, it feels more secure than its predecessor, the Huawei Mate Xs and the Motorola RAZR 5G.

Unfolded, the Z Fold 2 is a svelte 6.9 mm thin. Folded, that number more than doubles, making it one of the chunkiest phones around at 16.8 mm. Skinny jeans are out of the question, but that doesn’t mean the Fold 2 doesn’t look stunning – clever distraction technique Samsung.

With a similar matte glass finish to that of the Samsung Galaxy Note 20 Ultra 5G, it keeps fingerprints at bay, catches the light in a demure, diffuse way, and proudly shows off the buffed hinge, which sports a subtle Samsung insignia etched in its metal spine. The fingerprint scanner is in the power button, there’s a USB-port at the base, and the top and bottom of the phone are bookended by stereo speakers.

The design highlight has got to be just how well the hinge mechanism locks in place throughout its range of motion. Set it half-open, Nokia Communicator style (remember those?), and it balances itself up like a mini laptop – oh so satisfying.

CAMERAS: FOLD FAITHFUL

You won’t find the 108MP resolution the S20 Ultra packs here. Instead, you get three traditional 12MP cameras. The primary camera has an f/1.8 aperture and the biggest sensor of the bunch. Alongside it, there’s an ultrawide camera and a 2x zoom equivalent telephoto camera. The primary and telephoto cameras sport OIS and phase detection autofocus.

Samsung’s shooting modes are comprehensive, with AI photography detecting the scene you’re in and optimising snaps accordingly. There’s a handy Pro mode too, as well as the Single Take mode introduced on the S20 series, which makes a suite of almost perfect content to distribute across your TikGrams and InstaToks.

Photos taken on the Fold 2 5G have Samsung’s trademark punch to them, clearly sharpened with contrast boosted, but they don’t look bad. Across lighting conditions, your camera time will be helped along by a decent night mode that helps zap grain, but it falls way behind that of the Huawei P40 Pro and Google Pixel 4.

The Fold 2 5G shoots up to 4K resolution video at 60fps, and supports Pro Video capture, so you can get manual with your moving images too. The most exciting thing about the camera experience is using the gorgeous tablet screen as a viewfinder – maximum novelty-factor. Using the Z Fold 2 half-folded as its own ‘tripod’ is also fun and handy.

Not content with one selfie camera, the Galaxy Z Fold 2 5G packs in two, one on the front and another inside. They both shoot 4K resolution video at 60fps, making them exceptional on-paper, and very good in the flesh.

SOFTWARE: NOT QUITE FLEXIBLE ENOUGH

Samsung’s had to get heavy-handed with its customisations to make Android 10 play nicely with a folding form-factor, and there are both wins and woes here.

Many apps seamlessly transition from the external screen to a tablet view when the Fold 2 5G is opened up, but not all. To access them in a different view, they need to be force closed and reopened.

You can multitask across up to four apps simultaneously on the tablet – a bit much, but still cool. That said, when you close the Fold 2 and open it again, you’ve lost your carefully orchestrated layout. Again, annoying.

Then there’s the core UI. When in the camera app, if you half-open the Fold, the interface reorients to beautifully occupy the two halves of the screen optimally. Go back to the home screen, however, and you’re forced into portrait mode.

The orientation can be changed in the settings, however, even when corrected for, it doesn’t beautifully divide the display as per the camera app, so you lose elements in the crease. This lack of TLC given to using the Fold in its ‘laptop’ orientation is a massively missed trick from an ‘ooooh’ and ‘ahhhh’ point of view.

Thankfully, Google Play Store access means lots of apps, Samsung’s UI optimisation coupled with the 120Hz screen means smooth interaction, and the fact the Galaxy Z Fold 2 doesn’t sport an Exynos processor means decent heat management.

PERFORMANCE AND BATTERY LIFE: 22-CARAT FOLD

What you do get from the Galaxy Z Fold 2 5G, however, is beautifully nippy performance. Powered by a Qualcomm Snapdragon 865+ paired with 12GB RAM, it’s one of the most powerful Android phones money can buy right now. You’re paying for it with that £1,799 price tag, but the internals make the Fold 2 5G much easier to stomach than the midrange-powered Motorola RAZR 5G (£1,399).

Gaming is a treat on the Fold, with the stereo speakers booming, both displays shining brightly and the power under the hood able to keep framerates up and lag down. After an hour of playing Genshin Impact, the phone warmed up, though not uncomfortably so. Unlike most Samsung phones, there’s no microSD card slot here, though the 256GB starting capacity should suffice for most.

Finally, the 4500mAh battery lasts a full day very easily. You’ll get great mileage from it if you mainly use the front screen, which is entirely doable now it isn’t awful. Sorry original Fold. Open it up a bit more often and by 8pm 10-15 per cent should be left in the tank if you’re anything like us; whatsapping, watching, and gaming a bit too.

Another highlight – the Fold sports 25W fast charging and 11W fast wireless charging (plus 4.5W reverse wireless charging).

SAMSUNG GALAXY Z FOLD 2 5G VERDICT

The Samsung Galaxy Z Fold 2 5G is a glimmer of hope that foldable phones can be excellent. It isn’t the best camera phone around, nor does it charge the fastest – but who cares? It folds and doesn’t do anything badly.

Everything the Z Fold 2 does, it does at least well. It’s reassuring (glorious-looking) design delights every time you look at it, and the masterful hinge makes the two-in-one functionality we’ve been craving a reality. A handsome front-display only adds to the utility and enjoyment the phone delivers.

At £1,799 for the 256GB storage version, it doesn’t matter how good it is; the Galaxy Z Fold 2 won’t appeal to most people. Nevertheless, the annals of technology history will remember the Samsung Galaxy Fold 2 as the first foldable that was great – all-fold, no caveats.

A Razer Edition motherboard is the cherry on top of your RGB battlestation

The final frontier for Razer in the PC space is to offer actual components, having already tackled keyboards, mice, headsets, controllers, and a range of other products (even toasters). Taking a step in that direction, Razer intends to sell a line of AM4 motherboards for AMD’s Ryzen processors.

This was one of the many announcements at its inaugural RazerCon event over the weekend. It is also part of a bigger play to RGB all the things, as it has been doing. And it ties in with its Chroma Connect program consisting of over 50 third-party hardware partners, including new additions WD Black, Seagate Gaming, Yeelight, and Twinkly.

Razer’s goal is to expand its Chroma RGB lighting ecosystem far beyond its own peripherals. And so “to help natively control the growing list of connectable devices,” Razer announced it is partnering with ASRock to bring the first Razer Edition motherboards to market.

These will basically be custom versions of ASRock’s Taichi boards. Razer plans to launch multiple models, all based on AMD’s X570 and B550 chipsets, at least initially. The company did not say if it intends to offer Intel compatible motherboards at some point.

“The boards are the first of their kind, offering native Razer Chroma RGB support and universal compatibility with thousands of addressable RGB (ARGB) components. DIY PC builders with multiple ARGB components can now easily sync every component with their Razer peripherals with just the Razer Synapse 3 app,” Razer says.

Razer’s timing comes on the heels of AMD announcing its Ryzen 5000 series CPUs (Zen 3), which are set to debut to retail next month. In anticipation, motherboard makers have already started pushing out BIOS updates to ensure compatibility with their 500-series boards.

In any event, if you’re a die hard Razer fan, you could stick this into a Razer case, kick back on your Razer Iskur gaming chair, and tell everyone about your setup by speaking into your Razer Seiren Mini microphone that is plopped next to your Razer Sneki Snek plushie.

Now that’s build quality: Microsoft Surface Book takes a bullet for its owner, saving a life

Twitter user @itsExtreme_ (Aaron) reported that a Microsoft Surface Book saved his life by taking a stray bullet a tenant shot from the floor below. Aaron tweeted an image of his Surface Book after the incident. The convertible laptop was definitely worse for wear, with the bullet having penetrated the display and the mousepad. 

Unsurprisingly, the device no longer works. However, @itsExtreme_ has reached out to Microsoft support about the possibility of getting his laptop serviced or replaced. Considering the attention the original tweet received on Twitter, it will be interesting to see how Microsoft responds. 

The Microsoft Surface Book (and indeed, the rest of the Surface lineup) has received considerable praise for its high build quality, with a tight hinge mechanism and a magnesium alloy chassis. While the convertible laptop was certainly not designed to stop bullets, Microsoft’s impeccable build quality is clearly on display. We’re not sure if bullet resistance will be enough to sway MacBook Pro fans, though. 

Honor MagicBook 14 with Ryzen 5 4500U review

Introduction

Huawei and its subsidiary Honor have been expanding their presence in the laptop world for years now but have really hit their stride in the last 12 months. We’ve now covered the ultra premium Huawei MateBook X Pro 2020, the ultra powerful Huawei MateBook 14 2020 AMD, the affordable 16.1-inch Honor MagicBook Pro, and its more portable sibling – the Honor MagicBook 14 with the 12nm quad-core Ryzen 5 3500U that was initially available.

Now we have our hands on the upgraded version of the last one with the new Ryzen 5 4500U processor, based on AMD’s newer Zen 2 architecture and built on the more efficient 7nm process. It’s a six-core chip that runs cooler and is friendlier to the battery.

Beyond the CPU change the new MagicBook 14 is largely identical to its predecessor so you might want to read the review of that one. Here we’ll mostly focus on what’s new.

Display and keyboard

Like its name suggests, the MagicBook 14 has a 14-inch display. It’s an IPS LCD with a matte finish and a classic 1920x1080px resolution. Honor advertises a brightness of 250 nits and 800:1 contrast.

In our review of the first MagicBook 14 we measured a screen brightness of 277 nits, quite respectable. This model we have here is a brighter still at 294 nits, better than Honor’s claims. The panel has slim bezels, which help it to achieve an 84% screen to body ratio.

The screen is perfect for indoor use and we appreciate the no reflections matte finish. Colors are nice and punchy and brightness is fine for all but the sunniest environments.

The full-sized keyboard is just as nice to type on as on the original MagicBook 14. Keys are big and nicely spaced out and they have good travel, without being mushy. The touchpad is a good size, a bit wider than most other machines in this size. It has a plastic surface but is smooth and responsive. Multi touch gestures work flawlessly.

The power button doubles as a fingerprint scanner with support for caching – you press it once and it will boot up Windows and log you in with your fingerprint without the need to authenticate.

Performance and battery life

This is the most exciting bit. The original MagicBook 14 we tested had a Ryzen 5 3500U processor with 4 physical cores and 8 threads, built on AMD’s 12nm node and running on a 15W TDP. The model we have here is the Ryzen 5 4500U with 6 physical cores and 6 threads, built on AMD’s new 7nm node. The Ryzen 5 4500U has twice the L3 Cache at 8MB and retains the 15W TDP.

In Geekbench 5 the Ryzen 5 4500U CPU scored around 20% higher in single core and 30% higher in multi-core than the Ryzen 5 3500U.

And here’s Cinebench R20.

We tested the 512GB M2 NVMe SSD by SSSTC and got a good 1,947MBs sequential read and 1,381MBs write. That’s solid performance, although you could find faster drives, namely the 512GB Western Digital drive on the 16.1-inch MagicBook Pro, which posted 3,400MBs and 2,700MBs, respectively.

We performed a stress test on both CPU and the Vega GPU to evaluate the thermal performance. The CPU, which idled at around 27 degrees Celsius at 1397Mhz ramped up 75 degrees and 3443Mhz, before throttling down and settling on 65 degrees and 3143MHz. We didn’t find a difference in behavior between Normal and Performance mode. The GPU only reached 62 degrees.

During the stress test the single fan worked constantly to keep temperatures down, but we didn’t so much as hear a tiny whistle from it. The MagicBook 14 is a very quiet machine, practically silent under normal load. And what’s even better is that the chassis didn’t feel warm anywhere but at the very top right, just under the screen. That’s excellent thermal performance.

Battery life was outstanding from the 56Wh cell inside the MagicBook 14.

We ran a YouTube video test and a browsing test, both at 100% brightness and 100% volume (around 70dB) in the video test. The MagicBook 14 scored 11 hours and 20 minutes of constant YouTube video playback and 12 hours and 30 minutes of non-stop browsing.

Verdict

At the time of writing, the Honor MagicBook 14 costs €530 for the configuration with a 12nm Ryzen 5 3500U with 8GB of RAM and 256GB of storage. The model we have here with the 7nm Ryzen 5 4500U is €750, but you get double the storage at 512GB. We’d say that’s a reasonable markup for twice the storage, 20%-30% faster CPU performance, not to mention the better efficiency due to the newer manufacturing process.

The Honor MagicBook 14 is a perfect proposition for the current stay at home work culture. It can go all day on a single charge, is very lightweight and portable and is perfect for typing and productivity work. It can even handle some amounts of photo editing or coding easily.

You could save some money and go with the cheaper 12nm MagicBook 14, but investing in a 7nm laptop now is the sensible thing as it will be much more future-proof. Another easy recommendation. Great job, Honor!

COLORFUL Launches CVN Guardian and WARHALBERD DDR4 Memory Series

October 12th , 2020 Shenzhen, China – Colorful Technology Company Limited, professional manufacturer of graphics cards, motherboards, all-in-one gaming and multimedia solutions and high-performance storage, launches the CVN Guardian RGB DDR4 gaming memory for the latest AMD platforms and the WARHALBERD DDR4 memory for the 3rd generation AMD Ryzen platform and Intel desktop platforms. The COLORFUL CVN Guardian RGB Gaming Memory sports an eye-catching heat sink with an RGB lightbar that supports motherboard RGB sync. On the other hand, the WARHALBERD is a value-oriented DDR4 memory featuring CXMT memory modules for a wide application from gaming to office PCs.

COLORFUL CVN Guardian RGB Gaming Memory

The CVN Guardian RGB DDR4 gaming memory is the latest addition to the CVN family. Designed for gamers, the CVN Guardian sports a futuristic high-performance heat spreader with superb heat dissipation to ensure stable gaming performance. COLORFUL uses specially selected Hynix CJR memory modules that offer stable performance and excellent overclocking capabilities. The CVN Guardian Gaming Memory supports XMP 2.0 automatic overclocking function for a quick performance boost.

The CVN Guardian gaming memory has an addressable RGB lightbar that supports ARGB synchronization via the iGame Center. It also supports motherboard ARGB sync technologies including ASUS Aura Sync, MSI Mystic Light, GIGABYTE RGB Fusion, and ASRock Polychrome Sync. Furthermore, the CVN Guardian’s RGB lighting is complemented with laser engraved heat spreaders with a brushed aluminum finish.

The COLORFUL CVN Guardian gaming memory goes through rigorous testing including a 48-hour strict quality test to ensure stability and reliability. COLORFUL covers the CVN Guardian memory with a limited lifetime warranty. Available in three single-module packs at various speeds – DDR4-2666 8G, DDR4-3200 8GB, and DDR4-3200 16G.

Specifications

Model

DDR4-2666 8G

DDR4-3200 8G

DDR4-3200 16G

Module Capacity

8GB

8GB

16GB

Frequency

DDR4 2666MHz

DDR4 3200MHz

DDR4 3200MHz

Timing Sequence

19-19-19-43

16-18-18-38

16-20-20-40

Voltage

1.2V

1.35V

1.25V

COLOFUL WARHALBERD DDR4 Memory

The COLORFUL WARHALBERD Series is a value-oriented DDR4 memory for various applications from gaming to home PCs. The WARHALBERD memory modules are designed for the Intel and AMD mainstream platforms, available in DDR4-2666 8G and DDR4-3000MHz 8G single-module packs. The WARHALBERD memory passed rigorous testing with excellent compatibility in current-generation and previous-generation motherboards. The WARHALBERD memory is also subjected to rigorous testing including a 48-hour strict quality test. It also supports XMP 2.0 automatic overclocking function.

The COLORFUL WARHALBERD memory features CXMT (ChangXin Memory Technologies) DDR4 memory chips.

Pricing and Availability

The COLORFUL CVN Guardian DDR4 gaming memory will be available in selected regions worldwide with an MSRP of $49 for DDR4-2666 8G, $69 for DDR4-3200 8G, and $99 for DDR4-3200 16G.

ASUS ROG Strix XG438Q 43in 120Hz Gaming Monitor Review

The XG438Q is based around a VA panel, with the usual characteristics of this technology. It has very high 4,000:1 contrast, and quite high 450cd/m2 brightness. On the downside, the pixel response is 4ms gray-to-gray, which is par for the course for VA. Brightness and contrast are sufficient to enable HDR-10 support, with the ability to display 90 per cent DCI-P3 gamut to go with it.

Since this is such a big screen, it’s not packed with the adjustment. You can tilt forwards and backwards, but otherwise the position is fixed on the stand, with no swivel or height raising. There are three HDMI 2.0 inputs and a single DisplayPort 1.4, so you can plug in multiple games consoles, media streamers and your PC all at the same time. There’s a USB 3.0 hub, but only with two downstream connections.

However, a screen this size, with this resolution and feature set, doesn’t come cheap. Priced over £1,000, the XG438Q is a premium screen for a premium gaming experience. Let’s find out if it delivers.

Specification:

Screen size: 43-inch, 16:9 aspect

Native resolution: 3,840 x 2,160

Curvature: None

Refresh rate: 120Hz, AMD FreeSync 2

Panel type: VA

Contrast ratio: 4,000:1 (typical)

Brightness: 450cd/m2

Response time: 4ms Gray-to-Gray

Display inputs: 3 x HDMI 2.0, 1 x DisplayPort 1.4

USB hub: Yes, 2 x USB 3.0

Tilt: 5 degrees forward, 10 degrees backward

Raise: No

Swivel: No

Portrait: No

Other: Audio output, audio input, 10W stereo speakers

The ASUS ROG Strix XG438Q is an absolutely huge screen that comes in a huge box.

Inside you get the usual array of a video cables for each type of connection (HDMI, DisplayPort), USB upstream for the hub, and a power cable with external supply. There’s also an analog minijack audio cable, plus a remote control. The final inclusion is a little lighting device for projecting the ROG logo via ASUS Aura Sync system.

The fixed stand makes the XG438Q look more like a TV than a gaming monitor. There is Republic of Gamers branding strategically placed front and rear, but the design is completely overwhelmed by the size of this panel.

The only adjustments available are tilting five degrees forward or ten degrees back.

On the side of the screen, in a place that’s more convenient to reach, can be found the analog audio mini-jacks for input and output for headphones or external speakers. The upstream and two downstream ports for the USB 3.0 hub are also located here. Finally, one of the HDMI 2.0 ports is available facing sideways, making temporary connections easier. The remaining two HDMI 2.0 connections, DisplayPort 1.4, MiniUSB for AuraSync devices and power input are found beneath a plastic cover.

The menu is operated by four large buttons on the rear right corner of the panel, with a joystick at the top.

The lowest button simply turns the monitor off and on, so we’ll move up to the second-from-bottom button.

This button reveals quick menu access to the GameVisual preset modes, which include a lot of different game types. There’s a more generic Scenery Mode, then Racing Mode, Cinema Mode, RTS/RPG Mode, FPS Mode, sRGB Mode, and a separate MOBA Mode, where many monitors will expect you to use the RTS option for MOBAs. Finally, there’s a slot for user-configurable settings.

The next button up lets you choose between the four different video inputs manually.

The third button up simply exits other menus, so we move swiftly to pushing in the joystick, which reveals a menu you can now use to get to the same functions as the buttons, as well as the main menu. Note that pulling or pushing the joystick in any other direction rather than pushing it in before this menu appears doesn’t do anything.

If you push the joystick in again, you get the main menu, with the first subsection being the GameVisual modes once more.

Next along is GamePlus, where you can place a crosshair onscreen – for practice purposes only, we always stress. You can also superimpose a timer, hardware FPS counter, and grid to help with aligning multiple screens. However, you’d have to be truly minted to have more than one XG438Q monitor.

Next down is the Color section, which includes the usual controls for brightness and contrast as well as saturation. Color Temp has Cool, Normal and Warm presets as well as the ability to adjust via separate red, green and blue values. The Gamma options include 1.8, 2.2 and 2.5.

The Image section provides pixel overdrive options under OD. Aspect Control will only be available with non-native resolutions. ASCR is a smart contrast system. You can enable FreeSync here, and add up to four levels of Blue Light Filter to help counter eye strain. HDR is only available with an appropriate video signal, while Dynamic Dimming is another way to improve contrast. Finally, Shadow Boost improves contrast only in dark areas – handy for helping you to pick out enemies who hide in poor illumination, without blowing out brighter portions of the screen.

A screen this big, with this many video inputs, is a prime candidate for picture-in-picture. You could, for example, have your set-top box input superimposed within your Windows desktop.

The main menu also has a section for switching manually between the four video inputs.

Under System Setup, you find all the sundry configuration options such as how the USB hub behaves, which input is supplying sound, which menu options the quick buttons point to, how the OSD itself behaves, and which language the interface uses. You can also view information about the input signal and screen setup. However, at the top can be found Aura Sync and Aura RGB, which is where you alter the behaviour of the external projector and rear lighting.

Finally, MyFavorite provides the ability to save and load two full custom configurations.

Overall, the OSD has plenty to offer and makes it all quite easy to find with a logical layout. Unlike some manufacturers, ASUS keeps presets for multimedia and games in one place, although there aren’t any work-related options here. There is a good array of game-specific functions available. This is essentially the standard set of ASUS gaming monitor OSD features, and we never really have much to complain about in this area. We particularly like the wide selection of presets for specific game types, including the ever-popular MOBA.

Our main test involves using a DataColor SpyderX Colorimeter to assess a display’s image quality. The device sits on top of the screen while the software generates colour tones and patterns, which it compares against predetermined values to work out how accurate the screen is.

The results show –

A monitor’s maximum brightness in candelas or cd/m2 at various levels set in the OSD.

A monitor’s contrast ratio at various brightness levels in the OSD.

The brightness deviation across the panel.

The black and white points.

The colour accuracy, expressed as a Delta E ratio, with a result under 3 being fine for normal use, and under 2 being great for colour-accurate design work.

The exact gamma levels, with a comparison against preset settings in the OSD.

We first run this test with the display in its default, out-of-the-box state, with all settings on default after a factory reset. We then calibrate the screen using the Spyder software and run the test again.

We always test the display subjectively on the Windows desktop, using it for general tasks such as browsing and word processing, and with games as well, even if the display is not intended solely for that purpose.

We pay careful attention to any artefacts, ghosting or motion blur, and enable any gaming-specific features, such as adaptive-sync settings like G-Sync or FreeSync, using a compatible graphics card in our test PC.

We performed the quality tests on the ASUS ROG Strix XG438Q at its native 3,840 x 2,160 resolution in the default mode, after resetting the OSD, which sets the refresh to 60Hz, over a DisplayPort connection. Our test system was equipped with an AMD Radeon Vega Frontier Edition graphics card, which supports FreeSync, so we enabled that for games testing.

The gamut is very good, with the usual 100 per cent sRGB we now expect from all monitors, plus an excellent 86 per cent AdobeRGB and an also excellent 94 per cent DCI-P3, which is actually above specification.

Brightness uniformity is not so impressive, however, as we’ve often seen with very large monitors. The brightness differs in double percentage figures in a lot of areas of the screen compared to the centre.

Colour uniformity is a bit more respectable at most brightness levels, although not the best we’ve seen.

The XG438Q actually exceeds its brightness specification of 450cd/m2, managing 462.3cd/m2 at its maximum setting. Contrast hits a very commendable 3,270:1 at this setting, too, which is a little behind the 4,000:1 rating but we don’t test with contrast all the way up anyway, we use the default value. The white point isn’t that uniform, ranging from 6600K at zero brightness to 7000K at 100 per cent, which is quite a wide variation.

You get a fair amount of options with the different GameVisual presets. Scenery Mode is high brightness and contrast, offering 428.8cd/m2 for the former and 3,790:1 for the latter, with a mid-range 7000K white point. Racing Mode (the default, for some reason) is still quite bright at 407.9cd/m2 and high contrast, offering 3,610:1, with the same 7000K white point. Cinema Mode uses a much cooler 8900K white point, with a lower 334.6cd/m2 brightness and still high 2,960:1 contrast.

The RTS/RPG and FPS modes are almost identical, with 406.7cd/m2 brightness, 7000K white point and 600:1 or 590:1 contrast respectively. The differences between these modes are likely to be in other settings, such as shadow boosting and pixel overdrive. MOBA has very similar brightness at 407.5cd/m2, the same 7000K white point but a much higher contrast of 3,610:1. Finally, sRGB goes for a relatively low 182.5cd/m2 brightness, lower but still strong 1,850:1 contrast, and slightly cooler 7200K white point.

The gamma modes are consistently 0.1 above their nominal value, so you can’t achieve a true 2.2, but at least you know what you’ll get with each and the increments are uniform.

Colour accuracy at default settings, however, is not that great. The average deviation of 3.13 isn’t terrible, but we’d expect more from a monitor this expensive, making the XG438Q a prime candidate for calibration. So we fired up the SpyderX again.

After calibration, which consisted mostly of dropping the brightness down a lot from the default 90 per cent, the gamut has gone down slightly. You still get 100 per cent sRGB, but AdobeRGB and P3 have both dropped two per cent to 84 and 92 per cent respectively. These are still great values, though.

The Gamma 2.2 preset is now spot on the 2.2 value it’s supposed to provide.

Calibration has improved colour accuracy notably, to an average deviation of 1.7. This is good but still not as exceptional as a number of the monitors we’ve tested recently.

Overall, the ASUS ROG Strix XG438Q offers good but not outstanding performance.

Of course, this is a gaming monitor so we also had to fire up our usual suite of games including CS:GO, Rainbow 6 Siege and League of Legends. We enabled the FPS Gaming Mode preset for the first two games, and MOBA for LoL. We also made sure FreeSync was enabled.

The first thing we should say is that this is an absolutely amazing monitor for a MOBA. There is so much space that you can get a truly excellent view of the game terrain. We also tried Civilization VI and the huge space was superb for this as well (we switched to RTS Mode for Civ).

Our AMD Radeon Vega Frontier Edition graphics card could only drive CS:GO at high enough frame rates in 4K to get close to the 120Hz refresh. It was smooth and eminently playable. Rainbow 6 Siege was only hitting around 60fps so not making much use of the higher refresh. Sitting at desk distance from a screen this huge for FPS play is quite mesmerising, and if we’re honest a bit too immersive. We didn’t have a 4K console to hand, but we suspect that this screen will be absolutely in its element for gaming when you’re a little bit further from the display itself.

The ASUS ROG Strix XG438Q is a monster of a screen. Paying over a grand for a monitor sounds expensive, but looking at other 4K monitors with this screen size (of which there aren’t that many), it’s around what you’d expect. The performance out of the box isn’t quite as outstanding as we would have hoped, but its acceptable once you calibrate, so this screen is capable of decent quality if you set it up correctly.

The ASUS OSD as always delivers a fine array of features that will allow the gaming user to tweak the configuration for the style of play desired, including useful options like a hardware FPS counter. There’s not that much here if you did want to cross over into work territory, other than the sRGB GameVisual mode. You can have some fun with the light projection system and Aura Sync for a truly blinged-out gaming experience.

We do wonder why there are only two USB ports, although they are usefully placed on the side. But you get a wealth of video connectivity, which points to the direction where this screen is best suited. If you’ve got consoles, video set-top boxes and a PC all to use through one screen, preferably all in 4K, then the XG438Q is a decent choice. However, it is worth pointing out its HDMI ports are only HDMI 2.0, not HDMI 2.1. This means next-gen consoles won’t be able to utilise the 120Hz refresh at 4K, though PC users can of course utilise a DisplayPort connection instead.

You will also need a hefty graphics card to get the most out of the 120Hz ability with the latest AAA PC titles. But if you’re serious about your gaming screen size, the ASUS ROG Strix XG438Q is well worth considering.

The ASUS ROG Strix XG438Q is available from Overclockers UK for £1,078.99.

Pros:

Huge 43in 4K desktop.

120Hz refresh.

AMD FreeSync 2 adaptive sync.

Reasonable price for a 4K screen this big.

Plenty of adjustments available in OSD, particularly for gaming.

Lots of video inputs.

Aura ROG logo projector included.

Built-in USB 3.0 hub.

Cons:

Mediocre colour accuracy before calibration.

Only two USB ports on hub.

Almost no ergonomic adjustment.

Mediocre brightness uniformity.

HDMI 2.0 ports, not 2.1.

AMD Zen 3 Ryzen 5000 Price, Specs, Release Date, Performance, All We Know

AMD has officially announced that the Zen 3 architecture will land this year and outlined the new Ryzen 5000 models, setting the stage for a new wave of powerful chips based upon a newer version of AMD’s most successful architecture to date. The new Zen 3 microarchitecture will power AMD’s full lineup of next-gen chips, including the Ryzen 5000 “Vermeer” desktop processors that will soon vie for a spot on our list of Best CPUs, the Ryzen 5000 laptop chips, and the EPYC Milan data center processors.

The first four new Ryzen 5000 models come as chips for the desktop PC, and they stretch from the $299 Ryzen 5 5600X up to the $799 Ryzen 9 5950X. The chips will be on shelves on November 5th and could represent a big shift in the AMD vs Intel CPU wars. 

AMD says Zen 3 features a grounds-up rethinking of the architecture that finally allows it to take the 1080p gaming performance lead from Intel. Paired with a 19% boost to instructions per cycle (IPC) throughput and peak boost speeds of up to 4.9 GHz, AMD may just have the magic 7nm bullet that finally upsets Intel from its position at the top of our gaming performance benchmarks. In fact, given what we’ve seen so far, it looks like AMD could soon enjoy a dominating position in the desktop PC market unlike anything we’ve seen since the Athlon 64 days. 

AMD’s Zen 3 Ryzen 5000 stack begins with the impressive 16-core 32-thread Ryzen 9 5950X that will retail for $799. This chip boosts up to 4.9 GHz, has 64MB of unified L3 cache, and a 105W TDP rating. AMD says this chip is faster than Intel’s 10-core Core i9-10900K in pretty much everything, which isn’t surprising — Intel has no equivalent for the mainstream desktop.

The $549 Ryzen 9 5900X slots in as the more mainstream contender, at least by AMD’s definition of ‘mainstream,’ with 12 cores and 24 threads that boost up to 4.8 GHz. AMD says this chip beats the Core i9-10900K by even more impressive margins in gaming. Further down the stack, we find the 8C/16T Ryzen 7 5800X for $449 and the 6C/12T Ryzen 5 5600X for $299.

Intel is stuck with its Comet Lake chips for five long months to try to fend off the Ryzen 5000 lineup until Rocket Lake arrives in Q1 2021, which doesn’t bode well. 

As odd as it sounds, Intel may have one hidden advantage — pricing. AMD now positions Ryzen 5000 as the premium brand and says it has the benchmarks to prove it. As a result, AMD has pushed pricing up by $50 across the stack compared to its Ryzen XT models. However, the XT family doesn’t really represent AMD’s best value chips; its own Ryzen 3000 series, which comes at much lower price points, holds that crown.

As a result, Intel’s Comet Lake chips now have comparatively lower price points than AMD’s Ryzen 5000 lineup. However, AMD says it still maintains the performance-per-dollar lead. We won’t know the full story until the chips land in our labs, but that obviously won’t be long — AMD says the full roster of Ryzen 5000 chips will be available at retail on November 5.

If one thing is for certain, the Zen microarchitecture has completely redefined our expectations for mainstream desktop chips, and it’s rational to expect more of the same with Zen 3. Let’s cover what we know about Zen 3 so far. 

AMD Zen 3 Ryzen 5000 At A Glance

1080p gaming performance leadership

Ryzen 9, 7, and 5 models

From 6C/12T up to 16C/32T

Same optimized 7nm process as Ryzen XT models

Zen 3 microarchitecture delivers 19% IPC improvement

24% gen-on-gen power efficiency improvement — 2.8X better than 10900K

Higher peak frequencies for most models — 4.9 GHz on Ryzen 9 5950X

Lower base frequency for all models, offset by increased IPC

L3 cache now unified in a single 32MB cluster per eight-core chiplet (CCD)

Higher pricing across the stack (~$50)

No bundled cooler with Ryzen 9 and Ryzen 7 models

Drop-in compatible with the AM4 socket

No new chipset/motherboards launched

Current-gen 500-series motherboards work now (caveats below)

Beta support for 400-series motherboards begins in January 2021

All Zen 3 desktop, mobile, and APU chips will carry Ryzen 5000 branding

Same 142W maximum power for AM4 socket as previous-gen

Same 12nm GlobalFoundries I/O Die (IOD)

Here we can see the full Ryzen 5000 product stack, and how the new chips stack up against Intel’s Comet Lake. The first big thing you’ll notice are the increased Precision Boost clock rates, which now stretch up to 4.9 GHz. However, we also see a broad trend of lower base frequencies for the Ryzen 5000 series compared to the previous-gen chips, but that isn’t too surprising considering the much higher performance-per-watt that we’ll outline below.

AMD obviously leans on its improved IPC rather than raw clock speeds, thus boosting its power efficiency and reducing heat generation. The Ryzen 5 5600X is the best example of that — despite only a slight reduction to the base frequency, the chip drops to a 65W TDP compared to its predecessor’s 95W. 

What’s not as impressive? AMD has continued with the precedent it set with its Ryzen XT series: Bundled coolers no longer come with processors with a TDP higher than 65W. That means the Ryzen 5 5600X will be the only Ryzen 5000 chip that comes with a cooler in the box. AMD said it decided to skip bundled coolers in higher-TDP models largely because it believes most enthusiasts looking for high-performance chips use custom cooling anyway. AMD also still specs a 280mm (or greater) AIO liquid cooler for the Ryzen 9 and 7 chips, which significantly adds to the overall platform costs.

AMD continues to only guarantee its boost frequencies on a single core, and all-core boosts will vary based on the cooling solution, power delivery, and motherboard BIOS. The Ryzen 5000 chips still expose the same 20 lanes of PCIe 4.0 to the user (another four are dedicated to the chipset), and stick with DDR4-3200 memory. We’re told that memory overclocking capabilities remain the same as we see with the Ryzen XT models, so AMD hasn’t changed its guidance on that front. 

AMD Ryzen 5000 Zen 3 Performance Benchmarks and Comparisons 

AMD Ryzen 9 5950X Gaming and Application Performance Benchmarks 

Before we get into the benchmarks, be aware that AMD provided them. Like all benchmarks provided by any company, they could be (and probably are) heavily skewed toward games and applications that favor the company’s products.

Also, AMD tested all processors (both the Ryzen 5000 and Intel models) with DDR4-3600 memory. For reference, DDR4-3200 is the stock configuration for the AMD processors, and DDR4-2933 is stock for the Core i9-10900K. AMD also used a Noctua NH-D15s, a high-end air cooler, for all tested platforms (which is fine), and an Nvidia GeForce 2080 Ti. (It probably couldn’t buy a GeForce RTX 3080 or GeForce RTX 3090 either.)

What does the 16-core 32-thread Ryzen 9 5950X and its eye watering $799 price tag get you? The first slide pits AMD’s 5950X against the previous-gen Ryzen 9 3950X and shows 20%+ performance gains in the tested games, though the deltas do vary. 

AMD also says the Ryzen 9 5950X scores 640 points in the single-threaded Cinebench R20 benchmark, which is much higher than the Core i9-1900K’s 544 points. The content creation benchmarks show the 5950X with solid gains in lightly-threaded apps, like CAD, Adobe Premier, and compilation. 

However, performance gains in the heavily-threaded V-Ray application are a bit less pronounced. AMD says the Ryzen 5000 processors still have to adhere to the 142W power limit of the AM4 socket, which reduces performance gains in heavily-threaded applications. 

On the brighter side, AMD says those performance gains come at the same level of power consumption, which means the chips are more power-efficient. It will also be interesting to see how that looks when we lift the power limits in our own tests.

The second slide shows the 5950X against the Intel Core i9-10900K in several games and applications. The benchmarks show what is basically a dead heat with the 10900K, but the Ryzen 9 5900X is actually the faster gaming chip, so you’ll see bigger deltas over the Core i9-10900K in the benchmarks below.

Of course, with the RTX 2080 Ti, it could be the main bottleneck even at 1080p ultra. We joked about AMD not having RTX 3080 or RTX 3090 testing results, but in all seriousness, anyone upgrading to Zen 3 for gaming purposes is likely eyeing Nvidia’s Ampere or AMD’s Big Navi as well. That’s something we’ll be testing once we have hardware in our labs. 

AMD Ryzen 9 5900X Gaming and Application Performance Benchmarks 

Here’s a quick look at the improvement in AMD’s favorite single-threaded benchmark, Cinebench R20. AMD like this test because it is extremely favorable to its Zen microarchitecture.

The Ryzen 9 5950X scored 631 points, while the Core i9-10900K weighed in at 544 points. That works out to an outstanding 16% advantage for the Ryzen 9 5900X, but bear in mind this occurs in a single benchmark, so take it with a grain of salt. 

We scored 535 points with the 10900K in the same test, albeit obviously with a different test platform and conditions. AMD didn’t show the Ryzen 9 5900X’s multi-threaded CineBench score, but measured the Core i9-10900K at 6,354 points. That’s close enough to call a tie with our own measurement of 6,356 points.

AMD bills the Ryzen 9 5900X as the fastest gaming CPU on the market, which it says it measured from the average fps from 40 PC games at 1920×1080 at maxed-out settings.

Here we see a spate of AMD’s 1080p performance benchmarks with the Ryzen 9 5900X up against the Ryzen 9 3900XT. Overall, the 5900X provides a 26% average fps performance improvement, which is pretty stellar for an in-socket upgrade. Notably, the processor notches higher gains in some titles — to the tune of 50% in League of Legends and 46% in CS:GO. Other titles, like Battlefield V and Total War, see low single-digit gains. 

The second slide pits the Ryzen 9 5900X against the Core i9-10900K in a selection of games at 1080p with high fidelity settings. AMD recorded a slight loss in Total War, and some single-digit performance increases in a few titles. However, League of Legends and CS:GO, both of which are older titles, received significantly higher fps measurements.

We’ll obviously have to see these titles tested on our own test systems, and Intel could gain a bit more performance from overclocking. The jury is still out on Ryzen 5000’s overclockability, but the chips use the same process as the existing Ryzen XT models, so we don’t expect much headroom.

AMD Ryzen 7 5800X Gaming and Application Performance Benchmarks 

AMD didn’t share dedicated benchmarks for the Ryzen 7 5800X and Ryzen 5 5600X — the higher-end models are obviously in the spotlight for today’s announcements. However, the company did share performance-per-dollar slides, which you can see above for the Ryzen 7 5800X. We’ll add more benchmarks as we learn more. 

AMD Ryzen 5 5600X Gaming and Application Performance Benchmarks 

Here we can see AMD’s performance-per-dollar projections for the Ryzen 5 5600X. Given that this and the Ryzen 7 5800X are single-chiplet designs, we expect them to be incredibly competitive in gaming at the lower price ranges. 

AMD Zen 3 Ryzen 5000 Motherboards

AMD didn’t launch a new chipset with the Ryzen 5000 series; instead, the chips drop right into existing 500-series chipsets, like X570, B550 and A520 models. These boards require an AGESA 1.0.8.0 (or newer) BIOS to boot a Zen 3 processor, but AMD has been shipping silently shipping supporting BIOSes since summer. As a result, every 500-series motherboard on the market should have a downloadable BIOS available.

While the early BIOS revisions ensure the chips will work on the most basic level, you’ll have to update to an AGESA 1.1.0.0 (or better) BIOS for the best performance. These revisions will be available for all 500-series motherboards by the November 5th Ryzen 5000 launch date.

AMD originally announced it wouldn’t provide Zen 3 support for 400-series motherboards, but due to concerns from the enthusiast community, the company reversed course. Now AMD will also provide support for 400-series chipsets, but the BIOS updates are under development the first beta BIOSes will be available in January of 2021. 

However, a series of important restrictions apply to 400-series upgraders, which you can read more about here, but here’s the short version from AMD:

We will develop and enable our motherboard partners with the code to support “Zen 3”-based processors in select beta BIOSes for AMD B450 and X470 motherboards.

These optional BIOS updates will disable support for many existing AMD Ryzen Desktop Processor models to make the necessary ROM space available.

The select beta BIOSes will enable a one-way upgrade path for AMD Ryzen Processors with “Zen 3,” coming later this year. Flashing back to an older BIOS version will not be supported.

To reduce the potential for confusion, our intent is to offer BIOS download only to verified customers of 400 Series motherboards who have purchased a new desktop processor with “Zen 3” inside. This will help us ensure that customers have a bootable processor on-hand after the BIOS flash, minimizing the risk a user could get caught in a no-boot situation.

Timing and availability of the BIOS updates will vary and may not immediately coincide with the availability of the first “Zen 3”-based processors.

This is the final pathway AMD can enable for 400 Series motherboards to add new CPU support. CPU releases beyond “Zen 3” will require a newer motherboard.

AMD continues to recommend that customers choose an AMD 500 Series motherboard for the best performance and features with our new CPUs.

Note: You lose support for PCIe 4.0 on 400-series boards, but most gamers will not, and should not, care — PCIe 4.0 makes no meaningful performance difference in gaming. 

AMD Zen 3 Ryzen 5000 Pricing and Availability

The Ryzen 5000 series will come to market on November 5th, 2020. We expect to learn more information, like performance benchmarks, for the Ryzen 7 and 5 models in the interim. We also expect to eventually hear about Threadripper 5000 products with the Zen 3 architecture, but we aren’t sure when AMD will bring the new design to its ultra-powerful high end desktop lineup. 

The Zen 3 Ryzen 5000 processors do come with a recommended $50 markup across the product stack. AMD’s suggested pricing often has little to do with what we see at retail; you can expect the chips to eventually retail for far lower than MSRP. 

The change comes as AMD positions itself as a premium chip supplier as opposed to its long history as the value alternative. The continued absence of bundled coolers also serves to drive up the platform cost – in most cases, you’ll need to invest at least $40 to find a cooler that’s as capable as AMD’s stock coolers. The company specs a 280mm AIO cooler (or equivalent air cooler) for the chips, so plan accordingly. 

That’s led to plenty of complaints, and Intel’s Comet Lake lineup actually has lower pricing in critical price bands. We do have to take performance into account, though, and we have yet to do our own testing. That means the jury is out on the price-to-performance ratio for Ryzen 5000. 

AMD’s Zen 3 pricing in the market will be largely predicated upon how it performs relative to Intel’s chips. Given the big performance gains we expect with the Zen 3 generation, it’s possible the numbers could work out in favor of Intel’s competing chips. 

If Zen 3 lives up to its billing, it looks like AMD’s only constraint will be production capacity at TSMC. AMD will sell every Ryzen 5000 chip it punches out, at least until Rocket Lake arrives – and we still don’t know if Intel’s new 14nm design can keep pace with AMD’s 7nm chips. AMD’s ecosystem of 500- and 400-series motherboard partners have plenty of relatively affordable options, so we don’t foresee any problems with motherboard supply.

On that front, AMD will undoubtedly meet with stiff demand for the Ryzen 5000 chips at launch, and the company says it is working with retailers to avoid the plague of purchasing bots that exacerbated Nvidia’s now-infamous Ampere launch. And AMD hasn’t been free of shortages at launch either, with the Ryzen 9 3950X being relatively difficult to purchase for the first couple of months after it launched.

Should I Buy a Ryzen 5000 Series Zen 3 CPU?

The jury is still out on just how AMD’s Zen 3 Ryzen 5000 series chips will perform in the real world: We won’t know until the silicon lands in our labs, but you can bet that will be soon given the November 5th 2020 launch date. 

The performance does look promising; AMD has made plenty of alterations that should boost performance significantly. Here’s what we know about the Zen 3 microarchitecture:

AMD Zen 3 Ryzen 5000 Microarchitecture 

AMD shared many new details about the Zen 3 microarchitecture, but the company says it will share even more information in a future briefing, so we’ll have a lot more information for our forthcoming Ryzen 5000 reviews.

AMD embarked on what it describes as a ground-up redesign of the Zen 2 microarchitecture to deliver the gains we would normally see with an entirely new design. In fact, the company’s ~19% increase in IPC represents its largest single-generation increase in the ‘post-Zen’ era (Zen+, Zen 2). We certainly haven’t seen an increase of this magnitude for desktop chips from Team Blue in the recent past, either — the initial Skylake architecture achieved a similar boost, but everything since has been nearly static.

AMD calculates its 19% IPC number from the geometric mean of 25 workloads measured with two eight-core chips locked at 4.0 GHz. The impressive IPC gains required a ‘front-to-back’ series of modifications to the design, including (but not limited to) the cache subsystem, front end, branch predictor, execution engine, and load/store elements, all with a focus on boosting single-threaded performance while wringing out better instruction level parallelism (ILP). The result is improved performance across the board in both single- and multi-threaded integer and floating point workloads. However, the 142W power limit imposed by the AM4 socket does restrict the scope of performance gains in heavily-threaded workloads, though there are some advances there, too.

AMD says it uses the same enhanced version of TSMC’s 7nm process node that it used for the Ryzen XT series, but still hasn’t provided specifics. AMD’s ‘special recipe’ for 7nm is largely kept confidential, but the firm specified that it doesn’t use TSMC’s 7nm+ (an EUV node). That means that AMD uses the standard N7 from Zen 2 with improved design rules, or that the chips use the N7P node. 

AMD’s end goal is to have undisputed best-in-class performance across the full spectrum of applications, and gaming performance was a particular focus, which brings us to the changed cache hierarchy. 

As with the Zen 2 processors, Zen 3 uses the same 12nm I/O die (IOD) paired with either one or two chiplets in an MCM (Multi-Chip Module) arrangement. In the image above, we can see the large I/O die and the two smaller eight-core chiplets. 

AMD chose to stick with this basic design for its Zen 3 Ryzen 5000 chips. And just like we see with the previous-gen Zen 2 chips, processors with six or eight cores come with one chiplet, while chips with 12 or 16 cores come with two chiplets. 

While the overall package design is the same three-chiplet design, AMD made drastic changes to the internals of the two eight-core chiplets. In the Zen 2 architecture (left), each Zen compute chiplet (CCD) contained two four-core clusters (CCXes) with access to an isolated 16MB slice of L3 cache. So, while the entire chiplet contained 32MB of cache, not all cores had access to all of the cache in the chiplet.

To access an adjacent slice of L3 cache, a core had to communicate with the other quad-core cluster by issuing a request that traversed the Infinity Fabric to the I/O die. The I/O die then routed the request to the second quad-core cluster, even though it was contained within the same chiplet. To fulfill the request, the data had to travel back over the fabric to the I/O die, and then back into the quad-core cluster that issued the request.

On the right side of the slide, we can see that the chiplet now contains one large unified 32MB slice of L3 cache, and all eight cores within the chiplet have full access to the shared cache. This improves not only core-to-cache latency, but also core-to-core latency within the chiplet.

While all eight cores can access the L3 cache within a single compute chiplet, in a dual-chiplet Zen 3 chip, there will be times that the cores will have to communicate with the other chiplet and its L3 cache. In those cases, the compute chiplet’s requests will still have to traverse the Infinity Fabric via signals routed through the I/O die, which incurs latency.

Still, because an entire layer of external communication between the two four-core clusters inside each chiplet has been removed, the Infinity Fabric will naturally have far less traffic. This results in less contention on the fabric, thus simplifying scheduling and routing, and it could also increase the amount of available bandwidth for this type of traffic. All of these factors will result in faster transfers (i.e., lower latency) communication between the two eight-core chiplets, and it possibly removes some of the overhead on the I/O die, too. We imagine there could also be other advantages, particularly for main memory latency, but we’ll wait for more details. We do know that the default fabric speeds haven’t changed, though. 

All of this is important because games rely heavily on the memory subsystem, both on-die cache and main memory (DDR4). A larger pool of cache resources keeps more data closer to the cores, thus requiring fewer high-latency accesses to the main memory. Additionally, lower cache latency can reduce the amount of time a core communicates with the L3 cache. This new design will tremendously benefit latency-sensitive applications, like games — particularly if they have a dominant thread that accesses cache heavily (which is common). 

Naturally, power efficiency will improve as a function of reduced traffic on the Infinity Fabric, but that’s probably a small fraction of the performance-per-watt gains AMD has extracted from the architecture. Increased IPC and other SoC-level optimizations obviously factor in here. Still, the net result is that AMD managed to stay within the same TDP thermal and electrical ranges as the Ryzen 3000 chips while delivering more performance.

AMD Ryzen 5000 Zen 3 Power Consumption and Efficiency

AMD says it has not increased power consumption by a single watt — the maximum power draw for the AM4 socket still stands at 142W — which naturally will lead to impressive efficiency gains. AMD’s chart above uses the first-gen Ryzen 7 1800X as a comparison point, and here we see a 2X improvement by moving to the 7nm Zen 2 architecture. That isn’t too surprising considering the move from the older 14nm process to 7nm with that generation of chips. 

The more important reduction comes from extracting more efficiency from the ‘same’ 7nm node, which is far more difficult and requires a combination of both better design methodologies and architectural improvements. As a result of these factors, AMD says it wrung out another 24% gen-on-gen efficiency improvement with the Ryzen 9 5900X over the Zen 2-powered Ryzen 9 3900XT. That’s impressive. Intel’s most recent Comet Lake chips had to increase power draw quite a bit and still had far lower performance improvements.

What does that mean to you? Faster, cooler, and quieter performance for your PC compared to AMD’s previous chips – and those models already posed a stiff challenge to Intel’s Comet Lake. 

The Ryzen 5000 Zen 3 chips arrive at retail on November 5th, 2020. We’ll update as we learn more.