Intel may have a 6-core Comet Lake S CPU to challenge AMD in desktops

Intel may have moved onto 10nm process nodes for its Ice Lake processors (CPUs), but it looks like 14nm is still where its focus is for desktop. And, it may not be such a bad thing, as a recent leak has shown what could be a new Intel Core i5 CPU with 6 cores and 12 threads using the new Comet Lake-S architecture, HotHardware reports.

The leak shows an Intel CPU tested on an H400-series motherboard, one we understand to be designed for desktop Comet Lake-S processors using the LGA 1200 socket. Beyond the chip’s 6 cores and 12 threads, it’s shown having a base clock speed of 2.0GHz and a total of 12MB of L3 cache and 3MB of L2 cache.

New life for 14nm

Currently, Intel has already introduced 10nm Ice Lake CPUs and 14nm Comet Lake CPUs for mobile devices, but we haven’t seen desktop versions of these CPUs show up yet. And, while it’s unfortunate that we’re seeing yet another 14nm chip for desktop, Intel appears to be upping the ante to better challenge AMD this time.

AMD’s big push at the introduction of Ryzen, and now with Ryzen 3000, has been dazzling core counts and multi-threading to boost performance that much further. If Intel introduces a 6-core/12-thread Intel Core i5, as this leak suggests, it could be better competition for AMD’s chips than past Core i5 processors, which lacked hyper-threading.

The 2.0GHz clock speed may be a concern, but the leaked chip is likely an engineering sample that’s not running at the speeds production models will hit. Running at a higher speed could help it compete with something like the Ryzen 5 3500, which also boasts 6 cores and runs at a 3.6GHz base clock speed (though has a much heftier 32MB of L3 cache).

When it comes to the top of the pack, though, we’re still waiting for the 10-core Comet Lake CPU spotted in earlier leaks.

Mac running Catalina? Apple has released a new update to macOS 10.15

Just under a week ago, Apple released a supplemental update to macOS Catalina with various bug fixes and performance improvements. Now, Apple has made a revised version of that same supplemental update available to users.

On its developer website, Apple says that a new version of the macOS Catalina supplemental update has been released today. If you installed the original supplemental update released last week, you might not even receive today’s revised version with Apple focusing on people who hadn’t yet installed the initial supplemental update.

The release notes for today’s update, build 19A603, are exactly the same as last week’s:

Improves installation reliability of macOS Catalina on Macs with low disk space

Fixes an issue that prevented Setup Assistant from completing during some installations

Resolves an issue that prevents accepting iCloud Terms and Conditions when multiple iCloud accounts are logged in

Improves the reliability of saving Game Center data when playing Apple Arcade games offline

The revised version of the macOS Catalina supplemental update likely includes very minor changes and fixes. Apple is also currently beta testing macOS Catalina 10.15.1, which may have provided our first look at the forthcoming 16-inch MacBook Pro.

SK hynix introduces 1Znm DDR4 memory

The company claims productivity of the new 1Znm memory modules has been improved by approximately 27% over the previous generation 1Ynm line. However, the manufacturing process does not require expensive extreme ultraviolet (EUV) lithology so 1Znm production will be cost-effective.

SK hynix 1Znm memory supports up to 3200Mbps data transfer rates, which is the fastest data processing speed from the DDR4 interface. The new 1Znm memory modules have increased power efficiency, therefore successfully reducing power consumption by around 40% compared to modules of the same density from previous 1YNnm 8Gb DRAM.

A new substance, not used in the previous generation has been applied to the manufacturing process, which maximises the capacitance of the 1Znm product. Capacitance is the amount of electrical charge a capacitor can store, which is a key element in DRAM operation. A new design to increase operational stability has also been introduced into the process.

“1Znm DDR4 DRAM boasts the industry’s highest density, speed, and power efficiency, making it the best product to meet the changing demands of customers seeking high-performance/high-density DRAM,” said Lee Jung-hoon, Head of 1Z TF of DRAM Development & Business. “SK hynix will start mass production and full-scale delivery next year to actively respond to market demand.”

SK hynix has plans in place to extend the 1Znm technology process to a range of applications including next-generation mobile LPDDR5 DRAM and HBM3 which will be the fastest DRAM in the future.

Razer Raptor 27

Razer isn’t known as a company that makes gaming monitors—mainly because it never has made one before. But if the company’s first model, the $699 Raptor 27, is any indication of what we can expect going forward, it won’t be long before it’s arm-wrestling with the top gaming-monitor manufacturers in the market. The Raptor 27 wows us with exceptional color results, ultra-low input lag, and a unique design that is wholly Razer, all of which are enough to earn the display our Editors’ Choice. It’s one of the best gaming monitors of 2019.

A Mostly Intelligent Design

On the design side of things, everything about this IPS-based, 2,560-by-1,440-pixel panel stands out from anything I’ve seen in the gaming monitor space before. (It also features a peak 144Hz refresh rate.)

First, there’s the stand, which is both wider and heavier than any other display I’ve tested in this category. That’s not a bad thing, however, and it only adds to the overall sturdiness and solid profile of this 25.4-pound monitor. As long as you have a stable enough desk, you won’t catch this panel wobbling during intense gaming sessions anytime soon.

The one catch? The rectangular base is set in lockstep with the horizontal plane of the monitor, which means any ergonomically fussy gamers like myself who need to keep their keyboards at an off-angle to play effectively, will be forced to scoot the monitor back on their desks a little to set themselves up for multiplayer domination.

Next, there’s the utterly unique back of this unit, where the stand also doubles as a cable manager. The rear portion of the stand enables you to thread the (admittedly very neon green) cables that Razer provides with the monitor from the bottom of the screen, down the stand through precut channels, and out to your PC. Also, the monitor’s rear panel is wrapped in fabric—another first in gaming displays—and although it takes some getting used to, I’m a big fan of the materials choice. There’s just an ineffable elegance to it, something that needs to be seen (and felt) to be understood.

Finally, though I’m normally not a fan of RGB lighting on gaming monitors due to its tendency to be too bright and wash out the images on screen, I’m impressed with the restraint that Razer has shown in this department. A single, subtle RGB LED strip wraps around the base of the stand, and it reflects softly off your desk. You can program it to any customizable pattern you want; it’s controlled through the Razer Synapse 3 software suite.

The monitor is vertically adjustable, up to a maximum distance of 5.25 inches from the bottom of the display to the desk, and the stand supports a huge tilt range: up to 90 degrees. Why so far? Well, this is where one of the best design aspects of the monitor comes in: simplified port access. (Every other monitor maker, please pay attention.)

Because of where monitors need to place their circuit boards internally, on most monitor chassis the majority of the video inputs and USB ports are positioned underneath the monitor, facing downward. This can make it mighty tricky to get cables plugged in, naturally. More often than not, you might find yourself manhandling a monitor or positioning it screen-down just to get everything slotted in its right spot.

Razer’s solution for this problem? Let the display tilt fully flat, perpendicular to your desk. That allows for all the ports to face directly outward and makes it easier than ever to plug in every port you need.

As for that port mix? The Raptor 27 supports just enough, with DisplayPort 1.4b and HDMI 2.0 inputs (one of each), one USB Type-C input (it supports DisplayPort 1.4 and USB upstream capabilities), and two USB 3.0 ports. No headphone-jack passthroughs here, which is a small disappointment but not enough to ding the Raptor 27 much.

Five-Way Is the Only Way

With so many solid choices made in the design of the Raptor 27 chassis, I guess I shouldn’t be surprised to see Razer opted for the best way of navigating through the onscreen display (OSD) menus: a five-way joystick.

Using this joystick, you can change the many settings on offer, including brightness and contrast adjustments; switching between different color modes; and turning on the overdrive response-time booster and the motion-blur reduction feature. The monitor also supports both picture-in-picture and picture-by-picture functionality, just in case you’re doubling the Raptor 27 as a business or productivity display and want to hook up multiple PCs or other video sources to it at the same time.

The one drawback here: Unlike on many other gaming monitors, you can’t control the OSD settings via the Razer Synapse 3 software. This might change in the future (again, this is Razer’s first gaming monitor), but for now if you want to change any settings you’ll have to do it through the OSD alone. At least you have the joystick.

Surprises All Around

As a gaming monitor first and foremost, I didn’t set any heavy expectations in the color-reproduction department for the Raptor 27. So you can imagine my shock when the display started posting top marks left and right.

First there’s the sRGB results, which nailed the top of the curve with a perfect 100 percent score. sRGB mainly determines how well a display will handle internet-based content, and on this test, the Raptor 27 needed no help getting to the top of the list.

Next, there’s the Adobe RGB measurement, which generally tells you how well (or poorly) a monitor will do when handling content-creation tasks such as video or photo editing, or 3D modeling. Here, the monitor did falter ever so slightly, coming in at 89.2 percent overall coverage of the gamut. Still, that’s top-end for a gaming monitor, if a little on the low side for professional panels in this space.

Next was the DCI-P3 test, which measures how accurately a monitor will display movies and TV-show content. Razer rates the monitor to hit 95 percent DCI-P3 coverage, and it came up just short in our evaluation, at 94 percent. Not a major miss, by any means, and again still far above the grade set by any other gaming monitors we’ve tested this year.

Then there’s the luminance and contrast testing, where, again, the Raptor 27 continued to exceed expectations. Though it’s rated only as an HDR400 monitor (with a peak brightness of 420 nits, according to the company), the display jumped over that line to deliver 453 nits in HDR mode, while SDR results came in at a more reasonable reading of 358 nits. Compared against its black level of 0.27, the Razer once again leapfrogged its own contrast-ratio rating (1,000:1) to deliver a contrast ratio of 1,678:1.

Finally, using our HDFury 4K Diva for input-lag testing, we recorded a near-record-low input-lag time of just 1.7ms, which puts the Raptor 27 neck-and-neck with the HP Omen X 25f, and just short of our record holder to date, the MSI Optix MPG341CQR, with its result of 1.4ms.

Anecdotal Testing

As I do for every gaming monitor, I put the Razer Raptor 27 through its paces in a few rounds of Counter Strike: Global Offensive to test how well the overdrive features, adaptive sync, and motion-blur reduction features add to my multiplayer experience. Here, I found no issues with screen tearing or lag, and the monitor is well-equipped to do exactly what it says it will: help you game at 144Hz as fast as your fingers and twitch response will let you.

We tested both AMD’s FreeSync Nvidia’s G-Sync adaptive sync technologies on the Razer Raptor 27 using both an AMD Radeon 5700 XT and Nvidia GeForce 2080Ti, and in each instance the display showed no signs of tearing or slowing down the action.

HDR video testing was also great, and although 27 inches is a little on the small side for a monitor that I’d recommend you do the majority of your movie watching on, it could still sub in temporarily for a larger HDR TV in between gaming matches, if it really needs to.

Razer? Really, Razer!

Razer has a bit of a reputation in the gaming community—”polarizing,” would be the word. Depending on which side of the Razer fence you land on, the company’s products tend to be all hits, or mostly misses.

Though I’m an avid PC gamer, I haven’t personally purchased a Razer product in well over a decade. (Not for special reasons, mind you.) But if the Raptor 27 is the kind of kit I’ve been missing out on, then it’s a big regret on my part. Given that this is the first gaming monitor Razer has produced, I wasn’t expecting the world out of this display, which is why I’m that much more blown away to see a first-timer perform so far above the grade. Razer has burst out of the gate with a spectacular gaming monitor here, showing that the company listens to gamers. On most accounts, it delivers on what that segment of the market looks for in its next big purchase.

At $699, the monitor is right around the top end of what you should spend on a display with similar specs at its 27-inch screen size. But for me, it’s worth it. With a gorgeously functional design, eye-wateringly good color results, and smooth gaming performance, the Raptor 27 goes above and beyond to earn our Editors’ Choice badge. It may be Razer’s first (and only) gaming monitor to date, but if the company keeps up this level of quality and innovation with future panels, this will be just the first in a series of award wins.

Nvidia GeForce GTX 1660 Super Specs Seemingly Confirmed

A listing on JD.com lists specifications for the unreleased Nvidia GeForce GTX 1660 Super graphics card, as spotted by longtime hardware leaker @momomo_us.

There have many leaks about the GTX 1660 Super already, so expected specs have been out for a while. However, this is the first official listing of a custom model, and it helps corroborate early reports.

Let’s start with what hasn’t changed. The GTX 1660 Super is expected to retain the same number of CUDA cores, 1,408, as the original GTX 1660. The clock speeds are reportedly the same too, which means we’re looking at a 1,530 MHz base clock and 1,785 MHz boost clock. Like the GTX 1660, the GTX 1660 Super’s TDP (thermal design power) is rated for 120W, and both use a single 8-pin PCIe power connector, according to the JD listing.

The most obvious difference between the two graphics cards is the memory type. The GTX 1660 debuted with GDDR5 memory clocked at 2,000 MHz (8,000 MHz effective). The Super version will have seemingly have speedier GDDR6 memory at 1,750 MHz (14,000 MHz effective). Although both graphics cards are still restricted to a 192-bit memory interface, the memory swap should let the GTX 1660 Super deliver 75% more memory bandwidth than its sibling.

JD lists the Maxsun GeForce GTX 1660 Super Terminator for 1,899 yuan (~$268.36). The non-Super Terminator version sells for 1,499 yuan (~$211.83). That’s about a 26.7% premium over the GTX 1660.

With October 29 being the rumored launch date for the GTX 1660 Super, hopefully we’ll soon find out if the upgrade to GDDR6 memory is enough deserve our consideration.

Gigabyte Reveals Threadripper 3 Motherboard With Chipset Fan, Plethora of PCIe and M.2 slots

Gigabyte South Africa has revealed an image of a new Threadripper motherboard, which is likely part of the Threadripper 3 launch in November. Threadripper 3 boards are rumored to use the TRX40 chipset and even a new socket (which is expected to be physically identical to the TR4 socket, but different nonetheless). This board from Gigabyte is almost certainly one of the high end offerings, likely an AORUS or a motherboard more in line with a workstation-focused brand.

From the image, we can see four full-sized PCIe brackets (all of which are reinforced), three M.2 ports all equipped with heat shields, a debug display, robust passive cooling for some of the VRMs, and everyone’s favorite X570 feature, the chipset fan. Just by looking at this board, it’s pretty obvious it’s going to be one of the most expensive ones available.

The chipset fan is interesting because that would imply that TRX40 is using the same chipset as X570, which is also the same silicon that AMD uses for the I/O die inside Ryzen 3000 chips. One of Twitter’s best known leakers, @KOMACHI_ENSAKA, commented on the fan saying, “Well, basically it’s the same “AMD Premium (2019) Chipset” as the X570, so it will need a chipset fan.” According to him, the rumored TRX40 chipset and the X570 are likely one in the same.

While this is hardly confirmation, it would explain the chipset fan. It’s also interesting to note that Threadripper 3 should get the Epyc I/O die (which is over 400 mm2); Threadripper 3 and its TRX40 chipset should be an interesting combination if these assumptions are correct. But when it comes to PCIe lanes, the CPU itself will be doing most of the heavy lifting; the chipset will probably only provide more USB connectivity, some SATA ports, and a few more PCIe lanes.

VideoCardz previously reported Threadripper 3 and its motherboards would be revealed on November 5 and would launch on November 19, save for one CPU which the publication believes to be a 64 core processor. That one will apparently launch in January of 2020. AMD says Threadripper 3 is slated for November; let’s hope it hasn’t been delayed.

Intel’s ‘Element’ Comes To Life In Ghost Canyon NUC

A Chinese user from the KoolShare forums has reviewed and dismantled an engineering sample of Intel’s upcoming NUC 9 Extreme, codenamed Ghost Canyon. It appears that the NUC (Next Unit of Computing) actually uses Intel’s ‘The Element’ module that was revealed earlier this month.

The Intel NUC 9 Extreme measures 238 x 216 x 96mm (L x W x H) and is the chipmaker’s largest NUC yet. Intel will reportedly offer three 45W processor options that span from four cores up to eight cores. You can pick between the Core i9-9980HK, Core i7-9750H and Core i5-9300H.

The NUC’s front panel houses the power button, SD card reader, two USB 3.1 Gen 2 Type-A ports, and a single 3.5mm audio jack. The device features honeycomb side panels that allow fresh air to enter from both sides to cool off the internal components. Furthermore, Intel has placed a pair of 80mm cooling fans up top to expel the hot air from the NUC.

Once inside, you’ll notice that the Intel NUC 9 Extreme flaunts with a new layout. The 500W power supply from FSP is located at the bottom of the NUC. A small backplane sits on top of the power supply. This backplane has two PCIe 3.0 x16 slots, one PCIe 3.0 x4 slot and a M.2 22110 slot, which has its own heatsink.

The top PCIe 3.0 x16 slot is evidently reserved for The Element module while the second PCIe 3.0 x16 slot is there for you to install a discrete graphics card. In terms of clearance, the Intel NUC 9 Extreme can accommodate dual-slot graphics cards with a maximum length of 20.32cm.

Apparently, Intel has baptized the The Element module as the “NUC Compute Element” board. For the sake of simplicity, we’ll stick to the original name.

As we previously covered, The Element takes the form of a dual-slot PCIe card. It houses the processor, memory and storage devices. Besides the PCIe slot, The Element also draws power from a conventional 8-pin PCIe connector. Intel’s goal with The Element is to offer a swappable and inexpensive solution for NUC owners to upgrade their systems without buying a completely new device.

The processor is cooled with a passive, copper heatsink. A small cooling fan, like the kind you would typically find inside your laptop, helps with heat dissipation.

The Element comes with two SO-DIMM DDR4 slots. You can use up to 64GB of DDR4-2400 memory or 32GB of DDR4-2666 memory. It also has one M.2 2280 port and M.2 22110 port. In total, you install up to three M.2 PCIe or SATA III SSDs on the Intel NUC 9 Extreme, two inside The Element and one on the backplane. RAID 0 and 1 arrays are supported.

Connectivity on The Element includes four USB 3.1 Gen 2 Type-A ports and two Thunderbolt 3 ports. There’s one HDMI 2.0a port, which is powered by the 9th-generation Core processor’s integrated Intel UHD Graphics 630 solution.

The Element provides you with different options to connect to the Internet. There are two Gigabit Ethernet ports, one based of the i219-LM controller and the other from the i210-AT controller. It also has WiFi 802.11ax and Bluetooth 5 connectivity thanks to the presence of Intel’s Wi-Fi 6 AX200 module.

The Intel NUC 9 Extreme seemingly comes with a limited three-year warranty. There is no word on the official pricing yet. If this leaked NUC roadmap is accurate, the NUC 9 Extreme should land next year.

Android 11: what we want to see

The 2020 Android update is set to be Android 11 – that’s not speculation, but fact, since Google has confirmed the name itself. This follows, because with Android 10 Google pledged to follow a numerical system of operating system organisation.

That means while Android 9 was Android Pie, Android 10 wasn’t Android Quiche, and Android 11 won’t be Android Rhubarb and Custard (sadly) – but it will be packed with updates and, likely, a few long-awaited features.

Android 11, Google’s next operating system, is set to be revealed in mid-2020, before being released for Pixel devices around September, and then rolled out for other Android devices from late 2020 through well into 2021.

Mid-2021 is a way off, but we’ve already started hearing rumors and news surrounding the upcoming Android operating system. Here we’ll collect everything we know about Android 11.

Cut to the chase

What is it? The newest Android operating system

When is it out? Likely September 2020, at least on Pixel phones

How much will it cost? Android 11 will be available for free

Android 11 release date

Google tends to unveil some of the features of its newest operating system at Google IO in May, before releasing the OS for Pixel phones in September – this year, for example, it debuted on the Pixel 4.

After that, different smartphone brands will have their devices eligible for the update slowly over time. The OnePlus 7T and 7T Pro were the first non-Google smartphones which came with Android 10 in the box, but it could be a different company for Android 11.

Not all smartphone manufacturers upgrade immediately, however, and it’ll likely be some time in 2021 that all smartphones that can upgrade, have had the upgrade made available.

Android 11 features

There’s only one Android 11 feature we know of so far – it’s confirmed for the operating system though, so it’s not just speculation.

This feature is ‘Scoped Storage’, which was initially meant to be available in Android 10, but Google pushed it back due to complaints from Android developers.

Scoped Storage is a new way of organizing what information and files your apps can read – in short, it makes memory read speeds much quicker, improves your security, and stops you needing to give every new app permissions.

That’s the only Android 11 feature we know of, but since Android 10 isn’t available on most devices yet, it’ll be a while before we get to hear about more.

What we want to see in Android 11

These are the improvements we’d like to see in Android 11, that we still don’t have in the latest build of Google’s operating system.

1. A new NFC file-sharing alternative

With Android 10 came the removal of Android Beam, which let phones share files over NFC. All you had to do was tap phones together to share files, but it’s gone now.

It’d be useful to have some easy-to-use alternative that lets you share photos, videos, music, or other files just by pressing the phones against each other – at the moment it can be quite a hassle to use a specific app to do so, and Apple’s Airdrop is a useful feature that has no equivalent in Android.

2. Improved dark mode

Dark mode isn’t for everyone, but those who use it love it – or they want to, at least, but in Android 10 it has a lot of problems.

Firstly, not that many apps have their own dark modes, not even all Google apps, so using your smartphone with dark mode on can often see you jumping straight from dark apps to light ones, and back again.

Secondly, on apps that do have dark mode some text isn’t color-swapped, and can remain invisible – this actually happens on Google search, so you can’t see suggested results.

For dark mode to be useful on Android smartphones, it needs to actually be usable for most apps, instead of just an optional toggle here and there.

3. Extended chat ‘bubbles’

One of the new features of Android 10 was the introduction of chat ‘bubbles’, similar to Facebook Messenger, laid over the top of other apps, that let you see all your conversations over several apps all in one easy-to-access format.

Not all apps are compatible though, and while some of those that you use to message people, there are many that aren’t, with only the main text app Messages, and Google Hangouts, compatible. That means if you want to have conversations over various apps, for the most part you’re still going to be using those apps.

It’d be really useful if more apps became compatible – WhatsApp, Facebook Messenger, Instagram, Twitter, and perhaps even dating apps! Then, it’d be incredibly easy to keep on top of your various conversations across all the different apps that fill up your phone. 

PIXELBOOK GO LISTING IN THE GOOGLE STORE HINTS AT ADOBE PREMIERE RUSH RELEASE

Adobe Premiere Rush is the dangling carrot that has been just out of reach for Chromebook users everywhere for well over a year at this point. To be fair, Adobe only previewed the cross-platform application when it debuted in June of 2018, but the path from that debut took nearly a year for Premeire Rush to actually land on devices. When it did arrive in May of 2019, again Chromebook users were told it was “coming soon” only to sit around and wait for months on end for Adobe to actually deliver on its promises.

Nearly six months since that actual product debut, we’ve heard next to nothing from Adobe on the Chromebook front, but if a thumbnail image in the Google Store is any indicator, we could be seeing Premiere Rush on Chromebooks in the very near future. Take a look at the listing for the Core i7 variant of the new Pixelbook Go below:

There is no doubt that the app being displayed on the Pixelbook Go on the right is Adobe Premiere Rush. Clearly, Google decided what image they wanted to accompany each different model in the Pixelbook Go lineup. All the other versions have screens displaying apps that already exist on Chromebooks and each one increases in complexity as the models go up in price. The entry level Core m3 model displays a generic website, the upgraded Core i5 model has YouTube TV, the Core i5 with 16GB of RAM displays G Suite apps, and the final i7 model is shown with Adobe Premiere Rush.

This choice in screens would make sense as Google is likely trying to convey the higher end models having more processing power for more intense tasks. The question is, why would they use Premiere Rush if they didn’t think this app was going to be a part of the Chromebook experience very soon?

In short, I don’t think they would. There are literally millions of apps and billions of websites they could have chosen for this photo, yet they chose to go with Adobe Premiere Rush. If I were betting on it, I’d say we’ll hear something about Rush landing on Chrome OS around October 28th when the Pixelbook Go is expected to start shipping out. I could see this being a Pixelbook Go exclusive for a month or so, but I hope that isn’t the case. There are countless Chromebooks up to the task of running this app, and I really think it has the ability to fundamentally change the audience Chromebooks can be geared towards.

Will Premiere Rush make Chromebooks the ultimate video editing machines? No way. But the addition of this capability on the type of scale promised by Adobe Premiere Rush is basically unheard of right now on Chromebooks. With a video editor that already has a massive user base and desktop versions in play, legit video editing on a Chromebook never looked so close.

AMD Ryzen Threadripper 3960X, 3970X, and 3990X expected to be announced on November 5, require new TRX40 chipsets and new sockets

AMD has three new Threadripper processors lined up for a November launch, according to information obtained by Videocardz. These include the Threadripper 3960X, 3970X, and the 3990X. The Threadripper 3960X and 3970X will be unveiled on November 5 with review embargo lifting on November 19. The Threadripper 3990X will be officially launched during CES 2020 although, it will still be teased alongside the other two Threadrippers. The Threadripper 3980X can also be expected to be unveiled during CES 2020.

The specifications of the 3rd generation Threadrippers are all speculation as of now, but there is some credible information doing the rounds online. This generation will also see the introduction of 280W TDP SKUs, up from the 250W ones seen in Threadripper 2. The base SKU will be the Threadripper 3960X with 24 cores and 48 threads with a 250W TDP. Next comes the Threadripper 3970X with 32 cores and 64 threads (250W) followed by the Threadripper 3980X with 48 cores and 96 threads (280W). Finally, the flagship SKU will be the Threadripper 3990X with 64 cores and 128 threads and a 280W TDP. Base and boost clocks are still not known at this point.

The Threadripper 3rd generation will also see a new socket and possibly, separate platforms for ‘Enthusiast’ and ‘Workstation’ use cases. Reports seem to indicate that Threadripper 3000 will not be backward compatible with the current X399 chipset and instead, will require a new TRX40, TRX80, or WRX80 chipset. The new chipsets will also be featuring a new socket with additional power pins required for PCIe Gen4. Check out the images below for specifications and thermal requirements of the chipsets as was revealed by GamersNexus last month.

With such massive core counts, it will be interesting to see the kind of multi-core benchmark records the Threadripper 3000 series will be setting in the coming weeks. Although we’ve seen the 18-core Intel Cascade Lake-X Core i9-10980XE surpassing the Threadripper 2950X in Geekbench, we have also seen it fall significantly behind an alleged 32-core Threadripper 3000 processor in the same benchmark. Intel currently has no HEDT offerings to counter the extreme core counts expected in upcoming Theadripper processors and if priced right, AMD could have a winner once again.