Arctic’s massive Liquid Freezer II 420 cooler is built for CPUs that like to blaze

For the most part, all-in-one liquid coolers come in three general sizes: 120mm (single fan), 240mm (two fans), and 360mm (three fans). That includes Arctic’s Liquid Freezer II line of AIO coolers, but apparently 360mm was not big enough for some customers, who have been urging the company to go even bigger. And it did, with its new Liquid Freezer II 420.

This is still a three-fan cooling unit, but with larger 140mm fans attached to the radiator instead of 120mm fans, as found on the Liquid Freer II 360 model. It exists because, according to Arctic, fans kept asking, “Will Arctic ever release the Liquid Freezer II in bigger sizes?” Why yes, yes it will.

From end-to-end, the radiator is actually 458mm long, but 420 sounds cooler (and coincides with the combined size of the three 140mm Arctic-P fans). Arctic says it is the company’s largest and most powerful cooler to date, but that’s selling it short. It’s also one of the biggest AIO coolers ever made. The only other 420mm-class AIO I can think of is AlphaCool’s massive and refillable Eisabaer unit. I think a cooling shootout is in order.

Arctic introduced its Liquid Freezer II line last year. The Liquid Freezer II 280 earned an Editor’s Choice award from our friends at Tom’s Hardware, and I can only imagine how much better the new 420 model performs, with a bigger radiator and another fan attached. This is especially enticing as Intel plays fast and loose with power draws in pursuit of higher clocks on its aging 14nm node.

The Liquid Freezer II 420 is compatible with all the modern sockets, among them AM4 for AMD’s Ryzen CPUs, and LGA 1200 for Intel’s Comet Lake processors (plus others like LGA 115x). It also has sleeved cables, integrated cable management, and a PWM controller pump that Arctic says it developed in-house.

While big in size, the Freezer II 420 is only moderate in price—it costs $140 (€120), which undercuts pricing on some of the best CPU coolers. For example, Corsair’s H115i RGB Platinum, our top pick at the moment, is smaller and sells for around $160.

Apple is almost definitely launching its first non-Intel Mac on November 10

Mac systems built around Apple’s own custom silicon based on Arm are coming, as the company embarks on a two-year plan to phase Intel chips from its lineup. But when will it happen in earnest? One week from today.

I can’t say that with 100 percent certainty, but I’m 99.99 percent convinced. That’s because Apple has announced a third fall event (via MacRumors) for next Tuesday, November 10, to unveil “one more thing.”

The previous two events in September and October have seen launches of the iPhone 12, a new iPad Air, a HomePod mini speaker, the Apple Watch Series 6 and Apple Watch SE, and some software announcements.

Apple has already stated it intends to “ship the first Mac with Apple silicon by the end of the year,” so we can go ahead and disregard any notion that next week’s event is dedicated to a refreshed Apple TV set-top box. It’s going to be focused on new Macs, and it is a big deal for a couple of reasons.

For one, the transition away from Intel to custom Arm-based processors is a bold gamble. Apple made the switch from IBM’s PowerPC hardware to Intel’s x86 processors around 15 years ago, and a shift to Arm raises all kinds of questions, mostly about performance. To that end, Apple CEO Tim Cook is exuding supreme confidence (when does he not, though?).

“With its powerful features and industry-leading performance, Apple silicon will make the Mac stronger and more capable than ever. I’ve never been more excited about the future of the Mac,” Cook said in June.

Moving to Arm will give Apple a common architecture across its many product lines, and there is some merit in going that route. But as well as Apple’s custom hardware designs have benefited its iPad and iPhone devices, will it work out equally as well for its MacBooks and other Mac systems? We’re about to find out.

The other intriguing element is a more recent development—Nvidia’s $40 billion acquisition of Arm (provided it clears regulatory hurdles). Apple made the decision to abandon x86 hardware and move to Arm-based designs before Nvidia announced it had hammered out a deal to buy Arm. Apple’s relationship with Nvidia is somewhat strained, and has been for a long time. And now Apple is going all-in with IP (intellectual property) that could very well soon belong to Nvidia.

Granted, none of this directly affects PC gaming. But it is interesting nonetheless. And depending on how things work out, it is conceivable that there could be a bigger push towards Arm designs on Windows, as has already happened to some extent.

HOW TO GET PRODUCTIVE WITH FULL HD EXTENDED DISPLAYS ON THE LENOVO CHROMEBOOK DUET

Since the Lenovo Chromebook Duet arrived on the scene back in January 2020 at CES, most users have been a bit enamored by the diminutive tablet with the bright, beautiful display and very low starting price. After all, getting a great tablet, kickstand cover and keyboard/trackpad combo in the box for $299 is pretty amazing and makes for a great overall Chromebook experience. Add to that mix the fact that we’re starting to see pretty deep discounts on this package for the shopping season and picking up a Duet has never been more attractive.

There’s one issue that’s been a bit of a thorn in many sides, however, and that is the fact that this Chromebook doesn’t behave like others when hooked into a USB-C docking station. In fact, it behaves unlike just about any other device we’ve ever seen with a USB-C port on board. I won’t go any further into a rant here as I’ve done that quite a bit, but I’m still a bit baffled why this little Chrome OS tablet didn’t get the simple extended display treatment other far-less-prominent Chromebooks have had for years. It was an odd omission for sure.

Verified: how to get a proper extended display with the Duet

Not too long ago we began hearing about users having some luck with different display adapters and being able to extend the Lenovo Duet’s display out to multi-monitor setups of up to two additional screens. Though hearing of these anecdotes, try as I might, I could not get a workable resolution output to any extended display with the handful of USB-C adapters around our office. To be honest, I’d all but given up on the idea of extending the Duet’s display in any workable fashion.

However, a comment in the bug thread for this issue prompted me to rethink my methods and dig up an older display output tech that I’d not considered in years: DisplayLink. Chromebooks began supporting DisplayLink back in 2016, so this isn’t anything new. However, with the arrival of USB Type C on most Chromebooks in the past few years, the need for DisplayLink has nearly vanished. For those unfamiliar, DisplayLink can create display outputs over standard USB 3.0 ports and requires some special drivers to do so on Windows and MacOS systems. Chrome OS has these drivers built in, so DisplayLink works just fine on any Chromebook you have lying around at this point.

After seeing the comment in the bug report about a user having success with a DisplayLink adapter, I realized we have a couple of those lying around the office and I set out to give it a go. You know what? I’m typing this article on the Lenovo Chromebook Duet extended to a portable 1080p monitor at full resolution and full 60hz refresh rates. The setup isn’t pretty, requiring a USB A to C adapter (DisplayLink adapters are USB Type A) to plug into the Duet and a DVI to HDMI adapter to get to the right output for most monitors. But it is working.

With a better adapter (like the ones we’ll link below), you could clean this up quite a bit. You’ll still need the USB A to C adapter to get the right output from the Duet, but after that point you’ll have a clean HDMI port to hook into. For desktop monitors, this works OK, but you still need more adapters to charge your Chromebook at the same time since the Duet only has one USB Type C port. For a fully-mobile setup, you’ll also need external power for the monitor since DisplayLink doesn’t carry power along the connection like we see with the single cable USB Type C setup.

I did test my simple USB Type C Samsung dock and verified that I could run the DisplayLink connection from the USB Type A port there while also having a spot for power to pass through to my tablet, but that doesn’t really fix the monitor power issue. If this is a setup you’d like to use on the go, you are 100% going to need a battery pack in your bag at all times to get some juice to the display. If you are planning on using this at your desk most times, no worries there as your monitor obviously stays plugged in.

One other consideration before you go to assemble your workstation: I’d highly recommend a Bluetooth mouse and keyboard. Again, with the Duet’s limited port selection, hooking up an external keyboard and mouse is far simpler with Bluetooth, and our favorite pair is still the Logitech K580 and M355 as they are both designed to work perfectly with Chrome OS and still get the job done every time they are needed.

Is this the most perfect desktop setup? Not at all. In fact, I’d only go this route if forced to. With DisplayLink being a technology that taxes the processor a good bit, performance isn’t great, here. The Duet is already a slow Chromebook experience, so this setup just makes it feel a bit more sluggish. However, compared with completing this post on the Duet’s included keyboard/trackpad and 10.1-inch screen, I can definitely say that having a larger extended display made my task far simpler to accomplish.

I’m still hoping that a Snapdragon 7c Chrome OS tablet or something with MediaTek’s upcoming 6nm MT 8195 may be that tiny, light Chromebook detachable I’ve been after for both desktop work and leisurely browsing, but those devices are still a ways off. While I wouldn’t want to bind myself to the Duet on a daily basis, it gives me all sorts of hope for what this setup could be with a Chromebook that simply has proper display output via USB Type C and a bit more oomph under the hood. Lenovo didn’t see fit to make that happen with the Duet, but that doesn’t mean you can’t work around it if you are bound and determined to make this tablet a productivity machine, too. Hope this helps make that a reality for some of you looking.

This Windows 10/macOS combination is the perfect desktop operating system

We know that Microsoft is planning to give Windows 10 a makeover in the future, but it’s unlikely that the software giant will really go to town on the design, with the focus of its UI revamp — codenamed Sun Valley — being more on tweaking things and expanding on the Fluent Design language currently in use.

While some people are more than happy with how Windows 10 looks, it’s pretty bland when compared to macOS. But what if Apple and Microsoft joined forces to produce a Windows/macOS hybrid?

Over on Reddit, Kaiwave33x has put his design skills to good use to add macOS elements to his copy of Windows 10.

The customized OS, which he’s showcased in Light and Dark Mode, has a dock at the top, rather the traditional Windows taskbar at the bottom, with tiles for all of the most accessed apps and widgets sitting on the right.

He has put together a step-by-step guide for anyone interested in replicating it, but this has currently been flagged as spam on Reddit and removed. He is however, planning on creating an install script in the future.

In the meantime you can grab the wallpapers he used for DarkMode and LightMode.

Kamer Kaan Avdan, who has previously released videos for alternative versions of Windows 95, Windows XP, Windows 7, Windows 11 and Windows 20, came up with his own vision of Apple’s Windows 10 late last year, although sadly that was just a concept, and not something that could actually be installed, unlike this.

What do you think of Kaiwave33x’s hybrid? Share your comments below.

This water-cooled RTX 3080 is the most cyberpunk design Gigabyte’s made yet

Gigabyte is making a splash with the new water-cooled GeForce RTX 3080 design, now entering the maelstrom of custom Ampere graphics cards being swiftly swept away by the tide. The new Aorus Xtreme Waterforce WBs (noted by Computerbase) is overclocked by 8 percent, meaning it’ll be throwing up boosts clocks of 1,845MHz, and it’ll do so in a full-cover, water-cooled wet suit with impeccable cyberpunk style.

Gigabyte also claims more efficient heat dissipation thanks to the “shortened distance between the water channel and the GPU”, and the broad copper base plate covers the whole surface of the water channels. So it should (in theory) be pretty darn effective. As for connectivity, we’re looking at three DisplayPort 1.4a, two HDMI 2.1, and a single HDMI 2.0, so you can run your 4K displays at 120Hz or 8K at 60Hz.

There are a host of alternative water-cooling solutions available as separate units for the RTX 3080 and 3090 already, such as the EK-Quantum Vector, but there are some real benefits of investing in a GPU with a pre-installed water block. You could save a bunch of time and effort, and pre-installed water cooling means relief from the stress of self-assembly and the possibility of voiding your GPU warranty by jamming a water block into it. Just ask our Jacob who bled for his custom water-cooled PC build.

Black Friday 2020 deals: the place to go for the all the best Black Friday bargains.

Really, though, from a value for money perspective, pre-installed water blocks may not be the most cost effective solutions available.

That said, the Aorus Xtreme Waterforce WB 10G looks the absolute bomb with its subtle RGB LEDs and cyberpunk look, and it should integrate easily into existing water cooling systems by way of two G1/4-inch thread connections. It also features all important leak detection, complete with flashing warning lights, in case you somehow manage to get the install wrong.

If you’re worried about such cockups, according to the product page, the GPU comes with a four year warranty, but there’s not a wisp of info about pricing. With the Aorus Waterforce cooled versions of the previous gen RTX 2080 Ti GPUs coming in at $150 above the MSRP, we’re expecting the Aorus Xtreme Waterforce RTX 3080 WB cards to be something in the region of $850. 

But that remains to be seen, and good luck getting your hands on one!

G.skill launches extreme low-latency memory that’s ideal for AMD Ryzen gaming PCs

G.Skill has announced two brand new kits of extreme low-latency DDR4-3600 memory—a 32GB kit smack bang in the sweet spot for gaming and a 64GB kit for memory-heavy applications, such as content creation.

The new low-latency memory kits will be rated to CL14-15-15-35, which matches the Team Extreem ARGB C14 kits currently sitting atop of our best RAM for gaming guide. G.Skill will be offering its extreme low-latency kits under its Trident X Neo, Trident Z Royal, Trident Z RGB, and Ripjaws V series—plenty of options from bling to bare. 

All kits are built using Samsung B-Die chips, a fan favourite for overclocking.

Fitting the criteria for AMD Ryzen Zen 2 perfectly, the new kits from G.Skill will be a dab hand for AMD Ryzen builds. As you can see in Chris Szewczyk’s performance testing across multiple RAM speeds, a heady mix of both frequency and low latency offers the most consistently high performance in a range of games.

The new AMD Ryzen 5000 processors are reportedly going to run at their best at 4,000MHz (effective) but there’s unlikely to be much in it between that and a low-latency 3,600MHz kit. Similarly, Intel systems don’t care quite as much for memory speed and latency, but even so low-latency kits often outperform those with 1GHz+ higher frequencies.

A 16GB kit is currently the go-to choice for gaming PCs, but if you want to keep your PC a little ahead of the game, it’s a good time to pick up a 32GB kit as they’re often going for as little as 16GB kits were just a year back. The higher capacity will ensure you’re more than equipped for future games and memory hogs, and it’s really satisfying to fill all four motherboard memory slots.

Apple to Launch MacBooks With Own Chips Next Week

Apple Inc.’s 15-year relationship with Intel Corp. will officially begin to unwind next week when new Mac computers are revealed.

The Cupertino, California-based technology giant said on Monday that it will hold an online event dubbed “One more thing” on Nov. 10. That “thing” will be Macs with main processors designed by Apple for the first time, replacing Intel chips that have been a mainstay since 2006. An Apple spokesman declined to comment.

Apple and overseas suppliers are ramping up production of three Mac laptops with Apple processors: new 13-inch and 16-inch MacBook Pros and a new 13-inch MacBook Air, according to people familiar with the matter. Foxconn, known also as Hon Hai Precision Industry Co., is assembling the two smaller laptops, while Quanta Computer Inc. is building the larger MacBook Pro. The smaller models are further ahead in production and at least those two laptops will be shown at next week’s event. Beyond the processor switch, the devices won’t have significant design changes.

Apple has less than 10% of the market for personal computers, so the direct impact on Intel sales may be limited. However, the change highlights a crisis engulfing the world’s largest chipmaker. It has delayed a new manufacturing process, giving rivals a chance to catch up. These problems are at least partly behind Apple’s decision to move to in-house chips, although the company has been steadily shifting to this approach for years.

The partnership between Apple and Intel started in 2005, when Steve Jobs outlined a move away from PowerPC processors. Intel helped Apple catch up to Windows computers, some of which were more powerful at the time. In tandem, though, Apple was working on more energy-efficient chips for mobile devices based on Arm Ltd. designs and continues to use those to power the iPhone and iPad.

On Apple’s recent earnings call, Chief Executive Officer Tim Cook hinted at the Mac launch by saying, “without giving away too much, I can tell you that this year has a few more exciting things in store.” The company generated a record $9 billion in revenue from the Mac in its fiscal fourth quarter.

The first Mac processors from Apple will be based on the A14 chip found in the latest iPhones and iPad Air, and tests inside Apple indicate improved power efficiency over the Intel parts they are replacing. The new machines will also have Apple-designed graphics and machine-learning processors.

Apple said in June that the transition away from Intel chips will take two years. After updating its laptop line, Apple will still have until 2022 to update desktop computers — the iMac, iMac Pro, Mac Pro and Mac mini — with its own processors.

The company is already at work on a redesigned iMac, the company’s all-in-one desktop, and a new Mac Pro model, Apple’s highest-end desktop, other people familiar with the company’s plans said.

Apple engineers are currently developing a new Mac Pro that looks like the current design at about half the size. It’s unclear if that Mac will replace the current Mac Pro or if it’s an additional model. Apple’s chip designs could help the company reduce the size of its computers due to increased power efficiency, but the current Mac Pro is large, in part, to fit components like additional storage drives and graphics chips.

Apple’s test Mac for developers to write apps for the new processors is a Mac mini with an iPad Pro processor, but the company will still need to roll out a proper update to that model with a Mac-specific chip. The new Macs require macOS Big Sur, a redesigned Mac operating system that makes the software more like iPhones and iPads.

When announcing the transition in June, Apple said the move would provide a common architecture across all of its devices. That means future iPhones, Macs, iPads and Apple Watches will run a variation of the same chip. That will allow devices to work together better and let iPhone apps run natively on Macs for the first time.

Bloomberg first reported on the transition away from Intel in 2018 and again in April. The new Macs will mark Apple’s third major product release this fall. In September, the company announced new Apple Watches and iPads. The iPhone 12 and HomePod mini lines debuted in October.

Samsung Galaxy Z Fold 2 folding phone will have an LG rival that looks LIGHT YEARS ahead

The world of smartphone design is starting to get pretty wild, with manufacturers looking to deviate from the the candy bar form factor thanks to new tech like flexible panels and under screen cameras.

LG, however, is setting itself apart from the foldable devices that are becoming the new standard with its Project B handset, set to sport a rollable display and following in the footsteps of its uniquely designed swivel smartphone, the LG Wing. 

LG has been working on a rollable display for a while, and we initially heard about its Project B smartphone last November, with its extendable screen, as opposed to flexible, foldable display like the Galaxy Z Fold 2, or Galaxy Z Flip.

Now, LetsGoDigital reports, the company has been awarded a patent for a device with the retractable display.

At first glance, the handset looks like a standard smartphone, but the frame can be pulled out to both the left and the right to double the screen real estate. This is achieved thanks to an inner, flexible frame that slides out of the device’s metal housing. 

The design has some advantage over foldable phones on the market now, with the biggest one being the absence of a hinge or crease; a panel with a fold in it will see the crease put under strain, potentially affecting the structural integrity of the display. 

What’s more, the rollable can adapt to how the smartphone is being used – in much the same way as the LG Wing. In landscape mode, for example, expanding the screen while playing a video or streaming music could reveal the controls so that your viewing remains uninterrupted. There’s also the option of multitasking with a split-screen mode.

LG is reportedly skipping straight over the foldable trend, and is looking to launch its rollable smartphone in March, 2021 according to Neowin. It’s possible that the handset will only be extendable on one side, as opposed to both as depicted in the patent, but it’s a radical concept and we’re intrigued to see a company bucking the foldable trend to try something new.

Nvidia RTX 3060 Ti GPU could be unleashed on November 17

Nvidia’s GeForce RTX 3060 Ti, which is supposedly the next GPU to arrive in the Ampere range, could be launched in less than three weeks’ time on November 17, going by the latest speculation.

This comes from MyDrivers (via Videocardz) along with the suggestion that according to third-party graphics card manufacturers, stock levels of the 3060 Ti will be similar to that witnessed with the RTX 3070 that just hit the shelves – in other words, it’ll be very thin on the ground.

But that wouldn’t really be any surprise given that all of Nvidia’s RTX 3000 launches have failed to meet demand by some margin. That said, treat this speculation with a great deal of caution, and indeed this purported launch date.

It is, however, true that the November 17 date more or less falls in line with what we’ve heard from the grapevine previously, after Chinese retailers put the RTX 3060 Ti up for pre-order, and a mid-to-late November timeframe was suggested for arrival.

Rumored spec and cost

The rumor mill reckons that the 3060 Ti graphics card will weigh in at under $400 when it debuts, and it’ll boast 4,864 CUDA cores with 8GB of GDDR6 VRAM. It’ll use the same GPU as the RTX 3070 in fact, just a cutdown version (naturally).

As mentioned, the recently launched GeForce RTX 3070 GPU failed to provide sufficient inventory to retailers and sold out very swiftly, just like the RTX 3080 and 3090 before it. That wasn’t entirely unexpected, but hopes were up that the 3070 would fare better given that Nvidia delayed release by a fortnight in order to help resolve stock issues.

If you were wondering where the vanilla RTX 3060 is, this current rumor notes that there is no clear release timeframe yet, but that it will arrive at a later date, which is pretty much what we’ve heard previously (it’ll be based on a different GPU to the 3060 Ti).

Moto G 5G leaks suggest another budget 5G Motorola handset is on the way

We’ve seen the Moto G 5G Plus already this year (called the Motorola One 5G in the US), and if the rumors are correct then the standard Moto G 5G is going to be the next affordable 5G phone to arrive from Motorola.

As per information obtained by XDA Developers and an image posted to Slashleaks, the handset is going to come running the brand new mid-range Snapdragon 750G chipset, with 6GB of RAM and 128GB of storage on board.

Further information revealed on Twitter points to a 6.66-inch display running at a resolution of 2400 x 1080 pixels, plus a 5,000 mAh battery. As the name suggests, this looks like being a slightly less powerful version of the Moto G 5G Plus.

Xiaomi could be building a foldable phone

Red Dead Redemption 2 is a game of its generation

Choose the best Samsung Galaxy S7 tablet

The display is said to run at the standard 60Hz refresh rate, though there’s no word yet whether it uses LCD or OLED technology. The Moto G 5G Plus has a 6.7-inch, 2520 x 1080 pixel LCD display, with a faster 90Hz refresh rate.

The 5G future

Around the back there are reportedly three cameras: a 48MP primary sensor, an 8MP telephoto lens, and a 2MP macro lens. As for the front-facing camera, that’s apparently going to be a single 16MP selfie camera.

Also included in the leaked specs is a mention of a dedicated Google Assistant button, which is something we saw on the Moto G 5G Plus smartphone – a button that we found was a bit too easy to accidentally push.

There’s no word yet on when we might see the Moto G 5G appear, though a launch doesn’t appear to be too far off. Motorola has previously been happy to push out multiple handsets in a short space of time, so it might show up before the end of 2021.

Another budget phone with 5G on board would certainly be welcome: at the moment, phones with the next-gen connectivity on board tend to cost significantly more, but that will change over time as the technology becomes adopted more widely.