Cheap Elo ELO-IN250 Li-ion Laptop battery, Brand New ELO-IN250 replacement battery for ELO ELO-IN250 7167103 2I(CM)P8/67/103

5650mAh/41.8Wh 7.4V Elo ELO-IN250 Batteries for ELO ELO-IN250 7167103 2I(CM)P8/67/103, Elo ELO-IN250 Laptop battery is a brand new,100% Compatible original and replacement Laptop battery,Purchase wholesale and retail ELO-IN250 with high quality and low price!

ELO-IN250 Battery elo Li-ion 7.4V 5650mAh/41.8Wh

ELO-IN250

Specifications

  • Brand:ELO
  • Capacity :5650mAh/41.8Wh
  • Voltage :7.4V
  • Type :Li-ion
  • Battery Cell Quality: Grade A
  • Descriptive: Replacement Battery – 1 Year Warranty
  • Description: Brand New, 1 Year Warranty! 30-Days Money Back! Fast Shipping!

How we test this Elo ELO-IN250 Battery Li-ion 7.4V 5650mAh/41.8Wh

Step 1: Make sure customer bought the correct battery.
Step 2: Check battery’s appearance and interface.
Step 3: Test battery charger and recharger function.
Step 4: Charger the battery to 100% and recharger to 0% to get real battery capacity
Step 5: Use Ev2300 to check the voltage difference of each goroup cells.
Step 6: Charger battery power more than 30%.
Step 7: Package battery carefully and send out

Compatible Part Numbers:

ELO-IN250 7167103 2I(CM)P8/67/103

Compatible Model Numbers:

ELO ELO-IN250 7167103 2I(CM)P8/67/103

How much do you know about how to run laptop well as any place? The follow Tips cut way back on protecting battery life.


1). Please recharge or change your Laptop battery when battery power low.
2). Using Li-Ion Replacement Elo ELO-IN250 Laptop Battery for your notebook which can work longer time than Non Li-ion one.
3). It is better to defragmentation regularly for your Laptop battery life.
4). In order to reduce the laptop power consumpition, you can use some optical drive spin-down and hard drive in your Laptop .
5). Please keep your laptop in sleep or standby model without long time using, which both save the Replacement Elo ELO-IN250 Laptop Battery power and extend battery using life.
6). Leave your battery in a dry and cool condition when without using.
7). When you rarely or generally plugged in fixed power using, Please take down your battery to avoid hurting battery life.

Hot Products

11.1V/12.4v 2300mah BOSE 404600 for Bose SOUNDLINK I II III16.8V/20V 400mA/2200mAH/32Wh BOSE 300769-003 for Bose Sounddock Portable Digita3.8 V 1300 mAh AMAZON MC-305070 for AMAZON Kindle Voyage3.85V/4.4V 3080MAH/11.86WH ALCATEL TLp030JC for Alcatel A3 XL 9008j3.8V 2930mAh/11.1Wh NETGEAR W-7 for Netgear AirCard 790S 790SP 8103.8V/4.35V 2000MAH/7.6Wh ALCATEL TLI020F7 for Alcatel Onetouch Pixi 4 (5) 507.7V 37Wh/4810mAh ACER AP16M5J for ACER A315-51-51SL N17Q1 SERIES3.8V 4000mAh/15.2Wh LENOVO L15D1P31 for Lenovo Yoga Tab3 Pro YT3-X90L3.7V/4.2V 1750MAH/6.5WH AMAZON GP-S10-346392-0100 for AMAZON KINDLE 3 3G WIFI Kindle11.4V 84Wh DELL 4GVGH for DELL XPS 15 9550 4GVGH 1P6KD

Cheap T-gee BS101 Li-ion Tablets battery, Brand New BS101 replacement battery for T-GEE WINMATE M101B M101H M101M8 M101BT

38Wh/5140mah 7.4V T-gee BS101 Batteries for T-GEE WINMATE M101B M101H M101M8 M101BT, T-gee BS101 Tablets battery is a brand new,100% Compatible original and replacement Laptop battery,Purchase wholesale and retail BS101 with high quality and low price!

BS101 Battery t-gee Li-ion 7.4V 38Wh/5140mah

BS101

Specifications

  • Brand:T-GEE
  • Capacity :38Wh/5140mah
  • Voltage :7.4V
  • Type :Li-ion
  • Battery Cell Quality: Grade A
  • Descriptive: Replacement Battery – 1 Year Warranty
  • Description: Brand New, 1 Year Warranty! 30-Days Money Back! Fast Shipping!

How we test this T-gee BS101 Battery Li-ion 7.4V 38Wh/5140mah

Step 1: Make sure customer bought the correct battery.
Step 2: Check battery’s appearance and interface.
Step 3: Test battery charger and recharger function.
Step 4: Charger the battery to 100% and recharger to 0% to get real battery capacity
Step 5: Use Ev2300 to check the voltage difference of each goroup cells.
Step 6: Charger battery power more than 30%.
Step 7: Package battery carefully and send out

Compatible Part Numbers:

BS101 2ICP65/54/130-1

Compatible Model Numbers:

T-GEE WINMATE TABLE M101B M101H M101M8 M101BT

How much do you know about how to run laptop well as any place? The follow Tips cut way back on protecting battery life.


1). Please recharge or change your Tablets battery when battery power low.
2). Using Li-Ion Replacement T-gee BS101 Tablets Battery for your notebook which can work longer time than Non Li-ion one.
3). It is better to defragmentation regularly for your Tablets battery life.
4). In order to reduce the laptop power consumpition, you can use some optical drive spin-down and hard drive in your Tablets .
5). Please keep your laptop in sleep or standby model without long time using, which both save the Replacement T-gee BS101 Tablets Battery power and extend battery using life.
6). Leave your battery in a dry and cool condition when without using.
7). When you rarely or generally plugged in fixed power using, Please take down your battery to avoid hurting battery life.

Hot Products

11.1V/12.4v 2300mah BOSE 404600 for Bose SOUNDLINK I II III16.8V/20V 400mA/2200mAH/32Wh BOSE 300769-003 for Bose Sounddock Portable Digita3.8 V 1300 mAh AMAZON MC-305070 for AMAZON Kindle Voyage3.85V/4.4V 3080MAH/11.86WH ALCATEL TLp030JC for Alcatel A3 XL 9008j3.8V 2930mAh/11.1Wh NETGEAR W-7 for Netgear AirCard 790S 790SP 8103.8V/4.35V 2000MAH/7.6Wh ALCATEL TLI020F7 for Alcatel Onetouch Pixi 4 (5) 507.7V 37Wh/4810mAh ACER AP16M5J for ACER A315-51-51SL N17Q1 SERIES3.8V 4000mAh/15.2Wh LENOVO L15D1P31 for Lenovo Yoga Tab3 Pro YT3-X90L3.7V/4.2V 1750MAH/6.5WH AMAZON GP-S10-346392-0100 for AMAZON KINDLE 3 3G WIFI Kindle11.4V 84Wh DELL 4GVGH for DELL XPS 15 9550 4GVGH 1P6KD

This fast Alienware gaming monitor with G-Sync is on sale for $340

Since your monitor is the portal to actually using your PC, it makes sense to invest in a good one. Of course, the best gaming monitor for your needs depends on how you use your PC, and if ultra-fast framerates is your thing, check out this deal on the Alienware 25 (AW2518H).

This isn’t very big at 24.5 inches, and the resolution stands at just 1920×1080. However, that makes it easier on your hardware to utilize the rapid native 240Hz refresh rate it offers, making this a great choice for less demanding esports games (Counter-Strike: Global Offensive comes to mind). And as for the size, not everyone has the space for a hulking display.

As you might have guessed, the AW2518H is built around a TN panel (240Hz refresh rates are no longer the exclusive domains of TN displays, but IPS monitors are much fewer in number). Compared to a typical IPS display, TN panels have tighter sweet spots and lower color gamuts, so keep that in mind.

While professional photographers will want to look elsewhere, competitive gamers have reason to zero in on this one. It has a 1ms response time to facilitate quick reactions, and is built for speed and smooth gameplay. To that latter point, it supports Nvidia’s G-Sync technology with an actual G-Sync hardware module inside (as opposed to being a “G-Sync Compatible” model).

The AW2518H is also fairly generous in terms of connectivity. The underbelly is home to an HDMI 1.4 port, a DisplayPort 1.2 connector, two 3.5mm jacks (headphone and audio out), and four USB 3.0 downstream ports.

In The Lab: ASRock Radeon RX 5700 XT Taichi X 8G OC+ Graphics Card

Since AMD unveiled its products on the 7 nm process node, we’ve seen numerous CPU launches and announcements with the Ryzen 3000 series of processors. Not only processors but graphics cards too with AMD’s launch of the RX 5700 and RX 5700 XT graphics cards back in July. Aimed at occupying the mid-range consumer graphics market, both the new Navi 10 graphics cards are based on its RDNA architecture which is completely re-worked from what we’ve seen from the company in previous releases such as the Polaris based 12 nm cards. As part of our ASRock X570 Aqua build, the company also sent us one of its aftermarket cooled RX 5700 XT cards, specifically the ASRock RX 5700 XT Taichi X 8G OC+ graphics card with 8 GB of GDDR6 VRAM, a boost core clock of up to 2025 MHz, and an effective memory clock of 14 Gbps. We’re not doing a full analysis here, but we have takem a look at what’s under the hood and done some testing to see how the card compares with what we have in the motherboard testing lab.

The ASRock Radeon RX 5700 XT Taichi X 8G OC+

The ASRock Radeon RX 5700 XT Taichi X 8G OC+ is a premium version of AMD’s reference RX 5700 XT with a few key refinements with the aim on pushing mid-range performance to new levels. The reference model is well-known for its unique ‘dent’ along the top of the card’s cooler, but even more unique than this is ASRock’s Taichi X design. The RX 5700 XT Taichi X cooler incorporates a 2.5 slot design which makes it beefy, but more than adequate for ATX motherboards which usually leave 3-slot spacing between the two full-length PCIe slots.

As we highlighted in our launch day AMD Radeon RX 5700 XT and RX 5700 review, this isn’t the first 7 nm graphics card from AMD (Radeon VII), but it’s the first mass-produced consumer 7 nm graphics card in high volumes and one that helps and looks to redefine the mid-range graphics card sector. For comparison, the NVIDIA GeForce RTX 2060 Super is the rival card from the team green at the $399-499 price point.

The ASRock Taichi design is something we have seen implemented successfully on its motherboard range, eg the ASRock Z390 Taichi, the ASRock X399 Taichi, and more recently, the ASRock X570 Taichi. Transitioning the design to its fairly new VGA range, the fabled Taichi cogwheel is found all over the card including the fans, and on the all-metal backplate. The ASRock RX 5700 XT Taichi X also benefits from integrated ARGB LEDs which adds to the design, and the silver fan shroud is designed to fit in with its motherboard designs. It’s now commonplace for vendors that manufacturer both motherboard and graphics to align the designs for brand awareness, and it also gives users the ability to create a uniformed look when building a new system. For connectivity there are two HDMI 2.0b, and four DisplayPort 1.4 video outputs.

Dissecting the design of the ASRock Taichi X 8G cooler, the fan shroud houses three Taichi cooling fans, two 90 mm fans and a single 80 mm in the middle with each fan consisting of nine blades. The inside of the shroud around the fans is black, while the outer edging of the shrouds structure is silver. Fully assembled the ASRock 5700 XT Taichi X 8G is 12.7 inches in length and occupies 2.5 PCIe slots when installed into a motherboard.

In between the PCB and cooler of the ASRock 5700 XT Taichi X 8G is a metal frame which serves two main purposes. Firstly the brace helps to reinforce the PCB to reduce bending and flex which is commonly associated with GPU sag when installed into a system. Secondly, the metal frame helps to increase the Taichi X coolers thermal properties with a wider base area for heat dissipation.

On the rear of the card is a 3D metal backplate which assists the metal frame on the other side of the PCB to reduce PCB flex, and hopefully stop bending altogether. The design of the Taichi X backplate is rather interesting with a black and silver background, with ASRock’s Taichi branding and cogwheel illustrations. The gold cogwheels are 3D moulded into the backplate with a nice finish, while the black cogwheels are printed on.

Analyzing the PCB of the ASRock Radeon RX 5700 XT Taichi 8G OC+, we can see that it is using a 12-phase power delivery in an 11+1 configuration. This is an improvement over the reference design AMD RX 5700 XT which has a 7+1 design. Regulating the power delivery is an International Rectifier IR35217 PWM controller, while two 8-pin PCIe power inputs provide power directly from the power supply. Surrounding the 7 nm GPU die is eight Micron D9WCW GDDR6 memory chips which operate at 1750 MHz which is 14 Gbps effective.

When compared directly to the reference model, the ASRock RX 5700 XT Taichi X 8G OC+ comes with a hefty out of the box overclock when compared to stock. This is a 12% increase in base clock frequency, with game clock getting a 10% bump, and maximum boost clocks seeing a jump in frequency by just over 6%. The ASRock RX 5700 XT Taichi X 8G OC+ has a current selling price of $480, which is an $80 bump over the reference RX 5700 XT model at $399. NVIDIA’s competing graphics card in this price segment is the NVIDIA GeForce RTX 2060 Super, with aftermarket versions available for roughly the same price as aftermarket RX 5700 XT’s.

Test Bed and Gaming Performance

As mentioned, today we are comparing against what we have in the motherboard lab. Unfortuantely our GPU lab is on the other side of the world, but we were lucky enough to get hold of a few other cards for comparison.

Gaming: Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

It’s no secret that NVIDIA cards perform better in Grand Theft Auto V than AMD’s counterparts, and our results follow this trend. The ASRock RX5700 XT Taichi X 8G performs around 2 frames per better than the reference Radeon RX 5700 XT model in our testing.

Gaming: Strange Brigade (DX12)

Strange Brigade is based in 1903’s Egypt and follows a story which is very similar to that of the Mummy film franchise. This particular third-person shooter is developed by Rebellion Developments which is more widely known for games such as the Sniper Elite and Alien vs Predator series. The game follows the hunt for Seteki the Witch Queen who has arose once again and the only ‘troop’ who can ultimately stop her. Gameplay is cooperative centric with a wide variety of different levels and many puzzles which need solving by the British colonial Secret Service agents sent to put an end to her reign of barbaric and brutality.

The game supports both the DirectX 12 and Vulkan APIs and houses its own built-in benchmark which offers various options up for customization including textures, anti-aliasing, reflections, draw distance and even allows users to enable or disable motion blur, ambient occlusion and tessellation among others. AMD has boasted previously that Strange Brigade is part of its Vulkan API implementation offering scalability for AMD multi-graphics card configurations.

In Strange Brigade, the results are different from GTA V, with the ASRock RX 5700 XT Taichi X 8G flexing its muscles with a comfortable lead over NVIDIA’s GTX 2060 Super reference card. Compared with the reference RX 5700 XT, the ASRock RX 5700 XT Taichi X 8G comfortably leads by 3 frames on average and in 95th percentile.

Gaming: F1 2018

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained; otherwise, we should see any newer versions of Codemasters’ EGO engine find its way into F1. Graphically demanding in its own right, F1 2018 keeps a useful racing-type graphics workload in our benchmarks.

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained. We use the in-game benchmark, set to run on the Montreal track in the wet, driving as Lewis Hamilton from last place on the grid. Data is taken over a one-lap race.

In F1 2018, the AMD RX 5700 XT models decimate the NVIDIA GeForce RTX 2060 Super. The ASRock RX 5700 XT Taichi X 8G leads the NVIDIA RTX 2060 Super by around 21 frames at average, and by around 16 frames in our 95th percentile testing.

Final Words

The ASRock RX 5700 XT Taichi X 8G OC+ is one of many custom-designed Radeon RX 5700 series models to hit the market in a large wave. Within our three games, the ASRock RX 5700 XT Taichi X 8G OC+ graphics card comfortably won two of them, with the NVIDIA GeForce RTX 2060 Super comfortably taking the crown in Grand Theft Auto V; this is due to in-game optimizations which favors NVIDIA graphics cards.

One of the most interesting aspects is the design; the ASRock Taichi design is one of its most marketable and well-known brands. Coupled with a Z390 Taichi, or X570 Taichi motherboard, the ASRock RX 5700 XT Taichi X 8G OC+ plus makes a partner with its similar designs, and the RX 5700 XT series is a solid mid-range option. While a reference AMD Radeon RX 5700 XT costs $399, the uprated and better cooled ASRock RX 5700 XT Taichi X 8G OC+ costs $480; a rise of $81 for an aftermarket card over the reference design.

It has style, it has a goodcooler, and the hardware itself is more than capable of overclocking and achieving higher clocks than comes default out of the box; it already comes with a noticeably powerful out of the box overclock. For that reason alone, it’s worth it over the reference model. The other alternatives in this price segment are the NVIDIA GeForce RTX 2060 Super reference at $399, and various models via vendors featuring aftermarket coolers.

16-Inch MacBook Pro vs Razer Blade 15 Studio: Photo Editing Shootout

Last month, we reached out to Razer to see if they would lend us a Razer Blade 15 Studio Edition for testing. With the 16-inch MacBook Pro just announced and on its way in as well, it seemed like the perfect time to see which top-of-the-line laptop would take the photo editing crown.

Introduction

There was a time when all new “creatives” naturally drifted towards Mac. In 2007, I was… Maccurious, and so I tried switching from a Sony Vaio to that iconic white plastic MacBook.

I was hooked, and for years you couldn’t shake Apple off of my personal computing pedestal no matter haw hard you tried. The workflow was so seamless, and Windows wouldn’t do anything resembling “catching up” on the usability front until 2015 when Windows 10 was released. Vista was crap, Windows 7 was okay, and Windows 8 was such a touchscreen-optimized train wreck it took Windows 9 down with it.

But then came Windows 10 in 2015, just as Adobe was moving over to the Creative Cloud and Apple was making a few “brave” design mistakes.

Since then, it’s a been a painful waiting game for those of us who weren’t sure what would become of Apple’s “Pro” computing lineup. Windows laptops kept getting better and better, while Apple issued semi-apologies, reiterated that the SD card slot would never come back, and ran into increasingly frustrating performance issues by putting Intel’s hottest chips in the thinnest MacBook Pro design yet.

Over the past year, Apple finally got serious again—releasing the iMac Pro, Mac Pro, and the much-improved 16-inch MacBook Pro—but so has their competition in the creative space.

Windows has been courting creatives more adamantly than ever before. Thanks, in part, to the sheer hardware demands (and crappy optimization…) of Adobe’s photo and video editing apps, as well as the continued explosion of VFX and 3D rendering workflows, PC companies realized that the raw horsepower and cooling advantage they traditionally have on Macs had become more valuable.

Combine this with the evolution of Windows 10 into something approaching user-friendly, NVIDIA’s rift with Apple, and a few years of Apple removing ports and snubbing pro users, and you’ve got an entire swath of Apple power users still relying on their mid-2015 MacBook Pros, unsure if their next laptop is going to be a Mac or Windows machine.

I am one of those users. For years, I’ve held on to my mid-2015 MBP with its usable keyboard, reasonably powerful specs, all the ports I want, and magsafe connector, unsure if Apple could lure me back with their next proper redesign. Now I’m at a crossroads.

The Razer Blade 15 Studio Edition is an almost blow-for-blow competitor for the brand new 16-inch MacBook Pro, with the same exceptional build quality, an almost identical price at a similar spec, more I/O, and (allegedly) more raw editing power. So I switched to the Razer as my daily driver for a solid month, before running both machines through a battery of tests that were suggested by professional photographers.

Can the Razer beat the MacBook Pro at its own game, or does Apple’s latest redesign keep it in the top spot? Let’s find out.

Specs and Price Point

Despite Apple’s reputation for making “overpriced” laptops, the fact remains that you have to pay for the kind of build quality and quality hardware that you’ll find in a flagship MacBook Pro. The Razer Blade 15 Studio Edition—or any of the Blade 15 laptop, really—is a good example of this.

The two computers we’ve been testing go blow-for-blow in almost every category. Both use an aluminum unibody design and exceptional glass trackpads, both pack 32GB of RAM and powerful graphics, and both feature impressive multi-core Intel CPUs that promise to churn through both single and multi-threaded workloads. Here’s a spec breakdown of the Razer Blade 15 Studio and 16-inch MacBook Pro we’ll be comparing:

That’s right, the Razer is more expensive than the MacBook Pro despite offering half the storage, a 6-core i7 instead of an 8-core i9, and a smaller battery. In fact, if you drop the MacBook Pro down to the same 6-core i7 used in the Razer and match the storage, the 16-inch MBP comes out even further ahead at just $3,200… albeit with a weaker GPU.

If you’re willing to lose the SD card slot, cut your RAM in half, downgrade the storage to 512GB, and downgrade your graphics card to an NVIDIA RTX 2080 Max-Q with 8GB of VRAM, you can drop the price of the Razer down to $3,000 (sale price) by getting a fully loaded Razer Blade 15 Advanced GForce Studio Edition… but now you’re competing with a 16-inch MBP that costs just $2,600 if you match the two computers spec-for-spec (again, except graphics).

Build Quality

I can confidently say that there is no significant difference in build quality between these two machines. In fact, since the Razer Blade 15 Studio Edition is only available in the Mercury White variant, they even look extremely similar.

Both use an aluminum unibody design that’s rigid and stands up well to scuffs and finger prints, both look sleek and professional, and both are pretty hefty—an understandable drawback thanks to the 80Wh battery in the Razer and the 100Wh battery in the MacBook Pro.

While the 16-inch MacBook Pro is a bit longer in one dimension, it’s also thinner than the Razer, and both will slot into the majority of “made for 15-inch laptop” sleeves you’ll find in backpacks or messenger bags.

This category is a dead tie.

Usability

We’ll discuss operating systems shortly. For now, when I say “usability” I’m talking about the design elements that affect the day-to-day use of each laptop. In other words: what ports do you have, how do the keyboard and trackpad function and feel, how do the displays compare, and what kind of battery life can you expect.

I/O

In terms of I/O (ports) the Razer is the clear, breath-of-fresh-air winner. You’ve got multiple USB-A ports, you’ve got a Thunderbolt 3 port, you’ve got a Mini DisplayPort, and Razer has put an SD card slot back into this specific top-of-the-line Studio Edition variant of the Razer Blade 15.

While Apple’s Phil Schiller is adamant that pros don’t need an SD card slot, our recent deep dive into that question disagrees, and we’re glad Razer is listening to creatives who don’t want to live “that dongle life.”

The tradeoff, of course, is the versatility of the ports you do have. Any of the four USB Type-C ports on the MacBook can be used for charging the device and give you full Thunderbolt 3 transfer speeds. By contrast, there is only one USB Type-C port on the Razer, and while it is Thunderbolt 3, it cannot be used to charge the device.

Still, Razer takes this category by a landslide.

Keyboard and Trackpad

Before receiving the Razer Blade, I’d heard a lot of complaining online about the keyboard. I assumed it was misguided nitpicking. I was wrong.

They keyboard layout is a travesty. In order to fit in full-sized arrow keys (ostensibly for RGB and key customization purposes?) Razer has put the up arrow right next to the right-hand Shift key and slapped a function key at the very bottom right. I used the laptop exclusively for a month and never stopped making mistakes because of this.

If I press Shift>Up one more time when I mean to type a question mark, I’m gonna lose my mind.

In terms of feel, the keyboard is okay, but it’s mushier and less pleasant to type on than the new (read: old) scissor-switch keyboard on the new MacBook Pro. It’s a shame Razer didn’t put their new optical keyboard onto this laptop, but that’s only available in the Razer Blade Advanced.

The trackpad, on the other hand, is the best I’ve used on a PC. Apple is the undisputed king of high-quality trackpads thanks to their fantastic feel, responsiveness, and the intuitive feel to the multi-touch gestures, and Apple still beats the Razer’s version today, but the Blade Studio edition has nearly closed this gap. The feel of the Razer’s glass-topped trackpad is almost identical, and undeniably premium.

Apple’s new-old keyboard takes a resounding win here, and the company maintains its crown as the trackpad to beat… but not by much.

Display

Both of these displays are fantastic. Both cover the P3 color gamut, both are very bright, and both are more-than-high-enough resolution for high-end photo editing. That said, Razer’s panel with “retina shattering accuracy” takes this round for two important reasons:

It’s uses an OLED panel, where the MacBook Pro is IPS, earning this display an HDR 400 certification and extremely crisp contrast ratio.

It’s 4K, where the MacBook Pro is 3072 x 1920.

The win goes to Razer, no question. Both displays look great, but the additional resolution and higher contrast ratio made possible by the OLED display gives Razer a noticeable edge.

Battery Life

The 100Wh battery in the 16-inch MacBook Pro—the FAA limit for taking on planes in the United States—lives up to its wattage and pulls the Mac way ahead here.

The MBP regularly provided me with between 4 and 5 hours of high-productivity time. That means two Chrome windows with 10-15 tabs each, Photoshop, Slack, Email, Feedly, a notes app, and Spotify all actively used, with other apps coming and going as I work. By comparison, the Razer would crap out on me in 1.5 to 2 hours.

The Mac also offers a much more convenient 96W power adapter that can be used as a generic USB Type-C power brick and can power your laptop from an airplane’s A/C port. The Razer’s massive (in comparison) 230W power adapter uses a dedicated charging port on the laptop and more than doubles the average power that can be delivered to each outlet on an airplane.

Sure, the Razer charges more than twice as fast when you’re hooked up to the mains (on the ground), but it drains twice as fast and may not be able to pull enough juice to last you through a long flight, even plugged in.

This is a clear win for the Mac.

Operating System

This is, admittedly, a highly-subjective topic that people will yell at me about in the comments, but it’s hard to argue that Windows is clunkier than macOS when it comes to basic usability. That’s why 90% of die hard Mac users, even the pissed off ones, won’t switch.

While I disagree with silly statements like “All PCs suck because they run Windows,” I have to admit that most of the redeeming qualities I find in Windows 10 are a version of something on macOS: like Virtual Desktops in their version of Mission Control, or the recent decision to make taking Screenshots a three-key shortcut instead of using the inane Print Screen system.

I also have to mention that the Razer has locked up on me at least once every few days: either a single app like Photoshop would freeze completely, or the whole system would jam. At one point, Photoshop froze, and every time I opened Task Manager to try and kill it, Task Manager would freeze. Seriously.

I would have taken a screenshot, but that wasn’t working either. Awesome…

Windows 10 has come a long way, and Razer’s bloatware-free implementation is the best I’ve used. Credit where credit is due. At times I would even forget that I was on Windows for stretches of time… until some part of the system would lock up or some incredibly simple feature would jolt me out of my workflow. Even with all of its bugs, macOS Catalina hasn’t caused me nearly as much frustration in the months since it came out.

Performance

Given the recent GPU Acceleration improvements that Adobe made to Lightroom Classic, we were very curious to see how these tests turned out. Would the 8-core CPU in the Mac win out, or would the more powerful GPU (with twice the VRAM) in the Razer take down Apple? The CPU/GPU distinction wound up being very important…

But first, let’s cover some housekeeping.

All of the tests were run with both computers fully charged and hooked up to A/C power, with no other applications running, using the exact same images and (where applicable) identical edits on the latest versions of Lightroom and Photoshop.

We went into the settings on both computers and make sure GPU acceleration was on for both Lightroom and Photoshop, and ensured Photoshop was able to take advantage of the maximum amount of RAM available. Finally, we also went into the Razer’s performance settings in the brand’s Synapse app and put both CPU and GPU performance at max.

Lightroom Import

We ran two import tests on each machine, throwing the biggest files we could at Lightroom Classic and timing how long it took for the program to import the files off of the computer’s own hard drive and generate Standard Previews.

First, we imported 110 RAW files from the 61MP Sony a7R IV, and then followed up with 150 RAW files from a 100MP PhaseOne XF.

Lightroom Export

We ran six export tests in all. For each batch of photos, we performed heavy global edits, copied that same edit across the whole batch, and then exported all of those images as 100% quality JPEGs (sRGB), uncompressed 16-bit TIFFs (Adobe RGB), and DNGs with Medium JPEG previews.

HDR Merge

Next up, we ran two different HDR Merges. Both were shot with the Sony a7R IV: the first was a traditional 3-shot bracket, while the second was six identical frames shot in a dark cave, with the lighting moved around in order to try and capture a fully-lit composite.

Photoshop Benchmark

Photoshop is incredibly hard to benchmark. While many of the filters do take some time, the difference between these two machines when, say, merging a panorama or applying a GPU accelerated filter is only a few seconds here or there. Added up, though, it makes a difference.

That’s why we chose to use this Photoshop benchmark released by Puget Systems. A huge shoutout to them for putting this benchmark together, because it saved us hours and hours of work timing each of these things together. The benchmark tests 21 different tools in a row, timing each task, and then runs that same benchmark three times before assigning an overall score, a GPU score, a General score, a Filter score, and a Photomerge score.

This is where we expected the Razer to beat Apple. It was on the latest drivers, has 8GB more VRAM, and a GPU that should eat the AMD Radeon Pro 5500M for lunch. We … were surprised. We ran each benchmark ten times in a row. Here are the top PugetBench scores for each machine.

The only category that the Razer won was Photomerge, which really surprised us. Filters are GPU accelerated, but apparently not enough to overcome the CPU limitation. It wasn’t a slaughter by any means, but the Mac finished the benchmark sooner and scored better every single time–a result we did not expect.

A Story of GPU vs CPU

By this point, you’ve probably got the performance picture: while the GPU advantage of the Razer and other PCs might give them an edge in 3D rendering or some video editing tasks, it does very little for photographers. Whether you’re exporting hundreds of edited files for a client or editing hundreds of layers together in Photoshop, the Mac wins in almost every single category, losing only in the Photomerge test, and even then the difference is about 10 seconds total.

Conclusion

For Windows Users

If you’re a long-time Windows user looking for a well-built, exceptionally powerful machine with zero bloatware, the Razer Blade 15 Studio Edition is a fantastic choice. Don’t let the company’s gaming history and three-headed snake logo deter you: with its industry-leading graphics, I/O for days, a display to die for, and the cleanest possible install of Windows, this is the best PC laptop I’ve ever used.

HOWEVER

If you’re willing to sacrifice on the overall build quality, you will get better (read: faster) overall photo editing performance from a machine with a better CPU and worse GPU… and you’ll save money, too.

Take a look at the top spec Asus Zenbook Pro Duo or the Dell XPS 15 7590—both come with the same 8-core Intel i9 CPU that the MacBook Pro has, which means faster exports and faster file handling than the Razer, and you can get the Asus specced with a GPU that will more-than-keep-up with the 5500m in the MacBook Pro.

For Mac Users

If you’re a long-time Mac user wondering if now is the time to switch, you can rest assured that Apple saved themselves with the new 16-inch MacBook Pro.

Apple knocked this redesign out of the park. Sure, the Touchbar is still lame and I wish they’d thrown the SD card slot back on the machine (or at least another Thunderbolt 3 port), but all of my biggest complaints about the previous MacBook Pros have been addressed: keyboard, cooling, escape key… in that order.

As tempting as some of the NVIDIA Studio Edition laptops are—and the Razer Blade 15 is near the top of that list–there are simply too many drawbacks and not nearly enough advantages if you’re talking about a PC with similar specs and equivalent build quality. Even if you’re fairly familiar with Windows, it’ll slow you down in a million, mind-numbing little ways that will eat away any performance advantage you do get, and the performance boost that these laptops boast come with a big fat asterisk for photo editors: the most time-consuming task for most professional photographers, batch exporting, doesn’t even use the GPU in the first place.

G.Skill announces a bevy of high-performance DDR4 RAM kits up to 256GB

G.Skill is closing out the year with dozens of new DDR4 memory kits designed for Intel’s X299 and AMD’s TRX40 Threadripper platforms, under its Trident Z Royal and Trident Z Neo product lines. Several of the kits are based on its new low latency 32GB modules, including a bunch of 256GB capacity kits at different speeds.

This raises the question, does RAM speed and capacity matter for gaming? And the answer is yes, up to a certain point. Generally speaking, there is a definite improvement in upgrading from 8GB to 16GB of RAM, though the jump to 32GB and beyond is less substantive. A similar trend applies to frequencies on AMD systems—going from a DDR4-2666 kit to a DDR4-3600 can yield double-digit performance gains, depending on the game (Intel setups don’t benefit as much).

Having said all that, most of G.Skill’s new kits are overkill for a PC that is strictly used to play games. For example, the company’s newly minted 256GB (32GBx8) DDR4-4000 kit at 18-22-22-42 is legendary on paper, but much of the performance potential will remain untapped for gaming, general purpose computing, and most things outside of content creation and certain workstation chores. Simply put, the best RAM for gaming is not a 256GB kit.

Still, I applaud G.Skill for pushing the envelope here. All of its new kits are high-capacity options. They range from 32GB on up to 256GB. Have a gander:

From a technical standpoint, the new DDR4-3200 kits based on ultra-low latency 32GB modules are the most interesting, with 128GB and 256GB kits sporting 16-18-18-38 timings at 1.35V. Timings are even tighter on the 32GB and 64GB kits that use 8GB modules, at 14-15-15-35 at 1.45V.

G.Skill uses hand-selected memory chips for its Trident Z Neo and Royal memory lines. Otherwise known as binning, memory components are individually tested to hit and maintain certain speed and performance parameters, which is one of the things that separates a high-performance RAM kit from a generic one.

My assumption is these are using Samsung B-die memory ICs, as are commonly found in premium RAM kits, particularly those designed for overclocking. I’ve reached out to G.Skill to see if that’s the case and will update this article when I hear back.

These kits will be available in the first quarter of 2020. There’s no mention of price (G.Skill typically doesn’t share MSRPs because of fluctuations in the memory market), but I don’t expect any of these will be cheap, especially in 256GB territory—the few 256GB kits out there all run north of $1,200.

GIGABYTE Unveils MU71-SU0 and MD71-HB0 Xeon Motherboards For Professionals

Outside of the plethora of AMD-related announcements of late, GIGABYTE has announced a pair of Intel motherboards for its workstation and server board users. The GIGABYTE MU71-SU0 is designed for Intel’s Xeon W-3200 processor family and is based on the C621 single-socket chipset. The other model is the GIGABYTE MD71-HB0, which is dual-socket on the C622 chipset and supports Intel’s Xeon Scalable processor product stack.

MU71-SU0

Starting with the GIGABYTE MU71-SU0, the single-socket Intel C621 model has plenty of workstation features designed for use with Intel’s Xeon W-3200 processors, which range from 8-core models all the way up to 28-core options. The flexible Intel C621 chipset benefits from AVX-512 support, Intel’s VROC RAID key utility, and specifically with the MU71-SU0, it has an ASPEED AST2500 remote management controller.

The GIGABYTE MU71-SUO has six full-length PCIe 3.0 x16 slots which operate at x8/x16/x8/x16/x8/x16, with a half PCIe 3.0 slot locked down to x4. This also includes a single PCIe 3.0 x4 M.2 slot, with two Slim SAS connectors providing eight STA ports with support for RAID 0, 1, 5, and 10 arrays. There are eight memory slots which support both 64 GB RDIMMs and 128 GB LRDIMMs, with a maximum speed of DDR4-2933 in hex channel mode and up to 2 TB with Xeon-W ‘M’ high memory model processors. On the website, GIGABYTE states that the board supports Intel’s Optane DCPMM modules, however we were told by Intel that Xeon W-3200 does not support it, so we are looking into this. The rear panel uses two Intel I210-AT Gigabit Ethernet controllers, an additional Ethernet port dedicated for remote management, with four USB 3.1 G1 Type ports, a D-Sub utilized by the IPMI, a serial port.

MD71-HB0

Moving onto the second of GIGABYTE’s new professional mobos, the MD71-HB0, this dual-socket server model is designed for use with Intel’s Xeon Scalable processor family and offers a more high-end feature set than its single-socket counterpart. The GIGABYTE MD71-HB0 shares an albeit similar PCIe slot array which supports x16/x16/x16/x8/x16/x8/x16, with five full-length slots, and two half-length. Storage options include two PCIe 3.0 x4 M.2 slots, with three SlimSAS slots which support up to twelve SATA ports with the usual RAID options.

Its feature set includes twelve memory slots, with support for 64 GB RDIMMs and 128 GB LDRIMMs at speeds up to DDR4-2933. As with other C622 and C621 chipsets, this model uses hex channel memory configurations. On the rear panel is a dual-port Intel X557-AT2 10 GbE controller, with a further two Intel Gigabit Ethernet ports, with two USB 3.1 G1 Type-A ports, and a D-sub video output for the ASPEED AST2500 IPMI interface.

Both models feature similar designs with a blue PCB, blue memory slots, and standard non-reinforced PCIe slots. GIGABYTE hasn’t shared any pricing or availability as of yet, but it is expected that both the GIGABYTE MD71-HB0 and MU71-SU0 should become part of GIGABYTE’s other server offerings.

G.Skill’s New DDR4 Kits With 32GB Modules Have Very Low Latency

G.Skill today announced some new 32GB RAM kits with capacities ranging from 64GB in dual-module kits to 256GB in kits with eight DIMMs. The best part about these modules is that they feature very low timings for their capacities, meaning they should offer both high-capacity and snappy memory at once when they arrive in Q1 2020.

The kits are outed in a handful of lineups, the Trident Z RGB, Trident Z Royal, as well as the Trident Z Neo lines, so there will be plenty of options.

The 32GB modules will run at 3,200 MHz DDR4 speeds with CL14 timings. More pacifically, the CL timings are CL14-18-18-38. G.Skill has validated these clocks and timings on the latest Intel and AMD HEDT (high-end desktop) platforms, as well as AMD’s mainstream X570 platform, which doesn’t always happen at the performance limits.

It should be noted though that G.Skill is known for pushing its modules to their absolute limits with its testing, so we’d hold off on putting any numbers on real-life performance until we do some testing of our own. With Intel XMP 2.0 profiles, however, chances are you’ll be fine.

No word on pricing yet, but we’ll stay tuned to see these RAM kits arrive early next year. 

What Is a Favored Core? Intel CPUs With Turbo Boost Max Technology 3.0 Explained

Like we’ve proven with AMD processors, not all CPU cores are created equally, and that applies to Intel’s silicon, too. The silicon wafer fabrication process lends to discrepancies between each core’s voltage, power and thermal characteristics that lead to better performance. These faster cores are identified to the operation system as a favored core.

The term favored core first came with the arrival of Intel’s Broadwell-E line of HEDT (high-end desktop) CPUs in 2016.

Intel Turbo Boost Max Technology 3.0

Intel Turbo Boost Max Technology 3.0 is a feature used by some Intel CPUs to improve performance of lightly- or single-threaded applications by pushing those workloads to the processor’s two favored (or fastest) cores. Using both hardware and software, Intel claims Turbo Boost Max Technology 3.0 delivers “more than 15% better single-thread performance.”

Released in September 2019, Intel Turbo Boost Max Technology 3.0 works by using a driver and CPU-stored information to select and move workloads to the processor’s favored cores.

Windows natively supports this feature, and it’s enabled automatically. You need Windows 10 x65 – RS5 Distortion or later.

Similarly, CPUs that support Intel Turbo Boost Max Technology 3.0 have the feature enabled in its hardware and p-code. That means you don’t have to activate it in the BIOS or otherwise. Additionally, you can monitor the feature with Intel Extreme Tuning Utility (XTU) software.

According to Intel, the latest version of its Turbo Boost technology allows for “for higher frequencies with single-core turbo” than its predecessor, Intel Turbo Boost Technology 2.0.

However, the effectiveness of Turbo Boost Max Technology 3.0 can be impacted by the following:

Your CPU’s active core count

CPU’s temperature

Estimated current and power consumption levels

Workload type

Drive support

Which CPUs Support Intel Turbo Boost Max Technology 3.0?

Intel Turbo Boost Max Technology 3.0 works with certain Intel Core X-series CPUs on the Intel X299 chipset.

The following Intel CPUs can direct workloads to a favored CPU core using Intel Turbo Boost Max Technology 3.0:

Intel Core i7-69xx/68xx series

Intel Core i9-7900X

Intel Core i9-7920X

Intel Core i9-7940X

Intel Core i9-7960X

Intel Core i9-7980XE

Intel Core i7-7820X

Intel Core i7-9800X

Intel Core i9-9820X

Intel Core i9-99x0XE

Intel Core i9-99x0X

Intel Xeon E5-1600 v4 series (since CPU socket only)

OnePlus launches bug bounty program

After falling victim to a data breach back in November, OnePlus promised it would launch a bug bounty program by the end of the year in order to further secure its phones.

Now the Chinese smartphone maker has launched its bug bounty program to help prevent future breaches but at this time, only select security researchers will be invited to to look for vulnerabilities in its systems.

OnePlus will also be highlighting the work of the program’s top contributors by updating a leaderboard which will feature the top three contributors on the main page of its bug bounty program.

One Plus bug bounty program

According to a page offering more details on its bug bounty program, OnePlus will pay up to $7,000 for special cases, $750-$1,500 for critical vulnerabilities, $250-$750 for high tier vulnerabilities, $100-$250 for medium tier ones and $50-$100 for low severity bugs.

While the tiers show the level of rewards researchers can expect to receive, it is still unclear as to what the criteria are for each tier. OnePlus just says that the rewards researchers will receive are “determined based on vulnerability severity and actual business impact.”

Just as NordVPN did with its recently launched bug bounty program, OnePlus will also be partnering with HackerOne to launch its program.

The company’s bug bounty program is currently in a private test phase but OnePlus has said that it will launch a public version of the program in 2020.