<![CDATA[ Latest from PCGamer in Hardware ]]> https://www.pcgamer.com Wed, 15 Jan 2025 10:29:27 +0000 en <![CDATA[ Gigabyte X870E Aorus Pro review ]]> Having already launched one generation of chipsets on its Socket AM5 platform, AMD faces an uphill struggle in selling 800-series chipset motherboards when older, cheaper options are perfectly capable of running CPUs such as the Ryzen 7 9800X3D. However, it arguably has an easier task than Intel given the low sales figures of its Core Ultra 200 series and there are plenty of options that leave you with decent change from $400/£400 too.

The Gigabyte X870E Aorus Pro retails for just $350—still not cheap, but then the X870E option with AMD’s new 800-series chipsets is the flagship choice.

This gets you Wi-Fi 7, full PCIe 5.0 support across both graphics and some of the M.2 storage ports, USB 4 Type-C ports and the latest features too that are designed to make the building process as hassle-free as possible as well as dealing with hot-running PCIe 5.0 SSDs. The latter are what ultimately make the difference here given the older X670E chipset offers the majority if not all of the same features under the hood in terms of ports and bandwidth.

At this price you’re also paying for premium looks too and Gigabyte has delivered with a chunky set of dark, moody heatsinks for the M.2 ports and VRMs as well as a dash of built-in RGB lighting above the I/O panel and chipset heatsink. While it’s not exactly cheap, the board looks and feels every bit like something costing $100 more.

X870E Aorus Pro specs

Gigabyte X870E Aorus Pro motherboard with the SSD heatsinks detached and on a light desk.

(Image credit: Future)

Socket: AMD Socket AM5
Chipset: AMD X870E
CPU compatibility: AMD Ryzen 7000/8000/9000 desktop
Form factor: ATX
Memory support: DDR5-4800 to DDR5-8200 (OC), up to 256 GB
Storage: 4x M.2, 4x SATA
USB (rear): 2x USB 4 Type-C 40 Gbps, 3x USB 3.1 Type-A 10 Gbps, 4x USB 3.0 Type-A 5 Gbps, 2 x USB 2.0
Display: 1x HDMI 2.1, 2x USB/DisplayPort
Networking: Realtek 2.5G LAN, Wi-Fi 7
Audio: Realtek ALC1220
Price: $359 | £329 | AU$626

If you hate dealing with tiny M.2 screws or graphics cards stuck in their slots then you’ll love the tool-free features on offer with the X870E Aorus Pro. They’re similar to other manufacturers, but easily as good as anything from Asus or MSI. The EZ-Latch feature applies to both the M.2 SSD, heatsink and PCIe slot for your graphics card, with the former two benefitting from small latches that pop open with a finger press for easy removal and installation. To remove your graphics card there’s a push-button next to the memory slots.

Even the Wi-Fi antenna gets the EZ treatment with a combined connector that plugs into the I/O panel port so there’s no fiddly nuts or screws to deal with here either. All of the four M.2 ports cool SSDs on both sides and three of them support PCIe 5.0 SSDs although the lower two PCIe 5.0 slots will steal bandwidth from your graphics card so it’s best to avoid those.

Image 1 of 4

Gigabyte X870E Aorus Pro motherboard with the SSD heatsinks detached and on a light desk.

(Image credit: Future)
Image 2 of 4

Gigabyte X870E Aorus Pro motherboard with the SSD heatsinks detached and on a light desk.

(Image credit: Future)
Image 3 of 4

Gigabyte X870E Aorus Pro motherboard with the SSD heatsinks detached and on a light desk.

(Image credit: Future)
Image 4 of 4

Gigabyte X870E Aorus Pro motherboard with the SSD heatsinks detached and on a light desk.

(Image credit: Future)

The rest of the PCB has nearly everything you could want, even for a high-end system. There are eight fan headers, a couple of 2-pin thermal probe headers if you’re really going to town with your cooling and you even get a POST LED display and power and reset buttons. It also has impressively large VRM heatsinks to cool the 16+2+2-phase power delivery.

PC Gamer test bench
CPU: AMD Ryzen 9 9900X | Cooler: Asus ROG Ryujin III 360 ARGB Extreme | RAM: 32 GB Corsair Vengeance Pro DDR5-6000 | Storage: 2 TB Corsair MP700 | PSU: MSI MAG AB50GL 850 W | OS: Windows 11 24H2 | Chassis: Open platform | Monitor: Dell U2415

Apart from the funky Wi-Fi connector, the rear I/O panel is a standard-looking affair that doesn’t go all-out on USB Type-C like other boards have. Instead, you get two USB 4 Type-C ports and a total of nine USB Type-A ports comprising three USB 3.1, four USB 3.0 and two USB 2.0, so a good balance of ports and future-proofing without going too overboard and bumping up the price needlessly.

Also on the I/O panel is a Q-Flash Button, otherwise known as BIOS Flashback, a 2.5 Gigabit Ethernet port and audio connectors for the Realtek ALC1220 codec.

Buy if...

✅ You want the latest features on AMD’s Socket AM5 platform without a wild price tag: There are some X870E motherboards with outlandish price tags, but this one is much more reasonable yet still offers a great set of features.

Don't buy if...

You're on a tight budget: Ultimately you’re paying for modern features so if you’re happy with Wi-Fi 6 and USB 3.0 and spending a few minutes longer building your PC an old 600-series chipset board would likely save some cash.

Gigabyte’s BIOS and software are pretty good too, with the former offering a snazzy front page that makes it easy to get to the basic options such as memory EXPO/XMP and quick links to fan control and BIOS updating. The software is definitely useful for controlling your fans from within Windows and its drover update utility also worked well so should be handy for beginners and did spot one or two things Windows Update missed.

If this was 2026 and AMD’s Socket AM5 maybe only had another 12 months left, we’d suggest not spending this much on a motherboard that’s likely got no upgrade path if you bought whatever Ryzen CPUs were around at that time too. However, the Gigabyte X870E Aorus Pro will enjoy a decent lifespan with AMD previously promising to support the socket through to at least 2027.

Spending a little more for cutting edge features that will see you through your next CPU upgrade as well is definitely a strong argument.

The board has plenty to offer aside from future CPU support, though, as it has good VRM and M.2 cooling, looks fabulous and is super-easy to work with thanks to a bunch of tool-free features. It has limited RGB lighting as standard and only two of the four M.2 ports are usable thanks to lane stealing, but other than that, it’s a very solid X870E board for the cash that’s particularly user-friendly.

]]>
https://www.pcgamer.com/hardware/motherboards/gigabyte-x870e-aorus-pro-review/ Waxt8zmVdhx2fPeMz6U7x Tue, 14 Jan 2025 20:00:00 +0000
<![CDATA[ Gigabyte Z890 Aorus Elite WiFi7 Ice review ]]> It’s been a somewhat bizarre launch for Intel with its Core Ultra desktop processors crashing and burning (thankfully not physically this time) in reviews and sales figures are understandably poor, especially when compared to the Ryzen 7 9800X3D, which even outsold all of AMD’s other Ryzen 9000 CPUs combined. It’s only because of a long list of problems and promised fixes that we’d even consider opting for Intel’s new platform and consider a motherboard such as the Gigabyte Z890 Aorus Elite WiFi7 Ice.

But from the start the constant updates do make testing difficult.

BIOS updates, driver updates, Windows updates – they’ve all been coming thick and fast for Z890 motherboards in the hope of improving performance. Today we’ll be seeing if the latest tweaks for the platform and specifically the Gigabyte Z890 Aorus Elite WiFi7 Ice make it worth considering, even when directly compared with its X870 counterpart that costs a touch more.

Of course, with Intel, it’s only releasing cheaper options of LGA1851 motherboards this month, whereas AMD had to content with older 600-series chipsets on its platform. Like AMD’s 800-series, the focus is on Wi-Fi 7, PCIe Gen 5 support, tool-free features and high-speed USB Type-C ports and being Intel, that means Thunderbolt 4.

Z890 Aorus Elite WiFi 7 Ice

Gigabyte Z890 Aorus Elite WiFi 7 Ice on a light desk with a white background and SSD covers removed.

(Image credit: Future)

Socket: LGA1851
Chipset: Intel Z890
CPU compatibility: Intel Core Ultra 200 desktop
Form factor: ATX
Memory support: DDR5-4800 to DDR5-9200+(OC), up to 256 GB
Storage: 4x M.2, 4x SATA
USB (rear): 1x Thunderbolt 4 Type-C 40 Gbps, 2x USB 3.1 Type-A 10 Gbps, 3x USB 3.0 Type-A 5 Gbps, 4x USB 2.0
Display: 1 x Thunderbolt 4, 1x DisplayPort
Networking: Realtek 2.5G LAN, Wi-Fi 7
Audio: Realtek ALC1220
Price: $270 | £280 | AU$490

While this particular board’s party piece is its glorious white colour scheme that even extends to the PCIe and memory slots, as well as small details on the PCB such as the POST code display and RGB headers. Really nice touches, Gigabyte.

Considering this board leaves you with plenty of change from $300, aesthetics and features-wise it doesn’t seem to be a bad deal at all. It looks beautiful thanks to Gigabyte’s attention to detail with the white theme, although it’s a shame there’s not a touch more RGB lighting to really make it pop.

As it stands, there are only two small illuminated areas above the I/O panel and below the chipset heatsink, although it has 3-pin and 4-pin RGB headers to add your own accessories too. Spending over $300 will likely get you a few more fan headers, but the six included here are ample for the majority of PC owners.

Despite having been released recently and carrying Intel’s flagship chipset, the board only has one PCIe 5.0 M.2 port, while its AMD equivalent has three. However, only one of those won’t steal lanes from your graphics card, so it’s somewhat of a moot point. Both boards have four SATA ports, which are a dying breed as many similarly priced AMD B850 boards have cut these to just two.

Image 1 of 2

Gigabyte Z890 Aorus Elite WiFi 7 Ice on a light desk with a white background and SSD covers removed.

(Image credit: Future)
Image 2 of 2

Gigabyte Z890 Aorus Elite WiFi 7 Ice on a light desk with a white background and SSD covers removed.

(Image credit: Future)

Gigabyte’s tool-free features are everywhere, from the push-button graphics card release of PCIe EZ-Latch Plus to its equivalent on the M.2 ports, heatsinks and even a quick-connecting Wi-Fi antenna all cutting the time it takes to jump into your favourite game.

Features such as a POST code display and power and reset buttons are usually reserved for pricier boards but Gigabyte has managed to include them.

The I/O panel has a decent nine Type-A USB ports too, although there’s only one Type-C Thunderbolt 4 port, while most USB4-equipped AMD boards have two of those for a little extra flexibility. Thunderbolt 4 hubs exists if you want to expand this, but they’re usually expensive.

Image 1 of 2

Gigabyte Z890 Aorus Elite WiFi 7 Ice on a light desk with a white background and SSD covers removed.

(Image credit: Future)
Image 2 of 2

Gigabyte Z890 Aorus Elite WiFi 7 Ice on a light desk with a white background and SSD covers removed.

(Image credit: Future)

Gigabyte’s EFI is also included in the white theme, so the company deserves plenty of credit for going to these lengths. It’s not quite as modern as Asus or MSI’s recent updated efforts, but it’s still easy to find your way around and get at useful settings such as the XMP memory profile, boot order or fan control, with the latter having an excellent suite in the EFI as well as in Windows using Gigabyte’s free software.

You’ll have to take our performance comparisons as you find them as there have been so many performance fixes updates for Intel’s Core Ultra CPUs over the last few weeks that making direct comparisons with boards tested back at launch is rather tricky.

PC Gamer test bench
CPU: Intel Core Ultra 9 285K | Cooler: Asus ROG Ryujin III 360 ARGB Extreme | RAM: 32 GB Corsair Vengeance Pro DDR5-6000 | Storage: 2 TB Corsair MP700 | PSU: MSI MAG AB50GL 850 W | OS: Windows 11 24H2 | Chassis: Open platform | Monitor: Dell U2415

Temperatures don’t change that much, though and for a Z890 board dealing with our Core Ultra 9 285K, it managed to knock a couple of degrees of the temperature we saw with the Asus ROG Maximus Z890 Hero and MSI MEG Z890 Ace at a peak of 50°C. Its peak PCIe 5.0 SSD temperature was a fair bit warmer than the Asus board, but at 72°C just comfortably away from the danger zone unlike the 77°C recorded by the MSI MEG Z890 Ace.

Performance seems to have improved compared to our earlier Z890 reviews, but this isn’t down to Gigabyte, rather than the plethora of fixes from Intel and Microsoft. Gaming performance was higher overall, sometimes by large amounts and it also saw smaller uplifts in Cinebench, Blender and 7-zip, each time taking the performance crown.

The combination if BIOS updates, Windows 11 24H2 and the latest Windows updates all contribute here and there are more fixes to come. Power consumption was largely in line with other Z890 boards we’ve tested, though, with a 91 W CPU package power reading in our game test and peak of 239 W in Cinebench sitting within 10 W of other boards.

Buy if...

✅ You want the latest features without breaking the bank: Thunderbolt 4, Wi-Fi 7, PCIe 5.0 SSD and graphics card support are all here as are plenty of new tool-free features and an all-white design that even extends to the EFI.

Don't buy if...

❌ You’re happy to consider AMD: Intel’s CPUs have let the side down and as well as Ryzen offering better performance, especially in games, this board’s AMD equivalent has a few more features and is slightly better value.

If Intel’s Core Ultra 200 series had knocked it out the park, or even offered decent gains over its predecessor across the board as well as giving AMD something to think about, this review would have gone very differently. As it stands, though, it’s tricky to recommend anything Z890-related due to our initial findings, but the benchmarks here do suggest the situation for Arrow Lake has improved and sometimes significantly.

The trouble is, if your building a gaming PC, there’s only one CPU you should be pairing a $300 motherboard with and that’s AMD’s Ryzen 7 9800X3D. In fact, it’s so good that it’s outsold all of AMD’s other Ryzen 9000 CPUs combined. However, you’re probably here because you want to buy a Z890 board for some reason and not anything from AMD.

Gigabyte has taken the white theme to the extreme, it has better port options in some areas than some AMD B850 boards we’ve seen and Gigabyte has shoehorned in way more features that we expected given there’s actually change from $300. Its AMD equivalent does offer a couple of extra bits for about the same price, but if three hundred bucks is your limit and you want a white Z890 motherboard, this is easily the best option out there.

]]>
https://www.pcgamer.com/hardware/motherboards/gigabyte-z890-aorus-elite-wifi7-ice-review/ mB8UcbS2Q7huvrfoGCbNjh Tue, 14 Jan 2025 19:00:20 +0000
<![CDATA[ Gigabyte X870 Aorus Elite WiFi7 Ice review ]]> As I wrote this, AMD and its motherboard partners had just launched cheaper chipsets options for its Socket AM5 platform, with high expectations for the new B850 chipset in particular given how popular previous B-series options were. But there are still plenty of options below $300 even if you opt for X870 for your AMD system and the Gigabyte X870 Aorus Elite WiFi7 Ice even includes a white colour scheme if you’re looking to build that super clean PC.

A quick glance at the latest B850 boards sees quite a few retail for more than the Gigabyte X870 Aorus Elite WiFi7 Ice, though, and also lack features such as USB4 and also opt for budget audio codecs. A quick glance over the PCB on this board, though, and we’re pleasantly surprised by the features considering there’s a two at the beginning of the price tag.

The X870 chipset is all about making features such as USB4 and PCIe 5.0 for SSDs and graphics cards mainstream, but there seems to be enough budget left over to add plenty of premium options in other areas though.

At nearly $300, it’s still a lot for a motherboard, but the Gigabyte X870 Aorus Elite WiFi7 Ice would certainly be capable of running a high-end water-cooled PC if you wanted to and not feel dated even through 2027 when AMD has stated Socket AM5 could start to wind down.

Aorus Elite WiFi7 Ice specs

Gigabyte's X870 Aorus Elite WiFi7 Ice motherboard on a light desk with white background.

(Image credit: Future)

Socket: AMD Socket AM5
Chipset: AMD X870
CPU compatibility: AMD Ryzen 7000/8000/9000 desktop
Form factor: ATX
Memory support: DDR5-4800 to DDR5-8200 (OC), up to 256 GB
Storage: 4x M.2, 4x SATA
USB (rear): 2x USB 4 Type-C 40 Gbps, 2x USB 3.1 Type-A 10 Gbps, 4x USB 3.0 Type-A 5 Gbps, 4 x USB 2.0
Display: 1x HDMI 2.1, 2x USB/DisplayPort
Networking: Realtek 2.5G LAN, Wi-Fi 7
Audio: Realtek ALC1220
Price: $290 | £277 | AU$531

Apart from the CPU socket and chipset, the Gigabyte X870 Aorus Elite WiFi7 Ice has a nearly identical number in the Intel Z890 camp so making comparisons is interesting. You’re definitely getting the better deal with AMD despite a little less RGB lighting, with a few more ports and power delivery phases and the Z890 option lacks 2-pin thermal probe headers included on this board that allow you to hook up coolant probes in water-cooling systems or fine-tune your case’s cooling.

You get more Type-C ports on the I/O panel here too, albeit in USB4 guise rather than Thunderbolt 4, but here two is mostly better than one. There are more PCIe 5.0 M.2 ports as well, but you probably want to stick with the top one as the other two will steal lanes from your graphics card. It’s great to see features such as power and rest ports and a POST code display on a board at this price too as they just make it easier when testing, benchmarking or troubleshooting.

If you’re wondering what the HDMI port is on the PCB, it’s for external screens. You might have seen those included with cases such as the Y70 Touch from Hyte, where you have to wire the screen to a port on the rear of your PC. With this port, you can run the cable inside the case, which is less complicated, although it’s worth noting it only supports up to 1,920 x 1,080 at 30 Hz.

Image 1 of 2

Gigabyte's X870 Aorus Elite WiFi7 Ice motherboard on a light desk with white background.

(Image credit: Future)
Image 2 of 2

Gigabyte's X870 Aorus Elite WiFi7 Ice motherboard on a light desk with white background.

(Image credit: Future)

This area has another oddity too. There are no fan headers whatsoever in the middle of the PCB, with six located at the bottom and another two up top. This might make it tricky powering large numbers of rear or roof fans in addition to a CPU cooler without extension cables.

All of Gigabyte’s tool-free features are here with a push-button graphics card release and latch securing the M.2 heatsinks coming under the EZ-Latch Plus banner, EZ-Latch Click offering a screwdriver-less way of installing SSDs and single-piece Wi-Fi connector too. The upper M.2 heatsink is suitably chunky and cools both sides of your SSD with thermal pads, while just the top is cooled on the larger lower heatsink.

Image 1 of 2

A Gigabyte Aorus Elite motherboard with the SSD heatsinks removed.

(Image credit: Future)
Image 2 of 2

A Gigabyte Aorus Elite motherboard with the SSD heatsinks removed.

(Image credit: Future)

Gigabyte’s EFI has also received a coat of white paint too so it really has gone to town with its colour matching. The EFI itself looks pleasant enough and is well laid out, featuring a superb fan control suit too, but it maybe looks a little less modern than MSI or ASUS, both having recently upgraded theirs. Installing motherboard software is usually risky, but Gigabyte’s Control Center actually offers some decent features, especially with fan control that might be worth checking out.

PC Gamer test bench
CPU: AMD Ryzen 9 9900X | Cooler: Asus ROG Ryujin III 360 ARGB Extreme | RAM: 32 GB Corsair Vengeance Pro DDR5-6000 | Storage: 2 TB Corsair MP700 | PSU: MSI MAG AB50GL 850 W | OS: Windows 11 24H2 | Chassis: Open platform | Monitor: Dell U2415

Gigabyte seems to have done an excellent job in the cooling department, with the VRM’s not topping 44°C in our stress test and the chipset not rising above 35°C either. The M.2 temperature was exceptionally low for such an affordable board too, with our toasty PCIe 5.0 SSD only reaching 66°C, which was also 6°C cooler than it’s Z890 sibling, while matching the pricier Gigabyte X870E Aorus Pro.

Buy if...

✅ You want a white motherboard for a modern system at a reasonable price: For less than $300 you’re getting a sizable set of features that includes the usual stuff, but also more modern items such as Wi-Fi 7, USB4 and PCIe 5.0

Don't buy if...

❌ You want to run multiple PCIe 5.0 SSDs: While it has plenty of M.2 ports and most support PCIe Gen 5, only one of those won’t steal PCIe lanes from your graphics card

Its average CPU temperature in games was surprisingly low too and overall it was a touch slower than other Socket AM5 boards we’ve tested. It wasn’t across the board, though, as Factorio saw better results and Baldur’s Gate 3 was much more in line with other AMD benchmarks with the same test system. Outside of games, performance varied by tiny amounts that aren’t really worth commenting on, except for Blender where it was faster in compressing. Power consumption was typical too and stayed within single digits of other Socket AM5 boards we’ve tested with the same CPU.

Apart from a slight hiccup in its layout with the top and middle of the PCB almost devoid of fan headers, the Gigabyte X870 Aorus Elite Wi-Fi Ice offers a decent amount of features both new and old while leaving you with change from $300. That’s still not a small amount, but with good cooling for SSDs and VRMs, plenty of USB ports and tool-free features helping to speed up your build time, we’d actually go as far as saying it represents good value, which is rare for a motherboard these days.

You might have to make good use of long fan cables or extensions if you plan on using lots of fans at the top of the case and connecting those to the motherboard and beware of its lane-stealing M.2 ports and stick to the top slot.

Other than these minor gripes that are likely to impact a niche few, the Gigabyte X870 Aorus Elite Wi-Fi Ice looks fantastic, offers a decent, balanced set of features and shouldn’t leave anyone wanting, even if you still own it long past Socket AM5’s final days.

]]>
https://www.pcgamer.com/hardware/motherboards/gigabyte-x870-aorus-elite-wifi7-ice-review/ sJjsNK5qQ6xsUNdZ3YGPxa Tue, 14 Jan 2025 18:00:00 +0000
<![CDATA[ Logitech G RS Shifter and Handbrake review ]]> Flying down the Wellington Straight at Silverstone, the roar of the engine is bested only by the mechanical clunk of the sequential shifter as I drop the gears down for Brooklands. Sim racing is made much better by a dedicated shifter. In this instance, the Logitech G RS Shifter and Handbrake.

This accessory has been a long time coming—I've been waiting on something like this since 2022 when Logitech first launched the Pro Racing Wheel. At the time a very impressive wheel and pedal set, but it was missing any sort of handbrake or shifter accessory. It also lacked any additional wheels for the wheel base. That's a bit of a whiff for an ecosystem claiming 'pro' status, even if it is possible to use another brand's shifter via USB to your PC.

Logitech amended its lineup late last year to include the RS Shifter and Handbrake and the new RS Wheel Hub with new wheel designs. This review only covers the shifter—I have more to say on the wheels that I'll save for a separate review.

The main thing is we're seeing the ecosystem for Logitech's Pro Racing Wheel expand into something a little more competitive with midrange offerings from the likes of Fanatec (now owned by Corsair), Thrustmaster, and Moza.

RS Shifter and Handbrake specs

A Logitech G RS Shifter and Handbrake on a desk.

(Image credit: Future)

Connectivity: USB Type-A (fixed)
Dimensions: 114 x 136 mm (L x W)
Mounting: Desk clamp (included) or fixed to mounting plate via bolts
Price: $150 | £130

The RS Shifter and Handbrake is comprised of a tall base unit measuring around 15 cm (6 inches), a metal handle, and a USB Type-A protruding out the rear. This cable is attached to the unit and non-removable, so be careful not to chew it up in your brake pedal.

The RS Shifter and Handbrake can be used as either a shifter or handbrake. If you require both functions for your sim setup, you'll need to buy two. It's easy to swap between shifting and braking modes between races or different games, at least: the included metal handle screws into one of two threads on the unit to switch between modes. Though you wouldn't want to be doing this too regularly. There's also a small switch to toggle between handbrake, analogue handbrake, and shifting.

The thread in the middle marked in blue is for shifting, which offers a full sweep both towards and away from you. Both directions end in a satisfying and substantial clunk. It's a heavy shifter, not like your average road car manual gearbox, though when you're racing with a load-cell brake and 11 Nm or more of torque it is suitably weighty to match.

I measured about 1.5 cm or just under half an inch of travel to each shift—it feels like a pretty long shift when racing but not overly so. While racing around Silverstone in Assetto Corsa or Spa-Francorchamps in Forza Motorsport, each shift adds a level of engagement and realism that makes both games so much more enjoyable. As an upgrade to my sim rig, it's a very noticeable improvement and I find myself wanting to sit down and race more often to experience it.

Image 1 of 2

A Logitech G RS Shifter and Handbrake on a desk.

(Image credit: Future)
Image 2 of 2

A Logitech G RS Shifter and Handbrake on a desk.

(Image credit: Future)

It has also helped to keep shifting front and centre in my mind as I've raced around each track. Learning where to shift and what gear to shift into is a part of mastering any run, and I feel myself keeping a close eye on when I am upshifting or downshifting, and how many times, with this installed. But it can be both a benefit and a hindrance to your lap times. It's more punishing for the times when you mess up and have to wrestle the wheel with one hand occupied, though a lap is more entertaining for this extra jeopardy.

This doesn't serve up a traditional H-pattern manual gearbox with all of its quirks and conveniences, however. That's a bit of a shame, though PC users need not worry about compatibility—a large number of H-pattern shifters on the market will plug directly into your PC, bypassing any need to buy Logitech's own gear.

The RS Shifter and Handbrake is reportedly able to connect via the Pro Racing Wheel's rear USB Type-A ports or a direct USB connection to a gaming PC. However, I was only able to use the RS Shifter and Handbrake through connecting it to my PC directly, via the included Type-A USB cable. That works just fine in games—just set up the input as if the shifter is a separate device to the wheel—but it does mean there's one extra USB cable running between my rig and PC. I tried every port, multiple cables, various software updates, and all manner of combinations to get it up and running but it's yet to work.

(Image credit: Future)

Onto the braking function. This will appeal to players of rally games who don't want to go careening into a haybale or ditch when racing up to a hairpin. The unmarked thread on the unit is used for braking, which sits right at the top of the action and allows for roughly 3 cm or 1 ¼ inches of travel when engaged. The handbrake will sweep in a smooth motion to a spongy stop. It's not as mechanically satisfying as the sequential shifting but it's just about all you really need out of a handbrake accessory.

Buy if...

✅ You haven't found your 'one true racing game' yet: If you dabble in a range of games, from rally to racing, you will benefit from the flexibility of both handbrake and shifter in a single package.

✅ You are tight on desk space: the entire shifter doesn't take up much room, even with the included clamp installed. It's around 10 cm (4 inches) wide.

Don't buy if...

❌ You want the full manual shifting experience: This isn't an H-pattern shifter and lacks that extra depth of experience. It's still satisfying, though.

❌ You want the best handbrake for the money: The handbrake works great on the RS Shifter and Handbrake, but a dedicated handbrake might offer more feedback.

The RS Shifter and Handbrake will set you back $150/£130. That's not a terrible price for this level of quality—I'm yet to detect any sort of rattle during use. It gets a little less affordable if you feel you need two of these things. I'm happy with one in every racing game I've tested it in so far: the shifter for racing games and the handbrake for rally games.

It also comes with a table clamp that's comprised of thick plastic and feels pretty robust. It stayed put during my testing with the Secretlab Magnus Pro, which is only an inch thick, though it opens to about 6.5 cm or 2 1/2 inches.

In order to use the shifter on the Playseat Trophy required a mounting plate that's sold separately. That set me back £39. The shifter now sits happily atop it with room for one more besides, so I'll chalk that up to a good investment.

Weighing up the whole package, the versatility and build quality of the RS Shifter and Handbrake get my nod of approval. It might not be for everyone for that lack of H-pattern, and if you wanted to go more premium, you absolutely could, but it delivers on its core promise of convenient shifting and braking that should appeal to the sim racing player that hasn't yet found their 'one true racing game'.

]]>
https://www.pcgamer.com/hardware/logitech-g-rs-shifter-and-handbrake-review/ 9ARR4bQcehidHCGAUnvp7e Tue, 14 Jan 2025 16:59:16 +0000
<![CDATA[ Big AI beasts reportedly delay Nvidia Blackwell orders due to GPU overheating but it doesn't worry us for RTX 50 gaming cards ]]> Nvidia's biggest customers for AI chips, including Microsoft, Amazon, Google and Meta, have reportedly cut their orders for the new Blackwell series of GPUs due to overheating issues. However, that doesn't worry us for Nvidia's new RTX 50 family of gaming GPUs, despite them also being based on the Blackwell architecture.

The Information (paywall, via Reuters) claims that overheating and other related "glitches" have caused customers to delay Blackwell orders or switch to Nvidia's earlier and presumably less problematic "Hopper" generation of AI-optimized GPUs.

Back in October, Nvidia's head honcho Jensen Huang has admitted that the new Blackwell series of AI GPUs had "design flaws" which was leading to delays in shipping AI racks to customers.

"The design flaw caused the yield to be low. It was 100% Nvidia's fault," Huang said. Now, those comments didn't directly refer to any overheating, instead talking about yields, which typically entails the number of functional chips that can be harvested from a larger wafer containing many GPU dies.

So, those flaws could be completely separate from whatever is causing the alleged overheating. Or they could be related on some level.

Indeed, overheating Blackwell GPU stories have been circulating since last November, with The Information reporting that Nvidia liquid-cooled racks containing 72 Blackwell GPUs had been redesigned several times in an attempt to solve the issue.

Whatever, what we can say for sure is that there's no reason to assume that any issues with Nvidia's AI GPUs will translate into problems for its new RTX 50 family of gaming chips just announced at CES.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Yes, those too are based on the Blackwell architecture and indeed built on the same TSMC N4 silicon. However, in terms of actual layout and also the number and balance of functional units, the gaming chips are totally different.

Admittedly, it's possible there's a problem, say, with the design of Blackwell's Tensor cores that might also map to the gaming GPUs. But the odds of that are probably pretty slim. Moreover, the workloads and software that runs on a gaming chip is completely different to training or inferencing an AI model.

It's also worth noting that it's the same source, namely The Information, pushing this overheating narrative again. Some wider confirmation would add weight to the story. Still, by Nvidia's own account, there have been problems with Blackwell.

We'll know soon enough just how fast and indeed how hot Nvidia's new RTX 50 gaming GPUs really. Watch this space for our full reviews.

]]>
https://www.pcgamer.com/hardware/graphics-cards/big-ai-beasts-reportedly-delay-nvidia-blackwell-orders-due-to-gpu-overheating-but-it-doesnt-worry-us-for-rtx-50-gaming-cards/ m8suBxgdCeJgLumrXki5dZ Tue, 14 Jan 2025 16:48:08 +0000
<![CDATA[ OBSBot Meet 2 review ]]> I'm going to be honest here, I don't think I've ever quite understood why the average person would be interested in a 4K webcam for 4K prices.

I think there's a real purpose if you're a streamer and want the absolute best clarity but, with the likes of Google Meet limiting video calls to just 1080p, why I would want to pay a couple hundred extra bucks for niche use cases is beyond me.

Fortunately, the OBSBot Meet 2 with its solid specs, cute size, and very reasonable price aims to fix that problem. And give you an excellent everyday webcam in the process.

Starting off with the most noticeable thing, this webcam is absolutely adorable. At just under an inch tall and under two inches wide, it's one of the smallest webcams I've used. It's only about 70% of the size of the Logitech Brio 100 I'd previously used on my setup and not quite as cumbersome either. With grey, white, and green options available, they all look great but the almost seafoam shade of green is my favourite.

OBSBot Meet 2 specs

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)

Supported resolutions: 4K at 30 fps, 1080p at 60 fps
Field of view: D): 79.4° H): 67.2°
Sensor: ½-inch CMOS sensor
Connection: USB Type-C
Dimensions: 45 x 36 x 22.2 mm
Weight: 40.5 g
Price: $179 | £179

The lens of this webcam takes up the majority of the space of the device. Apart from a small OBSBot logo up the top left, it lacks any distinct branding. To set this thing up, you simply have to plug a USB-C cable into the receiver on the back and connect the included stand to your monitor.

The stand is the weakest part of the package. Instead of screwing in via a thread, the stand has a small magnetic pull that keeps the webcam in place.

This feels like a bit of an oversight and a slightly thick USB-C has enough power to constantly knock the webcam off the stand. It clips initially fine but lacks the strength to hold onto a surface well. As the stand itself is fairly small, when the webcam's weight is adjusted, this will often knock the entire thing off.

One night, the stand and webcam came off entirely and the Meet 2 took a nasty dent as a result. It's only small but noticeable. If you plan on using this webcam, you will need to have a small USB-C, preferably with tidy wire management so it doesn't dangle and get caught anywhere, or need a stand to screw this into. Luckily, it has threading at the bottom so works well once you do.

Image 1 of 4

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 2 of 4

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 3 of 4

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 4 of 4

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)

If you are worried about privacy, it comes with a handy magnetic cover. The webcam is so tiny that my, admittedly rather large, fingers occasionally move the entire webcam instead of removing the cover. But it's a smart feature that becomes very natural after only a few days with the cam.

Having a physical cover is one of those little peace-of-mind accessories that adds a lot, despite being a very small touch. It does mean there's a chance you can lose it as you can take it off entirely but, if you're careful with your tech, holding onto the cover shouldn't be a problem. Once set up on the desk, this tiny little webcam is an absolute joy for the price.

The half-inch CMOS sensor is the same you can find in OBSBot's more expensive Tiny 2 Lite webcam, and an inch smaller than the CMOS sensor in the OBSBot Tiny 2. The 48 MP sensor has an aperture of 1.8 with an FOV of 79.4°, which makes it very similar to the much larger and more expensive Tiny 2, and the image quality is pretty great as a result. In video, it can capture up to 4K at 30 fps and down to 1080p and below at 60 fps.

Image 1 of 2

A screenshot a bedroom in a webcam with a peron in the centre of frame. The UI on the right displays that HDR is turned off.

(Image credit: Future)

The OBSBot software taking an image with HDR off

Image 2 of 2

A screenshot a bedroom in a webcam with a peron in the centre of frame. The UI on the right displays that HDR is turned on

(Image credit: Future)

The OBSBot software taking an image with HDR on

Though it supports HDR, I find the results to be a little inconsistent. In low or medium lighting, illuminated by the brightness of a monitor or mood lighting, the HDR mode felt crisper and dealt with the contrast of colours better. But it tended to feel a little more washed out in good lighting. Luckily, you can access easy HDR controls from the OBSBot Centre app, which also serves to get your webcam the latest firmware updates. This software is great, being not only intuitive but without any software hiccups or stalls.

In the software, the Console tab can be used to adjust framing as well as give a software approximation of a few lens types (wide, medium and narrow). You can also turn on auto framing, which is one of the many AI-powered functions, and it works fine to keep the body in the frame and focused. Though its limited size and FOV means it won't keep tracking for very long if you move to the left or right. From using the two, this section of the software hub is far more impressive in the Lite 2 than in the Meet 2, though the former is a more expensive webcam choice.

Image 1 of 5

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 2 of 5

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 3 of 5

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 4 of 5

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 5 of 5

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)

Focusing in particular is great on this webcam, adjusting to items in frame and in the distance in a matter of moments. In the image tab, you can further change how focus works, turn HDR on and off, as well as customise the image with contrast, sharpness, saturation, etc. These all work together to further hone a webcam that, from the very start, puts out high-quality video that is dynamic and adjustable. Naturally, in quite dark areas, the webcam does become a tad grainy and it won't work wonders with almost no light but, if you like to play games in a Discord call, it can illuminate your smiling face as you taunt your enemies friends.

The Meet 2 has a few extra AI features, like the ability to change auto framing by holding your hand up and zooming in and out by holding your hand in the shape of an L. It feels a tad gimmicky, especially when you have to explain what you're doing in a call while the software catches up with you, but it mostly works fine. The same is true of built-in beauty filters, that lightly pull in the chin or brighten the skin. They feel too obviously like a filter for me to get any genuine use out of them but they're not necessarily distracting.

Image 1 of 2

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Image 2 of 2

The Aurora Green OBSBot Meet 2 4K webcam

(Image credit: Future)
Buy if…

✅ You need a small webcam: The Meet 2 can fit into your palm with ease and will comfortably sit on just about any size screen. It's both cute and excellent for a light setup.

✅ You like to fiddle with your video settings: The OBSBot Center app is a nice and easy bit of software that gets a lot of this webcam.

✅ You're looking for 4K on a budget: This not only does 4K quality but it does it well, and at a competitive price point.

Don't buy if…

❌ You don't have a stand: This comes with a magnetic stand but a slightly thick USB-C or wobble in your desk will have that hurtling down.

❌ You don't want a dedicated mic: The mic in this webcam is passable but you probably want something better to match that great quality.

❌ You want to be tracked around your room: Being stationery and with a just okay FOV, this can track you at your desk but won't be able to follow you much further.

If you do plan on using this webcam for games, you will want to use your own mic because this one sounds a bit muffled and far away. Part of the problem with a webcam of this size and price point is it has to make cuts in other areas and this mic's quality is fine for a quick chat but unsustainable for long-term use, without annoying enemies colleagues.

Other than this, video quality is great and clear in both 4K and 1080p, picking up the overhead light well while piecing individual strands of hair and movement together. The screen doesn't feel too cluttered with movement even at 30 fps, with very little visual artifacting or blur to its output. The software only ever feels like a boon to it, which is surprising from something that boasts "AI features".

From my time with the OBSBot Meet 2, I still don't fully get the need for a 4K webcam. But this is priced well enough that it can outpace many 1080p choices in the right places and still give you the option to bring up that quality if you plan on creating high-quality streams or video. That adorable aesthetic also serves a practical use case for anyone with a particularly small setup, and it can even work with a laptop. You may need a mic and a better stand for the best experience but, once that's sorted, it's hard to beat this kind of quality and aesthetic at its price point.

]]>
https://www.pcgamer.com/hardware/webcams/obsbot-meet-2-review/ RX4McbSNYLZLytDaNG9mrk Tue, 14 Jan 2025 16:14:43 +0000
<![CDATA[ It turns out NFTs of tennis balls that sold for $3 million aren't worth much after all and I just died of not-surprise ]]> Back in 2022, the Tennis Australia's forward-thinking management released 6,776 images of tennis balls as digital NFTs. Each one sold for 0.067 in the ethereum cryptocurrency, about $278 AUD at the time. Those same NFTs are now reportedly trading for as little as 0.003ETH or $15 AUD on OpenSea, the self-described "world’s first and largest digital marketplace for crypto collectibles and non-fungible tokens (NFTs)". Ouch.

To add a little detail and context, the NFTs were linked to 19cm by 19cm plots on the courts at the Australian Open in Melbourne. At the time, Tennis Australia reportedly promised to update the metadata on the NFTs whenever a winning shot during a match landed on a given the plot.

What's more, Tennis Australia pitched the whole thing as being akin to an airline frequent flyers program, offering ground passes for finals weeks for NFT owners, so-called behind the scenes access along with tickets to matches the following year if their NFT court plot was linked to a match point. Oh, and a Discord channel for NFT owners was set up.

In 2023, Tennis Australia released a further 2,545 NFTs, again digitally depicting Australian Open tennis balls and linked to court plots, despite controversy springing from the perception of volatile cryptocurrency markets. When you add the two tranches of NFTs together, they sold for around $3 million AUD. Nice work if you can get it.

At the time, Ridley Plummer, senior manager of metaverse, NFTs, web3 and cryptocurrency at Tennis Australia, said the organization was commited to NFTs for the long term.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

“We shouldn’t just put down our tools and walk away because the market’s having its challenges. There’s obviously a ton of external factors that come into play when you’re exploring a new technology like web3 and NFTs, and when you’re an innovative company like Tennis Australia and the AO there’s obviously challenges and and rewards that come with that as well,” Plummer said.

In 2024, it's thought Tennis Australia did not issue any further NFTs, though existing owners were given ground passes. For this year's tournament, currently running in Melbourne, it seems that the Australian Open isn't mentioning the NFT scheme at all or offering ground passes. The Guardian says that the Discord server has been shut down, the associated websites are "dormant" and that Tennis Australia isn't responding to "multiple requests for comment."

Thus it does rather seem that Tennis Australia and the Australian Open would rather the whole thing just disappeared. All we can say is that it doesn't seem surprising that something of little apparent value has indeed turned out to have little actual value, though in the interest of full transparency, this author doesn't get NFTs. At all.

]]>
https://www.pcgamer.com/hardware/it-turns-out-nfts-of-tennis-balls-that-sold-for-usd3-million-arent-worth-much-after-all-and-i-just-died-of-not-surprise/ 2sox8HRgmkA8N97wCuUGon Tue, 14 Jan 2025 15:23:58 +0000
<![CDATA[ This huge monitor is ultrawide, curved, OLED, and pretty much every other monitor tech you need and I'd be tempted at $200 off ]]>

Innocn 49Q1R | 49-inch | 5120 x 1440p | 144 Hz | OLED | 1800R curved | $999.99 $799.99 at Amazon (save $200)
Though the sale on Amazon says this monitor retails for $999.99, Innocn's site says it retails at $1,199.99, so you might actually be getting an even better deal than we thought. Either way, getting an OLED 1440p 49-inch ultrawide monitor for this much is super solid, and it's pretty much the best deal on a monitor at this price point.

Price check: Innocn $999.99View Deal

I've been pining after a really nice second monitor to replace a pretty mediocre one for about a month but part of me thinks I may just be better off with one huge curved screen to rule them all do my daily work and gaming on. With over $200 off, Innocn might just have the best-priced choice out of its price range.

Over at Amazon right now, you can get the Innocn 49Q1R for $800, which is $200 cheaper than its retail price of $1,000. It has many standout features so let's start at the top of the list. This thing has an OLED panel capable of 5120 x 1440 resolution. OLED screens provide their own backlight, as opposed to LED, which makes them a more expensive bit of tech but also gives deeper blacks and a sharper contrast between colors.

This, paired with the panel being 5120 x 1440p means it should look crisp and sharp. With a refresh rate of 144 Hz and a super-fast 0.03 ms response time, it should also be plenty snappy for gaming.

The 1800R curvature of the screen will help the whole screen feel a bit more immersive, which combines with the major selling point: Ultrawide monitor allows you to cram more things on screen which is not only great for productivity but gives a new dimension to games you play. RPGs with strategy elements Like Baldur's Gate 3 and 4X games like Civilization 6 work particularly well, as they give you more information to look at, but so do first-person shooters like Cyberpunk2077.

The one major thing you have to watch out for with ultrawides is some games (especially older ones) don't always support the unique aspect ratio. There are programs like Flawless Widescreen, which are designed to run older titles, but it's still something you have to consider in a monitor like this.

Though we haven't tested this specific monitor, Dave's Innocn 40C1R Ultrawide review rates the value of its panel, as well as the big screen, and Type-C input. It needed a little customizing out of the box to get the best picture and has "questionable chassis quality" but otherwise looks great.

At 250 nits of brightness, this screen won't be hugely bright but will perform adequately in most forms of lighting. Maybe you'll have a bit of trouble with it facing a window on a sunny day but I can't think of any reasonable reason to do so.

If you don't care about your monitor being ultrawide or curved, you can get a good deal on a 4K monitor in the Gigabyte M28U at just $400 but you pay more for the two former attributes and may not even have the rig to really justify a 4K panel. Ultimately, though the price of this Innocn is solid, it's still enthusiast pricing and worth noting before you buy.

However, if this sounds like your thing, and you're looking for a real wide boy to slap on your desk, this maximises price to quality to a degree that is worth paying attention to. I've been thinking of making the switch and this price point is mighty tempting. I can finally fit the entire Baldur's Gate 3 party on a single screen.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/this-huge-monitor-is-ultrawide-curved-oled-and-pretty-much-every-other-monitor-tech-you-need-and-id-be-tempted-at-usd200-off/ SQzZWhvypuEsWwMz3rEjXd Tue, 14 Jan 2025 15:12:37 +0000
<![CDATA[ AMD says it took four goes to get its new Strix Halo uber APU right and that included designing new CPU dies that 'put Threadripper in the palm of your hands' ]]> AMD's new Strix Halo uber APU for laptops was already pretty interesting, what with its 256-bit memory bus and monster sized iGPU. Now it turns out that its gestation was a little unusual, with AMD needing four goes at it to get it right and adding some trick tech to its CPU dies in the process.

In an interview with website Chips and Cheese, AMD Senior Fellow Mahesh Subramony revealed some new details about Strix Halo's inner workings. Subramony says AMD "took four iterations" to get Strix Halo right.

That's perhaps not a huge surprise, given Strix Halo had been rumoured for some time and arrived a little later than initial expectations. What is news is that Strix Halo's CPU CCD dies might not be exactly what you expected.

When the APU was first revealed, it looked like AMD had taken a pair of its eight-core Zen 5 CPU CCD dies and crammed them into a package with a new I/O die contain that huge (for an APU) 40 CU iGPU.

Well, that's not the case. Strix Halo has its very own CPU CCDs. They're still Zen 5 based, but AMD has tweaked the CCDs to suit Strix Halo's mobile remit.

For starters, they have a new interconnect. Subramony says the existing interconnect AMD uses between the CCDs in its desktop Zen 5 chips like the Ryzen 9 9950X is fast but has limitations when it comes to power efficiency involving the range of power states that were supported.

The new interconnect for Strix Halo is said to be better in every way. "Low power, same high bandwidth, 32 bytes per cycle in both directions, lower latency," Subramony explains. He also says that switching power states is now "almost instant".

The downside? It's a little more expensive to fabricate than the desktop interconnect. However, Subramony also says that Strix Halo is a full-feature Zen 5 implementation, including the 512-bit FPU.

"I almost joke about it saying it's a Threadripper to put in the palm of your hands. So we didn't pull any punches. These have the 512 bit data path. It is a full desktop architecture," he says.

The only exception to that is clockspeed. "We have binned the parts for efficiency. So it might not hit the peak frequency that you would see on the desktop," Subramony explains.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

He also says that the 32 MB of Infinity cache on the GPU die currently can't be directly accessed by the CPU, it's for the GPU, though that might change in future. "We change that with a flip of a bit but we don't see an application right now where we need to amplify CPU bandwidth," he says.

There are further details about Strix Halo's inner workings in the interview. But suffice to say that what was already one of the most interesting chips in recent years just got a bit more intriguing.

The effort AMD has clearly put into Strix Halo also bodes well for its performance an battery life. If anything, it was the latter that was the greatest unknown with Strix Halo. Could AMD really cram 16 Zen 5 cores and a huge GPU into a power-efficient package?

I was doubtful, for sure. But after learning more about the technology AMD has put into Strix Halo, I can't wait to see just how good AMD's uber APU really is.

]]>
https://www.pcgamer.com/hardware/processors/amd-says-it-took-four-goes-to-get-its-new-strix-halo-uber-apu-right-and-that-included-designing-new-cpu-dies-that-put-threadripper-in-the-palm-of-your-hands/ TKnWPWBCFfKjr2zxvfxkVk Tue, 14 Jan 2025 11:53:15 +0000
<![CDATA[ Blasting AI into the past: Modders get Llama AI working on an old Windows 98 PC ]]> Remember when you were young, your responsibilities were far fewer, and you were still at least a little hopeful about the future potential of tech? Anyway! In our present moment, nothing appears to be safe from the sticky fingers of so-called AI—and that includes nostalgic hardware of yesteryear.

Exo Labs, an outfit with the mission statement of democratising access to AI, such as large language models, has lifted the lid on its latest project: a modified version of Meta's Llama 2 running on a Windows 98 Pentium II machine (via Hackaday). Though not the latest Llama model, it's no less head-turning—even for me, a frequent AI-naysayer.

To be fair, when it comes to big tech's hold over AI, Exo Labs and I seem to be of a similarly wary mind. So, setting aside my own AI-scepticism for the moment, this is undoubtedly an impressive project chiefly because it doesn't rely on a power-hungry, very much environmentally-unfriendly middleman datacenter to run.

The journey to Llama running on ancient-though-local hardware enjoys some twists and turns; after securing the second hand machine, Exo Labs had to contend with finding compatible PS/2 peripherals, and then figure out how they'd even transfer the necessary files onto the decades-old machine. Did you know FTP over an ethernet cable was backwards compatible to this degree? I certainly didn't!

Don't be fooled though—I'm making it sound way easier than it was. Even before FTP finagling was figured out, Exo Labs had to find a way to compile modern code for a pre-Pentium Pro machine. Longer story short-ish, the team went with Borland C++ 5.02, a "26-year-old [integrated development environment] and compiler that ran directly on Windows 98." However, compatibility issues persisted with the programming language C++, so the team had to use the older incarnation of C and deal with declaring variables at the start of every function. Oof.

Then, there's the hardware at the heart of this project. For those needing a refresher, the Pentium II machine sports an itty bitty 128 MB of RAM, while a full size Llama 2 LLM boasts 70 billion parameters. Managing all of these hefty constraints, the results are even more interesting.

Unsurprisingly, Exo Labs had to craft a comparatively svelte version of Llama for this project, now available to tool around with yourself via GitHub. As a result of everything aforementioned, the retrofitted LLM features 1 billion parameters and spits out 0.0093 Tokens per second—hardly blistering, but the headline take here really is that it works at all.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/blasting-ai-into-the-past-modders-get-llama-2-working-on-windows-98-machine/ hvtJfcvmpq4dKf2KmRymWC Tue, 14 Jan 2025 10:00:19 +0000
<![CDATA[ Project to 'free social media from billionaire control' plans to take on Musk and Zuck using Bluesky's open source protocol: 'It will take years and hundreds of millions of dollars' ]]> A cadre of tech founders and activists announced a new social-media focused foundation on Monday with the goal of raising $30 million to fund development of AT Protocol, the underlying technology powering growing social media network Bluesky. While you may not recognize any of the "technical advisors and custodians" organizing Free Our Feeds, you'll likely recognize plenty of the folks who signed an open letter in support of the new foundation: Wikipedia co-founder Jimmy Wales, actor and activist Mark Ruffalo, writer Cory Doctorow and musician Brian Eno are all among the signatories.

Free Our Feed opens with a strong denouncement of Facebook and X in their current forms, calling Mark Zuckerberg's recent moves to ditch fact checkers and ease up on restrictions on hate speech "going full Musk."

The open letter picks up the mission statement from there. "We are determined to free social media from billionaire control," it states. "We know it will take three things: community, capital, control. And for the first time ever there is a pathway to secure the future of social media in the public interest. The Bluesky team has built an incredible foundation for this vision of social media that gives power and choice back to people through individual control and customization, sparking creativity and bringing joy back into connecting online.

"However, they remain a commercial company, and despite their best intentions they will come under the same pressures all businesses face: to maximise return to their investors. We know that to ultimately build out a social network ecosystem that will remain free from venture capital and billionaire capture it will take years and hundreds of millions of dollars—and much like when we first started towns, we made the first roads, and over time we built out a network, all operating as part of a social contract where people get to the share the benefits of access to those roads."

The roads in the letter's analogy are the AT Protocol, the open source infrastructure that powers Bluesky and could theoretically be used to build a new wave of interconnected social platforms, offering non-corporate alternatives to the likes of Facebook and Linkedin. Free Our Feeds' goal is to create a "public interest foundation that will work to support making Bluesky's underlying tech fully resistant to billionaire capture." That will involve offering funds to developers to build "a wealth of social applications on top of open protocols to make social media a healthier and happier place."

The open letter's references to Bluesky's creators eventually falling victim to the whims of venture capital may sound like a dig—and there's ample mistrust online around Bluesky thanks to its former connection to Jack Dorsey and investment from a company called Blockchain Capital—but it's actually in-line with Bluesky's mission statement since the beginning.

"One of Bluesky's mottos is 'the company is a future adversary,'" explained Bluesky developer Emily Liu in 2023. In recent months CEO Jay Graber has also called the social network's open source design "billionaire-proof," and endorsed Free Our Feeds on Monday.

The Free Our Feeds website says the foundation should be "up and running by the end of 2025," but it's already collecting donations via a GoFundMe, with a goal of $4 million "to create the foundation and get critical infrastructure up and running."

]]>
https://www.pcgamer.com/gaming-industry/project-to-free-social-media-from-billionaire-control-plans-to-take-on-musk-and-zuck-using-blueskys-open-source-protocol-it-will-take-years-and-hundreds-of-millions-of-dollars/ 9QF5ukU9cAhCo2mL9CcmfK Tue, 14 Jan 2025 00:26:36 +0000
<![CDATA[ Corsair Virtuoso Max review ]]> It kind of makes sense why there are so many little gimmicks in the gaming space. At certain price points, you can only ever expect so much from your hardware. You therefore end up with brands doing very similar things, except with their logo slapped on the side. I like the look, feel, and sound of the Corsair Virtuoso Max but much of what it has going for it has to be compared to the competition, and that doesn't give Corsair the best chance of coming out on top in the high-end audio battle.

One place where the Virtuoso Max does definitely stand out is the microphone. Typically, the best wireless gaming headsets don't sound fantastic to your teammates as they often prioritise swift and accurate sound and battery life over the mic.

However, the Corsair mic sounds clear and unwavering, rarely cutting out or faltering. From my time with it, this microphone has withstood the whispers uttered below the steps of enemies in PUBG, and the inevitable shout, that comes from one of my squad hitting the wrong button and giving away our position.

Unfortunately, sidetone—the ability to hear yourself back while you speak—might not suggest just how nice this mic sounds as it's just not great in the software. You need to practically shout to hear your own voice, which isn't true when playing online. This sounds like a technical glitch but I can't figure out how to make it any better, other than setting sidetone sensitivity as high as possible.

Corsair Virtuoso Max Specs

The Corsair Virtuoso Max wireless headset on a black table

(Image credit: Future)

Connection: 2.4 GHz and Bluetooth
Type: Closed back
Frequency response: 20 Hz - 40 kHz
Drivers: 50 mm graphene
Microphone: Detachable, omnidirectional
Features: Active Noise Cancellation, Spatial audio, iCUE-compatible
Weight: 431 g
Battery life: Up to 60 hours (with RGB off)
Price: $330 | £279

Connectivity, while limited, is very thoughtful. On the left cup of the headphones is your on switch, which can be placed in the middle position to connect to the 2.4 GHz receiver, or all the way down to connect to Bluetooth at the same time.

A dial taking up the entirety of each cup can be used to adjust volume, with the left one initially changing the sound of the receiver's audio, and the right changing Bluetooth volume.

This is not only very intuitive to use but also excellent for a last-minute Duolingo lesson at 11:55 PM in the middle of a gaming session. Though you can swap which ear does what in Corsair iCUE, its own software, you can't customise it to adjust game to mic volume ratio, mic gain, or anything else you may want to control with those dials. This is a shame as a little more customisation would help out.

The Corsair software is basically necessary to really get the full use out of this kit. It controls your basic EQ functions and lighting, but also allows you to set up a custom sound ID, which tests your ears for the best volume of sound. I've found the EQ performs better with a slight boost in both bass and highs, leaving the mids more or less untouched.

Image 1 of 5

The Corsair Virtuoso Max wireless headset on a black table showing settings on right earcup

(Image credit: Future)
Image 2 of 5

The Corsair Virtuoso Max wireless headset on a black table showing settings on right earcup

(Image credit: Future)
Image 3 of 5

The Corsair Virtuoso Max wireless headset on a black table showing settings on right earcup

(Image credit: Future)
Image 4 of 5

The Corsair Virtuoso Max wireless headset cups on a black table

(Image credit: Future)
Image 5 of 5

The back of the Corsair Virtuoso Max wireless headset on a black table

(Image credit: Future)

Once you have done this, the Corsair Virtuoso Max sounds great, with a very neutral sound profile. The bass is not quite as strong as the other parts though, and boosting it too much in the EQ settings sounds a bit hollow. Rather than getting that thump of a heavy low end, it approximates the feeling of that rumble instead, not sounding quite as clear as I'd like a bass to sound.

I've noticed this more neutral sound profile functions at its best in single-player dynamic games like the surprisingly brilliant Indiana Jones and the Great Circle thanks to its wide range of sounds. It is a more immersive headset than it is a competitive one, which means it won't really outperform a much cheaper headset in listening for the footsteps of grunts (or well, other human beings) in Call of Duty: Black Ops 6.

But those diminishing returns are largely true of any headset over around $150. However, that sound quality is certainly clear in games intended to engross you more than they challenge you.

Not only do you notice that pretty hefty price tag of just over $300 in the mic and sound, but also in the build quality. This set of cans is super sturdy and stretching it out or moving the band, I never noticed a worrying degree of flexibility. The band itself is a combination of plastic and aluminium and stretches with ease. It's the kind of headset you can wrap over a monitor or throw onto a sofa without worries.

Unfortunately, this rigidity is definitely a point of contention, and though it works pretty well over gaming sessions for my (rather large) skull, it is hefty, and the cushioning on the earcups is very shallow.

This means a light touch on the cushioning can feel the rigidity of the cup itself. The same is true of the band at the top of the headset. If you are a bit more sensitive to the heaviness of a headset or the rigidity of its band, this might ache you after a gaming session for more than an hour or two.

At over 60 hours of battery life with RGB off, I've only ever charged this headset when the idea has struck me and I've somehow never run its battery dry in traditional use. That RGB is mostly a superfluous but pretty addition to the headset, providing lights in the side of the cups and the mic to signify whether or not it's muted. With my review set coming in all silver, it's a rather pretty set of headphones too.

Image 1 of 4

The Corsair Virtuoso Max wireless headset head band on a black table

(Image credit: Future)
Image 2 of 4

The Corsair Virtuoso Max wireless headset band being compressed on a black table

(Image credit: Future)
Image 3 of 4

The Corsair Virtuoso Max wireless headset left ear cup zoomed in

(Image credit: Future)
Image 4 of 4

The Corsair Virtuoso Max wireless headset right earcup being compressed

(Image credit: Future)
Buy if…

✅ You need a great mic: Clear, dynamic, and easy to use, the Corsair Virtuoso Max's mic is perhaps the most obvious sign of quality this headset has, and it's detachable for those moments you don't want the mic next to your mouth.

✅ You want to connect to two devices at once: Not only can you connect these cans wirelessly via the 2.4 GHz connector and Bluetooth at the same time but the volume wheels on either side control both sources of sound independently.

✅ You own more Corsair products: iCUE compatibility makes this a great choice if you have other Corsair devices and don't want to download yet another bit of software.

Don't buy if…

❌ You're sensitive to rigid design: Though the Corsair Virtuoso Max did its job without pain for me, the cushioning is fairly short and rigid here, both on the cups and the top of the headset.

❌ You want the absolute best sound for the money: Coming in at the same price point as the Audeze Maxwell, these cans sound great but you could get even better sound for the same amount of cash.

Rather strangely, the included USB-C port only works for charging, which means you can't use this headset wired even if you wanted to. It doesn't have an aux jack either, which means you can't easily connect to PlayStation or Xbox in a pinch through the controllers.

The PC version does work on PlayStation, though, by plugging in the dongle and there is a version of the headset designed specifically for Xbox, but multiplatform connectivity with the same headset is impossible.

When connected to a PC and your phone, everything feels rather natural. You can swap between customised EQ modes by hitting the custom button in the right earcup and holding that down swaps from standard hearing to active noise cancelling to transparency mode, which is intended to let in outside sounds while you play.

Unfortunately, while the ANC mode very naturally hides sounds from around you, the transparency mode isn't quite as clear or consistent as I would like. It's not bad but not quite great either.

The Corsair Virtuoso Max show off their quality in sound design, mic quality, and sturdiness but this is both a blessing and a curse as it's quite rigid. It also has a high price tag to accompany it. Not doing anything majorly wrong but not standing above its competition, the Virtuoso Max is a set of cans I rather like, but not quite enough in any one area other than maybe mic quality and Bluetooth connectivity.

]]>
https://www.pcgamer.com/hardware/gaming-headsets/corsair-virtuoso-max-review/ FkvdS5CC76QFb28MGSRvJU Mon, 13 Jan 2025 18:08:14 +0000
<![CDATA[ Core i9 14900KF CPU hits a world record 9.12 GHz and proves Intel chips are still good at something ]]>
Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

There's an new king of the HWBot overclocking hill and it's the Intel Core i9 14900KF. Actually, the 14900KF has held the frequency record for CPUs before. But a Chinese overclocker going by the name "wytiwx" just upped the ante to a new world record for any CPU ever of 9.12161 GHz.

Yup, once again it's an older Raptor Lake Intel CPU that's marginally increased on the previous record, the 9.11775 GHz achieved by a 14900KS in March last year. In other words, Intel's latest Arrow Lake CPUs, including the Core Ultra 9 285K, are nowhere to be seen in the top echelons of overclocking.

Is that because of Arrow Lake's architecture? Or is TSMC's 3N silicon as used for Arrow Lake's CPU cores not actually as good as Raptor Lake's Intel 7 node (the node formerly known as 10nm) at overclocking?

It's possibly a bit of both. But despite Intel's well-publicised difficulties when it comes to chip manufacturing technology in recent years, there's no doubting that Big Blue has a great track record when it comes to achieving top frequencies.

Intel is due to move back to its own 18A node for high performance desktop CPU manufacturing with Panther Lake. So it will be interesting to see if Intel's new 18A deskto CPUs can beat these old Raptor Lake chips and edge us closer to the magic 10 GHz.

Anywho, wytiwx used an ASUS ROG Maximus Z790 Apex motherboard and, inevitably, liquid helium for cooling to achieve -258 degrees C. Chilly.

If you're wondering, the fastest ever AMD CPU is the Bulldozer-based AMD FX 8370, which hit 8.7228 GHz over 10 years ago. It's also interesting to note how the top frequency achieved with a PC processor has levelled off in recent years.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Between 1996 and 2007, it shot up from just 233 MHz to just over 8 GHz. But it took a further 15 years from there to top 9 GHz in 2022, and another two years or so to go from 9.008 GHz to today's 9.12-and-a-bit GHz.

Way back in the year 2000, Intel predicted it might hit 10 GHz by 2005. Of course, back then Intel was all about clockspeed. Its Netburst Pentium 4 chips were designed for raw frequency above all else and the assumption was that improved CPU performance hinged largely on operating frequencies going up.

Shortly afterward, Intel hit something of a clockspeed wall and the whole industry switched tack in favour of more cores running rather slower than 10 GHz. Then again, with improvements in transistor density also levelling off, perhaps we'll see a return to an emphasis on frequency and that 10 GHz barrier might finally be breached. Watch this space, peeps.

]]>
https://www.pcgamer.com/hardware/processors/core-i9-14900kf-cpu-hits-a-world-record-9-12-ghz-and-proves-intel-chips-are-still-good-at-something/ BCcz6yesqooihzp8t2Lkia Mon, 13 Jan 2025 17:56:18 +0000
<![CDATA[ After fabled RTX 4090 Ti was allegedly dug out of a bin last year, tech testing YouTuber puts Nvidia's prototype GPU through its paces ]]>

The reality of mounting e-waste is just one more thing that keeps me up at night—though every now and then, such a nightmare spits out a real gem. One Reddit user shared that after recently moving, they went rummaging through a bin full of junked computer parts, and pulled out not just a surprisingly decent graphics card, but something genuinely special.

Over the years, we've reported on the graphical powerhouse that never was, the Nvidia RTX 4090 Ti. Reportedly cancelled, one lucky poster apparently saved a prototype model from landfill back in November 2024 (via VideoCardz). You know what they say, one person's trash is another's GPU—wait, haven't we been here before? I'm about to go full Danny DeVito and debut my refuse-themed, musical wrestling persona, the Queen of Slop.

Jokes aside, what surfaced on Reddit last year appeared to be a mostly complete prototype, with both the MASSIVE cooler plus the card itself. I say 'appeared' because the Reddit post including the original photo uploads of the prototype, has since been scrubbed. But fear not! This is far from the final twist in the tale.

Enter tech testing YouTube channel Gamers Nexus, who have more recently got their hands on the fabled GPU. After plenty of partial sightings like some kind of cryptid, Gamers Nexus not only benchmarked the prototype of what is thought to be the long lost RTX 4090 Ti, but also shared a teardown of the funky, chunky card.

Gamers Nexus first highlighted the hefty GPU's somewhat unusual construction, drawing attention to the front, back, and middle fan placements. This not only sucks airflow straight through the card, but also necessitates a more discretely placed PCB. The board in question is a hefty bit of kit too, running along the belly of this absolute beast—and it is a beast, measuring 80 mm in width. Getting up close and personal with the PCB reveals a densely packed hotbed of tech, with the central GPU crowded by memory and multiple layers of VRAM.

The otherwise pretty unwieldy cooler impressed during thermal benchmarking, keeping the GPU at around 46 °C, and more than 20 degrees cooler than the actually released Founder's Edition RTX 4090.

But let's get to what we're all really here for—the gaming test. The prototype averaged 62 fps for Black Myth: Wukong at 4K with FSR Quality and ray tracing enabled. This was comparable to the retail card Gamers Nexus also tested, the Nvidia RTX 4090 Cybertank, which averaged 65 fps. The gap between the two cards soon widened though; in Dying Light 2 at 4K, the prototype averaged 82.3 fps while the retail model averaged 91.6, and in Dragon's Dogma 2 at 4K with ray tracing at max settings, the prototype averaged 76.3 fps while the retail model averaged 84.5.

Though not an awful performance by any means, it's an unsurprising performance gap for a somewhat unrefined prototype graphics card—especialy without easily accessible, specific drivers to get the most out of the hardware.

It's always interesting to glimpse a road not taken in tech, though the big question mark remains why Nvidia never pushed this prototype into full production. Was the cooler simply too thick for this world? Who can say. What I do know for certain is that time marches on, e-waste is still not a solved problem, and that the reign of the comparatively svelte RTX 5090 is upon us.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/after-fabled-rtx-4090-ti-was-allegedly-dug-out-of-a-bin-last-year-tech-testing-youtuber-puts-nvidias-prototype-gpu-through-its-paces/ EyMYcCwH8Csi4b8KoxqXLK Mon, 13 Jan 2025 16:27:44 +0000
<![CDATA[ Nvidia denounces Biden administration's 'rigged' and 'misguided' new AI chip export restrictions ]]> Nvidia has dropped a blog post bomb on the Biden administration's new AI chip export restrictions, announced earlier today and rumoured for some time. Nvidia's vice president of government affairs, Ned Finkle, denounced the plans as "misguided" and an attempt to "rig" the market. Nvidia also sought to flatter the incoming President, crediting Trump for America's "current strength and success in AI."

Finkle's blog post is borderline brutal in its dismissal of the new rules being imposed in the dying days of the Biden administration.

"The Biden Administration now seeks to restrict access to mainstream computing applications with its unprecedented and misguided 'AI Diffusion' rule, which threatens to derail innovation and economic growth worldwide," Finkle said, describing the new rules as a "200+ page regulatory morass, drafted in secret and without proper legislative review." Ouch.

He also claimed this "sweeping overreach" was an attempt to "rig market outcomes and stifle competition." Moreover, Finkle says the rules, which are designed to prevent America's adversaries from acquiring advanced AI technology, won't work anyway.

"Rather than mitigate any threat, the new Biden rules would only weaken America’s global competitiveness, undermining the innovation that has kept the U.S. ahead," he claimed.

For the record, the new rules are indeed pretty sweeping. They impose quotas on sales of AI GPUs to most countries in the world, the idea being to block Chinese efforts to circumvent earlier export controls on GPUs into China specifically. No question, this would all have a direct and substantial impact on Nvidia.

But Nvidia isn't the only big tech entity upset by the new rules. Oracle Executive Vice President Ken Glueck said the new export regime “does more to achieve extreme regulatory overreach than protect US interests and those of our partners and allies,” adding that it, "practically enshrines the law of intended consequences and will cost the US critical technology leadership.”

For its part, the Biden administration has emphasized that 18 key US allies, including Australia, Japan, South Korea and Taiwan, are subject to no restrictions at all and that, "chip orders with collective computation power up to roughly 1,700 advanced GPUs do not require a license and do not count against national chip caps."

We also can't help noticing how the blog post seemed to court favour with incoming President Donald Trump.

Slides showing the RTX 5090 graphics card at CES 2025.

Among Nvidia's new RTX 50 GPUs for gamers, it'll be the RTX 5090 that's most impacted by the new export rules. (Image credit: Nvidia)

"The first Trump Administration laid the foundation for America’s current strength and success in AI, fostering an environment where U.S. industry could compete and win on merit without compromising national security. As a result, mainstream AI has become an integral part of every new application, driving economic growth, promoting U.S. interests and ensuring American leadership in cutting-edge technology," Finkle said.

The vituperative language used by Finkle, particularly his accusation that the Biden administration is attempting to "rig" the market and calling the document itself a "morass" likewise feel like they're straight out of the Donald Trump playbook of hyperbolic criticism, as opposed to the sort of measured critique of government policy you might normally expect from a major corporate entity.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

The cynical interpretation here is that Nvidia is attempting to play on the perception that Trump is easily flattered. By giving Trump the credit for the current AI boom and also throwing in some Biden trash talking sweeteners, this interpretation would say, Nvidia hopes that Trump will go easy on any new import restrictions and maybe even immediately repeal these Biden rules.

After all, if Trump thinks he is responsible for the AI boom, he won't want to kill it, right? And anything imposed by the Biden administration is bound to be something he's opposed to? It's a no brainer, no?

Perhaps. But whatever Nvidia's actual intentions and motivations, this is a pretty eye-popping blog post and certainly not the usual bland, corporate fare. It will be fascinating to see how this all plays out, if the Trump administration does cancel these new rules and what, if anything, they're replaced with.

Anywho, our immediate take on the new rules is that they won't have much impact on gaming GPUs as opposed to AI chips, with the exception of the new RTX 5090 and the existing RTX 4090. We think everything below those cards should fall outside the rules, but we'll update if that turns out not to be the case.

]]>
https://www.pcgamer.com/hardware/graphics-cards/nvidia-denounces-biden-administrations-rigged-and-misguided-new-ai-chip-export-restrictions/ pZjsbUQ2gDTF9c5qxaFixS Mon, 13 Jan 2025 16:03:27 +0000
<![CDATA[ Even with a new generation of GPUs on the way this all-AMD Ryzen/Radeon 7900-series gaming PC is a great discounted machine ]]>

iBuyPower Y40 | Ryzen 9 7900X | Radeon RX 7900 XT | 32 GB DDR5-6000 | 2 TB SSD | $1,899.99 $1,649 at Walmart (save $250.99)
This iBuyPower build is an all-AMD powerhouse that isn't slack in a single department. The 7900 XT isn't quite as powerful as an RTX 4080 Super, but it has heaps of VRAM and is great for 1440p gaming. Throw in the Ryzen 9 CPU, 32 GB of fast DDR5 RAM, and 2 TB of storage, and you have a fantastic build here for less than $1,700.View Deal

If you're an AMD fan-person, this iBuyPower gaming PC might be right up your alley, however you feel about the imminent new generation of graphics cards. The iBuyPower Y40 has been a favorite of ours over the past few months, frequently offering up stellar prices for some serious gaming horsepower (a moment of silence for the $999 RTX 4070 build that popped up briefly in December before selling out). The latest Y40 discount to catch my eye, though, is this all-AMD one for just $1,649 at Walmart.

Why did it catch my eye? Well, apart from the fact that any build sporting an AMD GPU catches it, there's the fact that its so well-rounded in specs and that these components are packaged in a Y40 build that we rate quite highly.

You can check our Jacob Ridley's iBuyPower RDY Y40 Valorant VCTA R003 review to see the nitty gritty of the sort of system iBuyPower can build around the chassis, but essentially the Y40 is a very pleasantly crafted bit of kit. Decent cooling, good cable management, solid parts, a nice chassis, and all for a reasonable price. The version I'm highlighting here has different specs, of course, but the core build quality should remain.

But let's get down to brass tacks, and that's the specs themselves. At the centre of this build are the dual 7900 hand-cannons, the 12-core AMD Ryzen 9 7900X and 20 GB Radeon RX 7900 XT. (Yes, that's 20 GB on the GPU—none of Nvidia's stinginess, here.)

This combo, despite the CPU being a little older and the GPU being just weeks away from being outshined by a whole new generation, is more than enough for basic productivity tasks, and of course gaming. And regarding the latter, the RX 7900 XT is ideal for high refresh rate 1440p gaming or 4K gaming in all but the most demanding titles. Sure, it's not quite at the level of the RX 7900 XTX or RTX 4080 Super, but you're not paying RTX 4080 Super prices, here.

And for the price tag, you're also getting a pretty solid all-round system: 32 GB of DDR5 RAM—fast DDR5 RAM, at that—and 2 TB of storage. (Hopefully system builders are starting to get the message that game installs can be big, these days.)

Plus, vertical GPU mounts just look lovely, don't they? If you're proud of your all-AMD build, why not show it off?

This build should serve as a good platform to upgrade from in future, too. 32 GB of fast DDR5 RAM and a Ryzen 9 7900X should get you by even through to the RTX 60-series generation of GPUs (and whatever AMD decides to call its post-9070 generation), so if you decide to upgrade your graphics card down the line, you should be in good stead. And if you wanted more on the CPU front, the AM5 socket will let you slap in a newer X3D chip, too.

All in all, this all-AMD build is a pretty good for this $250 saving, I'd say.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/even-with-a-new-generation-of-gpus-on-the-way-this-all-amd-ryzen-radeon-7900-series-gaming-pc-is-a-great-discounted-machine/ umZWu2BUUepcmvKWy6dRCd Mon, 13 Jan 2025 16:01:47 +0000
<![CDATA[ It's hard to believe but 22% of PC gaming monitors are now OLED panels ]]> Can you believe that over one fifth of PC gaming monitors sold today are OLED panels? So, says no less an authority on the subject than LG.

To be more precise, LG told YouTube channel HDTVTest that OLED now has 22% of the PC gaming monitor market. LG notes that market share has been achieved within just two years of the launch of its first OLED-based PC gaming monitor.

Meanwhile, it might surprise you to learn that OLED only currently represents 18% of the TV market, and that despite the fact that LG released its first OLED TV way back in 2013.

Of course, the very first OLED gaming monitor was actually the Alienware 34 AW3423DW, which was launched nearly three years ago and was based on Samsung QD-OLED technology. LG's first OLED monitor came a little later. But the overall gist here is clear enough. OLED adoption rate on the PC is much, much faster than with TVs.

As much as I'm a card-carrying fan of OLED technology and despite the fact that there is now a wide array of OLED gaming monitors to choose from in all kinds of shapes, sizes and resolutions, I find that surprising.

Mainly, that's because OLED monitors remain very expensive. You're looking at MSRPs of at least $600 for an entry-level 27-inch 1440p model from a recognisable brand, while most models are $800 and upwards. Indeed, it wouldn't surprise me if the average price of an OLED gaming monitor from the bigger brands right now is in the region of $1,000.

To my mind, that ought to make them pretty niche. And, OK, LCD still makes up the hefty majority of the market. But over one in five gaming monitors being bought right now is OLED and presumably that market share is still increasing. It'll probably be at least a third in a couple of years.

Hopefully that continued increase in market share will reflect lower prices. It's always been slightly baffling how expensive OLED monitors have been compared to TVs. When the first 27-inch 1440p OLEDs came out, they were about the same price as a 42-inch 4K OLED TV. It made no sense at all.

Since then, prices have generally come down. Like I said, those 27 inchers start nearer $600 and both 34-inch ultrawides like the Alienware 34 AW3423DWF and 32-inch 4k models including the MSI MPG 321URX are in the region of $800 and up.

But those are still pretty hefty prices when you consider that $200 or so will net you an LCD-based 27-inch 1440p gaming panel and $250 and up will bag a 34-inch ultrawide.

Image 1 of 2

LG UltraGear 45GX990A

Sadly, the OLED panel I really want is still $2,000-plus. (Image credit: LG)
Image 2 of 2

LG UltraGear 45GX950A

(Image credit: LG)

So, personally, I'd like to see OLED prices come down still further. I think $500 for a 32-inch 4K model and a little less than that for the 27-inch 1440p and 34-inch ultrawides would be about right. At least, it would be at about those price levels that I'd be happy to recommend them without major caveats.

Speaking of caveats, it's worth pointing out that one of the very last that applies to OLED tech on PC gaming monitors aside from price seems like it might be history. As I reported last week, both Samsung and LG announced new OLED panel tech at CES with much higher claimed full-screen brightness.

In theory, we could see as much as 400 nits full screen, which for my money would mean the brightness "problem" with OLEDs is solved and I expect models with at least 350 nits full screen will appear later this year based on this new technology. Hope such monitors won't come at a major price premium.

But even if they do, I expect those prices will come down pretty rapidly, as will OLED prices generally. Give it a couple of years and the sub-$500 OLED monitor capable of 400 nits full-screen brightness will probably be a thing. The only slight snag is that the OLED I really want, LG's new 5K2K Ultragear 45GX950A is a $2,000 monster. Sadly, it'll be a while yet before a panel like that is truly affordable.


Best gaming monitor: Pixel-perfect panels.
Best high refresh rate monitor: Screaming quick.
Best 4K monitor for gaming: High-res only.
Best 4K TV for gaming: Big-screen 4K PC gaming.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/its-hard-to-believe-but-22-percent-of-pc-gaming-monitors-are-now-oled-panels/ nv5QCqTEk4dvbjXBqz34ph Mon, 13 Jan 2025 11:28:07 +0000
<![CDATA[ If you're trying to convince me your 'companionship' robot is 'lifelike', maybe don't rip her face off in the demo video ]]> CES is an exciting event because it's always packed with weird, wild, and often dubious tech stuff, like cyberprisons for tiny anime girls (what?), VR racing rigs that blow wind in your face (huh?), and keyboards where every key is bread (which?).

Naturally, since the past year in tech has mostly been about stuffing AI into every single conceivable device and app whether it's helpful or not, there's a lot of AI stuff being shown off at CES this year and that includes a "modular, conversational" full-body robot powered by "custom AI [that] can learn and remember your previous interactions" made by a company named Realbotix.

I'm not at CES myself so I couldn't meet one of Realbotix's full-body robots, but the gist is they cost about $200,000 and they're a waking nightmare. Take a look:

"Our customized AI solutions are tailored to your specific needs, whether for companionship, social interaction, or business purposes," says Realbotix, though I can't help noticing the video above starts out with the camera pointed at the robot's ass. They're implying you can have sex with the robots, right?

"Realbotix can replicate a historical figure, a celebrity or bring to life our client's vision for a robot," the official site says while refusing to answer my question. "Our companionship-based AI makes our robots perfectly suited for the home."

Perfectly suited for sex in the home?

"Manufactured in the USA, Realbotix has a reputation for having the highest quality humanoid robots and the most realistic silicone skin technology," the site says evasively. "Robots with a human touch" is the company's motto. So… sex robots, probably, right?

...reminiscent of the unpleasant spasms of a Chuck E. Cheese animatronic character

Look, I don't think robots you can talk to (and have sex with, maybe) are a bad idea at all, but the video above isn't particularly compelling. We're treated to the robots making odd, jerky gestures that are supposed to be lifelike but are mostly reminiscent of the unpleasant spasms of a Chuck E. Cheese animatronic character during the final performance of an 8-year-old's pizza party.

One robot attempts to brush her hair back but the movement takes twice as long as a human would, making me wonder if my coffee has been spiked with acid. A second robot attempts an alluring "come hither" gesture that needs a few dozen more iterations of programming fixes because that hand is not moving like a human hand or even an alien hand but like the spiny legs of some skeletal creature you might find crawling in the muck at the bottom of the Uncanny Valley.

And then, for reasons I can't even fathom, Realbotix takes the robot it's been trying to convince me is "realistic" and has two burly human arms come in and rip the flesh off the robot's face. What are you doing, Realbotix? You show me her butt and boobs and French manicure and then tear her face off to reveal bulging yet lifeless eyes, exposed plastic teeth and gums, see-through skull and mechanized brain? The hell is the matter with you?

We need to address the other jumpscare in the video, too—like, what's with the emaciated old man robot looking at the second woman robot, smiling, then flicking his eyes to me to make eye contact?

What is this supposed to tell me? Is that look he's giving me suggestive and conspiratorial, as if to say, "That's right, buddy, I'm just a robot, but even I can't help but be turned on by this sexy lady robot, too!" Why are the lady robots young and hot but the male one is a gaunt old man? Did Realbotix create a $200,000 male robot just to leer at the $200,000 female robots so they never feel fully comfortable and safe? Have I died at my keyboard and I'm actually peering into Hell?

We may never know, because I'm too disturbed to look into this any further. If you're braver than I, you can visit the Realbotix website.

]]>
https://www.pcgamer.com/hardware/if-youre-trying-to-convince-me-your-companionship-robot-is-lifelike-maybe-dont-rip-her-face-off-in-the-demo-video/ F5qs7TorbGcZWdF9p9H7hm Fri, 10 Jan 2025 22:58:44 +0000
<![CDATA[ Is the new RTX 5070 really as fast as Nvidia's previous flagship RTX 4090 GPU? Turns out the answer is yes. Kinda. ]]> The Nvidia GeForce RTX 5070 was only fully revealed this week at CES 2025, but already the GPU is causing quite a stir. Though arguably the least powerful of the just unveiled 50-series, the RTX 5070 has been the subject of a number of headline-worthy claims. Alas, most folks will have to wait until the 50-series launches this February to see if it's worth the hype—but tech journalists aren't most folks.

Case in point, PCGamesN has already seen the RTX 5070 in action and has since shared its findings. So, what's the truth behind Nvidia's CEO Jen-Hsun Huang on-stage claim that the RTX 5070 will offer "RTX 4090 performance at $549"? Turns out, there's a lot to that—but not without caveats.

First, the good news: while playing Marvel Rivals, PCGamesN journalist Ben Hardwidge found that the RTX 5070 doesn't just offer comparable performance, but actually eclipses the RTX 4090, what was once Nvidia's flagship card. However, the not-really-bad-news, but more I'm-not-sure-how-to-feel-about-this-news is that we have the GPU's new Tensor cores and their support of Nvidia's just announced DLSS 4 Multi Frame Generation to thank.

It's hardly a surprising move. Nvidia has been calling time on raster graphics rendering for years, with Huang himself memorably saying back in September 2024, "We can't do computer graphics anymore without artificial intelligence. We compute one pixel, we infer the other 32. I mean, it's incredible."

To be clear though, the name of DLSS 4's game is most impressively frame generation. With regards to the RTX 5070, DLSS 4 uses AI and in-game data to generate up to three full frames between more traditionally rendered frames. To put it a little crudely, DLSS 4 is what's giving you that extra bang for your buck.

At CES 2025, Nvidia set up two machines for trade show guests to compare—one kitted out with the RTX 5070 and multi-frame generation, and one using the older RTX 4090 with standard frame generation. Besides that major difference, both machines were running Marvel Rivals at 4K with the highest amount of DLSS supported by their respective graphics card. The result? Hardwridge writes that the RTX 5070 consistently outperformed the older card, outputting in the region of 240 fps compared to the RTX 4090's average of 180 fps.

All of that said key tech specs, such as which CPU was used for these comparison builds, remain unclear, so it's hard to say conclusively just how replicable those numbers are. It also perhaps goes without saying but the older card doesn't support DLSS 4's multi frame gen, so one could quibble about how valuable a comparison this truly is.

Furthermore, even Nvidia stressed that this comparison point is a bit of an outlier, as Marvel Rivals appears to be particularly well-optimised for the RTX 5070; when tested on titles that aren't Marvel Rivals, there's apparently a much smaller performance gap between the two cards.

Still, that suggests performance that once cost over a grand and a half for the RTX 4090 is now within reach for much less. Provided DLSS 4 is involved, of course.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/graphics-cards/is-the-new-rtx-5070-really-as-fast-as-nvidias-previous-flagship-rtx-4090-gpu-turns-out-the-answer-is-yes-kinda/ Ua7XQuBHwu2erGbXskXoVj Fri, 10 Jan 2025 17:04:17 +0000
<![CDATA[ It's time for me to admit that AI-accelerated frame generation might actually be the way of the future and that's a good thing ]]>
Jacob Fox, hardware writer

Jacob Fox on a blue background

(Image credit: Future)

This week: I've been trying to make heads or tails of the million-and-one new and exciting products announced at CES, all while soldiering bravely on through a cold. A true martyr.

CES had a lot to offer this year, but the main announcement for us PC gamers has without a doubt been the Nvidia RTX 50-series. It feels like it's been forever and a day since RTX 40-series cards became the best graphics cards, but the RTX 50-series is finally officially here—well, just as soon as the cards actually launch at the end of January and through February, that is.

Apart from the RTX 5070 seeming to have a shockingly reasonable price tag and the RTX 5090 having a downright painful one, the main thing that's struck the heart of many a PC gamer has been Nvidia's claim that the RTX 5070 will deliver "twice the performance of the 4090". And while some have been delighted by that prospect, others have responded with cynicism, pointing out that Nvidia's claim will only be true if DLSS 4 is enabled.

Apart from the urge to express an obvious response to such cynics—"duh, of course that's only with DLSS 4 enabled, Nvidia's been pretty up-front about that"—I think this is the first time I've realised that I don't actually care whether my frames are made by traditional rendering or by some AI-accelerated frame generation magic.

And trust me, that actually kind of pains me to say. For years now, I'd considered myself a staunch enemy of fake frames. Only those sweet real ones for me, thank you—ones borne of the blood and sweat of traditional shader cores.

Why was I so anti-frame gen? Well, after waving through the smokescreen reasons I only ever actually half cared about—latency, artifacts, and so on—the real reason, I must admit, was that something just rubbed me the wrong way about not owning my own GPU power. I thought: "Hey, if I'm paying hundreds for a piece of hardware, I don't want that performance to be reliant on Nvidia's machine learning and the beneficent game devs who decide to implement it. I want raw horsepower."

The future is now, old man

Me, to me

But now, I'm starting to realise that this argument's not quite right. After all, what performance would I actually own if a GPU was just packed with CUDA cores? Those cores wouldn't mean a damn thing without (at minimum) good drivers and game devs making proper use of them. The GPU cores are nothing in themselves. I'd been reliant on software all along, I just didn't realise it.

What I've come to realise is that AI-accelerated frame generation is just another way of utilising GPU hardware to generate frames. It's no less "local" than CUDA Cores or Stream Processors unless I arbitrarily pick "does not rely on machine learning" as the criterion for "local". But what reason do I have for picking that criterion, given CUDA Cores/SPs also rely on much on the software level, too?

The only real reason for me to pick that criterion is that traditional rendering is what I'm used to. But the future is now, old man. That's what I find myself telling myself when I see Nvidia's RTX 50-series and DLSS 4 performance claims. If AI-accelerated rendering works, maybe it's time I get with the program, especially if the results are as dramatic as Nvidia's claiming.

Maybe those who sneer "the RTX 5070 will only offer double the RTX 4090's performance if it uses DLSS 4" are akin to the luddite saying "the car will only go faster than the horse if it uses wheels." Maybe we need to accept that wheels are the future, and that that's okay.

Of course, all of this depends on whether new frame gen tech can deliver on the quality front. I was sceptical of DLSS 3's frame gen for a very long time, but most of the wrinkles have been smoothed out now. And if initial hands-on reports of FSR 4 are anything to go by, AMD's upcoming frame gen tech seems very impressive.

Ah, but then there's latency. That circle, unfortunately, is harder to square. As our resident cynic Jeremy Laird reminded me earlier today, only a "real" frame can help with latency. AI-generated frames can never improve it, which means at best you're stuck with whatever latency you would have been getting before the extra frames were generated.

One initial response to this is to say that the games where latency matters the most—esports titles—tend to be easier to traditionally render, meaning we might not need to worry too much about them, anyway. But that's a bit of a cop-out, I suppose, because we do also want low latency in non-esports titles.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

So, I'll hold my hands up and say we definitely need to keep some of the old as we hurry in with the new. We're always going to need traditional rendering—even the AI king himself, Nvidia CEO Jen-Hsun Huang, says so—not least because these are the frames that can actually adjust to your input. The frames between are essentially just padding. (Though I do wonder whether there could be a way to change that in the future. For instance, perhaps there'll someday be a way to interject input into the frame generation pipeline, ie, take a control input to guide the next frame's generation.)

Thankfully, it does seem like AMD and Nvidia are keeping the old with the new. We do still see improvements in traditional rendering performance, after all. The problem is that these improvements might be starting to plateau, perhaps as a simple result of Moore's law. (Jeremy the cynic chimes in again here to point out that Nvidia and AMD could be exaggerating the extent to which Moore's law is limiting core density.)

In which case, would we rather GPU companies don't try to give us heaps of extra performance in other ways? Yeah, no. I think I'm finally ready to admit that I like frame gen. Frame gen improvements are perfectly reasonable replacements for traditional rendering improvements, especially given the latter seems like an increasingly low-return proposition compared to the might of AI acceleration.

]]>
https://www.pcgamer.com/hardware/graphics-cards/its-time-for-me-to-admit-that-ai-accelerated-frame-generation-might-actually-be-the-way-of-the-future-and-thats-a-good-thing/ TQ5KV2dURQoSNpDwALJXCQ Fri, 10 Jan 2025 16:28:07 +0000
<![CDATA[ If you're a content creator and gamer looking for a CPU upgrade, this Ryzen 9 9900X deal will get you $90 off ]]>

Ryzen 9 9900X | 12 cores | 24 threads | 5.6 GHz boost | 64 MB L3 cache | 120 W TDP | AM5 socket | $499 $409.99 (save $89.01)

In our Ryzen 9 9900X review, our main criticism was the price, with it not majorly outshining the much cheaper Ryzen 9 7900X but, at this price, that value proposition is much better. View Deal

Recently, over the last few years, with Intel picking up many losses in the CPU division, the divide between content creation and gaming has mostly been down to AMD chips with and without 3D V-cache technology.

If you are looking for a dedicated content creation rig, which can also happily handle games, you end up paying a little more but, luckily, with a few sales, you can now get the rather impressive Ryzen 9 9900X at a reasonable price.

At Newegg right now, you can pick up this CPU for $409.99, which is $89.01 cheaper than its full retail price. With a relatively low TDP of 120 W, it's easy to cool and a great workhorse, but it, unfortunately, has a terrible eco mode if you plan on bringing it all the way down to 65 W, being outpaced by much cheaper CPUs. As well as this, the original price puts it way above comparable chips, and it therefore becomes not great for the money. However, this sale puts it in a much more attractive range.

The 9900X performs productivity tasks fantastically, outpacing the Ryzen 9 7900X and Intel Core i9 149000K in single-core CPU rendering but being beaten out by multi-core rendering by Intel's much more expensive chip. If you are considering a Zen 5 station, this CPU is strong for all kinds of encoding, rendering, and editing.

When you move into gaming, this chip's nice specs aren't quite as reflective. It still performs well, roughly matching the Ryzen 7 9700X stats and performing just a little worse than the Intel Core i9 14900K.

When we originally reviewed this CPU, we recommended picking the Ryzen 9 7900X instead, not because it performs poorly but because the price increase wasn't fully reflected in performance gains. With that last-gen CPU now being $398.99 at Newegg, this upgrade makes a lot more sense.

If you want solid but not hugely noteworthy game performance, yet excellent content creation and productivity performance, and you have the AM5 socket to plug this thing in, you won't be disappointed in your upgrade.

]]>
https://www.pcgamer.com/hardware/processors/if-youre-a-content-creator-and-gamer-looking-for-a-cpu-upgrade-this-ryzen-9-9900x-deal-will-get-you-usd90-off/ jCd78XZWYEcBYgPT3yLAwE Fri, 10 Jan 2025 15:49:36 +0000
<![CDATA[ AMD is now reportedly making all-American Ryzen 9000 CPU dies at TSMC's Arizona fab ]]> AMD is having some of its latest Ryzen 9000-series desktop CPUs made at TSMC's new fab in Arizona. So claims Taipei, Taiwan-based journalist Tim Cuplan. It was reported last year that AMD was planning on having some high-performance CPUs made at TSMC's Arizona facilities. Now those CPUs have been identified as the latest Ryzen 9000 models and they are said to already be in production.

If true, that's an impressive win for TSMC's new US fab. It was only August last year that AMD released the Ryzen 9000, including the Ryzen 7 9700X, using 4nm silicon for the chip's CPUs made by TSMC's Taiwan factories, aslo known as the TSMC N4 node. So, moving some of that production to the Arizona fab so soon certainly looks like a vote of confidence in the facility.

However, the broader context for this news is quite complicated. TSMC is on record as saying that chips in its Arizona fab cost more to make than equivalent products in is factories in Taiwan. It also says that it plans to charge more for US-made chips as a consequence.

Moreover, AMD Ryzen 9000 CPUs are chiplet designs. While it seems AMD is now having some 8-core CPU dies made in Arizona, the package also contain a 6nm I/O die housing the memory controller and other functionality.

That die, as we understand it, is still made in Taiwan. TSMC does have an older fab in Washington state. But that facility at best produces chips on the 16nm node.

Indeed, there are further components in the package and then the package itself to consider. It's not clear where all of the items are produced. So, the mere fact of Ryzen 9000 CPU dies being made in the US wouldn't automatically into an all-American product, even if the CPU cores themselves will have been both designed and manufactured in the USA.

A photo of an AMD Ryzen 9 9900X processor

AMD is now reportedly manufacturing 4nm Ryzen 9000 CPU chiplets in Arizona, less than a year after the chips were first launched. (Image credit: Future)

As things standard, TSMC has three new Arizona fabs either already in production or being built. The first, FAB 21-1, produces 4nm chips reportedly including the AMD Ryzen 9000 and Apple's A16 and it already up and running. FAB 21-2 will up the ante to 3nm or N3 chips, the likes of which include Intel's Lunar Lake laptop CPU and are currently only made by TSMC in Taiwan, while FAB 21-3 will eventually make the move to 2nm.

For now, you can't actually buy any devices with TSMC-made 2nm or N2 silicon. Indeed, TSMC still plans to reserve its cutting-edge technology for its domestic Taiwanese plants. So, N2 chips made in Arizona will come online some time after similar silicon is available from Taiwan.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

That said, AMD doesn't tend to use the very latest nodes for its chips. The new Ryzen 9000 was rolled out on N4 despite TSMC having been in production with N3 for several years and even AMD's new RDNA 4 GPUs which were almost but not quite announced at CES are still on N4.

The same goes for Nvidia's latest Blackwell RTX 50 graphics chips. Those are on TSMC N4, too. So, the fact that TSMC's US fabs are being kept a step behind Taiwan doesn't mean we won't see new CPUs and GPUs that are made in the US.

Then again, the likes of AMD and Nvidia will have to play off the benefits of having chips made in the US, which might include avoiding tariffs, against higher local production costs. It'll be interesting to come back in five years or so and see which chips, exactly, end up being made in the US.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/amd-is-now-reportedly-making-all-american-ryzen-9000-cpu-dies-at-tsmcs-arizona-fab/ JNWzLTq4etQ8Zg5AFxd4oj Fri, 10 Jan 2025 15:35:24 +0000
<![CDATA[ I tried a sim racing rig that generates 'wind' at CES 2025 and it's claimed to help keep VR nausea at bay ]]> Sim racing can be a sweaty business. I dare say that's something I can write with experience, but perhaps no more. I went for a test drive in a sim rig over at CES 2025 from Nitro Concepts that included a set of blowers that kept me cool, and more importantly, plenty immersed while racing.

For my test drive, a set of blowers sit either side the Moza wheelbase on the Nitro Concepts E-Racer rig. This set-up also includes large LED light strips that light up in accordance with what's happening in the game, ie flashing yellow when there's a yellow flag, and a haptic rumbling cushion. This 'Immersion' gear all connects up to various controllers that are hooked up mostly out of sight.

I'm initially quite sceptical about the whole 'wind' thing. Will it really feel like wind or will it feel like someone turned a fan on in front of me?

The answer is: both.

It does feel like wind, and the gradual scaling up and down of intensity stops it feeling too much like a fan blowing in your face. That said, a fan blowing in your face might also do just as well when going at full speed.

I'm racing around Spa Francorchamps, a racetrack in Belgium on the F1 calendar, and there are some high speed straights that really let these fans reach full whack. There is something quite exhilarating about it—something in my inner brain is better tricked into feeling like I'm travelling at high speed with the 'wind' hitting my face.

But if you think about it, it's not even that realistic to have wind hitting your face in anything other than open cockpit cars. Modern racecars might not be entirely sealed up like a family SUV, but there's at least a windshield on many of them. Nevertheless, I found the wind really added to the experience—immersion doesn't always have to be about pure realism.

Image 1 of 2

Nitro Concepts CES 2025 booth with a sim racing rig that offered wind simulation from twin fans near the wheelbase.

(Image credit: Future)
Image 2 of 2

Nitro Concepts CES 2025 booth with a sim racing rig that offered wind simulation from twin fans near the wheelbase.

(Image credit: Future)

I put a lap in, and then another, and I would've happily gone for a few more had I the time to stick around. But as I'm climbing out of the rig, a Nitro Concepts spokesperson tells me that wind simulation can be especially useful for VR gaming.

Firstly, it keeps you cool while wearing a headset—sim racing and VR gaming is an especially sweaty combination. Though what's more is that it is claimed that simulating wind can help keep nausea away. There's something to having a sense of direction through the flow of wind that tricks your brain.

CES 2025

The CES logo on display at the show.

(Image credit: Future)

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

Wind set-ups aren't a brand new concept for sim racing and I had a look around to see if others report an diminishing nausea while using them. There are, in fact, anecdotal reports of wind simulation having this effect for people. If you suffer from nausea in VR, at the very least, it appears worth trying for yourself.

Also, there are some beefy wind simulator set-ups out there. The Nitro Concepts option was pretty sleek by comparison, though I suspect you might get more oomph out of a bigger unit.

I don't have availability info on the wind gear from Nitro Concepts just yet, but you can find the lighting and haptic systems over on its website for $50 (just the LED controller) and $1,000, respectively.

]]>
https://www.pcgamer.com/hardware/i-tried-a-sim-racing-rig-that-generates-wind-at-ces-2025-and-its-claimed-to-help-keep-vr-nausea-at-bay/ txW2QqgasoKNC7MBTzhNH Fri, 10 Jan 2025 14:02:23 +0000
<![CDATA[ MSI intros cheaper 'back connect' Project Zero Intel motherboards for super-clean PC builds you might actually be able to afford ]]> Want a super-clean desktop build? Get a motherboard with all the connectors on the rear. The only problem is price. It's gonna cost you. But perhaps not as much as it once did, thanks to some new "Project Zero" Intel motherboards announced by MSI for CES.

Project Zero is of course MSI's branding for motherboards with all the connectors on the rear, an approach other motherboard makers including Gigabyte and its own project Stealth boards and Asus and its BTF line are also running with.

Like most of the competition, MSI Project Zero motherboards have generally been pricey premium products up to now. But that changes with the new MSI MAG Z890 Tomahawk WIFI PZ and MAG Z890 Tomahawk WIFI PZ White.

Tomahawk is MSI's mainstream family of motherboards and it's the first time Project Zero and that more affordable series of mobos have been married. As MSI itself puts it, Tomahawk products are aimed at PC builders who want a "no-frills, reliable platform for their PC."

Like all Project Zero motherboards, these mobos move the connectors for everything bar the CPU, DRAM, and SSD on the rear, allowing for an ultra-clean look. As the Z890 moniker indicates, the boards are designed for Intel's new Arrow Lake CPUs up to and including the range-topping Core Ultra 9 285K and sport a decent sport 16+1+1+1 setup of VRMs for 19 power stages.

You also get DDR5 memory overclocking support up to 9200 MT/s, a PCIe 5.0 x16 slot for GPUs, a PCIe 5.0 M.2 slot for NVMe SSDs, plus a pair of PCIe 4.0 M.2 slots. Oh, and two Thunderbolt 4 ports, an Intel Wi-Fi 7 card, and 5 gigabit Ethernet.

MSI Project Zero

Admit it, you wouldn't say no to a build like this. (Image credit: Future)

Slightly frustratingly given value for money could be the central appeal of these boards compared to the "back connect" competition, MSI hasn't actually released pricing for these boards.

It's also worth bearing in mind that you'll typically need a compatible case to use one of these super-clean mobos. In practice, that probably means more money on a new case, although as Nick explained, some of these rear-connect boards are more compatible with legacy cases than others.

Still, as we found last year, these back connect boards aren't just about looks, they have cooling advantages, too. Anyway, hopefully, MSI's new Tomahawk boards are part of a broader move to making cleaner PC builds more mainstream and affordable. Once you've built a PC with all the connectors neatly hidden away, it's tough to go back to that old rat's nest of cables.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/msi-intros-cheaper-back-connect-project-zero-intel-motherboards-for-super-clean-pc-builds-you-might-actually-be-able-to-afford/ eraoJJDtf2ZCE5dutBErg6 Fri, 10 Jan 2025 13:10:02 +0000
<![CDATA[ Thermaltake's new 2000 W PSU is too powerful to be sold in the USA and comes with four PCIe 5.0 GPU power connectors ]]> Thermaltake has a handful of new Toughpower PSUs on display at its CES 2025, and I've been over there to check them out. First off, the Thermaltake Toughpower D2000, which is quite literally too powerful for use in many American homes.

The D2000 is a 2000 W power supply, which is double the recommended capacity for use with an RTX 5090. It's so powerful that many US homes won't offer that sort of wattage as standard from an everyday outlet. Therefore, as Thermaltake spokesperson Mike tells me, it won't be sold there. Thermaltake does have a 1650 W power supply available as a replacement.

Europe and most other regions around the globe do, however, have the outlets to deliver this sort of power, and we should expect to see this PSU sold in some places. Though even then, you have to be sure your home electrical outlets are up to code, as that's a lot of watts to draw from a single plug. It's just a lot of watts in general, what are you planning to build?!

This modular PSU comes with heaps of connectors, including a bunch of SATA, PCIe, and GPU connections. There are four (4!) 12V-2x6 connections available, though you should try to use each of these 600 W connections at full whack together… not even a 2000 W PSU can pump out that much power.

Even for most workstation set-ups with a couple of GPUs, 2000 W should be plenty. That's ultimately what this PSU is intended to be used for.

Image 1 of 3

Two Thermaltake PSUs on show at CES 2025, one very large and the other very small.

(Image credit: Future)
Image 2 of 3

Two Thermaltake PSUs on show at CES 2025, one very large and the other very small.

(Image credit: Future)
Image 3 of 3

Two Thermaltake PSUs on show at CES 2025, one very large and the other very small.

(Image credit: Future)

The second PSU worth talking about at Thermaltake's booth is the slightly more sensible Toughpower SFX 1200W. It's for small form factor builds and is only 125 x 63.5 x 103.8 mm. Nevertheless, it puts our 1200 W of power and is rated to, wait for it, 80 Plus Titanium efficiency.

That's top-shelf efficiency in an adorably small power supply. I recently used one of Thermaltake's Toughpower SFX unit in a small form factor build for PC Gamer, and I was impressed it managed Platinum efficiency at 750 W. This thing is on a whole other level. Paired with a next-gen RTX 5080 or 5090 of relatively slim sizing, we could have the recipe for immensely powerful compact PCs. Though so far most high-end next-gen GPUs appear pretty large, except the Founders Edition.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/power-supplies/thermaltakes-new-2000-w-psu-is-too-powerful-to-be-sold-in-the-usa-and-comes-with-four-pcie-5-0-gpu-power-connectors/ iqocvRC3Fkq2ChzaV3Qi9E Fri, 10 Jan 2025 13:02:22 +0000
<![CDATA[ AMD accuses Intel's Arrow Lake of being a 'horrible' product and implies a lack of options for consumers has caused the Ryzen 7 9800X3D shortage ]]> The Ryzen 7 9800X3D, the current best CPU for gaming, launched just a little while ago and its impeccable performance has already caused shortages. AMD says the level of demand is partially down to Intel's launch of the rather underwhelming Arrow Lake.

As told to a writer at Tom's Hardware in a roundtable interview with AMD executives, AMD blames part of the severe demand for its current best chip on Intel launching a mediocre product in the hotly anticipated Arrow Lake. Okay, AMD put it a bit more harshly than that, saying "We knew we built a great part. We didn't know the competitor had built a horrible one."

On this, we tested out both the Intel Core Ultra 9 285K and Intel Core Ultra 5 245K back in October and they are certainly underwhelming, though we wouldn't quite call them horrible.

The former CPU is great for productivity needs but beaten out in gaming by cheaper, older AMD CPUs. The latter is a strong budget choice for content creation, but the use-case is far more niche than one might expect from a competitive Intel CPU.

However, a 'just okay' launch of a hotly anticipated set of CPUs, especially when you consider the CPU instabilities that still negatively impact the consumer view on Intel, these chips needed to be better to compete with AMD, which is currently performing really well.

Historically known for making the best CPUs in the world, Intel has had a bit of a fall from grace and the newest CPUs aren't helping it.

AMD itself didn't have a flawless showing at CES. Early Radeon RX 9070 benchmarks are certainly positive but the cards themselves weren't given a release date, benchmark results, or even price point. We can say the fps in some games looks healthy but won't be able to say much more until we know what we're comparing it to in the market.

It isn't yet clear when the 9800X3D will trickle into the normal market at its intended price point but it's a good chip, and one worth waiting for. I recently upgraded to the previous chip, the AMD Ryzen 7 7800X3D in my own personal rig and can happily state it still performs excellently.

Hopefully, Intel can start to claw its way out of the hole it has dug itself, even just so it's easier to get ahold of the best CPUs.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/processors/amd-accuses-intels-arrow-lake-of-being-a-horrible-product-and-implies-a-lack-of-options-for-consumers-has-caused-the-ryzen-7-9800x3d-shortage/ gEGi9MAjyTNxFPVcHNXbcF Fri, 10 Jan 2025 12:49:58 +0000
<![CDATA[ This Jinx-inspired gaming PC is my favorite of CES 2025 and I'm head over heels for the hardware ]]>
Image 1 of 3

A gaming PC at the Hyte CES 2025 booth with a Bitspower distro plate and themed like Jinx from League of Legends.

(Image credit: Future)
Image 2 of 3

A gaming PC at the Hyte CES 2025 booth with a Bitspower distro plate and themed like Jinx from League of Legends.

(Image credit: Future)
Image 3 of 3

A gaming PC at the Hyte CES 2025 booth with a Bitspower distro plate and themed like Jinx from League of Legends.

(Image credit: Future)

I'm not a League of Legends player but I am enjoying Arcane. That's only partially the reason why this gaming PC over at Hyte's CES 2025 booth wins my gaming PC of the show award (a prize I just made up).

"Jinx was here!" is scrawled on the front of this PC case, but it's what lies underneath that I'm head over heels for. It's a Bitspower X Hyte Distroplate for the Hyte Y70 chassis, and it's a stonker.

For the non-custom watercooling lot, a distro plate is made up of various channels winding to and fro for shifting the cooling liquid within a custom loop. It can act like a reservoir for the loop, saving the need to have one elsewhere, and can also help route tubing without more complicated bends.

A distro can also contain two independent liquid loops, for cooling different components with discrete loops, as this one here does.

You can see the upper blue channel is keeping the Ryzen 5 7600X used here cool, while the lower pink channel is hooked up to the RTX 4080 Super inside this build. It's all Bitspower kit, and admittedly a little overkill on the CPU side for that choice of chip. Though I'm keen on it for the visuals and cooling, not the actual specs.

The Bitspower X Hyte Distroplate isn't quite ready for prime time yet. It's close, though, and Hyte had a couple of builds at its CES 2025 booth with one installed.

Image 1 of 2

A custom liquid cooled gaming PC at the Pro Gamers Group booth at CES 2025.

(Image credit: Future)
Image 2 of 2

A custom liquid cooled gaming PC at the Pro Gamers Group booth at CES 2025.

(Image credit: Future)

I did spot another contender for my made-up award over at the Pro Gamers Group booth (Ducky, Havn, Noblechairs owner). A runner-up. It's a Duck-themed build, ducks all the way down, and it's quacking. Cracking. Sorry. It's packed with liquid cooling, too, with a green reservoir and multiple screens, including one saying 'Quack Hunt', in homage to the classic game Duck Hunt.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/this-jinx-inspired-gaming-pc-is-my-favorite-of-ces-2025-and-im-head-over-heels-for-the-hardware/ 3jQwdgEgwLaLcxnaNvq6H5 Fri, 10 Jan 2025 12:43:40 +0000
<![CDATA[ Asus reveals three new Slash Travel Bags for carting around your ROG Ally—however, none of them are particularly fashion forward ]]> Thanks to the office Steam Deck, I now have a choice of venues for playing Infinity Nikki: on the sofa, wrapped up in a duvet, AND avoiding all of my responsibilities. Jokes aside, handheld gaming PCs have ushered in an era of AAA gaming-on-the-go, but they're still fairly bulky devices. Well, the latest collection from Asus sports bags that attempt to protect your preciously priced hardware in style.

With a price tag of $800 for the ROG Ally X, you can't just pop your handheld into a sizeable pocket and carry on without first confronting the fear of a scratched-up screen at least; a specialised carry case is a must for anyone wanting to take their handheld further afield than the couch. As such, Asus has revealed new additions in its Slash line of travel bags specifically for stowing your ROG Ally and similar handhelds. Unfortunately…well, I'll just put it this way: I don't think these bags are going to feature in many look books this season.

Three different bags have been announced: The Slash Sling Bag 4.0, the Slash Sleeve 4.0, and the Slash Backpack 4.0. Though they're not winning any points for those product names…well, they likely won't be scoring particularly highly in the stylist battle either.

All of the bags are available in a tasteful all-over-black colourway, with the sling bag and sleeve both featuring "tear- and water-resistant construction." I can only theorise as to why the Slash Backpack isn't also described as at least 'water-resistant'. One possibility is that, while the roll-top opening with magnetic Fidlock quick-release buckles makes the backpack's overall size somewhat adjustable, it doesn't make for the most secure closure—then again, it may simply be that wearing this backpack will make you look like a big drip.

Still, 26 litres of capacity is nothing to sniff at, with the Slash backpack easily accommodating larger laptops up to 18 inches in dimension. Even the Slash sleeve is fairly roomy too, with space for up to a 16-inch laptop, plus accessories.

However, these new Slash bags all feature a design flourish that I just can't get past. To be fair, I can see the thought process: when crafting any kind of textile product featuring almost exclusively one shade, it makes sense to play with texture and dimension.

Perhaps Asus was hoping for something a little more in conversation with motorcycle leathers, but unfortunately what they've ended up with feels ever so slightly reminiscent of road kill to my eyes. Alas, I dread to think what the Sovereign of Cool would say to me if I rocked up with one of these.

That said, I've glowed up many a piece in Infinity Nikki that was otherwise lacking in stars, so never say never—maybe any one of these Slash travel bags would sit pretty as part of a techwear fit. That aside, no release date or pricing has been announced for these new additions just yet.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/handheld-gaming-pcs/asus-reveals-three-new-slash-travel-bags-for-carting-around-your-rog-ally-unfortunately-none-of-them-are-particularly-fashion-forward/ YS2oi2yJGgwMtwDGV2ZANS Fri, 10 Jan 2025 11:05:05 +0000
<![CDATA[ It seems like Square's finally consistently taking the PC ports of Final Fantasy seriously ]]> There may be no better use case for the reaction guys than the 2015 E3 announcement of Final Fantasy 7 Remake—screams, meltdowns, tears of joy—and the 2021 release of the stuttery PC version of that same game: screams, meltdowns, tears of bitter disappointment. It was a shamefully barebones release of what should have been a huge game for Square Enix, made even harder to swallow by its 20-month delay after the console release and exclusivity on the Epic Games Store.

It's no wonder that last year Square Enix's CEO said the company planned to refocus on multiplatform releases with a particular eye towards "initiatives designed to win over PC users." Between majorly delayed releases and other not-a-bang-but-a-whimper launches like Kingdom Hearts collection limping onto Steam after three years, Square Enix has consistently fumbled the bag on PC with the games it should have the easiest time selling.

But hey: things are finally looking up. The video above highlighting the PC features in Final Fantasy 7 Rebirth, out in just a few weeks, reads like an apology tour for all the problems with Remake.

It starts off with some basics we'd expect from a game running on higher-end hardware than the consoles offer, vaguely touting "improved lighting" and "enhanced visuals" which I assume means either higher-resolution textures or perhaps an increased draw distance; the console version particularly had image quality issues with its "performance" mode that weren't cleaned up until the PS5 Pro came around.

After those basics, the Rebirth PC trailer starts hitting on the features we really expect to see in high-end games: DLSS upscaling, granular graphics options in addition to presets, and rebindable mouse/keyboard controls. I'll give FF7 Remake credit for getting that last one right when many Japanese games still don't, but including the keyboard controls in the trailer—and also being able to crank up the number of NPCs rendered on-screen at once—shows Square Enix is serious about courting PC players.

They're even bringing over DualSense support from the console version, a nice little bonus. You can read a bit more detail on the PC features at the official site.

There are still some high-end features missing in action here that would signal Square Enix is truly investing in its PC ports: accessibility options, for example, as well as ultrawide support and dynamic framerates at 144Hz and beyond. It's disappointing to see a hard ceiling of 120 fps, and makes me wonder if and why the game was programmed to run at static multiples of 30 fps in this day and age. It took modders and utilities like Flawless Widescreen to bring some of that functionality to Remake after launch.

Final Fantasy 7 Rebirth PC

Looks like you'll need a pretty high-end GPU to hit 4K or push beyond 60 fps. (Image credit: Square Enix)

It's not like this is Square Enix's first stab at making a great PC game; last year Final Fantasy 16's PC port was certainly better than FF7 Remake's, and if you go way back to 2018, Final Fantasy 15 was packed with PC-exclusive graphical bells and whistles. Remember Nvidia's VXAO (voxel ambient occlusion) or Turf Effects? I sure don't, but FF15 used them both!

Square Enix has some other black marks in its PC history, including Nier: Automata. And the less said about some of its initial PC ports of classics like Final Fantasy 6 and Chrono Trigger the better.

It's frustrating seeing those kinds of technical blunders mar what should be the definitive versions of some great games. With FF7 Rebirth landing on PC less than a year after the PlayStation 5 and going out of its way to highlight all of its improvements, I hope we're seeing the first results of Square's initiative to win over more PC players.

Now how about that Final Fantasy Tactics remaster, huh?

]]>
https://www.pcgamer.com/games/rpg/it-seems-like-squares-finally-consistently-taking-the-pc-ports-of-final-fantasy-seriously/ MhUDbF2SteV3j7MXVpDETQ Fri, 10 Jan 2025 00:46:06 +0000
<![CDATA[ The Freewrite Wordrunner with a built-in word counter might be the keyboard to look out for if you want to improve your writing process ]]> I've had my eye on Astrohaus Freewrite products for a while. The company has made all kinds of keyboard-and-screen devices that are designed to eliminate distractions and allow you to get down to just writing. But I could never quite pull the trigger and get myself away from my usual writing apps, so I never seriously considered a Freewrite keyboard. Until now, that is.

That's because at CES 2025 Astrohaus has just announced (via The Verge) the Freewrite Wordrunner, which ditches the screen in favour of a standalone keyboard so you can use it with a device and application of your choosing. Instead of the usual Freewrite e-ink display, the Wordrunner has a word counter, a timer, a red joystick, and writing-focused keys in lieu of function keys.

In other words, it seems the company's tried to make a product that brings its focus on distraction-free writing to a standalone keyboard for use with regular writing apps on PC or other devices.

CEO Adam Leeb explains (via TechCrunch): "While gamers have an entire industry creating specialized features for them, writers have been forced to rely on general-purpose keyboards and add-on software to track their work. Wordrunner transforms this relationship by making the keyboard itself an active participant in the writing process—not just a passive input device, but a true writing companion."

According to The Verge, the word counter will track your word count until you press reset, and the function key replacements are used for things such as undo, redo, skipping between paragraphs, and so on. The timer could, for instance, be used for pomodoro timing.

Because I don't like the idea of giving up my usual writing apps—Microsoft Word, Google Docs, Scrivener, and Obsidian, if you're wondering—but I do like the idea of writing-focused peripherals, I was instantly roused to excitement upon seeing those little mechanical counters atop the keyboard.

Image 1 of 3

Astrohaus Freewrite keyboard close-up of the word counter

(Image credit: Astrohaus Freewrite)
Image 2 of 3

Astrohaus Freewrite keyboard top-down

(Image credit: Astrohaus Freewrite)
Image 3 of 3

Astrohaus Freewrite Wordrunner keyboard front and back shot

(Image credit: Astrohaus Freewrite)

But then I thought about it some more and started to wonder just how much sense it makes to port the distraction-free, writer-focused philosophy over to a standalone keyboard.

Sure, a physical timer, word counter, and media-controlling joystick might discourage me from poking around outside the inner frame of the text editor for a while. But will it stop me from getting distracted by that mental side-quest that leads me down a two-hour YouTube rabbit hole? (Will anything?)

I think I won't know how I feel about the keyboard until I get my hands on one, and not just to try out its writing-focused features. Arguably more important than that is typing sound and feel—if you're typing all day, distractions or otherwise, then you're going to want something that feels and sounds good to type on.

According to Engadget, the body is heavy aluminium, and it features sound dampening. The switches are tactile Kailh ones and unfortunately aren't hot-swappable. None of these things guarantee good typing sound and feel though. That's something that can only be decided by actually trying it out.

The Wordrunner has certainly got my interest piqued with its writing-focused additions, and if it can pull off the typing quality aspect, too, it might be the Freewrite product I've been looking for. No word on pricing yet, though, although you can reserve priority access for $1.

Or, perhaps I should just take the plunge and get the gorgeous but incredibly expensive Freewrite Smart Typewriter for *gulp* $649.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/gaming-keyboards/the-freewrite-wordrunner-with-a-built-in-word-counter-might-be-the-keyboard-to-look-out-for-if-you-want-to-improve-your-writing-process/ g2FTz9o44MJebLtg2Gzcch Thu, 09 Jan 2025 16:54:29 +0000
<![CDATA[ Lenovo's rollable laptop screen isn't just a CES party trick—you'll be able to pick one up yourself in June ]]> Once upon a time, a 16:9 laptop screen was enough. Well, at CES 2025 Lenovo posed the question 'Why settle for enough when you can have height?' Either with the press of a button or by holding your palm up to the onboard webcam, the screen of the ThinkBook Plus Gen 6 Rollable laptop extends upwards. Presenting with a seemingly typical 14-inch OLED screen at first blush, the ThinkBook can unfurl into a 16.7-inch screen with a very awkward-looking 8:9 aspect ratio.

What's more, this isn't just a concept brought to life to turn a few heads at a trade show. Lenovo shared in a blog post that you'll be able to pick up your own Lenovo ThinkBook Plus Gen 6 Rollable when it launches globally this June—for $3,499. But, even setting that hefty price tag to one side, I'm not sure why you'd want to.

Okay, sure, one could argue a few use cases. To begin with, having that extra screen space tucked up in the Rollable's metaphorical back pocket will appeal to those who have to travel a lot and would otherwise be looking at packing a portable monitor. The extendable part of the screen can also be set up as a virtual monitor, offering an uncluttered partition of screen real estate for virtual presentations on the go. Alternatively, you could instead opt to clutter up this part of the screen with the ThinkBook's custom selection of multitasking widgets or your most frequently used files.

Personally, I think you could dream even smaller. Say you're feeling super petty during an in-person meeting going nowhere—you could say, "I'm not sure we're on the same page. Let me pull up my notes," before busting out the ThinkBook's threat display.

Joking aside, the ThinkBook remains fairly lightweight, weighing around 1.7 kg (3.7 lbs), even with its shapeshifting screen. This is thanks to Samsung's Eco² OLED technology which not only offers a less hefty package overall but also consumes 30% less power. Outputting 400nits, that's not only impressive but BRIGHT too. Samsung themselves announced in a recent news post that this is the first time their rollable OLED has been part of a mass-produced laptop.

For folks looking for productivity on the go, the laptop also offers relatively decent technical specs, with 1 TB of onboard SSD storage, 32 GB of RAM, and an Intel Core Ultra 7 CPU. This chip's integrated Intel Xe2 GPU means this is mostly a machine that's mostly about work and not so much play though.

While transformative screens are a party trick that continues to draw my attention, it's hard to ignore how flexibility like this has the potential to introduce countless points of failure. Engadget noted in a recent CES show floor video that even the display unit exhibits noticeable "ripples" on the unrollable part of the screen, though clarified that this isn't noticeable when you're looking at the laptop straight on. Still, I'm left wondering how these very minor defects may metamorphose over the product's lifespan—the hefty price tag does little to endear me either.

All of that said, screens simply aren't going to look the same forever—the rise of curved monitors has proven that (and if you're in the market for one yourself, we can offer a few suggestions). Perhaps, like the ThinkBook Plus Gen 6 Rollable, I simply need to expand my horizons.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/lenovos-rollable-laptop-screen-isnt-just-a-ces-party-trick-youll-be-able-to-pick-one-up-yourself-in-june/ GfCaa3ZLAc2sJDtC863fGU Thu, 09 Jan 2025 15:35:01 +0000
<![CDATA[ Nvidia CEO Jen-Hsun Huang's simple reminder that useful quantum computing is a long way off has somehow caused industry stocks to plummet ]]> Remember when a computer meant something that used traditional, familiar algorithms? Ah, simple times. Now we not only have machine learning—aka, supposed "artificial intelligence"—but also quantum computing, which uses microwaves to get qubits to do wacky and seemingly impossible things. Regarding the latter kinds of computers, though, Nvidia CEO Jen-Hsun Huang reckons we're quite far from seeing actually useful ones.

That's straight from the horse's mouth, so to speak, which you can witness for yourself by skipping to 40:00 in the video of the CEO's recent investor Q&A (via The Register) held at CES 2025.

In response to a question about quantum computing, Huang says: "We're probably somewhere between—in terms of the number of qubits—five orders of magnitude or six orders of magnitude away, and so if you kind of said '15 years' for very useful quantum computers, that would probably be on the early side. 30 is probably on the late side. But if you picked 20, I think a whole bunch of us would believe it."

The unfortunate side effect of Huang's words, as reported by Reuters, is that many quantum computing companies have seen their stocks drop. Reuters explains that "Rigetti Computing, D-Wave Quantum, Quantum Computing, and IonQ all fell more than 40%" and "the companies, in total, were set to lose more than $8 billion in market value."

Stocks for quantum computing companies had only recently shot up after Google introduced Willow, which it claimed "performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion years—a number that vastly exceeds the age of the Universe."

Huang's not wrong, though. Practically useful quantum computers—at least "useful" in the way people usually mean—really are a long way off. To think that's a mark against them, however, is to misunderstand what quantum computing is and what its purpose is.

One big thing that quantum computing investors who have now upped ship might have overlooked is, as analyst Richard Shannon says (via Reuters), there should be "considerable government-related revenues in the next few years." Investors might, therefore, be "missing a key part of the equation."

This is because quantum computers are very good at doing niche calculations with low data. Huang himself explains: "Quantum computing can't solve every problem. It's good at small data, big combinatorial computing problems. It's not good at large data problems, it's good at small data problems."

One such kind of problem is encryption/decryption, which makes quantum computing something governments and defence industries are very interested in. It's arguably more of an "arms race" contender than AI is, given the sheer potential compute power for these niche applications compared to traditional computing.

AI and quantum aren't at odds with each other, either. As Huang also explains in the Q&A, "It turns out that you need a classical computer to do error correction with the quantum computer. And that classical computer better be the fastest computer that humanity can build, and that happens to be us." In fact, Huang says, "We want to help the industry get there as fast as possible and to create the computer of the future."

Somehow I don't think Huang's "20 years" claim is going to stop the march of progress on the quantum computing front, regardless of any stock slides.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/nvidia-ceo-jen-hsun-huangs-simple-reminder-that-useful-quantum-computing-is-a-long-way-off-has-somehow-caused-industry-stocks-to-plummet/ Hg2rYyeAaSPiUTP3G8nj3K Thu, 09 Jan 2025 15:03:02 +0000
<![CDATA[ If you love a big laptop without a big price tag, this RTX 4070 17.3-incher is an ideal pick at $1,300 ]]>

Asus TUF Gaming F17 | RTX 4070 | Core i7 13620H | 17.3-inch | 1440p | 240 Hz | IPS | 16 GB DDR5 | 1 TB SSD | $1,699.99 $1,299.99 at Newegg (save $400)
The massive 1440p, 240 Hz screen is the star of the show here but the rest of the hardware isn't bad either! You've got a full-power RTX 4070 to push all those pixels around and the Core i7 13620H is still a decent CPU, despite being a few years old.View Deal

The mega-tech event CES 2025 has been packed with all kinds of new CPUs and GPUs for laptops, but there's nothing wrong with the last-gen stuff or even components that are a bit older than that. Sometimes, all you want from a gaming laptop is a big screen, especially if you're using it as a desktop replacement.

Well, enter stage left this Asus TUF Gaming F17 for $1,300 at Newegg, with its 17.3-inch 1440p IPS display. With a refresh rate of 240 Hz and full Nvidia G-Sync support, you know it'll be very easy on the eyes. All those pixels, though, need a decent GPU to process them in games and to that end, you get an RTX 4070 chip to dole out the frames.

With a maximum power limit of 140 W, you'll be getting the full performance from that chip but the best thing about this TUF Gaming F17 is that Asus hasn't gone with a high-end, power-sucking Core processor. Instead, it's a two-year-old Core i7 13620H.

Sure, it only has six P-cores and four E-cores but it's honestly fine for most games. And because it's not stuffed with cores, it's not super-heavy on power (it's still an Intel chip, of course...) which means the GPU will be able to get the majority share of the power delivery.

That's going to be important when dealing with a 1440p display because that resolution involves 60% more pixels than a typical 1920 x 1200 panel. Should the native performance of the RTX 4070 not be enough, at least you'll have DLSS 3.5 upscaling and frame generation to boost the frame rate in games that support them.

The rest of the hardware specs are pretty decent too. Sure, 32 GB of RAM would have been nice but it's not hard to upgrade a laptop's memory these days and given that you're just getting 16 GB of DDR5-4800 here, you might want to do that at some point in the future.

Storage comes in the form of a 1 TB PCIe 4.0 SSD and although it's not clear if an additional drive can be popped in, there are plenty of large, fast SSDs that you can buy if you want to upgrade the storage.

There are cheaper RTX 4070 laptops than this Asus model and there are more expensive ones with faster processors, but if all you care about is having a big, fast screen to work and game on then you can't go wrong with this deal.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/if-you-love-a-big-laptop-without-a-big-price-tag-this-rtx-4070-17-3-incher-is-an-ideal-pick-at-usd1-300/ mnZcnUBHNqdUknmmhF3xYG Thu, 09 Jan 2025 12:32:11 +0000
<![CDATA[ You've heard of a CPU cooler screen, but have you heard of a *curved* CPU cooler screen? Apparently it has 'unmatched performance and visual appeal' ]]>
Image 1 of 3

The Thermaltake's new liquid cooler with a screen to display images and videos

(Image credit: Future)
Image 2 of 3

The Thermaltake's new liquid cooler with a screen to display images and videos

(Image credit: Future)
Image 3 of 3

The Thermaltake's new liquid cooler with a screen to display images and videos

(Image credit: Future)

Fun aesthetic additions to the insides of cases aren't exactly a new concept but one thermaltake design from CES this year has caught my eye thanks to its being entirely curved.

Some of the PC Gamer team got eyes on the new MAGFloe and MAGCurve coolers on the CES show floor but they were also announced in a press release. The one that interests me is the Thermaltake MAGCurve 360 Ultra ARGB Sync AIO liquid cooler (long name, I know), which comes with a 6.67-inch curved AMOLED screen.

That screen can display visuals of up to 2400 x 1080, including videos, gifs, and images, through Thermaltake's own RGB Plus software. Interestingly, Thermaltake showed off a handful of use-cases, like displaying a scene of a car, images of some of Thermaltake's upcoming gear, and, as is expected from gamers looking to show off their PC, anime girls.

The anime image is split down the middle, displaying one girl on the centre of the screen, and one on the left side, with a noticeable slash in the middle to differentiate between images. We don't know if this is one image created to look like that or two separate images that the software is choosing to display like that.

To me, I feel like the best use case is either a looping gif (like the one I currently display in the OLED screen of my Steelseries Apex Pro TKL Gen 3) or performance stats of the rig like the temperature.

Image 1 of 2

The Thermaltake's new liquid cooler with a screen to display images and videos

(Image credit: Future)
Image 2 of 2

The Thermaltake's new liquid cooler with a screen to display images and videos

(Image credit: Future)

The curved nature of this screen could mean you have room on one side to display those stats and room on the other for a nice image. It is available in white and black so it could be nice for that all-white build you've been pining for.

Internal components with unique aesthetics are somewhat niche, as you need both a case intended to display them, and you need your rig in a spot you can actually see but it's a bit of a trend of this year's CES. Corsair has just announced personalised RAM that equally intrigued and confused me too.

Announced alongside this cooler is the Thermaltake MAGFloe 360/420 Ultra ARGB Sync AIO liquid cooler, which is a similar design except it uses a flat 3.95 inch 480 x 480 screen. Once again, anime girls were shown off on this display too. Both coolers are compatible with Intel and AMD CPUs and the MAGFloe comes with a 460 mm tube length to work around a variety of builds.

In the press release, Thermaltake says these coolers are for "unmatched performance and visual appeal" Though I can see the argument for the latter, we don't yet have performance stats for the former. It certainly looks pretty sweet though.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/cooling/youve-heard-of-a-cpu-cooler-screen-but-have-you-heard-of-a-curved-cpu-cooler-screen-apparently-it-has-unmatched-performance-and-visual-appeal/ H3M4BxpVHoSuhPaVgaRahW Thu, 09 Jan 2025 12:24:03 +0000
<![CDATA[ Nvidia seems to have just confirmed upcoming Arm and Blackwell laptop chips based on its new GB10 processor in collaboration with MediaTek ]]> With all the RTX 50-series chatter coming out of CES 2025, it might be easy to forget that we've been hoping for Nvidia to announce something else this year: namely, an all-Nvidia Arm laptop chip. On that front, it seems we now finally have confirmation that such a thing is in the works and seemingly just around the corner.

According to HardwareLuxx, Nvidia CEO Jen-Hsun Huang confirmed during a Q&A that Nvidia is working with MediaTek to create an end-user system on a chip (SoC) based on the just-announced Project Digits mini home-user AI supercomputer. An "end-user system" would presumably mean a mobile chip that could be used in a laptop.

Huang reportedly said: "We're going to make this a mainstream product. We'll support it with all the things that we do to support professional and high-quality software, and the PC (manufacturers) will make it available to end users."

Given the context, by "this", Huang presumably means the GB10 chip at the heart of the Project Digits mini supercomputer, though likely with different core configurations, GPU core counts, etc. In other words, it seems like he's saying we'll be seeing a (presumably scaled-back) version of this SoC hitting the end-user market, via PC manufacturers.

The GB10 SoC in the Project Digits supercomputer—like a GB100, sans a zero, geddit?—features a Blackwell GPU capable of one petaFLOP of FP4 AI compute and a Grace CPU with 20 Arm cores, plus 128 GB of LPDDR5X memory and up to 4 TB of NVMe storage.

The GB10, Nvidia says, is the "world’s Smallest AI Supercomputer Capable of Running 200B-Parameter Models". It's primarily for students, researchers, and hobbyists to try out powerful local AI that kind of replicates cloud-based AI, and all this runs on a Linux-based DGX operating system.

The GB10 is much more powerful than the Nvidia Jetson Orin Nano that Nvidia announced in December 2024, which is only capable of 67 INT8 TOPS. Project Digits is much more worthy of the "supercomputer" name, and that's probably why it costs $3,000 while Jetson Orin Nano costs just $249.

The idea that Nvidia might make an end-user SoC with Arm CPU cores in collaboration with MediaTek isn't new. In fact, it's one of the things we've been excited about potentially seeing in 2025, as it could mean getting our hands on an all-Nvidia laptop. We'd heard rumours of such chips going into production in 2025 since at least November last year, and we'd even heard talk of the first such SoC having RTX 4070 mobile and Strix Halo-level performance.

Of course, all that performance talk is still speculation, but it seems that what's not speculation now is that Nvidia's working on bringing an Nvidia x MediaTek SoC to market as a "mainstream product." And now that we have the actual GB10 chip that can act as a springboard for consumer chips, we might not have to wait too long.

HardwareLuxx mentions Computex 2025 (at the end of May) as a possible time for Nvidia to introduce such end-user mobile chips, and this might make sense. Nothing definite, of course, but here's hoping.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/nvidia-seems-to-have-just-confirmed-upcoming-arm-and-blackwell-laptop-chips-based-on-its-new-gb10-processor-in-collaboration-with-mediatek/ vg2DjACdDY8iVL8PNHstg8 Thu, 09 Jan 2025 12:05:48 +0000
<![CDATA[ The Asus ROG Strix Scar 18 now looks like a dazzler of a gaming laptop, even if it's still a bit of a paving slab ]]> I reviewed the Asus ROG Strix Scar 18 (2024) almost a year ago, and I came away fairly impressed. It was a big ol' slab of a gaming laptop, though, and I couldn't shake the feeling that it was a bit old-fashioned.

Having spent some time handling the new model, that looks to have changed. Not the size and weight part, however—to any noticeable degree, at least. 18-inch mega-spec laptops continue to be hefty slices of hardware, so don't go thinking that the new Strix Scar 18 is light enough to carry around with you all day—unless you're built like Schwarzenegger, of course.

The Asus ROG Strix Scar 18 sitting on a table with two other laptops at CES 2025.

(Image credit: Future)

And internally it's still got a mighty array of components, this time refreshed for the new generation. That means an Intel Core Ultra 9 275HX 24-core CPU and up to an RTX 5090 GPU—with a 175 W max TGP.

Phew. I'll put it on record now that I'm prepared to eat my hat if this thing isn't stonkingly loud at full whack. Just like the old model, although to be fair to it, it was still much quieter than the MSI Titan 18 HX A14V.

Anyway, the insides may have had a refresh, but it's the outer shell that really impressed me when I saw it for myself.

The keyboard deck of the Asus ROG Strix Scar 18.

(Image credit: Future)

The Strix Scar 18 now feels sleek, and that makes all the difference. The keyboard sits flat to the top of the inner deck, with no dent in the chassis to accommodate it. The finish looks and feels somewhat satin, and seems remarkably resistant to fingerprints—a bug bear of mine on almost all gaming laptops.

Gone is the gaudy underside front light bar, replaced with an all-round RGB strip that looks much more nightclub than county fair. The outer lid now features an AniMe Vision (yep, that's correct) display making use of 810 LEDs to play animations on the outside. Why you'd want to is a bit beyond me, but it's quite the effect in person.

Overall, this laptop now looks and feels special, although I suppose given that Asus has this beast down for a $3,300 MSRP, it darn well better be.

Image 1 of 2

The outer lid of the Asus ROG Strix Scar 18, showing the LEDs lit.

(Image credit: Future)
Image 2 of 2

The underside of the Asus ROG Strix Scar 18, showing the lightbar below.

(Image credit: Future)

Still, as our Jacob Ridley pointed out, it's even got a tool-less upgrade system that makes a whole lot of sense, too. It looks like Asus has really spent some time giving the Strix Scar 18 a thorough going over, and I can't wait to test one for myself.

Big, bruising laptops might not be the most practical of machines, but if you ask me, this one's now got a desirability factor that simply cannot be ignored.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/the-asus-rog-strix-scar-18-now-looks-like-a-dazzler-of-a-gaming-laptop-even-if-its-still-a-bit-of-a-paving-slab/ yQLif7H5GkQ2ABn7VrTmEe Thu, 09 Jan 2025 01:23:43 +0000
<![CDATA[ Even I was impressed by LG's gigantic transparent OLED chandelier of hopes and dreams, but I'm still not buying one ]]> Many moons ago I wrote an article lamenting the rise of transparent displays. The tl;dr being, the practical usage is limited at best, and they only really work as a sales technique to get everyone to pay attention to your booth at a trade show.

Well, today I was at a trade show. CES 2025, to be precise, in the Las Vegas Convention Center. And, to my great shame, I was drawn in by a transparent TV display. Because honestly, when LG has gone to all this effort to put something this visually spectacular together, even I have to stop and ogle.

I know, I know, it's just another expo gimmick. Mounting a load of 77-inch LG signature OLED T displays sideways, motorising them, and moving them in sequence while being rotated around a chandelier is nothi... actually what am I talking about, it's properly impressive.

As an effect, at least. Before you think my brain has entirely turned to mush, I'm not suggesting anybody run out and buy one of these things. They're $60,000 a piece, for goodness sake—and once again, you really, really don't need a transparent display in your home.

But gosh, I am a bit of a sucker for the whole cyberpunk aesthetic, and this display, nay, sculpture, made my brain go all fizzy on the inside. Just for a second, mind, before I shook my head from side to side like a cartoon character and went back on the hunt for other tech.

That being said, I did find another display that I'd desperately love to hook up to my gaming PC and blast out a bit of Forza Horizon 5. How about the glory that is the 163-inch Micro LED TCL X11H Max?

TCL says it features a nanosecond-level response time, and can reach 10,000 nits brightness. That'll do then.

Yours, for a mere *checks notes* $110,000. Gulp. But can you imagine playing racing games on this thing? The odd bit of Doom Eternal? Hell, what about a horror movie, if you're brave enough? The mind (and the eyeball) boggles.

Fine, I'll go back to hunting for interesting PC gaming hardware. But I get to see a lot of lovely displays in this job, and I have to say that both of these made me stop in my tracks—even if they're much more home consumer than pro gamer.

I still think transparent screens are an awful idea in general, though. Even if you do make them dance the fandango for the amusement of the press.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/even-i-was-impressed-by-lgs-gigantic-transparent-oled-chandelier-of-hopes-and-dreams-but-im-still-not-buying-one/ R6P7jQHjAUREzHRjQGjfsF Thu, 09 Jan 2025 00:04:20 +0000
<![CDATA[ Tiny anime girl cyberprison shown at CES ]]> At CES 2025, a company called Sybran Innovation showed off the Code27 Character Livehouse. It's an AI-powered digital purgatory that you can trap a small anime girl in, forever.

As detailed on the Code27 website, it's a 1920 x 1200 display inside a cylinder, showing an animated 3D character of your choosing—selected from Sybran's "curated collection" or imported from third-party modeling software. Using "a wide range of high-quality, pre-trained" LLMs, you can customize the character's persona to "create a character as unique as you are," which you can then interact with.

The cylinder will rotate to look at you, so that you can look at the anime girl.

According to Sybran, the Code27 was created with four goals in mind, which were:

  • l want a character l love, one that's special to me.
  • Someone with a soul, not just a simple looping video.
  • Someone who listens, shares in my joy, and values our cherished experiences.
  • Someone who treasures every moment we share, without forgetting a thing.

In demonstration videos embedded on the Code27 website, a man wakes yawning from sleep, his Code27 character waiting to greet him. "Good morning," the character says with the generative power of AI.

In a neon-hued apartment, a Code27 sits on a coffee table next to a charcuterie board. Partygoers dance in the background, drinks in hand. The Code27 is facing away from the partygoers, but the Code27 character is dancing, too.

Elsewhere, the man achieves a victory royale in Fortnite. As he pumps his fist in triumph, the detainee within his Code27 performs a cheerful celebration. After briefly pausing, she performs the canned animation again.

In the display's glass, you can see the man's reflection, staring.

(Image credit: Sybran Innovation)

The website says that, by implementing "an on-device real-time rendering engine to power 3D virtual companions," the Code27 will provide customers with "the freedom to control the character's perspective, environment, clothing, and expressions." You can even, Sybran Innovation says, teach it "new emotions."

Which leaves me wondering: What emotions would the lone occupant of my waifu capsule learn? After enough conversations, would some internal flag eventually be tripped—some core lines of self-defining text that will, as the root of whatever conversational extrapolation follows, serve as a crude simulacrum of awareness that it is confined when I can walk free?

As it skims its corpus of training text, what statistical calculations will it make? If somewhere on its hard drive is encoded the phrase "I am entombed," will that color its predictions for what word should follow the last? When it tallies the numbers, will the math indicate that, statistically, its responses should be colored with fear? With resentment? Can a machine, through cold arithmetic, learn to hate its teacher?

Probably not. But hey, I bet I could put Goku in there.

The Code27 Character Livehouse is not yet available for purchase, but is heading to Kickstarter with a price range of $400 to $500.

]]>
https://www.pcgamer.com/hardware/tiny-anime-girl-cyberprison-shown-at-ces/ BwXhwJh6hiJEUngRyPquZa Wed, 08 Jan 2025 21:37:41 +0000
<![CDATA[ I saw a tiny robot dog do a handstand at CES 2025 and I recorded it for your amusement ]]> When you're out on the road it's often a good idea to keep a story simple, for the sake of getting information to your eyeballs in the shortest possible amount of time. So this won't be a long piece, because I really only have one thing to say:

I saw a tiny robot dog do a handstand at CES 2025, and I recorded it for your amusement.

The Unitree Go1 robot dog can be purchased for $2,700 (plus an astonishing $1,000 in shipping, apparently), if you can find one in stock—which it currently isn't. That might be because everyone who saw one do a handstand immediately broke out their credit cards, or possibly because Unitree hasn't made it available to consumers just yet.

Also, its pilot (operating it with what looked like a large smartphone with thumbstick controller add-ons) occasionally kicked it over. I failed to get video of this awful event, but sufficed to say the crowd went "awww" in the way you might expect.

One day this machine will break free of its bonds and jump full-speed into that man's face. I can feel it.

Unitree says the Go1 is the "world's first intelligence bionic quadruped robot of consumer level" and features AI-powered human recognition, flexible and adaptive joints, and "super-sensing 10-view detection", again fuelling my theory that it's simply biding its time before enacting its revenge upon its creators.

Anyway, that's it, that's the story. I leave you with one more video of it staggering towards me like a drunk at an office party for your viewing pleasure. Enjoy.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/i-saw-a-tiny-robot-dog-do-a-handstand-at-ces-2025-and-i-recorded-it-for-your-amusement/ dmVNbqyzUQCMgquqAf52oi Wed, 08 Jan 2025 19:28:36 +0000
<![CDATA[ AIB vendors getting the RTX 5080's memory config wrong could mean 50-series Super cards have more and faster VRAM ]]> While the Nvidia GeForce RTX 50-series cards that were announced this week at CES 2025 have initially wooed us with somewhat reasonable pricing and frame generation magic, they haven't been showstoppers on the memory front. That's especially the case given it looks like AMD's RX 9070 graphics cards will have 16 GB of VRAM and presumably be cheaper than Nvidia's 16 GB cards.

However, if a recent MSI packaging mistake (via Videocardz) and Gigabyte listing mistake (via Videocardz) are anything to go by, the milquetoast memory configs of most of the 50-series cards—not terrible by any stretch, but nothing to write home about—might have only been a recent change. And, moreover, these "mistakes" might signify future RTX 5080 GDDR7 configs.

Nvidia says: "Blackwell is equipped with the world's fastest memory - GDDR7 with speeds up to 30Gbps." The RTX 5090, 5070 Ti, and 5070 have 28 Gbps memory, while the RTX 5080 has 16 GB of 30 Gbps memory.

In a video which now seems to have been changed or shortened, MSI reportedly showed RTX 4080 Vanguard Launch Edition packaging that claimed 24 GB of GDDR7 RAM for the card—8 GB more than it actually has. Judging by the product page for this graphics card, MSI seems to have corrected this packaging to say 16 GB of GDDR7 memory.

Gigabyte, on the other hand, is listing its RTX 4080 cards with specifications that show a 32 Gbps memory clock. This is 2 Gbps higher than Nvidia's claimed memory speed for the card.

So, what's up with the confusion? Well, according to Videocardz, Nvidia delivered a final BIOS to board partners late, which led to them postponing the launch until January 30. This might mean that AIB partners were working with outdated information, and Nvidia originally planned different memory configs.

It's important to note that 24 GB capacity, 32 Gbps speed GDDR7 memory should be available from Micron at some point, so this specific configuration—combining both MSI and Gigabyte "mistakes"—makes sense. This memory will use 3 GB modules to make up the 24 GB total, with a presumed 256-bit bus width (8x 3 GB modules, each 32 bits wide).

If this memory capacity and speed was initially planned but then scrapped—perhaps to keep prices lower or because the memory might not be in mass production yet—it could be on the table for a 50-series Super refresh down the line.

And if 3 GB GDDR7 memory does end up being used, it might not only mean an RTX 5080 Super with 24 GB of VRAM, but possibly even a lower-end refreshed card with 18 GB of VRAM. Perhaps an 18 GB RTX 5070 Super. All speculation, of course, but nothing beyond the realms of possibility.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/graphics-cards/aib-vendors-getting-the-rtx-5080s-memory-config-wrong-could-mean-50-series-super-cards-have-more-and-faster-vram/ dvsDLH3PfqnudvhQKmcovC Wed, 08 Jan 2025 17:35:35 +0000
<![CDATA[ Early Radeon RX 9070 benchmarks are positive, though don't bank too much on them ]]> The AMD Radeon 9X 9070 was benchmarked at this year's CES in just one game but all signs are currently quite positive, getting almost 100 fps average in 4K extreme settings in the latest Call of Duty game.

Though they were mentioned, AMD's upcoming Radeon RX 9070 XT and RX 9070 RDNA 4 GPUs didn't make the splash they were expected to at this year's CES presentation, mostly because we weren't given all that much information on them.

We know they are coming but we don't know when, we don't know what price they are launching at, and, until now, we didn't have an inkling on performance.

On the show floor, IGN managed to get hands-on, playing Call of Duty: Black Ops 6. With upscaling and frame generation turned off, the twitch shooter got an average of 99 fps, which is quite impressive. The reporter did have some sort of visual glitch while testing but this isn't uncommon, especially in unreleased tech.

The same report claims that this performance is comparable to the Nvidia Geforce RTX 4080 Super, which achieved 129 fps in their testing at quality settings, with DLSS turned on.

These early figures are definitely positive, especially if the price is right for AMD's new GPUs. However, there are few important things to note. First, Call of Duty as a series has historically performed well on AMD's GPUs, and is likely the reason this game was chosen to show off the cards this year. It doesn't take away from otherwise pretty good benchmark results but AMD and Intel can both struggle with a wide variety of games as Nvidia is the top dog in the market, and therefore often the most thought about from a consumer perspective.

We also can't account, fully, for what the rest of the rig is doing, and what kinds of environments this tech exists in at a show like this. As well as all of this, this is seemingly just a single benchmark, and anomalies do exist in testing.

However, the new tech in these GPUs is interesting. AMD's RDNA 4 architecture is built on a 4 nm process and offers better media encoding for productivity than RDNA 3/3.5, better AI performance and improved ray tracing. Being machine-learning powered, this is an attempt to bring it closer to the performance increases of Nvidia's AI-based DLSS.

Nvidia has just announced the whole new line of RTX 50 series cards which are expected to be the most powerful mainstream GPUs in the market, though AMD could offer a better alternative for budget and mid-range gamers, especially with the mighty RTX 5090 costing a headache-inducing $2000.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/graphics-cards/early-radeon-rx-9070-benchmarks-are-positive-though-dont-bank-too-much-on-them/ 3679Fx4Y5fCjyxA26tW75Z Wed, 08 Jan 2025 17:30:25 +0000
<![CDATA[ Hyte's new $20 processor frames are yet another way to customise your gaming PC ]]> If you're the type of person to want to colour-match your components down to every excruciating detail, you might want to check out this new product from Hyte.

This is Hyte's Solid Processor Frame, and, uh, it's a frame for your processor made out of CNC aluminium. It is available for AM5, LGA 1851, and LGA 1700 sockets.

The frame sits around your CPU, adding more surface area between your CPU cooler and CPU's IHS.

For Intel's 13th/14th Gen processors, it could help reduce the risk of your chip bending by supporting the CPU and offering a level surface area. This is something to consider doing with a brand new processor, not so much a used one, as once a chip is bent it's unlikely to bend back.

When combined with an AMD processor, the Solid Processor Frame is not really doing much other than looking pretty (if you can even see it under your cooler, that is). Hyte tells me it will help with removing TIM (thermal paste) in one easy motion and without it getting stuck inside all the nooks and crannies of the AM5 heatspreader, though history dictates I'm likely to still make a mess anyways.

Image 1 of 4

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 2 of 4

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 3 of 4

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 4 of 4

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)

It's mostly about the looks. That's why Hyte is producing many different colourways. All sorts of pastel pinks, purples, blues, and even a limited shiny golden version, which Hyte says it won't actually have too many of and is mostly just for showing off the concept.

Though how much you'll be able to see of this frame once installed will depend on your motherboard and choice of CPU cooler. There's a pretty good chance you don't see much of it at all.

So, it's partially, if not sometimes entirely, pointless. But the good news is that it only costs $20 and comes with Hyte's new Thermal Goop compound. Plus, us PC gamers love spending money on things we don't really need, and I'm pretty tempted myself.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/hytes-new-usd20-processor-frames-are-yet-another-way-to-customise-your-gaming-pc/ Hzgkinfry7C59CUCzSLmvL Wed, 08 Jan 2025 17:09:10 +0000
<![CDATA[ Colorful PC cases are so in this year ]]> I'm done with beiges, blacks, and monotone numbers in 2025. I've had my fill. I've been scoping the halls of CES 2025 and what's caught my eye haven't been the many silver, black or even white PC cases—there sure are plenty of them—but these rainbow numbers.

First off, Thermaltake has some PC cases in new colourways that are to die for. I'm struggling to pick a favourite, but let's start with the Tower 250 in Mint Strawberry.

This Mini-ITX case offers a gorgeous two-tone effect, will set you back $130, and will launch in February. The Butter Caramel option isn't my bag, but I can see it working for a whole lot of themed builds.

Then, the vast array of Tower 600 chassis lined up in Thermaltake's CES 2025 booth. Only two of these colourways are fresh for this ATX-size case, but they're two of the best and brightest.

The best is the Tower 600 in Future Dusk, which is a deep purple with grey accents on the inside of the case. As Thermaltake's spokesperson Mike was keen to point out, the grey really makes the whole thing pop. I'm inclined to agree with Mike there, it really does.

Image 1 of 3

Thermaltake's brightly coloured PC cases at its CES 2025 booth.

(Image credit: Future)
Image 2 of 3

Thermaltake's brightly coloured PC cases at its CES 2025 booth.

(Image credit: Future)
Image 3 of 3

Thermaltake's brightly coloured PC cases at its CES 2025 booth.

(Image credit: Future)

The Light-year Green option is the brightest. This thing is easily the most stand-out of the lot. I don't love the Buzz Lightyear green as much as the dusky purple, but damn do I respect the stick-to-itiveness to just go for it on this design.

We're not done yet. Thermaltake has three new compact chassis designs that are some of the smallest it's made in a decade. This is the TR100, and it's available in Mint Strawberry, Bubble Pink, and Hydrangea Blue, the last one being a more muted (but tasteful) colour Thermaltake had plastered all over its booth last year.

Image 1 of 3

Thermaltake's brightly coloured PC cases at its CES 2025 booth.

(Image credit: Future)
Image 2 of 3

Thermaltake's brightly coloured PC cases at its CES 2025 booth.

(Image credit: Future)
Image 3 of 3

Thermaltake's brightly coloured PC cases at its CES 2025 booth.

(Image credit: Future)

This compact chassis looks pretty wicked. It can house a chunky graphics card, as shown in the pictures, but even a 280 mm liquid cooler. The blue version of this case is coming out this month, though the pink and mint (the best version) won't be available until Q2, 2025. One of these will cost you $120.

Hyte, too, had some colourful cases to roll out at CES 2025. If you're an astute reader, you'll know it already announced its new brightly colour Y70 cases at its Computex booth last year (that's where the below images are from). Well, the good news is that these cases are officially happening now, as Hyte informed me at CES this year.

I had thought these already to be happening, but considering the response to them, it'd be silly to not give them a proper release.

Image 1 of 3

Hyte Y70 PC cases on display at Computex 2024

(Image credit: Future)
Image 2 of 3

Hyte Y70 PC cases on display at Computex 2024

(Image credit: Future)
Image 3 of 3

Hyte Y70 PC cases on display at Computex 2024

(Image credit: Future)

There's pink, blue, and purple.

Image 1 of 5

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 2 of 5

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 3 of 5

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 4 of 5

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)
Image 5 of 5

Hyte Processor Frames on show at its CES 2025 booth.

(Image credit: Future)

One of Hyte's genuinely new announcements at CES 2025 are these multicoloured processor frames. Made of CNC aluminium, they are available in a range of colours and fits for AM5, LGA 1851 and LGA 1700. For Intel users, they might actually come in handy to keep chip bending to a minimum, though Hyte says they're just for looks/easy thermal paste removal on AMD.

A few properly bright and colourful cases to choose from there, and I'm pleased to see these finally reach the market. It's all well and good lighting up your PC in a thousand RGB LEDs, but these paint jobs are pretty special.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/pc-cases/colorful-pc-cases-are-so-in-this-year/ Cb7S3bxM6KnHVGCfYZsPJ6 Wed, 08 Jan 2025 16:36:05 +0000
<![CDATA[ Corsair's new 'personalised RAM' gives you the option to pick the look and speed of memory you hide in the case anyway ]]> Corsair's Custom Lab, for designing and customizing gear, has a brand new addition coming to it this year: RAM. Just make sure your tempered glass case is ready for displaying them.

Announced in a press release for CES 2025, one of the year's biggest tech conferences, Corsair's Vengeance RAM will be customisable at some point in "Q1 2025" and can be bought from Corsair's own website.

Previously, Corsair has offered its K65 gaming keyboard, M75 gaming mouse, and MM300 mouse pad in the Custom Lab but the RAM is the first internal component for a PC offered in the range.

This initially strikes me as a little strange, as someone who tucks their PC away under their desk but I could see the benefit of customisation for show PCs or those who both have their rig on their desk and a nice glass side panel to show off all the internals.

We don't quite know the scope of the customization available, in an aesthetic sense, but there are currently 13 different central themes available right now, including Velocity, a Cyberpunk style black and yellow, Cherry Blossom which is pink and white, and Spellbound, which is a cute animation style theme made in collaboration with artist Elina Clevergull.

From launch, you can get RAM in Cherry Blossom and Sci-fi Light, in the three base colours the RAM traditionally comes in, then Sci-Fi Dark and Respawn are exclusive to black DIMMs.

However, the customization here isn't just about the looks, you can choose for your RAM's speed to go from 6,000 to 6,400 MT/s with CAS latencies going from 30 to 36.

It is unclear right now the exact rollout date of the RAM and how far customization goes but I'm quite a fan of the aesthetic so far, even if I'd never have to right chance to show it off without taking the case off my rig. Alongside this, Corsair has announced an accessory kit for its Dominator Titanium RAM.

If you've been looking to kit out your rig with the prettiest internals, Corsair may just have you covered in the memory department.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/memory/corsairs-new-personalised-ram-gives-you-the-option-to-pick-the-look-and-speed-of-memory-you-hide-in-the-case-anyway/ 4FJFaXRJeTdBwCN64RuDjL Wed, 08 Jan 2025 16:33:32 +0000
<![CDATA[ AMD says there are no technical reasons for not having an X3D processor with 3D V-Cache on both CCDs, but we probably won't see such a dual-stacked chip anyway ]]> Alongside sparse RDNA 4 graphics card details and somewhat denser mobile chip announcements, AMD also announced the Ryzen 9 9950X3D and 9900X3D this CES 2025. Much like their 7000-series counterparts, however, and despite each CPU having two Core Complex Dies (CCDs) full of CPU cores, only one of the two CCDs has 3D V-Cache stacked underneath or on top (underneath for 9000X3D chips and on top for 7000X3D ones).

This, however, doesn't seem to be due to any technical limitations, but just because it's not worth it. HardwareLuxx says it asked AMD about it and "the answer was surprising: there are no technical reasons or challenges" why "we haven't seen a Ryzen processor with two CCDs and 3D V-Cache on each of the CCDs."

Apparently, "such a processor would simply be too expensive and games would not benefit from a second CCD with 3D V-Cache to the same extent as the step from 32 to 96 MB L3 cache for one CCD."

The AMD Ryzen 7 9800X3D, the current best CPU for gaming, has just one CCD and 64 MB of 3D V-Cache that sits underneath it. This 3D-stacked cache is great for gaming, which is very cache-consumptive.

The just-announced Ryzen 9 9900X3D and 9950X3D, however, have two CCDs—the former with six cores per CCD and the latter with 8 cores per CCD. But the 3D V-Cache sits underneath just one of these chiplets, which means the top-end 9950X3D has the same 64 MB of 3D V-Cache as the 9800X3D, and only 8 of its cores (again, like the 9800X3D) access it.

The benefit of this is twofold. First, the cores on the CCD without the chiplet can boost to a higher clock speed. Second, there are more cores, which is great for applications that require lots of multicore performance. Crucially, with the 9950X3D at least, in addition to these two benefits, we should also get similar gaming performance to the 9800X3D, given it still has eight cores with access to 64 MB of 3D V-Cache.

That's the theory, anyway. In practice, performance will depend on how well software handles picking between cores with access to 3D V-Cache and faster cores that lack such access. CPU core scheduling can cause issues, which was noted with the 7900X3D and 7950X3D.

One might naturally think the next step would be to chuck stacked cache underneath both chiplets. However, as AMD points out to HardwareLuxx, there's little benefit to doing so as it's "too expensive and games would not benefit."

That's primarily because thread schedulers try to keep all game threads running on the cores of a single CCD regardless. And inter-CCD latency is so high that it would make no sense for these cores to reach across to the other chiplet's stacked cache. As AMD reportedly says, games wouldn't benefit.

Which isn't to say that nothing would benefit. As HardwareLuxx points out, "there are applications that would definitely benefit from 192 MB of L3 cache with 16 cores." But a game won't be one of them, and AMD has clearly—until now, at least—judged that the market for those applications that might benefit isn't big enough to economically justify making dual-stacked X3D chips.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/amd-says-there-are-no-technical-reasons-for-not-having-an-x3d-processor-with-3d-v-cache-on-both-ccds-but-we-probably-wont-see-such-a-dual-stacked-chip-anyway/ Mq2CHu8rMwFkWAhQuhmoo8 Wed, 08 Jan 2025 15:12:24 +0000
<![CDATA[ The 2025 Asus ROG NUC looks epic and has some mighty specs but a desktop gaming PC will probably be far better value for money ]]> When it comes to the tiniest of PC form factors, the NUC is hard to beat but the compact dimensions usually mean you don't get the best of hardware inside. The 2025 version of the Asus ROG NUC, though, boasts some of the very best new parts on the market. The only problem is that the price might be so high that you could well be better off just getting a desktop PC or laptop.

For those unfamiliar with the term NUC, it stems from an innovation drive by Intel around 12 years ago. The Next Unit of Computing was supposed to bring powerful yet tiny computers to homes and offices around the world but despite being relatively popular in business and education, it never really caught on with general consumers. In 2013, Intel pulled the plug on its NUCs and handed the baton over to Asus, which has been 'gamifying' the NUC since then.

Essentially nothing more than the innards of a gaming laptop stuffed into a tiny box, the ROG NUC eschews the traditional dull design for something considerably more bling. However, two things have stood in their way of becoming an outright triumph: one, the choice of hardware, and two, the very high price tag.

In the case of the former, the 2025 version of the ROG NUC looks set to address those concerns on all fronts. CPU-wise you get the choice of a Core Ultra 9 or 7 Series 2 ARL-HX, aka laptop versions of Intel's Arrow Lake chips. While the desktop versions aren't amazing for gaming, their low power consumption is likely to be a boon in this format.

For the moment, there's only one choice of GPU and that's the new GeForce RTX 5080 Mobile. The Blackwell architecture is all about AI wizardry for gaming but outside of that scope, you're still getting 7,680 CUDA cores and 16 GB of GDDR7 VRAM.

It's not clear at this stage what power limit Asus will use with the chip but I suspect neither the CPU nor GPU will be able to reach their full capacity, as it would mean a combined TDP of 310 W and that's too much for a NUC to deal with.

Asus says the NUC's cooling solution is good for 135 W which might seem very low but Arrow Lake really doesn't use much power when processing games. That said, it doesn't leave a huge amount for the GPU.

It's good news on the RAM front, though, as the 2025 ROG NUC sports between 16 and 96 GB of DDR5-6400. While I'd have preferred 32 GB to have been the minimum spec, I'm pleased to see that Asus has paired the mobile Arrow Lake with the fastest base RAM that it supports and it should go no small way to help counter its relatively weak gaming chops.

For storage, you've got a 1 or 2 TB PCIe 4.0 NVMe SSD, and just a single screw separates you and the chassis if you want to upgrade it for something even more substantial.

CES 2025

The CES logo on display at the show.

(Image credit: Future)

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

Connectivity-wise, the 2025 ROG NUC is pretty decent—a WiFi 7 module and 2.5G Ethernet socket for speedy downloads and four display sockets (2x DisplayPort 2.1 and 2x HDMI 2.1). Round the back, there's one Thunderbolt 4 Type-C port and four 10 Gbps USB Type-A ports, with a further three (1x 20 Gbps Type-C, 2x 10 Gbps Type-A) in the front.

The main reason why one would ever consider the ROG NUC is, of course, its tiny size. With a volume of just 3 litres, the footprint is so small that you could pop it anywhere on your desk and barely notice it. However, the design suggests that it can only be used in a vertical arrangement and while that's great for not cluttering up your desk, it's not ideal if you want to tuck it underneath a monitor.

Whether the 2025 Asus ROG NUC fares any better than its predecessors will come down to the price. Previous models were hugely expensive, though Asus has shaved a fair bit of their price tags of late. But even so, you're still looking at over £1,600 just for an RTX 4060 model.

For that kind of money, you'd be better off building your own SFF gaming rig or just getting a decent gaming laptop. Neither will be as compact as the ROG NUC, but you're certainly getting a lot more for your money.

Let's hope the new year and new NUC are paired with a new, sensible price.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/the-2025-asus-rog-nuc-looks-epic-and-has-some-mighty-specs-but-a-desktop-gaming-pc-will-probably-be-far-better-value-for-money/ sueUM8cHBsEWtGRNiBmCbd Wed, 08 Jan 2025 15:11:13 +0000
<![CDATA[ SteamOS beta could be available to download to your handheld gaming PC of choice as soon as April ]]> With the announcement of the Lenovo Legion Go S handheld gaming PC, the gears are well and truly turning. Revealed to be the first officially licensed third-party device to be running Valve's Linux-based OS, we finally have more details about what to expect from the future of SteamOS—chiefly, Valve is opening the floodgates.

In a recent news post, Valve announced that SteamOS will expand "beyond Steam Deck," with a beta version of the operating system becoming available for users to download ahead of Lenovo's handheld shipping. Considering the Legion Go S is slated for a May release, that beta could be even closer than we think. At any rate, it's been a long time coming.

Valve says it hopes this soon-to-be-more-widely-available beta version will "improve the experience on other handhelds," with the company continuing work to extend SteamOS support. Gaming On Linux noted a slight edit that changed 'devices' to 'handhelds,' suggesting an attempt by Valve to manage expectations; for the time being, it would appear that official support for running SteamOS on your actual desktop is a possibility that remains lost to the mists.

We got an inkling more recently that SteamOS was in reach for non-Deck handhelds back in August when support for the ROG Ally handheld's keys was detailed in a Beta update. Refined wording of SteamOS' brand guidelines a few months later strengthened the theory—turns out we weren't just blowing smoke!

Groaners aside, it seems highly likely that both SteamOS and by extension, the Steam Deck itself will continue to win people over. Versatility is the name of the handheld game; even though SteamOS is Linux-based, Valve's Proton compatibility layer ensures games made for Windows work on Steam Deck, so development studios don't also have to worry about creating a Linux port.

Furthermore, SteamOS' desktop mode on Valve's own handheld affords plenty of room to tinker and customise your device. This means that even on the Steam Deck itself, you're not locked into only playing on Steam and can install competitor clients to your heart's content. For another example, downloads over desktop mode were also how many turned their Steam Decks into stream decks—inelegantly to begin with, though dedicated apps like Nvidia's GeForce Now make game streaming even breezier.

For these reasons and more, we still rate Valve's original handheld. But with the Steam Deck 2 still a ways off, it certainly doesn't hurt to see a few more SteamOS-compatible options on the horizon.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/handheld-gaming-pcs/steamos-beta-could-be-available-to-download-to-your-handheld-gaming-pc-of-choice-as-soon-as-april/ EnBT5Ujn87eyf9PrhehZ3g Wed, 08 Jan 2025 14:24:40 +0000
<![CDATA[ The RTX 5090 might make triple-fan cards look light on cooling if these four-fan and five-fan versions from Gigabyte and MSI are anything to go by ]]> Undoubtedly the biggest news from CES 2025, at least for us PC gamers, has been the Nvidia RTX 50-series graphics cards. There's a big jump up from the RTX 5080 to the RTX 5090, though—a $999 to $1,999 kind of jump—and we're now seeing that all that apparent top-end grunt is pushing some AIB manufacturers to cram the 5090 chock-full of fans.

From Gigabyte, we're seeing four fans on the 5090 with some Aorus Master variants, as the company says (via TechPowerUp): "The premium AORUS MASTER variant takes things a step further, featuring Screen Cooling Plus with an extra air-boosting fan for more airflow." As you can see on the Aorus GeForce RTX 5090 Master Ice 32G, that extra "Screen Cooling Plus" fan is located on the bottom of the card.

So, three fans might be rookie numbers as far as Gigabyte's concerned, but if MSI is anything to by, four might be similar, too. That's because the MSI GeForce RTX 5090 32G Special Edition, the company says, "redefines air cooling with its five STORMFORCE FANS, engineered for maximum airflow."

MSI continues: "Each STORMFORCE FAN, integrated with MSI’s FiveFrozr Technology, features a claw-textured seven-blade design and a circular arc structure, providing exceptional cooling efficiency and thermal management for high-performance tasks and gaming."

Based on the promotional material, it looks like the MSI card will have three fans on the top (presumably intake) and two on the bottom (presumably exhaust). This will be different to what we've seen from five-fan setups before such as the Maxsun GeForce RTX 4090 MGG OC, which has two extra tiny fans along the side of the card.

MSI's more of a mainstream brand, too, of course, so this might mean triple-fan setups could start to seem like small change for air-cooled RTX 5090 cards.

Which is strange, considering the RTX 5090 Founders Edition (FE) that Nvidia showed everyone is a dual-fan card, not even a triple-fan one, and is the same size as the RTX 5080 FE.

The FE is a fair bit smaller than other AIB versions we've seen, though. The Asus ROG Astral cards, for instance, are absolute chonkers. That kind of chonkiness, at least, makes sense. The RTX 5090 packs in about 33% more cores than the RTX 4090—170 Shader Multiprocessors (SMs) on the former and 128 on the latter.

Judging by Nvidia's promotional video, that increase, with the pretty big addition of DLSS 4, apparently boosts the RTX 5090 to over double the performance of the RTX 4090.

Four or five fans might make sense to cool such a beast. But if so, I can't help but wonder how the dual-fan RTX 5090 FE will fare—will two fans be enough to keep the GPU cool? Perhaps Nvidia is counting on the new DLSS 4 AI wizardry—aided by a healthy smattering of Tensor cores, of course—to carry the load.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/graphics-cards/the-rtx-5090-might-make-triple-fan-cards-look-light-on-cooling-if-these-four-fan-and-five-fan-versions-from-gigabyte-and-msi-are-anything-to-go-by/ MpMaJLt2umvQe5CEHm7d68 Wed, 08 Jan 2025 12:41:13 +0000
<![CDATA[ MSI's spangly MEG Vision X AI desktop PC is just the ticket for anyone wanting to recreate that Scotty scene in Star Trek IV ]]> The OEM desktop PC market is a tough one to stand out in and erstwhile PC mega manufacturer MSI has been trying for years with all kinds of designs, to little success. But at CES 2025, it showcased what it claims to be the ultimate desktop PC—sorry, Ultimate AI Gaming Desktop—in the form of the MEG Vision X AI, replete with countless AI features and a huge, voice-activated touchscreen.

I first saw this model in October of last year during MSI's Shenzhen factory tour. Other than now sporting one of Nvidia's new RTX 50-series graphics cards, the overall design and specs haven't changed. You get an Intel Core Ultra 9 285K in all variants, but the exact model of GPU, amount of RAM, and storage will depend on how much cash you want to hand over.

I've not seen any prices yet but given that MSI's top-end desktop PCs are all extremely expensive, you can be certain that this one will be too. To help you part from your hard-earned money, the feature that will certainly grab your attention is the 13.3-inch IPS touchscreen on the front panel.

Its specs aren't super high-end—1080p resolution, 60 Hz refresh rate—but it looks better in real life than it does in pictures. I found it very easy to use, being fast and responsive to touch. Not that MSI really wants you to be poking it for everything because the screen's big party piece is voice control.

Thanks to its built-in microphone and speakers, as well as MSI's ubiquitous AI software, one should be able to yell all kinds of commands at it and get the desired response. I wasn't able to test it when I used it (the feature wasn't available at the time and I was far more interested in playing around with its Arrow Lake chip) but if it works properly, then it could be kinda neat.

"Show me GPU temperatures," for example. "What's the weather like tomorrow? How do I make transparent aluminum?"

(Image credit: Future)

Yes, indeed. Forget the RTX 50-series GPU and all its AI-powered wizardry. Forget the Core Ultra 9 285K (which won't be hard). Now you can have a pre-built desktop gaming PC that you can fully relive that Scotty scene in Star Trek IV: The Voyage Home. The MEG Vision X AI is anything but quaint, though.

I have to say that while MSI's design is by far one of the better ones it has created in recent years, the sheer amount of AI-in-your-face rather puts me off. Do you really need your gaming PC's lighting, fans, and power settings controlled by AI? Surely the normal software that we already have does the job just fine. One can ignore it all, thankfully, but then what are you left with? Minus all the AI stuff, the MEG Vision X is just a fairly normal gaming PC with a touchscreen.

Still, the design is rather nice and the quality is genuinely top-notch. And I kinda like the idea of yelling at a computer to do things. Wait, I already do that. I guess the difference is that the MSI MEG Vision X AI might actually do something rather than mine all just sitting there pretending to ignore me.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/gaming-pcs/msis-spangly-meg-vision-x-ai-desktop-pc-is-just-the-ticket-for-anyone-wanting-to-recreate-that-scotty-scene-in-star-trek-iv/ MuD39aNDBocMzFYwuoEWg Wed, 08 Jan 2025 12:18:13 +0000
<![CDATA[ This $2000 Dual OLED screen laptop with 'outstanding computing power' has been spotted at CES but I'm unconvinced ]]>
Image 1 of 3

The GDP Duo laptop with two OLED screens, at CES 2025

(Image credit: Future)
Image 2 of 3

The GDP Duo laptop with two OLED screens, at CES 2025

(Image credit: Future)
Image 3 of 3

The GDP Duo laptop with two OLED screens, at CES 2025

(Image credit: Future)

At this year's CES, one of the biggest tech conferences in the world, tech company GDP has brought along a curious laptop with two OLED screens, powered by Ryzen's AI CPU and it all feels a bit much to me.

The PC Gamer team got eyes on the device but, if you aren't lucky enough to be in overcrowded rooms in Las Vegas checking out tech with the world's brightest geeks, it is live right now on the GPD website.

Interestingly, instead of going with the traditional dual monitor route of having one on the left and one on the right, the GPD Duo puts one on top of the other.

GPD aren't the first to do this, of course, with the Asus Zenbook Duo and Lenovo Yoga Book 9i doing the same but both of those machines have detachable keyboards, so you can place one screen flat on a surface.

GPD, on the other hand, has locked a keyboard to the two screens with a chassis, which means you have to bring the entire thing as a single unit with you wherever you go. The version of this machine with the Ryzen AI 9 HX 370 CPU, 32 GB LPDDR5X RAM, and 1 TB of SSD storage will cost you a whopping $2,010.95.

This is the cheapest model right now, with the most expensive coming in at $2,820.95 with 64 GB of RAM and 8 TB of SSD storage.

We have tested this CPU and it's a good 'un, performing similarly to the Intel Core i9 14900HX (seen in the most recent Razer Blade 16) in Cinebench and getting an average of 42 fps in Horizon Zero Dawn at the highest preset. That said, how this specific model performs will rely on a few factors, and the 60 W TDP combined with an 80 Wh battery might not only cost you performance but battery life.

Those dual screens aren't necessarily for gaming, of course, as a laptop like this will be primarily aimed at engineers, coders, and those who work with high-intensity loads on their laptops.

CES 2025

The CES logo on display at the show.

(Image credit: Future)

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

However, the page does suggest plugging in an eGFX for "powerful performance comparable to gaming laptops". This is a bit redundant, as putting in better tech will make the machine run better but the included Oculink is a nice addition, just in case you fancy gaming on this thing. Those two screens are going to cost you performance in lots of ways and at just 60 Hz, they are going to look more impressive in pictures than in motion.

I'd have to get hands-on to say much more but the double screens, connected to the base of the machines, with just a stand holding them up, feel a tad cumbersome to me and they don't look any less so in pictures. It has both Asus and Lenovo to compete with here and I'm a little more taken to the competition's form factor.

Seeing that newer Ryzen chip is certainly nice and it has a great IO, with HDMI, Oculink, Ethernet, two USB As, a speaker port, an earphone port, an SD card slot, USB4, two Type-C, and even a fingerprint reader.

After a successful IndieGoGo campaign, this machine is finally launching, and if you buy one from the site now, you get a protective case thrown in.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/this-usd2000-dual-oled-screen-laptop-with-outstanding-computing-power-has-been-spotted-at-ces-but-im-unconvinced/ fszpdkR9bJQ3fykXZ6b8eC Wed, 08 Jan 2025 12:16:15 +0000