Everything you need to know about Intel’s new GPU, from ray tracing support to its DLSS competitor.
Today’s Best Tech Deals
Picked by PCWorld’s Editors
Top Deals On Great Products
Picked by Techconnect’s Editors
Show More
Intel’s highly-anticipated Arc GPU is expected to give gamers a much-needed third option to AMD and Nvidia when released. Because we know you do more reading about gaming GPUs than actually using them these days, we’ve compiled everything we’ve learned about Intel’s first entry into high-end discrete graphics.
We’ll keep this story updated with more information as we get closer to the launch of Arc.
Why is it called Intel Arc?
Until this week, Intel’s discrete graphics card has been touted as Xe-HPG, for High Performance Gaming. Although the Xe-HPG microarchitecture will be somewhere on the box, the GPU now takes on the far more memorable ‘Arc’ branding, which, we’re told, comes from the “story arcs” in games. We just know it’s easier to remember than Xe-HPG.
Will it support hardware ray tracing?
When Intel first surfaced its intentions to compete with Nvidia and AMD, the road to making ray tracing a desirable feature was just getting started. Today, a lack of ray tracing support is seen as a weakness.
Intel’s Alchemist Arc will have hardware ray tracing support. Each “slice” will feature a ray tracing unit of unknown performance, which will enable key features for ray traversal, bounding box intersection, and triangle intersection. In short, Arc will have hardware ray tracing out of the gate. The first Arc will also feature mesh shading tier 2 and pixel sampling, and be DirectX 12 Ultimate-compatible.
How will Arc deal with Nvidia’s DLSS?
Gamers and the media initially viewed advanced ray tracing features with skepticism and fear of slowdowns. Nvidia’s AI-based DLSS 2.0 turned the tide of opinion by allowing impressive 4K resolution gaming based on 1080p upscaling.
With XeSS, Intel will use machine learning and temporal feedback techniques to let Arc GPUs render a game at 1080p before increasing the resolution to 4K. Intel says it’s possible to run an Unreal Engine demo with “no visible quality loss” using XeSS 4K vs. 4K native, while doubling frame rates. From what we’ve seen, it looks pretty impressive. But you won’t believe it until you see it yourself, so we’ve uploaded it to YouTube for you.
Does XeSS work only on Intel’s Arc GPU?
One drag on Nvidia’s DLSS 2.0 is you need Nvidia hardware to run it. Intel, seemingly having nothing to lose, said XeSS will work on any hardware that supports DP4A or INT8 instructions, which most modern Nvidia and AMD GPUs do. Intel said it will let XeSS reach a wider audience, although it won’t be quite as good nor as fast on other platforms. Such a move also makes the feature more acceptable to game developers who simply want more customers for any features.
Are a lot of game developers supporting XeSS?
Having a new hardware feature is one thing; getting developers to support it is another. For XeSS to go anywhere, Intel has to secure developer support for it, as well as its cards. Right now, the company isn’t saying how many are jumping aboard the XeSS or Arc bandwagons.
How many EUs does Arc have?
The answer is zero, because EUs are now gone. Intel has previously used execution units to describe the graphics engines in its graphics chips. For example, an 11th-gen Core i7-1185G7 features Iris Xe graphics with 96 EUs. Intel said EUs were getting too large and unwieldy to use as a descriptor, so it will now adopt Xe-cores. So, that Core i7-1185G7 would now have six Xe-cores. Each Xe-core contains 16 vector engines and 16 matrix engines. Four Xe-cores make up a slice, with the first Arc GPU expected to support up to eight slices. Each of the slices is connected through a high-bandwidth L2 fabric.
How power-efficient is Arc?
Intel didn’t discuss Arc’s power efficiency in detail, but it did say Xe-HPG’s performance-per-watt and frequency increased by 1.5X when compared to the Xe Max used in laptops. The performance and clock scaling improvements were a group effort, the company said, with an assist from Intel’s manufacturing partner.
Is TSMC making Arc GPUs?
In a first for Intel, the company will tap external foundries to make its leading-edge products. That means its Alchemist Arc graphics silicon will be made by TSMC on its N6 node.
Are the Arc codenames from games?
Intel dumped its usual boring code names for its GPU launch. The first Arc is code-named ‘Alchemist,’ with ‘Battlemage,’ ‘Celestials,’ and ‘Druid’ to follow. The names, the company said, are based on character classes from various games over the years. All we know is it’s an improvement over code-naming it ‘Los Angeles River,’ which you’d have to see in person to know what we’re talking about.
When can I buy Intel Arc?
We know, we know, you just want to know when you can buy it. Unfortunately, you’ll have to wait until Q1 2022.
Will Arc be as fast as AMD and Nvidia GPUs?
Arc’s performance is very much an unknown. With AMD and Nvidia GPUs harder to get these days than a Willy Wonka Golden Ticket, it may not matter to PC gamers desperate to get any gaming-capable GPU.
How much will Intel Arc cost?
Intel hasn’t said how much it would charge for its first high-end GPU.. We suspect, you won’t find that information until the day it launches. The real question is how much should it charge? Intel could opt to match AMD and Nvidia-comparable GPUs, or Intel could lowball it, effectively dropping a pricing grenade on AMD and Nvidia.
Will there be Linux drivers for Intel Arc?
Intel has pledged support for Linux. It even has plans to support Vulkan ray tracing once it has the Arc card shipped.
Will Intel Arc run Crysis?
Yes gamers, the meme of “will it play Crysis,” a game so punishing that 14 years later, people still whine about it, Arc will run it. Even better, it’s Crysis Remastered that Intel showed off in video captured on pre-production silicon.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
One of founding fathers of hardcore tech reporting, Gordon has been covering PCs and components since 1998.