ASRock announces souped up Radeon RX 5600 XT Challenger Pro 6G OC

ASRock announces souped up Radeon RX 5600 XT Challenger Pro 6G OC

by Emily Smith
0 comments 71 views
A+A-
Reset

ASRock has added another Radeon RX 5600 XT to its roster, promising some significant factory overclocks. 

The Radeon RX 5600 XT Challenger Pro 6G OC bears a striking resemblance to Gigabyte’s Radeon RX 5600 XT Gaming OC 6G thanks to its triple fan design, but ASRock has pushed the clocks a fair bit further than its competitor. It boasts a base clock of 1,420MHz, game clock of 1,615MHz and up to 1,750MHz with the boost clock. Compared to the reference Radeon RX 5600 XT, that works out at over 25 percent improvement on the base clock, 17.5 percent increase on the game clock, and 12.2 percent on the boost clock. 

With a similar improvement, the clock frequency of the 6GB of 192-bit GDDR6 has also seen a boost, set at 1,750MHz working out at 17 percent faster than the usual default value, and running at 14 Gbps. Of course, the whole thing is based around AMD’s Navi 10 silicon meaning 2,304 Stream Processors. 

ASRock reckons this makes the card the perfect card for the gamer looking for great 1080p performance. It’s fortunate then that the card utilises that noticeable triple fan design. There’s also the matter of a cooler that’s so long that it extends past the card’s PCB – something you might want to bear in mind if your PC case is a little cluttered. Note – the triple fan setup is semi-passive so it shouldn’t make too much noise unless you’re really pushing its potential.

There’s no mention of the TBP required for the card but it takes the power through a single 8-pin PCIe power connector so a regular 550W power supply should be plenty here. When it comes to outputs, you get one HDMI 2.0b port and up to three DisplayPort 1.4 outputs. 

ASRock hasn’t announced a price yet for the card or when to expect its release. We’d expect the release to be imminent but price will be an important one with the card arriving at a time when we’re on the brink of a new generation of GPUs. 

Read More

You may also like

Leave a Comment