Techzim

Zimbabwe and regional technology news and updates

advertisement

Intel Arc GPUs bringing the fight to Nvidia

This one is for my tech heads. When you are looking for a performance PC, you would know that the more serious the machine is, the less integrated the system is. What do I mean? Well, lower-tier computers will have one processor handling everything. By everything I mean the computation tasks and the graphics stuff.

When you start getting 2 processors dedicated to their individual tasks that’s where you know your system means all the business. A powerful processor for computational tasks like compiling code and a separate powerful processor just for graphics for those running 3D rendering software like architectures and 3D animators.

The GPU side of things is where we see the difference between a powerful computer and a very powerful machine with the 2 names to look out for being AMD Radeon and Nvidia GeForce. And it’s been like this for the longest time. Intel made some great processors. AMD made good enough processors, until recently when they started making some pretty desirable ones, and Nvidia made the best GPUs with AMD again also making good enough GPUs.

Intel has been making dedicated GPUs for some time

So you can bet I dropped everything when I heard that Intel is now making actual GPUs to take a straight fight at Nvidia and AMD. What? To be clear, Intel has been making GPUs for Ultrabooks for quite a while now. You may have seen a laptop with a core i5 or i7 sticker plus another one written Intel Xe graphics. That part right there was Intel’s first step in its quest to make GPUs. The graphics performance of the Xe graphics in these laptops is a lot better than the Intel integrated HD or UHD graphics we were used to getting. In fact, it’s comparable to Nvidia’s power-efficient MX series of GPUs used in these same laptops.

Intel Arc

The names. They are calling them Intel Arc and there are 3 versions. Arc 3, Arc 5, and Arc 7. If you are paying attention the nomenclature they are using is pretty much the same as their CPUs and if that is so maybe there will be an all-conquering Arc 9? Maybe.

So Arc 3 will be the entry-level card with Arc 7 being the most powerful. And there are 2 variants of the Arc 7 an A750 and an A770 with the A770 being more powerful than the A750 with an added option of doubling the video memory from 8GB to 16GB.

Performance: Intel Arc A750 vs Nvidia GeForce RTX3060

How does it perform compared to the biggest name in GPUs? Nvidia. Seems to hold its own actually. The Arc 7 A750 ran some benchmarks against the GeForce RTX 3060 and it’s consistently performing better than the establishment. Looking at frame rates in a selection of games that Intel used as benchmarks, the A750 was pushing out about 12% more performance than the RTX3060. The RTX3060 is a solid card for gaming at QHD resolution and 120fps with quality settings on high AND ray tracing so on the side of user experience Intel is definitely here to play.

Where things get a bit interesting is power consumption. The performance gains that the A750 brings onto the table come with a hefty price to pay on power consumption. Nvidia quotes power consumption for the RTX3060 at 170W whilst Intel quotes the power draw of the A750 at 225W. So you are getting a 12% boost in performance at the price of a 24% higher power consumption. And if we are to guess that the top tier A770 will have similar performance to the RTX3070 the difference in power draw is more negligible with Nvidia drawing 5W less than Intel.

Something to note is that across the range, the Nvidia cards are coming with more vRAM than Intel cards. In the tests Intel performed, it seems like the vRAM deficit they have is not introducing any performance variations in gaming scenarios. The A750 Intel benchmarked had 8GB of onboard video memory vs 12GB on Nvidia’s RTX3060. I suspect in memory-heavy applications like 3D rendering in architectural software or 3D animation the extra vRAM in the RTX3060 will reap more dividends over the A750.

Intel focuses on the masses

So here is my thinking about Intel’s GPUs. If the A750 is being benchmarked against the RTX3060, a lower-mid tier GPU, the A770 should be similar in performance to the RTX3070 which is a mid-tier GPU. That then makes the A580 an RTX3050 level card and the A380 a lower performer than the RTX3050.

So, at least for the time being, Intel looks to be targeting low to mid-tier systems with their cards and offering more options for someone looking for a dedicated card for their system at a budget. AMD could be sweating right now because not only is their Ray tracing not yet up to scratch with Nvidia but also they recently decided to price their cards higher than Nvidia.

Pricing

Speaking of pricing, The A380 seems to be the one that’s available to the public right now going for US$140 on NewEgg. Some rumors suggest that the A580 will go for around US$280, the A750 for around US$350, and the A770 for around US$400. Not too bad now is it? Oh, and there are some laptops that are already shipping with the mobile variant of Intel’s Arc GPUs but all of them listed on Intel’s website are running the lower-tier laptop GPUs, the A350s and A370s. A5 and A7 series GPUs are not yet available both on desktop and laptop variants.

What a time to be alive. Intel making competitive GPUs. Who would have guessed? Let me know what you think about this move. Will Intel overtake AMD?

Also Read:


Quick NetOne, Econet, And Telecel Airtime Recharge

5 thoughts on “Intel Arc GPUs bringing the fight to Nvidia

  1. It’s not just as simple as that, even if they are any good. The question is how many games, 3D suites or AI/ML frameworks support or will support the GPUs. “Support will build up over time” is not a guaranteed approach these days.

  2. I think AMD is just starting to integrate some of its recent CPU learnings into its GPU lines, so at least in the short term, their stick is long enough to beat off competition from Intel. Still, for a first gen commercial outing, Intel seems to be here to play. They will however, have to bring all their influence and muscle to bear on developers to build up support. If they can do that and build an ‘Intel Advantage’, they will be a serious threat.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.