The following are the pricing for the remaining GPUs in Intel Arc’s A700-series: On October 12th, the A750 is priced below $300.

0

- Advertisement -

We are getting more answers to critical market issues like pricing, launch dates, performance, and availability as the retail launch of intel’s highest-end graphics card range draws near. This means that we are gaining more insight into the market.

Today, Intel provided additional responses to questions regarding the A700-series GPUs. These responses were accompanied by assertions that every card in the Arc A700 series is competitive with Nvidia’s RTX 3060, which has been on the market for 18 months.

After making the announcement earlier this week that the price of its A770 GPU will be $329, Intel reiterated that it would debut three products in its A700 series on October 12:

The previously mentioned Arc A770, which can be purchased for $329 and has 8GB of GDDR6 memory; the additional Arc A770 Limited Edition, which can be purchased for $349 and has 16GB of GDDR6 memory at a slightly higher memory bandwidth but otherwise sports otherwise identical specifications; and the slightly less powerful A750 Limited Edition, which can be purchased for $289.

Intel Arcs A700 series 2

The A750 LE is essentially a binned version of the chipset found in the A770, but with 87.5 percent of the shading units and ray tracing (RT) units turned on, along with an ever-so-slightly downclocked boost clock. If you missed the memo on that sub-$300 GPU when it was announced, the A750 LE can be thought of as a replacement for the A770 (2.05 GHz, compared to 2.1 GHz on both A770 models).

Intel had previously confirmed that any new purchases of GPUs from the Arc A700 series made before January 2023 would come with a collection of downloadable games and software. These would include the remake of Call of Duty: Modern Warfare II that was released this year, Gotham Knights, and other titles.

In front of objective benchmarks, graphics processing units (GPUs) have a murky statistic called “performance-per-dollar.”

Intel representatives declined to clarify initial shipment counts for its first three A700-series GPUs during a conference call with the press, other than to suggest low stock for the larger-memory A770 LE: According to comments made by Intel Graphics Fellow Tom Petersen to Ars,”I suspect we’re going to sell out of that one very quickly,”

He was hesitant to provide a definitive answer about whether or not he anticipated an early sellout of Intel’s A700 GPUs 

“We don’t know if we’re going to have a supply problem or a demand problem. I hope we have a demand problem.” 

He was hesitant to provide a definitive answer about whether or not he anticipated an early sellout of Intel’s A700 GPUs    After that, he confirmed that Intel intends to produce its own in-house GPU models over time, as opposed to ceasing production of “LE” while demand may still be present in the market.

Unfortunately, Intel has not confirmed which add-in board (AIB) partners will be a part of the rollout of the A700 series in October, which has added more confusion to the availability question regarding GPUs.

Petersen suggested that those third-party GPU makers will make their own announcements, and then stated an interest in growing their list of Arc-powered AIBs. This was Petersen’s way of kicking the can farther down the road.

Intel’s recent presentation contains gaming benchmark measures that directly compare the 8GB A750 with an EVGA variant of the RTX 3060, which packs 12GB of GDDR6 RAM. Ars Technica has not yet conducted its own independent verification of the testing findings provided by Intel.

The above chart and a few others use a confusing “performance-per-dollar” metric to obfuscate raw comparisons in frame rates without listing raw frame rates or clear percentage differences.

But Intel seems determined to make that performance-per-dollar metric quite loud in the promotional effort for the A700 series. For example, it has advertised that the higher-end A770, which is priced at $349, nets “42 percent” more performance-per-dollar, on average, than an RTX 3060, which sells at retailers for an average of $418. This indicates that Intel is quite confident in the A770’s ability to compete with the RTX 3060. The same sales pitch, which uses fuzzy math, suggests that the A750, which costs $289, will net “53 percent” more average performance-per-dollar than the equivalent RTX 3060 model.

We are looking forward to someone in the comments section of Ars breaking down that incomplete algebra formula to establish the true performance disparity between each product, at least according to their own internal testing methodology. In the meanwhile, we will keep an eye out for their response. It’s possible that it may line up with prior remarks made by Intel, which put the A750 somewhere between 3 and 5 percent quicker than the RTX 3060.

Intel continues to concede that the Arc series’ most significant teething issue in its first generation was that the A700 series’ drivers and hardware are not yet doing an excellent job of outperforming the RTX 3060 in DirectX 11 performance.

Although Intel claims that a few DX11 games have nearly identical performance or even superior performance on Arc A770 in comparison to the RTX 3060, Intel’s representatives admit that Nvidia has a generally noticeable lead on those older games. Intel’s claim that a few DX11 games have nearly identical performance or even superior performance on Arc A770 is a contradiction.

When asked further about how each GPU compared to the more favorably rated RTX 3060 Ti, Petersen pushed back, again appearing focused on the price disparity between GPUs: 

“Pricing on the 3060 Ti is absolutely ridiculous, so we didn’t want to include it in our research,”

he stated.

As I’ve mentioned in the past, the RTX 3060 was released with a significant performance decrease in comparison to the RTX 3060 Ti. However, if Intel manages to push meaningful gains in general rasterization, specific ray tracing workloads, and XSS-powered image reconstruction, its price-to-power metric may pan out for anyone who is interested in purchasing an Nvidia alternative. (So long as it’s in stock at your favorite retailer, anyway.)

- Advertisement -

Leave A Reply

Your email address will not be published.