Published in Graphics

Nvidia partners working around GTX 1070 Ti factory-overclock ban

by on27 October 2017


One way or the other

By now, it is obvious that Nvidia has placed a block on GTX 1070 Ti factory-overclocking for its partners, but it appears that partners are finding different solutions to get their factory-overclocked versions to the market, one way or another.

According to Videocardz.com, partners are not allowed to modify BIOSes to ship the GTX 1070 Ti with higher GPU clock speeds, but strangely, even Nvidia officially claims that GTX 1070 Ti was designed with plenty of headroom for overclocking. This is a big surprise, and has put Nvidia AIC partners in an awkward position, forcing them to find a way to overclock their custom versions without breaking the deal with Nvidia.

For example, Asus is currently the only Nvidia partner that has actually listed higher GPU clocks on their site. While its ROG Strix GTX 1070 Ti work at reference 1607MHz base and 1683MHz Boost clocks it is a matter of simple click in GPU Tweak app to get it to work in OC Mode, which pushes the base clock up to 1683MHz and the Boost clock up to 1759MHz.

Zotac does not have the same luxury as Asus, as its FireStorm utility does not support custom profiles so Zotac simply listed "factory tested OC", saying that the GTX 1070 Ti AMP! Extreme can be pushed to +150 MHz for the GPU and +200MHz for the memory.

We already wrote about EVGA's custom GTX 1070 Ti graphics cards earlier, as this partner only listed reference clocks, saying that there is room for overclocking but not by how much.

The GTX 1070 Ti should be available on retail/e-tail shelves on November 2nd and we will then have a clear idea on how far these custom versions can be pushed, but for now, all graphics cards, no matter how beefier their VRM is or what kind of cooler is on, ship with reference clocks.

This will probably take its toll on the number of custom versions from Nvidia AIC partners but GTX 1070 Ti is already too close to GTX 1080, and that is something that Nvidia simply can't allow.

 

Last modified on 27 October 2017
Rate this item
(0 votes)

Read more about: