Jump to content
HWBOT Community Forums

GT220 Shader clock: 3.4GHz


Massman

Recommended Posts

Trying to get people motivated to test and report!

 

Might be good to read the first 'interesting bits' page as well for background information.

 

Interesting bits: shader clock world record?!

 

Fairly disappointed by the memory failure below -50°C, I decided to give the card a whirl anyway and tried to find the maximum stable core frequency knowing the performance was bad. So, what's important about the picture below is the frequencies.

 

attachment.php?attachmentid=580&stc=1&d=1257890131

 

The core frequency increased almost 1-to-1 with the temperature. As indicated on the previous page, due to memory failure below -100°C this card didn't work stable at 1300MHz. I'm sure that if we could take this card to -160°C or lower it could function at 1.4GHz+ MHz core frequency! Not that it would matter, though, knowing that the card is heavily bottlenecked by the memory frequency.

 

Another very important thing to note is the shader frequency of 3.4GHz, which is the highest I've ever seen on any Nvidia graphics card, so until proven differently, I guess this is a world record. I know that this frequency does seem quite unbelievable and I'm sure that some critics will call this a bugged run. However, following are arguments that can be used to defend the legitimacy of this overclock:

 

1) Three different applications indicate the working frequency at 3.4GHz

2) The frequency goes up to 3.4GHz in 3D load mode and decreases back to 2D idle mode when idling at the desktop

3) This type of graphics card hasn't been tested this thoroughly in the past; Nvidia's 40nm is unknown terrain for most of us, thus the lack of precedents is normal.

4) There's definitely an end to this overclock: 3.45GHz wasn't 3D stable any more at -90°C and 1.6V.

5) The performance scaling when increasing the shader frequency is in line with the expectations based on the scaling graphs on the third page of this article as shown in underneath chart.

 

attachment.php?attachmentid=581&stc=1&d=1257890182

 

6) Last but not least, we have to return to the memory failure due to temperature topic. I assume that temperature causes the memory to underclock itself to 2D mode clock frequencies because the card still works fluently (produces 3D image), but the performance drops significantly. Given that from previous observations we know that the shader frequency overclockability is linked to the memory frequency, it's very possible that the shader frequency of 3.4GHz is also due to the non-performance of the memory. In other words, the high shader frequency is partially caused by the memory coldbug.

 

(~ http://www.madshrimps.be/?action=getarticle&articID=968)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...