Jump to content
HWBOT Community Forums

Stelaras has strange results ...


K404

Recommended Posts

His 9800GT 3D06

 

http://hwbot.org/submission/2253449_stelaras_3dmark06_geforce_9800_gt_25501_marks

 

2850MHz shader

 

 

 

His 9800GT 3D01

 

http://hwbot.org/submission/2253446_stelaras_3dmark2001_se_geforce_9800_gt_125019_marks

 

2900MHz shader

 

 

 

Neither of these shader values are possible on G92. GPU-Z reports the actual shader value, not the value set. The steps are 2862 and 2916MHz.

 

Also: His memory. "1240"MHz. This does not exist either. The controller sets 1242MHz. I'm not sure about 1220MHz either. The standard values are 1215MHz and 1224, but I will double check that.

 

Even weirder, his 3D05:

 

http://hwbot.org/image/731882.jpg

 

Shader at 2970 and memory at 1230MHz. Not making any consistent sense at all.

 

 

 

Here is a very good example from MTP that shows the normal/expected values:

 

http://hwbot.org/submission/2160810_mtp04_3dmark03_geforce_8800_gts_512_mb_68776_marks

 

 

Also, WTF?????? At 1170MHz core on a card that coldbugs at -45. He would need some serious BIOS and driver tweaking to get around that.

 

 

I call complete BS.

Edited by K404
Link to comment
Share on other sites

And it's not possible to change the values by changing the clocks in nibitor? I clocked my 8800GT 1Gb's by flashing, and ended up with 734/1000/1885MHz as 01 stable clocks. 1885 is not a "valid" shader frequency either.

 

That said, his 9800gt has the original clocks so I doubt BIOS frequency modding is what has happened here.

Link to comment
Share on other sites

I will try that tonight to confirm, but I don't think so. I think the clock controller over-rides everything else.

 

His stock speeds are the other interesting thing. If he says he didn't use the eVGA thing, his score are even more suspect. 555MHz core suggested MSI "Green" 9800GT, which has no PCI-E power plug and a crappy 2-phase PWM

Edited by K404
Link to comment
Share on other sites

Ok, I tried some BIOS flashing. I used 197.45 drivers and GPU-Z 0.5.8 to match Stelaras.

 

If I flashed random in-between values, the Rivatuner MHz adjustment screen shows what I flash in BIOS, but the Rivatuner graphs AND GPU-Z show the normal "quantised" MHz.

 

I also tried again to get "1220" and "1230" MHz memory and could not do it.

 

 

Anyone else can try this as well of course. I can film my experiment no problem.

 

 

 

 

I don't know how the .3dr file is "alright" can someone ask Futuremark to look into it?

Edited by K404
Link to comment
Share on other sites

I agree that this definitely does not look good. Neither drivers nor a BIOS nor any tool can change the way that clocks are set on the g80/g92/g200 cards.

 

The only question that remains is is there a tool that sets clocks in such a way that it "fools" GPU-Z as well. I think there's a slight chance that afterburner could do this on the g200 cards, but I'll have to test to confirm.

 

I think its still a bit early to say he's cheating this definitively, but we (hwbot staff) should probably have a chat with him...

Link to comment
Share on other sites

I think it would be a good idea to adjust the thread title. "blablah is cheating" isn't something a member should claim as long there are just hints that something unusual - let it be a cheat/bug or tweak - happened. Things like that can result in reputational damage and aren't neccessary at all. In case Hwbot officially decides there's some cheating going on, there's plenty of time to call somebody a cheater afterwards. It's so easy to choose a neutral thread title - just a neat remark. I'm not taking sides or anything.

Link to comment
Share on other sites

  • 2 weeks later...

I don't have LN2 or dice here to test such high clocks on the GPU but at least I can confirm what you posted underneath.

 

Also: His memory. "1240"MHz. This does not exist either. The controller sets 1242MHz. I'm not sure about 1220MHz either. The standard values are 1215MHz and 1224, but I will double check that.

 

I'm also getting 1242 MHz but 1240 is impossible for me. Tested on two 8800GTs.

 

The question is - is there realy no way to get this clock? Maybe we should just ask him.

Link to comment
Share on other sites

  • 5 weeks later...
  • 3 months later...

Try absolutely anything you want. It has to be something random or unexpected if the MHz can be recreated. As far as I remember, these are the ONLY G92 results i've ever seen showing incorrect MHz.

 

IMO, these results have been weird start to finish. He said he'd found a CB fix that he couldn't share, he had pics of the rig that he couldn't share or had lost and he's using the least OC-friendly 9800GT to start with (an MSI Green) eVGA ePower or not, I can't see why ANYONE would look at that particular card and think "Hardware gold." It can't even be pretested to show potential because it doesn't have an extra power plug.

Link to comment
Share on other sites

I forgot to report this here but I had the same thing like steleras when I benched my 8800GTS G92 using the EPower Board. About 2 out of 10 times the clocks were different. I have no idea how but it was also not a GPU-Z bug because I rechecked with riva-tuner.

Link to comment
Share on other sites

  • 1 month later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...