K404 Posted February 13, 2012 Posted February 13, 2012 (edited) His 9800GT 3D06 http://hwbot.org/submission/2253449_stelaras_3dmark06_geforce_9800_gt_25501_marks 2850MHz shader His 9800GT 3D01 http://hwbot.org/submission/2253446_stelaras_3dmark2001_se_geforce_9800_gt_125019_marks 2900MHz shader Neither of these shader values are possible on G92. GPU-Z reports the actual shader value, not the value set. The steps are 2862 and 2916MHz. Also: His memory. "1240"MHz. This does not exist either. The controller sets 1242MHz. I'm not sure about 1220MHz either. The standard values are 1215MHz and 1224, but I will double check that. Even weirder, his 3D05: http://hwbot.org/image/731882.jpg Shader at 2970 and memory at 1230MHz. Not making any consistent sense at all. Here is a very good example from MTP that shows the normal/expected values: http://hwbot.org/submission/2160810_mtp04_3dmark03_geforce_8800_gts_512_mb_68776_marks Also, WTF?????? At 1170MHz core on a card that coldbugs at -45. He would need some serious BIOS and driver tweaking to get around that. I call complete BS. Edited February 13, 2012 by K404 Quote
knopflerbruce Posted February 13, 2012 Posted February 13, 2012 And it's not possible to change the values by changing the clocks in nibitor? I clocked my 8800GT 1Gb's by flashing, and ended up with 734/1000/1885MHz as 01 stable clocks. 1885 is not a "valid" shader frequency either. That said, his 9800gt has the original clocks so I doubt BIOS frequency modding is what has happened here. Quote
K404 Posted February 13, 2012 Author Posted February 13, 2012 (edited) I will try that tonight to confirm, but I don't think so. I think the clock controller over-rides everything else. His stock speeds are the other interesting thing. If he says he didn't use the eVGA thing, his score are even more suspect. 555MHz core suggested MSI "Green" 9800GT, which has no PCI-E power plug and a crappy 2-phase PWM Edited February 13, 2012 by K404 Quote
K404 Posted February 13, 2012 Author Posted February 13, 2012 (edited) Ok, I tried some BIOS flashing. I used 197.45 drivers and GPU-Z 0.5.8 to match Stelaras. If I flashed random in-between values, the Rivatuner MHz adjustment screen shows what I flash in BIOS, but the Rivatuner graphs AND GPU-Z show the normal "quantised" MHz. I also tried again to get "1220" and "1230" MHz memory and could not do it. Anyone else can try this as well of course. I can film my experiment no problem. I don't know how the .3dr file is "alright" can someone ask Futuremark to look into it? Edited February 14, 2012 by K404 Quote
K404 Posted February 15, 2012 Author Posted February 15, 2012 I am very aware that the bug has given HWB some priority work to do. Of course that comes first. http://hwbot.org/submission/2253452_stelaras_unigine_heaven___basic_preset_%28dx9%29_geforce_9800_gt_1887.36_dx9_marks If the .hwbot file contains as much info as people say, it should hopefully provide more info on how valid his results are Quote
Gautam Posted February 17, 2012 Posted February 17, 2012 I agree that this definitely does not look good. Neither drivers nor a BIOS nor any tool can change the way that clocks are set on the g80/g92/g200 cards. The only question that remains is is there a tool that sets clocks in such a way that it "fools" GPU-Z as well. I think there's a slight chance that afterburner could do this on the g200 cards, but I'll have to test to confirm. I think its still a bit early to say he's cheating this definitively, but we (hwbot staff) should probably have a chat with him... Quote
Hyperhorn Posted February 17, 2012 Posted February 17, 2012 I think it would be a good idea to adjust the thread title. "blablah is cheating" isn't something a member should claim as long there are just hints that something unusual - let it be a cheat/bug or tweak - happened. Things like that can result in reputational damage and aren't neccessary at all. In case Hwbot officially decides there's some cheating going on, there's plenty of time to call somebody a cheater afterwards. It's so easy to choose a neutral thread title - just a neat remark. I'm not taking sides or anything. Quote
K404 Posted February 17, 2012 Author Posted February 17, 2012 (edited) Hyperhorn... you're right. I tried to change the thread title just after I posted, but I couldn't/can't. EDIT: Thanks Edited February 17, 2012 by K404 Quote
knopflerbruce Posted February 27, 2012 Posted February 27, 2012 Anyone else did some more testing? Quote
K404 Posted March 2, 2012 Author Posted March 2, 2012 I can -kind of- recreate variations of "weird" MHz with G96 silicon, but NOT G92 Quote
der8auer Posted March 2, 2012 Posted March 2, 2012 I don't have LN2 or dice here to test such high clocks on the GPU but at least I can confirm what you posted underneath. Also: His memory. "1240"MHz. This does not exist either. The controller sets 1242MHz. I'm not sure about 1220MHz either. The standard values are 1215MHz and 1224, but I will double check that. I'm also getting 1242 MHz but 1240 is impossible for me. Tested on two 8800GTs. The question is - is there realy no way to get this clock? Maybe we should just ask him. Quote
knopflerbruce Posted March 3, 2012 Posted March 3, 2012 Yeah, if it happens on several screenshoits, then it's something he does every time that causes it. Quote
K404 Posted July 23, 2012 Author Posted July 23, 2012 Was it decided that these results are "fine?" Quote
knopflerbruce Posted July 25, 2012 Posted July 25, 2012 I'll give this a shot myself, although I'm a noob at 3D so... not sure if my efforts really help. What do you think I should try to recreate these weird frequency numbers? Did you check his other subs to see if there are similar results out there. Quote
K404 Posted July 25, 2012 Author Posted July 25, 2012 Try absolutely anything you want. It has to be something random or unexpected if the MHz can be recreated. As far as I remember, these are the ONLY G92 results i've ever seen showing incorrect MHz. IMO, these results have been weird start to finish. He said he'd found a CB fix that he couldn't share, he had pics of the rig that he couldn't share or had lost and he's using the least OC-friendly 9800GT to start with (an MSI Green) eVGA ePower or not, I can't see why ANYONE would look at that particular card and think "Hardware gold." It can't even be pretested to show potential because it doesn't have an extra power plug. Quote
der8auer Posted July 25, 2012 Posted July 25, 2012 I forgot to report this here but I had the same thing like steleras when I benched my 8800GTS G92 using the EPower Board. About 2 out of 10 times the clocks were different. I have no idea how but it was also not a GPU-Z bug because I rechecked with riva-tuner. Quote
K404 Posted July 25, 2012 Author Posted July 25, 2012 That's REALLY weird but I appreciate you posting Roman Did you get any screengrabs of the weirdness? Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.