You can prove that it's not exponential as well. It's the first thing that you typed!
All of us have a ton of real world data on how it's not exponential, thanks to OC
If I increase vCore by 60% without changing MHz... what % does the power consumption increase by?
If I increase vCore by 60%, without changing MHz... AND hold the temperature steady...what % does the power consumption increase by?
We can argue this either way.
EDIT...
I guess our typical voltage testing range is too small to show the exponent. Take a small enough interval and everything is a straight line......
But P=IV is a linear.
The one thing missing from your test data is the result of increased MHz and increased voltage. THAT will get the power consumption up
Using 1.175v // 274fps as a baseline, a 17% increase in voltage takes ~~6.2% off the score. The other data points show less.
My theory of shenanigans is, so far, inaccurate
Power consumption shouldn't increase exponentially with voltage.
Does % performance loss match (well.... mirror) % voltage increase?
If so, I call shenanigans.
(really nice find, though! .....and thankyou for posting it up!)
The difference with those is that they are on different PCBs with different RAM,but what I said was mostly joking.
Buy the same card again for another 2-point category, because nV couldn't be bothered putting any effort in. I'll wait until they're £10 on Ebay I know, no-one forces me to buy this stuff
http://hwbot.org/submission/2350187_k404_3dmark_vantage___performance_geforce_gt_630_gddr5_8442_marks
Thankyou I thought the points status was kept unless deliberately changed by a mod
I think I had one of my scores reported and it was checked by a mod. Now, that score gets points, despite me having them disabled.
I know, on an importance scale of 1-10 that this is -3, but, well, it's technically a bug