Jump to content
HWBOT Community Forums
Christian Ney

HWBOT GPUPI for CPU - 100M Benchmark Validation Regulations

Recommended Posts

21 hours ago, Leeghoofd said:

So what did you change  from when it was working? That's the main question you need to ask yourself. 

The OS remains the same. Only RAM or CPU tweaking.

And it was never 100% stable before, so I don't think that we can ever be sure of what is causing these problems.

It can even be the priority (Real time or high) that is affecting it. Who knows except the people who are related to this app?

Share this post


Link to post
Share on other sites

I believe my problem is straight Intel's Fault. I'm using a EVGA x299 Dark w/DDR4-4133. I have been able to run GPUPI 3.2 & 3.3 on the CPU (Using INTEL'S OpenCL) with a i9-7900x & i9-9820x. The only change has been the update of a i9-10940x. The program crashes before it even starts the test. It has completed the Bench using AMD's OpenCL Driver. I have not tried this on Win-10.

 

Thank You :) 

  • Thanks 1

Share this post


Link to post
Share on other sites

What Intel OpenCL version are you using? I have tried 18.1 on an i9-10980XE and it's working fine.

Share this post


Link to post
Share on other sites

I've tied 18.1 and just updated to DL version of 2020. It could also be... From where I have aggressively pushed my timings to  4000-CL 12-11-11-28/220 1T. I have also tried to make it do 12-11-10-28/220 from another sub. I may have damaged my Win-7 install :( from way tooo many BSOD's :( . I have made many runs with 4000-12-11-28/220 1T on GPU benches, I have to credit this to Teamgroup DDR4-4133 and m||rk||r88 on helping me fine tune the memory 🙂 I'm using A2 B-Die memory moduals with a set of A0's reserved for my EVGA z390 🙂

 

Thank for the help :)

 

GPUPI Eror Message.png

Share this post


Link to post
Share on other sites

The error occurs in Intel's threading library. If this happens on air/stock as well, it's pretty safe to say that this is a bug in Intel's OpenCL implementation. A bug report on the Intel forums would help them to have a look at it.

  • Thanks 1

Share this post


Link to post
Share on other sites

Hey is there going to be any update soon?

If not,.maybe it's better to close totally the section.

It's not fair not to be able to save any newer scores. It's a waste of time and effort for better results.

Share this post


Link to post
Share on other sites
Posted (edited)
15 hours ago, viper said:

Hey is there going to be any update soon?

There is going to be a major BenchMate release first. All my time and effort is going into that.

GPUPI 4 will still arrive this year though.

15 hours ago, viper said:

It's not fair not to be able to save any newer scores. It's a waste of time and effort for better results.

I have no clue what that means. Is this about 3.2 and 3.3?

Edited by _mat_

Share this post


Link to post
Share on other sites
45 minutes ago, _mat_ said:

There is going to be a major BenchMate release first. All my time and effort is going into that.

GPUPI 4 will still arrive this year though.

I have no clue what that means. Is this about 3.2 and 3.3?

Well there are so many previous posts about this matter.

3.2 definitely has the problem of not saving the data file, as I have tried again with a new cooling system and it's always breaking the saving process as it force closes the app.

Share this post


Link to post
Share on other sites

Use BenchMate, it fixes all problems with result saving: https://benchmate.org

The concept of having result saving, encryption and uploading inside each benchmark is flawed. Especially if latest benchmark versions are not allowed, like GPUPI 3.3.

Share this post


Link to post
Share on other sites

Why can't we just upload the full screenshot as we do for so many other benchmarks?

Share this post


Link to post
Share on other sites
6 hours ago, viper said:

Why can't we just upload the full screenshot as we do for so many other benchmarks?

To quote myself from the mission statement on benchmate.org:

Quote

Benchmarking is in a horrible spot right now. Even on HWBOT, the world's biggest result database, most results have to be submitted as screenshots and system data is entered manually. Due to undetectable time measurement bugs and easily abusable vulnerabilities of benchmarks, results can not be correctly judged by the inconclusive data available. Additionally, manual submission is error-prone, inconsistent, time consuming and hard to learn while still being far from trustworthy. Although there are better submission methods and online rankings available like 3DMark's Hall of Fame or the Geekbench Browser, they all set their own opaque standards for validation and are only available for their (latest) products.

 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...