Jump to content
HWBOT Community Forums
Christian Ney

HWBOT Y-Cruncher Pi-25m Benchmark Validation Regulations

Recommended Posts

There's no good reason why the screenshot must be obtained using the integrated screenshot functionality. A screenshot is a screenshot. I don't think my score should be disqualified if the integrated screenshot functionality causes my computer to crash. As it sometimes does. And is why I always make another screenshot before using it. But whatever. I'm going to post my scores with whatever screenshot I have. If that ain't good enough .

Share this post


Link to post
Share on other sites

LOL

It ain't good enough, and there is a reason.

Datafile required.

Nice attitude and first post BTW.

Edited by Mr.Scott

Share this post


Link to post
Share on other sites

Well...there's 3 things funny about that.

1. I posted that. And my name isn't mr.wulten. Nor was it my first post. BTW...I was using an older version that didn't have the "Make Datafile" option. And trying to submit scores using "Submit to HWBOT!" was not working. I eventually got that figured. But anywho...

2. There's no need for obtaining a screenshot of any kind(using whatever functionality) when the datafile already contains one. IF the datafile saves itself properly. I keep getting datafiles that are missing the screenshot. So I try submitting them first to see if it's there. Then resave it until it is...if it isn't. Learned that lesson the hard way...:/

EDIT: Ok...now I get it. You HAVE to use the integrated screenshot functionality FIRST. Or the datafile won't have a screenshot. Which is stupid. It should take the screenshot automatically when you click "Make Datafile". "Take Screenshot" is worthless/pointless, and only adds unnecessary complexity. Everything else, not struck through, I've said below is still valid.

EDIT 2: Wait...you can point the app to whatever screenshot(via the "Screenshot Path"), obtained using whatever functionality, and have the datafile contain that screenshot when it's made. So...unstrike everything I said below...IT'S ALL VALID!!!

EDIT 3: Evidently the screenshot needs to be named using the specific numeric naming scheme. YearMonthDay-HourMinuteSecond. Might need to be a PNG image too(but I doubt it). Don't know...don't care(see EDIT 4). But you most certainly CAN use ANY PNG image with a suitable name and have it work just as well as the one obtained with the integrated screenshot functionality.

EDIT 4: It does NOT need to be a PNG image. JPG will also work(as will many other formats I presume).

3. In actuality...it's a just PNG image. One PNG image is the same as another PNG image. There's nothing special about it. Thus I reiterate...if a screenshot were to be required...THERE IS ABSOLUTELY NO REASON WHY IT MUST BE OBTAINED USING THE INTEGRATED SCREENSHOT FUNCTIONALITY. Because, if it's a full screenshot in PNG format, there's no way to tell if it was or wasn't obtained using the integrated screenshot functionality(or some other way). REGARDLESS...that clause needs removed from the rules. As it DOES NOT APPLY when the datafile is required. As the datafile SHOULD already contain the mandatory screenshot. Just having the screenshot obtained using the integrated screenshot functionality(or any other full screenshot in PNG format) is not proof enough for validation of a score. And, in fact, will get you nowhere if that's all you have to submit. Nor can it be submitted under "verification", when submitting your score via the datafile(since that isn't available when supplying the datafile...see note below). You can only add it as a "picture of your system"

Also of note...if you do submit a datafile without a screenshot...it will say "Using supplied data file. No further verification is required". And there is no place to add a "verification" screenshot(like I said...it can only be added as a "picture of your system"). So...what also needs fixed is that. If we're going to be fussy about requiring a verification screenshot. As in make sure the datafile contains the required screenshot before it can be verified. How can it not tell if there's a screenshot in the datafile or not? And/or...does it really even matter? I like the idea of requiring the screenshot. But I shouldn't be able to submit a datafile that's missing one and have it be verified. Which I've done numerous times just to see if it would work. And every time it has. Don't make me prove it...

Edited by MrGenius

Share this post


Link to post
Share on other sites

To clear a few things up here from a technical standpoint.

tl;dr Version:

  • The validation file by itself (and thus datafile) is sufficient to verify that the computation happened with the associated hardware.
  • A screenshot is only useful for capturing additional information such as CPUz frequency - which itself isn't that useful anymore.

All the important information is captured in the validation file - thus a screenshot is not necessary from a technical standpoint. But if HWBOT wants to require a screenshot, that's up to them. It isn't really that useful for the purpose of validation. But it's still nice to have for illustration purposes.

Long Version:

The validation file captures the following information (among others):

  • Benchmark score
  • Hardware involved (CPU model, cores enabled, ram, etc...)
  • Computational settings
  • Program version
  • The reference clock. (TSC, HPET, etc...)

This information is protected with a hash so it cannot be tampered with without breaking the protection. This information is sufficient to verify that the computation did happen and that it wasn't faked or tampered with.

This validation file is then used to generate the datafile that is submitted to HWBOT. Embedded in the datafile is the original validation file itself. So moderators can examine it if they suspect anything.

On the other hand, the validation file does not record the following:

  • True CPU frequency

For all practical purposes, it cannot do that. This requires a kernel mode driver. So it's not practical for all but the most hard-core benchmark writers.

Thus the only thing that a screenshot can do that the validation file cannot is to capture CPUz output. But that's actually kind of useless:

  • CPU frequency fluctuates too much to be useful.
  • A screenshot is easily faked.

When you're taking the screenshot, the CPU is probably idling at maybe 1.2 GHz. Or perhaps you locked it to 5.0 GHz, but because of the AVX(512) offsets, the benchmark actually ran at 4.5 GHz instead of the 5.0 GHz that CPUz is showing. Or perhaps it's a power-throttled laptop so the frequency was fluctuating between 2.2 - 3.2 GHz for the whole benchmark. IOW, CPU frequency is becoming useless and days of low-clock benchmarks are over as they are neither practical nor enforceable.

Screenshots are easily faked.

  • You can run the computation on one computer, transfer the validation to another and submit from there with a completely different CPUz window.
  • The console window output is also easily faked by writing a small program that gives the same output - and it would be accurate to the pixel. Thus screenshots of the console output of y-cruncher are 100% useless.
  • Screenshot of the HWBOT submitter UI are also useless. Not only can those be faked, the score and timer information are already captured in the validation file + datafile. So it's not even needed.

Other Comments:

There's no point in requiring that the screenshot be taken using the built-in functionality. There is no difference at all. And it doesn't provide any additional security. It is there only for convenience.

 

Edited by Mysticial

Share this post


Link to post
Share on other sites

I hear you and you are completely right. If you think it to the end, and that's what we devs usually have to do, a screenshot is for viewing pleasure only and should have nothing to do with validation.

Did you have a look at my latest project, BenchMate? It would provide your benchmark with the necessary hardware detection like frequency, voltage, temperatures and so on. It also adds another protection layer to the benchmark process (and child processes) - specifically DLL injection for Timer API hijacking. Last but not least you would not have to bother with timer reliability issues, there are functions available by a low-level driver that give access to HPET without enabling it for the whole OS (as System Clock Source).

What language is your wrapper written in? I am aiming to provide a SDK for benchmark developers to add BenchMate's features directly into it. It will still take some time before that's finished, but there is the possibility to add integration externally (by hijacking API functions).

Let me know if you are interested.

Share this post


Link to post
Share on other sites
45 minutes ago, _mat_ said:

I hear you and you are completely right. If you think it to the end, and that's what we devs usually have to do, a screenshot is for viewing pleasure only and should have nothing to do with validation.

Did you have a look at my latest project, BenchMate? It would provide your benchmark with the necessary hardware detection like frequency, voltage, temperatures and so on. It also adds another protection layer to the benchmark process (and child processes) - specifically DLL injection for Timer API hijacking. Last but not least you would not have to bother with timer reliability issues, there are functions available by a low-level driver that give access to HPET without enabling it for the whole OS (as System Clock Source).

What language is your wrapper written in? I am aiming to provide a SDK for benchmark developers to add BenchMate's features directly into it. It will still take some time before that's finished, but there is the possibility to add integration externally (by hijacking API functions).

Let me know if you are interested.

Hey! We haven't spoken in a while.

I'm aware of BenchMate. Nice work! But I've been way to busy to even look at it beyond reading the forums.

I'm glad someone has the time to attack this. Thank you!

 

Quote

What language is your wrapper written in? I am aiming to provide a SDK for benchmark developers to add BenchMate's features directly into it. It will still take some time before that's finished, but there is the possibility to add integration externally (by hijacking API functions).

Let me know if you are interested.

Java. But there's a possibility it will involve C++ as well depending on where it needs to interface with the benchmark.

Yeah, I'm interested. Let's start a new thread for this. I have a LOT of questions and potential ideas.

 

Edited by Mysticial

Share this post


Link to post
Share on other sites
25 minutes ago, _mat_ said:

Let's do it! :)

Link me to the new thread once you've made it. And I'll start with some questions. :)

Share this post


Link to post
Share on other sites
New 2020 scores: Require CPU-Z 1.91 or newer version New 2020 scores: Full desktop screenshot required ( no clipping)

Share this post


Link to post
Share on other sites
How exactly do we run this benchmark for 25M, 1B etc? I have a cmd app called y-cruncher.exe v0.7.8.9503 and an app HWBOT Submitter v1.0.2.135 And then what?

Share this post


Link to post
Share on other sites

run the submitter file, the y-cruncher HWBOT submitter will open (if you have Java installed ofcourse) -- press Run Becnhmark --  a new  Run Benchmark window will open -- Select the benchmark version -- Press Run Benchmark in the Run Benchmark window  and off you go!!

 

 

ycruncher.jpg

Share this post


Link to post
Share on other sites

This will work great for most people and is the best option 99% of the time, if you really want to tweak it you can try custom run options. Full guides for what they do and how to configure them here: http://www.numberworld.org/y-cruncher/ Also remember that you'll need some form of java installed to run the submitter, which is required to get a datafile for submit.

Share this post


Link to post
Share on other sites

if you just installed it it is normal that the submitter list is empty, I made a gif, hope it will become more clearer now

ycruncher.gif

  • Like 2

Share this post


Link to post
Share on other sites
On 2/5/2020 at 11:23 PM, viper said:

Thanks guys and it sounds simple, but its not. The submitter is empty, so there is no benchmark to run. Also cant find any way to load any lists through it

The part that is empty is where it shows your scores, obviously if you have not run it you won't have scores ;) 

Once you've run a bench if you left the submitter up during the run then you'll have to hit the refresh button for the score to populate.

Share this post


Link to post
Share on other sites

Even if HPET is enabled, the program doesn't always detect it. In that case, it will show as blank - which means (unknown).

The next release (no ETA) will be able to detect more "variants" of HPET.

This is where the rules get a little tricky here. Personally, I think anything that's not "TSC" should be fine. But the HWBOT rules require that it shows "HPET". Unfortunately, you will never get it to show "HPET" if it's one of the variants that y-cruncher does not currently recognize.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...