Jump to content
HWBOT Community Forums

Democritise HWBoints

Recommended Posts

Towards the end of the R5 development cycle, we want to introduce a tool that allows the community to control which benchmarks generate points and which no longer generate points. This would allow new, popular, benchmarks to come up by community request.


Practically, we will move to a system where there are a fixed amount of benchmarks generating points (10, 15, 20 - not sure yet) and every 6 months a certain amount of these benchmarks can be replaced by others. A couple of benchmarks will always account for points. The exact implementation is still under debate. Currently we're considering cycles of 6 months. Also still up for debate is whether to do this on global points only, or on both global hardware. Also, still under consideration is what Leagues it would affect. Probably just XOC and EL - maybe Teams League too.


An example:


Cycle 1: list


Bold = fixed benchmarks


- Spi 1M

- Spi 32M

- Wprime 32M

- Wprime 1024M

- CPU frequency

- 3D01

- 3DM03

- Heaven DX11

- Vantage

- Memory Clock


Cycle 2:


Spi 1M, Wp 32M, 3DM03, Vantage, Memory Clock are up for replacement. After a vote, the 5 most wanted benchmarks for points by the community are:


- Borandi's SystemCompute v0.4

- 3Dmark06

- Spi 1M

- Wp 32M

- CatZilla


Your thoughts?

Link to comment
Share on other sites

  • Replies 73
  • Created
  • Last Reply

Top Posters In This Topic

Remove points from benchmarks which earned points in the past is a very bad idea to be honest. We have a lot of hardware benchers who spent years with benching every card and CPUs. Now to remove the points from these results will cause a lot of trouble.


There is no advantage for the community if you add this tool/rule. At least I don't see one =)


At the moment we have a lot of benchmarks and everybody can choose what to bench. By limiting the profiles to the best 20 results we have the best solution so everybody doesn't have to bench everything.


Why do you want to change the current system? I actually realy like it and don't see a disadvantage.




There are some benchmarks which I'd call the "main" benchmarks. For me:


- 3DMark 2001

- 3DMark 2003

- 3DMark 2005

- 3DMark 2006

- Aquamark

- 3DMark Vantage


- SuperPi 1M

- SuperPi 32M

- PiFast

- wPrime 32M

- wPrime 1024M


These are by far the most popular benchmarks. If you keep them as fixed benchmarks it will be fine for me and I guess 99% of the community.

Edited by der8auer
Link to comment
Share on other sites

For global points: sure, not the worst idea I've seen :D If HW points are included: NO way.


It's interesting, but I feel the current system is better. Perhaps you can add one or two that do not receive globals on a permanent basis, in addition to the ones that already give global points. Don't know if that's a better thought or not.

Link to comment
Share on other sites

The main issue is that it is becoming very difficult to introduce new benchmarks without saturating the bot. We can't continue like we do now and end up with 100 benchmarks that get points ... but we do need to be open for new benchmarks too. The rule that "everyone can choose what to bench" is relatively fine, but it does mean that everyone will just stick to what he likes and (almost never) try something new. I like the idea that we can push new benchmarks that may seem boring in the beginning but after a while get really interesting and competitive.


The points wouldn't be "removed" as such, they'd just be inactive for 6 months. Visually, they would still be there and if after 6 months the benchmark is voted back in, the points will just become active again. If we do this only for global benchmarks, I think it wouldn't be that big of a problem? I do think the community would love to have more say in this kind of stuff :).

Link to comment
Share on other sites

We have talked about the issue, that when new benchmarks are added, there is just more and more work to be done to get all points needed to get on top. I think there was no chance to make any real modifications to this system with earlier revisions, so maybe it will work in the new one.


I do agree, that new benchmarks should be added and they would need to get some exposure. In current system all new benchmarks are just making life more difficult and it is annoying.


In upcoming PRO OC league.. or whatever the name will be.. there is no need to vote which benchmarks are used, cause it will have seasons and HWbot can decide what benchmarks are used in season 1, 2 and so on. In other leagues there is need for selection, but I really don't know what is the best way to do it. Public vote might work, but usually only 2-5% of whole community answer to polls etc, so it might be hard to make correct decisions with them.

Link to comment
Share on other sites

Life normally shouldnt become more difficult as only the top 20 results count to the personal profile. So if you don't like a benchmark you can just leave it out.


If you take a look at 3DMark Vantage and 3DMark11 they reached a good popularity without disabling older benchmarks.

Link to comment
Share on other sites

Pieter, one more thing: I know that a lot of people don't like that HWBot is so inconsistent. Most benchers bench maybe once a month and therefor need a lot of time to deal with the changes of the bot.

Chaging the benchmarks every 6 months is way too much. Most just handled the change from rev4-rev5 within this year.


rev5 should be the last revision for now. I know that progress comes along with changes. Still you should tune HWBot to be more consistent.

Link to comment
Share on other sites

Let's not make any benchmarks "sacred". I know alot of experienced benchers like 01 and 32m for example, but if the masses wants to throw those out of the loop with this system, so be it. :)


...or we could just use popularity to decide which benchmarks to remove. If we remove 5 benchmarks, then we choose the ones where least # of benchers submitted during the past year. I also suggest that we start a cycle on "proper" dates. Any thoughts here? 1.1.13 would be excellent, but probably too soon, I guess. Or we can make the first round shorter than 1y, and then just end it on the next new years eve. Not the best dates benchwise, but not sure how much that matters in reality. People can still sandbag and post on new years eve if they like :P Or they can use the days after Christmas for benching, as I hope to do (if I can get Ln2 I will!). :D

Link to comment
Share on other sites

I like this idea. I would only do this for global points tho! Keep hardware points for the hardware junkies out there.


Currently we have:


- 7 benchmarks for CPUs which earn global points

- 8 benchmarks for GPUs which earn global points


Why not keep this amount fixed? If a new benchmark e.g. 3DMark12 is ready the community has to vote which older benchmark to remove from global points. So they can for example vote for 3DMark2005 which still earns hardware points but no globals anymore.


1 Year between these elections should be enough as there are not that much benchmarks coming out all the time and the users have enough time to deal with the change.


Also you should add to the vote whether the community even wants the new benchmark(s).


Keeping 7 x 2D and 8 x 3D all the time also helps to keep the consistency I was talking about.


Using this way the community can decide what to bench and whether they even want the new benchmark. The best way imo.

Edited by der8auer
Link to comment
Share on other sites

Sorry Guys but PcMark'05 is out, no ? :)


You need a sys bench, and 2004/Vantage/7 are not popular, nor will ever be.


infinitex CPU and infinitex GPU with hardware pts :D


Yes, please don't touch HW points, I could care less for the globals I don't earn anyway :D




Instead of fixed / timed cycles, maybe the following:

* TOP-TIER benches: 10 benches, our usual favs like 32MB/01/2005/Vantage/Uni, all points awarded.

* PROPOSED benches: whatever we decide by vote / suggestions, with half-points perhaps.

* LEGACY: whatever was in top-tier and replaced by a Proposed bench, no globals.

Link to comment
Share on other sites

I would not replace benchmarks just to replace them. Currently we have a good base of benchmarks which is working nice. Why touch a system which is running fine?


If there is a new one coming out we can think about replacing and older benchmark with it. As long as there are no good alternatives there is no need to change anything.


For example 3DMark12 is ready and tested. We can publish a poll and ask the community about the replacement. If you find a decent 2D benchmark we can for example take away the global points from PiFast.



2D needs a special treatment imo. We have single and multi threaded benchmarks. Thus a single threaded benchmark should replace a single threaded benchmark to keep a variety of benches.

Edited by der8auer
Link to comment
Share on other sites

  • Crew

Quality over quantity


I really think if anything there are already too many benchmarks, I think in hardware leagues more benchmarks make sense


In pro/oc league I think having the highest quality results possible is important, sometimes a single result can take two weeks to do properly, personally I'd rather see less benchmarks but higher quality results and competition


Bruce I think lots of people would be very sad if 01/32 ever disappeared, myself included

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...