Jump to content
HWBOT Community Forums

boobteg4642

Members
  • Posts

    40
  • Joined

  • Last visited

Posts posted by boobteg4642

  1. 10 minutes ago, cbjaust said:

    Well, I'd hope so otherwise what would be the point of competitive overclocking? Give it 1.35v and see.

    I can't choose 2500mhz for my nb in my bios, only 24 or 2600, so I opted for 2400 to be safe and it booted. It just doesn't seem like going from 2200 to 2400 would require a 1.2 to 1.35 volt change. But I don't know that well obviously. 

  2. 7 minutes ago, avalanche said:

    Well you can get going now doing scores. You just need search out scores in your range you can hit. 

    I bet if I could get my clocks up to your speed, I would get the same score. I jumped from 455 to 597 going from stock to 4400mhz. I don't know how to make it post past 4700mhz though, voltage  alone doesn't get it to boot. Any idea what voltage the NB should be at at 2500mhz? Maybe 1.25? 

  3. 1 minute ago, avalanche said:

    Started uploading scores here on 4790K. They kicked ass & still do. 3D01se my favorite then with busted ass $20 gpu's that were oven baked beauties.

    Fun times :)

    I broke some records with two gtx 780's in sli with the 4790k. Best DDR3 processor intel made in my opinion. Reason it still costs $350. 

  4. Just now, cbjaust said:

    Memory can have a lot to do with stability. My cooling was just a Thermalright TRUE Black with one or maybe two fans push-pull. Is the FX-8300 Bulldozer? If so it will not clock quite as well as Vishera.

    No probs, ask away.

    Here's my Ram, it's stable I ran prime95 with it like that for a couple hours. Didn't fail and never got above 30 degrees, although I know with amd that was inaccurate. 

    Cinebench.thumb.PNG.133c70bee1206f02183c2fcc5aa3af4b.PNG

  5. Just now, avalanche said:

    All good man. Lots of experience in these forums. I had Amd up till 1100t then switched over to dirty wintel.

    3570K & the onboard iGpu interested me for QuicSync action

    I used to have the i7 4790k with a gtx 780, it blew this system out of the water in every aspect, gaming, threading, not one thing was the fx better at.

  6. 13 minutes ago, avalanche said:

    For your Gpu you can try heaven benchmarks. I know with the 970 I can run less cores 8086K & it goes alright in old 3D benches too

    It does great in 3dmark pro. With Time Spy and all. I just can't do extreme. But other than that it hat least holds the GPU clock at 1493. I seem to be getting low scores there as well though, compared to my historic benchmarks in the past. The GPU is working great, it's the CPU that I think there's something up with.  

  7. 3 minutes ago, cbjaust said:

     

     

    4 minutes ago, cbjaust said:

    The other scores you see with higher numbers are the other versions of cinebench - R15 and R20 and the older one Cinebench 2003 but all those versions give the score in "cb" not "pts". Hwbot is only concerned with the CPU score from Cinebench. The OpenGL scores are a GPU test.

    Vishera was fine. My daily is an FX-8350 still. what cb you get under the later cinebenches?

     

  8. 1 minute ago, cbjaust said:

    The other scores you see with higher numbers are the other versions of cinebench - R15 and R20 and the older one Cinebench 2003 but all those versions give the score in "cb" not "pts". Hwbot is only concerned with the CPU score from Cinebench. The OpenGL scores are a GPU test.

    Vishera was fine. My daily is an FX-8350 still.

    I got a 455 cb in Cinebench r15

    •        I can't figure out what is going wrong, and I sure could appreciate a little help. I am a pretty experienced overclocker, and thought I knew everything about these chips but...
    • This is my system, I built it for fun out of spare parts (if you need anymore information by the end please ask)
    • FX 8300 (was brand new in the box the other day), Gigabyte AM3+ board, with 8 gb  DRR3 @1600 9..9.9.24.1, GTX 970 graphics, SSD, reservoir, swiftech xtreme ddc style water pump, 240mm radiator mounted in front, and decent waterblocks for both the gpu and cpu. 
    • Like everyone else, I am able to hit about 4600mhz stable at 1.400 volts. My gpu is stable at 1493mhz @ 1.212v and 195 watts max. 
    • My problem is with cinebench r11.5. At stock speeds, In openGL, I get about a 41.04. I can see in the OSD my clock speeds for my GPU are staying down at the default clock speed of 1177mhz. Cinebench doesn't stress the GPU enough to make it boost up or something, only about 30% utilization. And under perfcap reason in GPU-z during OpenGL, it still says utilization.  Is this common?
    • I see the cpu comparison says mine is worse then pentiums and stuff with like gtx 280 or 480. And the CPU test reveals an even more disappointing 5.18pts. Overclocked, I can pull of about 57 fps, and get about a 6 cpu. 
    • I know I have seen many post saying their 8300 or 8350 got like 80-90 fps with a gtx 580 or similar card, at lower clock rates. I saw one guy usually gets between 9 and 10 CPU score with an 8300. 
    • To recap scores Stock clock speeds  -  Cpu 5.18 pts     
    •                             4600mhz reveals a      - Cpu 6.
    •                                CinebenchR15 gets a 455cb  
    • Does this mean the FX processors were truly just that awful?  I don't seem to remember them being, even though they are like 9 years old now with zero support

                                                                                              -Bob 

                                                                                                         Capture.PNG.2f18c48c66cedbed05b5590e4c82dcc6.PNG

  9. Update:

             I ran Aida64 stress test for some time and averaged about 33 degrees. I settled on 4219mhz at 1.320 volts for the fx 8300. If I run it at 3500mhz at 1.172 volts,  I can get the temperature down to single digits. I am in a cold room.

             I have had a lot of experience in the past with nvidia BIOS flashing, and the maxwell wasn't much different. For the gtx 970 I, I buchwacked the factory BIOS and adjusted the wattage limits and raised the voltage by .12v. Through a lot of trial and error, I found that just simply doing this automatically raises the boost clock speed to 1480mhz. I add +13 on top of that in the BIOS. The card runs at 1493mhz. 39 degrees max after 3DMark sessions.

    Anything higher than that in terms of clock speed the card becomes unstable. I did a lot of cd C:/nvflash and nvflash -6 xxx.rom and saw what didn't work. It seems like if I raise the voltage anymore, even if I flash it in, it doesn't raise the actual voltage. So in terms of wattage, I'm using everything I can and have plenty of wattage left over. If I could give it more voltage somehow I could raise the clock a ton. Open for suggestions for overclocking. 

    Gaming like WoT takes the card up to about 48 degrees, on max graphics settings. If I drop to graphics down to medium, the card goes down to 40. Cpu is never above 37.

    5100.PNG.c1f9712d2965196c2a939def4dee6973.PNG

    fx83001.thumb.PNG.d3587b51ca10344273baa7d0c399e03a.PNGfx8300.thumb.PNG.967a84c1d436eb62e1e9c0d3c47b5d5c.PNG

    Capture.PNG

    8300.PNG

    • Like 1
  10.        I know I just posted this 23 hours ago, but I was since upgraded to an FX-8300 and got the waterblock installed on my card. I also upgraded my pump from a 1600 rpm pump to a 3000 rpm pump PWM controllable. Now, running 3dMark Time Spy at 1351mhz and 1.2 volts, the card never got above 39 degrees and the 8-core 8300 was never more than 22 during the 3D test, then got up to 27 during the render test.

           With the clock speed at 1447 mhz on stock voltage, it got up to 39 degrees in Time Spy and the cpu got up to 27 in the render test, but only 22 during the 3D test. Amazing the difference water makes. 

          I used something on my graphics card I have never used before. A graphite thermal pad. Seems to be working extremely well. I am pumping directly into the 240mm radiator, which goes to the gpu, then cpu, then reservoir, and back down to the pump. The difference is when your water loop is really functional. Like when I fill it with water, I don't really have to bleed it any. Hardly any sucks down from the reservoir when I first turn it on. It just fills up the whole loop when I fill it, all the way back to the reservoir. I only had to prime it once. And it's a tiny reservoir.      

    Here are pictures below of the install. Can't wait to see what this card can really do now! Thanks for reading!

    IMG_20191018_114633.jpg

    IMG_20191018_142330.jpg

    IMG_20191018_142420.jpg

    IMG_20191018_143242.jpg

    IMG_20191018_152155.jpg

    IMG_20191018_165611.jpg

     

     

     

     

     

    All of the fans are on the same fan controller. By PWM. I like positive case pressure so the dust gets stuck on the filter instead of inside my PC. 3 intake fans (2 on radiator) one exhaust fan. I know I should be pulling the air in from the other side instead of blowing it through the front with the fans, but I don't have the screws yet. I will tomorrow!

     

     

     

     

    IMG_20191018_230622.jpg

    IMG_20191018_230455.jpg

    IMG_20191018_230510.jpg

    • Like 1
  11. I have an issue. As a test, and as a programmer, I just wanted to see if this would work. And it did. I simply memory injected XTU and uploaded it. It gave me whatever score I wrote in my tools, along with other popular benchmarks. That means a vast amount of XTU submissions could ultimately be fake. Don't believe everything you read online about these scores. If there is a work around, people are sure to take advantage it. Since there are such a high number of benchmarking applications, I can rest assure you they are exploitable. Especially if you know a thing or two about the OS hiding behind the monitor and all the games you play.. Bummed to know they create such costly benchmarking programs without any R&D in security.

×
×
  • Create New...