Jump to content
HWBOT Community Forums

boobteg4642

Members
  • Posts

    40
  • Joined

  • Last visited

Everything posted by boobteg4642

  1. This guy got a better score than me with his NB at 2200 and core at 4400, than I did at nb 2500 and cpu 4600. There is something going on that I don't know, or something I don't know to change. He even has the same exact processor with slower RAM. https://hwbot.org/submission/3944223_oconsauce_cinebench___r15_fx_8300_713_cb/
  2. I can't choose 2500mhz for my nb in my bios, only 24 or 2600, so I opted for 2400 to be safe and it booted. It just doesn't seem like going from 2200 to 2400 would require a 1.2 to 1.35 volt change. But I don't know that well obviously.
  3. K gonna try 1.35 now. I don't know what the difference is in VID and vcore in hwinfo 64. I can change the VID with AMD overdrive, but not in my bios, in my bios I can change vcore but not VID.
  4. I bet if I could get my clocks up to your speed, I would get the same score. I jumped from 455 to 597 going from stock to 4400mhz. I don't know how to make it post past 4700mhz though, voltage alone doesn't get it to boot. Any idea what voltage the NB should be at at 2500mhz? Maybe 1.25?
  5. Thank you, yeah your are right I am not fully aware of the rules. I will research them.
  6. I have two accounts here, they might be under boobteg4444 also. Mind you this was a long time ago also, they are no longer records.
  7. Hmm. I just overclocked to 4400mhz and still only got a 592 cb in R15. Def. better than the 455 earlier at stock clocks but still. Maybe the 8350 was just that much better than the 8300.
  8. I broke some records with two gtx 780's in sli with the 4790k. Best DDR3 processor intel made in my opinion. Reason it still costs $350.
  9. You think I will notice a difference? If so what voltage should I throw the NB to at 2500mhz? Here's what I got now.
  10. I am going to go restart, lock my clocks and turn off power management and see what my numbers are now overclocked be right back.
  11. Here's my Ram, it's stable I ran prime95 with it like that for a couple hours. Didn't fail and never got above 30 degrees, although I know with amd that was inaccurate.
  12. I used to have the i7 4790k with a gtx 780, it blew this system out of the water in every aspect, gaming, threading, not one thing was the fx better at.
  13. By the way, I want to thank you both for taking the time to even respond to me. I know this is an old processor, and I didn't expect much support here. So thanks.
  14. I can't pull 4888mhz at least not on multiplier+core voltage alone. My cooling is fine, any tips on how you stabilized above 4800mhz?
  15. It does great in 3dmark pro. With Time Spy and all. I just can't do extreme. But other than that it hat least holds the GPU clock at 1493. I seem to be getting low scores there as well though, compared to my historic benchmarks in the past. The GPU is working great, it's the CPU that I think there's something up with.
  16. I can't figure out what is going wrong, and I sure could appreciate a little help. I am a pretty experienced overclocker, and thought I knew everything about these chips but... This is my system, I built it for fun out of spare parts (if you need anymore information by the end please ask) FX 8300 (was brand new in the box the other day), Gigabyte AM3+ board, with 8 gb DRR3 @1600 9..9.9.24.1, GTX 970 graphics, SSD, reservoir, swiftech xtreme ddc style water pump, 240mm radiator mounted in front, and decent waterblocks for both the gpu and cpu. Like everyone else, I am able to hit about 4600mhz stable at 1.400 volts. My gpu is stable at 1493mhz @ 1.212v and 195 watts max. My problem is with cinebench r11.5. At stock speeds, In openGL, I get about a 41.04. I can see in the OSD my clock speeds for my GPU are staying down at the default clock speed of 1177mhz. Cinebench doesn't stress the GPU enough to make it boost up or something, only about 30% utilization. And under perfcap reason in GPU-z during OpenGL, it still says utilization. Is this common? I see the cpu comparison says mine is worse then pentiums and stuff with like gtx 280 or 480. And the CPU test reveals an even more disappointing 5.18pts. Overclocked, I can pull of about 57 fps, and get about a 6 cpu. I know I have seen many post saying their 8300 or 8350 got like 80-90 fps with a gtx 580 or similar card, at lower clock rates. I saw one guy usually gets between 9 and 10 CPU score with an 8300. To recap scores Stock clock speeds - Cpu 5.18 pts 4600mhz reveals a - Cpu 6. CinebenchR15 gets a 455cb Does this mean the FX processors were truly just that awful? I don't seem to remember them being, even though they are like 9 years old now with zero support -Bob
  17. Update: I ran Aida64 stress test for some time and averaged about 33 degrees. I settled on 4219mhz at 1.320 volts for the fx 8300. If I run it at 3500mhz at 1.172 volts, I can get the temperature down to single digits. I am in a cold room. I have had a lot of experience in the past with nvidia BIOS flashing, and the maxwell wasn't much different. For the gtx 970 I, I buchwacked the factory BIOS and adjusted the wattage limits and raised the voltage by .12v. Through a lot of trial and error, I found that just simply doing this automatically raises the boost clock speed to 1480mhz. I add +13 on top of that in the BIOS. The card runs at 1493mhz. 39 degrees max after 3DMark sessions. Anything higher than that in terms of clock speed the card becomes unstable. I did a lot of cd C:/nvflash and nvflash -6 xxx.rom and saw what didn't work. It seems like if I raise the voltage anymore, even if I flash it in, it doesn't raise the actual voltage. So in terms of wattage, I'm using everything I can and have plenty of wattage left over. If I could give it more voltage somehow I could raise the clock a ton. Open for suggestions for overclocking. Gaming like WoT takes the card up to about 48 degrees, on max graphics settings. If I drop to graphics down to medium, the card goes down to 40. Cpu is never above 37.
  18. I know I just posted this 23 hours ago, but I was since upgraded to an FX-8300 and got the waterblock installed on my card. I also upgraded my pump from a 1600 rpm pump to a 3000 rpm pump PWM controllable. Now, running 3dMark Time Spy at 1351mhz and 1.2 volts, the card never got above 39 degrees and the 8-core 8300 was never more than 22 during the 3D test, then got up to 27 during the render test. With the clock speed at 1447 mhz on stock voltage, it got up to 39 degrees in Time Spy and the cpu got up to 27 in the render test, but only 22 during the 3D test. Amazing the difference water makes. I used something on my graphics card I have never used before. A graphite thermal pad. Seems to be working extremely well. I am pumping directly into the 240mm radiator, which goes to the gpu, then cpu, then reservoir, and back down to the pump. The difference is when your water loop is really functional. Like when I fill it with water, I don't really have to bleed it any. Hardly any sucks down from the reservoir when I first turn it on. It just fills up the whole loop when I fill it, all the way back to the reservoir. I only had to prime it once. And it's a tiny reservoir. Here are pictures below of the install. Can't wait to see what this card can really do now! Thanks for reading! All of the fans are on the same fan controller. By PWM. I like positive case pressure so the dust gets stuck on the filter instead of inside my PC. 3 intake fans (2 on radiator) one exhaust fan. I know I should be pulling the air in from the other side instead of blowing it through the front with the fans, but I don't have the screws yet. I will tomorrow!
  19. I have an issue. As a test, and as a programmer, I just wanted to see if this would work. And it did. I simply memory injected XTU and uploaded it. It gave me whatever score I wrote in my tools, along with other popular benchmarks. That means a vast amount of XTU submissions could ultimately be fake. Don't believe everything you read online about these scores. If there is a work around, people are sure to take advantage it. Since there are such a high number of benchmarking applications, I can rest assure you they are exploitable. Especially if you know a thing or two about the OS hiding behind the monitor and all the games you play.. Bummed to know they create such costly benchmarking programs without any R&D in security.
  20. Errrrr... Try again with your score bud. Click submit at the top and it will show your hwbot.org score not just your read bandwidth.
×
×
  • Create New...