Jump to content
HWBOT Community Forums

Dancop

Members
  • Posts

    2162
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by Dancop

  1. dafuck...is this real man...I was trying to reach 36K with 1080...but U just destroyed all my dreams! awesome work buddy
  2. Thanks a lot... I love this a bit more Dancop`s 3DMark11 - Performance score: 32688 marks with a GeForce GTX 1080 Gesendet von meinem HUAWEI NXT-L29 mit Tapatalk
  3. 25xx 3d stable Dancop`s 3DMark11 - Performance score: 32688 marks with a GeForce GTX 1080
  4. All you guys do is at your own risk and I don't really recommend, using this bios on other cards. No idea what it does to the other pcbs or vrms. I'd rather keep the founders edition like it is, the vrm is not the strongest tho Gesendet von meinem HUAWEI NXT-L29 mit Tapatalk
  5. Updates is... More soon, will try another card to see if those things are common
  6. Thanks a lot guys... NO matter who'll win, for today I see no chance to oc
  7. I've had the same problem, wanted one soooooo much...so many complaints about 1080 and oc... tbh, it seems to work, doesn't it?
  8. Quick follow up on LN2 Dancop`s GPUPI - 1B score: 13sec 492ms with a GeForce GTX 1080 More tomorrow...need a nap
  9. You're welcome Gesendet von meinem HUAWEI NXT-L29 mit Tapatalk
  10. As said above...I'm currently testing and will report later. I just wanted to open this thread as soon as possible, to provide initial experiences and especially the new really amazing bios! This is with 1.32V air...EASY!
  11. Hi Fellas, here are some Infos about the new ASUS Strix GTX 1080. I’ll keep this thread as short as possible. Luckily (for me tho) this card seems not to need any soldering iron, so this typical part, done and inspired from XA, won‘t be in this thread, except you’ll play with hotwire. Here’s the card: Here’s the original BIOS, NVFLASH and the XOC BIOS: 1080strixXOC.rar beim Filehorst - filehorst.de Just drag and drop the XOC file on the prepared NVFLASH shortcut… This BIOS has no powertarget and a fixed measured voltage around 1,24V. If you try this BIOS on air, you’ll notice a very fast increase temp-wise, so be careful. I did some initial testings with the original BIOS and found that 1,24V is good enough for 2250 to 2300. I will play the card a bit more and leave some space for upcoming infos. I think I don’t have to show pictures from the plastidipped card, but here is the explanation and foto for the Hotwire-Mod: Thanks to Techpowerup for the Foto (https://www.techpowerup.com/reviews/ASUS/GTX_1080_STRIX/3.html) The readouts (TOP) and hotwire (BOTTOM) are from left to right, PLL/MEM/VGPU. Furthermore you’ll have to short the former LN2 pad, to activate hotwire and remove the small resistors shown in the picture. Do this only if you want to use hotwire method!!! Instead of hotwire, you can also use the following VR (still, you have to remove the resistors and short the former LN2 pad!): Hotwire point to ground GPU 20k VR MEM 50k VR PLL 100k VR Tests on air (original BIOS only, cause XOC BIOS hits templimit): And here with 1.32V adjusted and measured on air! First tests on cold showed: 1. No CB 2. No CBB Hell yes…great news! So, the first tests where a kind of tricky and fun as well. Sure you can’t count GPUPI as the mega intense GPU test, that’s why my clocks where so high, but for heavy 3D stuff, we’re talking about much lower, but still interesting clocks. Let’s start to explain some things: 1. 1.25V barrier – tbh, there is no specific barrier, you’ll have to teste ach card separately. But what happens with high voltage, I’ll explain later in the LN2 – HOW TO 2. Full-Pot benching – yes you can…but there’s the biggest issue, I’ll explain shortly here, more precised later. At a certain temperature, at a round minus 95 for my card, the internal temp sensor oft he gpu detects positive temperatures. 3. Mem clocks decreasing with cold PCB – no! They got better on my card at least. On air my maxi s 5750-on LN2 I can run 5850 easily. LN2 – HOW TO 1. Start with the pot at positive temps at around 10 degrees. Now try to figure out the max stable clocks in GT1 from 3d11, which was by far the hardest benchmark for my card. Don’t touch the voltage yet. 2. If you found your max stable clocks go lower with the temp and leave gpuz renderer runing while u do that. 3. Once you hit minus 40 on the GPU, you’ll notice a drop in your frequency around 100 to 120 MHz! Here we have the first temp issue with hte gpu. It seems to think: „I am too hot!“, so it downclocks to the min boostclock. From now on, you’ll have to add exactly this to the adjusted offset 4. Kepp the above in mind and go lower with the temp…once you get a black screen, you’ll have to reboot. Keep the temperature in mind, where this happens!!! IMPORTANT 5. After the reboot, you’ll notice positive temps in idle, even at fullpot, see the pictures below: The first picture shows the IDLE behaviour with fullpot. Fullpot is so cold, that even in LOAD, the GPU detects positive temperatures. The second picture shows the IDLE and LOAD behaviour at minus 120. In IDLE the GPU is too cold to work right (the sensor), but in LOAD, the GPU heats internally, so it works properly and goes back to the minus 40, which ist he lowest, Afterburner can detect. I’ve tried to get rid of this with a constant load through GPUZ renderer, but that LOAD is not high enough to heat the GPU that much, that it detects the right temp. Now we come back to the third bullet point…when the GPU sensor detects a temp higher than minus 40…you’ll get your 100 to 120 MHz back….and there’s your crash Instead of 2320 you get 2440, that’s way too much for detected temps around plus 70 and the GPU shuts off. Look the picture below – IDLE= high temps, 3d11 GT1=goo temps, GPUZ renderer=wrong temps. 6. So, let’s conclude, everything below 95 degrees minus is not good, to reach nice stable clocks. So let’s go back to minus 90 and start OVERCLOCKING. 7. Now we’ll try to find the highest working voltage. Therefore I recommend to start with +200mV, which is around 1.29 to 1.31V load. Now adjust the offset to 2300 core freq. Let 3d11 GT1 run and see how far you can go with the voltage. If you get an instant black screen, your voltage is too high, which also might be a temp problem. Cause I was able to adjust higher voltage with lower temperature. 8. When we found our max voltage, we can start, finding the max core freq. Mem freq should be at least the same like on air. Sometimes you’ll see the card running a certain freq easily and afterwards not again after a crash, so reset the driver or reboot, to get rid of this. These cards a pretty consistent, I was able to bench exactly the same freq, even after 4 hours of benching. When you have your temps right, you see this: And here are some results: :banana: :banana: I saw no problems with adjusting clocks at any temp, like we all know it from 980ti. Some personal thoughts and tips: 1. the card is really quiet! 2. I tried some games in 4K- I'm not really a gamer, but with a stock boost at around 2050MHz, I can play all the new games at min 40FPS, which is really great! 3. the original BIOS is already really good and quite efficient. This is perfect for gamers and ambient overclockers! If you wanna play on colder temps starting from water, give the XOC BIOS a try, I really love, very good work Asus! 4. On air, always keep watching your temperatures. This is the main thing to loose efficiency, cause the GPU loves to clock down at some certain temperature points... 5. Watercooling with GPU-Only cooling. Be careful, to have the best possible contagt. NVIDIA isn't using any frame around the GPU since pascal. Furthermore, the VRM should be cooled properly, so keep that in mind! 6. For LN2, I recommend to use the backplate. This dispenses the mounting pressure the safest way! 7. Pascal has no DVI/VGAconverter anymore, so VGA won't work. Just use any other cable, all worked fullpot for me! SIDENOTE!!!! All what is written above is just my own experience, If you see other things, let me know, if you kill your hardware, don't let me know, cause you do all the mods at your own risk. Neither me, nor Asus is responsible for anything...be aware, that you void your warranty, by soldering the card (or any kind of hardware!!!).
  12. Got your advice Now just put it into the 2x category... Very nice score btw
×
×
  • Create New...