I don't have any OC, everything is stock. Well I did have a dell rebate in my pocket and a nice chunk of their dell rewards that were close to expiring.so all that made me jump on the purchase.But certainly the main factor was not being able to find a 3080/90 anywhere on the planet. 2 Jasper 2472 01-21-2021 06:27 PM Aurora R11, RTX 3080, constantly hitting power/voltage limit Hello, I monitored 20 minutes of gaming in Cyberpunk using MSI Afterburner and I noticed that GPU is constatly hitting power and voltage limits. Money was never the issue here, it was availability of the GPU I wanted. The bottom line seems to be, that your stuck with the ridiculous 320 watt power limit and old tech for the PCI. I even ordered the Corsair Fans and AIO cooler, that other users had good results with. I really want to return mine at this point. How simple would that be & if its something that could be done through drivers or some type of update.Is there anyway to bypass the restriction, whether it voids the warranty or not is no concern of mine. You will never get PCI-E X16 out of it unless they somehow remove that limitation. Just call it what it is, a PCI-E X8 slot. They knew what they were doing, its pretty disgusting, and disappointing. “At one hand they tell you "for optimal graphics performance, use a PCI-Express X16 slot for connecting the graphics card", Go ahead and throw in the "Resizable Bar" thing while you are at it. ASUS Shows Off Concept GeForce RTX 40 Graphics Card Without Power-Connectors, Uses Proprietary Slot NVIDIA GeForce RTX 4060 Ti Launches To Slow Retail. I think 90% of users just want the latest Nvidia GPU (on a nice, properly cooled board) and have it running at full-speed. Thing is, SLI is being slowly depreciated. Right, x8 is fine for SLI (even if it's a BIOS option, dip-switch, jumper, etc.). Under the 'Limiting Policies' tab, 'GPU voltage limit reached' is highlighted in red. I've not been invited to any Alienware desktop betas lately, but if I was, I would Zonk it. GPU voltage limit reached Greetings, I've been testing my GPU usage on my GTX 970. Still, hard to argue AGAINST (single slot) wider-bandwidth. Sure, (if it is actually) PCIe v4.0 running at x8 now, that is a little better. But if the Area-51 is not coming back, I think these new Nvidia cards could use the extra bandwidth interface. When the Area-51 was still being sold I used to explain it like this: Now I am assuming this is a chipset limitation, but I guess it could be for another reason, because that document does not explain why. Since these machines use the latest most powerful cards, and mine came with PCI 4.0, it is disappointing to see this in that price range. With PCI 4.0 it will not be, since it will have identical bandwidth as PCI 3.0 X16 and that is still enough for these cards. It's going to be a performance hit with PCI 3.0 to use X8 instead of X16. I can understand with dual cards dropping down to X8, but a single card? But that does not take away the point that it's rather disappointing for a gaming machine that costs $3,000 or more to have a chipset configuration that is not capable of running at least the main PCI-X slot at X16.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |