PC GAMING GEAR THREAD
DIPSET last edited by DIPSET
I only know enough about PCs to get by so I implore you to research further, but it seems like 850W PSU is recommended. WIll need a new PSU if you want to avoid using the adapter. Unclear how much power this will draw at any given moment.
Nvidia confirms 12-pin 'included adapter' for the Ampere Founders Edition cards
Looks like a next gen upgrade won't be as easy.
We don't actually know specs yet, but the Seasonic adaptor box that got sent out says it's recommended 850w That doesn't really mean much. PSU manufacturers overestimate this kind of thing all the time. It depends on what you've got now, what the TDP /power draw of the individual card you're getting is, and what you're current power overhead is. The 3090 Founders Edition is RUMORED to draw 350w at stock. The new 12 pin adapter is just 2x8 pin connectors with a different terminus on the GPU side. 2x8 pin 12v power is 300 watts and you can draw about 75 watts from the board. So there's a maximum draw of 375w with the power adapter.
So 850w might be recommended to leave you plenty of overhead (to avoid power throttling) It's probably not REQUIRED for non-OC cards, depending on what you've got in your PC right now. I have a brand new 750 and I think I'll probably be fine. The rest of my system uses like <250w so even with a maxed out 375w card, I'm still at around 625w max, leaving about 125w overhead.
The AIB cards with highly overclocked chips and ram, etc. might be another story, but those aren't going to use that 12 pin, they will prob use 3x8 pin.
9am PST tomorrow, we find out how big the dent in my credit card is gonna be.
Shoulderguy last edited by
Price and release dates for the new Nvidia GPU's
RTX 3090 - $1499 - Sept 24
RTX 3080 - $699 - Sept 17
RTX 3070 - $499 - October
Phbz last edited by
@shoulderguy Can't wait for PS6/Xbox Macarena (?)!
3090 power draw/rec PSU 750w:
The price on the 3070/3080 is actually... good?*
*compared to 2000 series cards
I think I bought a 750W in 2016 cause I thought 1000W was overkill but damn pushing the edge just a few years later.
I mean, for most things 1000w is overkill. For this card 1000w is overkill. 750w is recommended, but depending on your setup, not entirely necessary. Like I said earlier, I've still likely got somewhere in the neighborhood of 125-150w overhead with my system with a 750w card. And I've got a 3900x with 32GB of ram, an AIO, and RGB fans/keyboard/mouse being powered. I'm barely cracking 250w at peak with all that nonsense.
The cards are actually more efficient than the 2000 series, being on the smaller die, they just have twice the number of cores. I can't believe the 3090 has over 10k on die, that's nuts!
Digital Foundry got a 3080 and tested it in a limited fashion.
DIPSET last edited by DIPSET
I'm considering upgrading my monitor. I currently have the ASUS VA32AQ which is a 32" 1440p, 60Hz monitor. I love the size and I don't think I could return to anything other than 28-32" ever again. It's just TOO practical for both work and recreational gaming. But my god the colours suck on this monitor.
I currently have a ASUS GTX 1080 from 2016 and I can't justify upgrading this when I haven't even pushed it to it's full potential yet. It's capable of 4K gaming and I have it overperforming with this bad monitor.
Does anybody recommend a really nice gaming display? 4K @ 144Hz kinda thing? I'll take good colours over 1ms response time and stuff like that. I'm pretty sure my computer can handle the bump just fine and I want to maximize my Cyberpunk experience in November.
I guess the question is why? 4k 60hz in like a 27 or 32 inch gaming monitor? Prioritizing color accuracy over framerate? This is kind of opposite of the market. You CAN get 4k60fps monitors, sure - but they offer little to no value over a 1440 high refresh rate monitor.
At desktop size, ultrawide 1440 is really the way to go IMO. (which is why i did)
If you really want to do 4k 60fps gaming, you're probably better off looking at good gaming TVs like the Vizio P Series Quantum X or LG CX OLED. But then you're talking like 48-65 inches, which will likely not fit most desks.
If you're really dead set on a monitor sized 4k experience, something like this is probably what I'd go with.
It's got good color accuracy, a decent local dimming array, decent peak nit brightness for HDR, good refresh and response time, and all the typical bells and whistles. I still think it's a waste of money, but hey - you do you. :p
Thanks man. I think I need to re-evaluate what I want then. I assumed 4K gaming became more popular over the past few years but clearly it’s still niche to a degree considering the prices of monitors that have 4K resolution AND a high refresh rate. I thought maybe BestBuy.ca was lacking cause of Covid but it seems like maybe there aren’t that many of these types of monitors out there?
Mainly, I currently have a monitor that’s good for everything except gaming and photo editing. I’ll edit a photo in Lightroom then view it on my Mac, iPhone, or work PC and it’s like a completely different photo. That’s why I’m placing so much emphasis on colour.
Maybe I’ll look into a 1440p monitor with 144Hz refresh and some serious colour accolades. You’re probably right, it’s a waste to get a jack of all trades monitor...
It's not even that 4k is niche, if the latest Steam survey is to be believed, 1440p is still niche!
Something like 65% of the market is still at 1080p.
Again, at 27-32 inches, higher resolution is of limited importance. 4k is mostly for tv gaming where there's a big, noticeable difference. With the screen real estate of a desktop monitor, there's usually not enough of an upgrade from 1440 that matters more than framerate. Which is why ASUS's new monitor is 1080p/360Hz.
For what it's worth, gaming on my 3440x1440 ultrawide is delightful. As well as-is gaming on my 65 inch 4k tv. Happily representing the 0.9% and the 2.24% demographics.
Oscillator last edited by
My current (el cheapo LCD backup) monitor runs at 1366x768 60hz. My good monitor (a 19" CRT waiting repair) would be running either 1280x960 85hz or 1600x1200 85hz depending on the task.
@oscillator You are in the second highest Steam demographic!
I haven't posted my system in here yet?
This is my gaming setup that I built a few months ago:
Ryzen 9 3900x with an EK AIO 360 cooler
MSI MAG X570 Tomahawk
32GB G.Skill Trident Z Neo 16-19-19-39
Currently have a 2060 KO Ultra that I bought as a stopgap card until the 3000 series dropped, will be going into my media server after.
Prob gonna get a 3080 next week.
Samsung 970 EVO NVMe 1 TB M.2 SSD
Seasonic Focus PX-750
LG LG 34GK950F-B 34" Ultrawide primary monitor installed on a Monoprice Workstream arm.
Dell UltraSharp 2405FPW 1900x1200 10 year old monitor for chat windows/stream monitoring
Corsair K95 w Cherry MX Brown keys
Razer Deathadder mouse
Rode NT1 USB on the PSA1 Studio boom arm.
Cyberpunk edition MS controller and charging cradle
Thrustmaster T16000m HOTAS (just arrived today!)
Autonomous Smartdesk 2 Premium adjustable height desk in Bamboo/Black
Autonomous Ergochair 2
SMSL AD18 80W2 USB DSP DAC/AMP
(currently in the process of being made, shipping at the end of the month) Beyerdynamic DT 177X from Drop
(eventually) Vanatoo Transparent Zero desktop speakers
Scotty last edited by
8 GB RAM
RX 590 OC
Probably will wait to build a completely new system towards the end of 2021.
Hey @TokyoSlim mind if I ask you more unsolicited PC questions?
I was looking into some of these ultrawide gaming monitors and a lot of the G-Sync ones seem to be a lot more expensive than the FreeSync ones. I noticed the LG monitor you have scored really well on Rtings and they mention how Freesync tech is compatible with Nvidia, but is there any noticeable downsides you know of by using a Freesync monitor with an Nvidia card?
I'm sure the G-sync monitor paired with an Nvidia card is more or less a bell and whistle, but I figured I'd ask.
As far as I can tell, there's nothing specifically that is a downside of using freesync with an NVIDIA card. The main difference between the two is that freesync is an open source SOFTWARE application and SOME G-Sync has the additional hardware built into the display so that allows some more "advanced features" to work. Thats what makes them more expensive. If what you're looking for is adaptive refresh rates without screen tearing as a baseline - both of them seem to work fine.
Essentially, it's just two (It's actually like 6, with the Free/G-sync Pro/Advanced/etc) competing standards for the same thing, like HDR. (Dolby Vision vs HDR10+, etc)
So, basically G-sync is more expensive because it's got additional G-sync chips in the monitor - so essentially it's compatible with both "formats". I don't think I can tell the difference though.