Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Stickman
Feb 1, 2004

sauer kraut posted:

5500 are tiny mobile chips with 128bit memory busses, idk what's there to get antsy about? Unless you're building cheap Fortnite boxes for teenager relatives.
Why AMD refuse to challenge the popular 1660Super/Ti segment with a midsized 192bit chip is beyond me.

It’s supposed to come in 4GB and 8GB flavors, just like the 570/580. DDR6 means that it’ll have identical memory bandwidth to the 570/580/590, despite the smaller interface.

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Cygni posted:

you can get a 590 Red Dragon at $180 right now, which is faster than an 5500 will be and has 8gb of ram vs 4gb. So yeah.

If you had a hard cap at $150, i guess maybe? But the rumors are that customer cards are still a ways off as the partners burn much cheaper Polaris stock. I mean, MSI just announced a brand new RX 580 design but not a RX 5500... so yeah.

Yeah sure go for the 590 if the 230W power draw doesn't mean anything to you

sauer kraut
Oct 2, 2004

SwissArmyDruid posted:

That is exactly what I am building.

The 1650 Super is due in 2 days and should be what you're looking for. 100W card with 1060/580 performance, and the new Turing video block if that's relevant to you.

fat bossy gerbil
Jul 1, 2007

The main issue with the 1650 super is Nvidia being stingy with RAM like always. I’d rather not get another 4gb card but you know team green won’t give you anything else at that price point.

GRINDCORE MEGGIDO
Feb 28, 1985


sauer kraut posted:

Why AMD refuse to challenge the popular 1660Super/Ti segment with a midsized 192bit chip is beyond me.

Assuming they can't match the performance with 1660S sized chips / mem bus?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

fat bossy gerbil posted:

The main issue with the 1650 super is Nvidia being stingy with RAM like always. I’d rather not get another 4gb card but you know team green won’t give you anything else at that price point.

It's not like the 1650 really has the oompf to do much at levels you'd need 8GB for, anyhow. As a pretty much laptop-only part, it's not terrible. What's more obnoxious to me is the giant gap between the 1650 and 1660Ti in laptops. There's plenty of room for a 1650Ti/1660, whichever, but they aren't offering it for some reason.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

DrDork posted:

It's not like the 1650 really has the oompf to do much at levels you'd need 8GB for, anyhow. As a pretty much laptop-only part, it's not terrible. What's more obnoxious to me is the giant gap between the 1650 and 1660Ti in laptops. There's plenty of room for a 1650Ti/1660, whichever, but they aren't offering it for some reason.

There are quite a number of games where 4GB is definitely going to be a bottleneck in getting very high/ultra textures, even at 1080p. Texture settings are one of the most impactful settings in any game and is basically a 'free' setting in terms of performance if you have the vram.

Craptacular!
Jul 9, 2001

Fuck the DH

Happy_Misanthrope posted:

There are quite a number of games where 4GB is definitely going to be a bottleneck in getting very high/ultra textures, even at 1080p. Texture settings are one of the most impactful settings in any game and is basically a 'free' setting in terms of performance if you have the vram.

This is the “used 1070ti” market.

My 1070 is falling behind the $200 Black Friday cards, but it still has 8 GB of RAM, so I still have nice textures and go for shittier shadows and reflections to boost frames and let Gsync handle the rest. Haven’t felt any significant difference even if my settings are overall lower than what they were on 1080p.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Interview with Raja about Xe development.

Confirmed, Raja picked the name "Ponte Vecchio" because he wanted a junket to Italy :lol:

quote:

IC: Have you discussed how the eventual discrete graphics launch is going to happen?

RK: Not really. We are so much focused on execution right now. But I will tell you a funny story about ’Ponte Vecchio’ name. At Intel we have a policy for engineering code names to places or things you can find on a map. We have had too many ’lakes’ and I wanted to do bridges. Wanted to pick a place that I don’t mind going to for a launch! Florence in Italy has some of best Gelato in the world. And I love Florence and the art and architecture there as well.

Intel is working on MCM for graphics chiplets as well:

quote:

IC: Turning to gaming solutions, because there is a lot of interest in how Intel is going to attack the gaming space: what we’ve seen today is a compute GPU based on chiplets. Moving from a monolithic graphics chip to a chiplet design is a tough paradigm to solve, so does working on chiplets help solve the ‘multi-GPU’ issue on graphics? Is the future of graphics still consigned to single GPU, or should we expect multiple GPU scaling easier to manage?

RK: That’s a great question. As you know, solving the multi-GPU problem is tough – it has been part of my pursuits for almost 15 years. I’m excited, especially now, because multiple things are happening. As you know, the software aspect of multi-GPU was the biggest problem, and getting compatibility across applications was tough. So things like chiplets, and the amount of bandwidth now going on between GPUs, and other things makes it a more exciting task for the industry to take a second attempt. I think due to these continual advances, as well as new paradigms, we are getting closer to solving this problem. Chiplets and advancement of interconnect will be a great boost on the hardware side. The other big problem is software architecture. With many interesting cloud-based GPU efforts, I am optimistic that we will solve the software problems as well.

"Gen" architecture is done:

quote:

IC: As Xe pushes on and products come out, will Intel continue to develop Gen as a separate architecture line?

RK: All of our GPU teams are working on variants of the Xe architecture at the moment. We don’t see a reason for Gen anymore – Xe-LP, our low powered variant, covers the market that Gen covered.

XE sounds like possibly an evolution of the Larrabee design again? (might be why they re-hired some of them back)

quote:

The Xe architecture is actually a narrower width machine - the variable vector width that we have and the ability to switch between SIMT mode and SIMD mode and combine them gives the software guys lots of tools to do more. Now having said that, the tools will take some time to mature. What we are seeing today is that we’re being more productive than prior attempts in the industry. We are also putting the software out ahead of the hardware for productive performance enablement.

Paul MaudDib fucked around with this message at 21:58 on Nov 20, 2019

Mr.PayDay
Jan 2, 2004
life is short - play hard
Enthusiast impressions following :
The latest Direct X12 experiences show that the Pascal cards fall further behind the Turing cards as the Turing architecture simply is newer and slightly superior.
The 1080Ti already is 10% behind a 2080 in The Division 2 with DirectX 12 for example.

With a 3080Ti nowhere in sight the next months my brother did not want to wait and made the jump from a 1080Ti oc to a 2080Ti Aorus Waterforce and what the actual gently caress. :dogbutton:

His fps on 3440*1440 with a 5930K of @4,2 (5 years old) 1080Ti oc vs 2080Ti oc.
RDR2 ultra + manually maxed benchmark 35 to 62
Forza Horizon 4 : 75 to loving 108
Far Cry New Dawn Ultra : 75 to 102
Ghost Recon Wildlands Ultra 49 to 65.

So his 2080Ti Waterforce pushes on 3440*1440 (4,9 mio pixels) almost the fps I get on 2560*1440” (3,6 Mio pixels).. and I got a way better CPU and faster RAM. stonk:
What a beast, but that’s what 1479 Euro buy you over here.

It is - depending on the game and engine - a 30% to 60% fps jump,for him. We completely underestimated that oc and watercooling a 2080TI leads to such insane numbers.
Once he gets a newer CPU and faster DDR 4 RAM he might match my 1440p fps on 3440*1440.
my next GPU definitely will be watercooled, it simply leads to 10-15% more fps, at least.

Mr.PayDay fucked around with this message at 23:17 on Nov 20, 2019

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Yup. Getting a G10+AIO two generations ago was one of the better moves I've made in PC gaming. It's the gift that keeps on giving!

VelociBacon
Dec 8, 2009

Mr.PayDay posted:

Enthusiast impressions following :
The latest Direct X12 experiences show that the Pascal cards fall further behind the Turing cards as the Turing architecture simply is newer and slightly superior.
The 1080Ti already is 10% behind a 2080 in The Division 2 with DirectX 12 for example.

With a 3080Ti nowhere in sight the next months my brother did not want to wait and made the jump from a 1080Ti oc to a 2080Ti Aorus Waterforce and what the actual gently caress. :dogbutton:

His fps on 3440*1440 with a 5930K of @4,2 (5 years old) 1080Ti oc vs 2080Ti oc.
RDR2 ultra + manually maxed benchmark 35 to 62
Forza Horizon 4 : 75 to loving 108
Far Cry New Dawn Ultra : 75 to 102
Ghost Recon Wildlands Ultra 49 to 65.

So his 2080Ti Waterforce pushes on 3440*1440 (4,9 mio pixels) almost the fps I get on 2560*1440” (3,6 Mio pixels).. and I got a way better CPU and faster RAM. stonk:
What a beast, but that’s what 1479 Euro buy you over here.

It is - depending on the game and engine - a 30% to 60% fps jump,for him. We completely underestimated that oc and watercooling a 2080TI leads to such insane numbers.
Once he gets a newer CPU and faster DDR 4 RAM he might match my 1440p fps on 3440*1440.
my next GPU definitely will be watercooled, it simply leads to 10-15% more fps, at least.

I didn't find that much of a difference watercooling the GPU (2080ti EVGA XC Ultra), unless the waterforce is more than an AIO cooler on the GPU?

The main difference is basically acoustics - if you run the fans so everything is the same loudness you get some slightly cooler temps but it's nothing crazy. If I was doing a comparison where I max the fan speed before and after doing AIO on my card I'd think I maybe gain around 5% FPS at the most.

I think what you're seeing is the difference between a 1080ti and a 2080ti with all that VRAM managing the high res textures on a 3440x1440.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Yea water cooling isn’t going to get you 10-15% more performance without modifying power limits. I gained between 5-7% putting an AIO on my 2080 ti at a slightly lower noise level compared to the stock gaming x trio cooler and that’s a card with a 330W PL out of the box.

Mr.PayDay
Jan 2, 2004
life is short - play hard
:thunk:
Maybe I did my math wrong but I can’t get oc corespeed even near stable the Waterforces 2150 and 8100 Memclock. That’s definitely way above my 2180Ti Zotacs clocks and compared to my 1950 and 7100 before I get freezes means additional 9 % core clock boost and 14% on VRAM clocks .
That’s way above 5% more fps output.

He gets almost my fps in 1:1 graphic settings on 3440*1440 vs my 1440p. That’s ridiculous in a positive way

Cygni
Nov 12, 2005

raring to post

Someone tested an OEM RX 5500 unit out of an HP tower, and its slower than a 580.

https://videocardz.com/newz/amd-radeon-rx-5500-oem-version-tested

e: the good news is that Videocardz did dig up a regulatory listing for two "RX 5500 XTs" from Giga, so it looks like the higher performance (8gb?) version may be coming. Although AMD weirdly didn't announce that part with the other 5500's?

Cygni fucked around with this message at 19:05 on Nov 21, 2019

Seamonster
Apr 30, 2007

IMMER SIEGREICH
I'd glady trade 10-15% of that performance for a slot powered card...

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Cygni posted:

Someone tested an OEM RX 5500 unit out of an HP tower, and its slower than a 580.

https://videocardz.com/newz/amd-radeon-rx-5500-oem-version-tested

e: the good news is that Videocardz did dig up a regulatory listing for two "RX 5500 XTs" from Giga, so it looks like the higher performance (8gb?) version may be coming. Although AMD weirdly didn't announce that part with the other 5500's?

It's actually...not that bad? I mean, it draws 133W full load but only lags behind in some of the tests to the tune of 3-6 FPS, even the Firestrike stuff is only a few hundred points behind a newer 1660 or older 580. I think it'd be a good competitor in the 1660/Ti market with a moderate overclock especially if it stays sub-150W power draw. I'm sure if other vendors start adding custom boards/cooling they could squeeze more out of it too, plus we haven't seen the XT yet.

Rooted Vegetable
Jun 1, 2002

Seamonster posted:

I'd glady trade 10-15% of that performance for a slot powered card...

I'm in a unique situation where I need to shove my new card in an old server with no spare 8pin power cables, and also restricted PCIe length (6.59in max). Unless something changes before BF, I'll have to get a straight 1650, likely Zotac's super compact one.

I'm fine with it as this is going to be occasional gaming (a few hours in the evening if I'm not too tired) and don't care about sacrificing to get it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Doesn't quite fit any thread but: are Thunderbolt cards reasonably "generic" and will work in any PC with the appropriate header, or are motherboards tied to specific whitelisted PCIe-IDs or something like that?

eames
May 9, 2009

Since there was VR talk on the last pages, Valve just announced "Half Life: Alyx" for March 2020. VR only. It's a prequel set between the first two titles and totally not HL3. That should sell a few GPUs and Headsets.

SCheeseman
Apr 23, 2003

From David Spyrer, one of the devs

quote:

While it does take place before the events of Half-Life 2, we actually recommend that you play through Half-Life 2: Episode 2 before you play Half-Life: Alyx, for reasons that will become clear as you progress.
It's basically Half Life 3.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
Considering this isn't a continuation of the story, no.

taqueso
Mar 8, 2004


:911:
:wookie: :thermidor: :wookie:
:dehumanize:

:pirate::hf::tinfoil:

Half Life: Sell Some drat VR Hardware Edition

ILikeVoltron
May 17, 2003

I <3 spyderbyte!

eames posted:

Since there was VR talk on the last pages, Valve just announced "Half Life: Alyx" for March 2020. VR only. It's a prequel set between the first two titles and totally not HL3. That should sell a few GPUs and Headsets.

yea, I picked up a valve index caus of this. So I'd say it worked

Enos Cabell
Nov 3, 2004


Lambert posted:

Considering this isn't a continuation of the story, no.

What else to say but "lol".

wolrah
May 8, 2006
what?

Paul MaudDib posted:

Doesn't quite fit any thread but: are Thunderbolt cards reasonably "generic" and will work in any PC with the appropriate header, or are motherboards tied to specific whitelisted PCIe-IDs or something like that?
At least the Gigabyte cards are reported to work on a variety of other systems, as long as you have the right header.

Klyith
Aug 3, 2007

GBS Pledge Week

ILikeVoltron posted:

yea, I picked up a valve index caus of this. So I'd say it worked

violating the "don't pre-order games" rule harder than it's ever been violated before


I know half life is behind valve's break in case of emergency glass and they're not just gonna toss off a low-effort vr "experience"... but I'm not positive valve even remembers how to make games anymore.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

Enos Cabell posted:

What else to say but "lol".

You shouldn't be so mean to the person that picked up an Index for this.

But yeah, lol.

VelociBacon
Dec 8, 2009

Okay since we were just talking about it - I'll test the EVGA hybrid kit temps and clocks on my 2080ti XC Ultra in case anyone was thinking about picking it up.

At defaultish fan speed which gives me 56% fan speed (dead silent in a fractal define r5 case on my floor) at full load I guess, temps around 65C, GPU clock at 2055MHz.

When I run the fans at 100% I get 62C and a GPU clock of 2070MHz.

This is with the stock fans and a +140 on gpu clock, no offset on memory clock right now as I'm still trying to find the highest stable GPU clock offset I can.

Tested in the new Jedi game with a 9900k 5GHz all core, 1440p 144hz monitor getting around 120-140fps.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

VelociBacon posted:

Okay since we were just talking about it - I'll test the EVGA hybrid kit temps and clocks on my 2080ti XC Ultra in case anyone was thinking about picking it up.

At defaultish fan speed which gives me 56% fan speed (dead silent in a fractal define r5 case on my floor) at full load I guess, temps around 65C, GPU clock at 2055MHz.

When I run the fans at 100% I get 62C and a GPU clock of 2070MHz.

This is with the stock fans and a +140 on gpu clock, no offset on memory clock right now as I'm still trying to find the highest stable GPU clock offset I can.

Tested in the new Jedi game with a 9900k 5GHz all core, 1440p 144hz monitor getting around 120-140fps.

What’s your max power limit in watts and what fan rpm are you running at 56 percent?

I put an alpha cool eiswolf 240mm AIO on my 2080 Ti and it hits 55C at 1000 rpm (noctua a12x25) with a 330W power limit. In game clocks vary by game but aren’t far off from yours with the same type of monitor.

VelociBacon
Dec 8, 2009

B-Mac posted:

What’s your max power limit in watts and what fan rpm are you running at 56 percent?

I put an alpha cool eiswolf 240mm AIO on my 2080 Ti and it hits 55C at 1000 rpm (noctua a12x25) with a 330W power limit. In game clocks vary by game but aren’t far off from yours with the same type of monitor.


(current/minimum/maximum)

Max power 347w as per HWinfo64 and it's unclear why the fan RPM shot to 3k rpm but I think it's a glitch because I observed both fans at the same % with the same RPM. Assuming Fan1 is the correct RPM, 56% is about 1,111 rpm.

Vasler
Feb 17, 2004
Greetings Earthling! Do you have any Zoom Boots?
Hi folks - with a (hopefully decent) sale around the corner, how does this card look? There's also "XC Gaming" and "XC Ultra Gaming" options for this card. Are either of these something I should consider instead of the basic "Black" model?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
The XC I think is a better binned chip with a little factory overclock, the XC ultra has a deeper two fan setup which takes up an extra card slot but is cooler and quieter.

I have the 2070 super XC ultra and I am quite happy with it.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Vasler posted:

Hi folks - with a (hopefully decent) sale around the corner, how does this card look? There's also "XC Gaming" and "XC Ultra Gaming" options for this card. Are either of these something I should consider instead of the basic "Black" model?

Generally speaking, unless you care about RGBs, the only good reason to pay extra over the basic 2-fan offering is for a better cooling triple-fan version.

ILikeVoltron
May 17, 2003

I <3 spyderbyte!

Klyith posted:

violating the "don't pre-order games" rule harder than it's ever been violated before


I know half life is behind valve's break in case of emergency glass and they're not just gonna toss off a low-effort vr "experience"... but I'm not positive valve even remembers how to make games anymore.

If it makes you feel any better, it's the first game I've preordered in like 5 years and my friends have been trying to get me to play VR for months. I was really on the fence until I saw the announcement video. It looks AMAZING

treasure bear
Dec 10, 2012

Klyith posted:

violating the "don't pre-order games" rule harder than it's ever been violated before


I know half life is behind valve's break in case of emergency glass and they're not just gonna toss off a low-effort vr "experience"... but I'm not positive valve even remembers how to make games anymore.

I think that's why they bought Campo Santo the Firewatch studio. Some of them were leads/important in the first Telltale Walking Dead season.

eames
May 9, 2009

The german site Notebookcheck has info on a mobile Turing (Super) refresh in March 2020. They’re typically not a site that just publishes every rumour they find on Twitter.

https://www.notebookcheck.com/Exklusiv-2020-kommen-Super-Grafikkarten-in-Notebooks.443238.0.html

I find it hard to believe that the Nvidia isn’t going to launch anything new with the huge amount of AAA releases next year but here we are.

BTW the $1k Valve Index is sitting on #1 of steams best selling list. I preordered the original steam controller and I learned my lesson.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib

Vasler posted:

Hi folks - with a (hopefully decent) sale around the corner, how does this card look? There's also "XC Gaming" and "XC Ultra Gaming" options for this card. Are either of these something I should consider instead of the basic "Black" model?

Check out the 5700 XT as well if you're price-sensitive.

Lambert fucked around with this message at 11:30 on Nov 22, 2019

Vasler
Feb 17, 2004
Greetings Earthling! Do you have any Zoom Boots?

Lambert posted:

Check out the 5700 XT as well if you're price-sensitive.

I'm not price sensitive, but I don't want to waste money by buying something stupid. There are so many models with what seems like very marginal differences.

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Yeah that’s EVGA. I’d say buy the black edition, or if you want a better fan (and probably a marginally faster chip) get the XC ultra.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply