Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Hasturtium posted:

Those fetch real money these days. No kidding.

NGL, I'd pay more than I'm willing to admit for quality retro PC parts to throw together a Win98/WinXP build.

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

slidebite posted:

Oh, found an in box voodoo 5500agp in my garage lol. Too bad its agp!


Hasturtium posted:

Those fetch real money these days. No kidding.

legit $500+, sheesh

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
I got a gtx 460 only 200 bux

gasman
Mar 21, 2013

hey now
I think my dad still has one of the first Voodoo cards, he never gets rid of anything.
Had to connect it via the vga port to another card iirc.

Is it worth anything? I may have to borrow it for "science".

slidebite
Nov 6, 2005

Good egg
:colbert:

e: That would like be a V1 or V2. Both were awesome in their time.

Ironic if its worth more than the 1080ti!

I hate ebay... but might post it locally and see if anyone wants it. I thought it might be worth a bit, but not quite that much.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
I have an old ati 9700 something-or-other card in great condition and the shroud has a picture of an elf with enormous pointy ears and she's holding a laser gun. I'm not sure a price can be put on something like this -- it's not like you can snap your fingers and crank out a picasso in a factory.

Bad Munki
Nov 4, 2008

We're all mad here.


I have three (3) nvidia branded “do not disturb, I’m gaming” door hangers with some sort of edgy snarling dog on them, $50 each, don’t waste my time with lowball offers, I know what I got

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

Bad Munki posted:

I have three (3) nvidia branded “do not disturb, I’m gaming” door hangers with some sort of edgy snarling dog on them, $50 each, don’t waste my time with lowball offers, I know what I got

I realize they're door-hangers, but we're talking nVidia here -- are you forced to install GeForce Experience in order to use them?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Space Racist posted:

NGL, I'd pay more than I'm willing to admit for quality retro PC parts to throw together a Win98/WinXP build.

Hit me up, I might be able to help. :shrug:

Some poo poo I have has quite literally never been used, like a BFG 6800GT AGP card, or an SB Audigy with the breakout box. I think I also have a 7900GT.

If you wanna go super fancy I know I have UW160 SCSI controllers and drives. Those are/were used, though. I think I've got two 15k drives (first and second gen) and one 146.7GB Atlas 10K, the revision of which I don't recall. They haven't been used in nearly 20 years.

BIG HEADLINE fucked around with this message at 08:18 on Jan 28, 2021

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

slidebite posted:

I was going to put my 1080ti in Mrs Slidebites pc when i get my ampre next week (shes still chugging along with a 780gtx in her 2500k) but man, i might end up selling it.

Oh, found an in box voodoo 5500agp in my garage lol. Too bad its agp!

yeah lmao nice tease, post ur voodoos in samart already

edit: edited

Paul MaudDib fucked around with this message at 08:37 on Jan 28, 2021

Shipon
Nov 7, 2005
How picky are buyers about having the original packaging? I have a 1080Ti I'd like to get rid of while the going's good but the original box I foolishly shipped off to someone with an older card I gave them because I had nothing else to ship it in at the time.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Shipon posted:

How picky are buyers about having the original packaging? I have a 1080Ti I'd like to get rid of while the going's good but the original box I foolishly shipped off to someone with an older card I gave them because I had nothing else to ship it in at the time.

if you listed blower or basic model on SA-mart at like $400+shipping it'd probably be gone in a day. If it's a nice one, $450+sh, maybe 500+sh for an exceptional model.

once you advance past maybe a generation behind (2070s is a generation behind, 1080 ti is two) nobody gives a poo poo about packaging. Like I guess it's nice to have but it doesn't really matter.

Just make sure they have an antistatic bag or something if possible. Even if not, prolly ok.

Paul MaudDib fucked around with this message at 18:22 on Jan 28, 2021

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Edited.

BIG HEADLINE fucked around with this message at 09:01 on Jan 28, 2021

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

FlamingLiberal posted:

Just sold my old 1060 6 GB for $250, what is even happening

https://twitter.com/PCMag/status/1354553548789460993?s=19

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Quote is not edit. :cripes:

Scalping is only going to get worse as I'm gonna just assume the r/WSB and buttcoiner crowd are one of those near perfect circle Venn Diagram deals.

BIG HEADLINE fucked around with this message at 09:03 on Jan 28, 2021

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://videocardz.com/newz/iris-xe-intels-first-desktop-discrete-gpu-will-not-work-with-amd-processors

quote:

A special UEFI BIOS is required for Intel Iris Xe discrete desktop graphics cards to work, Legit Reviews revealed yesterday. The announcement of Intel’s first dGPU for system integrators and OEMs has sparked joy among GPU enthusiasts, finally, after so many years we get to see a third player entering the game.

Intel Iris Xe graphics are equipped with DG1 GPU based on Xe-LP architecture. This is an entry-level performance with only 80 Execution Units (640 Shading Units), well under anything gaming oriented. The Iris Xe graphics cards are not targeted at gamers though. These cards are meant to be offered for low-power and low-cost multimedia machines, home entertainment systems, or business consumers.

Only with Intel’s Xe-HPG based DG2 GPU, we might finally see true competition in gaming markets, which is now needed more than ever, as both NVIDIA and AMD are facing supply issues. Intel had already confirmed though that DG2 will be manufactured by an external foundry, which could also be affected by wafer constraints.

When it comes to Iris Xe, it has been confirmed by Intel that the graphics card will only work with certain Intel systems, namely Coffee Lake-S and Comet Lake-S. A special motherboard chipset is also required, and neither of them is new. More importantly though, the Iris Xe graphics card will not work on AMD systems.

The Iris Xe discrete add-in card will be paired with 9th gen (Coffee Lake-S) and 10th gen (Comet Lake-S) Intel® Core™ desktop processors and Intel(R) B460, H410, B365, and H310C chipset-based motherboards and sold as part of pre-built systems. These motherboards require a special BIOS that supports Intel Iris Xe, so the cards won’t be compatible with other systems.

— Intel Spokesman to Legit Reviews

To be clear, it doesn't seem to be a deliberate snub so much as a limitation of the roll-out: if you're only going to ever pair them with Intel CPUs and Intel boards because the cards are only ever meant to be shipped as part of prebuilts by OEMs, then there's no reason to work on enabling broader compatibility.

gradenko_2000 fucked around with this message at 09:23 on Jan 28, 2021

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Watch Xe be some hyperminer anomaly or something equally batshit.

Alchenar
Apr 9, 2008

redreader posted:


Also I just plain cannot notice any FPS increases over about 60 or 70, even with my 144hz monitor. I couldn't notice them before but I also never had a 144hz monitor before. I do try to get my FPS as high as possible for multiplayer FPS's, on the off-chance that an FPS increase I don't notice, will still make me play better. But I really can't tell if you ask me.

Also: g-sync on/off is absolutely not 'night and day'.

I'm writing this in the hopes that it makes some people stop panicking and trying to buy a card, and realise that what they have might be good enough already. But I can say for sure that my 980 was not cutting it with a 1440p screen.


You have made sure your Windows monitor refresh setting is set to above 60, right?

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


I honestly have a hard time seeing the difference between 90 and 100fps myself. You can feel higher refresh rates in the form of input lag or less ghosting in very active scenes, but in terms of the overall picture there's not much of a visual difference to me above 60-70fps. I will certainly always notice drops in framerate though, which is why frame pacing and stability are more important past a certain threshold. Sometimes games play a lot nicer if you simply add an FPS limit to avoid a huge variance in frame times.

Threadkiller Dog
Jun 9, 2010
I intend to just set max fps to 80 or 90 and forget about it. Maybe it will even leave some thermal or boost headroom for better minimum fps or w/e

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Alchenar posted:

You have made sure your Windows monitor refresh setting is set to above 60, right?

oh my god you mean I haven't actually been seeing 75 Hz this whole time???

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.

gradenko_2000 posted:

oh my god you mean I haven't actually been seeing 75 Hz this whole time???

at least you're not my brother in law who had a 144hz for almost a year before I was over there one day like "man your windows is running at 60fps"

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

exquisite tea posted:

I honestly have a hard time seeing the difference between 90 and 100fps myself. You can feel higher refresh rates in the form of input lag or less ghosting in very active scenes, but in terms of the overall picture there's not much of a visual difference to me above 60-70fps. I will certainly always notice drops in framerate though, which is why frame pacing and stability are more important past a certain threshold. Sometimes games play a lot nicer if you simply add an FPS limit to avoid a huge variance in frame times.

I am the same too, I limit my framerate to 100 in games despite having a 144hz display - I can definitely feel 144hz on general browsing and Windows things, but in games 100 is about the top limit before I can't tell if it's higher or not. It saves on power consumption too, something I feel isn't considered all that often in these kinds of discussions

For example I went back to desktop PC after being on a couch with a 65" for several years and it took off £5 a month of electricity consumption. It all adds up.

Ever since then, I've always been conscious of power consumption, I have display backlights set fairly low, power saving modes on, screensavers, that kind of thing.

Zedsdeadbaby fucked around with this message at 12:49 on Jan 28, 2021

FuturePastNow
May 19, 2014


It will be very funny if Xe-HPG has the same restrictions as Xe-Lp

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

gradenko_2000 posted:

https://videocardz.com/newz/iris-xe-intels-first-desktop-discrete-gpu-will-not-work-with-amd-processors


To be clear, it doesn't seem to be a deliberate snub so much as a limitation of the roll-out: if you're only going to ever pair them with Intel CPUs and Intel boards because the cards are only ever meant to be shipped as part of prebuilts by OEMs, then there's no reason to work on enabling broader compatibility.

This just seems like an iGPU with extra steps.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

gasman posted:

I think my dad still has one of the first Voodoo cards, he never gets rid of anything.
Had to connect it via the vga port to another card iirc.

Is it worth anything? I may have to borrow it for "science".

They worked via a passthrough VGA cable at the time - you'd take a small cable that came with the Voodoo, connect that to your 2D card, then connect the monitor to the VGA out on the Voodoo. The passthrough was analog so you'd take a small hit to your 2D quality. These days I'd just connect a VGA cable directly to the Voodoo and switch inputs on the connected monitor, but the late '90s were really different.

If you've got a spare computer with a vanilla PCI slot running XP or older, and wanna do some driver spelunking to run Glide games, have at. 3dfxzone.it still has a bunch of downloads.

Weird Pumpkin
Oct 7, 2007

redreader posted:

So I've had my evga 3080 xc3 gaming black since about 10 November and I just want to talk about it a bit. My other specs are Ryzen 3600 and 16gb ram, and 2 ssd's for games. I can't hear any noise from my card at all. it runs at 40-65 (40 degrees idle right now)

I don't really overclock it, I have MSI afterburner and have a saved OC setting of +139 core +150 memory, and apply it sometimes and sometimes not. I haven't tested higher overclocks, I don't really care too much. Anyway it is good, and I like it, and I can finally play FPS games on my 1440p@144hz monitor. BUT. I can't play anything at 144hz on high settings, really. Here are a couple of figures I get:

Hitman 2, max, 90 fps
COD: blops cold war: 110 or so with ray-tracing off and about 70 with it on (I can't remember, but this is about right), otherwise most settings are max but I think I may have turned one or two down.

On the other hand, sekiro couldn't get over 100FPS at 480p on my 980 card, so this is a massive improvement! I also just plain couldn't play FPS games at 1440p after upgrading my monitor from 1920x1200. But I just want to point out that you can't run everything on max at 1440p even with the 3080. It's a good card! I like it! But I also think ray-tracing is pretty much a scam i.e. not worth buying a card for. If you have a 10 or 20 series and you get decent FPS with it (over 60 in my opinion) then don't upgrade.

Also I just plain cannot notice any FPS increases over about 60 or 70, even with my 144hz monitor. I couldn't notice them before but I also never had a 144hz monitor before. I do try to get my FPS as high as possible for multiplayer FPS's, on the off-chance that an FPS increase I don't notice, will still make me play better. But I really can't tell if you ask me.

Also: g-sync on/off is absolutely not 'night and day'.

I'm writing this in the hopes that it makes some people stop panicking and trying to buy a card, and realise that what they have might be good enough already. But I can say for sure that my 980 was not cutting it with a 1440p screen.

Someone else might have responded to this, but go into your Nvidia settings panel and check your monitor. Make sure it's not secretly limited to 60hz output (and that gsync is turned on)

I had a 120hz monitor for years (back when Nvidia had their weird 3d graphics thing, it was actually pretty cool but gimmicky!) and never realized that I'd had the monitor set to 60 hz forever.

Personally I just stick to 60 FPS 4k for games though these days, running the higher frame rates looks nice but actually getting them to run that fast is such a pita

Inept
Jul 8, 2003

4000 Dollar Suit posted:

at least you're not my brother in law who had a 144hz for almost a year before I was over there one day like "man your windows is running at 60fps"

It's frankly really stupid that Windows doesn't see the monitor info and automatically adjust the refresh rate or at least prompt the user if they want to increase it.

fknlo
Jul 6, 2009


Fun Shoe

4000 Dollar Suit posted:

Tried tweaking the clocks and voltage but no dice.

Yeah furmark is what I'm using, also notice that i can drag the window size and the coil whine will actually change pitch depending on how big the window is.

My 1080ti would get really bad whine when the mouse cursor passed over text.

Withnail
Feb 11, 2004

Hasturtium posted:

They worked via a passthrough VGA cable at the time - you'd take a small cable that came with the Voodoo, connect that to your 2D card, then connect the monitor to the VGA out on the Voodoo.

And proceed to poo poo your pants at how good glquake looked because it was your first 3d card... :corsair:

triple sulk
Sep 17, 2014



Is something like $1150-1200 for a 3080 FE (via local sale) really stupid assuming I could sell my 1080ti for $500 or so which looks to be the case via eBay?

There are a lot of size issues with mini-ITX cases and it just seems like the easiest and best looking solution, and it feels like the prices are only going to get a lot worse.

The Grumbles
Jun 5, 2006
My 3070 whines a lot when playing any games that use the tensor cores/raytracing a lot. I think trying to elminate whine is a dark path to go down because there are so many weird factors that can contribute, so the best thing to do is really just to put your PC case somewhere where it's not right next to your face. The more recent trend for ultra-quiet fans, glass panel/mesh cases, and just beefier graphics cards makes it more of a noticeable thing. I'd really just encourage making peace with it, and using headphones or cranking up your speaker volume, or having a think about where your PC is placed relative to your sitting position. Coils gonna whine.

Tensokuu
May 21, 2010

Somehow, the boy just isn't very buoyant.
I went to the MicroCenter here in Madison Heights today; got there just as they opened and was about 16th in line for video cards. They got a single 3080 in today — a Zotac — which the first guy in line bought. I ended up getting a Gigabyte Aorus 3070 Master because they had no EVGA and I believe they had sold out of STRIX by the time I got up to the front of the line.

Haven’t plugged it in yet but I’m fairly content. Wish I had gotten an 80 but honestly after spending so much time nabbing myself a PS5 and having spent the last few weeks stalking Falcodrin’s twitch and multiple discords, I’m just glad to be done.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

triple sulk posted:

Is something like $1150-1200 for a 3080 FE (via local sale) really stupid assuming I could sell my 1080ti for $500 or so which looks to be the case via eBay?

There are a lot of size issues with mini-ITX cases and it just seems like the easiest and best looking solution, and it feels like the prices are only going to get a lot worse.

The FE is by far the smallest card, so if you're worried about fitment then it's the way to go. As for price...up to you. You can try to get in on the BB drops as they're still going for $700 there, but obviously no guarantees on that. $1100 isn't a terrible scalper price, though (if they bitch that they could get more on eBay, remind them that after fees they'd need to sell it for $1300 to clear $1100 and risk the brick-in-a-box return).

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

https://videocardz.com/newz/iris-xe-intels-first-desktop-discrete-gpu-will-not-work-with-amd-processors


To be clear, it doesn't seem to be a deliberate snub so much as a limitation of the roll-out: if you're only going to ever pair them with Intel CPUs and Intel boards because the cards are only ever meant to be shipped as part of prebuilts by OEMs, then there's no reason to work on enabling broader compatibility.

this is something of a bizarre decision since it cuts off HTPC and people just looking for more video connectors for their multimonitor rig. Retrofits like that are a pretty important part of the market for this type of card, GeForce 710 and similar are perenially popular cards and they're mostly DIY/retrofits into existing rigs.

like it's basically an integrated GPU on a board... but you also need to buy it as part of a new rig in a specially configured package with custom BIOS/etc? So it's... basically only going to be sold with -F style iGPU-less CPUs I guess? So you can replace the iGPU they disable on those SKUs, with the added bonus of wasting a PCIe slot? Those are some wack product-development decisions.

I've heard the idea advanced that maybe they want to minimize the support headache by doing essentially a "limited release" in specific configurations. Or that maybe it's not really PCIe but actually running DMI or something instead (given that it's derived from an iGPU). Weird.

Paul MaudDib fucked around with this message at 20:09 on Jan 28, 2021

mA
Jul 10, 2001
I am the ugly lover.

triple sulk posted:

Is something like $1150-1200 for a 3080 FE (via local sale) really stupid assuming I could sell my 1080ti for $500 or so which looks to be the case via eBay?

No, it's in line with the markup most resellers are slanging 3080 FEs, which is around 1.5x-2x the MSRP. I'd recommend trying to sell the card on SA mart or r/hardwareswap and save yourself a good chuck from Ebay fees.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Paul MaudDib posted:

this is something of a bizarre decision since it cuts off HTPC and people just looking for more video connectors for their multimonitor rig. Retrofits like that are a pretty important part of the market for this type of card, GeForce 710 and similar are perenially popular cards and they're mostly DIY/retrofits into existing rigs.

like it's basically an integrated GPU on a board... but you also need to buy it as part of a new rig in a specially configured package with custom BIOS/etc? So it's... basically only going to be sold with -F style iGPU-less CPUs I guess? So you can replace the iGPU they disable on those SKUs? Those are some wack decisions for product management.

I've heard it advanced on that maybe they want to minimize the support headache by doing essentially a "limited release" in specific configurations. Or that maybe it's not really PCIe but actually running DMI or something instead (given that it's derived from an iGPU). Weird.

It sounds like the graphics team is having a rough time adjusting to life after unspoken tight integration. The test rollout hypothesis sounds as good as anything, but not being able to thump an appealing sub-50 watt card into my Sandy Bridge Linux frankenputer is a bummeroo for me.

Sagebrush
Feb 26, 2012

So the EVGA line I'm in has not moved in over a month because I like a dumbass assumed back in September that the company would probably manufacture all the products they offer for sale, right? Not just the single most expensive model with the highest profit margin? lmao oops

So since gently caress Newegg and their stupid lotteries and bundles, I think my best chance now is the Best Buy drops I guess. Does anyone have a link to a post on how to get in on those?

flakeloaf
Feb 26, 2003

Still better than android clock

ATR-Stonks reported movement in the EVGA 3070 line just this afternoon

Adbot
ADBOT LOVES YOU

Sagebrush
Feb 26, 2012

Great, but I signed up for a 3080, and on September 17th I only clicked the XC3 models because hey I'm sure I'll get a card within a couple of weeks, a month tops, no need to go overboard with the super gamer poo poo.

As far as I can tell they literally have not shipped a single XC3 3080 since December 21st.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply