Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
fishmech
Jul 16, 2006

by VideoGames
Salad Prong

GRINDCORE MEGGIDO posted:

Hows the CPU useage for Vive? Are there threads for handling sensor feedback and such that benefit from more cores?

I'm just waiting to see the min framerate it gets in gaming, particularly in VR (but not many people seem to cover that.)

The Vive and other VR headsets really do want as many cores and as much GPU as you can throw at them, though GPU certainly helps much more. Having more CPU cores does help in getting more consistent framerates, which is vital for the 3D effect to not become unpleasant.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

FuturePastNow posted:

Zen doesn't appear to have many flaws but one that's apparent is that, in order to reach the clock speeds AMD wants, these things are being volted to within an inch of their lives and pushing more power through them makes a lot more heat for a very tiny gain.

I wonder what the binning process really looks like for the initial production runs, considering this.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Junior Jr. posted:

I never had to care for AMD's CPU architechture until Ryzen.

If it does turn out to be really good, what's the best AM4 motherboards for it, and is AMD making any budget Ryzens in the future?

Honestly you should wait until it's actually been out for a few months and motherboard manufacturers have had time to work out the ievitable bugs.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

buglord posted:

How successful is unlocking cores? I've read about people doing this, but I also hear that most of the time the cores are faulty or something that can cause crashing? Is any of that true?

Either way, would be cool to see~

We have no idea, because nobody's gotten their hands on the lower end chips with locked off cores yet. And they won't be out for quite some time after the full core versions are out.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Would be pretty funny to have your space computer with two huge meter square wings of radiator material jutting out the back into space.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Potato Salad posted:

ISS uses 386. Hubble is on 486. Certification for flight takes a long rear end time.

Also, reminder that the aerospace industry loves ADA.

http://www.ada-auth.org/cpl/lists/CPLbase.html

And it's worth remembering that the laptops and such which they use for the research are just normal modern Thinkpads most of the time. They've got about 50 or so thinkpads across the station.


FaustianQ posted:

Why not an ARM solution? Or is legacy code holding them back? God NASA needs an actual loving budget.

ARM solution for what? The air processing control and altitude stability controls etc already work, there's no point in replacing them. Perhaps when another module gets added onto the station they might use some ARM processors in the systems for that, but they will still need to be able to communicate with the rest of the station.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Rastor posted:

My understanding was that the rad hardened processors of choice have usually not been 386 or 486 but rather POWER based architectures, stuff like the RAD6000 and RAD750.

You see those more often on unmanned craft, like space probes or rovers. For instance, all the Mars rovers that currently active are based on radiation hardened POWER CPUs.

There's actually a paper from NASA in 1991 going over the pros and cons of whether the Space Station Freedom (predeccesor project to the ISS, and which formed the basis for most of the modules that the US would put up for the ISS) design should use 386s or 486s: https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19910016373.pdf

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

EdEddnEddy posted:

The Space Shuttle ran on Intel 8086 chips lol. They event had to go to eBay to pick up some for the last few space shuttle launches I heard back in the day.


Also the New Horizons probe uses a modified Playstation 1 CPU.

Nah, by the time eBay was around the space shuttles were being upgraded to 386-based systems, which they then retained for the rest of the time they were in operation.

Also New Horizons is only using a modified Playstation 1 CPU in the same way that the RAD750 in the latest Mars Rover is a modified GameCube CPU - New Horizons uses a modified MIPS R3000 core, just as the PlayStation uses a modification of the R3000 core, but they're entirely different modifications with little relation to each other. Notably, the variant used to create the processor in the New Horizons probe derives from refinements made for embedded use and adds very few instructions, while the PSX CPU had more radical changes.

Similarly, the CPU in the GameCube (and from that the Wii and Wii U) is derived from a modified PowerPC 750 (sold in macs as a "PowerPC G3" CPU) and so is the RAD750 on the latest mars rover. But the GameCube version of it adds a lot of instructions and subtracts a few, and derives from one revision of the series ultimately, while the RAD750 is derived from a different later revision with very few instuction changes, instead with the majority of effort spent on radiation hardeneing and improved processes so that it uses less power.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

EdEddnEddy posted:

Well it was a while ago, but I remember reading about It when it was news.* I could have sworn it was an Engadget article or something but alas, time has passed as has the SS.



*Whoever is the editor who allowed no proper capitalization should be shot.

That's covering a different issue: NASA needed 8086s for on-the-ground equipment used to test various components of the shuttles and their rocket boosters between launches and in the stages leading up to launches. After all, they couldn't have used regular consumer or even most military 8086s in the space shuttle, as they wouldn't be properly radiation hardened. But the ground equipment could use any 8086 just fine.

The first shuttle to have the main computers upgraded from 8086 to 80386 was the Atlantis in 1997, and the remaining 4 shuttles had the 386s installed one shuttle at a time, and upgrading a single shuttle each year. The intention of that staggered schedule, which took until 2001, was that there would be an old system shuttle ready to fly at any time in case the 386s turned out to have issues.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

EdEddnEddy posted:

Was that around the time they went to a more Glass cockpit as well? It's crazy some of the early pics of the SS cockpit vs the later stuff.

And as bad/crazy as the SS was for NASA, it did some amazing things and really was impressive even if always dangerous in one form or another.

The one thing I regret is never seeing a Space Shuttle launch in person.

Yeah a major impetus behind moving to the 386 systems was that the extra processing power was needed to drive the "glass cockpit" sort of displays.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

FaustianQ posted:

My original concern about moving to something other than old 486/386 processors to something like ARM wasn't about performance but rather how much they could drive down the thermals. My understanding of x86 is it's really hard to push down into the milliwatt range, especially such old designs.

Interesting that they use POWER designs as well.

That's not really all that useful in these applications if it can't provide the appropriate performance, especially since radiation-hardening procedures can easily result in higher power draw as part of ensuring robustness. For perspective, the RAD750 hardened PowerPC chip draws up to 5 watts of power. Normal embedded versions of the PowerPC 750 it was based on drew around 4.5-4.9 watts while the original CPU drew 7 watts when first launched in 1997 (the RAD750 design variant came out in 2001).

There happen to be a few ARM based CPUs out there that are radiation hardened right now, but they tend to be pretty slow compared to existing radiation-hardened CPUs. RAD750s installed in functioning spacecraft run at up to 200 MHz at the moment, while most of the ARM Cortex based chips available from companies like Vorago are running at speeds like 50 MHz or 75 MHz, and with less usable computation power per cycle as well - albeit those chips do only use 0.25 - 0.5 watts at full load. (For what it's worth those ARM chips are comparable in performance to the modified 20 MHz 386s used for main system control in the ISS and space shuttle, but that's not considered suitable for new projects, particularly when image processing is needed)

NASA and the ESA are currently working on projects with industry to develop radiation-hardened ARM CPUs fast enough to replace existing CPUs in spacecraft/probes/rovers, but they do not expect the project to produce workable product until 2020 at the earliest, which means waiting until like 2025 for craft using those CPUs to actually get designed and launched.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

I love cats posted:

Lisa su apparently takes gamers for complete idiots. If you up the resolution, that workload falls on the GPU, not CPU. What the gently caress?

Am I stupid for thinking they have marketed their CPU the wrong way targeting it at gamers when this CPU is clearly better somewhere else? Also, who would buy a 350-500 dollar CPU for 1080p gaming when most people rarely spend more than 700 bux on a desktop, anyway?

They're desperate to portray it as useful for games, because that's somewhere they've been really hosed for years and years. Unfortunately it doesn't really work to fix the problems they had there, although it is still way better than bulldozer garbage.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

FaustianQ posted:

Uh, 1800X 8% behind the 6900K stock to stock, so 12% clockspeed advantage to be 8% slower? Where in reality that makes it 25% slower clock for clock? That's just bad, just not as horrifically bad as before.


The truth isn't somewhere in the middle here, it's a low clock-speed, high efficiency processor with Ivy Bridge-E capability in TYOOL2017 (that like Ivy can keep up but has to be pushed beyond reasonable limits). AMD moved from awful to mediocre. This might be acceptable for mobile or APU/NPU use, for performance and even server I don't see it.

What is the word on when mobile Ryzen stuff is coming out anyway? There hasn't been a worthwhile AMD laptop outside of a certain kind of low end device ever, after all.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Pawn 17 posted:

All performance aside though, what really matters is sales. If enough people buy Ryzen then it doesn't really matter if Intel processors are better by whatever measure. I hope Ryzen is a huge success in terms of sales and we see an end to Intel thinking it's "normal" to sell $1k+ consumer chips.


I think you fundamentally misunderstand the business if you think the existence of chips that are expensive but not officially labeled as "enterprise" instead of "consumer" is a problem really. And also if you think expensive chips are going to go away.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Deuce posted:

Go away? No. The Best of the Best will always have a premium. But that premium can shrink when a solid competitor is present.

You know, because business.

Didn't happen last time AMD was at all competitive, why would it happen now?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Woden posted:

Because Intel isn't still doing the same poo poo that led to the 1.45 billion dollar fine?

And that's going to lead to not still selling very expensive high end processors because: _____

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Competition doesn't lower prices for luxury bullshit, jesus dude. This shouldn't be hard to understand, Intel has consistently sold high end but not officially "enterprise" CPUs for people with ridiculous prices since the mid 90s at least. No amount of competition gets rid of that. They've never been worth their price premium, it's usually poo poo like spending $300-$500 extra just to get 1% better performance, but a certain kind of idiot will buy them regardless.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Pawn 17 posted:

At least read up on the basics of microeconomics before you postulate something like that. Competition does affect luxury good pricing (competition shifts the supply curve).

And you can think of anything and there will always be some small portion of people who are willing to pay much more for it than others because it provides them with a lot of utility. This is part of the demand curve that Intel is capturing with $1k+ chips and doesn't change the fact that those same chips would be cheaper if there was a similar alternative.

There are plenty of studies and articles out there talking about Intel's monopoly pricing. I believe this study was used in AMD's antitrust lawsuit against Intel where they had to pay $1.25B
.

We have already seen competition with Intel multiple times. It never made Intel drop their needlessly expensive super high price chips for being on sale for $1000 or more. Remember we are talking about the chips that are only marginally better than the normal top of the line, but take hundreds of dollars higher prices. They've consistently been on sale since the 90s.

You're somehow not getting that just shouting your economics 101 level understanding doesn't change history, and is not going to tell you about the future in this specific product category.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Twerk from Home posted:

Where is AVX2 expected to matter and do any of these reviews look at AVX2 256-bit wide commands?

AVX and AVX2 are primarily intended to improve handling of certain video encoding, database processing, and general server workloads. The intended areas for it to help are similar to the areas that the various SSE revisions were intended to help.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

FuturePastNow posted:

laptop makers segregate AMD APUs to a ghetto of flimsy plastic laptops with bad screens and poo poo batteries, and Zen probably won't change that

Because systems where you're willing to start paying extra for graphics, you also want higher CPU performance which AMD just hasn't been able to give in the laptop space for like, gently caress, 15 years now? Unless you were willing to go full desktop replacement chunky laptop, because the AMD laptop stuff was simply more power hungry for performance.

Now that Zen has ok performance and ok power consumption, it might actually become worthwhile for anything besides low end garbage.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

K8.0 posted:

The steam hardware survey is meaningless at this point when it comes to westerners because it's probably >80% Russian/Chinese/Brazilian DOTA 2 players with computers from 2005.

That doesn't make any sense if you look at the actual results they get.

If we look at a system built out of the most common attribute for each parameter, we get the most common Steam device as:
nVidia graphics (59.8%)
DirectX 12 supported (74.89%)
Intel CPU (78.15%)
2.3 Ghz to 2.69 Ghz CPU (20.41%)
4 real cores (47.74%)
Windows 10 64 bit OS (47.71%)
8 GB RAM (34.22%)
1 GB VRAM (33.35%)
1920x1080 primary display (43.23%)
Hard drive space above 1 TB (30.44%)
OS Language English (43.67%)
SSE4.1 support on CPU (88.21%) (SSE4.1 is only available in systems from 2007 or later in Intel-land (starting with Penryn Core 2 Duo) and Bulldozer or later in AMD-land so 2011 and later)

That last part basically means that less than 12% of systems surveyed by Steam are older than 10 years, and since there were still computers getting sold or built new with new CPUs without SSE4.1 up til like 2009 or so in Intel and until a few years ago in AMD (because the Phenom chips were often still better than Bulldozer), it'd be even less of the non-supporting set that are really old.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Paul MaudDib posted:

Remember that like 70%+ of all Steam systems are either ancient (pre-DX12) or integrated graphics or both lol

Laptops, as everywhere, are totally dominant in percent terms.

Last time I checked the most common resolution on Steam was 1366x768... By a decent bit

According to the February 2017 steam hardware survey:

74.89% of systems support DirectX 12 on the GPU
Adding up the listed video cards, blatantly mobile video cards do not seem to form a majority, but there's also a lot of cards that show up in "other" so it makes it hard to tell.
43.23% of primary monitors are 1920x1080 (only 24.09% are 1366x768). Also, for users with multiple monitors the most common configuration is dual 1920x1080 monitors, placed side by side, at 32.31%.

FaustianQ posted:

The whole ridiculous video lengths also has to do with taking advantage of Youtubes recommendations algorithm getting changed. Basically longer vidoes are more likely to get recommended or some poo poo.
It's more complicated than that. Long videos also get bigger penalties in the recommendation system for people not finishing most of the video. So just making a long video will actually hurt you if it's not interesting enough for your viewers to watch most of the way through.

Granted a bunch of YouTubers don't actually understand that last part and so put out overly long videos that hurt them anyway...

fishmech fucked around with this message at 23:34 on Mar 7, 2017

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

Hey guys, know what's a lovely resolution? 1680 x 1050. Dell E207WFP.

Could be worse. You could be using some horrible sub-900 vertical pixel resolution trash like 1366x768 or 1280x800.

I outright refuse to buy any sort of new laptop that doesn't have like at least 1600x900 and really 1920x1080 minimum. It's unfortunate that so many laptops of all sorts of sizes still do the x768 poo poo. Maybe someone can just go burn down the factory that makes that specific resolution cheap so we can move on.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

gwrtheyrn posted:

Man I got my stuff put together, and I have no idea what I'm doing on overclocking. I haven't touched this stuff in a couple years and nothing looks familiar now

Well none of the new AMD CPUs are any good for overclocking, due to already being sold very close to their potential top speeds. So if you really want to mess around with overclocking, you might want to return them.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Quad raid 0 sounds like you're just begging to lose all your data though.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

MaxxBot posted:

They're not lazy, it's that desktop CPUs are less of a revenue generator than server and laptops CPUs so it makes little economic sense to focus on them, combined with the fact that competition from AMD has been basically nonexistent for the past several years until now. From that position it makes sense to milk desktop users for minimal effort because what else are they gonna do?

You're not addressing the actual statement though. It's unlikely any amount of special research effort would have resulted in Intel being able to put out consumer price range CPUs with 12 full cores and 3x the IPC on each of them, even though a naive projection based on the progression from the Pentium 4 to the early Core i series chips might have indicated that would happen.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SourKraut posted:


I think what a lot of people actually wish, is that Intel followed the console market approach, where slight improvements continue throughout the console's lifespan but prices continue to go down.

Er, but that's exactly what they've done and people complained about : console revisions pretty much only reduce power used to get the same performance. You normally don't get any improvements in speed. The PS4 Pro and Xbox Scorpio buck the trend by being actually improved speed/performance devices, rather than just being process shrinks and higher chip integration the way the revisions to the PS3/360 or PS2 or PS1 were.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

wargames posted:

So amd unfucked their stuff?

For some games yes, for others no.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Obsurveyor posted:

Why couldn't the Zen microarchitecture be used for a new APU for the next console generation?

It's not being used right now, and it's not being used in upcoming Xbox "Scorpio" project which will be to the Xbox One as the PS4 Pro is to the PS4. That means at least 3 or 4 years from now until any Zen architecture is showing up in new consoles, and even more lag time from Zen-optimized console games making it to PC.

There's no guarantee that the next consoles will even tap AMD for the CPU, they could switch to an Intel solution if Intel's willing to offer the right incentives.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

MaxxBot posted:

How do you know Scorpio won't use Zen?

Because games wouldn't be able to handle the differing layout of cores and general CPU aspects? The whole point of "Scorpio" is to be like the PS4 Pro in that all software will run correctly without any performance penalty, while most software can be run a bit faster and some software will be updated to take full advantage of the new power. Current games on the Xbox One rely on not ecountering a bottleneck for communication between the cores of the type the current Zen CPUs have between cores 0-3 and cores 4-7.

Unless AMD secretly has Zen CPUs with 8 cores all on the same thing unlike the CPUs they've actually released which split 8 cores up into two seperate 4 core units, I don't see how that's possible to implement.

On a normal PC OS environment, these differences wouldn't matter much - the OS and drivers would smooth over the differences. But in the console world where things are getting programmed to match very specific hardware against an OS that does much less, it can cause all sorts of havoc.

Beautiful Ninja posted:

Lisa Su has said, multiple times, that custom Zen SoC's would not be available until 2018. So unless she's been deliberating misleading investors and we get a surprise Zen in Scorpio, it seems rather unlikely.

There's this too, but the immediate issue of software compatibility is more pressing.

fishmech fucked around with this message at 20:05 on Mar 21, 2017

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

eames posted:

Somebody please explain why some of you people act like switching from Jaguar to Zen is like switching from PowerPC to X86? :confused:

They're both 8 core x64 architectures. Optimizing for Vega instead of GCN is probably going to be a bigger deal.

Nobody said that, where did you even pull that out of your rear end?

Just being 8 core x64 architectures is hardly good enough to ensure strict 100% compatibility in the console realm. Again, it's fine in PCs because the OS handles far more hardware abstraction, the way console do things you can't expect to go all the way from the Jaguar APUs to a theoretical and non-existent Zen-based APU and expect everything to work correctly at the same speed or better.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

EdEddnEddy posted:

Well, the Xbox One also is literally Windows 10, so whatever they do to Fix Zen to work on Desktop Windows 10, potentially can be backported to fix whatever changes Zen would be from Jaguar.

No. Most games on the system do not run in the full Windows-based OS (formerly built from Windows 8, now built from Windows 10) used for the applications.

The Xbox One runs a real time operating system at the root which contains a hypervisor. This hypervisor then has a fairly full featured version of Windows 10 to handle the applications, the main system menu, and certain game functionality like networking - it's capable of running every application on the Windows 8/10 app store at its heart, although tons of those apps are not actually allowed on the system. There's then another very stripped down OS that the majority of the games run on, which is built from the Windows codebase, but is not really Windows 8 or 10 compatible - its fairly easy to port games from this environment to a desktop and vice versa though.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Paul MaudDib posted:

Interesting, I was just going to ask you to expound on this a little, thanks for doing so.

Is there really no scheduler at all in the game-mode though? It's just 1 core = 1 thread forever? Nobody over-schedules the processor a bit to help deal with bubbles in the pipeline or fires off some async tasks or anything like that?

I never really thought about it but this is interesting stuff. Got any sources to read more on this?

There's schedulers and things like that, but it's not really the same as what goes on in normal Windows 10. It's also not clear if the OS partition used for most games ever got "updated" to "Windows 10" the way that the OS partition used for the system software and now UWP apps did.

This is a short overview of the original architecture from launch: http://wccftech.com/xbox-one-architecture-explained-runs-windows-8-virtually-indistinguishable/

The updates in fall 2015 brought in the Windows 10 upgrade for the system OS, Microsoft Edge to replace Internet Explorer as the browser, and UWP/Windows App Store support, as well as the backwards compatibility program for selected 360 games.

More in-depth info on how all this works is restricted to licensed developers - the "Dev Mode" environment that any Xbox One owner is allowed to use is restricted to UWP applications and games that run on the Windows 10 based system parittion instead of the full capabilities of the normal game partition.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Lungboy posted:

It's a 2 year old Sony KDL32W706BSU. Some MKVs work but not all. I figured it was an issue with my server but the same file plays fine on my phone app. Original MKV wasn't my encode.

Can you check the codecs in an MKV that works versus one that doesn't? https://mediaarea.net/en/MediaInfo MediaInfo is a good open source tool that you can just drop a media file on to have all the codec and bitrate info laid out.

You'll probably find out that what your player doesn't like is something with the audio codec.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

wargames posted:

my question at this point is how well does zen do with emulation? I want to say not well because of low clock rate.

Emulation of what? It's going to be more than enough performance for things up to like the PS2 and Wii, even Bulldozer chips could handle that.

It might not be good for the currently very early PS3/360/Wii U emulators being made, but those also barely run well on the highest end Intel chips at the moment.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

wargames posted:

on my 4690 citrea the 3ds emulator isn't great.

Citra is slow crap at the moment on almost every CPU. It'll get a lot better in another year or so but at the moment they're just focused on getting games to run correctly even if they're quite slow. Later, they'll move onto getting them at a playable speed.

When it's a more usable state it should run fine on your current hardware, and the current Ryzen line.

Much like the people working on the 360 and PS3 emulators, the team for Citra wants to make sure they get everything done right the first time, rather than getting things working acceptably to run fast from the start, which will mean more work later to get everything to run correctly.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

wargames posted:

Didn't intel help develop the nvme standard, if so i would assume they would have better drivers for it.

That's an excuse for 5 years ago when NVMe drives were first released. It's not an excuse that makes sense now. Even the latest revision in use is nearly 3 years old at this point.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Truga posted:

hot take: who gives a poo poo about 200 or 300 megs per second random access to 4k sized files? That's still thousands of files per second, you're not going to notice a difference in any real world scenario ever.


Anyone who'd be buying an NVMe drive for a desktop in the first place? On laptops sure its sometimes the only form of storage available. But if you're buying any of the current systems that use a Ryzen CPU, it's a normal size desktop that also has SATA ports where you can buy cheaper drives that move data slower over the SATA connection if you didn't care about the speeds.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

But solar killed nuclear just the same way it's killing coal

Not at all? Coal's dying because of cheap natural gas, which is even relatively cheap to swap existing coal plants over to.

  • Locked thread