Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Panty Saluter
Jan 17, 2004

Making learning fun!
top is theory, bottom is practice

I prefer this one because puppies :kimchi:

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Panty Saluter posted:

top is theory, bottom is practice

I prefer this one because puppies :kimchi:

That's terrific -- especially puppies :3:

WhyteRyce
Dec 30, 2001

http://www.siliconbeat.com/2015/04/15/intel-said-to-offer-500-million-for-san-diego-based-via-telecom/

gently caress Altera Intel can buy VIA. Wonder what Cyrix would cost

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.


Probably not much since they've been defunct for almost 20 years. The last Cyrix processor I owned was a soddered on BGA 486 I believe. I imagine you could get Cyrix for under a million bucks unless they do something else now. I don't know why Intel or AMD would want to own Cyrix. Via wouldn't want to own them neither would Qualcomm they don't even possess an x86 license to my knowledge.

evensevenone
May 12, 2001
Glass is a solid.
Amazingly, the Cyrix MediaGX micro arch lives on in the AMD Geode GX/LX ultra-low-power processors, not due to be EOLed until Q4 2017. Cyrix was bought by Nat Semi, then sold to VIA, and VIA sold Geode to AMD.

I've seen some Geode on system-on-module form factors, i'm guessing they go into ATMs, digital signs, that kind of thing.

http://www.amd.com/en-us/products/embedded/processors/lx

PC LOAD LETTER
May 23, 2005
WTF?!
MediaGX is just branding now. They replaced the original Cyrix design with what amounts to an old K7 Athlon core soon after they acquired it.

When they were 1st released though MediaGX was kinda interesting as being the first chipset integrated IGP + south bridge that had almost OK performance for pretty cheap. You could get a whole system for around $4-500 less than a Intel based system. Too bad the CPU wasn't good even for its time so they had to use a customized version of windows that was optimized for that chip to get good performance.

I don't know if you could even boot a regular version of windows on one.

SlayVus
Jul 10, 2009
Grimey Drawer

Panty Saluter posted:

http://www.userbenchmark.com/UserRun/201044

I don't know how much I trust this benchmark. The worst performing component in my computer is my 3TB storage drive and that's just "OK", not terrible. Yet for some reason I have a 35% total score? I don't know how they derived that average.

So I ran this benchmark because in my DBC Task Manager(Essentially Win8 task manager for Win7) the speed of my CPU says 0.80 GHz. Thankfull though, I was not having problems like the previous user with his 0.80GHz CPU.

However, my benchmark showed a little something else funny. My SSD had a read speed of 505 MBps. My Western Digital Green 3TB drive? 4,692 MBps read

It says it has RAM caching enabled on it. I don't have any kind of RAM caching thing installed on my PC. I have a mushkin Chronos, so I definitely don't have Samsung Magician.

http://www.userbenchmark.com/UserRun/216509

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

PC LOAD LETTER posted:

MediaGX is just branding now. They replaced the original Cyrix design with what amounts to an old K7 Athlon core soon after they acquired it.

When they were 1st released though MediaGX was kinda interesting as being the first chipset integrated IGP + south bridge that had almost OK performance for pretty cheap. You could get a whole system for around $4-500 less than a Intel based system. Too bad the CPU wasn't good even for its time so they had to use a customized version of windows that was optimized for that chip to get good performance.

I don't know if you could even boot a regular version of windows on one.

I don't know anything about them needing a special version of Windows nor can I find any info about that. We sold some systems that have MediaGX chips back at a computer shop I worked at back in the day and we just used regular Windows 95 (and maybe 98, but I don't remember for sure) on them.

PC LOAD LETTER
May 23, 2005
WTF?!
That was what I remembered about them.

Apparently its not really correct. There was some bug with installing win98 on them and a default win98 install disc won't work without some shenanigans.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

PC LOAD LETTER posted:

That was what I remembered about them.

Apparently its not really correct. There was some bug with installing win98 on them and a default win98 install disc won't work without some shenanigans.

Now that you mention that, I kind of do remember some issue like that but I don't remember the details. Something like you had to have a driver or something on the install disk that didn't come on it by default or something like that.

loudog999
Apr 30, 2006

Sorry for the dumb question, but realistically, if I am just using my computer for gaming, how much life does this i7 2600k have in it? I really dont want to upgrade my motherboard + cpu to game so when it comes to that I will probably just switch to consoles again.

evilweasel
Aug 24, 2002

loudog999 posted:

Sorry for the dumb question, but realistically, if I am just using my computer for gaming, how much life does this i7 2600k have in it? I really dont want to upgrade my motherboard + cpu to game so when it comes to that I will probably just switch to consoles again.

More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out.

loudog999
Apr 30, 2006

evilweasel posted:

More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out.

Great, thanks. This stuff is still all Greek to me and you guys are always a bunch of help. Cheers

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
If it's still stock you can get a ton of mileage out of it via overclocking. I still have mine at 4.6ghz from day 1 and it runs quiet and cool under a massive gently caress-off HR-02.

the1onewolf
Dec 19, 2013

Architect of all things Timey-Wimey
Just going to put this in here.

What kind of processor do you guys recommend for Autocad?
Intel Core i7-4790K?

Kazinsal
Dec 13, 2011



evilweasel posted:

More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out.

I wonder how soon we're going to start seeing Moore's Law degrading rapidly as the number of transistors per square inch we can increase successive die generations by approaches "zero, unless you like massive electron migration and quantum tunneling". Is 10nm going to be it? IIRC you start to see gnarly quantum tunneling happening at that point.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

the1onewolf posted:

Just going to put this in here.

What kind of processor do you guys recommend for Autocad?
Intel Core i7-4790K?

A fast GPU doesn't mean much if you have a lovely GPU. While a fast processor is important, make sure you have an appropriate GPU (Quadro or FireGL) and a lot of fast RAM if you plan on doing professional workflows.

canyoneer
Sep 13, 2005


I only have canyoneyes for you

Kazinsal posted:

I wonder how soon we're going to start seeing Moore's Law degrading rapidly as the number of transistors per square inch we can increase successive die generations by approaches "zero, unless you like massive electron migration and quantum tunneling". Is 10nm going to be it? IIRC you start to see gnarly quantum tunneling happening at that point.

Moore's original paper had a forecast horizon of 10 years. For the last 50 years, the industry leaders (including Intel) have said that they had line of sight to continue Moore's Law for the next 10 years, but no idea beyond that. All throughout, analysts have been predicting the death of Moore's law.
I haven't heard Intel announce or say anything that indicates they don't have line of sight to Moore's Law over the next 10 years.

I mean, before immersion lithography was developed some people thought that sub 65nm was never going to work. And there are papers that were published 10 years ago saying that EUV probably wouldn't be able to work in a high volume setting so :shrug:

evilweasel
Aug 24, 2002

canyoneer posted:

Moore's original paper had a forecast horizon of 10 years. For the last 50 years, the industry leaders (including Intel) have said that they had line of sight to continue Moore's Law for the next 10 years, but no idea beyond that. All throughout, analysts have been predicting the death of Moore's law.
I haven't heard Intel announce or say anything that indicates they don't have line of sight to Moore's Law over the next 10 years.

I mean, before immersion lithography was developed some people thought that sub 65nm was never going to work. And there are papers that were published 10 years ago saying that EUV probably wouldn't be able to work in a high volume setting so :shrug:

The problems that are in the future start becoming more basic physics barriers rather than it being inconceivable you could build accurately on that scale.

pmchem
Jan 22, 2010


For anyone who cares about upcoming tech and some science/engineering behind it, I highly recommend this: http://www.realworldtech.com/intel-10nm-qwfet/

the1onewolf
Dec 19, 2013

Architect of all things Timey-Wimey

mayodreams posted:

A fast GPU doesn't mean much if you have a lovely GPU. While a fast processor is important, make sure you have an appropriate GPU (Quadro or FireGL) and a lot of fast RAM if you plan on doing professional workflows.

Is having one of the "designer" card like the quadro or firegl important or will something like the GTX 970 be okay?

SlayVus
Jul 10, 2009
Grimey Drawer

the1onewolf posted:

Is having one of the "designer" card like the quadro or firegl important or will something like the GTX 970 be okay?

Quadro and FireGL cards have drivers specially designed for those programs. This is what give them the advantage over a normal desktop GPU.

A K2200 Quadro is $430. http://www.newegg.com/Product/Product.aspx?Item=N82E16814133559

mayodreams
Jul 4, 2003


Hello darkness,
my old friend
You don't have to go crazy on a Quadro/FireGL card, but like SlayVus said, the drivers are validated for professional applications and will work better than consumer cards. Professional cards focus on accuracy and stability rather than just speed.

Mr SoupTeeth
Jan 16, 2015

mayodreams posted:

You don't have to go crazy on a Quadro/FireGL card, but like SlayVus said, the drivers are validated for professional applications and will work better than consumer cards. Professional cards focus on accuracy and stability rather than just speed.

Indeed, if you're doing nothing but 2D raster and no compute the only advantage enterprise cards have is support. AutoCAD's recommended specs are a bit of joke, sure guys you totally need that slow 6 core Xeon for a single threaded application that only benefits from pure CPU/memory clock speed (a baseline amount of CPU cache is necessary of course, anything above an i3 is fine). A consumer card with generous amounts of VRAM and a boatload of system RAM will usually suffice (24-32 GB appears to be the point of diminishing returns for my office's workload, we work entirely in Civil 3D).

I built my personal PC as proof of concept for the office with a 4690k/GTX 970/DDR3 2400 setup, naturally when overclocked it dominates our HP Z420's for nearly a 1/4 of the cost. Long term reliability under our workloads is an unknown, but our HP's have quite frankly been huge overpriced pieces of poo poo that were clearly rushed.

Mr SoupTeeth fucked around with this message at 17:36 on Apr 23, 2015

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
AutoCAD barely support mutiple cores anyways

Mr SoupTeeth
Jan 16, 2015

go3 posted:

AutoCAD barely support mutiple cores anyways

The Xeon recommendation made more sense when consumer chipsets couldn't utilize as much RAM but that's no longer an issue unless you need a comical amount.

I've wondered how well a clocked to hell G3258 would chew on AutoCAD since more threads and cores don't do poo poo, the low cache is probably a deal breaker but it'd be interesting none the less.

Mr SoupTeeth fucked around with this message at 18:12 on Apr 23, 2015

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Mr SoupTeeth posted:

I built my personal PC as proof of concept for the office with a 4690k/GTX 970/DDR3 2400 setup, naturally when overclocked it dominates our HP Z420's for nearly a 1/4 of the cost. Long term reliability under our workloads is an unknown, but our HP's have quite frankly been huge overpriced pieces of poo poo that were clearly rushed.

I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea.

What you are paying for in a OEM workstation class machine is reliability and fast turn around on hardware support. Woe is you the first time that one of your overclocking gamin' rigs breaks and there is a production issue and lovely motherboard manufacturer won't do an advanced RMA.

Mr SoupTeeth
Jan 16, 2015

mayodreams posted:

I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea.

What you are paying for in a OEM workstation class machine is reliability and fast turn around on hardware support. Woe is you the first time that one of your overclocking gamin' rigs breaks and there is a production issue and lovely motherboard manufacturer won't do an advanced RMA.

To be fair it doesn't really require obscene voltage to smash the Xeons in IPC and just having that unlocked multiplier and support for memory frequencies above DDR3 1866 gives it pretty big advantage. A properly overbuilt setup like my personal machine would have to try real hard to fail as often as HP's cut rate garbage we're using and we don't exactly push our machines hard. The support is definitely the biggest reason to go with a workstation, as long as it isn't HP (this goddamn company sent out the 420's with recovery software that doesn't work, because they never tested it and learned of the ACHI legacy/UEFI bullshit). Their hardware turnaround time has been relatively decent when they get around to shipping us the right goddamn part.

Prescription Combs
Apr 20, 2005
   6

mayodreams posted:

I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea.

What you are paying for in a OEM workstation class machine is reliability and fast turn around on hardware support. Woe is you the first time that one of your overclocking gamin' rigs breaks and there is a production issue and lovely motherboard manufacturer won't do an advanced RMA.

At that point it's still probably cheaper to have various spares on hand if you're competent at replacing hardware in a small business scenario. Big corporation? gently caress it, use what they give you and make the IT dept deal with broken machines. :v:

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Prescription Combs posted:

At that point it's still probably cheaper to have various spares on hand if you're competent at replacing hardware in a small business scenario.

This. I am just try to throw some caution from experience out there. I only build my own machines, both desktops and servers, but you should really never roll your own in a business environment unless everyone understands the risks involved. That 25% performance increase sounds great until something bad happens at a deadline and poo poo gets real. If you buy good hardware, you can have a good experience with white box parts, but the stakeholders need to know what it will take to resolve issues.

I built out some workstation class machines using the EVGA dual socket 1366 boards and Xeon processors for my client that were rock solid and ran forever. They had way less issues than the boutique "Graphics" OEM that did stupid poo poo like liquid cooling and overclocking. Not to mention that my builds were about 2/3 the cost of the boutique, but still were easily $3k each. Sure it can be done, but I was charging $50/hr to build, test, and deploy the machines, so that adds up too.

I don't want to turn this into an I.T. thread, but there is a huge difference in usage cases and approaches between building out a gamin' rig at home and depending on workstations to make money, particularly when deadlines and idle engineers/artists are involved.

Number19
May 14, 2003

HOCKEY OWNS
FUCK YEAH


mayodreams posted:

I worked with a small 2D/3D graphics company as they started growing and I had work very hard to break them from overclocking their servers and workstations. Overclocking is fine for gaming where the workload comes in short to medium periods, but running a workstation into the ground by overclocking it for real work is a very bad idea.

What you are paying for in a OEM workstation class machine is reliability and fast turn around on hardware support. Woe is you the first time that one of your overclocking gamin' rigs breaks and there is a production issue and lovely motherboard manufacturer won't do an advanced RMA.

To reinforce this: I work at a game dev studio. We used to use custom built gaming PCs for dev work but the reliability problem became a major liability. I got us switched over to actual OEM workstations and while we lost a bit of flexibility in terms of system configs and cost we gained so much more in terms of reliability and tech support. The productivity gain from not having custom PCs break all the time by far outweighs the extra system cost. It's not even all that close actually.

For a Serious Work Computer it's almost always worth it to buy a real workstation class computer.

Mr SoupTeeth
Jan 16, 2015

Number19 posted:

To reinforce this: I work at a game dev studio. We used to use custom built gaming PCs for dev work but the reliability problem became a major liability. I got us switched over to actual OEM workstations and while we lost a bit of flexibility in terms of system configs and cost we gained so much more in terms of reliability and tech support. The productivity gain from not having custom PCs break all the time by far outweighs the extra system cost. It's not even all that close actually.

For a Serious Work Computer it's almost always worth it to buy a real workstation class computer.

I don't disagree, but the massive decline in certain OEM's quality and support recently makes me hesitant to recommend OEM workstations over rolling your own in a small business environment (consumer or enterprise hardware) if the expertise is available to build and maintain them. The difference in component and configuration quality of our HP's from 4 years ago compared to now is apparent, and the perpetual failures and baffling configurations are a testament to this. When spending thousands on a machine it's nice to know you're getting the best components for the money rather than the knowledge that if they can save 1/4th of a cent cutting corners they will. Our biggest liability have been our newest workstations. Inversely, when we had some local boutique make custom workstations years ago they were even bigger liabilities.

Number19
May 14, 2003

HOCKEY OWNS
FUCK YEAH


All my recent Z-Series have been bulletproof. Even the older ones I kept around have been going 24/7 for a long time and I've only had one keel over.

Sure I've had a hard drive or two fail but that's well within expectations.

We're a pretty small shop and it was still worth it.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

Mr SoupTeeth posted:

I don't disagree, but the massive decline in certain OEM's quality and support recently makes me hesitant to recommend OEM workstations over rolling your own in a small business environment (consumer or enterprise hardware) if the expertise is available to build and maintain them. The difference in component and configuration quality of our HP's from 4 years ago compared to now is apparent, and the perpetual failures and baffling configurations are a testament to this. When spending thousands on a machine it's nice to know you're getting the best components for the money rather than the knowledge that if they can save 1/4th of a cent cutting corners they will. Our biggest liability have been our newest workstations. Inversely, when we had some local boutique make custom workstations years ago they were even bigger liabilities.

HP is garbage hth

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD
We have HP Z8XX towers here and the driver support is an absolute joke. The users have no end of random intermittent problems with them.

Not sure how to apportion blame between the SOE and HP, but it's probably at least a little on each side.

computer parts
Nov 18, 2010

PLEASE CLAP

Kazinsal posted:

I wonder how soon we're going to start seeing Moore's Law degrading rapidly as the number of transistors per square inch we can increase successive die generations by approaches "zero, unless you like massive electron migration and quantum tunneling". Is 10nm going to be it? IIRC you start to see gnarly quantum tunneling happening at that point.

The "Xnm" term you see isn't actually accurate, it's a marketing term.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

evilweasel posted:

More than enough, CPUs don't get that much increase in processing power each generation these days and virtually nothing caps it out.

Except Dwarf Fortress :unsmigghh:

I bought a 4790K and now I get 150 FPS!

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.
some leaked skylake benches






Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry
I thought that there wasn't going to be a Skylake K chip. I thought it was going to be the Skylake S?

edit: I found an artical that answers my question and list the specs of the different chips that are expected at launch.
http://wccftech.com/intels-6th-generation-skylake-s-processor-lineup-leaked-core-i7-6700k-leads-pack-10-skus-detailed-samples-spotted/

Lowen SoDium fucked around with this message at 19:34 on Apr 25, 2015

Adbot
ADBOT LOVES YOU

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Wccftech is the Daily Mail of tech news.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply