Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Shirkelton
Apr 6, 2009

I'm not loyal to anything, General... except the dream.

Laserface posted:

2560x1440 is 'literally' double the amount of pixels as 1280x720.

1080P is around 55% more pixels, roughly.

You do know the p in 1080p doesn't stand for pixel, right?

quote:

The main difference between 720p and 1080p lies in the number of pixels that make up a 720p image and 1080p image. For 720p the number of pixels that make up the image is about 1 million (equivalent to 1 megapixel in a digital still camera) and about 2 million pixels for 1080p. This means that a 1080p image has the potential to display a lot more detail than a 720p image.

http://hometheater.about.com/od/hometheatervideobasics/qt/720p-Vs-1080p.htm

Adbot
ADBOT LOVES YOU

Midee
Jun 22, 2000

But I thought it was just one pixel stretched across the screen 1080 times. :ohdear:

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





Laserface posted:

2560x1440 is 'literally' double the amount of pixels as 1280x720.

1080P is around 55% more pixels, roughly.



Do PC players actually play computer games or do they just endlessly loving masturbate over pixel counts and settings? God drat I am so glad I grew out of that poo poo.

2560x1440 is quadruple the amount of pixels as 1280x720.

Laserface
Dec 24, 2004

I am bad at math.

my second point still stands. gently caress Autistic cunts who actually give a poo poo about pixel counts more than "is the game fun to play?"

(USER WAS PUT ON PROBATION FOR THIS POST)

Shirkelton
Apr 6, 2009

I'm not loyal to anything, General... except the dream.
Not really. It's an aspect of a game just like any other.

Moongrave
Jun 19, 2004

Finally Living Rent Free

Laserface posted:

I am bad at math.

my second point still stands. gently caress Autistic cunts who actually give a poo poo about pixel counts more than "is the game fun to play?"

When one thing has more of a thing than another thing, and that thing is something good to have more of, someone wanting that makes them autistic.

Got it.

MJBuddy
Sep 22, 2008

Now I do not know whether I was then a head coach dreaming I was a Saints fan, or whether I am now a Saints fan, dreaming I am a head coach.
E: tired and reading badly. I'll give this a shot in the morning.

A shrubbery!
Jan 16, 2009
I LOOK DOWN ON MY REAL LIFE FRIENDS BECAUSE OF THEIR VIDEO GAME PURCHASING DECISIONS.

I'M THAT MUCH OF AN INSUFFERABLE SPERGLORD

Laserface posted:

I am bad at math.

my second point still stands. gently caress Autistic cunts who actually give a poo poo about pixel counts more than "is the game fun to play?"

Literally nobody is saying this, though. Yes, if a game is fun to play, the graphics don't matter. Look at Minecraft.
What people are saying, though, is that is the same game is released on 2 systems, and on one system it plays at a lower resolution with less graphical effects, and at a less immersive frame-rate, then the same fun-to-play game is inherently more enjoyable with nicer graphics and smoother display. Regardless of whether you can tell the difference between 720p/1080p and 30/60fps just by looking, it's a natural, subconscious thing that your eyes do for you.
If anything, it's more 'Autistic' to say "I don't care about pixel counts, therefore it is not a problem for anybody."

A shrubbery! fucked around with this message at 09:28 on Mar 3, 2014

qnqnx
Nov 14, 2010

Laserface posted:

I am bad at math.

my second point still stands. gently caress Autistic cunts who actually give a poo poo about pixel counts more than "is the game fun to play?"
UUh yeah, lets all go back to Ataris since only consoles only went up in numbers and only autistic cunts care about better numbers, right?
To pull out the fun excuse you should actually have fun games first, by the way.


A shrubbery! posted:

Literally nobody is saying this, though. Yes, if a game is fun to play, the graphics don't matter. Look at Minecraft.
What people are saying, though, is that is the same game is released on 2 systems, and on one system it plays at a lower resolution with less graphical effects, and at a less immersive frame-rate, then the same fun-to-play game is inherently more enjoyable with nicer graphics and smoother display. Regardless of whether you can tell the difference between 720p/1080p and 30/60fps just by looking, it's a natural, subconscious thing that your eyes do for you.
If anything, it's more 'Autistic' to say "I don't care about pixel counts, therefore it is not a problem for anybody."
Also this, there's a point when unintentionally bad graphics are just distracting.

DancingShade
Jul 26, 2007

by Fluffdaddy

Laserface posted:

2560x1440 is 'literally' double the amount of pixels as 1280x720.

1080P is around 55% more pixels, roughly.



Do PC players actually play computer games or do they just endlessly loving masturbate over pixel counts and settings? God drat I am so glad I grew out of that poo poo.

Take a guess. If you don't appreciate high resolution gaming with all the bells and whistles turned on then I guess you can happily game away on intel graphics.

Most games should still be quite playable on low settings at minimum resolution. Certainly 720p. I know I can get UE3 games running at full 1080p on my intel graphics thinkpad if I turn various graphical settings down. ~30 FPS too. If you're all about the gameplay that shouldn't be an issue? Of course I think it looks like rear end so I stick to gaming on systems where everything is set to ultra, all the time.

To put this into a console gaming perspective I used to play Modern Warfare 2 on a laptop with a 1366x768 LCD. It was very playable and looked pretty decent. However the lack of resolution made ascertaining targets at range a problem (since most hardcore mode players used the perks that hid the name above their head, so identifying friend from foe was a big deal). When I moved my install to a desktop with a proper screen I just couldn't go back to low resolution gaming. I can't comprehend why anyone would prefer it unless they simply don't know what they're missing.

Captain Matchbox
Sep 22, 2008

BOP THE STOATS
I played saints row 4 on my PC before upgrading and sat a metre or more away from a 24" monitor. Flicking from 720p up to 1080p messed with my frame rate, but there was an extremely noticeable difference in graphics quality. This was also just before I got glasses so my vision was pretty terrible at the time too. It might be different when someone is playing on a 50" TV from a couch but I'd say there'd at least be a slightly noticeable difference.

Laserface posted:

I am bad at math.

my second point still stands. gently caress Autistic cunts who actually give a poo poo about pixel counts more than "is the game fun to play?"

Good work on the gendered insult too, you'll fit right in to the xbox live ecosystem.

Great Joe
Aug 13, 2008

Laserface posted:

2560x1440 is 'literally' double the amount of pixels as 1280x720.

1080P is around 55% more pixels, roughly.



Do PC players actually play computer games or do they just endlessly loving masturbate over pixel counts and settings? God drat I am so glad I grew out of that poo poo.

So, I take it you'll be playing Titanfall on the 360 instead of buying an Xbone?

Aphrodite
Jun 27, 2006

Wheany posted:

Video game graphics are generally sharper/have higher contrast edges than movies. Movies are always "perfectly antialiased", so differences in resolution are not as obvious. Telephone lines never "shimmer" in movies.

Plus movies are shot at way above 1080p and scaled down.

Wiggly Wayne DDS
Sep 11, 2010



Aphrodite posted:

Plus movies are shot at way above 1080p and scaled down.
Not to mention the amount of time they have to process a frame is longer than how long that frame is displayed.

Tiny Timbs
Sep 6, 2008

Laserface posted:

gently caress Autistic cunts

Seriously? Are you letting an internet fight get to you this much?

eyebeem
Jul 18, 2013

by R. Guyovich
<- Bone Owner

Bought the Bone because I had the 360 last time, I prefer the controller to the DS4, and $100 is inconsequential. Figured there would be parity of sorts between the two. I'm older, I don't pour over the numbers like I did when I was a teenager reading about the upcoming 32x addon for my Genesis.

1080p is better than 720p
60fps is better than 30fps
1080p/60fps is way better than 720p/30fps

$400 is better than $500

PS+ is way better than Gold

I'm not going to sperg about it, but the PS4 is the better gaming console this round. The Bone is fine, I'm certainly happy with it and I enjoyed the hell out of the Titanfall beta (and have it pre-ordered), but I would have been just as happy playing it on the 360 I gave to my son. Anyone saying "Graphics don't matter!" when we are mostly talking about games on both consoles is defending their purchase. MS dropped the ball this time, and that's fine! It happens! I didn't buy a PS3 until 2 years ago because Sony dropped the ball with it in the beginning. What's truly sad is that all of the extra slop that was added to the Bone is basically useless to me. We'll use the boxee box or chromecast for netflix/amazonprime/etc. The FiOS app isn't very good, so we don't use it. We've ignored optical media for movies by now, so the BD is useless.

I should have bought a PS4 and got used to the DS4, and I will. The Xbox One is inferior as a gaming platform, and I don't really see an argument against that. If you want an "all in one" and the features of the xbox appeal to you, it's really a great system. Personally, I already had components that do everything the Bone does, but better. If it played local content like my boxee box, I'd probably feel differently.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Aphrodite posted:

Plus movies are shot at way above 1080p and scaled down.

Yup.

720p and 1080i is pretty much reserved for live TV (or cheap serial TV). You might occasionally see the 480s on public-access or local TV that had to be dragged into DTV kicking and screaming.

1080p cameras were common back when 1080p was still kind of a big deal, but it was also recording from REAL LIFE, which has about twenty-five thousand times* more image density than a 'retina display', which means the initial transfer into the camera is going to be pretty graceful almost regardless of its recording ability, and it'll be left to post and broadcast/distribution not to balls it up.

Now they have 4K and 8K** cameras and production-quality film is at that level or better.

Whereas for games, barring supersampling, the target resolution is the best you'll ever have, so you pretty much have to baby it for best effect.

*Very approximately. Photons don't work like video signals any more than retinas do. However, a 'retina display' pixel is around 80,000 nanometers wide, while visible light has a wavelength of around 400 to 700 nanometers, and this can be treated as its minimum recording width because of phenomena like Rayleigh scattering. Microprocessor foundries do some pretty fancy poo poo to etch silicon, but for visual recording and playback, when you have more data than physics will probably ever let you use you can let a bit of it go.
**This is about the limit of 20/20 vision, assuming an ideal image and a comfortable viewing distance - less for typical quality and longer distance. Roughly double each dimension for panoramic or full-field displays.

Explain How!
Dec 14, 2013
There really are two fantastic controllers this time around. The DS4 is amazing. I can't believe people lived with the PS controller setup for almost 20 years, not even just the DS3.

Wiggly Wayne DDS
Sep 11, 2010



eyebeem posted:

<- Bone Owner

Bought the Bone because I had the 360 last time, I prefer the controller to the DS4, and $100 is inconsequential. Figured there would be parity of sorts between the two. I'm older, I don't pour over the numbers like I did when I was a teenager reading about the upcoming 32x addon for my Genesis.

1080p is better than 720p
60fps is better than 30fps
1080p/60fps is way better than 720p/30fps

$400 is better than $500

PS+ is way better than Gold

I'm not going to sperg about it, but the PS4 is the better gaming console this round. The Bone is fine, I'm certainly happy with it and I enjoyed the hell out of the Titanfall beta (and have it pre-ordered), but I would have been just as happy playing it on the 360 I gave to my son. Anyone saying "Graphics don't matter!" when we are mostly talking about games on both consoles is defending their purchase. MS dropped the ball this time, and that's fine! It happens! I didn't buy a PS3 until 2 years ago because Sony dropped the ball with it in the beginning. What's truly sad is that all of the extra slop that was added to the Bone is basically useless to me. We'll use the boxee box or chromecast for netflix/amazonprime/etc. The FiOS app isn't very good, so we don't use it. We've ignored optical media for movies by now, so the BD is useless.

I should have bought a PS4 and got used to the DS4, and I will. The Xbox One is inferior as a gaming platform, and I don't really see an argument against that. If you want an "all in one" and the features of the xbox appeal to you, it's really a great system. Personally, I already had components that do everything the Bone does, but better. If it played local content like my boxee box, I'd probably feel differently.
Yeah there are going to be people who walk into this situation. For what it's worth the other thread is mainly mocking the ones who have went in after multiple warnings and making sure everyone else is sufficiently informed. The explanations from people defending their purchases combined with the PR problems have kept the other threads alive since before E3. The reasons you've mentioned more or less showcase why the consensus is that sales will flatline. There's a fair number of supporters of the 360 controller who say the PS4 controller is equal or better, and that the Xbone one is a major step back.

For people really used to that asymmetric layout I'm sure a third-party version will appear at some point.
e: Noticed this was the wrong thread, changed context.

Wiggly Wayne DDS fucked around with this message at 16:09 on Mar 3, 2014

Stux
Nov 17, 2006
Probation
Can't post for 3 days!

Explain How! posted:

There really are two fantastic controllers this time around. The DS4 is amazing. I can't believe people lived with the PS controller setup for almost 20 years, not even just the DS3.

Honestly, I was a 360 owner last gen and loved the pad, but I got a ps3 this xmas for exclusives and the DS3 really isn't that bad at all. The 360 pad is definitely better, but it's not like the DS3 is some unusable abomination, it's just not as good.

Laserface
Dec 24, 2004

Captain Matchbox posted:

Good work on the gendered insult too, you'll fit right in to the xbox live ecosystem.

Words hurt my feelings :cry:

YourAverageJoe posted:

So, I take it you'll be playing Titanfall on the 360 instead of buying an Xbone?

I wont be playing that game on either platform, at least not until there is Australian servers. I was bored of the beta after a day.

DancingShade posted:

Take a guess. If you don't appreciate high resolution gaming with all the bells and whistles turned on then I guess you can happily game away on intel graphics.

Most games should still be quite playable on low settings at minimum resolution. Certainly 720p. I know I can get UE3 games running at full 1080p on my intel graphics thinkpad if I turn various graphical settings down. ~30 FPS too. If you're all about the gameplay that shouldn't be an issue? Of course I think it looks like rear end so I stick to gaming on systems where everything is set to ultra, all the time.

To put this into a console gaming perspective I used to play Modern Warfare 2 on a laptop with a 1366x768 LCD. It was very playable and looked pretty decent. However the lack of resolution made ascertaining targets at range a problem (since most hardcore mode players used the perks that hid the name above their head, so identifying friend from foe was a big deal). When I moved my install to a desktop with a proper screen I just couldn't go back to low resolution gaming. I can't comprehend why anyone would prefer it unless they simply don't know what they're missing.

wow, man. Really? Ultra? All the time? I am impressed. Allow me to bow and heap praise on you, for you truly are the hardest core of gamers. :rolleyes:

Why do people who dont own an Xbone, nor intend to, read this thread? why do you post here? nothing you are providing is new information. I would like to discuss news with other people who want to discuss news, rather than circle back around to old discussions that have been done to death a million loving times for the sake of a select few who are here purely to stir the pot.

Shirkelton
Apr 6, 2009

I'm not loyal to anything, General... except the dream.

Laserface posted:

Why do people who dont own an Xbone, nor intend to, read this thread? why do you post here? nothing you are providing is new information. I would like to discuss news with other people who want to discuss news, rather than circle back around to old discussions that have been done to death a million loving times for the sake of a select few who are here purely to stir the pot.

Do words hurt your feelings too?

Wiggly Wayne DDS
Sep 11, 2010



Laserface posted:

Words hurt my feelings :cry:


I wont be playing that game on either platform, at least not until there is Australian servers. I was bored of the beta after a day.


wow, man. Really? Ultra? All the time? I am impressed. Allow me to bow and heap praise on you, for you truly are the hardest core of gamers. :rolleyes:

Why do people who dont own an Xbone, nor intend to, read this thread? why do you post here? nothing you are providing is new information. I would like to discuss news with other people who want to discuss news, rather than circle back around to old discussions that have been done to death a million loving times for the sake of a select few who are here purely to stir the pot.
Look buddy I don't know why you're so firmly attached to making personal attacks. News is rarely posted here as posters like yourselves get upset. Due to that issue posters use the other thread where people aren't probated for "quasi concern trolling" when having a friendly discussion. We can't really explain problems when you'd just mash report over derails in a general discussion thread.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

Laserface posted:

Why do people who dont own an Xbone, nor intend to, read this thread?

Laserface posted:

I would like to discuss news with other people who want to discuss news

You assume these are two different groups of people.

VincesUndies
Jul 18, 2002

Vince McMahon's Underpants
Xbox One could get photo-realistic rendering system

This is the link: click it if you want or don't, if you don't want to

Phil Spencer confirms Microsoft has been experimenting with real-time ray-tracing "We've done experiments with Realtime Raytracing. A ton of potential with this tech, amazing visuals"

Edited to make Sedisp happy

VincesUndies fucked around with this message at 17:51 on Mar 3, 2014

A shrubbery!
Jan 16, 2009
I LOOK DOWN ON MY REAL LIFE FRIENDS BECAUSE OF THEIR VIDEO GAME PURCHASING DECISIONS.

I'M THAT MUCH OF AN INSUFFERABLE SPERGLORD
Photo-realistic graphics like this?

Seriously though is this something Xbone would be able to utilize? I don't know how hardware-intensive that kind of rendering is, what are the potential benefits for the Bone? The article suggests "The Cloud" which can only mean OnLive/Gaikai-style streaming of games being rendered server-side, right?

big mean giraffe
Dec 13, 2003

Eat Shit and Die

Lipstick Apathy
That's really nothing more than bullshit PR speak. There's no way either system could pull off those kind of graphics, and the cloud is worthless for anything realtime.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


VincesUndies posted:

Xbox One could get photo-realistic rendering system for "amazing visuals"

Phil Spencer confirms Microsoft has been experimenting with real-time ray-tracing

Phil Spencer drumming up irrational exuberance with basically no details or evidence provided. Probably everyone experiments with ray-tracing, because the visuals very nearly can't be beat. Yet if it pulled off such a feat it'd be the first platform to put it into general circulation.

And there is the double problem that 1) the Cloud is at a minimum not-great distance because of network latency and the speed of light (actually around 70% of it because electrons are leptons and have mass) and that 2) unless it can do ray-tracing through a 32MB aperture it's still stuck with its performance envelope.

Probad
Feb 24, 2013

I want to believe!
Ray-tracing is still just vaporware in my mind. People have been talking about it for years, and I don't buy that the XBone is going to be the device to bring it about when it's struggling with full HD resolutions on tech that actually exists.

Aphrodite
Jun 27, 2006

Things have gotten more powerful since of course, but...

quote:

On June 12, 2008 Intel demonstrated a special version of Enemy Territory: Quake Wars, titled Quake Wars: Ray Traced, using ray tracing for rendering, running in basic HD (720p) resolution. ETQW operated at 14-29 frames per second. The demonstration ran on a 16-core (4 socket, 4 core) Xeon Tigerton system running at 2.93 GHz

That's what it took to run Enemy Territory at 720p and it couldn't even hit 30fps.

univbee
Jun 3, 2004




Ray Tracing is extremely resource-intensive. Intel made a special build of Quake Wars which used ray tracing. A 24-core system with each core running over 2 GHz could only manage 720p at 20-35 frames per second. It's not even close to happening on expensive gaming PC's, let alone on any console this gen or probably another few gens.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

Sir Unimaginative posted:

Phil Spencer drumming up irrational exuberance with basically no details or evidence provided.

Hey, to be fair, Xbox One could get photo-realistic rendering system for "amazing visuals"

Great Joe
Aug 13, 2008

On that note, probably the closest we'll get this gen is abstracted approaches like UE4's voxel octree cone tracing. I'm not sure it's ready for prime time yet but it IS targeted at this generation.

Sedisp
Jun 20, 2012


VincesUndies posted:

Phil Spencer confirms Microsoft has been experimenting with real-time ray-tracing

At least edit the link so it's not so click baity.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Can GPU's not do ray tracing?

VincesUndies
Jul 18, 2002

Vince McMahon's Underpants

Sedisp posted:

At least edit the link so it's not so click baity.

Thanks. I fixed it for you.

Aphrodite
Jun 27, 2006

Don Lapre posted:

Can GPU's not do ray tracing?

They can, but CPUs are better at it right now apparently.

GPUs are faster though.

beejay
Apr 7, 2002

Don Lapre posted:

Can GPU's not do ray tracing?

They can, but nowhere close to fast enough for gaming, yet.

Ein
Feb 27, 2002
.
Universal real time ray tracing is the cold fusion of computer graphics.

Adbot
ADBOT LOVES YOU

univbee
Jun 3, 2004




Most GPU's designed for ray tracing are the cards designed for workstations, like the Quadro and FireGL series, which makes sense since games don't really do it at this time. Of course, at this level you're measuring things more in seconds per frame rather than frames per second.

  • Locked thread