Nobody’s perfect. But where do you draw the line?
- Larisa: I know how much you care about it, Landon, but I have to be honest with you.
- Larisa: I could barely see any difference between the game running with 30 fps at 900p and the one running with 60 fps at 1080p.
- Larisa: I hope this won’t become a burden on our relationship.
- Landon: I… think I need some space.
|
Dump her….
Awfully inconsiderate of Landon. She’s gonna be blind before too long, remember. It might just be that.
That, or she’s been looking at the flames too long.
Well they were playing Minecraft, so…
I know this is supposed to be a joke, but the problem is I don’t know which one is the joke.
I can tell the FPS difference, but the resolution difference is much harder to spot.
Depending on the game, that could make almost no difference, or all the difference in the world.
I’m with her on this one. Human vision can’t even see 60fps for real anyway. It is possible to tell them apart, but you aren’t actually catching all the frames. The reason this is is because the higher the fps, the more consistent the number of frames your eye “skips” is and so it looks smoother. But past a certain point, this stops mattering.
And I can’t even tell 720 and 1080 apart unless they are side by side or on a very large screen.
Larisa: And I can’t really see the point of 4k UHD. Sure, it looks nice, but only if you have a ridiculously oversized screen… and is anyone ever going to produce anything real to play on it?
Landon: …If that’s true, set light to me now. There’s no point in carrying on. ;_;
Larisa: See?! I knew we’d match up, and come out stronger! *hums to herself as she douses Landon with fuel*
>:=)>
Show larisa a flame or explosion in said res and fps and I’m sure she will see the difference between the two
Blasphemous! How could she not see the differences?! It is like night and day!!!
@ Whirlwound:
Remember, far as we know he doesn’t know about her condition.
I can tell the difference between 30 and 60 FPS, and resolutions basically never, but it’s never really affected my ability to enjoy content. I mean, if I’m playing a game, and enjoying myself, does it really matter what resolution it is in?
I can’t speak for those who have ACCESS to resolution like that (my computer averages 15 fps…), but I can vouch for difference in quality based on comparison to other’s computers. Oh, and as a huge Larisa fan, I have to compliment whoever it was that did the drawing of her as a succubus with Kevin. That was great, nice job 🙂
Framerates this high do matter if there are lots of fast-moving objects in the game—and in most of them, there are. Our vision does blur things together above 30 fps or so, but it’s important whether it blurs fast movement into a continuous trail or several standing images. Jiggle your cursor really fast and you’ll see what I mean.
I had the same reaction she did with HD TV. To me, 1080P is just as good as 4K. But I’m fine with SD, as far as that goes.
Maybe they played minesweeper…?
This level of casual is highly disturbing.
Most people can’t.
When it comes to resolution, the size of your screen, and the distance from your face, matter a lot. What you REALLY need to measure is the angle per pixel, and that can vary significantly depending on the dpi and distance, even in the same resolution. A good way to demonstrate this is VR, where having the screen essentially in your face makes what would be extreme overkill for a normal screen to be far too low.
Please don’t ruin this, those two have been through so much together.
I’m with you Larisa, I don’t see any difference too… I’m not even sure what is all this 4K about…
As someone with a 65 inch 4k screen, I can see the difference. Also, I trained myself to track objects moving quickly, but with bad framerate, it doesn’t move fast enough to be tracked. Although, it’s rare to see games that make good use of full 4k HD, but maybe the FFVII remake will.
@ d.artemis:
Her optic nerve is deteriorating due to Wolfram’s syndrome.
“But… but… they were entirely different GAMES… I was comparing Call of Duty to Yoshi’s Woolly World…”
“Yeah, I couldn’t keep them straight.”
“… Yoshi’s Woolly world is the very flammable one.”
“OH! Now I get it.”
(In slightly more seriousness, Landon should have just showed her videos of fire, one at each resolution/FPS, and had her compare that.)
SCANDALOUS!!!!!
https://youtu.be/inQjv6Cd6kI
She’d have noticed.
Trimutius wrote:
Because everybody is thinking about it wrong. 4K is a fine resolution for a 24-inch computer monitor at a distance of one standard arm length. A device on which you read and write lots of text at small font sizes.
As for a 65-inch television screen ten feet away displaying video, well, I don’t know either.
900p vs 1080p i don’t care, 30 fps vs 60fps pls don’t 30
Larissa truely is vile creature of hell!!!
this comment section seam,kinda crazy… no verification of identity at all..
I mean i could use someone elses email and gravatar listed user name and post something like
“Hitlers only mistake was not gasing more jews” or “xbox is better then pc” or some other stupid biggoted evil and/or blatantly wrong statement.
Larisa, being the voice of reason? LARISA????
You can’t tell the difference between 30fps and 60fps if you never play in 60fps.
@ GJT:
I know what you mean; We got a “Hi-def” TV when we replaced our old set. Honestly, it’s kind of cool that I can walk up to the set and see the pores on somebody’s face, but I watch the set from 8 feet away.
Besides, I’m an old codger who’s already had cataracts, and has “epi-retinal membrane”, and “retina resolution” is getting cheaper every year, if you catch my drift.
@ GJT:
yes, after 200 or so fps..
and there is no “see” the difference, you can only “feel” it.
personally, i feel difference at least up to 75fps, which is my monitor max, but i imagine i would feel it even at 100+ if i had a monitor that support it.
30 fps are horrible, after an hour or two i feel sick. 60 fps are fine, but i feel my head a bit heavy after few hours. 75fps feels better, but i reckon not many people would play enough hours to appreciate the difference from 60fps.
tl;dr better fps = less strain. 60fps should suffice for most people. 30fps limit should die in a fire
To be fair to her, she could “barely see a difference”, meaning she still saw one, so nothing’s wrong with her.
@ GJT:
In order for a computer generated image where frames aren’t interpolated to look smooth we need more than 30fps though. 60fps at least or even better 120. (Try playing Minecraft with 30fps then with 120. If you can’t see the difference then please visit a doctor)
The source for the “we don’t need more than 30fps” myth are in fact analog video and movie recordings. (analog as in recording the real world, not a digital generated movie) There we really don’t need more than 30fps. The reason being that the camera doesn’t capture static frames but a bit of motion blur with which our brain can properly interpolate multiple images into a smooth motion. Because that natural blur is missing in computer generated images they seem to stutter at lower frame rates. E.g. pause a movie in a scene with lots of motion and you wont get a sharp image. But if you make a screenshot of a video game everything is sharp. (At least when it doesn’t add some artificial blur which is one method, especially on consoles, to mask lower framerates. Most of the time it just looks bad and it’s really annoying for people like me that get motion sick quite fast)
Somehow I’m reminded of the time I was admiring my wife’s new iPad, now with Retina display. She was over the moon about the new display and had them both side-by-side to demonstrate the difference. I picked up the iPad, used it a bit to scroll back and forth, and yes, I did notice a decent difference.
While continuing to read and scroll I told her how much better this new one was. She smiled, and then suddenly froze.
In a very chilly voice, she said “you’re holding the *old* iPad.”
“Um. As I was saying, this old display is really hard to read and…”
On the other hand, we’re still married.
It will take him 5 to 10 minutes to make peace with her imperfection and return to the status quo.
As long as she doesn’t burn anything during this time, of course.
Moep wrote:
Which is why good CGI in movies includes motion blur for CGI elements. I expect that video games will include motion blur soon, and probably some already do.
@ Agarax:
A good amount of games already either have motion blur or can be modded to have it
Landon, kiss her.
This means you can load twice as many mods onto her versions of the game.
It should be noted that old screens didn’t actually display <30fps either. While the source video was (and is) 30fps animation, by creating a sort of approximate interpolation between the frames.
Speaking of interpolation, I should note that most games, at least first person shooters, actually run at 10fps. All the additional frames are actually an interpolation between the current and previous frame.
One last thing to note, creating accurate motion blur in a game is akin to rendering at a MUCH higher framerate, as you would need to render the interframes and blur them together. There are, of course, tricks that allow “faking it” without having to use so much resources.
@ GJT:
Well, seeing the FPS kind of breaks down around 16 FPS, when the illusion of movement appears, which is the minimum framerate for film.
However most people who play games that require high speed accurate movements (quite many games do), will know that a game at 60 FPS is much easier to control than that same game at 30 fps. Our hand-eye coordination is somewhat impedimented by the fact that the picture remains in place, albeit briefly, while our hand keeps moving.
When I got the opportunity to work with the Oculus rift, I realised that some aspects of our image processing are even more sensitive. The first machine I used was not powerful enough to run the Oculus rift at the recommended 90 FPS, and the developers and testers almost all got nauseuos when using it. Once we upgraded to a more powerful machine, only highly susceptible people became nauseous.
This seems to happen because at 60 FPS the fact that your vision stays in place while your head is moving is already a problem for you visual and accelerational equilibrium senses.
I always said, “what is 60fps good for?”. I.e. until I realized that the TV at my friend’s place looked weird to me, because it was displaying the movie at 60fps and didn’t provide the choppyness my brain was trained to expect.
Everyone is discussing frame rates and I’m here annoyed that this is Landon’s first speaking role in 200 pages.
Also everyone comparing videogames to movies stop it. Movies will always look better at lower frame rate due to having predetermined paths which allows for more realistic visually effects. Where as games have to generate out the image on the fly.
Also you all may want to look at the Nyquist-Shannon sampling thorem.
Interesting comments.
So to bring them all together … if Larisa was looking over Landon’s shoulder as he played the game, she would: (a) be further from the screen where the resolution has less impact, and (b) not be engaged in focusing on the fast-moving things on the screen, meaning that the background was more highly weighted in her evaluation, and (c), not be subject to the feedback loop between controls and screen that can engage those feelings of nausea.
This probably happens to a lot of people, actually. And then you add the connoisseur effect – which is a key part of the joke, but it’s also very real – and your average Joe finds themselves dismissing gamers’ desire for the latest and greatest as petty or one-up-manship rather than a sensitivity to a real quality difference.
Me, I play games where graphics doesn’t matter. Turn-based, for preference.
It’s weird to me how people are actually disturbed by lower frame rates/resolutions. I really don’t notice unless it’s really different. I can’t tell the difference between 30fps and 60fps. I don’t see how anyone else can see it either. I also don’t understand how it could bother other people so much.
On CRT computer monitors I had to use high refresh rates. As monitors got bigger and phosphor persistence got shorter (to satisfy gamers) I had to keep going up. I was OK at 75Hz on 17″ monitors but on the last CRT monitor I had which was a 20″ Iyama I had to use 100Hz. If I didn’t do this I got various symptoms: splitting headaches, motion sickness, eye strain. I realise refresh rate on the phosphor is not the same as a game’s rendering frame rate. What I couldn’t cope with probably was the periods where the phosphor dots were going dark.
LCD monitors were a revelation for me, the issue has gone away completely. Standard refesh rates are fine. I believe this is because LCD is inherently a “sample and hold” technology, the pixels don’t fade they stay the same until changed.
I manually turn off motion blur every time for video games because it makes me feel sick.