Jump to content
Tactically Inept

Monitor Discussion


Jedi2155

Recommended Posts

  • 1 month later...
  • Replies 256
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

The more important points I saw mentioned were:

You can play at slightly lower framerates with a smoother more acceptable experience.

other visual artifacts become more noticeable

these effects are still present even at 120/144hz

 

I really want to see just how this plays out, I understand it's nvidia tech, but if it becomes prevalent and there is no amd equivalent I feel this might be another massive advantage for nvidia. Hopefully in the next few years we'll begin to see monitors with this technology being sold.

Link to comment
Share on other sites

  • 4 weeks later...

Which is really sad for multiple reasons. Display port is a very solid interface that I wish became a bigger standard in the pc space. Unfortunately since all home audio manufacturers seem to be enamored with hdmi we'll most likely still get stuck using dvi and hdmi on most computer systems, at least in the foreseeable future.

Link to comment
Share on other sites

So OK, I basically thought about making sure what I was going to say was real before typing anything so I wouldn't look like a doofus. Too late. I'm wrong.

 

There is other details I didn't mention, part of this is apparently that it is in a COMING display port standard and there is some interesting discussion around the web that is pretty technical talk (the good kind) about freesync vs g-sync. Freesync demoed on laptops have different physical connection types than desktops and that and nvidia is saying their monitor hardware bypasses the scaler which has its own advantages. Interesting stuff.

Link to comment
Share on other sites

  • 1 month later...

I'm going to hijack this thread slightly.

 

I wanted to post up some information especially about 4k monitors, mostly because I have often wondered how well they perform and if it would be reasonable to use them for 1080p gaming. Well here's some of what I found.

 

This is how windows detects 4k displays over displayport 1.2 (you also need displayID 1.3 compatible gpu and drivers).
display_settings.jpg

Important 4k Consideration - Updated 15/1/14 - the UP3214Q is actually seen by Windows by default as two displays when running at 60Hz refresh rate, and if you look in the 'display properties' section (shown above) you will see the single screen appears as if it were two 1920 x 2160 resolution displays. You may need to select the "extend these displays" option here, and you might need to switch round which "display" is primary, and which is secondary in the menu, if the taskbar/start menu appear on the wrong half of the screen. This is the same as all other current 4k screens at 60Hz which use MulitStream to drive 60Hz refresh rate. If you switch back to 30Hz refresh rate the screen shows as a single display again by default.

http://www.tftcentral.co.uk/reviews/dell_up3214q.htm

 

Later he goes on to mention that you can merge the displays into a single display at 60hz but it requires using Eyefinity on amd and Mosiac on nvidia. He also mentions that there is some compatibility issues when trying to switch to 1080p, mostly because of how the dual displays are recognized.

 

Short answer is that 4k doesn't work very well on current tech. They're having to fudge how it detects to get around the bandwidth constraints of hdmi. When hdmi 2.0 comes out manufacturers can properly push 4k at 60fps with no need for multi-display shenanigans.

Link to comment
Share on other sites

  • 4 weeks later...

I wanted to update my post with regards to 4k displays. Anandtech has a nice review of a dell with the exact same issues that I mentioned in the above post. He also mentions that firmware is pretty hit or miss, and that often the monitor requires power cycling after it's put to sleep. In general 4k displays are not at a useable level right now, and certainly not for gaming.

http://anandtech.com/show/7906/dell-up3214q-review/2

 

The author also makes mention of various issues at the OS level with regards to dpi and text/icon scaling. It seems like it doesn't play well with other monitors of lower resolutions either.

 

Currently the display market is a bit weird. I find it hard to just suggest for people to go for a standard 1080p display and not something a bit fancier like a 120hz model or a 1440p display. That being said, I'm still waiting around to see just how good g-sync really is. Everyone who has seen demos of it has been awestruck at the technology. The real question is if any of the OEMs are going to push displays with g-sync at resolutions other than 1080p, and if those displays will be 120hz capable.

 

While I love my 1440p display, I would rather seriously consider moving back to a 1080p display with g-sync, especially if that means I could look into a nice 3 monitor setup for occasional ridiculousness.

Link to comment
Share on other sites

  • 2 months later...
  • 2 weeks later...

They've had 4k monitors at 60hz before it's just that it required multistream. This is one of the first that actually uses a single stream through displayport.

 

If someone was absolutely interested in screen real estate I guess that would be a good choice to go with. My issue is that most single gpu setups cannot drive high end games at 60fps and 4k. There is a part of me that almost wants to go back to 1080p monitor because of how standard that is and the performance increase I would experience instead of driving my 1440p monitor.

Link to comment
Share on other sites

One guy reported this on the Amazon Reviews:

 

I wish I could give more input on an AMD GPU set up but unfortunately, I don't own any. I can however, tell you that Nvidia works beautifully with this monitor especially using SLI. Battlefield 4 for example; With all Ultra settings, 2x msaa and SSAO, I was consistently getting 70-90 frames per second and never dropped below 65. One single 780 Ti had playable framerates at Medium settings, no aa and SSAO with between 50-70 fps which is still not bad considering the resolution. And that was on multiplayer which I felt was the best place to test the frames instead of boasting about ridiculously high frames in the campaign. Another example like Skyrim; i was able to completely max out this game at 2160p with a single card. BF4 really is the game that beats up your GPU at this resolution, no matter what you have. With everything maxed out it did introduce a very small amount of screen tearing and it does, expectedly, take a few seconds for the textures to load when you first fire up a match.

Link to comment
Share on other sites

  • 2 weeks later...

A few points from the review.

You cannot use it at higher fps than 60 when it's in 1080p. The only resolution that supports higher fps is 1440p (this is a slight concern).

They also have a different mode other than gsync called Ultra Low Motion Blur, which only works when G-sync is disabled. ULMB also only works at 85, 100, and 120hz (Not 144hz). Supposedly this mode is also impressive.

 

They broke down that G-sync is best used for games that run between 45-60 fps and ULMB is best used for games running above 85 fps consistently.

 

The viewing angles are really the only major issue I'm seeing, and that is almost entirely due to the panel being a TN. Every other metric seems very solid.

 

I am seriously considering making a large upgrade to this monitor and a new GPU (when the next cycle releases). I just wish I could see a G-sync demo in person.

Link to comment
Share on other sites

When I look at monitors to "upgrade" to, I mainly see 3 options right now for myself and they are each very different choices.

 

1) Asus ROG Swift PG278Q

  • True 8-bit panel. Unique for a TN.
  • 1440p. Unique for a TN. This brings up the issue of gpu horsepower. I maintain 60fps in most of my games but I'm not sure how much "over" that I would be. I am planning a GPU upgrade near Maxwell release. With gSYNC though this isn't as much of an issue. EDIT: ....because I would expect many games to run in the 70-90fps range.
  • gSYNC. Unique for anything and especially combined with the first two bullet points. As I already said in this thread, I think this tech is really cool.
  • $800 is a real kick in the balls. I could afford it but I would not be happy about it.
  • ULMB or gSYNC but not both. That sucks.
  • ULMB or 144Hz but not both. That sucks.
  • ULMB has a pretty big hit on brightness compared to other strobing/motion blur tech on other monitors.That sucks.
  • Panel uniformity could have been better.

2) Overlord Tempest X270OC

  • IPS panel so colors will be much improved. I mainly use a TN, and I've lived with it for quite a while now, but I have an IPS right next to it that I hav also had for a long time and the superior colors on the IPS have been noticeable since the day I got the TN.
  • 27", 90-120Hz guarnteed, 1440p.
  • No strobing tech and response time of the panel is going to be a little slower so there will probably be some "smearing" on fast movement compared to the other options but it would still be faster than my current monitor.
  • Same resolution horsepower problem as Option 1.

3) BenQ XL Series XL2411Z

  • The cheapest by a good margin at right around $300.
  • 24", 144Hz, 1080p. Since this is 1080p I should be able to drive most games at high FPS (current monitor 1200p). After a GPU upgrade even more so.
  • Semi-official 3rd party tool allows for powerful adjustment of strobing tech to make it one of the best.
  • After calibration, color and contrast are a bit above average for a TN.
  • Input latency and response time both very low.
  • This feels like a solution that has 80% of the features of the Swift at <50% of the cost. Keep in mind that I don't put that much value on resolution past 1080p.
  • If I am desperate for pure size, there is a 27" model with the same tech.

 

That is all I can think to type up right now....

 

A few points from the review.

You cannot use it at higher fps than 60 when it's in 1080p. The only resolution that supports higher fps is 1440p (this is a slight concern).

They also have a different mode other than gsync called Ultra Low Motion Blur, which only works when G-sync is disabled. ULMB also only works at 85, 100, and 120hz (Not 144hz). Supposedly this mode is also impressive.

 

They broke down that G-sync is best used for games that run between 45-60 fps and ULMB is best used for games running above 85 fps consistently.

 

The viewing angles are really the only major issue I'm seeing, and that is almost entirely due to the panel being a TN. Every other metric seems very solid.

 

I am seriously considering making a large upgrade to this monitor and a new GPU (when the next cycle releases). I just wish I could see a G-sync demo in person.

 

I hate to call you out on here but please check your units. FPS and Hz are not interchangeable. They are totally independent of each other. Even with vsync (normal or adaptive) they are independent of each other.

 

I always run native resolution and decrease graphical quality to ensure native quality so that isn't a concern for me. I will be interested in 3 of these monitors.

 

gsync does not work in surround at this time (official nvidia reply).

Link to comment
Share on other sites

You're right to call me out on the units, that's sloppy on my part, my bad.

 

Some points regarding the ULMB and G-sync. Since they both seem to be effective at different fps ranges, I'm not sure you would need both.

It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 - 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature discussed in the following section which is not available when using G-sync.

I would imagine most of the new AAA games would be anywhere between 45-60 fps at 1440p. My personal experience at that resolution is that only some games are maxed out at 60+ fps. I would not even meet the minimum requirements for ULMB most of the time (lowest setting starts at 85hz). Personally I would probably just set the monitor in G-sync for the most part, while the appreciable differences for G-sync become less and less apparent at higher fps ranges, I would imagine to achieve those performance metrics you would need some SLI setup.

 

I agree with you on the BenQ XL Series XL2411Z, in fact some of the reason why I still consider dropping back down to 1080p displays is because I would be capable of running higher frame rates and making use of the 120/144hz capability more effectively than at 1440p. While I've been very happy with my 1440p display for quite some time, I know for a fact that I run at either lower fps or lower settings (or both) in certain games like BF4. Here's a great example:

BF4 Ultra settings at 1080p: 780gtx = 67.6 and 780ti = 81.6

BF4 Ultra settings at 1440p: 780gtx = 44.5 and 780ti = 53.7

Source: http://anandtech.com/bench/GPU14/999

That's -41%. You can see that massive impact the extra resolution has on graphically intensive games.

Link to comment
Share on other sites

Some points regarding the ULMB and G-sync. Since they both seem to be effective at different fps ranges, I'm not sure you would need both.

 

I would prefer to have both for the situations when I play games (for example shadowrun is very low usage) that my system would easily be able to pump out 100+ FPS for and then not have to think about turning it off when I'm about to play a game that sits more in the 45-65 FPS range (like BF4). The technologies should not be mutually exclusive. Just because it has almost no benefit in lower FPS scenarios doesn't mean I want a strobe-like tech to be off.

 

Here's a great example:

BF4 Ultra settings at 1080p: 780gtx = 67.6 and 780ti = 81.6

BF4 Ultra settings at 1440p: 780gtx = 44.5 and 780ti = 53.7

Source: http://anandtech.com/bench/GPU14/999

That's -41%. You can see that massive impact the extra resolution has on graphically intensive games.

 

That's a really good example of why I'm a little hesitant to move to 1440p. I don't have solid 60fps all the time in BF4 at just 1200p.

Link to comment
Share on other sites

Let me clear up my point on G-sync and ULMB. I am in favor of both technologies and would like to see them both work together seamlessly. Unfortunately they seem to be mutually exclusive at this point. I don't know the technological limitations of the G-sync standard and how they would interact with something like ULMB. If you could find a way to allow variable timing on the backlight it might work, but for now they're separate technologies. It would also be nice if the ULMB would automatically detect and activate for 85+ hz but I have no idea what sort of overhead that would require either. As far as changing between G-sync and ULMB, I agree with you that it's not a perfect system, ideally you want your consumer to have a seamless experience. However I am not sure the technology is at that point yet, maybe the second generation of G-sync displays will accommodate ULMB more effectively.

 

Right now I'm looking what the Swift can do, it isn't a perfect display and the compromises it makes are pretty reasonable. If you can pretty quickly swap between G-sync and ULMB than I don't have too many issues with how their system works currently, I would probably leave it on G-sync and swap to ULMB when playing less intensive games.

 

I'm also curious to hear any news about the BenQ G-sync monitors that are supposedly releasing soon. They have a 24" and 27" variant at 1080p 144hz.

Link to comment
Share on other sites

Disclaimer: I wrote this kind of fast and did some mspaint silly so not my most organized post. Apologies if hard to follow.

Trying to expand on something I said earlier.

I've been thinking over g-sync and strobing technologies, how they all fit together and I'm wondering if I've missed something along the way. I will be referencing these images here a lot: http://www.anandtech.com/show/7582/nvidia-gsync-review I don't want to lean too heavily on these images, because they are marketing images from Nvidia, but they do a decent job of illustrating the tech.

I fully understand, from first hand experiance, that tearing is awful and that there needs to be some sort of synchronization between the gpu and screen. I also understand that g-sync is basically the inverse of v-sync in the sense that g-sync makes the screen wait on the gpu as opposed to where v-sync makes the gpu wait on the screen.

What I don't understand is quite how this is mutually exclusive and why there is not some kind of middle ground that also involves strobing technologies.

BenQ XL2720Z review: http://www.tftcentral.co.uk/reviews/benq_xl2720z.htm

In the review linked above, you can read the section on Blur Reduction, but the only point I want you to take home is that blue reduction/strobing techs DO work (Nvidia calls their version LightBoost and ULMB). I am just assuming at this point that the reason Nvidia has made strobing tech mutually exclusive with their g-sync is timing of the "in-betweens" being difficult. Purely an assumption

Imagine a scenario where you are using g-sync in a game that is running at 75fps (arbitrary choice). Let us say the screen's full processing time (input lag + reponse time) is 6ms, not uncommon, with a max refresh rate of 120Hz. So the screen's refresh rate at that moment is tied to the fps making it 75Hz but you are not getting frames faster than 6ms because of the limits of the panel. Look at the first image in the Anandtech article from Nvidia the one titled "Buy V-SYNC OFF Causes...". Note that the panel squares are uniform for all 3 images. In that first image, how would g-sync handle that scenario? The FPS can't be displayed faster than the panel. See image below.

 


6ETht4S.jpg

 


The label for the x-axis is cutoff on anandtech but on other sites it shows "Time (ms)". 5 frames in 80 ms means 60Hz. If the screen was running at 120Hz at all times (double what Nvidia showed), perhaps making the implementation of strobing tech easier, even if the game is still running at 75Hz, you still will only be getting frames at 6ms max. But what about tearing? Don't make the GPU wait (avoiding lag) but don't display a frame before next panel refresh. Still need a hard cap at 120fps. See image below. 60Hz and 120Hz shown.

XmOYRAB.jpg

144Hz would mean even more opportunities but I would worry about brightness reduction or color shift.

tldr : no sync = eye cancer tearing, v-sync makes gpu wait on screen = lag, g-sync makes screen wait on gpu = strobing tech hard to implement?, strobing tech works, why not hybrid where no one waits on anyone but screen only moves at fast as it can at all times and displays only full frames but still implements hard cap

 

EDIT: A lot of thoughts turning over in my head related to lag. I'm going to keep thinking about it.

Link to comment
Share on other sites

My own semi-rambling reply:

 

From my own limited understanding, the current version of backlight strobing (lightboost, ULMB etc...) is set on a fix interval, whether that's 85/100/120/144hz. There is no "dynamic" set for backlight strobing. The issue becomes that these technologies seem to be originally developed in the old style for static refresh rates. If you tied the backlight strobe to each new frame, you begin to run into some problems.

 

First you don't want to be strobing below 60hz, this is tied to the amount of time the backlight needs to be "on" and the amount of time where the backlight is "off." The lowest settings on the new Asus Swift monitor start at 85hz. I would also recommend looking through some of this article: http://www.tftcentral.co.uk/articles/motion_blur.htm

 

This is what the most aggressive lightboost setting looks like: 1.5ms "on" and 6.875ms "off" or basically 120hz refresh rate. That's the setting with the most amount of "off" time on the backlight. If you tried to apply that same logic to something running at say 60fps, you've have much longer periods of "on" time (around 9.8ms).

 

From what I understand G-sync functions by syncing the gpu render of a new frame to the refresh signal on the display. If you linked that to the backlight there would also have to be considerations for the maximum amount of "off" time. So occasionally you would get situations where the panel is stuck holding a previous frame, when the a newer frame is being pushed by the GPU.

 

You are suggesting using some funky form of interpolation, but that would require you to fake a higher refresh rate by adding in another period of "off" time and another period of "on" time with no image change. I don't know if that would ruin the effectiveness of backlight strobing.

 

This video is neat:

Link to comment
Share on other sites

Disclaimer: I wrote this kind of fast and did some mspaint silly so not my most organized post. Apologies if hard to follow.

 

Trying to expand on something I said earlier.

 

I've been thinking over g-sync and strobing technologies, how they all fit together and I'm wondering if I've missed something along the way. I will be referencing these images here a lot: http://www.anandtech.com/show/7582/nvidia-gsync-review I don't want to lean too heavily on these images, because they are marketing images from Nvidia, but they do a decent job of illustrating the tech.

 

I fully understand, from first hand experiance, that tearing is awful and that there needs to be some sort of synchronization between the gpu and screen. I also understand that g-sync is basically the inverse of v-sync in the sense that g-sync makes the screen wait on the gpu as opposed to where v-sync makes the gpu wait on the screen.

 

What I don't understand is quite how this is mutually exclusive and why there is not some kind of middle ground that also involves strobing technologies.

 

BenQ XL2720Z review: http://www.tftcentral.co.uk/reviews/benq_xl2720z.htm

 

In the review linked above, you can read the section on Blur Reduction, but the only point I want you to take home is that blue reduction/strobing techs DO work (Nvidia calls their version LightBoost and ULMB). I am just assuming at this point that the reason Nvidia has made strobing tech mutually exclusive with their g-sync is timing of the "in-betweens" being difficult. Purely an assumption

 

Imagine a scenario where you are using g-sync in a game that is running at 75fps (arbitrary choice). Let us say the screen's full processing time (input lag + reponse time) is 6ms, not uncommon, with a max refresh rate of 120Hz. So the screen's refresh rate at that moment is tied to the fps making it 75Hz but you are not getting frames faster than 6ms because of the limits of the panel. Look at the first image in the Anandtech article from Nvidia the one titled "Buy V-SYNC OFF Causes...". Note that the panel squares are uniform for all 3 images. In that first image, how would g-sync handle that scenario? The FPS can't be displayed faster than the panel. See image below.

 

 

6ETht4S.jpg

 

 

The label for the x-axis is cutoff on anandtech but on other sites it shows "Time (ms)". 5 frames in 80 ms means 60Hz. If the screen was running at 120Hz at all times (double what Nvidia showed), perhaps making the implementation of strobing tech easier, even if the game is still running at 75Hz, you still will only be getting frames at 6ms max. But what about tearing? Don't make the GPU wait (avoiding lag) but don't display a frame before next panel refresh. Still need a hard cap at 120fps. See image below. 60Hz and 120Hz shown.

 

XmOYRAB.jpg

 

144Hz would mean even more opportunities but I would worry about brightness reduction or color shift.

 

tldr : no sync = eye cancer tearing, v-sync makes gpu wait on screen = lag, g-sync makes screen wait on gpu = strobing tech hard to implement?, strobing tech works, why not hybrid where no one waits on anyone but screen only moves at fast as it can at all times and displays only full frames but still implements hard cap

 

EDIT: A lot of thoughts turning over in my head related to lag. I'm going to keep thinking about it.

 

Sorry for the long quote but something occured to me today related to the "hybrid" system I was thinking of. In the diagram I doubled the panel rate to show 120Hz but 144Hz is pretty available too so that would mean less lag between ready draw frame and ready panel but why limit the panel? Since the panel would be totally on its own bandwidth doesn't matter either. The BenQ 2720 review I linked in the above post was measured to have a full lag time (signal processing + response time) of <5ms. If we round up to 5ms, that means the screen itself is capable of 200Hz. Bandwidth limitations may keep frame rates at a max of 144fps but that's fine.

 

I'll try to make some diagrams later because I'm not sure I'm communicating this too well. Someone else MUST have thought of this before.... it makes a lot of sense to me.....

Link to comment
Share on other sites

  • 1 month later...

Kind of a response to your comment in the other thread Malaphax, don't go charging off the buy a ROG Swift monitor just yet. QC problems are creeping up more and more. Flickering and halos after a few days, blinking/black screens when setting 144Hz, g-sync turning on and off, etc. I would let some of this die down or get resolved before considering making that investment.

 

I'm still thinking I might buy a cheap 144Hz monitor and wait until more g-sync options are available.

Link to comment
Share on other sites

I want to get more info about this acer model before I make any purchases:

http://www.tomshardware.com/news/acer-g-sync-xb270habprz,27496.html

 

I'm not planning on buying a g-sync monitor right now, or even technically anytime soon. I would want to see multiple reviews before I make a leap like that. I will say that getting a different display is pretty high on my list because it's almost completely independent of anything else computer related.

 

I'm not sure I would go your route of buying a cheap 144hz monitor right now.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

×
×
  • Create New...