Recently nVidia held a two-day press event, during which they announced many new technologies for PC gaming, and one new graphics card. In other words, they did what probably a lot of people expected (in a general sense, as one could not necessarily predict the specifics of their announcements). At least I was not surprised by any of it. AMD is in every next-gen console, so naturally nVidia is going to want to focus on PC gaming. To that end they announced new tools, such as an improved PhysX system, new lighting system, a new piece of monitor-technology, and a new graphics card. Let's talk about two of those.
First up, the new GPU because that's easy. It is named the GTX 780 Ti, which implies it is an improved version of the current GTX 780. As it is not labelled the GTX Titan Ti, or something similar, the implication would appear to be that it will fall between the 780 and Titan, hopefully both in terms of ability and price. (Or better yet, it launches at the price of the 780, and everything below it gets a price cut. Probably won't happen, but that would be better, wouldn't it?) Honestly, I'm not sure I have any real thoughts on this hardware. I mean, it's just an improved 780, right? Sure that will be impressive, but it is not an 800 series card. I welcome it to the family, but it is still a big brother to something we already have, and not the next generation.
The new piece of monitor-technology is called G-Sync, because nVidia likes 'G.' The idea is definitely intriguing, but I see some long-term flaws with it. The idea behind G-Sync is that it will cause a monitor to have a variable framerate, instead of a constant framerate, as is the current standard. This variable framerate will be to match the output of the GPU. Think of it like this: the monitor and GPU are like two people walking side by side. If they are in step with each other, so they take steps in time with the other, then everything is great. That does not happen all the time though (for many reasons), but as the GPU changes the speeds of its steps, the monitor does not change at all. This means the GPU has to compensate for either falling behind or getting ahead of the monitor, and this causes ugly artifacts. What G-Sync will do is allow the monitor to know the GPU's steps, and stay in step with it. The result should be a much smoother experience for everyone (and a drop in input lag, but we don't need to go into that).
This technology seems to be a natural evolution of the FCAT technology nVidia developed sometime ago, but only released in the past few months. The natural evolution of G-Sync though, I believe, is that it will become a standard, or that a similar standard will be developed and integrated into all monitors and the DisplayPort and/or HDMI specs. At that point, G-Sync will be obsolete, as it requires special hardware in special monitors and is only offered through partnerships with nVidia. Of course that will take time, potentially years, and during that nVidia will be able to claim some increased revenue, and one other thing, that I am sure they will value. G-Sync's potential is only going to be realized on high-framerate monitors, which means it could encourage increased development and more purchases of these monitors. Now that is something nVidia marketing would love to claim responsibility for (any marketing wing of a company would).
Anyway, that's some of what was announced, and my take on it. You can visit nVidia's site if you want more info.
No comments:
Post a Comment