Video games are getting more realistic, but graphics aren’t going to make them better alone, with faster syncing also a part of the formula.

The next generation of gaming might not come from consoles alone, as PCs get smaller and smaller, and video cards improve. In fact, better video cards won’t just lend themselves to more impressive graphics, but game play that doesn’t feel like it disconnects you from the action.

One way this occurs is when a game feels like it’s “tearing”, which is another way of saying the graphics load into place almost in blocks, as if the images are refreshing on the screen by tearing themselves on screen in place.

Tearing is one of those things that’s hard to avoid at the moment, and even if you have a next-gen console or a high-end PC, at one point the graphics are going to appear like this, tearing away with blocks of your gaming falling into place almost as if it’s loading piece by piece.

It’s quick, and most won’t notice it, especially if they’re already immersed in frenetic gaming, but it generally comes from a monitor not being synchronised with what’s happening on the graphics card side of things. There’s more to it than that, but synchronisation is a big issue in gaming, and you’ll even see it with sound, with some video games asking you to synchronise audio by delaying it over a period of milliseconds.

Video tearing is a little like this, and is one of the problems graphics cards and monitor makers are only beginning to solve now.

Blink and you'll miss it: see the line in the image? That's tearing, as the screen tries to keep the imagery in sync.

Blink and you’ll miss it: see the line in the image? That’s tearing, as the screen tries to keep the imagery in sync.

One solution is built by AMD and is called “Freesync”, which relies on a graphic card and DisplayPort to get the images talking to each other the way they should be.

Tested this week, the solution is a two part issue, with a compatible AMD graphics chip — AMD R7 and R9 graphics cards, generally — and a compatible monitor with a DisplayPort.

Both of these are necessary, as you need the graphics card to make the technology start up, and you need a compatible monitor with Freesync compatibility loaded onto the display, which will talk to the DisplayPort and allow the two devices to talk to each other and keep everything synchronised.

The Dell on the left lacks Freesync, while the LG on the right includes it. Notice how the monitor on the left cuts up the laser fire from the ship because it lacks Freesync.

The Dell on the left lacks Freesync, while the LG on the right includes it. Notice how the monitor on the left cuts up the laser fire from the ship because it lacks Freesync.

Once working together, however, games don’t tear and graphics are clear, with gamers abler to wander a world and play titles without their screen splitting up as the images tear into and out of place, something we saw this week when we demoed a recent Dell monitor that wasn’t Freesync enabled alongside one that was made by LG.

That monitor is a bit of a catch, and it has to not only be setup for AMD’s Freesync technology, but it must have a DisplayPort, communicating to the PC or console using DisplayPort instead of HDMI.

That means that the current “next-gen” consoles are out of action for using this technology, even though both run on the right sort of AMD graphics, and that’s because the Microsoft Xbox One and the Sony PlayStation 4 use HDMI ports, not DisplayPort ports.

amd-freesync-hands-on-dell-vs-lg-2015-07