PC Perspective met with AMD at CES 2015 and got to see a demonstration of the company's FreeSync technology. One of the new things the site learned is that minimum frequency has to be taken into account as the monitor needs to maintain a certain refresh rate to avoid artifacts and flickering, for the LG 34UM67 for instance this is 40Hz.
The explanation below implies that for FreeSync to work as intended, the frame rate needs to be above 40fps at all times and that you need to use fps capping so the maximum refresh rate of your screen isn't exceeded.
What happens below that limit and above it differs from what NVIDIA has decided to do. For FreeSync (and the Adaptive Sync standard as a whole), when a game renders at a frame rate above or below this VRR window, the V-Sync setting is enforced. That means on a 60 Hz panel, if your game runs at 70 FPS, then you will have the option to enable or disable V-Sync; you can either force a 60 FPS top limit or allow 70 FPS with screen tearing. If your game runs under the 40 Hz bottom limit, say at 30 FPS, you get the same option: V-Sync on or V-Sync off. With it off, you would get tearing but optimal input/display latency but with it on you would reintroduce frame judder when you cross between V-Sync steps.