Yeah it depends on how the GPU is being utilized. For gaming, the trend will usually be you'll see more relative slowdown as you transition from GPU limited to CPU limited frame rates. In other words, as resolutions and qualities go down and frame rates increase.
If you want to get the equivalent of a GPU's native performance (i.e. if it were seated in an x16 PCIe 3.0 slot in an otherwise equivalent spec PC), the rule of thumb is you need to go up one "class" of GPU. For example, my GTX 1080 as an eGPU is doing almost exactly what a GTX 1070 in a desktop can do at 3440x1440 in Rise of the Tomb Raider with Ultra settings (49.56 vs 49.7FPS).
So, if you want:
Native GTX 1080Ti performance from an eGPU, use an RTX 2080Ti. This is your best shot for 4K@60FPS ultra settings in modern games.
Native GTX 1080 performance from an eGPU, use an RTX 2080 or GTX 1080Ti. This should be good for WQHD (3440x1440) @ 60+FPS ultra settings in modern games. Slightly overkill for 2560x1080 or even QHD (2560x1440).
Native GTX 1070 performance from an eGPU, use a GTX 1080, RTX 2070 or Vega 64. This should be good for 1080p @ 60+FPS ultra settings in modern games.
Native GTX 1060 performance from an eGPU, use a GTX 1070 or Vega 56. This should be fine for 1080p at high settings.
A Radeon RX 590 will be fine for gaming at medium settings @ 1080p.
These are of course just guidelines. Performance in individual titles and vs. different models of GPU cards will vary.
These guidelines do not apply if you're using the eGPU for something else like OpenCL/CUDA.
I'm pretty happy with my 4790K/EVGA 980ti in flight simulation at 1080p, but it sounds like I might be less happy with it as an eGPU. It's pretty close in performance to a 1070 in my PC. I was thinking about moving it to an eGPU, but it sounds like I might be happier selling the rig I have and buying a better GPU.