I wanted to ask this one Quora, but they no longer allow you to enter background/elaboration on a question.

I'm looking to build/purchase a workstation to use from home for research/data analysis (and probably some recreation, but anything able to handle my work needs will be able to run a game or two with ease). The most demanding applications it will be used for are image processing (specifically, 3D- and/or Z-projections of high-res, 16bit Z stacks, deconvolution, and deep learning / training DNNs and GANs). I'm looking at NVidia graphics cards, since it's so much easier to push parallel processing tasks to the GPU with something like CUDA than it is to jury-rig a workaround for an AMD card. Specifically, i'm trying to decide between a GTX and an RTX series card.

I know the primary difference is that RTX cards can do ray-tracing, while GTX cannot -- but it's not immediately apparent to me whether I should care, given that I won't be using the GPU for its ability to render realistic real-time action scenes.

Given that the GTX line is considerably more affordable, I'd like to know if the RTX equivalents will outperform the GTX counterparts in tasks like neural network training. Or even if I should bite the bullet and spring for a Quadro (though I sincerely hope that won't be the case).

Similar questions and discussions