Deep learning performance is unaffected by RAM size. However, it may make it difficult to execute your GPU code smoothly (without swapping to disk). You should have enough RAM to operate with your GPU comfortably. This implies you should have at least the quantity of RAM that corresponds to your most powerful GPU.
Memorization, on the other hand — basically overfitting, memorization refers to a model's failure to generalize to previously encountered data. The model has been over-structured in order to suit the data from which it is learning. Memorization is more likely to occur in a DNN's deeper hidden layers.