Hi. At present time there are two main groups of algorithms linked to quantum technology: quantum gates approach, quantum annealing approach. Typical classical algorithms like support learning machines could be implemented using both quantum methods and might be applied with almost the same efficiency as quantum and in general has no big effect. But more sophisticated machine learning algorithms like convolutional neural networks for images processing and transformers for languages might be a very promising area to inhance efficiency of quantum - classical methods. In view of quantum gates application one must have access to many qubit's computer, especially for language models and we should await for this time. But the question is what is the subject to learn in neural networks. The bottle neck of neural networks is optimisation of parameters that is still using classical approaches as qubits measurements give 0 or 1. After understanding deeper on how to optimize continuous parameters we can apply quantum algorithms better.
Adding to Andrey Fionov's great answer: At some point in time there will be a natural limit of how small can you make a transistor for classical computer components like GPUs. When this happens, alternative hardware solutions are needed if we want to keep increasing ML model sizes (parameters), represent them in physical memory and compute with them. At this moment, quantum computing seems like the best alternative to classical silicon computers that could one day achieve this capacity.