It seems that using machine/deep learning to solve PDEs is very popular (actually, not only in scientific computing, but also in all fields). So I want to know the reasons behind this. And is the prospect cheerful?
There are many different PDEs. Some of these were solved long before the invention of the digital computer. While there may be some exciting new developments in this area, the two are not necessarily linked. Because your initial statement is not necessarily true, the question that follows is problematic or ill-posed. Perhaps you might ask if there are any important PDEs that defy analytical (or existing numerical) solution, but might be solved using advanced computational means not previously available?
Partial differential equations (PDEs) are indispensable for modeling many physical phenomena and also commonly used for solving image processing tasks. In the latter area, PDE-based approaches interpret image data as discretizations of multivariate functions and the output of image processing algorithms as solutions to certain PDEs. Posing image processing problems in the infinite-dimensional setting provides powerful tools for their analysis and solution. For the last few decades, the reinterpretation of classical image processing problems through the PDE lens has been creating multiple celebrated approaches that benefit a vast area of tasks including image segmentation, denoising, registration, and reconstruction. In this paper, we establish a new PDE interpretation of a class of deep convolutional neural networks (CNN) that are commonly used to learn from speech, image, and video data. Our interpretation includes convolution residual neural networks (ResNet), which are among the most promising approaches for tasks such as image classification having improved the state-of-the-art performance in prestigious benchmark challenges. Despite their recent successes, deep ResNets still face some critical challenges associated with their design, immense computational costs and memory requirements, and lack of understanding of their reasoning. Guided by well-established PDE theory, we derive three new ResNet architectures that fall into two new classes: parabolic and hyperbolic CNNs. We demonstrate how PDE theory can provide new insights and algorithms for deep learning and demonstrate the competitiveness of three new CNN architectures using numerical experiments
For a variety of machine learning approaches, Differential Equations are often important, mostly based on comparison to some mathematical physical models.
Differential equations are a crucial method in physics for modelling a system's dynamics. Differential equations essentially tie the change rate of one quantity
to other device characteristics (with many variations on this subject).
So you are simply talking about unequal equal treatment when you try to model the learner as a dynamic model (for example, a network of neural weights that shift in compliance with some rule).
Unfortunately, it is not as easy to overcome differential equations as solve a (ordinary) equation. Rather, you must know some theory about how multiple
forms of differential equations generate different types of solutions.
There is also a large body of literature on numerical differential equations, which can be very overwhelming. Most examples I know are from computer vision, for example when the visual flow of an image is to be calculated from image sequences. But it is also possible that differential equations will pop any time a learning algorithm arrives from a comparison with a physical device.