He’s making a point about a recent trend towards a new kind of software that’s somewhat broader than what Deep Learning typically refers to (“assembling networks of parameterized functional blocks and by training them from examples using some form of gradient-based optimization”), but it certainly still includes Deep Learning.

He may be right about that, but he’s certainly not saying that Deep Learning itself is actually dead.

More Cristian Randieri's questions See All
Similar questions and discussions