There are some innovative ideas in deep algorithms other than the neural network-based ones. In general, the combination of algorithms in several steps (layers) performing progressive tasks such as feature extraction and classification or regression may be considered deep. There are some examples, like deep support vector machines (DSVM) here: https://www.rug.nl/research/portal/files/19535038/DSVM_extended_abstract.pdf
Metalearning is the study of principled methods that exploit metaknowledge to obtain efficient models and solutions by adapting machine learning and data mining processes. Metalearning monitors the automatic learning process itself, in the context of the learning problems it encounters, and tries to adapt its behaviour to perform better.
In deep learning the layers of a neural network are heterogeneous and to deviate widely from biologically informed connectionist models, for the sake of efficiency, trainability and understandability.
Then, using the analogy of the metalearning, it is possible to use more than one model to machine learning, to "deep the learning"
For example, I have used PCA to reduce the dimensionalty of the data and Genetic algorithms to enhance the results interpretation and filtering the coefficients of principal components, that is, to enhance the feature selection. Also, I used a unsupervised NN to clasify the data, then a wrapper technique to feature selection