No, because that doesn’t make sense. The reason one uses multilayer neural networks is because one is interested in associations that can’t be represented by single layer or 2-layer networks.
I think what you're trying to do falls under the term "pruning". Though reducing the network depth from 12 to 2 would most likely make it almost unusable for any non-trivial task.
Compressing multiple neural networks into single neural network lowers the accuracy of your predictive model. But if you still want to go then try "Pruning", "Weight Sharing", or "Neural Architecture Search" techniques. You can perform pruning using TensorFlow Model Optimization library of python.
I am working on a similar problem. I want to remove the layer because I want to analyze the model theoretically. Although compressing the model into an exact two-layer neural network is impossible, is it feasible to reduce the number of layers and increase the width to approximate the original model?