The answer depends on your personal 'appetite' to develop your own ML applications. Although there will be some renowned frameworks available, a significant number of people use Python to build their own.
Python comes equipped with many 'packages' which you can include in your code where necessary, comparable to the Toolboxes from Matlab. There are two foundation level Python packages which you may explore:
TensorFlow
Theano
With foundation level I mean that these are 'all inclusive' and provide you all the functionality you need. Together with this high degree of flexibility, you also get additional complexity which you may not need. In that case you can also import 'Keras', which rides on top of the two earlier mentioned packages but simplifies the use of those two significantly.
I would suggest Tensorflow. Because it is being funded by Google and there exists large community, groups supporting them. Theano has officially declared of stopping their support.
That depends on what you are mining for. Pre-processing is one of the most important data mining steps. If you do text analysis or multimedia mining they are very different. ML is domain independent so in my experience if you use TensorFlow or WEKA doesn't really matter that much. They are all excellent ML toolkits. But pre-processing is where often different toolkits offer different capabilities. For text analysis LingPipe worked well for me. For speech or face recognition TensorFlow might be a better choice.