I am a great fan of MATLAB since long time ago and I certainly recommend it. However, the right kind of tool depends on the application you have in your mind. Sometimes one can find very nice R packages for doing a specific task. I suggest that you specify your field of interest, so that you could obtain more informative responses from fellow researchers.
I would go undoubtedly go with MATLAB. Not so much for its capabilities in the area, which hare much the same as R though I would claim MATLAB is easier to read than R, but because of its surround infrastructure that means you can deliver your results in a properly engineered application. You might like to have a look at David Barber's book and MATLAB package Bayesian Reasoning and Machine Learning.
Depends on your preference and your previous knowledge on using the software.
If you're using R their so many feature on to statistical rather than ml, thou there are many reference and forum that discuss and have further more exploration of ml algorithm. Basically programming and statistic knowledge needed.
Matlab on the other hand is and ease of use tool that implement both ml or statistic but not so statistic oriented as R. If you dont want to make it from scratch only to prove and shown your algorithm , matlab is the one easiest and greatest way with many feature toolboxes for ml.
Matlab is very good because it contains many packages, when you use Matlab for machine learning, you need to access image processing, statistics, linear algebra , computer vision,..etc. These packages are found in one place in matlab. Also, Matlab is very simple and easy to get results in your research in little time .
If you intend to *implement* algorithms that are going to scale to very large data-sets, then none of these environments is the best choice. The best choice would be to implement the algorithm in a programming language that is known for its efficiency in scientific computing such as C/C++ (Java or FORTRAN would be good second choices, Python etc. come as very distant third choices), and then provide bindings to MATLAB etc. so that other people that cannot program in C/C++ can still use your implementation from within MATLAB or R, or SAS etc. That's what the shogun machine learning toolbox does, and it's what most "industrial-scale" libraries do.
If you are not good at C/C++ or you don't care testing your implementation on large data-sets, then you can choose any of the environments/languages you mentioned. Most engineers tend to be familiar with MATLAB, R and the other environments are not as popular among the engineering community. Statisticians on the other hand tend to be more fluent at R, SAS and SPSS than MATLAB.
Please, let me stand out from the crowd. Abandon all of them and embrace Python.
Python for scientific computing is getting more and more popular due to its simplicity, powerfulness and steep learning curve. If you know Matlab it will take 5 min to start working with python. Look at this website http://www.scipy.org/ . All tools are free even for commercial application and widely adopted in industries. The most prestigious Universities in the world are contributing to its development and ...... have a look yourself. You will love it!
Of the three I'd have to say MATLAB. I like R but in terms of speed and ease of use MATLAB is probably better for nnets.
That being said, if all you need is a simple ANN, R (with packages like neuralnet) will do just fine.
And yea, there are other languages like Py, C, etc but if you only need a simple back propagation ANN for data that isn't that big, it won't make a difference - you could just stay in whatever environment you're comfortable.
Before implementing algorithms by yourself consider first already implemented ones. In Python there is nice module scikit-learn (check it here http://scikit-learn.org/). I know python is not in your list, but it stands out of these three as general-purpose language (not only for statistics) and "battery included" (there are many implemented things/packages)
i have used all these but i would like to prefer R [Certain reason behind]. It provide flexible compatibility with Java and Python. It's implementation and way of programming is almost similar with Matlab! it also provide flexibility to share yoour result real-time on social network using hYat and sliidy as well as shinyApp.
Python appears to be a popular choice - but of the options you listed in your question I personally recommend MATLAB. MATLAB is fine for me in my research when implementing EMO algorithms and running experiments/statistical analysis.
However, when it comes to an industrial application or deployment of an EMO algorithm, I would advise against the use of MATLAB.
Also to give a shout-out to jMetal http://jmetal.sourceforge.net/: an object-oriented Java-based framework for multi-objective optimization with metaheuristics.
Let me take a step back before I can tell which is the best. As a software engineer, if you can't present results or push your machine learning models into production, then all that work would go to waste without any actionable results.
Today, most software like python, R, SAS & matlab provide API interface for your ML models to communicate with different programs and machine learning models. Next, is to consider the trade off; open source or paid software. Open source has a lot of contributes compared to paid software today and contributors play an important role in ML algorithm development.
Finally, R is a good statistical tool, python is a good application tool and matlab is a good signal processing tool. Said that, you can actually write python scripts in R and R scripts in python. Similar is the case for most tools today.
If you are planning to learn a tool, learn one tool and excel in that.