For really computationally heavy problems, its clearly either Fortran or C, because they are by far the fastest languages out there and easily parallelized for HPCs via mpi.
If you are more interested in prototyping and testing of various approaches, Python + packages are very useful since one can write code much faster.
Another option would be Julia, which tries to combine the simplicity of syntax and the speed of Fortran/C. However I am not sure about its flexibility and package support.
Otherwise, I would recommend to orientate yourself on your peers in the field. If 90% are using fortran, then it probably is a wise choice to go down that path.
I agree with Pesantez Jorge. I worked mostly during my thesis using Fortran. But when i learned other languages i realized that in Python for example you can find lots of packages and lots of inner functions.
However, if you want to really program everything in details, use the old old school (Fortran).
For really computationally heavy problems, its clearly either Fortran or C, because they are by far the fastest languages out there and easily parallelized for HPCs via mpi.
If you are more interested in prototyping and testing of various approaches, Python + packages are very useful since one can write code much faster.
Another option would be Julia, which tries to combine the simplicity of syntax and the speed of Fortran/C. However I am not sure about its flexibility and package support.
Otherwise, I would recommend to orientate yourself on your peers in the field. If 90% are using fortran, then it probably is a wise choice to go down that path.
I had only a very brief look at it, but basically its just an opinion that very well fits into the emerging fields of computational sciences...
I can give you my opinion an this will tie into the article.
Languages such as R/Python/Matlab have made it increasingly easy to generate code and hence attracted a much wider audience. This also means that the average researcher is less familiar with old school programing languages such as Fortran and C.
Now since your average researcher is less likely to have the to skills to easily work with libraries such as MPI to parallelize code other options have to be explored.
Spark is such a solution that will give you easy access to distributed computing. Its really fast at prototyping which ties into the idea of python/R/and matlab. Hence for the majority this will be a preferred choice.
This is probably also the reason the author of the article is pushing towards spark or chapel cause a wider audience is capable of utilizing more complex tools. Whether the obtained performance is actually good is another question.
I am pretty sure that you will get much better performance with Fortan/C+MPI+OMP than spark, but you will require a lot longer to write the code. Hence use what is better suited to your problem.
today python is widely used as platform for testing numerical methods and analytic simulation, but C++ is still a powerful tool for experts. so if you consider yourself an expert C is the way to go.
Thanks everyone for the answers to this topic, it's been illuminating. I want to ask a tangent question specifically towards Spark and HPC C/C++ programs. Spark is by default highly fault-tolerant by design, whereas I've heard horror stories from C/C++ developers on HPC machines about hardware faults requiring complete re-runs and how difficult it can be to program around these potential issues. Is this a pertinent issue?