Get access to the data via one node of the Earth System Grid Federation (ESGF); see for example http://www.dkrz.de/daten-en/IPCC-DDC_AR5. You may have to create an account before you actually can download the data. If a colleague at your institute has done it before, ask him to save time.
Prepare a data selection mask (details below), so that only the values of the catchment are taken into account.
Process the data. Done
If you get the impression that the global data of the CMIP5 simulations are to coarse for your question of interest, you may consider to explore the data from the WCRP CORDEX archive, were regional models of higher spatial resolution are driven by the typical CMIP5 scenarios. See for example http://cordex.org
Steps 2-3 in more detail
Any how the general steps are similar regardless of the data set. I would first download a data set for a year, for instance, and the information where each grid point is located in terms of latitude and longitude. I would construct a mask, where each model grid point entirely in your catchment would get the value one (1) and all model grid points outside of the catchment would get the value zero (0). The grid points that are partly covered by the catchment could be treated in two ways. The simple one for first test is as follows: If the fractional coverage is above 50% use one else zero. In a later version you may use the more advanced way by exploiting the actual fractional coverage ranging from 0 (completely outside) to 1 (entirely inside) your catchment.
Afterwards you may apply the mask to your test data set and inspect if the results are plausible. Afterwards you may download the entire data set covering your period of interest and repeat the steps for the entire period of interest. In this respect you either use your favorite tool to perform the actions or you may consider trying the CDOs (climate data operators: https://code.zmaw.de/projects/cdo/wiki/Cdo). They are very handy if you operate with netCDF output from common climate models. For common Linux distributions they can be easily installed, for example in Ubuntu the command would be: apt-get install cdo (at least as long as you have administrative rights; if not ask your system administrator).
Once you have done the job for one model and want to compare the results to another model, you have to redo all the steps above, since the model grid is probably different among the models. Here scripting the entire process (csh, tcsh, bash, ksh, …) could save you a lot of work at the end. However the construction of the mask and the tests should be done for each model, before a script is started, else the results might be misleading and a lot of working hours are wasted without proper testing.
For Australia-centric data I've been using http://nrm-erddap.nci.org.au. Most of which is global, but as it's a set of data derived from NOAA it may be possible to find more specific regional data suitable for you needs