I don't have a definitive answer to this, but I can make some observations that should help in choosing between the approaches. It appears you may be using the 'Factor' program to conduct the analysis? I have some experience of using this, and also comparing the results to parallel analysis conducted in SPSS (using macros by O'Connor) and in R (implemented using the menu developed by Basto).
Firstly, R and SPSS macro give the same results (assuming equivalent approaches to factoring etc). The results from Factor following the default Timmerman/Lorenzo-Seva method tend to differ, and to indicate fewer factors. I put this down to the different approach to analysing variance accounted for. Timmerman/Lorenzo-Seva describe this in their publication on the method, and suggest that this is a better approach.
Secondly, if you are using Factor, the Horn approach is based on Principal Components analysis. This may not be an appropriate model for your application. There is a lot of discussion in the literature on whether PCA or true factor analysis methods should be used in parallel analysis. However, Factor does not have any flexibility on this if using the Timmerman approach, so you would need to use alternative software if PCA is not what you need.
My view is that the way that Factor handles variance accounted for following the Timmerman/Lorenzo-Seva method is better at avoiding trivial factors. Hence it comes up with fewer recommended factors. If true factor analysis is what you need, then it may well be the better approach. However, I would strongly recommend that you use more than one way to investigate this question - Velicer MAP has a good reputation (although it is based on PCA, not factors), and R offers some further options.
I look forward to what others have to say on this question.