It depends on the specific kernel algorithm you are referring to. In the case of Support Vector Machines there are a few alternatives, not necessarily better than grid search, for example:
1) The gradient based approach of [1] (this is an iterative procedure).
2) Heuristic criteria such as the one presented in [2].
3) Tuning by means of evolutionary algorithms, possibly in conjunction with other parameters, e.g. [3].
[1] Chapelle, Olivier, et al. "Choosing multiple parameters for support vector machines." Machine learning 46.1-3 (2002): 131-159.
[2] Wu, Kuo-Ping, and Sheng-De Wang. "Choosing the kernel parameters for support vector machines by the inter-cluster distance in the feature space." Pattern Recognition 42.5 (2009): 710-717.
[3] Friedrichs, Frauke, and Christian Igel. "Evolutionary tuning of multiple SVM parameters." Neurocomputing 64 (2005): 107-117.