Performance prediction is required to optimally deploy workloads and inputs to a particular machine/accelerator in computing systems. Different predictors (e.g. AI predictors) come with different trade-offs, such as complexity, accuracy, and overheads. Which ones are the best?