Does the local temperature created around antenna due to radiation or due to its longer run has its impact on their performance. If so, could anyone suggest an option so that I can have a graph for temperature vs antenna performance in CST.
If an antenna is inside the body, then it is often used to cook (ablate) tissue. This does result in a change of tissue properties and a change in the match (return loss) of the antenna. This is one example of what you refer to. There can be others.
It could happen in other situations, where the antenna gets hot and the properties of the materials it is made of change, or the antenna expands.
To simulate these you need to know the relevant temperature dependent properties of all the parts and materials involved, and to be able to simulate of know the temperature distribution.
This question is not fully clear. Please mention the type of antenna, the type of material, and the resonant frequency of the antenna. The temperature change hardly affects the impedance-matching profile of the antenna and the system as a whole.
Coefficient of thermal expansion – x-axis1.4×10−5 K−1
Coefficient of thermal expansion – y-axis1.2×10−5 K−1
Coefficient of thermal expansion – z-axis7.0×10−5 K−1
This means if it gets 100 degrees hotter it will get about 0.1% bigger. The change in thickness won't affect the impedance much but it will get longer and wider too and the resonant frequency could drop by about 5 MHz so the match could change, depending on the bandwidth.