Incidence rate ratio (IRR) and hazard ratio (HR) are closely related measures of association, but they are not the same.
Incidence rate ratio is the ratio of the incidence rate in the exposed group to the incidence rate in the unexposed group. Incidence rate is the number of new cases of an event that occur during a specific period of time, divided by the total population at risk during that time period.
Hazard ratio is the ratio of the hazard of experiencing an event in the exposed group to the hazard of experiencing the event in the unexposed group. Hazard is the instantaneous rate at which an event occurs, given that the individual has not yet experienced the event.
In simple terms, IRR tells you how much more likely people in the exposed group are to develop an event than people in the unexposed group, while HR tells you how much faster people in the exposed group are likely to develop the event than people in the unexposed group.
Another way to think about it is that IRR is a measure of absolute risk, while HR is a measure of relative risk. Absolute risk is the overall probability of developing an event, while relative risk is the risk of developing an event compared to another group.
In some cases, IRR and HR may be very similar, especially for rare events. However, for common events, IRR and HR can be quite different. For example, if a drug reduces the risk of a heart attack by 50%, the IRR would be 0.5, but the HR would be 2.0. This means that people taking the drug are half as likely to have a heart attack as people who are not taking the drug, but they are also twice as likely to have a heart attack as people who have never had a heart attack.
It is important to understand the difference between IRR and HR when interpreting the results of medical studies. IRR is a more useful measure for understanding the overall risk of developing an event, while HR is a more useful measure for understanding the relative risk of developing an event compared to another group.