Hazard ratios are the increase or decrease in the hazard associated with a change of the predictor, all other predictors been held constant.
Although mathematically the log hazard ratios are easier to work with when fitting models, when it comes to interpretation, it is more natural to use the hazard ratios. A hazard ratio is defined as the hazard for one individual (or group) divided by the hazard for another individual or group. A value of 1 indicates no change in the hazard. Values between 0 and less than 1 indicate a decrease in the hazard . Values greater than 1 indicate an increase in the hazard.
When the predictor variable is a categorical variable, the hazard ratio is the increase or decrease in hazard over the baseline category.
When a predictor variable is a continuous variable, the hazard ratio is the increase or decrease in hazard for a change in the predictor variable. The default is for a 1 unit change in the predictor, although it may be more appropriate to use a larger unit, such as for a change of 10 units of the predictor variable.
When a term includes an interaction, the hazard ratio will depend on the interacting variables' values, so you will need to use multiple hazard ratios to describe the behavior at different levels of interest.