Of course, it can and does happen. The vast majority of model assumptions are needed in order to have estimates of the coefficients that are unbiased and (loosely speaking) to ensure the reliability of inferences concerning model parameters and the overall fit of the model but are not related to the significance of the model parameters. . However, it can happen that one or more (or even, all) the estimated coefficients including the intercept is insignificant. Note also that the primary aim of a regression model is to describe the relationship between the independent variable(s) and the dependent variable(s); however, in this setting, the regression coefficients play the crucial role and not the intercept. Thought, the goodness of the model needs to be evaluated - e.g. by means of an ANOVA.
Both previous answers are correct, and give you the information you require.
I only add one further point that may be of some help in interpreting the significance of the constant term. That small point is that the significance term for the constant relates to whether the constant is significantly different to zero (as it does for other estimates in the model). For explanatory/ independent variables in the model this has an intrinsic meaning; but for the constant there is no intrinsic meaning as there is no hypothesis that the constant should not be zero (or any other value). The constant term being simply the value that the regression line crosses the Y axis.
When reporting regression results I never report p values or significance for the constant, as doing so implies testing a null hypothesis (I don't recommend everyone does not report this in their results, it's a personal thing). But the important point is that there is no hypothesis/ null hypothesis to test regarding the value of the constant.
In a regression model, the constant or intercept term is almost always vital to include, even though it is almost never worth interpreting. The constant is the value where the regression line crosses the y-axis and is the expected value of the dependent variable when all independent variables are equal to zero. However, the constant may be insignificant if all other conditions are satisfied, especially if the independent variables never equal zero. In such cases, the intercept has no intrinsic meaning, but it is still necessary to give unbiased estimates of the slope and to calculate accurate predicted values. If a meaningful intercept is important, centering the independent variables can be done to get a meaningful intercept.
I don't think it's a good idea to let that depend on the data. Before you do a regression, YOU should decide if there should be an intercept or not. If you know for sure that y=0 when x=0 (e.g. x=voltage and y=current, or x=mass and y=volume etc.) then you should force the intercept to 0.
And consider other model functions instead of the usual linear!