Can R decrease as more variables are added to the model? Any example support would be appreciated
If by R you mean R-squared, then:
R-squared (unadjusted) will increase as independent variables are added to the model. If you add enough variables, you could force it to 1 (even with useless predictors).
Adjusted R-squared takes into account how many degrees of freedom we have (penalizes you for more independent variables). It cannot be forced to 1 simply by adding nonsense regressors. Adjusted R-squared will not exceed unadjusted R-squared. If you add a relatively “useless” variable, R-squared adjusted will decline (in comparison to unadjusted R-squared) to reflect that the variable isn’t useful. Remember, too, that adjusted R-squared can be negative (but this means 0, for practical purposes).
When you see a model with a high (unadjusted) R-squared, and a much lower adjusted R-squared, you know that some of those independent variables are not useful for predicting the DV.
Can R-squared (Not Adjusted) decrease by adding extra exlanations??
Think about it: if the coefficients on the new variables were all zero, then R² would remain unchanged. That’s the worst case; R² can only remain the same or increase when you add a new variable.