The effect of grain size on void swelling has its origin in the intrinsic property of grain boundaries as neutral and unsaturable sinks for both vacancies and self-interstitial atoms (SIAs). The phenomenon was investigated already in the 1970s and it wasdemonstrated that the grain size dependent void swelling measured under irradiation producing only Frenkel pairs could be satisfactorily explained in terms of the standard rate theory (SRT) and dislocation bias. Experimental results reported in the 1980sdemonstrated, on the other hand, that the effect of grain boundaries on void swelling under cascade damage conditions was radically different and could not be explained in terms of the SRT. In an effort to understand the source of this significantdifference, the effect of grain size on void swelling under cascade damage conditions has been investigated both experimentally and theoretically in pure copper irradiated with fission neutrons at 623K to a dose level of approx0.3 dpa (displacement peratom). The post-irradiation defect microstructure including voids was investigated using transmission electron microscopy and positron annihilation spectroscopy. The evolution of void swelling was calculated within the framework of the production biasmodel (PBM) and the SRT. The grain size dependent void swelling measured experimentally is in good accord with the theoretical results obtained using PMB. Implications of these results on modeling of void swelling under cascade damage conditions arediscussed.