We theoretically study the adiabatic preparation of an antiferromagnetic phase in a mixed Mott insulator of two bosonic atom species in a one-dimensional optical lattice. In such a system one can engineer a tunable parabolic inhomogeneity by controlling the difference of the trapping potentials felt by the two species. Using numerical simulations we predict that a finite parabolic potential can assist the adiabatic preparation of the antiferromagnet. The optimal strength of the parabolic inhomogeneity depends sensitively on the number imbalance between the two species. We also find that during the preparation finite size effects will play a crucial role for a system of realistic size. The experiment that we propose can be realized, for example, using atomic mixtures of rubidium 87 with potassium 41, or ytterbium 168 with ytterbium 174.