Despite being a simple and commonly-applied radio optimization technique, the impact on practical network performance from base station antenna downtilt is not well understood. Most published studies based on empirical path loss models report tilt angles and performance gains that are far higher than practical experience suggests. We motivate in this paper, based on a practical LTE scenario, that the discrepancy partly lies in the path loss model, and shows that a more detailed semi-deterministic model leads to both lower gains in terms of SINR, outage probability and downlink throughput and lower optimum tilt settings. Furthermore, we show that a simple geometrically based tilt optimization algorithm can outperform other tilt profiles, including the setting applied by the cellular operator in the specific case. In general, the network performance is not highly sensitive to the tilt settings, including the use of electrical and/or mechanical antenna downtilt, and therefore it is possible to find multiple optimum tilt profiles in a practical case. A broader implication of this study is that care must be taken when using the 3GPP model to evaluate advanced adaptive antenna techniques, especially those operating in the elevation dimension.