Due to increasing competition for water resources by urban, industrial, and agricultural users, the proportion of agricultural water use is gradually decreasing. To maintain or increase agricultural production, new irrigation systems, such as surface or subsurface drip irrigation systems, will need to provide higher water use efficiency than those traditionally used. Several models have been developed to predict the dimensions of wetting patterns, which are important to design optimal drip irrigation system, using variables such as the emitter discharge, the volume of applied water, and the soil hydraulic properties. In this work, we evaluated the accuracy of several approaches used to estimate wetting zone dimensions by comparing their predictions with field and laboratory data, including the numerical HYDRUS-2D model, the analytical WetUp software, and selected empirical models. The soil hydraulic parameters for the HYDRUS-2D simulations were estimated using either Rosetta for the laboratory experiments and inverse analysis for the field experiments. The mean absolute error (MAE) was used to compare the model predictions and observations of wetting zone dimensions. MAE for different experiments and directions varied from 0.87 to 10.43 cm for HYDRUS-2D, from 1 to 58.1 cm for WetUp, and from 1.34 to 12.24 cm for other empirical models.