We use a Bayesian method, optimal interpolation, to improve satellite derived irradiance estimates at city-scales using ground sensor data. Optimal interpolation requires error covariances in the satellite estimates and ground data, which define how information from the sensor locations is distributed across a large area. We describe three methods to choose such covariances, including a covariance parameterization that depends on the relative cloudiness between locations. Results are computed with ground data from 22 sensors over a 75×80 km area centered on Tucson, AZ, using two satellite derived irradiance models. The improvements in standard error metrics for both satellite models indicate that our approach is applicable to additional satellite derived irradiance models. We also show that optimal interpolation can nearly eliminate mean bias error and improve the root mean squared error by 50%.