While eXplainable Artificial Intelligence (XAI) has significant potential to glassbox Deep Learning, there are challenges in applying it in the domain of Geospatial Artificial Intelligence (GeoAI). A land use case study highlights these challenges, which include the difficulty of selecting reference data/models, the shortcomings of gradients to serve as explanation, the limited semantics and knowledge scope in the explanation process of GeoAI, and underlying GeoAI processes that are not amenable to XAI. We conclude with possibilities to achieve Geographical XAI (GeoXAI).