Thanks for writing back.
> Apologies! I thought you meant that Geocoding API gets involved when there are missing cache addresses.
- Technically, yes that is what I meant. Whenever the system will encounter any human readable address whose lat/long coordinates are not already stored in the map addresses cache table, it will fetch its lat/long coordinates from the API and will store them in the cache table for future use.
( in the database you'll find a table named "{table-prefix}_toolset_maps_address_cache" where these coordinates are saved )
> I just want to know if these missing addresses that are listed which has a note that a cron will process these, does this cron job that processes these missing addresses contribute to heavy Geocoding API requests?
- Let me share some use cases where the Geocoding API gets involved so that you have a better picture to understand how it works:
1). When a new address is saved in any "address" type custom field (either through the backend post edit screen or through the frontend form) and that address' lat/long coordinates are not already saved in the map's cache table.
2). When you have a view search form with a distance-based search field, where the user can enter an address/location to use as a search center. And if that searched address is not already in the map's cache, the Geocoding API call will be made to get and store that address' coordinates.
As discussed in this another recent support thread, if some spam bots are attacking the website's search form, it can be a cause of an unusually high number of API calls:
https://toolset.com/forums/topic/google-maps-api-private-api-key-404s-via-toolset-backend/
> I noticed that there are around 1000 missing cache addresses (total for both dev and live site).
- The fact that you're seeing the message about a large number of missing addresses from the cache, makes me more inclined to the two possibilities that I shared in my last message and not the two cases that I've shared above.
In your reply, you mentioned that you're not bulk-importing the posts. So for any addresses saved in the 'address' type custom fields, their cache entries should be getting maintained properly, as the relevant posts were getting created or being modified.
(provided that the Google Maps API key was properly connected at the time when these posts were created or modified)
And if the address coordinates were added to the cache table then nothing explains such a high number of missing entries from the cache, unless they were somehow removed later or were not added in the first place.
The best way to monitor this would be to let all the missing address get cached once so that the count of missing items become zero and then periodically check the number of records in the "{table-prefix}_toolset_maps_address_cache" table.
> Does this mean that when the cron runs, one missing address is equivalent to 1 Geocoding API requests?
- Yes, that is correct.
> How many times does this cron run everyday?
- There is no fixed schedule or number of times this cron job is set to run each day. Here is how it works, instead:
When the system encounters some missing addresses and they are under the threshold limit of 50, it generates the API requests for the geocoding of those addresses there and then, without setting any cron job.
If those addresses are more than 50, it generates the API requests for the geocoding of 50 addresses and sets a cron job to look for the missing entries after 1 hour again.
After 1 hour, when that cron job runs, it geocodes the 50 more missing addresses and if there are still some addresses left it sets a cron job to run again after 1 hour. And this cycle continues until no more missing addresses are left at which point no cron job is set, for the next hours.
I hope this explanation helps and please let me know if you have any follow up questions or observations.