Skip Navigation

[Resolved] High volume of Geocoding API Requests

This is the technical support forum for Toolset - a suite of plugins for developing WordPress sites without writing PHP.

Everyone can read this forum, but only Toolset clients can post in it. Toolset support works 6 days per week, 19 hours per day.

No supporters are available to work today on Toolset forum. Feel free to create tickets and we will handle it as soon as we are online. Thank you for your understanding.

Sun Mon Tue Wed Thu Fri Sat
- 9:00 – 13:00 9:00 – 13:00 9:00 – 13:00 9:00 – 13:00 9:00 – 13:00 -
- 14:00 – 18:00 14:00 – 18:00 14:00 – 18:00 14:00 – 18:00 14:00 – 18:00 -

Supporter timezone: Asia/Karachi (GMT+05:00)

This topic contains 16 replies, has 2 voices.

Last updated by Waqar 1 year, 8 months ago.

Assisted by: Waqar.

Author
Posts
#2540091
live-missing-addresses.png
dev-missing-addresses.png

The site is consistently showing approximately 1,200 Geocoding API hits every day (36,000+ showing over the last 30 days) which is beyond the free usage quota - and I'm wondering if there's any chance of bringing that down.

It seems unlikely that there are 1,200 real people hitting this API every single day with address lookups - I think it's more likely there are only a few dozen real people using it each day at the most

What's the process of this geocoding request? Is it per address search?

We've checked the maps settings and we found these missing addresses. There are 181 missing cache entries on the live site and 891 on the dev site? Then there's a note below '50 cache entries processed. Remaining cache entries will be processed automatically using cron'. Does this missing addresses contribute to high volume of Geocoding API requests?

#2540363

Hi,

Thank you for contacting us and I'd be happy to assist.

The Geocoding API request gets involved, when there is some address(es) saved in the address fields, whose location (lat/long) coordinates haven't been cached or saved in the Toolset's map database.

Is your website using any import mechanism for adding or updating posts with the 'address' type fields, periodically?

Or is there any database optimization or cleanup operation scheduled on the website, which could remove or limit the entries in the Toolset's map database?

These are the few possibilities that I can think of which can be linked to these missing address entries in the maps cache.

regards,
Waqar

#2540437

Additional Question:
Does this mean that this cron job runs every hour? Does this also. mean that every time that the cron job runs, 1 missing address is equivalent to 1 Geocoding API request?

Answer to your questions:
Is your website using any import mechanism for adding or updating posts with the 'address' type fields, periodically?
- No, we're not using other plugins for addresses or posts with address other than Toolset.

Or is there any database optimization or cleanup operation scheduled on the website, which could remove or limit the entries in the Toolset's map database?
- none

#2540683

Hi! I reread your reply, and I think I understood it wrong. Apologies! I thought you meant that Geocoding API gets involved when there are missing cache addresses.

I just want to know if these missing addresses that are listed which has a note that a cron will process these, does this cron job that processes these missing addresses contribute to heavy Geocoding API requests? I noticed that there are around 1000 missing cache addresses (total for both dev and live site). Does this mean that when the cron runs, one missing address is equivalent to 1 Geocoding API requests? How many times does this cron run everyday?

#2541985

Thanks for writing back.

> Apologies! I thought you meant that Geocoding API gets involved when there are missing cache addresses.

- Technically, yes that is what I meant. Whenever the system will encounter any human readable address whose lat/long coordinates are not already stored in the map addresses cache table, it will fetch its lat/long coordinates from the API and will store them in the cache table for future use.
( in the database you'll find a table named "{table-prefix}_toolset_maps_address_cache" where these coordinates are saved )

> I just want to know if these missing addresses that are listed which has a note that a cron will process these, does this cron job that processes these missing addresses contribute to heavy Geocoding API requests?

- Let me share some use cases where the Geocoding API gets involved so that you have a better picture to understand how it works:

1). When a new address is saved in any "address" type custom field (either through the backend post edit screen or through the frontend form) and that address' lat/long coordinates are not already saved in the map's cache table.

2). When you have a view search form with a distance-based search field, where the user can enter an address/location to use as a search center. And if that searched address is not already in the map's cache, the Geocoding API call will be made to get and store that address' coordinates.

As discussed in this another recent support thread, if some spam bots are attacking the website's search form, it can be a cause of an unusually high number of API calls:
https://toolset.com/forums/topic/google-maps-api-private-api-key-404s-via-toolset-backend/

> I noticed that there are around 1000 missing cache addresses (total for both dev and live site).

- The fact that you're seeing the message about a large number of missing addresses from the cache, makes me more inclined to the two possibilities that I shared in my last message and not the two cases that I've shared above.

In your reply, you mentioned that you're not bulk-importing the posts. So for any addresses saved in the 'address' type custom fields, their cache entries should be getting maintained properly, as the relevant posts were getting created or being modified.
(provided that the Google Maps API key was properly connected at the time when these posts were created or modified)

And if the address coordinates were added to the cache table then nothing explains such a high number of missing entries from the cache, unless they were somehow removed later or were not added in the first place.

The best way to monitor this would be to let all the missing address get cached once so that the count of missing items become zero and then periodically check the number of records in the "{table-prefix}_toolset_maps_address_cache" table.

> Does this mean that when the cron runs, one missing address is equivalent to 1 Geocoding API requests?

- Yes, that is correct.

> How many times does this cron run everyday?

- There is no fixed schedule or number of times this cron job is set to run each day. Here is how it works, instead:

When the system encounters some missing addresses and they are under the threshold limit of 50, it generates the API requests for the geocoding of those addresses there and then, without setting any cron job.

If those addresses are more than 50, it generates the API requests for the geocoding of 50 addresses and sets a cron job to look for the missing entries after 1 hour again.

After 1 hour, when that cron job runs, it geocodes the 50 more missing addresses and if there are still some addresses left it sets a cron job to run again after 1 hour. And this cycle continues until no more missing addresses are left at which point no cron job is set, for the next hours.

I hope this explanation helps and please let me know if you have any follow up questions or observations.

#2545847

Hi, Waqar thank you for a very detailed explanation. This is very much appreciated! Since these missing addresses may have affected the heavy requests, what can we do to fix these missing addresses?

#2546371

Glad that I could help.

> Since these missing addresses may have affected the heavy requests, what can we do to fix these missing addresses?

- Did you manage to Geocode all the missing addresses, so that any future high influx of API requests can be monitored?
( as I suggested in my last reply )

"The best way to monitor this would be to let all the missing address get cached once so that the count of missing items become zero and then periodically check the number of records in the "{table-prefix}_toolset_maps_address_cache" table."

#2548355

Hi Waqar. How can I find the missing addresses? I can only see the total missing addresses in the settings.

#2549481

The "Check for missing cache entries" option in the settings, shows the total number of missing addresses in the cache and not the individual missing addresses.
( ref: https://toolset.com/course-lesson/data-caching-for-maps-addresses/ )

Do you see a decrease over time in the number of missing addresses, when you use that option? If the correctly configured Google Maps API key is set in the settings and there is no huge influx of new posts with the address type custom fields, that number should be decreasing and eventually becoming '0'.

#2549689
mssing_entries.png

Unfortunately, the number of missing addresses has not decreased. It's still 181. What can we do to make this 0?

#2550709
missng-entries.png

Hi! I checked again, and the number of missing entries is the same.

#2551563

That is strange that the missing number of addresses in the cache is not decreasing.

Do you see any error or warning (on screen or in the server's error log), when you use the 'Check API' button or the 'Check for missing cache entries' button on the Toolset Maps settings page?

And is there any custom code or plugin added on the website which could restrict or block the WordPress Cron job feature?

In case the issue still persists, you're welcome to share temporary admin login details in reply to this message.

Note: Your next reply will be private and making a complete backup copy is recommended before sharing the access details.

#2554927

Thank you for sharing the admin access details.

I've noticed that the Google Maps API key seems to be configured correctly, yet the number of missing map cache entries is not changing over time.

Do I have your permission to download a clone/snapshot of the website to investigate this deeper on a different server? This way the actual website won't be affected.

#2555981

Hi! Yes. Can you delete it after the investigation? Thank you!

#2556623

Thank you for the permission and I've deployed the website's clone on my test server.

I'll be performing some tests and will share the findings, as soon as this testing completes.
( the clone will be deleted after this investigation concludes )

Thank you for your patience.