We have been getting hit with huge spikes in Google Maps API a couple of times a week for at least a few months. We just narrowed the spikes down to Toolset Maps Backend (the private API key used on the backend).
Why could this be happening?
Attached is an image showing the spikes.
NOTE: This is the key underneath the words:
"For added protection of your API keys, you may want to set up a 2nd key for server-side requests"
NOTE: This is only happening on the site that's open to public front end usage, not our development/staging sites.
NOTE: This API key IS restricted to the domain running TS maps
We really are confused, as Google kept associating these spikes with 404 errors and it doesn't make sense that the map could be getting used on the backend anywhere near as much as the API usage suggests. The times also don't match up. There is zero indication that we have been hacked.
TS Maps v2.0.12, Views v3.6.3, everything else is up to date and the changelogs after these versions don't seem to change anything which could be related.
Thank you for contacting us and I'd be happy to assist.
The second Google Maps API key (for the server-side requests) would get involved, when there are some addresses saved in the address fields, whose location (lat/long) coordinates haven't been cached or saved in the Toolset's map database.
Is your website using any import mechanism for adding or updating posts with the 'address' type fields?
Or is there any database optimization or cleanup operation scheduled on the website, which could remove or limit the entries in the Toolset's map database?
These are the few possibilities that I can think of which can be linked to these periodic spikes.
Thank you for the response. That's all worth considering, but we are struggling to think of a process that could cause something like this...
We run back-ups, and use a security monitor. The site doesn't use a CDN. There is no import / update of stored 'Google' addresses that we're aware of.
Once every six months we manually query the database to remove the geolocation coordinates (only) that get cached because they eventually build and cause LFD warnings. We leave those with actual locations (addresses) alone. My understanding is that the coordinates are less likely to get pulled from the cache as they could be accurate to which part of a room someone is standing in, making a second search of that spot on a mobile device less likely. An address, or city, or zip code on the other hand can be reused by others who wouldn't stand in that exact spot.
In any case, it is seldom we actually do this, so there likely isn't a connection to the spikes.
It seems more likely an attack is involved. Perhaps an attempted sql injection attacking the URL triggering the 404s shown in the image.
We are still having problems with this. Our API was hit again going into the weekend. We have quotas to shut them out and we use Wordfence to ban IPs who engage in suspicious activity, but we need to understand why this happening, especially for the API key handling backend functions.
It's very much looking like attacks at this point, but we need to know how these APIs are so vulnerable and what's going on with the 404s.
If you can match the spike in the Google Maps API requests with the website's page visits/views, then most likely your website is getting targeted for the DDoS attacks.
The URL with the parameters that you've shared is generated when the search is performed through a Toolset view. Based on the information we have so far, an attacker could be searching with different and random map locations as the center, which would force the second API to get engaged in the background, for the Geocoding of those uncached locations.
Have you considered adding the search forms behind the captcha or logged-in user wall?
"which would force the second API to get engaged in the background, for the Geocoding of those uncached locations"
I think I understand now, but what's with all the 404 (or 4.X.X) errors?
Even if I purposely type in total nonsense / gibberish it still returns post ajax with the [no items found] default data I set up. Is the system returning a 404 on the backend everytime a location isn't pulled from the cache?
As for the user login, that's way too much to ask. We need this map to be easy for every casual, would-be customer to find a dealer in their area as quickly and easily as possible. We even use a server side IP-API to feed into the view to get an approximate default for users not on a mobile IP.
Do you think that having Recaptcha v3 added on to view queries and forms would greatly help with this very costly issue?
Correct me if I'm wrong, but Recaptcha v3 doesn't require any clicking unless there's an issue. It's been out since 2018.
Waqar asked for my input here, and I spoke with the Maps developer about the sources of server API requests and showed him this thread.
He thinks Waqar's assessment that he has shared with you is accurate, and some attack targeting the address search forms using locations or coordinates that are not previously cached explaining the high volume of requests.
The 404s are a bit of a puzzle, but the developer conjectures that with the server under such a heavy load during such attacks it may not be able to service the requests. (Have you inspected the server logs to see what is generated at the times you have high volumes?)
As for limiting the attacks, if that's what this is, then you may need to protect the distance filters, yes. (If you take frequent backups you may be able to determine whether the address cache is being filled with just coordinate searches or actual location searches by comparing the address cache before and after. I assume the bots would be able to manage using the "Use my location" option and pass some dummy coordinates, or perhaps they detect Google autocomplete on the location input and provide dummy addresses.)
In any case Google reCaptcha v3 is likely your best bet for protecting those inputs, given that it is relatively unobtrusive.
I'm afraid that's something you will need to implement it yourself, though (or have a developer do it if needed). V3 is more options and produces a percentage evaluation rather than the binary results of previous versions, and our developers when reviewing it considered it complex to implement in general purpose tools like Toolset.
Directions for adding it can be found here: hidden link
The topic ‘[Closed] Google Maps API – private API key – 404s via Toolset Backend…’ is closed to new replies.