This is the technical support forum for Toolset - a suite of plugins for developing WordPress sites without writing PHP.
Everyone can read this forum, but only Toolset clients can post in it. Toolset support works 6 days per week, 19 hours per day.
I am facing two issues which might be related to each other. Error log is showing following duplicate entry error:
AH01071: Got error 'PHP message: WordPress database error Duplicate entry '9 Riverview Terrace Riverside BRIDGWATER TA6 3JL' for key 'PRIMARY' for query INSERT INTO ncGmi2Ndr_toolset_maps_address_cache ( `address_passed`, `address`, `point` ) VALUES ( '9 Riverview Terrace Riverside BRIDGWATER TA6 3JL', '9 Riverview Terrace, Riverside, Bridgwater TA6 3JL, UK', ST_PointFromText('POINT(-3.003766 51.134377)') ) made by require('wp-blog-header.php'), require_once('wp-includes/template-loader.php'), include('/themes/understrap/page.php'), get_template_part, locate_template, load_template, require('/themes/understrap/loop-templates/content-page.php'), the_content, apply_filters('the_content'), WP_Hook->apply_filters, do_shortcode, preg_replace_callback, do_shortcode_tag, WP_Views->short_tag_wpv_view, WP_Views->render_view_ex, WP_Views->render_view, wpv_do_shortcode, apply_filters('wpv-pre-do-shortcode'), WP_Hook->apply_filters, Toolset_Addon_Maps_Views->run_shortcodes, do_shortcode, preg_replace_callback, do_shortcode_tag, ...', referer: <em><u>hidden link</u></em>
This error occurs on following page:
This error is occurring because of newly created listings with duplicate addresses which was published on 16th of July. See screenshot for reference:
Issue is users are allowed to add multiple listings of same location because it's a property related site and single address can contain multiple property types and each property type must be added separately as per site design.
Their is an another issue which might be appearing because of this, I have imported around 8k listings few days ago but search page is showing only 93 total listings. I have premium Google maps API tier so I have higher limits allowed. See screenshots for reference:
So why it's taking too much time to geocode all addresses? Can we speed up this process? Check screenshot of table 'toolset_maps_address_cache':
It is showing 217 entries but on search page it is showing only 93 Results? I was using w3 total cache and Redis for object and page caching. I have disabled it now.
Looking forward for a quick resolution!
So why it's taking too much time to geocode all addresses?
My guess is it's not a matter of taking too much time, rather that geocoding has not been triggered for these addresses. The system isn't designed to handle imported addresses very well, it's designed to accept addresses entered manually. Simply importing addresses does not trigger a geocoding event. The way the system is built is such that geocoding is triggered when the address is needed to place a marker on a map. If the address does not currently exist in the maps cache, it gets sent to the Google Maps API for geocoding. In the standard workflow of a site, this isn't a problem because someone is entering address data in wp-admin or in a Form and chooses a mappable address from an autocomplete field. Then the address is immediately mapped, geocoding is triggered, caching happens, and the necessary data is put in place.
When addresses are imported, the mapping step is skipped and geocoding doesn't automatically occur. So then the geocoding event won't occur until the address is needed to place a marker on a map somewhere on the site. If all your front-end site map markers are populated by Views with distance filters set up, this is a problem because distance calculations also rely on the geocoded results (the cached results). Without cached, geocoded results, distance calculations cannot be performed on that listing. So in a bit of a catch-22, that listing can never be marked on a map with a distance filter, hence no geocoding event is triggered for that address. Therefore, if all your map markers are placed by Views that have distance filters, addresses that are imported will likely never be included in those maps unless you physically open those posts in wp-admin. That will trigger a map geocoding event because a map with a marker is shown in the post editor.
So how to trigger the geocoding event programmatically, then, for imported addresses? Open all 8000 imported posts in wp-admin? Obviously not. The only practical way I've found to get around this is to create a map with markers coming from a View with no filters. Load all the results, or if you can filter it somehow to load all the imported results that would be better. Lets say it loads 500 results per page in a paginated View. Drop that map in a page somewhere on your site and run a cron that hits each paginated URL of the unfiltered map to trigger the geocode process for the imported addresses. You may need to configure the cron to hit multiple times per page to trigger a geocode event for all the addresses per your geocode API limitations for addresses per second.
It's still possible that invalid or unmappable addresses were imported and cannot be geocoded, so posts with those addresses will never be included in results of a View with a distance filter. If you import addresses, you bypass the internal validation system and the validation falls on your external process. The only way to guarantee good addresses is to use a Google Maps Autocomplete widget wherever your data is generated.
With all that being said, I can try to replicate the issue you've described where identical addresses are imported, producing a primary key error in the cache table.
Thanks for your informative reply. I have added a view without any filters and have setup a cron script which fetches page after every 2 minutes. I can see that 3 or 4 listings are getting geocoded on each run.
Yes, please look into duplicate entry issue. This can cause performance issues in the long run.
Okay after several rounds of testing I am a bit closer to understanding the problem. It seems to be related to capitalization differences in similar addresses moreso than identical addresses. In my first few tests I was able to successfully import up to 200 posts with identical addresses without producing any errors, but then I realized the error message you shared contains variations of an identical address with differences in capitalization (BRIDGWATER vs. Bridgwater). I'm able to replicate this now in a local test consistently, even without a View, when importing posts with addresses that differ only by capitalization.
Let me ask my 2nd tier team to investigate and determine if this is something we will address in our software. The problem here is that the imported addresses have not been validated by the Google Maps API, which would normally prevent this problem when the addresses are chosen from autosuggest options. Our system expects to work with validated, geocodable addresses. Those entries would be normalized to have consistent capitalization in the normal workflow, and the duplicate key problem would not occur. Since there is no system in place to resolve geocoding errors in imported addresses, the issue requires manual intervention by editing the posts with similar addresses. I'll escalate and let you know what I find out.
This issue needs to be handled in toolset software because a lot of users would be importing listings like I did and they will be facing same issues.
I have added a page without any filters and have set limit of 3500 entries. Here is the shortcode I am using:
[wpv-view name="all-map-listings-geocoding" limit="3500" cached="off"]
and page link:
I have also configured a cron which runs every 3 minutes but it looks like only single listing gets geocoded on every page visit which is too low for 5k+ entries. So far 1072 Results are appearing on page. I have also checked Google cloud console and it's showing 0 errors regarding quota. That means their is no issue related to API quota. How can speed up this process further? Please hide links I have posted above from public as they contain sensitive data.
Is it okay for me to add pagination to this View? It's not very easy for me to see what's going on. A fair number of these are invalid (some are web URLs, for example) and I'm not able to easily tell how many should be mappable to get a clear idea of how the system is breaking down. If I add pagination I can get a better idea of the results for each paginated map set.
Yes sure. It's just for testing purpose so you can do whatever you need to do. So far only 1077 results are appearing which means imported listings are not getting geocoded.
Yes there are few spam entries but majority of listings are spam free.
Ugh this is painfully slow, even after adding pagination. I'm seeing the same thing you see, each time I refresh the page loads one or two more markers. As a test, I would like to try importing the same dataset you imported and run it on my local machine. I'd like to see if I can replicate the same problems you are experiencing, so I can ask a developer for some advice. Do you have a CSV or XML doc I can use to run the same import? Please post to Dropbox or Drive somewhere and provide a download link here. All URLs you share in the forum are hidden from other Users for your privacy, so only supporters will have access to that download link you share.
Here you go:
Thanks, I have downloaded the file so you can remove it from Dropbox if you'd like. I'll continue testing and try to get some feedback from the maps team. Please stand by.
Okay my support team leader and I have been running some tests, and we're seeing results similar to the results we see on your site in the paginated map View: hidden link
We're seeing around 5 new results geocoded per page load as well. We've found that pagination of around 50 results per page in the View seems to be the most efficient at maximizing the number of new geocoded markers per page load. We presented this to our maps developer for some feedback, and his response is this limitation in receiving new geocoded markers is coming from Google Maps geocoding API throttling. I've asked for some additional details and have yet to get a response, but I'll keep you posted.
My team leader has found a place in our software where we impose some throttling on the number of addresses submitted for geocoding per batch. He changed that limitation and provided a patch file here: hidden link
Please download and extract the patch file to wp-content/plugins/toolset-maps/includes/toolset-common-functions.php
Once the patch is applied, I can run a few more tests using the paginated listings pages on your site and see if the patch has provided any improvement to batch geocoding. Let me know when you're ready and I'll start.
Thanks for your patience!
I have uploaded patch as requested so please test on your end. I don't see any improvement though.
Feel free to use the same code for your own tests. For each of these URLs I tested, all 50 results markers were displayed:
The next URL I tried only showed 49 markers:
I clicked "zoom" next to each result in the list until I found the missing marker: the one unmappable address was "Apartment 27 Old Tannery Bingley BD16 4JJ"
Seeing all markers displayed for all these pages, I decided to change tactics to determine exactly how many are being encoded each time I hit the page. So I went to Toolset > Settings > Maps and loaded the list of cached data. I used jQuery to count the number of rows in the list, and found 1921 items. Next, I hit this URL:
I found 48 markers displayed (see below for more information about that), and then visited the cached data list again. Now I count the number of rows in the list, and find 1961 items. So that means 40 of these addresses were not cached before, but now they are. That means there were 40 items added to the cache when I loaded the page. I'd say that is a major improvement over the previous 1 or 2 additional markers per page load.
As far as why only 48 items were displayed on the 35th page of results, I visited each marker by clicking "zoom" in the list of results. I found two items in the list are not mappable:
Ohudoniko Ijirufo Ufenik OTEDESIVE
Idyrizu Ugexoryt Ynyjy UNEKOZES
So I'm curious how you came to the conclusion that the speed of geocoding has not improved. Can you tell me more about the process you are using to reach this conclusion?
I was just loading first page which was not showing any improvement. Your observation looks correct. So do I need to load each paginated page to geocode addresses? How can I do this quickly?
The topic ‘[Closed] Search page not showing all entries and duplicate entry error in error log’ is closed to new replies.