after import of 24000 posts with adresses the response times are very long or will not finish. What can i do?
Distance-based searches are processor-intensive. You can add a caching plugin to help improve the response time of these queries a bit, and you can lower the number of results shown on one page to help response times as well. You can also install Query Monitor and inspect each individual database query. If any single query takes longer than 0.5ms, we can investigate in more detail.
How can I optimize the whole, or reduce the requests to google?
Each map address must be geocoded individually, so I do not know of any way to get around the geocoding API quotas by batch-processing. Toolset Maps uses a caching system to store the geocoded addresses after processed once by the Geocoding API. After the addresses have been geocoded at least once, the cache will be utilized in future requests. So at first you will have many requests, but in the future you will have fewer and fewer requests as the cache builds up. Only new addresses will require requests to the API.
In a View that shows many results with maps, I would consider switching to static maps. These are simple map images instead of full interactive maps. On the single results page, show the full interactive map. The Static Maps API is a separate API that must be activated in the Google API Console, and has its own quotas and restrictions: https://developers.google.com/maps/documentation/maps-static/usage-and-billing