Suppose you have a travel planning tool, like wanderlog, and you need to display the details for all the places in your plan. Fetching the place details for all those items, multiplied by the number users, multiplied by the times an average user views/edits his plan, means that on a relatively medium or even small scale - the startup will go bankrupt.
Now, suppose you have the money to pay : Since Google's Place Details API does not support Bulk fetching, you need to fetch the details for all the places in your plan separately. The only performance-viable solution would mean to spawn multiple async tasks to fetch those details in parallel. However, very soon you would reach rate limit.
What about caching? : The current caching terms of use, unlike some years ago, allow you to cache only Lat+Long and place id. (not even the title!)
So, either companies like wanderlog/tripit/etc. simply cache all the details (which is illegal), or there is another solution that I'm missing here.
Would love to hear your experiences or simply opinions on the matter.
Google tends to have a greater number of commercial POIs (since it caters to potential advertisers), but OSM is catching up in this area as well.
Full disclosure: GeoDesk develops a spatial database engine for OpenStreetMap data, so obviously biased on this issue.
They could of course charge 1M USD per request or whatever, but then nobody would use their APIs. And if that's what they want, it's easier to just stop providing third-party access, no?
I mean, obviously (?) the goal is to make money I think. But if you price it such that only very big players are able to afford it... then...I feel like I'm missing a basic economics explanation here.
Sure, greed and all that, but... I am still baffled so it's clear that I don't understand.
Hopefully someone can provide some light on this.