This is a great write up, a good read.
I always love these analyses.
All these years on, even with all these improvements (for Californians) it still seems like a futile effort on Apples part to even try to compete with Google in this area.
The level of detail Google has about every business is just insanely deep, I use the info like Opening Times for tiny shops in my neighborhood constantly.
Just realized that my title is somewhat weird with the “(again)”. I meant to reference an even more eye opening analysis from Justin from last year (2017), showing the beauty of a well played long game in the mapping field:
“[…] In other words, Google’s buildings are byproducts of its Satellite/Aerial imagery. And some of Google’s places are byproducts of its Street View imagery... ...so this makes AOIs a byproduct of byproducts:
This is bonkers, isn’t it?
Google is creating data out of data.“
great writeup, but it baffles me that Apple is still working on basic cartography. the vegetation stuff is cool but the accuracy is troublesome. at least now you'll know there's a tree there when Apple maps tells you to take a wrong turn into the forest.
i hope they are getting it for free and not spending too much time on it... looks like road and path visibility suffer a bit due to reduced contrast.
Ahh yes, because we all use maps to see where trees are.