I think it will be ubiquitous too, but I can't see it crossing over into the mums'n'dads/family mainstream like phones have. It's like when you see sci-fi films and people have to wave their arms around in the air to get the simplest tasks done. It's silly.
Much as there is a big side of me that likes tech for techs sake, gimmicks and all sorts of things that I expect to remain very niche or nerdy, thats not what I associate with ubiquity. For AR to take off in the way I was suggesting, very much requires various flavours of mainstream to have found compelling uses for this stuff.
The world of AR and spatial computing on phones so far is still at the clunky stage. Low-hanging and obvious fruit in terms of applications so far started with things like measuring apps and companies salivating about customers being able to put virtual versions of their furniture etc into the context of their homes before they buy, etc etc.
I cant predict when enough of the clunky aspects will be out of the way for this stuff to become a more obviously winning proposition with a whole bunch of killer uses. And I dont really want to go crazy coming up with all sorts of theoretical applications yet because of how far fetched or silly they may seem at this stage, when some of the complicated problems are yet to be solved, and hardware form-factors unclear.
I dont like to predict what sorts of tech will end up appealing to what sorts of people, because I grew up with computer games etc having generation connotations, that are increasingly untrue. For example the Wii did not end up being some game changing development in the grand scheme of things, but it was a pleasure to see it introduce gaming to various groups who hadnt engaged that much before, due to the different input mechanism and the momentum the platform briefly had in terms of party & sports games etc. For years my Mum went to occasional small 'Wii nights' with her retirement-age friends. Things like the Apple Watch have useful features for all sorts of different ages too.
And when it comes to how we interface with computing devices, and what catches on and what doesnt, I prefer to just wait and see. Sometimes its a long wait - for decades I was not sure if people would ever take to voice-controlled stuff or not - most indications in the clunky decades were not positive, but it was hard to know how much of that was down to the clunkiness/inaccuracy/tedium/immature tech and how much was down to other factors such as some possible psychological aversion to talking to machines. It still wasnt really clear to me in the early years after the clunkiness was much reduced and the accuracy much increased, when the likes of Siri first arrived. But now we have all these digital assistant things like Alexa all over the place, they seem to have caught on enough I suppose. I say that begrudgingly because I dont use them myself and I dont know many people who do, but they certainly resemble a mainstream thing.