Share this comment
There are some startups trying to attack the problem that would make your memory portable. Things like mem0.ai. But otherwise, I think the number of people that will want to run models locally is quite small. Especially as the capabilities of the applications on top of the models in these platforms grow.
┬й 2025 Brian Balfour
Substack is the home for great culture
There are some startups trying to attack the problem that would make your memory portable. Things like Mem0.ai. But otherwise, I think the number of people that will want to run models locally is quite small. Especially as the capabilities of the applications on top of the models in these platforms grow.
I agree in general: users/consumers are 'lazy' - the least friction, most comprehensive product UX always win. The current smartphone OS landscape is an good example, though Android/Google Pixel is catching up fast.
Except that this time around, we are picking a Life OS, at least that is the direction I see it is trending towards. That is what triggered my context/history portability question: the thought of my thoughts, search queries, projects and purchase history being held hostage in a single platform scares me...The same reason I opted out of iOS for my personal devices long ago.
Worse yet, it doesn't seem any privacy and consumer protection law to catch up the fast development and adoption of AI-assistant, be it in the form of app or wearable, any time soon.
On the other hand, as a product person, I can see context/memories becomes the most differentiating feature of any consumer facing AI products: having the visibility and access to one's entire online activities as input for fine-tuning is how the model can become more personal, and uniquely, you. From there is just a self-perpetuating flywheel between user value > usage > feature / product enhancement, with the platform lock-in effect.
If I may quote Charles Dickens, it is the best of times and the worst of times being a founder right now.