Mark Zuckerberg thinks you’re going to need a pair of AI-powered glasses to stay competitive in the near future. No glasses, no superpowers. His argument is that without them, you’ll be at a “significant cognitive disadvantage.”
That sounds like a CEO trying to sell you something, but unfortunately, it is not far-fetched.
Why glasses, though?
Zuck’s reasoning is simple. AI assistants are becoming increasingly powerful, capable of holding conversations, identifying what you’re looking at, and even summarising your emails or helping you write code.
But interacting with these assistants via your phone or laptop is still not as straightforward as breathing. You have to stop what you’re doing, open an app, type or talk, and then wait.
Smart glasses eliminate all that friction. You’re wearing the AI. It sees what you see, hears what you hear, and talks to you like a voice in your head.
It becomes an assistant that’s available in real-time, not a thing you call for when you need help.
In Zuckerberg’s view, this kind of seamless interface is the next step. Those without it will fall behind.
What’s Meta doing?
Meta already sells Ray-Ban Meta smart glasses that let you talk to Meta AI, take pictures, play music, and even livestream. It’s basic stuff for now, but they want to take it even further.
They are talking glasses that can visually overlay information, act as memory aids, or help you with tasks just by observing your environment.
They’re calling it “personal superintelligence”, which is just a fancy way of saying an AI that knows you super well and helps you in real time, every day, everywhere.
It actually makes sense
Phones don’t usually have context. They don’t know what you’re looking at unless you point the camera. Smart speakers don’t follow you around. Smartwatches give you basic notifications and that’s abut it.
Glasses, on the other hand, sit on your face, facing the world with you. So they have the ultimate context about what you’re up to.
You could be walking through OK supermarket and ask your glasses, “What’s the price of this same product at Pick n Pay?” or say, “Remind me which brand of cooking oil my wife said not to buy.”
Or better still, in a work meeting, your glasses could transcribe everything and highlight action points as the meeting happens.
And yes, that sounds insane until you realise that AI is already capable of this. The only thing missing is the delivery method. Glasses could actually be the best for this.
The Zimbabwean reality
Now, here’s the problem. We’re in Zimbabwe.
Let’s say Meta or Apple eventually makes AI glasses affordable. Maybe $199 for a pair. That’s still a tough sell in a country where most people don’t even own a basic smartphone.
Worse still, it’s not just about the price. It’s about infrastructure, internet accessibility, and even electricity availability.
But there’s another, less obvious disadvantage: language.
Most of these AI assistants, and the glasses that act as their eyes and ears, are trained to understand English and a few other major languages.
They don’t yet understand Shona, Ndebele, or any of the other local languages that we use in real-world conversations in Zimbabwe.
Even in corporate settings, Shona often sneaks into boardroom discussions, client meetings, and negotiations. If your AI assistant cannot understand what’s being said, it cannot help. So the “always listening” feature becomes far less useful here.
We tried asking Google’s Gemini if it understands spoken Shona, and it said no. The same applies to ChatGPT, Alexa, and the rest of them. They may handle written Shona to some degree, but real-time audio? Not yet.
This becomes a real limitation. Even if Zimbabweans manage to get their hands on the glasses, they’ll be speaking a language the assistant doesn’t speak back.
So the “cognitive disadvantage” Zuckerberg warned about is not just about not wearing glasses. It is also about whether your language is even recognised by the AI powering them.
It becomes another layer of exclusion. Those in well-connected economies walk around with real-time AI assistants that understand their every word. The rest of us are still manually translating our thoughts for the machine to catch up.
Still, it sounds impressive
Even with all that said, Zuck has a point. If you’re already plugged in, own a smartphone, use AI apps like ChatGPT, and do most of your work online, then glasses might genuinely be the next step.
They solve the latency problem. They’re hands-free. They know the context. And unlike your phone, you’re unlikely to leave them in a kombi or drop them in the toilet.
This isn’t about whether Meta wins the race. Apple, Google, and even OpenAI are all chasing the same thing. The question is whether we, as Africans, are preparing to participate in this new wave or be left behind again.
Fortunately, we’re seeing some work to get our local languages available in AI tools.
Zuckerberg’s warning might sound dramatic, but he’s right to say that the way we interact with AI is about to change completely. Glasses could be the smartphone replacement we didn’t know we needed.
Whether we get them or not is a different story.
Leave a Reply