Ray-Ban Meta Glasses (Gen 2)
Today I met with an offshore operator in Singapore. They charter boats, move crews, move cargo, keep offshore platforms running. Someone in the meeting said something simple and true: no matter how much technology you give people, people remain a major factor in the equation.
He is right. There is a lot of tacit knowledge in operating a ship that never shows up in documentation. A skilled crew member still knows how to do things much better than AI can today.
Safety has the same dynamic. Many are moving away from BBS toward human performance. BBS is measuring near misses with an accounting lens. Human performance is changing the process instead of blaming the individual. It is about improving the system itself.
But something bigger is coming that most industries have not absorbed yet. Wearables.
We already see them in consumer life: rings that track sleep, watches that track runs, bikes connected to the internet. And now the Ray-Ban Meta glasses. This may be the first form factor that can replace the phone as the primary device we use.
You do not understand it until you put them on. These glasses feel like the BlackBerry moment. Not the iPhone yet, but the first credible step toward it.
Gen 1 felt gimmicky so I skipped it. Gen 2 is different. Eight hours of battery. A case that extends it. Pictures that are good. Recordings that are good. It has replaced my iPhone as my primary camera. I now take more pictures with my glasses than with my phone.
It is also replacing my AirPods. AirPods slip. The glasses do not. They are more comfortable. The audio is not as clear because the speakers sit further from my ears and people with big heads might feel the frame is small. But I still prefer using the glasses.
Talking on the phone is not the biggest shift. The glasses are becoming one of my primary interfaces to LLMs. For almost two years ChatGPT has been my main LLM because it has my memories and I can access it everywhere. But the glasses are even more convenient because they are on my face. I say hey Meta and I begin a conversation. Those conversations get stored. It is a real threat to ChatGPT.
I still prefer ChatGPT because GPT 5.1 is stronger and the speech capability is the best. Meta is better than Siri and better than Gemini but OpenAI leads in speech and conversation. ChatGPT real time can search the internet better than Grok or Gemini. Gemini still relies on older data. Everyone will eventually catch up. The point is I do not want to pick up my phone anymore. I want to talk to my glasses.
Two examples show the impact. When I ride my bike with my kids and talk to my parents on WhatsApp I used to keep the camera off because my phone was in my pocket. Now I turn the camera on through the glasses and they see what I see. They see my kids riding. A maintenance worker in the field could do the same thing.
At the airport I held my plane ticket up to my eyes and asked my glasses what my gate was. It struggled at first but after two or three tries it figured it out. It recognized the ticket did OCR and returned the gate number. A small example but a real wearable doing real time computer vision tasks.
Most processing is still done on the phone. The glasses do minimal processing and store pictures. The phone connects to the internet for full LLM power.
Ships all over the world are connected via Starlink. Underground mines have WiFi. 5G is everywhere. Wearables are everywhere. We will see massive changes bigger than we expect. Even difficult tasks where workers are blamed for mistakes will shift. Workers will adopt these technologies because they make them safer and more efficient. The incentives change and the big brother fear becomes smaller.
Companies that understand this without spending hundreds of millions and damaging their balance sheet will be ready. They can become more than a shipping company with ships and crews. They can become a company with ships and crews and a platform that differentiates them and supports organic and inorganic growth.