Tantalizing details of Jony Ive's AI device leak after OpenAI meeting
As OpenAI buys Jony Ive and Sam Altman's AI startup, the two have shared with staff a few hints about what their highly secret device will do.

Jony Ive
After weeks of speculation, OpenAI has bought Jony Ive and Sam Altman's AI startup for $6.5 billion. Publicly, OpenAI has posted a video discussing Ive and Altman's partnership in general, but now further details of their work has been revealed.
According to the Washington Post, Altman told OpenAI staff in a meeting on Wednesday, May 21, 2025, that the aim is to ship 100 million AI "companions." He described it as "the chance to do the biggest thing we've ever done as a company here."
"We're not going to ship 100 million devices literally on day one," Altman said. But he continued that OpenAI would ship that number of devices "faster than any company has ever shipped 100 million of something new before."
What the device will and won't be
Altman and Ive then reportedly told staff that the plan was for their device to be a user's third one, something they would put on their desks after an iPhone and a MacBook Pro. It would be able to go on a desk or in a pocket, and it would be unobtrusive.
The device is said to be entirely aware of a user's surroundings, and even their life. Despite speaking chiefly of a device, singular, Altman also said it would be a "family of devices," and Ive referred to it as "a new design movement."
Speaking of their original plans and the eventual need to join forces with OpenAI, the two men said that they had intended for Ive's startup to build and sell its own device, using OpenAI technology. It was because the device is not just an accessory, but instead a central facet of using OpenAI, Altman said the two companies had to combine.
"We [Ive and Altman] both got excited about the idea that, if you subscribed to ChatGPT, we should just mail you new computers, and you should use those," said Altman.
However, reportedly, Ive and Altman had concluded that existing devices would not work. Specifically, current laptops, or websites, would not be sufficient.
Altman said that current use of AI "is not the sci-fi dream of what AI could do to enable you in all the ways that I think the models are capable of."
It's already been reported that the device will not be a smartphone, and Altman has now said that it isn't a pair of glasses. Then Jony Ive is said to be skeptical about making a wearable device.
What happens next
It's already known that the device by Jony Ive's "io" company has reached a prototype stage, because Altman revealed publicly that he has been able to take one home, and "live with it."
In this meeting, Altman reportedly told staff that his goal is to release a device by late 2026.
He further suggested that acquiring Ive's "io" could potentially add $1 trillion in value to OpenAI.
Prior to OpenAI buying the company, Jony Ive was reported to be seeking funding of $1 billion for the device. Investors including Laurene Powell Jobs are said to have invested.
Read on AppleInsider
Comments
Brushed metal (caressible) or glass (lickable).
Internal wireless connectivity (no ports) and using mics/speakers from other devices.
Cloud processing plus subscription. Bingo!
Honesty, the 'sci-fi dream' is what it's always been: AI with an interface of some kind. Ideally voice and a screen. That's it.
Would a humanoid robot be a plus? Yes for some situations, but if it's going to sell in hundreds of millions, I'll rule that option out.
Rotten Apple will feel the pressure. Better now than late: Tim Cook needs to step down as he can’t afford to doom Apple further and further since Vision Pro.
There's a pattern where Apple ex-employees failing on their own, not saying the pattern won't be broken. But that it exists
Guessing it’s Magic Ball pen ;-)
I think there's a very distinct possibility that it will fail hard. Even if it ends up being a good product, Apple will likely copy it within a year or two and take over the market.
If this does end up being a tricorder type device, privacy/security concerns will be immense. Apple's strong brand in privacy/security combined with PCC can give them a leg up. Also, I wonder if it will turn out that a separate device really isn't necessary for people who already have multiple apple devices. Maybe OpenAI sees a need for a device because they currently sell no devices.
I suspect the overriding impetus for these clowns is to be billionaires.
Their AI device, "a user's third one", will hijack your iPhone and Mac like a zombie fungus?
Last thing I'd want is the ship computer voice interface or a phaser.
My record is indeed broken here. If this thing is a voice-only interface, it fails. Like, is the life cycle complete for the Amazon Echo and smart speakers? Nobody likes to discuss them anymore? This OpenAI thing will be a smart speaker, but with a set of cameras? Amazon already has a hardware and is just waiting on their chatbot to mature?
I've been thinking about voice interfaces that people like, and I'm coming up empty. We have experienced a voice interface, for most of our lives, in the form of automated telephone help systems. Everyone hates them, right? Chatbots won't fix that because their purpose is to drive down costs to do customer support. With chatbots, it will just be a nicer, more insidious way not to give you the support you want.
Is anyone calling smart speakers a success? Whenever I hear media talk about them today, it really seems to be about sound quality, not voice interface capability. In 2015, 2016 though, it was a similar narrative today as well. Amazon was all in on making Alexa this super useful voice assistant, able to do translation, buying stuff, doing things for you. Apple was way behind and if they didn't do anything, they would be doomed. What's the narrative now on Alexa? Also ran? Boring because of LLM chatbots?
So the Siri, Alexa, Google Assistant hardware, basically a version 1 of chat interfaces, ended up being not much of anything? Nobody made any profits, not even Apple. Rumor is that they are breaking even on the hardware, while everyone else took a loss.
The hype train was going for the LLM chatbot assistants the last couple of years with the Humane AI pin, R1 Rabbit, who knows what else. That ran its course in like 1 month after the hardware came out and nobody found a use for them. At least a use that was good enough to overcome the deficiencies of v1.0 wearable hardware. OpenAI is trying again. What's going to be different this time?
If it is a voice-only user interface, it fails. LLM chatbots are a service, like search is a service. It's not a hardware product. If it didn't have a text or prompt interface, I think would be failing too. Fortunately, it does have a text interface. Apple is uniquely adverse to CLI interfaces. It's been a detriment for them. So, I'd hope they would change from this perspective, but hardware wise, they have their bases covered.