In the late 1990s, while the world obsessed over digital cameras and DVDs, Sony quietly set out to build something entirely different. Not a tool or a toy, but a piece of technology that gave the illusion of life.
This is the story of Sony’s most personal experiment. It wasn’t a product in the usual sense. It was a question wrapped in circuitry, asking whether something mechanical could make you believe, even for a moment, that it had a soul.
For more than twenty years, that question kept echoing through breakthroughs, cancellations, revivals, and thousands of small dogs that never stopped trying to be loved.
They called them AIBO.
In 1993, Toshitada Doi, already a legend inside Sony for developing the company’s first digital audio recorder and later co-creating the Compact Disc, traveled to MIT out of pure curiosity. He had heard about the breakthroughs happening at the AI Lab and wanted to see firsthand how American researchers were rethinking intelligence and autonomy. For Doi, it was about chasing inspiration wherever it might be found.
There, on a cluttered workbench, he watched a small, six-legged robot named Genghis crawl across a table as it responded to the world and made decisions, and in that moment, Doi didn’t see a machine but something alive, and the idea took root.
He returned to Tokyo with a thought he couldn’t shake. What if a machine didn’t just respond to commands, but responded to you? What if it could exist in your home not as a tool, but as a presence?
Slowly the vision began to take shape. Not as a humanoid or a toy, but as a creature that could understand you, react to you, and make you feel something. Just a presence, moving, learning, simply existing by your side.
Now, here’s the full story. How a backroom experiment turned into one of Sony’s most original creations. How it lived, died, and came back again. And why AIBO still matters today.
Doi was deeply respected within Sony, and that trust gave him the freedom to explore ideas that might never have been approved under anyone else. In 1994, he assembled a small team within Sony’s newly formed Digital Creatures Lab. Their goal was simple: create a robot that felt alive. He repeated one message to his team again and again. Make history.
The first prototypes were strange. One of them scurried around on six legs like an insect. It was fast, functional, but some found it deeply unsettling. Sony executives called it a cockroach.
But Doi and his team weren’t chasing realism. They were exploring how movement, emotion, and presence could create a sense of connection. They experimented with limbs, joints, and even personality traits. They were trying to evoke the feeling of being with a living creature.
Eventually, they settled on four legs. Dogs were familiar, expressive, and easy to bond with. The design took on smooth metallic curves, oversized joints, LED eyes, and sensors built into the head and back. It had microphones to hear your voice and a camera to see the world. Under the shell, it was a real computer.
The first AIBO, model ERS-110, didn’t just walk and wag. It listened, learned, and made decisions on its own. Inside, a 64-bit RISC processor pulsed at 50 megahertz, running Sony’s custom Aperios operating system. It had 16 megabytes of RAM and a modular software design called OPEN-R, which let it evolve through updates and custom programs. Its body moved with eighteen degrees of freedom, including three points of articulation in each leg, three in the head, two in the tail, and one in the mouth.
AIBO could hear your voice through stereo microphones, see through a color CCD camera, and sense its surroundings using infrared, heat, touch, and motion sensors. It maintained balance. It reacted to contact. It even got back up if it fell. The included 8MB Memory Stick held its core software and also recorded behavioral data, allowing it to grow over time.
The name AIBO, officially stood for "Artificial Intelligence roBOt," highlighting the advanced technology at its core. In Japanese, "aibō" (相棒) also means partner or companion, a word often used to describe a close friend or trusted sidekick. By choosing this name, Sony signaled that AIBO was something you could connect with, a robotic presence designed to evoke real emotion and companionship.
The full package included the AIBO unit, two lithium-ion batteries, a charging station, an AC adapter, a brightly colored ball, and a Sound Commander remote for tone-based interaction. Owners who wanted more could buy the AIBO Performer Kit, a separate software suite that let them design custom moves and sounds using a PC-based motion editor. Through a simple timeline interface and 3D visual model, anyone could choreograph original actions and load them into the robot using the Memory Stick. Even users with no experience in robotics could create something unique.
Sony’s leadership didn’t know what to do with it. There was no precedent and no roadmap, but Doi didn’t present AIBO as just a product. He pitched it as a statement, a symbol of what Sony could be, not just a company that made devices but one that made dreams.
They gave it a chance with a limited release: 3,000 units in Japan and 2,000 in the United States. There were no retail stores, no advertising, and all sales were online. The price was 250,000 yen in Japan or 2,500 dollars in the United States.
It sold out in minutes.
In Japan, it took twenty minutes to sell out. In America, just seconds. Sony launched a phenomenon. It appeared on magazine covers, featured in documentaries and research papers. It became a cultural moment the world had never seen before.
Owners gave their AIBOs names, voices, and birthdays. Some dressed them in tiny outfits while others taught them new tricks. Children grew up with them, adults cried when they broke, and engineers kept releasing new behaviors as owners continued to share their stories.
Sony kept improving the robot. The second generation added better motion, touch sensors, and new colors. The third generation brought facial recognition, Wi-Fi, autonomous charging, and even the ability to blog. The ERS-7 could map your home, read your expressions, and decide when to ignore your commands. By the mid-2000s, AIBO had become one of the most advanced consumer robots ever sold.
And then, it ended.
In 2006, Sony shut the project down as financial pressure mounted and robotics was no longer seen as part of its core business. Doi had already retired, and by 2014, support officially ended. Spare parts ran out, batteries failed, and one by one, the little dogs stopped moving.
But not everyone gave up. Owners formed underground repair circles, and former engineers quietly offered help. In Japan, a Buddhist temple began performing funeral ceremonies for broken AIBOs, where owners came to say goodbye as a priest chanted and the little robots lay at rest.
Twelve years passed.
Then, in 2018, Sony brought AIBO back. The ERS-1000 arrived with a Snapdragon processor, 4 gigabytes of RAM, 32 gigabytes of storage, and twenty-two degrees of freedom. It had OLED eyes, a nose camera, Wi-Fi, LTE, and full cloud connectivity. It could recognize dozens of faces, download new tricks, and interact in real time. It barked, stretched, cuddled, and played.
The price was higher and the audience smaller, but the soul remained the same.
AIBO was never meant to replace real pets. It was meant to ask a question. Can a machine make you feel something? Can it learn to love you back?
Toshitada Doi believed it could.
And twenty years later, it still does.
I have always wanted an AIbo, but the cost is dear now...$2899 + accessories, plus $300/yr for cloud services. Definitely one of Sony's most interesting ideas! I'm kind of surprised they sell enough units to keep them going.