/bmi/media/media_files/2026/02/18/orion-robodog-2026-02-18-00-24-14.jpg)
New Delhi: The anger around Galgotias University’s “Orion” robot dog points to something larger. It hints at how we are learning to talk about AI in India, and how easily talk can start replacing work.
If a university can put a bought product on a summit floor, rename it, and let the room read it as “our innovation”, it tells you something about the incentives at play.
In many places, the reward is for being seen on the right stage, not for doing the slow, invisible work that builds real capability.
The university’s defence did not deny that the robodog was procured. It said it was acquired for learning. That is a fair point. Many institutions buy tools from global companies. Labs across the world do it.
Students learn by touching real machines, opening them up, running experiments, and failing many times. Nobody should be shamed for buying equipment.
So what’s the problem? It is the framing. When a procured device is displayed as a centrepiece, branded like a product reveal, and placed inside a narrative of “ecosystem” and “innovation”, the audience is pushed to assume authorship.
A viewer does not enter a public exhibition thinking, “This is a procurement demo.” They assume, “This is what you have built.”
What bothers people even more is the setting. This happened at a national summit that is meant to project seriousness.
A summit is not a college fest. It is where institutions, startups and governments signal priorities. It is where big words get used: self-reliance, leadership, future-ready, global scale.
If a university’s public face blurs the line between procurement and innovation, what does that say about how students are being taught to understand AI?
AI is not a prop. It is not a poster. It is not a robot that moves and makes people clap. AI is also data, methods, testing, safety, failure, and the discipline to be honest about what a system can and cannot do.
If the adults in the room treat AI like a branding exercise, students learn that branding matters more than building. They learn that naming matters more than explaining. They learn that shortcuts are normal.
The episode also sits uncomfortably with how the country is trying to sell its AI story. India does have real talent. It has serious researchers. It has strong engineering teams. It has useful digital rails. It has startups doing hard work in healthcare, language, commerce and public services.
But the “AI moment” is also becoming a stagecraft moment. The bigger the stage, the greater the temptation to overclaim.
This is where the government PR question becomes relevant. Not as an accusation, but as a question.
When the national mood around AI is being built through summits, photo-ops and big numbers, what gets rewarded? The sharp demo or the quiet deployment? The shiny exhibit or the boring documentation? The headline or the hardening?
A government can genuinely want AI progress and still end up building a PR-first ecosystem if the structure rewards spectacle. The two can exist together. That is what makes intent hard to judge from outside. Outcomes are easier to see.
If high-profile events become places where posturing survives without scrutiny, then the event design is part of the problem.
The Galgotias moment also raises a simpler question: what do we mean by “innovation” in public conversation?
Innovation can be building from scratch. It can also be adapting, improving, and applying a tool in a new context.
A university could have said: “This is a Unitree platform. Here is what our students have built on top of it. Here are the models they trained. Here are the tasks it can do in Indian settings. Here is what failed.”
That would have been a proud story too. Possibly a better story. It would have sounded like learning, not like claiming.
When institutions don’t do that, it creates a trust gap that spills beyond one institution. It makes the public more cynical about the next exhibit. It makes serious players look suspect because the bar of credibility drops. It also gives ammunition to people who want to dismiss the entire AI push as noise.
If India wants a credible AI ecosystem, the basics have to be boring and strict. Clear disclosures. Clear labels. Clear claims. Real demos that show what is built locally versus what is purchased.
That clarity is not anti-national. It is pro-capability.
A strong country can say, “We buy global tools, and we build on them.” A strong institution can say, “We didn’t build this machine, but we built learning on top of it.”
And the government, if it wants the AI story to survive beyond event season, has a stake in that honesty too. Because the real test of “serious work” is not a summit floor.
It is what stays after the banners come down: datasets that can be used responsibly, compute that is accessible, research that is publishable, products that work for citizens, and systems that can be trusted.
/bmi/media/agency_attachments/KAKPsR4kHI0ik7widvjr.png)
Follow Us