Groundbreaking AI Surgery: Johns Hopkins’ Flawless Gallbladder Removals
I came across this fascinating video covering some of the freshest developments in physical AI, and honestly, what grabbed me most was the AI-powered surgical robot developed by researchers at Johns Hopkins. According to the video, this robot performed complete, unassisted gallbladder removals flawlessly across eight surgeries on synthetic human models that closely mimic real anatomy. The team named their system the Surgical Robot Transformer Hierarchy (SRT), which builds on the well-known Da Vinci Research Kit but adds machine learning to empower the robot to learn like a medical student—by watching hours of real surgical videos without step-by-step instructions.
What’s wild here is how the robot handled 17 individual tasks from identifying tiny ducts to placing microscopic clips and even cutting tissue with scissors. It dynamically adapted to unexpected differences in tissue, demonstrating real-time judgment. Plus, it understood verbal cues from the team—like a nurse suggesting a clip be checked—which speaks volumes about how far AI interaction has come. The results were impressive: a 100% success rate with no errors. Sure, it was a bit slower than a human surgeon, but the precision clearly matched years of practice. The lead researcher put it plainly: this isn’t just about repeating programmed steps; the robot actually understands and makes judgment calls. To me, that’s a game-changer in surgical robotics. It’s not hard to imagine this technology expanding from synthetic models to real patients in the near future.
Autonomous Robots Take the Field: China’s All-Robot Football Match
Switching gears from operating rooms to sports fields, the video also spotlighted China‘s first autonomous robot football match in Beijing’s Yizwang zone. Here, four teams of fully independent humanoid robots went head-to-head—no human joysticks allowed. Each team had three active bots plus a substitute, playing two 10-minute halves and managing to spot the ball, track teammates, and decide passes or shots with over 90% accuracy. While the skill level was compared to kindergarteners (awkward and all), the autonomy is the real takeaway. The robots made their own decisions during the game, a milestone for AI and robotics combined.
Founder Cheng Hao is already envisioning mixed human-robot games but emphasizes safety first. Still, with the speed at which the vision and control algorithms are improving, a crossover game involving humans and bots feels much closer than sci-fi. Watching humanoid robots in a sport setting is not just cool—it shows how AI is maturing in unstructured, real-world environments.
Amazon’s Deep Fleet Brain and Intel’s New Robotics Powerhouse
The video also touched on Amazon’s massive robot fleet milestone: their one millionth production robot just joined the floor in Japan. Robots and humans now have about a 1:1 ratio in over 300 fulfillment centers worldwide. Amazon’s new “Deep Fleet” AI model orchestrates every shuttle’s path, anticipating traffic and reshuffling tasks on the fly. This coordination cuts travel time by 10%, meaning packages move faster to conveyors and eventually to your doorstep. What I appreciated hearing here was Amazon’s stance on workers—these robots aren’t there to replace humans but to offload heavy, repetitive lifting while upskilling staff into technical roles. Since 2019, 700,000 workers have passed through training programs to maintain and program these robots. It’s a good reminder that robotics and AI often work hand-in-hand with human labor, at least for now.
Intel took a different but equally interesting angle by spinning off its Real Sense division into a new standalone company, backed by $50 million in fresh funding. Real Sense is well-known for depth sensing cameras used in drones and autonomous machines, and the new CEO Nerdov Orbach promises new products focused on safety and plug-and-play ease. The move signals that the physical AI space is ripe for investment and innovation, with major players eager not to be left behind.
AI-Powered Art, Open-Source Desktop Robots, and Smarter Robot Training
The video wasn’t just about big industry news—it also delved into more creative and community-friendly innovations. One standout was AI DA, a humanoid robot with eerily lifelike features that just unveiled an oil painting of King Charles called “Algorithm King.” With the ability to swap tools and painstakingly recreate brushstrokes, AI DA’s art sparks debate around what counts as true creativity in the AI era. Its creator, Aiden Miller, frames the project as an ethical experiment aiming to widen conversation rather than replace human artists.
On the open-source front, Hugging Face introduced Reachi Mini, a tiny desktop robot priced at $299. It’s designed for hobbyists and kids to tinker with, supporting Python programming and even Scratch and JavaScript. The real kicker? Every hardware and software detail is open on GitHub, encouraging users to share custom motion packs and teach the bot new tricks. Projects like this democratize robotics in a way that’s really exciting for community builders and AI enthusiasts alike.
Training robots safely remains a big challenge, but researchers from the University of Sydney and NVIDIA showcased a clever method called QStack. It combines model predictive control with deep reinforcement learning but innovates by generating safety-aware cost maps on the fly without manual tuning. The result? 80% task performance with fewer samples and a real-world fruit-picking success rate over 93%. Efficient and safety-conscious training like this could impact everything from warehouse logistics to autonomous vehicles navigating busy streets.
Figure AI’s Bold Predictions: Humanoids in Our Homes Soon?
Finally, Brett Adcock from Figure AI made a bold claim on the “Around the Prompt” podcast: in just a few years, we’ll have humanoid robots helping out in homes and offices with logistics and other tasks. Their Helix robot already performs an hour of nonstop work at near-human pace. With over $2 billion raised and growing interest in humanoid robotics from giants like Tesla and Boston Dynamics, Adcock argues the real hurdle isn’t feasibility, but scaling production and deployment. He envisions a future where humanoid robots might actually be as common as humans on sidewalks, serving as the ideal platform for artificial general intelligence. Whether you buy into that or find it optimistic, it certainly gives food for thought about the direction physical AI is heading.
Which Development Surprised You Most?
Watching this range of advancements—from surgical bots that grasp nuance and execute delicate procedures, to football-playing humanoids, and democratized desktop robots—gives a clear sense of how multifaceted AI in robotics is today. Was it the flawless AI surgeries? The autonomy of robot football players? Or maybe AI DA’s elegant paintings? Personally, I’m still wrapping my head around the surgical robot’s ability to adjust on the fly and understand verbal commands—something I hadn’t quite pictured AI doing so soon.
What about you? Drop your thoughts and let’s chat about which breakthrough excites or surprises you the most.


