Hot AI News
Gmail enters the Gemini era: AI Overviews, smarter replies, and a cleaner inbox
ChatGPT Health turns OpenAI's chatbot into a personal health assistant
Nvidia fast-tracks Vera Rubin chips, promising a 5x jump in AI performance
9 Bold AI Predictions From Nvidia's Jensen Huang: How AI Will Reshape Wealth, Jobs, and Industry
NVIDIA RTX PRO 5000 72GB Blackwell: Supercharging agentic AI on your desktop
Aiholics: Your Source for AI News and Trends
  • News
    NewsShow More
    gmail gemini ai 2026
    Gmail enters the Gemini era: AI Overviews, smarter replies, and a cleaner inbox
    January 9, 2026
    chatgpt-health-2026-openai-available-rollout
    ChatGPT Health turns OpenAI's chatbot into a personal health assistant
    January 8, 2026
    Nvidia fast-tracks Vera Rubin chips, promising a 5x jump in AI performance
    January 6, 2026
    nvidia ceo jensen huang
    9 Bold AI Predictions From Nvidia's Jensen Huang: How AI Will Reshape Wealth, Jobs, and Industry
    January 6, 2026
    workstation rtx pro blackwell gpu nvidia agentic ai desktop
    NVIDIA RTX PRO 5000 72GB Blackwell: Supercharging agentic AI on your desktop
    December 20, 2025
  • AI Tools and Reviews
    AI Tools and ReviewsShow More
    Intelligent agents in AI: how agents make decisions in artificial systems
    Intelligent agents in AI: How agents make decisions in artificial intelligence systems
    December 20, 2025
    Emergent AI review
    ElevenLabs review
    magictrips ai review
    MagicTrips AI review
    AI tool identifies structural heart disease with 88% accuracy using smartwatch data
    November 3, 2025
  • AI assistants
    AI assistantsShow More
    gmail gemini ai 2026
    Gmail enters the Gemini era: AI Overviews, smarter replies, and a cleaner inbox
    January 9, 2026
    chatgpt-health-2026-openai-available-rollout
    ChatGPT Health turns OpenAI's chatbot into a personal health assistant
    January 8, 2026
    chatgpt 5.2
    GPT-5.2 arrives as OpenAI races to keep pace with Google's Gemini 3
    December 12, 2025
    ai overviews summary google search
    EU investigates Google over AI summaries: what this means for creators and tech innovation
    December 9, 2025
    chatgpt-5
    GPT-5.2 release: Features, upgrades and OpenAI's urgent ‘code red' response
    December 6, 2025
  • Safety
    SafetyShow More
    How AI helped solve the mystery of a missing mountaineer
    January 9, 2026
    ai overviews summary google search
    EU investigates Google over AI summaries: what this means for creators and tech innovation
    December 9, 2025
    smart ai radar camera speed car big brother
    Spain's new AI occupancy cameras: How stealth tech fines solo drivers
    November 23, 2025
    tik tok manage topics ai content manage filter
    New TikTok features make it easier to spot AI – and choose how much of it you see
    November 23, 2025
    ai vegans antiai movement
    Meet the ‘AI vegans': Young users cutting AI out of their daily lives
    November 22, 2025
  • Research
    ResearchShow More
    How AI helped solve the mystery of a missing mountaineer
    January 9, 2026
    Polytechnic artificial intelligence: how AI diploma programs transform vocational education
    AI in polytechnic education: Diploma programs bringing artificial intelligence to vocational studies
    December 20, 2025
    How our brain processes speech: A layered approach like AI models
    December 14, 2025
    mit ai self learning notes
    MIT researchers unveil a method that lets AI models learn from their own notes
    December 13, 2025
    artificial intelligence agi vs ai myths
    From AI to AGI: Debunking myths and setting real expectations
    December 8, 2025
  • Companies
    • OpenAI
    • Google
    • Meta
    • Apple
    • Nvidia
    • Microsoft
    • ByteDance
    • Other companies
    CompaniesShow More
    gmail gemini ai 2026
    Gmail enters the Gemini era: AI Overviews, smarter replies, and a cleaner inbox
    January 9, 2026
    chatgpt-health-2026-openai-available-rollout
    ChatGPT Health turns OpenAI's chatbot into a personal health assistant
    January 8, 2026
    Nvidia fast-tracks Vera Rubin chips, promising a 5x jump in AI performance
    January 6, 2026
    workstation rtx pro blackwell gpu nvidia agentic ai desktop
    NVIDIA RTX PRO 5000 72GB Blackwell: Supercharging agentic AI on your desktop
    December 20, 2025
    chatgpt 5.2
    GPT-5.2 arrives as OpenAI races to keep pace with Google's Gemini 3
    December 12, 2025
  • AI futurology
    AI futurologyShow More
    artificial intelligence agi vs ai myths
    From AI to AGI: Debunking myths and setting real expectations
    December 8, 2025
    Why synthetic data is becoming the most valuable resource in AI
    December 6, 2025
    How AI is quietly changing the way we grieve and remember loved ones
    December 3, 2025
    ai post writing articles content
    More articles are written by AI than humans: What that means for content creators
    November 24, 2025
    Why landing a first job is getting harder – and how AI plays a role
    November 23, 2025
  • Events
  • Sustainability
    SustainabilityShow More
    sustainability ai green technology environment ecology
    AI's climate impact: why it's not the environmental villain you think
    December 6, 2025
    Thermodynamic computing Extropic superconducting chips ai energy
    Extropic's superconducting chips could change everything about AI's power problem
    November 2, 2025
    Google's first carbon capture project: A new path to clean, reliable energy
    November 2, 2025
    Japan's AI-generated video shows what a Mount Fuji eruption could really look like
    November 2, 2025
    How NASA's new AI model is changing the way we predict solar storms
    November 2, 2025
  • Finance
    FinanceShow More
    OpenAI headquarters
    OpenAI reportedly preparing for a $1 trillion stock market debut by 2026
    November 2, 2025
    Meta's AI gamble: Why Zuckerberg's massive spending is spooking investors
    November 2, 2025
    nvidia_most_valuable_stock_market_cap
    Nvidia reaches $5 trillion valuation as AI demand explodes. Can rivals keep up?
    November 2, 2025
    Perplexity AI makes a bold $34.5 billion bid for Google Chrome
    November 2, 2025
    How a 23-year-old raised $1.5 billion for an AI hedge fund
    November 2, 2025
  • AI Tutorials and Prompts

Archives

  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • May 2025
  • August 2024
  • July 2024
  • June 2024

Categories

  • AI Apps and Tools
  • AI assistants
  • AI futurology
  • AI Tools and Reviews
  • AI Tutorials and Prompts
  • Anthropic
  • Apple
  • ByteDance
  • Companies
  • Events
  • Finance
  • Free Prompts
  • Google
  • Meta
  • Microsoft
  • News
  • Nvidia
  • OpenAI
  • Other companies
  • Research
  • Safety
  • Sustainability
  • Uncategorized
Reading: Robot, know thyself: How vision is teaching machines to understand their bodies
Search AI news & posts
Font ResizerAa
Aiholics: Your Source for AI News and TrendsAiholics: Your Source for AI News and Trends
  • News
  • Companies
  • AI assistants
  • Sustainability
  • Safety
  • Research
Search
  • News
  • Companies
    • Google
    • Meta
    • Microsoft
    • Nvidia
    • Apple
  • AI assistants
  • Sustainability
  • Safety
  • Research
  • AI futurology

Imagen 4 and Imagen 4 Fast: Balancing speed and quality in text-to-image AI

By Leo Martins
November 2, 2025
FacebookLike
InstagramFollow
YoutubeSubscribe
TiktokFollow
  • About us
  • Advertise with us
  • Privacy Policy
  • Terms and Conditions
  • Affiliate links Disclaimer
© Foxiz News Network. Ruby Design Company. All Rights Reserved.
AI futurology / Robot, know thyself: How vision is teaching machines to understand their bodies
AI futurologyResearch

Robot, know thyself: How vision is teaching machines to understand their bodies

Neural Jacobian Fields (NJF) enables robots to learn their own 3D shape and movement purely through visual observation.

Daniel Reed
ByDaniel Reed
AI Research, Safety & Ethics Analyst
Daniel Reed currently works as an AI Research, Safety & Ethics Analyst at Aiholics, writing about how changes in artificial intelligence are affecting and will affect...
- AI Research, Safety & Ethics Analyst
Published: July 31, 2025
6 Min Read
Share
SHARE

Robots that truly know their own bodies — it sounds like a sci-fi dream, but recent research from MIT‘s CSAIL team is making it real. I came across a fascinating breakthrough called Neural Jacobian Fields (NJF), a new way for robots to learn how their bodies move using just a single camera, without relying on any onboard sensors or pre-programmed models. This isn’t about building smarter physical parts — it’s about teaching machines to understand themselves visually, much like how we learn to control our own fingers by observing and experimenting.

Imagine a soft robotic hand curling its fingers around an object, but instead of a maze of sensors or complex programming, it simply ‘watches’ itself with a camera and figures out how its movements work. NJF flips the traditional robotics approach on its head. Instead of forcing robots to conform to rigid, sensor-laden designs so humans can control them, robots can now learn their own internal models from visual feedback alone. This opens the door to flexible, affordable robots with embodied self-awareness — an ability that could revolutionize how machines interact with messy, real-world environments.

“The main barrier to affordable, flexible robotics isn’t hardware — it’s control of capability, which could be achieved in multiple ways.”

Advertisements

Why vision over sensors?

Traditional robots often rely on rich sensor suites and pre-coded mathematical models to know where their parts are and how to move them. This works well for rigid arms on factory lines, but it’s limiting if you want robots to be soft, deformable, or bio-inspired in shape — areas where sensors might be costly or impractical. I found it interesting that NJF removes these constraints by using purely vision-based learning. The technology uses a neural network to simultaneously capture a robot’s 3D shape and how each part moves in response to motor commands, based on observation of random motions recorded by cameras.

Building on neural radiance fields (NeRF), which reconstruct 3D scenes from images, NJF goes a step further. It learns a Jacobian field — a fancy term for mapping how every point on the robot’s body responds to control inputs. What’s remarkable is that the system discovers this relationship without any human supervision or prior models. It’s like watching someone fumble with a new gadget until they figure out what each button does, but here, the robot figures out which motor controls which part of its body all by itself.

Testing across robot types demonstrates broad potential

The team put NJF through its paces on various robots — from a soft pneumatic hand that can pinch and grasp, to a rigid 3D-printed arm and even a rotating platform without any embedded sensors. Each time, the system learned the robot’s shape and control responses using just visual input and random movements. After an initial training period with multiple cameras, the robot only needs a single monocular camera to perform real-time control at 12 Hertz, allowing for responsive and adaptive behavior.

Why does this matter for us outside the lab? The technology promises to enable robots that can work in complicated, unstructured environments without expensive sensor arrays. Think agricultural robots that precisely localize plants in a field, or construction site assistants navigating chaos without carefully installed GPS or tracking systems. It also hints at applications like indoor drones or legged robots negotiating uneven terrain, all powered by the robot’s ability to visually understand its body.

More Read

How AI helped solve the mystery of a missing mountaineer
gmail gemini ai 2026
Gmail enters the Gemini era: AI Overviews, smarter replies, and a cleaner inbox
chatgpt-health-2026-openai-available-rollout
ChatGPT Health turns OpenAI’s chatbot into a personal health assistant
Nvidia fast-tracks Vera Rubin chips, promising a 5x jump in AI performance
Advertisements

Challenges on the horizon and an exciting future

Of course, NJF has limits. Training currently requires multiple cameras and must be redone for each robot anew. It also doesn’t yet generalize across different robot models or handle force and tactile sensing, important for tasks involving contact and touch. But the researchers are actively exploring ways to overcome these hurdles, improving generalization and extending the model’s spatial and temporal reasoning.

What really sticks with me is the broader shift this represents in robotics: moving away from rigid programming toward teaching robots through observation and interaction. This vision-based self-awareness mimics how humans develop control over their bodies — by experimenting, sensing visually, and adapting — rather than by memorizing detailed mechanical rules.

As one researcher put it, the goal is to make robotics more affordable, adaptable, and accessible, lowering the barriers caused by costly sensors and complex coding. We stand at the cusp of a new era where robots won’t just follow instructions; they’ll understand their own movements and can be shown what to do instead of meticulously programmed. That’s truly exciting for anyone passionate about the future of AI-driven machines.

In the end, NJF offers a glimpse of robots with a kind of bodily self-awareness — shaping the future of soft robotics, bio-inspired machines, and adaptable automation. I can’t wait to see where this vision-led control system takes us next.

TAGGED:AIAI ModelsAI researchMITvision

Sign Up for the Daily AI Pulse

One email a day. All the stories that matter.

By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Flipboard Whatsapp Whatsapp LinkedIn Reddit Telegram Email Copy Link
ByDaniel Reed
AI Research, Safety & Ethics Analyst
Daniel Reed currently works as an AI Research, Safety & Ethics Analyst at Aiholics, writing about how changes in artificial intelligence are affecting and will affect scholarship, society, and human civilization. He reports on breakthroughs in AI research, the development of safety frameworks, discussion of long-term risks, and ethical challenges; he also reports on global shifts in policy and governance. Daniel aims to make complex research papers and long-term thinking accessible to the everyday reader without sacrificing nuance. With his thoughtful and analytical style of writing, Daniel translates advanced topics into clear language. He targets questions that really matter: how safe are today's AI systems, what kind of ethical boundaries do we need, and how could exponential progress affect the way education, jobs, governance, and human values are shaped? His articles are often not just expert opinions but also balanced views and insight into emerging debates that define AI's place in the world. Daniel believes responsible AI development begins with awareness, transparency, and informed public conversation. In terms of his work with Aiholics, he encourages readers to look beyond headlines to understand the promise of artificial intelligence but also some of its consequences.
Leave a Comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Trending

FacebookLike
XFollow
TiktokFollow

Your may also like!

AI Tools and ReviewsCompaniesMicrosoftNews

Microsoft Lens retires: Scanning app makes way for AI-powered Copilot

AI Tools and ReviewsFinanceNews

How a 23-year-old raised $1.5 billion for an AI hedge fund

AI Tools and ReviewsCompaniesNewsOpenAI

What to expect from GPT-5: The next wave in AI evolution and how to prepare

ai artificial intelligence vs versus human
AI assistants

AI vs human experts: Who wins (2024)?

Quick Links

  • About us
  • Advertise with us
  • Privacy Policy
  • Terms and Conditions
  • Affiliate links Disclaimer
Advertise with us

Socials

Follow Aiholics
© 2026 AIholics.com
Accessibility Adjustments

Powered by OneTap

How long do you want to hide the accessibility toolbar?
Hide Toolbar Duration
Colors
Orientation
Version 2.4.0
Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
View preferences
{title} {title} {title}
Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
View preferences
{title} {title} {title}
adbanner
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?