News

The robot in question is a skid-steer 4-wheeled toy, to which he has added an ADNS3080 mouse sensor fitted with a lens, an H-bridge motor driver board, and a Wemos D1 Mini single board computer.
A new “machine vision sensor” can adapt to extreme lighting conditions much faster than the human eye, in about 40 seconds.
Tangram Vision, a startup building software and hardware for robotic perception, unveiled a new 3D depth sensor today called HiFi that packs powerful computer vision capabilities into an off-the ...
Scientists have developed a robot eye that can adjust to sudden changes in light much faster than humans. This new sensor ...
The system contains a sensor, chip and tiny AI model inspired by biological eyes and brains and uses a tenth of the energy of a camera-based system.
In blinding bright light or pitch-black dark, our eyes can adjust to extreme lighting conditions within a few minutes. The human vision system, including the eyes, neurons, and brain, can also learn ...
The sensor is an adaptation of a technology called GelSight, which was developed by the lab of Edward Adelson, the John and Dorothy Wilson Professor of Vision Science at MIT, and first described ...
Robots need love, too. That's why MIT researchers have added a touch-force sensor to the robotic Baxter, allowing him to register gentle caresses, tender hand-holding, and the sense the he is ...
The advanced robot vision is being developed as part of a European project that uses a sensor that employs a complicated digital imaging process known as ‘3D foveation’.
An open-source project aims to give a rudimentary eye to robots with the help of a camera that can detect, identify and track the movement of specific objects. The Pixy camera sensor board, being ...
Most robots that use camera sensors are confined to 2D perception. Wanna know how restrictive that is? Grab a racket, close one eye, and try to get through a set of squash. A company called PIXMAP ...