Google and NASA will make smarter flying spheres with kinect sensors

Remember in “Star Wars” when Luke Skywalker deflects lasers from a floating orb with his lightsaber? Google and NASA are planning a floating sphere.

The floating robots, or SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellites) will roam around the International Space Station equipped with Google’s Project Tango technology.

There are three SPHERES currently aboard the ISS, each one with its own propulsion and power systems contained in a free-flying satellite about the size of a volleyball.

Right now, they use ultrasound and infrared technology to navigate their way around. It’s not exactly the most advanced system, which is where Google comes in. In February, the company unveiled Project Tango, its initiative to put 3-D mapping technology inside of an Android smartphone.

Each SPHERE satellite is self-contained with power, propulsion, computing and navigation equipment as well as expansion ports for additional sensors and appendages, such as cameras and wireless power transfer systems. This is where the SPHERES’ smartphone upgrades are attached.

By connecting a smartphone, the SPHERES become Smart SPHERES. They now are more intelligent because they have built-in cameras to take pictures and video, sensors to help conduct inspections, powerful computing units to make calculations and Wi-Fi connections to transfer data in real time to the computers aboard the space station and at mission control.

“With this latest upgrade, we believe the Smart SPHERES will be a step closer to becoming a ‘mobile assistant’ for the astronauts,” said DW Wheeler, lead engineer with SGT Inc. in the Intelligent Robotics Group at Ames. “This ability for Smart SPHERES to independently perform inventory and environmental surveys on the space station can free up time for astronauts and mission control to perform science experiments and other work.”

“The Project Tango prototype incorporates a particularly important feature for the Smart SPHERES – a 3-D sensor,” said Terry Fong, director of the Intelligent Robotics Group at Ames. “This allows the satellites to do a better job of flying around on the space station and understanding where exactly they are.”

Later this month, Ames engineers will fly the prototype phone several times aboard an airplane that is capable of simulating microgravity by performing a parabolic flight path. The team has modified the motion-tracking and positioning code developed by Google that tells the phone where it is to work in the microgravity conditions of the space station. To verify that the phone will work, they must take the phone out of the lab at Ames and test it in a microgravity environment.

Project Tango uses a Kinect-like camera and sensors galore to instantly map its position and the world around it, thousands of times per second.

Google’s current prototype is a 5” Android phone containing highly customized hardware and software designed to track the full 3-dimensional motion of the device as you hold it while simultaneously creating a map of the environment. These sensors allow the phone to make over a quarter million 3D measurements every second updating its position and orientation in real-time combining that data into a single 3D model of the space around you.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks