NASA’s R5 – or Valkyrie – was designed and built by the Johnson Space Center (JSC) Engineering Directorate to compete in the 2013 DARPA Robotics Challenge (DRC) Trials. The humanoid robot was designed to be rugged, and capable of working in environmentally degraded environments such as will be encountered in space.
Valkyrie is different than earlier NASA robots, like Curiosity, Opportunity, and Spirit. These earlier robots relied on wheels to move around when collecting information. Replicating a human’s ability to walk on an even surface is hard enough for a robot, but walking on rough, uneven terrain has been the goal with Valkyrie.
To this end, the creators of Valkyrie also partnered with the Florida Institute for Human and Machine Cognition (IHMC) to implement their walking algorithms on NASA hardware. The finished product is more of an Iron Man look-alike. Valkyrie weighs 300 pounds and stands six-feet and two-inches.
All about Valkyrie
The semi-autonomous humanoid robot can be run from a wall or from battery power. Valkyrie’s head is filled with a sensor suite. The main perceptual sensor is the Carnegie Robotics Multisense SL. There are also “hazard cameras” on the front and back of the torso.
Each upper arm consists of 4 series elastic rotary actuators and when combined with the forearm has 7 joints. Valkyrie’s simplified humanoid hands come with 3 fingers and a thumb. There are rotary actuators in the forearms that allow for turning the wrist.
The pelvis houses three series elastic rotary actuators: the waist rotation joint, and the hip rotation joint of each leg. The pelvis is considered the robot’s base frame and includes two IMU’s. Each upper leg contains five series elastic rotary actuators. The ankle is realized using two series of elastic linear actuators working in concert.
Inside the robot’s infrared-transparent faceplate, you will find a whirring LIDAR sensor that constantly scans the surroundings for objects and obstacles. The robot’s dual-brain consists of two Intel iCore i7 computers which make sense of the sensors’ input, according to Mashable.
Getting a robot to walk
At the DARPA challenge in 2013, Boston Dynamics’ the Atlas robot could walk but its steps had to be determined by a human operator and inputted through a user interface. However, technology and AI has come a long way in improving a bipedal robot’s agility.
New software from the IHMC surveys the environment using the robot’s sensors and segmenting it into sections. Each section is interpreted into a series of polygons, allowing the robot to “see” a model of its environment. This way, the robot can plan its steps in getting from a starting point to its goal.
Valkyrie can now plan how to reach its goal a few steps at a time, allowing it to make any adjustments quickly, depending on the terrain. Valkyrie and other bipedal robots will now be able to walk over flat ground, stepping stones, stairs and piles of cinder blocks.