P12201: Humanoid Platform


Table of Contents

This section outlines guidelines for continuation of the humanoid platform.


SolidWorks Design

The best practice when designing sheet metal parts in SolidWorks is to use the Sheet Metal design toolbox. These functions (most importantly Base flange and Edge Flange) build bend deduction compensation into the flat pattern of your sheet metal part. This is absolutely critical for making accurate bends in any sheet metal part.

Computer Aided Manufacturing

We used MasterCam X5 in order to make the G-code required by CNC machines. The flat pattern configuration of our sheet metal parts from SolidWorks were imported into the program. Often the parts needed their planes reoriented so that the flat face would point upwards. This was done by changing the default WCS in the view manager. After this was done, the stock was set up under operations manager, toolpaths, machine group-1, properties, stock setup. Next, it’s a good idea to move the origin (or where the CNC will start its path to the desired location); this is done in the xform menu. Finally it is time to start creating tool paths. This is done by first selecting the machine type (default mill), and adding a new contour path. This is pretty straightforward: after selecting contour, select the edges you’d like to be milled and hit the check mark when you’ve finished selecting. Note that sometimes you cannot see what you are selecting do to graphics issues – this is the case on the computers in the Advances Systems Integration Lab (IE department). Also remember to cut interior holes before the perimeter of the part; this can be done with two separate contour toolpath operations (drag to reorder them in the toolpath list if necessary). It is very critical to set up the parameters after selecting the edges to be milled. First select your tool size (i.e. 1/8” flat endmill). Also set your feed rate. We generally used a feed rate of 1 ± 0.25 fpm for our aluminum that was 0.08” thick because the spindle speed was limited to 2000rpm. Rapid retract is a nice option to select here for quicker machining. Other important parameter pages are Lead In/Out (we disabled this), and linking parameters (set retract distances, zero the top of the material, and set plunge depth using a negative number). After you’ve made paths for all contours needing cuttings and ordered them appropriately you can export the operations to G-Code by selecting the G1 button located atop the toolpath list.

CNC Milling

The thing to remember here is how you set up your model in MasterCam. The sheetmetal was clamped on top of a piece of particle board and the particle board was fixed to the ways of the mini CNC mill in the multi agent robotics lab (located in the Erdle Commons). After the stock was clamped firmly, the Mach 3 software was booted up on the PC connected to the CNC controller via parallel port. This program requires administrative access; appropriate permissions will have to be set. The taig2000 profile was used and the g-code generated by mastercam (file extension is *.nc) was loaded in. Next the controller box connected to the mill itself was switched on and the reset button was pressed on the mach 3 software. The arrow keys and the page up/down buttons should now control the movement of the mill. Now it’s time to move the mill to the origin point chosen in the mastercam setup with these keys. Once you’ve found a suitable location on the stock for the origin, zero the x and y coordinates. A good practice for zeroing the z axis without damaging the material is to lay a sheet of paper on top of the stock and lower the mill until you can no longer move the paper without tearing it. Zero the z axis in this location. If you are unsure of where to zero the xyz coordinates, zero the z coordinate high above the material and do a test run. Alternatively, pick a test location for the x and y coordinates, zero them, and move to the edges of the material shown in the preview window while ensuring your path doesn’t go off the edge of the stock or ram into the clamping devices. When you’re ready to go, flip on the switch on the top right side of the mill to power the spindle and hit run on the mach 3 software. Make sure to periodically apply oil or coolant to the bit as it is cutting.

Sheet Metal Bending

The thing to know about sheet metal brakes (or benders) is that when bending, the material will not stretch equally on both sides of the bend line. The side clamped into the vice will hardly bend at all while the other side will stretch noticeably. The only way to find the sweet spot for the bend line is to experiment or to use more sophisticated equipment (like a hydraulic brake with guides, dies, and ruled stops). Also note that holes located too near a bend line will become distorted.


For an easy, robust assembly, it is important to use only one or two types of bolts. We chose M2 and M3 screws with standard nuts and lock nuts with nylon inserts. Lock nuts are very critical pieces of hardware. Once the design is finalized, rivets would be a viable option where applicable. They allow the robot to move without vibrating the nuts loose. Various bearings were used in our assembly. Ball bearing hubs from lynxmotion were used to attach components that needed to be rigidly attached but still rotate independently with minimal resistance (like the front of the hips and ankles). Flanged ball bearings were used where servos needed a rotation axis on both sides for support (like in the knees). One recommendation here is to use enough washers between the bolt and servo so that this can be a very rigid connection. General order of operations for assembly is as follows: Start with a servo bracket, attach fully assembled ball bearing hub (if necessary), attach servo to bracket with four mounting holes using lock nuts. If this servo/bracket/hub subassembly is going inside a U-bracket, make sure to attach other servos to the U-bracket first and the subassembly second.

Future Mechanical Expansion

The number one thing that comes to mind when mentioning future alterations or expansions to the Tigerbot is the support of the servo horns on some joints such as the top of the hips and the shoulder joints. There is a large amount of mechanical play at those locations due to the way the servo horns attach to the servos. They were not designed to handle tangential loads. A simple thrust bearing similar to the ones used in the hip and ankles (but a much wider version to allow for the servo on the inside)attached to the body and the U-bracket would eliminate the problem. Another major improvement that could be made is to replace all of the plastic servo horns with metal ones. The design of the links and brackets that the Tigerbot is comprised of are very robust. They were designed this way to ensure that the linkages for the robot are sturdy. This has the added side effect of additional weight, so a potential future expansion of this project is to design lighter less bulky joints that still provide the rigidity needed to balance. Another future expansion for the Tigerbot would be to upgrade as many of the servos as possible to stronger more robust high torque HSR-5990TG servos. Due to budgetary constraints, 2 different servo strengths were used. For walking and balancing, generally any servo can require full torque at any given time so using all or mostly HSR-5990TG servoes (at least in the legs) would be helpful. Essentially, anything that can lower the center of gravity, lower the overall weight, or increase the stability of the robot would be an improvement and potential future expansion.


Expanding movement / Look up tables:

The ARC32's final programming is equipped with many functions, chief of which is controlling the servos. The movements are done with successive calls to HSERVO and pauses and waits in between to give the movement a fluid like flow. This method causes the processor to be held up. The only thing that can be sent to the processor at this time is interrupts. A better constructed program that would not hold up the system should be built. This would allow the IMU, when working to be sampled.

In a previous version of the program, a queue was used to send values to the servos. Each movement function would load values into the queue and a timer would send values to a single HSERVO command. This maybe the only way to have a responsive system that does not hold the processor with pauses and waits. Further looking into this is a suggestion.

In addition to coming up with a better architecture of the program, movement commands need to be finished. Currently, wave, punch and high-five are the only movement commands besides the home position. The home position stands the robot on its feet. Note, this position needs to be done with batteries in the body because of the change in weight.

Movement testing:

Finding the angles required for the desired motion can be tricky to calculate mathematically. Thus, a program was developed to work on new command sets as well as having access to a manual command that moves individual servos. The movements desired are all found and tested at different speeds and times in order to create a smooth seamless motion. The timings that move the servo values can be adjusted in order to move all the joints at the same time to mimic human motion. As a rule of thumb, joints closer to the body are typically moved slower than the outer joints. For linear movement, the robot needs to find the middle ground between the joint timings and angles. This can be easily calculated where the servos movement angles are directly proportional to move time.

As part of the expandability theme of the project, there is also a program developed for MATLAB software that calculates the servo values based on the servo offset, the home position in angles and the differential change required from the joint in centimeters.


There were several factors affecting balance. One factor was the mechanical play in the low torque servos. To fix this a suggestion of adding a bearing to the axle of the servo and then attaching the servo horn.

Another factor was the lack of torque in the roll of the ankle. To fix this a suggestion of moving high torque servos from the arms to the ankle was made.

Finally, a few bolts in the legs continued to loosen. To fix this lock nuts could be placed on all bolts to make sure bolts do not come loose from movement.

Integration of IMU:

The IMU was originally planned to be used with the ARC32. This was to be done over I2C on the ARC32. BasicMicro provided I2C commands and these commands were used to interface with the IMU initially. During integration, it was noticed that these commands cause the servos to jerk. The direct cause of this is currently unknown, however, BasicMicro did say that there is I2C hardware present on the ARC32 that could be used to continue interfacing with the IMU. Two other suggestions would be to move the IMU to the Arduino since the IMU is not needed to directly balance the robot or an IMU that comes with a micro-controller could be bought. This IMU might be able to connect directly to SPI or could go through another micro-controller.


Currently, there were plans to implement IR sensors and touch sensors for object detection. The Arduino was going to be used to process this information and inform the Android phone of objects being approached. To complete this process, the IR sensors need to be mounted to the legs, arms and chest. The leg mounting holes are already present, however, screws need to be purchased. The chest mount needs to be created along with the arms.

After mounting is completed, the code will need to be written to detect objects. During detailed design review, a IR sensor was mapped to a distance function. This process can be used again to find the distance function for the new IR sensors. The touch sensors are hooked up and ready to be programmed.

Arduino-EasyVR Voice Command Training/Re-training:

The current voice recognition code uses the EasyVR's built-in word sets and its built-in training for those word sets. During bench testing of the EasyVR's voice recognition accuracy of its built-in word sets, the module was able to accurately recognize spoken word commands. However, after integrating the voice module into the robot and powering the robot up, it was found that the high-pitched whine coming from the servos under load caused the EasyVR to barely or not at all recognize the programmed commands. This is most likely due to the fact that the training for these built-in words, as well as the bench testing done, was done in a sound stage or quiet room. A solution to this problem was proposed, but have not yet been implemented. Because the built-in word commands cannot be retrained, the solution was to program custom commands, and then allow for retraining of these custom commands under various sound environments and conditions. A command called "train" should be implemented on both the WiFi and voice recognition side. When this command is received, starting from the first programmed custom command and incrementing forward by one each time, the intelligence system should prompt the user to speak each command. After finishing retraining all the custom commands, the EasyVR module should now be able to pick up valid voice commands easier because the spoken commands have been re-trained/recorded with the particular background noises that were originally causing problems.

Arduino-EasyVR Microphone Filtering:

In addition to the above suggestion to program the system to allow for command retrainability, some sort of physical filter should be designed to fit over the microphone of the EasyVR that blocks out background or ambient noise, but allows for a user that is speaking loudly to be detected. This and the above feature working in conjunction should provide a very robust solution that greatly increases the functionality of the voice recognition module.

Arduino-EasyVR Synthesizing Custom Sound Bites:

The EasyVR module has the ability to be programmed with many different sound bites that it can play back. Currently, only the pre-programmed "beep" sound is used by the system to audibly notify the user when a spoken command was recognized or not recognized. Utilizing this powerful feature of the EasyVR module is suggested. The sound bites can be a sentence, and so, the module can be programmed to play back a statement whether its fallen over on its back or front, or prompt the user when the spoken command was not recognized in a much more clear and concise way.

Android - Displaying Debug Info & Stats:

When the Android device is hooked up to the IOIO, it is very difficult to get adb (Android Debug Bridge) debug information from the device should there be a problem with the Android code or crashes occur. Currently, to get around that issue, variables that need its integer value or contents examined or verified are printed to the application's text area. This is a costly operation, as updating fields on the GUI application side require firing off of events and spawning a new helper thread to handle the updating GUI transaction. In addition to the current approach, the information that needed to be displayed is tightly compacted to a short string, and the current implementation requires commenting out the code in order to turn off the debug information printing to the GUI. Ideally, a debug feature that can be toggled on and off from the Android application's GUI would provide the perfect balance of uninterrupted, smooth running code, and debugging any major issues without having to disconnect the device from the robot intelligence system. It could also work even better if this debug feature can be programmed to inform all slave devices that debug mode is activated, and that these slave devices begin transmitting back feedback information to the Android device. This feature is highly recommended because more and clearer debug messages can be displayed when needed, and toggled off when the system communication is working great.

Remote GUI - Executable JAR:

It would be highly desirable to develop the remote GUI to not depend on executing from a command line shell. Currently, the Remote UI can only be started from a command prompt in Windows, and a terminal in Mac and Linux. Ideally, the remote program should be made into a GUI to avoid dependencies on native operating system tools. The remote GUI can simply be made with a location to input the IP address or hostname, port number, input box for user commands, and a text field to display echoed information between the remote client and the server, all using Java libraries. This would eliminate the dependencies on the OSes, and solely rely on whether the computer attempting to run the remote client has the JRE (Java Runtime Environment) installed.


A few suggestions for how to change and improve on the robot have been speculated at. One chief example would be to built/ replace the wired with a connector board, either a PCB or a perf board. This will help connect the boards and integrate all the systems better, without the need for so many excessive wires. The wiring can be cleaned up, allowing for more storage space in the chest of the robot. The wires for the servos connectors are still required, however all other forms of wires can be relocated. This should also improve the connectivity between the boards and the troubleshooting.