After playing with some of the demo programs I noticed on there was a Processing library on the leap developer site (can be found here). The control program for my hexapod is already written in processing so adding the leap was relatively easy.
All I have done currently is map x, y, z, yaw, pitch & roll of the users hand onto the body of the hexapod. The smoothing and inverse kinematics are then done and sent to the hexapod over TCP/IP. Raspberry Pi on the hexapod receives this with a python script and outputs the values to the servos. Hand tracking with the Leap is enabled or disabled by a key tap gesture with any finger.