A proof of concept we developed for a tractor manufacturer. Used motion capture to record an expert assembling a coffee table and generate digital training material.
Used Shadow live streaming for preview, Python for data post processing, and Unreal Engine for visualization.
The high-quality, easy to use, professional mocap suit. 17 inertial sensors embedded in a wearable sensor network.
Kinematic model simulates your body motion and directly outputs a character animation. Use for live performance or record clips for animation and motion analysis.
Miniature inertial measurement unit (IMU). A 3D rotation sensor with USB connectivity.
** These projects are from college circa 2003 so please bear with me. **
Character animations for movies and 3D games are generally created using linear blend skinning, an algorithm that animates a mesh of vertices based on a sequence of skeletal poses.
I worked on a research project with UW-Graphics to find easier and more intuitive ways to edit the blend weights of a mesh. The weights associate a single vertex in a mesh with a set of joints in the skeleton.
Ambient occlusion is a rendering technique used to model global illumination, a time consuming algorithm. In this project, we used ray casting to generate a set of normals in a pre-processing step. The new normals can be used to render the scene at real-time frame rates, using this approximation of global illumination.
Wavelet Analysis of Musical Signals
Wavelets are pretty cool. They can be used to analyze the frequency composition of a signal while also showing the change over time.
In this report, I decomposed some musical signals into their frequency components. The report includes some plots of the original signal, the frequency response of the signal (Fourier transform), and the time-frequency breakdown.
Email me: luketokheim@gmail.com
Find me on GitHub: https://github.com/luketokheim