![]() If we go to the time slider, we can now scrub and see our clip. By doing this, we record the past 15 second to clip. Go to the Realtime device, select a 15 sec clip, and hit record. Load it on the right side and connect the rotation from Real-Time. The last step is to add the eye movement. Now we will connect: translate and rotate from the Real-Time device to our head. So select the geometry and load it on the right side. Now we are live streaming data to our character. You can speed things up with our script that we included in the description bellow. Now we simply connect one by one by name. On the left side, we’re going to load the Real-Time device, and on the Right side, we’re going to select and load our blendshape. Now we need to open the connection editor. If you have the MocapX app on your iPhone or iPad Pro running, just click on connect, and Maya will connect the app. If we go to the attribute editor, we can choose either Wi-Fi or USB. First, we need to create a real-time device. Now we are ready to connect MocapX data to the blendshapes. You can also create blendshapes in Maya or import them from any other 3d application like 3ds max, blender, or others. If we select the head’s geometry and go to the Channel box, under blendshape there is a list of them. Now Maya is open, and in the outliner, we can find the head with already connected blendshapes. If we are done with it, go to the Zplugin menu – Maya Blend Shapes – and press Export Blend Shapes. You can see the result by setting the value to 1. ![]() To speed up the tutorial, we are going to open a Zbrush file where we already created all the blendshapes. ![]() If you go to section for documentation, you can find a list of all poses that need to be done for the best result.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |