Weekly Assignment Shimin Gu Weekly Assignment Shimin Gu

WEEK 01 – MOODBOARD & LAB DOCUMENTATION

Moodboard for Unreal Inspirations

Pinterest: https://pin.it/67rxwkom6sltry

I found different kinds of pictures for my moodboard, some are in very dark tune, some are really bright and warm. I also like some Japanese art styles which make me feel peaceful and I like their use of color. So in general, I like environments that are surreal, and comparatively huge to human beings or species living in that world. I’d like use assets with shaders instead of photo-realistic materials.

Because I took the narrative VR class last semester last semester and are doing some personal projects in Unreal, I realized it would make the characters in my work move more naturally. I used Maximo to the character animation before. It is a nice and convenient tool but it will somewhat limit what I can do, so I feel it would be great to learn motion capture and do customized animations. Meanwhile, I know a lot of huge-budget VFX films and games have been using motion capture for a long time. I’m kind of a big fan of them so I’m curious about their motion capture work flow.

Mocap Lab Documentation

The first step before calibration is to check if there is any reflective objects in the room. We can sometimes see some red dots in the screen. As Izzy said, it doesn’t matter as long as we mask them out and save the project. Then we can do the winding part and let each camera to receive more than 10,000 sample signals and then calculate and apply the result. The motive system feels user friendly to me because when you see which camera’s data is falling behind you can select the camera in the software and its light will turn to yellow as an indication. The calculation is quite fast by the way.

IMG_1838.JPG
IMG_2860.JPG

The next step is to set the ground offset. We use the customized maker platform to do that. The Z axis should in the same direction as the performers.

IMG_8717.JPG
IMG_0014.JPG

Now we can take out those rigid body markers from the locker and they can immediately be seen in motive. Select and make them rigid bodies. Renaming them would be a good idea for later use. Then we can start recording. The recording process was quite fun, four of us were doing what we want during that time. After stopping recording, it was also a good idea to rename each take in case there would be a lot different takes, so it’s easy to tell.

Next we want to stream the motion capture to Unreal. Open data streaming and a Unreal 4.19 project at the same time. We dragged an optiktrack origin into the scene first and set the IP address. We need to set the none naming convention in motive correspond with what it is in Unreal to be motive. Then we dragged an optiktrack object and a static mesh in. At first it didn’t work because we made the mesh to be the parent. Instead, the optictrack object should be the root. We also tried the add the static mesh as a component of the optictrack object. That way works as well.

IMG_3261.JPG
IMG_1681.JPG

Set a different number to different optictrack objects, we met problems because we didn’t realize the number starts from 1 instead of 0 which is quite uncommon in computer language. Also we forgot to hit play in motive so when we simulate in Unreal, those object didn’t move. So, always hit play no matter for recorded data or in live.

We faced a lot of problems, most of them are just small points you need to keep in mind. But it was great we met and solved them at the first place. We finally made it!

mocap.gif
Read More