We had a really good post mortem with the faculty. Its interesting what you learn when you review the last few months. We went a really long way with the project. I think we could have done better, but we did well.
The Most Awesome Prototyping Class Ever
Tuesday, December 15, 2015
Tuesday, October 6, 2015
We now have all of our inputs working...
We now have all of our inputs working in the same build...
And we are on Oculus...
Tone of voice, hand motion and distance all play into the NPC anger levels.
And we are on Oculus...
Tone of voice, hand motion and distance all play into the NPC anger levels.
Thursday, October 1, 2015
Saturday, September 19, 2015
Here is a rough version of our level.
This is our level. I matches the companies stage that we are working with. The simulation starts in the bottom middle and works around to the character by the car. The reason we did this is for the element of surprise and that the officer would have to walk over to the person to initiate conversation with him. He will be kneeling down and facing away from you grumbling. You get close and he stands up in your face and then the decision tree starts.
New Character
In order to fit the huge need to have a character that can simultaneously use ...
-Code driven animation cycles that are baked from motion capture or hand animation.
-Code driven phoneme animation that is driven by recorded audio files.
-Code driven expressions that fire depending on his anger level.
In order to have this level of control over the character we need to back the body animations and leave the anger morph targets so they can be driven by code.
I was not willing to put the time in to test the products that have been built, mainly because I can build one for our needs and I can control how it is exported from Maya.
This is our low poly model that is based off of my face scan. I also show a couple of expressions we are using.
-Code driven animation cycles that are baked from motion capture or hand animation.
-Code driven phoneme animation that is driven by recorded audio files.
-Code driven expressions that fire depending on his anger level.
In order to have this level of control over the character we need to back the body animations and leave the anger morph targets so they can be driven by code.
I was not willing to put the time in to test the products that have been built, mainly because I can build one for our needs and I can control how it is exported from Maya.
This is our low poly model that is based off of my face scan. I also show a couple of expressions we are using.
Subscribe to:
Posts (Atom)