Facial Motion Capture

Plugins Facial Motion Capture

Just testing version 10 with timeline. It seems the eyes wide and brows dont import properly into time line. the live animation looks great for the eyes but once in time line most movements that I made are gone. Again the live capture looks fantastic.
Thank you I will look.
 
Hello. Noob here. My phone broke so i thought i could buy something cheaper - iphone7 for example and try this plugin. But as i said i am noob and i never had iphone or cleverphone before. So i would like to know if this works on iphone 7 since the description says - ON NEWER IOS and i have no idea what newer ios is ( like do i need iphone 11 ? ) Thanks for the answer:sneaky:
 
Hello. Noob here. My phone broke so i thought i could buy something cheaper - iphone7 for example and try this plugin. But as i said i am noob and i never had iphone or cleverphone before. So i would like to know if this works on iphone 7 since the description says - ON NEWER IOS and i have no idea what newer ios is ( like do i need iphone 11 ? ) Thanks for the answer:sneaky:
I have not tried it on anything older than a 10 myself. The LIVE Face app (link in plugin description) lists the 7 as supported. I can not verify with my own testing.
 
Hello. Noob here. My phone broke so i thought i could buy something cheaper - iphone7 for example and try this plugin. But as i said i am noob and i never had iphone or cleverphone before. So i would like to know if this works on iphone 7 since the description says - ON NEWER IOS and i have no idea what newer ios is ( like do i need iphone 11 ? ) Thanks for the answer:sneaky:
You will need an Iphone with the TrueDepth Camera system, I think they are only in iphoneX and above (y)
 
I was just testing this and had some issues with calibration. Sometimes it seems to get into a state where the head is tilted to one side or turned when I am looking straight at the phone. The head still turns the right amount when I turn, but it looks like an extra rotation has been added, offsetting everything.

The other thing I saw is that a lot of the time when I have a neutral expression, the character's face will not (often it looks angry).

I've used the LiveFace app with iClone and had some similar issues but they have a "calibrate neutral expression" button that fixes it. You just look straight ahead with a neutral expression on your face (like the character has with the morphs zeroed out) , click, and it's fixed. This helps fix other capture problems too, like if you have a droopy eyelid, narrow or really wide eyes, the calibration step makes up for that.

I think all this does is read the state of all the morphs coming from the phone when you click the button and then subtract them from future values. So for example if your head is tilted slightly up when you calibrate, this gets subtracted so the character's head gets a zero tilt.
 
I was just testing this and had some issues with calibration. Sometimes it seems to get into a state where the head is tilted to one side or turned when I am looking straight at the phone. The head still turns the right amount when I turn, but it looks like an extra rotation has been added, offsetting everything.

The other thing I saw is that a lot of the time when I have a neutral expression, the character's face will not (often it looks angry).

I've used the LiveFace app with iClone and had some similar issues but they have a "calibrate neutral expression" button that fixes it. You just look straight ahead with a neutral expression on your face (like the character has with the morphs zeroed out) , click, and it's fixed. This helps fix other capture problems too, like if you have a droopy eyelid, narrow or really wide eyes, the calibration step makes up for that.

I think all this does is read the state of all the morphs coming from the phone when you click the button and then subtract them from future values. So for example if your head is tilted slightly up when you calibrate, this gets subtracted so the character's head gets a zero tilt.

Oh a calibrate neutral expression is a great idea. I do save the original state of the head an all expressions in the bowels of the plugin but actually calibrating the morphs / head position to those sounds like a great feature.

I will add calibrate to my list of TODO items (low on time in immediate moment because of school and work).
 
Oh a calibrate neutral expression is a great idea. I do save the original state of the head an all expressions in the bowels of the plugin but actually calibrating the morphs / head position to those sounds like a great feature.

I will add calibrate to my list of TODO items (low on time in immediate moment because of school and work).
I was playing in iClone and found a few more nice things about the "calibrate Neutral expression" feature.
1. You can put the phone on top of, below or to the side of the monitor, look straight at the monitor (not the phone) calibrate and the character will be looking straight ahead.
2. If you want more open eyes you can just close yours a bit, calibrate and the character's eyes will be more open
3. Same with the mouth, my captures always seemed to have a slight frown, so I frowned a bit, calibrated and the character now has a better expression.

Really a handy feature, Epic's face capture app has it built in I think, but it uses Live Link protocol to talk to the app. I think that app also can record reference video/audio. The LiveFace app doesn't do that.
 
with the latest version of vam. I get a security prompt to confirm every time i try to adjust any settings sliders
face cap error.PNG
 
I've successfully connected the iphone app to the plugin and gotten facial responses. I've read the readme file on the github page, but I'm still wondering two basic things:
1) If, in order to translate recorded facial movements into mocap float parameters, user needs to assign certain morphs to each recorded movement, where does a newbie like me get started? Is there a sample config file someone is willing to share with me?
2) How do I import a mocap recorded using the in-plugin recording option (saved as a .json) into acidbubbles timeline (using acidbubbles timeline plugin ui)? the current mocap import function seems to only import mocaps using the standard vam interface, so i end up always getting error message that no mocaps are present for import.

Thanks for helping!
 
1) If, in order to translate recorded facial movements into mocap float parameters, user needs to assign certain morphs to each recorded movement, where does a newbie like me get started? Is there a sample config file someone is willing to share with me?

Reasonable defaults are assigned for you. If you would like to modify them, take a look in the VAM/Saves/PluginData/lfe_facialmotioncapture.json file - edit it with notepad or vscode. In there you can edit the morph and strength of the morph that is mapped to each iPhone blendshape. Sorry there is no UI for that yet =(. Once you save those changes, reload the plugin in vam and they will be applied.


2) How do I import a mocap recorded using the in-plugin recording option (saved as a .json) into acidbubbles timeline (using acidbubbles timeline plugin ui)? the current mocap import function seems to only import mocaps using the standard vam interface, so i end up always getting error message that no mocaps are present for import.

Once you have recorded a file and it writes it to Saves/animations/mocap/...

Disconnect from your phone in the mocap plugin

Add VamTimeline.AtomPlugin to your person if you have not already

Click the "Open Custom UI..." button

Click the "More..." tab at the top

Click the "Import / export animations..." button at the right hand side close to the top

Click "Import animation(s)"

... now at this point timeline has you in a different folder, I'll update my plugin to write in this folder soon but for now do this ...

Navigate to Saves/animations/mocap

Select the json file you recorded.

It should now work and you can crop, edit, and so on in timeline.
 
I like the idea to capture facial expressions, which can be associated with a specific character. This can add a personality and maybe "wake" the person more to life if included in plugins like e-motion or life.
Is there some way to capture the expressions from a picture too? And will the captured expression need to be saved as a morph?
Maybe this is already possible too?
Sorry for the questions, as I am still quite new to vam.
 
Am I doing something wrong? When I raise my right eyebrow, it raises both eyebrows rather than just the one. Is this just how it works, or is it a bug? Would it be possible to add this as a feature, asymmetric eyebrows, I mean.
 
Am I doing something wrong? When I raise my right eyebrow, it raises both eyebrows rather than just the one. Is this just how it works, or is it a bug? Would it be possible to add this as a feature, asymmetric eyebrows, I mean.

It may just be how it works but I am not an expert on this. I do know that Apple has a limited number of "blendshapes" that they support and it could also be a limitation of the software that is sending the information to VaM. It is hard to say from this side of the internet.

I recommend watching the colors change in the sliders as you move individual eyebrows. If you see green colors getting brighter that indicates a more intense value being sent. If you notice some more intense values in areas that seem to control individual brows, try turning the multipliers of those higher to see if it is just a matter of tuning the strengths.
 
I've never gotten this error before today, but what does it look like to you? I think it might be a memory issue, but seems to sometimes happen if I put my hand infront of my face while connected to the app.
!> System.ArgumentOutOfRangeException: Argument is out of range.

Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
!> System.ArgumentOutOfRangeException: Argument is out of range.
Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
!> System.ArgumentOutOfRangeException: Argument is out of range.
Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
!> System.ArgumentOutOfRangeException: Argument is out of range.
Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
 
I've never gotten this error before today, but what does it look like to you? I think it might be a memory issue, but seems to sometimes happen if I put my hand infront of my face while connected to the app.
!> System.ArgumentOutOfRangeException: Argument is out of range.

Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
!> System.ArgumentOutOfRangeException: Argument is out of range.
Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
!> System.ArgumentOutOfRangeException: Argument is out of range.
Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
!> System.ArgumentOutOfRangeException: Argument is out of range.
Parameter name: size
at System.Net.Sockets.Socket.Receive (System.Byte[] buffer, Int32 offset, Int32 size, SocketFlags flags) [0x00000] in <filename unknown>:0
at LFE.FacialMotionCapture.Devices.RealIllusionLiveFaceClient.<Connect>m__0 () [0x00000] in <filename unknown>:0
I will try and reproduce it thank you for the stack trace.
 
I will try and reproduce it thank you for the stack trace.
I was running a lot of RAM heavy apps, so I am really thinking it was that. I thought at some point the error had said something about an invalid shape, but couldn't get that to pop up again.
 
Anyway to make it work on android?
If you can find a motion capture client on android that sends data over network let me know. I did not find one but did not look hard either. This plugin only reads data coming over a network so it is dumb. I does not actually interact with phones directly.
 
If you can find a motion capture client on android that sends data over network let me know. I did not find one but did not look hard either. This plugin only reads data coming over a network so it is dumb. I does not actually interact with phones directly.


Maybe try this? First thing on search. Please let me know if its work.
 
Back
Top Bottom