Giter VIP home page Giter VIP logo

facial-ar-remote's Introduction

Facial AR Remote (Preview)

Notice

This project has been discontinued and replaced by the Live Capture package and Unity Face Capture app. This repository is no longer supported or maintained.

About

Facial AR Remote is a tool that allows you to capture blendshape animations directly from a compatible iOS device to the Unity Editor. Download the Facial AR Remote Integration Project if you want a Unity project with all dependencies built in.

Experimental Status

This repository is tested against the latest stable version of Unity and requires the user to build the iOS app to use as a remote. It is presented on an experimental basis - there is no formal support.

Download

Install the package through the Package Manager using the Git URL.

How To Use/Quick Start Guide

This repository uses Git LFS so make sure you have LFS installed to get all the files. Unfortunately this means that the large files are also not included in the "Download ZIP" option on GitHub, and the example head model, among other assets, will be missing.

iOS Build Setup

  1. Set your build target to iOS
  2. In Project Settings > Player Settings go to Other Settings > Camera Usage Description and type a description of why you are requesting camera access. This will be presented when you first open the app.
  3. Set the Client.scene as your build scene in the Build Settings and build the Xcode project.

Editor Animation Setup

Install and Connection Testing

  1. (Optional) install the Sloth Example from the Package Manager by selecting the ARKit Facial Remote package and installing the sample

  2. Be sure your device and editor are on the same network. Launch the app on your device and press play in the editor.

  3. Set the Port number on the device to the same Port listed on the Stream Reader component of the Stream Reader game object.

  4. Set the IP of the device to one listed in the console debug log.

  5. Press Connect on the device. If your face is in view you should now see your expressions driving the character on screen. Note You need to be on the same network and you may have to disable any active VPNs and/or disable firewall(s) on the ports you are using. This may be necessary on your computer and/or on the network. Note Our internal setup was using a dedicated wireless router attached to the editor computer or lighting port to ethernet adaptor.

Known Issues

  1. Character Rig Controller does not support Humanoid Avatar for bone animation.

  2. Animation Baking does not support Humanoid Avatar for avatar bone animation.

  3. Stream source can only connect to a single stream reader.

  4. Some network setups cause an issue with DNS lookup for getting IP addresses of the server computer.

facial-ar-remote's People

Contributors

bradweiers avatar foobraco avatar jonathan-unity avatar mtschoen avatar mtschoen-unity avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

facial-ar-remote's Issues

Suffix being added to blendshape names

I solved this issue but feel it should be documented as I ran into this many many times. when exporting or importing the model into unity. If the model is an FBX version greater than 2011 it will have the model's name as a suffix appended to all blendshapes. creating a blendshape that will not be recognized. such as blendshapes2.jawOpen

ImportFBX errors

Hi there

I just had import fbx errors at unity master end. Seems like a broken fbx model. would you be able to help? Error screenshot attached below. thanks heaps and look forward to your reply.

unity screenshot import fbx errors

Unable To Connect

Hi , I downloaded and launched the package, but am unable to connect my IOS to the editor.
I have tried disabling my windows firewall. I am running the app on an I-Pad pro.

Steps tried to execute the project :

  1. Run editor
  2. Check IP and Port number printed in console is same on the IOS device
  3. Press connect.
    I have tried printing out the error message in the Client, and it read exception : Connection timed out. Ive tried searching information about the facial-ar-remote and cant find much information, none the less for the error.

Thankful for any insights to my connecting issue, cheers.

Facial AR Remote compatible Device

We are actually using an iPhone XR for testing, and it is showing that the device is not compatible, is an iPhone X strictly required for using the script? We did, however try it with an iPhone X, and it worked.

BlendShapesController streamReader coming up Null

Hi, in the BlendShapesController and CharacterRigController scripts, the IStreamReader streamReader is returning Null for one of my new meshes. I have put in two custom meshes before this one and they never had this problem. I'm not sure what to look into or change on the new one to have it fill the streamReader like the others. Does anyone else know why something wouldn't be able to find it?

Invalid AR Blendshapes?

I have imported a model with blendshapes and some of them work perfectly, but a great number of them(mostly mouth and eye movements) don't work and I have a lot of this error:

Blend shape BS_node.eyeBlinkLeft is not a valid AR blend shape
UnityEngine.Debug:LogWarningFormat(String, Object[])
Unity.Labs.FacialRemote.BlendShapesController:UpdateBlendShapeIndices(IStreamSettings) (at Assets/facial-ar-remote/Remote/Scripts/BlendShapesController.cs:178)
Unity.Labs.FacialRemote.BlendShapesController:Update() (at Assets/facial-ar-remote/Remote/Scripts/BlendShapesController.cs:121)

screen shot 2019-02-14 at 1 08 02 pm

Does anyone know why this might be happening?

FBX Export?

I see when you bake the animation it saves as a .anim is there any way to save it as an fbx perhaps? I want to be able to edit the animation after the capture.

Missing Mesh

I'm trying to test the amazing Facial AR Remote sample.

I'm following all the documentation steps ( maybe I done something wrong or missing ).

The App works on my iPhone X: I see the mask over my face and all the gestures. GREAT!

On the Unity Editor I open the scene SlothBlendShapes.

My IP is 192.168.1.31 PORT: 9000

I put on muy iPhone App and press connect but... nothing never happends :(

I only receive two warnings ( not errors ) on the Unity Console:

Missing mesh in Sloth_Head2 (UnityEngine.SkinnedMeshRenderer)
UnityEngine.Debug:LogWarning(Object)
Unity.Labs.FacialRemote.BlendShapesController:Start() (at Assets/Remote/Scripts/BlendShapesController.cs:72)

On the other issue do you talk me about the git-lfs but I haven't used GitHub, I directly downloaded the zip, decompressed and Drag&Drop on my ARKit 2.0 Project.

I think this is the origin of the problem... because the zip probably don't have the big files.

About the Versions:

I'm Using Unity 2018.2.2f1 personal ( 64 bit ) running on OSX High Sierra 10.13.4

Eye driver only works if character looks towards -Z

When using the Drive Eyes setting in Character Rig Controller, the pitch of the eye transforms is inverted if the character looks towards positive Z (as in the Sloth Example).

To reproduce

  • Open the scene Examples/Scenes/SlothBlendShapes
  • Add shapes for left and right eyes and add them as Left Eye / Right Eye in Character Rig Controller
  • Set Drive Eyes tu True, and eye look distance to something positiv (should pe positive if the character looks towards positive Z, right??)

Exception Errors

Hi!

I'm getting errors after establishing connection.
One time:
NullReferenceException: Object reference not set to an instance of an object Unity.Labs.FacialRemote.BlendShapesController.UpdateBlendShapeIndices (IStreamSettings settings) (at Assets/facial-ar-remote-master/Remote/Scripts/BlendShapesController.cs:106) Unity.Labs.FacialRemote.BlendShapesController.Update () (at Assets/facial-ar-remote-master/Remote/Scripts/BlendShapesController.cs:76)

and a lot of:

KeyNotFoundException: The given key was not present in the dictionary. System.Collections.Generic.Dictionary 2[UnityEngine.SkinnedMeshRenderer,Unity.Labs.FacialRemote.BlendShapeIndexData[]].get_Item (UnityEngine.SkinnedMeshRenderer key) (at /Users/builduser/buildslave/mono/build/mcs/class/corlib/System.Collections.Generic/Dictionary.cs:150) Unity.Labs.FacialRemote.BlendShapesController.Update () (at Assets/facial-ar-remote-master/Remote/Scripts/BlendShapesController.cs:82)

What it could be?

Cannot run

After I load the unitypackage into Unity2018 1.1f1, the Console showed
"Assets/facial-ar-remote/Remote/Scripts/ClientGUI.cs(5,7): error CS0246: The type or namespace name TMPro' could not be found. Are you missing an assembly reference?" & "Assets/facial-ar-remote/Remote/Scripts/ClientGUI.cs(49,9): error CS0246: The type or namespace name TMP_InputField' could not be found. Are you missing an assembly reference?"
Could you help to fix it? I appreciate that. :)

I can not press [ Bake Animation Clip ] button

hi

faical ar remote Is wonderful.

so I want to save an animation clip About the [StreamReader] Using the [Bake Animation Clip] button.

but The [Bake Animation Clip] button does not become active and can not be changed to animation clip.

If you know something about how to use this, please let me know.

thank you.

Error occurs when building with Release

I use the .unitypackage file published in Release to set up the project. However, the error occur when I use the xcode to build the ios client side.

../Libraries/UnityARKitPlugin/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm:739:74: Cannot initialize a parameter of type 'id _Nonnull' with an rvalue of type 'Class'

The error is the same when I use either ARKit 1.5 or ARKit 2.0.

How I can fix them?

找不到TouchGirl_Demo

您好,大神,我想利用iphoneX的ARKit来驱动人脸表情,但是使用我自己的头部模型就不能用,你可以分享一下您的可用头部模型吗? @gabriele

Can't connect to Remote

Seems like it's impossible to connect to the remote from unity console. I've tried using the IP put in the console. but it never connects and seems to just time out. the pc is hard connected and the iphone is via wifi, but the connect button doesn't seem to do anything when pressed, doesn't hang or appear to be attempting to connect.
will try a lightning cable to ethernet adapter, but I'm sure others have had this problem so any support would be great.
error code
Failed to connect to player ip: 192.168.144.100 UnityEditorInternal.ProfilerDriver:DirectIPConnect(String) UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Use different blendshape names

Hi,

I've been able to build the iOS client and run the Sloth demo. I'm trying now to do the same with some custom character, but blendshape names are different (although they do the same morphs required by ARKit). What's the best way to map these names from the StreamReader to my SkinnedMeshRenderer?

I see some mapping.cs script and a BlendShapeUtils.cs file that contains a list of arkit blendshapes, but I'm not sure if modifying these files is the right way to do it.

Thanks

Note: modifying the original 3D file is not an option, since it's a bought character from a library

Compiled App on App store

has anyone just compiled this and put it on the app store? I've compiled this a few times, only for it to just stop working for some reason. now I don't have my MacBook to push it to the iPhone. is there anyone who just put it up on the app store?

Project doesn't work

Is this project still being maintained?

I'm attempting to get it to work with iPhone 11 pro, and Unity 2019.2,
I have to remove the limitation for IphoneX in the client code, and disable my windows firewall and finally managed to get it to connect, but then the stream data doesn't affect the sloth.

use facial-ar-remote1.1 and ARKit2.0, but tongueOut useless

use facial-ar-remote1.1 and ARKit2.0( server and client), but tongueOut useless, I check the name of tongue blend shape, and also drag this to the component "blend shape controller" ,and change the count to 52.
Everything looks working well but when i stick out my tongue, it looks like not been tracking.Is there any step i missing?

Latency getting bigger over time

Tested today with a fresh iPhone XR. Everything feels very snappy when I press play in the Editor, but latency builds up pretty visibly and quickly - if I have it running for, say, 60s, I can already perceive a latency of approx. half a second (compared to basically nothing in the beginning).

Any ideas? Looks like something is improperly queueing up results internally.

Can this build windows program?

UnityEditor.BuildPlayerWindow+BuildMethodException: 2 errors
at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (BuildPlayerOptions options) [0x0021f] in /Users/builduser/buildslave/unity/build/Editor/Mono/BuildPlayerWindowBuildMethods.cs:187
at UnityEditor.BuildPlayerWindow.CallBuildMethods (Boolean askForBuildLocation, BuildOptions defaultBuildOptions) [0x0007f] in /Users/builduser/buildslave/unity/build/Editor/Mono/BuildPlayerWindowBuildMethods.cs:94
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Examples/Scenes/SlothBlendShapes.scene
Build windows program in mac system

wx20180822-123859

Model issues

Hi, I have imported a new model, and the blend shapes work fine. However, I am having issues with making the model's head turn. I have checked against the sloth model provided, and don't see anything different. Any insight is helpful. Also, when looking for a model are there any considerations I should look out for?

Eye driver using bones runs backwards on some models.

The eyes on some models (I tested with Mixamo/Fuse models) end up looking back into the skull.

This can obviously be fixed by editing the offset from -0.5 to 0.5, but then the up and down is reversed.

After some digging I found where this is updated, and added a bool to reverse the pitch.

I recommend adding a bool to flip the pitch to the driver options.
(example)

// ~line 412 of CharacterRigController.cs - duplicated for right eye just below
var leftEyePitch = Quaternion.AngleAxis((m_EyeLookUpLeft - m_EyeLookDownLeft) * m_EyeAngleRange.x, m_FlipEyePitch ? Vector3.left : Vector3.right);

Is this awesomeness still a live project with you guys at Labs ?

Also regardless, is there a list of the various blend shapes or a document detailing the workflow we can put in front of our artists for them to follow so we end up with our own character art that works with it?

(Just saw this at Unite and was wowed by it)

Questions about releasing applications using facial-ar-remote.

Dear Unity, I've encountered a problem: the thing is, if I want to put facial-ar-remote into my unity projects, what do I need to do? Is it necessary to get a Developer Account and is there a must to publish the application through mac book? If not, could you please recommend some detailed introductions and tutorials about it, API instruction is also great for me.
Thanks. Looking forward to receive your reply.

Is it possible to use this tool outside a local network?

Hi,

First of all, thanks for the tool!

I'm working on a project where we would like to use the face data from ARKit to animate some characters on the PC, in real-time. I've been testing it locally with this tool, and so far, so good.
We'd like to also animate using data from iPhones that would be outside of the network the PC is in. Is this possible?

I'm assuming it might be outside the scope of this tool but, any help on how to achieve this would be appreciated. :)

QuaternionToEuler: Input quaternion was not normalized - on baking animation clip

//Error occurs when baking animation clip:
QuaternionToEuler: Input quaternion was not normalized
UnityEditor.AnimationUtility:SetEditorCurve(AnimationClip, EditorCurveBinding, AnimationCurve)
Unity.Labs.FacialRemote.ClipBaker:ApplyAnimationCurves() (at Assets/facial-ar-remote-master/Remote/Scripts/Editor/ClipBaker.cs:171)
Unity.Labs.FacialRemote.StreamReaderEditor:BakeClipLoop() (at Assets/facial-ar-remote-master/Remote/Scripts/Editor/StreamReaderEditor.cs:299)
Unity.Labs.FacialRemote.StreamReaderEditor:OnInspectorGUI() (at Assets/facial-ar-remote-master/Remote/Scripts/Editor/StreamReaderEditor.cs:253)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

//Followed by this error:
Assertion failed on expression: 'IsFinite(rot)'
UnityEditor.AnimationUtility:SetEditorCurve(AnimationClip, EditorCurveBinding, AnimationCurve)
Unity.Labs.FacialRemote.ClipBaker:ApplyAnimationCurves() (at Assets/facial-ar-remote-master/Remote/Scripts/Editor/ClipBaker.cs:171)
Unity.Labs.FacialRemote.StreamReaderEditor:BakeClipLoop() (at Assets/facial-ar-remote-master/Remote/Scripts/Editor/StreamReaderEditor.cs:299)
Unity.Labs.FacialRemote.StreamReaderEditor:OnInspectorGUI() (at Assets/facial-ar-remote-master/Remote/Scripts/Editor/StreamReaderEditor.cs:253)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Baking Animation Records only first Keyframe

I'm able to use the remote with the sloth character, record animation and playback using the player on the stream recorder component. When I bake the animation, however, the Animation Clip produced contains only one keyframe at the head of each blendshape / animation track.

Considering that the Playback Buffer contains the animation which can be played back on the character, it step that's failing us is the bake. Might someone have some insight on how to remedy this? Perhaps something not in the documentation that we're missing?

Thank you in advance for any help you can offer on this front!

How to detect iPhone on Unity Console on Windows?

@jonathan-unity Can I use Unity Windows instead of Mac?
I built the Arkit Remote scene using VMware macOS and was able to connect to iPhone in Console.
The frame rate was very low because virtual machine is supposed to be slow.

So I want to run the remote app and connect to Unity on Windows.
But when I open Unity on Windows to try to connect to iPhone, I don't find the iPhone device in the Console list.
image

How do I detect iPhone on Unity Console on Windows? (I see the iPhone in Windows' iTunes fine)

Freezing and lagging

it appears that the program lags when in use. at first, it's completely fine, but as time goes on it freezes more and more. it could be heat related but i believe it might also be something in the code. if this is a known issue is there any documentation on it?

Missing steps in docs

Hey, just spotted a couple of steps that are missed on the build settings docs.

  1. import TMP Essential Resources for TextMesh Pro
  2. Enable "ARKit Uses Facetracking" on UnityARKitPlugin > Resources > UnityARKitPlugIn > ARKitSettings

I wasn't familiar with either, so I spent a bit of time working this out. Hopefully it will save some others the headache. Otherwise, great work guys, I'm going to get a lot of use out of this.

Suggestions for ideal head and neck bone rig?

Thanks for this project! Our company is using Unity Pro and a fork of this for an iOS app we're shipping January 2019.

We do have some questions about the head and neck bone setup if you could help us out.

unity 2018 3 0f2 - madden unity - gigan - iphone ipod touch and ipad metal 2018-12-19 17-04-57

  1. Should one bone/joint be a parent of the other? e.g. turning neck bone turns head or vice versa?
  2. What is ideal location for these bones? e.g. neck bone in center of neck mass at shoulder height, head bone in center of head mass at eye level?
  3. Roughly what are suggested vertex assignments? e.g. all of neck up to jawline assigned to neck bone, entire rest of head assigned to head bone?

Thanks again, any pointers at all would be very very appreciated! 🙌

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.