ARCore tutorial
Try-On makeup with Augmented Faces
How to use Augmented Faces functionality from ARCore to create Android try-on beauty app
Youtube has recently released a new AR beauty try-on feature that lets users try on makeup while watching a video tutorial. In this tutorial, you will learn how to create a similar experience using ARCore and Augmented Faces.
Augmented Faces is a subsystem of ARCore and lets your app identify different areas of a face and overlay those areas with textures and 3D models.
Prepare texture material
To be able to build a lipstick try on app a texture is required. We will use a UV texture from a reference face model canonical_face_mesh.fbx
. You should get a texture like on the image below. Based on this texture, you can create any overlays you wish.
Adding functionality to Android app
- Dependencies:
implementation "com.google.ar.sceneform.ux:sceneform-ux:1.10.0"
- Add Camera permission and AR feature to AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera.ar" android:required="true"/>
- Add
meta-data
underapplication
:
<meta-data android:name="com.google.ar.core" android:value="required" />
- Configure AR session and set it to
MESH3D
- Init a texture for a face filter & add your texture to the drawable folder:
- Add texture to a new detected face:
Conclusion
Based on the Augmented Faces sample from ARCore, it seems pretty straight forward to track and add any texture to any specific face areas using a UV mapping texture from canonical_face_mesh.fbx
.
This tutorial was build using Sceneform, if you want to learn how to build Augmented Faces native app with Kotlin and WITHOUT Sceneform check out codelabs on ARCore.how:
You can also find a mini tutorial on try-on makeup here:
You can find a screenshot from a demo “try-on lipstick” app below:
Source code:
Next tutorial on ARCore Augmented Faces: