
IOS AR Engineer
- or -
Post a project like this29
$35/hr
- Posted:
- Proposals: 4
- Remote
- #4481377
- Open for Proposals
Description
Experience Level: Expert
The work is technical and precise, we render directly onto a tracked face mesh at 60fps and every pixel matters.
You need to have shipped something real with ARKit. Not a demo. A product where face tracking drove rendering decisions at the vertex level.
The work involves:
Face mesh geometry and UV mapping
Real-time texture generation from geometric contours
Sub-pixel optical edge detection against live camera frames
Coordinate space transformations between 3D, image, and texture space
Threading discipline across ARKit, Vision, CoreGraphics and SceneKit
You are the right person if:
You have debugged a rendering artifact by reasoning about coordinate spaces, not by guessing blend modes
You know what happens to Vision landmark coordinates when you pass the wrong orientation to VNImageRequestHandler
You can explain why the same ARKit vertex produces different UV values depending on head angle
Working with raw pixel buffers feels normal to you
You are not the right person if:
Your ARKit experience stops at placing virtual objects in a room
You have never read BGRA pixel values directly from a CVPixelBuffer
Catmull-Rom splines and even-odd fill rules are unfamiliar
You need to have shipped something real with ARKit. Not a demo. A product where face tracking drove rendering decisions at the vertex level.
The work involves:
Face mesh geometry and UV mapping
Real-time texture generation from geometric contours
Sub-pixel optical edge detection against live camera frames
Coordinate space transformations between 3D, image, and texture space
Threading discipline across ARKit, Vision, CoreGraphics and SceneKit
You are the right person if:
You have debugged a rendering artifact by reasoning about coordinate spaces, not by guessing blend modes
You know what happens to Vision landmark coordinates when you pass the wrong orientation to VNImageRequestHandler
You can explain why the same ARKit vertex produces different UV values depending on head angle
Working with raw pixel buffers feels normal to you
You are not the right person if:
Your ARKit experience stops at placing virtual objects in a room
You have never read BGRA pixel values directly from a CVPixelBuffer
Catmull-Rom splines and even-odd fill rules are unfamiliar
Projects Completed
40
Freelancers worked with
36
Projects awarded
48%
Last project
13 Mar 2026
United Kingdom
New Proposal
Login to your account and send a proposal now to get this project.
Log inClarification Board Ask a Question
-
There are no clarification messages.
We collect cookies to enable the proper functioning and security of our website, and to enhance your experience. By clicking on 'Accept All Cookies', you consent to the use of these cookies. You can change your 'Cookies Settings' at any time. For more information, please read ourCookie Policy
Cookie Settings
Accept All Cookies