Skip to content
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.
/ FCRN-CoreML Public archive

Depth Estimation for iOS and macOS, using FCRN Depth-Prediction models

License

Notifications You must be signed in to change notification settings

Norod/FCRN-CoreML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

⚠️ The FCRN mlmodel was deleted by Apple and replaced by Depth Anything V2 ⚠️

Depth estimation using FCRN-CoreML

iOS and macOS

Depth Estimation sample Apps for iOS and macOS, using the FCRN-DepthPrediction models Apple provided on their model page.

fcrn16-iOS fcrn16-macOS

  • Loads an image and crops it to the input size requested by the FCRN model

  • The same helper class ImagePlatform provides hardware accelerated processing tools for both iOS and macOS images and buffers

  • Supports both FCRN-16 and FCRN-32 models

  • You can post the predicted cropped portrait photo to Facebook as a 3D photo directly from your iPhone

    Post to Facebook as 3D photo
  • You can also post the cropped predicted depthmap together with the cropped input image photo to Facebook as a 3D photo directly using a browser on your Mac or PC. See this guide for more information.

Before you try the sample App you need to download a model and save it in the mlmodel folder

You can download FCRN-DepthPrediction CoreML models from https://developer.apple.com/machine-learning/models/

You can download just one of them, both work with this project. Choose which one to use by setting the relevant build target in Xcode

FCRN.mlmodel Storing model weights using full precision (32 bit) floating points numbers. 254.7MB https://docs-assets.developer.apple.com/coreml/models/Image/DepthEstimation/FCRN/FCRN.mlmodel

FCRNFP16.mlmodel Storing model weights using half-precision (16 bit) floating points numbers. 127.3MB https://docs-assets.developer.apple.com/coreml/models/Image/DepthEstimation/FCRN/FCRNFP16.mlmodel