Skip to content

naveensrinivasan/SafeImage

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SafeImage

CAUTION THERE ARE SOME NUDE PICTURES IN THIS REPO FOR TESTING

The goal of this repo is to use google vision and AWS rekognition to identify inappropriate images. Compare which of these services is able to identify these inappropriate images.

This is for testing these services. I have also used the googles detect text api to identify swear words using OCR and loading curse words from a folder.

Releases

No releases published

Packages

No packages published

Languages