Skip to content

Kiwibp/NYC-DSA-Bootcamp--Final-Project

Repository files navigation

Data Wrangling & NLP on Local Used Items


Overview

I wrangled 10,000 local used items around me from Craigslist.org, letgo.com, and Facebook Marketplace. I then performed EDA with Python and Tableau. I also built a few simple extractor and classification NLP models using MonkeyLearn API. You can view the interactive visualizations I created in Tableau here: https://public.tableau.com/profile/keenan.burke.pitts#!/vizhome/NYCDSAFinalProject_0/LocalUsedItemsAnalysis

Installation

You will need to download the Scrapy Framework https://scrapy.org/ to scrape items from letgo.com and craigslist.org. You may need a paid crawlera account https://scrapinghub.com/crawlera to avoid getting blocked from Craigslist. You will need to use Selenium https://selenium-python.readthedocs.io/ for scraping Facebook Marketplace. You will need to download Tableau software https://public.tableau.com/en-us/s/download for the visualizations or you can simply interact with them on my public profile. You will also need an account with MonkeyLearn https://monkeylearn.com/ to connect with their API to execute the NLP models.

About

Data Wrangling, EDA, and & NLP With Python & Tableau

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages