Crawler API, a service that parses a URL and returns meta-data from that web page. It employs redis to ensure fast response for sites already visited.
-
create a .env file from the .env.sample file
cp .env.example .env
and fill in the necessary environment variables -
Ensure docker is installed on your local machine. Refer to Docker Guide
-
Once docker is installed run
docker-compose build
from the root of the application
-
Start the development server by running
docker-compose up -d crawler
This will bring up the app and it's dependencies. It can take a few seconds to be able to access your local environment via browser even after you see 'done' on the console. -
Now access the server on
localhost:3000 or 127.0.0.1:3000
this depends on the port specified in the.env
file but it defaults to 5200 is none is specified
- Run
docker-compose -f docker-compose-test.yml up
to test the app
-
create a .env file from the .env.sample file
cp .env.example .env
and fill in the necessary environment variables -
run
yarn
to install dependencies
-
Run
yarn run start:dev
to start the development server -
Now access the server on
localhost:3000 or 127.0.0.1:3000
this depends on the port specified in the.env
file but it defaults to 5200 is none is specified
- Run
yarn run test
import this Postman collection OR visit App Demo and use the query below in the playground
{
getMetas(url: "https://graphql.org/") {
title
description,
source,
image{
url,
width,
height
}
}
}