Skip to content

Latest commit

 

History

History
202 lines (154 loc) · 4.13 KB

README.md

File metadata and controls

202 lines (154 loc) · 4.13 KB

caffeine - minimum viable backend

A very basic REST service for JSON data - enough for prototyping and MVPs!

Features:

  • no need to set up a database, all data is managed automagically*
  • REST paradigm CRUD for multiple entities/namespaces
  • JWT authentication
  • realtime notifications (HTTP/SSE)
  • schema validation
  • autogenerates Swagger/OpenAPI specs
  • search using jq like syntax (see https://stedolan.github.io/jq/manual/)
  • CORS enabled
  • easy to deploy as container

Currently supports:

  • in memory database (map)
  • sqlite
  • postgres
  • filesystem storage

For a sample Vue app using caffeine see: https://gist.github.com/calogxro/6e601e07c2a937df4418d104fb717570

How to

Simply start the server with:

go run caffeine.go

optional params are:

Usage of caffeine:
  -AUTH_ENABLED=false: enable JWT auth
  -DB_TYPE="memory": db type to use, options: memory | postgres | fs | sqlite
  -DB_PATH="./data": path of the file storage root or sqlite database
  -IP_PORT=":8000": ip:port to expose
  -PG_HOST="0.0.0.0": postgres host (port is 5432)
  -PG_PASS="": postgres password
  -PG_USER="": postgres user

Store a new "user" with an ID and some json data:

> curl -X POST -d '{"name":"jack","age":25}'  https://localhost:8000/ns/users/1
{"name":"jack","age":25}

the value will be validated, but it could be anything (in JSON!)

retrieve later with:

> curl https://localhost:8000/ns/users/1
{"name":"jack","age":25}

All operations

Insert/update

> curl -X POST -d '{"name":"jack","age":25}'  https://localhost:8000/ns/users/1
{"name":"jack","age":25}

Delete

> curl -X DELETE https://localhost:8000/ns/users/1

Get by ID

> curl https://localhost:8000/ns/users/1
{"name":"jack","age":25}

Get all values for a namespace

> curl https://localhost:8000/ns/users | jq 
[
  {
    "key": "2",
    "value": {
      "age": 25,
      "name": "john"
    }
  },
  {
    "key": "1",
    "value": {
      "age": 25,
      "name": "jack"
    }
  }
]

Get all namespaces

> curl https://localhost:8000/ns
["users"]

Delete a namespace

> curl -X DELETE https://localhost:8000/ns/users
{}

Search by property (jq syntax)

> curl https://localhost:8000/search/users?filter="select(.name==\"jack\")"  | jq
{
  "results": [
    {
      "key": "1",
      "value": {
        "age": 25,
        "name": "jack"
      }
    }
  ]
}

JWT Authentication

There's a first implementation of JWT authentication. See documentation about JWT

Realtime Notifications

Using HTTP Server Sent Events (SSE) you can get notified when data changes, just need to listen from the /broker endpoint:

curl https://localhost:8000/broker

and for every insert or delete an event will be triggered:

{"event":"ITEM_ADDED","namespace":"test","key":"1","value":{"name":"john"}}
...
{"event":"ITEM_DELETED","namespace":"test","key":"1"}
...

Swagger/OpenAPI specs

After you add some data, you can generate the specs with:

curl -X GET https://localhost:8000/openapi.json

or you can just go to https://localhost:8000/swaggerui/ and use it interactively!

Schema Validation

You can add a schema for a specific namespace, and only correct JSON data will be accepted

To add a schema for the namespace "user", use the one available in schema_sample/:

curl --data-binary @./schema_sample/user_schema.json https://localhost:8000/schema/user

Now only validated "users" will be accepted (see user.json and invalid_user.json under schema_sample/)

Run as container

docker build -t caffeine .

and then run it:

docker run --publish 8000:8000 caffeine

Run with Postgres

First run an instance of Postgres (for example with docker):

docker run -e POSTGRES_USER=caffeine -e POSTGRES_PASSWORD=mysecretpassword -p 5432:5432 -d postgres:latest

Then run caffeine with the right params to connect to the db:

DB_TYPE=postgres PG_HOST=0.0.0.0 PG_USER=caffeine PG_PASS=mysecretpassword go run caffeine.go

(params can be passed as ENV variables or as command-line ones)

A very quick to run both on docker with docker-compose:

docker-compose up -d