A discovery service for matching people to the libraries that serve them.
This is a LYRASIS-maintained fork of the NYPL Library Simplified Library Registry.
Docker images ar available at:
You will need both this repository and the separate front end repo, in order to build the local development images. The
registry front end repo should be checked out into a directory named registry_admin
in the same parent directory as
the library-registry
repo itself. If it is not, you will need to change the host mount instructions in the
docker-compose.yml
file to accommodate its location. To get them both in the same directory, execute the following
from that directory:
git clone https://github.com/thepalaceproject/library-registry.git
git clone https://github.com/thepalaceproject/library-registry-admin.git
These environment variables are generally applicable, regardless of installation method, and are included here because they are not discussed elsewhere in this document.
- EMAILER_RECIPIENT_OVERRIDE: If set,
emailer
will send all non-test email to this email address.
If not using Docker, skip to section entitled "Installation (non-Docker)"
Because the Registry runs in a Docker container, the only required software is
Docker Desktop. The database and webapp containers expect to be able
to operate on ports 5432 and 80, respectively--if those ports are in use already you may need to amend the
docker-compose.yml
file to add alternate ports.
Note: If you would like to use the Makefile
commands you will also need make
in your PATH
. They're purely
convenience methods, so it isn't strictly required. If you don't want to use them just run the commands from the
corresponding task in the Makefile
manually. You can run make help
to see the full list of commands.
While they won't need to be changed often, there are a couple of environment variables set in the Dockerfile
that are
referenced within the container:
LIBRARY_REGISTRY_DOCKER_HOME
is the app directory.
Local development uses two Docker images and one persistent Docker volume (for the PostgreSQL data directory). To create the base images:
cd library-registry
make build
You can start up the local compose cluster in the background with:
make up
Alternatively, if you want to keep a terminal attached to the running containers, so you can see their output, use:
make up-watch
make stop
to stop (but not remove) the running containersmake start
to restart a stopped clustermake down
to stop and remove the running containersmake clean
to stop and remove the running containers and delete the database container's data volume
While the cluster is running, you can access the containers with these commands:
make db-session
- Starts apsql
session on the database container as the superusermake webapp-shell
- Open a shell on the webapp container
The Library Registry listens (via Nginx) on port 80, so once the cluster is running you should be able to point a
browser at https://localhost/admin/
and access it with the username/password admin/admin
.
The Library Registry Admin
front end is implemented as a Node package. The name and version of this package are configured in
admin/config.py
. In addition, either or both may be overridden via environment variables. For example:
TPP_LIBRARY_REGISTRY_ADMIN_PACKAGE_NAME=@thepalaceproject/library-registry-admin
TPP_LIBRARY_REGISTRY_ADMIN_PACKAGE_VERSION=1.0.0
The default configuration will result in the admin client being served from a content delivery network. To enable use of a local copy to support development/debugging, ensure that this repo and that of the admin UI have the same parent directory and then perform the following from the base of this repo:
(cd admin && npm link ../../library-registry-admin)
This will link the admin UI project into the admin directory in a manner that is compatible with both docker and non-containerized development. If the package is properly linked, admin UI assets will be served from the linked package, rather than the CDN.
To install the registry locally, you'll need the following:
- PostgreSQL 12+
- PostGIS 3
- Python 3.6+ (3.9 is the build target for the Docker install)
- Appropriate system dependencies to build the Python dependencies, which may include:
With a running PostgreSQL/PostGIS installation, you can create the required test and dev databases by executing:
CREATE DATABASE simplified_registry_dev;
CREATE USER simplified WITH PASSWORD 'simplified';
GRANT ALL PRIVILEGES ON DATABASE simplified_registry_dev TO simplified;
CREATE DATABASE simplified_registry_test;
CREATE USER simplified_test WITH PASSWORD 'simplified_test';
GRANT ALL PRIVILEGES ON DATABASE simplified_registry_test TO simplified_test;
\c simplified_registry_dev
CREATE EXTENSION fuzzystrmatch;
CREATE EXTENSION postgis;
\c simplified_registry_test
CREATE EXTENSION fuzzystrmatch;
CREATE EXTENSION postgis;
The database configuration is exposed to the application via environment variables.
SIMPLIFIED_TEST_DATABASE=postgresql:https://simplified_test:simplified_test@localhost:5432/simplified_registry_test
SIMPLIFIED_PRODUCTION_DATABASE=postgresql:https://simplified:simplified@localhost:5432/simplified_registry_dev
For development work, you should create a .env
file in the project directory that includes these variables
set to the appropriate values for your environment.
The project expects to use poetry
for dependency and virtualenv management, so first
install that.
Having done so, you should be able to run the following in the project directory to install all dependencies.
For a development environment:
poetry install --no-root -E pg-binary
For a production environment:
poetry install --no-dev --no-root -E pg
To start the registry inside the virtualenv that poetry
creates:
FLASK_APP=app.py poetry run flask run
This project runs all the unit tests through Github Actions for new pull requests and when merging into the default
main
branch. The relevant file can be found in .github/workflows/test-build.yml
. When contributing updates or fixes,
it's required for the test Github Action to pass for all python environments. Run the tox
command locally before
pushing changes to make sure you find any failing tests before committing them.
We lint our code with black and isort. These are
automatically run by both tox
and Github Actions and the linters must pass before code is committed.
You can run isort
through tox
with the command:
tox -e isort
This will lint the code with isort
, but not make any changes.
If you want isort
to automatically reformat your code, you can run
tox -e isort-reformat
Similar to isort
, you can run black
through tox
with the command
tox -e black
This will lint the code with black
, but not make any changes.
If you want black
to automatically reformat your code, you can run
tox -e black-reformat
Github Actions runs our unit tests against different Python versions automatically using tox.
To run pytest
unit tests locally, install tox
.
pip install tox
Tox has an environment for each python version and an optional -docker
factor that will automatically use docker to
deploy service container used for the tests. You can select the environment you would like to test with the tox -e
flag.
Environment | Python Version |
---|---|
py36 | Python 3.6 |
py37 | Python 3.7 |
py38 | Python 3.8 |
All of these environments are tested by default when running tox. To test one specific environment you can use the -e
flag.
Test Python 3.8
tox -e py38
You need to have the Python versions you are testing against installed on your local system. tox
searches the system
for installed Python versions, but does not install new Python versions. If tox
doesn't find the Python version its
looking for it will give an InterpreterNotFound
errror.
Pyenv is a useful tool to install multiple Python versions, if you need to install missing Python versions in your system for local testing.
If you install tox-docker
tox will take care of setting up all the service containers necessary to run the unit tests
and pass the correct environment variables to configure the tests to use these services. Using tox-docker
is not
required, but it is the recommended way to run the tests locally, since it runs the tests in the same way they are run
on Github Actions.
pip install tox-docker
The docker functionality is included in a docker
factor that can be added to the environment. To run an environment
with a particular factor you add it to the end of the environment.
Test with Python 3.8 using docker containers for the services.
tox -e py38-docker
If you wish to pass additional arguments to pytest
you can do so through tox
. The default argument passed to pytest
is tests
, however you can override this. Every argument passed after a --
to the tox
command line will the passed
to pytest
, overriding the default.
Only run the test_app.py
tests with Python 3.6 using docker.
tox -e py36-docker -- tests/test_app.py