Testbed for testing Readium LCP integration in Circulation Manager.
Project consists of the following modules:
- elasticsearch is a Dockerized version of Elasticsearch with pre-installed analysis-icu plugin required by Circulation Manager
- lcp-collection is an ONIX collection of 3 books used for testing
- lcp-docker is a Docker-based implementation of Readium LCP
- lcp-docker-conf is a Dockerized application generating LCP configuration for LCP License and Status servers using confd
- lcp-import is Docker image based on nypl/circ-exec including a bash script for importing the LCP collection into Circulation Manager
- proxy is a Dockerized nginx reversed proxy based on docker-nginx-with-confd GitHub project
- Update all the submodules:
git submodule init
git submodule update --remote --recursive
- Update the following host names in .env file:
- READIUM_LSDSERVER_HOSTNAME
- READIUM_FRONTEND_HOSTNAME
- MINIO_HOSTNAME
- CM_HOSTNAME
- Replace all the host names with
127.0.0.1
inetc/hosts
file:
127.0.0.1 lsdserver.lcp.hilbertteam.net
127.0.0.1 testfrontend.lcp.hilbertteam.net
127.0.0.1 minio.lcp.hilbertteam.net
127.0.0.1 cm.lcp.hilbertteam.net
- Run
lcp-conf
first to generate configuration required bylcpserver
,lsdserver
, andtestfrontend
:
docker-compose run lcp-conf
- Build the images:
docker-compose build
⚠️ Please note that this testbed uses development version ofnypl/circ-webapp
and custom version ofcirc-exec
Docker images.
- Run all the containers:
docker-compose up -d
- Use
docker-compose ps
to confirm that all the containers started successfully. It may take some time formariadb
to start which can negatively affectlcpserver
,lsdserver
, andtestfrontend
. In this case wait untilmariadb
finishes the initialization process (you can check the logs usingdocker-compose logs mariadb
) and then start all the remaining containers:
docker-compose up -d
- Make sure that Elasticsearch started correctly. Sometimes when the disk capacity is low, Elasticsearch marks shards as read-only which doesn't allow to use it properly.
Check the logs using
docker-compose logs es
and if you see something suspicious execute the following requests to fix it:
docker-compose exec es bash
curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_cluster/settings -d '{ "transient": { "cluster.routing.allocation.disk.threshold_enabled": false } }'
curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'
Also, sometimes Elasticsearch keeps failing because of OOM even though there is enough RAM. In this case please try to increase vm.max_map_count
:
sysctl -w vm.max_map_count=262144
-
Log into MinIO's administrative interface located at
MINIO_HOSTNAME
usingAWS_S3_KEY
andAWS_S3_SECRET
defined in .env file as credentials: -
Create the following buckets as it's shown on the picture below:
covers
- public access bucket containig book coversencrypted-books
(READIUM_S3_BUCKET
) - public access bucket containing encrypted books
- Grant public access to the buckets created before:
# Start the MinIO's command line client
docker run -it --entrypoint=/bin/sh --network container:circulation-lcp-test_minio_1 minio/mc
# Authenticate against the running MinIO instance
mc alias set local http://minio:9000 minioadmin minioadmin # Please use the credentials set in .env file
# Grant public access to covers bucket
mc policy set public local/covers
# Grant public access to covers encrypted-books
mc policy set public local/encrypted-books
-
Open Circulation Manager's administrative interface located at
CM_HOSTNAME
-
Set up a new administrative account
-
Create a new library using
LCP
as its short name as it's shown on the picture below:
LCP
is used in LCP configuration and shouldn't be changed
-
Set up a search service using
http://es:9200
as Elasticsearch's URL: -
Set up a new MinIO storage as it's shown on the pictures below and using
AWS_S3_KEY
andAWS_S3_SECRET
as credentials:
- Set up a new LCP collection:
- LCP License Server's input directory has to be
/opt/readium/files
because it's defined in Dockerfile- lcpencrypt's output directory has to be a value of
CM_REPOSITORY
, it points to the intermediate_repository
- Run the import script via in
nypl/circ-exec
Docker image:
docker-compose -f docker-compose.yml -f docker-compose.import.yml run import
-
Go to the Circulation Manager dashboard, select
A Dictionary in Hindi and English
and borrow it: -
Find the downloaded file.
Please note that this file must have
.lcpl
extension (this should be fixed in this PR). Until it's fixed please change its extension manually
If a list of the books you see in Circulation Manager is outdated or simply wrong you may have to truncate cached feeds in CM's database. To do that please execute the following steps:
docker exec -it circulation-lcp-test_postgres_1 bash
psql -U simplified simplified_circulation_dev # Please use the credentials from .env file
truncate cachedfeeds;