The following examples require initialization described in Demonstrate using Command Line Interface.
This example shows how to load a file of JSONlines onto a RabbitMQ queue using the json-to-rabbitmq
subcommand.
-
--help
will show all options for the subcommand. Example:~/stream-producer.py json-to-rabbitmq --help
-
✏️ Identify the file of JSON records on the local system to push to the RabbitMQ queue. Example:
export SENZING_INPUT_URL=/path/to/my/records.json
✏️ or identify a URL. Example:
export SENZING_INPUT_URL=https://s3.amazonaws.com/public-read-access/TestDataSets/loadtest-dataset-1M.json
-
✏️ Identify RabbitMQ connection information. Example:
export SENZING_RABBITMQ_HOST=localhost export SENZING_RABBITMQ_QUEUE=senzing-rabbitmq-queue export SENZING_RABBITMQ_USERNAME=user export SENZING_RABBITMQ_PASSWORD=bitnami
-
🤔 Optional: If limiting the number of records is desired, identify the maximum number of records to send. To load all records in the file, set the value to "0". For more information, see SENZING_RECORD_MAX Example:
export SENZING_RECORD_MAX=5000
-
Run
stream-producer.py
. Example:~/stream-producer.py json-to-rabbitmq \ --input-url ${SENZING_INPUT_URL} \ --rabbitmq-host ${SENZING_RABBITMQ_HOST} \ --rabbitmq-password ${SENZING_RABBITMQ_PASSWORD} \ --rabbitmq-queue ${SENZING_RABBITMQ_QUEUE} \ --rabbitmq-username ${SENZING_RABBITMQ_USERNAME} \ --record-max ${SENZING_RECORD_MAX}
This example shows how to load a file of JSONlines onto an AWS SQS queue using the json-to-sqs
subcommand.
-
--help
will show all options for the subcommand. Example:~/stream-producer.py json-to-sqs --help
-
✏️ For AWS access, set environment variables. For more information, see How to set AWS environment variables Example:
export AWS_ACCESS_KEY_ID=$(aws configure get default.aws_access_key_id) export AWS_SECRET_ACCESS_KEY=$(aws configure get default.aws_secret_access_key) export AWS_DEFAULT_REGION=$(aws configure get default.region)
-
✏️ Identify the file of JSON records on the local system to push to the AWS SQS queue. Example:
export SENZING_INPUT_URL=/path/to/my/records.json
✏️ or identify a URL. Example:
export SENZING_INPUT_URL=https://s3.amazonaws.com/public-read-access/TestDataSets/loadtest-dataset-1M.json
-
✏️ Identify the AWS SQS queue. Example:
export SENZING_SQS_QUEUE_URL=https://sqs.us-east-1.amazonaws.com/000000000000/queue-name
-
🤔 Optional: If limiting the number of records is desired, identify the maximum number of records to send. To load all records in the file, set the value to "0". For more information, see SENZING_RECORD_MAX Example:
export SENZING_RECORD_MAX=100
-
Run
stream-producer.py
. Example:~/stream-producer.py json-to-sqs \ --input-url ${SENZING_INPUT_URL} \ --record-max ${SENZING_RECORD_MAX} \ --sqs-queue-url ${SENZING_SQS_QUEUE_URL}
- More example CLI invocations can be seen in Tests
The following examples require initialization described in Demonstrate using Docker.
-
Example docker and docker-compose invocations can be seen in Tests
-
There is a tutorial showing AWS SQS usage.
-
docker run
command for populating Amazon SQS. Example:docker run \ --env AWS_ACCESS_KEY_ID=AAAAAAAAAAAAAAAAAAAA \ --env AWS_DEFAULT_REGION=us-east-1 \ --env AWS_SECRET_ACCESS_KEY=aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa \ --env SENZING_INPUT_URL="https://s3.amazonaws.com/public-read-access/TestDataSets/loadtest-dataset-1M.json" \ --env SENZING_RECORD_MAX=100 \ --env SENZING_SQS_QUEUE_URL="https://sqs.us-east-1.amazonaws.com/000000000000/queue-name" \ --env SENZING_SUBCOMMAND=json-to-sqs \ --interactive \ --rm \ --tty \ senzing/stream-producer