Skip to content

pulibrary/solr_load_testing

Repository files navigation

Instructions for developing tests

  1. Install jmeter
brew install jmeter 
  1. Start jmeter - this will open the GUI, which should only be used when building or editing tests, not when actually load testing.
jmeter
  1. From the GUI, select File > Open then browse to the test plan you want to work on (.e.g. solr_test_plan.jmx)
  2. You may need to press the "plus" buttons to the left of the headers to see the full plan.
  3. Go to the User Defined Variables under the top-level "Solr test plan" and make sure the variables match the environment you want to work against (see User Defined Variables by Environment below)
  4. In order to run the current test plan, click the green arrow button.

Running from loadtest1.lib.princeton.edu

  • Ensure the User Defined Variables in the Solr test plan are correct and saved for the environment (see User Defined Variables by Environment below)

  • This project is checked out at /home/deploy/solr_load_testing

  • You can make changes locally, commit and push them to a branch, then check that branch out on the server

  • SSH onto the box

ssh deploy@loadtest1.lib.princeton.edu
  • Run the jmeter test from loadtest1.lib.princeton.edu
# To test reads:
jmeter -n -t /home/deploy/solr_load_testing/solr_test_plan.jmx -e -l /home/deploy/solr_load_testing/test_report-$(date +"%Y-%m-%d:%H:%M:%S").jtl -o /home/deploy/solr_tests/test_results-$(date +"%Y-%m-%d:%H:%M:%S")/
# To test writes:
JVM_ARGS="-Xms2048m -Xmx2048m" jmeter -n -t /home/deploy/solr_load_testing/solr_write_test_plan.jmx -e -l /home/deploy/solr_load_testing/write_test_report-$(date +"%Y-%m-%d:%H:%M:%S").jtl -o /home/deploy/solr_tests/write_test_results-$(date +"%Y-%m-%d:%H:%M:%S")/
# To test facet.contains queries:
jmeter -n -t /home/deploy/solr_load_testing/solr_facet_contains_test_plan.jmx -e -l /home/deploy/solr_load_testing/facet_contains_test_report-$(date +"%Y-%m-%d:%H:%M:%S").jtl -o /home/deploy/solr_tests/facet_contains_test_results-$(date +"%Y-%m-%d:%H:%M:%S")/

User Defined Variables by Environment

Development

host orangelight.dev.solr.lndo.site
port 80
solr_core orangelight-core-dev
kw_expected_result_count 1
sitemap_expected_result_count 255
keyword_file_full_path [full path to this repo + /keywords.csv]

From loadtest1.lib.princeton.edu - against Staging

host lib-solr8d-staging.princeton.edu
port 8983
solr_core catalog-performance (for read test) or catalog-write-performance (for write test)
kw_expected_result_count 12941
sitemap_expected_result_count 18170086
keyword_file_full_path /home/deploy/keywords.csv

Generate HTML report

  1. Run the test using the CLI mode
  2. Once the test has completed, open the JMeter GUI
  3. Find where your user.properties file is for your install of JMeter
  • I used Homebrew to install locally, so to find the actual location, I did:
~ brew --prefix jmeter
/opt/homebrew/opt/jmeter
➜  ~ ls -la /opt/homebrew/opt/jmeter
lrwxr-xr-x  1 kadelm  admin  20 May 30 09:28 /opt/homebrew/opt/jmeter -> ../Cellar/jmeter/5.6.3
  • For homebrew installs, the user.properties file is in the symlinked directory, under /opt/homebrew/Cellar/jmeter/5.6.3/libexec/bin/user.properties
  1. In JMeter, go to Tools > Generate HTML report
  2. For "Results file" Select the newly created test_results/simple_data_writer.csv file
  3. For "user.properties file" put in the path you found above
  4. For "Output directory" create an empty directory
  5. Click "Generate report"
  6. Open the generated index.html file using your browser

Record the trial in the spreadsheet

Record what experiment you did, and a link to the report that was generated, in this spreadsheet.

Tips on planning a useful test

  • When we deploy a new solr configuration with pul_solr, it clears the solr caches (Filter Cache, Query Cache, etc). If you want to compare two different configurations, make sure to deploy them both fresh before running your test (unless your test is specifically investigating what happens with a full cache).

Tips on analyzing the data

  • If you are comparing between two tests, the dashboard's median response time and percentiles can be useful metrics to compare.
  • The charts tend not to be very useful unless you are trying to see the impact of a particular event that happened during your test (e.g. a cache gradually filled, or the solr box hit some resource limit)

About

A repository for jmeter tests for testing Solr

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published