Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Algaroba to SMTCOMP 2024 #42

Merged
merged 8 commits into from
Jun 18, 2024
Merged

Conversation

amarshah1
Copy link
Contributor

Adding Algaroba for SMTCOMP 2024 for the Single query track and QF_DT and QF_UFDT logics. Let me know if there are any issues with this submission

Copy link

github-actions bot commented May 21, 2024

Summary of modified submissions

Algaroba

@bobot bobot added the submission Submissions for SMT-COMP label May 21, 2024
},
"website": "https://github.com/uclid-org/algaroba",
"system_description": "https://github.com/uclid-org/algaroba/files/15255360/Algaroba_at_the_2024_SMTCOMP.pdf",
"command": ["algaroba"],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@amarshah1 Thanks for submitting to SMT-COMP! I have tested the submission on our infrastructure and it seems that the specified binary algaroba is not present in the archive. The archive contains folder algaroba-1.0.0/bin/ with contents algaroba.ml and dune. I am doing anything wrong or is the binary indeed not in the archive?

If you will be fixing the command, please also use the relative path to the executable given that the archive is unpacked to the current working directory (in your case by calling tar -xzf v1.0.0.tar.gz). For the current archive, the command should probably be something like algaroba-1.0.0/bin/algaroba.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Martin,

Thanks for your reply! I am a little confused about how I should do this. What is the environment that SMTCOMP will be run on, so that I can create an executable that runs on that environment?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can find more details about the execution environment in the Final call for solvers and on the SMT-Comp website:

It is also possible to locally emulate and test the computing environment on the competition machines using the following instructions:
https://gitlab.com/sosy-lab/benchmarking/competition-scripts/#computing-environment-on-competition-machines

Basically, the machines are running Ubuntu 22.04 with a few extra packages (for details, see the link above). You can test your solver using the referenced Dockerfile, which should be almost identical to the machines used. If you rebase your pull request to the latest version of master, your solver will also be executed in this environment as a part of CI check.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you! I couldn't get the Docker working, but I compiled the executable on Ubuntu 22.04. I have updated the link

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just committed again with an executable compiled in that computing environment, but it still seems to fail the tests. Do you have any insight into what is happening? Do you know what the trivial_bench files look like? I can try running them locally

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, this is really helpful. I think the z3 opam library cannot be statically linked (see Z3Prover/z3#2213). So the only solution seems to ship the z3 dylib file with my executable and a command to add this to LD_LIBRARY_PATH . Would this be alright?

There might be a better way to do this, but I cannot think of it

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes you can do that, you will need to wrap the LD_LIBRARY_PATH into a script.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a way to run a script once at the beginning, rather than having it run a script every time the solver is used?

Copy link
Contributor

@bobot bobot Jun 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, but the a little bash (even better dash, default sh) script is very fast to execute. Really it is just

#!/bin/sh
export LD_LIBRARY_PATH=...
exec $(dirname $0)/....

I haven't tested.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I got it to pass all of the checks. Thank you for all your help!

martinjonas pushed a commit that referenced this pull request Jun 10, 2024
#84: Create cvc5-cloud
#74: Draft STP submission
#70: draft yicesQS submission
#68: Create STP-CNFLS
#66: Yices2 SMTCOMP 2024 Submission
#65: Z3-alpha draft PR
#64: Solver submission: cvc5
#63: submission iProver
#61: OSTRICH 1.4
#60: SMT-RAT submission
#57: Amaya's submission for SMT-COMP 2024
#55: plat-smt submission
#54: Add 2024 Bitwuzla submission.
#53: 2024 solver participant submission: OpenSMT
#52: Z3-Noodler submission
#51: Submission Colibri
#45: Submission for smtinterpol
#42: Adding Algaroba to SMTCOMP 2024
@martinjonas martinjonas merged commit 0fabdbb into SMT-COMP:master Jun 18, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
submission Submissions for SMT-COMP
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants