Replies: 11 comments
-
My two cents are that, if we use an external dependency, it should be tailored towards optimization. I'm not thrilled about the idea of using Boost. That's the only stake I have in this conversation though. |
Beta Was this translation helpful? Give feedback.
-
I was going to pull out the |
Beta Was this translation helpful? Give feedback.
-
I wasn't even aware that boost provides something like that... |
Beta Was this translation helpful? Give feedback.
-
Right now I am more in favor of self-implementation. Does anyone know some research about maps and optimization? |
Beta Was this translation helpful? Give feedback.
-
I’m also in support of self-implementation. I have some trust region and line search methods we can adapt from MUQ. Just for completeness, here are a couple of other good c++ optimization packages. NLOpt is probably where I’d try to start if we decide to use an external optimization package because it’s relatively easy to use, does not come with a lot of baggage, and has many different algorithms. |
Beta Was this translation helpful? Give feedback.
-
I added them to the list above. |
Beta Was this translation helpful? Give feedback.
-
some more thoughts-- GSL multimin is GPL licensed, which is undesirable for gaining traction with others. For TAO, you'd have to install the entirety of PETSc from my understanding (I could be wrong, but I can't find an individual TAO library), so that goes against my previously shared opinion on having something that's exclusively optimization. Similarly, ROL is a package within the entire Trilinios library. I've used NLopt and IPopt several times in the past in Julia, and I'm happy with them, and the fact that NLopt doesn't require LAPACK or anything makes it nice as a dependency, so I'd probably agree with Matt about starting with NLopt if we sourced our optimization externally. |
Beta Was this translation helpful? Give feedback.
-
@dannys4 are you in favor of external package or self-implemented? |
Beta Was this translation helpful? Give feedback.
-
I could go either way. I don't know the ins and outs of where we could gain from using our own transport map-oriented code, but theoretically, it would have to be pretty large to justify the person cost of building out an optimization suite on that reason alone. Also, it would be great to have something that's rigorously tested out of the box for this. On the other hand, (and @mparno can correct me if I'm wrong), using an external suite would prevent an efficient GPU-based implementation, unless I'm mistaken. Also, using After writing all this out, I think I also might just barely lean toward self-implementation, if only because of the GPU factor, but maybe using NLopt as a guide and a placeholder would be a good idea? i.e. we can spend a few hours now to let it link with NLopt and use that until we have a robust and tested suite of optimization tools, then we can let users pick and choose (admitting that our optimizers might be less desirable on the CPU than a battle-hardened and widely-known suite of tools). |
Beta Was this translation helpful? Give feedback.
-
Consensus- Create framework so we can add self-implementation of optimization later, but currently use an "NLOpt" backend interfacing with this new framework |
Beta Was this translation helpful? Give feedback.
-
Problem with self-implementation- Levenberg-Marquardt (and most optimization methods) use linear algebra, so our options are to include CuSOLVER (and have pretty gnarly stuff) or pull in a new dependency (e.g. MAGMA) that makes it easier. Leaving this problem open for the future |
Beta Was this translation helpful? Give feedback.
-
Possible third-party packages (order without ranking):
cpp
python
Advantages of self-implementation (are more or less disadvantages of third-party packages and vice versa):
LevenbergMarquadtSolver
inMonotoneLeastSquares.cpp
.Disadvantages of self-implementation:
Beta Was this translation helpful? Give feedback.
All reactions