-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Callback not available when gradient is passed to VQE #10700
Comments
I think what you are seeing here is more a behavior of ADAM than a problem with VQE callback. If you switch say to using Similar - well the initial_point, you don't set that so they start from a different random point each. Then the built-in finite diff used by say SLSQP its eps is much smaller by default. And that will use the objective function, which is where the callback originates, to compute the gradient too. If you plot the value you can see a staircase effect where the values when it computes the gradient, using the objective function, their diff is very small. So the first callback will also end up showing calls made by the optimizer to do whatever it does including any gradient computation - the objective function has no idea what its being used for and just gives a callback for each use. With the second it will only be calls to the objective function itself with whatever is done for the gradient being off to the side so to speak. So with ADAM what it does is to use the gradient and then when it sees a minimal change in params it ends and makes one call to the objective function. You can see that here https://github.com/Qiskit/qiskit/blob/57137ff7f2cdd6d7671aa0e6889258f28a70e5a1/qiskit/algorithms/optimizers/adam_amsgrad.py#L268C37-L268C37 at the end of the minimize method. If you don't supply a gradient it uses a custom finite diff method using the objective function. This is why in the first callback with ADAM you see a lot of callbacks as it ends up using that for the gradient. The behavior you see is expected - when you use a gradient the computation done for it does not end up in the callback from the objective function. With ADAM doing its thing using the gradient and the params, and only explicitly calling the objective function once to return the minimum value it does make the difference more extreme. With qiskit.algorithms being moved to qiskit_algorithms and the tutorials being moved around as well I can only point you to an example notebook (it has yet to be re-published as html, but will be soon) https://github.com/qiskit-community/qiskit-algorithms/blob/main/docs/tutorials/02_vqe_advanced_options.ipynb Here you can see the staircase effect I was talking about with SLSQP. COBYLA there is not gradient based so its plot looks different in that regard. |
I see, that makes sense. Thank you for the detailed reply. Maybe this issue should be a feature request then; it seems like adding some sort of callback access to gradient calls for these kinds of scenarios would be useful. ADAM is a popular choice for VQE and other VQAs and if the callback is not called during optimization for custom gradients there's no obvious way to get mid-optimization convergence information. |
There is also access to that and other other internal information the optimizer has from the optimizer itself. They (mostly) have a As far as any specific new feature request that would need to be done on the qiskit_algorithms repository now. As of the last release of Qiskit here the qiskit.algorithms is deprecated and new features will not be considered here now. More info on the move in the last release note prelude text. As far as the optimizer callback it does depend on the optimizer, so it can be a bit more work to use if you are trying out different optimizers. As such an issue was raised to unify qiskit-community/qiskit-algorithms#60 things a while back - now transferred to the new repo. If you want to comment in the above issue, or raise something else for discussion there please feel free. I pointed out the above issue since I think that more closely relates to the sort of info you are looking for. I am going to close the issue here on the basis that if there is something further it can brought up on the qiskit_algorithms repo. |
Environment
What is happening?
The
algorithms.minimum_eigensolvers.VQE
algorithm does not callcallback
during optimization when a gradient is explicitly provided. This seems unexpected, and even if this is the intended design, it might be a good feature to givecallback
access when a gradient is provided. This would allow, e.g., comparison of convergence for different gradient types.How can we reproduce the issue?
The following code reproduces the issue; the output includes many calls to
callback_func1
but only a single call tocallback_func2
(I think this call comes from when the optimizer result is built?)What should happen?
The output of the two VQE runs should be the same (or as similar as possible).
Any suggestions?
A possible solution would be to call
callback
fromVQE.evaluate_gradient
inVQE._get_evaluate_gradient
, although the Estimator values and metadata will not be accessible.The text was updated successfully, but these errors were encountered: