You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But this would be inefficient, because some of the .models() implementations don't start yielding models until all of them have been computed (they aren't lazy).
If kissat is enabled, then nnf.kissat.solve() already implements this for CNF.
If it isn't, then the naive SAT solver is already lazy, and the above solution will be efficient.
._models_deterministic() isn't lazy, but can easily be (and should be) adjusted so that it is.
._models_decomposable() isn't lazy and can't trivially be made lazy, so it should be complemented by a new method.
The fallback implementation is lazy, but otherwise very inefficient. Converting to CNF first and solving that may be better, but you have to take care to remove the newly-introduced aux variables from the model.
The text was updated successfully, but these errors were encountered:
It would be useful to have a
.solve()
method that returns an arbitrary model for the sentence.At its simplest, this could be:
But this would be inefficient, because some of the
.models()
implementations don't start yielding models until all of them have been computed (they aren't lazy).kissat
is enabled, thennnf.kissat.solve()
already implements this for CNF.._models_deterministic()
isn't lazy, but can easily be (and should be) adjusted so that it is.._models_decomposable()
isn't lazy and can't trivially be made lazy, so it should be complemented by a new method.The text was updated successfully, but these errors were encountered: