-
Notifications
You must be signed in to change notification settings - Fork 3
Description
I know there's a billion things you already want to do with model-checker but if this sounds like a good idea I can (time permitting) try to implement it:
Say you give model-checker some premises and conclusions, and it gives you a model for them. Say you want to edit the model slightly to make it more natural for interpretation. You do your hand calculations to check if it's a model (or, if you're lazy, you don't) and want to have model-checker check if that model is indeed a model for the entailment.
Basically: implement a functionality where the user specifies a model and Z3 tells the user if that model is in fact a model. I don't think it'd be that hard. Basically we'd need to tell Z3 "check if a model for this entailment exists with these assignments." It doesn't seem like it'd be to hard to go from user input (even in something as simple as a string) |A| = < {a, a.b}, {c} > to a Z3 constraint And(Iff(verifies(A,bv), Or(bv==1, bv==3)), Iff(falsifies(A,bv), Or(c))) (where Iff is And(If(A,B),If(B,A))). If I'm not mistaken, all we'd need to do is feed Z3 the constraints it uses normally for checking for existence of a model and then also feed it these assignment constraints, which seem very mechanical to make.
I think this could also be a pedagogical tool—the same way that when you're learning to integrate you may do an integral by hand and then check it on a software that can solve integrals, you may want to solve a model by hand (from scratch even) and then see if your way of finding counter models is actually a (the) right way of finding counter models. (The thought for this came up while I was trying to give an interpretation to a countermodel and found myself wanting to minimally change the counter model). Also, for those who are weary about technology/think that methodology like this will make people lazy or something, this would allow for model-checker to be a facilitator in the pedagogical process to the end of finding models by hand by allowing them to check their students's manually-found models much faster than they would otherwise be able to.
This is different from telling it to just find a different model from one it's found (which is the iterate functionality).
This would also be something that would be very easy to do on a Jupyter notebook, further increasing functionality of those. (I'm sure it could be done on the CLI as well.)
Thoughts? Is this already possible? Would this be useful? (Or encourage laziness too much?) If you think it's a good idea, I think it'd be good to implement it as a function in the model-checker, that way it can be called in an API, and then you can take that function and adapt it to the CLI (it has seemed that API translates naturally to CLI, but not necessarily vice versa)