I have been telling people that we need to work on guided synthesis over verification. I believe that guided synthesis is the only way we are going to be able to create correct programs in practice. Verification is just too hard, because you have to write the actual program, write the specification, and then develop the verification proof. Maintenance is difficult, because if any of these parts change you have to update the other two. I think that guided synthesis would work better, because you can write do the specification, then write down the steps and designs to convert it into efficient code. I can imagine doing maintenance on such a structure. It would feel like programming to me, given the right tool support.
In the short term, I think we need to focus on specialized executable specification languages.
To me, it seems that many hard battles in executable specification have to do with context rather than sheer implementation. Direct interpretation in a dynamic environment holds context fairly easily, but many synthesis applications lead to a more static transformation.
ReplyDeleteDebuggability is a primary example:
* base case: source transforms mean semantics get different positions that have to be tracked back into the original source, especially once the system gains optimization and efficiency transforms that obfuscate the target code as a side-effect.
This case is familiar to the users of the most common transformational system-- a compiler. Synthesis magnifies this standard issue by layering transformations through representations not designed for transformation-targeting.
* challenge case: Tracking nonlocal semantics. As synthesis systems become rich, debugging "eerie actions at a distance" becomes ever more a problem.
This case is familiar to the users of complex, but still common, systems like BNF-based parser generators and Prolog.
There are other contextual problems, such as interacting with build systems. Generated code makes for headaches in static dependency declarations. (One easy answer is, of course, "don't do that", but requiring a complete reimplementation of one's entire computing context is not usually compatible with wide adoption.)
I do recognize the power more narrow case that you state, where a single developer builds both specification and transformation. However, once one uses rich, layered transformation systems built by others, debugging and maintenance can become fantastically opaque. That is, once the user doesn't entirely grasp the transformational system, effective debugging requires a much richer tracking semantics.
I'm sorry but I was mixing up to topics. I am working on dynamic interpretation of *specialized* executable specification languages. In this world debugging and nonlocal semantics are fairly easy. On the other hand, my student Srinivas is working on algorithm synthesis. This is a more static process, and requires a lot more work before it is practical. It is interesting that any system will probably need to include both specialized languages and algorithm synthesis, so we need them both.
ReplyDelete