Lately i have been asked this question in my interview - how would you develop a test framework.
in my opinion the answer would be - first we need to make a model/architecture/flow diagram on how is our testcase be executed . I have worked on a few architectures before - like table-driven and data-driven framework.
Whichever we choose, we should have a clear idea about what will be our input/from where do we choose our input from and it should be such that the code doesn't change when the input changes.
Next would be to develop the core functionality on how to test. We may follow some simple algorithms or more complex algorithms, but the epicenter should be it is simple to read and understand. It should be modular with no hardcoded values and properly documented. The more modular approach we take like packaging, global functions or include files, more better the code. Also having a layered architecture helps in later project lifecycle, as we need to change only a few modules when the project requirement changes.
While discussing the code functionality of the software development, we need to take care of how our outputs are been verified. Mostly verification of one or more functionality takes into consideration some of the analysis tools to be developed which again should be easy to use. These analyzers may be in-house or the third-party, but the APIs to be used via these analyzers should be comprehensive, flexible, understandable and easy to use. Providing more than one way of passing parameters to these modules helps in expanding our test framework at a later stage.
Lastly we should make sure, how our testcase output is being reported. A nicer approach would be to put all our testcase in some test suite and then posting the output on the XML output or on a web-page-based interface.
Aboveall, the environment setup to execute any test should be easy. We should consider writing smaller shell/perl scripts to set the environment rather than using basic linux/unix command line.