Technical computing can seem very different from a standard software environment. After all, you are delivering analysis, not serving thousands of users (your analysis may be hitting thousands or millions of users). Unfortunately, that means testing and validation are even more paramount. The product of technical computing is often a number or a chart. The reason the computing was done in the first place is that the result is not obvious to the customer. Errors, then, are hard to detect. Are you off by a factor of 2 somewhere? Was your source data parsed incorrectly? That error will probably escape detection for a while...
First and foremost, the project can't be "priced to perfection". Technical computing isn't cheap, and discount work is worth nothing. You need some time to produce quality results. If everything needs to go right for your project to be delivered on time and on budget, it will be hard to put time into verifying accuracy. Hard to detect, but costly errors will cause knock-on issues down the road.
Mathematics or Formula Testing
Any mathematical formula should have a unit test written around it, with some known or separate source to verify it is working correctly. Low level errors can haunt you for a long time, and will never "crash" (unfortunately). Constants in a simulation should be reviewed for correctness, as well as loads.
Testing of the full system on a known set of inputs with an expected result will give you confidence that your whole system is working correctly. Experimental data known results can be matched to verify the simulation framework works properly.
Always keep your raw data
Never modify your raw measurements or data if possible. Keep the data, and save the processing parameters you want to reach the given result. If your process is computationally expensive, you can always create a caching system that will save intermediate steps. If you need help with this process, Laurium Labs has experience implementing these sort of systems.