Parabolas, Rectangles, Triangles & Snakes (Part 9)

Part 9 – Verification

Part of the verification process in development of a spreadsheet or software tool involves what I like to term the process of idiot proofing. Some end users are considered as potential idiots, asking yourself to think like this idiot can be a challenge, as it doesn’t come naturally to some. Luckily as engineers we are good at solving problems, identifying risk and considering all eventualities, and are able to either test for these eventualities or provide suitable error handling to avoid giving back an incorrect, but potentially believable result to the end user.

You’ll essentially need to consider things no sane person would do, and then do these things to ensure your code, calculations, etc, all work as intended. You know that guy who uses a spreadsheet to design outside of any fundamental assumptions it might be based on, or intentionally decides some course of action is conservative when it really isn’t. Think like this guy, what will go wrong will go wrong, and you’re trying to head this eventuality off at the pass so to speak.

Considering that the average user may potentially use your software like a big black box, with a high potential of SISO (shit in, shit out) occurring, basically because they don’t fully understand what they are doing, or don’t fully understand the basic fundamentals on which the software is based its hard for them to tell if an answer seems reasonable.

Your primary job here is to do your best to try protect that person from themselves, by putting suitable measures in place to prevent them inadvertently doing something idiotic.

The idiotic things people do might surprise you if you are just starting out in engineering, but once you’ve been around for a while I’d like to say it gets better, but that might be stretching the truth as bit. Get a few of the wrong type of people together and there is a higher chance you’ll suffer from a case of multiplicative idiocy! The idiots are always innovating, finding new avenues for idiocy. Doing enough to stay one step ahead at least isn’t always enough, solve one issue and they will find more ways to prove your faith in the human race wrong!

Proof of idiocy…

To demonstrate I always remember this one guy, he was was using a spreadsheet I created that quite heavily relied on macros to iteratively solve for the capacity of a concrete beam member and do some other checks. I thought I was being quite smart, in so far as I had added some application events in Excel to detect a change in the inputs. At which time the macros would automatically fire and update the outputs so the user didn’t need to manually click a button each and every time they changed an input. This all works perfectly and avoided ‘I didn’t click the button syndrome’, unless of course you happened to have macros disabled, the fatal flaw in my the plan was exposed.

So anyway this guy designed several members with macros turned off, oblivious to this fact. Now I will give him credit, he eventually noticed that his little mini 300mm deep beam with minimal reinforcement miraculously had the same capacity as his 1000mm deep transfer beam that was reinforced up the wazoo. Naturally he complained my spreadsheet was rubbish because it didn’t give the right answers. A fact that it might tend to do that when its not functioning as intended!

Proof you need to turn your brain on sometimes at the very least! But this was me basically underestimating the actions of the end user, and not foreseeing this particular eventuality as a possibility. I was reliant on the user understanding that Excel actually warning you and asked each and every time if you wanted to enable macros, that it implied macros might have been required for the operation of the spreadsheet and the user would in all likelihood enable macros.

I came to the rescue by writing a few more macros (the irony) to warn the user when macros were not enabled. By adding a watermark before saving, and then removing it after saving via application events it seemed to address the issue nicely. So if a user opened the file that had macros disabled then they would see this watermark. Otherwise another macro would hide the watermarks visibility and the user would be none the wiser having never seen anything occur from their perspective.

It worked a treat, I never had another issue along the same lines come to my attention again. Instead I always got questions about some big warning popping up and how do they get rid of it? At which time I could explain a few things to them about life inside my spreadsheets, but at least I was speaking their language now.

ignore this you fuckers…

So this brings me around to exactly how do we propose to verify this analysis tool. I am assuming by the time I have finished it will be complicated enough that its going to need some checks before anyone will feel confident in using it. The average user isn’t probably going to check it themselves, they’ll just dive in and go nuts.

So I’m outlining my current thought process and thinking below:-

Continually performing checks along the way

Part of self-verification (and I’d say it is a big part of verification overall) is performing checks at every stage of the calculations as you produce the code. This is a sure thing, it gets done, it has to get done or you’d have one giant broken mess of code at the end probably that becomes increasingly hard to verify exactly where things went wrong.

Test at every stage basically, add code, test code, add code, test code. Is the answer as expected? If not why?

Verifying against known examples

You can verify the final output results against known examples, pretty easy for any sort of rectangular stress block as you’ll find 1001 examples out there, or you can compare directly to most commercial software design tools.

But for the parabolic stress block curve, I honestly couldn’t find anything that didn’t make some of assumptions around discretising the curve to discrete points or crude average steps and other such things that I’ve outlined previously.

So comparison of the full and final formulation might not be able to be checked in full, but believing in the math you can check individual constituent calculations. As well the expectation is that the answer for the parabolic rectangular stress block isn’t going to be vastly different to a rectangular block is a good measure of correctness, so if they do turn out completely different then you have to ask yourself if this is realistic?

Hand checks and comparisons

You could do some hand checks, again pretty hard if we can’t solve the full integrals by hand. Keep in mind here we are trying to prove our formulation is correct, not necessarily working to improve our integration skills to elite status. We just want to prove its correctness.

One thing to watch for is this needs to be independent to a degree, in the sense there is a problem if someone is simply translating the formulas in the spreadsheet or in the code onto paper, then they are potentially making the same mistake but getting the same wrong answer and thinking everything is golden.

Often the best approach is to do the hand comparison calculations prior to reviewing the actual spreadsheet or code. Through a combination of errors in the hand checks and spreadsheet you should be able to see a way through, often forcing you to do further checks to see where the error was until there is agreement in the two initially independent approaches.

Isolation and checking of various independent aspects

This leads me towards isolating certain aspects and testing each of these independently. If snippets of code work independently then as a collective they should also work with some belief in the interconnecting mathematics doing the business:-

  • For example if the stress is constant then the formulation for the concrete stress block simplifies down to simply finding the area in compression and the centroid of this area. This test confirms our force/centroid formulation is working correctly (i.e. or in other words our reliance on the width term is correct). The shapely package offers an alternative means of finding the area and centroid of the entire compression block area in one hit for comparison (in fact I plan to implement this for constant stress regions as a faster means of calculation at some point once its all working as expected).
  • Similarly for a rectangular shaped section with bending about a primary axes and using the parabolic stress block from EC2, the width term simplifies to a constant and we can actually solve the integral symbolically and compare forces and centroids. This test confirms that the reliance on the height term is correct when using the parabolic part of the EC2 curve.

User testing

You can also get someone else to use it and give feedback on anything related to how intuitive it is, or on any of the user interface experience, or even checking the code itself.

Often if you’re the only one looking at it for weeks on end, you won’t see or appreciate the obvious mistakes or alternative ways of getting to the same answer in a more efficient manner for example. Pick your guinea pig wisely, it should be someone familiar with whatever engineering design process you are automating, plus have some knowledge of the coding language or tools you are using.

Handling exceptions for all eventualities

Lastly, add sufficient exceptions in the code to provide feedback on things like incorrect inputs, etc. Capture all those eventualities I was talking about, its quite important to test all the code logic.

In our case things like what occurs if you have no internal holes defined, sends the code down an alternative path that needs confirmation that things aren’t going to break when you have this design situation.

I’ve seen it plenty of times where an engineer gives a task to another engineer to verify a spreadsheet, they cherry pick some nice easy inputs for them to look at that doesn’t even test all the possible outcomes.

This is completely the wrong approach, the verifier should be doing everything he can to break the spreadsheet, throw unexpected inputs, insert rows/columns. Often hard coded cell addresses in macros in Excel will then refer to the wrong cells, don’t be that guy who doesn’t use named cell references to head this one off.

I really hate that guy, you make things too hard to follow. Basically anything to test both the numerical accuracy and the robustness of all code eventualities should be checked.

End users will find programming errors pretty fast, but end users finding errors isn’t an effective overall verification strategy though. But it’s almost inevitable that some fringe cases will not be envisaged during development. I guess this is the beauty of contributing to open source projects.

Some examples coming up next…

In the next post I’ll go over some specific examples checking some of the items noted above to illustrate that all those damn integrals are in fact correct!

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *