Rule of Thumb: Use Purposefully Weakened Methods

posted by Craig Gidney on April 9, 2013

Last week I talked about checking preconditions, which is a well known tenet of defensive programming. This week I want to talk about another “obvious” rule of thumb: using methods that have been purposefully weakened.

When a method does more than you need, and you’ll be using it several times, it’s easy to misuse its extra features by accident. Removing the abstractions and redundancies you don’t need, by wrapping the method so it does exactly what you want, prevents these mistakes from happening.

This rule is perhaps very obvious. However, I see code that violates it all the time. I also see bugs caused by such code. It’s worth going over some examples.

Simple Example: SkipLast

A common operation to perform on a list or a string is to extract a contiguous subset of the items/characters. Most languages give you a way to do this as part of their standard libraries. A common variant of this operation is to take all but the last N items/characters of a list/string. When your language doesn’t have a standardized way to do this, you can do it by using the more general method:

var allButLastFive = text.SubString(0, text.Length - 5);

(Notice how the above code repeats ‘text’ twice. Redundancy is a common side effect of using a method that does more than you need.)

Skipping all but the last five characters in a string with SubString is by no means difficult, but all redundancy creates opportunities for stupid mistakes. If you accidentally specify the wrong variable name, the code will still limp along and appear to work. We can avoid the issue entirely by introducing a method that does exactly what we need:

public static string SkipLast(this String text, int skippedLength) {
    if (text == null) throw new ArgumentNullException("text");
    if (skippedLength < 0) throw new ArgumentOutOfRangeException("skippedLength", "skippedLength< 0");
    if (skippedLength > text.Length)
        throw new ArgumentOutOfRangeException("skippedLength", "skippedLength> text.Length");
    return text.SubString(0, text.Length - skippedLength);
}
var allButLastFive = text.SkipLast(5);

Alternatively, you could have been using python the whole time:

lastFive = text[:-5];

The benefits of this change are small, but the opportunities to use it or similar changes are many.

Practical Example: Crypto APIs

Cryptography is hard. Seemingly minor things can seriously reduce security. For example, using the same key for both sending and receiving data may allow replay attacks. You might also mistakenly use a message authentication code instead of a hash function, and end up authenticating nothing. It’s the perfect place for purposefully weakened methods.

Some cryptographic libraries are very flexible. They give you a lot of control. You get to choose what hash function to use for passwords. You get to decide how salts are generated and stored and retrieved, or if they’re used at all. You get to decide what the work factor should be. You get to decide how many times to shoot off your own foot.

Other cryptographic libraries are not flexible. They are purposefully limited / weakened. You can ask for high level functionality, and that’s it. They make all the decisions. They choose the hash function, appropriately. They generate a secure random salt, correctly. They include the salt (and details of the hash function) in the hash output, so that you can’t possibly fail to store it. They provide a function to verify that a password matches the hash function that isn’t vulnerable to timing attacks like yours is (because you used memcmp, didn’t you?).

Suffice it to say that you’re a lot less likely to mistakenly create giant security holes, when using a cryptographic library that exposes a purposefully-limited API. That’s why, for example, PHP is transitioning to a purposefully limited password hashing API (despite usually being a shining example of what not to do).

Common Example: Immutability

(Analogously: const in C/C++.)

An example of purposeful weakening that you’ve certainly heard about is the avoidance of inplace modification. I’m sure most programmers have experienced the shame of storing a list, expecting it to stay the same, but then modifying an alias of it somewhere else. Using immutable data structures makes that sort of mistake impossible.

Since this is such well covered ground, I won’t waste time going any deeper.

Notes

Purposefully weakening methods is a technique that applies everywhere. You use it when designing good APIs, when deciding to encapsulate, and to an extent every time you decide to extract a function. In a sense, it is extracting functions (But weaker. Har har.).

As always, you make exceptions to the rule when speed is absolutely necessary. If the compiler is failing to inline a weakened method, and you need every bit of speed, then don’t use it. You should also beware going overboard, and ending up with a code base where half your methods do nothing but rearrange arguments.

Discuss on Reddit

One Response to “Rule of Thumb: Use Purposefully Weakened Methods”

  1. Kristoffer Ryhl-Johansen says:

    It’s perfectly possible to make a comparison method that is invulnerable to timing attacks, which uses memcmp, you just have to do memcmp(hash(x),hash(y))


Twisted Oak Studios offers consulting and development on high-tech interactive projects. Check out our portfolio, or Give us a shout if you have anything you think some really rad engineers should help you with.

Archive

More interesting posts (16 of 22 articles)

Or check out our Portfolio.