Look at this equality:

\( (a + b) + c = a + (b + c) \) , or this one:

\( a . (b + c) = a.b + a.c \) .

They are true **structurally**. In principle, You can just replace one side of the equality with the other side without any extra comment. However, we usually like to describe these in terms of processes. For the first equality, we might say that *adding* the first two and then the sum to the third is the same as adding the first to the sum of the last two. For the second equality, we might say that from left to right it is *multiplying* through a set of parentheses, and from right to left, it is *factoring* something out. The italics are in the language of processes. Thus, when does the structure matter? Here is a strange example that I observed when teaching differential equations.

There is a technique in which you multiply both sides of a first order equation by something called “integrating factor”. The point is to write one side of the equation as the derivative of a function. For that, you need the product rule for derivatives: The derivative of a product of two functions is the derivative of the first times the second, plus the derivative of the second times the first. In symbols, we can write the product rule as \( (fg)’=f’g+g’f \) or many other ways. When moving from left to right we are *doing* something, we *find* the derivative, we *multiply*, we *add*. However, it seems that when moving from right to left, we are not doing anything. We just replace \( f’g+g’f \) with \( (fg)’ \) . Indeed, the mere fact that we can just replace the right side of the product rule with the left side was quite unacceptable for one of the best students of my class (hence he couldn’t see the logic behind the integrating factor). I would like to see more examples in which understanding structure really matter.