Or is this something which is imprecise because it's imprecise when mathematicians write in by hand?
It's exactly that. In fact I remember Feynman complaining about it in Surely You're Joking – he invented his own idiosyncratic notation for a lot of maths which didn't have all those ambiguities in which f(x) looks like f times x and dy/dx tempts you to cancel the ds. (And naturally he got used to using this in his own jottings, and then used it in front of someone else, who got completely confused, at which point he realised that maths notation is actually for *communicating with other people* and so he'd have to go back to the substandard normal notation after all.)
Some of the alternatives he described looked quite sensible, but others made me think he hadn't thought enough about futureproofing. For example, he had replacement notations for trig functions which looked like square root-type symbols, with a bar extending over the entire argument of the function. That's fine as far as it goes, but one's first question is 'OK, now how do you plan to extend this to an open-ended set of further mathematical functions that people will end up needing to use?' It's all very well if sin, cos and tan are the *only* things you'll ever need to do this to, but in practice you've now replaced the problem of picking a different *word* for each function with the much hairier one of picking a recognisably different *symbol*. (And then getting them all into Unicode / LaTeX / whatever...)
(Aww – I'd never looked before, but naturally someone has tried to implement Feynman's notation in TeX :-) |