The previous two answers show exactly why you should not try to bias an amp 'by current' without knowing a lot more about it.
First, you must know (not assume) the plate voltage to have any chance of telling whether the tube will even take it, let alone bias correctly at some particular current/dissipation.
And second, the 70% "rule of thumb" is questionable anyway. It's an upper limit, not a target.
The right current is above the point where crossover distortion occurs on a full-signal clean tone, and below the point where the tube ratings are exceeded when driven fully into distortion. Sometimes this is a small range, sometimes large. Often - with well-known amp/tube combinations - you can pick a figure which will be in the right range without having to measure anything else directly.
Another problem is that if you're using the common plug-in bias probes, the current you measure includes screen current (usually a few mA). Actually, this is a good thing in some ways as it reduces the real plate current slightly at any given measurement, hence making the setting a little more conservative - but most people don't seem to know this, let alone account for it. Either way, quoting bias figures to +/- 1mA, or anything like it, is inherently flawed.