I was brought in on an assignment to write business processes for testing the quality of high-speed Internet access lines. The company was wanting to grow, and needed to hire many more people to test phone lines, to match the planned increase in new customers each month. The few available experts (at the height of the dot-com boom) were needed for expert work, not for simple repetitive tasks such as testing phone line quality.
Experts in the department had been working with testing, diagnosing, and trouble shooting equipment in the industry for years. They had designed equipment for the industry. Testing the phone wires was so simple for them, they did it almost as an art; one glance at the test result numbers and they new if there was a problem, what type of problem it was, approximately how far the problem was along the phone lines, and what would need to be done to fix it.
They had no clue how to give all that knowledge to novices without years of education. They knew there was no simple way to interpret test results. There were too many ways the physical characteristics of the phone wires could interact in combination, too many exceptions to any rules. Training would take months or years, and they didn't have time for that.
I could understand the problem of teaching new people, since listening to the experts was confusing. They would say what range of numbers indicated a good phone line, and then give exceptions (both for good and for defective phone lines!) Listening to novices who had been testing phone lines a few weeks, getting trained, I could tell the training was confusing to them, and people interpreted what they had been taught differently.
There was a lot to teach novices before they could know how the experts interpreted the test results.
The experts did not have time to help novices interpret test results for each new customer. Even if the novices would not be able to accurately interpret the results of all tests, I had to find ways to reduce the number of tests that required an expert's interpretation. The experts knew all the complexity. I had to find the simplicity: what test results are a definite "good", what results in simple combinations are a definite good, what results are a definite "defective", and what results need an expert to interpret them?
In numerous interviews with the experts over two weeks, I pinned the experts down. What range of values for this measurement is "definitely good", "definitely bad", or "see an expert". Of course, each expert gave slightly different numbers, so I had to get them to agree on a single set of values.
I produced a one-page table showing, for each measurement they took in testing, the range of results that was "definitely good", "ask an expert", or "definitely bad".
That one page fit with the step-by-step procedures I wrote for performing the tests.
I wrote documentation that showed what each screen on their computers looked like at the start of a step, what fields needed to be filled in and where they were on the screen, what they needed to type in at each field, what the filled out screen looked like.
I used the exact description of every menu item, and every field, including underlines indicating keyboard shortcuts.
I showed the error screens that could be displayed (whether operator errors or line test errors), and the steps to recover from errors, each as a separate process.
I watched people learning the old ways of testing, and documented the mistakes they made, including what would show on screen to show they made a mistake, and wrote a separate procedure for how to get back on track.
If a step had options, I wrote a procedure for determining which option to follow, and a separate procedure for each option.
Each procedure was step by step, no need to remember what you decided for which options. No skipping past steps that didn't apply for what was in front of you (and sometimes skipping to the wrong place in the documentation).
No mystery, no interpreting my instructions, no confusion.
In a few weeks, I had complete step-by-step instructions for the entire testing process. I had each procedure reviewed by several experts for accuracy, and had the procedures tested both by people who had been doing the job a few weeks and by people brand new to the job.
Training time was reduced dramatically, usually to under one day. The company could now hire anyone with basic computer skills, who could type the right thing in the right place, following instructions. My procedures meant new employees in this job could start today, instead of starting after weeks or months of expensive training.
I later estimated what results my procedures had for the company. There were about 20 people following the procedures, and if those 20 people meant 5 fewer experts were needed, the salary savings alone would easily be over $500,000/year. But at the height of the dot-com boom, those 5 experts were just not available.
Those 20 people testing phone lines meant to the company that new customer installations were completed, and revenue came in each month for Internet service — essentially no extra work was needed once the installation was complete, except payment processing, and occasional customer service.