FWD: Techs stop reading manual

Subject: FWD: Techs stop reading manual
From: "Geoff Hart (by way of \"Eric J. Ray\" <ejray -at- raycomm -dot- com>)" <geoff-h -at- MTL -dot- FERIC -dot- CA>
Date: Wed, 25 Feb 1998 09:42:57 -0700

Jonathan Soukup reported <<The technical manuals that we produce are
used mostly by our field service technicians. We are finding that
after about a year, a new tech will know his job well enough to stop
using the manual... [but] we are constantly updating the manual with
new and changing information that they are not learning. Instead they
make costly long distance phone calls to field service support when
they run into a problem.>>

I know this may sound heretical, but strictly speaking, this isn't a
documentation problem. We've discussed the problem of updated
information several times before on techwr-l, and the consensus was
that people don't reliably read "new info.: read me first" or "please
insert this page in your binder" information. That being the case, a
better solution is to hold a monthly or semi-monthly meeting (or phone
conference if your techs are scattered around the world) to update
your techs on what has changed. This works very well if everyone works
out of the same office; it's a bit trickier if you have several
regional offices, but you can still do it.

Barclay Blanchard noted <<We're going to send our manuals to a
bindery for printing for the next software release. The binder says
that we can't deliver Word files because its color scheme is RGB,
whereas printers use CMYK. He says that PageMaker uses CMYK.>>

The binder is misleading you. All colors that appear on a computer
screen are indeed in RGB format... so strictly speaking, any software
package inherently produces RGB. The real problem is that all
printing on paper is based on CMYK or some derivative thereof, and
the color gamuts of the computer screen (which involves light formed
by electrons bouncing off a phosphorescent screen) and the printed
page (which involves light absorption by the four CMYK inks) do not
overlap completely. What the binder should probably have told you is
that they can't guarantee a match between your on-screen colors and
your on-paper colors unless you use some form of color matching
software to provide a close match between the screen and the paper. I
doubt the Word development team has even considered building color
matching into Word, so in that sense, the binder is correct;
moreover, service bureaus tend to get a case of the twitching awfuls
if you even mention Word files to them. However...

That being said, you can buy color management software for most
operating systems. On the Mac, it's built in ("Colorsync"?), and any
software that's been updated to the current system software release
supports color matching more or less well. I'm not sure how this is
done on the PC, though I know that Microsoft was playing catch-up
with Apple on this and may have already provided color matching; even
if it hasn't got there yet, most prepress applications (including
Photoshop and PageMaker) on the PC have some form of color matching
available.

Gina Hertel wondered <<How do all of you measure QUALITY in a
quantitative way? The only OBJECTIVE metric I can think of so far is:
decreased number of support calls.>>

Even that isn't a very good metric, at least not without lots of
tweaking. Have your users simply given up trying to reach technical
support? Stopped using features they don't understand? Developed
inefficient but effective workarounds for problems? Hired in-house
support staff. I'd also stay well clear of the "objective" part of
the definition, since quality is inherently subjective. What's more
important is that you come up with clear, measurable criteria that
actually reflect user needs.

My favorite definition of quality is "fitness for purpose"; you can
certainly split hairs and define the word more narrowly, but this
definition works just fine for most real-world practitioners. Starting
from that definition, you can define as many quality criteria as you
need, and define them as simply or as complexly as you desire. For
example, a simple criterion might be "users must be able to save a
file"; if they succeed, no matter how long it takes and how many
errors they make, then you've produced a quality product by this
definition. A more complex and realistic version of this criterion
might be "users must be able to save a file with no errors, within 10
seconds of deciding that they need to save the file". Prepare a series
of these criteria and you're well on your way to creating a
quantitative quality index.

Can you provide us with more information on your goals? Without a
context (e.g., quality improvement, retaining or firing your
developers), it's hard to come up with more specific suggestions.
--Geoff Hart @8^{)}
geoff-h -at- mtl -dot- feric -dot- ca




Previous by Author: Screen captures as movies?
Next by Author: Non-existent "features"
Previous by Thread: Can anyone suggest...
Next by Thread: Need RoboHelp Consultant


What this post helpful? Share it with friends and colleagues:


Sponsored Ads