Product Testing
Over the years I have come to appreciate
those publishers who have implemented extensive product testing
procedures. Without well-planned product testing, the software code
could “go south” at any moment, and take the customers and
publisher down with it. Here are a couple of stories which help explain
this concept.
The Macola Product
Testing Lab
For the past fifteen
years, Macola Progression has offered perhaps the finest low cost
manufacturing accounting software solutions in the industry. However,
the product suffered from serious bug issues – too many to ignore. I can
remember in 1996 receiving almost two dozen Macola CDs in the mail
during the year, as the company relentlessly released new versions every
other week in an effort to combat bugs. At following Spring, I had the
opportunity to have dinner with the company’s President – Bruce
Hollinger where I callously remarked “I’ve invented a device that
catapults my Macola CDs across the Chattahoochee River so that I can
shoot them with my shotgun – “Pull”, “Boom”, “Pull”, “Boom”. I thought
that I was being funny, but my comment didn’t raise a single smile, much
less a laugh.
Two years later, I
visited Bruce Hollinger again in Marion, Ohio. Bruce invited me to take
a ride with him where he took me on a tour of a new office location
where Macola had set up a product testing laboratory. Along the walls
were fifty computers busy running 1.5 million lines of test code against
the Macola product. In the offices were twenty personnel whose sole job
it was to read the testing reports, identify problems and bugs, and
communicate those problems back to the programmers. Bruce told me that
they had visited the Microsoft Excel team specifically to ask them how
they tested their product. Then Macola modeled their testing lab after
the Microsoft Excel testing lab. Bruce told me that the testing lab had
been one of the most difficult and expensive things he has ever done,
but also one of the most beneficial things as well. He said, “I used to
think that we were shipping pretty clean code, but I was only fooling
myself. With this new testing lab, I now know that we are shipping good
clean code and I can look customers in the eye with a good conscious. My
reply to Bruce was, Great Plains has known this for years, they have 700
computers running 3,600 test macros around the clock producing 193,400+
electronic reports that are checked electronically each night.
Since that time, I can
verify that the Macola Progression product has been much cleaner code.
I’ve talked to dozens of Macola consultants, resellers, and customers
who have confirmed that the testing lab has made a world of difference
for Macola. Except for one minor and unconfirmed complaint I receive in
May 2001 concerning Macola’s Warranty Repair module, I have not heard a
single other bug complaint since 1998. My hats off to Macola.
Solomon Software Fights
Bug Issues
I should note
that Solomon has always been a favorite product of mine, it was the very
first product I worked with back in 1985.I like the people and the
product.
In 1997, I visited
Solomon Software in Findlay, Ohio. During my visit, the Solomon folks
were excited that they had just implemented procedures to start
compiling their product code on a nightly basis, rather than on a
monthly basis as they had done in the past. I did not say so at the
time, but I was rather appalled that they had not been doing this all
along. I had just traveled from Fargo, North Dakota the previous day
where I toured the Great Plains Dynamics testing lab. As I mentioned
above, Great Plains had 700 computers running 3,600 test macros around
the clock producing 193,400+ electronic reports that are checked
electronically each night. Therefore, the fact that Solomon was just now
taking steps to compile code nightly was not very impressive, compared
to Great Plains’ efforts.
The following year,
Solomon released new product code which unfortunately, was buggy. Over
the course of the year, I encountered seven separate instances in which
an attendee in my audience stood up and complained bitterly about
Solomon bugs. I can vividly recall once such episode as I was lecturing
in Puerto Rico at an AICPA conference. I had just demonstrated the
Solomon product to several hundred participants when the CFO of a
California-based timber company stood up in front of everybody and
proceeded to slam Solomon without mercy. He explained that his company
had spent more than $300,000 on the product, and it never worked. They
eventually got their money back. I too had problems. Working with my
colleague at the time - Randy Johnston, we worked for days in an effort
to get Solomon up and running on our laptop computers – but to no avail.
Finally, Randy shipped his laptop off to Solomon and after several
weeks, he was told that they could not get it to work either. The Grande
Finale came in September as I attended a Texas Rangers games as a guest
of ePartners (great seats by the way). During the third inning
ePartner’s management announced to me that they had stopped recommending
Solomon IV. At first I was a little shocked, but then it really sank in
as to just what they were telling me. You see, at the time ePartners was
the number one Solomon reseller in the world. I asked them why and the
reply was “Solomon’s idea of testing their product is to throw it out
there to their customers and let them suffer”. [Please keep in mind that
this was not an official comment from ePartners, it was merely idol
conversation over a hotdogs and beer at the ballpark. EPartners had a
huge vested interested in the Solomon product and I don’t blame them for
being frustrated. EPartners is a fine organization.] [Also - from the
rumor department, some internal programmers at Solomon have suggested to
me that the root of the Solomon bug issues are a direct result of
changes that Microsoft made within their database design – and that the
fix for these problems laid in Microsoft’s hands – not Solomon’s. I have
no idea whether this is true, but because we are talking ancient history
here, it is interesting to ponder.]
As you can see, there
was a fairly decent amount of evidence that Solomon’s lack of formal
testing had finally caught up to them. Prior to this episode, Solomon
had produced some of the most rock solid code in the marketplace – as if
they were immune to bug issues. I am sure that their decade long success
to date help lull them in to a false sense of security. Solomon Software
has since been purchased by Great Plains, who has since been purchased
by Microsoft. He folks at Great Plains made it their first priority to
repair the Solomon code, which was done fairly easily using the Great
Plains testing lab model. Unfortunately the image takes a little longer
to repair, especially with accounting software pundits like me who spout
off these old stories. I am pleased to report that for the past several
years, Solomon IV has enjoyed good, clean, dependable code. I have been
recommending Solomon with complete confidence and continue to do so.
Conclusion
I think that these
stories speak for themselves. Product testing is very important. Each
accounting software publisher should maintain a separate group dedicated
to solely product testing. Don’t be fooled by publishers who claim that
“our programmers test their work”. Of course they should review their
own work – who doesn’t proof read or review their own work? However it
is a different story to have independent personnel dedicated
specifically to this function. Ask about it. Ask for names. Ask the
publisher to describe their product testing procedures. If the company
doesn’t have such procedures, you will be able to tell.
Final Note
- I am not sure how the folks at Macola,
ePartners, or Solomon will feel about me mentioning their names in these
stories above. In the end, this should be a good story for all. Macola
and Solomon both implemented superior testing and now have better code
than ever before. Also, my hat goes off to ePartners for doing the right
thing by not recommending a buggy product to their customers. My
purpose in recanting these stories is to help us all learn from past
mistakes.
- END -