Troubleshooting Professional Magazine
Exploiting Perceived Similarities
With regard to Steve Litt's quotation "I never let lack of knowledge and experience stand in the way of creating something great.", you may use this quotation anywhere you want, provided you quote the quotation verbatim and attribute it to Steve Litt. First use of this quote was in the January 2001 Troubleshooting Professional Magazine, published 1/3/2001.
[ Troubleshooters.Com | Back Issues ]
How do we give meaning to such a life? One way is to volunteer a strong pair of shoulders for others to stand atop, so they can skip the decades of grunt work we had to put in, and reach higher than our time allowed.
Although I'm nowhere near my "smartest moment", I offer this issue of Troubleshooting Professional as a strong pair of shoulders you can stand on. If you can instantly know what took me decades to learn, imagine how high you can go! All I ask is that as you learn, you pass it on.
Troubleshooting Process was easy. I documented the basics of Troubleshooting Process after less than 2 years in an environment where Troubleshooting productivity determined my paycheck. Developing a method to transfer that knowledge was harder -- 20 years in development and still not complete.
And then there's my Rapid Learning process, which took about a decade, from the first question, to the creation of a book that can teach that skill.
How does one guy come up with 2 original and unrelated concepts? Isn't that a little like lighting striking twice? How did I do that?
Believe me, it's not intelligence -- I'm average. You might think it's persistence, and at one level you'd be right. But persistence doing what? Ahhh, that's the question leading to the desired answer.
I can't yet answer that question (in 5 years I'll probably tell you the answer is obvious :-). But I can tell you one piece of the answer. That answerpiece is simply the fact that I perceive and exploit inobvious similarities.
If I do my job in this issue of Troubleshooting Professional, you'll be able to exploit perceived similarities the way I do. You might need some practice to perfect it, but that practice will be measured in months, not decades.
The remaining obvious question is "what does this have to do with Troubleshooting?". The simple answer is if you use the Universal Troubleshooting Process, your career will rapidly advance to the point where you're doing much more than restoring systems to their as-designed state and behavior. At that point you'll need to expand your abilities beyond the limits of your current knowledge. Exploiting perceived similarities is a powerful ability expanding tool.
So kick back and relax. Enjoy and think. This is your magazine.
The women in the 100+ person crowd surrounding The Artist raved about the beautiful paintings, and how much they wanted one. The men, every one of whom had done the same timings and calculations as I, wondered aloud if they had chosen the right career. Everyone in that crowd recognized they were seeing something extraordinary. And at least one person saw opportunity.
I didn't question my choice of career. I find it challenging to draw a person who isn't a stick figure. Art is out of the question for me. But I had enough computer program design experience to know that The Artist was actually designing his art, and then implementing it. So the question I asked was "Could I design computer programs the same way, and if I could, would I reproduce his extraordinary productivity?"
I was further helped by the life experience of participating in the Quality movement. I had learned that most human endeavor is done by following a process. This led me to ask "What is his process?". I observed him for hours, watching his efficient handling of the crowd of potential customers, and his swift, sure creation of the paintings.
He could paint a school of multi-colored fish in 15 seconds. Whip out a fish school stencil and can of spray paint to make the bodies. Center another stencil and grab a different color spray paint to paint the fishes' highlights.
A picture of the sun took 20 seconds. Spray a large blob of yellow. Place an empty spray can upright on the yellow, and spray sky color all around. Lift the can, and smear with a rag to get the sun's radience.
A comet took only 5 seconds. Spray white into a round stencil, lift and smear upwards with a rag.
After 2 hours it was time to go and I still hadn't figured it out. As my wife drove the 75 miles back to our home, I worked on synthesizing The Artist's process. From my study of accounting as well as reading self improvement books, I knew there must be a goal -- otherwise the result is random. Hence the question -- "What's The Artist's goal?".
Introspection made his goal, and the process by which he deduced his goal, obvious. He'd spend a minute interviewing the customer as to what she (most but not all were female) wanted. There would be a little back and forth as he molded her vision into what he could do quickly. Everyone got what they perceived as a custom painting, but all the paintings were actually within some pretty tight parameters. No skyscrapers, no football players. Just ocean, sky, fish, birds, sun, stars, air and water. In the first minute or two, he knew exactly what was to be painted, and had placed it in the framework of what he could do quickly. He had his goal after a 2 minute interview with the customer.
Another obvious fact was that the man made expert use of tools. Stencils, spray cans (both for spraying and for blanking out circles with the bottom of the can), rags, brushes. He always knew where they were, always cleaned and cared for them, and always put them exactly where they belonged. For The Artist, finding the right tool was always less than a second.
Big deal! We all have tools, but we all don't produce the way The Artist does. What was his secret for using tools?
Cruising Interstate 101, in my mind's eye I replayed The Artist's every move. In every painting he made the sun the same way. Every comet was produced by the same series of moves. Every school of fish was the same series of actions. Everything he did was almost a reflex action. I remembered back to my guitar days, and the use of riffs. Yep, just like a guitarist, The Artist used the same riffs over and over again to produce a result which overall looked (sounded) like something new. So The Artist's methodology looked something like this:
I recognized right away that programming was not a good candidate for The Artist's methodology. Unfortunately, the customer usually dictates the tools the programmer uses, and there's really no way around that in the programming marketplace. So I turned my sights to my other vocation, technical writing. And struck gold!
What were The Artist's stencils if not boilerplates? And unlike The Artist, I could make myself some nice little macros to automatically lay down several stencils (scuse me, boilerplates) at once. I created lots of boilerplates, practiced with each, and with combinations, until their use became riff. I especially concentrated on time consuming tasks like shooting and importing screenshots.
Within a month I'd doubled my technical writing productivity. When Troubleshooters.Com was born two years later, I instantly created tools and riffs to achieve my goals, both in creating content and in marketing the site. I'm often called prolific. Chalk it up to The Artist, and a tourist who saw a connection between quick paintings and technical writing.
The story doesn't end there. I never stopped asking how I could apply The Artist's techniques to computer programming, in spite of the fact that every client requires the programmer to use different languages, components and frameworks.
In 1999 I recognized that the set of software tools shipped with Linux could form a ubituitous toolbox for everyone, and that this toolbox could be brought to the customer site. I wrote about this in the August 1999 Troubleshooting Professional Linux Log column, mentioning that by using UNIX commands piped together with Perl, awk and sed programs, a "programmer" could quickly cobble together a major application. But I never accomplished that, none of my friends accomplished it, and I heard of no Troubleshooters.Com readers accomplishing it. I kept looking.
Last week I was invited to a client site to write a heavily geeky network management program. I did it in 3 days using Perl plus a network centric Perl module. The client plans to do it over again in Java, using a crew of several programmers and over a month's time. The Java version will be much better than mine. Especially because my program took away the time pressure that might have tempted them to do inadequate design in favor of quick rollout. And especially because they'll learn from my code, and from complaints about my code. For the price of 3 programmer days they got what we all wish for -- the ability to rewrite version 1 programs.
When I hit the road espousing system independent troubleshooting in 1990, many audiences treated me to the equivalent of throwing tomatoes. Saying that troubleshooting process was more important than system expertise was considered blasphemy back then. I couldn't convince them of my point of view. Nor could they convince me that their system centric troubleshooting methods were right. For I had been using system independent Troubleshooting process successfully for over a decade. I had discovered system independent troubleshooting in 1979, by recognizing a similarity.
It was my first such experience.
In the months before the discovery, I had found a great method of fixing stereo equipment. Rather than repeating that story, you can read it in an article called "The Loser" in the February 2000 Troubleshooting Professional. Suffice it to say I was able to use binary search on a stereo system to quickly narrow the root cause to a single component. Then my television broke.
I had absolutely no training in television theory or repair. But there were obvious similarities between stereos and televisions. Both used transistors, employed printed circuit boards, contained recognizable circuits (diff amps, class A, oscillators and the like), required soldering irons, and required solution of DC problems before high frequency problems. I would have considered myself a wuss if I hadn't at least tried to fix that TV.
Not knowing how a TV worked, I bought and read "Sams Photofact Television Course". The introduction to the book contained a block diagram of a television. Working by instinct garnered in my audio repair work, I used that block diagram to decide on measurements to take, and to section off the root cause. I fixed the TV.
In the glow of triumph experienced after successfully concluding a repair (some day that glow would be step 9 in the Universal Troubleshooting Process), it occurred to me that the binary search that made me a stereo repair ninja had allowed me to fix a television for which I had no training, no schematic, and no business being able to fix. The block diagram was different, but the methodology was the same. Perceiving the similarity between the stereo and the TV, I made the cognitive leap to the fact that I could use binary search to fix ANYTHING for which I had a block diagram. Realizing that binary search and the block diagrams were the two tools I needed, I named the the binary search technique "Divide and Conquer" and the block diagram (or any other organized representation of the system) the "Mental Model". All that remained was to test the hypothesis that Troubleshooting was system independent.
I got that opportunity when I became a professional programmer in 1984. Every time troubleshooting or debugging was necessary, I made sure to either have or make a block diagram of the system -- even if the system was entirely software. And I made sure to use binary search. Although I was a junior programmer, within 6 months I was out-troubleshooting hardware and software pros with over a decade's experience.
But the discovery of a third tool led to the question "Could there be even more tools?".
A few days later my Intermediate Accounting teacher outlined the basic principles of accounting, including the "Objectives" of accounting, which are threefold:
During the next 2 weeks I made it my business to perform commensurate with the impression I had made during the interview. I worked more hours than I billed, worked at night so they wouldn't see me programming with the manual on my lap, and used my Troubleshooting and program design skills as crutches for my less than stellar RBASE skills. After 2 weeks my RBASE skills were sufficient to take on the client's most difficult RBASE work, produce superior results, and to bill for every hour I spent. The client was thrilled. It was the beginning of a 10 year business relationship.
In subsequent years, I used that same terminology-centric method to
obtain contracts in Assembly/TSR, Clarion, Windows help authoring, and
many, many many more skills. In every case, the client loved my work. One
thing was obvious. I never let lack of knowledge and experience stand in
the way of creating something great.
It's quite humorous that a guy who achieved guru level RBASE performance in days required 10 years to fully understand his own learning process. In 1998 I got to thinking about all the contracts I'd obtained by smooth use of terminology. Obviously I had used a similar sales process in every one. But I perceived, and then exploited, another similarity. In every case, my results were those of a person with the experience I portrayed, not the experience I actually had. What process did I use to learn so quickly?
Once confronted with my series of successful contracts, it doesn't take a rocket scientist to realize that my study of terminology jump-started the learning process. After all, the originators of any concept give that concept, and its sub-concepts, names. These names form the technical terminology and give the concept tangibility. To attempt to master the concepts without first mastering the originators' technical terminology is to re-invent the wheel.
Upon reflection, it's obvious that terminology-first learning speeds
mastery several fold. But it was obvious only after seeing a long series
of such incidents, and perceiving their similarities. My exploitation of
that perceived similarity was the full documentation of my process, which
required 315 pages in "Rapid Learning: Secret Weapon of the Successful
Technologist". Obviously I can't even begin to summarize that book in this
article, but what I can do is give you the flowchart of the Rapid
If you're concerned how hard it is to hire trained technologists, and how hard it is to keep them trained in this age of 18 month obsolescence, consider using Rapid Learning in your organization. If you're a technologist bone tired from running the training treadmill, Rapid Learning just might be your salvation.
If you use Rapid Learning and like it, remember that Rapid Learning owes its existence to the exploitation of the perceived similarity between a sales technique and evidence of hyperproductive learning.
Like I said, this client likes my work, and they don't let my lack of knowledge or experience get in the way of hiring me to create something great.
I got the XYZ Perl module off the net (interestingly, not from CPAN), spent the first day learning XYZ terminology and creating a proof of concept with the XYZ module. I spent the second day writing the app, finishing everything but error handling. I completed the app on the third day. I heard it through the grapevine that they had expected it to take twice as long (presumably for an XYZ expert :-). And coolest of all, they had a crew of several Java guys working on a more complete and featureful Java version of the same app, and expected that app to take over a month.
So for the price of five programmer days (they felt it wise to have me spend one additional day on documentation and an additional day on site for live testing), they got an immediate working app, thus relieving the Java team's time pressure and giving them the time necessary for a quality design process. They also got a working prototype app to attract criticism, thus pointing out any potential design flaws before such flaws were solidified in the Java app.
That's all well and good, but certainly not big news. The big news is that I perceived a similarity between the outcome of that contract, and Jon "maddog" Hall's "Value Added" revenue model for Open Source, and an August 1998 Troubleshooting Professional article entitled "Linux Log: The Lesson of the Artist". The 8/1998 TPM article discussed the street artist from Port Hueneme, in great detail, including proposing a way to implement his methods using Linux, shellscripting, and Linux utilities. Distilling Jon "maddog" Hall's "Value Added" Open Source revenue model to a single sentence, you make money by using Open Source software as a tool. If one has several such tools, and creates riffs to use those tools, one can realize The Artist's methodology with Open Source. Walking to my car after finishing my XYZ enabled Perl app, I realized that I had used The Artist's methodology to write code. Cooler still, the tools I based my riffs on (the networking Perl module) were Open Source, thereby realizing Jon "maddog" Hall's Value Added revenue model.
And what a result. I received an hourly rate significantly above the norm for my geographic area. My client got twice the productivity they expected. The team doing the Java app got a reprieve from instant rollout. The client will end up with a superior long term (Java) product. Everyone won.
Even the guys who wrote the Open Source module won. Because they can use their tool even better than I, so they can get even better contracts. Heck, that's probably the reason they wrote it.
Had I not seen The Artist in action, and had not Jon "maddog" Hall told me of his Value Added revenue model, I wouldn't have recognized what I had done, and might not have done it again. But once again, exploitation of perceived similarity has led to new heights in productivity.
By the way, if your company is in Central Florida and you want someone
to come in and bang out your app in advance of the team doing it in Java,
C++, or whatever, I can do the same for you.
Maybe it can and maybe it can't. Only a few months of testing will answer that question. Until I do such testing, I cannot say with any certainty whether my hypothesized methodology, which I have named HDIFO, works. And therefore I will not describe it at this time, even to the extent of telling you what HDIFO stands for. You may never hear of this again.
Or you may see a future issue of Troubleshooting Professional Magazine entirely devoted to HDIFO, and maybe a book.
The main point is that if I hadn't seen a similarity between my outline and programatic functional decomposition, I never would have recognized that outline as a possible generic problem solving methodology.
When I discovered System Independent Troubleshooting I needed to fix a television, and would have looked like a real wuss if I had brought it to someone else for repair.
My discovery of the Fix the Right Problem tool occurred during the writing of "Troubleshooting: Tools, Tips and Techniques" -- a book that had to go where no other Troubleshooter had gone before.
The transformation from the cynical buzzwording sales technique to the productive Rapid Learning process occurred because I had gotten tired of the cynicism of my (now out of print) "Job Seekers Guide: Untold Truths of the Job Market", and wanted to focus on a more positive concept of "Rapid Learning".
My discovery of CPAN Gold occurred because of my continuing need to program faster, and because I really want to tell the world about a realistic Open Source revenue model so that Open Source will thrive.
In each case, necessity was the mother of invention (actually, discovery :-).
I see two very good reasons why necessity is the mother of invention. First, unless a new fact is a priority, why would you remember it or research it further. So much information, so little time. Second, without a pressing necessity, where would you file the information? What would you link it to? But if there's a necessity, you file it in your mental file containing that necessity.
So exploitation of the perceived similarities leads me to the conclusion that necessity is more important than exploitation of perceived similarities. Go figure :-)
Did I just hear you mention exploitation of perceived similarities? Years ago the necessity of fast, quality program design led to the learning of functional composition, and the recognition that the leaf elements have a fundimental difference from their ancestors. Today I created a freeform outline to solve a sales problem, and noted that there's a point at which to stop decomposing, and remembered the similarity with program design. Superimposing them, I created a new problem solving methodology (which I haven't tested yet, so this is vaporware right now) in a day instead of ten years.
In fact, the process of exploiting a perceived similarity is remarkably consistent:
Look around you at all the knowledge considered obvious today. How much of that knowledge was nonexistent or considered blasphemy 10 years ago? Much of that obvious information was uncovered using exploitation of perceived similarities.
The final commonality I find from the discoveries listed in this article is that exploitation of perceived similarities is just one method of thinking. It's a tool, not a solution. There are many other vital thinking tools, including but not limited to differential thinking and introspection.
Tradition is wonderful, but in time most traditions outlive their usefulness. So it is with the "revolutionary" theme of TPM's anniversary issues. The Troubleshooting Revolution is over, and we won. Troubleshooting Process is taught and practiced widely. Pockets of resistance surrender daily.
That's not to say there aren't still battlegrounds. A single company, Microsoft, continues to produce complex, non-modular, intermittent products that are close to irreparable. Unbelievably, they're still a threat even after Judge Jackson declared them an illegal monopoly and ordered divestiture. Luckily, Linux and Open Source software in general have given us reliable and modular alternatives to almost all Microsoft offerings. As a Troubleshooter, it's very much in your interest to tell your employers and clients which technologies you believe to be most reliable.
My statement that the revolution is over is an oversimplification. In fact, Jim Roach, Eddy Belew and the crew from Intelliworxx are continuing to convert the world to Era 4 Troubleshooting, as discussed extensively in the December 2000 issue of TPM. Look for Era 4 troubleshooting tools to assume an increasing share of Troubleshooting over the next several years.
But unless the operation of businesses change radically, Era 4 will never take over completely. Era 4 Troubleshooting Tools are system dependent, so each system requires its own tool, with info, scripts and the like. With obsolescense now at the 18 month level, and time to market ruling our economics, it's unlikely that manufacturers of computers, software and consumer products will take the time and expense to create an Era 4 tool for their products. So I advise you to become a member of the Era 4 revolution if you can, but if you don't have that opportunity don't worry -- Era 3 will be king of computer systems for at least several years.
And who knows what Era 5 will bring, or who will lead it.
You'll notice that the main beacon of Era 3 Troubleshooting, Troubleshooters.Com, has changed. Much new and detailed Universal Troubleshooting Process information has been added in the form of Troubleshooting Professional Magazine issues:
February 1997: Tools and Solutions August 1997: Superstition, Positive Thinking and Luck September 1997: Team Troubleshooting October 1997: The Attitude November, 1997: Gavin Gray February 1998: General Maintenance March 1998: Bottleneck Analysis December 1998: Intermittents February 1999: When the Going Gets Tough February 2000: Natural Born Troubleshooters March 2000: Take Pride August 2000: Do the Easy Ones First December 2000: The Many Faces of Troubleshooting and Problem SolvingLinks to these TPM issues now appear on the Universal Troubleshooting Process page, so there's a single source on the 'net for Troubleshooting information. In addition, Troubleshooting Professional Magazine is starting to address issues in the larger generic problem solving arena. The December 2000, January 2001 and February 2001 issues all subscribe to this genre.
Perhaps the most noticible change in Troubleshooting Professional Magazine is that we're becoming a little less editorial. We continue guiding you in benefitting from Open Source, but with less negative comments about Microsoft. Restraining my loathing of Microsoft was a tough decision, especially in light of the fact that every time I comment negatively on Microsoft, Troubleshooters.Com visitorship skyrockets. But we're now reaching out to an audience beyond the (already in the choir) Open Source advocates. We're now reaching out to technologists in general. If technologists think of me as a rabid fanatic, they're less likely to find my howto articles credible, and less likely to adopt Open Source. So I'm counting to 10 before discussing my opinion of Microsoft. Of course, I retain the option of slapping Uncle Billy and his sidekick Stevie whenever they do something truly stupid or immoral :-)
So how's the market for Troubleshooters? There's an old Chinese curse: "May you live in interesting times". In 2001, that curse is upon us.
A recession is coming, and word is out it might be bad. That's bad news for everyone who works. Given what's happened to NASDAQ and the dotcoms, the news is especially bad for technologists. With many workers competing for decreased work, transferrable skills may make the difference. Certainly sales and software analysis and design are such transferrable skills. And so is Troubleshooting skill, which has the effect of making the technologist look very smart and very indespensible. The past few years were times anyone could get a job. Now being a Troubleshooter will greatly enhance your job security.
Be VERY glad you're a Troubleshooter in 2001.
This years lone best article voter has quite a story to tell. He was
at a party in Berlin, Germany, and was disappointed that he'd missed voting
for the third consecutive year. Then a fellow partier pointed out it was
still December 31 in the United States, where Troubleshooters.Com originates.
So the lone voter rushed to his computer and voted. He voted both best
issue and best article, both for 2000 and for all time. Here are his picks,
in his own words:
1. My all time favorite issue is "Self documenting code", August 1999. 2. My favorite this year is the last one, "The many faces of troubleshooting and problem solving" 3. My all time favorite article is "Rapid learning: one man's story" in the December 1997 issue. 4. My favorite article this year is "Why easiest first boosts productivity", August.
Even more intriguing is the Best Issue vote. There was a voter who voted best issue without voting best article. His vote for best issue was the November issue, Annual Linux Showcase!.
So that's it. By a margin slimmer than the Florida presidential vote (and yet by 100% of the ballots cast), the best article was August's Why Easiest First Boosts Productivity. The best issue is "too close to call" -- a tie between issue was December 2000, The Many Faces of Troubleshooting and Problem Solving and November 2000, Annual Linux Showcase!.
Speaking of tradition, for the fourth year in a row I will name my favorite issue and top 5 favorite articles :-).
To me the best issue is obvious -- December 2000, The Many Faces of Troubleshooting and Problem Solving. It discusses generic problem solving, the subset relationship of system independent troubleshooting to generic problem solving, and several generic problem solving processes and tools. In my opinion, the December issue is absolutely unique on the Internet, and a much needed resource.
The most visited Troubleshooting Professional of 2000 was April 2000, Apache, Apachecon and PHP, which contains a complete tutorial for a PHP/Postgres web app. May 2000, Troubleshooting Automotive Overheating, is heavily visited as one of the web's most comprehensive references on the automotive cooling system and its problems, diagnoses, and solutions. The November issue, Annual Linux Showcase!, is essential reading for anyone wanting to make money within the realm of Open Source.
2000 offered several strong issues whose every article was strong, but there were few articles rising above the crowd. That makes it hard to choose best articles. Once again, there were no Troubleshooting short stories like Gavin Gray and The Man who Banned General Maintenance. Troubleshooting Professional Magazine has become more informative and less editorial over the years. There was a funny Linux advocacy poem (Linux Log: I'm Bringin My Box) that just missed the top 5.
So without further adeau, the top five (as I see them) TPM stories for
|Article||What It's About|
|Finally, the myth of the one size fits all problem solving methodology is debunked by a simple analogy.|
|A detailed explanation of the ever popular Theory of Constraints, both in a machine context and in its native factory context.|
Data Enabled App Using PostgreSQL
|Soup to nuts construction of a simple PHP/Postgres web app you can do on your Linux box.|
|4||Linux Log: The
Unexpected Security Hole
|Somebody had to say it. Not all exploits come from traditional crackers. Sometimes your worst security nightmare comes in the form the big software vendors' lawyers, and from the government.|
and Inline Skates
|A simple speedskating analogy debunks the common assertion that "Linux will never make it on the desktop".|
A warm thank you for a tough job well done goes out to this year's sole guest author, Steve Epstein, for his July 2000 article, "Using Ipchains".
Last but not least, here are the answers to our trivia contest:
|Quote||Article containing the quote|
|Gary Kildall has reached out of the grave and grabbed Bill Gates' ankle.||Gary's Revenge, June 1999 TPM (More Heroes, and a Trip to Linux Expo).|
|Then, just for a second, a big red thing with three silver eyeballs and huge yellow fangs appears in the closet door.||If you're not part of the solution..., January 1998 TPM (The Revolution, One Year Later).|
|A few days later, as Communism crumbled throughout Europe, The Attitude became the third Troubleshooting Tool.||The Discovery of the Attitude, October 1997 TPM (The Attitude).|
|I've been the worst, and I've been the best, and the only difference is what I learned about Troubleshooting Process.||The Loser, February 2000 TPM (Natural Born Troubleshooters).|
|There's slushy snow on the ground, and like a couple fourth graders the tycoon and the pauper trudge through the muck.||Gavin Gray, November 1997 TPM (Gavin Gray).|
|Like so many before me, I rumbled down the highways paralleling the old route 66 with nothing but an ancient overpacked car and the age-old dream of jobs and sunshine where old 66 ended at a sign reading "Santa Monica Pier".||Editors Desk, November 1999 TPM (Outlines: The Do Everything Tool).|
|The girls are all high school seniors (some for the second time), who look 30, walk with too much slink and smile with too much smirk.||A Legend Before His Time, March 1997 TPM (Teaching Troubleshooting Process).|
|Smiles becomes wide-eyed surprise as the wiseguys in the Mustang see me pass, with the front tires a few inches off the ground.||A Supercomputer in Every Kitchen, May 1998 TPM (Free Software).|
|And above all, if you don't want to see him cry, don't tell him you set up an Intranet with virtual hosting, DNS and full CGI (which can interface to Oracle or Sybase or free PostgreSQL), in four hours on a $600 computer.||Wrapup, November 1998 TPM (Linux Issue).|
|Richard Stallman wrote the manifesto. Linus Torvalds proved it worked.||Linus, May 1999 TPM (Where Have All the Heroes Gone?).|
In my case, I had the advantage of working with a Ninja systems analyst, who verbally laid out the design to a T -- clear and right on the money. Obviously the 3 days would have ballooned to a week with a lesser systems analyst, or a committee, or a not-too-bright user. Yep, the guy who gave me the specs was one of the best systems analysts I've encountered. So the question becomes, would it still be worth it doing the Perl version first in such a situation?
You bet it would! Imagine if a lesser systems analyst, committee, or not-too-bright user gave specifications directly to the Java team without first doing the quick Perl version! Can you say Death March? Widowmaker? Career Limiter? By creating a quick initial Perl version the organization would uncover the specification flaws.
I'm convince that a first version in a super quick language, like Perl or Python, done by an ace programmer, is a precursor to success.
So how can my results be reproduced?
First, you need an excellent programmer familiar not only with Perl,
but also with the modules that come with Perl, most of which can be found
in the Perl distribution itself or on CPAN.
The programmer must have experience whipping out quick apps. And unless the programmer is lucky enough to be paired with an extraordinary systems analyst, he must also have excellent analysis and design capabilities.
The programmer must have close and frequent access to the organization's subject matter expert(s). There's no way the programmer can walk in and know the behind-the-scenes technology, or how the organization wants to use that technology.
The programmer must have access to those who can give frequent and reasonable feedback. By reasonable I mean feeding back on show stoppers, not on small aesthetic matters. That's because a main benefit of the Perl-first design paradigm (I've always wanted to use that word in a non-pretentious way, and almost succeeded :-), is that of a dead end analysis. If the project is going to fail, or encounter a major regrouping, it's better to find out before spending countless days and months on little user interface tweaks.
The programmer must be able to convey to management a confidence that he can produce results quickly. There are many "quack" programmers running around, so management is rightfully skeptical.
Unless the programmer is one of the few who can also do documentation, the programmer must be provided with, and give his time to, a local documentation expert.
The programmer must work hand in hand with the local team who will eventually do "the real program". The programmer must not view the "real programmers" as his competition, but instead as the next guy in a relay race. The programmer must fully convey to the team doing "the real program" that he's working with them, not against them. The quick Perl programmer being described in this article makes his money on lots of quick jobs, not on a job that lasts for months or years.
The programmer must be paid commensurate with his ability to work miracles.
First, be sure you're familiar with whatever Perl modules are available to do the magic they're asking for. You don't have time to write this stuff yourself. Each of the CPAN modules is a tool helping you disassemble a storage crate containing gold. E-gold, if you will. Here are just a few of the 24 carat modules available on CPAN. Note that later versions may be available:
If you're into web apps, be sure to look at CGI.pm-2.72.tar.gz, CGI-Lite-2.0.tar.gz, CGI-QuickForm-1.90.tar.gz, CGI-Screen-0.122.tar.gz, CGI-WebsiteGenerator-0.3201.tar.gz, CGI-XML-0.1.tar.gz, CGI-XMLForm-0.10.tar.gz, CGI-modules-2.76.tar.gz, CGI.pm-2.74.tar.gz.
For mime messages, check out MIME-Lite-2.106.tar.gz, MIME-tools-5.410.readme.
If you want to search and/or pull content from the net, consider libwww-perl-5.48.tar.gz, WWW-Search-2.15.tar.gz, WWW-Search-AlltheWeb-1.5.tar.gz, sitemapper-1.019.tar.gz.
For working with Uniform Resource Identifiers, see URI-Find-0.04.readme, URI-Bookmarks-0.92.tar.gz, URI-1.09.readme.
Want to put a quickie GUI interface in a program? Try Tk800.022.tar.gz. Note there are a lot of other tk modules you can use for things like Tk-XMLViewer-0.12.tar.gz to make a graphical XML tree viewer, Tk-TreeGraph-1.023.tar.gz for drawing trees, Tk-SlideShow-0.06.tar.gz for making slideshows, Tk-ProgressBar-1.0.tar.gz for building a progress bar, Tk-Clock-0.06.tar.gz for making a clock, Tk-ContextHelp-0.06.tar.gz for context sensitive help for your app.
Click here to see a few additional perl modules available on CPAN.
Hopefully you're familiar with many of the CPAN modules, as well as other handy Perl modules located elsewhere. After all, you're a professional hired gun.
As you write down your spec, or translate it to pseudocode, assume you'll be able to do the geeky stuff, and make stub functions or classes to handle the geeky stuff. Be sure to give the stubs meaningful names.
The first day on the job, isolate the critical technical tasks to be performed and create "hello world" code to prove their concepts. Be sure the person you report to sees your proofs of concept, as that will give you the credibility and support to work another day. For a hired gun, 1 day is about the maximum "honeymoon period" you can expect. These critical technical tasks will eventually be mapped to the stub routines or classes you made during translation of the specification, or else they'll be mapped to brand new functions.
When coding Perl, I always have a subroutine called main() that is the main routine. I do that to localize variables, but it also provides another advantage. During initial coding, when I might be working with or testing one function or class. In that case, I'll comment out the call to main(), and simply code the call to the new class or function at the bottom. Once the class or function is operating correctly, I'll place a call to it in its proper place in main(), delete the test code at the bottom, and reenable the call to main() at the bottom. I iterate through the major functions and objects until the program's actions and behavior begin to resemble the specified actions and behavior. At that point my task starts resembling maintenance programming, as I add in one new functionality at a time.
Once you get the program working substantially as specified, show it off and get feedback. This is where specification changes are pointed out. Keep iterating spec changes and further development until the client has exactly what he wants. Document if you can, or offer to help the client's in-house documentation expert. If the client needs hand holding, stay long enough to hold his hand. Once the project is done, collect your money, get everyone's email addresses (these are good professional references), and leave. Write thank-you emails soon after.
|WARNING: I am not a lawyer. Everything expressed in this section is my personal interpretation. This is not a substitute for hiring your own lawyer. Use my information at your own risk. I'm not responsible for any problems or losses you incur through the use of my information, even if this information is blatantly incorrect.|
If your business dealings are anything like mine, clients using you as a work-for-hire contract programmer or an employee will not be particularly anxious to make the work-for-hire Open Source. And unless you're in a tremendous bargaining position, you're not going to spend much time convincing them to Open Source your work-for-hire. If you're anything like me, you'll spend all your bargaining chips negotiating away hold-harmless clauses, non-compete clauses, non-disclosure agreements which are truly non-compete clauses, and, if you're an employee, clauses stating anything you discover, invent or publish belongs to your employer (didn't Lincoln free the slaves?). And last but not least, you need to negotiate your salary or hourly rate.
The preceding paragraph is an introduction to the concept that you may not be able to use GPL software as tools. With its agressive (some call it viral) "copyleft" copyright, using a single GPL tool in an in-house app could make the entire app GPL. And if there's any anticipation that app will ever be loaded outside the corporation, be careful. The GPL license says that redistributed app must be handed over to the other entity, with source code. If that code contains trade secret business rules, they're no longer secret. Unfortunately, the very same copyleft licensing that makes GPL so resistant to proprietary corporate snarfs lessen its usefulness as a tool used in work-for-hire software.
Most Perl modules are distributed under a disjunctive license featuring
both GPL and the Artistic license. What that means is you can redistribute
them under either, or both. I don't like the Artistic license because it
basically says "you can contribute to my project, but I keep all control".
But the Artistic License contains a clause that's ideal for those wanting
to use its code as a tool in a non-open source program:
|8. Aggregation of this Package with a commercial distribution is always permitted provided that the use of this Package is embedded; that is, when no overt attempt is made to make this Package's interfaces visible to the end user of the commercial distribution. Such use shall not be construed as a distribution of this Package.|
Looks to me like the Artistic license specifically permits use as a tool, thus fulfilling Jon "maddog" Hall's Value Added Model, and the lessons of The Artist.
As a personal aside, I'd love to see a license (maybe called the Tool
License) that is the GNU GPL with the Artistic License's clause 8 added.
That would forever protect the tool itself, while allowing programmers
to make good money *using* the tool.
|GNU GPL||Yes||Aggressive copyleft most likely to be interpreted as making any software
that links to it or accesses it via an include statement GPL also.
A GPL like license with a clause like clause #8 of the Artistic license IMHO would be perfect for a tools license, but no such thing exists.
|GNU Lesser General Public License||Yes||Pro: Only license to ban your tool from having code that depends on
the (proprietary) app.
Con: license language requiring customer ability to recompile difficult to comply with.
|The License of Python 1.6a2 and earlier versions||Yes||Contains a clause saying interpreted and governed by Virginia law. Virginia is a UCITA state. Be VERY afraid!||If nothing else were available.||http://www.handle.net/python_licenses/python1.6_9-5-00.html|
|The X11 license||Yes||Simple, and appears to allow use within proprietary tools. However, no clear definition of "modification" exists, so it's possible for a case to be made that the entire encompassing app is merely a modification of the tool, and therefore X11 licensed software itself. I doubt it would come to that.||Yes||http://www.x.org/terms.htm|
|Cryptix General License||Yes||Simple, and appears to allow use within proprietary tools. However, no clear definition of "modification" exists, so it's possible for a case to be made that the entire encompassing app is merely a modification of the tool, and therefore Cryptix licensed software itself. I doubt it would come to that.||Yes||http://www.cryptix.org/docs/license.html|
|The modified BSD license||Yes||Simple, and appears to allow use within proprietary tools. However,
no clear definition of "modification" exists, so it's possible for a case
to be made that the entire encompassing app is merely a modification of
the tool, and therefore BSD licensed software itself. I doubt it would
come to that.
Caution: Earlier versions of the BSD contained an obnoxious advertising clause, and were not GPL compatible.
|The license of ZLib||Yes||Simple, and appears to allow use within proprietary tools. However, no clear definition of "modification" exists, so it's possible for a case to be made that the entire encompassing app is merely a modification of the tool, and therefore Zlib licensed software itself. I doubt it would come to that.||Yes||ftp://ftp.freesoftware.com/pub/infozip/zlib/zlib_license.html|
|Artistic license and the GNU GPL disjunctive||Yes||The Artistic license's clause #8 guarantees the right to use this as
a tool. It's unfortunate that the Artistic license places restrictions
on modifications making it unpalatable to contribute to Artistic tools.
However, the (disjunctive) GPL license enables one to contribute, while
the (disjunctive) Artistic license gives one the ability to use as a tool
in any app, open or proprietary.
If your app is written in Perl, this is an excellent choice. If you write a Perl tool, this is an excellent choice.
|Non-GPL compatible licenses||No||These may be useful as tools, but their use muddies the licensing waters. I recommend that before you use a tool with one of these licenses, you make a concerted effort to find a tool licensed with a tool-practical GPL compatible license. Note also you must carefully evaluate the license to determine whether the license allows you to use the tool as a tool in a proprietary or different-licensed app.||Maybe|
** If, and only if, my client could assure me that the encompassing app would be used only in-house, never placed on any machine outside the organization, never sold or redistributed, or if the client assured me the encompassing app could be licensed GPL, I would use GPL licensed tools.
|Cheaper software||We all want concisely engineered, full featured software written in a compiled language. But sometimes the organization can't pay for it. At such times it's better to have a lesser app, costing a few programmer days, than no app at all.|
|As preparation for concisely engineered, full featured software written in a compiled language|
|Lower total design cost||Many times the organization can afford concisely engineered, full featured software written in a compiled language, and will settle for nothing less. Even in such cases, a quick lesser app can save money. The quick app gives guidance to the crew writing the full featured app, and user feedback on the quick app helps the full featured crew avoid pitfalls. In many cases more is saved on the writing of the full featured app than was spent on the quick app.|
|Better final design||Time pressure has a way of compromising good design. Because the quick app is implemented in a few days, the crew doing the full featured app do not need to instantly roll out their product. They can fully design the product, thus providing a better final product.|
|Better customer service||No matter how quickly the full featured crew designs and codes, they'll never match the rollout of a hired gun with open source tools and riffs to support them. The earlier rollout satisfies the customer.|
1999 is gone. No more IPO's, instant dot com billionaires, and living off the largess of stockholders. No more blank check budgets with which to pay Microsoft tax and $2000/seat development environments. There's a recession on the way, and it looks like it will be a nasty one. Corporations can no longer afford the sloppy luxury of multi-programmer-year behamoths that are obsolete before rollout.
Today's successful IT department must pursue value. Who better to deliver or contribute to software develpment value than a programmer conversant in the methodology of The Artist, and using the tools conforming to Jon "maddog" Hall's Value Added Revenue Model?
After years of seemingly senseless rejection of value in favor of monopolistic junk, our day may be at hand.
Open Sourcing the work for hire program lends some extra credentials to the work, and if the resulting software is made GPL, it allows you to use GPL tools. Just be sure the client knows they must provide source to the program to anyone whom the program is given.
In my opinion, Open Sourcing the app isn't the main goal. The main goal is for you to make money, the client to get what they want, and for Open Source to be strengthened. I truly believe that just the use of Open Source tools strengthens Open Source. But if the client agrees to Open Source the application, that's some mighty sweet icing on the cake.
But next month's Linux Log will discuss some possibilities. In the meantime, please email me with your suggestions on selling this type of service. I need all the help I can get.
To become part of this revolution, learn an interpreter and several important modules, then sell your services. Beware of possible licensing gotchas, especially the fact that GPL tools probably cannot be linked to proprietary apps.
Perhaps the easiest route to software hired gun is Perl, combined with the immense collection of modules available at CPAN. There really is gold in them CPAN hills.
By submitting content, you give Troubleshooters.Com the non-exclusive, perpetual right to publish it on Troubleshooters.Com or any A3B3 website. Other than that, you retain the copyright and sole right to sell or give it away elsewhere. Troubleshooters.Com will acknowledge you as the author and, if you request, will display your copyright notice and/or a "reprinted by permission of author" notice. Obviously, you must be the copyright holder and must be legally able to grant us this perpetual right. We do not currently pay for articles.
Troubleshooters.Com reserves the right to edit any submission for clarity or brevity. Any published article will include a two sentence description of the author, a hypertext link to his or her email, and a phone number if desired. Upon request, we will include a hypertext link, at the end of the magazine issue, to the author's website, providing that website meets the Troubleshooters.Com criteria for links and that the author's website first links to Troubleshooters.Com. Authors: please understand we can't place hyperlinks inside articles. If we did, only the first article would be read, and we can't place every article first.
Submissions should be emailed to Steve Litt's email address, with subject line Article Submission. The first paragraph of your message should read as follows (unless other arrangements are previously made in writing):