Showing posts with label predicting. Show all posts
Showing posts with label predicting. Show all posts

Saturday, July 07, 2018

What were some jobs that existed 50 years ago but have largely disappeared today?

We often hear that we're in a time of change, but this observation isn't really news. We've been in a time of change for my whole lifetime, and well before that. Many jobs that once existed are no longer available, and many have even disappeared from memory.

We were challenged recently to recall some jobs that have disappeared in the past 50 years, and it was great fun reading all the answers, many of which described jobs I once held back in my youth. I go back a bit more than 50 years, though, so I have a few more to add.

The first, most obvious omission that popped into my mind was the iceman. In the 1930s, my family had an icebox (not a refrigerator, but an actual box that held a block of ice). The iceman’s horse-drawn wagon would come around and be surrounded by us kids, hoping to get free shards of ice caused when he cut up little blocks to fit our iceboxes.

Another job only briefly mentioned was typesetting. I never held that job, but I was trained for manual typesetting for a semester in high school. At least I know where terms like upper-case and lower-case come from.

Someone also mentioned keypunch operator, a task (not a job) that was often done by prisoners who were literally chained to their keypunch machines. What wasn't mentioned, however, were key verifier operators. Not many people today have ever seen a verifier machine, let alone even know what one was.

Even before my time, there were jobs that disappeared, but which I read about in a nineteenth century book about jobs for women. The final two chapters in the book were about a couple of sure-fire women’s jobs for the future (1900 was then the future).

First chapter was about telegraph operators. The chapter “proved” that there was a great future for women because they could operate a telegraph key at least as fast as men (and the telephone had yet to be invented).

Second chapter was about picture tinters. There was, of course, no color photography, and it wasn’t really even conceived of. Women were supposedly much better at coloring photos because of their “artistic bent” and their more delicate hands. Though there are a few photo tinters still around today for special jobs, it’s not a career with a great future.

It's fun to think about these forgotten jobs, but they're also a source of important knowledge, or perhaps even wisdom. Job disappearance is not some new phenomenon caused by computers. It's always gone on through history. True, some jobs lasted a long time, so long that they were passed down from generation to generation, even becoming family names, such as Smith, Turner, Eisenhower, Baker, and Miller. (See, for example, <Meaning of Surnames> for hundreds of examples)

Some of those jobs still exist, though often modified by new technology. Do you still recognize Fuller, Chandler, or Ackerman? And many others have largely disappeared, remaining only in some special niche, like photo tinters. Do you know anybody named Armbruster who still makes crossbows? Well, you probably know a few Coopers, but how many of them still make barrels?

So, what's the lesson for your own future? If you're as old as I am, you probably don't have to worry about your job disappearing, but even my "job" as a writer is changing rapidly with new technology. Even if your type of job doesn't disappear entirely, you will be faced with changes.

I think your preparation for job changes will be the same as your preparation for changed jobs: increased adaptability. Today's market tends to reward specialization, but when you become totally specialized, you become the victim of change. Think what's happened to all those COBOL experts from a few years ago.

I'd suggest that you take advantage of the rewards of specialization but invest a small percentage of your time to learning something new. Always. Keep you mind flexible for a future none of us can predict.

p.s. Minutes after I posted this blog, several readers wrote:

Your first job, "computer" did also disappear. How long was that job around? (Kind of surprised you did not mention it in the blog post.)
----
Well, that's shows I'm a human being. What's that saying about shoemakers' children going barefuot? It never occurred to me to consider my 'computer' job as disappearing, but of course it has been largely taken over by machines. Thanks, readers.

Oh, and some more, including switchboard operator, another job I had.

Maybe you folks could add more via comments here.

Thursday, July 13, 2017

Guaranteeing a Consulant's Future


I'm frequently asked, "What is the best thing a consultant can do to guarantee his or her future?"

Start by recognizing that there's no way to guarantee anything about the future. If a giant asteroid hits the earth, I doubt that many consultants will survive.

A giant asteroid may be unlikely, but, for example, a giant world-wide Depression is a possibility we've already experienced more than once.

"What about joining one of the large, established consulting firms?"

Yes, there's a certain stability in an established firm, but nothing guaranteed. Lots of consultants get themselves fired from such firms, and the firms themselves sometime fold.

The same dangers apply to founding your own firm. There's some safety in numbers, but no guarantees. By and large, you have to take care of your own future for yourself. There's a few strong things any consultant can do to help ensure their future, but again, nothing is guaranteed.

Definitely save your money. Although you cannot absolutely guarantee anybody’s future, having financial resources will come as close as anybody can come.

Money in the bank will even guarantee your present—protecting you from the temptation to take the kind of unwholesome assignments that kill a consulting practice.

However, there’s one thing even more important than money. Health. Make it your first priority to do whatever you need to do to remain healthy.

Think of it this way: You are the number one tool of your consulting business, so never compromise your health for your business. You cannot receive a second body to replace the one you were given, so take care.




Sunday, April 02, 2017

Complexity: Why We Need General Systems Thinking


It isn’t what we don’t know that gives us trouble, it’s what we know that ain’t so. - Will Rogers

The first step to knowledge is the confession of ignorance. We know far, far less about our world than most of us care to confess. Yet confess we must, for the evidences of our ignorance are beginning to mount, and their scale is too large to be ignored!

If it had been possible to photograph the earth from a satellite 150 or 200 years ago, one of the conspicuous features of the planet would have been a belt of green extending 10 degrees or more north and south of the Equator. This green zone was the wet evergreen tropical forest, more commonly known as the tropical rain forest. Two centuries ago it stretched almost unbroken over the lowlands of the humid Tropics of Central and South America, Africa, Southeast Asia and the islands of Indonesia.

... the tropical rain forest is one of the most ancient ecosystems ... it has existed continuously since the Cretaceous period, which ended more than 60 million years ago. Today, however, the rain forest, like most other natural ecosystems, is rapidly changing. ... It is likely that, by the end of this century very little will remain. - Karl Deutsch 

This account may be taken as typical of hundreds filling our books, journals, and newspapers. Will the change be for good or evil? Of that, we can say nothing—that is precisely the problem. The problem is not change itself, for change is ubiquitous. Neither is the problem in the man-made origin of the change, for it is in the nature of man to change his environment. Man’s reordering of the face of the globe will cease only when man himself ceases.

The ancient history of our planet is brimful of stories of those who have ceased to exist, and many of these stories carry the same plot: Those who live by the sword, die by the sword. The very source of success, when carried past a reasonable point, carries the poison of death. In man, success comes from the power that knowledge gives to alter the environment. The problem is to bring that power under control.

In ages past, the knowledge came very slowly, and one man in his life was not likely to see much change other than that wrought by nature. The controlled incorporation of arsenic into copper to make bronze took several thousand years to develop; the substitution of tin for the more dangerous arsenic took another thousand or two. In our modern age, laboratories turn out an alloy a day, or more, with properties made to order. The alloying of metals led to the rise and fall of civilizations, but the changes were too slow to be appreciated. A truer blade meant victory over the invaders, but changes were local and slow enough to be absorbed by a million tiny adjustments without destroying the species. With an alloy a day, we can no longer be sure.

Science and engineering have been the catalysts for the unprecedented speed and magnitude of change. The physicist shows us how to harness the power of the nucleus; the chemist shows us how to increase the quantity of our food; the geneticist shows us how to improve the quality of our children. But science and engineering have been unable to keep pace with the second-order effects produced by their first-order victories. The excess heat from the nuclear generator alters the spawning pattern of fish, and, before adjustments can be made, other species have produced irreversible changes in the ecology of the river and its borders. The pesticide eliminates one insect only to the advantage of others that may be worse, or the herbicide clears the rain forest for farming, but the resulting soil changes make the land less productive than it was before. And of what we are doing to our progeny, we still have only ghastly hints.

Some have said the general systems movement was born out of the failures of science, but it would be more accurate to say the general systems approach is needed because science has been such a success. Science and technology have colonized the planet, and nothing in our lives is untouched. In this changing, they have revealed a complexity with which they are not prepared to deal. The general systems movement has taken up the task of helping scientists unravel complexity, technologists to master it, and others to learn to live with it.

In this book, we begin the task of introducing general systems thinking to those audiences. Because general systems is a child of science, we shall start by examining science from a general systems point of view. Thus prepared, we shall try to give an overview of what the general systems approach is, in relation to science. Then we begin the task in earnest by devoting ourselves to many questions of observation and experiment in a much wider context. 

And then, having laboriously purged our minds and hearts of “things we know that ain’t so,” we shall be ready to map out our future general systems tasks, tasks whose elaboration lies beyond the scope of this small book.

[Thus begins the classic, An Introduction to General Systems Thinking]

Monday, February 20, 2017

How Long Can I Remain a [Ruby, Java, C++, Python, …]  Programmer?

Several respondents to an earlier post have asked me about the future prospects for workers in one programming language or another. Here's my best answer.

As others have said, "I can predict anything but the future." But also others have said that the only things we know about the future are what we know from the past. Therefore, you might get some idea of your future as a [Ruby …] programmer from the answers to a recent Quora question, "What were some jobs which existed 50 years ago but have largely disappeared today?"

It was great fun reading all these answers, many of which described jobs I held back then. I go back a bit more than 50 years, though, so I have a few more to add. Most obvious omission was the iceman. In the 1930s, we had an icebox (not a refrigerator, but an actual box that held a block of ice). The iceman’s horse-drawn wagon would come around regularly and be surrounded by us kids, hoping to get free shards of ice caused when he cut up little blocks to fit our iceboxes.

Another job only briefly mentioned was typesetting. I never held that job, but I was trained for manual typesetting for a semester in high school. At least I know where terms like upper-case and lower-case come from.

Someone also mentioned keypunch operator, a task (not a job) that was often done by prisoners who were literally chained to their machines. Who weren't mentioned, however, were key verifier operators. Not many people today have ever seen a verifier, let alone even know what one was.

Even before my time, there were jobs that disappeared, but which I read about—for instance, in a nineteenth century book about jobs for women. The final two chapters in the book were about a couple of sure-fire women’s jobs for the future (1900 was then the future).

First chapter was about teletype operators. The chapter “proved” that there was a great future for women because they could operate a telegraph key at least as fast as men (and the telephone had yet to be invented).

Second chapter was about picture tinters. There was, of course, no color photography, and it wasn’t really even conceived of. Women were supposedly much better at coloring photos because of their “artistic bent” and their more delicate hands. Though there are a few photo tinters still around today for special jobs, it’s not a career with a great future.

By the way, one future job for women that wasn't even mentioned in the book was typist (or amenuensis) in spite of the then recent exciting invention of the typewriter. Other sources explained that women would never be typists because everyone knew that women were not good with machines.

It's fun to think about these forgotten jobs, but they're also a source of important knowledge, or perhaps even wisdom. Job disappearance is not some new phenomenon caused by computers. It's always gone on through history. True, some jobs lasted a long time, so long that they were passed down from generation to generation, even becoming family names, such as Smith, Turner, Eisenhower, Baker, and Miller. (See, for example, <surnames.behindthename.com> for hundreds of examples)

Some of those jobs still exist, though often modified by new technology. Do you still recognize Fuller, Chandler, or Ackerman? And many others have largely disappeared, remaining only in some special niche, like photo tinters. Do you know anybody named Armbruster who still makes crossbows? Well, you probably know a few Coopers, but how many of them still make barrels?

No job is guaranteed. Nothing entitles you to hold the same job for life, let alone pass the job down to your children. So, for example, if you think of yourself as a "[Ruby…] Programmer," perhaps you'd better prepare yourself for future with a more general job description, such as "programmer" or "problem-solver."

In fact, there's a whole lot of people out there who think (or hope) the job of "programmer" will disappear one of these days. Some of them have been building apps since the time of COBOL that would "eliminate programmers." I've mocked these overblown efforts for half a century, but history has tried to teach me to be a bit more humble. Whether or not they succeed in your lifetime, you might want to hedge your bets and keep learning additional skills. Perhaps in your lifetime we'll still need problem-solvers and leaders long after we've forgotten the need for Chamberlains and Stringers.

Additional reading: 

Wednesday, August 24, 2016

Should I write my best book first?

The questioner wrote: "I have an idea for a great book. I have only have this one idea. I worry if I write this book and it doesn’t go well I might end up discouraged with no more great ideas to write about."


Someone once said, “There’s nothing as dangerous as an idea—especially if it’s the only idea you have.” 

By themselves, great ideas for books are essentially worthless. Hardly a month goes by without some eager soul telling me they have a great idea for a book that they want me to write in partnership with them. They simply don’t understand that it’s not the idea that’s worth anything, but only the writing of it.

99+% of “great ideas” never get written. Even though I’ve published over 100 books, I have had hundreds more “great ideas” that I’ve not (yet) written. I certainly don't need somebody else's great idea in order to have something great to write about.

So, either stop asking meaningless questions and write your book, or just file this “great idea” in the wastebasket and get on with your life. Chances are you’ll have hundreds of other great ideas in your life.


If you decide to write your book, or if you're simply trying to decide, you’ll want to read Weinberg on Writing: the Fieldstone Method.

Tuesday, August 02, 2016

Better Estimating Can Lead to Worse Peformance


This blog post is a new chapter, just added to my book, Do You Want To Be a (Better) Manager.

One of the (few) advantages of growing old is gaining historical perspective, something sorely lacking in the computing business. Almost a lifetime ago, I wrote an article about the future dangers of better estimating. I wondered recently if any of my predictions came to pass.

Back then, Tom DeMarco sent me a free copy of his book, Controlling Software Projects: Management, Measurement and Estimation. Unfortunately, I packed it in my bike bag with some takeout barbecue, and I had a little accident. Tom, being a generous man, gave me a second copy to replace the barbecued one.

Because Tom was so generous, I felt obliged to read the book, which proved quite palatable even without sauce. In the book, Tom was quite careful to point out that software development was a long way form maturity, so I was surprised to see an article of his entitled "Software Development—A Coming of Age." Had something happened in less than a year to bring our industry to full growth?

As it turned out, the title was apparently a headline writer's imprecision, based on the following statement in the article:

"In order for the business of software to come of age, we shall have to make some major improvements in our quantitative skills. In the last two years, the beginnings of a coherent quantitative discipline have begun to emerge…"

The article was not about the coming of age of software development, but a survey of the state of software project estimation. After reviewing the work of Barry Boehm, Victor Basili, Capers Jones, and Lawrence Putnam, DeMarco stated that this work

"…provides a framework for analysis of the quantitative parameters of software projects. But none of the four authors addresses entirely the problem of synthesizing this framework into an acceptable answer to the practical question: How do I structure my organization and run my projects in order to maintain reasonable quantitative control?"

As I said before, Tom is generous person. He's also smart. If he held such reservations about the progress of software development, I'd believe him, and not the headline writer. Back then, software development had a long way to go before coming of age.

Anyway, what does it mean to "come of age"? When you come of age, you stop spilling barbecue sauce on books. You also stop making extravagant claims about your abilities. In fact, if someone keeps bragging about how they've come of age, you know they haven't. We could apply that criterion to software development, which has been bragging about it's impending maturity now for over forty years.

Estimates can become standards

One part of coming of age is learning to appraise your own abilities accurately—in other words, to estimate. When we learn to estimate software projects accurately, we'll certainly be a step closer to maturity—but not, by any means, the whole way. For instance, I know that I'm a klutz, and I can measure my klutziness with high reliability. To at least two decimal places, I can estimate the likelihood that I'll spill barbecue sauce on a book—but that hardly qualifies me as grown up.

The mature person can not only estimate performance, but also perform at some reasonably high level. Estimating is a means mature people use to help gain high performance, but sometimes we make the mistake of substituting means for ends. When I was in high school, my gym teacher estimated that only one out of a hundred boys in the school would be able to run a mile in under ten minutes. When he actually tested us, only 13 out of 1,200 boys were able to do this well. One percent was an accurate estimate, but was it an acceptable goal for the fitness of high school boys? (Back in those days, the majority of us boys were smokers.)

This was a problem I was beginning to see among my clients. Once they learned to estimate their software projects reasonably well, there was a tendency to set these estimating parameters as standards. They said, in effect: "As long as we do this well, we have no cause to worry about doing any better." This might be acceptable if there was a single estimating model for all organizations, but there wasn't. DeMarco found that no two of his clients came up with the same estimating model, and mine were just as diverse.

Take the quality measure of "defects per K lines of code." Capers Jones had cited organizations that ranged on this measure from 50 to 0.2. This range of 250-1 was compatible with what I found among my own clients who measured such things. What I found peculiar was that both the 50-defect clients and the 0.2-defect clients had established their own level as an acceptable standard.

Soon after noticing this pattern, I visited a company that was in the 150-defect range. I was fascinated with their manager's reaction when I told him about the 0.2-defect clients. First he simply denied that this could be true. When I documented it, he said: "Those people must be doing very simple projects, as you can see from their low defect rates."

When I showed that they were actually doing objectively harder projects, he reacted: "Well, it must cost them a lot more than we're able to pay for software." 

When I pointed out that it actually costs them less to produce higher quality, he politely decided not to contract for my services, saying: "Evidently, you don't understand our problem." 

Of course, I understood his problem only too well—and he was it. He believed he knew how to develop software, and he did—at an incredibly high cost to his users.

His belief closed his mind to the possibility of learning anything else about the subject. Nowadays, lots of managers know how to develop software—but they each know different things. One of the signs of immaturity is how insular we are, and how insecure we are with the idea of learning from other people.

Another sign of immaturity is the inability to transfer theoretical knowledge into practice. When I spill barbecue sauce on books, it's not because I think it will improve the books. I know perfectly well what I should do. But I can't seem to carry it out. When I was a teenage driver, I knew perfectly well I shouldn't have accidents, but on the way home from earning my driver's license, I smashed into a parked car. (I had been distracted by a teenage girl on the sidewalk.) I was immature because even though I knew better than to gawk at pretty girls while driving, I had an accident anyway. 

The simple fact was that we already knew hundreds of things about software development, but we were not putting those ideas into practice. Forty years later, we're still not putting many of them into practice. Why not? The principal reason is that our managers are often not very good at what they are supposed to do—managing. In Barry Boehm's studies, the one factor that stood above all the others as a cause of costly software was "poor management." Yet neither Boehm nor any of the other writers on estimating had anything to say on how to make good managers—or get rid of bad ones.

Better estimating of software development could give us a standard for detecting the terrible managers. At the same time, however, it may give us a standard behind which the merely mediocre managers can hide.

Using estimates well

So, if you want to be a better manager, how do you use estimates effectively?

Any estimate is based on a model of what you're estimating. The model will have a number of parameters that characterize the situation. In the case of estimating a software project, parameters of the estimate might include

• number of programming languages to be used

• experience level of development staff

• use of formal code reviews

• characteristic error rate per function point

• many other factors

Suppose, for example, the project has six teams, each of which prefers to use a different programming language. Up to now, you've tolerated this mixture of languages because you don't want to offend any of the teams. Your estimating model, however, tells you that reducing the number of languages will reduce risk and speed up the project. On the other hand, if you try to eliminate one of the languages, your model tells you that a team with less experience in a new language will increase risk. By exploring different values of these parameters, you can learn whether it's worthwhile to convert some of the teams to a common language.

To take another example, you've avoided using formal code reviews because you believe they will add time to the schedule. Studying your estimating tool, however, shows you that use of formal reviews will reduce the rate of errors reaching your testing team. The estimating model can then show you how the rate of errors influences the time spent testing to find and correct those errors.


Many poor managers believe an estimating model is a tool for clubbing their workers over the head to make them work faster. Instead, it's a tool for clubbing yourself over the head as a guide to making wise large scale management decisions.

Monday, April 11, 2011

How Fast is Fast Writing?

See my guest post on Ellis Vidler's blog: "How Fast is Fast Writing?" http://ow.ly/4ycjq

Sunday, February 27, 2011

Who Can Alienate Readers Better?

I'm an author who's old enough to remember when the people who ran "Big Publishing" were book people—people who had some fairly decent intuition about books and the people who read them (in other words, their products and their customers). My first book was published by McGraw-Hill They were the biggest of the big, but they treated me with respect. For example, when I spotted trouble on my royalty statement, the situation was handled personally by the company president (one of the McGraws).

Four McGraw-Hill books later, the company was having some trouble over a bogus Howard Hughes biography, and turned down every new project for a year—including my latest manuscript, The Psychology of Computer Programming. I was naive enough to be shocked that a publisher might turn down a good book, so thought I must have done something wrong. After moping for a year of self-doubt, I recovered sufficiently to circulate the book to four publishers and was offered a contract by each of them. I chose Van Nostrand.

A year later, when the printed book was delivered, I went down to NYC to receive my first copy from the hand of my editor (a ritual I had practiced with McGraw-Hill). When I suggested we go to my editor's office to sit down and talk, he told me he didn't have an office—because he had just been fired.

Turns out he'd been fired by the corporate executives for publishing my book. In the interval since contract signing, Van Nostrand had been purchased by Litton Industries, along with (as I recall) four other publishers. The idea was to convert publishing to a "proper" business model—and this was the first such acquisition/consolidation, the one that began this new era in the publishing industry.

This new model included taking editorial responsibility out of the hands of the editors (real book people) and putting it into the hands of the executives (real business people).

Apparently their business intuition told them the book wouldn't sell, but apparently that intuition didn't work. In spite of fantastic order fulfillment screw-ups (another byproduct of the acquisition/consolidation, but that's another story), The Psychology of Computer Programming outsold all other similar books in Van Nostrand's inventory. It's still selling (I got the rights back—another stupid business decision by the executives—and the book is still selling steadily after almost 40 years—over 250,000 copies in a dozen languages. (It will be out soon as an eBook.)

And, after 40 years, these business executive are still clueless about that "book business," as opposed to their "book business." If you don't believe that, watch them screwing up the eBook business in just about every imaginable way. (Nobody said they weren't creative.) For instance, here’s what MacMillan CEO John Sargent recently had to say about libraries and ebooks:

    "That is a very thorny problem”, said Sargent. In the past, getting a book from libraries has had a tremendous amount of friction. You have to go to the library, maybe the book has been checked out and you have to come back another time. If it’s a popular book, maybe it gets lent ten times, there’s a lot of wear and tear, and the library will then put in a reorder. With ebooks, you sit on your couch in your living room and go to the library website, see if the library has it, maybe you check libraries in three other states. You get the book, read it, return it and get another, all without paying a thing. “It’s like Netflix, but you don’t pay for it. How is that a good model for us?"

    "If there’s a model where the publisher gets a piece of the action every time the book is borrowed, that’s an interesting model." - from http://go-to-hellman.blogspot.com/2010/03/ebooks-in-libraries-thorny-problem-says.html


If you don't understand what's wrong with this statement, take a look at the article and comments, "Friday Alert: HarperCollins in cagematch with Macmillan to see who can alienate readers better." <http://dearauthor.com/wordpress/2011/02/25/friday-alert-harpercollins-in-cagematch-with-macmillan-to-see-who-can-alienate-readers-better/>

Or, if that's not helping, take a look at past history—for example, the reaction of the Western Union executives when the technology for voice-over-wire (telephone) became available. Or, study the music industry executives' bungling of the digital music scene.
Whichever example you choose, it's always the same pattern of response to new science or new technology: The people on top of the existing industry always try to stifle the new in order to preserve the old. They bungle, and that opens the door for all sorts of brash newcomers. Brash, that is, until they become the fat cats and play the same bungling role when the next innovation comes along—as it always does.

The only question is "Who will be the brash newcomers this time around?"

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Find my eBook novels and nonfiction listed at these stores

• Barnes and Noble bookstore: http://tinyurl.com/4eudqk5

• Amazon Store: http://amazon.com/-/e/B000AP8TZ8

• Apple Store: http://apple.com

• Smashwords Store:
http://www.smashwords.com/profile/view/JerryWeinberg?ref=JerryWeinberg
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Tuesday, August 17, 2010

Attendance Too Regular? Try This!

Inspired by Ajay Balamurugadas's blog at

http://enjoytesting.blogspot.com/2010/08/aware-of-other-side-of-your-application.html

The title was Enjoy Testing

which starts with:

"For the past few months, I left office at sharp 6 p.m. I felt I should not invest more hours just because someone's estimate was wrong. So, I always took the 6 p.m. cab to home instead of the 8 p.m. or 10 p.m. cab."

And Michael Bolton commented:

"...I perceive that resolution to the trickiest part of the problem starts with recognizing people."

Michael is right. It starts with people. And guess who is the first person to recognize?

Yourself, of course.

The very first thing that struck me about the (quite fine) post was the regularity with which you come and go to work. Ordinarily, such regularity is a highly valued trait. For example, people can count on knowing when you'll be there and when you won't. Very good contribution to communication--and thus very high on every tester's list.

However, as an experienced tester, you already know that too regular, too predictable, behavior is a way to miss a great many bugs--and that's true of the regularity in attendance, too.

I would suggest you come in a couple of hours early on some random day next month, and (on a different day, probably) leave quite late. And, if you have people who work night shifts, arrange to be around for one or two of those.

I probably didn't have to explain why, but some of Ajay's readers may be less experienced than others. Experienced testers can probably all tell stories of when they came in early or left late (or were somewhere they weren't usually expected to be, or even prohibited to be) and because of that noticed something that led to a bug they never would have seen otherwise. (Perhaps something they were totally unaware of.)

I myself can tell many such stories, including one that may well have saved astronauts' lives, so I regularly practice being somewhat irregular in my behavior as a consultant (yes, I know that's a paradox).

Monday, February 23, 2009

Three Lessons from a Thirty-Year Bug

Reader Michael Bolton writes:

I'm reading General Principles of Systems Design, and enjoying it. I'm confused by something, and I think it's because of an error in the text.

On page 106, there's a matrix that is intended to describe the bathtubs illustrated in Figure 5.1 and diagrammed in figure 5.2. My interpretation is that the last row of the matrix should read

0 0 1 1 0

The text suggests

0 1 1 1 0

I interpret this as meaning that bathtub 1 could supply water directly to K, the sink; but neither Figure 5.1 nor 5.2 suggest that. Am I misunderstanding, or is there an error?

My response

It's an error in the text, previously unreported.

Seems as though it's been sitting there for 30 years and tens of thousands of readers.

Moral Number One:

Several morals, but to me, the most important one is about testability. Figures 5.1 and 5.2 are in the previous chapter, and difficult to look at while you're looking at the matrix. This makes testing quite difficult. I should have repeated one of the figures so it appeared on the same page as the matrix.

Moral Number Two:

In writing, as in software development, there's no such thing as perfection. (For more on this subject, see my book, Perfect Software, and other illusions about testing.) Just because nobody's found a bug for 29 years doesn't mean one won't turn up in year 30. If you start believing in perfection, you may be in for a nasty shock.

Moral Number Three

: It would have been easy to blame my readers for being careless, inattentive, or just plain dumb not to have detected and reported this bug (which actually appears twice). If I did that, however, I would have shielded myself from learning the first moral, which puts the responsibility squarely on me. If you don't take responsibility for your mistakes, learning doesn't happen.

Slow Learning is Better than No Learning

So, that's three lessons in thirty years. I'm a pretty slow learner, but at least three is better than zero. So, I'm going to be proud of myself for learning at all.

Saturday, February 07, 2009

Estimating (continued)

Back on April 17, 2006, I posted an entry called "Estimating Projects: A History of Failure." I try to make entries that are not ephemeral, so I'm happy to see that people are still commenting on this one almost three years later. I'm suggesting here that you go back on the blog and read the latest comment, from "Will."

Will has many wise things to say about how to do, and not do, estimating. Will, if you're writing a book on project management, give us the title, publisher, and estimated date we can expect to see it.

Take a look, and if you have comments, add them here.

Thursday, September 18, 2008

The Sense of Smell

I've been in less than tip-top condition lately, so you haven't seen any posts here for a long time. I just don't have the wherewithal to compose new stuff that's not whiney, so I think I'll start posting some really worthwhile stuff when it comes in from my correspondents (with their permission, of course).

The following is from Jim Batterson, who is currently teaching classes in China. Here's his words:

---------------------------------------------

I thought of a story and a half and I wanted to share it with someone, and you (Jerry) came to mind.

Let's begin with a simple experiment that I remember from a psychology class years ago. I guy thinks he is one of eight test subjects in an experiment. They all sit on a panel and he is number seven. The exercise is simple. They are shown two lines and asked to say which line is longer, A or B. B is clearly longer, but the lines are at different angles and such. One through six all say A is longer and our poor subject goes along with the crowd and agrees with them. Not clear whether he is appeasing or whether he doubts his own judgement or what. This repeats many times. Of course, one through six are shills planted by the experimenter.


Scenario two: same as the first except number two gives the right answer. All the support our guy needs to give the right answer every time.


So there I was in New Jersey sitting around a table for a project meeting. I was only tangentially on the project. I maintained the system that was being partially replaced, and had written an extract program that fed data to the system under development. When I sent them a test file they were supposed to run it through their edits and the rest of the system and I got some feedback at the end, but I really wasn't getting anything meaningful back.


Nevertheless, they went around the room and everyone seemed to be reporting that all systems were go, ready to fly in about a month. They'd been working on it for over a year. I, too, had done everything I was supposed to do and could have reported the same, but the manager detected a bit of hesitation in my voice (not very subtle) and asked me what I thought. She was a pretty sharp woman. I wasn't sure what to say, because I really was hardly involved in the project at all.


I told her that I had been on a lot of projects before, which she already knew. I didn't have any facts or data or even examples to point to, but "this project doesn't smell like a project that is a month away from going live."


I think that her senses, too, were telling her the same thing, but she needed another voice to confirm her suspicions. My sense of smell was all the confirmation she needed to drop the delivery date and push things back three months. She was still off by a month, but the system went live five months later.

Jim

Monday, June 18, 2007

How Good Are Expert Predictions?

Magazines are ephemeral, but some of my friends compulsively keep stacks of copies of old magazines. I've always wondered what possible use these collections can be, but here's a lovely contribution one of my readers sent, taken from Popular Science of May, 1967, page 93.

"Time sharing, most experts agree, is the key to the computer's future, at least for general use. A few years ago, when people thought about household computers at all, they though of some small, inexpensive, individual unit that would keep track of the family checking account and automatically type of Christmas-card labels. Now we know it won't be like that at all.

"The reason is economic. The bigger and faster the computer, the cheaper it makes each computation. Consequently, it will be far cheaper to build one monster computer with thousands or even millions of customers hooked into it than to have small, individual machines in individual homes."

Now we know that "most experts" were wrong: we know it would be like that, because today, 40 years later, it is like that. I was something of an "expert" in 1967, and I'm proud to say that I wasn't one of those who made such a piss-poor prediction. That's probably because I don't make predictions—except the prediction that almost all of the predictions we make today will turn out to be piss-poor 40 years later.

Why do I make such a meta-prediction? Well, I've researched the past, and, as Patrick Henry said, "I only know the future from the past." But don't take Patrick's or my word for it. Here's how you can find out for yourself. Beg, borrow, or steal a copy of some old computer magazine. Spend as much time reading it as you typically spend on this month's issue of the same publication (or an equivalent one, if the old one is no longer around). I guarantee that the time spent on the old one will be more productive.

Because I was an "expert" in the 1960s, I published a number of articles in the leading computer magazine of the time, Datamation.. I do save my old articles, so I happen to have a copy of Datamation. from September, 1962. My article in that issue is entitled, "How to Automate Demonstrations."

Although the print magazine Datamation. itself shuffled off this mortal coil in 1997, I'm proud to say that my 1962 article would stand up pretty well even today. Perhaps even better today. Now that hardly any part of the computer moves, demonstrations are much more challenging to create. Of course, this was supposed to be a humorous article, though not everyone realized it at the time. I received a dozen requests for the Demonstration Compiler—that is, the compiler that compiled fake demonstrations. (Hmm, is there any other kind?)

On page 79 of that issue of Datamation., there's an advertisement from Computer Dynamics of Silver Spring, Maryland. (What ever happened to them.?
"MEMO Re: COMPUTER TIME
Solve your computer problems efficiently and economically by using our 32K, 10 tape IBM 7090 at $450 per hour." (That's about $5,000 per hour or more in today's dollars.)

Today, 45 years later, I own five computers, each of which is far more powerful than that 7090. As far as their value, I've thrown away a more computing power than that because nobody wanted it. Yes, the ten tape drives would still be a bit expensive today, but why would I want them? I own more than a dozen disk drives, each of which stores far more than those ten tapes.

The list of advertisers from that issue contains many forgotten names of companies selling computers, plus a few companies that are still around but no longer selling computers. Here's some examples:

PHILCO "Philco's on the move."

RCA "What's new at RCA is news in EDP."

GENERAL PRECISION (Surely everyone remembers the RPC-4000.)

ASI "More computation per dollar—on the ASI-210."

GENERAL ELECTRIC "Progress is our most important product."

FRIDEN "This is Practimation."

AUTONETICS "It's called RECOMP III."

TRW "Be operational now with the TRW-130 (AN/UYK-1)"

BENDIX "Is your programming career in a closed loop?"

Bendix didn't actually advertise their machine (no, it wasn't a washing machine), but they were crying out for programmers. And so were most of the others, "from $7,000 on up."

Even IBM (who, at last look, was still around), was desperate for programmers to "shape the future of a new technology." Sound familiar? Although machines are millions of times faster and cheaper, some things—human things, mostly—don't seem to change in 45 years:

"IBM programmers ... are devising programs that in turn use machine capability for formulating new programs. They are creating programs that enable computers to diagnose their own faults through self-checking. And they are helping to design the systems that will let scientists and engineers 'talk' to machines in the everyday language of science and engineering."

Gee, I hope they finish these projects soon. I've been waiting a long time to talk to my computers.

Perhaps, in the end, all this flux of companies and jargon and sales promises is merely an illusion. Perhaps it's what doesn't change that teaches us the most important things about ourselves.

And what is it that doesn't change?

Us.

Oh, the faces change. The names change. But the behavior, the hopes, the visions, the gullibility—they don't change. Maybe that's a prediction you can safely make.