Jerry’s Story: The Roles of Jerry Weinberg

This is the fourth installment of the series of posts starting here: First Interactions: The Beginning of an Influential Career.

You may be wondering, “Who is Jerry Weinberg, anyway?” I have much more to learn before I can competently answer that question, but for the sake of my blog readers who are following along as I put the story together, I’m going to set a foundation as best I can. I expect that I’ve been able to uncover some surprises even for people who know him well.

34522523915_f33801d393_k

Jerry in his office in Corrales, New Mexico (May 2017)

His given name is “Gerald Marvin Weinberg,” and the name that you’ll see on his books and articles is “Gerald M. Weinberg.” But if you talk to him, you’ll learn that he’s always just gone by “Jerry.” My first exposure to his work was reading one of his most popular books, The Psychology of Computer Programming. First published in 1971, that was his fifth book. I discovered it 25 years later.

I decided to learn more from him by taking the Problem Solving Leadership (PSL) workshop that was developed and taught by Jerry and his colleagues. My employer at the time wouldn’t pay for it, so I satisfied myself with joining the more affordable online SHAPE forum (an acronym for “Software as a Human Activity Performed Effectively”), which Jerry moderated. Many people who had learned from Jerry hung out there and shared their wisdom.

A few years later, I finally arranged to participate in PSL. I followed up by participating in the “Change Shop” workshop as well. I enjoyed these experiential workshops so much that I attended a workshop on how to design experiential workshops. I have been in touch with Jerry off and on ever since.

Here are the roles that I think best explain who Jerry Weinberg is:

The Programmer
Jerry started his career as a programmer, and he continued to work with software programming, testing, architecture, and management throughout his career. He is the designer of the world’s first multiprogrammed operating system, used for NASA’s Project Mercury.

The Author
Many people discover Jerry through one of his books, which are the way he has been able to reach his largest audience. My most conservative count is 36 non-fiction books and 16 fiction books that he has authored or co-authored. This doesn’t count books that were later split into multiple volumes, translations, republished books, new editions, his doctoral thesis, short stories, or books he edited or contributed to. All told, Jerry estimates that he has been through the book publishing process about 100 times. He started out writing books for computer programmers. Jerry reports that his first book, Computer Programming Fundamentals, was the best-selling computer book of all time not long after it was published in 1961.

The Student
Jerry wants to always be learning, so much so, that if he feels that if he isn’t learning new things fast enough, he gets out of whatever situation that is stifling his learning, no matter how lucrative or prestigious. His love for learning is at the center of most of the things he does.

The Teacher
Jerry’s love for learning also applies to seeing other people learn. He has a rich history of teaching, starting while he was still a student. He found opportunities to teach while he was a programmer, as a college professor, and as a consultant.

“I always learned more through teaching than sitting through conventional classroom boredom.” -Jerry Weinberg

The Consultant
After working for IBM for about 12 years and teaching at SUNY Binghamton for three years, Jerry started working full-time as a consultant, a role which has defined the bulk of his career.

The Counselor
Jerry is known not only for consulting with high-tech organizations, but also for helping other consultants improve their craft. He is often called the “Consultant’s Consultant.” His mentor, Virginia Satir, also taught him a lot about family counseling, and his high-tech clients sometimes took him aside to ask for more personal advice. Jerry translated many of Satir’s counseling techniques into a form that engineers could use on the job. He also founded Consultants’ Camp, which still runs to this day.

The Human
I’ve discovered some interesting stories that help us to see Jerry as human. His is a son, brother, husband, father, and grandfather. He was once arrested for vagrancy. He paid for half of his college expenses with his gambling earnings. He briefly ran a computer dating service without actually using a computer. And he helped organize teach-ins in the 1960s.

I’m going to explore all of these topics more thoroughly in future posts. Stay tuned!

Jerry’s Story: Jerry, the Real Programmer

This is the third installment of Jerry’s Story. You might want to start with the first two:

Jerry Weinberg had worked as a shoe salesman, dishwasher, and camp counselor, among many other jobs. But he was finally getting to work with the “giant brains” that he had hoped to get his hands on. He asked to start two weeks early because, he says, “I had a wife and 1.66 kids by then, so I needed the two weeks’ pay.” Still living in Berkeley, he commuted across the bay to San Francisco on the F-train.

The first machine he used on the job was the two-ton IBM Type 607 Electronic Calculating Punch, which IBM also referred to as an “electronic calculator.” The term “computer” was in use at this point in the 1950s, but it doesn’t seem to have yet become the most common term for these machines. The job ad he had responded to referred to “electronic data processing machines.” It didn’t matter to Jerry what they were called—he was fascinated by these machines.

He hadn’t known that unit record equipment like the 607 had been in use since the late 19th century. Unlike the era of the IBM PC, which made IBM a household word in the 1980’s, IBM wasn’t marketing these data processing machines to consumers, so their existence wasn’t common knowledge.

Jerry was IBM’s first applied science representative in San Francisco. His first assignment was to teach a programming course to the three other applied science representatives who were starting two weeks later. He would soon be providing technical sales support to help salesmen sell leases for data processing machines and other IBM products. IBM would also send him to new customers at no additional charge to teach them how to program the computers, which usually also involved writing their first program for them.

8791024600_b4c3b971c0_k

Plugboard, system type unknown. Photo by Simon Claessen.

There was no training available for learning how to program, but there was a set of manuals. Jerry dug in to the manuals for the 607 and mastered the machine in a week. This was a wired program machine, programmed by plugging wires into a plugboard. He still easily remembers the technical details—20 wired program steps (this was expanded from the base model with 10 steps) and one signed ten-digit number of data storage. This was a big advance over the desk calculators he had used before. He also got familiar with other machines, like the keypunch, verifier, reproducing punch, collator, printer, and sorter. Some of these could also be programmed in limited ways, such as formatting and adding totals with a wired program on the printer, or defining the formatting for the cards on the keypunch using a special program card.

Jerry earned a reputation as a whiz kid by making the 607 do tricks. He won a dollar bet by turning on all the lights on the 607 control panel, which no one else in the office had figured out how to do. Jerry sums up his feelings about his new job: “I was being paid $450 a month for playing with the world’s greatest toy, a job I would gladly have paid $450 a month to do—though I wisely didn’t tell IBM that.”

Jerry moved on to learning how to program the IBM 650 Magnetic Drum Data-Processing Machine before the San Francisco office had one available. This was the world’s first mass-produced computer, and the first stored program machine that he encountered. The machine arrived a short time later.

There is a legendary story about Mel Kaye, the “real programmer.” Mel wrote hand-optimized programs for the Librascope/Royal McBee LGP-30 and RPC-4000 computers, which had storage on a rotating drum like the IBM 650. Mel implemented tricks like placing instructions on the drum so that after an instruction executes, the next instruction would have rotated to the read head at the moment it was needed. This eliminated the need to wait for the drum to rotate to the instruction before reading it. There was an optimizing assembler, but Mel’s hand-coded machine language programs always ran faster than the equivalent automatically optimized assembly program.

In order to optimize the amount of space a program occupied on the drum, when he needed a number in a calculation, Mel would try to find an instruction code in the program that happened to have that number, so he could just refer to the same memory location rather than adding another storage location to hold the data. The ultimate optimization described the Mel’s story involves using features of the machine that aren’t documented at all.

To learn about his competition, Jerry later wrote a few programs for the LGP-30, though he didn’t have a chance to run them. Recalling the IBM 650, which was similar in some ways to the LGP-30, Jerry says that the kinds of manual optimization described in Mel’s story were very common, and necessary if you wanted to have an efficient program. The optimizer might even cause a correctly written program to crash the computer. There was an assembler called “SOAP” (Symbolic Optimizing Assembly Program) that tried to place the instructions in an optimal location on the drum. Jerry says, “SOAP was okay for many programs, but for critical apps, we optimized by hand.”

Ed Nather, the person who wrote Mel’s story, was the programmer who tried to understand Mel’s code. It was a convoluted mess, almost impossible to decipher. Most of the Mel’s optimizations made the code harder to change, but even so, Ed wrote of his awe of Mel’s programming prowess. Programmers at that time did not consider maintainability an important attribute for their programs.

One of the optimizations that Jerry applied was on the IBM 704. There were many possible values for machine instructions that were not documented. Programmers were expected to only use the codes (often called “opcodes”) that were documented and supported. But of course, some of the programmers were curious, so they experimented with the undocumented opcodes, and found that a few actually were useful. One of them was a single instruction that would clear a memory word—the supported way to do this required two instructions. They coined a new instruction, Store Zero (STZ).

Someone added the STZ opcode to the assembler that was distributed by the SHARE user group, so all 704 programmers could use it and reduce their program size. Later when IBM produced the 709 and claimed it was compatible with the 704, Jerry and other programmers found that the STZ instruction no longer worked. They pressured IBM to add the STZ instruction so that they wouldn’t have to modify every program that used it. IBM complied, but if they hadn’t, the programmers would have had a big job in modifying their programs to work on the 709.


Less than a month after Jerry started the job, IBM chairman Thomas J. Watson Sr. died. This inspired a lot of talk in the office that informed Jerry about company history. It can be interesting to trace the machines of IBM’s roots to what Jerry experienced there starting in 1956. IBM started life as the Computing-Tabulating-Recording Company, which itself was formed from the merger of four companies in 1911:

  • Computing Scale Company of America—manufactured “computing scales,” scales similar in function to modern butcher scales, which weighed a product and calculated its price at the same time. They also made meat slicers (you can watch a video of a hand-cranked IBM meat slicer). Jerry only encountered these scales and slicers via the stories people shared about the history of the company.
  • International Time Recording Company and Bundy Manufacturing Company—both interrelated companies manufactured time clocks, tracing their roots to the world’s first time card recording company. Jerry never got involved in selling time clocks, but he does remember being required to punch a time card, simply because it was “part of our business.” When IBM sold the time recorder business to the Simplex Time Recorder Company in 1958, Jerry hopefully asked if he could stop punching a time card that didn’t really help anyone. The answer was “no,” but in fact the practice of punching in and out did fade away in his office. Bundy also got involved in producing adding machines, but these didn’t seem to make their way into IBM’s DNA.
  • The Tabulating Machine Company—Founded by Herman Hollerith, who built a tabulator that successfully processed the data for the 1890 census in the U.S. This machine was the first in a long line of unit record equipment, which includes the IBM 607 that Jerry started with, and evolved into modern computers.

One more later acquisition is worth mentioning here–

  • Electromatic Typewriters, Inc.—IBM entered the typewriter business with this acquisition in 1933. Jerry helped sell “a slew of typewriters,” which established his reputation in the office. He remembers that a typewriter salesman was making a pitch to a biologist who was doing genetic research. The biologist wanted to buy several IBM Selectric typewriters, but he needed to be able to make the standard symbols representing males and females. The salesman couldn’t find a typeball in his catalog that had those symbols, so he sought out an applied science representative to help. Jerry remembered that these symbols were also the astronomy symbols for Mars and Venus, and with this help, the salesman sold five typewriters. After that, salesmen started to bring a variety of problems for Jerry to help solve.

The next installment in this series is Jerry’s Story: The Roles of Jerry Weinberg.

What, us a good example?

I was pleased to hear Alan and Brent respond to my post “The black box tester role may be fading away” in AB Testing – Episode 43 (starting at 9:52). Their primary response was to my claim:

“The project structures they describe seem to be on the leading edge of the future of the software testing role. In my limited view of the software industry, I don’t see many companies that are anywhere near Microsoft in their evolution.”

They don’t think they’re really on the leading edge. I suspected that they would protest, because it’s human nature – everything is relative. If the organization I’m working with has a long way to go to achieve something that theirs is already doing, and I feel that theirs is a good enough exemplar, I don’t need to seek out something that’s even better. But from their point of view, they want to improve with the help of others who have gone even further, so their sights are set on others who have gone beyond. I’ve seen many software organizations recently that are so far behind in their evolution that it’s easy for me to point to Microsoft (which granted, I’ve also found many reasons to malign) and say that they’re far ahead of the pack.

These organizations that are behind the curve are generally surviving, and their companies are usually still making a profit. I saw this many times in my consulting. They are more or less successful and often don’t feel a strong need to make significant changes. This is where the traditional approach to software testing will hang on for a long time and keep the black-box testing role alive. At least for me, though, these are often not desirable places to work, and they may eventually find that it’s difficult to hire talented testers. Note that I’m more concerned about some of the antiquated traditional practices like scripted manual testing than I am with the black-box tester role itself, for now.

By the way, I’m amused that Alan introduced me as “our good friend,” but then didn’t seem to know whether that was okay to say. I’ve found that calling someone a friend often makes it so. I owe you guys a hug.

 

The black box tester role may be fading away

Is the traditional software tester role fading away? A recent blog post from Cem Kaner helped to reinforce this idea for me. You’ll find his comments inside this long post, in the “2. Oracles are heuristic” section, under the heading “A supplementary video”: Updating to BBST 4.0: What Should Change. Incidentally, the topic of the whole post is updating a course on software testing, which I think I attended a very early variation of around the turn of the millennium.

Cem’s parenthetical notes on tester careers are interesting. He suggests that traditional black-box testing (whether exploratory or scripted) will give way to piecework, where a tester will be paid by the number of completed tests. I’ve seen this model already underway in outsourcing companies like Rainforest QA and its competitors, where the manually executed test step is the basic unit that you’re paying for. It’s much easier to see this happening for scripted tests than for exploratory tests, and I argued against using this kind of service when an executive asked me to consider using it, because I see little value in scripted manual tests.

You can imagine that this kind of piece work will not pay testers very well. Cem noted that he already sees a significant pay differential between black box testers who don’t do any programing and those who have jobs that require some kind of programming. He suggests a fews skills like programming that testers could add to become more marketable. I have a few items of diversification I can point to, including programming and automation skills, though I don’t often focus on automation. I’m familiar with testing web apps and mobile apps (and even mobile web apps :). My experience with embedded systems often gets the attention of recruiters.

I’ve added a few items to my resume this year that I’m happy about. I took a training course and became a Certified Scrum Master so I could understand the leadership aspect of agile processes better. Perhaps the most promising, given the current job market for security, is the Certified Ethical Hacker course I took and passed the exam for. I know that passing a certification exam doesn’t really prove anything, but these particular certifications do seems to carry some weight that might help my career. Both are subject areas I already have experience in, and I was happy to round out my knowledge in the classes.

I’ve been following Microsoft employees Alan Page and Brent Jensen on their AB Testing podcast. They both have had fairly traditional testing roles, but now are in roles that seem to be much more future-proof. Brent is now a data scientist, specifically, Principal Data Scientist Manager. Alan, Principal Software Engineer, describes himself as a helper, doing the odd (but challenging) tasks that don’t easily fit the developer roles on his team, which is something that appeals to me. Both are still involved in the testing process. The project structures they describe seem to be on the leading edge of the future of the software testing role.

In my limited view of the software industry, I don’t see many companies that are anywhere near Microsoft in their evolution. If the traditional black-box tester role is fading, I think it’s going to happen very slowly. I think it will require a very broad view of the industry to track a slow evolution like this, and I’m curious if you’ve heard from anyone who is in a good position to see it.

 

 

 

Jerry’s Story: An aspiring auto mechanic changes course

Part 1 of my telling of Jerry Weinberg’s story was First Interactions: The Beginning of an Influential Career, where he had started his college studies in September of 1950. But let’s go back to the summer of 1950, when Jerry had no plans to attend college at all.

Jerry had graduated from Omaha Central High School, and he felt disgusted with school. He found many of the subjects in high school to be trivial, so he had skipped most of those classes and still got good grades. He did enjoy a few classes, however, especially auto shop. He told me “I just loved cars, driving them, working on them, even washing them—plus doing body work and painting in my father’s shop. I never really had any other career idea than working with cars in some way.” Though he was fascinated with computers, there were so few jobs available to work with them at the time (and none that he was aware of) that he didn’t even consider a computer job a viable option.

After graduating from Central High, Jerry applied for a job as a mechanic. The owner of the garage offering the mechanic job, however, wouldn’t let him start until after the next school term started. He suspected that Jerry was just looking for a summer job, but he really wanted a long-term employee. Jerry decided to wait out the summer so he could get that job, and in the interim, he worked as a summer camp counselor for a camp sponsored by the Omaha Jewish Community Center. At the camp, another counselor encouraged him to go to college so he could meet young women. Jerry had a keen interest in women, and hadn’t before considered this particular benefit of the college experience. So he determined to go to college instead of taking the mechanic job.

A few days before classes started in the Fall, Jerry showed up at the University of Nebraska in Lincoln to register. The counselors were not happy that he hadn’t registered in advance, but because he had graduated from a Nebraska high school, state law required that they admit him. The counselors were even less happy to find that they had to give him a scholarship because he had graduated in the top ten percent of his class. So he began his studies.

While at the university, Jerry got a job in the Physics department—the job title was “computer.” It turned out that Jerry would be a computer years before he programmed one. He used a Friden electromechanical calculator along with pencil, paper, and eraser to invert 10 by 10 matrices for faculty members. Just as a computing device doesn’t know the ultimate reason it does its work, he doesn’t recall ever knowing why they wanted the inverted matrices. Jerry told me about what he learned from this job:

I recall that it took me upwards of an hour to invert a 10 × 10, and of course the inversion time tends to grow as the square of the size. Going to 11 × 11 would have raised my computation time by over 20%, and increased my chance of making an error somewhere along the line. That was the first time I became aware of non-linear computation times and also the significance of error. It was a good start to my career: my understanding of these factors, which many programmers today don’t seem to appreciate.

He also offered his services as a tutor for any subject, primarily for failing athletes, and he worked grading English papers. He was a Physics teaching assistant and was told he was the first undergrad to get that job, at the ripe old age of 17.

Jerry was out sick with Crohn’s disease for most of his second year. He went home to Omaha to recover. While there, he took a few courses at the University of Omaha (now known as the University of Nebraska Omaha), including Mathematics of Finance. He thought that computers would be used in course, but he had no such luck. There was most likely no computer on campus at all.

When the course progressed to more advanced subjects like probability, statistics, and risk, Jerry found out he knew more about them that the professor did, so helped teach the class. This impressed the instructor, who was an associate of Warren Buffet. The professor recommended that Jerry meet Buffet because he was seeking bright math students to work with. Jerry wasn’t able to arrange a meeting, however, because he had to return to the hospital for surgery.

During his stay in Omaha, Jerry did manage to meet with the chief actuary at Mutual of Omaha. Jerry was impressed with the luxurious office, but not impressed with the actuary job itself.

He returned to the University of Nebraska and completed his Bachelor of Science degree, magna cum laude and with honors, for four majors: Physics, Math, Philosophy, and English. He then moved to California to study Physics at the University of California, Berkeley. A year later, he had passed his comprehensive exam, finished his thesis experiments, and was on track to earn a PhD in record time. He had a few months of work left to finish writing up his thesis when he found the opportunity he had been looking for since he was 11 years old.

Jerry was out of cash, and supporting a wife and a child, with the second child on the way. He read an ad in Physics Today from IBM looking for applied science representatives. It didn’t say that the job involved computers, though there was a picture of a roomful of data processing equipment. It’s not the first computer-related job ad that Physics Today ran, but it was the first that he noticed. He had no doubt that this was what he wanted to pursue. He wrote to IBM to apply, interviewed in Oakland, and was offered a job on the spot. He also interviewed at Boeing and got an offer for more than twice what IBM offered, but the job did not involve computers.

Accepting either job would mean not finishing his PhD. Jerry says “The degrees were irrelevant to me, but came along as a side effect of my hanging around. My advisor actually cried when I told him I was leaving.” He received a Master’s degree in Physics from UC Berkeley as a consolation prize. Jerry was hired for his dream job as an applied science representative at IBM on June 1, 1956.

I’m sure that Jerry would have found a way to play with computers before long, even if it weren’t for that wary garage owner, the fortuitous advice from his fellow camp counselor, or the worry about paying his family’s expenses. But I was fascinated to see the path that he took to realize his dream.

An excerpt from the ad in Physics Today, which ran in the January 1956 and March 1956 issues. You can see a full scan of a very similar ad from the February 1956 issue of Scientific American.

physics-today-ibm-ad

Can you help to provide additional details from your own knowledge of this era or from your interactions with Jerry? Please comment here or contact me on Twitter.

The next installment in this series is Jerry’s Story: Jerry, the Real Programmer.

A bit of advocacy helps to earn a bug bounty

I have been working on honing my security testing skills. I asked Don Ankney‘s advice on how to do this, and one of his suggestions was to participate in bug bounty programs. Many companies encourage security researchers to report security vulnerabilities to them, and in some cases, they offer monetary rewards to the first person who reports each one.

My first bug bounty report for Instagram, which wasn’t accepted, was discussed here: “Username Enumeration – Ho, Hum!” This time, though, I was more successful. I found that none of Instagram’s cookies on its web interface had the “secure” flag set, including the session cookie that identifies a logged-in user. Like username enumeration, the secure flag on the cookies is another “ho, hum” thing often excluded from bug bounty programs. But the Facebook Bug Bounty Program (which also covers Instagram) doesn’t mention such an exclusion, so I decided to report the vulnerability.

I spent some time crafting an attack scenario. I found that the attack didn’t work if I used “instagram.com” instead of “www.instagram.com.” I found that if the insecure page http://instagram.com was in the browser cache, the browser used the cached page and then there was no vulnerability. And for reasons I haven’t figured out, I was not able to complete the attack successfully if the victim was using Firefox. I was able to prove that hijacking an Instagram session was a simple matter of setting just the captured sessionid cookie. This is the bug report I sent:

Description and Impact

The secure flag is not set on any of Instagram’s cookies, including sessionid. When a user with an active session types “www.instagram.com” in their browser to go to the site, they will first hit the insecure site and transmit all of their cookies in the clear. An attacker monitoring their network packets will be able to hijack their session easily. Assuming there is no need to send cookies in the clear at any point, this is easily fixed by setting the secure flag in the cookies.

Reproduction Instructions / Proof of Concept

I implemented a proof of concept using Safari 8.0.8 on Mac OS 10.10.5 and Chrome 49 on Windows Vista Home Basic for the victim. I haven’t been able to reproduce it yet with Firefox.

  1. Make sure you’re not logged in to Instagram. Clear the browser cache.
  2. Go to https://www.instagram.com.
  3. Click “Log in with Facebook”, and enter valid Facebook login credentials. This logs you in to Instagram.
  4. …an arbitrary amount of time may pass, as long as the Instagram session is still valid when continuing.
  5. Go to a public network that someone is snooping on.
  6. Open a tab in the same browser as before and go to http://www.instagram.com (not https). The sessionid cookie is sent in the clear and has been captured by the attacker. Even though the server returns a 301 redirect to a secure site, the cookie has already been sent in the clear.
  7. Attacker hijacks the Instagram session by setting the sessionid cookie in their browser.

I got a reply five days later, saying “This is currently intentional behavior in our product…” I wasn’t surprised that another “ho, hum” bug was rejected, but I was surprised that they considered it a feature. So I replied, saying that I intended to publicly disclose the issue (which is standard practice after the report is closed, whether fixed or not) and I asked for further information about how the site needs this behavior in order to function, to inform my continued testing. I call this sort of response my “Just one more thing” reply, inspired by the TV character Columbo. This sort of followup is routine for professional software testers, but I don’t know how many security penetration testers put bug advocacy skills to use.

The next reply came quickly, saying that though many people had already reported this issue, they would go ahead and discuss the issue with the product team and try to fix it. And lo and behold, about three weeks later, I got notice that the issue is resolved, and I was pleasantly surprised to hear that they offered to pay me a bug bounty. The reasoning was fascinating – the site previously used http (I’m not clear how long ago) and then later switched to https. All the previous reports about this issue had been when they used http, which is silly, since in that case the secure flag would render the cookies invisible to the server. This explains their earlier pat rejection of bug reports about the secure flag, even though that response had become obsolete with the change to https.

They determined that I was the first to report the vulnerability since they switched to https, and so I qualified for the bounty. I am impressed with the amount of care that Facebook/Instagram took in handling this report. I’m eager now to dig deeper and apply more of my bug advocacy skills if necessary.

 

 

First Interactions: The Beginning of an Influential Career

My friend Jerry Weinberg was present at the dawn of the age of computers. He can describe first-hand what that was like, but much of his story has never been told. I have started to collect Jerry’s stories. Here is a small sampling, which I prompted by asking him “What were your first interactions with a computing device?”

The first thing that Jerry used that we could call a computing device is a slide rule that his father, Harry, gave him when he was about 7 years old. Harry worked for more than 20 years helping to improve processes at Sears, Roebuck & Co. He bought slide rules in quantity to give to the young ladies who computed customer bills. They used slide rules to check their multiplication, for example, when multiplying price times quantity. This practice caught an enormous number of errors before the bills were sent to customers.

Jerry had a more interesting use for his slide rule, though. He was a sports fan, so he used the slide rule to compute baseball batting averages. Jerry says “It’s the easiest thing in the world. A 7-year old could do it.” He still has that slide rule.

Jerry slide rule

He has a “UNIQUE” brand slide rule, made in England. Jerry describes it as small and cheap, with a table of “Trigonometric Ratios” on the back, which he didn’t understand how to use when he started using the slide rule. Later, though, he remembers using tables of sines, cosines, and logarithms, which could also be considered computing devices of a sort. He used the tables in math classes and also for experimenting with numbers for fun.

Jerry remembers his first introduction to the concept of computers being a Time magazine article. This may have been “Science: A Machine that Thinks,” in the July 23, 1945 issue, when he was 11 years old. That article discusses Dr. Vannevar Bush’s “memex,” a conceptual idea of a machine that stores facts for easy recall, which the Time article refers to as a “brain robot.”

The first book that Jerry read about computers was Giant Brains, or Machines That Think, by Edmund C. Berkeley, published in 1949. This book had a strong influence on Jerry, and he considers Berkeley one of his heroes. Much later, he met Berkeley and had long conversations with him, and he was delighted to know that Jerry had been inspired by his book.

Jerry may also have seen “Science: The Thinking Machine,” the landmark Time cover story on January 23, 1950, when he was 16 years old. The cover artist for that issue, as it was for many issues of Time, was Boris Artzybasheff, which is a detail that Jerry still recalls. The article discussed the Harvard Mark I named “Bessie” (which coincidentally is also Jerry’s mother’s name). This was an electromechanical computer that had been in operation since 1944. The article also discussed the Harvard Mark III, a hybrid electronic/electromechanical computer produced in 1950 and it went into detail comparing computers to the human brain.

Jerry was an avid reader. He explained just how avid: “I usually had breakfast alone, with cereal, so there was the box to read. I’m not saying it was my preferred reading, but just that I read everything that appeared in front of me. Like the see-food diet: I see food, I eat it. So, I see print, I read it.” He probably heard about computers from other sources during his youth. He remembers sitting at his father’s feet as his father read the newspaper and offered his commentary on a wide array of topics.

Jerry had been labeled as a “brainy” kid, and he yearned to learn more about brains, especially these “giant brains.” Early on knew he wanted his life’s work to be with computers. He didn’t yet know anyone who had ever seen a computer, let alone used one. He watched and waited for signs of a computer, but went all through high school without seeing one, with perhaps one exception. He had a summer night job in a large bakery computing recipe requirements for the following day’s orders. He used a Monroe adding machine.

When he entered college at age 16, Jerry told his counselors that he wanted to work with computers, but none of them knew anything about computers except that they had something to do with electrical engineering and physics. They decided he should major in physics because he was good at math, which they thought would be wasted in electrical engineering.

One day, Jerry saw a notice for a brief “computing course” using Monroe adding machines, given by the Monroe company. He already knew most of the material better than the instructor. He passed, earning a certificate that he’s lost somewhere along the way. It’s the only computing course he ever took, and the only “degree” in computing that he ever earned.

If you’re interested in hearing more of Jerry’s story, please let me know. He has much to tell. Note that many of the words above are his, and I decided to tell the story in third-person. Consider it a collaboration.

The next installment in this series is Jerry’s Story: An aspiring auto mechanic changes course.

Username Enumeration – Ho, Hum!

I used to think that checking for username enumeration vulnerabilities was important to do. Based on what I’ve observed, now I’m not so sure.

When I’m conducting a broad test of a software system, I tend to check for basic security holes like username enumeration. “Username enumeration” vulnerabilities happen when software that has login accounts for its users gives you a way to build a list of valid login accounts, for example, by giving you a way to query whether a guessed username is valid or not. Once you have this list, you can try to launch a brute force attack to guess the passwords.

Testing for username enumeration is one of the mitigations recommended in the 2013 OWASP Top 10. Cigital gives it even more prominence in their Top Web Application Security Vulnerabilities Compared to the OWASP Top 10, with username enumeration ranking as the 9th most commonly found, even when only considering the vulnerability via password reset features. My experience, though, is that companies aren’t very interested in fixing these vulnerabilities. It may be commonly found because it isn’t commonly fixed.

Once I did get a fix for an obvious vulnerability, but when I found I could still do enumeration by checking the response time from the server, that didn’t get fixed. I often see no fix at all when I report a username enumeration problem. For example, the main web login form for Instagram has this vulnerability, but Instagram considers your username to be public information, and they confirmed when I contacted them that they’re not going to close this hole. A significant fraction of the companies that participate in the HackerOne bug bounty program specifically state that they exclude username enumeration from the program.

I’ve found a few ways that companies have indirectly mitigated this issue, which may be contributing to some of the “ho-hum” response:

  1. Rate-limiting on the vulnerable feature based on IP address. In my testing, it wasn’t uncommon to see that a feature that allowed me to enumerate would temporarily lock me out after about several tries in rapid succession. This would only succeed in locking me out if it’s based on my network presence, not on the username, since I would be trying a different username each time.
  2. Similarly, a few vulnerable sites use CAPTCHA to defeat automated enumeration attempts. After trying several different usernames, I would get a CAPTCHA challenge that stopped a script from continuing.

In both of these cases, I can easily determine if a particular username is in use. But if I want to compile a large number of usernames, it may take a script months of running at the maximum allowed rate. There may be other ways to defeat these measures, such as frequently changing the public-facing IP address, using a botnet, or trying to use a database of known CAPTCHA responses.

None of this matters for Instagram, because the vulnerability on the web login is not rate-limited in any way. And that isn’t even the easiest vector – the issue with incremental userIDs mentioned here has not been fixed: InstaBrute: Two Ways to Brute-force Instagram Account Credentials.

One remaining mitigation comes to mind – the account lockout based on failed password attempts. Once we have a list of good accounts, we haven’t gained anything until we guess the password. I haven’t tried cracking that nut yet. My first thought is that if we have a large number of usernames, the time it takes to try one particular password for each of them may not trigger an account lockout at all by the time we roll back to the top of the list for the next password to try, as long as the lockout feature automatically resets itself after some fairly short period of time. But perhaps I’m being naïve about how quickly we would need to progress through a long list of possible passwords.

I would be curious to hear what your experience has been with reporting username enumeration problems, especially with companies that set a high priority on closing these holes.

Image credit: Christiaan Colen (CC BY-SA 2.0)

 

Better Bisecting

Tags

Bisection is a procedure that software testers can use to isolate a defect. I’ve been having fun building a tool to assist with bisection, and I’m writing about it in order to get feedback on whether it may be useful.

This work was inspired by the “bisect up” and “bisect down” features that James Bach developed for the perlclip tool. These features are tied to the counterstring generator. You start by creating counterstrings of two different sizes (perhaps vastly different sizes), where generally you see that when you use the smaller of the two strings in a test, it passes, and you observe that a test with the larger string fails. The task is then to determine precisely where the boundary is between passing and failing. You can use the “u” (bisect up) and “d” (bisect down) commands depending on whether the last test passed or failed to generate additional test strings that bisect the remaining possibilities until you converge upon the boundary.

I’ve used this feature many times – it’s really helpful. But I often find that I get confused trying to keep track of whether I should go up or down. If I make a mistake, I have to back to the last two values I’m confident in, create two counterstrings that I don’t actually use, and start over bisecting from there.

Here is my redesign that I think makes bisection easier to do. This is implemented in testclip, my work-in-progress port/enhancement of perlclip using Ruby. I’m going to demonstrate the tool by finding a bug in Audacity (version 2.1.2 on Mac OS).

I created a new Audacity project and looked for a text field that would be a good candidate for long string testing. I clicked File > Edit Metadata and found what I needed. I fired up testclip and make a counterstring for a happy path test:

$ ./testclip.rb 
Ready to generate. Type "help" for help.

cs 10
counterstring 10 characters long loaded on the clipboard

Then I pasted the data into the first field on the Metadata Tags dialog in Audacity:

metadata10

I clicked “OK,” then opened the dialog again, and the data was preserved without any errors. So I established that the counterstring format is valid for this field (some input fields don’t allow asterisks or numbers, so it’s good to check). I recorded the result of the test in testclip:

pass
10 recorded as pass

That was boring. Next I wanted to try a large number. In my experience, 10,000 characters is very fast to generate, and larger than what most input fields need, so that usually where I start. I asked testclip for a 10,000 character string.

cs 10000
counterstring 10000 characters long loaded on the clipboard

This was the result of the test:

metadata10000

When I tried to move the cursor further to the right, the cursor moved off the window, and I couldn’t see any more of the string. Looks like I found a bug. I record the result in testclip, choosing the tag “obscured” to identify the particular failure. This may become important later if I find a different failure mode.

fail obscured
10000 recorded as fail obscured

Due to the nature of the counterstring, I already had it pretty well isolated – it’s likely that my boundary is between 5690 and 5691 characters. But let’s make sure. I generated a 5690-character counterstring, find that it works fine, and recorded that as a pass.

cs 5690
counterstring 5690 characters long loaded on the clipboard

pass
5690 recorded as pass

I can ask testclip to report the test results it knows about and automatically identify where the boundaries between different results are.

status
10: pass
5690: pass
--boundary 1
10000: fail obscured

Next I tried 5691. This failed as with the 10,000 character string. I recorded this as the same type of failure and show the status again, which shows that testclip puts 5691 and 10,000 in the same equivalence class, just as it did with 10 and 5690 (I’m hoping it’s not confusing to call this an “equivalence class”, which is a term usually used to describe expected results, not actual results).

cs 5691
counterstring 5691 characters long loaded on the clipboard

fail obscured
5691 recorded as fail obscured

status
10: pass
5690: pass
--boundary 1
5691: fail obscured
10000: fail obscured

So, I hadn’t needed to bisect anything yet. I decided to make the test more interesting by saving the project and opening it again, to see if the long string is saved and loaded properly. I went back to 5690 and did the save and load test. Note that generating a new counterstring would reset the bisection status in perlclip, but all the test results are still retained in testclip so I can track multiple failure points. And in fact, the test fails, because after I open the saved file, the field is completely empty.

So now I abuse the tool just a bit, changing the result of the 5690 test. I’m actually running a slightly different test now, but I think I can keep it all straight. I tag this new failure “empty”. I now have two boundaries:

cs 5690
counterstring 5690 characters long loaded on the clipboard

fail empty
5690 result changed from pass to fail empty

status
10: pass
--boundary 1
5690: fail empty
--boundary 2
5691: fail obscured
10000: fail obscured

I have no clue where the new boundary 1 lies, so I’ll use bisection to find it:

bisect 1
highest value for 'pass': 10
lowest value for 'fail empty': 5690
2850 characters loaded on the clipboard

This test also failed, so I recorded the result and bisect again.

fail empty
2850 recorded as fail empty

bisect 1
highest value for 'pass': 10
lowest value for 'fail empty': 2850
1430 characters loaded on the clipboard

To complete the bisection, I repeated this process: do the next test, record the result, and bisect again. This was the end result:

bisect 1
Boundary found!
highest value for 'pass': 1024
lowest value for 'fail empty': 1025

status
10: pass
720: pass
897: pass
986: pass
1008: pass
1019: pass
1024: pass
--boundary 1
1025: fail empty
1027: fail empty
1030: fail empty
1075: fail empty
1430: fail empty
2850: fail empty
5690: fail empty
--boundary 2
5691: fail obscured
10000: fail obscured

I reported the two bugs to the Audacity project, and called it a success. Note: this was a case of a moving boundary – on two previous days of testing, the failure point was at 1028 and 1299, though within each test session I didn’t observe the boundary moving.

Besides adding the rest of the perlclip features and porting to Windows, the next things I’d like to implement for testclip are snapping the bisection to powers of 2 and multiples of 10, since bugs tend to lurk in those spots, and finishing a feature that can bisect on integer values in addition to counterstrings.

For more about perlclip, see “A Look at PerlClip” (free registration required).

photo credit: Wajahat Mahmood

Amplifying the Comment Challenge

Tags

An ongoing topic on Twitter is resonating with me – the #CommentChallenge, described in this blog post by Kristīne Corbus – Comment Challenge. The challenge is basically to leave a comment on at least one blog post a week that’s relevant to your work.

This is something I tend to do anyway. It doesn’t matter if I’m reading an article, book, discussion forum, or blog post, or listening to a podcast or any other media, I’m always looking for a way to engage with the author. If an author gives me something useful, I have an opportunity to make it a richer experience with the author’s help, and possibly strengthen my network. Also, if I expect I may be posting a comment, I find that I read the information more carefully and I’m therefore more likely to retain it.

When I’m reading a blog, the form of my feedback may be a blog comment, but that’s not generally a great platform for an extended discussion. So if I really want to get into the topic and I know where the author hangs out online, I may start a discussion elsewhere. I’ll probably also post some sort of comment on the blog, because that helps to show the public that the blog has engaged readers (especially if there are no comments yet), which helps the author. A habit that serves me well is just trying to be helpful.

These are the types of situations where I tend to offer comments:

  • When I have a question about something the author said or a closely related topic, something I’d really like to learn – either the author’s opinion, or facts that are hard to find elsewhere.
  • When the author asked a question and I have potentially a useful answer.
  • When I have something to add to what the author said that I think will be highly valuable, even if the author didn’t ask for this type of feedback. I try not to do this very often.
  • When I disagree with something the author said. I think carefully before I do this. Doing this can often earn the respect of the author, and we’re both likely to learn something from the exchange, but when done in the wrong way, it can damage both my relationship with the author and my public reputation. I won’t try to elaborate on all the subtleties here. Often I just ask a question instead of stating directly that the author is wrong. I may find out that I misunderstood something they said, and I don’t actually disagree with them.
  • When I want to give kudos to the author for making an important point or for making a point particularly well. This type of feedback is less useful than the rest, so it’s best to combine it with one of the items above.

I still don’t comment on everything I read – only those things that I have a useful reaction to that I can share.

My challenge to myself is a bit different from the Comment Challenge. I tend to let my learning habit fizzle, so that I stop taking the time to read or otherwise learn new things. So my challenge is to expose myself to new things every week. When I do that, the comments will naturally follow.