Why should anyone do a crossword puzzle? At best, you wind up with the answer you could find in the back of the book or in next week’s magazine. But it’s just because there is an answer that you do it. As on multiple-choice exams, you seek not The Truth but the answer you imagine the puzzle-maker/examiner/deity had in mind.
American crosswords rarely get past the content of the words. French puzzles recognize linguistic form (the signifiant) to the point of playing on multiple meanings and homonyms, scrambling a word or inverting its direction. I find those of Robert Scipion in the Nouvel Observateur the most intellectually satisfying of the genre. But the most ingenious crossworders are the British, who work on a grid in which only about half the letters cross, sparing them the need to fill it out with silly crossword-puzzle words. The text of the clue contains both a straightforward definition of the keyword and a cryptic allusion to the anagrammatic tricks by means of which it can alternatively be constructed. It is like a detective story in miniature. (No accident, of course, that the Brits also invented the whodunit. Did Agatha Christie do crossword puzzles?) A relatively straightforward example: Mocked, put out and ran away outside (7) = FLOUTED (out with fled outside it; mocked is the straight definition).
F | L | O | U | T | E | D |
These examples of “national character” reflect the different relationships between the cultural elite and the society as a whole. American puzzles are democratic, lowest-common-denominator–common knowledge plus a crossword-puzzle dictionary. I have seen some less-than-scintillating local imitations of British puzzles; should we anticipate a conversion to European sophistication analogous to the Starbucks phenomenon? Among the Euros, there is an interesting distinction is between French and British elitism. The French have a courtly culture where wit is the highest value; the British are given to schooled ingenuity. A Frenchman would consider it beneath his dignity to play childish games like putting “out” inside “fled,” but British culture from Dickens to Harry Potter has focused on ingenious children. (What used to be called the “classics” were tales for pre-adolescents–not The Red and the Black but David Copperfield, not Penguin Island but Treasure Island.)
Solving puzzles provides you with the more-than-personal satisfaction of having seen through the author’s mystifications and penetrated the opacity of the world of signs. For a moment, you feel that human minds can be transparent to each other. (This is the charm of detective stories as well, with the added point that what we divine in the other’s mind is a project of violence.) Yet finally the only understanding you achieve in the process of doing a crossword puzzle is of how to do crossword puzzles.
I sometimes do jigsaw puzzles too. These are so time-consumingly guilt-inducing that I don’t dare risk the temptation at home, reserving them for family trips to Kansas City , where the living-room table has been adapted for this purpose. A friend recently made an extension board to accommodate increasingly larger puzzles: first 1000, now 2000 pieces. Jigsaws demonstrate that the intentionality to which the puzzler responds need not descend into detail. A crossword is composed word by word by a Scipion or Margaret Farrar; a jigsaw puzzle is just chosen by someone for its pleasing appearance and stamped out on a cutting board. Yet our struggle to find the right piece is a search for the triumph of order over chaos that activates the same social instincts as other types of puzzles.
Jigsaws make up in the esthetic realm what they lack in shared wit. A filled-in crossword makes no prettier a picture than an empty one, but consider the almost-finished jigsaw: the missing pieces are gaping holes in a totality, begging to be filled. The jigsaw-maker is indifferent to the details of découpage, but he assures the puzzler that all those little pieces can be brought together in a finished product, a complete form revealing a complete content.
It’s hard not to participate in a jigsaw puzzle. The purely cumulative nature of the activity invites cooperation; if I add a piece to your puzzle, I can’ t possibly be frustrating your effort. Their extended topography makes jigsaw puzzles the most communal form of puzzling, but an unfinished crossword similarly invites participation. Whoever you are, you are part of the community whose purpose is to find that answer or fill that hole. Yet the satisfactions of puzzling are incommunicable to non-participants. There is no crossword- or jigsaw-puzzle-solving narrative. Imagine how a friend would enjoy the story of how you solved a jigsaw puzzle: “I looked through all the normal shaped pieces until I found one with a thin bottom tab with a wide and pointy left side and a dot of white on the lower right edge. It was still a bit too wide on the bottom tab, so. . . The next piece I found was. . .”
It’s hard for me to resist a puzzle of any kind: crosswords, jigsaws, cryptograms, logic problems, magazine quizzes, even those inane word-search puzzles. But more than any of these I enjoy computer programming. Writing a program offers all the stimulation of puzzling outside the confines of someone else’s mind. The solution does not accompany the problem; it has to be invented step by step. (Hacking in its new, sinister sense of seeking to crack a security system is too much like doing a crossword puzzle.)
The puzzle-like interactivity in computer programming makes debugging a program even more enjoyable than writing it. I tend to abandon good, dull programming practice (writing small, easily verified routines and combining them) and rush through the whole project to the point where it works more or less but there are still all sorts of errors in the code. Then you can run it and see whether you get “run-time” error messages or the output isn’t what you wanted from your input. It’s like a crossword where you can test your entries and learn from your mistakes. Once in a while, you get that special high from a program that runs perfectly the first time.
When my colleague Sara Melzer first introduced me to the 4 mHz Kaypro 10, I couldn’t wait to learn Basic. The little I had seen about programming fascinated me; as an ex-math major, I couldn’t understand how you could write “x = x + 1”. The instruction “print” (no longer used in the graphics era) also mystified me; why did they keep “printing” the thing on every line? Whence the unforgettable feeling of empowerment when I got my first program to “print” a pattern of green letters on the little black screen
It went something like:
10 for i=1 to 19
20 print tab(1+3*abs(10-i));
30 for j=1 to 10-abs(10-i)
40 print “blurk “;
50 next
60 print
70 next
The output:
blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk blurk
My main programming activity in the 8-bit era was system utilities in Z80 assembly language–for example, a keyboard macro system using the Esc key on a keyboard lacking either function keys or Alt–but my favorite programming projects were the games I wrote in Basic, my “native” computer language. Back around 1985 I wrote a computer vs user domino game that, ported to Java, still [no longer in 2018] pursues its existence at www.humnet.ucla.edu/humnet/french/faculty/gans/java/domino.htm, where it gets more hits than the entire rest of the French Department. My most recent project in this vein was a VB improvement on the Windows solitaire FreeCell: see www.humnet.ucla.edu/humnet/french/faculty/gans/java/home.html for details.
The most popular microcomputer language in those days was MBasic, “M” standing for Microsoft, at the time a small outfit whose sole product was, I believe, that Basic compiler, including an Apple version on the Motorola 6502. Each line had a number. Full-screen editing had not yet arrived; you retyped any lines you needed to change and the interpreter would reinsert them for you in numerical order (type “l” for a listing). With all the advances since in hardware and software, I have never known a more user-friendly programming environment than good old MBasic, where you could write your program and just type “run” to see how it worked, modifying it line by line until you got it right. To exit the interpreter, you used the old mainframe command system. In CP/M, you usually rebooted after each program. A little Basic joke was to do this by typing call mom; the call command would call the routine at the address given in the following variable, and since you hadn’t defined mom, it would be set to 0, the address of the CP/M “warm boot” routine.
Visual Basic is pretty friendly too, but first you have to get past all those forms and controls and get used to event-driven programming. The old programs always kept moving; if they awaited input, there was a location in the program where they ran in place while waiting for it. When I ported my old CP/M games to MS-DOS (the next stage in the triumph of the “M”) I incorporated a mouse, a Mac-like luxury in the DOS era. You played by clicking on a “card” (implemented first as [2h] or [Ks], then–in color!–as [2♥] or [K♦]), and after each play the program came back to the same spot. Even the first version of Java dominoes incorporated a waiting loop. It took me a long time to realize that a program doesn’t have to be doing anything; it can just wait for input. The Bronx Romantic finds it particularly hard to believe that you can just stay there and do nothing, his own existence having made him so much more familiar with turning around in a loop.
There’s not enough vocabulary for jigsaw puzzles, but there’s too much for computer programming. I have learned the hard way that nothing is more boring to spouses, friends, and assorted acquaintances than explaining how a program works. Does anyone care what a “pointer” is, that C-derived languages are case sensitive, don’t initialize variables, and their arrays begin with 0, while Basic isn’t, do, and its arrays begin with 1, that DOS strings used to be terminated with “$” before they became like UNIX strings terminated by 0–binary 0, not the letter “0”? Can anyone appreciate this old routine to convert binary hex numbers to ASCII without jumps–here in 8080 code, but it still works on a Pentium :
Starting with: 6 12 adi 90 (adds 90) 96 102 daa (gets rid of the 100s) 96 2 (carries the "1") aci 40 (add 40 with carry) 136 43 (adds 41) daa (as above) 36 = "6" 43 = "C" (12 in hexadecimal)
Numbers are from C and letters are from Basic: the numbers begin with Hex 30=”0″, whereas the letters begin with Hex 41=”A”. To get around this, adding 90 followed by daa, which stands for “Decimal Adjust for Addition,” adds an extra carry to the letters while treating the step from Hex 39 to Hex 40 as a single (“decimal”) digit, so that 9 becomes Hex 39= “9” while 10 adds the carry to get to Hex 41= “A”.
Have I really made you care about aci 40 and daa? What about dad sp and pchl? Or nop, “no operation,” the hacker’s favorite instruction, used to patch over a test for the password or license he doesn’t have. An instrument without cultural resonance, the jargon of current and bygone assembly language still gives me the thrill of the exotic, perhaps because I learned it relatively late in life. I used to have a Honda Civic DX with the license plate XORDXDX–get it?
Computer culture
In a time when people are writing books to demonstrate that computers will soon be smarter than we are, does programming reflect a desire to join with them against us? We never stop playing the game of problem and solution whereas computers, like Visual Basic programs, just stand there and wait until one of us provides them with input. No doubt anything we can understand about ourselves can be programmed into a computer. Just as the word processor “wants to” go to the next line when you type beyond the margin or the program in a guided missile “wants to” follow its moving target, computers of 2050 or 2100 (can we even begin to imagine the computers of 2200? of 3000?) will be programmed to “want to” play tennis, make love, what have you. But this lends no credence to the horror stories about computers taking over the world. The learning modules that will make computers “superior” to us will be of our own design. The idea that computers will keep secrets from their users and scheme together to replace them is just one more paranoid fantasy on the model of alien invasions, one more distraction from our real worst danger, which is ourselves. You can program resentment on a computer, but you can’t make a computer resentful. I’ll take HAL over Saddam anytime.
Of all human creations, computers are the most insistent indicators of progress. They have shown the most rapid sustained improvement of any human creation; doubling in every aspect of performance something like every eighteen months over a span of twenty years or so, they give us a taste of a future in which the material aspects of life will increasingly be controlled by the transfer of information. Rather than seeing humanity defeated by its own creation, we are witnessing the expansion of the semiotic human feedback mechanism to the material world as a whole. Computers have brought the whole world together, the developed part of it at any rate. The early confraternity of computer freaks has expanded and diversified into the majority who use email, handle a little word-processing, and surf the Internet; the nerds who know the best websites, how to change margins and recalculate spreadsheets; the geeks who can write, say, card games in Visual Basic; and the supergeeks who still call themselves hackers, stay up all night drinking Mountain Dew, and who know enough about interrupts and the Windows API to break into security systems.
Despite all the inventions of the Middle Ages and Renaissance–gunpowder, the compass, clocks, printing–the men of the eighteenth century still saw themselves as Greeks and Romans. The industrial nineteenth century showed that human history was not merely “perfectible” but open-ended and irreversible. Now the information era has made the reality of this irreversibility–improvement and obsolescence–an object of immediate experience. Purchasing a computer or any information-processing device is like changing dollars in a land of runaway inflation; buy in the morning and you’re sorry you didn’t wait till afternoon.
In contrast with the ever-improving hardware, programming is relatively conservative. The major operating systems, DOS-Windows, UNIX-Linux, even the Mac OS, have all been around for over fifteen years. A Visual Basic program today is still recognizably a Basic program, just as a C++ or Java program is still recognizably a C program with semicolons and matching braces. And the Intel Pentium II-III assembler code that underlies the objects and methods of the latest languages derives with few changes–and still with daa–from the good old 8080, just as those new Mac chips descend from the 6502. 32-bit EAX just extends AX, which extended the 8-bit “accumulator” A . And if the strange academic languages like Forth and Snobol seem to have disappeared, generation-old Cobol and Fortran programs are still running.
That my domino game has survived the transition from CP/M to Windows 2000, from 4 mHz to 1 gHz, from Microsoft Basic to Microsoft omnipotens, allows me a not-quite-Proustian sense of defeating time. Writing a computer program is solving a puzzle to which no prior solution existed. Isn’t that true as well of the originary hypothesis?