How To Overcome Your Anxiety And Enjoy The New Millennium.
By Donald A. Falk
JANUARY 3, 2000: AS THE MILLENNIUM moment approaches, it is apparent that a great deal of free-floating anxiety is accumulating around this event. Various people have made various tidy sums of money predicting everything from war and revolution to apotheosis, not to mention such everyday annoyances as power failures and a crash of the global stock market. Such high levels of anxiety are not healthy in a society with its twitching finger on the "go" button of a nuclear arsenal the size of Jupiter. Thus, as a public service I offer two easy methods that you can use safely in the privacy of your very home to reduce your anxiety about the coming "millennium."
1. It's not really the Year 2000. Let's attack this "millennium" nonsense directly by stating what must be painfully obvious: the date is totally arbitrary. The rest of the Universe is unaware that some (but not all) of us on Planet Earth started counting 2,000 years ago, nor does the Universe much care. Indeed, many Terrans start their count of years at different points. For example, to the world's Jews, this year (which is 5,761) started a few weeks ago in mid-autumn; nothing to feel anxious about, is it? The Jewish calendar dates to the traditional symbolic date of creation, which is the equivalent of 3761 B.C. So the Hebrew calendar passed Y2K in the equivalent of 1761 B.C. Did the world end in that year? No, it did not.
People have devised many other starting dates to the world's timekeeping. To Muslims, A.D. 622 is Year One. The Chinese Calendar began its first cycle in 2637 B.C. (Happy 4636, everyone!). Not surprisingly, Roman society followed the tradition of dating to the founding of Rome, the anno urbis conditae, in 753 B.C. Y2K in the Roman calendar was A.D. 1247; we seem to have survived that. Then there was the Era of Nabonassar, King Of Babylon, which began on the equivalent of February 26, 747 B.C. And so on.
Nor is the length of the calendar year an absolute. The Greeks and Romans altered the number of days in the year at least three times, creating successively over time the calendars we know as Julian, Augustan and Gregorian. In A.D. 325, the Council of Nicaea noted with dismay that people were observing Easter at a different time every year, and recommended that the length of the year be fixed in relation to the solar year, which as any first grader knows is 365 days, 5 hours, 48 minutes, and 45.6 seconds long (on average). A mere 1,257 years later, in 1582, Pope Gregory XIII decreed adoption of the Gregorian calendar, which we use today, incorporating such innovations such as Leap Year and Ground Hog's Day. However, the new calendar couldn't be reconciled with the Julian system that had been in use for the previous millennium, because of "extra" solar days that had accumulated. The solution? The day after October 4, 1582 was called "October 15." Incidentally, this happened again two centuries later in England; when the new calendar system was adopted, the day after September 2, 1752 was declared "September 14." Pope Gregory also declared January 1 to be the start of the year, which until then had been March 25 (this is true), as decreed by Emperor Dionysius Exiguus in A.D. 532. And so on. There is no spot in the Earth's annual orbit around the Sun marked "start here."
Incidentally, nearly all calendars devised around the world have grappled with the inconveniently uneven number of days in the solar year. Remarkably, the same solution -- an extra day every now and then, as in our Leap Year -- has been arrived at independently several times in human history, including by the Egyptians, Chinese and Mayans as well as in Greco-Roman culture.
But the dominant culture has adopted the Gregorian calendar dating to the birth of Jesus, whose arrival caused the pressing of the reset button, right? (I never did have much luck selling those genuine Roman coins dated 237 B.C.). Well, not completely right. Even Christians can disagree about what year it is. Although what we use as A.D. 1 was selected by Emperor Dionysius Exiguus, most biblical scholars agree that Jesus's birth has been reliably dated to the year 4 B.C. (by convention there is no year "A.D. 0," an annoyance primarily to paleontologists and followers of Nabonassar). In case this all strikes you as just a teeny bit arbitrary -- you're right. The Millennium is an artifact of history, not an absolute fact of the Universe. No worries.
By the way, if you would like to feel even more reassured that "2000" is an arbitrary number, feel free to contact the International Fixed Calendar League, founded in 1942 (A.D.) by Moses Colsworth. Better yet, write your slacker of a congressperson and demand support for one of the many bills introduced over the years into the House of Representatives that would establish the Perpetual Calendar devised by Willard Edwards of Hawaii as the official calendar of the United States. The Perpetual Calendar has many notable advantages, including the elimination of all Fridays the 13th.
2. We don't have to count in Base 10. Not only does the Universe not care when we started making scratch marks on the walls of our caves: it doesn't care a whit how we keep track of the numbers. We are so accustomed to counting in Base 10 that it rarely occurs to us that there is an alternative. Yes, most of us have ten fingers and all that -- but do you still count on your fingers? Since our counting system is at the heart of the whole Y2K anxiety attack, if we can do a little numerical jiu jitsu, we can avoid the whole mess.
Bases work something like this. Let's say you have eleven carrots. In our counting system, we have nine unique characters (1 through 9) plus zero. When we get to the tenth carrot, we don't use a unique character for it; instead we adopt the convention of saying that we now have "one" of the whole number series -- that is, "ten" -- and we begin counting from zero again (if you have trouble with this concept, ask your 5-year-old for help). Abacus users record this by sliding over a counter that means "one set of ten things has been recorded"; in our decimal system, we write one in the "tens" column. Then the "eleventh" carrot is the first is a new series of counting from 0 to 9, and so on.
But if your counting system had enough unique characters, you could count all the carrots without having to go to the "tens" column (for example, 1, 2, 3, 4, 5, 6, 7, 8, 9, &). If your counting system had fifty unique characters, you could write the number that we record "50" using whatever character was chosen to represent that number. On the other hand, if your counting system stopped at 8, then the ninth carrot would be written "10" (one set of nine), the tenth carrot would be written "11" (one set of nine plus one extra), and so on. Bases are just a convenient way to summarize large numbers without having to remember so many different characters, and they make computation easier, too.
Lest you think this is merely academic, it turns out that many societies throughout history have used bases other than 10, including 4, 5 and 20. The first known written counting system (a decimal system, incidentally) was developed in Egypt about 5,000 years ago. The Babylonian sexigesimal counting system, devised more than 3,600 years ago, marked numbers up to 10, then six sets of 10 up to 60. From this we derive our hours (and degrees of the compass), divided into 60 minutes of 60 seconds. Shortly after the French Revolution, a committee was established to recommend the best counting system for the new era. They chose the duodecimal system (that is, Base 12), which along with Base Eight (octal) has many advantages in multiplication and division. The Duodecimalists did not prevail, for reasons of politics, not mathematics. But the flame of their noble cause is still tended by the (are you ready?) Duodecimal Society of America, founded in 1944 (or as they would say in Base 12, 1160).
In fact, bases other than 10 are already widely used in our society. Computers, idiot savants that they are, count in Base Two -- that is, the binary system. What Base 10 counts as "1, 2, 3, 4, 5," a computer counts as "1, 10, 11, 100, 101." Many widely used computer and engineering languages use hexadecimal, which has 16 unique characters plus zero. Any number can be converted to another base with relatively little effort using a simple algorithm, and they all mean pretty much the same thing: representation of a particular unique number.
What does all this have to do with Y2K madness? It's very simple, really. The fact is, "2000" is just an artifact of Base 10. Not ready for the millennium yet? OK, just convert to, say, Base 11, in which next year is 1559 -- hardly anything to make a body nervous, and in any event you have several hundred years to get ready for the next "millennium."
An alternative approach (which I happen to favor) is to convert to a base smaller than 10, because then the "millennium" has already happened! In Base Nine, for instance, Y2K happened years ago, in what everyone else called "1458." According to my extensive research in the library and several local Public Houses, the world did not end in that year. On the other hand, the Turks did sack the Acropolis, and the Spanish poet Marques de Santillana died, so maybe the Millennialists are right in their own way.
We can all agree that the original decision not to include the extra digit for "thousands" in a few computer languages has had the most spectacularly bad benefit/cost ratio of any decision in human history (Benefit: save one byte of computer memory per calculation. Cost: in the trillions of dollars, even in Base 10). But let's not get over-wrought. With a few simple changes of perspective, the whole fuss goes away. See you on March 25, 1559.
News & Opinion: 1 2 3 4 5 6 7 8 9
Cover . News . Film . Music . Arts . Books . Comics . Search
© 1995-99 DesertNet, LLC . Tucson Weekly . Info Booth . Powered by Dispatch