Sunday, June 29, 2008
Transformers as literature?
I was thinking about it tonight (ok, ok, I admit I watched it for the 43,455th time) and realized how intensely violent the movie is.... I mean, it's sheer carnage everywhere, and it almost never stops for the whole 90-minute show. Most of the Autobots and Decepticons I'd seen on a daily basis for the last two years of devotedly watching the cartoon got killed, some in deciedly gruesome ways-- Ironhide dies from an execution-style shot to the back of the head. Starscream gets incinerated. Optimus Prime dies on the operating table. Whole planets get wiped out, and the survivors fed to Sharktacons or thrown into acid pits and melted, both of which you get to watch. Ultra Magnus says 'dammit,' which was, at that point, the first obscenity I had ever heard on television.
This is heavy stuff when you're nine years old. Up to that point, having grown up on thoroughly-sanitized, bowdlerized, and wholesomeificated (I think I just created a new word) American children's programming, you took certain things as ineffable. The good guys won. There was a lot of shouting and explosions and some drama, but everyone survived.
The Transformers movie changed that, and suddenly the world became a bit darker and less certain. Suddenly the good guys DIDN'T always win, and everyone didn't just get up and have their arms and legs welded back on. It was one small step towards adulthood.
On the whole, I think this movie had more of a lasting impact on me than most of elementary school, which probably says a lot about the US educational system. From the standpoint of appreciation of literature, for example, I had (and still have) nothing but contempt for The Indian in the Cupboard or the Phantom Tollbooth, but I still mutter the defiant "spare me this mockery of justice" line under my breath when I'm feeling wronged, because I think it speaks much more directly to the human (or alien robot) condition than a story about a magic cupboard that brings dolls to life. The fact that the line was delivered by a character who had just seen his planet being eaten, and who threw it into the teeth of beings who were about to sentence him to death, just added weight to it. That is pathos one does not get from Louisa May Alcott or Judy Blume.
The problem with most children's literature, I contend, is that it is mostly written by people who assume children to be naive, stupid, and in need of being cushioned from reality, and who write the books accordingly.
From the point of a positive role model, on the other hand, it was hard to find a better one on TV at that time than Optimus Prime. Leadership, charisma, nobility, self-sacrifice, and putting the needs of the many ahead of his own-- these are traits you don't often find in cartoon characters. How much does it say about me that my two biggest childhood role models were a space pirate and a giant alien robot that transformed into a truck?
Just as a side note...
It's still a bit of a surprise to me that the great Orson Welles' last role was in this movie... were irony not passe, I would say it's ironic that the guy who gave us Citizen Kane, the 1938 War of the Worlds radio performance, and Chimes at Midnight went out with, as his last credit, voice-acting a giant planet that turned into a robot and ate other planets.
Saturday, June 7, 2008
Weather
It's Evolution, Baby!
One of the things I've been puzzling over for most of my life—in between Incredible Hulk-style fits of rage at my boss (where is my paycheck, dammit?)—is the question of 'what if.' What if Richard III had won at Bosworth Field, what if Hitler had been a successful artist rather than a disgruntled and resentful failure, and so on. It's basically a question of how history would have turned out differently if some crucial element in the past had happened in a different way.
Heck, I was rereading Neuromancer the other day; this book was supposed to be an uber-cutting-edge view of the future when it came out in 1984, and it's basically the ur-text for cyberpunk. It popularized a lot of concepts like artificial intelligence, virtual reality, cyberspace, etc. Then I got to a part in the book where one of the characters is trying to sell on the black market three megabytes of RAM that he stole out of a Toshiba laptop. I just about fell out of the chair laughing. Three MB must have seemed like a lot then, when 640kb was the cutting edge, but right now I can't think of anyone who would blink at three megs. I have 512 in my computer right now, and the first PC I bought, ten years ago, had 64.
Evolution—both biological and technological—is generally similar to history, which is in a sense an accounting of the evolution of human culture. We tend to assume that whatever has happened previously is essentially a stepping-stone or intermediate stage towards what we have now—liberal capitalism, homo sapiens, and all of Homo's stuff, from V-8 engines to iPods.
This 'Whig' view of history is a very popular one; it's very pervasive, even at a sort of subconscious cultural level. Part of this popularity is due to the apparent simplicity of the idea—it just makes sense to think that all of history, in one way or another, took place so that the world could be the way it is now. You'd be surprised how many peoples' brains get hung up on ideas like the ultimately successful revolt of some British colonies in North America being fated to start in 1775, regardless of what happened in 1774. The worst offenders are often historians or political scientists –for example Thomas Friedman, Francis Fukayama, and Thomas Macaulay, all of whom sort of wind up their narratives in "and they all lived happily ever after" land.
What we don't always consider is the room for improvement in what we have now—except for people who are paid to think of improvements—or the other forms that things could have taken. As a general thing, people assume that what we have now is the best we could possibly have, even though we could easily have wound up with something totally different doing the same job.
When it comes right down to it, there's any number of 'also-rans' in the history of technology (I can think of dozens from just the last century and a half) that aren't necessarily failures or dead-ends in a technical sense—they worked, and sometimes very well—but which for whatever reason fell by the wayside. These aren't the Cambrian-era critters from the Burgess Shale finds like Neoctaris or Anomolocaris, species that which represent paths of evolution that dead-ended hundreds of millions of years before the first dinosaurs. It's just stuff that didn't sell, or was outcompeted, or happened to have a run of bad luck or come onto the market at a bad time.
Nearly everyone has heard of the 'format war' between VHS and Betamax—it's the stuff of pop culture legend and countless term papers by undergraduate business majors. In short, the Sony corporation developed the first cheap home video media—a VCR, essentially—in the late 1970s. Rather than simply adopt Sony's Betamax format for production in its own equipment, though, the JVC Corporation, one of the major producers of televisions and so on, decided to put a big chunk of money into developing their own video format, what became known as VHS, rather than allow themselves to be tied to Sony's format and see all the money for that market go to Sony. As things played out during the 1980s, JVC managed to sell more units in VHS than Sony could in Beta, so that economies of scale (it's easier and cheaper to produce many of something than a few) helped make VHS cheaper to produce and, therefore, sell. A certain amount of arm-twisting of retailers and video manufacturers also occurred—some movies were simply never released on Betamax video, and some stores never carried Betamax machines.
By the end of the 80s, thanks to clever marketing, "Betamax" sounded as much of a laughable dinosaur as the eight-track audio cassette had been a few years earlier.
The fact is, using comparable equipment, Betamax's heavier-duty videotape initially produced better-quality images than flimsy VHS. For this reason, Betamax was enthusiastically adopted by the broadcast industry, for uses like TV news camera crews, where it became the industry standard because of its ease of compatibility with the popular Camcorders (a Sony product). Ironically, Betamax continues in use in this application, almost ten years after DVDs began eclipsing VHS. Every time you see Dave Barsky or Doug Glover changing a tape on Dirty Jobs, it's probably still Betamax.
One of the most important technologies of modern life is the internal-combustion engine, which made possible everything from cars and trucks and airplanes to aircraft carriers to diesel generators and who knows what else. These engines, developed in the 1800s and likely to dominate until well into the 21st century-- are so pervasive and so central to modern life in the developed world that the matter of fuel for them—crude oil or other sources—has become the linchpin of modern global economics, as well as political and military policies for many countries.
The question is, then, do we have the best internal combustion engines that we could have? Most of the recent attention to energy matters has been devoted to alternative fuels, such as ethanol, biodiesel, or hydrogen, but relatively little thought has been expended on the engines these fuels would feed. Since the early 1900s, virtually all the engines of this type have been of the familiar reciprocating type, where pistons driven by exploding mixtures of fuel and vapor turn a crank, thus rather inefficiently converting chemical energy into mechanical energy—all those pistons and cranks have lots of inertia and are prone to friction, which takes away from the amount of energy they transfer to whatever is next up the mechanical chain. Are there better alternatives to the engines themselves, rather than just the fuels? We've all heard the urban legends about there being secret fuel additives or mysterious pills that you can put in a car's tank and drive it fifty thousand miles on a single tank of gas, except that they've all been buried by ExxonMobil or General Motors. Diesel engines were originally designed to run on powdered coal dust, alcohol, or vegetable oil, but petroleum was so cheap for most of the 20th century that few other fuels could compete, and so abundant that few in the general public took seriously the possibilities of it running out or being cut off.
So what else was out there, besides the reciprocating engine with its cumbrous entanglements of radiators and baths of lubricating and cooling fluids?
Knox air-cooled engine
One of the more interesting types of engine ever to drive a motor vehicle was the one developed by Harry Knox for the Knox vehicles manufactured in Springfield, MA in the early 1900s. At the time, the Springfield area was home to dozens of factories producing machinery and consumer goods of almost any imaginable type, as well as the enormous federal armory (home of both types of 'Springfield rifle,' in addition to the M-1 Garand and M-14), textile mills, and innumerable small machine shops. In the days before Detroit and the kakocracy of the Big Three, Springfield also produced many motor vehicles—the Duryea Motor Wagon Company was established there in 1895, the first American automotive manufacturer, and Rolls Royce had a manufacturing plant there. Knox was the largest of the many local automotive companies in the area.
Most motorcars built at that time were relatively fragile and erratic things, but Knox built his vehicles like forts, with heavy-duty transmissions and gearing, as well as branching out into relatively niche markets such as fire engines and heavy trucks. His vehicles quickly became favorites of fire departments across the country, when reliability was a very sought-after aspect. Knox vehicles regularly trumped all rivals when it came to long-distance road rallies or more taxing challenges like hill climbs, capturing two of the four prizes in a grueling Boston-to-New-York contest in 1902 and regularly running courier services from Boston to cities around New England. A Knox never held a speed record, though—that honor belonged to the contemporary Stanley Steamer, which set a record of 127 miles per hour in 1906—but it was among the first cars to climb Mount Washington.
Also of note is that most Knox vehicles usually managed about twenty miles per gallon of gas with a full load, a commendable feat considering that this was at a time when gasoline was comparatively hard to find, and well before the addition of tetraethyl lead or other octane boosters that increased efficiency.
One of the key parts of the Knox story was the unique 8-horsepower air-cooled engine, a metallurgically and thermodynamically sophisticated system that not only avoided the problems associated with water-cooled engines, which were heavier and wouldn't run without coolant water, which had to be refilled constantly in warm weather and which could also freeze in the wintertime. The Knox engine was nicknamed "Old Porcupine" because it bled off heat through 1,750 steel rods screwed into the engine's casing, increasing the engine's surface area (and thus the rate at which heat bled off) by a factor of 32, further augmented by a fan that blew air across the cooling surfaces. Later vehicles had two such systems. Knox marketed their vehicles as "the car that never drinks."
Harry Knox himself—an innovative mechanical engineer but not a businessman-- was forced out of the company by his financial backers by 1908; they ran the company into the ground and the firm went bankrupt in 1915. From 1908 until the introduction of the Volkswagen, nobody in the United States had an air-cooled engine. Harry went on to start other businesses, including the successful Atlas Motor Truck Company, and designed armored vehicles for the US government during the Second World War.
Rotary engine
Ever hear of an internal-combustion engine with only one moving part? Check out the Wankel Rotary Engine, which reached functionality in 1957. If you have a jetski, you might already have one. Unfortunately, the only real use of them in production autos was by Mazda (the RX-7 and RX-8) and a few manufacturers of small custom sports cars, like Citroen and NSU. In many respects, they're better than the reciprocating (back and forth piston) engines that have become the dominant type these days- they're it's very simple and easy to maintain, because they only have one moving part, a sort of trefoil-shaped rotor. They also produce very little vibration, and are quieter, (noise and heat are signs of inefficiency) more fuel-efficient, more mechanically efficient, more compact, and more lightweight than reciprocating engines of the same size and horsepower rating. Efficiency and emission standards of current designs meet even the California standards.
As to why it never took off in the US, well, that's not exactly clear, but it appears that when they idea was marketed to the major US auto firms in the 1960s and 1970s, General Motors and so on, were reluctant to devote money and manufacturing capacity to something they viewed as an untried European novelty. GM bought the rights to manufacture and sell the design in the US, but then sat on them—AMC (the original makers of Jeeps) intended to release their compact Pacer with a rotary engine, but when GM didn't deliver the engines, AMC had to shoehorn reciprocating engines into the car. The combination of economic downturn, higher gas prices during the 1970s oil crisis, and new emission standards caused the automotive industry to drop any ideas about new types of engines in favor of sticking catalytic converters onto existing engines.
The basic design of automotive engines hasn't changed much since the early 1900s, and all the hard-won increases in efficiency the engineers manage to squeeze out through the use of closer manufacturing tolerances, better fuels, or lighter materials get swallowed up by the increased mass of cars—in the case of luxury sedans, SUVs and trucks—or by the myriad electrical components that cars now include—your air conditioner burns gas, as does your sound system, your GPS system, and those heated or chilled seats that you just can't live without.
Speaking of electrical things, let's change scope here, from the level of a car's system to that of a city's power grid.
AC/DC
Plug something electrical into the wall. Ok, that's simple. Now think about where the electricity comes from—power plants, transmission lines, distribution substations, etc. That puts things into a different perspective. Now contemplate all the thought and experimentation and engineering and hard work that went into setting up the power grid in the first place. That's a big deal. Why 120 volt current, the standard used in the US and the rest of the Americas? Why not 230 or 240, which are used almost everywhere else? Why AC, rather than DC? Because that's how the infrastructure was created, and in the long term it's been easier to build things that work on the current we have than to change the current.
Most of it boiled down not so much to the necessities of engineering—no one current is really intrinsically better than any other-- as to differences between the designs developed by the first couple generations of inventors and industrialists in the electrical supply business, and to the preferences of people who made things that used electricity. The classic case study of this is the 'War of the Currents' that developed in the late 19th Century between the two major national electrical suppliers in the United States.
On the one side was Thomas Edison, the Bill Gates of the electrical industry, founder of General Electric, who for some years had enjoyed a virtual monopoly on electrical technology because he held most of the US patents for anything involving direct current (DC) power, and enforced his patents with an iron will. Just as an aside, please note that most of the innovations credited to Edison personally were actually the product of Edison the corporation, for his greatest innovation, so to speak, was to establish the world's first technological research and development company, and he had an army of other engineers, mechanics, chemists, and others working for him.
On the other side were the industrialist and engineer George Westinghouse and his partner, the electromagnetic boy wonder Nikola Tesla, who had developed a new electrical technology based on alternating current (AC). Not only did the AC system offer some major advantages over DC, but it also provided another sort of value-- a means to circumvent many of Edison's patents and break the monopoly on electricity.
There was also a personal element to the conflict- the Serbian-born Tesla had once been an employee of Edison's, and had attempted to introduce the AC concept to Edison himself, only to be insultingly dismissed. That bit of arrogance would cost Edison dearly; Tesla carried a grudge against his former employer for the rest of his days.
The major difference was infrastructural—Edison and General Electric liked the idea of distributed generation, or having lots of small local power plants supplying neighborhoods, an arrangement suited to the limitations of DC power. The technological crux of the matter was 'voltage drop'—110-volt DC current, which was the standard type in the Edison system, can only be sent a mile or two over wires before the resistance of the wires over which the current is sent so weakens the current that it becomes useless. Distributed generation was also in keeping with the prevalent way of doing things at the time—local gasworks (mostly coal gas at that time) and steam plants supplying a few city blocks or neighborhoods, or large mill complexes picking up scratch on the side by selling off their power plants' excess capacity to neighboring small businesses. From a logistics and economic standpoint, Edison's approach wasn't necessarily a bad idea, though it would probably have cost more in the long run due to the financial considerations of building, manning, and supplying the plants. The cost of copper for the wiring was also a concern, and many engineers and accountants spent a great deal of time arguing over such things.
Tesla and Westinghouse liked the idea of having a smaller number of large plants (though they would still be considered small by modern standards) that could supply whole cities or counties. They had found a leg up on Edison by using electrical transformers, which Westinghouse held most of the patents on, and which are basically devices used to shift electrical current from one voltage to another (I'll skip the complexities), and that could be used to step the voltage up for transmission, and then step it down when it arrived, converting it into a more user-friendly form; high voltage/low current power is ideal for transmission over long distances, but dangerous to use in appliances or other devices. The ability to transmit power over long distances—tens or hundreds of miles, with the right equipment—was a major advantage, since it opened up the possibility of selling power in rural areas that couldn't afford their own power plants. Mayberry didn't need a power plant of its' own, because it could just buy electricity piped in over the power lines from a plant fifty miles away. AC was also infinitely more flexible—rather than needing to build new power plant capacity in an area of increased demand, you simply bought more electricity from elsewhere, perhaps upgrading the transmission lines and substations, both of which were much cheaper than a whole new plant. Likewise, if demand in an area dropped, you weren't out as much of an investment—you could sell the power elsewhere. Likewise, if you needed, say, 9-volt or 18-volt DC current, or 220-volt AC current, you could use a transformer to turn 120-volt AC into what you wanted.
Edison launched a massive public-relations campaign to discredit AC power as dangerous, even going so far as to invent the AC-powered electric chair, an execution device, in order to demonstrate how much more lethal AC was than Edison's own DC. The first execution, in 1890, didn't go well—the Edison engineers underestimated the necessary power, and had to electrocute the prisoner repeatedly in order to kill him. Popular opinion was horrified--so much for Edison's advertising meme of a humane and scientific method of execution—but the method stuck. Electrocution has been plagued with misfires and failures ever since.
The first big battle of the War of the Currents was fought over Niagara Falls. After a furious, years-long war of bids and bribes, in 1893 Westinghouse won the contract to construct a hydroelectric plant at the falls, a major defeat for Edison.
In 1898, Edison endorsed the publication of a book titled Edison's Conquest of Mars, an unauthorized sequel to H.G. Wells' War of the Worlds. This book cast Edison as Earth's champion, a scientific hero who virtually singlehandedly built a space fleet (complete with death rays and space suits, all presumably DC-powered) and personally led an invasion of Mars in retaliation for the Martian attack on Earth. It was Edison's idea of PR, but it was also one of the first science fiction books.
Leon Czolgosz, the Polish-born anarchist who assassinated President McKinley, was executed via electrocution in October 1901; Edison filmed the occasion for posterity.
The Wizard of Menlo Park eventually took things to the extent of publicly killing animals with AC current to demonstrate how dangerous it can be. The most notorious such stunt was the controversial electrocution of a Coney Island circus elephant named Topsy in 1903. Granted, Topsy had already killed three of her handlers and was scheduled to be euthanized as a dangerous animal; one 'expert' had actually put forth a plan for hanging the animal from an enormous gallows. One wonders why simply shooting the animal with a large firearm – the proverbial 'elephant gun' of the big-game hunters-- was not the chosen method.
Edison staged the event in front of an audience of 1,500 people and filmed the proceedings; he later released the footage to theaters under the blunt title "Electrocuting an Elephant." As any good engineer or PR man would, however, Edison had a Plan B—just in case the electrocution misfired, Topsy was dosed beforehand with over a pound of extremely deadly potassium cyanide.
The war of the currents went on, with the tide slowly turning in favor of AC, as the economic advantages of an AC system over a DC system became overwhelming—by the mid-20th century, even those cities that had had DC electrical networks were converting to AC. Consolidated Edison continued to supply DC power to dwindling numbers of clients in New York City until 2007, and DC remained standard on most ships until well after the Second World War, when the increased use of shipboard electronic sensors and computers made a switch to AC necessary.
DC remains common, though, in applications such as batteries, automobiles, wind or solar power systems, and emergency power systems. High Voltage Direct Current is a relatively new application, which uses solid-state equipment to send DC over long distances by boosting the voltages.
A word about Niagara Falls-- this was, you understand, back in the days when majestic nature was looked upon mostly as raw material to be converted into something useful to mankind—that was what 'progress' looked like at the time, taming nature to improve the life of man. As it happened, the ready availability of cheap electricity in Buffalo made the area extremely attractive to industries involved in energy-intensive industrial processes, especially the manufacture of industrial quantities of chlorine—the Ur-chemical for most toxic waste—caustic soda, phenol, pesticides, and other chemicals.
It's A Bird, It's A Plane….?
The Ekranoplan was one of the stranger products of the Soviet Union, spawned by the febrile mind of an engineer named Rotislav Alexeyev, who grew up designing hydrofoils before essentially inventing the field of 'ground effect vehicles.' The name literally means "screen plane," and this unusual craft was intended to act akin to the hovercraft, racing across water (or theoretically, ice or very flat land) on a cushion of compressed air, taking advantage of an aerodynamic effect usually referred to as "wing-in-ground."
The most famous version, the KM, first flew in 1966, and was promptly nicknamed "The Caspian Sea Monster," by NATO intelligence because it was being tested on the Caspian Sea, moved at terrific speeds, and they couldn't figure out what it was—it looked like an airplane, and was in fact much bigger than any contemporary aircraft (almost 330 feet long), but it stayed close to the surface.
In some respects, the Ekranoplan could have been a deadly opponent in a fight—flying at high speed only meters above the water, it could have stayed below an enemy fleet's radar while still closing at speeds approaching those of a conventional airplane (300-400mph) while loaded with antiship missiles. They were also much bigger than most combat aircraft and able to carry heavy loads (thirty tons or more, translating to perhaps an entire marine company), so the Soviet military also explored their possible use as fast transports for troops. The Lun (Hen Harrier), which entered service with the Black Sea Fleet in the late 1980s, is the size of a jumbo jet, could carry six extremely deadly SS-N-22 Sunburn antiship missiles, and had a tested speed of 300 miles per hour. Unfortunately (or from the perspective of nervous NATO admirals, fortunately) only one of this type was ever built.
Unfortunately, the most famous Ekranoplan, the prototype KM (shown above), crashed due to pilot error in 1980; in a fit of political embarrassment, the Kremlin then basically sacked Alexeyev, and without him at the helm the whole project essentially spiraled down the drain; the existing models soldiered on, but nothing new was produced before the collapse of the USSR.
Two smaller Ekranoplans, circa the mid-1980s.
On a final and somewhat lighter note, consider the humble bass guitar. There have been instruments playing in the bass range for as long as there is any record of music. Leo Fender developed the first commercially-produced electric bass guitars beginning in 1951, with the objective of providing musicians with a bass instrument that could keep up with the rest of a big band—horn sections, drummers, or electric guitarists, whose volume often drowned out the band's bassist. The first thing off his production line was the first version of the Fender Precision bass, which with subsequent refinements became the industry standard for electric bass guitars.
Oh, a quick word about the coal tar thing….
Wednesday, June 4, 2008
Lead and Hilary
I discovered a funny thing yesterday. People complain about how dangerous lead is to children and all the howling about Chinese toys full of lead, we're not too careful about how we enforce it. In MA, under the Department of Public Health's Lead Poisoning Prevention and Control Regulation (105 CMR 460.00), "dangerous levels of lead" for the category of stuff that includes children's toys are defined as "equal to or more than 5,000 parts per million (ppm) or equal to or more than 0.5% by dry weight, as measured by atomic absorption spectrophotometry [i.e. certified laboratory testing]."
The Mass DEP's reportable concentrations and risk assessment standards for soil at residential properties, parks, schools, etc. are both 300 ppm-- if you have more than that, you're required to perform a cleanup.
The upshot of this, which I hashed out with Mass DEP and DPH staff, is that you can have sixteen times much lead in a child's toy as would be allowed in soil contaminated by hazardous waste.
Ummmmmm?
Oh, just to clear up another thing that's been bugging me the last couple of days. Hilary Clinton, despite her widely-bellowed claims, did *not* win the popular vote. I defer to RealClearPolitics here, for their table of tallies. Do the math.
Without the IA, NV, ME, WA caucuses and without the Michigan primary, Barack ended up with 33,916 more votes than Hillary Clinton.
With the IA, NV, ME, WA caucuses and without Michigan, Barack ended up with 144,138 more votes than Hillary Clinton.
Barack won by every standard there is. That's it.More to the point, Barack won by the rules and procedures established by the Democratic Party, and that included not campaigning or having his name on the ballot in Florida and Michigan after those states' primaries broke the rules despite being warned in advance, and were decertified. Hilary didn't campaign, but she did get her name on the ballot. Strictly speaking, by party rules, none of the votes in Florida and Michigan should be counted, a decision initially supported by the Clinton camp until they realized they needed those illegitimate results in order to stave off defeat and started screeching about how every vote must be counted. If the people in Michigan and Florida want to complain to anyone, they should tear their state governments a new one for ruining things for them.
One of the more annoying things I noted about the Clinton campaign was the way they continually glossed over contests they didn't win, although that eventually backed them into a corner when they got to Pennsylvania-- claiming "he doesn't have a chance if he can't win Pennsylvania" just doesn't work, either mathematically or in the court of common sense. The blogger Markos Moulitsas Zuniga, who founded DailyKos, described her whole plan as the "insult forty states" strategy. I thought it was pretty accurate-- she gambled everything on a big win in the big left-leaning states and didn't bother to plan for a long game anywhere else; she essentially wrote off everything she didn't think was important, and boy did that come back to f*ck her in the ass.
Hell, Al Gore didn't make this much of a fuss after the Supreme Court declared W the winner in Florida and thus Presidunce, even though Gore had 543,816 more popular votes than Georgie Porgie.
And John McCain? Four words for you. "George. Bush's. Third. Term."
He's complaining that Obama's 'naive' and accusing him of repeating things that aren't true?
Cue the McCain greatest hits, Al....
"I believe... that the Iraqi people will greet us as liberators." (March 20, 2003)
"Saddam Hussein is on a crash course to construct a nuclear weapon." (October 10, 2002)
"This conflict is... going to be relatively short." (March 23, 2003)
"I would argue that the next three to six months will be critical." (September 10, 2003)
"I think the initial phases of [the war] were so spectacularly successful that it took us all by surprise." (October 31, 2003)
"Only the most deluded of us could doubt the necessity of this war." (August 30, 2004)
"We will probably see significant progress in the next six months to a year." (December 4, 2005)
Plus the guy needs Lieberman to remind him of the difference between Sunni and Shi'a, and which one Iran is run by....Forty-odd years of government service and this is the best he can do? Newer model, please."We can know fairly well [whether the surge is working] in a few months." (February 4, 2007)
Tuesday, June 3, 2008
Narf?
Incidentally, guess which particular job floated back to the surface today, like a turd that refuses to go down with the flush? That's right, the glove factory. Nevermind that we haven't heard a peep from the client on this job since December, and that he still owes everyone money..... The hilarity is, I had a shortcut to the report folder on my desktop for quick access, and this guy called up about resurrecting the job not FIVE MINUTES after I deleted said shortcut to clean things up.
Speaking of turds and unpleasantness.....
I really wish Hilary would just admit it and quit.... this 'refusal to admit defeat' thing is getting embarassing, and the sheer amount of lying, obfuscation, bad-mouthing, rule-twisting, and hypocrisy in the Clinton campaign isn't doing anyone any good-- she's made herself look like a Republican in a very real sense. The last two or three months from her may as well have been photocopied out of Karl Rove's playbook. Face it. YOU LOST.
Face it, Hilary..... Just because Bill was president for eight years does NOT entitle you to hold the same office.