'Barely half -- 52 percent -- now believe Bush is "honest and trustworthy," down 7
percentage points since late October and his worst showing since the question was first
asked, in March 1999. At his best, in the summer of 2002, Bush was viewed as honest by 71
'In fact, those 27 photos grace one of the four most dishonest budgets in the nation's
history -- the other three are the budgets released in 2001, 2002 and 2003. Just to give you
a taste: remember how last year's budget contained no money for postwar Iraq -- and how
administration officials waited until after the tax cut had been passed to mention the small
matter of $87 billion in extra costs? Well, they've done it again: earlier this week the
Army's chief of staff testified that the Iraq funds in the budget would cover expenses only
But when administration officials are challenged about the blatant deceptions in their
budgets -- or, for that matter, about the use of prewar intelligence -- their response,
almost always, is to fall back on the president's character. How dare you question Mr.
Bush's honesty, they ask, when he is a man of such unimpeachable integrity? And that
leaves critics with no choice: they must point out that the man inside the flight suit bears
little resemblance to the official image.
There is, as far as I can tell, no positive evidence that Mr. Bush is a man of
exceptional uprightness. When has he even accepted responsibility for something that went
wrong? On the other hand, there is plenty of evidence that he is willing to cut corners
when it's to his personal advantage. His business career was full of questionable deals, and
whatever the full truth about his National Guard service, it was certainly not glorious.
Old history, you may say, and irrelevant to the present. And perhaps that would be true if
Mr. Bush was prepared to come clean about his past. Instead, he remains evasive. On "Meet
the Press" he promised to release all his records -- and promptly broke that promise.
I don't know what he's hiding. But I do think he has forfeited any right to cite his
character to turn away charges that his administration is lying about its policies. And that
is the point: Mr. Bush may not be a particularly bad man, but he isn't the paragon his
handlers portray. '
'Still, we may be on our way to an election in which Mr. Bush is judged on his record,
not his legend. And that, of course, is what the White House fears.'
There is no meat to Bush. People merely want to believe that the President who was
in office during 9/11 would represent us all.
'Finally! Finally, someone had reported serving with Bush! But there was one small problem
with [Bill] Calhoun's claim. His account contradicted the basic chronology of the case--a
timeline that has been clear and unchallenged for the past four years. [Mike Allen] Allen and
his editors [at the Washington Post] --hopeless incompetents--seemed ignorant of the story's
God freaking damn! It's about freaking time! She and Irving are finally engaged! The
build up has been incredible but we've been so used to Cathy disappointments that we're
still shocked: Almost as shocked as her mother!
'Children will be taught about atheism during religious education classes under official
plans being drawn up to reflect the decline in churchgoing in Britain. Non-religious beliefs
such as humanism, agnosticism and atheism would be covered alongside major faiths such as
Christianity or Islam under draft guidelines being prepared by the Qualifications and
Curriculum Authority, which regulates what is taught in schools in England.'
" 'The whole thing is terribly biased in favour of religion right now - it's all about
encouraging an identification with religion,' said Ben Rogers, author of the report for the
Institute for Public Policy Research thinktank. 'There are huge numbers of people who are
atheists or whose families are atheists and who are coming into a class where their family's
view is not acknowledged. You should be able to have a conversation about ethics that
doesn't collapse into a conversation about religion.' "
Excellent! This sound perfectly fair to me. Government cannot avoid religion but it
should not favor particular religions.
Once upon a time my wife and 2 kids were in the car this morning when 5 year old Connie
asked for some assistance singing the "10 Days of Christmas". Julia said that it's really the
"12 Days of Christmas", but yes we can help. So Julia started singing it. Julia is much better
at singing than I am and she has a much larger repertoire, but the "12 Days of Christmas" is one
of the songs that I actually know better than she does. So of course I eagerly jumped in too.
It was a jolly time and we were going along swimmingly, but when we got to the 11th day of
Christmas, nearly 3 year oldYork suddenly shouted out: "Shut up!"
Julia and I started laughing, but Connie was distraught about the interruption in the song. So
we immediately jumped back in and finished the song strongly. Everything ended up just fine.
'The same security compound was attacked two days earlier by gunmen just as the top U.S.
commander in the Middle East, Gen. John Abizaid, was visiting the site in Fallujah.'
'The attackers freed 75 prisoners held at the station, killing the guards and shooting
open the cell doors, police Lt. Col. Jalal Sabri said. The prisoners were criminals most
arrested for murder or theft and none of them were suspected of involvement in the anti-U.S.
insurgency, Sabri said.'
I just finished the 12 week course "Introduction to the Rapier" at the
Chicago Swordplay Guild. It's been a blast!
There are many aspects to it but the thing about it that hit me is that it is much more
satisfying to skewer a person than it is to pull a trigger and kill someone. Swordsmanship
requires more athleticism, more training, and more skill. I've joined the Guild itself and will
continue to practice with all manners of weapons.
From left to right: Back row: Phil, George, Jim, and John. Front row: Ashley, Jain, and Nora.
'The real shame in this whole thing is that there's a chance that innovation could have
prevented it. This was highlighted in a
Times column by James Flanigan who compares the supermarkets with Wal-Mart and
Costco. The supermarkets are pointing to Wal-Mart as the bad guy because their labor costs
are lower, which allows them to offer lower prices. However, equally successful Costco pays
higher wages and benefits than either Wal-Mart or the supermarkets. Even union leaders hail
it as the best in the retail industry. Costco's employee turnover is 20% -- one third of the
industry average, a factor that some industry experts state could save 20% on labor costs
for every 10% reduction in turnover. Plus Costco is refusing to follow the other corporate
fad of offshoring jobs such as call centers.
Why? CEO Jim Sinegal says it's not altruism, "It's good business."
Costco developed a strategy that fosters higher employee productivity and yields enormous
customer satisfaction and loyalty. Sinegal also states, "I don't see what's wrong with an
employee earning enough to be able to buy a house or having a health plan for the family.
We're trying to build a company that will be here 50 years from now." Which highlights
another ingredient in this stew: Wall Street's short term mindset. Because Costco makes 1.7
centers per dollar of sales compared to 2.5 cents for the supermarkets and 3.5 cents for
Wal-Mart, Wall Street considers this a shoddy performance. Bless Sinegal for staying the
'The bottom line is always important, and the bottom line here is that perhaps what's
wrong with supermarkets isn't employees' salaries but rather the lack of creative thought in
management, and, very probably, a management team whose compensation is based on short-term
Wall Street performance rather than a more long-term human, and humane, approach.'
So Costco.com has good ethics. So does
Levis.com. That alone gives me brand loyalty. People,
'The Asian "population explosion" was actually a "health explosion" -- it was fueled almost
entirely by declining mortality due to dramatic improvements in life expectancy. That same
"population explosion" has been defused by ongoing changes in childbearing patterns.'
Clarity. 'Now, with the nomination seemingly within his reach, the Massachusetts senator
must begin to more fully explain where he stands on the major challenges facing the
Scientists develop new hydrogen reactor. 'The reactor is a relatively tiny 2-foot-high
apparatus of tubes and wires that creates hydrogen from corn-based ethanol. A fuel cell, which
acts like a battery, then generates power.'
'The research is a more complicated version of a long-studied problem: how tightly
identical spheres can be packed together. Neatly stacked, as in a pyramid of oranges at a
grocery store, the spheres occupy 74 percent of the available volume. Arranged randomly,
however, the spheres fill only 64 percent of the space. In the new research, the
scientists considered spheroids -- spheres stretched into cigar shapes or squashed into M&M
shapes. Stacked neatly, the spheroids still take up 74 percent of the space, just like
spheres. But in random arrangements, computer simulations and experiments with M&M's showed
that spheroids could be packed much more densely, filling up to 71 percent of the space.'
'If the spheroids are deformed in a second direction, into ellipsoids (in other words,
stretched or squashed so the M&M shape is no longer circular when viewed from above), then
the maximum packing density increases to 77 percent, more tightly than the simple neat
Ha ha! We dealt with stuff like this all the time in Chemical Engineering.
'If anyone's ever promised you the sun, the moon and the stars, tell 'em you'll settle
for BPM 37093. The heart of that burned-out star with the no-nonsense name is a sparkling
diamond that weighs a staggering 10 billion trillion trillion carats. That's one
followed by 34 zeros.'
'The diamond is a massive chunk of crystallized carbon that lies about 300 trillion
miles from Earth, in the constellation Centaurus. The galaxy's largest diamond is formally
known as a white dwarf, or the hot core of a dead sun.'
'Dunham and others spent hours looking for clues in the code, a mix of assembler, C
and C++ programming languages. The leaked Windows 2000 code contained 30915 files and a
whopping 13.5 million lines of code, he said. And the Windows NT breach had 95,103 files
and 28 million lines. Both were available as zip files being exchanged readily on the
Internet, Dunham said.'
'Because Mainsoft used only select portions of the Windows source for MainWin,
Microsoft may find itself more worried about the egg on its face than possible exposure
of its flagship operating system; Windows 2000 served as the foundation for Windows XP
and Windows Server 2003.'
'In the struggle to meet deadlines, I think pretty much all programmers have put in
comments they might later regret, including swearwords and acerbic comments about other code
or requirements. Also, any conscientious coder will put in prominent comments warning others
about the trickier parts of the code. Comments like "UGLY TERRIBLE HACK" tend to indicate
good code rather than bad: in bad code ugly terrible hacks are considered par for the
course. It would therefore be both hypocritical and meaningless to go through the comments
looking for embarrassments. But also fun, so let's go.'
'In short, there is nothing really surprising in this leak. Microsoft does not steal
open-source code. Their older code is flaky, their modern code excellent. Their programmers
are skilled and enthusiastic. Problems are generally due to a trade-off of current quality
against vast hardware, software and backward compatibility.'
Brilliant stuff. Really looks at the roots to go beyond the present. She proves what
I've believed that programming right now is low-level: like working in the sewers.
Eventually programming will become easier, more high-level. It has to be that way to do
anything of complexity the complexity must be encapsulated and hidden from the bosses/users,
and yet the complexity must be uncompromisingly correct and well designed.
"And here's what's really sad -- the overwhelming majority of so-called "successful"
development projects produce mediocre software. Take almost any corporate accounting
application, and you'll find it poor in quality, unimpressive in capabilities, difficult to
extend, misaligned with other enterprise systems, technologically obsolete by the time of
release, and functionally identical to dozens of other accounting systems. Hundreds of
thousands of dollars are spent on development, and millions afterwards on maintenance -- and
for what? From an engineering standpoint, zero innovation and zero incremental value have
"The correlation of the size of the software with its quality is overwhelming and very
suggestive. I think his observations raise numerous questions: Why are big programs so
buggy? And not just buggy, but buggy to a point beyond salvation. Is there an inherent
complexity factor that makes bugs grow exponentially, in number, severity, and in how
difficult they are to diagnose? If so, how do we define complexity and deal with it?"
"I can see two reasonable ways to create complex programs that are less susceptible to
bugs. As in medicine, there is prevention and there is recovery. Both the objectives and the
means involved in prevention and recovery are so different that they should be considered
"Having said that, these technological advances are still inadequate in dealing with
many categories of bugs. You see, a "bug" is often just a sign of recognition that a program
is behaving undesirably. Such "undesirability" may indeed be caused by mechanical problems
in which code does something different from what it was intended to do. But all too often
the code is doing exactly what the programmer wanted at the time, which (in the end) turned
out to be a really bad idea. The former is a programming bug, and the latter a design bug,
or in some exceptionally lethal cases, an architectural bug. The constant security-related
problems associated with Microsoft's products are due to its fundamental platform
architecture. Java technology, in contrast, enjoys exceptional immunity to viruses because
of its sandbag architecture."
"I don't believe that future advances in software engineering will prevent developers
from making mistakes that lead to design bugs. Over time, any successful software evolves to
address new requirements. A piece of code that behaved appropriately in previous versions
suddenly turns out to have deficiencies -- or bugs. That's OK! The reality of the program
domain has changed, so the program must change too. A bug is simply a manifestation of
the newly discovered misalignment. It must be expected to happen, really! From that
vantage point, it's not the prevention of bugs but the recovery -- the ability to gracefully
exterminate them -- that counts. In regard to recovery, I can't think of a recent
technological breakthrough. Polymorphism and inheritance help developers write new
classes without affecting the rest of the program. However, most bug fixes require some
degree of refactoring, which is always dangerous and unpredictable. "
'Q: What about the notion of complexity as the primary reason for software bugs? Do you
have any concrete ideas on how to reduce complexity?
A: Well, I see two principal weapons. One is the intuitiveness of the programming
experience from the developer's point of view. Another is the ability to decompose the whole
into smaller units and aggregate individual units into a whole. Let me start with the
programming experience first.
Things appear simple to us when we can operate intuitively, at the level of consciousness
well below fully focused, concentrated, strenuous thinking. Thus, the opposite of complexity
-- and the best weapon against it -- is intuitiveness. Software engineering should flow from
the intuitiveness of the programming experience. A programmer who works with complex
programs comfortably does not see them as complex, thanks to the way our perception and
cognition work. A forest is a complex ecosystem, but for the average hiker the woods do not
"Object-oriented programming allowed developers to create industrial software that is
far more complex than what functional programming allowed. However, we seem to have reached
the point where OO is no longer effective. No one can comfortably negotiate a system with
thousands of classes. So, unfortunately, object-oriented programming has a fundamental
flaw, ironically related to its main strength. "
"In object-oriented systems, "object" is the one and only basic abstraction. The
universe always gets reduced to a set of pre-defined object classes, some of which are
structural supersets of others. The simplicity of this model is both its blessing and its
curse. Einstein once noted that an explanation should be as simple as possible, but no
simpler. This is a remarkably subtle point that is often overlooked. Explaining the world
through a collection of objects is just too simple! The world is richer than what can be
expressed with object-oriented syntax."
"Processes are extremely common in the real world and in programming. Elaborate
mechanisms have been devised over the years to handle transactions, workflow, orchestration,
threads, protocols, and other inherently "procedural" concepts. Those mechanisms breed
complexity as they try to compensate for the inherent time-invariant deficiency in OO
programming. Instead, the problem should be addressed at the root by allowing
process-specific constructs, such as "before/after," "cause/effect," and, perhaps, "system
state" to be a core part of the language. I envision a programming language that is a notch
richer then OO. It would be based on a small number of primitive concepts, intuitively
obvious to any mature human being, and tied to well-understood metaphors, such as objects,
conditions, and processes. I hope to preserve many features of the object-oriented systems
that made them so safe and convenient, such as abstract typing, polymorphism, encapsulation
and so on. The work so far has been promising. "
"Hierarchies and collections are pretty much the only tools we've got to define how
things relate to each other and how they should be organized into manageable structures.
Hierarchical aggregation fits well with the fractal nature of many organic and artificial
systems, and it is intuitively obvious to most people. Plus, the depth of the aggregation
scales linearly with the exponential growth of elements, which is hugely important.
Collections are similarly plentiful in the natural and virtual worlds, fit well with
peer-to-peer systems, and once again, are totally intuitive. Unfortunately, this
wonderfully simple division of structures into hierarchies and collections is, again, too
simple for our needs. "
"Equipped with such a powerful component architecture, a new theory of reuse may be
developed, this time addressing the entire software lifecycle over a project's lifetime in a
graceful, truly evolutionary way. Refactoring will no longer be a brutal, destructive
operation. Instead, a safe, almost organic rejuvenation of the old components by the new
ones -- guaranteed at compile time to be semantically, as well as syntactically, correct --
will become possible, analogous to the cyclical rejuvenation found in every corner of
"Software is truly amazing media, unlike anything else found in nature or created by
humankind. Like information in general, software is not an entirely physical substance,
for it has no mass, volume, or density. Neither is it an entirely metaphysical concept, for
it interacts with real, physical entities, and causes very concrete physical impacts, such
as the rotation of a turbine, the flow of electricity, or the imprint of an image on the
page. Software is a product of our imagination, like a book, a painting or a movie, designed
to synthesize a particular representation of the real world. But unlike all other forms of
pure art, software is constructed for utilitarian purposes to do more then merely reflect
the real world; software interacts with the world and in many cases even controls it. And
what is truly amazing -- software is replicable: instantaneously, in arbitrary numbers, at
zero cost! "
Advice to developers:
"Don't take everything you've been told about good software engineering as gospel
truth. Don't be bamboozled. Maintain your sense of skepticism and look for more
intuitive metaphors. "
"The complacency around C/C++ and the Java language is pervasive. C#, the first
programming language in years, looks more like the Java language. Enormous productivity
gains remain to be uncovered and difficult problems are yet to be solved. The world has
gone crazy with XML and then web services; SOAP and UDDI are getting enormous attention,
and, yet, from a software engineering standpoint, they seem to me a setback rather then
a step forward."
'Most web-developers know that IE has fallen behind in the race for standards and being able
to show the latest and greatest. Many
CSS2 properties are unsupported. Some of the more useful ones, are properties such as
min-width and finally
min-height.I will argue, how max-width is a crucial property, when it comes to on
line readability, and then I will show you how to make IE emulate the behavior of max-width, and
in turn, how to make it emulate many other properties that Internet Explorer for Windows is not
directly capable of.'
I agree 100% that Microsoft needs to catch up with the W3C standards, including
max-width. However, I personally prefer to adjust column widths myself by resizing the
windows. One particular use for this is if I'm viewing multiple windows at once and I want to
read a window that has a large width but a short height. Besides I like to read as wide as
possible (forgive the pun) if it will cut down on page scrolls. EG: Many pages have paragraphs
with a lead sentence/link in bold. I like to scan the bold and ignore the rest if I can. Wide
pages allow me to do this with fewer page downs.
There have been many articles praising Google. But this is the first to make me remember
that there was a world before Google. Oh as far as the question of "what's next?", it's
just that old semantic web crap.
'The transition into the Google Era has not occurred without some anguish. The stacks of a
university library can be a rather lonely place these days. Library circulation dropped about 20
percent at major universities in the first five years after Internet search engines became
popular. For most students, Google is where all research begins (and, for the frat boys, ends).'
'Students typically search only the most obvious parts of the Web, and rarely venture into
what is sometimes called the "Dark Web," the walled gardens of information accessible only
through specific databases, such as Lexis-Nexis or the Oxford English Dictionary. And most old
books remain undigitized. The Library of Congress has about 19 million books with unique call
numbers, plus another 9 million or so in unusual formats, but most have not made it onto the
Web. That may change, but for the moment, a tremendous amount of human wisdom is invisible to
researchers who just use the Internet.'
'What is innovative software? Before you discovered it, you did not feel that you were
missing out; there was no obvious void. However, after you discover it, its use becomes so
second-nature that you wonder how you lived without it.'
'The next generation of software engineers, who will be producing software in the next
twenty-odd years, are simply not able to produce innovative software. Thirty years ago,
programming was a niche area, an art, under constant evolution and requiring intellect and
ability. New software was really just that -- completely new. There was money to be made and
there were obvious needs to be fulfilled. Nowadays, anyone can write a program. Why is this a
bad thing? Well, if I am going to spend three or four years at university studying computer
science, yet not be able to offer any significant advantage to a major software development
house compared to a simple 'code-monkey' who can churn out lots of code at a very low wage,
where is my incentive to do software development? Sure, I may have been taught better
programming principles -- object-orientated programming, and the like -- but there's nothing
special that I can do. Why would I want to spend my time doing nothing but churning out code,
when there is better money to be made in less monotonous fields?'
O come on dude! Things happen in bunches. There's nothing wrong with a lull here and there.
On the other hand anything that encourages innovation is good.
SVG (Scalable Vector Graphics) is a W3C recommendation for 2D graphics driven by XML and is
fully scriptable. It's been up and coming for a while (competing against Macromedia's Flash (.swf)
and Microsofts VML (Vector Markup Language)) but I think it will finally gaining momentum in
certain areas. SVG will be very good for data-driven stuff, while Flash will will probably reign
for design-driven stuff.