No, I will not add you to my whitelist

December 21st, 2006

I am so sick of companies that scream about adding their addresses to my whitelist so they can get through my spam filters. Here’s one example from Hewlett Packard:

To ensure you properly receive your HP Technology at Work newsletter, and driver and support alerts, please add us-news@your.hp.com to your book. If you also receive the following e-mails from HP then add these addresses to your book:

  • HP Monthly promotions newsletter:
    us-specials@your.hp.com
  • Events and other general HP customer communications: us-bulletins@your.hp.com
  • Order and support e-mail confirmations: Hewlett-Packard@confirm.hp.com

I have news for the corporate zombies and clueless marketdroids that design these sites and their e-mail bots:

If you’re getting caught in my spam filter, it’s your own damn fault!
Read the rest of this entry »

Murphy’s Law of Co-occurrence Constraints

December 20th, 2006

Co-occurrence constraints are a perennial topic at XML conferences because the usual schema languages (DTDs, W3C Schemas, RELAX NG) can’t handle them. Consequently they’re a fertile source of papers like XML 2006’s keynote from Paolo Marinelli on Co-constraint Validation in a Streaming Context.

However, I mentioned in hallway conversation that I wasn’t sure how common or necessary co-occurrence constraints really were. In fact, I didn’t think I’d ever found one in the real world. Naturally two days later I stumbled across several of them in a very common, very frequent real world example.
Read the rest of this entry »

The 90 Second Rule

December 1st, 2006

While reading Paco Underhill’s Why We Buy, I was struck by his discovery of the 90 second rule:

We’ve interviewed lots of shoppers on the subject and have found this interesting result: When people wait up to about a minute and a half, their sense of how much time has elapsed is fairly accurate. Anything over ninety or so seconds, however, and their sense of time distorts—if you ask how long they’ve been waiting, their honest answer can often be a very exaggerated one. If they’ve waited two minutes, they’ll say it’s been three or four. In the shopper’s mind, the waiting period goes from being a transitional phase in a larger enterprise (purchasing goods) to being a full-fledged activity of its own. That’s when time becomes very bad. Taking care of a customer in two minutes is a success; doing it in three minutes is a failure.

I suspect the rule applies to a lot more than merely shopping. Read the rest of this entry »

RELAX Wins

November 26th, 2006

Among the XML cognoscenti, the debate is effectively over. Everyone is choosing RELAX NG as their schema language, and compiling to DTDs or W3C XML Schemas as necessary. I don’t know of a single project in the last couple of years that considered both RELAX NG and W3C Schemas and chose to go with the latter. Certainly, there’ve been a lot of W3C Schema adoptions. However those seem to have been made mostly by people who didn’t know they had a choice. In particular, the W3C imprimatur seems very appealing to larger, more bureaucratic organizations such as government agencies.

With that in mind, I thought it might be useful to list some of the groups (including some of the W3C’s own working groups) who have chosen to do their work in RELAX NG:
Read the rest of this entry »

Don’t Hide From Google

November 17th, 2006

Here’s a robots.txt file from a company whose software I’m currently evaluating:

User-agent: *
Disallow: /cgi/
Disallow: /cgi-bin/
Disallow: /mantis/
Disallow: /forum/
Disallow: /stats/
Disallow: /synk/unreg.html
Disallow: /synk/de/unreg.html
Disallow: /synk/fr/unreg.html
Disallow: /synk/it/unreg.html
Disallow: /synk/email.psn
Disallow: /synk/help/

This is from a small company whose main product is experiencing solid growth. In fact, they are growing so fast, they are having trouble responding to support e-mails and are consequently requesting that users check the FAQ list and read the forums before sending them e-mail. Keeping that in mind, can you tell what’s wrong with this robots.txt?
Read the rest of this entry »