Well said. Yet another absolutely stupid legal case. Now.. how do they enforce this in Massachusetts?
The internet is global people, start to get that through your heads.
In Denmark today, a judge rules against a search engine that respects the robots.txt convention, and stops it from “deep linking” into sites run by the Danish newspaper association. All these court cases are as stupid as dirt. Several good technical preventatives exist. First, if the search engine supports robots.txt, you can simply edit the file on your site, and save the lawyer’s fees. If it doesn’t support robots.txt, first raise the issue in public, and the tech weblogs will get right on it. If that doesn’t work, add a simple script to your server to look at the referer attribute on the HTTP request and if it isn’t from your site, redirect to your deep linking policy page. We know for sure that when a company goes to court for “deep linking” that they aren’t talking to, or listening to, their technical people. BTW, deep linking is an oxymoron. There’s only one kind of linking on the Web. Why would you ever point to the home page of a news oriented site.
[Scripting News]