← Back to Archives

Terms of Service and scraping...

Scraping seems to be OK in Florida: AP: Florida U.S. District Court Judge Steven D. Merryday has ruled that YachtBroker.com's use of a software program to harvest yacht listings, photos and product descriptions from Boats.com represented lawful use of facts that weren't protected by copyright law. As a result, Boats.com will be changing the terms of use for its site to make sure it has "the correct protections" for its content. A key point, noted by Judge Merryday, was the fact that the rights to the photos and descriptions listed on Boats.com's were held by individual yacht brokers and not the site itself.

As a "technologist" this is starting to really scrape my hide. That is, if you display content in a publicly-available forum that is not protected by a legally-accepted intellectual property regime (copyright, trademark and/or patent rights... note: it's no longer a trade secret if you post it on the net!), you shouldn't be able to protect that content by concocting some arcane Terms of Service document. There are accepted technological recourses for sites that want to control the uses of their content to this degree... like a robots.txt file in your root directory or any number of more complicated ways of making sure that there is a human instead of a spider script accessing your content.

What does everyone else think? Are you with me? That is, is running to your lawyers merely whining in this respect? Or should they not have to rely on technology to lock-down their content?

Posted by joebeone at Abril 10, 2004 10:34 AM | TrackBack