Best Practices for Managing the Risks of Big Data and Web Scraping

The $60.5 million judgment that craigslist obtained in April 2017 against RadPad, a third party that collected data from craigslist's site through automated means, highlights some of the issues faced by entities that collect – or engage others to collect – data through automated means for commercial purposes. The judgment was based on various claims relating to RadPad’s use of sophisticated techniques to evade detection and harvest content from craigslist’s site, as well as distribution of unsolicited commercial emails to craigslist users to market RadPad’s own apartment rental listing service. In a guest article, Proskauer partners Jeffrey D. Neuburger, Joshua M. Newville and Robert G. Leonard provide an overview of big data and web scraping, outline potential sources of liability to hedge fund managers that collect big data and describe best practices for navigating several areas of potential liability. See also “Using Big Data Legally and Ethically While Leveraging Its Value (Part One of Two)” (May 17, 2017); Part Two (May 31, 2017).

To read the full article

Continue reading your article with a CSLR subscription.