Media - News

  • Media
  • Lumen Researcher Interview Series – Daniel Seng sheds light on his research methodology in analysing takedown notices

Lumen Researcher Interview Series – Daniel Seng sheds light on his research methodology in analysing takedown notices

October 6, 2021 | Research

The Lumen Researcher Interview Series features conversations between the Lumen team and a wide variety of researchers who have used Lumen’s data in their work. Associate Professor Daniel Seng, Director of the Centre for Technology, Robotics, Artificial Intelligence & the Law (TRAIL), is the second individual to be profiled in this series. The Lumen Project is part of the Berkman Klein Center for Internet & Society at Harvard University.

Daniel Seng obtained his doctoral degree from Stanford Law School, where he used machine learning, natural language processing, and big data techniques to conduct research on copyright takedown notices. His doctoral thesis was the beginning of his decade-long association with the Lumen Project, then known as Chilling Effects.

As a part of his research, Daniel has worked on five papers that have used the data in the Lumen Database. A bulk of Professor Seng’s focus has been on empirical and quantitative legal analysis, for which he has analysed 203 million takedown complaints and 5.66 billion URLs in the Database. In this interview, Adam Holland, Project Manager at the Lumen Project, and Shreya Tewari, Lumen’s 2021 Research Fellow spoke with A/P Seng about his research work, research methodology and how the Lumen Database catalysed his research.

The interview may be viewed here. Some of Daniel’s writings in this area include: “The State of the Discordant Union” (2014), “Copyrighting Copywrongs” (2015, published 2021), “Who Watches the Watchmen” (2015)

——————

ABOUT LUMEN

Conceived and developed in 2002 by then-Berkman Center Fellow Wendy Seltzer, Lumen (until recently known as Chilling Effects) was nurtured with help from law school clinics at Berkeley, Stanford, University of San Francisco, University of Maine, George Washington School of Law, Santa Clara University School of Law, and Harvard Law School’s Cyberlaw Clinic (based at the Berkman Klein Center).

Lumen collects and studies online content removal requests, providing transparency and supporting analysis of the Web’s takedown “ecology,” in terms of who sends requests, why, and to what ends. Lumen seeks to facilitate research about different kinds of complaints and requests for removal – legitimate and questionable – that are being sent to Internet publishers, platforms, and service providers and, ultimately, to educate the public about the dynamics of this aspect of online participatory culture.

Initially focused on requests submitted under the United States’ Digital Millennium Copyright Act (DMCA), Lumen now includes complaints of all varieties, including those concerning trademark, defamation, and privacy, both domestic and international.  Currently, the Lumen database contains millions of removal requests, and grows by more than 20,000 notices per week, from companies such as Google, Twitter, YouTube, Wikipedia, Reddit, Medium, Github, Vimeo, and WordPress. Because of recent dramatic increases in notice volume, in 2014 the project upgraded to a more robust, scalable website that provides more granular data and API access for notice submitters and researchers.

 

Scroll to Top