GitHub Copilot is “unacceptable and unfair,” says Free Software Foundation


GitHub Copilot, a Visual Studio Code extension that uses artificial intelligence to help developers write code, has drawn the ire of the Free Software Foundation (FSF), which calls for white papers that address legal and philosophical issues. raised by technology.

GitHub Copilot is “unacceptable and unfair, from our perspective,” the FSF wrote in a blog post calling for white papers on the implications of Copilot for the free software community. This is because Copilot requires you to run software that is not free, such as Microsoft’s Visual Studio IDE or the Visual Studio Code Editor, supports the FSF, and is a “service as a”. software substitute ”, which means it’s a way to gain power over other people’s computing. .

Built by GitHub in conjunction with OpenAI, Copilot is a Visual Studio Code extension that uses machine learning trained on open source, freely licensed software to suggest lines of code or functions to developers when they write software. Copilot is currently available as a limited technical preview.

The FSF said there were legal issues regarding Copilot that may not have been tested before in court. So the organization is funding a call for white papers to examine legal and ethical issues surrounding Copilot, copyright, machine learning, and open source software. The FSF said that Copilot’s use of open source software has many implications for the free software community and that it has received many inquiries about its position on these issues.

“Developers want to know if training a neural network on their software can be considered fair use. Others who might want to use Copilot are wondering if snippets and other material copied from repositories hosted on GitHub might result in copyright infringement. And while everything may be legally co-pacific, activists wonder if there isn’t something fundamentally unfair about a proprietary software company building a service out of their work, ”the FSF wrote.

The FSF cited the following questions as interesting:

  • Does Copilot’s formation on public repositories violate copyright? Fair use?
  • How likely is the release of Copilot to generate actionable claims for violations of GPL-licensed works?
  • Can developers using Copilot comply with free software licenses like the GPL?
  • How can developers ensure that the code they copyright is protected against violations generated by Copilot?
  • If Copilot generates code that results in an infringement of a licensed free software work, how can this infringement be discovered by the copyright holder?
  • Is a trained AI / ML model protected by copyright? Who owns the copyright?
  • Should organizations like the FSF advocate for a change in copyright law in relation to these issues?

GitHub, responding to the FSF’s protest, expressed its willingness to be open to any issues. “This is a new space, and we want to engage with developers on these topics and lead the industry in defining appropriate standards for training AI models,” GitHub said.

The FSF will pay $ 500 for the white papers it publishes and will also review funding requests to do additional research leading to a subsequent article. Submissions are accepted until Monday, August 21 at the following email address: [email protected] Guidelines for articles can be found at

Copyright © 2021 IDG Communications, Inc.

Source link


Leave A Reply