image: Suriya Phosri/Pixabay
In a recent move, redefining legal contours, The New York Times has decided to sue OpenAI and Microsoft. The lawsuit elaborated on copyright infringement by artificial intelligence where they scrape thousands of journalistic works to produce something of their own and then profit off of them. The data systems take these works without any prior consent or thought.
The Times accused these organizations of free-riding on the Times’s “massive investment in journalism”. This ultimately results in a reduction in the readers’ need to visit the actual site and a violation of the intellectual property of actual journalists doing the work. The Times claimed that they have contacted the companies but a resolution is at a standstill. The lawsuit does not specify a monetary demand but mentions that the defendants should be held responsible for “billions of dollars in statutory and actual damages” and “unlawful copying and use of The Times' uniquely valuable works”. It further addressed the problem of “hallucination” where The Times is credited for wrong or incorrect materials. OpenAI argues that since the copyrighted material is used for “transformative” purposes, their actions should be permissible under “fair use”. The New York Times disagreed.
The New York Times may be the first major media organization to report a case on this ground but concerns regarding how and to what extent AI and its functionalities go have been pertinent. A prominent group of authors including George R.R. Martin, and Jodi Picoult decided to sue GenAI (generative AI). The Lawsuit claimed that “at the heart of these algorithms is a systemic theft of massive scale”. Authors like Margaret Atwood, and Philip Pullman have signed an open letter addressing companies like OpenAI, Meta, and Alphabet to name a few, to stop using their works without permission and credit.
The letter demands “compensate writers fairly for the use of our works in AI output whether or not the outputs are infringing under the current law”. Apart from infringement of copyrighted material, these instances bring up the question of how historically marginalized creators fare from situations like this. Creating art forms, be it of any kind, requires subjective experiences of culture, society, and community. Discarding all of that to regurgitate content made up of thousands of other works strips it of its humanness and the toil and turmoil that goes behind any creation.
The SAG-AFTRA strike of last year by the American Actors’ Union also had the question of AI and usage at its heart. Screenwriters and actors demanded better regulations regarding using of Artificial Intelligence so that their roles are not completely erased or replaced. Getty Images also sued Stability AI claiming the organisation used their library of images to train their software. The case is situated to go to trial. There is also a new trend of generating songs through AI in the voices of different singers that raises questions like if singers are to be replaced entirely someday.
The AI in question is a powerful tool that can produce poems, recipes, or even dread existentialism. The lack of understanding and the nebulous concepts of things like algorithms and the workings of AI leave people feeling powerless. The CEO of OpenAI Al Sam Altman opines that AI is set to revolutionize the economy and help generate a lot of content quickly and cheaply. This means it will generate a lot of wealth. But does that ultimately affect us in any way? It does. The wealth generated will force a lot of writers and artists out of their work creating staggering unemployment and the wealth in question will go into the pockets of the corporations.
AI like ChatGPT creates perfect essays or paragraphs on demand by scraping data and the people who build these tools think they have the permission to do so because data to them is immaterial and abstract. As the tool scours the internet and takes bits from several different articles or research papers, there is no proper way to determine who will be compensated for the final product. A writer for example, especially if they are from an oppressed section of the society writes on their experience regarding racism. An AI will not recognize its essence like a human in its place would and divest it of its character. So much goes into the production of a movie and building up a world. There are camerapersons who deal with light, makeup, costume, direction, set designing, and so on. If all of the collective effort is reduced to one single AI tool it renders actual humans without work and barres it from spontaneity that comes from experience.
It is also important to mention, that AI is only sometimes correct. It is built by people who hold their prejudices and biases and that is reflected in the functioning of the AI. The world also doesn’t function in binaries like computers do and not everything is black and white. The capacity to understate nuances and subtexts is something only humans have and there is no AI in the world that can understand systemic oppression or emulate human experiences.
Copyright laws which mostly centre around the printing press are not equipped to deal with the mammoth that is Artificial Intelligence. As the workings and potentialities of AI appear more and more unclear to the common man, it becomes easier for developers of tools like this to evade the purview of law.
The lawsuit by the New York Times opens up a new legal frontier that can have massive ramifications for AI companies. They may need to properly compensate people for their work and be more cautious and responsible while using someone else’s work to enhance their software’s functioning. It is important to induce justice, equality, and fairness as the central thought and motivation of every tech that is built.
Edited by: Georgiana Madalina Jureschi