Artists are among the many groups who will feel the effects of AI over the next few years, but it’s not doom and gloom for everyone. A group of artists have organized an open letter to Congress, arguing that generative AI isn’t so bad and, more importantly, the creative community should be included in talks about how the technology should be regulated and defined.
The full letter and list of signatories is here; the gist is that AI, machine learning and algorithmic or automated tools have been used in music, art and other media for decades and this is just another tool.
As such, those who use the tools, whether that’s as software engineers or painters, should be consulted in the process of guiding their development and regulation.
Here’s an edited snippet of the letter:
Just like previous innovations, these tools lower barriers in creating art — a career that has been traditionally limited to those with considerable financial means, abled bodies, and the right social connections.
Unfortunately, this diverse, pioneering work of individual human artists is being misrepresented. Some say it is about merely typing in prompts or regurgitating existing works. Others deride our methods and our art as based on ‘stealing’ and ‘data theft.’ …many individual artists are afraid of backlash if they so much as touch these important new tools.
Sen. Schumer and Members of Congress, we appreciate the ongoing hearings, ‘Insight Forums,’ and other initiatives focused on regulating generative AI systems and that your goal is to be inclusive, pulling from a range of ‘scientists, advocates, and community leaders’ who are actively engaged with the field. Ultimately, that must mean including artists like us.
We see a unique opportunity in this moment to shape generative AI’s development responsibly. The broad concerns around human artistic labor being voiced today cannot be ignored. All too often, major corporations and other powerful entities use technology in ways that exploit artists’ labor and undermine our ability to make a living. If you seek to ensure generative AI’s revolutionary trajectory benefits humanity as a whole, it would be a gross oversight to exclude those in our society who are working within its potential and its limitations.
There’s certainly reason and wisdom in these words, and the government ignores the creator community at its peril if it intends to form a diverse and representative group to advise its deliberations around AI.
But the letter, despite being published under the auspices of Creative Commons, conspicuously mischaracterizes the most serious criticism of the AI systems artists object to: that they were created through wholesale IP theft that even now leverages artists’ work for commercial gain, without their consent and certainly without paying them. It’s a strange oversight for an organization dedicated to navigating the complex world of digital copyright and licensing.
While there may be some who subjectively deride AI-assisted art as mere prompt engineering or what have you, many who object do so because the companies that created these tools did so in ways that exploited artists. Whether art resulting from such systems is derivative or original, it’s reasonable to consider it as the fruit of a poisoned tree.
Just as authors decry some large language models as being obviously trained on their own work, among the complaints artists can and probably will bring to any hearing or forum in Congress must be that companies unethically and perhaps illegally ingest copyrighted work against the wishes and welfare of their creators.
We’re only at the very beginning of the AI-influenced era of art and industry, so there is plenty of room for both disagreement and collaboration. While this open letter is only one perspective, it is a valuable one — and likely also one that will receive significant pushback from other artists who feel their own work or positions are being misrepresented. And by this time next year the world and its attendant conflicts will have moved on yet again as today’s models and methods are abandoned. We’ll be talking about this for a long, long time.
Source @TechCrunch