A response to the “AI pause” letter

Worth a read:

What we need is regulation that enforces transparency. Not only should it always be clear when we are encountering synthetic media, but organizations building these systems should also be required to document and disclose the training data and model architectures. The onus of creating tools that are safe to use should be on the companies that build and deploy generative systems, which means that builders of these systems should be made accountable for the outputs produced by their products. While we agree that “such decisions must not be delegated to unelected tech leaders,” we also note that such decisions should not be up to the academics experiencing an “AI summer,” who are largely financially beholden to Silicon Valley. Those most impacted by AI systems, the immigrants subjected to “digital border walls,” the women being forced to wear specific clothing, the workers experiencing PTSD while filtering outputs of generative systems, the artists seeing their work stolen for corporate profit, and the gig workers struggling to pay their bills should have a say in this conversation.
— Read on www.dair-institute.org/blog/letter-statement-March2023