
The FTC Chair emphasizes the exclusion of certain sensitive data from training AI models. The EU AI Act mandates transparency for general-purpose AI systems, requiring compliance with EU copyright law and detailed summaries of training content. Concerns are raised about companies asserting Fair Use for AI training on publicly available data but not disclosing the data, highlighting the need for transparency and documentation of training data by model providers.
One of the simplest but most useful and appropriate pieces of AI regulation to adopt at the moment is to require model providers to document the training data they used. This is something that the @EU_Commission AI Act gets right … on p.62 of its 272 pages (!). https://t.co/M65nyIZawJ https://t.co/YnQ6XesVL3
FTC Chair Asserts Certain Sensitive Data Should Be Excluded from Training AI Models https://t.co/RVaf4WRyCX #FTC #AI #Data @FTC https://t.co/XK3bKjucnL
The fact that AI companies are so sheepish about disclosing what data they trained on is really telling. If it’s fair use, then why can’t you be transparent about training data sets?
