rhatto via nettime-l on Thu, 2 Nov 2023 21:45:59 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> The Copy Far "AI" license


On Thu, Oct 26, 2023 at 12:10:10PM +0200, Javier de la Cueva wrote:
> On 06/10/2023 17:47, rhatto wrote:
> > On Tue, Oct 03, 2023 at 11:47:00AM +0200, Javier de la Cueva wrote:
> > > On 23/09/2023 17:48, rhatto via nettime-l wrote:
> > > > Hello nettimers, this might be of interest:
> > > > 
> > > >     Copy Far "AI" - A license close to copyleft, but far from the so-called
> > > >     "Artificial Intelligences": https://copyfarai.itcouldbewor.se
> > > > 
> > > > ps: I'm not in the list (please Cc).
> > > 
> > > Thank you for your email. Unfortunately, I have doubts about the
> > > enforceability/validity of the license.
> > 
> > Thanks for raising these concerns.
> 
> [...]
> 
> > You seem to be right about what's at stake in the legal stage.
> > 
> > For an ethical/moral/political standpoint, though, this license might
> > still be useful, regardless of the "fair use" outcome or it's validity
> > in courts.
> 
> Dear Rhatto,
> 
> Perhaps these are another cases where the law is not the solution to the
> ethical/moral/political issues, but technology. Just in case you have not
> read it yet, I enjoyed the idea of "poisoning" the data input of AI [1][2].
> It is also interesting to see how MIT Technology Review uses biased
> language:
> 
> "The tool, called Nightshade, messes up training data in ways that could
> cause serious damage to image-generating AI models."

This is very interesting, thanks for sharing.

One could even imagine this method yielding to dogs, cats and fantastic
animals holding Copy Far "AI" disclaimers as digital graffiti in
generated images.

Applicability of copyleft licenses is not restricted only to laws, and
can also be used as political statements in many places.

> Damage? I would rather say dadaist implementations.

:)

> [1] https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/
> [2] https://venturebeat.com/ai/meet-nightshade-the-new-tool-allowing-artists-to-poison-ai-models-with-corrupted-training-data/
> 
> --
> Kind regards,
> Javier de la Cueva

-- 
https://fluxo.info
-- 
# distributed via <nettime>: no commercial use without permission
# <nettime> is a moderated mailing list for net criticism,
# collaborative text filtering and cultural politics of the nets
# more info: https://www.nettime.org
# contact: nettime-l-owner@lists.nettime.org