Art won't eat itself

2023-11-28

Two weeks ago, the inaugural Culture Crush event took place in Dublin, and the biggest concern at the event was "how will AI alter the creative world" or "will the machine eat our work".

The tldr; is ... it's going to be a useful tool.

The quote that sticks with me is how AI is currently changing panelist Jennifer Ely's current work. When she is doing illustration work, say for a cover, often she had to wade through publisher and artist descriptions and make multiple attempts to get their vision.

Now, they approach her with their own multiple attempts in an AI image generator such as Midjourney, and they are wrong. Bits are right, but its bits across multiple attempts. Lots of "almost right", but nothing accurate. And by saying what they want with their collage of attempts, and what isn't quite right; it means that it takes a person with talent, fewer attempts to get to the point of fine tuning.

Barry Scannell, another of the panelists and expert in Artificial Intelligence and Irish law, would point out that this is a prompt engineering problem, and so will be sorted out soon. I don't agree.

AI can do a lot. It can create a fully working iPhone game, it can make my feeble dance moves glide correctly, it can create a laughing "Mona Lisa", but it's hard to get exactly the image you have in your head, out of your head and available to the world. But it can get close ...  and AI, or spicy autocorrect, is a stepping off point for creation.

This is why legal battles will continue. The Writers Guild Of America, USA's strike won its battle over keeping AI from their scripts. Its hard to get something coherent that works at a good length when starting with an incoherent concept and no life experience.

And why the Screen Actors Guild - American Federation of Television & Radio Artists had more difficulty. If you start with a good scan of a person, you can get them to react and move realistically enough to fool an audience for an effect. Virtual Spidermen have swung through virtual New Yorks. Its is already being introduced for ADR (additional dialog replacement) and for spoken machine translation with automated lip synching.

While it would be easy to get an actor to accept lip motion correction, or eye level changes using technology, its a different matter to legally completely replace them, which is possible (we think).

It is harder to embody a concept than it is to make a body move according to your ideas. Its easier to replicate an existing actor than it is to create one from scratch. Just ask the parents of any actors.

I've seen this while working with software design, where the step by step exactly detailed requirements and designs created after a conversation with a customer, does not meet the concepts in the customer's head. It is very hard to execute exactly their needs when the person with the idea speaks a different design language to the ones trying to implement them. This is something that takes a lot of attempts. Agile sprints exist for this reason.

Having said that, technology improves. But technology can only do what it's instructed to. Using the right words is always hard.