How is AI impacting the creative industries?

 

How is AI impacting the creative industries?

Earlier in the week, Alice and Shalini attended an insightful Byte the Book panel that looked at how Artificial Intelligence is impacting the creative industries. As we develop new technologies, more and more industries have to face changes, and adapt to work with new tools; and we were truly inspired to hear how this is already happening in the creative world. The panellists came from very different sectors (from fashion to audiobook publishing), and had some very interesting points on how AI is changing creative processes, and how it will keep influencing art in the future.

CEO and co-founder of the audiobook publishing company DeepZen Taylan Kamis explained how they are implementing AI technology to “clone” the voice of real-life actors. They are not taking away the actors’ jobs, but working with them and technology to increase the numbers of audiobooks available. DeepZen stated an average book takes about 4 to 6 weeks to be made into an audiobook, and costs up to £5,000. This means that only a small percentage of books can be available in audio format, and usually only bestsellers from the bigger publishers make the cut. DeepZen is trying to solve this market gap by increasing the variety of audiobooks available by working with smaller and mid-sized publishers. Taylan has noticed that publishing is moving slower and more cautiously towards AI than other sectors (say, the music industry), and that it will be difficult for publishers to move forward without partnering up with other businesses and mutually trusting each other.

DeepZen is still paying actors royalties for their intellectual property, but in many cases AI is raising new legal and copyright issues. Alex Hardy, partner at legal firm to the creative and tech industries Harbottle & Lewis, points out that most devices or digital services will have some level of AI working in the back-end, but the law has yet to catch up with these new technologies. When creating machines using AI, the question is "Who is responsible for the data collected by AI, the creators, or the machine itself?" According to GDPR, whoever collects personal data need to be able to answer questions on what data they’re collecting, how and why. However, some AI tools that collect huge amounts of data will do so through complicated algorithms that might not provide all that information. To perfect their analysis, machines are trying to collect as much information as possible through “deep learning”; anyone who creates content can decide not to make it accessible to machines, but this risks cutting them out of the cultural conversation. If machines will only analyse creative content that is freely accessible online, their results will inevitably ignore a lot of content, and the data they provide will be culturally biased.  From a legal perspective, a lot has been done in the past 2 years to future-proof as much as possible and include machines in possible legal scenarios. The new EU Copyright Directive, for example, specifically mentions possible infringements from machines. In the end though, ethics and self-regulation will play a very important role, and the people in charge of those machines will be held responsible.

The academic publishing industry makes great use of AI in detecting plagiarism and peer-reviewing content. In some cases, this new technology is even being used to let users experience content in a whole new form such as through different online-based platforms for sharing academic articles and journal content. Now that we understand a bit more about the new worlds this technology is opening up for us – as well as the issues it’s bringing with it – we truly can’t wait to see what the future will look like for the creative industries!