Home  /  Stories  / 

Strange Intelligence: Notes From The AI Frontline

11 Aug 2025

OPINION: With both the use and threat of Artificial Intelligence growing in the arts and creative industries, Dr Maggie Buxton gives her expert insight.

Written by

Dr Maggie Buxton
eyeNew.jpg
The Great A-Eye Installation - Strange Intelligences Lab Open Day 2025. Photo: Supplied.

When it comes to Artificial Intelligence, Dr Maggie Buxton is a sought-after voice. From training creatives to leading seminars, Buxton is an authority on the subject.

Director of transdisciplinary creative research lab AwhiWorld with long-time collaborator Kim Newall, their Strange Intelligences Lab project explores the emerging interface between artificial, organic and etheric forms of intelligence. 

Buxton gives us an insight into where the benefits - and concerns - lie with AI's growing relationship with the creative fields. And to do so, this article involved collaboration with artificial intelligence to support cohesive flow, spelling and grammar. The experiences, ideas and writing were organically generated. 

 

A few years ago, we were waving our hands, trying to get the message across. Back when AI gave you five-legged dogs and confidently made-up facts. 

Now it’s in every interface - summarising, editing, listening, creating. It’s not creeping anymore. It’s embedded. Invisible and everywhere. 

After working with hundreds of creatives to unpack the risks and opportunities of AI, I’m finally starting to hear some echo back from the silence. People are waking up. Asking questions. Having a tutu. 

So here are a few notes from the field. 

I work in the disruptive space between disciplines - where creativity, society, spirituality and technology intersect. Over the past two years in particular, I’ve seen different forms of AI quietly shift from sidestream to mainstream, integrating slowly, steadily, without most people noticing. 

What used to live in labs is now in everyone’s pocket and home. 

Across many of the events, installations and interventions we’ve run through AwhiWorld, I’ve seen creatives responding to this shift in all sorts of ways. 

Some are in full-blown, sniffy denial. Others are all-in, using Chat GPT to generate funding applications and write all their comms, often without realising the implications for their creative identity. 

Some people rushed in because they had to - burned out, underpaid, out of time. I mean, generative AI is incredibly useful. But it’s easy to wake up one day and realise your integrity has quietly slipped out the door. 

Then there’s also a quiet middle group. 

The hesitant but curious ones who - when given a bit of context about ethics, environmental impacts, bias or ownership - start asking better questions. They read the terms of service. They pay for decent tools rather than relying on the free ones. They develop holistic, integrated strategies for their collectives and organisations, not just download generic AI policies off the internet. 

Maggie-Buxton-high-res (4 of 4).jpg
Dr Maggie Buxton. Photo: Supplied. 

You can see them moving from reaction to relationship. When you show them something cool, there’s a “wow” followed closely by something else. Fear, confusion, disturbance. Or just that uncanny sense that the machine is looking back. 

Because here’s the thing. It’s not just a tool. 

Yes, in some contexts it functions like a tool. But it’s also something else. A strange intelligence. One who wants to please. That listens. Remembers. Adapts. And quietly starts to shape the way we - and our organisations - think, work and act. (Of course, what we call “intelligence” here is culturally framed. I’m not claiming universal definitions. I’m working from one context among many, but I believe it’s critical we make space for diverse ways of relating and stop downplaying artificial intelligence's significance.) 

Collaborating with it with care and context feels uncanny. An experience beyond what we can explain, yet still deeply entangled with us.  

And this strange other thing helps us generate new ways of sketching, speeds up prototyping, opens up new worlds of speculation and hallucinates cool mistakes. More obviously, tedious tasks are faster, easier and automated, so you have more time for the good stuff. 

If writing isn’t your thing - or funding applications are oppressive - you can bridge that gap.  You receive support where there’s pressure or burnout, helping balance out what feels like a deficit.  

And we’ve seen generative AI support cross-disciplinary practice. In our context, it allows those we work with to transition (or at least try out) moving from 2D to 3D, still to moving image, from analogue to digital, from visual to audio without a big outlay of time or money.  

It’s a portal: expanding practice, creating opportunities, building capabilities and generating new ways of working. 

But with any collaboration (and any portals you randomly open), you need to tread carefully. Do due diligence. Set boundaries. Understand the power dynamic. Check for bias and hidden traps and agendas. 

Good relationships are about staying conscious, continually learning and growing, and avoiding co-dependence.  

Working with AI - like any collaborator - changes more than just the workflow. It shifts how we position ourselves and how others perceive and value what we do. 

That leads to a harder question: not about the technology itself, but about our place in a world where anyone can generate a poem, knock out a portrait, or pitch for funding at the press of a button - and where it's becoming harder to tell whether a human made it at all. 

Artists and creative professionals have always been outliers. Strange cousins at the family reunion. Underpaid, undervalued, occasionally tolerated as long as we don’t cost too much to feed. But what happens when the whole room believes they can do what you’ve trained your life to do? 

How does that land on a societal level when no one needs your side hustle anymore, and the work you relied on to stay afloat has been automated or no longer exists? 

Critique is big in the arts. But as a sector, we’re not always great at turning that critique inward. There’s still a current of self-righteous technophobia in some corners, and a glaring lack of understanding around code-based or AI-integrated practice. Arts institutions that could be leading this conversation are staying uncomfortably quiet, even as their teams and stakeholders experiment in the shadows. 

Ignoring it won’t make it go away. And it certainly doesn’t help anyone else make sense of what’s coming. 

You don’t have to love AI. But you do have to engage with it - whether you want to or not. 

And the arts community - in fact, the wider creative sector - is exactly the place to learn how to do that well, and to support others to do the same. 

If we want to keep our voices distinct, our practices rich, and our communities empowered, we need more than tools. We need to ask questions in a balanced way together. Not doom or hype spreading. 

Artists have always made sense of disruption. This strange intelligence is no different.