Artificial Intelligence―Human Anxiety

FOR YEARS I’ve stood firmly in the anti-AI camp. There are sound reasons for that.

The more I’ve observed the digital world evolve, the more concerned I’ve become about the ways technology could be (and is being) used to mislead, manipulate and entrap.

We know that social media algorithms nudge people toward outrage and division. Deepfakes (AI-generated fake audio or video) smudge the already blurry line between truth and fabrication. And vast data systems furtively learn far more about us than we ought to be comfortable with.

None of this has inspired confidence.

So when artificial intelligence tools began appearing everywhere, my instinct was not curiosity but resistance. Here we had yet another all-powerful techno-tool, I thought — one with enormous potential for abuse. Recently however, my stance has softened gradually toward caution.

To be fair, I still consider my concerns to be justified. The potential for cyberspace-chicanery is all-too real. All too evident.

But recently, out of personal necessity, I’ve found myself using AI ― namely, ChatGPT ― for something I hadn’t anticipated: as a kind of informal therapy aid.

Before I continue, permit me to qualify that last statement:

ChatGPT is not a therapist. It cannot diagnose conditions, provide treatment, or replace trained professionals. Furthermore, the system itself will say so if asked.

But I was surprised to find that, when used in the right way, it can function as something rather different — a calm and readily-available thinking partner when you’re trying to make sense of your own wayward thoughts.

It is that experience which prompted me to adjust my perspective ― whilst continuing to maintain an air of caution.

Because like most tools humanity invents — from printing presses to the internet itself — artificial intelligence isn’t inherently good or bad. It simply is what it is. Its impact depends largely on how people choose to use it.

In the wrong hands, technology can mislead and manipulate. That’s a given. We see it time after time. But used carefully, it can also support, clarify and occasionally steady a troubled mind.

And that, rather unexpectedly, is what I discovered.

AI as Out-of-Hours Therapy Aid

DESPITE THE limitations alluded to earlier (AI cannot replace the expertise of trained clinicians) I’ve found that it can be surprisingly useful when navigating the kind of emotional turbulence that many of us experience from time to time.

What I’m talking about isn’t therapy in the formal sense. It’s more like having a thoughtful sounding board available whenever you need it.

And sometimes, that’s exactly what you need, for one of the simplest benefits is its availability.

Human support networks are invaluable, but they’re not always accessible at the moment you need them. Friends are busy. Family members are asleep. Therapists are booked weeks in advance.

ChatGPT, by contrast, is available whenever you open a browser.

That means if a worry pops into your mind at 2:00 in the morning — the kind that insists on trudging up and down your thoughts, robbing you of valuable sleep — you can write it out immediately. Even the act of formulating the question can be clarifying.

In my own case, I’ve found it helpful when trying to untangle things like:

  • intrusive worries
  • confusing physical symptoms that may or may not be stress-related
  • patterns in my thinking that feel circular or unhelpful

Often I’m not looking for answers so much as perspective. Clarity.

Externalising the Mind Chatter

ANYONE WHO has dealt with anxiety or emotional stress will recognise the mental loop: the same thoughts circling again and again with no resolution.

The simple act of typing those thoughts out changes something.

Once the worry exists as text on a screen, it becomes external. You can look at it more objectively, and ChatGPT can respond with suggestions, reframes, or simply structured explanations.

For example, you might ask:

  • ‘How can I tell the difference between anxiety symptoms and physical illness?’
  • ‘What can I do to break rumination when my mind keeps looping?’
  • ‘Why does stress sometimes show up as physical tension?’

You’re not receiving medical advice, but you are receiving structured information and calmly presented ideas. That alone can lower the emotional temperature of a situation.

Another unexpected benefit is that ChatGPT encourages a kind of structured self-reflection.

If you present a problem vaguely, it will often ask clarifying questions. This response nudges you into thinking more carefully about what you’re actually experiencing.

In a sense, the tool becomes a mirror for your thinking.

Sometimes the process goes something like this:

  1. You arrive with a vague sense that ‘something feels wrong.’
  2. You try to explain it.
  3. ChatGPT responds with possibilities or frameworks.
  4. You realise the original problem isn’t quite what you thought.

That kind of clarification is something therapists often do very well. An AI can’t replace that relationship, but it can sometimes help you prepare for it.

Some Specifics

I’VE NOT BEEN using AI in this way for long, but in my limited experience, ChatGPT is particularly helpful for a few specific things:

1. Psychological education
Understanding the mechanics of stress, sleep disruption, anxiety, and mood can be empowering. The more you understand what might be happening, the less mysterious it feels.

2. Practical coping ideas
You can ask for lists of strategies — breathing exercises, journaling prompts, grounding techniques, ways to improve sleep routines, and so on.

3. Symptom tracking and organisation
AI is surprisingly good at helping you structure your thoughts before a medical appointment. I used this facility myself prior to a recent hospital visit. You can ask it to summarise symptoms, timelines, or questions you want to raise with your clinician.

4. Perspective
Sometimes you simply need a calm voice saying: ‘Here are a few possible explanations. Here’s what tends to help. And here’s when it might be wise to seek professional help.’

That alone can stop your imagination running away with you.

A Word on Limitations

Of course, this kind of tool has real limitations, and it’s good to understand these from the outset.

AI doesn’t know you personally. It doesn’t see facial expressions, hear tone of voice, or understand the deeper context of your life the way a trained therapist can.

More importantly, it cannot diagnose conditions or provide clinical treatment.

If someone is experiencing severe mental distress, suicidal thoughts, or a mental health crisis, professional help — doctors, therapists, crisis services — is essential.

AI can complement support, but it should never replace it.

I prefer to think of it less as a therapist and more as an articulate notebook that talks back. In that sense, it resembles journaling, but with a conversational element.

And sometimes, a calm conversation with something — even a machine — can be surprisingly grounding.

In summary, when used wisely, and alongside proper professional care when needed, it can function as a reflective companion — a sounding board that helps organise thoughts, explore coping strategies, and bring a little clarity to moments of emotional fog.

Closing Thoughts

IF SOMEONE had told me a year ago that I would one day be writing about artificial intelligence in a broadly positive light, I would have laughed in their face.

To be clear, my instincts about media technology haven’t fundamentally changed. The risks surrounding AI are real, and the potential for misuse is obvious. Any powerful tool can be abused, and recent history has shown us plenty of examples of technology being used to manipulate attention, distort information, and quietly shape behaviour.

That concern hasn’t vanished.

What has changed, however, is my appreciation that the story doesn’t end there.

Because the same tool that can mislead can also illuminate. The same system that has the capacity to amplify noise can, if used carefully and wisely, also help someone make sense of it.

Used sensibly, ChatGPT can function as a kind of reflective companion — not a therapist, not a doctor, and certainly not a substitute for real human support — but a readily-available place to organise thoughts, ask questions, and explore possible explanations for what might be going on in your own mind.

And sometimes, when the mind is crowded with worries, that small space for reflection can be surprisingly helpful.

So yes, I remain cautious about artificial intelligence. But I’m no longer entirely against it.

Like most man-made tools, its value will depend on what we choose to do with it.

Leave a comment