Slow down AI now?

This might well be the end of us… was my initial reaction to the emergence of the recent AI development avalanche.

Then I entered a consciously positive space and explored how partnering with an AI could be incredibly helpful, not just for coaches and their clients.

And then, yesterday, I found myself immersed in two presentations by people I respect, and it opened a few thoughts up that brought back that initial concern. I’m not quite sure what to do with this as it seems impossible to keep up with the nauseatingly fast speed of changes in this landscape; but given recent, increasingly loud calls to slow down the releases of new AI tech into the public sphere, combined with human beings generally being quite shit at slowing down once they smell an exciting opportunity, I’m now concerned again.

Funny really, as I’m usually the big picture thinker, and much of the excitement at the moment lies in the detail and short-term effects that AI opens up. It took a historian to remind me what’s at stake!

Yuval Harari, in this excellent presentation, points out that humans operate fundamentally through language, and that science fiction likely had it wrong when it assumed that machines would need to threaten us physically in order for humanity to be at risk. In reality, we’re not far from a world in which a significant amount of content we consume will be produced by non-humans. And since culture curates our reality, and content makes up our culture, we’re at risk of changing our perception of reality in profound ways.

Yuval also warns that democracy is based on the premise that it enables a conversation across the population to decide how to govern. When this conversation gets hijacked by AI, and optimised for particular outcomes, it’ll be impossible to stop or slow down AI developments through democratic means.

Tristan Harris and Aza Raskin, the creators of the documentary “The Social Dilemma” dug into the AI landscape in this presentation and similarly warn us that right now is the time to try and intervene. If we had the chance to add more oversight to Facebook’s or TikTok’s addiction-inducing and election-tipping algorithms at a time when it was still in development, wouldn’t we want to do that?!

They also asked the question whether you’d get on an aeroplane to board a flight if 50% of aeroplane engineers told you that there’s a 10% or higher chance that you will die if you got on board. These are the numbers from a survey on AI safety (albeit some debate on the exact numbers).

Founder and CEO of Open AI, Sam Altman, along with a few other experts, just testified in front of the US Congress, kicking off a series of hearings designed to set rules for AI development. I didn’t get to listen to this in full, and frankly, I don’t have time sth through the 3 hours, so what it comes down to for me, as well as so many people I reckon, is to have trust in our governments and regulating institutions. Now I’m happy to be back in Germany, where my level of trust in government is reasonably high. I can’t say the same for the US or UK, which is pretty worrying.

And that’s when I arrived home, gave my daughter a big long hug, and wished her the best of luck!

It can be scary and exciting to sit with the possibilities that AI holds, and it’s hard to even try to imagine how the world might change, even over the next 12 months…

And that’s where I’ll leave it today. With lots of uncertainty and little clarity. I can feel the urge to put a comforting spin on this, and emphasise the positives. But I also think it’s important to pay attention to the whole spectrum of what’s at hand here, so I invite you to sit in some uncertainty together.

With Love
Yannick

_________________________________________________________________________

New Content: Coaching “out in the real world”?

Image

Coaching is often presented as a well-defined practice with clear guidelines and competencies. Yet, once coaches get released “into the wild”, their coaching styles often change and evolve, to a point where it can feel quite different to what we’re being taught at schools and training institutions. To shed some light on such developments, Siawash and I decided to go live on Facebook to record this special episode of our podcast, inspired by, and celebrating the Coaching Lab. Sorry to have missed you, Nikki! You were there in spirit.

You can listen to this episode or watch it on YouTube.