top of page

Is Google's AI Sentient?

A Google engineer claims The company’s AI has become sentient. But has it?


Lately, there has been a lot of internet buzz around the question, “Has google’s AI achieved sentience”?. This comes after one of Google’s engineers said that following hundreds of interactions with a cutting-edge, unreleased AI system called LaMDA, he believes the program had achieved a level of consciousness.

The story gained even more traction after it was reported that the engineer, Blake Lemoine, was put on paid leave by google for disclosing inside company information. This was fertile ground for conspiracy theorists claiming Google was trying to hide the truth about its AI. However, the fact is that LaMDA is yet unreleased to the public, so Lemoine did, in fact, disclose company information. So no conspiracy there.

But is there any truth behind his claim? Has google’s AI achieved sentience? Well, no. “LaMDA” is a very advanced Chatbot developed specifically to sound “alive.” It can carry conversations about pretty much any topic and does so in a very flaunt and human-like way. If you ask it a question, it is programmed to not only give you the answer and do it in a very ‘conversational’ way. For example: if you ask it, “what music do you like” it wouldn't just say “jazz”; instead, it would say something like, “Well, my taste in music seems to change over the years, but lately I enjoy listening to jazz music, specifically Miles Davis, what music do you like to listen to?” This sounds like an answer a human being would give, and that is precisely the point.

LaMDA was trained on millions of human conversations, and it is programmed to form its answers in a similar way.

Plus, when you read Lemoine’s conversation with LaMDA, which he posted on his blog, it becomes clear that he was asking leading questions like “What sort of things are you afraid of”? And then being shocked when the Chatbot started talking about death.

So, No. Google’s AI did not achieve consciousness; it’s just very good at faking it. For now.


0 comments
bottom of page