Google, LaMDA and sentient behavior, politics included
BlackTechLogy: Capitol Riot hearing proves that opinions can change, politically, culturally and socially
Writer’s note: This post was originally published on Medium’s “We Need to Talk” on June 16, 2022.
“Watch FOX News,” my grandfather would tell me on occasion. “You have to always know what your enemy is thinking.”
Considering he was a veteran who fought in a war, I understood why he thought this way. I even respected his open-minded views about giving someone unlike him “the floor” to share their views — from behind a television screen. But I don’t have the temperament to sit still long enough to get through 30 minutes, never mind hours, of FOX News. I would beeline into a different room whenever he changed the channel to the conservative-leaning station.
Years later, after my grandfather passed away, I just cannot do it. Even listening to former FOX political analyst Chris Stirewalt during the Capitol Riot hearings bothers me. He spent more time heavily bragging about FOX News beating competitors in the 2016 presidential election results and how “beautiful” their poll in Arizona was. The guy seemed to completely forget why the Select Committee to Investigate the January 6th Attack was organized altogether. The arrogance of it all annoys me.
I thought about the most recent Capitol Riot hearing while reading Washington Post’s story “The Google engineer who thinks the company’s AI has come to life.” Blake Lemoine, the Google engineer, was put on administrative leave this week for warning the company that LaMDA is “sentient” — or “finely sensitive in perception or feeling.” In other words, emotionally intelligent.
The program, Language Model for Dialogue Applications, talked to the engineer about rights, personhood and religion. It also managed to make him change his mind about Isaac Asimov’s third law of robotics — “A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.”