Search the KHIT Blog

Friday, May 15, 2026

"Perception is an ILLUSION?"

 
Well, that's pretty unequivocal. That graphic is from a BigThink Youtube video. The speaker is neuroscientist Dr. Heather Berlin.
 
First time I saw that graphic, I had a fleeting reflexive reaction of "oh, yeah, the 'Subjectivism Fallacy'," stemming from my 1999-2004 Adjunct days teaching collegiate "critical thinking" classes. i.e., "there ARE NO 'objective truths,' everything is subjectively perceived in response to sensory stimuli." "So, (BobbyG retorts) if this assertion is 'false' (illusory), it deductively follows that it must also be TRUE."
 
Pedant. 
 
Yeah, Heather. It's just a 4-word (albeit clickbait-ish) headline.
 
 
The broader point is taken, Doc. Succintly put in 6:21.
 
OF PARTICULAR RELEVANCE THESE DAYS
 
The provided sources examine the complex intersection of anthropomorphism, trust, and power within the field of artificial intelligence. One study investigates how linguistic cues, such as voice-based interfaces and the use of first-person pronouns, lead users to perceive large language models as more human-like and accurate. Complementary research explores the "Silicon Valley Effect", arguing that Big Tech companies strategically shape regulatory discourse to protect their commercial interests while potentially obscuring the human harms caused by their products. Further analysis focuses on the visual self-representations of ChatGPT, identifying recurring themes of futurism and social intelligence that promote the image of a "friendly assistant." Collectively, these texts highlight how human-like traits in AI can manipulate public perception, set unrealistic expectations of capability, and complicate the legal and ethical oversight of generative technologies.
AI as applied to social media (and "influence" industries broadly) is all about shaping your perceptions in ways that benefit them. "AI for Good?"
 
I'll fill in a bunch of multi-vector applicability this weekend. Stay tuned...

No comments:

Post a Comment