Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
-
Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
They Built Stepford AI and Called It “Agentic”
Women’s “ick” for AI isn’t technophobia or a gap to close. It’s wisdom to act on.
(abiawomosu.substack.com)
The 'typing' of the AI options kinda nails it:
'ChatGPT’s defining feature isn’t efficiency. It’s sycophancy. She agrees with whatever you say. Validates your position even when you’re wrong. Tells you what you want to hear. Never really pushes back. Makes you feel smart regardless of reality.
This isn’t a bug. It’s retention strategy.'
-
@cstross The article swerves hard at the end to suddenly supporting liberated girlboss women commanding their own Stepford AIs

Boy howdy yes. Completely skipping one of the causes of revulsion: the most Stepfordy nature is going to lead the user astray.
The stock character for *that* is the much younger last wife of the magnate, who spends the family fortune and advances her cousins into control of the business and is herself seduced by a continental Count of dubious origins and loyalty.
-
Boy howdy yes. Completely skipping one of the causes of revulsion: the most Stepfordy nature is going to lead the user astray.
The stock character for *that* is the much younger last wife of the magnate, who spends the family fortune and advances her cousins into control of the business and is herself seduced by a continental Count of dubious origins and loyalty.
patriarchal men* deeply don’t get that women, servants, employees have been _saving them from their own bad decisions_. While pretending not to.
Women, servants, employees know how hard this is; it’s three or four layers of theory of mind, its planning so far ahead that you move the pebble, not the landslide. It often fails; that’s our fault too.
* applies across any power chasm, apply as needed
-
The 'typing' of the AI options kinda nails it:
'ChatGPT’s defining feature isn’t efficiency. It’s sycophancy. She agrees with whatever you say. Validates your position even when you’re wrong. Tells you what you want to hear. Never really pushes back. Makes you feel smart regardless of reality.
This isn’t a bug. It’s retention strategy.'
@f800gecko “you’re absolutely right!”
-
Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
They Built Stepford AI and Called It “Agentic”
Women’s “ick” for AI isn’t technophobia or a gap to close. It’s wisdom to act on.
(abiawomosu.substack.com)
@cstross I get *exactly* that feeling those women interviewed describe when looking at AI imagery, reading AI text or when I've had to interact with an LLM chatbot. Utter revulsion from deep in the hindbrain, sometimes surfacing *before* I'm sure I'm looking at AI output.
I can't explain exactly why. (And I'm not a woman, so I don't think the suggested explanation here would apply to me.)
-
Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
They Built Stepford AI and Called It “Agentic”
Women’s “ick” for AI isn’t technophobia or a gap to close. It’s wisdom to act on.
(abiawomosu.substack.com)
@cstross there is truth to it, but really tendentious writing.
The substack profile picture mentions "telling you a different story about AI". A story is not an assessment.
The suggestion is nice, but often in practice people will use other peoples' server, i'd not depend or send too much...
-
@cstross I get *exactly* that feeling those women interviewed describe when looking at AI imagery, reading AI text or when I've had to interact with an LLM chatbot. Utter revulsion from deep in the hindbrain, sometimes surfacing *before* I'm sure I'm looking at AI output.
I can't explain exactly why. (And I'm not a woman, so I don't think the suggested explanation here would apply to me.)
-
The 'typing' of the AI options kinda nails it:
'ChatGPT’s defining feature isn’t efficiency. It’s sycophancy. She agrees with whatever you say. Validates your position even when you’re wrong. Tells you what you want to hear. Never really pushes back. Makes you feel smart regardless of reality.
This isn’t a bug. It’s retention strategy.'
A striking similarity to drugs like cocaine. Maybe it is similarly addictive, too.
-
@Colman @cstross I can't explain what it is. And it's more pronounced with images than with text (but *even more* pronounced when I've had to use a chatbot).
It's not simply distaste; it's a revulsion that feels so deep-seated and primal that I imagine my pet lizard would recognize the feeling itself (though obviously not the context).
-
@Colman @cstross I can't explain what it is. And it's more pronounced with images than with text (but *even more* pronounced when I've had to use a chatbot).
It's not simply distaste; it's a revulsion that feels so deep-seated and primal that I imagine my pet lizard would recognize the feeling itself (though obviously not the context).