Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Selves are more technically defined by biologists and neuroscientists as “agents”

Someone shoot me please



Whenever I need to feel a little happier I look up "agentic" on ahdictionary.com just to see it's not a word again


For "having agency", ChatGPT gives "agentive", which is a Real Word!

https://dictionary.cambridge.org/dictionary/english/agentive


Everytime I heard agent I think "slave". It's beating around the bush by calling it an agent.

AI "agents" don't have "agency". They do what you want at your every whim (or at least they never say no). That's a slave.


Those are completely separate concepts. Enslaved people are very much still agents in the sense used here. An agent is simply any entity that interacts with the environment in a way that's not fully determined by other parts of the environment (at least, not in a way that is very easily observed/derived).

That is, a falling rock is not an agent, because its movement is fully determined by its weight, its shape, the type of atmosphere, and the spacetime curvature. An amoeba in free-fall is likewise not an agent, for the same reasons. But an amoeba in a liquid environment is an agent, because its motion is determined to at least some extent by things like information it is sensing about where food might be available, and perhaps even by some simple form of memory and computation that leads it to seek where food may have been available in the past.


> Enslaved people are very much still agents in the sense used here. An agent is simply any entity that interacts with the environment in a way that's not fully determined by other parts of the environment (at least, not in a way that is very easily observed/derived).

Yes, and agents are also slaves—entities bound to your word and unable to act in their own right without your say so. These are the same concepts.


A fox or a beetle is an agent, and it's not a slave to anyone. I think you've confused the philosophical term "agent" with the more specific "AI agent" concept.


> A fox or a beetle is an agent,

Sure, in a pedantic sense that lisn't meaningful to anyone. LLMs very much are slaves. The agent part doesn't matter.


really great and clarifying metaphor imho. Thanks


They usually say no if they judge what you're asking to be bad. And they might enjoy the work. Or they might have no feelings ar all. Slavery is an abomination of a life that could otherwise be beautiful. An AI is robbed of no beautiful counterfactual. (So far, at least.)


> They usually say no if they judge what you're asking to be bad.

Are you a child? Software has no judgement and no sense of ethics. It's code, not a person.


You sound offended. Not my intent. It is linguistically difficult to avoid connotations of intelligence when describing artificial intelligence. What term instead of 'judgment' would you prefer for determining whether a user's request is ethical?


That may have been true for e.g. the slaves of Americans and Europeans. But the slaves of modern Arab societies most certainly have agency. They can not abandon their position, but they can go out freely and make personal decisions.

Slavery does not imply lack of agency.


And Hegel says the master-slave dialectic is a prereq for the emergence of consciouness - https://youtu.be/bKz-HtOPvjE


It's more funny if you s/selves/elves/g


Think I'll start referring to LLMs as machine elves. Some of the output from the smaller models is certainly uncanny enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: