Skip to content

Conversation

@samuelemarro
Copy link
Member

agents.txt

Right now, websites have a robots.txt file that specifies what pages crawlers are allowed to use. As agents' capacity to use websites becomes more sophisticated (and as acceptance of regular crawlers decreases), it is important to create systems for websites to describe how agents should interact with websites.

A good starting point is [llms.txt], which uses a Markdown file to describe information about a page in a way that LLMs can understand it. Note that llms.txt doesn't currently support permission management mechanisms.

@lightaime
Copy link

Not sure if agent action space and observation space can handle by agents.txt: camel-ai/owl#370

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants