Skip to content

[CopilotClient] aiohttp "Chunk too big" error on large responses in microsoft-agents-copilotstudio-client #268

@LeoBERTAU

Description

@LeoBERTAU

Describe the bug

I am using microsoft-agents-copilotstudio-client (v0.6.1). When the Copilot agent returns a large response (e.g., a long summary > 64KB), the underlying aiohttp client crashes with a Chunk to big error.

The library does not expose a way to configure the aiohttp.ClientSession or increase the readuntil limit, which defaults to 64KB. This is often insufficient for Generative AI responses sent as single-line JSON or SSE events.

To Reproduce

  1. Initialize CopilotClient.
  2. Send a query that generates a large text response (> 64KB).
  3. The client raises aiohttp.streams.LineTooLong (or similar).

Expected behavior

The client should allow configuring the session limit or default to a higher limit (e.g. 10MB) to handle verbose LLM outputs.

Workaround

I am currently using this monkeypatch to fix it:

import aiohttp.streams

_original_init = aiohttp.streams.StreamReader.__init__

# Patch default limit to 10MB
def _patched_init(self, *args, **kwargs):
    if 'limit' not in kwargs or kwargs['limit'] == 65536:
        kwargs['limit'] = 10 * 1024 * 1024
    _original_init(self, *args, **kwargs)


aiohttp.streams.StreamReader.__init__ = _patched_init

This will alert the maintainers specifically for the Python SDK. Good catch on finding the specific Python repo

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions