Skip to content

Conversation

@RJPercival
Copy link

What type of PR is this? (check all applicable)

  • Feature
  • Refactor
  • Bug Fix
  • Optimization
  • Documentation Update
  • Other

Description

When a test fails within the scope of RequestsMock (e.g. an exception is raised or an assertion fails), responses skips asserting that all requests were fired. This can hide useful information for debugging the test, as the test failure may have been a side effect of an expected request not being called (e.g. the mocked URL was wrong). This leaves users with an awkward choice - perform all of their assertions outside of the RequestsMock scope in order to always see when expected requests were not called (but lose the context those assertions would provide) or perform the assertions within the RequestsMock scope and lose the context of which requests were not called.

This PR introduces a new assert_on_exception parameter to the RequestsMock class and @responses.activate decorator. This feature eliminates the need for the aforementioned awkward choice by allowing the user to opt into always having it asserted that all requests were fired, even in the case that the test is already failing for some reason. The resulting test failure message will mention both the failing responses assertion and the original error message, giving the user the most context as to why the test is failing.

Key Changes

  • New Parameter: Added assert_on_exception boolean parameter to RequestsMock constructor
  • Decorator Support: Extended @responses.activate decorator to accept the new parameter
  • Enhanced Debugging: When assert_on_exception=True, request assertions are raised even during exceptions, providing context about which mocked requests were or weren't called
  • Backward Compatible: Default behavior remains unchanged (assert_on_exception=False)

Usage Examples

Context Manager Usage:

with responses.RequestsMock(
    assert_all_requests_are_fired=True, 
    assert_on_exception=True
) as rsps:
    rsps.add(responses.GET, "http://example.com/users", body="test")
    rsps.add(responses.GET, "http://example.com/profile", body="test")  # Not called
    requests.get("http://example.com/users")
    raise ValueError("Something went wrong")

# Output:
# ValueError: Something went wrong
#
# During handling of the above exception, another exception occurred:
#
# AssertionError: Not all requests have been executed [('GET', 'http://example.com/profile')]

Decorator Usage:

@responses.activate(
    assert_all_requests_are_fired=True,
    assert_on_exception=True
)
def test_my_api(): ...

Benefits

  • Improved Test Debugging: Developers can always see which HTTP requests weren't made when a test fails
  • Exception Context: Python's exception chaining shows both the original error and the unfired request information
  • Optional Feature: Preserves existing behavior by default to avoid masking original exceptions

Related Issues

PR checklist

Before submitting this pull request, I have done the following:

Added/updated tests?

Current repository has 100% test coverage.

  • Yes
  • No, and this is why:
  • I need help with writing tests

Test Coverage: Comprehensive tests added covering:

  • Context manager behavior with and without exceptions
  • Decorator functionality with new parameter
  • Exception chaining verification
  • Backward compatibility validation

When set to True, assertions about unfired requests will be raised even
when an exception occurs in the context manager. This provides valuable
debugging context about which mocked requests were or weren't called
when debugging test failures.

By default (assert_on_exception=False), the assertion is suppressed to
avoid masking the original exception.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@RJPercival RJPercival changed the title Assert on exception Add assert_on_exception parameter to add context to test failures Nov 3, 2025
@RJPercival RJPercival marked this pull request as ready for review November 3, 2025 11:15
@RJPercival RJPercival requested a review from markstory as a code owner November 3, 2025 11:15
raise ValueError("Main error")

# The AssertionError should mention the unfired request
assert "not-called.com" in str(assert_exc_info.value)

Check failure

Code scanning / CodeQL

Incomplete URL substring sanitization

The string [not-called.com](1) may be at an arbitrary position in the sanitized URL.
m.add(responses.GET, "http://not-called.com", body=b"test")
requests.get("http://example.com")

assert "not-called.com" in str(assert_exc_info2.value)

Check failure

Code scanning / CodeQL

Incomplete URL substring sanitization

The string [not-called.com](1) may be at an arbitrary position in the sanitized URL.
test_with_assert_on_exception()

# The AssertionError should mention the unfired request
assert "not-called.com" in str(assert_exc_info.value)

Check failure

Code scanning / CodeQL

Incomplete URL substring sanitization

The string [not-called.com](1) may be at an arbitrary position in the sanitized URL.
Comment on lines +1226 to +1233
with pytest.raises(ValueError) as value_exc_info:
with responses.RequestsMock(
assert_all_requests_are_fired=True, assert_on_exception=False
) as m:
m.add(responses.GET, "http://example.com", body=b"test")
m.add(responses.GET, "http://not-called.com", body=b"test")
requests.get("http://example.com")
raise ValueError("Main error")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is arguable that this behaviour is a bug. And that assert_on_exception doesn't need to exist.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem with changing this behaviour without an opt-in feature flag is that it'd be a breaking change, as the tests demonstrate (the exception leaving the RequestsMock block would change).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I disagree that the exception type changing because the library has become more strict/correct constitutes a breaking change. Following that line of thinking would imply that no additional exception types could be added as they could 'break' compatibility.

with pytest.raises(AssertionError) as assert_exc_info:

@responses.activate(
assert_all_requests_are_fired=True, assert_on_exception=True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like assert_on_exception is a noop when used by itself, which feels like it is a sub-optimal design choice.

Copy link
Author

@RJPercival RJPercival Nov 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess we could make assert_on_exception imply assert_all_requests_are_fired, but I'm not sure that's better. We could raise an exception if assert_all_requests_are_fired is False but assert_on_exception is True? That's trivial to do when instantiating RequestsMock; a bit trickier to do with the decorator since it mocks the attributes on RequestsMock.

RJPercival and others added 2 commits November 4, 2025 10:52
The @responses.activate decorator now accepts an assert_on_exception
parameter, providing a convenient way to enable assertion checking
even when exceptions occur:

    @responses.activate(
        assert_all_requests_are_fired=True,
        assert_on_exception=True
    )
    def test_my_api():
        ...

This is consistent with the existing decorator support for
assert_all_requests_are_fired and registry parameters.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Document the new assert_on_exception parameter in version 0.26.0.

This is a minor version bump (not patch) because we're adding new
functionality to the public API, even though it's fully backward
compatible with existing code.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants