Skip to content

Conversation

@Renizmy
Copy link
Contributor

@Renizmy Renizmy commented Nov 21, 2025

Related to: #13372

@Renizmy
Copy link
Contributor Author

Renizmy commented Nov 21, 2025

Mooved here @Megafredo

@Megafredo Megafredo assigned Megafredo and unassigned Megafredo Nov 21, 2025
@Megafredo Megafredo self-requested a review November 21, 2025 13:23
@Megafredo
Copy link
Member

Hello @Renizmy, thank you for the switch!
There is just one issue with the linter. It seems to be a matter of indentation in the linter configuration.

@Renizmy
Copy link
Contributor Author

Renizmy commented Nov 21, 2025

Fixed, sorry

@codecov
Copy link

codecov bot commented Nov 21, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 31.28%. Comparing base (8c3972f) to head (c949aa6).
⚠️ Report is 63 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master   #13261      +/-   ##
==========================================
+ Coverage   31.11%   31.28%   +0.16%     
==========================================
  Files        2922     2929       +7     
  Lines      193795   194733     +938     
  Branches    39561    39718     +157     
==========================================
+ Hits        60294    60916     +622     
- Misses     133501   133817     +316     
Flag Coverage Δ
opencti 31.28% <ø> (+0.16%) ⬆️
opencti-front 2.47% <ø> (-0.02%) ⬇️
opencti-graphql 68.52% <ø> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link
Member

@Megafredo Megafredo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @Renizmy, thanks for your work!
This new method for streams that allows batch processing will make a lot of people happy!

@Gwendoline-FAVRE-FELIX Gwendoline-FAVRE-FELIX added the community use to identify PR from community label Dec 5, 2025
@helene-nguyen
Copy link
Member

@Renizmy FYI, we'd like to improve a bit and refactor the code before merging ! :)

@xfournet
Copy link
Member

Hi @Renizmy,

Thank you for your contribution. As @helene-nguyen mentioned, we'd like the code to be refactored before merging. The main concern is that the new class (ListenStreamBatch) and method (listen_stream_batch) duplicate existing code.

Instead of creating a new class and method, we suggest implementing a message_callback wrapper that can adapt the existing listen_stream function from a single callback per message to a batched callback. You should be able to use the code you've already introduced to create this adapter.

Then each batch-capable connector (in regards of the targeted API) could be able to use this adapter to receive batch of message instead individual message.

Usage (assuming wrapper is named create_batch_callback and the process_message of the connector becomes process_message_batch) would be something like that:

    self.helper.listen_stream(message_callback=self.process_message)

--->

    batch_callback = self.helper.create_batch_callback(self.process_message_batch, self.batch_size, self.batch_timeout, self.max_batches_per_minute)
    self.helper.listen_stream(message_callback=batch_callback)

Would you be open to making this change?

@xfournet xfournet self-requested a review December 15, 2025 16:16
Copy link
Member

@xfournet xfournet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update! I made some comments, I will resume the PR after theses first feebacks have been processed.

@Renizmy
Copy link
Contributor Author

Renizmy commented Dec 29, 2025

Hi @xfournet ,

Thanks for the review! All points addressed:

Changes to the rate limiter have led to simplifications. I haven't implemented any code related to RL for basic stream consumption (out of scope?).

@SamuelHassine SamuelHassine force-pushed the master branch 8 times, most recently from 1c222ef to 2cb4539 Compare January 10, 2026 19:59
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds batch event consumption capabilities to the OpenCTI Python client, enabling connectors to accumulate and process multiple events together. The feature includes rate limiting support using a sliding window algorithm and proper cleanup mechanisms for thread-safe operations.

Changes:

  • Adds RateLimiter and RateLimitedCallback classes for rate-limited event processing with sliding window algorithm
  • Adds BatchCallbackWrapper class for accumulating events into batches based on size/timeout triggers
  • Modifies ListenStream to support callbacks with custom state management (bypassing automatic state updates)
  • Adds factory methods create_batch_callback() and create_rate_limiter() to OpenCTIConnectorHelper
  • Refactors listen_stream() to extract parameter resolution into _resolve_stream_parameters()
Comments suppressed due to low confidence (1)

client-python/pycti/connector/opencti_connector_helper.py:1

  • The docstring contains malformed content. Lines 2822-2832 appear to be duplicate or misplaced parameter declarations that should not be inside the docstring. These lines should be removed as they are not valid docstring formatting and create confusion. The docstring should end at line 2821 after the :rtype: dict line.
"""OpenCTI Connector Helper module.

@xfournet xfournet changed the title [Feature] Add batch event comsumption [Feature] Add batch event consumption Jan 16, 2026
@xfournet xfournet force-pushed the feature/batch-event-comsuption branch from 476538d to b08096d Compare January 16, 2026 17:31
@xfournet xfournet force-pushed the feature/batch-event-comsuption branch 2 times, most recently from 0d81560 to e499ed4 Compare January 16, 2026 18:37
@xfournet xfournet requested a review from Copilot January 16, 2026 19:02
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 3 out of 3 changed files in this pull request and generated 21 comments.

@xfournet xfournet force-pushed the feature/batch-event-comsuption branch 4 times, most recently from 16b399c to 2b2b7b5 Compare January 22, 2026 22:53
@JeremyCloarec JeremyCloarec force-pushed the feature/batch-event-comsuption branch from c949aa6 to 070eaa7 Compare February 2, 2026 14:17
@xfournet xfournet merged commit 20b7c7f into OpenCTI-Platform:master Feb 3, 2026
23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

community use to identify PR from community

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add bulk consumption helper/method for stream processing in client python

6 participants