-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Meeting detection in all apps + calendars integration #3478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a new meeting detection feature for the macOS app by monitoring system logs for microphone activity. When a meeting is detected, a floating "nub" UI is displayed. The implementation is spread across three new Swift files: MeetingDetector, NubManager, and NubWindow.
My review focuses on improving the correctness and maintainability of the new code. I've identified a potential logic issue in how meeting start/end events are processed, and a rendering bug in the progress bar UI. The suggested changes aim to make the feature more robust and simplify the code.
989139835182
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
حکمت افلاکی سورشجانی 989139835182
…ebouncing to avoid false positives
|
/gemini summary |
Summary of ChangesThis pull request delivers a significant enhancement by integrating calendar functionality and advanced meeting detection into the macOS application. The system monitors local calendars for scheduled meetings and observes microphone activity across all running applications. This dual approach enables the app to intelligently prompt users to record relevant conversations via a new 'nub' UI and automatically associate these recordings with their respective calendar events, providing richer context for subsequent AI processing. Highlights
Changelog
Activity
|
|
Gemini you forgot to include that we don't just rely on mic activity for detection, in case of browsers we also make use of the accessibility api of macos to check the title of the window that is open to be able to figure out which platform the user is using in browser for the meeting Calendar events are synced to firestore to a meetings collection, whenever a conversation starts we check if there's any meeting in calendar around that time and link it (store in redis if there is). Might not be the best approach but added it for now |
|
what is the best alternative solution for keeping the app sandbox on? and its trade-offs |
We can rely on NSWorkspace to detect if the user is using zoom/teams etc but it will not be as exact and accurate as relying on mic usage, this approach doesn't require disabling sandboxing. This approach can be also improved (since we already take screen capture permission). Overall it won't really be as good as the current approach but it will work, we might not be able to auto stop recording when meeting ends in case of browsers. |
|
e00a9dd replaces logstream (separate process) and Accessibility APIs usage with NSWorkspace and Core Graphics APIs to detect meetings both natively and in the browsers (this approach is inspired by Aside which is pretty cool and AppStore safe, added additional stuff using core graphics APIs to cover a lot of browser related cases and also to make auto stop recording possible) |
|
i see. pls use logstream. feel free to merge it. then open a new pr to add support:
then we move omi macos app to .dmg only. |
#3465
demo_detection.mp4