Skip to content

Conversation

@Kitenite
Copy link
Contributor

@Kitenite Kitenite commented Oct 1, 2025

Description

Related Issues

Type of Change

  • Bug fix
  • New feature
  • Documentation
  • Refactor
  • Other (please describe):

Testing

Screenshots (if applicable)

Additional Notes


Important

Introduces message queuing for chat, enhancing multi-message handling and UI for managing queued messages.

  • Behavior:
    • Introduces message queuing in use-chat/index.tsx to handle multiple submissions and edits.
    • Adds processNextInQueue to process queued messages sequentially.
    • Updates sendMessage to queue messages if streaming or pending messages exist.
  • UI:
    • Adds QueueItems and QueuedMessageItem components to display and manage queued messages in chat-input.
    • Updates ChatInput to include a collapsible panel for queued messages.
  • Refactor:
    • Removes suggestions from ChatInputProps and related logic.
    • Updates useChat to manage message queue state and processing.
  • Misc:
    • Exports queue from message/index.ts.

This description was created by Ellipsis for 3fb5aac. You can customize this summary. It will automatically update as commits are pushed.


Summary by CodeRabbit

  • New Features

    • Introduced chat message queue: queue multiple prompts, see pending items, and process them automatically.
    • Added collapsible “Queued messages” panel with counts and the ability to remove individual items.
    • Context-aware processing improves reliability when editing or streaming messages.
    • Dynamic control: Send button switches to Stop while streaming.
  • Style

    • Updated Chat input layout: reduced padding and refined spacing for better density.
    • Minor visual tweaks to context pills for improved alignment.

@supabase
Copy link

supabase bot commented Oct 1, 2025

This pull request has been ignored for the connected project wowaemfasoptxrdjhilu because there are no changes detected in apps/backend/supabase directory. You can change this behaviour in Project Integrations Settings ↗︎.


Preview Branches by Supabase.
Learn more about Supabase Branching ↗︎.

@vercel
Copy link

vercel bot commented Oct 1, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
web Ready Ready Preview Comment Oct 1, 2025 6:53am
1 Skipped Deployment
Project Deployment Preview Comments Updated (UTC)
docs Skipped Skipped Oct 1, 2025 6:53am

@coderabbitai
Copy link

coderabbitai bot commented Oct 1, 2025

Caution

Review failed

The pull request is closed.

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

Walkthrough

Introduces a queue-based chat sending flow in useChat, adds queued message types to models, and updates UI to display and manage queued messages. ChatInput and ChatTab wiring are adjusted to use the queue. Minor UI tweaks and prop/type changes accompany the new flow.

Changes

Cohort / File(s) Summary
Hook: queue-based messaging
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
Reworks send flow to a message queue with processing helpers; adds context-aware processing; integrates FinishReason; exports queuedMessages and removeFromQueue; introduces types (ProcessMessage, MessageContext, QueuedMessage).
Chat input wiring & UI
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx, .../action-buttons.tsx
Replaces suggestions with queue UI; adjusts props to accept queuedMessages/removeFromQueue; integrates Stop/Send logic; layout tweaks; makes ActionButtons.disabled optional with default false.
Queue UI components (new)
.../chat-input/queue-items/index.tsx, .../chat-input/queue-items/queue-item.tsx
Adds QueueItems and QueuedMessageItem components to list and remove queued messages with collapsible UI and per-item delete.
Chat tab integration
.../chat-tab-content/index.tsx
Wires useChat’s queuedMessages/removeFromQueue into ChatInput; removes suggestions prop.
Models: message types & exports
packages/models/src/chat/message/index.ts, packages/models/src/chat/message/message.ts, packages/models/src/chat/message/queue.ts
Adds QueuedMessage type; switches re-exports to extensionless paths; changes ChatMetadata.finishReason to FinishReason; new export of queue module.
Minor UI adjustment
.../context-pills/input-context-pills.tsx
Adds px-1 pt-1 to container classes.
Misc cleanup
.../chat-tab/controls.tsx
Removes unused useChatContext import.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant U as User
  participant CI as ChatInput
  participant HC as useChat (hook)
  participant Q as Queue (state)
  participant AI as AI Transport

  U->>CI: Type message / Press Send
  CI->>HC: processMessage(content, type)
  alt Hook idle
    HC->>AI: send(content, context)
    AI-->>HC: stream tokens / finishReason
    HC->>HC: onFinish -> processNextInQueue()
  else Hook busy
    HC->>Q: enqueue(QueuedMessage)
  end
  opt Queue drain
    HC->>Q: dequeue next
    HC->>AI: send(next.content, next.context)
    AI-->>HC: stream tokens / finishReason
    HC->>HC: onFinish -> processNextInQueue()
  end
Loading
sequenceDiagram
  participant CI as ChatInput
  participant QI as QueueItems
  participant HC as useChat

  CI->>QI: Render queuedMessages
  QI-->>CI: removeFromQueue(id) click
  CI->>HC: removeFromQueue(id)
  HC->>HC: Update queue state
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Poem

I lined up thoughts in a tidy queue,
Each hop, a message, crisp and new.
With whiskers twitching, I press “Send”—
One by one, they reach the end.
If streams must stop, I’ll calmly wait,
Then thump—next chat proceeds, first-rate!
🐇➡️💬

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Description Check ⚠️ Warning Although the PR description includes the repository’s section headings, each section remains filled with placeholder comments and lacks the actual content required by the template; the substantive summary is instead placed outside the structured “Description” area, and there are no linked issues, testing details, screenshots, or supplementary notes within their respective sections. Please populate each template section with the appropriate content by moving the summary of new features and refactors into the “Description” section, linking any related issue numbers under “Related Issues,” detailing verification steps under “Testing,” and adding screenshots or extra context in the “Screenshots” and “Additional Notes” sections to ensure full compliance with the repository’s PR guidelines.
✅ Passed checks (2 passed)
Check name Status Explanation
Title Check ✅ Passed The title “feat: allow queuing messages” succinctly captures the main change of adding a message queuing feature, clearly indicating a new capability rather than unrelated modifications, and is concise enough for teammates to understand the core purpose at a glance.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ced498c and 3fb5aac.

📒 Files selected for processing (3)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (5 hunks)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/queue-items/index.tsx (1 hunks)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/queue-items/queue-item.tsx (1 hunks)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🧹 Nitpick comments (2)
PENDING_QUEUE_PLAN.md (2)

276-297: Consider exposing clearQueue function.

Line 55 defines clearQueue(), but it's not included in the hook's return interface. If users need to clear all queued messages at once (e.g., "Clear All" button), this function should be exposed.

   // Queue management functions
   removeFromQueue,
+  clearQueue,
 };

Note: Phase 2 (line 308) mentions "no clear all button", so this may be intentional—but the function was still implemented, suggesting the plan might be incomplete.


299-316: Align implementation with phase descriptions.

Line 308 explicitly states "no clear all button," yet line 55 implements clearQueue() and the UI design could benefit from bulk operations. Consider either:

  1. Adding clear-all functionality to Phase 2 if it provides value, or
  2. Removing clearQueue() from the implementation to match the stated scope.
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b4e5c86 and 11e324e.

📒 Files selected for processing (3)
  • PENDING_QUEUE_PLAN.md (1 hunks)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/controls.tsx (0 hunks)
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (0 hunks)
💤 Files with no reviewable changes (2)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/controls.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
🧰 Additional context used
🪛 markdownlint-cli2 (0.18.1)
PENDING_QUEUE_PLAN.md

230-230: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

🔇 Additional comments (8)
PENDING_QUEUE_PLAN.md (8)

3-8: Clear objectives.

The goals are well-articulated and user-focused, establishing a solid foundation for the feature design.


26-41: Solid state design.

Capturing MessageContext[] at queue time is the right approach—it preserves the state when the user queued the message, avoiding stale context issues later.


133-142: Appropriate UI dependencies.

The component imports from @onlook/ui are standard and sufficient for the proposed queue interface.


144-189: Well-designed queue UI component.

The expandable design with local expansion state is appropriate, and the use of existing UI primitives (Collapsible, Badge, Card) maintains consistency.


241-274: Clean integration design.

The prop additions are minimal and the queue box placement above the textarea provides clear visual hierarchy without disrupting existing UI flow.


317-336: Thorough testing strategy.

The three-tier test approach (unit, integration, UX) with specific edge cases (empty queue, concurrent operations, large queues) demonstrates solid planning for quality assurance.


121-130: Verify effect dependencies and delay rationale

  • Unable to locate the implementation of processNextInQueue. Please confirm whether it’s memoized (e.g., wrapped in useCallback) or add it to the effect’s dependency array to avoid stale closures.
  • Clarify or document why a 500 ms timeout was chosen (or make it configurable based on message complexity/context).

354-372: Performance target validation required. The <10 ms queue operation goal may not be achievable with the current O(n) filter implementation on 100-message queues—benchmark or profile on representative hardware to confirm.

Comment on lines 59 to 69
const processNextInQueue = async () => {
if (queuedMessages.length === 0) return;

const [nextMessage, ...remaining] = queuedMessages;
setQueuedMessages(remaining);

// Refresh context before sending - takes and returns MessageContext[]
const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);

await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
};
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Risk of message loss and race conditions.

The processNextInQueue function has several issues:

  1. Message loss on failure: Line 63 removes the message from the queue before attempting to send it. If getRefreshedContext or sendMessageImmediately fails, the message is permanently lost.
  2. No error handling: No try/catch wrapping the async operations.
  3. Race condition: If processNextInQueue is called concurrently (e.g., user manually triggers while auto-process is running), the same message could be processed twice, or queue state could become inconsistent.
  4. No streaming guard: If streaming starts while processing, this could violate the queue's invariant.

Consider this safer implementation:

 const processNextInQueue = async () => {
   if (queuedMessages.length === 0) return;
+  if (isStreaming) return; // Don't process if streaming started
+  if (isProcessing) return; // Prevent concurrent processing
+  
+  setIsProcessing(true);
   
   const [nextMessage, ...remaining] = queuedMessages;
-  setQueuedMessages(remaining);
   
-  // Refresh context before sending - takes and returns MessageContext[]
-  const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
-  
-  await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
+  try {
+    const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
+    await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
+    // Only remove from queue after successful send
+    setQueuedMessages(remaining);
+  } catch (error) {
+    console.error('Failed to process queued message:', error);
+    // Optionally: add retry logic or move to failed queue
+  } finally {
+    setIsProcessing(false);
+  }
 };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const processNextInQueue = async () => {
if (queuedMessages.length === 0) return;
const [nextMessage, ...remaining] = queuedMessages;
setQueuedMessages(remaining);
// Refresh context before sending - takes and returns MessageContext[]
const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
};
const processNextInQueue = async () => {
if (queuedMessages.length === 0) return;
if (isStreaming) return; // Don't process if streaming started
if (isProcessing) return; // Prevent concurrent processing
setIsProcessing(true);
const [nextMessage, ...remaining] = queuedMessages;
try {
const refreshedContext: MessageContext[] =
await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
// Only remove from queue after successful send
setQueuedMessages(remaining);
} catch (error) {
console.error('Failed to process queued message:', error);
// Optionally: add retry logic or move to a 'failed' queue
} finally {
setIsProcessing(false);
}
};
🤖 Prompt for AI Agents
In PENDING_QUEUE_PLAN.md around lines 59 to 69, the current processNextInQueue
removes the message from queuedMessages before performing async work which
causes message loss on failure, lacks error handling, allows concurrent runs to
race, and doesn't guard against streaming; fix it by performing dequeue
atomically (or peek) but only call setQueuedMessages to remove the message after
successful send, wrap getRefreshedContext/sendMessageImmediately in a try/catch
and on failure requeue or leave the message in place and surface/log the error,
add a processing flag or mutex to prevent concurrent invocations (return early
if already processing), and check/await any streaming guard before starting so
streaming cannot start while processing (or abort processing if streaming
begins).

Comment on lines 59 to 69
const processNextInQueue = async () => {
if (queuedMessages.length === 0) return;

const [nextMessage, ...remaining] = queuedMessages;
setQueuedMessages(remaining);

// Refresh context before sending - takes and returns MessageContext[]
const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);

await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The processNextInQueue function lacks error handling for the context refresh operation. If editorEngine.chat.context.getRefreshedContext() fails, the message will be removed from the queue but never sent, resulting in message loss. Consider adding a try/catch block to either:

  1. Return the message to the queue if context refresh fails:
try {
  const refreshedContext = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
  await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
} catch (error) {
  // Put message back in queue
  setQueuedMessages(prev => [nextMessage, ...prev]);
  console.error("Failed to process queued message:", error);
}
  1. Or fall back to sending without refreshed context:
try {
  const refreshedContext = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
  await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
} catch (error) {
  // Send with original context as fallback
  await sendMessageImmediately(nextMessage.content, nextMessage.type, nextMessage.context);
}
Suggested change
const processNextInQueue = async () => {
if (queuedMessages.length === 0) return;
const [nextMessage, ...remaining] = queuedMessages;
setQueuedMessages(remaining);
// Refresh context before sending - takes and returns MessageContext[]
const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
};
const processNextInQueue = async () => {
if (queuedMessages.length === 0) return;
const [nextMessage, ...remaining] = queuedMessages;
setQueuedMessages(remaining);
try {
// Refresh context before sending - takes and returns MessageContext[]
const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
} catch (error) {
// Put message back in queue
setQueuedMessages(prev => [nextMessage, ...prev]);
console.error("Failed to process queued message:", error);
}
};

Spotted by Diamond

Fix in Graphite


Is this helpful? React 👍 or 👎 to let us know.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (5)
PENDING_QUEUE_PLAN.md (5)

224-268: Add explicit prop types for QueueBox.

Type props to improve safety and DX. Ensure callers pass removeFromQueue.

-const QueueBox = ({ queuedMessages, removeFromQueue }) => {
+const QueueBox = ({
+  queuedMessages,
+  removeFromQueue,
+}: {
+  queuedMessages: QueuedMessage[];
+  removeFromQueue: (id: string) => void;
+}) => {

51-53: Performance claim vs. implementation mismatch.

removeFromQueue is O(n) via filter; doc claims O(1)/O(log n). Either change data structure (Map + order array) or update the claim.

Prose tweak (if keeping array):

  • “Queue removals are O(n) with the current array approach; acceptable for <=100 items. For O(1) removals, use Map<string, QueuedMessage> plus an ordered array.”

Also applies to: 425-428


39-41: Prevent message loss and concurrent processing in processNextInQueue.

Dequeue happens before async work; failures lose messages. No guard against concurrent runs or streaming. Add isProcessing flag, stream guard, try/catch, and only remove after success.

-// Simple state additions to useChat hook
-const [queuedMessages, setQueuedMessages] = useState<QueuedMessage[]>([]);
+// Simple state additions to useChat hook
+const [queuedMessages, setQueuedMessages] = useState<QueuedMessage[]>([]);
+const [isProcessing, setIsProcessing] = useState(false);
-const processNextInQueue = async () => {
-  if (queuedMessages.length === 0) return;
-  
-  const [nextMessage, ...remaining] = queuedMessages;
-  setQueuedMessages(remaining);
-  
-  // Refresh context before sending - takes and returns MessageContext[]
-  const refreshedContext: MessageContext[] = await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
-  
-  await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
-};
+const processNextInQueue = async () => {
+  if (queuedMessages.length === 0) return;
+  if (isStreaming || isProcessing) return;
+  setIsProcessing(true);
+  const [nextMessage, ...remaining] = queuedMessages;
+  try {
+    const refreshedContext: MessageContext[] =
+      await editorEngine.chat.context.getRefreshedContext(nextMessage.context);
+    await sendMessageImmediately(nextMessage.content, nextMessage.type, refreshedContext);
+    setQueuedMessages(remaining); // remove only on success
+  } catch (error) {
+    console.error('Failed to process queued message:', error);
+    // optional: retry/backoff or move to failed queue
+  } finally {
+    setIsProcessing(false);
+  }
+};

Also applies to: 59-69


128-143: Remove the first sendMessage variant; it references undefined addToQueue.

It duplicates the complete version below and calls an undefined helper.

-const sendMessage: SendMessage = useCallback(
-  async (content: string, type: ChatType) => {
-    // If AI is streaming, add to queue instead
-    if (isStreaming) {
-      await addToQueue(content, type);
-      return getUserChatMessageFromString(content, [], conversationId); // Return placeholder
-    }
-    
-    // Send immediately if not streaming
-    return sendMessageImmediately(content, type);
-  },
-  [isStreaming, addToQueue]
-);

272-306: QueuedMessageItem uses removeFromQueue but doesn't declare it.

Add the prop and type it.

-const QueuedMessageItem = ({ message, index }: { message: QueuedMessage, index: number }) => {
+const QueuedMessageItem = ({
+  message,
+  index,
+  removeFromQueue,
+}: {
+  message: QueuedMessage;
+  index: number;
+  removeFromQueue: (id: string) => void;
+}) => {
🧹 Nitpick comments (4)
PENDING_QUEUE_PLAN.md (4)

201-208: Stabilize auto-processing effect dependencies.

processNextInQueue is used but not in deps; include it or memoize it to avoid stale closures.

-useEffect(() => {
+useEffect(() => {
   // When streaming ends, process the next message in queue
   if (!isStreaming && queuedMessages.length > 0) {
     const timer = setTimeout(processNextInQueue, 500); // Small delay
     return () => clearTimeout(timer);
   }
-}, [isStreaming, queuedMessages.length]);
+}, [isStreaming, queuedMessages.length, processNextInQueue]);

Optionally wrap processNextInQueue in useCallback with the minimal dependency set.


31-37: Consider serializable timestamp type.

If persisting queue, prefer timestamp: number (epoch) to avoid Date/string inconsistencies; parse on read in UI.


55-57: Unused clearQueue helper.

Plan says no "clear all" UI; either wire it into UX (with confirm) or drop it for now.


169-170: Clarify sendMessage return contract when enqueued.

Returning a placeholder chat message may mislead callers; consider returning a lightweight status (e.g., { queued: true, id }) or void and rely on QueueBox for display.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 11e324e and 360c51c.

📒 Files selected for processing (1)
  • PENDING_QUEUE_PLAN.md (1 hunks)
🧰 Additional context used
🪛 markdownlint-cli2 (0.18.1)
PENDING_QUEUE_PLAN.md

308-308: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

Comment on lines 114 to 120
void regenerate({
body: {
chatType,
conversationId,
agentType,
},
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Include context in edit regenerate call for parity with send path.

sendMessageImmediately passes context; editMessageImmediately does not. Add context: updatedContext to regenerate body.

   void regenerate({
     body: {
       chatType,
       conversationId,
+      context: updatedContext,
       agentType,
     },
   });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
void regenerate({
body: {
chatType,
conversationId,
agentType,
},
});
void regenerate({
body: {
chatType,
conversationId,
context: updatedContext,
agentType,
},
});
🤖 Prompt for AI Agents
In PENDING_QUEUE_PLAN.md around lines 114 to 120, the regenerate call's body
lacks the context field, causing editMessageImmediately to diverge from
sendMessageImmediately; update the regenerate body to include updatedContext
(e.g., body: { chatType, conversationId, agentType, updatedContext }) so the
edit flow passes the same context as the send path.

Comment on lines 158 to 167
if (isStreaming) {
// AI is running - add to bottom of queue (normal queueing)
setQueuedMessages(prev => [...prev, newMessage]);
} else if (queuedMessages.length > 0) {
// AI is stopped but there are queued messages - add to top of queue (priority)
setQueuedMessages(prev => [newMessage, ...prev]);
} else {
// No queue and not streaming - send immediately
return sendMessageImmediately(content, type);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Fix immediate-send context and hook dependencies; clarify FIFO ordering.

  • Pass context into sendMessageImmediately.
  • Include conversationId in deps.
  • Consider FIFO for all queued cases to avoid reordering surprises.
-    } else if (queuedMessages.length > 0) {
-      // AI is stopped but there are queued messages - add to top of queue (priority)
-      setQueuedMessages(prev => [newMessage, ...prev]);
+    } else if (queuedMessages.length > 0) {
+      // Keep FIFO even when stopped to avoid reordering older items
+      setQueuedMessages(prev => [...prev, newMessage]);
     } else {
       // No queue and not streaming - send immediately
-      return sendMessageImmediately(content, type);
+      return sendMessageImmediately(content, type, context);
     }
     
     return getUserChatMessageFromString(content, [], conversationId);
   },
-  [isStreaming, queuedMessages.length]
+  [isStreaming, queuedMessages.length, conversationId]
 );

Also applies to: 169-172

🤖 Prompt for AI Agents
In PENDING_QUEUE_PLAN.md around lines 158-167 (and similarly 169-172), the
immediate-send call must receive the current context and the hook should include
conversationId in its dependency list; change sendMessageImmediately(content,
type) to sendMessageImmediately(content, type, context) (or the actual context
variable in scope), add conversationId to the useEffect/useCallback deps array,
and make queued behavior consistently FIFO by appending newMessage
(setQueuedMessages(prev => [...prev, newMessage])) for both streaming and
non-streaming queued branches instead of prepending in the non-streaming branch.

@vercel vercel bot temporarily deployed to Preview – docs October 1, 2025 00:39 Inactive
async (content: string, type: ChatType) => {
// If AI is streaming, add to queue instead
if (isStreaming) {
await addToQueue(content, type);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The addToQueue function is referenced in the plan but not implemented in the provided code. Consider adding the implementation for this function or replacing the call with the direct queue management code that adds messages to the queue state. This would ensure the queuing functionality works as expected when the PR is merged.

Suggested change
await addToQueue(content, type);
// Add the content directly to the queue state
queue.push({ content, type, timestamp: Date.now() });
setQueue([...queue]); // Update the queue state if using React state

Spotted by Diamond

Fix in Graphite


Is this helpful? React 👍 or 👎 to let us know.

@vercel vercel bot temporarily deployed to Preview – docs October 1, 2025 00:54 Inactive
@vercel vercel bot temporarily deployed to Preview – docs October 1, 2025 02:16 Inactive
<div className="flex items-center gap-3 group hover:bg-transparent">
<div className="size-3 rounded-full border border-muted-foreground/50 flex-shrink-0 bg-transparent" />

<p className="flex-1 min-w-0 text-small text-muted-foreground truncate">
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using 'text-small' may be nonstandard. Consider using 'text-sm' (or ensure 'text-small' is a defined custom utility) for consistency with Tailwind conventions.

Suggested change
<p className="flex-1 min-w-0 text-small text-muted-foreground truncate">
<p className="flex-1 min-w-0 text-sm text-muted-foreground truncate">

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (1)

383-410: Fix Send/Stop button logic.

The condition isStreaming && inputEmpty (line 383) causes the Stop button to disappear if the user types while AI is streaming. Users cannot stop the stream once they've entered text, which is confusing and frustrating.

The Stop button should be shown whenever streaming, regardless of input state:

-                    {isStreaming && inputEmpty ? (
+                    {isStreaming ? (
                         <Tooltip open={actionTooltipOpen} onOpenChange={setActionTooltipOpen}>
                             <TooltipTrigger asChild>
                                 <Button
♻️ Duplicate comments (2)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (1)

46-46: Duplicate: Hardcoded user-facing text flagged in previous review.

The string "Remove from queue" at line 46 (and "in queue" at line 78) should be externalized using next-intl messages/hooks for proper internationalization.

As per coding guidelines.

apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (1)

108-136: Fix inconsistent context in return value.

Line 133 returns a placeholder message with an empty context array [], but the captured context from line 112 should be used instead. This inconsistency means the returned message doesn't match what was actually queued.

Apply this diff:

-        return getUserChatMessageFromString(content, [], conversationId);
+        return getUserChatMessageFromString(content, context, conversationId);
🧹 Nitpick comments (2)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (1)

25-51: Remove unused index prop.

The index parameter is declared (line 21) and passed (line 89) but never used in the component body. This clutters the interface.

Apply this diff to remove the unused prop:

 interface QueuedMessageItemProps {
     message: QueuedMessage;
-    index: number;
     removeFromQueue: (id: string) => void;
 }

-const QueuedMessageItem = ({ message, index, removeFromQueue }: QueuedMessageItemProps) => {
+const QueuedMessageItem = ({ message, removeFromQueue }: QueuedMessageItemProps) => {

And update the usage at line 86-90:

                     {messages.map((message, index) => (
                         <QueuedMessageItem
                             key={message.id}
                             message={message}
-                            index={index}
                             removeFromQueue={removeFromQueue}
                         />
                     ))}
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (1)

186-205: Consider surfacing queue processing errors to users.

The implementation correctly prevents message loss by keeping failed messages in the queue (line 199 only removes after success). However, errors are only logged to the console (line 201). Users won't know a queued message failed to send.

Consider showing a toast notification on failure so users can take action (e.g., retry manually or check the queue):

         } catch (error) {
             console.error('Failed to process queued message:', error);
+            toast.error('Failed to send queued message. It will remain in the queue.');
         } finally {

You'll need to import toast from @onlook/ui/sonner at the top of the file.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 991e0d4 and 780163f.

📒 Files selected for processing (5)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (5 hunks)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-tab-content/index.tsx (2 hunks)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (1 hunks)
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (7 hunks)
  • packages/models/src/chat/message/message.ts (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-tab-content/index.tsx
  • packages/models/src/chat/message/message.ts
🧰 Additional context used
📓 Path-based instructions (6)
apps/web/client/src/app/**/*.tsx

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/app/**/*.tsx: Default to Server Components; add 'use client' when using events, state/effects, browser APIs, or client‑only libraries
Do not use process.env in client code; import env from @/env instead

Avoid hardcoded user-facing text; use next-intl messages/hooks

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
apps/web/client/src/**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/**/*.{ts,tsx}: Use path aliases @/* and ~/* for imports that map to apps/web/client/src/*
Avoid hardcoded user-facing text; use next-intl messages/hooks instead

Use path aliases @/* and ~/* for imports mapping to src/*

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
apps/web/client/src/**/*.tsx

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable references across renders
Keep the active MobX store in a useRef and perform async cleanup with setTimeout(() => storeRef.current?.clear(), 0) to avoid route-change races
Avoid useMemo for creating MobX store instances
Avoid putting the MobX store instance in effect dependency arrays if it causes loops; split concerns by domain

apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable identities across renders
Keep the active MobX store in a useRef and clean up asynchronously with setTimeout(() => storeRef.current?.clear(), 0)
Do not use useMemo to create MobX stores
Avoid placing MobX store instances in effect dependency arrays if it causes loops; split concerns instead
observer components must be client components; place a single client boundary at the feature entry; child observers need not repeat 'use client'

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

Do not use the any type unless necessary

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
apps/web/client/src/app/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Default to Server Components; add 'use client' only when using events, state/effects, browser APIs, or client-only libs

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
{apps,packages}/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Avoid using the any type unless absolutely necessary

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
  • apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx
🧠 Learnings (2)
📚 Learning: 2025-09-14T01:44:21.209Z
Learnt from: CR
PR: onlook-dev/onlook#0
File: AGENTS.md:0-0
Timestamp: 2025-09-14T01:44:21.209Z
Learning: Applies to apps/web/client/src/**/*.{ts,tsx} : Avoid hardcoded user-facing text; use next-intl messages/hooks instead

Applied to files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
📚 Learning: 2025-09-16T19:22:52.461Z
Learnt from: CR
PR: onlook-dev/onlook#0
File: CLAUDE.md:0-0
Timestamp: 2025-09-16T19:22:52.461Z
Learning: Applies to apps/web/client/src/app/**/*.tsx : Avoid hardcoded user-facing text; use next-intl messages/hooks

Applied to files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
🧬 Code graph analysis (3)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (4)
packages/models/src/chat/message/message.ts (1)
  • ChatMessage (19-19)
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (1)
  • SendMessage (17-17)
packages/models/src/chat/message/queue.ts (1)
  • QueuedMessage (4-10)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (1)
  • QueueBox (58-97)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (5)
packages/models/src/chat/message/queue.ts (1)
  • QueuedMessage (4-10)
packages/ui/src/components/tooltip.tsx (3)
  • Tooltip (72-72)
  • TooltipTrigger (72-72)
  • TooltipContent (72-72)
packages/ui/src/components/button.tsx (1)
  • Button (57-57)
packages/ui/src/components/icons/index.tsx (1)
  • Icons (138-3667)
packages/ui/src/components/collapsible.tsx (3)
  • Collapsible (19-19)
  • CollapsibleTrigger (19-19)
  • CollapsibleContent (19-19)
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (4)
packages/models/src/chat/message/queue.ts (1)
  • QueuedMessage (4-10)
packages/models/src/chat/message/context.ts (1)
  • MessageContext (53-59)
apps/web/client/src/app/project/[id]/_hooks/use-chat/utils.ts (1)
  • getUserChatMessageFromString (18-35)
apps/web/client/src/components/store/editor/chat/index.ts (1)
  • sendMessage (43-49)
🔇 Additional comments (7)
apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx (4)

82-106: LGTM: Context-aware message processing.

The optional context parameter correctly supports both immediate sends (with fresh context) and queued processing (with refreshed context), addressing the stale context concern from previous reviews.


138-180: LGTM: Edit flow with context refresh.

Correctly refreshes context from the previous message and maintains message metadata consistency.


182-184: LGTM: Clean queue removal.

Simple and correct filter-based removal.


226-287: LGTM: Finish effect with queue draining.

Correctly triggers queue processing after successful completion (finishReason === 'stop'), with a reasonable 500ms delay to allow UI updates.

apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/chat-input/index.tsx (3)

29-36: LGTM: Props updated for queue integration.

The interface correctly adds queuedMessages and removeFromQueue to support the new queue feature.


333-336: LGTM: QueueBox correctly integrated.

The component is properly positioned and receives the required props.


86-136: Keep suggestion handling—it's still in use.

The Suggestions component and SuggestionsRef are defined in suggestions.tsx and referenced by chat-input/index.tsx (e.g. lines 87–94, 119–131), so the suggestionRef and its keyboard handlers are required.

Likely an incorrect or invalid review comment.

Comment on lines +207 to +224
const editMessage: EditMessage = useCallback(
async (messageId: string, newContent: string, chatType: ChatType) => {
posthog.capture('user_edit_message', { type: ChatType.EDIT });

if (isStreaming) {
// Stop current streaming immediately
stop();

// Process edit with immediate priority (higher than queue)
const context = await editorEngine.chat.context.getContextByChatType(chatType);
return await processMessageEdit(messageId, newContent, chatType);
}

// Normal edit processing when not streaming
return processMessageEdit(messageId, newContent, chatType);
},
[processMessageEdit, posthog, isStreaming, stop, editorEngine.chat.context],
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Remove redundant context fetch during streaming edits.

Line 216 fetches context but never uses it—processMessageEdit (line 217) refreshes context internally at line 151. This causes unnecessary async work during the edit priority path.

Remove the unused context fetch:

             if (isStreaming) {
                 // Stop current streaming immediately
                 stop();

-                // Process edit with immediate priority (higher than queue)
-                const context = await editorEngine.chat.context.getContextByChatType(chatType);
                 return await processMessageEdit(messageId, newContent, chatType);
             }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const editMessage: EditMessage = useCallback(
async (messageId: string, newContent: string, chatType: ChatType) => {
posthog.capture('user_edit_message', { type: ChatType.EDIT });
if (isStreaming) {
// Stop current streaming immediately
stop();
// Process edit with immediate priority (higher than queue)
const context = await editorEngine.chat.context.getContextByChatType(chatType);
return await processMessageEdit(messageId, newContent, chatType);
}
// Normal edit processing when not streaming
return processMessageEdit(messageId, newContent, chatType);
},
[processMessageEdit, posthog, isStreaming, stop, editorEngine.chat.context],
);
const editMessage: EditMessage = useCallback(
async (messageId: string, newContent: string, chatType: ChatType) => {
posthog.capture('user_edit_message', { type: ChatType.EDIT });
if (isStreaming) {
// Stop current streaming immediately
stop();
return await processMessageEdit(messageId, newContent, chatType);
}
// Normal edit processing when not streaming
return processMessageEdit(messageId, newContent, chatType);
},
[processMessageEdit, posthog, isStreaming, stop, editorEngine.chat.context],
);
🤖 Prompt for AI Agents
In apps/web/client/src/app/project/[id]/_hooks/use-chat/index.tsx around lines
207 to 224, remove the unused await
editorEngine.chat.context.getContextByChatType(chatType) call inside the
isStreaming branch (it's redundant because processMessageEdit refreshes context
internally), and update the useCallback dependency array to remove
editorEngine.chat.context so it no longer references that unused property; keep
processMessageEdit, posthog, isStreaming, and stop as dependencies.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (2)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (2)

38-40: Avoid hardcoded user-facing text; use next-intl instead.

The tooltip contains the hardcoded string "Remove from queue". This should be externalized using next-intl messages/hooks for proper internationalization support.

As per coding guidelines.


66-68: Avoid hardcoded user-facing text; use next-intl instead.

The text "chats in queue" is hardcoded. This should be externalized using next-intl messages/hooks with proper pluralization support for internationalization.

As per coding guidelines.

Example implementation:

const t = useTranslations('chat');
// ...
<span className="text-xs">
    {t('queuedChats', { count: messages.length })}
</span>

With a message entry like:

{
  "queuedChats": "{count, plural, one {# chat in queue} other {# chats in queue}}"
}
🧹 Nitpick comments (3)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (3)

10-14: Remove unused index parameter.

The index prop is destructured in QueuedMessageItem (line 16) but never used in the component implementation.

Apply this diff to remove the unused parameter:

 interface QueuedMessageItemProps {
     message: QueuedMessage;
-    index: number;
     removeFromQueue: (id: string) => void;
 }

20-20: Consider using standard Tailwind class text-sm instead of text-small.

The class text-small may be a custom utility. For consistency with Tailwind conventions, consider using the standard text-sm class unless text-small is intentionally defined in your design system.

Apply this diff if text-small is not a custom utility:

-            <span className="text-small truncate w-full text-left text-muted-foreground group-hover:text-foreground mr-2">
+            <span className="text-sm truncate w-full text-left text-muted-foreground group-hover:text-foreground mr-2">

74-80: Remove unused index prop being passed to QueuedMessageItem.

The index prop is passed to QueuedMessageItem but is never used within that component (as noted in an earlier comment).

Apply this diff to remove the unused prop:

                     {messages.map((message, index) => (
                         <QueuedMessageItem
                             key={message.id}
                             message={message}
-                            index={index}
                             removeFromQueue={removeFromQueue}
                         />
                     ))}
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 79426e2 and ced498c.

📒 Files selected for processing (1)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (1 hunks)
🧰 Additional context used
📓 Path-based instructions (6)
apps/web/client/src/app/**/*.tsx

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/app/**/*.tsx: Default to Server Components; add 'use client' when using events, state/effects, browser APIs, or client‑only libraries
Do not use process.env in client code; import env from @/env instead

Avoid hardcoded user-facing text; use next-intl messages/hooks

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
apps/web/client/src/**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/**/*.{ts,tsx}: Use path aliases @/* and ~/* for imports that map to apps/web/client/src/*
Avoid hardcoded user-facing text; use next-intl messages/hooks instead

Use path aliases @/* and ~/* for imports mapping to src/*

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
apps/web/client/src/**/*.tsx

📄 CodeRabbit inference engine (AGENTS.md)

apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable references across renders
Keep the active MobX store in a useRef and perform async cleanup with setTimeout(() => storeRef.current?.clear(), 0) to avoid route-change races
Avoid useMemo for creating MobX store instances
Avoid putting the MobX store instance in effect dependency arrays if it causes loops; split concerns by domain

apps/web/client/src/**/*.tsx: Create MobX store instances with useState(() => new Store()) for stable identities across renders
Keep the active MobX store in a useRef and clean up asynchronously with setTimeout(() => storeRef.current?.clear(), 0)
Do not use useMemo to create MobX stores
Avoid placing MobX store instances in effect dependency arrays if it causes loops; split concerns instead
observer components must be client components; place a single client boundary at the feature entry; child observers need not repeat 'use client'

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
**/*.{ts,tsx}

📄 CodeRabbit inference engine (AGENTS.md)

Do not use the any type unless necessary

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
apps/web/client/src/app/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Default to Server Components; add 'use client' only when using events, state/effects, browser APIs, or client-only libs

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
{apps,packages}/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Avoid using the any type unless absolutely necessary

Files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
🧠 Learnings (2)
📚 Learning: 2025-09-14T01:44:21.209Z
Learnt from: CR
PR: onlook-dev/onlook#0
File: AGENTS.md:0-0
Timestamp: 2025-09-14T01:44:21.209Z
Learning: Applies to apps/web/client/src/**/*.{ts,tsx} : Avoid hardcoded user-facing text; use next-intl messages/hooks instead

Applied to files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
📚 Learning: 2025-09-16T19:22:52.461Z
Learnt from: CR
PR: onlook-dev/onlook#0
File: CLAUDE.md:0-0
Timestamp: 2025-09-16T19:22:52.461Z
Learning: Applies to apps/web/client/src/app/**/*.tsx : Avoid hardcoded user-facing text; use next-intl messages/hooks

Applied to files:

  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
🧬 Code graph analysis (1)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (4)
packages/models/src/chat/message/queue.ts (1)
  • QueuedMessage (4-10)
packages/ui/src/components/tooltip.tsx (3)
  • Tooltip (72-72)
  • TooltipTrigger (72-72)
  • TooltipContent (72-72)
packages/ui/src/components/button.tsx (1)
  • Button (57-57)
packages/ui/src/components/collapsible.tsx (3)
  • Collapsible (19-19)
  • CollapsibleTrigger (19-19)
  • CollapsibleContent (19-19)
🔇 Additional comments (3)
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx (3)

1-8: LGTM!

The imports and 'use client' directive are correctly structured. The directive is appropriate since the component uses React state, and path aliases follow the coding guidelines.


47-50: LGTM!

The interface is well-typed and correctly defines the component's props with appropriate types from the models layer.


52-86: Good implementation of the queue display logic.

The component correctly uses the queuedMessages prop (avoiding the stub data issue from earlier iterations), implements proper early return for empty queues, and uses appropriate keys in the map function.

Comment on lines 25 to 35
<Button
variant="ghost"
size="icon"
className="text-muted-foreground hover:text-foreground absolute right-0 px-2.5 py-2 top-1/2 -translate-y-1/2 w-fit h-fit opacity-0 group-hover:opacity-100 !bg-background-onlook hover:!bg-background-onlook z-10 transition-none cursor-pointer"
onClick={(e) => {
e.stopPropagation();
removeFromQueue(message.id);
}}
>
<Icons.Trash className="w-4 h-4" />
</Button>
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Ensure keyboard accessibility for the remove button.

The remove button uses opacity-0 group-hover:opacity-100, which hides it from view until hover. This creates an accessibility barrier for keyboard-only users who cannot hover. The button should be visible or become visible when focused.

Apply this diff to ensure the button is visible on focus:

                     <Button
                         variant="ghost"
                         size="icon"
-                        className="text-muted-foreground hover:text-foreground absolute right-0 px-2.5 py-2 top-1/2 -translate-y-1/2 w-fit h-fit opacity-0 group-hover:opacity-100 !bg-background-onlook hover:!bg-background-onlook z-10 transition-none cursor-pointer"
+                        className="text-muted-foreground hover:text-foreground absolute right-0 px-2.5 py-2 top-1/2 -translate-y-1/2 w-fit h-fit opacity-0 group-hover:opacity-100 focus-visible:opacity-100 !bg-background-onlook hover:!bg-background-onlook z-10 transition-none cursor-pointer"
                         onClick={(e) => {
                             e.stopPropagation();
                             removeFromQueue(message.id);
                         }}
                     >
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
<Button
variant="ghost"
size="icon"
className="text-muted-foreground hover:text-foreground absolute right-0 px-2.5 py-2 top-1/2 -translate-y-1/2 w-fit h-fit opacity-0 group-hover:opacity-100 !bg-background-onlook hover:!bg-background-onlook z-10 transition-none cursor-pointer"
onClick={(e) => {
e.stopPropagation();
removeFromQueue(message.id);
}}
>
<Icons.Trash className="w-4 h-4" />
</Button>
<Button
variant="ghost"
size="icon"
className="text-muted-foreground hover:text-foreground absolute right-0 px-2.5 py-2 top-1/2 -translate-y-1/2 w-fit h-fit opacity-0 group-hover:opacity-100 focus-visible:opacity-100 !bg-background-onlook hover:!bg-background-onlook z-10 transition-none cursor-pointer"
onClick={(e) => {
e.stopPropagation();
removeFromQueue(message.id);
}}
>
<Icons.Trash className="w-4 h-4" />
</Button>
🤖 Prompt for AI Agents
In
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/queue-box.tsx
around lines 25 to 35, the remove Button is hidden with opacity-0 until hover
which prevents keyboard-only users from accessing it; update the button's
className to also reveal it on focus (e.g., add focus:opacity-100 and
focus-visible:opacity-100) and ensure an accessible focus indicator (like
focus:outline or focus:ring classes) is present so keyboard users can see and
activate the button while keeping the existing onClick behavior that stops
propagation.

@vercel vercel bot temporarily deployed to Preview – docs October 1, 2025 06:50 Inactive
@Kitenite Kitenite merged commit 55f1935 into main Oct 1, 2025
6 of 8 checks passed
@Kitenite Kitenite deleted the feat/message-queue branch October 1, 2025 06:52

export const QueuedMessageItem = ({ message, removeFromQueue }: {
message: QueuedMessage;
index: number;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'index' prop is declared but not used in QueuedMessageItem; consider removing it if unnecessary.

Suggested change
index: number;

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants