Conversation
* fix(sharing): fixed folders not appearing when sharing workflows * cleanup * fixed error case
… throughout (#620) * use cascade deletion * fix lint --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan>
* chore: use t3-env as source of truth * chore: update mock env for failing tests
* feat(logs): enhanced logging system with cleanup and theme fixes - Implement enhanced logging cleanup with S3 archival and retention policies - Fix error propagation in trace spans for manual executions - Add theme-aware styling for frozen canvas modal - Integrate enhanced logging system across all execution pathways - Add comprehensive trace span processing and iteration navigation - Fix boolean parameter types in enhanced logs API * add warning for old logs * fix lint * added cost for streaming outputs * fix overflow issue * fix lint * fix selection on closing sidebar * tooltips z index increase --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan> Co-authored-by: Waleed Latif <walif6@gmail.com>
…igrated logs (#624) * fix(frozen canvas): don't error if workflow state not available for old logs * fix lint --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@vikhyaths-air.lan>
* fix(reddit): change tool to use oauth token * fix lint * add contact info * Update apps/sim/tools/reddit/get_comments.ts Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com> * Update apps/sim/tools/reddit/hot_posts.ts Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com> * Update apps/sim/tools/reddit/get_posts.ts Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com> * fix type error --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-MacBook-Air.local> Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net> Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
* added tool re-ordering in agent block * styling
…nused local storage code (#628) * fix(oauth): fixed oauth state not persisting in credential selector * remove unused local storage code for oauth * fix lint * selector clearance issue fix * fix typing issue * fix lint * remove cred id from logs * fix lint * works --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
* fix: memory deletion * fix: bun run lint --------- Co-authored-by: Adam Gough <adamgough@Adams-MacBook-Pro.local>
* added turbopack to prod builds * block access to sourcemaps * revert changes to docs
* fix response format non-json input crash bug * fix lint --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
…sing separate endpoint (#633) * fix(revert-deployed): revert deployed functionality with separate endpoint * fix lint --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
…#634) * fix(dropdown): simplify & fix tag dropdown for parallel & loop blocks * fixed build
…, and chat client (#637) * add response format structure to tag dropdown * handle response format outputs for chat client and chat panel, implemented the response format handling for streamed responses * cleanup
#638) * keep warning until refresh * works * fix sockets server sync on reconnection * infinite reconn attempts * fix lint --------- Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
…anges, add read-only offline mode (#641) * force user to refresh on disconnect in order to mkae changes, add read-only offline mode * remove unused hook * style * update tooltip msg * remove unnecessary useMemo around log
…ading socket server, fixed persistence issue during streaming back from LLM response format, removed unused events (#642) * fix(sockets): added debouncing for sub-block values to prevent overloading socket server, fixed persistence issue during streaming back from LLM response format, removed unused events * reuse existing isStreaming state for code block llm-generated response format
* fix: chunk search bar fix * fix: fixed reload and refresh * fix: fixed structure * fix: need to fix persisting in knowledge search * fix: adding page as query param * fix: bun run lint (#557) * added instantaneous client-side search, added fuzzy search & text highlighting --------- Co-authored-by: Adam Gough <adamgough@Mac.attlocal.net> Co-authored-by: Waleed Latif <walif6@gmail.com>
Co-authored-by: Vikhyath Mondreti <vikhyathmondreti@Vikhyaths-Air.attlocal.net>
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎ |
There was a problem hiding this comment.
PR Summary
Comprehensive updates to enhance offline mode handling, credential management, and response format processing across the application.
- Major rework of
useSubBlockValuehook inapps/sim/app/workspace/[workspaceId]/w/[workflowId]/components/workflow-block/components/sub-block/hooks/use-sub-block-value.tsadding intelligent debouncing and streaming support for better collaborative performance - Removed OAuth storage functionality from credential selectors, centralizing auth flow in
OAuthRequiredModalfor improved security and maintainability - Added enhanced offline mode detection in
WorkspacePermissionsProviderwith 6-second connection timeout and user-friendly tooltips - Introduced safe response format parsing with the new
StreamingResponseFormatProcessorinapps/sim/executor/utils.tsfor robust handling of streaming JSON responses - Added Reddit OAuth integration with proper scopes and token management across multiple files (
tools/reddit/*.ts,lib/auth.ts,lib/oauth/oauth.ts)
78 files reviewed, 49 comments
Edit PR Review Bot Settings | Greptile
| content?: string | ||
| output?: Partial<NormalizedBlockOutput> | ||
| replaceOutput?: NormalizedBlockOutput // New field for complete replacement |
There was a problem hiding this comment.
logic: Consider the race condition between partial 'output' and 'replaceOutput' fields. Add validation to ensure only one is used at a time.
| const getTooltipMessage = (defaultMessage: string) => { | ||
| if (disabled) { | ||
| return userPermissions.isOfflineMode ? 'Connection lost - please refresh' : 'Read-only mode' | ||
| } | ||
| return defaultMessage | ||
| } |
There was a problem hiding this comment.
style: Consider using an enum or constant for the different states/messages to avoid string literals and ensure consistency
| export interface ResponseFormatStreamProcessor { | ||
| processStream( | ||
| originalStream: ReadableStream, | ||
| blockId: string, | ||
| selectedOutputIds: string[], | ||
| responseFormat?: any | ||
| ): ReadableStream | ||
| } |
There was a problem hiding this comment.
style: Consider documenting the parameters of processStream method, especially what responseFormat can contain
| import { useCallback } from 'react' | ||
| import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip' | ||
| import { cn } from '@/lib/utils' | ||
| import { useUserPermissionsContext } from '@/app/workspace/[workspaceId]/w/components/providers/workspace-permissions-provider' |
There was a problem hiding this comment.
style: Consider using a more specific path alias (e.g. @/providers) instead of the deeply nested import path
|
|
||
| try { | ||
| // Refresh the token if needed | ||
| const { accessToken } = await refreshTokenIfNeeded(requestId, credential, credentialId) |
There was a problem hiding this comment.
style: Add success log here to complete the request lifecycle logging
| if (!blockConfig) { | ||
| if (connection.type === 'loop') { | ||
| Icon = RepeatIcon as typeof Icon | ||
| bgColor = '#2FB3FF' // Blue color for loop blocks | ||
| } else if (connection.type === 'parallel') { | ||
| Icon = SplitIcon as typeof Icon | ||
| bgColor = '#FEE12B' // Yellow color for parallel blocks | ||
| } | ||
| } |
There was a problem hiding this comment.
style: large switch block handling multiple types should be extracted into a separate function
| const refreshChunksData = async (options?: { | ||
| search?: string | ||
| limit?: number | ||
| offset?: number | ||
| preservePage?: boolean | ||
| }) => { |
There was a problem hiding this comment.
style: Group options parameters into a single options object with better TypeScript types
| private processStreamingChunk(buffer: string, selectedFields: string[]): string | null { | ||
| // For streaming response format, we need to parse the JSON as it comes in | ||
| // and extract only the field values we care about | ||
|
|
||
| // Try to parse as complete JSON first | ||
| try { | ||
| const parsed = JSON.parse(buffer.trim()) | ||
| if (typeof parsed === 'object' && parsed !== null) { | ||
| // We have a complete JSON object, extract the selected fields | ||
| // Process all selected fields and format them properly | ||
| const results: string[] = [] | ||
| for (const field of selectedFields) { | ||
| if (field in parsed) { | ||
| const value = parsed[field] | ||
| const formattedValue = typeof value === 'string' ? value : JSON.stringify(value) | ||
| results.push(formattedValue) | ||
| } | ||
| } | ||
|
|
||
| if (results.length > 0) { | ||
| // Join multiple fields with newlines for readability | ||
| const result = results.join('\n') | ||
| return result | ||
| } | ||
|
|
||
| return null | ||
| } | ||
| } catch (e) { | ||
| // Not complete JSON yet, continue buffering | ||
| } | ||
|
|
||
| // For real-time extraction during streaming, we'd need more sophisticated parsing | ||
| // For now, let's handle the case where we receive chunks that might be partial JSON | ||
|
|
||
| // Simple heuristic: if buffer contains what looks like a complete JSON object | ||
| const openBraces = (buffer.match(/\{/g) || []).length | ||
| const closeBraces = (buffer.match(/\}/g) || []).length | ||
|
|
||
| if (openBraces > 0 && openBraces === closeBraces) { | ||
| // Likely a complete JSON object | ||
| try { | ||
| const parsed = JSON.parse(buffer.trim()) | ||
| if (typeof parsed === 'object' && parsed !== null) { | ||
| // Process all selected fields and format them properly | ||
| const results: string[] = [] | ||
| for (const field of selectedFields) { | ||
| if (field in parsed) { | ||
| const value = parsed[field] | ||
| const formattedValue = typeof value === 'string' ? value : JSON.stringify(value) | ||
| results.push(formattedValue) | ||
| } | ||
| } | ||
|
|
||
| if (results.length > 0) { | ||
| // Join multiple fields with newlines for readability | ||
| const result = results.join('\n') | ||
| return result | ||
| } | ||
|
|
||
| return null | ||
| } | ||
| } catch (e) { | ||
| // Still not valid JSON, continue | ||
| } | ||
| } | ||
|
|
||
| return null | ||
| } |
There was a problem hiding this comment.
style: Duplicate JSON parsing logic between processStreamingChunk and processCompleteJson. Extract common field processing into a shared method
| private processStreamingChunk(buffer: string, selectedFields: string[]): string | null { | |
| // For streaming response format, we need to parse the JSON as it comes in | |
| // and extract only the field values we care about | |
| // Try to parse as complete JSON first | |
| try { | |
| const parsed = JSON.parse(buffer.trim()) | |
| if (typeof parsed === 'object' && parsed !== null) { | |
| // We have a complete JSON object, extract the selected fields | |
| // Process all selected fields and format them properly | |
| const results: string[] = [] | |
| for (const field of selectedFields) { | |
| if (field in parsed) { | |
| const value = parsed[field] | |
| const formattedValue = typeof value === 'string' ? value : JSON.stringify(value) | |
| results.push(formattedValue) | |
| } | |
| } | |
| if (results.length > 0) { | |
| // Join multiple fields with newlines for readability | |
| const result = results.join('\n') | |
| return result | |
| } | |
| return null | |
| } | |
| } catch (e) { | |
| // Not complete JSON yet, continue buffering | |
| } | |
| // For real-time extraction during streaming, we'd need more sophisticated parsing | |
| // For now, let's handle the case where we receive chunks that might be partial JSON | |
| // Simple heuristic: if buffer contains what looks like a complete JSON object | |
| const openBraces = (buffer.match(/\{/g) || []).length | |
| const closeBraces = (buffer.match(/\}/g) || []).length | |
| if (openBraces > 0 && openBraces === closeBraces) { | |
| // Likely a complete JSON object | |
| try { | |
| const parsed = JSON.parse(buffer.trim()) | |
| if (typeof parsed === 'object' && parsed !== null) { | |
| // Process all selected fields and format them properly | |
| const results: string[] = [] | |
| for (const field of selectedFields) { | |
| if (field in parsed) { | |
| const value = parsed[field] | |
| const formattedValue = typeof value === 'string' ? value : JSON.stringify(value) | |
| results.push(formattedValue) | |
| } | |
| } | |
| if (results.length > 0) { | |
| // Join multiple fields with newlines for readability | |
| const result = results.join('\n') | |
| return result | |
| } | |
| return null | |
| } | |
| } catch (e) { | |
| // Still not valid JSON, continue | |
| } | |
| } | |
| return null | |
| } | |
| private processFields(parsed: Record<string, any>, selectedFields: string[]): string | null { | |
| const results: string[] = [] | |
| for (const field of selectedFields) { | |
| if (field in parsed) { | |
| const value = parsed[field] | |
| const formattedValue = typeof value === 'string' ? value : JSON.stringify(value) | |
| results.push(formattedValue) | |
| } | |
| } | |
| if (results.length > 0) { | |
| return results.join('\n') | |
| } | |
| return null | |
| } | |
| private processStreamingChunk(buffer: string, selectedFields: string[]): string | null { | |
| // Try to parse as complete JSON first | |
| try { | |
| const parsed = JSON.parse(buffer.trim()) | |
| if (typeof parsed === 'object' && parsed !== null) { | |
| return this.processFields(parsed, selectedFields) | |
| } | |
| } catch (e) { | |
| // Not complete JSON yet, continue buffering | |
| } | |
| // For real-time extraction during streaming, we'd need more sophisticated parsing | |
| // For now, let's handle the case where we receive chunks that might be partial JSON | |
| // Simple heuristic: if buffer contains what looks like a complete JSON object | |
| const openBraces = (buffer.match(/\{/g) || []).length | |
| const closeBraces = (buffer.match(/\}/g) || []).length | |
| if (openBraces > 0 && openBraces === closeBraces) { | |
| // Likely a complete JSON object | |
| try { | |
| const parsed = JSON.parse(buffer.trim()) | |
| if (typeof parsed === 'object' && parsed !== null) { | |
| return this.processFields(parsed, selectedFields) | |
| } | |
| } catch (e) { | |
| // Still not valid JSON, continue | |
| } | |
| } | |
| return null | |
| } |
| if (!tag.includes('.') && ['index', 'currentItem', 'items'].includes(tag)) { | ||
| processedTag = `${blockGroup.blockType}.${tag}` | ||
| } else { | ||
| // It's already a properly formatted tag (like blockname.results) | ||
| processedTag = tag | ||
| } |
There was a problem hiding this comment.
style: Consider extracting contextual tag prefix logic into a utility function for reuse
| vi.mock('@/stores/workflows/subblock/store', () => ({ | ||
| useSubBlockStore: vi.fn(() => ({ | ||
| getValue: vi.fn(() => null), | ||
| getState: vi.fn(() => ({ | ||
| getValue: vi.fn(() => null), | ||
| })), | ||
| })), | ||
| })) |
There was a problem hiding this comment.
style: Mock store initialization should be in a beforeEach block for better test maintainability
|
✅ No security or compliance issues detected. Reviewed everything up to c65384d. Security Overview
Detected Code Changes
Reply to this PR with |
v0.2.10: fix
Description
Type of change
Checklist:
bun run test)Security Considerations: