Skip to content

feat: Support large images#498

Open
guoyingtao wants to merge 6 commits intomasterfrom
feat/large-image
Open

feat: Support large images#498
guoyingtao wants to merge 6 commits intomasterfrom
feat/large-image

Conversation

@guoyingtao
Copy link
Owner

@guoyingtao guoyingtao commented Feb 26, 2026

Fixes #95

Summary by CodeRabbit

  • New Features

    • Large-image mode with automatic downsampling and cached preview for smoother interactive editing
    • UI toggle to switch between normal and large image display
    • Sample large image included
  • Configuration

    • New setting to configure the pixel threshold that enables large-image mode
    • Interactive edits use a downsampled preview while final crop preserves full resolution
  • Performance Improvements

    • More memory-efficient loading and cropping using optimized image pipelines

guoyingtao and others added 2 commits February 26, 2026 00:45
When maxImagePixelCount > 0 and the image exceeds the threshold:
- Display uses ImageIO-downsampled image to prevent UIImageView black screen
- Crop uses memory-efficient CIImage pipeline (CIPerspectiveCorrection)
  that only decodes needed source pixels via GPU tiling
- Full original resolution is preserved in crop output
- Default (maxImagePixelCount = 0) leaves all existing code paths untouched

Changes:
- CropViewConfig: add maxImagePixelCount property
- UIImageExtensions: add exifOrientation, downsampledIfNeeded (public),
  cropWithCIImage lazy pipeline supporting all transforms
- Mantis.swift: downsample display image in buildCropView
- CropView: add isLargeImageMode, branch crop/asyncCrop/update paths
- CropViewController+CropAPI: branch crop(by:) for large images
- DemoViewController: show downsampled preview, pass original to crop

Co-Authored-By: Claude <noreply@anthropic.com>
Add a UISwitch on the navigation bar left side to toggle between
sunflower.jpg (default) and large.jpg for testing large image support.

Co-Authored-By: Claude <noreply@anthropic.com>
@guoyingtao guoyingtao requested a review from Copilot February 26, 2026 06:02
@coderabbitai
Copy link

coderabbitai bot commented Feb 26, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f879ca6 and 0967d21.

📒 Files selected for processing (1)
  • Sources/Mantis/Extensions/UIImageExtensions.swift

📝 Walkthrough

Walkthrough

Adds configurable large-image support: images may be downsampled for display while a CIImage-based crop path preserves full-resolution output; demo app gets a Large Image toggle and per-config maxImagePixelCount to control behavior.

Changes

Cohort / File(s) Summary
Config & core image handling
Sources/Mantis/CropViewConfig.swift, Sources/Mantis/Mantis.swift
Adds maxImagePixelCount to config and applies downsampling to images before creating image containers.
Crop view & controllers
Sources/Mantis/CropView/CropView.swift, Sources/Mantis/CropView/CropView+Crop.swift, Sources/Mantis/CropViewController/CropViewController+CropAPI.swift
Adds isLargeImageMode and routes cropping to CIImage-based cropWithCIImage(by:) when image exceeds threshold; display path uses downsampled image while crop output preserves full resolution.
UIImage extensions (large-image support)
Sources/Mantis/Extensions/UIImageExtensions.swift
Adds exifOrientation, downsampledIfNeeded(maxPixelCount:sourceData:) using ImageIO, and cropWithCIImage(by:) using a CI pipeline for memory-efficient cropping and perspective handling.
Example / Demo & assets
Example/DemoViewController.swift, Example/MantisExample.xcodeproj/project.pbxproj
Adds large.jpg resource, a UI toggle (Large Image) and logic (useLargeImage, cachedDisplayImage) to switch image source, downsample for display, and reset transforms.

Sequence Diagram

sequenceDiagram
    participant User
    participant DemoVC as Demo ViewController
    participant UIImageExt as UIImage Extensions
    participant CropView
    participant CI as CoreImage

    User->>DemoVC: select image / toggle Large Image
    DemoVC->>UIImageExt: downsampledIfNeeded(maxPixelCount)
    UIImageExt-->>DemoVC: cached display image (possibly downsampled)
    DemoVC->>CropView: update(display image)
    User->>DemoVC: request crop (cropInfo)
    DemoVC->>CropView: crop(originalImage, cropInfo)
    CropView->>CropView: evaluate isLargeImageMode (maxImagePixelCount)
    alt Large Image Mode
        CropView->>UIImageExt: cropWithCIImage(cropInfo)
        UIImageExt->>CI: build & run CI pipeline (perspective/transform)
        CI-->>UIImageExt: cropped full-res image
    else Normal Mode
        CropView->>UIImageExt: crop(by: cropInfo)
    end
    UIImageExt-->>CropView: cropped UIImage
    CropView-->>DemoVC: CropOutput (full-res)
    DemoVC-->>User: deliver cropped image
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related PRs

Poem

🐰 I nibble pixels, big and small,
I downsample views to keep them all,
CI sculpts edges with mindful art,
Full-res kept while previews depart,
Hop, crop, deliver — a rabbit's call 🥕📷

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 40.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: Support large images' clearly and concisely summarizes the main change—adding support for handling large images in Mantis.
Linked Issues check ✅ Passed The changes address all core objectives from issue #95: downsampling for display memory efficiency, preserving original data for full-resolution outputs, and handling large image allocation failures.
Out of Scope Changes check ✅ Passed All changes are directly related to large image support: adding maxImagePixelCount config, downsampling utilities, CIImage-based cropping, and UI controls for testing the feature.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/large-image

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (4)
Sources/Mantis/CropViewController/CropViewController+CropAPI.swift (1)

14-22: Code duplication with CropView.isLargeImageMode.

The pixel count check duplicates the logic in CropView.isLargeImageMode (lines 62-66 in CropView.swift). Consider reusing the existing computed property to ensure consistency:

♻️ Proposed refactor to reuse isLargeImageMode
     public func crop(by cropInfo: CropInfo) {
-        let croppedImage: UIImage?
-        let maxPixels = config.cropViewConfig.maxImagePixelCount
-        let pixelW = Int(cropView.image.size.width * cropView.image.scale)
-        let pixelH = Int(cropView.image.size.height * cropView.image.scale)
-        if maxPixels > 0 && pixelW * pixelH > maxPixels {
-            croppedImage = cropView.image.cropWithCIImage(by: cropInfo)
-        } else {
-            croppedImage = cropView.image.crop(by: cropInfo)
-        }
+        let croppedImage = cropView.isLargeImageMode
+            ? cropView.image.cropWithCIImage(by: cropInfo)
+            : cropView.image.crop(by: cropInfo)

Note: This would require exposing isLargeImageMode as internal or higher visibility on CropView, or adding a method on CropViewProtocol.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Sources/Mantis/CropViewController/CropViewController`+CropAPI.swift around
lines 14 - 22, The pixel-count branching in CropViewController+CropAPI.swift
duplicates CropView.isLargeImageMode; replace the manual maxPixels/pixelW*pixelH
check by using cropView.isLargeImageMode (or add an isLargeImageMode accessor to
CropViewProtocol and use that) to decide between cropWithCIImage(by:) and
crop(by:); if visibility prevents direct use, change CropView.isLargeImageMode
to internal (or add a protocol method like isLargeImageMode()) and update the
call site to use that instead so the logic is centralized in CropView.
Sources/Mantis/Extensions/UIImageExtensions.swift (2)

379-384: Fix SwiftLint colon spacing warnings.

Static analysis flagged inconsistent colon spacing in the array literal.

🔧 Proposed fix for colon spacing
         let screenCorners = [
-            CGPoint(x: -halfCropW, y: -halfCropH),
-            CGPoint(x:  halfCropW, y: -halfCropH),
-            CGPoint(x:  halfCropW, y:  halfCropH),
-            CGPoint(x: -halfCropW, y:  halfCropH)
+            CGPoint(x: -halfCropW, y: -halfCropH),
+            CGPoint(x: halfCropW, y: -halfCropH),
+            CGPoint(x: halfCropW, y: halfCropH),
+            CGPoint(x: -halfCropW, y: halfCropH)
         ]
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Sources/Mantis/Extensions/UIImageExtensions.swift` around lines 379 - 384,
The array literal assigned to screenCorners contains inconsistent spacing around
colons in the CGPoint initializers; update each CGPoint(initializer:) to use a
single space after each colon (e.g., "x: value, y: value") so all four entries
are consistently formatted. Locate the screenCorners constant in
UIImageExtensions (the CGPoint(...) entries) and normalize the colon spacing
throughout the array literal to satisfy SwiftLint.

313-343: Memory consideration in downsampling approach.

The method converts the UIImage to JPEG/PNG data before creating the thumbnail. This temporarily requires the full image data in memory before ImageIO can create the thumbnail. For extremely large images, this could still trigger memory pressure.

For future optimization, consider accepting a file URL or Data parameter directly when the source is a file, allowing ImageIO to stream from disk without fully loading the UIImage first. However, this is acceptable for the current opt-in implementation.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Sources/Mantis/Extensions/UIImageExtensions.swift` around lines 313 - 343,
The downsampledIfNeeded(maxPixelCount:) implementation currently calls
jpegData(compressionQuality:) and pngData() on UIImage which forces the full
image into memory before ImageIO can create a thumbnail; update the API and
implementation to accept an optional Data or URL input so callers can provide
on-disk image data (or a file URL) to
CGImageSourceCreateWithData/CGImageSourceCreateWithURL instead of always
serializing the UIImage, and when UIImage-only input is used keep the existing
fallback but document the memory caveat; locate the method
downsampledIfNeeded(maxPixelCount:) and the code paths that call
jpegData/pngData and replace them with branching to use provided Data/URL (using
CGImageSourceCreateWithURL when URL present) while preserving existing behavior
when no external source is supplied.
Sources/Mantis/CropView/CropView+Crop.swift (1)

12-30: Consider moving the crop operation to the background thread for large images.

The cropping logic (lines 14-20) runs on the main thread before dispatching to the background. For large images, cropWithCIImage may cause UI blocking when CIContext.createCGImage() renders the final output. Consider wrapping the entire crop operation in the background dispatch:

♻️ Proposed refactor to move cropping to background
 func asyncCrop(_ image: UIImage, completion: `@escaping` (CropOutput) -> Void) {
     let cropInfo = getCropInfo()
-    let croppedImage: UIImage?
-    if isLargeImageMode {
-        croppedImage = image.cropWithCIImage(by: cropInfo)
-    } else {
-        croppedImage = image.crop(by: cropInfo)
-    }
-    let cropOutput = (croppedImage, makeTransformation(), cropInfo)
-
+    let transformation = makeTransformation()
+    let isLarge = isLargeImageMode
+    
     DispatchQueue.global(qos: .userInteractive).async {
+        let croppedImage: UIImage?
+        if isLarge {
+            croppedImage = image.cropWithCIImage(by: cropInfo)
+        } else {
+            croppedImage = image.crop(by: cropInfo)
+        }
+        let cropOutput = (croppedImage, transformation, cropInfo)
         let maskedCropOutput = self.addImageMask(to: cropOutput)
         DispatchQueue.main.async {
             self.activityIndicator.stopAnimating()
             self.activityIndicator.isHidden = true
             completion(maskedCropOutput)
         }
     }
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Sources/Mantis/CropView/CropView`+Crop.swift around lines 12 - 30, The crop
work is happening on the main thread inside asyncCrop; move the heavy cropping
into the background block to avoid UI jank. In asyncCrop, call getCropInfo()
(cheap) on main if needed, but perform the image cropping (use
image.cropWithCIImage(by:) when isLargeImageMode is true, otherwise
image.crop(by:)) inside DispatchQueue.global(qos: .userInteractive).async, then
build cropOutput, call addImageMask(to:) on that background thread, and only
dispatch back to DispatchQueue.main to stop activityIndicator and invoke
completion(maskedCropOutput). Ensure any UI access (activityIndicator) remains
on the main thread and do not touch UI from the background.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@Sources/Mantis/CropView/CropView`+Crop.swift:
- Around line 12-30: The crop work is happening on the main thread inside
asyncCrop; move the heavy cropping into the background block to avoid UI jank.
In asyncCrop, call getCropInfo() (cheap) on main if needed, but perform the
image cropping (use image.cropWithCIImage(by:) when isLargeImageMode is true,
otherwise image.crop(by:)) inside DispatchQueue.global(qos:
.userInteractive).async, then build cropOutput, call addImageMask(to:) on that
background thread, and only dispatch back to DispatchQueue.main to stop
activityIndicator and invoke completion(maskedCropOutput). Ensure any UI access
(activityIndicator) remains on the main thread and do not touch UI from the
background.

In `@Sources/Mantis/CropViewController/CropViewController`+CropAPI.swift:
- Around line 14-22: The pixel-count branching in
CropViewController+CropAPI.swift duplicates CropView.isLargeImageMode; replace
the manual maxPixels/pixelW*pixelH check by using cropView.isLargeImageMode (or
add an isLargeImageMode accessor to CropViewProtocol and use that) to decide
between cropWithCIImage(by:) and crop(by:); if visibility prevents direct use,
change CropView.isLargeImageMode to internal (or add a protocol method like
isLargeImageMode()) and update the call site to use that instead so the logic is
centralized in CropView.

In `@Sources/Mantis/Extensions/UIImageExtensions.swift`:
- Around line 379-384: The array literal assigned to screenCorners contains
inconsistent spacing around colons in the CGPoint initializers; update each
CGPoint(initializer:) to use a single space after each colon (e.g., "x: value,
y: value") so all four entries are consistently formatted. Locate the
screenCorners constant in UIImageExtensions (the CGPoint(...) entries) and
normalize the colon spacing throughout the array literal to satisfy SwiftLint.
- Around line 313-343: The downsampledIfNeeded(maxPixelCount:) implementation
currently calls jpegData(compressionQuality:) and pngData() on UIImage which
forces the full image into memory before ImageIO can create a thumbnail; update
the API and implementation to accept an optional Data or URL input so callers
can provide on-disk image data (or a file URL) to
CGImageSourceCreateWithData/CGImageSourceCreateWithURL instead of always
serializing the UIImage, and when UIImage-only input is used keep the existing
fallback but document the memory caveat; locate the method
downsampledIfNeeded(maxPixelCount:) and the code paths that call
jpegData/pngData and replace them with branching to use provided Data/URL (using
CGImageSourceCreateWithURL when URL present) while preserving existing behavior
when no external source is supplied.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 901ad4d and 85992b8.

⛔ Files ignored due to path filters (1)
  • Example/large.jpg is excluded by !**/*.jpg
📒 Files selected for processing (8)
  • Example/DemoViewController.swift
  • Example/MantisExample.xcodeproj/project.pbxproj
  • Sources/Mantis/CropView/CropView+Crop.swift
  • Sources/Mantis/CropView/CropView.swift
  • Sources/Mantis/CropViewConfig.swift
  • Sources/Mantis/CropViewController/CropViewController+CropAPI.swift
  • Sources/Mantis/Extensions/UIImageExtensions.swift
  • Sources/Mantis/Mantis.swift

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a “large image mode” to Mantis to avoid UIImageView black rendering / CGContext allocation failures when working with very large images (Fixes #95).

Changes:

  • Introduces CropViewConfig.maxImagePixelCount to enable downsampling for display and a CIImage-based cropping path for oversized inputs.
  • Updates crop flows to switch between the existing CGContext-based crop and the new CIImage pipeline based on pixel count.
  • Updates the example app to include a large-image demo asset and a UI toggle.

Reviewed changes

Copilot reviewed 8 out of 9 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
Sources/Mantis/Mantis.swift Uses a downsampled image for the display ImageContainer when configured.
Sources/Mantis/Extensions/UIImageExtensions.swift Adds downsampling helper + CIImage crop implementation for large images.
Sources/Mantis/CropViewController/CropViewController+CropAPI.swift Switches crop pipeline based on configured max pixel threshold.
Sources/Mantis/CropViewConfig.swift Adds maxImagePixelCount configuration knob and documentation.
Sources/Mantis/CropView/CropView.swift Switches crop output pipeline and downsample-on-update display behavior.
Sources/Mantis/CropView/CropView+Crop.swift Updates async crop to use CIImage pipeline for large images.
Example/MantisExample.xcodeproj/project.pbxproj Adds large.jpg to the example project resources.
Example/DemoViewController.swift Adds UI toggle and uses downsampled display image in the example.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

let scaleFactor = sqrt(ratio)
let maxDimension = max(CGFloat(pixelWidth), CGFloat(pixelHeight)) * scaleFactor

guard let imageData = jpegData(compressionQuality: 1.0) ?? pngData(),
Copy link

Copilot AI Feb 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

downsampledIfNeeded() builds the CGImageSource from jpegData(...) ?? pngData(). For images with alpha, jpegData will succeed but discard transparency (flattening pixels), which can change how transparent PNGs render in the crop UI. Prefer preserving alpha by choosing PNG when the source has an alpha channel (or avoid JPEG altogether for this helper).

Suggested change
guard let imageData = jpegData(compressionQuality: 1.0) ?? pngData(),
let imageData: Data?
if let cgImage = self.cgImage {
let alphaInfo = cgImage.alphaInfo
let hasAlpha: Bool = (
alphaInfo == .first ||
alphaInfo == .last ||
alphaInfo == .premultipliedFirst ||
alphaInfo == .premultipliedLast
)
if hasAlpha {
imageData = self.pngData()
} else {
imageData = self.jpegData(compressionQuality: 1.0)
}
} else {
// Fallback when cgImage is unavailable: prefer PNG to preserve potential transparency.
imageData = self.pngData() ?? self.jpegData(compressionQuality: 1.0)
}
guard let imageData,

Copilot uses AI. Check for mistakes.
Copy link
Owner Author

@guoyingtao guoyingtao Feb 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use CGImage.dataProvider (zero-copy for file-backed images) instead of
jpegData()/pngData() which forces a full decode then re-encode. Also add
optional sourceData parameter for callers that have original compressed
data available. Re-encoding is now only a last-resort fallback.

Co-Authored-By: Claude <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@Sources/Mantis/Extensions/UIImageExtensions.swift`:
- Around line 397-400: The CGPoint literals inside screenCorners have extra
spaces after the parameter labels (e.g., "x:  halfCropW") causing SwiftLint
colon spacing violations; update the four CGPoint initializers in the
screenCorners code to use a single space after each colon for both x: and y:
(e.g., "x: -halfCropW, y: -halfCropH") so the colon spacing conforms to
SwiftLint rules.
- Around line 333-344: The current else-if chain sets imageSource using
sourceData or providerData and skips the jpegData()/pngData() fallback if
provider-based CGImageSourceCreateWithData(providerData, ...) returns nil;
change the logic in UIImageExtensions (around the imageSource initialization) to
try each approach independently: first attempt CGImageSourceCreateWithData for
sourceData, if that yields nil then attempt CGImageSourceCreateWithData for
providerData (from cgImage.dataProvider.data), and if that also yields nil then
fall back to creating encoded data via jpegData(compressionQuality:) ??
pngData() and pass that to CGImageSourceCreateWithData; ensure imageSource is
assigned the first non-nil CGImageSource result so the re-encode fallback always
runs when provider-based creation fails.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 85992b8 and 1c14a46.

📒 Files selected for processing (1)
  • Sources/Mantis/Extensions/UIImageExtensions.swift

Change displayImage from a computed property to a cached stored property
that updates via didSet when image changes, preventing redundant
downsampledIfNeeded calls on every access.

Co-Authored-By: Claude <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
Example/DemoViewController.swift (1)

93-97: Consider localizing and labeling the Large Image toggle.

At Line 93, the text is hardcoded. If this demo is user-facing, prefer NSLocalizedString and set an explicit accessibility label/value on the switch.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Example/DemoViewController.swift` around lines 93 - 97, Replace the hardcoded
label text and add accessibility metadata: use NSLocalizedString for
switchLabel.text (e.g., NSLocalizedString("Large Image", "toggle label for large
image demo")) instead of the literal string in DemoViewController, and set
explicit accessibility properties on largeImageSwitch such as accessibilityLabel
(matching the localized string) and accessibilityValue/state so screen readers
can convey the toggle state; update references to switchLabel and
largeImageSwitch accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@Example/DemoViewController.swift`:
- Around line 16-18: The synchronous call to
image?.downsampledIfNeeded(maxPixelCount: maxImagePixelCount) inside the image
didSet blocks the main thread; change didSet to perform downsampling on a
background queue (e.g., DispatchQueue.global), capture the current image/local
maxImagePixelCount at task start, then when the downsampled result is ready
dispatch back to the main queue to assign cachedDisplayImage only if the source
image hasn’t changed (stale-result guard comparing the captured image or using
an incrementing token). Ensure all UI/state writes (cachedDisplayImage) happen
on the main thread and avoid retaining long-lived work for cancelled/stale
images.

---

Nitpick comments:
In `@Example/DemoViewController.swift`:
- Around line 93-97: Replace the hardcoded label text and add accessibility
metadata: use NSLocalizedString for switchLabel.text (e.g.,
NSLocalizedString("Large Image", "toggle label for large image demo")) instead
of the literal string in DemoViewController, and set explicit accessibility
properties on largeImageSwitch such as accessibilityLabel (matching the
localized string) and accessibilityValue/state so screen readers can convey the
toggle state; update references to switchLabel and largeImageSwitch accordingly.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 1c14a46 and 789fbd1.

📒 Files selected for processing (1)
  • Example/DemoViewController.swift

…der yields raw pixel data

CGImage.dataProvider.data can return decoded bitmap data rather than
an encoded image format. The previous else-if chain would accept the
data but fail at thumbnail creation, skipping the jpegData/pngData
fallback. Restructure as independent if-blocks so each approach is
tried end-to-end before moving to the next.

Co-Authored-By: Claude <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
Sources/Mantis/Extensions/UIImageExtensions.swift (1)

443-444: Reuse a shared CIContext for repeated crops.

Creating a new CIContext per crop is avoidable overhead; a shared context improves throughput for repeated operations.

Proposed refactor
 extension UIImage {
+    private static let sharedCIContext = CIContext(options: [.useSoftwareRenderer: false])
+
     func cgImageWithFixedOrientation() -> CGImage? {
         if imageOrientation == .up {
             return cgImage
@@
-        let context = CIContext(options: [.useSoftwareRenderer: false])
+        let context = Self.sharedCIContext
         guard let resultCG = context.createCGImage(scaledImage, from: scaledImage.extent) else {
             return nil
         }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Sources/Mantis/Extensions/UIImageExtensions.swift` around lines 443 - 444,
The code currently allocates a new CIContext for each crop (the local let
context = CIContext(...) used before calling createCGImage), which is expensive;
refactor by adding a shared static CIContext (e.g., a private static let
sharedCIContext = CIContext(options: [.useSoftwareRenderer: false]) on the
UIImageExtensions/UIImage extension) and replace the per-call context with that
shared context when calling createCGImage(scaledImage, from: ...), ensuring any
references to the local variable 'context' are updated to use the shared
property (e.g., Self.sharedCIContext or UIImageExtensions.sharedCIContext).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@Sources/Mantis/Extensions/UIImageExtensions.swift`:
- Around line 357-359: The fallback data creation currently prefers
jpegData(compressionQuality: 1.0) over pngData(), which can strip alpha; change
the selection to prefer pngData() first (i.e., use pngData() ??
jpegData(compressionQuality: 1.0)) so transparent images retain their alpha
before passing to
CGImageSourceCreateWithData/CGImageSourceCreateThumbnailAtIndex; update the
variable/condition that sets fallbackData accordingly in UIImageExtensions.swift
where fallbackData is created.

---

Nitpick comments:
In `@Sources/Mantis/Extensions/UIImageExtensions.swift`:
- Around line 443-444: The code currently allocates a new CIContext for each
crop (the local let context = CIContext(...) used before calling createCGImage),
which is expensive; refactor by adding a shared static CIContext (e.g., a
private static let sharedCIContext = CIContext(options: [.useSoftwareRenderer:
false]) on the UIImageExtensions/UIImage extension) and replace the per-call
context with that shared context when calling createCGImage(scaledImage, from:
...), ensuring any references to the local variable 'context' are updated to use
the shared property (e.g., Self.sharedCIContext or
UIImageExtensions.sharedCIContext).

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 789fbd1 and f879ca6.

📒 Files selected for processing (1)
  • Sources/Mantis/Extensions/UIImageExtensions.swift

…annel

Co-Authored-By: Claude <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support for large images

2 participants