Post

Reduce iOS Image Memory Usage with Downsampling

Downsample images before display to cut memory from 26MB to under 1MB per image on iOS.

Reduce iOS Image Memory Usage with Downsampling

A 2.6MB JPEG file consumes 26MB of RAM when decoded. File size and memory size are different things, and ignoring this distinction is one of the most common causes of memory spikes in image-heavy apps.

Why Images Use So Much Memory

When iOS displays an image, it decompresses pixels into a bitmap buffer. The formula is straightforward:

1
Memory = width x height x 4 bytes (RGBA)

A 3024 x 2160 photo:

1
3024 x 2160 x 4 = 26,127,360 bytes (~26MB)

Display that in a 300 x 200 point UIImageView on a 3x device (900 x 600 actual pixels) and iOS still decodes the full 26MB. The GPU downscales at render time, but the memory is already allocated.

The Fix: byPreparingThumbnail

iOS 15+ provides UIImage.byPreparingThumbnail(ofSize:) which decodes only the pixels you need:

1
2
3
4
5
6
7
8
9
10
11
import UIKit

extension UIImage {
    func downsampled(to pointSize: CGSize, scale: CGFloat = UIScreen.main.scale) -> UIImage? {
        let pixelSize = CGSize(
            width: pointSize.width * scale,
            height: pointSize.height * scale
        )
        return preparingThumbnail(of: pixelSize)
    }
}

Use the async variant for background loading:

1
2
3
4
5
6
7
8
9
10
import UIKit

func loadThumbnail(from url: URL, targetSize: CGSize) async -> UIImage? {
    guard let image = UIImage(contentsOfFile: url.path) else { return nil }
    let pixelSize = CGSize(
        width: targetSize.width * UIScreen.main.scale,
        height: targetSize.height * UIScreen.main.scale
    )
    return await image.byPreparingThumbnail(ofSize: pixelSize)
}

For a 300 x 200 point view on a 3x display, memory drops from 26MB to roughly 2.16MB (900 x 600 x 4 bytes).

ImageIO Alternative for Pre-iOS 15

CGImageSourceCreateThumbnailAtIndex gives you the same result without the iOS 15 requirement:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
import UIKit
import ImageIO

func downsample(imageAt url: URL, to pointSize: CGSize, scale: CGFloat = UIScreen.main.scale) -> UIImage? {
    let maxDimensionInPixels = max(pointSize.width, pointSize.height) * scale

    let options: [CFString: Any] = [
        kCGImageSourceCreateThumbnailFromImageAlways: true,
        kCGImageSourceShouldCacheImmediately: true,
        kCGImageSourceCreateThumbnailWithTransform: true,
        kCGImageSourceThumbnailMaxPixelSize: maxDimensionInPixels
    ]

    guard let imageSource = CGImageSourceCreateWithURL(url as CFURL, nil),
          let cgImage = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options as CFDictionary) else {
        return nil
    }

    return UIImage(cgImage: cgImage)
}

Key options:

  • kCGImageSourceShouldCacheImmediately forces decoding on the current thread instead of deferring to the main thread
  • kCGImageSourceCreateThumbnailWithTransform applies EXIF orientation so images display correctly

Comparison

ApproachMemory (300x200 @3x)Minimum iOS
UIImage(named:)~26MB-
preparingThumbnail(of:)~2.16MB15.0
CGImageSourceCreateThumbnailAtIndex~2.16MB2.0

Apply downsampling in collection views and grids where dozens of images load simultaneously. A feed displaying 20 full-resolution photos burns 520MB of memory; downsampled thumbnails bring that under 44MB.

☕ Support My Work

If you found this post helpful and want to support more content like this, you can buy me a coffee!

Your support helps me continue creating useful articles and tips for fellow developers. Thank you! 🙏

This post is licensed under CC BY 4.0 by the author.