Swiftpack.co - yangKJ/Harbeth as Swift Package

Swiftpack.co is a collection of thousands of indexed Swift packages. Search packages.
See all packages published by yangKJ.
yangKJ/Harbeth 0.2.0
Metal API for GPU accelerated Graphics and Video and Camera filter framework. 🎨 图像、视频、相机滤镜框架
⭐️ 104
🕓 2 weeks ago
.package(url: "https://github.com/yangKJ/Harbeth.git", from: "0.2.0")



Carthage compatible CocoaPods Compatible CocoaPods Compatible Platform

Harbeth is a tiny set of utils and extensions over Apple's Metal framework dedicated to make your Swift GPU code much cleaner and let you prototype your pipelines faster.

Graphics processing And Filter production.👒👒👒

English | 简体中文


🟣 At the moment, the most important features of Metal Moudle can be summarized as follows:

  • Support operator chain filter.
  • Support quick design filters.
  • Support merge multiple filter effects.
  • Support fast expansion of output sources.
  • Support camera capture effects.
  • Support video to add filter special effects.
  • Support matrix convolution.
  • The filter part is roughly divided into the following modules:
    • ☑ Blend: This module mainly contains image blend filters.
    • ☑ Blur: Blur effect
    • ☑ ColorProcess: basic pixel processing of images.
    • ☑ Effect: Effect processing.
    • ☑ Lookup: Lookup table filter.
    • ☑ Matrix: Matrix convolution filter.
    • ☑ Shape: Image shape size related.
    • ☑ Visual: Visual dynamic effects.

####src="https://raw.github.com/yangKJ/Harbeth/master/total of 100+ kinds of filters are currently available.✌️src="https://raw.github.com/yangKJ/Harbeth/master/p align="left">

  • Code zero intrusion injection filter function.
// Original code:
ImageView.image = originImage

// Injection filter code:
let filter = C7ColorMatrix4x4(matrix: Matrix4x4.sepia)

var filter2 = C7Granularity()
filter2.grain = 0.8

var filter3 = C7SoulOut()
filter3.soul = 0.7

let filters = [filter, filter2, filter3]

// Use:
ImageView.image = try? originImage.makeGroup(filters: filters)

// OR Use:
let AT = C7FilterTexture.init(texture: originImage.mt.toTexture()!)
let result = AT ->> filter ->> filter2 ->> filter3
ImageView.image = result.outputImage()

// Even:
var texture = originImage.mt.toTexture()!
filters.forEach { texture = texture ->> $0 }
ImageView.image = texture.toImage()
  • Camera capture generates pictures.
// Inject an edge detection filter:
var filter = C7EdgeGlow()
filter.lineColor = UIColor.red

// Inject a particle filter:
var filter2 = C7Granularity()
filter2.grain = 0.8

// Generate camera collector:
let camera = C7CollectorCamera(callback: { [weak self] (image) in
    self?.ImageView.image = image
camera.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
camera.filters = [filter, filter2]


  • Core, basic core board

    • C7FilterProtocol: Filter designs must follow this protocol.
      • modifier: Encoder type and corresponding function name.
      • factors: Set modify parameter factor, you need to convert to Float.
      • otherInputTextures: Multiple input source extensions, An array containing the MTLTexture
      • outputSize: Change the size of the output image.
  • Outputs, output section

    • C7FilterOutput: Output content protocol, all outputs must implement this protocol.
      • make: Generate data based on filter processing.
      • makeGroup: Multiple filter combinations, Please note that the order in which filters are added may affect the result of image generation.
    • C7FilterImage: Image input source based on C7FilterOutput, The following modes support only the encoder based on parallel computing.
    • C7FilterTexture: MTLTexture input source based on C7FilterOutput, The input texture is converted to a filter to process the texture.
    • C7CollectorCamera : The camera data collector generates images directly and then returns them in the main thread.
    • C7CollectorVideo : Add the filter effect to the video image frame to generate the image disrc="https://raw.github.com/yangKJ/Harbeth/master/y.


  • For example, how to design an soul filter.🎷

  1. Accomplish C7FilterProtocal

    public struct C7SoulOut: C7FilterProtocol {
        public var soul: Float = 0.5
        public var maxScale: Float = 1.5
        public var maxAlpha: Float = 0.5
        public var modifier: Modifier {
            return .compute(kernel: "C7SoulOut")
        public var factors: [Float] {
            return [soul, maxScale, maxAlpha]
        public init() { }
  2. Configure additional required textures.

  3. Configure the passed parameter factor, only supports Float type.

    • This filter requires three parameters:
      • soul: The adjusted soul, from 0.0 to 1.0, with a default of 0.5
      • maxScale: Maximum soul scale
      • maxAlpha: The transparency of the max soul
  4. Write a kernel function shader based on parallel computing.

    kernel void C7SoulOut(texture2d<half, access::write> outputTexture [[texture(0)]],
                          texture2d<half, access::sample> inputTexture [[texture(1)]],
                          constant float *soulPointer [[buffer(0)]],
                          constant float *maxScalePointer [[buffer(1)]],
                          constant float *maxAlphaPointer [[buffer(2)]],
                          uint2 grid [[thread_position_in_grid]]) {
        constexpr sampler quadSampler(mag_filter::linear, min_filter::linear);
        const half4 inColor = inputTexture.read(grid);
        const float x = float(grid.x) / outputTexture.get_width();
        const float y = float(grid.y) / outputTexture.get_height();
        const half soul = half(*soulPointer);
        const half maxScale = half(*maxScalePointer);
        const half maxAlpha = half(*maxAlphaPointer);
        const half alpha = maxAlpha * (1.0h - soul);
        const half scale = 1.0h + (maxScale - 1.0h) * soul;
        const half soulX = 0.5h + (x - 0.5h) / scale;
        const half soulY = 0.5h + (y - 0.5h) / scale;
        const half4 soulMask = inputTexture.sample(quadSampler, float2(soulX, soulY));
        const half4 outColor = inColor * (1.0h - alpha) + soulMask * alpha;
        outputTexture.write(outColor, grid);
  5. Simple to use, since my design is based on a parallel computing pipeline, images can be generated directly.

    var filter = C7SoulOut()
    filter.soul = 0.5
    filter.maxScale = 2.0
    /// Display directly in ImageView
    ImageView.image = try? originImage.make(filter: filter)
  6. As for the animation above, it is also very simple, add a timer, and then change the vsrc="https://raw.github.com/yangKJ/Harbeth/master/of soul and you are done, simple.

Advanced usage

  • Operator chain processing
/// 1.Convert to BGRA
let filter1 = C7ColorConvert(with: .color2BGRA)

/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8

/// 3.Adjust white balance
var filter3 = C7WhiteBalance()
filter3.temperature = 5555

/// 4.Adjust the highlight shadows
var filter4 = C7HighlightShadow()
filter4.shadows = 0.4
filter4.highlights = 0.5

/// 5.Combination operation
let AT = C7FilterTexture.init(texture: originImage.mt.toTexture()!)
let result = AT ->> filter1 ->> filter2 ->> filter3 ->> filter4

/// 6.Get src="https://raw.github.com/yangKJ/Harbeth/master/esult
filterImageView.image = result.outputImage()

  • Batch processing
/// 1.Convert to RBGA
let filter1 = C7ColorConvert(with: .color2RBGA)

/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8

/// 3.Soul effect
var filter3 = C7SoulOut()
filter3.soul = 0.7

/// 4.Combination operation
let group: [C7FilterProtocol] = [filter1, filter2, filter3]

/// 5.Get the result
filterImageView.image = try? originImage.makeGroup(filters: group)

Both methods can handle multiple filter schemes, depending on your mood.✌️


  • If you want to import Metal module, you need in your Podfile:
pod 'Harbeth'
  • If you want to import OpenCV image module, you need in your Podfile:
pod 'OpencvQueen'

Swift Package Manager

Swift Package Manager is a tool for managing the distribution of Swift code. It’s integrated with the Swift build system to automate the process of downloading, compiling, and linking dependencies.

Xcode 11+ is required to build Harbeth using Swift Package Manager.

To integrate Harbeth into your Xcode project using Swift Package Manager, add it to the dependencies value of your Package.swift:

dependencies: [
    .package(url: "https://github.com/yangKJ/Harbeth", .upToNextMajor(from: "0.1.15")),


The general process is almost like this, the Demo is also written in great detail, you can check it out for yourself.🎷


Tip: If you find it helpful, please help me with a star. If you have any questions or needs, you can also issue.


About the author


Harbeth is available under the MIT license. See the LICENSE file for more info.


Stars: 104
Last commit: 3 days ago
jonrohan Something's broken? Yell at me @ptrpavlik. Praise and feedback (and money) is also welcome.

Release Notes

13 weeks ago
  1. Add oil painting filter
  2. Rename Visual Module

Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics