Swiftpack.co - yangKJ/Harbeth as Swift Package

Swiftpack.co is a collection of thousands of indexed Swift packages. Search packages.
See all packages published by yangKJ.
yangKJ/Harbeth 0.4.7
Metal API for GPU accelerated Image and Video and Camera filter framework. Support macOS & iOS. 🎨 图像、视频、相机滤镜框架
⭐️ 175
🕓 2 days ago
iOS macOS watchOS tvOS
.package(url: "https://github.com/yangKJ/Harbeth.git", from: "0.4.7")

Harbeth

x

Carthage compatible CocoaPods Compatible CocoaPods Compatible Platform

Harbeth is a tiny set of utils and extensions over Apple's Metal framework dedicated to make your Swift GPU code much cleaner and let you prototype your pipelines faster.

Graphics processing And Filter production.👒👒👒


English | 简体中文

Features

🟣 At the moment, the most important features of Metal Moudle can be summarized as follows:

  • Supports iOS and macOS.
  • Support operator chain filter.
  • Support use UIImage, CIImage, CGImage, CMSampleBuffer, CVPixelBuffer.
  • Support quick design filters.
  • Support merge multiple filter effects.
  • Support fast expansion of output sources.
  • Support camera capture effects.
  • Support video to add filter special effects.
  • Support matrix convolution.
  • Support MetalPerformanceShaders.
  • Support compatible for CoreImage.
  • The filter part is roughly divided into the following modules:
    • ☑ Blend: This module mainly contains image blend filters.
    • ☑ Blur: Blur effect
    • ☑ Pixel: basic pixel processing of images.
    • ☑ Effect: Effect processing.
    • ☑ Lookup: Lookup table filter.
    • ☑ Matrix: Matrix convolution filter.
    • ☑ Shape: Image shape size related.
    • ☑ Visual: Visual dynamic effects.
    • ☑ MPS: MetalPerformanceShaders.

####src="https://raw.github.com/yangKJ/Harbeth/master/total of 100+ kinds of filters are currently available.✌️src="https://raw.github.com/yangKJ/Harbeth/master/p align="left">

  • Code zero intrusion injection filter function.
// Original code:
ImageView.image = originImage

// Injection filter code:
let filter = C7ColorMatrix4x4(matrix: Matrix4x4.sepia)

var filter2 = C7Granularity()
filter2.grain = 0.8

var filter3 = C7SoulOut()
filter3.soul = 0.7

let filters = [filter, filter2, filter3]

// Use:
let dest = BoxxIO.init(element: originImage, filters: filters)
ImageView.image = try? dest.output()

// OR Use:
ImageView.image = try? originImage.makeGroup(filters: filters)

// OR Use Operator:
ImageView.image = originImage ->> filter ->> filter2 ->> filter3
  • Camera capture generates pictures.
// Inject an edge detection filter:
var filter = C7EdgeGlow()
filter.lineColor = UIColor.red

// Inject a particle filter:
var filter2 = C7Granularity()
filter2.grain = 0.8

// Generate camera collector:
let camera = C7CollectorCamera.init(delegate: self)
camera.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
camera.filters = [filter, filter2]

extension CameraViewController: C7CollectorImageDelegate {
    func preview(_ collector: C7Collector, fliter image: C7Image) {
        DispatchQueue.main.async {
            self.originImageView.image = image
        }
    }
}
  • Local video or Network video are simply apply with filters.
    • 🙄 For details, See PlayerViewController.
    • You can also extend this by using BoxxIO to filter the collected CVPixelBuffer.
lazy var video: C7CollectorVideo = {
    let videoURL = URL.init(string: "Link")!
    let asset = AVURLAsset.init(url: videoURL)
    let playerItem = AVPlayerItem.init(asset: asset)
    let player = AVPlayer.init(playerItem: playerItem)
    let video = C7CollectorVideo.init(player: player, delegate: self)
    let filter = C7ColorMatrix4x4(matrix: Matrix4x4.sepia)
    video.filters = [filter]
    return video
}()

self.video.play()

extension PlayerViewController: C7CollectorImageDelegate {
    func preview(_ collector: C7Collector, fliter image: C7Image) {
        self.originImageView.image = image
        // Simulated dynamic effect.
        if let filter = self.tuple?.callback?(self.nextTime) {
            self.video.filters = [filter]
        }
    }
}

Overview

  • Core, basic core board

    • C7FilterProtocol: Filter designs must follow this protocol.
      • modifier: Encoder type and corresponding function name.
      • factors: Set modify parameter factor, you need to convert to Float.
      • otherInputTextures: Multiple input source extensions, An array containing the MTLTexture
      • outputSize: Change the size of the output image.
      • setupSpecialFactors: Special type of parameter factor, such as 4x4 matrix.
      • coreImageApply: Compatible support for CoreImage.
      • parameterDescription: Parametric description.
  • Outputs, output section

    • BoxxIO: Multi-function output, support UIImage, CGImage, CIImage, MTLTexture, CMSampleBuffer, CVPixelBuffer and so on.
    • Outputable: Output content protocol, all outputs must implement this protocol.
      • make: Generate data based on filter processing.
      • makeGroup: Multiple filter combinations, Please note that the order in which filters are added may affect the result of image generation.
    • C7CollectorCamera : The camera data collector generates images directly and then returns them in the main thread.
    • C7CollectorVideo : Add the filter effect to the video image frame to generate the image disrc="https://raw.github.com/yangKJ/Harbeth/master/y.

Usages

  • For example, how to design an soul filter.🎷

  1. Accomplish C7FilterProtocal

    public struct C7SoulOut: C7FilterProtocol {
        public var soul: Float = 0.5
        public var maxScale: Float = 1.5
        public var maxAlpha: Float = 0.5
        
        public var modifier: Modifier {
            return .compute(kernel: "C7SoulOut")
        }
        
        public var factors: [Float] {
            return [soul, maxScale, maxAlpha]
        }
        
        public init() { }
    }
    
  2. Configure additional required textures.

  3. Configure the passed parameter factor, only supports Float type.

    • This filter requires three parameters:
      • soul: The adjusted soul, from 0.0 to 1.0, with a default of 0.5
      • maxScale: Maximum soul scale
      • maxAlpha: The transparency of the max soul
  4. Write a kernel function shader based on parallel computing.

    kernel void C7SoulOut(texture2d<half, access::write> outputTexture [[texture(0)]],
                          texture2d<half, access::sample> inputTexture [[texture(1)]],
                          constant float *soulPointer [[buffer(0)]],
                          constant float *maxScalePointer [[buffer(1)]],
                          constant float *maxAlphaPointer [[buffer(2)]],
                          uint2 grid [[thread_position_in_grid]]) {
        constexpr sampler quadSampler(mag_filter::linear, min_filter::linear);
        const half4 inColor = inputTexture.read(grid);
        const float x = float(grid.x) / outputTexture.get_width();
        const float y = float(grid.y) / outputTexture.get_height();
        
        const half soul = half(*soulPointer);
        const half maxScale = half(*maxScalePointer);
        const half maxAlpha = half(*maxAlphaPointer);
        
        const half alpha = maxAlpha * (1.0h - soul);
        const half scale = 1.0h + (maxScale - 1.0h) * soul;
        
        const half soulX = 0.5h + (x - 0.5h) / scale;
        const half soulY = 0.5h + (y - 0.5h) / scale;
        
        const half4 soulMask = inputTexture.sample(quadSampler, float2(soulX, soulY));
        const half4 outColor = inColor * (1.0h - alpha) + soulMask * alpha;
        
        outputTexture.write(outColor, grid);
    }
    
  5. Simple to use, since my design is based on a parallel computing pipeline, images can be generated directly.

    var filter = C7SoulOut()
    filter.soul = 0.5
    filter.maxScale = 2.0
    
    /// Display directly in ImageView
    ImageView.image = try? originImage.make(filter: filter)
    
  6. As for the animation above, it is also very simple, add a timer, and then change the vsrc="https://raw.github.com/yangKJ/Harbeth/master/of soul and you are done, simple.


Advanced usage

  • Operator chain processing
/// 1.Convert to BGRA
let filter1 = C7ColorConvert(with: .color2BGRA)

/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8

/// 3.Adjust white balance
var filter3 = C7WhiteBalance()
filter3.temperature = 5555

/// 4.Adjust the highlight shadows
var filter4 = C7HighlightShadow()
filter4.shadows = 0.4
filter4.highlights = 0.5

/// 5.Combination operation
let texture = originImage.mt.toTexture()!
let result = texture ->> filter1 ->> filter2 ->> filter3 ->> filter4

/// 6.src="https://raw.github.com/yangKJ/Harbeth/master/he result
filterImageView.image = result.toImage()

  • Batch processing
/// 1.Convert to RBGA
let filter1 = C7ColorConvert(with: .color2RBGA)

/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8

/// 3.Soul effect
var filter3 = C7SoulOut()
filter3.soul = 0.7

/// 4.Combination operation
let group: [C7FilterProtocol] = [filter1, filter2, filter3]

/// 5.Get the result
filterImageView.image = try? originImage.makeGroup(filters: group)

Both methods can handle multiple filter schemes, depending on your mood.✌️


CocoaPods

  • If you want to import Metal module, you need in your Podfile:
pod 'Harbeth'
  • If you want to import OpenCV image module, you need in your Podfile:
pod 'OpencvQueen'

Swift Package Manager

Swift Package Manager is a tool for managing the distribution of Swift code. It’s integrated with the Swift build system to automate the process of downloading, compiling, and linking dependencies.

Xcode 11+ is required to build Harbeth using Swift Package Manager.

To integrate Harbeth into your Xcode project using Swift Package Manager, add it to the dependencies value of your Package.swift:

dependencies: [
    .package(url: "https://github.com/yangKJ/Harbeth.git", branch: "master"),
]

Remarks

The general process is almost like this, the Demo is also written in great detail, you can check it out for yourself.🎷

HarbethDemo

Tip: If you find it helpful, please help me with a star. If you have any questions or needs, you can also issue.

Thanks.🎇

About the author


License

Harbeth is available under the MIT license. See the LICENSE file for more info.


GitHub

link
Stars: 178
Last commit: 2 days ago
jonrohan Something's broken? Yell at me @ptrpavlik. Praise and feedback (and money) is also welcome.

Release Notes

Harbeth
3 weeks ago

Swift PM

Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics