Harbeth is a tiny set of utils and extensions over Apple's Metal framework dedicated to make your Swift GPU code much cleaner and let you prototype your pipelines faster.
Graphics processing And Filter production.👒👒👒
English | 简体中文
🟣 At the moment, the most important features of Metal Moudle can be summarized as follows:
####src="https://raw.github.com/yangKJ/Harbeth/master/total of 100+
kinds of filters are currently available.✌️src="https://raw.github.com/yangKJ/Harbeth/master/p align="left">
// Original code:
ImageView.image = originImage
// Injection filter code:
let filter = C7ColorMatrix4x4(matrix: Matrix4x4.sepia)
var filter2 = C7Granularity()
filter2.grain = 0.8
var filter3 = C7SoulOut()
filter3.soul = 0.7
let filters = [filter, filter2, filter3]
// Use:
ImageView.image = try? originImage.makeGroup(filters: filters)
// OR Use:
let AT = C7FilterTexture.init(texture: originImage.mt.toTexture()!)
let result = AT ->> filter ->> filter2 ->> filter3
ImageView.image = result.outputImage()
// Even:
var texture = originImage.mt.toTexture()!
filters.forEach { texture = texture ->> $0 }
ImageView.image = texture.toImage()
// Inject an edge detection filter:
var filter = C7EdgeGlow()
filter.lineColor = UIColor.red
// Inject a particle filter:
var filter2 = C7Granularity()
filter2.grain = 0.8
// Generate camera collector:
let camera = C7CollectorCamera(callback: { [weak self] (image) in
self?.ImageView.image = image
})
camera.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
camera.filters = [filter, filter2]
Core, basic core board
Float
.MTLTexture
Outputs, output section
Accomplish C7FilterProtocal
public struct C7SoulOut: C7FilterProtocol {
public var soul: Float = 0.5
public var maxScale: Float = 1.5
public var maxAlpha: Float = 0.5
public var modifier: Modifier {
return .compute(kernel: "C7SoulOut")
}
public var factors: [Float] {
return [soul, maxScale, maxAlpha]
}
public init() { }
}
Configure additional required textures.
Configure the passed parameter factor, only supports Float
type.
soul
: The adjusted soul, from 0.0 to 1.0, with a default of 0.5maxScale
: Maximum soul scalemaxAlpha
: The transparency of the max soulWrite a kernel function shader based on parallel computing.
kernel void C7SoulOut(texture2d<half, access::write> outputTexture [[texture(0)]],
texture2d<half, access::sample> inputTexture [[texture(1)]],
constant float *soulPointer [[buffer(0)]],
constant float *maxScalePointer [[buffer(1)]],
constant float *maxAlphaPointer [[buffer(2)]],
uint2 grid [[thread_position_in_grid]]) {
constexpr sampler quadSampler(mag_filter::linear, min_filter::linear);
const half4 inColor = inputTexture.read(grid);
const float x = float(grid.x) / outputTexture.get_width();
const float y = float(grid.y) / outputTexture.get_height();
const half soul = half(*soulPointer);
const half maxScale = half(*maxScalePointer);
const half maxAlpha = half(*maxAlphaPointer);
const half alpha = maxAlpha * (1.0h - soul);
const half scale = 1.0h + (maxScale - 1.0h) * soul;
const half soulX = 0.5h + (x - 0.5h) / scale;
const half soulY = 0.5h + (y - 0.5h) / scale;
const half4 soulMask = inputTexture.sample(quadSampler, float2(soulX, soulY));
const half4 outColor = inColor * (1.0h - alpha) + soulMask * alpha;
outputTexture.write(outColor, grid);
}
Simple to use, since my design is based on a parallel computing pipeline, images can be generated directly.
var filter = C7SoulOut()
filter.soul = 0.5
filter.maxScale = 2.0
/// Display directly in ImageView
ImageView.image = try? originImage.make(filter: filter)
As for the animation above, it is also very simple, add a timer, and then change the vsrc="https://raw.github.com/yangKJ/Harbeth/master/of soul
and you are done, simple.
/// 1.Convert to BGRA
let filter1 = C7ColorConvert(with: .color2BGRA)
/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8
/// 3.Adjust white balance
var filter3 = C7WhiteBalance()
filter3.temperature = 5555
/// 4.Adjust the highlight shadows
var filter4 = C7HighlightShadow()
filter4.shadows = 0.4
filter4.highlights = 0.5
/// 5.Combination operation
let AT = C7FilterTexture.init(texture: originImage.mt.toTexture()!)
let result = AT ->> filter1 ->> filter2 ->> filter3 ->> filter4
/// 6.Get src="https://raw.github.com/yangKJ/Harbeth/master/esult
filterImageView.image = result.outputImage()
/// 1.Convert to RBGA
let filter1 = C7ColorConvert(with: .color2RBGA)
/// 2.Adjust the granularity
var filter2 = C7Granularity()
filter2.grain = 0.8
/// 3.Soul effect
var filter3 = C7SoulOut()
filter3.soul = 0.7
/// 4.Combination operation
let group: [C7FilterProtocol] = [filter1, filter2, filter3]
/// 5.Get the result
filterImageView.image = try? originImage.makeGroup(filters: group)
Both methods can handle multiple filter schemes, depending on your mood.✌️
pod 'Harbeth'
pod 'OpencvQueen'
Swift Package Manager is a tool for managing the distribution of Swift code. It’s integrated with the Swift build system to automate the process of downloading, compiling, and linking dependencies.
Xcode 11+ is required to build Harbeth using Swift Package Manager.
To integrate Harbeth into your Xcode project using Swift Package Manager, add it to the dependencies value of your Package.swift
:
dependencies: [
.package(url: "https://github.com/yangKJ/Harbeth", .upToNextMajor(from: "0.1.15")),
]
The general process is almost like this, the Demo is also written in great detail, you can check it out for yourself.🎷
Tip: If you find it helpful, please help me with a star. If you have any questions or needs, you can also issue.
Thanks.🎇
Harbeth is available under the MIT license. See the LICENSE file for more info.
link |
Stars: 104 |
Last commit: 3 days ago |
Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics