Core Image

Now that we have our photos posting properly to Twitter, wouldn’t it be fun to jazz them up a little? There are all kinds of apps out on the market to apply filters to your photos, and it would be really cool if we could do that, too. Well, guess what, we can!

Core Image is a framework that was introduced in iOS 5, but it existed in Mac OS X before that. Core Image contains over a hundred photo filters—objects that can manipulate image data—for you to use and incorporate into your projects. For iOS 9, Apple has rewritten both the iOS and OS X version of Core Image to share a common code base, so nearly all the Apple-provided filters are now available for both platforms. And, if that’s not enough, it’s also possible to write your own image filters.

Important Core Image Classes

Core Image is made up of a small number of classes that create the basis of the functionality of the framework. These classes are like the Lego blocks you use to build your Core Image applications.

The first class we are going to explore is CIImage. CIImage does not contain any image data. Rather, it contains a set of instructions to be sent to the CIContext about how you plan to modify the components you are applying to an image. If we took a source image and applied two filters to it, the CIImage would basically just contain pointers to the source image data and the filters to be applied.

The next class you need to know about for Core Image is the CIFilter class. Filters, as we’ve discussed, are objects that operate on images and produce a changed version of their contents. A CIFilter contains a dictionary that holds all of the attributes for performing its work. For example, if you had a filter that adjusted RGBA color values, your CIFilter instance would hold the values for each of these attributes.

The final class we are going to talk about is the CIContext class. Most of the frameworks in iOS that do drawing utilize contexts. If you were to move on from here and work with Core Animation or OpenGL, you would encounter their own flavors of context. A context is basically just the thing that the drawing is being performed on. Since we want to draw the photo as altered by the filter, we need a place to do that, and that place is the context.

Any time we want to add a filter to an image, we will go through the following steps:

  1. Create a CIImage object. We can do this several different ways. We can instantiate our CIImage by URL reference, by loading image data directly, or receiving our image data from a Core Video pixel buffer.

  2. Create a CIContext to output your CIImage to. CIContext objects are buffers and should be reused, so you don’t need to create a context for every single thing you are doing.

  3. Create a CIFilter instance to apply to your image. This is the step where you will set all of the properties, the number of which will vary based on which filter you are using.

  4. Receive the filter output. This is the end of the processing pipeline where you will take possession of your shiny new filtered image.

Spiffy! We have a brand-new image. Wait, what do we do with it? An image isn’t like a car coming off the assembly line. We can’t touch, hold, or feel it. How do we retain this image after it pops off the conveyor belt?

There are several ways of doing this, but since we already know that we are not working on a video or exporting this to an OpenGL project, the best way for us to take possession of our filtered image is to use CIContext’s createCGImage(fromRect:). We could also use UIImage’s imageWithCIImage, which is slightly easier, but it performs badly. It’s important to consider the most efficient way of doing something, not just the easiest way.

Filters and Filter Documentation

So far we have been talking about all of these awesome filters that exist on iOS, but we haven’t actually seen any of them yet. It’s time to dig into what some of these filters are and what they do.

There are well over a hundred filters, but they are broken down into a few categories. Here are some of the more useful ones:

  • Blurs: These are the famous (or infamous, depending which way you look at it) effects that were utilized in the iOS 7 design aesthetic but were not implemented easily until iOS 8. Oops.

  • Color Adjustments and Effects: These filters allow you to adjust your colors in a controlled way to either correct your projects or let you do complex effects with color.

  • Compositing: If you have not played with compositing blend modes in either programming or in Photoshop, you are in for some fun. These are powerful filters that allow you to do some complex effects. One of the authors used these blend modes to add color and complex shading to a black-and-white manga scan by adding a layer to hold the color and having an underlying layer contain the black-and-white drawings, which was really cool.

A complete list of these filters is available in the Core Image Filter Reference in the Xcode documentation. If you are interested in seeing the kind of code associated with how these effects were created, check out GPUImage.[2] GPUImage is an open source framework for image processing that contains many similar filters to the ones used in Core Image. The difference is that you can look at how the shaders were written to get an idea of how that awesome effect you are using was put together so that you can learn how to modify it and roll your own. Writing shaders is beyond the scope of this book, but if you are interested in writing shaders, this is an invaluable resource.

Adding a Filter to Our Photos

All right, enough talk. Let’s go ahead and add our filter to our project.

We need to import the framework we are using. Go to the top of the root view controller and import the Core Image framework:

 import​ ​CoreImage

We need to modify our closure within createTweetForAsset. Replace the resultHandler closure with the following:

1: if​ ​let​ image = image, ​var​ ciImage = ​CIImage​ (image: image)
where​ ​SLComposeViewController​.isAvailableForServiceType(
SLServiceTypeTwitter​) {
ciImage = ciImage.imageByApplyingFilter(​"CIPixellate"​,
5:  withInputParameters: [​"inputScale"​ : 25.0])
let​ ciContext = ​CIContext​(options: ​nil​)
let​ cgImage = ciContext.createCGImage(ciImage,
fromRect: ciImage.extent)
let​ tweetImage = ​UIImage​(​CGImage​: cgImage)
10: let​ tweetVC = ​SLComposeViewController​(forServiceType:
SLServiceTypeTwitter​)
tweetVC.setInitialText(​"Here's a photo I tweeted. #pragsios9"​)
tweetVC.addImage(tweetImage)
dispatch_async(dispatch_get_main_queue(), { () -> ​Void​ ​in
15: self​.presentViewController(tweetVC, animated: ​true​,
completion: ​nil​)
})
}
})
  1. We need to start with a CIImage that’s based on the UIImage we received from the Photos framework before. The CIImage initializer that takes a UIImage is failable, meaning it could give us back nil, so we’ll add it to the if let on line 1. We’ll be using this to gather the pieces to apply our filter to our photo.

  2. Lines 4--5 create and apply our filter. For this example, we chose the easy-to-use (and easy-to-see!) CIPixellate, but there are over a hundred filters to choose from. If you want to apply a different filter, feel free to do so; just look it up in the Core Image Programming Guide, and replace its string name and its required parameters in the imageByApplyingFilter call.

  3. On line 6, we need to create a CIContext. Without a context, we won’t be able to draw anything to our screen because our CIImage doesn’t actually contain any pixels. It is a set of instructions to be passed to our CIContext, and if we don’t have one, our work will go nowhere.

  4. A CIContext can’t create a UIImage directly, but it can provide its lower-level bitmap equivalent, the CGImage, on lines 7--8. From that, line 9 can easily create a UIImage, tweetImage.

Finally, we mustn’t forget to have the SLComposeViewController use our new tweetImage. Update the call to addImage like this:

 tweetVC.addImage(tweetImage)
images/photos/morePixelatedPhoto.png

Run your application and try posting another photo. The photo you see should have the pixelation filter applied to it.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.180.118