Our DNA is written in Objective-C

Object Overlay on Video

We always welcome guest bloggers on Cocoanetics.com. This edition was prepared by Tejas Jasani and covers how to overlay objects on live video using Brad Larson’s GPUImage project.

BSA Banner

The main objective of this post is to describe how to overlay objects like text and images on video. Well since iOS 4 turned out there has been a finer approach to do this and it is high time I demonstrated to you how.

In following demo I have described how to overlay text and image for particular time on video with using GPUImage framework.

Step 1:

Create new XCode project name it as ObjectOverlayOnVideoDemo. It contains one UIViewController in Main.storyboard file.

Step 2:

You can download GPUImage Framework from https://github.com/BradLarson/GPUImage. After downloading, copy source folder available inside framework folder and add it in your project’s Thirdparty folder.

GPUImage Framework

GPUImageFramework is used for image processing and video processing. GPUImageFramework uses OPENGL ES. It provides different classes to apply filter on image or video.

It Allows to apply filter on image, video, live video camera and live still camera.

Step 3:

Add ThirdParty Folder from Finder to your application by dragging it.

Step 4:

Add a video in your project’s main bundle. In our example it is Missile_Preview.m4v.

Step 5:

Now you can use GPUImage classes by importing GPUImage.h file. Import GPUImage.h in your ViewController.h file.

#import “GPUImage.h”

Now declare GPUImageMovie, GPUImageFilter, GPUImageMovieWriter, GPUImageUIElement in ViewController.h file.

GPUImageMovie *movieFile;
GPUImageFilter *filter;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;

GPUImageMovie class is used to load movie on which we want to apply filter. initWithUrl method is used to load movie. Give url of movie you want to load as parameter of this method.

GPUImageFilter class is used to apply specific filter on video or image or live camera stream. GPUImageFilter is super class of all filter classes such as GPUImageBrightnessFilter, GPUImageSepiaFilter.

GPUImageMovieWriter class is used write movie in document directory after applying filter on GPUImageMovie. Provide url where you want to save final movie in initiWithMovieURL method of GPUImageMovieWriter. Also provide size of output movie in its size parameter.

GPUImageUIElement class is used to overlay any UIElement or view on the video or image.

Step 6:

Initialize movieFile with Missile_Preview.m4v url, Set playAtActualSpeed property to yes it will play movie when writing to file at actual video file speed and call prepareForImageCapture method of GPUImageMovie.

NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:@"Missile_Preview" withExtension:@"m4v"];

movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;

Step 7:

Initialize GPUImageFilter object with GPUImageBrightnessFilter with brightness 0, so no effect of filter is visible.

filter = [[GPUImageBrightnessFilter alloc]init];
[(GPUImageBrightnessFilter*)filter setBrightness:0.0];

Initialize GPUImageAlphaBlendFilter and set its mix property to 1.

GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;

Step 8:

Initialize a UIView, which contains image and text we want to overlay on video. Set its background color to clearColor.

UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];

Initialize UIImageView set its appropriate frame where you want to overlay image on video. And add it in contetView as its subview.

UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:@"logo.png"];
[contentView addSubview:ivTemp];

Initialize UILabel set its appropriate frame, set its text, font size, color etc property and also set hidden property to true. Then add it in contentView as its subview.

UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(100, 100, 100, 30)];
lblDemo.text = @"Blast";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];

Step 9:

Initialize GPUImageUIElement object with view contentView. Add blendFilter as target to filter, also add blendFilter as target to GPUImageUIElement object. Add filter as target to movieFile.

uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[movieFile addTarget:filter];

Step 10:

Inside setFrameProcessingCompletionBlock set lblDemo hidden property NO after 2 second. So that label will display in video after 2 second. Also call update method of GPUImageUIElement object.

[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
   if (frameTime.value/frameTime.timescale == 2) {
      [contentView viewWithTag:1].hidden = NO;
   [uiElementInput update];

Step 11:

Define output video path and call unlink method so, if video is available at same path then replace that video with current video.

NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]);

Initialize movieWriter object with specified path url. Add movieWriter as target to filter and blendFilter.

movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
[blendFilter addTarget:movieWriter];

Configure movieWriter for audio Syncronization. And then call startRecording method of movieWriter and startProcessing method of movieFile.

movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];

Step 12:

Inside setCompletionBlock of movieWriter save video to cameraRoll of device.

[movieWriter setCompletionBlock:^{
   UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);

Step 13:

Done all the process for video editing now add a UIButton in UIViewController set its title as StartProcessing and on its TouchUpInside event call that editVideo method and hide StartProcessing button.

- (IBAction)btnStartProcessingClicked:(id)sender
   [sender setHidden:YES];
   [self editVideo];

Download demo project from GitHub.

*Note :- Please run that demo example in device not in simulator for perfect result.

Author Bio:

Tejas Jasani is a Founder and CEO of iOS Development Company named The APP Guruz. His major focus is on how to improve mobile users smart phone experience through development of mobile games and apps.

Tagged as: , , ,

Categories: Recipes

1 Comment »

  1. how can we write this code in swift , i have been trying but i am getting some errors

Leave a Comment

%d bloggers like this: