ios - Swift - Realtime images from cam to server -


i hope can me! making app sends frames of camera server, , server make process. app sends 5-8 images per second (on nsdata format)

i have tried different ways that, 2 methods works have different problems. explain situations, , maybe can me.

first situation tried using avcapturevideodataoutput mode.

code below:

let capturesession = avcapturesession() capturesession.sessionpreset=avcapturesessionpresetiframe960x540 capturesession.addinput(avcapturedeviceinput(device: capturedevice, error: &error))  let output=avcapturevideodataoutput();     output.videosettings=[kcvpixelbufferpixelformattypekey:kcvpixelformattype_32bgra]  let cameraqueue = dispatch_queue_create("cameraqueue", dispatch_queue_serial)         output.setsamplebufferdelegate(self, queue: cameraqueue)         capturesession.addoutput(output)          videopreviewlayer = avcapturevideopreviewlayer(session: capturesession)         videopreviewlayer?.videogravity = avlayervideogravityresizeaspectfill         videopreviewlayer?.frame = view.layer.bounds         viewpreview?.layer.addsublayer(videopreviewlayer)           capturesession.startrunning() 

this view delegates: avcapturemetadataoutputobjectsdelegate avcapturevideodataoutputsamplebufferdelegate , call delegate method:

    func captureoutput(captureoutput: avcaptureoutput!, didoutputsamplebuffer samplebuffer: cmsamplebufferref!, fromconnection connection: avcaptureconnection!)     {       let imagen:uiimage=imagefromsamplebuffer(samplebuffer)       let dataimg:nsdata=uiimagejpegrepresentation(imagen,1.0)       //here send nsdata server correctly.     } 

this method call imagefromsamplebuffer , converts samplebuffer uiimage.

    func imagefromsamplebuffer(samplebuffer :cmsamplebufferref) -> uiimage {                 let imagebuffer: cvimagebufferref = cmsamplebuffergetimagebuffer(samplebuffer)                 cvpixelbufferlockbaseaddress(imagebuffer, 0)                 let baseaddress: unsafemutablepointer<void> = cvpixelbuffergetbaseaddressofplane(imagebuffer, int(0))                  let bytesperrow: int = cvpixelbuffergetbytesperrow(imagebuffer)                 let width: int = cvpixelbuffergetwidth(imagebuffer)                 let height: int = cvpixelbuffergetheight(imagebuffer)                  let colorspace: cgcolorspaceref = cgcolorspacecreatedevicergb()                  let bitspercompornent: int = 8                 var bitmapinfo = cgbitmapinfo((cgbitmapinfo.byteorder32little.rawvalue | cgimagealphainfo.premultipliedfirst.rawvalue) uint32)                 let newcontext: cgcontextref = cgbitmapcontextcreate(baseaddress, width, height, bitspercompornent, bytesperrow, colorspace, bitmapinfo) cgcontextref                  let imageref: cgimageref = cgbitmapcontextcreateimage(newcontext)                 let resultimage = uiimage(cgimage: imageref, scale: 1.0, orientation: uiimageorientation.right)!                 return resultimage             }  

here finish first method that, problem "infinite memory use", , app crashed after....2 minutes.

i debug , problem on uiimagejpegrepresentation(imagen,1.0) method, there form release memory after use method???

second (and think best way found) using "avcapturestillimageoutput"

code below:

var stillimageoutput: avcapturestillimageoutput = avcapturestillimageoutput()             if session.canaddoutput(stillimageoutput){                 stillimageoutput.outputsettings = [avvideocodeckey: avvideocodecjpeg]                 session.addoutput(stillimageoutput)                  self.stillimageoutput = stillimageoutput             }  var timer = nstimer.scheduledtimerwithtimeinterval(0.2, target: self, selector:  selector("methodtobecalled"), userinfo: nil, repeats: true)    func methodtobecalled(){              dispatch_async(self.sessionqueue!, {                 // update orientation on still image output video connection before capturing.                 let videoorientation =  (self.previewview.layer as! avcapturevideopreviewlayer).connection.videoorientation                 self.stillimageoutput!.connectionwithmediatype(avmediatypevideo).videoorientation = videoorientation                 self.stillimageoutput!.capturestillimageasynchronouslyfromconnection(self.stillimageoutput!.connectionwithmediatype(avmediatypevideo), completionhandler: {                     (imagedatasamplebuffer: cmsamplebuffer!, error: nserror!) in                      if error == nil {                         let dataimg:nsdata= avcapturestillimageoutput.jpegstillimagensdatarepresentation(imagedatasamplebuffer) //here send nsdata server correctly.                      }else{println(error)}                 })             })      } 

this works , without memory leaks, when app takes screenshot, phone makes tipical sound of "take photo", , can not allow it, there way without make sound??.

if needs code can share links found them.

thanks lot!

did ever manage solve problem yourself?

i stumbled upon question because converting objective-c project avcapturesession swift. code differently discard late frames in avcapturevideodataoutput, perhaps causing memory problem.

output.alwaysdiscardslatevideoframes = true 

insert line right after define video data output , before create queue:

let output=avcapturevideodataoutput(); output.videosettings=[kcvpixelbufferpixelformattypekey:kcvpixelformattype_32bgra] output.alwaysdiscardslatevideoframes = true let cameraqueue = dispatch_queue_create("cameraqueue", dispatch_queue_serial) 

i am, of course, referring first of 2 solutions.


Comments

Popular posts from this blog

javascript - Karma not able to start PhantomJS on Windows - Error: spawn UNKNOWN -

c# - Display ASPX Popup control in RowDeleteing Event (ASPX Gridview) -

Nuget pack csproj using nuspec -