[AS3+ffmpeg] PLAY MULTI-VIDEO FORMATS IN AIR

May 5, 2011 at 4:54 AM (Actionscript) (, , , )


This is an attempt to make a AIR Video player which can play not only flv or mp4 but also avi, mov, mpeg or to put it simple what ever ffmpeg can decode(to flv)..
This will be achieved by using Native process api to run ffmpeg from adobe air and send the video file path to ffmpeg along with decoding settings and ask it to output as flv … the output from ffmpeg is received by air through standard output…and the received data is played in video object through netstream’s new method appendBytes

Download ffmpeg to try this demo…if you have downloded windows version download the archive, extract it, and look for ffmpeg.exe in bin folder….

First lets see how to run ffmpeg from air and pass the needed arguments to the ffmpeg…

First check if your platform supports NativeProcess

function checkForNPSupport():Boolean
		{
			if (NativeProcess.isSupported)
			{
				return true				
			}
			else
			{
				return false
			}
		}

If NativeProcess is supported then lets proceed with application or close the the app with alert or revert the application to be a normal flv/mp4 player…(though in this demo it justnt do anything if Native process is not supported)


var process:NativeProcess;//NativeProcess instance
var pInfo:NativeProcessStartupInfo;
var args:Vector.<String>;//Arguments to be passed to the process 
//
var stream:NetStream;
var connection:NetConnection;
//
var pFile:File;//File Object for ffmpeg
var vfile:File;//File Object for video file
var file:File = File.applicationDirectory;
//
function init_the_process():void
{

	pFile = file.resolvePath("ffmpeg.exe");//have placed ffmpeg in the application directory
	pInfo = new NativeProcessStartupInfo();
	pInfo.executable = pFile;
	process = new NativeProcess();
	process.addEventListener(ProgressEvent.STANDARD_ERROR_DATA,onErrorS);
	process.addEventListener(IOErrorEvent.STANDARD_INPUT_IO_ERROR,onError);
	process.addEventListener(IOErrorEvent.STANDARD_ERROR_IO_ERROR,onError);
	process.addEventListener(IOErrorEvent.STANDARD_OUTPUT_IO_ERROR,onError);
	process.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onOutputData);
	//process.addEventListener(ProgressEvent.STANDARD_INPUT_PROGRESS, inputProgressListener);
	start_netconnection()
}
function onError(evt:IOErrorEvent):void
{
//Any error in writing input/output pipes will trigger this function
	trace(evt.text);
}
function onErrorS(evt:ProgressEvent):void
{
	//Any error or information from ffmpeg during decoding is sent through STANDARD_ERROR_DATA
	// not sure why even video info is sent through STANDARD_ERROR_DATA 
	var n:Number = process.standardError.bytesAvailable;
	var s:String = process.standardError.readUTFBytes(n);
	//debug_info.appendText("INFO -"+s+"\n");
}
function inputProgressListener(evt:ProgressEvent):void
{
}
function onOutputData(evt:ProgressEvent):void
{
//The decoded flv data is received here
	if (process.running)
	{
		var videoStream : ByteArray = new ByteArray();
		process.standardOutput.readBytes(videoStream,0,process.standardOutput.bytesAvailable);
//Since the netstream is in data generation mode you can pass a ByteArray into the netStream instance
		stream.appendBytes(videoStream);
	}
}
function start_netConnection()
{	
	connection = new NetConnection();
	connection.addEventListener(NetStatusEvent.NET_STATUS, onNetstatusHandler);
	connection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);
	connection.connect(null);		
}
function onNetstatusHandler(evt:NetStatusEvent):void
{
	
	//debug_info.appendText("INFO -"+(evt.info.code)+"\n");
	switch (evt.info.code)
	{
		case "NetConnection.Connect.Success" :
			start_decode_process();
			break;
		case "NetStream.Play.StreamNotFound" :
			//Stream not found initiate necessary action
			break;
	}
}
function securityErrorHandler(event:SecurityErrorEvent):void
{
	trace(event.toString());
}
function start_decode_process():void
{
	stream = new NetStream(connection);
	stream.addEventListener(NetStatusEvent.NET_STATUS, onNetstatusHandler);
	stream.client = {onMetaData:metaDataHandler};
//video - video object placed in stage
	video.attachNetStream(stream);
//Now put the netstream instance to data generation mode by passing null while calling 'play'...
	stream.play(null);
	args = new Vector.<String>();
	//-i - video info; -sameeq quality of output same as input file; -f output format here 'flv';
	//"-" means the output needs to send through std out 
	//for quality settings and other info regarding arguments refer <a href="http://www.ffmpeg.org/ffmpeg.html">ffmpeg documents</a>
	args.push("-i",vfile.nativePath,"-sameq","-f","flv","-");
	pInfo.arguments = args;
	if (process.running)
	{
		process.closeInput();
		process.exit();
	}
	//all the info needed to to start the process(executable) is feeded into NativeProcessStartupInfo and sent as a argument while starting the NativeProcess
	process.start(pInfo);
}

Lets Now Select a video file and play it in video object added to stage

function browseVideo(evt:MouseEvent):void
{
	if (! vfile)
	{
		vfile = new File();
	}
	vfile.addEventListener(Event.SELECT, onSelectHandler);
	vfile.browseForOpen("Select Video File");
}
function onSelectHandler(event : Event):void
{
	if (process.running)
	{
		process.closeInput();
		process.exit();
	}
	init()
}

You can get all video info from metatag…now lets adjust the video objects dimension to fit our stage

var screen_width:Number=480
var screen_height:Number=360
function metaDataHandler(infoObject:Object):void {
	var w:Number = infoObject.width;
	var h:Number= infoObject.height;
	var whR:Number=h/w
	var hwR:Number=w/h
	var vw:Number
	var vh:Number
	if(w>=h){
	vw=Math.min(screen_width,w)
	vh=vw*whR
	}else{
		vh=Math.min(screen_height,h)
	vw=vh*hwR
	}
	video.width=vw
	video.height=vh
	video.x=(480-vw)/2
	video.y=(360-vh)/2
}

Now here is full source of this demo..this Document class if you are trying it out in flash create a video object, file browse button in stage or modify the class if ou want to create dynamically in script…

package 
{
	import flash.display.Sprite;
	import flash.events.*;
	import flash.utils.*;
	import flash.media.*;
	import flash.filesystem.File;
	import flash.desktop.NativeProcess;
	import flash.desktop.NativeProcessStartupInfo;
	import flash.net.*;
	public class Main extends Sprite
	{
		private var pFile:File;
		private var vfile:File;
		private var process:NativeProcess;
		private var pInfo:NativeProcessStartupInfo;
		private var args:Vector.<String > ;
		//
		//var video:Video;
		private var stream:NetStream;
		private var connection:NetConnection;
		private var file:File = File.applicationDirectory;
		public function Main()
		{
			// constructor code
			if(checkForNPSupport()){
			pFile = file.resolvePath("ffmpeg.exe");
			//open_btn button object created on stage
			open_btn.buttonMode = true;
			open_btn.addEventListener(MouseEvent.CLICK,browseVideo);
			}else{
				open_btn.enabled=false
				debug_info.appendText("Multi Video Format Player Not Supported!\n")
				//To Do - Show alert that the application not supported and close the application
			}
		}
		private function checkForNPSupport():Boolean
		{
			if (NativeProcess.isSupported)
			{
				return true				
			}
			else
			{
				return false
			}
		}
		private function init()
		{
			if (process && process.running)
			{
				process.closeInput();
				process.exit();
			}
			init_the_process();			
		}
		private function init_the_process():void
		{
			debug_info.appendText("PROCESS_INIT\n");
			pInfo = new NativeProcessStartupInfo();
			debug_info.appendText(pFile.nativePath+" file path\n");
			pInfo.executable = pFile;
			debug_info.appendText(pFile.nativePath+"\n");
			process = new NativeProcess();
			process.addEventListener(ProgressEvent.STANDARD_ERROR_DATA,onErrorS);
			process.addEventListener(IOErrorEvent.STANDARD_INPUT_IO_ERROR,onError);
			process.addEventListener(IOErrorEvent.STANDARD_ERROR_IO_ERROR,onError);
			process.addEventListener(IOErrorEvent.STANDARD_OUTPUT_IO_ERROR,onError);
			process.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onOutputData);
			//process.addEventListener(ProgressEvent.STANDARD_INPUT_PROGRESS, inputProgressListener);
			//debug_info.appendText("PROCESS_INIT_END\n")
			start_netConnection();
		}
		private function onError(evt:IOErrorEvent):void
		{
			trace(evt.text);
			//debug_info.text+=evt.text+"\n"
			debug_info.appendText(evt.type+":"+evt.text+"\n");
		}
		private function onErrorS(evt:ProgressEvent):void
		{
			//trace(evt.text)
			//debug_info.text+=evt.text+"\n"
			var n:Number = process.standardError.bytesAvailable;
			var s:String = process.standardError.readUTFBytes(n);
			//debug_info.appendText("INFO -"+s+"\n");
		}
		private function inputProgressListener(evt:ProgressEvent):void
		{
			//debug_info.appendText("inputProgressListener\n");
			//process.closeInput();
		}
		private function onOutputData(evt:ProgressEvent):void
		{
			if (process.running)
			{
				var videoStream : ByteArray = new ByteArray();
				process.standardOutput.readBytes(videoStream,0,process.standardOutput.bytesAvailable);
				stream.appendBytes(videoStream);
			}
		}
		private function start_netConnection()
		{


			connection = new NetConnection();
			connection.addEventListener(NetStatusEvent.NET_STATUS, onNetstatusHandler);
			connection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);
			connection.connect(null);

		}
		function onNetstatusHandler(event:NetStatusEvent):void
		{
			trace(event.toString());
			debug_info.appendText("INFO -"+(event.info.code)+"\n");
			switch (event.info.code)
			{
				case "NetConnection.Connect.Success" :
					start_decode_process();
					break;
				case "NetStream.Play.StreamNotFound" :
					trace("Stream not found: ");
					debug_info.appendText("Stream not found: \n");
					break;
			}
		}

		function securityErrorHandler(event:SecurityErrorEvent):void
		{
			trace(event.toString());
		}
		function start_decode_process():void
		{
			stream = new NetStream(connection);
			stream.addEventListener(NetStatusEvent.NET_STATUS, onNetstatusHandler);
			stream.client = {onMetaData:metaDataHandler};
			video.attachNetStream(stream);
			stream.play(null);
			args = new Vector.<String>();
			args.push("-i",vfile.nativePath,"-sameq","-f","flv","-");
			pInfo.arguments = args;
			debug_info.appendText(pInfo.arguments.length+"\n");
			if (process.running)
			{
				process.closeInput();
				process.exit();
			}
			process.start(pInfo);
		}
		function browseVideo(evt:MouseEvent):void
		{
			if (! vfile)
			{
				vfile = new File();
			}
			vfile.addEventListener(Event.SELECT, onSelectHandler);
			vfile.browseForOpen("Select Video File");
		}
		function onSelectHandler(event : Event):void
		{			
			init();
		}

		function metaDataHandler(infoObject:Object):void
		{
			var w:Number = infoObject.width;
			var h:Number = infoObject.height;
			var whR:Number = h / w;
			var hwR:Number = w / h;
			var vw:Number;
			var vh:Number;
			if (w >= h)
			{
				vw = Math.min(480,w);
				vh = vw * whR;
			}
			else
			{
				vh = Math.min(360,h);
				vw = vh * hwR;
			}
			video.width = vw;
			video.height = vh;
			video.x=(480-vw)/2;
			video.y=(360-vh)/2;
		}



	}

}
Advertisements

20 Comments

  1. shefali said,

    i m getting error [NetStatusEvent type=”netStatus” bubbles=false cancelable=false eventPhase=2 info=[object Object]] when i choose file…

    • sathesh said,

      hi,
      can you trace and post the message of event.info.code from ‘onNetstatusHandler’ function, it will help to understand the nature of error

      Thanks,
      -sathesh

      • shefali said,

        hi, thanks for replying.

        i changed this

        args.push(“-i”,vfile.nativePath,”-sameq”,”-f”,”flv”,”-“);

        to this

        args.push(“-i”,vfile.nativePath,”-ar”,”22050″,”-b:v”,”2048k”,”-f”,”flv”,”-“);

        in above code , now its working. i guess ffmpeg’s new version dont support use of ‘-sameq’ . Instead of that we can use -qscale 0 for equivalent quality.

  2. jeevs said,

    Hi,

    Very nice stuff!

    I’m just wondering how can I seek the video? And how can I get the length of video?

    • shefali said,

      u can get information like creation time, duration of video, bitrate etc using ffmpeg command “-i” “video_location.mp4”

  3. Kirill said,

    Hi. Thank you for this post. What about seek position? How to rewind the video?

  4. bool said,

    Hi,nice job,works fine,plz can we use ffpeg to stream mms live video into flash, if yes plz help, thx

  5. Chad said,

    This is awesome but i have a question…i would like to do two things here.

    1.) use webcam instead of file
    2.) add args to the FFpeg to pipe to another .exe

    also really no need to view just transcode.

    i have seen other ways but i like as3 if possible

    • sathesh said,

      Hi,

      >>1.) use webcam instead of file

      Unlike microphone actionscript’s Camera API doesn’t have option to capture video data, so you don’t have a direct way of doing this..
      But if you really want a solution then there is a workaround by using ffmpeg itself to capture from webcam and transcode it…
      arguments to acheive this will be: args.push(“-r”,”15″,”-s”,”320×240″,”-f”,”vfwcap”,”-i”,’0′,”-f”,”flv”,”-“,path+”/webcam.avi”);
      here it captures the webcam and one output transcode it to flv and sends the stream to AIR app and another output transcode it to avi and save it to defined path
      -r 15 –> framerate
      -s 320×240 –> size
      -f vfwcap –> VfW (Video for Windows) capture input device[if u r using linux then go for video4linux2 instead of vfwcap]
      -i —> input [‘0’ indicates the capture driver number, ranging from 0 to 9 ]
      -f flv –> output format
      “-” –> output to stdout[we are doing this to display the capture video in air app]
      path+”/webcam.avi” –> saves second output as avi in the given path

      [Remember this captures only video, if you need to capture audio to then we need to use direct show capture device instead of vfwcap]

      >>2.) add args to the FFpeg to pipe to another .exe
      Am not sure about this let me try out and get back

      -sathesh

      • Chad said,

        the version of ffmpeg i have will use direct show audio and video in this type of format

        ffmpeg -f dshow -i video=”USB Video Device” -f dshow -i audio=”Microsoft® LifeCam Cinema(TM)” …… – | vlc

        the pipe would be at the end somthing like above as well but where you see ” – | vlc ” that is piping to vlc or another program

        you have my email could you drop me a line? i would like to know more on how you got the ” args” …

  6. jeevs said,

    Hi,

    Excellent post!!!

    but can u plz elaborate on the arguments?

    args.push(“-i”,vfile.nativePath,”-sameq”,”-f”,”flv”,”-“);

    why “-sameq” is used? I searched in the ffmpeg docs but couldn’t find it there.
    And why “-” is used at the end?

    • sathesh said,

      hi,

      -sameq is used to transcode the output file with same quality as input file
      and “-” is used to tell the ffmpeg to send the output via STDOUT

      -sathesh

  7. Vineet said,

    Thank you for this post. Though I have a doubt. What you have done here is that you trans-coded the video to a flash supported format on the fly and then decoded it in flash video object.

    One other way to approach this problem could be that you create a buffer for the decoded frames(rather than trans-coding) by ffmpeg, get a memory reference to them and directly display them on screen.

    What do you think??

  8. Polaco said,

    Have you ever tried of writing the video file bytes to the stdin of the native process?
    I mean instead of:
    args.push(“-i”,vfile.nativePath,”-sameq”,”-f”,”flv”,”-“);
    doing something like this:
    args.push(“-i”,”-“,”-sameq”,”-f”,”flv”,”-“);
    and then sending the video file bytes through the stdin.
    I have had no luck with it yet.

    thanks

    • justflash said,

      yes, i have tried that and it worked fine for me…send me your code i’ll check on what causing the issue for you…

  9. Polaco said,

    If you would like to pause the conversion/playback of the video, one option would be to kill the process and restart it again passing it the playback “from” value. Do you think there would be a better way?

    • justflash said,

      hi, we can simply call stream.pause() and stream.resume() to achieve pause and resume of video playback…

      • Polaco said,

        I agree, but suppouse you want to stream the output of ffmpeg to a remote flashclient (a tablet client for instance) so the remote client could ask the “server” for more data when it’s buffer is getting empty and then the server continues the conversion of the file for the next X bytes sends it to the client and pauses the conversion. I guess that maybe we could use some temp file to store the output of ffmpeg and stream it as needed.
        I think that several problems could happen if I send all the video data to the tablet client at once since if it’s a long video could blow up the tablet client. So that’s why I want to stream by chunks.

    • justflash said,

      Though i haven’t tried on this I’m sure there will a better way to achieve this…will try out and let you know if i could find any better solution for this…

  10. Polaco said,

    Nice! Very interesting stuff!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: