Starling Video extension

Update 2014-08-25: VideoTexture with StageVideo support for Windows will be released with upcoming Adobe AIR Beta.

Update 2014-10-02: 
Read more about the new VideoTexture in AIR 15:

Update 2014-07-22: Since Adobe won’t fix the bug I reported and unfortunately also dropped FLV format last month, I won’t support this extension anymore.

To cut things short: The usage is similar to the camera extension. You just have to provide the NetStream (and the optional Rectangle for clipping the Video to the Texture’s size if needed). There is no draw/upload strategy since Videos do not have methods for drawing. The bitmapData.draw() function is the only way to accomplish what we want. Like with an ordinary Video (used internally) you will control the Video with the NetStream instance. The provided play(), pause(), stop() functions are just controlling the draw/download methods.

Since Videos do not dispatch Event.VIDEO_FRAME like a Camera, the instance has to check the undocumented netStream.decodedFrames property on every frame, but will only update the BitmapData/Texture if a new frame is available.

Note: Just as the WebcamVideo I built this extension for the very rare cases you need a Video within/behind Starling. Even though there seems to be no faster approach, performance-wise this is even worse than pushing a camera image to GPU. That’s simply because a Video has to be decoded first and especially mobile devices have a hard time with that alone. As a result you’ll soon encounter frame drops and audio glitches on mobile devices, if the quality of the source is too high.

To make it more clear, here is a quote of Tinic Uro:

It’s not recommended that you mix video playback and Stage3D at the same time.

Of course this situation is even worse on mobile where mixing the two will essentially end up as a slide show. There are also technical limitations which prevent us from making this work properly on the most common mobile chipsets even if we had a dedicated code path for video.

It does not make a difference performance wise if you use BitmapDraw.draw or have something like Video.bitmapData. It would be API sugar nothing else. BitmapDraw.draw is very fast if you render in 100% scale. In the end you still need to either 1. decode in software or 2. copy back from the video card and 3. do the YUV2RGB conversion in software. That’s where 90% of the time is spent.

The only way we could ever make this reasonably fast is with a Context3DTextureFormat.YUV. And that’s simply not an option today since support for that is even more fragmented and generally broken compared to compressed texture formats.

If you want to do video stick with StageVideo. But don’t mix Stage3D and StageVideo at the same time.

So you better think twice, whether you use this extension.

Starling Video live demo


Start the demo featuring the trailer to Nina Palin’s »Sita sings the Blues« (720p, licensed under CC-0 license (“Public Domain”))

Actually I’m not sure how fast the connection of my server is. I hope it’s working out for you, since I didn’t add any buffering or controls.

If it doesn’t, you can test the code below locally or check the following link: webcam chat demo (more on that and the filters soon)

Starling Video Example Code

Basically all you have to do is passing the NetStream to the Video and add it to the stage. For example your Starling class could look like this:

	import de.flintfabrik.starling.display.Video;
	import flash.utils.Timer;
	import starling.display.Sprite;
	import starling.text.TextField;
	public class StarlingVideoExample extends Sprite
		private var video:Video;
		private var statsTextField:TextField;
		private var statsTimer:Timer = new Timer(1000, 0);
		private var netConnection:NetConnection;
		private var netStream:NetStream;
		public function StarlingVideoExample()
			addEventListener(Event.ADDED_TO_STAGE, addedToStageHandler);
		private function addedToStageHandler(e:Event):void
			removeEventListener(Event.ADDED_TO_STAGE, addedToStageHandler);
			netConnection = new NetConnection()
			netConnection.client = { };
			netConnection.client.onMetaData = function ():void { };
			netStream = new NetStream(netConnection);"example.mp4");
			video = new Video(netStream);
			statsTextField = new TextField(200, 100, "CAMERA DEBUG", "Arial", 10, 0xFF00FF, true);
			statsTextField.hAlign = "left";
			statsTextField.vAlign = "top";
			statsTextField.y = 26;
			statsTimer.addEventListener(TimerEvent.TIMER, statsTimer_timerHandler);
			video.addEventListener(Event.RESIZE, resizeHandler);
			stage.addEventListener(ResizeEvent.RESIZE, stageResizeHandler);
		private function stageResizeHandler(e:ResizeEvent):void
			video.height = e.height;
			video.scaleX = video.scaleY;
			video.x = (e.width - video.width) * .5;
			video.y = (e.height - video.height) * .5;
		private function resizeHandler(e:Event=null):void {
			video.height = stage.stageHeight;
			video.scaleX = video.scaleY;
			video.x = (stage.stageWidth - video.width) * .5;
			video.y = (stage.stageHeight - video.height) * .5;	
		private function statsTimer_timerHandler(e:TimerEvent):void
			statsTextField.text = "decoded/dropped frames:\t " + netStream.decodedFrames +"/"+ + "\nFPS:\t" + netStream.currentFPS.toFixed(1) + "\nvideo:\t" + video.width + "x" + video.height + "\ntextureClass: " + video.texture.root.base + "\ntexture:\t" + video.texture.root.nativeWidth + "x" + video.texture.root.nativeHeight + "\ndraw:\t" + video.drawTime.toFixed(2) + " ms" + "\nupload:\t" + video.uploadTime.toFixed(2) + " ms" + "\ncomplete:\t" + (video.drawTime + video.uploadTime).toFixed(2) + " ms";


Recording Area

Again, if you’re using Flash 11.7 / AIR 3.7 or lower, NPOT-textures/RectangleTextures are not supported. In that case it’s best to crop the video image, as explained in the foregoing articles. To do that you can simply pass a Rectangle. This will “zoom” the image, but can increase performance tremendously.

video = new Video(netStream, new Rectangle(32,36,256,128) );

Video Events

The Events dispatched by the Starling Video extension are the same as those from the camera class. Since a video will always contain the same information I doubt that it would make sense to access the BitmapData, but of course you could. However I think there are better reasons to listen for this event, e. g. to update a cached filter, or to sync Starling content to the current video frame (but you’d also have to check the netStreamInfo.droppedFrames property in that case to get correct results).

Scout Data


Again a little snapshot of a Scout session. As you can see there is “a lot more yellow”, but that’s still quite ok. As in the previous example, advanced telemetry is enabled and you can profile it yourself.


Sources are available for download at GitHub: Starling-Video

Note: Unfortunately there is a conflict between playback of H.264 video files and Stage3D on mobile devices. So you will have to stick with .flv files on mobile till this bug will be fixed.