The amazing adventures of Doug Hughes

Viewing MJPEG Streams in Flex

So, I’m learning Flex at long last. The reason for me finally taking up this challenge is that I finally found a project I believed it well suited to. You see, my family has lots of digital photos. About 9000 at last count. We keep these stored on a NAS in my office. All of our computers are set to show these photos as our screen savers. However, I’ve wanted a nice wireless digital picture frame that would randomly pull from this pool of images. Well, to make a long story short, I was unable to find a single wireless picture frame that could read from a Samba share. (And trust me, I’ve tried a lot of things.)

So, finally I became frustrated enough to take this into my own hands. I plan make my own ideal digital picture frame. I’m currently working on an Air application that will run in full screen and provide access to a range of picture frame components. Obviously, a part of this will display pictures. Other components will display weather information, stats on Alagad, and pretty much anything else I want to wire into my picture frame. Eventually, I’ll buy a nice, thin, touch screen tablet PC, configure it to run my air app on load, get it professionally framed and hang it on the wall.

Now, one of the components I want this picture frame to show is a video stream from my daughters baby monitor camera. Actually, it’s not really a baby monitor camera it’s a more-or-less generic wireless surveillance camera. I don’t have the model information any more and it’s not branded at all. But, from research I’ve done there are a lot of these cameras which run the same software.

The camera runs a basic web server so I can pull up the camera and see either a still photo or a video stream in a Java Applet or ActiveX Control. I did a little playing around with the camera and discovered that the still photo is returned from a file called “image.jpg”. Any time you request this file you get the latest image from the camera. So, my first attempt a hooking into the video camera was simply to request this photo over and over again and use that as the source for an Image object. This worked to an extent, but the Image object wouldn’t refresh quickly enough and would as a result flicker. Beyond that, I had to make a complete HTTP request for each frame of the video and it had a very slow refresh rate. I was able to solve the flickering problem by having two images on my canvas and alternate setting the source of each image and hiding the other image.

I was happy to have accomplished that, but it just wasn’t very nice. So I decided to do a little more research. The Java Applet on the camera was called xplug.class. Googling that revealed a few people who had decompiled the class and used that to discover that the camera actually streams video from a file named mjpeg.cgi. If you access this file directly you’ll simply get a long stream of binary data which makes up the video feed from the camera. Obviously the Java Applet and Active X Control could read this stream and use it to display the video feed.

A little more research revealed that, sadly, mjpeg is not a standard but more of a technique. Mjpeg really is just a concatenated stream of jpeg images, one after another. I’m familiar with the JPEG format (at least to some degree) from my work on the Image Component. At least, I knew enough to know the first few bytes that mark a JPEG image (FF D8). So, I figured I’d look at the data in the video stream to see if I could figure it out. I captured a few seconds of the video to disk using Curl and opened it up in a hex editor. The first thing I noticed was that each frame of the video was denoted by an ascii string “–video boundary–“. Looking a bit further into the file I discovered that I could find the markers that start an image. I tested cutting the bytes from the beginning of where I thought the JPEG would start to the last byte before the video boundary into a new file and saved it as a jpeg, which I was subsequently able to read!

I knew that if I could read the feed by simply starting at the top of the feed, find the start of an image and the end, cutting out that image, displaying it and repeating the process forever.

Simeon Bateman filled in the last little piece of data I needed to know to make this work. Namely, that you can simply provide binary data to the source property of an Image object and Flex will display the image.

So, after figuring out how to make a socket connection from Flex I was able to parse the video feed and display the video feed! I did still have the flickering problem but I solved it in a similar manner.

Attached to this blog entry is a final Flex Component which connects to the camera by URL and streams the feed. Anyone who has a use or it may use it, assuming your video feed is the same format as mine. This is my first real bit of AS3/Flex development though so don’t be offended if I’m not doing things in the best way possible.

Here’s an example usage of the component:

<webcam:webcamImage left="0" top="0" right="0" bottom="0" host="mycamera.com" port="80" />

You can also supply username and password properties to access password protected cameras.

Comments on: "Viewing MJPEG Streams in Flex" (29)

  1. From someone that knows little/nothing about the capabilities of digital frames: Are there digital frames that can run Air apps? Or for that matter one that supports the Flash Player? I know some of the Sony frames run Windows but unclear what you can actually do with them in terms of apps.

    Like

  2. Doug Hughes said:

    I’ve not seen anything that runs a real version of Windows. I don’t believe there are any standard consumer frames that can run air apps.

    Like

  3. Thanks for confirming. An “AirFrame” would be nice but probably ends up with hardware being at least a netbook (eeePC).

    Like

  4. Joo Fernandes said:

    Doug, your attachment seems to be broken, I guess the webtier compiler is trying to compile it instead of allowing the download.

    Like

  5. Doug Hughes said:

    @Joo – Thanks, I didn’t test the download. I’ve zipped it and you can get it now.

    Like

  6. Gordon Smith said:

    The camera data is multipart HTTP, in this case also known as MJPEG. There is no need to guess at the JPEG location in the stream. See “Motion JPEG Formats” section of http://www.jpegcameras.com/. Thanks for the component, I’m also new to Flex and needed something like this.

    Like

  7. Hi,
    I have problem with access streams from remote web server. crossdomain.xml file is ok, I can access other media files – such as normal JPGs via Image component. But your component fails with Error #2044: Unhandled securityError:. text=Error #2048: Security sandbox violation: http://mysite/cross/CrossRoads.swf cannot load data from mysite:80.
    at webcamImage()[C:JavaworkspaceFlexCrossRoadssrcwebcamImage.mxml:15]
    at mx.core::Container/createComponentFromDescriptor()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcoreContainer.as:3579]
    at mx.core::Container/createComponentsFromDescriptors()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcoreContainer.as:3493]
    at mx.containers::ViewStack/instantiateSelectedChild()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcontainersViewStack.as:1140]
    at mx.containers::ViewStack/commitProperties()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcontainersViewStack.as:664]
    at mx.core::UIComponent/validateProperties()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcoreUIComponent.as:5807]
    at mx.managers::LayoutManager/validateProperties()[C:autobuild3.2.0frameworksprojectsframeworksrcmxmanagersLayoutManager.as:539]
    at mx.managers::LayoutManager/doPhasedInstantiation()[C:autobuild3.2.0frameworksprojectsframeworksrcmxmanagersLayoutManager.as:689]
    at Function/http://adobe.com/AS3/2006/builtin::apply()
    at mx.core::UIComponent/callLaterDispatcher2()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcoreUIComponent.as:8628]
    at mx.core::UIComponent/callLaterDispatcher()[C:autobuild3.2.0frameworksprojectsframeworksrcmxcoreUIComponent.as:8568]
    at flash.utils::Timer/_timerDispatch()
    at flash.utils::Timer/tick()

    Like

  8. Doug Hughes said:

    @Bedy – I’ve got to be honest, I’m not really quite sure why this would be. I use this component to access data on another domain from where the video was displayed. I havn’t used it in a while, but I don’t recall having to supply a crossdomain.xml file or anything else.

    Like

  9. Rodolfo Reis said:

    Hi Doug, gimme a help buddie!

    I’m not an expert in Flex, but I followed exactly your tips, but I still can’t see images on the screen.

    I created a new project and a component with your code. After that, I created an application file that has an instance of this component (like you wrote!).

    But when I run script, nothing shows up on the screen, no images! Debbuging I saw that script is working fine, but nothing appears on the screen… Can you give me a explanation how to set properly imagens to take visible, please?

    Thanks!
    Rodolfo Reis

    Like

  10. Hi Doug –

    Could you please tell me, what is HOST in your code?
    The thing is that all I got is URL, which returns me MJPEG stream. This url is “http://81.198.212.174/axis-cgi/mjpg/video.cgi?resolution=640×480&compression=30”.
    Is that possible to stream it with Flash/Flex in the same approach you have been using?

    Thanks in advance.

    Like

  11. JUBRvn ltqkswdxhaod, [url=http://ryrbpgwznioq.com/]ryrbpgwznioq[/url], [link=http://kftgehkxanam.com/]kftgehkxanam[/link], http://wzbrxajawlhj.com/

    Like

  12. Nice work,
    The multipart respsone does contain what the boundary string for the document will be so you could parse that instead of assuming
    “–video boundary–”
    In my experience “–my boundary” as far more common.

    Regards,
    Erik

    Like

  13. Steve Robinson Hakkabee said:

    Found the swf file on my axis 213ptz on the same folder of mjpeg:
    baseurl/mjpg/video.mjpg
    can be:
    baseurl/mjpg/video.swf
    and it works easy!

    Like

  14. Hello,
    I installed the component in /src/components, loaded the components using:
    xmlns:components=”components.*”

    and call the component using

    My network camera is visible using this in a browser:
    http://127.0.0.1/axis-cgi/mjpg/video.cgi?resolution=640×480

    But the flex page is not showing anything.
    What am I doing wrong?
    Thanks a lot

    Like

  15. The problem will only be solved if there’s a possibility of adding a crossdomain.xml file to the camera’s web server OR by making the flash player think it’s coming from there by proxying through a server such that http://myserver.com/crossdomain.xml serves up the file and then having a pass throuh proxy that accesses the camera like http://myserver.com/mycamera.mjpeg

    The reason it worked for the author is that the access of the server was to localhost. Flash allows accesses from localhost to sockets without considering this a security violation.

    Hope this helps.

    Like

  16. I think Bill is exactly correct here. As indicated in the post, I’m fairly new to Flex programming. I never even finished this project, beyond creating the webcamImage component. Thus, I never tried to host this on the web or anything of the sort. It DID work for me. But I think that it’s because it was an AIR application running locally.

    Like Bill says, you should be able to use something like Apache’s modproxy to proxy the request to the camera through a domain you can add the crossdomain.xml file to.

    Sorry for the general abandonware of this component! I hope some people are able to get some use out of it.

    Like

  17. This is an old post, but I just thought I’d let you know that instead of looking for a video boundary to find the end of a jpg, you can just look for the next ending JPG bytes (FF D9) like you are looking for the beginning bytes (FF D8).

    The problem I have while using this is speed. I’m not sure if my camera sends back data faster or bigger images than yours, but whenever I have the connection open the GUI freezes, and responds if I close it.

    Also, shouldn’t the line
    httpRequest += “Host: localhost:80rn”;
    be
    httpRequest += “Host:” + host + “:” + port”rn”;
    ?

    Like

  18. asefsefsef said:

    httpRequest += “Host:” + host + “:” + port + “rn”;

    Like

  19. Hi how can y use this in IE. Do you have any example. I try use this in IE and can’t see anything, but if a execute in flash mx works well.

    Like

  20. Hi, I wrote this entry more than a year ago and have since pretty much ignored it. However, it still gets a good number of comments on a regular basis. I never expected this. A lot of people seem to have problems with the component. I can probably expand on this, but I want to know how people are using this. What do you need? What do you want? Why mjpeg streams?

    And @Andre, I think your question was addressed in earlier comments. Specifically, the component, when served via the web, won’t be able to make requests to hosts other than that which served it. You might be able to use something like a proxy configuration in Apache to forward the request to your camera.

    Anyhow people – let me know what you need and I’ll see if I can’t make something a bit more useful.

    Like

  21. blu-ray…

    Good post and it appears they are moving in the direction you suggest….

    Like

  22. The component download link is broken. Is the component still available anywhere?

    Like

  23. Better late than never: I’ve fixed the download link.

    Like

  24. Thank you for updating the link!
    I got it working, even though it’s a bit slow and flickery. Looking for the jpeg end marker, or using the headers Content-Length value might improve speed.
    Anyone using a Foscam: the video boundary is “–ipcamera” as in:
    –ipcamera
    Content-Type: image/jpeg
    Content-Length: 38648

    As for the flash player security: As long as you run this app from your own pc as opposed to hosting it on a webserver everything is fine. As soon as you publish the swf on a server the webcam will need to provide a valid policy file.

    Like

  25. Does this code really works? I have downloaded and tested and no works at all!!
    But if it works, good luck!

    Like

  26. I’ve tried a version of this and have made it work in a flex ‘AIR’ application which is not bound by crossdomain issues. It also works well on the android 🙂

    If anyone ever finds a way to also capture the raw .wav file that some camera’s provide (sound)… let me know

    2 classes here – Mjpeg.as and Base64

    package
    {
    import com.dynamicflash.util.Base64;

    import flash.display.Loader;
    import flash.events.Event;
    import flash.events.IOErrorEvent;
    import flash.events.ProgressEvent;
    import flash.media.Sound;
    import flash.net.Socket;
    import flash.utils.ByteArray;

    //import mx.utils.Base64Encoder;
    //import mx.utils.Base64Decoder;

    /**
    * This is a class used to view a MJPEG
    * @author Josh Chernoff | GFX Complex
    *
    */
    public class MJPEG extends Loader
    {
    private var _user:String; //Auth user name
    private var _pass:String; //Auth user password

    private var _host:String; //host server of stream
    private var _port:int; //port of stream
    private var _file:String; //Location of MJPEG
    private var _start:int = 0; //marker for start of jpg

    private var webcamSocket:Socket = new Socket(); //socket connection
    private var imageBuffer:ByteArray = new ByteArray(); //image holder
    private var outputSnd:Sound = new Sound();
    //private var myEncoder:Base64Encoder = new Base64Encoder();
    //private var tim = myEncoder.toString();

    /**
    * Create’s a new instance of the MJPEG class. Note that due a sandbox security problem, unless you can place a crossdomain.xml
    * on the host server you will only be able to use this class in your AIR applications.
    *
    * @example import MJPEG;
    * var cam:MJPEG = new MJPEG(“192.168.0.100”, “/img/video.mjpeg”, 80);
    * addChild(cam);
    *
    * @param host:String | Host of the server. Do not include protocol
    * @param file:String | Path to the file on the server. Start with a forward slash
    * @param port:int | Port of the host server;
    * @param user:String | User name for Auth
    * @param pass:String | User password for Auth
    */
    public function MJPEG (host:String, file:String, port:int = 80, user:String = null, pass:String = null )
    {
    _host = host;
    _file = file;
    _port = port;
    _user = user;
    _pass = pass;

    webcamSocket.addEventListener(Event.CONNECT, handleConnect);
    webcamSocket.addEventListener(IOErrorEvent.IO_ERROR, unableToConnect);
    webcamSocket.addEventListener(ProgressEvent.SOCKET_DATA, handleData);
    //outputSnd.addEventListener(SampleDataEvent.SAMPLE_DATA, processSound);
    webcamSocket.connect(host, port);
    }

    private function unableToConnect(e:Event):void{
    trace(“error”);
    dispatchEvent(new Event(“videoError”));
    }

    private function handleConnect(e:Event):void
    {
    // we’re connected send a request
    var httpRequest:String = “GET “+_file+” HTTP/1.1rn”;
    httpRequest+= “Host: localhost:80rn”;
    if(_user != null && _pass != null){
    var source:String = String(_user + “:” + _pass);
    var auth:String = Base64.encode(source);
    httpRequest += “Authorization: Basic ” + auth.toString()+ “rn”; //NOTE THIS MAY NEEED TO BE EDITED TO WORK WITH YOUR CAM
    }

    httpRequest+=”Connection: keep-alivernrn”;
    webcamSocket.writeMultiByte(httpRequest, “us-ascii”);
    }

    private function handleData(e:ProgressEvent):void {
    //trace(“Got Data!” + e);
    // get the data that we received.
    // append the data to our imageBuffer
    webcamSocket.readBytes(imageBuffer, imageBuffer.length);
    if (imageBuffer.length 1) {
    if(_start == 0){
    //Check for start of JPG
    for (x; x < imageBuffer.length – 1; x++) {
    // get the first two bytes.
    imageBuffer.position = x;
    imageBuffer.readBytes(startMarker, 0, 2);

    //Check for end of JPG
    if (startMarker[0] == 255 && startMarker[1] == 216) {
    _start = x;
    break;
    }
    }
    }
    for (x; x 0) {
    // Create new data buffer and populate next 3 bytes from data
    dataBuffer = new Array();
    for (var i:uint = 0; i 0; i++) {
    dataBuffer[i] = data.readUnsignedByte();
    }

    // Convert to data buffer Base64 character positions and
    // store in output buffer
    outputBuffer[0] = (dataBuffer[0] & 0xfc) >> 2;
    outputBuffer[1] = ((dataBuffer[0] & 0x03) <> 4);
    outputBuffer[2] = ((dataBuffer[1] & 0x0f) <> 6);
    outputBuffer[3] = dataBuffer[2] & 0x3f;

    // If data buffer was short (i.e not 3 characters) then set
    // end character indexes in data buffer to index of ‘=’ symbol.
    // This is necessary because Base64 data is always a multiple of
    // 4 bytes and is basses with ‘=’ symbols.
    for (var j:uint = dataBuffer.length; j < 3; j++) {
    outputBuffer[j + 1] = 64;
    }

    // Loop through output buffer and add Base64 characters to
    // encoded data string for each character.
    for (var k:uint = 0; k < outputBuffer.length; k++) {
    output += BASE64_CHARS.charAt(outputBuffer[k]);
    }
    }

    // Return encoded data
    return output;
    }

    public static function decode(data:String):String {
    // Decode data to ByteArray
    var bytes:ByteArray = decodeToByteArray(data);

    // Convert to string and return
    return bytes.readUTFBytes(bytes.length);

    }

    public static function decodeToByteArray(data:String):ByteArray {
    // Initialise output ByteArray for decoded data
    var output:ByteArray = new ByteArray();

    // Create data and output buffers
    var dataBuffer:Array = new Array(4);
    var outputBuffer:Array = new Array(3);

    // While there are data bytes left to be processed
    for (var i:uint = 0; i < data.length; i += 4) {
    // Populate data buffer with position of Base64 characters for
    // next 4 bytes from encoded data
    for (var j:uint = 0; j < 4 && i + j < data.length; j++) {
    dataBuffer[j] = BASE64_CHARS.indexOf(data.charAt(i + j));
    }

    // Decode data buffer back into bytes
    outputBuffer[0] = (dataBuffer[0] <> 4);
    outputBuffer[1] = ((dataBuffer[1] & 0x0f) <> 2);
    outputBuffer[2] = ((dataBuffer[2] & 0x03) << 6) + dataBuffer[3];

    // Add all non-padded bytes in output buffer to decoded data
    for (var k:uint = 0; k < outputBuffer.length; k++) {
    if (dataBuffer[k+1] == 64) break;
    output.writeByte(outputBuffer[k]);
    }
    }

    // Rewind decoded data ByteArray
    output.position = 0;

    // Return decoded data
    return output;
    }

    public function Base64() {
    throw new Error("Base64 class is static container only");
    }
    }
    }

    Like

  27. Hmm, where did Tim’s comment go ? (which I got from the “Notify me of followup comments via e-mail”) Even though that code didn’t work, and was severely altered in the process of posting/mailing.

    Like

  28. There it is..

    Like

  29. myvideo downloader…

    […]Viewing MJPEG Streams in Flex – Alagad Ally[…]…

    Like

Comments are closed.

Tag Cloud