4 Replies Latest reply on Oct 3, 2016 2:34 PM by Intel Corporation

    Capture Video Stream from Edi-Cam Server using openCV or urllib in Python?

    H_E_L_P

      Greetings, I am trying to do some processing on video being streamed from an Edi-Cam server which is written in nodejs. I have been trying to wrap my head around its structure and syntax, hoping to modify it, but its a bit too much at the moment.  My main goal is to capture the video stream into python and use the opencv library. Looking online there are two main solutions that should work, but they are not for me. I think my URL is incomplete.

       

      1) Use opencv's video capture class.

       

      import cv2

      cap = cv2.VideoCapture('http://192.168.1.137:8080)

      while True:

           ret, frame = cap.read()

           cv2.imshow('Video', frame)

           if cv2.waitKey(1) == 27:

               exit(0)

       

      2) Using the urllib

       

      import cv2

      import numpy as np

      import urllib2

       

      stream=urllib2.urlopen('http://192.168.1.137:8080')

      bytes=''
      while True:
          bytes+=stream.read(1024)
          a = bytes.find('\xff\xd8')
          b = bytes.find('\xff\xd9')
          print bytes
          if a!=-1 and b!=-1:
              jpg = bytes[a:b+2]
              bytes= bytes[b+2:]
              i = cv2.imdecode(np.fromstring(jpg, dtype=np.uint8),cv2.CV_LOAD_IMAGE_COLOR)
              cv2.imshow('i',i)
              if cv2.waitKey(1) ==27:
                  exit(0)

       

      Looking at what bytes is, its the source code of the web page, and doesn't change, there is no video stream found in it. Others examples have addresses like..

      cap = cv2.VideoCapture('http://192.168.1.137:8080/frame.mjpg')

      cap = cv2.VideoCapture('http://192.168.1.137:8080/?action=stream?dummy=param.mjpg')

       

      From my limited understanding of Edi-Cam, it uses ffmpeg to encode the video and then sets up a server on the Edison and client on the browser written in Node.js, as so to be non-blocking. I see that there are three ports used, one to listen to the hardware port for video(8020), one to listen to clients(8040), and one to serve clients the video page (8080). It is not clear to me what other address entries are needed in the URL to access the mpeg stream. Is there a way to scan the url for any streams urls?

       

      I am using python 2,7 and opencv3.0. I am not sure if I have ffmeg on my pc. Running windows 10.

       

      I am quite new to this stuff, the only simulair thing I have done was setup a simple TCP server client using the Edison to send sensor data.

       

      I have seen a clean solution to this problem on the Rasberry pi which uses its raspivid library  to read the camera and encode it and netcat to set up a TCP pipe on the pi. Wirelessly streaming a video from a Raspberry to a remote laptop - YouTube

      I could imagine all that is needed is to capture video from the camera, encode it, send it over tcp, receive it, and read a frame. Any advice on getting the Edi-cam to work or making another solution work. I like the edicam solution because it preforms very well.