Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UVC #4

Open
edbek opened this issue Dec 16, 2020 · 36 comments
Open

UVC #4

edbek opened this issue Dec 16, 2020 · 36 comments
Labels
help wanted Extra attention is needed

Comments

@edbek
Copy link

edbek commented Dec 16, 2020

Is it possible with the help of this stack to implement interaction with Webcamera through UVC ? If so, how?

@xiaocq2001
Copy link
Contributor

Unfortunately there is no online document for now. The basic UVC work flow is as follow:

  1. Initialize USBX with UVC
    /*========================================================================*/
    /*= Initialize USBX with UVC supported.  */
    /*========================================================================*/

    /* Initialize USBX system. */
    status = ux_system_initialize(memory_pointer, USBX_MEMORY_SIZE, usbx_cache_safe_memory, USBX_CACHE_SAFE_MEMORY_SIZE);
    if (status != UX_SUCCESS)
        error_handler();

    /* Initialize USBX Host Stack.  */
    status =  ux_host_stack_initialize(NULL);
    if (status != UX_SUCCESS)
        error_handler();

    /* Register video class.  */
    status =  ux_host_stack_class_register(_ux_system_host_class_video_name, _ux_host_class_video_entry);
    if (status != UX_SUCCESS)
        error_handler();

    /* Register EHCI HCD.  */
    status = ux_host_stack_hcd_register(_ux_system_host_hcd_ehci_name, _ux_hcd_ehci_initialize, EHCI_BASE, 0x0);
    if (status != UX_SUCCESS)
        error_handler();
  1. Wait a UVC device connection
    /*========================================================================*/
    /*= Wait until UVC device is connected.  */
    /*========================================================================*/

    /* Find the main video container.  */
    status = ux_host_stack_class_get(_ux_system_host_class_video_name, &host_class);
    if (status != UX_SUCCESS)
        error_handler();

    /* We get the first instance of the video device.  */
    while (1)
    {
        status = ux_host_stack_class_instance_get(host_class, 0, (void **) &inst);
        if (status == UX_SUCCESS)
            break;

        tx_thread_sleep(10);
    }

    /* We still need to wait for the video status to be live */
    while (inst -> ux_host_class_video_state != UX_HOST_CLASS_INSTANCE_LIVE)
    {
        tx_thread_sleep(10);
    }
    video = inst;
  1. Setup parameters
    /* Set video parameters to MJPEG, W x H resolution, .. fps. */
    status = ux_host_class_video_frame_parameters_set(video,
                                                      UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG,
                                                      CAMERA_RESOLUTION_WIDTH,
                                                      CAMERA_RESOLUTION_HEIGHT,
                                                      TEN_FRAMES_PER_SECOND);
    if (status != UX_SUCCESS)
        error_handler();
  1. Start streaming
    /*========================================================================*/
    /*= Start UVC streaming.  */
    /*========================================================================*/

    /* Start video transfer. */
    status = ux_host_class_video_start(video);
    if (status != UX_SUCCESS)
        error_handler();

#if HIGH_BANDWIDTH_EHCI /* Driver HCD must support adding requests list.  */
    /* Build buffer list.  */
    for (i = 0; i < VIDEO_BUFFER_NB; i ++)
        video_buffers[i] = video_buffer[i];

    /* Issue transfer request list to start streaming.  */
    status = ux_host_class_video_transfer_buffers_add(video, video_buffers, VIDEO_BUFFER_NB);
    if (status != UX_SUCCESS)
        error_handler();
#elif NORMAL_BANDWIDTH_OHCI /* Driver adds request one by one.  */
    /* Allocate space for video buffer. */
    for(buffer_index = 0; buffer_index < VIDEO_BUFFER_NB; buffer_index++)
    {
        /* Add buffer to the video device for video streaming data. */
        status = ux_host_class_video_transfer_buffer_add(video,
                                                         video_buffer[buffer_index]);
        if (status != UX_SUCCESS)
            error_handler();
    }
#endif
  1. Handle frame data and reuse frame buffers, assuming there is transfer done callback putting semaphore:
    /* Set transfer callback (do before start transfer). */
    ux_host_class_video_transfer_callback_set(video,
                                              video_transfer_done);

    /* Wait transfer done and re-use buffers.  */
    buffer_index = 0;
    while (1)
    {

        /* Suspend here until a transfer callback is called. */
        status = tx_semaphore_get(&data_received_semaphore, TX_WAIT_FOREVER);
        if (status != UX_SUCCESS)
            error_handler();

        /* Received data. The callback function needs to obtain the actual
           number of bytes received, so the application routine can read the
           correct amount of data from the buffer. */

        /* Application can now consume video data while the video device stores
           the data into the other buffer. */

        /* Add the buffer back for video transfer. */
        status = ux_host_class_video_transfer_buffer_add(video,
                                                         video_buffer[buffer_index]);
        if (status != UX_SUCCESS)
            error_handler();

        /* Increment the buffer_index, and wrap to zero if it exceeds the
           maximum number of buffers. */
        buffer_index = (buffer_index + 1);
        if(buffer_index >= VIDEO_BUFFER_NB)
            buffer_index = 0;
    }

@edbek
Copy link
Author

edbek commented Dec 18, 2020

Thank you for the detailed answer !

@yuxin-azrtos
Copy link
Contributor

Close this issue. Feel free to reopen if you have questions.

@yuxin-azrtos yuxin-azrtos added the help wanted Extra attention is needed label Feb 8, 2021
@bSquaredOlea
Copy link

Are there anymore complete examples for this yet? I used the above code to get to the point where I can set parameters, but my device doesn't stream.

@yuxin-azrtos yuxin-azrtos reopened this Dec 14, 2021
@xiaocq2001
Copy link
Contributor

@bSquaredOlea There is no complete example project yet. For the stream, it actually depends on your hardware and host controller driver, since the ISO transfer is quite different to bulk and interrupt transfer. If your HCD is not ready for ISO transfer there is no stream.

@xiaocq2001
Copy link
Contributor

You can try the steps to build a video example that enumerates and start streaming on a USB 2.0 high speed webcam (tested with "Microsoft LifeCam Studio(TM)"):

Get MIMXRT1060 Examples

Modifications in sample_usbx_host_mass_storage.c

  • Add include file
#include "ux_host_class_video.h"
  • Add global variables
/* Define the number of buffers used in this demo. */
#define VIDEO_BUFFER_NB (UX_HOST_CLASS_VIDEO_TRANSFER_REQUEST_COUNT - 1)

UX_HOST_CLASS_VIDEO                 *video;

#pragma location="NonCacheable"
UCHAR                               video_buffer[UX_HOST_CLASS_VIDEO_TRANSFER_REQUEST_COUNT][3072];

/* This semaphore is used for the callback function to signal application thread
 that video data is received and can be processed. */
TX_SEMAPHORE data_received_semaphore;
  • Add instance check function (before demo_thread_entry)
static UINT  demo_class_video_check()
{

UINT status;
UX_HOST_CLASS               *host_class;
UX_HOST_CLASS_VIDEO         *inst;


    /* Find the main video container.  */
    status = ux_host_stack_class_get(_ux_system_host_class_video_name, &host_class);
    if (status != UX_SUCCESS)
        while(1); /* Error Halt  */

    /* We get the first instance of the video device.  */
    while (1)
    {
        status = ux_host_stack_class_instance_get(host_class, 0, (void **) &inst);
        if (status == UX_SUCCESS)
            break;

        tx_thread_sleep(10);
    }

    /* We still need to wait for the video status to be live */
    while (inst -> ux_host_class_video_state != UX_HOST_CLASS_INSTANCE_LIVE)
    {
        tx_thread_sleep(10);
    }

    video = inst;
    return(UX_SUCCESS);
}
  • Add video transfer done callback (before demo_thread_entry)
/* video data received callback function. */
static VOID video_transfer_done (UX_TRANSFER * transfer_request)
{

UINT status;

    status = tx_semaphore_put(&data_received_semaphore);
    if (status != UX_SUCCESS)
        while(1); /* Error Halt.  */
}
  • Add class registration (in demo_thread_entry)
    /* Register video class.  */
    status =  ux_host_stack_class_register(_ux_system_host_class_video_name, _ux_host_class_video_entry);
    if (status != UX_SUCCESS)
        return;
  • Replace the while loop code block in demo_thread_entry
UINT i;
UINT buffer_index;
UCHAR *video_buffers[VIDEO_BUFFER_NB];

    /* Assume video points to a valid video instance. */
    /* Create the semaphore for signaling video data received. */
    status = tx_semaphore_create(&data_received_semaphore, "payload semaphore", 0);
    if (status != UX_SUCCESS)
        while(1); /* Error Halt.  */

    /* Wait for camera to be plugged in.  */
    demo_class_video_check();

    /* Set transfer callback. */
    ux_host_class_video_transfer_callback_set(video,
                                              video_transfer_done);

    /* Set video parameters to MJPEG, 640x480 resolution, 30fps. */
    status = ux_host_class_video_frame_parameters_set(video,
                                                      UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, 176, 144, 333333);
    if (status != UX_SUCCESS)
        while(1); /* Error Halt.  */

    /* Start video transfer. */
    status = ux_host_class_video_start(video);
    if (status != UX_SUCCESS)
        while(1); /* Error Halt.  */

    /* Build buffer list.  */
    for (i = 0; i < VIDEO_BUFFER_NB; i ++)
        video_buffers[i] = video_buffer[i];

    /* Issue transfer request list to start streaming.  */
    ux_host_class_video_transfer_buffers_add(video, video_buffers, VIDEO_BUFFER_NB);

    buffer_index = 0;
    while (1)
    {

        /* Suspend here until a transfer callback is called. */
        status = tx_semaphore_get(&data_received_semaphore, TX_WAIT_FOREVER);
        if (status != UX_SUCCESS)
            while(1); /* Error Halt.  */

        /* Received data. The callback function needs to obtain the actual
           number of bytes received, so the application routine can read the
           correct amount of data from the buffer. */

        /* Application can now consume video data while the video device stores
           the data into the other buffer. */

        /* Add the buffer back for video transfer. */
        status = ux_host_class_video_transfer_buffer_add(video,
                                                         video_buffer[buffer_index]);
        if (status != UX_SUCCESS)
            while(1); /* Error Halt.  */

        /* Increment the buffer_index, and wrap to zero if it exceeds the
           maximum number of buffers. */
        buffer_index = (buffer_index + 1);
        if(buffer_index >= VIDEO_BUFFER_NB)
            buffer_index = 0;
    }

@yuxin-azrtos
Copy link
Contributor

@bSquaredOlea : does the sample code help?

@bSquaredOlea
Copy link

bSquaredOlea commented Feb 8, 2022 via email

@xiaocq2001
Copy link
Contributor

@bSquaredOlea Thanks for sharing the progress.
For the transaction, please note UVC transaction is based on isochronous endpoint, which works different against the control requests and bulk transfer, so only HCD with isochronous transfer support can get video data from the device (which has been done in EHCI HCD for 1060). If you are working on some other chip, your HCD still needs modification on isochronous transfer to make things right.

@xianghui-renesas
Copy link

Hi, we can collect stream video through ethernet port to the PC. Is there a recommended PC side application that can collect from the Ethernet port and display the video? thanks!

@xiaocq2001
Copy link
Contributor

No for the raw video stream.

@xianghui-renesas
Copy link

xianghui-renesas commented Apr 28, 2023

Hi @xiaocq2001 , thanks for the comment. What about mpeg format? we can output mpeg format to PC.
Can some of the webcam application be used to display the video collected through USBX? eg. webcamiod. If not, what are some of the limiting factors? thanks!

@xiaocq2001
Copy link
Contributor

xiaocq2001 commented Apr 28, 2023

@xianghui-renesas, I'm not sure directly forwarded USB Video stream can be recognized by webcamiod, maybe you can try. I think the video stream must be rearrange/packaged by some web streaming protocol to allow PC application to play.

@xianghui-renesas
Copy link

Hi @xiaocq2001, thanks! I had a quick try with webcamiod and found they are primarily looking for the USB Video streaming device and is unaware of the host video packet format.
I have a specific question on the definition of TEN_FRAMES_PER_SECOND. How does it convert with fps? Appreciate any comment you can provide. Thanks!
status = ux_host_class_video_frame_parameters_set(video,
UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG,
CAMERA_RESOLUTION_WIDTH,
CAMERA_RESOLUTION_HEIGHT,
TEN_FRAMES_PER_SECOND);

@xiaocq2001
Copy link
Contributor

Yes. That's the framerate.

@xianghui-renesas
Copy link

xianghui-renesas commented May 8, 2023

Hi @xiaocq2001, thanks! how is an input of 333333 converted 30fps, what is the unit of this argument in the API? thanks!
/* Set video parameters to MJPEG, 640x480 resolution, 30fps. */
status = ux_host_class_video_frame_parameters_set(video,
UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, 176, 144, 333333);
I can see 10000000/30=333333. It seems the unit of this argument is 10th of a microsecond. Could you explain?

@xiaocq2001
Copy link
Contributor

xiaocq2001 commented May 9, 2023

Please refer to [Universal Serial Bus Device Class Definition for Video Devices:
Frame Based Payload], where you can see frame intervals are in 100ns unit.

@xianghui-renesas
Copy link

Thanks @xiaocq2001 for the reference information. We are trying to stream the video to PC through UDP port. The PC end application we are trying to use is VLC:
https://docs.videolan.me/vlc-user/3.0/en/advanced/streaming/stream_over_udp.html
Do you have any experience using host video class and VLC?
One of the video format we identified using VLC is UX_HOST_CLASS_VIDEO_VS_FORMAT_H264, how do we set up the bandwidth?
Also a general question is how are we defining the color channel format? We do not seem to see there is information on the stack for this.

@xiaocq2001
Copy link
Contributor

Unfortunately, there is no H264 format demo for now, maybe you can trace some existing H264 camera for reference also there is H264 format spec on usb.org (Universal Serial Bus Device Class Definition for Video Devices: H.264 Payload).

General for USB bandwidth: done through changing different alternate settings.

@xianghui-renesas
Copy link

Thanks @xiaocq2001 , could you explain how the color channel encoding is defined in the USBX host video stack? If we collect image using uncompressed format, what is the format of data in the video buffer collected?
For example, with this configuration, how are the color coding and buffer data format defined?
ux_host_class_video_frame_parameters_set(video,
UX_HOST_CLASS_VIDEO_VS_FORMAT_UNCOMPRESSED,
160, 120,
333333);
The max payload for this setting is 384 (which is identified from (ux_host_class_video_max_payload_get), can you help to explain the format of this data so we can repackage to send to the PC program?

@xiaocq2001
Copy link
Contributor

Checking uncompressed format spec in https://usb.org/sites/default/files/USB_Video_Class_1_5.zip.

The supported pixel coding is as follow:
image

The format is reported by GUID:
image

The each payload is composed by a header and actual data, the header is like:
image

You can refer to the spec for more details.

@BluesharkPD
Copy link

Hi @xiaocq2001 , is there any application of uvc host running on stm32h7 target ??

@xianghui-renesas
Copy link

Hi @xiaocq2001 , does the usbx video host stack support still image collection? Do you have an example code if it is supported.

@xiaocq2001
Copy link
Contributor

@xianghui-renesas , still image collection is not supported currently.

@xianghui-renesas
Copy link

xianghui-renesas commented Jun 5, 2023

Hi @xiaocq2001, I tried to piece together the packets collected in the MCU to display them using a feature in our e2studio IDE and found the packets are out of order in the packet buffers. I have 96 packet buffers. If the MCU is not providing the buffer fast enough for the frame rate, will the MCU start to skip packet? Do you have experience with this? thanks!

@xianghui-renesas
Copy link

Hi @xiaocq2001, your example so far uses stream-based protocol, does the stack support frame-based protocol, do you have an example for the frame-based implementation? I think it may be easier to look at the raw image from the MCU buffer with frame-based implementation.

@xiaocq2001
Copy link
Contributor

@xianghui-renesas , do you mean to see how a video frame is detected in USB packets? For motion jpeg, if you check the spec about payload header for each USB packet, there is EOF bit to indicate a frame end.
image

@MaheshAvula-Alifsemi
Copy link

MaheshAvula-Alifsemi commented Oct 9, 2023

Hi @xiaocq2001 ,@yuxin-azrtos,
I am working on USB host Isochrnous support, I have a one question that single transaction per microframe, I am able to see the video streaming and it's working fine but, multitransactions per miocroframe is not working, and i am no seeing any valid frames,

  1. I am using demo app as above one, I would like to know that, do i need to change application..? or could you please suggest how i can approach further on this.

I really appriciate your help on this.

Thanks
Mahesh

@xiaocq2001
Copy link
Contributor

@Mahesh-dev-avula I think the application is fine for multitransactions per microframe. Maybe you can check if multiple transactions per microframe is supported by your host controller, or the host controller driver needs modification to support it.

@MaheshAvula-Alifsemi
Copy link

@xiaocq2001 ,
Thank you for responding on my query,
Yes my host controller will support multi transactions per microframe,same hardware working on linux, my driver code also implemented based on linux reference(xhci driver).
the difference i found that linux and azure rtos, in the linux they are preparing multiple bufferes at time and sending command to the hardware,but in RTOS from the application we are requesting only one buffere at time and it's working for single transaction per microframe, I would thinking that may be we need to prepare multiple bufferes at time for multi transactions.
please correct me if I am wrong.

Thanks
Mahesh

@xiaocq2001
Copy link
Contributor

For isochronous request, it's supposed to support request list input, that is, the requests are linked by its ux_transfer_request_next_transfer_request, this means linked multiple requests can be accepted which includes multiple buffers.

In current EHCI implementation, single buffer is used for multiple transactions, max 3072 (3 * 1024) bytes can be transferred in the request buffer.

@MaheshAvula-Alifsemi
Copy link

@xiaocq2001 ,
Thank you for the responding my query,
Actually my hardware doesn't support EHCI , I have implemented xhci driver and trying on it.
there are few scenorios which i have tested:

  1. When UVC device sends payload length as less than or equals 1024 bytes, in this scenorio, I am requesting payload buffer as 3072 bytes from my application and it's working.
  2. When we increase resolution as 1280*720 with 30fps from my application and at this time UVC device commited payload length as more than 1024 bytes and it's three transactions per microframe, in this scenorio, first 1024 bytes only my host controller receiving and then it's throwing DATA BUFFER ERROR and RING OVERRUN ERROR.

Any suggestion on this to resolve the issue.

Thanks
Mahesh

@xiaocq2001
Copy link
Contributor

I'm not checking xHCI spec in detail, but I see following may relates to the high bandwidth multiple transactions support:

  1. Mult, Max Packet Size and Max Burst Size in 6.2.3 Endpoint Context.
  2. TRB Transfer Length in 6.4.1.3 Isoch TRB.

From description TRB Transfer Length can be 3072.
From description Max Burst Size shall be set to the number of additional transactions and Mult shall be 0 for high-speed.

@MaheshAvula-Alifsemi
Copy link

@xiaocq2001 ,
Thank you so much for your inputs and it's working fine after programming Endpoint Context array.

I really appriciating your help. thanks

Mahesh

@chrisrhayden84
Copy link

Hi @xiaocq2001 is there any update on when/if still image collection & H264 payload will be supported?

@xiaocq2001
Copy link
Contributor

For still image you can extract YUV frame from isochronous video stream.
For H264 payload, you can handle that in application.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

8 participants