Message ID | 20200430113809.14872-1-stanimir.varbanov@linaro.org (mailing list archive) |
---|---|
State | New, archived |
Headers | show |
Series | [RFC] docs: dev-decoder: Add two more reasons for dynamic change | expand |
Le jeudi 30 avril 2020 à 14:38 +0300, Stanimir Varbanov a écrit : > Here we add two more reasons for dynamic-resolution-change state > (I think the name is misleading espesially "resolution" word, maybe > dynamic-bitstream-change is better to describe). > > The first one which could change in the middle of the stream is the > bit-depth. For worst example the stream is 8bit at the begging but > later in the bitstream it changes to 10bit. That change should be > propagated to the client so that it can take appropriate action. In > this case most probably it has to stop the streaming on the capture > queue and re-negotiate the pixel format and start the streaming > again. > > The second new reason is colorspace change. I'm not sure what action > client should take but at least it should be notified for such change. > One possible action is to notify the display entity that the colorspace > and its parameters (y'cbcr encoding and so on) has been changed. Just to help with the use case, colorspace changes need to be communicated to the following HW or software in your media pipeline. Let's consider a V4L2 only use case: m2m decoder -> m2m color transform - >... The userspace needs to be aware on time, so that it can reconfigure the color transformation parameters. The V4L2 event is a miss-fit though, as it does not tell exactly which buffer will start having this new colorspace. So in theory, one would have to: - drain - send the new csc parameters - resume I'm not sure if our drivers implement resuming after CMD_STOP, do you have information about that ? We could also go through streamoff/on cycle in the mean time. Most codec won't allow changing these parameters on delta frames, as it would force the decoder doing CSC conversion of the reference frames in decode process, that seems unrealistically complex requirement. That being said, please keep in mind that in VP9, reference frames do not have to be of the same sizes. You can change the resolution at any point in time. No one managed to decode the related test vectors [0] with our current event base resolution change notification. [0] FRM_RESIZE https://www.webmproject.org/vp9/levels/ > > Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> > --- > Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- > 1 file changed, 5 insertions(+), 1 deletion(-) > > diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst > index 606b54947e10..bf10eda6125c 100644 > --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst > +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst > @@ -906,7 +906,11 @@ reflected by corresponding queries): > > * visible resolution (selection rectangles), > > -* the minimum number of buffers needed for decoding. > +* the minimum number of buffers needed for decoding, > + > +* bit-depth of the bitstream has been changed, > + > +* colorspace (and its properties) has been changed. > > Whenever that happens, the decoder must proceed as follows: >
On Fri, May 1, 2020 at 4:19 PM Nicolas Dufresne <nicolas@ndufresne.ca> wrote: > > Le jeudi 30 avril 2020 à 14:38 +0300, Stanimir Varbanov a écrit : > > Here we add two more reasons for dynamic-resolution-change state > > (I think the name is misleading espesially "resolution" word, maybe > > dynamic-bitstream-change is better to describe). > > > > The first one which could change in the middle of the stream is the > > bit-depth. For worst example the stream is 8bit at the begging but > > later in the bitstream it changes to 10bit. That change should be > > propagated to the client so that it can take appropriate action. In > > this case most probably it has to stop the streaming on the capture > > queue and re-negotiate the pixel format and start the streaming > > again. > > > > The second new reason is colorspace change. I'm not sure what action > > client should take but at least it should be notified for such change. > > One possible action is to notify the display entity that the colorspace > > and its parameters (y'cbcr encoding and so on) has been changed. > > Just to help with the use case, colorspace changes need to be > communicated to the following HW or software in your media pipeline. > Let's consider a V4L2 only use case: > > m2m decoder -> m2m color transform - >... > > The userspace needs to be aware on time, so that it can reconfigure the > color transformation parameters. The V4L2 event is a miss-fit though, > as it does not tell exactly which buffer will start having this new > colorspace. So in theory, one would have to: > > - drain > - send the new csc parameters > - resume > > I'm not sure if our drivers implement resuming after CMD_STOP, do you > have information about that ? We could also go through streamoff/on > cycle in the mean time. Most codec won't allow changing these > parameters on delta frames, as it would force the decoder doing CSC > conversion of the reference frames in decode process, that seems > unrealistically complex requirement. > > That being said, please keep in mind that in VP9, reference frames do > not have to be of the same sizes. You can change the resolution at any > point in time. No one managed to decode the related test vectors [0] > with our current event base resolution change notification. > > [0] FRM_RESIZE https://www.webmproject.org/vp9/levels/ > Agreed. The event mechanism is certainly suffering from some design issues, but that's just a tip of the iceberg. I think we can only solve this problem by adding an ability to query driver state on a per-buffer basis, so one could query the format of a particular dequeued frame. Best regards, Tomasz > > > > Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> > > --- > > Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- > > 1 file changed, 5 insertions(+), 1 deletion(-) > > > > diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > index 606b54947e10..bf10eda6125c 100644 > > --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst > > +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > @@ -906,7 +906,11 @@ reflected by corresponding queries): > > > > * visible resolution (selection rectangles), > > > > -* the minimum number of buffers needed for decoding. > > +* the minimum number of buffers needed for decoding, > > + > > +* bit-depth of the bitstream has been changed, > > + > > +* colorspace (and its properties) has been changed. > > > > Whenever that happens, the decoder must proceed as follows: > > >
Hi Nicolas, On 5/1/20 5:19 PM, Nicolas Dufresne wrote: > Le jeudi 30 avril 2020 à 14:38 +0300, Stanimir Varbanov a écrit : >> Here we add two more reasons for dynamic-resolution-change state >> (I think the name is misleading espesially "resolution" word, maybe >> dynamic-bitstream-change is better to describe). >> >> The first one which could change in the middle of the stream is the >> bit-depth. For worst example the stream is 8bit at the begging but >> later in the bitstream it changes to 10bit. That change should be >> propagated to the client so that it can take appropriate action. In >> this case most probably it has to stop the streaming on the capture >> queue and re-negotiate the pixel format and start the streaming >> again. >> >> The second new reason is colorspace change. I'm not sure what action >> client should take but at least it should be notified for such change. >> One possible action is to notify the display entity that the colorspace >> and its parameters (y'cbcr encoding and so on) has been changed. > > Just to help with the use case, colorspace changes need to be > communicated to the following HW or software in your media pipeline. > Let's consider a V4L2 only use case: > > m2m decoder -> m2m color transform - >... > > The userspace needs to be aware on time, so that it can reconfigure the > color transformation parameters. The V4L2 event is a miss-fit though, > as it does not tell exactly which buffer will start having this new > colorspace. So in theory, one would have to: > > - drain > - send the new csc parameters > - resume > > I'm not sure if our drivers implement resuming after CMD_STOP, do you According to the spec, after implicit drain the decoder is stopping and the client have two options: 1. streamoff -> reconfigure queue -> streamon 2. decoder command start #2 would be the case with colorspace changes. > have information about that ? We could also go through streamoff/on > cycle in the mean time. Most codec won't allow changing these > parameters on delta frames, as it would force the decoder doing CSC > conversion of the reference frames in decode process, that seems > unrealistically complex requirement. Shouldn't such changes be preceded by IDR (or what is applicable for the codec)? > > That being said, please keep in mind that in VP9, reference frames do > not have to be of the same sizes. You can change the resolution at any > point in time. No one managed to decode the related test vectors [0] > with our current event base resolution change notification. > > [0] FRM_RESIZE https://www.webmproject.org/vp9/levels/ I'd like to try those test streams. So, If I understood your comments correctly, the colorspace change event in stateful decoder spec isn't needed? > >> >> Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> >> --- >> Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- >> 1 file changed, 5 insertions(+), 1 deletion(-) >> >> diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst >> index 606b54947e10..bf10eda6125c 100644 >> --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst >> +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst >> @@ -906,7 +906,11 @@ reflected by corresponding queries): >> >> * visible resolution (selection rectangles), >> >> -* the minimum number of buffers needed for decoding. >> +* the minimum number of buffers needed for decoding, >> + >> +* bit-depth of the bitstream has been changed, >> + >> +* colorspace (and its properties) has been changed. >> >> Whenever that happens, the decoder must proceed as follows: >> >
Le samedi 02 mai 2020 à 12:33 +0300, Stanimir Varbanov a écrit : > Hi Nicolas, > > On 5/1/20 5:19 PM, Nicolas Dufresne wrote: > > Le jeudi 30 avril 2020 à 14:38 +0300, Stanimir Varbanov a écrit : > > > Here we add two more reasons for dynamic-resolution-change state > > > (I think the name is misleading espesially "resolution" word, maybe > > > dynamic-bitstream-change is better to describe). > > > > > > The first one which could change in the middle of the stream is the > > > bit-depth. For worst example the stream is 8bit at the begging but > > > later in the bitstream it changes to 10bit. That change should be > > > propagated to the client so that it can take appropriate action. In > > > this case most probably it has to stop the streaming on the capture > > > queue and re-negotiate the pixel format and start the streaming > > > again. > > > > > > The second new reason is colorspace change. I'm not sure what action > > > client should take but at least it should be notified for such change. > > > One possible action is to notify the display entity that the colorspace > > > and its parameters (y'cbcr encoding and so on) has been changed. > > > > Just to help with the use case, colorspace changes need to be > > communicated to the following HW or software in your media pipeline. > > Let's consider a V4L2 only use case: > > > > m2m decoder -> m2m color transform - >... > > > > The userspace needs to be aware on time, so that it can reconfigure the > > color transformation parameters. The V4L2 event is a miss-fit though, > > as it does not tell exactly which buffer will start having this new > > colorspace. So in theory, one would have to: > > > > - drain > > - send the new csc parameters > > - resume > > > > I'm not sure if our drivers implement resuming after CMD_STOP, do you > > According to the spec, after implicit drain the decoder is stopping and > the client have two options: > > 1. streamoff -> reconfigure queue -> streamon > 2. decoder command start > > #2 would be the case with colorspace changes. Agreed, I just wanted to underline as as no userspace make any use of that, CMD_START might current be broken in many places. That being said, if we only use it in the context of a new event, it can't cause any harm. > > > have information about that ? We could also go through streamoff/on > > cycle in the mean time. Most codec won't allow changing these > > parameters on delta frames, as it would force the decoder doing CSC > > conversion of the reference frames in decode process, that seems > > unrealistically complex requirement. > > Shouldn't such changes be preceded by IDR (or what is applicable for the > codec)? VP9 does not have this limitation. So reference frames and the output frame can be of different size. It's likely unique to VP9. > > > That being said, please keep in mind that in VP9, reference frames do > > not have to be of the same sizes. You can change the resolution at any > > point in time. No one managed to decode the related test vectors [0] > > with our current event base resolution change notification. > > > > [0] FRM_RESIZE https://www.webmproject.org/vp9/levels/ > > I'd like to try those test streams. You might also want to be aware of: http://www.itu.int/net/itu-t/sigdb/spevideo/Hseries-s.htm https://chromium.googlesource.com/webm/vp8-test-vectors/ Would be an nice and easy project to write a little runner for these compliance and integrate that into kernel CI. For stateful codecs it is trivial really. I'm working on GStreamer /GstValidate runners, as we want something generic across OS and CODEC APIs, but kernelci folks didn't seem very keen in having a framework on their rootfs (please correct me if I'm wrong on this). > > So, If I understood your comments correctly, the colorspace change event > in stateful decoder spec isn't needed? > > > > Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> > > > --- > > > Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- > > > 1 file changed, 5 insertions(+), 1 deletion(-) > > > > > > diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > > index 606b54947e10..bf10eda6125c 100644 > > > --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst > > > +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > > @@ -906,7 +906,11 @@ reflected by corresponding queries): > > > > > > * visible resolution (selection rectangles), > > > > > > -* the minimum number of buffers needed for decoding. > > > +* the minimum number of buffers needed for decoding, > > > + > > > +* bit-depth of the bitstream has been changed, > > > + > > > +* colorspace (and its properties) has been changed. > > > > > > Whenever that happens, the decoder must proceed as follows: > > >
On 30/04/2020 13:38, Stanimir Varbanov wrote: > Here we add two more reasons for dynamic-resolution-change state > (I think the name is misleading espesially "resolution" word, maybe espesially -> especially > dynamic-bitstream-change is better to describe). > > The first one which could change in the middle of the stream is the > bit-depth. For worst example the stream is 8bit at the begging but > later in the bitstream it changes to 10bit. That change should be > propagated to the client so that it can take appropriate action. In > this case most probably it has to stop the streaming on the capture > queue and re-negotiate the pixel format and start the streaming > again. > > The second new reason is colorspace change. I'm not sure what action > client should take but at least it should be notified for such change. > One possible action is to notify the display entity that the colorspace > and its parameters (y'cbcr encoding and so on) has been changed. > > Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> > --- > Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- > 1 file changed, 5 insertions(+), 1 deletion(-) > > diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst > index 606b54947e10..bf10eda6125c 100644 > --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst > +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst > @@ -906,7 +906,11 @@ reflected by corresponding queries): > > * visible resolution (selection rectangles), > > -* the minimum number of buffers needed for decoding. > +* the minimum number of buffers needed for decoding, > + > +* bit-depth of the bitstream has been changed, > + > +* colorspace (and its properties) has been changed. For this I want to have a new source change flag: V4L2_EVENT_SRC_CH_COLORIMETRY Changing colorimetry without changing resolution/bit depth does not require buffers to be re-allocated, it just changes how the pixel data is interpreted w.r.t. color. And that is important to know. Regards, Hans > > Whenever that happens, the decoder must proceed as follows: > >
On Tue, May 26, 2020 at 12:26 PM Hans Verkuil <hverkuil@xs4all.nl> wrote: > > On 30/04/2020 13:38, Stanimir Varbanov wrote: > > Here we add two more reasons for dynamic-resolution-change state > > (I think the name is misleading espesially "resolution" word, maybe > > espesially -> especially > > > dynamic-bitstream-change is better to describe). > > > > The first one which could change in the middle of the stream is the > > bit-depth. For worst example the stream is 8bit at the begging but > > later in the bitstream it changes to 10bit. That change should be > > propagated to the client so that it can take appropriate action. In > > this case most probably it has to stop the streaming on the capture > > queue and re-negotiate the pixel format and start the streaming > > again. > > > > The second new reason is colorspace change. I'm not sure what action > > client should take but at least it should be notified for such change. > > One possible action is to notify the display entity that the colorspace > > and its parameters (y'cbcr encoding and so on) has been changed. > > > > Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> > > --- > > Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- > > 1 file changed, 5 insertions(+), 1 deletion(-) > > > > diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > index 606b54947e10..bf10eda6125c 100644 > > --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst > > +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst > > @@ -906,7 +906,11 @@ reflected by corresponding queries): > > > > * visible resolution (selection rectangles), > > > > -* the minimum number of buffers needed for decoding. > > +* the minimum number of buffers needed for decoding, > > + > > +* bit-depth of the bitstream has been changed, > > + > > +* colorspace (and its properties) has been changed. > > For this I want to have a new source change flag: > > V4L2_EVENT_SRC_CH_COLORIMETRY > > Changing colorimetry without changing resolution/bit depth does not > require buffers to be re-allocated, it just changes how the pixel > data is interpreted w.r.t. color. And that is important to know. FWIW, the visible resolution (i.e. compose rectangle) change that is already defined doesn't require buffers to be re-allocated either. Backwards compatibility requires V4L2_EVENT_SRC_CH_RESOLUTION to be set, but perhaps we could have further flags introduced, which would mean visible resolution and stream format (pixelformat, resolution) exclusively? Best regards, Tomasz
Hi Hans, On 5/26/20 1:26 PM, Hans Verkuil wrote: > On 30/04/2020 13:38, Stanimir Varbanov wrote: >> Here we add two more reasons for dynamic-resolution-change state >> (I think the name is misleading espesially "resolution" word, maybe > > espesially -> especially > >> dynamic-bitstream-change is better to describe). >> >> The first one which could change in the middle of the stream is the >> bit-depth. For worst example the stream is 8bit at the begging but >> later in the bitstream it changes to 10bit. That change should be >> propagated to the client so that it can take appropriate action. In >> this case most probably it has to stop the streaming on the capture >> queue and re-negotiate the pixel format and start the streaming >> again. >> >> The second new reason is colorspace change. I'm not sure what action >> client should take but at least it should be notified for such change. >> One possible action is to notify the display entity that the colorspace >> and its parameters (y'cbcr encoding and so on) has been changed. >> >> Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> >> --- >> Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- >> 1 file changed, 5 insertions(+), 1 deletion(-) >> >> diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst >> index 606b54947e10..bf10eda6125c 100644 >> --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst >> +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst >> @@ -906,7 +906,11 @@ reflected by corresponding queries): >> >> * visible resolution (selection rectangles), >> >> -* the minimum number of buffers needed for decoding. >> +* the minimum number of buffers needed for decoding, >> + >> +* bit-depth of the bitstream has been changed, >> + >> +* colorspace (and its properties) has been changed. > > For this I want to have a new source change flag: OK, I can drop colorimetry and prepare a patch for src_change. Is the bit-depth one fine? > > V4L2_EVENT_SRC_CH_COLORIMETRY > > Changing colorimetry without changing resolution/bit depth does not > require buffers to be re-allocated, it just changes how the pixel > data is interpreted w.r.t. color. And that is important to know. > > Regards, > > Hans > >> >> Whenever that happens, the decoder must proceed as follows: >> >> >
On 04/06/2020 16:46, Stanimir Varbanov wrote: > Hi Hans, > > On 5/26/20 1:26 PM, Hans Verkuil wrote: >> On 30/04/2020 13:38, Stanimir Varbanov wrote: >>> Here we add two more reasons for dynamic-resolution-change state >>> (I think the name is misleading espesially "resolution" word, maybe >> >> espesially -> especially >> >>> dynamic-bitstream-change is better to describe). >>> >>> The first one which could change in the middle of the stream is the >>> bit-depth. For worst example the stream is 8bit at the begging but >>> later in the bitstream it changes to 10bit. That change should be >>> propagated to the client so that it can take appropriate action. In >>> this case most probably it has to stop the streaming on the capture >>> queue and re-negotiate the pixel format and start the streaming >>> again. >>> >>> The second new reason is colorspace change. I'm not sure what action >>> client should take but at least it should be notified for such change. >>> One possible action is to notify the display entity that the colorspace >>> and its parameters (y'cbcr encoding and so on) has been changed. >>> >>> Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> >>> --- >>> Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- >>> 1 file changed, 5 insertions(+), 1 deletion(-) >>> >>> diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst >>> index 606b54947e10..bf10eda6125c 100644 >>> --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst >>> +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst >>> @@ -906,7 +906,11 @@ reflected by corresponding queries): >>> >>> * visible resolution (selection rectangles), >>> >>> -* the minimum number of buffers needed for decoding. >>> +* the minimum number of buffers needed for decoding, >>> + >>> +* bit-depth of the bitstream has been changed, >>> + >>> +* colorspace (and its properties) has been changed. >> >> For this I want to have a new source change flag: > > OK, I can drop colorimetry and prepare a patch for src_change. Is the > bit-depth one fine? Yes. Hans > >> >> V4L2_EVENT_SRC_CH_COLORIMETRY >> >> Changing colorimetry without changing resolution/bit depth does not >> require buffers to be re-allocated, it just changes how the pixel >> data is interpreted w.r.t. color. And that is important to know. >> >> Regards, >> >> Hans >> >>> >>> Whenever that happens, the decoder must proceed as follows: >>> >>> >> >
Hi Hans, On 5/26/20 1:26 PM, Hans Verkuil wrote: > On 30/04/2020 13:38, Stanimir Varbanov wrote: >> Here we add two more reasons for dynamic-resolution-change state >> (I think the name is misleading espesially "resolution" word, maybe > > espesially -> especially > >> dynamic-bitstream-change is better to describe). >> >> The first one which could change in the middle of the stream is the >> bit-depth. For worst example the stream is 8bit at the begging but >> later in the bitstream it changes to 10bit. That change should be >> propagated to the client so that it can take appropriate action. In >> this case most probably it has to stop the streaming on the capture >> queue and re-negotiate the pixel format and start the streaming >> again. >> >> The second new reason is colorspace change. I'm not sure what action >> client should take but at least it should be notified for such change. >> One possible action is to notify the display entity that the colorspace >> and its parameters (y'cbcr encoding and so on) has been changed. >> >> Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> >> --- >> Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- >> 1 file changed, 5 insertions(+), 1 deletion(-) >> >> diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst >> index 606b54947e10..bf10eda6125c 100644 >> --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst >> +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst >> @@ -906,7 +906,11 @@ reflected by corresponding queries): >> >> * visible resolution (selection rectangles), >> >> -* the minimum number of buffers needed for decoding. >> +* the minimum number of buffers needed for decoding, >> + >> +* bit-depth of the bitstream has been changed, >> + >> +* colorspace (and its properties) has been changed. > > For this I want to have a new source change flag: > > V4L2_EVENT_SRC_CH_COLORIMETRY > > Changing colorimetry without changing resolution/bit depth does not > require buffers to be re-allocated, it just changes how the pixel > data is interpreted w.r.t. color. And that is important to know. I'm going to create a patch for this event, but I started to wonder do we need new buffer flag for this? Something like below sequence: - client receive SRC_CH_COLORIMETRY event - client issue G_FMT(CAPTURE queue) - at that point the client has to know the last buffer with previous colorimetry and thus the buffer with new colorimetry How the client will know the buffer with new colorimetry?
On 09/06/2020 11:44, Stanimir Varbanov wrote: > Hi Hans, > > On 5/26/20 1:26 PM, Hans Verkuil wrote: >> On 30/04/2020 13:38, Stanimir Varbanov wrote: >>> Here we add two more reasons for dynamic-resolution-change state >>> (I think the name is misleading espesially "resolution" word, maybe >> >> espesially -> especially >> >>> dynamic-bitstream-change is better to describe). >>> >>> The first one which could change in the middle of the stream is the >>> bit-depth. For worst example the stream is 8bit at the begging but >>> later in the bitstream it changes to 10bit. That change should be >>> propagated to the client so that it can take appropriate action. In >>> this case most probably it has to stop the streaming on the capture >>> queue and re-negotiate the pixel format and start the streaming >>> again. >>> >>> The second new reason is colorspace change. I'm not sure what action >>> client should take but at least it should be notified for such change. >>> One possible action is to notify the display entity that the colorspace >>> and its parameters (y'cbcr encoding and so on) has been changed. >>> >>> Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> >>> --- >>> Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- >>> 1 file changed, 5 insertions(+), 1 deletion(-) >>> >>> diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst >>> index 606b54947e10..bf10eda6125c 100644 >>> --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst >>> +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst >>> @@ -906,7 +906,11 @@ reflected by corresponding queries): >>> >>> * visible resolution (selection rectangles), >>> >>> -* the minimum number of buffers needed for decoding. >>> +* the minimum number of buffers needed for decoding, >>> + >>> +* bit-depth of the bitstream has been changed, >>> + >>> +* colorspace (and its properties) has been changed. >> >> For this I want to have a new source change flag: >> >> V4L2_EVENT_SRC_CH_COLORIMETRY >> >> Changing colorimetry without changing resolution/bit depth does not >> require buffers to be re-allocated, it just changes how the pixel >> data is interpreted w.r.t. color. And that is important to know. > > I'm going to create a patch for this event, but I started to wonder do > we need new buffer flag for this? > > Something like below sequence: > > - client receive SRC_CH_COLORIMETRY event > - client issue G_FMT(CAPTURE queue) > - at that point the client has to know the last buffer with previous > colorimetry and thus the buffer with new colorimetry > > How the client will know the buffer with new colorimetry? > You still need to drain the capture queue, so only after draining and a V4L2_DEC_CMD_START (or a STREAMOFF/ON pair) will the new colorimetry become active. But the big difference with CH_RESOLUTION is that no new capture buffers have to be allocated for just a colorimetry change. Regards, Hans
diff --git a/Documentation/userspace-api/media/v4l/dev-decoder.rst b/Documentation/userspace-api/media/v4l/dev-decoder.rst index 606b54947e10..bf10eda6125c 100644 --- a/Documentation/userspace-api/media/v4l/dev-decoder.rst +++ b/Documentation/userspace-api/media/v4l/dev-decoder.rst @@ -906,7 +906,11 @@ reflected by corresponding queries): * visible resolution (selection rectangles), -* the minimum number of buffers needed for decoding. +* the minimum number of buffers needed for decoding, + +* bit-depth of the bitstream has been changed, + +* colorspace (and its properties) has been changed. Whenever that happens, the decoder must proceed as follows:
Here we add two more reasons for dynamic-resolution-change state (I think the name is misleading espesially "resolution" word, maybe dynamic-bitstream-change is better to describe). The first one which could change in the middle of the stream is the bit-depth. For worst example the stream is 8bit at the begging but later in the bitstream it changes to 10bit. That change should be propagated to the client so that it can take appropriate action. In this case most probably it has to stop the streaming on the capture queue and re-negotiate the pixel format and start the streaming again. The second new reason is colorspace change. I'm not sure what action client should take but at least it should be notified for such change. One possible action is to notify the display entity that the colorspace and its parameters (y'cbcr encoding and so on) has been changed. Signed-off-by: Stanimir Varbanov <stanimir.varbanov@linaro.org> --- Documentation/userspace-api/media/v4l/dev-decoder.rst | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-)