Re: [PATCH v2 12/29] venus: add common capability parser

From: Stanimir Varbanov
Date: Mon Jul 02 2018 - 07:00:59 EST


Hi Tomasz,

On 07/02/2018 01:05 PM, Tomasz Figa wrote:
> On Mon, Jul 2, 2018 at 6:59 PM Stanimir Varbanov
> <stanimir.varbanov@xxxxxxxxxx> wrote:
>>
>> Hi Tomasz,
>>
>> On 07/02/2018 12:23 PM, Tomasz Figa wrote:
>>> On Thu, May 31, 2018 at 4:06 PM Tomasz Figa <tfiga@xxxxxxxxxxxx> wrote:
>>>>
>>>> On Thu, May 31, 2018 at 1:21 AM Stanimir Varbanov
>>>> <stanimir.varbanov@xxxxxxxxxx> wrote:
>>>>>
>>>>> Hi Tomasz,
>>>>>
>>>>> On 05/24/2018 05:16 PM, Tomasz Figa wrote:
>>>>>> Hi Stanimir,
>>>>>>
>>>>>> On Tue, May 15, 2018 at 5:08 PM Stanimir Varbanov <
>>> [snip]
>>>>>>
>>>>>>> + break;
>>>>>>> + }
>>>>>>> +
>>>>>>> + word++;
>>>>>>> + words_count--;
>>>>>>
>>>>>> If data is at |word + 1|, shouldnât we increment |word| by |1 + |data
>>>>>> size||?
>>>>>
>>>>> yes, that could be possible but the firmware packets are with variable
>>>>> data length and don't want to make the code so complex.
>>>>>
>>>>> The idea is to search for HFI_PROPERTY_PARAM* key numbers. Yes it is not
>>>>> optimal but this enumeration is happen only once during driver probe.
>>>>>
>>>>
>>>> Hmm, do we have a guarantee that we will never find a value that
>>>> matches HFI_PROPERTY_PARAM*, but would be actually just some data
>>>> inside the payload?
>>>
>>> Ping?
>>
>> OK, you are right there is guarantee that we not mixing keywords and
>
> Did the auto-correction engine in my head got this correctly as "no
> guarantee"? :)

yes, your engine works better than my :)

>
>> data. I can make parse_* functions to return how words they consumed and
>> increment 'word' pointer with consumed words.
>
> Yes, that or maybe just returning the pointer to the first word after
> consumed data. Most of the looping functions already seem to have this
> value, so it would have to be just returned. (vs having to subtract
> from the start pointer)

One possible issue could be with not parsed params, there I have to
increment with one because the read data size is unknown.

--
regards,
Stan