[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <44ac8977-cf98-46a5-be15-1bec330c6a2e@norik.com>
Date: Mon, 25 Mar 2024 09:55:23 +0100
From: Primoz Fiser <primoz.fiser@...ik.com>
To: Andrej Picej <andrej.picej@...ik.com>, Jonathan Cameron <jic23@...nel.org>
Cc: devicetree@...r.kernel.org, conor+dt@...nel.org, lars@...afoo.de,
krzysztof.kozlowski+dt@...aro.org, imx@...ts.linux.dev,
linux-iio@...r.kernel.org, festevam@...il.com, s.hauer@...gutronix.de,
upstream@...ts.phytec.de, linux-kernel@...r.kernel.org, haibo.chen@....com,
kernel@...gutronix.de, shawnguo@...nel.org, robh@...nel.org,
linux-arm-kernel@...ts.infradead.org
Subject: Re: [Upstream] [PATCH 0/2] i.MX93 ADC calibration settings
Hi Jonathan,
On 25. 03. 24 09:32, Andrej Picej wrote:
> Hi Jonathan,
>
> On 24. 03. 24 14:55, Jonathan Cameron wrote:
>> On Wed, 20 Mar 2024 11:04:04 +0100
>> Andrej Picej <andrej.picej@...ik.com> wrote:
>>
>>> Hi all,
>>>
>>> we had some problems with failing ADC calibration on the i.MX93 boards.
>>> Changing default calibration settings fixed this. The board where this
>>> patches are useful is not yet upstream but will be soon (hopefully).
>>
>> Tell us more. My initial instinct is that this shouldn't be board
>> specific.
>> What's the trade off we are making here? Time vs precision of
>> calibration or
>> something else? If these are set to a level by default that doesn't work
>> for our board, maybe we should just change them for all devices?
>>
The imx93_adc driver is quite new.
If you look at line #162, you will find a comment by the original author:
> /*
> * TODO: we use the default TSAMP/NRSMPL/AVGEN in MCR,
> * can add the setting of these bit if need in future.
> */
URL:
https://github.com/torvalds/linux/blob/master/drivers/iio/adc/imx93_adc.c#L162
So, for most use-cases the default setting should work, but why not make
them configurable?
So this patch-series just implement what was missing from the beginning
/ was planned for later.
BR,
Primoz
>
> So we have two different boards with the same SoC. On one, the
> calibration works with the default values, on the second one the
> calibration fails, which makes the ADC unusable. What the ADC lines
> measure differ between the boards though. But the implementation is
> nothing out of the ordinary.
>
> We tried different things but the only thing that helped is to use
> different calibration properties. We tried deferring the probe and
> calibration until later boot and after boot, but it did not help.
>
> In the Reference Manual [1] (chapter 72.5.1) it is written:
>
>> 4. Configure desired calibration settings (default values kept for
>> highest accuracy maximum time).
>
> So your assumption is correct, longer calibration time (more averaging
> samples) -> higher precision. The default values go for a high accuracy.
> And since we use a NRSMPL (Number of Averaging Samples) of 32 instead of
> default 512, we reduce the accuracy so the calibration values pass the
> internal defined limits.
>
> I'm not sure that changing default values is the right solution here. We
> saw default values work with one of the boards. And since the NXP kept
> these values adjustable I think there is a reason behind it.
>
> Note: When I say one of the boards I mean one board form. So same board
> version, but different HW.
>
> Best regards,
> Andrej
>
> [1] i.MX 93 Applications Processor Reference Manual, Rev. 4, 12/2023
> _______________________________________________
> upstream mailing list
> upstream@...ts.phytec.de
> http://lists.phytec.de/cgi-bin/mailman/listinfo/upstream
Powered by blists - more mailing lists