Hello,
I'm trying to understand how and when the automatic offset calibration will occur and I'm totally confused.
According to SLUA664:
the automatic calibration, Board Offset, is done the first time the I2C Data and Clock is low for more than 20 seconds, which is a much more accurate calibration.
During normal Gas Gauge Operation when the I2C clock and data lines are low for more than 5 seconds and Average Current is less than Sleep Current in mA, then an automatic CC Offset calibration is performed. This takes approximately 16 seconds and is much more accurate than the method in Calibration mode.
When Average Current is less than Sleep Current or greater than (–)Sleep Current in mA, the bq34z100 enters SLEEP mode if the feature is enabled (Op Config [SLEEP] = 1). The bq34z100 does an analog-to-digital converter (ADC) calibration and then goes to sleep.
Thus, in order to obtain a Board Offset calibration, should I drive low both SCL and SDA for more than 20s, keeping the bus stuck?
What kind of calibration is nr. 3? Since it's not talking about I2C state, I expect it to be something different from 1 and 2.
Then, if I read the datasheet, it states:
The gas gauge performs a single offset calibration when (1) the interface lines stay low for a minimum of Bus Low Time and (2) Vsr > Deadband.
What's Bus Low Time? Is it tBUSERR? In any case, what's its value? 20s? I can't find it specified.
Is the "Vsr > Deadband" condition overlapping the previously mentioned "Average Current is less than Sleep Current in mA" (meaning that if current is 0 and Deadband is not 0, no calibration will be started)?
Also, Deadband is reported to be a current (mA) and here it's compared to a voltage.
Can you please clarify the way automatic calibration starts?
Thank you,