Reply
Posts: 19
Registered: ‎06-17-2016

delay between switching voltage reference, analog input and attenuation

Hi everybody, I'm using an EFM8BB10F8G

I have a situation where I have to take different ADC measurements of 2 input pins, the internal temperature sensor and VDD

 

for measuring VDD I plan to use 0.5X attenuator at the input of the ADC, for other measurements I may have to switch between 1.65V and 2.4V internal references for better accuracy, here's the sequence

 

Perform measurement #1: connect ADC to VDD, use 0.5X, ref = 2.4V internal

Perform measurement #2: connect ADC to P0.6, use 1X, ref = 1.65V internal

Perform measurement #3: connect ADC to P0.7, use 1X, ref = 1.65V

Perform measurement #4: connect ADC to temp sensor, use 1X, ref = 1.65V

 

when I change the ADC input, reference voltage or attenuation factor what kind of delay should I insert before performing the conversion?

 

thanks

 

 

multiplex the input to the ADC, at the same time change the attenuation

Posts: 7,951
Registered: ‎08-13-2003

Re: delay between switching voltage reference, analog input and attenuation

[ Edited ]

I have no answer, but, since you see a delay why not just change the VDD divider and keep the same reference? that will, as well, allow autoscan

erik
Posts: 19
Registered: ‎06-17-2016

Re: delay between switching voltage reference, analog input and attenuation

thanks @erikm

I think there has been some misunderstanding, I'm assuming that when I change the voltage reference or the attenuation factor the analog circuit will require some transient delay to settle before I can safely start the ADC conversion. How much do you think that delay should be?

so the same goes for 0.5X, is there any settling time or can I just start ADC conversion immediately after switching from 0.5X to 1X?

 

thanks!

Posts: 7,951
Registered: ‎08-13-2003

Re: delay between switching voltage reference, analog input and attenuation

it just seems to me that you are overcomplicating the issue (and add more code than needed) when a simple change of a resistor can make all ADC reads work with same parameters

erik
Highlighted
<a href="http://community.silabs.com/t5/Welcome-and-Announcements/Community-Ranking-System-and-Recognition-Program/m-p/140490#U140490"><font color="#000000"><font size="2">Hero</font></font> </a> jmg
Posts: 1,107
Registered: ‎04-27-2004

Re: delay between switching voltage reference, analog input and attenuation


I'm assuming that when I change the voltage reference or the attenuation factor the analog circuit will require some transient delay to settle before I can safely start the ADC conversion. How much do you think that delay should be?


If the data is sparse, I look around for other time indications.

 

I can see ADC enable times of 1.2μs, TempSense enable of ~ 1.8μs, and VRef enable of  < 1.5μs

 

So I'd suggest start with 10x those, and when everything is working nicely, see if you can reduce them.