- Silicon Labs Community
- Welcome and Announcements
- Silicon Labs Knowledge Base
- 8-bit MCU
- 32-bit MCU
- Bluetooth / Wi-Fi
- Other Products Category
- Optical/RH/Temp Sensor
- Other Products
- Hardware and Software Tools
- Simplicity Studio and Software
- General Discussions and Suggestions
- Chinese Forum
- Software Libraries
- Development Kits
- Reference Designs
- Third Party Tools
- White Papers
- Official Blog of Silicon Labs
- Chinese Blog
04-13-2017 04:58 PM
Hi everybody, I'm using an EFM8BB10F8G
I have a situation where I have to take different ADC measurements of 2 input pins, the internal temperature sensor and VDD
for measuring VDD I plan to use 0.5X attenuator at the input of the ADC, for other measurements I may have to switch between 1.65V and 2.4V internal references for better accuracy, here's the sequence
Perform measurement #1: connect ADC to VDD, use 0.5X, ref = 2.4V internal
Perform measurement #2: connect ADC to P0.6, use 1X, ref = 1.65V internal
Perform measurement #3: connect ADC to P0.7, use 1X, ref = 1.65V
Perform measurement #4: connect ADC to temp sensor, use 1X, ref = 1.65V
when I change the ADC input, reference voltage or attenuation factor what kind of delay should I insert before performing the conversion?
multiplex the input to the ADC, at the same time change the attenuation
04-13-2017 05:14 PM - edited 04-13-2017 07:13 PM
I have no answer, but, since you see a delay why not just change the VDD divider and keep the same reference? that will, as well, allow autoscan
04-13-2017 05:39 PM
I think there has been some misunderstanding, I'm assuming that when I change the voltage reference or the attenuation factor the analog circuit will require some transient delay to settle before I can safely start the ADC conversion. How much do you think that delay should be?
so the same goes for 0.5X, is there any settling time or can I just start ADC conversion immediately after switching from 0.5X to 1X?
04-13-2017 07:12 PM
it just seems to me that you are overcomplicating the issue (and add more code than needed) when a simple change of a resistor can make all ADC reads work with same parameters
04-14-2017 02:25 AM
I'm assuming that when I change the voltage reference or the attenuation factor the analog circuit will require some transient delay to settle before I can safely start the ADC conversion. How much do you think that delay should be?
If the data is sparse, I look around for other time indications.
I can see ADC enable times of 1.2μs, TempSense enable of ~ 1.8μs, and VRef enable of < 1.5μs
So I'd suggest start with 10x those, and when everything is working nicely, see if you can reduce them.