- Silicon Labs Community
- Welcome and Announcements
- Silicon Labs Knowledge Base
- 8-bit MCU
- 32-bit MCU
- Bluetooth / Wi-Fi
- Other Products Category
- Optical/RH/Temp Sensor
- Other Products
- Hardware and Software Tools
- Simplicity Studio and Software
- General Discussions and Suggestions
- Chinese Forum
- Software Libraries
- Development Kits
- Reference Designs
- Third Party Tools
- White Papers
- Official Blog of Silicon Labs
- Chinese Blog
10-02-2017 08:22 PM
Hello again good people,
What we have is a system with multiple channels, where each channel has an excitation source and a receiver, which feeds into a 1nF capacitor. The capacitor is shorted most of the time, with the line set as a digital IO in Open-Drain mode.
Our read cycle is currently as follows:
- Turn off all excitation sources
- Short the sampling capacitors.
- Delay 10ms
- Turn on the excitation source
- Delay 3ms
- Remove the short from the sampling cap for the channel
- Delay 40ms
- Read the channel's ADC line
- Delay another few milliseconds
- Short the sampling cap
- More delay
- Turn off the LED
- Repeat 3-12 on the next channel.
- After all channels are complete, repeat 3-13 without turning on the excitation source.
- Process results.
What we're observing is that the first channel read is higher than the other channels, no matter which one we select as the first. This is more obvious in the second run through with the excitation source off.
In some variants, step 14 above includes reads of the forward bias on the excitation sources, the temperature and the supply voltage (this last including a gain change on the fly). However, if we remove these reads the behaviour is still similar.
The capacitor shorting line is also the ADC read line. All of these sense lines are on Port 0. I initialise the sense lines as digital inputs in Open-Drain mode. To read them, our code was just setting the port pin high (turning off the pull-down transistor) and doing a read. It seemed to read reasonable values, even though the reference manual (diagram 11.2 of the BB2 RM) says it shouldn't. I've now re-coded it to re-initialise the pin as an Analog Input for the read, but it doesn't seem to change the behaviour.
I have confirmed in my code and in the debugger that the ground reference is set to the power supply GND pin.
As a trial, I have added an additional step before every reading of the sensors to repeatedly read the ADC Ground channel until it drops below 5 counts. This has reduced the overvoltage, but not significantly. Worse, sometimes it reaches the limit of repeated reads (an arbitrarily-set 250) without dropping to or below 5 counts. We would expect it to drop to 1 or less within one read. Instead, we are seeing a lot of times where it takes tens or even hundreds of reads before it drops to that low a range. If I change the limit to 10 counts, it's only 1 read. Why does the GND channel not read 0?
I have posted a lot of details of my ADC and workflow code in this forum discussion (Changing ADC Gain On The Fly).
10-03-2017 05:29 PM
Thanks for the response.
They're all good questions.
The receiver is a photodiode. I've attached a portion of the schematic that I also posted to the other thread. The data sheet doesn't tell me directly what the impedance is, and it's been too many decades since my EE degree for me to try deriving it from the charts.
The datasheet says the photodiode is "Over-current-proof", but the short-circuit safety is an interesting question. We're not expecting a long lifetime from this particular product (hence the very cheap devices being used). Having said that, we've been using the prototype boards for some months now.
On the dev kit, there's about +/-50mV (peak) of noise on the power supply lines when powered from USB. I'd expect quite a lot less on our custom board when powered by battery (which is how the measurements are taken). Our electronics and systems team members say there was "not enough to be concerned about." We're also very careful to turn on just one LED at a time, and not to take measurements until the lines have had time to settle after switching.
10-04-2017 02:12 PM
Can you share the complete schematic? Have you been able to observe the input to the ADC with a scope to ensure that the voltage isn't actually changing on the first sample? My first two guesses are that either the voltage is higher for the first sample, or the reference is lower. Unfortunately you can't drive the reference voltage to a pin to monitor it on this part. Do you see the same behavior if you switch your reference to VDD or an external reference?
10-04-2017 05:19 PM
Hi Joe and thanks for the response,
There's not much to the schematic, but I'm reluctant to share too much on a public forum.
Realistically, the key part of the schematic is that we have several of those photodiodes and matching LEDs to light them up, plus a few LEDs for user interaction, a UART port and a battery. The important/interesting parts of the schematic are in the .PNG file as shared.
All the ADC lines are on P0, together with the UART.
All the LED lines are on P1.
There are no spare pins available.
I've tried switching to the 1.65V reference, but it didn't change the behaviour significantly.
I haven't tried using Vdd as a reference - yet.
We do suspect the voltage might be varying. In our test setup, it shouldn't be - particularly when we turn off the lights in the optics lab to do the tests - but we haven't ruled out the possibility. It's a matter of working out why this might be the case.
If I get a chance during today, I'll put our actual board on the CRO. To check this. The problem is excluding the effects of ambient light - including that from the CRO's display.
10-04-2017 09:15 PM
With the equipment available at my desk, I've managed to observe the waveforms.
They're not particularly clean, due to ambient light, radiated mains noise, grounding issues, cheap CRO, etc.
However, the waveforms are visually of very similar levels regardless of when they're triggered, which channel it is, etc.