IMO not quite the same as C where you don't know what will be the outcome from one implementation to the next. Once you know what any particular C will actually do outcome becomes more predictable. As with hardware it's possible to have an expectation of what will happen.Granted, we are in the realm of "undefined behaviour" (much like undefined behaviour in C -- those dark, dusty corners)
The usual expectation of reading a floating input will be that it will read some value which will be affected by external noise but will normally average about some baseline, and rarely stray too far from that. Having measured that, one has a band which it mostly stays within. And then the expectation is that any read of the ADC will deliver something within that band, with the occasional readings randomly being above or below that band -
Code:
|........................................................... | _ _ _ _-_ __-_ |--_-__-- -__--- -_--_-_ __- --_- ---_---_ _--_- --_- | - -_- |........................................................... |I think everyone would agree some kind of leakage is affecting things over time. This is one theoretical model for what we are seeing -
Code:
Vref >--------------------------------------. .---< 3V3 .--< ? | | \ | | | Ain >------o o----|------.----|>|--. __|_____|__ .|. | | / : | |_| __|__ }---| ADC | \ | --.-- | \_____:_____| .--o o----{ | | | | | }------|----|>|--' | | | __|__ | | | | --.-- | | | | | | | | 0V <---^---------^------^-----------------^-----^---< 0VA software parallel would be memory leakage. Memory leakage which isn't seen if a 'ps' or 'htop' is issued at less than 13 minute intervals, but is if the intervals are longer, with leakage becoming greater the longer the interval is.
Or having a watch which gains time, but only if you haven't looked at it for 13 minutes, then becomes correct until you don't look at it for 13 minutes.
Statistics: Posted by hippy — Thu Nov 21, 2024 3:23 pm