I'm seeing some confusing behavior for timer1 on an ATTiny85. My end goal is to make a rig that allows for precise autocalibration of the internal oscillator by providing an external clock to sync against.
I'm using an approach I found here: http://www.avrfreaks.net/index.php?name ... 92&start=0
. The basic gist is:
- 32.768kHz crystal hooked up to timer0
- timer1 set to free run with overflows counted
- after timer0 has overflowed, count the total number of ticks of timer1 (by shifting up the overflows)
- if the number of clocks was too high, drop OSCCAl a little; if it was too low, bump OSCCAL up a little
- repeat until we're within an error bound
This all seems like it should work, and indeed, I was able to play around and manually shift the OSCCAL value until it was precisely tuned. However, when I try to do it automatically, it doesn't work. After some debugging, I was able to determine that the number of ticks counted by timer1 was waay too low - something like 1/16th of what I expected.
This lead me to try and figure out why timer1 isn't doing what I expect. I set it up to output its signal directly to the OC1A pin so I could measure its frequency, and I was surprised to find that it was only 15.7kHz! As far as I can tell, I have completely disabled all prescalers and CTC functions, so it should just be a dump of the system clock. (And when I configure timer0 in that fashion, my oscilloscope's frequency counter gives up in terror, leading me to believe that timer is doing what I expected.)
Has anyone seen this sort of behavior before? Is there any sort of secret prescaler or setting that might have been turned on that would be causing this?
One last piece of information: I'm using the Arduino tiny core (from http://code.google.com/p/arduino-tiny/
) and have made modifications that I believe would disable the millis timer behavior just for testing purposes. It is totally possible that someone more experienced than I could detect something about the timer bootup sequence that would be hard to disable.