Im thinking about programming a PIC microcontroller to detect two frequencies, 10 times a sec, and 20 times a sec. on one input.

The challenge is making it the most optimum, so that it can be applied in series to a number of inputs and not degrade the outcomes.

The input is binary (1 or 0) but can change at random. This means if both a 20hz and 30hz source is present you may end up seing 10 as a product of their interference. (in which case you must simply do nothing).

Im currently thinking of doing Zero detection and timing how long its been since the previous zero to determin the freq = 1/t .

But my Question or challenge to anyone board, is does anyone have a better or more optomal what of getting the job done?

I Dont need source or anything (MASM wise), unless you feel so inclined to do so.. Im more or less looking for a better solution if there is one. Thanx again!

NaN
Posted on 2001-12-23 01:00:52 by NaN
I never worked with PICs but I've played with an Atmel 8-bit RISC processor, and this microcontroller had an internal counter (two actually, one 8-bit, the other 16-bit), which could be externally clocked. Everytime the input was activated, the counter is increased by one.
Then you could catch the overflow interrupt that happened when the counter reached it's maximum.
If your PIC has a counter of the same kind you can wait until the overflow happens, then time how many clockticks have passed since the last overflow. I think this is more accurate than measuring 1 tick.
I once created a little frequency counter (single frequency, not like yours). It used the 8-bit counter to count the active inputs, and on the overflow interrupt, increased a memory counter. The 16-bit counter was used as timer, which just caused an interrupt after one second or so passed. Then the memory counter was read, and multiplied by 256 (each 8-bit overflow is 256 ticks), the result was the input frequency.

Thomas
Posted on 2001-12-23 05:51:20 by Thomas