微控制器:Arduino Uno 中的 ATmega328P
时钟频率:16MHz
void timeDelay_CTC(float sec, unsigned char times) //0.1 <= sec <= 4
{
OCR1A = (sec / 0.000064f) - 1;
TCCR1A = 0b00000000;
TCCR1B = 0b00001101;
for( unsigned char i = 1; i <= times; i++ )
{
while( (TIFR1 & (1<<OCF1A)) == 0 );
TIFR1 |= (1<<OCF1A);
}
TCCR1A = 0;
TCCR1B = 0;
}
上述函数用于计算延时周期数,然后在 CTC 模式下实现。它运作良好。现在,我想在普通模式下编写一个类似的函数。以下是代码。
void timeDelay_NORM(float sec, unsigned char times)
{
unsigned int cycle = (sec / 0.000064f);
TCNT1 = 65534 - cycle;
TCNT1 = 49910;
TCCR1A = 0b00000000;
TCCR1B = 0b00000101;
for( unsigned char x = 1; x <= 2; x++ )
{
while( (TIFR1 & (1<<TOV1)) == 0 );
TIFR1 |= (1<<TOV1);
}
TCCR1A = 0;
TCCR1B = 0;
}
但是,参数 "times" > 1 的普通模式函数,时间延迟会比预期的要长得多。所以,我尝试了以下代码。
void timeDelay_NORM(float sec, unsigned char times)
{
//unsigned int cycle = (sec / 0.000064f);
//TCNT1 = 65534 - cycle;
TCNT1 = 49910; //Cycles for 0.5sec
TCCR1A = 0b00000000;
TCCR1B = 0b00000101;
//for( unsigned char x = 1; x <= 2; x++ )
//{
while( (TIFR1 & (1<<TOV1)) == 0 ); //Run 0.5sec two times to delay 1sec
TIFR1 |= (1<<TOV1);
while( (TIFR1 & (1<<TOV1)) == 0 );
TIFR1 |= (1<<TOV1);
//}
TCCR1A = 0;
TCCR1B = 0;
}
我发现当它运行以下指令 2 次时,时间延迟会比预期的要长得多。它延迟大约 5 秒而不是 1 秒。
while( (TIFR1 & (1<<TOV1)) == 0 );
TIFR1 |= (1<<TOV1);
你能教我如何让它工作吗?或者给我一些提示。
谢谢您的帮助!