Significant difference between millis() and micros()
when run code below , change between millis() , micros() there quite variance
with millis() time shown varies between 2 , 3 milliseconds.
however if change micros(), time shown 8700 micro seconds or 8.7 milli seconds.
can explain why big difference ?
with millis() time shown varies between 2 , 3 milliseconds.
however if change micros(), time shown 8700 micro seconds or 8.7 milli seconds.
can explain why big difference ?
code: [select]
void loop()
{
//start timer
gettime_millis = millis(); //micros();
drawscreen1(); //draw on tft screen
//calculate time taken loop
newtimemillis = millis(); //micros();
timelapsemillis = newtimemillis-gettime_millis;
//need bit of time see result
delay(1000);
}
with such short times, can imagine happen.
the millis() not accurate few milliseconds (read: inaccurate). updated in interrupt.
we don't have whole sketch, don't know if did use right variable types.
perhaps can try drawscreen1() in loop , 20 or 100 times. millis() , micros() should measure same delay.
the millis() not accurate few milliseconds (read: inaccurate). updated in interrupt.
we don't have whole sketch, don't know if did use right variable types.
perhaps can try drawscreen1() in loop , 20 or 100 times. millis() , micros() should measure same delay.
Arduino Forum > Using Arduino > Programming Questions > Significant difference between millis() and micros()
arduino
Comments
Post a Comment