Today I had to measure performance of some function running on some obscure ARM-derived hardware. Normally I'd just run it in a loop and measure how much it takes in total and per iteration, but due to its complexity and dependence on other factors, that wasn't an option. Since it took less than 1ms, I couldn't use regular timers and for odd reasons I couldn't use high-resolution clock. What did I do? I used a god damn oscilloscope to measure time the function kept debug pin in high state. It set it to high upon entry and reset it to low before exiting.
And it worked perfectly and gave me stable and reliable timings in μs.
[#]programming #embedded
=> More informations about this toot | More toots from branch300bpm@vivaldi.net
text/gemini
This content has been proxied by September (3851b).