I once built an application that crashed after about ~24 days of continuous use.
Testing it was... interesting.
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
So this was a soda dispenser interface ("Pepsi Spire", still in use).
It used Flash for the animation; it was my last work in Flash.
The machine was a very low power NUC.
I spent a TON of time profiling it, and optimizing the hell out of it. Everything was GPU driven. I made sure draw calls were kept to a minimum, that there were NO memory leaks of any kind, used tricks like spritesheets and polls, etc.
This was a consumer facing UI for 24h restaurants. It needed to be resilient.
=> View attached media | View attached media | View attached media | View attached media
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
Anyway, I came up with this "test" interface that would just let the application run and press around the interface (injecting mouse events, basically), to simulate operation. Like a webdriver kind of thing. All in real time. So I could monitor memory usage over time.
Luckily I had a separate unit so I'd just deploy the interface there and let it running permanently to make sure it didn't crash. It was also a nice conversation piece - I had a fountain machine UI self-operating at my desk 24h.
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
I spent a lot of time debugging it. Even ran into a severe Flash bug with string interning that I had to circumvent (made me appreciate languages who do strings well!).
Eventually I got to the point where the interface ran perfectly for many days, with no leaks or increases in memory use. In fact, it basically allocated all the GPU/CPU memory it needed right at the start - made me appreciate the meaning behind "unused memory is wasted memory". Then it just recycled and reused some stuff.
But.
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
But, it turned out that after 24 days or so everything sorta went haywire. Animations slowed down to a crawl (and the interface was always animating). I didn't quite understand.
Eventually I figured it out: to control animation (and a bunch of other things), I used ActionScript's getTimer(), which is "the number of milliseconds that have elapsed since the Flash runtime virtual machine (..) started".
And that's an integer.
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
So the problem becomes apparent: the maximum value of a 32 bit signed int is 2,147,483,647 (2^31-1), or roughly 24 days if counting ms. After that, things just break apart because times are negative in relation to when some of the animations started.
Eventually I tried wrapping getTimer() to transform the overflow of an int32 and make it work as an uint32, so it could last for ~49 days.
Unfortunately I never got to test it. I had to rely on the real system time, so I had to wait 24+ days.
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
For several reasons I never got the time to dedicate to fixing that. The partners who built the machine also added a system where it would boot every once in a while (to prevent other issues) so any problem with the UI's lifetime turned out to be low priority.
But I'm proud of the work I did there, and am always reminded of how unstable things are today.
Website and apps break often, and we're expected to reload/restart when that happens - "turn it off and on again". It's no big deal.
=> More informations about this toot | More toots from zeh@mastodon.gamedev.place
text/gemini
This content has been proxied by September (3851b).