It is bonkers how good #curl is on a bad connection. I can’t load Gmail, reddit, or duckduckgo, and it takes me 2-3 tries to upload mastodon posts, but I managed to (eventually) download a 200 megabyte binary with curl. It took 20 minutes but it succeeded first try. Firefox times out or gives HTTPS errors trying to load a basic static documentation site, but it takes less than a second to just curl down the html pages and open them locally. (without CSS unfortunately but I’ll take what I can get)
This week has really taught me how fragile most software has become in the face of network problems.
=> More informations about this toot | More toots from jnkrtech@treehouse.systems
@jnkrtech Yeah for sure. So much of the internet does not work on a slow connection. Luckily there are parts that do, and unsurprisingly they do well.
Yeah, just being able to download something once has been getting worse over the years. Lots of software doesn’t do continues for failed downloads and such.
Adding all the various retry flags to curl makes it much more resilient even in good networks. But people don’t even use those most of the time anymore.
=> More informations about this toot | More toots from renedudfield@fosstodon.org This content has been proxied by September (3851b).Proxy Information
text/gemini