Reminds me of the UK’s Government Digital Services, who want to digitise government processes but also have a responsibility to keep that service as accessible and streamlined as possible, so that even a homeless person using a £10 phone on a 2G data service still has an acceptable experience.
An example. Here they painstakingly remove JQuery (most modern frameworks are way too big) from the site and shave 32Kb off the site size.
That’s the most professional comment section I’ve ever fucking seen.
Hasn’t been linked to reddit yet probably.
Getting away from reddit has shown me that there are unspoiled places in the digital world out there, communities of people who actually care about the topic and not performatism and internet attention.
a) don’t let in anyone who acts like petulant children b) give adults an outlet for occasional outbursts that would make them sound like petulant children
Website is amazingly responsive as well, seems to be working.
The issue with UK services is that they all are fucking random and plenty of sections don’t work. There are billions of logins, bugs and sometimes you just get redirected to some bloody nightmare portal from 1990-s. And EU citizens couldn’t log in into HMRC portal for years after Brexit, what a fucking joke! And all they do is spend time removing jQuery, good fucking job!
At a certain point it makes more sense to subsidize better low-end hardware than to make every web site usable on a 20 year old flip phone. I’d argue that if saving 32 kB is considered a big win, you’re well past that point. Get that homeless guy a £50 phone and quit wasting the time of a bunch of engineers who make more than that in an hour.
Get that homeless guy a home.
Also, if you are in a basement/mountains/middle of Siberia, waiting for 32 kB takes quite some time.
I’m all for ending homelessness, but that’s really a different problem than we were discussing. I’m pretty confident jQuery isn’t stopping anyone from being housed.
Anyway, there’s no way you’re gonna convince me 32 kB is a lot of data. It’s just not. Even the slowest 3G connections can download that much in half a second. Just the text of this thread is probably more than 32 kB. If you can’t download that much data, you only technically have Internet service at all.
Even the slowest 3G connections can download that much in half a second.
Even 3G is not always avaliable, even 3G sometimes slower than 2G.
32 KB here, 32 KB there and boom - you have bitbucket.
At least in the US, the reason 3G isn’t available is that it has been phased out, as has 2G. You may as well complain about how slow it is to send data with smoke signals, because 4G is table stakes for an internet-capable device now.
US? US is wild place. A lot of people still on ADSL, but 2G and 3G equipment is thrown away and say “lol, you problem, buy new phone”. I won’t be surprised that there are a lot of places where internet is less stable than in a train going through tunnel under the mountain in the middle of Siberia. Which means no internet.
I wonder what happens to internet connection in rural areas of USSA, since you suddenly started talking about it.
In Europe(or at least in my part of Europe) there are places where mobile internet is overloaded like subway system and city center and places where mobile internet is very unstable like my house in suburban area and, agan, trains.
And, as I mentioned, bitbucket. It struggles to load even on average PC.
Wow, nice. Sexcellent.
You mean eggcellent?
This is TF2 meme.
This is T9 meme.
I make sure my own web game can run smoothly on crappy hardware. It runs well on my gaming laptop downclocked to 400MHz with a 4x slowdown set by Chrome. It also loads in a couple seconds with a typical crappy Internet connection of 200kbps and >10% packet loss. However, it doesn’t run smoothly on my Snapdragon 425 phone or my old Core 2 Duo laptop. Is this my game or just browser overhead?
When you see what ONE coder was able to do in the 80s, with 64K of RAM, on a 4MHz CPU, and in assembly, it’s quite incredible. I miss my Amstrad CPC6128 and all its good games.
Still happens.
Animal Well was coded by one guy, and it was ~35mb on release (I think it’s above 100 at this point after a few updates, but still). The game is massive and pretty complex. And it’s the size of an SNES ROM.
Dwarf Fortress has to be one of the most complex simulations ever created, developed by two brothers and given out for free for several decades. The game, prior to adding actual graphics, DF was ~100mb and the Steam version is still remarkably compact.
I am consistently amazed by people’s ingenuity with this stuff.
SNES ROMs were actually around 4MB. People always spoke about them being 32 Meg or whatever, but they meant megabits.
I did like Animal Well, but gave up after looking at one of the bunny solutions and deciding I didn’t have the patience for that.
I think most of the size of games is just graphics and audio. I think the code for most games is pretty small, but for some godforsaken reason it’s really important that they include incredibly detailed doorknobs and 50 hours of high quality speech for a dozen languages in raw format.
Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won’t anybody think of the poor oligarchs?!?
Stop making him cry, ask him some Rampart questions!
FOR THE LOVE OF GOD, THINK OF THE SHAREHOLDERS!
I like how KDE has been getting faster and faster as time went on. Like Lisa Simpson’s perpetuum mobile.
Antix made my old Chromebook’s usable. Old tech is fun.
The thing is that developers tend to keep things as simple as possible and overly optimize stuff, when you find bloatware is usually some manager that decided to have it.
It’s the marketing. Always the marketing. Especially the SEO guys.
One SEO guy we worked with told us not to cache our websites because he was convinced that it helped. He badgered us about it for weeks, showed us some bullshit graphs and whatever. One day we got fed up and told him we’d disabled the cache and he should keep an eye out for any improvements in traffic. Obviously we didn’t actually do anything of the sort because we are not fucking idiots. Couple days later the SEO wizard sent us another bunch of figures and said “see, I told you it would help I know my stuff”. He did not, in fact, know his stuff.
Couple days later the SEO wizard sent us another bunch of figures and said “see, I told you it would help I know my stuff”. He did not, in fact, know his stuff.
Ahaha no way.
Reminds me of a funny story I heard Tom Petty once tell. Apparently, he had a buddy with a POS car with a crappy stereo, and Tom insisted that all his records had to be mixed and mastered not so that they sound great on the studio’s million dollar equipment but in his friend’s car.
Reminds me of the ass audio mixing in movies where it is only enjoyable in a 7.1 cinema or your rich friends home theater but not on your own setup
It seems we’ve lost sight of reality there.
As we don’t intend to attend much cinema any more, I hope they bring back essentially a Dolby Noise Switch for movies. I don’t want to sacrifice too much, but booming noise followed by what comes out as whispered dialogue really cheapens the experience.
I hope they can find a process that gives us back a sound track for the sub-17:7 sound system.
Dynamic Range Compression. VLC player has it, possibly under a different name though. Set it up on my theater pc, and I almost don’t need subtitles anymore.
On Windows: https://www.fxsound.com/ (now free and open source)
On old Linux: PulseEffects
On new Linux: EasyEffectsThose really make your crappy speakers or headphones go the extra mile.
Resources are just way cheaper than developers.
It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.
And if you’re working with code that requires that serious of resource optimization you’ll invariably end up with low level code libraries that are hard to maintain.
… But fuck the Always on internet connection and DRM for sure.
If you consider only the RAM on the developers’ PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that’s pretty desastrous from a sustainability point of view.
Last time I checked - your personal computer wasn’t a company cost.
Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.
When I was last looking for a fully remote job, a lot of companies gave you a “technology allowance” every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.
Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.
It’d be nice to have that - yeah. My company issued me a laptop that only had 16gb of RAM to try and build Android projects.
Idk if you know Gradle builds but a multi module project regularly consumes 20+GB of ram during a build. Despite the cost difference being paid for in productivity gains within a month it took 6 months and a lot of fighting to get a 32gb laptop.
My builds immediately went from 8-15 minutes down to 1-4.
I always felt that this is where cloud computing should be. If you’re not building all the time, then 32GB is overkill.
I know most editing and rendering of TV shows happen on someone’s computer and not in the cloud but wouldn’t it be more efficient to push the work to the cloud where you can create instances with a ton of RAM?
I have to believe this is a thing. If it isn’t, someone should take my idea and then give me a slice.
It’s how big orgs like Google do it, sure. Working there I had 192gb of ram on my cloudtop.
That’s not exactly reducing the total spend on dev ram though - quite the opposite. It’s getting more ram than you can fit in a device available to the devs.
But you can’t have it both ways: you can’t bitch and moan about “always on internet connections” and simultaneously push for an always on internet connected IDE to do your builds.
I want to be able to work offline whenever I need to. That’s not possible if my resource starved terminal requires an Internet connection to run.
Ram is dirt cheap and only getting cheaper.
“Use cloud if available”?