- GNOME removed all UI controls for setting solid color backgrounds, but still technically supports it if you manually set a bunch of config keys — which seem to randomly change between versions (see: https://www.tc3.dev/posts/2021-09-04-gnome-3-solid-color-bac...).
The pattern here seems pretty clear: a half-baked feature kept alive for niche users, rather than either properly supporting or cleanly deprecating it. Personally, I’d love to simply set an RGB value without needing to generate a custom image. But given the state of things, I’d rather have one solid, well-maintained wallpaper system than flaky background color logic that’s barely hanging on.
tough 4 minutes ago [-]
It kinda seems to me this is better solved by a webapp that lets you generate any background image and download and you set it up on your OS.
gjsman-1000 10 seconds ago [-]
When you say it that way, it's actually baffling that there's still separate code paths for solid background colors.
Just offer a background picker, and have it generate a 1x1 PNG, for the color selected. Just like that, you can use the image background code path for everything; and the code for generating the image almost certainly exists in the OS already.
woolion 6 hours ago [-]
I checked in KDE, since I'm generally confused as to why it's not more popular now: in the wallpaper settings you choose `wallpaper type: plain color` and it gives you a color picker to set it.
It also shows you the screen you set it for, and a boolean to set it for all screens at once.
cogman10 3 hours ago [-]
I think it's historic reasons.
KDE used to be the "bloated" desktop way back when (I know, pretty silly and laughable now given the current state of things).
That cemented Gnome/Mate into a lot of major distros as the primary DME. Ubuntu being the most famous.
The QT licensing situation is also a bit of a bizarre quagmire. There are certainly people that don't like KDE for more ideological reasons.
Personally, none of this bothers me and it's what I use for my personal computer. KDE is just so close to exactly how I'm used to interacting with computers anyways growing up through the Win95 era. It is so close to the Windows experience you want to have.
eadmund 35 minutes ago [-]
> KDE used to be the "bloated" desktop
That’s not my recollection. I believe that the non-free license you mention was the major factor, in addition to the fact that KDE was written in C++ at a time when the free software community still preferred to write software primarily in C.
GNOME was written using a free software toolkit, and it was written in C, and it was associated with the FSF.
throwup238 9 minutes ago [-]
There were several eras of user and developer consternation. I definitely had the impression that the GP describes in the mid to late 2000s.
queenkjuul 3 hours ago [-]
KDE still seems pretty bloated
jlpom 3 hours ago [-]
I rather think the right word is clunky: one of the dev is attached to Server-Side Decoration/against CSD for some reason (none of his arguments make sense), so every stock app are difficult to read and taking unneeded screen space. It's just bad UX.
skyyler 3 hours ago [-]
One person's "bloat" is another person's "batteries included".
gymbeaux 2 hours ago [-]
KDE Wallet though?
woolion 32 minutes ago [-]
Funny, I used to think it was bloat but then I got to use it to store passwords for remote servers accessed with ssh, and now it's a nice 'batteries included' for me, as GP mentioned. It has become so because it is nicely and seamlessly integrated.
tremon 1 hours ago [-]
What about it?
greenavocado 5 hours ago [-]
Those of us that use KDE don't necessarily broadcast it
LastMuel 5 hours ago [-]
Why? KDE is awesome.
shashashank 23 minutes ago [-]
I'm a recent convert from Gnome. Mostly cause Gnome seemed to have too many mysterious crashes—waking from sleep, switching between windows when video was playing—so much so that it was just easier to switch to something modern (as opposed to sway/i3) and not have to learn/rewrite keybindings.
greenavocado 5 hours ago [-]
Too busy building
skyyler 4 hours ago [-]
Too busy to discuss your preferences. Not busy enough to not discuss your being too busy to discuss your preferences on internet forums.
I'm not trying to be mean here, I'm just fascinated by what people will consider to be a waste of time.
greenavocado 4 hours ago [-]
Evangelizing KDE is not something I care about
Suppafly 42 minutes ago [-]
>Evangelizing KDE is not something I care about
This, sometimes it's enough for things to just work as you expect so you can do your actual work. I don't really understand why so many people are unpaid part time evangelizers.
skyyler 3 hours ago [-]
Hm. It seems like you care about it at least a little, otherwise why would this thread exist? It's okay to care about things.
artursapek 39 minutes ago [-]
Yet you’re here
queenkjuul 3 hours ago [-]
Well, those that aren't you, apparently
myfonj 7 hours ago [-]
Last time I used an Android (Galaxy) phone, to have a solid pitch-black background (which I thought made sense for modern phone displays energy-wise, besides looking pretty swell), I had to download — yes, D-O-W-N-L-O-A-D — some black image from some "Galaxy Store" thing or whatnot to achieve that. It was free, but it seemed like an exception there.
Something that should be a default option, or a single-tap switch in settings, turned into a chore consisting of a period of agonising disbelief, doubt, denial, search, and eventually bitter acceptance.
whynotmaybe 6 hours ago [-]
I had to take a picture with my finger on the camera to have a black image to use as background.
kjkjadksj 1 hours ago [-]
I don’t know if that would do it or just return your dark current image.
The best thing to do is just take a file same as your screen resolution into your favorite image editor and fill it with actual true black. Save as png and send to phone.
myfonj 1 hours ago [-]
Was not sure if sensor in pitch dark would catch some other radiation and output some pixel "grain" seen from high ISO or not, but from what I tried, visually the photo seems pretty black, so the method seems to be surprisingly usable.
As for "same image as your screen resolution": screenshot sounds like the exact fitting thing here. As a challenge, tried making screenshot black using stock Samsung "Gallery" and it seems that repeated Edit - Brightness: -100 - Save as copy, then open the copy and goto back to Edit can do the trick as well, after four or so copies. (Copies, because there is no way to re-apply same effect on the same photo, apparently.)
drewolbrich 1 hours ago [-]
Another way is to do a Google image search for "black".
gymbeaux 2 hours ago [-]
A black background on an AMOLED display (something Samsung Galaxy phones tended to have) would use less energy because “pitch black” on AMOLED is literally turning the underlying pixels off- with LED displays, that’s not possible.
layer8 20 minutes ago [-]
It’s proportional to brightness on OLED. You save a lot with a dark background/dark mode already, it doesn’t need to be specifically black.
graemep 5 hours ago [-]
I use an alternative home screen app to deal with stuff like this.
wlesieutre 2 hours ago [-]
Also in macOS recently, I've set a solid color and it has reverted itself to some default forest photo several times.
I suspect this is related to the System Preferences rebuild, since it's worked fine for 20+ years of OS X before that.
hnlmorg 4 hours ago [-]
KDE works pretty well here. I set a solid colour of black on one PC which powers a projector.
layer8 23 minutes ago [-]
I keep a single-color image around for that reason.
RajT88 4 hours ago [-]
I too prefer a solid color.
However, I've noticed, there's not much point in changing it. Showing the desktop is a waste of screen real estate because of the generations of abuse of desktop shortcuts. Even if you are careful, it becomes a cluttered wasteland (on Windows anyways). I just learned to never use the desktop for anything and always have windows up on various monitors.
hennell 52 minutes ago [-]
My windows desktop remains pretty organised, occasionally there might be an app appear. My mac however I gave up on, it's just a mess of screenshots and files you have to drag n' drop from somewhere and the desktop is just where that ends up. I used to have a script that moved the screenshots but it's easier to just live in chaos.
duffyjp 17 minutes ago [-]
On Mac I disable icons from appearing on the Desktop and instead add one of those fan-out folder links to my taskbar, sorted newest first. I just checked and I have 547 Screenshots dating back to around this time last year. Maybe it's time for a purge. :)
I thought Windows programs generally asked if you want to make a desktop icon for them. (But I only use Windows as a video game console).
RajT88 3 hours ago [-]
Not always. It's up to their installer. And the installer doesn't have to ask (it can just do it).
The situation is better these days, with windows store apps. Still, I developed the habit of just never using the desktop in the XP days when things were really bad.
There was a war over your eyeballs, which had shady software vendors warring over desktop space, start menu space, taskbar space, even fucking file associations. I recall for a little RealPlayer and Windows Media Player used to yank back and forth file associations each time they ran, even if you tried to make them stop.
rpd9803 9 minutes ago [-]
You should view this page in Microsoft Edge for the best experience! \s
queenkjuul 3 hours ago [-]
No one of my biggest complaints about windows is the sheer number of apps that add an icon without asking. Sometimes it's even worse than an app, and Nvidia or AMD will add one in a driver update. Drives me nuts.
sixothree 3 hours ago [-]
I would love to store documents in the My Documents folder if applications actually had respect for me. Windows should never allow an application to just dump stuff in the Documents or Desktop folder without my permission.
layer8 16 minutes ago [-]
Just create and use any other folder you like under %USERPROFILE% (usually C:\Users\username)? My Documents is a default location, but you can ignore it. Simply use your user folder as you would under Linux or whatever.
RajT88 2 hours ago [-]
Don't get me started... Office these days adds friction if you want to save documents anywhere but the Documents folder (where they get uploaded to OneDrive if you have it set up).
They've also disabled auto-save if you don't have the documents backed up by OneDrive, which is the most egregious for me.
layer8 11 minutes ago [-]
Pro tip: Press F12 to directly open the traditional Save As dialog.
Office has never had auto-save for local documents; it only had (and still has) periodic recovery saves. The primary reason they added auto-save for cloud documents is to facilitate multiplayer online editing.
CRConrad 4 hours ago [-]
Windows 10 has a setting to allow you to choose if it should show or hide desktop icons. Dunno about 11.
imzadi 3 hours ago [-]
Yes, you can hide icons in windows 11 just by right clicking on the desktop and going to view > show desktop icons.
andrepd 7 hours ago [-]
> GNOME removed all UI controls for setting solid color backgrounds, but still technically supports it if you manually set a bunch of config keys — which seem to randomly change between versions
There's the peak GNOME experience.
AHTERIX5000 6 hours ago [-]
Or the display sleep menu which offers choices like 1, 2, 5, 10, 15 and 30 min timeout but no more than that unless you use external config editor.
amlib 3 hours ago [-]
30 minutes? That's a luxury! For me it has always been limited to 15 minutes in the GUI.
lozf 5 hours ago [-]
You can also use `gsettings set ...` on the CLI.
CRConrad 4 hours ago [-]
God forbid an input box in a GUI config dialog.
nullc 1 hours ago [-]
My analogous gnome experience was that on my tv-computer I was using 4x scaling, because TV and because my distance vision stinks.
At some point they decided 2x (3x?) scaling was enough for anyone and took away 4x, I didn't notice because I was already set at 4x and it continued working. Somewhat later they took away the backend, and then my system crashed with no error message immediately at login.
After much troubleshooting replaced a movie night, I inquired about the functionality being removed and was berated for using an undocumented/unsupported feature (because I was continuing to use it after the interface to set it had been removed, without my knowledge).
I'll never use gnome again if I can help it.
sixothree 3 hours ago [-]
The number of times my solid color preference has been replaced with all black over the last 10 years is absolutely astounding. I have no understanding of why this seems to be so difficult.
nandomrumber 11 hours ago [-]
I recently acquired a ThinkStation P910 dual CPU Xeon E26xx with 64GB RAM and 1080GTX
Quite a capable machine for my uses.
Not supported in Windows 11. Maybe with some additional config? Can’t be bothered with something hat might turn out to be fragile and need more maintenance than I can be bothered with. That’s a young man’s gane.
Ok, I’m about due to give Linux another tickle anyways.
Hmm, which distro… can always give a few a spin.
Keep it simple, Pop!_OS.
Installed fast, no issues, runs fine, stable. Seems entirely usable.
Customisations? Nah, keep it simple.
I’ll set a black background though.
Nope.
jeroenhd 8 hours ago [-]
I wouldn't go with Pop_OS with that hardware. The Nvidia GPU isn't supported by the new Nvidia driver and because System76 is hard at work writing Cosmic, their repositories are quite outdated. Support for things like Wayland is quite mediocre in the old drivers.
Switching to upstream (Ubuntu) with KDE would probably be more your speed.
jessekv 6 hours ago [-]
I recently updated Pop OS to the Cosmic Alpha and it is much more to my liking than Gnome 3 ever was. I have an older Nvidia GPU though.
RedShift1 11 hours ago [-]
Par for the course with Gnome though, if you like customization, KDE is better.
loftsy 10 hours ago [-]
Just make a black png and use it as the background?
doublerabbit 8 hours ago [-]
Sure but why should a workaround be required for a feature that should work?
dizhn 7 hours ago [-]
If they can't get colors to work, the software should just create the image itself and fake it.
ohgr 10 hours ago [-]
As much as I'd like a machine like that, my 5 year old random Lenovo 10500 desktop is probably more useful as a daily driver machine than an older workstation class machine at the sacrifice of no ECC RAM. I bought it when it was 3 years old and will use it for 4 years then get rid of it before it hits the tail end, the power supply dies or something else goes wrong. You avoid all the weird problems, the depreciation, the energy costs running like that. And you gain things like relatively competent NVMe slots, USB-C and other luxuries. And the single core performance is better than Xeons of the era and earlier.
win11 ltsc works perfectly on it. With a solid background :D
methuselah_in 11 hours ago [-]
go with gnome fedora! and rest will be history or debian stable.
MartinGAugustin 10 hours ago [-]
Replace Windows. Alternative Operating System (OS), for Performance computing.
KDE Plasma (on any distro I guess) has clear, easy to reach and easy to use settings for this.
necovek 13 hours ago [-]
It's funny to see this: after avoiding the Windows world for the last 25 years, back in the corporate world in the last few, I see this pattern with Microsoft tools all the time.
Teams not loading due to security issues, but notifications coming through with full content of messages. Ability to type a handful of words in cloud version of Word (or paste full documents) before security check catches up and requires me to set up a sensitivity label. Etc.
It mostly tells me about MS doing very bad software architecture for web apps, though apparently the desktop apps are not immune to it either.
Enginerrrd 13 hours ago [-]
It's not just MS. I think they might have fixed it now, but my personal favorite was when Google photos would send me a notification with a preview of an AI generated album of my photos they made for me even though the app did not now, nor ever have permissions (on Android) to look at said photos. And it too would then "catch up" and ask permission to see my files and I'd say "no" and then the preview would go away.
kn0where 12 hours ago [-]
Similar with google docs, if you share a link to a doc, even if the doc is restricted access, anyone can see the thumbnail icon with the contents of page 1.
harrall 12 hours ago [-]
I swear 70% of my value at work is pointing out details like this during meetings when no one else will… before we build it.
zerkten 4 hours ago [-]
Product managers will decide to show the thumbnail in situations because it results in more click throughs. In many cases they'll have done their research to know that many customers take steps to restrict what they share (think profitable but conservative companies) but will choose to show the thumbnails anyway.
Some customers will push back and have enough leverage to get an exception, but the default answer will be that this can't be disabled. You'll have some sales engineer challenged about the product behavior as part of an RFP and they'll try to convince you that nothing is leaked while knowing the financial opportunity would be much larger with these customers, if there was more concern for the customer.
necovek 12 hours ago [-]
In this particular issue, MS has an opposite problem: you grab a document link, grant a permission to someone, and they still can't access the document through the original link (you need to fetch a new link just for them).
antgiant 5 hours ago [-]
Kind of. The default behavior is to create a new link. So when you grant someone access you are actually creating a new link. However, you can find the buried manage access settings and change the permissions on the original link. If you do that then they can use the original link.
(Teams makes this Byzantine in the extreme to accomplish as you have to go find the folder it drops all shared files in to gain access to manage access settings. But it does allow you to retro change access even for things shared in Teams)
necovek 47 minutes ago [-]
Yeah, many things are definitely possible but to me it seems like the user experience is driven by the technical implementation and architecture, instead of vice versa.
From the outside looking in, it's the age old organizational problem when there are no good synergies between customer experience and development teams.
Possibly they refer to this: https://news.ycombinator.com/item?id=39172527“I received a link to a Google doc on slack recently, but the owner had forgotten to share permissions with me. Though I couldn't view the doc when I clicked it, I did notice that I could view the first page of the doc in the link preview. It was very high res and I could view the text clearly.”.
If so, pasting that link into Slack may reveal its first page.
necovek 44 minutes ago [-]
It could even be a Google docs Slack app that has the bug of generating a preview if the sharer has permissions on the doc (and they usually do) and the preview generation enabled.
areyourllySorry 7 hours ago [-]
permissions might have changed after link preview was generated
Spare_account 11 hours ago [-]
(Not the same guy but) I've definitely heard about this bug in the past, but I assume it is fixed now. I can't actually find a reference for it. If I find one within the hackernews comment edit window I'll add it here.
SonOfLilit 9 hours ago [-]
I can't
dns_snek 11 hours ago [-]
That's odd, were you synchronizing your photos to Google Photos[1] in any way, from any device? Presumably they would've had to be synchronized to Google at some point for them to generate an album of said photos.
However when you're inside a note (which BTW, can also be converted into checkboxes, aka very simple TODOs), Google Keep, the note taking app from search giant Google, doesn't have search functionality for that specific note.
Besides the many small bugs, sometimes the missing functionality in Google apps is mind boggling.
tsimionescu 7 hours ago [-]
On a asimialr vein, Microsoft's OneNote, which is of course part of the famed and expensive Office suite, still doesn't support Find and Replace. But, they do have a meticulously written official support article that suggests you can copy the text you want to replace with, and then do Find, double click the text it finds, ctrl+V, and repeat...
The team that wrote the preview portion of the app is a different team to the one that wrote the permission requesting part. They communicate asynchronously (as a team/org, but this probably is reflected in the app's architecture!), which means the outcome is eventually consistent! But you managed to observe one of those inconsistent cases!
xigoi 12 hours ago [-]
“Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure.” —Melvin Conway
bandrami 12 hours ago [-]
I've heard this was the secret to AWS's taking off twenty years ago: Bezos told the various teams they can only interact with each other as if they were vendors and customers to each other.
mzi 12 hours ago [-]
It was formulated a little different. But this was the 2002 mandate:
1. All teams will henceforth expose their data and functionality through service interfaces.
2. Teams must communicate with each other through these interfaces.
3. There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team’s data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
4. It doesn’t matter what technology they use. HTTP, Corba, Pubsub, custom protocols — doesn’t matter.
5. All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.
6. Anyone who doesn’t do this will be fired.
7. Thank you; have a nice day!
jodrellblank 10 hours ago [-]
Source: Steve Yegge’s “Amazon understand platforms and Google doesn’t” rant - copy found at https://gist.github.com/chitchcock/1281611 among others, since it was originally posted on Google+ and link-rotted.
Number 7 is a joke, etc.
rob74 10 hours ago [-]
Then maybe the default value for "permission to access photos" should be no, so they can only start accessing them after you give them permission. But yeah, with stuff like this it's always "opt-out", never "opt-in", unless someone forces them to...
garbagewoman 12 hours ago [-]
The team that wrote the preview portion just accessed the photos with elevated permissions if permission wasn’t granted yet? That doesn’t make any sense
TeMPOraL 12 hours ago [-]
I imagine the preview was generated server-side, where permissions granted to apps don't matter.
marssaxman 12 hours ago [-]
Oh, god, no, it makes complete sense. Somebody has to code the permissions in, after all...
11 hours ago [-]
mbrumlow 12 hours ago [-]
Yah. But I would think the permissions would be a OS level thing that can’t be bypassed simply because Google also wrote the app.
TeMPOraL 10 hours ago [-]
Google Photos is not a mobile app. Google Photos is a SaaS webapp that happens to have a companion app for Android. Whatever OS-level settings affect the Android app itself, they have no bearing on what Google Photos the SaaS can or cannot do.
chii 12 hours ago [-]
it's very easy to imagine the scenario where this happens.
Those photos may have already been uploaded to google's web servers (from my understanding, this happens with google photos by default?), from which a preview has been generated. The permission is at the android app level, and is requested at some point to ensure that the permission model is respected from the POV of the user. I can imagine the permission request being out of sync!
10 hours ago [-]
marssaxman 12 hours ago [-]
What OS? The one Google wrote, underlying these services?
necovek 12 hours ago [-]
Yes! So many times observed that there is a name for it (Conway's law), teams having limited touchpoints obviously leads to such impedance mismatches.
fhd2 9 hours ago [-]
I like to joke that Microsoft stuff is always 80%. Works very well for the obvious use cases, but then you are bound to run into weird issues as soon as you run into some edge case they haven't covered.
Makes me think it must have something to do with their corporate culture and how they work, since their developers, to my knowledge, never had a reputation for being weak. Maybe it's just because they have such a gigantic user base for everything they do, that 80% is the best business value. Though I doubt that a bit, as a third party developer, it makes me avoid their products where I can.
theandrewbailey 7 hours ago [-]
> Maybe it's just because they have such a gigantic user base for everything they do, that 80% is the best business value.
I was thinking about something similar recently. 80% of features take 20% of the time. For the hobby stuff I program, I want to make the best use of my time, so I skip the last 20% of features to make more room for stuff that matters.
I call it PDD: Pareto-Driven Development. Looks like you think Microsoft is similar.
conductr 4 hours ago [-]
They have time and resources though. I think what you say makes sense for release. At some point you just need to stop adding things and release what you could, even at their scale this is true. The pattern I see is they just never polish things after releasing them. Although, in theory, the whole team is still there available. Internally I think the culture is they immediately look to the next version. Instead of polishing what users have available, tackling the 20%, etc. So in a sense, they stay busy constantly building just the 80% part over and over again (eg, right now I imagine they’re busy porting the 80 to Windows 12 or whatever is next for them, which is probably a big rewrite because the like to change the entire style guide on each release)
duped 2 hours ago [-]
I mean Microsoft has always had a reputation for shipping bad software, what that reflects on their developer teams is up to the audience. Personally I'd say it's reflective of a bad engineering culture that doesn't care about quality or craftsmanship due to lack of an incentive to ship good software but every incentive to ship software at scale, quality be damned.
Like in this article alone, that's a change that should never be made. It's an understandable bug but it's indicative of carelessness, both managerial (how does this hold up to code review?) and QA (why is no one asking why login splashes take 30 seconds?).
hulitu 8 hours ago [-]
> I like to joke that Microsoft stuff is always 80%.
Only when they "release" it. And then they start again from 10%. /s
keyringlight 48 minutes ago [-]
The ongoing saga with the migration from control panel to the settings app is as good an example as any. They've been working on that since win8 which released in 2012. Given the time it took to make a new release, it's been 15 years and is still ongoing, it very much seems like MS don't care about that area of the OS.
I can understand that there's a lot of old/legacy third party stuff that would rely on the .cpl capabilities so you can't rip it out in one go, but "it's a lot of work" seems like a poor excuse for a company with resources like MS to not at least get all their own moved across with equivalents. They could even leave the old deprecated ones available in the background for anything legacy that needs to interact with them.
cornholio 10 hours ago [-]
The most annoying one is that Windows machines have lost the ability for deep sleep. Laptops that slept perfectly 5 years ago are now left as 24/7 zombies, with the CPU, fans and hard disks running non stop.
I'm certain that some idiotic change just like the ones suggested in the article destroyed this perfectly working feature, and nobody is bothered to fix it because it would impact the latest harebrained scheme to make my 10 year laptop do AI in its sleep and suggest helpful ads about the things it couldn't help overhear while "sleeping".
Moru 9 hours ago [-]
I have never had a windows computer that was able to do any sort of sleep without crashing, either in the sleep mode or randomly a time after waking up. First thing I do is disable it, I have lost enough time trying to find the cause of it and the crashes usually stop if I don't use sleep modes.
Most of my computers and friends computers have been ASUS though, maybe that is a connection.
(Windows user since 3.11 but I don't think those had sleep modes :-)
p_ing 3 hours ago [-]
ACPI implementations have been terrible for a very long time, ASUS is no exception. This is where a good portion of sleep-related issues stem from. The other side is garbage drivers. I have a joystick from Virpil, which has it's own drivers (extensible configuration) that prevented sleep.
That said, if I remove the joystick from the picture, my ASUS-based Am4 system sleeps just fine.
wsc981 5 hours ago [-]
I believe Asus used to produce MacBook's for a long time (probably not anymore, but I'm quite sure about the white G3 / G4 MacBooks - and sleep always worked fine on those machines.
It can't just be the hardware, I think.
Peanuts99 8 hours ago [-]
I'd be willing to bet this is more to do with chipset drivers and associated software than Windows itself.
cornholio 8 hours ago [-]
You may lose that bet. Nobody is writing or pushing driver updates for 10 year old hardware, yet Microsoft's strategy to cripple S3 sleep to compete with mobile OSes is well documented, with the, again, widely documented effect of setting on fire laptops their owners believed were "shut down".
smittywerben 38 minutes ago [-]
Hell even today in 2025 I installed my Wacom tablet drivers (USB drawing tablet) and the installer says "You must restart your system... Note: Shut down is not the same as Restart", like what does that even mean? It's a classic Microsoft move I'd say.
lewantmontreal 9 hours ago [-]
They do have some sort of sleep but its very inefficient so Lenovo Thinkpads actually go into hibernation after an hour or so of sleep, to avoid the user waking up to an empty battery.
Izkata 2 hours ago [-]
On Windows, at least when it was first introduced to consumers, "Sleep" was the name of that suspend/hibernate combination. Before that you had both options "Stand By" (suspend without hibernate) and "Hibernate" in the shutdown menu.
2 hours ago [-]
bartread 8 hours ago [-]
This has given me flashbacks. I went down this rabbit hole prepandemic with a Dell laptop I had for work at the time. I got tired of getting on the train to find my laptop dead.
Back then Windows would default to a crap version of sleep but you could still disable it in the BIOS and by tweaking a couple of other settings, thus forcing it to use proper sleep. I’m pretty sure I wrote a lengthy response on HN about this including the specifics at the time.
That worked well until I got a new Dell laptop that removed support for the good sleep mode entirely.
So then I’d make sure to always leave the machine plugged in and switched on overnight before any travel… which is how I discovered that the machine had a bug where sometimes it would fail to charge whilst plugged in and switched on but not awake, so occasionally I’d still end up with a dead laptop on the train.
So then I’d start up the machine as soon as I got out of bed so it’d get at least 30 - 45 minutes of charging with not much load on it whilst I was getting ready to leave.
I absolutely hate Dell.
For my own use I’ve been buying Apple laptops since 2011 and, although they went through a grim period meaning I kept a machine from 2015 to 2024, I never had this sort of nonsense with them.
skydhash 6 hours ago [-]
I like PC laptops. But only the business line, so that you get sensible admin options and good hardware.
voidUpdate 7 hours ago [-]
Windows kept waking up my old laptop at random times, draining the battery. Which lead to some embarrassment when I had replaced the default sounds with a comedy portal pack, and my backpack randomly yelled that the core temperature was critical when the battery nearly died
oblio 8 hours ago [-]
I just use hibernate instead. Marginally longer start up time and no more worries. For shorter periods I just leave it on (like when taking the laptop to a meeting room).
Vilian 2 hours ago [-]
Hibernate is a lot harder to make work correctly, and can lead to problems in more niche cases, so it's not a guarantee
InDubioProRubio 6 hours ago [-]
Somewhere, there is some intern commented out test for this feature.
analog31 14 hours ago [-]
This relates to something that might have started around that time: The practice of displaying the splash screen for a finite time period, then showing the user environment before the software was fully started. It was suspected that both OS's and apps were doing this, because people notice when "the app takes too long to load."
Now you have to guess whether the software has really loaded or not before you start using it.
cosmicgadget 14 hours ago [-]
Or, in the corporate information systems world, we have to wait for the half-dozen security and monitoring systems to get done scanning memory and blocking while painstakingly logging to the clould. Only then do you get to watch a catatonic UI thread idly wait for every last piece of the underlying software to load.
ffsm8 14 hours ago [-]
Not only do they forbid devs to use Linux on their dev machines, they then proceed with cb.exe etc. Nothing shows how they value your time quiet as much as these artificial slowdowns they love to introduce. (Along with gigantic privacy issues, as it allows the employer to essentially look at a live feed of your desktop whenever they want)
I could understand it if your device needed special access (VPN to prod etc), but you usually can't do that either from the dev machines - and need to first connect to a virtual machine (via browser or rdp) to be able to do that...
bregma 7 hours ago [-]
Well, we use Linux dev machines but IT has loaded them up with enough panopticon software that several times a week for an hour or so I can not get enough CPU or resident RAM to do productive work. But at least there are no forced system reboots at inconvenient times during the work day.
analog31 14 hours ago [-]
Yes and everything hangs because the security software is the last to load.
wizzwizz4 1 hours ago [-]
Ctrl+Alt+Delete, log off, then mash space to cancel the logoff. Kills the security software while leaving the rest of the system running. (Windows provides APIs to prevent this, but nobody writing "security" software uses them: in the Age of AI, I expect this to get even worse.
grishka 12 hours ago [-]
It's clear to me that the intent with this 30-second timeout was that it's better for you to have a possibly half-broken but at least somehow usable desktop than be stuck with the loading screen forever, having to boot into a different OS to try to fix your main one.
chii 12 hours ago [-]
> showing the user environment before the software was fully started.
and it has migrated to web apps today - where doing something causes the UI to show a loading/progress wheel, but it takes forever in actuality (or on start up of the webpage, you get a blank screen with placeholder bars/blurred color images etc).
And this is the so-called responsive design...
caseyohara 5 hours ago [-]
> And this is the so-called responsive design...
I’m not sure if this was meant to be a pun, but “responsive design” has nothing to do with how quickly a UI loads. It’s about adapting to different screen sizes.
kjkjadksj 1 hours ago [-]
Plain html could do that too. People forget the ancient knowledge.
presbyterian 42 minutes ago [-]
If everything was plain HTML, the web would still be only for academics and a handful of nerds, and most of the people on HN would have to find other jobs.
chii 11 minutes ago [-]
Well, the plain html version of gmail (which google recently removed) works so much better and runs so much faster than the fully interactive webapp version.
I would imagine that to be the case for a lot of webapps out there.
forgotusername6 12 hours ago [-]
What would you suggest? Is it better to wait until the whole app is loaded to show anything? Or is the only solution to fix loading times in the first place?
arkh 11 hours ago [-]
> Or is the only solution to fix loading times in the first place?
Ding! Ding! Ding! We got a winner!
Yeah, maybe we could expect machines which got 40 years of Moore's law to give you an experience at least as snappy as what you got on DOS apps.
72deluxe 9 hours ago [-]
Yes I am baffled how modern apps are painfully slow. Everything seems to include a Chrome Embedded Framework and therefore have an entire browser running. There is sadly a generation of people who grew up after .net was introduced who think it's perfectly reasonable for a VM spooling up as part of an app load or loading a browser is fine too, and have no idea how speedy Windows 95 used to be, or how loading an app took less than 1 second, or how easy Delphi apps were to create.
It's honestly very sad.
mike_hearn 4 hours ago [-]
Let's not go too far with the rose tinted glasses. Win95 apps are speedy if you run them on modern hardware but at the time they were all dog slow because the average Win95 machine was swapping like crazy.
Loading apps on it definitely did not take one second. The prevalence of splash screens was a testament to that. Practically every app had one whereas today they're rare. Even browsers had multi-second splash screens back then. Microsoft was frequently suspected to be cheating because their apps started so fast you could only see the splash for a second or two, and nobody could work out how they did it. In reality they had written a custom linker that minimized the number of disk seeks required, and everything was disk seek constrained so that made a huge difference.
Delphi apps were easier to create than if you used Visual C++/MFC but compared to modern tooling it wasn't that good. I say that as a someone who grew up with Delphi. Things have got better. In particular they got a lot more reliable. Software back then crashed all the time.
72deluxe 2 hours ago [-]
I suppose you are right. I worked with MFC/C++ and COM and it was horrible. Delphi and C++ Builder things were nicer to use but fell by the wayside, particularly after Borland lost their focus and didn't bother supporting VCL correctly in themes and also had major issues with their C++ compiler. They suffered a brain drain.
I remember Windows Explorer opening swiftly back in the day (fileman even faster - https://github.com/microsoft/winfile now archived sadly) and today's Explorer experience drives me insane as to how slow it is. I have even disabled most linked-in menu items as the evaluation of these makes Explorer take even longer to load; I don't see why it can't be less than 1 second.
Anyway, I do recall Netscape taking a long time to load but then I did only have a 486 DX2 66MHz and 64MB of RAM.... The disk churning did take a long time, now you remind me...
I think using wxWidgets on Windows and Mac was quite nice when I did that (with wxFormBuilder); C++ development on Windows using native toolkits is foreign to me today as it all looks a mess from Microsoft unless I have misunderstood.
In any case, I can't see why programs are so sluggish and slow these days. I don't understand why colossal JS toolkits are needed for websites and why the average website size has grown significantly. It's like people have forgotten how to write good speedy software.
duped 2 hours ago [-]
> Yes I am baffled how modern apps are painfully slow.
People underestimate how slow the network is, and put a network between the app and its logic to make the app itself a thin HTTP client and "application" a mess of networked servers in the cloud.
The network is your enemy, but people treat it like reading and writing to disk because it happens to be faster at their desk when they test.
72deluxe 2 hours ago [-]
I think all developers should test against a Raspberry Pi 3 for running their software (100mbps network link) just to concentrate on making it smaller and faster. This would eradicate the colossal JS libraries that have become the modern equivalent of many DLLs.
kjkjadksj 1 hours ago [-]
It is really amazing how big gui flagship apps like office suite or adobe suite seem slower than they did in 2001. And really they don’t do anything different than these old tools maybe a few extra functions (like content aware fill in ps) but the bread and butter is largely the same. So why is it so slow?
It is almost like they realized users are happy to wait 30-60 seconds for an app to open in 2001 and kept that expectation even as the task remained the same and computers got an order of magnitude more powerful in that time.
oblio 7 hours ago [-]
When I want to make old devs cry I send them this link[1]:
I know it's very simple, I know there isn't a lot of media (and definitely no tracking or ads), but it shows what could be possible on the internet. It's just that nobody cares.
[1] Yes, Hacker News is also quite good in terms of loading speed.
skydhash 6 hours ago [-]
Private torrent trackers are generally fast too. My pet peeves is when news websites are slow, because the only content you have is 90% text.
kjkjadksj 1 hours ago [-]
That 10% left of adware is heavy stuff
Aanok 9 hours ago [-]
I think at the very least individual widgets should wait to be fully initialized before becoming interactable. The amount of times I've, say, tried to click on a dropdown menu entry just to have it change right under my cursor making me click on something else because the widget was actually loading data asynchronously, without giving me any notice to the fact at all, is frankly ridiculous.
It's the right thing to do to load resources asynchronously in parallel, but you shouldn't load the interface piecemeal. Even on web browsers.
I'd much rather wait for an interface to be reliable than have it interactive immediately but having to make a guess about its state.
tonyedgecombe 12 hours ago [-]
I'd be happy to get a progress wheel, half the time it is a blank page.
skydhash 6 hours ago [-]
There used to be a loading progress by the browser, but today, it’s when it’s done that the real loading start.
cowsandmilk 14 hours ago [-]
Eh, in a case like this, without the 30 second “assume loaded” timeout, the system would be forever stuck in the loading screen for those impacted by the bugs. Sometimes it is better for your users to be optimistic that the system did indeed load.
mattnewton 14 hours ago [-]
Then maybe they would have caught the error in a preview release / QA
analog31 14 hours ago [-]
Of course I'm not your typical user, but I'd rather see the error log.
90s_dev 15 hours ago [-]
> Also, I tend to stick with default configurations because it makes bug filing easier.
I've learned to use default configurations pretty much everywhere. It's far too much of a hassle to maintain customizations, so it's easiest to just not care. The exception is my ~50 lines of VS Code settings I have sync'd in a mysterious file somewhere that I've never seen, presumably on github's servers, but not anywhere I can see?
skydhash 14 hours ago [-]
I only depends on an handful of tools (emacs, vim, lf, mpv, fish, foot,…) so I took the time to configure them and then just store the config in a git repo I sync everywhere. For personal computers I use stow. For remote machines, I just copy-paste. The nice thing is that those tools are so stable I could move to Debian stable and be OK.
TheDong 14 hours ago [-]
Reproducible self-contained configurations give most of the same benefit for bug filing.
Just your regular reminder that nix is good actually.
"I have a bug, you can get a full VM that reproduces it with 'nixos-rebuild build-vm --flake "github:user/repo#test-vm" && ./result/bin/run-*-vm'"
And the code producing that VM isn't just a binary blob that's a security nightmare, it's plain nix expressions anyone can read (basically json with functions).
And of course applying it to a new machine is a single command too.
nothrabannosir 13 hours ago [-]
<3
(Would it be pedantic of me to say that I receive my fair share of bug reports on nix code I maintain, and when someone sends me their entire nixosConfig the very first thing I do is punt it back with a "can you please create a minimal reproducible configuration"? :D but your point stands. I think. I like to think.)
5 hours ago [-]
squigz 15 hours ago [-]
> It's far too much of a hassle to maintain customizations
Is it? The vast majority of the time, I change settings/set things up the way I want, and then... leave them for literally years. Hell, I can directly restore a backup I have of Sublime Text from years ago and my customizations will work.
optymizer 14 hours ago [-]
It is. I used to customize everything. On Windows 95/98/2000/XP - custom cursors, themes, icon packs, custom Windows loading screen, the works. When I used KDE (and Gnome for a while) and compiz came out, I enjoyed flaming windows. Same story - custom icon packs, make grub menu look nice, hell, custom kernels compiled for my CPU, etc.
Somewhere along the way I lost interest in customizing the OS. These days I routinely switch between MacOS, Windows and various Linux flavors on lots of computers. The only thing I may customize is I write my .vimrc from memory.
On my Android phones, I change the wallpaper and I disable animations. Otherwise, stock everything.
Now that I think about it, it can't be the time saved, surely I waste more time on HN. It likely correlates more with using computers for work as opposed to for fun and learning. Even the learning I do these days is rather stressful - if I can steal an hour or two on the weekend, I feel lucky, so spending time to customize the environment seems like a waste.
Maybe if life slows down, I'll find joy in customizing my OSes again.
90s_dev 14 hours ago [-]
Yes, I have the same history customizing everything! From Windows 3.11 to XP to Linux, and then giving up because life gets busy.
On the note of programming not being fun anymore, that's exactly why I'm making my secret project that I hope to release very very soon, maybe in a week or so. I want to make programming fun again, in a similar way that pico8 did, but x100.
squigz 14 hours ago [-]
> It is. I used to customize everything. On Windows 95/98/2000/XP - custom cursors, themes, icon packs, custom Windows loading screen, the works.
> I have the same history customizing everything! ... then giving up because life gets busy.
I think this might be why some people have such different experiences. I don't try to customize "everything" - just what needs to be. Like, yeah, I would expect it to be difficult to maintain random Explorer customizations. I would not expect it to be difficult to maintain customization for a popular IDE.
progmetaldev 13 hours ago [-]
I am with you, customizing your daily tools and saving those configurations is far different from customizing every visual element of your operating system. Being able to reproduce your environment is definitely a worthy goal, but at the same time, I think you'd be limiting yourself by not configuring your production software to your own liking/most-efficient way you work.
3036e4 12 hours ago [-]
One of the few things I do bother to configure is my window manager, but only because it happens to be well designed and make it easy to store all configuration in my config git repo.
Too much software put host-specific stuff in settings files (like absolute paths) or just are not stable enough in general that it is worth trying to maintain a portable configuration.
Larrikin 14 hours ago [-]
Eh, I find a computer not customized to my work flow to be a waste of time. The amount of time I spend using someone else's computer is such a small amount of time.
90s_dev 14 hours ago [-]
Most of the time, yes. I maintained my vimrc for maybe 10-15 years before I gave up on it.
The hard part of maintaining a config is that there's no such thing as cost-free usage, it always takes a mental toll to change a config, to learn a new config, to remember which configs are in use and what they do, to backup configs, or at least to setup and maintain a config-auto-backup flow.
By far, the easiest mental model is just learning how everything works out of the box, and getting used to it. Then again, sometimes what people want is to do things the hard way for its own sake. That's probably part of why I kept going back to C over and over for so many years.
aftbit 4 hours ago [-]
Except then when I type ":Wq" by mistake, I get an error instead of vim doing what I expect. This happens about 20 times a day for me. One line of config maintenance is well worth the end of this daily annoyance.
3036e4 12 hours ago [-]
I don't know vim well. Why is it not easy to just keep using the same settings?
The oldest parts of my emacs config go back at least 30 years and I have had it in a git repo for ~15. I keep my entire .emacs.d versioned, including all third-party packages I depend on (so no fetching of anything from the cloud).
Have had to do at most minimal changes when upgrading to newer versions and with some tiny amount of work the same config still works for emacs from around version 21 to 31 (but features are of course missing in older versions).
Izkata 1 hours ago [-]
The same is true of vim. They're talking about effort changing it and keeping it around across computers. I don't think I've really changed my vimrc for 5-10 years.
eviks 12 hours ago [-]
Using bad defaults is also a hassle, and you do it way more often than maintaining customizations
akst 14 hours ago [-]
Even the best crafted systems, I think you'll also find there are just more synergies between different system features in the default configuration.
lp0_on_fire 15 hours ago [-]
Agreed. I had a professor once who would say “The defaults were put there by people who probably know more about the software than you”. As long as you understand what the defaults are doing sometimes it’s more hassle messing with every option under the sun.
meroes 15 hours ago [-]
Same except my teacher’s version was “just hit next/yes” for every option when installing software, in an era before that’d get you Adobe Reader and McAfee malware.
userbinator 10 hours ago [-]
These days the defaults are almost certainly oriented towards controlling or extracting the most value from you, be it invasive spyware, constant intrusions, or sub-optimal UI.
bandie91 10 hours ago [-]
those are the same people who let the user change those settings.
user3939382 15 hours ago [-]
It’s probably best decided as a function of frequency. For tools I’m using every day, I know every setting.
hulitu 7 hours ago [-]
> Agreed. I had a professor once who would say “The defaults were put there by people who probably know more about the software than you”.
He surely didn't use any Microsoft product. /s
1vuio0pswjnm7 2 hours ago [-]
Many, many years ago, when I still used Windows recreationally, I used to edit the value of a specific Windows registry key, replacing "explorer.exe" with "cmd.exe". This would prevent Windows from runningg explorer.exe and launching the "desktop" with wallpaper, icons, etc. This produced what seemed more like a UNIX window manager, a solid backgground, where each window is a Microsoft cmd.exe shell, a classic Windows black box with a blue bar across the top and thin grey borders. I could then launch applications, such as the ones in the C:\windows\system32 folder from a command prompt, e.g., taskmgr.exe. For me, this made Windows feel much faster and more robust that when using explorer.exe. Certainly it was lighter weight.
More recently, long after I stopped using Windows but still many years ago, I was reading an article about Arthur Whitney. It had a photo which seemd to be at home, maybe in a furnished garage, and in the background was a desktop computer running Windows. The only window open was a cmd.exe. I am not suggesting anything. It is just something I always remember.
Perusing some recent Microsoft documentation I noticed this:
There is/was a mode for this in Windows server. I also seem to remember a low cost/free windows for server or embedded meant for iot or whatever that featured cmd only without explorer.
zeeebo 41 minutes ago [-]
Sounds similar to WinPE
phkahler 4 hours ago [-]
This is the kind of problem that can be prevented by using "better" coding patterns. Not to say it will be prevented, but that some ways people like to structure code are more prone to these kinds of bugs. I quoted the word "better" because I work with a competent but less experienced guy that tends to write more complex code, or what some claim is just a different style than I would prefer. I make claims that "this way is better" and it's often very difficult to articulate why it's better. Sometimes I am convincing, other times not... I'm not immune to these kinds of bugs either, but I shudder to think I might one day inherit this other guys code and a collection of these weird bugs.
jameshart 4 hours ago [-]
Let’s be honest this is the kind of thing that can happen to anyone.
We all like to think we have picked up habits that immunize us from certain kinds of error but large software systems are complex and bugs happen.
The number of people in here taking ‘Raymond Chen tells an anecdote about the time a dumb bug shipped in Windows and was fixed two weeks later’ as an indictment of Microsoft’s engineering culture is frankly embarrassing. Trading war stories is how we all get better.
It would be better for us all if culturally, the reaction to a developer telling a story of how they once shipped a stupid bug were for everyone to reply with examples of worse stuff they’ve done themselves, not to smugly nod and say ‘ah yes, I am too smart to make such a mistake’.
phkahler 3 hours ago [-]
>> Let’s be honest this is the kind of thing that can happen to anyone.
I didn't say I'm immune to doing this myself, nor did I condemn anything about the particular scenario in the blog. My pain is in trying to articulate why some ways are better when any code that works is in some sense just fine.
>> We all like to think we have picked up habits that immunize us from certain kinds of error but large software systems are complex and bugs happen.
We sure do, although "immunize" is too strong. We try to minimize the likelyhood of these kinds of things. Experience is valuable and sometimes it's hard to articulate why.
jameshart 3 hours ago [-]
That’s been my experience of what it means to try to do ‘software engineering’ as a discipline; it’s the ongoing process of developing ways to articulate - beyond just ‘this way feels better to me’ - what attributes we are trying to build into the software we construct, what methods we use to imbue the software with those attributes, and why certain ways of doing so are better than others.
It still feels more like craftsmanship than actual engineering. A lot of the time it’s more like how a carpenter learns to use certain tools in certain ways because it’s safer or less prone to error, than how an engineer knows how constructing a particular truss ensures particular loads are distributed in particular ways.
And one of the best tools we have for developing these skills is to learn from the mistakes others have made.
So I agree - I think your instinct here was to look at this error and try to think whether you have engineering heuristics already that would make you unlikely to fall into this error, or do you need to adjust your approach to avoid making the same mistake.
My criticism here was more directed to others in the thread who seem to see this more as an opportunity to say ‘yeah, Windows was always buggy’ rather than to see it as an example of a way things can fail that they need to beware of.
josephg 10 hours ago [-]
To me, this falls in the category of bugs I think of as "systemic bugs" or "type bugs". If login components were passed a token, then you could make the token's destructor automatically flag that the process is done. Then this bug would be more-or-less impossible to write.
Because they made it a runtime thing - "components just have to remember to do this", the code structure itself affords this bug.
There was a similar bug at facebook years ago where the user's notification count would say you had notifications - and you click it, and there aren't any notifications. The count was updated by a different code path than the code which inserted notifications in the list, and they got out of sync. They changed the code so both the notification count & list were managed by the same part of the system, and the all instances of the bug went away forever.
nly 9 hours ago [-]
Bad notification icons happen on Reddit all the time. I've always assumed it was just bad caching
hbn 2 hours ago [-]
For as long as I've used the official app on iOS (since they killed third party apps... which also got me using reddit SIGNIFICANTLY less) it's had the issue where it'll send a push notification for a message reply or whatever, you can click it, view the message in the thread, reply to it, whatever. The push notification will be gone from your iPhone, but it never actually clears the notification from within the app. So you'll still see the notification in your in-app inbox so you have to go in there manually and click it again.
nubinetwork 15 minutes ago [-]
I could swear that this also affected Windows 10...
alex-mohr 14 hours ago [-]
The code in question reminds me a lot of my favorite Kubernetes bug:
if (request.authenticationData) {
ok := validate(etc);
if (!ok) {
return authenticationFailure;
}
}
Turns out the same meme spans decades.
snackbroken 9 hours ago [-]
This is a nice example of why one should parse, not validate. If every function that requires some kind of permission takes that permission as an argument, say (pseudocode)
Of course, we'd already fixed other issues like Kubelet listening on a secondary debug port with no authentication. Those problems stemmed from its origins as a make-it-possible hacker project and it took a while to pivot it to something usable in an enterprise.
DHRicoF 10 hours ago [-]
I don't know where you can read about this, but you are in the good track
If there is no authenticationData then the if !Ok is never run and the code continues execution as it were authenticated.
grumpyprole 11 hours ago [-]
The way software is built hasn't changed in decades.
hulitu 55 minutes ago [-]
> The way software is built hasn't changed in decades.
Correct. The only thing that changed is the number of level of abstractions.
est 10 hours ago [-]
Offtopic I really liked the auto-refreshing "Windows Spotlight" wallpaper on logon screen. I even wrote a script to sync it as my desktop wallpaper.
But on my Win10 it stopped working idk why, so I wrote a script to download Bing Image of the Day instead: https://blog.est.im/2025/stdout-03
adithyassekhar 10 hours ago [-]
11 can set spotlight as the desktop wallpaper.
JBiserkov 6 hours ago [-]
but is not supported on millions of otherwise working systems.
xnx 10 hours ago [-]
There's an official Bing Wallpaper App (https://www.bing.com/apps/wallpaper) for this, but it has all kinds of nuisances / dark patterns to switch your default browser to Edge and other nefarious things.
est 10 hours ago [-]
there's RSS and JPEG directly available for download so nah, I'd stick with few lines of script instead of a bloated app.
Kholin 9 hours ago [-]
On KDE Plasma, that's a built in feature.
est 8 hours ago [-]
win7 used to support dynamic wallpapers with an RSS inside some .theme file. It didnt last long. Looks like the hole "theme" idea was abandoned by Microsoft.
Then we have double context menus on Win11. Sigh!
pdpi 14 hours ago [-]
A bit meta, but I've come to look forward to the "Why did <bizarre behaviour> happen with <windows version>" headlines that herald a Raymond Chen post. These are always fascinating.
theandrewbailey 7 hours ago [-]
Agreed. The Old New Thing is such a wonderful collection of Windows oddities. Its the only blog that has explained away computing myths surrounding Windows that I've held since childhood.
sightofcorbie 16 hours ago [-]
“Comfort food”. That’s so funny. I still use motif window manager with steelblue4 desktop and wheat xterm background since aix into Linux. That was my first default in 1989 college and nothing has improved since. (Gnome, kde and the like make me want to upchuck).
kirenida 13 hours ago [-]
"Nothing has improved since". That's so funny.
toast0 12 hours ago [-]
What's the biggest thing that's improved? We have 4x the pixels, so we spend 4x the rendering time to draw everything with 4x as many pixels, when it works, and complain when it doesn't.
Would have been easier to stick with the pixel density we had.
Oh, and we have to wait a frame to see everything because of compositing that I still don't quite understand what it's supposed to do? Something something backing store?
CorpOverreach 15 hours ago [-]
My "comfort food" in this article is the realization that no matter how big, how advanced a team can be -- we all make (and ship) really dumb changes to production. A bolted-on wrapper if() statement that spans a bit too far is classic.
flomo 10 hours ago [-]
You just have awful taste, but so do a lot of my friends. :)
90s_dev 15 hours ago [-]
I have a certain nostalgia for bb4win, which I learned about during college, and was my first introduction to linux. Nostalgia is a powerful drug.
ryao 15 hours ago [-]
What will you do when you want to use a 4K monitor? This is not to be dismissive. I am genuinely curious if HiDPI works on motif.
hulitu 35 minutes ago [-]
> What will you do when you want to use a 4K monitor? This is not to be dismissive. I am genuinely curious if HiDPI works on motif.
Yes, xrandr -scale. Works fine for everything. Even better than Windows (which, for some reason, only scales some programs, not all)
suyula 13 hours ago [-]
I can't speak for Motif but Fluxbox works fine on my 4K monitor
tedunangst 12 hours ago [-]
You would set an appropriate size in .Xresources or somesuch.
bluedino 4 hours ago [-]
Machines were often so slow back then, 30 second logins were 'normal' (or maybe even quick), especially once you load a bunch of crap like AV and other enterprise crap.
If I had dollar for every minute of my life I spent troubleshooting random group policy quirks during my previous life as a sysadmin...
> Personally, I use a solid color background. It was the default in Windows 95,¹ and I’ve stuck with that bluish-green background color ever since.
My thoughts exactly, but I think it goes back to the Mac LC's we used in a school computer lab, and the palette of colors you could have with 16-bit graphics was so vast compared to the 16 color PC's I was used to.
Plus, you always have so much stuff open you're never going to see your wallpaper anyway. That's what screensaver are (were) for that rotate through a folder full of images.
scblock 2 hours ago [-]
On my fairly recent corporate laptop a 30 second login would be massively faster than my corpo-malware-ridden experience. 5 minutes to Windows 10 desktop this morning.
A similar (slightly older) laptop I own boots from fully off to the KDE desktop in 25 seconds total including typing my password.
shriracha 15 hours ago [-]
Great read. Raymond Chen's blog is absolutely legendary.
theandrewbailey 7 hours ago [-]
I feel that Bruce Dawson's blog falls into the same/similar category, though he hasn't had a great technical post in a while. I wonder what multithreading bug he'll find next.
This is the kind of thing that leads technical folks like me to do absurd things, like creating a 1x1 pixel jpg/png of the color they want just to be put into the normal-and-thus-tested-and-maintained parts of code.
geocar 1 hours ago [-]
You guys see your desktop background?
Nition 13 hours ago [-]
I've noticed a current problem in Win 11 where if you start your PC with the monitor off, let it get to the login screen, then turn the monitor on, you get a terribly low-res version of your login window background. I suspect it's taking some fallback default resolution like 800x600 and serving up an "optimised" version of the background for that res, then not updating it.
herbst 9 hours ago [-]
I've spent a few hours reading and installing random software I never heard of before to get my windows machine into the basic state of "I can boot it without getting heavily triggered"
- Half of each boot time was wasted for a copilot 360 dialog. On every fucking boot, no copilot no office installed. Or rather copilot installed itself without notice and started to spam me
- In several places the OS would show me "news" like death messages, war updates and economy details. Definitely far from a productive environment and honestly heavily triggering, I don't read news anywhere but my PC is full of it and there is no option to disable? What about kids?
- I have updates or a broken system about every second time I boot the PC. I know it's because I just cut the power, but I hate when it asks 3 times if I want to actually shut down (and then still breaks it, or never actually shuts down)
- I constantly end up in a variety of login screens that want me to login to an Microsoft account I don't have and want
- There are soooo many ads. I've been on Linux for years, instead of traditional TV I almost always stream with ad blocker. The country I live in isn't plastered with Ads either. But this shithole of operating system is. It literally pops up ad notification above apps on default.
If anyone wonders most problems where solved with "ShutUp10" others with chatGpt and regedit. It was actually pretty hard when you have no idea about this OS and it's dark patterns.
On my Linux machines I don't even change the wallpaper, but windows defaults are unbearable and upright productivity killers
hbn 2 hours ago [-]
On top of the news headlines being shoved in your face everywhere in Windows, they do the same with stocks, showing you trade values of random stocks in various places which I can't help but find hilarious.
I think they're trying to emulate Apple who has had stocks integration by default for years, including being alongside the other pre-installed apps like SMS and Mail on the first iPhone. I imagine Apple did it to cement themselves as a high-class lifestyle brand, even though I'm sure there was never a time where most iPhone users were doing a lot of day trading.
I wonder what percentage of Windows users rely on the stock ticker in the start menu though...
greenavocado 5 hours ago [-]
If you're willing to take the risk and trust the developer there is a fully prepared Windows distribution called Windows 11 24H2.3775 16in1 x64 - Integral Edition 2025.4.9 on zone94
Their other distributions are very good as well, especially for Windows XP because they bundle a lot of important drivers for old software to work correctly
herbst 2 hours ago [-]
I am using windows only for rekordbox. I tried a few like that like tiny10 but in the end I would always run into driver issues.
matejn 10 hours ago [-]
On my Windows 10 work machine, solid color backgrounds sometimes get reset to black after connecting to it via RDP. I "fixed" it by creating a solid color .png and using it as a background picture ...
bandie91 10 hours ago [-]
rdp client actually has an option to remove the remote desktop's wallpaper
matejn 8 hours ago [-]
True, but at least in my attempts, that setting only affects picture backgrounds and is ignored when using solid color backgrounds.
jacobgorm 8 hours ago [-]
Or why your software should never use arbitrary timeouts. Without the 30s timeout this bug would have been caught and fixed before release.
berkes 8 hours ago [-]
In one project, we had an ENV var (a few actually) for timeouts of network requests. Most places would raise an exception if they hit this timeout.
In test and CI we had this set to a very low number. In acceptance (manual testing, smoke testing) to a very high number.
This was useful because it showed three things:
- Network- and services- configuration bugs, would immediately give a crash and thus failing test. E.g. firewalls, wrong hosts, broken URIs etc.
- Slow services would cause flickering tests. Almost always a sign that some service/configuration/component had performance problems or was misconfigured itself. Quick fix would be to increase the timeout, but re-thinking the service - e.g. replace with a mock if we couldn't control it, or fixing its performance issues, the proper- and often not that hard- fix. Or re-thinking the use of the service, e.g. by pushing it to async or a job queue or such another fix.
- Stakeholders going through the smoke-test and acceptance test would inevitably report "it's really slow" showing the same issues as above but in a different context and with "real" services like some external PSP, or SMTP.
It was really a very small change: just some wrappers around http-calls and other network calls, in which this config was used, next to a hard rule to never use "native" clients/libs in the code but always our abstraction. This then turned out to offer so much more benefits than just this timeout: error reporting, debugging, decoupling.
It wasn't javascript (Ruby, some Python, some Typescript) but in JS it would be as easy as `function fooFetch(resource, options) { return fetch(resource, options) }` from day one, then slowly extended and improved with said logging, reporting, defaults etc.
I've since always introduced such "anti-corruption" layers (facades, proxy, ports/adapters) very early on, because once you have "requests.get("http://example.com/foo/bar")' all throughout your python code, there's no way to ever migrate away from "requests" if (when!) it gets deprecated, or to add said timeout throughout the code. It's really a tiny task to add my own file/module that simply imports "requests" and then calls it on day one, and then use that instead.
bornfreddy 7 hours ago [-]
I wonder what happens if some operation genuinly takes more than 30s to complete in sone cases? I'm sure the system handles it gracefully. /s
jayd16 3 hours ago [-]
Load curtains can be pretty annoying if you handle them like this.
One pattern I've had success with is using handles they need to be returned. If you never grab a handle you don't have to worry about returning it. Seems to work better than the fire and wait for side effects approach.
vlovich123 12 hours ago [-]
This is a structural problem in the code. The passed in handle to `DesktopIconsReady` should auto Report as soon as it's dropped. That completely obviates any of the problems of conditional logic.
emmelaich 10 hours ago [-]
I really love the last paragraph. I tell this to people fiddling with their system all the time.
Typical response is "Well it should just work anyway!". Which is theoretically true -- the worst kind of true.
zombot 10 hours ago [-]
If-Then Considered Harmful.
Seeing how that complicated if-then logic is just too stiff a challenge to your average developer, we should probably just dispense with it.
redbell 8 hours ago [-]
As a software developer, I’ve occasionally felt ashamed when it takes me hours—or even days—to fix a stubborn bug. But whenever I see that even big tech companies struggle with similar issues, I regain confidence in myself. After all, this is what being a software engineer often looks like: wrestling with unpredictable behavior until something finally clicks.
Just yesterday, I ran into a bizarre bug on Windows where the mouse cursor would move every time I pressed the arrow keys—almost like I was controlling the mouse with the keyboard. It drove me nuts. I checked all the usual mouse and keyboard settings, but everything looked normal. At one point, I even wondered if my machine had been infected by a virus.
Desperate, I Googled "mouse pointer moving on arrow keys". The first result had only one answer, which blamed... Microsoft Paint. I was skeptical—Paint? Really? That couldn’t possibly be it. Still, with no other leads, I gave it a shot. As it turned out, I did have Paint open in another Desktop View, where I’d been cropping a screenshot. The moment I closed it, the problem vanished. Instantly.
I still can’t believe that was the cause—and I’m a little embarrassed to admit it, even though no one was around to see it.
Ah, the glamorous life of a software dev—where we spend half our time building the future and the other half wondering why the future’s on fire.
Years ago, I had a bug so bizarre I nearly convinced myself the machine was haunted. My mouse pointer started drifting—not randomly, but only when I pressed the arrow keys. Up arrow? Cursor nudged north. Down arrow? There it went again. I was convinced some accessibility setting or keyboard remap had gone haywire. I rebooted. I checked drivers. I even briefly entertained the idea that my codebase was cursed.
Three hours in, I realized the true culprit: MSPaint. I had opened it earlier, and because the canvas was selected, the arrow keys were actually moving the selection box—which, by delightful Windows design, also moved the mouse cursor. I wasn’t losing my mind. Just... slowly drawing rectangles in the background every time I hit an arrow key.
I closed MSPaint, and poof—my “haunting” ended. I haven’t trusted that application since. Great for pixel art, less great for your sanity.
Some additional examples beyond the OP:
- In the latest macOS, trying to set a custom solid color background just gives you a blinding white screen (see: https://discussions.apple.com/thread/256029958?sortBy=rank).
- GNOME removed all UI controls for setting solid color backgrounds, but still technically supports it if you manually set a bunch of config keys — which seem to randomly change between versions (see: https://www.tc3.dev/posts/2021-09-04-gnome-3-solid-color-bac...).
The pattern here seems pretty clear: a half-baked feature kept alive for niche users, rather than either properly supporting or cleanly deprecating it. Personally, I’d love to simply set an RGB value without needing to generate a custom image. But given the state of things, I’d rather have one solid, well-maintained wallpaper system than flaky background color logic that’s barely hanging on.
Just offer a background picker, and have it generate a 1x1 PNG, for the color selected. Just like that, you can use the image background code path for everything; and the code for generating the image almost certainly exists in the OS already.
It also shows you the screen you set it for, and a boolean to set it for all screens at once.
KDE used to be the "bloated" desktop way back when (I know, pretty silly and laughable now given the current state of things).
That cemented Gnome/Mate into a lot of major distros as the primary DME. Ubuntu being the most famous.
The QT licensing situation is also a bit of a bizarre quagmire. There are certainly people that don't like KDE for more ideological reasons.
Personally, none of this bothers me and it's what I use for my personal computer. KDE is just so close to exactly how I'm used to interacting with computers anyways growing up through the Win95 era. It is so close to the Windows experience you want to have.
That’s not my recollection. I believe that the non-free license you mention was the major factor, in addition to the fact that KDE was written in C++ at a time when the free software community still preferred to write software primarily in C.
GNOME was written using a free software toolkit, and it was written in C, and it was associated with the FSF.
I'm not trying to be mean here, I'm just fascinated by what people will consider to be a waste of time.
This, sometimes it's enough for things to just work as you expect so you can do your actual work. I don't really understand why so many people are unpaid part time evangelizers.
Something that should be a default option, or a single-tap switch in settings, turned into a chore consisting of a period of agonising disbelief, doubt, denial, search, and eventually bitter acceptance.
The best thing to do is just take a file same as your screen resolution into your favorite image editor and fill it with actual true black. Save as png and send to phone.
As for "same image as your screen resolution": screenshot sounds like the exact fitting thing here. As a challenge, tried making screenshot black using stock Samsung "Gallery" and it seems that repeated Edit - Brightness: -100 - Save as copy, then open the copy and goto back to Edit can do the trick as well, after four or so copies. (Copies, because there is no way to re-apply same effect on the same photo, apparently.)
I suspect this is related to the System Preferences rebuild, since it's worked fine for 20+ years of OS X before that.
However, I've noticed, there's not much point in changing it. Showing the desktop is a waste of screen real estate because of the generations of abuse of desktop shortcuts. Even if you are careful, it becomes a cluttered wasteland (on Windows anyways). I just learned to never use the desktop for anything and always have windows up on various monitors.
# Disable icons on Desktop
defaults write com.apple.finder CreateDesktop false
The situation is better these days, with windows store apps. Still, I developed the habit of just never using the desktop in the XP days when things were really bad.
There was a war over your eyeballs, which had shady software vendors warring over desktop space, start menu space, taskbar space, even fucking file associations. I recall for a little RealPlayer and Windows Media Player used to yank back and forth file associations each time they ran, even if you tried to make them stop.
They've also disabled auto-save if you don't have the documents backed up by OneDrive, which is the most egregious for me.
https://thetechmentors.com/f12-a-better-alternative-to-the-s...
There's the peak GNOME experience.
At some point they decided 2x (3x?) scaling was enough for anyone and took away 4x, I didn't notice because I was already set at 4x and it continued working. Somewhat later they took away the backend, and then my system crashed with no error message immediately at login.
After much troubleshooting replaced a movie night, I inquired about the functionality being removed and was berated for using an undocumented/unsupported feature (because I was continuing to use it after the interface to set it had been removed, without my knowledge).
I'll never use gnome again if I can help it.
Quite a capable machine for my uses.
Not supported in Windows 11. Maybe with some additional config? Can’t be bothered with something hat might turn out to be fragile and need more maintenance than I can be bothered with. That’s a young man’s gane.
Ok, I’m about due to give Linux another tickle anyways.
Hmm, which distro… can always give a few a spin.
Keep it simple, Pop!_OS.
Installed fast, no issues, runs fine, stable. Seems entirely usable.
Customisations? Nah, keep it simple.
I’ll set a black background though.
Nope.
Switching to upstream (Ubuntu) with KDE would probably be more your speed.
win11 ltsc works perfectly on it. With a solid background :D
Based on: Arch. Init: systemd. https://cachyos.org
Based on: Debian. Init: Non-systemd. https://www.devuan.org
Based on: Arch. Init: systemd. https://garudalinux.org
Based on: Independent. Init: Non-systemd. https://www.gentoo.org
Based on: Red Hat Fedora. Init: systemd. https://nobaraproject.org
Teams not loading due to security issues, but notifications coming through with full content of messages. Ability to type a handful of words in cloud version of Word (or paste full documents) before security check catches up and requires me to set up a sensitivity label. Etc.
It mostly tells me about MS doing very bad software architecture for web apps, though apparently the desktop apps are not immune to it either.
Some customers will push back and have enough leverage to get an exception, but the default answer will be that this can't be disabled. You'll have some sales engineer challenged about the product behavior as part of an RFP and they'll try to convince you that nothing is leaked while knowing the financial opportunity would be much larger with these customers, if there was more concern for the customer.
(Teams makes this Byzantine in the extreme to accomplish as you have to go find the folder it drops all shared files in to gain access to manage access settings. But it does allow you to retro change access even for things shared in Teams)
From the outside looking in, it's the age old organizational problem when there are no good synergies between customer experience and development teams.
If so, pasting that link into Slack may reveal its first page.
[1] https://photos.google.com/
However when you're inside a note (which BTW, can also be converted into checkboxes, aka very simple TODOs), Google Keep, the note taking app from search giant Google, doesn't have search functionality for that specific note.
Besides the many small bugs, sometimes the missing functionality in Google apps is mind boggling.
https://support.microsoft.com/en-us/office/find-and-replace-...
1. All teams will henceforth expose their data and functionality through service interfaces.
2. Teams must communicate with each other through these interfaces.
3. There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team’s data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
4. It doesn’t matter what technology they use. HTTP, Corba, Pubsub, custom protocols — doesn’t matter.
5. All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.
6. Anyone who doesn’t do this will be fired.
7. Thank you; have a nice day!
Number 7 is a joke, etc.
Those photos may have already been uploaded to google's web servers (from my understanding, this happens with google photos by default?), from which a preview has been generated. The permission is at the android app level, and is requested at some point to ensure that the permission model is respected from the POV of the user. I can imagine the permission request being out of sync!
Makes me think it must have something to do with their corporate culture and how they work, since their developers, to my knowledge, never had a reputation for being weak. Maybe it's just because they have such a gigantic user base for everything they do, that 80% is the best business value. Though I doubt that a bit, as a third party developer, it makes me avoid their products where I can.
I was thinking about something similar recently. 80% of features take 20% of the time. For the hobby stuff I program, I want to make the best use of my time, so I skip the last 20% of features to make more room for stuff that matters.
I call it PDD: Pareto-Driven Development. Looks like you think Microsoft is similar.
Like in this article alone, that's a change that should never be made. It's an understandable bug but it's indicative of carelessness, both managerial (how does this hold up to code review?) and QA (why is no one asking why login splashes take 30 seconds?).
Only when they "release" it. And then they start again from 10%. /s
I can understand that there's a lot of old/legacy third party stuff that would rely on the .cpl capabilities so you can't rip it out in one go, but "it's a lot of work" seems like a poor excuse for a company with resources like MS to not at least get all their own moved across with equivalents. They could even leave the old deprecated ones available in the background for anything legacy that needs to interact with them.
I'm certain that some idiotic change just like the ones suggested in the article destroyed this perfectly working feature, and nobody is bothered to fix it because it would impact the latest harebrained scheme to make my 10 year laptop do AI in its sleep and suggest helpful ads about the things it couldn't help overhear while "sleeping".
Most of my computers and friends computers have been ASUS though, maybe that is a connection.
(Windows user since 3.11 but I don't think those had sleep modes :-)
That said, if I remove the joystick from the picture, my ASUS-based Am4 system sleeps just fine.
It can't just be the hardware, I think.
Back then Windows would default to a crap version of sleep but you could still disable it in the BIOS and by tweaking a couple of other settings, thus forcing it to use proper sleep. I’m pretty sure I wrote a lengthy response on HN about this including the specifics at the time.
That worked well until I got a new Dell laptop that removed support for the good sleep mode entirely.
So then I’d make sure to always leave the machine plugged in and switched on overnight before any travel… which is how I discovered that the machine had a bug where sometimes it would fail to charge whilst plugged in and switched on but not awake, so occasionally I’d still end up with a dead laptop on the train.
So then I’d start up the machine as soon as I got out of bed so it’d get at least 30 - 45 minutes of charging with not much load on it whilst I was getting ready to leave.
I absolutely hate Dell.
For my own use I’ve been buying Apple laptops since 2011 and, although they went through a grim period meaning I kept a machine from 2015 to 2024, I never had this sort of nonsense with them.
Now you have to guess whether the software has really loaded or not before you start using it.
I could understand it if your device needed special access (VPN to prod etc), but you usually can't do that either from the dev machines - and need to first connect to a virtual machine (via browser or rdp) to be able to do that...
and it has migrated to web apps today - where doing something causes the UI to show a loading/progress wheel, but it takes forever in actuality (or on start up of the webpage, you get a blank screen with placeholder bars/blurred color images etc).
And this is the so-called responsive design...
I’m not sure if this was meant to be a pun, but “responsive design” has nothing to do with how quickly a UI loads. It’s about adapting to different screen sizes.
I would imagine that to be the case for a lot of webapps out there.
Ding! Ding! Ding! We got a winner!
Yeah, maybe we could expect machines which got 40 years of Moore's law to give you an experience at least as snappy as what you got on DOS apps.
It's honestly very sad.
Loading apps on it definitely did not take one second. The prevalence of splash screens was a testament to that. Practically every app had one whereas today they're rare. Even browsers had multi-second splash screens back then. Microsoft was frequently suspected to be cheating because their apps started so fast you could only see the splash for a second or two, and nobody could work out how they did it. In reality they had written a custom linker that minimized the number of disk seeks required, and everything was disk seek constrained so that made a huge difference.
Delphi apps were easier to create than if you used Visual C++/MFC but compared to modern tooling it wasn't that good. I say that as a someone who grew up with Delphi. Things have got better. In particular they got a lot more reliable. Software back then crashed all the time.
I remember Windows Explorer opening swiftly back in the day (fileman even faster - https://github.com/microsoft/winfile now archived sadly) and today's Explorer experience drives me insane as to how slow it is. I have even disabled most linked-in menu items as the evaluation of these makes Explorer take even longer to load; I don't see why it can't be less than 1 second.
Anyway, I do recall Netscape taking a long time to load but then I did only have a 486 DX2 66MHz and 64MB of RAM.... The disk churning did take a long time, now you remind me...
I think using wxWidgets on Windows and Mac was quite nice when I did that (with wxFormBuilder); C++ development on Windows using native toolkits is foreign to me today as it all looks a mess from Microsoft unless I have misunderstood.
In any case, I can't see why programs are so sluggish and slow these days. I don't understand why colossal JS toolkits are needed for websites and why the average website size has grown significantly. It's like people have forgotten how to write good speedy software.
People underestimate how slow the network is, and put a network between the app and its logic to make the app itself a thin HTTP client and "application" a mess of networked servers in the cloud.
The network is your enemy, but people treat it like reading and writing to disk because it happens to be faster at their desk when they test.
It is almost like they realized users are happy to wait 30-60 seconds for an app to open in 2001 and kept that expectation even as the task remained the same and computers got an order of magnitude more powerful in that time.
https://forum.dlang.org/
I know it's very simple, I know there isn't a lot of media (and definitely no tracking or ads), but it shows what could be possible on the internet. It's just that nobody cares.
[1] Yes, Hacker News is also quite good in terms of loading speed.
It's the right thing to do to load resources asynchronously in parallel, but you shouldn't load the interface piecemeal. Even on web browsers.
I'd much rather wait for an interface to be reliable than have it interactive immediately but having to make a guess about its state.
I've learned to use default configurations pretty much everywhere. It's far too much of a hassle to maintain customizations, so it's easiest to just not care. The exception is my ~50 lines of VS Code settings I have sync'd in a mysterious file somewhere that I've never seen, presumably on github's servers, but not anywhere I can see?
Just your regular reminder that nix is good actually.
"I have a bug, you can get a full VM that reproduces it with 'nixos-rebuild build-vm --flake "github:user/repo#test-vm" && ./result/bin/run-*-vm'"
And the code producing that VM isn't just a binary blob that's a security nightmare, it's plain nix expressions anyone can read (basically json with functions).
And of course applying it to a new machine is a single command too.
(Would it be pedantic of me to say that I receive my fair share of bug reports on nix code I maintain, and when someone sends me their entire nixosConfig the very first thing I do is punt it back with a "can you please create a minimal reproducible configuration"? :D but your point stands. I think. I like to think.)
Is it? The vast majority of the time, I change settings/set things up the way I want, and then... leave them for literally years. Hell, I can directly restore a backup I have of Sublime Text from years ago and my customizations will work.
Somewhere along the way I lost interest in customizing the OS. These days I routinely switch between MacOS, Windows and various Linux flavors on lots of computers. The only thing I may customize is I write my .vimrc from memory.
On my Android phones, I change the wallpaper and I disable animations. Otherwise, stock everything.
Now that I think about it, it can't be the time saved, surely I waste more time on HN. It likely correlates more with using computers for work as opposed to for fun and learning. Even the learning I do these days is rather stressful - if I can steal an hour or two on the weekend, I feel lucky, so spending time to customize the environment seems like a waste.
Maybe if life slows down, I'll find joy in customizing my OSes again.
On the note of programming not being fun anymore, that's exactly why I'm making my secret project that I hope to release very very soon, maybe in a week or so. I want to make programming fun again, in a similar way that pico8 did, but x100.
> I have the same history customizing everything! ... then giving up because life gets busy.
I think this might be why some people have such different experiences. I don't try to customize "everything" - just what needs to be. Like, yeah, I would expect it to be difficult to maintain random Explorer customizations. I would not expect it to be difficult to maintain customization for a popular IDE.
Too much software put host-specific stuff in settings files (like absolute paths) or just are not stable enough in general that it is worth trying to maintain a portable configuration.
The hard part of maintaining a config is that there's no such thing as cost-free usage, it always takes a mental toll to change a config, to learn a new config, to remember which configs are in use and what they do, to backup configs, or at least to setup and maintain a config-auto-backup flow.
By far, the easiest mental model is just learning how everything works out of the box, and getting used to it. Then again, sometimes what people want is to do things the hard way for its own sake. That's probably part of why I kept going back to C over and over for so many years.
The oldest parts of my emacs config go back at least 30 years and I have had it in a git repo for ~15. I keep my entire .emacs.d versioned, including all third-party packages I depend on (so no fetching of anything from the cloud).
Have had to do at most minimal changes when upgrading to newer versions and with some tiny amount of work the same config still works for emacs from around version 21 to 31 (but features are of course missing in older versions).
He surely didn't use any Microsoft product. /s
More recently, long after I stopped using Windows but still many years ago, I was reading an article about Arthur Whitney. It had a photo which seemd to be at home, maybe in a furnished garage, and in the background was a desktop computer running Windows. The only window open was a cmd.exe. I am not suggesting anything. It is just something I always remember.
Perusing some recent Microsoft documentation I noticed this:
https://learn.microsoft.com/en-us/windows/configuration/shel...
We all like to think we have picked up habits that immunize us from certain kinds of error but large software systems are complex and bugs happen.
The number of people in here taking ‘Raymond Chen tells an anecdote about the time a dumb bug shipped in Windows and was fixed two weeks later’ as an indictment of Microsoft’s engineering culture is frankly embarrassing. Trading war stories is how we all get better.
It would be better for us all if culturally, the reaction to a developer telling a story of how they once shipped a stupid bug were for everyone to reply with examples of worse stuff they’ve done themselves, not to smugly nod and say ‘ah yes, I am too smart to make such a mistake’.
I didn't say I'm immune to doing this myself, nor did I condemn anything about the particular scenario in the blog. My pain is in trying to articulate why some ways are better when any code that works is in some sense just fine.
>> We all like to think we have picked up habits that immunize us from certain kinds of error but large software systems are complex and bugs happen.
We sure do, although "immunize" is too strong. We try to minimize the likelyhood of these kinds of things. Experience is valuable and sometimes it's hard to articulate why.
It still feels more like craftsmanship than actual engineering. A lot of the time it’s more like how a carpenter learns to use certain tools in certain ways because it’s safer or less prone to error, than how an engineer knows how constructing a particular truss ensures particular loads are distributed in particular ways.
And one of the best tools we have for developing these skills is to learn from the mistakes others have made.
So I agree - I think your instinct here was to look at this error and try to think whether you have engineering heuristics already that would make you unlikely to fall into this error, or do you need to adjust your approach to avoid making the same mistake.
My criticism here was more directed to others in the thread who seem to see this more as an opportunity to say ‘yeah, Windows was always buggy’ rather than to see it as an example of a way things can fail that they need to beware of.
Because they made it a runtime thing - "components just have to remember to do this", the code structure itself affords this bug.
There was a similar bug at facebook years ago where the user's notification count would say you had notifications - and you click it, and there aren't any notifications. The count was updated by a different code path than the code which inserted notifications in the list, and they got out of sync. They changed the code so both the notification count & list were managed by the same part of the system, and the all instances of the bug went away forever.
Link to the patch fixing it: https://github.com/kubernetes/kubernetes/commit/7fef0a4f6a44...
Of course, we'd already fixed other issues like Kubelet listening on a secondary debug port with no authentication. Those problems stemmed from its origins as a make-it-possible hacker project and it took a while to pivot it to something usable in an enterprise.
If there is no authenticationData then the if !Ok is never run and the code continues execution as it were authenticated.
Correct. The only thing that changed is the number of level of abstractions.
But on my Win10 it stopped working idk why, so I wrote a script to download Bing Image of the Day instead: https://blog.est.im/2025/stdout-03
Then we have double context menus on Win11. Sigh!
Would have been easier to stick with the pixel density we had.
Oh, and we have to wait a frame to see everything because of compositing that I still don't quite understand what it's supposed to do? Something something backing store?
Yes, xrandr -scale. Works fine for everything. Even better than Windows (which, for some reason, only scales some programs, not all)
If I had dollar for every minute of my life I spent troubleshooting random group policy quirks during my previous life as a sysadmin...
> Personally, I use a solid color background. It was the default in Windows 95,¹ and I’ve stuck with that bluish-green background color ever since.
My thoughts exactly, but I think it goes back to the Mac LC's we used in a school computer lab, and the palette of colors you could have with 16-bit graphics was so vast compared to the 16 color PC's I was used to.
Plus, you always have so much stuff open you're never going to see your wallpaper anyway. That's what screensaver are (were) for that rotate through a folder full of images.
A similar (slightly older) laptop I own boots from fully off to the KDE desktop in 25 seconds total including typing my password.
https://randomascii.wordpress.com/
https://randomascii.wordpress.com/2024/10/01/life-death-and-...
- Half of each boot time was wasted for a copilot 360 dialog. On every fucking boot, no copilot no office installed. Or rather copilot installed itself without notice and started to spam me
- In several places the OS would show me "news" like death messages, war updates and economy details. Definitely far from a productive environment and honestly heavily triggering, I don't read news anywhere but my PC is full of it and there is no option to disable? What about kids?
- I have updates or a broken system about every second time I boot the PC. I know it's because I just cut the power, but I hate when it asks 3 times if I want to actually shut down (and then still breaks it, or never actually shuts down)
- I constantly end up in a variety of login screens that want me to login to an Microsoft account I don't have and want
- There are soooo many ads. I've been on Linux for years, instead of traditional TV I almost always stream with ad blocker. The country I live in isn't plastered with Ads either. But this shithole of operating system is. It literally pops up ad notification above apps on default.
If anyone wonders most problems where solved with "ShutUp10" others with chatGpt and regedit. It was actually pretty hard when you have no idea about this OS and it's dark patterns.
On my Linux machines I don't even change the wallpaper, but windows defaults are unbearable and upright productivity killers
I think they're trying to emulate Apple who has had stocks integration by default for years, including being alongside the other pre-installed apps like SMS and Mail on the first iPhone. I imagine Apple did it to cement themselves as a high-class lifestyle brand, even though I'm sure there was never a time where most iPhone users were doing a lot of day trading.
I wonder what percentage of Windows users rely on the stock ticker in the start menu though...
Their other distributions are very good as well, especially for Windows XP because they bundle a lot of important drivers for old software to work correctly
In test and CI we had this set to a very low number. In acceptance (manual testing, smoke testing) to a very high number.
This was useful because it showed three things:
- Network- and services- configuration bugs, would immediately give a crash and thus failing test. E.g. firewalls, wrong hosts, broken URIs etc.
- Slow services would cause flickering tests. Almost always a sign that some service/configuration/component had performance problems or was misconfigured itself. Quick fix would be to increase the timeout, but re-thinking the service - e.g. replace with a mock if we couldn't control it, or fixing its performance issues, the proper- and often not that hard- fix. Or re-thinking the use of the service, e.g. by pushing it to async or a job queue or such another fix.
- Stakeholders going through the smoke-test and acceptance test would inevitably report "it's really slow" showing the same issues as above but in a different context and with "real" services like some external PSP, or SMTP.
It was really a very small change: just some wrappers around http-calls and other network calls, in which this config was used, next to a hard rule to never use "native" clients/libs in the code but always our abstraction. This then turned out to offer so much more benefits than just this timeout: error reporting, debugging, decoupling.
It wasn't javascript (Ruby, some Python, some Typescript) but in JS it would be as easy as `function fooFetch(resource, options) { return fetch(resource, options) }` from day one, then slowly extended and improved with said logging, reporting, defaults etc.
I've since always introduced such "anti-corruption" layers (facades, proxy, ports/adapters) very early on, because once you have "requests.get("http://example.com/foo/bar")' all throughout your python code, there's no way to ever migrate away from "requests" if (when!) it gets deprecated, or to add said timeout throughout the code. It's really a tiny task to add my own file/module that simply imports "requests" and then calls it on day one, and then use that instead.
One pattern I've had success with is using handles they need to be returned. If you never grab a handle you don't have to worry about returning it. Seems to work better than the fire and wait for side effects approach.
Typical response is "Well it should just work anyway!". Which is theoretically true -- the worst kind of true.
Seeing how that complicated if-then logic is just too stiff a challenge to your average developer, we should probably just dispense with it.
Just yesterday, I ran into a bizarre bug on Windows where the mouse cursor would move every time I pressed the arrow keys—almost like I was controlling the mouse with the keyboard. It drove me nuts. I checked all the usual mouse and keyboard settings, but everything looked normal. At one point, I even wondered if my machine had been infected by a virus.
Desperate, I Googled "mouse pointer moving on arrow keys". The first result had only one answer, which blamed... Microsoft Paint. I was skeptical—Paint? Really? That couldn’t possibly be it. Still, with no other leads, I gave it a shot. As it turned out, I did have Paint open in another Desktop View, where I’d been cropping a screenshot. The moment I closed it, the problem vanished. Instantly.
I still can’t believe that was the cause—and I’m a little embarrassed to admit it, even though no one was around to see it.
_____________________
1. https://superuser.com/questions/1467313/mouse-pointer-moving...
Years ago, I had a bug so bizarre I nearly convinced myself the machine was haunted. My mouse pointer started drifting—not randomly, but only when I pressed the arrow keys. Up arrow? Cursor nudged north. Down arrow? There it went again. I was convinced some accessibility setting or keyboard remap had gone haywire. I rebooted. I checked drivers. I even briefly entertained the idea that my codebase was cursed.
Three hours in, I realized the true culprit: MSPaint. I had opened it earlier, and because the canvas was selected, the arrow keys were actually moving the selection box—which, by delightful Windows design, also moved the mouse cursor. I wasn’t losing my mind. Just... slowly drawing rectangles in the background every time I hit an arrow key.
I closed MSPaint, and poof—my “haunting” ended. I haven’t trusted that application since. Great for pixel art, less great for your sanity.
------------
You and ChatGPT sound identical.