Hacker Newsnew | past | comments | ask | show | jobs | submit | more creshal's commentslogin

Eh, kinda. Work forces me to have Jira, Confluence, Gitlab, Copilot, the other Copilot formerly known as Outlook, the other other Copilot formerly known as Teams, as well as Slack of course, and a dozen other webslop apps open… and it still all fits in <8GB RAM.

Which is a lot worse than the <1GB you'd get with well-optimized native tools, but try running Win11 with "only" 8GB RAM.


I'm convinced the next windows GUI will just be an electron app that runs copilot as the desktop, forcing you to argue with it to open a file or run a program. Doesn't even have titlebars or window buttons or a task bar, just one big copilot bar at the bottom that you can ask whats already running or to close an app. All of this written in JavaScript of course


Indeed. Much of a modern Linux desktop e.g. runs inside one of multiple not very well optimized JS engines: Gnome uses JS for various desktop interactions, and all major desktops run a different JS engine as a different user to evaluate polkit authorizations (so exactly zero RAM could be shared between those engines, even if they were identical, which they aren't), and then half your interactions with GUI tools happens inside browser engines, either directly in a browser, or indirectly with Electron. (And typically, each Electron tool bundles their own slightly different version of Electron, so even if they all run under the same user, each is fully independent.)

Or you can ignore all that nonsense and run openbox and native tools.


Which is baffling as to why they chose it - I remember there being memory leaks because GObject uses a reference counted model - cycles from GObject to JS then back were impossible to collect.

They did hack around this with heuristics, but they never did solve the issue.

They should've stuck with a reference counted scripting language like Lua, which has strong support for embedding.


A month with CrunchBang Plus Plus (which is a really nice distribution based on Openbox) and you'll appreciate how quick and well put together Openbox and text based config files are.


> Much of a modern Linux desktop e.g. runs inside one of multiple not very well optimized JS engines

A couple of years ago I saw a talk by Sophie Wilson, the designer of the ARM chip. She had been amused by someone saying there was an ARM inside every iPhone: she pointed out that there was 6-8 assymetric ARM cores in the CPU section of the SOC, some big and fast, some small and power-frugal, an ARM chip in the Bluetooth controller, another in the Wifi controller, several in the GSM/mobile controller, at least one in the memory controller, several in the flash memory controller...

It wasn't "an ARM chip". It was half a dozen ARMs in early iPhones, and then maybe dozens in modern ones. More in anything with an SD card slot, as SD card typically contain an Arm or a few of them to manage the blocks of storage, and other ARMs in the interface are talking to those ARMs.

Wheels within wheels: multiple very similar cores, running different OSes and RTOSes and chunks of embedded firmware, all cooperatively running user-facing OSes with a load of duplication, like a shell in one Javascript launching Firefox which contains a copy of a different version of the same Javascript engine, plus another in Thunderbird, plus another embedded in Slack and another copy embedded in VSCode.

Insanity. Make a resource cheap and it is human nature to squander it.


I've found that Gnome works about as well as other "lighter" desktop environments on some hardware I have that is about 15 years old. I don't think it using a JS engine really impacts performance as much as people claim. Memory usage might be a bit higher, but the main memory hog on a machine these days is your web browser.

I have plenty of complaints about gnome (not being able to set a solid colour as a background colour is really dumb IMO), but it seems to work quite well IME.

> Or you can ignore all that nonsense and run openbox and native tools.

I remember mucking about with OpenBox and similar WMs back in the early 2000s and I wouldn't want to go back to using them. I find Gnome tends to expose me to less nonsense.

There is nothing specifically wrong with Wayland either. I am running it on Debian 13 and I am running a triple monitor setup without. Display scaling works properly on Wayland (it doesn't on X11).


> I find Gnome tends to expose me to less nonsense.

IMHO, I find the reverse. It feels like a phone/tablet interface. It's bigger and uses way more disk and memory, but it gives me less UI, less control, less customisation, than Xfce which takes about a quarter of the resources.

Example: I have 2 screens. One landscape on the left, one portrait on the right. That big mirrored L-shape is my desktop. I wanted the virtual-desktop switcher on the right of the right screen, and the dock thing on the left of the left screen.

GNOME can't do that. They must be on your primary display, and if that's a little laptop screen but there is a nice big spacious 2nd screen, I want to move some things there -- but I am not allowed to.

If I have 1 screen, keep them on 1 screen. If I have 2, that pair is my desktop, so put one panel on the left of my desktop and one on the right, even if those are different screens -- and remember this so it happens automatically when I connect that screen.

This is the logic I'd expect. It is not how GNOME folks think, though, so I can't have it. I do not understand how they think.


> IMHO, I find the reverse. It feels like a phone/tablet interface. It's bigger and uses way more disk and memory, but it gives me less UI, less control, less customisation, than Xfce which takes about a quarter of the resources.

I've used Xfce quite a lot in the past and quite honestly most of the "customisation" in it is confusing to use and poorly thought out.

I've also found these "light DEs" to be less snappy than Gnome. I believe this is because it takes advantage of the GPU acceleration better, but I am not sure tbh. The extra memory usage I don't really care about. My slowest laptop I use regularly has 8GB ram and it is fine. Would I want to use this on a sub 4GB machine, no. But realistically you can't do much with that anyway.

Also Gnome (with Wayland) does a lot of stuff that Xfce can't do properly. This is normally to do with HiDPI scaling, different refreshrates. It all works properly.

With Xfce, I had to mess about with DPI hacks and other things.

> Example: I have 2 screens. One landscape on the left, one portrait on the right. That big mirrored L-shape is my desktop. I wanted the virtual-desktop switcher on the right of the right screen, and the dock thing on the left of the left screen.

> If I have 1 screen, keep them on 1 screen. If I have 2, that pair is my desktop, so put one panel on the left of my desktop and one on the right, even if those are different screens -- and remember this so it happens automatically when I connect that screen.

I just tried the workspace switcher. I can switch virtual desktops with Super + Scroll on any desktop. I can also choose virtual desktops on both screens by using the Super + A and then there is virtual desktop switcher on each screen.

I just tried it on Gnome 48 on Debian 13 right now. It is pretty close to what you are describing.

> This is the logic I'd expect. It is not how GNOME folks think, though, so I can't have it. I do not understand how they think

I think people just want to complain about Gnome because it is opinionated. I also don't like KDE.

I install two extensions on desktop. Dash to Dock and Appindicators plugins. On the light DEs and Window Managers, I was always messing about with settings and thing always felt off.


This is quite interesting. As before, what you find is the reverse of what I find.

> I've used Xfce quite a lot in the past and quite honestly most of the "customisation" in it is confusing to use and poorly thought out.

In places, it can be. For instance, the virtual-desktop switcher: you can choose how many in 1 place, how many rows to show in the panel in another place, and how to switch in a 3rd place. This shows it evolved over time. It's not ideal but it it works.

But the big point is, it's there. I'd rather have confusing customisation (as Xfce can be) than no customisation like GNOME.

> I've also found these "light DEs" to be less snappy than Gnome.

I find the reverse.

> I believe this is because it takes advantage of the GPU acceleration better

Some do, yes. But I avoid dedicated GPUs for my hardware, and most of the time, I run in VMs where GPU acceleration is flakey. So I'd rather tools that don't need hardware for performance to tools that require it.

Here's some stuff I wrote about that thirteen years ago.

https://liam-on-linux.livejournal.com/33987.html

I really have been working with this for a while now. I am not some kid who just strolled in and has Opinions.

> The extra memory usage I don't really care about.

You should. More code = more to go wrong.

When I compared Xfce and GNOME for an article a few years ago I compared their bug trackers.

GNOME: about 45,000 open bugs.

Xfce: about 15,000 open bugs.

This stuff matters. It is not just about convenience or performance.

> But realistically you can't do much with that anyway.

News: yeah you can. Billions have little choice.

The best-selling single model range of computers since the Commodore 64 is the Raspberry Pi range, and the bulk of the tens of millions of them they've sold have 1GB RAM -- or less. There is no way to upgrade.

> Also Gnome (with Wayland) does a lot of stuff that Xfce can't do properly.

I always hear this. I had to sit down with a colleague pumping this BS when I worked for SUSE and step by step, function by function, prove to him that Xfce could do every single function he could come up with in KDE and GNOME put together.

> This is normally to do with HiDPI scaling,

Don't care. I am 58. I can't see the difference. So I do not own any HiDPI monitors. Features that only young people with excellent eyesight can even see is is ageist junk.

> different refreshrates.

Can't see them either. I haven't cared since LCDs replaced CRTs. It does not matter. I can't see any flicker so I don't care. See above comment.

> I just tried the workspace switcher. I can switch virtual desktops with Super + Scroll on any desktop. I can also choose virtual desktops on both screens by using the Super + A and then there is virtual desktop switcher on each screen.

You're missing the point and you are reinforcing the GNOME team's taking away my choices. I told you that I can't arrange things where I want -- even with extensions. Your reply is "it works anyway".

I didn't say it didn't work. I said I hate the arrangement and it is forced on me and I have no choice.

> I just tried it on Gnome 48 on Debian 13 right now. It is pretty close to what you are describing.

It is not even similar.

> I think people just want to complain about Gnome because it is opinionated. I also don't like KDE.

I complain about GNOME because I have been studying GUI design and operation and human-computer interaction for 38 years and GNOME took decades of accumulated wisdom and experience and threw it out because they don't understand it.

> I install two extensions on desktop. Dash to Dock and Appindicators plugins. On the light DEs and Window Managers, I was always messing about with settings and thing always felt off.

So you are happy with it. Good for you. Can you at least understand that others hate it and have strong valid reasons for hating it and that it cripples us?


There is so much wrong here I don't really know where to start. There is a bunch of the usual flawed assumptions on things that haven't been relevant in decades. So I am going to pick the most egregious examples.

> But the big point is, it's there. I'd rather have confusing customisation (as Xfce can be) than no customisation like GNOME.

Those gnome plugins I install and extensions I must have imagined. I am sure there will be some reason why this isn't good enough, but I can customise my desktop absolute fine.

https://extensions.gnome.org/

> Some do, yes. But I avoid dedicated GPUs for my hardware, and most of the time, I run in VMs where GPU acceleration is flakey. So I'd rather tools that don't need hardware for performance to tools that require it.

I am not sure why you wouldn't want GPU acceleration that works properly.

Your examples of VM. Gnome works fine through in a VM (I used it yesterday), Remote Desktop and even Citrix. I used Gnome in a Linux VM over RDP and Citrix 2 years at work. It worked quite well in fact, even over WAN.

I don't care about what the situation 13 years ago (I dubious it was true then btw becase I was using a CentOS 7 VM).

EDIT: I just read the article. You are complaining about enabling a bloody checkbox.

> The best-selling single model range of computers since the Commodore 64 is the Raspberry Pi range, and the bulk of the tens of millions of them they've sold have 1GB RAM -- or less. There is no way to upgrade.

I guarantee you people aren't using these 1GB models as desktops. They are using this for things like a Pi Hole, Home Assistant, 3d printer, Kodi, Retro Gaming emulators or embedded applications.

People do run KDE, Gnome and Cinnamon on the 4GB/8GB/16GB models or buy a Pi400/500.

> I always hear this. I had to sit down with a colleague pumping this BS when I worked for SUSE and step by step, function by function, prove to him that Xfce could do every single function he could come up with in KDE and GNOME put together.

I was quite obviously talking about HiDPI support. You didn't read what I said.

This stuff works properly on Gnome and not on Xfce.

> Don't care. I am 58. I can't see the difference. So I do not own any HiDPI monitors. Features that only young people with excellent eyesight can even see is is ageist junk.

I do fucking care. I use a HiDPI monitor. Fonts are rendered better. My games look better than I run on my desktop. I like it.

I am 42. I can see the difference. While I am younger. I am not that young.

Why you are bringing ageism into what is essentially more pixels on a screen I have no idea. It is baffling that you are taking exception because I want the scaling to work properly on my monitors that I purchased. BTW my monitors are over a decade old now. HiDPI is not novel.

> It is not even similar.

It is exactly what you described. I literally read what you said and compared to what I could do on my Gnome Desktop. So I can only assume that you can't actually the describe the issue properly. That isn't my issue, that is yours.

> So you are happy with it. Good for you. Can you at least understand that others hate it and have strong valid reasons for hating it and that it cripples us?

No. You literally repeated all the usual drivel that isn't true (I know because I've actually use Gnome) and complaints that are boil down to "I don't like how it works" or "the developers said something I didn't like and now I hate them forever". It is tiresome and trite, I would expect such things from someone in their early 20s, yet you are almost 60.


> I am sure there will be some reason why this isn't good enough,

Installing extensions is not customisability. It is code patching on the fly and it breaks when the desktop gets upgraded.

Not good enough.

> Gnome works fine through in a VM

Again you translate "does not do something well" into "it does not work". Yes it can run in a VM. It doesn't do it very well and it only does it if the VM is powerful on a fast host.

Just a few years ago it did not work.

> EDIT: I just read the article. You are complaining about enabling a bloody checkbox.

You didn't understand it, then. It is really about what settings to enable and what extensions you must install.

> I guarantee you people aren't using these 1GB models as desktops.

Then you're wrong. I did myself not long ago. Most of the world is poor, most of the world doesn't have high-end tech.

> I was quite obviously talking about HiDPI support. You didn't read what I said.

I read it. I replied. I don't care.

The GNOME developers destroyed an industry standard user interface -- https://en.wikipedia.org/wiki/IBM_Common_User_Access -- which I am willing to bet you've never heard of and don't know how to use -- just to avoid getting sued by Microsoft 20Y ago.

A bunch of entitled kids who don't know how to use a computer with keyboard alone and who don't give a fsck about the needs of disabled and visually impaired people ripped out menu bars and a tonne more to make their toy desktop, but they threw in features to amuse audiophiles and people with fancy monitors, and you don't understand why I am pissed off.

You ripped out my computer's UI and replaced it with a toy so you could have higher refresh rates and shinier games.

> It is baffling

It's only baffling because never heard before from anyone inconvenienced by it and never thought before of other people's needs and use cases -- which is GNOME all over.

> It is exactly what you described.

No it is not.

Tell me what extensions will put the GNOME favourites bar on the left of the left screen and a vertical virtual desktop switcher on the right of the right screen.

You didn't understand my blog post about GUI acceleration in VMs, and you don't understand my comments either.

I have used every single version of GNOME released since 2.0 and I know my way round it pretty well -- same as I am atheist and know the Bible better than all but about 3 so-called christians I've met in 6 decades. Know your enemy.

I have been getting hatred and personal abuse from the GNOME team and GNOME fans, every time I ever criticise it, for over a decade now. It is the single most toxic community I know in Linux.


> same as I am atheist and know the Bible better than all but about 3 so-called christians I've met in 6 decades. Know your enemy.

I missed this the first time around. The fact that you see Christians as enemies (your words btw) is quite telling about this entire interaction/conversation we've had.

I honestly think that if you haven't learned why this attitude of your is a problem at almost 60 years old, I don't think you ever will.

I won't be responding to you again on any topic.


> Installing extensions is not customisability. It is code patching on the fly and it breaks when the desktop gets upgraded.

This is nonsense.

1) It changes how it works to how I prefer it, so that is customising it. 2) I've used the same extensions for ages. Nothing ever broken.

Basically want you and a lot of people want, is that there are hundred of options setting trivial things. Ok fine, then don't use Gnome, nobody is forcing you to use it.

As I said I install dash to dock and appindicator icons.

> Again you translate "does not do something well" into "it does not work".

It seems to be that you are getting hung up on the word "works fine" and wanting to get into some stupid semantic argument.

I found that it does do it well. You didn't read what I said. I used it for 2 years. It worked perfectly fine during duration.

So I know for a fact that what are you are saying incorrect.

> You didn't understand it, then. It is really about what settings to enable and what extensions you must install.

I was being flippant when I said "enable a checkbox". What was described in your blog post I've done this in virtualbox myself in the past.

It isn't difficult, pretending it is is asinine. I haven't used virtualbox in years, but I am quite familiar with the general purpose from when I did.

> I read it. I replied. I don't care.

Right. So why are you replying at all? So why should I care about your opinion if you aren't willing to consider mine?

You said you were 58 years old, I expect someone that is 58 years old (and is clearly articulate) to behave better tbh.

> A bunch of entitled kids who don't know how to use a computer with keyboard alone and who don't give a fsck about the needs of disabled and visually impaired people ripped out menu bars and a tonne more to make their toy desktop, but they threw in features to amuse audiophiles and people with fancy monitors, and you don't understand why I am pissed off.

I have HiDPI monitors for work. You keep on making assumptions about people and then come to the wrong conclusions.

Also I actually have a blind friend and he says that Gnome is actually works reasonably well (he installed it in a VM on his Mac).

He says it isn't as good as MacOS and thus he still uses his Mac. But he used Gnome and Unity and he says they are "ok".

As for pipewire/pulse. I had some issues with it like while ago, but it all seems to be fixed now.

So I am going to assume that you don't know what you are talking about.

> You ripped out my computer's UI and replaced it with a toy so you could have higher refresh rates and shinier games.

This is absolute nonsense. I did nothing of the sort. I just customised the default UI that happened to come with CentOS 7 at work and happened to like it and usually return to using it.

Gnome actually known for not working well with games. I am actually making YouTube video about it. You have to install GameScope to sandbox the compositor.

This is another case of you not knowing what you are on about quite frankly.

> I have been getting hatred and personal abuse from the GNOME team and GNOME fans, every time I ever criticise it, for over a decade now. It is the single most toxic community I know in Linux.

Says the person that just told me he didn't care about my needs and whether my hardware works and then blames for something never did. The toxicity isn't coming from me.

BTW, None of this was done by me. I use gnome. I am not part of the community. I done exactly one YouTube video for a friend to show him how to configure some stuff in Gnome as he was new to Linux. Oh I think I once may have logged a bug on their issue tracker.

It seems to me that you are arguing with the wrong person. You need to direct anger elsewhere. I did find the accusations of quite hilarious. So thanks for the giggles.


COSMIC is gaining ground as a JS-free alternative to current desktops, so hopefully you won't be limited to openbox and such.


Openbox isn't limiting me, Wayland still has no advantages for what I do with desktops.


Wherever it they're finding people in charge of canteen menus.


And coffee didn't make the jump until around the same time, either. No wonder Europeans wanted to be anywhere on the planet except Europe.


Coffee in Europe predates 1492 I think.


Coffee was around in Ethiopia and Yemen before that, but it didn't really spread in the Muslim world before 1500, and didn't spread from there to Europe until even later.


It really depends on how you tackle gradual typing on a project level. The easiest way to sabotage is a "any new code must be fukly type checked" requirement, because it often means you also need to add type hints to any code you call, which leads to Optional[Union[Any]] nonsense if you let juniors (or coding assistants) go wild.

As always, no fancy new tech is a substitute for competent project management.


That'd be even more reason for them to have a solid PR plan prepared, to grind down opposition and gaslight everyone into giving up. Leaving all messaging about the issue to upset users is the worst way to handle it. Even just closing the issue would've been less damaging at this point.


"Industry hearsay" in this case was probably Sony telling game devs how awesome the PS5's custom SSD was gonna be, and nobody bothered to check their claims.


the industry hearsay is about concern of HDD load times tho


HDD load times compared to......?


are we not reading the same post and comments?

the "industry hearsay" from two replies above mine is about deliberate data duplication to account for the spinning platters in HDD (which isn't entirely correct, as the team on Helldivers 2 have realized)


What are you talking about?

This has nothing to do with consoles, and only affects PC builds of the game


HD2 started as playstation exclusive, and was retargeted mid-development for simultaneous release.

So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.


If what they were familiar with was a good SSD, then they didn't need to do anything. I don't see how anything Sony said about their SSD would have affected things.

Maybe you're saying the hearsay was Sony exaggerating how bad hard drives are? But they didn't really do that, and the devs would already have experience with hard drives.


What Sony said about their SSD was that it enabled game developers to not duplicate assets like they did for rotating storage. One specific example I recall in Sony's presentation was the assets for a mailbox used in a Spider Man game, with hundreds of copies of that mailbox duplicated on disk because the game divided Manhattan into chunks and tried to have all the assets for each chunk stored more or less contiguously.

If the Helldivers devs were influenced by what Sony said, they must have misinterpreted it and taken away an extremely exaggerated impression of how much on-disk duplication was being used for pre-SSD game development. But Sony did actually say quite a bit of directly relevant stuff on this particular matter when introducing the PS5.


Weird, since that's a benefit of any kind of SSD at all. The stuff their fancy implementation made possible was per-frame loading, not just convenient asset streaming.

But uh if the devs didn't realize that, I blame them. It's their job to know basics like that.


By far the most important thing about the PS5 SSD was the fact that it wasn't optional, and developers would no longer have to care about being able to run off mechanical drives. That has repercussions throughout the broader gaming industry because the game consoles are the lowest common denominator for game developers to target, and getting both Xbox and PlayStation to use SSDs was critical. From the perspective of PlayStation customers and developers, the introduction of the PS5 was the right time to talk about the benefits of SSDs generally.

Everything else about the PS5 SSD and storage subsystem was mere icing on the cake and/or snake oil.


Yeah, that's what I was trying to get at. Sony was extremely deceptive in how they marketed the PS5 to devs, and the Helldivers dev don't want to admit how completely they fell for it.


It's incompetence if they "fell for" such basic examples being presented in the wrong context. 5% of the blame can go to Sony, I guess, if that's what happened.

And on top of any potential confusion between normal SSD and fancy SSD, a mailbox is a super tiny asset and the issue in the spiderman game is very rapidly cycling city blocks in and out of memory. That's so different from helldivers level loading.


I don't really understand your point. You're making a very definitive statement about how the PS5's SSD architecture is responsible for this issue - when the isssue is on a totally different platform, where they have _already_ attempted (poorly, granted) to handle the different architectures.


No. Please try reading more carefully.


> executives conveniently stopped using the term "AGI," preferring weasel-words like "transformative AI" instead.

Remember when "AGI" was the weasel word because 1980s AI kept on not delivering?


But how much are you paying for these services?


My family? Same as they pay for Google


See also, FreeBSD: Plenty of commercial offerings around it, no source for most of them, because the license doesn't require it. For example, there's no source for the Playstation kernels/userlands released by Sony. They only upstream some bug fixes that would be too onerous to keep in their private fork.


> They only upstream some bug fixes that would be too onerous to keep in their private fork.

Are you arguing that more good things would go upstream if it were licensed non-permissive or are you giving an example were it works well enough?


They're privatizing their profits and socializing their losses.

It's not healthy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: