Now, I really like Wayland, and it’s definitely better than the mess that is X11
BUT
I think the approach to Wayland is entirely wrong. There should be a unified backend/base for building compositors, something like universal wlroots, so that applications dealing with things like setting wallpapers don’t have to worry about supporting GNOME, Plasma, Wlroots, AND Smithay (when COSMIC comes out). How about a universal Wayland protocol implementation that compositors are built on? That way, the developers of, say, wayshot, a screenshot utility, can be sure their program works across all Wayland compositors.
Currently, the lower-level work for creating a compositor has been done by all four of the GNOME, KDE, Wlroots and Smithay projects. To me, that’s just replication of work and resources. Surely if all standalone compositors, as well as the XFCE desktop want to, and use wlroots, the GNOME and KDE teams could have done the same instead of replicating effort and wasting time and resources, causing useless separation in the process?
Am I missing something? Surely doing something like that would be better?
The issue with X11 is that it got big and bloated, and unmaintainable, containing useless code. None of these desktops use that useless code, still in X from the time where 20 machines were all connected to 1 mainframe. So why not just use the lean and maintainable wlroots, making things easier for some app developers? And if wlroots follows in the footsteps of X11, we can move to another implementation of the Wayland protocols. The advantage of Wayland is that it is a set of protocols on how to make a compositor that acts as a display server. If all the current Wayland implementations disappear, or if they become abandoned, unmaintained, or unmaintainable, all the Wayland apps like Calendars, file managers and other programs that don’t affect the compositor itself would keep on working on any Wayland implementation. That’s the advantage for the developers of such applications. But what about other programs? Theme changers, Wallpaper switchers etc? They would need to be remade for different Wayland implementations. With a unified framework, we could remove this issue. I think that for some things, the Linux desktop needs some unity, and this is one of these things. Another thing would be flatpak for desktop applications and eventually nix and similar projects for lower-level programs on immutable distros. But that’s a topic for another day. Anyways, do you agree with my opinion on Wayland or not? And why? Thank you for reading.
Don’t worry too much about the duplicated effort of different projects implementing the same standard on their own. It’s good to have lots of implementations of a standard. That’s what makes it a standard rather than just some code we all depend on.
That’s what makes it a standard rather than just some code we all depend on.
succinctly encapsulates the importance of code diversity. 👍
Things like taking screenshots and setting wallpaper actually do have a standard API. That stuff is just part of xdg desktop portals and not the core Wayland protocols. If, for example, a screenshot app uses the org.freedesktop.portal.Screenshot API then it should work with any compositor (as long as the compositor follows the API standards).
Well, those requires D-Bus. The wlroots project decided early on to support non-dbus software stacks, so wlroots compositors expose Wayland protocol extensions which could either be used directly or wrapped by the
xdg-desktop-portal-wlr
daemon.*
*(Well… many wlroots devs argued that the ecosystem should have chosen WP extensions instead of dbus, but I think most relented when Pipewire entered the equation.)
- person using software developed in opposition to monolithic architectures rediscovering the benefits of monolithic architectures
Wayland isn’t to blame for duplicate effort. Instead of 4 different efforts doing the same thing, they can collaborate to build a common base. Heck, wlroots is exactly that.
There’s a ton of duplicated work in Linux ecosystem. Just think about every new distro coming out doing the same things other distros did. Just think about all those package managers on different distros. They do almost the same thing. Do they need to have codebases that share nothing? No. But they don’t care. They rather duplicate effort. They chose this.
Wayland is a classic case of underspecification. They set out to replace X11, but their replacement only covered maybe 50% of what people were actually doing with X11, everything else was left as an exercise for the reader. That’s how you get this sluggish progress of the whole thing, as people will either ignore Wayland because it doesn’t work for their case, try ugly workarounds that will break in the long run or implement the thing properly, which in turn however might lead to multiple incompatible implementations of the same thing.
This also creates a weird value proposition for Wayland, as it’s basically like X11, just worse in every way. Even 14 years later it is still struggling to actually replace the thing it set out to replace, let alone improve on it in any significant way.
I have seen some improvements to be honest. I have never seen screen tearing (was quite common on X11) and Compositors run more smoothly for me, with less resource usage (that is unfortunately taken up by heavy bars like Waybar). For example, Qtile would usually run at about 780 Mb on a coldboot on X11, while on Wayland, it averages at about 580-600 Mb.
The thing is, what are the chances that those improvements needed a complete rewrite and couldn’t just be patched into X11? As for lack of screen tearing, is that even an advantage? In X11 to get rid of it I can do (dependents on driver, but AMD had it for ages):
xrandr --output HDMI-0 --set TearFree on
But more importantly, I can also do
TearFree off
to get a more responsiveness. Especially when it comes to gaming that is a very important option to have.There are also other things like CSD which I consider a fundamental downgrade to the flexibility that X11 offered.
Disabling screen tearing for two or more monitors with different refresh rates is as far as I know impossible within the X11 protocol. This is especially annoying for high-refresh rate VRR monitors which could be tearfree with negligible cost in responsiveness.
You also can’t prevent processes from manipulating each others inputs/outputs. An X11 system can never have meaningful sandboxing because of this. Maybe you could run a new tweaked and sandboxed X server for every process but at that point you’re working against the protocol’s fundamental design.
You also can’t prevent processes from manipulating each others inputs/outputs.
That’s one of those pseudo-problems. In theory, yeah, a bit more control over what apps can and can’t access would be nice. In reality, it doesn’t really matter, since any malicious app can do more than enough damage even without having access to the Xserver. The solution is to not run malicious code or use WASM if you want real isolation. Xnest, Xephyr and X11 protocol proxy have also been around for a while, X11 doesn’t prevent you from doing isolation.
Trying to patch sandboxing into Linux after the fact is not only not giving you isolation that is actually meaningful, it also restricts user freedom enormously. Screenshots, screen recording, screen sharing, keyboard macros, automation, etc. All very important things, suddenly become a whole lot more difficult if everything is isolated. You lose a ton of functionality without gaining any. Almost 15 years later and Wayland is still playing catch up to feature that used to “just work” in X11.
In theory, yeah, a bit more control over what apps can and can’t access would be nice. In reality, it doesn’t really matter, since any malicious app can do more than enough damage even without having access to the Xserver.
Complete nonsense. Moving away from a protocol that doesn’t allow every single application to log all inputs isn’t “a bit more control over what apps can and can’t access”. We’re switching from a protocol where isolation is impossible to one where it is.
The notion that if you can’t stop every possible attack with a sandbox then you should not bother to stop any of them is also ridiculous. A lot of malware is unsophisticated and low effort. Not bothering to patch gaping security holes just because there might be malware out there that gets around a sandbox is like leaving all your valuable stuff on the sidewalk outside your house because a good thief would have been able to break in anyway. You’re free to do so but you’ll never convince me to do it.
The solution is to not run malicious code
Another mischaracterization of the situation. People don’t go around deliberately running “malicious code”. But almost everyone runs a huge amount of dubious code. Just playing games, a very common use case, means running millions of lines of proprietary code written by companies who couldn’t care less for your security or privacy, or in some cases are actively trying to get your private data. Most games have some online component and many even expose you to unmoderated inputs from online strangers. Sandboxing just steam and your browser is a huge step in reducing the amount of exploitable vulnerabilities you are exposed to. But that’s all pointless if every app can spy on your every input.
Xnest, Xephyr and X11 protocol proxy have also been around for a while, X11 doesn’t prevent you from doing isolation.
What’s the point then of a server-client architecture if I end up starting a dedicated server for every application? It might be possible to have isolation this way but it is obviously patched on top of the flawed design that didn’t account for isolation to begin with. Doing it this way will break all the same stuff that Wayland breaks anyway so it’s not a better approach in any way.
Moving away from a protocol that doesn’t allow every single application to log all inputs isn’t “a bit more control over what apps can and can’t access”.
Every app already has full access to your home directory and can replace every other app simply by fiddling with
$PATH
. What you get with Wayland is at best a dangerous illusion of security.What’s the point then of a server-client architecture if I end up starting a dedicated server for every application?
Flexibility. I can chose to sandbox things or not too. And given how garbage the modern state of sandboxing still is, I’d rather take that flexibility than being forced to sandbox everything.
Anyway, to take a step back: Wayland doesn’t actual solve any of this. It just ignores it. Not having a way to record inputs or make screenshots does not improve security, it simply forces the user to find other means to accomplish those task, those means can then be utilized by any malicious app just the same. If you actual want to solve this issue you have to provide secure means to do all those task.
it’s basically like X11, just worse in every way
Hard disagree there.
My desktop is far smoother, works much better with multi-monitor, Gnome’s trackpad gestures work amazingly, I never see ugly tearing, there are fewer instances of bugs and instability. Then on top of that there’s the far better security aspect.
So far the only issue I’ve experienced is screen sharing on discord.
I don’t quite follow your arguments. X11 got big and bloated, wayland applications need to worry about the different compositors? So we should use one implementation? Implementations should be irrelevant. That is the whole point of an API/protocal - a description of how things should talk to each other even for different implementations.
I don’t see how one implementation helps here - that one implementation still needs APIs for the applications to talk to. The problem is not that there are different implementations but maybe that the wayland protocol does not cover enough of the API space needed by applications. Some of which are addressed by things like the xdg-desktop-portal.
Well, yes, but there are programs like wdisplays, wlr-randr, etc. which only work on wlroots compositors. Why is that the case?
It should work in any compositor that implements the wlr-output-management-unstable-v1 protocol. Compositors that are known to support the protocol are Sway and Wayfire.
It uses a new unstable protocal that others dont support yet. The fact it is unstable suggests it might change over time as well,
deleted by creator
There should be a unified backend/base for building compositors
There is… it’s called Weston; however, Weston is very, VERY minimalistic for a good reason.
Isn’t Weston just a reference implementation (you shouldn’t use but get ideas from it)?
deleted by creator
That’s an interesting ppint of view. And I completely agree that it’s good to try not to turn things into X11. The thing is, even if that happens, Wayland apps will generally work across all implementations.
The issue with X11 is that it got big and bloated, and unmaintainable, containing useless code. None of these desktops use that useless code, still in X from the time where 20 machines were all connected to 1 mainframe.
I don’t think that is very fair to say. From what I heard, the X.org code as in the implementation of the protocol and its extensions is actually of very high quality, so it can be maintained. The problem as you correctly describe is the design and the resulting protocol with its extensions which don’t fit modern needs.
It’s also not like theoretically multiple X11 servers implementing the X Window System couldn’t have existed simultaneously, it was just too much effort regarding the complexity of the protocol. In fact, for a short time, two different implementations existed: XFree86 and the X.org server. Granted the latter was a fork of the former, but they were independent projects during the time of their coexistence.
But it is fair to say that considering they started Wayland because they could not fix the issues with X11.
Yea, sometimes new problems need new solutions and the old architecture can get fundamentally outdated!
deleted by creator
Well, there are programs like wdisplays, wlr-randr, etc. which only work on wlroots compositors. Why is that the case? Am I missing something here? Please explain. I will admit I don’t know as much about Wayland as I wish I did.
deleted by creator
Why would they be different about it? I understand they want this supported asap and couldn’t wait for an unstable protocol to become stable, but I sincerely hope they switch to using this protocol once it matures.
Why would they be different about it?
Because if they rush it into stable, and then find flaws in it, or want to change how an API works for whatever reason, it will break applications that used to work perfectly one compositor update ago. (not so stable behavior)
When you first transition to Wayland, you need to replace applications that don’t use Wayland APIs and it’s a pain in the ass. Now imagine having to do that every time your compositor is updated. That’s what rushing things into stable will do.
Yeah, that’s why I think it would be better to focus on existing standards rather than multiplying work so much, when that time could have been used for more productive tasks like working on accessibility or fixing bugs, or addressing documentation issues, or when everything else is done, working on the Mobile space, which is very much an emerging market.
So why not just use the lean and maintainable wlroots
wlroots can’t be used (comfortably and idiomatically) in Rust because it’s too hard (if not impossible) to provide a memory-safe interface for it.
we can move to another implementation of the Wayland protocols.
So unfortunately this has already happened.
What i wonder about, why do compositors have to handle Keyboards (again)? Wayland does and X did. Shouldn’t that be separately handled?
What do you mean by handled seperatly? Both wayland compositors and X11 use libinput nowadays to get input from the hardware. But then it needs to be routed to the right application - the one that is in focus minus any global shortcuts that the compositor might want to deal with. The compositor is what understand what application has focus and thus is what knows where to send input to so it makes sense for it to handle that. It is not just about where to render windows - but manages all events such as input that applications require.
Right, forgot that part.
I love the idea of Wayland, but it only finally actually booted for me onto the desktop earlier this year (on Manjaro KDE). But it still randomly freezes for about a full minute, quite a bit. I am keen to move to it as my compositor hangs on X11 for some odd reason on KDE every time I try to do a rectangular area screenshot with Spectacle (mmm just realised it is also for around a minute - maybe I do have some other underlying issue), or when accessing the Compositor menu option. But X11 is still otherwise rock solid for me.
Manjaro’s to blame. Manjaro ships botched packages sometimes, and they are always two weeks behind, meaning you can’t use the AUR lest you break your system.
I approach from behind, and then hog tie the bastard!
On my desktop computer (debian testing + Sid + experimental; AMD Ryzen; Nvidia GPU RTX 3080), that I use mostly for multimedia (blender) and gaming, I avoid Wayland cause I lose 10%-15% FPS on games (both native ones n using wine/proton/proton_eggroll)… so, for me, yet, Wayland ain’t an option!!!
Personally, I haven’t had a performance hit, but many games don’t work properly. I also use a tiling window manager though. Gaming on Linux is not as easy as Wincrap. Gaming on a Tiler is harder. Gaming on Wayland on a Tiler is still quite crazy.
Normally projects like this address real needs. If X would actually fail to provide crucial functionality on modern desktop someone would develop alternative that would cover it and people would switch in a matter or years. Instead Wayland set out to build something complex and useless for most people and now is surprised it takes a lot of time for it to gain traction.
How it should be approached is that if people need some very specific setup (like multiple displays with fractional scaling and different refresh rates and they want to play games on it and need to get 100% of their configuration) Wayland should provide them a tool to do just that with dedicated server and DE. Most people wouldn’t need any of this and would stay with X, few people would use the new DE. If more and more people would require the functionality provided by the new DE it would grow, get forked and other DE would start supporting the standard. The approach of “we build something 1% of users need, spend a lot of effort to support us” is what’s silly.
X is bad code and to hard to maintain. You do know the people developing Wayland are the same ones who developed X11? I think their biggest issue is they should have called it X12 or something so people new it was the successor to X11.
So? What features essential for average Linux user are missing in X11? If the code is so bad I’m sure users struggle to use it. Do they?
So what that those are the same devs? If X is bad code it would mean we should probably look for other devs to do it, right?
features
- mixed refresh rates
- (not GNOME) mixed VRR/nonVRR
- (not GNOME) Better mixed DPI?
- (not yet, experimental in gamescope) HDR support
- (not yet, experimental in KDE) persistence through compositor restart
It was the inability to add features like mixed refresh which caused Xorg devs to push for a new protocol. Otherwise it would be yet another series of janky patches to break assumptions made in a 40 year old protocol.
Other devs have been working on it. Valve’s contributions to wlroots, KDE, and gamescope can’t be understated.
I admire your effort but there are a lot of people in the open source community and especially on Lemmy that just don’t want to understand it, I try not to argue with them but they are fucking everywhere with their trashtalking of various amazing open source projects!
I was curious about what they’d say next. Their argument is “most users don’t need more than Xorg, so it’s ‘silly’ to expect investment in Wayland”.
I found some agreement in “as more people need Wayland features, investment will grow”, especially with the Valve and KDE/wlroots/gamescope. Also Automotive Grade Linux embracing libweston.
None of those features are essential for average user. None of them are even ‘nice to haves’ for like 90% of users.
They are becoming more essential by the day. HDR and VRR is supported by just about every graphics card for the last 5 years, and displays which support both can be found for $200 or less. Valve had a reason to add HDR support to Gamescope/Steam Deck; it is a highly requested feature.
I will agree with you on one point: Xorg is not bad code. Xorg is an awesome project, and has developed and changed to the needs of users exceedingly well for decades. But X11 itself is tech debt. The first ten years of Wayland were spent paying that debt off (while simultaneously continuing Xorg development).
If the features aren’t what you need, then Wayland wasn’t built to support you today. But you might find yourself in 6 years looking at a gorgeous HDR display which works out-of-the-box on your favorite Linux distro thanks to Wayland.
They are becoming more essential by the day.
Exactly, they are becoming essential now. They are still not essential and definitely were not essential 14 years ago. That’s all I’m saying. Expecting that everyone will invest a lot of effort to support a project that does not deliver any value to vast majority of users is silly. 14 years later the need is slowly growing so the support is materializing. That should be the approach from the beginning: build something people need.
14 years later the need is slowly growing so the support is slowly growing
Yes! I agree wholeheartedly. Adoption has been slow because Wayland did not meet the needs of most people more than Xorg did. Cinnamon isn’t moving any time soon because the value-add isn’t enough for the average desktop user.
But…
build something that people need
People have needed HDR and VRR for years. HDR is essential for professionals in video and image editing. They needed Wayland years ago, and it was being built with them in mind, not just the average desktop user in 2012.
Not every feature is used by every user of that software. I used X-forwarding over SSH once, ever. It did not add any value to me. SSH forwarding adds no value to the average user either. But it is essential to someone.
Not relying on your own custom implementation of GL that completely breaks embedded platforms…
Again, not essential to vast majority of Linux users.
If you’re going to marginalize a subset of an already marginalized community then nothing will ever be enough. What kind of an argument is this anyway?
any form of security in the display server would be nice. X is incredibly insecure with no trivial means of locking it down.
X is not a code. Neither Wayland.