Skip to content

Friday, 10 March 2023

Make sure you commit anything you want to end up in the KDE Gear 23.04 releases to them

We're already past the dependency freeze.

The Feature Freeze and Beta is next week Thursday 16 of March.

More interesting dates  
  March 30: 23.04 RC (23.03.90) Tagging and Release
  April 13: 23.04 Tagging
  April 20: 23.04 Release

https://community.kde.org/Schedules/KDE_Gear_23.04_Schedule

Let’s go for my web review for the week 2023-10.


You Are Not a Parrot And a chatbot is not a human. And a linguist named Emily M. Bender is very worried what will happen when we forget this.

Tags: tech, ai, gpt, machine-learning, cognition, linguistics, politics, ecology, ethics

This is an excellent piece. Very nice portrait of Emily M. Bender a really gifted computational linguist and really bad ass if you ask me. She’s out there asking all the difficult questions about the current moment regarding large language models and so far the answers are (I find) disappointing. We collectively seem to be way too fascinated by the shiny new toy and the business opportunities to pay really attention to the impact on the social fabric of all of this.

https://nymag.com/intelligencer/article/ai-artificial-intelligence-chatbots-emily-m-bender.html


Did You Miss My Comment or What? Understanding Toxicity in Open Source Discussions

Tags: tech, sociology, foss, github

Early days for this type of research so a couple of limitations to keep in mind while reading this paper. Most notably: rather small sample explored (it’s a qualitative study) and tends to conflate GitHub with “the Open Source community”. The later especially matters since the vibe can be very different outside of GitHub. That being said, very interesting findings in there. Some validate my experience with GitHub. It’s clear that compared to other spaces there’s much more entitlement behavior from some people. Interestingly the words seem on average less violent (although it does happen of course) than in other platforms… still this is important to keep in check since it could have implication toward prospective contributors. The last point in their discussion section is promising. Some of the current manual interventions from maintainers seem to have good results (encouraging) and it seems possible to at least semi-automate the handling of toxic comments which could help with maintainers well-being.

https://cmustrudel.github.io/papers/osstoxicity22.pdf


diziet | Never use git submodules

Tags: tech, git, tools

I tend to agree with this quite a lot. Git submodules tend to create lots of strange issues and rather bad developer experience. Even worse it’s not necessarily spotted straight away, you notice the real pains only after having invested in it quite a bit. There are alternatives worth exploring though.

https://diziet.dreamwidth.org/14666.html


Safety and Soundness in Rust

Tags: tech, rust, safety

People tend to be fixated on the “unsafe” keyword and assuming not using it will make their code devoid of memory safety bugs. Well, it’s a bit more subtle than this. It helps you know where such bugs can hide but it can’t completely prevent them all the way down the stack.

https://jacko.io/safety_and_soundness.html


Indices point between elements - Made of Bugs

Tags: tech, programming

Neat way to think about array indices, if it was widespread it would simplify a few things in documentations I think.

https://blog.nelhage.com/2015/08/indices-point-between-elements/


A discussion between Casey Muratori and Robert C. Martin about Clean Code

Tags: tech, architecture, performance, craftsmanship

Very interesting conversation between Uncle Bob and one of the recent critics of his work regarding performance. I like how he admits some faults in the way he presents things and try to improve for later rather than trying to be right. Some people should learn from that. There’s clearly a tension between performance and what is described in Clean Code, it’d be pointless to deny it.

https://github.com/unclebob/cmuratori-discussion/blob/main/cleancodeqa.md


An Alternative to Dependency Injection Frameworks – Software the Hard way

Tags: tech, architecture, complexity, java

Indeed, in some type of projects people tend to turn to Dependency Injection Frameworks a bit blindly (especially true in the Java world). Still there are other patterns which give similar benefits without less headaches. That’s worth investigating if this fits your context before picking up a framework.

https://software.rajivprab.com/2018/11/06/an-alternative-to-dependency-injection-frameworks/


Why You Should Send a Weekly Summary Email | by Jens-Fabian Goetzmann | Feb, 2023 | Medium

Tags: tech, organization, team, note-taking

The advice is sound. Having more written records of such things definitely help teams. It can have a benefit in other forms (notes or todo’s) if you do it just for you.

https://jefago.medium.com/why-you-should-send-a-weekly-summary-email-1c556149ed42


How to hire engineering talent without the BS · Jesal Gadhia

Tags: tech, hr, interviews

Hiring and interview isn’t simple. There are good advises in this piece. In particular I strongly agree with the fact that leet coding is probably not it and that having something guided and scripted it necessary.

https://jes.al/2023/03/how-to-hire-engineering-talent-without-the-bs/


Want an unfair advantage in your tech career? Consume content meant for other roles

Tags: tech, management, empathy, culture, team, learning

This is definitely a worthy advice with lots of interesting side effects. For me the main motive beyond cheer curiosity is developing more empathy towards others with different roles.

https://matthewgrohman.substack.com/p/want-an-unfair-advantage-in-your


The Lost Art of Lacing Cable - The Broadcast Bridge - Connecting IT to Broadcast

Tags: tech, networking, history, culture

Fascinating old school way to manage cables. And indeed the result looks pretty as well.

https://www.thebroadcastbridge.com/content/entry/12400/the-lost-art-of-lacing-cable



Bye for now!

Thursday, 9 March 2023

There has been some recent discussions about how KDE applications (or Qt apps in general) should look and feel like outside of the Plasma desktop, particularly in a GNOME environment.

During this discussion I noticed two major disconnects between the involved parties. One of them is technical in nature, where (understandably) not everyone involved has deep knowledge about how Qt and KDE apps work. The other one is cultural in nature, where there’s opposing views about who gets to decide how an application should look and feel like on a given platform.

I can’t do much about the cultural issue, but I can help the conversation by giving some much needed overview of how any of this works on a technical level. Everyone being on the same page technically could help foster a more productive conversation about this complex topic.

First of all it’s important to note that Qt to its core is an abstraction across various plaforms (most important here are Linux, Windows, and macOS, but also to some degree Android and iOS). Whenever possible Qt tries to use the platform’s native facilities to do anything, whether that’s rendering, file dialogs, widget styles etc. This becomes somewhat messy when you consider that “Linux” isn’t exactly a single, well-defined “platform”. Qt does usually have non-native fallbacks for things like file dialogs and widget styles, but they aren’t necessarily something you want a user to have to see. It’s also important to mention that Qt has two somewhat competing ways of defining UIs, the traditional QtWidgets, and the more recent QtQuick/QML.

There are several somewhat independent pieces involved in how a Qt application looks and feels. Jan Grulich already talked about some of them in the context of GNOME and QGnomePlatform, but there are also things specific to KDE applications that aren’t mentioned.

The first piece is the “Qt Platform Theme (QPT)”. Despite the name it doesn’t have much to do with the visual style. It is responsible for applying settings from the platforms. This for example includes font settings, the double click interval, or whether a file should be openend on single or double click. It also defines how standard dialogs look like, most importantly the file picker dialog, but also dialogs like a color picker. Third, it defines the color palette (QPalette) the application is using. More on that later. Qt itself ships platform themes for non-Linux platforms as well as somewhat generic Linux platform themes for GNOME and Plasma. Notable out-of-tree plugin exist, like plasma-integration which you are using right now if you are on Plasma, the aforementioned QGnomePlatform targeted towards GNOME (and to some degree similar environments), and qt5ct, which isn’t aligned to a specific environment and provides generic control over platformtheme things.

The second, and perhaps most well-known, knob is the widgets style (also called QStyle). It controls the majority of the appearance of a QtWidgets application. Well-known examples include Breeze (the current Plasma default), Oxygen (the KDE4-default), adwaita-qt, as well as built-in styles for Windows/macOS. Qt also comes with a built-in Fusion style. QStyles are implemented using C++ plugins. Whenever the app needs to render some piece of UI, e.g. a button, it defers that to the style plugin. Some style, like e.g. Windows then use platform native APIs to render widgets, others like Breeze draw the widgets from scratch. Application developers can also include custom styles for complete control over the appearance.

The third important concept is QPalette. A QPalette is a set of colors used to draw UI elements. The palette is defined by the platform theme(!). For example Plasma uses this to apply the color scheme set in System Settings. QGnomePlatform uses it to apply Adwaita-like colors. The selected QStyle may (or may not!) use this palette when drawing controls. The application developer can also manually query colors from the palette for drawing custom widgets while still respecting the platform’s wanted colors. A platform theme may only offer a single palette this way, or include light and dark variants, or allow the user to configure arbitrary color sets (like we do on Plasma). It is also possible for application developers to override the system-provided palette, for example to offer an in-app dark mode switch.

For applications using QML there is another relevant component: The Qt Quick Controls 2 Style. For reasons I’m not going to go into QtQuick Controls don’t use QStyle for their styling. Instead they come with their own stying system, which is itself based on QML. In Qt5 QML apps only have a very basic and broken default theme that should never be used. In Qt6 they use Fusion by default.

These are the relevant knobs every Qt app has. Some app developers choose to use them to control the appearance of their apps themselves, but many others leave it to the environment to apply a suitable look and feel. Furthermore, there are some relevant KDE-additions to this that are important to understand.

One such addition is KColorScheme. You can think of KColorScheme as a superset of QPalette, i.e. it provides additonal color roles and thus finer-grained control over colors. When changing the Colors setting in Plasma’s System Settings you are picking a color scheme. This gets applied to QPalette via the plasma-integration QPT, but can also be queried directly by the application developer for custom painting. Contrary to QPalette a KColorScheme is not porgrammatically filled based on plaform values (that happens only on Plasma), but it is a static, textual list of colors. Here we have the first problem for running KDE applications under e.g. GNOME. When running a KDE app on GNOME QGnomePlatform will apply Adwaita colors using QPalette. However, this does not affect colors the application directly pulls from KColorScheme, which unless explicitly configured has a default that resembles Breeze. This means we get mixtures of two different color sets, giving unpleasant results. This is especially noticeable when using a dark system theme combined with the light default colors from KColorScheme.

How do we solve this? Well, I’ve been banging my head against that problem for a while. Short of removing the concept of KColorScheme entirely I see two realistic options, not necessarily mutually exclusive. QGnomePlatform could create a KColorScheme definition with Adwaita-like colors and apply that to the application. If exuted correctly it would likely give very good results, but obviously only on platforms that use QGnomePlatform. The other option would be to programmatically derive a KColorScheme definition from a QPalette, which is likely much harder because KColorScheme is a superset of QPalette, but it would be a generic solution for all platforms.

The second noteworthy thing for KDE applications affects QML apps in particular. I’ve mentioned that QML has a separate theming system compared to QtWidgets. Because maintaining two style definitions for different systems is no joy KDE maintains a “hack” around this. qqc2-desktop-style implements a Qt Quick Controls style that fetches style information from a QStyle, which means all the existing QStyles out there keep working for QML apps. It works amazingly well, until it doesn’t. One of the shortcomings of this approach is that qqc2-desktop-style internally heavily relies on KColorScheme, which makes the aforementioned mismatch between QPalette and KColorScheme much more prominent. Possible solutions are the same as mentioned before.

I hope this gives some much needed overview over technology and terminology of involved components and helps with a productive way forward with addressing the problems we have. You are welcome to join this discussion. There’s some other relevant things to talk about, like icon loading, theming, and rendering, but that’s for another day.

Wednesday, 8 March 2023

Screenshot of version 0.5.3 with gradients

We're excited to announce the release of Glaxnimate 0.5.3! This update includes a number of new features, bug fixes, and improvements to the user experience.

Editing

Glaxnimate 0.5.3 introduces several new editing features:

There's a new keyframe preset called "Fast" that basically has the reverse effect of "Ease".

Additionally, this release adds support for conical gradients.

Users can now select the first Bezier node from the node's context menu.

Version 0.5.2 introduced the ability to animate along a path, 0.5.3 makes this easier as now there's a new context menu entry for position properties showing a dialog to select a shape to follow.

User experience

In addition to editing improvements, Glaxnimate 0.5.3 includes several UI enhancements. We've made color values in tree views more visually appealing, and removed extra items from the "Move To" dialog.

Changing fill, stroke, and gradient properties now applies to the whole selection rather than the last selected object.

We've also revamped the context menu for properties, making them consistent between the timeline and the canvas.

Editing tools are more forgiving now when you release modifier keys before finishing the shape.

Users can now choose whether the timeline scrolls vertically or horizontally without modifiers, and we've added new layout presets to better accommodate a variety of screen sizes.

File Formats

This release also includes updates to Glaxnimate's import and export functionality.

The main new feature here is the ability to import and export Android Vector Drawables, the animation format used when making animated icons for Android.

We've also fixed various issues with the SVG parser, improving support for animated paths.

Opening raster images now uses the file basename as layer name, and we've resolved a bug affecting plugin export.

Bug Fixes

Finally, Glaxnimate 0.5.3 includes a number of bug fixes. We've addressed an issue with layers created by drawing tools not having an end frame, and fixed several other small bugs affecting the user experience.

We encourage all users to upgrade to Glaxnimate 0.5.3 to take advantage of these new features and improvements. You can download the latest version of Glaxnimate from the download page.

Tuesday, 7 March 2023

It has been many years that I have provided up-to-date builds of KDE/Plasma for Debian stable, testing, unstable. It is now more than a year that I don’t use Debian anymore. Time to send this off.

As already mentioned in some comments to various blog posts here, I will not invest more work into the current repositories. I invite anyone with interest in continuing the work to contact me. I will also write up a short howto guide on what I generally did and how I worked with this amount of packages.

I feel sad about leaving this behind, but also relieved from the amount of work, not to speak of the insults (“You are a Nazi” etc) I often get from the Debian side. I also feel sorry for all of you who have relied on these packages for long time, have given valuable feedback and helpful comments.

It was a nice and long run.

So long, and thanks for all the fish.

Monday, 6 March 2023

Precisely one month ago I joined KDE e.V., the non-profit organization behind KDE, as Software Platform Engineer. This is part of three positions in KDE’s “Make a living” initiative.

The exact scope of this position is a bit vague. I like to describe it as “Taking care of everything needed so that people can build and enjoy awesome software”. A large part of that is taking care of foundational libraries such as Qt and KDE Frameworks, but it can be really anything that helps people do awesome things. This is pretty much what I’ve been doing as a volunteer for the last couple of years anyway.

So what have I been up to this past month? A lot, but also not a lot that’s worth mentioning individually right now. As you probably know we are heading full steam towards using Qt6 for all our products. This is something that started almost four years ago (and I’ve been involved from the start) and is growing ever more closely to being finished. Last week we switched Plasma master to use Qt6 exclusively, completing an important milestone for the whole transition. This involved a ton of small to medium-sized changes and fixes across the stack.

Instead of listing all the changes I have done as part of that let’s focus on the outcome instead: I’m typing this post running on a full Plasma session running against Qt6. There are still some rough edges, but overall it’s quite usable already. Definitely enough to get involved and hack on it. I’d show you a screenshot, but that would be pretty boring, it looks exactly the same as before!

So what does the future hold? The transition towards Qt6/KF6 is going to stay my focus for a while, but once that settles down I’m going to focus on other areas of our software platform eventually. If you have ideas for what it would make sense for me to focus on please get in touch.

This position is enabled and financed by KDE e.V.. To allow me to keep this position in the long term (and perhaps even hire additional people) please consider donating to KDE e.V.

Sunday, 5 March 2023

I was first introduced to Linux when my outdated computer could not run Windows anymore. I had previously heard some information about LINUX while surfing the internet, but my initial impression was that it was highly sophisticated and exclusively for programmers or mainly hackers :). I finally started using Linux Mint, and was moved by how quick and configurable it was 🤩.

After a few distro-hopps, I finally settled for KDE Neon being amazed by how easily customizable it was ❣️

Motivation for applying to sok’23

I was very much content with KDE plasma but I always found some minor annoying bugs here and there, it was also around this time I got started with programming and was looking into ways to get better at it, so I decided that I would contribute to KDE 😇, and like every other beginner, I didn’t know how to navigate through such large repositories of KDE. I also learned about the Season of KDE program, which assists in onboarding new contributors for KDE, so I decided to apply.

My Project

My project for SOK’23 is improving the accessibility of Tokodon by writing appium tests. My project involves two parts: the first one is making Tokodon work without internet connection so that it is ready for tests and then the second part is writing GUI tests to improve accessibility. You can find my full project proposal SoK Proposal for Tokodon.

Work done so far🤗

Week 1-2:

In my first week, I researched how I would run Tokodon without network connectivity. I tried to reverse engineer the existing unit-tests, and created a new start file offline-main.cpp, and by the end of the second week, I could start Tokodon without network connectivity with some broken UI.

offline-tokodon

Week 3-4:

The next step was writing appium test for the search functionality. For this, I first fixed the broken search UI by reversing the already written unit-test for search and following which I wrote my first test for testing the GUI of search. The final result is shown in the gif below:

searchboxtest

Week 5-6:

In these weeks I, with the help of the maintainers of Tokodon, fixed the breaking pipelines of tokodon-offline and also wrote another appium test for testing different types of timeline statuses.

What I will be doing in the upcoming weeks

In the upcoming weeks I plan to add more appium tests and fix broken UI elements in tokodon-offline.

Friday, 3 March 2023

Let’s go for my web review for the week 2023-09.


Godot 4.0 sets sail: All aboard for new horizons

Tags: tech, 3d, gaming, godot

This is a huge release. Lots of very strong and needed feature to be a competitive engine. Congrats!

https://godotengine.org/article/godot-4-0-sets-sail/


Nokia launches DIY repairable budget Android phone | Nokia | The Guardian

Tags: tech, mobile, nokia, ecology

Coming from Zombie Nokia, still I think we need more options like this. It is the number one solution to reduce ecological footprints of computing.

https://www.theguardian.com/technology/2023/feb/25/nokia-launches-diy-repairable-budget-android-phone


OpenAI Is Now Everything It Promised Not to Be: Corporate, Closed-Source, and For-Profit

Tags: tech, ai, gpt, ethics, business

When they changed their statutes it was the first sign… now it’s clear all ethics went through the window. It’s about fueling the hype to drive money home.

https://www.vice.com/en/article/5d3naz/openai-is-now-everything-it-promised-not-to-be-corporate-closed-source-and-for-profit


Keep your AI claims in check | Federal Trade Commission

Tags: tech, ai, criticism

That’s a good set of questions to ask ourselves when in contact with a product claiming the use of “AI”.

https://www.ftc.gov/business-guidance/blog/2023/02/keep-your-ai-claims-check


New C++23 features I’m excited about - twdev.blog

Tags: tech, c++

This newer standard brings up interesting features again. I’m especially interested in std:expected myself.

https://twdev.blog/2022/10/cpp23/


SymPy makes math fun again

Tags: tech, mathematics, python

This really looks like a nice library for symbolic maths. Keep in mind it’s python based but it goes all the way to generating solutions to the given problem in various languages.

https://wordsandbuttons.online/sympy_makes_math_fun_again.html


Game Asset Storage, Loading, Compression and Caching | PH3 Blog

Tags: tech, gaming, compression, tests

Interesting new compression format around the corner. Might turn out useful in some cases. I could definitely have used it last year for a test harness with very large reference data (so no, not gaming).

https://ph3at.github.io/posts/Asset-Compression/


The Great Gaslighting of the JavaScript Era | The Spicy Web

Tags: tech, web, frontend, react, complexity

A bit of a rant so brace yourselves. Still, it’s very much aligned with the current backslash against “everything must be an SPA” trend and makes very good points on how it happened. This indeed turned into a popularity contest based on false premises. Meanwhile… complexity increased dramatically on the web frontend side and the performances are bad for most users.

https://www.spicyweb.dev/the-great-gaslighting-of-the-js-age/


Visual design rules you can safely follow every time

Tags: tech, gui, design, ux

Very nice set of rules. They are very simple to apply individually. The art is in respecting it all of course.

https://anthonyhobday.com/sideprojects/saferules/


Clever Code Considered Harmful

Tags: tech, complexity, maintenance, craftsmanship

Good musing about simple code and complexity. We definitely should avoid unwarranted complexity in our code, or at least try to prevent it’s spreading.

https://www.joshwcomeau.com/career/clever-code-considered-harmful/


Stop saying “technical debt” - Stack Overflow Blog

Tags: tech, technical-debt, maintenance, craftsmanship

Definitely this. I think this could have turned into a good term until it was used for everything under the sun. It’s about maintainability first, not just about what you like or not.

https://stackoverflow.blog/2023/02/27/stop-saying-technical-debt/


The lone developer problem

Tags: tech, programming, craftsmanship, team

Development is and has to be a team sport indeed.

https://evanhahn.com/the-lone-developer-problem/


A thorough team guide to RFCs. A reference guide to implement RFCs as… | by Juan Pablo Buriticá | Feb, 2023 | Juan’s And Zeroes

Tags: tech, decision-making, product-management

They’re definitely a powerful tool. I see them used in a few places but definitely not enough.

https://buriti.ca/a-thorough-team-guide-to-rfcs-8aa14f8e757c


6 qualities that make a great engineer | Inside Intercom

Tags: tech, quality, culture

A bit too much written in superlatives for my taste. Still, this is an interesting set of qualities indeed. Definitely things to aim for.

https://www.intercom.com/blog/traits-of-exceptional-engineers/


The Missing Semester of Your CS Education

Tags: tech, university, craftsmanship, tools

Having taught quite a bit at the university, having interviewed quite a few junior developers… I have to agree what’s proposed here is missing from most curricula. I wish this would be taught more systematically. If not at least students everywhere should know this online course exists.

https://missing.csail.mit.edu/


Improving Students’ Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology - John Dunlosky, Katherine A. Rawson, Elizabeth J. Marsh, Mitchell J. Nathan, Daniel T. Willingham, 2013

Tags: teaching, learning

Very interesting but long state of the art and evaluation of learning techniques. This is definitely something students should look at to pick better techniques. The way I design my trainings and coaching session seem to be mostly aligned with the findings, they tend to foster the right learning techniques… Still that’s up to the students to pick up the opportunity instead of repeating usual inefficient patterns.

https://journals.sagepub.com/stoken/rbtfl/Z10jaVH/60XQM/full



Bye for now!

Thursday, 2 March 2023

This Monday, I was in Brussels to attend a stakeholder workshop for the Digital Market Act (DMA) organized by the European Commission. For those who don’t know that is the DMA, it’s a new law that the European Parliament voted on recently and one of its goals of it is to force some interoperability between messaging services by allowing small players to able to communicate with users from the so-called Gatekeepers (e.g., WhatsApp).

I attended this meeting as a representative of KDE and NeoChat. NeoChat is a client for the Matrix protocol (a decentralized and end-to-end encrypted chat protocol). I started developing it with Tobias Fella a few years ago during the covid lockdown.

I learned about this workshop thanks to NLNet, who funded previous work on NeoChat (end-to-end encryption). They put Tobias Fella and me in contact with Jean-Luc Dorel, the program officer for NGI0 for the European Commission. I would never have imagined sitting in a conference room in Brussels, thanks to my contribution to open-source projects,

Visitor pass
Visitor pass

Regarding the workshop itself, this was quite enlighting for me, who is just a minor player there and only work on NeoChat and many other KDE application in my free time as a volunteer. I expected a room full of lawyers and lobbyists, which was partially true. A considerable part of the room was people who were silent during the entire workshop, representing big companies and mostly taking notes.

Fortunately, a few good folks with more technical knowledge were also in the room. With, for example, people from Element/Matrix.org, XMPP, OpenMLS, Open Source Initiative (OSI), NlNet, European Digital Rights (EDRi), and consumer protection associations.

Photo of the room
Photo of the room

The workshop consisted of three panels. The first was more general, and the latter two more technical.

In the first panel, the topic was the scope, the trade-offs, and the potential challenges of the Article 7 of the DMA. This panel was particularly well represented by a consumer protection organization, European Digital Rights, and a university professor, who were all in favor of the DMA and the interoperability part of it. Regarding the scope, one topic of discussion started by Simon Phipps was if gatekeepers like Meta should be forced to also interop with small self-hosted XMPP or Matrix instances or if this would only be about relatively big players. Unfortunately, I also learned that, while it was once part of the draft of the DMA, social networks are not required to interop. If Elon had bought Twitter earlier, this would have probably been part of the final text too. From this panel, I particularly appreciated the remarks of Jan Penfrat from the EDRi, who mentionned that this is not a technical or standardization problem and pointed out that some possible solutions like XMPP or Matrix already exist for a long time. There were also some questions left unanswered, like how to force gatekeepers to cooperate, as some people in the audience fear that they would make it needlessly difficult to interoperate.

After this panel, we had a short lunch, and this was the occasion for me to connect a bit with the Matrix, XMPP and NlNet folks in the room.

The second panel was more technical and was about end-to-end encryption. This panel had people from both sides of the debate. Paul Rösler, a cryptography researcher, tried to explain how end-to-end encryption works for the non-technical people in the audience, which I think was done quite well. Next, we had Eric Rescorla, the CTO of Mozilla, who also gave some additional insight into end-to-end encryption.

Cisco was also there, and they presented their relative success integrating other platforms with Webex (e.g. Teams and Slack) with for example, the possibility of creating Webex calls from Slack. This ‘interoperability’ between big players is definitively different from the direction of interoperability I want to see. But this is also a good example showing that when two big corporations want to integrate togethers, there are suddenly no technical difficulties anymore. They are also working on a new messaging standard (with reminds me a bit of xkcd 927) as part of the MIMI working group of the IETF and they already deployed that in production.

Afterward, it was the turn of Matrix, and Matthew Hodgson, the CEO/CTO at Element. Matthew showed a live demo of client-side bridging. This is their proposed solution to bridging end-to-end-encrypted messages across protocols without having to unencrypt the content inside a third-party server. This solution would be a temporary solution; ideally, services would converge to an open standard protocol like Matrix, XMPP, or something new. He pointed out that Apple was already doing that with iMessage and SMS. I found this particularly clever.

Last, Meta sent a lawyer to represent them. The lawyer was reading a piece of paper in a very blank tone. He spent the entirety of his allocated time, telling the commission that interoperability represents a very clear risk for their users who trust Meta to keep their data safe and end-to-end encrypted. He ignored Matthew’s previous demo and told us that bridging would break their encryption. He also envisioned a clear opt-in policy to interoperability so that the users are aware that this will weaken their security and express a clear need for consent popups when interacting with users of other networks. It is quite ironic coming from Meta who in the context of the GDPR and data protection, was arguing against an opt-in policy and against consent. And as someone pointed out in the audience, while Whatsapp is end-to-end-encrypted, this isn’t the case for Messengers and Instagram conversations, which are both also products of Meta. The lawyer quickly dismissed that and explained that he only represented Whatsapp here and couldn’t answer this question for other Meta products. As you might have guessed, the audience wasn’t convinced by these arguments. Still, something to note is that Meta had at least the courage to have something in front of the audience, unless other big gatekeepers like Microsoft, Apple and Google who were also in the room but didn’t participate at all in the debate.

Finally, after another small coffee break, the last panel was about abuse prevention, identity management, and discovery. With Meta in the panel again, consent was again a hot subject of discussion. Some argued that each time someone from another server joins a room, each user should consent to this new server can read their messages. This sounds very impractical to me, but I guess this goal is to make interoperability impractical. It also reminds me very much of the GDPR popup, which privacy-invading services try to optimize using dark patterns so that the users click on the “Allow” button, just that in this case, this will try to convince the user to click on the “Don’t connect with this user coming from this untrusted and scary third party server” button.

Slides about efficient design for effective interoperability
Slides about efficient design for effective interoperability

There was also some discussion about whether this was the server’s role in deciding if they allow connection from a third-party server or the user’s role. The former would mean that big providers would only allow access to their service for other big providers and block access to small self-hosted instances, and the latter would give users a choice. Another topic was the identifier, which, as someone from the audience pointed out, phone numbers used by Whatsapp, Signal and Telegram are currently not perfect as they are not unique across services and might require some standardization.

In the end, the European Commission tried to summarize all the information they got today and sounded quite happy that so many technical folks were in the room and active in the conversation.

And finally, after the last panels, I went to a bar next to the conference building with a few people from XMPP, EDRi, NlNet and OpenMLS to get beers and Belgium fries.

Me outside of the building
Me outside of the building

Since I am planning to go on vacations very soonish (actually, just idling the time until I head for the airport…), I think it is good time to wrap-up the recent changes in the KDE Yocto area.

As you might have noticed, 4 weeks ago there finally was an in-person FOSDEM again. It was great event, many interesting people were there and I had really good chats. One of the main topics for me was discussing the next steps for our Yocto efforts, in particular as Volker and Hannah were also around.

Our current state is as follows:

  1. meta-kf5 is updated the the latest KF5 release, 5.103.0. We plan to update that branch further as more KF5 releases appear. But it will stick with KF5 and KF6 is packed separately (see below).
  2. meta-kde also got an update, but a much bigger one than meta-kde: here, you will now find KDE Plasma 5.27.0 as well as support for Yocto Langdale and all needed backports in the Wayland area to make it work with Kirkstone. I expect this layer at some time to switch to Qt6/KF6 support and at latest once I am back, there will appear a kf6-staging branch.
  3. meta-kf6 is probably the most interesting point on the list. Following the Yocto project’s approach to have one repository per layer, there now is a new meta-kf6 repository. In this repository we use branch names that follow the supported Yocto releases to make it easier for device creators to pick compatible versions.
    This meta-kf6 repository now has an initial set of Git master hashes that are known to build regarding Langdale and scripting is prepared such that we can do semi-regular updates of the hashes as long as now official KF6 releases exist. Stressing the meaning of “Git master hashes”, please do not in any way expect that to be stable 😉 The only safeguard at the moment before updating the hashes is that we check that everything is building on Yocto Langdale. Currently, the main purpose is to catch packaging regressions early and support KF6 development with build fixes for more exotic setups.

Talking about “exotic setups”, I was very glad about the interested in our Plasma Bigscreen @ RISC-V/VisionFive2 demo at FOSDEM. All the WIP changes which were needed for this demo are finally landed in the KDE Yocto layers. There is only one remaining (very big) MR pending for meta-riscv which needs to get merged. The overall state of the board image though is quite basic and a lot of things are still to do (if you know Yocto and have such a board, help is welcome 🙂 ) For example, RAM size is reported incorrectly from the Kernel, the screen looks too pink, GStreamer and Kwin does not want to work yet together with the GStreamer-OMX backend from Starfive…
Recreating my setup should be fairly simple, just use this manifest file, add meta-riscv with the branch of the above mentioned MR and do “MACHINE=visionfive2 . ./setup-environment && bitbake kde-demo-image-bigscreen”.