July 26, 2017

Wednesday is the third and for many people last day of BoFs, as people start to head off home. However hacking and some smaller meetings will happen tomorrow between those still here

Dot Categories:

July 25, 2017

Bienvenidos al unboxing del ultrabook Slimbook Pro de 13 con KDE Neon, mi última adquisición. Un portátil con el que creo que estaré muchos años ya que su acabado final es perfecto o casi perfecto. Apenas llevo un par de días con él y las sensaciones son extraordinarias. 

Unboxing del ultrabook Slimbook Pro de 13 con KDE Neon

De Akademy-es y Akademy 2017 me llevé unos recuerdos extraordinarios y la visión de que KDE Blog se va a internacionalizar un poco, ya iré desgranando los detalles en futuras entradas. Pero además de estas cosas intangibles, me llevé (comprado, naturalmente) un ultrabook de la marca Slimbook.

Concretamente he adquirido un ultrabook Slimbook Pro de 13 pulgadas, una máquina preciosa y robusta a la que espero sacar el mismo rendimiento o más que a mi Slimbook Classic (a menos que los chicos y chicas de Slimbook saquen algo mejor y me pongan los dientes largos).

De momento, y creo que por mucho tiempo, el sistema operativo elegido es un KDE Neon, ya que éste me proporciona las novedades de la Comunidad KDE en tiempo récord. Así de momento mi portátil ya está ejecutando un escritorio Plasma 5.10.4, el más avanzado posible.

Unboxing del ultrabook Slimbook Pro de 13 con KDE Neon

De esta forma, y al igual que hice cuando hace un par de años, he decidido realizar un unboxing delante de una cámara para promocionar la marca (ya saben que creo fírmemente en su proyecto) y mostraros de primera mano sus características.

Es pronto para hacer una análisis detallado de mi uso con él, pero de momento os puedo avanzar que este modelo mejora en todo al anterior: teclado, touchpad, pantalla, gráfica, webcam, conectividad, etc. También hay que decir, que es un poco más caro.

Solo hay un aspecto que ha empeorado por razones de espacio interno, ya que ofrece la posibilidad de poner dos discos duros: la batería. No obstante, en un principio no me preocupa ya que en 2 años el número de tomas eléctricas en mis zonas de trabajo han aumentado y casi siempre tengo una disponible.


Por cierto, por si alguien quiere saberlo sin visitar el canal de Youtube de KDE Bloglas características básicas de mi pequeña maravilla son las siguientes:

  • Procesador Intel i5-7200U 2.5GHz, Turbo Boost 3.1GHz 2 Core 4 Threads 3M Cache
  • Tarjeta gráfica Intel Graphics HD 620
  • 8 Gb de Ram DDR4
  • Disco duro sólido de 250 Gb
  • Pantalla de 13,3″ FullHD 1920×1080
  • Touchpad Synaptic
  • 2 entradas USB 3.0
  • 1 USB Type-C
  • 1 entrada
  • HDMI
  • Teclado retro-iluminado LED
  • Batería de 3100 MA
  • Cámara frontal de 720 p
  • Bluetooth 4.0
  • Lector de tarjetas SD y MMC
  • Chasis de aluminio

The second day of Akademy BoFs, group sessions and hacking has just finished. There is a wrapup session at the end so that what happened in the different rooms can be shared with everyone including those not present.

Dot Categories:

Hello!

Sidenote: I'm working on Go language support in KDevelop. KDevelop is a cross-platform IDE with awesome plugins support and possibility to implement support for various build systems and languages. The Go language is an cross-platform open-source compiled statically-typed languages which tends to be simple and readable, and mainly targets console apps and network services.

During last week I worked on fixed some bugs:

  • Wrong struct declaration color - the problem was with misuse of setting declaration isTypeAlias property to true and not inheriting StructureType in class used for structs handling;
  • Bug which caused syntax highlighting disappear sometimes after restart;
  • Some fails related to retrieving info from method declaration.
Also, I worked on adding support for formatting using "go fmt" - a Go language standard tool which formats code. After implementing this I noticed that source highlighting disappears every time I apply formatting - this was a bug I mentioned before. Without formatting I rarely could reproduce that bug, but given the ability to easy reproduce that situation I was able to fix it.

After that I worked on linting support via Go Meta Linter - a tool which runs and aggregates an output from various linters \ analyzers \ normalizers. Usage of that tool allow to not worry about different output format of different analyzers and easy call them concurrently. Sadly, it has it own downsides - it isn't able yet to detect all duplicates in errors\warnings - but it tries to do so (see first error on screenshot, there is a list of linters that detected that problem in braces)

Go Meta Linter output
Screenshot shows some examples of Go Meta Linter output - error regarding not used variable (yep, it's error in Go language world) and warning about missing documentation comment on Test function which is exported to outerscope. 

P.S. Also, now it's possible to build kdev-go using clang. :)

Looking forward to next week!

Here at Akademy in Almería, we have moved from the conference portion — two days of talks at the University — to the hacking week portion. The conference days were very busy; most of the talks were recorded and those recordings will be up when they’re done post-processing. I haven’t heard a date for that yet.

Cake is a theme. I don’t know who bakes them or where they come from, but the shout on IRC that there is fresh-baked, still-warm cake and coffee in the hall near the team room is an event to drop everything for.

I missed BoFs yesterday that I wanted to go to — I guess I was wrapped up in hacking and talking to people outside of the scheduled activities. Today I spotted a Calamares BoF — not one I scheduled, mind, but apparently people think I should be doing more work.

At the end of each hack-week day, there is a BoF wrap-up for those attendees who could not be everywhere at once, and also for people outside of Akademy who want to know what has been worked on. Yesterday’s wrap-up was lots of “we discussed this-and-that”, I imagine today is going to start moving into “we built something”. So, for everyone at Akademy: be in the big hall downstairs at 18:00; for others, catch the video later (yay, shameless self-promotion!)

Sebastian Kügler writes on his blog about Plasma's vision statement, which names durabililty, usability and elegance as its corner stones.

Riddell: “Hey, Ade, don’t touch your quassel for a minute, ok? Right, switch to a different channel.” Me: “Sure” Riddell: “OK, see the Quassel notification icon?” Me: “Yeah, it’s throbbing.”

He walks over, checks that the notification icon is throbbing in the systray, and that I’ve got a konsole window at hand. “So, run top.” I switch windows, type “top” and hit enter.

Silence. “Um, what operating system is this?”

So Plasma 5 on FreeBSD looks sufficiently indistinguishable from KDE Neon, that it can fool even the conoisseur. But top(1) is different enough. This makes me really happy, since it shows that packaging vanilla upstream KDE software is the right thing to do for FreeBSD. Even better, the KDE Neon bug Riddell was trying to illustrate to me, is also present on FreeBSD with Plasma 5 on Intel graphics (although I use the scfb driver for now). Achieving bug parity is quite a milestone.

Plasma -- Durable, Usable, Elegant.Plasma — Durable, Usable, Elegant.
Over the past weeks, we (KDE’s Plasma team) have distilled the reasons why we do what we do, and what we want to achieve into a vision statement. In this article, I’d like to present Plasma’s vision and explain a bit what is behind it. Let’s start with the statement itself, though:

Plasma is a cross-device work environment by the KDE Community where trust is put on the user’s capacity to best define her own workflow and preferences.

Plasma is simple by default, a clean work area for real-world usage which intends to stay out of your way.
Plasma is powerful when needed, enabling the user to create the workflow that makes her more effective to complete her tasks.

Plasma never dictates the user’s needs, it only strives to solve them. Plasma never defines what the user is allowed to do, it only ensures that she can.

Our motivation is to enable actual work to happen, across devices, across different platforms, using any application needed.

We build to be durable, we create to be usable, we design to be elegant.

I’ve marked a few bits which are especially important in a bold font, let’s get into a bit more detail:

Cross-device — Plasma is a work environment for different classes of devices, it adapts to the form-factor and offers a user interface which is suitable for the device’s characteristics (input methods such as touchscreen, mouse, keyboard) and constraints (screen size, memory and CPU capabilties, etc.).

Define the workflow — Plasma is a flexible tool that can be set up as the user wishes and needs to make her more effective, to get the job done. Plasma is not a purpose in itself, it rather enables and gets out of the way. This isn’t to say that we’ll introduce every single option one can think of, but we strive to serve many users’ use cases.

Simple by default means that Plasma is self-explanatory to new users, and that it offers a clean and sober interface in its default state. We don’t want to overwhelm the user, but present a serene and friendly environment.

Powerful when needed on the other hand means that under the hood, Plasma offers tremendous power that allow to get almost any job done efficiently and without flailing.

We build to be durable, we create to be usable, we design to be elegant. — The end result is a stable workspace that the user can trust, that is friendly and easy to use, and that is beautiful and elegant in how it works.

Qt now provides a new module named Qt 3D. In this very short talk Giuseppe D’Angelo introduces some of the design ideas behind Qt 3D, discuss its use cases, and shows how simple it is to get 3D content in an application when using Qt 3D APIs.…

The post Qt 3D Short Presentation appeared first on KDAB.

Introduction

This article is part of a blog series about Clang Tidy. In the previous article we learned about the general usage of Clang Tidy to automatically refactor source code for projects using the CMake build system. In this particular episode we’ll discuss using Clang Tooling on projects using different build systems with the help of Bear.

Motivation: So you want to use Clang Tooling on your project — but what if your particular project of interest is using …

The post Clang Tidy, part 2: Integrate qmake and other build systems using Bear appeared first on KDAB.

In this month, I have been working on fixing bugs found by static code analyzers in KStars. All of the tools are open-source or they can be used for free by open-source projects:

- cppcheck: C++ source code analyzer.
- ClazyQt-oriented static code analyzer by KDE for C++ based on Clang.
Clang Static Analyzer: Clang Static Analyzer is a Clang-based static code analyzer for C++.
- Krazy: Code analyzer by KDE.
- CoverityCoverity is a commercial C++ static code analyzer from Synopsys, but it is free for open-source projects.

If you want to give these analyzers a try, you can pick the build scripts from the tools directory of KStars Git repository and try in your project following our wiki page on the bottom.

Meanwhile I keep fixing memory handling bugs what I find when I test KStars built with runtime sanitizers and adding C++ smart pointers to certain places to make pointer-handling more safe. I also try to minimize the unnecessary includes over the codebase to cut the build time although sometimes I break the Jenkins CI build for FreeBSD or some other esoteric platforms. Ops. :) But I fix those problems as soon as possible.

More details are on our wiki page:
https://techbase.kde.org/Projects/Edu/KStars/C%2B%2B_developer_tools_with_KStars_on_Linux



Here comes another KStars release: v2.8.0 For Windows, MacOS, and Linux.

This is a minor bugfix release to increase stability of KStars on all supported platforms. Nevertheless, there were few significant updates:

 

Además, gracias al trabajo de VictorHck (no os perdáis su blog) ya está disponible el podcast en archive.org.

 

Espero que os haya gustado, si es así ya sabéis: “Manita arriba“, compartid y no olvidéis visitar y suscribiros al canal de Youtube de KDE España.

Como siempre, esperamos vuestros comentarios que os aseguro que son muy valiosos para los desarrolladores, aunque sean críticas constructivas (las otras nunca son buenas para nadie). Así mismo, también nos gustaría saber los temas sobre los que gustaría que hablásemos en los próximos podcast.

Aprovecho la ocasión para invitaros a suscribiros al canal de Ivoox de los podcast de KDE España que pronto estará al día.

July 24, 2017


Antonio Larrosa advocates building
intracommunity relationships.

Sunday was busy day and the talks were as varied as the speakers.

Antonio Larrosa kicked off the morning with his talk on The KDE Community and its Ecosystem. He expressed concern about what he perceived as an increase in the isolation of certain communities and laid out the advantages of working on intra-community relationships.

Later on in the day, Kevin Ottens gave his audience a taste of what Qt's 3D API can do in his talk Advances in Qt 3D. There are more and more applications that rely on 3D everyday, especially with the increase in popularity of virtual reality. Ottens introduced the tools Qt developers looking to include 3D into their programs and even treated attendees to a preview of a feature that is still in the works and that helps manage shader code.

Dan Leinir Turthra Jensen gave a very entertaining talk on Supporting Content Creators or Satisfying Your Inner Capitalist. Leinir laid out ways app developers could make enough money to be able to sometimes eat, while at the same time still feed their craving for developing cool stuff under free licenses.

Jonathan Riddell gave a demonstration of what Neonception would be like by running Neon inside Neon using Docker images. The point being that, apart from looking cool, as Neon comes in various experimental flavours, developers can run unstable versions in a container without endangering their main set up.

Jonathan gives a good explanation himself in the following video:

Ivan Čukić rounded off the day, invoking the Cthulhu of programming languages. In his talk C++17 and 20, he reviewed some of the more interesting features included into C++17, as well as those planned for C++20. Although some attendees would probably prefer C++ lay dreaming a few aeons more, new things like ranges, concepts, and coroutines may just convert some developers over.

About Akademy

For most of the year, KDE—one of the largest free and open software communities in the world—works on-line by email, IRC, forums and mailing lists. Akademy provides all KDE contributors the opportunity to meet in person to foster social bonds, work on concrete technology issues, consider new ideas, and reinforce the innovative, dynamic culture of KDE. Akademy brings together artists, designers, developers, translators, users, writers, sponsors and many other types of KDE contributors to celebrate the achievements of the past year and help determine the direction for the next year. Hands-on sessions offer the opportunity for intense work bringing those plans to reality. The KDE Community welcomes companies building on KDE technology, and those that are looking for opportunities. Join us by registering for the 2017 edition of Akademy today.

For more information, please contact the Akademy Team.

Akademy has had its first full day of BoFs, group sessions discussing our plans for the next year. The wrapup session has just finished so watch the video to find out about what Plasma devs are working on, what tutorials happened and how we avoided a fist fight to the finish.

Yesterday I picked up my new KDE Slimbook. It comes with KDE Neon pre-installed. Of course it also works well with openSUSE, and Manjaro, and Netrunner Linux (some things I’ve at least booted the Live CD for). But for me, “will it run FreeBSD” is actually the most important bit.

Yes. Yes it does, and it does so beautifully.

Photo of two laptops

That is at least one advantage of choosing a Free Software friendly laptop, one designed for GNU/Linux: it is likely to be supported by many more operating systems that you might like. No, I have not tried OpenSolaris / Illumos on it .. there’s really no desktop distro in that corner anymore.

So, here’s how to breakupgrade your  Slimbook to FreeBSD (no warranty implied):

  • Resize the installed partition to make space for FreeBSD. I chopped 40GB off the end of the main Linux partition. This may break various crypt-setup things, be careful. For resizing, I actually used the Manjaro installer, and told it to resize an existing partition, then cancelled the install during unsquash, and then deleted the partition it had made. You can probably use resize2fs and gparted with good effect, too.
  • Install FreeBSD 12-CURRENT. I went for UFS on a single partition, no swap: that’s the easiest to get right, and avoids weirdness like zpools on a partition. It ended up in ada0p4, or sda4, or (hd0, gpt4) depending on what nomenclature you use for naming disks.
  • Oh, yeah .. don’t install a boot manager for FreeBSD. We’ll let the existing GRUB deal with it.
  • Reboot and let GRUB start Linux again. We’ll configure GRUB to (also) start FreeBSD. Add an OS entry for FreeBSD, by adding /etc/grub.d/40_custom:


    menuentry "FreeBSD" --class freebsd --class bsd --class os {
    insmod ufs2
    insmod bsd
    set root=(hd0,gpt4)
    chainloader /boot/boot1.efi
    }

    Also recommended: set timeouts so you can actually pick an OS, in /etc/grub.d/custom.cfg:


    set timeout=5
    set timeout_style=menu

  • Reboot, hit escape at the right moment to get the GRUB menu, and choose FreeBSD. Boot into FreeBSD.
  • Configure wireless networking on FreeBSD for the Slimbook. I have one with an Intel 7265 wireless card, so I needed to set that up. Since the firmware comes from the filesystem, I ended up following the quick-start guide and futzing with rc.local to load the driver. Here’s my rc.local:


    #! /bin/sh
    /sbin/kldload if_iwm
    /sbin/ifconfig wlan0 create wlandev iwm0
    :

    and this is a bit of my rc.conf:


    # Wireless
    wlans_iwm0="wlan0"
    ifconfig_wlan0="WPA SYNCDHCP"

    Bear in mind that NetworkManager and other fancy bits are not available: configure wpa_supplicant.conf by hand and add SSIDs and PSKs there.

  • Right now, Intel IGP after Broadwell (and the Slimbook is a Skylake) isn’t fully supported by the xf86-video-intel driver, so instead use scfb. This loses acceleration and some other features, but it gives you X11 right now, as opposed to sometime later when the newer drivers are merged.

    pkg install xf86-video-scfb

    Add some explicit, manual, X.org configuration:


    Section "Device"
    Identifier "Card0"
    Driver "scfb"
    EndSection

  • After that, follow my earlier Plasma 5 on FreeBSD HOWTO, including adding the Area51 repo. However, since this is 12-CURRENT, you need to use a different pkg repository URL.

This concludes my laptop-futzing-about at Akademy this year: I have a laptop that dual-boots Linux and FreeBSD, and gives me an up-to-date Plasma 5 Desktop and KDE Applications on both — but that leaves me free to hack on whatever my work requires in the OS best suited to it each day of the week.

Every year at Akademy we celebrate some of our hardest working achievers in the community. The prizes are selected and awarded by the previous year's winners. The winners this year are:


Kai-Uwe

Application Award

Kai Uwe Broulik for their valuable work on Plasma.




Cornelius Schumacher

Non-Application Contribution Award

Cornelius Schumacher for their long term contributions to KDE.



 
Olaf and Martin

Jury Award

Martin Konold & Olaf Schmidt-Wischhöfer for their work on the KDE Free Qt Foundation.
Honourable mentions also go to Lars Knoll and Tuukka Turunen for their work on the Qt side of the foundation which controls Qt's licensing.




Thanking the Akademy organisers

The organising team were also given a certificate and thanked for their hard work. The organisation has been led by Kenny Duffus who has helped for over a decade making the events appear to run smoothly. The local team have worked hard this year led by Rubén Gómez and Ismael Olea to bring us a fabulous event.


Lukas Hetzenecker

Akademy continues for another four days with meetings, workshops and hacking from teams within KDE to discuss their work over the forthcoming year.

One final announcement was made at the end of the talks. The location for next year's Akademy was announced to be in Vienna, Lukas Hetzenecker introduced what he assured us was a beautiful and welcoming city.

Dot Categories:

Books, badges and new functionalities are coming!

Hey WikiToLearn-ers! It has been a while, but now “What’s going on?” is back.

This last period was really difficult for all of us but we never forgot of WikiToLearn. In spite of personal commitments and exams, the members of our community kept working day after day and now we see the results!

Electromagnetism is now ready!

In the first episode of “Meet the authors” we have come to know Dan, a long term contributor of ours. It is now a pleasure to announce that his book about electromagnetism is ready! Dan is one of the most active editors in our community and on our website he wrote several books on the italian portal: analysis, physics I, mechanics and electromagnetism! Kudos Daniele for your strong dedication to the project and for your contributions!

We have badges!

The tech team never stops, the platform is always at his best and our sysadmins work hard to provide you great services. When we had problems we were able to solve them very quickly and to get back on our feet! In this period

our devs worked on a long-term requested feature for our website: badges! Yeah, now WikiToLearn has badges! Since our project was born we were asked a way to certify an imported or reviewed book and to give its author/reviewer the proper merit. We are very happy to announce that this feature is now ready! When you donate a book to be imported, a proper tag will be associated to it on the website.

Professors always asked us a tag to certify their books authority. To accommodate this request now we have the “Reviewed” tag. Are you the author of a specific book and you would like it to be certified? Now with WikiToLearn you can! One step further to guarantee high-quality content and to monitor revisions on our dynamical textbooks!

Let’s celebrate!

We never forget where we belong: that’s why we remind you that during this week Akademy2017 is taking place in Almeria! Akademy is the annual conference of the KDE community. WikiToLearn was born under the KDE umbrella and still today we fell part of the KDE family. This year WikiToLearn is represented at Akademy by Vasudha, a GSoC student of ours working on Ruqola. Today we remember Akademy with extreme pleasure: two years ago, during Akademy2015 our project was officially born!

Since the first official announcement to the public, so much happened. We can consider ourselves satisfied for the hard work we did and for the initial outreach we had. People feedbacks pushed us to work better and better to improve functionalities and to satisfy users’ needs. In this period we had the occasion to spread the word about WikiToLearn and we could obtain substantial involvement in our projects. We are extremely grateful to all the members of our community for events organized, books donated for importation, reviews and creation of new material on the platform.

For us now it’s time to work even harder. During these two years we came up to realize what we were doing properly and what has to be modified. In the next few months we are working on communication and style improvements to enlarge our community.

Content creation and usability are the two main issues we are facing right now.  Any kind of user should be aware of what the platform offers and should be encouraged to write on it. WikiToLearn collects dynamical textbooks and in the incoming future this innovative feature of our product has to become our strength point!

 

We are working for you, WikiToLearn-ers! Stay tuned, spread the word, join.wikitolearn.org and start sharing your knowledge in a completely innovative way!

 

L'articolo Wiki, what’s going on? (Part 24-Badges and books) sembra essere il primo su Blogs from WikiToLearn.

In the previous episode we presented how to uncover 32 Qt best practices at compile time with clazy. Today it’s time to show 5 more and other new goodies present in the freshly released clazy v1.2.

New checks

1. connect-not-normalized

Warns when the content of SIGNAL(), SLOT(), Q_ARG() and Q_RETURN_ARG() is not normalized. Using normalized signatures allows to avoid unneeded memory allocations.

Example:

    // warning: Signature is not normalized. Use void mySlot(int) instead of void mySlot(const int) 

The post clazy 1.2 released appeared first on KDAB.

Hello, this is the report for the second phase of the GSOC. The last month was not easy. Some things had to be re-written because they were not very well written. For example, I wrote a system of “sensors”, the logic of which was laid in the destructors of objects....

Bienvenidos a la crónica vespertina del segundo día de Akademy-es 2017 de Almería, la segunda parte de una jornada un poco extraña para mi ya que la matinal me la perdí. La razón es que estuve en la Asamblea General de KDE e.V., del cual soy miembro desde hace poco. Por la tarde estuve más activo en Akademy-es, aunque no igual que la primera tarde. Se notó en las redes sociales. Por cierto, la jornada de ayer domingo fue tan intensa que fue imposible hacer una entrada en condiciones y apenas pude publicar una imagen. Mil disculpas.

Crónica vespertina del segundo día de Akademy-es 2017 de Almería

  • Crónica vespertina del segundo día de Akademy-es 2017Traducir para KDE – Adrián Chaves, traductor de KDE al Gallego

Se empieza la jornada con la charla trasladada de la mañana donde Adrián nos explica las bases para empezar la importantísima tarea de traducir, una de las tareas más imporatantes dentro del ecosistema KDE.

No hay que olvidar que KDE está traducido a más de 70 idiomas, siendo uno de los elementos que distingue al sistema de Konqi del resto de sus competidores.

 

  • Charlas Relámpago

Por motivos de tiempo no se promovió la realización de estas charlas, así que no tuvimos la oportunidad de ver esos pequeños proyectos

  • Cómo empezar a programar usando Qt – Jesús Fernández, desarrollador de The Qt Company

Una de las charlas que más gustó a los programadores fue esta realizada por Jesús, donde de una forma distendida y casi casual fue explicando algunos destalles para aquellos que quieran introducirse en el mundo de la programación Qt.

  • Descanso y foto de grupo

  • 5 maravillas de Plasma 5 – Baltasar Ortega, editor de KDE Blog

Me toca realizar mi segunda charla, una charla demostrativa donde explicar solo 5 cosas que me maravillan de Plasma 5 y de sus aplicaciones KDE: el escritorio, las Preferencias del Sistema, Dolphin y Gwenview… y la quinta que la dejo para cuando cuelgue la presentación.

Lamentablemente, esta charla no se pudo grabar así que pido disculpas de antemano.

  • Como probar aplicaciones sin tener que cambiar de distribución – Aleix Pol, Vicepresidente de KDE eV

Para concluir la serie de charlas potentes, aparece delante del proyector Aleix explicando el camino que ha llegado a crear Flatpak y aspectos de su futuro. Una tecnologia que seguro seguirá dando de hablar en en tiempos venideros.

  •  Lo que nos espera… Akademy 2017 – Rubén Gómez, Organizador de Akademy y Ceremonia de clausura – Antonio Larrosa, Presidente de KDE España

Comento 2 en una ya que tienen poco que comentar. En la primera Rubén nos explicó que nos espera en la casi semana que dura Akademy: dos días de charlas y 4 de hacking, sin olvidar la parte social que empieza esa misma noche.

Por otra parte Antonio se dedica a dar las gracias a los organizadores y patrocinadores del evento las facilidades que han puesto para realizar Akademy-es e invita a los asistentes a Akademy y a volver con nosotros en la edición del 2018.

¡Nos vemos en Akademy-es 2018!

 

Recuerda: utiliza la etiqueta #akademyes cuando utilices twitter o facebook y hables de nuestro evento.

It has been more than a month since I last blogged. This is not really a good situation. Nevertheless, for the remaining period of GSOC I will try to be more regular.

In this post I intend to report the whereabouts of my project. First of all me not posting any updates about my project was due to two problems that showed up when I was two weeks into the coding period. One, which I had anticipated, was to decide from where to show a warning dialog during the brief period of time when privileges are elevated. The problem was that showing the prompt from KIO::Slave resulted in repetition and to show it from KIO::JobUiDelegate permissions of destination folder was needed beforehand which required additional computation. So for this I decided to add a signal in KIO::Slave and all the necessary code for additional prompts in KIO::Job. This way the KIO slave emits the signal whenever it encounters ACCESS DENIED error and then job decides whether or not to show the prompt. The other problem was to figure out how to modify files created by a privileged process by an underprivileged one. By the way the latter was completely uncalled-for and it took me around two weeks to decide on a solution. To send data between processes I tried every possible IPC mechanism involving shared memory, pipes and sockets. At last I decided on sharing file descriptor between the privileged and under-privileged process and to accomplish that I used Unix local domain sockets.

Now fast-forward to today, I have found solutions to my problems and I can say I have finally made some real progress. To be precise with my current state of project dolphin’s context menu is very much usable inside a read-only folder. Most of the action in dolphin’s context menu, which includes copy/cut/paste, rename, creating new file, creating new folder, creating links, trash and delete, are working without any issues whatsoever. The actions that are not working are Creating desktop files, Creating Link to Application (technically both involves creating desktop files but have two separate menu options), Compressing files, Drag and Drop, Undoing and Renaming multiple files in one go. The first two need changes in KIO::KPropertiesDialog in order to work. For compressing to work changes have to be done from ark’s side. DnD requires changes in KIO::DropJob. Undoing and renaming many files kind of challenge my current solution but I have explained their case and the possible workaround in the end of post.

I have created a phabricator task in which I have listed all the related revisions and also tried to explain how everything is going to work. If anyone has a kde build environment set up and is interested to try out the changes then apply the patches listed in this repo. Unlike my previous patches which required separate branch for every file operation, all of my current patches can be applied at once and changes can be tested without having to checkout to a new branch.

Now coming back to the issue with undoing and renaming files. My design assumes one top-level parent job and many sub-jobs. However while undoing changes and renaming files, jobs are created in a loop. This means creation of many top-level jobs (like KIO::CopyJob) and that many sub jobs (like KIO::SimpleJob). So a possible workaround is to introduce a new KIO::Job that can be placed outside the loop body and would serve as a parent job for all the jobs created inside the loop. But I will consider this only after my current changes are reviewed by KIO developers.

Till then cheers.

July 23, 2017

I started to implement multiple cursor and selection support in KDE’s famous text editor kate a while ago, but eventually didn’t quite have the time to finalize it. I am currently at Akademy in Spain, KDE’s annual developer conference, and decided that would be a good time to pick it up again and make it actually work. Here it is in action:

Multiple cursors in kate

What does it do?

It allows you to have an arbitrary amount of cursors and selections in KTextEditor. They all mirror what you do with the primary one — text input, text removal, navigation, text selection, …

Features include:

  • Place any amount of cursors with mouse and keyboard shortcuts.
  • Have any amount of (disconnected, i.e. non-continuous) selections. Each selection has exactly one cursor at either its start or its end, but the selection for a cursor is allowed to be empty. Selections do not need to have the same size.
  • Freeze and unfreeze your secondary cursors, allowing you to move only the primary cursor, or all of them simultaneously.
  • Perform most editing, text selection and text navigation features on all cursors simultaneously.

What is it good for?

You decide.

Multiple selections of different sizes

How do I use it?

Usage is relatively simple; the shortcut for controlling multicursors is currently Ctrl+Meta (Meta is the key with the Windows icon on it). Press Ctrl+Meta and click in your document to place a secondary cursor. Then, just do whatever you would normally do with the keyboard. Press Esc to clear all secondary cursors.

You can place cursors with just the keyboard by pressing Ctrl+Meta+D (“toggle secondary cursor at current position”). Doing that will freeze all secondary cursors, and the keyboard now only moves the primary cursor until you unfreeze them again with Ctrl+Meta+F.

You can also create multiple and additional selections by pressing Ctrl+Meta, and then just using the mouse to select text.

Multiple cursors in a Kate document

What’s the state?

Most things work, there will be some issues I’m not aware of. What is at the moment completely broken is persistent selection, and the block selection mode. Both just do random things. I will need to fix that — or do you want to help? Assistance is very welcome.

If you want to test things, I’m sure you can find issues around static and dynamic word wrap, and folding.

How do I try it?

Check out the “multicursor” branch in the ktexteditor repo and build it, then start kate or any other application using the katepart editor component. Or, get a kate AppImage with multicursor support from here: http://files.svenbrauch.de/kate-linux/multicursor/

Please leave feedback in the comments if you try it out!


Akademy-ES 2017

On the 20th and 21st of July, KDE España held, with the invaluable help of UNIA, HackLab Almería and the University of Almería, and with the sponsorship of Opentia, its 12th annual gathering: Akademy-es 2017.

As it always happens when Akademy takes place in Spain, Akademy-es 2017 became a prelude of the international event and many well-known KDE developers attended.

Throughout two days, talks were offered covering many different topics, including Plasma, programming (C++, Qt, mobile), exciting projects like Kirigami, proposals for the future such as KDE on automobile, encouragement to use KDE software and contribute to KDE, and information about KDE España.

People who could not attend should not be worried as videos of the talks will be available online.


Akademy makes the news

The local newspaper stopped by for a photo shoot and to write a story on the world gathering of KDE developers that was about to happen.

Attendees also got a chance to play around with Slimbook Ultrabooks such as the well-known KDE flavour or their new Pro edition.

As usual, KDE España members gathered to celebrate their AGM. If you wish to find out what goes on in there, or if you wish to help us out organizing events like Akademy-es and getting the word out in Spain about KDE, please consider joining KDE España. It is now easier than ever!


KDE España board, Baltasar, Adrián, Antonio, José commonly knows as Los Guapos


Slimbook Talk by Alejandro

Yesterday I picked up my new KDE Slimbook from the Slimbook.es stand at Akademy.

Photo of slimbook being handed over

First thing I did, of course, was boot it with my FreeBSD 11.0 SD card, to see if it works with my favorite operating system (with Plasma 5 desktop, of course). Nope: 11.0 hangs after finding acpi_ec0, so I will write about that later this week.

Second thing I did was boot KDE Neon (pre-installed) on it, to see how it works out-of-the-box. I collected a bunch of tiny-little-irritations, papercuts if you will, from the basic installation — which have disappeared after an update and reboot.

It’s a really nice and slick machine. I wanted a machine that would still fit in the train or plane, for work, but a little larger than my Thinkpad x121e. I bought that machine in 2012(?) from Hettes, a Dutch shop specializing in hardware with Linux preinstalled (now gone, since they could no longer source hardware without a Windows license). So I’m really happy to buy a new machine from a Free Software supporting shop in 2017.

There’s a bit of a weird-ass dongle in the box for wired ethernet, SD card slot, HDMI and two USB ports on the sides, and a DC in — I don’t think I will miss USB-C at all, although that would be neat for a refresh. I have not tried the webcam, which is in the bezel at the top of the screen (no nostril shots like some Dell machines). Speaking of bezels, they’re pretty wide compared to current “design” laptops, Not any wider than the x121e, so relatively more narrow.

The touchpad is a big change for me personally, since I am — or shortly will have been — an IBM TrackPoint™ fan. On the other hand, the touchpad is solid and clicky. The keyboard is nice, with perhaps a little too much flex in the right-hand alt and delete keys. The arrow keys are arrowy, not the fat-left-and-right that (I think) HP uses.

So!

Having discovered that the machine is shiny and nice and fast and works well .. my next step is to try to break it. With the blessing of Alejandro and César — it’s good to have the best possible tech support right at hand.

(Oh, I forgot to mention: I’m pleased as punch with the ordering process, too, for instance the special “deliver to Akademy” shipping option, and the fact that I got email informing me of progress as the laptop was assembled and installed.)

Hey people,

The second phase of Google summer of code evaluation is near!!. And I have started to work on the second part of my project. That is, I have completed the first part of my project. Yeayy!

What’s on hand right now?

Started working on the Bundle Manager, creating the UI which was decided to be done. Creating addition features like deleting the bundles created and creating a search functionality too at the moment.

Deleting the bundles

In Krita, we cannot delete the bundle created just like that. The Bundles created are saved as the KisResource in a QList. We have to remove it from that list, then obviously, we have to remove it from the list widget where this bundle is shown. Then we have to BlackList the file. Then from there, we can remove the blacklisted bundles as we empty a recycle bin ;).

Searching the bundles.

I was reading the code of tag management most of the time, So I haven’t gone deep, this week coding. Other than these two functionalities. But I have a mere idea how to do this, Since this is a QList, I would think of implementing some search algorithms based on the bundle’s name. Or else, I will be tracing a way out using Hashing method.

I was reading the code for the tagging management and have to look out for things which need fixes. After understanding the codebase I will start implementing features and fixing bugs.

Planned works for next week.

  • Get Second phase review done.
  • Implement features/fix bugs of Tag management.
  • Fix bugs/Implement features of Resource Manager.
  • Document/test if necessary.
  • Get more inputs on the tag management from the community.

Hope to hack in again in better ways.

Cheers.


Docker can help you to quickly set up a development environment for working on a software project. This is great for getting new contributors up to speed quickly. I've recently discovered how nice this is and want to share how I use Docker.

KDE projects often have desktop services, mimetypes and plugins. Ideally, these are installed system-wide. During development it is not convenient to change your main environment just to test your software. This is where Docker comes in. Docker is the most popular Linux container system. It gives you a system that is separate from your main system and which you can rewind when you break it.

Dolphin in KDE neon

The simplest way to get going is to run a fresh Plasma system. KDE neon provides Docker images.

After installing Docker, you can get a system going like this;

# give the docker application access to X
xhost +si:localuser:$USER
# run a throwaway session with dolphin
docker run --rm \
    -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix \
    -ti \
    kdeneon/plasma:dev-stable dolphin

--rm

cleans up the container after use.

-ti

makes the session interactive.

-e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix

gives the container access to your X session.

an empty home Dolphin running in a throwaway Docker container. No home directory was set up.

Dolphin in openSUSE

Instead of kdeneon/plasma:dev-stable you might want to use a different distribution. You can build you own images by creating your own Dockerfile. Here is a simple Dockerfile for a Docker image for dolphin based on openSUSE.

FROM opensuse

# opensuse image is minimal, so add plasma and dolphin
RUN zypper --non-interactive install plasma5-desktop dolphin \
  && zypper clean

# set up an environment for a user
RUN useradd --create-home user
USER user
ENV KDE_FULL_SESSION=true
ENV XDG_RUNTIME_DIR=/run/user
WORKDIR /home/user

CMD ["bash", "-l"]

To create a docker image, write this text into a file name Dockerfile and run

xhost +si:localuser:$USER
docker build -t opensuse-dolphin .
docker run --rm \
    -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix \
    -ti \
    opensuse-dolphin dolphin

Now a docker image called opensuse-dolphin was created and run. At each uppercased commands like RUN a snapshot is created. The last step is to tag the final snapshot with the name opensuse-dolphin. That name can be used to run a throwaway container with dolphin.

dolphin running in an openSUSE container Dolphin running in a throwaway Docker container based on openSUSE.

Adding a home

To do any coding, you'd like to provide files to your container and have the ability to store files. This is possible. You can give a container access to a local folder with the -v flag, for example -v $HOME/calligra-dev/home:/home/user which mounts your folder $HOME/calligra-dev/home to the folder /home/user in the container.

I like to manage this container home with git, so I can start with a clean folder by running git clean -fd.

mkdir -p $HOME/calligra-dev/home
cd $HOME/calligra-dev/home
git init
echo 'git clean -fd' > .bash_profile
echo -e 'src\nbuild\ninstall' > .gitignore
git add .bash_profile .gitignore
git commit -a -m 'Initial commit'

So the development environment is managed by Docker on the system level and by git on the level of work directory.

Developing Calligra in Docker

After this introduction, I'm showing the Dockerfile that I'm using for Calligra with QtCreator or KDevelop and the clazy quality checker. Clazy has not been packaged so the Dockerfile needs to have a build step for it.

FROM kdeneon/plasma:dev-stable

USER root

RUN apt-get update && apt-get dist-upgrade -y

RUN apt-get install -y --no-install-recommends \
  cmake extra-cmake-modules g++ gettext git kdoctools-dev kross-dev \
  libboost-all-dev libeigen3-dev libetonyek-dev libfontconfig1-dev \
  libfreetype6-dev libgit2-dev libgsl-dev libkf5activities-dev \
  libkf5archive-dev libkf5kcmutils-dev libkf5kdelibs4support-dev \
  libkf5notifications-dev libkf5notifyconfig-dev libkf5parts-dev \
  libkf5wallet-dev libkf5xmlgui-dev libodfgen-dev libpoppler-qt5-dev \
  libqca-qt5-2-dev libqt5opengl5-dev libqt5svg5-dev libqt5x11extras5-dev \
  librevenge-dev libwpd-dev libwpg-dev libwps-dev ninja-build pkg-config \
  sudo

# requirements for clazy
RUN apt-get install -y --no-install-recommends \
  clang llvm-dev libclang-3.8-dev

# build and install clazy
RUN git clone git://anongit.kde.org/clazy \
  && cd clazy \
  && cmake -DCMAKE_INSTALL_PREFIX=/usr -DCMAKE_BUILD_TYPE=Release -GNinja \
  && ninja install

# dependencies for development
RUN apt-get install -y --no-install-recommends \
  cmake-curses-gui \
  less vim strace qtcreator kdevelop valgrind gdb

USER neon
CMD ["/bin/bash", "-l"]
docker run -v $HOME/calligra-dev/home:/home/neon \
  -v /tmp/.X11-unix:/tmp/.X11-unix \
  -e DISPLAY=$DISPLAY \
  -e XDG_CURRENT_DESKTOP=$XDG_CURRENT_DESKTOP \
  --network none \
  -ti calligra-clazy

Parallel development environments

Docker is easy to set up and lets you have tight control over your development and testing environment. You can have many environments present on your system and reset and start and stop them quickly. Finally, here is a screenshot of my desktop running development environments for Calligra 2 and Calligra 3 in parallel. Both are running in QtCreator. This is a screenshot from my Akademy presentation on Calligra.

Two versions of Calligra running in Docker containers Two versions of Calligra running in Docker containers.

Helping contributors to get started

Getting started in KDE can be challenging. Docker images or Dockerfiles for complete development environments can get new contributors started more quickly. In KDE, neon already uses Docker images. Jonathan Riddell is giving a presentation about that this Akademy. Because this is so easy, I predict that we'll see more Dockerfiles being published by KDE projects.

July 22, 2017

During the first day at the Akademy, everything went according to plan and nearly everything was on time. Kudos to the organisers.

The weather was balmy at the beginning of the day and, although Aleix Pol said it was not hotter than a hot day in Barcelona, many of the Scandinavian and Scottish attendees were visibly wilting under the sun. Fortunately for them, the venue is equipped with air-conditioning.

Little known fact about Almería: it is situated in the biggest desert in Europe, the Desert of Tabernas. A better known fact is that that same desert has been used as a location for many spaghetti westerns, including the seminal Sergio Leone movies "For A Fistful of Dollars" and "The Good, the Bad and the Ugly". What is more interesting for some KDE members is that Tabernas has also been used in the filming of at least one Doctor Who episode ("A Town Called Mercy"). Unsurprisingly, the whovians amongst us quickly got busy and organised a trip to the place of the shoot for later in the week.

The Talks


Robert Kaye has managed to woo both an active
community of volunteers and the industry with MusicBrainz.

Robert Kaye did not disappoint and delivered an entertaining keynote on how MusicBrainz, a community-powered non-profit, has managed to be THE database of musical metadata. MusicBrainz's data is used by Google, the BBC, YouTube, Amazon, and nearly everyone else (including most FLOSS media players).

Jean-Baptiste Mardelle introduced us to the new features, back end and interface of Kdenlive, KDE's video editing software. Apart from having cleaner code and being more stable, upcoming versions of Kdenlive will sport intelligent clip cutting, resizing and inserting, making life for video editors much easier.

As expected Aditya Mehra's talk on the Mycroft plasmoid was another of the highlights of the day. The topic, after all, is intrinsically interesting -- there is something about issuing voice commands to an AI assistant on your desktop that appeals to everybody.

During the mid-afternoon Ask Us Anything session, attendees had the chance to... well, ask anything to the KDE e.V. board members. Questions ranged from governance to how donations were used, passing through the process of getting elected to the board. Talking of which, it was a chance to properly meet the new board member, Eike Hein, who stepped in for Marta Rybczynska.

Eike, among other things, maintains and develops Konversation, a user-friendly IRC client for KDE. He is also in charge of Yakuake, an original spin on the traditional terminal. Yakuake sits hidden at the top of your desktop and you can unfold it like a blind when you need it. He discovered KDE when test running Corel Linux (does anybody else remember that rather bizarre distro?) back in the 90s and started contributing in 2005.

In the evening, Timothée Giet gave us an update on GCompris, the suite of educational activities and games for young children. The improvements in design and to the number of features are turning GCompris into a free, safe and privacy-protecting suite of educational programs as opposed to some of the proprietary alternatives out there.


Eike, the new member of the KDE e.V. board,
answering attendees' questions.

Agustín Benito, on the other hand, pointed to new sectors KDE should probably be looking into. Agustín has been working on Free Software on embedded devices for the automotive industry for some time now and reckons this is an area in which KDE could grow and even become a mainstream technology.

At the very end of the day, in the very last session, there was a lively debate on writing and how developers could better describe their projects to a larger audience. The discussion was animated enough to make us forget the time and, finally, we were all thrown out.

Day 2 promises to be equally fun.

About Akademy

For most of the year, KDE—one of the largest free and open software communities in the world—works on-line by email, IRC, forums and mailing lists. Akademy provides all KDE contributors the opportunity to meet in person to foster social bonds, work on concrete technology issues, consider new ideas, and reinforce the innovative, dynamic culture of KDE. Akademy brings together artists, designers, developers, translators, users, writers, sponsors and many other types of KDE contributors to celebrate the achievements of the past year and help determine the direction for the next year. Hands-on sessions offer the opportunity for intense work bringing those plans to reality. The KDE Community welcomes companies building on KDE technology, and those that are looking for opportunities. Join us by registering for the 2017 edition of Akademy today.

For more information, please contact the Akademy Team.

I've used ngrx/store and ngrx/effects for a while, and like the pattern. The new version has been released, and here are some of the changes I had to make in my code.

Pre v4 you built your reducers as follows, passing an object with

stateslice: reducerfunction.

const productionReducer = combineReducers(reducers);
export function reducers(state: any, action: any) {
    return productionReducer(state, action);}

With the new version, don't use combineReducers. I was running into the situation where the reducers weren't being initialized and the state was null.

Now you create your object of reducers:

export const reducer: ActionReducerMap = {
contactprimary: fromPrimary.reducer
contactselection: fromSelectionList.reducer
contactlist: undoable(fromContactList.reducer),
relations: relationUndoable(fromRelations.reducer),...

and in your app.module imports: [] section


StoreModule.forRoot(reducer),

The second big change is the Action type has been redefined.

export interface Action {
  type: string;
}

This breaks any of your code where you use action.payload. To fix it you have to define Action in each reducer. First define your action classes.

import {Action} from "@ngrx/store";
import {PrintItem} from "../../../models/print.model";


export const ADD_TO_QUEUE = '[Printing] Add to queue';
export const REMOVE_FROM_QUEUE = '[Printing] Remove from queue';
export const MARK_STATUS = '[Printing] mark status';

export class Print implements Action {
    type = ADD_TO_QUEUE;    
constructor(public payload: PrintItem) {}
}

export class PrintDone implements Action {
    type = REMOVE_FROM_QUEUE;    
constructor(public payload: PrintItem) {}
}

export class PrintStatus implements Action {
    type = MARK_STATUS;    
constructor(public payload: PrintItem) {}
}

export type Actions
    = Print
    | PrintStatus
    | PrintDone;


Note that the payload has a type in each one, and it is the same. You can call it whatever you like, or have multiple properties. The type Actions is exported.

Then in your reducer.

import {PrintState} from "../../../models/print.model";import {ActionReducer} from "@ngrx/store";import * as printingActionTypes from "../actions/printing.actions";

export type Action = printingActionTypes.Actions;
const initialState: PrintState = {
    queue: []
};
export const reducer: ActionReducer = (
    state = initialState, action: Action) => {
    switch (action.type) {
....


In this reducer, action is of the type defined in your action classes. The same action needs to be defined in the Effects as well.

This requires refactoring much of your reducers and effects. If you have different payloads for each of your defined action classes, lots of changes.

The advantage is strict typing of the data being passed around. It caught a couple minor flaws in my code, and I will be rewriting some reducers to take advantage of the added protection.

Otherwise it all works. The documentation is very helpful, especially with the specific initialization requirements.

I will update this blog after I have refactored the feature module reducers. There are also some improvements in the way selectors can be grouped.

In this article, I am outlining an idea for an improved process of deploying software to Linux systems. It combined advantages of traditional, package mangement based systems with containerized software through systems such as Flatpak, Snap, or AppImage. An improved process allows us to make software deployment more efficient across the whole Free software community, have better supported software on users systems and allow for better quality at the same time.

Where we are goingWhere we are going
In today’s Linux and Free software ecosystems, users usually receive all their software from one source. It usually means that software is well integrated with the system, can be tested in combination with each other and support comes from a single vendor. Compared to systems, in which single software packages are downloaded from their individual vendors and then installed manually, this has huge advantages, as it makes it easy to get updates for everything installed on your system with a single command. The base system and the software comes from the same hands and can be tested as a whole. This ease of upgrading is almost mind-boggling to people who are used to a Windows world, where you’d download 20 .exe installer files post-OS-install and have to update them individually, a hugely time-consuming process and at time outright dangerous as software easily gets out of date.

Traditional model of software deploymentTraditional model of software deployment
There are also downsides to how we handle software deployment and installation currently, most of them revolve around update cycles. There is always a middle man who decides when and what to upgrade. This results in applications getting out of date, which is bad in reality and leads to a number of problems, as security and bug fixes are not making it to users in a timely fashion,

  • It’s not unusual that software installed on a “supported” Linux system is outdated and not at all supported upstream anymore on the day it reaches the user. Worse, policies employed by distributions (or more generally, operating system vendors) will prevent some software packages from ever getting an update other than the most critical security fix within the whole support cycle.
  • Software out in the wild with its problems isn’t supported upstream, bug reports reaching the upstream developers are often invalid and have been fixed in newer versions, or users are asked to test the latest version, which most of the time isn’t available for their OS — this makes it harder to address problems with the software and it’s frustrating for both, users and developers.
  • Even if bugs are fixed in a timely fashion, the likelihood of users of traditional distributions actually receiving these updates without manually installing them is small, especially if users are not aware of it.
  • Packaging software for a variety of different distributions is a huge waste of time. While this can be automated to some extent, it’s still less than ideal as work is duplicated, packaging bugs do happen simply because distribution packagers do not fully understand how a specific piece of software is built and best deployed (there’s a wide variety of software after all) and software stacks aren’t often well-aligned. (More on that later!)
  • Support cycles differ, leading to two problems:
  • Distros need to guarantee support for software they didn’t produce
  • Developers are not sure how much value there is in shipping a release and subsequent bugfix releases, since it takes usually at least months until many users upgrade their OS and receive the new version.
  • Related to that, it can take a long time until a user confirms a bug fix.
  • There is only a small number of distributions who can package every single piece of useful software available. This essentially limits the user’s choice because his niche distro of choice may simply not have all needed software available.

The value of downstreams

One argument that has been made is that downstreams do important work, too. An example for that legal or licensing problems are often found during reviews at SUSE, one of KDE’s downstream partners. These are often fed back to KDE’s developers where the problems can be fixed and be made part of upstream. This doesn’t have to change at all, in fact, with a quicker deployment process, we’re actually able to ship these fixes quicker to users. Likewise, QA that currently happens downstream should actually shift more to upstream so fixes get integrated and deployed quicker.

One big problem that we are currently facing is the variety of software stacks our downstreams use. An example that often bites us is that Linux distributions are combining applications with different versions of Qt. This is not only problematic on desktop form-factors, but has been a significant problem on mobile as well. Running an application against the same version of Qt that developers developed or tested it against means fewer bugs due to a smaller matrix of software stacks, resulting in less user-visible bugs.

In short: We’d be better off if work happening downstream happens more upstream, anyway.

Upstream as software distributor

Software directly from its sourceSoftware directly from its source
So, what’s the idea? Let me explain what I have in mind. This is a bit of a radical idea, but given my above train of thoughts, it may well solve a whole range of problems that I’ve explained.

Linux distributors supply a base system, but most of the UI layers, so the user-visible parts come from downstream KDE (or other vendors, but let’s assume KDE for now). The user gets to run a stable base that boots a system that supports all his hardware and gets updated according to the user’s flavor, but the apps and relevant libraries come from upstream KDE, are maintained, tested and deployed from there. For many of the applications, the middle-man is cut out.

This leads to

  • vastly reduced packaging efforts of distros as apps are only packaged once, not once per distro.
  • much, much shorter delays until a bug fix reaches the user
  • stacks that are carefully put together by those that know the apps’ requirements best

Granted, for a large part of the user’s system that stays relatively static, the current way of using packaged software works just fine. What I’m talking about are the bits and pieces that the users relies on for her productivity, the apps that are fast moving, where fixes are more critical to the user’s productivity, or simply where the user wants to stay more up to date.

Containerization allows systemic improvements

In practice, this can be done by making containerized applications more easily available to the users. Discover, Plasma’s software center, can allow the user to install software directly supplied by KDE and allow to keep it up to date. Users can pick where to get software from, but distros can make smart choices for users as well. Leaner distros could even entirely rely on KDE (or other upstreams) shipping applications and fully concentrate on the base system and advancing that part.

Luckily, containerization technologies now allow us to rethink how we supply users with our software and provide opportunities to let native apps on Linux systems catch up with much shorter deployment cycles and less variety in the stack, resulting in higher quality software on our users’ systems.


Older blog entries


Planet KDE is made from the blogs of KDE's contributors. The opinions it contains are those of the contributor. This site is powered by Rawdog and Rawdog RSS. Feed readers can read Planet KDE with RSS, FOAF or OPML.