April 19, 2019

Kipi Plugins is a set of app plugins for manipulating images.  They use libkipi which is released as part of KDE Applications.  It used to get standalone releases and was then moved to be part of Digikam releases.  Since Digikam 6 they have been deprecated by Digikam in favour of their new plugin framework DPlugins.  While in KDE Frameworks the Purpose Framework is another newer project covering similar features.

However Kipi Plugins are still supported by KDE apps KPhotoAlbum, Gwenview, Spectacle so they shouldn’t disappear yet.

I’ve made a new release available for download now.


Versioned 5.9.1 because it is little changed from the previous release done inside Digikam which was 5.9.0.

Tagged commit b1352149b5e475e0fbffb28a7b5fe13503f24dfe

Sha256 Sum: 04b3d31ac042b901216ad8ba67dafc46b58c8a285b5162b51189833f6d015542

Signed by me Jonathan Riddell <jr@jriddell.org>

This will become part of KDE Applications in its next release scheduled for August and will follow the KDE Applications version numbers.

Facebooktwitterlinkedinby feather

We are happy to announce the next release of LabPlot! As usual, in the release announcement we want to introduce the major points in the new release. Some of the new developments were already described in the previous blog posts were we reported on the ongoing achievements. Many other smaller and bigger improvements, bug fixes and new features were done in this release. The full list of changes that are worth to be mentioned is available in our changelog.

LabPlot has already quite a good feature set that allows to create 2D Cartesian plots with a lot of editing possibilities and with a good variety of different data sources supported. Analysis functionality is also getting more and more extended and matured with every release. Based on the overall good foundation it’s time now to take care also of other plot types and visualization techniques. As part of the next release 2.6 we ship the histogram:
Main Window with a Histogram
We dedicated a separate blog post already to this important new feature where the different options around this visualization technique are described.

The second big new feature is the support for the MQTT protocol. With LabPlot 2.6 is is possible now to read live data from MQTT brokers. This feature was contributed by Ferencz Kovács during Google Summer of Code 2018. His blog contains a lot of information about the progress in his project as well as a lot of demos.

The number of supported file formats that can be imported into LabPlot was further extended in this release.  We support now ROOT‘s trees and tuples, and especially ROOT’s histograms which were already introduced in the previous blog:

Further new supported formats are Ngspice raw files (ASCII and binary) and JSON format (JSON arrays and objects)

Ngspice raw files and ROOT files can also be used as a live data source – in this mode LabPlot monitors the external files, re-reads them completely on changes and updates the visualization of the data. See this blog post for an example video demoing such a workflow.

The import of data from SQL databases was implemented already in the previous release. Unfortunately, the handling of ODBC connections was not implemented correctly in the past. In this release we fixed this.

The “File Info” Dialog was extended and shows now format specific information for the selected file, like the number of attributes, dimensions and variable for netCDF etc.:

File Information Dialog

Also for this release there was quite a good progress in the area of data analysis functions in LabPlot.  Convolution/Deconvolution and Cross-/Autocorrelation of data sets with many different options (sampling interval, linear/circular, normalization, etc.) joined the family of analysis functions in 2.6

For the worksheet, couple of new features and improvements were developed. Here we want to mention the new ability to rotate plot legends and to use different border shapes for labels (rectangle, eclipse, etc.) and a better positioning of rotated axis tick labels.

While working with plots it is sometimes desired to avoid unwanted occasional panning and zooming in the plot triggered by mouse drag and wheel events. For this we added a new option which allows to make plots non-interactive.

In the spreadsheet, in addition to the already available methods for data generation (fill with constant values, mathematical function values, random values, etc.) there are now functions for quick data manipulation – add and subtract a value and multiply and divide by a value. Similar functionality was also implemented for matrix data containers.

When exporting a worksheet to a text file, we allow to specify the number format in the export options  – similarly to how this is done during the import. Furthermore, it is possible now to export a spreadsheet into a Sqlite database.

The integration of Cantor – KDE’s frontend for open-source computer algebra systems and programming languages – saw further improvements. It is possible now to modify the settings for different systems directly in LabPlot:

CAS Settings

Cantor saw quite a great improvement in the last months. The recent blog Cantor 18.12 – KDE way of doing mathematics described the progress. Linux users benefit from their distributions shipping the current and up-to-date versions of LabPlot and Cantor. For Windows users we ship the current version of Cantor as part of our Windows installer with a working support for Octave, Python and Maxima:

Maxima worksheet in LabPlot on Windows

Octave part still have problems with the integrated plots and we plan to address this for the next release. Also, for the next release we plan to also have a working combination of LabPlot and Cantor for Mac OS X users. Even though the development of LabPlot and Cantor is done on Linux, we continue increasing our effort and the attention for these platforms, too.

All in all,  it took some time to finalize the release. Again, many features implemented but still a lot in the backlog and already in the development. Also, Google Summer of Code 2019 is coming soon and we have couple of nice projects for this summer. So, stay tuned, give LabPlot a try and provide us feedback.

Ayer fue lanzado KDE Aplicaciones 19.04, la nueva gran revisión del ecosistema de aplicaciones de la Comunidad KDE que viene cargado de novedades y que pronto estará disponible  en vuestra distribución favorita, gracias al trabajo de los empaquetadores.

Lanzado KDE Aplicaciones 19.04

El pasado 18 de abril de 2019 los desarrolladores del proyecto KDE anunciaron el lanzamiento de KDE Aplicaciones 19.04.
En palabras de los desarrolladores:

“Nuestra comunidad trabaja continuamente para mejorar el software incluido en las series de las Aplicaciones de KDE. Conjuntamente con las nuevas funcionalidades, mejoramos el diseño, la usabilidad y la estabilidad de todas las utilidades, juegos, y herramientas de creatividad. Nuestro objetivo es hacer más fácil la vida haciendo que el software de KDE sea agradable de usar. ¡Esperamos que os gusten las mejoras nuevas y las enmiendas de errores que se encuentra en 19.09!”

En general se han resuelto más de 150 errores, lo cual significa la vuelta de funcionalidades inactivas, normalización de atajos de teclado y solucionan errores, haciendo que el conjunto de aplicaciones de KDE sean más amigables y nos permitirán ser más productivos y eficientes.

Las novedades de KDE Aplicaciones 19.04

Muchas son las novedades que nos ofrece KDE Aplicaciones 19.04, las cuales han sido resumidas en un vídeo. Hay que destacar que normalmente este tipo de vídeo muestran solo algunas de las nuevas funcionalidades así que os aconsejo leer la lista completa de cambios.

Hoy solo quiero dar una pincelada de las principales novedades, para dedicar en un futuro entradas más completas:

  • Kitinerary: Nueva asistente de viaje de Kontact que ayudará a obtener la ubicación y os aconsejará el camino a seguir.
  • Dolphin: Se ha ampliado el número de tipos de miniaturas que el explorador de ficheros puede mostrar por defecto: archivos de Microsoft Office, de eBook .epub y .fb2, de Blender y PCX.

    Lanzado KDE Aplicaciones 19.04

    Kdenlive ha sido reescrito en parte para mejorar su uso.

  • Kdenlive: Reescrito el 60%  del código interno del editor de vídeo no lineal mejorando globalmente su funcionamiento.
  • Okular: Ahora permite la visualización y verificación de firmas digitales en archivos PDF.
  • KMail: Detección automática de  números de teléfono en los correos electrónicos y se pueden marcar directamente mediante KDE Connect.
  •  KOrganizer: Los eventos recurrentes de Google Calendar se volverán a sincronizar correctamente.
  • Kate: Mejorados los menús contextuales con nuevas acciones disponibles.
  • Konsole: La gestión de pestañas ha recibido varias mejoras que le ayudarán a ser más productivo. Las pestañas nuevas se pueden crear con un clic del botón central a las partes vacías de la barra de pestañas.
  • Gwenview: Introducido la función pantalla táctil completa, con gestos y zoom.
  • Spectacle: Añadidas opciones en el modo de Región Rectangular.
  • KmPlot: Añadida la posibilidad de hacer zoom utilizando la tecla Ctrl y la rueda del ratón.
  • Kolf: Restaurado el sonido.

April 18, 2019

We are happy to announce version 1.13.0 of the Qbs build tool. This is the last version to be released under the auspices of the Qt Company, but certainly not the least.


Transparent pkg-config support

Qbs projects can now make use of pkg-config modules. Syntax-wise, the same dependency mechanism as for Qbs’ own modules is used. For instance, on a typical Linux machine with an OpenSSL development package installed, the following is enough to let a Qbs project build against it:

Depends { name: "openssl" }

Internally, this functionality is implemented on top of module providers, a powerful new feature that allows creating Qbs modules on demand.

Automatic Qt detection

Projects with a Qt dependency now set up the required Qt-specific modules (such as Qt.core) automatically at build time, using either the PATH environment variable or the moduleProviders.Qt.qmakeFilePaths property. It is therefore no longer strictly necessary to create a profile using the setup-qt command. Behind the scenes, this feature is also implemented using module providers.

Job Pools

The global limit for the number of concurrently running jobs does not make sense for all types of commands. For instance, linkers are typically I/O-bound and take up a lot of memory, so it often makes sense not to parallelize them as much as compilers. Here’s how job-specific limits are set on the command line:

$ qbs --job-limits linker:2,compiler:8

These limits can also be provided via preferences and in project files. The details are explained here.

What else is new?

Language Improvements

Rules are no longer required to specify output artifacts. As a result, rules whose main purpose is in their “side effect” will look more natural now, as they don’t need to declare a dummy output file anymore.

It is no longer necessary to start each project file with an “import qbs” line.

The Application, DynamicLibrary and StaticLibrary items have new properties install and installDir for more convenient installation of target binaries.

We introduced Process.atEnd() and FileInfo.canonicalPath().

C/C++ Support

GNU linker scripts are now scanned recursively to catch changes to included linker scripts. Thanks to Ola Røer Thorsen for this contribution!

The new cpp.linkerVariant property allows to force the use of ld.gold, ld.bfd or lld for linking.

Qt Support

We introduced the new property Qt.core.enableBigResources for the creation of “big” Qt resources.

Static builds now pull in the default set of plugins as specified by Qt, and the user can specify the set of plugins by type.

Android Support

The AndroidApk item has been deprecated. Instead, a normal Application item can and should be used now.

Building Qt apps is properly supported now. Some small changes to Qt’s androiddeployqt tool were necessary to achieve this, so you need at least Qt 5.12.

Autotest Support

There is an autotest module now, which you can use to specify arguments and working directory per test.

Various things

We introduced the texttemplate module, a facility similar to qmake’s QMAKE_SUBSTITUTES feature.

We added basic support for Google Protocol Buffers (for C++ and Objective-C). Thanks to Ivan Komissarov for this contribution!

Try It!

The Open Source version is available on the download page, and you can find commercially licensed packages on the Qt Account Portal. Please post issues in our bug tracker. You can also find us on IRC in #qbs on chat.freenode.net, and on the mailing list. The documentation and wiki are also good places to get started.

Qbs is also available on a number of packaging systems (Chocolatey, MacPorts, Homebrew) and updated on each release by the Qbs development team. It can also be installed through the native package management system on a number of Linux distributions including but not limited to Debian, Ubuntu, Fedora, and Arch Linux.

Qbs 1.13.0 is also included in Qt Creator 4.9.0, which was released earlier this week.

So, what now?

Preparations for handing over the project to the community are ongoing. Stay tuned for further announcements.

The post Qbs 1.13 released appeared first on Qt Blog.

KDE’s bundle of apps KDE Applications 19.04 has been released. Here at KDE neon the build servers have built all the .debs and the QA servers are now checking over them before publishing shortly.

That’s great if you run KDE neon but what about every other distro? Well for the first time ever you can install 50-odd apps from the bundle from one of the new cross-distro Linux App Stores, the Snap Store.

Our software store app Discover also supports Snaps so they should be available through that if you have snapd installed.

Get it from the Snap Store

Snaps is a new container based format from Canonical and they can be installed on any Linux distro. The spec and software is all open source managed from a centralised Store in much the same way as Google Play or Steam or F-Droid. Here at KDE neon we have been experimenting for some time with this format and waiting for the needed features to be added so we could give a great exprience. There’s still a couple of rough edges such as printer support or which directory the File Open dialog uses by default, which is mostly down to Qt and how it supports the xdg-portals spec. The store does not yet pick up all the meta-data such as icons and screenshots from Appstream metadata files.

Of the 50 apps we have ready today most are simple ones, games and edu features a lot. Hopefully we can get some of the more flagship apps up before long.

This is an exciting change in the way we deliver our software to you the user. No more having to go through a third party distro before you can install it, now we can release directly. Hopefully other App Stores will be supported in the near future too such as Flatpak/Flathub and Appstream/Appstreamhub.

The new format puts app authors and maintainers incharge of their software. Currently it’s done through KDE neon but there’s no reason why that needs to be the case, it can and should be done through the same KDE repos the apps are in with continuous integration and deployment done from our new GitLab setup invent.kde.org. Watch out for blogs on details of how it works shortly.

Kubuntu 19.04 has been released, featuring the beautiful Plasma 5.15 desktop from the KDE community.

Code-named “Disco Dingo”, Kubuntu 19.04 continues our proud tradition of integrating the latest and greatest open source technologies into a high-quality, easy-to-use Linux distribution.

The team has been hard at work through this cycle, introducing new features and fixing bugs.

Under the hood, there have been updates to many core packages, including a new 5.00-based kernel, Qt 5.12, KDE Frameworks 5.56, Plasma 5.15.4, and KDE Applications 18.12.3

Kubuntu has seen some exciting improvements, with newer versions of Qt, updates to major packages like Krita, Kdeconnect, Kstars, Latte-dock, Firefox and LibreOffice, and stability improvements to KDE Plasma.

For a list of other application updates, upgrading notes and known bugs be sure to read our release notes:


Download 19.04 or read about how to upgrade from 18.10.

Qt 5.9.8 is released today. As a patch release Qt 5.9.8 does not add any new functionality, but provides security fixes and other improvements.

Compared to Qt 5.9.7, the new Qt 5.9.8 contains multiple security fixes, updates to some of the 3rd party modules and close to 20 bug fixes. In total there are around 130 changes in Qt 5.9.8 compared to Qt 5.9.7. For details of the most important changes, please check the Change files of Qt 5.9.8.

Qt 5.9.8 can be installed using the maintenance tool of the online installer. For new installations, please download latest online installer from Qt Account portal or from qt.io Download page. Offline packages are available for commercial users in the Qt Account portal and at the qt.io Download page for open-source users.

The post Qt 5.9.8 Released appeared first on Qt Blog.

Como ya sabréis si seguís el blog, el próximo 20 de abril a las las 21:00 (UTC) tenemos una nueva edición del Maratón Linuxero, que en esta ocasión lleva por sobrenombre “edición Flisol 2019”. Así que esta semana me he dedicado a promocionarlos todo lo que pueda. Ayer hablé del programa y hoy toca compartir con vosotros la información disponible del concurso y patrocinadores del Maratón linuxero edición Flisol 2019, dos aspectos importantes de este evento.

Concurso y patrocinadores del Maratón linuxero edición Flisol 2019

Concurso y patrocinadores del Maratón linuxeroUn aspecto importante del Maratón Linuxero es que intenta reunir a todas los protagonistas del Software Libre: usuarios, oyentes, despistados, artistas multimedias, administradores web, programadores, promotores, empresarios, activistas, diseñadores, etc.

Y para ello busca un lugar para cada uno de ellos de forma variada: participando activamente en el podcast, diseñando banners, preparando el sistema de emisión, etc.

Eso si, para la participación de los oyentes y para echar la caña a los despistados el Maratón Linuxero ha preparado un concurso donde se sortearán cuatro packs de bienes relacionados con el Software Libre donados por algunas de las empresas que patrocinan el evento.

Para participar debéis  mandar un audio de pocos segundos contestando ¿Porqué es importante organizar un FLISoL? al correo maratonlinuxero@disroot.org.

Y hablando de patrocinadores, vale la pena dedicarles unas líneas a cada uno de ellos:

  • Neodigit: Servidor de recursos web muy vinculado al Software Libre.
  • Slimbook: Ensamblador de dispositivos 100% compatibles con las distribuciones GNU/Linux (portátiles, miniPcs, torres, AllinOne, etc).
  • VantPC: Otra empresa de venta de ordenadores con GNU/Linux.
  • LibreBit: Soluciones de software en gestión empresarial con Software Libre.
  • Linux Español: Sitio web facilita a los usuarios del mismo el acceso a información y servicios prestados por Tecnologic Team a aquellas personas u organizaciones interesadas en los mismos.

¿Qué es el Maratón Linuxero?

Maratón Linuxero 1.1

El Maratón Linuxero es un proyecto creado por podcasters y oyentes de GNU/Linux que quieren realizar un evento en directo a través de aplicaciones y servicios de software libre.

Su origen fue ver si era posible sacar adelante emisiones en directo como otras organizaciones han hecho, pero sin recurrir a sistemas privativos, o por lo menos que sean afines al Software Libre o de código abierto.

No solo colaboran podcasters, sino también administradores de sistemas, desarrolladores, diseñadores y artistas que consiguieron ofrecer servicios como web, blog o canales de comunicación, carteles, promos y vídeos del proyecto.


¿Qué es Flisol?

Aunque muchos de vosotros lo sabréis, nunca está de más explicar la cosas. FLISoL (Festival Latinoamericano de Instalación de Software Libre), es uno de los evento de difusión de Software Libre más grande en Latinoamérica y está dirigido a todo tipo de público: estudiantes, académicos, empresarios, trabajadores, funcionarios públicos, entusiastas y aun personas que no poseen mucho conocimiento informático.

Es decir, es un evento donde prevalece la difusión del Conocimiento Libre antes que sus aspectos técnicos, es por tanto la perfecta puerta de entrada a nuevos simpatizantes del Software Libre.


The first new applications bundle release of the year adds stability, coherence and new features that help users become more comfortable and more productive using KDE software.

KDE's file manager, Dolphin, can now show previews of more types of files, including Microsoft Office files, Blender 3D scenes, and EPUB eBooks. Continuing in the same department, previews of text files containing code or markup will now show syntax highlighting. All of the above will help users identify the contents of a file even before opening it. Other usability improvements include the option to choose which split to close when clicking the Close Split button, smarter tab placement, and a more practical way of tagging files.

Okular, KDE's document viewer, now lets users verify PDFs that have been digitally signed. Digital signatures are used in official documents to confirm that the document comes from the right source, and that it has not been tampered with. Okular also lets you edit LaTeX documents directly in the viewer and has improved touchscreen compatibility, making it much easier to use it in presentation mode.

Kontact has also improved, with KMail receiving most of the changes. Starting with this release, KMail can correct your grammar in the texts of your messages. A new thing you will find in this version of Kontact is KItinerary, a travel assistant developed by Volker Krausse that advises you on how to best get to your destination using meta-data from your e-mails.

Many other apps have been improved: Konsole has made using tabs easier, Spectacle gives you more options when taking and saving screenshots, and Kdenlive has overhauled the timeline, making editing video easier and more fun, just to mention a few changes.

As always, you can read more about KDE Applications 19.04 in the official announcement. Many distributions are currently packaging Applications 19.04 ready for update and install. For the first time, you can install many of the apps on any distro directly from the Snap Store.

Dot Categories:

April 17, 2019

One of the things which makes Plasma so attractive is the officially supported option to customize also the style, and that beyond colors and wallpaper, to allow users to personalize the look to their likes. And designers have picked up on that and did a good set of custom designs (store.kde.org lists at the time of writing 454 themes).

Intended changes and unintended changes

In the 11 years since Plasma was released first (January 11, 2008) there has been some evolution of the theming options. Sadly also sometimes regressions in the support of older options were introduced, mainly in the early phase of Plasma 5 when code was ported to QtQuick 2 and the new default theme Breeze did not expose the regression or even triggered some changes for its special needs or for simplifications it allowed.

Given Breeze as default was pleasing to most, and possibly some alternative theme designers were lost during the longer phase from a pleasing KDE4 to an again pleasing Plasma 5, and new theme designers made their themes directly matching latest Plasma 5 Theme support, the piled-up regressions were not a real issue for those contributing.

Well, not a real issue unless on started to restore an old theme like “Fluffy Bunny”. And was wondering why it looks broken. And then wondering the same while playing with all the other old themes still available on store.kde.org.

So much partially good artwork being broken made my heart bleed, and in the absence of own designer abilities it was time to document the regressions as well as fix some code where possible to restore previous variability.

Teaching Porting to Plasma 5

So, theme designers, please find a a new page on techbase.kde.org about Porting Themes to latest Plasma 5. It lists the changes known so far and what to do for them. Please extend the list if something is yet missing. One might test the fixes in the code against one’s themes with latest development versions of Plasma using Live images.

Fresh old Air & Oxygen

And even the Oxygen & Air themes, the default Plasma themes during KDE4 times, while being maintained as official themes part of current Plasma releases, had suffered some small regressions (e.g. analog clocks hands rotation point or progressbar height). No more, with Plasma 5.16 the visual experience of Oxygen & Air should be closer to the original.

Luxury on the borders results in bad looks

Sadly there is one thing which is not so easy to fix: Plasma code now assumes for any frames (like with borders of panels & pop-up dialogs) that the actual border visuals are negligible and the margin from the outside to the content is the same, with border on the side enabled or not. With luxurious decorated borders (like Fluffy Bunny or Spoons Original) this sadly results in bad looking bigger margins on sides where the border is disabled (see details). Which is sad especially given that High DPI display actually would allow more fanciness, even more when in the future Flat design gets replaced by whatever new trend. My passive dreams & hopes are on for Plasma 6 ��

Separating Plasma 5 compatible themes on store.kde.org

And while there now is a list of things to do to make old themes work again, many themes on store.kde.org no longer are maintained. There is talk about separating those, e.g. at time of Plasma 5.16 release, so there would be a clean list of working and maintained themes for latest Plasma 5.

KDE Project:

Like every year, a number of KDE PIM developers met in Toulouse for a bit of bugfixing.

Discussions and decisions
There were a number of those, the most important ones being about food, of course.
Among the topics of lesser importance: turning some PIM libraries into KF5 frameworks (that's just their way to dump more work on me, clearly... but it also means a lot of cleanup work for Volker, first), outreach to the Plasma Mobile PIM team, how to increase the number of attendees for this kind of sprint, how to make it easier to start contributing to KDE PIM, how to blog more often about progress.

In terms of the actual work done, the list is quite long, here is my view on things.

Moving an event from one calendar to another is now possible in KOrganizer (the combobox in the editor dialog was disabled due to an old bug, it all seems to work now)

Library cleanups (so they can become frameworks)
Most of this happened in KCalCore and KContacts. The plan is to submit them for inclusion into KF5 after the 19.08 release (Sep/Oct 2019). Other libs were discussed, a longer and more specific sprint would be needed, to untangle some of the dependencies and do a major cleanup of the libraries.

Deadlock handling
Thanks to Dan's insights, I was able to improve greatly the handling of the unavoidable database deadlocks. It's a feature from all database implementations: when multiple transactions are happening at the same time and they end up locking the same rows, the database rolls back forcefully one of the transactions and asks the application to retry. But retrying doesn't mean running the exact same SQL statements (as the code was doing)... If we retry inserting into a table we might get a different auto-increment ID than the first time, so subsequent queries (which might use this ID) have to be adjusted. Therefore we now rerun the whole handling of the current command at the C++ level, instead.
The new "retry" logic has its limits though, some transactions are currently created from another process (in ItemSync). Dan started to redesign sync'ing that it happens more server-side and we don't have those TransactionJobs anymore. To be continued...

Multiple merge candidates
Despite a lot of head scratching and digging through possible scenarios, we still don't know how exactly this error can happen.
But at least when it does happen, the sync doesn't get stuck forever anymore.

Many thanks to all our sponsors and donators for making such sprints possible, and I hope we do this again, with more people, and less critical bugs :-)

A apenas tres días del evento, ya tenemos el Programa del Maratón Linuxero edición Flisol 2019, el cual seguro que nos proporcionará unas buenas horas de conocimiento compartido y proyectos libres.

Programa del Maratón linuxero edición Flisol 2019

Como ya comenté hace unos días, el próximo 20 de abril, a partir de las 23 horas española peninsular arranca una nueva edición del Maratón Linuxero, una edición que estará ligada a otro gran evento libre: Flisol 2019.

En esta ocasión, los miembros del Maratón Linuxero han querido participar en Flisol 2019 realizando una edición especial de su podcast en la que nos ofrecerán  3 horas  de contenido.

Programa del Maratón linuxero edición Flisol 2019

En las dos primeras horas del Maratón Linuxero se contará con la presencia de siete invitados, todos ellos organizadores del FLISOL en distintos países, que no solo hablarán del evento sino que nos contarán detalles de los proyectos libres en los que participan.

La última hora estará dedicada a la tradicional mesa redonda del equipo de Maratón Linuxero en la que sus organizadores nos deleitarán con su sabiduría.

Por cierto, en esta edición también se realizarán sorteos: cuatro packs (uno por cada patrocinador- Neodigit; Slimbook; VantPC; LibreBit; Linux Español), al mandar un audio de pocos segundos contestando ¿Porqué es importante organizar un FLISoL? al correo maratonlinuxero@disroot.org.

¿Qué es el Maratón Linuxero?

Maratón Linuxero 1.1

El Maratón Linuxero es un proyecto creado por podcasters y oyentes de GNU/Linux que quieren realizar un evento en directo a través de aplicaciones y servicios de software libre.

Su origen fue ver si era posible sacar adelante emisiones en directo como otras organizaciones han hecho, pero sin recurrir a sistemas privativos, o por lo menos que sean afines al Software Libre o de código abierto.

No solo colaboran podcasters, sino también administradores de sistemas, desarrolladores, diseñadores y artistas que consiguieron ofrecer servicios como web, blog o canales de comunicación, carteles, promos y vídeos del proyecto.


¿Qué es Flisol?

Aunque muchos de vosotros lo sabréis, nunca está de más explicar la cosas. FLISoL (Festival Latinoamericano de Instalación de Software Libre), es uno de los evento de difusión de Software Libre más grande en Latinoamérica y está dirigido a todo tipo de público: estudiantes, académicos, empresarios, trabajadores, funcionarios públicos, entusiastas y aun personas que no poseen mucho conocimiento informático.

Es decir, es un evento donde prevalece la difusión del Conocimiento Libre antes que sus aspectos técnicos, es por tanto la perfecta puerta de entrada a nuevos simpatizantes del Software Libre.

Qt 5.12.3, the third patch release of Qt 5.12 LTS, is released today. While not adding new features, the Qt 5.12.3 release provides a number of bug fixes, as well as performance and other improvements.

Compared to Qt 5.12.2, the new Qt 5.12.3 provides almost 200 bug fixes. For details of the most important changes, please check the Change files of Qt 5.12.3.

Qt 5.12 LTS will receive many more patch releases throughout the coming years and we recommend all active developed projects to migrate to Qt 5.12 LTS. Qt 5.9 LTS is currently in ‘Strict’ phase and receives only the selected important bug and security fixes, while Qt 5.12 LTS is currently receiving all the bug fixes. Qt 5.6 Support has ended in March 2019, so all active projects still using Qt 5.6 LTS should migrate to a later version of Qt.

Qt 5.12.3 is available via the maintenance tool of the online installer. For new installations, please download latest online installer from Qt Account portal or from qt.io Download page. Offline packages are available for commercial users in the Qt Account portal and at the qt.io Download page for open-source users. You can also try out the Commercial evaluation option from the qt.io Download page.

The post Qt 5.12.3 Released appeared first on Qt Blog.

April 16, 2019

As you can read in the official Creator 4.9.0 release announcement, Qt Creator now uses the KSyntaxHighlighting Framework for providing the generic highlighting.

This is a nice step for the wider adoption of this MIT licensed part of the KDE Frameworks.

And this is not just an one-way consumption of our work.

The framework got actively patches back that make it more usable for other consumers, too, like Kate ;=)

If you want concrete examples, take a look at:

I hope this cooperation will continue in the future. I thank the people working on Qt Creator that made this integration possible. I hope the initial effort will pay of with less code for them to maintain on their own and more improvements of the framework for all users.

Last week I flew to Brussels for EuroLLVM followed by Bristol for ACCU.

At both conferences I presented the work I’ve been doing to make it easier for regular C++ programmers to perform ‘mechanical’ bespoke refactoring using the clang ASTMatchers tooling. Each talk was prepared specifically for the particular audience at that conference, but both were very well received. The features I am working on require changes to the upstream Clang APIs in order to enable modern tooling, so I was traveling to EuroLLVM to try to build some buy-in and desire for those features.

I previously delivered a talk on the same topic about AST Matchers at code::dive 2018. This week I presented updates to the tools and features that I have worked on during the 6 months since.

One of the new features I presented is a method of debugging AST Matchers.

Part of the workflow of using AST Matchers is an iterative development process. For example, the developer wishes to find functions of a particular pattern, and creates and ever-more-complex matcher to find all desired cases without false-positives. As the matcher becomes more complex, it becomes difficult to determine why a particular function is not found as desired.

The debugger features I wrote for AST Matchers intend to solve that problem. It is now possible to create, remove and list breakpoints, and then enable debugger output to visualize the result of attempting to match at each location. A simple example of that is shown here.

When using a larger matcher it becomes obvious that the process of matching is short-circuited, meaning that the vertically-last negative match result is the cause of the overall failure to match the desired location. The typical workflow with the debugger is to insert break points on particular lines, and then remove surplus breakpoints which do not contribute useful output.

This feature is enabled by a new interface in the Clang AST Matchers, but the interface is also rich enough to implement some profiling of AST Matchers in the form of a hit counter.

Some matchers (and matcher sub-trees) are slower/more expensive to run than others. For example, running a matcher like `matchesName` on every AST node in a translation unit requires creation of a regular expression object, and comparing the name of each AST node with the regular expression. That may result in slower runtime than trimming the search tree by checking a parameter count first, for example.

Of course, the hit counter does not include timing output, but can give an indication of what might be relevant to change. Comparison of different trees of matchers can then be completed with a full clang-tidy check.

There is much more to say about both conferences and the tools that I demoed there, but that will be for a future log post. I hope this tool is useful and helps discover and debug AST Matchers!

April 15, 2019

Today I did run again into an old problem: You need to archive a lot small and large files inside a single Git repository and you have no support for Git LFS available. You did this several year and now you ended up in a state where cloning and working with the repository is unbearable slow.

What now? Last time I did run into that, I archived the overfull repository to some “rest in peace” space and used git filter-branch to filter out no longer needed and too large objects from a repository copy that then will replace the old one for daily use.

There are a lot of guides available how to use git filter-branch for that. All variants I ever used were complex to do and did take very long. Especially if you need several tries to get a sane set of stuff you want to remove to gain enough space savings.

This time, I searched once more and stumbled on the BFG Repo-Cleaner. And yes, it does what it promises on the web site and it seems to be trusted enough to be advertised by e.g. GitHub, too.

Just following the steps described on their landing page allows to shrink your stuff nicely and without a lot of round-trip time.

If you still are just in the “experimenting” phase to see which space decrease one can archive with which file size filter (or which files you want to purge by removing them from master before running the tool), I recommend to swap the step

git reflog expire –expire=now –all && git gc –prune=now –aggressive

with just

git reflog expire –expire=now –all && git gc –prune=now

to not wait potential hours for the aggressive GC. For me that was good enough to get some estimate of the later size for my experiments before I settled to some final settings and did the real run.

And as always, if you touch your Git history: Do that only if you really need to, keep backups, check carefully that afterwards the repository is in some sensible state (git fsck --strict is your friend) and inform all people using that repository that they will need to do a full new clone.

Could you tell us something about yourself?

I’m just someone who likes to create all the things! I actually have a focus in 3D, but constantly utilize my illustrations to help me concept 3D pieces. Oh, and you if you don’t want to call me Insane you can call me Missy. ��

Do you paint professionally, as a hobby artist, or both?

I suppose it depends as what someone defines as ‘hobby’. My job requires use of 3D knowledge, but my painting skills come in handy with material creation or editing some of them. Outside of my day job I create illustrations to sell online. I think it’s more of a side gig.

What genre(s) do you work in?

I really like creatures and monsters, non human stuff. Lately I’ve been on a bit of a fantasy kick, but I don’t think I can really settle on just one genre. I like to experiment with them all!

Whose work inspires you most — who are your role models as an artist?

This one is hard. I don’t have any specific role model, I just like certain aspects of other artists whether it be how they approach a specific subject, or portray something in their art. I keep a good healthy mix of artists I follow who inspire me. A few would be Chris Sanders, Danny Mac, Madeleine-Scott Spencer, Bruce Timm, Creature Box…the list goes on.

How and when did you get to try digital painting for the first time?

I actually don’t know why or what started it. Probably back in the early days of deviantart or the oekaki boards. I always had sketchbooks to scan in and post my art, but when I saw digital art, something drew me to it. Seeing a movie in 3D probably really cemented digital art for me, and I always preferred being on a computer, so I think at some point just before high school it all just clicked together that this was the start of my digital painting experience.

What makes you choose digital over traditional painting?

For just me, I think it saves on paper and waste (of course if someone prefers traditional I wouldn’t stop them!). I have used oils, acrylics and watercolor, but the supplies can get expensive and I’ve moved around too much to really justify the space it takes up. It also meant my younger siblings couldn’t scribble on my traditional art.

How did you find out about Krita?

I never used Photoshop to paint, I used Corel Painter. My version was becoming outdated and the new version was out of my price range. I tried looking into free alternatives, and I don’t remember how, but I think it just magically appeared on my computer one day, opened itself and told me to use it.

What was your first impression?


What do you love about Krita?

It’s open-source. It lets me paint however I want. It’s built by a community. It HAS a community. It’s always improving and fixing bugs.

What do you think needs improvement in Krita? Is there anything that really annoys you?

I think a lot of improvements just revolve around fixing bugs. Except text. It’s made leaps and bounds but I would love to see continued improvement with that tool. Oh, and recent documents doesn’t actually show ‘recent’, like the last file I opened all the time. But it could also be user error on my part. ¯\_(ツ)_/¯

What sets Krita apart from the other tools that you use?

Programs I truly support revolve around one thing: community. When a software package has developers that listen to its users and works with them, it really adds value to that software. I’ve watched Krita grow and I only see it getting bigger and better.

If you had to pick one favourite of all your work done in Krita so far, what would it be, and why?

As of late probably my Celestial Goldfish. I watch a lot, and I mean A LOT, of fish hobby videos. A goldfish seller posted a photo of that cute little guy and naturally I had to take it and exaggerate it. I don’t even own goldfish! If anyone ever needs a weird expression to inspire them goldfish have a million of them.

What techniques and brushes did you use in it?

I love the ink brushes. I tweaked one of the defaults to meet my needs and preferences. The watercolor brushes are becoming a favorite, and I’m hoping to use them in a large illustration soon!

As for techniques, a lot has changed with 4.0. I use the color mask editing tool in most of my work now,

Where can people see more of your work?

I think my instagram has more finished pieces, but if you don’t mind being spammed with twenty WIP images, random updates on my life and the occasional The Office gif, you can see it all on my Twitter, too.
All the links can be found on my website: justcallmeinsane.com
For quick access to my Krita tutorials: https://www.youtube.com/c/justcallmeinsane

Anything else you’d like to share?

Every time you draw something you’ve never drawn before, it’s like learning how to hold a pencil all over again. Don’t stop and keep going.

April 14, 2019

Changing routines to stay productive

As a remote worker, you need to find ways to keep productivity levels high. No matter how exciting your work is, there are times in which you struggle with keeping up the pace. Looking back at my performance during the last couple of weeks of January and first few days of February, I discovered that I was getting into a productivity valley, which never happened to me after coming back from a couple of weeks vacation. I decided to do something about it before the issue had any impact in my overall performance.

  • I have a little place back in my home island, La Palma, Canary Islands, Spain, where I can work comfortably. So when I feel I am not being very productive at home, I move there for a couple of weeks. It does work well for me. I decided though to try something different this time.
  • I go to the office in Manchester, UK several times a year. Those weeks there are intense and change whatever dynamic I am in. I spent there a few days early in February already but I decided to go back again the last week of March.
  • Once in a while, specially on Fridays, I go to coworking spaces in Málaga, specially when I am at home over three weeks in a row. This past month of March I decided to join The Living Room coworking space, in Málaga. My idea is to work there once or twice a week the coming three months. So far the experience is positive.
  • The first week of March, after the Embedded World 2019 and the CIP Technical Steering Committee face to face meeting in Nürnberg, I went to Prague for a week. I rented an apartment there and a hot desk at Impact Hub coworking space. The trip gave me the opportunity to break the routine while enjoying some of the many activities that the city offers. The classical music, opera, jazz and blues scene in Prague is rich. There are plenty of theaters and clubs to go to every night. Live music is one of my passions and Prague is perfect for it.

Impact Hub Prague is a big and busy coworking space with great facilities. It is maybe too crowded for people that have concentration difficulties or many video chats, but very good for entrepreneurs and freelancers. Those who have a fixed desk there work in more quiet areas. People were very friendly and I could have a productive week.

I am in the process of slightly changing some well established routines like starting a little earlier, reducing the lunch time and trying to finish a little earlier when working from home. I might also increase the amount of time I am working standing up.

It seems that all these measures are working so far although I will need a few more weeks to confirm this impression.

Scale Summit 2019

On Friday 29th March I attended to Scale Summit 2019 in London, UK, an unconference that works under Chatham House rules bringing together “professionals from the operations and software development communities who have a particular interest in scalable, high performance systems “.

It is a fantastic event. One of the best ones I’ve been to. This is the second time I have attended and will not be the last one. The quality of the discussions is very high and people go there to learn instead of to shine, sharing experiences and asking questions.

FOSS North 2019

I was invited by the FOSS North organizers to give a talk on Tuesday 9th April. This is a 2 days, 260 participants and two tracks event, that takes place in Gothenburg, Sweden. This was the fourth edition and the next one, at the same venue, will take place on March 30th and 31st 2020.

It was my first participation at this event and my first time in Gothenburg. FOSS North is well organized, vibrant, it takes place in good facilities, there were great speakers (Adrian DeGroot, Chris Simmonds, Mirko Boehm, Molly de Blanc, Michael Kerrisk, Chris Lamb, etc), some interesting talks, the food was great… and I liked Gothenburg. There will be videos available from every session. The speakers dinner was fun and interesting, which is not always the case.

Thanks Johan for the invitation and thanks Codethink Ltd for sponsoring my trip there.

I would recommend you to watch Mirko Boehm’s keynote “Open Source, Standards Development and Patents in Europe“. I found specially interesting the view of Open Source projects from the standardization bodies point of view he provided. It is always good to receive criticism with somebody else’s eyes. Specially controversial was the point referred to meritocracy.

I heard for the very first time about Property-based testing, which is an alternative approach to Example Based Testing (like Unit Testing). I believe it is kind of a structured way of doing Fuzz Testing. I will read more about this topic because I liked it. I initially find it specially useful for regulated environments (Contract Driven Development).

I got a good amount of feedback about my talk, which had a couple of controversial slides for some. I will deliver this talk again (in Spanish), this coming month of May at OpenSouthCode. Some of the comments will help me to improve the talk, which is always a good thing. I will link the video to the Conference section of this site as soon as it is available. Meanwhile I have published the slides already.

As part of the Community Day, on Sunday 7th, I attended to the KDE workshop driven by Adrian DeGroot. I took with me my RPi3 and RPi screen with Plasma Mobile and my 10″ Lenovo tablet with openSUSE Tumbleweed and Plasma. Adrian took his 7″ device, also with Plasma so the room looked like an embedded oriented workshop.

There were several other activities from different communities in different locations across Gothenburg that same Sunday. Several companies and non-profit organizations hosted these workshops. This warming up activity was considered a success so it will take place again in the 2020 edition. There were a few booths from sponsors at the event. Most of the companies present were recruiting.

If you are in Scandinavia, think about attending to this event next year. It is a good one.

In May I will attend to OpenSouthCode and J On The Beach, both in Málaga area. See you there!

Latte Dock v0.8.8   has been released containing important fixes and improvements!

Go get  v0.8.8   from, download.kde.orgor  store.kde.org*

* archive has been signed with gpg key: 325E 97C3 2E60 1F5D 4EAD CF3A 5599 9050 A2D9 110E

  • multi-screen fix: unload properly explicit screen docks when their screen is not available any more


You can find Latte at Liberapay if you want to support,     Donate using Liberapay

or you can split your donation between my active projects in kde store.

Last month, I attended FOSSASIA’s annual conference which was held in Singapore. This conference was a collection of amazing, life-changing experiences. It was my first experience as a speaker and it taught me so much about the open-source culture. This summit took place from 14th March to 17th March in the beautiful city of Singapore. This was my second foreign trip as well. First one was to San Francisco as a part of the Student Startup Exposure Program.

My flight was scheduled for 12th March from Jaipur and had a layover at Chennai for 5 hours. I reached the Changi Airport in the early morning of 13th. This airport was quite scenic and is also ranked as the top airport in the world.


The Changi airport has fairly good transportation connectivity, I took an MRT to Bendemeer where I was going to spend my next few days. I stayed at a hostel located in the heart of the city. It was the first time I was staying in a shared room with 12 others, from different countries. Luckily, I found some other participants and speakers of Fossasia Summit staying in the same hostel (I was not the only techie here, yay!). It was a leisure day, so I planned to go sightseeing around the city with my new friends. We went to the Marina Bay, Night safari and Gardens by the bay. Being a vegetarian was quite hard there as I could barely find some vegetarian food around. There was a lot of seafood available. It was a hot day and we finally reached the hostel back.


Next day, I had my breakfast early and reached The Lifelong Learning Institute where the Summit was organized. There were many talks on various tracks from the experts. I was more interested in Cloud, Containers, DevOps and Science and Education tracks.  There were quick lightning talks on variety of topics.


The next day was similar to the first,  I spent a lot amount of time interacting with the developers around the world and got to know about many new techs and some interesting projects. Another thing I love about these conferences is the goodies and swags, I got a lot of goodies of various open source organizations �� And finally, I found some vegetarian food to eat. In the evening, we decided to explore the Singapore city more and went to Sentosa Island. The transportation facility was really great, the whole city was well-connected. With some fun and adventures, we finally came back to the hostel.


Next day, there were talks on Artificial Intelligence and Web Technologies, I attended a few of them. A UNESCO Hackathon was also organized, I also participated in that whose theme was Indigenous languages. We formed a team of five and built a crowd-sourced corpus generator for translating Indigenous language sentences to English.

It was day 3 of the conference and I was very excited as I was going to deliver my talk “GCompris – The open source Educational suite”. I ended up speaking faster than I planned to, leaving more than enough time for a QA round. My previous contribution with the project as a part of Google  Summer of Code student helped me to answer questions related to tech stacks and how to contribute. Received some suggestions for the improvements of the current activities and met someone who was interested in translating activities in the Thai Language. Also, I met Shantanu Tushar and Sinny Kumari who are KDE contributors.


After the talk, I interacted with many more people and it was a great overall experience for me. Later, we again started working on our Hackathon project and made good progress there. After a lot of hacking and interactions we went to the market for some shopping, exploring streets and dinner.

The best part of our hostel was rooftop, where I used to go at night to interact with people all around the world who were staying there. I made some new friends from Canada, Russia, and Singapore. Also, I met a person from India who was working as an agile DevOps Coach at Singapore. He took me to the nearest Gurudwara where I had my dinner.

It was the last day of the conference and I was also excited about the results of the Hackathon. And to our surprise, we won three out of five categories in the Hackathon. We got the first position in AI Dev and Audience attraction category and third place in Cloud Tech(Yayyyy!!). It was really a great moment of joy for us. We got some electric Gadgets including Bluetooth Speaker, earphones and badges. We also got the SUSI.ai and PSLab which are the open source hardware projects of Fossasia.

Click to view slideshow.

After 4 days of  amazing experince and involvement with open-source, the conference finally came to end with its closing ceremony.

My flight back to home was on the 19th March. First leg was from Singapore to Bangkok and after two and half hours of layover, my final leg was to Jaipur. I had mixed emotions on landing in Jaipur with fresh memories of Singapore and my pending college assignments which I was going to do as soon as I return home ��

Fossasia Summit was a great trip and learning experience for me. I hope I can make more trips like this in the future!

End of last month I attended the KDE privacy goal sprint in Leipzig. Together with Sandro I continued to look into tooling for identifying and fixing insecure HTTP links, an issue I have written about earlier already. The result of this can be found in D19996.

The first tool we built is httpcheck, a scanner for http: URLs in whatever files you point it to. This is optimized for high speed and therefore doesn’t do any online validation etc.

Obviously something like this can never be perfect, so this has a few features to deal with the common problems we encounter:

  • There is a global exclusion list for known URIs as e.g. used in XML namespaces, where the http: is part of the identifier and not resolved as a network address (see also the last post on this issue).
  • There is an exclusion list for services known to not support transport encryption (yes, those still exist in 2019), as well as for URLs that would just produce an unmanageable amount of warning noise for now (that’s mainly the gnu.org addresses commonly found in license headers).
  • Like other code checkers, this supports inline and per module overrides to suppress warnings. It is for example quite important to not touch code that deals with adjusting http: to https: URLs and therefore might validly contain parts of what seems to be an insecure URL.

A tool like this is mainly useful to prevent new issues from being introduced, and there’s two ideas on how to deploy this:

  • As a unit test injected by ECM into all projects (as it’s currently done for the appstream test, and that’s also why the code for this is in ECM).
  • As a commit hook, similar to the license checks run at commit time.

Before rolling this out we need to fix the current code base first though, to not drown in warnings and test failures.

And that brings us to the second tool, httpupdate, which is is meant to automate the migration to https: URLs. This will consume the same overrides and exclusion list as httpcheck, so it wont touch anything explicitly marked as intentionally using http:. It also doesn’t simply replace http: by https: but it first validates that the corresponding service actually supports secure connections.

A side-effect of this is that it also identifies dead links or no longer existing services, and therefore helps to maintain e.g. our documentation.

Of course this is also imperfect, the result always needs manual review, but it nevertheless massively speeds up the process compared to doing all this manually.


How much does this help with the overall privacy for our users though? How often do you click on links in the documentation, CMake output, or let alone in license headers? And even then, doesn’t HSTS enabled browsers and properly configured web servers redirect to secure connections anyway? In most cases this is probably true, and the practical impact is limited.

However during the test runs of the tools at the sprint we found two possible data leaks this way (one when using an URL shortening service, one for a pastebin service), among hundreds of probably less impactfull insecure links. So I think this is worth it even if it just helps us to spot a potential high impact issue among the many harmless ones.


As mentioned above, before it makes sense to roll out the continuous checks for this we need to fix the current state. That means going through all repositories and see what these tools find, fix things and improve the tools and their exclusion lists on the way. So there’s plenty of opportunity to help :)

Here’s week 66 in KDE’s Usability & Productivity initiative, and some major features have landed! Check it out:

New Features

Bugfixes & Performance Improvements

User Interface Improvements

Finally I’d like to give a shout-out to KDE developer Matthieu Gallien, who is developing KDE’s next-generation music player Elisa, which is shaping up nicely. I’m close to using it as my daily driver. Give it a try via KDE’s nightly Flatpak builds, or compile it from source.

Eagle-eyed readers will no doubt notice many visual papercuts, but they’re being worked on!

Next week, your name could be in this list! Not sure how? Just ask! I’ve helped mentor a number of new contributors recently and I’d love to help you, too! You can also check out https://community.kde.org/Get_Involved, and find out how you can help be a part of something that really matters. You don’t have to already be a programmer. I wasn’t when I got started. Try it, you’ll like it! We don’t bite!

If you find KDE software useful, consider making a donation to the KDE e.V. foundation.

Dear digiKam fans and users, after the first digiKam 6 release published in February 2019, we received lots of user feedback to consolidate this important stage of this project started 2 years ago. We are proud to quickly announce the new digiKam 6.1.0, with plenty of new features and fixes. The New Plugins Interface “DPlugins” With 6.0.0, digiKam project has already left the KIPI plugins support and embedded all tools as native solutions without easly permitting to customize and extend the application.

April 12, 2019

During my web site upgrade, I reviewed the old stuff I had hosted on my long gone web sites but still archived here locally. An interesting thing I stumbled on are the KDE 3 -> 4 porting screenshots of Kate I saved in 2005.

They actually show pretty nicely how far we have gone since 2005 with our development stack.

The KDE 3 -> 4 transition was a large hassle. It did take weeks of work just to get Kate back into an usable state.

We first started with some trivial KTextEditor container (a mini KWrite) for the porting to get KTextEditor at all doing something. That started out with something that even had no menu or toolbars working:

This evolved after two days into a kind of working KWrite port (icons still randomly missing):

After the KTextEditor part did work “reasonable”, we started with Kate:

And ended up with an initial ported prototype three days later:

And after that a long time of actual polishing for KDE 4.0 did start. Kate was actually one of the first ported applications during the KDE 3 -> 4 transition.

The KDE 4 -> KF5 transition was much nicer, even thought the Frameworks split did cost a lot of time and resources. But the actual changes to the application code bases were not that radical.

And where did we end up with this ~14 years later? Actually, just here:

It is still recognizable the same application, thought I hope some progress is visible :=)

April 11, 2019

Elisa is a music player developed by the KDE community that strives to be simple and nice to use. We also recognize that we need a flexible product to account for the different workflows and use-cases of our users.

We focus on a very good integration with the Plasma desktop of the KDE community without compromising the support for other platforms (other Linux desktop environments, Windows and Android).

We are creating a reliable product that is a joy to use and respects our users privacy. As such, we will prefer to support online services where users are in control of their data.

I have been quiet for some months but during those months, Elisa has seen many improvements by existing and new contributors and a new stable version is planned in the coming weeks.

I will publish some blog posts about the many new features implemented in the master branch.

Support for libVLC

Today, I would like to talk about the added support for playing audio through libVLC.

One of my goals had been to be able to offer the best possible first impression when starting Elisa. With the current stable version, flatpak builds and Windows builds are not able to support many audio formats. This is a pretty bad first experience.

The big advantage of using libVLC (except the API is nice and easy to use) is the ability to easily bundle the support for many audio formats. Thanks a lot to the VideoLAN project for their work.

Support for progress bar on Plasma Desktop taskbar entries

Elisa has long had support for showing track progress when playing on Windows platform.

The next stable version will also be able to show progress when running inside Plasma workspace.

Screenshot_20190411_231225Progress bar on Plasma workspace

Improved party mode

Elisa has been featuring a party mode since 0.3 release. It is now also featuring a simplified playlist view.

Screenshot_20190411_231616Improved party mode

In party mode, one is now able to switch to any track by just clicking on it. This is especially nice for laptops with touchscreen. Anybody can quickly switch to a specific track by a simple touch.


Quite some other features are already ready. They will be the subject of other posts.

There are still some features that are still under review for the next stable release. This is the main reason no firm date are currently set for the next release.

I would like to be able to make the best possible release. In order to do that, feedback would be very welcome.

The easiest way to do that is by using flatpak or Windows installers produced by the KDE continuous build servers. I now that some distributions also have package built on top of Git.

In the course of its twenty-two years of existence, the KDE community has created an impressive amount of documentation, from user manuals to internal project discussions to developer tutorials. Like any fast-growing garden, there is a tendency for some things to get a bit out of hand when left unattended for some time. It can be easy to get lost in the outgrowth, especially for first time visitors. It’s time to put on the gloves, take out the tools, and do a bit of gardening!

(Documentation Konqi courtesy of the amazing Tyson Tan)

Last month, I had the privilege of coming on board as a documentation specialist to take a closer look at KDE’s developer documentation and to later come up with strategies to make them better than ever. I have talked with some of the community’s developers to get their feedback on some of the areas that need updating or fixing when it comes to technical documentation. But that’s only one part of the story.

Our dev docs are also meant for new developers. That applies to both new contributors in the KDE community as well as external developers who want to use our software, particularly our excellent KDE Frameworks. In that regard, we’re also looking for feedback on the areas where interested budding rockstar coders and passionate KDE contributors are having trouble getting into the community.

Having clear, up-to-date, and relevant documentation goes a long way in encouraging and helping new developers get involved with the community with as little friction as possible. It even helps those already manning the ship become familiar with other parts of our software they may not have used before. I would love to hear some thoughts and suggestions, especially from interested KDE hackers, so let me know in the comments below.


We are happy to announce the new Qt Installer release, based on the Installer Framework 3.1. The main reason for a new installer was that we wanted to provide a more intuitive and streamlined user experience.

We simplified the component tree view by introducing package categories. This has a significantly improved the metadata download performance, as there is no need to load the metadata of all the packages anymore. In addition, it is more intuitive for new users to pick up the right packages instead of selecting everything just in case.

The updated Qt Installer contains four package categories:

  • Latest releases category shows the latest supported releases, e.g. Qt 5.12.2. Additional releases may be visible, if Qt products have dependencies on them.
  • Preview shows the latest unofficial release previews, including snapshots, alpha, beta, and RC releases.
  • Archive contains all supported releases.
  • LTS contains the latest Long-Term Support releases, such as Qt 5.12.2 and 5.9.7.


Qt Installer does not contain unsupported releases anymore. This reduces the download size of the metadata even further which makes the installation process faster and smoother. Another reason why we removed the unsupported versions was that unsupported releases have known security issues that will remain unaddressed. Thus, we do not recommend you using them anymore. Lastly, having fewer releases available significantly lowers the amount of data our mirrors need to store.

The older versions of Qt have not completely disappeared. You can still find them in the archive on http://download.qt.io, where we keep a full history of all our releases. In addition, commercial users will still be able to see old releases in the Qt Installer, as they can purchase extended support.

The Qt Installer is still not yet perfect. We are considering improving login/sign-up experience, and adding a default installation option to improve the on-boarding of new users. However, right now we are researching how to separate the installer UI from the component model. Tight coupling of the UI and the package repository model has set several challenges to improve the UX.


Updates in Maintenance Tool

Qt Installer’s Maintenance Tool has a few improvements as well. Signing is now enabled in Windows. Note however that old Maintenance Tool itself breaks the signing. If you update from Qt Maintenance Tool 3.0.6 or older to 3.1, the tool still lacks the signing. However, after you have updated to 3.1, the future Maintenance Tools include the signing.

In addition, we have enabled the use of Qt Installer and Maintenance Tool, even if there are some broken components. Previously, “Cannot find missing dependency..” or “SHA mismatch detected..” errors aborted the installation. In the new version, you can continue the installation despite some broken components. You will notice that these broken components are greyed out in the component tree and cannot be selected for the installation.

Feeling curious? Try it out for yourself here.

The post Updated Qt Installer Released appeared first on Qt Blog.

It’s April already… We’re long overdue for a development update, especially since we haven’t had a new release for quite some time.

The reason for both those facts is that our maintainer has had some health issues since December that seriously slowed down not just his part of the development work, but also made it really hard to create new releases — all releases are still prepared by one person, and if that person temporarily loses the use of one arm, things that should happen, don’t happen. The arm is back in action, but that wasn’t the only issue, and things are still a bit slowed down.

Apart from that, we’re making quite good progress towards our next release, which should happen next month.

In the first place, we’ve got a new full-time developer! Tiar, who is well known in Krita’s Reddit community, graduated from university just when the increased income from Steam made it possible to hire someone to help out with this year’s big goal: bug fixing! Tiar started March 1st, and has already fixed more than a dozen or so tough bugs. Krita now finally has a real Nearest Neighbour scaling method, for instance.

We had an intermezzo in between fixing bugs: the HDR functionality. This was mainly developed by Dmitry, in cooperation with Intel and gives Krita unique possibilities: there is no other application where you can create HDR images and actually see what you’re doing.

Since we finished that feature, though, we’ve been doing nothing but triaging and fixing bugs.

Not that all bugs are our own doing: we updated our builds to use Qt 5.12, the new LTS release of our development platform. And unfortunately, the ride has not been smooth. We’ve had problems with tablet support, with painting images and with Python support. And more. We now have a host of patches in Qt’s review system, patches we apply to Qt when building our binaries.

These patches are so essential that as soon as a Linux distribution moves to Qt 5.12, their version of Krita is utterly broken, no matter whether it’s the last stable release for Linux (4.1.18) or a build from our master branch. If you’re using Krita on Linux, you have to use the appimage.

But many bugs are our own fault, of course. It’s not possible to develop software and not break things. Sometimes software grows almost organically, to the breaking point. That’s the case with Krita’s resources system. Resources are things like brushes and gradients. We’ve always had bugs creating, modifying, saving, deleting resources or tags. So we started a big project to improve the situation last year. It looks like the project won’t be finished for 4.2, even though our maintainer already has put in several months of work on it. Our new target for the rewritten resources system is the 4.3 release, which we aim to release in September.

If you want to take a sneak peek at what’s coming in 4.2, apart from bug fixes, check out the draft of the release notes.

If you’re using Windows or Linux you can use the nightly builds to test and help us find more issues before we release Windows | Linux ). Thanks to Ivan Yossi’s work, we also might have nightly builds for macOS soon.


April 10, 2019

Each month KDAB schedules time for me to maintain Qt for Android port. Usually I use it to review the pending patches or fix bugs, but sometime there is a quiet month and I have time to add new functionalities or improve the existing ones.

April was quite quiet so I decided to rewrite the Android sensors plugin (no, this is not a late April fool’s joke).

Why on earth did I want to do that? Why try to fix something that works? After all, the current implementation faithfully served us for years!

I decided to rewrite it after I did some performance tests that showed some surprising numbers. When I used the NDK API to get sensors data, the CPU usage dropped from over 60% to even less than 20% (it fluctuates from 15% up to 30%)! Yup, that’s a surprising performance improvement, which TBH I hadn’t expected at all!

I expected it to be faster, as the NDK API doesn’t need to move the data to JAVA and then back to C/C++ world (via JNI) but, I never dreamed it would be that fast!

Are you wondering why we didn’t use the NDK API in the first place? It’s because the sensors NDK API was not introduced until API-16, which means that when we first implemented the sensors API, we were supporting Android from API-9. However, starting with Qt 5.9, the minimum API has been raised to API-16, so, now I can safely use NDK API for this job.

Here you can find the pending patch. It targets dev branch which will be part of Qt 5.14.

The post Qt sensors just got better on Android appeared first on KDAB.

April 08, 2019

Several years the kate-editor.org & cullmann.io pages got hosted on a Hetzner root server. To reduce costs and switch away from old hardware they got now moved to a OpenVZ based virtual server at Host Europe.

On both servers CentOS 7.x is running, it did always provide a stable foundation of the web services.

As with any server move in the past, I always need to search how to best move the data/config from one server to the other. To document this for me and others, here the quick way to move the basic things needed for web services using just plain Apache & MariaDB.

The following steps assume you have installed the same packages on both machines and the new machine is allowed to ssh as root to the old one. If you have non-system users, you should create them with the same ids as on the old server.

For the following shell commands, the old server address is $SERV and the MariaDB root password is $PASS on both machines. Best use the raw IP as address if you are in parallel updating your DNS entries to avoid confusion (and wrong syncs).

Attention: Wrong syncing of stuff can have disastrous consequences! Check all commands again before executing them, don’t trust random people like me without verification!

  • sync your data, assuming it is in /home and /srv/(ftp/www)

rsync –delete -av root@$SERV:/home/ /home
rsync –delete -av root@$SERV:/srv/ftp /srv
rsync –delete -av root@$SERV:/srv/www /srv

  • transfer your databases

ssh root@$SERV “mysqldump -u root -p$PASS –all-databases > /root/db.sql”
scp root@$SERV:/root/db.sql /root/
mysql -u root -p$PASS < /root/db.sql

  • sync configs (you might need more, this is just apache & vsftp)

rsync –delete -av root@$SERV:/etc/httpd /etc
rsync –delete -av root@$SERV:/etc/letsencrypt /etc
rsync –delete -av root@$SERV:/etc/vsftpd /etc

  • get crontabs over for later re-use, store them in the root home

rsync –delete -av root@$SERV:/var/spool/cron /root

Now all things should be there and after some service restarts e.g. WordPress powered pages should be up-and-running again.

I hope this short how-to helps others and allows me to avoid searching stuff in the future once again from scratch.

Older blog entries

Planet KDE is made from the blogs of KDE's contributors. The opinions it contains are those of the contributor. This site is powered by Rawdog and Rawdog RSS. Feed readers can read Planet KDE with RSS, FOAF or OPML.