Skip to content

Wednesday, 31 October 2018

Dear digiKam fans and users, following the first beta release published in September, we are proud to announce the second beta of digiKam 6.0.0. Exiv2 0.27 support With this new release we are following the Exiv2 project which is in release stage for their next stable release to 0.27 planed for December 2018. This library is a main component of digiKam used to interact with file metadata, like populating database contents, updating item textual information, or to handling XMP side-car for read only files.

Tuesday, 30 October 2018

Hell yeah!

Today I was updating my LinkedIn account and saw that I have 3 years and one month with KDE. And what a blast!

KDE life is responsible for a lot of changes in my life, including my personality. I don’t have enough thanks to this amazing community.

Today I am part of Atelier, the printer host, where 3DPrinting was the first reason that made me into KDE, and that changed my life in so many ways… I really hope to get back to the project anytime soon… And I am also part of the Fundraising WG…

Quick Timeline:

2015: Joined the Community -> 2016: Lakademy/Atelier/GSoC(Umbrello)/Randa Meetings -> 2017: Fundraising WG and Campaigns/ Atelier talks -> 2018: AtCore 1.0 launched/Akademy – Vienna…

I have a few plans for my contributions to KDE, a couple of them related to the Web part of the community that needs a lot of love, and I don’t need much of a boilerplate to work on it instead of Atelier(still missing a 3dprinter…).

I can just hope that the years coming can be as much enjoyable as this three years have been.

Thanks, KDE for having me!

Thanks, Tomaz Canabrava for bringing me in!

Thanks, everyone that works to make KDE the best community ever!

That’s all folks!

It’s been a while since I posted anything related to the Akonadi EWS resource, but that does not mean that the project is abandoned – development is ongoing, albeit a little slow.

Today’s episode of “what’s going on in the EWS world” is all about authentication.

From the early days of Akonadi EWS interacting with Microsoft authentication mechanisms was a bit on the hard side. Anybody who is wondering why is invited to look into Samba history & development, which should be a good enough indication of the problems we’re dealing with here. This mostly impacted NTLM authentication, which had to be patched in KIO in order to work properly (NTLM version 2).

OAuth2

With the move to the cloud and introduction of Office 365 Microsoft went into more modern authentication methods. One of them is OAuth2, which was already widely used in other services, such as GMail.

One specific thing about OAuth2 is that you need to register an application, for which authentication is taking place. In case of Office 365 this means an Azure account is needed to register an application.

Evolution, which has been supporting OAuth2 chose to let the user to provide its own registered application, which effectively means that the OAuth2 authentication backend is non-functional out-of-the-box. User interaction is needed in order to enable it.

For Akonadi EWS I chose to register my own app, which means that OAuth2 should be working without any further user actions beside supplying a password.

The application registration should be working for everyone as it was registered as a global app. It is however possible that an administrator of a corporate Office 365 account may either explicitly block it or use a whitelist of apps that have access to Exchange. Should this ever happen a back door is left behind in the EWS Resource – similar to Evolution it is possible to configure a different app identifier to be used instead of the default. This can be done by manually editing the ~/.config/akonadi_ews_resource_?rc file (? will be a number starting from 0) and adding a new line:

OAuth2AppId=<app-uuid>

In case of very stubborn admins, the ultimate way to work around obstacles is to use the app id from Microsoft Outlook (d3590ed6-52b3-4102-aeff-aad2292ab01c).

Strong authentication

Experienced sysadmins have already noticed that using just a username and password for authentication is proving to be not sufficient. For that purpose additional authentication steps, such as MFA (Multi-Factor Authentication) have been deployed.

Another option is to establish some form of trust against the device that is used for authentication, which ensures that access from a company owned or managed device is considered safe.

For the latter purpose Microsoft defined an additional authentication protocol called Public Key Authentication Protocol (PKey) described in [MS-PKAP]. It is used during OAuth2 authentication to prove possession of a device-specific private key. This key is being generated and registered with the Azure Directory during device enrolment into the corporate directory. This can be done by performing a Workplace Join from Windows 10 or using the InTune app on Android.

From what I can find it looks like Akonadi EWS is the first open-source implementation of the PKey authentication protocol.

Why is this so important? Depending on the security policy defined by the corporate administrator it may have decided to block access to Office 365 from outside the company network for unregistered devices or alternatively require additional authentication steps (MFA) when connecting from outside the office. By setting up Private key authentication it is possible to be seen as trusted in the same way as Windows users.

So far there are no open-source tools to register such a certificate/private key with the Azure Directory. Fortunately however the protocol has been published and I will be working on such a tool/script.

Both OAuth2 and Public/Private Key Authentication are implemented and will hopefully be part of KDE Applications 18.12 release.

What’s next on the table

SAML

I’m not done with authentication yet. There is one more authentication goodie in the pipeline.

While OAuth2 is a nice and generic solution it brings a slight drawback – you need to type your password and/or MFA token every two weeks or so.

For Office 365 and Azure Directory users there is however another way in – based on ADFS SAML authentication. This method is non-interactive and can be used with just the username/email and password. There is a catch however – if the administrator has chosen to require strong authentication when connecting outside of trusted network this method will not work unless it is used together with PKey authentication.

EWS SOAP API refactoring

In the current implementation all EWS SOAP requests have been written manually. They are in many ways copies of each other with lots of boilerplate code.

The plan is for this to be replaced by generated code.

This is a significant task to do, however when it’s done it will open up possibilities for the long awaited full calendar support and more.

This blog shows how Qt applications can be built with Cargo. The goal is to make compiling them as simple as installing Qt and running

Rust Qt Binding Generator (Logo by Alessandro Longo)
Rust Qt Binding Generator (Logo by Alessandro Longo)

The crates qrep and mailmodel are examples. You can try qrep with

qrep is a minimal GUI for ripgrep.

qrep
qrep

mailmodel is a proof-of-concept mail reader.

mailmodel
mailmodel

You can get started quickly with your own Rust Qt application by copying the folder templates/qt_quick_cargo from the Rust Qt Binding Generator repository.

Why Cargo?

In previous blogs we built applications with CMake. CMake is probably the most widely used tool to compile Qt software. CMake uses Cargo to build the Rust parts.

CMake is a familiar tool for Qt developers, but not for Rust developers. For them, CMake is an unneeded hurdle. Rust Qt Binding Generator can be used with only Cargo. To do so you use the Rust way of building C++ code: build.rs.

build.rs

build.rs contains Rust code that is compiled and run by Cargo to build an application. Here is a simple build.rs file:

This file is placed directly in your project folder. The last command, compile("my_project"), compiles a library called my_project.

Cargo.toml

build.rs and the name of the library should be added to Cargo.toml:

[package]
name = "my_project"
version = "0.1.0"
build = "build.rs"    # use build.rs for custom build steps
links = "my_project"  # and link to the resulting library

[dependencies]
libc = "0.2"

[build-dependencies]
rust_qt_binding_generator = "0.2"

src/main.rs

main.rs is the entry point for Rust applications. In these applications, two things should happen:

  • include the generated Rust code interface.rs.

  • call into the C++ code.

Qt applications have an event loop. Starting this loop requires some C++ code. We start the event loop in a C++ function. The name of the application is passed to this function.

src/main.cpp

The template contains one C++ file. That is usually all you need. But it’s possible to add more C++ files by adding more calls to Build::cpp.

This file loads the GUI from a QML file and starts the event loop.

qml.qrc

Qt applications can contain resources. These are embedded files that are accessible via qrc: URLs. The template application contains two resources: main.qml and MainForm.qml. Other resources such as translations or application icons can be added as well.

Concluding

The option to build Qt applications with Cargo should please Rust programmers. I’ve ported my personal projects to build this way. Using CMake is also still supported and is recommended for C++ projects that currently use CMake and would like to introduce Rust.

This feature is new and not widely tested. Bugs can be filed here.

Monday, 29 October 2018


A few weeks ago I travelled to the Netherlands to be part of the Krita October Sprint. During this Sprint we decided to focus on bug fixing, my tasks included some simple bugs and a couple of more convoluted bugs. I started fixing the simple ones in order to gain speed: one about modifiers not working on OSX, the bug was simple enough but puzzling as the missing logic shouldn’t make the code work on Linux, but it did. The second bug was related to events logic in the preferences dialog command: My first approach was good but not simple, so talking with the team made me change the solution to something much more simple.

Hard working Krita Sprinters

The next days showed me how deep the rabbit hole goes in Krita’s code, my bug was in the invert color code, some color spaces didn’t show the correct/expected result. A quick dive showed that there was a different codebase for every colorspace invert operation, and the wrong results showed for the missing implementations. However this made the solution not very portable as the combination of colorspaces and color depths suggested I needed to implement 18 color inverters. A short consultation showed me that there was a space invert operation already implemented for the pixel depth, so refactoring to use this convertors in one class to invert the input colors made the invert filter work as expected, except for CMYK and Lab spaces in 16bit float spaces. After a couple of days of digging into the code and testing, we found that there is a a bug in the way CMYK and Lab is values are processed as normalized values are not returned in places they should be.

As my first Krita Sprint I was very nervous, however I was even more exited to meet the team. In a way it was also the first time to work in a code only environment which made it very fruitful as it showed me that code is not made by super coding super geniuses, but by a little changes made by a coordinated team of normal people.

Sunday, 28 October 2018

Also available in Italian and Taiwanese Mandarin.

Hey Chakra users!

On your next system upgrade you will receive all the latest versions of KDE’s Plasma , Applications and Frameworks , in addition to the usual package updates. All these have been compiled against the latest Qt 5.11.2.

We make available Plasma 5.14 in its second bug-fix release, a brand new series that introduces many new features to our favorite desktop environment.

For more details and the full changelogs on KDE’s software releases, you can read the official announcements:

With this update, we are also removing from our repositories the kde-runtime and kdelibs packages. To achieve this, we had to remove some outdated and nowadays mostly obsolete applications. Make sure to go through both bug reports to find out more on the packages that have been removed, updated or replaced. As always, we are not forcing this removal to our users, so you will have to manually remove these packages if you no longer need them.

It should be safe to answer yes to any replacement question by the package manager application. If in doubt or if you face another issue in relation to this update, please ask or report it below.

Most of our mirrors take 12-24 hours to synchronize with the central repositories on the origin server . Use the mirror status web page to see when your mirror of choice last synchronized.

Enjoy!

2 posts - 1 participant

Read full topic

Friday, 26 October 2018

TL;DR Beware of connections to function objects accessing class members which could be triggered during execution of subclass destructor methods.

Oh, those name-based signal/slot connections feel outdated!

So you are a happy embracer of Qt’s new QObject signal/slot connection way of coding based on function-pointer or functor objects, like myself. This compile-time check of signals and slots feels just so much better. And thus you also port any existing code to it. Including some which uses the pimpl approach for some public classes, borrowing Qt’s macros Q_PRIVATE_SLOT, Q_D & Co.:

class MyWidgetPrivate;

class MyWidget : public QWidget
{
    Q_OBJECT
public:
    explicit MyWidget(QWidget *parent);
    // [...]
    // setting a QWidget-subclass to be used as custom form in this widget
    void setForm(Form *form);
private:
    const QScopedPointer d_ptr;
    Q_PRIVATE_SLOT(d_func(), void handleFormDeleted())
};

// implementation side:

class MyWidgetPrivate
{
// [...]
public:
    void handleFormDeleted() { /*...*/ }
};

MyWidget::MyWidget(QWidget *parent)
    : QWidget(parent)
    , d_ptr(new MyWidgetPrivate)
{
    // [...]
}

void MyWidget::setForm(Form *form)
{
    Q_D(MyWidget);
    // [...]
    connect(form, SIGNAL(destroyed()), this, SLOT(handleFormDeleted()));
}

Got some time, let’s modernize the code

The old code calls to be changed into using a connection from the destroyed signal to a lambda expression calling handleFormDeleted() directly on the private object, with MyWidget instance as context object, thus removing the need for that Q_PRIVATE_SLOT:

class MyWidgetPrivate;

class MyWidget : public QWidget
{
    Q_OBJECT
public:
    explicit MyWidget(QWidget *parent);
    // [...]
    // setting a QWidget-subclass to be used as custom form in this widget
    void setForm(Form *form);
private:
    const QScopedPointer d_ptr;
};


// implementation side:

class MyWidgetPrivate
{
// [...]
public:
    void handleFormDeleted() { /*...*/ }
};

MyWidget::MyWidget(QWidget *parent)
    : QWidget(parent)
    , d_ptr(new MyWidgetPrivate)
{
    // [...]
}

void MyWidget::setForm(Form *form)
{
    Q_D(MyWidget);
    // [...]
    connect(form, &QObject::destroyed,
            this, [this] { Q_D(MyWidget); d->handleFormDeleted(); });
}

Looks fine & compiles. Code feels more future-proof with the compiler now warning if some signal or slots got changed/removed.

Ooops, crashing now?

Just… nooos, it sometimes crashes now, in the destructor of MyWidget. How that on this innocent looking change?

Reading once more closely the documentation of QMetaObject::Connection QObject::connect(const QObject *sender, PointerToMemberFunction signal, const QObject *context, Functor functor, Qt::ConnectionType type = Qt::AutoConnection) we notice the remark:

The connection will automatically disconnect if the sender or the context is destroyed. However, you should take care that any objects used within the function object are still alive when the signal is emitted.

Which subtly hints to the problem we now have: if the form instance is set as child widget of the MyWidget instance, it will be deleted when ~QWidget() is run as part of the MyWidget destructor. And then emit the destroyed signal. At that point in time this as seen by the function object no longer is a proper MyWidget instance. And things go *boom*.

The old string-based connection as well as the member-function-pointer-based one handle that case for us, by some QObject magic using virtual methods which catch that the receiver no longer is a MyWidget and somehow then just drop the slot call (got lost in the code details, but it is something like this).
While with the new function-object-based connection that one will only become automatically inactive by being destroyed if the ~QObject destructor of either sender or receiver is reached. So having a longer lifetime, which can come a bit unexpected to some.

Fixing the modern times, unsure how

Lesson learned: do not blindly port code to the context & function object based connection. Instead beware of the additional traps which there are given that the function object is an independent complex object and not just a member function pointer. I will have to revisit quite some code where I might have missed this trap with the subclass destructor methods :/
As I seemed not the only one hit by this, I filed QTBUG-71432: “API dox of context & function object using QObject::connect should hint about destructor issues” so other people like me might be saved from this from the start.

Curious to learn about best practices for private slots and non-string-based connections. Thus happy to hear about proposals/hints in the comments.

(Update: text now using C++ standard lingo term “function object” instead of “functor”)

Wednesday, 24 October 2018

We will be holding a Bug Day on October 30th, 2018, focusing on Konsole. Join at any time, the event will be occurring all day long!

This is a great opportunity for anyone, especially non-developers to get involved!

  1. Mascot_konqi-support-bughunt.pngCheck out our Bug Triaging guide for a primer on how to go about confirming and triaging bugs.
  2. Log into KDE Phabricator and join the Bugsquad!
  3. Join the #kde-bugs IRC channel on Freenode to chat with us in real-time as we go through the list.
  4. Open the shared Etherpad for this event (use your KDE Identity login) to select your block of bugs and cross them off.

If you need any help, contact me!

Monday, 22 October 2018

Almost two weeks ago we had the seventh edition of the LaKademy, an event that has been held in Brazil since 2012. As you may know LaKademy’s main goal is to get together the Latin American contributors of KDE community and to attract new ones. We don’t have talks like in Akademy because the event’s idea is to be a space for sprints. So people work in small groups doing specific tasks like fixing bugs, developing new features or translating software and documentation.

dav
LaKademy banner.

LaKademy 2018 was the biggest one so far. This year we had 23 participants of which 12 were newcomers. This is an impressive number of new people attending the event, the biggest number in all these years. We were very happy to have them all working with KDE projects in Florianópolis, an amazing island located in the South of Brazil.

DSC04821
LaKademy 2018 group photo.

Among these new people we had a group from UFF (Universidade Federal Fluminense), a public university from Rio de Janeiro state, Brazil, who are working with KDE Edu project. One of them, Maria Edoarda, made an amazing pixel art of Konqi that was printed using a 3D printer we had there. That was a cute gift for the community in its 22th birthday ❤

photo_2018-10-13_17-49-10
Edoarda proud of her pixel art.

Speaking of anniversary, we also had a cake to celebrate the date. We bought a traditional cake from the South of Brazil called “cuca” that we preferred to call “kuka” 🙂

IMG_20181014_134418.jpg
Kuka and Konqi 😀

IMG_20181014_135329.jpg
The group celebrating 22th KDE birthday with a “kuka”.

During LaKademy I worked on the KDE timeline project that I’ve developed with Camila Ayres in 2016 for KDE 20 years celebration. The timeline was outdated and needed to have new info about the last two years added. So I made some changes like in the header to make this project follow the history of the community beyond the 20th anniversary. When I had the idea to create it my first thought was just use it for the celebrations of the 20th birthday, but now I think the project might be useful to keep a record about the evolution of our community.

dav
We’ve made konqi mugs for the attendees 🙂

There is a moment at the LaKademy where everybody participate of discussions about promo things like how to attract more users and contributors to the community, what should be our actions for the next year, which place we should choose for the next LaKademy. During our promo meeting this year we discussed the possibility of doing the event outside Brazil. This is an old idea and we’ve been trying to do this since the second edition.

The main obstacle that prevents us to do this is the cost of bringing everybody to another country. As we have the majority of the contributors in Brazil it’s cheaper for us just make the event here. On the other hand we need to connect with people from the others Latin American countries, otherwise the name of this event makes no sense. We do have people from outside Brazil attending LaKademy but they were never a high number. Bring the event to another country might help us to increase this participation.

As part of the anniversary celebrations we also went for a karaoke bar at night. That was one of the most funny moments of the LaKademy. This moment was a very nice opportunity to new and old contributors get to know each other better, have a break from the tiring routine of the sprints, and of course show their talent 😀

IMG_20181014_014345.jpg
Nerds just wanna have fun 😀

IMG_20181014_032630.jpg
LaKademy band ❤

Now our community have a person very good with communication and promo stuff and she is doing a very good job. Thank you, Barbara! She made this nice video to show the world some moments from LaKademy in 1 minute:

I would like to thank Patrick for his support in Florianópolis, organizing the event there and helping us with everything. I also wanna say thank you to the KDE e.V. that has been supported us all these years.

LaKademy is always a very reinvigorating moment for me, especially right now that Brazil is facing a difficult moment of growth of the far right and fascism. Being there with all these nice people sharing code but also sharing dreams and love is a form of resist to this wave of hate. I hope that more new people can come to this amazing event and see by themselves how is worth be in a friendly community like KDE. Happy 22 years, my community! ❤

Friday, 19 October 2018

In August of last year, i wrote a blog entry about my experience at Akademy 2017 in the amazing Almería, and in that blog entry, amongst many other things, i wrote about an effort which had been slowly brewing, conceptually, for about a year by then: Tagging support in the Open Collaboration Services API. Now, what does that have to do with the KDE Store, you might say? Well, that is the API used by the KNewStuff framework to interface with the store, and that in turn is what is used in the many various places in our software which show shiny, new content for downloading (or to put it in a different way: used by our software to let users Get Hot New Stuff).

For Your Immediate Consumption

I am proud to announce that as of KDE Frameworks 5.51.0, a major patch for KNewStuff was merged, which expands substantially on some basic tag data handling functionality previously added to the Attica framework. One thing it includes, amongst others, is a test tool. Have a screenshot, because those are shiny and make people look at blog entries ;)

A usable test tool for KNewStuff would make testing KNewStuff easier, you say? Well, in that case, have a usable test tool for KNewStuff.

Some of you brave people running Frameworks from Neon's nightly packages saw an explosion when using Discover a few weeks ago, and i'd just like to also extend an apology to you, as that was my fault for temporarily introducing a binary incompatibility in the first merged version of that patch. Thank you, also, for actually running this, as without you we might have not found this bug before releasing, at which point it would have been too late to fix. So, thank you for your invaluable testing and reporting work! This double merge of the patch is also why you might notice two entries of that patch being mentioned in the changelog.

Immediate Culminations

So, apart from shiny new test tools, what sort of shiny things can you, as a user or developer of KDE software, expect when running it on top of KF5.51? Well, one important thing you will notice (or, rather, hopefully not notice too much) is that the content offered to you in for example Plasma's Get New Wallpapers dialogue or KDEnlive's templates are going to be both installable and usable. This does require intervention by the KDE Store's moderators, who are the ones that can mark content as something KNewStuff should hide by default, and is why a call went out for assistance there a couple of months ago, so we could prepare for the arrival of this patch. Incidentally, if you find anything off on the store, please do tell us about it and we'll get right on it!

One very important point i feel i need to make before continuing: The basic filtering functionality I'm about to describe is entirely backward compatible, with no side effects other than the filtering just not happening if an application using it is run on top of an older version of KNewStuff. This means if you want to add this to your software, you won't need to wait for your favourite distros to get all up to date with Frameworks.

As an example of something slightly more involved than just hiding those bits explicitly marked as unwanted on the server, have a couple of screenshots of a bit more of the functionality in this patch. On the left we have the test category Comics on share.krita.org, with one comic (supplied as an ePub file in this case), one non-downloadable comic (still technically a comic, but it's a link to a website - technically fine for this category, but not downloadable), and one spam entry (fairly sure this stuff isn't a comic book of any kind...). On the right, the same data is shown in Peruse,  but with the two non-usable entries filtered out for having either no comic to download, or for being spam and explicitly excluded by a moderator.

No modifications were done in Peruse's code itself either, only the knsrc configuration file, which had the following line added to it:

DownloadTagFilter=data##mimetype==application/x-cbz,data##mimetype==application/x-cbr,data##mimetype==application/x-cb7,data##mimetype==application/x-cbt,data##mimetype==application/x-cba,data##mimetype==application/vnd.comicbook+zip,data##mimetype==application/vnd.comicbook+rar,data##mimetype==application/vnd.ms-htmlhelp,data##mimetype==image/vnd.djvu,data##mimetype==image/x-djvu,data##mimetype==application/epub+zip,data##mimetype==application/pdf

This somewhat unsightly chunk means, fairly simply, that there should be a filter on the content item's download items, which should accept only entries in which at least one of those download items had one of the listed entries for the data##mimetype tag. The documentation for the filtering of content items can be found right over here and here for download item tags, alongside KNewStuff's other API documentation.

If you want to do something more involved than what is possible using a static list of tags like that, you can absolutely add the filters manually through code. Do this by calling the KNSCore::Engine::addTagFilter and addDownloadTagFilter functions, using the formats listed in TagsFilterChecker's documentation.

Future Prospects

What does the future hold? Well, for KNewStuff itself, the functionality as it stands now is pretty powerful already, but if you have ideas for enhancements, please do get in touch, either directly to me (i'm not difficult to find, to the best of my knowledge i'm the only leinir around), or on IRC (i'm leinir on many of KDE's various channels on Freenode and on our Telegram groups, on Twitter and so on), or even better, surprise us with a patch over on Phabricator.

What are all these AppImage being filtered by architecture? Well, then, that's certainly something we should perhaps be doing more of now that it's possible to do so... ;)

One future prospect which is very immediate is going to be enhancing the KDE Store support in Discover. Right now, Discover's support for content coming through KNewStuff is limited to, effectively, just showing the items offered by all the knsrc files on the system and managing their installation and removal. This is already lovely, but enhancing this functionality by adding such things as explicit, user specified filtering or browsing through tags supplied by the creators, or by computer architecture and the like for things which require running would be very handy (for example for supporting downloading and installing AppImages from the AppImage section on the store).

Thank You!

The future, then, is very much full of shiny possibilities, and while i am under no illusion that anybody is going to be quite as enthusiastic as someone who has been working (on and off) on this functionality for over two years, i do hope that some of my excitement might have rubbed off on you.


The word (/abbreviation) of the day is: SABA (because having supplied air breathing apparatus would be handy with the bitumen removal chemicals being used in our house at the moment)