Skip to content

Monday, 6 August 2018

In my previous post I played with the team size and activity metrics on several communities and see what would come out of it. Interestingly, to me this wasn’t necessarily the most interesting of what I posted (it’s rather basic in what it presents) but somehow it’s the one which triggered the most comments, especially in the KDE community. Looks like I struck a nerve. :-)

Anyway, it got quite a lot of good comments, so I thought it deserved a follow-up post with a different tone. For the record, I generally try to avoid putting too much of my own personal opinion in posts where I present metrics. I think it’s sane to try to shield facts on the data from my biased position. It’s obviously super hard, if not impossible. Indeed, at a minimum I’m forced to mention potential events in the time frame considered (if I know them)… it’s risky, but still I do it because otherwise things would be just very dry and super annoying to read! And I think that’s why the previous post struck a nerve, but more on that below.

In the rest of this post, I’ll pick extract of the comments I got, in no particular order, followed by my own opinion. So contrary to my usual “data posts”, the cursor between factual presentation and opinion piece will very much point toward “opinion piece”. Be warned! ;-)

More Information About the Rust Community

I got a very nice comment from Florian Gilcher, I won’t address it all, but I’ll add my reactions to some of the extracts.

This doesn’t mean that we have an explanation for everything. But I can certainly say that the slowdown in 2015 is a couple of people going on holidays and taking some time off for once. There was a lot of exhaustion and also tension in the community at that time.

Thanks a lot for the confirmation! When looking at the history of the project from the outside it seemed the most logical conclusion, glad to see I wasn’t far off for once. ;-)

While I agree that we will probably not hold this growth forever, we are doing a lot of intentional work to make that happen. One of them is constantly reorganising the project and actively pulling people in.

That’s what I got from attending a couple of RustFest. I was very much impressed by the efforts going on into growing the community. It’s very proactive and welcoming while KDE has a more passive stance even though it’s a very welcoming community as well.

We take a very clear stance that we want to recognise all work that people have done for the project and invest time in that, for example with projects like https://thanks.rust-lang.org/.

Wow, didn’t know about that page! It’s a great idea. I wonder if KDE has something similar, it’s probably worth producing. Each application has an about box with some people, but there’s nothing structured for the frameworks, also I like the breakdown per version used by Rust. Probably something I could script, I’ll look into it I think.

We try to keep all processes as lean as possible and especially don’t try to impose a huge process on first-time or small contributors. Contribution effort should scale with the size of it.

This is an interesting concern. Instead of going for a one size fit all contribution model. To be kept in mind for sure.

Also, I don’t want to say that we do everything right - the contrary. But we have gotten our project used to reflection and change.

Right, I think it’s important to maintain a culture which allows both to pass the wisdom of older contributors while also being open to change.

Debating the Impact of Tooling

From Florian Gilcher still, we got a point about tooling:

Talking a little about your readings here: In contrast to other comments, I don’t find it unreasonable to attribute the bump in KDE to the change of tooling. […] Free contribution also means that leaving is easy. And some people don’t want to pay the mental tax of learning new tooling.

Of course I agree with that, and I think that’s why KDE lost some of the old guard and quite a bit of the drive by contributions (Git was really painful at the time). Also, on top of the mental tax of learning the new tooling there’s Conway’s law to take into account, a switch means organizational changes which in turns mean quite some communication generated by it, updating our wiki pages for on boarding, team building and so on.

And we got a similar stance from Anton Latukha as well:

I strongly agree on “documented process of 100 steps or to follow” for contribution versus “documented process of 5 steps”. It is so much, that I wanted, but really never bothered to contribute. It is just too much rules and info to fit-in myself for me to make drive-by commits.

I strongly agree with that point of course! It’s not related to the “was the dip caused by the switch to git” debate, since most of the reasons why it’s hard to contribute to KDE predates that switch anyway. But it’s a very important thing to keep in mind if we want to improve. We don’t get as many drive-by contributions as we could and it’s unfortunate. People expectations are that it should be much easier than it is. That’s why at last year Akademy we gave the talk Looking at the Application Developer Story with David Faure.

Of course there’s also another position on that debate which consider the tooling as irrelevant in the contribution history of KDE.

First from Luke Parry:

In short, the excitement has gone. KDE has just become a utility that works pretty well for a DE. Yet, It is not pushing any boundaries of how we interact and work with a desktop.

It’s indeed another factor to take into account and I agree with that. For some parts, KDE is just a commodity people are glad it’s around and since it’s not exciting enough anymore they don’t feel compelled to contribute. Note this is worrying though because it means the community (not the software) has fallen in a kind of Tragedy of the Commons trap. That’s why I think it’s important to hear the comments from Florian above, they show a path forward: the KDE Community should be consciously groomed like a proper common (aka we’re doing great on the software side, not so much on the people side).

Then we got comments from Martin Flöser who are very much on the “switch to git is irrelevant” side:

Personally I don’t believe in the git theory for the kde community. It just doesn’t fit. KDE is a highly technical community, our code has partially a superb quality. Why should the community metrics change just because the introduction of git?

I would say: just see Florian’s and Anton’s comments above. There’s a cognitive cost to such changes, especially when git was really a pain to use.

Instead I would rephrase the question: what resulted in the 2010 peak? For this I see two main events in the KDE community: KDE 4 and Nokia buying Qt.

I agree about KDE 4 resulting in more activity, but not all the way to 2010. As for Nokia, well… I really don’t think it brought much more to KDE than extra sponsorship. One particular project saw extra contributions due to Nokia, but it wouldn’t account for a big increase in activity by a long shot.

After the release of KDE 4 new developers were brought in.

That’s the thing, we always had a strong contributor influx. Except during KDE 4 preparation (it was very hard for people to join because everything moved all the time as the increased activity indicates) and after 2010… which is where we’re debating the reasons still. :-)

If I remember correctly you still were a student at that time (or did you already do your phd?)

I already had my PhD at that time. Besides, I don’t know if it goes like this for all PhDs everywhere, but it was really a full time job in my case. I was definitely not working like a student anymore through my PhD. Anyway, very specific and not very relevant so I won’t dwell on this more. ;-)

So what did happen around 2010 that we did not get the new students in? My answer to that is Bologna and Android. From talking with Bachelor students around that time I got the impression that they don’t have the time anymore to do things as open source development next to their studies. The second thing I mention is Android. I think for students we were no longer the interesting and attractive community to join. Why hack on the old thing if you could do Android apps?

I think that’s where we touch the crux of the issue in this debate. I very much agree with the KDE 4 impact and the commodification of the KDE products. Now, I tend to ignore them because we can’t act on them, and that’s why I talk much more about the tooling: I know we can act on that part!

And that’s where you disagree, you think that the tooling change had no impact on the existing community. Well, I’m pretty sure at least part of the “previous developers” were driven away both because of the personal reasons and the change of tooling. The two collided, learning new half done annoying to use tooling while having less spare time? No way they’d do that. Remember git back then it’s was really very difficult to learn with all its quirks.

Second reason why the tooling matters: the students changed. It’s not only about the time available and making Android apps being more hip. Post-PhD you might remember I was involved in an University program setting up student projects. They had the time, they had cool things to pick from, so only stuff they were motivated to work on. Still, over the years I could see it was less and less natural for them to contribute to KDE. There has been a strong cultural shift I witnessed over the course of a few years.

It became much easier for them to use git, but the paradox for me is that it became much harder for them to use the rest of our tools. Somehow as various generations of students became more skilled in Git, they also became more influenced (or brainwashed, pick your position) by the GitHub contribution model. And nowadays it’s for them the only true way of contributing. Our current processes for contributing are thus looking very alien and preposterous to your average student. I don’t like that, but that’s what I witnessed.

Also Dominik points out git is a standard:

But git is the de-facto standard, it’s not to blame for the decrease - well, at least not in the last 5 years.

Yes definitely. I was thinking of “git back then”, which is in part why we lost a fair share of the old guard as mentioned above. Nowadays git is fine, but as I mentioned above the other tools seem to get in the way for new contributors.

About Comparing Projects

We got a couple of interesting points from Boudewijn as well revolving around comparing projects:

I think you should superimpose the rust graph on KDE’s graph — put the current 2018 location on KDE’s 2010 location. Everything has a hype curve, skyrockets, drops then settles.

Yes, I definitely agree there and I mentioned that at the end of my previous post: the Rust curves currently look similar to KDE early days. That being said even if we do what you propose with the curves, there’s still a difference between KDE and Rust in my opinion. Their team size and their commit count variations are much more strongly correlated than for KDE. The only time KDE had it in a similar fashion is roughly for its first five years as far as I can tell after quickly checking. They kept that almost twice as long now. This is interesting in itself I think.

It would also be interesting, but controversial, to make this graph for larger kde projects – libs/frameworks, plasma, krita/calligra (that doesn’t matter much, krita was always the larger part of koffice/calligra development), kdenlive, digikam, pim.

I don’t think this would be controversial. And that’s actually one of the gazillion things I’d like to do… so many angles to look at those things, I can keep myself busy for the next ten or twenty years I think. I kind of touched it a bit for PIM in my previous posts by the way.

On the Definition of Activity

Sho opened the debate on the commit count metric itself:

I think our commit count going down has a lot to do with how we do code review - which we didn’t do before.

This could be a factor for the 2010 drop, I somehow doubt it is a strong one because I seem to recall we were already doing reviews in 2009. Now I agree that we likely increased the amount of reviews at some point, but it didn’t take 6 years to do that. That’d account for some of the decreasing trend but not all.

This makes it difficult to assess activity by comment count alone without looking at diff sizes too.

Yes, this is something I’d like to explore better. Currently I use commit count for the activity, but I know it’s a poor proxy for it: not all commits are born equal and also reviews is work and collaboration. The reviews I can likely get a partial history through the Phabricator API, but that will never go back many years. For the “value” of a commit, I still didn’t find an approach I liked from what I drafted… still looking for one.

About the Choice of Phabricator

Finally, I got an unexpected comment from Anton Latukha regarding Phabricator:

Phabricator community/development is already virtually dead: https://www.openhub.net/p/phabricator

Obviously a very important point… It’s really a shame because I personally like Phabricator quite a bit despite the fact it looks foreign to a fair share of our contributor base. Of course it makes things concerning since it became a very central piece of KDE’s infrastructure.

Really it pains me (did I say I like Phabricator?) but between Phabricator’s declining contributor base and what I’ve seen with students who consider it confusing, I wonder if we should reconsider it. I hate gratuitous tooling changes (see my points on their price above) but it looks like the price of staying with that particular choice might sooner or later increase by a lot… and it’s already high (see my points above about it looking alien to nowadays students). :-/

Last night we were living outside as usual. It had cooled a bit and a stiff cool breeze began blowing, so we moved inside for the first time in a week. We had a wonderful discussion about the state of the world (worrying) and what we might do about it beyond working for freedom in our KDE work. I think I'm not alone in being concerned about visiting Austria since politics there turned "populist". Since I'm living in a country where the same is true at least on the Federal level, that might seem hypocritical. Perhaps it is, but I'm not the only one working to expand the scope of people we welcome, rather than the reverse. I believe the most fortunate--including me--should pay the highest taxes, to provide public goods to all: excellent schools, medical and social care, fine public transport, free libraries, and free software.

We can only do that last bit well with a healthy KDE community. This means uniting around our goals, contributing to the community along with the software; by creating good documentation, helping promote news, contributing timely information for release announcements, joining a working group or the e.V. itself and most important: living up to our Code of Conduct. Our Code of Conduct is one of the best and most positive in free software, and is a key reason I came to KDE and stayed to contribute. It is of little value, however, unless we occasionally re-read it and resolve to personally hold ourselves to a high standard of conduct, and in addition, daring to step up to help resolve situations where it requires courage to do so. This is an important bit:
If you witness others being attacked, think first about how you can offer them personal support. If you feel that the situation is beyond your ability to help individually, go privately to the victim and ask if some form of official intervention is needed. 
Similarly you should support anyone who appears to be in danger of burning out, either through work-related stress or personal problems.
It is sometimes very difficult and discouraging to confront distressing situations, when those whom you respect and even love deeply disappoint. However if we are to grow and thrive as a family, and we are a huge family, this must be done.

I've recently stolen from Boud and Irina's huge library In Search of the Indo-Europeans: Language, Archaeology and Myth by J.P. Mallory. A bit old, but a lovely survey of Eurasia up to historical times. Just this morning with my breakfast I read:
In what did the Proto-Indo-Europeans believe, or, to use their own words, to what did they 'put in their hearts'? This archaic expression is still preserved in a roundabout way in English where the Latin verb credo 'I believe' has been borrowed to fashion our English creed
After our talk last night, this passage prompted me to write today.


More photos from Deventer:
Flower cheese!

Sage, parsley

Sunset

IPA even in Deventer!

Sunday, 5 August 2018

What Qt installer framework is ? Qt installer framework is a collection of tools that can be used to make installers on Linux, Windows and Mac . You can either use  pre-built versions or compile it from source.   There are other softwares like NSIS, installBuilder that can be used to make installers but I … Continue reading Distributing Qt application using Qt Installer framework

Attending the yearly KDE Pim Sprint in April in Toulouse was nice. For me it is often leaving a cold rainy Germany and arriving warm, almost summer weather with a lot of sun. This time the weather in Germany was also sunny and warm when I left, but spring's always further in Toulouse. As only around ten people attended the sprint, it was also a time to get to know the people behind the nicknames. Unfortunately there were no new faces this time, but a new contributor joined the Pim team and attended remotely.

As the trains from Germany to Toulouse take some time, for me, the sprint normally starts with entering the train and having time to hack. The first things I looked at, were some cleanups in the dependency chain in KDE Pim, by moving stuff around.

Reaching Toulouse, David and I started to dig into the problem, that sometimes connections to remote servers stall and nothing goes back and forth without an error being triggered. This issue is only visible if the internet connection is not stable, like a connection while riding the train. Yes, it's a good thing that sometime developers have to face real world, to be able to reproduce bugs. To solve these issues we first had to reproduce them, which leads into the problem of how to reproduce an unstable internet connection. It took a while before we had a setup running to reproduce the issue and after a lot of trial and error, we finally managed to fix the issues we'd found.

Email security is a big issue in KDE Pim and KDE made privacy a goal for the next years. As a whole team we discussed what we can improve in the context of KDE Pim and added some topics in Phabricator, what we want to improve T7050 (have a look at the subtasks). As the sprint was before the public announcement of EFail, but we were already informed about this, we could discuss the outcome of EFail and add some improvements. You can read about the current situation of KMail and EFail on dot.kde.org.

We also looked at how we can support emails using Memory Hole. Memory Hole is a way to also encrypt headers in emails. The discussion has started but I havn't found time to implement it yet, see T742 for more information.

Together with Daniel I had a discussion on how to improve the way to debug issues with Akonadi. Currently it is very hard to debug issues with Akonadi, but we already have a tool for debugging: Akonadi Console. But still Akonadi Console has some issues, as you can't keep it running for days to wait for issues that happen very rarely. This is because it slows down your computer too much after some time. The Debug view shows the transmitted data between Akonadi and all clients. To make it possible to keep Debug running for days, we had to refactor the Debug view to use a TableView to display the logs instead of a simple TextView. A TableView scales much better with a big amount of data. We can now keep it running and save those logs and scan for interesting entries by hand later. Another advantage is that we can now change the filter settings while logging and have the ability to add more logic to find the interesting entries. Together with this switch we replaced the hand written Debug log protocol with JSON. This also removed the need to implement a separate parser for the protocol and you can use QJson directly for further filtering.

Another issue, that we've been having for a long time, was that you needed to restart Akonadi to view the logs, as those are only printed in the console you start Akonadi from and if you had disabled logging, then no logs were available. Also, you had to enable the correct categories to view a specific issue. Daniel used the time of the sprint to implement another tab in Akonadi Console to display those logs. So you don't need to restart Akonadi anymore to view the logs, which is a great step forward to debugging more easily.

Thanks to KDE e.V. for supporting me in joining the sprint.

Saturday, 4 August 2018

I am really happy that this year, I am able to attend Akademy again.

This enables me to set up a BOF session. It is intended for members of the KDE community who are interested in KDE’s collaboration with Qt. We will talk about the KDE Free Qt Foundation (legal setup; history and future; perspectives: What is important for the KDE community going forward?)

A cordial invitation to all KDEers! – Tuesday, 14 August, 9:30

It is great that many people from the Qt Company will also be at Akademy, so we will have a number of in-person meetings.

And of course, I can personally report on our activities of the KDE Free Qt Foundation during the General Annual Meeting of KDE e.V. (Below you can also read our formal report for the past year.)

I look forward to meeting many of you in Vienna!

I am going to Akademy!

AGM Report from KDE Free Qt Foundation for 2017

During the past year, the KDE Free Qt Foundation has been working on the
following topics:

Qt as Free Software

All parts of Qt continue to be available under free software licenses (for main part of Qt, under the LGPLv3 and also under GPLv2 or later; for some
add-on modules, at least GPLv3) for desktop Linux, Android, Microsoft Windows, Microsoft Windows Phone, Apple MacOS and Apple iOS.

Legally safeguarding this is the purpose of the KDE Free Qt Foundation.

As a separate product, the Qt Company has also released a 3D UI design tool in the meantime under the GPLv3. This code was originally contributed by NVIDIA Inc.

Tooling

The tools of Qt also fall under the Free Software conditions. The proprietary Qt Quick compiler (which used to lack compliance with our contract in the past) has been completely replaced by a superior Free Software alternative. We have requested the Qt Company to update their website accordingly.

Of course, the Qt Company keeps having the option to offer separate products (developed and sold separate from Qt).

Qt 6

The foundation also discussed the plans of the Qt Company for Qt 6. Current plans are that there will be long term support releases in autumn 2018
(Qt 5.12 LTS) and in spring 2020 (Qt 5.15 LTS), and that Qt 6.0 will be released in autumn 2020 as a cleanup release removing all APIs that have been deprecated in Qt 5.15.

There are no plans on removing any functionality from Qt, without having an appropriate replacement under the same license conditions. The Qt Company reassured us that any such removals would also be discussed in the foundation, as stipulated by our contracts.

(Olaf Schmidt-Wischhöfer and Martin Konold)

Time passes. In Deventer, it is chimed by the church bells every hour, and during the day, a tiny concert every quarter-hour. To celebrate the Market, there was a concert of bells yesterday. The guest carillon-master was quite showy, with flourishes and trills! The church is in the next block, so we hear the bells very clearly. Behind the house a short distance is the Roman Catholic church, where yesterday we heard the joyous tolling of bells to celebrate a wedding.

After we visited the Market yesterday, Irina took me to the cheese shop. The phrase "cheese shop" doesn't cover how amazing this place is, even before one walks in and smells the symphony of cheese within:


After our trip to the Market, Irina as if by magick produced quail pies for lunch! The previous evening we had eaten at a *great* restaurant just around the corner from their house, and all had the quail. Our leftover halves were packed up and became pies!

This is being typed and put together out on the terrace, shared with the birds of the nieghborhood, the sun, and an enormous tree in a neighboring square.



In short, life is good! My thanks to the KDE e.V. for supporting the KDE community and Akademy, and sponsoring my accomodation while there. My thanks to the Ubuntu community fund for sponsoring my travel here and back home again. My profound and deep thanks to Boud and Irina Rempt for their generosity, thoughtfulness, hospitality, peaceful house and delicious food, and most of all, for asking me to come and live with them in Deventer this week. This is city living at its finest.

Item {
    property string text;
    Text { ... }
    MouseArea {
        onClicked: {
            if (parent.text === "Quit") {
                Qt.quit()
            } else if (parent.text === "Start") {
                globalObject.start()
            } else if (parent.text === "Stop") {
                globalObject.stop()
            }
        }
    }
}

I don’t always understand why people do things in some ways.

Friday, 3 August 2018

Like many people around, I plan to attend Akademy this year. I unfortunately was not able to attend it last year, when it was in Spain again and damn, I love Spain, but this time I cannot miss it, especially when it’s so closed to Czech Republic. I’ll be there from the very first day until Thursday 16th of August. We will be organizing a BoF focused on Flatpak and Snap on Tuesday 14th of August in the morning so if you want to discuss or help with anything Flatpak and Snap related then you are more than welcomed. You can also reach me anytime during the conference if you want to discuss  something about other stuff I’ve been doing, like plasma-nm or lately screen sharing through PipeWire. See you in Vienna.

Thursday, 2 August 2018

KDE Project:

This week, we have received a number of inquiries into how Plasma extensions, particularly those found on the KDE Store, relate to the stability and safety of a Plasma system. With an engineering focus, this blog hopes to provide answers.

Crash Resilience By Process Isolation

Present-day Plasma process architecture diagram
Present-day process architecture: Compositor and shell UI are isolated from each other

In Plasma, the shell UI and the compositor are isolated into seperate processes (plasmashell and kwin, respectively). The two exchange information regulated by IPC mechanisms such as the windowing system (Wayland or X11) and DBus, using standard protocols where suitable and custom-created ones where needed.

In a Wayland session, it's kwin that plays host to your application clients and puts them on screen (on X11, both KWin and app are clients to the X Server, providing a further isolation - KWin can crash and restart without impacting apps). Shell UI extension live in the seperate plasmashell process.

In the event that a shell extension crashes the shell, this allows it to be restarted without impacting KWin and therefore without impacting your applications. They continue to run undeterred while the shell takes steps to right itself.

Beyond crash resilience, putting limits on untrusted code run in the compositor process also has a security dimension. Particularly on Wayland, where one of the design goals is not to allow untrusted code to introspect the windowing system globally.

Meanwhile, to make KWin ready to taking over for the X Server in a Wayland session, we significantly upgraded our engineering story - requiring and dramatically raising unit test coverage for KWin and related library code, for one.

Process Isolation: Next Steps

The architecture discussed above shields applications and the compositor from shell UI extensions, but it doesn't shield the shell. We want to fix that next.

Future R&D Plasma process architecture diagram
Next: Isolate shell UI and extensions, too — individual or batch processes for extensions

The always-stunning David Edmundson has been spearheading several engineering efforts to make the shell itself have a multi-process architecture. Some of this has already quietly been shipping in the Plasma 5.12 LTS release: KRunner plugins, which provide search results in our menus/launchers, can now opt into running out-of-process. We've used this to isolate some of the crash chart-topping plugins, preventing them from taking down the shell when performing a search query.

Likewise, for shell UI extensions, he has been working on top off the library stack we initially built for the Wayland transition to allow them to be run out of process and then get composited into the shell UI. In addition to making the shell far more resilient to extension crashes, this would also create additional security domains for untrusted extension code - we can build on this to sandbox and drop privileges for them.

All of this is broadly similar to application architecture improvements pioneered by multi-process web browsers in past years, albeit built squarely to leverage the shared free desktop stack (particularly Wayland and DBus). Unlike what took place in Firefox' project Electrolysis, however, we don't see a need to break extension compatibility in our case. For Plasma's extension API scope, we've always taken a conservative demand-based approach instead of allowing extensions free access to all data and code by default. This is paying dividends now, allowing us new shell architecture options while preserving the known, stable API contract it has with its extensions.

David is set to show off the R&D proof of concept we have working now in his talk at this year's Akademy conference, which I can't wait to attend. Be sure to also keep your eyes peeled on his blog, where he will dive deeper into the details of this project after the conference excitement.

In Summary

Plasma today is architected to prevent shell UI extensions from being able to crash your session and interfere with your apps. We are working to expand on this and prevent extensions from interfering with the shell.

In Context

One of the broad goals of Plasma 5 has been raising quality. This is a user-driven process. Converting feedback we got during the first generation of Plasma into software, we worked hard to bring Plasma users tangible improvements to speed, resource usage and UI polish (this is by no means over, either). We also already implemented a LTS release process to improve our support story.

Dove-tailing with the KDE community's Goals, climbing higher on the ladder of safety and security is one of our immediate next ones. Increasing process isolation in Plasma and achieving state of the art crash resilience and sandboxing, a likely first on the desktop, is a concrete engineering expression of this aim.

Sounds interesting? If you want to be part of a team and a community that keeps pushing the desktop (and not just the desktop) forward in the days to come, join us in one of our channels.

Of course:

I'm going to Akademy promo picture

Friday, 21 August 2015


Update 1: Google Play still not has the newest version, but it is incomming in the following days
Update 2: There is an open beta version now, you can get it from here from Google Play.

Marble Maps has the following features:
  • Wondering around on the map
    • You can move the map with one finger by dragging the map with it
    • It will increase the zoom if you touch double times fast to the map
    • You can also zoom with two fingers (only supported on multitouch devices)
  • Handling your position
    • You can check your position on the map
    • You can check the distance and direction to your position when your position is not visible
    • You can center the view to your position
  • Routing
    • You can plan routes via interactive placemarks, you have to search for something and after that, you can use the result as a waypoint
    • Also, you can modify the route instead of the interactive waypoints with the route editor which is available from the menu

    • To get the routing instructions, visit the menu too
    • You can select from three routing profile: with car, with bike or as a pedestrian
  • Navigation
    • You can navigate on the route what you have planned previously, or you can also navigate to a single destination
    • Marble Maps shows your position on the route with a different icon
    • It provides turn-by-turn based navigation
    • It uses the Text To Speech interface of the Android system to provide voice instructions. To use it, please install it from Google Play
    • It shows the next instruction and its distance too
    • The current speed and the distance to the destination is also visible 

Some techincal background:
Marble's base source code in this summer has become Android compatible. If you want an Android app can be built on the top of it. Any map can be loaded in, and all of the projections are available. It supports some plugins too.




And finally my personal experience about the summer:
I liked to work on this project very much because I have learned a lot of new things, like coding technics, I have got closer to QML and I also have had a deeper insight on how Android applications work. It was also good to work with the people of the community. I would like to thank to everybody who helped me this summer, especially to Dennis Nienhüser and to Torsten Rahn. Without them Marbe on Android would be still a dream.

Thank you Google for this fantastic opportunity!

But the story is not ending here, so stay tuned...