Most recent items from Ubuntu feeds:
Raphaël Hertzog: Freexian’s report about Debian Long Term Support, November 2018 from Planet Ubuntu

Like each month, here comes a report about the work of paid contributors to Debian LTS.
Individual reports
In November, about 209 work hours have been dispatched among 14 paid contributors. Their reports are available:

Abhijith PA did 13 hours (out of 13 extra hours from October).
Antoine Beaupré did 24 hours.
Ben Hutchings did 20 hours.
Brian May did 10 hours.
Chris Lamb did 18 hours.
Emilio Pozuelo Monfort did 38 hours (out of 30 hours allocated + 47.25 extra hours, thus keeping 39.25 extra hours for December).
Hugo Lefeuvre did 15 hours.
Lucas Kanashiro did 4 hours.
Markus Koschany did 30 hours.
Mike Gabriel did 9 hours (out of 8 hours allocated + 4 extra hours, but gave back 2 hours, thus keeping 1 extra hour for December).
Ola Lundqvist did 9 hours (out of 8 hours allocated + 8 extra hours, thus keeping 7 extra hours for December).
Roberto C. Sanchez did 13.75 hours (out of 12 hours allocated + 2.5 extra hours, thus keeping 0.75 extra hours for December).
Santiago Ruano Rincón did 12 hours (out of 18 extra hours, thus keeping 6 extra hours for December).
Thorsten Alteholz did 30 hours.

Evolution of the situation
In November we had a few more hours available to dispatch than we had contributors willing and able to do the work and thus we are actively looking for new contributors. Please contact Holger if you are interested to become a paid LTS contributor.
The number of sponsored hours stayed the same at 212 hours per month but we actually lost two sponsors and gained a new one (silver level).
The security tracker currently lists 30 packages with a known CVE and the dla-needed.txt file has 32 packages needing an update.
Thanks to our sponsors
New sponsors are in bold.

Platinum sponsors:

TOSHIBA (for 38 months)
GitHub (for 29 months)
Civil Infrastructure Platform (CIP) (for 6 months)

Gold sponsors:

The Positive Internet (for 54 months)
Blablacar (for 53 months)
Linode (for 43 months)
Babiel GmbH (for 32 months)
Plat’Home (for 32 months)

Silver sponsors:

Domeneshop AS (for 54 months)
Nantes Métropole (for 48 months)
Dalenys (for 44 months)
Univention GmbH (for 39 months)
Université Jean Monnet de St Etienne (for 39 months)
Ribbon Communications, Inc. (for 33 months)
maxcluster GmbH (for 27 months)
Exonet B.V. (for 23 months)
Leibniz Rechenzentrum (for 17 months) (for 14 months)
CINECA (for 6 months)
Ministère de l’Europe et des Affaires Étrangères

Bronze sponsors:

David Ayers – IntarS Austria (for 54 months)
Evolix (for 54 months), a.s. (for 54 months)
MyTux (for 53 months)
Intevation GmbH (for 51 months)
Linuxhotel GmbH (for 51 months)
Daevel SARL (for 50 months)
Bitfolk LTD (for 48 months)
Megaspace Internet Services GmbH (for 48 months)
NUMLOG (for 48 months)
Greenbone Networks GmbH (for 47 months)
WinGo AG (for 47 months)
Ecole Centrale de Nantes – LHEEA (for 43 months)
Sig-I/O (for 41 months)
Entr’ouvert (for 38 months)
Adfinis SyGroup AG (for 36 months)
GNI MEDIA (for 30 months)
Laboratoire LEGI – UMR 5519 / CNRS (for 30 months)
Quarantainenet BV (for 30 months)
Bearstech (for 22 months)
LiHAS (for 21 months)
People Doc (for 18 months)
Catalyst IT Ltd (for 16 months)
Demarcq SAS (for 10 months)
TrapX Security (for 7 months)
NCC Group (for 4 months)

No comment | Liked this article? Click here. | My blog is Flattr-enabled.

about 2 hours ago

Robert Ancell: Interesting things about the GIF image format from Planet Ubuntu

I recently took a deep dive into the GIF format. In the process I learnt a few things by reading the specification.A GIF is made up of multiple images I thought the GIF format would just contain a set of pixels. In fact, a GIF is made up of multiple images. So a simple example like: Could actually be made up of multiple images like this:  GIF has transparency, but that doesn't mean you have transparent GIFs In the above example the sun and house images have the background in them. If the background was very detailed then this would be inefficient. So instead you can set a transparent colour index for each image. Pixels with this index don't replace the background pixels when the images are composited together.That's the only transparency in the specification. The background colour is actually encoded in the file so technically a GIF picture has all pixels set to a colour. However at some point renderers decided they wanted transparency and ignored the background colour and set it to transparent instead. It's not in the spec, but it's what everyone does. This is the reason that GIF transparency looks bad - there's no alpha channel, just a hack abusing another feature.You can have more than 256 colours GIFs are well known for having a palette of only up to 256 colours. However, you can have a different palette for each image in the GIF. That means in the above example you could use a palette with lots of greens and blues for the background, lots of reds for the house and lots of yellows for the sun. The combined image could have up to 768 colours! With some clever encoding you can have a GIF file that uses up to 24 million colours.Animation is just delaying the rendering GIFs are most commonly used for small animations. This wasn't in the original specification but at some point someone realised if you inserted a delay between each image you could make an animation! In the above example we could animate by adding more images of the sun that were rotated from the previous frame with a delay before them: Why we can't have nice thingsWith all of the above GIF is both a simple but powerful format. You can make an animation that is made up of small updates efficiently encoded.Sadly however someone decided that all images inside a GIF file should be treated as animation frames. And they should have a minimum delay time (including zero delays being rounded up to 20ms or so). So if you want you GIF to look as you intended you're stuck with one image per frame and only 256 colours per frame unless the common decoders are fixed. It seems the main reason they continue to be like this is there are badly encoded GIF files online and they don't want them to stop working.GIF, you are a surprisingly beautiful format and it's a shame we don't see your full potential!

about 20 hours ago

Robert Ancell: GIFs in GNOME from Planet Ubuntu

Here is the story of how I fell down a rabbit hole and ended up learning far more about the GIF image format than I ever expected...We had a problem with users viewing a promoted snap using GNOME Software. When they opened the details page they'd have huge CPU and memory usage. Watching the GIF in Firefox didn't show a problem - it showed a fairly simple screencast demoing the app without any issues.I had a look at the GIF file and determined:It was quite large for a GIF (13Mb).It had a lot of frames (625).It was quite high resolution (1790×1060 pixels).It appeared the GIF was generated from a compressed video stream, so most of the frame data was just compression artifacts. GIF is lossless so it was faithfully reproducing details you could barely notice. GNOME Software uses GTK+, which uses gdk-pixbuf to render images. So I had a look a the GIF loading code. It turns out that all the frames are loaded into memory. That comes to 625×1790×1060×4 bytes. OK, that's about 4.4Gb... I think I see where the problem is. There's a nice comment in the gdk-pixbuf source that sums up the situation well: /* The below reflects the "use hell of a lot of RAM" philosophy of coding */They weren't kidding. 🙂While this particular example is hopefully not the normal case the GIF format has has somewhat come back from the dead in recent years to be a popular format. So it would be nice if gdk-pixbuf could handle these cases well. This was going to be a fairly major change to make.The first step in refactoring is making sure you aren't going to break any existing behaviour when you make changes. To do this the code being refactored should have comprehensive tests around it to detect any breakages. There are a good number of GIF tests currently in gdk-pixbuf, but they are mostly around ensuring particular bugs don't regress rather than checking all cases.I went looking for a GIF test suite that we could use, but what was out there was mostly just collections of GIFs people had made over the years. This would give some good real world examples but no certainty that all cases were covered or why you code was breaking if a test failed.If you can't find what you want, you have to build it. So I wrote PyGIF - a library to generate and decode GIF files and made sure it had a full test suite. I was pleasantly surprised that GIF actually has a very well written specification, and so implementation was not too hard. Diversion done, it was time to get back to gdk-pixbuf.Tests plugged in, and the existing code actually has a number of issues. I fixed them, but this took a lot of sanity to do so. It would have been easier to replace the code with new code that met the test suite, but I wanted the patches to be back-portable to stable releases (i.e. Ubuntu 16.04 and 18.04 LTS).And with a better foundation, I could now make GIF frames load on demand. May you GIF viewing in GNOME continue to be awesome.

about 22 hours ago

Podcast Ubuntu Portugal: S01E15 – Open Source Garden from Planet Ubuntu

Neste episódio o Diogo Constantino, esteve na CMS Garden Unconference na Unperfekthaus em Essen em conjunto com os speakers e organizadores do Secure Open Source Day, e aproveitou para gravar uma conversa agradável com os intervenientes da conferência, dando a conhecer o evento e a comunidade.

1 day ago

Jonathan Riddell: Achievement of the Week from Planet Ubuntu

This week I gave KDE Frameworks a web page after only 4 years of us trying to promote it as the best thing ever since cabogganing without one.  I also updated the theme on the KDE Applications 18.12 announcement to this millennium and even made the images in it have a fancy popup effect using the latest in JQuery Bootstrap CSS.  But my proudest contribution is making the screenshot for the new release of Konsole showing how it can now display all the cat emojis plus one for a poodle.

So far no comments asking why I named my computer

1 day ago

Ubuntu Podcast from the UK LoCo: S11E40 – North Dallas Forty from Planet Ubuntu

This week we’ve been playing on the Nintendo Switch. We review our tech highlights from 2018 and go over our 2018 predictions, just to see how wrong we really were. We also have some Webby love and go over your feedback.

It’s Season 11 Episode 40 of the Ubuntu Podcast! Alan Pope, Mark Johnson and Martin Wimpress are connected and speaking to your brain.
In this week’s show:

We discuss what we’ve been up to recently:

Mark has been playing on his Nintendo Switch.

We review the tech highlights from 2018 and go over our 2018 predictions.

We share a Webby Lurve:

Cockpit – The easy-to-use, integrated, glanceable, and open web-based interface for your servers

And we go over all your amazing feedback – thanks for sending it – please keep sending it!

Image credit: Paul Smith

That’s all for this week! You can listen to the Ubuntu Podcast back catalogue on YouTube. If there’s a topic you’d like us to discuss, or you have any feedback on previous shows, please send your comments and suggestions to or Tweet us or Comment on our Facebook page or comment on our Google+ page or comment on our sub-Reddit.

Join us in the Ubuntu Podcast Telegram group.

1 day ago

Alan Pope: Fixing Broken Dropbox Sync Support from Planet Ubuntu

Like many people, I've been using Dropbox to share files with friends and family for years. It's a super convenient and easy way to get files syncronised between machines you own, and work with others. This morning I was greeted with a lovely message on my Ubuntu desktop.

It says "Can't sync Dropbox until you sign in and move it to a supported file system" with options to "See requirements", "Quit Dropbox" and "Sign in".
Dropbox have reduced the number of file systems they support. We knew this was coming for a while, but it's a pain if you don't use one of the supported filesystems.
Recently I re-installed my Ubuntu 18.04 laptop and chose XFS rather than the default ext4 partition type when installing. That's the reason the error is appearing for me.
I do also use NextCloud and Syncthing for syncing files, but some of the people I work with only use Dropbox, and forcing them to change is tricky.
So I wanted a solution where I could continue to use Dropbox but not have to re-format the home partition on my laptop. The 'fix' is to create a file, format it ext4 and mount it where Dropbox expects your files to be. That's essentially it. Yay Linux. This may be useful to others, so I've detailed the steps below.
Note: I strongly recommend backing up your dropbox folder first, but I'm sure you already did that so let's assume you're good.
This is just a bunch of commands, which you could blindly paste en masse, or, preferably one-by-one, checking it did what it says it should, before moving on. It worked for me, but may not work for you. I am not to blame if this deletes your cat pictures. Before you begin, stop Dropbox completely. Close the client.
I've also put these in a github gist.
# Location of the image which will contain the new ext4 partition

# Current location of my Dropbox folder

# Where we will copy the folder to. If you have little space, you could make this
# a folder on a USB drive

# What size is the Dropbox image file going to be. It makes sense to set this
# to whatever the capacity of your Dropbox account is, or a little more.

# Create a 'sparse' file which will start out small and grow to the maximum
# size defined above. So we don't eat all that space immediately.
dd if=/dev/zero of="$DROPBOXFILE" bs=1 count=0 seek="$DROPBOXSIZE"

# Format it ext4, because Dropbox Inc. says so
sudo mkfs.ext4 "$DROPBOXFILE"

# Move the current Dropbox folder to the backup location

# Make a new Dropbox folder to replace the old one. This will be the mount point
# under which the sparse file will be mounted

# Make sure the mount point can't be written to if for some reason the partition
# doesn't get mounted. We don't want dropbox to see an empty folder and think 'yay, let's delete
# all his files because this folder is empty, that must be what they want'
sudo chattr +i "$DROPBOXHOME"

# Mount the sparse file at the dropbox mount point
sudo mount -o loop "$DROPBOXFILE" "$DROPBOXHOME"

# Copy the files from the existing dropbox folder to the new one, which will put them
# inside the sparse file. You should see the file grow as this runs.

# Create a line in our /etc/fstab so this gets mounted on every boot up
echo "$DROPBOXFILE" "$DROPBOXHOME" ext4 loop,defaults,rw,relatime,exec,user_xattr 0 0 | sudo tee -a /etc/fstab

# Let's unmount it so we can make sure the above line worked
sudo umount "$DROPBOXHOME"

# This will mount as per the fstab
sudo mount -a

# Set ownership and permissions on the new folder so Dropbox has access
sudo chown $(id -un) "$DROPBOXHOME"
sudo chgrp $(id -gn) "$DROPBOXHOME"

Now start Dropbox. All things being equal, the error message will go away, and you can carry on with your life, syncing files happily.
Hope that helps. Leave a comment here or over on the github gist.

1 day ago

Colin King: Linux I/O Schedulers from Planet Ubuntu

The Linux kernel I/O schedulers attempt to balance the need to get the best possible I/O performance while also trying to ensure the I/O requests are "fairly" shared among the I/O consumers.  There are several I/O schedulers in Linux, each try to solve the I/O scheduling issues using different mechanisms/heuristics and each has their own set of strengths and weaknesses.For traditional spinning media it makes sense to try and order I/O operations so that they are close together to reduce read/write head movement and hence decrease latency.  However, this reordering means that some I/O requests may get delayed, and the usual solution is to schedule these delayed requests after a specific time.   Faster non-volatile memory devices can generally handle random I/O requests very easily and hence do not require reordering.Balancing the fairness is also an interesting issue.  A greedy I/O consumer should not block other I/O consumers and there are various heuristics used to determine the fair sharing of I/O.  Generally, the more complex and "fairer" the solution the more compute is required, so selecting a very fair I/O scheduler with a fast I/O device and a slow CPU may not necessarily perform as well as a simpler I/O scheduler.Finally, the types of I/O patterns on the I/O devices influence the I/O scheduler choice, for example, mixed random read/writes vs mainly sequential reads and occasional random writes.Because of the mix of requirements, there is no such thing as a perfect all round I/O scheduler.  The defaults being used are chosen to be a good best choice for the general user, however, this may not match everyone's needs.   To clarify the choices, the Ubuntu Kernel Team has provided a Wiki page describing the choices and how to select and tune the various I/O schedulers.  Caveat emptor applies, these are just guidelines and should be used as a starting point to finding the best I/O scheduler for your particular need.

3 days ago

Jono Bacon: 10 Ways To Up Your Public Speaking Game from Planet Ubuntu

Public speaking is an art form. There are some amazing speakers, such as Lawrence Lessig, Dawn Wacek, Rory Sutherland, and many more. There are also some boring, rambling disasters that clog up meetups, conferences, and company events.

I don’t claim to be an expert in public speaking, but I have had the opportunity to do a lot of it, including keynotes, presentation sessions, workshops, tutorials, and more. Over the years I have picked up some best practices and I thought I would share some of them here. I would love to hear your recommendations too, so pop them in the comments.

1. Produce Clean Slides

Great talks are a mixture of simple, effective slides and a dynamic, engaging speaker. If one part of this combination is overloading you with information, the other part gets ignored.

The primary focus should be you and your words. Your #1 goal is to weave together an interesting story that captivates your audience. 

Your slides should simple provide a visual tool to help get your words over more effectively. Your slides are not the lead actress, they are the supporting actor.

Avoid extensive amounts of text and paragraphs. Focus on diagrams, pictures, and simple lists.


<figure class="wp-block-image"></figure>


<figure class="wp-block-image">Notice how I took my company logo off, just in case someone swipes it and think that I actually like to make slides like this. </figure>

Look at the slides of great speakers to get your creativity flowing.

2. Deliver Pragmatic Information

Keynotes are designed for the big ideas that set the stage for a conference. Regular talks are designed to get over key concepts that can help the audience expand their capabilities.

With both, give your audience information they can pragmatically use. How many times have you left a talk and thought, “Well, that was neat, but, er…how the hell do I start putting those concepts into action?“

You don’t have to have all the answers, but you need to package up your ideas in a way that is easy to consume in the real world, not just on a stage.

Diagrams, lists, and step-by-step instructions work well. Make these higher level for the keynotes and more in-depth for the regular talks. Avoid abstract, generic ideas: they are unsatisfying and boring.

3. Build and Relieve Tension

Great movies and TV shows build a sense of tension (e.g. a character in a hostage situation) and the payoff is when that tension is relieved (e.g. the character gets rescued.)

Take a similar approach in your talks. Become vulnerable. Share times when you struggled, got things wrong, or made mistakes. Paint a picture of the low point and what was running through your mind.

Then, relieve the tension by sharing how you overcame it, bringing your audience along for the ride. This makes your presentation dynamic and interesting, and makes it clear that you are not perfect either, which helps build a closer connection with the audience. Speaking of which…

4. Loosen Up and Be Yourself

Far too many speakers deliver their presentations like they have a rod up their backside.

Formal presentations are boring. Presentations where the speaker feels comfortable in their own skin and is able to identify with the audience are much more interesting.

For example, I was delivering a presentation to a financial services firm a few months ago. I weaved in it stories about my family, my love of music, travel experiences, and other elements that made it more personal. After the session a number of audience members came over and shared how it was refreshing to see a more approachable presentation in a world that is typically so formal.

Your goal is to build a connection with your audience. To do this well they need to feel you are on the same level. Speak like them, share stories that relate to them, and they will give you their attention, which is all you can ask for.

5. Involve Your Audience (but not too much)

There is a natural barrier between you and your audience. We are wired up to know that the social context of a presentation means the speaker does the talking and the audience does the listening. If you violate this norm (such as heckling), you would be perceived as an asshole.

You need to break this barrier, but to never cede control to your audience. If you loose control and make the social norm for them to interrupt, your presentation will be riddled with audience noise.

Give them very specific ways to participate, such as:

Ask how they are doing at the beginning of a talk.Throw out questions and invite them to put their hands up (or clap loudly.)Invite someone to volunteer for something (such as a role play scenario.)Take and answer questions.

6. Keep Your Ego in Check

We have all seen it. A speaker is welcomed to the stage and they constantly remind you about how great they are, the awards they have won, and how (allegedly) inspirational they are. In some cases this is blunt-force ego, in some cases it is a humblebrag. In both cases it sucks.

Be proud of your work and be great at it, but let the audience sing your praises, not you. Ego can have a damaging impact on your presentation and how you are perceived. This can drive a wedge between you and your audience.

7. Don’t Rush, but Stay on Time

We live in multi-cultural world in which we travel a lot. You are likely to have an audience from all over the world, speaking many different languages, and from a variety of backgrounds. Speaking at a million words a minute will make understanding you very difficult some people.

Speak at a comfortable pace, and don’t rush it. Now, some of you will be natural fast-talkers, and will need to practice this. Remember these?:

<figure class="wp-block-image"></figure>

Well, we now all have them on our phones. Switch it on, practice, and ensure you always finish at least a few minutes before your allocated time. This will give you a buffer.

Running over your allocated time is a sure-fire way to annoy (a) the other speakers who may have to cut their time short, and (b) the event organizer who has to deal with overruns in the schedule. “But it only went over by a few minutes!” Sure, but when everyone does this, entire events get way behind schedule. Don’t be that person.

8. Practice and get Honest Feedback

We only get better when we practice and can see our blind spots. Both are essential for getting good at public speaking.

Start simple. Speak at your local meetups, community events, and other gatherings. Practice, get comfortable, and then file papers at conferences and other events. Keep practicing, and keep refining.

Critique is essential here. Ask close friends to sit in your talks and ask them for blunt feedback afterwards. What went well? What didn’t go well? Be explicit in inviting criticism and don’t overreact to them when you get it. You want critical feedback…about your slides, your content, your pacing, your hand gestures…the lot. I have had some very blunt feedback over the years and it has always improved my presentations.

9. Never Depend on Conference WiFi

It rarely works well, simple as that.

Oh, and your mobile hotspot may not work either as many conference centers often seem to be built in borderline faraday cages. Next…

10. Remember, it is just a Presentation

Some people get a little wacky when it comes to perfecting presentations and public speaking. I know some people who have spent weeks preparing and refining their talks, often getting into a tailspin about imperfections that need to be improved.

The most important thing to worry about is the content. Is it interesting? Is it valuable? Does it enrich your audience? People are not going to remember the minute details of how you said something, what your slides looked like, and what whether you blinked too much. They will remember the content and ideas: focus on that.

Oh, and a bonus 11th: turn off animations. They are great in the hands of an artisan, but for most of us they look tacky and awful.

I am purely scratching the surface here and I would love to hear your suggestions of public speaking tips and recommendations. Share them in the comments! Oh and be sure to join as a member, which is entirely free.

The post 10 Ways To Up Your Public Speaking Game appeared first on Jono Bacon.

3 days ago

The Fridge: Ubuntu Weekly Newsletter Issue 556 from Planet Ubuntu

Welcome to the Ubuntu Weekly Newsletter, Issue 556 for the weeks of November 25 – December 8, 2018. The full version of this issue is available here.
In this issue we cover:

Google Code-in in KDE
Ubuntu Stats
Hot in Support
Ubuntu at KubeCon & CloudNativeCon
Ubucon Europe 2019 – Sintra!
LoCo Events
Mir News: 30th of November 2018
MATE panel running in Mir
Mir News: 7th of December 2018
Migrating from Bazaar to Git on Launchpad just got easier!
Xfce Screensaver 0.1.3 Released
Jonathan Riddell:
Other Community News
Canonical News
In the Blogosphere
Featured Audio and Video
Meeting Reports
Upcoming Meetings and Events
Updates and Security for 14.04, 16.04, 18.04, and 18.10
And much more!

The Ubuntu Weekly Newsletter is brought to you by:

Krytarik Raido
Chris Guiver
Wild Man
And many others

If you have a story idea for the Weekly Newsletter, join the Ubuntu News Team mailing list and submit it. Ideas can also be added to the wiki!
Except where otherwise noted, this issue of the Ubuntu Weekly Newsletter is licensed under a Creative Commons Attribution ShareAlike 3.0 License

4 days ago

Jono Bacon: Speaking Engagements in Tel Aviv in December from Planet Ubuntu

I am excited to share that I will be heading to Tel Aviv later this month to speak at a few events. I wanted to share a few details here, and I hope to see you there!

DevOps Days Tel Aviv

Dev Ops Days Tel Aviv
Tues 18 December 2018 + Wed 19 December 2018 at Tel Aviv Convention Center, 101 Rokach Blvd, Tel Aviv, Israel.

I am delivering the opening keynote on Tuesday 18th December 2018 at 9am.

Get Tickets

Meetup: Building Technical Communities That Scale

Thu 20th Dec 2018 at 9am at RISE, Ahad Ha’Am St 54, 54 Ahad Ha’Am Street, Tel Aviv-Yafo, Tel Aviv District, Israel.

I will be delivering a talk and participating in a panel (which includes Fred Simon, Chief Architect of JFrog, Shimon Tolts, CTO of Datree, and Demi Ben Ari, VP R&D of Panorays.)

Get Tickets (Space is limited, so grab tickets ASAP)

I popped a video about this online earlier this week. Check it out:

I hope to see many of you there!
The post Speaking Engagements in Tel Aviv in December appeared first on Jono Bacon.

5 days ago

Benjamin Mako Hill: Awards and citations at computing conferences from Planet Ubuntu

I’ve heard a surprising “fact” repeated in the CHI and CSCW communities that receiving a best paper award at a conference is uncorrelated with future citations. Although it’s surprising and counterintuitive, it’s a nice thing to think about when you don’t get an award and its a nice thing to say to others when you do. I’ve thought it and said it myself.

It also seems to be untrue. When I tried to check the “fact” recently, I found a body of evidence that suggests that computing papers that receive best paper awards are, in fact, cited more often than papers that do not.

The source of the original “fact” seems to be a CHI 2009 study by Christoph Bartneck and Jun Hu titled “Scientometric Analysis of the CHI Proceedings.” Among many other things, the paper presents a null result for a test of a difference in the distribution of citations across best papers awardees, nominees, and a random sample of non-nominees.

Although the award analysis is only a small part of Bartneck and Hu’s paper, there have been at least two papers have have subsequently brought more attention, more data, and more sophisticated analyses to the question.  In 2015, the question was asked by Jaques Wainer, Michael Eckmann, and Anderson Rocha in their paper “Peer-Selected ‘Best Papers’—Are They Really That ‘Good’?“

Wainer et al. build two datasets: one of papers from 12 computer science conferences with citation data from Scopus and another papers from 17 different conferences with citation data from Google Scholar. Because of parametric concerns, Wainer et al. used a non-parametric rank-based technique to compare awardees to non-awardees.  Wainer et al. summarize their results as follows:

The probability that a best paper will receive more citations than a non best paper is 0.72 (95% CI = 0.66, 0.77) for the Scopus data, and 0.78 (95% CI = 0.74, 0.81) for the Scholar data. There are no significant changes in the probabilities for different years. Also, 51% of the best papers are among the top 10% most cited papers in each conference/year, and 64% of them are among the top 20% most cited.

The question was also recently explored in a different way by Danielle H. Lee in her paper on “Predictive power of conference‐related factors on citation rates of conference papers” published in June 2018.

Lee looked at 43,000 papers from 81 conferences and built a regression model to predict citations. Taking into an account a number of controls not considered in previous analyses, Lee finds that the marginal effect of receiving a best paper award on citations is positive, well-estimated, and large.

Why did Bartneck and Hu come to such a different conclusions than later work?

<figure class="wp-block-image">Distribution of citations (received by 2009) of CHI papers published between 2004-2007 that were nominated for a best paper award (n=64), received one (n=12), or were part of a random sample of papers that did not (n=76).</figure>

My first thought was that perhaps CHI is different than the rest of computing. However, when I looked at the data from Bartneck and Hu’s 2009 study—conveniently included as a figure in their original study—you can see that they did find a higher mean among the award recipients compared to both nominees and non-nominees. The entire distribution of citations among award winners appears to be pushed upwards. Although Bartneck and Hu found an effect, they did not find a statistically significant effect.

Given the more recent work by Wainer et al. and Lee, I’d be willing to venture that the original null finding was a function of the fact that citations is a very noisy measure—especially over a 2-5 post-publication period—and that the Bartneck and Hu dataset was small with only 12 awardees out of 152 papers total. This might have caused problems because the statistical test the authors used was an omnibus test for differences in a three-group sample that was imbalanced heavily toward the two groups (nominees and non-nominees) in which their appears to be little difference. My bet is that the paper’s conclusions on awards is simply an example of how a null effect is not evidence of a non-effect—especially in an underpowered dataset.

Of course, none of this means that award winning papers are better. Despite Wainer et al.’s claim that they are showing that award winning papers are “good,” none of the analyses presented can disentangle the signalling value of an award from differences in underlying paper quality. The packed rooms one routinely finds at best paper sessions at conferences suggest that at least some additional citations received by award winners might be caused by extra exposure caused by the awards themselves. In the future, perhaps people can say something along these lines instead of repeating the “fact” of the non-relationship.

5 days ago

Omer Akram: Introducing PySide2 (Qt for Python) Snap Runtime from Planet Ubuntu

Lately at, we have been PySide2 for an internal project. Last week it reached a milestone and I am now in the process of code cleanup and refactoring as we had to rush quite a few things for that deadline. We also create a snap package for the project, our previous approach was to ship the whole PySide2 runtime (170mb+) with the Snap, it worked but was a slow process, because each new snap build involved downloading PySide2 from PyPI and installing some deb dependencies.So I decided to play with the content interface and cooked up a new snap that is now published to snap store. This definitely resulted in overall size reduction of the snap but at the same time opens a lot of different opportunities for app development on the Linux desktop.I created a 'Hello World' snap that is just 8Kb in size since it doesn't include any dependencies with it as they are provided by the pyside2 snap. I am currently working on a very simple "sound recorder" app using PySide and will publish to the Snap store.With pyside2 snap installed, we can probably export a few environment variables to make the runtime available outside of snap environment, for someone who is developing an app on their computer.

7 days ago

Jonathan Riddell: from Planet Ubuntu

It’s not uncommon to come across some dusty corner of KDE which hasn’t been touched in ages and has only half implemented features. One of the joys of KDE is being able to plunge in and fix any such problem areas. But it’s quite a surprise when a high profile area of KDE ends up unmaintained. is one such area and it was getting embarrassing. February 2016 we had a sprint where a new theme was rolled out on the main pages making the website look fresh and act responsively on mobiles but since then, for various failures of management, nothing has happened. So while the neon build servers were down for shuffling to a new machine I looked into why Plasma release announcements were updated but not Frameworks or Applications announcments. I’d automated Plasma announcements a while ago but it turns out the other announcements are still done manually, so I updated those and poked the people involved. Then of course I got stuck looking at all the other pages which hadn’t been ported to the new theme. On review there were not actually too many of them, if you ignore the announcements, the website is not very large.
Many of the pages could be just forwarded to more recent equivalents such as getting the history page (last update in 2003) to point to or the presentation slides page (last update for KDE 4 release) to point to a more up to date wiki page.
Others are worth reviving such as KDE screenshots page, press contacts, support page. The contents could still do with some pondering on what is useful but while they exist we shouldn’t pretend they don’t so I updated those and added back links to them.
While many of these pages are hard to find or not linked at all from they are still the top hits in Google when you search for “KDE presentation” or “kde history” or “kde support” so it is worth not looking like we are a dead project.
There were also obvious bugs that needed fixed for example the cookie-opt-out banner didn’t let you opt out, the font didn’t get loaded, the favicon was inconsistent.
All of these are easy enough fixes but the technical barrier is too high to get it done easily (you need special permission to have access to reasonably enough) and the social barrier is far too high (you will get complaints when changing something high profile like this, far easier to just let it rot). I’m not sure how to solve this but KDE should work out a way to allow project maintenance tasks like this be more open.
Anyway yay, is now new theme everywhere (except old announcements) and pages have up to date content.
There is a TODO item to track website improvements if you’re interested in helping, although it missed the main one which is the stalled port to WordPress, again a place it just needs someone to plunge in and do the work. It’s satisfying because it’s a high profile improvement but alas it highlights some failings in a mature community project like ours.

8 days ago

Ubuntu Podcast from the UK LoCo: S11E39 – The Thirty-Nine Steps from Planet Ubuntu

This week we’ve been flashing devices and getting a new display. We discuss Huawei developing its own mobile OS, Steam Link coming to the Raspberry Pi, Epic Games laucnhing their own digital store and we round up the community news.

It’s Season 11 Episode 39 of the Ubuntu Podcast! Alan Pope, Mark Johnson and Martin Wimpress are connected and speaking to your brain.
In this week’s show:

We discuss what we’ve been up to recently:

Martin has been flashing all the things. Lineage on the Nexus 4, Nexus 5, Nexus 7, Nexus 9, Nexus 10 and moto Style X. UBports 16.04 on the BQ Aquaris M10 FHD.
Alan has been playing with a new display.

We discuss the news:

Huawei is Developing its own Mobile OS
Steam Link app comes to the Raspberry Pi
Fortnite and Unreal developer Epic Games are launching their own digital store

We discuss the community news:

Xubuntu Will Stop Producing 32-bit ISOs Beginning With Xubuntu 19.04
Snapcraft 3.0
Nautilus 3.30 – status
MATE panel running in Mir
New features in Forkstat

We mention some events:

FOSDEM ’19: 2nd to 3rd of Feb 2019 – Brussels, Belgium.

Image credit: Elijah O’Donell

That’s all for this week! You can listen to the Ubuntu Podcast back catalogue on YouTube. If there’s a topic you’d like us to discuss, or you have any feedback on previous shows, please send your comments and suggestions to or Tweet us or Comment on our Facebook page or comment on our Google+ page or comment on our sub-Reddit.

Join us in the Ubuntu Podcast Telegram group.

8 days ago