Thursday, December 13, 2012

Jamming Thursday's!

Right now as I type we have two jams going on! Last week Jono posted about enhancing the ubuntu.com/community page. If your a part of the community, join in raising the banner for your specific focus area. The fun is happening now on #ubuntu-docs. For the full details, see Jono's post. For us quality folks, the pad is here: http://pad.ubuntu.com/communitywebsite-contribute-quality. Feel free to type and edit away!

In addition, as Daniel Holbach mentioned, there is a hackathon for automated testing. Come hang out with us on #ubuntu-quality, learn, ask and write some tests. Again, the full details can be found on Daniel's post.

Come join us!

Wednesday, November 28, 2012

Our first Autopilot testcase

So last time we learned some basics for autopilot testcases. We're going to use the same code branch we pulled now to cover writing an actual testcase.

bzr branch lp:~nskaggs/+junk/autopilot-walkthrough

As a practical example, I'm going to convert our (rather simple and sparse) firefox manual testsuite into an automated test using autopilot. Here's a link to the testcase in question.

If you take a look at the included firefox/test_firefox.py file you should recognize it's basic layout. We have a setup step that launches firefox before each test, and then there are the 3 testcases corresponding to each of the manual tests. The file is commented, so please do have a look through it. We utilize everything we learned last time to emulate the keyboard and mouse to perform the steps mentioned in the manual testcases. Enough code reading for a moment, let's run this thing.

autopilot run firefox

Ok, so hopefully you had firefox launch and run through all the testcases -- and they all, fingers-crossed, passed. So, how did we do it? Let's step through the code and talk about some of the challenges faced in doing this conversion.

Since we want to test firefox in each testcase, our setUp method is simple. Launch firefox and set the focus to the application. Each testcase then starts with that assumption. Inside test_browse_planet_ubuntu we simply attempt to load a webpage. Our assertion for this is to check that the application title changes to "Planet Ubuntu" - - in other words that the page loaded. The other two testcases expand upon this idea by searching wikipedia and checking for search suggestions.

The test_search_wikipedia method uses the keyboard shortcut to open the searchbar, select wikipedia and then search for linux. Again, our only assertion for success here is that the page with a title of Linux and wikipedia loaded. We are unable to confirm for instance, that we properly selected wikipedia as the search engine (although the final assertion would likely fail if this was not the case).

Finally, the test_google_search_suggestions method is attempting to test that the "search suggestions" feature of firefox is performing properly. You'll notice that we are missing the assertion for checking for search suggestions while searching. With the knowledge we're gained up till now, we don't have a way of knowing if the list is generated or not. In actuality, this test cannot be completed as the primary assertion cannot be verified without some way of "seeing" what's happening on the screen.

In my next post, I'll talk about what we can do to overcome the limitations we faced in doing this conversion by using "introspection". In a nutshell by using introspection, autopilot will allow us to "see" what's happening on the screen by interacting with the applications data. It's a much more robust way of "seeing" what we see as a user, rather than reading individual screen pixels. With any luck, we'll be able to finish our conversion and look at accomplishing bigger tasks and tackling larger manual testsuites.

I trust you were able to follow along and run the final example. Until the next blog post, might I also recommend having a look through the documentation and try writing and converting some tests of your own -- or simply extend and play around with what you pulled from the example branch. Do let me know about your success or failure. Happy Testing!

Monday, November 26, 2012

Getting started with Autopilot

If you caught the last post, you'll have some background on autopilot and what it can do. Start there if you haven't already read the post.

So, now that we've seen what autopilot can do, let's dig in to making this work for our testing efforts. A fair warning, there is some python code ahead, but I would encourage even the non-programmers among you to have a glance at what is below. It's not exotic programming (after all, I did it!). Before we start, let's make sure you have autopilot itself installed. Note, you'll need to get the version from this ppa in order for things to work properly:

sudo add-apt-repository ppa:autopilot/ppa
sudo apt-get update && sudo apt-get install python-autopilot

Ok, so first things first. Let's create a basic shell that we can use for any testcase that we want to write. To make things a bit easier, there's a lovely bazaar branch you can pull from that has everything you need to follow along.

bzr branch lp:~nskaggs/+junk/autopilot-walkthrough
cd autopilot-walkthrough

You'll find two folders. Let's start with the helloworld folder. We're going to verify autopilot can see the testcases, and then run and look at the 'helloworld' tests first. (Note, in order for autopilot to see the testcases, you need to be in the root directory, not inside the helloworld directory)

$ autopilot list helloworld
Loading tests from: /home/nskaggs/projects/

    helloworld.test_example.ExampleFunctions.test_keyboard
    helloworld.test_example.ExampleFunctions.test_mouse
    helloworld.test_hello.HelloWorld.test_type_hello_world

 3 total tests.


Go ahead and execute the first helloworld test.

autopilot run helloworld.test_hello.HelloWorld.test_type_hello_world
 
A gedit window will spawn, and type hello world to you ;-) Go ahead and close the window afterwards. So, let's take a look at this basic testcase and talk about how it works.

from autopilot.testcase import AutopilotTestCase

class HelloWorld(AutopilotTestCase):

    def setUp(self):
        super(HelloWorld, self).setUp()
        self.app = self.start_app("Text Editor")

    def test_type_hello_world(self):
        self.keyboard.type("Hello World")


If you've used other testing frameworks that follow in the line of xUnit, you will notice the similarities. We implement an AutopilotTestCase object (class HelloWorld(AutopilotTestCase)), and define a new method for each test (ie, test_type_hello_world). You will also notice the setUp method. This is called before each test is run by the testrunner. In this case, we're launching the "Text Editor" application before we run each test (self.start_app("Text Editor")). Finally our test (test_type_hello_world) is simply sending keystrokes to type out "Hello World".

From this basic shell we can add more testcases to the helloworld testsuite easily by adding a new method. Let's add some simple ones now to show off some other capabilities of autopilot to control the mouse and keyboard. If you branched the bzr branch, there is a few more tests in the test_example.py file. These demonstrate some of the utility methods AutopilotTestCase makes available to us. Try running them now. The comments inside the file also explain briefly what each method does.

autopilot run helloworld.test_example.ExampleFunctions.test_keyboard
autopilot run helloworld.test_example.ExampleFunctions.test_mouse

Now there is more that autopilot can do, but armed with this basic knowledge we can put the final piece of the puzzle together. Let's create some assertions, or things that must be true in order for the test to pass. Here's a testcase showing some basic assertions.

autopilot run helloworld.test_example.ExampleFunctions.test_assert
  
Finally, there's some standards that are important to know when using autopilot. You'll notice a few things about each testsuite.
  • We have a folder named testsuite.
  • Inside the folder, we have a file named test_testsuite.py
  • Inside the file, we have TestSuite class, with test_testcase_name
  • Finally, in order for autopilot to see our testsuite we need to let python know there is a submodule in the directory. Ignoring the geekspeak, we need an __init__.py file (this can be blank if not otherwise needed)
Given the knowledge we've just acquired, we can tackle our first testcase conversion! For those of you who like to work ahead, you can already see the conversion inside the "firefox" folder. But the details, my dear Watson, will be revealed in due time. Until the next post, cheerio!

Tuesday, November 20, 2012

A glance at Autopilot

So, as has been already mentioned, automated testing is going to come into focus this cycle. To that end, I'd like to talk about some of the tools and methods for automated testing that exist and are being utilized inside ubuntu.

I'm sure everyone has used unity at some point, and you will be happy to know that there is an automated testsuite for unity. Perhaps you've even heard the name autopilot. The unity team has built autopilot as a testing tool for unity. However, autopilot has broader applications beyond unity to help us do automated testing on a grander scale. So, to introduce you to the tool, let's check out a quick demo of autopilot in action shall we? Run the following command to install the packages needed (you'll need quantal or raring in order for this to work):

sudo apt-get install python-autopilot unity-autopilot

Excellent, let's check this out. A word of caution here, running autopilot tests on your default desktop will cause your computer to send mouse and keyboard commands all by itself ;-) So, before we go any further, let's hop over into a 'Guest Session'. You should be able to use the system indicator in the top right to select 'Guest Session'. Once you are there, you'll be in a new desktop session, so head back over to this page. Without further ado, open a terminal and type:

autopilot run unity.tests.test_showdesktop.ShowDesktopTests.test_showdesktop_hides_apps

This is a simple test to check and see if the "Show Desktop" button works. The test will spawn a couple of applications, click the show desktop button and verify clicking on it will hide your applications. It'll clean up after itself as well, so no worries. Neat eh?

You'll notice there's quite a few unity testcases, and you've installed them all on your machine now.

autopilot list unity

As of this writing, I get 461 tests returned. Feel free to try and run them. Pick one from the list and see what happens. For example,

autopilot run unity.tests.test_dash.DashRevealTests.test_alt_f4_close_dash

Just make sure you run them in a guest session -- I don't want anyone's default desktop to get hammered by the tests!

If you are feeling adventurous, you can actually run all the unity testcases like this (this will take a LONG TIME!).

autopilot run unity

As a sidenote, you are likely to find some of the testcases fail on your machine. The testsuite is run constantly by the unity developers, and the live results of commit by commit success or failure is actually available on jenkins. Check it out.

So in closing, this cycle we as a community have some goals surrounding easing the burden for ourselves in testing, freeing our resources and minds towards the deeper and more thorough testing that automation cannot handle. To help encourage this move of our basic testcases towards automation, the next series of blog posts will be a walkthrough on how to write Autopilot testcases. I hope to learn, explore and discover along with all of you. Autopilot tests themselves are written in python, but don't let that scare you off! If you are able to understand how to test, writing a testcase that autopilot can run is simply a matter of learning syntax -- non-programmers are welcome here!

Wednesday, October 31, 2012

UDS-R: Rise of the the (quality) machines


Greetings from Copenhagen! I thought I would give a mid-UDS checkup for the quality team community. You may have already heard some of the exciting stuff that is already been discussed at UDS. Automated testing is being pursued with full vigor, the release schedule has been changed, and cadence testing is in. In addition, ubuntu is being focused into getting into fighting shape by targeting the Nexus 7 as a reference platform for mobile.

I was honored enough to have a quick plenary where attendees here got to see and hear about the various automated testing efforts going on. Does that mean the machines have replaced us? Hardly! The goal with bringing automated testing online is to help us be more proactive with how and why we test. We've done an amazing job of reacting to changes and bugs, but now as a community I would like us to focus on being proactive with our testing. The changes below are all going to help set us firmly in this direction. By proactively testing things, we eliminate bugs, and repetitive or duplicated work for ourselves. This frees us to explore more focused, more interesting, and more in-depth testing. So without further ado, here's a quick rundown of the changes discussed here in Copenhagen -- hang on to your testing hats!

Release
The Release schedule has dropped all alphas, and the first beta, resulting in a beta and then final release milestone only. In addition, the freezes have been moved back a few weeks. The end result is the archive will not be frozen till late in the cycle, allowing development and testing to continue unencumbered. This of course is for ubuntu only. Which brings us to flavors!


Flavors
Flavors will now have complete control over there releases. They can chose to test, freeze, and re-spin according to there own schedule and timing. Some will adopt ubuntu's schedule, others may retain the old milestones or even do something completely different.


ISOs
Iso's will now be automatically 'smoke' tested before general release. No more completely broken installers on the published images! In addition, the iso's will be published daily as usual, but will not have typical milestones as mentioned above. Preference will be given to the daily iso -- the current one -- throughout the cycle. Testing will occur in a cadence instead of a milestone.

Cadence
Rather than milestones, a bi-weekly cadence of testing will occur with the goal of assuring good quality throughout the release cycle. The cadence weeks will be scheduled and feature testing different pieces of ubuntu in a more focused manner. This includes things like unity, the installer, and new features landing in ubuntu, but will also be the target of feedback from the state of ubuntu quality.

State of ubuntu Quality
A bold effort to generate a high level view of what needs testing and what is working well on a per image basis inside of ubuntu. This is an experimental idea whose implementation will garner feedback early in the cycle and will collect data and influence decisions for testing focus during the cycle. *fingers crossed*

AutoPilot
This tool will integrate xpresser to allow for a complete functional UI testing tool. One of the first focuses for testcases will be automating the installer from a UI perspective to free our manual testing resources from basic installer testing! From the community perspective, we can join in both the writing, and executing of automated, as well as the development of the tool itself.

Hardware Testing Database
This continuing experiment will become more of a reality. The primary focus of the work this cycle will be to bring the tool, HEXR, online and to do basic integration with the qatracker for linking your hardware profiles. In addition, focused hardware testing using the profiles will be explored.

I hope this gives you a nice preview of what's coming. I would encourage you to have a look a the blueprints and pads for the sessions, and ask questions or volunteer to help in places you are interested. I am excited about the opportunities to continue bringing testing to the next level inside of ubuntu. I  owe many thanks to the wonderful community that continues to grow around testing. Here's to a wonderful cycle.

Sunday, October 28, 2012

Readying for UDS

I trust everyone is readying themselves -- don't blink! Ubuntu UDS-R is already upon us. Those of you who have been watching closely may have heard about some of the planned sessions for QA, but if not feel free to take a look. Don't worry, I'll wait.

But wait, there's more! In addition, there is going to be an evening event where testing is the focus. It's happening Tuesday evening. The goal is to learn about some of the testing efforts going on inside ubuntu, including automated testing; and more importantly, to write some testcases! Folks will be on hand to help talk you through and discuss writing both automated and manual test cases.

Looking through the tsessions, I hope you have the sense that testing is continuing to play a large role in ubuntu. And further, that you can be even more invovled! UI testing, automated testing, testcase writing -- all of these are focus points this cycle and have sessions. Get involved -- and if your at UDS, please do come to a session or two, add your voice, and grab some work items :-) Let's make it happen for next cycle.

Tuesday, October 9, 2012

Community Charity-a-thon: The Aftermath

I wanted to express my heartfelt thanks to everyone who contributed. To those who gave on behalf of the debian community, thank you as well! I stated that for every five donations I would do a manpage for a package that is missing one :-) I received just under 5 donations marked debian, but not to worry, I'll still create a manpage for one in need. Although I did other work during the marathon, I purposefully held onto creating the manpage until I was a bit more rested -- I have enough trouble speaking English sometimes without adding in sleep deprivation. The man page readers will thank me, and I'm sure those who get my page to review will as well.

To the rest of you, thank you very much. We raised $943.34 for WaterAid. That's amazing! I'm truly touched by your generosity. Here's the complete list of donors, hats off to all of you -- I know several of you donated anonymously, thank you!

Anonymous :-)
Cormac Wilcox
Gema

Anders Jonsson
Arthur Talpaert
Sam Hewitt
Alvaro

Ólavur Gaardlykke
Joey-Elijah Sneddon
steve burdine

Thomas Martin (tenach)
Daniel Marrable
sebsebseb Mageia
Jonas Grønås Drange
Gregor Herrmann
Mark Shuttleworth
phillw
Thijs K
Alvaro
Max Brustkern
Jane Silber
Gema Gomez-Solano
Martin Pitt
Michelle Hall


Now I know no one wants to re-watch that crazy 24 hours of video, but I wanted to bring you a few highlights as well. I spent time doing some of my normal work, but I also promised to do something outside the norm. I was able to scratch an itch, and although my on-air demo failed (an uh-duh moment), I was able to record this video immediately after demonstrating where we in QA are focusing next cycle. In addition, there were several talks from QA personnel, and I recommend watching this clip if your interested in hearing Rick's take on where ubuntu is going, and indeed how quality will play a role. You can skip to here if you only want to hear his take on quality. Now is a great time to be involved in QA -- I'm excited to see things unfold for 14.04, and I hope you are to.

For the readers who actually made it this far, I've got the best for last. There were some gags in those 24 hours; for instance, check out my chicken dance! (*cough* this was supposed to be a group thing *cough*). Ohh, and there's always this lovely screencap. To be fair, this was about 20 hours or so in.


Tuesday, October 2, 2012

Preparing for the Community Charity-a-thon

In preparation for the 24 hour marathon, I thought I would share with everyone my thoughts on my I chose my charity, and what I plan to do for 24 hours :-) To start off the post, let me get this right out front <insert flashing lights and whirlygigs> DONATE HERE </insert>

First the charity, WaterAid. I chose WaterAid upon realizing how important water is to me. I love water -- I love to stare out across a vast sea, or to sail along it using the wind, and hearing only the splashing of the waves, and the smell of life beneath it. And of course, I consume water each day in order to sustain life. I am happy to support an organization whose goal it is to provide sustainable water sources to everyone. Water is important to life, and is a basic need for us all as humans. We need clean water, and even moreso, we need sustainable access to it. Water is precious, and it's important for us to not pollute the water we have as well. WaterAid understands this need and works with locals to help create clean renewable water sources. Consider donating to help those who don't have access to the same resource we take for granted -- available anytime out of our faucet.

I'm also placing a call out to those who are interested in QA on both the Debian and ubuntu communities to participate. I plan to spend my time during the 24 hours doing something to further the work of how you interact with ubuntu and QA. So, to that end, I'd like to ask those of you who are interested in ubuntu to donate and install ubuntu during the marathon. I'll be here to provide technical installation support for you during the install. Let's see how many exotic configurations we can see successful installations on. Got a RAID setup or some exotic hardware? Multiple disks, partitions, and Os's? Get your systems backed up now, and let's try and install. NOTE, I'd encourage using the latest daily iso for installation, but you are welcome to also use beta2.

In addition for those of you in the Debian community, I am issuing a challenge for donations. For every 5 donations from the Debian community, I will write a missing man page from the list. I'll be focusing on things I use, but feel free to offer a suggestion during the marathon!

I would also issue a challenge to the greater ubuntu community. Do you have a problem that you are unable to solve within ubuntu? While I can't promise a fix for your issue, I will offer you my personal attention to help solve your problem. I'll help you file a bug, confirm it, or help you debug the problem to the best of my ability. I'll even offer my karma on askubuntu to your question ;-) If I get overwhelmed with donations, I'll pull the highest donators first -- but we do have 24 hours to fill! Note, I plan to do this work on Thursday, on-air, but you can donate in advance. Just leave me a note, or simply send me an email after your donation with your request if you donate in advance.

Finally, if none of the above suits you, I am happy to have a personal 1 on 1 match of highest level house of cards building, or another quick playing game of your choice on air. And don't worry, I'm ruthless competition, no pushover here!

Wednesday, September 19, 2012

Getting your bug fixed; the art of bug shepherding

We've all had this experience. Upgrade our hardware to the new version of ubuntu (or perhaps installing ubuntu for the first time on some new hardware), and everything works perfectly. We quickly dive in and enjoy the new experience of our favorite OS. Well, it's almost perfect. There's just this one bug that is bothering you.

Perhaps instead you are a tester. Someone who lives and breathes for breakage and the latest and greatest raw code packaged from the hands of it's developers. There's a new upgrade out and everything is perfect; well, except for that one bug.

So what do you do in these situations? Sadly some people may chose to do nothing but hope and wait. "Next cycle, or next release I'm hoping my bug will be fixed". Odds are, no one knows about your bug, and it can't be fixed if it's unknown. And if you are using proprietary software, "wait and see" is about the limit of your options. As a part of ubuntu however, we as a community can do so much more!

With that let me present my patented wizzbang approach to a successful resolution of your bug within ubuntu!
  
First, let me clarify what a successful resolution is. If you have been around ubuntu, you may have seen a bug that expired as 'incomplete'. This is clearly not a successful resolution. In my eyes, a successful resolution to a bug sees the status changed to a 'won't fix, fix committed, triaged, etc'.

Ok, so here's the steps:
  1. If you don’t know how to file a good bug, ask first! It's important to do your best to describe the problem you are experiencing, and if possible how to repeat the bug. Check out the post on askubuntu which has a nice summary of resources available to help you.
  2. File a good bug, using your newly formed knowledge from above :-)
  3. Get someone else to confirm it. This is important! If possible, have a friend confirm the bug on their system. Once they've confirmed it, have them mark the bug as affecting them as well on launchpad.
  4. Answer questions promptly when asked by others. Make sure you are getting emails from launchpad, and when someone asks a question on your bug, respond promptly.
  5. Get your bug triaged. If your bug is confirmed and filed correctly, the bug triagers should help triage the bug. If a long time has passed without this occurring, check to make sure you bug is in good order to be triaged. If so, asking politely for a triager to look at your bug on the #ubuntu-bugs channel is a good way to keep your bug moving.
  6. Help debug, test, and confirm and fixes that are made for your issue. If the developer spends time working on your bug, do what you can to help confirm it fixes your issue.
  7. Remember no one will care about your bug as much as you do! When you file a bug, commit to carrying it as far along in the process as you can.
That's it! There's no guarantee every bug you file now will receive your desired outcome, but you should see proper resolution, instead of your bugs expiring. By being a good participant you are ensuring you can feel good about the resolution of the bugs you file. Remember we are all human, and sometimes things get missed. Stick with your bug, and shepherd it through.

So is the process perfect? Not at all. We as a community still need to think more about improving our experience in dealing with problems. Not every "problem" encountered is a bug, and a process to better handle these problems is still worthy of thought. I invite those of you interested in this to look for a UDS session on the topic.

Special thanks to TheLordofTime and hggdh for their discussions surrounding bugs, and of course for our marvelous bugsquad without whom this would not be possible!

Friday, September 7, 2012

Global Jam with QA love

Global Jam is a unique time for folks to get together across the world and celebrate ubuntu. As part of the celebrations, there is an opportunity to download the latest beta (released yesterday!) and check out the next version of ubuntu. You can run it in a livecd or perhaps in a virtual machine. Ether way, there's opportunities for you to test the common applications and report your results.

The testcases are available here: Ubuntu Global Jam Testcases. For some of you, this page may seem a bit funny, but fortunately there is a handy walk-through for you to understand how to use the tracker and specifically, how to report results against these tests. Check out the guides below. If you follow the link, you'll even find some video of yours truly giving you a visual demonstration.

https://wiki.ubuntu.com/Testing/QATracker
https://wiki.ubuntu.com/Testing/CallforTesting/Walkthrough

If you get stuck, remember your friends in #ubuntu-testing on freenode are also happy to help. Have fun jamming, and if you do test, Happy Testing! Most of all, celebrate ubuntu!

Thursday, August 23, 2012

Ubuntu QA goes social

Since I posted the survey results, some of the members of the qa community noticed the team was unknown by others in the greater ubuntu community and beyond. After posting the article and ending my day, no sooner do I awake then my inbox already contains links to several social media locations where groups have been setup. I was floored and impressed by everyone's excitement to let others share in the fun we have here in QA (great work everyone!).

So, what does that mean for everyone else? We've gone social! Check out the new groups, and if your already a member of the social community in question, I would encourage you to join up if your interested in qa. And for anyone who has been involved in the oppurtunities we undertake in qa, I would encourage you to write about them. Share them with your friends.. in a word, communicate :-) You don't need to find a new medium, take qa and ubuntu to where you already interact!

https://wiki.ubuntu.com/QATeam/Contact

Tuesday, August 21, 2012

Call for Testing: Compiz & Unity

It's the testing event I've been waiting for! A new version unity has arrived, stock full of compiz fixes. The team has removed metacity completely, and also migrated to gsettings from gconf. The result is removal of unity2d, and the enablement of llvmpipe on unity for those running without hardware acceleration.

Now, of course this work, in particular the settings migration, needs to be tested. This is where all of you come in! The package tracker now contains an entry for unity, complete with testcases for this migration. For those who helped test last cycle, you will also notice all of the checkbox testcases have been ported over as well! Many thanks to the unity developers for there help in migrating these tests. For this call for testing, the 'Unity GSetting Migration' testcases have been marked as mandatory, meaning that testcase is the primary focus. However, if you are able, executing the other testcases also helps the unity team ensure there haven't been any regressions.

Please note the 'Unity GSetting Migration' has steps for you to complete BEFORE you install from the ppa. Please read it first before diving in. Here's a link to the testing on the tracker. And if you are new, check out our wiki for a guide on using the qatracker to help test.

Now, as an added bonus, the unity developers have also tagged several multi-monitor bugs and have asked users to go through the list and confirm any bugs that they can. Read the bug report, and if you have a multi-monitor unity setup, see if it's still affecting you. Leave a comment on the bug with your result. The unity team wants to make sure the bugs are able to be triaged and get proper attention if they are still valid.

List of unity bugs, tagged multi-monitor

Thank you in advance for your help, and happy testing everyone!

Monday, August 20, 2012

The grand "Cadence" experiment

It all started innocently enough. A simple idea, turned into a simple post on the mailing list. This idea eventually led the ubuntu QA community to perform an experiment for this cycle, which has been dubbed "cadence testing".

Now, before I manage to confuse everyone with this "cadence testing" term, let's define cadence testing. Scratch that, let's just give a simple definition of what was intended by original idea. If you want the whole story read the thread. heh. I'll be waiting (hint it's LONG!).

Cadence testing was intended to introduce regularity into testing. If the development release could be "stable" everyday (which was the grand experiment during the precise cycle), could we not also test to ensure that things were good all throughout the release? If the everyday images and archive were now to the quality of previous release's milestones, could we just eliminate the milestone idea and go with a calendar schedule for testing? Thus, a proposal was made test every 2 weeks, whether or not a milestone had been planned, and report the results.

Fast forward 2 months to today. So what happened? Well, I'm happy to report that the QA community despite the confusion more or less met the goal of testing the desktop images every 2 weeks (milestone or not). But what did this achieve? And where are the results?

Let's step back a moment and talk about what we learned by doing this. My comments are specific to the non-milestone cadence testing weeks. First, the development process inside ubuntu is still built around milestones. The daily images during cadence testing weeks were sometimes stable, and sometimes flat out broken by a new change landing from a development team. Second, the tools we used are built around milestone testing as well. The qatracker as well as any qa dashboard or report available doesn't have a good way to track and image health across the cadence week. This meant it was both difficult to test and difficult to see the results of the testing. Finally, the development teams were not expecting results against the daily images, and couldn't follow-up well on any bugs we reported, nor where we able to coordinate well with the release team, as the bugs reported were not available in a summarized or meaningful way.

Now, I'll save the discussion on my ideas of a healthy QA workflow for a later post, but I think we can agree that testing without good result reporting, and without developer follow-up has a limited impact. So does this mean "cadence testing" was a bad idea? No, it was simply poorly executed. The trouble comes in the assumptions listed above.

The archive has not been "stable" everyday, and development teams have have continued development, pausing only as required by the current milestones. In addition, changes, even major ones (like the ubiquity changes landing a few weeks ago, or the nvidia change just this past weekend), are not well communicated. Since they land without little or no warning, we as a QA community are left to react to them, instead of planning and executing them. In this environment, cadence testing makes little sense.

So was the experiment a failure then? In my mind, not at all! In fact, I think the future of ubuntu and QA is to push for complete adoption of this idea, and this experiment confirms the obstacles we will face in getting there. I'll be posting more about what this vision for QA looks like, but I'll leave you with a few thoughts until then.

In my mind, QA should enhance and improve developers, testers, and users lives and workflows. Our work is critical to the success of ubuntu. I would like to see a future where users receive regular, timely scheduled updates that the folks in the QA community have vetted by working with the development and release teams to deliver focused quality updates. The ideal workflow is more fluid, more agile and yes, it has a cadence.

Thursday, August 16, 2012

Quality Perceptions Survey Results

A couple Fridays ago I asked for feedback on how quality and the ubuntu QA team this cycle. That survey has now been completed and I have some results to share with everyone. Before I dive into the numbers, let me take a moment to say thank you to all of you who responded. Thank you! I read all of the comments left as well, and all were helpful feedback. Remember the survey was anonymous, so I cannot respond individually to anything written. Feel free to contact me if you wish to discuss anything further or to receive a response.

The second question on the survey asked rather simply, "What does quality mean to you?".

As it turns out, the largest answers mirrored those of a later question, in which I asked "What's the biggest problem with quality in ubuntu right now?".
Note, I read all of the "other" responses and categorized them into some new categories to display here for general consumption.
So there is some agreement amongst those who were polled both about what quality means, and about where ubuntu's biggest problems lie. The respondents indicated the largest issue with quality in ubuntu, according to them, was also the definition of what "quality" is!

Now I asked this question for a specific reason. "Quality" is a subjective term! Perhaps I'll get some disagreement on this, but hear me out. All of the answers for the question in my mind are valid with respect to quality. As an example, let's say, I asked you to recommend software to balance my checkbook. If I specified I wanted a quality piece of software, would you not recommend to me a stable (works without crashing), mature (good development/bug workflow), and easy to use (just works) piece of software that has a nice feature set (latest and greatest)? It's easy to see that "quality" can refer to all of this and more.

Still, in my mind, when I speak to wanting a "quality" release of ubuntu, I tend to focus on the stability and ease of use aspects. As the graphs indicate, the respondents seemed to echo this idea. In other words, it's really important to the idea of quality that things "just work". In the context of ubuntu this means applications run without crashing, and the operating system runs on your hardware. If things don't "just work", even if all the other indications of quality are true, you aren't likely to describe or perceive the product as having "good quality".

Let's ponder that thought for a moment and look at some more results. The survey captured about a 50/50 split of folks who run the development release, and over 70% run it or intend to run it before the final release.
So among those 50-70% who run or will run the development release, how many have participated in ubuntu qa?
Yikes! Only about a third. Just under half have no idea a ubuntu QA team existed. There's some clear evangelizing work to be done here. Let me take pause here just for a moment to say the team does exist, and would love to have you!

Ok, now onto the last multiple-guess multiple-choice question.
I'm happy to see people desire to help! That's wonderful. The responses regrading time, being technically able, or where to start are all very solvable. I would encourage you to watch this space for invitations to help test. QA work comes in all shapes and sizes, sometimes it's as little as 15 minutes, and the ability to install/uninstall a package and reboot a machine. If this sounds like something you would be able to do, please start by having a look at our wiki page. Send us an email and introduce yourself. There's no requirements or forced participation and we welcome everyone. And who knows, you might even learn something about ubuntu :-)

Ok, so I've shared the hard numbers from the survey, but I'd like to leave you with a few takeaway thoughts. First, while quality is subjective, our focus in ubuntu QA should be to have things "just work". That doesn't mean we shouldn't also help improve our development and bug processes, or continue to push for new applications and features, but rather that we ensure that our efforts help forward this cause.

I've said it before, but I want to help deliver good computing experiences. That story I shared when I introduced myself was close to home. My first interaction with the ubuntu community came via the forums, and yes, getting a printer to work. The community undertook work to change what was once a nightmare not for the feint of heart to child's play. The execution of this work is what defines the experience. This is where QA fits. We aren't just testing; we're delivering the result of the entirety of the ubuntu community's labor.

Judging from the survey results, many of you share this same vision. So won't you join us? QA transcends across teams and the ubuntu community. I would encourage you to get involved and be a part of making it happen. The list of "problems with quality" reach many areas. Would you be part of the solution?

Tuesday, August 7, 2012

Quality mid-cycle checkup

About 2.5 months ago I wrote about the plans for the ubuntu QA community for the quantal cycle. We were building off of lots of buzz from the precise release and we planned to undertake lots of new work, while being very careful to avoid burnout. Our focus was to take QA to the next level and help us communicate and grow as a team to take on the opportunities we have.

So, how are we doing? Let's go over each of the points noted in the original post and talk about the progress and plans.

ISOTesting
Our alpha1 testing went very well, but the alpha2 and alpha3 have seen less participation. In addition we were asked and responded to a plan to test our iso's every 2 weeks as part of a more cadenced testing. Overall, isotesting continues to be a weak spot for us as a community. ISO Testing is perhaps the most important piece of testing for us as a greater ubuntu community. The image we produce is literally the first experience many folks have with ubuntu. If it fails to install, well, there went our chance for a positive first impression :-( I would be happy to hear ideas or comments on isotesting in particular.

Application Testing
This work has been mostly completed. The package tracker now allows us to perform work that was done via checkbox or manual testing last cycle. We can now manage results, tests and reporting all in one tool -- and it's all publicly available. For more information about the qatracker, see this wiki page.

SRU Verification
This work is still on paper, awaiting for the 12.04.1 release before further discussions and work will begin.

General Testing (eg, Day to Day running of the development version)
I am still experimenting with understanding how to enable better reporting and more focused testing on this. The current plan is to track specific packages that are critical to the desktop, and allow those run the development version the ability to report how the application is working for each specific upload during the development release. This is done with the qatracker. I'll blog more about this and the results in a later post. Contact me as always if your interested in helping.

Calls for Testing
This has been a wonderful success. There have been several calls for testing and the response has been wonderful. A big thank you to all of you who have helped test this. We've had over 50 people invovled in testing, and 41 bugs reported. Myself and the development teams thank you! But we're not done yet, unity testing among other things are still coming!

QATracker development
There is still room for more developers on the qatracker project. It's written in drupal, and I would happy to help you get started. As we grow, there will continue to be a need for people who want to build awesome tools to help us as a community test. If you have ideas for a tool (or know of a tool) that would help us test, please feel free to share with me.

Hardware Database
Work has been completed to spec out the design, and is scheduled now to land this cycle not in a future cycle. Fingers crossed we'll sneak this in before we release quantal :-) I'm very excitied to share this new tool with you; as soon as it's complete we'll be able to incorporate it into our workflow on the qatracker.

Testcases
Done, and for the most part our testcases have been migrated over. In addition, there is now a special team of folks who help to manage and maintain our testcases. If you have a passion for this work, contact me and I can help get you involved with the team.

Overall, I am happy to see signs of growth and newcomers to the community. If your on the fence about getting more involved with ubuntu, I would encourage you to check out QA. We collaborate with almost every area of ubuntu in some way, and no two days are the same :-) Send an email to the ubuntu-qa mailing list and introduce yourself.

So what's your opinion? Feel free to respond here with your thoughts and/or fill out the quality survey to give feedback.

Friday, August 3, 2012

Quality Perceptions Survey

What's your perception of quality this cycle? Are things working well for you? It's been several months now since precise landed, and ubuntu development for the next version has been ongoing. The ubuntu QA team has had a busy summer putting into place the new tools we spoke about at UDS. The qatracker has been revamped to allow us to consolidate our testcases and test reporting across all of our activities. In addition, we've been helping in the release of 3 alpha milestones, and 3 testing campaigns. To all those who have helped in this testing, a very big thank you!

I have my own thoughts about the impact to the ubuntu project this testing has had, and I will continue to share my thoughts to point out the progress we make in this regard. But now, I want your input. I have created a survey to understand the community perspective on how we as a ubuntu project are doing on quality. If you have a few moments, please fill out the survey and let your thoughts and perspective be known. The survey will be anonymous, but I will share an aggregation and summary of the results.

My hope is to help gain an understanding of how we can focus our efforts on what's important to ubuntu as a project in terms of quality, as well as how we can help you (yes you!) become a more active part of QA if your interested.

Here's a link to survey. I'll leave it open until next Friday August 10th. Thanks in advance for your participation.

Tuesday, July 31, 2012

A video walkthrough of ISO Testing

Notice I'm not mentioning a good video walk-through, but it is in HD! It can be viewed on youtube here. I had a lot of trouble doing my first screencast, but I think I have a promising setup now for the future. I promise HD, lag-free, and better editing in the future. For reference, I highly recommend flowblade to edit and kazam to record. Kudos to both developers for nice pieces of software!

The text version is available here and I encourage you to follow along with the video. You don't need anything more than a reasonably modern pc that can run a VM and you can help iso test. Take 20 minutes to watch and follow along and take the fear out of iso testing. Then join us next week as we continue our testing cadence and verify our daily isos are in good shape.

And don't worry, no one will know it's running the installation on a separate workspace while you watch youtube videos.. It'll be our secret.

Wednesday, July 25, 2012

What can webapps do? Help test and see!

As I'm sure you've already heard, webapps are becoming first rate citizens on ubuntu. Some of you might even have installed the ppa and are enjoying exploring what webapps does for you on the desktop. If you've been one of those early adopters and are wondering exactly what all webapps can do, or are wondering how you can help squash bugs, this is your chance!

The webapps team has made available some manual testcases that cover the functionality found in webapps currently. These testcases have been made available via a 'call for testing' for webapps. Head over to the package tracker and select the tests corresponding to your version of ubuntu. For precise, Webapps Precise Testing, and for quantal Webapps Quantal Testing.

Never participated before in a call for testing? No worries! Check out this handy walk-through and you'll be submitting results in no time. Thanks for helping test!

Tuesday, July 24, 2012

Remembering the good ole days

Quality has been a buzzword for a couple releases now. Certainly, it's safe to say we saw many more people talking about quality last cycle than any previous cycle I can remember. Ubuntu of course has done LTS's in the past, but something about this past release was different. People yearned again for the perceived quality of the past LTS release and wanted to see ubuntu succeed.

Ahh yes, the good ole days of 8.04 or 10.04 LTS releases when everything was right in the world and ubuntu just worked, etc. Heh, it's easy for us to remember the past and the good things we enjoyed. But to the extent those releases served me well, I have the community to thank. Ubuntu is community, and it's success or failure is determined by all of us.

Open source projects are not typically known for there quality. In the same way ubuntu has innovated and paved the ground for a consumer focused desktop linux, I believe we as a community are uniquely positioned to show how open source can be better quality than competing ideological offerings. In the same way a minuteman can win a battle against a better equipped mercenary, so too can a dedicated community provide better quality software than a commercial offering. This only makes sense. I would rather work with a group of passionate people than with folks who don't care about there work.

One of the best parts about getting to work in QA is seeing the end result of the entire communities work. Those who perform QA represent the last pair of eyes for the work our community does. The developers, translator, doc writers, bug triagers, forums and IRC admins can all do marvellous work in support of ubuntu. But if it doesn't install/work on your pc, then sadly you won't be able to enjoy any of that work.

Recently, those running the development version of ubuntu noticed a bug that caused there webcam to no longer work. The community responded by helping figure out exactly what commit caused the regression. Many folks were involved here! People with the issue tested different kernel versions to narrow down when the regression occurred, while the kernel team made these kernels available and provided insight. Ultimately with everyone's help the problem code was identified and confirmed by testing a custom built kernel. Armed with this knowledge an upstream bug was created.

Now as I sit here today, my webcam works again! And not just for me, or other ubuntu users, but for everyone who uses the linux 3.5 kernel. After upstreaming the bug,  the original developer made a fix available which we tested. Ultimately, this fix made it into the 3.5 kernel. This prevented the 3.5 kernel from shipping with broken logitech webcams for everyone. This is the power of our community!

The story is but one of many examples of problems the ubuntu community has solved. If you enjoyed or remember the good ole days in ubuntu, I would encourage you to get involved. If your running ubuntu on your desktop now and want to continue doing so, consider donating your skills to help. A healthy QA community is vital for ubuntu to continue to grow and get better. We, as a community, can be the standard for quality in open source.  So how can we take ubuntu, and specifically quality in ubuntu, to the next level? It starts with you.

Monday, July 16, 2012

Testing in Cadence

Last month, an interesting thread emerged on ubuntu-devel. A proposal to change the way we as ubuntu look at testing and quality. In many ways it was more of a codification of ideas and thoughts from the precise cycle than a proposal. 

One of the outcomes of this was a change to how to test isos. Rather than focus on arbitrary moments in time, we've been asked to stick to a two week cadence for testing. What that means is a regular checkup of our images every two weeks. Quite a task, but not impossible! Given the fact the change happened mid-cycle, there has been some confusion over what exactly this means. I decided to put together a post detailing exactly what's on the table for us as a community and more importantly how you can help!


If you have a look at this wiki page, I've listed the images we test and produce for ubuntu. So far during the quantal cycle we have achieved 100% coverage for most of these iso's for our mandatory testcases. That's thanks to the wonderful efforts of folks like you testing isos! But in some cases, like our first non-milestone cadence last week, the coverage was provided by a single person -- meaning we have only 1 confirmation of success or failure. I'd like us as a community to take this to the next level. I'm asking for you to commit to an iso over the course of the cycle. Would you be willing to commit to running through the mandatory testcases every 2 weeks for the iso and making sure it's in good shape? If so signup on that wiki page, underneath the iso in question. Don't be afraid if this sort of testing sounds scary. I and the rest of the community are happy to help you through your first testcase. As part of iso testing, I'm still growing my knowledge of linux and ubuntu and interacting with wonderful and talented people while doing it. This is a natural expansion of the 'adopt an iso campaign' with a new cadence. Powerpc, and mac users, this is your chance to make a difference as your hardware is less common and therefore harder to ensure proper testing for.


During the precise cycle over 100 people submitted a result to the isotracker for an iso they downloaded and tested as part of daily testing. That's excellent work, and I thank all of you very much! Many more of you downloaded and installed iso's throughout the cycle, but perhaps didn't report your work. I would encourage you to get involved and help share your results with others. If we have 100 people signup in support of iso testing, the workload required of each individual will be quite small. Yet the benefits for us as a community will be huge. More hardware and more testing results in more bugs caught sooner. We all want a good upgrade experience in October. This is your chance to be a part of making sure it happens.


As a small addendum, I'd like to point out the results of the work this testing achieves. If you have a look on this page, you'll notice a very long list of bugs; many of which are rated as high or critical in launchpad. All of these bugs were found during iso testing -- a testament to those who have tested before us. We all thank you.

Monday, July 2, 2012

Call for Testing: 12.10 kernel on 12.04 -- Part Deux

As announced earlier the kernel team is looking for a some folks to help bring the 12.10 kernel to 12.04. Once 12.10 has been released, the team wants to enable newer hardware support via the kernel for the LTS version of the desktop. So, since the original announcement, we've had 10 people help test the various builds of the kernel from the ppa. Thanks so much to all who tested! Now we'd like to take this testing to the next level.

We've put together a list of commonly used hardware that we want to ensure proper support for the kernel. I'm asking for volunteers to run the kernel from the ppa on precise and report results. The catch here is that we want to have at least 1 person who has each piece of hardware listed be represented. Make sense? For example, we want folks running nvidia cards to have at least a couple people reporting results using nouveau and the proprietary driver. Same for AMD and Intel. On the wireless side, getting someone who has a chipset of each of the manufacturers listed is our goal. This is a first step in our on-going efforts to help make testing and quality a more assured and quantitative effort. We're going 'quantal' if you will. For those who want to have even more detailed and specific hardware testing, hang tight. If you'll remember this past UDS we spoke about creating a community hardware database. Work to enable this is on-going, and I hope to be able to share more about it in the coming months. In the meantime, let's build up a list of folks and systems ready to populate such a database, shall we? ;-)

So if your interested in helping, go ahead and edit that wiki page. Add yourself under one or more pieces of hardware. There's a handy script that should help you identify what's in your system if your not quite sure. Then head over to the QATracker. 


Once there click on 'Quantal kernel for precise LTS', you will wind up on a page showcasing the tests and instructions for this call for testing. If you click the on 'Link to the installation information' you will get information on installing and uninstalling the package. Similarly the 'Link to bug reporting instructions' provides details on reporting a bug you find in the test case. Finally, if you click 'Kernel Smoke Tests' you'll arrive on the page to report your testcase results. Note you will need to sign in using your ubuntu sso account to report results.

If you encounter issues, you can always reboot into your current working kernel and be back to normal. As always, if you have any issues in using the tracker, feel free to get in touch with me.

Wednesday, June 27, 2012

Bleeding Orange and Purple

So today I headed over to ubuntu planet to get my daily fix, when I happen to spy my floating head next to a post. What's this? The last post from my blog was published! That simply won't do for an introduction, so this post will do so!

The reason my blog appeared in the planet blogroll is the same reason I now write this post. I would like to introduce myself as a newly minted ubuntu member! I couldn't be happier to bleed orange and purple. I have been 'around' ubuntu for a long time, but only recently began pursuing a more active role in the project. This culminated in me joining Canonical as the QA Community Coordinator1 and now becoming a member :-) I would encourage anyone who might want to know more about me to simply visit the about me and contact me pages above. It has links to goodies like my ubuntu wiki page, launchpad information and contact details.

To those of you within the community who I've been working with already, thank you! It's been a blast getting to know everyone. And for those whom I haven't yet had the pleasure of meeting or working with, feel free to say hello. Join a call for testing, or other testing event and get some hands on time with myself and the other folks in QA; we always enjoy meeting new faces and new people.

You can expect to see more QA and testing posts coming from me on this space. I hope to here more from all of you as well. Happy Testing!

1. Yes, Jono Bacon is my boss, and no it's not quite as cool as you think it would be. jk <3 u Jono!

Thursday, June 7, 2012

Call for Testing: 12.10 kernel on 12.04

EDIT: This has been migrated to the production instance. Check out the new post here to get invovled.

The first calls for testing for this cycle are happening! I am excited to not only announce the opportunity to help test this new kernel, but also to unveil some of the new features to the qatracker to allow us to better serve our needs for calls for testing.

Last cycle, calls for testing was a manual thing -- I asked, and the community responded, following along using instructions from a blog or mailing list post. Now we're going to put some more structure around this as I spoke about earlier.

The kernel team is committing to keeping precise up-to-date by providing kernels from the future releases in precise. The first one of these will be the 12.10 kernel which will land in precise as part of the normal 12.04.X update. In order for that to happen, the team is making available kernels to test on 12.04. If you need a newer kernel for hardware enablement, this is the kernel you are encouraged to run and report on. For those running the mainline kernel, this differs in that the ubuntu patchset and official support for this kernel running on 12.04 will happen when it is pushed via update.

So we're really testing a couple things here -- the 12.10 kernel on 12.04, but also the new qatracker. Feedback is encouraged on the qatracker also! Ok, so how does this work?

First, you'll need to use the staging site for the qatracker, were a package tracker has been setup. If you click on the 'Quantal kernel for precise LTS' link, you will wind up on a page showcasing the tests and instructions for this call for testing. If you click the 'Link to the installation information' link you will get information on installing and uninstalling the package and filing bugs against the package. Especially note the instructions for filing a bug properly; additional information is requested to help make your bug report more helpful to the development team. If you click the 'Kernel Smoke Tests' link you'll arrive on the page with the instructions for the testcase. If you login to the tracker using your ubuntu SSO credentials, you will be able to report results as well. This should look very familiar to those of you who have used the isotracker in the past. Neat eh? If you have any issues in using the tracker, feel free to get in touch with me.

I'm asking those folks willing to help test please head over to the qatracker and submit results. Note that the qatracker emails are turned off, but otherwise everything should function as expected for you. To leave feedback on the new site, file a bug and mention your using the new staging qatracker. Contributions to the qatracker are welcome and encouraged, contact me if your interested in helping out.

Tuesday, May 29, 2012

Adopt an ISO: Quantal Style

It's almost hard to believe, but the new cycle is starting to ramp up. In just over one week's time, we'll be putting out an alpha 1 iso for quantal! If you will remember last cycle, I began an adopt an iso campaign to help insure precise got the iso testing coverage it needed to be an awesome release. This cycle, that campaign will continue with an open call for folks to adopt an iso and help test it all cycle long. Instead of managing and updating all of our excellent testers myself via email, I am asking instead for you to subscribe to the iso your interested in adopting and helping to make sure it's in a ready state for each milestone release. Subscribing to the iso will alert you via email when there is a new build for you ready to test, so you don't need to watch the page or await an email from me. If you miss the email from me, you may of course contact me for personal interactions at anytime you wish :-)

Interested? Awesome! This wiki page should detail everything you need to get started. Specifically, you should ensure you subscribe to the testcases for the iso you wish. For example, if I am interested in Ubuntu Desktop i386, I would head over to this page. See that button at the bottom called subscribe? Hit it and you should be subscribed to new builds for that iso. Please note the subscription feature is a work in progress, and there is not (yet!) a management page for subscriptions. Additionally, there is no visual indication on the page that your subscription is active (contributions welcome, contact me if you know drupal and wish to help!). Please read the wiki page on ISO Testing for more information on confirming your subscription in the interim period.

Ok, great, so now your subscribed. There's just one piece left in making sure things go well for your iso. This cycle we are trying something new to help make the alpha, beta, final release process smoother for all of our iso's (and you!). We would like to have our adopters run the daily iso before we spin the first candidate for release. What this means for you is that the week before each milestone release date, go ahead and try testing out the daily version of the iso. Think of it as warm-up for the big day. After all, you don't want your iso to be the one causing the re-spin do you?

This schedule will come in handy. It shows the timeline for how and when we'll be undertaking this testing. I know it looks big and scary, but focus on just the first column called 'Community Testing'. See the week of May 31st with the 'Q-D' listed by it in the 'Community Testing' column? If you glance at the 'Legend' at the top you will notice that 'Q' stands for quantal, and 'D' stands for daily. You will also notice that the Alpha 1 release for quantal is schedule one week after. So our goal is to spend this week getting our iso's in shape for the first spins for Alpha 1. The good news is time spent now is respins saved later. Happier isos make happier users!

Before I close I do want to remind everyone that each cycle is a marathon. We spoke at UDS about burnout, and that is something to keep in mind. Pace yourself and share the work. We'll have a wonderful cycle together. As always, contact me if you need any help. Your response was overwhelming for ubuntu precise, let's keep going strong for quantal. Thanks for helping make ubuntu better for everyone!

Quality in Quantal: A community perspective

I originally posted this wonderful wall of text on the ubuntu-qa mailing list. If you want to get invovled in QA on ubuntu this cycle, you should subscribe to that list. Additionally, sign-up for the ubuntu testing team. Monitoring this blog and @UbuntuTesting will also keep you informed.


ISOTesting
My goal is to help ensure things are smooth before milestones, and
before isotesting events. Before we spin an iso, we want to feel good
about what's going on that iso. And we as a community can help make that
happen. Overall, I want each individual to have a lighter workload than
last cycle, despite having a similar amount of overall work we need to
achieve. To do this I'd like to help enable more people to be testing,
and to expand the 'adopt an iso' program so that folks can focus on
testing things they like and are able to test without becoming
overwhelmed or burnt out. Additionally, respins will be a continuous
focus and communication of what has changed and what needs tested will
be a priority. As a community we want to avoid doing re-work/extra work
and dedicate ourselves to performing quality testing, not merely having
a large quantity of testing.

Application Testing
Last cycle we utilized checkbox to deliver manual application tests.
During UDS, we spoke of expanding the isotracker to do our testcase
management, and thus consolidate our application testing by using the
same tool used for the isotracker to create an application tracker. This
work is on-going, but should be finished at some point during the cycle
so we can adopt it and use it. In the interim period will be continue
utilizing checkbox or doing manual testing via blogs or mailing lists, etc.

SRU Verification
SRU verification is currently a manual process with a high learning
curve and little visibility for many people. During the cycle, we hope
to help change that but also utilizing a new tracker to do SRU testing.
This testing will involve running the stable version (currently precise)
of ubuntu, but testing fixes to individual packages. This makes it a
good fit for those who aren't living on the bleeding edge but wish to
help. When this process is ironed out (sometime during the cycle) I will
contact everyone again with information on howto get involved.

General Testing (eg, Day-to-Day running of the development version)
Some good feedback was given on how to help make this better. There are
a few things we would like to do to help improve this process. First,
day to day changes should be able to be followed easier with some
proposed changes to update-manager to better display changelogs for
updated packages. I'll be detailing some information about how
'whoopsie' works and what it means to you. In addition, keeping the
development release stable at all times will continue to be a priority
for the development teams.

Calls for testing (specific feature or new features of critical package
or focused testing on a specific package)
Last cycle this typically involved me posting and laying out a basic
testplan on my blog with instructions on how to help test. This cycle,
again we hope to consolidate this onto a tracker where the tests and
results can be recorded. I will still be utilizing my blog, the
@ubuntutesting twitter account, this mailing list and our IRC meeting to
publicize events like this for people to get involved and contribute.
It's always fun to see new features before they come to everyone else,
and the feedback loop with the developers was welcome on both sides.

QATracker Development
With these changes to the qatracker, there is room for some folks who
know python and django to get involved and help improve the qatracker
codebase to make testing and reporting easier  Contact me, or simply
have a look at the code on launchpad and start hacking.
lp:~ubuntu-qa-website-devel/ubuntu-qa-website/drupal7-rewrite
lp:~ubuntu-qa-website-devel/ubuntu-qa-website/drupal7-rewrite-testcase-management
lp:~ubuntu-qa-website-devel/ubuntu-qa-website/python-qatracker
lp:~ubuntu-qa-website-devel/ubuntu-qa-website/python-qatracker-testcase-management

Hardware Database
The idea for having a hardware database for testing is not a new one,
but work has begun anew. This is work that will go beyond this cycle,
but ideas are being explored at using ubuntu friendly and other tools to
make this a reality.

Testcases
As a testcase management system will soon be in place (hurray!), we'll
be migrating all of the testcases over to this system. That means will
have much better visibility and ease of maintenance for all of our
testcases. Cleanup and expansion of the number of testcases is
definitely a goal for the cycle, and expect to hear more about getting
involved in this area.

Whew, that's a wall of text, but I hope it helps outline what the plans
are for the cycle. Feedback appreciated and encouraged  Happy Testing!

Wednesday, May 23, 2012

Executing on an idea; a UDS story

UDS is now behind us, and the excitement of the work that lies before us for the next cycle is fresh on our minds and hearts. Last cycle I solicited and received some amazing ideas for improving how we as a community do QA inside of ubuntu. As UDS neared I encouraged many of those with ideas to participate in UDS by attending, signing up for work items, and advocating their ideas.
This is a key portion of being a part of the community -- you must be willing to act. If you are unwilling to act upon your own idea, why would anyone else? If you don't believe in it, no one else will. Own the problem you wish to solve and you will find others who share your passion along the way to help you achieve your goals. This is the heart of open source.
But how? How can I act? What if the problem is outside of my skillset? Because of the greater community and the nature of open source, you don't have to solve all of the problem by yourself. As you undertake work to execute your idea, you will find it attracts those who are of like-mind and similar persuasion to you. The best part is that they will have different skillsets to bring to the problem and can help you accomplish more than you could alone.
In a previous job, I was given the freedom to spend a percentage of my time on anything I chose; provided I could convince two of my workmates to help out. The idea behind the requirement was a litmus test for my idea. If the idea has merit, I should be able to convince my colleagues to work on it with me. Ubuntu is one of several open source projects to operate on this idea of 'meritocracy'. The basic premise is to have the best people making the most informed decisions possible about problems specific to there expertise. This is achieved by granting authority to make decisions to anyone who demonstrates there ability to do so by contributing to the project.
So, returning to UDS I would like to tell you a small story of just one example of executing on an idea. Let me introduce Paolo Sammicheli to you. Paolo is from the Italian Loco team, and has been active in driving growth in the localized iso community. He began his work by starting an "Italian Testing Team" several UDS's ago, and has been advocating greater testing and community participation for several cycles now. This past UDS, Paolo wanted to help kickstart a localized iso community beyond just his Italian loco iso. Before UDS, he had already produced a set of wiki pages documenting how to use the isotracker admin features with a bent towards running your own localized iso tracker. Additionally, the Italian loco team planned and tested during the 12.04 cycle to create a localized ubuntu 12.04 image for release. Finally, Paolo came to UDS and created a blueprint so he could share his idea with others. Have a look at it yourself:

https://blueprints.launchpad.net/ubuntu/+spec/community-q-localized-iso-community-growth

Paolo was able to generate good ideas, and see other people attempt to replicate his work within their own locos. Plans were made to have two other loco teams produce localized isos this cycle, and ultimately use there findings as a model for future loco teams. Although the work is on-going this cycle, Paolo, I think, has been successful at bringing his idea to life.
How can you replicate Paolo's example? A couple key points I see in what happened.

  • Lay the groundwork
    • Start proving the idea out as best you can. Perhap's it's a demo or prototype -- maybe even just a specification or a storyboard. You need to convince yourself (and others!) your idea makes sense and can be done
  • Tell others
    • Let others know about your work. Blog about it, come to UDS, present it at a Ubuntu user days event, post it to the forums, talk to people on IRC about it, etc
  • Do it
    • This is key. You need to start executing your idea as best you can. People are not going to make your idea a reality without you! (and why would you want them to? It's your idea! Own it :-) )
  • Share your work
    • Invite others to work with you on your idea. It's helpful to have specific and easy ways to get involved, but don't limit people. You want to work openly in a way that anyone can participate at any level.
Go forth and own your ideas! I empower all of you to do so. Who knows, maybe your OS also won't "just be a hobby, won't be big and professional like gnu".

Wednesday, May 16, 2012

UDS Highs and Lows

As you may have seen or heard, I attended UDS-Q last week in Oakland, California. I was privileged to serve as the track lead for the QA track. I trust I served all of you brave souls willing to attend the QA track sessions well enough ;-) A big thank you to everyone who participated. We set some big goals for this next cycle, which I'd like to share over the coming days. For now, I'm going to provide the list (perhaps not comprehensive, but I tried) of blueprints I have work items on as a sneak peak at what we discussed:


You can see more of the 'qa' blueprints here:


A few key takeaways can be found in them:
  • Expansion of the isotracker capabilities to do testcase management
  • Ideas around helping reduce burnout, duplication of work, and in general make the testing experience easier and more fun
  • Utilizing ubuntu friendly for testing
  • Scrapping of the team organization ideas I proposed earlier in the cycle; instead focusing on communicating between groups will be pursued
  • Ideas for community involvement in automated and regression testing
One thing you won't find in the blueprints, but which is still an important goal, is QA representation at the next UDS. I want to see more of you there next time! 

So, this post was intended to be about the highs and lows of UDS. Well, let's call the first night dinner and jetlag a definite low, and the closing party a definite high :-) After 5 days of discussing I was ready to head home, but the closing party (despite the soberness of leaving) lifted my spirits and helped UDS end on a positive note.

Hmm, perhaps an image or two would better explain it.. 

This was high, and below is the low :-)

Thursday, April 26, 2012

Thank you QA Community!

Your commitment to quality and excellence has shown itself in this release. People love numbers, so let me spill some!

We had 13 calls for testing this cycle, with 6 iso testing milestones, and lots of bug reports, support and users using the development version of precise. Additionally, we set a new record for iso testing the precise final isos! Have a look yourself!

https://wiki.ubuntu.com/QATeam/ReleaseReports/PreciseFinalTestReport

That's 114 people who helped make sure your ubuntu precise experience was going to be a good one out of the box. Thank you to all of our testers this cycle!

https://wiki.ubuntu.com/PrecisePangolin/ReleaseNotes/Credits/Testers

Wednesday, April 18, 2012

ISO Adoptions

Ubuntu Desktop i386

Yesterday, ~70 people (and counting! you guys are rockstars, thank you so much!) answered the call to adopt, and are now sponsoring over 2 dozen isos for ubuntu, and it's flavors like xubuntu, kubuntu, edubuntu, lubuntu and ubuntu studio. That is wonderful news.


However, there are a couple sad iso's still out there who are wanting to be adopted. They are the more troublesome ones to adopt if you will. Everyone loves and wants to help Ubuntu Desktop iso, but little pay attention to our mac and wubi specific testing. If you have access         to a windows machine and can help test wubi, please let me know! If you have access to a macbook or other mac hardware please also let me know! I can help you adopt those troublesome iso's and make ubuntu precise a better experience for those with similar hardware.

Ubuntu Desktop amd64+mac
Look at his face and then hit your compose button to email me. You'll be glad you did. Contact me at nicholas.skaggs at the canonical.com domain and I will make sure to get you connected to one of these little guys!