Since I posted the survey results, some of the members of the qa community noticed the team was unknown by others in the greater ubuntu community and beyond. After posting the article and ending my day, no sooner do I awake then my inbox already contains links to several social media locations where groups have been setup. I was floored and impressed by everyone's excitement to let others share in the fun we have here in QA (great work everyone!).
So, what does that mean for everyone else? We've gone social! Check out the new groups, and if your already a member of the social community in question, I would encourage you to join up if your interested in qa. And for anyone who has been involved in the oppurtunities we undertake in qa, I would encourage you to write about them. Share them with your friends.. in a word, communicate :-) You don't need to find a new medium, take qa and ubuntu to where you already interact!
https://wiki.ubuntu.com/QATeam/Contact
Thursday, August 23, 2012
Tuesday, August 21, 2012
Call for Testing: Compiz & Unity
It's the testing event I've been waiting for! A new version unity has arrived, stock full of compiz fixes. The team has removed metacity completely, and also migrated to gsettings from gconf. The result is removal of unity2d, and the enablement of llvmpipe on unity for those running without hardware acceleration.
Now, of course this work, in particular the settings migration, needs to be tested. This is where all of you come in! The package tracker now contains an entry for unity, complete with testcases for this migration. For those who helped test last cycle, you will also notice all of the checkbox testcases have been ported over as well! Many thanks to the unity developers for there help in migrating these tests. For this call for testing, the 'Unity GSetting Migration' testcases have been marked as mandatory, meaning that testcase is the primary focus. However, if you are able, executing the other testcases also helps the unity team ensure there haven't been any regressions.
Please note the 'Unity GSetting Migration' has steps for you to complete BEFORE you install from the ppa. Please read it first before diving in. Here's a link to the testing on the tracker. And if you are new, check out our wiki for a guide on using the qatracker to help test.
Now, as an added bonus, the unity developers have also tagged several multi-monitor bugs and have asked users to go through the list and confirm any bugs that they can. Read the bug report, and if you have a multi-monitor unity setup, see if it's still affecting you. Leave a comment on the bug with your result. The unity team wants to make sure the bugs are able to be triaged and get proper attention if they are still valid.
List of unity bugs, tagged multi-monitor
Thank you in advance for your help, and happy testing everyone!
Now, of course this work, in particular the settings migration, needs to be tested. This is where all of you come in! The package tracker now contains an entry for unity, complete with testcases for this migration. For those who helped test last cycle, you will also notice all of the checkbox testcases have been ported over as well! Many thanks to the unity developers for there help in migrating these tests. For this call for testing, the 'Unity GSetting Migration' testcases have been marked as mandatory, meaning that testcase is the primary focus. However, if you are able, executing the other testcases also helps the unity team ensure there haven't been any regressions.
Please note the 'Unity GSetting Migration' has steps for you to complete BEFORE you install from the ppa. Please read it first before diving in. Here's a link to the testing on the tracker. And if you are new, check out our wiki for a guide on using the qatracker to help test.
Now, as an added bonus, the unity developers have also tagged several multi-monitor bugs and have asked users to go through the list and confirm any bugs that they can. Read the bug report, and if you have a multi-monitor unity setup, see if it's still affecting you. Leave a comment on the bug with your result. The unity team wants to make sure the bugs are able to be triaged and get proper attention if they are still valid.
List of unity bugs, tagged multi-monitor
Thank you in advance for your help, and happy testing everyone!
Monday, August 20, 2012
The grand "Cadence" experiment
It all started innocently enough. A simple idea, turned into a simple post on the mailing list. This idea eventually led the ubuntu QA community to perform an experiment for this cycle, which has been dubbed "cadence testing".
Now, before I manage to confuse everyone with this "cadence testing" term, let's define cadence testing. Scratch that, let's just give a simple definition of what was intended by original idea. If you want the whole story read the thread. heh. I'll be waiting (hint it's LONG!).
Cadence testing was intended to introduce regularity into testing. If the development release could be "stable" everyday (which was the grand experiment during the precise cycle), could we not also test to ensure that things were good all throughout the release? If the everyday images and archive were now to the quality of previous release's milestones, could we just eliminate the milestone idea and go with a calendar schedule for testing? Thus, a proposal was made test every 2 weeks, whether or not a milestone had been planned, and report the results.
Fast forward 2 months to today. So what happened? Well, I'm happy to report that the QA community despite the confusion more or less met the goal of testing the desktop images every 2 weeks (milestone or not). But what did this achieve? And where are the results?
Let's step back a moment and talk about what we learned by doing this. My comments are specific to the non-milestone cadence testing weeks. First, the development process inside ubuntu is still built around milestones. The daily images during cadence testing weeks were sometimes stable, and sometimes flat out broken by a new change landing from a development team. Second, the tools we used are built around milestone testing as well. The qatracker as well as any qa dashboard or report available doesn't have a good way to track and image health across the cadence week. This meant it was both difficult to test and difficult to see the results of the testing. Finally, the development teams were not expecting results against the daily images, and couldn't follow-up well on any bugs we reported, nor where we able to coordinate well with the release team, as the bugs reported were not available in a summarized or meaningful way.
Now, I'll save the discussion on my ideas of a healthy QA workflow for a later post, but I think we can agree that testing without good result reporting, and without developer follow-up has a limited impact. So does this mean "cadence testing" was a bad idea? No, it was simply poorly executed. The trouble comes in the assumptions listed above.
The archive has not been "stable" everyday, and development teams have have continued development, pausing only as required by the current milestones. In addition, changes, even major ones (like the ubiquity changes landing a few weeks ago, or the nvidia change just this past weekend), are not well communicated. Since they land without little or no warning, we as a QA community are left to react to them, instead of planning and executing them. In this environment, cadence testing makes little sense.
So was the experiment a failure then? In my mind, not at all! In fact, I think the future of ubuntu and QA is to push for complete adoption of this idea, and this experiment confirms the obstacles we will face in getting there. I'll be posting more about what this vision for QA looks like, but I'll leave you with a few thoughts until then.
In my mind, QA should enhance and improve developers, testers, and users lives and workflows. Our work is critical to the success of ubuntu. I would like to see a future where users receive regular, timely scheduled updates that the folks in the QA community have vetted by working with the development and release teams to deliver focused quality updates. The ideal workflow is more fluid, more agile and yes, it has a cadence.
Now, before I manage to confuse everyone with this "cadence testing" term, let's define cadence testing. Scratch that, let's just give a simple definition of what was intended by original idea. If you want the whole story read the thread. heh. I'll be waiting (hint it's LONG!).
Cadence testing was intended to introduce regularity into testing. If the development release could be "stable" everyday (which was the grand experiment during the precise cycle), could we not also test to ensure that things were good all throughout the release? If the everyday images and archive were now to the quality of previous release's milestones, could we just eliminate the milestone idea and go with a calendar schedule for testing? Thus, a proposal was made test every 2 weeks, whether or not a milestone had been planned, and report the results.
Fast forward 2 months to today. So what happened? Well, I'm happy to report that the QA community despite the confusion more or less met the goal of testing the desktop images every 2 weeks (milestone or not). But what did this achieve? And where are the results?
Let's step back a moment and talk about what we learned by doing this. My comments are specific to the non-milestone cadence testing weeks. First, the development process inside ubuntu is still built around milestones. The daily images during cadence testing weeks were sometimes stable, and sometimes flat out broken by a new change landing from a development team. Second, the tools we used are built around milestone testing as well. The qatracker as well as any qa dashboard or report available doesn't have a good way to track and image health across the cadence week. This meant it was both difficult to test and difficult to see the results of the testing. Finally, the development teams were not expecting results against the daily images, and couldn't follow-up well on any bugs we reported, nor where we able to coordinate well with the release team, as the bugs reported were not available in a summarized or meaningful way.
Now, I'll save the discussion on my ideas of a healthy QA workflow for a later post, but I think we can agree that testing without good result reporting, and without developer follow-up has a limited impact. So does this mean "cadence testing" was a bad idea? No, it was simply poorly executed. The trouble comes in the assumptions listed above.
The archive has not been "stable" everyday, and development teams have have continued development, pausing only as required by the current milestones. In addition, changes, even major ones (like the ubiquity changes landing a few weeks ago, or the nvidia change just this past weekend), are not well communicated. Since they land without little or no warning, we as a QA community are left to react to them, instead of planning and executing them. In this environment, cadence testing makes little sense.
So was the experiment a failure then? In my mind, not at all! In fact, I think the future of ubuntu and QA is to push for complete adoption of this idea, and this experiment confirms the obstacles we will face in getting there. I'll be posting more about what this vision for QA looks like, but I'll leave you with a few thoughts until then.
In my mind, QA should enhance and improve developers, testers, and users lives and workflows. Our work is critical to the success of ubuntu. I would like to see a future where users receive regular, timely scheduled updates that the folks in the QA community have vetted by working with the development and release teams to deliver focused quality updates. The ideal workflow is more fluid, more agile and yes, it has a cadence.
Thursday, August 16, 2012
Quality Perceptions Survey Results
A couple Fridays ago I asked for feedback on how quality and the ubuntu QA team this cycle. That survey has now been completed and I have some results to share with everyone. Before I dive into the numbers, let me take a moment to say thank you to all of you who responded. Thank you! I read all of the comments left as well, and all were helpful feedback. Remember the survey was anonymous, so I cannot respond individually to anything written. Feel free to contact me if you wish to discuss anything further or to receive a response.
The second question on the survey asked rather simply, "What does quality mean to you?".
As it turns out, the largest answers mirrored those of a later question, in which I asked "What's the biggest problem with quality in ubuntu right now?".
So there is some agreement amongst those who were polled both about what quality means, and about where ubuntu's biggest problems lie. The respondents indicated the largest issue with quality in ubuntu, according to them, was also the definition of what "quality" is!
Now I asked this question for a specific reason. "Quality" is a subjective term! Perhaps I'll get some disagreement on this, but hear me out. All of the answers for the question in my mind are valid with respect to quality. As an example, let's say, I asked you to recommend software to balance my checkbook. If I specified I wanted a quality piece of software, would you not recommend to me a stable (works without crashing), mature (good development/bug workflow), and easy to use (just works) piece of software that has a nice feature set (latest and greatest)? It's easy to see that "quality" can refer to all of this and more.
Still, in my mind, when I speak to wanting a "quality" release of ubuntu, I tend to focus on the stability and ease of use aspects. As the graphs indicate, the respondents seemed to echo this idea. In other words, it's really important to the idea of quality that things "just work". In the context of ubuntu this means applications run without crashing, and the operating system runs on your hardware. If things don't "just work", even if all the other indications of quality are true, you aren't likely to describe or perceive the product as having "good quality".
Let's ponder that thought for a moment and look at some more results. The survey captured about a 50/50 split of folks who run the development release, and over 70% run it or intend to run it before the final release.
So among those 50-70% who run or will run the development release, how many have participated in ubuntu qa?
Yikes! Only about a third. Just under half have no idea a ubuntu QA team existed. There's some clear evangelizing work to be done here. Let me take pause here just for a moment to say the team does exist, and would love to have you!
Ok, now onto the lastmultiple-guess multiple-choice question.
I'm happy to see people desire to help! That's wonderful. The responses regrading time, being technically able, or where to start are all very solvable. I would encourage you to watch this space for invitations to help test. QA work comes in all shapes and sizes, sometimes it's as little as 15 minutes, and the ability to install/uninstall a package and reboot a machine. If this sounds like something you would be able to do, please start by having a look at our wiki page. Send us an email and introduce yourself. There's no requirements or forced participation and we welcome everyone. And who knows, you might even learn something about ubuntu :-)
Ok, so I've shared the hard numbers from the survey, but I'd like to leave you with a few takeaway thoughts. First, while quality is subjective, our focus in ubuntu QA should be to have things "just work". That doesn't mean we shouldn't also help improve our development and bug processes, or continue to push for new applications and features, but rather that we ensure that our efforts help forward this cause.
I've said it before, but I want to help deliver good computing experiences. That story I shared when I introduced myself was close to home. My first interaction with the ubuntu community came via the forums, and yes, getting a printer to work. The community undertook work to change what was once a nightmare not for the feint of heart to child's play. The execution of this work is what defines the experience. This is where QA fits. We aren't just testing; we're delivering the result of the entirety of the ubuntu community's labor.
Judging from the survey results, many of you share this same vision. So won't you join us? QA transcends across teams and the ubuntu community. I would encourage you to get involved and be a part of making it happen. The list of "problems with quality" reach many areas. Would you be part of the solution?
The second question on the survey asked rather simply, "What does quality mean to you?".
As it turns out, the largest answers mirrored those of a later question, in which I asked "What's the biggest problem with quality in ubuntu right now?".
Note, I read all of the "other" responses and categorized them into some new categories to display here for general consumption. |
Now I asked this question for a specific reason. "Quality" is a subjective term! Perhaps I'll get some disagreement on this, but hear me out. All of the answers for the question in my mind are valid with respect to quality. As an example, let's say, I asked you to recommend software to balance my checkbook. If I specified I wanted a quality piece of software, would you not recommend to me a stable (works without crashing), mature (good development/bug workflow), and easy to use (just works) piece of software that has a nice feature set (latest and greatest)? It's easy to see that "quality" can refer to all of this and more.
Still, in my mind, when I speak to wanting a "quality" release of ubuntu, I tend to focus on the stability and ease of use aspects. As the graphs indicate, the respondents seemed to echo this idea. In other words, it's really important to the idea of quality that things "just work". In the context of ubuntu this means applications run without crashing, and the operating system runs on your hardware. If things don't "just work", even if all the other indications of quality are true, you aren't likely to describe or perceive the product as having "good quality".
Let's ponder that thought for a moment and look at some more results. The survey captured about a 50/50 split of folks who run the development release, and over 70% run it or intend to run it before the final release.
So among those 50-70% who run or will run the development release, how many have participated in ubuntu qa?
Yikes! Only about a third. Just under half have no idea a ubuntu QA team existed. There's some clear evangelizing work to be done here. Let me take pause here just for a moment to say the team does exist, and would love to have you!
Ok, now onto the last
Ok, so I've shared the hard numbers from the survey, but I'd like to leave you with a few takeaway thoughts. First, while quality is subjective, our focus in ubuntu QA should be to have things "just work". That doesn't mean we shouldn't also help improve our development and bug processes, or continue to push for new applications and features, but rather that we ensure that our efforts help forward this cause.
I've said it before, but I want to help deliver good computing experiences. That story I shared when I introduced myself was close to home. My first interaction with the ubuntu community came via the forums, and yes, getting a printer to work. The community undertook work to change what was once a nightmare not for the feint of heart to child's play. The execution of this work is what defines the experience. This is where QA fits. We aren't just testing; we're delivering the result of the entirety of the ubuntu community's labor.
Judging from the survey results, many of you share this same vision. So won't you join us? QA transcends across teams and the ubuntu community. I would encourage you to get involved and be a part of making it happen. The list of "problems with quality" reach many areas. Would you be part of the solution?
Tuesday, August 7, 2012
Quality mid-cycle checkup
About 2.5 months ago I wrote about the plans for the ubuntu QA community for the quantal cycle. We were building off of lots of buzz from the precise release and we planned to undertake lots of new work, while being very careful to avoid burnout. Our focus was to take QA to the next level and help us communicate and grow as a team to take on the opportunities we have.
So, how are we doing? Let's go over each of the points noted in the original post and talk about the progress and plans.
ISOTesting
Our alpha1 testing went very well, but the alpha2 and alpha3 have seen less participation. In addition we were asked and responded to a plan to test our iso's every 2 weeks as part of a more cadenced testing. Overall, isotesting continues to be a weak spot for us as a community. ISO Testing is perhaps the most important piece of testing for us as a greater ubuntu community. The image we produce is literally the first experience many folks have with ubuntu. If it fails to install, well, there went our chance for a positive first impression :-( I would be happy to hear ideas or comments on isotesting in particular.
Application Testing
This work has been mostly completed. The package tracker now allows us to perform work that was done via checkbox or manual testing last cycle. We can now manage results, tests and reporting all in one tool -- and it's all publicly available. For more information about the qatracker, see this wiki page.
SRU Verification
This work is still on paper, awaiting for the 12.04.1 release before further discussions and work will begin.
General Testing (eg, Day to Day running of the development version)
I am still experimenting with understanding how to enable better reporting and more focused testing on this. The current plan is to track specific packages that are critical to the desktop, and allow those run the development version the ability to report how the application is working for each specific upload during the development release. This is done with the qatracker. I'll blog more about this and the results in a later post. Contact me as always if your interested in helping.
Calls for Testing
This has been a wonderful success. There have been several calls for testing and the response has been wonderful. A big thank you to all of you who have helped test this. We've had over 50 people invovled in testing, and 41 bugs reported. Myself and the development teams thank you! But we're not done yet, unity testing among other things are still coming!
QATracker development
There is still room for more developers on the qatracker project. It's written in drupal, and I would happy to help you get started. As we grow, there will continue to be a need for people who want to build awesome tools to help us as a community test. If you have ideas for a tool (or know of a tool) that would help us test, please feel free to share with me.
Hardware Database
Work has been completed to spec out the design, and is scheduled now to land this cycle not in a future cycle. Fingers crossed we'll sneak this in before we release quantal :-) I'm very excitied to share this new tool with you; as soon as it's complete we'll be able to incorporate it into our workflow on the qatracker.
Testcases
Done, and for the most part our testcases have been migrated over. In addition, there is now a special team of folks who help to manage and maintain our testcases. If you have a passion for this work, contact me and I can help get you involved with the team.
Overall, I am happy to see signs of growth and newcomers to the community. If your on the fence about getting more involved with ubuntu, I would encourage you to check out QA. We collaborate with almost every area of ubuntu in some way, and no two days are the same :-) Send an email to the ubuntu-qa mailing list and introduce yourself.
So what's your opinion? Feel free to respond here with your thoughts and/or fill out the quality survey to give feedback.
So, how are we doing? Let's go over each of the points noted in the original post and talk about the progress and plans.
ISOTesting
Our alpha1 testing went very well, but the alpha2 and alpha3 have seen less participation. In addition we were asked and responded to a plan to test our iso's every 2 weeks as part of a more cadenced testing. Overall, isotesting continues to be a weak spot for us as a community. ISO Testing is perhaps the most important piece of testing for us as a greater ubuntu community. The image we produce is literally the first experience many folks have with ubuntu. If it fails to install, well, there went our chance for a positive first impression :-( I would be happy to hear ideas or comments on isotesting in particular.
Application Testing
This work has been mostly completed. The package tracker now allows us to perform work that was done via checkbox or manual testing last cycle. We can now manage results, tests and reporting all in one tool -- and it's all publicly available. For more information about the qatracker, see this wiki page.
SRU Verification
This work is still on paper, awaiting for the 12.04.1 release before further discussions and work will begin.
General Testing (eg, Day to Day running of the development version)
I am still experimenting with understanding how to enable better reporting and more focused testing on this. The current plan is to track specific packages that are critical to the desktop, and allow those run the development version the ability to report how the application is working for each specific upload during the development release. This is done with the qatracker. I'll blog more about this and the results in a later post. Contact me as always if your interested in helping.
Calls for Testing
This has been a wonderful success. There have been several calls for testing and the response has been wonderful. A big thank you to all of you who have helped test this. We've had over 50 people invovled in testing, and 41 bugs reported. Myself and the development teams thank you! But we're not done yet, unity testing among other things are still coming!
QATracker development
There is still room for more developers on the qatracker project. It's written in drupal, and I would happy to help you get started. As we grow, there will continue to be a need for people who want to build awesome tools to help us as a community test. If you have ideas for a tool (or know of a tool) that would help us test, please feel free to share with me.
Hardware Database
Work has been completed to spec out the design, and is scheduled now to land this cycle not in a future cycle. Fingers crossed we'll sneak this in before we release quantal :-) I'm very excitied to share this new tool with you; as soon as it's complete we'll be able to incorporate it into our workflow on the qatracker.
Testcases
Done, and for the most part our testcases have been migrated over. In addition, there is now a special team of folks who help to manage and maintain our testcases. If you have a passion for this work, contact me and I can help get you involved with the team.
Overall, I am happy to see signs of growth and newcomers to the community. If your on the fence about getting more involved with ubuntu, I would encourage you to check out QA. We collaborate with almost every area of ubuntu in some way, and no two days are the same :-) Send an email to the ubuntu-qa mailing list and introduce yourself.
So what's your opinion? Feel free to respond here with your thoughts and/or fill out the quality survey to give feedback.
Friday, August 3, 2012
Quality Perceptions Survey
What's your perception of quality this cycle? Are things working well for you? It's been several months now since precise landed, and ubuntu development for the next version has been ongoing. The ubuntu QA team has had a busy summer putting into place the new tools we spoke about at UDS. The qatracker has been revamped to allow us to consolidate our testcases and test reporting across all of our activities. In addition, we've been helping in the release of 3 alpha milestones, and 3 testing campaigns. To all those who have helped in this testing, a very big thank you!
I have my own thoughts about the impact to the ubuntu project this testing has had, and I will continue to share my thoughts to point out the progress we make in this regard. But now, I want your input. I have created a survey to understand the community perspective on how we as a ubuntu project are doing on quality. If you have a few moments, please fill out the survey and let your thoughts and perspective be known. The survey will be anonymous, but I will share an aggregation and summary of the results.
My hope is to help gain an understanding of how we can focus our efforts on what's important to ubuntu as a project in terms of quality, as well as how we can help you (yes you!) become a more active part of QA if your interested.
Here's a link to survey. I'll leave it open until next Friday August 10th. Thanks in advance for your participation.
I have my own thoughts about the impact to the ubuntu project this testing has had, and I will continue to share my thoughts to point out the progress we make in this regard. But now, I want your input. I have created a survey to understand the community perspective on how we as a ubuntu project are doing on quality. If you have a few moments, please fill out the survey and let your thoughts and perspective be known. The survey will be anonymous, but I will share an aggregation and summary of the results.
My hope is to help gain an understanding of how we can focus our efforts on what's important to ubuntu as a project in terms of quality, as well as how we can help you (yes you!) become a more active part of QA if your interested.
Here's a link to survey. I'll leave it open until next Friday August 10th. Thanks in advance for your participation.
Subscribe to:
Posts (Atom)