Showing posts with label testing. Show all posts
Showing posts with label testing. Show all posts

Wednesday, May 16, 2012

One Million Bugs

Launchpad.net passed one of those interesting milestones with Bug #1000000 today. Ok, so it's interesting for us who like base-10, for you power of 2 types, you'll have to wait around until Bug #1073741824.

Putting aside the interesting number, let's use it as a reminder of the amazing participation by the community in not only the Ubuntu project, but all the other projects (including many of my favorites at Linaro) that are hosted on on Launchpad.  Thanks to all the tireless testing and bug tracking/fixing by this entire community, all of these projects are better off.  Also, it should serve as a reminder that we should persevere and push even harder to find all those issues still waiting to be discovered so we can get them fixed, and further increase the quality of Ubuntu and others.  Finally, thanks to the Launchpad team for their efforts in producing and maintaining this service that is useful to so many of us.

Wednesday, June 01, 2011

LAVA Project Changes

If you take a look at http://launchpad.net/lava you'll see some structural changes are afoot:
LAVA is now a project group
lava-server - the core server components
lava-dashboard - the results dashboard (was launch-control)
lava-scheduler - the lava scheduler
lava-dispatcher - the dispatcher
lava-tool - the core pieces of the command line interface
lava-test - (coming soon) the test execution framework

All the Linaro validation tools are now going to be consolidated under the LAVA project group in launchpad. If you are already deploying and experimenting with LAVA, don't worry, there will be some instructions coming soon (and packages too) for installing the latest versions. This is laying the groundwork for the development that will take place over the next few months. More on that later. :)

Wednesday, June 23, 2010

Automated Testing Frameworks

A good friend of mine asked me today, how many test harnesses, or automated testing frameworks I've either written, or participated in the creation of now. Honestly, I had never really stopped to think about it, but it's been more than a few. I've had a lot of people suggest over the years that it shouldn't be necessary to start over and create a new one, when there are lots of good ones already out there. There's some common sense in that, however in reality it doesn't necessarily work out that way.

The question "How many testing frameworks have you built?", to me, seems a bit like asking "how many different pairs of shoes have you worn?" I'm not big on shoes or anything, but I do have different ones for different purposes. I have tennis shoes that I typically wear, dress shoes for church or other occasions where sneakers would be too informal, and motorcycle boots. They each have a very different purpose, and for the most part, need to be completely different. Sure, in theory someone could design a dressy, tough, boot that's comfortable to run in. I'll give you a second to try to picture that... Ok, I think you get my point now.

Testing frameworks are kinda like that too. They are often built with a specific purpose in mind, because the existing solutions don't quite do what they wanted, or don't do it in a way that they like. So, rather than try to force themselves upon an existing project, which was created with different goals in mind, developers often start over. This allows both projects the freedom to explore what they want to do, in the way they want to do it, without interfering with one another. Some would call this fragmentation, but I look at it as specialization with opportunities later for collaboration where it makes sense to do so!