Qs and how to A them, a non-QA-engineer's perspective ;)
QA is a somewhat tricky skillset to describe, because there's a ton of loosely-related jobs that fall under that umbrella. Here's the roles I can think of, and skills that come in handy for them… (note: not the skills that are required for them; this is things that might be worth learning, not things you need to know entirely before applying for a job)
Testing (black box)
Given a thing, find, isolate, and describe flaws in that thing
- Quickly learning how to use the thing is important for being able to explore it (i.e. practice learning new libraries/frameworks if the thing you're testing is one of those)
- Imagine likely failure modes and make experiments to try them (e.g. if a new text field in an app supports floating point numbers, trying out 0, -0, 0.0000…001, FLOAT_MAX, DOUBLE_MAX, and so on likely makes sense.)
- Figure out how to replicate what you just broke ("lab notes" for your experiments can be helpful)
- Using a scripting language to control apps can make things much easier (AppleScript support is common in Mac apps. Games often have internal scripting systems that could be used for precisely controlling your angle when looking for clipping issues, etc...)
- Writing documentation is a good way to explore your understanding of something, and produces documentation as a side effect
- Extracting information from the author of the thing requires knowing how to ask good questions
Testing (white box)
Given a thing's code, find, isolate, and describe flaws in internal pieces of that thing
- Reading other people's code is a tricky skillset that is hard to build without doing it. Open source projects can provide a great environment to practice this skill.
- Know how to use the local testing framework (XCTest, JUnit, OCUnit, etc...)
- Be aware of common failure modes:
- How does this object behave when used from multiple threads? (Probably poorly)
- If there are any callbacks, what happens when you turn around and call back into it from inside its own callback?
- If there are "automatic" behaviors in the language (key-value observing in ObjC for example), do they work properly? People often forget about them because they "just work"
- If the thing being tested is subclassable, what happens if you make a subclass and override chunks of it?
- What happens if you throw an exception from a callback or an overridden method?
- What happens if you pass nil as an argument? How about the wrong type of object?
- Does it take input from the network or a file? What happens if the network is down or the file is missing? What if the input is invalid?
- Create "mock objects", fake test-only imitations of other pieces of the program that the piece you're testing interacts with, so that you can isolate it (…and be aware of how this can introduce false positives)
Given a report of a bug, determine whether the bug is real, how severe it is, and who it should be assigned to
- Get really good at reading crashlogs and spindumps. Honestly this is just a useful life skill given the state of the software industry
- Bisect builds: if something crashes now, and didn't crash in the past, you can rapidly zero in on when it broke by testing the version exactly in the middle between 'now' and 'in the past', to narrow it down. Even with 1000 versions to test, you can figure out which one was the first broken one in less than 10 tests (log2(1000) == ~9.6).
- Use your local version control system effectively. Being able to quickly search and move around your project's history is essential, and some version control systems (git) have automatic bisecting capabilities that are super handy.
- Be aware of surrounding teams and who's on them. You'll regularly get bugs that aren't your team's responsibility, and knowing approximately where they should go is super useful. Building a relationship with screeners on those teams also lets you be more comfortable saying "hey this looks like one of yours, what do you think?"
- Estimating severity is tricky, and mostly just requires experience to build up a gut feel for "hm we're seeing things that look like this a lot" and "oh last time something broke in this area it ended up having all kinds of nasty results"
- Keep track of a current "live" set of issues that are being reported a lot so you can quickly mark bugs as duplicates. It can take some time for a fix to make it out into the wild, so if something bad broke you'll be seeing it a lot
Given a report of a bug, make the bug easier to fix by fleshing out the report with steps to reproduce, when it was introduced, possible causes, etc...
- Mostly the same skillset as screening, just with more time spent per bug
- Stay aware of changes being made, for example by participating in code review. That way you can more easily say "xyz looks like possible fallout from change abc"
Write and maintain programs to automate parts of any of the other QA subroles
- Pretty much white box testing with a focus on making something autonomous that you can leave running rather than having to drive it yourself
- A useful skill here is being able to write little programs that shell out to unix commands and read their output. For example, running the 'leaks' command on the test runner and including any memory leaks in the test output page
- Occasionally more exotic stuff like one time my officemate had a robotic arm for repeatedly sleeping and waking a laptop
Test Infrastructure Administration
Most software teams have some machines running tests automatically on each change they make ("continuous integration"), someone needs to keep those machines running and up to date
- Typical sysadmin stuff, get comfortable with operating a computer via ssh
- Write scripts for stuff you have to do a lot
- Being able to hack on your local CI system is useful. Buildbot is written in Python and is open source, for example, so knowing python would help if it's being used
Other writing about programming