Twitter Updates

    follow me on Twitter

    March 2014

    Sun Mon Tue Wed Thu Fri Sat
    2 3 4 5 6 7 8
    9 10 11 12 13 14 15
    16 17 18 19 20 21 22
    23 24 25 26 27 28 29
    30 31          

    « Design Values: Validated Data over Expert Opinion | Main | The Blog: User Centered Design Tool? »

    November 21, 2009


    You appear to be making an assumption here that the quality of the data from the survey was inferior to that of the on-site visits. However, that doesn't have to be the case.

    If the sites visited were mostly "important" ones (i.e. larger sites with IT departments who jealously guard the materials Autodesk permits them to have), then that's a very biased sample indeed and the survey is likely to be much closer to the truth than the site visits.

    You also need to consider cause and effect. In recent years, the default has been for the customer to not receive manuals, so of course users you visit are not going to have them around. That doesn't mean they don't want them and wouldn't use them if they had them.

    There's a subtext behind that 80% result. My guess is that the majority of customers don't want you to stop sending out manuals. They understood that if they said in your survey they weren't using the documentation that it would be used as an excuse to stop sending them out. So the survey, while maybe not reflecting a true answer to the question, did reflect a true indication of customer sentiment. All totally unscientific guesswork, but I bet I'm not too far off with it.

    Of course, site visits to "important" sites (with which existing relationships may exist and where many users can be seen at once) is just another example of easier data collection.

    I agree completely about the quality of survey questions; unfortunately, Autodesk surveys in particular seem to be very prone to this.

    Sucks being part of the minority that actually uses the printed manuals though.

    It is extremely important to use people from your target audience, but there are ways of using proxies to good effect.

    Desiree Sy posted an article on just that:

    Excellent points Lynn! This is a battle I've had to fight many times before and still struggle to find a balance with. For example, when conducting usability testing, how important is it to do live testing with people from your target audience, which I believe gives you the highest quality of data, versus testing people off the street or remote testing?

    Verify your Comment

    Previewing your Comment

    This is only a preview. Your comment has not yet been posted.

    Your comment could not be posted. Error type:
    Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

    The letters and numbers you entered did not match the image. Please try again.

    As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

    Having trouble reading this image? View an alternate.


    Post a comment

    Comments are moderated, and will not appear until the author has approved them.

    Your Information

    (Name is required. Email address will not be displayed with the comment.)

    RSS Feed

    • Subscribe