Thinking out loud about differences between surveys

What with one thing (daughter’s chickpox) and other (holiday in the US) and all the stuff in between (more bank holidays than you can shake a stick at at), I haven’t been here for a while.  I’m rapidly rolling towards my first real deadline: I’m presenting a paper at the Social Policy Association Conference which is due in mid-June.  Of course, I don’t yet have the results that I had hoped for when I submitted the proposal.  (This is the point at which I would really appreciate other academics popping up to tell me this is normal…)

Today I am stuck on volunteering rates among youth in the 1970s.  My data (National Child Development Study, which follows a cohort born in 1958) show a rather excessive rate of volunteering when the cohort reaches 16 (37.9%).  Actually, it’s pretty high when the cohort is 23 as well (23.8%).  Although the General Household Survey did not ask a volunteering question in 1974 (when ‘my’ cohort were 16) they did ask it in 1981 when they were 23.  GHS 1981 shows a rate of 14% for 23 year olds.  Incidentally, the same survey gives 24% volunteering at 16, 20% volunteering at 17 and 14% volunteering at 18.

So what’s going on?  Why does NCDS measure volunteering among 23 year olds in 1981 at 23.8% when GHS makes it 14%.  Well, for a start, NCDS is hardly representative, especially by 1981.  Attrition (that is, when people drop out of an ongoing longitudinal study) disproportionately affects young unmarried men from lower income groups who are less well educated – the very people who are also least likely to volunteer.

But surely there must be something else?  Is the question different?  The GHS asked about voluntary work over the last year and provided a showcard of examples.  The NCDS asked about voluntary work in the past year and provided a showcard of examples.  In fact, the question wordings were the same.  I haven’t found the showcards yet, but I can’t imagine they would be vastly different, given the identical question wording.

Maybe it’s about context then?  The NCDS question follows on from a section about smoking but it is followed by 6 additional questions on volunteering and voluntary organisations.  The GHS question, on the other hand, is preceded by a section on accidents and followed by 5 additional questions on volunteering.  It doesn’t seem nearly enough to make a difference.

Of course, some of the difference between NCDS volunteering at age 16 in 1974 (38%) and GHS volunteering at age 16 in 1981 (24%) could be real.  The world changed.  In 1972, the school leaving age changed from 15 to 16 and the NCDS cohort were only just behind.  Could staying in school have given a temporary boost to volunteering as young people tried to figure out their place in the job market?  On the other hand, unemployment was many times higher in 1981 than in 1972: if uncertainties in the job market push volunteering rates upwards, the effect should have been much greater in 1981.  1972 also saw some reorganisation of the volunteering infrastructure.  The Volunteer Centre was founded, for example, and NCVYS was reorganised.  Could that have influenced the young volunteers?  It seems somehow unlikely…

One further thought…  Observing people in a longitudinal study changes them – it changes their behaviour.  They anticipate being re-interviewed and alter their behaviour because they know they will be discussing it with an interviewer.  Could there be a kind of anticipated social approbation bias here?!  Perhaps at 23…  After all, participants were asked about volunteering at age 16.  But at age 16, could respondents have anticipated such a question?  Besides, wouldn’t this effect be pretty small?

The whole thing gets stranger and stranger when I look at who the ‘extra’ volunteers are.  At age 23, there’s nothing obviously weird in the data.  At 23, those who go on to garner higher level qualifications or to be in higher social classes are also more likely to volunteer.  At age 16, on the other hand, everything is weird.  The excess is in lower social classes and at lower levels of qualifications.  For example, a staggering 48.8% of those who reach 50 with NO educational qualifications at all volunteer at age 16, compared to 38.5% of those who reach the equivalent of NVQ level 5.  WHY?!  What if the NCDS sample is just (whisper it…) weird?

I’ve checked my coding (several times), rested my head on my keyboard (several times) and I’m stumped.  I’ll get back you.  Later.

Edit: The volunteering question at age 16 is weird.  I’m not sure it’s weird enough to create the class/education effect, but weird.  The cohort members were given the following text:

Below is a list of things which many people do in their spare time.  You will probably only do a few of these. Please show by ringing one of the numbers for each one whether this is something that you do often, sometimes, never or hardly ever. If it is something that you would like to do but don’t have the chance, please ring 4.

  • Reading books
  • Playing outdoor games and sports
  • Swimming
  • Playing indoor games and sports
  • Watching television
  • Going to parties in friends’ homes
  • Dancing at dance halls, discos etc.
  • Voluntary work to help others.

So there it is.  Who knows what was going through their minds when they tackled that one…?

This entry was posted in PhD and tagged , , . Bookmark the permalink.

6 Responses to Thinking out loud about differences between surveys

  1. Pete says:

    Interesting points. I can think of a few possible reasons (aside from a weird sample!).
    Firstly, there is surely a lifecourse issue here – could it be that 16 year olds in the ‘lower classes’ have less employment opportunities at that age, or at least less attractive opportunities, so volunteering could provide a useful gap between school and work? Conversely, the more educated participants, especially university educated, might find they have a bit of time on their hands at 23, post graduation?
    Secondly, could there be differences in how the different classes perceive what volunteering actually is? I agree it is a rubbish question and open to interpretation – ie is mowing grannies lawn on a saturday morning volunteering, compared with say, cleaning out the kennels at the RSPCA?

    • Vicki Bolton says:

      Pete, I really want it to be a lifecourse issue! That would mean the data are revealing something interesting! I’ve started to wonder whether there may be a questionnaire timing effect, actually. If people in class IV and V are more likely to leave school as soon as possible (gross generalisation) and the survey was conducted over a period of months, the ‘lower class’ 16 year olds could, as you say, be in a gap between education and work (or be in work, but underemployed). It still doesn’t make vast amounts of sense when you look at other data, though. At all ages, in all states of the economy, for people who are employed or unemployed, formal volunteering is more prevalent among richer, better educated, people in professional and managerial jobs than for others. The sixteen year old class IV and V school leavers with no jobs to go to ‘shouldn’t’ be volunteering at a greater rate than the class I and IIs who are still in school.

      I think your second explanation is more likely. There is no relationship between class and informal volunteering for 16 year olds (Citizenship Survey 2008/9, 2007/8 and 2005, taken to together). If the class IVs and Vs were answering with respect to informal (rather than formal) volunteering, that would go some way to explaining the result. Still – it’s a bit weird to see the relationship completely reversed. If the relationship was non-existent, maybe I could buy it…

      It’s really extremely frustrating!

  2. Pete says:

    Hmm yes I see your point – it doesn’t make much sense in terms of life course. Could there be an issue about the accuracy of self reported class? Probably not given its ubiquity. The informal/formal difference could work, and the reverse might be explained if lower class kids saw mowing lawns as voluntary work whilst more educated kids didn’t. It’s not impossible that this might part of broader class based attitude to work?

    I’m guessing you’ve seen this, but just in case you haven’t I thought you might find it interesting (if not very useufl in terms of the current discussion).

    • Vicki Bolton says:

      Pete, I’ve been thinking about this some more. I’ve checked properly for wave non-response effects, and it’s not that. It’s not a question timing effect, either – and I was almost hoping it would be, just so that I could have an explanation. I quite like your idea that this could be part of the broader class-based attitude to work, actually. I have some attitude questions, but they’re all from later waves… I have to keep reminding myself that if it was easy it would all have been sorted out by now!

  3. Pingback: Not exactly blogging the SPA conference… | Confounding Factor

  4. Pingback: Not exactly blogging the SPA conference…….

Comments are closed.