I trust we’re all in this together…?

Spot the odd one out.

  1. I trust you.
  2. She’s a trusting soul.
  3. Do you trust him?
  4. Generally speaking, would you say that most people can be trusted or that you can’t be too careful in dealing with people?

Of course it’s 4, a survey question designed to get at “generalised” or social trust.  It’s so ubiquitous in social surveys that it’s known by an acronym: GTQ – the Generalised Trust Question.  Generalised trust is the sense we have that the people around us – people we don’t know personally, but who occupy the same public spaces as us – are generally safe to be around.  A generalised truster sees friends he hasn’t met yet; a non-truster sees a paedophile behind every bush and a pickpocket in every crowd.

I’ll be honest and admit that after writing that sentence I checked the correlation between ‘trusting’ and ‘reading the Daily Mail’ using the 2009/10 Citizenship Survey.  I was slightly disappointed to find that there’s no difference between Mail readers and the rest of the population.  Guardian readers, on the other hand, are a trusting lot.  Generalised trust stands at around 40% in the population (and among Mail readers) but 65% for Guardian readers.  It’s a small sample (383 Guardian readers out of a core sample of 9305).  Anyway…

Generalised trust is an odd concept.  Trusters and non-trusters must be moving in different circles, as well as having different habits of mind.  An intelligent truster in a neighbourhood full of cheats, thieves and liars would soon change her attitude.  Patrick Sturgis has an interesting paper on the subject which links trust to intelligence.  His hypothesis runs roughly that bright people (well, children who scored well on intelligence tests) will go on to be more trusting adults (measured by the GTQ) because they make good judgements.  Bright people give their trust to unknown others under circumstances in which that trust is most likely to be rewarded, so they tend to have good experiences, so they continue to be trusting: essentially, they make good judgements.  People who maybe aren’t so bright invest their trust less wisely and so have bad experiences, and end up less trusting.  Of course, as Sturgis points out, intelligence test scores in childhood and generalised trust are both strongly social class-related: rich kids are brighter and rich adults are more trusting.  Using the 2009/10 Citizenship Survey data again, 53% of adults in management are trusting, compared to 32% of those in routine or semi-routine jobs.

Trust is an attitude, linked to personal circumstances but also to personal history and character.  Robert Putnam (he of Bowling Alone, social capital, the evils of TV and the demise of schmoozing) uses it as the key social attitude linking community cohesion and political and civic health.  People who schmooze, who meet with their neighbours for coffee, who volunteer in their communities, are more trusting than loners, he says, and their trust grows as a result of their positive contacts with the people around them.  He also says that trust is good for democracy: trusting people have more reasons to get involved with running their communities, they vote more often, and engage more fully with the democratic process, contacting politicians and local officials more often than non-trusters.

This tangled “bowl of spaghetti” (that’s Putnam again) is pretty much impossible to unpick, but it seems to me, that the fork around which the spaghetti twirls is social class.  That muffled twang, by the way, was the sound of the spaghetti metaphor unravelling.  Social class at birth predicts IQ, trust, future social class (forget about social mobility right now), future salary, civic and political engagement, newspaper-readership, and pretty much anything else you can think of.  Your father’s job title when you are born offers enormous insight into your future civic and political life (and everything else) – or certainly it has done up until now.  I just wish I could see that changing any time soon.

Posted in Academic research, PhD | Tagged , , , , , , | Comments Off

A sense of dignity

I went to hear Jonathan Wolff (UCL Philosophy) give a lecture on relative poverty and social inquality tonight.  It’s part of a piece of work he’s doing for the Joseph Rowntree Foundation (JRF). The JRF is bringing together academic ideas on poverty from across disciplines – economics, political science, and also philosophy. Wolff said that poverty is much ignored by modern political philosophers, in favour of equality and justice: Rawls himself doesn’t mention it once.  He then emphasised his point by quoting extensively from economists and social statisticians, referencing in particular the works of Adam Smith and Benjamin Seebohm Rowntree.

Adam Smith in The Wealth of Nations addressed the relative, cultural aspects of poverty, writing: “A linen shirt, for example, is, strictly speaking, not a necessary of life…  But in the present times, through the greater part of Europe, a creditable day-labourer would be ashamed to appear in public without a linen shirt…”  There are some activities and goods which are so central to life in our society that we are prepared to forgo even food to buy into them. These activities and goods are strongly status-linked. For a school-child, it might be a set of clothes which are not school uniform. For a worker, it might be a drink in the bar on a Friday night. For a teenager, it might be a smartphone.

Poverty is essentially relative – and this relativity is local. Some groups are so far from the societal goodies that they turn inward. There is no point reaching for the next rung on the social ladder – it’s at ceiling height – so status is measured against a more local norm. Wolff talked about gangs in South Africa which prioritise an unusual dental arrangement: removing the front four incisors.  People with this modification are marked to the extent that they will be unable to compete for status in South African society’s mainstream, but they have high status in Cape Town’s gang culture.

In this, as in so much political science and philosophy, The West Wing has got there first.  In the episode Isaac and Ishmael, we hear the following words spoken by Charlie, the President’s bag-carrier:

Gangs give you a sense of belonging, and usually, an income. But mostly, they give you a sense of dignity. Men are men, and men’ll seek pride… You think bangers are walking around with their heads down, saying, “Oh man, I didn’t make anything out of my life. I’m in a gang.” No, man! They’re walking around saying, “Man, I’m in a gang. I’m with them.”

My colleague Rich Penny (@Rich_Penny) has written his PhD on the subject of Rawls, justice and self-respect: “the process of securing the social basis of self-respect for citizens is much less a matter of ‘traditional’ distribution, and much more a case of cultivating a particular set of social relations and a public political culture that serves to secure for all citizens the space, recognition and encouragement needed for them to develop a sense of self-respect on their own terms.”

Poverty is about so much more than money.

Posted in Uncategorized | Comments Off

Back in the computer lab – questions answered

I signed up to teach again this semester.  I am demonstrating on a rather nice course covering social science data and methods using Stata (STAT6077, for those in Southampton: it was the course that made me realise I was doing the right MSc).  Today was computer lab number one.  These are the questions I answered:

Q: What does the 25th income percentile mean? A: It’s the level of income below which one quarter of the incomes are found).

Q: Which of these equivalized incomes gives the right answer?  A: None of them. Annoying isn’t it? For the uninitiated, income data is often collected at a household level.  Analysis, meanwhile, frequently takes place at an individual level.  To bridge the gap, you have some choices.  You can pretend that one person in the household gets to control the whole income (i.e. pretend that individual income equals household income). Or you can pretend that household income is shared evenly among all household members.  Or you can assign 20% of the income to the kids and divide the rest evenly between the adults.  Or you can do as the OECD does and treat individual income as household income divided by the square root of household size.  I think this might be the time (Monday of Week One) to quote George Box: “Essentially, all models are wrong, but some are useful”.

Q: How do I access the University filestore? A: It’s on the H: drive. If you want to, for example, keep a log of your Stata workshop session, you can tell Stata:

log using h:\workshop.log, t

This will open a log file for your session and save it in your personal filestore (don’t forget to close your log at the end of the session).  You can access your filestore from any computer in the university and also via a VPN connection at home.

Q: Why are there any negative household incomes in the Family Resources Survey?  A: Negative incomes are usually discounted for analysis purposes (thank you Dr Google!).  The variable is constructed using all income, minus council tax, pension contributions, some particular insurance payments, student loan repayments and child maintenance.  There are 36 individuals (out of more than 52,000) in 17 households with a negative income.  And 19 of those were children.  Must be pretty grim.

Q: How does this .do file work?  A: It’s magic. And you can keep a log of everything you do too. In fact, you probably should.  All the details were in the introductory session on Stata in Week 0 and are available on Yves Berger’s website.

Q: Can I use Stata at home?  A: Sadly not. University of Southampton only has on-site licenses. You can, however, buy a student license from Timberlake on their GradPlan.  You’ll end up buying Stata 13 when the computer labs are running Stata 12, though…  And it will cost a minimum of £120.

Q: What is this coefficent of variation, then?  A: Instead of answering this question, allow me to introduce you to the statistical equivalent of Wikipedia.  It’s great, up-to-date, and (unlike Wikipedia) completely acceptable as a reference/citation.  It’s based at UCLA: http://www.ats.ucla.edu/stat/

 

Posted in Pedagogy | Tagged | Comments Off

Mystery

I spoke yesterday at the Voluntary Sector and Volunteering Research Conference about the first paper of my PhD. I finally answered the question I asked two years ago in this post: http://www.confoundingfactor.org/archives/145 For those who are interested, the answer is: the Raising of the School Leaving Age. Those extra 16 year old volunteers were essentially timetabled into it. More soon.

Posted in Uncategorized | Comments Off

Another advantage of longitudinal data

I have been having trouble formulating an important argument in one of my PhD papers, so I’m going to rehearse it here.  If anyone could help me to refine it, I’d be eternally grateful if you’d let me know…

Longitudinal data is good stuff if you want to try to understand a cause/effect relationship.  For example, causes generally precede effects, and having longitudinal data allows you to check whether your hypothesised relationship runs in the right direction through time.  I am interested in whether volunteering ’causes’ political activity.  I have a dataset (the NCDS) which allows me to check whether volunteering precedes political activity (like voting, signing petitions and attending public meetings) in time.  It does.  So far, so good.

There are still problems, though. In survey terms, volunteering is bound to precede political activity.  The cohort is asked about volunteering for the first time at age 16 and political activity for the first time at age 23.  Therefore, if volunteering and political activity are related at all, it will look as if volunteering is the cause and political activity the effect, even if that’s not really the case.  For example, volunteering and political activity could be jointly caused.  This is incredibly plausible.  Volunteers are generally well-educated and middle-class: political acts are also linked to education and class.  Furthermore, one could argue that there is a ‘civic type': people with a particular personality or socialisation which predisposes them to volunteer and vote.  Causal inference, therefore, continues to elude me.

There is hope, however.  Longitudinal data like this allows the user to account, to a certain extent, for the effects of character, intelligence and socialisation.  If a person’s volunteering is driven by their character, their innate intelligence, or their familial socialisation, those effects are likely to be evident at any given age.  That is, if there is a ‘volunteering type’ we would expect to see those people over-represented among the active volunteers in the NCDS at age 16 and age 23 (and indeed at 50, when these questions are asked again).  Instead what we see is that volunteers at age 16 are less likely to go on to get an education, less likely to go on to vote and more likely to be from families which are supported by an unskilled worker.  The reasons for THAT are a whole other blog post, by the way.

This pattern in the data makes me all the more confident about my findings.  Volunteers at age 16 are 40 per cent more likely to volunteer at age 50 than n0n-volunteers (even controlling for an array of potential confounders).  This is true even though the 16 year old volunteers have quite different demographic and socio-economic characteristics to the 50 year old volunteers. When I look at the data and see that the relationship persists even though adolescent volunteering in the NCDS looks so unusual, I feel more confident that I may be looking at a causal relationship. Furthermore, when I perform logistic regression using volunteering at age 16 to predict volunteering at age 50, I am controlling not only for volunteering at age 16, but implicitly for all the things which ’caused’ that adolescent volunteering. Therefore, I have controlled for being a ‘volunteering type’, for socialisation by the family, for innate intelligence, and for a whole host of other things which the literature has linked to volunteering. I remain within the shadow of doubt for sure, but I feel a lot more confident than I would have done without the longitudinal data.

And that’s it. Criticism welcome. I am not attempting formal, statistical causal inference, by the way. This is more ‘weight of evidence’. I hope.

Posted in PhD | Tagged , , , | 5 Comments

Survey Data Not Be All And End All – Shock

Apparently there’s more to research than survey data – who knew?

Those who have been paying attention will remember that I have used the National Child Development Study (NCDS) to examine the relationship between volunteering and political activity for the first paper of my PhD.  It turns out that, as well as 50 years of fascinating survey data on a cohort of people born in a single week in 1958, the NCDS also includes the transcripts of 220 semi-structured interviews (each lasting around an hour and a half) given when the cohort members were aged 50.  This collection of 220 interviews is known as the Social Participation and Identity Study (SPIS).

I’ve just taken a short break from my PhD research to do a collaborative (ESRC-funded) research project using this lovely stuff.  I collaborated with Jane Parry (Southampton) and Katherine Brookfield (soon to arrive at Edinburgh) on a mixed-methods report looking at a sample of 50 of the semi-structured interviews (called “50 at 50” – geddit?).  Our full report will be available shortly at www.tsrc.ac.uk   It focuses on participation: in clubs, associations, voluntary organisations etc.  It covers the differences between participation data collected in surveys (quantitative) and in open-ended interviews (qualitative).  We also looked at people’s motivations for involvement and the barriers to participation that they face.  Finally, we included a chapter on participating in a cohort study (actually, more accurately, I should say that Katherine and Jane did that – I stuck mostly to the numbers and to formal social and civic participation).

For now, I’ll stick to the differences between quantitative and qualitative data when it comes to studying participation.

We used NCDS quantitative data to make a ‘sample’ of 50 out of the SPIS 220.  The quantitative data covers the whole of the cohort members’ 50 years, but we used only data from the five adult waves (cohort members were surveyed at age 23, 33, 42, 46 and 50).  We singled out:

  • non-participants,
  • perennial participants, and
  • frequent participants.

Non-participants answered ‘no’ to a slew of participation questions at every wave – they didn’t even go to discos when they were 23!  Perennial participants answered ‘yes’ to ONE question per wave about active involvement with a formal/membership organisation.  Frequent participants were active weekly in either (a) many spheres at once, or (b) voluntary work

Looking only at the quantitative survey data, perennial participants and non-participants are indistinguishable in terms of gender, social class, education, marital status, childrearing and care for elders.  Frequent participants (nb: only 8 of them in our sample) look noticeably better educated than their less-involved peers. They were also:

  • healthier
  • less engaged in caring for elderly parents
  • more likely to vote
  • more likely to be self-employed.

For those familiar with the literature, this should be raising some eyebrows.  We would generally expect volunteers, for example, to be better educated, and from the professional/managerial social classes.  We can certainly see some of those differences among the frequent participants, but not among the perennials.  The take-home, however, in spite of these small differences, is that it’s pretty hard to pick someone’s likely participation status from the rest of their NCDS data.

Looking at this from other side (the dark side?), the qualitative data shows that, as you might expect, frequent participants really are very busy people.  John, one of the frequent participants*, launches into a description of the sports coaching activities he and his wife engage in with the words: “My wife gets up at just after three.”  Frankly, just reading the interview is exhausting.  While both data sources generally agree for the frequent participants, the qualitative interviews show that the majority of non-participants are more engaged that their quantitative data suggests.  Out of 21 non-participants, just seven failed to come up with any participatory activity at all in their qualitative interviews.  Furthermore, some perennial participants do really very little to ‘earn’ that tag.  Some perennials had just attended church services, or played golf at a club – and this put them in the same category as others with much more active records.  Finally, some types of participation were all but absent from the qualitative interviews: political acts and trade union activity were far more prominent in the quantitative than the qualitative data.  (By the way, this last comes under the general heading of ‘if you don’t ask, you don’t get’ – there was little chance that cohort members would be prompted to describe their trade unionism or political acts, given the interview schedule.)

Survey timing clearly played a role here.  Some of the difference between quantitative and qualitative data is down to “events, dear boy, events”.  The gap between quantitative and qualitative data collection was anywhere between one month and two years!  Alison, for example, referred to a changing economic backdrop:  “I think probably we’ve felt the credit crunch crunching over the last few months, so we probably do stay at home a lot more”.  Iona had been involved in various activities and organisations at the time of the NCDS quantitative interview.  However, between then and the SPIS qualitative interview, she was diagnosed with cancer and had started a programme of treatment.  As a result, her participation and volunteering commitments had been suspended (“since September I’ve had somebody else’s life really”).  In that case, the time gap was only three months, but it was very significant for the findings.  The quantitative data identified Fiona as a frequent participant on the basis of her participation in multiple organisations at age 50. However, the qualitative interview data indicated that she rarely participated in any organisations. Highlighting the role of timing, in her SPIS interview Fiona mentioned that work commitments, and the demands of studying part time for a new qualification, had, in the past two years, forced a reduction in her participation activities.

There is also a strong question effect.  The NCDS participation questions are generally predicated on ‘joining’ or membership.  Before cohort members have a chance to tick a box describing their activities, they must first pass a ‘membership’ hurdle.  The survey asks first “Are you a member…?” and only then “How often do you join in the activity of…?”.  In the case of religious activity, this is visible in the quantitative data alone.  Three cohort members (Iain, Susan and Anwen) said that they attended weekly services but were not members of a religious organisation.  When quantitative and qualitative data are compared, it’s clear that ‘membership’ is perceived differently by different cohort members.  For example, Michael (a perennial participant, according to the quantitative data) appeared to identify online contact with several clubs and receiving email newsletters, both rather passive forms of engagement, as ‘joining in’ while Paul (a non-participant) did not identify his membership and subsequent attendance at National Trust properties as ‘joining in’.  The membership hurdle also rules out participatory activities like evening classes and episodic volunteering.  For example, Janet (a quantitative non-participant) was involved informally with a children’s charity through her husband.  She says:

“My husband is a taxi driver and there’s a–, the only volun–, the only charity work that I do is help–, he takes part in a–, it’s a one day kiddies’ outing to PLACE IN SCOTLAND… so we’ve got to blow up the balloons and get everything ready but it’s–, it’s a couple of weeks of preparation, getting everything in, then you’re up at four, five in the morning to put this stuff onto the taxi and things like–, so that’s–, that’s about it.”  The interviewer then asks: “Have you been doing that for quite a while?”  She replies: “Oh, 20-odd years now…  It’s–, it’s maybe not a lot but it’s a lot to me [coughs] ‘cause we pay for it all ourselves as well.”

This certainly sounds like volunteering – but Janet’s quantitative NCDS record suggests no such activity.  The survey questions did not capture her episodic, family-linked volunteering (perhaps partly because she seemed reluctant even to use that word).

Including religious activity as a form of participation has also affected the findings. Some religious participants are very active: running choirs, or Sunday schools, or involved in church administration.  Others just go to the service.  Iain, for example, said: It’s a sanctuary type thing and really, that’s all it is. If somebody were to say to me, “What did the priest say last week?” I couldn’t tell you what he says. I listen in wee bits, but the majority of the time it’s just me and that’s what I get out of chapel.“  This is about as non-participatory as participation gets!  Louise, on the other hand, revealed in her qualitative interview that she was spending 1.5 to 2 days a week on her church-related activity.  Depending on how the cohort member views their own activity, these people can look pretty much identical in the quantitative data.

So the message here…?

  • When you use a survey question, you rely heavily both on question-order and on the respondent’s interpretation.
  • Unless you are interested purely in church-going, it may be better to ask specific questions about church-related participation.
  • If you don’t ask, you don’t get.

And finally – don’t for a moment think that this has put me off using survey data; it’s just made me determined to use it better.

 

*All names are pseudonyms.

Posted in Methods, PhD | Tagged , , , , , , , | 2 Comments

A civic recovery? Or a graph of er… something-or-other.

Last year, I bought a copy of Political Participation in Britain (Paul Whiteley, 2012).  On page 86 there is a graph (bear with me…) of responses to one of the questions from the British Election Study Continuous Monitoring Survey.  This study asks 1000 people a month their views on a variety of political issues.  This particular question reads: “Over the past few years, have you volunteered to get involved in politics or community affairs?”  Whiteley describes “a slow but noticeable decline” from April 2004 to December 2010.  He writes: “It is no coincidence that this was the beginning of the worst financial crisis and recession in eighty years, suggesting that the state of the economy has an influence on social capital”.  At the anniversary of the London Olympics, I wondered whether the The Games could have made a difference to how involved people get with their communities, so I went back to the data to bring the graph up to date.

Before we go too much further into unpacking some of this, I’ll show you my version of Whiteley’s graph.  He uses monthly data, while I collapse mine to quarters; his stops at the end of 2010, whereas I now have the data up until the end of Q1 2013.  I also annotated mine with pretty red arrows.

04Q2_to_13Q1_civics_BES_CMS_annotatedThere are quite a few things here that need more thought.  First up, I don’t see an obvious Olympic dividend.  The pick-up in civic activity began well before the Olympics.

Secondly, Whiteley claimed that the data he used (up to Q4 2010) showed a “noticeable” decline.  Looking at it now, with the 2011 and 2012 data, I’m not at all sure that’s the case.  Hell, even if you put your hand up to the screen to cover the later data, I’m not sure you’d really describe the decline as particularly “noticeable”.

Thirdly, look at the question on which the graph is based: “Over the past few years, have you volunteered to get involved in politics or community affairs?”  What’s this really asking? “Over the past few years” is such a woolly timeframe that this is more of an attitudinal question than an activity question.  My best guess is that the question stands for ‘Would you get involved…?’ more than it stands for ‘Did you get involved…?’.

So what about the last segment of the graph?  It looks as though people got more enthusiastic about getting involved in politics and community affairs sometime shortly after the last General Election.  As far as I’m concerned the jury’s still out, but it certainly seems to suggest a political driver, rather than Whiteley’s economic driver, for the change in responses to this question.

I wonder (and I’m whispering very quietly here), could the Big Society have anything to do with this…?

 

EDITED 26/7/13: After a tweet from @karlwilding, I plotted the confidence intervals for the means in the graph above.  This one was done in Excel and has ended up looking less spiky, but it’s the same data.

04Q2_to_13Q1_civics_BES_CMS_ExcelwithCI

The eye is now drawn much more to the dips than to the peaks: if there’s anything here to explain at all, it now looks as though it’s in the 2010 and 2004 troughs.

Posted in Academic research | Tagged , , , , | 2 Comments

This shouldn’t be a surprise, but ‘gold’ costs an average of £1727 while ‘green’ is free…

I admit, I am coming in a little late on this one.  It turns out that publishing papers may be about to cost money.  An average of £1727 per paper, if RCUK has done its sums right.

Some lovely folks over at The Disorder of Things have helped me catch up a little.  This is how it is.  In the beginning was the Finch Report.  (Actually, that can’t have been the beginning – someone must have commissioned it – but I’m still only just catching up, so I’ll start with that.)  Dame Janet Finch very sensibly said that publicly funded research should be publicly available.  At the moment, most of it sits behind the paywalls of big, expensive journals.

The UK research councils picked up the ball and ran with it.  They said that any research they funded would, henceforth, have to be publicly available.  This is where we get into colour-coding.  RCUK explain things thusly:

Our policy requires that peer reviewed research papers which result from research that is wholly or partially funded by the Research Councils must be published in journals which are compliant with Research Council policy on Open Access.

A journal is compliant with our policy if it provides Gold OA using the CC-BY licence, and RCUK will provide funds to institutions to cover payment of APCs.  However, if a journal is not prepared to offer a Gold CC-BY option, it can achieve compliance by offering a specific Green option which must meet the following requirements.  It must allow, at a minimum, the accepted manuscript with all changes resulting from peer-review, to be deposited in a repository without restrictions on non-commercial re-use and with a maximum embargo period of 6 months.  For a limited transition period the maximum embargo period is extended to 12 months for papers arising from research funded by the AHRC and the ESRC.  This is in recognition that journals in these areas are not yet as well placed to move to an OA model.

So what does this mean for authors?  If the journal they want to publish in only offers policy compliance through a Gold route, they must use that journal’s Gold option.  If the journal only offers compliance through the Green route, the author must ensure that a copy of the post-print is deposited in an appropriate repository – for example, UKPMC for papers arising from MRC funded research.  If the journal offers both a Gold and a Green route to compliance (and some journals already do this), it is up to the author and their institution to decide on the most appropriate route to use.  And, if a journal offers neither a Green nor a Gold compliant route, it is not eligible to take RCUK funded work, and the author must use a different, compliant, journal.

In practical terms, this means that authors have to ‘pay to play’ if they want their work published in their chosen journal.  RCUK have promised block grants to funded institutions to cover the new cost – but everyone has gone suspiciously quiet on the subject of research students.

Will RCUK-funded PhD students have to get additional funding from their universities in order to publish their work? What if a student chooses to work on publication after their viva?

This is particularly difficult in the context of REF.  Universities are so completely focused on proving that they have been doing world-class, impactful research for the last five years that the question of how Open Access policies might affect the next REF in 2020 has been somewhat neglected.  If it costs an average of £1727 per paper to publish in the leading REF-respected journals in the next five years, who do you think will publish?  Early career researchers – the people who really need to get a couple of REF-relevant publications under their belt so they can ever get a job which lasts more than 18 months?  Or funded Principal Investigators – the people the universities need to bring home their next research grant?  I know it doesn’t have to be either/or, but where there is competition for scarce resources it seems reasonable to assume that those with little sway will be the first to lose out.

Instead of forcing journals to adjust their business models to allow Open Access after an embargo period (‘green’) RCUK is using public money to pay journals – and forcing researchers to use their own money to pay journals – to provide a free subscription for the world (‘gold’).  I’m beginning to think that it would be better if we slowed down to think for a moment (‘red’).

This policy comes from an admirable place, but the wrong people will be paying for it.

PS Yes, I did force a third colour into the colour-coding back there.  I really wanted to get Karma Chameleon’s “loving would be easy if your colours were like my dreams – red, gold and green” in somehow.  Sorry.

Posted in Academic research | Tagged , , , | Comments Off

The tax payer buys some data. Now who owns it? A private research company.

It turns out that when the Government commissions research (including data collection) from a private research company, it is not standard practice to require, as part of the contract, that the data is later placed in the UK Data Archive.  Who knew?  I’m told that the Government expects it, but cannot require it under the terms of the contract.

I feel a letter coming on…

Of course, there is a healthy serving of self-interest here.  I would like to get my hands on the data from the Citizenship Education Longitudinal Study, which started in 2001 and concluded in 2011 with Citizens in Transition, a follow-up survey which contacted the cohort at age 20.  The follow-up section was paid for by ESRC (rather than the Department of Education) and they, of course, require that the data be deposited. Not that it’s much use without the previous waves…

EDIT: The follow-up data has been deposited. The original survey is still private.

Posted in Uncategorized | Comments Off

Who should be responsible for policy evaluation?

The Guardian has published a note by Nick Axford on when a charity might wish to carry out a randomised controlled trial. Axford works for a charity which promotes the use of evidence in designing services for children and families.  This rather neatly wraps up a number of my key interests: policy evaluation, longitudinal data collection and the role of the voluntary sector.

Randomised controlled trials (RCTs) are usually associated with medical, rather than policy, interventions.  Actually, they are usually associated with modern pharmaceuticals.  We generally kid ourselves that our medicine is evidence-based.  A lot of it is ‘stuff that we’ve done before that seemed to work so we did it again’.  And so it is with social interventions.  Hence Axford’s suggestion that maybe we should try to gather some actual evidence for some of our pet policies before putting them into full scale production.

It’s not a bad idea – in fact it’s a good one (although I should add that the evidence wouldn’t have to come from an actual RCT – there are other options).  Who should take responsibility for making sure it happens?  We (well, the National Institute for Clinical Excellence) demand that pharmaceutical companies come up with some evidence before the NHS will buy their products.  Pharmaceutical companies, unlike charities with a new idea for helping children living in poverty, stand to make a lot of money from their products, so it makes sense to ask them to come up with the evidence.  Should we really ask a charity to do the same?  I can see why a charity might wish to promote its ideas this way, but can we expect it?

In fact I think that if the state wants to ‘buy’ the charity’s new idea, it should do the research itself.  Charities don’t have the same kind of financial interest vested in their ‘products’ as pharmaceutical companies: they simply don’t have the resouces to do this kind of research.  Following people over time – which is what you’d need to do to study the effect of a particular policy or intervention – is expensive.

As an aside, I think Axford’s article is missing a paragraph about regression to the mean.  He writes: “One of our evaluations showed that children whose parents attended a parenting programme were better behaved at the end of the programme than at the beginning. Great – the programme worked! Except that the same happened to similar children whose parents didn’t attend the group. The programme made no difference.”  The key here is “similar children”.  If you take a group of children with poor behaviour (the kind to whose parents one might recommend a parenting programme) you might expect to see some improvement in their behaviour over time, whether you intervene or not.  Why?  Because most children are neither consistently bad nor consistently good.  A period of bad behaviour will be followed (eventually!) by a period of better behaviour, no matter what you do.

Posted in Academic research, Journalism | Tagged , , , , , | Comments Off