I need to vent a lil so just feel free to ignore the following blather.


I'm really glad people aren't hassling me for the last results of the VCP. Election's over journos have moved on, blah blah... It's good, because the data is just crap.

The post-election responses are absolute nonsense. People regurgitating what the media has told them is their reason for voting, post-event rationalisation, and all manner of people claiming they knew what the outcome would be. Most significantly, the post election responses do not match the pre-election responses.

I've re-merged and re-analysed 4 times now, and it's undeniable. Very, very unstable voters who indicated a different vote intent every survey, saying after the election they decided a long time ago or were always going to vote that way. Those who posted emotional messages about Bob Hawke dying just two weeks later saying it didn't affect them at all.

I trust the pre-election data. It's the post-election data I think is wrong.

But so much of election study is done post-election. Is it all bad? Is it all not reflective of genuine voter behaviour when they walked into the booth? That would render the entire catalogue of the Australian Election Study since the 1980's bin-worthy.

And, no, I do not have the balls to call Ian McAllister's lifetime of work (with significant contributions from others) rubbish. Even if he does plagiarize.

I am prepared to call my last two surveys rubbish.

End dilemma vent.

@ktxby I had tuned out on the post-election analysis. So much of it seemed to be firing from the hip

@_skeletonmeat It always is. The articles on turnout for example written and filed when there were still millions of votes to count. The media post-rationalises like everyone else. A journo described it to me as fitting the story to the event rather than actually attempting to understand the event.

@ktxby is the quality of the data from this election really that different to previous elections? I've seen a lot of hand wringing but a lot of it seems based out of the fact that most people really dont know how to read statistics

@_skeletonmeat Yes, quantifiably. Discounting the polls which are the usual guesswork and always misread, and not to overblow my own study, there was another multi-wave pre-election study - the Collaborative Australian Election Study (CAES) headed up by Simon Jackman at USyd. They also made substantial improvements to Vote Compass this year. (SMH/Age's Smartvote was, on the other hand, significantly compromised and a step backwards from YourVote.)

@_skeletonmeat Before this year, the last time we had any kind of panel data (multiple surveys of the same respondents) was 1990, and that was a two wave set - the first survey three months before and the second just after the election. There's only been 2 previous 3-wave studies - one done at the last election by the Electoral Integrity Project which is about trust in the system not campaign effects; the other with a very small sample in Canberra to measure the impact of a speech by Menzies.

@ktxby wow. I wouldve thought there'd be a lot more than that

@_skeletonmeat In every other country, there is. There was a flurry of studies in the 50s, and then it just stopped. A small kick of interest again in the 80s, but then the AES was established and it was like academia said 'that'll do'. I had to build my own (and it was a fucking traumatic experience, I wish I hadn't) because the data I wanted simply did not exist. The Voter Choice Project was to be the first Australian Columbia study - copying a study pioneered at Columbia University in 1940.

@_skeletonmeat (It probably would qualify as a Columbia study, but it got screwed by universities to something else.)

Sign in to participate in the conversation

Welcome to thundertoot! A Mastodon Instance for 'straya