Federal Election 2019: What people actually think, not what they are saying.
Federal Election 2019: What people actually think, not what they are saying.
The opinion polls got it wrong again.
Where did it all go wrong? Somewhere between relying on inaccurate sampling, small sample sizes, framed questioning and solicited responses which bring voluntary response bias.
Why on earth one would rely on more primitive methods of data collection in with the advanced technology we have available today? It’s like bringing a knife to a gun fight.
Without question, we can perform better quality research, and deliver more accurate outcomes when we embrace better technology as part of our arsenal. Those in the know have been leveraging AI-powered social and media intelligence for quite some time already. Long enough, that over 10 odd years they have seen the technology mature and get even better with time.
When we hone in on the digital engagement with both Prime Minister Scott Morrison and Opposition Leader Bill Shorten, we can clearly see why.
Firstly, we are not looking at a tiny unrepresentative sample, nor a sample limited by the needs of an analyst to manually crunch data, or even one skewed with content from news outlets.
From when the 2019 federal election date was announced on 11 April 2019 to 17th May 2019 (a day prior to election), we captured 304,326 social posts of people engaging directly with either (or both) party leaders.
That is some pretty serious volume. It only took somewhere between 30 and 60 seconds to collect all that conversation and run analysis on all the data after spending a whole 60 seconds building the search! Think about those 120 seconds and how long it took (and cost of research) to produce the questions, make the 6000-8000 phone calls needed to have the 1000 conversations, and then interpret the results…
We are going to listen in on what people are actually saying, and to examine these public conversations, we have some pretty cool tech at our disposal!
An auto-sentiment algorithm trained on half a million posts (positive, negative, neutral) and an emotional sentiment algorithm trained on two million posts, including hashtags and emoji’s, looking at base-level, cross-cultural emotions (joy, sadness, anger, disgust, fear and surprise).
The engagement is revealing.
Engagement with Bill Shorten – Sentiment and Emotion
Firstly Bill Shorten. 145,609 posts, 49% of the dialogue, contained an identifiable sentiment and emotion. 61.6% of which was negative sentiment. The top emotion being joy (32.1%) followed closely by sadness (29.8%) and then anger (19.3%).
Engagement with Scott Morrison – Sentiment and Emotion
Now onto Scott Morrison. 87,723 posts, 61% of the dialogue, contained an identifiable sentiment and 63% of posts an identifiable emotion. The engagement with Scott Morrison was less negative than that of his opponent with 55.1% negative sentiment. The top emotion also joy (34.9%), slightly higher by comparison, followed by lower levels of both sadness (25.3%) and anger (15.2%).
Whist it is clearly apparent there is unanimous dislike in attitudes towards both politicians, as well as engagement from both friend and foe alike, the differences in sentiment and emotion shown in social data prove indicative of a preference over either leader and/or attitudes towards policy initiatives supporting the election outcome.
Australian’s were more negatively engaged and expressed both greater anger and sadness in their language used talking with (and at) Bill Shorten compared to Scott Morrison.
We can observe 7 percentage points difference in the sentiment of engagement between the two leaders which is reflected in the final result.
This is one piece of the puzzle in building predictive modelling using sentiment as a leading indicator of decision making. With the right technology, sentiment can easily be tracked in virtually real-time, captured historically and trended over time, and segmented in numerous ways to understand how people are responding to different issues.
This data is not one dimensional like one too many Game of Thrones memes, each peak and trough corresponds to an event(s) or topics which can be further analysed with deeper analysis. There are so many layers you can peel back. The more you dig, the more interesting it gets when you start connecting and laminating data.
For example, when trying to understand more about an audience, we can identify some of the other topics which are more likely of interest to that particular audience. We can leverage this and taking a slightly different (and more interesting) approach to demographics, we can aim to understand just how much the audience engaged in each instance was representative of the Australian population.
In a nutshell, audience affinities compares the audiences of two distinct conversations and can tell us where they are they same, and where they differ in interest. Each bubble represents a topic; its size, the volume of participation in the conversation.
Audience Affinities for those engaging with Bill Shorten
In the chart above, the further the topic is to the left, the more likely those interests represents those of the audience engaged with Bill Shorten. The further to the right of the chart, by contrast, then the topic is more of an interest to the entire twitter feed from Australia (which we use to “represent” the Australian population).
Where they come together in the middle is where the the engaged audience shares similar interests to the “rest of Australia”. As cities and states are also identifiable topics of interest, we can see which of those are of interest to those in the discussion.
Audience Affinities for those engaging with Scott Morrison
The data above clearly shows a difference in audiences between the two and that Australian’s engaged with Scott Morrison represent a better cross-section of the Australian population geographically, as the “interests” converge together very close to the middle (meaning they are not unique to either audience). Scott Morrison knew middle Australia better than Bill Shorten judging from the “geographical” vested interest of those engaging with him, meaning his message was getting across to all of Australia as it was “all corners of Australia” responding to it.
Bill Shorten who on the other hand, was speaking to the people who were going to listen to him anyway, as we can see resonating with those interested in New South Wales and Victoria with both interests over-indexing in the conversation. Interestingly we can also see Queensland over-indexing knowing full well that this state played a role in the election outcome. A simple dive into the conversation of those who had more interest in this state will help you start to understand what was driving some of this engagement.
Engagement with Bill Shorten – Sentiment (Queensland)
Social data can tell you quite a lot about what any audience is thinking, and in this case we can clearly see how unsolicited conversation is far more revealing than any potentially framed questioning and susceptible responses from the handful to thousand of those willing to respond to a cold call in any standard opinion poll.
Social media data is proving more accurate in determining sentiment than that of opinion polling and clearly shows not only that there was more dislike for Bill Shorten, but can also tell you with far more detail the where and the why. Just imagine the possibilities of higher accuracy and insight when you use the machine learning to train categorisation to your own personal version of sentiment or intent!
We live in a digital world where traditional forms of media can find themselves quickly out of date, and it’s deep, fast and reliable customer insights from the best technologies that go beyond “social listening” or “media listening” which are needed to drive the most successful organisations today.
Ultimately social media (and online) analysis offers more effective digital and social marketing, and proven technologies like Crimson provide the huge volumes of public web data and social intelligence to gain the understanding of customers needed today on consumer trends, drivers of sentiment, attributes of purchase intent and products, as well as category and brand level conversations.
When thinking about your next qualitative analysis or focus group, social and online media offers you access to colossal volumes of data ready to reveal a myriad of powerful business intelligence to make more informed and accurate decisions. If your organisation still doesn’t understand “social media” you simply risk getting left further behind.
As an official partner, we support some of the leading organisations across Australia with Brandwatch and the AI-powered Crimson technology. We don’t use AI as a “buzzword”. We are talking about the building blocks of artificial intelligence which encompass multiple technologies such as machine learning, deep learning, and natural language processing. They are all forms of AI and provide both a higher speed to insight and accuracy due to customisation. It’s faster than hand coding or traditional market research, with either better or comparable accuracy, and can be applied over colossal volumes of data.
It doesn’t necessarily take a “data scientist” to better support informed decision making. The Crimson technology offers unlimited global data access and unlimited user access across the largest library of unsolicited publicly available conversations in the world. Anyone in the organisation can access and analyse any of its data for insight about their customers, partners, competitors or industry stakeholders as it happens, or across any point any time over the last 10 years. Anything else is yesterday’s tech and old news.
Want to see more? Scroll down and connect with me for a demo to see how you can go deeper into the strategic communications and easily analyse the campaign policy messaging using AI, as well as identify trends in the emotions used in the language.
Viva la data democracy revolution!