Bradley Honan has co-written a chapter in the Sentiment Analysis in Social Networks book by Federico Alberto Pozzi, Elisabetta Fersini, Enza Messina, and Bing Liuwhich looks at the role of social networks in modern society.
In a chapter titled The Human Advantage: Leveraging the Power of Predictive Analytics to Strategically Optimize Social Campaigns, we highlight our work on digital content evaluation and predictive analytics, outlining our firm’s sentiment analysis philosophy and our content analytics offerings.
In case you haven’t seen the book yet, here’s an elaboration on what we wrote about and discussed.
Pretesting or Bust
A core component of our social media philosophy is robust pretesting. Given the significant costs and complexities of rolling out an effective social media campaign, we believe testing out small pieces of social media content before the full campaign is developed and optimized can eliminate work or ideas that are neither effective, nor engaging. Too often social media strategists skip this important step.
We have developed a proprietary tool called the Brand Engagement Lab where companies can recruit their target audience and test pieces of content in a simulated social media environment like Facebook before they go live. In this setting, the audience can browse through content with the ability to “like,” share, and comment on posts as they wish so we can tell if and how consumers are engaging with the content. Insights from this exercise help to optimize a campaign’s approach and strategy by identifying which messaging content and tactics most effectively drive the greatest degree of engagement, favorability, and resonance with consumers.
The Census Philosophy is Wrong – There is No Need to Count Everything
Currently, far too many companies are attempting to analyze every single piece of social media content related to their brand with the hopes of determining what drives engagement. We think this so called “census philosophy” is misplaced. Even the US Census, mandated by the Constitution to count every American, does not survey every single person in the United States!
Instead, they make estimates based on statistical sampling. If the US Census can effectively sample people rather than count them all, we can certainly identify social media trends and insights without counting every single piece of content individually.
Unfortunately, the census philosophy too often wins the day. When you are analyzing massive amounts of data, you have to use computers and data aggregation technology. Computer software isn’t (yet) smart enough to fully understand human language and sentiment so important insights are inevitably lost.
For example, a computer could come across a post that said “Pizza is hot.” This could indicate the pizza is warm, trendy, spicy, or even good looking. Computers cannot account for those nuances and context in language and therefore would likely miscode the content. Instead of analyzing every single piece of content, we feel it is in the best interest of the campaign to gather a scientific sampling that yields a representative picture of the social media landscape.
A reasonable sample size of 1,000-2,000 pieces of content allows humans to take the time to understand nuances in tone and message of the content so that we can specifically dive into what does and doesn’t work.
III. Predictive Modeling – More Campaigns Must Do This
Once we have a representative sample of content to analyze, we then uses in-depth human analysis to review hashtags, tweets, posts, and comments to develop a framework of analysis to capture the variables, trends, and patterns of the available content. Each piece of content is coded for tone, theme, sentiment, and substance, among other variables.
We then take the information gained from the human coding process and feed it into a predictive model in order to understand the relationship between each variable and its outcome measure(s). Statistical modeling helps the client develop and optimize campaign strategy that resonates with consumers, increases engagement, and sets up the client for long term success. For example, we used this method for a life insurance organization and found that content with inspirational images greatly increased engagement across Twitter and LinkedIn platforms.
In closing, we are honored to have been chosen to contribute to this insightful book and are happy to have added to the dialog and discussion taking place today.
Sentiment Analysis in Social Networks
Bradley Honan has co-written a chapter in the Sentiment Analysis in Social Networks book by Federico Alberto Pozzi, Elisabetta Fersini, Enza Messina, and Bing Liuwhich looks at the role of social networks in modern society.
In a chapter titled The Human Advantage: Leveraging the Power of Predictive Analytics to Strategically Optimize Social Campaigns, we highlight our work on digital content evaluation and predictive analytics, outlining our firm’s sentiment analysis philosophy and our content analytics offerings.
In case you haven’t seen the book yet, here’s an elaboration on what we wrote about and discussed.
A core component of our social media philosophy is robust pretesting. Given the significant costs and complexities of rolling out an effective social media campaign, we believe testing out small pieces of social media content before the full campaign is developed and optimized can eliminate work or ideas that are neither effective, nor engaging. Too often social media strategists skip this important step.
We have developed a proprietary tool called the Brand Engagement Lab where companies can recruit their target audience and test pieces of content in a simulated social media environment like Facebook before they go live. In this setting, the audience can browse through content with the ability to “like,” share, and comment on posts as they wish so we can tell if and how consumers are engaging with the content. Insights from this exercise help to optimize a campaign’s approach and strategy by identifying which messaging content and tactics most effectively drive the greatest degree of engagement, favorability, and resonance with consumers.
Currently, far too many companies are attempting to analyze every single piece of social media content related to their brand with the hopes of determining what drives engagement. We think this so called “census philosophy” is misplaced. Even the US Census, mandated by the Constitution to count every American, does not survey every single person in the United States!
Instead, they make estimates based on statistical sampling. If the US Census can effectively sample people rather than count them all, we can certainly identify social media trends and insights without counting every single piece of content individually.
Unfortunately, the census philosophy too often wins the day. When you are analyzing massive amounts of data, you have to use computers and data aggregation technology. Computer software isn’t (yet) smart enough to fully understand human language and sentiment so important insights are inevitably lost.
For example, a computer could come across a post that said “Pizza is hot.” This could indicate the pizza is warm, trendy, spicy, or even good looking. Computers cannot account for those nuances and context in language and therefore would likely miscode the content. Instead of analyzing every single piece of content, we feel it is in the best interest of the campaign to gather a scientific sampling that yields a representative picture of the social media landscape.
A reasonable sample size of 1,000-2,000 pieces of content allows humans to take the time to understand nuances in tone and message of the content so that we can specifically dive into what does and doesn’t work.
III. Predictive Modeling – More Campaigns Must Do This
Once we have a representative sample of content to analyze, we then uses in-depth human analysis to review hashtags, tweets, posts, and comments to develop a framework of analysis to capture the variables, trends, and patterns of the available content. Each piece of content is coded for tone, theme, sentiment, and substance, among other variables.
We then take the information gained from the human coding process and feed it into a predictive model in order to understand the relationship between each variable and its outcome measure(s). Statistical modeling helps the client develop and optimize campaign strategy that resonates with consumers, increases engagement, and sets up the client for long term success. For example, we used this method for a life insurance organization and found that content with inspirational images greatly increased engagement across Twitter and LinkedIn platforms.
In closing, we are honored to have been chosen to contribute to this insightful book and are happy to have added to the dialog and discussion taking place today.
Recent Post
Archives