Archive for June, 2011

Defining and measuring engagement: An essential tool for Media Companies.

Wednesday, June 29th, 2011

According to Jeffery Graham, “Engagement is like love —everyone agrees it’s a good thing, but everyone has a different definition of what it is.”

Graham is referring to the struggle in the media world to define what is meant by this term.

Defining this term is vital for numerous reasons. Epps (2009) emphasises how a definition can add significance to media companies in terms of establishing value for advertising companies, retaining and acquiring customers and justifying investment in content.

A framework of engagement was developed by Forrester, known as ‘The Four I’s of Engagement’. Put in the context of websites research, the four I’s can be defined as the following; involvement with the content (time spent on page and number of pages viewed); interactions made by viewer (playing videos, commenting and rating content); intimacy (the affection the viewer has for the brand); and the influence of this viewer on other people, or their likelihood to advocate the brand to others.

This framework can be praised for creating a clear definition of engagement, that includes not only the state of the person at the time of exposure, but also the long-term effect of the content on viewer behaviour.

ACB believe that defining the term is only half of the battle. The way in which the research is conducted in order to measure engagement is of paramount importance. The framework laid out by Forrester used both qualitative and quantitative data sources in order to measure each of the four factors related to engagement. However, the methods chosen could be criticised. Metrics used to assess involvement and interaction included time spent on page per session, actions within a page (for example viewing videos and commenting), and repeat visitation, are complex to observe and also to understand. Intimacy and influence metrics, such as sentiments expressed in blog posts, and sharing content with friends, potentially ignore the subtly of sarcasm and language and could be considered invasive.

ACB would suggest a more in-depth and simple design is needed to truly understand this. Clearly, traditional methods need reviewing, as stated by Epps (2009): “Both advertisers and media companies… acknowledge that the old metrics of page views and impressions are insufficient to account for the complexity of consumer interaction with advertising and content online.”

As ever, ACB believe that it is important recognise the amount of time spent on a particular channel or web page is not necessarily reflective of attention dedicated to the content by the viewer. This could be affected by numerous different factors including concurrent activities that the individual may be participating in, e.g. putting the kettle on!

Sadly, the lost opportunity in this type of metric is that rather than providing a more refined product, that delivers insights on audience behaviour on multiple screens, and incorporates contextual factors – or ‘life’ – we are simply defaulting to using a crude metric that may be potentially misleading even if it delivers a lot of numbers!

Plummer defines engagement as “turning on the mind… a subtle, subconscious process in which consumers begin to combine the ad’s messages with their own associations, symbols and metaphors to make the brand more personally relevant…until they undertake this process, or “co-create” the meaning, they haven’t truly engaged and it is unlikely to impact their behaviour.” ACB admires this detailed and ambitious definition, but measurement is again extremely complex. ACB would assert that without cameras capturing sound and natural behaviour, we will be ‘blind and deaf’ to a true understanding of engagement.

For the purpose of understanding engagement the unobtrusive video ethnographic nature of our study is superior to traditional methods of collecting data such as survey and self-report questionnaires. Sanday (1979) states that “One comes to understand something by seeing it as an outsider”.

ACB believes that video ethnography – capturing audience behaviour and simultaneously capturing what is on the TV, laptop or mobile – delivers the potential of gaining a true understanding of engagement. Second-by-second behavioural micro-analysis is used to assess all aspects of the use of this technology, concurrent behaviours and simultaneous screen use as well as engagement and interactions. For example, a positive interaction with the content may be clicking on a link, rewinding content, laughing, sharing or commenting on or about content.

In contrast to looking at broad engagement figures for one session, the rigorous method used by ACB can provide a measure of real-time engagement. We are able to see when and why in a particular session viewer engagement splits or switches up on multiple screens. This delivers an accurate representation of engagement, along with insights into what affects this such as aspects of the technology, and situational factors including time of day, presence of others and concurrent behaviours.

ACB has traditionally used a simple definition of engagement. This has stood the test of time and is reliable in delivering insights on the impact of new technology over time. However, ACB is ever keen to innovate and push the boundaries of knowledge, and so ACB is working with partners in Phase Five of the 1-3-9 Media Lab to provide a deeper understanding of engagement which includes immersion in the viewing experience. This additional layer of information will provide new insights on engagement and deliver to our members a richer understanding of audience behaviour than has ever been delivered before. It will also have the value of being future focused incorporating the latest screens.

ACB would assert that to understand engagement one does not need huge samples – that the focus is ultimately mistaken. Rather, understanding engagement needs language, laughter, and other human responses – most importantly understanding engagement needs natural behaviour and the observation of all screens to compare and contrast.

The Importance of Context in Research

Wednesday, June 22nd, 2011

The identification of the most effective methodology in any type of research is fraught with difficulty and highlighted by the debate surrounding the competition between qualitative and quantitative research. As stated by Morrison (1998) “The methods one chooses structure the representations that are made of the social world”.

When studying an individual’s behaviour the question of the most appropriate methodology is particularly pertinent. Behaviour is dependent on the context an individual finds themselves in, and as such the influence of situational factors cannot be overlooked. A reliance on an individual’s memory of their past behaviours is problematic as there may be numerous misperceptions made by them.

It is only by analysing an individual’s behaviour in the context in which it is engendered that researchers can provide a detailed understanding of this behaviour, their attitudes and decisions. Ethnographic analysis is therefore a usual method for qualifying the behaviour of people, as the behaviour is witnessed in the natural context, rather than relying on accounts from the individual after it has occurred. Combined with ensuring people do not have to recall events which have happened in the past, ethnography can be praised for its reliance on observational analysis, as Sanday (1979)  states “one comes to understand something by seeing it as an outsider”. The observer is claimed to be better suited to understanding the behaviour of the group as they are “likely to be more sensitive to the nuances observed at home, which might otherwise be ignored”.

However, a traditional observational study, in which the observer places themselves in the situation which he wants to study, elicits numerous issues in terms of collating information. Webb et al. (1966) discuss the downfalls of the human observer, their “fallibility as a measuring instrument – his selective perceptions and his lack of capacity to note all elements in a complex set of behaviours.” Furthermore, “in simple observational studies, research is often handicapped by the weaknesses of the human observer, by the unavailability of certain content and by a cluster of variables over which the investigator has no control” (Webb et al., 1966).  In order to tackle the inevitable limitations of this approach, video-cameras can be used to capture behaviours as and when they occur, allowing a richness of analysis unavailable through simple observational studies.

This is the approach used here at ACB. Placing cameras in the homes of a number of early majority households, we are able to capture the behaviour surrounding the use of new technology in the context in which it is most organic. After some gentle hot-housing, the cameras are then placed in the home for a number of months. Although only four of these days undergo our in-depth analysis, the extended period over which the cameras are present allows the family to become familiarised with them, avoiding the “speak clearly into the microphone, please” (Webb et al., 1966) approach.

By the end of the extended capture period, we are left with four days’ worth of footage for each of our participants. Having a hard copy of the data gives us added value over data obtained through traditional ethnographic analysis, as we are not impeded by the limitations of the investigator’s memory. A second-by-second behavioural micro-analysis is conducted on the footage over a number of months by a set of dedicated researchers. Coding every detail of the use of technology, as well as the individual’s behaviour and key ethnographic insights, means that the data set not only reflects in-context, natural behaviour, but provides a detailed and rich account of it.

A further benefit of this particular study into the use of the technology is the longitudinal nature of the research, which allows us to see how actual audience behaviour changes and develops over time with the emergence of new technology. When placed in conjunction with the collection of quantitative data, longitudinal ethnographic analysis can provide the richest and detailed account of this behaviour:

The technical collection of BARB data, from which the rating figures are produced, may not be a ‘serious’ sociological enterprise in its collection of pure information, but the data thus collected are a very important source for the development of sociological knowledge, once subjected to sophisticated longitudinal analysis.” (Morrisson, 1998)

Further to the work on the impact of technology in the homes, our methodology is used to as means of assessing consultations between doctors and patients. The evolution of a new means of patient assessment, ‘telehealthcare’, is particularly apt for this means of analysis. Understanding engagement and patient experience is significant in that it will have an effect on patient satisfaction which in turn could affect concordance. Given the unfamiliarity of these consultations, a number of potential problems may arise which may affect patient engagement. Reliance on patients’ self-reports here is particularly problematic, and researchers studying the efficacy of these procedures in the past have “urged caution in interpreting the largely positive findings reported in many studies.” (McClean, Protti and Sheikh, 2011). Again, only by studying these behaviours as and when they occur can we get a true, honest and objective assessment of the experience.

One final noteworthy merit of video ethnographic analysis is its flexibility when it comes to interpreting the results. In comparison to other studies, in which the design is formed around the hypotheses being tested, the permanence of this record allows researchers to form hypotheses subsequent to the collection of data: “It is not subject to selective decay and can provide the stuff of reliability checks…the same content can be the basis for new hypothesis-testing not considered at the time the data were collected” (Webb et al., 1966).