In my most recent blog post http://chi.anthropology.msu.edu/2018/12/what-is-your-purpose/, I discussed the importance of public engagement by researchers in academia, focusing on the role of biological anthropologists and their unique ability to contribute to the conversation on social race and ancestry. I mentioned how this concern led me to broaden the target audience of my CHI fellowship project from biological anthropologists and related professionals to middle and high school students and introductory college courses. However, because I am not trained or conditioned to reach outside of my disciplinary bubble, one of my goals this semester is to learn how to successfully engage with and educate the public through my project (brief project description available in my last blog post). I started asking myself questions to decide how to best promote my project and ensure that people will actually use the website.
Some questions I am concerned with are:
- What variables are used to measure success of public engagement?
- When is a public engagement project considered successful?
- What mediums are commonly used to reach the largest number of people?
When researching methods and variables to measure success of engagement projects, I relied more on sources referring to the social sciences as these metrics vary greatly depending on whether the project is physical interaction or online, the discipline, and the goal of engagement (e.g. education, recruitment, etc.). Overall, there were several common trends. First, and the most obvious, is the number of people reached. Second, many projects are concerned with the demographic diversity, which will be an important aspect of this project given its goal is teach the intricacies of human variation to better understand the origins of social race. Finally, researchers agree that qualitative data should be recorded to gather feedback and input on projects, such as usability, ability to understand the content, and the influence it had on decision making.
I will take these variables into account in a few ways. There will now be a user tracker on the website landing page that counts the number of times the landing page has been visited. Because the website is meant to be a teaching tool, it will likely have an inaccurate number of visitors. For example, the teacher may decide to open the site and work through the content as a unit, assign it as a group project, or assign the project to each student. This means that a high school class could result in anywhere between 1 and approximately 30 users. However, this will measure rough rates of engagement across time. To manage some of the inconsistencies with this raw metric, I am considering some other ways to collect the number of people reached, such as teacher surveys or adding questions user-focused questions to the quizzes incorporated in the learning tools component of the website. This could simply be check boxes or data entry boxes about whether the user is part of a course or school and if the user is taking the quiz as a group or individually. Additional quantitative metrics will be collected from social media posts promoting the project (i.e. likes, shares, and comments).
Measuring the demographic diversity of the project’s engagement will be even trickier. I do not want to collect individualizing information from the users. One approach may be to include another initial quiz question or teacher survey asking the location (city and/or school) of the user(s). This will gather information on whether the website is only being used in certain regions, such as urban school systems where there tends to be more diverse perspectives and demographic groups, or if it is reaching a wider audience. This will be critical in deciding how and where to disseminate this project in order to achieve the greatest impact.
Finally, I have decided to collect qualitative data by doing a trial run of the project by asking a handful of teachers/professors for different age groups to use the project in their classrooms. I can then take feedback from this initial testing phase to determine how/if student perspectives on race theory, ancestry, and human variation were altered and how/if this might impact behavior in the future. This will also let me know if the content is presented clearly and effectively and make updates before releasing the project on various digital outlets.
In the upcoming weeks, I will keep these questions on public engagement success in consideration throughout the project development and continue to research other anthropology public engagement projects to see what social mediums have the highest rate of engagement. In the meantime, here are some ways anthropologists are already engaging through digital media outlets, such as podcasts and Instagram and Sketchfab accounts.
Anthropological Airwaves by the American Anthropologist journal
AnthroPod by Society for Cultural Anthropology
The Natural History Museum, London
Jason Herrmann, Archaeologist, Personal Account
Michigan State University Campus Archaeology
The Natural History Museum, London