Education-Outreach According to Vanilla Ice: Strategies for High-Quality, Effective Education Efforts

Greta M. Zenner Petersen

"Stop, collaborate and listen." Although it pains me to say this, Vanilla Ice might have been onto something. Two decades after he released his hit single, "Ice, Ice, Baby",1 he is mocked and hardly even considered a B-list celebrity, but in the opening lyrics to his song, he offers valuable advice that we should take seriously when working on education-outreach projects. The process of beginning an education-outreach program or activity can seem daunting – so many possibilities, so many audiences, so many needs – which can leave a person feeling at a loss of where to start. Three imperatives from a forgotten rapper at the end of the last millennium can help.

Stop: Slow Down, Conduct Background Work, and Lay a Foundation

One of my biggest pieces of advice for working on an education-outreach project is to slow down. Frequently, researchers and education professionals alike jump too quickly to the hands-on phase of development, bypassing several important preparatory steps. When included at the beginning of development, these steps will make the entire process go more smoothly and increase the quality of the final product, whether an entire multi-year program, or a single classroom demonstration. The first of these steps is setting goals, and the second is understanding your audience.

Set Goals

Once you have chosen the approximate content area and approach (e.g., hands-on activity, after-school program, kit, teacher training, etc.) for your education-outreach project, it is essential to set goals. These will help you to establish a framework for your evaluation process, to re-focus during the development phase, and to know if you accomplished what you set out to do. If you fail to decide ahead of time what you hope to achieve and what you hope your target audience will gain, it becomes very difficult to know if you actually succeeded.

The most commonly considered category of goals is content learning goals – what content do we want our target audience to learn or master through participating in the education-outreach project? While this category of goals maybe feel simple and clear-cut, it is still essential to enumerate and concretely define them. Leaving content learning goals unspoken and abstract invites confusion and makes it difficult to knowing whether you have achieved them. Do not underestimate the importance of this step. It may seem obvious and straightforward – and maybe it will be – but it can substantially improve the quality and effectiveness of an education-outreach project.

There also exists an equally valid set of learning goals that complement and add to content goals. A recent publication by the National Science Board, Learning Science in Informal Environments: People, Places, and Pursuits,2 enumerated six strands of learning. As the title suggests, the publication explicitly addresses learning within informal science environments, but much of the FEd community’s education-outreach efforts fall into that category, and many characteristics of effective informal education resemble effective formal education. A noteworthy characteristic of these strands of learning is that content makes up only 1/6 of them; a large portion of learning happens around non-content-specific strands. Additional types of learning include becoming excited about science, developing and conducting experiments, reflecting upon science as a way of knowing, participating in science, and personally identifying with science.3 Considering these goals at the outset of a project will make it more robust and impactful.

A final step in preparatory goal-setting is to streamline, simplify, and reduce. I have yet to witness the development of an education-outreach project where the goals started as too succinct and too simple. In contrast, both researchers and education professionals begin with learning goals, especially content ones, that are too lofty and too numerous. If in doubt, simplify and reduce. Additionally, it is also completely appropriate, if not wise, to revisit learning goals throughout the development process and revise them as you obtain feedback from audiences and gain experience leading the project.

Know Your Audience

One of the most important rules in education-outreach is "know your audience". That can prove challenging at times because you might not be able to forecast who your audience will be in advance, especially with some events like large-scale expos or science shows. With other situations, such as an undergraduate course, you can. Either way, collecting a baseline of information and taking that into consideration during development and delivery of your project can dramatically increase its effectiveness.

Fundamental audience characteristics to consider include: age, gender, race, ethnicity, and socio-economic status. Additionally, it is also important to understand your audience’s prior knowledge as much as possible, to understand what they know, how they formed that knowledge, and what preconceptions they might bring to the event. Resources exist that can help us develop a basic understanding of the ideas and concepts that students and public audiences have mastered at different ages and grade levels. Several of these include:
  • The National Science Education Standards4
  • The Atlas of Scientific Literacy, Volumes 1 and 25
  • National Science Teachers Association (NSTA) journals6: Science and Children (elementary school), Science Scope (middle school), The Science Teacher (high school), and Journal of College Science Teaching (undergraduate)
  • The most recent Science and Engineering Indicators (currently 2010)7

Browsing these sources, especially the Standards and the Atlas, offers a quick way to assess the concepts that students are learning – and are capable of learning – at different ages. I emphasize the capability component because we often assume that if we just explain something well enough, children will understand it. However, some concepts, such as atoms and the particulate nature of matter, are beyond what children younger than middle school can comprehend. No matter how well you explain the idea, young children’s minds and understanding of the world make it so that they cannot truly grasp the concept. The NSTA journals are likewise good sources to assess the content students learn at various levels, as well as the ways that educators help students to learn the content. The publications listed here can help you understand your audience and give you ideas for education projects.

These K-12-related resources can also be helpful for assessing the average American adult’s understanding of science, which is generally considered to be at the eighth grade level. This means that if you learn about the comprehension level of middle-school students, you will also have an approximate baseline for the general American adult population. Another way to become more familiar with adults’ understanding and perception of science, as well as the sources where they find their information, is the Science and Engineering Indicators. The Indicators are a series of biennial publications by the National Science Board that report on the American and international scientific research enterprises and on the public understandings of science around the world, with a focus on the US. The Science and Engineering Indicators 2010 is the most recent release.

Written sources are good starting points for helping us to understand our target audiences, but first-hand experience is always best. Reading about students and classrooms can only go so far; to develop a real understanding of and appreciation for the realities of your target audience, you must experience it. If possible, visit a site that serves your target audience so you can witness the reality of that environment. It does not have to be extensive – it can even be sitting in one or two class periods or visiting a local museum and informally observing visitors’ interactions.

Collaborate: Form Partnerships and Leverage the Expertise of Others

An unfortunate mistake many people make when exploring education-outreach is trying to do it all on their own. Even a recent article in Nature supports this mistaken notion that researchers are left unaided to develop education materials and engage in outreach efforts.8 The reality is that a wide range of groups, institutions, and professionals exist who are interested in working with scientists on education-outreach efforts. Forging and cultivating such partnerships allow us (and our partners) to divide the workload and take advantage of a range of expertise areas, thereby strengthening the entire project.

Some potential resources and partners to consider include:
  • Museums, including children’s museums, natural history museums, science museums, and art museums;
  • Area K-12 teachers, schools, and/or districts;
  • Professional societies (like APS) and their education committees (like FEd) and staff (APS has excellent education-outreach staff);
  • Community groups, such as Girl Scouts, Boy Scouts, Boys & Girls Clubs, 4H, etc.; and
  • Your own institution. Many universities, colleges, government labs, and industries have pre-existing education-outreach programs and infrastructures.

Asking for help, i.e., seeking collaborators, is not only acceptable, but also ideal. A strong partnership will strengthen the work and impact of both partners.

And Listen: Evaluate and Assess, Don’t Assume

Evaluation is an essential component of all stages of education-outreach projects and programs. Good scientists review the current literature before conducting an experiment on a specific topic and conduct several tests to make certain that their data show what they are claiming they show. The same should be true with education efforts. Conduct some work at the beginning of the project and conduct evaluation at several points throughout the development process. Evaluation could serve as the topic for an entire newsletter article, so I will only address it briefly here, highlighting a couple main ideas. The first of these is something called front-end evaluation, and the second is formative evaluation

Front-end evaluation is the process of finding out more about your audiences – who they are and what they know – as well as what other programs have been conducted similar to yours. A research analogy to front-end evaluation would be conducting a literature review. You can learn from the work others have done before you. I mentioned earlier several resources for understanding your audience, and I strongly recommend taking advantage of the written publications and first-hand experience opportunities. Additionally, to learn more about similar education-outreach efforts, both current and past, talk with colleagues, search award databases of major funding institutions like the National Science Foundation, and attend education symposia and sessions at professional society meetings. APS has a very rich portion of each meeting’s program dedicated to education. These sessions are a wealth of information and contacts for anyone interested in participating in education-outreach.

Formative evaluation is a process by which you ascertain whether your project or program is working and whether you are on the right track toward achieving the goals you established. There are a large variety of approaches that formative evaluation can take, depending on your goals, audience, and type of project. As I mentioned, this is a large topic, but several questions to ask yourself at the beginning of designing your evaluation are, "What do I want to find out? What kinds of questions should I ask in order to find out that information? And what kinds of evidence can I realistically gather and measure that would provide me with that information?" If you established strong, clear goals at the beginning of your project, the formative evaluation process will be much easier. Use these goals to frame your evaluation. Collaborators can also contribute to evaluation efforts. Professional educators such as teachers and museum staff are experienced in assessing their learners’ experiences. Take advantage of their knowledge. Evaluation may seem to be a daunting task, especially for those new to education-outreach, but, again, by working with others and learning from others have done, it can become manageable and extremely helpful.

A Last Thought: Take Advantage of Passion and Enthusiasm

My final recommendation strays from Vanilla Ice’s lyrics, but is perhaps one of the most vital components of effective education-outreach. As with anything you want to do well, start with areas that you are knowledgeable, passionate, and enthusiastic about. For many of us, education-outreach is "extra", something added to our already full plates of research, administration, service, teaching, mentoring, and more. That, combined with the fact that a person’s passion for their topic is arguably the most important component of an education-outreach effort, makes it crucial that we choose education-outreach projects that play to our strengths, expertise, and passion. Share your passion. It is the most important thing you have to give, even if it is not mentioned in a hip-hop song from 1990.

Acknowledgments

This material is based upon work supported by the National Science Foundation through the University of Wisconsin-Madison’s Materials Research Science and Engineering Center on Nanostructured Interfaces (DMR-0520527), the Internships in Public Science Education program (DMR-0424350), and the Nanoscale Informal Science Education Network (ESI-0532536). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. The author would also like to thank her numerous collaborators and colleagues for making education and outreach a rewarding and exciting endeavor.

1 Van Winkel, Robert "Vanilla Ice". (1990). "Ice, Ice Baby". To The Extreme. SBK Records.

2 National Research Council (NRC). (2009). Learning Science in Informal Environments: People, Places, and Pursuits. Committee on Learning Science in Informal Environments. Philip Bell, Bruce Lewenstein, Andrew W. Shouse, and Michael A. Feder, Editors. Board on Science Education, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. Also available online at http://www.nap.edu/catalog.php?record_id=12190 (accessed September 10, 2010).

3 NRC. (2009). p. 4.

4 National Research Council. (1996). National Science Education Standards. Washington, DC: National Academy Press. Also available online at http://www.nap.edu/openbook.php?record_id=4962 (accessed September 10, 2010).

5 American Association for the Advancement of Science (AAAS), Project 2061. (2001 and 2007). Atlas of Scientific Literacy, Volumes 1 and 2. Washington, DC: AAAS. Information available online at http://www.project2061.org/publications/atlas/default.htm (accessed September 10, 2010).

6 National Science Teachers Association. Publications and Products: Overview. http://www.nsta.org/publications/ (accessed September 9, 2010).

7 National Science Board. (2010). Science and Engineering Indicators 2010. Arlington, VA: National Science Foundation (NSB 10-01). Also available online at http://www.nsf.gov/statistics/seind10/ (accessed September 10, 2010).

8 C. Lok. (2010). "Science Funding: Science for the Masses". Nature, 465 416.

Greta Zenner Petersen the Director of Education for the University of Wisconsin Materials Research Science and Engineering Center (MRSEC) on Nanostructured Interfaces, where she has been sharing cutting-edge research with non-technical audiences and enjoying the fun of education-outreach for nine years.


Disclaimer- The articles and opinion pieces found in this issue of the APS Forum on Education Newsletter are not peer refereed and represent solely the views of the authors and not necessarily the views of the APS.