It's interesting to note how far electronic resources have come over the past twenty years. There was a time when any information derived from the Internet was regarded with suspicion. Then, as government agencies, journalists, researchers, and various other authoritative parties started to take their place in the electronic frontier, educators began to develop protocols for evaluating electronic resources. We had to do this to help our students differentiate between legitimate, authoritative information and the myriad of uninformed opinion that often attempts to disguise itself as something more.
In an era where students are leveraging the Internet to support their research, it is critical that they be armed with the tools they need to assess the websites and online articles that might seek to find their way into bibliographies and works cited.
A while back, I came up with a simple approach to evaluating electronic sources of information. I called it the Five "A" Approach. There are definitely other approaches to evaluating online resources, but I quite like the Five "A" Approach for its speed and simplicity. The five "A"s that I ask my students to examine in an electronic resource (such as a web page or online article) can be described as follows:
i) Authorship: Is the author identified?
ii) Authority: Is the agency or organization associated with the resource identified?
iii) Age: Is the publication date of the resource identified?
iv) Aim: Is the author or organization associated with the resource aligned with any particular goal or objective that might bias the resource?
v) Authenticity: Does the resource identify the sources of its own information?
If the resource fails any of these five criteria, then its authority and authenticity may be suspect. If it fails two or more criteria, then it should probably be rejected as a potential resource.
Consider using the Five "A" Approach to evaluating electronic resources the next time your students set out to support their work with online resources.
You can download a PDF version of the Five "A" Approach rubric below.
The 5A Test vs. the CRAAP Test
The CRAAP test is a similar approach to evaluating electronic resources that was developed by the Meriam Library at California State University. This test was published in 2010, a couple of years after I developed and published the 5A Approach on newlearner.com. The CRAAP test also involves five questions for researchers to ask when evaluating the reliability of a given resource, suggesting that researchers examine a resource for its currency, relevance, authority, accuracy, and purpose.
While the categories of each test are quite parallel, the 5A test was developed specifically for electronic resources, and tends to be presented as a series of objective questions that can be answered with a "yes" or "no." The CRAAP test, on the other hand, is made for general application to any resource, and is more subjective in its design. You will note that each of the five criteria in the 5A test begins with the words, "The student can identify." Alternatively, the CRAAP test tends to ask a number of "who, what, when, does, do" questions for each of the five criteria.
Invariably, the fundamental difference between the two tests can be attributed to the audience for which the tests were originally created. I created the 5A test for secondary school students, and so each criteria poses a single, objective question that can more or less serve as an acid test. The Meriam Library, on the other hand, developed the CRAAP test for university students, and so its five criteria have a more general application while also requiring a higher degree of subjective evaluation on the part of the student.
Field studies describe an organized, well-considered effort to collect data from a real-world environment. Field studies generate primary research data through the use of recorded observations, interviews, or surveys. Field studies are not to be confused with experiments, wherein data is generated from within a controlled setting - such as a lab.
The objective of any field study is to make inferences about what happens in the real world.
The Field Study Extended Project
In the Field Study Project, each student (or pair of students) would propose, design, and conduct some form of original field study. The field study must explore a phenomenon that is related to a course's overall topic of study (ex. law, sociology, or economics). The actual field study (i.e. the data collected) would form the basis of a scholarly journal article that can later be published within a journal that the class produces and posts online.
The Field Study extended project includes the following phases:
Students familiarize themselves with the idea of field studies by reading field study articles curated by the course instructor.
III) Ethical Review
Structure and Length
Each field study write-up, presentation, and journal article should:
Write-Up: There is no prescribed lower or upper word limit for the study write-up. However, this study should be well considered and well implemented so that it can serve as the foundation for both your presentation and your scholarly article.
Presentation: Ten minutes.
Scholarly Article: Maximum of 1500 words, including the front matter. (Front matter for a journal article generally consists of a title, abstract, key words, and the names of the author or authors.) This is an article intended for publication within a scholarly journal, so it should be representative of the student's very best writing. Single spaced, at font size 12, the article should take up three pages within the scholarly journal.
The Correlation Study Icebreaker: Learn about the tools of social science and break the ice in a single class!
Looking for an Icebreaker for your first social science class?
I've used a certain icebreaker exercise in my economics classes for years, and it never ceases to amaze me. I first go over the basics of how field studies and investigative science work: how scientists will propose hypotheses, gather data, identify correlations, and then attempt to explain causal relationships.
I then ask the students (alone or in groups, depending on the class size) to develop a hypothesis that they might be able to examine by just studying the students in our class. Each student (or group) must then interview each and every student in the class in order to collect the two variables that they wish to examine. Do their classmates have any siblings? How tall are they? Do they wear corrective lenses? Do they wear a watch? How many languages do they speak? The possibilities are endless.
The students plot the data they collect on a graph, and then present both their hypothesis and their findings to the class. The study / icebreaker portion of this exercise can take place in a single 80-minute class, and the presentations can generally be completed in the next class.
Google Spreadsheets Serve Up Excellent Scatter Graphs in Three Easy Steps
Google spreadsheets provide a particularly quick and easy way to illustrate a correlation between variables. If you have access to Google Apps in your school, then a Google spreadsheet can plot the data points and illustrate correlations in three easy steps:
ii) Highlight just the two columns of data (without any names), and then click on the "Chart Wizard" button. You will see a variety of chart options, but click on "more" chart options to find the scatter graph option. (You must select the "scatter" graph option to plot correlations between variables.)
i) Set up three adjacent columns: the first column being for the names of the students interviewed (so the interviewer can track who she has and hasn't interviewed), and the next two columns being for the two variables that are being analyzed.
iii) Click on the "Customize" tab and then scroll down to exercise the options of setting your chart title, naming your X and Y axes, and even generating a line of best fit. (The line of best fit is a particularly handy feature of Google spreadsheets that Google had previously been criticized for not including. As you can see, Google Apps are constantly evolving.)
These mini-studies are a great way for students to meet and learn about each other while also exploring the tools of investigative science. Every class will inevitably find themselves exploring issues of correlation, causality, sample bias, split effects, and even post-hoc fallacies.
Try this the next time you're looking for a way to break the ice in your social science course, and let me know how it goes.
Please note: While computers help, you don't need computers to do this exercise. I did this exercise for years before my school became a laptop school. You can download a PDF below that will facilitate a pen and paper version of this exercise.
Over the years, I've seen my students make so many amazing discoveries right before my eyes. For example, did you know that people who wear watches tend to enjoy greater academic success in school? How about this one: Did you know that blue-eyed people tend to wear corrective lenses less than brown-eyed people? Finally, would you believe that people who speak two languages tend to do better in school than people who speak one - or even three - languages? These are just a few of the incredible findings that my students have unearthed during this exercise. While these are just correlations, not causations, they are still pretty amazing discoveries.
The New Learner Lab
Exploring the ever-changing, often challenging, and always controversial world of teaching.