21 October 2013

Socrative 2.0 - "Clicker" Platform Gets Better

I've been a huge fan of iOS/Android/web app Socrative to replace the clickers in my classroom since the first week I had iPads in my classroom (January 2011).

I've used it for exit tickets, bell-work, chapter reviews, guided practice during a lecture, reflection on learning, and even a semester final. My students have ALWAYS complained that if they made a mistake or wanted to skip around, there was no "go back" functionality within Socrative. As flashy and fun as the app is, I've always leaned heavier on Google Forms to give my students that flexibility.

I gave a presentation last week on replacing the Scantron with digital tools that give you richer, quicker feedback and allow you to assess more than multiple choice. I love Socrative as a tool to close what I called the "feedback gap," that time between when you give an assessment, grade it, and then pass it back. What I didn't know was that Socrative had a new version in beta that addressed the "go back" functionality that I mentioned above! You can access the beta version of Socrative 2.0 by both you and your students using beta.socrative.com

The U/I is fresh, but there's more here than that!
Rather than write anything more, I decided to run through Socrative 2.0 briefly and share some of the new teacher and student features.

This video screencasts:
Administering a saved quiz - choosing quiz, setting student paced
Student taking quiz, going back through work and editing responses (Socrative 1.0 did not allow editing or skipping around)
Student submitting response
Viewing specific student responses and results (Socrative 1.0 only showed a student's score)
Finishing the saved quiz and getting a report


I last wrote about Socrative soon after the "insert image" into a question function went public, and it was much needed, but I think I even more appreciate the new student pacing options in Socrative 2.0.

07 October 2013

Writing in Math: Modeling is Powerful


My students encounter writing most in my AP Statistics class. Because of the responsibility to my students to prepare them for an AP exam that will require them to justify the statistical tests they conduct, the conclusions they make, and the observations they draw from graphs, data sets, or computer outputs, I have no choice but to engage them in writing tasks.

Students will not learn to write technically on their own. It's unnatural. It's "hard."
For the majority of my students, AP Statistics is their first exposure to sustained, technical, and descriptive writing. For the most part they've had short answer responses on some quizzes in other courses that will require them to explain "why" they think they're answer is reasonable, or how they came to their answer, but I find myself stretching and pulling all of 1st quarter to get these kids to write more than a sentence per prompt question. 

Consider the following histogram of a roughly symmetric, normal-ish distribution (if I'm losing you there, just know that this bar graph should be symmetric with a high peak in the center and long tails to both positive and negative infinity):

A pretty common prompt in the section where normal curves are introduced would ask something like, "Describe the shape of the distribution." Students usually feel pretty good about themselves if they remember to point out that its symmetric, and has a single peak. If they're getting frisky, they'll mention the tails off to the left and right, and point out the because the distribution is not skewed (with the data clumped in the right or left with a long tail to either side), that we know the mean and median would be roughly value, in the middle of that peak. 

I've come to expect these habits early in the year, so I lean heavy on giving positive feedback for effort (they wrote something), and always give a more specific example of how I would have refined what they said, or I read from my solution manual (sometimes even correcting the manual if I think the manual could have been more specific).

Students cannot know good technical writing without reading technical writing. 
Ironically, I think your textbook is a good place to start, because every section has passages you can easily pull that attempt to succinctly explain vocabulary or walk students through a procedure.

You hope there comes a time in every student's educational career where they stop filling their writing with flowery words that don't mean anything; that they would get to the point, especially in technical writing. However, the pendulum soon swings the other way and students write much less than they should, forcing the reader to assume much of the knowledge the student should be demonstrating.

To refer to the histogram again, I think a student given a prompt of, "Describe the histogram," or "Describe the characteristics of this data set" that had not been introduced to statistical terminology would probably write too much, and still not relay the point about symmetry and the placement of the mean/median. "There are 12 bars on the graph. The first one starts and -3 and goes up a little bit past 0.0..."

I've written all of this so I could share how embarrassingly inspired I was reading this report of a recent Pew Research Survey on the Affordable Care Act, and the role of Republicans vs. the President in the shutdown.

The writer of the report spent several paragraphs describing their methods, the lengths the survey designers took to eliminate response bias from the participants, and examining the bias they were unable to eliminate through their methods. It's just a lot of good, descriptive, writing that will be great for the Experiment Design chapter in the AP Stats curriculum. 
"The analysis in this report is based on telephone interviews conducted October 3-6, 2013, among a national sample of 1,000 adults 18 years of age or older living in the continental United States (500 respondents were interviewed on a landline telephone, and 500 were interviewed on a cell phone, including 250 who had no landline telephone). The survey was conducted by interviewers at Princeton Data Source under the direction of Princeton Survey Research Associates International. A combination of landline and cell phone random digit dial samples were used; both samples were provided by Survey Sampling International. Interviews were conducted in English. Respondents in the landline sample were selected by randomly asking for the youngest adult male or female who is now at home. Interviews in the cell sample were conducted with the person who answered the phone, if that person was an adult 18 years of age or older. For detailed information about our survey methodology, see: http://people-press.org/methodology/. 
The combined landline and cell phone sample are weighted using an iterative technique that matches gender, age, education, race, Hispanic origin and region to parameters from the 2011 Census Bureau’s American Community Survey and population density to parameters from the Decennial Census. The sample also is weighted to match current patterns of telephone status, based on extrapolations from the 2012 National Health Interview Survey. The weighting procedure also accounts for the fact that respondents with both landline and cell phones have a greater probability of being included in the combined sample and adjusts for household size among respondents with a landline phone. Sampling errors and statistical tests of significance take into account the effect of weighting. The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey: 
 
Sample sizes and sampling errors for other subgroups are available upon request.
In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.
"