## November 04, 2014

### I'm a little bit stuck - where did I go wrong?

I was asked by a colleague why the amplitude of tangential modes of a ring or cylinder scale as $$cos(m\theta)/m$$.

The answer was given by Rayleigh in his Theory of Sound, which you can read on The Internet Archive. Here's the relevant part:

The relevant solution to the differential equation (1) in Rayleigh for the tangential component has the  $$cos(m\theta)/m$$ behavior.

But, I wasn't able to start with Rayleigh's coordinates and end up with his equation (1).

Here's my work; where did I go wrong?

## October 23, 2014

### New Physics Drawings File: Springs

I added a new file to the Physics Drawings page. The file has my interpretation of stretched springs for use in physics homework, assessments, or presentations. All the files open in LibreOffice and compatible applications. They are CC licensed, so go nuts with using them.

## August 02, 2014

### AAPT Summer meeting twitter analytics #aaptsm14

The Summer meeting of AAPT 2014 has come and gone.  It was fantastic to get to engage with so many incredible teachers, to hear what they are doing in their classrooms, to catch up with old friends, and make some new ones.

The use of twitter at the AAPT conferences has grown over the past few years. Rhett had a post at the end of the 2013 Winter meeting in New Orleans which showed only 116 tweets using the conference hashtag.

This past meeting, there were over 1500 uses of the conference hashtag (#aaptsm14) according to Tweetarchivist. Using the tools from Tweetarchivist and checking some other analytics, I’m guessing that there were at least 42 users who used the conference hashtag to post to twitter not counting users who retweeted a post containing the hashtag.

75% of the tweets came from the top 25 users.  Who was the most prolific tweeter?  @LCTTA had 202 tweets (representing just over 13% of all tweets), compared to AAPT’s official account, which had a respectable 145 tweets (9.6%). Third place went to @rutherfordcasey with 97 tweets (6.4%).  The top 25 users are displayed below. There were a LOT of great tweets from many attendees (and many who were following from home). Check out some of these physics teachers and consider following them!

I made a word cloud of the most popular words used. By far the most popular word was “students” if I counted the twitter abbreviation “Ss”. Also “learning” was slightly more popular than “teaching”

What about the use of twitter through the week? The amount of tweets during the day took a dip in the middle of the conference, then were strong on the last day:

 Screenshot from Tweetarchivist
The most popular phone used to tweet was the iPhone (no idea what model). I’d love to give props to the attendees who were using either a Blackberry (2 tweets posted) or Windows Phone (1 tweet posted) but I have no idea which user was on that one.  The breakdown of clients is shown below. I combined the  iPad and iPhone results, although the iPhone posts dominated.

 iOS (all clients) 610 Tweetdeck 259 Android (all clients) 227 Web interface 152 Hootsuite 114 Other 148

Finally, for fun, I have the top mentions of users as well as the other popular hashtags used in posts also tagged with #aaptsm14 - The top users mentioned were @AAPTHQ with 118 mentions.  This makes sense, as there were many people tweeting questions at AAPT.  The next user with the most mentions was @rutherfordcasey, who was organizing many of the meet-ups as well as tweeting about great sessions he was in.  The popular hashtags used included #perc2014 and #modphys.

Top users by posts they made:

 Top Users COUNT LCTTA 202 AAPTHQ 145 rutherfordcasey 97 SciEdHenry 92 drmagoo 90 eigenadam 77 TRegPhysics 57 achmorrison 55 dyanlj 39 fnoschese 37 SteveMaier_ 30 distractons 26 MartaStoeckel 26 QuantumTweep 21 UniverseAndMore 20 SJDJ 19 OpenStax 18 MsPoodry 17 Cabertram92 16 ng_Holmes 16 arundquist 16 rjallain 15 danny_doucette 15 astronomatty 14 MnSTA1 13

Mentions of users in posts:

 Top Users COUNT @AAPTHQ 118 @rutherfordcasey 114 @EEtkina 65 @LCTTA 58 @SciEdHenry 54 @achmorrison 52 @eigenadam 43 @bohacekp 37 @TRegPhysics 33 @MartaStoeckel 31 @drmagoo 30 @dyanlj 28 @kellyoshea 22 @UniverseAndMore 22 @arundquist 21 @distractons 18 @MsPoodry 17 @chrisgoedde 17 @leetramp 17 @ng_Holmes 17 @MrBWysocki 17 @UMNews 16 @jossives 16 @sciencegeekgirl 13 @phyzman 13

Popular hashtags used along with #aaptsm14

 Hashtags COUNT #perc2014 38 #modphys 34 #physicsed 31 #ngss 15 #newbies 11 #scavengerhunt 11 #scied 10 #msum 8 #physics 6 #tweetup 6 #directmeasurementphysicsvideos 5 #shameless 5 #minneapolis 4 #dmvideo 4 #perc14 3 #gamedev 3 #dbir 3 #edcampmsmn 3 #umn 3 #arduino 3 #womeninstem 3 #math 3 #scaleup 3 #edtech 3 #starwars 3

## August 01, 2014

### Words matter (was What does "brick and mortar" make you think of?)

(Note: I posted a version of this originally on July 31, 2014. It was accidentally deleted, and I was unable to recover the original post. I rewrote the missing parts and added an epilogue at the end.)

The PERC conference kicked off with an interesting talk by Mike Dubson from CU Boulder. There is a great summary of the talk by Stephanie Chasteen over at her blog. The talk was about comparing a traditional large lecture-hall physics class to a MOOC of identical content.

I was struck, however, by the repeated use of the term "brick and mortar” to refer to the face-to-face course. To me, the terminology immediately invoked a business metaphor, with the student as a customer and the professor as a service provider.  I asked the presenter if that was an intentional choice of words, but I don't believe that my question was understood before he chose to move on. (Incidentally, I specifically remembered him using the word "customer" to refer to a student enrolled in the MOOC, but he claimed he did not use that word.)

The use of "brick and mortar" and "customer" to refer to a learning environment reminded me of the advertisement for a faculty position at a college that demanded the candidate be able to provide excellent customer service. David Perry wrote a series of posts on that which you can read about on his blog.

My take-away from Perry's articles was that the words we choose to use have an impact on how we act. If we believe education is more than a business transaction, then we must not use words which make it easier for administrators, students, other faculty, parents and the public to expect that sort of relationship with us.

Prof. Dubson had earlier in his talk stated his belief that education is the process by which our collective knowledge and understanding is passed down from generation to generation. At the end of his talk he concluded by telling us about the elementary school teacher who inspired in him a love for reading. He said that those types of teachers cannot be automated or replaced by a computer. His passionate love for great education was clear.

If the type of teacher that drives great education cannot be automated, then it certainly can't be reduced to a simple transactional relationship that words such as "brick and mortar" or "customer" cue us to think. Nor can the responsibility of passing down the world’s collective knowledge, understanding and culture be simply bought like a box tissues with a click of a button or a swipe of a credit card.

Words matter. We need to be intentional about using them.

Epilogue

I was privileged to be able to co-host a discussion session at PERC which was about how the physics education and physics education research community should be thinking about implementing what we called in the abstract “Competency-Based Assessment” but which goes by many names. I was honored that Eugenia Etkina (and so many other people!!) showed up for this discussion.

After listing many of the names by which the assessment strategy is called, I had intended to switch to the more familiar “Standards Based Grading” (SBG) name for our session. Prof. Etkina raised her hand and pointed out how we know that grades and grading causes stress for our students and is not at all what we want to emphasize. She pointed out that the standards are measured by assessments.

I saw immediately the parallel between what she was pointing out to me and what I had been asking the speaker on the day before. Words matter. Even if I don’t intend to use the term “standards based grading” with my students it will better form our thinking if we remind ourselves that we are intentionally taking the emphasis off of grading. In our session we immediately switched to using the term “Standards Based Assessment and Reporting” (SBAR) which I intend to use exclusively going forward.

Words matter. Changing our use of words can be done; we just have to be intentional about it.

## April 21, 2014

### Frustration and failure - things to capitalize on to facilitate learning

There's a great answer to the question on Quora: "Why do we get frustrated when learning something?"

There's much more to the answer on Quora. Hopefully you can see all of it if you click through.

## April 17, 2014

### Dynamically updated figures - real data! Looking for more.

There are a number of graphs that I like to use in astronomy class which are based on historical data.  Over the years, the graphs have become a bit dated and I needed to find new copies of them.  I then discovered that some of these are kept up-to-date online at all times.  Very cool!  Here's one that I discovered, but I'm really looking for more examples of these.

 http://solarscience.msfc.nasa.gov/images/bfly.gif

I really thought that I had more examples of these - images that are dynamically updated, but that the url for the image stays the same. Now I can't seem to find any more.  Anyone know of any others?

## April 14, 2014

### Thoughts on mindset vs. grit

A recent story on NPR caught my attention. The story is about schools teaching "grit" and whether or not it can be done. My introduction to the concept of grit came from a really great episode of This American Life from 2012. Even though grit is only mentioned by name twice in the episode, there was quite a bit of discussion on non-cognitive traits and their importance to learning. My interest in the episode is summed up best by this line:

"Non-cognitive traits like grit and self-control are even more important in college than in high school."

How best to encourage the best non-cognitive traits leading to success in college?  In the NPR piece, Alfie Kohn makes a great point: persistent people persist.

I'm a big believer in the Dweck model of mindsets: fixed vs. growth mindsets. I work to cultivate growth mindsets in my students. It's not easy. It would be great to add grit to my student's toolbox of tools to use for success in college. I watched Duckworth's TED talk hoping she would have some research to present that would be useful for me to use with my students. Here's her TED talk:

If you watched the talk, you may have noticed that the only research cited was Dweck's work on mindset! The talk is over a year old, so maybe there is new work on grit that I'm not aware of.

I spent a lot of time thinking about these questions over the last few weeks. How can mindset be such a solid concept and grit sound great but have easy criticisms?

Leave it to Dr. Tae to answer my questions in less than 140 characters:
Simple, right? There's nothing WRONG with encouraging grit. It's just not as effective as building the growth mindset. Thanks, Tae!

## April 10, 2014

### Standing on the shoulders of SBG greatness

I've done a lot of reading on the implementation of standards-based grading (SBG) in physics classes. I often tell people I meet that most of the SBG classrooms I know of are in the high schools. I can point to Frank Noschese, Kelly O'Shea, Geoff Schmit, and Shawn Cornally as SBG experts who have successfully used SBG in their classes and share resources online.

Looking online for resources for doing this at the college level has often seemed to turn up fewer resources, at least in my opinion. But, I do want to acknowledge the great SBG users at the college level who have helped me along my way towards using SBG. These include, though are not limited to:

Ian Beatty
Joss Ives
Todd Zimmerman
Andy Rundquist
Rhett Allain

I'm linking above to resources that they have all posted which have helped me start to focus my plans and methods for how I'm implementing SBG in my classes.

Others who I've had great conversations related to SBG include Heather Whitney from Wheaton College, Chris Goedde from DePaul University, and Matt Harding who is a teacher I went to college with.  Thanks to all for helping me figure things out.  I couldn't have gotten this far without you.

## April 07, 2014

### Drafting standards for algebra-based intro physics at a two-year college

Last weekend I was at the Illinois Section AAPT meeting, where I gave a presentation of my foray into Standards-Based Grading. My main points in the presentation were that I have observed:

a.) Most of the people who try SBG the first time write too many standards initially

b.) It's really hard to find a list of standards used in college physics classes online

I've been drafting a set of standards that I would feel comfortable using for a first semester physics class. To address the first point from above, I've whittled it down to 18 standards, although several have multiple parts to them.

I believe that I can assess these standards in chunks of less than 18 assessments. I am aiming for 13-14 nominal assessments with the opportunity for re-assessments on any of them.

I am also working on as-of-yet-unwritten lab standard or standards, which I will likely need help with.

To address the second point from my talk, I'm putting the draft up here for review from the community. I would love to see a discussion of physics faculty from all levels getting involved on building a set of standards that work well. (Not that I want the standards to be, uh....standardized on any level beyond a classroom....)

Here is my draft standards for first semester intro physics, algebra-based. We move oscillations and sound to the second semester, in case you're wondering where they appear. Thank you (in advance) for any thoughts you have on them.

Physics 101 Standards (Draft Spring 2014)

1.) I can interpret and construct graphs of objects in 1-D motion

2.) I can apply a logical problem-solving process to model the motion of objects moving in 1-D.

3.) I can resolve vectors into their components.

4.) I can add and subtract vectors graphically as well as by components.

5.) I can recognize situations described by projectile motion and apply an accurate model of the 2-D motion to determine unknown quantities.

6.) I can apply Newton’s laws of motion for objects in equilibrium as well as objects in motion including:

a.) single objects
b.) connected objects
c.) objects in contact with a spring
d.) objects in circular motion

7.) I can recognize situations where the Work-Kinetic Energy theorem applies, and be able to solve problems using the theorem.

8.) I can recognize situations where the conservation of energy principle is appropriate and be able to apply the principles to those situations including:

a.) objects under the influence of a gravitational field
b.) objects in contact with a stretched or compressed spring

9.) I can identify situations where impulse is used and correctly apply the momentum-impulse theorem.

10.) I can identify situations where conservation of linear momentum is appropriate and correctly apply the conservation principle to those situations including:

a.) elastic collisions
b.) inelastic collisions

11.) I can evaluate (graphically and analytically) the quantities of rotating objects in terms of the linear kinematic equivalents including:

a.) angle
b.) angular velocity
c.) angular acceleration
d.) moment of inertia

12.) I can apply the conservation of energy principle to rotating objects.

13.) I can apply Newton's second law for rotational motion for

a.) objects rotating
b.) objects in static equilibrium

14.) I can identify situations where materials are subject to thermal expansion and be able to calculate the change in their length, area or volume.

15.) I can determine the equilibrium temperature when materials of different initial temperatures are brought into thermal contact with each other.

16.) I can differentiate between conduction, convection and radiation mechanisms.

17.) I can apply the ideal gas law and the results of the kinetic theory of gases to calculate properties of gases.

18.) I can determine the energy transferred by heating required to change the temperature of material and cause materials to change phases.

## March 19, 2014

I've been taking baby steps towards standards-based grading (SBG) for over two years, but this semester is the first time that I've really implemented the core ideas of SBG in any of my classes.

Last Fall I had (with my colleague) the opportunity to rewrite learning outcomes for our introductory astronomy course, ASTR101.  This is a general education survey of astronomy course without a laboratory. It is 3 credit hours and covers the solar system, stars, and galaxies.

Our campus assessment specialist pushed us to look at the revised Bloom's taxonomy word list to come up with descriptors for what we wanted our outcomes to be.  I really don't like how our campus uses the Bloom's taxonomy, but my opinions are a topic for another time.

After the outcomes were written and approved, I realized that I could implement them almost unchanged as standards for a real step towards SBG.

Here's what I did:

I went from 3 exams plus a final to no midterm exams, but nearly weekly quizzes. Each quiz is "scored" on a 0-5 point scale which measures the mastery of the standard being assessed. Students have optional homework assignments on MasteringAstronomy (which, by the way, is NOT optimized for SBG) but they are required to do the homework if they want to re-assess by retaking the quiz. If they want to retake the quiz for a third time, they have to visit my office for a discussion about the standard before they are allowed a third shot.  After the third try, the standard is closed.

Grades are weighted - 40% is based on a semester-long astrojournal, 35% is the SBG-style quizzes, 10% is a Just-in-Time-Teaching style reflection/reading review that students submit online, and 15% is a cumulative final.

So far, I've had a fairly positive experience with this in astronomy.  I should write down my workflow for getting all the assessments prepared and scored.  I have had some students come in for reassessments.  I am expecting to see more as the semester progresses.

What I could really use is a bit of feedback on how the standards are written.  I can't change the learning outcomes, but I can tweak the standards if appropriate.

There are some standards broken into multiple parts so I could have the option if necessary to break out into multiple assessments.  The goal was to have no more than 15-16 assessments. Here's the standards as I wrote them out:

1) Explain how astronomical objects move in the sky.

2a) Explain the cause of the seasons

2b) Explain the cause of moon phases.

2c) Explain the cause of eclipses.

3) Describe how the heliocentric model of the solar system was developed and why it was adopted over the geocentric model of the universe.

4a&b) Apply Kepler's Laws of orbital motion and Newton's Law of Universal Gravitation to objects in the universe.

5) Describe the functions of a telescope and types of telescopes and explain why some telescopes are placed on the ground and some in space.

6) Explain how astronomers use light to determine:
a.) the luminosity of stars,
b.) temperature of stars,
c.) and size of stars,
d.) chemical composition of astronomical objects,
e.) the speed and direction of an astronomical object's motion,

7) Describe the nature of our solar system and how it was formed.

8) Explain how astronomers use the Hertzsprung-Russell diagram to study properties of stars.

9) Describe how stars are formed, evolve and die.

10) Describe the structure and size of the Milky Way galaxy.

11) Compare the Milky Way galaxy to other galaxies.

12) Explain how astronomers know that the universe is expanding and how they determine the age of the universe.

## March 18, 2014

### Distances to brightest naked eye stars

I saw this recent xkcd comic, and had to figure out how many of the naked eye stars are more than 1000 light-years away.

It took me awhile to find a star catalog that was easy to search which also had both the apparent magnitudes and the distances to the stars, but I was able to locate a database of over 87,000 stars in CSV format.

First I found all the stars with magnitude 6.0 or brighter. That narrowed the list down to just over 5,000 stars. Putting the distances into plot.ly, I created this histogram:

Each bar represents a bin of width 100 parsecs. My interest in stars 1000 light-years means I have to look at the stars more than 300 parsecs away. I added up the stars in the first three bins, which represented about 87% of all the visible stars.  So my estimate of the naked eye stars which are 1000 light-years away or more is about 13%.

So, assuming on a clear night (no moon, ideal viewing conditions) I could see somewhere between 2000-3000 stars total, only about 250-400 stars would be more than 1000 light years away.

Of course, looking through a telescope changes that figure completely.

## March 16, 2014

### Suggesting twitter to high school teachers did not go well

Recently I had the opportunity to have dinner with high school science teachers from our college's district.  Our department hosted what we call a Science Dinner, which was an open house and a meal after the teachers had a chance to visit labs in the department.

During dinner, the conversation drifted from one topic to the next, including how schools were implementing Next Generation Science Standards, how to implement AP Physics, and the lack of funds for professional development.

I asked the teachers (including a department head) if they had ever thought of exploring online options for professional development such as twitter or facebook. I explained that there are teachers from all over the country on twitter who are asking similar questions and discussing issues which traditional professional development funds would typically cover.  I offered to put all the physics teachers in touch with physics teachers on twitter all over the country if they were interested.

The teachers all listened politely and said that they had never considered online professional development. One of them said that another teacher at her school had quit last year over an interaction that happened on social media. Another teacher said that she would never want to be on twitter because her students might be on twitter, and she would not want to interact with them online.

I was stunned at how quickly the conversation turned to pointing out the dark sides of social media. I had this naive idea that interacting with other teachers from around the country would be really attractive. I guess I underestimated the fear of the unknown.

Next time I have the opportunity to plug social media, I'm going to suggest starting with The Global Physics Department first.  Perhaps that is an avenue to getting teachers to interact with each other online.

## January 07, 2014

### What am I trying to encourage with my exam policies?

I'm trying to figure how to handle giving quizzes and exam next semester in my algebra-based physics courses, my intro astronomy course and my general education course in musical acoustics.

There has been much talk online recently about Standards Based Grading (SBG) and related assessment strategies. I'm not diving fully into the SBG waters, and currently my issue isn't directly related to going towards SBG.  The reason I mention SBG is to give some context.

### Introductory Physics Courses

A few years ago when I learned about SBG, I sort of had the wrong idea of how it was supposed to be implemented. I liked the philosophy which allowed for students to learn at their own pace and to be reassessed on understanding of the standards.  I also liked the idea of using student-made screencasts (Thanks to Andy Rundquist for leading me down this path) as assessment methods.  Because I get to hear the students explain the physics in their own words, I can really find out what they understand and what they are simply regurgitating from class or the book.

Grades in intro physics are made up of the following parts: online reflections of what they did in class and read in the textbook, screencasts done for homework, lab reports, weekly quizzes, midterm exams and a final. I consider the lab reports to be drafts which can be corrected and submitted until they are satisfactory. I also consider the screencast homework assignments to be practice for taking quizzes and exams, so I provide feedback on the screencasts and allow them to be resubmitted as many times as needed until correct.

Quizzes and exams are done in a traditional way - all the students spread out as far away from each other in the classroom and work independently on the quiz or exam for a set amount of time. For quizzes, I provide relevant (and sometimes not-so-relevant) equations, but for the exams students prepare their own equation sheet. I usually give 20 minutes for a quiz and 2 hours for an exam.

### General education courses - Intro Astronomy and Physics of Sound, Music and Hearing

In the gen ed courses I do not use screencasts. The only homework that the students are required to do is the classroom reflections. Astronomy is not a lab course, so there are no lab reports, but they do have to do a semester-long astronomy journal project. In the acoustics class, students design and build their own musical instrument.  I'm pretty happy with those parts of the grading process.

But the exams are something else. Again, I have typically given "traditional" type exams where all students work independently. I typically supply equations for these classes.

There is a pattern that is starting to emerge over the last few semesters in astronomy. The first part of the pattern is that on the first exam the class average is somewhere in the mid-60% to mid-70% range. For many students it is shockingly low. However, in the 10-ish years I've been teaching the class, the average on the first exam has never strayed far from this mark.  Typically we have a discussion of how now they know how the exam will be structured (even though we discussed it thoroughly beforehand) and that they should think carefully about what changes they need to make in preparing for the next exam. I've also been weighting the first exam less than later exams in recent years to try to alleviate concern that their grade is sunk after one poor exam. The next part of the pattern is that on the second exam (out of three midterm exams) the class average goes down. Significantly down. In most semesters before the last 3, the class average would rise to right below about 80%.  More recently, the average has declined to the low 60% range.

Frustrated by this pattern, I offered to allow group exams in astronomy on the third midterm.  Working together, the students were able to significantly bring up their scores, although implementing the group exam brings in its own set of challenges in terms of how I score it fairly.

### What am I really trying to encourage?

There are some maxims that are sort of swirling around in my head whenever I think about what I'm going to do next semester. One is the saying about how students don't really respond to what you want them to do (or what's best for them) but they will respond to what they are graded on. I guess I can't really think of the exact saying right now, but I think a lot about how to incentivize the intrinsic motivation to pursue deep learning without having to provide the extrinsic motivation of points towards a grade.

The other related thought that I can't quite decide how to address is the idea that if I want to encourage a type of behavior or thinking, then it SHOULD be a part of the grade somehow.

So for example, last semester in astronomy we used the lecture tutorials by the CAPER team as purely formative assessments. Students were told they would not be graded on them, so they should work together and feel free to make mistakes on them that we would correct in class.  My class never fully bought into the idea taking the tutorials seriously as a way of being actively engaged in the class. Even after the first exam had 80% of the questions based directly on the lecture tutorials, and the students themselves recognized how much of the exam was based on the tutorials they did not believe that collaborating with others on the tutorials was necessary.

And why should they have? I was not going to be rewarding them for working with others as a part of their grade, after all. I think that perhaps if group exams were a part of the course from the start, they would have reason to work with others in the class from the beginning.

But, what about the general physics course?  I believe Eric Mazur's Harvard course has some form of open-book, open-note policy on quizzes and exams. Others have used group exams in these courses.  What am I trying to encourage?  I think I am trying to encourage students to work together collaboratively, but am I grading that way?  Should I be? Isn't part of the course figuring out how to take quizzes and exams by yourself?

### The real reason I need to figure this out

I have a conference that is going to take me away from school the last week of the semester before finals. I am not happy with this schedule, but there is not much I can do about it right now. What I'd like to do if possible is eliminate in-class exams. Since I typically give three mid-term exams, that effectively gives me back all my time I would be missing at the end of the semester…although it's really never the same. But if I give take-home exams, for example, how should they be structured? Do I explicitly forbid collaboration and trust the students? That seems to go against the classroom dynamic that I would like to foster of students working together. Do I explicitly encourage students to group up and work on it?  That would seem to disadvantage students who have busy work and home schedules and cannot easily pop back and forth to campus.

The one idea I've had is to give the exams as take-home exams and allow for students to group up if they want. But instead of them handing in the exam, have them make screencasts for each problem on the exam. That way, I hear each student explain it in their own words, just like the homework. I just don't know if I can grade that many screencasts in a reasonable amount of time.

### TL;DR

How can I improve the way I assess and evaluate students next term? How closely does the grading policy align with my philosophy on learning and what can I do to improve that?