This page serves as a sort of journal for Geoff’s doctoral program, separate from the Emergentmath blog. Here, Geoff will share thoughts and reflections on classes, insights into literature and other PhD-related musings. While you’re here, feel free to explore Geoff’s main blog, various “miniseries,” and select blog posts. Here is Geoff’s About Me page.
Journal: Urgent vs. Important
One of the challenges I’m facing in my doctoral studies thus far is prioritizing readings, research, assignments, and final papers. Perhaps it’s because it’s been a while since I’ve been a student, I didn’t consider how you have to keep multiple timelines for multiple classes. A significant fear of mine is that I’ll keep up with my shorter assignments at the expense of my larger papers due at the end of the semester. Or I’ll diligently keep up with my coursework but have to play catch up with my dissertation.
Here’s what I’m going to try.
I wrote about the Urgent vs. Important matrix in Necessary Conditions. That was in the context of lesson and departmental planning. Here’s how it looks in NC.
I’m creating a slight remix of that matrix for my PhD coursework and long term research and writing. I’m still using the phrases “Urgent vs. Important” but I’m thinking of them as short-term / long term vs. high effort / low effort. I’m using a google jamboard as it has easy sketching and easy sticky-noting.
The really important thing is that I keep the items that aren’t due for months in my brain somehow. Keeping tabs on them via this chart might be a way to do that.
Implications for Remote Assessment Environments: a brief discussion of Erickan, Asil, and Grover (2018)
When schools closed in March of this year, states had the grace and foresight to cancel their usual Spring high stakes standardized tests. But what about this year? It’s unlikely we’ll be so lucky. But what does it mean to assess a student’s knowledge and skills in a remote learning environment?
In a study by Ercikan, Asil, and Grover (2018), researchers gave adolescents tests on their skills around information and computing technology (ICT). The tests were administered to 14-year olds in multiple countries (perhaps notably, not the U.S.). Assessments were given to measure ICT skills in four domains: create an online collaborative workspace; explain process of breathing; create a website/image for a band competition; create an information sheet about a school trip. These items were then scored for the quality and completion of their work.
While there were several notable findings, here are some I found particularly interesting:
- Girls performed better on ICT tasks, despite boys reporting having more access to technology.
- Interest and enjoyment were indicators of success.
- Socioeconomic status played a major factor in achievement on the ICT test.
This last finding is particularly troublesome. After all, the test the researchers administered was essentially divorced from core content knowledge. In 2021, the next plausible mass standardized testing window, many students will have spent much of their attempting to learn remotely. Teachers, parents and students are all currently struggling with remote learning, particularly those from learn, live, and instruct in poorer neighborhoods.. Should standardized assessments resume in the Spring of 2021, it’s inevitable that students who have struggled to learn material via remote learning will perform exceptionally poorly on a remotely delivered assessment.
The authors do provide some recommendations for any remotely implemented assessment system:
- Familiarize students with the technology environment
- Provide tutorials to be accessed before and throughout an assessment
- Administer surveys to gauge ICT access
- Design assessments with least access individuals in mind (emphasis mine)
- Conduct tryouts to ensure the system is stable and satisfactory
- Continue checking for students’ digital divide over time
While schools are doing everything they can to instruct remotely, they can put some of these recommendations in place now in anticipation for remote assessment next Spring, should it come to pass.
Ercikan, K., Asil, M., & Grover, R. (2018). Digital divide: A critical context for digitally based assessments. Education Policy Analysis Archives, 26(51). http://dx.doi.org/10.14507/epaa.26.3817