I feel like it’s been a while since I’ve been able to post an update about my dissertation, so here it is!
- My first article, a literature review comparing online and blended teaching competencies, was reviewed by a journal and asked for some revisions before publication. I don’t yet know if it will be published by the journal, but I have revised it and re-submitted. Here’s to hoping!
- My second article, which takes a subset of the literature from the review and coded competencies based on whether they were (1) online/digital in nature, (2) in-person, (3) blending online & in-person, or (4) generic. That has been accepted, pending a few revisions! Should be published in March, and I will post a link on this blog so you can read it.
- My final research article just got approved by BYU’s IRB office. We are sending out the test of blended teaching skills, knowledge, and understanding to blended and online schools and teachers across the country. I am looking forward to seeing how people perform on the test and analyzing the results. This is what I am partnering with The Learning Accelerator to do. This test may or may not end up being posted on their website as a resource for schools.
I still need to hold a formal prospectus meeting with my committee, but today I am actually having a pre-prospectus meeting with 4 members of the committee to discuss details and get their feedback before I launch into the formal prospectus.
This is my latest publication, written for the Brookings Institution– coauthored with Saro Mohammed of The Learning Accelerator!
We have both noticed the disparity between blended learning research and publication in peer-reviewed journals. The process of submitting to a journal may seem cumbersome in this digital age where we can self-publish easily to a blog, but there are reasons it is here!
Check it out here!
I’m an educational researcher. I am invested in the improvement of education for children throughout the world. While some educational research is observational in nature, some research conducted in education seeks to make causal claims about the efficacy of interventions. Causal research is important to continue conducting in order to discern interventions that could lead to improvements in the lives of the students. However, cause-and-effect is harder to isolate in social contexts, such as education, and therefore it is more difficult to design and carry out studies that address the threats to validity that it might encounter. In education there are many well-intentioned individuals with big ideas, and money is thrown into improvements and interventions without reference to the actual impact of the intervention. By conducting experimental and quasi-experimental research in education, we can isolate effects and see whether the intervention was worth the cost that it took. Sharing the results of the studies gives us persuasive power beyond anecdotal evidence of efficacy. Educational policies and practices that are backed up by experiments with defensible design choices will be able to help us move forward into new territory by helping us discard notions that are not research-based, and accept those that are.
These are lofty goals. It is not simply enough to conduct an experiment (which includes random assignment and random selection) or quasi-experiment (which does not include random assignment). Both types of studies are susceptible to “threats of validity” — logical holes in the experiment’s design, which can undermine the findings or, if properly addressed, can strengthen a study’s causal claims. While I will not attempt to detail every threat to validity in this essay, I will address a few common ones that come up. The first threat to validity for quasi-experimental designs is selection bias. When individuals select themselves into a treatment or control group, there are unobserved personal characteristics that went into that decision, which can confound the effects of the treatment. Another threat to validity includes “pretesting effects”– a research subject who remembers taking a pretest (and its contents) can have an effect on their posttest scores. Low statistical power is another threat to validity: if you have very few research subjects, can you truly find whether the effects of the treatment were significant? Yet another threat is attrition; when you lose test subjects over the course of a study, you are losing valuable data from which to draw conclusions. It isn’t difficult to see that the design of a study has many considerations which must be addressed in order to be logically sound and stand up to threats of validity which might be leveled at it by other researchers.
One challenge in educational research is the actual implementation of interventions that are successful. We work with people in a complex ecosystem, some of whom are resistant to change. Making the findings of a study easy to understand and relevant to the group that would be implementing changes requires a great deal of time and effort if they are to carry the changes forward. Incentives for overturning the status quo are often lacking in education due to cultural conditioning; a proactive school leader might not be around to help make the changes needed in practice that are found to be effective. Sometimes there is cronyism involved in curriculum design and selection which can get in the way of switching to other methods that might have improved student benefits. But despite these challenges, it is still worth it to try new things and be open to educational research as a means of providing us with evidence for making small changes over the years.
The Society for Information Technology and Teacher Education (SITE) is holding a conference this March, and our submission about the Blended Teaching Readiness Instrument Development has been accepted! Charles Graham will be presenting this. To read more about this topic, see the MVLRI report below.
Also, a conference presentation about our analysis of blended teaching competencies has been accepted! The analysis is about whether the blended teaching competencies are geared toward in-person skills, online skills, weaving the two together, or if the competency is generic and really could apply to anyone in any teaching/learning environment. My colleague Cecil Short will be presenting this research on behalf of all those on the team.
Unfortunately I cannot be at this event, but I am excited for the researchers on our team who can go.
I had a great time interviewing my friend Darilyn about her experience teaching at Innovations Early College High School in Salt Lake City. I learned about her journey and how she got to where she is, and what she thinks about her job.
Innovations uses a “flex” blended model, and draws students from the downtown Salt Lake City area. It is in its seventh year of operation, and has been led by Ken Grover who was once the principal at West High in Salt Lake City. Each teacher has a group of students that they individually mentor, in addition to teaching classes.
A few things that struck me as profound from my interview with Darilyn:
1) You don’t need to want to change the world or be super entrepreneurial to thrive in this environment–you simply need to be humble.
2) There is time built into the schedules to care about students, which is why most teachers get into the profession in the first place.
If I come up with any findings that are more nuanced than that I may post them here, but for now, I am going to focus on writing up an analysis for my Qualitative Research class!
I helped to write this report about our development of a Blended Teaching Readiness Instrument! Very excited to have it published through the Michigan Virtual Learning Research Institute (MVLRI) and to share our research with the world.
Read the report here: https://mvlri.org/wp-content/uploads/2017/11/k12-blended-teaching-readiness-phase-1-instrument-development.pdf
This readiness instrument was developed over long hours, consulting with teachers and experts, and will partially inform the blended teaching assessment for preservice and inservice teachers that I’m writing right now for the second phase of my dissertation.
Last week was a whirlwind. In addition to finding out I had been offered a fellowship through Learning Accelerator, I was also given the opportunity to be part of a research panel at the upcoming iNACOL conference in Orlando, Florida in October! This symposium brings together technology companies with K-12 districts, researchers, and non-profits to help explore how online and blended learning can be improved in K-12. I am looking forward to meeting people there and sharing the research agenda we have been working on.
Once I have the details of what I’m being asked to share more solidified I will post a rough outline here, or I’ll write up a retrospective when I get home.
On Friday I attended a poster conference at Utah State to share ideas about my literature review and to get feedback from others on K-12 blended teaching competencies. Currently, Jered Borup, Charles Graham and myself have built an instrument that assesses “Blended Teacher Readiness”– it is targeted at the school teachers who have some experience in face-to-face teaching environments, who might teach with some level of technology integration, but who haven’t yet adopted online teaching components for their classroom. The hope is that by surveying their readiness for blended teaching, that we can assess where their needs are and provide targeted resources for them to enhance their abilities in blended and online teaching.
We are conducting a confirmatory factor analysis at this point to validate the 50-60 items in the survey (utilizing teachers in a Virginia school district to take the assessment). Here are the general concepts we are surveying the teachers for (these general categories follow the IBSTPI model categories, loosely):
- Foundational Knowledge and Skills
- Technical Literacy
- Digital Citizenship
- Positive Mindsets
- Planning and Design
- Planning blended activities
- Planning blended assessments
- Instructional Strategies
- Personalizing Instruction
- Facilitating Community-based Activities
- Facilitating Student-Instructor Interaction
- Facilitating Student-Content Interaction
- Assessment and Evaluation
- Implementing Digital Assessments
- Evaluating and Reflecting
One of our main goals in creating the instrument was to create items that are targeted to a blended environment. We are assuming that those taking this survey have experience in the face-to-face classroom, so the skills we want to assess in them are ones that require the technical literacy and creative use of digital technologies to change the way the classroom works. More information on that when we are finished with the validation!