• Request Info
  • Visit
  • Apply
  • Give
  • Request Info
  • Visit
  • Apply
  • Give

Search

  • A-Z Index
  • Map

Educational Leadership and Policy Studies

  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
Home » Archives for Jonah Hall

Power BI, Will It Really Give Me Data Viz Superpowers?

Power BI, Will It Really Give Me Data Viz Superpowers?

Power BI, Will It Really Give Me Data Viz Superpowers?

May 15, 2025 by Jonah Hall

Power BI, Will It Really Give Me Data Viz Superpowers?

What is Power BI?

Power BI is a powerful tool to visualize data.  

It can take multiple large datasets, put them all together, transform them, perform calculations and help you create beautiful visualizations. Think of it as a data wrangler, organizer, and visualizer! Oftentimes, a collection of visualizations is created into a report.  

My name is Jake Working, I am a third-year student in the ESM PhD program at UTK and primarily use Power BI in my day job as a Data Analyst for Digital Learning at UTK. I will briefly discuss some of Power BI’s main functions and point you towards some resources if you want to learn more. 

Why use a data viz software? 

Before we jump into the software, you may be thinking, “why go through all the trouble of learning another software just to create visualizations? Aren’t my [insert your software of choice here] visualizations good enough?” 

Even when you get comfortable and quick in [your software of choice], at the end of the day, these programs’ primary functions are typically to store, present, or analyze your data, not bringing in data with the purpose of creating visualizations. 

The advantage of learning data visualization software like Power BI is that it is designed with visualization as its primary purpose. If you have learned or even mastered creating visuals in another software, you can 100% learn and master visualization software like Power BI. 

What can Power BI do? 

First, Power BI is excellent at bringing in data. You can connect multiple large and different types of data sources to Power BI, transform them, and perform calculations as necessary to prepare visuals. 

For data sources, if you can access the data, Power BI can connect to or import it. Power BI can take flat files (ex. Excel, PDF, or CSV), pull direct (snapshot or live) from a database (ex. MySQL, Oracle, SQL Server), import from a website, R script, Python script, and so many more! Even if you have multiple data sources, you can load as many as you need in and create relationships between your data sources.  

Creating relationships serves as the backbone of your data model if you have multiple data sources. For example, say you have a data source with student demographic data and another with student course information. If both contain a unique identifier, such as their student ID, you can create a relationship between the data sources based on that student ID and Power BI will know which course information connects with which student in your demographic data.  

Most of the mistakes within building a model occur at this step, and it is important to understand how and why you are building your model in a certain way or else you could sluggish, incorrect, or confusing output. I suggest reading Microsoft’s overview of relationships and then later this two-part blog post on Power BI data modeling best practices (part 1, part 2). Warning! This blog post is overly detailed for beginners, but extremely important information to avoid common Power BI pitfalls with relationships. I have had to deal with, and overcome, issues related to cardinality, filtering, and schema structure that are discussed in the blog.  

An overview of Power BI’s capabilities: bringing in multiple sources of data, cleaning data, creating relationships between data sources, and using the data to generate a visual report. 

Once you have identified your dataset, Power BI has abilities to transform your data into clean, workable, data within their Power Query editor. This editor has functionalities like Excel such as updating data types, replacing values, creating new columns, and pivoting data. This is done using the Power Query GUI or its script language, M. These transformation steps can be “saved” to your data source and performed on your data each time Power BI connects to or updates that data source. So, once you have cleaned up your data once, it is done automatically using the steps you already created! 

Power BI can then do complex calculations on your dataset once you’ve loaded it in. It uses a function and reference library called Data Analysis Expressions (DAX, for short) that is like expressions used in Excel. Check out Microsoft’s overview of how DAX can be used within Power BI and the library of DAX functions. In my use within Power BI, I mainly use calculated columns and measures.  

For example, let’s say I have a column in my data set that shows the date a form was submitted in this format: mm/dd/yyyy hr:min:sec. If I want to count the number of forms submitted in the calendar year 2025 and display that value on my report, I can create a measure using the DAX functions. It would look something like this: 

Finally, Power BI’s main function is to create engaging visuals and reports to infer information from your data. Power BI has a workspace that allows you to easily select visuals, drag fields from your data into the visuals, and then edit or customize your visuals. The software is pre-loaded with many useful visuals, but you can search and download additional, user-created, visuals as well. Check out the image below showcasing Power BI’s workspace. 

image from Microsoft (source) 

Visuals can be used together (like in the image) to create a report. These reports can be published in a shareable environment through the Power BI Service so others can view the report. This is how companies create and distribute data reports! 

One exciting feature of Power BI is the ability to use and interact with Microsoft’s AI, Copilot. Copilot is quite intelligent when it comes to understanding and using data and can even help build visuals and whole reports. Check out this three minute demo on Copilot within Power BI to get a sense of its capabilities. 

I want to try! 

If you are interested in poking around Power BI to see if it could be useful for you, you can download the desktop version for free here. I will note that even if you are working on personal projects and have data you want to create visuals from, it may be worth it to try Power BI! 

Microsoft has training, videos, sample data you can play with once you open the program, and a community forum to help with any questions you may have.  

Curious what Power BI can do? Check out some of the submissions from this year’s Microsoft’s Power BI Visualization World Championships! 

Filed Under: Evaluation Methodology Blog

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

March 15, 2025 by Jonah Hall

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

By Dr. Fatima T. Zahra

Hello, my name is Fatima T. Zahra. I am an Assistant Professor of Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville. My research examines the intersection of human development, AI, and evaluation in diverse and displaced populations. Over the past decade, I have worked on projects that explore the role of evaluation in shaping educational and labor market outcomes in refugee and crisis-affected settings. This post departs from a purely technical discussion to reflect on the role of empathy in evaluation practices—a quality that is often overlooked but profoundly consequential. For more information about the work that I do check out my website. 

Evaluation is typically regarded as an instrument for assessing program effectiveness. However, in marginalized and forcibly displaced populations, conventional evaluation models often fall short. Traditional frameworks prioritize objectivity, standardized indicators, and externally driven methodologies, yet they frequently fail to capture the complexity of lived experiences. This gap has spurred the adoption of empathy in evaluation, particularly participatory and culturally responsive frameworks that prioritize community voices, local knowledge, and equitable power-sharing in the evaluation process. The work in this area is substantially underdeveloped. 

A group selfie taken during field work in the Rohingya refugee camps in 2019.

Why Does This Matter?

My recent meta-analysis of 40 studies comparing participatory, culturally responsive, and traditional evaluation models in refugee and displaced settings underscores the importance of empathy-driven approaches. Key findings include: 

  • Participatory evaluations demonstrated high levels of community engagement, with attendance and participation rates ranging from 71% to 78%. Evaluations that positioned community members as co-researchers led to greater program sustainability. 
  • Culturally responsive evaluations yielded statistically significant improvements in mental health outcomes and knowledge acquisition, particularly when interventions incorporated linguistic and cultural adaptations tailored to participants’ lived experiences. 
  • Traditional evaluations exhibited mixed results, proving effective in measuring clinical outcomes but demonstrating lower engagement (54% average participation rate), particularly in cases where community voices were not integrated into the evaluation design. 

The sustainability of programs was not dictated by evaluation models alone but was strongly influenced by community ownership, capacity building, and system integration. Evaluations that actively engaged community members in decision-making processes were more likely to foster lasting impact. 

Lessons from the Field

In our research on early childhood development among Rohingya refugees in Bangladesh, initial evaluations of play-based learning programs suggested minimal paternal engagement. However, when we restructured our approach to include fathers in defining meaningful participation—through focus groups and storytelling sessions—engagement increased dramatically. This shift underscored a critical lesson: evaluation frameworks that do not reflect the lived realities of marginalized communities risk missing key drivers of success. 

Similarly, in a study examining the impact of employment programs in refugee camps, traditional evaluations focused primarily on income and productivity, overlooking the psychological and social effects of work. By incorporating mental well-being as a key evaluation metric—through self-reported dignity, purpose, and social belonging—we found that employment offered far more than economic stability. These findings reinforce an essential principle: sustainable impact is most likely when evaluation is conducted with communities rather than on them, recognizing the full spectrum of human needs beyond economic indicators. 

Rethinking Evaluation: A Call for Change

To advance the field of evaluation, particularly in marginalized and displaced settings, we must adopt new approaches: 

  1. Power-sharing as a foundational principle. Evaluation must shift from an extractive process to a collaborative one. This means prioritizing genuine co-creation, where communities influence decisions from research design to data interpretation. 
  1. Cultural responsiveness as a necessity, not an afterthought. Effective evaluation requires deep listening, linguistic adaptation, and recognition of cultural epistemologies. Without this, findings may be incomplete or misinterpreted. 
  1. Expanding our definition of rigor. Methodological validity should not come at the expense of community relevance. The most robust evaluations integrate standardized measures with locally grounded insights. 
  2. Moving beyond extractive evaluation models. The purpose of evaluation should extend beyond measuring impact to strengthening local capacity for continued assessment and programmatic refinement. 

Looking Ahead

The field of evaluation stands at a pivotal juncture. Traditional approaches, which often prioritize external expertise over local knowledge, are proving inadequate in addressing the complexity of crisis-affected populations. Empathy in evaluation (EIE) methodologies—those that emphasize cultural adaptation, power-sharing, and stakeholder engagement—offer a path toward more just, effective, and sustainable evaluation practice. 

For scholars, this shift necessitates expanding research on context-sensitive methodologies. For practitioners, it demands a reimagining of evaluation as a process that centers mutual learning rather than imposing external standards. For policymakers and funders, it calls for investment in evaluation models that are adaptive, participatory, and aligned with the needs of affected populations. 

As evaluators, we hold a critical responsibility. We can either reinforce existing power imbalances or work to build evaluation frameworks that respect and reflect the realities of the communities we serve. If we aspire to generate meaningful knowledge and drive lasting change, we must practice empathy, cultural responsiveness, and community engagement at the core of our methodologies. 

Additional Resources

For those interested in deepening their understanding of these concepts, I highly recommend the following works: 

  • Evaluation in Humanitarian Contexts:  
  • Mertens, D. M. (2009). Transformative Research and Evaluation. Guilford Press. 
  • Culturally Responsive Evaluation:  
  • Hood, S., Hopson, R. K., & Kirkhart, K. E. (2015). Culturally Responsive Evaluation. In K. E. Newcomer, H. P. Hatry, & J. S. Wholey (Eds.), Handbook of Practical Program Evaluation (4 ed., pp. 281-317). Jossey-Bass. https://doi.org/10.1002/9781119171386.ch12 
  • Participatory Research in Development Settings:  
  • Chouinard, J.A., Cousins, J.B. The journey from rhetoric to reality: participatory evaluation in a development context. Educ Asse Eval Acc 27, 5–39 (2015). https://doi.org/10.1007/s11092-013-9184-8 
  • Empathy in Evaluation:  
  • Zahra, F. T. (n.d.). Empathy in Evaluation. https://www.fatimazahra.org/blog-posts/Blog%20Post%20Title%20One-gygte 
  • Empathy and Sensitivity to Injustice:  
  • Decety, J., & Cowell, J. M. (2014). Empathy and motivation for justice: Cognitive empathy and concern, but not emotional empathy, predict sensitivity to injustice for others (SPI White Paper No. 135). Social and Political Intelligence Research Hub. https://web.archive.org/web/20221023104046/https://spihub.org/site/resource_files/publications/spi_wp_135_decety.pdf 

Final Thought: Evaluation is a mechanism for empowerment and is more than just an assessment tool. Evaluators have the capacity to amplify community voices, shape equitable policies, and drive sustainable change. The question is not whether we can integrate empathy into our methodologies, but whether we choose to do so.  

Filed Under: Evaluation Methodology Blog

Irwin Recognized As Emerging Professional By ACPA

Irwin Recognized As Emerging Professional By ACPA

March 5, 2025 by Jonah Hall

Irwin Recognized As Emerging Professional By ACPA

Courtesy of the College of Education, Health, & Human Sciences

At its recent convention in Long Beach, California, College Student Educators International (ACPA) recognized Lauren Irwin with the Annuit Coeptis Emerging Professionals Award. This prestigious award honors exemplary educators in the early stages of their careers. Irwin was one of five early-career professionals recognized for their contributions to the field.
Irwin, an assistant professor in the department of Educational Leadership and Policy Studies (ELPS) in the College of Education, Health, and Human Sciences (CEHHS), is a long-time ACPA member and was deeply honored to receive the award.

headshot

“ACPA has long been my professional home in student affairs, and it means a lot to receive this recognition,” said Irwin. “The Annuit Coeptis award is ultimately about community and discussion to support the future of our field. As a former student affairs administrator and early-career faculty member, I am honored to be part of this prestigious multigenerational community and to have the opportunity to learn from and with some of the brightest minds in our field.”

Irwin primarily teaches in the College Student Personnel and Higher Education Administration programs. Her research informs student affairs practice, aiming to enhance and affirm the success of both students and practitioners. Her doctoral dissertation, which examined racialization and whiteness in college student leadership programs, earned ACPA’s Marylu McEwen Dissertation of the Year Award. Additionally, her research has been published in numerous scholarly journals.

“I hope to continue centering my commitment to student learning, equity, and inclusion through my teaching, research, and service,” Irwin said.
Through its seven departments and 13 centers, the UT College of Education, Health and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

Is Your Data Dirty? The Importance of Conducting Frequencies First

Is Your Data Dirty? The Importance of Conducting Frequencies First

March 1, 2025 by Jonah Hall

Is Your Data Dirty? The Importance of Conducting Frequencies First

By Jennifer Ann Morrow, Ph.D.

Data, like life, can be messy. I’ve worked with all types of data, both collected by me and by my clients, for over 25 years and I ALWAYS check my data before conducting my proposed analyses. Sometimes, this part of the analysis process is quick and easy but most of the time it’s like an investigation…you need to be thorough, take your time, and provide evidence for your decision making. 

Data Cleaning Step 3: Perform Initial Frequencies 

After you have drafted your codebook and analysis plan you should conduct frequencies on all of your variables in your dataset, both numeric and string variables. I typically use Excel or SPSS to do this, my colleague Dr. Louis Rocconi prefers R, but you can use any statistical software that you feel most comfortable with. At this step I conduct frequencies and request graphics (e.g., bar chart, histogram) for every variable. This output will be invaluable as you work through your next data cleaning steps. 

So, what should you be looking at when reviewing your frequencies? One thing that I will make note of is any discrepancies in coding between my data and what is listed in my codebook. I’ll flag any spelling issues in my variable names/labels and note anything that doesn’t match my codebook. One thing that I always check is that my value labels (what labels are given to my numeric categories) are the same as my codebook and consistent across sets of variables. Many times, if you are using an online survey software package to collect your data there can easily have been programming mistakes when creating the survey that results in mislabeled values. Also, if you have had many individuals enter data into your database it can increase the chances that mistakes were made during the data entry process. During this step I will also check to make sure that I have properly labeled any values that I’m using to designate missing data and that this is consistent with what I have listed in my codebook.  

Lastly, I will highlight when I see variables that may have extreme scores (i.e., potential outliers), variables with more than 5% missing data, and variables with very low sample size in any of their response categories. I’ll use this output in future data cleaning steps to aid in my decision making on variable modification. 

Data Cleaning Step 4: Check for Coding Mistakes 

At this step I will take my output that I highlighted potential issues with coding and start reviewing and making variable modification decisions at this step. Coding issues are more common when you have data that has been manually entered but you can still have coding errors in online data collection! Any variables that have coding issues I first determine if I can verify the data from the original/another source. For data that has been manually entered I’ll go back to the organization/paper survey/data form to verify the data. If it needs to be changed to the correct response I will make a note of this to fix in my next data cleaning step. If I cannot verify the datapoint (like when you have collected your data anonymously) and the value doesn’t fall in the possible values listed in my codebook then I make a note to set the value as missing when I get to the next data cleaning step.  

Additional Advice 

As I am going through my frequencies I will highlight/enter notes directly in the output to make things easier as I move forward through the data cleaning process. I’ll also put notes in my project notebook summarizing any issues and then once I make decisions on variable modifications, I note these in my notebook as well. You will use the output from Step 3 in the next few data cleaning steps to aid in your decision making so keep it handy! 

Resources

12 Steps of Data Cleaning Handout: https://www.dropbox.com/scl/fi/x2bf2t0q134p0cx4kvej0/TWELVE-STEPS-OF-DATA-CLEANING-BRIEF-HANDOUT-MORROW-2017.pdf?rlkey=lfrllz3zya83qzeny6ubwzvjj&dl=0 

Step 1: https://cehhs.utk.edu/elps/organizing-your-evaluation-data-the-importance-of-having-a-comprehensive-data-codebook/ 

Step 2: https://cehhs.utk.edu/elps/clean-correlate-and-compare-the-importance-of-having-a-data-analysis-plan/ 

https://davenport.libguides.com/data275/spss-tutorial/cleaning

https://libguides.library.kent.edu/SPSS/FrequenciesCategorical

https://www.datacamp.com/tutorial/tutorial-data-cleaning-tutorial

https://www.geeksforgeeks.org/frequency-table-in-r

https://www.goskills.com/Excel/Resources/FREQUENCY-Excel

Filed Under: Evaluation Methodology Blog

Boyd Receives Legacy of Excellence Award From ASCA

Boyd Receives Legacy of Excellence Award From ASCA

February 27, 2025 by Jonah Hall

Boyd Receives Legacy of Excellence Award From ASCA

Karen D. Boyd, professor of practice in the College of Education, Health, and Human Sciences (CEHHS) at the University of Tennessee, Knoxville, received the Raymond H. Goldstone Legacy of Excellence Award by the Association for Student Conduct Administration (ASCA) during its 2025 Annual Conference held in Portland, Oregon.

The Goldstone Legacy of Excellence Award is a new initiative launched from the Goldstone Foundation to recognize distinguished individuals who have impacted the field of student conduct and higher education. The Legacy of Excellence Award annually recognizes a select group of individuals who have left an enduring impact on the profession through significant contributions to the field of student conduct; impactful scholarship and research; and/or leadership within ASCA and other organizations.

Boyd has been a part of ASCA since its inception. Her leadership included Conference Chair, President, and Gehring Academy Chair, as well as authoring multiple publications and presentations and even serving as Interim Executive Director. In addition, Boyd serves as a professor of practice and director of undergraduate education in the department of Educational Leadership and Policy Studies (ELPS).

“It is an honor to be so recognized for doing work in service to the success of my students and colleagues that I have loved so very much,” said Boyd.

Many members, past and present, have benefited from all she implemented in the Association. The future of our field continues to benefit through her role as professor at the University of Tennessee, Knoxville, where her courses are consistently regarded by students as among their favorite and most impactful.

Her work with educating professionals and students about the landmark Dixon v. Alabama case, and her partnership on the documentary regarding the case, has made a significant impact on the conduct field.

The ASCA Annual Conference, spanning from February 5 – February 8, 2025, gathered nearly 650 student conduct and student affairs practitioners for a professional development experience. The awards were presented during the Awards Luncheon on February 6, 2025, where attendees gathered to connect and congratulate the recipients.

Since its inception in 1986, the Association for Student Conduct Administration (ASCA) has been at the forefront of supporting campus judicial officers and student conduct practitioners. ASCA provides members strategic resources, including communities of practice, webinars, intensive-learning opportunities (Donald D. Gehring Academy) as well as partnering with the Raymond H. Goldstone Foundation for scholarship funding. Today, ASCA supports over 2,660 members worldwide and is committed to its mission of serving as a vital resource and advocate in the field of student conduct administration. Learn more at theasca.org.

Through its seven departments and 13 centers, the College of Education, Health, and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

David Hamilton Recognized as Field Award Recipient

David Hamilton Recognized as Field Award Recipient

February 20, 2025 by Jonah Hall

David Hamilton Recognized as Field Award Recipient

Mr. David Hamilton, Principal at Cumberland Gap High School in the Claiborne County School District, has been named as this year’s recipient of the William J. and Lucille H. Field Award for Excellence in Secondary Principalship for the State of Tennessee.

Pictured from Left to Right: Dr. James Martinez, Mr. David Hamilton, & Mr. Randy Atkins

The Field Award was established to recognize one outstanding secondary school leader each year who demonstrates leadership excellence through commitment to the values of civility, candor, courage, social justice, responsibility, compassion, community, persistence, service, and excellence. Administered by the College of Education, Health, and Human Sciences at the University of Tennessee, the Field Award identifies a Tennessee secondary school principal whose life and work are characterized by leadership excellence and encourages secondary school principals to pause and reflect upon their current leadership practice and to consider their experience, challenges, and opportunities in light of the personal values that they embody. 

The Field Award recipient for this year is Mr. David Hamilton, Principal at Cumberland Gap High School (CGHS) in the Claiborne County School District. Mr. Hamilton has served as the principal of CGHS since 2019, and served as the school’s assistant principal from 2003-2018. During that time, he developed and implemented a program that significantly improved student transition and retention, organized initiatives that paired students and community mentors, spearheaded fundraising efforts that raised over $20,000 for student resources and facility upgrades, and established a year-round food and hygiene pantry that ensures students have access to essential resources.

Mr. Hamilton served as a high school health and physical education teacher in the Claiborne County School District from 1999-2003 and coached high school baseball teams between 2003-2006, and again between 2015-2018. Mr. Hamilton holds a Bachelor of Science degree in Health and Physical Education, and Masters of Arts and Educational Specialist degrees in Educational Administration and Supervision, all from Lincoln Memorial University. The department of Educational Leadership and Policy Studies at the University of Tennessee, Knoxville is proud to name Mr. David Hamilton as this year’s Field Award Winner. Congratulations, Mr. Hamilton! 

Filed Under: News

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

February 15, 2025 by Jonah Hall

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

By M. Andrew Young

We’ve all heard (and likely said) “Nobody’s perfect!”, but do we really know how to give ourselves (and others) the proper amount of empathy? 

Hello, my name is M. Andrew Young. I’m a third-year Ph.D. student in the Evaluation, Statistics and Methodology program in the Educational Leadership & Policy Studies department at the University of Tennessee. For the past 5 years now, I have served as a higher education evaluator as a Director of Assessment. In every job I’ve had since I graduated from my undergraduate degree in 2011, I have always weaved the use of data into the fabric of my work tasks, and this degree program and the field of evaluation is my happy place. I’d like to divert from the ‘normal’ type of technical blog posts I’ve written in the past and share something a bit more personal. 

I’ve noticed that in higher education, particularly in graduate and professional programs, there are a lot of highly conscientious people. I am one of them. This anecdotal observation or generalization extends to faculty, staff, and students alike. A year ago, I was doing some research on the changing landscape of evaluation and assessment career skills, and when I looked at how much the landscape has changed post-pandemic, I was astounded how rapidly the culture, values, and demands in the workplace had shifted (see this resource included in my reference section for more info, even though it is even becoming outdated: Essential Post-Pandemic Skills | ACCA Global, 2021).  

The laws of physics demand that for every action there is an equal and opposite reaction, and I have noticed that oftentimes, being conscientious, which is a good thing, is counterbalanced by its less-useful companion: high levels of self-imposed demands for excellence or even perfection. In 2021, Forbes magazine released an article called “Why Failure is Essential to Success” (Arruda, 2021). It is a really good read, and their interview with Dr. Sam Collins was eye-opening. The basic premise is that our culture celebrates and glorifies success; we even idolize overcoming adversity success stories, but we rarely see the numerous and deep failures those success stories encountered along their road to success. We love victory, but do not fully feel the depths of the pain, depression even, or discouragement they waded through along the journey.  

People like me are often so concerned with getting it right the first time and setting a personal standard so high that when we can’t attain it, we immediately sink into an unproductive self-deprecating, self-condemnatory internal dialogue. Doubts gnaw at our own self-concept of our worth and capabilities to succeed, and there is an insidious voice telling us to give up, that we aren’t capable of succeeding, that we are alone or unique in our struggles, and that the effort we put into it won’t result in anything other than wasting our time we could be using by just being satisfied with our current status-quo. 

It is incredible how we can grow without even noticing it in the moment. Let me tell you about Andrew 10 years ago. Andrew worked for a web design and marketing consulting company. The hours were long, the pay was abhorrently low for the job title I had, and I was unhappy and out of my element. The original job I was hired to do was create data visualizations for marketing surveys. It morphed into learning survey instrument development, data cleaning, statistical analysis, search engine marketing, search engine optimization, and website quality assurance. I was not ready for the work because I was not properly trained nor supported by professional development for what I would encounter. I made a LOT of mistakes, and I was unhappy. I recall a conversation with my then supervisor. It was one of those uncomfortable conversations where my work quality didn’t measure up to the demands of the job or their expectations. We were speaking about data visualization, and they gave me a scenario of a creative way to visualize geographical map information. Something was said along the lines of, “This is the type of stuff we are looking for”, and my response was, “I don’t know that I am capable of thinking up those things on my own”.  

When I reflect on that moment, I chuckle at how simplistic that data solution was within the context of my current knowledge. When I look at the types of data analyses I’m capable of and knowledge I possess now through the lens of what I was capable of only two years ago, I can see the growth. When I look at the quality of my work today compared to in the past, distant and recent, there is growth. As a parent of school-aged children now, I see the incredible pressures this culture levies on immediate success and high performance. My middle child, who is four years younger than her older sister, has unrealistic expectations of her own capabilities and limitations, and often finds herself at a comparative disadvantage to her sister. Both my school-aged children have been asked to perform tasks, to which they fail or don’t perform to their level of desire or expectations, and when asked to do it again they’ve huffed in frustration and despair, “I can’t do that, dad!”, to which I always reply, “No. You can’t yet. You CAN figure it out!” 

Oh, if I had learned that lesson earlier in my life. Sometimes we have families with impossible expectations for us. Sometimes we work for employers who want us to perform at a high level, never make mistakes, and are waiting with the hammer held twitchingly above our heads, ready for us to fail. Sometimes our educational system is designed to grind us through the mill at their speed when we really need to back up and master foundational things….the list goes on. 

Let me assure you of some things: you will disappoint those you love. You will make an embarrassing mistake at your job. You will misunderstand a school assignment and get a bad grade. You will send that email or chat message that you didn’t think through well enough. You will forget a deadline. You will get turned down for that promotion. You will receive rejection letters for almost all of those “dream jobs” with the nice salaries you’ve applied for.  

And that’s ok.

Embrace failure. It isn’t the end; it is an opportunity to learn and grow. 

Embrace chuckling at the simpleton’s drivel you produced “back when”; you were proud of it then because it was what you were capable of then.

Pursue growth, not perfection; every project and every challenge are opportunities to get better, so embrace where you’re at.

Finally, never get comfortable. Life is a journey, not a destination, and if we ever deceive ourselves into thinking that we can rest on our laurels, we stop growing. It takes an oak tree a hundred years to tower over its peers. Do you see it now? If we recognize that our journey is about growth, it is ok to be where we are and recognize that growth takes time and persistence.

Cool Extra Resources:

A UTK Class I HIGHLY recommend to study student success: ELPS 595: Student Success in Higher Education 

A book that was instrumental for me understanding wellbeing/belonging/success:  

Quaye, S. J., Harper, S. R., & Pendakur, S. L. (Eds.). (2020). Student engagement in higher education: Theoretical perspectives and practical approaches for diverse populations (Third edition). Routledge. 

Wellbeing/Strengths Assessments: 

Gallup Clifton Strengths: https://www.gallup.com/cliftonstrengthsforstudents/ 

EdResearch for Action: https://edresearchforaction.org/research-briefs/evidence-based-practices-for-assessing-students-social-and-emotional-well-being-2/  

 
Full Reference List: 

Arruda, W. (2021, December 10). Why Failure Is Essential To Success. https://www.forbes.com/sites/williamarruda/2015/05/14/why-failure-is-essential-to-success/ 

Essential post-pandemic skills | ACCA Global. (2021). https://www.accaglobal.com/lk/en/affiliates/advance-ezine/careers-advice/post-pandemic-skills.html 

Evidence-Based Practices For Assessing Students’ Social And Emotional Well-Being. (n.d.). EdResearch for Action. Retrieved January 5, 2025, from https://edresearchforaction.org/research-briefs/evidence-based-practices-for-assessing-students-social-and-emotional-well-being-2/ 

Quaye, S. J., Harper, S. R., & Pendakur, S. L. (Eds.). (2020). Student engagement in higher education: Theoretical perspectives and practical approaches for diverse populations (Third edition). Routledge. 

Singh, A. (2021, August 23). The top data science skills for the post-Covid world. https://www.globaltechcouncil.org/data-science/the-top-data-science-skills-for-the-post-covid-world/ 

Filed Under: Evaluation Methodology Blog

Clean, Correlate, and Compare: The Importance of Having a Data Analysis Plan

Clean, Correlate, and Compare: The Importance of Having a Data Analysis Plan

February 7, 2025 by Jonah Hall

Clean, Correlate, and Compare: The Importance of Having a Data Analysis Plan

By Dr. Jennifer Ann Morrow

Data Cleaning Step 2: Create a Data Analysis Plan

Hi again! For those that read my earlier blog on Data Cleaning Step 1: Create a Data Codebook, you know I love data cleaning! My colleagues, Dr. Louis Rocconi and Dr. Gary Skolits, love to nerd out and talk about data cleaning and why it is such an important part of analyzing your evaluation data. As I mentioned in my earlier blog post before we can tackle addressing our evaluation or assessment questions, we need to get our data organized. Creating a data analysis plan is an important part of the data management process. Once I create my first draft of my data codebook (Step 1), I draft a data analysis plan…and both of these get updated as I make changes to my evaluation/assessment dataset. 

Why a Data Analysis Plan?

While it can be tempting to just dive right on in and conduct your proposed analyses (I mean who doesn’t just want to run a multiple regression right away?!?) it’s good practice to have a detailed plan for how you intend to clean your data and how you will address your evaluation/assessment questions. Creating a data analysis plan BEFORE you start working with your dataset helps you think through the data that you need to collect to address your questions, what specific pieces of the data that you will use to address your questions, how you will analyze the data that you collect, and what are the most appropriate ways to disseminate the data that you analyze. While creating a data analysis plan can be time consuming, it is an invaluable part of the data management and analysis process. Also, if you are working with a team (as many of us evaluator/assessment professional do!) it makes collaboration, replication, and report generation easier. Just like the data codebook, the data analysis plan is a living document that changes as you make decisions and modifications to your dataset and planned analyses.  

I share the data analysis plan with my clients throughout the life of the project so they are aware of the process but also so they can chime in if they have questions or requests for different ways to approach the analysis of their data. At the end of my time with the project I routinely share a copy of the data codebook, data analysis plan, and a cleaned/sanitized dataset for the client to continue to use to inform their program and organization. 

What is in a Data Analysis Plan?

Whether you create your data analysis plan in Excel, Word, or some other software platform (I tend to prefer Word) these are my suggestions for what you should include in a data analysis plan: 

  • 1.) General Instructions to Data Analysts
  • 2.) List of Datasets for the Project
  • 3.) Who is Responsible for Each Section of the Analysis Plan
  • 4.) Evaluation/Assessment Questions
  • 5.) Variables that You Will Use in Your Analyses
  • 6.) Step by Step Description of Your Data Cleaning Process
  • 7.) Specific Analyses that You Will Use to Address Each Evaluation/Assessment Question
  • 8.) Proposed Data Visualizations that You Will Use for Each Analysis
  • 9.) Software Syntax/Code (e.g., SPSS, R) that You Will Use to Analyze Your Data

Since many times there are multiple people working with my datasets (Boy…did it take me a long time to get used to giving up control here!) including step by step instructions for how your data analysts should name, label, and save files is extremely important. Also providing guidance for how data analysts should document what they do (see project notebook in your data codebook!) and how they arrived at their decisions is invaluable for keeping the evaluation/assessment team aware of each step of the data analysis process. 

I typically organize my data analysis plan by first listing any data cleaning that needs to be completed followed by each of my evaluation/assessment questions. This way all of my analyses are organized by the questions that my client wants me to address…and this helps immensely when writing up my evaluation/assessment report for them.  

Including either the software syntax/code (if using something like SPSS or R) or the step-by-step approach to how you are using the software tool (if using something like Excel) to clean and analyze the data is so helpful to not only your team members but also your clients. It allows them to easily rerun analyses and critique the steps that you took to analyze the data. I also include in my syntax/code notes about my decision-making process so anyone can easily follow how and why I approached the analyses the way that I did. 

Additional Advice

While it is important to develop your data analysis plan early in your project always remember that it is a living document and it will definitely change as you are collecting data, meeting with your client to discuss the evaluation/assessment, and during the data cleaning process. Your “perfect” plan may not work once you have collected your data, so be flexible in your approach. Just remember to document any changes that you make to the plan and to your data in your project notebook! 

Resources

12 Steps of Data Cleaning Handout: https://www.dropbox.com/scl/fi/x2bf2t0q134p0cx4kvej0/TWELVE-STEPS-OF-DATA-CLEANING-BRIEF-HANDOUT-MORROW-2017.pdf?rlkey=lfrllz3zya83qzeny6ubwzvjj&dl=0 

http://fogartyfellows.org/wp-content/uploads/2015/09/SAP_workbook.pdf 

https://cghlewis.com/blog/project_beginning

https://learn.crenc.org/how-to-create-a-data-analysis-plan

https://pmc.ncbi.nlm.nih.gov/articles/PMC4552232/pdf/cjhp-68-311.pdf

https://the.datastory.guide/hc/en-us/articles/360003250516-Creating-Analysis-Plans-for-Surveys

https://www.slideshare.net/slideshow/brief-introduction-to-the-12-steps-of-evaluagio/26168236#1

https://www.surveymonkey.com/mp/developing-data-analysis-plan

https://youtu.be/105wwMySZYc?si=9SEqjP2HWB5k4MDn

https://youtu.be/djVHKjmImrw?si=BdfSxl6C4weZEOgD

Filed Under: Evaluation Methodology Blog

Mr. David Hamilton, Cumberland Gap High School Principal, Named Field Award Recipient

Mr. David Hamilton, Cumberland Gap High School Principal, Named Field Award Recipient

January 30, 2025 by Jonah Hall

Mr. David Hamilton, Cumberland Gap High School Principal, Named Field Award Recipient

Press Announcement – for Immediate Release

Mr. David Hamilton, Principal at Cumberland Gap High School in the Claiborne County School District, has been named as recipient of William J. and Lucille H. Field Award for Excellence in Secondary Principalship for the State of Tennessee.

The Field Award was established to recognize one outstanding secondary school leader each year who demonstrates leadership excellence through commitment to the values of civility, candor, courage, social justice, responsibility, compassion, community, persistence, service, and excellence. Administered by the Department of Educational Leadership & Policy Studies in the College of Education, Health, and Human Sciences at the University of Tennessee, the Field Award identifies a Tennessee secondary school principal whose life and work are characterized by leadership excellence and encourages secondary school principals to pause and reflect upon their current leadership practice and to consider their experience, challenges, and opportunities in light of the personal values that they embody.

The Field Award recipient for this year is Mr. David Hamilton, Principal at Cumberland Gap High School (CGHS) in the Claiborne County School District. Mr. Hamilton has served as the principal of CGHS since 2019, and served as the school’s assistant principal from 2003-2018. During that time, he developed and implemented a program that significantly improved student transition and retention, organized initiatives that paired students and community mentors, spearheaded fundraising efforts that raised over $20,000 for student resources and facility upgrades, and established a year-round food and hygiene pantry that ensures students have access to essential resources. Mr. Hamilton served as a high school health and physical education teacher in the Claiborne County School District from 1999-2003 and coached high school baseball teams between 2003-2006, and again between 2015-2018. Mr. Hamilton holds a Bachelor of Science degree in Health and Physical Education, and Masters of Arts and Educational Specialist degrees in Educational Administration and Supervision, all from Lincoln Memorial University. The department of Educational Leadership and Policy Studies at the University of Tennessee, Knoxville is proud to name Mr. David Hamilton as this year’s Field Award Winner. Congratulations, Mr. Hamilton!

Filed Under: News

Grant Writing in Evaluation

Grant Writing in Evaluation

January 15, 2025 by Jonah Hall

Grant Writing in Evaluation

By Jessica Osborne, Ph.D.

Jessica is the Principal Evaluation Associate for the Higher Education Portfolio at The Center for Research Evaluation at the University of Mississippi. She earned a PhD in Evaluation, Statistics, and Measurement from the University of Tennessee, Knoxville, an MFA in Creative Writing from the University of North Carolina, Greensboro, and a BA in English from Elon University. Her main areas of research and evaluation are undergraduate and graduate student success, higher education systems, needs assessments, and intrinsic motivation. She lives in Knoxville, TN with her husband, two kids, and three (yes, three…) cats. 

I’ve always been a writer. Recently, my mother gave (returned to) me a small notebook within which I was delighted to find the first short story I ever wrote. In blocky handwriting with many misspelled words, I found a dramatic story of dragons, witches, and wraiths, all outsmarted by a small but clever eight-year-old. The content of my writing has changed since then, but many of the rules and best practices remain the same. In this blog, I’ll highlight best practices in grant writing for evaluation, including how to read and respond to a solicitation, how to determine what information to include, and how to write clearly and professionally for an evaluation audience.  

As an evaluator, you can expect to respond to proposals in many different fields or content areas: primary, secondary, and post-secondary education, health, public health, arts, and community engagement, just to name a few. The first step in any of these scenarios is to closely and carefully read the solicitation to ensure you have a deep understanding of project components, requirements, logistics, timeline, and, of course, budget. I recommend a close reading approach that includes underlining and / or highlighting the RFP text and taking notes on key elements to include in your proposal. Specifically, pay attention to the relationship between the evaluation scope and budget and the contexts and relationships between key stakeholders. In reviewing these elements and determining if and how to respond, make sure you see alignment between what the project seeks to achieve and your (or your team’s) ability to meet project goals. Also, be sure to read up on the funder (if you are not already familiar) to get a sense of their overarching mission, vision, and goals. Instances when you may not want to pursue funding include a lack of alignment between the project scope / budget and your team’s capacity or conflicts between you and the funder’s ethics, legal requirements, or overarching vision and mission.     

Grant writing in evaluation typically takes two forms: responding as a prime (or solo) author to a request for proposal (RFP) or writing a portion of the proposal as a grant subrecipient. The best practices mentioned here are relevant for either of these cases; however, if working on a team as a subrecipient, you’ll also want to match your writing tone and style to the other authors.  

When responding to an RFP, your content should evidence that you know and understand:  

  • the funder – who they are; why they exist; 
  • the funder’s needs – what they are trying to accomplish; what they need to achieve project goals; 
  • and most importantly, that you are the right person to meet their needs and help them achieve their goals.  

For example, if you are responding to a National Science Foundation (NSF) solicitation, you will want to evidence broader impacts and meticulously detail your research-based methods (they are scientists who want to improve societal outcomes), how your project fits the scope and aims of the solicitation (the goals for most NSF solicitations are specific – be sure you understand what the individual program aims to achieve), and the background and experience for all key personnel (to evidence that you and your team can meet solicitation goals).  

When considering content, be sure to include all required elements listed in the solicitation (I recommend double and triple checking!). If requirements are limited or not provided, at minimum be sure to include:  

  • an introduction highlighting your strengths as an evaluator and how those strengths match the funder’s and / or program’s needs 
  • a project summary and description detailing your recommended evaluation questions, program logic model, evaluation plan, timeline, approach, and methods 
  • figures and tables that clearly and succinctly illustrate key evaluation elements  

When considering writing style and tone, stick to the three C’s:  

  • clear 
  • concise 
  • consistent 

To achieve the three C’s, use active voice, relatively simple sentence structure, and plain language. Syntactical acrobatics containing opaque literary devices tend to obfuscate comprehension, and, while tempting to construct, have no place in evaluation writing. Also, please remember that the best writing is rewriting. Expect and plan for multiple rounds of revision and ask a colleague or team member to revise and edit your work as well.  

And finally, a word on receiving feedback: in the world of evaluation grant writing, much like the world of academic publications, you will receive many more no’s than yes’s. That’s fine. That’s to be expected. When you receive a no, look at the feedback with an eye for improvement – make revisions based on constructive feedback and let go of any criticisms that are not helpful. When you receive a yes, celebrate, and then get ready for the real work to begin! 

Filed Under: Evaluation Methodology Blog

  • 1
  • 2
  • 3
  • …
  • 14
  • Next Page »

Educational Leadership and Policy Studies

325 Bailey Education Complex
Knoxville, Tennessee 37996

Phone: 865-974-2214
Fax: 865.974.6146

The University of Tennessee, Knoxville
Knoxville, Tennessee 37996
865-974-1000

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.

ADA Privacy Safety Title IX