• Request Info
  • Visit
  • Apply
  • Give
  • Request Info
  • Visit
  • Apply
  • Give

Search

  • A-Z Index
  • Map

Educational Leadership and Policy Studies

  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
Home » Archives for Jonah Hall » Page 2

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

February 15, 2025 by Jonah Hall

Giving Yourself Room to Grow is Critical to Long-Term Wellbeing, and In Turn, Success

By M. Andrew Young

We’ve all heard (and likely said) “Nobody’s perfect!”, but do we really know how to give ourselves (and others) the proper amount of empathy? 

Hello, my name is M. Andrew Young. I’m a third-year Ph.D. student in the Evaluation, Statistics and Methodology program in the Educational Leadership & Policy Studies department at the University of Tennessee. For the past 5 years now, I have served as a higher education evaluator as a Director of Assessment. In every job I’ve had since I graduated from my undergraduate degree in 2011, I have always weaved the use of data into the fabric of my work tasks, and this degree program and the field of evaluation is my happy place. I’d like to divert from the ‘normal’ type of technical blog posts I’ve written in the past and share something a bit more personal. 

I’ve noticed that in higher education, particularly in graduate and professional programs, there are a lot of highly conscientious people. I am one of them. This anecdotal observation or generalization extends to faculty, staff, and students alike. A year ago, I was doing some research on the changing landscape of evaluation and assessment career skills, and when I looked at how much the landscape has changed post-pandemic, I was astounded how rapidly the culture, values, and demands in the workplace had shifted (see this resource included in my reference section for more info, even though it is even becoming outdated: Essential Post-Pandemic Skills | ACCA Global, 2021).  

The laws of physics demand that for every action there is an equal and opposite reaction, and I have noticed that oftentimes, being conscientious, which is a good thing, is counterbalanced by its less-useful companion: high levels of self-imposed demands for excellence or even perfection. In 2021, Forbes magazine released an article called “Why Failure is Essential to Success” (Arruda, 2021). It is a really good read, and their interview with Dr. Sam Collins was eye-opening. The basic premise is that our culture celebrates and glorifies success; we even idolize overcoming adversity success stories, but we rarely see the numerous and deep failures those success stories encountered along their road to success. We love victory, but do not fully feel the depths of the pain, depression even, or discouragement they waded through along the journey.  

People like me are often so concerned with getting it right the first time and setting a personal standard so high that when we can’t attain it, we immediately sink into an unproductive self-deprecating, self-condemnatory internal dialogue. Doubts gnaw at our own self-concept of our worth and capabilities to succeed, and there is an insidious voice telling us to give up, that we aren’t capable of succeeding, that we are alone or unique in our struggles, and that the effort we put into it won’t result in anything other than wasting our time we could be using by just being satisfied with our current status-quo. 

It is incredible how we can grow without even noticing it in the moment. Let me tell you about Andrew 10 years ago. Andrew worked for a web design and marketing consulting company. The hours were long, the pay was abhorrently low for the job title I had, and I was unhappy and out of my element. The original job I was hired to do was create data visualizations for marketing surveys. It morphed into learning survey instrument development, data cleaning, statistical analysis, search engine marketing, search engine optimization, and website quality assurance. I was not ready for the work because I was not properly trained nor supported by professional development for what I would encounter. I made a LOT of mistakes, and I was unhappy. I recall a conversation with my then supervisor. It was one of those uncomfortable conversations where my work quality didn’t measure up to the demands of the job or their expectations. We were speaking about data visualization, and they gave me a scenario of a creative way to visualize geographical map information. Something was said along the lines of, “This is the type of stuff we are looking for”, and my response was, “I don’t know that I am capable of thinking up those things on my own”.  

When I reflect on that moment, I chuckle at how simplistic that data solution was within the context of my current knowledge. When I look at the types of data analyses I’m capable of and knowledge I possess now through the lens of what I was capable of only two years ago, I can see the growth. When I look at the quality of my work today compared to in the past, distant and recent, there is growth. As a parent of school-aged children now, I see the incredible pressures this culture levies on immediate success and high performance. My middle child, who is four years younger than her older sister, has unrealistic expectations of her own capabilities and limitations, and often finds herself at a comparative disadvantage to her sister. Both my school-aged children have been asked to perform tasks, to which they fail or don’t perform to their level of desire or expectations, and when asked to do it again they’ve huffed in frustration and despair, “I can’t do that, dad!”, to which I always reply, “No. You can’t yet. You CAN figure it out!” 

Oh, if I had learned that lesson earlier in my life. Sometimes we have families with impossible expectations for us. Sometimes we work for employers who want us to perform at a high level, never make mistakes, and are waiting with the hammer held twitchingly above our heads, ready for us to fail. Sometimes our educational system is designed to grind us through the mill at their speed when we really need to back up and master foundational things….the list goes on. 

Let me assure you of some things: you will disappoint those you love. You will make an embarrassing mistake at your job. You will misunderstand a school assignment and get a bad grade. You will send that email or chat message that you didn’t think through well enough. You will forget a deadline. You will get turned down for that promotion. You will receive rejection letters for almost all of those “dream jobs” with the nice salaries you’ve applied for.  

And that’s ok.

Embrace failure. It isn’t the end; it is an opportunity to learn and grow. 

Embrace chuckling at the simpleton’s drivel you produced “back when”; you were proud of it then because it was what you were capable of then.

Pursue growth, not perfection; every project and every challenge are opportunities to get better, so embrace where you’re at.

Finally, never get comfortable. Life is a journey, not a destination, and if we ever deceive ourselves into thinking that we can rest on our laurels, we stop growing. It takes an oak tree a hundred years to tower over its peers. Do you see it now? If we recognize that our journey is about growth, it is ok to be where we are and recognize that growth takes time and persistence.

Cool Extra Resources:

A UTK Class I HIGHLY recommend to study student success: ELPS 595: Student Success in Higher Education 

A book that was instrumental for me understanding wellbeing/belonging/success:  

Quaye, S. J., Harper, S. R., & Pendakur, S. L. (Eds.). (2020). Student engagement in higher education: Theoretical perspectives and practical approaches for diverse populations (Third edition). Routledge. 

Wellbeing/Strengths Assessments: 

Gallup Clifton Strengths: https://www.gallup.com/cliftonstrengthsforstudents/ 

EdResearch for Action: https://edresearchforaction.org/research-briefs/evidence-based-practices-for-assessing-students-social-and-emotional-well-being-2/  

 
Full Reference List: 

Arruda, W. (2021, December 10). Why Failure Is Essential To Success. https://www.forbes.com/sites/williamarruda/2015/05/14/why-failure-is-essential-to-success/ 

Essential post-pandemic skills | ACCA Global. (2021). https://www.accaglobal.com/lk/en/affiliates/advance-ezine/careers-advice/post-pandemic-skills.html 

Evidence-Based Practices For Assessing Students’ Social And Emotional Well-Being. (n.d.). EdResearch for Action. Retrieved January 5, 2025, from https://edresearchforaction.org/research-briefs/evidence-based-practices-for-assessing-students-social-and-emotional-well-being-2/ 

Quaye, S. J., Harper, S. R., & Pendakur, S. L. (Eds.). (2020). Student engagement in higher education: Theoretical perspectives and practical approaches for diverse populations (Third edition). Routledge. 

Singh, A. (2021, August 23). The top data science skills for the post-Covid world. https://www.globaltechcouncil.org/data-science/the-top-data-science-skills-for-the-post-covid-world/ 

Filed Under: Evaluation Methodology Blog

Clean, Correlate, and Compare: The Importance of Having a Data Analysis Plan

Clean, Correlate, and Compare: The Importance of Having a Data Analysis Plan

February 7, 2025 by Jonah Hall

Clean, Correlate, and Compare: The Importance of Having a Data Analysis Plan

By Dr. Jennifer Ann Morrow

Data Cleaning Step 2: Create a Data Analysis Plan

Hi again! For those that read my earlier blog on Data Cleaning Step 1: Create a Data Codebook, you know I love data cleaning! My colleagues, Dr. Louis Rocconi and Dr. Gary Skolits, love to nerd out and talk about data cleaning and why it is such an important part of analyzing your evaluation data. As I mentioned in my earlier blog post before we can tackle addressing our evaluation or assessment questions, we need to get our data organized. Creating a data analysis plan is an important part of the data management process. Once I create my first draft of my data codebook (Step 1), I draft a data analysis plan…and both of these get updated as I make changes to my evaluation/assessment dataset. 

Why a Data Analysis Plan?

While it can be tempting to just dive right on in and conduct your proposed analyses (I mean who doesn’t just want to run a multiple regression right away?!?) it’s good practice to have a detailed plan for how you intend to clean your data and how you will address your evaluation/assessment questions. Creating a data analysis plan BEFORE you start working with your dataset helps you think through the data that you need to collect to address your questions, what specific pieces of the data that you will use to address your questions, how you will analyze the data that you collect, and what are the most appropriate ways to disseminate the data that you analyze. While creating a data analysis plan can be time consuming, it is an invaluable part of the data management and analysis process. Also, if you are working with a team (as many of us evaluator/assessment professional do!) it makes collaboration, replication, and report generation easier. Just like the data codebook, the data analysis plan is a living document that changes as you make decisions and modifications to your dataset and planned analyses.  

I share the data analysis plan with my clients throughout the life of the project so they are aware of the process but also so they can chime in if they have questions or requests for different ways to approach the analysis of their data. At the end of my time with the project I routinely share a copy of the data codebook, data analysis plan, and a cleaned/sanitized dataset for the client to continue to use to inform their program and organization. 

What is in a Data Analysis Plan?

Whether you create your data analysis plan in Excel, Word, or some other software platform (I tend to prefer Word) these are my suggestions for what you should include in a data analysis plan: 

  • 1.) General Instructions to Data Analysts
  • 2.) List of Datasets for the Project
  • 3.) Who is Responsible for Each Section of the Analysis Plan
  • 4.) Evaluation/Assessment Questions
  • 5.) Variables that You Will Use in Your Analyses
  • 6.) Step by Step Description of Your Data Cleaning Process
  • 7.) Specific Analyses that You Will Use to Address Each Evaluation/Assessment Question
  • 8.) Proposed Data Visualizations that You Will Use for Each Analysis
  • 9.) Software Syntax/Code (e.g., SPSS, R) that You Will Use to Analyze Your Data

Since many times there are multiple people working with my datasets (Boy…did it take me a long time to get used to giving up control here!) including step by step instructions for how your data analysts should name, label, and save files is extremely important. Also providing guidance for how data analysts should document what they do (see project notebook in your data codebook!) and how they arrived at their decisions is invaluable for keeping the evaluation/assessment team aware of each step of the data analysis process. 

I typically organize my data analysis plan by first listing any data cleaning that needs to be completed followed by each of my evaluation/assessment questions. This way all of my analyses are organized by the questions that my client wants me to address…and this helps immensely when writing up my evaluation/assessment report for them.  

Including either the software syntax/code (if using something like SPSS or R) or the step-by-step approach to how you are using the software tool (if using something like Excel) to clean and analyze the data is so helpful to not only your team members but also your clients. It allows them to easily rerun analyses and critique the steps that you took to analyze the data. I also include in my syntax/code notes about my decision-making process so anyone can easily follow how and why I approached the analyses the way that I did. 

Additional Advice

While it is important to develop your data analysis plan early in your project always remember that it is a living document and it will definitely change as you are collecting data, meeting with your client to discuss the evaluation/assessment, and during the data cleaning process. Your “perfect” plan may not work once you have collected your data, so be flexible in your approach. Just remember to document any changes that you make to the plan and to your data in your project notebook! 

Resources

12 Steps of Data Cleaning Handout: https://www.dropbox.com/scl/fi/x2bf2t0q134p0cx4kvej0/TWELVE-STEPS-OF-DATA-CLEANING-BRIEF-HANDOUT-MORROW-2017.pdf?rlkey=lfrllz3zya83qzeny6ubwzvjj&dl=0 

http://fogartyfellows.org/wp-content/uploads/2015/09/SAP_workbook.pdf 

https://cghlewis.com/blog/project_beginning

https://learn.crenc.org/how-to-create-a-data-analysis-plan

https://pmc.ncbi.nlm.nih.gov/articles/PMC4552232/pdf/cjhp-68-311.pdf

https://the.datastory.guide/hc/en-us/articles/360003250516-Creating-Analysis-Plans-for-Surveys

https://www.slideshare.net/slideshow/brief-introduction-to-the-12-steps-of-evaluagio/26168236#1

https://www.surveymonkey.com/mp/developing-data-analysis-plan

https://youtu.be/105wwMySZYc?si=9SEqjP2HWB5k4MDn

https://youtu.be/djVHKjmImrw?si=BdfSxl6C4weZEOgD

Filed Under: Evaluation Methodology Blog

Mr. David Hamilton, Cumberland Gap High School Principal, Named Field Award Recipient

Mr. David Hamilton, Cumberland Gap High School Principal, Named Field Award Recipient

January 30, 2025 by Jonah Hall

Mr. David Hamilton, Cumberland Gap High School Principal, Named Field Award Recipient

Press Announcement – for Immediate Release

Mr. David Hamilton, Principal at Cumberland Gap High School in the Claiborne County School District, has been named as recipient of William J. and Lucille H. Field Award for Excellence in Secondary Principalship for the State of Tennessee.

The Field Award was established to recognize one outstanding secondary school leader each year who demonstrates leadership excellence through commitment to the values of civility, candor, courage, social justice, responsibility, compassion, community, persistence, service, and excellence. Administered by the Department of Educational Leadership & Policy Studies in the College of Education, Health, and Human Sciences at the University of Tennessee, the Field Award identifies a Tennessee secondary school principal whose life and work are characterized by leadership excellence and encourages secondary school principals to pause and reflect upon their current leadership practice and to consider their experience, challenges, and opportunities in light of the personal values that they embody.

The Field Award recipient for this year is Mr. David Hamilton, Principal at Cumberland Gap High School (CGHS) in the Claiborne County School District. Mr. Hamilton has served as the principal of CGHS since 2019, and served as the school’s assistant principal from 2003-2018. During that time, he developed and implemented a program that significantly improved student transition and retention, organized initiatives that paired students and community mentors, spearheaded fundraising efforts that raised over $20,000 for student resources and facility upgrades, and established a year-round food and hygiene pantry that ensures students have access to essential resources. Mr. Hamilton served as a high school health and physical education teacher in the Claiborne County School District from 1999-2003 and coached high school baseball teams between 2003-2006, and again between 2015-2018. Mr. Hamilton holds a Bachelor of Science degree in Health and Physical Education, and Masters of Arts and Educational Specialist degrees in Educational Administration and Supervision, all from Lincoln Memorial University. The department of Educational Leadership and Policy Studies at the University of Tennessee, Knoxville is proud to name Mr. David Hamilton as this year’s Field Award Winner. Congratulations, Mr. Hamilton!

Filed Under: News

Grant Writing in Evaluation

Grant Writing in Evaluation

January 15, 2025 by Jonah Hall

Grant Writing in Evaluation

By Jessica Osborne, Ph.D.

Jessica is the Principal Evaluation Associate for the Higher Education Portfolio at The Center for Research Evaluation at the University of Mississippi. She earned a PhD in Evaluation, Statistics, and Measurement from the University of Tennessee, Knoxville, an MFA in Creative Writing from the University of North Carolina, Greensboro, and a BA in English from Elon University. Her main areas of research and evaluation are undergraduate and graduate student success, higher education systems, needs assessments, and intrinsic motivation. She lives in Knoxville, TN with her husband, two kids, and three (yes, three…) cats. 

I’ve always been a writer. Recently, my mother gave (returned to) me a small notebook within which I was delighted to find the first short story I ever wrote. In blocky handwriting with many misspelled words, I found a dramatic story of dragons, witches, and wraiths, all outsmarted by a small but clever eight-year-old. The content of my writing has changed since then, but many of the rules and best practices remain the same. In this blog, I’ll highlight best practices in grant writing for evaluation, including how to read and respond to a solicitation, how to determine what information to include, and how to write clearly and professionally for an evaluation audience.  

As an evaluator, you can expect to respond to proposals in many different fields or content areas: primary, secondary, and post-secondary education, health, public health, arts, and community engagement, just to name a few. The first step in any of these scenarios is to closely and carefully read the solicitation to ensure you have a deep understanding of project components, requirements, logistics, timeline, and, of course, budget. I recommend a close reading approach that includes underlining and / or highlighting the RFP text and taking notes on key elements to include in your proposal. Specifically, pay attention to the relationship between the evaluation scope and budget and the contexts and relationships between key stakeholders. In reviewing these elements and determining if and how to respond, make sure you see alignment between what the project seeks to achieve and your (or your team’s) ability to meet project goals. Also, be sure to read up on the funder (if you are not already familiar) to get a sense of their overarching mission, vision, and goals. Instances when you may not want to pursue funding include a lack of alignment between the project scope / budget and your team’s capacity or conflicts between you and the funder’s ethics, legal requirements, or overarching vision and mission.     

Grant writing in evaluation typically takes two forms: responding as a prime (or solo) author to a request for proposal (RFP) or writing a portion of the proposal as a grant subrecipient. The best practices mentioned here are relevant for either of these cases; however, if working on a team as a subrecipient, you’ll also want to match your writing tone and style to the other authors.  

When responding to an RFP, your content should evidence that you know and understand:  

  • the funder – who they are; why they exist; 
  • the funder’s needs – what they are trying to accomplish; what they need to achieve project goals; 
  • and most importantly, that you are the right person to meet their needs and help them achieve their goals.  

For example, if you are responding to a National Science Foundation (NSF) solicitation, you will want to evidence broader impacts and meticulously detail your research-based methods (they are scientists who want to improve societal outcomes), how your project fits the scope and aims of the solicitation (the goals for most NSF solicitations are specific – be sure you understand what the individual program aims to achieve), and the background and experience for all key personnel (to evidence that you and your team can meet solicitation goals).  

When considering content, be sure to include all required elements listed in the solicitation (I recommend double and triple checking!). If requirements are limited or not provided, at minimum be sure to include:  

  • an introduction highlighting your strengths as an evaluator and how those strengths match the funder’s and / or program’s needs 
  • a project summary and description detailing your recommended evaluation questions, program logic model, evaluation plan, timeline, approach, and methods 
  • figures and tables that clearly and succinctly illustrate key evaluation elements  

When considering writing style and tone, stick to the three C’s:  

  • clear 
  • concise 
  • consistent 

To achieve the three C’s, use active voice, relatively simple sentence structure, and plain language. Syntactical acrobatics containing opaque literary devices tend to obfuscate comprehension, and, while tempting to construct, have no place in evaluation writing. Also, please remember that the best writing is rewriting. Expect and plan for multiple rounds of revision and ask a colleague or team member to revise and edit your work as well.  

And finally, a word on receiving feedback: in the world of evaluation grant writing, much like the world of academic publications, you will receive many more no’s than yes’s. That’s fine. That’s to be expected. When you receive a no, look at the feedback with an eye for improvement – make revisions based on constructive feedback and let go of any criticisms that are not helpful. When you receive a yes, celebrate, and then get ready for the real work to begin! 

Filed Under: Evaluation Methodology Blog

Kelchen Recognized By Education Week As Top Scholar Influencer

Kelchen Recognized By Education Week As Top Scholar Influencer

January 9, 2025 by Jonah Hall

Kelchen Recognized By Education Week As Top Scholar Influencer

Courtesy of the College of Education, Health, & Human Sciences

When a reporter seeks expert insight into higher education issues, it’s very likely that Robert Kelchen is at the top of their call list. Over the years, Kelchen continues to receive accolades from Education Week as a top influencer who shape educational practice and policy. This year is no different as Kelchen is once again recognized as a Top 200 education scholar at a United States university.

Kelchen, who serves as department head in Educational Leadership and Policy Studies in the College of Education, Health, and Human Sciences at the University of Tennessee, Knoxville, ranked 31 out of 200 scholars nationwide in Education Week’s Edu-Scholar Public Influence Rankings for 2025. In fact, Kelchen is the only scholar from the University of Tennessee to make this year’s list.

 “In a time of declining trust in higher education, I feel that it is crucial for faculty to demonstrate how our work benefits the public good,” said Kelchen.

headshot

Each year, Education Week selects the top 200 scholars from across the U.S. (from an eligible pool of 20,000)  as having the most influence on issues and policy in education. The list is compiled by opinion columnist Rick Hess, resident scholar at the American Enterprise Institute and director of Education Policy Studies.

The selection process  involves a rigorous evaluation by a 24-member committee of university scholars representing institutions nationwide. Criteria include Google Scholar scores, book publications, Amazon rankings, mentions in the Congressional Record, and appearances in media and web platforms.

Kelchen’s reputation as a reliable and insightful source for higher education stories is well-earned. He has participated in more than 200 media interviews annually, with his expertise regularly featured in outlets such as The New York Times, The Washington Post, The Wall Street Journal, Education Week, and The Chronicle of Higher Education.

“It is a pleasure to use my scholarly expertise to help inform policy conversations and the general public on pressing issues such as college affordability, financial aid, and college closures,” said Kelchen.

Through its seven departments and 13 centers, the UT Knoxville College of Education, Health, and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

Finding Fit: A Statistical Journey

Finding Fit: A Statistical Journey

January 2, 2025 by Jonah Hall

Finding Fit: A Statistical Journey

By: Sara Hall

As a graduate student in Evaluation, Statistics, and Measurement, I’ve learned a thing or two about fit. Not just in terms of statistical models, but in my own academic journey and beyond. Life is kind of like running one big regression on your choices – sometimes, the model explains everything and other times is all error terms and cold coffee. Somewhere in between lies the essence of goodness of fit. In this blog post I will take you through my experience of finding the right graduate program, using some statistical concepts to illustrate my process.

The Initial Model: Leadership & Decision-Making

Two years ago, I began my graduate studies in a Leadership and Decision-making program. I had been out of the academy for 10 years. I had a successful career in sales, children old enough to reach the microwave, and a supportive group of friends that could help with childcare as well as navigating graduate school. A good friend and former colleague was teaching quantitative and qualitative analysis and methodology in a Leadership program. He encouraged me to apply with the promise that we would be working together again, and I could pursue my research interests with his support. For as long as I can remember, I have wanted to teach and do research. The timing was perfect, and this seemed like the best opportunity as a non-traditional student to at least get a PhD to teach and do research, even if not exactly in my field of interest. Two things are relevant to note:

  • My research and career goals do not include a focus on leadership and decision-making.
  • My friend accepted a position (a much better fitting one, see what I did there?) at a different university a week before classes started.

In statistics, we often talk about “goodness of fit” – how well a model describes a set of observations. My program choice was a model that looked great on the spreadsheet but failed to capture the nuances of my data, in this case, my interests and career goals. I was dealing with a poor model fit. The residuals – the differences between what I expected and what I experienced, only grew larger as the months turned into years. I was determined to see through and complete my degree, but my frustration was palpable. I was trying to fit a curvilinear model to a linear relationship. My R-squared value was disappointingly low.

Example of Poor Goodness of Fit

Notice in this exaggerated example, a curve is inappropriately used to fit a clearly linear data pattern. The strong positive linear pattern of the data points suggests that as program value increases goal opportunity also increases. The fitted curve completely misses the underlying pattern, indicating poor model fit. The R-squared value indicates the model explains none of the variance and performs about 33 times worse than if the prediction was simply based on the mean.

Reassessing the Model: Searching for a Better Fit

Just as we refine our statistical models when they fail to adequately explain our data, I concluded I needed to reassess my academic path. The final straw was being told that theory was less important than application while I was working feverishly to map a theory of identity deconstruction that could be generalized to various populations for use in clinical settings. As atheoretical methodologist who values the balance of theory and action, it was a kick in the gut. Turns out it was just what I needed. I began talking to friends whose interests aligned with mine, reaching out to professors and mentors for advice, and really challenging myself to think through what I wanted to do with my scholarly pursuits and the potential consequences of leaving my current program. From there I began looking at different programs and creating my own information criteria (I heart Bayes!). In the same way residuals reflect the gap between outcomes and predictions, or expectations and experiences in my case, I wanted to minimize the residuals in my decision-making by selecting a program the most closely aligned with my personal and professional aspirations. I developed a framework, inspired by statistical concepts like Bayesian information criteria, to create four dimensions of alignment that were of critical importance to my decision to change programs (research, faculty, career, academic). I then used this information set to evaluate and compare the different programs based on how well they matched my interests, goals, and priorities. In this context, I viewed each program as a distinct model where specification defined how the four dimensions of alignment (research, faculty, career, and academic priorities) interact and contribute to program fit.

Here is a link to a tutorial providing the steps necessary to run Bayesian goodness-of-fit testing for regression models using R developed by Andres F. Barrientos and Antonio Canale.

The New Model: Evaluation, Statistics, & Measurement (ESM)

The ESM program immediately stood out to me. The classes were intriguing, the faculty profiles contained research focuses I wanted to explore, and the career options were many I could see myself enjoying. Specifically, the focus on creating applied learning experiences grounded in atheoretical foundation aligned well with my personal approach to both teaching and learning. I met with faculty who echoed my values while also piquing my curiosity for subject matter I had not previously considered exploring. I wanted to learn from them and I felt I could contribute positively to the program. After careful consideration, I chose to make the switch to ESM. The difference was immediately apparent – it was like finding a model with an excellent fit! I had a well-specified model, capturing the complexity of my academic aspirations without over or under fitting. The residuals between my expectations and experiences shrunk. The Radar Chart to the right compares the two programs across five dimensions of important considerations when choosing a Graduate program. The Evaluation program consistently scores higher across all dimensions, indicating better alignment with the important considerations than the Leadership program.

The Importance of Fit

Good statistical models strike a balance between simplicity and explanatory power. ESM provided the right balance of theory and application for me. Finding the right graduate program is a lot like fitting a statistical model. Graduate school is a continuous process of adaptation requiring careful analysis and sometimes, a willingness to start over. Changing programs can be a hard decision but we shouldn’t force ourselves to fit into programs that don’t align with our goals and expectations for our educational experience. My journey to ESM is a reminder that it is okay to reassess, to look for a better fit, and to make changes. Both life and regression analysis are iterative processes in which goodness of fit can influence the predicted outcomes. It is important to reflect on our experiences and take action when adjustments need to be considered. I encourage you to reflection how you define success in your graduate journey and ask, does your current path align with that definition? To hold yourself accountable try setting specific goals at the start of each semester and revisit them mid-way through to make modifications if necessary. In both statistics and graduate school, the end game is not just to find any fit, but to find the best fit. When you do, the adjusted R-squared of your experience will be higher and so will your confidence in achieving your vision of your future.

Whether you are just starting to consider graduate school, evaluating your goodness of fit in a current program, or just wanting to reflect, this YouTube video, Picking the Graduate Program that is Perfect for You, by Dr. Sharon Milgram is full of helpful advice and considerations.

About the Author

I am a current graduate student in the ESM program. My research interests include identity deconstruction and evaluating the use of AI in higher education. I love all things methodology and have passion for factor analysis.

Filed Under: Evaluation Methodology Blog

Emerging Research Methodologies in the Age of Artificial Intelligence and Big Data

Emerging Research Methodologies in the Age of Artificial Intelligence and Big Data

December 15, 2024 by Jonah Hall

Emerging Research Methodologies in the Age of Artificial Intelligence and Big Data

By Richard Amoako

As a doctoral student of Evaluation, Statistics, and Methodology (ESM), I am constantly immersed in a world of evolving research methods. Advanced technologies and artificial intelligence (AI) have brought significant shifts to our research space, especially influencing how data is collected, analyzed, and reported. Methodological adaptations prompted by digital advancements shape how researchers address complex questions across disciplines.

Hello! I’m Richard D. Amoako, a third-year doctoral student in the ESM program at the University of Tennessee, Knoxville. In this post, I delve into some research methods and methodologies that are emerging in education and the broad social sciences. By emphasizing methodologies central to my studies, I hope to showcase how technological advancements reshape research. I will start with a discussion contrasting traditional and emerging methods, proceed to an in-depth exploration of Internet Data Mining, and conclude with challenges, ethical considerations, and a look at the future of these exciting developments.

Traditional vs. Emerging Methodologies

The research space has changed significantly in recent years (Selwyn, 2014). Traditional methods such as cross-sectional studies, survey research, longitudinal research, randomized controlled trials, and qualitative interviews have long been the backbone of social science research. These methods have provided valuable insights into human behavior, social phenomena, and educational outcomes. However, the advent of Big Data, AI, and internet-based research has introduced dynamic alternatives that adapt to the digital age’s unique demands and possibilities.

Emerging methodologies like Data-Driven and AI-enhanced methods, including Natural Language Processing (NLP), Adaptive Research Designs, Computational Ethnography, Crowdsourced Data Collection, publicly accessed internet data mining, and multimodal research—reflect a shift towards interdisciplinary, diverse datasets and real-time data analysis. NLP, for example, facilitates the analysis of massive datasets, transforming qualitative data analysis through machine learning. Adaptive research designs adjust based on real-time inputs, an advantage that enables iterative improvements, particularly beneficial in health and education. Computational ethnography offers new ways to analyze digital behavior and cultures, making it possible to study online communities on platforms like Reddit or Twitter. Multimodal research combines data from diverse sources—such as text, images, audio, video, physiological signals, and gestures- enabling researchers to gain a richer, more complete understanding of a phenomenon.

Furthermore, crowdsourced data collection and citizen science projects tap into citizen participation, gathering data from thousands of individuals quickly, enabling massive-scale studies that would be excessively costly or impractical using traditional methods. Collectively, these methodologies represent an evolving toolkit for researchers who seek to explore complex phenomena in real-world contexts beyond traditional controlled environments. They not only increase the volume of data available but also democratize the research process, allowing non-scientists to contribute to scientific endeavors.

These emerging methods have immense potential but also present some challenges. AI models, such as NLP, often lack transparency, making it hard to understand how they generate decisions or insights, which can undermine trust. Additionally, big data from sources like crowdsourcing might not always be representative, introducing biases that can limit the accuracy and applicability of the results.

To read more about these methods, I have included some helpful resources at the end of this post for your reference.

Deep Dive into Public Interest Data Mining Methods

In this age of digital data abundance, Public Internet Data Mining stands out as a potent research methodology with broad applications across fields like education, technology, and the social sciences. I came across this research approach from one of my readings in my educational data science foundation class. A notable paper that utilized this approach is by Kimmons and Veletsianos, (2018). They examined the use of public internet data mining to analyze trends and patterns in online interactions by collecting data from public websites, social media, and forums. Their study highlighted how researchers can work with large datasets by employing tools such as SQL queries, web scraping, or APIs (Application Programming Interfaces) to extract and analyze data from digital platforms.

Public internet data mining opens new avenues for research by enabling researchers to gather large quantities of data from diverse public platforms. For instance, using Python or R, a researcher might automate the extraction of public data, such as tweets or YouTube comments, to examine trends in educational attitudes or analyze discussions surrounding public policies. In one of their studies, Kimmons and Veletsianos (2016) demonstrated how they extracted data from K-12 websites and social media to analyze technology use patterns and engagement in online discussions.

Here, I share how to use web scraping and web-based API query in R to extract data from publicly accessible websites using platform-provided APIs to access data in a structured manner.

Find other examples here.

For detailed information or training on using R’s rvest package for web scraping, visit here. For an SQL query in R, see here.

In addition to its flexibility, public data mining allows researchers to conduct both quantitative and qualitative analyses, surpassing traditional methods through automated processing and the ability to uncover complex patterns across massive data sets. This method makes it possible to quantify social media engagement metrics as demonstrated by Kimmons, et al. (2017a, 2017b) where they examine higher education institutions’ Twitter activity. With its applicability to social sciences, internet data mining enables real-time monitoring of public sentiment or policy impacts, adding valuable insights that traditional methods may overlook. Through extensive data sets, this approach facilitates exploring subpopulations, such as by analyzing student engagement with educational content on different platforms to identify their engagement patterns and interests. Unlike traditional methods where data collection might influence participant behavior, public internet data mining allows researchers to observe and analyze behaviors and interactions as they occur naturally in online spaces.

Challenges and Ethical Considerations

Ethical concerns present a more profound challenge, especially when working with sensitive data that may reveal personal information. Even if the data is publicly available, researchers face dilemmas about privacy and potential harm to participants. While most internet users might not expect their public posts to be aggregated for research, such practices can inadvertently expose them to risks. For example, a study analyzing sentiments toward educational policies could inadvertently expose identities of data from specific school districts or teachers if used without careful anonymization. As Kimmons and Veletsianos (2018) note, although such data may not be classified as “human subjects research” (p.498) by conventional ethical standards, it can nonetheless influence or harm individuals if used irresponsibly. Other challenges include the potential for bias in the data, concerns about data quality, legal issues, and a risk of over-reliance on algorithms and automated tools for data collection and analysis.

Despite their benefits, these emerging methodologies including internet data mining raise significant challenges, primarily around the expertise required and ethical concerns associated with handling large datasets. These methods demand proficiency in various technical skills—such as coding, database management, and API handling—that may be unfamiliar to many researchers. Kimmons and Veletsianos (2018) argue that without interdisciplinary collaboration, researchers may struggle to perform the necessary technical tasks or interpret findings in the appropriate context. For instance, my own experience trying to analyze large-scale social media data highlighted the steep learning curve associated with data cleaning and storage.

Conclusion

Emerging research methodologies in the digital age are remodeling the research space, allowing us to explore real-world phenomena with unprecedented depth. Public internet data mining exemplifies how technology enables the collection and analysis of vast datasets, supporting new ways to examine complex questions in education and beyond. As we integrate these methods into our work, it is crucial to consider the ethical implications and recognize the limitations inherent in using automated and large-scale methods.

As we look to the future, it’s clear that these methodologies will continue to evolve alongside technological advancements. Artificial intelligence and machine learning are likely to play an increasingly significant role in research, potentially automating more aspects of the research process and uncovering patterns that human researchers might miss. However, developing the research methodology of the future relies on our ability to use these innovations thoughtfully, responsibly, and inclusively. By embracing these tools, researchers in all fields can explore vast new territories of knowledge while contributing to ethical practices that respect individual privacy and integrity. I hope that other researchers will be inspired to explore these methodologies and engage critically with the ethical considerations they entail, ultimately contributing to a more inclusive and data-informed research ecosystem.

Resources

Abramson, C. M., Joslyn, J., Rendle, K. A., Garrett, S. B., & Dohan, D. (2018). The promises of computational ethnography: Improving transparency, replicability, and validity for realist approaches to ethnographic analysis. Ethnography, 19(2), 254-284. https://doi.org/10.1177/1466138117725340

Brooker, P. (2022). Computational ethnography: A view from sociology. Big Data & Society, 9(1). https://doi.org/10.1177/20539517211069892

Dataquest. (2020). R API tutorial: Getting started with APIs in R. Retrieved from https://www.dataquest.io/blog/r-api-tutorial/

Javaid, S. (2024). Crowdsourced data collection benefits & best practices. AI Multiple Research. Retrieved from https://research.aimultiple.com/crowdsourced-data/ Keyes, D. (2021). How to Scrape Data with R. https://rfortherestofus.com/2021/04/how-to-scrape-data-with-r/

Kimmons, R., & Veletsianos, G. (2018). Public internet data mining methods in instructional design, educational technology, and online learning research. TechTrends, 62(5), 492–500. https://doi.org/10.1007/s11528-018-0307-4

Ofosu-Ampong, K. (2024). Artificial intelligence research: A review on dominant themes, methods, frameworks, and future research directions. Telematics and Informatics Reports, 14, 100127. https://doi.org/10.1016/j.teler.2024.100127

Selwyn, N. (2014). Data entry: towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64–82. https://doi.org/10.1080/17439884.2014.921628

Stryker, C., & Holdsworth, J. (2024). What is NLP (natural language processing)? IBM. Retrieved from https://www.ibm.com/topics/natural-language-processing

Urban Institute. Education Data Portal: https://educationdata.urban.org/documentation/schools.html

YouTube Tutorials

Dean Chereden, How to GET data from an API using R in RStudio: https://www.youtube.com/watch?v=AhZ42vSmDmE

APIs for Beginners 2023 – How to use an API: https://www.youtube.com/watch?v=WXsD0ZgxjRw&t=39s

Automated Web Scraping in R using rvest

Filed Under: Evaluation Methodology Blog

Skills Needed to be a Grant Manager

Skills Needed to be a Grant Manager

December 1, 2024 by Jonah Hall

Skills Needed to be a Grant Manager

By Paul Kirkland, Ph.D.

Hi! My name is Paul Kirkland and I am currently the Grant Manager for Monroe County Schools, located in East Tennessee. I am also currently an adjunct faculty member for the ESM and EM graduate programs at UTK. I earned my Ph.D in Educational Psychology and Research with a concentration in Evaluation, Statistics, and Measurement (now called the Evaluation Statistics and Methodology program) from the University of Tennessee, Knoxville in 2018. In my professional career, I’ve served as a high school mathematics teacher, dual enrollment instructor, and a research coordinator. In my current role, a large portion of my duties focus on Grant Management, Grow Your Own, School Safety, and STEM. This post is my thoughts and opinions and does not represent those of my employer. In this blog post, I want to reflect on and discuss the skills needed for a career in Grant Management.

Navigating Complexity through the Eyes of the MAD Hatter

When I began this journey as a Grants Manager in 2021, I had a bad case of Imposter Syndrome and questioned whether or not I could do something different.  Being a Grant Manager requires juggling many different responsibilities: planning, budgeting, reporting, and communicating with funders and stakeholders.  In my opinion, the mentorship and internship opportunities provided by the ESM program provided me with the necessary skill set to successfully fill this position.  The opportunities to conduct real-world evaluation projects, with the mentorship and support from the faculty, gave me the confidence to conduct my own grant proposals and evaluations.   

What exactly is Grant Management?  One might think of grant management as trying to organize one of Lewis Carroll’s Mad Hatter’s chaotic tea parties with a sense of purpose.  Imagine the Mad Hatter (the grant manager) hosting a tea party where every cup of tea (representing a budget item) has a specific role or purpose.  While ensuring each guest (the people or resources) is at the right place and time, the grant manager has to keep track of all the teapots and plates (the funds) to make sure nothing goes awry.  While the Mad Hatter is known for his chaotic approach, grant management aims to bring order and accountability to this setup.   

In simple terms, Grant Management is managing several types of budgets (all at the same time) where every dollar must have a purpose and be accounted for. This ensures that the funding source is happy, while maintaining eligibility for future grant projects.  The manager must understand and be able to implement the following steps: 

  1. Planning – Outline the project & budget 
  1. Budgeting – Track expenditures for approved purposes 
  1. Reporting – Basic updates on how the funds are being spent 
  1. Compliance – Following the Rules set by the grant provider 

As the “Mad Hatter,” the grant manager needs to keep track of various moving parts, which is very similar to the course work provided in any evaluation course.  Every grant report is different and will require you to report to “Alice” (the grant provider) that everything is in order, and ensure that the party (the project) fulfills its purpose in an organized, timely, and accountable way.  Through this process, it is imperative to build relationships with the grant providers.  This will make it easier to implement the project if you have any hiccups along the way.  Subsequently, it is through these relationships that will help build a successful grant department.   

Walt Disney stated, “We keep moving forward, opening new doors and doing new things, because we’re curious, and curiosity keeps leading us down new paths.”  As you are embracing the fields of methodology, evaluation, statistics, and assessment, I recommend that you do what makes you happy but be open-minded about future opportunities and job growth.  Originally, I would have never thought about having the skill set necessary for the grant management path.  However, this program helped me grow professionally.  I would strongly recommend this field to others.    

If you are interested in this field, here is a list of additional resources I have used: 

  • Grant Professionals Association: https://grantprofessionals.org/ 
  • Grant Learning Center: https://www.grants.gov/learn-grants/ 
  • Foundation Directory Online: https://fconline.foundationcenter.org/ 
  • East Tennessee Foundation: https://easttennesseefoundation.org/ 
  • RJMA Grants Consulting: https://rjma.com/ 
  • Nonprofit Ready: https://www.nonprofitready.org/ 

Filed Under: Evaluation Methodology Blog

Learn About our Evaluation Graduate Programs at UTK!

Learn About our Evaluation Graduate Programs at UTK!

November 15, 2024 by Jonah Hall

Learn About our Evaluation Graduate Programs at UTK!

By Jennifer Ann Morrow, Ph.D.

Hi! My name is Jennifer Ann Morrow and I’m the Program Coordinator for the Evaluation Methodology MS program and an Associate Professor in Evaluation Statistics and Methodology at the University of Tennessee-Knoxville. I have been training emerging assessment and evaluation professionals for the past 23 years. My main research areas are training emerging assessment and evaluation professionals, higher education assessment and evaluation, and college student development. My favorite classes to teach are survey research, educational assessment, program evaluation, and statistics. 

Check out my LinkedIn profile: https://www.linkedin.com/in/jenniferannmorrow/  

Are you interested in the field of evaluation? Do you want to earn an advanced degree in evaluation? If your answers are yes, then check out our graduate programs in evaluation here at the University of Tennessee Knoxville. We currently offer two graduate programs, a residential PhD program in Evaluation Statistics and Methodology and a distance education MS program in Evaluation Methodology. There are numerous career paths that an evaluator can take (check out our blog post on this topic!) and earning an advanced degree in evaluation will give you the needed skill sets to be successful in our field. 

Information on the Evaluation Statistics and Methodology PhD program 

Our PhD in Evaluation Statistics and Methodology is a 90-credit residential program that typically takes 4 years to complete (students have up to 8 years to complete their degree). The ESM program is intended for students with education, social science, psychology, economics, applied statistics, and/or related academic backgrounds seeking employment within the growing fields of applied evaluation, assessment, and statistics. While our program is residential, we offer flexibility with evening, online, and hybrid courses. Our PhD program is unique in that it offers focused competency development, theory to practice course-based field experiences, theory to practice internships targeted to student interests, highly experienced and engaged faculty, and regular access to one-on-one faculty support and guidance. Applications are due on December 1st each year (priority deadline), however applicants may still apply through April 1st with the understanding that funding and space may be limited the later that one applies. Our curriculum is listed below. If you have any questions about our ESM PhD program please contact our program coordinator, Dr. Louis Rocconi. 

ESM Core Courses (15 credit hours) 

  • ESM 533 – Program Evaluation I  
  • ESM 534 – Program Evaluation II 
  • ESM 577 – Statistics in Applied Fields I 
  • ESM 677 – Statistics in Applied Fields I 
  • ESM 581 – Educational Assessment 

Advanced ESM Core (12 credit hours) 

  • ESM 651 – Advanced Seminar in Evaluation 
  • ESM 678 – Statistics in Applied Fields III 
  • ESM 680 – Advanced Educational Measurement and Psychometrics  
  • ESM 667 – Advanced Topics  

Research Core (15 credit hours) 

  • ESM 583 – Survey Research 
  • ESM 559 – Introduction to Qualitative Research in Education  
  • ESM 659 – Advanced Qualitative Research in Education  
  • ESM 682 – Educational Research Methods  
  • 3 credit hours of approved graduate research electives selected in consultation with the major advisor 

Applied Professional Experience (15 credit hours) 

  • ESM 660 (9 credit hours) – Research Seminar 
  • ESM 670 (6 credit hours) – Internship 

Electives (9 credit hours) selected in consultation with the major advisor 

Dissertation/Research (24 credit hours) 

  • ESM 600 – Doctoral Research & Dissertation  
  • Students will enroll in a minimum total of 24 credit hours of dissertation at the conclusion of their coursework. 

Information on the Evaluation Methodology Distance Education MS Program 

Our MS in Evaluation Methodology is a 30-credit distance education program where all courses are taught asynchronously. Our program prepares professionals who are seeking to enhance their skills and develop new competencies in the rapidly growing field of evaluation methodology. The program is designed to be completed in two years (6 credits, 2 classes per semester), however students may take up to six years to complete their degree. Courses in the Evaluation Methodology program are taught by experienced professionals in the field of evaluation. Our instructors work as evaluation professionals, applied researchers, and full-time evaluation faculty, many of which have won prestigious teaching awards and routinely earn positive teaching evaluations. Applications are due by July 1st each year. Check out our curriculum listed below. If you have any questions about the EM MS program please contact our program coordinator, Dr. Jennifer Ann Morrow.  

Required Courses: 27 Credit Hours 

  • ESM 533 – Program Evaluation I 
  • ESM 534 – Program Evaluation II 
  • ESM 559 – Introduction to Qualitative Research in Education 
  • ESM 560 – Evaluation Designs and Data Collection Methods 
  • ESM 570 – Disseminating Evaluation Results 
  • ESM 577 – Statistics in Applied Fields I 
  • ESM 583 – Survey Research 
  • ESM 590 – Evaluation Practicum I 
  • ESM 591 – Evaluation Practicum II 

Electives: 3 Credit Hours 

  • ESM 581 – Educational Assessment 
  • ESM 677 – Statistics in Applied Fields II 
  • ESM 672 – Teaching Practicum in Evaluation, Statistics, & Methodology 
  • ESM 682 – Educational Research Methods 
  • Or another distance education course approved by the program coordinator 

Resources: 

ESM PhD and EM MS Admission information: https://cehhs.utk.edu/elps/admissions-information/ 

ESM PhD program information: https://cehhs.utk.edu/elps/academic-programs/evaluation/evaluation-statistics-methodology-phd/ 

EM MS program information: https://cehhs.utk.edu/elps/academic-programs/evaluation/evaluation-methodology-concentration-masters-in-education-online/ 

UTK’s MAD with Measures Blog: https://cehhs.utk.edu/elps/academic-programs/evaluation/evaluation-methodology-blog/ 

UTK Graduate School: https://gradschool.utk.edu/ 

UTK Admissions for International Students: https://gradschool.utk.edu/future-students/office-of-graduate-admissions/applying-to-graduate-school/admissions-for-international-students/ 

Questions about your UTK Graduate School application: https://gradschool.utk.edu/future-students/office-of-graduate-admissions/contact-graduate-admissions/ 

UTK Vols Online: https://volsonline.utk.edu/  

Applying to graduate school: https://www.apa.org/education-career/grad/applying 

How to apply to grad school: https://blog.thegradcafe.com/how-to-apply-to-grad-school/  

Filed Under: Evaluation Methodology Blog

Hazing Prevention Study Expands

Hazing Prevention Study Expands

November 11, 2024 by Jonah Hall

Hazing Prevention Study Expands

Courtesy of the College of Education, Health, & Human Sciences

Penn State’s Timothy J. Piazza Center for Fraternity and Sorority Research has expanded a national hazing prevention study to include nine more campuses. The WhatWorks study emphasizes the prevention of hazardous drinking, hazing and other resulting behaviors, with the goal of changing student, organization and campus culture. 

The newest cohort includes Auburn University; Bowling Green State University; California Polytechnic State University, San Luis Obispo; Mississippi State University; Virginia Tech; the University of Alabama; the University of Kentucky; the University of Missouri; and the University of Tennessee. 

“This thorough volume is the result of a collaborative effort to study hazing from secondary school to higher education,” said Patrick Biddix, Professor of Higher Education at the University of Tennessee, Knoxville.  “It is one of the most comprehensive research projects on hazing prevention, featuring a new definition of hazing and clinical strategies for education and prevention. The findings are influencing national prevention initiatives like the What Works study at Penn State University and are being showcased in various national workshops and presentations.”

Portrait photo of Patrick Biddix. He has fair skin, and short, dark hair. He is wearing a light colored shirt and gray sport coat. He is smiling in the picture.

Biddix is Jimmy and Ileen Chee Endowed Professor of Higher Education in the Department of Educational Leadership and Policy Studies in the College of Education, Health, and Human Sciences. He is a leading authority in fraternity and sorority research. His 50 academic publications have been cited over 630 times by scholars and researchers.

“We’re glad to partner with the Piazza Center and our peers on this project, not only to participate in the development of best practices, but also to benefit from the research-driven principles identified,” said Steven Hood, vice president for student life at the University of Alabama. “Enhancing and supporting student safety and well-being are at the forefront of everything we do, so we consider this project important in forecasting the best path forward for universities like ours with robust fraternity and sorority communities.” 

The WhatWorks study, a partnership with the WITH US Center for Bystander Intervention at California Polytechnic State University and the Gordie Center at the University of Virginia, is designed with top prevention and content experts from behavioral health, psychology and higher education. The study allows participating campuses to implement comprehensive hazing prevention programs. Participating institutions work with the Piazza Center and partners to test and validate effective methods of hazing prevention over a three-year assessment cycle. 

“We are building campuses’ capacity to implement effective prevention that increase student safety,” said Stevan Veldkamp, executive director of Penn State’s Piazza Center, a unit in the division of Student Affairs. “The study aims to build comprehensive prevention programs and assess them with precision to ultimately eliminate life-threatening incidents.” 

The WhatWorks study is being led by Robert Turrisi, professor of biobehavioral health and prevention research at Penn State. Turrisi, along with professor of higher education at the University of Tennessee Patrick Biddix, will work with each cohort member to design research-informed prevention strategies. 

Filed Under: News

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • …
  • 14
  • Next Page »

Educational Leadership and Policy Studies

325 Bailey Education Complex
Knoxville, Tennessee 37996

Phone: 865-974-2214
Fax: 865.974.6146

The University of Tennessee, Knoxville
Knoxville, Tennessee 37996
865-974-1000

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.

ADA Privacy Safety Title IX