• Request Info
  • Visit
  • Apply
  • Give
  • Request Info
  • Visit
  • Apply
  • Give

Search

  • A-Z Index
  • Map

Educational Leadership and Policy Studies

  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
    • Student Affairs and Higher Education (Pending CRC Approval)
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
    • Student Affairs and Higher Education (Pending CRC Approval)
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
Home » Archives for Jonah Hall

ACED Students & Faculty Attend 2025 AAACE Conference

ACED Students & Faculty Attend 2025 AAACE Conference

ACED Students & Faculty Attend 2025 AAACE Conference

November 3, 2025 by Jonah Hall

ACED Students & Faculty Attend 2025 AAACE Conference

The faculty and students of the Adult & Continuing Education (ACED) PhD and master’s programs demonstrated active participation both virtually and onsite at the 2025 American Association of Adult & Continuing Education (AAACE) Annual Conference! AAACE is recognized as the leading international and national organization for adult education professionals.

The conference was held virtually on September 26th, 2025, and physically in Cincinnati, Ohio October 6th-10th. Faculty members and mentors, Qi Sun, Mitsunori Misawa, and Jennifer Kobrin not only presented their own research but also actively guided and collaborated with both PhD and master’s students on research projects. They worked closely with ACED PhD program students such as Rosite Delgado, Billie McNamara, Dan wang, Lauren Davenport, Georgette Samaras, and Steven Henley. Additionally, master’s students such as Kortney Jarman, and Janie Swanger also wrote and submitted proposals that were presented at the conference receiving positive feedback. Multiple students also presented their own research projects, gaining experience in academic writing and developing their professional presentation skills!

During the virtual conference, Qi Sun delivered a presentation on lifelong learning policy, using China’s adult and continuing education movement and reform as a context and case study to highlight emerging trends in standardization, accessibility, digital transformation, and holistic human development. Additionally, she collaborated with PhD candidate Rosite Delgado on a study titled Exploring the Multiple Dimensions of Faculty Wellness in Higher Education: A Holistic Support Approach, and with PhD student Dan Wang on their research titled Enhancing Cross-Cultural Awareness and Intercultural Communication: Experiential Learning in Teaching Chinese as a Second Language, which Dan Wang presented onsite on their behalf. 

From left to right: Jennifer Kobrin, Dan Wang, & Lauren Davenport at the AAACE 2025 Annual Conference.

From left to right: Mitsunori Misawa, Patricia Higgins, Janie Swanger, and Kortney Jarman at the AAACE 2025 Annual Conference.

Furthermore, Kobrin and PhD student Lauren Davenport presented a session titled “From Exclusion to Empowerment: Supporting Older Adults’ Learning Technology in Nonformal Settings.” Their presentation drew from their ongoing research project exploring how older adults engage with technology, privacy, and digital literacies in community-based, nonformal learning settings. Misawa collaborated with MS graduate Kortney Jarman and co-presented: Exploring Workforce Development from Holistic Approaches, and collaborated with Janie Swanger, a  Master’s degree program student, and co-presented: Self-Directed Learning in OBGYN Residency: The Intersection of Medical and Adult Education. 

Some students, for example, Georgette Samaras and Billie McNamara, presented their research at respected commission sessions. Georgette Samaras presented “Mind the Gap: A Psychology Technician Pathway for Workforce Success” at the Commission for Workforce and Professional Development. Billie McNamara also presented at the Commission of International Adult Education (CIAE). Additionally, we are very proud that Billie has served as an editor for the CIAE proceedings of the AAACE annual Conference. 

PhD Student Georgette Samaras presenting her research at the 2025 AAACE Annual Conference.

Misawa has recently joined the AAACE Board of Directors, serving as the Director of the Commission for Workforce and Professional Development. The annual AAACE Conference convenes educators, scholars, and practitioners from across the nation and internationally to exchange research and promote the advancement of adult and continuing education. The upcoming conference is scheduled to take place in Chattanooga, Tennessee, from October 5 to 9, 2026. The faculty anticipate the opportunity to involve more of our program students in this esteemed event!

Filed Under: News

Bartlett, McGuigan, & Miller Join ELPS this Fall as New Faculty Members

Bartlett, McGuigan, & Miller Join ELPS this Fall as New Faculty Members

August 29, 2025 by Jonah Hall

Bartlett, McGuigan, & Miller Join ELPS this Fall as New Faculty Members

The Department of Educational Leadership & Policy Studies is excited to announce that three faculty members joined our department this Fall! Dr. Caroline Bartlett, Dr. Allie McGuigan, and Dr. Ryan Miller each joined the ELPS team following successful faculty searches earlier this year.

First, Dr. Caroline Bartlett joined ELPS as an Assistant Professor! Her research uses quantitative, qualitative, and mixed methods to understand how education policies enhance or constrain educational opportunities for historically underserved groups of students, with a particular focus on multilingual students classified as English learners (ELs). Feel free to check out her CV here!

Her research has been supported by a National Academy of Education/Spencer Foundation Dissertation Fellowship. She holds a Ph.D. in Education Policy and K-12 Educational Leadership from Michigan State University and an M.P.A. with a focus in Education Policy Analysis from Texas A&M University. She teaches education policy and politics. Before her Ph.D., Caroline worked as an English as a Second Language and English teacher!

“I’m excited to begin my faculty career here as an Assistant Professor in ELPS. It has been a pleasure to get to know the department’s outstanding scholars, staff, and EdD students,” said Dr. Bartlett. “I look forward to continuing my research in education policy, law, and finance, while teaching in the EdD program and working and learning alongside EdD students who are engaged with pressing policy issues across the country.”

Next, Dr. Allie McGuigan joined ELPS as an Associate Professor of Practice and the Coordinator of our Higher Education Administration Master’s Program! She obtained her doctorate and graduate certificate in institutional research through Penn State, and her master’s in postsecondary educational leadership and student affairs from San Diego State University. Her research, which focuses on online education, examines relationship building, engagement, and connection for online learners. Dr. McGuigan’s CV can be found here!

Allie’s professional interests span numerous student affairs offices, and she has had experiences in residence life, new student orientation, summer bridge programs, academic advising, student life and leadership, and more. She also served on the Board of Trustees for The Pennsylvania State University which developed her interest in university governance and administration. Allie enjoys teaching courses related to college student development, governance, higher education law, and more – and working with online students to help them integrate into their online campuses.

“I am thrilled to join the faculty at UTK and to coordinate the online master’s in higher education administration program. From my very first conversations with students and colleagues, it is clear that this is a fantastic department and program to work in and learn from,” said Dr. McGuigan. “I’ve worked in online program coordination for nearly a decade, and I am eager to use my experience to help continue to grow this program alongside such talented colleagues. I especially look forward to forming meaningful relationships with students and being a small part of their academic and professional journeys within higher education.”

Lastly, Dr. Ryan Miller joined ELPS as a Professor of Higher Education and the Coordinator of our Higher Education Administration Ph.D. Program! Informed by his background as a student affairs practitioner and first-generation college graduate, Ryan’s scholarship focuses on the experiences of minoritized social groups in higher education (primarily LGBTQ+ and disabled students). He has produced more than 70 publications on these topics in outlets including AERA Open, Journal of Higher Education, Journal of College Student Development, The Review of Higher Education, and Journal of Diversity in Higher Education. He teaches courses on student affairs administration, college student development, and research design. Dr. Miller’s Google Scholar can be viewed here!

Nationally, Ryan serves as vice chair for the Council for the Advancement of Higher Education Programs and associate editor (and former editor) of the College Student Affairs Journal. He was named an Emerging Scholar for ACPA College Student Educators International and is a former Fellow of the University of California National Center for Free Speech and Civic Engagement. Ryan’s research has been funded by the National Science Foundation, John M. Belk Endowment, ACPA, and NASPA.

Prior to joining the University of Tennessee, Ryan was a tenured faculty member at the University of North Carolina at Charlotte and an administrator at the University of Texas at Austin and the University of North Florida. He holds graduate degrees in higher education administration including his Ph.D. from UT-Austin and master’s degree from the Harvard Graduate School of Education. Ryan received the Melvene D. Hardee Dissertation of the Year award from NASPA; the Outstanding Publication Award from the NASPA Faculty Assembly; and was named the Bonnie E. Cone Early Career Professor in Teaching at UNC Charlotte.

“I’m excited to continue my faculty career at the University of Tennessee and to work with outstanding graduate students, faculty, and staff,” said Dr. Miller. “The higher education administration Ph.D. program has a rich history and strong reputation nationwide, and I’m looking forward to building on the success of the program while I also continue my research agenda at UTK. I’m teaching first-year doctoral students this semester and have found them to be enthusiastic, intellectually curious, and interested in shaping the future of the field.”

The Department of Educational Leadership & Policy Studies is excited to welcome three new faculty members to the department as we continue our work to enrich the knowledge, skills, and values requisite to effective leadership, teaching, and research in educational settings. ELPS prepares administrators for schools and colleges, faculty for colleges and universities, and policy scholars for service in state, regional and national agencies associated with educational and human service enterprises. For more information on our programs, please visit this page!

Filed Under: News

Morrow, Angelle, & Cervantes Recently Return from BELMAS

Morrow, Angelle, & Cervantes Recently Return from BELMAS

July 21, 2025 by Jonah Hall

Morrow, Angelle, & Cervantes Recently Return from BELMAS

ELPS faculty members Dr. Jennifer Ann Morrow and Dr. Pamela Angelle recently returned from Brighton, England alongside Higher Education Administration (HEAM) PhD student, Abraham Cervantes, where they presented research at the annual BELMAS Conference!

As stated on the BELMAS homepage, the British Educational Leadership Management and Administration Society (BELMAS) is the Learned Society dedicated to advancing educational leadership. It is a membership organization made up of individual members working across research and practice in all areas of the field. Their members “come from a wide range of backgrounds – from academic researchers to school and system leaders – all committed to advancing understanding and practice in the field.”

Dr. Morrow and Dr. Angelle shared a presentation titled “Artificial Intelligence (AI) and Research: Terrifying or Terrific?” and Cervantes also presented “The Politics of Identity: How ‘Latinx’ Reflects the Tension Between Academia and Culture” at the conference!

Dr. Morrow and Dr. Angelle shared a presentation titled “Artificial Intelligence (AI) and Research: Terrifying or Terrific?” and Cervantes also presented “The Politics of Identity: How ‘Latinx’ Reflects the Tension Between Academia and Culture” at the conference!

“BELMAS was a great opportunity to present our work to an international audience. If you get the chance I highly recommend presenting at international conferences, it is a great way to network with other researchers from around the world and to gain different perspectives on your work.”

-Jennifer Ann Morrow, Ph.D.

Filed Under: News, Uncategorized

Serving with Purpose: Lessons Learned from Consulting in Assessment and Research

Serving with Purpose: Lessons Learned from Consulting in Assessment and Research

July 15, 2025 by Jonah Hall

Serving with Purpose: Lessons Learned from Consulting in Assessment and Research

By Jerri Berry Danso


​​​I’m Jerri Berry Danso, a first-year doctoral student in the Evaluation, Statistics, and Methodology (ESM) program at the University of Tennessee, Knoxville. Before beginning this new chapter, I spent over a decade working in higher education assessment: first as the Director of Assessment for the College of Pharmacy at the University of Florida, and later in Student Affairs Assessment and Research. During those years I learned how purposeful data work can illuminate student learning, sharpen strategic planning, and strengthen institutional effectiveness. Across these roles, I collaborated with faculty, staff, and administrators on a wide range of projects, where I supported outcomes assessment, research design, program evaluation, and data ​​storytelling.

Whether it was designing a survey for a student services office or facilitating a department’s learning outcomes retreat, I found myself consistently in the role of consultant: a partner and guide, helping others make sense of data and translate it into action. Consulting, I’ve learned, is not just about expertise; it also requires curiosity, humility, and a service mindset. And like all forms of service, it is most impactful when done with ​​purpose. My goal in this post is to share the values and lessons that shape my approach so you can adapt them to your own practice. 

What Does It Mean to Consult? 

In our field, we often engage in informal consulting more than we realize. Consulting, at its core, is the act of offering expertise and guidance to help others solve problems or make informed decisions. In the context of ​​research, evaluation, assessment, and methodology, this can involve interpreting data, advising on survey design, facilitating program evaluation,​​ or co-creating strategies for data-informed improvement. 

I define consulting not only by what we do, but also by how we do it – through relationships built on trust, clarity, and mutual respect. If you’ve ever had someone turn to you for guidance on a research or assessment issue because of your experience, congratulations! You’ve already engaged in consulting. 

My Core ​​Consulting Values 

My foundation as a consultant is rooted in an early lesson from graduate school. While earning my first master’s degree in Student Personnel in Higher Education, I took a counseling skills course that fundamentally shaped how I interact with others. We were taught a core set of helping behaviors: active listening, empathy, reflection, open-ended questioning, and attention to nonverbal cues. Though designed for future student affairs professionals, these skills have served me equally well in consulting settings. 

From that experience, and years of practice, my personal consulting values have emerged: 

  • Empathy: Understanding what matters to the client, listening deeply, and genuinely caring about their goals. 
  • Integrity: Being transparent, honest, and grounded in ethical principles, especially when working with data. 
  • Collaboration: Co-creating solutions with clients and recognizing that we are partners, not saviors. 
  • Responsibility: Taking ownership of work, meeting commitments, and communicating clearly when plans change. 
  • Excellence: Striving for quality in both process and product, whether that product is a report, a workshop, or a relationship.

These values are my compass. They help me navigate difficult decisions, maintain consistency, and most importantly, deliver service that is thoughtful and human-centered. 

Lessons from the​​ Field 

​​​Over the years, I’ve learned that the best consultants don’t just deliver technical expertise. They cultivate trust. Here are a few key lessons that have stuck with me: 

  1. Follow through on your promises. If you say you’ll deliver something by a certain date, do it, or communicate early if something changes. Reliability builds ​​credibility and fosters trust in professional relationships. 
  1. Set expectations early. Clarify what you will provide and what you need from your client to be successful. Unmet expectations often stem from assumptions left unspoken. 
  1. Stick to your values. Never compromise your integrity. For example, a client asked me to “spin” data to present their program in a more favorable light. I gently reminded them that our role was to find truth, not polish it, and that honest data helps us improve. 
  1. Anticipate needs. When appropriate, go a step beyond the request. In one project, I created a detailed methodology plan for a project that the client hadn’t asked for. They later told me it became a key reference tool throughout the project. 
  1. Adapt your communication. Know your audience. Avoid overwhelming clients with technical jargon, but don’t oversimplify in a way that’s condescending. Ask questions, check for understanding, and create space for curiosity without judgment. 

​​​The Art of Service 

Good consulting is about more than solving problems; it is equally about how you show up for others. What I’ve come to call the art of service is an intentional approach to client relationships grounded in care, curiosity, and a commitment to helping others thrive. This includes:

  • Practicing empathy and active listening  
  • Personalizing communication and building rapport 
  • Going beyond what’s expected when it adds value 
  • Continuously reflecting on your approach and improving your craft 

These principles align closely with literature on counseling and helping relationships. For instance, Carl Rogers (1951) emphasized the power of empathy, congruence, and unconditional positive regard. These are qualities that, when applied in consulting, build trust and facilitate honest conversations. Gerald Egan (2014), in The Skilled Helper, also highlights how intentional listening and support lead to more effective outcomes. 

A Call to Aspiring Consultants 

You don’t need consultant in your job title to serve others through your expertise. ​​Whether you’re a graduate student, an analyst, or a faculty member, you can bring consulting values into your work, especially in the measurement, assessment, evaluation, and statistics fields, where collaboration and service are central to our mission. 

So, here’s my invitati​​on to you:  

Take some time to define your own values. Reflect on how you show up in service to others. Practice listening more deeply, communicating more clearly, and delivering with care. The technical side of our work is vital, but the human side? That’s where transformation happens. 

​​​Resources for Further Re​​ading 

  • Egan, G. (2014). The Skilled Helper: A Problem-Management and Opportunity-Development Approach to Helping (10th ed.). Cengage Learning. 
  • Rogers, C. R. (1951). Client-Centered Therapy: Its Current Practice, Implications and Theory. Houghton Mifflin. 
  • Block, P. (2011). Flawless Consulting: A Guide to Getting Your Expertise Used (3rd ed.). Wiley. 
  • Kegan, R., & Lahey, L. L. (2016). An Everyone Culture: Becoming a Deliberately Developmental Organization. Harvard Business Review Press. 

Filed Under: Evaluation Methodology Blog

Navigating Ambiguity and Asymmetry: from Undergraduate to Graduate Student and Beyond

Navigating Ambiguity and Asymmetry: from Undergraduate to Graduate Student and Beyond

June 15, 2025 by Jonah Hall

Navigating Ambiguity and Asymmetry: from Undergraduate to Graduate Student and Beyond

By Jessica Osborne, Ph.D. and Chelsea Jacobs

Jessica is the Principal Evaluation Associate for the Higher Education Portfolio at The Center for Research Evaluation at the University of Mississippi. She earned a PhD in Evaluation, Statistics, and Measurement from the University of Tennessee, Knoxville, an MFA in Creative Writing from the University of North Carolina, Greensboro, and a BA in English from Elon University. Her main areas of research and evaluation are undergraduate and graduate student success, higher education systems, needs assessments, and intrinsic motivation. She lives in Knoxville, TN with her husband, two kids, and three (yes, three…) cats. 

My name is Chelsea Jacobs, and I’m a PhD student in the Evaluation, Statistics, and Methodology (ESM) program at the University of Tennessee, Knoxville. I’m especially interested in how data and evidence are used to inform and improve learning environments. In this post, I’ll share reflections — drawn from personal experience and professional mentorship — on navigating the ambiguity and asymmetry that often define the transition from undergraduate to graduate education. I’ll also offer a few practical tips and resources for those considering or beginning this journey. 

Transitioning from undergraduate studies to graduate school is an exciting milestone, full of possibilities and challenges. For many students, it also marks a shift in how success is measured and achieved. We — Jessica Osborne, PhD, Principal Evaluation Associate at The Center for Research Evaluation at the University of Mississippi, and Chelsea Jacobs, PhD student at the University of Tennessee — have explored these topics during our professional networking and mentoring sessions. While ambiguity and asymmetry may exist in undergraduate education, they often become more pronounced and impactful in graduate school and professional life. This post sheds light on these challenges, offers practical advice, and points prospective graduate students to resources that can ease the transition. 

From Clarity to Exploration: Embracing Ambiguity in Graduate Education 

In undergraduate studies, assessments often come in the form of multiple-choice questions or structured assignments, where answers are concrete and feedback is relatively clear-cut. From a Bloom’s Taxonomy perspective, this often reflects the “remembering” domain. Success may align with effort — study hard, complete assignments, and you’ll likely earn good grades. 

Graduate school, however, introduces a level of ambiguity that can be unexpectedly challenging. Research projects, thesis writing, and professional collaborations often lack clear guidelines or definitive answers. Feedback becomes more subjective, reflecting the complexity and nuance of the work. For example, a research proposal may receive conflicting critiques from reviewers, requiring students to navigate gray areas with the support of advisors, peers, and faculty. 

These shifts are compounded by a structural difference: while undergraduates typically have access to dedicated offices and resources designed to support their success, graduate students often face these challenges with far fewer institutional supports. This makes it all the more important to cultivate self-advocacy, build informal support networks, and learn to tolerate uncertainty. 

Though ambiguity can feel overwhelming, it’s also an opportunity to develop critical thinking and problem-solving skills. Graduate school encourages asking deeper questions, exploring multiple perspectives, and embracing the process of learning rather than focusing solely on outcomes. 

How to Navigate Ambiguity 

Embrace the Learning Curve: Ambiguity is not a sign of failure but a necessary condition for growth—it pushes us beyond routine practice and encourages deeper, more flexible thinking. Seek opportunities to engage with complex problems, even if they feel overwhelming at first, as these moments often prompt the most meaningful development. 

Ask for Guidance: Don’t hesitate to seek clarification from advisors, peers, or those just a step ahead in their academic journey. Opening up about your struggles can reveal how common they are — and hearing how others have navigated doubt or setbacks can help you build the resilience to keep moving forward. Graduate school can be a collaborative space, and connection can be just as important as instruction. 

In the ESM program at UTK, we’re fortunate to be part of a collaborative, non-competitive graduate environment. This isn’t the case for all graduate programs, so it’s an important factor to consider when choosing where to study. 

Uneven Roads: Embracing the Asymmetry of Growth 

As an undergraduate, effort is often emphasized as the key to success, but the relationship between effort and outcome isn’t always straightforward. Study strategies, access to resources, prior preparation, and support systems all play a role — meaning that even significant effort doesn’t always lead to the expected results. However, success can align with effort—study hard, complete assignments, and you’ll likely earn good grades. 

In graduate school and professional life, this symmetry can break down. You might invest months into a research paper, only to have it rejected by a journal. Grant proposals, job applications, and conference submissions often yield similar results—hard work doesn’t always guarantee success, but it does guarantee learning. 

This asymmetry can be disheartening, but it mirrors the realities of many professional fields. Learning to navigate it is crucial for building resilience and maintaining motivation. Rejection and setbacks are not personal failures but part of growth. 

How to Handle Asymmetry 

Redefine Success: Focus on the process rather than the outcome. Every rejection is an opportunity to refine your skills and approach. 

Build Resilience: Mistakes, failures, and rejection are not just normal—they’re powerful learning moments. These experiences often reveal knowledge or skill gaps more clearly than success, making them both memorable and transformative. Cultivating a growth mindset helps reframe setbacks as essential steps in your development. 

Seek Support: Surround yourself with a network of peers, mentors, and advisors who can offer perspective and encouragement. 

Resources for Prospective Graduate Students 

Workshops and seminars can help students build essential skills — offering guidance on research methodologies, academic writing, and mental resilience. 

Here are a few resources to consider: 

  • Books: Writing Your Journal Article in Twelve Weeks by Wendy Laura Belcher is excellent for developing academic writing. The Writing Workshop, recommended by a University of Michigan colleague, is a free, open-access resource. 
  • Research Colloquium: UTK students apply research skills in a colloquium setting. See Michigan State University’s Graduate Research Colloquium for a similar example. These events are common — look into what your institution offers. 
  • Campus Resources: Don’t overlook writing centers, counseling centers, and mental health services. For example, Harvard’s Counseling and Mental Health Services provides a strong model. Explore what’s available at your school. 
  • Professional Networks: Join organizations or online communities in your field. This can lead to mentorship, which is invaluable — and worthy of its own blog post. 

Final Thoughts 

Ambiguity and asymmetry are not obstacles to be feared but challenges to be embraced. They help develop the critical thinking, problem-solving, and resilience needed for both graduate school and a fulfilling professional career. By understanding these aspects and using the right resources, you can navigate the transition with confidence. 

To prospective graduate students: welcome to a journey of growth, discovery, and MADness — Meaningful, Action-Driven exploration of methods and measures. We’re excited to see how you’ll rise to the challenge. 

Filed Under: Evaluation Methodology Blog

My Journey In Writing A Bibliometric Analysis Paper

My Journey In Writing A Bibliometric Analysis Paper

June 1, 2025 by Jonah Hall

My Journey In Writing A Bibliometric Analysis Paper

As a third-year doctoral student in Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville, I recently completed a bibliometric analysis paper for my capstone project on Data Visualization and Communication in Evaluation. Bibliometrics offers a powerful way to quantify research trends, map scholarly networks, and identify gaps in literature. It is an invaluable research method for evaluators and researchers alike. Hello everyone! I am Richard D. Amoako. 

Learning bibliometrics isn’t always straightforward. Between choosing the right database, wrangling APIs, and figuring out which R or Python packages won’t crash your laptop, there’s a steep learning curve. That’s why I’m writing this: to share the lessons, tools, and occasional frustrations I’ve picked up along the way. Whether you’re an evaluator looking to map trends in your field or a researcher venturing into bibliometrics for the first time, I hope this post saves you time, sanity, and a few coding headaches. Let’s explore the methodology, applications, and resources that shaped my project. 

Understanding Bibliometric Analysis 

Bibliometric analysis is the systematic study of academic publications through quantitative methods- examining citations, authorship patterns, and keyword frequencies to reveal research trends. Bibliometric analysis differs from traditional literature reviews by delivering data-driven insights into knowledge evolution within a field. Common applications include identifying influential papers, mapping collaboration networks, and assessing journal impact (Donthu, et al., 2021; Van Raan, et a., 2018; Zupic & Čater, 2015). 

For evaluators, this approach is particularly valuable. It helps track the adoption of evaluation frameworks, measure scholarly influence, and detect emerging themes, such as how data visualization has gained traction in recent years. My interest in bibliometrics began while reviewing literature for my capstone project. Faced with hundreds of papers, I needed a way to objectively analyze trends rather than rely on subjective selection. Bibliometrics provide that structure, turning scattered research into actionable insights. 

Key Steps in Writing a Bibliometric Paper 

Defining Research Objectives 
The foundation of any successful bibliometric study lies in crafting a precise research question. For my capstone on data visualization in evaluation literature, I focused on: “How has the application of data visualization techniques evolved in program evaluation research from 2010-2025?” This specificity helped me avoid irrelevant data while maintaining analytical depth. Before finalizing my question, I reviewed existing systematic reviews to identify underexplored areas – a crucial step that prevented duplication of prior work. When brainstorming and refining your thoughts, utilize productive technologies such as generative AI tools (such as ChatGPT, Claude, Perplexity, Google Gemini, Microsoft Copilot, DeepSeek, etc.)  to enhance and clarify your ideas.   

Database Selection and Data Collection 
Choosing the right database significantly impacts study quality. After comparing options, I selected Scopus for its comprehensive coverage of social science literature and robust citation metrics. While Web of Science (WoS) offers stronger impact metrics, its limited coverage of evaluation journals made it less suitable. Nonetheless, I examined the potential applications of using WoS. Google Scholar’s expansive but uncurated collection proved too noisy for systematic analysis. Scopus’s ability to export 2,000 records at once and include meta-data such as author affiliation, country proved invaluable for my collaboration mapping. 

Data Extraction and Automation 
To efficiently handle large datasets, I leveraged R’s Bibliometrix package. Use this R script to automate your data extraction with the Scopus API (Application Programming Interface). APIs enable software systems to communicate with each other. Researchers can use APIs to automate access to database records (like Scopus, WoS) without manual downloading. To access the Scopus database, request access via Elsevier’s Developer Portal. 

Pros: Good for large-scale scraping. Cons: Requires API key approval (can take days or weeks).  

For targeted bibliometric searches, carefully construct your keyword strings using Boolean operators (AND/OR/NOT) and field tags like TITLE-ABS-KEY() to balance recall and precision – for example, my search TITLE-ABS-KEY(“data visualization” AND “evaluation”) retrieved 37% more relevant papers than a simple keyword search by excluding off-topic mentions in references. 

After exporting Scopus results to CSV, a simple script converted and analyzed the data (Aria & Cuccurullo, 2017): 

library(bibliometrix) 

M <- convert2df(“scopus.csv”, dbsource = “scopus”, format = “csv”) 

results <- biblioAnalysis(M) 

This approach provided immediate insights into citation patterns and author networks.  

Data Screening and Cleaning 
The initial search may return many papers; my search returned over 2,000. To narrow down the most relevant articles, you can apply filters such as: 

  1. Removing duplicates via DOI matching [use R code, M <- M[!duplicated(M$DO), ] #Remove by DOI. Duplicates are common in multidatabase studies.  
  1. Excluding non-journal articles 
  1. Excluding irrelevant articles that do not match your research questions or inclusion criteria 
  1. Manual review of random samples to verify relevance 

Additional data cleaning may be required. I use R’s tidyverse, janitor or dplyr packages for these tasks.  

The screening process can be overwhelming and time-consuming if performed manually. Fortunately, several tools and websites are available to assist with this task. Notable examples include abstrackr, convidence.org, rayyan.ai, AsReview, Loonlens.com, and nested-knowledge. These tools require well-defined inclusion and exclusion criteria. It is essential to have thoroughly considered criteria in place. Among these tools, my preferred choice is Loonlens.com, which automates the screening process based on the specified criteria and generates a CSV file with decisions and reasons upon completion. 

Analysis and Visualization  

Key analytical approaches included (refer to the appendices for R codes and this guideline): 

  • Citation analysis to identify influential works 
  • Co-authorship network mapping to reveal collaboration patterns 
  • Keyword co-occurrence analysis to track conceptual evolution 
  • Country and institution analysis to identify geographical collaborations and impacts 

For visualization, VOSviewer creates clear keyword co-occurrence maps, while CiteSpace helps identify temporal trends. The bibliometrix package streamlined these analyses, with functions like conceptualStructure() revealing important thematic connections. Visualization adjustments (like setting minimum node frequencies) transformed initial “hairball” network diagrams into clear, interpretable maps.  

This structured approach, from precise question formulation through iterative visualization – transformed a potentially overwhelming project into manageable stages. The automation and filtering strategies proved particularly valuable, saving countless hours of manual processing while ensuring analytical rigor.  

All the R code I used for data cleaning, analysis, and visualization is available on my GitHub repository. 

Challenges & How to Overcome Them 

Bibliometric analysis comes with its fair share of hurdles. Early in my project, I hit a major roadblock when I discovered many key papers were behind paywalls. My solution? I leveraged my university’s interlibrary loan/resource sharing system and reached out directly to authors via ResearchGate to request for full text – some responded with their papers. API limits were another frustration, particularly with Scopus’s weekly request cap (20,000 publications per week). I used R’s httr package to space out requests systematically, grouping queries by year or keyword to stay under Scopus’s weekly limit while automating the process. In addition to utilizing the API, you may access Scopus with your institutional credentials to manually search for papers using your key terms. You can then export your results in various formats such as CSV, RIS, and BibTex. 

The learning curve for R’s Bibliometrix package nearly derailed me in week two. After spending hours on error messages, I discovered the package’s excellent documentation and worked through their tutorial examples line by line. This hands-on approach helped me master essential functions within a week. 

Perhaps the trickiest challenge was avoiding overinterpretation. My initial excitement at seeing strong keyword clusters nearly led me to make unsupported claims. Consult with your advisor, a colleague or expertise in your field to help you distinguish between meaningful patterns and statistical noise. For instance, I found that a seemingly important keyword connection was just due to some prolific author’s preferred terminology. 

For clarity in my visualization, I use a consistent color scheme across visualizations to help readers quickly identify key themes. I used blue for methodological terms, green for application areas, and red for emerging concepts. This small touch markedly improved my visual’s readability. 

Conclusion 

This journey through bibliometric analysis has transformed how I approach research. From crafting precise questions to interpreting network visualizations, these methods bring clarity to complex literature landscapes. The technical hurdles are real but manageable – the payoff in insights is worth the effort. 

For those just starting, I recommend beginning with a small pilot study, perhaps analyzing 100-200 papers on a focused topic. The skills build quickly. 

I’d love to hear about your experiences with bibliometrics or help troubleshoot any challenges you encounter. Feel free to reach out at contact@rd-amoako.com or continue the conversation on research forums and other online platforms. Let’s explore how these methods can advance our evaluation and research  practice together. 

Interested in seeing the results of my bibliometric analysis and exploring the key findings? Connect with me via LinkedIn  or my blog. 

View an interactive map of publication counts by country from my project:  publications_map.html  

Bibliography 

an Eck, N. J., & Waltman, L. (2014). Visualizing bibliometric networks. In Y. Ding, R. Rousseau, & D. Wolfram (Eds.), Measuring scholarly impact: Methods and practice (pp. 285–320). Springer. 

Aria, M. & Cuccurullo, C. (2017) bibliometrix: An R-tool for comprehensive science mapping analysis, Journal of Informetrics, 11(4), pp 959-975, Elsevier. 

Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., & Lim, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of Business Research, 133, 285–296. https://doi.org/10.1016/j.jbusres.2021.04.070 

Liu, A., Urquía-Grande, E., López-Sánchez, P., & Rodríguez-López, Á. (2023). Research into microfinance and ICTs: A bibliometric analysis. Evaluation and Program Planning, 97, 102215. https://doi.org/10.1016/j.evalprogplan.2022.102215 

Van Raan, A. F. J. (2018). Measuring science: Basic principles and application of advanced bibliometrics. In W. Glänzel, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Handbook of science and technology indicators. Springer. 

Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., Van Eck, N. J., & Wouters, P. (2012). The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432. https://doi.org/10.1002/asi.22708 

Yao, S., Tang, Y., Yi, C., & Xiao, Y. (2022). Research hotspots and trend exploration on the clinical translational outcome of simulation-based medical education: A 10-year scientific bibliometric analysis from 2011 to 2021. Frontiers in Medicine, 8, 801277. https://doi.org/10.3389/fmed.2021.801277 

Zupic, I., & Čater, T. (2014). Bibliometric Methods in Management and Organization. Organizational Research Methods, 18(3), 429-472. https://doi.org/10.1177/1094428114562629 

 Resources: 

  • Bibliometrix Tutorial 
  • Scopus API Guide 
  • VOSviewer 
  • CiteSpace Manual  

Data Screening  

Abstractr- https://www.youtube.com/watch?v=jy9NJsODtT8 

Convidence.org- https://www.youtube.com/watch?v=tPGuwoh834A 

Rayyan.ai- https://www.youtube.com/watch?v=YFfzH4P6YKw&t=9s 

AsReview- https://www.youtube.com/watch?v=gBmDJ1pdPR0 

Nested-knowledge- https://www.youtube.com/watch?v=7xih-5awJuM 

R resources:  

My project repository https://github.com/amoakor/BibliometricAnalysis.git 

Packages: 

-tidyverse, – bibliometrix, – rscopus, -janitor 

-pysch, -tm 

httr package documentation: https://httr.r-lib.org/, https://github.com/r-lib/httr 

Analyzing & Visualizing Data 

  • Key Metrics to Explore (See the Bibliometrix Tutorial for more examples): 
  1. Citation Analysis: 

citations <- citations(M, field = “article”, sep = “;”) 

head(citations$Cited, 10) # Top 10 most cited 

  1. Co-authorship Networks: 

networkPlot(M, normalize = “salton”, type = “collaboration”) 

  1. Keyword Trends: 

conceptualStructure(M, field = “ID”, method = “CA”, minDegree = 10) 

Filed Under: Evaluation Methodology Blog

Power BI, Will It Really Give Me Data Viz Superpowers?

Power BI, Will It Really Give Me Data Viz Superpowers?

May 15, 2025 by Jonah Hall

Power BI, Will It Really Give Me Data Viz Superpowers?

What is Power BI?

Power BI is a powerful tool to visualize data.  

It can take multiple large datasets, put them all together, transform them, perform calculations and help you create beautiful visualizations. Think of it as a data wrangler, organizer, and visualizer! Oftentimes, a collection of visualizations is created into a report.  

My name is Jake Working, I am a third-year student in the ESM PhD program at UTK and primarily use Power BI in my day job as a Data Analyst for Digital Learning at UTK. I will briefly discuss some of Power BI’s main functions and point you towards some resources if you want to learn more. 

Why use a data viz software? 

Before we jump into the software, you may be thinking, “why go through all the trouble of learning another software just to create visualizations? Aren’t my [insert your software of choice here] visualizations good enough?” 

Even when you get comfortable and quick in [your software of choice], at the end of the day, these programs’ primary functions are typically to store, present, or analyze your data, not bringing in data with the purpose of creating visualizations. 

The advantage of learning data visualization software like Power BI is that it is designed with visualization as its primary purpose. If you have learned or even mastered creating visuals in another software, you can 100% learn and master visualization software like Power BI. 

What can Power BI do? 

First, Power BI is excellent at bringing in data. You can connect multiple large and different types of data sources to Power BI, transform them, and perform calculations as necessary to prepare visuals. 

For data sources, if you can access the data, Power BI can connect to or import it. Power BI can take flat files (ex. Excel, PDF, or CSV), pull direct (snapshot or live) from a database (ex. MySQL, Oracle, SQL Server), import from a website, R script, Python script, and so many more! Even if you have multiple data sources, you can load as many as you need in and create relationships between your data sources.  

Creating relationships serves as the backbone of your data model if you have multiple data sources. For example, say you have a data source with student demographic data and another with student course information. If both contain a unique identifier, such as their student ID, you can create a relationship between the data sources based on that student ID and Power BI will know which course information connects with which student in your demographic data.  

Most of the mistakes within building a model occur at this step, and it is important to understand how and why you are building your model in a certain way or else you could sluggish, incorrect, or confusing output. I suggest reading Microsoft’s overview of relationships and then later this two-part blog post on Power BI data modeling best practices (part 1, part 2). Warning! This blog post is overly detailed for beginners, but extremely important information to avoid common Power BI pitfalls with relationships. I have had to deal with, and overcome, issues related to cardinality, filtering, and schema structure that are discussed in the blog.  

An overview of Power BI’s capabilities: bringing in multiple sources of data, cleaning data, creating relationships between data sources, and using the data to generate a visual report. 

Once you have identified your dataset, Power BI has abilities to transform your data into clean, workable, data within their Power Query editor. This editor has functionalities like Excel such as updating data types, replacing values, creating new columns, and pivoting data. This is done using the Power Query GUI or its script language, M. These transformation steps can be “saved” to your data source and performed on your data each time Power BI connects to or updates that data source. So, once you have cleaned up your data once, it is done automatically using the steps you already created! 

Power BI can then do complex calculations on your dataset once you’ve loaded it in. It uses a function and reference library called Data Analysis Expressions (DAX, for short) that is like expressions used in Excel. Check out Microsoft’s overview of how DAX can be used within Power BI and the library of DAX functions. In my use within Power BI, I mainly use calculated columns and measures.  

For example, let’s say I have a column in my data set that shows the date a form was submitted in this format: mm/dd/yyyy hr:min:sec. If I want to count the number of forms submitted in the calendar year 2025 and display that value on my report, I can create a measure using the DAX functions. It would look something like this: 

Finally, Power BI’s main function is to create engaging visuals and reports to infer information from your data. Power BI has a workspace that allows you to easily select visuals, drag fields from your data into the visuals, and then edit or customize your visuals. The software is pre-loaded with many useful visuals, but you can search and download additional, user-created, visuals as well. Check out the image below showcasing Power BI’s workspace. 

image from Microsoft (source) 

Visuals can be used together (like in the image) to create a report. These reports can be published in a shareable environment through the Power BI Service so others can view the report. This is how companies create and distribute data reports! 

One exciting feature of Power BI is the ability to use and interact with Microsoft’s AI, Copilot. Copilot is quite intelligent when it comes to understanding and using data and can even help build visuals and whole reports. Check out this three minute demo on Copilot within Power BI to get a sense of its capabilities. 

I want to try! 

If you are interested in poking around Power BI to see if it could be useful for you, you can download the desktop version for free here. I will note that even if you are working on personal projects and have data you want to create visuals from, it may be worth it to try Power BI! 

Microsoft has training, videos, sample data you can play with once you open the program, and a community forum to help with any questions you may have.  

Curious what Power BI can do? Check out some of the submissions from this year’s Microsoft’s Power BI Visualization World Championships! 

Filed Under: Evaluation Methodology Blog

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

March 15, 2025 by Jonah Hall

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

By Dr. Fatima T. Zahra

Hello, my name is Fatima T. Zahra. I am an Assistant Professor of Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville. My research examines the intersection of human development, AI, and evaluation in diverse and displaced populations. Over the past decade, I have worked on projects that explore the role of evaluation in shaping educational and labor market outcomes in refugee and crisis-affected settings. This post departs from a purely technical discussion to reflect on the role of empathy in evaluation practices—a quality that is often overlooked but profoundly consequential. For more information about the work that I do check out my website. 

Evaluation is typically regarded as an instrument for assessing program effectiveness. However, in marginalized and forcibly displaced populations, conventional evaluation models often fall short. Traditional frameworks prioritize objectivity, standardized indicators, and externally driven methodologies, yet they frequently fail to capture the complexity of lived experiences. This gap has spurred the adoption of empathy in evaluation, particularly participatory and culturally responsive frameworks that prioritize community voices, local knowledge, and equitable power-sharing in the evaluation process. The work in this area is substantially underdeveloped. 

A group selfie taken during field work in the Rohingya refugee camps in 2019.

Why Does This Matter?

My recent meta-analysis of 40 studies comparing participatory, culturally responsive, and traditional evaluation models in refugee and displaced settings underscores the importance of empathy-driven approaches. Key findings include: 

  • Participatory evaluations demonstrated high levels of community engagement, with attendance and participation rates ranging from 71% to 78%. Evaluations that positioned community members as co-researchers led to greater program sustainability. 
  • Culturally responsive evaluations yielded statistically significant improvements in mental health outcomes and knowledge acquisition, particularly when interventions incorporated linguistic and cultural adaptations tailored to participants’ lived experiences. 
  • Traditional evaluations exhibited mixed results, proving effective in measuring clinical outcomes but demonstrating lower engagement (54% average participation rate), particularly in cases where community voices were not integrated into the evaluation design. 

The sustainability of programs was not dictated by evaluation models alone but was strongly influenced by community ownership, capacity building, and system integration. Evaluations that actively engaged community members in decision-making processes were more likely to foster lasting impact. 

Lessons from the Field

In our research on early childhood development among Rohingya refugees in Bangladesh, initial evaluations of play-based learning programs suggested minimal paternal engagement. However, when we restructured our approach to include fathers in defining meaningful participation—through focus groups and storytelling sessions—engagement increased dramatically. This shift underscored a critical lesson: evaluation frameworks that do not reflect the lived realities of marginalized communities risk missing key drivers of success. 

Similarly, in a study examining the impact of employment programs in refugee camps, traditional evaluations focused primarily on income and productivity, overlooking the psychological and social effects of work. By incorporating mental well-being as a key evaluation metric—through self-reported dignity, purpose, and social belonging—we found that employment offered far more than economic stability. These findings reinforce an essential principle: sustainable impact is most likely when evaluation is conducted with communities rather than on them, recognizing the full spectrum of human needs beyond economic indicators. 

Rethinking Evaluation: A Call for Change

To advance the field of evaluation, particularly in marginalized and displaced settings, we must adopt new approaches: 

  1. Power-sharing as a foundational principle. Evaluation must shift from an extractive process to a collaborative one. This means prioritizing genuine co-creation, where communities influence decisions from research design to data interpretation. 
  1. Cultural responsiveness as a necessity, not an afterthought. Effective evaluation requires deep listening, linguistic adaptation, and recognition of cultural epistemologies. Without this, findings may be incomplete or misinterpreted. 
  1. Expanding our definition of rigor. Methodological validity should not come at the expense of community relevance. The most robust evaluations integrate standardized measures with locally grounded insights. 
  2. Moving beyond extractive evaluation models. The purpose of evaluation should extend beyond measuring impact to strengthening local capacity for continued assessment and programmatic refinement. 

Looking Ahead

The field of evaluation stands at a pivotal juncture. Traditional approaches, which often prioritize external expertise over local knowledge, are proving inadequate in addressing the complexity of crisis-affected populations. Empathy in evaluation (EIE) methodologies—those that emphasize cultural adaptation, power-sharing, and stakeholder engagement—offer a path toward more just, effective, and sustainable evaluation practice. 

For scholars, this shift necessitates expanding research on context-sensitive methodologies. For practitioners, it demands a reimagining of evaluation as a process that centers mutual learning rather than imposing external standards. For policymakers and funders, it calls for investment in evaluation models that are adaptive, participatory, and aligned with the needs of affected populations. 

As evaluators, we hold a critical responsibility. We can either reinforce existing power imbalances or work to build evaluation frameworks that respect and reflect the realities of the communities we serve. If we aspire to generate meaningful knowledge and drive lasting change, we must practice empathy, cultural responsiveness, and community engagement at the core of our methodologies. 

Additional Resources

For those interested in deepening their understanding of these concepts, I highly recommend the following works: 

  • Evaluation in Humanitarian Contexts:  
  • Mertens, D. M. (2009). Transformative Research and Evaluation. Guilford Press. 
  • Culturally Responsive Evaluation:  
  • Hood, S., Hopson, R. K., & Kirkhart, K. E. (2015). Culturally Responsive Evaluation. In K. E. Newcomer, H. P. Hatry, & J. S. Wholey (Eds.), Handbook of Practical Program Evaluation (4 ed., pp. 281-317). Jossey-Bass. https://doi.org/10.1002/9781119171386.ch12 
  • Participatory Research in Development Settings:  
  • Chouinard, J.A., Cousins, J.B. The journey from rhetoric to reality: participatory evaluation in a development context. Educ Asse Eval Acc 27, 5–39 (2015). https://doi.org/10.1007/s11092-013-9184-8 
  • Empathy in Evaluation:  
  • Zahra, F. T. (n.d.). Empathy in Evaluation. https://www.fatimazahra.org/blog-posts/Blog%20Post%20Title%20One-gygte 
  • Empathy and Sensitivity to Injustice:  
  • Decety, J., & Cowell, J. M. (2014). Empathy and motivation for justice: Cognitive empathy and concern, but not emotional empathy, predict sensitivity to injustice for others (SPI White Paper No. 135). Social and Political Intelligence Research Hub. https://web.archive.org/web/20221023104046/https://spihub.org/site/resource_files/publications/spi_wp_135_decety.pdf 

Final Thought: Evaluation is a mechanism for empowerment and is more than just an assessment tool. Evaluators have the capacity to amplify community voices, shape equitable policies, and drive sustainable change. The question is not whether we can integrate empathy into our methodologies, but whether we choose to do so.  

Filed Under: Evaluation Methodology Blog

Irwin Recognized As Emerging Professional By ACPA

Irwin Recognized As Emerging Professional By ACPA

March 5, 2025 by Jonah Hall

Irwin Recognized As Emerging Professional By ACPA

Courtesy of the College of Education, Health, & Human Sciences

At its recent convention in Long Beach, California, College Student Educators International (ACPA) recognized Lauren Irwin with the Annuit Coeptis Emerging Professionals Award. This prestigious award honors exemplary educators in the early stages of their careers. Irwin was one of five early-career professionals recognized for their contributions to the field.
Irwin, an assistant professor in the department of Educational Leadership and Policy Studies (ELPS) in the College of Education, Health, and Human Sciences (CEHHS), is a long-time ACPA member and was deeply honored to receive the award.

headshot

“ACPA has long been my professional home in student affairs, and it means a lot to receive this recognition,” said Irwin. “The Annuit Coeptis award is ultimately about community and discussion to support the future of our field. As a former student affairs administrator and early-career faculty member, I am honored to be part of this prestigious multigenerational community and to have the opportunity to learn from and with some of the brightest minds in our field.”

Irwin primarily teaches in the College Student Personnel and Higher Education Administration programs. Her research informs student affairs practice, aiming to enhance and affirm the success of both students and practitioners. Her doctoral dissertation, which examined racialization and whiteness in college student leadership programs, earned ACPA’s Marylu McEwen Dissertation of the Year Award. Additionally, her research has been published in numerous scholarly journals.

“I hope to continue centering my commitment to student learning, equity, and inclusion through my teaching, research, and service,” Irwin said.
Through its seven departments and 13 centers, the UT College of Education, Health and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

Is Your Data Dirty? The Importance of Conducting Frequencies First

Is Your Data Dirty? The Importance of Conducting Frequencies First

March 1, 2025 by Jonah Hall

Is Your Data Dirty? The Importance of Conducting Frequencies First

By Jennifer Ann Morrow, Ph.D.

Data, like life, can be messy. I’ve worked with all types of data, both collected by me and by my clients, for over 25 years and I ALWAYS check my data before conducting my proposed analyses. Sometimes, this part of the analysis process is quick and easy but most of the time it’s like an investigation…you need to be thorough, take your time, and provide evidence for your decision making. 

Data Cleaning Step 3: Perform Initial Frequencies 

After you have drafted your codebook and analysis plan you should conduct frequencies on all of your variables in your dataset, both numeric and string variables. I typically use Excel or SPSS to do this, my colleague Dr. Louis Rocconi prefers R, but you can use any statistical software that you feel most comfortable with. At this step I conduct frequencies and request graphics (e.g., bar chart, histogram) for every variable. This output will be invaluable as you work through your next data cleaning steps. 

So, what should you be looking at when reviewing your frequencies? One thing that I will make note of is any discrepancies in coding between my data and what is listed in my codebook. I’ll flag any spelling issues in my variable names/labels and note anything that doesn’t match my codebook. One thing that I always check is that my value labels (what labels are given to my numeric categories) are the same as my codebook and consistent across sets of variables. Many times, if you are using an online survey software package to collect your data there can easily have been programming mistakes when creating the survey that results in mislabeled values. Also, if you have had many individuals enter data into your database it can increase the chances that mistakes were made during the data entry process. During this step I will also check to make sure that I have properly labeled any values that I’m using to designate missing data and that this is consistent with what I have listed in my codebook.  

Lastly, I will highlight when I see variables that may have extreme scores (i.e., potential outliers), variables with more than 5% missing data, and variables with very low sample size in any of their response categories. I’ll use this output in future data cleaning steps to aid in my decision making on variable modification. 

Data Cleaning Step 4: Check for Coding Mistakes 

At this step I will take my output that I highlighted potential issues with coding and start reviewing and making variable modification decisions at this step. Coding issues are more common when you have data that has been manually entered but you can still have coding errors in online data collection! Any variables that have coding issues I first determine if I can verify the data from the original/another source. For data that has been manually entered I’ll go back to the organization/paper survey/data form to verify the data. If it needs to be changed to the correct response I will make a note of this to fix in my next data cleaning step. If I cannot verify the datapoint (like when you have collected your data anonymously) and the value doesn’t fall in the possible values listed in my codebook then I make a note to set the value as missing when I get to the next data cleaning step.  

Additional Advice 

As I am going through my frequencies I will highlight/enter notes directly in the output to make things easier as I move forward through the data cleaning process. I’ll also put notes in my project notebook summarizing any issues and then once I make decisions on variable modifications, I note these in my notebook as well. You will use the output from Step 3 in the next few data cleaning steps to aid in your decision making so keep it handy! 

Resources

12 Steps of Data Cleaning Handout: https://www.dropbox.com/scl/fi/x2bf2t0q134p0cx4kvej0/TWELVE-STEPS-OF-DATA-CLEANING-BRIEF-HANDOUT-MORROW-2017.pdf?rlkey=lfrllz3zya83qzeny6ubwzvjj&dl=0 

Step 1: https://cehhs.utk.edu/elps/organizing-your-evaluation-data-the-importance-of-having-a-comprehensive-data-codebook/ 

Step 2: https://cehhs.utk.edu/elps/clean-correlate-and-compare-the-importance-of-having-a-data-analysis-plan/ 

https://davenport.libguides.com/data275/spss-tutorial/cleaning

https://libguides.library.kent.edu/SPSS/FrequenciesCategorical

https://www.datacamp.com/tutorial/tutorial-data-cleaning-tutorial

https://www.geeksforgeeks.org/frequency-table-in-r

https://www.goskills.com/Excel/Resources/FREQUENCY-Excel

Filed Under: Evaluation Methodology Blog

  • 1
  • 2
  • 3
  • …
  • 13
  • Next Page »

Educational Leadership and Policy Studies

325 Bailey Education Complex
Knoxville, Tennessee 37996

Phone: 865-974-2214
Fax: 865.974.6146

The University of Tennessee, Knoxville
Knoxville, Tennessee 37996
865-974-1000

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.

ADA Privacy Safety Title IX