• Request Info
  • Visit
  • Apply
  • Give
  • Request Info
  • Visit
  • Apply
  • Give

Search

  • A-Z Index
  • Map

Educational Leadership and Policy Studies

  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
  • About
  • Our People
    • Our People Overview
    • Faculty
    • Staff
    • Students
  • Academic Programs
    • Academic Programs Overview
    • Adult & Continuing Education
    • College Student Personnel
    • Educational Administration
    • Evaluation Programs
    • Higher Education Administration
    • Undergraduate Studies
  • Education Research & Opportunity Center
  • Admissions & Information
    • Admissions Overview
    • Graduate Forms, Handbooks, and Resources
    • Contact ELPS
Home

Reflecting on a Decade After ESM: My Continuing Journey as an Evaluation Practitioner and Scholar

Reflecting on a Decade After ESM: My Continuing Journey as an Evaluation Practitioner and Scholar

Reflecting on a Decade After ESM: My Continuing Journey as an Evaluation Practitioner and Scholar

May 15, 2024 by Jonah Hall

By Tiffany Tovey, Ph.D.

Greetings, fellow explorers of evaluation! I’m Tiffany Tovey, a fellow nerd​,​​ ​UTK alum​,​ and practitioner on a constantly evolving professional and personal journey, navigating the waters with a compass called reflective practice. Today, I’m thrilled to reflect together with you on the twists and turns of my journey as an evaluation practitioner and scholar in the decade since I defended my dissertation and offer some insights for you to consider in your own work. 

My Journey in Evaluation 

Beginning the unlearning process. The seeds of my journey into social science research were sown during my undergraduate years as a first-generation college student at UTK, where I pursued both philosophy and psychology for my bachelor’s degree. While learning about the great philosophical debates and thinkers, I was traditionally trained in experimental and social psychology under the mentorship of Dr. Michael Olson. This rigorous foundation of exploring knowledge and inquiry provided me with a foundational perspective on what was to come. I learned the importance of asking questions, embracing fallibilism, and appreciating the depth of what I now call reflective practice. Little did I know, this foundation really set the stage for my immersion into the world of evaluation, starting with the Evaluation, Statistics, and Methodology (ESM) program at UTK.  

Upon entering ESM for my Ph.D., I found myself in the messy, complex, and dynamic realm of applying theory to practice. Here, my classical training in positivist, certainty-oriented assumptions was immediately challenged (in ways I am still unlearning to this day), and my interests in human behavior and reflective inquiry found a new, more nuanced, context-oriented environment to thrive. Let me tell you about lessons I learned from three key people along the way: 

  • Communicating Data/Information: Statistics are tools for effectively communicating about and reflecting on what we know about​​ what is and (sometimes) why it is the way it is. Dr. Jennifer Ann Morrow played a pivotal role in shaping my understanding of statistics and its application in evaluation. Her emphasis on making complex statistical information accessible and meaningful to students, clients, and other audiences has stuck with me.  
     
    ​​As important as statistics are, so too are words—people’s lived experiences, which is why qualitative research is SO important in our work, something that all my instructors helped to instill in me in ESM. I can’t help it; I’m a word nerd. Whether qualitative or quantitative, demystifying concepts, constructs, and contexts, ​​outsmarting software and data analysis programs, and digesting and interpreting information in a way that our busy listeners can understand and make use of is a fundamental part of our jobs. 
  • Considering Politics a​​nd Evaluation Use: Under the mentorship of Dr. Gary Skolits, a retired ESM faculty member and current adjunct faculty member at UTK, I began to understand the intricate dances evaluators navigate in the realms of politics and the use of evaluation findings. His real-talk and guidance helped prepare me for the complexities of reflective practice in evaluation, which became the focus of my dissertation. Upon reflection, I see my dissertation work as a continuation of the reflective journey I began in my undergraduate studies, and my work with Gary as a fine-tuning and ​clarification of​ the critical role of self-awareness, collaboration, facilitation, and tact in the evaluation process. 
  • The Key Ingredient – Collaborative Reflective Practice: My journey was deepened by my engagement with Dr. John Peters, another now-retired faculty member from UTK’s College of Education Health and Human Sciences, who introduced me to the value of collaborative reflective practice through dialogue and systematic reflective processes. His teachings seeded my belief that evaluators should ​​facilitate reflective experiences for clients and collaborators, fostering deeper understandings, cocreated learning, and more meaningful outcomes (see the quote by John himself below… and think about the ongoing role of the evaluator during the lifecycle of a project). He illuminated the critical importance of connecting theory to practice through reflective practice—a transformative activity that occupies the liminal space between past actions and future possibilities. This approach encourages us to critically examine the complexities of practice, thereby directly challenging the uncritical acceptance of the status quo. 

My post-PhD journey. I currently serve as the director of the Office of Assessment, Evaluation, and Research Services and teach program evaluation, qualitative methods, reflective practice, interpersonal skills, and just-in-time applied research skills to graduate and undergraduate students at UNC Greensboro. Here, I apply my theoretical knowledge to real-world evaluation projects, managing graduate students and leading them on their professional evaluation learning journey. Each project and collaboration has been an opportunity to apply and refine my understanding of reflective practice, effective communication, and the transformative power of evaluation.  

​​​My role at UNCG has been a continued testament to the importance of reflective practice. The need for intentional reflective experiences runs throughout my role as a director of OAERS, lead evaluator and research on sponsored projects, mentorship and scaffolding with students, and as a teacher. Building in structured time to think, unpack questions and decisions together, and learn how to go on more wisely is a ubiquitous need. Making space for reflective practice means leveraging the ongoing learning and unlearning process that defines the contours of (1) evaluation practice, (2) evaluation scholarship, and (3) let’s be honest… life itself!  

Engaging with Others: The Heart of Evaluation Practice 

As evaluators, our work is inherently collaborative and human centered. We engage with diverse collaborators and audiences, each bringing their unique perspectives and experiences to the table. In this complex interplay of voices,​​ it’s essential that we—evaluators—foster authentic encounters that lead to meaningful insights and outcomes. 

In the spirit of Martin Buber’s philosophy, I try to approach my interactions with an open heart and mind, seeking to establish a genuine connection with those I work with. Buber reminds us that “in genuine dialogue, each of the participants really has in mind the other or others in their present and particular being and turns to them with the intention of establishing a living mutual relation between himself and them” (Buber, 1965, p. 22). This perspective is foundational to my practice, as it emphasizes the importance of mutual respect and understanding in creating a space for collaborative inquiry and growth. 

Furthermore, embracing a commitment to social justice is integral to my work as an evaluator. Paulo Freire’s insights resonate deeply with me:  

Dialogue cannot exist, however, in the absence of a profound love for the world and for people. The naming of the world, which is an act of creation and re-creation, is not possible if it is not infused with love. Love is at the same time the foundation of dialogue and dialogue itself. (Freire, 2000, p. 90) 

This principle guides me in approaching each evaluation project with a sense of empathy and a dedication to promoting equity and empowerment through my work. 

Advice for Emerging Evaluators 

  • Dive in and embrace the learning opportunities that come your way.  
  • Reflect on your experiences and be honest with yourself.  
  • Remember, evaluation is about people and contexts, not just techniques and tools.  
  • Leverage your unique personality and lived experience in your work. 
  • Never underestimate the power of effective, authentic communication… and networking. 
  • Most importantly, listen to and attend to others—we are a human-serving profession geared towards social betterment. Be in dialogue with your surroundings and those you are in collaboration with. View evaluation as a reflective practice, and your role as a facilitator of that process. Consider how you can leverage the perspectives of Buber and Freire in your own practice to foster authentic encounters and center social justice in your work. 

Conclusion and Invitation 

My journey as an evaluation scholar is a journey of continuous learning, reflection, and growth. As I look to the future, I see evaluation as a critical tool for navigating the complex challenges of our world, grounded in reflective practice and a commitment to the public good. To my fellow evaluators, both seasoned and emerging, let’s embrace the challenges and opportunities ahead with open minds and reflective hearts. And to the ESM family at UTK, know that I am just an email away (tlsmi32@uncg.edu), always eager to connect, share insights, and reflect further with ​​you. 

Filed Under: Evaluation Methodology Blog

Honors Leadership Scholar leads VEXU Robotics Team to World Championship

Honors Leadership Scholar leads VEXU Robotics Team to World Championship

September 22, 2021 by Jonah Hall

Four years ago, Grant Kobes, a member of the inaugural class of UT’s Honors Leadership Program, created a strategic leadership plan to found UT’s VEX University competitive robotics team, YNOT, as part of an assignment for his first foundations of leadership course.  Grant personally secured a team mentor and started a new student organization after developing a budget and drafting a constitution.  He hosted interest meetings, performed individual member skills evaluations, and held officer elections.  With no sponsor or work space, Grant and his teammates created their first robot on the floor of his dorm room in White Hall!  Since then, his team’s footprint and successes have been unprecedented.

Team YNOT’s 2021 World Championship robot fleet

Team YNOT has qualified for and competed at the world championship level for the past four years.  This summer, Grant and team YNOT achieved the impossible.   Not only were they crowned World Champions at the 2021 VEXU Robotics World Championship in Greenville, Texas, but they also earned the Excellence Award, the highest award presented in the VEX Robotics competition. This award is presented to the team that exemplifies overall excellence in building a high-quality robotics program including design innovation, build quality, autonomous programming, personal interview, and documentation through an engineering notebook.

2021 Competition robot

Grant evolved his early vision for team YNOT into a four-year capstone leadership project by applying the theoretical leadership concepts he learned in his Honors Leadership Program coursework.  First, he wanted to create an opportunity for UT engineering students to gain hands-on experience with the principles they were studying in class.  “Competitive robotics hones essential skills that all young engineers must possess,” says Kobes, “like the ability to approach a problem using the engineering design process:  to fabricate a prototype, test, and relentlessly revise until the most efficient iteration is achieved.”

Next, he sought to give back to the local community by establishing mentorship relationships between his university level team and middle and high school robotics teams around the state.  “These young teams thrive when given one-on-one attention from an experienced VEX competitor,” says Kobes.  The impact of this collaboration was recognized by judges at the 2018 VEX World Championship, resulting in team YNOT receiving the prestigious Community Award.  This award is presented to the university level team that demonstrated the most meaningful leadership and influence toward promoting STEM education in their local community.

Team YNOT founder Grant Kobes (seated), watches as young students begin a chassis assembly.

Grant continued to develop his own personal leadership style through ELPS coached leadership skills and expanded his vision to include using competitive robotics to recruit the most talented high school students from around the state to the Tickle College of Engineering.  He accomplished this goal by planning and hosting an official VEX qualifying event on UT’s campus in early 2020.  In preparation for the event, Kobes spent the semester inventorying his leadership strengths and weaknesses while outlining the multiple steps necessary to make the tournament a reality.   “Putting this event together required me to utilize many of the leadership skills that I struggle with.  However, it also provided a practical opportunity for me to inspire and empower other YNOT team members who naturally possess these skills to take on leadership roles of their own,” says Kobes.

UT’s first official Vex tournament, Tower Takeover, brought over 150 students, coaches, family, and friends to Rocky Top.  Thirty-two teams from around the state competed for seven qualifying spots at the 2020 Tennessee VEX State Championship.  Upon their arrival on campus, students and their coaches were treated to T-shirts and swag bags from the Tickle College of Engineering and the ISE department.  “We wanted to get information into the hands of perspective students and their parents,” says ISE department head, Dr. John Kobza, who helped team YNOT orchestrate the event. “These kids are already budding industrial engineers integrating technology, people, and information to maximize their performance in the VEX challenge. UT is a great option for them as there are many branches of study available in the Tickle College of Engineering. I hope to see them as UT Volunteers in a few years.”

Another aspect of Grant’s leadership plan, diversity, was also highlighted at the event. In 2017, VEX introduced an initiative called ‘Girl Powered’ in an effort to involve more females in competitive robotics.  The program offered workshops and events specifically for female students.  Since then, VEX has seen an explosion in the number of females on competition teams, as well as all-girl teams.  “For example, the Talbot, Tennessee team, Higher Calling, comprised of only two female high school students, won the Excellence Award at our event,” says Kobes.  “These girls can hold their own against any team in our state.”

Robots stack cubes at UT’s VEX Tower Takeover event

Two all-girls teams go head-to-head at the event as Team YNOT member, Eli Charles (right), serves as referee.

Teams took advantage of their trip to Rocky Top by coming to UT on Friday afternoon and taking campus tours.  “Many students from technical schools, as well as rural programs around the state, were on campus for the first time in their lives,” says Kobes. “With the implementation of the Tennessee Promise scholarship, robotics students who never dreamed they could afford to attend UT to study engineering are now perfect candidates and team YNOT wants to be the first to welcome them to campus.  We were also honored to have twenty elementary students from Green Magnet Academy elementary school, which we mentor, serve as our field resetters during the competition,” says Kobes, “proving that students are never too young to embody the Volunteer spirit.”

Team YNOT continues to host events highlighting the college of engineering including an online event during COVID-19 in which Grant personally proctored thirty-minute Zoom sessions with teams from around the country.  Using skills he perfected working as a technical specialist at the department of ISE’s iLAB, he even created custom awards for the winners.

Grant hosts an online competition session through Zoom

Grant funds his team almost entirely by organizing yearly fundraising campaigns through VOLstarter, UT Knoxville’s crowdfunding platform. Over the past four years, the team has raised over $20,000 which they use for supplies and outreach.  During last year’s BIG ORANGE GIVE, Team YNOT won the Student Organization Challenge, bringing in over $3600.  With the help of team mentor, Dr. John Kobza, Grant was also successful in obtaining a dedicated 1200 sq ft lab space on campus, an exceptional privilege for an undergraduate.

Grant’s leadership efforts were rewarded by the Tickle College of Engineering when he was named the 2020 Outstanding ISE Student of the Year, recognizing both academic excellence and service contributions to the engineering campus community.

At this year’s VEX University Robotics World Championship, Team YNOT accomplished an unprecedented feat by winning both the competitive portion of the event and the highest judged award, the Excellence Award. “I am most proud of the Excellence Award,” says Kobes, “because it represents the efforts of the entire team.  YNOT optimizes our performance using designers from Tickle College of Engineering’s mechanical engineering department, programmers from the computer science department, an automation expert from ISE, and an archival specialist from anthropology who compiles our engineering notebook.  One of our best builders is actually a wildlife and fisheries major! Our success demonstrates what UT students can achieve when they work in collaboration.”

Team members (from left): Andy Zeng, a junior in Computer Engineering; Clare Remy, a recent graduate from the department of Anthropology; Grant Kobes, a senior in the ISE department; Tony Spezia, a sophomore in Mechanical Engineering; Mackenzie Belt, a sophomore in Wildlife and Fisheries; Brandan Roachell, a junior in Computer Science; Jay Ashworth, a junior in Computer Engineering; and Christian Ramsey, a sophomore in Mechanical Engineering.

While Grant’s leadership has brought UT and its students international recognition for servant leader hearts and their capacity to make a difference in their community, team YNOT continues to volunteer hundreds of hours as judges and referees at numerous VEX qualifying events around the state.  Upon his graduation in December, Grant will receive a gold medallion, in recognition of his personal contribution of over 225 hours of community service, from UT’s Clay and Debbie Jones Center for Leadership and Service.  “One of the greatest rewards I have received through my HLP experience is the honor of serving alongside like-minded and gifted students,” says Kobes.

With a World Championship title under his belt, Kobes is now focusing on the leadership legacy he leaves at UT through team YNOT by ensuring that the team continues after he graduates.  Kobes has already begun to mentor and train team members in specific areas which will allow them to assume additional leadership positions in the organization.  “The ultimate indicator of my success as a leader is that the organization I leave behind continues to draw the brightest young minds to the University of Tennessee.”

Filed Under: Leadership Studies News, News

CSP Welcomes New Coordinator

CSP Welcomes New Coordinator

August 1, 2021 by Jonah Hall

Mary DueñasMary Dueñas has joined the College Student Personnel (CSP) program as the new program coordinator and is an assistant professor in the Department of Educational Leadership and Policy Studies. Dueñas holds her PhD from the University of Wisconsin-Madison in Educational Leadership and Policy Analysis. With publications in the Journal of College Student Development, Journal of Latinos and Education, International Journal of Qualitative Studies in Education, and Hispanic Journal of Behavioral Sciences, Dueñas’ research focuses on the Latinx college student experience. Her interest attends to critical and social processes that affect this student population, with the intent for the findings to inform how student affairs can and should work with these students to promote their success.

Prior to her position at the University of Tennessee,Knoxville, Dueñas served as Posse Mentor, director of two educational programs, and coordinator for a Chican@ Latin@ Studies Program. As it relates to CSP, Dueñas expressed, “I am genuinely delighted and excited to be the new CSP Coordinator! I am thrilled to continue to uplift students’ experiences and work with campus partners to enhance the program. CSP is a quality program, and to be part of it – is something special.”

Filed Under: Uncategorized

Ritchie wins Management in Education Journal Editorial Board Choice Award

Ritchie wins Management in Education Journal Editorial Board Choice Award

February 16, 2021 by Jonah Hall

A recent BELMAS (British Educational Leadership, Management and Administration Society) newsletter announced that the winner of Management in Education’s Board Choice Award for the most downloaded article is Margaret Ritchie for ‘Succession Planning for Successful Leadership: Why We Need to Talk About Succession Planning!’

Margaret is a Fall 2020 graduate in Educational Leadership and Policy Studies PhD in Leadership Studies program. An interview with Margaret can be found on the MiE’s website in the near future.  Her award winning article can be found here.

Filed Under: Accolades, News

ELPS Associate Professor Mary Lynne Derrington Releases New Book

ELPS Associate Professor Mary Lynne Derrington Releases New Book

September 30, 2020 by Jonah Hall

Mary Lynne Derrington, PhD, releases new book
Developing the Organizational Culture of the Central Office: Collaboration, Connectivity, and Coherence. Available from Routledge.

“Central office resources are one of the largest assets in making meaningful change in schools, and this important book guides aspiring district leaders to take up the challenge to transform their schools, while at the same time balancing their core responsibilities. This book helps readers rethink the impact of central office on system and school initiatives, understand and apply transformational thinking, and change strategies at the central office to develop new instructional designs, create new opportunities to prioritize human and fiscal resources, and establish new leadership approaches founded on systems review and change. Full of exemplars from the field, questions for discussion, and suggested readings, this valuable textbook is for use in educational leadership preparation programs.”

Filed Under: News, Publications

Frank Cuevas Named Permanent Vice Chancellor for Student Life

Frank Cuevas Named Permanent Vice Chancellor for Student Life

May 21, 2020 by Jonah Hall

Frank Cuevas, who has been serving as the University of Tennessee, Knoxville’s Interim Vice Chancellor for Student Life since January, will take on the role permanently.

“Over the last several months, some of which have been the most challenging we’ve faced as an institution, I have watched Frank’s steady leadership, his steadfast advocacy for our students, and the trust and respect he has built with the Student Life staff,” said Chancellor Donde Plowman. “Frank is collaborative, compassionate, and thoughtful in his decision-making, and he has already proven himself to be a tremendous addition to our senior leadership team. I’m absolutely thrilled he has agreed to take on the role permanently.”

Frank Cuevas
Vice Chancellor for Student Life Frank Cuevas

Cuevas previously served as Assistant and then Associate Vice Chancellor for Student Life. He also oversaw Student Housing at UT for seven years, beginning in 2010. He has been an Adjunct Assistant Professor in Educational Leadership and Policy Studies at UT for seven years.

“I am deeply honored to serve UT as the Vice Chancellor for Student Life. I am passionate about both education and the student experience, so I am very excited to get to work on behalf of our students. I wish to thank Chancellor Plowman for entrusting me with this responsibility. I look forward to continuing to work with her, our campus leadership, and the student life team to support student learning as we work to advance the university’s mission.”

Cuevas was the only finalist out of a nationwide pool of candidates.

Before coming to UT, Cuevas served in various roles in student housing at Florida State University for nearly 20 years. He holds a bachelor’s degree in international affairs, a master’s in student affairs, and a doctor of education in higher education administration, all from Florida State.

—

CONTACT

Tyra Haag (865-974-5460, tyra.haag@tennessee.edu)

Filed Under: Accolades, News, Uncategorized Tagged With: Division of Student Life, Frank Cuevas, Goal 2: Make an Impact, Goal 5: Living our Values

Morrow, Angelle, & Cervantes Recently Return from BELMAS

Morrow, Angelle, & Cervantes Recently Return from BELMAS

July 21, 2025 by Jonah Hall

ELPS faculty members Dr. Jennifer Ann Morrow and Dr. Pamela Angelle recently returned from Brighton, England alongside Higher Education Administration (HEAM) PhD student, Abraham Cervantes, where they presented research at the annual BELMAS Conference!

As stated on the BELMAS homepage, the British Educational Leadership Management and Administration Society (BELMAS) is the Learned Society dedicated to advancing educational leadership. It is a membership organization made up of individual members working across research and practice in all areas of the field. Their members “come from a wide range of backgrounds – from academic researchers to school and system leaders – all committed to advancing understanding and practice in the field.”

Dr. Morrow and Dr. Angelle shared a presentation titled “Artificial Intelligence (AI) and Research: Terrifying or Terrific?” and Cervantes also presented “The Politics of Identity: How ‘Latinx’ Reflects the Tension Between Academia and Culture” at the conference!

Dr. Morrow and Dr. Angelle shared a presentation titled “Artificial Intelligence (AI) and Research: Terrifying or Terrific?” and Cervantes also presented “The Politics of Identity: How ‘Latinx’ Reflects the Tension Between Academia and Culture” at the conference!

“BELMAS was a great opportunity to present our work to an international audience. If you get the chance I highly recommend presenting at international conferences, it is a great way to network with other researchers from around the world and to gain different perspectives on your work.”

-Jennifer Ann Morrow, Ph.D.

Filed Under: News, Uncategorized

Serving with Purpose: Lessons Learned from Consulting in Assessment and Research

Serving with Purpose: Lessons Learned from Consulting in Assessment and Research

July 15, 2025 by Jonah Hall

By Jerri Berry Danso


​​​I’m Jerri Berry Danso, a first-year doctoral student in the Evaluation, Statistics, and Methodology (ESM) program at the University of Tennessee, Knoxville. Before beginning this new chapter, I spent over a decade working in higher education assessment: first as the Director of Assessment for the College of Pharmacy at the University of Florida, and later in Student Affairs Assessment and Research. During those years I learned how purposeful data work can illuminate student learning, sharpen strategic planning, and strengthen institutional effectiveness. Across these roles, I collaborated with faculty, staff, and administrators on a wide range of projects, where I supported outcomes assessment, research design, program evaluation, and data ​​storytelling.

Whether it was designing a survey for a student services office or facilitating a department’s learning outcomes retreat, I found myself consistently in the role of consultant: a partner and guide, helping others make sense of data and translate it into action. Consulting, I’ve learned, is not just about expertise; it also requires curiosity, humility, and a service mindset. And like all forms of service, it is most impactful when done with ​​purpose. My goal in this post is to share the values and lessons that shape my approach so you can adapt them to your own practice. 

What Does It Mean to Consult? 

In our field, we often engage in informal consulting more than we realize. Consulting, at its core, is the act of offering expertise and guidance to help others solve problems or make informed decisions. In the context of ​​research, evaluation, assessment, and methodology, this can involve interpreting data, advising on survey design, facilitating program evaluation,​​ or co-creating strategies for data-informed improvement. 

I define consulting not only by what we do, but also by how we do it – through relationships built on trust, clarity, and mutual respect. If you’ve ever had someone turn to you for guidance on a research or assessment issue because of your experience, congratulations! You’ve already engaged in consulting. 

My Core ​​Consulting Values 

My foundation as a consultant is rooted in an early lesson from graduate school. While earning my first master’s degree in Student Personnel in Higher Education, I took a counseling skills course that fundamentally shaped how I interact with others. We were taught a core set of helping behaviors: active listening, empathy, reflection, open-ended questioning, and attention to nonverbal cues. Though designed for future student affairs professionals, these skills have served me equally well in consulting settings. 

From that experience, and years of practice, my personal consulting values have emerged: 

  • Empathy: Understanding what matters to the client, listening deeply, and genuinely caring about their goals. 
  • Integrity: Being transparent, honest, and grounded in ethical principles, especially when working with data. 
  • Collaboration: Co-creating solutions with clients and recognizing that we are partners, not saviors. 
  • Responsibility: Taking ownership of work, meeting commitments, and communicating clearly when plans change. 
  • Excellence: Striving for quality in both process and product, whether that product is a report, a workshop, or a relationship.

These values are my compass. They help me navigate difficult decisions, maintain consistency, and most importantly, deliver service that is thoughtful and human-centered. 

Lessons from the​​ Field 

​​​Over the years, I’ve learned that the best consultants don’t just deliver technical expertise. They cultivate trust. Here are a few key lessons that have stuck with me: 

  1. Follow through on your promises. If you say you’ll deliver something by a certain date, do it, or communicate early if something changes. Reliability builds ​​credibility and fosters trust in professional relationships. 
  1. Set expectations early. Clarify what you will provide and what you need from your client to be successful. Unmet expectations often stem from assumptions left unspoken. 
  1. Stick to your values. Never compromise your integrity. For example, a client asked me to “spin” data to present their program in a more favorable light. I gently reminded them that our role was to find truth, not polish it, and that honest data helps us improve. 
  1. Anticipate needs. When appropriate, go a step beyond the request. In one project, I created a detailed methodology plan for a project that the client hadn’t asked for. They later told me it became a key reference tool throughout the project. 
  1. Adapt your communication. Know your audience. Avoid overwhelming clients with technical jargon, but don’t oversimplify in a way that’s condescending. Ask questions, check for understanding, and create space for curiosity without judgment. 

​​​The Art of Service 

Good consulting is about more than solving problems; it is equally about how you show up for others. What I’ve come to call the art of service is an intentional approach to client relationships grounded in care, curiosity, and a commitment to helping others thrive. This includes:

  • Practicing empathy and active listening  
  • Personalizing communication and building rapport 
  • Going beyond what’s expected when it adds value 
  • Continuously reflecting on your approach and improving your craft 

These principles align closely with literature on counseling and helping relationships. For instance, Carl Rogers (1951) emphasized the power of empathy, congruence, and unconditional positive regard. These are qualities that, when applied in consulting, build trust and facilitate honest conversations. Gerald Egan (2014), in The Skilled Helper, also highlights how intentional listening and support lead to more effective outcomes. 

A Call to Aspiring Consultants 

You don’t need consultant in your job title to serve others through your expertise. ​​Whether you’re a graduate student, an analyst, or a faculty member, you can bring consulting values into your work, especially in the measurement, assessment, evaluation, and statistics fields, where collaboration and service are central to our mission. 

So, here’s my invitati​​on to you:  

Take some time to define your own values. Reflect on how you show up in service to others. Practice listening more deeply, communicating more clearly, and delivering with care. The technical side of our work is vital, but the human side? That’s where transformation happens. 

​​​Resources for Further Re​​ading 

  • Egan, G. (2014). The Skilled Helper: A Problem-Management and Opportunity-Development Approach to Helping (10th ed.). Cengage Learning. 
  • Rogers, C. R. (1951). Client-Centered Therapy: Its Current Practice, Implications and Theory. Houghton Mifflin. 
  • Block, P. (2011). Flawless Consulting: A Guide to Getting Your Expertise Used (3rd ed.). Wiley. 
  • Kegan, R., & Lahey, L. L. (2016). An Everyone Culture: Becoming a Deliberately Developmental Organization. Harvard Business Review Press. 

Filed Under: Evaluation Methodology Blog

Navigating Ambiguity and Asymmetry: from Undergraduate to Graduate Student and Beyond

Navigating Ambiguity and Asymmetry: from Undergraduate to Graduate Student and Beyond

June 15, 2025 by Jonah Hall

By Jessica Osborne, Ph.D. and Chelsea Jacobs

Jessica is the Principal Evaluation Associate for the Higher Education Portfolio at The Center for Research Evaluation at the University of Mississippi. She earned a PhD in Evaluation, Statistics, and Measurement from the University of Tennessee, Knoxville, an MFA in Creative Writing from the University of North Carolina, Greensboro, and a BA in English from Elon University. Her main areas of research and evaluation are undergraduate and graduate student success, higher education systems, needs assessments, and intrinsic motivation. She lives in Knoxville, TN with her husband, two kids, and three (yes, three…) cats. 

My name is Chelsea Jacobs, and I’m a PhD student in the Evaluation, Statistics, and Methodology (ESM) program at the University of Tennessee, Knoxville. I’m especially interested in how data and evidence are used to inform and improve learning environments. In this post, I’ll share reflections — drawn from personal experience and professional mentorship — on navigating the ambiguity and asymmetry that often define the transition from undergraduate to graduate education. I’ll also offer a few practical tips and resources for those considering or beginning this journey. 

Transitioning from undergraduate studies to graduate school is an exciting milestone, full of possibilities and challenges. For many students, it also marks a shift in how success is measured and achieved. We — Jessica Osborne, PhD, Principal Evaluation Associate at The Center for Research Evaluation at the University of Mississippi, and Chelsea Jacobs, PhD student at the University of Tennessee — have explored these topics during our professional networking and mentoring sessions. While ambiguity and asymmetry may exist in undergraduate education, they often become more pronounced and impactful in graduate school and professional life. This post sheds light on these challenges, offers practical advice, and points prospective graduate students to resources that can ease the transition. 

From Clarity to Exploration: Embracing Ambiguity in Graduate Education 

In undergraduate studies, assessments often come in the form of multiple-choice questions or structured assignments, where answers are concrete and feedback is relatively clear-cut. From a Bloom’s Taxonomy perspective, this often reflects the “remembering” domain. Success may align with effort — study hard, complete assignments, and you’ll likely earn good grades. 

Graduate school, however, introduces a level of ambiguity that can be unexpectedly challenging. Research projects, thesis writing, and professional collaborations often lack clear guidelines or definitive answers. Feedback becomes more subjective, reflecting the complexity and nuance of the work. For example, a research proposal may receive conflicting critiques from reviewers, requiring students to navigate gray areas with the support of advisors, peers, and faculty. 

These shifts are compounded by a structural difference: while undergraduates typically have access to dedicated offices and resources designed to support their success, graduate students often face these challenges with far fewer institutional supports. This makes it all the more important to cultivate self-advocacy, build informal support networks, and learn to tolerate uncertainty. 

Though ambiguity can feel overwhelming, it’s also an opportunity to develop critical thinking and problem-solving skills. Graduate school encourages asking deeper questions, exploring multiple perspectives, and embracing the process of learning rather than focusing solely on outcomes. 

How to Navigate Ambiguity 

Embrace the Learning Curve: Ambiguity is not a sign of failure but a necessary condition for growth—it pushes us beyond routine practice and encourages deeper, more flexible thinking. Seek opportunities to engage with complex problems, even if they feel overwhelming at first, as these moments often prompt the most meaningful development. 

Ask for Guidance: Don’t hesitate to seek clarification from advisors, peers, or those just a step ahead in their academic journey. Opening up about your struggles can reveal how common they are — and hearing how others have navigated doubt or setbacks can help you build the resilience to keep moving forward. Graduate school can be a collaborative space, and connection can be just as important as instruction. 

In the ESM program at UTK, we’re fortunate to be part of a collaborative, non-competitive graduate environment. This isn’t the case for all graduate programs, so it’s an important factor to consider when choosing where to study. 

Uneven Roads: Embracing the Asymmetry of Growth 

As an undergraduate, effort is often emphasized as the key to success, but the relationship between effort and outcome isn’t always straightforward. Study strategies, access to resources, prior preparation, and support systems all play a role — meaning that even significant effort doesn’t always lead to the expected results. However, success can align with effort—study hard, complete assignments, and you’ll likely earn good grades. 

In graduate school and professional life, this symmetry can break down. You might invest months into a research paper, only to have it rejected by a journal. Grant proposals, job applications, and conference submissions often yield similar results—hard work doesn’t always guarantee success, but it does guarantee learning. 

This asymmetry can be disheartening, but it mirrors the realities of many professional fields. Learning to navigate it is crucial for building resilience and maintaining motivation. Rejection and setbacks are not personal failures but part of growth. 

How to Handle Asymmetry 

Redefine Success: Focus on the process rather than the outcome. Every rejection is an opportunity to refine your skills and approach. 

Build Resilience: Mistakes, failures, and rejection are not just normal—they’re powerful learning moments. These experiences often reveal knowledge or skill gaps more clearly than success, making them both memorable and transformative. Cultivating a growth mindset helps reframe setbacks as essential steps in your development. 

Seek Support: Surround yourself with a network of peers, mentors, and advisors who can offer perspective and encouragement. 

Resources for Prospective Graduate Students 

Workshops and seminars can help students build essential skills — offering guidance on research methodologies, academic writing, and mental resilience. 

Here are a few resources to consider: 

  • Books: Writing Your Journal Article in Twelve Weeks by Wendy Laura Belcher is excellent for developing academic writing. The Writing Workshop, recommended by a University of Michigan colleague, is a free, open-access resource. 
  • Research Colloquium: UTK students apply research skills in a colloquium setting. See Michigan State University’s Graduate Research Colloquium for a similar example. These events are common — look into what your institution offers. 
  • Campus Resources: Don’t overlook writing centers, counseling centers, and mental health services. For example, Harvard’s Counseling and Mental Health Services provides a strong model. Explore what’s available at your school. 
  • Professional Networks: Join organizations or online communities in your field. This can lead to mentorship, which is invaluable — and worthy of its own blog post. 

Final Thoughts 

Ambiguity and asymmetry are not obstacles to be feared but challenges to be embraced. They help develop the critical thinking, problem-solving, and resilience needed for both graduate school and a fulfilling professional career. By understanding these aspects and using the right resources, you can navigate the transition with confidence. 

To prospective graduate students: welcome to a journey of growth, discovery, and MADness — Meaningful, Action-Driven exploration of methods and measures. We’re excited to see how you’ll rise to the challenge. 

Filed Under: Evaluation Methodology Blog

My Journey In Writing A Bibliometric Analysis Paper

My Journey In Writing A Bibliometric Analysis Paper

June 1, 2025 by Jonah Hall

As a third-year doctoral student in Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville, I recently completed a bibliometric analysis paper for my capstone project on Data Visualization and Communication in Evaluation. Bibliometrics offers a powerful way to quantify research trends, map scholarly networks, and identify gaps in literature. It is an invaluable research method for evaluators and researchers alike. Hello everyone! I am Richard D. Amoako. 

Learning bibliometrics isn’t always straightforward. Between choosing the right database, wrangling APIs, and figuring out which R or Python packages won’t crash your laptop, there’s a steep learning curve. That’s why I’m writing this: to share the lessons, tools, and occasional frustrations I’ve picked up along the way. Whether you’re an evaluator looking to map trends in your field or a researcher venturing into bibliometrics for the first time, I hope this post saves you time, sanity, and a few coding headaches. Let’s explore the methodology, applications, and resources that shaped my project. 

Understanding Bibliometric Analysis 

Bibliometric analysis is the systematic study of academic publications through quantitative methods- examining citations, authorship patterns, and keyword frequencies to reveal research trends. Bibliometric analysis differs from traditional literature reviews by delivering data-driven insights into knowledge evolution within a field. Common applications include identifying influential papers, mapping collaboration networks, and assessing journal impact (Donthu, et al., 2021; Van Raan, et a., 2018; Zupic & Čater, 2015). 

For evaluators, this approach is particularly valuable. It helps track the adoption of evaluation frameworks, measure scholarly influence, and detect emerging themes, such as how data visualization has gained traction in recent years. My interest in bibliometrics began while reviewing literature for my capstone project. Faced with hundreds of papers, I needed a way to objectively analyze trends rather than rely on subjective selection. Bibliometrics provide that structure, turning scattered research into actionable insights. 

Key Steps in Writing a Bibliometric Paper 

Defining Research Objectives 
The foundation of any successful bibliometric study lies in crafting a precise research question. For my capstone on data visualization in evaluation literature, I focused on: “How has the application of data visualization techniques evolved in program evaluation research from 2010-2025?” This specificity helped me avoid irrelevant data while maintaining analytical depth. Before finalizing my question, I reviewed existing systematic reviews to identify underexplored areas – a crucial step that prevented duplication of prior work. When brainstorming and refining your thoughts, utilize productive technologies such as generative AI tools (such as ChatGPT, Claude, Perplexity, Google Gemini, Microsoft Copilot, DeepSeek, etc.)  to enhance and clarify your ideas.   

Database Selection and Data Collection 
Choosing the right database significantly impacts study quality. After comparing options, I selected Scopus for its comprehensive coverage of social science literature and robust citation metrics. While Web of Science (WoS) offers stronger impact metrics, its limited coverage of evaluation journals made it less suitable. Nonetheless, I examined the potential applications of using WoS. Google Scholar’s expansive but uncurated collection proved too noisy for systematic analysis. Scopus’s ability to export 2,000 records at once and include meta-data such as author affiliation, country proved invaluable for my collaboration mapping. 

Data Extraction and Automation 
To efficiently handle large datasets, I leveraged R’s Bibliometrix package. Use this R script to automate your data extraction with the Scopus API (Application Programming Interface). APIs enable software systems to communicate with each other. Researchers can use APIs to automate access to database records (like Scopus, WoS) without manual downloading. To access the Scopus database, request access via Elsevier’s Developer Portal. 

Pros: Good for large-scale scraping. Cons: Requires API key approval (can take days or weeks).  

For targeted bibliometric searches, carefully construct your keyword strings using Boolean operators (AND/OR/NOT) and field tags like TITLE-ABS-KEY() to balance recall and precision – for example, my search TITLE-ABS-KEY(“data visualization” AND “evaluation”) retrieved 37% more relevant papers than a simple keyword search by excluding off-topic mentions in references. 

After exporting Scopus results to CSV, a simple script converted and analyzed the data (Aria & Cuccurullo, 2017): 

library(bibliometrix) 

M <- convert2df(“scopus.csv”, dbsource = “scopus”, format = “csv”) 

results <- biblioAnalysis(M) 

This approach provided immediate insights into citation patterns and author networks.  

Data Screening and Cleaning 
The initial search may return many papers; my search returned over 2,000. To narrow down the most relevant articles, you can apply filters such as: 

  1. Removing duplicates via DOI matching [use R code, M <- M[!duplicated(M$DO), ] #Remove by DOI. Duplicates are common in multidatabase studies.  
  1. Excluding non-journal articles 
  1. Excluding irrelevant articles that do not match your research questions or inclusion criteria 
  1. Manual review of random samples to verify relevance 

Additional data cleaning may be required. I use R’s tidyverse, janitor or dplyr packages for these tasks.  

The screening process can be overwhelming and time-consuming if performed manually. Fortunately, several tools and websites are available to assist with this task. Notable examples include abstrackr, convidence.org, rayyan.ai, AsReview, Loonlens.com, and nested-knowledge. These tools require well-defined inclusion and exclusion criteria. It is essential to have thoroughly considered criteria in place. Among these tools, my preferred choice is Loonlens.com, which automates the screening process based on the specified criteria and generates a CSV file with decisions and reasons upon completion. 

Analysis and Visualization  

Key analytical approaches included (refer to the appendices for R codes and this guideline): 

  • Citation analysis to identify influential works 
  • Co-authorship network mapping to reveal collaboration patterns 
  • Keyword co-occurrence analysis to track conceptual evolution 
  • Country and institution analysis to identify geographical collaborations and impacts 

For visualization, VOSviewer creates clear keyword co-occurrence maps, while CiteSpace helps identify temporal trends. The bibliometrix package streamlined these analyses, with functions like conceptualStructure() revealing important thematic connections. Visualization adjustments (like setting minimum node frequencies) transformed initial “hairball” network diagrams into clear, interpretable maps.  

This structured approach, from precise question formulation through iterative visualization – transformed a potentially overwhelming project into manageable stages. The automation and filtering strategies proved particularly valuable, saving countless hours of manual processing while ensuring analytical rigor.  

All the R code I used for data cleaning, analysis, and visualization is available on my GitHub repository. 

Challenges & How to Overcome Them 

Bibliometric analysis comes with its fair share of hurdles. Early in my project, I hit a major roadblock when I discovered many key papers were behind paywalls. My solution? I leveraged my university’s interlibrary loan/resource sharing system and reached out directly to authors via ResearchGate to request for full text – some responded with their papers. API limits were another frustration, particularly with Scopus’s weekly request cap (20,000 publications per week). I used R’s httr package to space out requests systematically, grouping queries by year or keyword to stay under Scopus’s weekly limit while automating the process. In addition to utilizing the API, you may access Scopus with your institutional credentials to manually search for papers using your key terms. You can then export your results in various formats such as CSV, RIS, and BibTex. 

The learning curve for R’s Bibliometrix package nearly derailed me in week two. After spending hours on error messages, I discovered the package’s excellent documentation and worked through their tutorial examples line by line. This hands-on approach helped me master essential functions within a week. 

Perhaps the trickiest challenge was avoiding overinterpretation. My initial excitement at seeing strong keyword clusters nearly led me to make unsupported claims. Consult with your advisor, a colleague or expertise in your field to help you distinguish between meaningful patterns and statistical noise. For instance, I found that a seemingly important keyword connection was just due to some prolific author’s preferred terminology. 

For clarity in my visualization, I use a consistent color scheme across visualizations to help readers quickly identify key themes. I used blue for methodological terms, green for application areas, and red for emerging concepts. This small touch markedly improved my visual’s readability. 

Conclusion 

This journey through bibliometric analysis has transformed how I approach research. From crafting precise questions to interpreting network visualizations, these methods bring clarity to complex literature landscapes. The technical hurdles are real but manageable – the payoff in insights is worth the effort. 

For those just starting, I recommend beginning with a small pilot study, perhaps analyzing 100-200 papers on a focused topic. The skills build quickly. 

I’d love to hear about your experiences with bibliometrics or help troubleshoot any challenges you encounter. Feel free to reach out at contact@rd-amoako.com or continue the conversation on research forums and other online platforms. Let’s explore how these methods can advance our evaluation and research  practice together. 

Interested in seeing the results of my bibliometric analysis and exploring the key findings? Connect with me via LinkedIn  or my blog. 

View an interactive map of publication counts by country from my project:  publications_map.html  

Bibliography 

an Eck, N. J., & Waltman, L. (2014). Visualizing bibliometric networks. In Y. Ding, R. Rousseau, & D. Wolfram (Eds.), Measuring scholarly impact: Methods and practice (pp. 285–320). Springer. 

Aria, M. & Cuccurullo, C. (2017) bibliometrix: An R-tool for comprehensive science mapping analysis, Journal of Informetrics, 11(4), pp 959-975, Elsevier. 

Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., & Lim, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of Business Research, 133, 285–296. https://doi.org/10.1016/j.jbusres.2021.04.070 

Liu, A., Urquía-Grande, E., López-Sánchez, P., & Rodríguez-López, Á. (2023). Research into microfinance and ICTs: A bibliometric analysis. Evaluation and Program Planning, 97, 102215. https://doi.org/10.1016/j.evalprogplan.2022.102215 

Van Raan, A. F. J. (2018). Measuring science: Basic principles and application of advanced bibliometrics. In W. Glänzel, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Handbook of science and technology indicators. Springer. 

Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., Van Eck, N. J., & Wouters, P. (2012). The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432. https://doi.org/10.1002/asi.22708 

Yao, S., Tang, Y., Yi, C., & Xiao, Y. (2022). Research hotspots and trend exploration on the clinical translational outcome of simulation-based medical education: A 10-year scientific bibliometric analysis from 2011 to 2021. Frontiers in Medicine, 8, 801277. https://doi.org/10.3389/fmed.2021.801277 

Zupic, I., & Čater, T. (2014). Bibliometric Methods in Management and Organization. Organizational Research Methods, 18(3), 429-472. https://doi.org/10.1177/1094428114562629 

 Resources: 

  • Bibliometrix Tutorial 
  • Scopus API Guide 
  • VOSviewer 
  • CiteSpace Manual  

Data Screening  

Abstractr- https://www.youtube.com/watch?v=jy9NJsODtT8 

Convidence.org- https://www.youtube.com/watch?v=tPGuwoh834A 

Rayyan.ai- https://www.youtube.com/watch?v=YFfzH4P6YKw&t=9s 

AsReview- https://www.youtube.com/watch?v=gBmDJ1pdPR0 

Nested-knowledge- https://www.youtube.com/watch?v=7xih-5awJuM 

R resources:  

My project repository https://github.com/amoakor/BibliometricAnalysis.git 

Packages: 

-tidyverse, – bibliometrix, – rscopus, -janitor 

-pysch, -tm 

httr package documentation: https://httr.r-lib.org/, https://github.com/r-lib/httr 

Analyzing & Visualizing Data 

  • Key Metrics to Explore (See the Bibliometrix Tutorial for more examples): 
  1. Citation Analysis: 

citations <- citations(M, field = “article”, sep = “;”) 

head(citations$Cited, 10) # Top 10 most cited 

  1. Co-authorship Networks: 

networkPlot(M, normalize = “salton”, type = “collaboration”) 

  1. Keyword Trends: 

conceptualStructure(M, field = “ID”, method = “CA”, minDegree = 10) 

Filed Under: Evaluation Methodology Blog

Power BI, Will It Really Give Me Data Viz Superpowers?

Power BI, Will It Really Give Me Data Viz Superpowers?

May 15, 2025 by Jonah Hall

What is Power BI?

Power BI is a powerful tool to visualize data.  

It can take multiple large datasets, put them all together, transform them, perform calculations and help you create beautiful visualizations. Think of it as a data wrangler, organizer, and visualizer! Oftentimes, a collection of visualizations is created into a report.  

My name is Jake Working, I am a third-year student in the ESM PhD program at UTK and primarily use Power BI in my day job as a Data Analyst for Digital Learning at UTK. I will briefly discuss some of Power BI’s main functions and point you towards some resources if you want to learn more. 

Why use a data viz software? 

Before we jump into the software, you may be thinking, “why go through all the trouble of learning another software just to create visualizations? Aren’t my [insert your software of choice here] visualizations good enough?” 

Even when you get comfortable and quick in [your software of choice], at the end of the day, these programs’ primary functions are typically to store, present, or analyze your data, not bringing in data with the purpose of creating visualizations. 

The advantage of learning data visualization software like Power BI is that it is designed with visualization as its primary purpose. If you have learned or even mastered creating visuals in another software, you can 100% learn and master visualization software like Power BI. 

What can Power BI do? 

First, Power BI is excellent at bringing in data. You can connect multiple large and different types of data sources to Power BI, transform them, and perform calculations as necessary to prepare visuals. 

For data sources, if you can access the data, Power BI can connect to or import it. Power BI can take flat files (ex. Excel, PDF, or CSV), pull direct (snapshot or live) from a database (ex. MySQL, Oracle, SQL Server), import from a website, R script, Python script, and so many more! Even if you have multiple data sources, you can load as many as you need in and create relationships between your data sources.  

Creating relationships serves as the backbone of your data model if you have multiple data sources. For example, say you have a data source with student demographic data and another with student course information. If both contain a unique identifier, such as their student ID, you can create a relationship between the data sources based on that student ID and Power BI will know which course information connects with which student in your demographic data.  

Most of the mistakes within building a model occur at this step, and it is important to understand how and why you are building your model in a certain way or else you could sluggish, incorrect, or confusing output. I suggest reading Microsoft’s overview of relationships and then later this two-part blog post on Power BI data modeling best practices (part 1, part 2). Warning! This blog post is overly detailed for beginners, but extremely important information to avoid common Power BI pitfalls with relationships. I have had to deal with, and overcome, issues related to cardinality, filtering, and schema structure that are discussed in the blog.  

An overview of Power BI’s capabilities: bringing in multiple sources of data, cleaning data, creating relationships between data sources, and using the data to generate a visual report. 

Once you have identified your dataset, Power BI has abilities to transform your data into clean, workable, data within their Power Query editor. This editor has functionalities like Excel such as updating data types, replacing values, creating new columns, and pivoting data. This is done using the Power Query GUI or its script language, M. These transformation steps can be “saved” to your data source and performed on your data each time Power BI connects to or updates that data source. So, once you have cleaned up your data once, it is done automatically using the steps you already created! 

Power BI can then do complex calculations on your dataset once you’ve loaded it in. It uses a function and reference library called Data Analysis Expressions (DAX, for short) that is like expressions used in Excel. Check out Microsoft’s overview of how DAX can be used within Power BI and the library of DAX functions. In my use within Power BI, I mainly use calculated columns and measures.  

For example, let’s say I have a column in my data set that shows the date a form was submitted in this format: mm/dd/yyyy hr:min:sec. If I want to count the number of forms submitted in the calendar year 2025 and display that value on my report, I can create a measure using the DAX functions. It would look something like this: 

Finally, Power BI’s main function is to create engaging visuals and reports to infer information from your data. Power BI has a workspace that allows you to easily select visuals, drag fields from your data into the visuals, and then edit or customize your visuals. The software is pre-loaded with many useful visuals, but you can search and download additional, user-created, visuals as well. Check out the image below showcasing Power BI’s workspace. 

image from Microsoft (source) 

Visuals can be used together (like in the image) to create a report. These reports can be published in a shareable environment through the Power BI Service so others can view the report. This is how companies create and distribute data reports! 

One exciting feature of Power BI is the ability to use and interact with Microsoft’s AI, Copilot. Copilot is quite intelligent when it comes to understanding and using data and can even help build visuals and whole reports. Check out this three minute demo on Copilot within Power BI to get a sense of its capabilities. 

I want to try! 

If you are interested in poking around Power BI to see if it could be useful for you, you can download the desktop version for free here. I will note that even if you are working on personal projects and have data you want to create visuals from, it may be worth it to try Power BI! 

Microsoft has training, videos, sample data you can play with once you open the program, and a community forum to help with any questions you may have.  

Curious what Power BI can do? Check out some of the submissions from this year’s Microsoft’s Power BI Visualization World Championships! 

Filed Under: Evaluation Methodology Blog

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

Empathy in Evaluation: A Meta-Analysis Comparing Evaluation Models in Refugee and Displaced Settings

March 15, 2025 by Jonah Hall

By Dr. Fatima T. Zahra

Hello, my name is Fatima T. Zahra. I am an Assistant Professor of Evaluation, Statistics, and Methodology at the University of Tennessee, Knoxville. My research examines the intersection of human development, AI, and evaluation in diverse and displaced populations. Over the past decade, I have worked on projects that explore the role of evaluation in shaping educational and labor market outcomes in refugee and crisis-affected settings. This post departs from a purely technical discussion to reflect on the role of empathy in evaluation practices—a quality that is often overlooked but profoundly consequential. For more information about the work that I do check out my website. 

Evaluation is typically regarded as an instrument for assessing program effectiveness. However, in marginalized and forcibly displaced populations, conventional evaluation models often fall short. Traditional frameworks prioritize objectivity, standardized indicators, and externally driven methodologies, yet they frequently fail to capture the complexity of lived experiences. This gap has spurred the adoption of empathy in evaluation, particularly participatory and culturally responsive frameworks that prioritize community voices, local knowledge, and equitable power-sharing in the evaluation process. The work in this area is substantially underdeveloped. 

A group selfie taken during field work in the Rohingya refugee camps in 2019.

Why Does This Matter?

My recent meta-analysis of 40 studies comparing participatory, culturally responsive, and traditional evaluation models in refugee and displaced settings underscores the importance of empathy-driven approaches. Key findings include: 

  • Participatory evaluations demonstrated high levels of community engagement, with attendance and participation rates ranging from 71% to 78%. Evaluations that positioned community members as co-researchers led to greater program sustainability. 
  • Culturally responsive evaluations yielded statistically significant improvements in mental health outcomes and knowledge acquisition, particularly when interventions incorporated linguistic and cultural adaptations tailored to participants’ lived experiences. 
  • Traditional evaluations exhibited mixed results, proving effective in measuring clinical outcomes but demonstrating lower engagement (54% average participation rate), particularly in cases where community voices were not integrated into the evaluation design. 

The sustainability of programs was not dictated by evaluation models alone but was strongly influenced by community ownership, capacity building, and system integration. Evaluations that actively engaged community members in decision-making processes were more likely to foster lasting impact. 

Lessons from the Field

In our research on early childhood development among Rohingya refugees in Bangladesh, initial evaluations of play-based learning programs suggested minimal paternal engagement. However, when we restructured our approach to include fathers in defining meaningful participation—through focus groups and storytelling sessions—engagement increased dramatically. This shift underscored a critical lesson: evaluation frameworks that do not reflect the lived realities of marginalized communities risk missing key drivers of success. 

Similarly, in a study examining the impact of employment programs in refugee camps, traditional evaluations focused primarily on income and productivity, overlooking the psychological and social effects of work. By incorporating mental well-being as a key evaluation metric—through self-reported dignity, purpose, and social belonging—we found that employment offered far more than economic stability. These findings reinforce an essential principle: sustainable impact is most likely when evaluation is conducted with communities rather than on them, recognizing the full spectrum of human needs beyond economic indicators. 

Rethinking Evaluation: A Call for Change

To advance the field of evaluation, particularly in marginalized and displaced settings, we must adopt new approaches: 

  1. Power-sharing as a foundational principle. Evaluation must shift from an extractive process to a collaborative one. This means prioritizing genuine co-creation, where communities influence decisions from research design to data interpretation. 
  1. Cultural responsiveness as a necessity, not an afterthought. Effective evaluation requires deep listening, linguistic adaptation, and recognition of cultural epistemologies. Without this, findings may be incomplete or misinterpreted. 
  1. Expanding our definition of rigor. Methodological validity should not come at the expense of community relevance. The most robust evaluations integrate standardized measures with locally grounded insights. 
  2. Moving beyond extractive evaluation models. The purpose of evaluation should extend beyond measuring impact to strengthening local capacity for continued assessment and programmatic refinement. 

Looking Ahead

The field of evaluation stands at a pivotal juncture. Traditional approaches, which often prioritize external expertise over local knowledge, are proving inadequate in addressing the complexity of crisis-affected populations. Empathy in evaluation (EIE) methodologies—those that emphasize cultural adaptation, power-sharing, and stakeholder engagement—offer a path toward more just, effective, and sustainable evaluation practice. 

For scholars, this shift necessitates expanding research on context-sensitive methodologies. For practitioners, it demands a reimagining of evaluation as a process that centers mutual learning rather than imposing external standards. For policymakers and funders, it calls for investment in evaluation models that are adaptive, participatory, and aligned with the needs of affected populations. 

As evaluators, we hold a critical responsibility. We can either reinforce existing power imbalances or work to build evaluation frameworks that respect and reflect the realities of the communities we serve. If we aspire to generate meaningful knowledge and drive lasting change, we must practice empathy, cultural responsiveness, and community engagement at the core of our methodologies. 

Additional Resources

For those interested in deepening their understanding of these concepts, I highly recommend the following works: 

  • Evaluation in Humanitarian Contexts:  
  • Mertens, D. M. (2009). Transformative Research and Evaluation. Guilford Press. 
  • Culturally Responsive Evaluation:  
  • Hood, S., Hopson, R. K., & Kirkhart, K. E. (2015). Culturally Responsive Evaluation. In K. E. Newcomer, H. P. Hatry, & J. S. Wholey (Eds.), Handbook of Practical Program Evaluation (4 ed., pp. 281-317). Jossey-Bass. https://doi.org/10.1002/9781119171386.ch12 
  • Participatory Research in Development Settings:  
  • Chouinard, J.A., Cousins, J.B. The journey from rhetoric to reality: participatory evaluation in a development context. Educ Asse Eval Acc 27, 5–39 (2015). https://doi.org/10.1007/s11092-013-9184-8 
  • Empathy in Evaluation:  
  • Zahra, F. T. (n.d.). Empathy in Evaluation. https://www.fatimazahra.org/blog-posts/Blog%20Post%20Title%20One-gygte 
  • Empathy and Sensitivity to Injustice:  
  • Decety, J., & Cowell, J. M. (2014). Empathy and motivation for justice: Cognitive empathy and concern, but not emotional empathy, predict sensitivity to injustice for others (SPI White Paper No. 135). Social and Political Intelligence Research Hub. https://web.archive.org/web/20221023104046/https://spihub.org/site/resource_files/publications/spi_wp_135_decety.pdf 

Final Thought: Evaluation is a mechanism for empowerment and is more than just an assessment tool. Evaluators have the capacity to amplify community voices, shape equitable policies, and drive sustainable change. The question is not whether we can integrate empathy into our methodologies, but whether we choose to do so.  

Filed Under: Evaluation Methodology Blog

Irwin Recognized As Emerging Professional By ACPA

Irwin Recognized As Emerging Professional By ACPA

March 5, 2025 by Jonah Hall

Courtesy of the College of Education, Health, & Human Sciences

At its recent convention in Long Beach, California, College Student Educators International (ACPA) recognized Lauren Irwin with the Annuit Coeptis Emerging Professionals Award. This prestigious award honors exemplary educators in the early stages of their careers. Irwin was one of five early-career professionals recognized for their contributions to the field.
Irwin, an assistant professor in the department of Educational Leadership and Policy Studies (ELPS) in the College of Education, Health, and Human Sciences (CEHHS), is a long-time ACPA member and was deeply honored to receive the award.

headshot

“ACPA has long been my professional home in student affairs, and it means a lot to receive this recognition,” said Irwin. “The Annuit Coeptis award is ultimately about community and discussion to support the future of our field. As a former student affairs administrator and early-career faculty member, I am honored to be part of this prestigious multigenerational community and to have the opportunity to learn from and with some of the brightest minds in our field.”

Irwin primarily teaches in the College Student Personnel and Higher Education Administration programs. Her research informs student affairs practice, aiming to enhance and affirm the success of both students and practitioners. Her doctoral dissertation, which examined racialization and whiteness in college student leadership programs, earned ACPA’s Marylu McEwen Dissertation of the Year Award. Additionally, her research has been published in numerous scholarly journals.

“I hope to continue centering my commitment to student learning, equity, and inclusion through my teaching, research, and service,” Irwin said.
Through its seven departments and 13 centers, the UT College of Education, Health and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

Is Your Data Dirty? The Importance of Conducting Frequencies First

Is Your Data Dirty? The Importance of Conducting Frequencies First

March 1, 2025 by Jonah Hall

By Jennifer Ann Morrow, Ph.D.

Data, like life, can be messy. I’ve worked with all types of data, both collected by me and by my clients, for over 25 years and I ALWAYS check my data before conducting my proposed analyses. Sometimes, this part of the analysis process is quick and easy but most of the time it’s like an investigation…you need to be thorough, take your time, and provide evidence for your decision making. 

Data Cleaning Step 3: Perform Initial Frequencies 

After you have drafted your codebook and analysis plan you should conduct frequencies on all of your variables in your dataset, both numeric and string variables. I typically use Excel or SPSS to do this, my colleague Dr. Louis Rocconi prefers R, but you can use any statistical software that you feel most comfortable with. At this step I conduct frequencies and request graphics (e.g., bar chart, histogram) for every variable. This output will be invaluable as you work through your next data cleaning steps. 

So, what should you be looking at when reviewing your frequencies? One thing that I will make note of is any discrepancies in coding between my data and what is listed in my codebook. I’ll flag any spelling issues in my variable names/labels and note anything that doesn’t match my codebook. One thing that I always check is that my value labels (what labels are given to my numeric categories) are the same as my codebook and consistent across sets of variables. Many times, if you are using an online survey software package to collect your data there can easily have been programming mistakes when creating the survey that results in mislabeled values. Also, if you have had many individuals enter data into your database it can increase the chances that mistakes were made during the data entry process. During this step I will also check to make sure that I have properly labeled any values that I’m using to designate missing data and that this is consistent with what I have listed in my codebook.  

Lastly, I will highlight when I see variables that may have extreme scores (i.e., potential outliers), variables with more than 5% missing data, and variables with very low sample size in any of their response categories. I’ll use this output in future data cleaning steps to aid in my decision making on variable modification. 

Data Cleaning Step 4: Check for Coding Mistakes 

At this step I will take my output that I highlighted potential issues with coding and start reviewing and making variable modification decisions at this step. Coding issues are more common when you have data that has been manually entered but you can still have coding errors in online data collection! Any variables that have coding issues I first determine if I can verify the data from the original/another source. For data that has been manually entered I’ll go back to the organization/paper survey/data form to verify the data. If it needs to be changed to the correct response I will make a note of this to fix in my next data cleaning step. If I cannot verify the datapoint (like when you have collected your data anonymously) and the value doesn’t fall in the possible values listed in my codebook then I make a note to set the value as missing when I get to the next data cleaning step.  

Additional Advice 

As I am going through my frequencies I will highlight/enter notes directly in the output to make things easier as I move forward through the data cleaning process. I’ll also put notes in my project notebook summarizing any issues and then once I make decisions on variable modifications, I note these in my notebook as well. You will use the output from Step 3 in the next few data cleaning steps to aid in your decision making so keep it handy! 

Resources

12 Steps of Data Cleaning Handout: https://www.dropbox.com/scl/fi/x2bf2t0q134p0cx4kvej0/TWELVE-STEPS-OF-DATA-CLEANING-BRIEF-HANDOUT-MORROW-2017.pdf?rlkey=lfrllz3zya83qzeny6ubwzvjj&dl=0 

Step 1: https://cehhs.utk.edu/elps/organizing-your-evaluation-data-the-importance-of-having-a-comprehensive-data-codebook/ 

Step 2: https://cehhs.utk.edu/elps/clean-correlate-and-compare-the-importance-of-having-a-data-analysis-plan/ 

https://davenport.libguides.com/data275/spss-tutorial/cleaning

https://libguides.library.kent.edu/SPSS/FrequenciesCategorical

https://www.datacamp.com/tutorial/tutorial-data-cleaning-tutorial

https://www.geeksforgeeks.org/frequency-table-in-r

https://www.goskills.com/Excel/Resources/FREQUENCY-Excel

Filed Under: Evaluation Methodology Blog

Boyd Receives Legacy of Excellence Award From ASCA

Boyd Receives Legacy of Excellence Award From ASCA

February 27, 2025 by Jonah Hall

Karen D. Boyd, professor of practice in the College of Education, Health, and Human Sciences (CEHHS) at the University of Tennessee, Knoxville, received the Raymond H. Goldstone Legacy of Excellence Award by the Association for Student Conduct Administration (ASCA) during its 2025 Annual Conference held in Portland, Oregon.

The Goldstone Legacy of Excellence Award is a new initiative launched from the Goldstone Foundation to recognize distinguished individuals who have impacted the field of student conduct and higher education. The Legacy of Excellence Award annually recognizes a select group of individuals who have left an enduring impact on the profession through significant contributions to the field of student conduct; impactful scholarship and research; and/or leadership within ASCA and other organizations.

Boyd has been a part of ASCA since its inception. Her leadership included Conference Chair, President, and Gehring Academy Chair, as well as authoring multiple publications and presentations and even serving as Interim Executive Director. In addition, Boyd serves as a professor of practice and director of undergraduate education in the department of Educational Leadership and Policy Studies (ELPS).

“It is an honor to be so recognized for doing work in service to the success of my students and colleagues that I have loved so very much,” said Boyd.

Many members, past and present, have benefited from all she implemented in the Association. The future of our field continues to benefit through her role as professor at the University of Tennessee, Knoxville, where her courses are consistently regarded by students as among their favorite and most impactful.

Her work with educating professionals and students about the landmark Dixon v. Alabama case, and her partnership on the documentary regarding the case, has made a significant impact on the conduct field.

The ASCA Annual Conference, spanning from February 5 – February 8, 2025, gathered nearly 650 student conduct and student affairs practitioners for a professional development experience. The awards were presented during the Awards Luncheon on February 6, 2025, where attendees gathered to connect and congratulate the recipients.

Since its inception in 1986, the Association for Student Conduct Administration (ASCA) has been at the forefront of supporting campus judicial officers and student conduct practitioners. ASCA provides members strategic resources, including communities of practice, webinars, intensive-learning opportunities (Donald D. Gehring Academy) as well as partnering with the Raymond H. Goldstone Foundation for scholarship funding. Today, ASCA supports over 2,660 members worldwide and is committed to its mission of serving as a vital resource and advocate in the field of student conduct administration. Learn more at theasca.org.

Through its seven departments and 13 centers, the College of Education, Health, and Human Sciences enhances the quality of life for all through research, outreach, and practice. Find out more at cehhs.utk.edu

Filed Under: News

David Hamilton Recognized as Field Award Recipient

David Hamilton Recognized as Field Award Recipient

February 20, 2025 by Jonah Hall

Mr. David Hamilton, Principal at Cumberland Gap High School in the Claiborne County School District, has been named as this year’s recipient of the William J. and Lucille H. Field Award for Excellence in Secondary Principalship for the State of Tennessee.

Pictured from Left to Right: Dr. James Martinez, Mr. David Hamilton, & Mr. Randy Atkins

The Field Award was established to recognize one outstanding secondary school leader each year who demonstrates leadership excellence through commitment to the values of civility, candor, courage, social justice, responsibility, compassion, community, persistence, service, and excellence. Administered by the College of Education, Health, and Human Sciences at the University of Tennessee, the Field Award identifies a Tennessee secondary school principal whose life and work are characterized by leadership excellence and encourages secondary school principals to pause and reflect upon their current leadership practice and to consider their experience, challenges, and opportunities in light of the personal values that they embody. 

The Field Award recipient for this year is Mr. David Hamilton, Principal at Cumberland Gap High School (CGHS) in the Claiborne County School District. Mr. Hamilton has served as the principal of CGHS since 2019, and served as the school’s assistant principal from 2003-2018. During that time, he developed and implemented a program that significantly improved student transition and retention, organized initiatives that paired students and community mentors, spearheaded fundraising efforts that raised over $20,000 for student resources and facility upgrades, and established a year-round food and hygiene pantry that ensures students have access to essential resources.

Mr. Hamilton served as a high school health and physical education teacher in the Claiborne County School District from 1999-2003 and coached high school baseball teams between 2003-2006, and again between 2015-2018. Mr. Hamilton holds a Bachelor of Science degree in Health and Physical Education, and Masters of Arts and Educational Specialist degrees in Educational Administration and Supervision, all from Lincoln Memorial University. The department of Educational Leadership and Policy Studies at the University of Tennessee, Knoxville is proud to name Mr. David Hamilton as this year’s Field Award Winner. Congratulations, Mr. Hamilton! 

Filed Under: News

  • 1
  • 2
  • 3
  • …
  • 14
  • Next Page »

Educational Leadership and Policy Studies

325 Bailey Education Complex
Knoxville, Tennessee 37996

Phone: 865-974-2214
Fax: 865.974.6146

The University of Tennessee, Knoxville
Knoxville, Tennessee 37996
865-974-1000

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.

ADA Privacy Safety Title IX