Tory Dellafiora // Assistant Director, Experiential Learning // Florida State University
Abstract
This practitioner piece focuses on a technique to assess student learning and outcomes with preexisting data (content analysis). Utilizing a qualitative approach, student reflections are evaluated for alignment with either professional or programmatic competencies, goals, and outcomes. The author utilizes a previously-conducted assessment to walk readers through the process, including considerations for data collection and organization, qualitative coding practices, and tips for next steps. … Student reflections in an experiential learning-based honor society were evaluated for alignment with the NACE Career Readiness Competencies to see if students were reflecting on their experiences through a career-focused lens, even if they weren’t utilizing the exact language articulated by NACE. The researcher used a deductive, rubric-style coding approach to examine student reflections on four specific prompts with language unrelated to the competencies. Once coded, the data was evaluated both quantitatively (for counts and averages) as well as qualitatively (for narratives and experiences) to examine what competencies were being demonstrated. In addition, the results of this study were utilized to better articulate the programmatic learning objective and outcome for potential scholars. … For general practitioners, this article will walk step-by-step through the process, pointing out challenges, difficulties, and questions that may occur along the way. Breaking these more intense qualitative research methods into digestible chunks for an average professional can provide an opportunity for actionable assessment to help improve and articulate programmatic goals and objectives, as well as provide justification for the work that we do.
Does your experiential learning program include defined outcomes or goals for participating students? How do you know that these goals or outcomes are being met or achieved? How do you evaluate programs for evidence of student learning, growth, and development?
Many experiential learning opportunities, from internships to study abroad, include student reflections as a critical component for program completion. However, how often are those reflections actually evaluated for content rather than completeness? Are reflections evaluated at the macro level to examine programmatic trends, or are they simply used to understand a single student’s learning and growth during an experience? Student reflections are a treasure trove for assessing programmatic objectives at the macro level, adding tools to help advocate for, alter, and improve our programs.
The purpose of this article is to walk you through the process of evaluating student reflections utilizing content analysis, a qualitative method with a quantitative twist – the best of both worlds. I will be using a content analysis conducted previously with a student honor society at Florida State University to show you the steps required to effectively utilize this assessment tool in your own practice and programs.
Garnet and Gold Scholar Society
For some context, the Garnet and Gold Scholar Society (GGSS) at Florida State University provides an opportunity for students to engage in and be recognized for both in- and out-of-classroom experiences that contribute to being a well-rounded undergraduate student. Housed in the Career Center, this program is a partnership that reaches across the university’s Divisions of Academic Affairs and Student Affairs. Students qualify for recognition by completing experiences and reflections in three of five engagement areas and submitting a culminating synthesis essay. The five areas of engagement are international experiences, internships, leadership, research, and service (Garnet and Gold Scholar Society, 2018).
An expert in each area is assigned to read submitted student reflection essays and approve or suggest revisions. These experts work in the assigned engagement area and evaluate students based on the given rubric. Synthesis essays of no more than 1,200 words are submitted as the final milestone for induction into the Society, with the student’s overall program advisor serving as the evaluator of that essay. These essays integrate experiences from all three engagement areas and have students reflect on interconnectedness and collaboration with others, as well as skills that may have been developed through the experiences. These synthesis essays were the unit of analysis for the assessment, as they reflected a student’s overall involvement and participation in the program.
The assessment method used to evaluate these overall synthesis essays was content analysis. Though it will be explored more fully throughout this discussion, content analysis is a qualitative method that allows data to also be explored numerically while identifying the presence or frequency of words or themes within a text. Though there are several frameworks that can be used to work with content analysis, this assessment process utilized a method called ‘Directed Content Analysis’ while also utilizing predetermined codes (Hseih & Shannon, 2005).
Steps for Conducting a Content Analysis
To complete a directed content analysis using predetermined codes, there are seven steps for success. Some of these are easier than others, while some require intense legwork. However, it’s critical to be thoughtful about the process, as being critical at the forefront will save you time and headaches at the end of the process.
1. Determine your dataset
See what you’re working with! What data exists that you’re able to evaluate? This could be student reflections, interview transcripts, or open-ended feedback from surveys that have never really been examined. For some programs, there may be one definite data source, while other programs may have a wide variety of information to pick from. I recommend comparing apples to apples when conducting your first content analysis; that is to say, if you have multiple student reflections from different points in the program and end-of-program survey responses, stick to one of those reflections rather than combining everything into one massive data set. You’ll be able to see clearer patterns when sticking to one data source, and it will also keep you from going off the rails.
For the GGSS assessment, we knew that there were a ton of student reflections that were probably full of useful information that had never been examined. Though students reflect on individual engagement areas throughout the process, we chose to focus on the final synthesis essay since every scholar was required to answer the same four questions (no matter what activities they had participated in) and it happened at the end of their time in the program. We also knew that we wanted to utilize this research not only for programmatic benefit and improvement but to report out on results, so it was necessary to get IRB approval and receive student permission to utilize their reflections for the data analysis. Depending on your intent with the research, this step may or may not be necessary. Once we determined what dataset we were going to examine, we had to figure out how to frame it for analysis.
It’s important to note that depending on your evaluation style, steps one and two may be in reverse order. You may say, “I have a dataset that I know has information, but I’m not sure how to frame it.” (step one then two) or it may be, “I know our students learn X during this experience and want to show that, but I’m not sure which of our datasets would best highlight that information.” (step two then one).
2. Decide on your rubric/code
To perform a content analysis using deductive codes, you must have a rubric or codebook established before beginning the process. This may sound intimidating, but it really is just asking the question, “What are the exact items you’re looking for? How will you know when a sentence aligns with what you’re examining?” This rubric could be anything from indications of the different stages of Kolb’s experiential learning cycle to a list of resources that students mention during a post-event survey. Though a code can be something developed entirely by the researcher or research team (“We want to look in these responses for students utilizing Career Center resources”), they can also be aligned with a preexisting theory or benchmark outside of the research process.
The GGSS analysis utilized the NACE Career Readiness Competencies (2017). As the Garnet and Gold Scholar Society is housed within the FSU Career Center, we were curious if the synthesis reflections included students demonstrating career-ready behaviors or thoughts to be in line with more general goals. In addition, utilizing a preexisting set of codes (the competencies themselves) allowed us to dive right in and begin making meaning that would be aligned with the overall Career Center goals of career readiness in our graduates. There are many other frames in which we could have examined this data, but we specifically wanted to find data points that would show the program’s alignment with the departmental mission.
3. Clean and combine data
Depending on your data set, this step will take a varied length of time. For the GGSS analysis, certain programmatic and demographic information was stored in separate locations with actual reflections stored in an entirely different system. This is also the stage where you may need to consider what software or method you plan to code with. To prepare for nVivo analysis, all data is most simply uploaded in a single Excel spreadsheet, while coding by hand may require each reflection to be printed on its own sheet of paper.
This step is also important to think about what you may want to know later – it’s better to have too much information that you’ll never use than need a piece of information further down the line. This can particularly be a problem if you choose to de-identify student information to protect privacy during the process. We included gender, college affiliation, and major in our student profiles, but none of those were utilized for analysis – however, since we included it originally, we still have that option for analysis if we want to go back and examine that later.
4. Coding process – use those rubrics!
This is the step that can feel the most daunting for those not used to qualitative analysis. As we’ve already determined that we’re using a preexisting rubric or code, this eliminates part of the fear or struggle. As we read through each essay, reflection, or response, we need to ask ourselves, “Does this sentence (or other unit of analysis) show alignment with my rubric or code?” In the simplest terms, that’s it! Particularly in more complex responses, there can be questions related to a student’s intent related to a unit of analysis, and it is sometimes the job of the analyst to determine if the response truly aligns with the code or rubric.
In addition, this is the step where you may want to consider what your unit of analysis will be: are you only going to be looking for specific words or will you be highlighting full sentences or paragraphs that demonstrate alignment with your rubric? My recommendation is to highlight as much as you need to give context to a thought if it were removed from its current location. For some reflections, this may be half of a sentence while others may require three or four sentences to capture the whole thought. However, this suggestion is only that – a suggestion. Code in whatever way makes the most sense for your project!
You are also not strictly bound to code only what appears on your rubric. Though that is obviously the focus of your analysis, there may be other things that emerge during the process that you want to examine later or have on-hand. For example, the GGSS analysis included an additional code for when students talked positively about the impact of the program itself, as well as another code called ‘Cool Stuff’ that included narratives that would be great for sharing with campus partners.
Depending on your coding software or process, the actual coding process will look very different. This may mean selecting a certain phrase in nVivo to assign it to a node, utilizing a specific color of highlight in a Word document, or using a specific marking to hand–mark a physical piece of paper. Code however you are best able to complete the process. If you’re a handwritten person, do it by hand. However, do know that each system has its advantages and disadvantages. Particularly for content analysis, a software built specifically for coding (such as nVivo) will automatically calculate things such as counts or averages, as well as more easily run analysis on demographic factors.
It is also important to note that this is usually the most time- and thought-intensive part of a content analysis. Do whatever you have to in order for it to be productive; this may mean closing a door, going to a quiet place outside of your usual location, plugging in a second (or third) monitor, or having a printed version of your rubric taped to the wall next to your desk. It may mean making a sign for your door or space that indicates you’re working with noise-cancelling headphones and would appreciate the time alone. It’s also a great feeling to get into the coding groove, but don’t make the same mistakes I did and sit without moving for hours at a time; this isn’t healthy for your eyes or brain, or for your daily step count. However, these focused chunks of time can allow you to project approximately how much time remains on your coding process. I knew that I could do three synthesis essays in a focused half hour, so I just had to do that math over how many were remaining in order to budget my time appropriately.
5. Pattern analyzation
The first way to look for patterns is by looking at the raw numbers alone for your analysis. Using the GGSS analysis as an example, is there a particular competency that appeared to be discussed more than any other? Was there a competency discussed more often by students coded to a particular engagement area? As mentioned above, depending on your coding method and software, this may be a very easy or very difficult step. In utilizing a software like NVivo, you can run reports, cross tabulations, and word clouds to look for further patterns.
This is where you want to think big-picture about the particular data that you’ve collected and what the results indicated. Is the format you’ve currently chosen the most appropriate format or method to share your message? Are you capturing the full picture, or is there something missing in your analysis? For example, are you comparing raw numbers where the sample sizes are not the same? Would it make more sense to compare averages instead? Though students engaged with leadership talked about the leadership competency with great frequency and had a number that stood out from the rest of the sample (115 times total, 3.19 times per essay), students engaged in international experiences talked about global and intercultural fluency at a similar rate (73 times total, 3.17 times per essay). The higher number of leadership responses averaged out with the higher number of leadership participants, leaving the two engagement areas and competencies at similar discussion rates.
This is also the time where you want to examine the results of your coding in a more qualitative manner. Rather than being concerned about how often somebody discussed a competency, you may be more interested in the language they used to discuss that competency. Students whose responses aligned with the critical thinking competency were more likely to discuss “problem-solving” and reflections on the application of critical thinking: “this is how we did things, here is the problem we identified, and here is how we’ll be doing it going forward”. Reading the actual language the students are using may give you insight about what language to apply to a program or service going forward.
6. Synthesize and summarize results
This is the part that can sometimes feel overwhelming. You did an amazing assessment, you found incredible results, and ultimately it comes down to the question of “so what?” What are the key points you want someone to take away from all the work that you’ve done? What is the most important thing you discovered? Not only is this critical for an audience, but this is also critical for your own discussion of the work. Being able to summarize the key content in a succinct manner means that more people will have access to your work.
This step could take many different forms from writing main takeaways on post-it notes or index cards and hanging them on a wall to writing a five-word sentence about the key findings. This is an opportune time to bring in another set of eyes or ears, someone who can help clarify fledgling ideas and assist you in the identification of the key takeaways. This is also a key time to identify any qualitative data from your “Cool Stuff” code that would be useful to share out, as it can be useful to include in the summarization or as a hook for your data itself.
7. Take action
In the assessment loop, this is the step that can most often be missed. Now that you’ve conducted an assessment, you have a responsibility to do something with the data. A final product may be a marketing campaign for your program that includes relevant data, a one-page summary for a leadership team to understand the program’s impact, or a presentation for colleagues about your findings and implications for their work. This action step may also include making programmatic change. This may be a rewriting of learning objectives, given prompts, or a further explanation of program goals on a website. Data is only as useful as the change inspired by it, even if that is just assessment-informed change to your daily practice.
Implications for Practice
Hopefully this series of seven action steps has been insightful in breaking apart what may be an intimidating assessment process into more digestible segments. Content analysis truly is a great place to start to dive into the assessment world, as it allows you to engage with qualitative data in a numeric way while still preserving the language underneath. It is my hope that this aids you in gaining comfort in the world of assessment, and that it is the first of many steps on this journey!
References
Garnet and Gold Scholar Society. (2018). About [Fact sheet]. Retrieved from
http://garnetandgoldscholar.fsu.edu/about
Hsieh, H. & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277-1288. Doi: 10.1177/1049732305276687
National Association of Colleges and Employers. (2017). Career readiness defined [Fact sheet].
Retrieved from http://www.naceweb.org/career-readiness/competencies/career-readiness-defined/