"

12 Evaluating Impact of First-Year Experience Programs in Canadian Higher Education

Heather Doyle, Nathan Barton, & Jack Killeen

Introduction

Higher education in Canada is changing. We are seeing declines in enrollment, changes in provincial funding, tuition fee increases, record levels of student debt, and changes in international student enrolment (Varela, 2024; Hendersen, 2024; Usher, 2022). In student affairs, we are facing internal and external pressures to achieve more with fewer resources while ensuring our programs and services positively impact students’ growth, development, and success outside the classroom. Assessment has never been needed more in helping to understand the impact our initiatives have on student learning and success. Although this topic has been widely discussed in higher education, and there is a substantial body of literature on integrating it into student affairs, particularly within the U.S., it’s essential to recognize and account for the distinct differences and unique contexts in Canada.

By participating in an ongoing assessment cycle, post-secondary institutions can foster a culture of evidence-based decision-making, ensuring student learning and development are used to create and improve programming, services, and events. This chapter will explore the importance of assessing first-year experience programming in Canada and the assessment process. We will provide practical examples and discuss critical aspects that can be evaluated to help support a successful first-year experience.

Background

When conducting assessment, it is essential to establish a solid foundation by understanding the key factors that need to be evaluated. Good assessment practices go beyond satisfaction surveys by delving into the impact on student achievement, learning and success. We know that the first-year experience is pivotal in shaping students’ academic success and well-being. For many, the transition from high school to post-secondary represents a significant shift in academic expectations, social dynamics, and personal responsibility (Hassel & Ridout, 2018). It sets the tone for how well students will adapt and is a period of adjustment as they navigate new environments, relationships, and academic demands. Several Canadian institutions, including York University, Dalhousie University, Humber College, and Ontario Tech University have adopted the 5 Senses of Success model to support student success in the first year. Originating from Lizzio’s work in 2006, the 5 Senses of Success framework offers a holistic approach to addressing students’ needs and concerns throughout their academic journey. Aligned with research on student development, it has demonstrated predictive power regarding student satisfaction and outcomes. The model focuses on five key areas (or “senses”): purpose, culture/identity, connectedness, resourcefulness, and capability (Lizzio & Wilson, 2013). By adopting a life-cycle perspective on student development, institutions can tailor their programs to address the diverse identity-related tasks and needs students encounter at various stages of their academic journey, thereby enhancing their overall success and satisfaction (Hassel & Ridout, 2018). This model can also be used as a foundation for developing learning outcomes throughout the assessment cycle.

Related to the first year, studies have also shown that students’ high school grades and social integration are predictive measures for success and retention. However, there is deeper context to consider. Per Lizzio & Wilson (2013), academic risk is not an inherent characteristic or inevitable outcome for specific groups. Instead, they posit that risk results from the interaction between students and their environment, and that first-year students are often unable to make realistic appraisals of what it takes to be a post-secondary student. This is supported by the 2022 Canadian University Survey Consortium (CUSC) survey results. When entering students were asked about expectations, many experienced a different reality than they had anticipated:

  • 46% indicated that the cost of attending university was more than expected.
  • 55% felt that the time put into coursework was more than they anticipated.
  • 50% indicated that the courses were more academically demanding than expected.

This is where first-year experience programming can help with student success and persistence. Orientation programs can be viewed as anticipatory socialization: a process or set of experiences through which students learn the values, norms, and behaviours they will encounter in a new social setting (Pascarella et al., 1986).  When anticipatory socialization is effective, the student becomes more successfully integrated into the new setting and functions more effectively (Pascarella et al., 1986). Students exposed to and participating in first-year experience programs and experiences will be more likely to become socially and academically integrated into the post-secondary experience than students who do not participate (Pascarella et al., 1986). This is evidence by Dalhousie’s own data. Students were surveyed and asked whether they had attended at least one event on campus that term. They were then asked to rank how connected they felt to campus. Students who attended at least one event on campus were 287% more likely to feel connected. Therefore, rather than being a one-time experience, orientation is more effective as an ongoing attempt to enhance students’ successful integration into the campus academic and social systems before and throughout their first year.

While exciting, the transition from high school to post-secondary (adolescence to adulthood) is also stressful. Students experience high rates of academic, social, and emotional stress, which have only increased since the beginning of the COVID-19 pandemic (American College Health Association, 2022; Kwan et al., 2021; Gallagher et al., 2019; Barbayannis et al., 2022). When surveyed, 47% of Canadian students across 16 campuses indicated they had experienced moderate stress in the past 30 days, and 36.7% showed high stress (American College Health Association, 2022). Alongside these life changes and coupled with stress, late adolescence is also a period of psychological change in which mental health issues begin to arise (Slykerman & Mitchell, 2021). This results in high levels of anxiety and depression in students across campuses (Gallagher et al., 2019; American College Health Association, 2022).

Understanding students’ social and emotional well-being is essential to promoting a psychologically healthy campus. To support students throughout this challenging transition, we can use assessments that aid in understanding our campus’ well-being, needs and wants. By assessing the social and emotional well-being of first-year students in post-secondary education, we can gain valuable insight into the psychological health of our campus. Assessment results can be used to understand our strengths and weaknesses, inform targeted interventions, create resources, and adapt services. It is crucial when collecting the personal experience of students – such as emotional wellbeing – that this data is appropriately used to make positive changes as an appreciation of their engagement and willingness to share their lived experiences.

Incorporating Assessment

Student affairs professionals must assess our transition programs and services regularly to ensure that the goals and outcomes of first-year experience programming are met and that students acquire the knowledge and skills needed to be successful . By conducting regular assessments, we can ensure that our efforts effectively contribute to students’ academic, personal, and professional development. However, we know that assessment and evaluation could be more consistent in student affairs units across the Country. According to a recent study by the Canadian Association of College and University Student Services (CACUSS) (Academica Group, 2023), only 31% of Canadian post-secondary institutions surveyed had assessment units as part of student affairs. In addition to the need for embedded assessment units, research has shown several obstacles that can hinder the assessment process (Scott, 2018).

One of the most common obstacles is staff members’ need for skill and knowledge building.  Assessing and evaluating programs requires a certain level of expertise, and staff members need to be trained in these areas to carry out assessments effectively. Another common obstacle is a lack of dedicated time to engage in the assessment process. Student affairs staff members have many responsibilities and priorities, and finding time to carry out assessment can be challenging and is often done on the “side of the desk”. Finally, there may be a general feeling of distrust among staff members regarding assessment. This can stem from various factors, such as fear of negative feedback or a lack of understanding of the purpose of assessment (Scott, 2018). When embedding assessment into our work, it is essential to look at naming and addressing these barriers to help demystify the assessment process to make it feel like part of our everyday routine. Assessment can often seem daunting or disconnected from our regular tasks, but by integrating it seamlessly into our work, we can create a more holistic and effective approach to evaluating progress and learning.

In addition to the challenges listed above, according to the 2023 Pan-Canadian First Year Experience and Students in Transition Programs Survey, the most common reasons for cancelling or changing programming were lack of student uptake and institutional funding, while the top barriers to implementing new programming were maintaining partnerships, lack of financial and staffing resources, and inconsistent student engagement. Assessment has a significant role to play in addressing these barriers. The absence of relevant data often complicates the process of advocating for resources and partnerships. We cannot determine trends and gaps (such as student engagement) meaningfully and purposefully without appropriate data to tell our stories. Without data, we tend to make decisions based on anecdotal or surface-level information, which often leads to frustration, and inconsistent programming.

Overall, student affairs practitioners must overcome these obstacles and prioritize assessment to improve the quality of programs and services provided to students. Using assessment, we can effectively strategize and execute programming grounded in theoretical knowledge and tailored to the feedback received from students. By leveraging assessment, we can enhance the planning and implementation of programs. Assessment can help students by amplifying voices, engaging with administration, and allowing input from various experiences.

The Importance of Incorporating Socially Just Practices into Assessment

Pursuing higher education is often viewed as the catalyst for a better life, and for good reason. According to Statistics Canada (2016), the median income of earners holding a high school diploma in Canada was $49,514 compared to a median income of $75,212 for earners holding a bachelor’s degree. For many, a post-secondary education is considered an equalizer, where everyone across different socioeconomic backgrounds and cultures can enter a post-secondary institution with the same opportunities. However, it is becoming increasingly clear that the first-year experience is not ubiquitous. Beyond the stressful transition from high school to post-secondary, Black, Indigenous, and People of Colour (BIPOC) students routinely experience racial microaggressions on predominantly white campuses in Canada. A study conducted in the US demonstrated that anxiety and depressive disorders tend to be higher among BIPOC students than white students because of their experience dealing with discrimination (Pieterse et al., 2010). A qualitative study by Houshmand and Spanierman (2021) found that first-year BIPOC Canadian students felt like they did not belong on campus. In response, they sought out community elsewhere, and even travelled to a more racially diverse campus where they felt they belonged. Assessment can potentially help address some of these inequalities of BIPOC students on university campuses. Addressing the first-year student experience is essential because it sets a precedent for their overall experience. One study used assessment to identify first-year students at risk of dropping out (Beck & Davidson, 2015). The study found that variables dependent on interactions with the academic and social environments were the best predictors of retention. These studies illustrate the importance of cultural spaces, student organizations, and faculty representation in creating a desirable academic and social environment for first-year BIPOC students. Through assessment, institutions can identify areas where students face inequality and work towards creating an equitable solution. However, it is essential to note that assessment is an inherently biased process. Assessment designers unconsciously create assessments for students who look like them, learn like them, or hold similar mental models around content, culture, religion, and school (Learner-centered initiatives, 2016). Awareness and recognition of this innate bias is the first step to creating a just assessment.

Assessment designers should actively seek diverse perspectives from the Black, Indigenous and POC community and students. Regularly re-designing, revising, and re-administering assessments is crucial. By incorporating the viewpoints of BIPOC students, we can make evidence-based changes that address inequality. It’s essential to recognize that each campus has a diverse population with varying needs related to social and emotional well-being and that diverse students will encounter a wide range of stressors. Their experiences and well-being during the transition period significantly differ based on factors such as ethnicity, identity, socioeconomic background, and membership to marginalized groups. For example, 2SLGBTQIA+ students are more likely to experience more mental health issues and higher stress than students who do not identify with this group (Steele et al., 2017; Slykerman & Mitchell, 2021). International students experience higher rates of loneliness, self-stigmatization of mental health issues, and acculturative stress (Alharbi & Smith, 2018; Baghoori et al., 2022). Keep diversity in mind as you navigate the assessment of well-being on your campus. Although you may find that a stressor may only impact an individual or small group, this experience is valid, and it is essential to address and include this experience in your assessment and how you discuss the results.

The Assessment Cycle

Mission/Purpose/Goals/Values

The Assessment Cycle starts by clearly defining the purpose of the assessment within the context of the unit, service or program while considering the specific goals, objectives, and outcomes the assessment aims to measure or achieve (Goff et al., 2015). It also ensures alignment between assessment goals and institutional mission, program objectives, and student needs. Engaging partners in this process, including students, is vital to ensure full buy-in. At Dalhousie University, our team embarked on a collaborative process to create a Transition and Engagement Framework. During the development, we engaged in workshop discussions to collectively craft a vision and purpose (mission) that effectively captured the spirit of our shared objectives. We recognized that it was essential for our vision and purpose to resonate with all members of our community, from students to faculty to Ancillary Services to Student Affairs, to ensure a unified and impactful approach to our work. We considered our team members’ varied perspectives and expertise, and worked together to reflect our collective strengths and values. Through this process we gained a deeper understanding of our community’s needs and aspirations, and we were able to develop a roadmap for achieving our goals. Overall, the collaborative process was a valuable learning experience that brought our teams closer together and set us on a path towards success. In the assessment cycle, the vision, mission, and values will be the foundation of your work and serve as the centre for all you do.

Identify and Map Outcomes

Student learning outcomes represent a critical assessment aspect, delineating what students are expected to know, do, and value due to their interactions with your program, service, or event. Learning outcomes should be time-bound, transparent, measurable, and aligned with the mission and vision of your program and service. They should be grounded in theory, programming goals, and background documents such as the Council for the Advancement of Standards (Robbins et al., 2022). Consider the theories in which you will ground your outcomes. For example, if you are an institution that aligns with the 5 Senses of Success, this should be demonstrated in your learning outcomes. If you are concerned with students’ social and emotional well-being, consider some of the data points mentioned above and develop outcomes around that. You may be interested in the theories around anticipatory socialization and want to measure that through the development of outcomes. Given that learning and development are an ongoing process, adopting a developmental approach is essential, tailoring learning outcomes to students’ evolving needs throughout their post-secondary careers.

When writing outcomes, you will want to consider various learning taxonomies. Bloom’s Taxonomy is one of the most recognized learning theories in education. Educators often use Bloom’s Taxonomy to create learning outcomes that target not only subject matter but also the depth of learning they want students to achieve, and then create assessments that accurately report on students’ progress towards these outcomes (Anderson & Krathwohl, 2001). Bloom’s Taxonomy comprises three learning domains: cognitive, affective, and psychomotor, and assigns to each of these domains a hierarchy that corresponds to different levels of learning (Bloom’s Taxonomy, Centre for Teaching Excellence – University of Waterloo, n.d. Retrieved from University of Waterloo’s Centre for Teaching Excellence). ​Another taxonomy often used is Fink’s Taxonomy of Significant Learning. Fink’s taxonomy is designed to help educators create compelling learning experiences for students. It identifies six types of learning outcomes that students can achieve, including the acquisition of knowledge, the development of skills, and the formation of values or attitudes (Fink, 2013). When considering taxonomies, we can (and should) also look at different ways of knowing. In the 2016 article, Switching from Bloom to the Medicine Wheel, LaFever proposes expanding on Bloom’s three domains of learning and adding a fourth quadrant based on the Medicine Wheel. This fourth quadrant, spiritual, is essential for achieving balance in curricular design and supporting students’ learning goals (LaFever, 2016). Incorporating different ways of knowing into the development of learning outcomes is crucial for creating a comprehensive and effective educational experience. By recognizing and valuing diverse perspectives and knowledge systems, we can enhance the assessment process and promote more holistic learning (Calgary Board of Education, 2022).

Operational outcomes

Operational outcomes are metrics that gauge the effectiveness of a program’s operational aspects, typically focusing on demand, resource utilization, and efficiencies. Operational outcomes include retention rates, number of students at events, office/service utilization rates, satisfaction with events/services, etc. They are often the standard of what we look at for assessment. How many students attend our events? Were they satisfied? Although these numbers are essential, they only tell part of the story.

The Scarborough Charter on Anti-Black Racism and Black Inclusion in Higher Education: Principles, Actions, and Accountabilities is a commitment by institutions across Canada to combat anti-Black racism and foster black inclusion in higher education. In Section 2, the charter calls for universities and colleges to commit to fostering inclusive excellence: “In governance by identifying the extent of (under) representation through baseline data compilation and analysis, to provide demographic knowledge of Black faculty, staff and students within their institutions, with suitable disaggregation of data.” The disaggregation of data is essential to have an accurate measure of retention of BIPOC first-year students. Further, disaggregation is needed to determine if programs enacted to support BIPOC students are improving retention. Without the disaggregation of data, the experiences of BIPOC students would often be lost in the more extensive student body data. In addition to disaggregating data, including students in the data analysis is also essential. Consider disaggregating your data to determine which students are accessing your programming and who are not. What does this tell you? How can you use this data to make your programming more accessible and ensure that you are meeting the needs of all your students?

Retention data also falls under operational data. Although we may not be able to prove that programming is directly linked to student retention (as there are many factors to consider), we can show correlation. At Dalhousie, we have a program called Together@Dal; created to support new students and help them transition to university by connecting them with upper-year mentors. There are four outcomes of Together@Dal: increasing students’ sense of belonging, their academic self-efficacy, and their engagement with the Dalhousie community, and providing the support and resources needed to be academically successful. Three years into the program, students participating in the Together@Dal program continue to see gains in their first-to-second-year retention rates. We also have this data disaggregated by student type, location (rural/urban), and identity. Although we need to consider participation bias and other factors, by showing the program’s impact, both qualitatively and quantitatively, we have been able to demonstrate the impact to students, administrators, and donors.

Figure 1: 1st to 2nd year retention rates of Dalhousie students participating in Together@Dal versus 1st year Dalhousie students who did not participate

Mapping

Mapping determines when, where, and through what experiences the desired outcomes will be achieved. Operational outcomes and student learning outcomes can be mapped across the experience, which helps to represent the delivery and student learning progression visually (Robbins et al., 2022). Mapping helps identify discrepancies between expectations and potential gaps in delivery. Before initiating an assessment, defining what is being assessed is essential; ensuring manageability, relevance to goals, and clarity on how achievement will be recognized is necessary (Robbins et al., 2022).

Gather Evidence

Next, you want to determine how you will assess your learning outcomes. Consider the type of measure you will be using. Direct assessment methods require students to demonstrate their knowledge and skills to show that the outcome has been attained. Conversely, indirect methods are used as outcome indicators without measuring them directly. Direct and indirect assessment methods can complement each other, offering multifaceted insights into student learning.

Ultimately, the choice of assessment method depends on the question, objectives, and context, as well as the strengths and limitations of each approach. Assessment methods are crucial in gathering data, evaluating outcomes, and informing decision-making. Two primary approaches are qualitative and quantitative methods. Qualitative assessment methods explore and understand complex phenomena by examining individuals’ experiences, perceptions, and behaviours (Tenny et al., 2022). These methods emphasize subjective interpretation and meaning-making, allowing you to capture rich, detailed data.

Qualitative assessment methods include interviews, focus groups, observations, and document analysis. These approaches facilitate the collection of narrative data, personal stories, and contextual information. Focus groups are an excellent example of a more personable form of data collection as they help facilitate an open discussion where ideas can build off of each other. Including diverse populations to capture varied experiences and perspectives is essential when running a focus group. Consider holding multiple sessions with different groups, faculties, societies/clubs, etc. Create a welcoming environment by choosing a location that is accessible, comfortable, neutral , and free from distraction. Use open-ended questions to encourage students to share their thoughts, feelings, and perceptions about their social interactions, support networks, and emotional well-being. Establish ground rules such as respect, active listening, confidentiality of participants and experiences, and discourage interruptions. Designate a moderator and note-taker who will facilitate the discussion, ask follow-up questions, guide the conversation, and ensure that every voice that wants to be heard is heard. With the consent of participants, the discussion should be recorded. Consider following up with the focus group to summarize the discussion and findings and inform them how their data will be used. This follow-up can be used as an opportunity for feedback on the process and to provide appreciation for their participation.

Quantitative assessment methods involve surveys, standardized tests, and observations using structured protocols. These approaches facilitate the collection of numerical data, measurements, and frequencies. Some examples of ways to embed quantitative assessment in the first-year experience are to track participation in extracurricular activities, on-campus events, and utilization of campus services compared with retention data. Acquire feedback from campus services such as advising and tutoring, which interact with first-year students often, to understand impact and access better. Distribute surveys that look at students’ sense of belonging, social connections and peer support networks, stress, depression, and anxiety levels, substance use, coping strategies, and satisfaction. Dalhousie University implemented a similar strategy by implementing the Student Transition Survey in 2023 and 2024[1]. Given to all entering students at the end of their first year, it focuses on better understanding students’ academic efficacy and sense of belonging and utilizes the mental health continuum short form to measure psychological and social well-being (Lamers et al., 2011). We disaggregate the results such as by identity, Faculty and living situation (on or off campus) to better understand how different students are experiencing their first year. Using some of this data, we were able to tailor our first-year communications based on the question “What is one thing you wish you had known when you started Dalhousie?” We also found that feeling depressed, stressed, or worried was the number one challenge for all students, which allowed us to focus programming during the first year. In addition, the academic workload was the number two challenge for international students and students who lived on campus, while having enough money was the number two challenge for students who identified as 2SLGBTQIA+. Difficulty making new friends came up as a challenge for those who self-identified as racialized, although this was not in the top three for any other group. By disaggregating the data, we can address the differing needs of students without assuming a “one size fits all” model.

Summative and formative assessments offer distinct viewpoints on program efficacy. Summative assessment produces quantitative data, and formative assessment results in qualitative data. Formative assessment focuses on ongoing awareness, understanding, and support of learning, (Chmolova, 2016) while summative assessments provide an overarching perspective, and formative assessments provide ongoing feedback for timely adjustments (Ismail et al., 2022).  Although qualitative and quantitative assessment methods differ in their approaches, they are often combined to understand complex phenomena comprehensively (Fullstory, 2021). This mixed-methods approach allows us to triangulate findings, validate results, and gain deeper insights.

Analyze and Interpret Findings

When analyzing assessment data, several steps will help to ensure meaningful insights. Be sure to contextualize the findings by considering the broader educational context and triangulate your data whenever possible with multiple sources and methods to cross-validate results. Reflect on how the results align with research questions and educational goals and be mindful of external factors that might influence the outcomes. Actively collaborate with partners duringinterpretation and reporting, ensuring diverse perspectives are represented throughout the process. Employ various data analysis techniques beyond relying solely on white student experiences as a benchmark. Compare your findings to existing literature or benchmarks and determine whether your results are consistent or divergent and explore the implications of these comparisons. Identify patterns by looking for recurring trends, outliers, or consistent themes. These patterns reveal valuable information about student learning, program effectiveness, or organizational processes. Address limitations by acknowledging any constraints in your data or methodology. When drawing comparisons, try and contextualize the results within relevant frameworks and engage in discussions around small sample sizes (“n”) to understand their implications. Finally, propose actionable recommendations based on your analysis. Consider how programming, services, and practices can be improved based on the assessment results. Interpretation is an ongoing process, and engaging partners ensures that the insights lead to positive change. As mentioned above, when we disaggregated the Transition Survey data, we saw that the top 3 challenges first-year students experienced differed across student groups. It would be easy for student affairs professionals to interpret what those differences mean and create programming to address them. However, we also bring our biases, assumptions, and worldviews into these interpretations. As such, it is essential to go to these groups and ask them what they need—as with the above example, asking racialized students what is challenging regarding making new friends and involving them in that analysis as opposed to us making that distinction for them.

Implement Change

Closing the loop and using the data for change is often one of the most-overlooked aspects of the assessment cycle. Not only is it essential to collect the data, but it is also necessary to use it for meaningful change. Collecting data to sit around unused is unethical and leads to mistrust among students and staff. It is important to share results to show students that we are using what we have heard to implement changes  and provide interventions or improvements informed by assessment results. Monitor the effectiveness of interventions and adjustments over time and make further refinements as needed. It is essential to foster a culture of assessment that values ongoing reflection, collaboration, and improvement. Continuously evaluate and refine the assessment cycle to ensure its effectiveness and relevance in supporting student learning and institutional goals (Goff et al., 2015).

Conclusion

This chapter discusses the importance of assessment in student affairs to enhance student growth, development, and success. Although higher education may be in a challenging context, there has never been more of a need for evidence-based decision-making through ongoing assessment cycles to improve first-year experience programming. We must go beyond satisfaction surveys, focusing on student achievement, learning, and retention while ensuring we are focusing on the students’ voices, identities, and needs in our work.

References

Academica Group. (2023).  CACUSS Census and Benchmarking Study. Summary report [unpublished]. Canadian Association of College and University Student Service.

Adams, T., & Doyle, H. (2020, January 29–31).  Socially just assessment [Conference session slides]. NACADA: The Global Community for Academic Advising Assessment Institute. Retrieved from https://nacada.ksu.edu/Portals/0/Events/AssessmentInst/2020/documents/C4p-SociallyJust-HDTA_1.pdf?ver=2019-11-25-094512-260

Alharbi, E. S., & Smith, A. P. (2018). Review of the literature on Stress and well-being of international students in English-speaking countries. (“Review of the Literature on Stress and Wellbeing of International …”) International Education Studies, 11(6), 22. https://doi.org/10.5539/ies.v11n6p22

American College Health Association. American College Health Association-National College Health Assessment III: Canadian Reference Group Data Report Spring 2022. Silver Spring, MD: American College Health Association; 2022.

Anderson, L.W. and Krathwohl, D.R., et al. (Eds.) (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. Allyn & Bacon.

Baghoori, D., Roduta Roberts, M., & Chen, S.-P. (2022). Mental health, coping strategies, and social support among international students at a Canadian University. Journal of American College Health, pp. 1–12. https://doi.org/10.1080/07448481.2022.2114803

Barbayannis, G., Bandari, M., Zheng, X., Baquerizo, H., Pecor, K. W., & Ming, X. (2022). “Academic Stress and Mental Well-Being in College Students: Correlations, Affected Groups, and COVID-19.” (“Academic Stress and Mental Well-Being in College Students: Correlations …”) Frontiers in Psychology, 13, 886344–886344. https://doi.org/10.3389/fpsyg.2022.886344

Beck, H., & Davidson, W. (2015). Improving the Retention of First-Year College Students: A Temporal Model of Assessment and Intervention. Journal of The First-Year Experience & Students in Transition, 27(2), 83-99

Blaich, C., & Wise, K. (2011, January).  From gathering to using assessment results: Lessons from the Wabash national study (Occasional Paper No. 8). National Institute for Learning Outcomes Assessment (NILOA). Retrieved from https://www.bu.edu/provost/files/2015/09/From-Gathering-to-Using-Assessment-Results_Lessons-from-the-Wabash-Study-C.-Blaich-K.-Wise1.pdf

Bloom’s Taxonomy Learning Activities and Assessments. (n.d.). Retrieved from University of Waterloo’s Centre for Teaching Excellence

Canadian University Survey Consortium (CUSC) (2022). 2022 CUSC Survey First-Year Students Calgary Board of Education (March 2022). Indigenous Education Holistic Lifelong Learning Framework. Retrieved from: https://cbe.ab.ca/about-us/policies-and-regulations/Documents/Indigenous-Education-Holistic-Lifelong-Learning-Framework.pdf

Chmolova, Katarina (Jan 11, 2016). Qualitative Vs. Quantitative Methods of Verification and Evaluation. The Report. Retrieved from: https://www.classcentral.com/report/qualitative-vs-quantitative-methods-verification-evaluation/

Doyle, H. (2023).  Utilizing appreciative assessment in student affairs. New Directions for Student Services, 2023,  pp. 113–121. https://doi.org/10.1002/ss.20500

Fink, L. D. (2013). Creating Significant Learning Experiences. An Integrated Approach to Designing College Courses. Jossey Bass.

Fullstory (2021). Qualitative vs. quantitative data in research: what’s the difference? Retrieved from: https://www.fullstory.com/blog/qualitative-vs-quantitative-data/

Gallagher, K. M., Jones, T. R., Landrosh, N. V., Abraham, S. P., & Gillum, D. R. (2019). College students’ perceptions of stress and coping mechanisms. Journal of Education and Development, 3(2), 25. https://doi.org/10.20849/jed.v3i2.600

Goff, L.; Potter, M.K.; Pierre, E.; Carey, T.; Gullage, A.; Kustra, E.; Lee, R.; Lopes, V.; Marshall, L.; Martin, L.; Raffoul, J.; Siddiqui A.; & Gastel, G.V. (2015). Learning Outcomes Assessment A Practitioner’s Handbook. Higher Education Quality Council of Ontario. Retrieved from: https://heqco.ca/pub/learning-outcomes-assessment-a-practitioner-s-handbook/

Hassel, Steganie & Ridout, Nathan (2017). An Investigation of First-Year Students’ and Lecturers’ Expectations of University Education. Frontiers in Psychology, 8, 2218. https://doi.org/10.3389/fpsyg.2017.02218.

Henderson, J. (2024). Nova Scotia to cap university tuition increases at 2% with new funding agreement. Halifax Examiner. Retrieved from: https://www.halifaxexaminer.ca/government/province-house/nova-scotia-to-cap-university-tuition-increases-at-2-with-new-funding-agreement/

Houshmand, S., & Spanierman, L. (2021). Mitigating racial microaggressions on campus: Documenting target’s responses. New Ideas in Psychology, 63.

Ismail, S.M., Rahul, D.R., Patra, I. et al. Formative vs. summative assessment: impacts on academic motivation, attitude toward learning, test anxiety, and self-regulation skill. Lang Test Asia 12, 40 (2022). https://doi.org/10.1186/s40468-022-00191-4

Keeling, R. P. (Ed.). (2004).  Learning reconsidered: A campus-wide focus on student experience. American College Personnel Association and National Association of Student Personnel Administrators. Retrieved from https://www.naspa.org/images/uploads/main/Learning_Reconsidered_Report.pdf

Kwan, M. Y. W., Brown, D., MacKillop, J., Beaudette, S., Van Koughnett, S., & Munn, C. (2021). Evaluating the impact of Archway: a personalized program for 1st-year student success and mental health and wellbeing. BMC Public Health, 21(1), 59–59. https://doi.org/10.1186/s12889-020-10057-0

Lamers, S. M. A., Westerhof, G. J., Bohlmeijer, E. T., ten Klooster, P. M., & Keyes, C. L. M. (2011). Evaluating the psychometric properties of the mental health Continuum-Short Form (MHC-SF). Journal of Clinical Psychology, 67(1), 99-110. doi: https://doi.org/10.1002/jclp.20741

Learner-centered initiatives. (2016). Understanding Bias in Assessment Design. Taken from: https://www.michiganassessmentconsortium.org/wp-content/uploads/UnderstandingBiasInAssessmentDesign.pdf

Lizzio, A. (2006). Designing an Orientation and Transition Strategy for Commencing Students. A Conceptual Summary of Research and Practice. First-Year Experience Project. Griffith University, Brisbane.

Lizzio, A. & Wilson, K. (2011). First-year students’ appraisal of assessment tasks: Implications for efficacy, engagement and performance. Assessment & Evaluation in Higher Education. 2011. 10.1080/02602938.2011.637156.

Marcella LaFever (2016). Switching from Bloom to the Medicine Wheel: creating learning outcomes that support Indigenous ways of knowing in post-secondary education, Intercultural Education, 27:5, 409–424, DOI: 10.1080/14675986.2016.1240496

Montenegro, E., & Jankowski, N. A. (2017, January).  Equity and assessment: Moving towards culturally responsive assessment (Occasional Paper No. 29). National Institute for Learning Outcomes Assessment (NILOA). Retrieved from https://files.eric.ed.gov/fulltext/ED574461.pdf

Pascarella, E. T., Smart, J. C., & Ethington, C. A. (1986). Long-term persistence of two-year college students. Research in Higher Education, 24(1), 47–71. https://doi.org/10.1007/BF00973742

Pieterse, A., Carter, R., Evans, S., & Walter, R. (2010). An exploratory examination of the associations among racial and ethnic discrimination, racial climate, and trauma-related symptoms in a college student population. Journal of Counseling Psychology, 57(3), 255-263.

Rana, U. (2024). How international student cap could affect services at universities, colleges. Global News. Retrieved from: https://globalnews.ca/news/10250089/international-student-cap-affect-services-universities-colleges/

Robbins, R.; Zargas, K.M.; Esquivel, S. (2022). Breaking Down Assessment of Academic Advising: Gathering Evidence, Reporting and Planning, Change and Sustainability. NACADA: The Global Community for Academic Advising.

Scott, M. (2018).  Assessment in practice: A mixed model study of the representations of assessment, evaluation, and research competencies in the position descriptions of student affairs and services professionals in the Province of Ontario (Doctoral dissertation, University of Toronto). TSpace Repository. Retrieved from https://tspace.library.utoronto.ca/bitstream/1807/92146/1/Scott_Melinda_A_201811_PhD_thesis.pdf

Slykerman, R. F., & Mitchell, E. A. (2021). Stress, anxiety, and psychological wellbeing in first-year university students: changes over time. New Zealand Journal of Psychology (Christchurch. 1983), 50(3), 39–45.

Steele, L. S., Daley, A., Curling, D., Gibson, M. F., Green, D. C., Williams, C. C., & Ross, L. E. (2017). LGBT identity, untreated depression, and unmet need for mental health services by sexual minority women and Trans-identified people. Journal of Women’s Health, 26(2), 116–127. https://doi.org/10.1089/jwh.2015.5677

Statistics Canada. (2017). Does Education pay? A comparison of earnings by level of education in Canada and its provinces and territories. (No. 98-200-X).

Tenny S, Brannan JM, Brannan GD. Qualitative Study. [Updated 2022 Sep 18]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2024 Jan-. Available from: https://www.ncbi.nlm.nih.gov/books/NBK470395/

Usher, A., (2022). The State of Postsecondary Education in Canada, 2022. Toronto: Higher Education Strategy Associates.

Varela, W. K. (2024). Data analysis: International student cap exposes chronic underfunding of Ontario and BC post-secondary schools. New Canadian Media. Retrieved from: https://www.newcanadianmedia.ca/data-analysis-international-student-cap-exposes-chronic-underfunding-of-ontario-and-bc-post-secondary-schools/ 

[1] The Dalhousie University First Year Experiences Survey (2023 and 2024) is an institutional quality assurance activity. Data reported signify a sample of student experiences collected within the institution to inform its efforts to improve student transition and engagement experiences. Findings presented here are intended to inform other institutions how they might assess transition experiences of students within their own institutions and are not intended to be generalized outside of this specific quality assurance purpose.

 

License

The Evolving Landscape of Post-secondary Student Transitions in Canada: Striving for Best Practices Copyright © by Steven Smith; Tom Brophy; Adam Daniels; and Amy McEvoy. All Rights Reserved.