Learning as We Go

Investigating the intersection between evaluation, education and labour markets

Brief Research Prospectus

Investigating evaluative thinking skills among educators and young people in preparation for transitions to education, employment, and entrepreneurship.

I am in year 1 of a 4-year PhD program in the Educational Studies at the University of Victoria. My dissertation focuses on building capacity for evaluation and fostering key life skills among young people to think critically with a view to achieving goals. This builds on my 15 years of experience as a Research and Evaluation Specialist in education programming, whereby using data and evidence is essential to reaching results and driven by inquiry into the factors that influence progress. My work has included many roles, including:

  1. Advancing educator capacity to use and understand research, data and performance measurement at the Government of Ontario Ministry of Education;
  2. Leading a national and distinctions-based, mixed-methods evaluation of the Indigenous Post-Secondary Education Program for Indigenous Services Canada/Government of Canada; and

c) Working with UNICEF, I collaborated with global stakeholders from private and public sectors to create and implement a global, multi-level measurement framework and reporting tool, managed data inputs from 52 countries and translated findings by designing an interactive, multi-media dashboard.

Evaluation is most often applied in public sector contexts for three very distinct reasons: accountability, building capacity, and learning (United Nations, n.d.). In my experience, I have become very familiar with the numerous challenges that young people face in preparing for and navigating steps in developing career and life trajectories. Understanding skills competencies, labour market demands, and career interests is essential, as are needed funds for advanced training or entrepreneurship endeavours. This results in a youth employment rate of 54% – the lowest since winter 1998, excluding the pandemic years 2020/21 (Statistics Canada, 2025). Furthermore, the transition to employment is made more complex by artificial intelligence (AI); as more than half of the Canadian workforce is highly exposed to jobs that will be disrupted by AI (Li & Dobbs, 2025). In light of employment challenges, many youth turn to self-employment; more 15-24 year olds have started businesses than any other age group over the 2001-2020 time period, but their success rate is the lowest (Liu & Zhang, 2025). It is pivotal that young people learn skills that insulate them from job and career disruptions and facilitate their decision making. Using the self-determination theoretical framework, which assumes people seek psychological growth and therefore learning, mastery, and connection with others (Deci & Ryan, 2020), I will look to understand how students are intrinsically and extrinsically motivated in light of their psychological needs being met, namely autonomy, competence, and relatedness.

The Future Skills Centre Report (Li & Dobbs, 2025) recommends education and workforce development programs foster AI resilient skills, such as critical thinking, leadership, and problem solving. Fostering such skills aligns with the Sustainable Development Goal #4 (UN Department of Economic and Social Affairs), the Canadian federal sustainable development strategy (Government of Canada), and SSHRC’s Future Challenge Areas, Working in the Digital Economy and Truth Under Fire in a Post-fact World. Alongside critical thinking, the evaluation community focuses on evaluative thinking, which is an extension of critical thinking that focuses on expected results, how they are achieved, and the evidence needed to inform decision making and improved results (Campbell-Patton et al., 2023). Although evaluative thinking appears in the evaluation capacity building literature (Archibald T. et al., 2015; 2020; Tolley, 2025), its prevalence in the education literature is not as vast. (Paproth et al., 2023).

The purpose of my research is to identify and measure the factors that influence the teaching and learning of evaluative thinking in Canadian high schools.  


Study #1 will explore two research questions: (1) How do educators understand evaluative thinking and integrate it into their instructional design? and (2) How can evaluative thinking be measured to reflect the experiences of Canadian youth? I will create a modified competency framework and measurement tool to measure knowledge of and applicability of evaluative thinking skills among youth. It will then be shared with participants in this study; approximately 100 educators (depending on the power analysis for required sample size) who regularly teach business skills to students that will shortly transition to employment or entrepreneurship. A survey will gather input on the proposed tool and gather data related to their perspectives on evaluative thinking, student abilities, opinions and attitudes on approaches to foster evaluative thinking among high school students.

Study #2 is a mixed-methods study that is guided by the research questions (3) Do high school students think evaluatively in their preparations for their future? Using self-determination theory and recognition of competencies as a supportive condition, this study will examine (4) What supports would help increase student evaluative thinking skills and confidence in their choices as they prepare for future transitions to employment and entrepreneurship? Participants of this experimental study will be recruited by the educators that participated in Study #1 and represent a cross-section of perspectives based on their lived experiences (i.e., rural, Indigenous, people with disabilities, etc.). As part of their business class, students will be divided into 2 groups, all of whom will participate in an online focus group, which will be guided by informed questions from the measurement framework and tool designed in Study #1. Participants will discuss how they prepare for their future planning, sentiments on readiness, reflections on skills, employment, and entrepreneurship opportunities as well as non-academic aspects of transitions, such as self-sufficiency, autonomy, and the tools, resources, and people that they believe will contribute to positive experiences. Group 1 will serve as a control group, while Group 2 will be introduced to the Toulmin Argument Model (Mirzababaei & Pammer – Schindler, 2021) using an AI tool (Online Toulminator) (Caulfield, 2025), which assesses arguments using the Toulmin method. It reviews evidence (quality), assumptions, and rebuttals and provides advice to what may strengthen or weaken arguments, which in this case will be questions around student evaluative thinking processes in planning for their future. The tool is expected to enhance evaluative thinking as it can be seen as a quick and easy way to use evaluative prompts and provide real-time feedback across a variety of formats, including social media. Three months after the focus groups, participants will be invited back to participate in 1 of 3 additional focus group to test evaluative thinking skills and the effects of the Toulminator. The focus groups will consist of students from control Group 1, who did not receive information about the Toulminator, and students from Group 2, who self-report whether they a) used it a lot, and b) used it a little. These discussions will reflect on evaluative thinking processes, decision making, and sentiments about the future, as well as feedback on the Toulminator.

Significance: My research contributes to the literature on evaluation capacity building and career transitions. Findings will be used as evidence on the degree to which evaluative thinking skills (and hence confidence and decision making) are prevalent among educators and students in high schools. Findings will be disseminated yearly in a variety of journals: Canadian Journal of Program Evaluation, American Journal of Evaluation, Canadian Journal of Education, and the Journal of Career Development.

Feasibility: My supervisor, Dr. Valerie Irvine, is an internationally established SSHRC-funded scholar in educational psychology and technology and the Director of the Technology Integration and Evaluation (TIE) Research Lab, funded in part by the Canada Foundation for Innovation and where I am a graduate research associate. The PhD program at UVic provides training central to my research (e.g., education, technology, multi-method research designs, and quantitative analysis) and builds on my extensive experience in evaluation, qualitative research analysis, and measurement. I intend to complete my degree in Spring 2029 and my long-term goal is to enter academia to advance my research program to understand and advance Canada’s capacity for evaluative thinking to support transitions to education, employment, and entrepreneurship.

Evaluation Journal Review

New Directions for Evaluation is the official journal of the American Evaluation Association. The journal is administered by Wiley Online Library, which oversees publications of more than 2000 journals. Wiley views the future of research as open access, open data and open practices. In practice, this means that the author pays a publication charge to make the article freely available. There is also a possibility to make the article hybrid open access, which means the author, institution or funder pays the article publication charge. To publish in the New Directions for Evaluation Journal, it costs $3,300USD + tax.

There seems to be much support offered by Wiley to assist authors in submitting their article for publication; by offering manuscript guidelines, editing services, data sharing and citation policies and advice on Search Engine Optimization, registering for an ORCID ID, licensing requirements etc. The lack of specific author guidelines for New Directions for Evaluation Journal implies a free format submission – authors submit manuscripts in the format of their choice and Wiley will update the formatting if it is accepted for publication. Information is provided on the types and lengths of the peer review process and notes that specific details vary by publication and available upon request to the journal.

I was impressed by the articles in New Directions for Evaluation because of the vast array of relevant research it provided to my own research. Its search function allows the reader to easily find related articles. As its title suggests, it focuses on evaluation topics, with an emphasis on innovations, emerging methods and new applications of evaluation. Many of the articles published by this journal advance concepts, points of view and theories that evaluators struggle in their daily work. To a lesser extent have I encountered published articles that discuss specific case studies or learning in particular contexts unless there is there is a unique aspect to the study, such as a new method or application.

The Managing Editor and Co-editors are based within a US university setting, however there is a good cross-section of representatives on the Board of Directors with several from private consulting, International organizations or international academic institutions. New Directions for Evaluation published 49 articles in 2024 from predominantly the US (73%), followed equally (4%) by Australia, Finland and New Zealand. Although the most readers originated from the US (40%), the UK (8%), Australia(8%) and Canada(7%) closely followed in its readership.

2.

The American Journal of Evaluation, conversely, is administered by Sage Journals. It focuses on theory, methods and the evaluation practice in society. It is organized in to seven sections: 1) Book Reviews 2) Economic Evaluation, 3) Ethics, Values and Culture, 4) Experimental Methodology, 5) International Developments in Evaluation, 6) Method Notes, 7) Teaching and Learning of Evaluation. Topics are often broad perspectives on evaluation issues related to education, public administration, behavioral sciences, human services, health sciences, sociology, criminology among others. The Editor, Associate Editors and Section Editors are predominantly based in American academic institutions, as well as some non-governmental institutions, international universities and private consulting firms, which is also consistent amongst the backgrounds of the Board of Directors.

Unlike the New Directions for Evaluation Journal, the American Journal of Evaluation amplifies the amount of time it commits to potential authors of its decision to accept articles for publication and shares that has a 20.5% acceptance rate. Despite these metrics, its support for prospective scholars is minimal, offering only some formatting guidelines and high-level descriptions of its focus preferences for each section, while offering editor contact details for prospective contributors to reach out directly.

Sage Journals does not offer as much explanation and clear support for open access to the same degree as the Wiley Online Library. Upon acceptance for publication, scholars can choose for the article to be open access or not. Hybid options allow for the author to either by themself or arrange a funder to pay the article processing charge. The cost to publish under the hybrid option (Sage Choice) for 2025 is $3,650 USD.

The articles in the American Journal of Evaluation are of good quality and is very broad in terms of the application of evaluation in various settings. This offers advantages by exploring evaluation from numerous perspectives and disciplines, but also disadvantages in that its breadth sacrifices its potential for depth and is not immediately clear how it may advance evaluation as a discipline or explore emerging topics that are relevant and needed by the evaluation community. I appreciated its continued publication of older articles, which provide for good foundational reading, as well as the podcast series and its inclusion of book sections and reviews.

3.

The Canadian Journal of Program Evaluation, by contrast, is a smaller journal with fewer resources and is administered by the University of Toronto Press. It provides a good history of its editorial journey, which consists predominantly of editors from Canadian academia. Similar to its American counterparts, Board of Directors are mostly from academia, with some independents and likewise some external to Canada, however its two francophone editors are unique and speak to the bilingual character of the journal.

The Canadian Journal of Program Evaluation is an open-access journal, but does not provide readers more information about this in practice or offer its views on open scholarship. Instead, through the University of Toronto Press, scholars can publish their article as ‘gold open access’ for a $3,500 CAD article processing charge.

 Interestingly, the Canadian Journal of Program Evaluation has some notable features: it distinguishes between its publications: full-length articles and practice notes, which shares practical knowledge, experiences and lessons learned; it contains a Roots and Relations section devoted to evaluation topics exploring Indigenous contexts, heritage and culture; and also provides a dedicated page for peer reviewers.

By way of submission, prospective authors are directed to language editing services and then to the Online peer review system, which manages the submission and peer review process. Specific submission details are not offered without signing up for an account on the system.

On average, two issues are published per year, with occasional thematic interest issues also published. The articles published by the Canadian Journal of Program Evaluation are, unsurprisingly, more Canadian in context, but the contributors are also clearly more international. It reads as though written directly to the evaluation community and as though passing on its knowledge to the next generation of evaluators, is structured with opening editorial remarks, occasional tributes to evaluation practitioners and offers French language and Indigenous-themed articles in almost every issue. I find the content would be interesting and valuable to any evaluator, but the website is cumbersome as the site is not intuitive and does not allow the reader to easily identify articles of similar topics, authors or themes.

My Research Interests

Unpacking the Topic, Problem, Purpose and Research Questions

Evaluation is a critical process at the start of any large project to a) ensure all relevant parties agree on goals, b) how they will achieve it and importantly, c) how they will monitor progress. It is the 5000ft view of a project that sees many moving parts and how they influence each other.

My research topic focuses on evaluation capacity building

Evaluation capacity building is multi-dimensional and has captured researchers for decades. Evaluative thinking is a cornerstone of the profession itself (Patton, 2018; Preskill & Boyle, 2008) and the emphasis leaders make on evidence-based decision making may have signaled that evaluative thinking is more of a widely practiced skill. Unfortunately, this is not the case as evaluators note that many of their evaluation findings point to shortcomings at the pivotal design phase of a project when evaluative thinking skills are intended to be applied.

  • My research problem is that evaluative thinking skills are not well-developed, applied or applied well among many project leaders and decision makers.

Realizing a gap in practice and a lack of exploration amongst scholars on evaluative thinking, there are several studies that could be pursued. Recognizing evaluative thinking as an important life skill that can facilitate decision making during personal and professional transitions, it bears wondering to whether and to what degree evaluative thinking is prevalent in the education system.

  • The purpose of my study is to identify and measure the factors that influence teaching and learning of evaluative thinking in Canadian High Schools

The questions to be asked in my research focus on the following questions:

  1. How do educators understand evaluative thinking and integrate it into their instructional design?
  2. How can evaluative thinking be measured to reflect the experiences of Canadian youth?
  3. Do high school students thinking evaluatively in their decision-making process in preparation for their future?
  4. What supports would help increase student evaluative thinking skills and confidence in student choices as they prepare for future transitions – to post-secondary education, employment and entrepreneurship?

Starting a New Chapter

Hello! I’m very happy to be at UVic starting a PhD in Educational Studies. My journey here has been long and winding and although I’m not formally trained in education, somehow I’ve found myself developing a career around learning! Perhaps it’s a character trait that I couldn’t let go of as a child or that the gratification and curiousity of discovery has too much appeal? Whatever the reason, my career has grown over time not in pedagogy, but in Evaluation.

Most of my career has been in the public sector (Canadian provincial and federal governments, International Organizations) where policies, projects, programs and initiatives are advanced to solve multi-disciplinary problems and the solutions are often societal-based and intangible. I’ve worked in a lot of interesting fields: Education, Justice, Employment, Business Development and Entrepreneurship, Health, Inclusion, International Development, Emergency Response and Community development. Along the way, I’ve met and been mentored by many wonderful people around the world and I’ve had incredible opportunities that I humbly can say have contributed to my skills as an Evaluator.

I get mixed reactions whenever I share what I do for a living; often it is uncertainty in what evaluation is or what I do, or fear that my work will result in publicly-funded programs losing their funding. Neither is a good look, but I take it with a grain of salt and dutifully explain in plain language: “evaluation helps you plan and assess whether goals are being met and its purpose is to address two things: Accountability and Learning.”

Accountability because we want to know that public funds are being used to benefit tax payers. When policies and programs are created, and which is sometimes forgotten, is that no one really has the answers. We give governments the benefit of the doubt that what they do will result in the change we desire. But, truly there is no magic recipe or map to follow in order to solve some of the really difficult challenges that we are grappling with – poverty, unemployment, mental health, climate change. It requires a lot of people working in tandem and coordinating action. The best we can do is to use resources effectively, advance actions that build on what we know now and apply evaluation constructively to help us learn what works, what doesn’t and how to improve.

To that end, I’ve noticed patterns in my work; common challenges that I encounter and which very often can be distilled down to a need to think evaluatively. It’s this key attribute, and the technology that can improve its teaching, collaboration and scale, that I will explore in my doctoral studies. I’m excited and this prospect and that I may contribute to wider discussion on evaluation capacity building, organizational learning and development and applying these skills in settings outside of evaluation. Through this work, I hope that I may have an impact on others’ ability to define and reach their goals, particularly in advancing towards career goals.

Welcome and Introduction

Before proceeding with this first blog post, we expect you to consider your privacy preferences carefully and that you have considered the following options:

  1. Do you want to be online vs. offline?
  2. Do you want to use your name (or part thereof) vs. a pseudonym (e.g., West Coast Teacher)?
  3. Do you want to have your blog public vs. private? (Note, you can set individual blog posts private or password protected or have an entire blog set to private)
  4. Have you considered whether you are posting within or outside of Canada? This blog on opened.ca is hosted within Canada. That said, any public blog posts can have its content aggregated/curated onto social networks outside of Canada.

First tasks you might explore with your new blog:

  • Go into its admin panel found by adding /wp-admin at the end of your blog’s URL
  • Add new category or tags to organize your blog posts – found under “Posts” (but do not remove the pre-existing “EdTech” category or sub-categories, Free Inquiry and EdTech Inquiry). We have also pre-loaded the Teacher Education competencies as categories should you wish to use them to document your learning. If you would like to add more course categories, please do so (e.g., add EDCI 306A with no space for Music Ed, etc.)
  • See if your blog posts are appearing on the course website (you must have the course categories assigned to a post first and have provided your instructor with your blog URL)
  • Add pages
  • Embed images or set featured images and embed video in blog posts and pages (can be your own media or that found on the internet, but consider free or creative commons licensed works)
  • Under Appearance,
    • Select your preferred website theme and customize to your preferences (New title, etc.)
    • Customize menus & navigation
    • Use widgets to customize blog content and features
  • Delete this starter post (or switch it to draft status if you want to keep for reference)

Do consider creating categories for each course that you take should you wish to document your learning (or from professional learning activities outside of formal courses). Keep note, however, that you may wish to use the course topic as the category as opposed to the course number as those outside of your program would not be familiar with the number (e.g., we use “EdTech” instead of “edci336).

Lastly, as always, be aware of the FIPPA as it relates to privacy and share only those names/images that you have consent to use or are otherwise public figures. When in doubt, ask us.

Please also review the resources from our course website for getting started with blogging:

© 2025 Learning as We Go

Theme by Anders NorenUp ↑