Prairie Women's Health Centre of Excellence

  Evaluating Programs For Women: A Gender-specific Framework (2000 Revised Edition)


fFull Report (.pdf) 369KB

More Information

The Executive Summary of this report is also available online in French language.

Additional copies of this publication are available.
Please mail a cheque or money order for $10.00 in
Canadian funds to:
Prairie Women's Health Centre of Excellence
56 The Promenade
Winnipeg, MB
R3B 3H9

The research and publication of this study were funded by the Prairie Women's Health Centre of Excellence (PWHCE). The PWHCE is financially supported by the Centre of Excellence for Women's Health Program, Bureau of Women's Health and Gender Analysis, Health Canada. The views expressed herein do not necessarily represent the views of the PWHCE or the official policy of Health Canada.

To view or print a PDF file, you need the Adobe Acrobat Reader, which you may download at no charge from the Adobe web site. (Acrobat is a trademark of Adobe Systems Incorporated.)
J. McLaren

Notes on the Revised Edition The first edition of this report was published in 1999. (1) The gender-specific evaluation framework was tested in two pilot evaluations conducted by the author with:
  • the Birth Control and Unplanned Pregnancy Counselling Program (BCUPC) at the Women's Health Clinic in Winnipeg, Manitoba. The program evaluation report was completed in June 1999. (2)
  • the Grandmothers' and Girls' Violence Prevention Education Program (GGVPE) guided by Intercultural Grandmothers Uniting in Fort Qu'Appelle, Saskatchewan. (3)

Lessons Learned

Usefulness of the Model
Use of the Framework resulted in a strong, comprehensive evaluation process and outcomes that helped the programs to identify their strengths and support base, areas needing improvement, and recommendations for improving the programs. In both cases, the evaluation proved useful for program improvement. The BCUPC program appreciated the depth and usefulness of the data developed. The GGVPE program viewed the evaluation as thorough and useful, and found that the recommendations were particularly helpful in making changes to the program.

Awareness of and Support for Women's Participation. The evaluations showed the need for awareness of, and arrangements for, support for women's participation. In both evaluations, not paying attention to women's unique needs can prevent them from participating. For example, costs of the following items should be included in the evaluation budget either in whole or in part:

  • honoraria for Evaluation Advisory Committee members and focus group participants to assist with costs such as transportation, meals and child care.
  • community support for transportation (for the second evaluation, women in Fort Qu'Appelle called each other to ensure that grandmothers had rides to meetings--there was no bus service available).
  • lodging and meals when women need to stay overnight.

The Process Is Empowering. The stakeholder members of the two Evaluation Advisory Committees (EACs) commented on their sense of increased knowledge about the program and about evaluation as a process in which they could experience true participation and impact. The pilot evaluations demonstrated that the use of a stakeholder EAC gives real strength and grounding to the evaluation process. From the first questions set out by the Committee through to their input on final recommendations, the EACs were sources of varied perceptions which then were discussed and refined. The EACs were helpful in sorting out best ways to collect data and for verifying facts and perceptions that emerged. They supported and helped focus the evaluation process. The evaluations would not have been as rich and comprehensive if the EAC had not participated. The responsive constructionist process empowers the stakeholder groups that are involved. This is a distinct difference from traditional evaluation in which power is in the hands of the program funder or sponsor to whom the evaluator reports.

In the GGVPE evaluation, a plain language version of the process was useful in presenting and discussing concepts. The collaborating evaluator developed a plain language version of the process, which was used with the EAC. This step helped participants to understand the process and results throughout the process. The oldest of the grandmothers involved in the GGVPE evaluation, a wise Elder, commented that she "could understand every word" of the training and discussion, and the concepts used in explaining and working through the process with the EAC. The plain language version helped with the empowerment process, and was an important feature of the collaboration between the evaluators and the EAC.

Celebration Is Important. Women's needs to feel connected and appreciated for their contributions to the evaluation were evident in the second evaluation. The EAC held a celebration upon completion of the work with a social afternoon with all the members, the evaluators, the program staff, funders, members of the school staff and community agencies, and the parents of student participants. Sharing a brief ceremony and food helped to provide an opportunity for thanks and congratulations for a job well done, and to bring closure to the process. The circle of sharing and grandmothers' prayers made the occasion a special one. Such a closing celebration is a valuable way to express appreciation to all of those involved.

Indicators Can Be Used for Other Evaluations. The evaluators noted that the safety and comfort indicators used in the study were measurable, differed by gender, and verified research findings reported in the literature. Using rating scales ranging from "0 = did not have any good qualities at all" to "10 = Perfect in every way," was a tool that was culture-free in that focus group and interview participants could use it with ease, and it rendered useful data. It also gave the evaluators insight into participants' thinking and consideration of issues. It also provided respondents with a way to think about the project and the evaluation process that appealed to them, and a way to communicate about it.


Gender and Other Societal Elements Are Intertwined

While gender and societal elements such as socio-economic levels and race can be viewed separately to some extent, the pilot evaluations demonstrated how intertwined such elements are and how difficult it is to deal with specifics in isolation. On the other hand, some gender findings were clear and unmistakable. In both evaluations, gender was a clear focus.

In the BCUPE evaluation, there was clear awareness of gender and the impact of socio-economic levels, race and access to resources. Gender issues did impact on the evaluation, including difficulty in gathering client information related to women's feelings of stigma in accessing reproductive and services, and for some, lack of power, resources and privacy in their own lives.

In the GGVPE evaluation, some participants were not sufficiently aware of how gender affects girls and classroom activities. For example, the project was originally established for girls, but staff seemed unaware that group gender composition plays a role in how girls are able to discuss and deal with issues such as violence, and how the experience of violence may differ for girls and for boys. In addition, the high proportion of Aboriginal students, and students from First Nations schools who were bussed to a mixed-race school starting in junior high, meant that racial and cultural issues received concentrated attention, whereas gender was not a focus. Classes had a larger proportion of boys than girls, which emphasized the tendency to overlook girls' issues since boys were harder to control and tended to receive higher levels of attention than did girls.

The Importance of Awareness of Gender Issues. With programs like the BCUPC, a high level of gender awareness had been developed and was evident. The evaluation raised awareness of gender, race, age, cultural, socio-economic and power issues. But the Framework also showed that not everyone is aware of how gender affects the dynamics and evolution of programs. In program settings where there is limited gender awareness, it may be necessary to provide gender training for evaluation committee members and others. Some specific examples drawn from project experience included:

The GGVPE program's original mandate was to develop a program for girls. A school mandate to involve both girls and boys changed the entire program focus to one which included both boys and girls. This diffused the original intent and focus of trying to help girls deal more effectively with family, school and community violence. The changed focus was not discussed with the organization funding the program or with the participant grandmothers. Program sponsors were unaware of the magnitude of impact of the changed direction on girls. Although the program achieved good outcomes, it did not provide the opportunity to focus on girls' needs regarding violence. This demonstrates what can happen when women-oriented programs are changed to serve other interests and lose their focus on women.

Lack of awareness of the dynamics at school affecting girls occurred at two levels: Teachers commented they were not aware of how girls are affected differently than boys in a mixed group versus an all-girl group. They saw no reason to have a program just for girls, nor to separate the sexes for group work or focus groups. Girl students stated that there would not be any difference in their behaviour and reactions in a mixed group as opposed to an all-girls group. The research literature clearly indicates large differences in the ability to be open, to share feelings, and even to be heard or acknowledged in mixed groups. This was confirmed in that the disclosures were made by girls only in an all-girls group. In settings where there is limited gender awareness, it may be necessary to provide gender training for evaluation committee members and others.

Evaluators Can Easily Overlook or Find Difficulty in Addressing Gender Issues. Though both evaluators were skilled in feminist approaches, they realized how easy it is to overlook gender aspects that, when examined, were significant (for example, not planning more thoroughly for an all-girl focus group; overlooking the need for specific gender analysis on student questionnaire responses which, when done, revealed significant differences that confirmed research findings on mixed group behaviours of girls). In retrospect, the evaluators noted how difficult it was to keep gender awareness functioning at all times and at all levels of the evaluation including planning, data collection and analysis. For example:

  • there was no identification by gender on original questionnaires administered prior to the evaluation. When notes were taken in Focus Group sessions where the tape recorder stopped, the note-taker failed to note whether the speaker was a boy or a girl, so no analysis on gender could be performed.
  • when a research assistant conducted the initial analysis on student questionnaires developed by the evaluators, it was noticed that the data had not been analyzed by gender. When gender analysis was done, significant patterns were revealed.
  • when one male and one female student assisted in the evaluation process, the male took the lead in participating in the Evaluation Advisory Committee meeting. The female student deferred to the male to speak first when asked about issues, and sought his agreement at certain points. Her involvement may have been different had she taken part alone or with another female student.

Unexpected Difficulties in Data Collection. Data collection problems occurred in both pilot evaluations. For example:

  • In the BCUPC evaluation, the attempt to obtain information from clients was difficult. Several methods were attempted to increase the number of responses, but none was sufficiently productive. When the evaluator checked with similar programs in other provinces, similar experiences were evident. The low response rate has to do with many factors, such as data being unavailable for women clients, which requires special approaches for future evaluations. It may be that data may have to be collected directly at the time client service is provided.
  • In the GGVPE evaluation, data collection problems took a different form. The intention to gather gender-specific data from students was almost thwarted due to a misjudgment in planning. The plan was to hold focus groups with an entire class group in one case, and then with gender-specific groups in another class situation. The evaluators had assumed that the gender composition would be similar in both cases. They ran what was to be the first focus group with the mixed class. When they ran the gender-specific group, they discovered that the class had very few girls, and they had to use a mixed group again. It is interesting that it was in the all-girls' group during the original program that disclosure was made of violence that had not come to light before.


Much was learned about how to improve the Framework during the pilot evaluations, particularly with respect to the need to explain elements and processes in more detail. Among the more major revisions are:

Program Logic Model
It was not clear to EAC participants why the Logic Model was important. The Framework needed to articulate the purpose of the Logic Model. That is, it is a means of checking and sharing the vision of the program and how it attempts to bring about change. Sometimes, a program has not set out its exact focus. The Logic Model helps to articulate the focus and may indicate the need to refocus activities or direction, as it did in one evaluation. Discussion of the Program Logic Model in the EAC meetings resulted in a process of clarification and rethinking of the original intent of the program. The process was important for developing a clear vision. The Logic Model part of the Framework has been emphasized and expanded in this document.

Evaluation Recommendations
Developing the recommendations derived from the data analysis and the process of discussion, reaching consensus, and rewording with the EAC is an important step in terms of time, energy and attention. In terms of significance, the recommendations form one of the most important parts of the evaluation process. It was clear from the pilot evaluations that the recommendations should be identified and treated as a separate step in the process. This change has been incorporated in the revised document.

Action Plan
Inclusion of the action plan step was not appropriate for the evaluation process. It is not within the evaluators' purview to be involved in action that is taken after the evaluation. The step describing the action plan has been removed from the Framework. It could be suggested as a follow-up step the program might take, or that the recommendations be assigned to program staff to develop an action plan to implement changes. However, this would be a separate process in which the evaluator and the EAC may or may not be involved.


Executive Summary

Social structures and processes affect health and the quality of life. A key social factor influencing health is gender. At all levels of society, awareness is expanding about the intimate links between gender and health. Gender-specific health programming is emerging as a significant focus across Canada and internationally, stemming from a growing awareness of the need for effective, gender-sensitive, woman-centred programs and a concomitant need for gender-based program evaluation approaches to examine these programs.

Program evaluation is recognized as an important part of operating programs well. If evaluation and other processes do not reflect gender differentiation, they perpetuate old models that overlook gender needs and differences, and fail to support the empowerment of women. Yet a search of the program evaluation literature reveals that little has been reported in the area of gender-specific, woman-centred evaluation models or processes. A shift to gender-specific evaluation affects how evaluation structures and processes are conceptualized, utilized, managed, analyzed and reported. In turn, the way evaluation is employed effects how woman-centred services are developed and delivered, and how effective they will be.

As policy-makers interested in women's health and women's programming review their progress in addressing key health determinants and attempt to identify what approaches are most effective, questions that have fundamental relevance to these issues emerge:

  1. What are the characteristics of effective gender-specific and woman centred programs?
  2. What are the elements of effective gender-specific program evaluation frameworks?
  3. What indicators can be identified that could be applied in evaluating gender-specific programming?


This study was undertaken for the Prairie Women's Health Centre of Excellence (PWHCE) to develop a flexible program evaluation framework to address these questions while acknowledging the unique evaluation needs of every program and jurisdiction. The objectives of the project were to:

  • describe the characteristics of effective gender-specific and woman-centred programs;
  • research what models exist for evaluating gender-specific and woman-centred programming;
  • analyze relevant existing health related evaluation frameworks; and
  • formulate recommendations for an effective gender-specific evaluation framework.


A program is defined as an organized system of services, or a related series of activities, designed to address specified health needs of clients. Some theoretical background, gender lenses, and models of programs that support woman-centred health and development are examined. The study sets out characteristics for programs in which there is an interdisciplinary approach and individual accountability for the program administration.

In a gender-specific, woman-centred program, four key phases are involved: gender-sensitive, woman-centred needs assessment; planning; implementation; and evaluation of the extent to which the program meets women's needs. The planning, implementation and evaluation phases of the program cycle are organized around the outcomes, processes, and structures of gender-specific health services. This enables consideration of what results are achieved, as well as the incorporation of those service strategies and resource approaches appropriate for achieving the desired results.

Program evaluation is a process that studies the extent to which desired outcomes were achieved, optimal resources were employed, and/or adequate structures were in place for undertaking the program processes. A gender-specific, woman-centred evaluation framework builds in gender- and woman-sensitive considerations at each step, and uses gender-based analysis as a key element.


PART 1: Characteristics of Effective Gender-specific and Woman-centred Programs

  • examines gender-specific determinants of health;
  • reviews models bearing on effective gender-sensitive and woman-centred programs; and
  • enumerates key elements of effective gender-specific and woman-centred programs as set out in the literature and derived from experience.

PART 2: The Gender-specific and Woman-centred Program Evaluation Framework

  • examines information available in the literature on approaches to program evaluation;
  • assesses existing program evaluation models, theoretical issues and challenges in developing a gender-sensitive framework; and
  • describes the gender-sensitive and woman-centred program evaluation framework--its principles, purpose, approach, and the types of programs to which it is applicable.

PART 3: The Steps of Conducting a Gender-specific and Woman-centred Program Evaluation

  • sets out the ten generic steps of conducting a program evaluation, outlining the ways in which gender-specific considerations must be brought into play to ensure a gender-sensitive and woman-centred program evaluation; and
  • describes and discusses the process of gender-sensitive analysis.

The study sets out the goals, purposes, approach and principles reflected in the framework. It suggests the use of woman-centred and equity-sensitive processes, and considerations focussing on involvement and empowerment in establishing the evaluation committee, gathering data, analyzing results and developing recommendations. It is based on a set of ten generic steps:

Step 1 Set the contract and organize the Evaluation Committee
Step 2 Develop the information base about the program
Step 3 Conduct the evaluability assessment
Step 4 Specify the type of evaluation
Step 5 Identify the evaluation objectives and indicators
Step 6 Develop the data collection design
Step 7 Conduct the data collection
Step 8 Analyze the data using gender analysis
Step 9 Develop the recommendations
Step 10 Write, present and disseminate the evaluation report

At each step, the framework outlines the ways in which gender-specific considerations must be brought into play to ensure gender-sensitive and woman-centred program evaluation process and results. It outlines questions and a considerations at step, and invites those involved in evaluation of woman-centred programs to consider gender issues.each

Although women's organizations and community groups have long advocated that a greater proportion of health research and service delivery funding be spent on woman-centred activities, little evidence exists to indicate significant increases have occurred. To support the contention that women's health concerns merit gender-specific approaches, the framework can help to support the view that gender-specific programs provide effective outcomes.

When the desirable characteristics of gender-specific, woman-centred programs and a program evaluation framework have been identified, their application to specific programs enables us to conduct useful program evaluations that can influence both programs and policies, and elicit the cooperation and participation of program staff, their clients and other stakeholders.

The framework should be viewed as a flexible instrument rather than a rigid format for achieving evaluation objectives. The framework is not a definitive work, but a provisional one upon which future efforts can be built. In that spirit, we can learn together, and continue to use the collective process essential for the progress we pursue.


1. Joan McLaren, Evaluating Programs for Women: A Gender-specific Framework, Winnipeg, Manitoba: Prairie Women's Health Centre of Excellence, 1999.

2. The program evaluation report entitled Working with Women: Program Evaluation of the Birth Control/Unplanned Pregnancy Volunteer Counselling Program, Women's Health Clinic by Joan McLaren, was completed in June 1999.

3. The program evaluation report entitled More Than Just Worry: Violence Prevention Education--An Evaluation in a Gender-specific Framework by Joan McLaren and Jayne Melville Whyte, was completed in April 1999.

Back to top of page