<< First  < Prev   1   2   Next >  Last >> 
  • 09/26/2019 12:51 PM | Greater Boston Evaluation Network (Administrator)

    On September 10, 2019, Bryan Hall, Senior Director, Evaluation at BellXcel, discussed his experience hiring evaluation professionals at his organization, building a three-person evaluation department, and the process and challenges he encountered along the way.   Over 20 GBEN members participated in-person or virtually. 

    Mr. Hall discussed the pre-hiring process and what hiring managers should consider before making a job posting public, the hiring process and what to consider when creating a job posting, the applicant and resume review process, and the key characteristics of strong evaluation professionals.    Below are a few key summary points from Bryan’s presentation.

    The Hiring Process Begins Long Before a Job Posting is Public

    The pre-hiring process is critical, and can make or break a hiring initiative.  Before developing and making public a job posting, it’s important to consider a host of factors. 

    • Hiring managers should identify the key processes and stakeholders that will be part of the entire hiring initiative.  For example, what role will your organization’s Human Resources department plan in the hiring process?  Who will participate in the interview process and what scheduling accommodations will be needed?
    • It’s important to know your full budget for hiring, beyond (but including) the salary range for the position.  Do you have budget to fly a candidate in for an interview?  Will you have budget to train staff once hired?
    • Don’t underestimate the amount of time needed to complete the hiring process.  Some hiring processes can take upwards of a year to complete.  The interview process alone can sometimes take two-to-three months.  Bryan noted that a recent hiring process for an Evaluation Manager position took 6-8 months before leading to a hire. 
    • Think seriously about the workload of the position you are hiring for.  What will their day-to-day, month-to-month work life look like?  It’s important to consider whether you even need a full-time employee at all or if part-time, seasonal, temporary, or consultant staff would be a better fit for your needs.

    The job posting is important for candidates and the hiring manager

    Once you are ready to formally start the hiring process, it’s important to develop a job posting that you will make public to interested candidates.   Similar to the pre-hiring process, it’s important that the job posting strongly reflect your organization’s needs.  A poorly designed job posting can delay your hiring process or attract candidates that may not be the best fit for your needs.  A few considerations for the job posting:

    • The job posting is not the same as the job description you hand a new employee once they start.  Avoid making a job posting an exhaustive list of responsibilities, but instead try to capture the high level job responsibilities and requirements you are looking for.
    • A job posting should include key information that allows an interested candidate to decide if they are a good fit for the job.  Key items to include are:  brief position description, key responsibilities and expectations, key hiring attributes and requirements, a brief description of your organization and work, brief summary of benefits, the process to apply, and salary range (ideally, but not always possible).
    • Seriously consider what attributes a strong candidate must have in order to be considered for the position, and what are simply nice to have.   Most employees learn on the job and receive significant training once they start.  Consider which attributes are flexible and which are non-negotiable for a candidate.
    • Consider the power of transferable and general skill domains, versus hiring for a specific skill set.  For example, if your organization uses Salesforce as a data system, you don’t necessarily need to hire a candidate with Salesforce experience.  Instead, consider – and advertise for in the job posting - a candidate with strong “technology proficiency” in other systems who can be trained on how to use Salesforce.  
    • Don’t let perfect be the enemy of great.  There is no such thing as a perfect candidate.  Of the “must haves” and “nice to have” attributes in your job posting, consider 3-5 that are most important to you, and which others you can be flexible on. 

    There are many online resources for job postings

    Your job posting should ideally be hosted on your company’s website and/or LinkedIn account.  In addition, the evaluation world has a few key job websites for job postings including the American Evaluation Association (AEA) website and  In addition, consider general job websites like LinkedIn,, and (for non-profits).  Job sites like Monster and Career Builder may not be that useful for evaluation positions.  Lastly, consider discipline-specific professional associations, as most offer the ability to post job openings.  For example, if you work in the field of public health, consider listing the job through the American Public Health Association (APHA) website. 

    Interview Questions Should Fill in the Gaps of the Resume

    A resume and cover letter (if provided) should tell you 80-90% about a candidate.  The purpose of the interview is to fill in the rest.  Therefore, focus your interview on attributes about the candidate that may not be expressed via the resume such as passion for the work, soft skills, and problem-solving skills.  Example questions that Bryan has used in past interviews include:

    • Why are you interested in this position?  Why did you apply?
    • Tell me about a recent job experience and relevance to this job?
    • A key job responsibility is ____.  Tell me about a time you did ____ ?  Are you comfortable/do you enjoy doing ______ ?
    • Tell me about your work personality?  How do you work with others and/or independently?  What are your needs as an employee?
    • Tell me about a time you faced a conflict/challenge/problem – how did you approach and resolve it?

    Mr. Hall’s full presentation slides can be found here (members only). 

  • 09/26/2019 12:41 PM | Greater Boston Evaluation Network (Administrator)

    The next 2-year terms for GBEN Treasurer and Clerk start January 1, 2020. This is an opportunity to show your commitment to the value of GBEN and help to shape its future! You may nominate yourself or a committed GBEN colleague. The deadline for nominations is Monday, October 7, 2019.

    Position Descriptions

    GBEN is governed by an Executive Committee, which serves as the board of directors, and consists of a President, a Vice-President, a Treasurer, a Clerk, and chairs from each of the committees. The President, Vice-President, Treasurer, and Clerk are elected by the membership.Any GBEN members may serve on the GBEN Board.

    The Executive Committee meets monthly to over see all GBEN activities and operations, including overseeing all subcommittees. The Committee is also responsible for setting dues and approving a budget for each year. The Executive Committee oversees elections, fills vacancies, holds special elections, and removes Committee members as outlined in GBEN’s by-laws.

    Clerk Position Description:

    • Record the proceedings of GBEN;

    • Keep the records of Bylaws and subsequent amendments; 

    • Handle all the general correspondence of GBEN, as directed by the President and Vice-President;

    • Support creation of agendas for GBEN meetings;

    • Work with the Treasurer to submit annual IRS filing for 501c(3) status and attend to any other administrative and annual reporting work associated with 501c(3) status.

    Treasurer Position Description:

    • Collect dues and any other funds to be received by GBEN;

    • Document all financial transactions related to GBEN;

    • Report monthly financial updates to the President and Vice-President and the Executive Committee;

    • Report at general membership meetings and prepare an annual/fiscal year report;

    • Transact the general business of GBEN in the interim between meetings; 

    • Disburse funds and pay bills in accordance with the provision of the Bylaws or policies of the Executive Committee;

    • Work with the Clerk to submit annual IRS filing for 501c(3) status and attend to any other administrative and annual reporting work associated with 501c(3) status.

    • The outgoing officers shall deliver to their successors all books and materials of their respective offices by January 15th.

    Qualifications and Time Commitment:

    • Membership with GBEN and AEA

    • Some leadership or management experience

    • Minimum of 3 years experience with evaluation-related work

    • Capacity to commit 10-15 hours per month

    • Some Board experience helpful

    • Strong organizational skills helpful

    Submission Process:

    Each nomination submission should include:

    • Name, Title, Affiliation, Email, Phone

    • Resume or CV

    • A brief statement answering the following questions:

      • Why are you interested in becoming Clerk or Treasurer of GBEN?

      • What are your qualifications for Clerk or Treasurer?

      • What is your vision for GBEN?

    Submit COMPLETED applications to GBEN via email ( by Monday October 7, 2019 or earlier, if possible.


    If you have questions about nominations process, please contact Danelle Marable,

  • 07/31/2019 10:14 AM | Greater Boston Evaluation Network (Administrator)

    Big data.  Data science.  Predictive analytics.  Social network analysis.  The field of evaluation is expanding to new frontiers, becoming a transdisciplinary practice.   

    Based on your work experience and interests in the field of evaluation, what is the next area(s) that you want to learn more about and integrate into your evaluation practice?

  • 06/26/2019 3:24 PM | Greater Boston Evaluation Network (Administrator)

    On Tuesday, June 18th, GBEN hosted its second roundtable on the topic of social network analysis.   Over a dozen GBEN members and guests participated.  The roundtable discussion was led by Kelly Washburn, MPH, from Massachusetts General Hospital’s Center for Community Health Improvement.  Kelly is also one of GBEN’s Programming Committee co-chairs.

    Social network analysis (SNA) is the mapping and measuring of relationships and flows between people, groups, organizations, computers or other information/knowledge processing entities.” (Valdis Krebs, 2002). SNA can show the performance of the network as a whole and its ability to achieve its key goals, characteristics of the network that are not immediately obvious, such as the existence of a smaller sub-network operating within the network, the relationships between prominent people of interest whose position may provide the greatest influence over the rest of the network, and how directly and quickly information flows between people in different parts of the network.

    Kelly walked through a small social network analysis she conducted to walk participants through the different steps needed, challenges, and lessons learned.  The project discussed was a provider task force improving connections among services providers, streamlining services, and enhancing care coordination efforts.  The SNA provided a baseline on how the task force members work with each other by asking four questions:

    1. Do you know this person?
    2. Have you gone to this person for information in the last year?
    3. Have you worked with this person to obtain new resources in the last year?
    4. Have you referred a client to their organization in the last year?

    The analysis was done in Gephi, a free software for conducting social network analyses.  Data cleaning was the most tedious part of the project and was done manually; however, there are ways to bypass the manual data cleaning. After the data is set-up in the appropriate Node and Edges file, they are uploaded into Gephi. Once in Gephi, the steps detailed in their manuals were followed to take it from the initial map to the finalized map.  Following Kelly’s discussion of her project, others in attendance spoke of their own experiences of using social network analysis in their work.

    Key Challenges and Lessons Learned:

    The roundtable participants discussed a few challenges and lessons learned when conducting a SNA, including:

    • New analytical methods and techniques, like SNA, can require a lot of patience and time to learn and master.  Be sure to invest the necessary time when learning how to conduct a SNA for the first time.
    • A high response rate means A LOT of follow-up to ensure the data is representative of the population you are analyzing.  Be sure to invest the necessary time and resources to doing follow-up for your project.
    • Make sure the questions being asked are the right questions as it’s difficult to change directions once the project and analysis has started.
    • Continually ask yourself and/or your team(s):  Do I need to collect new data or is there already collected data I can use for the SNA?  
    • SNA can be frustrating to administer and master at times.  Patience during the process is key to ensuring a successful outcome. 
    • The visual map was key for the task force in understanding the analysis. 


  • 05/28/2019 1:12 PM | Greater Boston Evaluation Network (Administrator)

    Feminism, at its core, is about the transformation of power—but how do you know that’s happening at the organizational level? How can you understand the core drivers of that transformation? How can your own process of evaluating that transformation democratize the evaluators’ power?

    Taylor Witkowski and Amy Gray are evaluation and learning specialists at Oxfam America, designing and testing feminist monitoring, evaluation and learning processes for a gender justice-focused organizational change initiative.


    Everything is political – even evaluations.

    Traditional evaluations, even when using participatory methods, prioritize certain voices and experiences based on gender, race, class, etc., which distorts perceptions of realities. Evaluators themselves carry significant power and privilege, including through their role in design and implementation, often deciding which questions to ask, which methodologies to use, and who to consult. 

    Feminist evaluation recognizes knowledge is dependent upon cultural and social dynamics, and that some knowledge is privileged over others – reflecting the systemic and structural nature of inequality. However, there are multiple ways of knowing that must be recognized, made visible and given voice. 

    In feminist evaluation, knowledge has power and should therefore be for those who create, hold and share it – therefore, the evaluator should ensure that evaluation processes and findings attempt to bring about change, and that power (knowledge) is held by the people, project or program being evaluated.

    In other words, evaluation is a political activity and the evaluator is an activist.


    Oxfam America is seeking to understand what it means to be a gender just organization—from the inside out.

    In order to do this, Oxfam America recognizes that a holistic, multi-level approach is required. We believe that transformational change begins at the individual level and ripples outwardly into the organizational culture and external-facing work (Figure 1).

    (Figure 1)

    (Figure 1)

    This is why we are investing in a feminist approach to monitoring and evaluation—because even though feminist values—adaptivity, intersectionality, power-sharing, reflexivity, transparency—seem like good practice, without mainstreaming and operationalization they would not be fully understood or tied to accountability mechanisms at the individual, organizational or external levels.

    Therefore, as evaluators, we are holding ourselves accountable to critically exploring and implementing these values in our piece of this process. The foundational elements of this emergent approach include:

    • Power-Sharing: The design, implementation and refinement of the monitoring and evaluation framework and tools are democratized through participatory consultations with a range of stakeholders—a steering committee, the senior leadership team, board members, gender team and evaluation team.
    • Co-Creation: Monitoring includes self-reported data from project contributors as well as process documentation from both consultants and evaluation staff, and data is continually fed into peer validation loops for ongoing reflection and refinement.
    • Transparency: Information regarding the monitoring and evaluation framework, approach and activities are communicated and made accessible to staff on a rolling basis as they evolve.
    • Peer Accountability: Monitoring mechanisms that capture failures and the cultivation of peer-to-peer safe spaces to discuss them create new opportunities for horizontal learning and growth. This includes a social network analysis (SNA) of perceived power dynamics within teams (contributed by team members via an anonymous survey), followed by a group discussion in which they reflected on the visual depiction of their working dynamics through the lens of hierarchy and intersectionality.


    As the monitoring, evaluation and learning (MEL) staff working on this initiative, we recognize that we have an opportunity to directly contribute to change. We therefore see ourselves as activists, ensuring MEL processes and tools share knowledge and power as well as generate evidence that reflects diverse realities and perspectives, and can be used for accountability and learning at multiple levels. As a result of this feminist approach to MEL, participating Oxfam staff can see and influence organizational transformation.                                                                       

    How have you used feminist evaluation in your work? Do you have any tips, resources or lessons learned you’d like to share? Do you think this would make a good roundtable discussion?



  • 04/29/2019 3:12 PM | Greater Boston Evaluation Network (Administrator)

    As evaluators, we sometimes collect more data than we can use.

    What are 1 or 2 methods or tricks you use to make your data collection process more meaningful and/or more aligned to your evaluation questions?

  • 03/27/2019 10:52 AM | Greater Boston Evaluation Network (Administrator)

    On Tuesday, February 5th, GBEN and Northeastern University’s Public Evaluation Lab (NU-PEL) co-hosted a panel on Impact Evaluation.  This was the first GBEN event of 2019 and the first event co-sponsored with NU-PEL.  The event saw the greatest turnout in the history of GBEN with 66 attendees!

    The panel featured five local internal and external evaluation leaders who have recently undergone randomized-control trial (RCT) or quasi-experimental impact evaluations. 

    The five panelists were:

    More and more, non-profits must demonstrate impact in order to ensure their ability to grow and innovate.  The purpose of the panel discussion was to explore what drives non-profits to engage in an impact evaluation, how to choose methodology, and lessons learned about communicating results. 

    Here are some of the key takeaways from the engaging panel discussion:

    Methodological Rigor vs Reality

    Several of the panelists discussed the push-and-pull between ideal methodological rigor and what is actually possible and/or ethical for programs.  In particular, Ms. Britt and Mr. Nichols-Barrier spoke to being able to do or not do randomization based on program over-subscription.  On the flip side, Ms. Goldblatt Grace and Professor Farrell from My Life My Choice shared a powerful anecdote about overcoming skepticism to their project’s rigorous methodology to allow a research assistant to be present at the mentor-mentee match sessions.

    Organizations Conduct Impact Evaluations for Lots of Reasons

    The motivating factors behind the decision to evaluate impact are diverse. Organizational values, political context, and funders can all play a role in the decision to conduct an evaluation as well as decisions around study methodologies. 

    Communicating Results

    Several of the panelists shared helpful tips about communicating results, specifically going beyond sticking them in a report that few people read. Ms. Britt shared a strong example of Year Up planning a year-long plan for communicating parts of their results throughout the whole organization, including a big celebratory kick-off event. 

    GBEN would like to thank the five panelists for being a part of this incredible event as well as NU-PEL for co-hosting.  Be on the look-out for future co-hosted events with NU-PEL!

  • 12/19/2018 10:48 AM | Greater Boston Evaluation Network (Administrator)

    On Tuesday, December 4th, GBEN hosted its last roundtable of the year about budgeting for evaluation.  Like all of our roundtables this year, we had great a turnout with 17 members participating.  The roundtable was designed to be an open dialogue focusing on the critical issues evaluators face when budgeting for evaluations. Guiding questions for the discussion included:

    • How do non-profits and other organizations fund evaluation staff, data management systems, and other elements of successful evaluations?
    • Is there a magic "rule of thumb" on how to allocate budget resources to evaluations?
    • Are there funders interested in supporting evaluation capacity building?
    • What are other processes or best practices to consider when budgeting for evaluation?  

    Here are some of the key takeaways from the group discussion:

    Evaluation staffing and capacity varies across organizations

    Some organizations still have little-to-no dedicated evaluation staff on the payroll.  For many organizations, having a dedication evaluator is a new organizational initiative.  For those with dedicated evaluation staff, many are grant-funded, meaning once the grant is over, the position may be reduced or eliminated.

    Evaluation budgets can span multiple departments

    Many organizations spread evaluation costs and budgets across varying departments and/or programs.  Evaluation budgets can include staff who may not have the word “evaluation” in their job title or job responsibilities, including front-line program staff, data management and/or administrative staff, and/or information technology and systems staff.  For example, an organization may employ a Database Specialist through their IS/IT department who does critical evaluation-related work by using Efforts to Outcomes (ETO) software. 

    Commitment from organizational leadership is key

    Like all facets of a good evaluation initiative, commitment from senior management within an organization is important for evaluation funding.  Senior leadership and management are often best positioned to seek out, advocate for, and request funding for evaluation from key funding partners. 

    Monitoring your evaluation work helps make the case for future or additional evaluation

    Evaluation staff may hold the key to the magical data kingdom, but often time we don’t directly experience or see how the evaluation results are used to change or improve programmatic processes.  Be sure to document the collaborative process between the evaluation team and front-line program staff to highlight programmatic improvements that are a result of the evaluation findings. 

    Relationships with development staff are key

    It is very helpful to have a relationship between the development team and evaluation team, especially for grant writing.  Development staff understand funder needs and wants and can effectively communicate impressive evaluation results.

    Assert your evaluation needs!

    Most people have no idea what it takes to make an evaluation successful.  Often time it takes more than just a staff person.  Evaluation needs can include certain software, equipment, consultants with specific expertise, and other administrative needs like postage and mailing supplies.  Make a wish list of things you need and share it!

    Seek out ways to reduce evaluation costs

    While planning and conducting and evaluation, it’s important to always ask the question: “what do we want and what do we already have?”  For example, is there data already being collected or existing data systems that could serve your evaluation project?  Finding ways to lower evaluation costs may make it easier to acquire the necessary budget resources for an evaluation initiative. 

    Resources:  (click here to access members-only resources from this roundtable)

    • Budgeting for Evaluation: Key Factors to Consider – a rubric developed by Westat for assessing how much to budget for evaluation
    • Budgeting for Evaluation from the 2014 Americorps Symposium  

  • 10/31/2018 1:52 PM | Greater Boston Evaluation Network (Administrator)

    On October 5, 2018, Danelle Marable, GBEN President and Senior Director for Evaluation, Assessments, and Coalitions at the Massachusetts General Hospital Center for Community Health Improvement (MGH/ICHI), discussed two community health needs assessments that her team is working on: 

    1. Partnering with the North Suffolk Public Health Collaborative, municipalities, other healthcare providers, community coalitions, and organizations to conduct an assessment for Chelsea, Revere, and Winthrop.

    2. Partnering with all Boston hospitals to conduct a Boston-wide assessment. 

    In 2011, the Affordable Care Act required every hospital to conduct a community health needs assessment (CHNA), develop strategic plans, and post findings to the public.  Non-profit status would be revoked if the CHNA was not conducted.

    A CHNA is a systematic examination of the health status indicators for a given population that is used to identify key problems and assets in a community. The goal of a CHNA is to develop strategies to address the community’s health needs and identified issues.  A CHNA is instrumental in identifying the social and environmental conditions as well as social determinants that can impact the health of these communities, such as childhood experiences, housing, income, employment, healthcare, community.  MGH identified Revere, Chelsea, Charlestown and Winthrop as primary communities to target with CHNA and a community health improvement plan (CHIP).

    During the roundtable, Danelle discussed the process around the assessments, community engagement strategies, data collection efforts, and implementation. 


    The CHNA and CHIP is a year-long process that occurs every three years.   MGH uses the MAPP Framework (Mobilizing for Action Planning and Partnerships) for the CHNA and CHIP that in short outlines a process of engaging partners in comprehensive data collection and strategic. MGH/ICHI dedicates one year for visioning, assessment, and identification of strategic issues and then their Board of Trustees reviews and grants approval, then allowing for 100 days to develop an implementation plan. The MGH Trustees are required in MA to go to community advisory committee meetings in addition to other meetings.         

    Community Engagement and Needs Assessment

    MGH/ICHI engages multiple sectors during the CHNA process:  resident, local leaders, community-based organizations, educators, as well as other hospitals in the communities.  MGH/ICHI started engaging other hospitals so they don’t burden the community with repeated data collection.  Assessment and identification of needs can be a challenge as prioritization is not always straightforward.  For example, residents in the community of Chelsea spoke of issues of community violence and safety, but other data sources showed decreases in instances of violence in the community. 

    Data Collection

    Data collection for the CHNA involved survey data, focus groups, in-depth interviews, and secondary data sources, including the quality of life survey (addresses more social determinants such as housing, transportation, overall community). Surveys are translated and are distributed online and in print through various community networks.

    Implementation and Evaluation

    A crucial part of the process of a CHNA is the identification and implementation of evidence-based strategies around certain community needs and issues.  Part of this process involves community dialogue – asking key questions about what can be done, what resources are available, and is there the will to implement solutions.   This process not always leads to straightforward answers.  For example, communities identified opioid use as a crisis and access to Narcan as a response, however the community balked at increasing access to Narcan under the suspicion that it leads to increased opioid use.  The group had to take 3 steps back to educate the community regarding the benefits of Narcan.  Once strategies have been implemented, the final process is comprehensive progress monitoring, implementation evaluation, and impact measurement. 

    Copies of Danelle’s PowerPoint slides can be found in the member roundtable resources section (members only!).  For questions, reach out to Danelle by email at:

  • 09/28/2018 9:19 AM | Greater Boston Evaluation Network (Administrator)

    On September 6, 2018, Dana Benjamin-Allen of the Boys and Girls Club of America - as well as a long time GBEN member - hosted GBEN's first professional development webinar titled “How to Host Webinars that Don’t Suck!”  The webinar – a live meeting and discussion that occurs via the internet – can sometimes have a reputation for being boring, disengaging, and a waste of an audience member’s time.   For evaluation professionals, creating a data-rich yet engaging webinar can be a challenge. 

    Fourteen GBEN members joined the webinar to learn about tools to increase interactivity and engagement, best practices for online presentations, and the benefits of different distance engagement platforms.   Dana used several tools and activities to model ways to engage the audience and keep them interested in the content of the webinar.  You can access her slide deck on the Roundtable Resources section of the website (members only).  Below are some take away points for engaging your webinar audience.

    Framing the Webinar is Important!

    How you frame and present the webinar is critical to gaining interest and maintaining audience engagement once the webinar starts.  Start the webinar with an introductory slide that presents the intentions and objectives that best resonate with your audience.  It’s important to describe the presenter's relevant background and experience.  

    In addition, it is important to ‘level set’ your audience, that is, identifying the knowledge and experience level among your audience members regarding the webinar topic.  You may have audience members with diverse backgrounds, varying knowledge levels, and varying experience levels with the topic.  Level setting can help you hone in on certain content areas that may be most relevant to this diverse audience as well as identify gaps in knowledge.   Hot tip:  create interactive activities using online polls and tools like GroupMap to level set your audience.

    Language matters so be sure to use a catchy title to make the webinar more inviting.  The webinar title “Webinars that Don’t Suck!” is a perfect example!  The timing of your webinar is very important.  For some audience members, mornings are better, for others it may be lunch time.  If you are reaching a diverse geography of audience members, be sure to schedule it conveniently for all time zones. Lastly, experiment with different alternatives to the word webinar to increase engagement.  Hot tip: call it an online meeting, online course, virtual workshop, “brown bag”, or like the American Evaluation Association, a virtual “coffee break.” 

    Choose the best facilitation tools and methods based on audience needs and intended engagement level:

    There are various webinar platforms on the market today with varying functions.  Do you need a live whiteboard?  Do you want to share and/or pass the screen to multiple presenters?  Do you want a live camera to accommodate live video of all participants and presenters?  These factors should be considered before choosing a platform that best fits the format of your presentation, the needs of your audience, and the intended engagement level.   Hot tip:  using shared experience activities, such as real-time polling applications, helps connect audiences as well as solicit feedback.

    In addition to picking a webinar platform, it is important to identify which facilitation method is right for you and your audience.  Is there one or multiple presenters?  Are you hosting a moderated panel discussion?  Is the presentation interactive and involve audience participation?  Hot tip: co-hosting the webinar with a colleague who has credibility with your audience can increase audience interest and engagement. 

    Lastly, plan accordingly if you plan to take questions and how you will manage the questions.  Do you need a colleague to help with this?  Will it be interactive (i.e. audio for all participants)?   Between 6-10 questions may be good for a live interactive presentation, but plan extra time as participants may interact during each question.

    Presentation is Important:

    The design of your presentation slides is very important.  Keep your slides clean and easy to read.  If you are going to use a lot of text in a slide, plan accordingly to walk through the text with the audience.  Also, create intrigue, confusion, and or excitement from slide to slide using questions, images, and or illustrations.  Lastly, use a strong, exciting, and engaging title.    Hot tip:  there are online alternatives to PowerPoint like Prezi, Visme, or Haiku Deck

    During this webinar, Dana used the following tools (follow the links for more details):

<< First  < Prev   1   2   Next >  Last >> 

Greater Boston Evaluation Network is a 501(c)3 non-profit organization. 

Powered by Wild Apricot Membership Software