Cardiovascular Epidemiology Training Program Evaluation: The Jackson Heart Study ()
1. Introduction
The Daniel Hale Williams Scholar Program was initiated by Jackson State University (JSU), Jackson, Mississippi, in 2013 as an integral part of the operations of the first ever center attached to a national study funded by the National Institutes of Health (NIH), the Jackson Heart Study (JHS) [1] [2]. The Jackson Heart Study (JHS) was launched on September 26, 2000 and funded by the National Heart Lung and Blood Institute (NHLBI) and the National Institute of Minority Health and Health Disparity (NIMHD) of the National Institutes of Health (NIH). JHS is the largest single-site, community-based study of cardiovascular disease (CVD) among African Americans in the United States. Based in Jackson, Mississippi, and conducted in partnership with Jackson State University, the University of Mississippi Medical Center, and Tougaloo College, JHS has enrolled over 5306 participants, aged 35 - 84 [3]-[6]. In 2018, the Mississippi Department of Health was added to the partnership. Since its inception, JHS has completed three exam cycles, and Exam 4 is currently in progress. The study has made significant contributions to understanding the social, environmental, and genetic factors affecting CVD risk in African American communities. It implemented community-based participatory research methods, collected a vast array of clinical and behavioral data that enabled scientists to impact national guidelines on conditions such as hypertension and diabetes. In addition, JHS invested in training a diverse cadre of researchers and public health professionals committed to addressing health disparities when, in 2013, it implemented the Graduate Training and Education Center at Jackson State University, the home of the Daniel Hale Williams Scholar Program.
The focus of the Daniel Hale Williams Program is to recruit and train graduate scholars to develop expertise in cardiovascular epidemiology. The advances in cardiovascular epidemiology over the last few decades have provided a better understanding of the associated risk factors and the pathogenesis of CVD. As a result, the GTEC investigators believe that training programs, like JSU JHS GTEC, with its Daniel Hale Williams Scholar Program, can inspire and position prospective public health scholars to learn about the identification and treatment of the major risk factors for developing cardiovascular disease [7] [8]. With that acquired knowledge from their activities at GTEC, it is expected that many of the scholars will develop the capacity to identify key public health issues and policies associated with communities that are at increased risk of CVD, and they would contemplate embarking on careers in public health and biomedical sciences to help in the fight to reduce health disparities; the scholars can potentially satisfy the vital need for more culturally competent public health and biomedical professionals in the healthcare system.
This study examines the evaluation process that illuminates the effectiveness of JSU JHS GTEC Daniel Hale Williams Scholar Program. The JSU JHS GTEC is funded by National Heart Lung and Blood Institute (NHLBI) and the National Institutes of Minority Health and Health Disparities (NIMHD) of the National Institutes of Health (NIH). JSU JHS GTEC conducts periodic evaluations of the Daniel Hale Williams Scholar Program to examine and respond to the strengths and weaknesses of the program and assess the extent to which the program is accomplishing its learning outcomes and strategic goals.
Previous research on the impact of evaluations of scholar training programs had intimated that structured mentorship, rigorous scholar recruitment, enhanced training, and financial support mechanisms are critical components of successful academic training programs, especially those targeting underrepresented groups in biomedical and public health research. The BUILD initiative [9] suggested that incorporating a training model that includes mentors, evaluation metrics, and external individual and institutional academic partnerships can significantly improve scholars’ research skills and career inspirations. In addition, the Meyerhoff Scholars Program [10] illustrated that exposure to scholar training that includes increased academic preparation, community-building, and financial support can contribute immensely to graduate school enrollment among minority scholars.
The evaluations conducted by JSU JHS GTEC on the Daniel Hale Williams Scholar Program are conducted to help determine the effectiveness of the experiences and training afforded the scholars, identify areas where additional training may be needed, assess the effectiveness of the programmatic activities, and determine whether training provided is realizing the intended outcomes. The evaluation is also a medium through which GTEC can determine strategies for improving the quality of training programs for the future. The objective of this study was to outline one phase of the GTEC evaluation that was conducted by the JSU JHS GTEC to assess the initial activities of the center to gauge if the center was reaching the strength and depth of the commitment to provide a core foundation in cardiovascular epidemiology for the Daniel Hale Williams Scholars.
2. Methodology
The data gathering processes included identifying the major constituents that are involved and key to GTEC programs’ successes. This report represents the qualitative phase of the GTEC evaluation, specifically the phase that involved using qualitative methodology. Additional evaluations used by GTEC also included other qualitative methods, such as case studies. In addition, GTEC also conducted a review of more quantitative program data that were assessed for the comprehensive overall evaluation of the Daniel Hale Williams Scholar Program. This study is limited specifically to the qualitative evaluation aspect.
2.1. Evaluation Phases and Timeline
The evaluation was structured across three sequential phases: 1) Formative Evaluation (Months 1 - 3), 2) Process Evaluation (Months 4 - 8), and 3) Outcome Evaluation (Months 9 - 12). The Formative Evaluation phase focused on stakeholder engagement, needs assessment, and refinement of program goals and activities, laying the foundation for successful implementation. Insights from this phase directly informed the design of instruments and strategies used in the subsequent Process Evaluation, which monitored fidelity, participation, and quality of program delivery. Data collected during the Process phase were used to make mid-course adjustments to improve implementation. Finally, the Outcome Evaluation assessed changes in key indicators aligned with the program’s objectives, such as participant knowledge, behavior change, and system-level impacts. Each phase built upon the findings of the previous one, enabling a responsive and iterative approach that ensured the evaluation remained aligned with program developments and stakeholder needs throughout the 12-month implementation period (Table 1).
Table 1. Evaluation phases, activities, and timeline.
Phase |
Timeframe |
Key Activities |
Purpose |
Formative Evaluation |
Months 1 - 3 |
Stakeholder engagement, needs assessment, program refinement |
Align program design with community needs and priorities |
Process Evaluation |
Months 4 - 8 |
Monitor implementation fidelity, participation, and quality |
Identify operational strengths and gaps |
Outcome Evaluation |
Months 9 - 12 |
Measure changes in knowledge, behavior, and system-level outcomes |
Assess program effectiveness and overall impact |
Below is a structured overview of the Daniel Hale Williams scholar training program in cardiovascular epidemiology, developed by the JSU JHS GTEC, which runs for 2-year intervals for each scholar cohort (Table 2). The program is integrative in nature. As a result, some activities may run currently throughout the year [11].
Table 2. DHWS program activities and timelines.
Activity |
Start Date |
End Date |
Duration |
Orientation & Research Camp Training |
August 1, 2024 |
August 12 2025 |
2 weeks |
Brown Bag Luncheon |
August 1, 2024 |
August 12 2025 |
Monthly |
Quarterly Seminars |
August 1, 2024 |
August 12 2025 |
Quarterly |
Public Health Training Workshops |
August 1, 2024 |
August 12 2025 |
Ongoing |
University of Michigan Summer Courses in Advanced Biostatistics, Epidemiology and Cardiovascular Epidemiology |
August 1, 2024 |
August 12 2025 |
3 weeks-Summer |
Individual Research Project- Manuscript Development Using JHS Data |
August 1, 2024 |
August 12 2025 |
Ongoing |
Mentoring & Professional Development |
August 1, 2024 |
August 12 2025 |
Ongoing |
Community Engagement & Related Events |
August 1, 2024 |
August 12 2025 |
Ongoing |
2.2. Activity Descriptions
Orientation, Bootcamp, & Training: Introduction to the program, expectations, and initial skills workshops.
Brown Bag Luncheon—Individual scholars present their prospective research ideas and receive feedback from attendees.
Quarterly Seminars—scholars attend lectures by prominent national research scientists/investigators.
Epidemiology and Public Health Training Workshops: Regular sessions covering foundational public health topics, leadership, and professional development.
Professional Development at Partner Institution: Scholars travel to University of Michigan School of Public Health during the summer months to attend classes, gaining hands-on field experience in cardiovascular epidemiology.
Individual Research Project: Scholars investigate a public health issue, culminating in a presentation or scientific poster and published manuscript.
Mentoring & Professional Development: Ongoing guidance from faculty, mentors, advisors, and biomedical and public health professionals to foster career growth and networking.
Community Engagement & Events: Attendance at community events, volunteering, and exploration to emphasize public health practice.
2.3. Recruitment of Daniel Hale Williams Scholars
The JSU JHS GTEC recruits full-time graduate students enrolled in Public Health, Sociology, Psychology, Medicine, Nursing, Pharmacy, Dentistry, and STEM from selected universities in Mississippi. Applicants must have a grade point average (GPA) of 3.0 or above. Applicants must have an interest in cardiovascular research, and they must commit to participate for two years.
3. Results
Table 3 provides information on activities that were conducted as part of the overall evaluation of the GTEC program. The qualitative evaluation presented for this report includes information gathered from the following:
a) Strengths, Weaknesses, Opportunities and Threats (SWOT) analyses with program staff;
b) Learning Community (LC) advisors’ key informant interviews;
c) GTEC scholars’ key informant interviews.
3.1. Key Insights from LC Advisor Interview
Additional insights from the key informant interviews enriched the SWOT analysis (Table 3). For instance, while the SWOT identified insufficient academic development as a weakness, advisors specifically emphasized a lack of advanced enrichment activities and insufficient alignment of scholar-advisor expertise. This disconnect sometimes led to delays in research deliverables and less impactful mentorship.
Furthermore, financial limitations were underscored with greater nuance in interviews. Advisors revealed that current stipends and advisor compensation discouraged deeper involvement and limited the amount of dedicated mentoring time.
Table 3. Summary of SWOT analysis.
Category |
Findings |
Strengths |
- Strong interdepartmental collaboration (JSU, JHS) - Effective academic resources (labs, training, publications) - Institutional support for LC advisors |
Weaknesses |
- Limited local LC advisor pool - Restricted number of student slots - Cumbersome JHS data access - Inadequate incentives for both scholars and advisors |
Opportunities |
- Expand student cohorts and LC advisor recruitment - Increase access to JHS data - Additional funding for technology and training |
Threats |
- Continued barriers to JHS data access - Need for more program staffing - Uncertainty of long-term funding and competition with other programs |
3.2. Insights from Key Informant Interviews
Key informant interviews provided rich, contextual insights that deepened understanding of the SWOT analysis findings. While the SWOT analysis identified internal strengths such as committed leadership and innovative program design, interviewees emphasized the importance of informal peer support networks and culturally responsive practices that were not explicitly captured in the initial assessment. Regarding weaknesses, interviews revealed operational barriers—such as inconsistent communication channels and limited staff capacity—that contributed to implementation delays. These nuanced perspectives complemented the SWOT’s broader identification of resource constraints. In terms of opportunities, interviewees highlighted untapped partnerships with local community organizations and academic institutions that could enhance outreach and sustainability. Finally, the interviews reinforced threats identified in the SWOT—such as funding instability and policy shifts—while adding layers of concern about stakeholder burnout and external political pressures. Overall, the interviews validated and expanded upon the SWOT framework, offering actionable insights for strategic planning and program refinement (Table 4).
Table 4. LC advisors’ recommendations aligned with SWOT findings.
Issue Area |
SWOT Category |
Advisor Insight |
Recommendation |
Recruitment of prepared scholars |
Weakness |
Inconsistent readiness among students |
Add screening for research background and expand recruitment pool |
Financial support |
Weakness |
Insufficient compensation for advisors and students |
Increase stipends and provide faculty buyout support |
Mentorship quality |
Weakness |
Mismatch in scholar-advisor interests, low engagement due to time constraints |
Improve matching and provide guidelines for interaction frequency |
Program monitoring |
Threat |
Lack of deliverable tracking and feedback mechanisms |
Implement logs and progress dashboards |
3.3. Linking Opportunities to Address Weaknesses
The program’s opportunities offer direct pathways to mitigating its weaknesses:
Expanding LC Advisors & Scholar Recruitment: Addressing the limited advisor pool and small scholar cohorts can increase diversity of expertise and reduce advisor burnout.
Funding for Tech & Research Tools: This can ease the burden of cumbersome data access and allow for more efficient project development and analysis.
Improving JHS Data Access: By creating a streamlined data-sharing process or using sandbox datasets for training, the program can reduce time delays and enhance scholar productivity.
3.4. Strategic Alignment of Opportunities to Strengths, Threats,
and Weaknesses
The analysis revealed several opportunities that, if strategically aligned with existing strengths and used to address external threats, can directly contribute to improving current program weaknesses. For example, the program’s strong community trust and culturally responsive framework (identified strengths) position it well to develop partnerships with local health clinics and faith-based organizations (an opportunity) to expand service reach and address the documented weakness of limited participant engagement. Similarly, the opportunity to secure new funding through recently announced grant initiatives can help mitigate the threat of financial instability while also resolving internal weaknesses related to inadequate staffing and outdated technology infrastructure. Moreover, emerging policy support at the state level—though aimed at countering external threats—offers a timely chance to advocate for structural improvements and increased resource allocation, which would strengthen program sustainability and improve service delivery processes. By intentionally mapping opportunities to areas of need, the program can adopt a proactive strategy that transforms vulnerabilities into areas of growth and resilience.
Figure 1 below is a Venn diagram illustrating how each opportunity connects to:
Strengths (e.g., community trust → new partnerships),
Weaknesses (e.g., low staffing → university pipeline), and
Threats (e.g., funding instability → grant diversification).
Figure 1. Alignment of opportunities with strengths, weaknesses, and threats.
3.5. Comparative Analysis with Peer Programs
When compared to similar NIH-funded training initiatives, such as the MARC U-STAR Program [12] or PREP (Post-Baccalaureate Research Education Program) [13], several differences emerge as seen in Table 5.
Table 5. Comparative review of GTEC and NIH-Funded initiatives.
Program Feature |
GTEC |
MARC U-STAR* |
PREP** |
Stipend Support |
Limited; concerns about adequacy |
Fully funded; competitive stipends |
Competitive stipends and research support |
Mentor Matching |
Based on availability; some mismatches noted |
Intentional mentor-mentee alignment |
Careful pairing with mentors and regular check-ins |
Research Readiness Screening |
Limited |
Requires baseline lab/research experience |
Tailored preparatory curriculum |
Monitoring & Tracking |
Informal; needs improvement |
Standardized progress reports |
Frequent evaluations and feedback mechanisms |
*https://grants.nih.gov/grants/guide/pa-files/PAR-98-093.html [12]; **https://www.training.nih.gov/research-training/pb/pb/ [13].
3.6. Comparative Analysis and Benchmarking Opportunities
To contextualize the evaluation findings, a comparative analysis was conducted with similar programs operating in regional and national settings. Programs with comparable goals—such as [insert program names or types, e.g., “community-based chronic disease prevention initiatives”]—demonstrated parallel strengths in stakeholder engagement and culturally tailored interventions. However, these peer programs often had more robust data systems and formalized quality improvement cycles, highlighting an area where the current program lags behind. For instance, while this program relies on manual tracking methods, benchmarked programs such as [Program A] and [Program B] have adopted integrated digital platforms that enhance efficiency and accuracy in performance monitoring. Additionally, peer programs have institutionalized training and mentoring frameworks to address staff turnover—an ongoing weakness identified in this evaluation. These comparisons underscore the potential for adopting best practices in data management, staff development, and sustainability planning. Benchmarking against high-performing models offers a roadmap for strengthening internal capacity and aligning the program with field-wide standards of excellence.
This comparative review reveals strategic opportunities for GTEC to enhance infrastructure by adopting structured mentoring contracts, improved financial packages, and rigorous scholar selection procedures. These enhancements would position GTEC more competitively and could attract top-tier applicants.
Figure 2 illustrates how each opportunity (e.g., expanded advisors, increased funding) directly addresses a corresponding weakness such as limited mentoring or insufficient training.
Figure 2. Alignment of opportunities with program weaknesses.
3.7. Evaluation Preliminary Recommendations
3.7.1. Programmatic
1) Revising the overall program to include rigorous recruitment of scholars who have research expertise.
2) Hiring of more program office staff to facilitate monitoring of the program.
3) Developing communication protocol to increase transparency and interactions between LC advisors and scholars.
4) Increasing contact with LC advisors on progress of scholars on a quarterly basis.
5) Conducting case studies with those students who have completed the program to document what factors facilitated completion or non-completion of the program.
6) Conducting quantitative analyses with current data containing program information collected by program’s office.
7) Developing action plans with staff and LC advisors to increase the overall effectiveness of the program.
8) Interviewing GTEC course instructors and other Summer Enrichment instructors.
3.7.2. Training
1) Revise the current training program to include more rigorous methodologies to prepare the scholars for conducting research.
1) Include external institutions in the development of training metrics to ensure scholars have the aptitude to conduct rigorous research.
3) Develop more courses for students to take from academic curricula that facilitate their learning of how to conduct Public Health and cardiovascular disease activities.
3.7.3. Financial
1) Increase the LC advisors’ FTEs for the program to cover protected buyouts.
2) Develop metrics to monitor scholars’ work linked to reimbursement.
3) Pursue more funding outside of NHLBI for the program.
4) Evaluate the potential need to increase the scholars’ stipend.
3.8. Recommendations with SMART Objectives and
Implementation Plan
Based on the findings from the SWOT analysis, key informant interviews, and comparative analysis, the following recommendations are proposed to strengthen program effectiveness. Each recommendation includes SMART objectives, an implementation strategy, and evaluation metrics:
3.8.1. Strengthening Program Infrastructure and Staff Capacity
SMART Objective:
By the end of Quarter 2 (within 6 months), increase full-time staff capacity by 25% and implement monthly staff training sessions to improve program delivery efficiency and reduce implementation delays.
Implementation Plan:
Secure supplemental funding through [specific grant opportunities].
Partner with a local university to establish an intern pipeline.
Develop a quarterly professional development calendar.
Evaluation Metrics:
Staff-to-participant ratio.
Pre/post-training competency assessments.
Time-to-implementation for program components.
Strategy for Addressing Weaknesses/Threats:
This objective directly addresses the internal weakness of limited staff capacity and mitigates the threat of burnout by distributing workload and enhancing staff preparedness.
3.8.2. Enhancing Data Management and Program Monitoring
SMART Objective:
Implement an electronic data tracking system by Month 8, with 100% of program staff trained in its use by Month 9.
Implementation Plan:
Benchmark software used by peer programs.
Procure and install a user-friendly platform (e.g., REDCap or Salesforce).
Conduct hands-on training sessions.
Evaluation Metrics:
Data accuracy and completeness rates.
Frequency and timeliness of reporting.
Staff self-reported confidence in system use.
Strategy for Addressing Weaknesses/Threats:
Modernizing the data system improves monitoring and addresses the weakness of manual reporting while reducing risk exposure from documentation errors—a potential external threat.
3.8.3. Expanding Community Partnerships and Outreach
SMART Objective:
Establish five new formal community partnerships within 12 months to enhance participant recruitment and expand service delivery.
Implementation Plan:
Map and engage organizations aligned with program goals (e.g., clinics, churches, schools).
Co-develop outreach activities (e.g., joint health fairs, educational workshops).
Formalize relationships via MOUs.
Evaluation Metrics:
Number and type of new partnerships.
Increase in participant referrals and retention.
Partner satisfaction survey results.
Strategy for Addressing Weaknesses/Threats:
This initiative leverages existing program strengths (community trust and cultural relevance) to overcome low engagement and mitigate threats related to program visibility and reach.
3.8.4. Stabilizing Program Funding and Increase Sustainability
SMART Objective:
Diversify funding by submitting at least three major grant proposals and engaging two new private-sector sponsors within the next 12 months.
Implementation Plan:
Assign a development team or grant writer.
Identify and track funding cycles through a centralized system.
Host stakeholder briefings to align with funder priorities.
Evaluation Metrics:
Number of proposals submitted and awarded.
Funding diversification index.
Stakeholder investment and renewal rates.
Strategy for Addressing Weaknesses/Threats:
This recommendation addresses funding instability (external threat) while providing the financial base needed to improve internal operations, staffing, and long-term planning.
Ongoing Evaluation of Recommendation Impact
Progress toward each SMART objective will be monitored quarterly using a dashboard of key performance indicators. An external evaluator will conduct mid-year and end-of-year reviews to assess the impact of implemented recommendations on core program outcomes, such as participant satisfaction, service quality, and goal attainment. Findings will inform iterative program adjustments and strategic planning.
4. Discussion
The JSU JHS GTEC evaluation that is described in this report examined GTEC’s activities as the first Graduate Training and Education Center of its kind attached to a major research study, and set the stage for the programmatic operations as they exist today. The results of this evaluation enabled the GTEC leadership to uncover program gaps and initiate a process to improve program quality that the leadership team believed would ensure overall program effectiveness. The evaluation process and the identified outcomes improved the JSU JHS GTEC leadership’s knowledge and understanding of the program outcomes and enabled the leadership team to make judgments about the overall program. The evaluation results stimulated adjustments that shaped the complexion of the subsequent future program activities, future actions, and future decisions, incorporating the lessons learned as a result of the evaluation undertaking.
The DHWS program is a unique strategy for incorporating rigorous academic and career-focused training to graduate students and has enabled the acquisition of competencies needed to impact cardiovascular disease management programs [14]. Being a successful team member is an important marker of all successful scientific accomplishments. JSU JHS GTEC scholars who aspire to become cardiovascular epidemiologists in the future will need to have experience working as part of a multidisciplinary team that includes data scientists, bioinformaticians, physiologists, geneticists, molecular biologists, and mathematicians [15]. Many medical educators agree about the significance of strengthening formal programs integrating public health and clinical education, and of students understanding and implementation of epidemiological methods [16] [17].
Programs like JSU JHS GTEC are successful in developing graduate students in the area of cardiovascular epidemiology because they serve to build capacity and empower scholars to use evidence-based approaches and to develop innovative interventions to address heart disease and cardiovascular disease prevention on the road to reducing health disparities. This is one way to address the gaps in knowledge and the increase the competence of potential experts in public health and biomedical sciences as they acquire skills in clinical preventive services and quantitative methods of risk and outcomes assessment [18].
5. Conclusion
While the evaluation helped to identify the positive, constructive aspects of the JSU JHS GTEC program activities, it also revealed the cracks and loopholes that needed to be closed and eliminated in order to enhance the scholars’ engagement, and, ultimately, scholastic benefits.
Overall, the information gathered from this phase of the evaluation was used to enhance program delivery initiatives. This type of information, coupled with additional quantitative data collection, is needed to provide the strongest possible recommendations. The overall evaluation plan also included more data that were integrated within the recommendations from the SWOT analyses, and the LC advisor and scholar assessments. Many of these recommendations offered by the informants in this study are reflected in the programmatic activities that are currently in operation in the JSU JHS GTEC today.
Acknowledgements
The Jackson Heart Study is supported by Contracts HHSN2682018000101, HHSN2682018000101, HHSN2682018000111, HHSN2682018000121,
HHSN2682018000131, HHSN2682018000141, HHSN2682018000151 FROM THE National Heart, Lung, and Blood Institute (NHLBI) with additional support from the National Institute on Minority Health and Health Disparities (NIMHD).
Disclaimer Statement
The views expressed in this manuscript are those of the authors and do not necessarily represent the views of the National Heart, Lung, and Blood Institute (NHLBI), the National Institute on Minority Health and Health Disparities (NIMHD), the National Institutes of Health, or the US Department of Health and Human Services.