Measuring What Matters: Weaving Science, Service, and Systems Into Data and Evaluation Practice

Data intelligence blog; VentureWell logo; headshots of the article authors

By Chithra Adams

For nearly three decades, VentureWell has been at the forefront of evaluating STEM innovation and entrepreneurship (I&E) programs. Our approach spans multiple levels—individuals, innovation, institutions, and ecosystems—providing a comprehensive view of the I&E landscape.

As a learning organization, we are dedicated to continuously improving how we innovate by using data to generate insights, measure impact, and refine our strategies—and we are committed to openly sharing what we discover. We combine key metrics with feedback and analysis to inform decision-making. We think creatively about our approach and learn from everything we do, ensuring that our work remains effective, adaptable, and accessible to all.

Three key threads are woven into all our evaluations: the science of evaluation in I&E, insightful service design, and robust data systems. These interwoven principles form the fabric of our approach to data intelligence, equipping us with the knowledge, perspective, and insights needed to drive meaningful change.

Science

We draw on a range of different evaluation theories to guide our practice. These theories, such as developmental evaluation, help us determine which approaches and methods to use given the program context, evaluation goals, and construction of value judgments. In addition to evaluation theories, we use research literature on I&E to guide instrument development, examine I&E constructs, and apply relevant frameworks like lean startup. By integrating both evaluation theories and I&E literature, we develop models that connect program interventions and strategies with tailored evaluation approaches.

Service

A key feature of evaluation services is that the value created by both intangible and tangible components cannot be differentiated; value is created through interactions between clients, evaluators, and data/evaluation products. As we implement data and evaluation activities, we design the order of interactions and products, ensuring that each informs the other. This approach aligns our evaluation efforts with programmatic contexts. Using data and insights to make decisions and judgments is inherently a human activity. Intentional design and implementation of evaluation interactions and products ensure that the insights generated are both useful and timely.

Systems

STEM I&E is a complicated process, and tracking its outcomes requires a data system and architecture aligned with both individual and institutional I&E journeys. This data system must capture a wide range of I&E outputs, such as patents, business formation, and funding raised, to name a few. We use several procedures, processes, and automations to ensure that the data is accurate, verifiable, and easily retrievable for analysis and reporting.

We collaborated with the American Evaluation Association to share insights into our work. The links below present some of our perspectives, processes, and lessons learned in implementing this holistic approach to I&E evaluation.

Insights, Strategies, and Best Practices for Understanding Impact

Driving Innovation with Developmental Evaluation by Jacki Purtell | How developmental evaluation helps ecosystems learn about success in their local context and emerging best practices.

Anchors Away! Navigating the Evaluation of STEM Innovation & Entrepreneurship Programming by Samantha Langan | The I&E ecosystem framework can help organizations define program logic and goals, including the role of partners from academia, government, and industry in shaping their services.

What Evaluators Can Learn From The Lean Startup by Stefanie Leite and Olivia Noel |
Explore how the lean startup framework can improve evaluation practice.

Effective Data Management in Evaluation of Innovation & Entrepreneurship by Stefanie Leite and Olivia Noel |Lessons learned in developing data management for programs of varying scales.

Using Extant Data To Understand Impact and Reach by Emily Markese and Adam Blandford | Using extant data sources to examine impact and reach at the institutional level.

Bringing Everyone In: Evaluating Inclusive Innovation by Jacki Purtell, Samantha Langan, Polly Todd | Collaborative practice—bringing partners together in evaluation design and collective sense-making.

The Future of Evaluation of STEM Innovation and Entrepreneurship by Chithra Adams |
Three future directions for evaluators to play a greater role in STEM I&E.

Building a Strong Foundation: Understanding Data Architectures by Jahannie Torres-Rodríguez and Joey Kleiner | Lessons learned in building a data architecture around STEM I&E programming.

Scaling with Data Pipelines by Jahannie Torres-Rodríguez and Harrison Sharitt | How data pipelines can be used to make data process efficient and consistent.

Reducing Manual Work with Automations by Jahannie Torres-Rodríguez and Grace Wang | Implementing automations through Salesforce flows.

Prioritizing Data Quality by Olivia Arstein-Kerslake and Lauren Baur | Exploring rigorous approaches and practices to maintain data quality.

Creating a Data Use Working Group: The VentureWell Data Champions Circle by Jahannie Torres-Rodríguez and Olivia Arstein-Kerslake | Building capacity among teams around data use.

How To Use Templates for Your Data Products by Olivia Arstein-Kerslake and Wen Yi Aw
| Templates to help kickstart documentation of flows and automations.

Asking the Right Questions from Your Data Systems by Olivia Arstein-Kerslake and Samantha Langan | Three practical tips to help ask the right question from a data system.

Sign Up for the VentureWell Newsletter

×

    I'd best describe myself as a:

    By continuing to use the site, you agree to the use of cookies. Read More