Categories
Diversity & Inclusion Fellowships Information NAEPSDP

Are you evaluating your program? Ask the stakeholders!

Julie Huetteman, Ph.D., is the Strategic Initiatives Coordinator at Purdue Extension. She is serving as the National Association of Extension Program and Staff Development Professionals (NAEPSDP) eXtension Fellow for 2017.

My role as 2017 eXtension NAEPSDP Fellow for Program Evaluation was launched with the Diversity and Inclusion Issue Corps (now called the Impact Collaborative) in Cincinnati. Since then, I have attended online sessions for those projects to share progress, challenges, and accomplishments. In addition, I have been included on the Corps evaluation team to learn of feedback from project teams.

A theme in this feedback was “stakeholder” involvement expressed as 1) key to their program goals; 2) instrumental to providing external input, perspective and support for their program; and 3) important in their next steps to move forward in their program planning, implementation and evaluation efforts.

In my online interactions with project teams, I found myself repeating, “Have you asked them?” I reminded many to “keep asking questions” of their stakeholders, audience, participants, and attendees to connect to those perspectives, interests, and insights.

We don’t have to have all the answers. Instead, consider asking questions of stakeholders to get those answers.

In education, the “expert” typically shares information or content. But do we know what is of interest to attendees? Do stakeholders understand what is being shared? Is the program of value to participants? How did the audience benefit from taking part in activities? Here is the break: We don’t have to have all the answers. Instead, consider asking questions of stakeholders to get those answers.

A lot is involved in planning, implementing, evaluating and reporting Extension programs, and we want to do the best we can. So, consider asking questions throughout and use feedback to inform your decisions.

  • Are you planning activities that encourage attendees to be active, involved and engaged? Check on current research for best practices, then ask the intended audience: “What activities would you find interesting to do?”
  • When deciding which topics are most important, check the literature, then ask a couple representatives of your future audience: “What topics are important to you?”
  • While planning the evaluation, check on practice guidelines, then ask stakeholders: “What questions might be asked to find out the value of this program?” Alternately, give them draft questions and ask: “Which ones work well to capture the value of the program for you?” followed by “How might you state a question to ask about its benefit to participants?”
  • In your outline or curriculum, schedule specific activities to involve and engage participants like asking verbal questions, posting polls, sharing questions on a slide, and so on. Some examples: “Is there anything that you need to be clarified?” “Was this activity helpful?” “What was most valuable to you?” Also: Keep questions going throughout; don’t wait until the end of the program to ask.

Avoid packing your program with so much content that you forget about — or don’t leave time or space for — getting to know the audience.

Ask your audience to 1) help clarify your planning efforts, 2) give feedback during your implementation, and 3) craft questions for debriefing, or 4) review and express the evaluation results. Avoid packing your program with so much content that you forget about — or don’t leave time or space for — getting to know the audience. Include questions to get their ideas on, the perceived value from, and experience of the program. Key questions to get started might be: “Has the program met your needs?” “Is this activity/program of value to you?” “Is this of interest to you?” “What is important to you?” “How have you benefited from this presentation/program?”

Ask questions, then listen. Audience responses and feedback can guide your next steps for planning and evaluation. Make time to get to know, and connect with, the audience by asking about their thoughts or perceptions. Ask your audience – before, during and after your program – so that their perspective is the focus of your planning, activities, and evaluation.

Julie can be contacted at jhuettem@purdue.edu

Categories
Fellowships Information

Perspectives: Avoiding Stereotypes in Program Evaluation

Julie Huetteman, Ph.D., is the Strategic Initiatives Coordinator at Purdue Extension. She is serving as the National Association of Extension Program and Staff Development Professionals (NAEPSDP) eXtension Fellow for 2017.

As the 2017 eXtension NAEPSDP Fellow for Program Evaluation, I have been on a journey to expand my awareness and understanding relating to inclusion, and to look at evaluation from this perspective, since participating in the Diversity and Inclusion Corps in Cincinnati.

Quality versus Quantity

I often ponder the busy-ness of those working in Extension. We wear a lot of hats and have many roles, but in providing education to our county or state residents, we want to be sure we are doing the best we can. To help us think about the quality of programming, not just the quantity, I share these thoughts that put stakeholders first.

Another thoughtful and thought-provoking reading recommendation from my colleague, Dr. Pamala Morris, Assistant Dean/Director of Multicultural Programs at the College of Agriculture at Purdue University, led me to Whistling Vivaldi by Claude M. Steele.

This book on “how stereotypes affect us and what we can do” is based on our human perception of identity. It shares the ways in which stereotyping defines groups and characteristics, how pervasive it is, and how it can influence performance. When individuals experience identity threat from associated restrictive characteristics, their performance is negatively affected. Stereotype threats occur from many perspectives and affect how people perform in education settings, as well as personal and professional situations.

What can we do?

In an education setting, researchers share a two-part explanation:

  • Self-affirmation or sense of competence and worth.
  • Accomplished challenges may create a mindset to interrupt negative restrictions of stereotypes.

For example, think of the message that women are not as good as men in math or science, and the resulting performance by women in STEM. Programming that affirms abilities in science — in combination with instruction and challenging STEM opportunities for accomplishment — can help in addressing the gap in performance associated with the stereotype.

Applying these concepts to our Extension setting, we can be deliberate in efforts to maintain keener awareness of our communities, to explore how we might affirm our stakeholders’ senses of self, and provide quality instruction and challenges to encourage achievement in learning.

This awareness can help direct our program evaluation activities to address the participants’ experience and perspective, not our own as program deliverers. Consider asking stakeholders about their experiences, comforts, barriers, challenges, benefits, values, and accomplishments from participating in programs. Here is where we find the quality in our work!

Thanks again to Pamala Morris for sharing and recommending this book on the human situation we live and face every day.

For More Information

You can contact Julie at jhuettem@purdue.edu

Steele, C.M. (2010). Whistling Vivaldi. New York, NY: W.W. Norton. http://books.wwnorton.com/books/Whistling-Vivaldi/

 

Categories
Diversity & Inclusion Fellowships Information

Perspectives: Overcoming Bias in Program Evaluation

Julie Huetteman, Ph.D., is the Strategic Initiatives Coordinator at Purdue Extension. She is serving as the National Association of Extension Program and Staff Development Professionals (NAEPSDP) eXtension Fellow for 2017.

My eXtension NAEPSDP Fellowship for Program Evaluation 2017 started with the Diversity and Inclusion Corps in Cincinnati. I have been exploring related resources, opportunities, and associations ever since. Here I share thoughts and reflections more so than a set of instructions. We need space and time to ponder our human experience and learn about other perspectives to incorporate those thoughts as we plan, develop, deliver and report on our Extension work.

Blindspot: Hidden Biases of Good People by Mahzarin R. Banaji and Anthony G. Greenwald is one book recommended to me by Dr. Pamala Morris, Assistant Dean/Director of the Office of Multicultural Programs in the College of Agriculture at Purdue University. The book is about research on our human minds that looks at how our biases develop toward race, ethnicity, gender, age, religion and so on. The researchers share their Implicit Association Test (IAT), which measures how the brain associates people and groups with traits and values. This automatic preference that develops pervades even when egalitarian beliefs are expressed.

A lot of self-reflection about perceptions and openness to others resulted for me. Completing sample tests and activities had me assessing my views, thoughts, and actions about and toward others. This created time and space to think and reflect on our society and our human relations across the personal and professional, local and regional, and global.

We can apply these reflections to program evaluation efforts.

  1. Make sure we make time to reflect on our own hidden biases.
  2. Make opportunities to include our clients/participants in our activities. Invite the perceptions, thoughts, and direction of our stakeholders from the beginning, and throughout, as we work to plan, develop, deliver, and report program activities and evaluation approaches.

The ultimate result is that the opportunities made available are of value and benefit to stakeholders. Given the busy-ness of our jobs, these steps can be easy to overlook, but they are incredibly worthwhile.

I would like to send a special thank you to Pam, for sharing this resource with me at this moment in our society and for a time of reflection on our human interactions.

Julie Huetteman, Ph.D., Coordinator, Extension Strategic Initiatives, Purdue Extension

Banaji, M.R. & Greenwald, A.G. (2016). Blindspot: Hidden Biases of Good People. New York, NY: Bantam Books.

 

Categories
Success Stories

Data Jams Breathe Life Into Extension Reporting

Water, water everywhere, but not a drop to drink. That’s a sailor cast away on the high seas, surrounded by a never-ending ocean but without a sip of liquid to sustain life.

In like manner, extension administrators and faculty are surrounded by oceans of data but without the means to make sense of that data to help sustain their programs.

Christian Schmieder, eXtension Fellow and Qualitative Research Specialist at the University of Wisconsin Extension, says, “We collect over 1,200 impact statements and 600 to 800 program narratives every year. That’s a lot of time and money invested, but if we don’t use those data well, it’s money out the window.” Schmieder and his colleagues are changing that by teaching extension professionals how to deal with large amounts of textual data through the Data Jam Initiative.

What Is a Data Jam?

Single- or multi-day Data Jams provide an opportunity for extension colleagues to get together in an intensive experience to do analytic work using analysis software. The goal is to end the experience with concrete write-ups, models, theories and visualizations that can be shared with colleagues, partners and relevant stakeholders.

Josset Gauley, Program Development & Evaluation Specialist, FoodWise, at UW-Extension, says, “A neat thing about the Data Jam is putting aside a whole day to talk about your program outside the normal work-day routine.  When you have 5 to 10 people, all concentrated on the same task, you can accomplish a lot. You can see what’s missing, where the gaps are.”

In Data Jams, groups of 4 to 12 extension colleagues look at narratives and impact statements from central data collection systems to answer questions such as, “How do our programs affect those in poverty?” Creating and fostering a common institutional language is key to this process because “analysts need to agree on the meaning of core terms – for example what ‘program’ means, or what we institutionally mean when we say ‘poverty,’” according to Schmieder.

Schmieder continues, “Oftentimes we do analysis in silos; that means that the knowledge we produce remains also in silos, and we keep on re-inventing the wheel.” Data Jams – and the institutional use of specialized analysis software – allow extension personnel to build on each other’s analytic work to “radically reduce the amount of time it takes to analyze and synthesize massive amounts of data.”

Collecting usable and relevant data is one of the core challenges for institutional data sets. Engaging more colleagues in the analysis of data increases institutional buy-in into reporting, which ultimately strengthens data quality. During Spring 2017, John Pinkart, FoodWIse Program Coordinator, Oconto and Marinette Counties, Wisc., participated in a three-day Data Jam coordinated by Schmieder and colleagues.  What Pinkart discovered was, “We do a good job of setting the context, describing what we’re doing and writing results narratives, but we really struggle to see a lot of evidence of behavior change and impact.”  Pinkart’s FoodWise program receives federal SNAP-Ed funding, so he’s very aware of the need for accessible, useful impact data.  Pinkart says after the Data Jam, he and those he coaches will be “much more mindful in describing impact and focus more on holistically connecting direct education, policy system work, and results.”

Gauley allows that not every extension professional is excited about reporting and data evaluation. He says Data Jams are “super valuable” because educators “walk away with a better understanding of how their data are used, and they feel appreciated and more valued by the organization.”  Then, in the future, as they report outcomes and impacts to teachers, parents, policymakers, county boards and others interested in obesity prevention, they will do a better job.

What Is Next?

Schmieder and colleagues are taking a big-picture approach to data analysis and organizational change. Justin Smith, eXtension GODAN Fellow and county extension director in Mason County, Wash., and Schmieder are currently developing and testing analytic workflows based on Data Jams that can be used to seed crowd-analysis of massive data sets, and to quality-control automated categorization of data.

Last March, Smith and Schmieder co-conducted a Data Jam focusing on extension data related to climate change.  Smith says, “Ultimately, someone will be able to query the eXtension system and be connected not only to extension literature but also local information and data sets from around the world  about weather, health, vegetation, population and more to solve problems, such as those related to climate change.”

“We also want to connect people – experts with particular skills and knowledge to inform the data and help design research models,” Smith says, all with an eye toward giving educators the gulps of life-giving knowledge they need to serve their publics.

For More Information

For more information about Data Jams, contact Christian Schmieder at:
christian.schmieder@ces.uwex.edu or visit the data jam website  http://fyi.uwex.edu/datajams/

Additional sites to learn about Data Jams:

Contact Josset Gauley at: josset.gauley@ces.uwex.edu or 608-265-4975

Categories
Newsroom

News Roundup – April 2017

Knowledge

In much of academia, we think of knowledge as something that we possess, something that can be acquired and collected. We think of knowledge as having value for its own sake. However, we live in a world where knowledge is being increasingly commoditized. There is more knowledge than ever and most of it is freely available and easily shared. You will notice many items in this roundup are about making sure people who need knowledge not only can find relevant knowledge but find it at the right time.

What are your thoughts on the relationship between knowledge and Cooperative Extension? Share your comments by tweeting @eXtension4u and using the #coopext hashtag.

News Roundup

impact collaborative logoeXtension Impact Collaborative Food Systems Fellow Applications Due June 1

The Food Systems Impact Collaborative Program Fellow is responsible for leading the formation and success of the 2017-2018 Food Systems Collaborative, working with the food systems organizing committee and the eXtension Impact Collaborative Project Manager and eXtension team. The position will follow, and improve upon, the Impact Collaborative Operations Guide. This is expected to be a one-year commitment from a professional at a Land-Grant University for .50 -.70 FTE. The person in this position reports to Issue Response Program Director. Learn more…

Coming Soon! The eXtension Impact Collaborative for Food Systems

Hundreds of Extension Professionals have already engaged in the Issue Corps experience with program ideas covering the entire spectrum of Extension. eXtension is looking to recruit more than 200 projects and 700 members for the next iteration, which will be called the Impact Collaborative for Food Systems. Get your ideas ready and watch for the call for applications coming this spring! Learn more about the upcoming call…

Jeff PiestrakReimagining Our Land Grant System as a Networked Knowledge Commons

In the fifth post of a series on Solving for Pattern, Jeff Piestrak (Cornell University Mann Library and eXtension Fellow) discusses the future of knowledge. He highlights both the complexities and the potential of a networked land grant system that is a relevant knowledge source in a world where people are increasingly consuming information in an on-demand, informal way at their convenience. Read Part 5…Fellowship Final Report

Laura ThompsonData Visualization – Blog Series & Webinar

It has been estimated that 90% of the world’s data has been created in the past two years alone. How can scientists and educators explain and present data in clear and concise ways amidst all this noise? Laura Thompson, University of Nebraska and eXtension fellow, offers some ideas in:

Laura is also presenting a webinar on data visualization on May 30 at 11 am EDT.  Learn more or register for the webinar…

Urban Extension Competencies

Extension professionals who serve in urban areas know that the job can be very different than rural Extension positions. What is unique about urban Extension and what skills and competencies are needed for this type of work? Additional information on urban Extension is below:

Katie StofereXtension Fellow Publishes Article on Citizen Science In JOE

Katie Stofer, University of Florida and eXtension Fellow recently published an article in the Journal of Extension on “Tools for Using Citizen Science in Environmental, Agricultural, and Natural Resources Extension Programs” Read the article…

Upcoming Webinars

Civil Discourse Resources

venn diagram showing overlap of civil dialogue and race relationsLast fall, ECOP created a rapid response team on the topic of Civil Discourse and Race Relations. The team has worked to identify existing resources and expertise in the Cooperative Extension System to create a toolbox for educators addressing these topics in their programming. The team’s final report was approved at the ECOP meeting on April 19 and a website is soon to be unveiled. A webinar is scheduled for May 17th at 2 pm EDT, which will focus on the background and need, the work of the team, and recommendations moving forward. Webinar link…

Evaluating for Program Implementation and Integrity

evaluation webinar logoThis webinar will explore informal and formal ways to evaluate your program during implementation. Using the Logic Model, we will explore how evaluation can address program inputs and outputs. These evaluation efforts help in the assessment of how a program is delivered, and in the improvement of planned approaches and activities. We will look at evaluation examples and give the attendees opportunities to practice and respond via the chat. The presenter is Julie Huetteman, Coordinator for Extension Strategic Initiatives at Purdue University. May 18, 2017, at 2 pm EDT. Learn more or register…

The Power of Online Maps for Outreach

story map webinar logoAre you looking to add interactivity to your Esri Story Map? Or are you simply ready to use online maps to share information with your colleagues or your outreach audience? Esri’s ArcGIS Online is one of the most powerful tools currently available to make and share online maps (and no programming knowledge required!). You can create maps using your own data in addition to using data posted online by people and organizations throughout the world. Shane Bradt, University of New Hampshire will present on how to get started with the very popular Story Mapping tool and how to use your maps in outreach. May 23, 2017, at 2 pm EDT. Learn more or register for the webinar…

eXtension LearnDon’t Miss These Recordings

Compassion and Security: Bridging the Gaps of the Refugee Crisis

Kayla Davis, University of Tennessee discussed the definition of the term “refugee”. She also presented on the numbers of refugees in the world today, as well as where they are and who is taking them in; humanitarian governance and the refugee camps; and the US asylum process. Watch the recording…

Instructional Design: Strategies & Best Practices for Online/Blended Learning

Gwyn Shelle, Michigan State University covered strategies and best practices for developing online and blended curriculum. She discussed content development and quality assurance. A variety of self-paced and cohort-based online courses were demonstrated including a variety of lecture types and interactive activities. Watch the recording…

Using Story Maps to Engage Your Audience

Shane Bradt, University of New Hampshire discussed a compelling way to share your message with people in person and online. Esri Story Maps offer a method for Cooperative Extension professionals to communicate information and engage their audience using a graphical and narrative approach with no coding experience needed. Watch the recording…

Webinar recordings and related resources are posted on the link for that webinar in Learn, often within 24 hours after the webinar ends.

Categories
Extension Fellowships Impact Software Technology Working Differently

Building Evaluation Capacity Through Data Jams, Part 3: Readying Extension for the Systematic Analysis of Large Qualitative Datasets

In this third blog post on the University of Wisconsin-Extension Data Jam Initiative, I will focus on four institutional outcomes of this Evaluation Capacity Building Framework.

Screenshot from the University of Wisconsin-Extension Civil Rights Legacy Datasets in MAXQDA.
Screenshot from the University of Wisconsin-Extension Civil Rights Legacy Datasets in MAXQDA.

INSTITUTIONAL OUTCOME 1: Continuous use of Institutionally Collected Data

The Data Jam Initiative provides colleagues with the tools, skills, support and community they need to engage in the analysis of large, often fragmented and hard-to-analyze textual datasets. We are currently conducting a longitudinal study measuring the initiative’s impact on analytic self-confidence and proficiency. At this early stage we observe heightened empowerment in Extension professionals, and we see a steep increase of evaluation, research and internal development projects that utilize the data from our central data collection system.

INSTITUTIONAL OUTCOME 2: Improvement of Institutional Data Quality

An essential element of the Data Jam Initiative is to communicate to colleagues and leadership how data are being used. Institutionally, this validates colleagues’ efforts regarding reporting, and it supports leadership in adjusting data collection foci based on ongoing, interdisciplinary data analysis. This, in turn, helps keeping institutional research, evaluation and communication efforts in alignment with ongoing data collection and storage.

INSTITUTIONAL OUTCOME 3: Building Interdisciplinary Capacity to Quickly Respond to Emerging Analytic Needs

All-Program area Evaluator Data Jam at the University of Wisconsin-Extension, March 2017.
All-Program area Evaluator Data Jam at the University of Wisconsin-Extension, March 2017.

Over time we create a baseline of shared techniques for analysis, and distributed proficiency in utilizing Qualitative Data Analysis software. Consequently, colleagues can tap into shared analytic frameworks when they collaborate on projects. On a larger scale, the institution can quickly and flexibly pull together analysis teams from across the state, knowing that a number of colleagues already share fundamental analytic and technical skills, even if they have never directly worked together. This allows an institution to respond quickly and efficiently to time-sensitive inquiries, and  to analyze more data more quickly, while bringing more perspectives into the process through work in larger ad-hoc analysis teams.

INSTITUTIONAL OUTCOME 4: Retaining Analytic Work through Legacy Datasets

Qualitative Data Analysis Software is designed to allow for detailed procedural documentation during analysis. This allows us to retain the analytic work of our colleagues, and to merge it into a single file. For example, we created a “Civil Rights Legacy Dataset” – a Qualitative Data Analysis Software file that contains all programming narratives containing information on expanding access to underserved or nontraditional audiences, currently from 2014 to 2016. This surmounts to approximately 1000 records, or 4000 pages of textual data. The file is available to anyone in the institution interested in learning about best practices, barriers and programmatic gaps regarding our work with nontraditional and underserved audiences.

The analyses that currently conducted on this dataset by various teams are being merged back into the “Legacy File”. Future analysts can view the work benches of prior analysts and projects, thus allowing them to use prior insights and processes as stepping stones. This enables the institution to conduct meta-analyses, maintain analytic continuity, and to more easily and reliably distribute analytic tasks over time or across multiple analysts. You can find more information on the use of Legacy Datasets in Extension in an upcoming book chapter, published in Silver & Woolf’s textbook on utilizing Qualitative Data Analysis Software.)

Beyond Qualitative Data: A Pathway for Building Digital Learning and Adaptation Skills

The outcomes above are immediate institutional effects the Data Jam Initiative was designed for. But maybe more importantly, we’re creating a base line of proficiency in negotiating between a technical tool and a workflow. Our tools change. Our methodological approaches differ from project to project. Each new project, and each new digital tool requires that we engage in this negotiation process. Every time, we need to figure out how we can best use a tool to facilitate our workflows; this skill is a fundamental asset in institutional professional development, and it transcends the topical area of evaluation.

This means that the Data Jam initiative, as an approach focused on mentorship and making by imbuing a technical tool with concrete, relevant processes, is not limited to qualitative data – it can be a framework for many contexts in which Extension professionals use software to do or build things: Be it visualization tools, digital design and web design, app development, statistics and quantitative research, or big data tools.

The development of the Data Jam Initiative Tool Kit has been supported by an eXtension Fellowship. To access the curriculum, examples, videos and training materials, please visit the UW-Extension Data Jam website: http://fyi.uwex.edu/datajams/

Categories
Extension Fellowships Impact Software Technology Working Differently

Building Evaluation Capacity Through Data Jams, Part 2: Software as a Teaching Tool for Data Analysis

 In this second blog post on the University of Wisconsin-Extension Data Jam Initiative, I will focus on the role of software in Data Jams,  and on the skills that colleagues are building in this Evaluation Capacity Building Framework.
Screenshot from the Qualitative Data Analysis Software MAXQDA.
Screenshot from the Qualitative Data Analysis Software MAXQDA.

 

What is Qualitative Data Analysis Software?

The technical backbone of the Data Jam Initiative is Qualitative Data Analysis Software – often abbreviated as QDAS, or CAQDAS (Computer Assisted Qualitative Data Analysis Software). This type of research and evaluation software is designed to support the analysis of large amounts of textual information. It allows for efficient data management and the distributed analysis of large datasets in large teams. While Qualitative Data Analysis Software (such as MAXQDA, NVIVO, Atlas.TI or Dedoose) cannot do qualitative analysis by itself, modern packages typically do offer a wide array of options for coding, documentation, teamwork, qualitative data visualization and mapping.

Focusing on Analytic Collaboration, not on where to Click

Data Jam at the University of Wisconsin-Extension, August 2016
Data Jam at the University of Wisconsin-Extension, August 2016.

In a Data Jam, groups of colleagues analyze data together while using the same analytic software tool, and similar analytic techniques. This creates a common experience of bringing a tool (the software) and a process (the analytic techniques) together. We’re not teaching how to click through menus; we’re not teaching theoretical workflows. We analyze, we make things – with a real tool, a real question, real data and concrete techniques. These concrete analytic techniques emphasize writing and documentation throughout the process, and they focus on utilizing analysis groups to drive the analysis. In a Data Jam, colleagues practice how to stay focused on their research question, and how to work as an analysis group to produce a write-up at the end of the day.

Qualitative Data Analysis Software empowers colleagues to quickly explore our large datasets, and to dive into the data in an engaging way – as such, this software is a powerful tool to illustrate and practice methodological workflows and techniques. We’re not only building individual capacity – we are building a community of practice around data analysis in our institution. I will focus on this aspect in the third blog post, but I will briefly describe outcomes on the individual level here.

Individual Capacity Building & Improved Perception of Institutional Data Collection

On the individual level, we are seeing two outcomes in our ongoing evaluation of the initiative: Firstly, we build analytic capacity and evaluation capacity. Colleagues learn how to analyze textual data using state-of-the-art analytic tools, and they learn how to integrate these tools into their evaluation and research work flows. View the 3-minute video below to view some impressions and learning outcomes from a 4-day Data Jam for Extension research teams.

Secondly, colleagues gain a better understanding regarding how (and that!) the data that they enter in the central data collection system are being used. Our evaluations show that colleagues leave our Data Jams with an increased understanding as to why we collect data as an institution, and as to why it is important to enter quality data. Experiencing the role of the analyst seems to have a positive effect on colleagues’ perceptions of our central data collection effort, and leaves them excited to communicate how the data are being used to their colleagues.

Not every colleague will use the software or engage in research in the future; our goal is not to make everyone an analyst. But we establish a basic level of data literacy across the institution – i.e. a common understanding of the procedures, products, pitfalls and potentials of qualitative data analysis. This type of data literacy is a crucial core skill as we are undergoing the Data Revolution.

The development of the Data Jam Initiative Tool Kit has been supported by an eXtension Fellowship. To access the curriculum, examples, videos and training materials, please visit the UW-Extension Data Jam website: http://fyi.uwex.edu/datajams/

Categories
Extension Fellowships Impact Software Technology Working Differently

Building Evaluation Capacity Through Data Jams, Part 1: A Response to the Data Challenge

Collecting large amounts of textual data is easier than ever – but analyzing those growing amounts of data remains a challenge. The University of Wisconsin – Extension responds to this challenge with the “Data Jam Initiative”, an Evaluation Capacity Building model that focuses on the collaborative, making-centered use of Qualitative Data Analysis Software.  In this first of three blog posts I will provide a brief overview over the Initiative, the tools we’re using, and the products we’re making in Data Jams.

Data Jam at the University of Wisconsin Extension, August 2016
Data Jam at the University of Wisconsin Extension, August 2016

Extension’s Data Challenge

Extensions collect large amounts of textual data, for example in the form of programming narratives, impact statements, faculty activity reports and research reports, and they continue to develop digital systems to collect and store these data. Collecting large amounts of textual data is easier than ever. Analyzing those growing amounts of data remains a challenge. Extensions and other complex organizations are expected to use data when they develop their programs and services; they are also expected to ground their communications and reports to stakeholders in rigorous data analysis.

Collaborative, Software-Supported Analysis as a Response

Data Jam at the University of Wisconsin - Extension, August 2016
Data Jam at the University of Wisconsin – Extension, August 2016

The University of Wisconsin-Extension responds to this expectation with the Data Jam Initiative, an Evaluation Capacity Framework that utilizes Qualitative Data Analysis Software. In monthly full-day Data Jams and multi-day analysis sessions, colleagues meet to explore and analyze data together. Data Jams are inspired by the concept of Game Jams. In Game Jams, game developers meet for a short amount of time in order to produce quick prototypes of games.

Asking Real Questions, Analyzing Real Data

The most important feature of Data Jams is that we work with data and questions that are relevant to our colleagues; in fact, most topics in Data Jams are brought up by specialists and educators from across the state.  By collaboratively analyzing programming narratives and impact statements from our central data collection system, we start answering questions like:

  • How are equity principles enacted in our Community Food Systems-related work?
  • How do our colleagues state-wide frame their work around ‘poverty’?
  • How does state-wide programming in Agriculture and Natural Resources address Quality Assurance?
  • How are youth applying what they’ve learned in terms of life skills in our state-wide 4-H and Youth Development programming?
  • How does FoodWIse (our state-wide nutrition education program) partner with other organizations, both internally and externally?
Data Jam products are shared with colleagues across the institution.
Data Jam products are shared with colleagues across the institution via our internal Data Jam blog.

Using Qualitative Data Analysis Software, Data Jammers produce concrete write-ups, models, initial theories and visualizations; these products are subsequently shared with colleagues, partners and relevant stakeholders.

Building Institutional Capacity to Analyze Large Datasets

Data Jam at the University of Wisconsin-Extension, February 2017
Data Jam at the University of Wisconsin-Extension, February 2017

Through the Data Jam Initiative, we build institution-wide capacity in effectively analyzing large amounts of textual data. We connect teams, researchers, evaluators and educators to develop commonly shared organizational concepts and analytic skills. These shared skills and concepts in turn enable us to distribute the analysis of large data sets across content and evaluation experts within our institution. The overall goal of the initiative is to enable our institution to systematically utilize large textual datasets.

Since early 2016, we use the Data Jam model in monthly one-day Data Jams across Wisconsin, in regular internal consulting and retreat sessions for project and program area teams, and in graduate methods education on the UW-Madison campus. We have hosted external Data Jams on the University of Washington Pullman campus and at the United Nation’s Office of Internal Oversight Services (OIOS).

The development of the Data Jam Initiative Tool Kit has been supported by an eXtension Fellowship. To access the curriculum, examples, videos and training materials, please visit the UW-Extension Data Jam website: http://fyi.uwex.edu/datajams/