Categories
Fellowships Food Systems

Digital Green Helps Solve Food System Challenges with Digital Technology

digital green mobile appDigital Green is a global non-profit development organization that empowers smallholder farmers to lift themselves out of poverty by harnessing the collective power of technology and grassroots-level partnerships. They partner with extension actors in developing countries to solve problems like market access, farmer training, and rural nutrition education using digital technology. Digital Green developers build cutting-edge software, such as mobile apps and online data collection and analysis, to benefit farmers worldwide.

After much success working in developing countries, Digital Green now seeks to partner with United States Cooperative Extension to pilot digital solutions in our local food systems. This is an exciting opportunity for Extension professionals looking for a strong partner to help implement local solutions through technology!

With sponsors like Bill & Melinda Gates Foundation and the U.S. Agency for International Development, and a focus on grassroots change through Extension, Digital Green is poised to be fruitful partners with Extension in the US. All they need are your ideas, vision, and partnership!

Jennifer Cook, the Digital Green eXtension Fellow, is looking for your local ideas and wants to help you implement a pilot project, partnering with Digital Green to develop digital solutions in your local food system. “We want to help you and your community develop efficient and practical digital solutions. You know the local challenges in farm production, market access, nutrition, food waste, and education. A partnership with Digital Green can help you transform obstacles into solutions.”

The video below is a recorded webinar on this opportunity.

jennifer cookIf a project meets the above criteria, Digital Green will partner to develop, implement, and evaluate the pilot project. Total estimated costs are $100,000. A portion of the costs will be contributed by Digital Green, securing outside funding may be needed.

Interested? Get involved by participating in the Impact Collaborative and contact Jennifer Cook, < Jennifer.cook@colostate.edu> the Digital Green Fellow, to discuss the opportunity to pilot your ideas for digital solutions in your food system!

Categories
Diversity & Inclusion Fellowships Information NAEPSDP

Are you evaluating your program? Ask the stakeholders!

Julie Huetteman, Ph.D., is the Strategic Initiatives Coordinator at Purdue Extension. She is serving as the National Association of Extension Program and Staff Development Professionals (NAEPSDP) eXtension Fellow for 2017.

My role as 2017 eXtension NAEPSDP Fellow for Program Evaluation was launched with the Diversity and Inclusion Issue Corps (now called the Impact Collaborative) in Cincinnati. Since then, I have attended online sessions for those projects to share progress, challenges, and accomplishments. In addition, I have been included on the Corps evaluation team to learn of feedback from project teams.

A theme in this feedback was “stakeholder” involvement expressed as 1) key to their program goals; 2) instrumental to providing external input, perspective and support for their program; and 3) important in their next steps to move forward in their program planning, implementation and evaluation efforts.

In my online interactions with project teams, I found myself repeating, “Have you asked them?” I reminded many to “keep asking questions” of their stakeholders, audience, participants, and attendees to connect to those perspectives, interests, and insights.

We don’t have to have all the answers. Instead, consider asking questions of stakeholders to get those answers.

In education, the “expert” typically shares information or content. But do we know what is of interest to attendees? Do stakeholders understand what is being shared? Is the program of value to participants? How did the audience benefit from taking part in activities? Here is the break: We don’t have to have all the answers. Instead, consider asking questions of stakeholders to get those answers.

A lot is involved in planning, implementing, evaluating and reporting Extension programs, and we want to do the best we can. So, consider asking questions throughout and use feedback to inform your decisions.

  • Are you planning activities that encourage attendees to be active, involved and engaged? Check on current research for best practices, then ask the intended audience: “What activities would you find interesting to do?”
  • When deciding which topics are most important, check the literature, then ask a couple representatives of your future audience: “What topics are important to you?”
  • While planning the evaluation, check on practice guidelines, then ask stakeholders: “What questions might be asked to find out the value of this program?” Alternately, give them draft questions and ask: “Which ones work well to capture the value of the program for you?” followed by “How might you state a question to ask about its benefit to participants?”
  • In your outline or curriculum, schedule specific activities to involve and engage participants like asking verbal questions, posting polls, sharing questions on a slide, and so on. Some examples: “Is there anything that you need to be clarified?” “Was this activity helpful?” “What was most valuable to you?” Also: Keep questions going throughout; don’t wait until the end of the program to ask.

Avoid packing your program with so much content that you forget about — or don’t leave time or space for — getting to know the audience.

Ask your audience to 1) help clarify your planning efforts, 2) give feedback during your implementation, and 3) craft questions for debriefing, or 4) review and express the evaluation results. Avoid packing your program with so much content that you forget about — or don’t leave time or space for — getting to know the audience. Include questions to get their ideas on, the perceived value from, and experience of the program. Key questions to get started might be: “Has the program met your needs?” “Is this activity/program of value to you?” “Is this of interest to you?” “What is important to you?” “How have you benefited from this presentation/program?”

Ask questions, then listen. Audience responses and feedback can guide your next steps for planning and evaluation. Make time to get to know, and connect with, the audience by asking about their thoughts or perceptions. Ask your audience – before, during and after your program – so that their perspective is the focus of your planning, activities, and evaluation.

Julie can be contacted at jhuettem@purdue.edu

Categories
Fellowships Information

Perspectives: Avoiding Stereotypes in Program Evaluation

Julie Huetteman, Ph.D., is the Strategic Initiatives Coordinator at Purdue Extension. She is serving as the National Association of Extension Program and Staff Development Professionals (NAEPSDP) eXtension Fellow for 2017.

As the 2017 eXtension NAEPSDP Fellow for Program Evaluation, I have been on a journey to expand my awareness and understanding relating to inclusion, and to look at evaluation from this perspective, since participating in the Diversity and Inclusion Corps in Cincinnati.

Quality versus Quantity

I often ponder the busy-ness of those working in Extension. We wear a lot of hats and have many roles, but in providing education to our county or state residents, we want to be sure we are doing the best we can. To help us think about the quality of programming, not just the quantity, I share these thoughts that put stakeholders first.

Another thoughtful and thought-provoking reading recommendation from my colleague, Dr. Pamala Morris, Assistant Dean/Director of Multicultural Programs at the College of Agriculture at Purdue University, led me to Whistling Vivaldi by Claude M. Steele.

This book on “how stereotypes affect us and what we can do” is based on our human perception of identity. It shares the ways in which stereotyping defines groups and characteristics, how pervasive it is, and how it can influence performance. When individuals experience identity threat from associated restrictive characteristics, their performance is negatively affected. Stereotype threats occur from many perspectives and affect how people perform in education settings, as well as personal and professional situations.

What can we do?

In an education setting, researchers share a two-part explanation:

  • Self-affirmation or sense of competence and worth.
  • Accomplished challenges may create a mindset to interrupt negative restrictions of stereotypes.

For example, think of the message that women are not as good as men in math or science, and the resulting performance by women in STEM. Programming that affirms abilities in science — in combination with instruction and challenging STEM opportunities for accomplishment — can help in addressing the gap in performance associated with the stereotype.

Applying these concepts to our Extension setting, we can be deliberate in efforts to maintain keener awareness of our communities, to explore how we might affirm our stakeholders’ senses of self, and provide quality instruction and challenges to encourage achievement in learning.

This awareness can help direct our program evaluation activities to address the participants’ experience and perspective, not our own as program deliverers. Consider asking stakeholders about their experiences, comforts, barriers, challenges, benefits, values, and accomplishments from participating in programs. Here is where we find the quality in our work!

Thanks again to Pamala Morris for sharing and recommending this book on the human situation we live and face every day.

For More Information

You can contact Julie at jhuettem@purdue.edu

Steele, C.M. (2010). Whistling Vivaldi. New York, NY: W.W. Norton. http://books.wwnorton.com/books/Whistling-Vivaldi/

 

Categories
Diversity & Inclusion Fellowships Information

Perspectives: Overcoming Bias in Program Evaluation

Julie Huetteman, Ph.D., is the Strategic Initiatives Coordinator at Purdue Extension. She is serving as the National Association of Extension Program and Staff Development Professionals (NAEPSDP) eXtension Fellow for 2017.

My eXtension NAEPSDP Fellowship for Program Evaluation 2017 started with the Diversity and Inclusion Corps in Cincinnati. I have been exploring related resources, opportunities, and associations ever since. Here I share thoughts and reflections more so than a set of instructions. We need space and time to ponder our human experience and learn about other perspectives to incorporate those thoughts as we plan, develop, deliver and report on our Extension work.

Blindspot: Hidden Biases of Good People by Mahzarin R. Banaji and Anthony G. Greenwald is one book recommended to me by Dr. Pamala Morris, Assistant Dean/Director of the Office of Multicultural Programs in the College of Agriculture at Purdue University. The book is about research on our human minds that looks at how our biases develop toward race, ethnicity, gender, age, religion and so on. The researchers share their Implicit Association Test (IAT), which measures how the brain associates people and groups with traits and values. This automatic preference that develops pervades even when egalitarian beliefs are expressed.

A lot of self-reflection about perceptions and openness to others resulted for me. Completing sample tests and activities had me assessing my views, thoughts, and actions about and toward others. This created time and space to think and reflect on our society and our human relations across the personal and professional, local and regional, and global.

We can apply these reflections to program evaluation efforts.

  1. Make sure we make time to reflect on our own hidden biases.
  2. Make opportunities to include our clients/participants in our activities. Invite the perceptions, thoughts, and direction of our stakeholders from the beginning, and throughout, as we work to plan, develop, deliver, and report program activities and evaluation approaches.

The ultimate result is that the opportunities made available are of value and benefit to stakeholders. Given the busy-ness of our jobs, these steps can be easy to overlook, but they are incredibly worthwhile.

I would like to send a special thank you to Pam, for sharing this resource with me at this moment in our society and for a time of reflection on our human interactions.

Julie Huetteman, Ph.D., Coordinator, Extension Strategic Initiatives, Purdue Extension

Banaji, M.R. & Greenwald, A.G. (2016). Blindspot: Hidden Biases of Good People. New York, NY: Bantam Books.

 

Categories
Community Design Diversity & Inclusion Extension Fellowships Food Systems i-Three Corps i-Three Event Impact Information Information Technology Innovation Issue Response Social Networking Technology Working Out Loud

Solving for Pattern: Reimagining our Land Grant System as Networked Knowledge Commons, Part 5

Optimizing for Health: Linking Land Grant Knowledge Assets in Support of Healthy People, Food Systems and Communities.

When a living system is suffering from ill health, the remedy is found by connecting with more of itself.

– Francisco Varela

Village rice fields, Shirakawa-go, Gifu-ken Japan (photo by Joel Abroad, https://www.flickr.com/photos/40295335@N00/4888037629/).

The scene above is an example of “satoyama”, a traditional Japanese agricultural landscape where different land uses are maintained in an integrated and harmonious manner over many generations. More broadly it is also what some call a Socio-Ecological Production Landscape (SEPL), supporting both human well-being and biodiversity through sustainable production systems. The Community Development and Knowledge Management for the Satoyama Initiative (COMDEKS) helps sustain and promote SEPLs across the globe by collecting and distributing knowledge and experience from successful on-the-ground practices for replication and upscaling in other parts of the world.

Though perhaps in a less idealized way, I and many of my Community, Local and Regional Food Systems CoP cohorts seek similar outcomes in the form of healthy, multi-functional1food systems adapted to local need and conditions, including urban environments. Of course as a Community of Practice knowledge sharing is also of great interest to us. In this post I highlight several findings from my recent eXtension supported Land Grant Informatics fellowship2 relevant to realizing these kinds of outcomes by more effectively linking people, technology and information in support of our Land Grant mission and the diverse communities we serve.

“Emergent Health” as an Integrative Framework for Collaborative Action

One thing I sought out early in my investigations were broad systems models/definitions of health and fitness, touched on in my last post. And from that, what opportunities might exist for “convening an ecosystem”3 of Land Grant actors around a shared set of informatics related objectives supporting those models in the form of healthy people, communities and food systems.

As it turns out, there is in fact a fair amount of existing momentum to build on, including that documented by the:

  • ECOP Health Task Force Cooperative Extension’s National Framework for Health and Wellnessprioritizing greater integration of nutrition, health, environment, and agricultural systems projects, followed up by the
  • APLU Healthy Food Systems, Healthy People initiative, calling for “collaborations and integration among agriculture, food, nutrition, and health care systems that have never before been explored or optimized. Working across these systems and developing solutions that combine multidisciplinary research and education”. The image below from that report illustrates those kinds of collaboration across various scales.

Figure 1 (from APLU Healthy Food Systems, Healthy People report) Integration must occur at many societal levels, including national, state/regional, community/local, and scientist/educator/practitioner.

Underlying many of these initiatives are social ecological models which view:

health as an ‘emergent property’ that results from different interactions among components of a complex, adaptive system. Together the individual determinants of health4, and the system as a whole – including social and environmental determinants – can develop a high degree of adaptive capacity, resulting in resilience and the ability to address ongoing and new challenges… To achieve and maintain health over long periods, individuals must continually readjust how they… respond…to the changing demands of life… Social action also is required to create circumstances that can promote individual and population health.5

This emergent, adaptive view of health echoes that of many others, including those I’ve quoted earlier in this series. It suggests a shift away from top down, one-size-fits-all prescriptive approaches often focused on treating the symptoms of dis-ease toward more facilitative ones (e.g. building soil health as a foundation for healthy food systems). A complex adaptive systems approach would also require the arrows in the diagram above going in both directions, allowing knowledge and insight to flow “up” and down, as well as laterally (e.g between communities).

I document in my report2 similar efforts/voices across various disciplines and sectors, some aimed squarely at tackling wicked problems like food insecurity and climate change. Many highlight the critical role networks and “boundary spanners” like Cooperative Extension professionals can play in supporting connectivity and feedback,  one “circumstance” vital to the health of these systems, including sustainable agricultural systems. They do that partly by enhancing the ability of people, organizations and communities to recognize and leverage multiple forms of capital. That includes data, information and knowledge resources supporting ongoing learning and innovation locally, and collective intelligence on a larger, sometimes global scale.

Yet in spite of the many good reasons and calls for greater collaboration and integration, doing so remains a wicked challenge in itself, a task often at odds with our well-intentioned but increasingly outdated institutional, programmatic and funding structures. The good news is that a number of useful but underutilized tools and strategies already exist.  What remains is for Land Grant actors (including Cooperative Extension and libraries like my own) to more systematically and collaboratively link and leverage these in support of network-centric approaches

Networked Platforms and Stacks Supporting Emergent Learning

As I began to outline in my previous post, this transformation will require new “socio-technical” structures and capabilities. That means systems (including agrifood systems and Land Grant knowledge systems) where social and technical subsystems are optimized to support locally-directed, globally-connected problem solving and innovation, as well as the well-being of those (including Cooperative Extension) engaging with those systems.

In that post I also mention several architectural patterns commonly found in innovative environments, including networks, stacks and “emergent platforms”. If we look at health as an emergent property relying on these, then a systems approach would require the co-creation and maintenance of such structures. Though now retired Jim Langcuster from Alabama Cooperative Extension has written at length about the future of Cooperative Extension depending on its ability to embrace this type of work, transforming itself into an emergent, generative, “open source platform” developing “adaptive digital networks… responsive to the needs of contemporary learners”.

One central recommendation in my report is the collaborative development of a particular kind of knowledge commons, a “Land Grant Knowledge Graph” undergirding and/or linking such adaptive learning networks and platforms.  Like the underlying (light gray) one on the far right below, potentially supporting a variety of emergent, health enhancing efforts.

Figure 2. Different kinds of networks serve different kinds of needs, including the emergence of complex adaptive networks. (Image from http://thewisdomeconomy.blogspot.com/2011/08/opportunities-in-chaos.html)

“Graph” knowledge structures including graph databases provide an ideal matrix for supporting these environments of connectivity, enabling the modeling of a variety of topics or entities, and the relations between them. They offer many practical applications including support for serendipitous discovery and linking of widely distributed expertise and knowledge artifacts, as illustrated through various research networking tools like VIVO and derivatives such as AgriProfiles. In my report I also outline how Issue-Based Information Systems (IBIS) can be used to help document and map as a graph conversations amongst diverse stakeholders working to address wicked problems, through facilitated approaches like dialogue mapping.

Figure 3. “Google Knowledge Graph Card”, returned as part of a Google search result for “Liberty Hyde Bailey”

One of the more developed and commonly used examples of a knowledge graph is Google’s. Figure 3 shows one benefit provided by that graph, the ability to aggregate a wide variety of information resources related to a particular subject in the form of a “Knowledge Graph Card”. In my report I frame this as part of a larger evolution toward a “semantic web”, the original vision Tim Berners-Lee had of the World Wide Web , “where anything could be potentially connected with anything else”, offering countless opportunities/pathways for the sharing, discovery and (re)use of knowledge.

Recommendations

Realizing a Land Grant Knowledge Graph, more effectively linking the diverse and widely disparate knowledge resources from our various institutions in support of emergent learning and health might seem at first to be an impossible undertaking. Especially if approached from a traditional top down planning approach. In his piece Government as Platform, technology thought leader Tim O’Reilly flips that model, suggesting6 a more emergent, collaborative approach, by developing data and information layers, “on which we, the people, can build additional applications” –a suggestion worth considering for Land Grant Universities, sometimes referred to as the People’s Colleges. The Obama administration embraced such a role through its Digital Government initiative, outlining three layers of digital services within a digital ecosystem:

  • Information (or Storage) Layer -Includes structured information/data such as census data, plus unstructured information such as fact sheets and recommendations.
  • Platform (or Management) Layer -Includes all the systems and processes used to manage this information.
  • Presentation Layer –What the “end users” of information create/need in order to leverage that data and information in support of informed decision making and action.

Adapting this brilliant XPLANE created image from Jay Cross’ Informal Learning Blog (now archival due to his unfortunate passing), I’ve created the animation below to illustrate how Land Grant supported networked stacks and platforms might help facilitate emergent learning and innovation at the community level. Each able to both “push” and “pull” from those above, below and adjacent to it, enabling multiple pathways for data, information and knowledge exchanges. Combined with efforts like those highlighted in a recent GODAN Open Farms documentary this could greatly contribute to the development and scaling of SEPLs and similar integrated approaches.

 

My report2 provides several examples of existing organizations and initiatives illustrating typical roles or needs within this stack. It also provides several suggestions on how Extension and others can develop their “sociotechnical capabilities” for interacting with and contributing to such an ecosystem. That includes best practices like following FAIR principles, what I see as a specialized form of Working Out Loud, where you make the “digital trace” left by your work more findable, accessible, interoperable and reusable.

Recommendations relevant to Extension include three broad areas of development (potentially supported in part through eXtension competency-based education (CBE) services):

  • Gaining and promoting a systems-oriented definition of health, including agrifood systems health, based on an understanding of complex adaptive systems and related emerging transdisciplinary frameworks.
  • A shared understanding of and ability to effectively leverage information and communications tools and systems (including Issue-Based Information Systems capabilities and metaliteracy)
  • Promoting trust and mutual understanding amongst Land Grant personnel and those they work with, nurtured through facilitative/network/systems leadership.

Next Steps

I and several colleagues have already begun exploring how some of these ideas could be implemented in the form of Land Grant system facilitated Crucial Conversations on Health and Wealth. Though we might each use different tools and programmatic structures to realize desired outcomes, we share an interest in collective sensemaking and problem solving approaches which can help communities better address wicked problems like hunger. At the very end of my report is a concept map generated from a recent Diversity & Inclusion designathon session exploring those ideas. Look for future posts/updates as we proceed on that learning journey!

Links to all the various outputs associated with the eXtension co-sponsored fellowship this post emerged from are available here: https://www.extension.org/jeff-piestrak/

 

Endnotes

1. Food systems “multifunctionality”, those providing economic, environmental and social functions or benefits simultaneously, is  something many researchers and practitioners look at when assessing the health of food systems, particularly those applying a social-ecological systems (SES) framework.
2. My fellowship final report is available for download from Cornell’s eCommons repository here: http://hdl.handle.net/1813/48205
3. In line with eXtension/LG system partner GODAN’s (Global Open Data for Ag & Nutrition) theory of change for realizing a data ecosystem for agriculture and food
4. A broad range of personal, social, economic, and environmental factors can influence the health of people, communities and food systems. More information can be found here: https://www.healthypeople.gov/2020/about/foundation-health-measures/Determinants-of-Health
5. Bircher, J., & Kuruvilla, S. (2014). Defining health by addressing individual, social, and environmental determinants: New opportunities for health care and public health. Journal of Public Health Policy35(3), 363–386. https://doi.org/10.1057/jphp.2014.19
6. O’Reilly’s article is chapter two from the book Open government: collaboration, transparency, and participation in practice

Categories
Content Design Fellowships i-Three Lab Information Information Technology Innovation Media Professional Development Technology

Seven Elements of Good Data Visualization

To prompt behavior change, we must be able to effectively communicate data. Not convinced? Read this post on why data visualization matters. The goal of this article is to dig in deeper and present some foundational concepts for creating good data visuals.

A recent eXtension webinar[i] described numerous tools and programs at our disposal for creating more engaging data visualizations, so I won’t address those sorts of resources in this post. What I hope to impart are fundamental concepts of data visualizations that are cross-cutting and applicable, regardless of which tool you choose to use to create them. I have distilled these into my top seven characteristics of good data visualization.

1. What’s your point?

Our goal is to present scientific data in a clear and simple way. But do not misunderstand me; I am not advocating for over simplistic, watered down presentations of science. For example, Nate Silver’s FiveThirtyEight[ii] website, featuring data visualizations and journalism on topics of politics, economics, science, and sports, presents lots of complicated data, often using chart types that are unfamiliar and atypical. And even though we have probably all been warned against using visuals that depart from the norm, the site is ranked 618 in the U.S according to Alexa[iii], and, according to Quantcast[iv], over 371,000 visit the site each month in the U.S. Despite my lack of interest in sports in general, I have found myself browsing through numerous sports-related stories on Silver’s site. What makes these fairly complex and unfamiliar graphs engaging and worth spending time looking at? I propose that one key factor is that the authors know their point.

When you are presenting a graph or chart, do you think through what you want people to understand and walk away with? I know that I am guilty of approaching data visualization with the goal of displaying all of the data as neatly and completely as possible. While not a bad idea, more must be considered than whether I was able to fit all the information into the display. Before finalizing a graph to share with others, take a step back and ask yourself, “What is my point?” Then, determine if the graph actually conveys that, or if there is a way to make your point clearer. It could be that a different chart type or color scheme would help elucidate your point.

2. Choose the right chart

I suspect that by and large, bar charts are the most used chart type, and possibly for good reason: they are simple to read and people are familiar with them. However, they are not a one-size-fits all solution, and numerous other options should be considered. The following charts are ones I have experimented with in the last year.

Consider trying one of these chart or graphic types out this year. All of these examples were made in Excel – not with any special software – using some creative “tricks” to make them possible. At the end of this post I provide some resources to help you learn these tricks.

Simple text

When you have one or two numbers that tell the story, highlighting a single number is a great option. A caution, however: do not overuse this simple tool or it will lose its impact.

Example of simple text to communicate data
Example of simple text to communicate data

Heatmap

Tables are rarely a good tool for showing data in a live presentation, but they do give you the ability to present a lot of detail and can be useful in printed materials. Combining a table with the technique used in a heatmap – that is, adding colors that vary in intensity to show relative performance – can help readers more quickly process and see patterns. In the example below, the darker colors represent higher yields, allowing the reader to see at a glance which combination of nitrogen application rate and seeding rate results in the best yield.

Example of the heatmap technique applied to a table
Example of the heatmap technique applied to a table

Layered bar graph

A layered bar graph is essentially combining the two bars of a side-by-side double bar graph. Both the grey bars and red bars are assumed to start at the 0 point on the x-axis. This allows an easy comparison showing us how much more there is of the grey than the red. Combining the bars is a good technique for saving space and clearly illustrates the difference between the two things you are comparing, especially when you want to emphasize the relative difference and not necessarily the quantitative difference. Instead of a legend, color in the title and color of the bar segments are used to communicate information about the different elements being compared.

Example of a layered bar graph
Example of a layered bar graph

Small multiples

Small multiples is a great tool for breaking complex information into an array of manageable and comparable information. The technique uses multiple views to show different partitions of a dataset, using a series of similar charts or graphs with the same scale and axes that can be easily compared. There are numerous uses for small multiples, and many chart types can be broken down into small multiples; this examples uses horizontal bar charts.

Example of a small multiples graph to break down complex info
Example of a small multiples graph to break down complex info

3. Less is more

Eliminating unnecessary legends, gridlines, tick marks, and colors will clean up the graph and allow you to focus your learner’s attention on your point.

Eliminating the legend is a good strategy to clean up the graphic, and, if done well, makes interpretation of the graph quicker. Labeling bars directly, such as in the small multiples example, makes it easier for the viewer to process information because they do not have to look between the legend and the main part of the chart to determine what each color in the bar chart represents. Color also can be used in a similar way, such as in the stacked bar example. Colors in the title and on the average lines indicate what the grey and red categories are, making a separate label unnecessary.

Consider whether eliminating axis labels and instead labeling the points directly might be advantageous. When the actual numeric value is important, label the points directly; when the overall trend is important, leave the axis labels in place. To reduce redundancy, however, do not use both axis labels and individual point labels. One exception to this guideline would be to use the axis labels, but label a few key data points to draw attention to them.

4. Use color intentionally

I have seen many graphs like the following. In this case, each individual site was given a different color.  The graph is bright and eye catching, yet the color is not used in a meaningful way. Separating the various sites with different colors is not important and only detracts from the overall point.

Example of color not used purposefully in a layered bar chart.
Example of color not used purposefully in a layered bar chart.

Color is a powerful tool and should always be used to convey a message. When I am developing a graphic, I like to first make as much of the graph as possible grey. Then I go back and begin using color to make the key point stand out. In the following graph I have used a lighter shade of red for the late planting date, and a darker shade of red for the early planting date. Color in the subtitle is used to designate what the different colors of bars represent and allows the legend to be eliminated.

Example of layered bar chart with purposeful use of color.
Example of layered bar chart with purposeful use of color.

5. Create pointed titles and call out key points with text

The previous graph could be given a title along the lines of “Soybean Yield by Planting Date, 2008 to 2010.” However, a much more useful title could be leveraged to communicate the key point – in this case, “Planting Soybeans Early Resulted in an Average 2.7 bu/acre Yield Increase.”

Text also can be used in other strategic locations, such as the use of the word “12 On-Farm Research Sites” to designate all the sites along the x-axis rather than labeling them each “site 1, site 2, etc.” A subtitle is used to designate what the different colors of bars represent and provides additional useful information about planting dates.

6. Get feedback and iterate

This process is dynamic and, at least for me, requires lots of trial and error. Utilize the back button. Or create a separate copy before trying a bold remake, which also allows you to compare the first and second versions. On a number of occasions, once I had gotten a graph cleaned up and presentable, I realized my point would be better displayed with a completely different graph type, and I ended up starting the process over again.

Starting with a quick sketch on a sheet of scratch paper can also be helpful. Sometimes you can save time by quickly drawing out some ideas of how to display your variables before beginning your computer work. This also forces you to think through the concept rather than just defaulting to one of Excel’s recommended charts.

Getting feedback can be very valuable. Ask other people to take a look at your graph. Ask them what they think the main point is, and what they notice first. Audiences also often provide great feedback. Take note of what questions your audience have and then determine if there is a way to make your graph more clearly communicate the information they need.

7. Read up and copy other visualizations

Many of the graphics I have experimented with came from examples that intrigued me by the effectiveness with which they communicated information. I encourage you to browse websites and follow Twitter accounts that routinely produce good data visualizations. If you see something that really communicates information well, take a few minutes to look at it and think about why it is effective, then try to incorporate that into your future designs.

Here are some suggestions to get you started:

Websites

Twitter accounts

  • @BBGVisualData
  • @WSJGrapics
  • @NateSilver538
  • @evergreendata
  • @Rbloggers
  • @Seeing_Data
  • @538viz
  • @FiveThirtyEight
  • @USDA_ERS

This is admittedly a very brief introduction to the concept of data visualization. There are lots of great resources that discuss how to pick the right chart for your data – and even walk you through how to create them. Two of my favorites that are fairly comprehensive are “Storytelling with Data” by Cole Knaflic and “Effective Data Visualization“ by Stephanie Evergreen.

Be patient with yourself – as with most things, learning to create good data visualizations takes time. Scott Berinato, author of “Good Charts: The HBR Guide to Making Smarter, More Persuasive Data Visualizations,” sums it up well: “Simplicity takes some discipline and courage to achieve. The impulse is to include everything you know. But charts communicate the idea that you’ve been just that – busy[x].”

These seven suggestions are meant to serve as a starting point and to encourage you to begin experimenting with the way you communicate data. In the next post, I will take you through a data visualization makeover using the elements I outlined in this post.

Please take a minute to answer these three questions. Your feedback helps direct future articles and resources.

[gform form=”https://docs.google.com/forms/d/e/1FAIpQLSfmAI4bHnnWZYSecowyCiFqkuI3knR8C77jsJaXoMpiU4H2LQ/viewform?usp=sf_link” legal=’off’ title=’off’]

 

[i] https://learn.extension.org/events/3007%5blink

[ii] www.fivethirtyeight.com

[iii] http://www.alexa.com/siteinfo/fivethirtyeight.com

[iv] https://www.quantcast.com/fivethirtyeight.com

[v] http://www.storytellingwithdata.com

[vi] http://stephanieevergreen.com/

[vii] www.fivethirtyeight.com

[viii] https://www.ers.usda.gov/data-products/chart-gallery/

[ix] https://www.ers.usda.gov/data-products/data-visualizations/

[x] https://hbr.org/2016/06/visualizations-that-really-work

 

Categories
Extension Fellowships Impact Software Technology Working Differently

Building Evaluation Capacity Through Data Jams, Part 3: Readying Extension for the Systematic Analysis of Large Qualitative Datasets

In this third blog post on the University of Wisconsin-Extension Data Jam Initiative, I will focus on four institutional outcomes of this Evaluation Capacity Building Framework.

Screenshot from the University of Wisconsin-Extension Civil Rights Legacy Datasets in MAXQDA.
Screenshot from the University of Wisconsin-Extension Civil Rights Legacy Datasets in MAXQDA.

INSTITUTIONAL OUTCOME 1: Continuous use of Institutionally Collected Data

The Data Jam Initiative provides colleagues with the tools, skills, support and community they need to engage in the analysis of large, often fragmented and hard-to-analyze textual datasets. We are currently conducting a longitudinal study measuring the initiative’s impact on analytic self-confidence and proficiency. At this early stage we observe heightened empowerment in Extension professionals, and we see a steep increase of evaluation, research and internal development projects that utilize the data from our central data collection system.

INSTITUTIONAL OUTCOME 2: Improvement of Institutional Data Quality

An essential element of the Data Jam Initiative is to communicate to colleagues and leadership how data are being used. Institutionally, this validates colleagues’ efforts regarding reporting, and it supports leadership in adjusting data collection foci based on ongoing, interdisciplinary data analysis. This, in turn, helps keeping institutional research, evaluation and communication efforts in alignment with ongoing data collection and storage.

INSTITUTIONAL OUTCOME 3: Building Interdisciplinary Capacity to Quickly Respond to Emerging Analytic Needs

All-Program area Evaluator Data Jam at the University of Wisconsin-Extension, March 2017.
All-Program area Evaluator Data Jam at the University of Wisconsin-Extension, March 2017.

Over time we create a baseline of shared techniques for analysis, and distributed proficiency in utilizing Qualitative Data Analysis software. Consequently, colleagues can tap into shared analytic frameworks when they collaborate on projects. On a larger scale, the institution can quickly and flexibly pull together analysis teams from across the state, knowing that a number of colleagues already share fundamental analytic and technical skills, even if they have never directly worked together. This allows an institution to respond quickly and efficiently to time-sensitive inquiries, and  to analyze more data more quickly, while bringing more perspectives into the process through work in larger ad-hoc analysis teams.

INSTITUTIONAL OUTCOME 4: Retaining Analytic Work through Legacy Datasets

Qualitative Data Analysis Software is designed to allow for detailed procedural documentation during analysis. This allows us to retain the analytic work of our colleagues, and to merge it into a single file. For example, we created a “Civil Rights Legacy Dataset” – a Qualitative Data Analysis Software file that contains all programming narratives containing information on expanding access to underserved or nontraditional audiences, currently from 2014 to 2016. This surmounts to approximately 1000 records, or 4000 pages of textual data. The file is available to anyone in the institution interested in learning about best practices, barriers and programmatic gaps regarding our work with nontraditional and underserved audiences.

The analyses that currently conducted on this dataset by various teams are being merged back into the “Legacy File”. Future analysts can view the work benches of prior analysts and projects, thus allowing them to use prior insights and processes as stepping stones. This enables the institution to conduct meta-analyses, maintain analytic continuity, and to more easily and reliably distribute analytic tasks over time or across multiple analysts. You can find more information on the use of Legacy Datasets in Extension in an upcoming book chapter, published in Silver & Woolf’s textbook on utilizing Qualitative Data Analysis Software.)

Beyond Qualitative Data: A Pathway for Building Digital Learning and Adaptation Skills

The outcomes above are immediate institutional effects the Data Jam Initiative was designed for. But maybe more importantly, we’re creating a base line of proficiency in negotiating between a technical tool and a workflow. Our tools change. Our methodological approaches differ from project to project. Each new project, and each new digital tool requires that we engage in this negotiation process. Every time, we need to figure out how we can best use a tool to facilitate our workflows; this skill is a fundamental asset in institutional professional development, and it transcends the topical area of evaluation.

This means that the Data Jam initiative, as an approach focused on mentorship and making by imbuing a technical tool with concrete, relevant processes, is not limited to qualitative data – it can be a framework for many contexts in which Extension professionals use software to do or build things: Be it visualization tools, digital design and web design, app development, statistics and quantitative research, or big data tools.

The development of the Data Jam Initiative Tool Kit has been supported by an eXtension Fellowship. To access the curriculum, examples, videos and training materials, please visit the UW-Extension Data Jam website: http://fyi.uwex.edu/datajams/

Categories
Content Extension Fellowships i-Three Lab Information Technology Innovation Media Professional Development Technology

Data Visualization for Extension Professionals: Why Does it Matter?

“We face danger whenever information growth outpaces our understanding of how to process it.”[i]

The ability to generate data has greatly increased in recent years, across all sectors, including agriculture. In fact, according to VCloud News, 90% of the world’s data has been created in the last 2 years alone[ii]. This “big data” is harnessed to improve health, save money, and improve efficiencies. In this era of “big data,” challenges lie not only in storing and processing data, but distilling and presenting it so it becomes meaningful and offers insights for our intended audience. Scott Berinato, senior editor at Harvard Business Review, encapsulates this idea in “Visualizations That Really Work”: “Decision making increasingly relies on data, which comes at us with such overwhelming velocity, and in such volume, that we can’t comprehend it without some layer of abstraction.”[iii]

The goal of this post is to discuss how we, as scientists and educators, can present data in clear and concise ways.

Enter data visualization.

What is Data Visualization?

Simply put, data visualization is how we make sense of, and communicate, data.

However, this term can encompass a variety of things and varies by profession – computer programmers, statisticians, graphic designers, business analysts, scientists, journalists, and professional speakers all approach the topic of data visualization differently.

I am not a computer programmer, nor am I a graphic designer. I am a scientist by training, and therefore a practitioner of data visualization. I experiment, and I have much to learn.

I have been convinced of the importance of paying attention to how we visualize data, as much by my own struggles to decipher cluttered, burdensome graphics as by any well-crafted argument. Unfortunately, scientific data is often presented in overly complex charts – charts that make data hard to interpret and consequently remember. This is true for information delivered to both the scientific community and Extension audiences. In fact, it could be argued there is a tendency within the scientific community to over-complicate things, as if making our data more convoluted will impress people with our vast knowledge.

Thankfully, scientific data presentation does not have to be cumbersome and overly complex; effective visualizations can make the message clear and memorable.

 Why Should Extension Professionals Worry about Data Visualization?

Intuitively, we know that good information, when poorly communicated, cannot prompt desired behavior change. You can’t act on information you don’t understand –  and having information does not equal understanding.

There is research evidence that supports this. Pandey, Manivannan, Nov, Satterthwaite, and Bertini (2014)[iv] tested the assumption that “visualization leads to more persuasive messages” by showing participants data in both chart and table form. When participants didn’t have strong beliefs about a topic, the visual information presented in charts was more persuasive than textual information presented in tables in changing their attitudes. Simply, data visualizations lead to greater impact.

So why is there not more emphasis on this important aspect of how we communicate data?

A quick Google Trend[v] analysis shows a rapid increase in searches for “big data” since 2011, while searches for “data visualization” stay relatively stagnant. Why the lack of interest and emphasis on visualizing our data? Surely as we increase the quantity of data we collect, the need for effective data visualization increases correspondingly, if not increasingly more.

Google Trend Analysis of "Data Visualization" and "Big Data"
Google Trend Analysis of “Data Visualization” and “Big Data”

In Cooperative Extension, our goal is to have impact – for people to make behavior changes as a result of information we share. In order for this to happen, we need to effectively communicate data.  Unfortunately, many obstacles get in the way of effective data communication. I believe one of these obstacles is simply ignorance of the fact that data can be communicated poorly.

Lack of awareness and attention to the issue may be partly to blame, but it may not be all our fault. After all, in the past, data visualization has been left to specialists such as data scientists and professional designers. But now, due to enhanced computing capabilities, new software and tools, and the ability to quickly collect and process massive quantities of data, most Extension professionals routinely produce charts and figures – without formal training in data visualization.

As a 2016 eXtension fellow[vi], my goal is to bring awareness and promote discussion of the topic of data visualization. If Extension is to fulfill the mission of bridging the gap between scientists and the public, so the public can act on the information scientists provide, we must communicate data well.

Fortunately, numerous books, videos, podcasts, and blogs are dedicated to the finer points of good data visualization. As a starting point, in my next post, I offer what I consider my top seven elements of good data visualization.

Please take a moment to complete the anonymous survey below. Information submitted will be used to guide my work during this fellowship.

[gform form=”https://docs.google.com/forms/d/e/1FAIpQLScYNJo9TjjiQ4yR7UYPYOZ6cFz0lzBe5hIjO6Qq6mjL9f447g/viewform” legal=’off’ title=’off’]

Endnotes

[i] Silver N. (2012).The Signal and the Noise: Why So Many Predictions Fail But Some Don’t. New York: Penguin Press.

[ii] http://www.vcloudnews.com/every-day-big-data-statistics-2-5-quintillion-bytes-of-data-created-daily/

[iii] https://hbr.org/2016/06/visualizations-that-really-work

[iv] http://ieeexplore.ieee.org/xpls/icp.jsp?arnumber=6876023

[v] https://trends.google.com/trends/

[vi] https://www.extension.org/laura-thompson/

 

Categories
Extension Fellowships Impact Software Technology Working Differently

Building Evaluation Capacity Through Data Jams, Part 2: Software as a Teaching Tool for Data Analysis

 In this second blog post on the University of Wisconsin-Extension Data Jam Initiative, I will focus on the role of software in Data Jams,  and on the skills that colleagues are building in this Evaluation Capacity Building Framework.
Screenshot from the Qualitative Data Analysis Software MAXQDA.
Screenshot from the Qualitative Data Analysis Software MAXQDA.

 

What is Qualitative Data Analysis Software?

The technical backbone of the Data Jam Initiative is Qualitative Data Analysis Software – often abbreviated as QDAS, or CAQDAS (Computer Assisted Qualitative Data Analysis Software). This type of research and evaluation software is designed to support the analysis of large amounts of textual information. It allows for efficient data management and the distributed analysis of large datasets in large teams. While Qualitative Data Analysis Software (such as MAXQDA, NVIVO, Atlas.TI or Dedoose) cannot do qualitative analysis by itself, modern packages typically do offer a wide array of options for coding, documentation, teamwork, qualitative data visualization and mapping.

Focusing on Analytic Collaboration, not on where to Click

Data Jam at the University of Wisconsin-Extension, August 2016
Data Jam at the University of Wisconsin-Extension, August 2016.

In a Data Jam, groups of colleagues analyze data together while using the same analytic software tool, and similar analytic techniques. This creates a common experience of bringing a tool (the software) and a process (the analytic techniques) together. We’re not teaching how to click through menus; we’re not teaching theoretical workflows. We analyze, we make things – with a real tool, a real question, real data and concrete techniques. These concrete analytic techniques emphasize writing and documentation throughout the process, and they focus on utilizing analysis groups to drive the analysis. In a Data Jam, colleagues practice how to stay focused on their research question, and how to work as an analysis group to produce a write-up at the end of the day.

Qualitative Data Analysis Software empowers colleagues to quickly explore our large datasets, and to dive into the data in an engaging way – as such, this software is a powerful tool to illustrate and practice methodological workflows and techniques. We’re not only building individual capacity – we are building a community of practice around data analysis in our institution. I will focus on this aspect in the third blog post, but I will briefly describe outcomes on the individual level here.

Individual Capacity Building & Improved Perception of Institutional Data Collection

On the individual level, we are seeing two outcomes in our ongoing evaluation of the initiative: Firstly, we build analytic capacity and evaluation capacity. Colleagues learn how to analyze textual data using state-of-the-art analytic tools, and they learn how to integrate these tools into their evaluation and research work flows. View the 3-minute video below to view some impressions and learning outcomes from a 4-day Data Jam for Extension research teams.

https://youtu.be/IOWhots-qdc

Secondly, colleagues gain a better understanding regarding how (and that!) the data that they enter in the central data collection system are being used. Our evaluations show that colleagues leave our Data Jams with an increased understanding as to why we collect data as an institution, and as to why it is important to enter quality data. Experiencing the role of the analyst seems to have a positive effect on colleagues’ perceptions of our central data collection effort, and leaves them excited to communicate how the data are being used to their colleagues.

Not every colleague will use the software or engage in research in the future; our goal is not to make everyone an analyst. But we establish a basic level of data literacy across the institution – i.e. a common understanding of the procedures, products, pitfalls and potentials of qualitative data analysis. This type of data literacy is a crucial core skill as we are undergoing the Data Revolution.

The development of the Data Jam Initiative Tool Kit has been supported by an eXtension Fellowship. To access the curriculum, examples, videos and training materials, please visit the UW-Extension Data Jam website: http://fyi.uwex.edu/datajams/

Categories
Extension Fellowships Impact Software Technology Working Differently

Building Evaluation Capacity Through Data Jams, Part 1: A Response to the Data Challenge

Collecting large amounts of textual data is easier than ever – but analyzing those growing amounts of data remains a challenge. The University of Wisconsin – Extension responds to this challenge with the “Data Jam Initiative”, an Evaluation Capacity Building model that focuses on the collaborative, making-centered use of Qualitative Data Analysis Software.  In this first of three blog posts I will provide a brief overview over the Initiative, the tools we’re using, and the products we’re making in Data Jams.

Data Jam at the University of Wisconsin Extension, August 2016
Data Jam at the University of Wisconsin Extension, August 2016

Extension’s Data Challenge

Extensions collect large amounts of textual data, for example in the form of programming narratives, impact statements, faculty activity reports and research reports, and they continue to develop digital systems to collect and store these data. Collecting large amounts of textual data is easier than ever. Analyzing those growing amounts of data remains a challenge. Extensions and other complex organizations are expected to use data when they develop their programs and services; they are also expected to ground their communications and reports to stakeholders in rigorous data analysis.

Collaborative, Software-Supported Analysis as a Response

Data Jam at the University of Wisconsin - Extension, August 2016
Data Jam at the University of Wisconsin – Extension, August 2016

The University of Wisconsin-Extension responds to this expectation with the Data Jam Initiative, an Evaluation Capacity Framework that utilizes Qualitative Data Analysis Software. In monthly full-day Data Jams and multi-day analysis sessions, colleagues meet to explore and analyze data together. Data Jams are inspired by the concept of Game Jams. In Game Jams, game developers meet for a short amount of time in order to produce quick prototypes of games.

Asking Real Questions, Analyzing Real Data

The most important feature of Data Jams is that we work with data and questions that are relevant to our colleagues; in fact, most topics in Data Jams are brought up by specialists and educators from across the state.  By collaboratively analyzing programming narratives and impact statements from our central data collection system, we start answering questions like:

  • How are equity principles enacted in our Community Food Systems-related work?
  • How do our colleagues state-wide frame their work around ‘poverty’?
  • How does state-wide programming in Agriculture and Natural Resources address Quality Assurance?
  • How are youth applying what they’ve learned in terms of life skills in our state-wide 4-H and Youth Development programming?
  • How does FoodWIse (our state-wide nutrition education program) partner with other organizations, both internally and externally?
Data Jam products are shared with colleagues across the institution.
Data Jam products are shared with colleagues across the institution via our internal Data Jam blog.

Using Qualitative Data Analysis Software, Data Jammers produce concrete write-ups, models, initial theories and visualizations; these products are subsequently shared with colleagues, partners and relevant stakeholders.

Building Institutional Capacity to Analyze Large Datasets

Data Jam at the University of Wisconsin-Extension, February 2017
Data Jam at the University of Wisconsin-Extension, February 2017

Through the Data Jam Initiative, we build institution-wide capacity in effectively analyzing large amounts of textual data. We connect teams, researchers, evaluators and educators to develop commonly shared organizational concepts and analytic skills. These shared skills and concepts in turn enable us to distribute the analysis of large data sets across content and evaluation experts within our institution. The overall goal of the initiative is to enable our institution to systematically utilize large textual datasets.

Since early 2016, we use the Data Jam model in monthly one-day Data Jams across Wisconsin, in regular internal consulting and retreat sessions for project and program area teams, and in graduate methods education on the UW-Madison campus. We have hosted external Data Jams on the University of Washington Pullman campus and at the United Nation’s Office of Internal Oversight Services (OIOS).

The development of the Data Jam Initiative Tool Kit has been supported by an eXtension Fellowship. To access the curriculum, examples, videos and training materials, please visit the UW-Extension Data Jam website: http://fyi.uwex.edu/datajams/