Classroom Case Study: Teaching Data Ethics with Nonprofit Fundraising Scenarios
EthicsCase studyTeaching

Classroom Case Study: Teaching Data Ethics with Nonprofit Fundraising Scenarios

MMarcus Ellison
2026-05-06
21 min read

A teacher-ready case study on donor profiling, privacy, fairness, and transparent AI outreach in nonprofit fundraising.

Nonprofit fundraising is one of the best real-world settings for teaching data ethics because it sits at the intersection of mission, persuasion, privacy, and power. When students examine how organizations use donor data, they can see that ethical questions are rarely abstract: Who should be profiled? What counts as fair targeting? How much personalization is too much? And when does “data-informed” become “data-extractive”?

This student module is designed for teachers who want to build an instructional unit around fundraising scenarios that feel practical, current, and consequential. It works especially well in media studies, computer science, business, civics, social studies, or career-readiness courses because it teaches students how AI, records, and outreach systems shape real decisions. For a broader framing on responsible tech use in classrooms, see our guide to IoT safety and equity in connected classrooms and the related lesson on reading AI optimization logs for transparency.

The goal is not to tell students that data is bad. The goal is to show them that every fundraising model makes tradeoffs, and those tradeoffs can be designed well or badly. Students will analyze donor profiling, privacy boundaries, fairness risks, transparency practices, and ethical outreach alternatives. That makes this a powerful nonprofit case study for teaching not just technical literacy, but judgment.

1. Why nonprofit fundraising is a strong ethics classroom

It is familiar, but not simplistic

Students usually understand fundraising at a basic level: a nonprofit needs support, donors respond to appeals, and campaign teams want to raise more money. That familiarity helps teachers move quickly into the ethical layer without spending weeks on background. Because the context is concrete, students can focus on the real questions: what data should be used, who should be contacted, and what assumptions are hidden inside the targeting logic.

Unlike many corporate examples, nonprofit fundraising makes the moral stakes visible. The organization is often pursuing a public good, which can make its data practices seem automatically justified. That assumption is exactly why this setting is useful. Students learn that noble intent does not erase privacy concerns or bias, and that ethical design matters even when the mission is admirable.

AI makes the hidden logic easier to examine

Recent nonprofit coverage shows that even smaller organizations are experimenting with AI to identify major gift prospects and improve outreach. That means this lesson is timely, not hypothetical. Students can explore how donor segmentation, predictive scoring, and message personalization may boost revenue while also shaping who gets attention, who gets excluded, and who gets nudged more aggressively. For a practical lens on how audiences are identified and prioritized, connect this module to niche prospecting and high-value audience pockets and employer-facing content strategies.

That makes nonprofit fundraising an ideal ethics case study because students can inspect AI not as magic, but as a decision-support layer. They can ask: what inputs are used, what outputs are generated, and what human judgment remains? Those are the same questions employers ask in data, policy, marketing, and product teams.

It connects ethics to careers

Students often want to know whether ethics has job value. In this unit, the answer is obviously yes. Ethical thinking is not a soft extra; it is a differentiator for analysts, marketers, researchers, educators, and operations teams. Students who can explain fairness, privacy, and transparency in a practical setting are building a skill set that employers recognize.

If you want to extend the career connection, pair the lesson with building a human-led portfolio and project portfolio design with AI tools. Students can document their analysis as a case brief, ethics memo, or stakeholder recommendation, turning the classroom exercise into a portfolio artifact.

2. Learning objectives for the student module

Core knowledge goals

By the end of the instructional unit, students should be able to define data ethics in their own words and distinguish it from general “good intentions.” They should understand donor profiling, predictive scoring, personalization, consent, and algorithmic bias. They should also be able to explain why privacy is not only about keeping secrets, but about respecting context, expectations, and power imbalance.

Teachers can keep these outcomes visible in the room or on the LMS. A strong module should feel concrete enough that students can point to examples and say, “This is where the ethical issue appears.” That is a better educational result than memorizing a definition.

Analytical skills students should practice

The most important skill is ethical reasoning under uncertainty. Students should learn to weigh benefits against risks, identify affected stakeholders, and compare alternative designs. They should be able to evaluate whether a fundraising tactic is fair, whether a dataset is appropriate, and whether a message is transparent enough for the audience.

It also helps to teach students to separate accuracy from legitimacy. A model may predict which donors are likely to give, but that does not automatically make the targeting ethical. In fact, accurate prediction can intensify ethical risk when it is used to exploit vulnerability or entrench inequity.

Transferable workplace skills

This module builds skills that employers value across policy, education, nonprofit operations, marketing analytics, and AI governance. Students practice evidence-based writing, stakeholder analysis, and responsible decision-making. Those abilities are especially relevant in roles where professionals must interpret data without losing sight of human consequences.

For students interested in policy workflows and process controls, the unit pairs well with compliance planning under changing regulations and policy-resilient procurement contracts. This is where ethics stops being theoretical and starts looking like a practical workplace competency.

3. The nonprofit fundraising scenario: how donor profiling works

What donor profiling usually includes

In the classroom, explain donor profiling as the process of using data to estimate who may give, how much they may give, and what type of appeal they may respond to. The data might include giving history, event attendance, email engagement, geographic location, wealth indicators, past campaign response, and sometimes inferred interests. AI systems can combine these signals into a score or segment label that helps staff prioritize outreach.

Students should understand that donor profiling is not one thing. It can range from benign segmentation, such as separating recurring donors from first-time donors, to much more aggressive prediction, such as estimating a person’s financial capacity or emotional responsiveness. That distinction matters because the ethics shift as the model becomes more invasive.

How AI changes the scale of the decision

Before AI, a staff member might sort a spreadsheet manually and choose a few dozen names for a major gift campaign. With AI, a platform can rank thousands of records in seconds and generate personalized copy at scale. This creates efficiency, but it also means the organization can make more decisions, faster, with less human inspection.

That is where students should look for ethical friction. A model that operates at scale can quietly amplify assumptions, especially if a fundraising team trusts scores without questioning them. If a system overweights older donors, high-income neighborhoods, or historically generous zip codes, the nonprofit may appear effective while systematically narrowing who is seen as worth contacting.

Why “likely to give” is not the same as “fair to contact”

A useful classroom distinction is between predictive validity and ethical appropriateness. Just because a donor profile can predict a response does not mean the outreach should happen in the same way for everyone. The right question is not only “Can we contact them?” but “Should we contact them in this manner, with this level of personalization, and using these signals?”

Students can compare this to other data-driven targeting systems, such as the strategy behind new buying modes in ad platforms and automated bid strategies. In both cases, optimization can improve efficiency while obscuring who is being advantaged, excluded, or nudged.

4. Where bias enters the fundraising pipeline

Historical data reflects historical inequality

Bias often enters before the model is trained. If a nonprofit’s historical donor file reflects a narrow group of wealthy supporters, then the model learns from an already unequal pattern. This can lead the system to favor people who resemble the past donor base, while ignoring younger supporters, first-generation professionals, community members, or people from lower-income backgrounds who may still care deeply about the mission.

This is a powerful classroom lesson because it shows why data ethics is not only about model design. It is also about the structure of the underlying world. If the organization has historically underreached certain communities, the model may simply automate that underreach and call it insight.

Proxy variables can create hidden discrimination

Students should learn to spot proxy variables: data points that stand in for sensitive traits without naming them directly. Zip code, household size, home value, neighborhood density, language preference, and event attendance can all become proxies for income, race, age, or social class. Even when a fundraiser avoids explicit protected-class data, the system may still reproduce exclusion through indirect signals.

This is where ethical design becomes a technical and civic issue. Teachers can ask students whether the model should use certain variables at all, or whether they should be stripped out, tested, or constrained. A good conversation here is similar to evaluating fair participation in events where nobody feels like a target, because both situations involve inclusion, comfort, and power in a data-shaped environment.

Optimization can punish new or lower-capacity donors

Another bias pattern appears when the system over-optimizes for return on investment. If the model is trained to maximize revenue, it may prioritize only “high yield” prospects and ignore small donors, emerging donors, or supporters whose gifts are infrequent but meaningful. That may make short-term sense, but it can weaken long-term trust and reduce community breadth.

Teachers can frame this as a fairness question: fair to whom, and for what time horizon? A campaign that only learns from the highest-dollar outcomes may become efficient at extracting money from a small set of people while neglecting relationship-building. That is why “fairness” in fundraising cannot be reduced to just financial output.

What makes fundraising data sensitive

Students often assume that fundraising data is less sensitive than medical or financial data, but that is not always true. A donor record can reveal religious affiliation, political values, family structure, grief history, charitable priorities, and membership in vulnerable communities. When AI combines signals from public records, engagement logs, and inferred behavior, it can create a detailed profile that the person never knowingly assembled.

This is why privacy in fundraising is not only a legal issue. It is an issue of dignity and context. People may expect a charity to acknowledge their support, but not necessarily to infer their wealth, emotional state, or probability of making a major gift.

In class, emphasize that consent is meaningful only when people understand what is being collected and how it will be used. A generic privacy policy buried in fine print does not tell donors that AI may rank them, segment them, or personalize the emotional tone of appeals. Students should ask whether the nonprofit provides notice at the right moment and in language ordinary supporters can understand.

To sharpen this point, compare the scenario to digital consumer behavior. If people know how their data will be used, they can make more informed choices. If they do not, the organization may be technically compliant while still being ethically opaque. For a broader consumer-facing privacy lens, see how people navigate deals with privacy in mind and why accuracy matters in compliance document capture.

Respecting context is part of ethical design

Contextual integrity is a helpful concept for students: information shared for one purpose should not automatically be reused for another without careful justification. Someone who attended an awareness event is not necessarily signaling readiness for a high-pressure donation ask. Someone who clicked a newsletter may want information, not a fundraising escalation sequence.

Ethical design means slowing down at these boundary points. A humane system asks not only what can be inferred, but what should remain un-inferred. That lesson travels well beyond nonprofits into education platforms, advertising systems, and consumer apps.

6. A teacher-ready framework for evaluating fundraising ethics

Use the three-question lens: purpose, process, and impact

Teachers can simplify the analysis with a three-question framework. First: what is the purpose of the data use, and is it mission-aligned? Second: what is the process, and is it transparent, limited, and reviewable? Third: what is the impact, and who benefits or bears the burden? These questions help students move beyond gut reactions into structured ethical reasoning.

This framework works well because it can be applied to different fundraising tactics. A major-gift prediction model, a personalized email campaign, and a donor-retention dashboard each look different technically, but students can still test them with the same ethical lens. That consistency makes the lesson easier to grade and easier to remember.

Map stakeholders before judging the model

Encourage students to identify all stakeholders, not just the nonprofit and its finance team. Who else is affected? Donors, prospective donors, community members, staff, board members, volunteers, and beneficiaries all have different interests. Sometimes the people most affected by a fundraising system are not the ones directly being targeted, but the people whose services depend on the resulting revenue.

Students can create a stakeholder map and compare power, risk, and benefit. That mirrors professional analysis in ethics reviews and policy design. It also helps students see why a technically clever solution may still be socially weak.

Look for explicit guardrails

Ask whether the nonprofit has guardrails such as data minimization, human review, contact caps, suppression lists, and subject-access pathways. Guardrails show that the organization is designing for restraint, not just performance. If a system lacks these controls, students should flag it as higher risk even if it is generating impressive results.

For teachers who want to bring in operational thinking, the lesson pairs naturally with building a safer AI assistant without expanding attack surface and securing connected devices in workspace systems. In every case, trust depends on boundaries.

7. A comparison table students can use in class

The table below helps students compare common fundraising approaches and evaluate them through the lenses of privacy, fairness, transparency, and educational usefulness. Teachers can use it as a discussion prompt, a worksheet, or a rubric for student presentations.

Fundraising approachPotential benefitPrivacy riskFairness riskTeacher discussion prompt
Manual donor segmentationSimple, explainable outreachLow to moderateCan still reflect staff biasWho decides the categories and why?
AI donor scoringFast prioritization at scaleModerate to high if many data sources are usedMay favor wealthy or historically visible donorsShould scores be treated as suggestions or decisions?
Personalized email automationHigher response ratesModerate if behavior is tracked closelyMay pressure specific groups differentlyWhat counts as respectful personalization?
Lookalike prospect modelingFinds new prospects efficientlyModerateCan replicate historical exclusionWhat if the original donor base was unrepresentative?
Universal, non-targeted appealBroad and inclusive reachLowMore equitable access to ask and informationWhen is broad outreach better than precision?

This table is especially effective because it prevents students from treating precision as inherently superior. In many cases, broader outreach may be more equitable and more aligned with trust-building, even if it is less efficient. That tension is the heart of the ethical lesson.

8. Classroom activities for a complete instructional unit

Activity 1: Donor profile audit

Give students a fictional donor dataset with fields such as donation history, event attendance, region, email opens, and volunteer hours. Ask them to identify which fields are reasonable, which are risky, and which may function as proxies for sensitive traits. Then have them decide which fields should be removed, constrained, or kept with human review.

This activity teaches students to think like auditors rather than just users. They learn that data selection is an ethical act. It also helps them understand that a cleaner model is not always a narrower one; sometimes the better design is the one that intentionally uses less data.

Activity 2: Fair outreach redesign

Give groups a scenario where a nonprofit wants to increase major gifts among alumni donors. Students should redesign the outreach plan so it remains effective but reduces privacy intrusion and bias. They can propose a tiered outreach model: general updates for everyone, optional engagement for interested supporters, and human-reviewed major gift contact only where appropriate.

As an extension, students can compare their plan to a growth-oriented experimentation mindset similar to early-access product tests or micro-retail experiments. The key difference is that ethical fundraising should optimize for trust, not just conversion.

Activity 3: Transparency memo

Ask students to write a one-page transparency memo explaining, in plain language, what data the nonprofit uses, why it uses it, and what choices donors have. This is an excellent writing assignment because it forces students to translate technical logic into public-facing language. If they can’t explain the practice clearly, the practice may not be ethically mature enough.

You can make the memo more realistic by having students write for different audiences: donors, board members, staff, or the public website. Students will quickly see that transparency is not one sentence; it is a design commitment. For inspiration on communicating complex systems clearly, connect the exercise to guided experiences with real-time data and preserving voice when using AI tools.

9. How to assess student understanding fairly

Use a rubric that values reasoning, not just conclusions

A good ethics rubric should not reward students simply for saying “this is bad.” Instead, it should assess whether they identify stakeholders, explain tradeoffs, recognize bias sources, and propose realistic alternatives. Students can disagree on the final answer and still demonstrate excellent understanding if their reasoning is sound.

Teachers should make the rubric explicit before the task begins. Suggested criteria include clarity of explanation, identification of privacy risks, recognition of fairness issues, quality of proposed safeguards, and practicality of the redesign. That approach mirrors workplace expectations, where good judgment matters more than memorized slogans.

Evidence of mastery

Look for signs that students can move from description to critique. A strong response will explain not only what the system does, but why it may be problematic for certain groups or under certain conditions. Even better, students will propose concrete fixes such as consent language, opt-out options, human review, or more inclusive outreach categories.

You can also ask students to compare the nonprofit scenario to another domain, such as building trust with young audiences or retention analytics for viewers. Cross-domain transfer is a strong indicator that they truly understand the ethics concepts.

Feedback that improves ethical thinking

When giving feedback, avoid framing student answers as right or wrong in a simplistic way. Instead, ask what assumptions they made and whose perspective they left out. Prompt them to consider whether their proposal would still work if the donor base changed, the nonprofit expanded, or regulations tightened.

This kind of feedback helps students develop the habit of ethical iteration. That habit is valuable in school, but it is even more valuable in careers where systems must be revised over time as contexts shift.

10. Teaching notes: common misconceptions and how to handle them

Misconception: If it helps the mission, it is ethical

This is one of the most common misunderstandings students bring to the lesson. Teachers should acknowledge that mission impact matters, but then explain that good outcomes do not automatically justify every method. A nonprofit can do important work and still over-collect data, over-target supporters, or obscure the logic of its models.

A helpful follow-up question is: would we accept the same method if a for-profit company used it? If the answer changes based on the organization’s label rather than the method itself, students are starting to see how ethical reasoning works.

Misconception: Bias only happens when sensitive data is used directly

Students may assume that avoiding race, gender, or income fields removes bias. Explain that models can infer those characteristics indirectly through proxies. This is why fairness testing must include a review of variables, outcomes, and operational consequences, not just a checklist of forbidden columns.

For a broader lesson on inference and policy constraints, teachers can refer students to No link placeholder

Misconception: Transparency means revealing everything

Transparency does not mean dumping raw logic or exposing sensitive security details. It means giving stakeholders a meaningful explanation of what is collected, why it is collected, how it is used, and what choices they have. Good transparency is understandable, timely, and useful.

Students often appreciate this distinction because it reflects real tradeoffs. An organization can be honest without being reckless, and it can protect security without hiding key ethical facts. That nuance is central to modern data ethics.

11. A practical case brief students can submit

Teachers can assign a concise case brief with five sections: scenario summary, data map, stakeholder analysis, ethical risks, and redesign proposal. This format is manageable for most classes and easy to grade. It also mirrors professional memos used in policy, nonprofit, and operations settings.

Students should include at least one explicit fairness concern, one privacy concern, and one transparency improvement. They should also describe how the nonprofit could test their redesign, such as by piloting a new outreach policy with human review and measuring response quality instead of just conversion rate.

What strong student work looks like

Strong work typically names hidden assumptions, identifies at least one proxy variable, and explains why some donor segments may be over- or under-served. It also suggests a less intrusive alternative that still respects fundraising goals. The best submissions do not reject data; they refine its use.

That is the central pedagogical win of this module. Students learn that ethical design is not anti-technology. It is disciplined technology use in service of people, not just metrics.

How to connect the brief to portfolio-building

Students can revise the brief into a presentation slide deck, a one-page policy recommendation, or a short recorded explanation. Those formats become excellent portfolio pieces for college, internships, or entry-level roles. If you want to help students showcase the work professionally, pair it with portfolio guidance and project-based portfolio examples.

Teachers can also encourage students to compare this brief to other applied case writing, such as No link placeholder

Conclusion: why this module matters

Teaching data ethics through nonprofit fundraising gives students a rare combination of realism, complexity, and moral clarity. The scenario is familiar enough to feel accessible, but sophisticated enough to surface the core issues shaping modern AI systems: donor profiling, privacy, bias, transparency, and ethical design. Students do not just learn definitions; they practice judgment.

For teachers, this module is flexible. It can be used as a one-day discussion, a week-long instructional unit, or a capstone case study. It also connects naturally to career skills, since the same reasoning used here applies to analytics, communications, policy, and AI governance roles. If students can evaluate a fundraising model with nuance, they are already developing the kind of practical ethics employers value.

Pro Tip: Ask students to finish the lesson by writing one sentence that starts with “A fairer fundraising system would…” This tiny prompt often reveals whether they understand ethics as a design choice, not just a critique.

FAQ

What grade level is this student module best for?

This module works best for middle school, high school, community college, and introductory undergraduate classes. Teachers can simplify the language for younger students or add policy detail for older learners. The scenario is adaptable because the ethical questions are clear even when the technical depth changes.

Do students need AI knowledge before starting?

No. The lesson can begin with basic ideas about data collection, sorting, and prediction. AI concepts such as scoring and personalization are introduced as extensions of familiar decision-making rather than as advanced technical topics. That makes the unit accessible to mixed-ability classes.

How do I keep the discussion from becoming anti-nonprofit?

Frame the lesson as mission-supportive and improvement-oriented. Emphasize that nonprofits often want to do the right thing but may still need guardrails, better transparency, or fairer outreach design. The point is to strengthen trust, not attack charitable work.

What is the biggest ethical risk in donor profiling?

The biggest risk is usually the combination of hidden inference and unequal power. If a system predicts behavior using proxies and then escalates outreach in ways donors do not understand, the organization may cross from helpful targeting into manipulation or exclusion. Fairness and privacy concerns often appear together in that moment.

How can students show they understand transparency?

Have them rewrite a fundraising policy or donor notice in plain language. If they can explain what data is used, why it is used, and what choices people have, they likely understand transparency well. A strong answer is clear, specific, and respectful of the audience’s perspective.

Advertisement
IN BETWEEN SECTIONS
Sponsored Content

Related Topics

#Ethics#Case study#Teaching
M

Marcus Ellison

Senior SEO Content Strategist

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
BOTTOM
Sponsored Content
2026-05-06T01:45:07.975Z