Upheal’s approach to the environmental impact of AI

At Upheal, we take the environmental impact of AI seriously. We believe innovation should move forward responsibly, and that sustainability needs to be part of how new technologies are built and scaled.
Our infrastructure runs on major cloud providers, including AWS, Google Cloud, and Microsoft Azure, all of which have public commitments around energy efficiency and water sustainability, with goals to become water positive by 2030. Upheal relies entirely on this third-party infrastructure to deliver our services, and our usage represents only a very small fraction of their overall capacity.
From a product perspective, we use a combination of proprietary AI models and third-party large language models, such as Gemini and GPT. We continuously optimize our systems to be more lightweight and efficient, reducing unnecessary compute usage and minimizing environmental impact as usage scales.
We also benefit from ongoing infrastructure improvements made by our cloud partners. For example, Microsoft has introduced advanced server cooling techniques that significantly reduce energy consumption and eliminate the need for water-based cooling in certain data centers.
While no AI system is impact-free, we are committed to making thoughtful infrastructure choices and ongoing efficiency improvements as part of building AI that is both useful and responsible.
Putting AI’s energy impact in perspective
Most of the concerning stats being shared about the catastrophic impact of AI on the environment come from older assumptions that realistically no longer reflect today’s models and hardware. A recent analysis from Epoch AI estimates that a typical ChatGPT query (using GPT-4o) uses around 0.3 watt-hours (Wh), about 10× lower than the widely cited ‘3 Wh per query’ figure. The difference is attributed to more efficient models and chips, plus earlier estimates assuming unusually long prompts and responses.
In real terms, the figure of 0.3 Wh is less electricity than an LED lightbulb or a laptop uses in a few minutes. Even for a heavy AI user, this is a small fraction of everyday household electricity use. For context, the same Epoch AI piece cites an average US household at ~10,500 kWh/year, or ~28,000 Wh/day.
Where impact can climb is when usage is not ‘typical,’ for example: very long inputs (uploading or pasting large documents), very long outputs, or ‘reasoning’ style interactions. Epoch AI notes that long-input requests can raise energy per interaction, because processing very large context windows is much more compute-intensive than short chats.
Many people consume hours of streaming services (such as Netflix) every day without worrying about the environmental implications, but the impact of just one hour of Netflix is similar to asking ChatGPT 100 questions. This analysis argues that even a high-end estimate like ‘ChatGPT uses as much energy as ~20,000 households’ seems dramatic, but is small compared to video streaming at global scale. That same piece cites Netflix’s reported data-center electricity use and points out that streaming overall is far larger as a category, simply because so many people stream video every day.
Upheal’s approach to reducing AI’s environmental impact
At Upheal, we believe AI should meaningfully improve clinicians’ lives without creating unnecessary harm elsewhere. That includes being thoughtful about the environmental footprint of the technology we build and rely on. While AI infrastructure inevitably consumes energy, we believe responsible choices and intentional design can significantly reduce its impact.
Choosing responsible infrastructure partners
Upheal is intentional about the technology partners we work with. We prioritize cloud and infrastructure providers that are actively investing in renewable energy, energy efficiency, and long-term sustainability initiatives.
This means favoring partners that are:
- Committed to transitioning their data centers to renewable or low-carbon energy sources
- Publicly accountable for sustainability goals and progress
- Investing in more efficient data center design, cooling, and energy usage
While we do not operate our own data centers, we believe vendor choice matters, and we see this as one of the most direct ways we can influence environmental outcomes today.
Building efficient AI by design
Rather than training large models unnecessarily or running compute-heavy processes by default, Upheal focuses on:
- Using efficient, task-specific models where possible
- Leveraging existing models and fine-tuning them instead of training from scratch
- Continuously evaluating whether new AI features justify their computational cost
Our goal is to deliver real clinical value without excess computation, keeping AI helpful, targeted, and efficient.
Supporting innovation that reduces resource use
We closely follow advances in AI efficiency and infrastructure sustainability, including:
- Improved model optimization techniques that reduce energy consumption
- New data-center cooling approaches that lower electricity and water usage
- Carbon offset and mitigation programs that help balance unavoidable emissions
As these approaches mature and become practical at scale, we expect them to play a larger role in how AI systems are deployed responsibly.
Aligning with emerging standards and regulation
We support the direction of emerging regulations and industry standards aimed at making digital technology more sustainable.
Clear benchmarks, reporting standards, and environmental accountability help ensure that AI development moves in a direction that balances innovation with responsibility. We see this as a positive step for the industry and for the clinicians and clients who rely on these tools.
A balanced perspective: challenges and opportunities
The challenge of demand
As AI applications expand into healthcare, finance, and everyday life, the demand for computing power will only grow. This makes ongoing efficiency improvements essential to avoid an unsustainable trajectory. While this poses a challenge, AI’s power may also create opportunities to support the future of sustainability and environmental efforts.
AI’s role in environmental solutions
AI isn’t just part of the problem—it’s also part of the solution. Many AI-driven applications are tackling climate challenges, including:
- Optimizing renewable energy grids to improve solar and wind power efficiency.
- Reducing waste in manufacturing by streamlining production and minimizing resource use.
- Predicting extreme weather patterns to help mitigate disasters.
The bigger picture
No technology is completely free of environmental costs, but AI, like other innovations, can be developed and used responsibly. A sustainable future isn’t about stopping AI—it’s about using it wisely, efficiently, and to our advantage.
Moving toward responsible AI
AI has the potential to drive both progress and unintended environmental consequences. The challenge is real, but so are the efforts to make AI more sustainable.
- Continued innovation in green computing will be key.
- Investment in renewable energy and efficient AI models can make a difference.
- Public awareness and industry accountability will shape the future of AI sustainability.
With the right balance of technological advancements and environmental responsibility, AI can be a tool for positive change, not just in how we work, but in how we care for our planet.



