Arize Phoenix is an open-source LLM tracing & evaluation platform. Seamlessly instrument, experiment, and optimize AI applications in real time—tr
I notice the social mentions you've provided don't appear to contain reviews or mentions of a software tool called "Phoenix." Instead, they seem to discuss Phoenix, Arizona in the context of climate change and water issues, plus what appears to be a GitHub repository report. Without actual user reviews or social mentions specifically about Phoenix as a software tool, I cannot provide a meaningful summary of user opinions about its strengths, complaints, pricing, or reputation. Could you please provide reviews and mentions that are specifically about the Phoenix software tool you'd like me to analyze?
Mentions (30d)
1
Reviews
0
Platforms
4
GitHub Stars
9,053
775 forks
I notice the social mentions you've provided don't appear to contain reviews or mentions of a software tool called "Phoenix." Instead, they seem to discuss Phoenix, Arizona in the context of climate change and water issues, plus what appears to be a GitHub repository report. Without actual user reviews or social mentions specifically about Phoenix as a software tool, I cannot provide a meaningful summary of user opinions about its strengths, complaints, pricing, or reputation. Could you please provide reviews and mentions that are specifically about the Phoenix software tool you'd like me to analyze?
Features
Industry
information technology & services
Employees
130
Funding Stage
Series C
Total Funding
$131.0M
442
GitHub followers
54
GitHub repos
9,053
GitHub stars
20
npm packages
6
HuggingFace models
Arizona’s water is drying up. That’s not stopping the data center rush.
It’s no secret that Arizona is worried about its water. The [Colorado River is drying up](https://grist.org/politics/colorado-river-deal-trump-burgum/), [in part due to climate change](https://www.youtube.com/watch?v=AzpYHXgfbbI), and groundwater aquifers are running dry. Some of the state’s biggest industries are suffering as a result: Many farmers have been forced to rip up their cotton and alfalfa fields, and some home developers have been blocked from building new subdivisions. A state with hydrologic woes of this magnitude would seem an unlikely place to attract new factory-scale industries, which often have substantial water appetites themselves, but over the past year that’s exactly what’s happened. So-called hyperscaler tech companies like Microsoft and Meta have swarmed in to build the data centers fuelling the artificial-intelligence boom, and the Taiwan Semiconductor Manufacturing Company has spent billions of dollars on a factory complex outside Phoenix. This [rapid](https://www.reuters.com/sustainability/climate-energy/desert-storm-can-data-centres-slake-their-insatiable-thirst-water--ecmii-2025-12-17/) [development](https://fortune.com/2024/04/08/tsmc-water-usage-phoenix-chips-act-commerce-department-semiconductor-manufacturing/) has [triggered](https://www.azcentral.com/story/money/business/tech/2024/11/04/phoenix-provides-water-to-a-new-chipmaker-any-cause-for-worry/75917812007/?gnt-cfr=1&gca-cat=p&gca-uir=true&gca-epti=z1104xxe1104xxv004275d--47--b--47--&gca-ft=198&gca-ds=sophi) [fears](https://www.apmresearchlab.org/10x/data-centers-resource) that the industry will suck up the finite water supplies available to residents of Phoenix and Tucson. So far, however, these predictions have not come true. Even though Arizona will soon be home to nearly 200 data centers and chip factories, these facilities have not yet caused a major bump in the state’s water consumption. The companies’ precise effects on water supply are hard to discern due to their own secrecy about their water usage, but the aggregate picture suggests they have found ways to minimize their impact, whether through new cooling technologies or by recycling water on-site. And despite [local](https://news.azpm.org/s/102502-marana-data-center-vote-sparks-backlash-three-residents-launch-council-runs/) [backlash](https://www.theguardian.com/us-news/2025/oct/15/tucson-arizona-ai-data-center-project-blue), water experts and many local officials appear to have largely made their peace with the industry’s arrival — and with the Phoenix region’s emergence as one of the nation’s largest AI infrastructure clusters. “There’s not a hair-on-fire context right now,” said Sarah Porter, a fellow at Arizona State University’s Kyl Center for Water Policy. “We just don’t see it.” Arizona is home to [more than 150 data centers](https://www.datacentermap.com/usa/arizona/), according to an analysis from the Data Center Map, an industry resource. Each of these buildings contains thousands of servers that need to stay cool in the desert heat as they process computational queries. This cooling can be done with air conditioners, but it’s more efficient to surround them with pipes full of cold water, or to use evaporating mists to draw out hot air. Cooling systems like these *can* consume a huge amount of water, but [no one knows](https://www.azcentral.com/story/money/business/tech/2026/02/04/arizona-data-centers-water-power-use/88054536007/?gnt-cfr=1&gca-cat=p&gca-uir=true&gca-epti=z119875p003550c003550e1185xxv119875d--55--b--55--&gca-ft=206&gca-ds=sophi) how much they *are* consuming. Independent estimates suggest that an average data center can use anywhere from [50,000](https://www.eenews.net/articles/states-push-to-end-secrecy-over-data-center-water-use/) to [5 million](https://www.eesi.org/articles/view/data-centers-and-water-consumption) gallons of water per day. An [analysis](https://www.ceres.org/resources/reports/drained-by-data-the-cumulative-impact-of-data-centers-on-regional-water-stress) from the sustainability advocacy organization Ceres estimated that the data centers active in Phoenix last summer used around 385 million gallons of water per year. Ceres projected that the metropolitan’s data center water consumption could grow tenfold to around 3.8 billion gallons per year. But even that worst-case-scenario would make data center usage equivalent to just around 1 percent of total [residential water consumption](https://www.azwater.gov/adwr-data-dashboards) in the Phoenix area — and less than half a percent of the region’s total 2024 water usage. (A comparison with agricultural usage is even more stark: Agriculture uses [more than 70 percent](https://environment.arizona.edu/news/where-does-our-water-come) of the state’s water, and still accounts for around 35 percent of water consumption even in the Phoenix metro, the state’s most urban region.) Furthermore, there’s some evidence that Ceres’ estimates may be too high. State data show that
View originalPricing found: $50, $10, $3
I've created an MCP to build automations using Claude Code.
Hey there! Over the past few days, I’ve been building an MCP Server for my side project (Hooklistener), which lets you create any kind of automation. I’ve built all of this using Claude Code (it’s worth noting that I have a technical background). The backend is primarily Elixir and Phoenix. The workflow is always as follows: Planning mode Implementation Phase (using specific agents; for example, I have some with specific instructions for working with Elixir code). Once that’s done, I run the code-simplifier skill and perform a couple of rounds of validation. The interesting thing about this is that it lets you create simple automations without even touching a UI. For example, imagine you need to send GitHub notifications to Telegram: you could do this directly from Claude Code. I'd appreciate your feedback! https://reddit.com/link/1sgpde0/video/1sn6rsx306ug1/player submitted by /u/absoluterror [link] [comments]
View originalsparX: Phoenix-powered X content skills/agents specifically for Claude Code
sparX is a collection of Claude Code skills, agents, and deep reference material on the X algorithm (phoenix) that transforms claude into a full X content studio: drafting, optimizing, scoring, scheduling, trend research, performance review, and visual content creation. I created this to draft optimized X posts with the help of claude. Its completely open source MIT Licensed Feedback is very welcome! submitted by /u/RealEpistates [link] [comments]
View original11.7B Claude tokens in 45 days. Here's every project it built — and what actually happened.
People kept asking what 9.3B tokens actually builds. The number is now 11.7B over 45 days. Here's the honest answer. **What's real and running:** **Phoenix Traffic Intelligence** — Live traffic system on ADOT's AZ-511 feed. 8 Phoenix freeway corridors monitored 24/7. Cascade risk detection, weighted incident scoring (construction zones separated from real incidents), AI-generated crew dispatch recommendations, 2-minute sweep cycle. Already in conversation with City of Phoenix Office of Innovation and AZTech about a pilot. **Expression-Gated Consciousness** — A formal mathematical model for the gap between what people know and what they express. 44+ subjects, Pearson r=0.311, three discrete response types confirmed by data. Cold emailed Joshua Aronson (NYU, co-author of the foundational 1995 stereotype threat paper). He replied. Call is pending. **LOLM** — Custom transformer architecture built from scratch. Not fine-tuned. Original architecture targeting 10B–100B parameters on Google TPU Research Cloud. **Codey** — AI coding platform in development. Structural codebase analysis across 12 LLM providers. $8,323 estimated API-equivalent compute. No team. No university. No funding. Phoenix, Arizona. Full breakdown of how the tokens were used, what it cost by day, and how it compares to other documented heavy users: theartofsound.github.io/claude-usage-dashboard Portfolio showing everything live: theartofsound.github.io/portfolio If you want to talk about how I'm actually structuring sessions at this scale — multi-agent setups, context management, what burns tokens vs what doesn't — happy to get into it. submitted by /u/OGMYT [link] [comments]
View originalBuilt a GUI overlay on native Claude Code terminals
I've been experimenting with Claude Code this week and built a GUI overlay on top of it – not a wrapper, not a chat layer. Looking for honest feedback before I open-source it. Full disclosure: some of what I built is probably already possible directly in Claude. I'm not claiming this is the only way to do it – I was just curious to see how far I could push the interface and what becomes possible when you add a visual layer on top. This is the result of a week of tinkering. What it is: Claude Code terminals run natively underneath. The GUI listens to structured JSON returned from external API calls and renders dynamic visual screens on top – tables, editors, dashboards – without replacing the terminal. All visual screens are configurable by JSON format / style. What I've built so far: Lead research – Describe your ICP, fetch leads, review in a visual table, run ICP scoring as a skill, push selected contacts to your CRM. All in one place. Landing page editor – Build and edit ad landing pages visually without leaving the interface. More like a wordpress feeling here. SEO/GEO analysis – Results rendered as a browsable overview. Draft and edit blog articles in a side panel from the same screen. Ad creative + campaign launcher – Load your brand workspace, preview generated ad variants, select, and launch a campaign directly from the GUI. Live meeting & call analysis – Toggle record during a prospect call or meeting and get live feedback as the conversation unfolds: talk ratio, objection signals, topic tracking, suggested next steps. No waiting for a post-call summary. Voice analyzed with Deepgram here. Website intelligence + auto brand context – An external API extracts everything from a client's website: active ads, page content, assets, copy tone. It auto-generates a brand voice skill and branding skill from that data. Switch workspaces and those skills are already loaded – every prompt is immediately in the right brand context. Team skill sync + auto-evolution – Skills are shared and synced across team members automatically. As the team works, skill files adapt based on internal feedback and real process outcomes – they update themselves over time rather than staying static. Multi-workspace switcher – Each workspace carries its own skills, tools, and MCP configs (Google Ads, Meta, LinkedIn, mailing accounts, etc. - configurable from external API or internal systems). Built with agencies in mind: 20–100 clients, clean context switching, no mess. Who it's for: Sales, marketing, and customer support – specifically non-technical people who want to run serious AI-powered workflows with claude code but who are missing a interface to review data/ What I'm genuinely trying to figure out: Is any of this solving a real problem for you or your team? The live call analysis and auto-evolving team skills – useful in practice or over-engineered? What's the one thing that would make you actually use this daily? Planning to open-source this. Not pitching anything - just collecting feedback before the release. Happy to drop a screen recording in the comments if there's interest. https://preview.redd.it/unzjvcxt6drg1.png?width=4582&format=png&auto=webp&s=7c997544d563c52de20d29215c9716ab44413d49 submitted by /u/Few_Earth_1001 [link] [comments]
View originalThe Luminous Vanguard of the Imperial Dominion
Forged in unbreakable carbon steel and illuminated by the empire’s sacred energy, the Luminous Vanguard represents the highest echelon of the Imperial Army—fourteen commanders chosen not only for their strength, but for their unyielding loyalty and symbolic purpose. Each officer bears a distinct armor set, infused with radiant light-strips that pulse like a living force—signifying rank, specialization, and battlefield authority. Their right-arm insignias are not mere decoration, but ancient emblems of power: dragons, phoenixes, beasts, and mythical creatures that embody the spirit of their command. Together, they form an unstoppable war council: • The Black Dragon Commander – Master of annihilation and fear, striking from shadows with ruthless precision. • The White Phoenix Marshal – Symbol of rebirth and strategy, rising stronger from every defeat. • The Silver Hawk Overseer – Eyes of the empire, unmatched in reconnaissance and aerial dominance. • The Golden Sovereign – The embodiment of imperial authority, leading with absolute command. • The Emerald Serpent General – Specialist in stealth warfare and silent elimination. • The Crimson Flame Warden – Bringer of devastation, wielding overwhelming offensive force. • The Azure Tide Commander – Controller of fluid tactics and battlefield adaptation. • The Infernal Beast Captain – Aggression incarnate, thriving in chaos and close combat. • The Obsidian Lion Sentinel – Guardian of the empire, unbreakable and immovable. • The Violet Revenant – A ghost of war, feared for relentless pursuit and silent judgment. • The Radiant Gold Executor – Enforcer of imperial law, delivering swift and absolute justice. • The Shadow Eclipse Knight – Operates beyond sight, mastering deception and psychological warfare. • The Scarlet Wolf Lord – Leader of elite strike packs, fierce and loyal to the end. • The Amethyst Warlord – The final authority in battle, cloaked in mystery and unmatched power. Bound by honor, enhanced by technology, and driven by a single purpose—the expansion and protection of the empire—these fourteen stand as living legends. Where their lights shine… resistance falls. submitted by /u/Dark-Rose19 [link] [comments]
View originalWeekly Report Mar 2 -- Mar 9, 2026
# Weekly Report: Mar 2 -- Mar 9, 2026 ## Quick Stats | Metric | Count | |--------|-------| | Merged PRs | 47 | | Open PRs | 24 (11 draft) | | Open issues | 61 | | New issues this week | 33 | | Issues closed this week | 6 | | CI runs on main | 30 | ## Highlights An exceptionally active week with 47 merged PRs. Key themes: - **Realm migration**: Keycloak master-to-kagenti realm migration landed (#764), with follow-up fixes (#851, #863) - **Platform hardening**: Podman support (#861), Docker Hub rate limit fixes (#844), PostgreSQL mount fix (#852) - **CI/CD improvements**: OpenSSF Scorecard 7.1->8+ (#807), stale workflow permissions (#859), HyperShift cluster auto-cleanup (#854) - **New capabilities**: CLI/TUI for Kagenti (#835), Istio trace export to OTel (#795), RHOAI 3.x integration (#809) - **Dependency updates**: 8 Dependabot PRs (Docker actions major bumps, CodeQL, Trivy) - **Authorization epic**: 7 new issues (#787-#794) laying out a comprehensive authorization and policy framework - **Agent sandbox epic**: New epic (#820) for platform-owned sandboxed agent runtime ## Issue Analysis ### Epics (active initiatives) | # | Title | Owner | Status | |---|-------|-------|--------| | #862 | AgentRuntime CR — CR-triggered injection | @cwiklik | New, design phase | | #820 | Platform-Owned Sandboxed Agent Runtime | @Ladas | Active, PR #758 in progress | | #828 | Migrate installer from Ansible/Helm to Operator | @pdettori | New, planning | | #787 | Authorization, Policies, and Access Management | @mrsabath | New, 6 sub-issues filed | | #841 | Org-wide orchestration: CI, tests, security | @Ladas | Active, PRs #866-#868 open | | #767 | Migrate from Keycloak master realm | @mrsabath | Mostly done (#764 merged), close candidate | | #619 | Tracing observability PoC | @evaline-ju | Active (#795 merged) | | #621 | OpenSSF Scorecard to 10/10 | @Ladas | Active (#807 merged, now 8+) | | #523 | Refactor APIs for Compositional Architecture | @pdettori | Active, PR #770 open | | #518 | OpenShift AI deployment issues | @Ladas | Active (#809 merged) | | #309 | Full Coverage E2E Testing | @cooktheryan | Ongoing | | #440 | Multi-Team Deployment on RHOAI | @Ladas | Ongoing | | #439 | Namespace-Based Token Usage Quotas | @Ladas | Ongoing | | #614 | Feedback review community meeting | @Ladas | Stale (>30d no update) | | #623 | Identify Emerging Agentic Deployment Patterns | @kellyaa | Stale | | #612 | Agent Attestation Framework | @mrsabath | Stale, PR #613 still draft | ### Security-Adjacent Issues | # | Title | Status | Recommendation | |---|-------|--------|----------------| | #822 | Keycloak configmap should be secret | Open | High priority — credentials in configmap | | #106 | Replace hardcoded secret with SPIRE identity | Open | Long-standing, PR #769 in draft | | #333 | SPIFFE ID missing checks | Open | Stale, needs triage | | #267 | Replace hard-coded Client Secret File path | Open | Good first issue, needs assignee | ### Bug Reports | # | Title | Still affects main? | PR exists? | Recommendation | |---|-------|---------------------|------------|----------------| | #856 | Warnings during Kagenti install | Likely yes | No | Triage — install warnings | | #855 | Can't checkout source on Windows | Yes (skill naming) | PR #869 | In progress | | #829 | Deleting A2A agent doesn't delete HTTPRoute | Likely yes | No | Needs fix | | #826 | No way to log out of Kagenti | Yes | No | UX bug, needs fix | | #825 | Build failures lead to stuck state | Likely yes | No | Needs investigation | | #738 | UI drops spire label on 2nd deploy | Likely yes | No | Stale (>30d) | | #486 | Installer issues (Postgres/Phoenix) | Partially (#852 fixed PG) | Partial | Re-verify Phoenix | | #781 | kagenti-deps fails on OCP 4.19 | Unknown | No | Stale, needs triage | | #606 | Unsupported Helm version | Unknown | No | Stale, needs triage | | #655 | Duplicated resources between repos | Unknown | No | Stale, needs triage | ### Issues Closed This Week (good velocity) | # | Title | Fix PR | |---|-------|--------| | #833 | UI login fails after realm migration | #834 | | #831 | --preload fails when images cached | #832 | | #819 | Remove deprecated Component CRD refs | #818 | | #813 | Import env vars references bad URL | #821 | | #810 | Import env vars silently fails on dup | #821 | | #804 | OAuth secret job SSL error on OCP | #805 | ### Feature Requests | # | Title | Priority | Recommendation | |---|-------|----------|----------------| | #858 | Use new URL for fetching Agent Cards | Medium | Good first issue | | #836 | AuthBridge sidecar opt-out controls in UI | Medium | Tied to #862 epic | | #824 | Help text for UI fields | Low | Good UX improvement | | #823 | Examples as suggestions in UI | Low | Nice-to-have | | #817 | Auto-add issues/PRs to project board | Medium | PR #870 open | | #814 | Mechanism to update agent via K8s | Medium | Operator feature | | #786 | Register MCP servers from UI | Medium | UI feature | | #783 | Agent card signing/verifica
View originalArizona’s water is drying up. That’s not stopping the data center rush.
It’s no secret that Arizona is worried about its water. The [Colorado River is drying up](https://grist.org/politics/colorado-river-deal-trump-burgum/), [in part due to climate change](https://www.youtube.com/watch?v=AzpYHXgfbbI), and groundwater aquifers are running dry. Some of the state’s biggest industries are suffering as a result: Many farmers have been forced to rip up their cotton and alfalfa fields, and some home developers have been blocked from building new subdivisions. A state with hydrologic woes of this magnitude would seem an unlikely place to attract new factory-scale industries, which often have substantial water appetites themselves, but over the past year that’s exactly what’s happened. So-called hyperscaler tech companies like Microsoft and Meta have swarmed in to build the data centers fuelling the artificial-intelligence boom, and the Taiwan Semiconductor Manufacturing Company has spent billions of dollars on a factory complex outside Phoenix. This [rapid](https://www.reuters.com/sustainability/climate-energy/desert-storm-can-data-centres-slake-their-insatiable-thirst-water--ecmii-2025-12-17/) [development](https://fortune.com/2024/04/08/tsmc-water-usage-phoenix-chips-act-commerce-department-semiconductor-manufacturing/) has [triggered](https://www.azcentral.com/story/money/business/tech/2024/11/04/phoenix-provides-water-to-a-new-chipmaker-any-cause-for-worry/75917812007/?gnt-cfr=1&gca-cat=p&gca-uir=true&gca-epti=z1104xxe1104xxv004275d--47--b--47--&gca-ft=198&gca-ds=sophi) [fears](https://www.apmresearchlab.org/10x/data-centers-resource) that the industry will suck up the finite water supplies available to residents of Phoenix and Tucson. So far, however, these predictions have not come true. Even though Arizona will soon be home to nearly 200 data centers and chip factories, these facilities have not yet caused a major bump in the state’s water consumption. The companies’ precise effects on water supply are hard to discern due to their own secrecy about their water usage, but the aggregate picture suggests they have found ways to minimize their impact, whether through new cooling technologies or by recycling water on-site. And despite [local](https://news.azpm.org/s/102502-marana-data-center-vote-sparks-backlash-three-residents-launch-council-runs/) [backlash](https://www.theguardian.com/us-news/2025/oct/15/tucson-arizona-ai-data-center-project-blue), water experts and many local officials appear to have largely made their peace with the industry’s arrival — and with the Phoenix region’s emergence as one of the nation’s largest AI infrastructure clusters. “There’s not a hair-on-fire context right now,” said Sarah Porter, a fellow at Arizona State University’s Kyl Center for Water Policy. “We just don’t see it.” Arizona is home to [more than 150 data centers](https://www.datacentermap.com/usa/arizona/), according to an analysis from the Data Center Map, an industry resource. Each of these buildings contains thousands of servers that need to stay cool in the desert heat as they process computational queries. This cooling can be done with air conditioners, but it’s more efficient to surround them with pipes full of cold water, or to use evaporating mists to draw out hot air. Cooling systems like these *can* consume a huge amount of water, but [no one knows](https://www.azcentral.com/story/money/business/tech/2026/02/04/arizona-data-centers-water-power-use/88054536007/?gnt-cfr=1&gca-cat=p&gca-uir=true&gca-epti=z119875p003550c003550e1185xxv119875d--55--b--55--&gca-ft=206&gca-ds=sophi) how much they *are* consuming. Independent estimates suggest that an average data center can use anywhere from [50,000](https://www.eenews.net/articles/states-push-to-end-secrecy-over-data-center-water-use/) to [5 million](https://www.eesi.org/articles/view/data-centers-and-water-consumption) gallons of water per day. An [analysis](https://www.ceres.org/resources/reports/drained-by-data-the-cumulative-impact-of-data-centers-on-regional-water-stress) from the sustainability advocacy organization Ceres estimated that the data centers active in Phoenix last summer used around 385 million gallons of water per year. Ceres projected that the metropolitan’s data center water consumption could grow tenfold to around 3.8 billion gallons per year. But even that worst-case-scenario would make data center usage equivalent to just around 1 percent of total [residential water consumption](https://www.azwater.gov/adwr-data-dashboards) in the Phoenix area — and less than half a percent of the region’s total 2024 water usage. (A comparison with agricultural usage is even more stark: Agriculture uses [more than 70 percent](https://environment.arizona.edu/news/where-does-our-water-come) of the state’s water, and still accounts for around 35 percent of water consumption even in the Phoenix metro, the state’s most urban region.) Furthermore, there’s some evidence that Ceres’ estimates may be too high. State data show that
View originalArizona’s water is drying up. That’s not stopping the data center rush.
It’s no secret that Arizona is worried about its water. The [Colorado River is drying up](https://grist.org/politics/colorado-river-deal-trump-burgum/), [in part due to climate change](https://www.youtube.com/watch?v=AzpYHXgfbbI), and groundwater aquifers are running dry. Some of the state’s biggest industries are suffering as a result: Many farmers have been forced to rip up their cotton and alfalfa fields, and some home developers have been blocked from building new subdivisions. A state with hydrologic woes of this magnitude would seem an unlikely place to attract new factory-scale industries, which often have substantial water appetites themselves, but over the past year that’s exactly what’s happened. So-called hyperscaler tech companies like Microsoft and Meta have swarmed in to build the data centers fuelling the artificial-intelligence boom, and the Taiwan Semiconductor Manufacturing Company has spent billions of dollars on a factory complex outside Phoenix. This [rapid](https://www.reuters.com/sustainability/climate-energy/desert-storm-can-data-centres-slake-their-insatiable-thirst-water--ecmii-2025-12-17/) [development](https://fortune.com/2024/04/08/tsmc-water-usage-phoenix-chips-act-commerce-department-semiconductor-manufacturing/) has [triggered](https://www.azcentral.com/story/money/business/tech/2024/11/04/phoenix-provides-water-to-a-new-chipmaker-any-cause-for-worry/75917812007/?gnt-cfr=1&gca-cat=p&gca-uir=true&gca-epti=z1104xxe1104xxv004275d--47--b--47--&gca-ft=198&gca-ds=sophi) [fears](https://www.apmresearchlab.org/10x/data-centers-resource) that the industry will suck up the finite water supplies available to residents of Phoenix and Tucson. So far, however, these predictions have not come true. Even though Arizona will soon be home to nearly 200 data centers and chip factories, these facilities have not yet caused a major bump in the state’s water consumption. The companies’ precise effects on water supply are hard to discern due to their own secrecy about their water usage, but the aggregate picture suggests they have found ways to minimize their impact, whether through new cooling technologies or by recycling water on-site. And despite [local](https://news.azpm.org/s/102502-marana-data-center-vote-sparks-backlash-three-residents-launch-council-runs/) [backlash](https://www.theguardian.com/us-news/2025/oct/15/tucson-arizona-ai-data-center-project-blue), water experts and many local officials appear to have largely made their peace with the industry’s arrival — and with the Phoenix region’s emergence as one of the nation’s largest AI infrastructure clusters. “There’s not a hair-on-fire context right now,” said Sarah Porter, a fellow at Arizona State University’s Kyl Center for Water Policy. “We just don’t see it.” Arizona is home to [more than 150 data centers](https://www.datacentermap.com/usa/arizona/), according to an analysis from the Data Center Map, an industry resource. Each of these buildings contains thousands of servers that need to stay cool in the desert heat as they process computational queries. This cooling can be done with air conditioners, but it’s more efficient to surround them with pipes full of cold water, or to use evaporating mists to draw out hot air. Cooling systems like these *can* consume a huge amount of water, but [no one knows](https://www.azcentral.com/story/money/business/tech/2026/02/04/arizona-data-centers-water-power-use/88054536007/?gnt-cfr=1&gca-cat=p&gca-uir=true&gca-epti=z119875p003550c003550e1185xxv119875d--55--b--55--&gca-ft=206&gca-ds=sophi) how much they *are* consuming. Independent estimates suggest that an average data center can use anywhere from [50,000](https://www.eenews.net/articles/states-push-to-end-secrecy-over-data-center-water-use/) to [5 million](https://www.eesi.org/articles/view/data-centers-and-water-consumption) gallons of water per day. An [analysis](https://www.ceres.org/resources/reports/drained-by-data-the-cumulative-impact-of-data-centers-on-regional-water-stress) from the sustainability advocacy organization Ceres estimated that the data centers active in Phoenix last summer used around 385 million gallons of water per year. Ceres projected that the metropolitan’s data center water consumption could grow tenfold to around 3.8 billion gallons per year. But even that worst-case-scenario would make data center usage equivalent to just around 1 percent of total [residential water consumption](https://www.azwater.gov/adwr-data-dashboards) in the Phoenix area — and less than half a percent of the region’s total 2024 water usage. (A comparison with agricultural usage is even more stark: Agriculture uses [more than 70 percent](https://environment.arizona.edu/news/where-does-our-water-come) of the state’s water, and still accounts for around 35 percent of water consumption even in the Phoenix metro, the state’s most urban region.) Furthermore, there’s some evidence that Ceres’ estimates may be too high. State data show that
View originalDecember 22, 2025
*David Sathuluri is a Research Associate and Dr. Marco Tedesco is a Lamont Research Professor at the Lamont-Doherty Earth Observatory of Columbia University.* **As climate scientists warn that we are approaching irreversible tipping points in the Earth’s climate system, paradoxically the very technologies being deployed to detect these tipping points – often based on AI – are exacerbating the problem, via acceleration of the associated energy consumption.** The UK’s much-celebrated £81-million ($109-million) [Forecasting Tipping Points programme](https://www.theguardian.com/environment/2025/feb/18/early-warning-system-for-climate-tipping-points-given-81m-kickstart) involving 27 teams, led by the Advanced Research + Invention Agency (ARIA), represents a contemporary faith in technological salvation – yet it embodies a profound contradiction. The ARIA programme explicitly aims to “harness the laws of physics and artificial intelligence to pick up subtle early warning signs of tipping” through advanced modelling. We are deploying massive computational infrastructure to warn us of climate collapse while these same systems consume the energy and water resources needed to prevent or mitigate it. We are simultaneously investing in computationally intensive AI systems to monitor whether we will cross irreversible climate tipping points, even as these same AI systems could fuel that transition. ## The computational cost of monitoring Training a single large language model like GPT-3 consumed approximately 1,287 megawatt-hours of electricity, resulting in 552 metric tons of carbon dioxide – equivalent to driving 123 gasoline-powered cars for a year, according to a recent [study](https://arxiv.org/ftp/arxiv/papers/2104/2104.10350.pdf). GPT-4 required roughly [50 times](https://www.weforum.org/stories/2024/07/generative-ai-energy-emissions/) more electricity. As the computational power needed for AI continues to double approximately every 100 days, the energy footprint of these systems is not static but is exponentially accelerating. > **[UN adopts first-ever resolution on AI and environment, but omits lifecycle](https://www.climatechangenews.com/2025/12/12/un-adopts-first-ever-resolution-artificial-intelligence-ai-environment-lifecycle-unea/)** And the environmental consequences of AI models extend far beyond electricity usage. Besides massive amounts of electricity (much of which is still fossil-fuel-based), such systems require advanced cooling that consumes enormous quantities of water, and sophisticated infrastructure that must be manufactured, transported, and deployed globally. ## The water-energy nexus in climate-vulnerable regions A single data center can consume up to [5 million](https://utulsa.edu/news/data-centers-draining-resources-in-water-stressed-communities/#%3A%7E%3Atext=Unfortunately%2C+many+data+centers+rely+on+water-intensive%2Cto+supply+thousands+of+households+or+farms.) gallons of drinking water per day – sufficient to supply thousands of households or farms. In the Phoenix area of the US alone, more than [58 data centers](https://utulsa.edu/news/data-centers-draining-resources-in-water-stressed-communities/) consume an estimated 170 million gallons of drinking water daily for cooling. The geographical distribution of this infrastructure matters profoundly as data centers requiring high rates of mechanical cooling are disproportionately located in water-stressed and socioeconomically vulnerable regions, particularly in Asia-Pacific and Africa. At the same time, we are deploying AI-intensive early warning systems to monitor climate tipping points in regions like Greenland, the Arctic, and the Atlantic circulation system – regions already experiencing catastrophic climate impacts. They represent thresholds that, once crossed, could trigger irreversible changes within decades, scientists have warned. > **[Nine of our best climate stories from 2025](https://www.climatechangenews.com/2025/12/22/nine-of-our-best-climate-stories-from-2025/)** Yet computational models and AI-driven early warning systems operate according to different temporal logics. They promise to provide warnings that enable future action, but they consume energy – and therefore contribute to emissions – in the present. This is not merely a technical problem to be solved with renewable energy deployment; it reflects a fundamental misalignment between the urgency of climate tipping points and the gradualist assumptions embedded in technological solutions. The carbon budget concept reveals that there is a cumulative effect on how emissions impact on temperature rise, with significant lags between atmospheric concentration and temperature impact. Every megawatt-hour consumed by AI systems training on climate models today directly reduces the available carbon budget for tomorrow – including the carbon budget available for the energy transition itself. ## The governance void The deeper issue is that governance frameworks
View originalRepository Audit Available
Deep analysis of Arize-ai/phoenix — architecture, costs, security, dependencies & more
Pricing found: $50, $10, $3
Key features include: Top AI Teams Use Phoenix, Flexible, Transparent, & Free from Lock-In, Iterate on your LLM workflow and deploy to production with confidence, Trace, Evaluate, Iterate, Ready to get started?, Phoenix.
Phoenix has a public GitHub repository with 9,053 stars.
Based on user reviews and social mentions, the most common pain points are: token usage.
Based on 14 social mentions analyzed, 0% of sentiment is positive, 100% neutral, and 0% negative.
Chris Urmson
CEO at Aurora Innovation
1 mention