Search for a command to run...
The global surge in demand for artificial intelligence and cloud computing has led to a rapid expansion of data centers. In 2023, U.S. data centers consumed 176 terawatt-hours (TWh) of electricity – 4.4% of the nation’s total. It is projected to account for up to 12% of national usage by 2028. As data centers become more energy-intensive, they contribute significantly to grid instability and greenhouse gas emissions. The purpose of this research is to design and simulate a carbon-aware workload scheduling algorithm that reduces emissions and energy costs by dynamically adjusting when and where jobs run, using environmental and grid data.The methodology involves simulating AI job-level workloads using open-source workload profiles (e.g., MLPerf benchmarks) and combining them with ComStock data to estimate baseline building-level energy use and thermal load patterns representative of data centers in California. These simulated workloads are integrated with time-series datasets from the California Independent System Operator (CAISO) for hourly grid carbon intensity, Open-Meteo for local temperature forecasts, and PG&E’s time-of-use electricity pricing. The scheduling algorithm, developed in Python, prioritizes non-urgent computing tasks during periods of lower carbon intensity, cooler ambient temperatures (to reduce cooling energy needs), and lower electricity rates. It uses a hybrid decision engine that combines rule-based logic with machine learning to forecast optimal scheduling windows up to 24 hours in advance. To evaluate its effectiveness, simulations compare baseline (unscheduled) workloads against carbon-optimized schedules using key metrics such as estimated CO₂ emissions, total energy cost, and cooling demand. Initial simulations suggest the algorithm could reduce carbon emissions and energy costs by 5–10% for flexible workloads, with even greater gains during periods of high renewable output or extreme heat. Scaled across major data centers, this approach could prevent millions of tons of CO₂ emissions annually, ease strain on electrical grids, and support more efficient integration of renewable energy. This research offers a practical, scalable strategy for reducing the environmental footprint of AI computing – one that aligns digital infrastructure with global climate goals.