How one can Access NASA’s Climate Data — And How It’s Powering the Fight Against Climate Change Pt. 1

-

can’t consider a more essential dataset. Just today, I saw a headline like this: ‘Heat Waves Are Getting More Dangerous with Climate Change.’ You may’t say we haven’t been warned. In 1988, we saw headlines like this: ‘Global Warming Has Begun, Expert Tells Senate.’ And while data science has played its role in revealing that we’ll likely surpass the 1.5 °C goal set by the Paris Agreement, there’s way more we may very well be doing. For one, people don’t consider it, yet the info is quickly available, free, and straightforward to access. You may check it yourself! So on this episode, we’ll. We’ll also talk in regards to the surprising and interesting ways this data is currently being utilized to combat the consequences of climate change. 

But climate data can be incredibly interesting. You’ve probably also seen headlines like: Blue Origin launch of 6 people to suborbital space delayed again attributable to weather.Which makes you’re thinking that, if we will send someone to the moon, then why can’t we ensure in regards to the weather? If difficult doesn’t describe it, then a might. From a knowledge science perspective, that is our Riemann Hypothesis, our P vs NP problem. How well we will model and understand climate data will shape our next many years on this earth. That is a very powerful problem we may very well be working on. 

And while Recent York just went through a heat wave, it’s essential to notice that climate change is worse than simply hotter weather. 

  • Failing harvests undermine global food security, especially in vulnerable regions.
  • Vector-borne diseases expand into latest regions as temperatures rise.
  • Mass extinctions disrupt ecosystems and erode planetary resilience.
  • Ocean acidification unravels marine food chains, threatening fisheries and biodiversity.
  • Freshwater supplies dwindle under the pressure of drought, pollution, and overuse.

But not all is lost; we’ll discuss a number of the ways data has been used to handle these problems. Here’s a summary of a number of the data NASA keeps track of. We are going to access a few of these parameters.

Image by Creator

Getting the data

We’ll start by picking some interesting locations we’ll examine on this series. All we’d like are their coordinates — a click away on Google Maps. I exploit quite a little bit of decimal places here, however the meteorological data source resolution is ½° x ⅝°, so there’s no must be this accurate. 

interesting_climate_sites = {
    "Barrow, Alaska (Utqiaġvik)": (71.2906, -156.7886),    # Arctic warming, permafrost melt
    "Greenland Ice Sheet": (72.0000, -40.0000),            # Glacial melt, sea level rise
    "Amazon Rainforest (Manaus)": (-3.1190, -60.0217),     # Carbon sink, deforestation impact
    "Sahara Desert (Tamanrasset, Algeria)": (22.7850, 5.5228),  # Heat extremes, desertification
    "Sahel (Niamey, Niger)": (13.5128, 2.1127),            # Precipitation shifts, droughts
    "Sydney, Australia": (-33.8688, 151.2093),             # Heatwaves, bushfires, El Niño sensitivity
    "Mumbai, India": (19.0760, 72.8777),                   # Monsoon variability, coastal flooding
    "Bangkok, Thailand": (13.7563, 100.5018),              # Sea-level rise, heat + humidity
    "Svalbard, Norway": (78.2232, 15.6469),                # Fastest Arctic warming
    "McMurdo Station, Antarctica": (-77.8419, 166.6863),   # Ice loss, ozone hole proximity
    "Cape Town, South Africa": (-33.9249, 18.4241),        # Water scarcity, shifting rainfall
    "Mexico City, Mexico": (19.4326, -99.1332),            # Air pollution, altitude-driven weather
    "Reykjavík, Iceland": (64.1355, -21.8954),             # Glacial melt, geothermal dynamics
}

Next, let’s select some parameters. You may flip through them within the Parameter Dictionary https://power.larc.nasa.gov/parameters/

Image by Creator

You may only request from one community at a time, so we group the parameters by community.

community_params = {
    "AG": ["T2M","T2M_MAX","T2M_MIN","WS2M","ALLSKY_SFC_SW_DWN","ALLSKY_SFC_LW_DWN",
           "CLRSKY_SFC_SW_DWN","T2MDEW","T2MWET","PS","RAIN","TS","RH2M","QV2M","CLOUD_AMT"],
    "RE": ["WD2M","WD50M","WS50M"],
    "SB": ["IMERG_PRECTOT"]
}

How is that this data used?

  • AG = Agricultural. Agroeconomists typically use this community in crop growth models, similar to DSSAT and APSIM, in addition to in irrigation planners like FAO CROPWAT. It’s also used for livestock heat stress assessment and in constructing food security early warning systems. This helps mitigate food insecurity attributable to climate change. This data follows agroeconomic conventions, allowing it to be ingested directly by agricultural decision-support tools.
  • RE = Renewable Energy. Given the name and the very fact you could get windspeed data from here, you may have the ability to guess its use. This data is primarily used to forecast long-term energy yields. Wind speed for turbines, solar radiation for solar farms. This data will be fed into PVsyst, NREL-SAM and WindPRO to estimate annual energy yields and costs. This data supports every little thing from rooftop array design to national clean energy targets.
  • SB = Sustainable Buildings. Architects and HVAC engineers utilize this data to make sure their buildings comply with energy performance regulations, like IECC or ASHRAE 90.1. It could possibly be directly dropped into EnergyPlus, OpenStudio, RETScreen, or LEED/ASHRAE compliance calculators to confirm buildings are as much as code.

Now we pick a start and end date. 

start_date = "19810101"
end_date   = "20241231"

To make the API call something repeatable, we use a function. We are going to work with day by day data, but for those who prefer yearly, monthly, and even hourly data, you simply need to alter the URL to 

…/temporal/{resolution}/point.

import requests
import pandas as pd

def get_nasa_power_data(lat, lon, parameters, community, start, end):
    """
    Fetch day by day data from NASA POWER API for given parameters and site.
    Dates should be in YYYYMMDD format (e.g., "20100101", "20201231").
    """
    url = "https://power.larc.nasa.gov/api/temporal/day by day/point"
    params = {
        "parameters": ",".join(parameters),
        "community": community,
        "latitude": lat,
        "longitude": lon,
        "start": start,
        "end": end,
        "format": "JSON"
    }
    response = requests.get(url, params=params)
    data = response.json()

    if "properties" not in data:
        print(f"Error fetching {community} data for lat={lat}, lon={lon}: {data}")
        return pd.DataFrame()

    # Construct one DataFrame per parameter, then mix
    param_data = data["properties"]["parameter"]
    dfs = [
        pd.DataFrame.from_dict(values, orient="index", columns=[param])
        for param, values in param_data.items()
    ]
    df_combined = pd.concat(dfs, axis=1)
    df_combined.index.name = "Date"
    return df_combined.sort_index().astype(float)

This function retrieves the parameters we requested from the community we specified. It also converts JSON right into a dataframe. Each response at all times accommodates a property key — if it’s missing, we print an error.

Let’s call this function in a loop to fetch the info for all our locations. 

all_data = {}
for city, (lat, lon) in interesting_climate_sites.items():
    print(f"Fetching day by day data for {city}...")
    city_data = {}
    for community, params in community_params.items():
        df = get_nasa_power_data(lat, lon, params, community, start_date, end_date)
        city_data[community] = df
    all_data[city] = city_data

At once, our data is a dictionary where the values are also dictionaries. It looks like this:

This makes using the info complicated. Next, we mix these into one dataframe. We join on the info after which concatenate. Since there have been no missing values, an inner join would yield the identical result. 

# 1) For every city, join its communities on the date index
city_dfs = {
    city: comms["AG"]
                .join(comms["RE"], how="outer")
                .join(comms["SB"], how="outer")
    for city, comms in all_data.items()
}

# 2) Concatenate into one MultiIndexed DF: index = (City, Date)
combined_df = pd.concat(city_dfs, names=["City", "Date"])

# 3) Reset the index so City and Date turn out to be columns
combined_df = combined_df.reset_index()

# 4) Bring latitude/longitude in as columns
coords = pd.DataFrame.from_dict(
    interesting_climate_sites, orient="index", columns=["Latitude", "Longitude"]
).reset_index().rename(columns={"index": "City"})

combined_df = combined_df.merge(coords, on="City", how="left")

# then save into your Drive folder
combined_df.to_csv('/content/drive/MyDrive/climate_data.csv', index=False)

Should you’re uninterested in coding for the day, you can even use their data access tool. Just click anywhere on the map to retrieve the info. Here I clicked on Venice. Then just select a Community, Temporal Average, and your selected file type, CSV, JSON, ASCII, NETCDF, and hit submit. A few clicks and you’ll be able to get all of the weather data on the planet. 

https://power.larc.nasa.gov/data-access-viewer

Image by Creator

Sanity check

Now, let’s perform a fast sanity check to confirm that the info we’ve is smart.  

import matplotlib.pyplot as plt
import seaborn as sns # Import seaborn

# Load data
climate_df = pd.read_csv('/content/drive/MyDrive/TDS/Climate/climate_data.csv')
climate_df['Date'] = pd.to_datetime(climate_df['Date'].astype(str), format='%Y%m%d')

# Filter for the required cities
selected_cities = [
    'McMurdo Station, Antarctica',
    'Bangkok, Thailand',
]
df_selected_cities = climate_df[climate_df['City'].isin(selected_cities)].copy()

# Create a scatter plot with different colours for every city
plt.figure(figsize=(12, 8))

# Use a colormap for more aesthetic colours
colours = sns.color_palette("Set2", len(selected_cities)) # Using a seaborn color palette

for i, city in enumerate(selected_cities):
    df_city = df_selected_cities[df_selected_cities['City'] == city]
    plt.scatter(df_city['Date'], df_city['T2M'], label=city, s=2, color=colours[i]) # Using T2M for temperature and smaller dots

plt.xlabel('Date')
plt.ylabel('Temperature (°C)')
plt.title('Every day Temperature (°C) for Chosen Cities')
plt.legend()
plt.grid(alpha=0.3)
plt.tight_layout()
plt.show()

Yes, temperatures in Bangkok are quite a bit hotter than within the Arctic.

Image by Creator
# Filter for the required cities
selected_cities = [
    'Cape Town, South Africa',
    'Amazon Rainforest (Manaus)',
]
df_selected_cities = climate_df[climate_df['City'].isin(selected_cities)].copy()

# Arrange the colour palette
colours = sns.color_palette("Set1", len(selected_cities))

# Create vertically stacked subplots
fig, axes = plt.subplots(nrows=2, ncols=1, figsize=(12, 10), sharex=True)

for i, city in enumerate(selected_cities):
    df_city = df_selected_cities[df_selected_cities['City'] == city]
    axes[i].scatter(df_city['Date'], df_city['PRECTOTCORR'], s=2, color=colours[i])
    axes[i].set_title(f'Every day Precipitation in {city}')
    axes[i].set_ylabel('Precipitation (mm)')
    axes[i].grid(alpha=0.3)

# Label x-axis only on the underside subplot
axes[-1].set_xlabel('Date')

plt.tight_layout()
plt.show()

Yes, it’s raining more within the Amazon Rainforest than in South Africa. 

South Africa experiences droughts, which place a major burden on the agricultural sector. 

Image by Creator
# Filter for Mexico City
df_mexico = climate_df[climate_df['City'] == 'Mexico City, Mexico'].copy()

# Create the plot
plt.figure(figsize=(12, 6))
sns.set_palette("husl")

plt.scatter(df_mexico['Date'], df_mexico['WS2M'], s=2, label='WS2M (2m Wind Speed)')
plt.scatter(df_mexico['Date'], df_mexico['WS50M'], s=2, label='WS50M (50m Wind Speed)')

plt.xlabel('Date')
plt.ylabel('Wind Speed (m/s)')
plt.title('Every day Wind Speeds at 2m and 50m in Mexico City')
plt.legend()
plt.grid(alpha=0.3)
plt.tight_layout()
plt.show()

Yes, wind speeds at 50 meters are lots faster than at 2 meters. 

Typically, the upper you go, the faster the wind moves. At flight altitude, the wind can reach speeds of 200 km/h. That’s, until you reach space at 100,000 meters. 

Image by Creator

We’ll take a much closer have a look at this data in the next chapters.

It’s heating up

We just went through a heat wave here in Toronto. By the sounds my AC made, I feel it nearly broke. But in a temperature graph, you want to look quite fastidiously to see that they’re rising. It is because there’s seasonality and significant variability. Things turn out to be clearer once we have a look at the yearly average. We call an anomaly the difference between the typical for a selected 12 months and the baseline. The baseline being the typical temperature over 1981–2024, we will then see that the recent yearly average is significantly higher than the baseline, primarily attributable to the cooler temperatures present in earlier years. The converse is equally true; The early yearly average is significantly lower than the baseline attributable to hotter temperatures in recent times. 

With all of the technical articles present here, headlines like ‘Grammar as an Injectable: A Trojan Horse to NLP Natural Language Processing’. I hope you’re not upset by an easy linear regression. But that’s all it takes to point out that temperatures are rising. Yet people don’t consider. 

# 1) Filter for Sahara Desert and exclude 2024
city = 'Sahara Desert (Tamanrasset, Algeria)'
df = (
    climate_df
    .loc[climate_df['City'] == city]
    .set_index('Date')
    .sort_index()
)

# 2) Compute annual mean & anomaly
annual = df['T2M'].resample('Y').mean()
baseline = annual.mean()
anomaly = annual - baseline

# 3) 5-year rolling mean
roll5 = anomaly.rolling(window=5, center=True, min_periods=3).mean()

# 4) Linear trend
years = anomaly.index.12 months
slope, intercept = np.polyfit(years, anomaly.values, 1)
trend = slope * years + intercept

# 5) Plot
plt.figure(figsize=(10, 6))
plt.bar(years, anomaly, color='lightgray', label='Annual Anomaly')
plt.plot(years, roll5, color='C0', linewidth=2, label='5-yr Rolling Mean')
plt.plot(years, trend, color='C3', linestyle='--', linewidth=2,
         label=f'Trend: {slope:.3f}°C/yr')
plt.axhline(0, color='k', linewidth=0.8, alpha=0.6)

plt.xlabel('12 months')
plt.ylabel('Temperature Anomaly (°C)')
plt.title(f'{city} Annual Temperature Anomaly')
plt.legend()
plt.grid(alpha=0.3)
plt.tight_layout()
plt.show()
Image by Creator

The Sahara is getting hotter by 0.03°C per 12 months. That’s the most popular desert on the planet. We are able to even check every location we picked and see that not a single one has a negative trend.

Image by Creator

So yes, Temperatures are rising. 

The forest for the trees

An enormous reason NASA makes this data open-source is to combat the consequences of Climate Change. We’ve mentioned modelling crop yields, renewable energy, and sustainable constructing compliance. Nevertheless, there are additional ways data will be utilized to handle climate change in a scientific and mathematically grounded manner. Should you’re taken with this topic, this video by Luis Seco covers things I didn’t get to handle in this text, like

  • The carbon trade and the value of carbon
  • Predictive biomass tool optimizing tree planting
  • Protected drinking water in Kenya 
  • The socioeconomic costs of emissions
  • Controlled burning of forests

I hope you’ll join me on this journey. In the following episode, we’ll discuss how differential equations have been used to model climate. And while much is being done to handle climate change, the sooner list of effects was not exhaustive. 

  • Melting ice sheets destabilize global climate regulation and speed up sea-level rise.
  • Climate-related damages cripple economies through escalating infrastructure and health costs.
  • Rising numbers of climate refugees strain borders and fuel geopolitical instability.
  • Coastal cities face submersion as seas rise relentlessly
  • Extreme weather events shatter records, displacing hundreds of thousands.

But there’s noise, and there’s signal, they usually will be separated. 

Sources

  • Climate change impacts | National Oceanic and Atmospheric Administration. (n.d.). https://www.noaa.gov/education/resource-collections/climate/climate-change-impacts
  • Freedman, A. (2025, June 23). . CNN. https://www.cnn.com/2025/06/23/climate/heat-wave-global-warming-links
  • . World Meteorological Organization. (2025, May 26). https://wmo.int/news/media-centre/global-climate-predictions-show-temperatures-expected-remain-or-near-record-levels-coming-5-years
  • . The Recent York Times. (1988, June 24). https://web.archive.org/web/20201202103915/https:/www.nytimes.com/1988/06/24/us/global-warming-has-begun-expert-tells-senate.html
  • NASA. (n.d.). . NASA. https://power.larc.nasa.gov/
  • Wall, M. (2025, June 20). . Space. https://www.space.com/space-exploration/private-spaceflight/watch-blue-origin-launch-6-people-to-suborbital-space-on-june-21

Code Available Here

Linkedin

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x