Under the Uzès Sun: When Historical Data Reveals the Climate Change

-

, I’m biologically required to endure the identical loop of small talk yearly: “It’s boiling, isn’t it? Way hotter than 2020,” or the classic, “Back in my day, we actually had 4 seasons, not only ‘Pre-Oven’ and ‘Deep Fryer.’”

Truthfully, I’m tempted to nod along and complain too, but I even have the memory of a goldfish and a brain that demands cold, hard facts before joining a rant. Since I can’t remember if last July was “sweaty” or “molten,” I’d like to have some actual data to back up my grumbling.

I work at icCube. It’s mainly knowledgeable sin for me to get right into a data-driven argument without bringing enterprise-level tooling to a back-of-the-napkin debate.

At the subsequent , when someone starts reminiscing about how “1976 was the true scorcher,” I shouldn’t just be nodding politely while nursing my pastis. I must be whipping out a high-performance, pixel-perfect dashboard that visualizes their nostalgia right into oblivion. If I can’t use multi-dimensional evaluation to prove that our sweat glands are working harder than they did within the seventies, then what am I even doing with my life?

While this journey began as a quest to settle a neighborhood argument within the South of France, this post goes beyond the climate debate. It serves as a blueprint for a classic data challenge: architect a high-performance analytical system capable of creating sense of many years of historical data applicable to any domain requiring historical vs. current benchmarking.

The Battle Plan

Here is the plan mapping out our tactical strike against vague nostalgia and anecdotal evidence:

  1. Scouting the Intel: Hunting down the raw numbers because “it feels hot” isn’t a metric, and we want the high-octane stuff.
  2. Constructing the War Room: Architecting a structure robust enough to carry many years of heatwaves without breaking a sweat.
  3. The Analytical Sledgehammer: Deploying the heavy-duty logic required to show raw data into undeniable, nostalgia-incinerating proof.
  4. The Visual “I Told You So”: Designing the pixel-perfect dashboard to finish any apéro argument in three seconds flat.
  5. Post-Victory Lap: Now that we’ve conquered the climate debate, what other domestic myths lets incinerate with data?

Scouting the Intel

Data is central to our mission. Due to this fact, we want to secure accurate, high-fidelity historical temperature records from France.

Méteo-France, the national meteorological and climatological service, is a public establishment of the State. It makes available to all users the information produced as a part of its public service missions in its public data portal: datagouv.fr. God bless public data portals. While half the world’s data is locked behind paywalls and registration forms that ask to your blood type, France just… hands it over. Liberté, égalité, température.

The information utilized in this post is made available under the Open License 2.0.

The Observations

Climatological (each day/hourly) data from all metropolitan and overseas weather stations since their opening, for all available parameters. The information have undergone climatological control: www.

The Weather Stations

Characteristics of meteorological weather stations in metropolitan France and overseas territories in operation: www.

Early Evaluation & Transformations

Being like Saint-Thomas, I wish to see and review a bit by myself the actual data to get first a superb understanding and perform a little bit of sanity checks before drawing any conclusions in a while.

To maintain things clean, I’ve been extracting raw temperature data from the pile of observations now we have. Being an unrepentant Java geek, I’ve built a set of classes for this mission and tossed them right into a Github project. Be at liberty to tear through the code, re-use it as much as you want.

I’m not going to bore you with a dry lecture on the information without delay. That will be like serving a lukewarm rosé, absolutely criminal, possibly illegal in certain Provençal villages.

I’ll be diving into the gritty details when needed.

Constructing the War Room

If we’re going to settle these terrace debates once and for all, we are able to’t just turn up with a spreadsheet and a dream. We’d like an OLAP schema; a structure so robust it makes the local historical stone masonry look flimsy. We’re keeping it lean for this specific fight, but trust me, it’s built to scale when the subsequent “mildest winter ever” argument inevitably breaks out.

Let’s break down the architecture.

The Dimensions

  • Stations: It lets us pinpoint the precise weather station within the France map because saying “somewhere within the South” won’t cut it. We’d like coordinates, names, the works.
  • Time/Calendar: The same old suspects: years, months, days. Boring? Sure. Essential for proving your neighbor’s memory is garbage? Absolutely. We’re tossing in Months and Days of Month to fuel a calendar widget that may let me point at any specific date and say: “See? July 1st, 2025 was an absolute hellscape”. Precision is vital if you’re ruining someone’s nostalgic buzz.

The Facts (aka., Measures)

  • Temperatures: The “Holy Trinity” of information points—Average, Maximum, and Minimum. That is the first input for our “Deep Fryer” versus “Pre-Oven” evaluation.

The complete schema definition is parked over within the GitHub project with the source code, ready for if you’re feeling particularly vengeful.

The Cube

The ? A loaded schema containing greater than 500 million rows of French temperature data stretching back to 1780. Is it absolute overkill for an off-the-cuff chat over olives? In fact it’s. That’s the purpose.

It gives us a playground to hack into other metrics in a while. But let’s save those for after we really intend to make people regret bringing up the weather in the primary place.

The Analytical Sledgehammer

Time to construct the query that may shut down the subsequent apéro debate in three seconds flat.

To chop through the noise, I’m using the MDX language: a question language specifically designed for this sort of multi-dimensional heavy lifting. To prove that we’re indeed living in a “Deep Fryer”, I’m going to match every day’s temperature against a historical reference period.

When you don’t speak MDX, skip to the beautiful picture. The query mainly tells the information engine to seek out the typical “normal” for this specific day over 30 years and subtract it from today’s temperature.

First, the reference period (aka., our baseline) is defined as a static set using the range operator (e.g., 1991 – 2000):

with
  static set [Period] as { 
    [Time].[Time].[Year].[1991] : [Time].[Time].[Year].[2020] 
  }

“Why 30 years?” Because that’s what climatologists and the World Meteorological Organization decided counts as “normal” before the planet began experimenting with recent thermostat settings. It’s the gold standard for a “climatological normal”; long enough to smooth out the weird years, short enough to still remember what “normal” used to feel like.”

The each day average temperature is defined as the typical of the utmost and minimum temperatures of the day. I’ve experimented with hourly averages; the outcomes are nearly similar. So let’s keep on with this straightforward and well accepted definition:

with
  [T_Avg_Daily] as 
    ( [Measures].[Temperature (max.)] + [Measures].[Temperature (min.)] ) / 2
    , FORMAT_STRING=".#"

Now, we want to know what the temperature be. We calculate the typical of those each day temperatures aggregated over our reference period:

with
  [T_Avg_Period] as 
    avg( [Period], [T_Avg_Daily] )
    , FORMAT_STRING=".#"

Finally, we calculate the difference, measuring exactly how much hotter (or colder) it’s today in comparison with my past years. This delta value puts a precise number on our collective sweat:

with
  [T_Avg_Diff] as 
    IIF( isEmpty( [T_Avg_Daily] ), null, [T_Avg_Daily] - [T_Avg_Period] )

Putting all together, here is MDX query that compares the 2025 each day temperatures in Uzès against the record:

with
  static set [Period] as { 
    [Time].[Time].[Year].[1991] : [Time].[Time].[Year].[2020] 
  }

  [T_Avg_Daily] as 
    ( [Measures].[Temperature (max.)] + [Measures].[Temperature (min.)] ) / 2
    , FORMAT_STRING=".#"

  [T_Avg_Period] as 
    avg( [Period], [T_Avg_Daily] )
    , FORMAT_STRING=".#"

  [T_Avg_Diff] as 
    IIF( isEmpty( [T_Avg_Daily] ), null, [T_Avg_Daily] - [T_Avg_Period] )

select
  [Time].[Months].[Months] on 0
  [Time].[Days of Months].[Days of Months] on 1
  
  from [Observations]

where [T_Avg_Diff]

filterby [Time].[Time].[Year].&[2025-01-01T00:00:00.000]
filterby [Station].[Station].[Name].&[30189001] -- Nîmes Courbessac

The attentive reader will notice I’ve swapped the local Uzès station for the Nîmes-Courbessac station. Why? Because I would like that sweet, sweet historical data to fuel my “back in my day” comparisons, and Nîmes simply has an extended memory. It’s right round the corner, so the temperatures are virtually similar though, if I’m being honest, Nîmes normally runs a bit hotter.

Image by the writer.

In the subsequent section, I’ll show you splash some color on these values so you possibly can spot the heatwaves at a look.

The Visual “I Told You So”

So it’s time to stop looking at raw code and truly construct a visible for that MDX result. My plan? Cram your entire yr right into a single 2D grid, because taking a look at a scrollable list of 365 dates is a one-way ticket to a migraine.

The setup is easy: months across the horizontal axis, days of the month on the vertical. Each cell represents the temperature delta, that’s, the (Celsius degrees) difference between 2025 and our reference period. To make it “idiot-proof” for the subsequent time I’m three pastis deep, I’ve applied a heat map: the warmer the day was in comparison with the past, the redder the cell; the colder, the bluer.

Full disclosure: I’m not a “visual guy.” My aesthetic preference normally begins and ends with “does the query return in under 50 milliseconds?” But even with my lack of artistic flair, the information speaks for itself.

Image by the writer.

One glance at this grid and it’s painfully clear: 2025 isn’t just “a bit mild.” It’s a sea of offended crimson that proves our reference period belongs to a world that was significantly less “pre-oven.” If this doesn’t shut down the “back in my day” crowd at the subsequent apéro, nothing will.

My Nostalgia Past Years (1980-2000)

I’m recalibrating the baseline to match the years of my youth. By shifting the reference period to those “glory days,” it seems my brain wasn’t exaggerating; the information confirms a transparent shift from the manageable summers of the past to this recent intensity.

Image by the writer.

No wonder the lavender is stressed.

#Days > 35

I began getting curious; was it just my imagination, or is the “oven” setting on this planet actually speeding up? I made a decision on a fast exercise: counting what number of days per yr the thermometer hits or cruises past the 35°C mark.

Image by the writer.

To the surprise of absolutely no person, the information confirms the “pre-oven” phase is shrinking, and the “deep fryer” era is officially taking on.

2003: When Summer Became a Tragedy

There, in the information, a stark peak that towers above all others. The summer of 2003. Fifteen thousand people didn’t survive those relentless days above 35°C. In France alone. A nation that hadn’t understood how deadly heat might be. The chart doesn’t capture the empty chairs at dinner tables that autumn, the families endlessly modified, the conclusion that got here too late.

These charts don’t prove global climate change by itself; they simply prove local lived reality with rigor.

Post-Victory Lap

And that’s how you switch an off-the-cuff sunset drink right into a data-driven interrogation.

We’ve officially unleashed the information and MDX to prove that “it was once cooler” isn’t only a senior citizen grumbling after one too many Ricards; it’s a verifiable fact. Is bringing a multi-dimensional heatmap to a social gathering the fastest solution to lose friends and stop getting invited to ? Probably. But is the silence that follows a wonderfully executed “I told you so” value it? Each time.

Data won’t stop the warmth but it’s going to hopefully stop the bad arguments about it.

The “Mistral Madness” Index

Now that the warmth is settled, I’m setting my sights on the legendary Mistral. In every village square from Valence to Marseille, there’s a sacred “Rule of three” that claims once the Mistral starts, it must blow for 3, 6, or 9 days. It’s the form of local numerology that folks defend with their lives.

I’m already prepping a brand new “Wind-Chill” schema to cross-reference hourly gust speeds with this calendar myth. I would like to see if the wind actually cares about multiples of three, or if it’s just our brains trying to seek out patterns within the chaos while our shutters are rattling.


When you’ve enjoyed watching me over-engineer an answer to an off-the-cuff conversation, follow my descent into analytical madness over on Medium. We’re just getting began.

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x