Published in Graphics

Valve Genius fixed video game lighting calculations

by on06 December 2024


The maths was all wrong

The groundbreaking lighting in Half-Life 2 resulted from serious lobbying by a Valve designer on hardware makers who had not correctly added up their sums.

According to PC Gamer  Ken Birdwell wanted to see accurate lighting for the game, but when he looked at the math being used, it was all off.

“The math that we were using was wrong. Not only that, but the math that everybody was using was wrong. And then, as I started to correct it, I realised just how bad it was… and then I fixed it, and suddenly everything looked great!”

The simple version is that graphics cards always stored RGB textures and even displayed everything as non-linear intensities. An 8-bit RGB value of 128 encodes a pixel that's about 22 percent as bright as a value of 255, but the graphics hardware was doing lighting calculations as though everything was linear.

This meant that the hardware at the time rendered light incorrectly, causing bizarre and unnatural shading, particularly on curved surfaces. What was supposed to look halfway dim often appeared much darker than intended, creating a distorted, unrealistic effect.

“The net result was that lighting always looked off. If you tried to shade something curved, the dimming due to the surface angle aiming away from the light source would get darker too quickly. Something that was supposed to look 50 per cent brighter than full intensity ended up looking only 22 per cent brighter on display. It looked very unnatural, instead of a nice curve everything was shaded way too extreme, rounded shapes looked oddly exaggerated and there wasn’t any way to get things to work in the general case.”

Birdwell had a job convincing hardware manufacturers, particularly those behind graphics cards that their sums were off.

“I had to go tell the hardware guys, the people who made hardware accelerators, that fundamentally the math was wrong on their cards,” he recalls.

So, how did the manufacturers react when Birdwell raised the alarm? Not exactly warmly.

“When I pointed this out to the graphics hardware manufacturers in '99 and early 2000s, I hit the 'you've just pointed out that my chips are fundamentally broken until we design brand new silicon, I hate you' reaction,” Birdwell recalls. “It went through the stages of denial, anger, bargaining, etcetera, all in rapid succession with each new manufacturer.”

“That took about two-and-a-half years. I could not convince the guys. Finally, we hired Gary McTaggart [from 3DFX], and Charlie Brown, and those guys had enough pull and enough… I have a fine arts major; nobody will listen to me.”

Birdwell’s discovery, while a major breakthrough at the time, is still relevant in modern-day graphics programming. “This remains a super common graphics mistake,” he admits, adding, “Even today, certain areas of programming require the coder to keep in mind that all the bitmap values are probably nonlinear; you can’t just add them together or blend them or mix them with linear calculations without considering what 'gamma space' you're working in."

Today, graphics cards have caught up and automatically adjust non-linear data into a linear value, ensuring things look right from the get-go. But that wasn’t always the case.

“Back in the '90s up to maybe the early 2010s it wasn't the case,” Birdwell explains. “You had to be super aware of what 'gamma space' you were in at each step of the process, or things would look super weird.”

The frustration must have been palpable, but Birdwell had some help. “I was very happy to pass off those conversations to the newly hired HL2 graphics programmers Gary McTaggert and Charlie Brown, who worked through it all step by painful over the years,” he said.

Rate this item
(7 votes)

Read more about: