Pathfinder 1: Problem Finding

On January the first I announced that this year would be the year of the pathfinder. Since then, I got mentally stuck a bit. Since I had already made an attempt at solving the problem, I found myself slip into the gulleys created by my past efforts. It was very tempting to just continue where I stopped last time, even though I know that my previous approach has important downsides. It makes no sense to solve the wrong problem, so although I was very eager to get started I realized that it was important to identify the right problem first. After going in circles for quite a while, I decided to get help and figure out how to approach this. I stumbled again on Paul MacCready’s approach during the development of the Gossamer Condor. After analysing the competition, he concluded: “The problem is that we don’t understand the problem.”

My previous approach

In 2020 I made a similar attempt. This showed promise, but it lacked important problems.

The most important thing was that it only gives you the most optimal route, with the assumption that all the thermals will still be there when you go to where they were last observed. This is obviously not true, even more so in the evening when thermal activity dies down. A friend suggested that I incorporate the chance that a thermal is not present. This makes a lot of sense, but the algorithm I chose gave me no easy way of incorporating this.

Then I collected statistics without any distinction between lift found during gliding and lift found during circling. This means that if someone circles in lift for a long time, the statistics for that location will be heavily biased towards that lift.

The algorithm also struggled with missing data. Due to the gaps that naturally exist in the coverage of the Open Glider Network, there will be areas where I collect no statistics at all. I had to make assumptions for those locations.

Finding problems

Thermal lift values on the most busy day in 2024: April 11th. It’s nice to see that thermal strengths follow a normal distribution.

The first thing I’m going to do is analyse a lot of gliding days. What is possible with perfect knowledge? How fast could we really fly? How much lift do we really need to stay airborne? How often do predicted thermals still exist when we reach them?

Process

The process I’m using this year is a bit different from last year. Where last year I started to build something straight away, I’m trying to verify my direction a little bit. The reason for this is quite simple: during the presentation of Hobbes in Darmstadt I learnt that tactile feedback would have been much better. I disregarded that quite early (due to safety and complexity concerns), but I do think I could have done a bit more research. Since I know from conversations with a friend who’s an avid cross country pilot that “just the fastest route” might not be sufficient, I am trying to refine the question a bit first.

This is all new and difficult for me. Although I have attended university, I have never had prolonged guidance on “how to do research”. Therefore I’ve always been more of an engineer/tinkerer than a scientist.

2025: Year of the pathfinder

One day in 2009, I was lying in the grass near my dorm at the Twente University campus. I saw a DG-300 circle above me, which launched at the nearby airbase. Not long before I had learnt about the work of Alan Kay, who’s team at Xerox PARC invented about half of what we call “Personal Computing” today. Their way of thinking was new and exiting to me. One thought-provoking question that had stuck with me was: “What would be ridiculous not to have in 25 years?”

The dream

Watching the DG-300 circle in a thermal, I thought: “It would be ridiculous not to have a precise prediction of thermals in 25 years”. And so I started thinking….

What would gliding be like if we could prevent out-landings? If we could precisely predict the weather? Predict where thermals will form, when they will form and how fast we can climb in those thermals? This would be a game-changer. No more need for sustainer engines. No more need for late-night retrieves. That sounds pretty great….

Finding an exponential to make it happen

If we could use Moore’s law, this might not actually be that hard to achieve. In 25 years the amount of computing power on board of a glider will be a lot, about 100.000 times that of 2009. We can either use Moore’s Law to make the computers on board faster or to reduce it’s power consumption, or a bit of both. So even if batteries don’t scale exponentially in capacity over time, we can still use Moore’s Law to compute more for the same amount of energy.

Make it feasible

But 25 years is a long time. It probably doesn’t take 25 years to develop a solution. I decided that weather models, such as WRF, would probably be too slow. I decided something based on observations would probably work better than something based on predictions.

In 2009 cellular data wasn’t as ubiquitous as it is in 2025. My conclusion in 2009 was therefore: I should invest my time in Wireless Mesh Networking, so the gliders can communicate with each other without the need for cellular coverage. And that’s what I did: I spent my Master’s Thesis on Wireless Mesh Networking for gliders.

After finishing my thesis, I was not very content with the result. I started working, and the dream of live weather insights faded…. until I had access to the live tracking data from the SkyLines platform. When I looked at the SkyLines data around 2019, I saw that cellular coverage had dramatically improved. I could see that 97% of all GPS positions took less than 1 second to reach the SkyLines server. That means that the glider in question has a working cellular connection!

First try

This lead to my first attempts. First, I used XCSoar’s thermal detection algorithm to locate thermals. A clubmate of me flew with a moving map that showed thermals, and found two where I predicted them. Nice!

Then I started to look at faster cross country flights, and saw that a lot can be either won or lost during the gliding phase. I wrote a program that analyses OGN data and collects statistics of the entire world. I wrote another program that finds the fastest route from A to B given those statistics.

A friend did an 750km attempt in his ASW24, and I followed him live during the attempt. This is what I saw during that day:

A failed 750km attempt in an ASW-24. Lift/sink is shown on the map. The green line is the route home. The bottom graph shows the vertical profile: where to climb, how high to climb and how fast to glide.

Just after rounding the second turn-point I could predict his landing time… and I saw it slip. You can imagine that I was pretty over-the-moon by this result. I could find the most optimal route home in under 2 seconds on a Raspberry Pi 4! This might actually work….

However, later on when I tried to verify my algorithm formally, I unfortunately noticed critical errors in the algorithm and abandoned the project again.

This year

In the past years I didn’t know if what I wanted was feasible. I tried to remove some of the optimizations that violated correctness… but it didn’t work. I needed more than 16GB of memory to calculate a 50km route. It just wasn’t working.

I went back to studying the underlying algorithms, and talked to some friends about my approach and problems. I learnt some valuable things, but by the time I did I was focussing on Hobbes. Now that Hobbes has reached a conclusion, I learnt that prolonged attention can show great results after a few months.

So in 2025 I’m going to focus my attention on making the pathfinder work: make it correct, make it fast.