Light On Water, a Forensic and Sketching Study

Russell Foltz-Smith
8 min readOct 23, 2018

Was walking along the Marina the other day photographing and sketching things. The light was intense and the water very reflective. I stopped to consider it all for minute.

and then I got very confused. and excited.

Multiple Wave Forms All At Play

In the scene above there is sunlight, reflections, shadows and at least 3 different wave patterns in the water. Additionally there’s clearly debris and chemicals in the water. Put all this together and you get a very confusing and dynamic scene of light.

When you are trying to sketch or paint something you often are just rendering the blocks of color/light which is a fairly straightforward copying process if the scene / forms are simple enough. The scene I was looking at was moving rapidly and chaotically and the forms are almost impossibly complex. The only way to get a decent sketch is to actually know the physics of the situation and be able to build the image from first principles. Where is the light coming from? how are the surfaces moving? what are the surfaces made of? what direction in reality and in the visual plane is everything going? and so on.

I captured several photos, a couple of videos and some slo motion video to aid my forensic investigation. Our phones nowadays are insane forensic tools. That we can do slo-motion hi-def video really allows us to do high end physics with almost no effort.

Setting the Scene

First things first I just wanted to understand the basic set up. There were three “barriers” the waves were bouncing off of — two sides of the marina wall and the docks. Well, and of course a bunch of boats etc etc. The light source was just the sun and any reflected light from the boat surfaces and dock sides. Shadows were cast from the boats and the walls and surfaces. There appeared to be oil or some other shiny chemical in parts of the water as well.

The scene occured at 1:40pm on October 4, 2018 in Marina Del Rey, CA.

The light was only slightly off from overhead from the SSW direction.

WolframAlpha.com: “Sun 1:40p October 4, 2018”

I was positioned outside the Marina Del Rey library, somewhat NNE of the position of the light source (sun).

Google Maps Location of Scene
Google Satellite 3d view of scene

The light was be pretty bright and mainly hit the top of any waveforms. Cast shadows were relatively short. So clearly there was going to be bright, point spots and pretty small dark spots in the troughs of the waves and relatively minor cast shadows from the waves themselves.

This gave me a scene that could be analyzed in a particle kind of way. I hoped.

The Simulation

Next I wanted to understand all the sources of the surface topology. There’s just the single source of waves from the current in the marina. But there’s effectively several walls creating secondary waves etc.

We can simulate the set up to understand the surface patterns. http://www.falstad.com/ripple/

A fairly simple set up would include 2 walls and some docks. http://www.falstad.com/ripple/Ripple.html?rol=$+1+476+59+10+0+668+0.048828125%0As+0+337+5+0+0.233333+0+10+100%0Aw+0+248+471+475+217%0A202+0+189+0+246+194+0%0A202+0+77+-1+134+193+0%0A202+0+-15+-3+42+191+0%0A

3d view of basic wave pattern in the marina.
basic interfering wave pattern

This is the base topology. It was further complexified by the “brownian” like motion of the boats etc, which introduce some randomness to this regular wave pattern.

The Analysis

Using the images I wondered if I could actually see that basic wave pattern at play in the images.

Slo-Motion video of the scene light and water wave pattern
here are frame 1 and frame 25 of a video
here are frame 1 and frame 25 of wave simulation

I decomposed frame 1 and 25 into particle representation and then tracked the particles using ImageCorrespondingPoints in WolframLang which provides easy access to several image point algorithms. By comparing the points in different images you can build a mapping of particle movement within the scene.

Particle map / wave map frame 1 and frame 25
comparing simulated wave pattern frame 1 and frame 25.

The simulated wave pattern investigation showed the direction of the waves clearly. The actual wave images weren’t so kind to analysis. The direction of the water waves is somewhat betrayed by the patterns caused by the sun bright spots and the various effects of debris and chemical as well as the angle of the video camera angle. Both sets of images did reveal some “swirling” pattern. What can be said beyond that? not much specifically.

I decided to review more frames of the video to see if some of the light, cast shadow and reflections revealed their regularity a bit more obviously.

50 or so keypoints from frame 1 tracked throughout the video

Unsurprisingly there were swirling oscillations in the scene. Patterns closer to the camera showed a longer travel path, and as the visual plane receded into the horizon we saw the paths with same shape but cover less 2d ground. So we have a mapping between 2d (picture plane) and 3d (reality) of a wave forms. We can measure where in reality the waves are relative to the viewer by comparing the lengths of the paths in the graph. You can see the large red particle path near 200,200 in the above covers about 200–250 ticks where red path at 50,500 travels only about 100. Could we safely assume the form at 50,500 is twice as far away? Probably.

For the purposes of sketching this analysis would tell you to make the wave forms half as small if they are twice as far away. Roughly. This is effectively no different than drawing perspective grids, the traditional approach to rendering the 3d space in 2d plane.

I was not satisfied with the abstract nature of the analysis up to that point so I combined the video frames with the particle mapping.

frame 1 and 2 of the wave video.

The blue dots were the patterns in the image that disappear frame to frame. The green arrows draw a from the center of a pattern in the first frame to its location in the second frame. The algorithms that figure this out are not infallible, they are directional. The images above show that in frames that are pretty close together in time sequence this is a reasonable view.

This type of analysis is very good for objects in the image that are much more realized than blobs of light in a big blob of light. One can look at debris on the surface of the water and see the vector path.

I wanted to see longer range behavior. So I considered frames further apart in time.

frame 1 and frame 1000

The photo showed the light moved but doesn’t really indicate much directionally. And it’s now clear the algorithm hasn’t mapped things correctly. The large downward green vector is clearly wrong. It’s just picking up similarly valued patterns.

Highlight particles through 1250 frames.

The animated highlighting of particles showed the challenges of these algorithms for doing anything more than hinting at what’s going on. Without knowledge of the scene and of the physics the algorithms are merely just light pattern detections. As the light pattern of the image changes the algorithms do not maintain coherence. They do not understand the overall context of the scene.

OR

it is humans that make a bad assumption! The algorithms have correctly detected that the light patterns DO NOT repeat themselves in short durations in a scene like this one. The wave forms and light may actually take a much longer time to find repetitions and those patterns have properly “moved” out of the scene in the direction of the wave.

Highlighting keypoints in 10,000 frames

This is the tricky part of this scene. There were cycles AND there were ephemeral, random patterns AND there were ordered causal patterns.

If one waited long enough in that scene the cycles and perturbations of the sunlight, the moon rise, the boats coming in and out, the wind would emerge.

The Synthesis

Finally, having completed the physics/optics and scenic analysis it was time to synthesize into a sketch.

I focused on seeing if I could render the dynamics of the water surface and it’s play with the light using only a monochromatic approach. Using line, pattern, and light value alone could I get the visual scenic dynamics of a cyclic, ordered but ephemeral scene?

A sketch of the scene, a sketch of a wave form, and a sketch description.

I felt good about the study and the synthesized sketch.

AND all of this left me much more fully appreciating the incredible seascapes of painters like JM W Turner.

Turner’s “Keelmen Heaving in Coals by Moonlight”

What he mastered was the full light and water topology AND often fire and people. He did this all so well that you can almost hear the birds, feel the wind, smell the salt, feel those warm fires.

The Affirmation

My study suggests that one need achieve enough dynamism in the scene, regardless of the pictorial accuracy, to make people experience the scene. The dynamism I’m talking about is that reality-grounded feeling of cycle, order, randomness, ephemeralness. One can use color, light, line, pattern, shape, texture, etc or all of it, as long is there are surfaces to ride, light to swirl, particles to travel off into the distance, and shapes bobbing.

P.S.

I used a variety of computational tools to work study the world.

Primary programming language: WolframLang
and I use nodejs and whatever Google Cloud forces me to use for whatever I may need to get done. I also do some visualization work in Processing

As for various machine learning algos, databases etc. I use whatever. I couldn’t care less about most of it. As long as it is well supported, has ok documentation I’ll use it/have used it.

Camera: Sony Alpha 6000 and iPhone 8

Sketching on Paper: Faber Castelle Watercolor Pencils and whatever paper

Sketching on Computer: iPad Pro 12.9 inch, apple pencil and ProCreate

Digital Photo/Image Editing: PixelMator and Photoshop and GIMP

--

--