Combined constellations

Thinking through the layers of invisible forces at play in the presence of satellites, the video here brings together:

There’s an interesting interplay here between timescales of the natural sky cycle – of the moon passing from east to west in Stellarium view, with the constant hovering of the satellite visualisation. 

This combination of sources also layers the celestial labels with the repetition of Starlink ID numbers. The straight dotted lines show gateway connections between satellites, connecting them in constellation-like shapes. The hexagonal shapes indicate internet coverage in the focus area of Canberra/NSW, but in the way I’ve put the video together they become transparent over land, so the outline of the coast, and Tasmania is visible. 

Together it feels like a merging of natural and artificial skies and their combined constellations.

Time series and blocky contours

Rather than combining different bands to create an image, I tried using a sequence of images all in the same band, of the same sky area taken at different times. 

This data comes from the Skymapper Object ID 230719364, all in the z-band, taken from 2014 – 2019. One of these images is contaminated with a satellite streak but the rest of the sky stays the same. I began with this series in grayscale, compositing them into one frame that DS9 can then ‘blink’ to show that sky area through time.

A blinking sequence of greyscale star images
I then applied one of the preset colour options and exported them as images that I worked with as 3D video layers.

A series of star images in pink and green colours

As 3D video layers, I spaced the images out from earliest to most recent.

Images lined up in 3D space Images lined up in 3D space

Dropping the opacity accentuates the parts of this sky section does stay the same – eg. the stars, because their repetition creates a sort of tunnel through the multiple layers. The satellite streak is then one slice of noise within that consistency. To exaggerate this and how the images interact with each other, I experimented with different blending modes.

The Overlay blending mode ‘Multiplies or screens the input colour channel values, depending on whether or not the underlying colour is lighter than 50% gray. The result preserves highlights and shadows in the underlying layer.’ Using this mode the satellite streak stands out more as it is one of the lightest parts of this series of images:

The Multiply blending mode multiplies source colour channels, darkening the accumulation of image data when more layers are combined, so the satellite streak is less visible.

Finally with this same sky object, I tested what showing the contours of the sky objects looks like, as well as using the ‘block in’ and ‘block out’ functions which reduces smoothness in the image.  There are some interesting details in these results that I hope I can recreate!

3D layers

Still messing around with these FITS images!

I have been thinking about the challenge of mapping the impact of satellites and the dynamic elements of space that are captured differently from every angle.

This time I have returned to more familiar domains of regular old RGB channels. Here, I separated the RGB channels into three separate layers then pulled them apart in 3D space.

Interesting things happen when this is translated into a projection for 360 video or VR.

Aligning images: difference and sequence


The last FITS images I made using the I, R, and G filter bands as R, G, B channels. While sorting through SkyMapper images to do this, I felt there were more satellite streaks in Z bands than any other. While I would need to look at a bigger sample to see if this is actually true, it seemed important to make images with the other bands that weren’t included in the RGB images I was creating.

As an experiment I made new frames using the Z, V and U filter bands as RGB channels. This gives two images of the same ObjectID where the stars are aligned but satellite streaks, interference, and intensity of data are different. 

To compare, I’ve made images that blink between the two frames:


A while ago I considered whether satellite lines could be used to make a drawing. Although a playful idea, it emerged again in these short loops, especially when the lines link up across frames, as in the above grid.

Without a clear idea why, I wanted to try lining up the frames in an animation so the angle of the satellite line circulated around a centre point. I chose 24 frames where the line dissected the frame, and put them into a sequence where the line moved from connecting the left hand top corner to the right hand top corner. 

Then, I duplicated this sequence, reversed the order of frames, then rotated it 90 degrees and mirrored it (brain gymnastics), so that the line then travelled from corner to corner on the left hand side of the frame.

Repeating this two more times meant that the line travelled around the four sides of the frame. Most frames appear at four different angles throughout this sequence:


I say ‘most’, because an interesting part of this process what that mirroring and rotating the image created a difference that altered how the images moved in sequence. To counter this I found that it was necessary to reorder some of the frames by hand, and take some out, to create smoother motion from the new orientation of lines.

While I can’t pinpoint why this might be useful in any way (!!) it stood out to me that the process relied on reflection, and that this reflection caused unexpected angles and subsequent alterations, which I can’t help comparing to the impacts of a satellite reflecting light.


Some more FITS images made as RGB frames using the I, R, and G bands of satellite contaminated Skymapper images.

I am interested in the streak as a line drawn by sunlight, and in tracing this line back to the satellite that made it.

Planning to animate some of these next.


FITS Images

This past week I began using astronomy software SOA ds9 to look at satellite contaminated images from Skymapper, in FITS format.

Different from typical image files, FITS  (Flexible Image Transport System) is an archival data format that stores information such as spectra, lists, tables, or arrays, instead or as well as ‘images’. SOA ds9 is an application designed for working with image data in this format. The FITS file comes with a ‘Header’ that contains stacks of metadata, here’s an excerpt:

A screenshot of the FITS Header metadata

Since astronomical images measure the intensity of light rather than a defined colour, they are grayscale and appear differently dependending on the band of filter used (and frequency captured). Ds9 can open these files and combine them into frames that correlate and/or separate this information.

I have been trying a very basic function of doing this by combining FITS files of the same sky object captured in different bands, into a colour image in RGB channels. This involves layering the red, green, blue, channels together. Sounds simple enough… (twist! It’s not)

I originally charged ahead using the ‘r’ band for red and ‘g’ band for green. When I asked Brad where the ‘blue’ channel was, he explained (reminding me patiently) that it’s spectrum that we’re working with, not colour. 

In the light spectrum, what we see as blue has wavelengths between about 450 and 495 nanometers, green from 450 to 570, and red from 620 to 750.

A graph of the colour spectrum

Different telescopes have different filter profiles according to what they are trying to measure. SkyMapper filters are plotted on this graph:

A graph of the SkyMapper filter wavelengths

Comparing these graphs was useful for me in thinking through how the different bands are translated to RGB channels due to their wavelengths on the light spectrum.

So when making an RGB image from SkyMapper FITS files in ds9, I used the i band for the red channel, the r band for the green channel, and g band for the blue filter. 

These are two contaminated SkyMapper cutouts converted in this way:

A satellite streak image with static and colour tint

A satellite streak image with static and colour tint

A screenshot of the ds9 interface

A screenshot of the ds9 interface

And here are some more made from simply messing around with a new imaging format with no rhyme or reason:

Screenshot of a pixelated satellite streak image

A screenshot of the ds9 interface

A screenshot of the ds9 interface

Space Situational Awareness

The shaky star images led me to ask Brad how we would go about creating such images now. I was thinking visually, of images that capture the hand-tremor of a human observer, but this unleashed the topic of Space Situational Awareness (SSA) – the massive, global, effort to track and be familiar with the environment of dynamic objects in space.

The recent increase in satellites means there is also an increase in radars watching satellites. This is a huge and multi-pronged industry which immediately brings up the military uses and reasons for tracking satellites – and also a completely different method and aesthetic of visualising them. 

An increasing number of companies offer services specifically for  producing high-res interactive visualisations of the satellites that astronomers actively avoid. I am interested in how a single satellite can be captured visually in such starkly different ways, and how the resulting representation encapsulates vastly different world views.

These are some screenshots from a quick but thought-provoking zoom around LeoLabs:

A visualisation of satellites in spaceGreen objects: satellites currently in space.
Red shapes: Leo Labs’ tracking radars.

A visualisation of satellites in space
The straight line of satellites in this image are Starlink.

Pink objects: space debri 😧

Visualisation of satellites in spaceAustralian owned satellites.

US owned satellites.

Shaky stars

These images were found in a collection of photographic slides from 1963, part of an archive currently being digitised at Mt Stromlo Observatory.

While at first they don’t look like anything of interest, Brad mentioned them to me because they actually contain a satellite, indicated on the slide envelopes as Syncom 1, NASA’s first geosynchronous communication satellite. Although contact was lost with Syncom 1 five hours after launch, it was sighted on 1st March 1963 from Boyden Observatory at Bloemfontein, South Africa, two days before this first slide was taken.

These images represent an intriguing shift in focus. Because the satellite is the intended subject, the stars are blurred. This creates an effect similar to satellite contamination as we are familiar with now, but from the opposite cause.

It flips the temporal relationship: rather an exposure time relevant to the motion of stars that satellites zoom past, in this image the timescale is recalibrated to the speed of a satellite. The shakiness of the line then translates directly to the observer’s hand as they traversed the sky. The way the stars move in different directions suggests how the observer moved the viewfinder to find the satellite, actively seeking out a glimpse of other humans in space.

Star streaks (above), satellite streak (below).

Satellite streak

Starlink satellite streaks
Starlink satellite streaks seen in image by Zdenek Bardon

This seems a compelling reflection of how our relationship with orbital space has shifted. In the older image the trace of the human hand is embedded in the authorship of the photograph. In images like those resulting from Starlink interference, the trace of the human is instead found in a network of technology. The quality of line changes from nuanced and ambiguous to programmed and streamlined, representing a role reversal in whose times we live by.

Syncom 1, NASA's first geosynchronous communication satellite, 1963
Syncom 1, NASA’s first geosynchronous communication satellite, 1963

Sky Viewer artifacts

Thinking about how ‘glitch’ artifacts appear in astronomy images, both in image processing, and as satellites. These are a series of images from navigating around the different layers of Sky Viewer to capture signal and noise, and track substance and absence in both the sky that is being depicted and the process of mapping it.


We have started using Skymapper, the wild-field survey telescope at ANU Research School of Astronomy and Astrophysics, to search out satellites that are already embedded in the database.

This telescope is at the summit of Mt Woorat (Siding Spring Mountain), in the Warrumbungle Dark Sky Park on Gamilaroi, Wiradjuri and Weilwan land.

We discussed trying to access images that have been deleted due to too much interference but it isn’t clear where these go (yet!). So instead, Brad was able to find satellites by searching for images that are flagged as containing objects with significant elongation – this rules out stars, that are completely round.

These images are examples of the results. They still pass as usable because they are image cutouts, only affecting a small section of a bigger view.


The ‘band’ of the images indicates the colour of light that is captured: U (ultraviolet); B (blue); G (green); V (visual); R (red); I (infrared) – but this measures intensity, not visual colour. In radio astronomy, this intensity is shown in contours, not physical images.

The different bands are captured in separated exposures of the same area. The variation is sometimes subtle but sometimes extreme, where prominent stars are the only consistent points. To compare the bands in a slightly different way, I have compiled a series of images of multiple bands describing the same area into an animated image sequence:


This is the same idea, but uses the Skyviewer service on Skymapper:

Finally, I experimented with animating and drawing over the images with satellite streaks while thinking through webs and lines in space:

 More on Sky Viewer soon!